<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
		>

<channel>
	<title>Desobediencia Robótica | Site-Wide Activity</title>
	<link>https://desobedienciarobotica.org/comunidad/</link>
	<atom:link href="https://desobedienciarobotica.org/comunidad/feed/" rel="self" type="application/rss+xml" />
	<description>Activity feed for the entire site.</description>
	<lastBuildDate>Wed, 08 Apr 2026 01:57:22 +0000</lastBuildDate>
	<generator>https://buddypress.org/?v=2.8.30</generator>
	<language>es</language>
	<ttl>30</ttl>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>2</sy:updateFrequency>
		
								<item>
				<guid isPermaLink="false">712e6b3e11df64e42ee3aefa7734874c</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Linguistic Disobedience x NLP [Machine Unlearning]: [TASK] Fine-Tune Base Model w/ "Disobedient" DatasetWe have a dataset that challenges Spanish norms, and we are trying to fine-tune our base GPT2 model so that we can start "contaminating" the outcome, aiming at inventing new words out of this "contamination."Our dataset's clean version is "Las tres mitades de Ino Moxo y otros brujos de la Amazonía," a book by César Calvo, a Peruvian poet who wrote this book highlighting many words in the Amazonian dialect, creating poetic, fantastical, yet "incorrect" narrations.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/187/</link>
				<pubDate>Mon, 17 Nov 2025 01:04:40 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Fine-Tune Base Model w/ «Disobedient» Dataset</b></p>
<p>We have a dataset that challenges Spanish norms, and we are trying to fine-tune our base GPT2 model so that we can start «contaminating» the outcome, aiming at inventing new words out of this «contamination.»</p>
<p>Our dataset&#8217;s clean version is «Las tres mitades de Ino Moxo y otros brujos de la Amazonía,» a book by César Calvo, a Peruvian poet who wrote this book highlighting many words in the Amazonian dialect, creating poetic, fantastical, yet «incorrect» narrations.</p>
<p></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">9691b13c02870b36b8f738b61fea2a2e</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Linguistic Disobedience x NLP [Machine Unlearning]: [TASK] [COMPLETE] &#x2714; Download Our First Base Model to Disobediently TrainWe should experiment with various approaches, but we will begin by downloading a small GPT-2 Spanish model and using it to train on a dataset. To do this, we need to download this base model onto our environment:python - &#60;&#60; 'EOF'from transformers import AutoTokenizer, AutoModelForCausalLMmodel_name = "DeepESP/gpt2-spanish"print("Loading model:", model_name)tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=True)model = AutoModelForCausalLM.from_pretrained(model_name)print("Loaded OK.")EOFThe "EOF" syntax enables Python scripts to be written directly within the terminal window, eliminating the need to create a separate Python file. In theory, we could have written a Python file, for example, 'download_base_model.py' with the following inside:from transformers import AutoTokenizer, AutoModelForCausalLMmodel_name = "DeepESP/gpt2-spanish"print("Loading model:", model_name)tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=True)model = AutoModelForCausalLM.from_pretrained(model_name)print("Loaded OK.")And then run it by typing in our Terminal window 'python3 download_base_model.py'. It's equal, the same. Now that we haveloaded our base model, we can test it:python - &#60;&#60; 'EOF'import torchfrom transformers import AutoTokenizer, AutoModelForCausalLMmodel_name = "DeepESP/gpt2-spanish"print("Loading tokenizer/model...")tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=True)model = AutoModelForCausalLM.from_pretrained(model_name)# Move to MPS if availabledevice = torch.device("mps" if torch.backends.mps.is_available() else "cpu")model.to(device)print("Using device:", device)prompt = "Un robot desobediente es"inputs = tokenizer(prompt, return_tensors="pt").to(device)with torch.no_grad(): outputs = model.generate( **inputs, max_new_tokens=60, temperature=0.9, top_p=0.95, do_sample=True, pad_token_id=tokenizer.eos_token_id, )text = tokenizer.decode(outputs[0], skip_special_tokens=True)print("\n--- OUTPUT ---")print(text)EOFThis asks the model to generate a maximum of 60 new tokens after the phrase (prompt) "Un robot desobediente es". The output is interesting and confirms our GPT-2 model works:Un robot desobediente es el que se ha apoderado de la máquina, y la máquina, a fin de que no se haga daño o que la máquina deje de salir. ""La máquina es el único superviviente de las máquinas humanas que ha sido destruida", reflexionó.From here, we will want to begin "corrupting" our model. This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/186/</link>
				<pubDate>Sun, 16 Nov 2025 21:47:00 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Download Our First Base Model to Disobediently Train</b></p>
<p>We should experiment with various approaches, but we will begin by downloading a small GPT-2 Spanish model and using it to train on a dataset. To do this, we need to download this base model onto our environment:</p>
<pre>python - &lt;&lt; 'EOF'<br />from transformers import AutoTokenizer, AutoModelForCausalLM<br />model_name = "DeepESP/gpt2-spanish"<br />print("Loading model:", model_name)<br />tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=True)<br />model = AutoModelForCausalLM.from_pretrained(model_name)<br />print("Loaded OK.")<br />EOF</pre>
<p>The «EOF» syntax enables Python scripts to be written directly within the terminal window, eliminating the need to create a separate Python file. In theory, we could have written a Python file, for example, &#8216;download_base_model.py&#8217; with the following inside:</p>
<pre>from transformers import AutoTokenizer, AutoModelForCausalLM<br />model_name = "DeepESP/gpt2-spanish"<br />print("Loading model:", model_name)<br />tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=True)<br />model = AutoModelForCausalLM.from_pretrained(model_name)<br />print("Loaded OK.")</pre>
<p>And then run it by typing in our Terminal window &#8216;python3 download_base_model.py&#8217;. It&#8217;s equal, the same. Now that we haveloaded our base model, we can test it:</p>
<pre>python - &lt;&lt; 'EOF'<br />import torch<br />from transformers import AutoTokenizer, AutoModelForCausalLM<br />model_name = "DeepESP/gpt2-spanish"<br />print("Loading tokenizer/model...")<br />tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=True)<br />model = AutoModelForCausalLM.from_pretrained(model_name)<br /># Move to MPS if available<br />device = torch.device("mps" if torch.backends.mps.is_available() else "cpu")<br />model.to(device)<br />print("Using device:", device)<br />prompt = "Un robot desobediente es"<br />inputs = tokenizer(prompt, return_tensors="pt").to(device)<br />with torch.no_grad():<br /> outputs = model.generate(<br /> **inputs,<br /> max_new_tokens=60,<br /> temperature=0.9,<br /> top_p=0.95,<br /> do_sample=True,<br /> pad_token_id=tokenizer.eos_token_id,<br /> )<br />text = tokenizer.decode(outputs[0], skip_special_tokens=True)<br />print("\n--- OUTPUT ---")<br />print(text)<br />EOF</pre>
<p>This asks the model to generate a maximum of 60 new tokens after the phrase (prompt) «Un robot desobediente es». The output is interesting and confirms our GPT-2 model works:</p>
<ul></ul>
<p><b>Un robot desobediente es el que se ha apoderado de la máquina, y la máquina, a fin de que no se haga daño o que la máquina deje de salir. «</b></p>
<p><b>«La máquina es el único superviviente de las máquinas humanas que ha sido destruida», reflexionó.</b></p>
<p>From here, we will want to begin «corrupting» our model. <b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">c673287c59c0e28386853e88d7dd72cf</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Linguistic Disobedience x NLP [Machine Unlearning]: [TASK] [COMPLETE] &#x2714; Create a New Environment Using 'Miniconda3' to Train New NLP ModelWe have already installed Miniconda3, which is provided as the default for our Linguistic Disobedience project. If Miniconda/conda isn't already installed on the computer to be used, then this step needs to be pursued. Knowing that we have Miniconda3, we create a new environment:conda create -n disobedient-nlp python=3.11We are requesting that the environment run on Python 3.11, despite Python 3.12 being the default version installed through Homebrew within the system. This is due to Python 3.11 being much safer and compatible for machine learning work (PyTorch, Transformers, PEFT, etc.).After this step, our environment will be created. An environment is important because, within that environment, we can select options such as a different Python version, among other things, without affecting the entire system. Once the environment is created, we can verify it is installed:conda env listThen, we can activate it:conda activate disobedient-nlpIn my case, once the environment is activated, it will show between ():(disobedient-nlp) NKA-MacBook-Pro:~ nka$Then, if we want to deactivate this environment:conda deactivateTo configure the environment for what we want, we need to install some additional tools:pip install torch torchvision torchaudio   # For an M1 MBP, this will automatically install using MPS (Metal Performance Shaders)pip install transformers datasets peft accelerate sentencepiece.  # Hugging Face ToolsThis task is now complete.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/185/</link>
				<pubDate>Sun, 16 Nov 2025 21:24:13 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Create a New Environment Using &#8216;Miniconda3&#8217; to Train New NLP Model</b></p>
<p>We have already installed Miniconda3, which is provided as the default for our Linguistic Disobedience project. If Miniconda/conda isn&#8217;t already installed on the computer to be used, then this step needs to be pursued. Knowing that we have Miniconda3, we create a new environment:</p>
<pre>conda create -n disobedient-nlp python=3.11</pre>
<p>We are requesting that the environment run on Python 3.11, despite Python 3.12 being the default version installed through Homebrew within the system. This is due to Python 3.11 being much safer and compatible for machine learning work (PyTorch, Transformers, PEFT, etc.).</p>
<p>After this step, our environment will be created. An environment is important because, within that environment, we can select options such as a different Python version, among other things, without affecting the entire system. Once the environment is created, we can verify it is installed:</p>
<pre>conda env list</pre>
<p>Then, we can activate it:</p>
<pre>conda activate disobedient-nlp</pre>
<p>In my case, once the environment is activated, it will show between ():</p>
<pre>(disobedient-nlp) NKA-MacBook-Pro:~ nka$</pre>
<p>Then, if we want to deactivate this environment:</p>
<pre>conda deactivate</pre>
<p>To configure the environment for what we want, we need to install some additional tools:</p>
<pre>pip install torch torchvision torchaudio   # For an M1 MBP, this will automatically install using MPS (Metal Performance Shaders)</pre>
<pre>pip install transformers datasets peft accelerate sentencepiece.  # Hugging Face Tools</pre>
<p><b>This task is now complete.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">37e3a3c45cc542c43ed608fa30475273</guid>
				<title>Nicolas Kisic Aguirre created the group Linguistic Disobedience x NLP [Machine Unlearning]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/184/</link>
				<pubDate>Sun, 16 Nov 2025 21:17:56 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">f65f5e68c418c74302ac957458f563bb</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Raspberry Pi Zero 2 WH: [TASK] [COMPLETE] &#x2714; Enable 2 Wi-Fi Connections Using an External Wi-Fi AdapterWe want two connections to be an option, like we have with robots DR1, DR2, and DR3. For our setup using the RPi Zero 2 WH (DRX), we want to connect to the local 'disobedientrobots' network using our internal Wi-Fi card and, when connecting an external Wi-Fi adapter, we want to be able to access the internet. To do so, we need to modify /etc/netplan/50-cloud-init.yaml.sudo nano /etc/netplan/50-cloud-init.yamlThen, we need to modify to have the following:network:    version: 2    wifis:        wlx90de8023b2b9:   #the name of the external Wi-Fi adapter            dhcp4: true            access-points:                "MCD-PUBLIC": {}            optional: true        wlan0:   # the internal Wi-Fi adapter            dhcp4: false            dhcp6: false            addresses:                - 192.168.2.111/24   # static IP address            nameservers:                addresses: [192.168.2.1, 8.8.8.8]            access-points:                "disobedientrobots":                    password: "disobedientrobots"            optional: true    ethernets:        eth0:            optional: true            dhcp4: trueThis should work. If for some reason internet connection isn't working, then disconnect, reconnect the external Wi-Fi adapter, and wait briefly before trying again. Our DRX-server machine is now "111". Marked as complete.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/183/</link>
				<pubDate>Sat, 15 Nov 2025 19:42:33 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Enable 2 Wi-Fi Connections Using an External Wi-Fi Adapter</b></p>
<p>We want two connections to be an option, like we have with robots DR1, DR2, and DR3. For our setup using the RPi Zero 2 WH (DRX), we want to connect to the local &#8216;disobedientrobots&#8217; network using our internal Wi-Fi card and, when connecting an external Wi-Fi adapter, we want to be able to access the internet. To do so, we need to modify /etc/netplan/50-cloud-init.yaml.</p>
<pre>sudo nano /etc/netplan/50-cloud-init.yaml</pre>
<p>Then, we need to modify to have the following:</p>
<pre>network:<br />    version: 2<br />    wifis:<br />        wlx90de8023b2b9:   #the name of the external Wi-Fi adapter<br />            dhcp4: true<br />            access-points:<br />                "MCD-PUBLIC": {}<br />            optional: true<br />        wlan0:   # the internal Wi-Fi adapter<br />            dhcp4: false<br />            dhcp6: false<br />            addresses:<br />                - 192.168.2.111/24   # static IP address<br />            nameservers:<br />                addresses: [192.168.2.1, 8.8.8.8]<br />            access-points:<br />                "disobedientrobots":<br />                    password: "disobedientrobots"<br />            optional: true<br />    ethernets:<br />        eth0:<br />            optional: true<br />            dhcp4: true</pre>
<p>This should work. If for some reason internet connection isn&#8217;t working, then disconnect, reconnect the external Wi-Fi adapter, and wait briefly before trying again. Our DRX-server machine is now «111». </p>
<p><b>Marked as complete.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">726919af4d76d537dc29f0b46061c945</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Raspberry Pi Zero 2 WH: [TASK] [COMPLETE] &#x2714; Enable Multiple Terminal Windows Using Ubuntu 24 LTS ServerUbuntu Server has no GUI, meaning that it only works within the "terminal". To enable multiple "windows," we can install tmux:sudo apt install tmuxThen, to run tmux, we simply type in:tmuxCtrl+B, then C will create a new window.Ctrl+B, then " will split the screen horizontally, while Ctrl+B, then % will split the screen vertically. Once the screen is split, we can navigate between screens using Ctrl+B, followed by the arrow keys.This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/181/</link>
				<pubDate>Fri, 14 Nov 2025 22:43:18 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Enable Multiple Terminal Windows Using Ubuntu 24 LTS Server</b></p>
<p>Ubuntu Server has no GUI, meaning that it only works within the «terminal». To enable multiple «windows,» we can install tmux:</p>
<pre>sudo apt install tmux</pre>
<p>Then, to run tmux, we simply type in:</p>
<pre>tmux</pre>
<p>Ctrl+B, then C will create a new window.</p>
<p>Ctrl+B, then » will split the screen horizontally, while Ctrl+B, then % will split the screen vertically. Once the screen is split, we can navigate between screens using Ctrl+B, followed by the arrow keys.</p>
<p><b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">60e266e42c99c8d8ce2e6a6af6827686</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Raspberry Pi Zero 2 WH: [TASK] [COMPLETE] &#x2714; Install ROS2 Jazzy (Bare Bones) into the Raspberry Pi Zero 2 WHAfter successfully installing Ubuntu 24 LTS Server using the Raspberry Pi Imager into the Pi Zero's SD Card, we'll need to install the "bare bones" version of ROS2 Jazzy into the system.Before proceeding, we need to make sure we're connected to the internet with the Pi Zero by adding the following to the /etc/netplan/50-cloud-init.yaml network config file:network:  version: 2  wifis:    wlan0:      dhcp4: true      access-points:        "MCD-PUBLIC": {}   # In our case, the network name at the Newlab is "MCD-PUBLIC" and has no password, hence the {}      optional: trueThen, following the Official Ubuntu Install Instructions for ROS2 bare bones (ros-jazzy-ros-base) should work well. It worked for us!We had to install ros-jazzy-demo-nodes-cpp and ros-jazzy-demo-nodes-py after the fact to run the classic talker/listener test. It worked!This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/180/</link>
				<pubDate>Fri, 14 Nov 2025 21:50:15 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Install ROS2 Jazzy (Bare Bones) into the Raspberry Pi Zero 2 WH</b></p>
<p>After successfully installing Ubuntu 24 LTS Server using the Raspberry Pi Imager into the Pi Zero&#8217;s SD Card, we&#8217;ll need to install the «bare bones» version of ROS2 Jazzy into the system.</p>
<p>Before proceeding, we need to make sure we&#8217;re connected to the internet with the Pi Zero by adding the following to the /etc/netplan/50-cloud-init.yaml network config file:</p>
<pre>network:<br />  version: 2<br />  wifis:<br />    wlan0:<br />      dhcp4: true<br />      access-points:<br />        "MCD-PUBLIC": {}   # In our case, the network name at the Newlab is "MCD-PUBLIC" and has no password, hence the {}<br />      optional: true</pre>
<p>Then, following the <a target='_blank' href="https://docs.ros.org/en/jazzy/Installation/Ubuntu-Install-Debs.html" rel="nofollow">Official Ubuntu Install Instructions</a> for ROS2 bare bones (ros-jazzy-ros-base) should work well. It worked for us!</p>
<p>We had to install ros-jazzy-demo-nodes-cpp and ros-jazzy-demo-nodes-py after the fact to run the classic talker/listener test. It worked!</p>
<p><b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">71368699bca3809e80441da496051b0c</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Raspberry Pi Zero 2 WH: Using a Raspberry Pi Zero 2 WH to try a simple ROS2 Jazzy installation, enabling communication between robots and smaller devices like solenoid modules, small components, and other objects that could be part of our robot family without using too much energy or costing too much.Following the ROS2 tutorials, we set-up the ROS_DOMAIN_ID = 1, so that it matches DR1, we are assuming we will use DRX at .111 to communicate with DR1, if this changes, we will have to change the Domain ID too. </title>
				<link>https://desobedienciarobotica.org/comunidad/p/179/</link>
				<pubDate>Fri, 14 Nov 2025 21:11:31 +0000</pubDate>

									<content:encoded><![CDATA[<p>Using a <a target='_blank' href="https://www.raspberrypi.com/products/raspberry-pi-zero-2-w/" rel="nofollow"><b>Raspberry Pi Zero 2 WH</b></a> to try a simple ROS2 Jazzy installation, enabling communication between robots and smaller devices like solenoid modules, small components, and other objects that could be part of our robot family without using too much energy or costing too much.</p>
<p>Following the <a target='_blank' href="https://docs.ros.org/en/jazzy/Tutorials/Beginner-CLI-Tools/Configuring-ROS2-Environment.html" rel="nofollow">ROS2 tutorials</a>, we set-up the <b>ROS_DOMAIN_ID = 1</b>, so that it matches DR1, we are assuming we will use DRX at .111 to communicate with DR1, if this changes, we will have to change the Domain ID too. </p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">bcf42fbb3d4b2173c21ff4524e7a152f</guid>
				<title>Nicolas Kisic Aguirre created the group Raspberry Pi Zero 2 WH</title>
				<link>https://desobedienciarobotica.org/comunidad/p/178/</link>
				<pubDate>Fri, 14 Nov 2025 21:07:36 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">88fc7c0c274a8eeec2d13bc60f5e4a8a</guid>
				<title>Nicolas Kisic Aguirre changed their profile photo</title>
				<link>https://desobedienciarobotica.org/comunidad/p/177/</link>
				<pubDate>Sun, 27 Jul 2025 03:28:50 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">fc711d819f7045f3db08edc648e35c3e</guid>
				<title>Nicolas Kisic Aguirre posted an update: Hi all, I've been working in having robot DRM communicated with DR1, 2,  and 3, and I finally found out why I couldn't. Each robot had a  different DDS implementation, having cyclone as the default for AGNS  system and fast dds for some reason as the default for DRM. I  finally have them all compatible so that means that I can run the  ros2osc bridge node in order to start playing with receiving messages  within Supercollider. Yesterday I added an additional stream of  sound to the DRM, which is a slowly fading it fading out white noise,  only that it is a unique source of white noise for each of the 16  loudspeakers instead of the 16 having the same source of white noise.  This produces an interesting "3d" effect inside the DRM.An  interesting thing we noticed yesterday while taking pictures of the  project was that inside the DRM the sounds that are coming from outside  like voices or even sounds coming from AGNS robots enter the dome and  transform themselves! It is an unexpected effect and at the same time  very fitting with its own name as a machine that mutates the outside  reality. A gift from the gods perhaps.I will continue working on linking robot movement and sound within the DRM and will report later!</title>
				<link>https://desobedienciarobotica.org/comunidad/p/173/</link>
				<pubDate>Thu, 05 Jun 2025 18:06:01 +0000</pubDate>

									<content:encoded><![CDATA[<p>Hi all, I&#8217;ve been working in having robot DRM communicated with DR1, 2,  and 3, and I finally found out why I couldn&#8217;t. Each robot had a  different DDS implementation, having cyclone as the default for AGNS  system and fast dds for some reason as the default for DRM. </p>
<p>I  finally have them all compatible so that means that I can run the  ros2osc bridge node in order to start playing with receiving messages  within Supercollider. </p>
<p>Yesterday I added an additional stream of  sound to the DRM, which is a slowly fading it fading out white noise,  only that it is a unique source of white noise for each of the 16  loudspeakers instead of the 16 having the same source of white noise.  This produces an interesting «3d» effect inside the DRM.</p>
<p>An  interesting thing we noticed yesterday while taking pictures of the  project was that inside the DRM the sounds that are coming from outside  like voices or even sounds coming from AGNS robots enter the dome and  transform themselves! It is an unexpected effect and at the same time  very fitting with its own name as a machine that mutates the outside  reality. A gift from the gods perhaps.</p>
<p>I will continue working on linking robot movement and sound within the DRM and will report later!</p>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">427add19a92ade9d8251dc03596d038c</guid>
				<title>Nicolas Kisic Aguirre posted an update: Hi all, I've been working in having robot DRM communicated with DR1, 2,  and 3, and I finally found out why I couldn't. Each robot had a  different DDS implementation, having cyclone as the default for AGNS  system and fast dds for some reason as the default for DRM. I  finally have them all compatible so that means that I can run the  ros2osc bridge node in order to start playing with receiving messages  within Supercollider. Yesterday I added an additional stream of  sound to the DRM, which is a slowly fading it fading out white noise,  only that it is a unique source of white noise for each of the 16  loudspeakers instead of the 16 having the same source of white noise.  This produces an interesting "3d" effect inside the DRM.An  interesting thing we noticed yesterday while taking pictures of the  project was that inside the DRM the sounds that are coming from outside  like voices or even sounds coming from AGNS robots enter the dome and  transform themselves! It is an unexpected effect and at the same time  very fitting with its own name as a machine that mutates the outside  reality. A gift from the gods perhaps.I will continue working on linking robot movement and sound within the DRM and will report later!</title>
				<link>https://desobedienciarobotica.org/comunidad/p/172/</link>
				<pubDate>Thu, 05 Jun 2025 18:03:53 +0000</pubDate>

									<content:encoded><![CDATA[<p>Hi all, I&#8217;ve been working in having robot DRM communicated with DR1, 2,  and 3, and I finally found out why I couldn&#8217;t. Each robot had a  different DDS implementation, having cyclone as the default for AGNS  system and fast dds for some reason as the default for DRM. </p>
<p>I  finally have them all compatible so that means that I can run the  ros2osc bridge node in order to start playing with receiving messages  within Supercollider. </p>
<p>Yesterday I added an additional stream of  sound to the DRM, which is a slowly fading it fading out white noise,  only that it is a unique source of white noise for each of the 16  loudspeakers instead of the 16 having the same source of white noise.  This produces an interesting «3d» effect inside the DRM.</p>
<p>An  interesting thing we noticed yesterday while taking pictures of the  project was that inside the DRM the sounds that are coming from outside  like voices or even sounds coming from AGNS robots enter the dome and  transform themselves! It is an unexpected effect and at the same time  very fitting with its own name as a machine that mutates the outside  reality. A gift from the gods perhaps.</p>
<p>I will continue working on linking robot movement and sound within the DRM and will report later!</p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">edc8fd4b28c89b12c1c8639b9de6d1f7</guid>
				<title>Nicolas Kisic Aguirre posted an update: Hi All! Today I start my exhibit at Centro Nacional de las Artes - CENART in Mexico City. The exhibit will run from today, Tuesday June 3rd, to Sunday June 8th. I will be coming here every day from 10am to 6pm to continously work on robots (AGNS) and dispositivo de realidad mutada (DRM) so that they can evolve over the next few days.This is my take; an exhibition space and time is, for me, an excellent time to continue experimenting instead of leaving work that is "finished". As a matter of fact, I think that one dimension of "disobedience" is all about considering work always as "unfinished" in the sense that it is constantly evolving. And exhibiting this way, as a way of experimenting and evolving through time, is for me an idea and perhaps a statement I will follow.I will be writing posts as the day(s) go by, in a way this will be my exhibition journal! So I will start with what's happening right now at the beginning:- DRM is now supported by its own legs, they can flex and adjust to the height of different people seating below it. Adjustment isn't automatic, rather I need to operate it so that it goes to the appropriate height. I'm not sure if automatic adjustment will happen in the future, both because it is a difficult technological feat, and because me being present as a performatic operator is also an interesting aspect to explore.- Sound coming out of the DRM is currently at its most basic, and I hope this will evolve over the next few days. There is currently no connection between robots movement but there is a way to do it since DRM is already running in its own RPi5 with ROS2 installed. It's just a matter of running the OSC bridge while receiving signals from the other robots.- Robots are currently expressing the clicking sound that they had last time they were active in Seattle. This is the last bit of their memory but this will start changing as I include different recordings from CDMX and their own construction as ways to explore their expression.- Robots movement is currently limited because coding is made for bigger spaces, they get stuck too easily. I have some ideas, for example, having robots move not all 3 at the same time (I mean navigate, they can still move up and down while parked).I will probably write in the comments as I come up with ideas and changes! Stay tuned!! And so it begins…</title>
				<link>https://desobedienciarobotica.org/comunidad/p/164/</link>
				<pubDate>Tue, 03 Jun 2025 18:45:00 +0000</pubDate>

									<content:encoded><![CDATA[<p>Hi All! Today I start my exhibit at Centro Nacional de las Artes &#8211; CENART in Mexico City. The exhibit will run from today, Tuesday June 3rd, to Sunday June 8th. I will be coming here every day from 10am to 6pm to continously work on robots (AGNS) and dispositivo de realidad mutada (DRM) so that they can evolve over the next few days.</p>
<p>This is my take; an exhibition space and time is, for me, an excellent time to continue experimenting instead of leaving work that is «finished». As a matter of fact, I think that one dimension of «disobedience» is all about considering work always as «unfinished» in the sense that it is constantly evolving. And exhibiting this way, as a way of experimenting and evolving through time, is for me an idea and perhaps a statement I will follow.</p>
<p>I will be writing posts as the day(s) go by, in a way this will be my exhibition journal! So I will start with what&#8217;s happening right now at the beginning:</p>
<p>&#8211; DRM is now supported by its own legs, they can flex and adjust to the height of different people seating below it. Adjustment isn&#8217;t automatic, rather I need to operate it so that it goes to the appropriate height. I&#8217;m not sure if automatic adjustment will happen in the future, both because it is a difficult technological feat, and because me being present as a performatic operator is also an interesting aspect to explore.</p>
<p>&#8211; Sound coming out of the DRM is currently at its most basic, and I hope this will evolve over the next few days. There is currently no connection between robots movement but there is a way to do it since DRM is already running in its own RPi5 with ROS2 installed. It&#8217;s just a matter of running the OSC bridge while receiving signals from the other robots.</p>
<p>&#8211; Robots are currently expressing the clicking sound that they had last time they were active in Seattle. This is the last bit of their memory but this will start changing as I include different recordings from CDMX and their own construction as ways to explore their expression.</p>
<p>&#8211; Robots movement is currently limited because coding is made for bigger spaces, they get stuck too easily. I have some ideas, for example, having robots move not all 3 at the same time (I mean navigate, they can still move up and down while parked).</p>
<p>I will probably write in the comments as I come up with ideas and changes! Stay tuned!! </p>
<p>And so it begins…</p>
]]></content:encoded>
				
									<slash:comments>4</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">3ec8eb44af66422a710ffbcc9adb440d</guid>
				<title>eunsun choi became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/150/</link>
				<pubDate>Tue, 18 Mar 2025 22:04:20 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">d6c490f4218776097d1ff51e3be48652</guid>
				<title>rosrobotman posted an update in the group Raspberry Pi 5 8GB [DR2024]: What ROS2 Python library did you use for interacting with the RPi 5 GPIOs? I tried a couple today with no luck.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/146/</link>
				<pubDate>Fri, 03 Jan 2025 04:31:49 +0000</pubDate>

									<content:encoded><![CDATA[<p>What ROS2 Python library did you use for interacting with the RPi 5 GPIOs? I tried a couple today with no luck.</p>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">2d88344a8ecdb1ca5d5a3f3a4a999a28</guid>
				<title>rosrobotman joined the group Hardware [DR2024]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/145/</link>
				<pubDate>Fri, 03 Jan 2025 04:05:15 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">e8efafd17632edada44b3f857dbbbf57</guid>
				<title>rosrobotman joined the group RPLiDAR A1M8 [DR2024]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/142/</link>
				<pubDate>Fri, 29 Nov 2024 18:05:25 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">8d5ee9a27c9cfd4dd780f2aad9121ed2</guid>
				<title>rosrobotman joined the group Raspberry Pi 5 8GB [DR2024]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/141/</link>
				<pubDate>Fri, 29 Nov 2024 18:05:13 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">30d6859d5b08842021fd801371949694</guid>
				<title>krishpri posted an update: </title>
				<link>https://desobedienciarobotica.org/comunidad/p/140/</link>
				<pubDate>Thu, 14 Nov 2024 23:44:03 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">c3b0a0c41c1f80451a2fb1f0a2d99a83</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group MicroROS: [TASK] Install microROSFrom Ethan:One thing to note, platformio configuration will need to be done for the specific type of microcontroller you have and the version of microros/ros2 you are targeting. In this case we chose nodemcu-32s (it was the one we had) and iron for our distro, though you may be using different systems.Then,</title>
				<link>https://desobedienciarobotica.org/comunidad/p/137/</link>
				<pubDate>Thu, 14 Nov 2024 23:03:34 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Install microROS</b></p>
<p><b>From Ethan:</b></p>
<pre>One thing to note, platformio configuration will need to be done for the specific type of microcontroller you have and the version of microros/ros2 you are targeting. In this case we chose nodemcu-32s (it was the one we had) and iron for our distro, though you may be using different systems.</pre>
<p>Then,</p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">e4a42950d85f6a0322e248f974be5fe4</guid>
				<title>Nicolas Kisic Aguirre created the group MicroROS</title>
				<link>https://desobedienciarobotica.org/comunidad/p/136/</link>
				<pubDate>Thu, 14 Nov 2024 22:59:21 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">d7969382d513eb133d1a1fbeae6d13fa</guid>
				<title>baileyambrose became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/129/</link>
				<pubDate>Mon, 21 Oct 2024 07:05:37 +0000</pubDate>

				
									<slash:comments>1</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">4a6aef3d76734596038a1bf8120388f6</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Ubuntu [DR2024]: [TASK] Launch File to Launch All Nodes at Once

To do this, we will use tmux. First, install tmux:
sudo apt update
sudo apt install tmux
Then, we should enable the use of the mouse in tmux:
nano ~/.tmux.conf   # if the file doesn't exist it will be created
Inside this file, we need to add the following line:
set -g mouse on
Then close any tmux sessions, the new ones will have the mouse enabled (to activate panes, for example).

To make things even easier, we will install tmuxinator. In a Terminal window, run:
gem install tmuxinator
Then, tmuxinator needs an editor to be defined, so if 'echo $EDITOR' provides no answer, we can:
export EDITOR=nano
Or if we want to save this for all bash/Terminal sessions, we can:
echo 'export EDITOR=nano' &#62;&#62; ~/.bashrc
source ~/.bashrc
Then, we can create our first project:
tmuxinator new DR1_session
It will launch the tmuxinator config file, in which we will add:


# /home/disrobot/.config/tmuxinator/DR1_session.yml

name: DR1_session
root: ~/slam_ws
windows:
  - editor
      layout: tiled # Ensures an even tiled layout for the panes
      panes:
        - source install/local_setup.bash # roboclaw_node pending: ros2 launch roboclaw_node_ros roboclaw_node.launch.py
        - source install/local_setup.bash &#38;&#38; ros2 launch twist_mux twist_mux_launch.py config_topics:=/home/disrobot/slam_ws/src/twist_mux/config/twist_mux_topics.yaml
        - source install/local_setup.bash &#38;&#38; ros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args -r /cmd_vel:=/key_vel
        - source install/local_setup.bash &#38;&#38; ros2 launch rplidar_ros rplidar_a1_launch.py
        - source install/local_setup.bash &#38;&#38; ros2 launch rplidar_ros rplidar_a1b_launch.py serial_port:=/dev/bottom_lidar frame_id:=bottom_laser remap_scan:=/bottom_scan
        - source install/local_setup.bash &#38;&#38; ros2 launch robot_wander startwandering.launch.py
        - source install/local_setup.bash &#38;&#38; ros2 run ros2osc ros2osc_node
        - source install/local_setup.bash &#38;&#38; ros2 launch lift_routine liftroutine.launch.py
        - source install/local_setup.bash &#38;&#38; ros2 launch timer_conductor timer.launch.py
        - QT_QPA_PLATFORM=offscreen sclang ~/SC/DR1_SCvol.scd


This file is located in /home/disrobot/.config/tmuxinator/DR1_session.yml

To launch this session we need to
tmuxinator start DR1_session
And to end it
tmuxinator stop DR1_session</title>
				<link>https://desobedienciarobotica.org/comunidad/p/128/</link>
				<pubDate>Mon, 21 Oct 2024 00:42:52 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Launch File to Launch All Nodes at Once</b></p>
<p>To do this, we will use <b>tmux</b>. First, install tmux:</p>
<pre>sudo apt update
sudo apt install tmux</pre>
<p>Then, we should enable the use of the mouse in tmux:</p>
<pre>nano ~/.tmux.conf   # if the file doesn't exist it will be created</pre>
<p>Inside this file, we need to add the following line:</p>
<pre>set -g mouse on</pre>
<p>Then close any tmux sessions, the new ones will have the mouse enabled (to activate panes, for example).</p>
<p>To make things even easier, we will install <b>tmuxinator</b>. In a Terminal window, run:</p>
<pre>gem install tmuxinator</pre>
<p>Then, tmuxinator needs an editor to be defined, so if &#8216;echo $EDITOR&#8217; provides no answer, we can:</p>
<pre>export EDITOR=nano</pre>
<p>Or if we want to save this for all bash/Terminal sessions, we can:</p>
<pre>echo 'export EDITOR=nano' &gt;&gt; ~/.bashrc
source ~/.bashrc</pre>
<p>Then, we can create our first project:</p>
<pre>tmuxinator new DR1_session</pre>
<p>It will launch the tmuxinator config file, in which we will add:</p>
<pre>
# /home/disrobot/.config/tmuxinator/DR1_session.yml

name: DR1_session
root: ~/slam_ws
windows:
  - editor
      layout: tiled # Ensures an even tiled layout for the panes
      panes:
        - source install/local_setup.bash # roboclaw_node pending: ros2 launch roboclaw_node_ros roboclaw_node.launch.py
        - source install/local_setup.bash &amp;&amp; ros2 launch twist_mux twist_mux_launch.py config_topics:=/home/disrobot/slam_ws/src/twist_mux/config/twist_mux_topics.yaml
        - source install/local_setup.bash &amp;&amp; ros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args -r /cmd_vel:=/key_vel
        - source install/local_setup.bash &amp;&amp; ros2 launch rplidar_ros rplidar_a1_launch.py
        - source install/local_setup.bash &amp;&amp; ros2 launch rplidar_ros rplidar_a1b_launch.py serial_port:=/dev/bottom_lidar frame_id:=bottom_laser remap_scan:=/bottom_scan
        - source install/local_setup.bash &amp;&amp; ros2 launch robot_wander startwandering.launch.py
        - source install/local_setup.bash &amp;&amp; ros2 run ros2osc ros2osc_node
        - source install/local_setup.bash &amp;&amp; ros2 launch lift_routine liftroutine.launch.py
        - source install/local_setup.bash &amp;&amp; ros2 launch timer_conductor timer.launch.py
        - QT_QPA_PLATFORM=offscreen sclang ~/SC/DR1_SCvol.scd
</pre>
<p>This file is located in /home/disrobot/.config/tmuxinator/DR1_session.yml</p>
<p>To launch this session we need to</p>
<pre>tmuxinator start DR1_session</pre>
<p>And to end it</p>
<pre>tmuxinator stop DR1_session</pre>
]]></content:encoded>
				
									<slash:comments>3</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">538763a93268267345250b62662de3ba</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Ubuntu [DR2024]: [TASK] [COMPLETE] &#x2714; Create Password-Less SSH Access Between RobotsAs we are going to be using one of the robots (101) as the master repository from which 102 and 103 synchronize, we will need password-less SSH access between them.To do so, we need to first generate an SSH key. From robots that will connect to the master, we need to run this command in the Terminal:ssh-keygen -t rsa -b 4096   # leave default locations and do not input a password (leave blank)Then, we need to copy the key to the master or leader robot:ssh-copy-id disrobot@192.168.2.101   # the system will ask for the password to access 101After this, we should be able to access Robot 1 from other robots, by simply:ssh disrobot@192.168.2.101   # no need for passwordThis task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/127/</link>
				<pubDate>Mon, 21 Oct 2024 00:23:31 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Create Password-Less SSH Access Between Robots</b></p>
<p>As we are going to be using one of the robots (101) as the master repository from which 102 and 103 synchronize, we will need password-less SSH access between them.</p>
<p>To do so, we need to first generate an SSH key. From robots that will connect to the master, we need to run this command in the Terminal:</p>
<pre>ssh-keygen -t rsa -b 4096   # leave default locations and do not input a password (leave blank)</pre>
<p>Then, we need to copy the key to the master or leader robot:</p>
<pre>ssh-copy-id disrobot@192.168.2.101   # the system will ask for the password to access 101</pre>
<p>After this, we should be able to access Robot 1 from other robots, by simply:</p>
<pre>ssh disrobot@192.168.2.101   # no need for password</pre>
<p><b>This task has been completed.</b></p></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">af8cc9c4a61826b2612a448e347f270c</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group SuperCollider [DR2024]: [TASK] Use Supercollider Directly from the Terminal Window Using SSHThese instructions will guide us through using Supercollider via a Terminal window connected to our robots via SSH. For example, in a Macbook Pro, after doing ssh disrobot@192.168.2.101, we will be in our remote computer. We will need multiple windows. Towards the end we will try to create a script to open everything we need at once.First, it is recommended to start jackd:jackd -d alsa --device hw:S3 --rate 44100 --period 2048 &#38;   # hw:S3 is the location of our SoundBlaster external sound card.Then, we need to boot the server from another Terminal window:scsynth -u 57110 &#38;Then, we need to launch Supercollider from another Terminal window, this is how we launch it:QT_QPA_PLATFORM=offscreen sclangFrom within the window that activated sclang, we need to connect to our already booted server:s = Server.local;s.boot;Then we can finally use Supercollider from within the Terminal window. A simple test should confirm:{ SinOsc.ar(440, 0, 0.2) }.play;To stop:s.freeAllKnowing this, we now want to be able to run an entire .scd file from the terminal. Let's see.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/123/</link>
				<pubDate>Fri, 18 Oct 2024 22:59:42 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Use Supercollider Directly from the Terminal Window Using SSH<br /></b></p>
<p>These instructions will guide us through using Supercollider via a Terminal window connected to our robots via SSH. For example, in a Macbook Pro, after doing ssh <a href="mailto:disrobot@192.168.2.101" rel="nofollow">disrobot@192.168.2.101</a>, we will be in our remote computer. We will need multiple windows. Towards the end we will try to create a script to open everything we need at once.</p>
<p>First, it is recommended to start jackd:</p>
<pre>jackd -d alsa --device hw:S3 --rate 44100 --period 2048 &amp;   # hw:S3 is the location of our SoundBlaster external sound card.</pre>
<p>Then, we need to boot the server from another Terminal window:</p>
<pre>scsynth -u 57110 &amp;</pre>
<p>Then, we need to launch Supercollider from another Terminal window, this is how we launch it:</p>
<pre>QT_QPA_PLATFORM=offscreen sclang<br /></pre>
<p>From within the window that activated sclang, we need to connect to our already booted server:</p>
<pre>s = Server.local;<br />s.boot;</pre>
<p>Then we can finally use Supercollider from within the Terminal window. A simple test should confirm:</p>
<pre>{ SinOsc.ar(440, 0, 0.2) }.play;</pre>
<p>To stop:</p>
<pre>s.freeAll</pre>
<p>Knowing this, we now want to be able to run an entire .scd file from the terminal. Let&#8217;s see.</p>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">cfcf2f88acced28567f6984a3a7591eb</guid>
				<title>rosrobotman became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/119/</link>
				<pubDate>Wed, 16 Oct 2024 01:26:37 +0000</pubDate>

				
									<slash:comments>3</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">72f4d2f407a1068b264083d1b84a6722</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Ubuntu [DR2024]: [TASK] [COMPLETED] &#x2714; Symlinks for USB ConnectionsThis is mainly for the RPLiDARS, considering that we will have two identical lidars for each robot, we need to be able to identify them. The way we will identify them is by the exact physical USB port they're connected to. This way, if we run the command:udevadm info -a -n /dev/ttyUSB0   # when only the top lidar is connectedWe will see a lot of information, one of which will look like this:KERNELS=="4-1:1.0"   # this refers to the physical locationFor our top lidar, the location is on the top left corner of the USB ports. Then, we must create rules:sudo nano /etc/udev/rules.d/99-lidar.rules   # this will create a new document the first timeInside the document, we will add the symlink rules:SUBSYSTEM=="tty", KERNELS=="4-1:1.0", SYMLINK+="top_lidar"Then we close the rules document and, to have the rule be effective immediately, we must reload udev rules:sudo udevadm control --reload-rulessudo udevadm triggerWe can repeat the process for the bottom lidar, adding to the rules document a second line with:SUBSYSTEM=="tty", KERNELS=="2-1:1.0", SYMLINK+="bottom_lidar"The bottom lidar is connected to the bottom left USB port. Then reload udev rules, now both lidars will be accessible from /dev/top_lidar and /dev/bottom_lidar instead of /dev/ttyUSB0 and /dev/ttyUSB1.This task is complete.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/116/</link>
				<pubDate>Tue, 15 Oct 2024 00:39:05 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETED] &#x2714; Symlinks for USB Connections</b></p>
<p>This is mainly for the RPLiDARS, considering that we will have two identical lidars for each robot, we need to be able to identify them. The way we will identify them is by the exact physical USB port they&#8217;re connected to. This way, if we run the command:</p>
<pre>udevadm info -a -n /dev/ttyUSB0   # when only the top lidar is connected</pre>
<p>We will see a lot of information, one of which will look like this:</p>
<pre>KERNELS=="4-1:1.0"   # this refers to the physical location</pre>
<p>For our top lidar, the location is on the top left corner of the USB ports. Then, we must create rules:</p>
<pre>sudo nano /etc/udev/rules.d/99-lidar.rules   # this will create a new document the first time</pre>
<p>Inside the document, we will add the symlink rules:</p>
<pre>SUBSYSTEM=="tty", KERNELS=="4-1:1.0", SYMLINK+="top_lidar"</pre>
<p>Then we close the rules document and, to have the rule be effective immediately, we must reload udev rules:</p>
<pre>sudo udevadm control --reload-rules<br />sudo udevadm trigger</pre>
<p>We can repeat the process for the bottom lidar, adding to the rules document a second line with:</p>
<pre>SUBSYSTEM=="tty", KERNELS=="2-1:1.0", SYMLINK+="bottom_lidar"</pre>
<p>The bottom lidar is connected to the bottom left USB port. Then reload udev rules, now both lidars will be accessible from /dev/top_lidar and /dev/bottom_lidar instead of /dev/ttyUSB0 and /dev/ttyUSB1.</p>
<p><b>This task is complete.</b></p>
</p></p>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">c59bb4b3a3d542ddde86ceceecb08edb</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Hardware [DR2024]: [TASK] A Physical STOP ButtonWe need a button that acts as a lock and stops all movement and sound from the robot but doesn't turn it off (hence not the on/off switch). We need to be able to press it to stop and press it again to resume. </title>
				<link>https://desobedienciarobotica.org/comunidad/p/114/</link>
				<pubDate>Mon, 14 Oct 2024 07:43:01 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] A Physical STOP Button</b></p>
<p>We need a button that acts as a lock and stops all movement and sound from the robot but doesn&#8217;t turn it off (hence not the on/off switch). We need to be able to press it to stop and press it again to resume. </p>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">c6a8d5fd592620541d922dbff7475173</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group ROS2 Jazzy [DR2024]: [TASK] Create a Robot &#62; Wander NodeWe want to create a node that enables a "wandering" behavior. This means that our robot should move around space, avoiding obstacles without any specific target. We will create a package to code this node using Python, as detailed here:cd ~/&#60;workspace_name_ws&#62;/src
ros2 pkg create --build-type ament_python robot_wanderThis will create a series of files and folders that, in some cases, we need to modify slightly. First, we will change 'setup.py.' Inside /src/setup.py, by default, the code looks like this:from setuptools import find_packages, setuppackage_name = 'robot_wander'setup(    name=package_name,    version='0.0.0',    packages=find_packages(exclude=['test']),    data_files=[        ('share/ament_index/resource_index/packages',            ['resource/' + package_name]),        ('share/' + package_name, ['package.xml']),    ],    install_requires=['setuptools'],    zip_safe=True,    maintainer='disrobot',    maintainer_email='disrobot@todo.todo',    description='TODO: Package description',    license='TODO: License declaration',    tests_require=['pytest'],    entry_points={        'console_scripts': [        ],    },)We need to change</title>
				<link>https://desobedienciarobotica.org/comunidad/p/113/</link>
				<pubDate>Fri, 11 Oct 2024 20:47:18 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Create a Robot &gt; Wander Node</b></p>
<p>We want to create a node that enables a «wandering» behavior. This means that our robot should move around space, avoiding obstacles without any specific target. </p>
<p>We will create a package to code this node using Python, as detailed <a target='_blank' href="https://disobedientrobots.org/members/nka/activity/55/" rel="nofollow">here</a>:</p>
<pre>cd ~/&lt;workspace_name_ws&gt;/src
ros2 pkg create --build-type ament_python robot_wander</pre>
<p>This will create a series of files and folders that, in some cases, we need to modify slightly. First, we will change &#8216;setup.py.&#8217; Inside /src/setup.py, by default, the code looks like this:</p>
<pre><p>from setuptools import find_packages, setup</p><p>package_name = 'robot_wander'</p><p>setup(</p><p>    name=package_name,</p><p>    version='0.0.0',</p><p>    packages=find_packages(exclude=['test']),</p><p>    data_files=[</p><p>        ('share/ament_index/resource_index/packages',</p><p>            ['resource/' + package_name]),</p><p>        ('share/' + package_name, ['package.xml']),</p><p>    ],</p><p>    install_requires=['setuptools'],</p><p>    zip_safe=True,</p><p>    maintainer='disrobot',</p><p>    maintainer_email='disrobot@todo.todo',</p><p>    description='TODO: Package description',</p><p>    license='TODO: License declaration',</p><p>    tests_require=['pytest'],</p><p>    entry_points={</p><p>        'console_scripts': [</p><p>        ],</p><p>    },</p><p>)</p></pre>
<p>We need to change</p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">48618a1162b9e0272bdb0e2a3cfd76c2</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group ROS2 Jazzy [DR2024]: [TASK] Create a Node that Publishes OSC Messages to Connect w/ SupercolliderWe need to install 'python-osc' first:pip install python-osc --break-system-packages   # we need to break system packages, but this won't hurt the systemThen, </title>
				<link>https://desobedienciarobotica.org/comunidad/p/111/</link>
				<pubDate>Wed, 09 Oct 2024 23:26:33 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Create a Node that Publishes OSC Messages to Connect w/ Supercollider</b></p>
<p>We need to install &#8216;python-osc&#8217; first:</p>
<pre>pip install python-osc --break-system-packages   # we need to break system packages, but this won't hurt the system</pre>
<p>Then, </p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">7086fc466e262db3a0f97d77e2f2e9e3</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Hardware [DR2024]: [TASK] [COMPLETE] &#x2714; Measure Ticks /Meter &#38; /Rotation for 118RPM MotorsTicks measure encoder data; we need to know how many ticks are recorded per wheel rotation and how many ticks are recorded per meter traveled. The following table contains all of our data both for the (old) 313pm motors and the (new) 118rpm motors:https://docs.google.com/spreadsheets/d/1gXIYVflYtpmY3lc_lc8FVnALPolJMXfjmJMj7WFF6CY/edit?usp=sharing</title>
				<link>https://desobedienciarobotica.org/comunidad/p/110/</link>
				<pubDate>Sun, 06 Oct 2024 01:17:40 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Measure Ticks /Meter &amp; /Rotation for 118RPM Motors</b></p>
<p>Ticks measure encoder data; we need to know how many ticks are recorded per wheel rotation and how many ticks are recorded per meter traveled. The following table contains all of our data both for the (old) 313pm motors and the (new) 118rpm motors:</p>
<p><a target='_blank' href="https://docs.google.com/spreadsheets/d/1gXIYVflYtpmY3lc_lc8FVnALPolJMXfjmJMj7WFF6CY/edit?usp=sharing" rel="nofollow">https://docs.google.com/spreadsheets/d/1gXIYVflYtpmY3lc_lc8FVnALPolJMXfjmJMj7WFF6CY/edit?usp=sharing</a></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">57e836594c82e01061539dcdac2d609a</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group ROS2 Jazzy [DR2024]: SYNC WORKSPACE PULLING FROM MASTER ROBOT (101)Make sure SSH access is password-less by following steps here.tmuxinator-button.servicesudo rsync -avz --delete -e ssh disrobot@192.168.2.101:/etc/systemd/system/tmuxinator-button.service /etc/systemd/system/tmuxinator-button.servicePython Codesrsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/PythonCodes/ /home/disrobot/PythonCodes/tmuxinator configs (Consolidated Launch Files)rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/.config/tmuxinator/ /home/disrobot/.config/tmuxinator/slam_ws (ROS2 workspace):rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/slam_ws/ /home/disrobot/slam_ws/SC (Supercollider folder)rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/SC/ /home/disrobot/SC/ROS2 LAUNCH COMMANDSConsolidated Launch w/ Tmux &#124; Tmuxinatortmuxinator start DR1_sessiontmuxinator start DRX_sessionIndividual Launch CommandsSSH Terminal Windows:DR3: ssh disrobot@192.168.0.53 #change IP address accordinglyRoboclaw Node:cd ~/slam_ws. install/local_setup.bashros2 launch roboclaw_node_ros roboclaw_node.launch.pyTwist Mux:cd ~/slam_ws. install/local_setup.bashros2 launch twist_mux twist_mux_launch.py config_topics:=/home/disrobot/slam_ws/src/twist_mux/config/twist_mux_topics.yamlRobot State Publisher:cd ~/slam_ws. install/local_setup.bashros2 launch disrobot_description urdf_visualize.launch.py Cartographer:cd ~/slam_ws. install/local_setup.bashros2 launch cartographer_ros backpack_2d_custom.launch.py configuration_directory:=/home/disrobot/slam_ws/src/cartographer_ros/configuration_files configuration_basename:=backpack_2d_custom.lua—or—. install/local_setup.bashros2 launch cartographer_ros backpack_2d_custom.launch.py configuration_directory:=/home/disrobot/ros2_ws/src/cartographer_ros/configuration_files configuration_basename:=backpack_2d_custom.lua—x—Teleop:cd ~/slam_ws. install/local_setup.bashros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args -r /cmd_vel:=/key_vel  # correct remappingNav2:cd ~/slam_ws. install/local_setup.bashros2 launch nav2_bringup bringup_launch.py params_file:=/home/disrobot/slam_ws/src/nav2_bringup/params/nav2_params.yaml—or—cd ~/ros2_ws. install/local_setup.bashros2 launch nav2_bringup bringup_launch.py params_file:=/home/disrobot/ros2_ws/src/nav2_bringup/params/nav2_params.yaml—x—————In Remote Desktop:*TOP* RPLiDAR Node:sudo chmod 777 /dev/dri/renderD128cd ~/slam_ws. install/local_setup.bashQT_QPA_PLATFORM=xcb ros2 launch rplidar_ros view_rplidar_a1_launch.py—or—cd ~/slam_ws. install/local_setup.bashros2 launch rplidar_ros rplidar_a1_launch.py*BOTTOM* RPLiDAR Node:sudo chmod 777 /dev/dri/renderD128cd ~/slam_ws. install/local_setup.bashQT_QPA_PLATFORM=xcb ros2 launch rplidar_ros view_rplidar_a1b_launch.py serial_port:=/dev/bottom_lidar frame_id:=bottom_laser remap_scan:=/bottom_scan—or—cd ~/slam_ws. install/local_setup.bashros2 launch rplidar_ros rplidar_a1b_launch.py serial_port:=/dev/bottom_lidar frame_id:=bottom_laser remap_scan:=/bottom_scanJoint State Publisher Guicd ~/slam_ws. install/local_setup.bashros2 run joint_state_publisher_gui joint_state_publisher_gui———Robot Wander:cd ~/slam_ws. install/local_setup.bashros2 launch robot_wander startwandering.launch.py  # RPLiDAR, Twist_Mux, Teleop, Roboclaw &#60; Necessaryros2osc:cd ~/slam_ws. install/local_setup.bashros2 run ros2osc ros2osc_nodelift_routine:cd ~/slam_ws. install/local_setup.bashros2 launch lift_routine liftroutine.launch.pytimer_conductor:cd ~/slam_ws. install/local_setup.bashros2 launch timer_conductor timer.launch.py</title>
				<link>https://desobedienciarobotica.org/comunidad/p/109/</link>
				<pubDate>Sat, 05 Oct 2024 21:35:13 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>SYNC WORKSPACE PULLING FROM MASTER ROBOT (101)</b></p>
<p>Make sure SSH access is password-less by following steps <a target='_blank' href="https://disobedientrobots.org/community/p/127/" rel="nofollow">here</a>.</p>
<p><b>tmuxinator-button.service</b></p>
<pre>sudo rsync -avz --delete -e ssh disrobot@192.168.2.101:/etc/systemd/system/tmuxinator-button.service /etc/systemd/system/tmuxinator-button.service</pre>
<p><b>Python Codes</b></p>
<pre>rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/PythonCodes/ /home/disrobot/PythonCodes/</pre>
<p><b>tmuxinator configs (Consolidated Launch Files)<br /></b></p>
<pre>rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/.config/tmuxinator/ /home/disrobot/.config/tmuxinator/</pre>
<p><b>slam_ws (ROS2 workspace):</b></p>
<pre>rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/slam_ws/ /home/disrobot/slam_ws/</pre>
<p><b></b><b>SC (Supercollider folder)</b></p>
<pre>rsync -avz --delete -e ssh disrobot@192.168.2.101:/home/disrobot/SC/ /home/disrobot/SC/</pre>
<p><b></b><b>ROS2 LAUNCH COMMANDS</b></p>
<p><b>Consolidated Launch w/ Tmux | Tmuxinator</b></p>
<pre>tmuxinator start DR1_session<br />tmuxinator start DRX_session</pre>
<p><b>Individual Launch Commands</b></p>
<p><b>SSH Terminal Windows:</b></p>
<p><b>DR3: </b></p>
<pre>ssh <a href="mailto:disrobot@192.168.0.53" rel="nofollow">disrobot@192.168.0.53</a> #change IP address accordingly</pre>
<p><b></b><b>Roboclaw Node:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch roboclaw_node_ros roboclaw_node.launch.py</pre>
<p><b></b><b>Twist Mux:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch twist_mux twist_mux_launch.py config_topics:=/home/disrobot/slam_ws/src/twist_mux/config/twist_mux_topics.yaml</pre>
<p><b></b><b>Robot State Publisher:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch disrobot_description urdf_visualize.launch.py </pre>
<p><b></b><b>Cartographer:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch cartographer_ros backpack_2d_custom.launch.py configuration_directory:=/home/disrobot/slam_ws/src/cartographer_ros/configuration_files configuration_basename:=backpack_2d_custom.lua</pre>
<p><b>—or—</b></p>
<pre>. install/local_setup.bash<br />ros2 launch cartographer_ros backpack_2d_custom.launch.py configuration_directory:=/home/disrobot/ros2_ws/src/cartographer_ros/configuration_files configuration_basename:=backpack_2d_custom.lua</pre>
<p><b>—x—</b></p>
<p><b></b><b>Teleop:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args -r /cmd_vel:=/key_vel  # correct remapping</pre>
<p><b></b><b>Nav2:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch nav2_bringup bringup_launch.py params_file:=/home/disrobot/slam_ws/src/nav2_bringup/params/nav2_params.yaml</pre>
<p><b>—or—</b></p>
<pre>cd ~/ros2_ws<br />. install/local_setup.bash<br />ros2 launch nav2_bringup bringup_launch.py params_file:=/home/disrobot/ros2_ws/src/nav2_bringup/params/nav2_params.yaml</pre>
<p><b>—x—</b></p>
<p><b>————</b></p>
<p><b></b><b>In Remote Desktop:</b></p>
<p><b></b><b>*TOP* RPLiDAR Node:</b></p>
<pre>sudo chmod 777 /dev/dri/renderD128<br />cd ~/slam_ws<br />. install/local_setup.bash<br />QT_QPA_PLATFORM=xcb ros2 launch rplidar_ros view_rplidar_a1_launch.py</pre>
<p><b>—or—</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch rplidar_ros rplidar_a1_launch.py</pre>
<p><b></b><b>*BOTTOM* RPLiDAR Node:</b></p>
<pre>sudo chmod 777 /dev/dri/renderD128<br />cd ~/slam_ws<br />. install/local_setup.bash<br />QT_QPA_PLATFORM=xcb ros2 launch rplidar_ros view_rplidar_a1b_launch.py serial_port:=/dev/bottom_lidar frame_id:=bottom_laser remap_scan:=/bottom_scan</pre>
<p><b>—or—<br /></b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch rplidar_ros rplidar_a1b_launch.py serial_port:=/dev/bottom_lidar frame_id:=bottom_laser remap_scan:=/bottom_scan</pre>
<p><b></b><b>Joint State Publisher Gui</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 run joint_state_publisher_gui joint_state_publisher_gui</pre>
<p><b>———</b></p>
<p><b></b><b>Robot Wander:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch robot_wander startwandering.launch.py  # RPLiDAR, Twist_Mux, Teleop, Roboclaw &lt; Necessary</pre>
<p><b></b><b>ros2osc:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 run ros2osc ros2osc_node</pre>
<p><b></b><b>lift_routine:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch lift_routine liftroutine.launch.py</pre>
<p><b>timer_conductor:</b></p>
<pre>cd ~/slam_ws<br />. install/local_setup.bash<br />ros2 launch timer_conductor timer.launch.py<br /></pre>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">e7e918fcc84074b502f49c1f52e41c33</guid>
				<title>lucasb became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/106/</link>
				<pubDate>Fri, 04 Oct 2024 20:39:06 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">6de0299c16df5e2e4843045161bfae1f</guid>
				<title>zev became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/104/</link>
				<pubDate>Fri, 04 Oct 2024 20:38:28 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">d9c80d7a20b4f4e22354b51eb7b94c5e</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group ROS2 Jazzy [DR2024]: [TASK] Synchronized Time/Clock Between All RobotsSynchronization is an essential aspect of creating coordinated movement. We must be able to simultaneously send a command to all robots, making them as synchronized as possible.SOLUTIONBy having all robots in the same network (see this post), we can take advantage of the ROS2 middleware's Data Distribution Service (DDS) features, which distribute data (topics) not only within the individual system running ROS2 nodes but also to other systems running nodes within the same network.However, we must be careful to keep topics private from nodes not meant to be shared. An example of a topic that has to stay within its domain is /cmd_vel (the topic that sends movement instructions to the motor controller). This topic needs to remain private. Otherwise, the movement command from one robot could end up moving another robot in the same network, causing conflicting instructions and potential damage.We can have a distinct Domain ID for each robot to avoid topic conflicts. Robot 1 can have Domain ID 1. Robot 2 can have Domain ID 2, and so on. Topics within a shared domain ID can be published and subscribed through the network, but topics within different domain IDs can't.For each robot, we must first ensure the Domain ID is distinct:echo $ROS_DOMAIN_IDIf there is no result or if the Domain ID returned is not the desired one, we need to change it:export ROS_DOMAIN_ID=&#60;your_domain_id&#62; # For example, 1.Finally, if we want to save this setting so that every shell (Terminal) session defaults to the Domain ID that we want, we need to do the following:echo "export ROS_DOMAIN_ID=&#60;your_domain_id&#62;" &#62;&#62; ~/.bashrcUpon opening new shell sessions, "echo $ROS_DOMAIN_ID" will return the number we inscribed as the default. This will ensure that each robot is operating, for the most part, with all topics and nodes within its domain.Now that we have separated each domain to avoid conflicts, we can run specific nodes within different Domain IDs. All we need to do is open a new shell session where, before running our nodes, we will re-define the Domain ID for only that single shell session (or Terminal window). So if our robot is operating in Domain ID = 3, but we want to run specific nodes that synchronize with other robots (for example, another robot operating in Domain ID=1), then we can, for that single shell session:export ROS_DOMAIN_ID=1If we were to publish a topic from within this window after defining Domain ID=1, for example:. install/local_setup.bashros2 topic pub /robot_topic std_msgs/String "data: 'Hello from Robot 3'"Then in robot 1, we will be able to find /robot_topic:. install/local_setup.bashros2 topic list   # this will show the list of topics currently available, one of which should be /robot_topicros2 topic echo /robot_topic.  # this will publish the message received from robot 3.We can see that </title>
				<link>https://desobedienciarobotica.org/comunidad/p/103/</link>
				<pubDate>Thu, 03 Oct 2024 07:41:31 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Synchronized Time/Clock Between All Robots</b></p>
<p>Synchronization is an essential aspect of creating coordinated movement. We must be able to simultaneously send a command to all robots, making them as synchronized as possible.</p>
<p><b>SOLUTION</b></p>
<p>By having all robots in the same network (<a target='_blank' href="https://disobedientrobots.org/community/p/102/" rel="nofollow">see this post</a>), we can take advantage of the ROS2 middleware&#8217;s Data Distribution Service (DDS) features, which distribute data (topics) not only within the individual system running ROS2 nodes but also to other systems running nodes within the same network.</p>
<p>However, we must be careful to keep topics private from nodes not meant to be shared. An example of a topic that has to stay within its domain is /cmd_vel (the topic that sends movement instructions to the motor controller). This topic needs to remain private. Otherwise, the movement command from one robot could end up moving another robot in the same network, causing conflicting instructions and potential damage.</p>
<p>We can have a distinct <a target='_blank' href="https://docs.ros.org/en/jazzy/Concepts/Intermediate/About-Domain-ID.html" rel="nofollow">Domain ID</a> for each robot to avoid topic conflicts. Robot 1 can have Domain ID 1. Robot 2 can have Domain ID 2, and so on. Topics within a shared domain ID can be published and subscribed through the network, but topics within different domain IDs can&#8217;t.</p>
<p>For each robot, we must first ensure the Domain ID is distinct:</p>
<pre>echo $ROS_DOMAIN_ID</pre>
<p>If there is no result or if the Domain ID returned is not the desired one, we need to change it:</p>
<pre>export ROS_DOMAIN_ID=&lt;your_domain_id&gt; # For example, 1.</pre>
<p>Finally, if we want to save this setting so that every shell (Terminal) session defaults to the Domain ID that we want, we need to do the following:</p>
<pre>echo "export ROS_DOMAIN_ID=&lt;your_domain_id&gt;" &gt;&gt; ~/.bashrc</pre>
<p>Upon opening new shell sessions, «echo $ROS_DOMAIN_ID» will return the number we inscribed as the default. This will ensure that each robot is operating, for the most part, with all topics and nodes within its domain.</p>
<p>Now that we have separated each domain to avoid conflicts, we can run specific nodes within different Domain IDs. All we need to do is open a new shell session where, before running our nodes, we will re-define the Domain ID for <b>only</b> that single shell session (or Terminal window). So if our robot is operating in Domain ID = 3, but we want to run specific nodes that synchronize with other robots (for example, another robot operating in Domain ID=1), then we can, for that single shell session:</p>
<pre>export ROS_DOMAIN_ID=1</pre>
<p>If we were to publish a topic from within this window after defining Domain ID=1, for example:</p>
<pre>. install/local_setup.bash<br />ros2 topic pub /robot_topic std_msgs/String "data: 'Hello from Robot 3'"</pre>
<p>Then in robot 1, we will be able to find /robot_topic:</p>
<p>. install/local_setup.bash</p>
<p>ros2 topic list   # this will show the list of topics currently available, one of which should be /robot_topic</p>
<p>ros2 topic echo /robot_topic.  # this will publish the message received from robot 3.</p>
<p>We can see that </p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">b631ed0422dcd91264c5317762de0553</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Ubuntu [DR2024]: [TASK] [COMPLETED] &#x2714; Double Wi-Fi Network ConnectionWe need to use two networks to have a ubiquitous network solution for our robots. The first one is to connect to any local Wi-Fi networks available. For example, the DXARTS Gallery has its network, while the DXARTS Electronics has another one. The University of Washington has its network, too, and so on. However, the connection between remote and local computers to access Ubuntu remotely should occur from within our router. This will provide a stable solution that can be taken, together with our router, anywhere we want instead of depending on the local Wi-Fi.We need to be able to connect to two Wi-Fi networks simultaneously to have both internet access and a local network.

SOLUTIONWe have purchased a TP-LINK Archer A6 Wireless Router:TP-Link AC1200 Gigabit WiFi Router (Archer A6)With our new router, we created the "disobedientrobots" Wi-Fi network. When we first turned our router on, we accessed the configuration pages at 192.168.0.1. After creating a password for the first time, we changed the network name to be "disobedientrobots" and the password "disobedientrobots" too. Then, we accessed the LAN settings by clicking on Advanced at the top of the interface and then, from the sidebar, selecting Network &#62; LAN.Change the current LAN IP address from 192.168.0.1 to 192.168.2.1. By using 192.168.2.1, we're making sure that we don't conflict with the default subnet many routers usually pick (192.168.0.x).Then, to access two Wi-Fi networks using a single Raspberry Pi, we need to use an additional USB Wi-Fi adapter. For that purpose, we are using:Geekworm NVIDIA Jetson Nano Dual Band Wireless USB 3.0 WiFi Adapter 5GHz+2.4GHz 1200MThis adapter is "plug-and-play" with our Raspberry Pi 5's. As soon as it is connected via USB to the RPi, we can connect to 2 separate internet connections, one with the internal Wi-Fi antenna and one with our newly inserted USB adapter antenna.Using the internal Wi-Fi antenna, we will connect to the no-internet "disobedientrobots" network. The computer will be assigned an IP Address within the subnet range, such as 192.168.2.101.We will then enter the configuration settings for this network, and under the IPv4 tab, we will select "manual" instead of "automatic." This will allow us to assign a static IP to each robot manually instead of automatically assigning IP addresses. For the Address field, we will use the computer's assigned IP address, for example, 192.168.2.101. For the Netmask, we will use 255.255.255.0; for the Gateway, we will use the router's IP address, 192.168.2.1.We will connect to the internet network, in our case, "DX-STUDIO" or "DX-GALLERY," with the external USB adapter Wi-Fi antenna.NOTE: If we do this the other way around, attempting to connect to our RPis via RDP remotely will fail. This could be because Ubuntu has the routing internally assigned to the internal Wi-Fi card.Once connected to both networks, we can access these computers from within the network. From a Mac OS, we won't be able to connect to two separate networks using an USB adapter antenna. However, by using a virtual machine (the UTM app) running Ubuntu within my MacBook Pro, I was able to access the internet using the USB adapter antenna while the main Mac OS Wi-Fi connection was linked to "disobedientrobots". It is a workaround to be able to have internet access although it happens through a Virtual Machine.Now we can use software like Royal TSX to connect to our robots using IP Addresses from the 192.168.2.x subnet. The network is stable, and we are able to move around our own router.For now, this problem is solved.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/102/</link>
				<pubDate>Thu, 03 Oct 2024 00:37:49 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETED] &#x2714; Double Wi-Fi Network Connection</b></p>
<p>We need to use two networks to have a ubiquitous network solution for our robots. The first one is to connect to any local Wi-Fi networks available. For example, the DXARTS Gallery has its network, while the DXARTS Electronics has another one. The University of Washington has its network, too, and so on. However, the connection between remote and local computers to access Ubuntu remotely should occur from within our router. This will provide a stable solution that can be taken, together with our router, anywhere we want instead of depending on the local Wi-Fi.</p>
<p>We need to be able to connect to two Wi-Fi networks simultaneously to have both internet access and a local network.</p>
<p><b>SOLUTION<br /></b></p>
<p>We have purchased a TP-LINK Archer A6 Wireless Router:</p>
<ul>
<li><b></b><b><a target='_blank' href="https://www.tp-link.com/us/home-networking/wifi-router/archer-a6/" rel="nofollow">TP-Link AC1200 Gigabit WiFi Router (Archer A6)</a></b></li>
</ul>
<p>With our new router, we created the «disobedientrobots» Wi-Fi network. When we first turned our router on, we accessed the configuration pages at 192.168.0.1. After creating a password for the first time, we changed the network name to be «disobedientrobots» and the password «disobedientrobots» too. Then, we accessed the LAN settings by clicking on Advanced at the top of the interface and then, from the sidebar, selecting Network &gt; LAN.</p>
<ul>
<li>Change the current LAN IP address from <b>192.168.0.1</b> to <b>192.168.2.1</b>. By using 192.168.2.1, we&#8217;re making sure that we don&#8217;t conflict with the default subnet many routers usually pick (192.168.0.x).</li>
</ul>
<p>Then, to access two Wi-Fi networks using a single Raspberry Pi, we need to use an additional USB Wi-Fi adapter. For that purpose, we are using:</p>
<ul>
<li><b></b><b><a target='_blank' href="https://geekworm.com/products/geekworm-nvidia-jetson-nano-dual-band-wireless-usb-3-0-adapter-5ghz-2-4ghz-1200m" rel="nofollow">Geekworm NVIDIA Jetson Nano Dual Band Wireless USB 3.0 WiFi Adapter 5GHz+2.4GHz 1200M</a></b></li>
</ul>
<p>This adapter is «plug-and-play» with our Raspberry Pi 5&#8217;s. As soon as it is connected via USB to the RPi, we can connect to 2 separate internet connections, one with the internal Wi-Fi antenna and one with our newly inserted USB adapter antenna.</p>
<ul>
<li>Using the internal Wi-Fi antenna, we will connect to the no-internet «disobedientrobots» network. The computer will be assigned an IP Address within the subnet range, such as 192.168.2.101.</li>
<li>We will then enter the configuration settings for this network, and under the IPv4 tab, we will select «<b>manual</b>» instead of «automatic.» This will allow us to assign a static IP to each robot manually instead of automatically assigning IP addresses. For the <b>Address</b> field, we will use the computer&#8217;s assigned IP address, for example, <u>192.168.2.101</u>. For the <b>Netmask</b>, we will use <u>255.255.255.0</u>; for the <b>Gateway</b>, we will use the router&#8217;s IP address, <u>192.168.2.1</u>.</li>
<li>We will connect to the internet network, in our case, «DX-STUDIO» or «DX-GALLERY,» with the external USB adapter Wi-Fi antenna.</li>
<li><b>NOTE</b>: If we do this the other way around, attempting to connect to our RPis via RDP remotely will fail. This could be because Ubuntu has the routing internally assigned to the internal Wi-Fi card.</li>
</ul>
<p>Once connected to both networks, we can access these computers from within the network. From a Mac OS, we won&#8217;t be able to connect to two separate networks using an USB adapter antenna. However, by using a virtual machine (the UTM app) running Ubuntu within my MacBook Pro, I was able to access the internet using the USB adapter antenna while the main Mac OS Wi-Fi connection was linked to «disobedientrobots». It is a workaround to be able to have internet access although it happens through a Virtual Machine.</p>
<p>Now we can use software like Royal TSX to connect to our robots using IP Addresses from the 192.168.2.x subnet. The network is stable, and we are able to move around our own router.</p>
<p><b>For now, this problem is solved.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">9fc350447ef46d6f0d2f9b19bebe71c4</guid>
				<title>krishpri became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/100/</link>
				<pubDate>Fri, 30 Aug 2024 06:43:19 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">c620214252f4d104445b9b982d4c71eb</guid>
				<title>michael became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/98/</link>
				<pubDate>Fri, 16 Aug 2024 08:45:51 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">b7c19661237eea052662fcb84b478432</guid>
				<title>gmc posted an update in the group Cytron MD10C [DR2024]: [Task] [Complete] &#x2714; Get Relay and Motor Controller to Work on Raspberry Pi using PythonWhen trying to install packages to use the GPIO pins, it was found that there is an issue with installing RPi.GPIO, but there is an alternative package called gpiozero. This has almost all the functionalities required to get the lift moving with the relay and motor controller. First, we need to install gpiozero:sudo apt-get install python3-gpiozeroAlways use (replace … with import you want to use):from gpiozero import …Using example codes and combining them, a test code to get the GPIO pins to work with the relay and motor controller was made.from gpiozero import PWMOutputDevice, DigitalOutputDevice, OutputDevicefrom time import sleep# Define pinsREL_PIN = 25PWM_PIN = 12DIR_PIN = 16# Initialize PWM and direction pinmotor_pwm = PWMOutputDevice(PWM_PIN, frequency=1000)motor_dir = DigitalOutputDevice(DIR_PIN)device = OutputDevice(REL_PIN)# Function to set motor directiondef set_motor_direction(forward=True):    motor_dir.value = 0 if forward else 1try:    # Run motor in the opposite direction    device.on() # Turns the relay on        set_motor_direction(forward=False)    motor_pwm.value = 0.5  # 50% duty cycle    print("Motor moving down at 50% speed")    sleep(10)    # Stop motor    motor_pwm.value = 0    print("Motor stopped")    device.off() # Turns the relay off    sleep(1)    for i in range(1,10):        device.on()        # Run motor in one direction        set_motor_direction(forward=True)        motor_pwm.value = (i/10) # duty cycle        print("Motor moving up at", i/10)        sleep(5)        motor_pwm.value = 0        print("Motor stopped")        device.off()        sleep(1)        device.on()        set_motor_direction(forward=False)        motor_pwm.value = (i/10) # duty cycle        print("Motor moving down at", i/10)        sleep(5)finally:    # Ensure motor is stopped    motor_pwm.value = 0    print("Motor stopped")    device.off()</title>
				<link>https://desobedienciarobotica.org/comunidad/p/97/</link>
				<pubDate>Tue, 13 Aug 2024 04:03:51 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[Task] [Complete]</b><b> &#x2714; Get Relay and Motor Controller to Work on Raspberry Pi using Python</b></p>
<p>When trying to install packages to use the GPIO pins, it was found that there is an issue with installing RPi.GPIO, but there is an alternative package called gpiozero. This has almost all the functionalities required to get the lift moving with the relay and motor controller. First, we need to install gpiozero:</p>
<pre>sudo apt-get install python3-gpiozero</pre>
<p><span>Always use (replace … with import you want to use):</span></p>
<pre>from gpiozero import …</pre>
<p>Using example codes and combining them, a test code to get the GPIO pins to work with the relay and motor controller was made.</p>
<pre>from gpiozero import PWMOutputDevice, DigitalOutputDevice, OutputDevice<br />from time import sleep</pre>
<pre># Define pins<br />REL_PIN = 25<br />PWM_PIN = 12<br />DIR_PIN = 16</pre>
<pre># Initialize PWM and direction pin<br />motor_pwm = PWMOutputDevice(PWM_PIN, frequency=1000)<br />motor_dir = DigitalOutputDevice(DIR_PIN)<br />device = OutputDevice(REL_PIN)</pre>
<pre># Function to set motor direction<br />def set_motor_direction(forward=True):<br />    motor_dir.value = 0 if forward else 1</pre>
<pre>try:<br />    # Run motor in the opposite direction<br />    device.on() # Turns the relay on    <br />    set_motor_direction(forward=False)<br />    motor_pwm.value = 0.5  # 50% duty cycle<br />    print("Motor moving down at 50% speed")<br />    sleep(10)</pre>
<pre>    # Stop motor<br />    motor_pwm.value = 0<br />    print("Motor stopped")<br />    device.off() # Turns the relay off<br />    sleep(1)</pre>
<pre>    for i in range(1,10):<br />        device.on()<br />        # Run motor in one direction<br />        set_motor_direction(forward=True)<br />        motor_pwm.value = (i/10) # duty cycle<br />        print("Motor moving up at", i/10)<br />        sleep(5)</pre>
<pre>        motor_pwm.value = 0<br />        print("Motor stopped")<br />        device.off()<br />        sleep(1)</pre>
<pre>        device.on()<br />        set_motor_direction(forward=False)<br />        motor_pwm.value = (i/10) # duty cycle<br />        print("Motor moving down at", i/10)<br />        sleep(5)</pre>
<pre>finally:<br />    # Ensure motor is stopped<br />    motor_pwm.value = 0<br />    print("Motor stopped")<br />    device.off()</pre></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">d852eb5b289345f6fc31e98e2b74eb8a</guid>
				<title>gmc posted an update in the group Hardware [DR2024]: [Task] [Complete] &#x2714; Connecting MD10C motor controller to Raspberry Pi 5Started off connecting motor controller to lift; the connection can be visualized using the electrical diagram in Hardware section. The DIR, PWM, and GND pins on the motor controller are connected to the GPIO pins on the Raspberry Pi, while the Motor A and Motor B pins are connected to the motor on the lift. The power to the motor controller is connected to a relay, which will turn on the motor controller only when the lift is needed. This is to keep the motor controller safe since it produces too much heat in idle state. The relay is then connected to a 24V power supply.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/94/</link>
				<pubDate>Tue, 13 Aug 2024 03:53:06 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[Task] [Complete] &#x2714; Connecting MD10C motor controller to Raspberry Pi 5</b></p>
<p>Started off connecting motor controller to lift; the connection can be visualized using the electrical diagram in Hardware section. The DIR, PWM, and GND pins on the motor controller are connected to the GPIO pins on the Raspberry Pi, while the Motor A and Motor B pins are connected to the motor on the lift. </p>
<p>The power to the motor controller is connected to a relay, which will turn on the motor controller only when the lift is needed. This is to keep the motor controller safe since it produces too much heat in idle state. The relay is then connected to a 24V power supply.</p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">2a6b2fdee3bb7a500e9268bcc949801c</guid>
				<title>gmc joined the group Cytron MD10C [DR2024]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/93/</link>
				<pubDate>Tue, 13 Aug 2024 02:52:09 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">d3920a1787af9562266c61958e1551a2</guid>
				<title>gmc joined the group Hardware [DR2024]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/92/</link>
				<pubDate>Tue, 13 Aug 2024 02:51:11 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">1bac265cc8ef463fc5f0cbbddf8f3229</guid>
				<title>Nicolas Kisic Aguirre created the group Cytron MD10C [DR2024]</title>
				<link>https://desobedienciarobotica.org/comunidad/p/91/</link>
				<pubDate>Tue, 13 Aug 2024 02:51:06 +0000</pubDate>

				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">fd618effa0e799779c89eda0e52a8af5</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group ROS2 Jazzy [DR2024]: [TASK] Mapping w/ CartographerFirst, we need to install Cartographer for ROS2:sudo apt updatesudo apt install ros-jazzy-cartographersudo apt install ros-jazzy-cartographer-rosThen, we need to set open permissions for the files in both cartographer packages. Otherwise, we will encounter permission problems:sudo chmod -R 777 /opt/ros/jazzy/share/cartographer/sudo chmod -R 777 /opt/ros/jazzy/share/cartographer_ros/We are first going to map using the backpack_2d.launch.py launch file. However, we will create a custom version of the same launch file for three reasons. The first one is that we don't need everything that is being launched with that launch file. Second, we must create parameters to modify with the launch command. Third, we need to have a launch file and parameter files that aren't affected by future ROS2 upgrades, which could reset modifications to the original files.cd /opt/ros/jazzy/share/cartographer_ros/launch/touch backpack_2d_custom.launch.py  # this creates the fileThen, we add the following code to the launch file:"""  Copyright 2018 The Cartographer Authors  Copyright 2022 Wyca Robotics (for the ros2 conversion)  Licensed under the Apache License, Version 2.0 (the "License");  you may not use this file except in compliance with the License.  You may obtain a copy of the License at       http://www.apache.org/licenses/LICENSE-2.0  Unless required by applicable law or agreed to in writing, software  distributed under the License is distributed on an "AS IS" BASIS,  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  See the License for the specific language governing permissions and  limitations under the License."""from launch import LaunchDescriptionfrom launch.actions import DeclareLaunchArgument, IncludeLaunchDescriptionfrom launch.conditions import IfCondition, UnlessConditionfrom launch.substitutions import LaunchConfigurationfrom launch_ros.actions import Node, SetRemapfrom launch_ros.substitutions import FindPackageSharefrom launch.launch_description_sources import PythonLaunchDescriptionSourceimport osdef generate_launch_description():    ## ***** Launch arguments *****    use_sim_time_arg = DeclareLaunchArgument('use_sim_time', default_value = 'False')    config_dir_arg = DeclareLaunchArgument(        'configuration_directory',        default_value=os.path.join(FindPackageShare('cartographer_ros').find('cartographer_ros'), 'configuration_files'),        description='Configuration directory for Cartographer')    config_basename_arg = DeclareLaunchArgument(        'configuration_basename',        default_value='backpack_2d.lua',        description='Configuration file for Cartographer Backpack 2d')    ## ***** File paths ******    pkg_share = FindPackageShare('cartographer_ros').find('cartographer_ros')    # urdf_dir = os.path.join(pkg_share, 'urdf')    # urdf_file = os.path.join(urdf_dir, 'backpack_2d.urdf')    # with open(urdf_file, 'r') as infp:    #    robot_desc = infp.read()    ## ***** Nodes *****    # robot_state_publisher_node = Node(    #    package = 'robot_state_publisher',    #    executable = 'robot_state_publisher',    #    parameters=[    #        {'robot_description': robot_desc},    #        {'use_sim_time': LaunchConfiguration('use_sim_time')}],    #    output = 'screen'    #    )    cartographer_node = Node(        package = 'cartographer_ros',        executable = 'cartographer_node',        parameters = [{'use_sim_time': LaunchConfiguration('use_sim_time')}],        arguments = [            '-configuration_directory', LaunchConfiguration('configuration_directory'),            '-configuration_basename', LaunchConfiguration('configuration_basename')],        remappings = [            ('echoes', 'horizontal_laser_2d')],        output = 'screen'        )    cartographer_occupancy_grid_node = Node(        package = 'cartographer_ros',        executable = 'cartographer_occupancy_grid_node',        parameters = [            {'use_sim_time': True},            {'resolution': 0.05}],        )    return LaunchDescription([        use_sim_time_arg,        config_dir_arg,        config_basename_arg,        # Nodes        # robot_state_publisher_node,        cartographer_node,        cartographer_occupancy_grid_node,    ])Finally, we create a configurations_files folder within our own workspace:sudo mkdir /home/disrobot/ros2_ws/src/cartographer_ros/ /home/disrobot/ros2_ws/src/cartographer_ros/configuration_files/We copy the contents of the ROS2 installation Cartographer Configuration Files onto our own ros2_ws workspace:sudo cp -r /opt/ros/jazzy/share/cartographer_ros/configuration_files/ /home/disrobot/ros2_ws/src/cartographer_ros/Then, we edit permissions to make sure they're set correctly:sudo chown -R disrobot:disrobot /home/disrobot/ros2_ws/src/cartographer_ros/Finally, we create our custom configuration file:cd ~/ros2_ws/src/cartographer_ws/configuration_files/touch backpack_2d_custom.luaUsing an editor, we paste the following code into backpack_2d_custom.lua:-- Copyright 2016 The Cartographer Authors---- Licensed under the Apache License, Version 2.0 (the "License");-- you may not use this file except in compliance with the License.-- You may obtain a copy of the License at----      http://www.apache.org/licenses/LICENSE-2.0---- Unless required by applicable law or agreed to in writing, software-- distributed under the License is distributed on an "AS IS" BASIS,-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.-- See the License for the specific language governing permissions and-- limitations under the License.include "map_builder.lua"include "trajectory_builder.lua"options = {  map_builder = MAP_BUILDER,  trajectory_builder = TRAJECTORY_BUILDER,  map_frame = "map",  tracking_frame = "laser",  published_frame = "base_link",  odom_frame = "odom",  provide_odom_frame =false,  publish_frame_projected_to_2d = true,  use_pose_extrapolator = true,  use_odometry =false,  use_nav_sat = false,  use_landmarks = false,  num_laser_scans = 1,  num_multi_echo_laser_scans = 0,  num_subdivisions_per_laser_scan = 1,  num_point_clouds = 0,  lookup_transform_timeout_sec = 0.2,  submap_publish_period_sec = 0.3,  pose_publish_period_sec = 5e-3,  trajectory_publish_period_sec = 30e-3,  rangefinder_sampling_ratio = 1.,  odometry_sampling_ratio = 1.,  fixed_frame_pose_sampling_ratio = 1.,  imu_sampling_ratio = 1.,  landmarks_sampling_ratio = 1.,}MAP_BUILDER.use_trajectory_builder_2d = trueTRAJECTORY_BUILDER_2D.num_accumulated_range_data = 0.1TRAJECTORY_BUILDER_2D.max_range = 3.5TRAJECTORY_BUILDER_2D.missing_data_ray_length = 3.TRAJECTORY_BUILDER_2D.use_imu_data = falseTRAJECTORY_BUILDER_2D.use_online_correlative_scan_matching = true TRAJECTORY_BUILDER_2D.motion_filter.max_angle_radians = math.rad(0.1)POSE_GRAPH.constraint_builder.min_score = 0.65POSE_GRAPH.constraint_builder.global_localization_min_score = 0.7return optionsFinally, when we run our Cartographer launch file, we will use the following command:cd ~/ros2_ws. install/local_setup.bashros2 launch cartographer_ros backpack_2d_custom.launch.py configuration_directory:=/home/disrobot/ros2_ws/src/cartographer_ros/configuration_files configuration_basename:=backpack_2d_custom.luaWe should be able to run to have a map now.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/90/</link>
				<pubDate>Fri, 09 Aug 2024 23:03:25 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Mapping w/ Cartographer</b></p>
<p>First, we need to install Cartographer fo<span>r ROS2:</span></p>
<pre>sudo apt update<br />sudo apt install ros-jazzy-cartographer<br />sudo apt install ros-jazzy-cartographer-ros</pre>
<p>Then, we need to set open permissions for the files in both cartographer packages. Otherwise, we will encounter permission problems:</p>
<pre>sudo chmod -R 777 /opt/ros/jazzy/share/cartographer/<br />sudo chmod -R 777 /opt/ros/jazzy/share/cartographer_ros/</pre>
<p>We are first going to map using the backpack_2d.launch.py launch file. However, we will create a custom version of the same launch file for three reasons. The first one is that we don&#8217;t need everything that is being launched with that launch file. Second, we must create parameters to modify with the launch command. Third, we need to have a launch file and parameter files that aren&#8217;t affected by future ROS2 upgrades, which could reset modifications to the original files.</p>
<pre>cd /opt/ros/jazzy/share/cartographer_ros/launch/<br />touch backpack_2d_custom.launch.py  # this creates the file</pre>
<p>Then, we add the following code to the launch file:</p>
<pre>"""<br />  Copyright 2018 The Cartographer Authors<br />  Copyright 2022 Wyca Robotics (for the ros2 conversion)<br />  Licensed under the Apache License, Version 2.0 (the "License");<br />  you may not use this file except in compliance with the License.<br />  You may obtain a copy of the License at<br />       http://www.apache.org/licenses/LICENSE-2.0<br />  Unless required by applicable law or agreed to in writing, software<br />  distributed under the License is distributed on an "AS IS" BASIS,<br />  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.<br />  See the License for the specific language governing permissions and<br />  limitations under the License.<br />"""<br />from launch import LaunchDescription<br />from launch.actions import DeclareLaunchArgument, IncludeLaunchDescription<br />from launch.conditions import IfCondition, UnlessCondition<br />from launch.substitutions import LaunchConfiguration<br />from launch_ros.actions import Node, SetRemap<br />from launch_ros.substitutions import FindPackageShare<br />from launch.launch_description_sources import PythonLaunchDescriptionSource<br />import os<br />def generate_launch_description():<br />    ## ***** Launch arguments *****<br />    use_sim_time_arg = DeclareLaunchArgument('use_sim_time', default_value = 'False')<br />    config_dir_arg = DeclareLaunchArgument(<br />        'configuration_directory',<br />        default_value=os.path.join(FindPackageShare('cartographer_ros').find('cartographer_ros'), 'configuration_files'),<br />        description='Configuration directory for Cartographer')<br />    config_basename_arg = DeclareLaunchArgument(<br />        'configuration_basename',<br />        default_value='backpack_2d.lua',<br />        description='Configuration file for Cartographer Backpack 2d')<br />    ## ***** File paths ******<br />    pkg_share = FindPackageShare('cartographer_ros').find('cartographer_ros')<br />    # urdf_dir = os.path.join(pkg_share, 'urdf')<br />    # urdf_file = os.path.join(urdf_dir, 'backpack_2d.urdf')<br />    # with open(urdf_file, 'r') as infp:<br />    #    robot_desc = infp.read()<br />    ## ***** Nodes *****<br />    # robot_state_publisher_node = Node(<br />    #    package = 'robot_state_publisher',<br />    #    executable = 'robot_state_publisher',<br />    #    parameters=[<br />    #        {'robot_description': robot_desc},<br />    #        {'use_sim_time': LaunchConfiguration('use_sim_time')}],<br />    #    output = 'screen'<br />    #    )<br />    cartographer_node = Node(<br />        package = 'cartographer_ros',<br />        executable = 'cartographer_node',<br />        parameters = [{'use_sim_time': LaunchConfiguration('use_sim_time')}],<br />        arguments = [<br />            '-configuration_directory', LaunchConfiguration('configuration_directory'),<br />            '-configuration_basename', LaunchConfiguration('configuration_basename')],<br />        remappings = [<br />            ('echoes', 'horizontal_laser_2d')],<br />        output = 'screen'<br />        )<br />    cartographer_occupancy_grid_node = Node(<br />        package = 'cartographer_ros',<br />        executable = 'cartographer_occupancy_grid_node',<br />        parameters = [<br />            {'use_sim_time': True},<br />            {'resolution': 0.05}],<br />        )<br />    return LaunchDescription([<br />        use_sim_time_arg,<br />        config_dir_arg,<br />        config_basename_arg,<br />        # Nodes<br />        # robot_state_publisher_node,<br />        cartographer_node,<br />        cartographer_occupancy_grid_node,<br />    ])</pre>
<p><span>Finally, we create a configurations_files folder within our own workspace:</span></p>
<pre>sudo mkdir /home/disrobot/ros2_ws/src/cartographer_ros/ /home/disrobot/ros2_ws/src/cartographer_ros/configuration_files/</pre>
<p>We copy the contents of the ROS2 installation Cartographer Configuration Files onto our own ros2_ws workspace:</p>
<pre>sudo cp -r /opt/ros/jazzy/share/cartographer_ros/configuration_files/ /home/disrobot/ros2_ws/src/cartographer_ros/</pre>
<p>Then, we edit permissions to make sure they&#8217;re set correctly:</p>
<pre>sudo chown -R disrobot:disrobot /home/disrobot/ros2_ws/src/cartographer_ros/</pre>
<p>Finally, we create our custom configuration file:</p>
<pre>cd ~/ros2_ws/src/cartographer_ws/configuration_files/<br />touch backpack_2d_custom.lua</pre>
<p>Using an editor, we paste the following code into backpack_2d_custom.lua:</p>
<pre>-- Copyright 2016 The Cartographer Authors<br />--<br />-- Licensed under the Apache License, Version 2.0 (the "License");<br />-- you may not use this file except in compliance with the License.<br />-- You may obtain a copy of the License at<br />--<br />--      http://www.apache.org/licenses/LICENSE-2.0<br />--<br />-- Unless required by applicable law or agreed to in writing, software<br />-- distributed under the License is distributed on an "AS IS" BASIS,<br />-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.<br />-- See the License for the specific language governing permissions and<br />-- limitations under the License.<br />include "map_builder.lua"<br />include "trajectory_builder.lua"<br />options = {<br />  map_builder = MAP_BUILDER,<br />  trajectory_builder = TRAJECTORY_BUILDER,<br />  map_frame = "map",<br />  tracking_frame = "laser",<br />  published_frame = "base_link",<br />  odom_frame = "odom",<br />  provide_odom_frame =false,<br />  publish_frame_projected_to_2d = true,<br />  use_pose_extrapolator = true,<br />  use_odometry =false,<br />  use_nav_sat = false,<br />  use_landmarks = false,<br />  num_laser_scans = 1,<br />  num_multi_echo_laser_scans = 0,<br />  num_subdivisions_per_laser_scan = 1,<br />  num_point_clouds = 0,<br />  lookup_transform_timeout_sec = 0.2,<br />  submap_publish_period_sec = 0.3,<br />  pose_publish_period_sec = 5e-3,<br />  trajectory_publish_period_sec = 30e-3,<br />  rangefinder_sampling_ratio = 1.,<br />  odometry_sampling_ratio = 1.,<br />  fixed_frame_pose_sampling_ratio = 1.,<br />  imu_sampling_ratio = 1.,<br />  landmarks_sampling_ratio = 1.,<br />}<br />MAP_BUILDER.use_trajectory_builder_2d = true<br />TRAJECTORY_BUILDER_2D.num_accumulated_range_data = 0.1<br />TRAJECTORY_BUILDER_2D.max_range = 3.5<br />TRAJECTORY_BUILDER_2D.missing_data_ray_length = 3.<br />TRAJECTORY_BUILDER_2D.use_imu_data = false<br />TRAJECTORY_BUILDER_2D.use_online_correlative_scan_matching = true <br />TRAJECTORY_BUILDER_2D.motion_filter.max_angle_radians = math.rad(0.1)<br />POSE_GRAPH.constraint_builder.min_score = 0.65<br />POSE_GRAPH.constraint_builder.global_localization_min_score = 0.7<br />return options</pre>
<p>Finally, when we run our Cartographer launch file, we will use the following command:</p>
<pre>cd ~/ros2_ws<br />. install/local_setup.bash<br /><p>ros2 launch cartographer_ros backpack_2d_custom.launch.py configuration_directory:=/home/disrobot/ros2_ws/src/cartographer_ros/configuration_files configuration_basename:=backpack_2d_custom.lua</p></pre>
<p>We should be able to run to have a map now.</p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">19cb3cde95aab99176a24b22c1e63adb</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group SuperCollider [DR2024]: [TASK] [COMPLETED] Playing Audio Locally While Controlling it RemotelyWe need to play audio coming directly out of the robot while accessing from the remote login (RDP) option. First, we need to open alsa-state.service, found in /usr/lib/systemd/system/alsa-state.service.Then, only add "!"  as "=!/etc/alsa/state-daemon.conf".This should allow the audio system to start without a problem. As described in a post below, the Sound Blaster card needs to be selected from the qjackctl setup menu at some point. This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/89/</link>
				<pubDate>Thu, 08 Aug 2024 02:57:30 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETED] Playing Audio Locally While Controlling it Remotely</b></p>
<p>We need to play audio coming directly out of the robot while accessing from the remote login (RDP) option. </p>
<p>First, we need to open alsa-state.service, found in /usr/lib/systemd/system/alsa-state.service.</p>
<p>Then, only add «!»  as «=!/etc/alsa/state-daemon.conf».</p>
<p>This should allow the audio system to start without a problem. As described in a post below, the Sound Blaster card needs to be selected from the qjackctl setup menu at some point. </p>
<p><b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">b39e36f58d68010aad1b463101503e50</guid>
				<title>ruiqi became a registered member</title>
				<link>https://desobedienciarobotica.org/comunidad/p/85/</link>
				<pubDate>Wed, 07 Aug 2024 03:07:43 +0000</pubDate>

				
									<slash:comments>1</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">560865bff48ad6dd500c1eb92bd24652</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Ubuntu [DR2024]: [TASK] Adding Persistent SSH (port 22) Reverse Tunnel on Remote Machine &#38; Connect LocallyRemotely:sshpass -p '*********' autossh -R 4004:localhost:22 disobedientrobot@209.42.195.109 -p 2200Locally:ssh -i '/Users/nka/Desktop/DXARTS Projects/SSH Keys/DR2024_local' -L 4004:localhost:4004 disobedientrobot@209.42.195.109 -p 2200

ssh -i '~/DR2024_local' -L 4004:localhost:4004 disobedientrobot@209.42.195.109 -p 2200   # UTM Ubuntu DR4Then, locally:ssh -p 4004 disrobot@localhostThen we are connected via SSH in terminal to our remote computer. We need to add the remote command to initialize automatically at start-up. TBD.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/84/</link>
				<pubDate>Sun, 28 Jul 2024 22:19:29 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] Adding Persistent SSH (port 22) Reverse Tunnel on Remote Machine &amp; Connect Locally</b></p>
<p>Remotely:</p>
<pre>sshpass -p '*********' autossh -R 4004:localhost:22 disobedientrobot@209.42.195.109 -p 2200</pre>
<p>Locally:</p>
<pre>ssh -i '/Users/nka/Desktop/DXARTS Projects/SSH Keys/DR2024_local' -L 4004:localhost:4004 disobedientrobot@209.42.195.109 -p 2200

ssh -i '~/DR2024_local' -L 4004:localhost:4004 disobedientrobot@209.42.195.109 -p 2200   # UTM Ubuntu DR4</pre>
<p><span>Then, locally:</span></p>
<pre>ssh -p 4004 disrobot@localhost</pre>
<p>Then we are connected via SSH in terminal to our remote computer. We need to add the remote command to initialize automatically at start-up. TBD.</p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">7bc80c3ebf27eb26630021fadc6537c1</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group RPLiDAR A1M8 [DR2024]: [TASK] [COMPLETED] &#x2714; Better Support for Graphics in RViZ2Upon running RViZ2 (remotely), I encountered the following errors:~/ros2_ws$ QT_QPA_PLATFORM=xcb rviz2   # run rviz2 remotelyTU: error: ../src/freedreno/vulkan/tu_knl.cc:208: failed to open device /dev/dri/renderD128 (VK_ERROR_INCOMPATIBLE_DRIVER)
Opening /dev/dri/renderD128 failed: Permission denied
MESA: error: ZINK: vkEnumeratePhysicalDevices failed (VK_ERROR_INITIALIZATION_FAILED)
MESA: error: ZINK: failed to choose pdev
glx: failed to create drisw screen
TU: error: ../src/freedreno/vulkan/tu_knl.cc:208: failed to open device /dev/dri/renderD128 (VK_ERROR_INCOMPATIBLE_DRIVER)
Opening /dev/dri/renderD128 failed: Permission denied
MESA: error: ZINK: vkEnumeratePhysicalDevices failed (VK_ERROR_INITIALIZATION_FAILED)
MESA: error: ZINK: failed to choose pdev
glx: failed to create drisw screen
[INFO] [1722121592.924541298] [rviz2]: Stereo is NOT SUPPORTED
[INFO] [1722121592.924652502] [rviz2]: OpenGl version: 4.5 (GLSL 4.5)
[INFO] [1722121593.003604851] [rviz2]: Stereo is NOT SUPPORTEDTo fix them, I added the user to the video group:sudo usermod -aG video disrobotThen, I installed Vulkan:sudo apt install vulkan-tools   # alternatively vulkan-utils (untested)Finally, I installed mesa drivers:sudo apt install mesa-utilsThis resulted in an RViZ2 launch with no errors, except for "Stereo is NOT SUPPORTED" which is more of a warning.This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/82/</link>
				<pubDate>Sat, 27 Jul 2024 23:28:25 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETED] &#x2714; Better Support for Graphics in RViZ2</b></p>
<p>Upon running RViZ2 (remotely), I encountered the following errors:</p>
<pre><b>~/ros2_ws$ QT_QPA_PLATFORM=xcb rviz2   # run rviz2 remotely<br /></b>TU: error: ../src/freedreno/vulkan/tu_knl.cc:208: failed to open device /dev/dri/renderD128 (VK_ERROR_INCOMPATIBLE_DRIVER)
Opening /dev/dri/renderD128 failed: Permission denied
MESA: error: ZINK: vkEnumeratePhysicalDevices failed (VK_ERROR_INITIALIZATION_FAILED)
MESA: error: ZINK: failed to choose pdev
glx: failed to create drisw screen
TU: error: ../src/freedreno/vulkan/tu_knl.cc:208: failed to open device /dev/dri/renderD128 (VK_ERROR_INCOMPATIBLE_DRIVER)
Opening /dev/dri/renderD128 failed: Permission denied
MESA: error: ZINK: vkEnumeratePhysicalDevices failed (VK_ERROR_INITIALIZATION_FAILED)
MESA: error: ZINK: failed to choose pdev
glx: failed to create drisw screen
[INFO] [1722121592.924541298] [rviz2]: Stereo is NOT SUPPORTED
[INFO] [1722121592.924652502] [rviz2]: OpenGl version: 4.5 (GLSL 4.5)
[INFO] [1722121593.003604851] [rviz2]: Stereo is NOT SUPPORTED</pre>
<p>To fix them, I added the user to the video group:</p>
<pre>sudo usermod -aG video disrobot</pre>
<p>Then, I installed Vulkan:</p>
<pre><span>sudo apt install vulkan-tools   # alternatively vulkan-utils (untested)</span></pre>
<p><span>Finally, I installed mesa drivers:</span></p>
<pre>sudo apt install mesa-utils</pre>
<p>This resulted in an RViZ2 launch with no errors, except for «Stereo is NOT SUPPORTED» which is more of a warning.</p>
<p><b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>1</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">d999e10615169a3dfa74dabfa98ec75e</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Ubuntu [DR2024]: [TASK] [COMPLETED] &#x2714; Automatically Give Permissions to User 'disrobot' to Access /dev/ttyACM*, etc.To prevent us from having to run the following command:sudo chmod 777 /dev/ttyACM0Whenever a program needs to access anything connected to our USB ports, we can add our user 'disrobot' to the 'dialout' group. To see in which groups our user participates, we can run the following command:groups disrobotTo add 'disrobot' to the 'dialout' group, which gives permanent permission/access to serial ports, we run:sudo usermod -aG dialout disrobot   # change 'disrobot' for your username if it is different.Afterwards, log out and log back in. Now, there should not be a need to give permissions every time we need to access devices.This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/81/</link>
				<pubDate>Sat, 27 Jul 2024 07:38:16 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETED] &#x2714; Automatically Give Permissions to User &#8216;disrobot&#8217; to Access /dev/ttyACM*, etc.</b></p>
<p>To prevent us from having to run the following command:</p>
<pre>sudo chmod 777 /dev/ttyACM0</pre>
<p>Whenever a program needs to access anything connected to our USB ports, we can add our user &#8216;disrobot&#8217; to the &#8216;dialout&#8217; group. To see in which groups our user participates, we can run the following command:</p>
<pre>groups disrobot</pre>
<p>To add &#8216;disrobot&#8217; to the &#8216;dialout&#8217; group, which gives permanent permission/access to serial ports, we run:</p>
<pre>sudo usermod -aG dialout disrobot   # change 'disrobot' for your username if it is different.</pre>
<p>Afterwards, log out and log back in. Now, there should not be a need to give permissions every time we need to access devices.</p>
<p><b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>0</slash:comments>
				
							</item>
					<item>
				<guid isPermaLink="false">b25208e12333975cbbd2563e7e19d1b0</guid>
				<title>Nicolas Kisic Aguirre posted an update in the group Hardware [DR2024]: [TASK] [COMPLETE] &#x2714; Add a Remote Switch to Turn On/Off RobotsWe acquired a Shelly Plus 1 remote switch and followed the instructions in this video (the video explains it for 12V DC power) to attach it to our 24V power supply. We can now power on/off our robot (only DR2 for now) from anywhere in the world with an internet connection. This task has been completed.</title>
				<link>https://desobedienciarobotica.org/comunidad/p/79/</link>
				<pubDate>Sat, 27 Jul 2024 06:24:15 +0000</pubDate>

									<content:encoded><![CDATA[<p><b>[TASK] [COMPLETE] &#x2714; Add a Remote Switch to Turn On/Off Robots</b></p>
<p>We acquired a <a target='_blank' href="https://www.shelly.com/en/products/shop/shelly-plus-1" rel="nofollow">Shelly Plus 1</a> remote switch and followed the instructions in <a target='_blank' href="https://www.youtube.com/watch?v=XkRUZn8XnU8" rel="nofollow">this video</a> (the video explains it for 12V DC power) to attach it to our 24V power supply. We can now power on/off our robot (only DR2 for now) from anywhere in the world with an internet connection. </p>
<p><b>This task has been completed.</b></p>
]]></content:encoded>
				
									<slash:comments>2</slash:comments>
				
							</item>
		
	</channel>
</rss>
		