Skip to content

Commit

Permalink
deploy: 8ba4cde
Browse files Browse the repository at this point in the history
  • Loading branch information
krrish-jindal committed Apr 5, 2024
1 parent 7d3d30f commit ed10e52
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion blog/mr_robot_blog/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
</a><button class="navbar-toggler collapsed" type=button data-toggle=collapse data-target=#sitenavbar>
<span class=icon-bar></span>
<span class=icon-bar></span>
<span class=icon-bar></span></button><div class="collapse navbar-collapse" id=sitenavbar><ul class="navbar-nav ml-auto main-nav"><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/>Home</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/about>About</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/wiki>Wiki</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/portfolio>Projects</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/blog>Blog</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/downloads>Downloads</a></li><li class=nav-item><a class="nav-link btn btn-sm btn-primary btn-sm-rounded" href=https://linktr.ee/a.t.o.m_robotics_lab><span class=btn-area><span data-text="Get in touch">Get in touch</span></span></a></li></ul></div></div></nav><main><section class="site-blog details"><div class=container><div class="row justify-content-center"><div class=col-lg-8><article class=site-blog-details><p><span>February 25, 2024</span> by <span>Kirti Nain</span></p><h2 class=blog-title>Whats behind MR.ROBOT ?</h2><img class=feature-image src=https://atom-robotics-lab.github.io/images/blog/mr_robot/cover_thumbnai_mr_robot.jpeg alt=blog-feature-image><h1 id=mr-robot>MR Robot</h1><p>Let&rsquo;s think of something that people have made in their likeness—I&rsquo;m talking about robots. We get excited when we talk about robots, but there&rsquo;s also a healthy dose of curiosity, which often results in the development of more likely-to-be human robots. That is exactly the reason why we at A.T.O.M. came up with the idea of fabrication of “MR Robot”. Fascinating right? Let’s dive more into “MR Robot” then.</p><p><figure><img src=/images/blog/mr_robot/cover_mr_robot.jpeg alt="mr robot" width=100%><figcaption><p>mr robot</p></figcaption></figure>MR Robot is a kind of autonomous mobile robot (AMR), it can understand and move through its environment without being overseen directly by an operator or limited to a fixed, predetermined path. It is designed to navigate its environment using the ROS (Robot Operating System) navigation stack. The project is designed with versatility in mind; that is, it is composed of plug-and-play components that may be altered to accommodate various use cases. ROS navigation stack consists of several software packages that enable robot mapping and navigation. This enables “MR-ROBOT” to precisely explore its environment and avoid obstacles by utilizing sensors like LIDAR, cameras, and IMU.</p><p>MR Robot leverages Simultaneous Localization and Mapping (SLAM) algorithms to autonomously navigate, understand, and interact with their environment by simultaneously estimating their own position and constructing a map of the surroundings using sensor data, enabling real-time decision-making and adaptive behavior in dynamic environments.
<span class=icon-bar></span></button><div class="collapse navbar-collapse" id=sitenavbar><ul class="navbar-nav ml-auto main-nav"><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/>Home</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/about>About</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/wiki>Wiki</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/portfolio>Projects</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/blog>Blog</a></li><li class=nav-item><a class=nav-link href=https://atom-robotics-lab.github.io/downloads>Downloads</a></li><li class=nav-item><a class="nav-link btn btn-sm btn-primary btn-sm-rounded" href=https://linktr.ee/a.t.o.m_robotics_lab><span class=btn-area><span data-text="Get in touch">Get in touch</span></span></a></li></ul></div></div></nav><main><section class="site-blog details"><div class=container><div class="row justify-content-center"><div class=col-lg-8><article class=site-blog-details><p><span>February 25, 2024</span> by <span>Keerti Nain</span></p><h2 class=blog-title>Whats behind MR.ROBOT ?</h2><img class=feature-image src=https://atom-robotics-lab.github.io/images/blog/mr_robot/cover_thumbnai_mr_robot.jpeg alt=blog-feature-image><h1 id=mr-robot>MR Robot</h1><p>Let&rsquo;s think of something that people have made in their likeness—I&rsquo;m talking about robots. We get excited when we talk about robots, but there&rsquo;s also a healthy dose of curiosity, which often results in the development of more likely-to-be human robots. That is exactly the reason why we at A.T.O.M. came up with the idea of fabrication of “MR Robot”. Fascinating right? Let’s dive more into “MR Robot” then.</p><p><figure><img src=/images/blog/mr_robot/cover_mr_robot.jpeg alt="mr robot" width=100%><figcaption><p>mr robot</p></figcaption></figure>MR Robot is a kind of autonomous mobile robot (AMR), it can understand and move through its environment without being overseen directly by an operator or limited to a fixed, predetermined path. It is designed to navigate its environment using the ROS (Robot Operating System) navigation stack. The project is designed with versatility in mind; that is, it is composed of plug-and-play components that may be altered to accommodate various use cases. ROS navigation stack consists of several software packages that enable robot mapping and navigation. This enables “MR-ROBOT” to precisely explore its environment and avoid obstacles by utilizing sensors like LIDAR, cameras, and IMU.</p><p>MR Robot leverages Simultaneous Localization and Mapping (SLAM) algorithms to autonomously navigate, understand, and interact with their environment by simultaneously estimating their own position and constructing a map of the surroundings using sensor data, enabling real-time decision-making and adaptive behavior in dynamic environments.
The modularity of the “MR-ROBOT” project is one of its main advantages. Sensors and manipulators are just a few examples of the interchangeable elements and components that enable this modularity. Combining the strength of the ROS navigation stack with the adaptability of a modular robot design, the MR Robot represents an exciting advancement in the field of autonomous robotics. This implies that it can be tailored to fit a variety of uses, such as medical robots and warehouse automation.</p><h2 id=working-of-mr-robot>Working of MR Robot</h2><h3 id=hardware-working>Hardware Working</h3><p>MR Robot is equipped with two encoder motors, an ESP32 micro controller, a Raspberry Pi 4, an IMU, a LiDAR, a motor driver, and a 5V buck converter. The robot is designed to calculate the movement of the motors and send it to the microcontroller. The data is then published to a topic and used to calculate the odometry. The LiDAR and IMU are linked to the Raspberry Pi 4, and the motor driver is used to form an H-bridge for differential drive.Let&rsquo;s go over each of these specialized terminology in detail now.</p><ul><li>Encoder Motors :- MR Robot is equipped with two encoder motors. The encoder motor calculates the movement of the motor and sends it to the microcontroller. This data is then used to calculate the odometry.</li><li>ESP32 Microcontroller :- It receives data from the encoder motors and processes it. The microcontroller then publishes the data to a topic that is used to calculate the odometry.</li><li>Raspberry Pi 4 :- The Raspberry Pi 4 is used to link the LiDAR and IMU. The LiDAR and IMU are connected to the Raspberry Pi 4, which processes the data from the sensors.</li><li>IMU :- The IMU is used to measure the orientation, acceleration, and angular velocity of the robot. The data from the IMU is processed by the Raspberry Pi 4 using a Kalman filter. The Kalman filter is used to estimate the orientation, acceleration, and angular velocity of the robot.</li><li>LiDAR X 2L Model :- The LiDAR X2L Model is a type of LiDAR sensor that is used to detect the distance and angle of objects in the environment. The LiDAR X2L Model is connected to the Raspberry Pi 4 and is used to create a map of the environment.</li><li>Motor Driver :- The motor driver is used to form an H-bridge for differential drive. The motor driver is connected to the microcontroller and the encoder motors. The motor driver is responsible for controlling the speed and direction of the motors.</li><li>5V Buck Converter :- The 5V Buck Converter is used to convert the voltage from the battery to 5V.</li></ul><p>To wrap things up, MR Robot is an intricately engineered robotic system, integrating cutting-edge sensors and components for seamless environmental navigation. With LiDAR, IMU, and Encoder Motor, this robot boasts precision in its movements. Motor control is facilitated through a motor driver, while sensor data processing is efficiently managed by a microcontroller. In essence, MR Robot stands as a formidable and adaptable platform, adept at executing diverse tasks with finesse.</p><h3 id=software-working>Software Working</h3><p>MR Robot employs the ROS Serial protocol for seamless communication with its various components. It relies on Twist to PWM for precise motor control and ESP_Diff_TF for accurate odometry calculation. Additionally, equipped with a LiDAR sensor interfaced with the Raspberry Pi 4, the robot harnesses its capabilities for robust mapping and navigation tasks.</p><ul><li>ROS Serial Protocol :- ROS Serial is a protocol used to communicate with the robot’s components, such as the microcontroller and motor driver. It allows for the exchange of messages between the robot and the computer. This protocol is used to send and receive data from the robot, such as motor commands and sensor data.</li><li>Twist to PWM :- The Twist to PWM converter is used to control the speed and direction of the robot’s motors. It takes in Twist messages, which represent the desired linear and angular velocities of the robot, and converts them to PWM signals that are sent to the motor driver.</li><li>ESP_Diff_TF :- The ESP_Diff_TF node is used to calculate the robot’s odometry. It takes in the encoder data from the motor driver and uses it to calculate the robot’s position and orientation. The calculated odometry data is then sent to the Navigation Stack.</li><li>AMCL :- The Adaptive Monte Carlo Localization (AMCL) node is used to localize the robot within a map. It takes in sensor data, such as the LiDAR data from the Raspberry Pi 4, and uses it to estimate the robot’s position within the map. AMCL is a part of the Navigation Stack and is responsible for accurately determining the robot’s position.
Move Base :- The Move Base node is used for robot navigation. It takes in a goal location and uses the odometry and localization data from AMCL to navigate the robot to the goal location. Move Base utilizes a global planner and a local planner to plan a path to the goal and avoid obstacles in real-time.</li><li>LiDAR Data :- The LiDAR data from the Raspberry Pi 4 is used for mapping and navigation. The LiDAR sensor scans the environment and creates a 2D map of the surroundings. This map is then used by the Navigation Stack to determine the robot’s position and plan a path to the goal.</li><li>GMapping :- The LiDAR data from the Raspberry Pi 4 can also be used for mapping using the GMapping node. GMapping is a SLAM algorithm that uses the LiDAR data to create a map of the environment. The generated map can then be used for navigation by the Navigation Stack.</li></ul><p>Summarizing, MR Robot represents an advanced and robust robotic platform, leveraging ROS Serial, Twist to PWM, and ESP_Diff_TF for precise motor control and odometry calculation. Navigation and localization are facilitated by the Navigation Stack, incorporating AMCL and Move Base algorithms. With the Raspberry Pi 4 interfacing with a LiDAR sensor, MR Robot adeptly harnesses LiDAR data for mapping and navigation tasks through the Navigation Stack, and alternatively utilizes GMapping for mapping purposes. In essence, MR Robot emerges as a versatile and adept solution, perfectly suited for a multitude of tasks.</p><p>Now, setting aside the technical aspects, let&rsquo;s delve into the more enjoyable facets of MR Robot. MR Robot is equipped with teleoperation capabilities, allowing manual control via a joystick interface for movement.This feature enable users to navigate the robot remotely, directing it’s motion with precision and flexibility. By manipulating the joystick, operators can command the robot to move forward,backward,turn and adjust it’s speed in real-time,facilitating smooth and intuitive control. Map-based techniques use pre-defined routes or optimal paths. Dynamic replanning adjusts trajectories based on real-time data, resembling rapid repositioning.</p><p>A COST map, or Coordinate System map, serves as a crucial tool for MR Robot to navigate through surroundings efficiently. By integrating data from various sensors such as LiDAR and cameras(VSLAM), the robot constructs a detailed representation of its environment within a coordinate system. This map allows MR Robot to identify obstacles, landmarks, and other relevant features, enabling it to plan and execute optimal paths to navigate safely and autonomously. By referencing the COST map in real-time, it can continuously update its position and adjust its trajectory, ensuring precise and reliable navigation through complex environments.</p><p>ROSJS serves as a vital bridge between ROS-controlled MR Robot and users within the same network, offering a seamless interface for control and interaction. By leveraging JavaScript, ROSJS facilitates the development of web-based graphical user interfaces (GUIs) that enable users to command MR Robot remotely. This enables users sharing the same network to access the MR Robot&rsquo;s functionalities conveniently through their web browsers. Additionally, ROSJS allows for the integration of robot control directly into websites, enabling developers to embed robot control interfaces seamlessly into web applications. With ROSJS, the barriers between website-based interfaces and ROS-controlled systems are bridged, facilitating intuitive and accessible control of robots within local network environments.</p><p>MR Robot takes on a futuristic and visually captivating appearance with the integration of neo pixel lights, whether used for aesthetic appeal or functional purposes such as signaling, these pixel lights enhance the MR Robot’s visual presence, making it stand out in various environments. Additionally, the ability to synchronize lighting with the robot&rsquo;s movements or interactions adds an extra layer of sophistication and charm, captivating observers. With the capability to integrate a 2D map, Setting a destination goal enables MR Robot to traverse the path effortlessly. Adding to its versatility, adjusting the speed allows for mood modulation, enhancing the interactive experience with MR Robot :</p><ul><li>Normal Speed ~ Happy Mood</li><li>Sudden acceleration ~ Excited Mood</li><li>Braking ~ Angry Mood</li><li>Immobile ~ Bored Mood</li></ul><p>MR Robot represents the cutting edge of technological advancement, with the ability to navigate complex environments with precision, interact seamlessly with users via intuitive controls, and exude personality through mood modulation and dynamic lighting. The combination of cutting-edge technologies like G-mapping, joystick control, new pixel lighting, live video feeds, mood modification features, and SLAM technology with plug-and-play components has transformed the landscape of autonomous mobile robots. These new features not only improve MR Robot’s functioning and versatility, but also it’s visual appeal and user experience. It holds promise for a wide range of applications across industries. As we continue to push the boundaries of MR Robot’s innovation, the future holds even more exciting opportunities for the evolution of autonomous mobile robots.</p><p>Source Links :
<a href=https://github.com/atom-robotics-lab/atom-robotics-lab.github.io/tree/main/content>https://github.com/atom-robotics-lab/atom-robotics-lab.github.io/tree/main/content</a></p></article></div></div></div></section></main><footer class=site-footer><div class=container><div class=row><div class=col-12><div class=site-footer-logo><a href=https://atom-robotics-lab.github.io/><img src=https://atom-robotics-lab.github.io/images/logo-footer.png alt=logo-footer></a><span style=color:#fff> © 2022 A.T.O.M Robotics Lab | All rights reserved.</span></div></div><div class="col-lg-3 col-md-6"><div class=site-footer-widget><h5 class=site-footer-widget-title>Where to find us?</h5><p class=site-footer-widget-description>ECE Block Basement<br>Maharaja Agrasen Institute of Technology<br><a href=mailto:[email protected]>[email protected]</a></p></div></div><div class="col-lg-2 col-md-6"><div class=site-footer-widget><h5 class=site-footer-widget-title>Sitemap</h5><ul class=site-footer-widget-links><li><a href=https://atom-robotics-lab.github.io/about>About Us</a></li><li><a href=https://atom-robotics-lab.github.io/portfolio>Projects</a></li><li><a href=https://atom-robotics-lab.github.io/blog>Blog</a></li><li><a href=https://atom-robotics-lab.github.io/contact>Contact</a></li></ul></div></div><div class="col-lg-2 col-md-6"><div class=site-footer-widget><h5 class=site-footer-widget-title>Social Media</h5><ul class=site-footer-widget-links><li><a href=https://discord.gg/QxezpRQkJG>Discord</a></li><li><a href=https://twitter.com/atom_robotics_>Twitter</a></li><li><a href=https://www.instagram.com/a.t.o.m_robotics_lab/>Instagram</a></li><li><a href=https://github.com/atom-robotics-lab>Github</a></li><li><a href=https://www.youtube.com/channel/UCMGzre9_yk8R42rBfu9gO1A>YouTube</a></li></ul></div></div><div class="col-lg-3 col-md-6"><div class=site-footer-widget><h5 class=site-footer-widget-title>We like to build Robots:</h5><p class=site-footer-widget-description>A Robotics Society | Community<br>based out of Delhi</p></div></div><div class="col-lg-2 col-12"><a href=#top class=site-footer-widget-top><img src=https://atom-robotics-lab.github.io/images/up2.png alt=back-to-top><p>I want to<br>visit again</p></a></div></div></div></footer><script src=https://atom-robotics-lab.github.io/js/formhandler.min.js></script><script src=https://atom-robotics-lab.github.io/js/vendor.min.js></script><script src=https://atom-robotics-lab.github.io/js/script.min.js></script></body></html>
Loading

0 comments on commit ed10e52

Please sign in to comment.