It isn’t a secret that many kids find math to be boring and it is easy for them to develop an attitude of “when am I ever going to use this?” But math is incredibly useful in the real world, from blue-collar machinists using trigonometry to quantum physicists unveiling the secrets of our universe through […]
It isn’t a secret that many kids find math to be boring and it is easy for them to develop an attitude of “when am I ever going to use this?” But math is incredibly useful in the real world, from blue-collar machinists using trigonometry to quantum physicists unveiling the secrets of our universe through advanced calculus. By engaging children early on in fun, intuitive ways, we can lay a mathematical foundation to build upon and TIEboard is a unique electronic toy that could help.
Developed by researchers from the Keio Graduate School of Media Design and University of Auckland, TIEboard is an interactive digital tool aimed at teaching kids geometric concepts. It is a bit like the classic Lite-Brite toy, but for geometric shapes and smart enough to guide learning. It consists of a grid of points, each of which is a hole that can be lit by an LED and accept a “thread.” Those threads are fiber optic and light up. They’re also conductive and make contact with pads around the holes.
A basic lesson to guide the construction of a square would light up four points. The child could then string threads between those points to form the sides of the square in glowing colors. More complex lessons are possible and kids can progress through them as they grasp the fundamentals of shapes and geometry.
An Arduino Nano Every board provides that functionality by setting the colors of the LEDs and monitoring the matrix of copper pads around the holes. Buttons let the pupil move through the different lessons.
The lessons created for the TIEboard prototype are limited and the researchers found that some of the test participants struggled to follow along, but the concept is strong and lesson refinement would likely improve the results in the future.
Hard data is hard to find, but roughly 100 million books were published prior to the 21st century. Of those, a significant portion were never available in a digital format and haven’t yet been digitized, which means their content is effectively inaccessible to most people today. To bring that content into the digital world, Redditor […]
Hard data is hard to find, but roughly 100 million books were published prior to the 21st century. Of those, a significant portion were never available in a digital format and haven’t yet been digitized, which means their content is effectively inaccessible to most people today. To bring that content into the digital world, Redditor bradmattson built this machine that automatically scans books from cover to cover.
There are, of course, already machines on the market for scanning books. But the inexpensive models require manual page-turning and the more feature-packed models are very expensive. Bradmattson’s book scanner is fully automatic and can scan a whole stack of books without the assistance of a human operator. And the machine is relatively affordable to build, which makes it easier to justify the digitization of books that might otherwise be overlooked.
Oh, and it is portable. The whole thing folds up into a briefcase, so the operator can take it from location to location, digitizing books along the way.
As you’d expect, this machine is fairly complex. But the basic gist is that a stack of books rests on one side and gravity drops each one down onto a feed mechanism, which carries the book to the scanning area. There, a suction gripper lifts the cover. Next, a plexiglass press holds down the pages while a camera snaps a photo. To flip to the next page, a PC fan creates negative pressure to gently grip the paper and then the whole process repeats. When the whole book has been scanned, it slides over to the output area and the next book enters the scanning area.
A computer running Python oversees the process and catalogs the images. It controls the various motors through an Arduino GIGA R1 WiFi board paired with a CNC shield, as well as additional relays and a servo driver board.
We all love the immense convenience provided by robot vacuum cleaners, but what happens when they get too old to function? Rather than throwing it away, Milos Rasic from element14 Presents wanted to extract the often-expensive components and repurpose them into an entirely new robot, inspired by the TurtleBot3: the PlatypusBot. Rasic quickly got to […]
We all love the immense convenience provided by robot vacuum cleaners, but what happens when they get too old to function? Rather than throwing it away, Milos Rasic from element14 Presents wanted to extract the often-expensive components and repurpose them into an entirely new robot, inspired by the TurtleBot3: the PlatypusBot.
Rasic quickly got to work by disassembling the bot into its drive motors, pump, and several other small parts. Luckily, the main drive motors already had integrated encoders which made it very easy to connect them to an Arduino UNO R4 WiFi and an L298N motor driver for precise positional data/control. Further improving the granularity, Rasic added a 360-degree lidar module and enough space for a Raspberry Pi in order to run SLAM algorithms in the future.
For now, this 3D-printed robot assembled from reclaimed robot parts is controlled via a joystick over UDP and Wi-Fi. The host PC converts the joystick’s locations into a vector for the motors to follow, after which the values are sent to the UNO R4 WiFi for processing.
We know that introducing AI into your coding environment comes with questions – about safety, accuracy, privacy, and trust. That’s why we want to be transparent about how we built the recently-announced Arduino AI Assistant in the Cloud IDE, and why we chose to power it with Claude by Anthropic, available via Amazon Web Services […]
We know that introducing AI into your coding environment comes with questions – about safety, accuracy, privacy, and trust. That’s why we want to be transparent about how we built the recently-announced Arduino AI Assistant in the Cloud IDE, and why we chose to power it with Claude by Anthropic, available via Amazon Web Services (AWS) Bedrock. This feature is not a shortcut. It’s a tool to help you learn faster, test smarter, and stay focused on the creative side of building. Here’s how, and why, we made it.
Arduino AI Assistant: Your smart coding companion
Claude was designed from the ground up to be a collaborator – not just a chatbot. It’s one of the top-performing large language models (LLMs) when it comes to writing, explaining, and editing code. It is available through Amazon Bedrock, a fully managed service that makes foundation models accessible via API. We integrated Claude via AWS because it allowed us to easily access a secure and scalable model directly within the infrastructure we already trust and use.
We tested multiple models, and Claude stood out for its ability to understand context, generate cleaner code, and explain concepts clearly. It was also a good match for our goals: not just delivering answers, but helping you learn, debug, and iterate.
Context-aware with less hallucination
In developing the Arduino Cloud AI Assistant, we’ve implemented Retrieval Augmented Generation (RAG) – a technique that gives the AI more relevant context before it answers your question. Basically, when you ask the assistant something, we don’t just send your prompt to Claude directly. Instead, we first provide it with hand-picked, structured documentation based on your sketch, board, and use case.
This means you’re more likely to get reliable, Arduino-specific answers, and less likely to see hallucinated or misleading code. We regularly update these documents based on product releases and user feedback – so the system continues to improve over time.
Privacy comes first
We’ve built clear guardrails into the AI Assistant’s behavior – both our own and the ones provided by AWS Bedrock. These include:
No personal or identifiable data (like private sketches or account info) is ever shared with the LLM.
Every response stays within the Arduino context – the assistant won’t answer or suggest anything unrelated to our platform.
Guardrails help prevent suggestions for harmful or inappropriate projects, reinforcing our community guidelines.
We’ve also taken a minimal-data approach. The assistant only sees what it needs to generate a useful reply – no more, no less.
Community-led AI Assistant
This assistant wasn’t designed in a vacuum. Before launch, we worked closely with users through interviews and beta testing to identify the most common questions and pain points. The feedback we received shaped everything from prompt engineering to UI design.
We’re continuing to build this tool with you. That’s why every answer includes a thumbs up/down feedback option, and why we monitor the results closely. Some of the most useful improvements – like support for more libraries, better error messages, and undo/redo functionality – came directly from user suggestions.
Your input helps us tune the assistant – and the documents it draws from – to serve the real needs of real developers.
Supporting learning, not replacing it
We’ve heard the concerns about generative AI – from hallucinated code to worries that AI tools could erode developer skills or take over human jobs. We share some of these concerns, and we’ve taken a careful approach.
We designed the Arduino AI Assistant to be just that: an assistant, not a replacement. It’s not there to write your entire project. It’s there to help you fix bugs, understand syntax, explore ideas, and stay in flow while you build. For example, you can ask the assistant: “Explain this sketch”, and it will walk you through the code step by step, helping you understand a project written by someone else or clarify syntax you’re unfamiliar with.
We’ve added lightweight signals – like “experimental” tags and a friendly reminder not to blindly trust code to encourage self-learning.
Have you tried the Arduino Cloud AI Assistant yet?
The Arduino Cloud AI Assistant is available to everyone – even on the free plan. You can try it today with up to 30 free interactions per month, right inside the Cloud Editor.
If you need more, our Maker and School plans include 1,500 monthly interactions, and Team or Enterprise plans unlock unlimited usage.
Get started now at cloud.arduino.cc/features and let the assistant help you code smarter, debug faster, and stay in flow.
Yes, the title of this article sounds pretty crazy. But not only is it entirely possible through the lens of physics, but it is also practical to achieve in the real world using affordable parts. Jon Bumstead pulled it off with an Arduino, a photoresistor, and an inexpensive portable projector. Today’s digital camera sensors are […]
Yes, the title of this article sounds pretty crazy. But not only is it entirely possible through the lens of physics, but it is also practical to achieve in the real world using affordable parts. Jon Bumstead pulled it off with an Arduino, a photoresistor, and an inexpensive portable projector.
Today’s digital camera sensors are the result of a fairly linear progression from a camera obscura up through film cameras. The light from the scene enters through a lens that focuses all of that light on the 2D plane at the same time. The digital “sensor” is actually a whole grid of tiny sensors that each measure the light they receive. The camera records those values and reconstructing them gives you a digital image.
Bumstead’s “camera” works differently and only records a single point of light at a time. The entire camera is actually just an Arduino Mega 2560 (an UNO also works) with a photoresistor. The photoresistor provides a single analog light measurement and the Arduino reads that measurement, assigns a digital value, and passes the data to a PC.
Here’s the cool part: by only illuminating one point of the scene at a time, the camera can record each “pixel” in sequence. Those pixel values can then be reconstructed into an image. In this case, Bumstead used a portable video projector to provide the illumination. It scans the illumination point across the scene as the Arduino collects data from the photoresistor.
Bumstead also experimented with more complex techniques that rely on projected patterns and a lot of very fancy math to achieve similar results.
Finally, Bumstead showed that this also works when the photoresistor doesn’t have line-of-sight to the scene. In that demonstration, light from the scene bounces off a piece of paper, kind of like a mirror. The photodetector only sees the reflected light. But that doesn’t matter — remember, the photodetector is only seeing a single point of light anyway. Whether that light came directly from the surface of objects in the scene or bounced off paper first, the result is the same (just with a bit less quality, because the paper isn’t a perfect reflector).
Are you an educator looking to make coding easier and faster to teach? Join Andrea Richetta, Principal Product Evangelist at Arduino, and Roxana Escobedo, EDU Product Marketing Specialist, for a special Arduino Cloud Café live webinar on July 7th at 5PM CET. You will discover how the new AI Assistant in Arduino Cloud can help […]
Are you an educator looking to make coding easier and faster to teach?
Join Andrea Richetta, Principal Product Evangelist at Arduino, and Roxana Escobedo, EDU Product Marketing Specialist, for a special Arduino Cloud Café live webinar on July 7th at 5PM CET.
You will discover how the new AI Assistant in Arduino Cloud can help you save valuable time in the classroom. We’ll also show you how the AI Assistant can generate, explain, and fix code, giving both you and your students the support you need to focus on creativity and learning.
What to expect
Watch live demos with the UNO R4 WiFi and Plug and Make Kit
Learn how to generate sketches, fix errors, and understand your code better
Get Andrea Richetta’s top 5 expert tips to work smarter with AI
Ask your questions live during our open Q&A
Whether you’re teaching STEM in a classroom or mentoring young developers, this session will help you engage with smarter, faster, AI-powered teaching.
Register now
Don’t miss your chance to see the AI Assistant in action and find out how AI is shaping the future of Arduino development.
One reason that fans prefer mechanical keyboards over membrane alternatives is that mechanical key switches provide a very noticeable tactile sensation at the moment a key press registers. Whether consciously or not, users notice that and stop pressing the key all the way through the maximum travel — reducing strain and RSI potential. Developed by […]
One reason that fans prefer mechanical keyboards over membrane alternatives is that mechanical key switches provide a very noticeable tactile sensation at the moment a key press registers. Whether consciously or not, users notice that and stop pressing the key all the way through the maximum travel — reducing strain and RSI potential. Developed by researchers at KAIST’s HCI Tech Lab, UltraBoard is a novel wearable that provides similar tactile feedback while typing in virtual reality.
UltraBoard’s designers wanted a device suitable for VR typing that would provide on-demand haptic feedback sensations, without complicated physical actuators. They achieved that with an array of ultrasonic transducers that produce strong soundwaves that the user can feel, but not hear. That array sits below the hand and can project localized soundwaves targeting specific points. So, typing the letter “A” on a virtual reality keyboard would cause the transducer array to blast soundwaves at the tip of the pinky finger.
An UltraBoard straps on to each of the user’s wrists and a servo motor near the strap tilts the transducer array to match the wrist angle, ensuring that the array is always directly underneath the user’s hand.
The prototype UltraBoard device uses both an Arduino Mega 2560 and an Arduino Micro board. They share duties, with the Micro controlling the servo motor and the transducer board, while the Mega controls an Ultraino driver board. They follow commands from a connected PC, which runs the virtual reality software that the user interacts with through a virtual reality headset.
The results of testing were mixed, but UltraBoard didn’t appear to provide a statistically significant improvement to typing speed. Even so, the concept is interesting and further testing may reveal other benefits, such as a more comfortable typing experience.
If you ask someone to think of a battery, they’re probably going to picture a chemical battery, like a AA alkaline or a rechargeable lithium-ion battery. But there are other kinds of batteries that store energy without any fancy chemistry at all. If you find a way to save energy for later, you have a […]
If you ask someone to think of a battery, they’re probably going to picture a chemical battery, like a AA alkaline or a rechargeable lithium-ion battery. But there are other kinds of batteries that store energy without any fancy chemistry at all. If you find a way to save energy for later, you have a useful battery. Erik, of the Concept Crafted Creations YouTube channel, achieved that by storing kinetic energy in a spinning flywheel weighted with water.
This isn’t a crazy idea, because flywheels exist specifically to store kinetic energy in a spinning mass. In this case, most of that mass comes from tubes full of water. Water is cheaper than something like cast iron and it is easy to adjust the levels to maintain perfect balance.
But this wet flywheel has another trick up its sleeve: adjustable moment of inertia. Watch an ice skater as they tuck into spin and you’ll understand this. By pulling their arms and legs close their axis of rotation, the skater can reduce their overall moment of inertia and increase their speed. Erik’s flywheel can do the same thing by actuating the cylinders of water to bring them in closer to the rotational axis.
To control that process, Erik used an Arduino Nano board housed in a simple laser-cut box with a potentiometer for adjusting speed, and buttons to control power and the arm actuation. A beefy brushless DC motor spins up the flywheel under power. Then, when it is time to collect that power (such as to power the lightbulb Erik used for demonstration), that motor acts as a dynamo, like in a generator.
As a battery for long-term power storage, this isn’t very practical. In a vacuum with perfect frictionless bearings, it would be. But in the real-world the flywheel will slow down on its own in short order. Even so, it is still a great illustration of the concept.
We are proud to announce two groundbreaking additions to the Arduino Pro portfolio: the Arduino Stella and Portenta UWB Shield, developed in partnership with Truesense. These advanced tools leverage ultra-wideband (UWB) technology to redefine precision tracking, indoor navigation, and contactless human-machine interactions, empowering IoT innovation across industries. Whatever you have in mind, you’ll leverage streamlined […]
We are proud to announce two groundbreaking additions to the Arduino Pro portfolio: the Arduino Stella and Portenta UWB Shield, developed in partnership with Truesense. These advanced tools leverage ultra-wideband (UWB) technology to redefine precision tracking, indoor navigation, and contactless human-machine interactions, empowering IoT innovation across industries. Whatever you have in mind, you’ll leverage streamlined development thanks to ready-to-use Arduino IDE libraries, examples, and tutorials, enabling you to move from concept to prototype faster.
With UWB technology, you can achieve pinpoint accuracy in even the most complex environments, connect effortlessly with UWB-enabled smartphones and cloud platforms, and ensure your data remains private and secure thanks to UWB’s hard-to-intercept signals. You can learn more about our collaboration with Truesense and the power of UWB technology in our recent blog post: Arduino and Truesense partner to bring UWB technology to millions.
Arduino Stella shines for precision and versatility
Featuring an nRF52840 microcontroller and Truesense DCU040 module, the Arduino Stella delivers unparalleled accuracy for real-time tracking. Its compact design and seamless integration with UWB-enabled smartphones and apps like NXP Trimension, Apple’s Nearby Interaction, and Android’s UWB Jetpack library make it the perfect solution for modern tracking and automation needs.
Stella excels in industries such as healthcare, logistics, and smart buildings, offering advanced functionality like:
Pinpointing location tracking for high-value assets
Intuitive human-machine interaction
Automated safety and monitoring systems
Reliable indoor navigation
Portenta UWB Shield extends the end-to-end capabilities of the Portenta family
Powered by the Truesense DCU150, the Portenta UWB Shield easily adds UWB connectivity to the Portenta C33. This versatile shield acts as a base station and a client device, enabling precise real-time location services (RTLS) and two-way ranging.
With its modular and robust design, the Portenta UWB Shield is ideal for:
Smart logistics with dynamic route optimization
Interactive environments for enhanced user experiences
Secure and responsive IoT systems
Expand possibilities with ultra-wideband!
Every new addition to our ecosystem is a tool designed to make innovation accessible and scalable for professionals across industries. The Arduino Stella and Portenta UWB Shield, in particular, make it easier than ever to tackle applications such as:
Human-machine interaction: Enable intuitive commands and real-time feedback using UWB-equipped devices.
Follow-me AGVs: Automate logistics with autonomous vehicles that dynamically follow workers in warehouses.
Secure item transportation: Track critical items with proximity alerts and temperature monitoring during transit, leveraging compatibility with Modulino nodes.
Residential access control: Automate door access for authorized personnel with UWB-enabled smartphones.
EV automatic recharge: Streamline EV charging by triggering the process based on real-time vehicle positioning.
High-value asset tracking: Monitor valuable equipment in real time with location alerts and optimization tools.
Ready to elevate your IoT projects to new heights, with unmatched precision, seamless integration, and secure communication? Find the Arduino Stella and Portenta UWB Shield on the Arduino Store today!
Robot arms are very cool and can be quite useful, but they also tend to be expensive. That isn’t just markup either, because the components themselves are pricey. However, you can save a lot of money if you make some sacrifices and build everything yourself. In that case, you can follow Ruben Sanchez’s tutorial to […]
Robot arms are very cool and can be quite useful, but they also tend to be expensive. That isn’t just markup either, because the components themselves are pricey. However, you can save a lot of money if you make some sacrifices and build everything yourself. In that case, you can follow Ruben Sanchez’s tutorial to create your own four degrees of freedom robot arm from scratch.
This design has four actuated axes: the base, the shoulder, the elbow, and the wrist. Depending on the end effector you need, a gripper might count as another. It has a reach of up to 80cm and a maximum payload capacity of 350g, which is enough to move small objects.
Sanchez reduced the cost of this robot arm (compared to typical designs) in two ways. The first is by constructing the frame from aluminum sheet cut by hand, with laser markings as a guide template. The second is by using DC gear motors with external encoders for actuation, rather than purpose-built robotic actuators. They won’t have as much accuracy or repeatability, but they’re affordable.
An Arduino Due board controls the motors through Pololu drivers. The Arduino receives movement commands from a connected PC, which can look at the work area through an Intel RealSense camera attached by the end effector.
Sanchez provides the Arduino Sketch to get started, but encourages users to develop their own control software. To help with that, his writeup includes some nice explanations of inverse kinematics, the math involved, and how to implement it.