Adafruit Weekly Editorial Round-Up: September 15th to September 21st, Celebrating the Adafruit Discord Community, National Hispanic Heritage Month, All the Internet of Things – Ep. 5 and more!

NewImage


ADAFRUIT WEEKLY EDITORIAL ROUND-UP


We’ve got so much happening here at Adafruit that it’s not always easy to keep up! Don’t fret, we’ve got you covered. Each week we’ll be posting a handy round-up of what we’ve been up to, ranging from learn guides to blog articles, videos, and more.


BLOG

NewImage

14,000 THANK YOUs! Celebrating 14,000 members in the Adafruit Discord Community!

Together as a community, we reached over 14,000+ humans thank you! We share projects, coordinate events, make new friends, build open-source together like CircuitPython, we’ve worked really hard to make this a special place for everyone to share their projects, code, and things they make.

Join today! https://adafru.it/discord

Check out the full post here!

More BLOG:

Keeping with tradition, we covered quite a bit this past week. Here’s a kinda short nearing medium length list of highlights:


Learn

Magical Cardboard Craft Obsidian Sword

This guide takes you through the process of creating your own fantasy weapon that begins to glow as soon as you pick it up.

The design for this sword is taken from the Cartoon Network animated series ‘Steven Universe’.

See the full guide here!

More LEARN:

Browse all that’s new in the Adafruit Learning System here!

DeepRC Robot Car is a new kind of Smart Car #MachineLearning #ArtificialIntelligence #SmartPhone #3Dprinting #Robot #DeepLearning #TensorFlow @pyetras @hackaday

From the ‘DeepRC Robot Car’ project on hackaday.io by Piotr Sokólski

 

The ‘DeepRC Robot Car’ project on hackaday.io by Piotr Sokólski aims to create a miniature self-driving car that can be trained at home. Probably the coolest part about this project is that it incorporates a smartphone for a number of the pieces of hardware. For instance, a mirror was used to shift the phone camera view to the front of the vehicle so the car can see the road (see below). The chassis was 3D printed and a number of other small electronics were used to build the car (like the NRF52 SOC).

For controlling the actuators and reading telemetry data a small number of electronic components are installed on the chassis. The main circuit board is based on an excellent NRF52 SOC. It provides a Bluetooth LE radio to communicate with the phone. The servo is controlled by the chip directly, however the motor requires an additional Electronic Speed Controller (ESC).

From the ‘DeepRC Robot Car’ project on hackaday.io by Piotr Sokólski

The software powering the robot was split between an app on the phone and a computer. @pyetras trained the robot to avoid collisions using deep reinforcement learning. Specifically, the TensorFlow agents implementation of the Soft Actor-Critic algorithm was used.

For a Deep Reinforcement Learning algorithm I chose Soft Actor-Critic (SAC)(specifically the tf-agents implementation). I picked this algorithm since it promises to be sample-efficient (therefore decreasing data collection time, an important feature when running on a real robot and not a simulation) and there were already some successful applications on simulated cars and real robots.

The model followed methodology from several projects including “Learning to Drive smoothly” and “Learning to Drive in a Day“. If you would like to learn more about this project checkout @pyetras YouTube video or GitHub.

From the ‘DeepRC Robot Car’ project on hackaday.io by Piotr Sokólski

Optical Machine Learning with Diffractive Deep Neural Networks #MachineLearning #3Dprinting #DeepLearning #NeuralNetworks #TensorFlow @InnovateUCLA

From techxplore.com. Credit: UCLA Engineering Institute for Technology Advancement

 

The Ozcan Lab at UCLA has created optical neural networks using 3D printing and lithography. TensorFlow models were trained on the MNIST, Fashion-MNIST, and CIFAR-10 data sets using beefy GPUs. The trained models were then translated into multiple diffractive layers. These layers create the optical neural network. What the model lacks in adaptability it gains in speed as it can make predictions “at the speed of light” without any power. The basic workflow involves passing light through an input object which is filtered through the entire optical neural network to a detector which captures the results.

…each network is physically fabricated, using for example 3-D printing or lithography, to engineer the trained network model into matter. This 3-D structure of engineered matter is composed of transmissive and/or reflective surfaces that altogether perform machine learning tasks through light-matter interaction and optical diffraction, at the speed of light, and without the need for any power, except for the light that illuminates the input object. This is especially significant for recognizing target objects much faster and with significantly less power compared to standard computer based machine learning systems, and might provide major advantages for autonomous vehicles and various defense related applications, among others.

If you’d like to learn more about Photonics checkout the research happening at the Ozcan Lab. If you’d like more details about diffractive deep neural networks checkout this publication in Science or the most recent Ozcan Lab publication on the topic.

Adafruit Weekly Editorial Round-Up: July 14th – July 20th

NewImage


ADAFRUIT WEEKLY EDITORIAL ROUND-UP


We’ve got so much happening here at Adafruit that it’s not always easy to keep up! Don’t fret, we’ve got you covered. Each week we’ll be posting a handy round-up of what we’ve been up to, ranging from learn guides to blog articles, videos, and more.


BLOG

NewImage

13,000 THANK YOUs! Celebrating 13,000 members in the Adafruit Discord Community!

Together as a community, we reached over 13,000+ humans thank you! We share projects, coordinate events, make new friends, build open-source together like CircuitPython, we’ve worked really hard to make this a special place for everyone to share their projects, code, and things they make.

Join today! https://adafru.it/discord

Check out the full post here!

More BLOG:

Keeping with tradition, we covered quite a bit this past week. Here’s a kinda short nearing medium length list of highlights:


LEARN

NeoTrellis Sound Board

Use an Adafruit Feather M4 and Prop-Maker FeatherWing to make a portal NeoTrellis soundbox! Play and trigger motion activated audio samples with CircuitPython. Build and assemble the 3D printed enclosure to make your own with built-in speaker and rechargeable battery!

See the full guide here!

More LEARN:

Browse all that’s new in the Adafruit Learning System here!

Adafruit Weekly Editorial Round-Up: May 23rd – May 29th

NewImage


ADAFRUIT WEEKLY EDITORIAL ROUND-UP


We’ve got so much happening here at Adafruit that it’s not always easy to keep up! Don’t fret, we’ve got you covered. Each week we’ll be posting a handy round-up of what we’ve been up to, ranging from learn guides to blog articles, videos, and more.


BLOG

Tiny Machine Learning on the Edge with TensorFlow Lite Running on SAMD51

Tiny Machine Learning on the Edge with TensorFlow Lite Running on SAMD51 (video and/or skip to the demo part at 7 min 28 secs). You’ve heard of machine learning (ML), but what is it? And do you have to buy specialty hardware to experiment? If you have some Adafruit hardware, you can build some Tiny ML projects today!

Check out the full post here!

More BLOG:

Keeping with tradition, we covered quite a bit this past week. Here’s a kinda short nearing medium length list of highlights:


Learn

1,900th GUIDE! Trash Panda 2: Garbage Day

Our 1,900th guide has landed in the Learn System! It’s John Park’s Trash Panda 2: Garbage Day and it’s lots of fun! You can make it with MakeCode Arcade, and mod and hack it all you like.

In Trash Panda 2: Garbage Day, you play as the suburban dweller just trying to get some sleep when the raccoons and cats decide its time to make noise and throw garbage our of the trash bins! You must try to stop them by shining your flashlight on them. But you can only play at night, so be sure that your PyGamer or PyBadge’s light sensor indicates it’s dark out!

See the full guide here!

More LEARN:

Browse all that’s new in the Adafruit Learning System here!