2021.01.24 19:00 UTC



Getting Started with Raspberry Pi Pico using MicroPython and C

2021-01-24 10:43 CNXSoft Jean-Luc Aufranc (CNXSoft)

Raspberry Pi Pico board was just launched last Thursday, but thanks to Cytron I received...


DevEBox STM32H7 Development Boards are made for Factory Automation

2021-01-23 04:08 CNXSoft Abhishek Jadhav

When it comes to STMicroelectronics’ STM32H7 series, there are three product lines- Dual-core lines, Single-core...


PoE-enabled Apollo Lake system triggers machine vision

2021-01-22 19:35 LinuxGizmos Eric Brown

Axiomtek’s compact “MVS100-323-FL” machine vision computer combines Apollo Lake with 3x GbE ports — 2x with PoE — plus lighting controls, trigger I/O, isolated DIO, and mini-PCIe. Axiomtek has previously launched vision I/O computers based on Intel’s 7th Gen Kaby Lake with products like the MVS900-511-FL, IPS962-512-PoE, and IPS960-511-PoE. The new MVS100-323-FL is a far […]


Mastering embedded AI

2021-01-22 19:01 Embedded.com Ariel Hershkovitz

The appeal of putting AI in embedded applications is obvious, for example using face-id to authorize access to machine controls on a factory floor. Facial recognition, voice control, anomaly detection, with AI there are so many possibilities. I’ll use face-id as an example in this blog. So much easier to use, more intelligent and more robust than traditional human-machine interfaces and passwords. Not to mention that everyone else is doing it. How AI works may seem magical, but what it can do is fast becoming a minimum expectation. No one wants to evaluate products transparently based on yesterday’s technology.

(Source: CEVA)

The challenge

There’s a problem for a product builder. AI-based development is quite different from standard embedded development. You aren’t writing software, at least for the core function. You have to train a neural net to recognize patterns (like images), just as you would train a child in school. Then you must optimize that net to the constrained footprint of your embedded device, to meet size and power goals. Neural nets may not be conventional code, but the net and its calculations still consume memory and burn power. As an embedded developer you know how important it is to squeeze these metrics as much as possible. I’ll get to this in my next blog. For now let’s understand at least some of how these neural nets work.

The basics

I don’t want to walk you through a lengthy explanation of neural nets; just what you’re going to have to do to make your application work. A neural net is conceptually a series of layers of “neurons”. Each neuron reads two (or more) inputs from a previous layer or the input data, applies a calculation using trained weights and feeds forward a result. Based on these weights, a layer detects features, progressively more complex as you move through layers, eventually recognizing a complex image at the output.

The first clever part then is in designing the net – how many layers, connections between the layers and so on – the core neural net algorithm. The second clever part is in training. This is a process in which many images are run through the net, with labeling to identify what should be recognized. These runs build up the weight values needed for recognition.

If you feel ambitious, you might build your own neural net from scratch for one of the standard networks such as TensorFlow. You could also start from an open-source option such as this one for face-id. You can build all of this into an app which can run on a laptop, which will be handy for customers who want to register new approved faces. Now you can start training your network with a test set of approved faces in multiple poses.

Why not just do this in the cloud?

There are services that will do face recognition online – no need to get into messy AI on your device. Just take the picture, upload it to the cloud, the app transmits back an OK and your product approves the next step.

But – all your approved employees need to have their photos and other credentials in the cloud. Maybe not such a great idea for security and privacy. You’ll burn quite a bit of power communicating the image to the cloud every time a worker wants access to a machine. And if your Internet connection is down, no-one can be approved until it comes back up. Doing authentication right on the device preserves privacy and security, keeps the power demand low and continues to work even when the network connection is down.

Up next – embedding your trained network

Now you have the AI hard part done, you have to download it to your device. That’s an interesting step in its own right, where you’ll definitely need help from your AI platform. I’ll talk about that more in my next blog. Meanwhile, for more information, check out “Deep learning for the real-time embedded world.”

Ariel Hershkovitz serves as CEVA’s Senior Manager of Customer Solutions for Software Development Tools. Ariel brings over 14 years of multi-disciplinary experience, spanning software development, verification, integration and deployment of software deliveries, in both technical and managerial roles. He is passionate about user experiences, ease of use and innovative technology. Highly proficient in analyzing complex problems and simplifying them for rapid resolution. Ariel holds a B.Sc. in Computer Science from Ben-Gurion University, and an MBA from Bar-Ilan University..

Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

The post Mastering embedded AI appeared first on Embedded.com.


Most Read articles – RP2040, SpaceX satellites, automotive chips

2021-01-22 15:27 ElectronicsWeekly Alun Williams

There's the new Raspberry Pi RP2040, Lockheed Martin handing over the Orion spacecraft, the head of ARM China, more SpaceX success for StarLink launches, and special pleading from the US auto industry...

This story continues at Most Read articles – RP2040, SpaceX satellites, automotive chips

Or just read more coverage at Electronics Weekly


The Tough World Of Mixed Signal

2021-01-22 15:12 ElectronicsWeekly David Manners

David Milne had a huge success with Wolfson Microelectronics taking it from a university design-house spin-off to become the best mixed signal fabless company in the world. Back in 2002, Milne summed up the mixed signal scene: “ST is strong, Zoran is big in DVDs, TI is the 900lb Gorilla in DSP, Philips should be ...

This story continues at The Tough World Of Mixed Signal

Or just read more coverage at Electronics Weekly


Interesting PWM on Raspberry Pi RP2040 mocrocontroller

2021-01-22 11:46 ElectronicsWeekly Steve Bush

I am supposed to be on other duties this week, but cannot resist pointing you in the direction of the new Raspberry Pi in-house MCU, the RP2040. Much as I want to write chapter and verse, time limits to not allow, so I will restrict myself to pointing out the PWM block, which has some ...

This story continues at Interesting PWM on Raspberry Pi RP2040 mocrocontroller

Or just read more coverage at Electronics Weekly


ASPs on the up

2021-01-22 11:44 ElectronicsWeekly David Manners

Renesas, NXP, and ST are among semiconductor manufacturers which have hiked prices by between 10 and 20%, reports the Nikkei, while Toshiba has entered into negotiations with customers about settling higher prices. Soaring demand and lack of capacity are likely to drive prices higher for the rest of the year. The car industry is in particularly bad ...

This story continues at ASPs on the up

Or just read more coverage at Electronics Weekly


Whitepaper: ON Semi – IoT and Energy Harvesting Zigbee Green Power Switches

2021-01-22 11:37 ElectronicsWeekly EW Staff

A new whitepaper available on Electronics Weekly details energy harvesting for remote switches using Zigbee, for example for use in Smart Homes and Buildings. Download the whitepaper » Written by On Semiconductor, with an IoT focus, it covers how best to achieve the benefits of wireless, battery-free switches. It writes: “These devices offer unparalleled flexibility for deployment within buildings ...

This story continues at Whitepaper: ON Semi – IoT and Energy Harvesting Zigbee Green Power Switches

Or just read more coverage at Electronics Weekly


Rockchip RV1126 AI Camera SoC features 2.0 TOPS NPU, promises 250ms fast boot

2021-01-22 11:14 CNXSoft Jean-Luc Aufranc (CNXSoft)

The Rockchip Developer Conference that took place at the end of November 2020 allowed us...


Arduino builds on the Raspberry Pi RP2040

2021-01-22 11:03 ElectronicsWeekly Alun Williams

Raspberry Pi launched its first microcontroller-class product yesterday – the RP2040 – and Arduino have now announced they are working with it for their own board, which will be the Arduino Nano RP2040 Connect. The RP2040 (right) was newly developed at the Raspberry Pi Foundation and the Pi Pico builds on it as a standalone ...

This story continues at Arduino builds on the Raspberry Pi RP2040

Or just read more coverage at Electronics Weekly


Samsung plans $10bn 3nm Texas fsb

2021-01-22 10:28 ElectronicsWeekly David Manners

Samsung is planning to build a $10 billion 3nm fab alongside its existing fab in Austin Texas, reports Bloomberg. The plan involves financial help from the new US administration which is said to be currently under negotiation. The plan is to start building this year with manufacturing equipment to be installed in 2022 and first ...

This story continues at Samsung plans $10bn 3nm Texas fsb

Or just read more coverage at Electronics Weekly


Polymer as strong as polyamide with 15x its bending fatigue limit.

2021-01-22 07:12 ElectronicsWeekly David Manners

Toray Industries has created a polymer that retains the outstanding thermal resistance, rigidity, and strength of polyamide 6 (PA6) while delivering a bending fatigue limit that is 15-fold that of conventional polymers. Prospective applications for such exceptional durability include automobiles, appliances, and sporting goods. Toray looks to initiate full-fledged sample work in fiscal year 2021 ...

This story continues at Polymer as strong as polyamide with 15x its bending fatigue limit.

Or just read more coverage at Electronics Weekly


UK gets a Space Sector Landscape Map

2021-01-22 07:08 ElectronicsWeekly Alun Williams

The Knowledge Transfer Network (KTN) and Uk Space Agency (UKSA) have mapped out a Space Sector Landscape for the UK. The tool outlines the universities, companies, funding bodies and networks that collectively form the nation’s space industry. You can view the interactive map online, to drill down into the data by various categorises – see ...

This story continues at UK gets a Space Sector Landscape Map

Or just read more coverage at Electronics Weekly


Ultra-wide DC-DC converters have 91%+ efficiency

2021-01-22 07:03 ElectronicsWeekly David Manners

Murata has brought out two new ultra-wide 10:1 ratio DC-DC converters, the 250W IRH-W80 half-brick and the 150W IRQ-W80 quarter-brick from Murata Power Solutions. Both high power density modules feature efficiency levels above 91% with a 16 – 160 Vdc input voltage range. The 250W IRH-W80 and 150W IRQ-W80 modules are designed for embedded applications ...

This story continues at Ultra-wide DC-DC converters have 91%+ efficiency

Or just read more coverage at Electronics Weekly


GNSS front end integrates impedance matching and ESD circuitry

2021-01-22 07:00 ElectronicsWeekly David Manners

ST’s BPF8089-01SC6 RF front-end for GNSS integrates the impedance-matching and ESD protection circuitry typically implemented using discrete components. The device provides a 50Ω matched interface between the receiver’s antenna and low-noise amplifier (LNA), and is ready to plug-and-play with ST’s STA8089 and STA8090 LNAs. The chip typically replaces a matching network containing up to five capacitors, ...

This story continues at GNSS front end integrates impedance matching and ESD circuitry

Or just read more coverage at Electronics Weekly


Android 10 ported to RISC-V board powered by Alibaba T-Head XuanTie C910 Processor

2021-01-22 05:06 CNXSoft Jean-Luc Aufranc (CNXSoft)

RISC-V has made a lot of progress in just a few years, but for anything...


Decline In US Anti-Government Extremist Groups

2021-01-22 02:05 ElectronicsWeekly David Manners

The Southern Poverty Law Center, a prominent U.S. civil rights group, identified 576 extreme anti-government groups in 2019, 181 of which were militias. That represents a reduction on the 612 groups documented in 2018 and it is also far fewer than the 1,360 that were active in 2012. The decline in active groups under Trump ...

This story continues at Decline In US Anti-Government Extremist Groups

Or just read more coverage at Electronics Weekly


Raspberry Pi HAT offer NMEA 2000 link for marine applications

2021-01-21 23:15 LinuxGizmos Eric Brown

Copperhill’s $99 “PiCAN-M” HAT for the Raspberry Pi provides CAN-based NMEA 2000 and RS-422 driven NMEA 0183 ports for marine applications. The HAT includes a 3A SMPS supply and a Qwiic link. In 2019, Copperhill Technologies launched a PiCAN3 CAN-Bus HAT for the Raspberry Pi 4. The new PiCAN-M (for Marine), built for Copperhill by […]


Zora P1 Amlogic A311D Development Board interfaces with Orbbec 3D Cameras

2021-01-21 20:00 CNXSoft Abhishek Jadhav

We have already seen the powerful Amlogic A311D powered Khadas VIM3 SBC, and Orbbec announced...


Raspberry Pi designs its own MCU along with $4 board

2021-01-21 19:13 Embedded.com Nitin Dahad

Raspberry Pi has designed its own microcontroller (MCU), the RP2040, and launched a new $4 board based on the new MCU, the Raspberry Pi Pico, programmable in C and MicroPython.

The RP2040 features a dual-core Arm Cortex-M0+ processor with 264KB internal RAM and support for up to 16MB of off-chip Flash. A wide range of flexible I/O options includes I2C, SPI, as well as programmable I/O (PIO). Raspberry Pi’s chief operating officer, James Adams, said in a blog, “We had three principal design goals for RP2040: high performance, particularly for integer workloads; flexible I/O, to allow us to talk to almost any external device; and of course, low cost, to eliminate barriers to entry. We ended up with an incredibly powerful little chip, cramming all this into a 7 × 7 mm QFN-56 package containing just two square millimetres of 40 nm silicon.”

He added that with six independent banks of RAM, and a fully connected switch at the heart of its bus fabric, it is easy to arrange for the cores and DMA engines to run in parallel without contention. In addition, since the Cortex-M0+ lacks a floating-point unit, Raspberry Pi had commissioned optimized floating-point functions from Mark Owen, author of the Qfplib libraries; these are substantially faster than their GCC library equivalents and are licensed for use on any RP2040-based product.

Adams said, “With two fast cores and a large amount of on-chip RAM, RP2040 is a great platform for machine learning applications. For power users, we provide a complete C SDK, a GCC-based toolchain, and Visual Studio Code integration. For beginners, and other users who prefer high-level languages, we’ve worked with Damien George, creator of MicroPython, to build a polished port for RP2040; it exposes all of the chip’s hardware features, including our innovative PIO subsystem. And our friend Aivar Annamaa has added RP2040 MicroPython support to the popular Thonny IDE.”

The Raspberry Pi Pico $4 board featuring its own dual core Arm Cortex-M0+ based microcontroller (Image: Raspberry Pi)

In his blog, Adams explained the reasons for producing its own silicon. He said the Raspberry Pi has been quite successful in helping bridge the worlds of software and hardware, and as a result sold 37 million units to date. However, he said the existing boards do have limits – for example a Raspberry Pi Zero consumes of the order of 100mW; Raspberry Pi on its own does not support analog input; and while it is possible to run “bare metal” software on a Raspberry Pi, software running under a general-purpose operating system like Linux is not well suited to low-latency control of individual I/O pins.

He added that many applications tend to pair their Raspberry Pi with a microcontroller. While the Raspberry Pi might do the computation, network access and storage, the microcontroller handled analog input and low latency I/O.

Hence by making its own silicon, it could improve performance, I/O and cost. In developing the RP2040, he said, they’d learned the lessons from using other microcontrollers in Raspberry Pi products.

In addition to the microcontroller and the board, Raspberry Pi said it has also been working with various partners to create both a variety of other boards based on the RP2040 silicon platform, as well as accessories for the Raspberry Pi Pico. These partners include Adafruit, Arduino, Pimoroni, and Sparkfun.

The Raspberry Pi Pico board pinout. (Image: Raspberry Pi)

Key features of the RP2040

  • Dual-core Arm Cortex-M0+ @ 133MHz
  • 264KB (remember kilobytes?) of on-chip RAM
  • Support for up to 16MB of off-chip Flash memory via dedicated QSPI bus
  • DMA controller
  • Interpolator and integer divider peripherals
  • 30 GPIO pins, 4 of which can be used as analogue inputs
  • 2 × UARTs, 2 × SPI controllers, and 2 × I2C controllers
  • 16 × PWM channels
  • 1 × USB 1.1 controller and PHY, with host and device support
  • 8 × Raspberry Pi Programmable I/O (PIO) state machines
  • USB mass-storage boot mode with UF2 support, for drag-and-drop programming.

The $4 board

Along with the new microcontroller, the company also introduced a $4 board, the Raspberry Pi Pico. This pairs the RP2040 with 2MB of Flash memory, and a power supply chip supporting input voltages from 1.8-5.5V. This allows the Pico to be powered up from a wide variety of sources, including two or three AA cells in series, or a single lithium-ion cell.

The Pico board provides a single push button, which can be used to enter USB mass-storage mode at boot time and also as a general input, and a single LED. It exposes 26 of the 30 GPIO pins on RP2040, including three of the four analog inputs, to 0.1”-pitch pads; headers can be soldered to these pads or their castellated edges enable Pico to be soldered directly to a carrier board. Volume customers will be able to buy pre-reeled Pico units. The Pico PCB layout was co-designed with the RP2040 silicon and package: a two-layer PCB with a solid ground plane and a GPIO breakout that Adams said “just works”.

Full technical specifications of the RP2040, the Raspberry Pi Pico, and the software development kit (SDK), whether for C/C++ or MicroPython, are available here.

Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

The post Raspberry Pi designs its own MCU along with $4 board appeared first on Embedded.com.


Raspberry Pi goes MCU with open-spec Pico

2021-01-21 18:58 LinuxGizmos Eric Brown

RPi Ltd. has launched a $4 “Raspberry Pi Pico” board based on an “RP2040” chip with dual Cortex-M0+. The Pico adds 2MB flash, micro-USB, and 26 GPIO. RP2040-based boards are also available from Adafruit, Arduino, Pimoroni, and SparkFun. The Raspberry Pi project was modeled in part on the Arduino open hardware project that continues to […]


Arduino, and Pi Pico, builds on the Raspberry Pi RP2040

2021-01-21 18:53 ElectronicsWeekly Alun Williams

Not only has Raspberry Pi launched its first microcontroller-class product - the Raspberry Pi Pico - but then Arduino have built on it with the Arduino Nano RP2040 Connect.

This story continues at Arduino, and Pi Pico, builds on the Raspberry Pi RP2040

Or just read more coverage at Electronics Weekly


Fable: A Good Screw

2021-01-21 15:18 ElectronicsWeekly David Manners

There was once a great mathematician who was asked to come up with a way to remove the bilge-water in ships. His solution, which was adopted, was a device with a revolving screw-shaped blade inside a cylinder. When the screw was turned it could raise low-lying water and pump it out at a higher level. ...

This story continues at Fable: A Good Screw

Or just read more coverage at Electronics Weekly


Third-party Raspberry Pi RP2040 boards from Arduino, Adafruit, Sparkfun and Pimoroni

2021-01-21 09:19 CNXSoft Jean-Luc Aufranc (CNXSoft)

I’ve just written about the launch of the Raspberry Pi Pico board and Raspberry Pi...


Raspberry Pi Pico

2021-01-21 09:05 ElectronicsWeekly David Manners

Farnell has today announced availability of the first product built on Raspberry Pi-designed silicon: Raspberry Pi Pico. It brings Raspberry Pi’s signature values of high performance, low cost, and ease of use to the microcontroller market, in a game-changing $4 development kit. Farnell customers will be able to purchase the Raspberry Pi Pico from the Farnell ...

This story continues at Raspberry Pi Pico

Or just read more coverage at Electronics Weekly


Mecool KM6 Deluxe (Amlogic S905X4) TV Box Review

2021-01-21 09:00 CNXSoft Nico Dekerf

Back in September 2020, Jean-Luc wrote about the Mecool KM6 TV Box. This comes with...


$4 Raspberry Pi Pico board features RP2040 dual-core Cortex-M0+ MCU

2021-01-21 07:59 CNXSoft Jean-Luc Aufranc (CNXSoft)

The Raspberry Pi Foundation introduced the Linux capable Raspberry Pi board in 2012 to teach...


Four European Network Heavyweights Back O-RAN

2021-01-21 07:20 ElectronicsWeekly David Manners

Deutsche Telekom, Orange, Telefónica, and Vodafone have issued a joint MOU about O-RAN roll-out. The operators said they will work together with existing and new ecosystem partners like the O-RAN Alliance and the Telecom Infra Project (TIP) to ensure that O-RAN quickly reaches competitive parity with traditional solutions. “Open RAN is the natural evolution of ...

This story continues at Four European Network Heavyweights Back O-RAN

Or just read more coverage at Electronics Weekly


ICs complete RF signal chain

2021-01-21 07:13 ElectronicsWeekly David Manners

Renesas has added to its portfolio for macro base transceiver stations (BTS) with four devices, offering customers access to a complete RF signal chain. This expansion includes quad-channel F4482/1 TX variable gain amplifiers (VGA) and the F011x family of dual-channel first-stage low noise amplifiers (LNA). The device set also includes the F1471 RF driver amplifier ...

This story continues at ICs complete RF signal chain

Or just read more coverage at Electronics Weekly


Element14 rallies disaster responses

2021-01-21 07:09 ElectronicsWeekly David Manners

Element14 is giving its members the opportunity to come up with creative and altruistic solutions for global disasters. Members are encouraged to think outside the box and share ideas that could potentially be used to combat anything from global droughts and flooding, to world hunger or even complications from long term COVID-19 symptoms or other ...

This story continues at Element14 rallies disaster responses

Or just read more coverage at Electronics Weekly


SpaceX passes Starlink 1,000-satellite launch mark

2021-01-21 07:07 ElectronicsWeekly Alun Williams

The UK government-owned OneWeb may be scaling back on its planned satellite broadband constellation, but Elon Musk’s rival SpaceX is pressing ahead: the latest deployment of 60 Starlink satellites sees the total pass 1,000 launched. It means 1,015 Starlink satellites have been delivered into space, although it is reported that of those 951 are actually still ...

This story continues at SpaceX passes Starlink 1,000-satellite launch mark

Or just read more coverage at Electronics Weekly


Shake Up Coming For Auto industry

2021-01-21 02:00 ElectronicsWeekly David Manners

The traditional car industry business model is going to be turned on its head by Foxconn and Apple, Malcolm Penn, CEO of Future Horizons, told IFS2021 earlier this week. “Apple and Foxconn will change the automotive world,” said Penn, “the auto industry is re-structuring – it’s easier for Apple and Tesla to get into EVs ...

This story continues at Shake Up Coming For Auto industry

Or just read more coverage at Electronics Weekly


Fanless embedded PC supports industrial GRE Tiger Lake CPUs

2021-01-20 22:50 LinuxGizmos Eric Brown

Avalue’s fanless, rugged “EMS-TGL” embedded PC runs Linux or Win 10 on embedded versions of Intel’s 11th Gen ULP3 Core CPUs with up to 64GB DDR4-3200, 3x M.2, 1GbE and 2.5GbE ports, and optional “IET” expansion. Avalue, which recently launched a pair of NUC-APL mini-PCs based on Intel’s Apollo Lake, announced a larger, but similarly […]


Renesas adds to RZ/G2 line with three Cortex-A55 SoCs

2021-01-20 19:59 LinuxGizmos Eric Brown

Renesas unveiled three low-end “RZ/G2L” members of its RZ/G2 family of Linux-driven IoT SoCs with single or dual -A55 cores plus a Mali-G31, Cortex-M33, and up to dual GbE support. There is also a SMARC module and dev kit. Renesas’ RZ/G2 line of industrial-focused system-on-chips include the hexa-core RZ/GM and octa-core RZ-G2H, both with mixtures […]


How data-driven control using ML improves 5G network performance

2021-01-20 19:21 Embedded.com Alex Saad-Falcon

5G is quickly moving from an idealized future to a very real present. The first 5G-ready iPhone has already been released. As with all generation upgrades, 5G promises significant speed improvements over its predecessor. 4G-LTE offers a peak download rate of 100Mbit/s, with the average being 25-50Mbit/s. In stark contrast, 5G offers up to 1.8Gbps, an improvement of almost 20x. Additionally, 5G has lower latency, with the primary latency contributor being airtime. Verizon has reported a latency of under 30ms in early deployment. There are additional gains in user mobility, energy efficiency, and number of simultaneous connections.

These changes do not come for free, however. 5G deployment requires extensive changes to existing infrastructure to handle new technologies for higher frequencies, beamforming, edge computing, and more. Luckily, these infrastructure improvements also enable previously unrealizable applications. For instance, 5G has the potential to make augmented reality much more tenable.

The sheer amount of data generated by 5G systems enables data-driven control architectures, powered by machine learning, to make 5G even more powerful and efficient. In this article, we will discuss an architecture to apply machine learning to a 5G system. New architectures and algorithms can lead to huge service improvements and cost savings for 5G systems as they become more widespread in the coming years.

New 5G technologies

We will first mention a few of the new technologies coming with 5G that allow the data-driven architecture to become a reality. The primary drivers relate to mobile edge computing (MEC) and radio access networks (RANs).

MEC moves computing from centralized servers closer to the mobile data users in different areas. Normally, data is forwarded from the base station to a central server. The central server handles the data and sends a response back to the base station. Given that the central server can be halfway across the country from the base station, the roundtrip time for data is of the order of tens to hundreds of milliseconds, which limits how responsive the cellular network can be.

MEC brings decentralized computing to cellular networks. Instead of a single central server, compute devices are distributed around the country, with one or multiple per service area. This reduction in processing latency enables much more sophisticated algorithms than previously possible, especially real-time and region-specific algorithms.

Figure 1.A comparison between traditional (left) and 5G (right) network architecture latency.

The second key driver for data-driven 5G architectures is radio access network (RAN) improvements. The RAN is responsible for transferring data from user devices to the core network. 5G technologies add multiple frequency bands, beamforming, and massive MIMO to RANs. These allow huge reconfigurability in how data is delivered to users, but they present challenges in orchestration. This reconfigurability enables, for example, crowded concerts to have hugely improved service.

In the first full set of 5G standards, the 3GPP specifies splitting up the base station of previous generations into separate units as the 5G standard. They suggest splitting up the base station into centralized units, distributed units, and radio units (or CUs, DUs, and RUs, respectively). The decentralized, flexible nature of the 5G RAN enables sophisticated control schemes based on data from the thousands of units in each service area.

Data-driven cellular architecture

One possible data-driven architecture to take advantage of the distributed RAN consists of the following:

  • a cloud controller which manages RAN controllers for a given service area
  • RAN controllers that orchestrate the centralized and decentralized units to handle user device actions, like RAN transfers and load balancing
  • centralized and distributed units that handle data-delivery operations
  • radio units that control the RF transceivers that put data over the air

This architecture is presented and described in greater detail by Polese et al. In the paper, they propose an edge-controller-based architecture for cellular networks and evaluate its performance with real data from hundreds of base stations of a major U.S. operator. They provide insights on how to dynamically cluster and associate base stations and controllers, according to the global mobility patterns of the users.

Both cloud and RAN controllers and even the centralized units can be deployed in the MEC. The distribution of components provides a separation of concerns for different layers in the protocol stack and, therefore, allows the cloud and RAN controllers to make higher level decisions without having to worry about low-level operations like channel coding and beamforming.

Figure 2.Polese et al.’s proposed 5G distributed control architecture.

For example, the RAN controller can aggregate data from all of its corresponding centralized and distributed units and run machine learning algorithms to optimize service in real time. The cloud controller can then aggregate data from multiple RAN controllers and determine which algorithms are performing the best. It can also create an estimate of user behavior and can monitor network congestion in different areas throughout the day.

Machine learning

Polese et al. tested their architecture on real 4G-LTE data from a cellular provider in California. LTE architecture is fully distributed, but it does not have the aggregation and data sharing extant in 5G architectures. Their study found that, in comparison to the LTE architecture, a controller-based 5G architecture, as described above, greatly improves prediction accuracy by aggregating data from multiple sources in the cloud and RAN controllers. This access to information from a plethora of sources makes the architecture a great candidate for new data-driven strategies and machine learning. Algorithms can be run on the cloud and RAN controllers, which can disseminate decisions to their respective CUs, DUs and RUs.

In terms of machine learning algorithms, Polese et al. experimented with random forest, Bayesian ridge, and Gaussian process regressors. The authors used these algorithms to predict different key performance indicators. The authors also experimented with a cluster-based approach in contrast to a local-based one. The cluster-based approach tries to group controllers based on location or data. The authors found that data-based clustering was more effective. The data-based clusters must be updated periodically in response to network activity, which requires network overhead to coordinate between clusters, but the authors found that daily updates had comparable performance to 15-minute updates.

The most successful algorithm, in terms of RMSE (root mean-squared error), was the cluster-based Gaussian process regressor, followed by cluster-based random forest and local-based Bayesian ridge. The cluster-based GPR outperformed all other algorithms for all time lags, from 1 to 10 minutes. Additionally, using a cluster-based approach brings a 53% reduction in RMSE when compared to a local-based approach, directly showing the potential improvement of the 5G architecture over LTE.

Key takeaways

As 5G continues to roll out, new applications that take advantage of the unique properties of 5G networks will be needed to fully realize performance and efficiency gains. Using data-driven techniques like machine learning, RAN controllers can orchestrate how decentralized base stations provide service. The simple addition of data-sharing to the fully decentralized LTE architecture can bring a 53% reduction in RMSE to regression algorithms.The ability to forecast load, throughput, and outage duration with this accuracy is highly beneficial to manage the network efficiently.

Alex Saad-Falcon is a published research engineer at an internationally acclaimed research institute, where he leads internal and sponsored projects. Alex has his MS in electrical engineering from Georgia Tech and is pursuing a PhD in machine learning. He is a content writer for Do Supply Inc..

Related Content:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

The post How data-driven control using ML improves 5G network performance appeared first on Embedded.com.


Gadget Watch: CES, the Gadget Fest’s Gadget Fest

2021-01-20 16:24 ElectronicsWeekly Alun Williams

When it comes to the Gadget Watch category on Gadget Master, surely CES represents the ultimate Gadget Watch roundup? (Albeit in Virtual Form this year, of course).

This story continues at Gadget Watch: CES, the Gadget Fest’s Gadget Fest

Or just read more coverage at Electronics Weekly


ON Semi adds Quuppa AoA location technology to Bluetooth SoC

2021-01-20 15:31 Embedded.com Nitin Dahad

ON Semiconductor has added the Quuppa Intelligent Locating System to its RSL10 low power Flash-based Bluetooth low energy radio system on chip (SoC). Provided in a user-friendly CMSIS-Pack format, the solution allows manufacturers to design ultra-low-power indoor asset tracking applications with direction finding features and advanced angle of arrival (AoA) technology.
The Intelligent Locating System is a technology platform developed by Quuppa, a company established in 2012 in Espoo, Finland, by the team which was previously responsible for the invention of high accuracy indoor positioning (HAIP) at the Nokia Research Center. Its direction finding methodology and positioning algorithms enable real-time tracking of tags and devices with centimeter-level accuracy, even in challenging environments. Quuppa technology allows positioning updates to be sent up to 50 times per second, providing a reliable real-time locating system (RTLS) solution for many industries.

The Quuppa AOA location technology uses the radio signal direction measured by antennas to then compute a tag’s position using advanced algorithms (Image: Quuppa)

The RSL10 Quuppa RTLS AoA tag CMSIS-Pack is available as part of a comprehensive asset management development ecosystem from ON Semiconductor and technology partners. Designed to provide manufacturers with flexible deployment options, the ecosystem features a range of RSL10-based solutions including sensor development kits and software resources. For turnkey solutions, ON Semiconductor has collaborated with Tatwah sa to develop a portfolio of Bluetooth tags and beacons including the recently added Quuppa trackable tags.

On Semiconductor RSL10-Quuppa
The Quuppa AoA location technology enables low power indoor asset tracking (Image: ON Semiconductor)

“Monitoring and tracking of assets enables new capabilities and huge improvements in operational efficiencies across a wide variety of applications,” said Wiren Perera, who heads IoT at ON Semiconductor. “Ultra-low-power wireless sensing and accurate location identification are essential to delivering on this promise. Implementing the Quuppa trackable technology on our industry-leading Bluetooth Low Energy wireless platform addresses this need, and our portfolio of solutions should unleash the full market potential.”
“Over the years, we witnessed increasing demand for RTLS technologies where companies are seeking to gain visibility within their production lines and workflows,” said Fabio Belloni, chief customer officer, Quuppa. “The need for a variety of different types of sensors and tags to empower those use cases is endless”.
The RSL10 Quuppa AoA RTLS Tag CMSIS pack featuring the Quuppa Intelligent Locating System is compatible with all RSL10-based development hardware and is available now for free download.

Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

The post ON Semi adds Quuppa AoA location technology to Bluetooth SoC appeared first on Embedded.com.


Bug Changes Traffic Congestion Pattern

2021-01-20 15:23 ElectronicsWeekly David Manners

The bug changed traffic congestion patterns in last year, says TomTom’s 2020 Traffic Index, with 387 cities experiencing a significant decrease (average of 21%) in overall congestion with a 28% average decrease in congestion during rush hours. By contrast, only 13 cities saw their traffic jams increase. Bengalaru in India has been displaced by Moscow in top ...

This story continues at Bug Changes Traffic Congestion Pattern

Or just read more coverage at Electronics Weekly


Understanding the UART

2021-01-20 15:04 Embedded.com Eric Peña and Mary Grace Legaspi

The UART, or universal asynchronous receiver-transmitter, is one of the most used device-to-device communication protocols. This article shows how to use a UART as a hardware communication protocol by following the standard procedure.

When properly configured, the UART can work with many different types of serial protocols that involve transmitting and receiving serial data. In serial communication, data is transferred bit by bit using a single line or wire. In two-way communication, we use two wires for successful serial data transfer. Depending on the application and system requirements, serial communications needs less circuitry and wires, which reduces the cost of implementation.

In this article, we will discuss the fundamental principles when using a UART, with a focus on packet transmission, standard frame protocol, and customized frame protocols that are value added features for security compliance when implemented, especially during code development. During product development, this document also aims to share some basic steps when checking on a data sheet for actual usage.

At the end of the article, the goal is for better understanding and compliance of UART standards to maximize the capabilities and application, particularly when developing new products.

“The single biggest problem in communication is the illusion that it has taken place.”

—George Bernard Shaw

Communication protocol plays a big role in organizing communication between devices. It is designed in different ways based on system requirements, and these protocols have a specific rule agreed upon between devices to achieve successful communication.

Embedded systems, microcontrollers, and computers mostly use a UART as a form of device-to-device hardware communication protocol. Among the available communication protocols, a UART uses only two wires for its transmitting and receiving ends.

Despite being a widely used method of hardware communication protocol, it is not fully optimized all the time. Proper implementation of frame protocol is commonly disregarded when using the UART module inside the microcontroller.

By definition, UART is a hardware communication protocol that uses asynchronous serial communication with configurable speed. Asynchronous means there is no clock signal to synchronize the output bits from the transmitting device going to the receiving end.


Figure 1. Two UARTs directly communicate with each other.

The two signals of each UART device are named:

  • Transmitter (Tx)
  • Receiver (Rx)

The main purpose of a transmitter and receiver line for each device is to transmit and receive serial data intended for serial communication.

Figure 2. UART with data bus.

The transmitting UART is connected to a controlling data bus that sends data in a parallel form. From this, the data will now be transmitted on the transmission line (wire) serially, bit by bit, to the receiving UART. This, in turn, will convert the serial data into parallel for the receiving device.

The UART lines serve as the communication medium to transmit and receive one data to another. Take note that a UART device has a transmit and receive pin dedicated for either transmitting or receiving.

For UART and most serial communications, the baud rate needs to be set the same on both the transmitting and receiving device. The baud rate is the rate at which information is transferred to a communication channel. In the serial port context, the set baud rate will serve as the maximum number of bits per second to be transferred.

Table 1 summarizes what we must know about the UART.

Table 1. UART Summary

Wires 2
Speed 9600, 19200, 38400, 57600, 115200, 230400, 460800, 921600, 1000000, 1500000
Methods of Transmission Asynchronous
Maximum Number of Masters 1
Maximum Number of Slaves 1

The UART interface does not use a clock signal to synchronize the transmitter and receiver devices; it transmits data asynchronously. Instead of a clock signal, the transmitter generates a bitstream based on its clock signal while the receiver is using its internal clock signal to sample the incoming data. The point of synchronization is managed by having the same baud rate on both devices. Failure to do so may affect the timing of sending and receiving data that can cause discrepancies during data handling. The allowable difference of baud rate is up to 10% before the timing of bits gets too far off.

Data Transmission

In a UART, the mode of transmission is in the form of a packet. The piece that connects the transmitter and receiver includes the creation of serial packets and controls those physical hardware lines. A packet consists of a start bit, data frame, a parity bit, and stop bits.

Figure 3. UART packet.

Start Bit

The UART data transmission line is normally held at a high voltage level when it’s not transmitting data. To start the transfer of data, the transmitting UART pulls the transmission line from high to low for one (1) clock cycle. When the receiving UART detects the high to low voltage transition, it begins reading the bits in the data frame at the frequency of the baud rate.

Figure 4. Start bit.

Data Frame

The data frame contains the actual data being transferred. It can be five (5) bits up to eight (8) bits long if a parity bit is used. If no parity bit is used, the data frame can be nine (9) bits long. In most cases, the data is sent with the least significant bit first.

Figure 5. Data frame.


Parity describes the evenness or oddness of a number. The parity bit is a way for the receiving UART to tell if any data has changed during transmission. Bits can be changed by electromagnetic radiation, mismatched baud rates, or long-distance data transfers.

After the receiving UART reads the data frame, it counts the number of bits with a value of 1 and checks if the total is an even or odd number. If the parity bit is a 0 (even parity), the 1 or logic-high bit in the data frame should total to an even number. If the parity bit is a 1 (odd parity), the 1 bit or logic highs in the data frame should total to an odd number.

When the parity bit matches the data, the UART knows that the transmission was free of errors. But if the parity bit is a 0, and the total is odd, or the parity bit is a 1, and the total is even, the UART knows that bits in the data frame have changed.

Figure 6. Parity bits.

Stop Bits

To signal the end of the data packet, the sending UART drives the data transmission line from a low voltage to a high voltage for one (1) to two (2) bit(s) duration.

Figure 7. Stop bits.

Steps of UART Transmission

First: The transmitting UART receives data in parallel from the data bus.

Figure 8. Data bus to the transmitting UART.

Second: The transmitting UART adds the start bit, parity bit, and the stop bit(s) to the data frame.

Figure 9. UART data frame at the Tx side.

Third: The entire packet is sent serially starting from start bit to stop bit from the transmitting UART to the receiving UART. The receiving UART samples the data line at the preconfigured baud rate.

Figure 10. UART transmission.

Fourth: The receiving UART discards the start bit, parity bit, and stop bit from the data frame.

Figure 11. The UART data frame at the Rx side.

Fifth: The receiving UART converts the serial data back into parallel and transfers it to the data bus on the receiving end.

Figure 12. Receiving UART to data bus.

Frame Protocol

One key feature that is available in the UART yet not fully used is the implementation of a frame protocol. The main use and importance of this is an added value for security and protection on each device.

For instance, when two devices use the same UART frame protocol, there are tendencies that, when connecting to the same UART without checking the configuration, the device will be connected to different pins that may cause malfunctions in the system.

On the other hand, implementing this ensures security because of the need to parse the information received in alignment with the design frame protocol. Each frame protocol is specifically designed to be unique and secure.

In designing a frame protocol, designers can set the desired headers and trailers, including CRC, to different devices. In Figure 13, two (2) bytes are set as part of the header.

Second: Under memory map, check the UART address.

Figure 13. Sample UART frame protocol.

Based on the sample, you can set a header, trailer, and CRC that are unique to your device.

Header 1 (H1 is 0xAB) and Header 2 (H2 is 0xCD)

Header is the unique identifier that determines if you are communicating with

the correct device.

Command (CMD) Selection

Command will depend on the list of command designed to create the communication between two devices.

Data Length (DL) per Command

Data length will be based on the command chosen. You can maximize the length of data depending on the command chosen, so it can vary based on the selection. In that case, the data length can be adjusted.

Data n (Varying Data)

Data is the payload to be transferred from devices.

Trailer 1 (T1 is 0xE1) and Trailer 2 (T2 is 0xE2)

Trailers are data that are added after the transmission is ended. Just like the Header, they can be uniquely identified.

Cyclic Redundancy Checking (CRC Formula)

The cycling redundancy checking formula is an added error detecting mode to detect accidental changes to raw data. The CRC value of the transmitting device must always be equal to the CRC computations on the receiver’s end.

It is advisable to add security by implementing frame protocols for each UART device. The frame protocol needs identical configurations on both the transmitting and receiving devices.

UART Operations

When using any hardware communication protocol, it’s a prerequisite to check the data sheet and hardware reference manual.

Here are the steps to follow:

First: Check the data sheet interface of the device.

Figure 14. Microcontroller data sheet.

Figure 15. Microcontroller memory map.

Third: Check the specific details for the UART PORT such as the operation mode, data bits length, the parity bit, and stop bits. Sample UART port details in data sheet:

The sample MCUs provide a full-duplex UART port, which is fully compatible with PC standard UARTs. The UART port provides a simplified UART interface to other peripherals or hosts, supporting full-duplex, DMA, and asynchronous transfer of serial data. The UART port includes support for five to eight data bits, and none, even, or odd parity. A frame is terminated by one and a half or two stop bits.

Fourth: Check the UART operation details, including the baud rate computation. Baud rate is configured using the following sample formula. This formula varies depending on the microcontroller.

Sample details of UART operations:

  • 5 to 8 data bits
  • 1, 2, or 1 and ½ stop bits
  • None, or even or odd parity
  • Programmable oversample rate by 4, 8, 16, 32
  • Baud rate = PCLK/((M + N/2048) × 2OSR + 2 × DIV


OSR (oversample rate)
UART_LCR2.OSR = 0 to 3
DIV (baud rate divider)
UART_DIV = 1 to 65535
M (DIVM fractional baud rate M)
UART_FBR.DIVM = 1 to 3
N (DIVM fractional baud rate M)
UART_FBR.DIVN = 0 to 2047

Fifth: For the baud rate, make sure to check what peripheral clock (PCLK) to use. In this example, there is a 26 MHz PCLK and 16 MHz PCLK available. Notice that OSR, DIV, DIVM, and DIVN varies per device.

Table 2. Baud Rate Example Based on 26 MHz PCLK

9600 3 24 3 1078
115200 3 4 1 1563

Table 3. Baud Rate Example Based on 16 MHz PCLK

9600 3 17 3 1078
115200 3 2 2 348

Sixth: Next part is to check the detailed registers for UART Configuration. Take a look at the parameters in computing the baud rate such as UART_LCR2, UART_DIV, and UART_FBR. Table 4 will lead to a specific register to cover.

Table 4. UART Register Descriptions

Name Description
UART_DIV Baud rate divider
UART_FIBR Fractional baud rate
UART_LCR2 Second line control

Seventh: Under each register, check the details and substitute the values to compute for the baud rate, then start implementing the UART.

Why Is It Important?

Familiarity with the UART communication protocol is advantageous when developing robust, quality-driven products. Knowing how to send data using only two wires, as well as how to transport a whole pack of data or a payload, will help ensure that data is transferred and received without error. Since UART is the most commonly used hardware communication protocol, this knowledge can enable design flexibility in future designs.

Use Cases

You can use UART for many applications, such as:

  • Debugging: Early detection of system bugs is important during development. Adding UART can help in this scenario by capturing messages from the system.
  • Manufacturing function-level tracing: Logs are very important in manufacturing. They determine functionalities by alerting operators to what is happening on the manufacturing line.
  • Customer or client updates: Software updates are highly important. Having complete, dynamic hardware with update-capable software is important to having a complete system.
  • Testing/verification: Verifying products before they leave the manufacturing process helps deliver the best quality products possible to customers.


Basics of UART Communication.” Electronics Hub, July 2017.

Campbell, Scott. “Basics of UART Communication.” Circuit Basics. Keim, Robert.

Back to Basics: The Universal Asynchronous Receiver/Transmitter.” All About Circuits, December 2016.

What Is UART Protocol? UART Communication Explained.” Arrow.

Eric Peňa is a senior firmware engineer and part of the Design and Layout Team working with the Consumer Software Engineering Group at Analog Devices. He joined ADI in Cavite, Philippines in April 2019. He graduated from Adamson University in Manila with a bachelor’s degree in computer engineering. Eric previously worked at Technology Enabler Designer as a firmware engineer and also as a systems engineer at Fujitsu Ten Solutions. He can be reached at eric.pena@analog.com.
Mary Grace Legaspi is a firmware engineer and part of the Design and Layout Team working with the Consumer Software Engineering Group at Analog Devices. She joined ADI in Cavite, Philippines in September 2018. She graduated from Tarlac State University with a bachelor’s degree in electronics engineering. She is currently studying toward a Master of Management at the University of the Philippines. She can be reached at mary.legaspi@analog.com.

Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

The post Understanding the UART appeared first on Embedded.com.


MediaTek Dimensity 1100 and 1200 5G mobile SoC’s reach up to 3GHz

2021-01-20 12:01 CNXSoft Jean-Luc Aufranc (CNXSoft)

MediaTek has just unveiled not one, but two premium 5G SoC’s with Dimensity 1100 and...


Little Bee is an affordable, open hardware current & magnetic field probe (Crowdfunding)

2021-01-20 10:59 CNXSoft Jean-Luc Aufranc (CNXSoft)

Little Bee is an affordable, open-source hardware, and high-performance current probe and magnetic field probe...


Trump pardons Levandowski

2021-01-20 08:49 ElectronicsWeekly David Manners

Anthony Levandowski (pictured) the autonomous driving pioneer who downloaded 14,000 Waymo files before moving to Uber Technologies, has been pardoned by outgoing US president Donald Trump. In August Levandowski was sentenced to 18 months in jail after pleading guilty to the theft. Levandowski was also ordered to pay Google $756,499 in restitution and was fined ...

This story continues at Trump pardons Levandowski

Or just read more coverage at Electronics Weekly


Renesas RZ/G2L MPUs Feature Cortex-A55 & Cortex-M33 Cores for AI Applications

2021-01-20 08:25 CNXSoft Saumitra Jagdale

Renesas Electronics Corporation announced RZ/G2L MPUs, allowing enhanced processing for an extensive variety of AI...


Semiconductor R&D spend to rise 4%

2021-01-20 08:23 ElectronicsWeekly David Manners

Semiconductor industry R&D spending will rise 4% in 2021, after a record 2020, says IC Insights. Intel stays on top of the R&D ranking, but its share of total industry R&D expenditures dipped after its spending decreased 4% in 2020. AMD moved into the R&D top 10. Research and development spending by semiconductor companies worldwide ...

This story continues at Semiconductor R&D spend to rise 4%

Or just read more coverage at Electronics Weekly


Valetudo is a cloud-free web interface for robot vacuum cleaners

2021-01-20 08:07 CNXSoft Jean-Luc Aufranc (CNXSoft)

In my review of Kyvol Cybovac S31 LDS smart robot vacuum cleaner, I noted that...


2021 IC industry to grow 18%

2021-01-20 07:30 ElectronicsWeekly David Manners

The chip industry will grow 18% this year to reach a market size of $520.6 billion, forecast Malcolm Penn (pictured) CEO of Future Horizons, at IFS 2021 yesterday. Three out of the four leading market indicators: demand, capacity and prices are all pointing to a big growth year. The only exception being the general economy ...

This story continues at 2021 IC industry to grow 18%

Or just read more coverage at Electronics Weekly


Leti researchers use RRAM for ML

2021-01-20 07:11 ElectronicsWeekly David Manners

CEA-Leti scientists have demonstrated a machine-learning technique exploiting what have been previously considered as “non-ideal” traits of resistive-RAM (RRAM) devices, overcoming barriers to developing RRAM-based edge-learning systems. The research team demonstrated how RRAM, or memristor, technology can be used to create intelligent systems that learn locally at the edge, independent of the cloud. The learning algorithms ...

This story continues at Leti researchers use RRAM for ML

Or just read more coverage at Electronics Weekly


OneWeb massively scales back constellation plans

2021-01-20 07:06 ElectronicsWeekly Alun Williams

OneWeb is reducing the planned size of its satellite constellation, down from 47,884 to 6,372 satellites. The Low Earth Orbit (LEO) satellite communications company – which is part owned by the UK government since June 2020, along with Bharti Global and the SoftBank Group – informed the US Federal Communications Commission (FCC) of the reduction ...

This story continues at OneWeb massively scales back constellation plans

Or just read more coverage at Electronics Weekly


On Semi supplies Quuppa locating system for BLE SoC

2021-01-20 07:00 ElectronicsWeekly David Manners

ON Semiconductor is supplying the Quuppa Intelligent Locating System for the RSL10, the Flash-based BLE radio SoC. Provided in a CMSIS-Pack format, the solution allows manufacturers to design ultra-low-power indoor asset tracking applications with Direction Finding features and advanced Angle of Arrival (AoA) technology. The Quuppa Intelligent Locating System is a platform for location-based services and ...

This story continues at On Semi supplies Quuppa locating system for BLE SoC

Or just read more coverage at Electronics Weekly