Skip to content
Attitude shift towards the Internet of Things and smart homes

the (IoT) the “Artificial Internet of Things,” a technology ecosystem, emerged during the pandemic. Then the Smart Home was developed.

AIoT combines connected objects (the IoT) and the artificial intelligence (AI) used in these objects.

The past 12 months have been tough. The pandemic has wreaked havoc around the world and people are now realizing that Covid-19 is here to stay.

We now accept this fact and seek ways to adapt our lives and our interactions with the world. To ensure that people lead safe, productive and happy lives, governments, industries and businesses are constantly changing the status quo.

People had to change the way they work and where they work. Over the past year, working from home has become the norm. Businesses can continue to allow employees to work remotely as long as employees remain productive. Working from home has led to a renewed emphasis on the importance of work and the value of our homes. Discussions around tech-enabled smart homes are more topical than ever.

Smart homes and all the technologies involved are still a very young industry. Last year, research determined the barriers preventing AIoT from becoming a reality. Electronics engineers identified significant market-level as well as device-level issues in this research. Then the researchers did the same study a year later to see how things had improved. The title? What title? No results were reported.

AI has security issues due to its reliance on data. The more information a device needs, the brighter it is. Engineers found that processing data locally can solve privacy issues. Homes can keep their data within their walls without sharing it with third parties in the cloud. Simply minimizing third-party cookies reduces the risk of data leakage.

smart home

A smart home can be used to store data so that a remote cybercriminal doesn’t have to become an ordinary burglar to steal it. Although this is unlikely to happen, device manufacturers should ensure that data handling on their devices is secure.

You can enjoy much better data and decision-making security by using various device-level security features such as secure key storage, accelerated encryption, and real number generation random.

Engineers found connectivity to be a significant barrier to deploying AI. However, only 27% of industry professionals see connectivity as a significant barrier to the technology, and 38% expressed concern about the technology’s ability to overcome latency issues. For example, home healthcare monitoring cannot afford to be hampered by poor connectivity when it comes to making decisions about potentially life-changing circumstances, such as heart attacks. However, using on-device processing makes network latency irrelevant.

If the industry wants to develop applications that do not suffer from latency, it should move to on-device computing. Product makers can now run certain AIoT chips in nanoseconds, allowing products to think quickly and make decisions with precision.


Engineers also pointed out the scaling issue last year. Engineers know that the number of connected devices continues to grow, which puts additional pressure on cloud infrastructure. About 25% of engineers believe that scalability is a barrier to cutting-edge technology success in 2020. However, experts are beginning to recognize the deep-rooted scalability benefits of IoT.

The cloud is no longer a compute factor at the edge, negating any potential scaling and growth issues. Today, less than a fifth of engineers think cloud infrastructure can hold back cutting-edge AI.

The good news? The electronics industry has nothing to do to ensure the scalability of the IoT. One of the major technical barriers to IoT expansion is the need for cloud computing to manage billions of additional devices and petabytes in the future – which has now been eliminated.

Increase power capacity, reduce power consumption

The AIoT market has grown over the past year. He has also improved technically. On-device AI processing capabilities have improved while decreasing power requirements and expense. Chip owners can now tailor chips to different AIoT needs at an affordable price point.

How can engineers transition to using AIoT chips as a realistic option for product manufacturers?

The development environment is a critical consideration. New chip architectures often mean immature and untested proprietary programming platforms that engineers have to learn and get comfortable with.

Instead, engineers should look for sites that can afford to use industry standard methods with which they are familiar. Industry-standard methods include full programmability and runtime environments such as FreeRTOS, TensorFlow Lite, and C. Engineers can quickly program chips using user-friendly platforms without learning new languages, tools, or techniques.

Having a single programming environment that can handle all of the computing requirements of an IoT system is essential. Computing requirements capability will always be key to enabling the design speed needed to bring fast and secure AI to the home in the new post-covid era.

Image credit: Kindel Media; pexels; Thank you!

Deanna Ritchie

Editor-in-chief at ReadWrite

Deanna is the editor of ReadWrite. Previously, she worked as an editor for Startup Grind and has over 20 years of experience in content management and development.

zimonewszimonews Trans

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.