Internet of Things Asked by acp_ on August 23, 2021
With many IoT devices connecting to the internet to store information on large databases, I was wondering how they received data from those databases. For example, if I have a smart thermostat that I can control over the internet, how does it know when I have updated it’s configuration on whatever web service it uses? My initial thought was just send an HTTP request to an API every so often, maybe five to thirty seconds. One problem I could see with this is each device is sending literally thousands of requests a day, which could be quite expensive on the server side of things. Is there an industry standard for this? Is MQTT used as opposed to HTTP?
Assume your IoT device (e.g. with some RaspBerry Pi or other embedded Linux computer, or a Microbit one) has some TCP/IP connectivity.
Then your device could access (using TCP/IP perhaps thru Wifi or Lifi) remote databases such as PostGreSQL or MongoDB running on some more or less distant server.
See also this draft report and the Chariot and Decoder and Vessedia European projects, and the CompCert compiler.
Answered by Basile Starynkevitch on August 23, 2021
welcome.
I think @hardillb is covering the details pretty well. I'd like to provide some insight regarding the overall architecture.
Regardless of what the RDBMS guys keep repeating, as soon as the data exploded back in early 2000 there has been an uprising of "no-sql" data stores. RDBMs do a lot of things good but they are far from being the only game in town by now. The game we are playing now is "data ingestion" where you can see stuff like AWS Kinesis or Apache Kafka, Spark, Redshift, Elasticsearch... and that's mostly for ingestion and data store - taking the data from devices into cloud.
Due to inertia it happens (some of) those new systems support SQL as well but make no mistake: they are not RDBMS databases.
Completing the picture you have dockerization tech (Kubernetes) which means you can automate upscaling of your server infrastructure in real time. I recall a presentation where the speaker declared their core count varied by 10k CPU cores per second so I'm inclined to assert at cloud level the thing is kinda well known.
Answered by MaxDZ8 on August 23, 2021
Yes, in many cases pub/sub protocols that are initiated from the device to the cloud (like MQTT) are used.
With things like persistent subscriptions and message queuing for offline clients it means that updates/commands will reach when it comes back online and it provides a way for the device to also publish it's state and any sensor readings back up to the cloud as well.
Answered by hardillb on August 23, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP