With IoT being the latest buzzword in the world of technology, a lot of innovative and disruptive IoT Startups have emerged in the past few years. With products and solutions ranging from solving the pain-points of the consumers to optimizing the enterprise level operations and maximizing the revenues, these Startups have shown successfully how the future of IoT would shape in India.
Showing posts with label IoT. Show all posts
Showing posts with label IoT. Show all posts
Tuesday, 10 January 2017
Thursday, 2 June 2016
SOLI: A Google project of touchless interaction
Technology has already created incredible changes to the world and this may be yet another milestone. Google visualizes Soli to change interaction with the world without touching things. Google has been inventing astonishing technologies, the latest experiment is from Advanced Technology and Project group (ATAP) in San Francisco, they are creating this fascinating technology on the powerful motion control of our gesture with the smart watch, tablet, the radio, medical equipment and other daily used appliances. ATAP led the main stress on the size of a chip. They preferred to design Soli as small as they can because it can be embedded in any device, so the chip is small and expedient.
Physical touch to any button or switch is replaced by input devices like mouse, keyboard or screen touch display and this is further replaced by gesture controlling system. The project Soli works on the especial approximation, i.e. Radar system. The frequency of the emitted radiation plays an important role in gesture analysis. It is in the form of the chip where no other rotatory part is involved, it can be embedded inside devices and eradicate the need for a button. The signal gets emitted from the chip and strikes to a movable object or any part of the body through which we act and gets reflected back. And the distance calculated by the difference of incoming and outgoing of the signal is a compact wave signal, through this, we provide a signal process and machine learning technique to detect the gesture. The Soli chip works at 60 GHz radar spectrum and has a refresh rate 10,000 frames per second.
This chip can detect small motion of our hand like, swiping side of the thumb on the index finger will lead to rising or low of a volume, and swiping up and down will be used in scrolling. The thumb and index finger together tapping will judge the pressing of a button.
Ivan Poupyrev said that companies like Leap Motion or Intel already use this gesture technology in camera’s, but this are cumbersome and require additional hardware. So, Google invented this mini chip that can be just embedded inside any device. "It is improbable how simple is radar emitted chip, it has no camera tracking, you may placed soli wherever you want" he said.
Due to soli's compatible size, it can be embedded in any device you want. Soli's gesture technology is a step forward towards a future of touch-less interactions with devices.
![]() |
| Image credit: atap.google.com |
This chip can detect small motion of our hand like, swiping side of the thumb on the index finger will lead to rising or low of a volume, and swiping up and down will be used in scrolling. The thumb and index finger together tapping will judge the pressing of a button.
Ivan Poupyrev said that companies like Leap Motion or Intel already use this gesture technology in camera’s, but this are cumbersome and require additional hardware. So, Google invented this mini chip that can be just embedded inside any device. "It is improbable how simple is radar emitted chip, it has no camera tracking, you may placed soli wherever you want" he said.
Due to soli's compatible size, it can be embedded in any device you want. Soli's gesture technology is a step forward towards a future of touch-less interactions with devices.
Written by Radhika Soni
She believes and she visualizes. Loves to take risks and grab on opportunities, curious about technology.
Monday, 18 April 2016
Brain like Chip, soon to be a reality
“The human brain, sitting on your shoulders is the most complicated object in the known universe”
Throughout history, people have compared the brain to different inventions. In the past, the brain has been said to be like a water clock and a telephone switchboard. This is hard to imagine, but it is what it is. But, these days, the favorite invention that the brain is compared to a computer. Some people use this comparison to say that the computer is better than the brain; some people say that the comparison shows that the brain is better than the computer. Perhaps, it is best to say that the brain is better at doing some jobs and the computer is better at doing other jobs.
The era, where we can color hair according to our mood, by using nanotechnology (nanorobots) thinking about a brain like chip is no big deal . By 2020, most home computers will have the computing power of a human brain. That doesn't mean that they are brains, but it means that in terms of raw processing, they can process bits as fast as a brain can.
So the question here arises is, how long is it going to take for the development of a machine that's as smart as we are? Well the good news is that we don’t have to wait any further for this now because man has already initiated his first step towards this dream project.
The year 2014 ended with a surprise. In the month of August,2014 IBM said it used conventional silicon manufacturing techniques to create what it calls a neurosynaptic processor that could rival a traditional supercomputer by handling highly complex computations while consuming no more power than that supplied by a typical hearing aid battery .The chip is also one of the biggest ever built, boasting some 5.4 billion transistors, which is about a billion more than the number of transistors on an Intel Xeon chip. To do this, researchers designed the chip with a mesh network of 4,096 neurosynaptic cores. Each core contains elements that handle computing, memory and communicating with other parts of the chip. Each core operates in parallel with the others.
The human brain has a “clock speed” (neuron firing speed) measured in tens of hertz, and a total power consumption of around 20 watts. A modern silicon chip, despite having features that are almost on the same tiny scale as biological neurons and synapses, can consume thousands or millions times more energy to perform the same task as a human brain. As we move towards more advanced areas of computing, such as artificial general intelligence and big data analysis — areas that IBM just happens to be deeply involved with — it would really help if we had a silicon chip that was capable of brain-like efficiency.
The chip would seem to represent a breakthrough in one of the long-term problems in computing: Computers are really good at doing math and reading words, but discerning and understanding meaning and context, or recognizing and classifying objects — things that are easy for humans — have been difficult for traditional computers. One way IBM tested the chip was to see if it could detect people, cars, trucks and buses in video footage and correctly recognize them. It worked.
In terms of complexity, the True North chip has a million neurons, which is about the same number as in the brain of a common honeybee. A typical human brain averages 100 billion. But given time, the technology could be used to build computers that can not only see and hear, but understand what is going on around them. Currently, the chip is capable of 46 billion synaptic operations per second per watt, or SOPS. That’s a tricky apples-to-oranges comparison to a traditional supercomputer, where performance is measured in the number of floating point operations per second, or FLOPS. But the most energy-efficient supercomputer now running tops out at 4.5 billion FLOPS.
Down the road, the researchers say in their paper, they foresee TrueNorth-like chips being combined with traditional systems, each solving problems it is best suited to handle. But it also means that systems that in some ways will rival the capabilities of current supercomputers will fit into a machine the size of your Smartphone, while consuming even less energy.
The project was funded with money from the Department of Defense’s research organization. IBM collaborated with researchers at Cornell Tech and iniLabs. With TrueNorth’s truly novel architecture, that changes — with TrueNorth, IBM is now a big step closer to building a brain on a chip, and that could be big news for the future of computing.
Article by Rishibha Tuteja
Last minute Blogger, fangirl by profession. A Bibliophile by heart,Tech–Enthusiast by choice.
She breathes dreams like air and can be reached at https://twitter.com/BibliophileRish
Throughout history, people have compared the brain to different inventions. In the past, the brain has been said to be like a water clock and a telephone switchboard. This is hard to imagine, but it is what it is. But, these days, the favorite invention that the brain is compared to a computer. Some people use this comparison to say that the computer is better than the brain; some people say that the comparison shows that the brain is better than the computer. Perhaps, it is best to say that the brain is better at doing some jobs and the computer is better at doing other jobs.
![]() |
| All photos in this article are credits: research.ibm.com |
So the question here arises is, how long is it going to take for the development of a machine that's as smart as we are? Well the good news is that we don’t have to wait any further for this now because man has already initiated his first step towards this dream project.
The year 2014 ended with a surprise. In the month of August,2014 IBM said it used conventional silicon manufacturing techniques to create what it calls a neurosynaptic processor that could rival a traditional supercomputer by handling highly complex computations while consuming no more power than that supplied by a typical hearing aid battery .The chip is also one of the biggest ever built, boasting some 5.4 billion transistors, which is about a billion more than the number of transistors on an Intel Xeon chip. To do this, researchers designed the chip with a mesh network of 4,096 neurosynaptic cores. Each core contains elements that handle computing, memory and communicating with other parts of the chip. Each core operates in parallel with the others.
The human brain has a “clock speed” (neuron firing speed) measured in tens of hertz, and a total power consumption of around 20 watts. A modern silicon chip, despite having features that are almost on the same tiny scale as biological neurons and synapses, can consume thousands or millions times more energy to perform the same task as a human brain. As we move towards more advanced areas of computing, such as artificial general intelligence and big data analysis — areas that IBM just happens to be deeply involved with — it would really help if we had a silicon chip that was capable of brain-like efficiency.
The chip would seem to represent a breakthrough in one of the long-term problems in computing: Computers are really good at doing math and reading words, but discerning and understanding meaning and context, or recognizing and classifying objects — things that are easy for humans — have been difficult for traditional computers. One way IBM tested the chip was to see if it could detect people, cars, trucks and buses in video footage and correctly recognize them. It worked.
In terms of complexity, the True North chip has a million neurons, which is about the same number as in the brain of a common honeybee. A typical human brain averages 100 billion. But given time, the technology could be used to build computers that can not only see and hear, but understand what is going on around them. Currently, the chip is capable of 46 billion synaptic operations per second per watt, or SOPS. That’s a tricky apples-to-oranges comparison to a traditional supercomputer, where performance is measured in the number of floating point operations per second, or FLOPS. But the most energy-efficient supercomputer now running tops out at 4.5 billion FLOPS.
Down the road, the researchers say in their paper, they foresee TrueNorth-like chips being combined with traditional systems, each solving problems it is best suited to handle. But it also means that systems that in some ways will rival the capabilities of current supercomputers will fit into a machine the size of your Smartphone, while consuming even less energy.
The project was funded with money from the Department of Defense’s research organization. IBM collaborated with researchers at Cornell Tech and iniLabs. With TrueNorth’s truly novel architecture, that changes — with TrueNorth, IBM is now a big step closer to building a brain on a chip, and that could be big news for the future of computing.
Article by Rishibha Tuteja
Last minute Blogger, fangirl by profession. A Bibliophile by heart,Tech–Enthusiast by choice.
She breathes dreams like air and can be reached at https://twitter.com/BibliophileRish
Subscribe to:
Comments
(
Atom
)




