I am thinking of writing this blog just because I was thinking of this question last night.
So how many ways can we access a sensor, for example the gyroscope, on our smartphone or tablet? Definitely there is a couple approaches:
You may use sensor API in your app. Take a look at some Android sensor topics starting from here.
You can even hack into the system level code and manipulate anything you like.
However, do we feel something not consistent here? Let’s suppose we use a bunch of sensor API to build an app for our smarphone. But…which platform do you choose? iOS, Android or… It matters, right? It means a developer should spend extra time to transplant their code from one platform to another.
On the other hand, as we can see, web is the trend, mobility is the trend…The web technology brings us a universal access to information that most of people are using daily. In my opinion, the web is the simplest way to exchange information on the Internet. You do not need to worry too much about the platform, or just do some minor changes in your code. Thus, why not integrate sensor access into web?
I thought it was a big issue and might drive some research papers. I was even want to step into the field to propose a mechanism. Now the concerns seem to gradually disappear.
The third one can be energy issue. I did an experiment that the sensors on the smartphone are extremely battery thirsty. They drop your battery level in an unbelievable speed. If the design is not efficient enough, the user should
hate your web app.