23

Track Your Smartphone in 2D With JavaScript - Better Programming - Medium

 3 years ago
source link: https://medium.com/better-programming/track-your-smartphone-in-2d-with-javascript-1ba44603c0df
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Track Your Smartphone in 2D With JavaScript

Imagine what you can do with this new medium of interaction

Image for post
Image for post
Photo by the author.

With a fundamental shift to the web, we are able to do really cool things right from our browser. In this tutorial, we will be utilising the Generic Sensor API to turn your smartphone into a pointer with real-time tracking.

Here’s what we will be making:

Prerequisites

  • As of writing, the Generic Sensor API is not yet supported on iOS. In addition, some Android smartphones don’t have the required sensors. However, it is still possible to work through this tutorial with simulated sensors found in Chrome DevTools:
Image for post
Image for post
Image for post
Image for post
  • It is also possible to view the console output of Chrome from your Android smartphone via USB, although this requires some further setup.
  • The Generic Sensor API requires a secure context. Therefore, HTTPSis required. You can either work on localhostwith the sensor simulator or an online code editor (such as Repl.it) with your smartphone.

Note: Everything for the tutorial can be found at this Repl, where you can browse and edit the code as well as try out the demo.

Tracking Your Smartphone

Let’s start off with a generic controller.html file and corresponding controller.jsscript:

The Generic Sensor APIsupports multiple sensors. However, for our requirements, we will be using the AbsoluteOrientationSensor.

According to MDN web docs, the AbsoluteOrientationSensoris a sensor fusion API that “describes the device's physical orientation in relation to the Earth's reference coordinate system.”

By combining data from multiple real sensors, new virtual sensors can be implemented that combine and filter the data so that it’s easier to use — these are known as fusion sensors. In this case, data from the onboard magnetometer, accelerometer, and gyroscope are used for the AbsoluteOrientationSensor’simplementation.

Below is the code for interfacing with this virtual sensor. And that’s it!

First, the sensor object is initialised with a set frequency — the rate at which the sensor is read and corresponding handleSensorcallback is fired. Then it starts the reading process.

After refreshing your page, move your phone around to see the following output. You should see a stream of quaternions:

Image for post
Image for post

But what are quaternions?

“Quaternions are a number system that extends the complex numbers.” — Wikipedia

In place of Euler angles, they can be used as an alternative method of describing the orientation of an object in space. Quaternions are extensively used in game development, as calculations using them are less computationally expensive.

However, for simplicity, let’s convert these to the more intuitive Euler angles. Following the conversion formula, below is a JavaScript implementation. As we are tracking in two dimensions, pitch has been omitted:

Update your handleSensor function to now print the converted Euler angles by using the function above:

Image for post
Image for post

You should see the angle output in radians, which should change intuitively as you rotate your phone around. Below are the dimensions we will be using to track our pointer in 2D.

Image for post
Image for post
Photo by the author.

We have successfully interfaced with the Generic Sensor API to obtain the required sensor data to track your smartphone’s orientation in real-time. We need to now translate these changing angles into movement projectedon the screen.

But first, we need a method of calibration — setting an initial start position from which all distances are measured:

When you click on the controller page body, the current orientation of the phone is set as the start point from which all angles — and therefore distances — are measured.

Image for post
Image for post
Photo by the author.

For calculating the relative distance moved, simple trigonometry is required by using the change in angle from the start point.

When taking differences, we need a way to wraparound to ensure a correct value (i.e. at the 180° and -180° points).

Here’s the code:

Note:The number 800 in the final calculation determines the virtual distance of the controller from the canvas. In the real world, this does not make sense and instead can be used to change the sensitivity of movement.

Update your handleSensorfunction to print out the calculated distance using the function above:

And that’s it!

You now have the ability to track your smartphone’s movement in real-time and translate it into a distance measurement for on-screen movement.

Pointing With Your Smartphone

With a simple Node.js server, some SocketIO magic, and an HTML canvas element, the distance measurements above can turn your smartphone into a digital pointer with support for multiple controllers.

Note: As this article focuses on utilising the Sensor API, SocketIO explanations have been glossed over, although the code should be self-explanatory. For more information, take a look at the documentation.

The server

This simple server serves our HTML pages from the publicdirectory and sends controller data to all connected web clients via SocketIO for the pointers to be rendered on canvas.

By storing a list of connected controllerclients, it is possible to add support for multiple controllers.

As a result, the array of controllers and corresponding distances moved are sent to all connected clients at a constant rate with setInterval.

The controller

The controller reads sensor data, calculates distances, and sends them to the server to be broadcasted — mostly covered above, along with the added communication with our SocketIO server.

The digital canvas

Image for post
Image for post

The canvas page renders the calculated distances received from the server as circular pointers on a canvas.

Most of it is self-explanatory, but let’s look at the drawfunction:

For multiple controllers, the controllerarray is iterated through and corresponding distances are offset by amounts that enable the pointer to start at the centre rather than in the top left corner where the canvas coordinate system begins.

These are then rendered as circles using the Canvas API with the next available colour. For more information, see the docs.

Finally, the requestAnimationFramefunction is called to tell the browser to perform the previous operation again before the next repaint. This cycle continues to allow the pointer to move in real-time with your smartphone.

Conclusion

If you have a working smartphone pointer, congratulations! But what next?

Maybe a tool of communication? A smarter alternative to a laser pointer? Or perhaps something like Paintr — a collaborative digital canvas where your smartphone becomes your paintbrush:

This is a new medium of interaction with something we all carry around in our pockets. The possibilities with this kind of setup are endless.

Thanks for reading. Let me know your thoughts below.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK