0% found this document useful (0 votes)
37 views

Sensors

The document discusses various sensors commonly found in smartphones, including ambient light sensors, proximity sensors, GPS, accelerometers, compasses, gyroscopes, and back-illuminated sensors. It provides examples and brief explanations of how each sensor works and what it is used for.

Uploaded by

Nisha Daniel
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Sensors

The document discusses various sensors commonly found in smartphones, including ambient light sensors, proximity sensors, GPS, accelerometers, compasses, gyroscopes, and back-illuminated sensors. It provides examples and brief explanations of how each sensor works and what it is used for.

Uploaded by

Nisha Daniel
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

By Asad-Uj-Jaman

MDI Bangladesh
The tremendous growth of sensor technology in Smartphone increases day by day and will
experience fabulously over the next few years. Success of smart phones is leading to an
increasing amount of MEMS & sensors in mobile phones to provide new features/ services to
end-users, to reduce cost through more integration or to improve hardware performance. Today, I
am going to start a brief and basic discussion about these sensor technologies one by one.
Ambient light (ALS):
An ambient light sensor to portable devices such as tablets, smart phones, and laptops extends
battery life and enables easy-to-view displays that are optimized to the environment. According
to the spectral sensitivity of light measurement in human eyes, modern ambient light sensors
attempt to live with an incomplete match to the photonic CIE curve, (Prashanth, Holenarsipur III
(Jun 20, 2011). Ambient-Light Sensing Optimizes Visibility and Battery Life of Portable
Displays.) And instead use the principle of superposition to calculate the ambient light
brightness. Most light sensors on the market today use two or more different types of
photodiodes, each sensitive to a different portion of the light spectrum. By combining these
photodiode outputs mathematically, each with a suitably adjusted gain, the sensor can be made to
output a fairly accurate measurement of ambient brightness for the light sources commonly
available. However, basically an ambient light sensor adjusts the display brightness which in turn
saves battery power in Smartphone; it saves power by adjusting the brightness of the display
based on how much ambient light is present. As an example Providing ambient light sensing
(ALS) and proximity detection operation in environments over 60,000 lux (sunlight), TSL2x72
Series [Anonymous, July 11, 2011 Sensors for Smartphone detect ambient light, proximity]
creates dynamic range of operation necessary for mobile products to adapt to all lighting
conditions. Dynamic range is enabled by devices programmable gain modes, including reduced
gain mode. Proximity detection features include SNR performance and programmable register,
which allows compensation for optical signal cross talk.
Proximity Sensor:
A proximity sensor is very much useful in Smartphone. It detects how close the screen of the
phone is to your body. This allows the phone to sense when you have brought the phone up to
your ear. At that point, the display turns off in order to save battery. It also stops detecting
touches, as to avoid unwanted input, until you take the phone away from your ear. Proximity
Sensor can turn off the screen to avoid accidental touch of the screen by ear. Besides it is useful
for detecting towers and sources of interference. So that one can amplify or filter them out using
beam forming techniques or others. In the case of iPhone, the proximity sensor shuts off the

screen and touch-sensitive circuitry when the iPhone is brought close to the face, both to save
battery and prevent unintentional touches. So in a word, The proximity sensor in Smartphone
senses how close the phone is to the users cheek/face, so that it can pause whatever activity it is
in the middle of (playing music or browsing the web, for example) so the user can take a phone
call. When the phone is removed from the ear after the call, the phone resumes its previous
activity.
Global Positioning System (GPS):
GPS was originally intended for military applications, but in the 1980s, the government made the
system available for civilian use. Its a Navigation tracking, often with a map picture in the
background, but showing where you have been, and allowing routes to be preprogrammed,
giving a line you can follow on the screen of Smartphone. GPS satellites circle the earth twice a
day in a very precise orbit and transmit signal information to earth. GPS receivers take this
information and use triangulation to calculate the users exact location. [Anonymous, November
22, 2011. What is GPS?] As an example, iPhone models use A-GPS or Assisted GPS
which in basic terms accesses an intermediary server when it is not possible to connect directly
via satellite indoors, for example and this server provides the nearest satellite with
additional information to make it possible to more accurately determine a users position. The
iPhone 3G, 3GS and 4 employ A-GPS, and the iPhone 3GS and 4 also have a digital compass.
iPhone 4S supports GLONASS global positioning system in addition to GPS.
Accelerometer:
The accelerometer allows the device of Smartphone to detect the orientation of the device and
adapts the content to suit the new orientation. For example, when you rotate your device
sideways, the Web browser automatically switches the screen to landscape mode so that you now
have a wider viewing space. Similarly, the camera relies on the accelerometer to tell it whether
you are taking a picture in portrait or landscape mode. The accelerometer in smart devices
measures the acceleration of the device relative to freefall. A value of 1 indicates that the device
is experiencing 1 g of force exerting on it (1 g of force being the gravitational pull of the earth,
which your device experiences when it is stationary). The accelerometer measures the
acceleration of the device in three different axes: X, Y, and Z. iPhones shows the different axes
measured by the accelerometer: [Wei-Meng Lee III (May 11, 2010). Using the Accelerometer
on the iPhone
As an example of this sensor in iPhone, it says - A 3-axis accelerometer senses the orientation of
the phone and changes the screen accordingly, allowing the user to easily switch between portrait
and landscape mode. Photo browsing, web browsing, and music playing support both upright and
left or right widescreen orientations. Unlike the iPad, the iPhone does not rotate the screen when
turned upside-down, with the Home button above the screen, unless the running program has

been specifically designed to do so. The 3.0 update added landscape support for still other
applications, such as email, and introduced shaking the unit as a form of input. The
accelerometer can also be used to control third-party apps, notably games. The iPhone 4 also
includes a gyroscopic sensor, enhancing its perception of how it is moved.
Compass:
Basically compasses are attracted to the earth's poles using magnets. But the modern smart phone
is not using magnets. Magnetic interference would render the Smartphones cellular capabilities
useless. Once introduced, magnetic interferences drops signal strength parabolic ally. Frequency
and ranges are consistent with GSM bands. As an example, the AK8973 chip on the iPhone is a
small sensor that "listens" for an ultra low frequency signal. If that signal comes from a specific
spot like North, paired with the accelerometer the device can calculate the orientation and
direction. The compass in the iPhone 4 is the AKM AK8975, which is very similar to the AKM
AK8973 in the iPhone 3GS. It senses orientation relative to the Earth's magnetic field using the
Hall Effect. The Hall Effect occurs when [Elson Liu, 2006 Mac Book Pro (Core Duo)] a
magnetic field is applied transverse to a flowing current. The magnetic field deflects the moving
charges that make up the current, inducing a voltage (called the Hall voltage, shown in the figure
below as) that is transverse to the current. The Hall voltage can then be measured and used to
determine the strength of the component of the magnetic field that was transverse to this current.
By using multiple sensors oriented in different directions, and by using a disk of high
permeability material called a magnetic concentrator to bend magnetic field lines that are parallel
to the sensor plane so that they have a component perpendicular to the sensor plane that can be
sensed, the device can measure the total magnetic field vector and therefore determine the
devices orientation relative to that magnetic field.
Gyros:
A gyroscope is a device for measuring or maintaining orientation, based on the principles of
angular momentum. Gyroscopic sensors used in navigation systems and gesture recognition
systems in Smartphones and tablet PCs. Gyroscopes are used in Smartphones and tablet PCs for
finding the position and orientation of devices. Wireless computer pointing devices such as
mouse for controlling the mouse pointer based on wireless mouse movement. A group of
researchers says Earlier this year the iPhone 4 became the first Smartphone to boast a built-in
gyroscope in addition to an accelerometer, proximity sensor and ambient light sensor. Combining
a gyroscope with an accelerometer allows the device to sense motion on six axes left, right, up,
down, forward and backward, as well as roll, pitch and yaw rotations allowing for more
accurate motion sensing abilities comparable to a game controller such as the Wii-mote. The
iPhone 4 uses a MEMs (micro-electro-mechanical-systems) gyroscope but a newly developed
optical gyroscope, small enough to fit on the head of a pin, could allow the integration of more

accurate motion sensing technology in not only smartphones, but also in medical devices inside
the human body.
A New Sensor (Back-illuminated sensor):
A back-illuminated sensor, also known as backside illumination (BSI or BI) sensor, is a type of
digital image sensor that uses a novel arrangement of the imaging elements to increase the
amount of light captured and thereby improves low-light performance. The technique was used
for some time in specialized roles like low-light security cameras and astronomy sensors, but
was complex to build and required further refinement to become widely used. Sony was the first
to reduce these problems and their costs sufficiently enough to introduce a 5 Mpx 1.75 m BI
CMOS sensor at general consumer prices in 2009. BI sensors from Omni Vision Technologies
have since been used in consumer electronics from other manufactures such as HTCs EVO 4G,
and as a major selling point for the camera in Apples iPhone 4. A back-illuminated sensor
contains the same elements, but orients the wiring behind the photocathode layer by flipping the
silicon wafer during manufacturing and then thinning its reverse side so that light can strike the
photocathode layer without passing through the wiring layer.
As we see the application and programs of sensor technology in most smartphones increase
geometrically, this rapid growth may give us a new direction of a smart planet in our hand.

1)Android
Motion Sensors
The Android platform provides several sensors that let you monitor the motion of a device. Two
of these sensors are always hardware-based (the accelerometer and gyroscope), and three of
these sensors can be either hardware-based or software-based (the gravity, linear acceleration,
and rotation vector sensors). For example, on some devices the software-based sensors derive
their data from the accelerometer and magnetometer, but on other devices they may also use the
gyroscope to derive their data. Most Android-powered devices have an accelerometer, and many
now include a gyroscope. The availability of the softare-based sensors is more variable because
they often rely on one or more hardware sensors to derive their data.
Motion sensors are useful for monitoring device movement, such as tilt, shake, rotation, or
swing. The movement is usually a reflection of direct user input (for example, a user steering a
car in a game or a user controlling a ball in a game), but it can also be a reflection of the physical
environment in which the device is sitting (for example, moving with you while you drive your
car). In the first case, you are monitoring motion relative to the device's frame of reference or
your application's frame of reference; in the second case you are monitoring motion relative to

the world's frame of reference. Motion sensors by themselves are not typically used to monitor
device position, but they can be used with other sensors, such as the geomagnetic field sensor, to
determine a device's position relative to the world's frame of reference (see Position Sensors for
more information).
All of the motion sensors return multi-dimensional arrays of sensor values for each
SensorEvent. For example, during a single sensor event the accelerometer returns acceleration
force data for the three coordinate axes, and the gyroscope returns rate of rotation data for the
three coordinate axes. These data values are returned in a float array (values) along with other
SensorEvent parameters. Table 1 summarizes the motion sensors that are available on the
Android platform.
Android Open Source Project Sensors

The Android Open Source Project (AOSP) provides three software-based motion sensors: a
gravity sensor, a linear acceleration sensor, and a rotation vector sensor. These sensors were
updated in Android 4.0 and now use a device's gyroscope (in addition to other sensors) to
improve stability and performance. If you want to try these sensors, you can identify them by
using the getVendor() method and the getVersion() method (the vendor is Google Inc.; the
version number is 3). Identifying these sensors by vendor and version number is necessary
because the Android system considers these three sensors to be secondary sensors. For example,
if a device manufacturer provides their own gravity sensor, then the AOSP gravity sensor shows
up as a secondary gravity sensor. All three of these sensors rely on a gyroscope: if a device does
not have a gyroscope, these sensors do not show up and are not available for use.
Using the Accelerometer

An acceleration sensor measures the acceleration applied to the device, including the force of
gravity. The following code shows you how to get an instance of the default acceleration sensor:
private SensorManager mSensorManager;
private Sensor mSensor;
...
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);

Conceptually, an acceleration sensor determines the acceleration that is applied to a device (Ad)
by measuring the forces that are applied to the sensor itself (Fs) using the following relationship:
Ad = - Fs / mass

However, the force of gravity is always influencing the measured acceleration according to the
following relationship:
Ad = -g - F / mass

For this reason, when the device is sitting on a table (and not accelerating), the accelerometer
reads a magnitude of g = 9.81 m/s2. Similarly, when the device is in free fall and therefore
rapidly accelerating toward the ground at 9.81 m/s2, its accelerometer reads a magnitude of g = 0
m/s2. Therefore, to measure the real acceleration of the device, the contribution of the force of
gravity must be removed from the accelerometer data. This can be achieved by applying a highpass filter. Conversely, a low-pass filter can be used to isolate the force of gravity.

Using the Gravity Sensor


The gravity sensor provides a three dimensional vector indicating the direction and magnitude of
gravity. The following code shows you how to get an instance of the default gravity sensor:
private SensorManager mSensorManager;
private Sensor mSensor;
...
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);

The units are the same as those used by the acceleration sensor (m/s2), and the coordinate system
is the same as the one used by the acceleration sensor.
Note: When a device is at rest, the output of the gravity sensor should be identical to that of the
accelerometer.

Using the Gyroscope


The gyroscope measures the rate or rotation in rad/s around a device's x, y, and z axis. The
following code shows you how to get an instance of the default gyroscope:
private SensorManager mSensorManager;
private Sensor mSensor;
...
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE);

The sensor's coordinate system is the same as the one used for the acceleration sensor. Rotation
is positive in the counter-clockwise direction; that is, an observer looking from some positive
location on the x, y or z axis at a device positioned on the origin would report positive rotation if
the device appeared to be rotating counter clockwise. This is the standard mathematical
definition of positive rotation and is not the same as the definition for roll that is used by the
orientation sensor.
Usually, the output of the gyroscope is integrated over time to calculate a rotation describing the
change of angles over the timestep.

Using the Linear Accelerometer


The linear acceleration sensor provides you with a three-dimensional vector representing
acceleration along each device axis, excluding gravity. The following code shows you how to get
an instance of the default linear acceleration sensor:
private SensorManager mSensorManager;
private Sensor mSensor;
...
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION);

Conceptually, this sensor provides you with acceleration data according to the following
relationship:
linear acceleration = acceleration - acceleration due to gravity

You typically use this sensor when you want to obtain acceleration data without the influence of
gravity. For example, you could use this sensor to see how fast your car is going. The linear
acceleration sensor always has an offset, which you need to remove. The simplest way to do this
is to build a calibration step into your application. During calibration you can ask the user to set
the device on a table, and then read the offsets for all three axes. You can then subtract that offset
from the acceleration sensor's direct readings to get the actual linear acceleration.
The sensor coordinate system is the same as the one used by the acceleration sensor, as are the
units of measure (m/s2).

Using the Rotation Vector Sensor


The rotation vector represents the orientation of the device as a combination of an angle and an
axis, in which the device has rotated through an angle around an axis (x, y, or z). The following
code shows you how to get an instance of the default rotation vector sensor:
private SensorManager mSensorManager;
private Sensor mSensor;
...
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);

The three elements of the rotation vector are expressed as follows:


x*sin(/2)
y*sin(/2)
z*sin(/2)

Where the magnitude of the rotation vector is equal to sin(/2), and the direction of the rotation
vector is equal to the direction of the axis of rotation.

Figure 1. Coordinate system used by the rotation vector sensor.


The three elements of the rotation vector are equal to the last three components of a unit
quaternion (cos(/2), x*sin(/2), y*sin(/2), z*sin(/2)). Elements of the rotation vector are
unitless. The x, y, and z axes are defined in the same way as the acceleration sensor. The
reference coordinate system is defined as a direct orthonormal basis (see figure 1). This
coordinate system has the following characteristics:

X is defined as the vector product Y x Z. It is tangential to the ground at the device's


current location and points approximately East.

Y is tangential to the ground at the device's current location and points toward the
geomagnetic North Pole.

Z points toward the sky and is perpendicular to the ground plane.

The Android SDK provides a sample application that shows how to use the rotation vector
sensor. The sample application is located in the API Demos code ( OS - RotationVectorDemo).

2) JAVA
Working with the Mobile Sensor API
by Vikram Goyal

This article explains how the Mobile Sensor API (JSR 256 for Java Platform, Micro
Edition) enables Java MIDlets to talk to the physical and virtual sensors available on a
device.
Published September 2010

Downloads:
Java ME
The Mobile Sensor API (JSR 256 for Java Platform, Micro Edition [Java ME]) provides the
ability for Java MIDlets to talk to the physical and virtual sensors available on the device. This is
a particularly useful API, because it can provide the MIDlet the means to gauge the health not
only of the device but also of the outside physical environment, which makes some very
interesting applications possible. Of course, as with all APIs, the implementation of this API by
the device manufacturer determines the actual sensors that are available.
In this article, I provide an introduction to this API and some simple code to determine the
available sensors on a device. I then create an example MIDlet to figure out the current tilt of a
device using the accelerometer. The tilt orientation (axisX, axisY, and axisZ) can help Java
devices figure out which way a user is holding the device in order to manipulate the user
interface accordingly.
Introduction to the Mobile Sensor API

The Mobile Sensor API grew out of JSR 256. The API provides ways to fetch and monitor data
from physical and virtual sensors uniformly. The simplest example of a sensor is a battery
monitor, which tells the MIDlet the level of battery charge. Using the API, you can make
informed decisions in your MIDlet about what the battery sensor is telling you, if you first create
a connection to the sensor and then register to listen to events originating from this sensor.
A sensor might or might not allow the user to control the sensor via programmatic interface. For
example, if a physical sensor connected to a device allows the user to measure the current
temperature, it might also provide controls to physically start, stop, or calibrate the thermometer.
To this end, the Mobile Sensor API is divided into two parts:

The sensor part, covered by the package javax.microedition.sensor

The control part, covered by the package javax.microedition.sensor.control

The sensor part is solely responsible for getting information out of the sensor (physical or
virtual). The control part provides the interfaces for the controls.

Defining Sensors

Sensors, within the context of the Mobile Sensor API, are defined using two parameters:

Quantity

Context

Think of quantity as the actual item that is being measured, for example, the battery charge. The
context (or to use the correct term, context type) is the context within which that quantity is being
measured or monitored, for example, the device. So, if a sensor were to provide you the battery
charge of a device, the quantity that you are monitoring or measuring is the battery charge, and
the context is the device.
If, for argument sake, you were interested in the battery charge of an electric vehicle (through a
sensor connected to your device), you would be using the sensor defined with the quantity of
battery charge and the context of vehicle. Continuing on with our silly analogy, if a user has a
measurable battery charge sensor, the quantity would be battery charge, and the context would be
user. There is one last context type, called ambient, which is used to define something being
measured within the surrounding environment. (I could say sensing the battery charge of the
surrounding environment, but that would be too silly.)
So there are four context types that are used to define a sensor:

Ambient

User

Device

Vehicle

If the context types define the "WHERE" of sensors, the quantity defines the "WHAT." The
sensor will measure (what?) the battery charge (where?) on the device.
As you might expect, there can be hundreds of quantities, many of which are standardized by the
API and defined in the javax.microedition.sensor.SensorInfo interface. Companies can
introduce their own quantities, as long as they can keep them separate from existing ones on the
device.
The API specifies that the information about a sensor be encapsulated in the SensorInfo
interface. This interface, provides information about a sensor, and must provide the quantity
being measured, the context type, the type of connection to the sensor (embedded, wired, short-

range wireless, or remote), the sensor model, the URL to the sensor, the sensor's description, and,
finally, meta information about the sensor in terms of channels (unit, scale, ranges, and so on). A
lot of information is required of a SensorInfo object. All of it is mandatory, per the API, and
must not be null or empty. (Unfortunately, as with most API definitions, this requirement is
contradicted within the API itself at several places; it seems that the only properties that are
really required are the quantity and context type.) Besides these, the API also defines some
optional parameters.
The final piece of the puzzle for how a sensor is defined is solved by the SensorConnection
interface. The implementation of this interface provides the means to connect to the actual sensor
and collect data from it, which can be done either synchronously or asynchronously. As
expected, asynchronous notifications require you to implement a DataListener interface in your
receiver class. Two getData methods within this interface provide the mechanism to get the
actual data from the underlying sensor.
Finding Sensors

To identify and work with sensors, the API provides a manager class called SensorManager,
which has two methods to find sensors within a device:

findSensors(String quantity, String contextType) returns an array of


SensorInfo objects that match the given quantity and context type.

findSensors(String url) requires you to know the URL of an existing sensor.

URLs are defined similarly to HTTP URLs, with sensor replacing HTTP.
Therefore, sensor:battery_charge;context_type=device is a valid URL. This
method returns an array of possible SensorInfo objects.
Finding All Sensors in a Device

To find all sensors available in a device, all you need to do is to call the findSensors(String
quantity, String contextType) method with null values for both parameters. This should
return all available sensors, in theory. The following code shows a MIDlet that prints out the
available sensors on a device.
searches for the sensors in the searchSensors method and uses the paint
method to draw information about the sensors on the screen. Before creating SensorCanvas, the
MIDlet checks to see whether the current device supports the Mobile Sensor API by trying to
find the microedition.sensor.version system property. The MIDlet does not proceed further
if this property is null.
SensorCanvas

Handling Sensor Information

Once you find a sensor you are interested in, how do you handle the data that it produces? As
mentioned in an earlier section, you use the SensorInfo implementation for that sensor and after
getting a SensorConnection to the sensor, you use this connection to get the data, either directly
or through a listener.
The actual data is encapsulated in the Data interface. Since the data sent by a sensor might not be
a single unit of measurement, the data comes in channels. For example, although the data sent by
a thermometer sensor would be a single unit (temperature), the data sent by a sensor measuring
the tilt of the device will have at least three channels, (X, Y, and Z) that together make up the
data sent by the sensor. Therefore, the data that is retrieved is in an array of Data objects, each
with its own channel. The ChannelInfo interface provides methods to query information about
the channel, such as the name, data type, accuracy, and so on.
The Data interface provides three methods to retrieve the value of the data:

getIntValues

getDoubleValues

getObjectValues

You can use the ChannelInfo.getDataType method to get the type of the data that is expected.
Using the information that we have so far, it is easy to create a simple MIDlet that echoes the
current tilt of the device. The following code snippet shows the TiltCanvas class that captures
and shows this information.
As you can see, the tilt canvas needs to look for the tilt sensor using the keyword acceleration
for the quantity and SensorInfo.CONTEXT_TYPE_DEVICE for the context type. This is the only
combination that works for the Java ME SDK 3.0, and you will need to find the correct values to
use using the previous SensorFinder MIDlet if you run this code on other devices or emulation
platforms. The canvas implements the dataReceived method to capture the data events sent by
the acceleration sensor attached to the emulator. Since we know which channel is for what axis
(we can find this out by trial and error, as I did, or from the documentation that comes with the
emulator), we can assign the channel data to the correct axis. In this case, data[0] is axis X,
data[1] is axis Y, and data[2] is axis Z.

Conclusion

This article introduced you to the Mobile Sensor API (JSR 256), which has been around for quite
some time and is starting to gain a lot of traction in applications for the Java ME platform. We
looked at the details of this API and its interfaces, and we learned how to find all the sensors
within a device or emulator. Finally, we learned how to use the information returned by our
sensors using a simple example that retrieves data from an accelerator to figure out the current
tilt of a device.

3) MathWorks Mobile Sensor Connectivity Team.


If you own a smart phone, you're probably holding a device with a number of sensors that is
constantly collecting information from its surroundings. Many of the apps on your phone make
use of these various sensors to operate. Now, you can read from your phone's sensors using
MATLAB! If you have R2013a or newer and an Android phone or an iPhone/iPad, you will want
to check this out.
After installing a free app for your Android or iPhone/iPad, you can now receive sensor data
from you phone directly into your MATLAB session. It is written using MATLAB Classes, and it
has a simple interface to read sensor data from your phone.
Establish connection with your phone:
sensor = sensorgroup('AndroidMobile')
sensor =
sensorgroup logging data from Android device on port 60000
Measurements: (showLatestValues)
Acceleration

Orientation

Latitude
Longitude
Altitude

Speed
MagneticField

Read the latest values from the sensors:


showLatestValues(sensor)
Measurement
Latest Values
------------- ---------------------------------Acceleration
4.51
-3.30
-8.16
MagneticField
-3.56e-06
-2.49e-05
1.11e-05
Orientation
27.08
27.14
-19.39

Units
------m/s^2
Tesla
degrees

Waiting for: Latitude, Longitude, Altitude, and Speed.

Log Size
-------<1x3>
<2x3>
<2x3>

More information.

Get specific measurements:


sensor.Orientation
ans =
20.6081
34.4375

-15.0781

The entry also comes with a MATLAB app that shows a live view of the data coming in, as well
as an example of using the magnetic field information from the phone to rotate a 3D plot.

You might also like