Movement part 1 — OS

Movement part 1 — OS

This is part one of a ten part series about building movement-aware applications.

Introduction

This article is focused on Android and iOS when talking about the evolution from location to movement because they represent 99.7% market share at the time of writing this article.

App development, as we know it today, began with the general release of these two Operating Systems to developers in 2008. At first, a few hundred apps were published on the platforms at the time of release. At the start of 2018, there are 5 million apps for users to choose from.

Location-based services were made available to developers from the first release. Device sensors and OS APIs have evolved by leaps and bounds since then, along with the collective imagination of developers.

Apple’s iOS and Google’s Android do a great job of generating location and activity data for developers to consume. However, developers are left to do a lot more to generate, manage and consume this data to build product experiences with movement of app users. Let’s take a deeper look.

Beginning

In the thick of the 2008 financial crisis that led to a global recession, a new world order was born with Apple and Google releasing Operating Systems for developers to build apps on.

The App Store was born on 10 July 2008. iOS 2.0 released the next day along with iPhone 3G. Developers had basic access to device location from the day they started developing apps for iPhones.

A few months later, on 23 Sep 2008, Android 1.0 was released to developers. API Level 1 gave developers basic access to device location from the day they started developing apps for Android devices.

Now

In this section, we take a closer look at the current state of the OS. Specifically, what the OS tells the device user about location services, movement features offered by the OS to device users, location and movement APIs offered by the OS to developers, and OS privacy considerations. Let’s go one OS at a time starting with iOS.

iOS

What iOS tells device user about location services

Take a look at this screen recording on iOS11 for: Settings => Privacy => Location Services => About Location Services & Privacy…

The top paragraph in the iOS note says “Location Services allows Apple and third-party apps and websites to gather and use information based on the current location of your iPhone or Apple Watch to provide a variety of location-based services. For example, an app might use your location data and location search query to help you find nearby coffee shops or theaters, or your device may set its time zone automatically based on current location.

This paragraph refers to use cases that use location at a point-in-time, such as when you perform a local search or when you switch your device on.

Movement features offered by iOS to device users

When you scroll further down the screen, you will see new additions that Apple has made over the last few releases. These include Traffic, Popular Near Me, Significant Locations, Location-Based Apple Ads, Location-Based Suggestions, Location-Based Alerts, Share My Location and HomeKit. These are services offered by Apple using information about user movement sent by the device in “an anonymous and encrypted form to Apple”.

Let’s take a closer look at one of these features called Significant Locations. Here is a screen recording of the feature on an iOS11 device. Get there from Settings => Privacy => Location Services => System Services => Significant Locations => Touch ID for security.

Apple’s description of the feature says: “Your iPhone will keep track of places you have recently been, as well as how often and when you visited them, in order to learn places that are significant to you. This data is transmitted end-to-end encrypted between your iCloud connected devices and will not be shared without your consent. It will be used to provide you with personalized services, such as predictive traffic routing, and to build better Photos Memories.

APIs offered by iOS to developers

iOS organizes its APIs into Core Location Services and Core Motion Services. Let’s take a closer look.

Core Location Services

The centerpiece of Core Location services is CLLocationManager. This is “the object that you use to start and stop the delivery of location-related events to your app”.

The most common use of CLLocationManager is to subscribe to changes in user’s location by specifying distance or time thresholds.

iOS added Geofencing to Core Location with iOS 5 in 2011 and when-in-use or always Authorization Levels to Core Location with iOS 8 in 2015.

Core Motion Services

The Core Motion services “process acceleromoter, gyroscope, pedometer and environment-related events.CMMotionActivityManager, which tells us if the user is walking, running, driving or stationary, was introduced with iOS 7 in 2013. The iPhone 5s, launched with iOS 7, packed the powerful M7 Motion Coprocessor that reduced power consumption while acting as the sensor hub for the accelerometer, gyroscope and compass. Cycling activity was added later with iOS 8 in 2014.

CMMotionManager made raw accelerometer data available to developers with iOS 4 in 2010. Gyroscope and magnetometer were added with iOS 5 in 2011.

CMPedometer is a recent addition with iOS 8 in 2014 (with the launch of HealthKit and Health app) and has significant updates with iOS 9 in 2015 (with the launch of Apple Watch).

Apple brought motion manager closer to silicon through the M-series motion coprocessor. This way, iOS does not need to invoke CPU for collecting movement data and makes it possible to collect ambient motion data with minimal battery impact. As is evident by the first time you set up your Health app, iOS maintains historic count of steps and walking/running distances in persistent memory while Android purges the data at restart.

CMMotionActivityManager and CMMotionManager were primarily consumed by health and fitness apps in the early days. The user permission on iOS that allows apps to use this data continues to be called “Motion & Fitness”. Using Core Motion to improve Core Location beyond fitness use cases is a recent phenomenon.

Observations about movement services by iOS

Some observations before we move on to the next section:

  • Apple’s communication of Location Services to iPhone users focuses on point-in-time features
  • Movement features offered by Apple are private and fairly hidden in the OS
  • APIs offered by iOS to developers are limited in comparison to what Apple is capable of offering
  • Much progress from location to movement has happened in the last few years, indicating strong demand from developers and improved capability of devices
  • Location and motion services are silos with minimal integration at the device level

Android

What Android tells device user about location services

Take a look at this screen recording on Android Oreo for: Settings => Security & Location => Location (under Privacy) => Help

Like in iOS, Location is under Privacy. The popular Help docs that talk about location are Manage location settings for apps, Turn location on or off for your device, and Manage or delete your Location History.

Unlike iOS that allows device users to turn Location Services on or off, Android additionally allows device users to choose from one of three location modes:

  • High accuracy: Use GPS, Wi-Fi, Bluetooth, or mobile networks to determine location
  • Battery saving: Use Wi-Fi, Bluetooth, or mobile networks to determine location
  • Device only: Use GPS and device sensors to determine location

The top paragraph in the Android Help note for Manage location settings for apps says “You can let your apps use your Android device’s location to do things for you or give you information. For example, apps can use your device’s location to check in, see commute traffic, or find nearby restaurants”.

The second paragraph in the Android Help note for Turn location on or off for your device says “When location is on for your device, you can get information based on where your Android device has been. For example, you can get automatic commute predictions or better search results.

Like in iOS, the use cases refer to point-in-time features such as local search when talking about other apps using location. More sophisticated movement features remain private to Android and Google. Let’s take a closer look at the latter.

Movement features offered by Android to device users

Scrolling down the screen for Settings => Security & location => Location (under Privacy) reveals two location services: Google Location History and Google Location Sharing.

For parity with Significant Locations on iOS, let’s take a closer look at one of these features, Google Location History. Here is a screen recording of the feature on Android Oreo. The OS has a button called Manage Timeline that hands off to the Google Maps app.

You need to go to the Your timeline view in the hamburger menu of Google Maps to see the places where the device user has been and activity segments that indicate the type of movement between these places. All of this data is organized in a chronological order, or timeline, through the day.

Android does not provide direct access to this data through its location APIs. However, it provides lower level primitives with which developers can construct this data with significant effort.

APIs offered by Android to developers

Android organizes most movement related APIs under LocationServices, “the main entry point for location services integration.” Some APIs are available under android.hardware.

Two important APIs relevant to movement are FusedLocationProviderClient and ActivityRecognitionClient. Let’s take a closer look.

FusedLocationProviderClient

FusedLocationProviderClient is the “main entry point for interacting with the fused location provider.

Like in iOS, the most common use of FusedLocationProviderClient is to subscribe to changes in user’s location by specifying distance or time thresholds.

Android added Geofencing to LocationServices with Android 4.3 (Jelly Bean) in 2013. Location permissions in Android have remained ACCESS_COARSE_LOCATION and ACCESS_FINE_LOCATION since API Level 1, and there is no direct equivalent to when-in-use and always authorizations in iOS.

ActivityRecognitionClient

ActivityRecognitionClient is the “main entry point for interacting with the activity recognition.requestActivityUpdates, which tells us if the user is walking, running, driving, cycling or stationary, was introduced with Android 4.3 (Jelly Bean) in 2013.

SensorManager in android.hardware made raw accelerometer data available to developers with API Level 1 in 2008. Gyroscope and magnetometer were added with API Level 3 in 2010. Compared to CMPedometer in iOS, Sensor in android.hardware with STRING_TYPE_STEP_COUNTER and STRING_TYPE_STEP_DETECTOR was added with Android 4.4 (KitKat) in 2013.

Observations about movement services by Android

Some observations before we move on to the next section:

  • Like iOS, Android’s communication of Location Services to Android users centers around point-in-time features to allay privacy concerns
  • Movement features offered by Android are private. Due to Google Maps, they are not as hidden from the user, yet remain features within an app that is primarily used for local search and navigation to places
  • Like iOS, APIs offered to developers by Android are limited in comparison to what Android and Google are capable of offering
  • Much progress from location to movement has happened in the last few years, indicating strong demand from developers and improved capability of devices
  • Location and motion services are silos with minimal integration at the device level, although a bit more than in iOS

Cloud

As we saw in the previous section, the OS APIs offered to developers are significantly restricted in comparison with movement features built by the OS for its own device users.

This is because movement features require a non-trivial device-to-cloud stack to bring these features to life. Offering cloud-based services to developers is not a prerogative for the OS.

Beyond a point, the cloud is necessary to manage accuracy, real-timeness and battery efficiency.

Beyond a point, the cloud is necessary to put movement data in context of places, roads and traffic information in the world of maps.

The OS is in the business of providing the best possible APIs on the device. The primary considerations for the OS are:

  • to protect the user’s privacy and
  • be efficient on the resource consumption on device.

The OS is not in the business of providing the best possible APIs in the cloud. The primary considerations for the developers are:

  • to power a great product experience and
  • be efficient in building new features without having to build complex infrastructure

Privacy

Apple and Google are the two most valuable companies on earth at the time of writing this article.

Apple is in the business of selling devices and value-added-services on top. Google is in the business of selling ads with the best targeting of user intent.

Generating the most powerful movement information about its users directly aligns with the respective missions.

Managing and consuming this information in the cloud is important to them for private consumption for various use cases. However, opening the management and consumption of movement data in a use-case ready manner with developers opens a can of worms with regard to user privacy.

Think about it as a device user. You would throw a fit to know that the device OS is selling your information to third-party app developers. It surprised consumers worldwide when they learned that popular apps like Uber and Google Maps were able to track them even after the trip was complete.

In response to the consumer furor,

  • iOS11 (2017) mandated it for app developers to offer while-in-use permission for getting location and added a blue bar to call out the app that used background location with while-in-use permission
  • Android 8.1 Oreo (2017) added background location limits that limit how frequently developers could receive location updates in a hour

Conclusion

Developers want to build awesome product experiences with movement data of users of their apps. The OS does a great job of generating this data on the device.

This data is available to developers in a restricted way because:

  • The OS is limited to offering APIs on the device and not in the cloud. For instance, Android deprecated FusedLocationProviderApi in favor of FusedLocationProviderClient to go offline.
  • The role of the OS is that of protecting user privacy rather than making it easier for developers to build movement features even as the OS continues to build sophisticated movement features for its own consumption.

Developers are on their own to generate, manage and consume movement data. The OS is the starting point for generation.

Previous: Introduction

Next: Part 2—Accuracy

Original post on HyperTrack blog by Ravi Jain (Feb 26, 2018)

要查看或添加评论,请登录

Kashyap Deorah的更多文章

  • The HyperTrack VP Marketing Hendiatris

    The HyperTrack VP Marketing Hendiatris

    More logistics tech will be built in the next few years than has been built in the last few decades. Here’s a…

    3 条评论
  • Supply chain transformation from distribution to fulfillment

    Supply chain transformation from distribution to fulfillment

    The past decade (2010-2020) witnessed a profound shift in consumer behavior for food, groceries and medicines from…

  • HyperTrack Culture

    HyperTrack Culture

    Culture eats strategy for breakfast - Peter Drucker We are now 15 people, between San Francisco and Ukraine, with some…

  • Movement part 3 — Real-time

    Movement part 3 — Real-time

    This is part three of a ten part series about building movement-aware applications. In the previous parts, we talked…

    1 条评论
  • Movement part 2 — Accuracy

    Movement part 2 — Accuracy

    This is part two of a ten part series about building movement-aware applications. In the last part, we spoke about the…

  • Movement — Introduction

    Movement — Introduction

    This is the introduction to a ten part series about building movement-aware applications. Mobility The mobile phone was…

    1 条评论
  • Drawing inspiration from LEGO

    Drawing inspiration from LEGO

    “Cogito ergo sum” (I think, therefore I am) – René Descartes Humans are superior, if not unique, compared to other…

    6 条评论
  • Standing on the shoulders of giants

    Standing on the shoulders of giants

    “We are like dwarfs sitting on the shoulders of giants. We see more, and things that are more distant, than they did…

    4 条评论
  • HyperTrack is hiring!

    HyperTrack is hiring!

    In six months of being live in public Beta, we have been amazed by the number of industries, countries and use cases…

    15 条评论
  • Solution: The HyperTrack API framework

    Solution: The HyperTrack API framework

    Over the past year, we have interacted with thousands of developers from over 30 countries and as many industries…

    1 条评论

社区洞察

其他会员也浏览了