Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SINGLE-INSTANCE MULTI-USER SUPPORT FOR VEHICLE OPERATING SYSTEMS
Document Type and Number:
WIPO Patent Application WO/2024/039996
Kind Code:
A1
Abstract:
A computing device comprising a memory and one or more processors may be configured to implement various aspects of the techniques. The memory may store an instance of a vehicle operating system, where the instance of the vehicle operating system facilitates concurrent access by multiple user profiles. Tire one or more processors may execute the instance of the vehicle operating system, which is configured to authorize the multiple user profiles to interface with the instance of the vehicle operating system; and present multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles. The multiple user interfaces may interface with multiple users associated with the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system and control functionality associated with the vehicle head unit.

Inventors:
KANTEK ANTONIO (US)
HEO YUN CHEOL (US)
GOEL SHASHANK (US)
AZUCENA OSCAR ARMANDO (US)
YERAVADEKAR MANJIRI (US)
LEME FELIPE (US)
JEONG DONGKYUN (US)
PARK KEUN YOUNG (US)
WANG XIANG (US)
Application Number:
PCT/US2023/072010
Publication Date:
February 22, 2024
Filing Date:
August 10, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GOOGLE LLC (US)
International Classes:
B60W40/08; B60K35/00
Foreign References:
US20180357233A12018-12-13
US20190176625A12019-06-13
US20150232045A12015-08-20
US194562633714P
US195162633714P
Attorney, Agent or Firm:
GAGE, Matthew K. (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1 . A method comprising: executing, by a vehicle head unit, an instance of a vehicle operating system, wherein tlie instance of the vehicle operating system facilitates concurrent access by multiple user profiles; authorizing, by the instance of the vehicle operating system, the multiple user profiles to interface with the instance of the vehicle operating system; presenting, by the instance of the vehicle operating system, multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and interfacing, via the multiple user interfaces, with multiple users associated with the one or more of tire multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system to control functionality associated with the vehicle head unit.

The method of claim 1, wherein the instance of a vehicle operating system comprises a first instance of the vehicle operating system, wherein the method further comprises executing a second instance of the vehicle operating system, and wherein the first instance of the vehicle operating system includes an interface by which to communicate with the second instance of the vehicle operating system to facilitate concurrent access by the multiple user profiles.

3. The method of claim 2 wherein the vehicle head unit includes a high processing capacity processor that executes the first instance of the vehicle operating system, wherein the vehicle head unit includes a low processing capacity processor that executes the second instance of the vehicle operating system, and wherein the high processing capacity processor provides more processing capacity than the low processing capacity processor.

4. The method of any of claims 1 and 2, wherein the multiple displays are displaced about a cabin of a vehicle that includes the vehicle head unit.

5. The method of any of claims 1-4, wherein the multiple displays includes two or more of: a first display integrated into an operator side of a front dashboard of a cabin of a vehicle that includes the vehicle head unit; a second display integrated into a center console of the front dashboard; a third display integrated into a passenger side of the front dashboard; and a fourth display integrated into a rear passenger compartment of the cabin.

6. The method of any of claims 1-5, wherein the multiple displays include one or more computing devices associated with at least one of the multiple users that is a passenger of a vehicle that includes the vehicle head unit.

7. The method of any of claims 1-6, wherein interfacing with the multiple users associated with the one or more of the multiple user profiles includes interfacing with the multiple users to control audio playback within one or more zones of a cabin of a vehicle that includes the vehicle head unit.

8. The method of claim 7, wherein the one or more zones of the cabin include one or more of a front operator zone, a front passenger zone, an operator-side rear passenger zone, and a passenger-side rear passenger zone.

9. The method of any of claims 7 and 8, wherein interfacing with the multiple users to control the audio playback includes interfacing with the multiple users to control audio volume in a single one of the one or more zones of the cabin.

10. The method of any of claims 7-9, wherein interfacing with the multiple users to control the audio playback includes interfacing with the multiple users to control a focus of audio playback m at least one of tire one or more zones of the cabin.

1 1 . The method of any of claims 7-10, further comprising capturing, by one or more audio capture devices communicatively coupled to the vehicle head unit, audio data representative of a soundfield at each of the one or more zones.

12. The method of claim 11, further comprising determining, based on the audio data representative of the soundfield occurring at each of the one or more zones, that a first user of the multiple users in a first zone of the one or more zones is speaking in an attempt to audibly interface with the vehicle head unit.

13. The method of any of claims 11 and 12, further comprising adjusting, based on the audio data representative of the soundfield occurring at each of the one or more zones, audio playback at the one or more zones.

14. The method of claim 13, wherein adjusting the audio playback includes: determining, based on the audio data representative of the soundfield occurring at each of the one or more zones, a noise level; and adjusting, based on tire noise level, the audio playback at tire one or more zones.

15. The method of claim 1, wherein interfacing with the multiple users associated with the one or more of the multiple user profiles includes interfacing, via a first user interface of the multiple user interfaces associated with a first user profile of the multiple user profiles, with a first user of the multiple users to interact with content presented by a second user interface of the multiple user interfaces associated with a second user profile of the multiple user profiles.

16. The method of claim 1 , wherein interfacing with the multiple users associated with the one or more of the multiple user profiles includes interfacing, via a first user interface of the multiple user interfaces associated with a first user profile of the multiple user profiles, with a first user of the multiple users to share conten t presented by the first user interface with a second user interface of the multiple user interfaces associated with a second user profile of the multiple user profiles.

17. A computing device comprising: a memory configured to store an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; one or more processors configured to execute the instance of the vehicle operating system, wherein the instance of the vehicle operating system is configured to: authorize the multiple user profiles to interface with the instance of the vehicle operating system; and present multiple user interfaces across multiple displays communicatively coupled to the computing device, each of the multiple user interfaces associated with one or more of the multiple user profiles, wherein the multiple user interfaces are configured to interface with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system to control functionality associated with the computing device.

18. The computing device of claim 17. wherein the instance of a vehicle operating system comprises a first instance of the vehicle operating system, wherein the one or more processors are further configured to execute a second instance of the vehicle operating system, and wherein the first instance of the vehicle operating system includes an interface by which to communicate with the second instance of the vehicle operating system to facilitate concurrent access by the multiple user profiles.

19. The computing device of claim 18, wherein the one or more processors include a high processing capacity processor that executes the first instance of the vehicle operating system, wherein one or more processors include a low processing capacity processor that executes the second instance of the vehicle operating system, and wherein the high processing capacity' processor provides more processing capacity' than the lov,' processing capacity processor.

20. A non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause one or more processors of a vehicle head unit to: execute an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; authorize, via execution of the instance of the vehicle operating system, the multiple user profiles to interface with the instance of the vehicle operating system; present, via execution of the instance of the vehicle operating system, multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and interface, via the multiple user interfaces, with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system to control functionality associated with the vehicle head unit.

Description:
SINGLE-INSTANCE MULTI-USER SUPPORT

FOR VEHICLE OPERATING SYSTEMS

[0001] This application claims priority to U.S. Provisional Application Serial No.

63/371 ,445, entitled ‘"CONTEXT AWARE SAFETY FEATURES FOR VEHICLE

OPERATING SYSTEMS,” filed August 15, 2022, and U.S. Provisional Application Serial No, 63/371 ,451 , entitled “SINGLE-INSTANCE MULTI-USER SUPPORT FOR VEHICLE OPERATING SYSTEMS,” filed August 15, 2022, each of which is incorporated byreference as if set out in their respective entireties herein.

BACKGROUND

[0002] A vehicle head unit (which may be referred to as an infotainment system) may be configured to execute a vehicle operating system to facilitate control of, to provide a few examples, entertainment (such as music, video, images, etc.), information, navigation, and voice calls, as well as vehicle systems, such as heating, ventilation, and air conditioning (HVAC) systems, lighting systems, and seat control systems (including heating and/or cooling, seat adjustment, etc.). Vehicle operating systems may enable a single user profile to access a given instance of the vehicle operating system, where the single user profile is typically the operator of the vehicle. While the instance of the vehicle operating system may support multiple user profiles, the instance of the vehicle operating system may only allow a single user profile of the multiple user profiles to access (or in other words “log into”) the instance of the vehicle operating system. The instance of the vehicle operating system may limit access to a single user profile to ensure that the operator of the vehicle does not become distracted by other user profile behavior while operating the vehicle.

SUMMARY

[0003] In general, various aspects of the techniques set forth in this disclosure are directed to a single instance of a vehicle operating system that provides concurrent multi-user support. Rather than limit access to a single user profile for the single instance of the vehicle operating system, a single instance of the vehicle operating system described in this disclosure may allow multiple user profiles to access the single instance of the vehicle operating sy stem. As vehicles have begun integrating increasingly more displays in a cabin of the vehicle, more users may safely interface with the vehicle operating system without distracting an operator of the vehicle (considering that in some examples, the operator either has limited view' of these displays or has no direct view of these displays, which may be disposed behind the operator for rear seat passengers). As such, multiple users associated with multiple user profiles may access the vehicle operating systems and interact with the vehicle to control various functionality provided via the vehicle head unit, such as entertainment, information, navigation, and voice calls as well as various vehicle systems (which may be, to some extent, dependent on the multiple user locations within the vehicle).

[0004] The multiple users may also interface with the vehicle operating system to jointly coordinate activities among the multiple users, such as sharing content between the multiple users, reviewing content viewed by other users of the multiple users, control audio playback between the multiple users (such as change audio playback volume, coordinate on playlists, etc.), messaging between the multiple users, collaborating by the multiple users jointly on navigation directions, and the like. In some instances, the multiple users may use gestures, such as a swipe gesture, a pinch gesture, and a tap gesture, to indicate sharing of such content, which may allow for intuitive control of sharing between the multiple users, [0005] As a result, various aspects of the techniques described hi this disclosure may facilitate a better user experience when traveling in the vehicle while still preserving safety concerns with distracting the operator of the vehicle. As a result of permitting multiple user profiles to access the vehicle operating system, the vehicle operating system may tailor the user experience to each individual user profile allowing for preferences to be applied that are specific to each individual user profile. The vehicle operating system may also allow the multiple users to coordinate on activities (as noted above) that may improve the user experience while traveling in the vehicle. Given that the additional displays (which may be separate from a main display of the vehicle head unit) may be located in the cabin where the operator of the vehicle cannot directly see the content being presented, the vehicle operating system may permit increased permissions in terms of access to content while also potentially limiting any distractions to the operator.

[0006] In one example, tins disclosure describes a method comprising: executing, by a vehicle head unit, an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; authorizing, by the instance of the vehicle operating system, the multiple user profiles to interface with the instance of the vehicle operating system; presenting, by the instance of the vehicle operating system, multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and interfacing, via the multiple user interfaces, with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[0007] In another example, this disclosure describes a computing device comprising: a memory- configured to store an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; one or more processors to execute the instance of the vehicle operating system, w -herein the instance of the vehicle operating system is configured to: authorize the multiple user profiles to interface with the instance of the vehicle operating system; and present multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles, wherein die multiple user interfaces are configured to interface with multiple users associated with the one or more of tire multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[ 80081 In another example, this disclosure describes a non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause one or more processors to: execute an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; authorize, via execution of the first instance of tire vehicle operating system, the multiple user profiles to interface with the instance of the vehicle operating system; present, via execution of the first instance of the vehicle operating system, multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and interface, via the multiple user interfaces, with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[0009] In another example, this disclosure describes a computing device comprising: means for executing an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; means for authorizing the multiple user profiles to interface with the instance of the vehicle operating system; means for presenting multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and means for interfacing, via the multiple user interfaces, with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[0010] The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description, drawings, and claims.

BRIEF DESCRIPTION OF DRAWINGS

[0011] FIG. 1 is a block diagram illustrating an example computing system that is configured to provide a single-instance, multi-user vehicle operating system in accordance with various aspects of the techniques described in this disclosure.

[0012] FIG. 2 is a diagram illustrating an example of a vehicle that includes a computing system configured to execute a vehicle operating system that operates in accordance with various aspects of the single-instance multi-user techniques described in this disclosure. [0013] FIG. 3 is a diagram illustrating different interaction models with multiple displays including interactions that occur via single instance multi-user models and multi-instance multi-user models in accordance with the vehicle operating system techniques described in this disclosure.

[0014] FIG. 4 is a diagram illustrating an example vehicle that includes a vehicle head unit configured to control audio content in accordance with various aspects of the single-instance multi-user vehicle operating system techniques described in this disclosure.

[0015] FIGS. 5 and 6 are diagrams illustrating audio control in a single-instance, multi-user vehicle operating system m accordance with various aspects of the techniques described in this disclosure .

[0016] FIG. 7 is a flowchart illustrating example operation of the computing system shown in FIG. 1 in executing a vehicle operating system configured to perform various aspects of the techniques described in this disclosure.

DETAILED DESCRIPTION

[0017] FIG. I is a block diagram illustrating an example computing system that is configured to provide a single-instance, multi-user vehicle operating system in accordance with various aspects of the techniques described in this disclosure. As shown in the example of FIG. 1 , a computing system 100 includes a computing device 102. Although described with respect to a vehicle, the computing system 100 may be utilized in different contexts, including standalone computing systems (including laptop computers, desktop computers, workstations and the like), gaming systems, cellular telephones (including so-called “smartphones”), media systems (including streaming media systems), audio/visual (A/V) receivers, televisions (including so-called ‘‘smart televisions”), smart speakers, smart watches, thermostats (including so-called “smart thermostats”), smart glasses, or any other computing system. [0018] In any event, computing device 102 is an example of vehicle computing device, such as a vehicle head unit. FIG. 1 illustrates only one particular example of computing device 102, and many other examples of computing device 102 may be used in other instances and may include a subset of the components included in example computing device 102 or may include additional components not shown in FIG, 1.

[0019] As shown in the example of FIG. 1, computing device 102 includes presence-sensitive display 112, one or more processors 140, one or more communication units 142, one or more input components 144, one or more output components 146, and one or more storage devices 148, and communication channels 149. Communication channels 149 may interconnect each of the components 112, 140, 142, 146, and/or 148 for inter-component communications (physically, communicatively, and/or operatively) and thereby allow components 112, 140, 142, 146, and 148 to communicate with one another. In some examples, communication channels 149 may include a system bus, a network connection, one or more inter-process communication data structures, or any other components for communicating data (also referred to as information). Although shown as including components 1 12, 140, 142, 146, and 148, main computing device 102 may include other components or less components than those shown, where such components may be included in other control units such as a telematic control unit (TCU).

[0020] One or more communication units 142 of computing device 102 may communicate with external devices by transmitting and/or receiving data. For example, computing device 102. may use one or more of communication units 142 to transmit and/or receive radio signals on a radio network such as a cellular radio network. In some examples, communication units 142 may transmit and/or receive satellite signals on a satellite network such as a Global Positioning System (GPS) network. Examples of communication units 142 include a network interface card (e.g. an Ethernet card), an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that can send and/or receive information. Other examples of communication units 142 may include short wave radios (e.g., NFC, BLUETOOTH (including BLE)), GPS, 3G, 4G, 5G, and WIFI radios found in mobile devices as well as Universal Serial Bus (USB) controllers and the like.

[0021] One or more input components 144 of computing device 102 may receive input. Examples of input are tactile, audio, kinetic, and optical input, to name only a few examples. Input components 144 of computing device 102 include, in one example, a mouse, keyboard, touchpad, voice responsive system, video camera, buttons, scroll wheel, dial, control pad, a microphone (or, in other words, an audio capture device), or any other type of device for detecting input from a human or machine. Input components 144 may include cameras. In some examples, input component 144 may be a presence-sensitive input component, which may include a presence-sensitive screen, touch-sensitive screen, etc. separate from presencesensitive display 112,

[0022] One or more output components 146 of computing device 102 may generate output. Examples of output are tactile, audio, and video output. Output components 146 of computing device 102, in some examples, include a presence-sensitive screen (possibly separate from presence-sensitive display 1 12), sound card, video graphics adapter card, speaker, cathode ray tube (CRT) monitor, liquid crystal display (LCD), organic light emiting diode (OLED), or any other type of device for generating tactile, audio and/or visual output to a human or machine.

[0023] In some examples, presence-sensitive display 112 of computing device 102 may include functionality of input component 144 and/or output components 146. In the example of FIG. 1, presence-sensitive display 112 may include a presence-sensitive input (PSI) component 104 (“PSI component 104”), such as a presence-sensitive screen or touch- sensitive screen. In some examples, presence-sensitive input component 104 may detect an object at and/or near the presence-sensitive input component. As one example range, presence-sensitive input component 104 may detect an object, such as a finger or stylus that is within two inches or less of presence-sensitive input component 104. Presence-sensitive input component 104 may determine a location (e.g., an (x,y) coordinate) of tire presencesensitive input component at which the object was detected. In another example range, presence-sensitive input component 104 may detect an object two inches or less from presence-sensitive input component 104 and other ranges are also possible. Presencesensitive input component 104 may determine the location of presence-sensiti ve input component 104 selected by a user’s finger using capacitive, inductive, and/or optical recognition techniques. [0024] In some examples, presence-sensitive display 112 may also provide output to a user using tactile, audio, or video stimuli as described with respect to output component 146. For instance, presence-sensitive display 112 may include display component 103 that displays a graphical user interface. Display component 103 may be any type of output component that provides visual output, such as described with respect to output components 146. While illustrated as an integrated component of computing device 102, presence-sensitive display 1 12 may, in some examples, be an external component that shares a data or information path with other components of computing device 102 for transmitting and/or receiving input and output. For instance, presence-sensitive display 112 may be a built-in component of computing device 102 located within and physically connected to the external packaging of computing device 102 (e.g., an in-vehicle screen mounted in a dashboard of a vehicle). In another example, presence-sensitive display 112 may be an external component of computing device 102 located outside and physically separated from the packaging of computing device 102 (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data path with a electronic control unit of the vehicle). In some examples, presence-sensitive display 1 12, when located outside of and physically separated from the packaging of computing device 102, may be implemented by two separate components: a presence-sensitive input component 104 for receiving input and a display component 103 for providing output,

[0025] One or more storage devices 148 within computing device 102 may store information for processing during operation of computing device 102 (e.g., computing device 102 may store data accessed by operating system (OS) 160A and OS 160B during execution at computing device 102). As shown in the example of FIG. 1, one or more storage devices 148 may store a first instance of an operating system 160A (OS 160A) and a second instance of operating system 160B (OS 160B). In some examples, storage component 148 is a temporary memory, meaning that a primary purpose of storage component 148 is not long-term storage. Storage devices 148 of computing device 102 may be configured for short-term storage of information as volatile memory and therefore not retain stored contents if powered off Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art.

[0026] Storage devices 148, in some examples, also include one or more computer-readable storage media. Storage devices 148 in some examples include one or more non-transitory computer-readable storage mediums. Storage devices 148 may be configured to store larger amounts of information than typically stored by volatile memory. Storage devices 148 may further be configured for long-term storage of information as non-volatile memory' space and retain information after power on/off cycles. Examples of non-volatile memories include magnetic hard discs, optical discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.

Storage devices 148 may store program instructions and/or information (e.g., data) associated with OS 160A and/or 160B. Storage devices 148 may include a memory configured to store data or other information (which is not shown for ease of illustration purposes) associated with OS 160A and OS 160B.

[0027] One or more processors 140 may implement functionality and/or execute instructions associated with computing device 102. Examples of processors 140 include application processors, display controllers, auxiliary’ processors, one or more sensor hubs, and any other hardware configure to function as a processor, a processing unit, or a processing device. OS 160A and/or 160B may be operable (or, in other words, executed) by processors 140 to perform various actions, operations, or functions of computing device 102. That is, OS 160A and/or OS 160B may form executable bytecode, which when executed, cause processors 140 to perform specific operations (and thereby causing computing device 102 to become a specific-purpose computer by which to perform) in accordance with various aspects of the techniques described herein. For example, processors 140 of computing device 102 may’ retrieve and execute instructions stored by storage devices 148 that cause processors 140 to perform the operations described herein that are attributed to OS 160A and/or 160B. The instructions, when executed by processors 140, may cause computing device 102 to store information within storage devices 148.

[8028] As described above, computing system 100 may be integrated or otherwise included within a vehicle, lire vehicle may include one or more of a bicycle, a tricycle, a unicycle, a motorcycle, an automobile, farm equipment (such as a tractor, combine, etc.), construction equipment (a dump truck, crane, etc.), military' vehicle or equipment (a tank, armament, etc.), a truck, a semi-tractor (or, in other words, a semi-trailer), aviation equipment (such as a plane), nautical equipment (such as a boat, earner, submarine, etc.), or any other type of vehicle.

[0029] Computing device 102 (which may be, as noted above, referred to as vehicle head unit 102 and also as an infotainment system 102) ma_v be configured to execute a vehicle operating system, such as one or more of OS 160A and 160B (which as a result may’ also be referred to as vehicle OS 160A - VOS 160 A - and VOS 160B), to facilitate control of, to provide a few' examples, entertainment (such as music, video, images, etc.), information, navigation, and voice calls, as well as vehicle systems, such as heating, ventilation, and air conditioning (HVAC) systems, lighting systems, and seat control systems (including heating and/or cooling, seat adjustment, etc.). In some instances, vehicle operating systems may allow a. single user profile to access a given instance of the vehicle operating system at once, where the single user profile is typically the operator of the vehicle.

[0030] lliis vehicle operating system may be referred to as a single-instance, single-user vehicle operating system. That is, a single instance of the vehicle operating system may only allow a single user profile to access the single instance of the vehicle operating system and require a user profile switch in which the single user profile is replaced with another single user profile. In this respect, only a single user profile can access a single instance of the vehicle operating system at any given time,

[0031] To enable multiple concurrent user profiles, processors 140 may execute another instance of the vehicle operating system, which may require additional processors and/or higher processing capacity processors (which are typically more expensive). However, there may be difficulties in enabling efficient communication between both instances of the vehicle operating system such that users associated with the multiple user profiles are unable to share content between one another, thereby potentially limiting the user experience.

[0032] As such, while the instance of the vehicle operating system may support multiple user profiles, the instance of the vehicle operating system may only allow a single user profile of the multiple user profiles to access (or in other words “log into”) the instance of the vehicle operating system. The instance of the vehicle operating system may limit access to a single user profile to ensure that the operator of the vehicle does not become distracted by other user profile behavior while operating the vehicle.

[0033] In accordance with various aspects of the techniques described in this disclosure, computing system 100 may implement a single instance of a vehicle operating system, such as VOS 160A and/or 160B (“VOS 160”), that provides concurrent multi-user support. Rather than limit access to a single user profile for the single instance of VOS 160, a single instance of the VOS 160 described in this disclosure may allow multiple user profiles to access the single instance of VOS 160. As vehicles have begun integrating increasingly more displays, such as supporting presence-sensitive displays 150A-150N (“supporting presence-sensitive displays 150,” which may also be referred to as “displays 150”) in a cabin of the vehicle, more users may safely interface with VOS 160 without distracting an operator of the vehicle (considering that in some examples, the operator either has limited view of these displays 150 or has no direct view of these displays 150, which may be disposed behind the operator for

0 rear seat passengers). As such, multiple users associated with multiple user profiles may access VOS 160 and interact with computing device 102 to control various functionality provided via computing device 102, such as entertainment, information, navigation, and voice calls as well as various vehicle systems (which may be, to some extent, dependent on tlie multiple user locations within the vehicle).

[0034] lire multiple users may also interface with VOS 160 to jointly coordinate activities among the multiple users, such as sharing content between the multiple users, reviewing content viewed by other users of the multiple users, control audio playback between the multiple users (such as change audio playback volume, coordinate on playlists, etc.), messaging between the multiple users, collaborating by the multiple users jointly on navigation directions, and the like. In some instances, the multiple users may use gestures, such as a swipe gesture, a pinch gesture, and a tap gesture, to indicate sharing of such content, which may allow for intuitive control of sharing between the multiple users.

[0035] In operation, computing device 102 may, as shown in the example of FIG. 1, interface with displays 150, which may be similar to if not substantially similar to presence-sensitive display 112. However, rather than integrate into computing device 102 similar to presencesensitive display 112, displays 150 may be communicatively coupled (via wire or wirelessly) to computing device 102 and integrated throughout the cabin of the vehicle or separate from tlie vehicle. That is, displays 150 may also, in some instances, represent tablets, smartphones, laptops, portable gaming device, portable video devices, or any other device capable of interfacing with computing device 102 to present a user interface associated with VOS 160 (and/or applications executing in an application space presented by VOS 160, which may be separate from a privileged kernel space in which VOS 160 executes to facilitate interactions between tlie applications and underlying hardware, such as units 112, and 140-148).

[0036] In any event, processors 140 may execute an instance of a vehicle operating system , such as VOS 160A or VOS 160B, wherein VOS 160A and/or 160B facilitates concurrent access by multiple user profiles that are shown in tlie example of FIG. 1 as user profiles (UP) 161 A-161N (“UP 161”) for VOS 160A and UP 163A-163N (“UP 163”) for VOS 160B. UP 161/163 may each define a set of access rights (or in other words privileges), preferences (e.g., for user interfaces, VOS 160 settings, etc.), and other user-specific configuration data or information. UP 161/163 may each be associated with a different user, although UP 161 and UP 163 may include user profiles for the same user. For example, UP 161 A and UP 163A may be associated with the same user. [0037] Processors 140 may execute, as one example, VOS 160A and authorize multiple user profiles of UP 161 to interface with VOS 160A. To authorize multiple user profiles of UP 161, each of the users may register with V OS 160 A, entering a username (associated with a given one of UP 161 ) and a password to log into VOS 160A. Alternatively, any other way by which to log into VOS 160A may be enabled, such as scanning a quick response (QR) code with a camera of a smartphone, entering a personal identification number (PIN), performing a biometric process (e.g., a fingerprint scan, a retina scan, etc.), facial recognition or any other way by which to register a given user profile of Ups 161 with VOS 160A. OS 160A may compare the entered information with authentication information stored to UP 161 to authorize (or in other words authenticate) each of the multiple user profiles of UP 161 to interface with VOS 160A.

[0038] While discussed with respect to authentication, VOS 160A may also allow guest user profiles (as one of UP 161) in which authorization is provided without requiring any authentication . In this respect, guest user profiles may provide a limited experience (compared to authenticated UP 161) in terms of maintaining preferences, applications, etc. across different sessions but still enable functionality described below' with respect to sharing content between multiple UP 161 and the like.

[0039] In any event, VOS 160A may present multiple user interfaces across multiple displays (such as display 112 and one or more of display s 150) communicatively coupled to computing device 102. VOS 160A may, for authenticated UP of UP 161, present the user interface specific for each of the authenticated UP that maintains preference in terms of application organization, user interface theme, various VOS 160A setings (e.g., regarding notifications, accessibility, display configuration in terms of brightness, resolution, orientation, and the like, and any other type of OS setting). In any event, each of the multiple user interfaces are associated with one or more of UP 161 .

[0040] VOS 160A may next interface, via the multiple user interfaces, with multiple users associated with multiple UP of UP 161 to enable the multiple users to interface with VOS 160A for purposes of controlling functionality associated with computing device 102. This functionality may include controlling navigation, changing content playback, changing user interface settings, controlling HVAC settings, sharing content between UP 161, and other functionality described below' in more detail.

[0041] In some examples, processors 140 may represent a high processing capacity processor (which may be referred to as “processor 140A”) and a low' processing capacity processor (which may be referred to as “processor HOB”). In some instances, processor 140 may represent a processor capable of variable processing capacity having two or more distinct operating voltages that enable, at high voltages, high processing capacity and, at low voltages, low processing capacity. In this sense a single processor may represent both processor 140A and processor 140B, switching between the different processing modes (e.g., high and low processing modes) to facilitate power conservation. Processor 140 A may provide additional processor cores compared to processor 140B (or in the instance where a single processor switches between processing mode, enable additional cores in the high processing mode when compared to the low' processing mode).

100421 Processor 140A may execute VOS 160A, while processor 140B may execute VOS 160B. Processor 140A may execute VOS 160A for tasks that require higher processing, such as gam ing, video conferencing, navigation, and other processor intensive tasks/applications. Processor 140B may execute VOS 160B for tasks that require lower processing, such as streaming audio, telephone calls, viewing images, text messaging, or other less processor intensive tasks/applications.

[0043] In some examples, processor 140A may execute VOS 160A concurrent to processor 140B executing VOS 160B. In instances of concurrent execution of VOS 160A and 160B, VOS 160A may present an interface by which VOS 160B may communicate with VOS 160A to facilitate various functionalities of computing device 102. This interface may represent an application programming interface that VOS 160B may invoke to facilitate inter-VOS communication between VOS 160A and VOS 160B to cooperatively facilitate support to functionality provided by computing device 102.

[0044] Utilizing different processing capacity processors may facilitate energy consumption. Further, it may be cost prohibitive to install high processing capacity processor 140 capable of all of the functionality supported by computing device 102, where a low processing capacity processor 140 may allow the manufacturer to upgrade over time to add additional processors having high processing capacity (and/or possibly low' processing capacity). Providing a framework for inter-VOS communication may allow for manufacturers to upgrade over time without having to configure a separate inter-VOS communication interface by which to facilitate access to the functionality of computing device 102 described in more detail below.

[0045] As further shown in the example of FIG. 1, computing device 102 may also interface with supporting audio capture devices 152A-152N (‘"supporting audio capture devices 152” or ‘Audio capture devices 152”). Audio capture devices 152 may each represent a microphone or other transducer configured to capture audio data representative of a soundfield. Supporting audio capture devices 152 may also be referred to as “microphones 152.” In some examples, audio capture devices 152 may be integrated throughout one or more zones of the cabin of the vehicle that includes computing device 102. These zones may include an operator zone, one or more front passenger zones, and one or more rear passenger zones. As described below in more detail, these microphones 152 may capture (or, in other words, record, detect or otherwise sense) audio data that VOS 160 may use to adjust audio playback as well as support additional functionality provided by computing device 102. [0046] In this way, various aspects of the techniques may facilitate a better user experience when traveling in the vehicle while still preserving safety concerns with distracting the operator of the vehicle. As a result of permitting multiple UP 161/163 to access VOS 160A/160B, VOS 160A/160B may tailor the user experience to each individual UP of UP 161/163 allowing for preferences to be applied that are specific to each individual UP. VOS 160A/160B system may also allow the multiple users to coordinate on activities (as noted above) that may improve the user experience while traveling in the vehicle. Given that additional displays 150 (which may be separate from a mam display, e.g., presence-sensitive display 112, of vehicle head unit 102) may be located in the cabin where the operator of the vehicle cannot directly see the content being presented, VOS 160A/160B may permit increased permissions in terms of access to content while also potentially limiting any distractions to the operator.

[0047] FIG. 2 is a diagram illustrating an example of a vehicle that includes a computing system configured to execute a vehicle operating system that operates in accordance with various aspects of the single-instance multi-user techniques described in this disclosure. As shown in the example of FIG. 2, an interior (which may be referred to as a “cabin”) of vehicle 200 may include a computing system in the form of vehicle head unit 202, which represents an example of computing device 102.

[0048] Vehicle head unit 202 is, in the example of FIG. 2, integrated into approximately a center portion (e.g., a center console) of a front dashboard 220. Vehicle head unit 202 includes a display 212, which may represent one example of presence-sensitive display 1 12. Vehicle 200 may also include displays 250A, 250B, and 250C, which may represent examples of displays 150 described above with respect to FIG. 1. Display 250A is integrated into a passenger side of the front dashboard 220. Although not shown in the example of FIG. 2, a display similar to display 250A may be integrated into an operator side of front dashboard 220. Displays 250B and 250C are integrated into a rear passenger compartment of the cabin (i.e., in headrests of the front seats in the example of FIG. 2) on both the operator side (display 25 OB) and a passenger side (display 25 OC).

[0049] As described above, VOS 160 A may interface, via a first user interface (presented for example by display 250A) associated with a first user profile (e.g., UP 161 A), with a first user (a front passenger) to interact with content presented by a second user interface (e.g., presented for example by display 250B) associated with a second user profile (e.g., UP 161B) of multiple UP 161. In some instances, VOS 160A may interfacing with the first user to one or more of view or initiate playback, at the first user interface, the content presented by the second user interface.

[0050] VOS 160A may also interface with the first userto control audio playback by the second user interface presented by display 250B. VOS 160A may interface with the first user to, via the first user interface presented by display 250A, change an audio volume associated with the audio playback by the second user interface presented by display 250B.

[0051] VOS 160A may also interface with the first userto enable the first user and the second user (e.g., an operator-side rear passenger) to jointly interact with the content presented by the second user interface presented by display 250B. In this example, VOS 160A may interface with the first user to enable the first user and the second user to jointly contribute to audio playback by the second user interface (such as by jointly collaborating on building an audio playlist). VOS 160A may also, as another example, interface with the first user to enable the first user and the second user to jointly contribute to a multi-user activity presented by the second user interface, where such multi-user activity may include a navigation activity in which both the first user and the second user contribute to navigation of vehicle 200 that includes vehicle head unit 202 and/or a multi-player video game presented by the second user interface.

[0052] While shown as multiple physical displays 212/250, various aspects of the techniques may operate with respect to a single physical display (or multiple physical displays) having separate logically separated displays (or so-called virtual displays). That is, a single physical display 212/250 may be logically split (e.g., via software) to represent two distinct physical displays (from the perspective of the instance of VOS 160). In this respect, a single physical display may represent multiple different displays and displays 212/250 may each represent a logically separate virtual display.

[0053] In this respect, various aspects of the techniques described herein may enable a number of different use cases, including the following:

● A parent would like to see what their children are watching; ● Parent or driver would like to control the volume from the children's displays;

● Two or more passengers may want to contribute to some shared playlist; and

● Two or more passengers may want to interact on the same content (like a multi -game player).

The following use cases may be enabled given that the techniques allow for the following:

● Mirror displays contents when running on Android Auto; ● Allow drivers and passengers mirror their display contents; and ● Let driver or passenger send inputs, like (key, rotary, dpad, touch, etc events) to others passenger displays.

[0054] In other words, various aspects of the techniques described herein m ay address problems associated with enabling mobile apps to interact with each other (same or different app) across concurrent multiple users on single Android instance. As such, the techniques may enable interaction from rear seat to operator for adding a stop in navigation and/or interaction from an operator to rear seat (such as a child) for playing video.

[0055] VOS 160A may also interface, via the first user interface associated with UP 161A, with a first user (front passenger) of the multiple users to share content presented by the first user interface with a second user interface of the multiple user interfaces associated with UP 161 B. For example, VOS 160 A may receive, at the first user interface presented as an example by display 250A, a gesture that indicates the content presented by the first user interface is to be shared with the second user interface presented for example by display 250B. The gesture may include one or more of a swipe gesture, a pinch gesture, a tap gesture, or any other gesture associated with using presence-sensitive displays, such as display 250A.

[0056] VOS 160 A may be configured to, in some instances, present, at the first user interface, an animation indicating initiation of the content being shared. In addition or alternatively, VOS 160A may play audio indicating initiation of the content being shared. In some examples, VOS 160A may, when playing audio indicating initiation of the content being shared, play spatialized audio to reflect a position of the first user interface relative to a position of the second user interface.

[0057] In this respect, various aspects of the techniques may make cross-display interaction, such as sharing screens, more immersive using advanced user interface technologies. Based on the locational relationship between passenger zones in the vehicle, the techniques may enable the following: ● Sending content from one display to another using gesture (swipe, click, etc.);

● Visualizing the interaction using animation; and

• Using spatial audio to show where the content is moving to make it more immersive. [0058] FIG. 3 is a diagram illustrating different interaction models with multiple displays including interactions that occur via single instance multi-user models and multi-instance multi-user models in accordance with the vehicle operating system techniques described in this disclosure. As discussed in this disclosure, vehicle displays, such as displays 212 and 250, are evolving from operator centric to whole car experiences, where passengers can have the toll owing:

● Personal curated experiences while commuting to work;

● Communal experiences to enjoy a family firn trip; and

● Shared experiences, while in an ride-sharing environment

[0059] Original equipment manufacturers (OEMs) may recognize these needs and are building cars to provide users with these experiences. These are no more seen as premium experiences but a value tor a desirable vehicle.

[0060] In this respect, users may look for concurrent experiences across displays for a seamless and immersive experience, such as:

● Navigation - add stops to driver from their display for ongoing journey;

● Entertainment - Play and control video for kids in backseat or play games to keep them engaged & entertained;

● Users - Access their own curated apps, content and data for entertainment and/or productivity; and

● Personalization - Control my seat settings, HVAC control, alerts for my seat.

[0061] OEMs are thinking about following user interaction models, based on anticipation of user behaviors. There are 3 types of interactions models (shown in the example of FIG. 3):

1 . In-vehicle infotainment (IVI) Display is the primary controller for content on front seat entertainment (FSE) and rear seat entertainment (RSE); a. FSE and RSE have minimal controls - audio, play/pause, on/off etc.

2. IVI Screen is mirrored across FSE/RSE, with passenger displays cannot select content or controls; and

3. Content can be shared across all the screens, with each screen having its own individual controls.

[0062] There are the following user personas for multi-display scenarios: ● Driver: o Interacting with cluster & 1V1 screens for driving and interacting with other screens in car ● Front seat passenger: o Assisting to driver for navigation and passengers content using FSE

• Rear seat passengers: o Interacting with RSE for content, controls and suggestions to driver for navigation ● Guest: o Temporary riding in the car and wants to access some apps on passenger displays

[0063] FIG. 4 is a diagram illustrating an example vehicle that includes a vehicle head unit configured to control audio content in accordance with various aspects of the single-instance multi-user vehicle operating system techniques described in this disclosure. In the example of FIG. 4, a vehicle 400 may represent an example of vehicle 200 (shown in FIG. 2) in which a vehicle head unit 202 may enable privacy and sharing between different audio zones 404A- 404D (‘‘audio zones 404”) within vehicle 400. Audio zone 404A may represent a front seat operator-side zone (also denoted a “front operator zone 404A”). Audio zone 404B may represent a front seat passenger-side zone (also denoted a “front passenger zone 404B”).

Audio zone 404C may represent an operator-side rear passenger zone, while audio zone 404D may represent a passenger-side rear passenger zone.

[0064] In each of audio zones 404, vehicle 2.00 may include a respective one of audio capture devices 452A-452D (“audio capture devices 452”) and a respective one of speakers 454A- 454D (“speakers 454”). Audio capture devices 452 may represent examples of audio capture devices 152. Both audio capture devices 452 and speakers 454 may represent transducers capable of converting, in the instance of audio capture devices 452, sound pressure into electrical signals representative of a soundfield and, in the instance of speakers 454, electrical signals representative of a soundfield into the corresponding soundfield.

[0065] Although described as having a single one of audio capture devices 452 and a single one of speakers 454 in each of audio zones 404, each of audio zones 404 may include more or less audio capture devices 452 and more or less speakers 454. Further, while described as being transducers, any type of device capable of capturing audio data (transformed from the electrical signals captured by the audio capture devices 452) and reproducing a soundfield from electrical signals (transformed from audio data) may be utilized in one or more of audio zones 404. In addition, while four audio zones 404 are shown in the example of FIG. 4, vehicle 400 may include more or less audio zones 404.

[0066] In any event, VOS 160A may interface with the multiple users to control audio playback within one or more of zones 404 of a cabin of vehicle 400 that includes vehicle head unit 202. VOS 160A may, for example, interface with the multiple users to control audio volume in a single one of zones 404 of the cabin. As another example, VOS 160A may interface with the multiple users to control a focus of audio playback in at least one of zones 404 of the cabin.

[0067] As such, various aspects of the techniques may provide a central service for audio controls concurrently for all users (e.g., driver and/or passengers) to control audio settings (volume up/down, mute, unmute, or any other control). To facilitate this control, VOS 160A may provide a service to manage audio controls (e.g., volume, mute, settings, ducking, interruptions, etc.) so as to provide the following: ● Manage controls independently for each user in their respective audio zones:

● Manage focus for other users in separate zone;

● While taking into consideration current safety restrictions; and

● While taking into consideration current user roles (Driver, passenger, disable passenger, rear seat passenger).

[0068] In addition, one or more of audio capture devices 452 may capture audio data representative of a soundfield at each of one or more zones 404 (which is another way to refer to audio zones 404). VOS 160A may determine, based on the audio data representative of the soundfield occurring at each of zones 404, that a first user of the multiple users in a first zone (e.g., zone 404C) of the one or more zones is speaking in an attempt to audibly interface with vehicle head unit 202. VOS 160A may adjust, based on the audio data representative of the soundfield occurring at each of zones 404, audio playback at zones 404, [0069] In terms of adjusting audio playback, VOS I60A may determine, based on the audio data representative of the soundfield occurring at each of zones 404, a noise level. VOS 160A may next adjust, based on the noise level, the audio playback at one or more of zones 404.

[0070] In this respect, various aspects of the techniques may address how to enable multi-mic support for VOS 160A. VOS 160A may, using audio data captured by audio capture devices 452, determine and/or perform the following: ● Use mic to know which user is speaking, talking to adjust the volume controls by recognizing user; ● Map to display being used;

● Provide better assistant response for the (speaking) user;

● Detect noise level for each user (around their respective areas) and perform required audio changes for audio playback for comfortable audio listening;

• Also combine audio information for each user to determine noise levels in the car and apply changes to speaker for a better listening environment;

[0071] FIGS. 5 and 6 are diagrams illustrating audio control in a single-instance, multi-user vehicle operating system m accordance with various aspects of the techniques described in this disclosure. In some instances, vehicle operating systems have limited scope in that such vehicle operating systems can only send audio from one zone to another for that particular application unique identifier (UID). Various aspects of the techniques described in this disclosure enable zones 404 to share audio to the cabin with the following objectives: ● Share media audio from passenger to the main cabin;

● Provide a mechanism for passengers not in main cabin to request for audio playback to main zone (e.g., operator zone 404A);

● Provide a mechanism for driver to allow audio playback in the main zone; ● Provide mechanism to share audio focus between two different zones; and ● Disable volume control from non cabin owner from controlling volume;

[0072] For example, a passenger listening to media in the rear seat entertainment (RSE) zone (e.g., one of zones 404C or 404D) may send audio to the main cabin to allow for everyone in the car to listen to audio. While the passenger in the back seat RSE would select to send audio to the main cabin, the operator (or main cabin user) would still maintain control of allowing the audio play, as well as retaining control of the volume and other audio settings.

[0073] FIG. 5 shows a high level ovendew of the current audio architecture. Audio zones 404 are defined in configuration are used to set up the audio routing for each audio zone 404.

Each of audio zones 404 is defined as a collection of volume groups; each group contains a set of devices that are controlled on volume changes on the volume group. Each device can have different audio context’s routed into the device. Tire vehicle audio service may use the routing information on each audio zone to define a set of audio mixes, which VOS 160A may use to configure the audio routing for each of zones 404.

[0074] Referring next to FIG. 6, a configuration set-up is shown between the vehicle audio service and vehicle occupant zone service. For audio, the vehicle audio service may read the audio zone ( audioZoneld ) to occupant zone Id ( occupantZoneld ) mapping from the configuration , This information may be sent to the car occupant zone sen ice to set up tire occupant zone configuration during initialization. Occupant zone service may maintain information about the occupant zone configuration and display port mapping, which may be read from the different configuration information.

[0075] The car audio sen- ice may register a VehicleOccupantZoneCallback on the occupant zone service. VOS 160 A may trigger this service when there are any of the following changes in the vehicle occupant zone: ● Display activation; ● Audio Config changes;

● Occupant Zone user assignments;

● Passenger Start; and ● Passenger Stop.

[0076] When the audio service receives the onOccupantZoneConfigChanged signal from the callback, the audio service of VOS 160A may automatically assign the users to their corresponding audio zone as follows:

● Unloads previous user settings;

● Removes previous user audio policy routing;

● Loads audio settings for the new user;

● Set up audio policy routing for the new' user; and ● Resets the audio focus mapping for the new user,

[0077] The audio service of VOS 160A may use the audio focus mapping to determine where the incoming focus request should be assigned.

[0078] To allow for the passenger to send audio to the main cabin there a couple of things that need to be in place:

● Handle focus request in main cabin; and ● Route passengers audio to the main Cabin

Once the focus request has successfully changed from the passenger's own audio zone in vehicle 400, the audio routing can take place. For audio playback not yet started, the focus handling will automatically request focus in the main cabin zone with the rales already set with regards to audio usage priority (e.g., audio for media will be rejected by current phone call). [0079] The passenger should be able to request to send audio to the main cabin, a prompt would pop up for the driver to accept. Alternatively, the driver can enable an automatic allow for passengers to play audio. If accepted the audio focus request can be transferred to the main cabin, if needed, and the playback can commence. If not accepted the passenger can be prompted with a message.

[0080] Because one goal of the functionality is to play media from passengers in tire mam cabin, the focus request for the passenger can be restricted to media only. Focus requests for other sounds (e.g. alarm, call, notification, etc.) should remain in the passenger’s respective zone. For the media focus request the logic could be as follow':

● Send transient focus loss to passengers media apps;

● Request Focus for passengers media app in main cabin;

● Send focus gain if focus request is granted; and

● If not granted due to higher priority sound (emergency , phone call): o Set focus request as delayed focus; s Focus will be granted once higher priority focus is done; and o Alternative: Send focus back to the users zone

[0081] For passengers, where the user Id is used, a possible driver for audio routing may be tire user id device affinity routing. This can still be utilized for sending the passengers audio to the main cabin as follows:

• Find the main’s cabin media device (preferably a separate high quality device); and

● Reset the passenger device affinitie’s to share the cabin’s media device.

[0082] In addition, multi-zone audio (MZA) may facilitate playback of audio by various users, all playing audio on each individual audio zone 404. This may allow' for users in the main cabin to play media or any other sounds, while users in a rear entertainment system also play media, in their respective zones. This enables OEMs to potentially design complex audio infotainment systems where each passenger can tailor their own experience in vehicle 400.

[0083] Example use cases include the following:

● In the car, the driver can play music while users in the backseat can have their own music / media played;

● Users in the backseat can control their music volume without affecting the music playback of the driver;

● First party music apps launched in the backseat can play sound in the backseat without any change. [0084] One mechanism used for MZA is based on the dynamic audio policy , in particular uses the UID / userid based routing. This may allow for the audio policy to define routing based on audio attribute usages, UID, or userid. This may allow for applications or services to leverage the automatic routing of audio as configured by the dynamic audio policy. However, there exist APIs that can be used to route audio outside of the assigned audio devices. This has some impact in the car as applications are able to send audio to a particular zone without the users permission for example: ● Rear seat entertainment (RSE) application from user A sending audio directly to the output device for the driver user in the main cabin.

● RSE application from user A sending audio directly to the output device tor a different RSE assigned to user B.

This may raise some concerns for the driver safety as tire mam driver could be distracted upon arrival of an unwanted audio in the main cabin.

[0085] One possible goal of this aspect of the techniques is to allow the current dynamic audio policy to continue working but also limit the playback of audio for users (and their respective application/services) to play audio outside of a set of assigned zones, irrespective of the mechanism used to select devices tor audio playback, A potential benefit tor the users in the car is that such users will be able to consistently listen to audio in the car in a private and consistent manner.

[0086] The dynamic audio policy may provide a mechanism to set devices that can be used by a user for automatic routing via an audio policy API. This audio policy API may either be used or extended to limit application using the audio routing relative to a set of preferred devices to route audio outside of the limits of the audio policy user assignment. Tins should potentially limit the “forced” routing for devices defined within the audio policy.

[0087] FIG. 7 is a flowchart illustrating example operation of the computing system shown in FIG. 1 in executing a vehicle operating system configured to perform various aspects of the techniques described in this disclosure. As described above, one or more processors 140 of computing device 102 may execute an instance of a vehicle operating system, such as VOS 160A or VOS 160B, wherein VOS 160A and/or I60B facilitates concurrent access bymultiple user profiles that are shown in the example of FIG. 1 as user profiles (UP) 16 IA- 161N (“UP 161”) for VOS 160A and UP 163A-163N (“UP 163”) for VOS 160B (700).

[0088] Processors 140 may execute, as one example, VOS 160 A and authorize multiple user profiles of UP 161 to interface with VOS I60A (702). VOS 160A may present multiple user interfaces across multiple displays (such as display 1 12 and one or more of displays 150) communicatively coupled to computing device 102 (704). VOS 160A may next interface, via the multiple user interfaces, with multiple users associated with multiple UP of UP 161 to enable the multiple users to interface with VOS 160A for purposes of controlling functionality associated with computing device 102 (706). Uris functionality may include controlling navigation, changing content playback, changing user interface settings, controlling HVAC settings, sharing content between UP 161 , and other functionality described above.

[0089] In this way, the above described techniques may enable the following examples: [0090] Example 1. A method comprising: executing, by a vehicle head unit, an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; authorizing, by the instance of the vehicle operating system, the multiple user profiles to interface with the instance of the vehicle operating system; presenting, by the instance of the vehicle operating system, multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and interfacing, via the multiple user interfaces, with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of tlie vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[0091] Example 2. The method of example 1, wherein the instance of a vehicle operating system comprises a first instance of the vehicle operating system, wherein the method further comprises executing a second instance of the vehicle operating system, and wherein the first instance of the vehicle operating system includes an interface by which to communicate with the second instance of the vehicle operating system to facilitate concurrent access by the multiple user profiles.

[0092] Example 3. The method of example 2, wherein the vehicle head unit includes a high processing capacity processor that executes the first instance of the vehicle operating system, wherein the vehicle head unit includes a low processing capacity processor that executes the second instance of the vehicle operating system, and wherein the high processing capacity processor provides more processing capacity than the low processing capacity processor.

[0093] Example 4. The method of any combination of examples 1 -3, wherein the multiple displays are displaced about a cabin of a vehicle that includes the vehicle head unit. [0094] Example 5. Tire method of any combination of examples 1-4, wherein the multiple displays includes two or more of: a first display integrated into an operator side of a front dashboard of a cabin of a vehicle that includes the vehicle head unit; a second display integrated into a center console of the front dashboard; a third display integrated into a passenger side of the front dashboard; and a fourth display integrated into a rear passenger compartment of the cabin.

[0095] Example 6. The method of any combination of examples 1-5, wherein the multiple displays include one or more computing devices associated with at least one of the multiple users that is a passenger of a vehicle that includes the vehicle head unit.

[0096] Example 7 . The method of any combination of examples 1-6, wherein interfacing with the multiple users associated with the one or more of the multiple user profiles includes interfacing with the multiple users to control audio playback within one or more zones of a cabin of a vehicle that includes the vehicle head unit.

[0097] Example 8. The method of example 7, wherein the one or more zones of the cabin include one or more of a front operator zone, a front passenger zone, an operator-side rear passenger zone, and a passenger-side rear passenger zone.

[0098] Example 9. The method of any combination of examples 7 and 8, wherein interfacing with tire multiple users to control the audio playback includes interfacing with the multiple users to control audio volume in a single one of the one or more zones of the cabin. [0099] Example 10. The method of any combination of examples 7-9, wherein interfacing with the multiple users to control the audio playback includes interfacing with the multiple users to control a focus of audio playback in at least one of the one or more zones of the cabin.

[0100] Example 11 . The method of any combination of examples 7-10, further comprising capturing, by one or more audio capture devices communicatively coupled to the vehicle head unit, audio data representative of a soundfield at each of the one or more zones.

[0101] Example 12. The method of example 11, further comprising determining, based on the audio data representative of the soundfield occurring at each of the one or more zones, that a first user of the multiple users in a first zone of the one or more zones is speaking in an attempt to audibly interface with the vehicle head unit.

[0102] Example 13. The method of any combination of examples 11 and 12, further comprising adjusting, based on the audio data representative of the soundfield occurring at each of the one or more zones, audio playback at the one or more zones. [0103] Example 14. Hie method of example 13, wherein adjusting the audio playback includes: determining, based on the audio data representative of the soundfield occurring at each of the one or more zones, a noise level; and adjusting, based on the noise level, the audio playback at the one or more zones.

[0104] Example 15 . The method of any combination of examples 1-14, wherein interfacing with the multiple users associated with tire one or more of the multiple user profiles includes interfacing, via a first user interface of the multiple user interfaces associated with a first user profile of the multiple user profiles, with a first user of the multiple users to interact with content presented by a second user interface of the multiple user interfaces associated with a second user profile of the multiple user profiles.

[0105] Example 16. The method of example 15, wherein interfacing with the first user comprises interfacing with the first user to one or more of view' or initiate playback, at the first user interface, the content presented by the second user interface.

[0106] Example 17. The method of any combination of examples 15 and 16, wherein interfacing with the first user of the multiple users comprises interfacing with the first user to control audio playback by the second user interface.

[0107] Example 18. The method of example 17, wherein interfacing with the first user to control the audio playback comprises interfacing with the first user to, via the first user interface, change an audio volume associated with the audio playback by the second user interface.

[0108] Example I S). The method of any combination of examples 15-18, wherein interfacing with the first user comprises interfacing with the first user to enable the first user and the second user to jointly interact with the content presented by the second user interface. [0109] Example 20. The method of example 19, wherein interfacing with the first user to enable the first user and the second user to jointly interact with the content comprises interfacing with the first user to enable the first user and the second user to jointly contribute to audio playback by the second user interface.

[0110] Example 21 . The method of any combination of examples 19 and 20, wherein interfacing with the first user to enable the first user and the second user to jointly interact with the content comprises interfacing with the first user to enable the first user and the second user to jointly contribute to a multi-user activity presented by the second user interface. [0111] Example 22. Tire method of example 21, wherein the multi-user activity comprises a navigation activity in which both the first user and the second user contribute to navigation of a vehicle that includes the vehicle head unit.

[01] 2] Example 23. The method of any combination of examples 21 and 22, wherein the multi-user activity comprises a multi-player video game presented by the second user interface.

[0113] Example 24 . The method of any combination of examples 1-23, wherein interfacing with the multiple users associated with the one or more of the multiple user profiles includes interfacing, via a first user interface of the multiple user interfaces associated with a first user profile of the multiple user profiles, with a first user of the multiple users to share content presented by the first user interface with a second user interface of the multiple user interfaces associated with a second user profile of the multiple user profiles.

[0114] Example 25. The method of example 24, wherein interfacing with the first user comprises receiving, at the first user interface, a gesture that indicates the content presented by the first user interface is to be shared with the second user interface.

[0115] Example 26. The method of exam pie 25, wherein the gesture includes one or more of a swipe gesture, a pinch gesture, and a tap gesture.

[0116] Example 27. The method of any combination of examples 2.4-26, further comprising presenting, by the first user interface, an animation indicating initiation of the content being shared.

[0117] Example 2.8. Hie method of any combination of examples 24-27, further comprising playing audio indicating initiation of the content being shared.

[0118] Example 29. The method of example 28, wherein playing the audio comprises playing spatialized audio to reflect a position of the first user interface relative to a position of the second user interface.

[0119] Example 30. A computing device comprising: a memory configured to store an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; one or more processors to execute the instance of the vehicle operating system, wherein the instance of the vehicle operating system is configured to: authorize the multiple user profiles to interface with the instance of the vehicle operating system; and present multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles, wherein the multiple user interfaces are configured to interface with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[0120] Example 31. The computing device of example 30, wherein the instance of a vehicle operating system comprises a first instance of the vehicle operating system, wherein the one or more processors are further configured to execute a second instance of the vehicle operating system, and wherein the first instance of the vehicle operating system includes an interface by which to communicate with the second instance of the vehicle operating system to facilitate concurrent access by the multiple user profiles.

[0121] Example 32. The computing device of example 31, wherein the one or more processors include a high processing capacity processor that executes the first instance of the vehicle operating system, wherein one or more processors include a low processing capacity processor that executes the second instance of the vehicle operating system, and wherein the high processing capacity processor provides more processing capacity than the low processing capacity processor.

[0122] Example 33. The computing device of any combination of examples 30-32, wherein the multiple displays are displaced about a cabin of a vehicle that includes the vehicle head unit.

[0123] Example 34. The computing device of any combination of examples 30-33, wherein the multiple displays includes two or more of: a first display integrated into an operator side of a front dashboard of a cabin of a vehicle that includes the vehicle head unit; a second display integrated into a center console of the front dashboard; a third display integrated into a passenger side of the front dashboard; and a fourth display integrated into a rear passenger compartment of the cabin.

[0124] Example 35. The computing device of any combination of examples 30-34, wherein the multiple displays include one or more computing devices associated with at least one of the multiple users that is a passenger of a vehicle that includes the vehicle head unit.

[0125] Example 36. The computing device of any combination of examples 30-35, wherein the multiple user interfaces are configured to interface with the multiple users to control audio playback within one or more zones of a cabin of a vehicle that includes the vehicle head unit.

[0126] Example 37. The computing device of example 36, wherein the one or more zones of the cabin include one or more of a front operator zone, a front passenger zone, an operatorside rear passenger zone, and a passenger-side rear passenger zone. [0127] Example 38. Tire computing device of any combination of examples 36 and 37, wherein the multiple user interfaces are configured to interface with the multiple users to control audio volume in a single one of the one or more zones of the cabin.

[0128] Example 39. The computing device of any combination of examples 36-38, wherein tlie multiple user interfaces are configured to interface with the multiple users to control a focus of audio playback in at least one of the one or more zones of the cabin.

[0129] Example 40. The computing device of any combination of examples 36-39, further comprising one or more audio capture devices configured to capture audio data representative of a soundfield at each of the one or more zones.

[0130] Example 41. The computing device of example 40, wherein the instance of the vehicle operating system is further configured to determine, based on the audio data representative of the soundfield occurring at each of the one or more zones, that a first user of the multiple users in first zone of the one or more zones is speaking in an attempt to audibly interface with the ve ide head unit.

[0131] Example 42. Hie computing device of any combination of examples 40 and 41 , wherein the instance of the vehicle operating system is further configured to adjust, based on the audio data representative of the soundfield occurring at each of the one or more zones, audio playback at the one or more zones.

[0132] Example 43. The computing device of example 42, wherein the instance of the vehicle operating system is further configured to: determine, based on the audio data representative of the soundfield occurring at each of the one or more zones, a noise level; and adjust, based on the noise level, the audio playback at the one or more zones.

[0133] Example 44. The computing device of any combination of examples 30-43, wherein a first user interface of the multiple user interfaces associated with a first user profile of the multiple user profiles is configured to interface with a first user of the multiple users to interact with content presented by a second user interface of the multiple user interfaces associated with a second user profile of the multiple user profiles.

[0134] Example 45. The computing device of example 44, wherein the first user interface of the multiple user interfaces associated with the first user profile of the multiple user profiles is configured to interface with the first user to one or more of view or initiate playback, at the first user interface, the content presented by the second user interface.

[0135] Example 46. The computing device of any combination of examples 44 and 45, wherein the first user interface of the multiple user interfaces associated with the first user profile of the multiple user profiles is configured to interface with the first user to control audio playback by the second user interface.

[0136] Example 47. The computing device of example 46, wherein the first user interface of the multiple user interfaces associated with the first user profile of the multiple user profiles is configured to interface with the first user to, via the first user interface, change an audio volume associated with the audio playback by the second user interface.

[0137] Example 48. The computing device of any combination of examples 44 and 45, wherein the first user interface of the multiple user interfaces associated with the first user profile of the multiple user profiles is configured to interface with the first user to enable the first user and the second user to jointly interact with the content presented by the second user interface.

[0138] Example 49. The computing device of example 48, wherein the first user interface of the multiple user interfaces associated with tire first user profile of the multiple user profiles is configured to interface with the first user to enable the first user and the second user to jointly contribute to audio playback by the second user interface.

[0139] Example 50. The computing device of any combination of examples 48 and 49, wherein the first user interface of tire multiple user interfaces associated with tire first user profile of the multiple user profiles is configured to interface with the first user to enable the first user and the second user to jointly contribute to a multi-user activity presented by the second user interface.

[0140] Example 51 . The computing device of example 50, wherein the multi-user activity comprises a navigation activity in winch both the first user and the second user contribute to navigation of a vehicle that includes the vehicle head unit.

[0141] Example 52. The computing device of any combination of examples 50 and 51, wherein the multi-user activity comprises a multi-player video game presented by the second user interface.

[0142] Example 53. The computing device of any combination of examples 30-52, wherein the first user interface of the multiple user interfaces associated with the first user profile of the multiple user profiles is configured to interface with a first user of the multiple users to share content presented by the first user interface with a second user interface of the multiple user interfaces associated with a second user profile of the multiple user profiles.

[0143] Example 54. The computing device of example 53, wherein the first user interface of the multiple user interfaces associated with the first user profile of the multiple user profiles is configured to receive a gesture that indicates the content presented by the first user interface is to be shared with the second user interface.

[0144] Example 55. The computing device of example 54, wherein the gesture includes one or more of a swipe gesture, a pinch gesture, and a tap gesture.

[0145] Example 56. The computing device of any combination of examples 53-55, wherein the first user interface is further configured to present an animation indicating initiation of the content being shared ,

[0146] Example 57. Tie computing device of any combination of examples 53-56, wherein the first user interface is further configured to play audio indicating initiation of the content being shared.

[0147] Example 58. The computing device of example 57, wherein the first user interface is configured to play spatialized audio to reflect a position of the first user interface relative to a position of the second user interface.

[0148] Example 59. A non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause one or more processors to: execute an instance of a vehicle operating system, wherein the instance of the vehicle operating system facilitates concurrent access by multiple user profiles; authorize, via execution of the first instance of the vehicle operating system, the multiple user profiles to interface with the instance of the vehicle operating system; present, via execution of the first instance of the vehicle operating system, multiple user interfaces across multiple displays communicatively coupled to the vehicle head unit, each of the multiple user interfaces associated with one or more of the multiple user profiles; and interface, via the multiple user interfaces, with multiple users associated with the one or more of the multiple user profiles to enable the multiple users to interface with the instance of the vehicle operating system for purposes of controlling functionality associated with the vehicle head unit.

[0149] In one or more examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over, as one or more instructions or code, a computer- readable medium and executed by a hardware-based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer-readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium .

[0150] By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transient media, but are instead directed to non-transient, tangible storage media. Disk and disc, as used, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, ultra Blu-ray, etc. where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.

[0151] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor,” as used may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described. In addition, in some aspects, the functionality described may be provided within dedicated hardware and/or software modules. Also, the techniques could be fully implemented in one or more circuits or logic elements.

[0152] The techniques of tins disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.

[0153] Various examples have been described. These and other examples are within the scope of the following claims.