Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND APPARATUS FOR PROCESSING SEARCH RESULTS
Document Type and Number:
WIPO Patent Application WO/2015/195364
Kind Code:
A1
Abstract:
The present principles of the embodiments generally relate to an apparatus and a method for processing media searches and/or displaying of media search results. In one exemplary embodiment, the present invention is able to dynamically process a search query and display corresponding search result with a plurality of items in a fly-out or pop-up portion of a display. In one exemplary embodiment, a substitute graphical image icon is displayed instead, when a still picture corresponding to a media asset is unavailable from the search result. In another embodiment, additional information is presented in a different region when a user highlights or selects one of the items in the search result.

Inventors:
AZMOON TROY (US)
ASH ARDEN A (US)
NOGUEROL JUAN MANUEL (US)
DUNN MICHAEL (US)
Application Number:
PCT/US2015/034387
Publication Date:
December 23, 2015
Filing Date:
June 05, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
THOMSON LICENSING (FR)
International Classes:
G06F17/30
Domestic Patent References:
WO2013048360A12013-04-04
WO2010078523A12010-07-08
Foreign References:
US20110161242A12011-06-30
US20060069670A12006-03-30
US20090094197A12009-04-09
US8745136B12014-06-03
Attorney, Agent or Firm:
SHEDD, Robert D. et al. (3rd FloorPrinceton, New Jersey, US)
Download PDF:
Claims:
CLAIMS

1 . A method comprising:

receiving (1200) a first user input, said first user input comprising a search query;

determining (1210) a search result in response to said search query, said search result being positioned in a first region; said search result comprising a plurality of items relevant to said search query;

determining (1220) availability of one or more of respective still pictures corresponding to one or more of said items;

outputting (1230) said items with said one or more of said respective still pictures of said items when said one or more of the respective still pictures are available; and

outputting (1240) one or more of respective predetermined substitute image icons corresponding to said one or more of said items when said one or more of said respective still pictures are unavailable.

2. The method of claim 1 where said respective predetermined substitute image icons are selected based on respective types of said items.

3. The method of claim 2 wherein said respective types comprising one of: (1 ) movie, (2) TV show, (3) audio program, (4) person, and (5) electronic book.

4. The method of claim 1 wherein said first region is a fly-out region.

5. The method of claim 1 wherein said items are presented as a list.

6. The method of claim 5, wherein said list is presented in an order of relevancy to said search query.

7. The method of claim 1 wherein said items may comprise one or more of: (1 ) media asset, (2) actor, (3) director, and (4) producer.

8. The method of claim 7 wherein the media asset is one of: (1 ) video program, (2) audio program, (3) movie, (4) TV show, and (5) electronic book.

9. The method of claim 1 wherein said search query is a text string.

10. The method of claim 9 wherein said search result is outputted dynamically as respective letters of said text string are incrementally received.

1 1 . The method of claim 1 , further comprising:

receiving (1250) a second user input; and

outputting (1250) on a second region additional information for one of said plurality of items selected in response to said second user input.

12. The method of claim 1 1 , wherein

said additional information including at least one of:

(1 ) a title of a media asset, (2) a length of a media asset, (3) a rating of a media asset, (4) a still-picture representing a media asset, and (5) a person's name.

13. An apparatus comprising:

an interface (120, 215, 280) for receiving a first user input, said first user input comprising a search query;

a processor (1 15, 210, 265) for outputting a search result in response to said search query, said search result being positioned in a first region; said search result comprising a plurality of items relevant to said search query, said items being accompanied by respective ones of a plurality of corresponding visual representations of said items, said plurality of visual representations of said items comprising a plurality of respective still pictures of said items; and said processor outputting a predetermined substitute image icon corresponding to an item of said items when a still picture of the item is unavailable.

14. The apparatus of claim 13 where the substitute image is predetermined by a respective type of said items.

15. The apparatus of claim 14 wherein the respective type comprising one of: (1 ) movie, (2) TV show, (3) audio program, (4) person, and (5) electronic book. 16. The apparatus of claim 13 wherein said first region is a fly out region.

17. The apparatus of claim 13 wherein said items are presented as a list.

18. The apparatus of claim 13, wherein said list is presented in an order of relevancy to said search query.

19. The apparatus of claim 13 wherein said items may comprise one or more of: (1 ) media asset, (2) actor, (3) director, and (4) producer. 20. The apparatus of claim 19 wherein the media asset is one of: (1 ) video program, (2) audio program, (3) movie, (4) TV show, and (5) electronic book.

21 . The apparatus of claim 13 wherein said search query is a text string. 22. The apparatus of claim 21 wherein said search result is outputted dynamically as respective alphabets of said text string are incrementally received.

23. The apparatus of claim 13, further comprising: said processor receiving a second user input and outputting on a second region additional information for one of said plurality of items selected in response to said second user input. 24. The apparatus of claim 23, wherein said additional information including at least one of: (1 ) a title of a media asset, (2) a length of a media asset, (3) a rating of a media asset, (4) a still-picture representing a media asset, and (5) a person's name.

25. A computer program product stored in a non-transitory computer- readable storage media comprising computer-executable instructions for:

receiving (1200) a first user input, said first user input comprising a search query;

determining (1210) a search result in response to said search query, said search result being positioned in a first region; said search result comprising a plurality of items relevant to said search query;

determining (1220) availability of respective still pictures corresponding to said items;

outputting (1230) said items with the respective still pictures of said items when the respective still pictures are available; and

outputting (1240) respective predetermined substitute image icons corresponding to said items when the respective still pictures are unavailable.

26. An apparatus comprising:

an interface (120, 215, 280) for receiving a first user input, said first user input comprising a search query;

a processor (1 15, 210, 265) for determining a search result in response to said search query, said search result being positioned in a first region and comprising a plurality of items relevant to said search query; and

said processor for determining availability of respective still pictures corresponding to said items; and for outputting said items with the respective still pictures of said items when the respective still pictures are available and for outputting respective predetermined substitute image icons corresponding to said items when the respective still pictures are unavailable. 27. An apparatus comprising:

first means (120, 215, 280) for receiving a first user input, said first user input comprising a search query;

second means (1 15, 210, 265) for determining a search result in response to said search query, said search result being positioned in a first region and comprising a plurality of items relevant to said search query; and

said second means (1 15, 210, 265) determining availability of respective still pictures corresponding to said items; and for outputting said items with the respective still pictures of said items when the respective still pictures are available and for outputting respective predetermined substitute image icons corresponding to said items when the respective still pictures are unavailable.

28. An apparatus comprising:

a first means (120, 215, 280) for receiving a first user input, said first user input comprising a search query;

a second means (1 15, 210, 265) for outputting a search result in response to said search query, said search result being positioned in a first region; said search result comprising a plurality of items relevant to said search query, said items being accompanied by respective ones of a plurality of corresponding visual representations of said items, said plurality of visual representations of said items comprising a plurality of respective still pictures of said items; and

said second means outputting a predetermined substitute image icon corresponding to an item of said items when a still picture of the item is unavailable.

29. The apparatus of claim 28 where the substitute image is predetermined by a respective type of said items.

30. The apparatus of claim 14 wherein the respective type comprising one of: (1 ) movie, (2) TV show, (3) audio program, (4) person, and (5) electronic book.

31 . The apparatus of claim 28 wherein said first region is a fly out region.

32. The apparatus of claim 28 wherein said items are presented as a list.

33. The apparatus of claim 28, wherein said list is presented in an order of relevancy to said search query.

34. The apparatus of claim 28 wherein said items may comprise one or more of: (1 ) media asset, (2) actor, (3) director, and (4) producer.

35. The apparatus of claim 34 wherein the media asset is one of: (1 ) video program, (2) audio program, (3) movie, (4) TV show, and (5) electronic book.

36. The apparatus of claim 28 wherein said search query is a text string.

37. The apparatus of claim 36 wherein said search result is outputted dynamically as respective alphabets of said text string are incrementally received.

38. The apparatus of claim 28, further comprising:

said second means receiving a second user input and outputting on a second region additional information for one of said plurality of items selected in response to said second user input.

39. The apparatus of claim 38, wherein said additional information including at least one of: (1 ) a title of a media asset, (2) a length of a media asset, (3) a rating of a media asset, (4) a still-picture representing a media asset, and (5) a person's name.

Description:
METHOD AND APPARATUS FOR PROCESSING SEARCH RESULTS

CROSS REFERENCE TO RELATED APPLICATIONS

This application claims priority to and all benefits accruing from the provisional application filed in the United States Patent and Trademark Office on June 18, 2014 and assigned serial numbers 62/013,620.

BACKGROUND OF THE INVENTION

Field of the Invention

The present principles of the embodiments generally relate to an apparatus and a method for processing media searches and/or displaying of media search results. In one exemplary embodiment, the present invention is able to dynamically process a search query and display corresponding search result with a plurality of items in a fly-out or pop-up portion of a display. In one exemplary embodiment, a substitute graphical image icon is displayed instead, when a still picture corresponding to a media asset is unavailable from the search result. In another embodiment, additional information is presented in a different region when a user highlights or selects one of the items in the search result.

Background Information

Electronic devices such as televisions, personal computers (PCs), tablets, cellphones, and etc., require a control system that includes a user interface system. Typically, a user interface system provides information to and receives information from a user and simplifies usage of a device. One example of a user interface system is an electronic program guide and its associated user interaction menu and control functions in an electronic device, as shown in FIG. 1 and FIG. 2. An electronic program guide facilitates the searching of media assets or programs, program channels, media sources, and other information related to the media assets.

An electronic program guide system may comprise a program or media asset information database and interactive user interface screens. An electronic program guide system may obtain program information about media contents from an information provider (which may or may not be the same as the content provider), and may display the program information to a user. The program content may be, e.g., video, audio media, and/or electronic book assets from various sources, such as, for example, broadcast, satellite, internet, local storage media, and etc.

Program guide information typically comprises programming information for a program or a media asset such as, for example, media asset title, program station channel/name or media asset source, start time, end time, elapsed time, time remaining, review rating, parental guide rating, genre, actors/actresses, director, producer, description of the program's content, and etc. For example, as illustrated in FIG. 2, when a user highlights a media asset such as the movie ZULU in a program guide 310, additional program information for the movie ZULU is shown in 320, including e.g., information about movie title, stars, producer, parental guide rating, reviewing rating, and plot.

U.S. Patent No. 6, 1 1 1 ,61 1 , issued to Ozkan et al., describes in detail an exemplary embodiment of an electronic program guide system for providing program guide information to an electronic device, including exemplary data packet structure for carrying the program guide information from a provider to an electronic device. The exemplary data packet structure is designed so that both the channel information (e.g., channel name, call letters, channel number, and etc.) and the program description information (e.g., title, rating, program description, and etc.) relating to a program may be transmitted from a program guide database provider to a receiving apparatus. The teachings of this patent are incorporated herein by reference in their entirety.

In addition, different streaming media sites (e.g., Hulu, Netflix, M-GO, etc.) currently provide various user interfaces for users to search media asset information such as available video titles and their related information on their respective websites. The users may search, e.g., different movie titles available on these websites by typing in a query string using keyboards on their e.g., PCs, laptops, cellphones, and etc.

SUMMARY OF THE INVENTION The present inventors recognize the needs to improve the existing processing of media asset searches and displaying of the media asset search results.

According to an exemplary aspect of the present invention, an apparatus is presented, comprising: an interface for receiving a first user input, the first user input comprising a search query; a processor for determining a search result in response to the search query, the search result being positioned in a first region and comprising a plurality of items relevant to the search query; and the processor for determining availability of respective still pictures corresponding to the items; and for outputting the items with the respective still pictures of the items when the respective still pictures are available and for outputting respective predetermined substitute image icons corresponding to the items when the respective still pictures are unavailable.

In another exemplary embodiment, a method is presented comprising: receiving a first user input, the first user input comprising a search query; determining a search result in response to the search query, the search result being positioned in a first region; the search result comprising a plurality of items relevant to the search query; determining availability of one or more of respective still pictures corresponding to one or more of the items;

outputting the items with the one or more of the respective still pictures of the items when the one or more of the respective still pictures are available; and outputting one or more of respective predetermined substitute image icons corresponding to the one or more of the items when the one or more of the respective still pictures are unavailable.

In accordance with another exemplary aspect of the present invention, a computer program product stored in a non-transitory computer-readable storage media is presented, comprising computer-executable instructions for: receiving a first user input, the first user input comprising a search query; determining a search result in response to the search query, the search result being positioned in a first region; the search result comprising a plurality of items relevant to the search query; determining availability of respective still pictures corresponding to the items; outputting the items with the respective still pictures of the items when the respective still pictures are available; and outputting respective predetermined substitute image icons corresponding to the items when the respective still pictures are unavailable.

In accordance with another exemplary aspect of the present invention, an apparatus is presented, comprising: an interface for receiving a first user input, the first user input comprising a search query; a processor for outputting a search result in response to the search query, the search result being positioned in a first region; the search result comprising a plurality of items relevant to the search query, the items being accompanied by respective ones of a plurality of corresponding visual representations of the items, the plurality of visual representations of the items comprising a plurality of respective still pictures of the items; and the processor outputting a predetermined substitute image icon corresponding to an item of the items when a still picture of the item is unavailable.

In accordance with another exemplary aspect of the present invention, an apparatus is presented, comprising: a first means, including an interface, for receiving a first user input, the first user input comprising a search query; a second means, including a processor, for determining a search result in response to the search query, the search result being positioned in a first region and comprising a plurality of items relevant to the search query; and the second means determining availability of respective still pictures corresponding to the items; and for outputting the items with the respective still pictures of the items when the respective still pictures are available and for outputting respective predetermined substitute image icons corresponding to the items when the respective still pictures are unavailable.

In accordance with another exemplary aspect of the present invention, an apparatus is presented, comprising: a first means, including an interface, for receiving a first user input, the first user input comprising a search query; a second means, including, a processor, for determining a search result in response to the search query, the search result being positioned in a first region and comprising a plurality of items relevant to the search query; and the second mean determining availability of respective still pictures corresponding to the items; and for outputting the items with the respective still pictures of the items when the respective still pictures are available and for outputting respective predetermined substitute image icons corresponding to the items when the respective still pictures are unavailable.

DETAILED DESCRIPTION OF THE DRAWINGS

The above-mentioned and other features and advantages of the invention, and the manner of attaining them, will become more apparent and the invention will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings, wherein:

FIG. 1 shows an existing program guide in an electronic device;

FIG. 2 shows another existing program guide in an electronic device; FIG. 3 shows an exemplary apparatus according to the principles of the present invention;

FIG. 4 shows an example system according to the principles of the present invention;

FIG. 5 to FIG.1 0 show exemplary user interfaces and their functions of an exemplary electronic device according to the principles of the present invention;

FIG. 1 1 illustrates how various elements in an exemplary user interface system may be stored in a database according to the principles of the present invention; and

FIG. 12 shows an exemplary process according to the principles of the present invention.

The examples set out herein illustrate exemplary embodiments of the invention. Such examples are not to be construed as limiting the scope of the invention in any manner.

DETAILED DESCRIPTION

Referring now to the drawings, and more particularly to FIG. 3, FIG. 3 shows an exemplary embodiment of an electronic device 3 capable of processing audio, video signals and various associated processes and programs in accordance with the principles of the present invention. As described herein, the system shown in FIG. 3 is an MPEG compatible system for receiving MPEG encoded transport streams representing broadcast programs. MPEG compatible systems may include systems capable of processing, e.g., MPEG-1 , MPEG-2, MPEG-4, MPEG-4 Part 1 0 (H.264), MPEG-H Part 2 (H.265), or a future improved version thereof, compression encoded video signals. However, the system shown in FIG. 3 is exemplary only. Other non-MPEG compatible systems, involving systems capable of processing other types of non-MPEG related encoded data streams (e.g., VP8) may also be used according to the principles of the present invention.

Other exemplary devices may include mobile devices such as cellular telephones, tablets, PCs, or devices combining computer and television functions such as the so-called "PCTVs". The term "program" and "media asset" as used herein are interchangeable and represent any form of content data such as digital video and/or audio information, including streaming and stored data content received via cable, satellite, broadcast and other telecommunications networks, or via local networks or connections, such as WiFi, USB, HDMI, or Firewire connections. The media content can be from a remote source (e.g., a server) or from a local source, such as from a local storage media (e.g., hard drives, memory cards or USB memory sticks, etc.).

As an overview, in the video receiver system 3 of FIG. 3, a carrier modulated with video data and/or audio data is received by antenna 10 and processed by input processor unit 15. The resultant digital output signal is demodulated by demodulator 20 and decoded by decoder 30. The output from decoder 30 is processed by transport system 25 which is responsive to commands from a user control/remote control unit 125. System 25 provides compressed data outputs for storage, further decoding, or communication to other devices.

Video decoder/processor 85 and audio decoder/processor 80 respectively, decodes the compressed data from system 25 to provide outputs for display 89 and speakers 88. Data port 75 provides an interface for communication of the compressed data from system 25 to/from other devices such as a computer or a High Definition Television (HDTV) receiver, for example. Storage device 90 stores compressed data from system 25 on storage medium 105. Device 90, in a playback mode also supports retrieval of the compressed or uncompressed video and audio data from storage medium 105 for processing by system 25 for decoding, communication to other devices or storage on a different storage medium (not shown to simplify drawing).

Considering FIG. 3 in detail, a carrier modulated with video and/or audio data, including e.g., digital radio data, received by antenna 1 0, is converted to digital form and processed by input processor 15. Input processor 15 includes one or more radio frequency (RF) tuners 16-1 to 16-N for tuning to one or more broadcast channels concurrently. The input processor 1 5 which comprises turners 16-1 to 16-N, intermediate frequency (IF) mixer and amplifier 17 then tunes and down-converts the respective input video signal to a lower frequency band suitable for further processing. The resultant digital output signal is demodulated by demodulator 20 and decoded by decoder 30. The output from decoder 30 is further processed by transport system 25.

Multiplexer (mux) 37 of service detector 33 is provided, via selector 35, with either the output from decoder 30, or the decoder 30 output is further processed by a descrambling unit 40. Descrambling unit 40 may be, for example, a removable unit such as a smart card in accordance with ISO 7816 and NRSS (National Renewable Security Standards) Committee standards (the NRSS removable conditional access system is defined in EIA Draft Document IS-679, Project PN-3639), or a CableCARD used in U.S. cable systems. Selector 35 detects the presence of an insertable, compatible, descrambling card and provides the output of unit 40 to mux 37 only if the card is currently inserted in the video receiver unit. Otherwise selector 35 provides the output from decoder 30 to mux 37. The presence of the insertable card permits unit 40 to descramble additional premium program channels, for example, and provide additional program services to a viewer. It should be noted that in the preferred embodiment NRSS unit 40 and smart card unit 130 (smart card unit 130 is discussed later) share the same system 25 interface such that only either an NRSS card or a smart card may be inserted at any one time. However, the interfaces may also be separate to allow parallel operation.

The data provided to mux 37 from selector 35 is in the form of an MPEG compliant packetized transport data stream as defined e.g., in MPEG2 Systems Standards ISOJEC 1 3818-1 and may include program guide information and the data content of one or more program channels. The individual packets that comprise particular program channels are identified by Packet Identifiers (PIDs). The transport stream contains Program Specific Information (PSI) for use in identifying the PIDs and assembling individual data packets to recover the content of all the program channels that comprise the packetized data stream. Transport system 25, under the control of the system controller or processor 1 15, acquires and collates program guide information from the input transport stream, storage device 90 or an Internet service provider via the communication interface unit 1 16. The individual packets that comprise either particular program channel content or program guide information, are identified by their Packet Identifiers (PIDs) contained within header information. Program guide information may contain descriptions for a program which may comprise different program descriptive fields such as title, star, ratings, genre, detailed event description, and etc., relating to a program.

The user interface system incorporated in the video receiver 3 shown in FIG. 3 enables a user to activate various features by selecting a desired feature from an on-screen display (OSD) menu. The OSD menu may include an electronic program guide as described above and other selectable user features according to the principles of the present invention, and to be described in more detail below. Data representing information displayed in the OSD menu is generated by, e.g., system controller 1 15 in response to stored program guide information, stored graphics information, system and user interface control information as described herein and in accordance with an exemplary control program to be shown in FIG. 12, and to be described in detail later. The software control programs may be stored, for example, in embedded memory of system controller 1 15, or other suitable memory (both not shown) as well known by one skilled in the art.

Exemplary embodiment of a user control unit 125 may include one or more of, e.g., a wired or wireless remote control, a mouse, a keyboard, voice activated device, gesture activated devices, and etc. A user may use a user control unit 125 to move a curser (e.g., 513 in FIG. 5), to select one of the user selectable icons shown in FIG. 5 to FIG. 10. Such user selectable icons may represent e.g., a media asset to be selected, a navigational icon, and etc. A user is able to make various user selections via user control unit 125 as described above. System controller/processor 1 15 uses the selection information, provided via remote unit interface 120, to configure the various associated elements of system 3 shown in FIG. 3, in response to the selections. For example, system controller 1 15 provides associated control information to audio processor 80 and video processor 85 via control signal paths 72 and 73 respectively to control their respective functions.

In addition, when a user selects programs for viewing or storage, system controller 1 1 5 generates PSI (Program Specific Information) suitable for the selected storage device and media. Controller 1 15 also configures system 25 elements 45, 47, 50, 55, 65 and 95 by setting control register values within these elements via a data bus and by selecting signal paths via muxes 37 and 1 1 0 with control signal C.

In response to control signal C, mux 37 selects either, the transport stream from unit 35, or in a playback mode, a data stream retrieved from storage device 90 via store interface 95. In normal, non-playback operation, the data packets comprising the program that the user selected to view are identified by their PIDs by selection unit 45. If an encryption indicator in the header data of the selected program packets indicates the packets are encrypted, unit 45 provides the packets to decryption unit 50. Otherwise unit 45 provides non-encrypted packets to transport decoder 55. Similarly, the data packets comprising the programs that the user selected for storage are identified by their PIDs by selection unit 47. Unit 47 provides encrypted packets to decryption unit 50 or non-encrypted packets to mux 1 10 based on the packet header encryption indicator information.

The functions of decryptors 40 and 50 may be implemented in a single removable smart card which is compatible with the NRSS standard. This approach places all security related functions in one removable unit that easily can be replaced if a service provider decides to change encryption technique or to permit easily changing the security system, e.g., to descramble a different service.

Units 45 and 47 employ PID detection filters that match the PIDs of incoming packets provided by mux 37 with PID values pre-loaded in control registers within units 45 and 47 by controller 1 15. The pre-loaded PIDs are used in units 47 and 45 to identify the data packets that are to be stored and the data packets that are to be decoded for use in providing a video image. The pre-loaded PIDs are stored in look-up tables in units 45 and 47. The PID look-up tables are memory mapped to encryption key tables in units 45 and 47 that associate encryption keys with each pre-loaded PID. The memory mapped PID and encryption key look-up tables permit units 45 and 47 to match encrypted packets containing a pre-loaded PID with associated encryption keys that permit their decryption. Non-encrypted packets do not have associated encryption keys. Units 45 and 47 provide both identified packets and their associated encryption keys to decryptor 50. The PID lookup table in unit 45 is also memory mapped to a destination table that matches packets containing pre-loaded PIDs with corresponding destination buffer locations in packet buffer 60. The encryption keys and destination buffer location addresses associated with the programs selected by a user for viewing or storage are pre-loaded into units 45 and 47 along with the assigned PIDs by controller 1 15. The encryption keys are generated by ISO 7816-3 compliant smart card system 130 from encryption codes extracted from the input data stream. The generation of the encryption keys is subject to customer entitlement determined from coded information in the input data stream and/or pre-stored on the insertable smart card itself (International Standards Organization document ISO 7816-3 of 1989 defines the interface and signal structures for a smart card system).

The packets provided by units 45 and 47 to unit 50 are encrypted using an encryption techniques such as the Data Encryption Standard (DES) defined in Federal Information Standards (FIPS) Publications 46, 74 and 81 provided by the National Technical Information Service, Department of Commerce. Unit 50 decrypts the encrypted packets using corresponding encryption keys provided by units 45 and 47 by applying decryption techniques appropriate for the selected encryption algorithm. The decrypted packets from unit 50 and the non-encrypted packets from unit 45 that comprise the program for display are provided to decoder 55. The decrypted packets from unit 50 and the non-encrypted packets from unit 47 that comprise the program for storage are provided to mux 1 10. Unit 60 contains four packet buffers accessible by controller 1 15. One of the buffers is assigned to hold data destined for use by controller 1 15 and the other three buffers are assigned to hold packets that are destined for use by application devices 75, 80 and 85. Access to the packets stored in the four buffers within unit 60 by both controller 1 15 and by application interface 70 is controlled by buffer control unit 65. Unit 45 provides a destination flag to unit 65 for each packet identified by unit 45 for decoding. The flags indicate the individual unit 60 destination locations for the identified packets and are stored by control unit 65 in an internal memory table. Control unit 65 determines a series of read and write pointers associated with packets stored in buffer 60 based on the First-In-First-Out (FIFO) principle. The write pointers in conjunction with the destination flags permit sequential storage of an identified packet from units 45 or 50 in the next empty location within the appropriate destination buffer in unit 60. The read pointers permit sequential reading of packets from the appropriate unit 60 destination buffers by controller 1 15 and application interface 70.

The non-encrypted and decrypted packets provided by units 45 and 50 to decoder 55 contain a transport header as defined by section 2.4.3.2 of the MPEG systems standard. Decoder 55 determines from the transport header whether the non-encrypted and decrypted packets contain an adaptation field (per the MPEG systems standard). The adaptation field contains timing information including, for example, Program Clock References (PCRs) that permit synchronization and decoding of content packets. Upon detection of a timing information packet, that is a packet containing an adaptation field, decoder 55 signals controller 1 15, via an interrupt mechanism by setting a system interrupt, that the packet has been received. In addition, decoder 55 changes the timing packet destination flag in unit 65 and provides the packet to unit 60. By changing the unit 65 destination flag, unit 65 diverts the timing information packet provided by decoder 55 to the unit 60 buffer location assigned to hold data for use by controller 1 1 5, instead of an application buffer location.

Upon receiving the system interrupt set by decoder 55, controller 1 15 reads the timing information and PCR value and stores it in internal memory. PCR values of successive timing information packets are used by controller 1 1 5 to adjust the system 25 master clock (27 MHz). The difference between PCR based and master clock based estimates of the time interval between the receipt of successive timing packets, generated by controller 1 15, is used to adjust the system 25 master clock. Controller 1 1 5 achieves this by applying the derived time estimate difference to adjust the input control voltage of a voltage controlled oscillator used to generate the master clock. Controller 1 15 resets the system interrupt after storing the timing information in internal memory.

Packets received by decoder 55 from units 45 and 50 that contain program content including audio, video, caption, and other information, are directed by unit 65 from decoder 55 to the designated application device buffers in packet buffer 60. Application control unit 70 sequentially retrieves the audio, video, caption and other data from the designated buffers in buffer 60 and provides the data to corresponding application devices 75, 80 and 85. The application devices comprise audio and video decoders 80 and 85 and high speed data port 75. For example, packet data corresponding to a composite program guide generated by the controller 1 15 as described above, may be transported to the video decoder 85 for formatting into video signal suitable for display on a display monitor 89 connected to the video decoder 85. Also, for example, data port 75 may be used to provide high speed data such as computer programs, for example, to a computer. Alternatively, port 75 may be used to output or receive data to and from an HDTV to display or process images corresponding to a selected program or a program guide, for example. One example of port 75 may be a HDMI data port.

Packets that contain PSI information are recognized by unit 45 as destined for the controller 1 15 buffer in unit 60. The PSI packets are directed to this buffer by unit 65 via units 45, 50 and 55 in a similar manner to that described for packets containing program content. Controller 1 15 reads the PSI from unit 60 and stores it in internal memory.

Controller 1 15 also generates condensed PSI (CPSI) from the stored PSI and incorporates the CPSI in a packetized data stream suitable for storage on a selectable storage medium. The packet identification and direction is governed by controller 1 15 in conjunction with the unit 45 and unit 47 PID, destination and encryption key look-up tables and control unit 65 functions in the manner previously described.

In addition, controller 1 15 is coupled to a communication interface unit 1 16. Unit 1 16 provides the capability to upload and download information to and from the internet. Communication interface unit 1 1 6 includes, for example, communication circuitry for connecting to an Internet service provider, e.g., via a wired or wireless connection such as an Ethernet, WiFi connection, or via cable, fiber or telephone line. The communication capability allows the system shown in FIG. 3 to provide, e.g., Internet related features such as program content streaming and web browsing, in addition to receiving television and radio programming. Also, it allows the exemplary system shown in FIG. 3 to obtain electronic program guide information from a provider through the internet.

FIG. 4 is an exemplary system according to the principles of the present invention. Fig. 4 is a diagram representing, for example, system capable of providing a streaming media service. For example, various devices 260-1 to 260-n in FIG. 4 may access a media asset over the internet 250. The media service or video service is hosted, e.g., by a web server 205. Web server 205 may be a server having a processor 210 such as, e.g., an Intel processor, running an appropriate operating system such as, e.g., Windows 2008 R2, Windows Server 2012, Linux operating system, etc. Devices 260-1 to 260-n may access streaming media assets provided by server 205 using, e.g., a streaming protocol such as e.g., Apple HTTP Live Streaming (HLS) protocol, Adobe Real-Time Messaging Protocol (RTMP), Microsoft Silverlight Smooth Streaming Transport Protocol, and etc.

In addition, a server administrator may interact with server 205 using user I/O 215 such as a keyboard and a display as well known in the art, through an interface 215. Media assets and their associated metadata (e.g., program guide and related information) may be stored in a database 225 and accessed by processor 210 in server 205 as needed. Database 225 may reside in appropriate storage media, such as, e.g., one or more hard drives. Server 205 is connected to, e.g., the internet or a local or wide area network through a communication interface 220 to be connected to one or more of user devices 260-1 to 260-n. In addition, one skilled in the art would readily recognize that other server components, such as, e.g., RAM memories, are also needed, but are not shown in FIG. 4 to simplify the drawing.

User devices 260-1 to 260-n may be, e.g., a PC, laptop, tablet, cellphone, and etc. Such a user device may be, e.g., a Microsoft Windows 7 or Windows 8 computer, an Android phone (e.g., Samsung S3, S4, or S5), an Apple IOS phone (e.g., IPhone 5S or 5C), an Apple IPad, and etc. For example, a detailed block diagram of an exemplary device according to the principles of the present invention is illustrated in block 260-1 of FIG. 4. Device 260-1 comprises a processor 265 for processing various data and for controlling various functions and components of the device 260-1 , user I/O components 280 (which may include a virtual or physical touch keyboard and/or a display) for inputting and/or outputting user data, memory 285 for processing and storing various information as necessary, and a communication interface 270 for connecting and communicating to/from web server 205 via, e.g., the internet 250 using e.g., a cable network, FIOS network, Wi-Fi network, and/or a cellphone network such as e.g., 3G, 4G, LTE, and etc.

FIG. 5 to FIG. 10 illustrate exemplary user interface screens and functions according to the principles of the present invention. These user interface screens and functions may be controlled and/or provided by e.g., system controller/processor 1 1 5 in receiver 3 of FIG. 3, processor 265 in device 260-1 of FIG. 4, or processor 210 in web server 205 of FIG. 4.

FIG. 12 is a flow chart of an exemplary process according to principles of the present invention. In one embodiment, the exemplary process may be implemented as computer executable instructions which may be executed by, e.g., a system controller/processor 1 1 5 in receiver 3 of FIG. 3, a processor 265 in device 260-1 in FIG. 4, or a processor 210 in server 205 of FIG 4. For example, a computer program product having the computer-executable instructions may be stored in non-transitory computer-readable storage media of the respective devices 3, 260-1 , or 210. The exemplary control program shown in FIG. 1 2, when executed, facilitates processing and displaying of user interfaces shown, for example, in FIG. 5 to FIG. 10, and controlling of their respective functions and interactions with a user. One skilled in the art can readily recognize that the exemplary process shown in FIG. 12 may also be implemented in hardware (e.g., logic arrays or ASIC), or a combination of hardware and software (e.g., a firmware implementation).

At step 1200 of FIG. 12, a first user input which represents a search query is received. One example of a search query is shown, e.g., as "batman" 570 entered by a user in the query field 590 of FIG. 5.

At step 121 0, a search result is determined in response the search query "batman" 570 and will be positioned in a region 520 as shown in FIG. 5. For example, a search result may comprise a list of a plurality of items 550-1 to 550-7 relevant to the search query "batman" 570 as shown in FIG. 5. In one exemplary embodiment, the search result may be shown in such a list in the order of the relevancy to the search query.

As shown in FIG. 5, items 550-1 to 550-7 are accompanied by respective ones of a plurality of corresponding visual representations 555-1 to 555-7 of the items. The visual representations of the items may comprise respective still pictures of e.g., a media asset (e.g., movie, album, song track, electronic book, and etc.), actor, director, and producer, and etc. The corresponding still pictures may be produced and/or obtained from images such as, e.g., movie disk cover, movie poster art, album cover, book cover, other advertising material related to the media, a person's stock photo, or one of the beginning frames of a video asset, etc., as shown in FIG. 5.

At step 1220, a determination is made as to whether a corresponding still picture is available from e.g., database 225 for the respective items 550-1 to 550-7. As shown in FIG. 5, e.g., if a still picture is available for, e.g., item "batman (1989)" 550-1 , then that still picture will be outputted and displayed (see, e.g., 555-1 ) along with the item 550-1 , at step 1230

However, if it is determined at step 1 220 that a still picture corresponding to a respective item is not available, then a predetermined substitute image icon corresponding to that item will be outputted and displayed instead, as shown at step 1240. This example is illustrated in FIG. 5, e.g., when a corresponding still picture is not available for item "Tony Batman (Actor)" 550-4. In that example, a predetermined graphical image icon 555-4 in the shape of an upper body of a person is outputted and displayed instead of a still photography of actor Tony Batman. The choice of the image icon to be used as a substitution when a still picture corresponding to an item is unavailable is determined by the type of the corresponding item, as shown in FIG. 6. The substitute generic graphic image icons may be designed and chosen to give a viewer easy understanding of the type (e.g., nature or category) of the item being represented. For example, an image icon 655-1 in the shape of a film would represent and correspond to a movie item "Batman (1 989)" 650-1 (i.e., as a substitute of still picture 613, if still picture 613 is unavailable). Likewise, an image icon 655-2 in the shape of a TV would accompany a TV episode item "Batman Begins (1966)" 650-2 (i.e., as a substitute of still picture 623, if still picture 623 is unavailable). Additional exemplary image icons shown in FIG. 6 comprise, e.g., an image icon in the shape of a person 655-4 representing an item 650-4 relating to a person, image icon 655-8 in of a musical note representing an audio media asset item 650-8, and image icon 655-9 in the shape of a book representing an e-book asset item 650-9.

In one exemplary aspect of the present invention, when a user types out the letters of a query text string in search query field 590, if there is an item matching to the term having the entered letters, a first search result region 520 will appear or "fly-out". As the user continues to type out the search query, the search result shown in region 520 will change dynamically in response to the letters entered. When search query field 590 is no longer empty, an "X" icon 580 will appear. This allows the user to clear out the search query as well as the dynamic results result in region 520. In one exemplary embodiment, text descriptions for search result items in region 520 may be bolded if they matched the entered search text query, as shown in, e.g., FIG. 5.

At step 1 250 and as illustrated in FIG. 5, a second region such as, e.g., the left region 530 displays additional information about one of the search result items 550-1 to 550-7. For example, if a user highlights an item, e.g., "Batman Begins (1996)" 550-2 on the search result list (e.g., by moving a cursor 513 over the item), then the second region 530 will show additional information relating to the highlighted item 550-2. On the other hand, if a user has not highlighted any of the items 550-1 to 550-7, second region 530 may display additional information about the first item on the list of items in region 520 by default.

Additional information presented in the left region depends on the type or nature of the items highlighted. In one example as illustrated in FIG. 5, additional information for a movies is shown and may comprise, e.g., movie title 531 , critic and/or viewer rating 532, a large still picture representing the movie 533, parental rating of the movie 534, length of the movie 535, genre of the movie 536, major cast members 537, a more detailed description of the movie 538. In one exemplary embodiment, if the user clicks on the still picture 533, movie title 531 , or the description 538, another web page with further content information about this media asset will be presented to the user.

Additional FIGS. 7 to 10 show additional exemplary embodiments of how the first and the second regions can be rendered for different types or natures of search resuit items in response to a search query. In an optional embodiment (as shown n FIG. 8), additional information about the most likely matching item "iron Man" 850 may be placed in the top most pari 81 0 of the left region 830, and additional information about the second most likely matching item "Iron Man 2" 860 may be placed at the second position of the left region 830 pane. Of course, one skilled in the art can readily recognize that more positions (i.e., >2) may be added in the left region 830 accordingly to show more information about more respective items on the search result list.

In one exemplary embodiment, system controller/processor 1 5 in receiver 3 of FIG. 3, processor 265 in device 260-1 in FIG. 4, or processor 21 0 n server 205 of FIG 4 may distinguish between media asset types for the same title. For example, the movie version of iron Man is displayed as the most likely matching result even though Iron Man the television show is also displayed, but at lower position (e.g., less relevancy) on the a search resuit list. Similarly, the song Iron Man by Black Sabbath may not even be shown on the screen even though it has the same search term as other results. Hence, media asset types can be important for sorting out the search results, depending on, e.g., whether the service is primary a video, an audio, or an E- book service. In a further alternative embodiment, when a particular person who matches in different capacities (e.g., actor, director, writer), the person's name wili be shown with the most relevant capacity relevant to the search result item. For example, for Clint Eastwood, his results could be for his work as an actor, director, composer, writer, producer, etc. That is, if the work of Clint Eastwood as a director is the most relevant for a search result, the indication "Clint Eastwood (Director)" wiii be shown in the search results. Si Clint Eastwood as an actor is the most relevant in response to a search query, the indication "Clint Eastwood (Actor)" wili be shown. Optionally in another alternative embodiment, a person will only appear once. Hence, if Clint Eastwood as a director is more significant than Clint Eastwood as an actor, the "director" results will be shown. The actor results wili be filtered out.

FIG. 1 1 illustrates how data corresponding to the different information shown on user interface screen of FIG. 5 may be stored in an exemplary database 1 1 10 and processed by e.g., system controller/processor 1 15 in receiver 3 of FIG. 3, processor 285 in device 260-1 in FIG. 4, and/or processor 210 in server 205 of EIG 4. As noted previously, such an exemplary database may be a database 225 in web server 205 of FIG. 4.

While several embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the functions and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the present embodiments. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings herein is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereof, the embodiments disclosed may be practiced otherwise than as specifically described and claimed. The present embodiments are directed to each individual feature, system, article, material and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials and/or methods, if such features, systems, articles, materials and/or methods are not mutually inconsistent, is included within the scope of the present embodiments.