US Pat. No. 10,715,856

DEVICES, METHODS, AND PROGRAM PRODUCTS ENABLING CONCURRENT PERSONAL NETWORK STREAMING AND WIDE AREA NETWORK ACCESS

SLING MEDIA PVT LTD, Ban...

1. A method carried-out by a client media receiver utilized in conjunction with a Personal Video Recorder (PVR) and a display device, the method comprising:identifying a user desire to conduct a wireless PVR viewing session utilizing the client media receiver, while maintaining wireless access to a Wide Area Network (WAN);
when identifying a user to conduct a wireless PVR viewing session, while maintaining wireless access to the WAN, utilizing a software application executing on the client media receiver to:
create a personal Local Area Network (LAN) enabling wireless communication between the client media receiver and the PVR; and
establish Dual Virtual Antenna (DVA) parameters defining a first virtual antenna assigned to the WAN and a second virtual antenna assigned to the personal LAN; and
utilizing the second virtual antenna to support a wireless PVR viewing session during which the client media receiver receives streaming media content from the PVR over the personal LAN for presentation on the display device, while concurrently routing WAN data through the first virtual antenna.

US Pat. No. 10,715,855

METHOD, SYSTEM, AND APPARATUS FOR PROGRAMMATICALLY GENERATING A CHANNEL INCREMENTALITY RATIO

Groupon, Inc., Chicago, ...

1. A method for programmatically generating a channel incrementality ratio, comprising:receiving a plurality of touchpoint signals from a plurality of client devices, wherein each touchpoint signal of the plurality of touchpoint signals is associated with a respective channel of a plurality of channels, each channel of the plurality of channels associated with a unique channel identifier, each channel of the plurality of channels being a source of network traffic;
receiving a plurality of transaction signals from the plurality of client devices, wherein each transaction signal of the plurality of transaction signals is associated with a respective channel of the plurality of channels;
extracting a transaction timestamp from each transaction signal of the plurality of transaction signals into a plurality of transaction timestamps;
extracting a touchpoint timestamp from each touchpoint signal of the plurality of touchpoint signals into a plurality of touchpoint timestamps;
generating a touchpoint timestamps subset, the touchpoint timestamps subset comprising touchpoint timestamps of the plurality of touchpoint timestamps associated with a particular transaction timestamp of the plurality of transaction timestamps, wherein each touchpoint timestamp of the touchpoint timestamps subset was received within a period of network time prior to the particular transaction timestamp;
for each different channel of the plurality of channels associated with at least one touchpoint timestamp of the touchpoint timestamps subset, locating a latest channel landing-page touchpoint timestamp for a channel;
generating a sorted list using each channel identifier of the plurality of channels by using a time order associated with the latest channel landing-page touchpoint timestamp for each different channel associated with the touchpoint timestamps subset;
assigning a weighting factor to each channel based on a location of a channel identifier associated with the channel in the sorted list; and
generating a channel incrementality ratio associated with each channel using a machine learning model and based at least on the plurality of transaction timestamps, a plurality of touchpoint timestamps subsets, the sorted list, and the weighting factor for each channel.

US Pat. No. 10,715,854

METHOD AND APPARATUS FOR PUSHING INFORMATION

Baidu Online Network Tech...

1. A method for pushing information, comprising:acquiring video information of a video played by a user using a terminal, the video information comprising bullet screen information, wherein the bullet screen information comprising user comments on a video in a scrolling form when the video is played;
analyzing the video information and bullet screen information to generate a keyword set, the keywords set including a keyword included in the bullet screen information;
selecting at least one piece of candidate push information including the keyword as a set of push information including the keyword, based on a matching relationship between the keyword set and each piece of candidate push information; and
pushing the set of push information to the terminal in a form of bullet screen information, wherein the pushing the set of push information to the terminal comprises: presenting the set of push information including the keyword on the video, when the bullet screen information including the keyword is presented on the video.

US Pat. No. 10,715,853

PERSON LEVEL VIEWERSHIP PROBABILISTIC ASSIGNMENT MODEL WITH MARKOV CHAIN

Comscore, Inc., Reston, ...

1. A computer-implemented method, comprising:accessing panelist viewing data representing viewing events associated with panelists, each viewing event in the panelist viewing data including an identification of a program associated with the viewing event and a panelist minutes value representing a number of minutes a panelist was exposed to the program during the viewing event;
determining, based on the panelist viewing data, a plurality of state values, each of the state values corresponding to a different portion of the program and representing whether the corresponding portion was watched by a given one of the panelists, wherein the plurality of state values comprise a state value indicating that a particular portion of the program was watched and another state value indicating that another particular portion of the program was not watched;
accessing tuning data representing tuning events associated with particular households, each tuning event in the tuning data including an identification of a particular program associated with the tuning event and a household minutes value representing a number of minutes the particular program was played at a particular household in association with the tuning event; and
for at least one tuning event represented by the tuning data:
accessing household member data descriptive of one or more individual members of a household associated with the tuning event;
determining that at least a portion of the panelist viewing data, which includes information descriptive of the panelists, matches at least a portion of the household member data;
determining, for each of the one or more individual members of the household, a total number of watched portions and a total number of series of continuous portion-watching of the particular program identified by the at least one tuning event; and
generating, based on the total numbers, an output representative of a probability that at least a portion of the particular program identified by the at least one tuning event was watched by at least one of the one or more individual members.

US Pat. No. 10,715,852

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM

DENTSU INC., Tokyo (JP)

1. An information processing method, comprising:acquiring first identification information at least including an Internet Protocol (IP) address, information regarding a television identifier, and information regarding a television viewing log, and second identification information at least including an IP address and an advertisement ID (Identifier); and
performing control to identify a combination of the viewing log and the advertisement ID corresponding to the same IP address on the basis of the first identification information and the second identification information,
wherein the control is performed by a control unit to determine whether the information regarding the television identifier and the advertisement ID belong to the same household based on at least one of a first criterion and a second criterion,
wherein the first criterion includes a probability increasing as the number of records consecutively linked when a table of the first identification information, including a time stamp, and a table of the second identification information, including a time stamp, are respectively sorted with a set of a time stamp and an IP address, and
wherein the second criterion includes a probability increasing as the number of records in which the same information regarding the identifier of the television and the same advertisement ID are associated with different IP addresses increases as a result of matching performed at every predetermined period.

US Pat. No. 10,715,851

DIGITAL RIGHTS MANAGED VIRTUAL REALITY CONTENT SHARING

BigScreen, Inc., Walnut,...

1. A method for providing audiovisual content in a virtual reality (VR) viewing experience, comprising:receiving, from an application associated with a VR device, a request to access audiovisual content, wherein the VR device is in use by a user during a valid user session in response to the request, determining that the user is connected to an unexpired ticket for the user to access the audiovisual content;
sending a content request to a content delivery network (CDN) storing an encrypted copy of the audiovisual content;
receiving, from the CON, a playback manifest for the audiovisual content comprising at least license information for the audiovisual content;
sending the playback manifest to a digital rights management (DRM) subsystem of the VR device, wherein the DAM subsystem is configured to execute a DRM protocol capable of processing the playback manifest;
receiving notification from the DRM subsystem that the encrypted audiovisual content has been downloaded from the CON and decrypted within a protected graphical processor unit (GPU) memory of the VR device; and
providing the decrypted audiovisual content for playback on the VR device from within the protected GPU memory of the VR device;
wherein the determining that the user is connected to an unexpired ticket comprises at least:
receiving, from the application associated with the VR device, a visual representation of an eye of the user accessing the VR device, and
matching the visual representation of the eye of the user with a preexisting unique eye signature associated with one of the valid, unexpired tickets from the set of tickets.

US Pat. No. 10,715,850

RECOMMENDING RECENTLY OBTAINED CONTENT TO ONLINE SYSTEM USERS BASED ON CHARACTERISTICS OF OTHER USERS INTERACTING WITH THE RECENTLY OBTAINED CONTENT

Facebook, Inc., Menlo Pa...

1. A method comprising:maintaining an embedding corresponding to each user of an online system, the embedding corresponding to a user based on interactions by the user with content presented to the user by the online system;
obtaining a content item at the online system for presentation to users of the online system, the content item including video data;
presenting content from the content item to viewing users of the online system;
identifying a set of the viewing users to whom the content item was presented and who performed one or more specific actions with the content item, the one or more specific actions with the content item comprising viewing at least a threshold amount of the video data included in the content item;
subsequent to identifying the set of viewing users, generating a content embedding associated with the content item based on embeddings corresponding to each of the set of viewing users;
identifying a candidate user of the online system to whom content from the content item has not been presented;
retrieving an embedding corresponding to the candidate user of the online system;
determining a similarity of the embedding corresponding to the candidate user of the online system and the content embedding; and
communicating a recommendation for the content item to a client device associated with the candidate user in response to the similarity equaling or exceeding a threshold.

US Pat. No. 10,715,849

AUTOMATICALLY GENERATING A RECOMMENDATION BASED ON AUTOMATIC AGGREGATION AND ANALYSIS OF DATA

Accenture Global Solution...

1. A system, comprising:a memory; and
one or more processors to:
receive, from multiple data providers or multiple platforms, data associated with content, a content provider that provides the content, and multiple channels via which the content is provided or consumed,
the data including first data, received from a first data provider or a first platform of the multiple data providers or the multiple platforms, that is associated with a different file type than second data received from a second data provider or a second platform of the multiple data providers or the multiple platforms,
the first data to include contextual data that identifies a context of the content,
 the contextual data including data about one or more of:
  a motif, or
  a character present in a particular scene or frame of the content, and
the first data being identified using time-based metadata for the content,
the second data including information relating to audience or viewership consumption of the content via two or more distribution channels, of the multiple channels, and received from an audience measurement entity,
 the two or more distribution channels including at least two of:
  a television source device,
  an over-the-top (OTT) source device, or
  a social media source device, and
the system including a data model associated with the first data and the second data;
aggregate the data from the multiple data providers or the multiple platforms;
extract the data based on identifying a pattern in the aggregated data;
identify, from the extracted data, a relationship between the first data and the second data,
the first data and the second data being different types of data, and
the relationship being used to determine whether an increase or decrease in content that includes a particular type of the contextual data increases or decreases the audience or viewership consumption of the content; and
perform an action based on the relationship between the first data and the second data,
where the one or more processors, when performing the action, are to:
generate a recommendation to increase or decrease content that includes the particular type of the contextual data based on the relationship.

US Pat. No. 10,715,848

METHODS AND SYSTEMS FOR GENERATING AND PROVIDING PROGRAM GUIDES AND CONTENT

Pluto Inc., West Hollywo...

1. A system, comprising:a network interface;
at least one processing device;
non-transitory memory storing programmatic code that when executed by the at least one processing device, cause the system to:
generate, for at least a first user, a first prediction model, the first prediction model generated based at least in part on a determination, with respect to a plurality of different positionings in time of items of ancillary content relative to a beginning of one or more items of streaming content, of:
how quickly the first user fast forwarded through, skipped, or navigated away from one or more of the items of ancillary content;
access, using the network interface, data from a first user device that enables identification of the first user;
use the identification of the first user to access associated user data;
determine a first number of items of ancillary content that are to be presented during streaming of a first digital program;
select, using at least the accessed user data and a determined prediction model accuracy, a prediction model from a pool of prediction models, the pool of prediction models comprising prediction models, including the first prediction model, configured to generate predictions of user responses to different positionings in time of ancillary content with respect to digital programs;
execute the prediction model, selected from the pool of prediction models, to generate predictions of user responses to a plurality of different positionings in time of the first number of items of ancillary content relative to a beginning of the first digital program;
determine a positioning in time of the first number of items of ancillary content relative to the beginning of the first digital program based at least in part on the generated predictions of user responses to the plurality of different positionings in time of the first number of items of ancillary content relative to the beginning of the first digital program;
based at least in part on the determined positioning in time relative to the beginning of the first digital program of the first number of items of ancillary content, enable at least a portion of the first number of items of ancillary content to be streamed to and displayed by the first user device in accordance with the determined positioning in time;
monitor, via information received using the network interface from the first user device, responses of the first user to items ancillary content displayed in accordance with the determined positioning in time by the first user device;
based at least in part on the monitored responses of the first user to the displayed items of ancillary content in accordance with the determined positioning in time, enhancing the prediction model, wherein the prediction model comprises one or more hidden layers, a given hidden layer comprising nodes, the nodes associated with node weights, wherein enhancing the prediction model comprises adjusting one or more node weights; and
use the enhanced prediction model to enable a second number of items of ancillary content to be streamed to and displayed by the first user device in accordance with a second determined positioning in time.

US Pat. No. 10,715,847

CUSTOM DATA INDICATING NOMINAL RANGE OF SAMPLES OF MEDIA CONTENT

Microsoft Technology Lice...

1. In a computer system that implements a video processing tool, a method comprising:receiving range data and encoded video content in a first format, wherein the range data indicates nominal range of samples of the encoded video content, the samples of the encoded video content having a sample depth that indicates an available range of values of the samples of the encoded video content, wherein the nominal range is a range of values within the available range for the sample depth of the samples of the encoded video content, and wherein the range data indicates one of multiple possible options for the nominal range, the multiple possible options for the nominal range including:
full range characterized by values from 0 . . . 2n?1 for samples of bit depth n; and
a limited range characterized by values in less than the full range;
parsing the range data;
decoding the encoded video content, thereby producing samples of reconstructed video output in the first format; and
converting the samples of the reconstructed video output from the first format to a second format for display, wherein converting uses logic that changes depending at least in part on the nominal range.

US Pat. No. 10,715,846

STATE-BASED IMAGE DATA STREAM PROVISIONING

Amazon Technologies, Inc....

1. A computing system for image data stream provisioning comprising:one or more processors; and
one or more memories having stored therein instructions that, upon execution by the one or more processors, cause the computing system to perform operations comprising:
receiving, for a viewer, a selected attribute value of a service attribute used by an image data streaming service, wherein the service attribute is associated with a video game attribute used within a video game, the video game attribute having current game values that change during playing of the video game based on input from one or more players of the video game;
receiving, by an image data streaming service, a plurality of image data streams captured from the video game;
receiving, by the image data streaming service, a plurality of state data streams including state data that describes contents of the plurality of image data streams, wherein the state data includes a plurality of stream attribute values of the service attribute, wherein the plurality of stream attribute values are updated to reflect changes to the current game values as the video game is being played, and wherein conversion instructions included in the video game are used to convert the video game attribute to the service attribute and to convert the current game values to the plurality of stream attribute values;
providing, to the viewer, a currently matching stream indication of each of the plurality of image data streams that currently matches an attribute value selection of the viewer
matching a first stream attribute value of the plurality of stream attribute values in a first state data stream of the plurality of state data streams to the selected attribute value, wherein the first state data stream corresponds to a first image data stream of the plurality of image data streams;
including the first image data stream in the currently matching stream indication; and
transmitting the first image data stream to the viewer.

US Pat. No. 10,715,845

BROADCAST SIGNAL TRANSMISSION/RECEPTION DEVICE AND METHOD

LG ELECTRONICS INC., Seo...

1. A broadcast signal transmitting method comprising:generating low level signaling (LLS) information, wherein the LLS information includes table ID information for identifying a type of the LLS information;
processing the LLS information into IP packets;
generating link layer packets by link layer processing the IP packets; and
generating a broadcast signal by physical layer processing the link layer packets,
wherein the LLS information further includes an emergency alert table (EAT),
wherein the EAT includes at least one emergency alert message,
wherein the at least one emergency alert message includes message identification information for identifying the at least one emergency alert message, priority information for indicating a priority of an alert, a text of an emergency notification, Uniform Resource Locator (URL) information for identifying a media file that provides additional information related to the text of the emergency notification, Uniform Resource Identifier (URI) information of another media file associated with the media file, and at least one live media information, and
wherein the at least one live media information includes broadcast stream identifier (BSID) information for identifying a broadcast stream including an emergency-related live audio/video (A/V) service and service ID information for identifying the emergency-related live A/V service.

US Pat. No. 10,715,844

METHOD AND APPARATUS FOR TRANSCEIVING DATA FOR MULTIMEDIA TRANSMISSION SYSTEM

Samsung Electronics Co., ...

1. An apparatus for receiving media content in a system, the apparatus comprising:a receiver configured to receive a multimedia data packet generated based on a data unit, the data unit being fragmented into at least one sub data unit, the multimedia data packet including a packet header and a payload; and
a processor configured to decode the multimedia data packet to recover the media content,
wherein the multimedia data packet comprises type information indicating whether payload data included in the payload comprises either metadata of the data unit or a data element derived from the at least one sub data unit,
wherein the type information comprises a first value indicating that the payload data comprises the metadata of the data unit if the payload data comprises the metadata of the data unit, and
wherein the type information comprises a second value indicating that the payload data comprises the data element derived from the at least one sub data unit if the payload data comprises the data element derived from the at least one sub data unit.

US Pat. No. 10,715,842

METHOD AND SYSTEM FOR DISTRIBUTING INTERNET CARTOON CONTENT, AND RECORDING MEDIUM

NAVER WEBTOON CORPORATION...

1. A content distribution method executed by at least one computer for distributing cartoon content having a plurality of unit scenes with a basic definition original image and at least one unit scene having a high-definition original image, the method comprising:registering the cartoon content including a high-definition original image for at least one unit scene among all unit scenes of the cartoon content;
displaying the plurality of unit scenes of the cartoon content together on a display screen;
visibly dividing the plurality of unit scenes displayed on the display screen into a predetermined plurality of unit cuts and displaying the plurality of unit cuts together on the display screen in response to an input of a user action on the display screen for capturing at least one of the plurality of unit scenes; and
capturing a high-definition original image of a unit scene selected by a user from the plurality of unit cuts of the cartoon content displayed on the display screen,
wherein each of the plurality of unit cuts is a drawing created by the creator of the cartoon content and corresponds to one of the plurality of unit scenes.

US Pat. No. 10,715,841

METHOD AND SYSTEM FOR REMOTELY CONTROLLING CONSUMER ELECTRONIC DEVICES

Gracenote, Inc., Emeryvi...

1. A method for use in connection with a client device and a sequence of media content that includes a first portion followed by a second portion, the method comprising:presenting, by the client device, the first portion of the sequence of media content;
performing, by the client device, a content replacement operation, wherein performing the content replacement operation comprises presenting replacement media content instead of the second portion of the sequence of media content;
while performing the content replacement operation:
(i) monitoring, by the client device, advancement of the second portion of the sequence of media content, for steganographic data included in the second portion of the sequence of media content, and
(ii) based on the monitoring, determining, by the client device, that the second portion of the sequence of media content was interrupted by interrupting media content; and
responsive to determining that the second portion of the sequence of media content was interrupted by interrupting media content, switching, by the client device, from presenting the replacement media content to presenting the interrupting media content instead.

US Pat. No. 10,715,840

ESTABLISHMENT AND USE OF TIME MAPPING BASED ON INTERPOLATION USING LOW-RATE FINGERPRINTING, TO HELP FACILITATE FRAME-ACCURATE CONTENT REVISION

Gracenote, Inc., Emeryvi...

1. A method comprising:ascertaining a plurality of matching points between (i) query fingerprints representing a media stream being received by a client and (ii) reference fingerprints, wherein each identified matching point defines a respective match between (i) a query fingerprint that is timestamped with client time defined according to a clock of the client and (ii) a reference fingerprint that is timestamped with true time according to a timeline within a known media stream;
performing linear regression based on the timestamps of the ascertained plurality of matching points, the linear regression establishing a mapping between true time and client time;
using the established mapping as a basis to determine a client-time point at which the client should perform an action with respect to media stream being received by the client; and
performing, at the determined client-time point, the action with respect to the media stream being received by the client,
wherein each matching point is based on a comparison between (i) a respective bundle of the query fingerprints and (ii) the reference fingerprints, wherein a given bundle of the query fingerprints is consecutive query fingerprints, and wherein the method further comprises detecting that the consecutive query fingerprints of the given bundle of the query fingerprints match each other and, responsive to the detecting that the consecutive query fingerprints of the given bundle match each other, excluding the given bundle from use to establish a matching point for the linear regression.

US Pat. No. 10,715,839

SCHEME FOR DETERMINING THE LOCATIONS AND TIMING OF ADVERTISEMENTS AND OTHER INSERTIONS IN MEDIA

Sony Interactive Entertai...

1. A non-transitory computer readable storage medium storing one or more computer programs adapted to cause a processor based system to execute steps comprising:analyzing a sequence of frames of content;
identifying a first area of interest in a scene depicted by the sequence of frames that should not be obstructed from being viewed;
determining whether an insertion area exists in the scene where additional content can be inserted without obstructing the first area of interest;
determining whether the insertion area includes two or more regions within the insertion area that generally have different colors; and
dividing the insertion area into the two or more regions that generally have different colors;
wherein at least one of the two or more regions that generally have different colors has a greater area than the additional content.

US Pat. No. 10,715,838

REMOTE-CONTROLLED MEDIA STUDIO

Sling Media L.L.C., Fost...

1. A method comprising:receiving, by a computing device, a plurality of individual media items;
decoding the plurality of individual media items using at least a plurality of decoders to generate a plurality of media input items, each of the plurality of individual media items received from a respective one or more inputs, at least one of the plurality of individual media items received from an operator console remote from the computing device;
processing the plurality of media input items to generate a first media output item;
generating a multiview media item that includes multiple of the media input items, the multiview media item combining the multiple of the media input items for a common display;
sending, to the operator console remote from the computing device, the multiview media item;
receiving one or more commands from the operator console, the one or more commands identifying at least one of the multiple of the media input items included in the multiview media item to be included in a second media output item; and
generating the second media output item based at least in part on the at least one of the multiple of the media input items; and
outputting the first media output item and the second media output item to one or more remote devices for viewing and/or storage of the first media output item and the second media output item.

US Pat. No. 10,715,837

DETERMINATION OF A SERVICE OFFICE OF A MEDIA CONTENT DISTRIBUTION SYSTEM TO RECORD A MEDIA CONTENT ITEM WITH A NETWORK RECORDER

1. A method comprising:receiving, at a network device of a media content distribution system from a requesting device, a request to record a media content item, the request including an identifier of the media content item and an identifier of the requesting device, the media content distribution system including multiple tiers of service offices with overlapping delivery areas;
determining, based on the identifier of the requesting device, a first service office of the media content distribution system that is associated with the requesting device, wherein the determining the first service office is based on identifying a lowest tier of the multiple tiers of the media content distribution system having a service office with a non-zero count of requests to record the media content item, wherein the first service office is selected based on being in the determined lowest tier;
incrementing, at the network device based on the request, a total count of requests to record the media content item and a count of requests to record the media content item attributed to the first service office;
making a determination, based on a first selection criteria, whether to record the media content item at the first service office, wherein the first selection criteria comprises a first storage cost, a first bandwidth delay between a top tier and the first service office, and a second bandwidth delay between the first service office and first requesting devices;
responsive to the determination indicating to record the media content item at the first service office:
sending a command from the network device to the first service office to schedule recording of the media content item by a first network recorder at a scheduled time associated with the media content item; and
adjusting, at the network device, the total count by subtracting the count of requests to record the media content item attributed to the first service office;
in response to the adjusting of the total count, determining to additionally record the media content item at a second network recorder of a second service office of the media content distribution system based on the total count being greater than zero after the adjusting of the total count;
determining a lowest tier of the multiple tiers of the media content distribution system having the second service office with a non-zero count of requests to record the media content item; and
selecting the second network recorder of the second service office in the determined lowest tier based on a second selection criteria that comprises a second storage cost, a third bandwidth delay between the top tier and the second service office, and a fourth bandwidth delay between the second service office and second requesting devices,
wherein the first and second service offices are in an intermediate tier that is above a bottom tier of the multiple tiers.

US Pat. No. 10,715,836

METHOD AND APPARATUS FOR DELOCALIZED MANAGEMENT OF VIDEO DATA

INTERDIGITAL CE PATENT HO...

1. A method for managing video data in a storage system, the video data comprising frames, the method comprising:storing one or more frames in the storage system;
associating a unique identifier with each of the one or more frames;
generating one or more modified frames by processing one or more frames stored in the storage system; and
associating a derived unique identifier with each modified frame, wherein the derived unique identifier comprises references to the unique identifiers of the one or more processed frames.

US Pat. No. 10,715,835

SIGNAL PROCESSING APPARATUS AND METHODS

1. An apparatus for promoting and selectively delivering programming at a receiver station comprising:a first receiver section that receives a promotional video program at said receiver station, said promotional video program promoting said programming and including a request for a viewer response to receive said promoted programming;
output apparatus including a plurality of output components for outputting programming, one output component operatively functional at said receiver station to display said promotional video program;
programmable processor apparatus operatively functional at said receiver station to receive a signal that designates said programming;
a second receiver section operatively connected to said programmable processor apparatus to receive a viewer response to said request for a viewer response to receive said promoted programming;
a storage device operatively connected to said programmable processor apparatus to store said viewer response, said programmable processor apparatus accessing said storage device in response to said viewer response to at least one of control reception of said promoted programming and control presentation of said promoted programming;
a third receiver section that receives said promoted programming at said receiver station; and
a selective transfer device at said receiver station, that under control of said programmable processor apparatus based on said signal that designates programming, selectively transfers said promoted programming from said third receiver section to a selected output component of said plurality of output components of said output apparatus.

US Pat. No. 10,715,833

ADAPTIVE SYNTAX GROUPING AND COMPRESSION IN VIDEO DATA USING A DEFAULT VALUE AND AN EXCEPTION VALUE

Apple Inc., Cupertino, C...

1. A system comprising:a receiver receiving encoded video data;
a video decoder decoding the encoded video data organized as a plurality of groups of pixels of a frame, each group having a plurality of subgroups of pixels, the decoding using decoding parameters including a prediction mode derived respectively for the subgroups; and
a controller that for at least one group of the plurality of groups:
determines from the received encoded video data, a default prediction mode flag of the one group from a group-level syntax element, wherein the default prediction mode flag indicates a default prediction mode when true,
when the default prediction mode flag is true, the prediction mode for every subgroup in the group is inferred to be the default prediction mod; and
when the default prediction mode flag is false, the prediction mode for at least one subgroup is derived from a corresponding subgroup-level syntax element;
wherein the video decoder decodes each subgroup of pixels according to its corresponding prediction mode.

US Pat. No. 10,715,832

METHOD AND APPARATUS OF BLOCK PARTITION FOR VR360 VIDEO CODING

MEDIATEK INC., Hsin-chu ...

1. A method of video coding for a video encoder or decoder, the method comprising:receiving a target processing unit in a VR picture, wherein the VR picture corresponds to a 2D (two-dimensional) picture projected from a 3D (three-dimensional) picture according to a target projection format;
partitioning the target processing unit into one or more child processing units using quadtree (QT) partition and/or binary tree (BT) partition;
when one target child processing unit contains only one face edge and maximum QT depth and/or maximum BT depth associated with said one target child processing unit is not reached, splitting said one target child processing unit horizontally into two horizontal child processing units when only one face edge is one horizontal face edge and splitting said one target child processing unit vertically into two vertical child processing units when only one face edge is one vertical face edge; and
encoding or decoding one or more final leaf processing units comprising said two horizontal child processing units or said two vertical child processing units, or two or more second horizontal child processing units or two or more second vertical child processing units further partitioned from said two horizontal child processing units or said two vertical child processing units respectively.

US Pat. No. 10,715,830

LUMA-BASED CHROMA INTRA-PREDICTION FOR VIDEO CODING

TEXAS INSTRUMENTS INCORPO...

1. A method comprising:filtering samples of a down-sampled luma block of a video frame;
computing parameters ? and ? of a linear model using the filtered samples of the down-sampled luma block, wherein the linear model is PredC[x,y]=a RecL?[x,y]+?, wherein x and y are coordinates, PredC is predicted chroma samples, and RecL? is the samples of the down-sampled luma block; and
computing the predicted chroma block samples from the samples of the down-sampled luma block using the linear model and the parameters.

US Pat. No. 10,715,829

MOVING IMAGE PREDICTION ENCODING/DECODING SYSTEM

NTT DOCOMO, INC., Tokyo ...

1. A video predictive encoding device comprising:a processor;
an input terminal executable by the processor to accept input of a plurality of pictures constituting a video sequence;
a predicted signal generator executable by the processor to encode each of the input pictures by a method of either intra-frame prediction or inter-frame prediction to generate compressed picture data including a random access picture serving as a picture of random access, and the predicted signal generator further executable by the processor to encode data providing display order information of each of the pictures;
a restoration unit executable by the processor to decode the generated compressed picture data to restore a reproduced picture;
a frame memory to store the restored reproduced picture as a reference picture to be used for encoding of a subsequent picture; and
a memory management unit executable by the processor to control the frame memory,
wherein after completion of an encoding process to generate the random access picture, the memory management unit is further executable by the processor to refresh the frame memory by setting every reference picture stored in the frame memory except for the random access picture as “not used as reference pictures”, the frame memory being refreshed immediately before encoding a picture having display order information larger than display order information of the random access picture.

US Pat. No. 10,715,828

IMAGE ENCODING METHOD AND IMAGE DECODING METHOD

KABUSHIKI KAISHA TOSHIBA,...

1. An image decoding apparatus comprising:circuitry configured to:
select one or more available blocks from a plurality of candidate blocks, the available blocks including different motion information, the plurality of candidate blocks comprising a first block, a second block, a third block, and a fourth block, the first block being adjacent to a left of a target block, the second block being adjacent to a top of the target block, the third block being adjacent to an upper right of the target block, and the fourth block being adjacent to an upper left of the target block;
decode, from input encoded data, selection information specifying one of the available blocks, and not decode the selection information when the number of the available blocks is 1;
select, as a selection block, one available block from the available blocks in accordance with the selection information; and
generate a predicted image of the target block using, as motion information corresponding to the target block, motion information corresponding to the selection block,
wherein the circuitry is configured to select the one or more available blocks from the plurality of candidate blocks by performing at least (1) determining whether the first block is available, (2) determining whether the second block is available after (1), (3) determining whether the third block is available after (2), and (4) determining whether the fourth block is available after (3).

US Pat. No. 10,715,827

MULTI-HYPOTHESES MERGE MODE

MEDIATEK INC., Hsinchu (...

1. A method comprising:selecting a first motion predictor and a second motion predictor from a list of candidate motion predictors for a block of pixels;
coding a motion prediction code word that identifies the first and second motion predictors;
generating a first set of predicted pixels that corresponds to a first spatial region within the block according to the first motion predictor;
generating a second set of predicted pixels that corresponds to a second spatial region within the block different from the first spatial region according to the second motion predictor;
computing a combined prediction based on the first set of predicted pixels and the second set of predicted pixels; and
coding the block of pixels according to the combined prediction for motion compensation.

US Pat. No. 10,715,826

METHOD OF PERFORMING MOTION VECTOR PREDICTION, AND APPARATUS THEREOF

SUN PATENT TRUST, New Yo...

1. An encoding method for encoding a current block of a picture to generate a bitstream, the encoding method comprising:deriving a candidate for a motion vector predictor to encode a current motion vector of the current block, from a first motion vector of a first block which is
(i) a neighboring block that is included in a current picture including the current block and is adjacent to the current block or
(ii) a co-located block included in a picture different from the current picture;
adding the derived candidate to a candidate list;
deriving at least one motion vector predictor based on a candidate selected from the candidate list;
encoding the current motion vector using the derived at least one motion vector predictor; and
encoding the current block using the current motion vector,
wherein the deriving of the candidate includes determining whether to derive the candidate from the first motion vector, based on a type of a current reference picture and a type of a first reference picture, the current reference picture being referred to from the current block using the current motion vector, and the first reference picture being referred to from the first block using the first motion vector,
each of the type of the current reference picture and the type of the first reference picture is one of a long term reference picture and a short term reference picture, and
in the determining of whether to derive the candidate from the first motion vector, the candidate is determined to be derived from the first motion vector when the type of the current reference picture and the type of the first reference picture are the same.

US Pat. No. 10,715,825

METHOD FOR ENCODING AND DECODING IMAGE INFORMATION AND DEVICE USING SAME

LG ELECTRONIC INC., Seou...

1. A decoding method for an inter prediction, the method comprising:receiving, by a decoding apparatus, inter prediction mode information and merge index information for a current block;
determining, by the decoding apparatus, a merge mode is used as an inter prediction mode for the current block based on the inter prediction node information;
configuring, by the decoding apparatus, merge candidates including spatial merge candidates and a temporal merge candidate based on spatial neighboring blocks and a temporal neighboring block of the current block, wherein the temporal neighboring block is located in a collocated picture;
selecting, by the decoding apparatus, a merge candidate from the configured merge candidates for the merge mode based on the merge index information;
deriving, by the decoding apparatus, motion information of the current block based on motion information for the selected merge candidate, wherein the motion information of the current block includes a motion vector and a reference picture index; and
generating, by the decoding apparatus, prediction samples of the current block using the motion vector and a reference picture indicated by the reference picture index,
wherein motion information for a spatial merge candidate is derived from motion information of a spatial neighboring block of the current block,
wherein a motion vector for the temporal merge candidate is derived from a motion vector of the temporal neighboring block,
wherein a reference picture index for the temporal merge candidate is set equal to 0 regardless of any of values of reference picture indexes of the spatial neighboring blocks and the temporal neighboring block, and
wherein the spatial neighboring block includes a left neighboring block, a bottom left neighboring block, a top left neighboring block, a top neighboring block and a top right neighboring block of the current block.

US Pat. No. 10,715,824

SYSTEM AND METHOD FOR DATA COMPRESSING OPTICAL SENSOR DATA PRIOR TO TRANSFERRING TO A HOST SYSTEM

Facebook Technologies, LL...

1. A method, comprising:receiving a stream of image data of an eye generated by a sensor;
assigning pixels of the image data to a pupil region of the eye, a glint region of the eye, and a background region by comparing pixel values of the pixels to threshold values;
generating encoded image data by applying an encoding algorithm to the image data for the pixels of the pupil region, the glint region, and the background region, the encoded image data having a smaller data size than the image data, the encoded image data having an encoding format using a sequence of bits for encoding a region, the sequence of bits including either a first plurality of bits to define the region as one of the pupil region or the glint region or a single bit to define the region as the background region, the sequence of bits further including a second plurality of bits to define a run length of pixels assigned to the region; and
transmitting an output stream including the encoded image data to a computing device.

US Pat. No. 10,715,823

METHOD AND APPARATUS FOR EXECUTING DECODING COMMAND, METHOD AND APPARATUS FOR GENERATING DECODING COMMAND, AND RECORDING MEDIUM

SAMSUNG ELECTRONICS CO., ...

1. A method of executing a decoding command, the method comprising:acquiring the decoding command;
obtaining a type code included in the decoding command;
identifying the type code included in the decoding command, wherein the type code indicates one of a plurality of operations including an operation of storing context information of a current encoded symbol, an operation of storing information on an address value at which context information of a symbol following the encoded symbol is stored and an operation of decoding the encoded symbol;
determining an operation corresponding to the identified type code; and
performing the determined operation based on information included in the decoding command,
wherein the determining of the operation comprises comparing a database, in which relationship information between at least one decoding command and a type of operation is prestored, with the acquired decoding command, and
wherein the performing of the operation comprises:
performing the operation of storing context information of the encoded symbol when a result of the comparison indicates that the acquired decoding command is a first decoding command; and
performing the operation of decoding the encoded symbol when the result of the comparison indicates that the acquired decoding command is a second decoding command,
wherein the operation of decoding begins with a leftmost bit (most significant bit MSB) or a rightmost bit (least significant bit LSB) in a bitstream, and
wherein the method further comprises determining whether or not a loop count value is identical to a preset loop value.

US Pat. No. 10,715,822

IMAGE ENCODING METHOD AND ENCODER

SZ DJI TECHNOLOGY CO., LT...

1. An image encoding method comprising:determining N P-frames from a sequence of images, N being a positive integer;
for each P-frame, determining a source refreshing region in the P-frames, the source refreshing region being a portion less than a whole region of the P-frame;
obtaining reconstructed images corresponding to the source refreshing regions by performing a first encoding on the source refreshing regions, including:
performing the first encoding on each of the source refreshing regions using a quantization parameter according to the P-frames; and
recording the quantization parameter using a first quantization, the quantization parameter being obtained by adjusting a bit rate required in transmitting the P-frames;
obtaining updated P-frames by updating the source refreshing regions with the reconstructed images; and
performing a second encoding on the updated P-frames.

US Pat. No. 10,715,821

EMBEDDING INFORMATION ABOUT EOB POSITIONS

GOOGLE LLC, Mountain Vie...

1. A method for decoding a transform block of quantized transform coefficients, comprising:decoding, from an encoded bitstream, a predetermined number of coefficients of the quantized transform coefficients, wherein the transform block comprises the predetermined number of coefficients and subsequent quantized transform coefficients;
determining a value for the predetermined number of coefficients;
decoding, from the encoded bitstream, a subsequent quantized transform coefficient of the subsequent quantized transform coefficients; and
determining whether to decode an end-of-block (EOB) indicator based on the value that is determined for the predetermined number of coefficients.

US Pat. No. 10,715,820

INTERFRAME PREDICTIVE CODING METHOD AND DEVICE

1. An inter-frame predictive encoding method, wherein the method comprises:dividing a frame to be encoded into a plurality of blocks to be encoded;
determining, for each block to be encoded, a forward encoding block and a backward encoding block corresponding to the block to be encoded;
determining a first weighting parameter corresponding to the forward encoding block;
determining a fourth weighting parameter corresponding to the backward encoding block according to the first weighting parameter and the following formula: fourth weighting parameter=1?first weighting parameter;
determining an overall brightness-based second weighting parameter of a reference image containing the forward encoding block, and determining an overall brightness-based third weighting parameter of a reference image containing the backward encoding block;
determining predictive actual weighting parameters of the block to be encoded corresponding to the forward encoding block and the backward encoding block according to the first weighting parameter, the second weighting parameter, the third weighting parameter and the fourth weighting parameter; and
performing predictive encoding on the block to be encoded by the predictive actual weighting parameter;
wherein, determining an overall brightness-based weighting parameter of a reference image comprises:
determining the overall brightness-based weighting parameter of the reference image according to the image to be encoded and the reference image through minimal residual technique, comprising
performing a down-sampling on the image to be encoded and the reference image;
obtaining a weighted reference image with a brightness weighting parameter;
dividing the down-sampled to-be-encoded image into blocks;
obtaining a residual between each of the blocks with a matching block in the weighted reference image; and
selecting, from brightness weighting parameters, one that produces a minimum sum of the squares of the obtained residuals as the brightness weighting parameter of the reference image.

US Pat. No. 10,715,819

METHOD AND APPARATUS FOR REDUCING FLICKER

Canon Kabushiki Kaisha, ...

1. An image encoding apparatus that divides an image in a video into blocks made up of a plurality of pixels, and encodes the blocks, the image encoding apparatus comprising:a first motion vector computation unit configured to compute a first motion vector of a block to be encoded in an image; and
a motion vector replacement unit configured to replace the first motion vector computed in the first motion vector computation unit with a second motion vector when a condition is satisfied,
wherein the condition comprises (1) at least one of the following conditions: (i) an absolute value of the first motion vector is greater than a first threshold, and (ii) an absolute value of a difference between the first motion vector and the second motion vector is greater than a second threshold, and (2) a condition where encoding cost corresponding to the block to be encoded if the second motion vector is used is less than a third threshold.

US Pat. No. 10,715,818

TECHNIQUES FOR HARDWARE VIDEO ENCODING

Intel Corporation, Santa...

1. A system for video encoding, comprising:an encoder, wherein the encoder comprises a plurality of fixed function hardware units comprising at least a hierarchical motion estimation unit, an integer motion estimation unit, and a fractional motion estimation unit, wherein when the encoder is to execute the plurality of fixed function hardware units, the plurality of fixed function hardware units is operable to:
execute a multiple reference hierarchical motion estimation search on a plurality of reference frames;
execute a first integer motion estimation search based on at least the results of the multiple reference hierarchical motion estimation search to obtain a precise motion vector and a second integer motion estimation search using a pseudo predicted motion vector and the precise motion vector;
partition a current macroblock based on the second integer motion estimation search; and
execute a fractional motion estimation search based on the current macroblock partitioning to determine a best inter-predicted macroblock coding decision;
a memory that is to store instructions and that is communicatively coupled to the encoder; and
a processor communicatively coupled to the encoder and the memory, wherein when the processor is to execute the instructions, the processor is to:
compare the best inter-predicted macroblock coding decision, an intra-predicted macroblock coding decision, and a skip to determine a final macroblock coding decision, wherein the intra-predicted macroblock coding decision is derived from a spatial neighboring macroblock and the skip is based on a predicted motion vector.

US Pat. No. 10,715,817

APPARATUS AND METHOD FOR ENHANCING MOTION ESTIMATION BASED ON USER INPUT

NVIDIA CORPORATION, Sant...

1. A method comprising:receiving input data representative of a user based input using an input device;
determining, based at least in part on the input data, that the user based input is associated with altering data rendered for display;
based at least in part on the user based input being associated with altering the data, analyzing the input data to determine a type of input device and an input value corresponding to the type of input device;
converting the input value to a device agnostic input value;
associating the device agnostic input value with a target frame of a video stream;
computing, based at least in part on the device agnostic input value, displacement coordinates for a current block within the target frame of the video stream with respect to a reference frame of the video stream;
executing a search, within the reference frame, for a best match block corresponding to the current block within the target frame, the search beginning at an initial search location determined using the displacement coordinates; and
responsive to identifying the best match block, determining a prediction error based at least in part on computing a difference between the current block within the target frame and the best match block within the reference frame.

US Pat. No. 10,715,816

ADAPTIVE CHROMA DOWNSAMPLING AND COLOR SPACE CONVERSION TECHNIQUES

Apple Inc., Cupertino, C...

1. An image conversion method, comprising:determining values for Cb and Cr for image data according to a transform from the image data's source color space to a Y?CrCb color space;
producing a reconstructed Cb* value and a reconstructed Cr* value by processing the Cb and Cr values according to a lossy process, then inverting the lossy process;
determining a plurality of candidate Y? values from the image data in the source color space and the Cb* and Cr* values;
deriving a final Y? value from the plurality of candidate Y? values; and
outputting the derived final Y? value, Cb value, and Cr value as converted image data.

US Pat. No. 10,715,815

METHOD AND APPARATUS FOR REAL-TIME SAO PARAMETER ESTIMATION

TEXAS INSTRUMENTS INCORPO...

1. A method comprising:receiving a first pixel and a second pixel;
receiving a first deblocked pixel of the first pixel and a second deblocked pixel of the second pixel;
categorizing the first deblocked pixel in a band category;
categorizing the second deblocked pixel in an edge category;
estimating a first error based on a difference between the first deblocked pixel and the first pixel;
estimating a second error based on a difference between the second deblocked pixel and the second pixel;
determining a first candidate offset with the band category;
determining a second candidate offset with the edge category;
estimating a first rate distortion (RD) cost associated with the first candidate offset; and
estimating a second RD cost associated with the second candidate offset.

US Pat. No. 10,715,814

TECHNIQUES FOR OPTIMIZING ENCODING PARAMETERS FOR DIFFERENT SHOT SEQUENCES

NETFLIX, INC., Los Gatos...

1. A computer-implemented method, comprising:partitioning a source video sequence associated with a media title into a plurality of subsequences that includes a first subsequence and a second subsequence, wherein the first subsequence and the second subsequence are associated with different periods of time;
generating a first encoded subsequence based on the first subsequence and a first value of an encoding parameter;
generating a second encoded subsequence based on the second subsequence and a second value for the encoding parameter, wherein the second value of the encoding parameter is not equal to the first value of the encoding parameter; and
aggregating the first encoded subsequence and the second encoded subsequence to generate a first encoded video sequence,
wherein at least a portion of the first encoded video sequence is subsequently streamed to an endpoint device during a playback of the media title.

US Pat. No. 10,715,813

METHOD AND APPARATUS FOR PERFORMING BLOCK PREDICTION SEARCH BASED ON RESTORED SAMPLE VALUES DERIVED FROM STORED SAMPLE VALUES IN DATA BUFFER

MEDIATEK INC., Hsin-Chu ...

1. A block prediction search method, comprising:encoding or decoding a first pixel line to generate sample values of a plurality of samples in the first pixel line;
generating a bit-depth reduced sample value by sampling each of the sample values of the samples in the first pixel line according to a difference between a bit depth of said each of the sample values and a buffer bit depth of a data buffer, wherein the bit depth of said each of the sample values is larger than the buffer bit depth of the data buffer, and a bit depth of the bit-depth reduced sample value is smaller than the bit depth of said each of the sample values;
storing bit-depth reduced sample values of the samples in the first pixel line into the data buffer;
obtaining restored sample values by reading the data buffer, wherein the restored sample values are derived from stored bit-depth reduced sample values in the data buffer, and a bit depth of each of the restored sample values is not smaller than a bit depth of each of the stored bit-depth reduced sample values;
detecting occurrence of an edge in the first pixel line according to the restored sample values;
determining a block prediction vector of a pixel group in a second pixel line different from the first pixel line based at least partly on a last edge count value indicative of a number of samples in the first pixel line that have gone by since the edge occurs, wherein determining the block prediction vector of the pixel group in the second pixel line comprises:
generating a comparison result by comparing a first set of N consecutive samples in the first pixel line with each of second sets of N consecutive samples in the first pixel line, wherein N is a positive integer, a rightmost sample of the first set of N consecutive sample in the first pixel line and a rightmost sample of the pixel group in the second pixel line have a same column position in an image, and rightmost samples of the first set of N consecutive samples and the second sets of N consecutive samples are different samples in the first pixel line;
referring to the comparison result to select one of the second sets of N consecutive samples;
utilizing a position offset between the rightmost sample of the first set of N consecutive sample and a rightmost sample of said one of the second sets of N consecutive samples as the block prediction vector of the pixel group; and
checking at least the last edge count value to determine whether or not the block prediction vector is selected as a final block prediction vector of the pixel group in the second pixel line; and
when the block prediction vector is the final block prediction vector of the pixel group in the second pixel line, encoding or decoding the pixel group in the second pixel line according to the final block prediction vector.

US Pat. No. 10,715,812

METHOD AND APPARATUS FOR VIDEO CODING

Tencent America LLC, Pal...

7. An apparatus for video decoding, comprising:processing circuitry configured to:
decode prediction information for a block in a current picture from a coded video bitstream, the prediction information being indicative of an inter prediction mode that signals one or more offsets for each base motion vector predictor candidate in a candidate list, each base motion vector predictor candidate associated with a respective coverage window different from coverage windows of other base motion vector predictor candidates, the coverage window of each base motion vector predictor candidate is defined by a rectangular area that covers an X and Y axis area defined by the one or more offsets of the respective base motion vector predictor candidate;
determine whether a new motion vector predictor satisfies a spacing requirement that specifies that a coverage window of the new motion vector predictor does not overlap with the respective coverage window of each base motion vector predictor candidate in the candidate list;
add the new motion vector predictor into the candidate list in response to a determination that the new motion vector predictor satisfies the spacing requirement;
decode an index of a specific base motion vector predictor candidate in the candidate list and the offset associated with the specific base motion vector predictor candidate to determine a final motion vector for block reconstruction; and
reconstruct at least a sample of the block according to the final motion vector.

US Pat. No. 10,715,811

METHOD AND APPARATUS FOR DETERMINING MERGE MODE

SAMSUNG ELECTRONICS CO., ...

1. A method of determining a merge mode, the method implemented by at least one processor, comprising:determining, by the at least one processor, at least one first merge candidate to be used in a first merge mode of a coding unit of a first depth from among previous prediction units that are spatially and temporally associated with the coding unit of the first depth;
obtaining a first cost of encoding the coding unit of the first depth according to the first merge mode by using motion information of the at least one first merge candidate;
obtaining coding units of a second depth by splitting the coding unit of the first depth;
determining at least one second merge candidate to be used in a second merge mode of the coding units of the second depth from among previous prediction units that are spatially and temporally associated with the one of the coding units of the second depth;
obtaining second costs of encoding the coding units of the second depth according to the second merge mode by using the at least one second merge candidate based on a partial cost of the first cost of encoding the coding unit of the first depth according to the first merge mode obtained by using motion information of the at least one first merge candidate corresponding to the at least one second merge candidate;
comparing the first cost with a sum of the second costs;
determining a merge mode, having a smaller cost based on a result of the comparison, from among the first merge mode and the second merge mode; and
outputting an information of the merge mode,
wherein the obtaining the second costs comprises:
when there exists an area of the at least one first merge candidate of the coding unit of the first depth, the area being equal to the at least one second merge candidate, and the at least one first merge candidate of the first depth and the at least one second merge candidate of the second depth are a same prediction unit having same motion information, obtaining the partial cost of the first cost of encoding the coding unit of the first depth according to the first merge mode by using the area of the at least one first merge candidate of the coding unit of the first depth; and
determining the partial cost of the first cost of encoding the coding unit of the first depth according to the first merge mode as one of the second costs of the second merge mode of a coding unit of the second depth based on an area of the coding unit of the second depth while skipping an operation of calculating the one of the second costs of encoding each of the coding units of the second depth according to the second merge mode by using the motion information of the at least one second merge candidate.

US Pat. No. 10,715,810

SIMPLIFIED LOCAL ILLUMINATION COMPENSATION

Qualcomm Incorporated, S...

1. A method of processing video data, the method comprising:determining, by processing circuitry, a plurality of neighboring samples for predicting a current block, wherein the plurality of neighboring samples are arranged outside of a region of a current picture, the region comprising the current block, a row of samples adjacent to a top row of the current block, and a column of samples adjacent to a left column of the current block;
deriving, by the processing circuitry, local illumination compensation information for the current block using the plurality of neighboring samples, wherein deriving the local illumination compensation information comprises calculating a and/or b,
andwherein Recneig denotes the plurality of neighboring samples, wherein Recrefneig denotes a plurality of reference neighboring samples, and wherein N denotes a pixel number in Recneig and Recrefneig; and
generating, by the processing circuitry, a prediction block using the local illumination compensation information.

US Pat. No. 10,715,809

ENTROPY CODING SUPPORTING MODE SWITCHING

GE VIDEO COMPRESSION, LLC...

1. A decoder for decoding a data stream including encoded data of a video, the decoder comprising:an entropy decoding engine configured to decode data from the data stream based on an entropy decoding scheme of a plurality of entropy decoding schemes to obtain a sequence of symbols, wherein the plurality of entropy decoding schemes includes a context adaptive binary arithmetic coding scheme and wherein, with respect to at least one symbol of the sequence of symbols, the entropy decoding engine is configured to:
select a context corresponding to the at least one symbol, and
decode the at least one symbol using the selected context based on the entropy decoding scheme, wherein the entropy decoding includes updating a probability model associated with the selected context at one of a first update rate and a second update rate, which is lower than the first update rate, wherein the first update rate is associated with a high-efficiency mode of entropy decoding and the second update rate is associated with a low-complexity mode of entropy decoding;
a desymbolizer configured to desymbolize the sequence of symbols to obtain a sequence of syntax elements; and
a reconstructor configured to reconstruct at least a portion of the video based on the sequence of syntax elements.

US Pat. No. 10,715,808

IMAGE CODING APPARATUS FOR CODING TILE BOUNDARIES

SUN PATENT TRUST, New Yo...

1. Circuity executing operations, the operations comprising:dividing a picture into tiles;
coding the tiles to generate pieces of coded data, each of which corresponds to a different one of the tiles; and
generating a bitstream including the pieces of coded data,
wherein the coding of the tiles includes:
generating a first code string by:
coding a first tile of the tiles with reference to coding information of an already-coded tile neighboring the first tile when a boundary between the first tile and the already-coded tile is a first boundary; and
coding the first tile without reference to the coding information of the already-coded tile when the boundary between the first tile and the already-coded tile is a second boundary, and
wherein, in the generating of the bitstream, the bitstream is generated to include tile boundary independence information, the tile boundary independence information indicating whether each boundary between the tiles is one of the first boundary and the second boundary.

US Pat. No. 10,715,807

METHOD AND APPARATUS FOR PYRAMID VECTOR QUANTIZATION INDEXING AND DE-INDEXING OF AUDIO/VIDEO SAMPLE VECTORS

TELEFONAKTIEBOLAGET LM ER...

1. A method for pyramid vector quantization indexing of coefficients derived from an audio or video signal, wherein the method comprises:obtaining an input vector having a plural number of the coefficients;
extracting a leading sign from the input vector, the leading sign being a sign of a non-zero coefficient in the input vector;
indexing the input vector with a modified pyramid vector quantization enumeration scheme into an output index, which output index together with the leading sign represent the coefficients;
said modified pyramid vector quantization enumeration scheme neglecting said leading sign when performing the indexing of the input vector into the output index; and
outputting the leading sign and the output index as codewords in an outgoing bit stream.

US Pat. No. 10,715,806

SYSTEMS, METHODS, AND MEDIA FOR TRANSCODING VIDEO DATA

DIVX, LLC, San Diego, CA...

1. A method for transcoding a source video file into a set of multiple alternate video streams, the method comprising:generating, at a computer system configured as a media metadata generation device, media metadata related to the source video file prior to decoding, during a transcoding of, at least a portion of the source video file, where the media metadata comprises scene complexity information;
providing information based on the media metadata from the computer system to a plurality of transcoding devices; and
performing the following at each of the plurality of transcoding devices in parallel:
receiving the at least a portion of the source video file, including a first plurality of encoded images encoded according to a source format, from a media content source;
decoding the at least a portion of the source video file based on the source format to generate a decoded portion of video including a plurality of decoded images;
receiving the information based on the media metadata from the computer system; and
encoding the plurality of decoded images of the decoded portion of video into an alternate video stream including a second plurality of encoded images based on a target format and the information based on the media metadata, the alternate video stream being one of the set of multiple alternate video streams.

US Pat. No. 10,715,805

METHOD AND DEVICE FOR SUBBAND CODING FREQUENCY CONVERSION UNIT, AND METHOD AND DEVICE FOR IMAGE ENCODING/DECODING USING SAME

SK TELECOM CO., LTD., Se...

1. A method performed by an apparatus for decoding a current block corresponding to a two dimensional frequency conversion unit having a size larger than or equal to a 8×8 pixel size, the method comprising:generating a predicted block by predicting the current block;
reconstructing frequency coefficients of the frequency conversion unit by decoding a bitstream, to generate a frequency conversion block having a size of the frequency conversion unit, wherein the frequency conversion unit is a unit of transforming residual signals that are differences between original pixel values and predicted pixel values, and the frequency coefficients are values generated by transform of the residual signals from a spatial domain into a frequency domain;
inversely transforming the frequency coefficients in the frequency conversion block from the frequency domain into the spatial domain by using a transform size identical to the size of the frequency conversion unit, to reconstruct a residual block having the residual signals; and
adding the reconstructed residual block to the predicted block, to thereby reconstruct the current block
wherein the frequency conversion block is generated by reconstructing and inversely scanning the frequency coefficients of the frequency conversion unit in the unit of two dimensional subblocks divided from the frequency conversion unit, all of the subblocks divided from the frequency conversion unit being equal-sized square blocks which have a 4×4 pixel size regardless of the size of the frequency conversion unit,
wherein the reconstructing and inversely scanning of the frequency coefficients in the unit of the subblocks comprises:
decoding, from the bitstream, subblock encoding information indicating whether a subblock of the 4×4 pixel size in the frequency conversion unit corresponding thereto has at least one non-zero frequency coefficient, wherein
a value of the decoded subblock encoding information is 0 when the subblock does not have at least one non-zero frequency coefficient, and
the value of the decoded subblock encoding information is 1 when the subblock has at least one non-zero frequency coefficient;
reconstructing from the bitstream and inversely scanning frequency coefficients corresponding to the subblock, when the decoded subblock encoding information indicates that the subblock has at least one non-zero frequency coefficient; and
setting all frequency coefficients corresponding to the subblock to 0, when the decoded subblock encoding information indicates that the subblock does not have at least one non-zero frequency coefficient,
wherein each of the subblocks includes frequency coefficients corresponding to a frequency range, the frequency range of each of the subblocks being different from each other,
wherein the frequency coefficients which are located at different positions in each of the subblocks are values in the frequency domain which correspond to different frequencies.

US Pat. No. 10,715,804

ENCODING APPARATUS AND ENCODING METHOD AS WELL AS DECODING APPARATUS AND DECODING METHOD

SONY CORPORATION, Tokyo ...

1. An encoding apparatus, comprising:an encoding unit configured to encode an input image by a non-reversible encoding method, wherein encoded data is obtained by encoding the input image;
a database configured to:
register a plurality of texture components; and
register a plurality of bases of the plurality of texture components, wherein the plurality of bases is obtained based on conversion of each texture component of the plurality of texture components into a basis;
a decoding unit configured to decode the encoded data into a decoded image;
a basis synthesis unit configured to generate a restoration component for each texture component of the plurality of texture components registered in the database, wherein
the restoration component restores a texture component of the input image, and
the restoration component is generated based on a basis synthesis in which the decoded image and a basis of a texture component of the plurality of texture components are used;
a match component determination unit configured to determine, as a match component, the restoration component whose error with respect to the input image is minimum from among a plurality of restoration components generated for the plurality of texture components registered in the database; and
a transmission unit configured to transmit identification information and the encoded data, wherein
the identification information is for identification of the match component from among the plurality of texture components registered in the database, and
the match component is a texture component that matches with the input image.

US Pat. No. 10,715,802

METHOD FOR ENCODING/DECODING VIDEO SIGNAL BY USING SINGLE OPTIMIZED GRAPH

LG ELECTRONICS INC., Seo...

1. A method of encoding a video signal using a single optimized graph, comprising:obtaining a residual block;
generating graphs from the residual block;
generating an optimal graph and an optimal transform by combining the graphs, wherein the graphs are combined based on an optimization process, wherein the optimal graph is determined based on an intra prediction mode used in the residual block; and
performing a transform for the residual block based on the optimal graph and the optimal transform,
wherein the optimization process comprises:
determining a common graph transform indicative of an optimal graph transform for a given set of graph Laplacian matrices;
determining a common graph frequency based on the common graph transform, wherein the common graph frequency indicates a function of common graph transform frequencies obtained with respect to the given set of graph Laplacian matrices, and corresponds to a diagonal matrix ?=diag([0, ?2, . . . , ?d]) where ?d is the (j, j)-th entry of ?; and
determining an optimal graph Laplacian matrix using a maximum likelihood function based on the common graph transform and the common graph frequency.

US Pat. No. 10,715,801

METHOD FOR PALETTE TABLE INITIALIZATION AND MANAGEMENT

HFI Innovation Inc., Zhu...

1. A method of palette management for palette coding in a video coding system, the method comprising:receiving input data associated with a current block in a high-level picture structure, wherein the high-level picture structure corresponds to a slice, tile, coding tree unit (CTU) row, or wavefront structure associated with wavefront parallel processing (WPP), sequence, or picture;
performing initialization of a palette predictor for the high-level picture structure to include one or more initial color entries in the palette predictor, the initialization of the palette predictor being performed only once for the high-level picture structure at a beginning of coding the high-level picture structure;
when the current block is coded according to a palette mode:
applying the palette coding to the current block according to a current palette, and
updating the palette predictor for the high-level picture structure based on the current palette to become an updated palette predictor for the high-level picture structure usable for a next block that is coded according to the palette mode; and
when the current block is coded according to a non-palette mode, keeping the palette predictor for the high-level picture structure unchanged and usable for the next block that is coded according to the palette mode,
wherein
the initialization of the palette predictor uses initialization values that include zero, a mid-level value, or a derived value, the derived value being determined according to brightness or hue associated with pixels of the high-level picture structure, and
the performing the initialization of the palette predictor includes:
determining whether the pixels of the high-level picture structure is coded using a YUV color format; and
when the pixels of the high-level picture structure is determined to be coded using the YUV format, setting an initial color entry of the one or more initial color entries to
the mid-level value for a U component of the initial color entry, and
the mid-level value for a V component of the initial color entry.

US Pat. No. 10,715,800

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

SONY CORPORATION, Tokyo ...

1. An image processing apparatus, comprising:an encoding section, in a case where a prediction mode of a luminance component of an image is an intra BC prediction mode, encoding information indicating a prediction mode of a color component of the image by using, as a context, that the prediction mode of the luminance component is the intra BC prediction mode,
wherein in a case where it is used as the context that the prediction mode of the luminance component is the intra BC prediction mode, the encoding section encodes the information indicating the prediction mode of the color component in such a way that in a case where the prediction mode of the color component is the intra BC prediction mode, a compression rate becomes high, and
wherein the encoding section is implemented via at least one processor.

US Pat. No. 10,715,799

METHODS AND SYSTEMS FOR IMAGE INTRA-PREDICTION MODE MANAGEMENT

Dolby Laboratories Licens...

1. A method for decoding a digital image, the method comprising:decoding each of blocks into which an image is divided;
performing intra-prediction for predicting values of pixels located along a specified direction in a target block to be decoded;
estimating a prediction mode for the target block;
obtaining first information indicating whether the estimated prediction mode is to be selected as an intra prediction mode of the target block; and
responsive to determining that the first information indicates that the estimated prediction mode is not to be selected as the intra prediction mode of the target block, obtaining second information indicating the intra prediction mode for the target block;
wherein
the intra-prediction is based on at least a vertical prediction mode using a prediction value being a pixel value of a first block located adjacent to and above the target block, a horizontal prediction mode using a prediction value being a pixel value of a second block located adjacent to the left side of the target block, a DC prediction mode using a prediction value being an average of the pixel values of the first block located adjacent to and above the target block and the second block located adjacent to and to the left side of the target block, a Diagonal Down/Left prediction mode using the specified direction being diagonally downward to the left at approximately a 45 degree angle, a Diagonal Down/Right prediction mode using the specified direction being diagonally downward to the right at approximately a 45 degree angle as the prediction mode,
the prediction modes are numbered with increasingly larger numbers, in a predetermined order of the vertical prediction mode, the horizontal prediction mode, the DC prediction mode, the Diagonal Down/Left prediction mode and the Diagonal Down/Right prediction mode,
an estimated prediction mode is determined to have the lower mode number among the prediction mode of the first block located adjacent to and above the target block and the prediction mode of the second block located adjacent to the left side of the target block as the prediction mode for the target block, wherein a mode number associated with each of the prediction mode of the first block and the prediction mode of the second block is in accordance with the predetermined order of prediction modes.

US Pat. No. 10,715,797

DEVICE FOR TESTING ANGLE OF VIEW OF CAMERA

LG INNOTEK CO., LTD., Se...

1. A device for testing an angle of view of a camera, the device comprising:an image sensor;
a first light source disposed over the image sensor, wherein the first light source comprises a first side having a first width and extending in a first lengthwise direction, and a second side having a second width extending in a second lengthwise direction different from the first lengthwise direction, and wherein the second width is different from the first width; and
a second light source and a third light source that are respectively disposed on opposite sides, in the first lengthwise direction, of the image sensor.

US Pat. No. 10,715,796

METHOD AND DEVICE FOR MONITORING AN IMAGE SENSOR

Robert Bosch GmbH, Stutt...

1. A method for monitoring an image sensor, the image sensor having a communication interface configured for communicating with an external monitoring unit, the method comprising:reading-in a request signal, generated by the monitoring unit, via the communication interface;
modifying a test pattern, generated by the image sensor, using the request signal, to obtain a modified test pattern; and
outputting the modified test pattern to the communication interface, to have the modified test pattern evaluated by the monitoring unit;
wherein using the request signal, at least one further reply signal is output in the outputting step to have the further reply signal evaluated by the monitoring unit.

US Pat. No. 10,715,795

METHOD FOR DETERMINING A DIAGNOSTIC CONDITION OF A VEHICULAR VIDEO CONNECTION

MAGNA ELECTRONICS INC., ...

1. A method for determining a diagnostic condition of a vehicular video connection, the method comprising:providing a video driver, a video cable and a video driver power supply;
providing a current sensor and a microcontroller;
powering the video driver via the video driver power supply;
sensing, via the current sensor, current flowing to the video driver from the video driver power supply;
determining, via the microcontroller, a current level sensed by the current sensor during operation of the video driver; and
determining an open circuit condition when the determined current level of the current flowing into the video driver falls below a predetermined quantity.

US Pat. No. 10,715,794

METHOD AND SYSTEM FOR TRACKING EYE MOVEMENT IN CONJUNCTION WITH A LIGHT SCANNING PROJECTOR

Magic Leap, Inc., Planta...

1. An eye tracking system comprising:a pair of glasses including two frames;
a light scanning projector coupled to the pair of glasses and operable to scan a beam of light to project an image frame including a plurality of pixels, each of the plurality of pixels being associated with a pixel position within the image frame;
an eyepiece mounted in one of the two frames and optically coupled to the light scanning projector, the eyepiece including an exit pupil expander operable to direct at least a portion of the beam of light towards an eye of a user;
one or more photodetectors coupled to the pair of glasses and operable to detect one or more time-varying reflected signals, wherein each of the one or more time-varying reflected signals is associated with the plurality of pixels of the image frame and includes a series of reflected intensities; and
a processor coupled to the light scanning projector and the one or more photodetectors, wherein the processor is operable to correlate the reflected intensities in the series of reflected intensities of each of the one or more time-varying reflected signals with the pixel position of each of the plurality of pixels to form a spatial map between the pixel position of each of the plurality of pixels and the reflected intensities in the series of reflected intensities of each of the one or more time-varying reflected signals and determine a first eye orientation.

US Pat. No. 10,715,793

TWO DIMENSIONAL TO THREE DIMENSIONAL MOVING IMAGE CONVERTER

Steven M. Hoffberg, West...

1. A method of converting a 2D video file to a 3D representation, comprising:receiving a 2D video file comprising audio information and a plurality of visual objects in a scene having relative motion with respect to each other, at least one of the visual objects being associated with emission of a sound;
automatically calculating a distance and an angle of a viewer with respect to the scene;
calculating at least one characteristic delay of the sound in the audio information coming from a respective visual object in the 2D video file, based on at least one autocorrelation of a component of the audio information with a movement of the respective visual object in the 2D video file;
assigning a depth to the plurality of visual objects within the 2D video file based on the at least one characteristic delay;
applying at least one transform to the 2D video file, to produce a 3D representation of the scene as a series of pairs of parallax images and associated audio information for the viewer over time, dependent on the relative motion, the assigned depth of the plurality of visual objects, and the automatically calculated distance and angle, comprising distinct parallax values for each of the respective plurality of visual objects and modified audio information;
storing the 3D representation of the scene in a memory.

US Pat. No. 10,715,792

DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME

SAMSUNG ELECTRONICS CO., ...

1. A display device comprising:a display panel comprising a plurality of pixels, each of which comprises a plurality of sub pixels;
a prism panel at one side of the display panel and comprising a first layer comprising a prism array, and a second layer comprising a liquid crystal and stacked on the first layer, the first layer being closer to a light source of the display device than the second layer;
a prism panel driver configured to apply voltage to the prism panel; and
a controller configured to display a plurality of image frames on the display panel and to control a driving state of the prism panel variably while the plurality of image frames is displayed
wherein the controller is further configured to:
divide an image frame into the plurality of image frames based on a number of the sub pixels comprised in each of the plurality of pixels,
control the display panel to shift and display sub pixel values for each of the plurality of sub pixels, control the display panel to shift and display sub pixel values for each of the plurality of image frames,
control the prism panel driver to apply a voltage of variable level to the prism panel while the sub pixel values are shifted and displayed, and
in response to the plurality of image frames being divided into a first image frame, a second image frame, and a third image frame, and a first level voltage applied to the prism panel positioned in a front direction of the display panel and light emitted from the display panel refracted with a first refraction angle by the prism panel, control to display a sub pixel value with respect to the first image frame on a first sub pixel corresponding to the first refraction angle, control to display a sub pixel value with respect to the second image frame on a second sub pixel corresponding to the first refraction angle, and control to display a sub pixel value with respect to the third image frame on a third sub pixel corresponding to the first refraction angle.

US Pat. No. 10,715,791

VIRTUAL EYEGLASS SET FOR VIEWING ACTUAL SCENE THAT CORRECTS FOR DIFFERENT LOCATION OF LENSES THAN EYES

GOOGLE LLC, Mountain Vie...

1. A virtual eyeglass set, comprising:a frame configured to mount onto a user's head and hold a first virtual lens in front of the user's left eye and a second virtual lens in front of the user's right eye;
the first virtual lens supported by the frame and the second virtual lens supported by the frame, each of the first virtual lens and the second virtual lens being supported by the frame so that when the virtual eyeglass set is mounted onto the user's head, a first side of the lens will face the user and a second side of the lens will face away from the user, each of the first virtual lens and the second virtual lens comprising:
a light field display on the first side;
a direction camera on the first side configured to capture a direction that the user's respective eye is facing; and
a light field camera on the second side, the light field camera being configured to capture images from multiple viewpoints; and
a processor supported by the frame, the processor being configured to construct, for display on each of the light field displays based on image data from the multiple viewpoints received via each of the light field cameras and the captured direction, an image from a perspective of the user's respective eye, the user's respective eye being behind the light field camera.

US Pat. No. 10,715,790

SYSTEM AND METHOD FOR LEAD FOOT ANGLE INSPECTION USING MULTIVIEW STEREO VISION

GENERIC POWER PTE LTD, S...

1. A system for analyzing the lead foot angle of leads extending outwards from an integrated circuit package, the leads disposed in rows of leads with each row of leads extending from a respective one side of the object, each of the leads including a lead shoulder, a lead leg, and a lead foot, wherein an end of each lead foot is a lead tip, wherein the rows of leads define a top row of leads, a bottom row of leads, a left row of leads and a right row of leads, the system comprising:a) a support for said integrated circuit package;
b) a light source;
c) a first image capturing device comprising a first lens and a first sensor and being mounted at a first bottom viewing angle that is perpendicular to a plane where said integrated circuit package is placed for capturing a first bottom view image,
d) a second image capturing device comprising a second lens and a second sensor and being mounted at a second perspective viewing angle from said integrated circuit package for capturing a second perspective view image,
e) a third image capturing device comprising a third lens and a third sensor and being mounted at a third perspective viewing angle from said integrated circuit package for capturing a third perspective view image,
wherein the first, second and third image capturing devices form a corner shape that defines an L-shape with the first image capturing device in the center, the second image capturing device on a left side of the first image capturing device and the third image capturing device on a front side of the first image capturing device;
wherein a first optical axis of the first imaging capturing device is a line passing through the center of said first lens and the center of said first sensor, a second optical axis of the second image capturing device is a line passing through the center of said second lens and the center of said second sensor and a third optical axis of the third image capturing device is a line passing through the center of said third lens and the center of said third sensor;
wherein the first optical axis and the second optical axis form a first alignment plane with a normal direction and the first optical axis and third optical axis form a second alignment plane with a normal direction;
wherein said first alignment plane and said second alignment plane are orthogonal to each other;
wherein the normal direction of said first alignment plane is along the lead extending orientation of the leads extending outwards from the front side and the back side of said integrated circuit package;
wherein the normal direction of said second alignment plane is along the lead extending orientation of the leads extending outwards from the left side and the right side of said integrated circuit package; and
wherein the first, second and third image capturing devices are calibrated with multi-view stereo vision principles;
wherein, for one or more leads on each of the rows of leads, the system is configured to detect conjugate lead tip points, wherein the conjugate lead tip points are detected by the first image capturing device and either the second image capturing device or the third image capturing device, and three dimensional coordinates are established for each lead tip of the one or more leads from the detected conjugate lead tip points and the calibration;
wherein, the system is configured to establish a three-dimensional reference plane from the three dimensional coordinates of each lead tip of the one or more leads;
wherein, for a first lead of the one or more leads, the system is configured to:
detect lead foot middle points, wherein the lead foot middle points are detected by the same image capturing devices that detect the conjugate lead tip points for the first lead, and three dimensional coordinates are established for the lead foot middle points of the first lead from the detected middle points and the calibration;
determine a first lead foot line for the first lead, between the three dimensional coordinates of the lead tip of the first lead and the three dimensional coordinates of the lead foot middle points for the first lead; and
determine a first lead foot angle for the first lead, between the first lead foot line and the reference plane.

US Pat. No. 10,715,789

LIGHT-DEFLECTION THREE-DIMENSIONAL IMAGING DEVICE AND PROJECTION DEVICE, AND APPLICATION THEREOF

Ningbo Sunny Opotech Co.,...

1. An imaging device which is installed in an electronic mobile device which is selected from the group consisting of a mobile phone, a laptop and a tablet computer, comprising:a light deflection projection device comprising a light source which emits a projective light, at least a light deflection device which comprises a fixed light deflection element deflecting said projective light, a grating, a condensing lens group, and an emission lens, wherein when said projective light emitted by said light source passes through said grating, said projective light is then refracted and aggregated by said condensing lens group, wherein said projective light is then deflected by said light deflection element and eventually emitted out of said light deflection projection device from said emission lens, wherein a relative position between said light source and said light deflection element is fixed, wherein after the deflection of said light deflection element, said deflected projective light is projected to the outside of the light projection device from a side thereof in such a manner that a projection direction of said deflected projective light is transversely changed to a direction along a thickness of said light deflection projection device, wherein a thickness of said light deflection projection device is corresponding to a total thickness of said light deflection element and said emission lens, so as to reduce a thickness of the electronic mobile device;
at least a receiving device; and
a processor, wherein said projective light emitted from said light projection device is reflected after reaching a surface of a target object, wherein said receiving device receives said projective light reflected by the surface of the target object and transmits an information of said projective light to said processor, wherein said processor processes the information to obtain a 3D image information.

US Pat. No. 10,715,787

DEPTH IMAGING SYSTEM AND METHOD FOR CONTROLLING DEPTH IMAGING SYSTEM THEREOF

ULSee Inc., Taipei (TW)

1. A depth imaging system, comprising:a first imaging device, configured to capture a first image;
a second imaging device, configured to capture a second image;
a sliding base, wherein the first imaging device and the second imaging device are mounted on the sliding base;
a detecting module, configured to detect a target region of the first image and the second image in real time;
an estimating module, coupled to the detecting module, and configured to estimate an initial depth of the target region;
a calculating module, coupled to the estimating module, and configured to calculate a baseline corresponding to the initial depth;
a control module, coupled to the calculating module and the sliding base, and configured to control the sliding base to adjust a relative distance between the first imaging device and the second imaging device;
wherein the calculating module is further configured to generate an adjusted baseline according to the adjusted relative distance between the first imaging device and the second imaging device, such that the adjusted baseline is closer to the calculated baseline.

US Pat. No. 10,715,786

SYSTEMS AND METHOD FOR GPU BASED VIRTUAL REALITY VIDEO STREAMING SERVER

AlcaCruz Inc., San Mateo...

15. A system comprising:one or more processors; and
a memory storing instructions that, when executed by at least one processor among the one or more processors, cause the system to perform operations comprising:
detecting selection of a first field of view among multiple fields of view that also include a second field of view;
in response to the selection of the first field of view, initiating conversion of a first spherical image that corresponds to the first field of view into a first equirectangular image that corresponds to the first field of view and initiating storage of the first equirectangular image;
detecting selection of the second field of view among the multiple fields of view; and
in response to the detecting of the selection of the second field of view occurring after the initiating of the conversion of the first spherical image into the first equirectangular image but before completion of the storage of the first equirectangular image, converting the first spherical image that corresponds to the first field of view into a second equirectangular image that corresponds to the second field of view and storing the second equirectangular image that corresponds to the second field of view instead of the first equirectangular image that corresponds to the first field of view.

US Pat. No. 10,715,785

ELECTRONIC DEVICE AND METHOD FOR CONTROLLING THE SAME

Canon Kabushiki Kaisha, ...

1. An electronic device comprising a memory and at least one processor and/or at least one circuit to perform the operations of the following units:a playback unit configured to play back a viewing-direction-changeable moving image;
a management unit configured to store viewpoint information as a record of a viewing direction used by the playback unit in playing back the moving image; and
a control unit configured
to carry out control so that a range of the moving image including a viewing direction identical to a viewing direction on a previous playback occasion is extracted and displayed based on the viewpoint information stored in the management unit, in response to predetermined user operation (i) different from an instruction for playback at a normal speed and (ii) related to (a) playback of the moving image or (b) specification of a playback position in the moving image, and
to carry out control so that the displaying the range based on the viewpoint information is not performed in response to an instruction for playback at the normal speed.

US Pat. No. 10,715,784

METHODS AND SYSTEMS FOR PRESERVING PRECISION IN COMPRESSED DEPTH DATA REPRESENTATIVE OF A SCENE

Verizon Patent and Licens...

1. A method comprising:accessing, by a data precision preservation system, a depth representation of a virtual reality scene associated with a world coordinate space;
dividing, by the data precision preservation system, the depth representation into a plurality of sections associated with a plurality of different clip coordinate spaces, the plurality of sections including a particular section associated with a particular clip coordinate space;
determining, by the data precision preservation system, a lowest non-null depth value and a highest non-null depth value represented in the particular section of the depth representation, wherein the lowest and highest non-null depth values are each numbers represented by a plurality of data bits;
determining, by the data precision preservation system based on the lowest and highest non-null depth values, an inverse view-projection transform for transforming depth values from the particular clip coordinate space to the world coordinate space;
converting, by the data precision preservation system, an original set of depth values represented in the particular section of the depth representation to a compressed set of depth values normalized based on the lowest and highest non-null depth values represented in the particular section; and
providing, by the data precision preservation system to a media player device by way of a network, a virtual reality dataset representative of the virtual reality scene, the virtual reality dataset including data representative of the compressed set of depth values and the inverse view-projection transform.

US Pat. No. 10,715,783

STEREO-AWARE PANORAMA CONVERSION FOR IMMERSIVE MEDIA

Adobe Inc., San Jose, CA...

1. In a digital medium environment for editing digital images, a computer-implemented method for generating a panoramic image from different panoramic projections, the method comprising:calculating, by at least one processor and for each of a plurality of pixels in an output image, Euler angle coordinates at first UV coordinates corresponding to the respective pixel, the first UV coordinates representing a two-dimensional texture of a first three-dimensional surface associated with the respective pixel in the output image;
mapping, by the at least one processor and for each of the pixels in the output image, the Euler angle coordinates to a geometric ray;
calculating, by the at least one processor and for each geometric ray, second UV coordinates corresponding to an intersection between the respective geometric ray and a pixel of an input image, the second UV coordinates representing a two-dimensional texture of a second three-dimensional surface associated with the respective pixel in the input image, wherein the first three-dimensional surface of the output image is different from the second three-dimensional surface of the input image;
converting, by the at least one processor and for each intersection, the second UV coordinates at the respective intersection to one of the pixels in the input image; and
generating, by the at least one processor and for each intersection, a combined panoramic image, metadata descriptive of the combined panoramic image, or both, the generating being based on the converted pixel in the input image, the combined panoramic image representing a combination of the output image and the input image, wherein the combined panoramic image retains at least a portion of content of the output image and at least a portion of content of the input image.

US Pat. No. 10,715,782

3D SYSTEM INCLUDING A MARKER MODE

VEFXi Corporation, North...

1. A system for conversion using a processor of a series of two dimensional images into a series of three dimensional images for display on a display comprising:(a) said processor receiving said series of two dimensional images;
(b) said processor processing said two dimensional images to determine respective depth maps associated with said two dimensional images;
(c) said processor converting said series of two dimensional images to said series of three dimensional images based upon said depth maps;
(d) said processor processing said three dimensional images to render said three dimensional images on said display as a series of at least three spatially displayed three dimensional views across said display that have an appearance to a viewer as being presented simultaneously where at least one of said three dimensional views, and less than all said three dimensional views, is replaced with visual content not associated with said three dimensional images nor said series of two dimensional images that is readily observable by a user as not associated with either said three dimensional images nor said series of two dimensional images.

US Pat. No. 10,715,781

MULTIPLE KILL VEHICLE (MKV) INTERCEPTOR WITH IMPROVED PRE-EJECTION ACQUISITION AND DISCRIMINATION

Raytheon Company, Waltha...

1. A multiple kill vehicle (MKV) interceptor, comprising:a carrier vehicle (CV) including an internal communication bus and a central processor; and
a plurality M of KVs mounted on the CV such that their field-of-view (FOV) overlap but are not spatially registered and coupled to the internal communication bus, each said KV including an IR sensor mounted to capture IR images in the FOV,
wherein pre-ejection each KV transmits the IR images via the internal communication bus to said central processor, which spatially registers and sums at least the IR images from the M KVs over a reduced common FOV to form a registered spatially averaged image.

US Pat. No. 10,715,780

DISPLAY CONTROLLING APPARATUS, DISPLAY CONTROLLING METHOD, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A display controlling apparatus comprising:one or more hardware processors; and
one or more memories which store instructions executable by the one or more hardware processors to cause the display controlling apparatus to perform at least:
(1) obtaining viewpoint path information for specifying a movement path of a virtual viewpoint corresponding to a virtual viewpoint image that is generated based on a plurality of captured images of an area captured with a plurality of image capturing apparatuses;
(2) generating, based on the obtained viewpoint path information, a path representation image for representing concurrently a plurality of movement paths of the virtual viewpoint; and
(3) causing a display screen to display the generated path representation image,
wherein in the generated path representation image, a figure representing a movement path of the virtual viewpoint is superimposed on an image of the area viewed from a specific viewpoint that is not on a movement path of the virtual viewpoint.

US Pat. No. 10,715,779

SHARING OF MOTION VECTOR IN 3D VIDEO CODING

NOKIA TECHNOLOGIES OY, E...

10. An apparatus comprising at least one processor and at least one memory including computer program code, the at least one memory and computer program code configured to, with the processor, cause the apparatus to:decode a first motion vector from the bitstream;
decode a second motion vector from the bitstream;
decode a third motion vector from the bitstream;
decode a second depth map picture, wherein the first motion vector is used to predict the second depth map picture from a first depth map picture, wherein the first and second depth map pictures are associated with first and second texture pictures, respectively, and are auxiliary pictures coded independently of the corresponding texture pictures, and wherein the first depth map picture belongs to a first view and the second depth map picture belongs to a second view;
decode the second texture picture, wherein the second motion vector is used to predict the second texture picture from the first texture picture or a third texture picture; and
decode the third texture picture, wherein the third motion vector is used to predict the third texture picture from the first texture picture or the second texture picture.

US Pat. No. 10,715,778

VIDEO SIGNAL TRANSMISSION DEVICE, VIDEO SIGNAL RECEPTION DEVICE AND VIDEO SIGNAL TRANSFERRING SYSTEM

THINE ELECTRONICS, INC., ...

1. A video signal transmission device, comprising:a packer unit configured to capture a data enable signal and a video signal constituted by one or more pixel signals, each of which corresponds to one pixel and includes a color signal and a sync signal, and apply packetizing processing to the video signal, to generate a plurality of block signals;
an encoder unit configured to apply encoding processing to the plurality of block signals to generate a plurality of encoded block signals; and
a serializer configured to apply parallel-serial conversion to the plurality of encoded block signals to generate a serial signal, wherein
the packer unit includes a pixel packer and a color packer to generate a control signal including a pulse having a pulse width corresponding to the number of pixels and the number of tone bits of the color signal, the pixel packer applying packetizing processing to the video signal in accordance with the data enable signal so that the video signal has a packet configuration size corresponding to the number of pixels per video signal, the color packer applying packetizing processing to the video signal in accordance with the data enable signal so that the video signal has a packet configuration size corresponding to the number of tone bits of the color signal, and
the encoder unit applies encoding processing of encoding efficiencies different between a first period of the control signal in which the pulse exists and a second period of the control signal distinguished from the first period depending on existence or non-existence of the pulse.

US Pat. No. 10,715,777

INFORMATION PROCESSING APPARATUS, INFORMATION RECORDING MEDIUM, INFORMATION PROCESSING METHOD, AND PROGRAM

SONY CORPORATION, Tokyo ...

1. An information processing apparatus comprising:circuitry configured to
execute a data reproduction process and a decoding of reproduction control information associated with subtitle data defined by an extensible markup language (XML) format based on timed text markup language (TTML), the subtitle data having information recorded therein including color space designation information that is scripted as a string indicating a type of color space, electro-optical transfer function (EOTF) designation information that is scripted as a string indicating a type of EOTF that is to be applied to the subtitle data, and color depth designation information that is scripted as a variably-set numerical value indicating a color depth of the subtitle data,
acquire, from the decoded reproduction control information, at least one selected from a group consisting of the color space designation information, the EOTF designation information, and the color depth designation information, and
generate output subtitle data in accordance with the information acquired from the decoded reproduction control information.

US Pat. No. 10,715,776

PACKED I-FRAMES

Apple Inc., Cupertino, C...

1. A method of streaming a media asset at a media player, comprising:storing, at a media source, first segments of a first media stream for the media asset, the first segments containing video data of frames of the media asset coded according to temporal prediction techniques;
storing, at the media source, second segments of a second media stream for the media asset that contain frames of the media asset coded solely as intra-coded (I) frames, wherein the second segments contains frames of the media asset that are redundant to corresponding frames of the first segments and wherein at least one I frame in the second segments corresponds a frame in the first segments that is not an I frame;
storing, at the media source, manifest data that contains for the first media stream:
an identifier of a network location from which each of the first segments may be downloaded,
configuration information for the first media stream including a bitrate of the stream,
an identifier of a network location from which each of the second segments may be downloaded, and
an indication that both the first segments and the second segments are associated with the first media stream, that the first segments are normal playback segments, and the that second segments are enhanced playback segments; and
providing, by the media source, data from the first segments and the second segments in response to request(s) from a client device.

US Pat. No. 10,715,775

MULTIMEDIA DEVICE FOR PROCESSING VIDEO SIGNAL AND CONTROL METHOD THEREOF

LG ELECTRONICS INC., Seo...

1. A multimedia device for processing a video signal, the multimedia device comprising:a memory;
a receiving unit configured to receive a video signal; and
a controller configured to:
perform, by referring to the memory, a first tone mapping on the received video signal by applying a first linear curve to the received video signal based on a determination that the received video signal corresponds to a high dynamic range (HDR) video signal;
perform, by referring to the memory, a second tone mapping on the received video signal by
applying a non-linear curve amplifying a second gradient of the received video signal and applying a second linear curve to a first gradient region and a third gradient region of the received video signal based on a determination that the received video signal corresponds to a standard dynamic range (SDR) video signal, wherein the non-linear curve is different from the first linear curve and the second linear curve, wherein the second gradient region is a signal region higher than the first gradient region, and the third gradient region is a signal region higher than the second gradient region and the first gradient region, the second gradient region, and the third gradient region do not overlap with each other; and
perform a dynamic contrast function on the second tone mapped received SDR video signal based on the determination that the received video signal corresponds to the SDR video signal by applying different curves based on a gradient region of each of the first tone mapped HDR received video signal and the second tone mapped SDR received video signal, wherein the first tone mapping and the second tone mapping are performed in a first color space and the dynamic contrast function is performed in a second color space that is different from the first color space; and
perform the dynamic contrast function on the first tone mapped HDR received video signal based on the determination that the received video signal corresponds to the HDR video signal.

US Pat. No. 10,715,774

COLOR CONVERSION FOR AMBIENT-ADAPTIVE DIGITAL CONTENT

MICROSOFT TECHNOLOGY LICE...

1. A method for digital image color conversion, comprising:at a first computing device, capturing a first digital image of a first real-world environment;
based on ambient lighting conditions of the first real-world environment, generating a first ambient lighting-agnostic digital image from the first digital image using a capture-side ambient lighting color conversion;
transmitting the first ambient lighting-agnostic digital image to a second computing device in a second real-world environment;
receiving, from the second computing device, a second ambient lighting-agnostic digital image of the second real-world environment, the second ambient lighting-agnostic digital image having been generated by the second computing device from a second digital image using the capture-side ambient lighting color conversion;
based on the ambient lighting conditions of the first real-world environment, generating a lighting-corrected digital image from the second ambient lighting-agnostic digital image using a display-side ambient lighting color conversion; and
displaying the lighting-corrected digital image on the first computing device.

US Pat. No. 10,715,773

METHOD AND SYSTEM OF LENS SHADING COLOR CORRECTION USING BLOCK MATCHING

Intel Corporation, Santa...

1. A processor system comprising:memory to store first shade correction data; and
processor circuitry to:
cluster blocks of an image into at least one cluster based on a criterion;
process the at least one cluster to determine at least one modification parameter;
modify the first shade correction data with the at least one modification parameter to determine second shade correction data; and
correct a lens shade effect associated with the image based on the second shade correction data.

US Pat. No. 10,715,772

HIGH DYNAMIC RANGE COLOR CONVERSION CORRECTION

NETFLIX, INC., Los Gatos...

14. A method, comprising:downsampling first color space values to generate downsampled color space values;
upsampling the downsampled color space values to generate second color space values;
determining a first new value for at least one component value included in the downsampled color space values based on a first component value included in the first color space values, a second component value included in the second color space values, and an approximation of a nonlinear transfer function;
determining that a first color component value associated with the first new value is outside of a color space range; and
determining a second new value for the at least one component value, wherein the first color component associated with the second new value is within the color space range.

US Pat. No. 10,715,769

PROJECTOR

JVC KENWOOD CORPORATION, ...

1. A projector comprising:a blue light source configured to emit blue illumination light; and
an infrared light source configured to emit infrared illumination light;
a light source controller configured to switch on and off of each of the blue light source and the infrared light source in such a way that the blue light source and the infrared light source are alternately turned on and off;
a phosphor configured to convert a part of the blue illumination light into yellow illumination light;
a first dichroic mirror configured to synthesize the blue illumination light, the yellow illumination light, and the infrared illumination light with one another;
a second dichroic mirror configured to separate the blue illumination light from a synthesized illumination light in which the blue illumination light and illumination light of a color other than blue are synthesized;
a third dichroic mirror configured to separate the yellow illumination light and the infrared illumination light into red illumination light, the infrared illumination light, and green illumination light;
a first image display element configured to modulate first illumination light in which the red illumination light and the infrared illumination light are alternately switched, based on an image signal for a visible light image and an image signal for an infrared light image, respectively, and to emit first image light in which red image light and infrared image light are alternately switched;
a second image display element configured to modulate the green illumination light based on a green image signal, and to emit green image light,
a third image display element configured to modulate the blue illumination light based on a blue image signal, and to emit blue image light;
a synthesizer configured to synthesize the first image light, the green image light, and the blue image light with one another, and to obtain synthesized image light; and
a projection unit configured to project the synthesized image light.

US Pat. No. 10,715,763

VIDEOCONFERENCING CALIBRATION SYSTEMS, CONTROLLERS AND METHODS FOR CALIBRATING A VIDEOCONFERENCING SYSTEM

ZELLER DIGITAL INNOVATION...

1. A controller for calibrating a videoconferencing system, the system including a first codec connected to a second codec through a videoconferencing connection, the controller comprising:an output in communication with the first codec for controlling the first codec to transmit a videoconferencing signal to the second codec through the videoconferencing connection; and
an input for receiving a calibration adjustment value from another controller over a network, the other controller in communication with the second codec, the received calibration adjustment value determined by the second controller by comparing at least one signal level of the videoconferencing signal received at the second codec to a calibration target according to at least one calibration adjustment rule;
wherein the controller is configured to adjust a signal level setting of the first codec using a level adjustment command of the first codec, the level adjustment command determined according to the calibration adjustment value transmitted by the other controller.

US Pat. No. 10,715,762

METHOD AND APPARATUS FOR PROVIDING IMAGE SERVICE

Samsung Electronics Co., ...

1. An electronic device comprising:a camera;
a display;
a communication unit configured to establish wireless communication with another electronic device using at least one protocol; and
a processor configured to be functionally connected to the camera, the display, and the communication unit,
wherein the processor is configured to:
perform a voice call with the other electronic device based on a first network,
detect a user input for performing a video call with the other electronic device while performing the voice call with the other electronic device,
connect the video call based on a second network in response to the user input and display a user interface associated with the video call on the display, and
display a video acquired using the camera through the user interface and transmit the displayed video to the other electronic device through the second network,
wherein the performing of the voice call based on the first network is maintained while the displayed video is transmitted through the second network,
wherein the processor is further configured to:
receive, from the other electronic device, a control signal for adjusting the video while transmitting the displayed video to the other electronic device through the second network,
based on the control signal, generate an adjusted video of which at least one of an exposure level, a focus, a resolution, a bit rate, a frame rate, a color temperature, a gamma value, noise, or a zooming level is adjusted from the video, and
display the adjusted video through the user interface and transmit the adjusted video to the other electronic device through the second network such that the other electronic device displays the adjusted video.

US Pat. No. 10,715,759

ATHLETIC ACTIVITY HEADS UP DISPLAY SYSTEMS AND METHODS

adidas AG, Herzogenaurac...

1. A method of using an athletic activity heads up display system during an athletic activity, comprising:a body-mountable heads up display unit receiving information about a sport ball being used in the athletic activity, wherein the information comprises acceleration data about the sport ball received wirelessly from the sport ball;
the heads up display unit taking a still photo or video clip in response to receiving the acceleration data from the sport ball;
the heads up display unit further receiving information about a plurality of individuals engaged in the athletic activity; and
the heads up display unit displaying a plurality of images to a user wearing the heads up display unit based on the information about the sport ball and the information about the plurality of individuals,
wherein the images are overlaid on the user's present field of view of an environment as viewed through the heads up display unit,
wherein a first image includes a strike zone icon overlaid on top of the sport ball in the user's present field of view, and
wherein a second image includes a point of impact icon overlaid on top of a ball icon, such that the point of impact icon is representative of a calculated point of impact based in part on the acceleration data received from the sport ball.

US Pat. No. 10,715,755

SOLID-STATE IMAGE SENSOR AND IMAGING DEVICE

SONY CORPORATION, Tokyo ...

1. A solid-state image sensor, comprising:a photoelectric conversion circuit configured to generate an image signal, wherein the image signal corresponds to incident light incident on the photoelectric conversion circuit;
a plurality of image signal lines;
a plurality of output control units, wherein
each output control unit of the plurality of output control units is connected to a respective image signal line of the plurality of image signal lines
a selection information retention unit configured to retain selection information, wherein the selection information is for selection of one image signal line of the plurality of image signal lines; and
a selection control unit configured to select a first output control unit of the plurality of output control units based on the selection information retained in the selection information retention unit, wherein
the first output control unit is configured to output the generated image signal to a first image signal line of the plurality of image signal lines,
the first image signal line is connected to the first output control unit, and
the first image signal line is configured to transmit the generated image signal received from the first output control unit.

US Pat. No. 10,715,754

SINGLE REFERENCE CLOCK TIME TO DIGITAL CONVERTER

1. A time to digital converter (TDC) comprising:a clock input configured to receive a reference clock that is synchronized with a first event, the reference clock having a first frequency;
a clock generation circuit configured to generate a first clock at a first output of the clock generation circuit based on the reference clock, the first clock having a second frequency lower than the first frequency;
a data input configured to receive an input stream of pulses, wherein the input stream of pulses is based on the first event;
a sampling circuit having an input register, the sampling circuit coupled to the data input, the sampling circuit configured to continuously sample the input stream of pulses into the input register based on the reference clock; and
output terminals configured to stream time stamps based on the input stream of pulses at the second frequency, wherein the stream of time stamps is synchronized with the first clock.

US Pat. No. 10,715,753

SOLID-STATE IMAGE PICKUP ELEMENT AND ELECTRONIC APPARATUS

SONY CORPORATION, Toyko ...

1. A solid-state image pickup element, comprising:a pixel array that comprises a plurality of pixel blocks, wherein
each of the plurality of pixel blocks comprises a plurality of pixels, and
the plurality of pixels includes a normal pixel and a pixel for phase difference detection;
a plurality of Analog-Digital (AD) converters,
wherein each of the plurality of AD converters corresponds to a respective pixel block of the plurality of pixel blocks; and
circuitry configured to:
read out pixel signals of the plurality of pixels of each pixel block of the plurality of pixel blocks; and
supply the pixel signals of the plurality of pixels of each pixel block of the plurality of pixel blocks to a corresponding AD converter of the plurality of AD converters, wherein
each of the plurality of AD converters is configured to AD-convert the pixel signals of the plurality of pixels of the respective pixel block,
the pixel for phase difference detection of a first pixel block of the plurality of pixel blocks and the pixel for phase difference detection of a second pixel block of the plurality of pixel blocks are arranged in different positions in the first pixel block and the second pixel block, respectively, and
an order of AD conversion of a pixel signal of the pixel for phase difference detection of each of the first pixel block and the second pixel block is common among the first pixel block and the second pixel block.

US Pat. No. 10,715,752

SYSTEM AND METHOD FOR MONITORING SENSOR PERFORMANCE ON AN AGRICULTURAL MACHINE

CNH Industrial Canada, Lt...

1. A system for monitoring sensor performance on an agricultural machine, the system comprising:an agricultural machine;
a vision-based sensor mounted on the agricultural machine, the vision-based sensor being configured to capture first and second images; and
a controller communicatively coupled to the vision-based sensor, the controller being configured to:
receive the first and second images from the vision-based sensor;
determine an image parameter value associated with each of a plurality of pixels contained within each of the first and second images;
for each respective pixel of the plurality of pixels, determine a variance between the image parameter value of the respective pixel in the first image and the image parameter value of the respective pixel in the second image; and
when the variance associated with the image parameter values for a given pixel of the plurality of pixels falls below a threshold value, identify the given pixel as being at least one of obscured or inoperative.

US Pat. No. 10,715,751

SOLID-STATE IMAGE PICKUP ELEMENT, IMAGE PICKUP DEVICE, AND METHOD OF MANUFACTURING SOLID-STATE IMAGE PICKUP ELEMENT

SONY CORPORATION, Tokyo ...

1. A solid-state image pickup element, comprising:a pixel having a plurality of protrusions on a light-receiving surface of the pixel, wherein
an interval between respective representative points of the plurality of protrusions is one of 250 nanometers or more,
the pixel includes:
an infrared-light pixel comprising a first photoelectric conversion portion to photoelectrically convert infrared light, and
a visible-light pixel comprising a second photoelectric conversion portion to photoelectrically convert visible light, and
each of the first photoelectric conversion portion and the second photoelectric conversion portion is below the light-receiving surface in a first direction;
a light-receiving-surface-sided trench around the pixel, wherein
the light-receiving-surface-sided trench is at the light-receiving surface of the pixel,
the first photoelectric conversion portion has a first depth in the first direction,
the light-receiving-surface-sided trench has a second depth in the first direction,
the second photoelectric conversion portion has a third depth in the first direction, and
the second depth of the light-receiving-surface-sided trench is smaller than the first depth of the first photoelectric conversion portion and greater than the third depth of the second photoelectric conversion portion; and
a light-receiving-surface-sided member buried in the light-receiving-surface-sided trench.

US Pat. No. 10,715,750

DATA RATE CONTROL FOR EVENT-BASED VISION SENSOR

1. A change detection sensor, comprising:a pixel array comprising pixels that detect light, wherein the pixels comprise photosensors and photoreceptor circuits that output first signals based on the response of the photosensors;
event detectors for detecting events associated with light received by the pixels by comparing the first signals and a first reference voltage and a second reference voltage, selectively;
an event rate detector for counting the events with a counter; and
a controller for changing how the pixels detect the events based on the event rate detector by changing at least one of the first reference voltage and the second reference voltage;
wherein the controller compares an event rate provided by the event rate detector to a maximum event rate and a minimum event rate, if the event rate is larger than the maximum event rate and if the event threshold is smaller than a maximum event threshold then at least one of the first reference voltage and the second reference voltage is changed, and if the event rate is smaller than the minimum event rate and if the event threshold is larger than a minimum event threshold then at least one of the first reference voltage and the second reference voltage is changed.

US Pat. No. 10,715,749

INFRARED BAND PASS SYSTEM FOR OPTICAL DETECTION OF PARAMETERS

Schott Glass Technologies...

1. A glass substrate having an average thickness of the glass substrate from 0.01 to 1.2 mm and having a temperature dependence of refractive index at a wave-length of 850 nm in a temperature range from ?40° C. to 60° C. of not more than 10×10?6/K.

US Pat. No. 10,715,748

SYSTEM AND METHOD FOR CONTROLLING A PROJECTOR VIA A PASSIVE CONTROL STRIP

Dell Products, L.P., Rou...

1. A projector comprising:an infrared generator to provide an infrared curtain on a screen onto which the projector projects an image;
an infrared camera to detect a selection of an icon of a plurality of icons based on a first infrared signature of the infrared curtain and a second infrared signature of an object over the icon on a control strip attached to the screen and located within a first boundary of the infrared curtain on the screen, and wherein the control strip is outside of a second boundary of an image projection on the screen, to determine a location of the icon, and to send the location of the icon to an infrared control module;
the infrared control module to communicate with the infrared generator and the infrared camera, the infrared control module to determine a coordinate position of the icon within the infrared curtain based on the location of the icon, and to determine an orientation of the control strip within the infrared curtain associated with the icon, wherein the orientation of the control strip is based on a first infrared wavelength of a first infrared calibration tag and a second infrared wavelength of a second infrared calibration tag, wherein the icons are in an order based on the orientation of the control strip, and wherein the icons are located between the first infrared calibration tag and the second infrared calibration tag of the control strip; and
a controller to communicate with the infrared control module, the controller to determine a command for the projector based on the coordinate position of the icon and the orientation of the control strip, and to execute the command.

US Pat. No. 10,715,747

SENSOR SUPPORT SYSTEM, TERMINAL, SENSOR, AND METHOD FOR SUPPORTING SENSOR

OMRON Corporation, Kyoto...

1. A sensor support system comprising:a sensor; and
a terminal comprising a processor configured with a program to perform operations to display form information of the sensor, wherein
the sensor comprises
a storage tag readable from the terminal and configured to store the form information of the sensor,
the terminal comprises:
a camera configured to take an image of the sensor;
the processor configured with the program to perform operations comprising:
operation as a reading unit configured to read information from the storage tag of the sensor;
operation as a position calculator configured to calculate position information of the sensor from the image of the sensor taken by the camera,
operation as an image generator configured to generate a virtual image of a detection direction and a detection area of the sensor included in the form information, based on the position information calculated by the position calculator; and
operation as a synthesizing unit configured to superimpose the virtual image on the image of the sensor taken by the camera to synthesize a synthetic image; and
a display configured to display the synthetic image synthesized by the synthesizing unit.

US Pat. No. 10,715,746

ENHANCED TELESTRATOR FOR WEARABLE DEVICES

Realwear, Inc., Vancouve...

1. A method for operating a wearable computing device that includes a frame member, a display, a photon-detector and photon-emitter, comprising:receiving an instruction to provide a visual element at a first location within a first field-of-view (FOV) of the photon-detector;
in response to receiving the instruction and a determination that the display is currently located at a first position relative to the frame member, employing the display to display a composite image that includes a combination of the first FOV of the photon detector and the visual element located at the first location within the first FOV of the photon-detector; and
in response to a determination that the display has been transitioned from the first position relative to the frame member to a second position relative to the frame member, transmitting, by the photon-emitter, one or more photon beams such that the visual element is projected onto a surface of the first location within the first FOV of the photon-detector.

US Pat. No. 10,715,745

CONSTRUCTING AN IMAGE USING MORE PIXEL DATA THAN PIXELS IN AN IMAGE SENSOR

Imagination Technologies ...

1. An apparatus for an imaging device arranged to capture an image of a scene using an image sensor comprising a plurality of pixels and an optical arrangement operable to focus light from a portion of the scene onto the image sensor whilst preventing light from other portions of the scene from being focused onto the image sensor, the apparatus comprising:a processor configured to receive from the image sensor a sequence of captured portions of the scene each formed from light focused from a respective portion of the scene; and construct an image of the scene so that the constructed image contains pixel data for a number of pixels greater than the number of pixels in the image sensor;
wherein the optical arrangement is configured to prevent light from portions of the scene from being focused onto the image sensor by masking those portions of the scene and/or by focusing light from those portions in regions of a focal plane surrounding the sensor.

US Pat. No. 10,715,744

VEHICULAR DISPLAY CONTROL DEVICE, VEHICULAR DISPLAY SYSTEM, VEHICULAR DISPLAY CONTROL METHOD, AND NON-TRANSITORY STORAGE MEDIUM

JVC KENWOOD Corporation, ...

1. A vehicular display control device comprising:a video acquiring unit configured to acquire video data captured by a rear camera configured to capture a rear view of a vehicle;
a frequency detecting unit configured to detect, for each of multiple directions of movement of a driver's head, a frequency of the driver's action in the vehicle for changing a range of view with respect to a display installed in the vehicle;
a video data generating unit configured to
in response to determining that the frequency of the driver's action detected by the frequency detecting unit is less than a predetermined value, clip the video data to generate clipped video data comprising a normal clipping range analogous to a range viewable while the driver's head is in a position to look squarely at an optical rearview mirror, and
in response to determining that the frequency of the driver's action detected by the frequency detecting unit is equal to or greater than the predetermined value, clip the video data to generate the clipped video data comprising a shifted clipping range including a range that is not within the normal clipping range and is analogous to a range viewable while the driver's head is moved from the position to look squarely at the optical rearview mirror, wherein the shifted clipping range is obtained by shifting the normal clipping range in a first direction opposite to a second direction of a movement of the driver's head; and
a display controller configured to cause the display installed in the vehicle to display the clipped video data generated by the video data generating unit.

US Pat. No. 10,715,743

SYSTEM AND METHOD FOR PHOTOGRAPHIC EFFECTS

International Business Ma...

1. A computer-implemented method for creating a photographic effect, comprising:determining a group distance for a first group of pixels on an acquired image, wherein the first group of pixels represents a first object depicted in the acquired image;
determining a group distance for a second group of pixels on the acquired image, wherein the second group of pixels represents a second object depicted in the acquired image;
assigning a first expansion factor to the first group of pixels based on the group distance for the first group of pixels;
assigning a second expansion factor to the second group of pixels based on the group distance for the second group of pixels, wherein the second expansion factor is a value different from a value of the first expansion factor;
resizing the first group of pixels based on the assigned first expansion factor;
resizing the second group of pixels based on the assigned second expansion factor; and
creating a processed image based on the acquired image, wherein the processed image comprises the resized first group of pixels and the resized second group of pixels.

US Pat. No. 10,715,742

MODULAR SURVEILLANCE TOWER SYSTEM

1. A modular surveillance tower system (10) for monitoring a site (12) and for modifying the system as a function of changing conditions, the system comprising, in combination:a base (16) in a tubular configuration and having an upper end (18) and a lower end (20), the lower end being positioned on a recipient surface at the site to be monitored, the upper end being at an elevation sufficient to allow monitoring of the site, the base being hollow and having a circular cross sectional configuration with a base diameter, the base having a base height between the upper end and the lower end, the base height being several times greater than the base diameter; a support plate (22) at the lower end of the base, to allow the system to be freestanding;
a plurality of modules (24A) (24B) (24C) (240), each module being similarly configured in a tubular configuration, each module having an upper end (28) and a lower end (30), each module being hollow and having a circular cross sectional configuration, with a module diameter equal to the base diameter, each module having a module height between the upper end and the lower end, the module height being greater than the module diameter; a cylindrical projection (32) on the lower end of each module to facilitate removable coupling to the base or another one of the plurality of modules there beneath, the diameter of the cylindrical projection being smaller than the module and base diameter;
the plurality of modules including a lowermost module (24A) removably supported on the upper end of the base, a hub (36) positioned within the lowermost module, the hub adapted to receive digital data, manipulate received digital data, and transmit manipulated digital data;
the plurality of modules including a lower module (24B) removably supported on the upper end of the lowermost module, a first window (48) positioned in the lower module, a first camera (50) elevationally adjustable within the lower module adjacent to the first window, the first camera adapted to monitor a first half of the site and generate first images, the first camera adapted to transmit digital data to the hub representative of the first images;
the plurality of modules including an upper module (24C) removably supported on the upper end of the lower module, a second window (54) positioned in the upper module, the second window being located 180 degrees rotated from the first window, a second camera (56) elevationally adjustable within the upper module adjacent to the second window, the second camera adapted to monitor a second half of the site and generate second images, the second camera adapted to transmit digital data to the hub representative of the second images; the plurality of modules including an uppermost module (240) removably supported on the upper end of the upper module, a third window (60) positioned in the uppermost module, a fourth window (62) positioned in the uppermost module, the fourth window being located 180 degrees rotated from the third window, a first infrared lighting fixture (64) positioned within the uppermost module adjacent to the third window, a second infrared lighting fixture (66) positioned within the uppermost module adjacent to the fourth window; a lid (70) positioned on the upper end of the uppermost module, the lid and the base and the first module and the second module and the third module and the fourth module being fabricated of a rigid material and constituting a tower (72);
a transceiver (76) in the base adapted to transmit and receive digital signals to and from the first camera and the second camera and the first lighting fixture and the second lighting fixture; a digital device (78)(80) located remote from the tower adapted to control the system, the digital device selected from the class consisting of lap tops (78) and smart phones (80); and a processor (84) operatively coupled between the digital device and the tower for communicating there between.

US Pat. No. 10,715,741

OCCULTING DEVICE AND METHODS OF USING SAME

1. An occulting device comprising:a sensor capable of capturing an image;
a computer in communication with the sensor and configured and arranged to receive a captured image from the sensor and select a portion of the image as a target;
a photochromic film disposed in a field of view of the sensor;
a projector disposed adjacent the photochromic film and configured and arranged to darken portions of the film that correspond to the target while leaving other portions of the film clear; and
a motor, a shaft couples to the motor, the shaft being coupled to the photochromic film and configured and arranged to move the photochromic film.

US Pat. No. 10,715,739

ELECTRONIC DEVICE INCLUDING LIGHT-EMITTING ELEMENTS AND METHOD OF OPERATING ELECTRONIC DEVICE

SAMSUNG ELECTRONICS CO., ...

1. An electronic device comprising:a camera configured to obtain an image of a subject;
a light source comprising a plurality of light-emitting diodes; and
at least one processor configured to:
divide the image into a plurality of regions;
determine a distance to each of the plurality of regions, based on any one or any combination of an area of the subject included in each of the plurality of regions and the distance to the subject; and
control a luminance of each of a light respectively emitted by the plurality of light-emitting diodes to a plurality of photographing regions corresponding to the plurality of regions, based on the distance to each of the plurality of regions.

US Pat. No. 10,715,738

ASYMMETRICAL LICENSE PLATE READING (ALPR) CAMERA SYSTEM

Axon Enterprise, Inc., S...

1. A light emitting apparatus mountable to a transportation vehicle, the apparatus comprising:a body;
a camera;
a light source comprising a plurality of light emitting diodes configured to emit light; and
a micro-controller communicatively coupled to the light source and the camera,
a distance measurement component communicatively coupled to the micro-controller and configured to measure an approximate distance to a target vehicle in a lane near one on which the transportation vehicle is traveling; and
a speed delta measurement component communicatively coupled to the micro-controller and configured to calculate a relative speed of the target vehicle in the lane relative to a speed of the transportation vehicle in its own lane,wherein the micro-controller is configured to dynamically adjust at least illumination power of the light source and exposure time of the camera based on the approximate distance measured by the distance measurement component and the relative speed calculated by the speed delta measurement component,wherein the light emitted by the plurality of light emitting diodes creates an asymmetrical illumination cone pattern towards a lane near to one on which the transportation vehicle is traveling.

US Pat. No. 10,715,737

IMAGING DEVICE, STILL IMAGE CAPTURING METHOD, AND STILL IMAGE CAPTURING PROGRAM

FUJIFILM Corporation, To...

1. An imaging device comprising:a MOS type imaging element comprising a plurality of pixels;
a mechanical shutter disposed in front of the imaging element;
a driving unit that drives the imaging element;
an imaging control unit that performs still image exposure control for controlling the driving unit in a state in which the mechanical shutter is open to simultaneously start exposure of the plurality of pixels and closing the mechanical shutter to simultaneously end the exposure of the plurality of pixels, first readout control for controlling the driving unit during the exposure to read out a first captured image signal from first pixels included in the plurality of pixels at each of a plurality of timings during the exposure, and second readout control for reading out, after the exposure ends, a second captured image signal from at least second pixels included in the plurality of pixels and being other than the first pixels from among the first pixels and the second pixels;
a display control unit that generates live view image data based on the first captured image signal sequentially read out from the first pixels through the first readout control and causes an image based on the live view image data to be displayed on a display unit;
a first storage unit that integrates and stores the first captured image signal sequentially read out from the first pixels through at least the first readout control among the first readout control and the second readout control, and stores the second captured image signal read out from the second pixels through the second readout control; and
an image processing unit that processes a third captured image signal comprising the first captured image signal and the second captured image signal stored in the first storage unit to generate captured image data and stores the captured image data in a storage medium,
wherein the imaging control unit performs control for reading out the first captured image signal from the first pixels and reading out the second captured image signal from the second pixels as the second readout control in a case in which a first timing at which the reading out of the first captured image signal is completed immediately before the exposure ends and a second timing at which the mechanical shutter is closed do not match each other, and performs control for reading out the second captured image signal from only the second pixels as the second readout control in a case in which the first timing and the second timing match each other.

US Pat. No. 10,715,735

HEAD-MOUNTED DISPLAY, DISPLAY CONTROL METHOD, AND PROGRAM

Sony Interactive Entertai...

1. A head-mounted display worn by a user, comprising:a display section configured to be disposed in front of eyes of the user;
a camera attached to the head-mounted display worn by the user configured to capture images of what is ahead of the display section;
a detection section configured to detect varying directions of a visual line of the user's eyes;
a camera control section configured to change directions of an optical axis of the camera in accordance with the varying directions of the visual line of the user's eyes, such that the capture images change in accordance with the varying directions of the visual line of the user's eyes; and
a display control section configured to cause the display section to display partial images cut out from the capture images captured by the camera, wherein:
the partial images are cut out from different positions within the capture images in response to: (i) the varying directions of the visual line of the user's eyes, and (ii) changing directions of the optical axis of the camera, such that when the direction of the optical axis of the camera is changed, the different position in the capture image from which the partial image is cut out is changed opposite to the changed direction of the optical axis of the camera.

US Pat. No. 10,715,734

BIRD'S-EYE VIEW VIDEO GENERATION DEVICE, BIRD'S-EYE VIEW VIDEO GENERATION METHOD, AND NON-TRANSITORY STORAGE MEDIUM

JVC KENWOOD Corporation, ...

1. A bird's-eye view video generation device comprising:a memory device;
a controller that performs functions of multiple components including:
a video data acquisition unit configured to acquire video data from a plurality of cameras configured to capture videos of surroundings of a vehicle;
a bird's-eye view video generator configured to generate a first bird's-eye view video from a virtual viewpoint at a position above the vehicle by performing viewpoint conversion processing on the video data acquired by the video data acquisition unit to synthesize viewpoint-converted video;
an obstacle information acquisition unit configured to acquire information from at least one detector configured to detect at least one obstacle around the vehicle and to specify a position of the detected obstacle on the first bird's-eye view video; and
a display controller configured to display the first bird's-eye view video in a display, wherein, when the position of the detected obstacle that is specified by the obstacle information acquisition unit overlaps a synthesis boundary between videos in the first bird's-eye view video, the bird's-eye view video generator is further configured to generate a second bird's-eye view video obtained by changing the position of the virtual viewpoint of the first bird's-eye view video to a position from which the detected obstacle does not overlap the synthesis boundary in the first bird's-eye view video, and the bird's-eye view video generator is further configured to change the position of the virtual viewpoint to a position from which the detected obstacle does not overlap the synthesis boundary in the first bird's-eye view video due to a display area, after changing the position of the virtual viewpoint, of the viewpoint-converted video of the plurality of cameras in which the detected obstacle is contained becoming wider than that of the viewpoint-converted video before changing the position of the virtual viewpoint.

US Pat. No. 10,715,733

IMAGING APPARATUS, IMAGING-DISPLAYING APPARATUS, AND CONTROL METHOD THEREOF

Seiko Epson Corporation, ...

1. An imaging apparatus comprising:an imaging unit which outputs an imaging signal denoting a captured image;
a storage unit;
an image processing unit which generates an image signal by performing image processing with respect to the imaging signal, and writes the image signal in the storage unit;
an image signal output unit which reads the image signal from the storage unit, and outputs the image signal to a display unit; and
a timing control unit which
generates a first vertical synchronizing signal for driving the imaging unit and supplies the first vertical synchronizing signal to the imaging unit; and
generates a second vertical synchronizing signal which is obtained by delaying the first vertical synchronizing signal at least by a predetermined time which is variable according to contents of the image processing performed and supplies the second vertical synchronizing signal to the display unit,
wherein the timing control unit controls the image signal output unit so that the image signal is read from the storage unit by being delayed by a phase difference between the first vertical synchronizing signal and the second vertical synchronizing signal, after outputting of the imaging signal from the imaging unit.

US Pat. No. 10,715,732

TRANSMISSION APPARATUS, SETTING APPARATUS, TRANSMISSION METHOD, RECEPTION METHOD, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image transmission apparatus comprising:a reception unit configured
to receive
at least one of a first command and a second command from an external apparatus, wherein the first and second commands are commands to superpose first text information on a picked-up video image picked up by an image pickup unit; and
a transmission unit configured to transmit a superposed video image in which the first text information is superposed on the picked-up video image, when the reception unit receives at least one of the first command and the second command,
wherein, in a case where the reception unit receives the first command, the superposed video image is a video image in which the first text information is always superposed on a fixed position in the picked-up video image, regardless of a change of at least one of a pan angle, a tilt angle, and a zoom magnification of the image pickup unit, and
wherein, in a case where the reception unit receives the second command, the superposed video image is a video image in which the first text information is superposed in accordance with the change of at least one of the pan angle, the tilt angle, and the zoom magnification.

US Pat. No. 10,715,731

IMAGE STABILIZATION IN ZOOM MODE

Qualcomm Incorporated, S...

1. A method for image processing at a device, comprising:generating a non-zoom autofocus configuration, a first zoom autofocus configuration, and a second zoom autofocus configuration;
detecting blurring in one or more images from a stream of images captured by a camera based at least in part on monitoring the stream of images;
identifying a level of zoom based at least in part on detecting the blurring;
selecting an autofocus configuration from a plurality of autofocus configurations based at least in part on the identified level of zoom satisfying a first zoom threshold, or a second zoom threshold greater than the first zoom threshold, or both, the plurality of autofocus configurations including at least one of the non-zoom autofocus configuration, the first zoom autofocus configuration, or the second zoom autofocus configuration; and
performing an autofocus operation using the selected autofocus configuration.

US Pat. No. 10,715,730

DAMPER ARRANGEMENT FOR ACTUATOR DAMPING

Apple Inc., Cupertino, C...

1. A device, comprising:a stationary component of a lens actuator;
a dynamic component of the lens actuator, wherein the dynamic component is configured to hold one or more lens elements and move at least along an optical axis of the one or more lens elements;
a pocket configured in at least one of the stationary component or the dynamic component, wherein the pocket comprises a viscoelastic material; and
an interface member extending from the stationary component or the dynamic component and at least partially embedded in the viscoelastic material, wherein the interface member is configured to traverse within the viscoelastic material to dampen motion of the dynamic component during operation of the lens actuator.

US Pat. No. 10,715,729

IMAGE PROCESSING APPARATUS FOR DETECTING MOVING SUBJECT, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image processing apparatus to detect a moving subject region, the image processing apparatus comprising:a memory that stores instructions; and
one or more processors configured to execute the instructions to cause the image processing apparatus to function as:
a likelihood generation unit configured to detect a motion of a region in a first image and, based on at least two input images, generate a moving subject likelihood for each region in the first image,
a similarity detection unit configured to detect, as a detected similarity, a similarity between a target region and a peripheral region of the target region for at least one of the at least two input images, and
a correction unit configured to correct the generated moving subject likelihood of the target region based on the detected similarity and the generated moving subject likelihood of the peripheral region,
wherein the moving subject region is detected based on the corrected moving subject likelihood of the target region.

US Pat. No. 10,715,728

SUB-FRAME JITTER COMPENSATION

RAYTHEON COMPANY, Waltha...

1. An imaging detector including: an image detection device that includes an array of digital pixels, each digital pixel including an output that provides a pulse each time a charge stored in the digital pixel exceeds a threshold; an accelerometer connected to the image detection device; and a readout integrated circuit (ROIC) connected to the accelerometer and connected to the output of each of the digital pixels and that receives pulses from each pixel, the ROIC including: a plurality of accumulators, each of the plurality of accumulators associated with a respective digital pixel; wherein the ROIC is configured to receive a pulse from a first digital pixel of the array of digital pixels and to assign the received pulse to an accumulator associated with another digital pixel of the array based on information received from the accelerometer.

US Pat. No. 10,715,727

SYNTHETIC LONG EXPOSURE IMAGE WITH OPTIONAL ENHANCEMENT USING A GUIDE IMAGE

APPLE INC., Cupertino, C...

1. A method comprising:obtaining a plurality of source images;
stabilizing the plurality of source images to generate a plurality of stabilized images;
averaging the plurality of stabilized image to generate a synthetic long exposure image; and
generating a dynamism map indicative of a level of pixel value variation of the pixels of the plurality of stabilized images, wherein generating a particular pixel of the dynamism map at a particular pixel location comprises:
determining one or more pixel values of the plurality of stabilized images in a neighborhood surrounding the particular pixel location; and
determining a dynamism value of the particular pixel indicative of variation of the one or more pixel values.

US Pat. No. 10,715,725

METHOD AND SYSTEM FOR HANDLING 360 DEGREE IMAGE CONTENT

Samsung Electronics Co., ...

1. A method for handling 360 degree image content captured by a 360 degree camera, the method comprising:obtaining, by an electronic device, a sequence of 360 degree projection format image frames associated with the 360 degree image content and inertial measurement unit (IMU) rotation data associated with the 360 degree image content;
determining a first relative rotation angle for each of the 360 degree projection format image frames with respect to a first 360 degree projection format image frame by tracking a set of feature points in each of the 360 degree projection format image frames;
determining a second relative rotation angle for each of the 360 degree projection format image frames with respect to the first 360 degree projection format image frame by using the IMU rotation data;
computing, by the electronic device, a relative rotation of the 360 degree image content based on the first and second relative rotation angles; and
applying, by the electronic device, the relative rotation to at least one of the 360 degree projection format image frames.

US Pat. No. 10,715,722

DISPLAY DEVICE, METHOD OF CONTROLLING THEREOF AND DISPLAY SYSTEM

SAMSUNG ELECTRONICS CO., ...

1. A display device comprising:a display;
a communicator configured to communicate with a first external device, a first nearby display device connected to a second external device, and a second nearby device connected to a third external device; and
a processor configured to:
control the communicator to transmit a first image received from the first external device and a second image received from the first nearby display device to the second nearby display device,
control the communicator to receive first device information on the first external device from the first external device, to receive second device information on the second external device corresponding to the second image from the first nearby display device, and to receive third device information on the third external device from the first nearby display device,
generate an image source list using the first, the second, and the third device information, and control the display to display the generated image source list, and
in response to receiving a control command to select one from among the first, the second, and the third external device as an image source, control the display to display an image corresponding to the selected external device.

US Pat. No. 10,715,721

CONTROL DEVICE, CONTROL METHOD, AND PROGRAM

Sony Corporation, Tokyo ...

1. A control device, comprising:an imaging controller configured to control imaging in an imaging device; and
a detector configured to detect a photographic subject on a basis of a first captured image in which a detection region set at a part within a field angle of the imaging device has been imaged, wherein
the imaging controller makes the imaging device perform imaging of the detection region or imaging of an acquisition region set within the field angle of the imaging device,
in a case where the photographic subject has been detected on a basis of the first captured image, the imaging controller makes the imaging device perform imaging of the acquisition region, and
in a case where a plurality of the detection regions are set independently, the detector detects the photographic subject for each of the plurality of the detection regions on a basis of the first captured image corresponding to each of the plurality of the detection regions.

US Pat. No. 10,715,720

INTELLIGENT MANUAL ADJUSTMENT OF AN IMAGE CONTROL ELEMENT

Intuitive Surgical Operat...

1. A method for intelligent manual adjustment of a manually rotatable focus control of a camera, the method comprising:a processor determining a depth value of a target area relative to the camera;
the processor determining a desirable focus adjustment according to a direction and an amount that a focal point of the camera is to be moved in order to coincide with the depth value of the target area; and
the processor assisting an operator of the manually rotatable focus control to the desirable focus adjustment by:
defining and setting a first rotation direction of the manually rotatable focus control as corresponding to moving the focal point towards the depth value of the target area, regardless of whether the focal point is in front of the depth value or behind the depth value;
defining and setting a second rotation direction of the manually rotatable focus control as corresponding to moving the focal point away from the depth value of the target area, regardless of whether the focal point is in front of the depth value or behind the depth value;
upon receiving a first indication that the operator has manually rotated the manually rotatable focus control by a first rotation angle in the first rotation direction, moving the focal point towards the depth value of the target area according to the first rotation angle and the desirable focus adjustment; and
upon receiving a second indication that the operator has manually rotated the manually rotatable focus control by a second rotation angle in the second rotation direction, moving the focal point away from the depth value of the target area according to the second rotation angle and in an opposite direction than that of the desirable focus adjustment.

US Pat. No. 10,715,719

IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF

CANON KABUSHIKI KAISHA, ...

1. An image capturing apparatus comprising:a first display capable of displaying a predetermined item and being seen through a finder;
a detection unit that detects a touch operation on a touch panel;
an operation unit capable of being operated by a finger of a same hand of another finger that is operating the touch panel provided at a position different from the positions of the first display and the operation unit; and
a control unit that control to
while a predetermined operation that is pressing the operation unit is not made to the operation unit and a subject image is seen through the finder, display the predetermined item at a position on the first display based on a position at which touch is started on the touch panel, not based on a display position of the predetermined item before the touch is started, in response to the touch operation detected by the detection unit, and
while the predetermined operation is made to the operation unit and the subject image is seen through the finder, display the predetermined item on the first display at a position moved from the position where the predetermined item is displayed before the touch position is moved by an amount corresponding to the moved amount of the touch position, and not to move the predetermined item in response to starting of the touch.

US Pat. No. 10,715,718

PHASE DETECT AUTO-FOCUS THREE DIMENSIONAL IMAGE CAPTURE SYSTEM

Taiwan Semiconductor Manu...

1. A method of capturing a three-dimensional image, the method comprising:performing an image capture process while moving a lens to capture image data across a range of focal depths using a phase detect auto-focus sensor, wherein the phase detect auto-focus sensor has a plurality of pixels, and a pixel focus depth is determined for each pixel;
performing a three dimensional image reconstruction process to generate a three dimensional image based on the image data;
rendering a two-dimensional image including focused image data from across the range of focal depths; and
fusing the two dimensional image with the three dimensional image to generate a focused three dimensional model.

US Pat. No. 10,715,717

LENS CONTROL DEVICE, LENS CONTROL METHOD, AND RECORDING MEDIUM

CANON KABUSHIKI KAISHA, ...

1. A lens control device performing control of the focus lens, position information of the focus lens is acquirable at a relative position, comprising:an input member configured to receive designation, from a user, wherein the input member receives the designation of the position of the focus lens in a predetermined mode,
at least one processor programmed to perform the operations of following units:
an acquisition unit configured to acquire first information regarding the focus position in the predetermined mode designated via the input member; and
a control unit configured to control the drive of the focus lens in the predetermined mode based on the first information,
wherein, when the control unit receives the designation via the input member, the control unit starts driving the focus lens from a current focus position to an end position of a movable range of the focus lens, and
wherein the first information is acquired based on information corresponding to a driving amount from a reset position to the current focus position.

US Pat. No. 10,715,716

ELECTRONIC DEVICE FOR RECORDING IMAGE AS PEER MULTIPLE FRAME RATES USING CAMERA AND METHOD FOR OPERATING SAME

Samsung Electronics Co., ...

1. An electronic device, comprising:a camera;
a display;
a memory; and
at least one processor configured to:
obtain a plurality of first images as per a first frame rate using the camera based on a signal related to image recording and control the camera to perform focusing of a lens included in the camera on at least one of one or more objects in the plurality of first images while obtaining the plurality of first images,
provide a first portion of the plurality of first images as a preview through the display,
control the camera to lock the focusing on the at least one object,
identify a designated event for slow motion recoding while obtaining the plurality of first images,
based at least in part on the designated event, obtain a plurality of second images as per a second frame rate higher than the first frame rate using the camera focusing-locked on the at least one object, and
provide a video related to the at least one object using a second portion of the plurality of first images and at least one of the plurality of second images.

US Pat. No. 10,715,715

FOCUS DETECTION APPARATUS, METHOD OF CONTROLLING SAME, AND STORAGE MEDIUM

CANON KABUSHIKI KAISHA, ...

1. A focus detection apparatus, comprising:an image sensor having a pixel line in which a plurality of pixels for receiving and photo-electrically converting a pair of light beams that passed through differing pupil areas of an imaging optical system are arrayed; and
at least one processor or circuit configured to function as following units:
a drive control unit configured to cause accumulation times of a plurality of pixel lines in the image sensor to differ each predetermined number of lines;
a first calculation unit configured to, after performing compensation processing for compensating for a difference of the accumulation times between lines based on signals of the accumulation times that differ between lines in the same frame, calculate a defocus amount by using the signals after the compensation processing, wherein the first calculation unit calculates a defocus amount corresponding to a first correlation direction by using signals of pixels arrayed in the first correlation direction, in accordance with a focusing status; and
a second calculation unit configured to calculate a defocus amount corresponding to a second correlation direction by using signals of pixels arrayed in the second correlation direction, in accordance with a focusing status.

US Pat. No. 10,715,714

MACHINE LEARNING-BASED DEVICE PLACEMENT AND CONFIGURATION SERVICE

Verizon Patent and Licens...

1. A method comprising:storing, by a network device, a regression model that stores data associated with parameters derived from previously installed and tested video cameras at sites;
obtaining, by a network device, location information of a candidate site at which a service is to be provided in a service area based on an installation of one or more video cameras;
obtaining, by the network device, map information of the candidate site that includes the service area;
displaying, by the network device via an interactive graphical user interface, the map information;
receiving, by the network device via the interactive graphical user interface, one or more selections of one or more assets in the map information at which one or more virtual video cameras are to be virtually affixed;
receiving, by the network device via the interactive graphical user interface, geo-location parameters for the one or more virtual video cameras to be virtually affixed, wherein the geo-location parameters include location, height, heading, pitch, and roll;
calculating, by the network device, an anticipated accuracy of detection value for the service area or a portion of the service area based on the geo-location parameters, the regression model, the map information, and one or more fields of view for the one or more virtual video cameras;
determining, by the network device subsequent to the calculating, whether the anticipated accuracy of detection value satisfies a target accuracy of detection value for the candidate site; and
using the geo-location parameters to install and test the one or more video cameras at the candidate site in response to determining that the anticipated accuracy of detection value satisfies the target accuracy of detection value.

US Pat. No. 10,715,713

INTERACTIVE APPLICATION ADAPTED FOR USE BY MULTIPLE USERS VIA A DISTRIBUTED COMPUTER-BASED SYSTEM

Breakthrough PerformanceT...

1. A system, comprising:a processing device;
a computer readable medium that stores programmatic instructions that, when executed by the processing device, are configured to cause the system to perform operations comprising:
detect a selection of a multimedia module comprising a plurality of components including a first component and a second component, the second component comprising a plurality of subcomponents including a first subcomponent corresponding to a first content type, a second subcomponent corresponding to a second content type, and a third subcomponent corresponding to a third content type;
render, during a first mode of operation, a dynamic, graphical, navigation flow control in association with first multimedia content of the multimedia module,
wherein the rendered dynamic, graphical, navigation flow control indicates the plurality of components and visually emphasizes the first component, the first component corresponding to a first current navigation position;
store a first result to memory obtained by execution of an interactive event during the first mode of operation;
based at least on the first result, enter a second mode of operation; and
re-render, during the second mode of operation, the dynamic, graphical, navigation flow control in association while the first content type and the second content types are displayed, and the third content type is not displayed;
wherein the re-rendered dynamic navigation flow control indicates the plurality of components and the plurality of subcomponents of the second component, and indicates a second current navigation position by visually emphasizing the first subcomponent of the second component corresponding to the displayed first content type, the second subcomponent of the second component corresponding to the displayed second content type, and not visually emphasizing the third subcomponent of the second component.

US Pat. No. 10,715,712

BARCODE READERS AND COMPONENT ARRANGEMENTS THEREOF

Symbol Technologies, LLC,...

1. A barcode reader comprising:a housing having a handgrip portion and an upper body portion;
a first printed circuit board (PCB) extending into the upper body portion; and
an imaging module positioned within the upper body portion, the imaging module including:
an imaging system having an imager and an imaging lens assembly, the imaging system having a field of view with a central imaging axis passing through a window in the upper body portion and lying on a horizontal plane;
an aiming light system configured to emit an aiming light pattern, the aiming light system offset from the imaging system along the horizontal plane; and
a second PCB,
wherein the first PCB is positioned at an oblique angle relative to the central imaging axis, and
wherein both of the imager and a portion of the aiming light system are positioned on the second PCB.

US Pat. No. 10,715,711

ADAPTIVE THREE-DIMENSIONAL IMAGING SYSTEM AND METHODS AND USES THEREOF

Marvel Research Ltd., Wa...

1. An adaptive 3D imaging system comprising:an imaging part and a lens part detachably connected thereto, wherein the lens part has a first end and a second end;
the imaging part comprising a sensor and a reflector configured to transmit a plurality of captured light field images to the sensor;
wherein the lens part comprises a first camera lens positioned at the first end of the lens part, a second camera lens positioned at the second end of the lens part, an entrance pupil plane and matching device positioned between the first camera lens and the second camera lens and being adaptive to different focal lengths of the second camera lens, an internal reflection unit positioned between the first camera lens and the entrance pupil plane and matching device and configured to decompose the captured light field images and refract them into a plurality of secondary images with different angular offsets.

US Pat. No. 10,715,709

IMAGING DEVICE, OPTICAL DEVICE PROVIDED WITH SAME, ELECTRONIC DEVICE PROVIDED WITH SAME, AND METHOD FOR PRODUCING IMAGING DEVICE

NIDEC COPAL CORPORATION, ...

1. An imaging device comprising:a lens holding member holding at least one lens;
a base member holding the lens holding member;
a substrate, secured to the base member, that has an imaging element;
a screw-fastening member in contact with the base member, wherein the lens holding member is movable in a direction of an optical axis of the lens by rotating the screw-fastening member with respect to the base member; and
a biasing member configured to bias the lens holding member away from the substrate,
wherein the lens holding member is biased in a direction away from the substrate,
wherein the screw-fastening member includes a pressing member having a first screw-fastening portion,
wherein the base member has a second screw-fastening portion,
wherein the pressing member is biased together with the lens holding member in a direction away from the substrate, wherein the first screw-fastening portion and the second screw-fastening portion are movable from a first position that are not in contact with each other to a second position that are in contact with each other when being pressed by a biasing force of the biasing member.

US Pat. No. 10,715,708

CAMERA MODULE PACKAGE AND METHOD OF MANUFACTURING THE SAME

Hyundai Mobis Co., Ltd., ...

1. A camera module package for a vehicle comprising:a front body comprising a housing, a lens, and a retainer;
a rear body disposed behind the front body; and
a printed circuit board (PCB) provided with an image sensor and configured to be mounted between the front body and the rear body, wherein:
the front body is formed of a laser light transmissive molded body,
the rear body is formed of a laser light absorptive molded body,
the front body, the rear body, and the PCB are subjected to six-axis alignment and then laser-welded together, and
the retainer is made of rubber or silicone material.

US Pat. No. 10,715,707

IMAGING DEVICE ABLE TO PERFORM AUTO FOCUS AND VIBRATION COMPENSATION

SINTAI OPTICAL (SHENZHEN)...

1. An imaging device comprising:a base;
a prism module fixedly disposed in the base;
a lens module comprising a lens unit, a first carrier configured to fix the lens unit, and a second carrier configured to accommodate the first carrier, wherein the lens unit has an optical axis extending in a first direction, the second carrier is connected to the base through an axial elastomer extending in the first direction, and the prism module and the lens module are disposed in the base and sequentially arranged in the first direction; and
an imaging module;
wherein the second carrier comprises two side walls opposite to each other, an end wall connected to end portions of the side walls, and a bottom plate connected between bottom portions of the side walls; the lens module further comprises a planar elastomer configured to provide a restoring force for return of the first carrier and the second carrier when the first carrier and the second carrier are moved away from each other in the first direction; the planar elastomer is connected between the first carrier and the second carrier; the planar elastomer is in a plane defined by the first direction and a third direction; and the third direction is perpendicular to the first direction.

US Pat. No. 10,715,706

CAMERA MODULE AND ELECTRONIC EQUIPMENT USING THE CAMERA MODULE

NANNING FUGUI PRECISION I...

1. A camera module comprising:a light sensing module comprising a light sensing element adapted for receiving lights and converting lights into image signals;
a control circuit adapted for setting a photographing setting according to a control signal, and processing the image signals according to the photographing setting;
a mounting member, wherein one side of the mounting member is contacted with the light sensing module;
an external lens module detachably mounted on another side of the mounting member opposite to the light sensing module, the external lens module comprising a lens and at least one magnetic components, wherein the at least one magnetic component has a configuration corresponding to a type of the external lens module;
a magnetic sensing element adapted for detecting the configuration of the at least one magnetic component, and providing the control signal to the control circuit according to the configuration of the at least one magnetic component; and
wherein the configuration of the at least one magnetic component is selected from magnetic pole directions of the magnetic components facing the mounting member.

US Pat. No. 10,715,705

CHARACTERIZING OPTICAL CHARACTERISTICS OF OPTICAL ELEMENTS

Pony AI Inc., Grand Caym...

1. A system comprising:an optical element mount configured to carry an optical element;
a calibration display configured to display a calibration object, the calibration object including a known visual pattern;
an image sensor mount configured to carry an image sensor;
one or more processors; and
a memory storing instructions that, when executed by the one or more processors, cause the system to perform:
obtaining images including different perspectives of the calibration object, the images being captured using the optical element carried by the optical element mount;
characterizing optical characteristics of the optical element based on the known visual pattern and the different perspectives of the calibration object; and
calibrating the image sensor to compensate the images, based on the optical characteristics of the optical element, for a distortion of light traveling through the optical element.

US Pat. No. 10,715,704

LENS BRACKET ASSEMBLY AND GIMBAL USED THEREWITH

SZ DJI OSMO TECHNOLOGY CO...

1. A lens bracket assembly for supporting an imaging device, the imaging device comprising a body and a lens connected to the body, the lens bracket assembly comprising:a mounting plate, configured for sliding along one or more guiding posts via a first adaptor;
a supporting plate slidably disposed on the mounting plate, the supporting plate being configured for mounting the imaging device and comprising a first side;
a bracket, comprising:
a supporting portion having a shape that matches the lens of the imaging device, and configured for supporting the lens; and
a fixing portion connected with the supporting portion and fixedly arranged at the first side, the fixing portion having a guide hole; and
a fixing member, comprising an assemble-in portion and an abutting portion connected to the assemble-in portion, wherein the assemble-in portion passes through the guide hole and a first fixing hole provided at the first side of the supporting plate, and the abutting portion abuts against the fixing portion.

US Pat. No. 10,715,703

SELF-LEVELING CAMERA HEADS

SEESCAN, INC., San Diego...

1. A self-leveling video inspection camera system, comprising:a camera head with a substantially cylindrical outer housing;
an illumination window positioned at the front end of the cylindrical housing assembly;
a lens assembly for focusing images or video from in front of the cylindrical housing assembly;
a high strength scratch resistant material window positioned in front of the lens assembly in a central aperture within the illumination window;
a camera module assembly, including an image sensor, supported inside an interior opening in the cylindrical housing assembly behind the lens assembly;
a plurality of lighting elements positioned within the outer housing to providing light through the illumination window to illuminate an area being imaged;
a slip ring assembly removably coupled to the camera module assembly, wherein the slip ring assembly includes a plurality of elongate conductive contact brushes having proximal ends electrically coupled to spaced apart opposing circuit board assemblies and a plurality of contact rings electrically coupled to corresponding ones of the contact brushes; and
a processing circuit for receiving the output signal from the image sensor and an orientation signal from an angular orientation sensor and provide an output image or video signal at a predetermined angular orientation.

US Pat. No. 10,715,702

METHOD OF AND CIRCUIT FOR PREDISTORTION FOR A CABLE TV AMPLIFIER

Xilinx, Inc., San Jose, ...

1. A digital predistortion (DPD) system, comprising:an input configured to receive a DPD input signal;
a first non-linear datapath coupled to the input, wherein the first non-linear datapath is configured to add a non-linear mirror image component to the DPD input signal to provide a first non-linear signal that is used to generate a first predistortion signal;
a linear datapath coupled to the input in parallel with the first non-linear datapath to generate a second predistortion signal; and
a first combiner configured to combine the first predistortion signal and the second predistortion signal to generate a DPD output signal.

US Pat. No. 10,715,701

DATA GENERATING APPARATUS GENERATING CONTROL DATA FOR COMMON USE IN PRINTING PROCESSES PERFORMED ON A PLURALITY OF PRINT EXECUTION UNITS

BROTHER KOGYO KABUSHIKI K...

1. A data generating apparatus comprising a processor configured to perform:acquiring offset information indicating a deviation in a colorant usage used by a specific print execution unit from a standard amount of usage, the specific print execution unit being one of a plurality of print execution units, the standard amount of usage being a standard quantity concerning colorant used by the plurality of print execution units;
controlling the specific print execution unit to print a first patch image based on first patch image data without using the offset information;
generating first control data using first read data obtained by optically reading the first patch image, the first control data being for common use in the printing processes performed on the plurality of print execution units, wherein the first control data is generated for use in a halftone process;
controlling the specific print execution unit to print a second patch image based on second patch image data by using the first control data without using the offset information;
generating second control data using second read data obtained by optically reading the second patch image, the second control data being for common use in printing processes performed on the plurality of print execution units;
generating corrected second control data by correcting the second control data using the offset information, the corrected second control data being for use in the specific print execution unit, wherein the second control data and the corrected second control data are generated for use in a tone correction process;
controlling the specific print execution unit to print a third patch image based on third patch image data using the corrected second control data, the third patch image being printed separately from the second patch image;
generating third control data using third read data obtained by optically reading the third patch image, the third control data being for common use in the printing processes performed on the plurality of print execution units, the optically reading the third patch image being executed separately from the optically reading the second patch image, and wherein the third control data is generated for use in a color conversion process;
wherein controlling the specific print execution to print the third patch image comprises:
generating the print data for the third patch image by performing the tone correction process on the third patch image data by using the corrected second control data;
outputting the generated print data to the specific print execution unit to print the third patch image;
wherein the print data for the third patch image is generated by using the corrected second control data and the first control data, and when controlling the specific print execution unit to print the third patch image, the processor is further configured to perform:
generating print data for the third patch image by performing the tone correction process on the third patch image data using the corrected second control data and by performing the halftone process using the first control data.

US Pat. No. 10,715,700

PROFILE ADJUSTMENT METHOD, PROFILE ADJUSTMENT APPARATUS, AND PROFILE ADJUSTMENT SYSTEM

Seiko Epson Corporation, ...

1. A profile adjustment method for causing a computer to execute a process of adjusting a to-be-adjusted profile by using a color conversion module performing a color conversion process of converting coordinate values in a color space with reference to a profile, the profile adjustment method comprising:setting a color conversion module included in one or more of the color conversion module, as a target module that is a color conversion module to be used;
causing the target module to execute the color conversion process of converting input coordinate values in an input color space into output coordinate values in an output color space; and
executing a process of using a conversion result by the target module to adjust the to-be-adjusted profile,
wherein the profile includes an input profile representing a correspondence relationship between first coordinate values in a first device dependent color space and device independent coordinate values in a profile connection space, an output profile representing a correspondence relationship between the device independent coordinate values and second coordinate values in a second device dependent color space, and a device link profile representing a correspondence relationship between the first coordinate values and the second coordinate values,
the setting the color conversion module includes accepting a first conversion setting involving passage through the profile connection space or a second conversion setting avoiding the passage through the profile connection space when converting the input coordinate values in the first device dependent color space into the output coordinate values in the second device dependent color space, and
the causing the target module to execute the color conversion process includes,
in a case where the first conversion setting is accepted, referring to the input profile and the output profile to cause the target module to execute the color conversion process of converting the input coordinate values in the first device dependent color space into the output coordinate values in the second device dependent color space, and
in a case where the second conversion setting is accepted, referencing the device link profile to cause the target module to execute the color conversion process of converting the input coordinate values in the first device dependent color space into the output coordinate values in the second device dependent color space.

US Pat. No. 10,715,699

INFORMATION PROCESSING APPARATUS

KYOCERA DOCUMENT SOLUTION...

1. An information processing apparatus, comprising:a marking extraction circuit that extracts, from a script image including a marking superimposed on a character string, the marking;
a character string comparison circuit that extracts, from the script image, a character string on which a marking is not superimposed, the character string being the same as that on which a marking is superimposed;
a blank creation circuit that creates a plurality of blank images to be respectively superimposed on the plurality of character strings on which the markings are respectively superimposed, each of the plurality of blank images having a shape and a position with which the character string on which the marking is superimposed and edge portions of the marking sticking out from the character string are hidden by each of the plurality of blank images; and
an image synthesis circuit that creates a synthesis image by synthesizing the script image, the plurality of blank images, and a symbol image as an image of the allocated symbols.

US Pat. No. 10,715,697

IMAGE PROCESSING METHOD THAT DETERMINES A NUMBER OF BLOCKS FOR EMBEDDING INFORMATION BASED ON A SIZE OF AN IMAGE TO BE PRINTED, IMAGE PROCESSING APPARATUS, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image processing method, performed by at least one processor by executing a program stored in a memory, comprising:obtaining additional information;
obtaining size information related to a size of an image and a size of a print medium on which at least a portion of the image is to be printed;
determining positions of blocks in the image on the basis of the obtained size information; and
embedding the additional information in each of the blocks of the determined positions,
wherein in a case where it is determined on the basis of the obtained size information that the image is to be printed with a size larger than the print medium, positions of blocks are determined such that an edge of a block is overlaid on an edge of the print medium.

US Pat. No. 10,715,696

INFORMATION PROCESSING SYSTEM INCLUDING IMAGE FORMING APPARATUS THAT TRANSMITS IMAGE DATA TO INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING APPARATUS THAT DETERMINES STORAGE AREA TO STORE IMAGE DATA, INFORMATION PROCESSING APPARATUS THAT DETE

KYOCERA DOCUMENT SOLUTION...

1. An information processing system, comprising:an image forming apparatus capable of communicating via facsimile; and
an information processing apparatus capable of communicating with the image forming apparatus via a network,
the image forming apparatus including
an input unit that receives input information input by a user, and outputs the input information to a first controller circuit, and
the first controller circuit that transmits image data transmitted via facsimile, facsimile transmission information including information of one or more items about the facsimile transmission, and additional information including additional information of one or more items about the image data, in association with each other, to the information processing apparatus via the network,
the information processing apparatus including a second controller circuit that
receives the image data, the facsimile transmission information, and the additional information, in association with each other, from the image forming apparatus via the network, and
determines a storage area to store the image data based on information of at least one item out of the information of the one or more items included in the facsimile transmission information and/or based on information of at least one item out of the information of the one or more items included in the additional information, wherein
the first controller circuit of the image forming apparatus
reads the additional information from the image data by executing optical character recognition on at least a part of the image data,
generates the additional information based on the input information input in the input unit by the user, and
where additional information of a certain item included in the additional information read from the image data is different from additional information of the certain item included in the additional information generated based on the input information input in the input unit, transmits the additional information of the certain item included in the additional information read from the image data, with a higher priority, to the information processing apparatus.

US Pat. No. 10,715,695

IMAGE PROCESSING APPARATUS

Konica Minolta, Inc., To...

1. An image processing apparatus comprising:a volatile storage device;
a hardware processor that executes an information process using the volatile storage device; and
a nonvolatile storage device that stores data to be used for the information process, wherein
in response to a condition that makes it impossible to continue the information process being satisfied while the information process is being executed, the hardware processor generates a snapshot of the volatile storage device and stores the snapshot in an area of the nonvolatile storage device having a use other than snapshot storage,
in response to a condition that makes it possible to resume the information process being satisfied, the hardware processor reads the snapshot from the nonvolatile storage device and resumes the information process,
when the information process is a process of dividing and transmitting image data of a plurality of pages on a page basis, the hardware processor generates the snapshot after transmitting image data of any of the plurality of pages, and
when the information process is a process of collectively transmitting the image data of the plurality of pages, the hardware processor generates the snapshot after transmitting the image data of the plurality of pages.

US Pat. No. 10,715,694

IMAGE FORMING INSTRUCTION DEVICE, IMAGE FORMING INSTRUCTION METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR BORDERLESS AND NON-BORDERLESS PRINTING

FUJI XEROX CO., LTD., To...

1. An image forming instruction device comprising:a processor, configured to execute a selection module and an outer edge checking module,
wherein the selection module selects either a setting for borderless printing or a setting for non-borderless printing based on that print data is an image generated by taking a screenshot,
wherein if the print data is the image generated by taking the screenshot, the selection module selects the setting for borderless printing,
wherein if the print data is not the image generated by taking the screenshot, the outer edge checking module detects an image on an outer edge of print data for each edge of the print data and outputs a detection result to the selection module,
wherein the selection module selects either a setting for borderless printing or a setting for non-borderless printing, for each edge of the print data.

US Pat. No. 10,715,692

SYSTEMS AND METHODS FOR LOCALIZING A USER INTERFACE BASED ON AN INPUT DOCUMENT

XEROX CORPORATION, Norwa...

1. A method for presenting a local language user interface on a multi-function device, the method comprising:receiving a hard copy document by the multi-function device, the multi-function device configured to scan the hard copy document, the hard copy document comprising text information in at least one local language of a user, wherein the at least one local language comprising is a single local language or the at least one local language comprises multiple local languages, and the multi-function device comprising comprises a user interface configured in a default mode to display one or more functions of the multi-function device in a default language and configured in a local mode to display the one or more functions of the multi-function device in the at least one local language single local language or in a local language selected from the multiple local languages;
scanning a portion of the hard copy document to create an electronic document;
analyzing the electronic document to identify the at least one local language; and
upon identification of the single local language from the electronic document, automatically presenting the user interface in the local mode in the single local language, upon identification of the multiple local languages, displaying a local language option on the user interface, based on the analysis of the electronic document, for the user to select a local language from the multiple local languages identified from the electronic document as the single a selected local language, and upon selection of the single local language, presenting the user interface in the local mode in the single selected local language, wherein the user interface in the local mode enables the user to operate the multi-function device in the single local language or the selected local language.

US Pat. No. 10,715,691

IMAGE FORMING APPARATUS INCLUDING AN ILLUMINATOR TO ILLUMINATE A WORK TARGET LOCATED IN A SPACE BETWEEN AN IMAGE READER AND AN IMAGE FORMER

KONICA MINOLTA, INC., To...

1. An image forming apparatus of an internal ejection type in which a sheet ejection space is provided between an image reader and an image former, the image forming apparatus comprising:an illuminator provided on a wall surface partitioning the sheet ejection space;
a detector that detects a position of a work target enabled to move in pulling-out and storing directions within the sheet ejection space, the work target being movable relative to the illuminator in the sheet ejection space; and
a controller that performs on/off control of the illuminator based on a detection by the detector,
wherein in a case in which it is detected that the work target is in a work position for performing human work, the controller performs on/off control of the illuminator to illuminate a work target portion of the work target that faces an interior of the sheet ejection space and is to be subjected to the human work,
wherein the work target is a post-processor that performs post-processing on an ejected sheet, and
wherein the image forming apparatus further comprises:
a mounting determinator that determines whether or not a post-processor is mounted within the sheet ejection space,
wherein in a case in which a determination by the mounting determinator is negative, the controller performs on/off control of the illuminator to illuminate a sheet ejected into the sheet ejection space.

US Pat. No. 10,715,690

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD FOR DESIGNATING COLOR AND MONOCHROME IMAGES

TOSHIBA TEC KABUSHIKI KAI...

11. A multifunctional peripheral device, comprising:a scanner for obtaining a document image;
a printer for printing color and monochrome images on recording media;
an operation panel for displaying information and receiving user inputs associated with operations of the scanner and the printer;
a storage device for storing, in association with a user ID, a color feature threshold value and a history file including previously determined color feature threshold values of the user associated with the user ID; and
a processor configured to:
set the user ID according to a log-in process;
acquire a plurality of page images;
designate each page image as a color image or a monochrome image based on analysis of a color feature quantity value of the page image and the color feature threshold value stored in the storage device in association with the user ID;
display the designation of a color image or a monochrome image for the plurality of page images on the operation panel;
receive user input, via the operation panel, for changing whether at least one page image among the plurality of page images is designated a color image or a monochrome image;
calculate a new color feature threshold value if the received user input changes whether at least one image among the plurality of page images is designated a color image or a monochrome image, the new color feature threshold value being calculated based on the color feature quantity value of the at least one image having a designation changed by the user input; and
store the new color feature threshold value in the storage unit in association with the user ID.

US Pat. No. 10,715,689

IMAGE PROCESSING APPARATUS, METHOD FOR CONTROLLING IMAGE PROCESSING APPARATUS, AND RECORDING MEDIUM

CANON KABUSHIKI KAISHA, ...

1. An apparatus comprising:a memory;
at least one processor coupled to the memory to:
execute an application;
register history of settings of the applications in a time order in which the application is executed according to the setting;
display a screen containing the registered history of setting of the application;
receive user instruction to register, as a first setting, a setting of the application contained in the history;
register, as the first setting, the setting of the application contained in the history according to the user instruction;
display a main screen containing an object to invoke the setting of the application registered as the first setting;
register, as a second setting, the setting of the application registered as the first setting, without the user instruction; and
display a screen for setting the application containing an object capable of invoking the setting of the application registered as the second setting.

US Pat. No. 10,715,688

DISPLAY DEVICE CAPABLE OF NOTIFYING DISPLAY OBJECT BY VOICE, IMAGE PROCESSING APPARATUS, NOTIFYING METHOD, PROCESS EXECUTING METHOD

KYOCERA Document Solution...

1. A display device, comprising:a display portion;
a display processing portion configured to display, on the display portion, a plurality of predetermined display objects including an operation icon corresponding to a predetermined process;
a first process executing portion configured to execute the predetermined process in response to a touch operation on the operation icon;
a detecting portion configured to detect, by using a plurality of sensors provided side by side along an edge portion of the display portion, a shielded area in the display portion shielded by a shielding object, the shielded area including at least a part of the edge portion, the detecting portion being provided independently of a touch panel that is configured to detect an operation performed on the display portion;
a notification processing portion configured to notify by voice, among the plurality of predetermined display objects, a display object that includes a display area partially or completely overlapped with the shielded area detected by the detecting portion; and
a storage portion configured to store a plurality of pieces of notification information which are preliminarily made to respectively correspond to the display objects and each indicate a piece of voice notification content, wherein
the display objects each include text information indicating content of the respective display objects,
the notification information includes the text information included in each of the display objects, and
the notification processing portion is configured to:
acquire, when the display object including the display area partially or completely overlapped with the shielded area detected by the detecting portion is displayed, the piece of notification information corresponding to the display object,
notify the display object by voice based on the acquired piece of notification information, and
notify, when there are multiple display objects to be notified by voice, the multiple display objects sequentially.

US Pat. No. 10,715,687

INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING METHOD FOR TESTING A SERIES OF PROCESSES OF ONE OR MORE APPLICATIONS

Ricoh Company, Ltd., Tok...

1. An information processing system including one or more information processing apparatuses each of which is configured to perform a plurality of programs to implement functions, the information processing system comprising:the one or more information processing apparatuses which form a service provision system;
a terminal device; and
at least one electronic apparatus,
wherein the service provision system includes
a memory configured to store application information associating flow information and application configuration information for each of one or more applications that performs, when executed, a series of processes using electronic data which utilize at least one function of the at least one electronic apparatus,
wherein, the flow information defines program identification information identifying each of one or more programs, which is executed to perform a process included in the series of processes, and an order of executing the one or more programs,
wherein, the application configuration information defines parameter information used to execute each of the one or more programs, and flow identification information identifying the corresponding flow information; and
circuitry configured to
receive, from the terminal device, a registration request including the application configuration information generated by the terminal device,
generate one or more test cases used to test the series of processes based on the parameter information defined in the application configuration information included in the registration request received, the application configuration information including the parameter information set from the terminal device, the set parameter information including a parameter condition that can be set from the at least one electronic apparatus using the application,
wherein each test case indicates a combination of pieces of the parameter information,
acquire the flow information identified with the flow identification information defined in the application configuration information included in the registration request, and
execute, according to the order of executing the one or more programs defined in the flow information, the one or more programs each of which is identified with the program identification information defined in the flow information acquired, to test the series of processes,
wherein each program is executed using the combination of the pieces of the parameter information indicated with each test case generated.

US Pat. No. 10,715,686

IMAGE FORMING APPARATUS WITH PUNCH DIE-SET LUBRICATION PROMPT, METHOD OF CONTROLLING THE SAME, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image forming apparatus connected to a sheet hole punching apparatus that applies a punch process to a sheet received from the image forming apparatus, the sheet hole punching apparatus replaceably mounting a punch die-set for performing the punch process, the punch die-set having a first memory storing identification information for identifying the punch die-set, the image forming apparatus comprising:a display configured to display information;
a controller configured to obtain the identification information from the first memory of the punch die-set mounted on the sheet hole punching apparatus; and
a second memory configured to store count information indicating a number of times of execution of the punch process using the punch die-set mounted on the sheet hole punching apparatus in association with the identification information,
wherein the controller causes the display to display a message for prompting a user to lubricate the punch die-set in accordance with the count information being greater than a predetermined value,
wherein the controller causes the display to display a first input screen for causing the user to input first lubrication completion information indicating a completion of lubrication of the punch die-set in accordance with a predetermined operation by the user after having displayed the message,
wherein in a case that a designation for displaying a second input screen for causing a user to input second lubrication completion information indicating a completion of lubrication of the punch die-set in accordance with an operation being different from the predetermined operation is input in a state that the count information is not greater than the predetermined value and a user has lubricated the punch die set before the message is displayed, the controller causes the display to display the second input screen,
wherein the controller initializes the count information stored in the second memory in association with the identification information of the punch die-set based on an input of the first lubrication completion information via the first input screen, and
wherein the controller initializes the count information stored in the second memory in association with the identification information of the punch die-set based on an input of the second lubrication completion information via the second input screen.

US Pat. No. 10,715,685

DETERMINING WHETHER SESSION IDS MATCH AND TAKING ACTION ACCORDINGLY

Canon Kabushiki Kaisha, ...

1. A communication apparatus that performs communication with a communication partner apparatus, comprising:a transmission unit configured to transmit request information for requesting information related to the communication partner apparatus, wherein the request information corresponds to first information for identifying communication with the communication partner apparatus, and the communication partner apparatus transmits second information related to response information corresponding to the request information;
a determination unit configured to determine whether the second information transmitted from the communication partner apparatus matches the first information; and
an instruction unit configured to transmit, if the first information and the second information do not match each other, to the communication partner apparatus, an instruction for discarding the response information related to the second information.

US Pat. No. 10,715,684

APPARATUS, SYSTEMS AND METHODS FOR GENERATING 3D MODEL DATA FROM A MEDIA CONTENT EVENT

DISH Technologies L.L.C.,...

1. A method that generates three dimensional (3D) models based on images in a media content event, the method comprising:receiving, at a media device, a user request that is associated with an interest by a user to obtain a 3D model of a physical object of interest that is being shown in a scene of a media content event that the user is currently viewing;
accessing a first video image frame of the scene of the media content event, wherein the first video image frame corresponds to a video image presented in the scene of the media content event at a time that the user request is generated;
identifying a candidate physical object that is shown in the accessed first video image frame;
generating an image of the candidate physical object using the accessed first video image frame;
generating a user confirmation image that is presentable to the user, wherein the user confirmation image includes the generated image of the candidate physical object;
receiving a user response,
wherein the user response indicates that the candidate physical object corresponds to the physical object of interest that the user is interested in, or
wherein the user response indicates that the candidate physical object does not correspond to the physical object of interest that the user is interested in;
defining the candidate physical object as the physical object of interest when the user response indicates that the candidate physical object corresponds to the physical object of interest that the user is interested in;
accessing a plurality of second video image frames, wherein the plurality of second video image frames are accessed from the scene of the media content event that was presented to the user at the time that the user request is generated;
generating an image of the physical object of interest from each one of the plurality of second video image frames when the physical object of interest is identified in the plurality of second video image frames; and
generating 3D model data based on at least the images of the physical object of interest that are generated from the first video image frame and the plurality of second video image frames that were accessed from the scene of the media content event that was presented to the user at the time that the user request is generated.

US Pat. No. 10,715,683

PRINT QUALITY DIAGNOSIS

Hewlett-Packard Developme...

1. A print quality diagnosis apparatus comprising:a processor; and
a memory storing machine readable instructions that when executed by the processor cause the processor to:
align a scanned image of a printed physical medium to a master image associated with generation of the printed physical medium;
align corresponding characters between the aligned scanned and master images to generate a common mask;
determine, for each character of the scanned image, an average value associated with pixels within the common mask, and determine, for each corresponding character of the master image, the average value associated with pixels within the common mask;
determine, for each character of the common mask, a metric between the average values associated with the corresponding characters in the scanned and master images, wherein the metric between the average values comprises a Euclidean distance between the average values associated with the corresponding characters in the scanned and master images; and
analyze a histogram of determined metrics for characters of the common mask to diagnose print quality of the printed physical medium.

US Pat. No. 10,715,682

ELECTRONIC DEVICE, CONTROL METHOD AND NON-TRANSITORY STORAGE MEDIUM FOR EXECUTING A PROCESS RELATED TO AN INCOMING CALL ACCORDING TO A RESPONSE METHOD

KYOCERA CORPORATION, Kyo...

1. An electronic device, comprising:a display;
a communication unit;
a storage configured to store response method data in which pre-set keywords and response methods corresponding to incoming calls are associated with each other; and
a processor configured to, when an incoming call containing a text message comes in,
acquire the response method data that is stored in the storage, and
execute a process relating to the incoming call according to a response method that is defined in the acquired response method data,
wherein the response method contains
a mode in which a notification is made when an incoming call comes in, and
content of a response process performed when an incoming call comes in, and
wherein the processor is configured to, when another incoming call that does not contain the text message comes in and only a caller's phone number is displayed on the display, make a reply, to the caller's phone number, indicating that only incoming calls containing text messages are to be received.

US Pat. No. 10,715,681

NETWORK INTERFACE FOR TRACKING RADIO RESOURCE UTILIZATION

Verizon Patent and Licens...

1. A method, comprising:obtaining, by one or more network devices, a plan code for a communication device connecting to a radio access network (RAN);
associating, by the one or more network devices and based on the plan code, the communication device with a RAN-usage-based plan;
reporting, by the one or more network devices and after the associating, a tracking instance of RAN usage by the communication device;
obtaining, by the one or more network devices, another plan code for a different communication device connecting to the RAN; and
not reporting, by the one or more network devices and based on the other plan code, other tracking instances of RAN usage by the different communication device.

US Pat. No. 10,715,679

SYSTEM AND METHOD FOR DETECTING INSTANCES OF MISSING DATA RECORDS IN USAGE DATA RECORDS GENERATED FOR WIRELESS SERVICES

Aeris Communications, Inc...

1. A computer-implemented method comprising:recording a sequence of events related to transmission of data through at least one service element;
transmitting the sequence of events recorded by the at least one service element as usage data records from the at least one service element to a server for accounting, wherein the usage data records for the sequence of events without missing records comprises one START record followed by a number of CONTINUE records at regular intervals and one STOP record, wherein the number of CONTINUE records is any number from 0 to n; and
using an anomaly detection algorithm to detect instances of missing usage data records lost during transmission of the usage data records from the at least one service element to the server for accounting,
wherein the instances of missing usage data records comprise any of at least one usage data record corresponding to an event in the sequence of events is missing, usage data records corresponding to the entire sequence of events is missing or a combination thereof, and
wherein the anomaly detection algorithm comprises any of:
pattern recognition algorithm when a usage data record corresponding to an event in the sequence of events in the recorded pattern of transmission of data through the at least one service element is missing, and
a pattern matching algorithm using the recorded pattern of the transmission of data through the at least one service element when usage data records corresponding to the entire sequence of events in the recorded pattern of transmission of data through the at least one service element is missing.

US Pat. No. 10,715,678

MOBILE TERMINAL, EVENT INFORMATION DISPLAY METHOD, NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM STORING EVENT INFORMATION DISPLAY PROGRAM, AND INTERCOM SYSTEM

Panasonic Intellectual Pr...

1. A mobile terminal configured to communicate with a server, wherein the server performs call control between an intercom disposed in a lobby of a building and an indoor monitor disposed in a room of the building, the mobile terminal comprising:a receiver which, in operation, receives event information from the server, wherein the event information includes information about one or more visits to the lobby and respective time stamps of the visits;
a display which, in response to receiving the event information, scroll-displays the event information in chronological order; and
a transmitter which, in response to receiving the event information, transmits a response to the server, wherein the response indicates that the response is from the mobile terminal;
wherein the receiver, in response to the transmitter transmitting the response to the server, receives, via the server, image information captured by the intercom and formatted in an image setting selected to be compatible for display on the display of the mobile terminal.

US Pat. No. 10,715,677

TECHNIQUES TO EXTEND A DOORBELL CHIME

Vivint, Inc., Provo, UT ...

1. A method of home automation communications, the method comprising:identifying an event associated with an entry to a structure associated with a home automation system;
determining a location within the structure to broadcast a notification signal based at least in part on the event; and
transmitting, to a home automation component at the location, a command to broadcast the notification signal based at least in part on the determining.

US Pat. No. 10,715,676

DISTRIBUTED SENSOR SYSTEM

SAFE-COM WIRELESS, Holmd...

1. A sensor for a distributed antenna system (DAS), the sensor comprising:a connection to DAS module (CtD) that includes:
an interface circuit that receives data and modulates the data at a predetermined out of band frequency to generate a modulated data signal, and
a bidirectional radio frequency (RF) coupler that injects the modulated data signal on to cabling of the DAS, wherein the modulated data signal at the out of band frequency is received by a head-end unit of the DAS; and
a remote sensor and control unit (RSC) communicatively coupled to the CtD that includes:
a probe that obtains measurements of one or more environmental conditions,
a memory,
a CtD interface communicatively coupled to the interface circuit of the CtD, and
a processor communicatively coupled to the probe, the memory and the CtD interface;
wherein the processor:
receives the measurements of the one or more environmental conditions from the probe, and
transmits, using the CtD interface, the measurements of the one or more environmental conditions to the head-end unit of the DAS via the interface circuit and the bidirectional RF coupler of the CtD.

US Pat. No. 10,715,675

TRANSMISSION APPARATUS, TRANSMISSION SYSTEM, TRANSMISSION METHOD, AND PROGRAM

ASAHI KASEI KABUSHIKI KAI...

1. A transmission apparatus which transmits a sound signal from a first microphone provided corresponding to a first position to a speaker provided corresponding to a second position, wherein the transmission apparatus comprises:an evaluating unit which evaluates at least one of a level of a direct sound transmitted from the first position to the second position without intervention of the first microphone and the speaker, a noise level, and an operating state of a noise source; and
a delay setting unit which sets a transmission delay from the first microphone to the speaker based on an evaluation result of the evaluating unit.

US Pat. No. 10,715,674

METHOD FOR MANAGING THE RECEPTION OF A TELEPHONE CALL ON A CALLED COMMUNICATION TERMINAL

ONOFF TELECOM, Paris (FR...

1. A method for managing the reception of a telephone call signal on a called communication terminal, in particular a called mobile communication terminal, in order to establish a telephone connection with a calling communication terminal, in particular a calling mobile communication terminal, said method comprising:configuring said called communication terminal to receive the telephone call signal according to at least two modes for the establishment of the telephone connection, wherein the at least two modes comprise a “voice” mode, wherein the telephone connection between the called communication terminal and at least one server is established through a telephone communication network, and a “data” mode, wherein the telephone connection between the called communication terminal and the at least one server is established through a data communication network using “voice over IP” technology;
directly selecting, by a user of the method through a specific action on said called communication terminal, the mode used to establish the telephone connection;
establishing said telephone connection using the selected mode through the at least one server connected to said called communication terminal and to said calling communication terminal;
after an emission of an initial telephone call signal from the calling communication terminal on a telephone communication network, towards a number matching a called communication terminal of a user of the method:
receiving, by a call server, the telephone call signal emitted by the calling communication terminal;
attributing, by the call server, a temporary number to the calling communication terminal;
sending a notification of the telephone call signal and of the temporary number to a voice over IP server and to an application server, said voice over IP server and application server being connected to the called communication terminal;
sending a notification of the telephone call signal and of the temporary number to the called communication terminal by the application server;
displaying, on the screen of the called communication terminal, a human-machine interface allowing the user to select the mode in which to establish the telephone connection;
selecting, by the user, through said human-machine interface, the mode in which the telephone connection is established;
if the user chooses to receive the telephone call signal in “voice” mode:
emitting, by the called communication terminal, a return telephone call signal sent to the temporary number allocated by the call server to the calling communication terminal, said return telephone call signal being emitted through the telephone communication network;
receiving, by the call server, the return telephone call signal;
reconstituting the telephone connection, by the call server, by connecting the telephone call signal initially emitted by the calling communication terminal to the return telephone call signal emitted by the called communication terminal; and
in response to the user choosing to receive the telephone call signal in “data” mode;
emitting, by the called communication terminal, a return telephone call signal sent to the temporary number allocated by the call server to the calling communication terminal, said return telephone call signal being emitted through the data communication network using the voice over IP server;
transmitting the return call signal by the voice over IP server to the call server;
receiving, by the call server, the return telephone call signal;
reconstituting the telephone connection, by the call server, by connecting the telephone call signal initially emitted by the calling communication terminal to the return telephone call signal emitted by the called communication terminal.

US Pat. No. 10,715,672

SOCIAL NETWORKING-BASED TELECONFERENCING SYSTEM AND METHOD

Tawlk, Inc., Santa Monic...

1. A system hosting a social networking website for providing a teleconferencing feature, the system comprising:a teleconferencing page or window of the social networking website, the teleconferencing page or window being configured for scheduling a teleconference in response to user input to the social networking website via a communication network, and for generating an event for being represented on the social networking website, the event including information about the teleconference and being associated with one or more messages for sending to a plurality of users of the social networking website, each of the one or more messages having a link for accessing the teleconference by one or more invitees;
an interactive voice response system for hosting the teleconference to connect at least some of the one or more invitees to the teleconference; and
a client application running on a mobile device associated with each of the at least some of the one or more invitees, the client application receiving at least one of the one or more messages and the link, the client application including a control interface to enable an associated invitee to join the teleconference via the link to the teleconference.

US Pat. No. 10,715,671

APPARATUS, SYSTEM AND METHOD OF CALL NOTIFICATIONS TO ACTIVE COMMUNICATION DEVICES

BLACKBERRY LIMITED, Wate...

1. A server comprising:a processor; a communication interface; and a switching element, the processor configured to:
receive, at the communications interface, a call for a first endpoint device;
send, using the communications interface, a query requesting activity states of each of a plurality of further endpoint devices, the plurality of endpoint devices identified based on a stored association between the first endpoint device and each of the plurality of endpoint devices;
receive, using the communications interface, responses to the query, each response indicating an activity state and an activity level of a respective one of the plurality of further endpoint devices;
select a second endpoint device, the second endpoint device having a highest priority of the further endpoint devices having responses indicating an active state, wherein the highest priority is dynamically generated based on activity levels of the plurality of further endpoint devices;
transmit a notification of the call to the second endpoint device;
receive a signal to redirect the call to the second endpoint device; and
in response to receiving the signal, transfer, using the switching element, the call to the second endpoint device.

US Pat. No. 10,715,669

SYSTEMS AND METHODS FOR CONTROLLING TRANSFER OF CONTACTS IN A CONTACT CENTER

Avaya Inc., Santa Clara,...

1. A method for managing transfer requests in a contact center, the method comprising:receiving, by a processor of the contact center, a request to transfer a contact to a destination within the contact center;
determining, by the processor of the contact center, in real time, one or more first characteristics related to a source of the request and one or more second characteristics related to the destination of the request; and
based on the one or more first characteristics and the one or more second characteristics, performing, by the processor of the contact center, one of:
transferring the contact to the destination, or
preventing transfer of the contact to the destination.

US Pat. No. 10,715,667

CUSTOMER JOURNEY MANAGEMENT

1. A method of using a predictive model to manage customer journeys, the method comprising using one or more processors in a computer server:receiving data defining a plurality of customer journeys, each customer journey comprising a succession of logged events representing a historical trail of previous actions performed by a particular customer in a computing system, each event corresponding to an interaction between a customer device and a server or other device;
for a particular customer in a customer journey:
retrieving from computer data storage a combination of independent variables relating to the customer;
using a predictive model to determine a variable representing a customer journey score for the particular customer based on the combination of independent variables relating to the customer;
determining if the variable representing the customer journey score is above a threshold; and
if the variable representing the customer journey score is above the threshold, sending information to be displayed.

US Pat. No. 10,715,666

COMMUNICATION TERMINAL, COMMUNICATION SYSTEM, COMMUNICATION MANAGEMENT METHOD, AND MEDIUM

Ricoh Company, Ltd., Tok...

1. A communication terminal configured to communicate with a management system via a network, the communication terminal comprising:processing circuitry configured to
receive, as a reply to a request for a destination list, the request including a communication ID of a source of the request and being transmitted to the management system, destination list information in which destination candidates are associated with respective communication IDs and destination names that correspond to the communication ID of the source of the request;
classify the destination candidates included in the received destination list information into a plurality of groups;
receive selection of a group from among the plurality of classified groups;
determine a particular destination from among destination candidates represented by particular destination information included in the selected group; and
make a request for starting communication with the determined particular destination.

US Pat. No. 10,715,665

DYNAMIC RESOURCE ALLOCATION

United Services Automobil...

1. A computer-implemented method, comprising:obtaining internal and external data relating to a call center;
providing the internal and external data to a trained machine learning system, the trained machine learning system configured to predict an expected call volume based on the internal and external data;
receiving a predicted expected call volume from the trained machine learning system; and
dynamically adjusting a number of customer service representatives based on the predicted expected call volume by:
raising a bid for customer service representatives if the predicted expected call volume exceeds a threshold volume, or lowering the bid for the customer service representatives if the predicted expected call volume is below the threshold volume, wherein the bid is a financial incentive offered by the call center for the customer service representatives to handle the expected call volume.

US Pat. No. 10,715,664

DETECTION OF SENTIMENT SHIFT

1. A method comprising:monitoring, by a processing system including a processor, interactions between a user and an agent;
detecting, by the processing system, a shift in an attitude of the user during at least one of the interactions;
identifying, by the processing system, either a positive shift or a negative shift, based on an analysis of the shift, wherein the positive shift occurs in a first subset of the interactions and the negative shift occurs in a second subset of the interactions, wherein the identifying comprises using pre-existing training materials to understand actions the agent is trained to use including utterances to be used by the agent responsive to the agent detecting some shift in the attitude of the user during an interaction;
tracking, by the processing system, an outcome of each of the interactions, wherein a first group of the interactions results in a positive outcome and a second group of the interactions results in a negative outcome;
analyzing, by the processing system, the second subset of the interactions that correspond to the first group of interactions; and
producing, by the processing system, training materials for use by the agent based on the analyzing the second subset of the interactions that correspond to the first group of the interactions, the training materials including identification and ranking of actions, words or phrases for highest and lowest probability of reversing the negative shift in future interactions, the training materials including a suggestion as to how to reverse the negative shift in the future interactions.

US Pat. No. 10,715,663

MANAGEMENT OF AGENT SESSIONS FOR OMNICHANNEL PREDICTIVE OUTBOUND

Avaya Inc., Santa Clara,...

1. A system, comprising:a network interface; and
a processor, having access to data storage and the network interface; and
wherein the processor:
establishes, via the network interface, an agent half-connection, the agent half-connection being a connection with the agent endpoint and wherein the processor utilizes each of a plurality of media type-specific connections;
initiates, via the network interface, a customer half-connection via initiating contact with a customer device utilizing a customer address for one of the plurality of media type-specific connections;
upon establishing the customer half-connection for one of the plurality of media-specific connections, joins the agent half-connection with the customer half-connection comprising the one of the plurality of media type-specific connections; and
upon joining the agent half-connection with the customer half-connection, joins the agent half-connection with the customer half-connection comprising another one of the plurality of media type-specific connections different from the one of the plurality of media type-specific connections.

US Pat. No. 10,715,662

SYSTEM AND METHOD FOR ARTIFICIAL INTELLIGENCE ON HOLD CALL HANDLING

MOTOROLA SOLUTIONS, INC.,...

1. A method of on hold handling of calls comprising:receiving a call, from a caller, at a public safety access point;
monitoring, with an Artificial Intelligence (AI) bot implemented by a processor, an incoming audio associated with the call;
placing the call on hold;
determining, by the AI bot, based on the incoming audio, that the call should be taken off hold;
providing a recommendation, by the AI bot, to a call taker that the call should be taken off hold, wherein the recommendation includes a reason why the call should be taken off hold;
taking the call off hold when the recommendation is accepted; and
providing an indication that the recommendation was correct or incorrect based on when the recommendation is accepted, wherein the AI bot updates historical data based on the indication in order to actively learn from the accuracy of the recommendation.

US Pat. No. 10,715,661

SYSTEM AND METHOD FOR SCALABLE AND EFFICIENT MULTI-CHANNEL COMMUNICATION

Micro Macro Assets, LLC, ...

1. A system for establishing a communication with a customer and a user of a rep computer in communication with a second computer, comprising:the rep computer comprising a first processor and a first memory configured to store non-transitory instructions that when executed by the first processor performs the steps of:
receiving from the second computer a batch notification identifying a batch of one or more customer records in the rep computer, wherein the one or more customer records is associated with one or more customers that are being communicated with and have met a potential transfer criteria; and
receiving from the second computer a first connection transfer notification identifying a first transferred customer that is being transferred to the user, wherein at least one customer record associated with the first transferred customer is part of the batch,
wherein the communication with the customer by the user is via one or more communication modes, and a direction of the communication includes one or more of the group consisting of an inbound communication initiated by the customer and an outbound communication initiated to the customer.

US Pat. No. 10,715,660

CALLER ID VERIFICATION USING CALL IDENTIFICATION AND BLOCK LISTS

Pindrop Security, Inc., ...

1. A system comprising:a non-transitory storage medium storing a plurality of computer program instructions; and
a processor configured to execute the plurality of computer program instructions to:
receive from a computer an indication of an outgoing phone call to a first phone number from a second phone number;
transmit a first request to a user device associated with the first phone number to add the second phone number to an identification list maintained at the user device;
transmit a confirmation to the computer that the outgoing phone call can be placed to the first phone number; and
in response to the processor receiving a confirmation message that the outgoing phone call has been initiated:
transmit a second request to the user device to remove the second phone number from the identification list.

US Pat. No. 10,715,658

TELEPHONE CALL-BACK DEVICE

1. A telephone call-back device comprising:an activation device coupled to a phone line and phone;
a call source utility coupled to the activation device, wherein the call source utility identifies a source phone number of a spam incoming phone call received by the phone, regardless of whether the spam incoming phone call is answered or not answered, in response to the activation device being activated;
a call-back utility coupled to the call source utility, wherein the call-back utility sends at least one robo call-back outgoing phone call to the source phone number in response to receipt of the spam incoming phone call by the phone.

US Pat. No. 10,715,657

CALLING AN UNREADY TERMINAL

Microsoft Technology Lice...

1. A system comprising:one or more hardware processors; and
a memory comprising instructions that when executed by the one or more hardware processors, configure the one or more hardware processors to establish a call between a caller terminal and a called terminal where a client application to accept the call is not installed on the called terminal when the called terminal receives a call establishment request, by performing operations including:
receiving, by the called terminal, the call establishment request from the caller terminal to establish the call between the caller terminal and the called terminal, the call establishment request indicating an initiation of the establishment of the call at the caller terminal;
in response to receiving the call establishment request:
presenting a single prompt that requests whether a user wishes to answer the call request, and to allow installing of the client application in order to accept the call,
based on selection of the single prompt, installing the client application,
sending, via the installed client application, a reverse call establishment request to the caller terminal in response to a completion of the installing of the client application and the reception of the call establishment request from the caller terminal, the reverse call establishment request configured to cause the caller terminal to accept the reverse call establishment request, wherein the installation of the client application on the called terminal occurs during the establishment of the call at the caller terminal; and
receiving a call acceptance response to establish the call using a packet-switched network, the call acceptance response indicating a completion of the establishment of the call at the caller terminal.

US Pat. No. 10,715,656

METHOD AND APPARATUS FOR THREAT IDENTIFICATION THROUGH ANALYSIS OF COMMUNICATIONS SIGNALING EVENTS, AND PARTICIPANTS

PINDROP SECURITY, INC., ...

1. A computer-implemented method comprising:receiving, by a computer, a first carrier signaling data from a first switching device in a telephone network, the first carrier signaling data utilized by the telephone network to route a phone call to a callee's phone number;
setting, by the computer, a destination routing address of the first carrier signaling data;
transmitting, by the computer, a first continue message to the first switching device with a parameter associated with the destination routing address, such that the first switching device routes the phone call to the destination routing address;
receiving, by the computer, a second carrier signaling data from a second switching device in the telephone network, the second carrier signaling data generated by the second switching device in response to receiving the phone call;
generating, by the computer, a threat score for the phone call based upon comparing the first carrier signaling data to the second carrier signaling data;
transmitting, by the computer, a second continue message to the second switching device with the destination routing address set to the callee's phone number; and
transmitting, by the computer, the threat score to a device associated with the callee.

US Pat. No. 10,715,655

STANDARD MOBILE COMMUNICATION DEVICE DISTRACTION PREVENTION AND SAFETY PROTOCOLS

CELL COMMAND, INC., Mari...

1. A transmitter for activating a behavior in a mobile device within a specified environment, comprising:a memory configured to store software instructions; and
a processor configured to access the software instructions from the memory;
the processor configured to access the software instructions from the memory to:
broadcast a first trigger signal within the specified environment, the first trigger signal comprising discovery information corresponding to a modified universally unique identification (UUID) code of the transmitter, wherein the modified UUID code comprises a format formed through modification of at least one of a structure of a pre-existing UUID code format and a meaning of one or more bytes of the pre-existing UUID code format, and wherein at least a portion of the modified UUID code identifies at least one of:
the specified environment in which the transmitter operates; and
a specified working group information in the specified environment in which the transmitter operates; and
wherein the discovery information broadcast from the transmitter in the first trigger signal causes activation of the behavior in the mobile device within the specified environment.

US Pat. No. 10,715,651

MULTILAYER MOBILE APP INTERFACE

EMC IP Holding Company LL...

1. A method, comprising:displaying an application page;
receiving, by one or more processors, an indication to provide a previously stored application page; and
in response to receiving the indication to provide the previously stored application page, providing the previously stored application page based at least in part on information associated with the previously stored application page including information sufficient to regenerate the previously stored application page without obtaining additional data for the previously stored application page from a server or application associated with the previously stored application page.

US Pat. No. 10,715,650

DUAL-TRANSCEIVER WIRELESS CALLING

Bose Corporation, Framin...

1. A wireless audio system comprising:a first wireless transceiver configured to establish a wireless link with an audio gateway for receiving and sending call audio and exchanging call control data;
a second wireless transceiver configured to wirelessly communicate with the first wireless transceiver over a simple voice forward profile (SVFP) connection; and
a headphone system comprising a first headphone containing the first wireless transceiver and a second headphone containing the second wireless transceiver,
wherein the first wireless transceiver is configured to forward the call audio to the second wireless transceiver and exchange the call control data with the second wireless transceiver over the SVFP connection,
wherein the first headphone further comprises at least one first microphone and the second headphone further comprises at least one second microphone,
wherein the second wireless transceiver is configured to send call audio received at the at least one second microphone to the first wireless transceiver over a synchronous connection-orientated (SCO) link established by the SVFP connection, and
wherein the first wireless transceiver is configured to send the call audio received from the second wireless transceiver to the audio gateway over a synchronous connection-orientated (SCO) link established by a hands-free profile (HFP) connection.

US Pat. No. 10,715,648

USER INTERFACE FOR VIRTUAL ASSISTANT INTERACTIONS ON TELEPHONY DEVICE

Cisco Technology, Inc., ...

1. A method comprising:displaying, through a user interface on a display of a telephony device, a plurality of lines associated with the telephony device, the displayed plurality of lines including at least a designated line for a user of the telephony device and a shared line for a virtual assistant, wherein the virtual assistant is integrated with the telephony device and is presented as the shared line of the telephony device;
in response to receiving an incoming call to the designated line, displaying, through the user interface, a plurality of features associated with handling the incoming call, the displayed plurality of features including an assistant feature to direct the incoming call to the virtual assistant;
in response to the incoming call being directed to the virtual assistant, displaying, through the user interface, a plurality of features associated with managing a conversation between the virtual assistant and a caller during the incoming call; and
after the incoming call has ended, displaying, through the user interface, a call history associated with the incoming call.

US Pat. No. 10,715,647

MOBILE TERMINAL

ZHEJIANG GEELY HOLDING GR...

1. A mobile terminal, comprising a terminal body and a control unit, wherein the terminal body is provided with a flash lamp, a zoom lens is provided in an advancing direction of a light of the flash lamp, the control unit is configured to control the flash lamp to switch between a flash mode and a flashlight mode according to an input command, and the zoom lens has a plurality of operating states and converges the light of the flash lamp in at least one operating state;the terminal body is further provided with a laser lamp and a camera, the laser lamp and the camera are located on the same surface of the terminal body, the control unit is further configured to control the laser lamp to emit light according to an input command, control the camera to capture an image of an object irradiated by the laser lamp, and calculate a distance between the camera and the object according to a distance between a bright spot of the laser lamp in the captured image and an image center of the captured image.