US Pat. No. 10,433,027

METHOD AND APPARATUS FOR SIGNALING VIDEO ENHANCEMENT INFORMATION (VEI) FOR VIDEO QUALITY ENHANCEMENT OF FIXED/MOBILE BROADCASTING HYBRID 3DTV

Electronics and Telecommu...

1. An IP-based hybrid 3DTV content reception apparatus, comprising:a receiving unit configured to receive multiplexed 3DTV content information;
an additional data obtaining unit configured to obtain service layer signaling information from the received 3DTV content information, and obtain additional data based on the obtained service layer signaling information;
a decoding unit configured to decode the received 3DTV content information by using the obtained additional data; and
a playing back unit configured to play back the decoded 3DTV content,
wherein the 3DTV content comprises either one or both of video data and audio data,
wherein the additional data comprises data required for either one or both of decoding and playing back the 3DTV content,
wherein the service layer signaling information comprises information about a standard for receiving media components (User Service Bundle Description (USBD)) and/or User Service Description (USD)) of the 3DTV content, information of the media components (Media Presentation Description (MPD), and information about a session through which the media components are transmitted (Service-based Transport Session Instance Description (S-TSID)), and
wherein the MPD comprises information corresponding to right video and information corresponding left video within a same period, and
wherein the media components are received with reference to the MPD and the S-TSID based on information included in the USBD and/or the USD.

US Pat. No. 10,433,025

VIRTUAL REALITY RESOURCE SCHEDULING OF PROCESS IN A CLOUD-BASED VIRTUAL REALITY PROCESSING SYSTEM

1. A method comprising:receiving a request for a render project that includes information specifying three-dimensional video data to be used to create a render, the information including a segment, a storage location of the segment, a storage location of initial video data, two or more cameras used to record the initial video data, a geometry of each of the two or more cameras used to record the initial video data, a format of the render, a geometry of the render, a time frame of the initial video data, a project identifier, and a priority of the render;
determining a plurality of jobs required to create the render from the three-dimensional video data;
determining an availability of a plurality of nodes across a network;
creating a render map that specifies a processing sequence of the plurality of jobs across the plurality of nodes to create the render, the render map being created based on at least the availability of the plurality of nodes; and
processing the plurality of jobs at the plurality of nodes to create three-dimensional content.

US Pat. No. 10,433,019

SYSTEMS AND METHODS FOR ADAPTIVE STORAGE AND SCHEDULING OF MEDIA ASSETS

Rovi Guides, Inc., San J...

1. A method for adaptively retrieving and storing media assets relating to a participant of interest in an event in response to a user request received during the event, the method comprising:receiving, at a first time, a first user command indicating a participant of interest in an event, wherein the event includes a plurality of instances transmitted as a plurality of scheduled media assets;
identifying, from the plurality of scheduled media assets, a first scheduled media asset involving the participant of interest that is scheduled for transmission at a second time later than the first time;
scheduling the first scheduled media asset for recording at the second time;
searching, a remote server, for a first set of previously stored media assets that involve the participant of interest and that correspond to a set of the plurality of instances that were transmitted as a set of the plurality of scheduled media assets before the first time;
retrieving outcomes for the set of the plurality of instances;
determining, based on the retrieved outcomes, a second set of previously stored media assets from the first set of previously stored media assets, each previously stored media asset in the second set corresponds to a given one of the plurality of instances associated with an outcome that corresponds to a threshold;
downloading, at the first time, from the remote server to a local storage device at a user device, the second set of previously stored media assets;
arranging the recorded first scheduled media asset and the downloaded second set of previously stored media assets based on a respective transmission time corresponding to each respective media asset;
generating, for display, the arranged media assets in a sequence;
retrieving, at a third time later than the first time and before the second time, updated outcomes for instances that occurred after the first time and before the second time;
determining whether the participant of interest remains in the event at the third time based on the updated outcomes and a set of event evolvement rules; and
updating a recording schedule depending on whether the participant of interest remains in the event at the third time.

US Pat. No. 10,433,017

SYSTEMS AND METHODS FOR INTEGRATED HTML5 SEARCHING AND CONTENT DELIVERY

Cable Television Laborato...

1. A method for authenticating and authorizing at least one user agent to access content selected through an MVPD HTML5 application of a multichannel video programming distributor, the method comprising the steps of:verifying that the MVPD HTML5 application is accessing the selected content on behalf of a registered subscriber of the multichannel video programming distributor;
deep linking to the MVPD HTML5 application by selecting a unique URL within a user interface of an electronic device in which the MVPD HTML5 application is stored;
authorizing the registered subscriber based on the particular entitlements of the registered subscriber with the multichannel video programming distributor;
executing a first lightweight license between the at least one user agent and the multichannel video programming distributor;
validating the licensor as a legitimate business entity;
downloading a certificate attesting that the first lightweight license has been executed, the certificate including at least a public key and a private key pair; and
installing the private key from the public key and private key pair on the electronic device.

US Pat. No. 10,433,016

RECEPTION APPARATUS, RECEPTION METHOD, TRANSMISSION APPARATUS, AND TRANSMISSION METHOD

SONY CORPORATION, Tokyo ...

1. A reception apparatus comprising:circuitry configured to
receive broadcast content provided as a pay broadcast service, the broadcast content being transmitted in a scrambled manner;
acquire a subscription check application for checking presence of a subscription to the pay broadcast service depending on information indicating presence of the subscription check application, the information being included in control information including information regarding a structure of the broadcast content;
control an operation of the subscription check application;
acquire a promotion application in response to a determination that the subscription to the pay broadcast service is not present; and
acquire a conjunction application in response to a determination that the subscription to the pay broadcast service is present;
wherein the subscription check application is transmitted in a non-scrambled manner, the promotion application is transmitted in a non-scrambled manner, and the conjunction application is transmitted in a scrambled manner.

US Pat. No. 10,433,007

METHOD OF ADAPTING A BIT RATE FOR A MOBILE DEVICE

MIMIK TECHNOLOGY INC., V...

1. A method of adapting a content transmission bit rate for a user device having Global Positioning System (GPS), the method comprising:providing a client application on the user device to obtain GPS coordinates and differential coordinates of the user device;
transmitting the GPS coordinates and the differential coordinates to a serving node to which the user device is registered, the serving node associated with a user of the user device, the serving node located at a premises of the user;
calculating, by the serving node, a speed of the user device based on the GPS coordinates and the differential coordinates;
using, by the serving node, the speed of the user device to calculate a probable content transmission error rate and a probable packet loss rate;
determining, by the serving node, a closest server to the user device based on the GPS coordinates;
adjusting, by the serving node, the content transmission bit rate for content to be transmitted to the user device based on:
the calculated probable content transmission error rate, the calculated probable packet loss rate, and the determination of the closest server; and
a condition of the user device, wherein the condition of the user device includes supported content formats by the user device and transport layer protocol used by the user device;
adjusting dynamically, by the serving node, an expected packet arrival rate for the content to be transmitted to the user device, wherein the expected packet arrival rate is determined from an actual packet arrival rate measured over a time period; and
causing, by the serving node, the closest server to transmit the content to the user device, based on the adjusted content transmission bit rate and the adjusted expected packet arrival rate.

US Pat. No. 10,433,006

METHOD, APPARATUS, AND SYSTEM FOR ACCESSING DATA STORAGE WITH BIOMETRIC VERIFICATION

1. A computing device, comprising:a wireless transceiver configured to communicate wirelessly with a data network to connect to a remote storage device storing a plurality of media assets;
a processor to exchange media processing capabilities with a media player device connected to the computing device, the computing device being removably connectable to the media player device, and to generate a user interface according to the received media processing capabilities of the media player device, the user interface being configured by the processor to be displayed at the media player device to display a list of the media assets stored in the storage device, and to receive a user input from the media player device selecting a media asset from the displayed list at the media player device, the processor being configured to access the selected media asset from the storage device; and
an external biometric characteristic reader configured to receive user biometric characteristic data to verify a user's identity after a wireless connection with the media player device is established,
wherein:
the processor is configured to access the selected media asset from the storage device via the wireless transceiver and the data network based on verification of the user's identity, for display at the media player device; and
the user interface is configured by the processor to filter the displayed list of stored media assets according to the received media format processing capabilities of the media player device to omit media assets that cannot be processed by the media player device.

US Pat. No. 10,433,000

TIME-SENSITIVE CONTENT UPDATE

Facebook, Inc., Menlo Pa...

1. A method comprising:by one or more computing systems, determining available media content from one or more content sources;
by the one or more computing systems, accessing a graph comprising:
a plurality of first nodes that are each associated with a respective user;
a plurality of second nodes that are each associated with a respective show or movie; and
a plurality of edges connecting the first nodes and the second nodes, each particular edge indicating that a particular user corresponding to a particular first node previously watched, liked, shared, or commented on a particular show or movie corresponding to a particular second node;
by the one or more computing systems, providing at least a portion of the available media content in a programming guide on a display device, the programming guide comprising:
a plurality of entries, each entry associated with one of the available media content; and
social content as determined from the graph; and
by the one or more computing systems, updating, in response to selection of a user-selectable update option, the programming guide to display one or more new entries that are each associated with new available media content, the user-selectable update option comprising a graphical touch-input element displayed on a touch-sensitive portion of the display device, the touch-input element comprising a message indicating that updates to the programming guide are available;
by the one or more computing systems, automatically updating the programming guide a pre-defined amount of time after a previous update of the programming guide;
by the one or more computing systems, automatically updating the programming guide a pre-defined amount of time before each half hour; and
by the one or more computing systems, automatically updating the programming guide in response to a pre-defined amount of time of inactivity with respect to user interaction with the programming guide.

US Pat. No. 10,432,998

ONLINE BACKUP AND RESTORATION OF TELEVISION RECEIVER STORAGE AND CONFIGURATION DATA

DISH Technologies L.L.C.,...

1. A television receiver device, comprising:one or more tuners configured to receive television signals from one or more television data sources;
one or more decoders configured to decode television signals received via the one or more tuners;
a digital video recorder (DVR) comprising a DVR database configured to receive and store recorded television programs
one or more processors; and
memory communicatively coupled with and readable by the one or more processors and having stored therein processor-readable instructions which, when executed by the one or more processors, cause the television receiver device to:
receive, from a backup server operating separately and at a remote location from the television receiver device, television receiver backup data, the television receiver backup data including a listing of television program data identifying one or more television programs to be restored to the television receiver device;
receive electronic programming guide (EPG) data corresponding to a plurality of upcoming television program broadcasts from one or more television data sources;
determine whether at least one of the television programs to be restored to the television receiver device is identified within the received EPG data corresponding to the plurality of upcoming television program broadcasts, using the listing of television program data received from the backup server;
in response to determining that at least one television program to be restored to the television receiver device is identified within the received EPG data, set one or more timers on the television receiver device to record the identified at least one television program, based on the EPG data;
receive the identified at least one television program via the one or more television data sources; and
record and store the identified at least one television program in the DVR database of the digital video recorder.

US Pat. No. 10,432,997

INTERCHANGEABLE REAR SEAT INFOTAINMENT SYSTEM

VOXX INTERNATIONAL CORPOR...

1. A rear seat entertainment system, comprising:a first access point, wherein the first access point includes a first display screen and a plurality of input/output ports, and wherein the first access point is included in a first housing; and
a second housing separate from the first housing, wherein the second housing includes a second display screen,
wherein the first access point is configured to display first content on the first display screen and stream the first content displayed on the first display screen to the second housing so that the first content is simultaneously displayed on the first and second display screens,
wherein the first access point is further configured to receive second content from a first mobile device, display the second content on the first display screen and stream the second content displayed on the first display screen to the second housing so that the second content is simultaneously displayed on the first and second display screens,
wherein the first access point includes a content caster configured in firmware on a system component, the content caster configured to stream the second content to the second housing in a first mode and a second mode,
wherein the content caster is configured to perform group communication using a real-time transport protocol and permit more than two devices to source content therefrom if the more than two devices are communicably coupled to the first access point through a wireless local area network.

US Pat. No. 10,432,996

MATCHING DATA OBJECTS TO VIDEO CONTENT

1. A method by an application server for matching information to video content for creation of a relatively large volume of information-to-video content matches, comprising:providing, by an application processor and via a network access device, a list of videos to be reviewed for tagging with a tag that identifies a limited portion of the video content of a corresponding video listed in the list of videos;
receiving, from each of a plurality of devices each associated with a corresponding user account and via the network access device, a message including at least one vote indicating that the tag corresponds to a Graphical User Interface (GUI) object that includes information topically relevant to a subject of video content of one video of the list of videos;
validating, by the application processor, that the tag corresponds to the GUI enabled data object based at least in part on a number of received votes compared to a threshold number of votes; and
assigning, by the application processor, the video content to an authorized account in response to the number of received votes reaching or exceeding the threshold number of votes; and
receiving, from a device associated with the authorized account, data indicating that the tag corresponds to the GUI enabled data object;
wherein validating that the tag corresponds to the GUI enabled data object is further determined based on receiving the data indicating that the tag corresponds to the GUI enabled data object from the device associated with the authorized account.

US Pat. No. 10,432,990

APPARATUS AND METHODS FOR CARRIER ALLOCATION IN A COMMUNICATIONS NETWORK

Time Warner Cable Enterpr...

1. A method of operating a computerized network controller apparatus in a digital content network to allocate radio frequency (RF) spectrum using at least a modulator apparatus, the method comprising:receiving, at a data communication interface of the computerized network controller apparatus, data identifying each of (i) a prioritized portion of the RF spectrum, and (ii) a non-prioritized portion of the RF spectrum;
receiving, at the data communication interface of the computerized network controller apparatus via the digital content network, data indicative of one or more user requests for content from one or more computerized user devices;
isolating the prioritized portion of the RF spectrum from a carrier selection algorithm operative to control the modulator apparatus, the carrier selection algorithm configured to allocate RF carriers for delivery of digital content based at least in part on the receipt of the data indicative of the one or more user requests for content;
causing utilization of the non-prioritized portion of the RF spectrum to dynamically service a first request of the one or more user requests for content, the first request corresponding to first digitally rendered content, the causing utilization of the non-prioritized portion of the spectrum based at least in part on a determination that the first digitally rendered content is not currently delivered on the prioritized portion of the RF spectrum; and
causing utilization of the prioritized portion of the RF spectrum to service a second request of the one or more user requests for content, the second request corresponding to second digitally rendered content, the prioritized portion of the spectrum not subject to allocation of RF carriers for delivery of the second digitally rendered content based on receipt of the data indicative of the one or more user requests for content.

US Pat. No. 10,432,989

TRANSMISSION APPARATUS, TRANSMISSION METHOD, RECEPTION APPARATUS, RECEIVING METHOD, AND PROGRAM

SATURN LICENSING LLC, Ne...

1. A transmission method comprising a step of delivering a Layered Coding Transport (LCT) packet including a portion and an LCT header, the portion being data including part of a fragment, whereinthe fragment includes:
a movie fragment (moof); and
a media data (mdat) including an mdat header and a sample group,
the moof includes BaseMediaDecodeTime representing a presentation time of a first sample of the mdat, and
the LCT header includes:
a sequence number representing a position of the fragment;
a version representing a position of the part of the fragment in the fragment;
a header extension portion including a Network Time Protocol (NTP) time representing the presentation time of the first sample of the mdat;
sample count start information representing a position of a first sample of the part of the fragment from a first sample of the fragment; and
a moof subset that is at least part of the moof.

US Pat. No. 10,432,988

LOW LATENCY WIRELESS VIRTUAL REALITY SYSTEMS AND METHODS

ATI TECHNOLOGIES ULC, Ma...

1. A method of processing Virtual Reality (VR) data, the method comprising:receiving user feedback information;
using one or more server processors to:
predict, based on the user feedback information, a user viewpoint of a next frame of a sequence of frames of video data to be displayed;
render a portion of the next frame of video data to be displayed using the prediction; and
encode the portion of the next frame of video data to be displayed; and
transmit the encoded and formatted portion of the next frame of video data to be displayed.

US Pat. No. 10,432,986

RECALL AND TRIGGERING SYSTEM FOR CONTROL OF ON-AIR CONTENT AT REMOTE LOCATIONS

Disney Enterprises, Inc.,...

1. An affiliated station for communication with a television network, the affiliated station comprising:a memory; and
a processor configured to:
receive a video from the television network, the video including a command inserted into the video using a data identifier (DID) and a secondary data identifier (SDID) and carried in an ancillary data area of the video;
store the video and the inserted command in the memory, the memory also storing a command detection module and a local content element insertion module;
after storing the video and the inserted command in the memory, detect, using the command detection module also stored in the memory, the inserted command by searching through the ancillary data area of the video stored in the memory;
load a database based on the inserted command detected by searching through the video;
identify a first content element, a second content element, and a third content element having a layering order in the video using image recognition performed by the command detection module and the local content element insertion module based on the inserted command detected by searching through the video;
retrieve data from the database based on the inserted command detected by searching through the video;
modify the video using the data based on the inserted command by changing the layering order of the first content element, the second content element, and the third content element to generate a modified video; and
transmit the modified video to a display device.

US Pat. No. 10,432,983

LIVE VIDEO CLASSIFICATION AND PREVIEW SELECTION

Twitter, Inc., San Franc...

1. A computing device comprising:at least one processor; and
a non-transitory computer-readable medium having executable instructions stored thereon that, when executed by the at least one processor, are configured to:
for each of a plurality of live video streams available for viewing via a live video sharing platform:
obtain a portion of the live video stream, the portion being a segment generated by a streaming protocol,
assign the portion to a class using a video classifier, each class used in the video classifier having an associated tag indicating whether the class is preview-eligible or not preview-eligible, wherein a class with a preview-eligible tag has an associated percentage that represents rare occurrence within a statistically relevant sample of segments of live video streams, the percentage representing a quantity of segments in the sample that are classified into the class compared with a total quantity of segments in the sample,
determine, based on the tag for the class, whether the portion is preview-eligible, and
generate, responsive to determining that the portion is preview-eligible, a snippet of the live video stream using the portion, and
provide at least some of the snippets for display in a user interface, the snippets provided for display in the user interface being selectable and the user interface being configured to, responsive to a user selecting a first snippet of the snippets provided for display in the user interface, enable the user to join the live video stream corresponding to the first snippet.

US Pat. No. 10,432,982

ADAPTIVE BITRATE STREAMING LATENCY REDUCTION

ARRIS Enterprises LLC, S...

1. A method of transmitting media content, comprising:receiving an adaptive transport stream description at an HTTP streamer from a media preparation unit, the adaptive transport stream description describing media content available from the media preparation unit as one or more adaptive transport streams, wherein each of said one or more adaptive transport streams are continuous streams comprising a plurality of switchable segments each comprising one or more delivery chunks, the switchable segments being marked with segment boundary points and the delivery chunks being marked with chunk boundary points, wherein positions between each of said plurality of switchable segments are positions at which a client device can switch to a different one of said one or more adaptive transport streams;
publishing a playlist with said HTTP streamer listing identifiers for one or more of said plurality of switchable segments, the switchable segments including delivery chunks;
receiving said one or more adaptive transport streams into a memory buffer at said HTTP streamer from said media preparation unit;
receiving a request at said HTTP streamer from said client device for a particular switchable segment identified on said playlist to be received at a requested bit rate;
responding to said request in the HTTP streamer by processing the received one of more adaptive transport streams on the fly in real time by;
continuing receipt of delivery chunks of a switchable segment prior to the particular switchable segment until the delivery chunks reach a segment boundary point;
identifying boundary marks of the one or more chunks in only the particular switchable segment; and
transmitting the one or more delivery chunks from said particular switchable segment at the requested bit rate to said client device using HTTP chunked transfer encoding until a terminating segment boundary point is reached,
wherein each of said one or more delivery chunks are portions of the particular switchable segment that are independently decodable by said client device, such that said HTTP streamer is configured to begin sending delivery chunks from a requested switchable segment so that said client device begins decoding and rendering received delivery chunks when said HTTP streamer has not yet received additional ones of the switchable segments from said media preparation unit.

US Pat. No. 10,432,977

SIGNAL RESHAPING FOR HIGH DYNAMIC RANGE SIGNALS

Dobly Laboratories Licens...

1. A method to improve backward compatible decoding, the method comprising:accessing with a processor an image database;
computing first hue values in a first color space of the images in the image database;
computing second hue values in a second color space of the images in the database;
computing a hue rotation angle by minimizing a hue cost function, wherein the hue cost function is based on a difference measure of the first hue values and rotated second hue values;
generating based on the hue rotation angle a color-rotation matrix for color-rotating input images prior to encoding;
computing first saturation values of the images in the database in the first color space;
transforming the images in the database into the second color space to generate transformed images;
applying the color-rotation matrix to the transformed images to generate color-rotated images;
computing second saturation values of the color-rotated images;
computing a saturation scaler based on minimizing a saturation cost function, wherein the saturation cost function is based on a difference measure between the first saturation values and scaled second hue values; and
generating a scaling vector based on the saturation scaler.

US Pat. No. 10,432,976

IMAGE PROCESSING APPARATUS AND METHOD

VELOS MEDIA, LLC, Plano,...

1. An image processing apparatus comprising:a processor; and
a memory storing instructions that, when executed by the processor, cause the processor to:
decode, from encoded data, identification information indicating whether a non-compression mode has been selected in a coding unit, wherein the encoded data includes the coding unit and the identification information, the coding unit being formed by block partitioning a largest coding unit (LCU) into a plurality of coding units, whereing the block partitioning of the LCU includes recursively splitting the LCU into the plurality of coding units; and
decode the coding unint in the encoded data using the identification information by:
if the identification information indicates that the non-compression mode has not been selected in the coding unit, decoding the coding unit according to a first bit depth, and
if the identification information indicates that the non-compression mode has been selected in the coding unit, decoding the coding unit according to a second bit depth.

US Pat. No. 10,432,971

IMAGE DATA COMPRESSION AND DECOMPRESSION USING MINIMIZE SIZE MATRIX ALGORITHM

Sheffield Hallam Universi...

1. A data processing device comprising at least one data processor and a non-transitory computer readable medium coupled to the at least one data processor, the non-transitory computer readable medium storing instructions that when executed by the at least one data processor cause the at least one data processor to perform a process comprising:applying a discrete cosine (DCT) transformation to each of a plurality of non-overlapping pixel blocks which span a frame of image data to generate a set of DCT coefficients for each pixel block comprising a DC DCT coefficient and a plurality of AC DCT coefficients;
quantising each set of DCT coefficients to generate a set of quantised DC DCT coefficients and a set of quantised AC DCT coefficients;
forming a DC array from the set of quantised DC DCT coefficients;
forming an AC matrix from the set of quantised AC DCT coefficients;
forming a limited data array comprising elements having values corresponding only to each unique value of the elements of the AC matrix;
compressing the AC matrix by eliminating blocks of data of the AC matrix having only zero values and forming a reduced AC array from blocks of data of the AC matrix including non-zero values;
storing a position in the AC matrix of each block of data of the AC matrix including non-zero values in a location array;
generating a key using a maximum value of the elements of the reduced AC array, and wherein the key comprises a plurality of key components;
compressing the reduced AC array using the key to form a coded AC array, wherein a same number of elements of the reduced AC array as a number of key components are combined using the key to form a single element of the coded AC array;
arithmetically coding the DC array and the coded AC array to form arithmetically coded data; and
forming a compressed image file including the arithmetically coded data, storing the location array in a header of the compressed image file and storing the key and the limited data array.

US Pat. No. 10,432,970

SYSTEM AND METHOD FOR ENCODING 360° IMMERSIVE VIDEO

Telefonaktiebolaget LM Er...

1. A media preparation method, comprising:receiving a media input stream;
generating a plurality of bitrate representations of the media input stream, each bitrate representation having a separate video quality that is related to a quantization parameter (QP) value used for each bitrate representation;
encoding each bitrate representation into a first coded bitstream comprising a plurality of frames with a specific Group-of-Pictures (GOP) structure, wherein each GOP starts with an intra-coded (I) frame followed by a set of frames including at least one predictive-coded (P) frame; and
encoding each bitrate representation into a second coded bitstream comprising a plurality of frames with a GOP structure that has a size coextensive with a size of the GOP structure of the first coded bitstream, wherein each GOP of the second coded bitstream starts with an I-frame followed by a plurality of X-frames, each X-frame having a slice header of a P-frame and comprising blocks of only intra-coded data (I-blocks).

US Pat. No. 10,432,969

3D VIDEO DATA STREAM COMPRISING A VALUE TABLE ASSOCIATING REFERENCE VIEWS AND CAMERA PARAMETER IN TEMPORAL UNIT SCOPE AND BEYOND TEMPORAL UNIT SCOPE PARAMETER SETS AND AN ENCODER FOR ENCODING AND A DECODER FOR DECODING THE 3D VIDEO DATA STREAM

Fraunhofer-Gesellscaft zu...

1. A non-transitory computer-readable storage medium storing a 3D video data stream comprising:a set of coded views coded in the 3D video data stream in temporal units; and
a parameter set comprising a first table comprising an entry for each coded view, which comprises
a value indicating a count of a number of reference views,
for each coded view for which the count of the number of reference views exceeds zero,
a flag indicating whether, for each of the number of reference views of the respective coded view, a relative camera parameter associated with the respective reference view is present in the first table or within portions of the 3D video data stream not exceeding the scope of the temporal units,
for each of the number of reference views of the respective coded view,
an index indexing the respective reference view, and
if the flag indicates that, for each of the number of reference views of the respective coded view, the relative camera parameter associated with the respective reference view is present in the first table, the relative camera parameter associated with the respective reference view of the respective coded view,
wherein a scope of the parameter set is beyond the scope of the temporal units,
wherein the 3D video data stream further comprises, for each coded view for which the flag indicates that, for each of the number of reference views of the respective coded view, the relative camera parameter associated with the respective reference view is present within the portions of the 3D video data stream not exceeding the scope of the temporal units,
within each temporal unit encompassed by the scope of the parameter set,
a temporal unit scope parameter set for the respective coded view, the temporal unit scope parameter set comprising a second table comprising, for each of the number of reference views of the respective coded view, an associated entry comprising
the relative camera parameter associated with the reference view of the respective coded view indexed by the index indexing the respective reference view,
wherein the relative camera parameter comprises
a pair of scale and offset values to convert depth values of the associated reference view to disparity values between the coded view for which the relative camera parameter is present in the 3D video data stream and the reference view with which the relative camera parameter is associated.

US Pat. No. 10,432,965

VIDEO-ENCODING METHOD AND VIDEO-ENCODING APPARATUS BASED ON ENCODING UNITS DETERMINED IN ACCORDANCE WITH A TREE STRUCTURE, AND VIDEO-DECODING METHOD AND VIDEO-DECODING APPARATUS BASED ON ENCODING UNITS DETERMINED IN ACCORDANCE WITH A TREE STRUCTURE

SAMSUNG ELECTRONICS CO., ...

1. An apparatus for decoding a video, the apparatus comprising:a receiver configured to receive a bitstream with respect to an encoded video; and
a decoder configured to extract, from the received bitstream, information about a size of a coding unit that is a data unit for decoding of a picture of the encoded video, a variable depth of the coding unit, split information and an encoding mode with respect to coding units having a tree structure of the picture, determine a maximum size of the coding unit based on the information about the size and the variable depth, split the picture into one or more maximum coding units based on the determined maximum size, determine the coding units having the tree structure based on the split information, and decode and reconstruct the picture based on the determined coding units based on the encoding mode, wherein a maximum coding unit, among the one or more maximum coding units, is hierarchically split into one or more coding units of depths including at least one of a current depth and a lower depth according to the split information, when the split information indicates a split for the current depth, a coding unit of the current depth is split into four coding units of the lower depth, independently from neighboring coding units, and when the split information indicates a non-split for the current depth, the coding unit of the current depth is split into one or more prediction units.

US Pat. No. 10,432,962

ACCURACY AND LOCAL SMOOTHNESS OF MOTION VECTOR FIELDS USING MOTION-MODEL FITTING

PIXELWORKS, INC., Portla...

1. A method of producing video data, comprising:receiving, at a processor, a current frame of image data in a stream of frames of image data;
dividing a current frame of image data into blocks;
identifying a current block and defining a neighborhood of blocks for the current block;
generating at least one initial motion vector for each block;
using the initial motion vector for current block and an initial motion model to calculate a weight for each initial motion vector in the neighborhood based on a difference between initial motion vector for the current block and the initial motion vector for at least one other block from the current block in the neighborhood and differences in the image data between the current block and the other blocks in the neighborhood;
using the weights for each initial motion vector to generate coefficients for a refined motion model;
refining the initial motion vector for the current block according to the refined motion model to produce a refined motion vector;
using the refined motion vector and the pixels in the stream of frames to produce at least one of adjusted pixels and new pixels; and
displaying the at least one of adjusted pixels and new pixels on a display.

US Pat. No. 10,432,961

VIDEO ENCODING OPTIMIZATION OF EXTENDED SPACES INCLUDING LAST STAGE PROCESSES

Apple Inc., Cupertino, C...

1. A video compression method, comprising:converting a video signal from a first format to a second format used by a video encoder;
combining a predicted video signal in the second format with the video signal in the second format to produce a residual video signal in the second format;
coding the residual video signal by the video encoder using selected coding parameters;
decoding the coded data that are output by the video encoder to produce decoded data in the second format;
filtering with a first input of the decoded data in the second format and a second input of the video signal in the first format, and producing filtered decoded data in the second format using both the first input and the second input;
storing the filtered decoded data in a decoded picture buffer; and
predicting the predicted video signal in the second format from the stored filtered decoded data in the decoded picture buffer.

US Pat. No. 10,432,960

OFFSET TEMPORAL MOTION VECTOR PREDICTOR (TMVP)

ARRIS Enterprises LLC, S...

1. A method of temporal motion vector prediction for inter block coding in High Efficiency Video Coding (HEVC) that relies on a block based translational model, the method comprising:designating a current prediction block as an area for motion compensation using HEVC where all the pixels inside the current prediction block perform identical translation temporally using either one or more motion vectors MVs;
deriving a coordinate offset for the current prediction block from the MVs of its spatially neighboring blocks;
defining an offset of a temporal motion vector predictor (TMVP) for the current prediction block as the MV of an offset block which is in the geometrical location of the current prediction block coordinate plus the coordinate offset in a specified temporal reference picture; and
using the offset TMVP to code MVs,
wherein the motion vectors of neighboring prediction blocks to the current prediction block are used to calculate the offset for the TMVP,
wherein the neighboring prediction blocks located in a first three positions in a merge candidate list for the current prediction block are used in calculating the offset for the TMVP, wherein the three neighboring prediction blocks comprise a left (L), an above (A), and an above-left (AL),
wherein with the three neighboring prediction blocks, the offset for the TMVP for the current prediction block is derived as median of motion vectors of these neighbors, as follows:
dx=median (Lx, ALx, Ax)
dy=median (Ly, ALy, Ay)
wherein Lx, ALx, Ax are the x component of motion vectors of Left neighbor, Above-left neighbor, and Above neighbor, respectively, and
wherein Ly, ALy, Ay are the y component of Left neighbor, Above-left neighbor, and Above neighbor, respectively.

US Pat. No. 10,432,959

SIGNALING HIGH DYNAMIC RANGE AND WIDE COLOR GAMUT CONTENT IN TRANSPORT STREAMS

ARRIS Enterprises LLC, S...

1. An apparatus for generating or receiving a transport stream including a program map table, the apparatus comprising:one or more processors including computer-readable instructions for generating a program map table that includes the following:
an elementary stream identifier indicating a particular elementary stream within a transport stream;
a High Efficiency Video Coding (HEVC) video descriptor signaling a syntax element using two bits for combining signaling of a presence or absence of both a high dynamic range (HDR) content and a wide color gamut (WCG) content in a single syntax element,
wherein the high dynamic range content and/or wide color gamut content is associated with an elementary stream based on the elementary stream identifier signaled in the program map table.

US Pat. No. 10,432,957

TRANSMISSION DEVICE, TRANSMITTING METHOD, RECEPTION DEVICE, AND RECEIVING METHOD

Saturn Licensing LLC, Ne...

1. A transmission device comprising:circuitry configured to
generate a container in a format including identifying information and a video stream separately provided in the container, the video stream including encoded image data;
insert, into the video stream, auxiliary information for downscaling a spatial and/or temporal resolution of the image data;
set the identifying information included in the container to indicate that the video stream includes the auxiliary information for a decoder that does not support the spatial and/or temporal resolution of the image data, and
transmit the container, wherein
downscaling processing of the spatial and/or temporal resolution of the image data is applied, by a reception device which receives the container having the video stream and the identifying information and extracts the auxiliary information inserted into the video stream, to the image data according to the extracted auxiliary information for downscaling the spatial and/or temporal resolution of the image data to generate display image data having a desired resolution.

US Pat. No. 10,432,956

IMAGE CODING DEVICE, IMAGE DECODING DEVICE, IMAGE CODING METHOD, AND IMAGE DECODING METHOD

Mitsubishi Electric Corpo...

1. An image decoding device comprising:a variable length decoder for performing a variable-length-decoding process on a coded data multiplexed into a bitstream to obtain a coding mode for each of coding blocks; and
a prediction image generator that carries out a prediction process corresponding to said coding mode to generate a prediction image, said prediction image generator carrying out an intra prediction process for a current partition which is predicted by intra mode;
wherein said variable length decoder obtains from said bitstream an intra merge flag indicating whether or not an intra prediction parameter of said current partition is identical to that of an adjacent partition located above or to the left of said current partition,
wherein when there are two or more partitions adjacent to top or left of said current partition, a first partition in a direction away from a top left of said current partition is selected as said adjacent partition,
wherein when said intra merge flag indicates that said intra prediction parameter of said current partition is identical to that of said adjacent partition, said variable length decoder obtains from said bitstream an intra merge direction specifying, out of said adjacent partitions located above and to the left of said current partition, the adjacent partition whose intra prediction parameter is identical to that of said current partition, and
when said intra prediction parameter of said current partition is not identical to that of said adjacent partition, said variable length decoder obtains from said bitstream said intra prediction parameter for said current partition.

US Pat. No. 10,432,954

VIDEO ENCODER, VIDEO ENCODING SYSTEM AND VIDEO ENCODING METHOD

NVIDIA CORPORATION, Sant...

1. A video encoding system, comprising a controller, a first video encoder, a second video encoder, and a memory, where the video encoding system:divides a frame of an image into a predetermined number of predetermined portions, where the predetermined number of the predetermined portions is based on a number of a plurality of video encoders within the video encoding system;
sends, from the controller to a first video encoder of the plurality of video encoders, a command to encode a first predetermined portion of the frame of the image;
sends, from the controller to a second video encoder of the plurality of video encoders, a command to encode a second predetermined portion of the frame of the image separate from the first predetermined portion of the frame of the image;
retrieves from the memory, by the first video encoder, the first predetermined portion of the frame of the image;
retrieves from the memory, by the second video encoder, the second predetermined portion of the frame of the image;
encodes, by the first video encoder, the first predetermined portion of the frame of the image to create a first encoded portion of the frame of the image, wherein during the encoding of the first predetermined portion of the frame of the image by the first video encoder, a value of an image height and width register used by the first video encoder is set based on a height and width of the first predetermined portion of the frame of the image, and a value of a macro block (MB) position register used by the first video encoder is set based on a position of a macro block in the image;
encodes, by the second video encoder, the second predetermined portion of the frame of the image to create a second encoded portion of the frame of the image different from the first encoded portion of the frame of the image, wherein during the encoding of the second predetermined portion of the frame of the image by the second video encoder, a value of an image height and width register used by the second video encoder is set based on a height and width of the second predetermined portion of the frame of the image, and a value of a macro block (MB) position register used by the second video encoder is set based on the position of the macro block in the image;
writes, by the first video encoder, the first encoded portion of the frame of the image to the memory; and
writes, by the second video encoder, the second encoded portion of the frame of the image to the memory.

US Pat. No. 10,432,952

SYSTEM AND METHODS FOR FIXED-POINT APPROXIMATIONS IN DISPLAY STREAM COMPRESSION (DSC)

QUALCOMM Incorporated, S...

1. An apparatus for coding video data, comprising:a memory for storing the video data, the memory including a buffer; and
a hardware processor operationally coupled to the memory and configured to:
determine and store a scaling parameter based upon a total number of pixels within a slice of video data;
determine and store a data structure associating a plurality of input values with their reciprocal values;
receive the video data to be coded, the video data comprising at least one slice divided into a plurality of blocks;
determine a threshold value based upon the stored scaling parameter;
for a block of the slice to be coded, in response to a determination that a number of remaining pixels in the slice is less than the threshold value, update the scaling parameter and determine an updated threshold value based upon the scaling factor; and
perform one or more fixed-point approximation operations to determine a target rate for the block, based upon a reciprocal value associated with the number of remaining pixels scaled based upon the scaling factor, wherein the reciprocal value is determined using the stored data structure.

US Pat. No. 10,432,951

CONFORMANCE AND INOPERABILITY IMPROVEMENTS IN MULTI-LAYER VIDEO CODING

QUALCOMM Incorporated, S...

1. A method of processing video data comprising:receiving coded video data having a plurality of output operation points;
extracting a selected output operation point from the plurality of output operation points, the selected output operation point being a sub-bitstream of an entire bitstream;
performing a first bitstream conformance test on the selected output operation point when the selected output operation point corresponds to one of an entire bitstream with only the base layer to be output, and a temporal subset of the entire bitstream with only the base layer to be output, the first bitstream conformance test being based on a set of sequence-level hypothetical reference decoder (HRD) parameters in an active sequence parameter set (SPS) for a base layer, and one or more non-nested supplemental enhancement information (SEI) messages, wherein the non-nested SEI messages comprise one of decoding unit information (DUI), buffering period (BP), and picture timing (PT) SEI messages, and the non-nested SEI messages are directly included in an SEI network abstraction layer (NAL) unit,
performing a second bitstream conformance test on the selected output operation point when the selected output operation point corresponds to one of a layer set specified by a base video parameter set (VPS) of an active VPS and a temporal subset of the layer set with only the base layer to be output, the second bitstream conformance test being based on a set of sequence-level HRD parameters in the base VPS and directly nested SEI messages, and
performing a third bitstream conformance test on the selected output operation point when the selected output operation point corresponds to one of an output layer set (OLS) specified by a VPS extension of the active VPS and a temporal subset of the OLS, the third bitstream conformance test being based on a set of sequence-level HRD parameters in the active VPS and indirectly nested SEI messages; and
applying the indirectly nested SEI messages only when the selected output operation point corresponds to an OLS specified in the VPS extension, the indirectly nested SEI messages being one of BP, PT, and DUI SEI messages.

US Pat. No. 10,432,946

DE-JUDDERING TECHNIQUES FOR CODED VIDEO

Apple Inc., Cupertino, C...

1. A video coding method, comprising:coding a source video sequence as base layer coded video at a first frame rate;
estimating frame rate conversion operations of a decoding terminal,
applying estimated frame rate conversion operations of the decoding terminal on decoded base layer data,
identifying a portion of the decoded base layer data having judder following the application of frame rate conversion operations of the decoding terminal,
for the identified portion, including a skip hint in coded video data to indicate that a decoder should omit frame rate conversion operations for the respective portion of video data, and
for the identified portion, coding additional frames of the source video sequence corresponding to the identified portion as coded enhancement layer data, the coded enhancement layer data and the coded base layer data when decoded generating recovered video representing the identified portion at a higher frame rate than the coded based layer data when decoded by itself.

US Pat. No. 10,432,942

SIGNALING COLOR VALUES FOR 3D LOOKUP TABLE FOR COLOR GAMUT SCALABILITY IN MULTI-LAYER VIDEO CODING

QUALCOMM Incorporated, S...

1. A method of decoding video data, the method comprising:determining a number of octants for each of three color components of a three-dimensional (3D) lookup table for color gamut scalability;
for each of the octants for each of the color components, decoding color mapping coefficients for a linear color mapping function of color values in the 3D lookup table used to convert color data in a first color gamut for a lower layer of the video data to a second color gamut for a higher layer of the video data, wherein decoding the color mapping coefficients comprises, for a first one of the octants for each of the color components, decoding at least one coefficient of the color mapping coefficients based on a predicted value of the at least one coefficient of the color mapping coefficients, and wherein decoding the color mapping coefficients further comprises, for each remaining one of the octants for each of the color components, decoding the color mapping coefficients based on predicted values from at least one previously decoded octant;
generating the 3D lookup table based on the number of octants for each of the color components and color values associated with the color mapping coefficients for each of the octants;
decode residual data of video blocks of the video data; and
reconstruct the video blocks of the video data based on the decoded residual data and at least one reference picture generated using the 3D lookup table.

US Pat. No. 10,432,941

SIGNALING COLOR VALUES FOR 3D LOOKUP TABLE FOR COLOR GAMUT SCALABILITY IN MULTI-LAYER VIDEO CODING

QUALCOMM Incorporated, S...

1. A method of decoding video data, the method comprising:determining a number of octants for each of three color components of a three-dimensional (3D) lookup table for color gamut scalability;
for each of the octants for each of the color components, decoding color mapping coefficients for a linear color mapping function of color values in the 3D lookup table used to convert color data in a first color gamut for a lower layer of the video data to a second color gamut for a higher layer of the video data, wherein decoding the color mapping coefficients comprises, for a first one of the octants for each of the color components, decoding at least one coefficient of the color mapping coefficients based on a predicted value of the at least one coefficient of the color mapping coefficients, and wherein the at least one coefficient of the color mapping coefficients comprises a key coefficient that defines a weighting factor for the linear color mapping function between a same color component of the lower layer of the video data and the higher layer of the video data;
generating the 3D lookup table based on the number of octants for each of the color components and color values associated with the color mapping coefficients for each of the octants;
decoding residual data of video blocks of the video data; and
reconstructing the video blocks of the video data based on the decoded residual data and at least one reference picture generated using the 3D lookup table.

US Pat. No. 10,432,939

ENTROPY CODING SUPPORTING MODE SWITCHING

GE VIDEO COMPRESSION, LLC...

1. A decoder for decoding a data stream including encoded data of a video, the decoder comprising:an entropy decoding engine configured to decode data from the data stream based on an entropy decoding scheme to obtain a sequence of symbols, wherein, with respect to at least one symbol of the sequence of symbols, the entropy decoding engine is configured to:
select a context corresponding to the at least one symbol, and
decode the at least one symbol using the selected context based on the entropy decoding scheme, wherein the entropy decoding includes updating a probability model associated with the selected context at one of a first update rate under a high-efficiency mode of entropy decoding and a second update rate, that is lower than the first update rate, under a low-complexity mode of entropy decoding;
a desymbolizer configured to desymbolize the sequence of symbols to obtain a sequence of syntax elements; and
a reconstructor configured to reconstruct at least a portion of the video based on the sequence of syntax elements.

US Pat. No. 10,432,937

ADAPTIVE PRECISION AND QUANTIFICATION OF A WAVELET TRANSFORMED MATRIX

Jean-Claude Colin, Versa...

1. A method for compressing a digital image, comprising:a step for reducing an entropy of a component of said image, represented in a form of an original matrix (X), wherein:
said original matrix is transformed into a transformed matrix (T) using a wavelet transformation;
a respective quantisation coefficient corresponds to each detail matrix for each of plural detail matrices;
said wavelet transformation is calculated in fixed decimal point using a first number (D) of digits, wherein D?1, after the decimal point, for each wavelet level for which at least one of the quantisation coefficients corresponding to each of the detail matrices is strictly greater than 1, and
at the end of the processing of a wavelet level in fixed-decimal point numbers, values of an approximation matrix are transformed into integer numbers when each of the quantisation coefficients of each of the detail matrices of a subsequent wavelet level is equal to 1, and are kept in fixed-decimal point numbers in the contrary case.

US Pat. No. 10,432,933

IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM

Sony Corporation, Tokyo ...

1. An encoder for encoding an image signal comprising: processing circuitry configured toset, as a block setting process and in case that a condition that an operating mode requires resource efficiency higher than that of a normal mode, a depth of block division with more limited variety of pixel sizes of coding units than that of the normal mode;
perform cost calculation and divisional determination only with respect to coding units within the set depth; and
skip cost calculation and divisional determination with respect to coding units outside the set depth.

US Pat. No. 10,432,929

METHOD AND APPARATUS FOR MULTIPLE LINE INTRA PREDICTION IN VIDEO COMPRESSION

TENCENT AMERICA LLC, Pal...

1. A method of video decoding, comprising:determining, for a current block of a picture, one of a plurality of reference lines, each reference line being parallel to a side of the current block;
determining an intra prediction mode for the current block in accordance with the determined one of the plurality of reference lines; and
performing intra prediction for the current block based on the determined intra prediction mode and one or more samples included in the determined one of the plurality of reference lines, wherein
the plurality of reference lines includes an adjacent reference line that is adjacent to the current block and at least one non-adjacent reference line that is not adjacent to the current block,
a number of intra prediction modes associated with each of the at least one non-adjacent reference line is equal to or less than half a number of intra prediction modes associated with the adjacent reference line, and
each of the at least one non-adjacent reference line is associated with a same number of intra prediction modes.

US Pat. No. 10,432,925

LIGHT FIELD DISPLAY CONTROL METHODS AND APPARATUS, AND LIGHT FIELD DISPLAY DEVICES

BEIJING ZHIGU TECH CO., L...

1. A light field display control method, comprising:determining target pixel density distribution information, wherein determining the target pixel density distribution information comprises:
determining a first region of a light field image;
determining a first display region of a display, wherein determining the first display region in the display includes:
determining light field sub-image information corresponding to the first region in the light field image, and determining the first display region that affects displaying of the light field sub-image information; or
determining the first display region according to relative location information of pixels in the first region with respect to a reference point of the light field image; and
determining the target pixel density distribution information according to the first display region, wherein in the target pixel density distribution information, a target pixel density corresponding to the first display region is different from a target pixel density corresponding to a second display region, and the second display region is a display region of the display other than the first display region;
adjusting display pixel density distribution of a display of a light field display device according to the target pixel density distribution information, so that at least two display regions in the display of the light field display device after the adjustment have different display pixel densities;
performing sampling processing on the light field image according to location information of display pixels of the display of the light field display device after the adjustment; and
displaying, by the light field display device after the adjustment, the light field image undergone the sampling processing.

US Pat. No. 10,432,923

3D DISPLAY SYSTEM

KOREA PHOTONICS TECHNOLOG...

1. A unit light source module of 3D display system, the unit light source module comprising:a light emitting unit including a plurality of point light sources corresponding to a number of viewpoints; and
a light collecting unit configured to output a light outputted from said plurality of point light sources by collecting the light while being spaced apart at a predetermined distance from said light emitting unit,
wherein said plurality of point light sources is arranged for implementing only one of a horizontal parallax and a vertical parallax or both of the horizontal parallax and the vertical parallax based on:
a relationship in size between a width of each point light source and a center distance between adjacent point light sources; and
a direction in which the plurality of point light sources arranged.

US Pat. No. 10,432,922

MEDICAL DEVICES, SYSTEMS, AND METHODS USING EYE GAZE TRACKING FOR STEREO VIEWER

INTUITIVE SURGICAL OPERAT...

1. An eye tracking system, comprising:an image display comprising a first coordinate frame and configured to display an image of a surgical field comprising a second coordinate frame to a user, wherein the user is in a third coordinate frame, the image display configured to emit a light in a first wavelength range;
a right eye tracker configured to emit light in a second wavelength range and to measure data about a first gaze point of a right eye of the user;
a left eye tracker configured to emit light in the second wavelength range and to measure data about a second gaze point of a left eye of the user;
an optical assembly disposed between the image display and the right and left eyes of the user, the optical assembly configured to direct the light of the first and second wavelength ranges such that the light of the first and second wavelength ranges share at least a portion of a left optical path between the left eye and the image display and share at least a portion of a right optical path between the right eye and the image display, without the right and left eye trackers being visible to the user; and
at least one processor configured to process the data about the first gaze point and the second gaze point to determine a viewing location in the displayed image at which the first gaze point and the second gaze point of the user is directed.

US Pat. No. 10,432,921

AUTOMATED PANNING IN ROBOTIC SURGICAL SYSTEMS BASED ON TOOL TRACKING

Intuitive Surgical Operat...

1. A digital zoom and panning system, the system comprising:an endoscopic camera device to capture digital video images of a surgical site;
an image buffer coupled to the endoscopic camera device, the image buffer to store one or more frames of the digital video images as source pixels;
a first display device having first pixels to display images;
a first user interface displayed on the first display device to accept a first user input to display a fovea, the first user input including selection of a first source pixel array of source pixels within a first frame of the digital video images with reference to the surgical site and selection of a first target pixel array of target pixels within a subset of the first pixels of the first display device;
a first digital mapping and filtering device coupled between the image buffer and the first display device, the first digital mapping and filtering device to selectively map and filter source pixels in a first region of interest from the image buffer into target pixels in a first destination rectangle for the first display device; and
a tracking system to track a position of at least one object to digitally pan and display the fovea in the first destination rectangle based on one or more tracked positions of the at least one object.

US Pat. No. 10,432,920

IMMERSIVE COMPACT DISPLAY GLASSES

TESSELAND, LLC, Glendale...

1. A display device comprising:a display, operable to generate a real image comprising a plurality of object pixels; and
an optical system, comprising an array of a plurality of lenslets, arranged to generate an immersive virtual image from the real image, the immersive virtual image comprising a plurality of image pixels, by each lenslet projecting light from the display to a respective pupil range, wherein the lenslets comprise at least two lenslets that cannot be made to coincide by a simple translation rigid motion;
wherein the pupil range comprises an area on the surface of an imaginary sphere of from 21 to 27 mm diameter, the pupil range including a circle subtending 15 degrees whole angle at the center of the sphere;
wherein the object pixels are grouped into clusters, each cluster associated with a lenslet, so that the lenslet produces from the object pixels a partial virtual image comprising image pixels, and the partial virtual images combine to form said immersive virtual image;
wherein imaging light rays falling on said pupil range through a given lenslet come from pixels of the associated cluster, and said imaging light rays falling on said pupil range from object pixels of a given cluster pass through the associated lenslet;
wherein said imaging light rays exiting a given lenslet towards the pupil range and virtually coming from any one image pixel of the immersive virtual image are generated from a single object pixel of the associated cluster.

US Pat. No. 10,432,919

SHUTTERED WAVEGUIDE LIGHT FIELD DISPLAY

1. A light field display device comprising at least one multiplexed light field display module, the multiplexed light field display module comprising a view image generator, a first waveguide, a set of first shutters spatially distributed in a two-dimensional array across the face of the first waveguide, and a second waveguide, the view image generator optically coupled to the second waveguide, the second waveguide optically coupled to one edge of the first waveguide, the first waveguide optically coupled to each first shutter, the view image generator operable to generate a set of beams of light from one of a set of view images, the second waveguide configured to transmit the set of beams from the view image generator into the first waveguide, the first waveguide configured to transmit the set of beams along its length via internal reflection, each first shutter operable to be switched between a closed state and an open state, the closed state of the first shutter configured to prevent the beams from escaping the first waveguide, the closed state of the first shutter configured to allow the beams to propagate along the first waveguide past the first shutter, the open state of the first shutter configured to allow the beams to escape the first waveguide, the module operable to generate, over time, the set of beams from a different one of the set of view images, and to open, over time, a different subset of the set of first shutters, thereby to allow the set of beams escaping from the subset of the set of first shutters to correspond to a different one of the set of view images.

US Pat. No. 10,432,918

THREE-DIMENSIONAL DISPLAY DEVICE AND METHOD FOR THE SAME

BOE TECHNOLOGY GROUP CO.,...

1. A three-dimensional display device, comprising:a display panel comprising a plurality of pixels arranged as an array, wherein the plurality of pixels comprises left eye pixels and right eye pixels alternately arranged in each row or each column;
a light-splitting device disposed as being parallel to the display panel and configured for projecting lights emitted by the left eye pixels and right eye pixels to different view areas, wherein the light-splitting device is at a side facing a viewer of the display panel, and a distance h is presented between the light-splitting device and the display panel along a light emission direction of the display device;
a distance adjusting device configured to adjust the distance h between the display panel and the light-splitting device along the light emission direction of the display device according to a variation of a distance H between the viewer and the display panel along the light emission direction of the display device,
wherein a proportion of the distance h to the distance H is unchanged; and
wherein the proportion of the distance h to the distance H is identical to a/(a+I), wherein a is a width of each of the plurality of the pixels, and I is an interpupillary distance of the viewer.

US Pat. No. 10,432,917

3D IMAGE DISPLAY DEVICE

1. A 3D image display device, comprising:a display module comprising a plurality of pixels and an image composed of the pixels, wherein the pixels is arranged in a first direction; and
a first lenticular array comprising a plurality of strip-shaped first lenticular lenses and an angle between an extension direction of the first lenticular lens and the first direction is larger or equal to 45 degree;
wherein the image composed of the pixels is created by the steps of:
(a) providing a capture device, a subject to be captured, and a lenticular array, wherein the lenticular array comprises a plurality of strip-shaped lenticular lenses, a length of a bottom of each lenticular lens is 2L, and a center of the bottom is set as 0;
(b) placing the capture device to aim at a top of one of the lenticular lenses and a point between ?xL to xL of a bottom coordinate of the lenticular lens, and capturing the subject until a capturing for a plurality of pixels corresponding to the range from ?xL to xL of the bottom coordinate of the lenticular lens is finished, wherein the value of x is smaller than 1 but greater than 0;
(c) mapping pixels corresponding to the range from from ?xL to 0 and from xL to 0 captured by the capture device to pixels corresponding to a range from ?L to ?xL and from L to xL of the bottom coordinate; and
(d) repeating the steps (b) to (c) for the others of the lenticular lenses.

US Pat. No. 10,432,916

MEASUREMENT APPARATUS AND OPERATION METHOD OF MEASUREMENT APPARATUS

Olympus Corporation, Tok...

1. A measurement apparatus comprising:an imaging unit configured to capture an image of an object which is a measurement target and generate a parallax image including two images having a parallax therebetween;
a display control unit configured to cause a display unit to display at least one of the two images, the two images displayed on the display unit being included in a first parallax image which is the parallax image generated when an image of the object is captured at a first timing;
a designation point setting unit configured to set a designation point in an image which is one of the two images included in the first parallax image and is displayed on the display unit;
a point detection unit configured to detect a corresponding point corresponding to the designation point in an image which is the other of the two images included in the first parallax image and differs from the image in which the designation point has been set;
a reliability determination unit configured to determine a measurement reliability on the basis of the designation point and the corresponding point;
a measurement unit configured to perform measurement of the object using the parallax image when the reliability determination unit determines that the measurement reliability is high; and
an imaging condition determination unit configured to determine whether imaging conditions of the imaging unit have changed from imaging conditions at the first timing after the reliability determination unit determines that the measurement reliability is low,
wherein, the imaging unit is configured to capture an image of the object at a second timing and generate a second parallax image which is the parallax image including the two images after the reliability determination unit determines that the measurement reliability is low,
the point detection unit is configured to detect a similar point in one of the two images included in the second parallax image after the imaging condition determination unit determines that the imaging conditions have changed, the similar point being similar to the designation point set in one of the two images included in the first parallax image,
the point detection unit is configured to detect a similar corresponding point corresponding to the similar point in an image which is the other of the two images included in the second parallax image and differs from the image in which the similar point is detected, and
the reliability determination unit is configured to determine the measurement reliability on the basis of the similar point and the similar corresponding point.

US Pat. No. 10,432,915

SYSTEMS, METHODS, AND DEVICES FOR GENERATING THREE-DIMENSIONAL MODELS

The Sanborn Map Company, ...

1. A system for determining a location of a point on a structure comprising:a first imaging platform, comprising:
a first focal plane having a first sensor capable of detecting:
a first photon reflected off a first point on a structure, along a first path towards the first imaging platform, and onto the first sensor; and
a second photon reflected off the first point, along a second path towards the imaging platform, and onto the first sensor; and
an image processor configured to:
sense the first photon on the first sensor and capture a first image of the first point on the structure when the first imaging platform is at a first vantage point, the first image having a first array of one or more pixels; and
sense the second photon on the first sensor and capture a second image of the first point on the structure when the first imaging platform is at a second vantage point, the second image having a second array of one or more pixels; and
a computing system, comprising a processor
configured to determine a position of the first point on the structure by performing computer software operations comprising:
selecting a first reference system;
identifying a subset of the first array of one or more pixels from the first image as corresponding to the first point;
determining a first vector defining the first path;
determining a second vector defining the second path; and
determining the position of the first point on the structure relative to the first reference system based on the first vector and the second vector.

US Pat. No. 10,432,914

GRAPHICS PROCESSING SYSTEMS AND GRAPHICS PROCESSORS

Arm Limited, Cambridge (...

1. A method of operating a graphics processing system, the graphics processing system comprising a graphics processing pipeline comprising a primitive generation stage and a pixel processing stage, the method comprising:processing input data in the primitive generation stage to produce first primitive data associated with a first view of a scene and second primitive data associated with a second view of the scene;
processing the first primitive data in the pixel processing stage to produce first pixel-processed data associated with the first view;
determining, for second pixel-processed data associated with the second view, whether to use the first pixel-processed data as the second pixel-processed data or whether to process the second primitive data in the pixel processing stage to produce the second pixel-processed data, wherein the determining is based on one or more geometric properties of the first primitive data and/or one or more geometric properties of the second primitive data, and wherein the one or more geometric properties include a predetermined parallax property;
identifying that the first primitive data and the second primitive data have the predetermined parallax property in response to determining that one or more offsets between one or more positions of one or more vertices of one or more primitives in the first primitive data and one or more positions of one or more corresponding vertices of one or more corresponding primitives in the second primitive data do not exceed one or more predetermined offset thresholds; and
performing additional processing in the graphics processing pipeline based on the determining.

US Pat. No. 10,432,913

SYSTEMS AND METHODS FOR DETERMINING THREE DIMENSIONAL MEASUREMENTS IN TELEMEDICINE APPLICATION

PROXIMIE, INC., Bedford,...

1. A system for measuring a depth or length of area of interest a telemedicine patient, comprising:a first image capturing device that captures a two-dimensional (2D) image or video of a region of interest of a patient;
a second image capturing device that captures a three-dimensional (3D) point cloud of the region of interest of the patient;
a rendering system that processes a unified view of a common field of view (FOV) for both the first and second image capturing devices where the 2D image or video and 3D point cloud are generated and calibration is performed as preprocessing between first and second image capturing devices;
an overlay system that generates and stores a 2D render point map image from the unified view, which includes an overlay of the 2D image and at least one 2D video frame;
a set of fiducial markers in the common FOV of the first and second image capturing devices used as reference points in the 2D render point map image and further is used to produce a set of first parameters; and
a remote measurement processing system that determines a depth or length between two points selected at a computer at a remote location than the region of interest from the 2D image or at least one 2D frame of the region of interest by (1) transforming coordinates of the two points selected at the computer at the remote location to associated coordinates of the 2D render point map image to identify adjusted coordinates of the two points, (2) determining second parameters including 2D to 2D transformation parameters by connecting a 2D image of the fiducial markers in a remote display to a local 2D high resolution camera, and (3) using a combination of the first and second parameters for identifying coordinates of the two points in the 3D point cloud of the region of interest of the patient.

US Pat. No. 10,432,912

TARGET, METHOD, AND SYSTEM FOR CAMERA CALIBRATION

Waymo LLC, Mountain View...

1. A target used for calibration, comprising:a first pattern of fiducial markers;
a second pattern of fiducial markers,
wherein the first pattern is a scaled version of the second pattern, such that a calibration image captured of the target simulates multiple images of a single pattern captured at multiple calibration perspectives; and
one or more panel-identification fiducial markers that uniquely identify the target used for calibration.

US Pat. No. 10,432,911

METHOD FOR IDENTIFICATION OF CONTAMINATION UPON A LENS OF A STEREOSCOPIC CAMERA

1. A method for identifying contamination upon a lens of a stereoscopic camera, wherein said stereoscopic camerais arranged such that a capturing area of said stereoscopic camera is predefined such that images from said stereoscopic camera have the same capturing area over time,
is provided with a first camera adapted to cover said capturing area by providing first images of said capturing area, and
is provided with a second camera adapted to cover said capturing area by providing second images of said capturing area, wherein
said first images are divided into at least one evaluation area and said second images are divided into at least one evaluation area, wherein the respective evaluation area of said first and said second images are correspondently located in respective image, wherein said method comprises the steps of:
forming historical image data for said evaluation areas, wherein said historical image data comprises an image parameter representing the respective evaluation area from a predetermined number of previously captured first and second images,
comparing said historical image data for the evaluation area of said first image with historical image data for the evaluation area of said second image, and
indicating that at least one lens of said stereoscopic camera is contaminated, if a deviation larger than a threshold value between the compared historical image data is identified.

US Pat. No. 10,432,910

METHODS AND APPARATUS FOR CONTROLLING A VIEWING POSITION

NextVR Inc., Newport Bea...

1. A method of presenting content corresponding to an environment, the method comprising:displaying to a user content corresponding to a first camera position in said environment;
determining if the user is viewing another camera position in said environment; and
monitoring, while said user is viewing said another camera position, to detect user input indicating a user selected switch to said another camera position.

US Pat. No. 10,432,909

DIGITAL SIGNAGE SYSTEM AND DATA PROCESSING METHOD IN THE SAME

LG ELECTRONICS INC., Seo...

1. A digital signage system comprising:a first display device;
a second display device arranged a first spatial distance from the first display device, wherein the second display device has a display screen that at least partially overlaps a display screen on the first display device; and
a controller configured to:
control the first display device to display single image data at a first time, and
control the second display device to display the single image data at a second time based on the first spatial distance between the first display device and the second display device, wherein at least one of the first display device or the second display device is a transparent display, and
wherein the controller is further configured to:
detect a third display device being different from the first and second display devices and having a display screen that at least partially overlaps the display screen on the second display device has been positioned between the first display device and the second display device,
determine a second spatial distance between the first display device and the third display device, and a third spatial distance between the third display device and the second display device,
control the first display device to display the single image data at the first time,
control the third display device to display the single image data at a third time based on the second spatial distance, and
control the second display device to display the single image data at a fourth time based on the third spatial distance.

US Pat. No. 10,432,908

REAL-TIME MULTIFOCAL DISPLAYS WITH GAZE-CONTINGENT RENDERING AND OPTIMIZATION

Facebook Technologies, LL...

1. A method comprising:determining a first pixel intensity value for each pixel in at least a portion of a plurality of pixels of a plurality of displays by modifying an initial intensity value for the pixel using at least one numerical iteration applied on one or more correlation values related to at least the portion of the plurality of pixels;
determining a second pixel intensity value for each pixel in at least the portion of the plurality of pixels using the first intensity value and applying one or more numerical iterations;
determining information about a gaze direction for an eye relative to the plurality of displays;
modifying the second pixel intensity value based on the information about the gaze direction; and
determining a pixel intensity value for each pixel in at least the portion of the plurality of pixels to generate content for presentation, based on the modified second pixel intensity value.

US Pat. No. 10,432,907

ELECTRONIC SYSTEM FOR CREATING AN IMAGE AND A METHOD OF CREATING AN IMAGE

CITY UNIVERSITY OF HONG K...

1. A method for creating an image comprising the steps of:displaying a plurality of two-dimensional representations on a display within a three-dimensional space, wherein the plurality of two-dimensional representations are arranged to each individually represent a unique portion of a three-dimensional object within the three-dimensional space;
moving the display with a robotic mounting structure arranged to move the display such that the plurality of two-dimensional representations are displayed in a plurality of predefined positions within the three-dimensional space;
recording the plurality of two-dimensional representations being displayed within the three-dimensional space with an imager arranged to capture the plurality of two-dimensional representations being displayed within the three-dimensional space; and
combining the plurality of two-dimensional representations recorded in said step of recording in a plurality of predefined positions to form a single image representative of the three-dimensional object within the three-dimensional space.

US Pat. No. 10,432,906

RECORDING MEDIUM, PLAYBACK DEVICE, AND PLAYBACK METHOD

PANASONIC INTELLECTUAL PR...

1. A playback method of a playback device that reads out from a recording medium and plays a plurality of High Dynamic Range (HDR) video streams that is encoded video information and that has a wider luminance range than Standard Dynamic Range (SDR), the playback device including:a first register storing first information indicating, for each of a plurality of playback formats for the plurality of HDR video streams, whether the playback device corresponds to or not,
a second register storing second information indicating, for each of the plurality of playback formats for the plurality of HDR video streams, whether a display device connected to the playback device corresponds to or not, and
a third register storing third information indicating, for each of the plurality of playback formats for the plurality of HDR video streams, a priority of the playback formats based on a user's preference,
the playback method comprising:
playing the plurality of HDR video streams using a playback format with a highest priority among the priority indicated by the third information, in a case where the first information and the second information indicate that there are a plurality of playback formats corresponding to both the playback device and the display device for the plurality of HDR video streams.

US Pat. No. 10,432,905

METHOD AND APPARATUS FOR OBTAINING HIGH RESOLUTION IMAGE, AND ELECTRONIC DEVICE FOR SAME

GUANGDONG OPPO MOBILE TEL...

1. An image processing method, applied in an electronic device, wherein the electronic device comprises an imaging apparatus comprising an image sensor, the image sensor comprises an array of photosensitive pixel units and an array of filter units arranged on the array of photosensitive pixel units, each filter unit corresponds to one photosensitive pixel unit, and each photosensitive pixel unit comprises a plurality of photosensitive pixels, the image processing method comprises:outputting a merged image by the image sensor, wherein, the merged image comprises an array of merged pixels, and the photosensitive pixels in a same photosensitive pixel unit are collectively output as one merged pixel;
determining whether there is a target object in the merged image; and
when there is the target object in the merged image, converting the merged image into a restoration image using a second interpolation algorithm, wherein the restoration image comprises restoration pixels arranged in an array, each photosensitive pixel corresponds to one restoration pixel; and converting the restoration image into a merged true-color image.

US Pat. No. 10,432,904

IMAGE PROCESSING DEVICE AND OPERATIONAL METHOD THEREOF

Samsung Electronics Co., ...

1. An image processing device, comprising:an image sensor module including a lens and an image sensor;
an actuator; and
a processor configured to:
obtain, using the image sensor module, a first image having first color information, the first image corresponding to an external object;
move at least one of the lens and the image sensor based on a designated pixel unit;
obtain, using the image sensor module with the moved at least one of the lens and the image sensor, a second image having second color information, the second image corresponding to the external object;
generate a third image having third color information based on the first color information and the second color information, the third image corresponding to the external object,
wherein the processor is further configured to:
obtain brightness information using the image sensor; and
obtain the second image based on the sensed brightness information.

US Pat. No. 10,432,903

3D LASER PROJECTION, SCANNING AND OBJECT TRACKING

FARO TECHNOLOGIES, INC., ...

1. A laser projection system comprising:at least one laser projector having a laser source and a laser detector; and
one or more processors for executing non-transitory computer readable instructions, the one or more processors operatively coupled to the laser projector, the non-transitory computer readable instructions comprising:
determining the component is stationary based at least in part on the emitting of a first laser light toward at least one reference target on the component and receiving a portion of the first laser light by the laser detector;
in response to the determining the component is stationary:
determining a transformation for mapping a first coordinate system of the laser projector to a second coordinate system of the component; and
forming a visual pattern on the component based at least in part on the transformation; and
determining that the component is in motion based at least in part on receiving a portion of the first laser light by the laser detector and removing the visual pattern from the component.

US Pat. No. 10,432,902

INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD

SONY CORPORATION, Tokyo ...

1. An information processing device, comprising:a projecting device;
a capturing device; and
circuitry configured to:
control the projecting device to project an image that includes a first pattern, wherein
the first pattern is inverted in phase in first cycles in a first direction,
the first pattern includes a second pattern repeated in second cycles in a second direction, and
the second direction is orthogonal to the first direction;
control the capturing device to capture the projected image that includes the first pattern; and
acquire, based on the captured image, a relationship between a pixel of the projecting device and a pixel of the capturing device.

US Pat. No. 10,432,901

CONTENT PROJECTION CONTROL APPARATUS, CONTENT PROJECTION CONTROL METHOD AND PROGRAM

Rakuten, Inc., Tokyo (JP...

1. A content projection control apparatus comprising:a central processing unit; and
a memory storing a program which causes the central processing unit to:
obtain content data indicating content including a plurality of objects, each object including one or more images or a plurality of characters;
obtain projection area data indicating a projection area;
determine a projection unsuitable area, which is not suitable for projection in the projection area, and a projection suitable area, which is suitable for projection in the projection area, based on the projection area data;
generate a plurality of arrangement units from the content indicated by the content data, each arrangement unit including one object or a plurality of adjacent objects;
determine a reference reduction ratio based on a size of the projection area and a size of the projection suitable area; and
reduce a projection size of each arrangement unit by applying a specific reduction ratio determined based on the reference reduction ratio to each arrangement unit, and changing a layout of the arrangement units by determining a projection position of each arrangement unit based on a position of the projection unsuitable area.

US Pat. No. 10,432,899

IMAGE DISPLAY DEVICE

PANASONIC INTELLECTUAL PR...

1. An image display device comprising:a light source configured to emit laser light;
a screen configured to be two-dimensionally scanned with the laser light to draw an image on the screen;
a scanning unit configured to scan the screen with the laser light;
a drive unit configured to drive the scanning unit so that the laser light moves on the screen along a plurality of scan lines at predetermined intervals; and
an optical system configured to generate a virtual image of the image drawn on the screen,
wherein on the screen, a plurality of lens regions are arranged so as to line up in two directions different from each other, and
rows in one of the two directions of the lens regions are respectively inclined relatively at a predetermined inclination angle, between but not including 0 degrees and 90 degrees, with respect to main scan directions of the laser light to the screen.

US Pat. No. 10,432,898

PROJECTOR HAVING LIGHT SOURCE INCLUDING LASER DIODES

CASIO COMPUTER CO., LTD.,...

1. A projector comprising:a light source device consisting of a plurality of laser diodes,
wherein each of pencils of light emitted by the light source device has a radiation angle differing along a major axis and a minor axis orthogonal to the major axis to define an elliptic section of the each of the pencils of light, and
wherein the major axis of the elliptic section of the each of the pencils of light is substantially parallel to one another;
a light guide configured to guide the pencils of light emitted by the light source device, wherein the light guide comprises:
a rectangular incident port on which the pencils of light emitted by the light source device are incident,
wherein the rectangular incident port is disposed so that the major axis of the elliptic section of the each of the pencils of light is substantially parallel to the longer sides of the rectangular incident port; and
a rectangular emerging port from which the pencils of light that enters from the rectangular incident port are caused to be emitted; and
a display device comprising a rectangular image forming surface on to which the pencils of light emitted from the rectangular emerging port of the light guide are incident,
wherein the rectangular image forming surface of the display device is disposed so that the major axis of the elliptic section of the each of the pencils of light emitted from the rectangular emerging port of the light guide is substantially parallel to the longer sides of the rectangular image forming surface of the display device.

US Pat. No. 10,432,897

VIDEO IDENTIFICATION AND ANALYTICAL RECOGNITION SYSTEM

1. An analytical recognition system, comprising:a video camera configured to capture video data including a video frame that includes an image of a person;
an antenna configured to capture mobile communication device data including an identifier of a mobile communication device associated with the person;
one or more processors; and
one or more memories storing instructions that, when executed by the one or more processors, cause the one or more processors to function as:
a data analytics module configured to:
correlate the video data and the mobile communication device data, wherein the correlating including matching the video frame, which includes the image of the person, to the identifier of the mobile communication device associated with the person;
generate a profile of the person based on a result of the correlating, wherein the generated profile includes:
at least one of an indication of when the video frame was captured or an indicate on of when the mobile communication device data was captured; and
at least one of an indication of a location at which the video data was captured or an indication of a location at which the mobile communication device data was captured;
identify objects associated with the person from at least one of or a combination of the video data and the mobile communication device data; and
track the identified object using at least one of or a combination of the video data and the mobile communication device data.

US Pat. No. 10,432,896

SYSTEM AND METHOD FOR ACTIVITY MONITORING USING VIDEO DATA

Placemeter Inc., New Yor...

1. A system for activity monitoring using video data, the system comprising:at least one special purpose processor executing a plurality of video analysis worker processes;
a management subsystem coupled to the at least one special purpose processor that performs specific tasks, wherein the management subsystem comprises:
a feed management subsystem for managing video data from the plurality of video data sources;
a worker management subsystem for managing the video analysis worker processes;
a location management subsystem for managing geographic information related to the plurality of video sources;
a data storage subsystem for managing storage of data, including unprocessed and processed video data; and
a plurality of video data sources of multiple types producing video data of different types, wherein the plurality of video data sources comprises at least one mobile device executing a video sensing application that produces a video stream for processing by the plurality of video analysis worker processes;
wherein the specific tasks include line analysis comprising using video data of people standing in a line to determine how many people wait in line and how long it takes a person to go through the line.

US Pat. No. 10,432,895

COMMUNICATION TERMINAL, IMAGE COMMUNICATION SYSTEM COMMUNICATION METHOD, AND NON-TRANSITORY RECORDING MEDIUM

RICOH COMPANY, LTD., Tok...

1. A communication terminal, comprising:circuitry configured to
control a display to display a predetermined area image which is a part of a whole image, the whole image being shared with another communication terminal;
receive a request to display a destination setting screen;
receive, in response to a user input based on the destination setting screen, a request to change a destination terminal to be the another communication terminal and a destination terminal identifier that identifies the another communication terminal; and
transmit display change information to be received by the another communication terminal, wherein
the display change information includes an identifier of the communication terminal, the destination terminal identifier and predetermined area information, and
the predetermined area information indicates a predetermined area associated with the predetermined area image.

US Pat. No. 10,432,894

COMMUNICATION SYSTEM, COMMUNICATION METHOD, AND PROGRAM

OPTIM CORPORATION, Saga-...

1. A system for communication, the system being connected with a plurality of cameras through a network, comprising:an input unit that receives input of personal information on a person;
a search unit that searches the plurality of cameras that take images of the person based on the received personal information;
a display unit that displays images taken by the searched cameras;
an image selection unit that selects a desired image from the displayed images taken by the plurality of cameras; and
a communication unit that communicates with the person around a camera corresponding to the selected image through a device attached to the camera.

US Pat. No. 10,432,893

AD HOC ENDPOINT DEVICE ASSOCIATION FOR MULTIMEDIA CONFERENCING

Google LLC, Mountain Vie...

1. A computer-implemented method comprising:preparing, in a multimedia conference between a first, a second and a third participant device, a set of video streams for the third participant device, wherein the set of video streams is based on a first video stream associated with the first participant device and a second video stream associated with the second participant device;
determining that the first participant device and the second participant device are located within a geographical area based on the first participant device emitting an audio signal that is detected by the second participant device;
designating, in response to determining that the first participant device and the second participant device are located within the geographical area, the first participant device and the second participant device to operate as an ad hoc endpoint in the multimedia conference, such that the first participant device and the second participant device share at least one resource in the multimedia conference;
determining that a display area of the first participant device is larger than a display area of the third participant device; and
excluding, based on designating the first participant device and the second participant device to operate as the ad hoc endpoint and based on the display area of the first participant device being larger than the display area of the third participant device, the second video stream from the set of video streams for the third participant device,
wherein the first video stream includes video representations of a first user of the first participant device and a second user of the second participant device, and a video representation of the second user via the second video stream is excluded for the third participant device.

US Pat. No. 10,432,892

SYSTEMS AND METHODS FOR INTEGRATING AND CONDUCTING VIDEO SESSIONS

United Services Automobil...

19. At least one non-transitory computer-readable medium comprising a set of instructions that, when executed by one or more processors, cause a machine to perform the operations of:engaging, via a communications network, in an interaction with a user via a channel on a device associated with the user, wherein the device is a wearable;
determining whether the interaction is eligible for a video session with a representative;
actively monitoring activities of the user via the wearable to determine a potential need for the user to engage with the representative via the video session;
in response to the activities indicating a potential need to engage in the video session with the representative, sending, to the device, a link that routes the user directly to a uniquely skilled representative;
after the user selects the link requesting the video session, routing the video session to the uniquely skilled representative, the uniquely skilled representative determined based at least in part on the interaction and a location of the device; and
transferring the video session from the device associated with the user to a second device associated with the user in response to the user touching the device with the second device.

US Pat. No. 10,432,891

VEHICLE HEAD-UP DISPLAY SYSTEM

MAGNA ELECTRONICS INC., ...

1. A display system for a vehicle, said display system comprising:a head-up display unit disposed in a vehicle and operable to display information at a display area that is viewable by a driver of the vehicle when the driver is normally operating the vehicle;
wherein, when not displaying information at the display area but when said head-up display unit projects some light, there is a postcard effect at the display area;
wherein said head-up display unit comprises a display screen and projects light through said display screen for displaying information at the display area, and wherein said head-up display unit comprises a compensation film that attenuates light passing through said display screen to reduce the postcard effect at the display area of the vehicle; and
wherein, when said head-up display unit is deactivated and has a glow that is visible at the windshield, said compensation film diffuses edges of the display area to reduce sharp lines between dark grey, where the display area is located, and black, at areas surrounding the display area, so that the glow is not readily visible to and discernible by a person viewing the display area.

US Pat. No. 10,432,890

AUDIO-VISUAL SYSTEM AND METHOD FOR CONTROLLING THE SAME

SAMSUNG ELECTRONICS CO., ...

1. An audio-visual system comprising:a housing comprising an upper end portion having an opening and a storage space inside the housing;
a display configured to be stored in the storage space inside the housing and to be moved into and out of the housing through the opening of the upper end portion of the housing, the display having a display area for displaying video contents;
an actuator configured to move the display into and out of the housing through the opening of the upper end portion of the housing;
a speaker; and
a processor configured to:
output an audio content through the speaker while an entirety of the display area of the display is disposed inside the housing,
control the actuator to move the display out of the housing through the opening of the upper end portion of the housing such that the entirety of the display area of the display is disposed outside the housing, and
output a video content on the display area of the display while the entirety of the display area of the display is disposed outside the housing.

US Pat. No. 10,432,889

AUDIO-VISUAL SYSTEM AND METHOD FOR CONTROLLING THE SAME

SAMSUNG ELECTRONICS CO., ...

1. An audio-visual device comprising:a speaker;
a housing comprising an upper end portion having an opening and a storage space inside the housing;
a stand configured to support the housing such that the housing is spaced apart from a floor surface;
a display configured to be stored in the storage space inside the housing and to be moved through the opening of the upper end portion of the housing, the display having a display area for displaying contents;
an actuator configured to move the display into and out of the housing through the opening of the upper end portion of the housing; and
a processor configured to:
control the actuator to move the display out of the housing through the opening of the upper end portion of the housing such that a portion of the display area of the display is disposed outside the housing, while the portion of the display area of the display is disposed outside the housing through the opening of the upper end portion of the housing, control the speaker to playback an audio content, and control the display to display information relating to the audio content by operating the portion of the display area of the display,
control the actuator to move the display out of the housing through the opening of the upper end portion of the housing such that an entirety of the display area of the display is disposed outside the housing, and
while the entirety of the display area of the display is disposed outside the housing, control the display to display a video content by operating the entirety of the display area of the display.

US Pat. No. 10,432,888

AUGMENTED REALITY DEVICE FOR LEVERAGING HIGH-ACCURACY GNSS DATA

Trimble Inc., Sunnyvale,...

1. A method for displaying images using an augmented reality (AR) device, the method comprising:receiving, from a global navigation satellite systems (GNSS) receiver, GNSS position data based on wireless signals received from a GNSS satellite;
determining, based on the GNSS position data, a first GNSS point within a geospatial frame at a first GNSS time within a first time range and a second GNSS point within the geospatial frame at a second GNSS time within a second time range, the first GNSS point and the second GNSS point forming a GNSS vector;
determining, based on camera position and orientation (POS) data generated by one or more cameras, a first AR point within an AR frame at a first AR time within the first time range and a second AR point within the AR frame at a second AR time within the second time range, the first AR point and the second AR point forming an AR vector;
shifting one of the geospatial frame and the AR frame such that the second GNSS point is aligned with the second AR point;
calculating an angle formed by the GNSS vector and the AR vector; and
rotating one of the geospatial frame and the AR frame to the other of the geospatial frame and the AR frame by the angle.

US Pat. No. 10,432,887

DISPLAY APPARATUS AND OPERATING CHANNEL SETTING METHOD FOR DISPLAY APPARATUS

SAMSUNG ELECTRONICS CO., ...

1. A display apparatus comprising:a communication interface configured to be wirelessly connected to at least one of a first access point and a portable device; and
a processor configured to receive device information of the portable device, which comprises first information about a wireless connection between the portable device and the first access point and second information about whether the portable device supports a real simultaneous dual band (RSDB) connection, from the portable device, and
determine a peer-to-peer (P2P) operating channel between the display apparatus and the portable device, based on device information of the display apparatus which comprises third information about a wireless connection between the display apparatus and the first access point and fourth information about whether the display apparatus supports a RSDB connection, and the received device information of the portable device,
wherein, the processor is further configured to:
determine the P2P operating channel aby using the first information and the third information,
control the display apparatus to connect to the portable device through the determined P2P operating channel,
identify whether the display apparatus and the portable device support the RSDB connection based on the second information and the fourth information, and
determine the peer-to-peer (P2P) operating channel between the display apparatus and the portable device further based on identifying whether the display apparatus and the portable device support the RSDB connection.

US Pat. No. 10,432,886

DISPLAY SYSTEM, DISPLAY APPARATUS, AND CONTROLLING METHOD THEREOF

SAMSUNG ELECTRONICS CO., ...

1. A display system comprising:a display apparatus;
an image providing apparatus configured to provide an image to the display apparatus; and
a remote control device configured to transmit control signals for controlling the image providing apparatus,
wherein the display apparatus is configured to:
control the remote control device to transmit a first control signal included in a first control code set to the image providing apparatus,
based on the image provided by the image providing apparatus being changed in response to sensing of the first control signal, acquire apparatus information, which identifies the image providing apparatus, based on the changed image and the first control signal, and
based on the image not being changed in response to sensing of the first control signal, control the remote control device to transmit a second control signal included in a second control code set to the image providing apparatus,
wherein at least one of a manufacturer or a model corresponding to the second control code set is different from at least one of a manufacturer or a model corresponding to the first control code set.

US Pat. No. 10,432,884

IMAGING ELEMENT, IMAGING METHOD AND ELECTRONIC APPARATUS

Sony Corporation, Tokyo ...

1. An imaging device, comprising:a plurality of pixels including a first pixel and a second pixel;
a first signal line coupled to the first pixel;
a second signal line coupled to the second pixel;
a comparator coupled to the first and the second signal lines, the comparator including;
a first differential pair unit including;
a first differential transistor coupled to the first signal line;
a second differential transistor coupled to a reference signal generation circuit;
a first select transistor coupled to the first differential transistor; and
a second select transistor coupled to the second differential transistor; and
a second differential pair unit including;
a third differential transistor coupled to the second signal line;
a fourth differential transistor coupled to the reference signal generation circuit;
a third select transistor coupled to the third differential transistor; and
a fourth select transistor coupled to the fourth differential transistor.

US Pat. No. 10,432,883

BACKSIDE ILLUMINATED GLOBAL SHUTTER PIXELS

SEMICONDUCTOR COMPONENTS ...

1. An image sensor comprising:a substrate having a front surface and a back surface;
first and second photodiodes formed in the substrate;
a charge storage region formed in the substrate between the first and second photodiodes;
backside deep trench isolation having first and second portions between the first and second photodiodes;
a trench in between the first and second portions of backside deep trench isolation that extends from the back surface towards the front surface;
an opaque layer formed in the trench; and
an oxide layer formed over the opaque layer, wherein the oxide layer is formed in the trench.

US Pat. No. 10,432,882

IMAGING DEVICE AND ENDOSCOPE SYSTEM

OLYMPUS CORPORATION, Tok...

1. An imaging device, comprising:a plurality of pixels configured to output a pixel current according to incident light;
a reference current generation circuit configured to generate a reference current;
a differential current generation circuit to which the pixel current and the reference current are input, and configured to generate a differential current according to a difference between the pixel current and the reference current;
a reference voltage generation circuit configured to generate a first reference voltage and a second reference voltage;
a conversion circuit to which the differential current and the first reference voltage are input, and configured to convert the differential current into an output voltage on the basis of the first reference voltage; and
an output circuit to which the output voltage and the second reference voltage are input, and configured to output the output voltage and the second reference voltage,
wherein the second reference voltage is higher than the first reference voltage when the output voltage at the time of resetting of the plurality of pixels is higher than the output voltage at the time of exposure of the plurality of pixels, and
the second reference voltage is lower than the first reference voltage when the output voltage at the time of resetting of the plurality of pixels is lower than the output voltage at the time of exposure of the plurality of pixels.

US Pat. No. 10,432,881

INFRARED IMAGING DEVICE AND METHOD OF UPDATING FIXED PATTERN NOISE DATA

FUJIFILM Corporation, To...

1. An infrared imaging device comprising:an infrared detector including a plurality of detector elements that detect incident infrared rays;
a noise correction processing unit that subtracts fixed pattern noise data from a detection signal of the infrared rays detected by the plurality of detector elements, to thereby remove fixed pattern noise from the infrared detection signal; and
a noise data update processing unit including a signal component amount calculation unit that calculates an amount of a signal component dependent on the infrared rays incident on the infrared detector included in the infrared detection signal, on the basis of a plurality of infrared detection signals obtained by detecting multiple times of infrared rays by the infrared detector, a fixed pattern noise calculation unit that calculates an amount of a fixed pattern noise component on the basis of the infrared detection signal and the calculated amount of a signal component, and a data update unit that updates the fixed pattern noise data with the calculated amount of a fixed pattern noise component,
wherein the signal component amount calculation unit calculates dispersion or standard deviation of the plurality of the infrared detection signals detected by each detector element to be processed, and calculates the amount of a signal component on the basis of the calculated dispersion or standard deviation.

US Pat. No. 10,432,879

DUAL CONVERSION GAIN HIGH DYNAMIC RANGE IMAGE SENSOR READOUT CIRCUIT MEMORY STORAGE STRUCTURE

OmniVision Technologies, ...

10. An image sensing system, comprising:a pixel array including a plurality of dual conversion gain (DCG) pixels arranged into a plurality of rows and a plurality of columns;
control circuitry coupled to the pixel array to control operation of the pixel array;
a plurality of readout circuits coupled to the pixel array to read out pixel data from the pixel array, wherein the pixel data includes low conversion gain (LCG) pixel data and high conversion gain (HCG) data, wherein each readout circuit includes:
a ramp generator coupled to output a ramp signal;
a comparator, wherein a first input of the comparator is coupled to receive the ramp signal from the ramp generator, wherein a second input of the comparator is coupled to a respective one of a plurality of column bitline outputs of the pixel array to receive an output signal from one of the plurality of DCG pixels, wherein the output signal is one of an LCG signal or an HCG signal;
a counter coupled to receive an output of the comparator;
a first memory circuit and a second memory circuit coupled to receive an output of the counter, wherein the counter is coupled to write to only one of the first and second memory circuits at a time in response to a memory write select signal;
a first multiplexor, wherein a first input of the first multiplexor is coupled to receive an initial value, wherein a second input of the first multiplexor is coupled to receive an initial memory value from the first memory circuit, wherein the counter is coupled to load either the initial value or the initial memory value from an output of the first multiplexor in response to an initial select signal;
a second multiplexor, wherein a first input of the second multiplexor is coupled to the first memory circuit, wherein a second input of the second multiplexor is coupled to the second memory circuit, wherein the second multiplexor is coupled to load either an LCG memory value from the first memory circuit or an HCG memory value from the second memory circuit from an output of the second multiplexor in response to a memory read select signal; and
a data transmitter circuit coupled to the output of the second multiplexor to transmit pixel data of the pixel array; anda digital processor coupled to the readout circuits to receive the pixel data from the pixel array.

US Pat. No. 10,432,877

IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD AND PROGRAM STORAGE MEDIUM FOR PROTECTING PRIVACY

NEC CORPORATION, Tokyo (...

1. An imaging processing system comprising:a memory storing computer program instructions; and
at least one processor configured to execute the computer program instructions to:
generate a foreground image by extracting, from a first image frame, a difference area in which a difference from a background image is not less than a certain threshold;
extract an edge portion from the generated foreground image; and
generate a second image frame to be output by superimposing the edge portion and the background image,
wherein transparency of the second image frame is dependent on an intensity of the edge portion.

US Pat. No. 10,432,876

IMAGING APPARATUS CAPABLE OF SWITCHING DISPLAY METHODS

MAXELL, LTD., Kyoto (JP)...

1. An imaging apparatus comprising:an imager configured to perform image pickup and obtain a pickup image;
an image display configured to display the pickup image;
a touch panel configured to enable a user to select a subject included in the pickup image displayed on the image display;
a controller configured to control the image display to display the pickup image or a superimposed image including the pickup image and a cutout image which is cut out and generated for displaying a part of an area of the pickup image with magnification;
an operation input interface configured to switch an image displayed on the image display from the pickup image to the superimposed image in accordance with an operation of a user; and
a recording button configured to operate a start and an end of a recording of the pickup image obtained by the imager when a moving picture is recorded,
wherein the controller is further configured to:
when the subject included in the pickup image displayed on the image display is selected via the touch panel, recognize and chase the selected subject in the pickup image,
in a case that a position of the selected subject that is recognized and chased in the pickup image is changed, automatically change the area which is cut out as the cutout image from the pickup image in accordance with the changed position of the subject selected in the pickup image, and
when the superimposed image including the pickup image and the cutout image including the selected subject is displayed on the image display in accordance with the operation via the operation input interface and an operation of the start of the recording for recording the moving picture by the recording button is detected, switch from displaying the superimposed image to displaying the pickup image on the image display.

US Pat. No. 10,432,875

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGE SENSOR

SONY CORPORATION, Tokyo ...

1. An image processing apparatus, comprising:a central processing unit (CPU) configured to:
integrate a plurality of pixel values of a first exposure image for each row of a plurality of rows of the first exposure image;
calculate a first row integration value for each row of the plurality of rows of the first exposure image based on the integration of the plurality of pixel values of the first exposure image, wherein the first exposure image is captured for a first exposure time by rolling shutter;
integrate a plurality of pixel values of a second exposure image for each row of a plurality of rows of the second exposure image, wherein the second exposure image is captured for a second exposure time by the rolling shutter;
calculate a second row integration value for each row of the plurality of rows of the second exposure image based on the integration of the plurality of pixel values of the second exposure image, wherein
the second exposure image includes a subject and an angle of view common to the first exposure image, and
the second exposure time includes a time period that overlaps with the first exposure time;
calculate a row integration ratio based on a division of the first row integration value by the second row integration value, wherein the row integration ratio is calculated for each row of the plurality of rows of the first exposure image, and for each row of the plurality of rows of the second exposure image;
generate a reference value for each row of the plurality of rows of the first exposure image and for each row of the plurality of rows of the second exposure image;
multiply the row integration ratio by the reference value for each row of the plurality of rows of the first exposure image, and for each row of the plurality of rows of the second exposure image;
integrate a plurality of multiplication results of the multiplication corresponding to the plurality of rows of the first exposure image and the plurality of rows of the second exposure image;
calculate a sum of the plurality of multiplication results based on the integration of the plurality of multiplication results;
estimate a flicker parameter of a flicker in the first exposure image based on the sum; and
correct the flicker in the first exposure image based on the estimated flicker parameter.

US Pat. No. 10,432,873

IMAGING APPARATUS AND IMAGING METHOD FOR CONCURRENTLY ACQUIRING STILL IMAGES AND VIDEO

OLYMPUS CORPORATION, Tok...

1. An imaging apparatus comprising:an imager configured to generate an image using a set exposure time; and
at least one circuit configured to:
detect a specific photography situation based on an output from a sensor, wherein the sensor measures an orientation of the imager,
start a sequential shooting movie photography mode when the specific photography situation is detected, the sequential shooting movie photography mode including:
setting an exposure time for acquiring a movie frame based on a subject luminance of an entirety of a field angle and setting a first exposure time for acquiring a still image frame based on a subject luminance of a subject of interest at predetermined timings;
setting, a difference between the exposure time for acquiring a movie frame and the first exposure time as a second exposure time for acquiring a complementary frame;
generating a still image, based on the still image frame generated by photography with the first exposure time; and
generating the movie frame constituting a movie, by composing the still image frame and the complementary frame generated by photography with the second exposure time;
determine whether the still image generated in the sequential shooting movie photography mode includes the subject of interest;
determine, when the subject of interest is detected, whether the still image includes a composition for photographing that is suitable for the subject of interest and suitable for playback or appreciation;
select the still image determined as including the composition for photographing that is suitable for the subject of interest and suitable for playback or appreciation;
determine whether the selected still image is suitable for a time lapse movie based on whether at least one of a characteristic of the still image, a position of the characteristic on the still image, the output from the sensor, and a pose or expression of the subject of interest if the subject of interest is a person, has changed; and
generate a time lapse movie frame constituting a time lapse movie from the still image that is determined to be suitable.

US Pat. No. 10,432,872

MOBILE DEVICE WITH DISPLAY OVERLAID WITH AT LEAST A LIGHT SENSOR

ESSENTIAL PRODUCTS, INC.,...

1. A computer-implemented method of operating a mobile device having a light sensor underneath a display, comprising:displaying a user interface on at least a portion of the display over the light sensor, the display comprising a plurality of pixels, wherein opaqueness of each pixel in the plurality of pixels is individually adjustable;
identifying, by a processor, a command to capture an image with the light sensor;
responsive to the command, adjusting, by the processor, opaqueness of at least a target portion of an opaqueness adjustable region of the display directly over the light sensor, wherein the opaqueness of each pixel in the opaqueness adjustable region is individually adjustable creating a visual effect comprising a grating or a lens flare;
based on the visual effect, determining, by the processor, a gesture from the image; and
further responsive to the command, the processor capturing the image with the light sensor while the target portion of the opaqueness adjustable region is at least partially transparent.

US Pat. No. 10,432,871

SYSTEMS AND METHODS FOR IMAGING USING A RANDOM LASER

Yale University, New Hav...

1. An active interrogation imaging system, the imaging system comprising:a complex laser having a mutual coherence of less than one tenth and a photon degeneracy of greater than 102 that produces a plurality of independent lasing modes with uncorrelated phase relationships and distinct spatial output patterns in response to being pumped; and
one or more detectors that detect an image of an object in response to interrogation of the object by the plurality of independent lasing modes with distinct spatial output patterns,
wherein the plurality of independent lasing modes with distinct spatial output patterns of the complex laser in response to being pumped has a controlled degree of spatial coherence and the image detected by the one or more detectors in response to the controlled degree of spatial coherence is free of speckle,
wherein the complex laser is adapted to enable (i) adjusting a mean free path by adjusting a refractive index of at least one of (1) a background material in an excitation medium or (2) scattering elements in an excitation medium or (ii) adjusting of a shape of the cavity to adjust a degree of cavity chaoticity.

US Pat. No. 10,432,870

ZOOM LENS AND IMAGING APPARATUS

Tamron Co., Ltd., Saitam...

1. A zoom lens, comprising, in order from an object side,a first lens group having positive refractive power,
a second lens group having negative refractive power,
a composite positive lens group having positive refractive power as a whole, and
a composite negative lens group having negative refractive power as a whole, wherein
the composite positive lens group comprises, at a most object side, a third lens group having positive refractive power, and comprises one or more lens groups having positive refractive power,
the composite negative lens group comprises a negative A lens group having negative refractive power and being arranged at a most object side, and comprises a negative B lens group having negative refractive power,
changing focal length is performed by varying distances between the lens groups, and
the following conditional expressions are satisfied:
3.44 0.10 where
?rt: composite lateral magnification of the composite negative lens group at a telephoto end,
f3: focal length of the third lens group,
fw: focal length of the zoom lens at a wide angle end, and
ft: focal length of the zoom lens at the telephoto end.

US Pat. No. 10,432,869

METHOD OF UTILIZING WIDE-ANGLE IMAGE CAPTURING ELEMENT AND LONG-FOCUS IMAGE CAPTURING ELEMENT FOR ACHIEVING CLEAR AND PRECISE OPTICAL ZOOMING MECHANISM

MULTIMEDIA IMAGE SOLUTION...

1. A method of utilizing a wide-angle image capturing element and a long-focus image capturing element to achieve clear and precise optical zooming when controlling an electronic device to simultaneously and respectively capture a wide-angle image and a long-focus image of a same spot comprising the steps of:reading the wide-angle image and the long-focus image, and performing exposure and white balance adjustment processes to the wide-angle image and the long-focus image respectively, so as to enable the exposure and the white balance of the wide-angle image and the long-focus image to be in consistency;
performing an image matching algorithm to the wide-angle image and the long-focus image respectively, so as to enable each pixel on the long-focus image to match with each corresponding pixel on the wide-angle image; wherein the image matching algorithm calculates and obtains an image ratio between the long-focus image and the wide-angle image according to hardware parameters of the long-focus image capturing element and the wide-angle image capturing element, retrieves a region from the wide-angle image that corresponds to the long-focus image according to the image ratio, and then calculates and obtain dense optical flow field of an offset of each pixel on the long-focus image with respect to each corresponding pixel on the wide-angle image by using an optical flow estimation; and
performing an image morphing fusion process to the corresponding pixels on the long-focus image and the wide-angle image, so as to generate a new wide-angle image; wherein the image morphing fusion process performs an offset deformation to the pixels on the long-focus image T(x+u, y+v) according to the dense optical flow fields of the offsets (u, v) of the pixels on the long-focus images T(x+u, y+v), so as to enable the pixels on the long-focus image T(x+u, y+v) to be deformed to match with the corresponding pixels on the wide-angle image W(x, y), and then perform a fusion process to the long-focus image T(x+u, y+v) and the wide-angle image W(x, y), in accordance with the following formula, so as to generate the new wide-angle image WNEW:
WNEW=(1??)T(x+u,y+v)+?*W(x,y)wherein ? is a weight factor andwherein, when pixels on a deformed long-focus image T(x+u, y+v) are clearer than the corresponding pixels on the wide-angle image W(x, y), taking ?=0, otherwise, taking 0

US Pat. No. 10,432,868

REMOVING AERIAL CAMERA DRONES FROM A PRIMARY CAMERA'S FIELD OF VIEW

International Business Ma...

1. A camera comprising:a sensor for detecting a mobile airborne device within a field of view of the camera, wherein the camera is a primary camera, wherein a three-dimensional physical space is within the field of view of the primary camera;
a signal generator, wherein the signal generator generates a signal that, when received by the mobile airborne device, causes the mobile airborne device to exit the three-dimensional physical space;
a transmitter, wherein the transmitter transmits the signal to the mobile airborne device to cause the mobile airborne device to exit the three-dimensional physical space that is within the field of view of the primary camera;
a location device, wherein the location device detects that the mobile airborne device is at a distance from the primary camera such that an appearance of the mobile airborne device within the field of view of the primary camera has been predetermined to be smaller than a predetermined size; and
one or more processors, wherein the one or more processors determine that the appearance of the mobile airborne device within the field of view of the primary camera is smaller than the predetermined size, and in response to determining that the appearance of the mobile airborne device within the field of view of the primary camera is smaller than the predetermined size, override the signal to the mobile airborne device such that the mobile airborne device is no longer directed to exit the three-dimensional physical space.

US Pat. No. 10,432,866

CONTROLLING A LINE OF SIGHT ANGLE OF AN IMAGING PLATFORM

Planet Labs, Inc., Menlo...

1. A computer-implemented method of controlling an imaging platform, the method comprising:determining, by one or more computing devices, a motion profile for a dual-axis steering mirror associated with an imaging platform;
determining, by the one or more computing devices, position information indicative of an orientation of the imaging platform at one or more points along a path on which the imaging platform travels;
determining, by the one of more computing devices, a plurality of integration time periods based at least in part on the motion profile;
capturing, by the one or more computing devices, a sequence of image frames of at least a portion of a region of interest during at least a subset of the plurality of integration time periods as the imaging platform travels along the path;
identifying, by the one or more computing devices, blur in at least one of the captured image frames; and
controlling, by the one or more computing devices, the motion of the steering mirror based at least in part on the motion profile, the position information, and the identified blur, wherein controlling the motion of the steering mirror comprises controlling the steering mirror to rotate about a first axis and a second axis.

US Pat. No. 10,432,865

DIGITAL PHOTOGRAPHING APPARATUS AND CONTROL METHOD

Samsung Electro-Mechanics...

1. A digital photographing apparatus comprising:a position sensor configured to detect current position information of a lens unit for a driving amount; and
an optical driving processor configured to:
calculate a movement position variation of the lens unit based on the position information of the lens unit,
compare the movement position variation with a reference movement position variation among reference movement position variations, and
cause the movement position variation to converge on the reference movement position variation by adjusting the driving amount,
wherein the reference movement position variations are constants stored in the optical driving processor or in a separate memory, and
wherein the optical driving processor is configured to apply different bit resolutions to sections of the movement position variation based on differences to the reference movement position variation.

US Pat. No. 10,432,864

SYSTEMS AND METHODS FOR STABILIZING VIDEOS

GoPro, Inc., San Mateo, ...

1. An image capture system that stabilizes videos, the image capture system comprising:a housing;
an optical element carried by the housing and configured to guide light within a field of view to an image sensor;
the image sensor carried by the housing and configured to generate a visual output signal conveying visual information based on light that becomes incident thereon during a capture duration, the visual information defining visual content having the field of view;
a position sensor carried by the housing and configured to generate a position output signal based on positions of the housing during the capture duration, the position output signal conveying position information that characterizes rotational positions of the housing at different moments within the capture duration; and
one or more physical processors configured by machine-readable instructions to:
determine a trajectory of the housing during the capture duration based on the position information, the trajectory reflecting the rotational positions of the housing at the different moments within the capture duration, the trajectory including a first portion corresponding to a first moment within the capture duration and a second portion corresponding to a second moment subsequent to the first moment within the capture duration;
determine a smoothed trajectory of the housing based on a subsequent portion of the trajectory such that a portion of the smoothed trajectory corresponding to the first portion of the trajectory is determined based on the second portion of the trajectory, the smoothed trajectory having smoother changes in the rotational positions of the housing than the trajectory;
determine a viewing window for the visual content based on the smoothed trajectory of the housing, the viewing window defining one or more extents of the visual content; and
generate stabilized visual content of a video based on the viewing window, the stabilized visual content including a punchout of the one or more extents of the visual content within the viewing window.

US Pat. No. 10,432,863

ROUTING OF TRANSMISSION MEDIA THROUGH ROTATABLE COMPONENTS

GoPro, Inc., San Mateo, ...

1. An image capturing system, comprising:a digital image capturing device (DICD); and
a hand-held apparatus configured to support the DICD, the hand-held apparatus including:
a grip;
a transmission media extending through the grip, the transmission media configured to transmit data and/or power through the hand-held apparatus;
a first gimbal assembly positioned adjacent an upper end of the grip, the first gimbal assembly including:
first and second housings configured for relative movement; and
a motor assembly accommodated within one of the first and second housings;
a first arm including opposite first and second end portions, wherein the first gimbal assembly is positioned adjacent the first end portion of the first arm, and the transmission media extends from the grip, through the motor assembly of the first gimbal assembly, and into the first arm;
a second gimbal assembly positioned adjacent the second end portion of the first arm, the second gimbal assembly including:
first and second housings configured for relative movement; and
a motor assembly accommodated within one of the first and second housings;
a second arm including opposite first and second end portions, wherein the second gimbal assembly is positioned adjacent the first end portion of the second arm, and the transmission media extends from the first arm, through the motor assembly of the second gimbal assembly, and into the second arm; and
a third gimbal assembly positioned adjacent the second end portion of the second arm such that the third gimbal assembly is operatively connected to the DICD, the transmission media extending into the third gimbal assembly from the second arm.

US Pat. No. 10,432,862

IMAGING APPARATUS, IMAGE PROJECTOR APPARATUS, AND STAGE APPARATUS

RICOH IMAGING COMPANY, LT...

1. An imaging apparatus comprising:a photographing optical system;
an image sensor onto which an object image is projected through said photographing optical system;
a movable stage to which said image sensor is fixed;
a base which holds said movable stage in a manner to allow said movable stage to move relative to said base; and
a thrust generator which generates thrust forces in different directions against said movable stage,
wherein said different directions include a first direction, a second direction and a third direction, said first direction being parallel with a direction of an optical axis of said photographing optical system, and
wherein, during operation of the thrust generator, an interaction of said thrust forces against said movable stage in at least one of said different directions holds said movable stage in a noncontact state with said base and causes said movable stage to at least one of:
translate relative to said base in said first direction,
translate relative to said base in said second direction,
translate relative to said base in said third direction,
rotate relative to said base about said first direction,
rotate relative to said base about said second direction, and
rotate relative to said base about said third direction.

US Pat. No. 10,432,860

CAMERA OPERATION MODE CONTROL

LENOVO (BEIJING) CO., LTD...

1. A controlling method comprising:detecting a triggering event, the triggering event being one of a plurality of types of events, and the plurality of types of events including a first-type event and a second-type event that are different from each other; and
controlling, based on the triggering event, a camera of a laptop including a first face and a second face to be in an operation mode corresponding to the triggering event as detected, including:
determining an angle between the first face and the second face,
controlling, in response to the angle being smaller than a preset angle, the camera to be in an operation mode with a first operation mechanism, and
controlling, in response to the angle being larger than the preset angle, the camera to be in an operation mode with a second operation mechanism,
wherein:
a screen is arranged on the first face of the laptop,
a keyboard and the camera are arranged on the second face of the laptop,
the first-type event is a user authentication event corresponding to the angle being smaller than the preset angle and the operation mode with the first operation mechanism is a texture acquiring mode,
the second-type event is a picture capturing event corresponding to the angle being larger than the preset angle and the operation mode with the second operation mechanism is a picture acquiring mode, and
the picture acquiring mode is independent of the texture acquiring mode.

US Pat. No. 10,432,859

IMAGE PROCESSING APPARATUS AND METHOD FOR SETTING WIDE DYNAMIC RANGE MODE BASED ON LUMINANCE OF IMAGE

HANWHA TECHWIN CO., LTD.,...

1. An image processing apparatus comprising:an image sensor configured to output image data having a plurality of image channels;
at least one processor configured to implement:
a mode setting unit which changes a photographing mode from a first mode in which a first number of sub-images are synthesized to a second mode in which a second number of sub-images are synthesized, based on luminance of the output image data: and
a register setting unit which changes a register value of the image sensor according to the second mode; and
an image signal processor (ISP) configured to generate a result image by synthesizing the second number of sub-images from the image data according to the changed register value,
wherein the second number of sub-images have different exposure times,
wherein the register value comprises a maximum shutter speed for each photographing mode and a shutter speed for each image channel, and
wherein the maximum shutter speed is set in inverse proportion to a number of sub-images to be synthesized for generating the result image.

US Pat. No. 10,432,857

SYSTEMS, METHODS, AND APPARATUSES FOR OPTIMIZING FIELD OF VIEW

LIFE TECHNOLOGIES HOLDING...

1. A system for maximizing a field of view for an image capturing device, the system comprising:a surface configured to support an object;
a camera configured to capture an image of the object in a first position within the field of view;
a processor including instructions to calculate a rotational angle by virtually aligning an edge of the object with an edge of the field of view and calculating a zoom factor to position the edge of the object along the edge of the field of view.

US Pat. No. 10,432,856

METHOD AND APPARATUS OF VIDEO COMPRESSION FOR PRE-STITCHED PANORAMIC CONTENTS

MEDIATEK INC., Hsin-Chu ...

1. A method of video encoding of pre-stitched pictures for a video encoding system, wherein each pre-stitched picture is formed from at least two images captured by a plurality of cameras of a panoramic video capture device, and wherein two neighboring images captured by two neighboring cameras include at least an overlapped image area, the method comprising:receiving panoramic video source data comprising a current block in a current pre-stitched picture;
receiving stitching information associated with a stitching process to form the pre-stitched pictures, wherein the stitching information comprises calibration data, matching results, seam position, blending level, sensor data, or a combination thereof, wherein the stitching information corresponds to seam information associated with seam detection, and seam-based Inter prediction is used to encode the current block by utilizing the seam information; and
encoding the current block utilizing the stitching information associated with the stitching process.

US Pat. No. 10,432,855

SYSTEMS AND METHODS FOR DETERMINING KEY FRAME MOMENTS TO CONSTRUCT SPHERICAL IMAGES

GoPro, Inc., San Mateo, ...

1. A system for determining key frame moments to construct spherical images, the system comprising:one or more physical computer processors configured by computer readable instructions to:
obtain multiple video segments, individual video segments including multiple frames, wherein the individual video segments are time synchronized for at least portions of the individual video segments captured during a synchronized time period;
set a first key frame moment within the synchronized time period;
determine a first set of stitching parameter values for the frames of the individual video segments captured during the first key frame moment, the first set of stitching parameter values including values of one or more of an image capture device parameter, a color parameter, a stabilization parameter, and/or an exposure parameter;
construct, using the first set of stitching parameter values, a first set of spherical images, the first set of spherical images including a first spherical image constructed from the frames of the individual video segments captured during the first key frame moment and at least one other spherical image from the frames of the individual video segments captured at a moment adjacent or near the first key frame moment;
detect a change in the stitching parameter values of the individual video segments relative to the first set of stitching parameter values at a moment in time subsequent to the first key frame moment;
identify, based on the detected change in the stitching parameter values of the individual video segments relative to the first set of stitching parameter values not being within a predefined threshold, the moment in time subsequent to the first key frame moment as a second key frame moment;
determine a second set of stitching parameter values for the frames of the individual video segments captured during the second key frame moment; and
construct, using the second set of stitching parameter values, a second set of spherical images, the second set of spherical images including a second spherical image constructed from the frames of the individual video segments captured during the second key frame moment and at least one other spherical image constructed from the frames of the individual video segments captured at a moment adjacent or near the second key frame moment and preceding the second key moment such that the second set of stitching parameter values are used to construct the at least one other spherical image from the frames of the individual video segments captured at a moment in time between the first key frame moment and the second key frame moment.

US Pat. No. 10,432,853

IMAGE PROCESSING FOR AUTOMATIC DETECTION OF FOCUS AREA

SONY CORPORATION, Tokyo ...

1. A method for image processing, said method comprising:extracting, by an imaging device, a plurality of object features of a plurality of objects,
wherein the plurality of objects are in a field-of-view (FOV) of said imaging device;
generating, by said imaging device, a plurality of confidence maps based on said extracted plurality of object features;
determining, by said imaging device, a focus area corresponding to said FOV based on said generated plurality of confidence maps and a specific rule.

US Pat. No. 10,432,852

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING APPARATUS CONTROL METHOD, IMAGE PICKUP APPARATUS, AND IMAGE PICKUP APPARATUS CONTROL METHOD

Canon Kabushiki Kaisha, ...

1. An image processing apparatus, comprising:one or more processors; and
a memory storing instructions which, when the instructions are executed by the one or more processors, cause the image processing apparatus to function as:
an acquiring unit configured to acquire a plurality of image signals picked up by an image pickup unit;
a determining unit configured to determine distance information of the object image by using the plurality of image signals; and
a conversion unit configured to perform, on the distance information, gradation conversion that has at least one conversion characteristic out of a plurality of different conversion characteristics, and to output information that is based on the distance information converted by the gradation conversion,
wherein the conversion unit selects the one conversion characteristic out of the plurality of different conversion characteristics based on an application to which the information is output, and
wherein the conversion unit determines a gradation resolution of the output of the conversion unit based on the conversion characteristic.

US Pat. No. 10,432,851

WEARABLE COMPUTING DEVICE FOR DETECTING PHOTOGRAPHY

1. A wearable computing device for detecting photography comprising:an outer casing configured to be worn by a user;
a device camera coupled to the outer casing and configured to detect image data corresponding to a person holding a remote camera in an environment of the wearable computing device;
a mobile processor coupled to the device camera and configured to determine that a photograph will be taken based on the image data corresponding to the person holding the remote camera, and determine a direction of the remote camera relative to the wearable computing device based on the image data corresponding to the person holding the remote camera; and
an output device coupled to the mobile processor and configured to output data indicating that the photograph of the wearable computing device will be taken, the output device including at least one of a speaker configured to output audio data providing directions for the user to turn to face the remote camera or a pair of vibration units each positioned on one side of the outer casing and configured to output stereo haptic feedback in a pattern that provides directions for the user to turn to face the remote camera.

US Pat. No. 10,432,850

APPARATUS AND METHOD FOR SUPPLYING CONTENT AWARE PHOTO FILTERS

Snap Inc., Santa Monica,...

1. A server, comprising:a photograph filter module with instructions executed by a processor to:
identify when a client device captures a photograph;
choose photograph filters based upon attributes of the photograph, wherein the attributes of the photograph include the physical environment captured in the photograph;
supply the chosen selected photograph filters to the client device, wherein the supplied photograph filters are configured to be independently selectable by a user in response to a gesture applied to the photograph as presented on a display of the client device, wherein each of the supplied photograph filters is an overlay on top of the photograph to augment the photograph and is not presented on the display of the client device until it is overlaid on the photograph in response to the gesture applied to the photograph presented on the display of the client device;
receive a selection of one or more of the supplied photograph filters; and
route the one or more selected photograph filters and the photograph to another client device.

US Pat. No. 10,432,849

IMAGE MODIFICATION BASED ON OBJECTS OF INTEREST

eBay Inc., San Jose, CA ...

1. A method, comprising:receiving, by a processor of an image capture device, a user interface selection initiating an image capture;
detecting a first image capture parameter;
identifying an object of interest within a field of view of the image capture device responsive to identifying a set of object characteristics of the object of interest based on one or more publications corresponding to the object of interest;
based on one or more object characteristics of the set of object characteristics, generating a parameter notification indicating a suggested modification of the first image capture parameter, the parameter notification including a specified parameter range corresponding to the suggested modification and one or more user interface elements selectable to modify the first image capture parameter; and
causing presentation, at the image capture device, of the parameter notification.

US Pat. No. 10,432,847

SIGNAL PROCESSING APPARATUS AND IMAGING APPARATUS

Sony Corporation, Tokyo ...

1. An imaging apparatus, comprising:two imaging devices that generate respective pieces of imaging data that differ in angle of view from each other; and
a composition unit that generates first composite imaging data, by adding together a low-frequency component of first imaging data, a high-frequency component of the first imaging data, and a high-frequency component of second imaging data, the first imaging data being the imaging data that has been generated by one of the imaging devices and has a relatively wide angle of view, and the second imaging data being the imaging data that has been generated by another of the imaging devices and has a relatively narrow angle of view.

US Pat. No. 10,432,846

ELECTRONIC DEVICE, IMAGE CAPTURING METHOD AND STORAGE MEDIUM

Chiun Mai Communication S...

1. A method for capturing an image using an electronic device, the electronic device comprising a camera device and a display device, the method comprising:obtaining images directly from a predetermined device according to location coordinates of the electronic device, wherein location coordinates of each of the obtained images belongs to a predetermined geographical range, the predetermined geographical range is a circular range that is defined by a centre and a predetermined radius, and the location coordinates of the electronic device are set to be the centre, and the predetermined radius is equal to a predetermined distance from the centre;
dividing a display area of the display device into a first display area and a second display area;
dividing a horizontal direction of the first display area into M value ranges, wherein the horizontal direction of the first display area represents a first parameter of related parameters of each of the obtained images, and each of the M value ranges represents a range of the first parameter;
dividing a vertical direction of the first display area into N value ranges, wherein the vertical direction of the first display area represents a second parameter of the related parameters of each of the obtained images, and each of the N value ranges represents a range of the second parameter, wherein the M and N are positive integers; wherein the first parameter is a horizontal azimuth angle of an image capturing device when the image capturing device captures the obtained image, and the second parameter is selected from a color temperature of the obtained image and a pitching angle of the image capturing device when the image capturing device captures the obtained image;
displaying the obtained images on the first display area according to a value range of the first parameter of each of the obtained images and a value range of the second parameter of each of the obtained images;
displaying a preview image of a current scene on the second display area;
setting one of the obtained images to be a reference image;
calculating one or more difference values using current orientation parameters of the electronic device and orientation parameters of the reference image;
adjusting the orientation parameters of the electronic device according to the one or more difference values; and
controlling the camera device to capture an image of the current scene based on the related parameters and/or orientation parameters of the reference image.

US Pat. No. 10,432,845

METHOD AND APPARATUS FOR GENERATING BLURRED IMAGE, AND MOBILE TERMINAL

GUANGDONG OPPO MOBILE TEL...

1. A method for generating a blurred image, comprising:determining, according to preview image data acquired via two rear cameras of a dual-camera device, first depth-of-field information for a foreground region and second depth-of-field information for a background region in a current preview image;
acquiring a basic value of a blurring degree according to the first depth-of-field information and the second depth-of-field information, the basic value of the blurring degree being a reference value of the blurring degree; and
performing Gaussian blur process on the background region according to the basic value of the blurring degree to generate the blurred image;
wherein, performing the Gaussian blur process on the background region according to the basic value of the blurring degree to generate the blurred image comprises:
determining a blurring coefficient for each pixel in the background region according to the basic value of the blurring degree and the second depth-of-field information for the background region; and
performing the Gaussian blur process on the background region according to the blurring coefficient for each pixel in the background region to generate the blurred image;
wherein determining the blurring coefficient for each pixel in the background region according to the basic value of the blurring degree and the second depth-of-field information for the background region comprises:
calculating a multiplied value by multiplying the basic value of the blurring degree by the second depth-of-field information of each pixel in the background region, to obtain the blurring coefficient for each pixel in the background region.

US Pat. No. 10,432,844

IMAGE PICKUP DEVICE AND ELECTRONIC APPARATUS WITH AN IMAGE PLANE PHASE DIFFERENCE DETECTION PIXEL

Sony Semiconductor Soluti...

1. An image pickup device comprising an image plane phase difference detection pixel for obtaining a phase difference signal for image plane phase difference AF, the image plane phase difference detection pixel comprising:a first photoelectric conversion section that generates an electric charge in response to incident light;
a second photoelectric conversion section that generates an electric charge in response to light that passes through the first photoelectric conversion section and a lower electrode section;
an upper electrode section that is one of a number of electrodes disposed facing each other across the first photoelectric conversion section, the upper electrode section being formed on a light incident side of the first photoelectric conversion section; and
the lower electrode section that is another of the electrodes disposed facing each other across the first photoelectric conversion section, the lower electrode section being formed on a side opposite the light incident side of the first photoelectric conversion section, wherein the lower electrode section includes a first lower electrode section and a second lower electrode section that are unevenly two-divided at a position that avoids a center of the incident light, and a member that transmits the incident light.

US Pat. No. 10,432,843

IMAGING APPARATUS, CONTROL METHOD OF IMAGING APPARATUS, AND NON-TRANSITORY RECORDING MEDIUM FOR JUDGING AN INTERVAL BETWEEN JUDGEMENT TARGETS

OLYMPUS CORPORATION, Tok...

1. An imaging apparatus comprising:an image pickup device which acquires a judgment image of a structure that includes judgment targets of nearly a same shape; and
a processor communicatively coupled to the image pickup device, wherein the processor:
calculates intervals between the judgment targets of the structure based on the judgment image, and
judges whether each of the intervals between the judgment targets is within a preset interval based on the calculated intervals;
wherein the wherein the processor further:
compares a figure of a reference target which is one of the judgment targets shown in the judgment image with figures of a pair of comparative targets which are the judgment targets adjacent to the reference target, to judge whether each of the intervals between the judgment targets is within the preset interval; and
judges that the intervals between the judgment targets are within the preset interval when a first ratio between a dimension of the figure of the reference target and a dimension of the figure of one of the pair of comparative targets and a second ratio between the dimension of the figure of the reference target and the dimension of the figure of a second of the pair of comparative targets have a preset relation.

US Pat. No. 10,432,842

FUSION OF INERTIAL AND DEPTH SENSORS FOR MOVEMENT MEASUREMENTS AND RECOGNITION

1. A movement recognition system, comprising:an inertial sensor coupled to an object and configured to measure a first unit of inertia of the object;
a depth sensor configured to measure a three dimensional shape of the object using projected light patterns and a camera; and
a processor configured to receive a signal representative of the measured first unit of inertia from the inertial sensor and a signal representative of the measured shape from the depth sensor and to determine a type of movement of the object based on the measured first unit of inertia and the measured shape utilizing a classification model,
wherein the processor is configured to:
compare the type of movement with a predefined intended movement type:
issue a warning in response to the type of movement not matching the predefined intended movement type; and
count a number of correctly completed movements in response to the type of movement matching the predefined intended movement type.

US Pat. No. 10,432,841

WEARABLE APPARATUS AND METHOD FOR SELECTIVELY CATEGORIZING INFORMATION DERIVED FROM IMAGES

OrCam Technologies, Ltd.,...

1. A wearable apparatus for collecting information related to activities of a user, the wearable apparatus comprising:an image sensor configured to capture a plurality of images from an environment of the user;
a communications interface; and
at least one processing device programmed to:
process the plurality of images to identify an activity occurring in the environment of the user;
associate the activity with an activity category;
determine, based on the plurality of images, a level of interest of the user in the activity category, wherein the level of interest is based, at least in part, on a duration of the activity;
cause transmission of at least the activity category to a remotely located computing device via the communications interface; and
cause a life log to be stored in memory, the life log including information comprising at least part of at least one of the plurality of images depicting the activity and the activity category associated with the at least one of the plurality of images, wherein the information stored in the life log is selectively included based on at least the level of interest of the user in the activity category exceeding a predetermined threshold.

US Pat. No. 10,432,840

FUSION NIGHT VISION SYSTEM

L-3 Communication-Insight...

14. A vision system, comprising:a first housing having a first optical axis, a display, an image combiner; and a first eye piece; and
a second housing having a second optical axis and a second eye piece, the first housing coupled to the second housing through a first coupler, the first coupler having a first hinged joint rotatable about a first axis and a second hinged joint rotatable about a second axis, the first axis spaced a first fixed distance from the second axis, the second housing coupled to the first housing through a second coupler, the second coupler having a third hinged joint rotatable about a third axis, the first housing and the second housing coupled through the first coupler and the second coupler such that a row of pixels in the display when viewed through the first eye piece can be maintained in a position relative to an imaginary line going through the first optical axis and the second optical axis as the distance between the first optical axis of the first housing and the second optical axis of the second housing is varied.

US Pat. No. 10,432,838

LIGHTING FOR INDUSTRIAL IMAGE PROCESSING

1. A light source for industrial image processing for illuminating an image area, comprising:at least one sensor configured for capturing an actual relative position of the light source in relation to at least one defined reference, which is at least one of: at least one defined reference point or at least one defined reference plane; and
a position comparer, which is connected to the at least one sensor, being configured for comparing the actual relative position of the light source with a defined relative target position.

US Pat. No. 10,432,837

PLASTIC BARREL, CAMERA MODULE, AND ELECTRONIC DEVICE

LARGAN PRECISION CO., LTD...

1. A plastic barrel, comprising:an object-end portion, comprising:
an outer object-end surface;
an object-end hole; and
an inner annular object-end surface, wherein a part of the inner annular object-end surface is connected with the outer object-end surface and surrounds the object-end hole;
a holder portion, comprising:
a bottom surface;
a bottom hole; and
an outer bottom side, wherein the bottom surface surrounds the bottom hole and connected with the outer bottom side, and the holder portion further comprises at least three cut traces obtained by partially removing at least three gate portions; and
a tube portion, connecting the object-end portion with the holder portion and comprising a plurality of inner annular surfaces;
wherein a diameter of the object-end hole is ?d, a height of the plastic barrel parallel to a central axis is H, and the following condition is satisfied:
1.02

US Pat. No. 10,432,836

CAMERA DRIVING MODULE, IMAGE CAPTURING UNIT AND SMARTPHONE

LARGAN DIGITAL CO., LTD.,...

1. A camera driving module, comprising:a base comprising an opening;
a casing disposed on the base, the casing comprising a through hole and at least one broadwise notch structure, the through hole corresponding to the opening of the base, the at least one broadwise notch structure being located nearby a periphery of the through hole;
a lens unit movably disposed on the casing, the lens unit comprising at least one protruding structure corresponding to the at least one broadwise notch structure, the at least one protruding structure being located at a periphery of the lens unit;
a magnetic element fixed to the casing and located at an inside of the casing;
a coil fixed to the lens unit and located at an outside of the lens unit, the coil facing toward the magnetic element;
at least one spring disposed on the lens unit; and
at least one damper agent configured to reduce movement of the lens unit, and the at least one damper agent being disposed between the at least one broadwise notch structure and the at least one protruding structure.

US Pat. No. 10,432,835

OPTICAL IMAGE CAPTURING SYSTEM, IMAGE CAPTURING DEVICE AND ELECTRONIC DEVICE

LARGAN PRECISION CO., LTD...

1. An optical image capturing system comprising seven lens elements, the seven lens elements being, in order from an object side to an image side: a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element, a sixth lens element, and a seventh lens element;wherein the first lens element has negative refractive power, the third lens element has an image-side surface being convex in a paraxial region thereof, the seventh lens element with negative refractive power has an image-side surface being concave in a paraxial region thereof and comprises at least one convex shape in an off-axis region thereof, and an object-side surface and the image-side surface of the seventh lens element are aspheric;
wherein the optical image capturing system has a total of seven lens elements, there is no relative movement among the seven lens elements, and an absolute value of a focal length of the first lens element is smaller than an absolute value of a focal length of the second lens element.

US Pat. No. 10,432,834

LENS DRIVING MODULE WITH CASING HAVING PLASTIC MATERIAL AND ASSEMBLY METHOD THEREOF

TDK Taiwan Corp., Yangme...

1. A lens driving module, configured to drive an optical lens, comprising:a holder, having a receiving space for the optical lens to be disposed therein;
a casing, having a plastic material;
a base, having a plurality of protrusions extending toward the casing and a main body from which the protrusions protrude, and each protrusion has a side surface and a positioning bump, wherein the holder is disposed between the casing and the base;
a first elastic element connecting the holder to the main body;
a second elastic element connecting the protrusions to the holder and having a plurality of locating holes, wherein the positioning bumps are correspondingly incorporated within the locating holes;
an electromagnetic driving assembly, disposed between the holder and the casing and configured to force the holder and the optical lens to move relative to the base; and
a glue, disposed between the side surfaces and the casing, wherein the side surfaces are parallel to a central axis of the optical lens.

US Pat. No. 10,432,833

INTERCHANGEABLE LENS CAMERA

SONY CORPORATION, Tokyo ...

1. A camera, comprising:a body; and
a plurality of groups of contacts on the body, wherein
the plurality of groups of contacts comprises a first group of contacts and a second group of contacts different from the first group of contacts,
the plurality of groups of contacts is configured to be coupled to a plurality of lenses,
the plurality of lenses includes a plurality of lens-side mounts, and
a first length of each of the first group of contacts in a direction of an optical axis of the camera is different from a second length of each of the second group of contacts in the direction of the optical axis of the camera,
the difference between the first length and the second length is based on a difference between flange back distances of respective lens-side mounts of a first lens of the plurality of lenses and a second lens of the plurality of lenses, the flange back distances are with respect to an imaging plane, and
the first group of contacts having the first length is configured to couple with the first lens and the second group of contacts having the second length is configured to couple with the second lens.

US Pat. No. 10,432,831

IMAGE SENSOR

SK hynix Inc., Icheon-si...

1. An image sensor device comprising:a pixel array in which a plurality of pixel blocks are arranged,
wherein each of the pixel blocks comprises:
a light receiver comprising a floating diffusion and a plurality of unit pixels and configured to receive incident light and generate photo charges in response to the received incident light, the plurality of unit pixels sharing the floating diffusion;
a first driver located at a first side of the light receiver and comprising a driver transistor;
a second driver located at a second side of the light receiver and comprising a reset transistor; and
a conductive line having a first region coupling the driver transistor to the floating diffusion and a second region coupling the floating diffusion to the reset transistor,
wherein the driver transistor and the reset transistor are respectively located the first side and the second side of the light receiver in a diagonal direction.

US Pat. No. 10,432,829

OPTICAL DEVICE AND IMAGING DEVICE WITH MECHANISM FOR REDUCING CONDENSATION

DENSO CORPORATION, Kariy...

1. An optical device comprising:a lens assembly comprising:
at least one lens having an optical axis for receiving light from an object located at a first side of the optical axis; and
a holder for holding the at least one lens;
a circuit board for performing at least one process based on the received light; and
a housing having an opening and configured to house the lens assembly and the circuit board therein, the lens assembly being exposed to an outside of the housing via the opening,
wherein:
the housing comprises a top wall;
the opening comprises a recess formed in the top wall of the housing;
the recess communicates with an inside of the housing and concavely extends in a second side of the optical axis, the second side of the optical axis being opposite to the first side thereof; and
the lens assembly is arranged below the recess.

US Pat. No. 10,432,828

CAMERA MODULE HAVING A SHIELD MEMBER

LG INNOTEK CO., LTD., Se...

1. A camera module comprising:a lens unit;
a first casing coupled with the lens unit;
a printed circuit board disposed behind the lens unit so as to be spaced apart from the lens unit and to face the lens unit in an optical-axis direction of the lens unit;
a second casing disposed behind the first casing and coupled at a front portion thereof to a rear portion of the first casing, the second casing being configured to accommodate the printed circuit board therein; and
a shield member electrically connected to the lens unit and grounded to the second casing,
wherein the shield member is disposed between the lens unit and the printed circuit board, and
wherein the shield member comprises:
an upper surface to which an optical axis of the camera module is perpendicular;
a hollow region within the upper surface;
a first terminal protruding from the upper surface in a radial direction into the hollow region; and
a second terminal that comprises a first portion bent from the upper surface and extending downwardly therefrom, a second portion bent from the first portion and extending laterally therefrom, and a third portion bent from the second portion and extending downwardly therefrom, wherein the third portion of the second terminal is configured so as to be brought into contact with the second casing and to allow the shield member to be grounded to the second casing.

US Pat. No. 10,432,827

INTEGRATED AUTOMOTIVE SYSTEM, NOZZLE ASSEMBLY AND REMOTE CONTROL METHOD FOR CLEANING AN IMAGE SENSORS EXTERIOR OR OBJECTIVE LENS SURFACE

DLHBOWLES, INC., Canton,...

1. An external lens washing system, comprising:a substantially rigid aiming fixture having a distal side and a proximal side and being configured to support and constrain an external lens exposed toward said distal side;
said external lens having an external lens surface with a lens perimeter and a lens central axis projecting from said lens surface, wherein a lens field of view is defined as a distally projecting solid angle including said lens central axis and originating within said lens perimeter;
a first nozzle assembly configured to be supported and aimed toward said external lens by said aiming fixture;
said first nozzle assembly including a fluid inlet in fluid communication with a first laterally offset washing nozzle projecting from said aiming fixture's distal side;
said first nozzle assembly being configured and aimed to spray washing fluid toward said external lens surface and across said field of view, spraying at a first selected spray aiming angle in relation to the lens external surface; and
said first spray aiming angle being within the range bounded by 1° and 20° in relation to the lens external surface,
wherein said first nozzle assembly is aimed to spray along a first selected spray azimuth angle in relation to a fixed datum on said lens perimeter,
said external lens washing system further comprising a second nozzle assembly configured to be supported and aimed by said aiming fixture;
said second nozzle assembly including a fluid inlet in fluid communication with a second laterally offset washing nozzle projecting from said aiming fixture's distal side;
said second nozzle assembly being configured and aimed to spray washing fluid toward said external lens surface and across said field of view, spraying at a second selected spray aiming angle in relation to the lens external surface;
said second spray aiming angle being within the range bounded by 1° and 20° in relation to the lens external surface; and
said second nozzle assembly being configured and aimed to spray along a selected spray azimuth angle being radially spaced at a selected inter-spray angle from said first nozzle assembly's spray azimuth angle.

US Pat. No. 10,432,826

AUTOMATIC SUPPRESSION OF UNRECOGNIZED SPOT COLORS FROM A RASTER IMAGE

Xerox Corporation, Norwa...

1. A method of suppressing unrecognized spot colors, the method comprising:receiving a print job into a computerized device comprising a marking device having a programmed color space, said print job comprising an electronic document and print job attributes for use in rendering said print job;
performing raster processing of said electronic document;
identifying an object in said electronic document, wherein said object calls for a spot color, and wherein said spot color is not defined in said programmed color space of said marking device; and
responsive to said spot color not being defined in said programmed color space of said marking device,
suppressing said spot color for said object by creating an alternate color space for said marking device, said alternate color space having tint values of zero for every color in said alternate color space, and
assigning said object to said alternate color space.

US Pat. No. 10,432,825

COLOR MAP FROM COLOR DATA COLLECTED FROM PLURALITY OF SOURCES

Hewlett-Packard Developme...

1. An output device, comprising:a processor; and
a machine-readable storage medium on which is stored instructions that when executed by the processor, cause the processor to:
collect color data from a plurality of sources including at least one of a printer cartridge, a peripheral device, a source local to the output device and a remote source accessed over a network;
store a locator table, the locator table to index a plurality of color map selection fields to a plurality of color maps, the plurality of color maps to be based on the collected color data, wherein each entry of the locator table includes an identifier to match one of the plurality of color maps to at least one permutation of the color map selection fields;
select one of the plurality of color maps based on a condition and the identifier; and
generate an output using the selected color map.

US Pat. No. 10,432,824

IMAGE FORMING APPARATUS, IMAGE FORMING SYSTEM, METHOD OF CALIBRATING IMAGE DETECTION UNIT, AND POST-PROCESSING APPARATUS

KONICA MINOLTA, INC., To...

1. An image forming apparatus comprising:an image forming unit that forms an image on a recording medium;
a conveying unit that conveys the recording medium through a conveying path;
a first image detection unit being a line sensor configured to detect the image formed on the recording medium; and
a control unit that controls the forming of the image and the conveying of the recording medium, and is configured to receive detection results of the image from the first image detection unit and a detection result of the image from a second image detection unit, the second image detection unit being a spectral colorimeter, wherein
the control unit has a detection-unit calibration mode for determining a calibration parameter with use of the detection result of the image from the second image detection unit, and
the control unit determines, on the detection-unit calibration mode, whether the image has an image quality higher than or equal to predetermined quality based on the detection result of the image from the first image detection unit, and only when the image quality is determined to be higher than or equal to the predetermined quality, the control unit determines the calibration parameter based on the detection result of the image from the second image detection unit.

US Pat. No. 10,432,821

DETECTING NOISE IN IMAGE DATA THAT PREVENTS AN IMAGE PROCESSING APPARATUS FROM CORRECTLY DETECTING A PRINTING AREA DUE TO THE NOISE IN THE IMAGE DATA

KABUSHIKI KAISHA TOSHIBA,...

9. An image processing method comprisingacquiring, by an arithmetic element, a first image data read by a scanner from a manuscript containing a printing area;
generating, by the arithmetic element, second image data by executing an image processing on the first image data;
recognizing, by the arithmetic element, the printing area in the first image data based on the second image data;
cutting, by the arithmetic element, an image in the printing area from the first image data as a third image data;
wherein the second image data is generated by reducing a contrast of the first image data, the second image data is generated by extracting a contour line image from the first image data, painting the inside of the extracted contour line image as a function of a binarized lightness value and a binarized saturation value associated with a hue, saturation, and lightness color space, and reducing the contrast.

US Pat. No. 10,432,819

IMAGE FORMING APPARATUS FOR MANAGING SUBSTANTIALLY SIMULTANEOUS IMAGE PROCESSING REQUESTS

Ricoh Company, Ltd., Tok...

1. An image forming apparatus, comprising:a communication interface configured to communicate with a plurality of control terminals that are operated by different users; and
circuitry configured to
authenticate the plurality of control terminals in an order that an authentication request is received from the control terminals;
send an operational screen to each one of the plurality of control terminals that have been successfully authenticated for display at each control terminal, the operational screen being configured to accept a process request for requesting the image forming apparatus to execute an image forming process;
receive a plurality of process requests from the plurality of control terminals that have been authenticated in an order that the process request is accepted at the control terminals; and
control an image forming device to execute a plurality of image forming processes according to the plurality of process requests in the order that the process request is accepted at the control terminals, wherein the image forming apparatus further includes
the image forming device including reading circuitry to execute a scan process, and printing circuitry to execute a print process, the reading circuitry and the printing circuitry being configured to operate independently from each other,
wherein, when a process request for executing the print process and a process request for executing the scan process are received at substantially a same time, the circuitry is further configured to control the reading circuitry to execute the scan process, and control the printing circuitry to execute the print process, concurrently.

US Pat. No. 10,432,818

SPARSE MODULATION FOR ROBUST SIGNALING AND SYNCHRONIZATION

Digimarc Corporation, Be...

1. An image processing method for updating a design file for a retail product package or for a product label, said image processing method comprising:obtaining a design file comprising a first area and a second area, the first area comprising area that is devoid of image or color variability, the second area comprising text;
using one or more multi-core processors, generating a sparse pattern of elements at spatial coordinates within the first area, the sparse pattern of elements conveying a machine-readable variable data signal, the sparse pattern of elements comprising a plurality of binary one and binary zero elements, which correspond to ink elements and no-ink elements of the sparse pattern of elements;
selecting an ink color to use for the ink elements through a perceptual analysis that minimizes a visual difference between the ink color and the first area, in which said selecting yields a selected ink color; and
providing the sparse pattern of elements and the selected ink color to update the design file.

US Pat. No. 10,432,817

SYSTEM, APPARATUS AND METHOD FOR ENHANCING METADATA REGISTRATION WORKFLOW

RICOH COMPANY, LTD., Tok...

1. A multi-function output apparatus comprising an operational display, a document scanner to scan a document in a scanning session of an authenticated apparatus user, a processor and a non-transitory medium embodying a program of instructions executable by the processor to configure said multi-function output apparatus having the document scanner to perform a method comprising:(a) providing, on the operational display, of the multi-function output apparatus having the document scanner and via a metadata interface, a series of plural metadata entry screens, associated with a selected workflow, for user entry of metadata to be associated with a scanned document image, each of the plural metadata entry screens provided in the series depending on metadata entered in one or more previous metadata entry screens in the series;
(b) capturing the metadata entered through the metadata entry screens provided on the operational display of the multi-function output apparatus having the document scanner;
(c) causing, upon user confirmation by the apparatus user on the operational display of the multi-function output apparatus of the metadata entered through the metadata entry screens provided on the operational display of the multi-function output apparatus having the document scanner, the metadata entry screens to be customized based on the captured metadata, which was user-entered via the operational display of the multi-function output apparatus, and causing the customized metadata entry screens to be registered as a customized workflow including plural sequential metadata entry screens for the authenticated apparatus user; and
(d) providing, in another scanning session of the authenticated apparatus user after the customized workflow has been registered, the customized workflow, including the plural sequential metadata entry screens capturing the metadata entered by the apparatus user, through the metadata interface, for user selection in connection with one or more additional documents scanned or to be scanned by the document scanner of the multi-function output apparatus.

US Pat. No. 10,432,816

FINISHING LINE CONTROLLERS TO OPTIMIZE THE INITIALIZATION OF A PRINTING DEVICE

Hewlett-Packard Developme...

14. A print device comprising:a first mechanism movable by a first servo;
a second mechanism movable by a second servo; and
a processor to:
cause the first mechanism and the second mechanism to perform a calibration; and store reference positions of the first mechanism and the second mechanism in a memory;
receive a signal indicative of a transition to a power on state;
cause the first mechanism to move to a first stop position and determine whether the first mechanism is within a threshold of an expected position for the first mechanism;
cause the second mechanism to move to a second stop position and determine whether the second mechanism is within a threshold of an expected position for the second mechanism;
responsive to a determination that both the first mechanism and the second mechanism are within respective thresholds of the respective expected positions; fetch the stored reference positions; and
responsive to a determination that one or both the first mechanism and the second mechanism are not within respective thresholds, cause the first mechanism and the second mechanism to perform the calibration.

US Pat. No. 10,432,814

IMAGE FORMING APPARATUS CAPABLE OF PERFORMING A HIGH-SPEED STARTUP PROCESS IN RESPONSE TO A POWER-ON OPERATION, AND RECORDING MEDIUM

KONICA MINOLTA, INC., To...

1. An image forming apparatus comprising:a volatile storage;
a nonvolatile storage;
a main power switch;
a hardware processor; and
a display,
wherein: the hardware processor tries to obtain, from the volatile storage, saving target information related to the image forming apparatus in a power supply continuation period from a time of a power-off operation to a time of power supply interruption, and to store the saving target information in the nonvolatile storage as first snapshot data for restoring a state at a predetermined time after firmware of the image forming apparatus is activated,
the hardware processor determines, when a power-on operation is performed in response to operation of the main power switch after the time of the power-off operation, whether to perform a first high-speed startup process using the first snapshot data as an apparatus startup process with respect to the image forming apparatus, and
when a determination is made to perform the first high-speed startup process using the first snapshot data, the hardware processor causes the display to display, in a period in which a hardware initialization process in response to the power-on operation is being performed or immediately after the hardware initialization process is completed, an advance notice screen to be displayed at a predetermined time before completion of startup in response to the power-on operation, the advance notice screen including at least one of a logo and a message and giving an advance notice that a transition from a power-off state to a user operable state following the power-on operation will be completed.

US Pat. No. 10,432,813

IMAGE READING DEVICE THAT READS DOCUMENT IMAGE

KYOCERA Document Solution...

1. An image reading device comprising:a contact plate on which a document to be read is placed;
a box-shaped frame that supports the contact plate;
a carriage that includes a reading mechanism extending in a main-scanning direction and reciprocally moves in a sub-scanning direction at a side of a bottom surface of the contact plate in the frame, the bottom surface being an opposite side from a top surface on which the document is placed;
a flexible flat cable that transmits an electric signal of the reading mechanism, the flexible flat cable being connected to an one side surface of the carriage so that a width direction thereof matches the main-scanning direction, and being arranged so that a part continued from a portion where the flexible flat cable connects with the one side surface is curved in a U-shape to go around toward an underside of the carriage; and
a cable guide unit that guides a U-shaped curved portion of the flexible flat cable and avoids the flexible flat cable from coming into contact with the contact plate while reciprocally moving in the sub-scanning direction by following the carriage at a speed slower than a moving speed of the carriage, the U-shaped curved portion moving along with a move of the carriage.

US Pat. No. 10,432,810

SCANNER AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM FOR IMAGE PROCESSING DEVICE

Brother Kogyo Kabushiki K...

1. A scanner, comprising:a conveyer configured to sequentially convey multiple original sheets along a conveying passage,
an image sensor arranged on the conveying passage and configured to optically read the original sheet being conveyed along the conveying passage,
a detection sensor configured to detect physical information regarding the original sheet being conveyed,
a memory configured to store particular image information regarding a particular image, the particular image being an image indicated on a particular fixed object which is fixed to the original sheet in an overlapped state,
a controller configured to perform:
controlling the conveyor to convey original sheets one by one along the conveying passage;
controlling the image sensor to optically read the original sheet;
determining a detection position of the original sheet based on an output signal of the detection sensor;
obtaining target image data containing at least a partial image of the original sheet at a detection area including the detection position;
determining whether the particular fixed object including the particular image exists at the detection area of the original sheet by analyzing the target image data with use of the particular image information in the memory;
interrupting conveyance of the original sheet by the conveyer when the detection position is determined, based on an output signal of the detection sensor, to be an overlapped position and when it is detected, by analyzing the target image data, that the particular fixed object does not exist at the detection area; and
outputting image data representing an image of the original sheet when the detection position is determined, based on the output signal of the detection sensor, to be the overlapped position and when the detection area is detected, by analyzing the target image data, that the particular fixed object exists at the detection area.

US Pat. No. 10,432,809

READING APPARATUS AND IMAGE GENERATION METHOD

Seiko Epson Corporation, ...

1. A reading apparatus configured to read an original document, the reading apparatus comprising:a background board placed behind the original document;
a sensor configured to obtain a read image by repeatedly reading a line-image at a prescribed frequency; and
a clipping processor configured to estimate a background pixel value which is a read value of the background board and to clip an image of the original document from the read image based on the background pixel value which is estimated, the read value changing due to an increase in distance between the sensor and the background board caused by the original document, the read image being a result of reading an area including the original document and the background board.

US Pat. No. 10,432,808

USER INPUT BASED PRINT TRAY CONTROL

Hewlett-Packard Developme...

1. A printing device comprising:a plurality of feeder print trays to hold print media to be used for printing;
a user input sensor in proximity to a corresponding feeder print tray to:
sense a user input on an outer surface of a casing of the corresponding feeder print tray;
generate user input data based on the user input; and
input tray processor circuitry communicatively coupled to the user input sensor, wherein the input tray processor circuitry is to:
correlate the user input data as selection of the corresponding feeder print tray;
determine a selection attribute corresponding to the user input data;
determine a tray control action to be initiated for the corresponding feeder print tray based on the user input data and the selection attribute; and
initiate the tray control action.

US Pat. No. 10,432,807

REMOTE POST-SCANNING WORKFLOW USING SCAN JOB IN SCAN JOB QUEUE

Xerox Corporation, Norwa...

1. A system comprising:an application configured to control a computerized device having a computer interface controlled by the application to display a remote scan job menu for at least post-scanning processing options for unexecuted scan jobs within unexecuted scan job queues,
wherein the computerized device is in communications with a scanning device,
wherein the application is configured to control the scanning device to receive at least one of the unexecuted scan jobs in at least one of the unexecuted scan job queues,
wherein the scanning device comprises a scanner interface, wherein the scanner interface is controlled by the application to display at least one of the unexecuted scan job queues and is configured to receive selection of one of the unexecuted scan jobs from one of the unexecuted scan job queues to identify a selected scan job,
wherein the scanning device is controlled by the application to execute the selected scan job in response to the selection of one of the unexecuted scan jobs by scanning items provided to the scanning device to produce an electronic image file, and
wherein the scanning device is controlled by the application to process the electronic image file by performing the post-scanning processing options within the selected scan job on the electronic image file.

US Pat. No. 10,432,806

INFORMATION PROCESSING APPARATUS AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR SETTING FUNCTION FOR ENTITY IN REAL SPACE

FUJI XEROX CO., LTD., To...

1. An information processing apparatus comprisinga registration unit that registers an entity and an executable function in association with each other, the entity being an entity in real space identified by sensing, the executable function being a function executable in response to the entity being identified again,
wherein the registration unit associates, as the executable function, a coordinated function with a plurality of entities in real space identified by sensing, the coordinated function being a function executable by use of the plurality of entities in response to the plurality of entities being identified again.

US Pat. No. 10,432,805

OPERATION OF A PANEL FOR A PRINTER USING A DIAL CONTROL

KYOCERA Document Solution...

1. A method for operation of a multi-function printer panel, comprising:displaying a main menu on a display screen of the multi-function printer panel;
navigating a plurality of primary image items of the main menu displayed on the display screen responsive to a first movement of a dial control;
pressing the dial control to select a primary image item of the plurality of primary image items;
navigating a plurality of secondary image items displayed on the display screen associated with the primary image item selected responsive to a second movement of the dial control;
pressing the dial control to select a secondary image item of the plurality of secondary image items;
navigating to a height arrow image displayed on the display screen;
pressing the dial control to select the height arrow image;
shifting the dial control up or down to adjust a y-axis boundary;
navigating to a width arrow image displayed on the display screen by rotating the dial control;
pressing the dial control to select the width arrow; and
shifting the dial control right or left to adjust an x-axis boundary.

US Pat. No. 10,432,804

DISPLAY CONTROL FOR AN IMAGE PROCESSING APPARATUS

Canon Kabushiki Kaisha, ...

1. An image processing apparatus having a plurality of applications for using functions of the image processing apparatus which arranges, in a first area displayed on an operation unit of the image processing apparatus, a first software key for executing an application among the plurality of applications, the image processing apparatus comprising:at least one memory storing instructions; and
at least one processor that, upon execution of the instructions, configures the at least one processor to
display, on the operation unit, a second area where a second software key generated by executing a job corresponding to the application is arranged, wherein the second software key is for re-executing the application by a first user operation according to a setting content of the executed job; and
display, on the operation unit, an item for arranging, in the first area, the first software key for executing a job according to a setting content of the second software key, and a menu screen including the item when the second software key is selected by a second user operation different from the first user operation.

US Pat. No. 10,432,803

IMAGE FORMATION SYSTEM INCLUDING ENCODED IMAGE GENERATION DEVICE AND IMAGE FORMATION DEVICE

RISO KAGAKU CORPORATION, ...

1. An image formation system comprising:an encoded image generation device including a first processor that performs:
generating print data from manuscript data;
setting security information for controlling a print mode of the print data; and
generating an encoded image from the print data and the security information; and
an image formation device including a second processor that performs:
inputting the encoded image;
decoding the print data and the security information from the input encoded image;
determining whether an output of the decoded print data is available in accordance with the decoded security information; and
outputting the decoded print data when it is determined that the output of the print data is available,
wherein:
the security information includes manuscript identification information that identifies the manuscript data, and number-of-times-of-printing threshold information that sets a threshold of a number of times of printing of the manuscript data, and
the second processor performs:
recording print history information including the manuscript identification information included in the decoded security information and information relating to an accumulated number of outputs of the decoded print data, every time the decoded print data is output; and
determining that the output of the print data is available, when the print history information that corresponds to the manuscript identification information included in the decoded security information has not been recorded, or when the print history information that corresponds to the manuscript identification information has been recorded, and the information relating to the accumulated number of outputs included in the print history information is compared with the number-of-times-of-printing threshold information included in the decoded security information so as to find out that printing is available.

US Pat. No. 10,432,802

TERMINAL DEVICE, AND NON-TRANSITORY COMPUTER READABLE MEDIUM STORING PROGRAM FOR TERMINAL DEVICE

FUJI XEROX CO., LTD., To...

1. A terminal device comprising:a photographing apparatus configured to capture a photograph in a specific direction in which a data processing apparatus is installed;
a display configured to display a request screen for designating the data processing apparatus to process data, the request screen being configured to display the photograph captured by the photographing apparatus and at least one of:
(i) data images indicating data that is requested to be processed, and
(ii) when an image of the data processing apparatus arranged in the specific direction is not captured in the photograph due to an obstacle positioned between the photographing apparatus and the data processing apparatus, a processing apparatus image indicating the data processing apparatus acquired over a network, the processing apparatus image being a graphical representation of a silhouette of the data processing apparatus, the processing apparatus being superimposed on the photograph captured by the photographing apparatus such that the processing apparatus image and the photograph are entirely visible except a portion of the photograph covered by the silhouette of the data processing apparatus; and
a control unit configured to:
acquire position information of the data processing apparatus;
acquire the processing apparatus image indicating the data processing apparatus over the network;
acquire an address of the data processing apparatus on the network; and
in response to an operation designating the processing apparatus image and one of the data images on the request screen, transmit a request for processing the data indicated by the designated data image to the data processing apparatus indicated by the designated processing apparatus image.

US Pat. No. 10,432,801

DEVICE MANAGEMENT SYSTEM, DEVICE MANAGEMENT METHOD, AND RECORDING MEDIUM

Ricoh Company, Ltd., Tok...

1. A device management system for communicating with a relay device connected with one or more devices in a local network via a firewall, the device management system comprising:circuitry configured to:
receive status information indicating a status of the relay device from the relay device,
based on a determination that the received status information satisfies a predetermined condition, obtain instruction information associated with the predetermined condition, the instruction information indicating a predetermined process to be executed by the relay device, the predetermined process corresponding to a reboot process when the status information of the relay device indicates the relay device is low on memory, and
transmit the obtained instruction information to the relay device to cause the relay device to execute the predetermined process, which corresponds to the reboot process when the status information of the relay device indicates the relay device is low on memory.

US Pat. No. 10,432,800

APPARATUS AND METHOD FOR MANAGING THREE-DIMENSIONAL PRINTING

ELECTRONICS AND TELECOMMU...

1. A method of managing three-dimensional (3D) printing, the method comprising:receiving a video of a product being output from a 3D printer;
acquiring first output information by comparing a first frame of the video and a second frame subsequent to the first frame;
acquiring second output information by extracting output layer-specific trace information from a G-code of the product being output acquired from the 3D printer; and
acquiring quality information of the product being output based on the first output information and the second output information,
wherein the acquiring of the first output information by comparing the first frame of the video and the second frame subsequent to the first frame comprises acquiring first output information by calculating an area change rate of the product being output with respect to a heating bed between a first frame and a second frame of a video looking down from an upper end of the 3D printer on the product being output.

US Pat. No. 10,432,799

RING AND TEXT TONE NOTIFIER (RTTN)

1. A system of notification of a Call and Text by the “Ring and Text Tone Notifier” to emulate by duplicating the ring and text tone received from a mobile device comprising;a two prongs adapter for plugging into any standard 110-volt power outlet;
an on/off power switch to turn the RTTN on/off;
Bluetooth search capabilities module denoted by (130) that includes a Bluetooth processor (CPU) to identify one or more than one available mobile device that are ready to pair with the RTTN;
a touch screen display to allow the user to select the detected devices to pair with;
to select an audible tone, or ring, and a volume control;
red LED and Green LED to distinguish between an incoming phone call vs text message; the Green LED also turns solid green light as the RTTN is plugged into 110 Volts outlet indicating the “Ready” state of the RTTN;
a USB port input for the purpose of providing a means for the mobile device to recharge, or data transfer;
an AC-DC Converter;
a speaker with a Piezo buzzer with a built-in amplified circuit to broadcast the selected tone or ring;
the RTTN may contain an extra 3-prong grounded outlet adaptor in order to compensate for the one already taken by plugging in the RTTN.

US Pat. No. 10,432,798

SYSTEM, METHOD, AND APPARATUS FOR SERVICE GROUPING OF USERS TO DIFFERENT SPEED TIERS FOR WIRELESS COMMUNICATION

1. A system, comprising:a processor; and
a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations, comprising:
determining speed tier data indicative of respective speed tiers assigned to user equipment served by an access point device of a communication network, wherein the respective speed tiers specify target data rates for communication between the user equipment and the access point device; and
in response to determining that an observed data rate, associated with a first user equipment of the user equipment, is not less than a first target data rate, of the target data rates, that corresponds to first speed tier of the speed tiers assigned to the first user equipment, assigning a first priority to a first non-guaranteed bit rate bearer that is associated with the first user equipment, wherein the first priority is lower than a second priority that is to be assigned to a second non-guaranteed bit rate bearer that is associated with a second user equipment, of the user equipment, that has been assigned a second speed tier of the speed tiers, wherein the first speed tier is higher than the second speed tier, wherein the second user equipment is determined to not have exceeded a second target data rate of the target data rates that corresponds to the second speed tier, and wherein the first non-guaranteed bit rate bearer and the second non-guaranteed bit rate bearer belong to a common quality of service class.

US Pat. No. 10,432,797

PRE-DISTORTION SYSTEM FOR CANCELLATION OF NONLINEAR DISTORTION IN MOBILE DEVICES

1. A system, comprising:a memory that stores instructions;
a processor that executes the instructions to perform operations, the operations comprising:
canceling, via a nonlinear cancellation signal included within an acoustic signal transmitted to a device, a portion of nonlinear distortions created once a component of the device emits a linear signal, wherein the acoustic signal includes the linear signal and the nonlinear cancellation signal.

US Pat. No. 10,432,796

METHODS AND APPARATUS TO ASSIST LISTENERS IN DISTINGUISHING BETWEEN ELECTRONICALLY GENERATED BINAURAL SOUND AND PHYSICAL ENVIRONMENT SOUND

1. A method comprising:enabling a listener to distinguish between electronically generated binaural sound and physical environment sound during a telephone call with a person by:
playing, with a wearable electronic device worn by the listener and during the telephone call, an audio alert that signifies that the electronically generated binaural sound is a voice of the person to assist the listener to distinguish between the voice of the person and the physical environment sound; and
repeating, with the wearable electronic device worn by the listener, the audio alert during the telephone call to remind the listener that the electronically generated binaural sound is the voice of the person and not the physical environment sound.

US Pat. No. 10,432,795

PERFORMING AUTOMATED EVENT SERVICES TO REGISTERED END USERS

West Corporation, Omaha,...

1. A method, comprising:transmitting an initial event notification message to an end user communication device based on a primary communication preference of:
at least one primary communication contact preference and at least one secondary communication preference; and
at least two of a mobile device preference, a computer device preference, a voice call preference, a text message preference and an email preference;
transmitting another event notification to the end user communication device via a different communication medium, based on the at least one secondary communication preference; and
joining, from the end user communication device, an event via the different communication medium while the initial communication medium is currently being occupied via a current event in progress on the end user communication device.

US Pat. No. 10,432,794

SYSTEM AND METHOD FOR DISTRIBUTED DYNAMIC RESOURCE COMMITMENT

1. A managed resource device, configured for use in a contact center, the device comprising:a processor; and
a memory coupled to the processor, wherein the memory stores instructions that, when executed by the processor, cause the processor to:
receive a signal corresponding to an initialization of a raise round according to a request over a shared data communications channel to a plurality of non-committed resources registered to communicate on the shared data communications channel,
determine, by hosted logic, whether the managed resource device should volunteer for an activity type during the raise round,
automatically transmit a volunteer signal over the shared data communications channel in response to determining that the managed resource device should volunteer for the type of activity according to the hosted logic indicating selected volunteering resources,
receive a message for committing the selected volunteering resources to the request, wherein the committed resources are selected for routing an activity having the activity type; and
an electronic routing device coupled to the processor for routing the activity having the activity type, to the committed resources, for handling by the committed resources.

US Pat. No. 10,432,793

SYSTEMS AND METHODS TO ENROLL USERS FOR REAL TIME COMMUNICATIONS CONNECTIONS

INGENIO, LLC., San Franc...

1. A method, comprising:providing a web server coupled with a connection server configured to establish real time communication connections between telephonic devices, wherein the web server is configured to present information about first users of a first set of telephonic devices to second users of a second set of telephonic devices; and
in response to a user of a web browser visiting the web server:
presenting, by the web server to the web browser, a user interface that includes a plurality of questions;
determining, by a computing apparatus, personalized ranks of the information of first users of the first set of telephonic devices based on answers from the user;
selecting, by the computing apparatus, a subset of the first users based on the personalized ranks determined from the answers;
presenting, by the web server to the web browser, the information about the subset of the first users;
receiving, in the web server, a user selection of a particular one of the first users from the subset presented in the web browser; and
in response to the user selection, the connection server
establishing a real time communication connection between a telephonic device of the user and a telephonic device of the particular one of the first users.

US Pat. No. 10,432,792

SYSTEM AND METHOD OF INTEGRATING TO AN EXTERNAL SEARCH APPLICATION IN AN EMPLOYEE DESKTOP WEB CLIENT

Verint Systems UK Limited...

1. A method of integrating to an external application for an agent in a web client application, the method comprising:logging into the web client application by the client;
starting an interaction with a client with the web client application by the agent;
searching for relevant knowledge content through a third-party integration module using a graphical user interface, wherein the third-party integration module integrates with other systems and applications outside of a current system in order to search for the relevant knowledge content, wherein the other systems and applications outside of the current system are integrated into the graphical user interface;
completing the interaction with the client using the graphical user interface with enhanced input from the search step;
completing the interaction with the client without the knowledge search if the knowledge search feature is not configured.

US Pat. No. 10,432,791

METHOD AND SYSTEM FOR A SCALABLE COMPUTER-TELEPHONY INTEGRATION SYSTEM

State Farm Mutual Automob...

1. A computer-implemented method for presenting a contact center directory in a computer-telephony integration system, the method executed by one or more processors programmed to perform the method, the method comprising:presenting, via one or more processors, a plurality of contact center service categories to a system administrator, each contact center service category corresponding to a particular type of contact center service in the computer-telephony integration system;
for each of the plurality of contact center service categories, presenting, via the one or more processors, one or more sets of contact information for communicating with call agents assigned to the contact center service category; and
presenting to the system administrator, via the one or more processors, one or more user controls for editing at least one of: (i) the plurality of contact center service categories, or (ii) the one or more sets of contact information for communicating with the call agents assigned to the plurality of contact center service categories.

US Pat. No. 10,432,790

AUTOMATED TELEPHONE HOST SYSTEM INTERACTION

REPNOW INC., San Diego, ...

1. A system comprising:one or more client applications executable by respective communication devices, each communication device configured to receive, from a user of the communication device, a request for a customer service call, and transmit request information associated with the request via a network; and
a server comprising one or more processors configured with processor-executable instructions to perform operations comprising:
causing a telephony service to initiate a first pre-queued call center call to a call center associated with a service provider;
determining that a live agent has answered the first pre-queued call center call;
receiving, from the one or more communication devices, the request information;
determining, based on the received request information, that the service provider corresponds to a pending request for a customer service call;
causing initiation of a user call from the telephony service to the communication device associated with the pending request; and
causing the telephony service to bridge the user call and the first pre-queued call center call such that audio can be transmitted between the communication device and the call center.

US Pat. No. 10,432,789

CLASSIFICATION OF TRANSCRIPTS BY SENTIMENT

VERINT SYSTEMS LTD., Her...

1. A method for classifying a sentiment of a dialog transcript, the method comprising:training a lexicon, wherein the training comprises:
receiving a training set of dialog transcripts;
splitting the training set into a negative set and a non-negative set based on a seed;
identifying n-grams in the dialog transcripts;
computing, for each n-gram, a polarity score that corresponds to the likelihood of the n-gram having either a negative or a non-negative sentiment, wherein the computing the polarity score for a particular n-gram comprises comparing the frequency of the particular n-gram in the negative set to the frequency of the particular n-gram in the non-negative set;
identifying prominent n-grams based on each n-gram's polarity score;
expanding the lexicon by adding the prominent n-grams, which are not already in the lexicon, to the lexicon; and
repeating the splitting, computing, identifying, and expanding for a plurality of iterations to obtain a trained lexicon, wherein the splitting for each iteration uses the expanded lexicon from the previous iteration; and
classifying the sentiment of the dialog transcript using the trained lexicon wherein the classifying comprises:
receiving a dialog transcript;
selecting an utterance in the dialog transcript;
identifying n-grams in the utterance;
obtaining a polarity score for each n-gram using the trained lexicon;
determining the utterance is negative or non-negative based, at least, on the polarity scores for each n-gram;
repeating the selecting, identifying, computing, and determining for other utterances in the dialog transcript; and
distinguishing the sentiment of the dialog transcript as negative or non-negative based on the negative or non-negative utterances determined in the dialog transcript.

US Pat. No. 10,432,788

COACHING IN AN AUTOMATED COMMUNICATION LINK ESTABLISHMENT AND MANAGEMENT SYSTEM

ConnectAndSell, Inc., Lo...

1. A system for a closed loop process in connection with an automatic calling system, comprising:a communication link establishment and management system configured to selectively establish a communication channel between a user and a target, the communication link establishment and management system having a data store for storing contextual lead information associated with targets;
the communication link establishment and management system being programmed to define a market signal;
the communication link establishment and management system being programmed to generate a query based on the market signal;
the communication link establishment and management system being programmed to execute the query on a website of a company to identify a publication of the market signal for the company;
the communication link establishment and management system being programmed to select a plurality of targets based on the market signal;
the communication link establishment and management system being operative to initiate a plurality of outbound calling attempts to establish communication with the plurality of targets, and to store information including the market signal associated with each target;
the communication link establishment and management system being configured to automatically assign an agent to an outbound calling attempt of the plurality of outbound calling attempts, and connect the agent to a call resulting from the outbound calling attempt in response to the outbound calling attempt being answered;
the communication link establishment and management system being configured to transfer the call to the user in response to the call being connected with the target associated with the outbound calling attempt, wherein the user and the agent are different;
the communication link establishment and management system being programmed to automatically interface with a customer relationship management system to transfer said information including the market signal to the customer relationship management system to update a database associated with the customer relationship management system;
the communication link establishment and management system being programmed to score a completed call in the plurality of outbound calling attempts for the user;
the communication link establishment and management system being programmed to identify a weakness of the user based on the score;
the communication link establishment and management system being programmed to determine that the user is idle between the first completed call and a subsequent call;
the communication link establishment and management system being programmed to provide video training content in response to the user being idle between the first completed call and the subsequent call, wherein the video training content is provided within the communication link establishment and management system to the user in real time based on the weakness; and
the communication link establishment and management system being programmed to provide a field for notes for future conversations on a screen simultaneously with the video training content.

US Pat. No. 10,432,787

TELECOMMUNICATION SYSTEM AND METHOD FOR FLEXIBLE CONTROL OF THE TELECOMMUNICATION SYSTEM USING A SWITCHING COMMAND ISSUED BY AN APPLICATION TO A PLATFORM

1. A telecommunication apparatus for flexible execution of a switching command, comprising:a computer device that is configured to execute switching commands received from a terminal, the computer device comprising a non-transitory computer readable medium communicatively connected to at least one processor;
the computer device configured to execute the switching commands received from the terminal via an application being run on the terminal; and
the computer device configured to test the switching commands on a case-by-case basis to decide whether a first execution mode or a second execution mode should be utilized for execution of the switching command before the executing of the switching command;
wherein the first execution mode is a mode in which the switching command is executed simultaneous with switching and verification and the second execution mode is a mode in which the switching command is executed immediately and without verification.

US Pat. No. 10,432,786

TELEPHONE NUMBER SELECTION

TEXTNOW, INC., Waterloo,...

9. A method of assigning a telephone number to a user account in a communications system, the method comprising:obtaining a mnemonic seed;
obtaining a subset of telephone numbers filtered from a global telephone number pool based on the mnemonic seed, wherein each alphanumeric representation of respective telephone numbers in the subset is within a threshold Hamming distance of the mnemonic seed;
receiving an indication of a selected telephone number from the subset; and
assigning the selected telephone number to the user account.

US Pat. No. 10,432,785

OPTIMIZED SYSTEM AND METHOD FOR NOTIFYING A CALLED PARTY OF A CALL ATTEMPT

COMVIVA TECHNOLOGIES LTD....

1. A method of notifying a called party of a call attempt by a calling party, the method comprising:receiving a communication request on occurrence of a nonstandard trigger event, the nonstandard trigger event being one of protocol error, internal error and calling party insufficient balance;
processing the communication request, wherein the processing comprises steps of:
monitoring number of call attempts performed by the calling party;
defining a threshold duration to act as a delay in sending a notification to the called party;
defining a threshold count for the number of call attempts by the calling party during the threshold duration;
monitoring availability of the called party; and
sending the notification to the called party about the call attempt after completion of a threshold duration and upon determining the called party is available; or
sending the notification to the called party upon determining communication requests exceeds the threshold count during the threshold duration and upon determining the called party is available.

US Pat. No. 10,432,784

SYSTEM AND METHOD FOR EFFECTUATING REAL-TIME SHAPED DATA TRANSFER DURING CALL SETUP PROCEDURE IN A TELECOMMUNICATION NETWORK

1. A method for providing a telecommunication service conveying data from a called party telecommunication device to a calling party telecommunication device during a call setup procedure comprising the steps of:said calling party telecommunication device placing a call to said called party telecommunication device over an operator network to create a connection therebetween,
receiving a rejection of the call from the called party telecommunication device and disconnecting the connection to the called party telecommunication device while keeping the connection to the calling party telecommunication device intact, wherein disconnecting the connection ends limited use of the called party telecommunication device experienced while the called party telecommunication device is ringing,
forwarding the call to a dynamic player by the operator network within the call setup procedure,
triggering the dynamic player, after the connection to the called party telecommunication device is disconnected and in response to receipt of the rejection, to communicate dynamic tone selection options to the called party and receiving a selection in a manner that the called party telecommunication device presents and inputs data during the call setup procedure after the connection to the called party telecommunication device is disconnected and the call is listed on a missed or rejected call list of the called party telecommunication device,
presenting, by the dynamic player a dynamic tone selection of the called party to the calling party in a manner that, within the call setup procedure, a real time determined dynamic tone is played to said calling party by said dynamic player via said calling party telecommunication device, where the dynamic tone selection options include dynamic busy tone selection options.

US Pat. No. 10,432,783

METHOD FOR HANDLING A CALL, SOFTWARE PRODUCT, AND DEVICE

INCONTACT, INC., Sandy, ...

1. An apparatus for handling a call that is moveable within a system between at least one unmonitored domain of the system and at least one monitored domain of the system, said apparatus comprising:a first communication device having a processor and non-transitory memory, wherein the first communication device is within a monitored domain of the system; and
the first communication device configured to determine a call context of a first call upon first arrival of said first call into said system and generate a first unique number to be temporarily used for a first movement of said first call between an unmonitored domain of the system and the monitored domain of the system based on said call context, the first unique number generated to avoid use of the first unique number in a move of any other call within the system at a same time as the first movement of the first call.

US Pat. No. 10,432,782

COMMUNICATION MODULE

1. A communication module, comprising:a communication circuit having a communication interface for transmitting data on an electrical data line;
an overvoltage protection module which is integrated into the communication module in a pluggable manner for protecting the communication circuit from overvoltage on the electrical data line;
a data interface for transferring data between the overvoltage protection module and the communication circuit; and
wherein the overvoltage protection module is configured to transmit a status of the overvoltage protection module to the communication circuit during operation, wherein the communication circuit comprises a telecommunication interface for transmitting the status of the overvoltage protection module to an external data network, and wherein the communication module is configured to automatically send an alarm over the external data network indicating that the overvoltage protection in module is damaged.

US Pat. No. 10,432,781

SYSTEMS AND METHODS FOR PRESENTING CONTENT BASED ON USER BEHAVIOR

Massachusetts Mutual Life...

1. A computer-implemented method comprising:periodically monitoring, by a server via a GPS sensor in communication with an electronic device, a speed of the electronic device;
when the speed of the electronic device satisfies a speed threshold, activating, by the server, a tracking sensor associated with the electronic device, the tracking sensor being configured to monitor behavior data of a user operating the electronic device;
periodically retrieving, by the server via the activated tracking sensor, behavior data associated with the user;
when the behavior data satisfies a behavior data threshold indicating that the user is facing a display of the electronic device and the speed of the electronic device satisfies the speed threshold:
dynamically generating, by the server, modified electronic content data configured to be presented by the electronic device, the modified electronic content data corresponding to the electronic content data, and
transmitting, by the server, the modified electronic content data to the electronic device;
when at least one of the behavior data fails to satisfy the behavior data threshold and the speed associated with the electronic device fails to satisfy the speed threshold:
resuming, by the server, presentation of the electronic content data on the electronic device.

US Pat. No. 10,432,780

AUTOMATIC VEHICLE OPERATOR DETECTION FOR IN-VEHICLE RISK-FREE MOBILE DEVICE USE

1. A wireless terminal having a user interface comprising:a cellular transceiver configured to communicate with a cellular network and to receive therefrom an incoming text message from a calling party;
a personal area network (PAN) transceiver configured to communicate with a vehicle PAN transceiver associated with a vehicle; and
a controller configured to receive a message from the vehicle PAN transceiver indicating that the vehicle has been shifted out of PARK and, in response, to place the wireless terminal in a drive mode, wherein, during drive mode, the controller detects receipt of the incoming text message by the cellular transceiver and is further configured to prevent the text message from being displayed to the driver on the user interface of the wireless terminal, and wherein the controller is further configured to determine if the calling party that sent the incoming text message is listed in a Contacts list associated with the wireless terminal, in response to a determination that the calling party is listed in the Contacts list, to at least one of:
notify the driver that a text message has been received from the calling party; or
display a message on a user interface of the vehicle that a text message has been received from the calling party.

US Pat. No. 10,432,779

COMMUNICATION SESSION MODIFICATIONS BASED ON A PROXIMITY CONTEXT

Motorola Mobility LLC, C...

9. A local communication device comprising:one or more sensors;
one or more processors; and
one or more processor-executable instructions that, responsive to execution by the one or more processors, enable the communication device to perform operations comprising:
establishing, at the local communication device, a communication session with a remote communication device, the local communication device being capable of operating in a private mode or a speaker mode;
determining the local communication device is operating in the private mode while the communication session is in progress;
determining, using the local communication device and while the communication session is in progress, a proximity context associated with an area surrounding the local communication device to identify a presence of a non-call participant;
analyzing the proximity context to determine an identity of the non-call participant;
in response to determining the identity of the non-call participant, scanning communications from the remote computing device for a keyword associated with the identity; and
in response to identifying the keyword in the communications from the remote computing device, automatically, and without human intervention, forwarding an audible alert to the remote computing device that indicates that the communication session is not private and indicates the presence of the non-call_participant.

US Pat. No. 10,432,778

PORTABLE COMMUNICATIONS DEVICES

ELLIPTIC LABORATORIES AS,...

1. A portable communications device comprising:a) an interactive touchscreen display having a transparent outer surface member occupying substantially all of a front surface of said communications device
b) a casing providing a rear surface of said communications device;
c) an audible sound transmitter;
d) an audible sound receiver arranged so as to receive vocal sounds produced by a user when the device is placed against the user's head;
e) an ultrasonic transmitter, separate from said audible sound transmitter;
f) an elongate aperture having a minimum dimension less than 100 ?m located between said transparent outer surface member and said casing;
g) a channel connecting said elongate aperture and said ultrasonic transmitter so as to permit ultrasonic signals to pass out of the elongate aperture.

US Pat. No. 10,432,776

MANAGING UNANSWERED DIGITAL COMMUNICATIONS

PROJECT AFFINITY, INC., ...

1. A computer-implemented method of managing unanswered digital communications, comprising:identifying, by a processor, a list of digital communications associated with a user account as expecting a response or a follow-up;
determining, by the processor, a group of unanswered digital communications, from the list of digital communications, for which a response or a follow-up is still expected;
assigning priority values to the group of unanswered digital communications,
the priority values indicating an order of responding to the group of unanswered digital communications,
the assigning comprising prioritizing an unanswered digital communications sent to the user account over an unanswered digital communication sent from the user account;
transmitting a notification regarding one or more of the group of unanswered digital communications to the user account, the notification including information related to the priority values.

US Pat. No. 10,432,775

EMERGENCY CALL CIRCUIT FOR ELECTRONIC DEVICE

Wistron Corporation, New...

1. An emergency call circuit for an electronic device, comprising:at least one dial button configured to generate at least one dial signal according to at least one dial input signal, wherein the at least one dial signal corresponds to at least one emergency call;
a wireless communication module coupled to the at least one dial button and a power source circuit of the electronic device, and configured to dial the at least one emergency call according to the at least one dial signal when the electronic device is shut down;
an emergency call button configured to generate an emergency call detecting signal according to an emergency call input signal, wherein when a user presses the emergency call button, the emergency call input signal is generated to activate an emergency call function of the electronic device to shut down the electronic device and keep the wireless communication module turning on; and
a logic circuit coupled to the emergency call button, the wireless communication module and the power source circuit, and configured to transmit the emergency call detecting signal to the wireless communication module and the power source circuit, and receive a first power source according to the emergency call detecting signal;
wherein the power source circuit supplies the first power source to the emergency call circuit and the wireless communication module when the electronic device is shut down;
wherein the logic circuit comprises:
a first transistor comprising a source coupled to the first power source, a gate coupled to a first node, and a drain coupled to a second node, wherein the first transistor is a P-type MOSFET (Metal-Oxide-Semiconductor Field-Effect Transistor);
a first diode comprising an anode coupled to the first node, and a cathode coupled to the third node;
a second diode comprising an anode coupled to a fourth node, and a cathode coupled to the third node;
a third diode comprising an anode coupled to the power source circuit, and a cathode coupled to the wireless communication module;
a fourth diode comprising an anode coupled to the second node, and a cathode coupled to the wireless communication module;
a first resistor coupled between the first power source and the first node;
a second resistor coupled between the first power source and the third node;
a third resistor coupled between a fifth node and a ground;
a fourth resistor coupled between the second node and a light emitting diode;
the light emitting diode, comprising an anode coupled to the fourth resistor, and a cathode coupled to the wireless communication module; and
a second transistor comprising a source coupled to the third resistor and the ground, a drain coupled to the first node, and a gate coupled to the fifth node, wherein the second transistor is an N-type MOSFET.

US Pat. No. 10,432,774

PERSONAL ALARM SYSTEM AND METHOD

Sfara, Inc., Hoboken, NJ...

1. A communication device comprising:an arming component operable to generate an arming signal based on arming activation at an arming time;
an activation component operable to generate an activation signal based on an activation at an activation time;
a parameter establishing component operable establish a safety parameter and to generate a safety signal based on a determination of the safety parameter;
a communication component operable to transmit a data communication;
a disarming component operable to generate a disarming, signal based on a disarming activation at a disarming time;
a memory having a parameter signature stored therein; and
a comparator,
wherein said communication component is operable to transmit the data communication based on the safety signal,
wherein said communication component is operable to transmit the data communication based on the activation signal,
wherein said communication component is operable to prevent transmission of the data, communication based on the disarming signal,
wherein said parameter establishing component is operable to generate a second parameter signature based on a detected parameter,
wherein said comparator is operable to generate the activation signal based on a comparison of the second parameter signature and the parameter signature, and
wherein the detected parameter comprises at least one of a sound and an acceleration.

US Pat. No. 10,432,773

WIRELESS AUDIO TRANSCEIVERS

BESTECHNIC (SHANGHAI) CO....

1. A wireless audio system comprising:a primary wireless audio transceiver comprising:
a first radio frequency module configured to transmit audio information at a first frequency; and
a second radio frequency module configured to transmit the audio information at a second frequency; and
a secondary wireless audio transceiver comprising:
a third radio frequency module configured to receive the audio information at the first frequency; and
a fourth radio frequency module configured to receive the audio information at the second frequency;
wherein the first radio frequency module of the primary wireless audio transceiver transmits the audio information to the third radio frequency module of the secondary wireless audio transceiver, and the second radio frequency module of the primary wireless audio transceiver transmits the audio information to the fourth radio frequency module of the secondary wireless audio transceiver; and
at least one of the primary wireless transceiver and the secondary wireless transceiver comprises a control module configured to:
determine that a criterion corresponding to switching a transmission frequency at which the audio information is transmitted from the primary wireless audio transceiver to the secondary wireless audio transceiver between the first frequency and the second frequency is met; and
in response to the determination, switch the transmission frequency between the first frequency and the second frequency at a prearranged time slot based on a prearranged agreement.

US Pat. No. 10,432,772

DUAL-MODE EYEGLASSES

1. An entertainment system comprising:wearable dual-mode eyeglasses including a translucent display, a low latency transmitter that is communicatively coupled to an entertainment console, a high latency transceiver that is communicatively coupled to the entertainment console, and a processing module, wherein the display, the low latency transmitter, the high latency transceiver, and processing module are integrated into the dual-mode eyeglasses, and wherein the entertainment console is a gaming console, personal computer, or a set top box communicatively coupled to a display and wherein the eyeglasses are configured to receive video data from the console via the high latency transceiver, the processing module configured to:
transmit game control messages to the entertainment console via the low latency transmitter; and
communicate game data with the entertainment console via the high latency transceiver;
wherein a target or menu item is selected by translating and transmitting head and/or eye movements via the low latency transmitter to the entertainment console and wherein game data transferred to the eyeglasses via the high latency transceiver is displayed on a secondary display of the eyeglasses.

US Pat. No. 10,432,771

COMMUNICATION DEVICE, COMMUNICATION METHOD, AND STORAGE MEDIUM

CASIO COMPUTER CO., LTD.,...

1. A device capable of wireless communication comprising:a communicator configured to transmit and receive communication packets to and from other device;
a memory configured to store one or more kinds of setting information; and
a processor configured to determine whether or not at least one of the one or more kinds of setting information has been changed after the last communication with the other device, generate discrimination information indicating whether or not each of the one or more kinds of setting information has been changed, and control communication with the other device based on the discrimination information, wherein the discrimination information is a bitmap, and a number of bits of the bitmap is the same as the number of the kinds of the setting information and wherein each bit of the bitmap is set to one in the case that one kind of setting information corresponding to the bit has been changed after the last communication with the other device, and set to zero in the case that the one kind of setting information has not been changed after the last communication with the other device.

US Pat. No. 10,432,770

METHOD FOR PROVIDING MULTI-FUNCTION BACK COVER TO MOBILE TERMINAL AND MOBILE TERMINAL THEREOF

JRD COMMUNICATION (SHENZH...

1. A method for providing a multi-function back cover for a mobile terminal, comprising:providing a plurality of test points on a PCB of a main body of the mobile terminal in advance;
providing a plurality of pogo pins at positions corresponding to the test points on the back cover of the mobile terminal; and
installing the back cover of the mobile terminal on the main body of the mobile terminal, such that the pogo pins connect with the test points to provide a corresponding function of the back cover of the mobile terminal;
wherein the main body of the mobile terminal is provided with a first battery management chip or a baseband chip;
wherein the plurality of test points at least comprise a first identification pin and a second identification pin, and levels of the first identification pin and the second identification pin are detected so as to determine one or more functions of the back cover of the mobile terminal installed on the main body of the mobile terminal according to a result of the determination;
wherein the providing the plurality of test points on the PCB of the main body of the mobile terminal in advance comprises:
providing twelve test points on the PCB of the main body of the mobile terminal in advance;
wherein the twelve test points comprise the first identification pin, the second identification pin, a detection pin, a ground pin, a charge input pin, a charge output pin, a power supply pin, a first I2C control pin, a second I2C control pin, a left audio input pin, a right audio input pin, and a reset pin.

US Pat. No. 10,432,769

ASSEMBLY, SYSTEM AND METHOD FOR REVERSIBLY COMBINING HANDHELD ARTICLES

1. An assembly, of components, adapted for combining two handheld articles, a first handheld article and a second handheld article, each chosen from at least one of the following categories; electronic devices, electrical devices, mechanical devices and items for personal use including medication, cosmetics and food products, said assembly comprising:a first receptacle means for receiving and reversibly engaging at least a portion of said first handheld article, said first receptacle means defining first and second planes and includes four sidewalls, a first two having lengths longer than a second two;
a second receptacle means for receiving and reversibly engaging at least a portion of said second handheld article, said second receptacle means defining third and teeth fourth planes and includes four sidewalls, a first two having lengths longer than a second two;
a hinge means for hingedly joining said first and second receptacle means' along one of said first sidewalls of each of said first and second receptacle means', said hinge means adapted for rotation of said first and second receptacle means' between a first position and a second position, wherein
rotation between said first position and said second position includes 360 degrees;
said assembly components adapted for multiple configurations, wherein a first configuration comprises:
one face each of said first receptacle means and said second receptacle means oppositely exposed in the first position and the faces are now adjacently concealed in the second position upon rotation; and
wherein a second configuration comprises:
only one of said faces is exposed in the first position and the other of said faces is exposed in the second position resulting in said exposed face of said first position now being concealed upon rotation.