US Pat. No. 10,433,122

EVENT TRIGGERED MESSAGING

Vivint, Inc., Provo, UT ...

1. A method for wireless communication, comprising:receiving, via a transceiver of a camera, an electronic message at a first location;
determining, via the camera, that the electronic message comprises an audio portion;
analyzing, via the camera, the audio portion of the electronic message to determine a sender of the electronic message and information associated with an intended recipient of the electronic message;
comparing, via the camera, the determined information with information stored in memory associated with the camera;
identifying, via the camera, the intended recipient of the electronic message based at least in part on the comparing and the determined sender; and
broadcasting a portion of the electronic message based at least in part on identifying the intended recipient.

US Pat. No. 10,433,121

METHOD FOR EQUIPMENT NETWORKING AND OUTPUTTING BY EQUIPMENT, AND EQUIPMENT

Beijing Xiaoniao Tingting...

1. A method for outputting by equipment, comprising:detecting, by equipment, first data characterizing a signal transmitted by wireless equipment when the wireless equipment approaches the equipment;
determining, by the equipment, whether a second pre-set condition is met by the first data characterizing the signal; and
in response to determining that the second pre-set condition is met by the first data characterizing the signal, generating, by the equipment, an output instruction, and outputting data according to the output instruction, detecting second data characterizing the signal transmitted by the wireless equipment, and determining whether a third pre-set condition is met by the second data characterizing the signal; and in response to determining that the third pre-set condition is not met by the second data characterizing the signal, generating an end instruction, and stopping outputting the data according to the end instruction, the signal indicates a frequency of the signal, wherein the determining that the second pre-set condition is met by the first data characterizing the signal comprises: determining that the second pre-set condition is met by the first data characterizing the signal in response to determining that a first frequency of the signal meets a pre-set frequency condition, wherein the determining that the third pre-set condition is not met by the second data determining that the third pre-set condition is not met by the second data characterizing the signal in response to determining that a second frequency of tire signal does not meet tire pre-set frequency condition.

US Pat. No. 10,433,116

SYSTEM AND METHOD FOR GENERATING AN ADDRESS INDICATOR FOR AN OBJECT BASED ON A LOCATION OF THE OBJECT

YOGEO, INC., Cupertino, ...

1. A method for associating an address for an object, comprising:providing a smart address server, with a processor and memory, the smart address server configured to:
receiving geo location for the object, the geo location indicative of the latitude and longitude of the location of the object;
converting the geo location for the object into a zone, sector and target portion to generate a ZST location indicator for the object;
appending an elevation portion to the ZST location, to generate a ZSTE location indicator for the object;
assigning an object ID for the object and appending the object ID to the ZSTE location indicator to generate a smart address for the object; and
storing the generated smart address for the object in a data store.

US Pat. No. 10,433,109

APPARATUS, SYSTEM AND METHOD OF PERFORMING A POSITION MEASUREMENT

INTEL CORPORATION, Santa...

1. An apparatus comprising a memory, and a processor, the processor configured to cause a first Neighbor Awareness Networking (NAN) device to:identify a value of a cluster Time Synchronization Function (TSF) at a time of a last detected movement of the first NAN device prior to sending a NAN frame by the first NAN device;
include the value of the cluster TSF into a last movement indication field;
transmit the NAN frame to a second NAN device, the NAN frame comprising a ranging attribute comprising a plurality of fields, the plurality of fields comprising the last movement indication field to assist the second NAN device in a determination to perform a ranging measurement procedure with the first device; and
communicate messages of the ranging measurement procedure with the second NAN device.

US Pat. No. 10,433,108

PROACTIVE DOWNLOADING OF MAPS

Apple Inc., Cupertino, C...

1. A method comprising:receiving, by a map service of a computing device, data defining a relevant location from a user computing device;
generating, by the map service, an offline map area definition containing the relevant location, the offline map area definition being represented by a polygon with each side of the polygon being a predetermined distance from the relevant location;
sending, by the map service, the offline map area definition to the user computing device;
receiving, by the map service from the user computing device, a request for an offline map that includes an offline map area corresponding to the offline map area definition; and
sending, by the map service, the offline map based on the offline map area definition to the user computing device.

US Pat. No. 10,433,103

WI-FI PROCESS

Apple Inc., Cupertino, C...

1. A method, comprising:determining a position of a wireless device based at least in part on location signals;
determining whether the position of the wireless device has changed;
in response to determining that the position of the wireless device has changed, comparing the change of the position of the wireless device with a first predetermined distance; and
in response to the change of the position of the wireless device being greater than the first predetermined distance, decreasing a period of time between successive roam scans,
wherein the first predetermined distance is based at least in part on a coverage rang of an access point.

US Pat. No. 10,433,100

SYSTEMS AND METHODS FOR CORRECTION OF GEOGRAPHIC INCONSISTENCY IN ITEM PROPERTIES

Walmart Apollo, LLC, Ben...

1. A system comprising:a mobile device, the mobile device including:
a global positioning system (GPS) receiver,
a communication device,
a data reader,
a user interface,
a memory, and
a processor;
a secure network having an authentication server, the secure network in communication with the mobile device via a mobile network; and
a secure item database server in selective communication with the mobile device via the secure network based on authenticating by the authentication server,
wherein the memory of the mobile device includes instructions that when executed by the processor, cause the mobile device to:
determine a current geographic location of the mobile device in response to satellite broadcasts received by a global positioning system (GPS) receiver of the mobile device;
programmatically transmit, by the communication device, the current geographic location to a location service hosted on a location server in communication with the mobile device via the mobile network;
receive, from the location service, an identification of a plurality of geographic locations of interest, the geographic locations of interest being identified based, at least in part, on a distance between the current geographic location and the geographic locations of interest;
receive, via a first firewall, user-identifying information from the mobile device;
programmatically transmit, via a secure network, the user-identifying information to a threat management server;
in response to a determination that the user-identifying information is not a threat programmatically transmit, via a second firewall, the user-identifying information to an authentication server of the secure network for authentication;
in response to receiving an authentication, transmit a message to the mobile device authorizing relocation;
detect, via the GPS receiver, a relocation of the mobile device from the current geographic location to a selected one of the geographic locations of interest,
acquire, by the data reader, item-descriptive data associated with an item at the selected one of the geographic locations of interest,
receive, from the user interface, item-property data associated with the item at the selected one of the geographic locations of interest,
connect the mobile device to the secure network by transmitting, from the communication device and via the mobile network, user-identifying information to the security authentication server of the secure network through the first firewall and the second firewalls,
programmatically transmit, via the secure network and in response to receiving an authentication from the authentication server, the item-descriptive data to the secure item database server through the first firewall and the second firewall, wherein a load balancer transmits the item-descriptive data to at least one of a plurality of geographically distributed servers and the secure item database server,
receive, from the secure item database server in response to the transmission of the item-descriptive data, (i) an identification of a corresponding item located at a different geographic location from the selected one of the geographic locations of interest, and (ii) corresponding item-property data associated with the corresponding item at the different geographic location; and
programmatically determine a difference between the item-property data and the corresponding item-property data, the difference effecting a change in corresponding item-property data in the secure item database server.

US Pat. No. 10,433,096

AUDIO OBJECT MODIFICATION IN FREE-VIEWPOINT RENDERING

Nokia Technologies Oy, E...

1. An apparatus comprising:at least one processor; and
at least one non-transitory memory including computer program code, the at least one non-transitory memory and the computer program code configured to, with the at least one processor, cause the apparatus to:
define audio object metadata associated with at least one audio object, wherein the audio object metadata includes at least one instruction for rendering the at least one audio object in a free-viewpoint rendering in response to detection of a locational conflict abnormality based upon at least one predetermined abnormality modification parameter; and
transmit the audio object metadata, wherein the audio object metadata is configured to modify rendering of the at least one audio object.

US Pat. No. 10,433,092

MANIPULATION OF PLAYBACK DEVICE RESPONSE USING SIGNAL PROCESSING

Sonos, Inc., Santa Barba...

1. A method of operating a playback device, the method comprising:receiving, at the playback device, left and right channels of audio content;
generating a center channel of the audio content, wherein generating the center channel comprises combining at least a portion of the left channel and at least a portion of the right channel;
generating first and second side channels of the audio content,
wherein generating the first side channel comprises combining (i) the center channel and (ii) a difference of the left channel and the right channel, and
wherein generating the second side channel comprises combining (i) the center channel and (ii) an inverse of the difference of the left channel and the right channel;
playing back the audio content, wherein playing back the audio content comprises:
applying one or more filters to attenuate portions of the first side channel having frequencies less than a cutoff frequency;
playing back the first side channel according to a first radiation pattern having a maximum aligned with a first direction;
playing back the second side channel according to a second radiation pattern having a maximum aligned with a second direction;
amplifying the center channel in proportion to the attenuation of the first side channel; and
playing back the center channel of the audio content according to a third radiation pattern having a maximum aligned with a third direction.

US Pat. No. 10,433,091

COMPATIBLE MULTI-CHANNEL CODING-DECODING

1. An audio decoder for decoding an encoded audio signal to obtain a decoded audio signal, the audio decoder comprising:an input data reader configured for reading the encoded audio signal, the encoded audio signal comprising channel side information, a left downmix channel and a right downmix channel, wherein the channel side information is calculated such that the left or the right downmix channel, when weighted using the channel side information, results in an approximation of a selected original channel, wherein the input data reader is configured to obtain the left downmix channel and the right downmix channel and the channel side information; and
a channel reconstructor configured for reconstructing the approximation of the selected original channel using the channel side information and the left downmix channel or the right downmix channel to obtain the approximation of the selected original channel,
wherein the approximation of the selected original channel represents the decoded signal and comprises at least three of an approximated left channel, an approximated left surround channel, an approximated right channel, and an approximated right surround channel,
wherein the input data include channel side information for at least three of the approximated left channel, the approximated left surround channel, the approximated right channel, and the approximated right surround channel,
wherein the channel reconstructor is operative to perform at least three of the following reconstructing operations:
reconstructing the approximated left channel using channel side information for the left channel and using the left downmix channel,
reconstructing the approximated left surround channel using channel side information for the left surround channel and using the left downmix channel,
reconstructing the approximated right channel using channel side information for the right channel and using the right downmix channel, and
reconstructing the approximated right surround channel using channel side information for the right surround channel and using the right downmix channel,
wherein the audio decoder further comprises a perceptual decoder configured for decoding the left downmix channel to obtain a decoded version of the left downmix channel and configured for decoding the right downmix channel to obtain a decoded version of the right downmix channel, wherein the perceptual decoder comprises an entropy decoder and an inverse quantizer, and
wherein at least one of the input data reader, the channel reconstructor, the perceptual decoder, the entropy decoder, and the inverse quantizer comprises a hardware implementation.

US Pat. No. 10,433,090

METHOD AND APPARATUS FOR DECODING STEREO LOUDSPEAKER SIGNALS FROM A HIGHER-ORDER AMBISONICS AUDIO SIGNAL

Dolby International AB, ...

1. A method for decoding an encoded Higher Order Ambisonics (HOA) audio signal, the method comprising:receiving the encoded HOA audio signal;
determining a decoding matrix D for loudspeakers having positions defined by azimuth angle values; and
decoding and rendering, by at least one processor, the encoded HOA audio signal based on the decoding matrix D,
wherein the decoding matrix D is based on a first matrix G and a second matrix ?+,
wherein the first matrix G contains desired panning function values for all virtual sampling points and is based on an order N of the encoded HOA audio signal and on the azimuth angle values and a number S of virtual sampling points on a sphere, wherein said panning function values are determined by panning functions, the panning functions include panning functions for segments on the sphere, and the panning functions for segments on the sphere include, for at least one of the loudspeakers, different panning functions for different ones of the segments,
wherein the second matrix ?+ is based on the number S and the order N of the encoded HOA audio signal.

US Pat. No. 10,433,087

SYSTEMS AND METHODS FOR REDUCING VIBRATION NOISE

Qualcomm Incorporated, S...

1. A method for reducing vibration noise by an electronic device, comprising:obtaining an audio signal, wherein the audio signal comprises vibration noise;
obtaining vibration data from one or more motion sensor signals, wherein a 6-axis motion sensor that produces the one or more motion sensor signals is alignment independent from a microphone that produces the audio signal;
processing the vibration data to produce microphone-response matched vibration data based on a transfer function, wherein processing the vibration data comprises adding vibration levels from a plurality of axes to produce a total microphone vibration response; and
reducing the vibration noise based on the microphone-response matched vibration data.

US Pat. No. 10,433,085

METHOD AND APPARATUS FOR EVALUATING AUDIO DEVICE, AUDIO DEVICE AND SPEAKER DEVICE

1. An audio device comprising:an amplification device part which an audio signal from a sound source is input into and performs required processing and amplification of the audio signal; and
a speaker device which is connected to the amplification device part and the processed and amplified audio signal is input into and emits this audio signal;
wherein
the amplification device part includes a correction device that performs a processing correcting a group delay characteristic and a frequency characteristic that the audio device has;
the correction device includes an FIR digital filter, and incorporates a computer device including a program for reproducing a signal to be measured for measuring a group delay characteristic and a frequency characteristic of the audio device, detecting a reproduction output of the signal by the audio device with a microphone, analyzing the reproduction output to measure the group delay characteristic and the frequency characteristic, preparing an acoustic transfer function for a reverse correction from the measured group delay characteristic and frequency characteristic, and performing a processing for correcting the group delay characteristic and the frequency characteristic of the audio device including the speaker device by using the acoustic transfer function; and
the speaker device comprises:
a speaker box,
a speaker unit which is equipped to the speaker box, the speaker unit including a vibrating body having a front surface facing a viewing direction of the speaker device and a back surface opposing the front surface, wherein a sound emitted from the front surface of the vibrating body when the vibrating body is vibrated is regarded as a signal sound, and a sound other than the signal sound is regarded as a noise, the sound other than the signal sound including a sound which is emitted from the back surface of the vibrating body and a sound which is generated by an object contacting the speaker unit being vibrated by the vibration of the vibrating body, and
a sound absorbing member, the sound absorbing member configured to cover a portion other than the front surface of the vibrating body, and a portion of the object contacting the speaker unit, so that only the signal sound is emitted by the speaker device and the noise is not emitted by the speaker device.

US Pat. No. 10,433,083

AUDIO PROCESSING DEVICE AND METHOD OF PROVIDING INFORMATION

YAMAHA CORPORATION, Hama...

1. An audio processing device communicable with a communication device via sound waves, the audio processing device comprising:a microphone that captures first audio sound, output by a first sound emitter, as sound waves, that includes:
a first audio component of sound of guidance voice; and
a second audio component of notification sound associated with the guidance voice,
wherein the microphone outputs a first audio signal representing the captured first audio sound;
an information extractor configured to extract the second audio component, in a first frequency band, from the first audio signal;
an audio signal processor configured to generate a second audio signal representing the second audio component extracted by the information extractor, in a second frequency band, an upper limit of the second frequency band being higher than an upper limit of the first frequency band; and
a second sound emitter configured to output second audio sound, as sound waves, representing the second audio signal, while the first sound emitter is outputting, as sound waves, the first audio sound representing the first audio component, to communicate with the communication device, without disrupting recipient's ability to hear the first audio sound representing the first audio component output by the first sound emitter, as sound waves.

US Pat. No. 10,433,082

FITTING METHOD FOR A BINAURAL HEARING SYSTEM

Sonova AG, Staefa (CH)

1. A method for fitting of a binaural hearing system comprising a first hearing device and a second hearing device to a patient suffering from an asymmetric hearing loss for first time use, wherein the difference in hearing loss between the two ears is at least 5 dB on average in a main frequency range between 500 Hz and 4 kHz, the method comprising:providing audiogram data representative of the hearing loss of each of the ears of the patient,
determining, from the audiogram data, for each of the hearing devices an initial gain setting and a final gain setting,
applying, for an initial time period, the respective initial gain setting to each of the hearing devices,
applying, for an acclimatization time period, gain settings to each of the hearing device which are gradually changed, as a function of time, from the respective initial gain setting to the respective final gain setting,
applying, after lapse of the acclimatization time period, the respective final gain setting in each of the hearing devices,
wherein the initial gain setting of the hearing device used at the ear having the stronger hearing loss is, on average in the main frequency range, lower than the respective final gain setting by an amount depending on the hearing loss of the ear having the weaker hearing loss, wherein the initial gain setting of the hearing device used at the ear having the weaker hearing loss deviates, on average in the main frequency range, from the respective final gain setting by an amount depending on the hearing loss of the ear having the stronger hearing loss, and wherein, on average in the main frequency range, the binaural difference between the initial gain settings is lower than that of the final gain settings.

US Pat. No. 10,433,081

CONSUMER ELECTRONICS DEVICE ADAPTED FOR HEARING LOSS COMPENSATION

JACOTI BVBA, Boechout (B...

1. Method for allowing a consumer electronics device to output a hearing loss compensated signal, the consumer electronics device comprising an operating system whereon at least one application can be run that yields a sound output signal prepared for output to a sound device without substantially being altered, comprising the steps of:providing said consumer electronics device with a first software module adapted for rerouting said sound output signal, said first software module being a virtual sound device capable of being registered and deregistered with said operating system as a currently selected sound output device, said first software module being written and installed in said operating system;
providing said consumer electronics device with a second software module adapted for receiving from said first software module said rerouted sound output signal and for performing hearing loss compensation on said rerouted sound output signal and for outputting said hearing loss compensated signal,
wherein the first software module and second software modules are separate and communicate through mechanisms of the operating system, and the operating system restricts the first software module from performing hearing loss compensation,
wherein said first software module is restricted from performing floating point operations, capable of performing integer operations only, incapable of allowing any graphical user interfaces and incapable of accessing the Internet, and
wherein said second software module is not restricted from performing floating point operations, capable of allowing graphical user interfaces and capable of accessing the Internet.

US Pat. No. 10,433,080

HEARING AID COMPRISING AN INDICATOR UNIT

1. A hearing aid comprisinga housing comprising electronic components configured to generate a perceivable modulated signal in response to a received sound, the perceivable modulated signal being configured to produce a hearing sensation to a user of the hearing aid;
a light guide member arranged proximal to and/or within the housing;
a first light source configured to emit a first colored light and a second light source configured to emit a second colored light, the first light source and the second light source being positioned within the housing and arranged in relation to the light guide member such that a substantial amount of the first colored light and a substantial amount of the second colored light are adapted to travel through the light guide member; and
the light guide member comprises dimensions such that the substantial amount of the first colored light and the substantial amount of second colored light are mixed within the light guide member to generate a light of a third color.

US Pat. No. 10,433,079

HEARING PROSTHESIS ACCESSORY

Cochlear Limited, Macqua...

1. A protective sleeve for a hearing prosthesis sound processor, comprising:a flexible main body shaped to be substantially form fitting to the sound processor;
an opening disposed at an end of the main body configured to receive the sound processor; and
a flexible second component configured to mechanically mate with the opening in the main body to close the opening and seal the sound processor within the main body and to prevent the ingress of fluids through the opening.

US Pat. No. 10,433,078

EYE-MOUNTED HEARING AID

INTERNATIONAL BUSINESS MA...

1. A method of stimulating a cornea, the method comprising:providing a plurality of piezo-electric elements positioned upon an eye lens, each of the plurality of piezo-electric elements comprising a piezoelectric structure that vibrates at a different resonant frequency;
capturing a sound with a microphone;
providing a microprocessor in electrical communication with the plurality of piezo-electric elements;
identifying, with the microprocessor, a frequency in the sound;
determining, with the microprocessor, a resonant frequency associated with the frequency;
selectively stimulating a first piezo-electric element of the plurality of piezo-electric elements based on the resonant frequency, the first piezo-electric element adjacent to a receptor of the cornea; and
mechanically stimulating the receptor of the cornea with the stimulated first piezo-electric element.

US Pat. No. 10,433,076

AUDIO PROCESSING DEVICE AND A METHOD FOR ESTIMATING A SIGNAL-TO-NOISE-RATIO OF A SOUND SIGNAL

1. An audio processing device, comprisingat least one input unit for providing a time-frequency representation Y(k,n) of an electric input signal representing a time variant sound signal consisting of target speech signal components S(k,n) from a target sound source TS and noise signal components N(k,n) from other sources than the target sound source, where k and n are frequency band and time frame indices, respectively,
a noise reduction system configured to
determine a first signal to noise ratio estimate ?(k,n) of said electric input signal,
determine second signal to noise ratio estimate ?(k,n) of said electric input signal from said first signal to noise ratio estimate ?(k,n) based on a recursive algorithm comprising a recursive loop, and to
determine said second signal to noise ratio estimate ?(k,n) by non-linear smoothing of said first signal to noise ratio estimate ?(k,n), or a parameter derived therefrom, and wherein said non-linear smoothing is controlled by one or more bias and/or smoothing parameters; and
a selector located in the recursive loop, wherein said selector is configured to select an input to determine said one or more bias and/or smoothing parameters based on a select control parameter; and wherein said select control parameter for a given frequency index k is determined in dependence of the first and/or the second signal to noise ratio estimates corresponding to a multitude of frequency indices.

US Pat. No. 10,433,075

LOW LATENCY AUDIO ENHANCEMENT

Whisper.ai, Inc., San Fr...

1. A method for providing enhanced audio at an earpiece, the earpiece comprising a set of microphones and being configured to implement an audio filter for audio playback, the method comprising:collecting, at the set of microphones, audio datasets;
processing, at the earpiece, the audio datasets to obtain target audio data;
wirelessly transmitting, at one or more first selected time intervals, data representing the target audio data from the earpiece to an auxiliary processing unit;
determining, at the auxiliary processing unit, a set of filter parameters based on the data representing the target audio data and wirelessly transmitting the set of filter parameters from the auxiliary processing unit to the earpiece;
updating the audio filter at the earpiece based on the set of filter parameters to provide an updated audio filter wherein filter parameters are: determined at the auxiliary processing unit, wirelessly transmitted from the auxiliary processing unit to the earpiece, and used to update the audio filter at the earpiece at an update rate that is greater than once every 500 milliseconds during a time period when voice activity is detected to be present;
using the updated audio filter to produce enhanced audio; and
playing the enhanced audio at the earpiece.

US Pat. No. 10,433,074

HEARING AUGMENTATION SYSTEMS AND METHODS

1. A hearing system comprising:a speaker disposed in or on a hearing device located in or adjacent to at least one ear of a user, said speaker configured to direct sound into said at least one ear of said user;
a microphone disposed in or on said hearing device that is configured to detect ambient sound and to generate audio data representing said ambient sound;
a location detector that generates location data representative of a specific location of said hearing device;
a signal processor disposed in said hearing device that processes said audio data and associates said audio data with said location data;
a computing device, located remotely from said hearing device, that generates hearing test signals, receives hearing test feedback data, generates initial sound settings based upon said hearing test feedback data, receives said audio data, analyzes said audio data, generates commands based upon analysis of said audio data and wirelessly transmits said commands and said initial sound settings to said hearing device;
a memory disposed in said hearing device that is configured to store said commands and said initial sound settings:
a control system disposed in said hearing device and coupled to said memory, that applies said initial sound settings to said hearing device, receives said commands from said memory, executes said commands to modify said initial sound settings to create localized sound settings whenever said location data matches said specific location and sends said localized sound settings for said specific location to said signal processor which modifies said audio data associated with said location data for said specific location.

US Pat. No. 10,433,073

ELECTROACOUSTIC TRANSDUCER

TAIYO YUDEN CO., LTD., T...

5. An electroacoustic transducer comprising:a dynamic speaker that generates a first acoustic sound; and
a piezoelectric speaker that generates a second acoustic sound;
wherein a reproduced sound of the first acoustic sound and a reproduced sound of the second acoustic sound have a crossover frequency range, and the reproduced sound of the first acoustic sound has a phase (?1) and the reproduced sound of the second acoustic sound has a phase (?2) in the crossover frequency range, the phase (?1) and the phase (?2) being such that index ? is 0.5 or greater where
??{(cos ?1+cos ?2)2+(sin ?1+sin ?2)2}1/2
wherein ?=2 when ?1=?2, and ?=0 when ?1=?2+?.

US Pat. No. 10,433,072

MICRO-SOUND DETECTION ANALYSIS DEVICE AND ARRAY AUDIO SIGNAL PROCESSING METHOD BASED ON SAME

HARBIN INSTITUTE OF TECHN...

5. A device comprising multiple micro-acoustic sensors,wherein the micro-acoustic sensors comprise a graphene membrane or graphene oxide membrane configured to deform under sound pressure;
wherein the micro-acoustic sensors further comprise a laser, a processor and a photo-sensitive cell;
wherein the laser is configured to direct a laser beam toward the graphene membrane or graphene oxide membrane;
wherein the photo-sensitive cell is configured to detect a portion of the laser beam scattered by the graphene membrane or graphene oxide membrane; and
wherein the processor is configured to detect a movement of position of a facula on the graphene membrane or graphene oxide membrane based on the portion of the laser beam, and determine audio features from deformation of the graphene membrane or grapheme oxide membrane.

US Pat. No. 10,433,071

MICROPHONE WITH HYDROPHOBIC INGRESS PROTECTION

Knowles Electronics, LLC,...

1. A microphone, the microphone comprising:a base, the base having a port extending therethrough;
a microelectromechanical system (MEMS) device coupled to the base, the MEMS device including a diaphragm, a back plate, and a substrate, the substrate forming a back-hole; and
a capillary structure disposed in the back-hole of the substrate and integral to the substrate, the capillary structure including a plurality of capillaries extending through the capillary structure, the capillary structure configured to inhibit contaminants from outside the microphone from reaching the diaphragm via the port.

US Pat. No. 10,433,070

SENSITIVITY COMPENSATION FOR CAPACITIVE MEMS DEVICE

Infineon Technologies AG,...

8. A MEMS device comprising:a first movable electrode structure;
a second movable electrode structure spaced apart from the first movable electrode structure, the first movable electrode structure and the second movable electrode structure enclosing a gap between the first movable electrode structure and the second movable electrode structure, the gap having a gas pressure lower than an ambient pressure;
a static electrode structure within the gap and interposed between the first movable electrode structure and the second movable electrode structure; and
a plurality of pillars extending through the gap and connecting the first movable electrode structure and the second movable electrode structure, the plurality of pillars extending through the static electrode structure, the plurality of pillars dividing the gap into a plurality of gap regions, different gap regions having different thicknesses, wherein the plurality of pillars have a non-uniform pitch.

US Pat. No. 10,433,069

CHARGE PUMP ASSEMBLY

TDK Corporation, Tokyo (...

1. Charge pump assembly, comprisinga charge pump with an input port and an output port,
a bias circuit electrically connected to the input port and provided for creating a bias voltage Vbias,
where
the bias voltage Vbias has a temperature dependence.

US Pat. No. 10,433,068

MEMS ACOUSTIC TRANSDUCER WITH COMBFINGERED ELECTRODES AND CORRESPONDING MANUFACTURING PROCESS

STMicroelectronics S.r.l....

1. A MEMS acoustic transducer, comprising:a substrate of semiconductor material having a back surface and a front surface opposite with respect to a vertical direction;
a first cavity in the substrate, the first cavity extending from the back surface to the front surface;
a membrane at the front surface and suspended over the first cavity, a perimeter of the membrane being anchored to the substrate; and
a combfingered electrode arrangement including a plurality of mobile electrodes coupled to the membrane and a plurality of fixed electrodes coupled to the substrate and facing the plurality of mobile electrodes for forming a sensing capacitor, a deformation of the membrane as a result of incident acoustic pressure waves being configured to cause a capacitive variation of the sensing capacitor,
wherein the combfingered electrode arrangement is arranged above the membrane and extends parallel to the membrane, the plurality of mobile electrodes and the plurality of fixed electrodes being suspended above the membrane.

US Pat. No. 10,433,065

SPEAKER DEVICE

PIONEER CORPORATION, Kaw...

1. A speaker device comprising:a diaphragm that radiates sound;
an edge arranged in an outer periphery of the diaphragm;
an attachment part for attaching the diaphragm to the speaker device via the edge positioned on a sound radiating side of an outer peripheral region of the edge and arranged apart from the edge in a sound radiating direction; and
a connecting member held between the outer peripheral region of the edge and the attachment part and adhered to both the outer peripheral region of the edge and the attachment part,
wherein the edge is made of resin, and the attachment part is made of metal.

US Pat. No. 10,433,064

ACOUSTIC DEVICE CONFIGURATION AND METHOD

BOSE CORPORATION, Framin...

1. An acoustic device comprising:a frame;
a diaphragm; and
a suspension element that couples the diaphragm to the frame such that the diaphragm is movable in a reciprocating manner relative to the frame, the suspension element comprising a first surround element and a second surround element that are located adjacent to each other, wherein the first surround element and the second surround element each comprise an edge that together form a plurality of openings disposed along an outer perimeter, and wherein at least one opening of the plurality of openings enables airflow from an interior chamber of the acoustic device to an outside environment.

US Pat. No. 10,433,063

MEMS CIRCUIT BOARD MODULE HAVING AN INTEGRATED PIEZOELECTRIC STRUCTURE, AND ELECTROACOUSTIC TRANSDUCER ARRANGEMENT

USound GmbH, Graz (AT)

1. MEMS printed circuit board module for a sound transducer assembly for generating in a membrane and/or detecting from the membrane, sound waves in the audible wavelength spectrum, the MEMS printed circuit board module comprising:a printed circuit board; and
a multi-layer piezoelectric structure that is configured for setting the membrane into oscillation to generate oscillations that can be detected by the printed circuit board; and
wherein the multi-layer piezoelectric structure defines an anchoring area facing towards the printed circuit board, and the printed circuit board is laminated to the anchoring area of the multi-layer piezoelectric structure; and
wherein the printed circuit board defines a recess within which the multi-layer piezoelectric structure is disposed.

US Pat. No. 10,433,062

STEREO AUDIO SYSTEM AND METHOD

DIODES INCORPORATED, Pla...

1. A circuit, comprising:a first output wire, a second output wire, and a third output wire, wherein the circuit is configured to:
receive a first input signal R and a second input signal L;
provide a first driving signal to the first output wire, the first driving signal being a linear function of the difference between the input signal R and the input signal L;
provide a second driving signal to the second output wire, the second driving signal being a linear function of the sum of the input signal R and L; and
provide a third driving signal to the third output wire, the third driving signal having a magnitude of the first driving signal and having an opposite polarity with respect to the first driving signal;
provide a first output signal between the first output wire and the second output wire, the first output signal being a linear function of the input signal L and not a function of the input signal R; and
provide a second output signal between the third output wire and the second output wire, the second output signal being a linear function of the input signal and not a function of the input signal L;
wherein the circuit is configured to:
provide a first driving signal ?a(R?L) to the first output wire;
provide a second driving signal ?a(R+L) to the second output wire; and
provide a third driving signal ?a(L?R) to the third output wire, where a is a constant;
wherein the circuit is configured to:
provide a first output signal 2bL between the first output wire and the second output wire; and
provide a second output signal 2bR between the third output wire and the second output wire, where b is a constant;
wherein the circuit comprises:
a first inverting amplifier for receiving the first input signal R and providing a signal ?R;
a second inverting amplifier for receiving the second input signal L and providing a signal ?L;
a first summing amplifier for receiving ?R and L signals and producing a signal a(R?L);
a second summing amplifier for receiving ?R and ?L signals and producing a signal a(R+L); and
a third summing amplifier for receiving ?L and R signals and producing a signal a(L?R).

US Pat. No. 10,433,061

EAR UNIT AND PORTABLE SOUND DEVICE

LG ELECTRONICS INC., Seo...

1. An ear unit comprising:an ear housing including a bass hole and a flat hole formed in a rear surface of the ear housing, the bass hole being larger than the flat hole;
a driver unit mounted in the ear housing; and
a rotator rotatably coupled to the ear housing, the rotator including an inner bracket opposite the rear surface of the ear housing, the inner bracket having an opening/closing hole configured to open or close the bass hole or the flat hole according to a rotational position of the rotator,
wherein the ear housing further includes:
an inner case having a damper layer and an inner hole formed at the damper layer;
an outer case coupled to an outer surface of the inner case, the outer case having a hole plate, and the bass hole and the flat hole are formed at the hole plate; and
a duct groove formed in an inner surface of the hole plate so as to connect the inner hole and the flat hole.

US Pat. No. 10,433,060

AUDIO HUB AND A SYSTEM HAVING ONE OR MORE AUDIO HUBS

DSP Group Ltd, Herzliya ...

1. A system comprising a first processor and an audio hub;wherein the audio hub comprises first communication interfaces, a second communication interface, a second processor, and a memory; wherein the second processor is a digital signal processor and wherein the first processor is a host processor or an application processor;
wherein the first communication interfaces are configured to exchange audio signals with a group of audio components of different types; wherein an aggregate number of first communication interface bits exceeds a number of second communication interface bits;
wherein the audio signals comprise input audio signals received from the group and output audio signals transmitted to the group;
wherein the second processor is configured to generate an input multiplex of input audio signals; and
wherein the second communication interface is configured to transmit the input multiplex to the first processor and to receive an output multiplex from the first processor.

US Pat. No. 10,433,059

PUBLIC ADDRESS SYSTEM FOR THE SONICATION OF A SONICATION REGION, METHOD FOR THE SONICATION OF A SONICATION REGION AND COMPUTER PROGRAM FOR CARRYING OUT THE METHOD

Robert Bosch GmbH, Stutt...

1. A public address system (1) for the sonication of a sonication region (2), the public address system comprising:a multiplicity of loudspeakers (3a,b) arranged in the sonication region (2) for outputting an input signal (4a,b) as an audio signal,
a measuring device (5) for the detection of the audio signal at a measurement point (6) in the sonication region (2), wherein the measuring device (5) is designed to determine a propagation time difference (?tn) of the audio signals between each of two speakers (3a,b) to the measurement point (6) and to provide the propagation time difference (?tn) as propagation time difference data,
a modelling module, wherein the modelling module comprises a model of the sonication region (2), positions of the loudspeakers (3a,b) in the sonication region (2) and the measurement point (6), wherein by means of the modelling module a user can define a target point in the sonication region (2), a model in the sonication region (2), or both and the modelling module is designed to determine the propagation time differences (?tn) such that the propagation time difference (?tn) at the target point is compensated, and
a delay module (10), wherein the delay module (10) is designed to add a time delay to the input signals (4a,b) based on the propagation time differences (?tn), in order to compensate for the propagation time difference (?tn) between the audio signals at the measurement point (6),
wherein
the delay module (10) is connected to the measuring device (5) via a data link and the propagation time difference data is provided to the delay module (10) via a data link.

US Pat. No. 10,433,058

CONTENT RULES ENGINES FOR AUDIO PLAYBACK DEVICES

Sonos, Inc., Santa Barba...

15. A playback device comprising:one or more processors; and
a computer-readable medium storing instructions that, when executed by the one or more processors, cause the one or more processors to perform the following operations:
receiving, via a media playback system, a first command to form a synchrony group comprising a plurality of playback devices, wherein each playback device of the synchrony group is configured to play back audio content in synchrony with one another;
receiving, via the media playback system, a second command for the synchrony group to play back first audio content;
in response to the second command, playing back, via the synchrony group, the first audio content;
receiving, via the media playback system, (i) second audio content to be played back by one or more of the playback devices of the plurality of playback devices and (ii) content source properties associated with an audio source of the second audio content;
accessing a rules engine to determine playback restrictions, the playback restrictions based on the content source properties, wherein the media playback system further comprises a remote computing device, and wherein the accessing comprises, via the playback device:
transmitting a request to the remote computing device, wherein the request comprises the content source properties; and
in response to the request, receiving state information corresponding to the playback restrictions; and
based at least in part on the determination of the playback restrictions, restricting, via the media playback system, operation of the one or more playback devices, the restricting comprising precluding formation of the synchrony group.

US Pat. No. 10,433,057

WIRELESS AUDIO SYNCHRONIZATION

Bose Corporation, Framin...

1. An audio distribution system comprising:an audio source;
an access point; and
a plurality of audio playback devices in communication with each other and with the audio source,
wherein a group of the audio playback devices are arranged to render audio content provided by the audio source in synchrony,
wherein one of the audio playback devices within the group is configured as an audio master which distributes audio content from the audio source to the other audio playback devices within the group, and wherein one of the plurality of audio playback devices, other than the audio master, is configured as a clock master, which distributes clock information that the group of audio playback devices synchronizes to, and
wherein the system is configured to select the clock master from among the plurality of audio playback devices based on ping times between the audio playback devices of the plurality of audio playback devices and the access point.

US Pat. No. 10,433,056

AUDIO SIGNAL PROCESSING STAGE, AUDIO SIGNAL PROCESSING APPARATUS, AUDIO SIGNAL PROCESSING METHOD, AND COMPUTER-READABLE STORAGE MEDIUM

Huawei Technologies Co., ...

1. An audio signal processing stage for processing an input audio signal into an output audio signal, wherein the audio signal processing stage comprises:a filter bank defining two or more frequency bands, the filter bank being configured to separate the input audio signal into two or more input audio signal components, each of the two or more input audio signal components being limited to a respective one of the two or more frequency bands;
a set of two or more band branches configured to provide two or more output audio signal components, wherein each of the two or more band branches is configured to process a respective one of the two or more input audio signal components to provide a respective one of the two or more output audio signal components, wherein the set of two or more band branches comprises one or more compressor branches, each of the one or more compressor branches comprising a compressor configured to compress the input audio signal component of the respective compressor branch to provide the output audio signal component of the respective compressor branch;
an inverse filter bank configured to generate a summed audio signal by summing the two or more output audio signal components;
a residual audio signal generating unit configured to generate a residual audio signal, the residual audio signal being a difference between the input audio signal and the summed audio signal;
a virtual bass unit configured to generate a virtual bass signal which comprises one or more harmonics of the residual audio signal, the virtual bass unit comprising a harmonics generator configured to generate the one or more harmonics on the basis of the residual audio signal; and
a summation unit configured to generate the output audio signal by summing the summed audio signal and the virtual bass signal.

US Pat. No. 10,433,055

SPEAKER SYSTEM WITH MOVING VOICE COIL

GOERTEK, INC., Weifang (...

1. A speaker system with a moving voice coil, sequentially comprising a front cover, a vibrating component and a magnetic circuit component from top to bottom, whereinthe front cover is provided with a conductive material layer which constitutes a fixed electrode plate;
the vibrating component comprises a vibrating diaphragm and a voice coil secured to the lower surface of the vibrating diaphragm and disposed in a magnetic gap formed by the magnetic circuit component;
the vibrating component further comprises a flexible circuit board secured to one side of the vibrating diaphragm, wherein a metal layer is disposed in the middle of the flexible circuit board and constitutes a movable electrode plate; a front cover solder pad is disposed on the front cover; an external terminal is disposed on the front cover solder pad; the fixed electrode plate is electrically connected to an external circuit by means of the external terminal; the flexible circuit board extends to the outside of the speaker system and is provided with a solder pad electrically connected to the external circuit; the fixed electrode plate and the movable electrode plate are opposite to each other; charges accumulate on the fixed electrode plate and the movable electrode to constitute a capacitor after the front cover and the flexible circuit board are electrified;
a capacitance monitoring chip is further disposed in the speaker system with the moving voice coil and is electrically connected to the capacitor;
the flexible circuit board is secured between the vibrating diaphragm and the voice coil; and
the vibrating component further comprises a vibrating diaphragm reinforcing portion secured to the upper surface of the vibrating diaphragm; a hole is formed in the center of each of the vibrating diaphragm and the vibrating diaphragm reinforcing portion; and the capacitance monitoring chip is disposed in the hole.

US Pat. No. 10,433,054

MEMS DEVICES

Cirrus Logic, Inc., Aust...

1. A protection system for protecting a MEMS transducer of a MEMS device from electrostatic capture, wherein the MEMS transducer is operable in a normal-sensitivity, mode and in a reduced-sensitivity mode, wherein the protection system comprises:an overload detector for detecting an overload condition arising as a result of an excessive sound pressure level at the MEMS transducer;
a signal estimator configured to generate an estimate of a sound pressure level at the MEMS transducer; and
a controller configured, in response to detection by the overload detector of an overload condition, to:
disable an output of the MEMS transducer; and
after a delay of a first predetermined period of time:
cause the MEMS transducer to operate in the reduced-sensitivity mode;
enable the output of the MEMS transducer; and
cause the MEMS transducer to return to the normal-sensitivity mode if the estimate of the sound pressure level generated by the signal estimator while the MEMS transducer is operating in the reduced-sensitivity mode is below a safe sound pressure level threshold for a second predetermined period of time.

US Pat. No. 10,433,052

SYSTEM AND METHOD FOR IDENTIFYING SPEECH PROSODY

1. A system for processing audio for identifying loudness levels of speakers, the system comprising:one or more audio sensors included in a wearable apparatus and configured to capture audio data from an environment of a wearer of the wearable apparatus; and
at least one processing unit configured to:
analyze the audio data to determine that the wearer and a second speaker are engaged in conversation;
analyze the audio data to obtain prosodic information associated with the wearer and at least part of the conversation, the prosodic information comprises an indication of a loudness level;
estimate a distance between the wearer and the second speaker;
use the estimated distance between the wearer and the second speaker and the indication of the loudness level to determine that the loudness level is inappropriate for the estimated distance;
in response to the determination that the loudness level is inappropriate for the estimated distance and the determination that the wearer and the second speaker are engaged in conversation, provide a feedback to the wearer;
obtain additional audio data captured by the one or more audio sensors after obtaining the prosodic information, the additional audio data comprising an additional part of the conversation;
estimate a second distance between the wearer and the second speaker;
analyze the additional audio data to obtain additional prosodic information associated with the wearer, the additional prosodic information comprises an additional indication of an additional loudness level;
use the estimated second distance between the wearer and the second speaker and the additional indication of the additional loudness level to determine that the additional loudness level is inappropriate for the estimated second distance;
determine an elapsed time since the feedback was provided to the wearer;
when the elapsed time is longer than a selected time duration, in response to the determination that the additional loudness level is inappropriate for the estimated second distance, provide an additional feedback to the wearer; and
when the elapsed time is shorter than the selected time duration forgo the additional feedback to the wearer.

US Pat. No. 10,433,050

SPEAKER MODULE FOR MOBILE DEVICE AND MOBILE DEVICE HAVING DUCT RADIATION STRUCTURE

Samsung Electronics Co., ...

1. A mobile device comprising:a speaker module comprising:
a speaker configured to output a sound signal, and
a duct configured to provide a radiation path of the sound signal, the duct comprising one side end, at which the sound signal is emitted from the speaker module, and a first opening having a tilted configuration at the one side end;
a body in which the speaker module is mounted, the body comprising a connecting portion comprising a second opening opposite the first opening, the second opening being configured to couple with the first opening; and
a protection member disposed between the first opening and the second opening,
wherein the duct and the connecting portion are configured such that the first opening is coupled to the second opening to radiate the sound signal via the connecting portion to an outside of the mobile device without leakage,
wherein the protection member covers only the first opening having the tilted configuration or the second opening,
wherein the entire protection member disposed at one of the first opening or the second opening is tilted, and
wherein the speaker module is configured to be an essentially soundproof enclosure except for the first opening.

US Pat. No. 10,433,046

DETERMINATION OF ENVIRONMENTAL EFFECTS ON ELECTRICAL LOAD DEVICES

ESS Technology, Inc., Mi...

1. A circuit for determining a signal that is representative of an environmental effect on an electrical load while the electrical load is operating based upon an input signal, comprising:a first differential amplifier having a first input receiving the input signal and a second input receiving an output of the first differential amplifier, the output of the first differential amplifier driving the electrical load thereby causing the second input to receive a signal representing the input signal and including the environmental effect on the electrical load;
a second differential amplifier having a first input receiving the input signal and a second input receiving an output of the second differential amplifier, the output of the second differential amplifier driving a load having an impedance equal to an impedance of the electrical load, thereby causing the second input to receive a signal representing the input signal; and
a third differential amplifier having a first input receiving the output of the first differential amplifier and a second input receiving the output of the second differential amplifier, thereby producing as an output of the third differential amplifier a signal which is a difference between the input signal and the signal applied to the electrical load by both the input signal and the environmental effect.

US Pat. No. 10,433,045

EARBUD STABILITY ANCHOR FEATURE

Apple Inc., Cupertino, C...

1. An earbud, comprising:an anchoring feature defining a channel extending from a first end of the anchoring feature to a second end of the anchoring feature, the anchoring feature being sized to be secured within an ear of a user;
an earbud housing comprising: a central section disposed between opposing first and second ends of the earbud housing, the central section being disposed within the channel and the first and second ends being disposed outside the channel, the first end of the earbud housing protruding from the anchoring feature at an angle of between 100 and 145 degrees with respect to an outer surface of the first end of the anchoring feature; and
an audio driver disposed within the earbud housing and aligned with an audio exit opening defined by the first end of the earbud housing through which audio generated by the audio driver leaves the earbud housing.

US Pat. No. 10,433,044

HEADPHONES WITH INTERACTIVE DISPLAY

1. A headphone system comprising:a frame having a first region, a second region, and a headband extending between said first region and said second region, said first region and said second region configured to be positioned near a first ear and a second ear of a user, respectively;
a first speaker assembly coupled to said first region of said frame, said frame configured to position said first speaker assembly near a first ear of a user;
a transparent display coupled to said frame and configured to be positioned in an optical path of said user when said first and said second regions of said frame are positioned near said first and said second ears of said user;
a user interface operative to receive input from said user;
memory for storing data and code; and
a controller coupled to said frame, said controller being responsive to said input from said user and operative to execute said code and display images on said transparent display for viewing based at least in part on said input from said user;
and wherein said transparent display is rotatably coupled to said frame;
said transparent display is configured to rotate between at least a first position and a second position;
said transparent display is disposed in said optical path of said user when said transparent display is in said first position;
said transparent display is disposed over said headband when said transparent display is in said second position;
said transparent display is further configured to rotate from said second position to a third position; and
said transparent display is configured to be disposed around a rear region of a head of said user when said transparent display is in said third position.

US Pat. No. 10,433,041

OUTDOOR LOUDSPEAKER WITH INTEGRAL LIGHTING

Harman International Indu...

1. An outdoor loudspeaker system, comprising:a plurality of outdoor loudspeakers each including an audio driver, a top arranged on a base and a cap arranged on the top, a lamp mounted beneath the cap and outside the top, wherein the cap includes a centrally mounted lens above the lamp, the lens mounted in webs in the cap above the audio driver, wherein the audio driver is configured to output sound upwardly around the lens and through interstices between the webs;
a lighting controller to output lighting signals to the lamp of the one or more than one outdoor loudspeakers;
an audio controller to output audio signals to the audio driver to the one or more than one outdoor loudspeakers;
a lighting conduit housing lighting wiring from the lighting controller to the one or more than one outdoor loudspeakers; and
an audio conduit housing audio wiring from the audio controller to the one or more than one outdoor loudspeakers, the audio conduit being separate from the lighting conduit.

US Pat. No. 10,433,039

HOUSING FOR TERMINAL, TERMINAL AND MOBILE PHONE

GUANGDONG OPPO MOBILE TEL...

14. A mobile phone, comprising:a housing;
a printed circuit board (PCB);
a memory;
an input unit;
a sensor; and
an electroacoustic component;
the housing defining a chamber, the PCB, the memory, the input unit and the sensor being arranged in the chamber, and the memory being arranged on and electrically connected to the PCB,
the housing comprising:
a front cover having an inner top wall and inner side walls, the inner top wall and the inner side walls cooperatively a mounting chamber configured to accommodate the electroacoustic component, the inner top wall defining a sound output hole in communication with the mounting chamber, the sound output hole having a first end and a second end, the first end being adjacent to the mounting chamber and defining a first opening, the second end being away from the mounting chamber and defining a second opening, and the first opening being larger than the second opening
a display screen cover plate defining a through hole and
embedded in the front cover, and the sound output hole being opposite to and in communication with the through hole; and
a dustproof mesh having a protruding portion extending into the through hole and a flange sandwiched between the front cover and the display screen cover plate.

US Pat. No. 10,433,036

DATA LOGGER SYSTEM AND RELATED METHODS

Arizona Board of Regents,...

1. A data logger system comprising:a data logger coupled with an energy storage device, the data logger configured to couple to a living subject and record data related to one of the living subject's activities and related to a vital parameter of the living subject;
an energy harvester electrically coupled with the energy storage device, where the energy harvester is configured to harvest energy from one of an external environment, the living subject, and both the external environment and the living subject; and
a processor and a memory;
wherein the data logger is configured to communicate with a remote communication system while coupled to the living subject;
wherein an energy usage of the data logger is monitored and adjusted using the processor and the memory using an electronic Hardware Abstraction Layer (eHAL);
wherein the eHAL is directly coupled to and configured to directly alter hardware settings or alter the hardware settings through a hypervisor.

US Pat. No. 10,433,035

PROFILES FOR COLLECTING TELEMETRY DATA

Intel Corporation, Santa...

1. An apparatus comprising:a plurality of telemetry registers;
a memory to store a plurality of telemetry profiles and a plurality of message profiles, the plurality of telemetry profiles including a first telemetry profile specifying a first collection trigger, a first set of telemetry registers, and a first telemetry data destination; and
a virtualized telemetry controller to:
detect a first condition satisfying the first collection trigger specified in the first telemetry profile;
in response to a detection of the first condition, read a first plurality of telemetry values from the first set of telemetry registers specified in the first telemetry profile;
generate a first telemetry container including the first plurality of telemetry values;
identify a first message profile associated with the first telemetry data destination;
encapsulate the first telemetry container using the identified first message profile to generate a telemetry message; and
send the telemetry message to the first telemetry data destination specified in the first telemetry profile.

US Pat. No. 10,433,032

DYNAMIC DISTRIBUTED-SENSOR NETWORK FOR CROWDSOURCED EVENT DETECTION

Google LLC, Mountain Vie...

1. A crowdsourced event detection network based on devices associated with the consumption of one or more resources, the crowdsourced event detection network comprising:a plurality of smart-home devices that are distributed among a plurality of homes within a geographic region, the plurality of smart-home devices having processing and sensing capabilities and being configured to communicate data, the plurality of smart-home devices including: an irrigation device on the premises of a first home and at least one additional device located within a second home, the at least one additional device selected from the group consisting of: a thermostat, a smoke detector, and a camera, wherein:
the irrigation device has a primary function associated with property irrigation and includes a primary sensor that senses conditions related to property irrigation and also includes a secondary sensor that senses conditions that are not related to property irrigation, the secondary sensor of the irrigation device being one or more sensors selected from the group consisting of: a seismic sensor, an audio sensor, an acceleration sensor, a temperature sensor, and a radiation sensor;
the thermostat has a primary function of temperature control and includes a primary sensor that senses conditions related to temperature control and also includes a secondary sensor that senses conditions that are not related to temperature control, the secondary sensor of the thermostat being one or more sensors selected from the group consisting of: a seismic sensor, an audio sensor, an acceleration sensor, and a radiation sensor;
the smoke detector has a primary function of smoke detection and includes a primary sensor that senses conditions related to smoke detection and also includes a secondary sensor that senses conditions that are not related to smoke detection, the secondary sensor of the smoke detector being one or more sensors selected from the group consisting of: a seismic sensor, a temperature sensor, an audio sensor, and a humidity sensor; and
the camera has a primary function of video monitoring and includes a primary sensor that senses conditions related to video monitoring and also includes a secondary sensor that senses conditions that are not related to video monitoring, the secondary sensor of the camera being one or more sensors selected from the group consisting of: a temperature sensor, a seismic sensor, an acceleration sensor, and a radiation sensor; and
an event detection processor programmed and configured to detect an event from sensor measurements gathered within the geographic region from the secondary sensors of the irrigation device and at least one of the thermostat, the smoke detector, or the camera according to the steps of:
receiving said sensor measurements from said secondary sensors;
processing said sensor measurements from said secondary sensors to determine whether an event has occurred, wherein the event is selected from the group consisting of: an earthquake, a tornado, a power outage, and a local weather event; and
generating at least one alert identifying the detected event based on said processing of said sensor measurements from said secondary sensors.

US Pat. No. 10,433,028

APPARATUS AND METHOD FOR TRACKING TEMPORAL VARIATION OF VIDEO CONTENT CONTEXT USING DYNAMICALLY GENERATED METADATA

Electronics and Telecommu...

1. An apparatus for tracking temporal variation of a video content context, the apparatus comprising:a processor configured to:
generate static metadata based on internal data held during an initial publication of video content,
tag the generated static metadata to the video content,
collect external data related to the video content generated after the video content is published,
generate dynamic metadata related to the video content based on the collected external data,
tag the generated dynamic metadata to the video content,
regenerate the dynamic metadata with an elapse of time,
tag the regenerated dynamic metadata to the video content,
track a temporal change in context of the regenerated dynamic metadata from the generated dynamic metadata, and
generate and provide a trend analysis report of a context of the video based on the tracked temporal change in the context of the regenerated dynamic metadata,
wherein the processor comprises:
a first metadata generator/tagger configured to generate the static metadata based on the internal data held during the initial publication of the video content, and tag the generated static metadata to the video content,
an external data collector configured to collect the external data related to the video content generated after the video content is published,
a second metadata generator/tagger configured to generate dynamic metadata related to the video content based on the collected external data, and tag the generated dynamic metadata to the video content, and
a dynamic metadata-based video context temporal variation tracker/analyzer configured to regenerate the dynamic metadata with the elapse of time, tag the regenerated dynamic metadata to the video content, track the temporal change in context of the regenerated dynamic metadata from the generated dynamic metadata, and generate and provide the trend analysis report of the context of the video based on the tracked temporal change in the context of the regenerated dynamic metadata,
wherein the second metadata generator/tagger is further configured to expand and update topics and keywords of the static metadata using the collected external data and publication time information of the collected external data, and tag other metadata, along with publication time information of the other metadata, to the video content.

US Pat. No. 10,433,026

SYSTEMS AND METHODS FOR CUSTOMIZED LIVE-STREAMING COMMENTARY

MyTeamsCalls LLC, Pacifi...

1. A method for transmitting a plurality of fingerprints that have been autonomously identified from a broadcast stream such that an alternative media stream can be synchronized across multiple receivers of the broadcast stream, the method comprising:receiving, at a broadcast server, a broadcast stream containing audiovisual data;
receiving, at the broadcast server, an alternative media stream to be associated with the broadcast stream;
receiving, at the broadcast server, time information of the audiovisual data from a common time source that is independent of the broadcast server;
determining the plurality of fingerprints associated with the broadcast stream that will be used to synchronize the alternative media stream with the broadcast stream based at least upon the content of the broadcast stream, the time information, and the alternative media stream, each of the plurality of fingerprints corresponding to a fixed time point in the content of the broadcast stream; and
transmitting the plurality of fingerprints to one or more recipients to facilitate playback of the alternative media stream in synchronization with the broadcast stream.

US Pat. No. 10,433,025

VIRTUAL REALITY RESOURCE SCHEDULING OF PROCESS IN A CLOUD-BASED VIRTUAL REALITY PROCESSING SYSTEM

1. A method comprising:receiving a request for a render project that includes information specifying three-dimensional video data to be used to create a render, the information including a segment, a storage location of the segment, a storage location of initial video data, two or more cameras used to record the initial video data, a geometry of each of the two or more cameras used to record the initial video data, a format of the render, a geometry of the render, a time frame of the initial video data, a project identifier, and a priority of the render;
determining a plurality of jobs required to create the render from the three-dimensional video data;
determining an availability of a plurality of nodes across a network;
creating a render map that specifies a processing sequence of the plurality of jobs across the plurality of nodes to create the render, the render map being created based on at least the availability of the plurality of nodes; and
processing the plurality of jobs at the plurality of nodes to create three-dimensional content.

US Pat. No. 10,433,024

METHOD AND APPARATUS FOR CONFIGURING CONTENT IN A BROADCAST SYSTEM

Samsung Electronics Co., ...

1. A method of transmitting media data, the method comprising:processing a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and
transmitting, by a transmitter, the MPU,
wherein the MPU includes at least one fragmentation unit,
wherein the parameters include a first parameter and a second parameter, based on the first parameter having a first value, the first parameter indicates that the at least one fragmentation unit comprises timed data including timeline information for decoding and/or presentation of content of the timed data, and based on the first parameter having a second value, the first parameter indicates that the at least one fragmentation unit comprises non-timed data, which does not include the timeline information for decoding and/or presentation of the content of the non-timed data,
wherein the second parameter indicates a sequence number of the MPU,
wherein the MPU is transmitted in at least one packet and the at least one packet includes information indicating if the at least one packet includes at least one random access point (RAP), and
wherein the at least one fragmentation unit in the MPU is grouped into a first group including one or more fragmentation units of an I-picture type and one or more fragmentation units of a P-picture type and a second group including one or more fragmentation units of a B-picture type, according to a picture type for each of the fragmentation units, the first group is disposed ahead of the second group, and application-forward error control (AL-FEC) is applied only to the first group.

US Pat. No. 10,433,020

SHARING MOBILE SUBSCRIBER CONTENT IN A PUBLICALLY VIEWABLE CONTENT DISTRIBUTION NETWORK

1. A method, comprising:associating, by a processing system including a processor, mobile video content obtained from a first mobile device, with a television channel of a television distribution service comprising a plurality of television channels to generate an updated television channel that includes the mobile video content, wherein equipment of a television receiver, when tuned to the television channel, processes a television signal of the television distribution service to obtain the mobile video content for presentation at a display device; and
providing, by the processing system, a notification to a second mobile device, wherein the notification identifies an association of the mobile video content and the television channel, and wherein, responsive to the notification, equipment of a viewer accesses a message from the second mobile device indicating availability of the mobile video content.

US Pat. No. 10,433,018

VIDEO PRODUCTION SYSTEM WITH DYNAMIC CHARACTER GENERATOR OUTPUT

Tribune Broadcasting Comp...

1. A method comprising:accessing, by a computing system, a first set of ordered content items and a second set of active/inactive status attributes, wherein each content item of the first set corresponds to a respective active/inactive status attribute of the second set, wherein the first set comprises (i) a particular content item that corresponds to a particular active/inactive status attribute of the second set and (ii) other content items that correspond to other active/inactive status attributes of the second set;
identifying, by the computing system, a subset of the first set based on each content item of the subset corresponding to an active status attribute in the second set;
using, by the computing system, the content items of the identified subset to generate video content that includes the content items of the identified subset, as ordered in the first set;
while generating the video content, modifying, by the computing system, the particular active/inactive status attribute based on at least one of (i) a threshold number of the other active/inactive status attributes or (ii) a position of the other active/inactive status attributes relative to the particular active/inactive status attribute; and
after modifying the particular active/inactive status attribute, repeating, by the computing system, the identifying and using steps, thereby causing modification of the generated video content.

US Pat. No. 10,433,016

RECEPTION APPARATUS, RECEPTION METHOD, TRANSMISSION APPARATUS, AND TRANSMISSION METHOD

SONY CORPORATION, Tokyo ...

1. A reception apparatus comprising:circuitry configured to
receive broadcast content provided as a pay broadcast service, the broadcast content being transmitted in a scrambled manner;
acquire a subscription check application for checking presence of a subscription to the pay broadcast service depending on information indicating presence of the subscription check application, the information being included in control information including information regarding a structure of the broadcast content;
control an operation of the subscription check application;
acquire a promotion application in response to a determination that the subscription to the pay broadcast service is not present; and
acquire a conjunction application in response to a determination that the subscription to the pay broadcast service is present;
wherein the subscription check application is transmitted in a non-scrambled manner, the promotion application is transmitted in a non-scrambled manner, and the conjunction application is transmitted in a scrambled manner.

US Pat. No. 10,433,015

SYSTEMS AND METHODS FOR PROVIDING RECOMMENDATIONS BASED ON SHORT-MEDIA VIEWING PROFILE AND LONG-MEDIA VIEWING PROFILE

ROVI GUIDES, INC., San J...

1. A method for providing recommendations to a user, the method comprising:tracking behavior of the user while the user is engaged in watching short-length media content;
generating a short-media viewing profile based on the user's behavior tracked when the user was engaged in watching short-length media content, the short-media viewing profile comprising a plurality of recommendation metadata;
tracking behavior of the user while the user is engaged in watching long-length media content;
generating a long-media viewing profile based on the user's behavior tracked when the user was engaged in watching long-length media content, the long-media viewing profile comprising a plurality of recommendation metadata;
providing media content to a user device;
comparing the length of the media content to a length threshold;
in response to the determining that the length of the media content exceeds the length threshold:
activating the long-media viewing profile;
selecting a recommendation item, wherein the metadata of the recommendation item matches at least one recommendation metadata of the long-media viewing profile; and
providing the recommendation item to the user device; and
in response to the determining that the length of the media content does not exceed the length threshold:
activating the short-media viewing profile;
selecting a recommendation item, wherein the metadata of the recommendation item matches at least one recommendation metadata of the short-media viewing profile; and
providing the recommendation item to the user device.

US Pat. No. 10,433,014

MEDIA CONTENT DOWNLOAD TIME

Hewlett Packard Enterpris...

1. A system comprising:a memory to store machine readable instructions; and
a processing unit to access the memory, and execute the machine readable instructions, the machine readable instructions comprising:
a scheduling agent to continuously monitor a consumer network usage pattern on a network at a consumer device for downloading selected media content and continuously monitor a provider network usage pattern on a network at the content provider to determine a download time for selected media content from a content provider, wherein the download time is determined by machine learning based on:
a viewing time of the selected media content;
an amount of network bandwidth available to a consumer device for downloading the selected media content;
a provider network usage pattern; and
a consumer network usage pattern; and
a download agent to initiate a download of the selected media content at the determined download time, wherein the selected media content contains digital rights management features.

US Pat. No. 10,433,013

DYNAMIC MANAGEMENT OF AUDIOVISUAL AND DATA COMMUNICATIONS

Comcast Cable Communicati...

1. A method comprising:monitoring, by a computing device, data corresponding to content requests from one or more population pools;
determining, based on at least a first portion of the content requests, a first number of users requesting a content asset from a first server, wherein the content asset is initially assigned, based on historical viewership data, to a first content lineup available via the first server;
reassigning, based on the first number of users satisfying a first threshold, the content asset from the first content lineup available via the first server to a second content lineup available via a second server different from the first server, wherein the computing device causes the content asset to be available via the second server;
determining, based on at least a second portion of the content requests, a second number of users requesting the content asset from the second server; and
reassigning, based on the second number of users satisfying a second threshold, the content asset from the second content lineup available via the second server to the first content lineup available via the first server, wherein the computing device causes the content asset to be available via the first server.

US Pat. No. 10,433,012

ELECTRONIC DEVICE AND CONTENT PROVIDING METHOD THEREOF

Samsung Electronics Co., ...

1. A content providing method of an electronic device, the method comprising operations of:downloading an organization schedule including highlight organization information from a server, wherein the organization schedule comprises at least a first channel and a second channel;
detecting a channel selection of the first channel after downloading the organization schedule;
sequentially downloading respective first channel highlight images of a plurality of contents in a first sequence for the first channel from the server, based on the highlight organization information;
sequentially playing the downloaded first channel highlight images of the plurality of contents in the first sequence while downloading remaining undownloaded highlight images of the plurality of contents in the first sequence, based on the highlight organization information;
detecting a channel selection of the second channel after the channel selection of the first channel;
sequentially downloading respective second channel highlight images of a plurality of contents in a second sequence for the second channel from the server, based on the highlight organization information; and
sequentially playing downloaded second channel highlight images and at least one of the first channel highlight images previously downloaded in the second sequence based on the highlight organization information.

US Pat. No. 10,433,011

APPARATUS AND METHOD FOR PROVIDING PROGRAMMING INFORMATION FOR MEDIA CONTENT TO A WEARABLE DEVICE

The DIRECTIV Group, Inc.,...

1. A device, comprising:a processor; and
a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations, comprising:
detecting a first smart watch and a second smart watch in proximity of a display device, wherein each of the first smart watch and the second smart watch is configured to wrap around a wrist of a person;
receiving a first user input from the first smart watch indicating to present a first selected media content and receiving a second user input from the second smart watch indicating to present a second selected media content;
causing a presentation of the first selected media content at the display device and storing of the second selected media content in a digital video recorder in response to a determination of a first user profile having a higher priority than a second user profile, wherein the first user profile is associated with the first smart watch and the second user profile is associated with the second smart watch;
detecting the first smart watch is in motion; and
causing a presentation of the second selected media content at the display device and storing at least a portion of the first selected media content in the digital video recorder in response to detecting that the first smart watch is in motion and the second smart watch is stationary.

US Pat. No. 10,433,009

SYSTEMS AND METHODS FOR MANAGING SERIES RECORDINGS AS A FUNCTION OF STORAGE

Rovi Guides, Inc., San J...

1. A method for modifying scheduled storage of a series as a function of available storage, the method comprising:receiving a user request to store the series;
storing an instruction to store episodes of the series in a scheduling data structure;
based on the instruction to store episodes of the series, storing copies of a plurality of episodes of the series on a storage device;
storing viewing progress representing, for each respective copy of the copies of the plurality of episodes, an amount of the respective copy that the user has played back in a user profile corresponding to the user;
determining an amount of available storage remaining on the storage device;
calculating a threshold viewing progress by processing the amount of available storage with a series viewing function;
computing an aggregated viewing progress representing a collective viewing progress among the plurality of stored episodes by summing each amount of the respective copy;
determining whether the aggregated viewing progress exceeds the threshold viewing progress; and
in response to determining the aggregated viewing progress does not exceed the threshold viewing progress, deleting the instruction to store episodes of the series thereby canceling the recording of future episodes of the series.

US Pat. No. 10,433,007

METHOD OF ADAPTING A BIT RATE FOR A MOBILE DEVICE

MIMIK TECHNOLOGY INC., V...

1. A method of adapting a content transmission bit rate for a user device having Global Positioning System (GPS), the method comprising:providing a client application on the user device to obtain GPS coordinates and differential coordinates of the user device;
transmitting the GPS coordinates and the differential coordinates to a serving node to which the user device is registered, the serving node associated with a user of the user device, the serving node located at a premises of the user;
calculating, by the serving node, a speed of the user device based on the GPS coordinates and the differential coordinates;
using, by the serving node, the speed of the user device to calculate a probable content transmission error rate and a probable packet loss rate;
determining, by the serving node, a closest server to the user device based on the GPS coordinates;
adjusting, by the serving node, the content transmission bit rate for content to be transmitted to the user device based on:
the calculated probable content transmission error rate, the calculated probable packet loss rate, and the determination of the closest server; and
a condition of the user device, wherein the condition of the user device includes supported content formats by the user device and transport layer protocol used by the user device;
adjusting dynamically, by the serving node, an expected packet arrival rate for the content to be transmitted to the user device, wherein the expected packet arrival rate is determined from an actual packet arrival rate measured over a time period; and
causing, by the serving node, the closest server to transmit the content to the user device, based on the adjusted content transmission bit rate and the adjusted expected packet arrival rate.

US Pat. No. 10,433,005

METHODS AND SYSTEMS FOR PRESENTING INFORMATION ABOUT MULTIPLE MEDIA ASSETS

ROVI GUIDES, INC., San J...

1. A method for presenting information about programs, the method comprising:generating for simultaneous display a first listing for a first program and a second listing for a second program;
determining a user's progress in the first program and the user's progress in the second program;
detecting, with control circuitry, a first plot event corresponding to a first portion of the first program;
determining, with the control circuitry, a first type associated with the first plot event;
selecting, with the control circuitry, a first graphical property from a plurality of graphical properties that corresponds to the first type;
detecting, with the control circuitry, a second plot event corresponding to a second portion of the second program;
determining, with the control circuitry, a second type associated with the second plot event, wherein the second type is different from the first type;
selecting, with the control circuitry, a second graphical property from the plurality of graphical properties that corresponds to the second type; and
generating for simultaneous display a first progress bar with the first graphical property for the first program and a second progress bar with the second graphical property for the second program.

US Pat. No. 10,433,004

RECEPTION APPARATUS, RECEIVING METHOD, AND PROGRAM

Sony Semiconductor Soluti...

1. A reception apparatus comprising:a processor; and
a memory, the memory storing program code executable by the processor to perform operations comprising:
receiving a data stream formed of a series of packets;
obtaining a transmission parameter from the received data stream;
specifying a time stamp increment value corresponding to the obtained transmission parameter to provide a specified time stamp increment value;
updating a time stamp by adding the specified time stamp increment value to a previous time stamp;
adding the updated time stamp to the packets configuring the data stream; and
selectively outputting the packets to which the updated time stamp has been added,
wherein specifying the time stamp increment value includes calculating a transmission rate on the basis of the obtained transmission parameter and determining the time stamp increment value on the basis of the calculated transmission rate.

US Pat. No. 10,433,001

BROADCAST RECEIVING APPARATUS AND CONTROL METHOD THEREOF

SAMSUNG ELECTRONICS CO., ...

1. A broadcast receiving apparatus comprising:a plurality of tuners, each tuner of the plurality of tuners configured to be tuned to receive a broadcast signal; and
at least one processor configured to:
assign a plurality of operations to be performed in the plurality of tuners, the plurality of operations being different from one another and being performed in sequence to receive content from the broadcast signal,
vary the assignment of the plurality of operations to be performed in the plurality of tuners depending on a progress of the plurality of operations, and
receive the content from the broadcast signal.

US Pat. No. 10,432,996

MATCHING DATA OBJECTS TO VIDEO CONTENT

1. A method by an application server for matching information to video content for creation of a relatively large volume of information-to-video content matches, comprising:providing, by an application processor and via a network access device, a list of videos to be reviewed for tagging with a tag that identifies a limited portion of the video content of a corresponding video listed in the list of videos;
receiving, from each of a plurality of devices each associated with a corresponding user account and via the network access device, a message including at least one vote indicating that the tag corresponds to a Graphical User Interface (GUI) object that includes information topically relevant to a subject of video content of one video of the list of videos;
validating, by the application processor, that the tag corresponds to the GUI enabled data object based at least in part on a number of received votes compared to a threshold number of votes; and
assigning, by the application processor, the video content to an authorized account in response to the number of received votes reaching or exceeding the threshold number of votes; and
receiving, from a device associated with the authorized account, data indicating that the tag corresponds to the GUI enabled data object;
wherein validating that the tag corresponds to the GUI enabled data object is further determined based on receiving the data indicating that the tag corresponds to the GUI enabled data object from the device associated with the authorized account.

US Pat. No. 10,432,995

VIDEO DISPLAY DEVICE, TERMINAL DEVICE, AND METHOD THEREOF

LG ELECTRONICS INC., Seo...

1. A method for communicating with at least one server and at least one mobile device in a video display apparatus connected to a set top box (STB), comprising:receiving uncompressed audio or video frames through an external input port from the STB;
extracting fingerprints from the uncompressed audio or video frames;
transmitting the fingerprints to a first server;
receiving a first URL from the first server, the first URL being used to retrieve supplementary content from a second server different from the first server;
retrieving the supplementary content from the second server based on the first URL received from the first server;
performing a bidirectional communication with the at least one mobile device;
transmitting a second URL for accessing information related to an application to the at least one mobile device, the second URL being different from the first URL;
receiving a subscription message from the at least one mobile device, the subscription message comprising service information for identifying a service between the video display apparatus and the at least one mobile device; and
transmitting a notification message to the at least one mobile device when the at least one mobile device is subscribed with the video display apparatus for the identified service between the video display apparatus and the at least one mobile device.

US Pat. No. 10,432,993

OUTPUT AND PLAYBACK CONTROL DELIVERY FOR ADAPTIVE BITRATE STREAMING

ARRIS Enterprises LLC, S...

1. A method of customizing an adaptive bitrate streaming session, comprising:establishing a network connection session between a client device and a manifest delivery controller;
receiving a manifest request at said manifest delivery controller from said client device, wherein said manifest request identifies a video;
determining whether one or more playback control rules and/or output control rules have been set at said manifest delivery controller that apply to said video and/or said client device; and
sending a manifest and control tags from said manifest delivery controller to said client device, wherein the control tags are provided separate from the manifest,
wherein said manifest identifies locations of a plurality of adaptive bitrate chunks of said video,
wherein said control tags are associated with the playback control rules and/or output control rules that said manifest delivery controller determined applied to said video and/or said client device, and
wherein the manifest directly identifies the locations of the adaptive bitrate chunks rather than rely on a link and the control tags directly deliver the playback control rule information rather than rely on a link.

US Pat. No. 10,432,990

APPARATUS AND METHODS FOR CARRIER ALLOCATION IN A COMMUNICATIONS NETWORK

Time Warner Cable Enterpr...

1. A method of operating a computerized network controller apparatus in a digital content network to allocate radio frequency (RF) spectrum using at least a modulator apparatus, the method comprising:receiving, at a data communication interface of the computerized network controller apparatus, data identifying each of (i) a prioritized portion of the RF spectrum, and (ii) a non-prioritized portion of the RF spectrum;
receiving, at the data communication interface of the computerized network controller apparatus via the digital content network, data indicative of one or more user requests for content from one or more computerized user devices;
isolating the prioritized portion of the RF spectrum from a carrier selection algorithm operative to control the modulator apparatus, the carrier selection algorithm configured to allocate RF carriers for delivery of digital content based at least in part on the receipt of the data indicative of the one or more user requests for content;
causing utilization of the non-prioritized portion of the RF spectrum to dynamically service a first request of the one or more user requests for content, the first request corresponding to first digitally rendered content, the causing utilization of the non-prioritized portion of the spectrum based at least in part on a determination that the first digitally rendered content is not currently delivered on the prioritized portion of the RF spectrum; and
causing utilization of the prioritized portion of the RF spectrum to service a second request of the one or more user requests for content, the second request corresponding to second digitally rendered content, the prioritized portion of the spectrum not subject to allocation of RF carriers for delivery of the second digitally rendered content based on receipt of the data indicative of the one or more user requests for content.

US Pat. No. 10,432,989

TRANSMISSION APPARATUS, TRANSMISSION METHOD, RECEPTION APPARATUS, RECEIVING METHOD, AND PROGRAM

SATURN LICENSING LLC, Ne...

1. A transmission method comprising a step of delivering a Layered Coding Transport (LCT) packet including a portion and an LCT header, the portion being data including part of a fragment, whereinthe fragment includes:
a movie fragment (moof); and
a media data (mdat) including an mdat header and a sample group,
the moof includes BaseMediaDecodeTime representing a presentation time of a first sample of the mdat, and
the LCT header includes:
a sequence number representing a position of the fragment;
a version representing a position of the part of the fragment in the fragment;
a header extension portion including a Network Time Protocol (NTP) time representing the presentation time of the first sample of the mdat;
sample count start information representing a position of a first sample of the part of the fragment from a first sample of the fragment; and
a moof subset that is at least part of the moof.

US Pat. No. 10,432,987

VIRTUALIZED AND AUTOMATED REAL TIME VIDEO PRODUCTION SYSTEM

Cisco Technology, Inc., ...

15. An apparatus comprising:at least one processor; and
at least one memory element storing data, which, when executed on the processor, performs an operation comprising:
receiving real-time metadata about a plurality of video streams;
receiving information associated with a directed stream, the information comprising a plurality of time segments defined by a director, wherein each respective time segment specifies a respective begin time and a respective end time, and identifies a respective video stream of the plurality of video streams selected, wherein the respective video stream was selected by the director for display in the directed stream during the respective time segment;
determining, based on the information, that the directed stream will display a first video stream during a first time segment;
generating a subsidiary stream for a first group of users by:
generating a first score for content included in the first video stream during the first time segment, based on real-time metadata associated with the first video stream and further based on a first set of rules associated with the first group of users;
generating a second score for content included in a second video stream during the first time segment, based on real-time metadata associated with the second video stream and further based on the first set of rules; and
upon determining that the second score is greater than the first score, outputting the second video stream in the subsidiary stream during the first time segment.

US Pat. No. 10,432,985

METHOD AND APPARATUS FOR GENERATING QUALITY ESTIMATORS

1. A method comprising:detecting, by a processing system comprising a processor, from an image pair comprising a first image and a second image, a first distortion type associated with the first image and a second distortion type associated with the second image, wherein the image pair is received from a server;
generating, by the processing system, a preference model according to the first distortion type and the second distortion type, wherein the preference model corresponds to a probability that the first image is preferred over the second image; and
providing, by the processing system, the preference model to the server, wherein the server distributes media content to viewer equipment, and wherein the media content is assigned distortion effects utilizing a selected distortion based on the preference model.

US Pat. No. 10,432,983

LIVE VIDEO CLASSIFICATION AND PREVIEW SELECTION

Twitter, Inc., San Franc...

1. A computing device comprising:at least one processor; and
a non-transitory computer-readable medium having executable instructions stored thereon that, when executed by the at least one processor, are configured to:
for each of a plurality of live video streams available for viewing via a live video sharing platform:
obtain a portion of the live video stream, the portion being a segment generated by a streaming protocol,
assign the portion to a class using a video classifier, each class used in the video classifier having an associated tag indicating whether the class is preview-eligible or not preview-eligible, wherein a class with a preview-eligible tag has an associated percentage that represents rare occurrence within a statistically relevant sample of segments of live video streams, the percentage representing a quantity of segments in the sample that are classified into the class compared with a total quantity of segments in the sample,
determine, based on the tag for the class, whether the portion is preview-eligible, and
generate, responsive to determining that the portion is preview-eligible, a snippet of the live video stream using the portion, and
provide at least some of the snippets for display in a user interface, the snippets provided for display in the user interface being selectable and the user interface being configured to, responsive to a user selecting a first snippet of the snippets provided for display in the user interface, enable the user to join the live video stream corresponding to the first snippet.

US Pat. No. 10,432,982

ADAPTIVE BITRATE STREAMING LATENCY REDUCTION

ARRIS Enterprises LLC, S...

1. A method of transmitting media content, comprising:receiving an adaptive transport stream description at an HTTP streamer from a media preparation unit, the adaptive transport stream description describing media content available from the media preparation unit as one or more adaptive transport streams, wherein each of said one or more adaptive transport streams are continuous streams comprising a plurality of switchable segments each comprising one or more delivery chunks, the switchable segments being marked with segment boundary points and the delivery chunks being marked with chunk boundary points, wherein positions between each of said plurality of switchable segments are positions at which a client device can switch to a different one of said one or more adaptive transport streams;
publishing a playlist with said HTTP streamer listing identifiers for one or more of said plurality of switchable segments, the switchable segments including delivery chunks;
receiving said one or more adaptive transport streams into a memory buffer at said HTTP streamer from said media preparation unit;
receiving a request at said HTTP streamer from said client device for a particular switchable segment identified on said playlist to be received at a requested bit rate;
responding to said request in the HTTP streamer by processing the received one of more adaptive transport streams on the fly in real time by;
continuing receipt of delivery chunks of a switchable segment prior to the particular switchable segment until the delivery chunks reach a segment boundary point;
identifying boundary marks of the one or more chunks in only the particular switchable segment; and
transmitting the one or more delivery chunks from said particular switchable segment at the requested bit rate to said client device using HTTP chunked transfer encoding until a terminating segment boundary point is reached,
wherein each of said one or more delivery chunks are portions of the particular switchable segment that are independently decodable by said client device, such that said HTTP streamer is configured to begin sending delivery chunks from a requested switchable segment so that said client device begins decoding and rendering received delivery chunks when said HTTP streamer has not yet received additional ones of the switchable segments from said media preparation unit.

US Pat. No. 10,432,980

INHERITANCE IN SAMPLE ARRAY MULTITREE SUBDIVISION

GE VIDEO COMPRESSION, LLC...

1. A decoder for reconstructing an array of information samples encoded in a data stream and representing video information, the decoder comprising:an extractor configured for:
extracting, from the data stream, inheritance information associated with an inheritance coding block of the array of information samples, the inheritance information indicating as to whether inheritance is used, wherein the inheritance coding block corresponds to a first hierarchy level of a sequence of hierarchy levels and is composed of a set of coding sub-blocks, each of which corresponds to a second hierarchy level of the sequence of hierarchy levels, the first hierarchy level being indicated with a lower value than that of the second hierarchy level,
extracting, from the data stream if the inheritance is used with respect to the inheritance coding block, an inheritance subset associated with the inheritance coding block, the inheritance subset including at least one syntax element of a predetermined syntax element type, and
extracting, from the data stream, respective residual information associated with each of the set of coding sub-blocks; and
a predictor configured for:
copying the inheritance subset including the at least one syntax element into a set of syntax elements representing coding parameters used in an inter coding process corresponding to each of the set of coding sub-blocks,
determining, for each of the set of coding sub-blocks, a coding parameter used in the inter coding process associated with the corresponding coding sub-block based on the at least one syntax element, and
predicting a respective prediction signal for each of the set of coding sub-blocks based on the coding parameter determined for the coding sub-block,
wherein each of the set of coding sub-blocks is reconstructed based on the respective prediction signal and the respective residual information.

US Pat. No. 10,432,976

IMAGE PROCESSING APPARATUS AND METHOD

VELOS MEDIA, LLC, Plano,...

1. An image processing apparatus comprising:a processor; and
a memory storing instructions that, when executed by the processor, cause the processor to:
decode, from encoded data, identification information indicating whether a non-compression mode has been selected in a coding unit, wherein the encoded data includes the coding unit and the identification information, the coding unit being formed by block partitioning a largest coding unit (LCU) into a plurality of coding units, whereing the block partitioning of the LCU includes recursively splitting the LCU into the plurality of coding units; and
decode the coding unint in the encoded data using the identification information by:
if the identification information indicates that the non-compression mode has not been selected in the coding unit, decoding the coding unit according to a first bit depth, and
if the identification information indicates that the non-compression mode has been selected in the coding unit, decoding the coding unit according to a second bit depth.

US Pat. No. 10,432,974

METHODS AND APPARATUS TO PERFORM FRACTIONAL-PIXEL INTERPOLATION FILTERING FOR MEDIA CODING

Intel Corporation, Santa...

1. A method, comprising:applying a finite impulse response (FIR) filter to samples of a source signal to generate an array of values;
after applying the FIR filter, applying an infinite impulse response (IIR) filter to the array of the values to generate fractional-pixel interpolated values; and
at least one of storing the fractional-pixel interpolated values in an encoded video data structure, outputting the fractional-pixel interpolated values to a display interface, or using the fractional-pixel interpolated values as prediction data to encode a future frame.

US Pat. No. 10,432,973

CONSTRAINTS AND UNIT TYPES TO SIMPLIFY VIDEO RANDOM ACCESS

Microsoft Technology Lice...

1. A method, comprising:encoding a broken link access (BLA) picture;
encoding one or more leading pictures associated with the BLA picture; and
generating a bitstream that comprises the encoded BLA picture and the one or more encoded leading pictures, wherein the generating the bitstream further comprises generating in the bitstream explicit indications for each of the one or more encoded leading pictures indicating whether a respective leading picture is decodable or not decodable when pictures from before the BLA picture in decoding order are unavailable to a decoder.

US Pat. No. 10,432,972

GUIDED OFFSET CORRECTION FOR LOOP RESTORATION IN VIDEO CODING

GOOGLE LLC, Mountain Vie...

1. A method of reducing error in a reconstructed frame comprising pixels, the method comprising:classifying the pixels into available offset classes based on a classification scheme, wherein the classification scheme includes multiple classifications associated with respective pixel characteristics, and for a classification of the multiple classifications:
the classification has a respective plurality of classification classes; and
each of the plurality of classification classes of the classification is defined by respective ranges of values of a pixel characteristic associated with the classification, wherein the available offset classes into which the pixels may be classified are determined as respective combinations of classification classes of the multiple classifications, and
wherein classifying the pixels comprises:
assigning a pixel of the pixels to a respective classification class of at least two classifications of the multiple classifications based on values of the pixel and the respective ranges of values defining the plurality of classification classes of each of the at least two classifications; and
assigning the pixel to a single offset class of the available offset classes based on a combination of the classification classes of the at least two classifications to which the pixel is assigned;
for each offset class of those of the available offset classes that include pixels after the classifying:
determining an offset value for the offset class;
applying the offset value for the offset class to each pixel of the offset class resulting in offset-adjusted pixels of the offset class; and
determining, for the offset class, an error reduction in using the offset value for the offset class as compared to omitting the offset value for the offset class, the error reduction based on the pixels of the offset class in the reconstructed frame, the offset-adjusted pixels of the offset class, and co-located source pixels in a source frame decoded to generate the reconstructed frame; and
selecting, for reducing error in the reconstructed frame, a subset of those of the available offset classes that include pixels after the classifying based on the error reductions.

US Pat. No. 10,432,971

IMAGE DATA COMPRESSION AND DECOMPRESSION USING MINIMIZE SIZE MATRIX ALGORITHM

Sheffield Hallam Universi...

1. A data processing device comprising at least one data processor and a non-transitory computer readable medium coupled to the at least one data processor, the non-transitory computer readable medium storing instructions that when executed by the at least one data processor cause the at least one data processor to perform a process comprising:applying a discrete cosine (DCT) transformation to each of a plurality of non-overlapping pixel blocks which span a frame of image data to generate a set of DCT coefficients for each pixel block comprising a DC DCT coefficient and a plurality of AC DCT coefficients;
quantising each set of DCT coefficients to generate a set of quantised DC DCT coefficients and a set of quantised AC DCT coefficients;
forming a DC array from the set of quantised DC DCT coefficients;
forming an AC matrix from the set of quantised AC DCT coefficients;
forming a limited data array comprising elements having values corresponding only to each unique value of the elements of the AC matrix;
compressing the AC matrix by eliminating blocks of data of the AC matrix having only zero values and forming a reduced AC array from blocks of data of the AC matrix including non-zero values;
storing a position in the AC matrix of each block of data of the AC matrix including non-zero values in a location array;
generating a key using a maximum value of the elements of the reduced AC array, and wherein the key comprises a plurality of key components;
compressing the reduced AC array using the key to form a coded AC array, wherein a same number of elements of the reduced AC array as a number of key components are combined using the key to form a single element of the coded AC array;
arithmetically coding the DC array and the coded AC array to form arithmetically coded data; and
forming a compressed image file including the arithmetically coded data, storing the location array in a header of the compressed image file and storing the key and the limited data array.

US Pat. No. 10,432,967

MULTIPLEX METHOD AND ASSOCIATED FUNCTIONAL DATA STRUCTURE FOR COMBINING DIGITAL VIDEO SIGNALS

1. A method for coding a set of at least two compressed digital images received from at least two video participants of a video conference in a multipoint control unit, wherein the compressed digital images are chronologically synchronous images of different chronological image sequences in different data sets of the at least two video participants and are divided into macroblocks of pixels coded with color value statements, including intraprediction macroblocks; wherein the coding occurs in an area which is divided into first areas, each of which is occupied by the said macroblocks of one of the compressed digital images, and a second area by which the first areas are spaced in parallel from each other, wherein the second area is occupied by pixels with a color value default for the intraprediction, to avoid decompression errors during intrapredictions, wherein all the pixels of the second area have this color value default and the second area in each case spaces apart two of the first areas in parallel by a distance corresponding to at least one of the quadratic macroblocks; the method comprising:compressing each of the images into at least a first data stream portion which comprises at least one portion of the macroblocks, said portion being reduced by at least physical redundancies for transmission of the compressed images to video conference participants as a compilation of the different chronologically synchronous images received from the at least two video participants, and a second data stream portion assigned to the first data stream portion, said second data stream portion describing the reduced physical redundancies,
wherein for each of the intraprediction macroblocks, the first data stream portion is reduced by color value statements with correlations to color values from at least one line of pixels which are arranged outside and on an edge of the intraprediction macroblock and for which the color value default is used in the case of pixels outside the compressed image, and the second data stream portion comprises intrapredictors to describe the correlations to the color values; and
wherein the color value default is a pre-selected color value that separates the first areas from the second area so that all pixels in the second area have a same color value for separation of the different digital images received from the at least two video participants that are compressed into the first data stream portion during the compressing of each of the images so that the first data stream portion is decompressible and decodable for displaying the at least two digital images of the compilation of the different chronologically synchronous images.

US Pat. No. 10,432,963

BIT DEPTH VARIABLE FOR HIGH PRECISION DATA IN WEIGHTED PREDICTION SYNTAX AND SEMANTICS

ARRIS Enterprises LLC, S...

1. A method for decoding a bitstream, the method comprising:identifying one or more weight flags signaled in the bitstream that indicates presence of weighting factors for at least one of a luma component and/or a chroma component;
determining a first weighting factor for performing weighted prediction for a current unit of a current picture the first weighting factor for weighting pixels of a first reference unit of a first reference picture when performing motion compensation for the current unit;
determining a second weighting factor for weighting pixels of a second reference unit of a second reference picture when performing motion compensation for the current unit,
wherein when weighting factors for a luma component is present:
determining from a signaled delta_luma_weight_l0 syntax a difference of the first weighting factor and the second weighting factor applied to a luma prediction value for list 0 prediction using a variable RefPicList0[i] for a first luma component, and
deriving a variable LumaWeightL0 associated with the luma component weighting factors, wherein when the one or more weight flags indicates presence of the weighting factor for a luma component, LumaWeightL0 is derived to be equal to (1?luma_log 2_weight_denom)+delta_luma_weight_l0 in a range of ?(1?(BitDepthy?1)), (1?(BitDepthy?1)?1, inclusive,
wherein luma_log 2_weight_denom is a base 2 logarithm of a denominator for all luma weighting factors, and BitDepthy is a bit depth for the luma component of the respective reference picture; and
wherein when weighting factors for a chroma component is present:
determining from a delta_chroma_weight_l0[i][j] syntax a difference of the first weighting factor and the second weighting factor applied to a chroma prediction value for list 0 prediction using a variable RefPicList0[i] with j equal to 0 for Cb or j equal to 1 for Cr for a second component; and
deriving a variable ChromaWeightL0 associated with the chroma component weighting factor, wherein when the one or more weight flags indicates presence of the weighting factor for a chroma component, ChromaWeightL0 is derived to be equal to ((1?(luma_log 2_weight_denom+delta_chroma_log 2_weight_denom))+delta_chroma_weight_l0, delta_chroma_weight_l0 in a range of ?(1?(BitDepthc?1)), (1?BitDepthc?1))?1, inclusive,
wherein delta_chroma_log 2_weight_denom is a difference of a base 2 logarithm of a denominator for all chroma weighting factors, and BitDepthc is a bit depth for the chroma component of the respective reference picture;
wherein the delta_chroma_weight_l0[i][j] syntax is within the range set by the first value, and
wherein the second component comprises a chroma component of the first reference unit or the second reference unit.

US Pat. No. 10,432,962

ACCURACY AND LOCAL SMOOTHNESS OF MOTION VECTOR FIELDS USING MOTION-MODEL FITTING

PIXELWORKS, INC., Portla...

1. A method of producing video data, comprising:receiving, at a processor, a current frame of image data in a stream of frames of image data;
dividing a current frame of image data into blocks;
identifying a current block and defining a neighborhood of blocks for the current block;
generating at least one initial motion vector for each block;
using the initial motion vector for current block and an initial motion model to calculate a weight for each initial motion vector in the neighborhood based on a difference between initial motion vector for the current block and the initial motion vector for at least one other block from the current block in the neighborhood and differences in the image data between the current block and the other blocks in the neighborhood;
using the weights for each initial motion vector to generate coefficients for a refined motion model;
refining the initial motion vector for the current block according to the refined motion model to produce a refined motion vector;
using the refined motion vector and the pixels in the stream of frames to produce at least one of adjusted pixels and new pixels; and
displaying the at least one of adjusted pixels and new pixels on a display.

US Pat. No. 10,432,956

IMAGE CODING DEVICE, IMAGE DECODING DEVICE, IMAGE CODING METHOD, AND IMAGE DECODING METHOD

Mitsubishi Electric Corpo...

1. An image decoding device comprising:a variable length decoder for performing a variable-length-decoding process on a coded data multiplexed into a bitstream to obtain a coding mode for each of coding blocks; and
a prediction image generator that carries out a prediction process corresponding to said coding mode to generate a prediction image, said prediction image generator carrying out an intra prediction process for a current partition which is predicted by intra mode;
wherein said variable length decoder obtains from said bitstream an intra merge flag indicating whether or not an intra prediction parameter of said current partition is identical to that of an adjacent partition located above or to the left of said current partition,
wherein when there are two or more partitions adjacent to top or left of said current partition, a first partition in a direction away from a top left of said current partition is selected as said adjacent partition,
wherein when said intra merge flag indicates that said intra prediction parameter of said current partition is identical to that of said adjacent partition, said variable length decoder obtains from said bitstream an intra merge direction specifying, out of said adjacent partitions located above and to the left of said current partition, the adjacent partition whose intra prediction parameter is identical to that of said current partition, and
when said intra prediction parameter of said current partition is not identical to that of said adjacent partition, said variable length decoder obtains from said bitstream said intra prediction parameter for said current partition.

US Pat. No. 10,432,954

VIDEO ENCODER, VIDEO ENCODING SYSTEM AND VIDEO ENCODING METHOD

NVIDIA CORPORATION, Sant...

1. A video encoding system, comprising a controller, a first video encoder, a second video encoder, and a memory, where the video encoding system:divides a frame of an image into a predetermined number of predetermined portions, where the predetermined number of the predetermined portions is based on a number of a plurality of video encoders within the video encoding system;
sends, from the controller to a first video encoder of the plurality of video encoders, a command to encode a first predetermined portion of the frame of the image;
sends, from the controller to a second video encoder of the plurality of video encoders, a command to encode a second predetermined portion of the frame of the image separate from the first predetermined portion of the frame of the image;
retrieves from the memory, by the first video encoder, the first predetermined portion of the frame of the image;
retrieves from the memory, by the second video encoder, the second predetermined portion of the frame of the image;
encodes, by the first video encoder, the first predetermined portion of the frame of the image to create a first encoded portion of the frame of the image, wherein during the encoding of the first predetermined portion of the frame of the image by the first video encoder, a value of an image height and width register used by the first video encoder is set based on a height and width of the first predetermined portion of the frame of the image, and a value of a macro block (MB) position register used by the first video encoder is set based on a position of a macro block in the image;
encodes, by the second video encoder, the second predetermined portion of the frame of the image to create a second encoded portion of the frame of the image different from the first encoded portion of the frame of the image, wherein during the encoding of the second predetermined portion of the frame of the image by the second video encoder, a value of an image height and width register used by the second video encoder is set based on a height and width of the second predetermined portion of the frame of the image, and a value of a macro block (MB) position register used by the second video encoder is set based on the position of the macro block in the image;
writes, by the first video encoder, the first encoded portion of the frame of the image to the memory; and
writes, by the second video encoder, the second encoded portion of the frame of the image to the memory.

US Pat. No. 10,432,951

CONFORMANCE AND INOPERABILITY IMPROVEMENTS IN MULTI-LAYER VIDEO CODING

QUALCOMM Incorporated, S...

1. A method of processing video data comprising:receiving coded video data having a plurality of output operation points;
extracting a selected output operation point from the plurality of output operation points, the selected output operation point being a sub-bitstream of an entire bitstream;
performing a first bitstream conformance test on the selected output operation point when the selected output operation point corresponds to one of an entire bitstream with only the base layer to be output, and a temporal subset of the entire bitstream with only the base layer to be output, the first bitstream conformance test being based on a set of sequence-level hypothetical reference decoder (HRD) parameters in an active sequence parameter set (SPS) for a base layer, and one or more non-nested supplemental enhancement information (SEI) messages, wherein the non-nested SEI messages comprise one of decoding unit information (DUI), buffering period (BP), and picture timing (PT) SEI messages, and the non-nested SEI messages are directly included in an SEI network abstraction layer (NAL) unit,
performing a second bitstream conformance test on the selected output operation point when the selected output operation point corresponds to one of a layer set specified by a base video parameter set (VPS) of an active VPS and a temporal subset of the layer set with only the base layer to be output, the second bitstream conformance test being based on a set of sequence-level HRD parameters in the base VPS and directly nested SEI messages, and
performing a third bitstream conformance test on the selected output operation point when the selected output operation point corresponds to one of an output layer set (OLS) specified by a VPS extension of the active VPS and a temporal subset of the OLS, the third bitstream conformance test being based on a set of sequence-level HRD parameters in the active VPS and indirectly nested SEI messages; and
applying the indirectly nested SEI messages only when the selected output operation point corresponds to an OLS specified in the VPS extension, the indirectly nested SEI messages being one of BP, PT, and DUI SEI messages.

US Pat. No. 10,432,948

DETERMINING INTRA PREDICTION MODE OF IMAGE CODING UNIT AND IMAGE DECODING UNIT

SAMSUNG ELECTRONICS CO., ...

1. A method of decoding an image, the method comprising:obtaining first information that indicates an intra prediction mode of a luminance block from a bitstream;
obtaining second information that indicates an intra prediction mode of a chrominance block corresponding to the luminance block from the bitstream;
performing intra prediction on the luminance block based on the intra prediction mode of the luminance block; and
performing intra prediction on the chrominance block based on the intra prediction mode of the chrominance block,
wherein the intra prediction mode of the luminance block includes a particular direction among a plurality of directions and the particular direction is indicated by one of (i) dx number in a horizontal direction and a fixed number in a vertical direction, and (ii) dy number in the vertical direction and a fixed number in the horizontal direction,
wherein the dx number and the dy number are determined among {26, 21, 17, 13, 9, 5, 2, ?2, ?5, ?9, ?13, ?17, ?21, ?26} according to the intra prediction mode of the luminance block,
wherein the fixed number in the vertical direction and the fixed number in the horizontal direction are 2 5,
wherein the performing intra prediction on the luminance block comprising:
determining one of (i) a left neighboring pixel of a first previous luminance block adjacent to a left side of the luminance block and decoded prior to the luminance block and (ii) an up neighboring pixel of a second previous luminance block adjacent to an upper side of the luminance block and decoded prior to the current luminance block, the left neighboring pixel is determined based on j*dy>>5 and the up neighboring pixel is determined based on i*dx>>5, where a location of a current pixel of the luminance block is (j,i), where j and i are integers,
wherein, when the second information indicates that the intra prediction mode of the chrominance block is equal to the intra prediction mode of the luminance block, the intra prediction mode of the chrominance block is determined to be equal to the intra prediction mode of the luminance block,
wherein the image is split into a plurality of maximum coding units according to information about maximum size of a coding unit,
a maximum coding unit, of the plurality of maximum coding units, is hierarchically split into one or more coding units of depths including at least one of a current depth and a lower depth according to split information,
when the split information indicates a split for the current depth, a coding unit of the current depth is split into four coding units of the lower depth, independently from neighboring coding units, and
when the split information indicates a non-split for the current depth, one or more prediction units are obtained from the coding unit of the current depth based on a partition type of the coding unit.

US Pat. No. 10,432,947

METHOD AND APPARATUS FOR DECODING VIDEO, AND METHOD AND APPARATUS FOR CODING VIDEO

SAMSUNG ELECTRONICS CO., ...

1. A video decoding method comprising:determining neighboring pixels of a current block to be used for performing intra prediction on the current block;
acquiring, from a bitstream, information indicating one of a plurality of filtering methods used on the neighboring pixels;
selecting one of the plurality of filtering methods according to the acquired information;
filtering the neighboring pixels by using the selected filtering method; and
performing the intra prediction on the current block by using the filtered neighboring pixels,
wherein the plurality of filtering methods comprise a spatial domain filtering method and a frequency domain filtering method, wherein the spatial domain filtering method filters the neighboring pixels in a spatial domain, and the frequency domain filtering method filters the neighboring pixels in a frequency domain.

US Pat. No. 10,432,946

DE-JUDDERING TECHNIQUES FOR CODED VIDEO

Apple Inc., Cupertino, C...

1. A video coding method, comprising:coding a source video sequence as base layer coded video at a first frame rate;
estimating frame rate conversion operations of a decoding terminal,
applying estimated frame rate conversion operations of the decoding terminal on decoded base layer data,
identifying a portion of the decoded base layer data having judder following the application of frame rate conversion operations of the decoding terminal,
for the identified portion, including a skip hint in coded video data to indicate that a decoder should omit frame rate conversion operations for the respective portion of video data, and
for the identified portion, coding additional frames of the source video sequence corresponding to the identified portion as coded enhancement layer data, the coded enhancement layer data and the coded base layer data when decoded generating recovered video representing the identified portion at a higher frame rate than the coded based layer data when decoded by itself.

US Pat. No. 10,432,943

SIGNALING COLOR VALUES FOR 3D LOOKUP TABLE FOR COLOR GAMUT SCALABILITY IN MULTI-LAYER VIDEO CODING

QUALCOMM Incorporated, S...

1. A method of decoding video data, the method comprising:determining a number of octants for each of three color components of a three-dimensional (3D) lookup table for color gamut scalability;
determining a quantization value for residual values of the color mapping coefficients;
for each of the octants for each of the color components, decoding color mapping coefficients for a linear color mapping function of color values in the 3D lookup table used to convert color data in a first color gamut for a lower layer of the video data to a second color gamut for a higher layer of the video data, wherein decoding the color mapping coefficients further comprises:
for each of the octants for each of the color components, decoding residual values of the color mapping coefficients;
inverse quantizing the residual values of the color mapping coefficients based on the determined quantization value; and
reconstructing the color mapping coefficients based on the decoded residual values and predicted values of the color mapping coefficients;
generating the 3D lookup table based on the number of octants for each of the color components and color values associated with the color mapping coefficients for each of the octants;
decoding residual data of video blocks of the video data; and
reconstructing the video blocks of the video data based on the decoded residual data and at least one reference picture generated using the 3D lookup table.

US Pat. No. 10,432,936

APPARATUS AND METHODS FOR PERCEPTUAL QUANTIZATION PARAMETER (QP) WEIGHTING FOR DISPLAY STREAM COMPRESSION

QUALCOMM Incorporated, S...

1. An apparatus for coding video data using display stream compression, comprising:an encoder configured to code a current block of video data using the YCoCg color space comprising a luma channel, a chrominance orange (Co) channel, and a chrominance green (Cg) channel; and
a rate controller comprising a hardware processor, the rate controller configured to:
determine a luma quantization parameter (QP) for quantizing the luma channel of the current block of video data; and
based upon the determined luma QP, determine a Cg QP for quantizing the Cg channel of the current block of video data and a Co QP for quantizing the Co channel of the current block of video data, wherein the Cg QP and the Co QP are greater than the luma QP, and wherein the rate controller is configured to determine the Co QP such that the Co QP will always be greater than the Cg QP;
wherein the encoder is configured to encode the current block of video data based upon the determined luma QP, Co QP, and Cg QP to form a video data bitstream for display or transmission.

US Pat. No. 10,432,934

VIDEO ENCODING DEVICE AND VIDEO DECODING DEVICE

NEC Corporation, Tokyo (...

1. A video encoding device for dividing input video data into blocks of a predetermined size and applying quantization to each image block obtained by division, to perform a compression-encoding process, comprising:at least one processor configured to execute machine-readable instructions to implement:
a quantization step size encoding unit configured to encode a quantization step size for controlling granularity of the quantization;
a quantization step size downsampling unit configured to downsample one or more encoded quantization step sizes to generate a quantization step size representative value;
a quantization step size representative value storing unit configured to store the quantization step size representative values generated by the quantization step size downsampling unit;
a quantization step size downsampling control unit configured to control an operation of the quantization step size downsampling unit based on a predetermined operation parameter including at least one of a downsampling scale factor or information indicating a type of computation when generating the quantization step size representative value; and
a multiplexer configured to multiplex at least the operation parameter of the quantization step size downsampling unit, in a compression-encoded video bitstream,
wherein the quantization step size encoding unit is configured to predict the quantization step size using the quantization step size representative value.

US Pat. No. 10,432,923

3D DISPLAY SYSTEM

KOREA PHOTONICS TECHNOLOG...

1. A unit light source module of 3D display system, the unit light source module comprising:a light emitting unit including a plurality of point light sources corresponding to a number of viewpoints; and
a light collecting unit configured to output a light outputted from said plurality of point light sources by collecting the light while being spaced apart at a predetermined distance from said light emitting unit,
wherein said plurality of point light sources is arranged for implementing only one of a horizontal parallax and a vertical parallax or both of the horizontal parallax and the vertical parallax based on:
a relationship in size between a width of each point light source and a center distance between adjacent point light sources; and
a direction in which the plurality of point light sources arranged.

US Pat. No. 10,432,922

MEDICAL DEVICES, SYSTEMS, AND METHODS USING EYE GAZE TRACKING FOR STEREO VIEWER

INTUITIVE SURGICAL OPERAT...

1. An eye tracking system, comprising:an image display comprising a first coordinate frame and configured to display an image of a surgical field comprising a second coordinate frame to a user, wherein the user is in a third coordinate frame, the image display configured to emit a light in a first wavelength range;
a right eye tracker configured to emit light in a second wavelength range and to measure data about a first gaze point of a right eye of the user;
a left eye tracker configured to emit light in the second wavelength range and to measure data about a second gaze point of a left eye of the user;
an optical assembly disposed between the image display and the right and left eyes of the user, the optical assembly configured to direct the light of the first and second wavelength ranges such that the light of the first and second wavelength ranges share at least a portion of a left optical path between the left eye and the image display and share at least a portion of a right optical path between the right eye and the image display, without the right and left eye trackers being visible to the user; and
at least one processor configured to process the data about the first gaze point and the second gaze point to determine a viewing location in the displayed image at which the first gaze point and the second gaze point of the user is directed.

US Pat. No. 10,432,921

AUTOMATED PANNING IN ROBOTIC SURGICAL SYSTEMS BASED ON TOOL TRACKING

Intuitive Surgical Operat...

1. A digital zoom and panning system, the system comprising:an endoscopic camera device to capture digital video images of a surgical site;
an image buffer coupled to the endoscopic camera device, the image buffer to store one or more frames of the digital video images as source pixels;
a first display device having first pixels to display images;
a first user interface displayed on the first display device to accept a first user input to display a fovea, the first user input including selection of a first source pixel array of source pixels within a first frame of the digital video images with reference to the surgical site and selection of a first target pixel array of target pixels within a subset of the first pixels of the first display device;
a first digital mapping and filtering device coupled between the image buffer and the first display device, the first digital mapping and filtering device to selectively map and filter source pixels in a first region of interest from the image buffer into target pixels in a first destination rectangle for the first display device; and
a tracking system to track a position of at least one object to digitally pan and display the fovea in the first destination rectangle based on one or more tracked positions of the at least one object.

US Pat. No. 10,432,920

IMMERSIVE COMPACT DISPLAY GLASSES

TESSELAND, LLC, Glendale...

1. A display device comprising:a display, operable to generate a real image comprising a plurality of object pixels; and
an optical system, comprising an array of a plurality of lenslets, arranged to generate an immersive virtual image from the real image, the immersive virtual image comprising a plurality of image pixels, by each lenslet projecting light from the display to a respective pupil range, wherein the lenslets comprise at least two lenslets that cannot be made to coincide by a simple translation rigid motion;
wherein the pupil range comprises an area on the surface of an imaginary sphere of from 21 to 27 mm diameter, the pupil range including a circle subtending 15 degrees whole angle at the center of the sphere;
wherein the object pixels are grouped into clusters, each cluster associated with a lenslet, so that the lenslet produces from the object pixels a partial virtual image comprising image pixels, and the partial virtual images combine to form said immersive virtual image;
wherein imaging light rays falling on said pupil range through a given lenslet come from pixels of the associated cluster, and said imaging light rays falling on said pupil range from object pixels of a given cluster pass through the associated lenslet;
wherein said imaging light rays exiting a given lenslet towards the pupil range and virtually coming from any one image pixel of the immersive virtual image are generated from a single object pixel of the associated cluster.

US Pat. No. 10,432,918

THREE-DIMENSIONAL DISPLAY DEVICE AND METHOD FOR THE SAME

BOE TECHNOLOGY GROUP CO.,...

1. A three-dimensional display device, comprising:a display panel comprising a plurality of pixels arranged as an array, wherein the plurality of pixels comprises left eye pixels and right eye pixels alternately arranged in each row or each column;
a light-splitting device disposed as being parallel to the display panel and configured for projecting lights emitted by the left eye pixels and right eye pixels to different view areas, wherein the light-splitting device is at a side facing a viewer of the display panel, and a distance h is presented between the light-splitting device and the display panel along a light emission direction of the display device;
a distance adjusting device configured to adjust the distance h between the display panel and the light-splitting device along the light emission direction of the display device according to a variation of a distance H between the viewer and the display panel along the light emission direction of the display device,
wherein a proportion of the distance h to the distance H is unchanged; and
wherein the proportion of the distance h to the distance H is identical to a/(a+I), wherein a is a width of each of the plurality of the pixels, and I is an interpupillary distance of the viewer.

US Pat. No. 10,432,917

3D IMAGE DISPLAY DEVICE

1. A 3D image display device, comprising:a display module comprising a plurality of pixels and an image composed of the pixels, wherein the pixels is arranged in a first direction; and
a first lenticular array comprising a plurality of strip-shaped first lenticular lenses and an angle between an extension direction of the first lenticular lens and the first direction is larger or equal to 45 degree;
wherein the image composed of the pixels is created by the steps of:
(a) providing a capture device, a subject to be captured, and a lenticular array, wherein the lenticular array comprises a plurality of strip-shaped lenticular lenses, a length of a bottom of each lenticular lens is 2L, and a center of the bottom is set as 0;
(b) placing the capture device to aim at a top of one of the lenticular lenses and a point between ?xL to xL of a bottom coordinate of the lenticular lens, and capturing the subject until a capturing for a plurality of pixels corresponding to the range from ?xL to xL of the bottom coordinate of the lenticular lens is finished, wherein the value of x is smaller than 1 but greater than 0;
(c) mapping pixels corresponding to the range from from ?xL to 0 and from xL to 0 captured by the capture device to pixels corresponding to a range from ?L to ?xL and from L to xL of the bottom coordinate; and
(d) repeating the steps (b) to (c) for the others of the lenticular lenses.

US Pat. No. 10,432,914

GRAPHICS PROCESSING SYSTEMS AND GRAPHICS PROCESSORS

Arm Limited, Cambridge (...

1. A method of operating a graphics processing system, the graphics processing system comprising a graphics processing pipeline comprising a primitive generation stage and a pixel processing stage, the method comprising:processing input data in the primitive generation stage to produce first primitive data associated with a first view of a scene and second primitive data associated with a second view of the scene;
processing the first primitive data in the pixel processing stage to produce first pixel-processed data associated with the first view;
determining, for second pixel-processed data associated with the second view, whether to use the first pixel-processed data as the second pixel-processed data or whether to process the second primitive data in the pixel processing stage to produce the second pixel-processed data, wherein the determining is based on one or more geometric properties of the first primitive data and/or one or more geometric properties of the second primitive data, and wherein the one or more geometric properties include a predetermined parallax property;
identifying that the first primitive data and the second primitive data have the predetermined parallax property in response to determining that one or more offsets between one or more positions of one or more vertices of one or more primitives in the first primitive data and one or more positions of one or more corresponding vertices of one or more corresponding primitives in the second primitive data do not exceed one or more predetermined offset thresholds; and
performing additional processing in the graphics processing pipeline based on the determining.

US Pat. No. 10,432,912

TARGET, METHOD, AND SYSTEM FOR CAMERA CALIBRATION

Waymo LLC, Mountain View...

1. A target used for calibration, comprising:a first pattern of fiducial markers;
a second pattern of fiducial markers,
wherein the first pattern is a scaled version of the second pattern, such that a calibration image captured of the target simulates multiple images of a single pattern captured at multiple calibration perspectives; and
one or more panel-identification fiducial markers that uniquely identify the target used for calibration.

US Pat. No. 10,432,906

RECORDING MEDIUM, PLAYBACK DEVICE, AND PLAYBACK METHOD

PANASONIC INTELLECTUAL PR...

1. A playback method of a playback device that reads out from a recording medium and plays a plurality of High Dynamic Range (HDR) video streams that is encoded video information and that has a wider luminance range than Standard Dynamic Range (SDR), the playback device including:a first register storing first information indicating, for each of a plurality of playback formats for the plurality of HDR video streams, whether the playback device corresponds to or not,
a second register storing second information indicating, for each of the plurality of playback formats for the plurality of HDR video streams, whether a display device connected to the playback device corresponds to or not, and
a third register storing third information indicating, for each of the plurality of playback formats for the plurality of HDR video streams, a priority of the playback formats based on a user's preference,
the playback method comprising:
playing the plurality of HDR video streams using a playback format with a highest priority among the priority indicated by the third information, in a case where the first information and the second information indicate that there are a plurality of playback formats corresponding to both the playback device and the display device for the plurality of HDR video streams.

US Pat. No. 10,432,905

METHOD AND APPARATUS FOR OBTAINING HIGH RESOLUTION IMAGE, AND ELECTRONIC DEVICE FOR SAME

GUANGDONG OPPO MOBILE TEL...

1. An image processing method, applied in an electronic device, wherein the electronic device comprises an imaging apparatus comprising an image sensor, the image sensor comprises an array of photosensitive pixel units and an array of filter units arranged on the array of photosensitive pixel units, each filter unit corresponds to one photosensitive pixel unit, and each photosensitive pixel unit comprises a plurality of photosensitive pixels, the image processing method comprises:outputting a merged image by the image sensor, wherein, the merged image comprises an array of merged pixels, and the photosensitive pixels in a same photosensitive pixel unit are collectively output as one merged pixel;
determining whether there is a target object in the merged image; and
when there is the target object in the merged image, converting the merged image into a restoration image using a second interpolation algorithm, wherein the restoration image comprises restoration pixels arranged in an array, each photosensitive pixel corresponds to one restoration pixel; and converting the restoration image into a merged true-color image.

US Pat. No. 10,432,904

IMAGE PROCESSING DEVICE AND OPERATIONAL METHOD THEREOF

Samsung Electronics Co., ...

1. An image processing device, comprising:an image sensor module including a lens and an image sensor;
an actuator; and
a processor configured to:
obtain, using the image sensor module, a first image having first color information, the first image corresponding to an external object;
move at least one of the lens and the image sensor based on a designated pixel unit;
obtain, using the image sensor module with the moved at least one of the lens and the image sensor, a second image having second color information, the second image corresponding to the external object;
generate a third image having third color information based on the first color information and the second color information, the third image corresponding to the external object,
wherein the processor is further configured to:
obtain brightness information using the image sensor; and
obtain the second image based on the sensed brightness information.

US Pat. No. 10,432,903

3D LASER PROJECTION, SCANNING AND OBJECT TRACKING

FARO TECHNOLOGIES, INC., ...

1. A laser projection system comprising:at least one laser projector having a laser source and a laser detector; and
one or more processors for executing non-transitory computer readable instructions, the one or more processors operatively coupled to the laser projector, the non-transitory computer readable instructions comprising:
determining the component is stationary based at least in part on the emitting of a first laser light toward at least one reference target on the component and receiving a portion of the first laser light by the laser detector;
in response to the determining the component is stationary:
determining a transformation for mapping a first coordinate system of the laser projector to a second coordinate system of the component; and
forming a visual pattern on the component based at least in part on the transformation; and
determining that the component is in motion based at least in part on receiving a portion of the first laser light by the laser detector and removing the visual pattern from the component.

US Pat. No. 10,432,899

IMAGE DISPLAY DEVICE

PANASONIC INTELLECTUAL PR...

1. An image display device comprising:a light source configured to emit laser light;
a screen configured to be two-dimensionally scanned with the laser light to draw an image on the screen;
a scanning unit configured to scan the screen with the laser light;
a drive unit configured to drive the scanning unit so that the laser light moves on the screen along a plurality of scan lines at predetermined intervals; and
an optical system configured to generate a virtual image of the image drawn on the screen,
wherein on the screen, a plurality of lens regions are arranged so as to line up in two directions different from each other, and
rows in one of the two directions of the lens regions are respectively inclined relatively at a predetermined inclination angle, between but not including 0 degrees and 90 degrees, with respect to main scan directions of the laser light to the screen.

US Pat. No. 10,432,895

COMMUNICATION TERMINAL, IMAGE COMMUNICATION SYSTEM COMMUNICATION METHOD, AND NON-TRANSITORY RECORDING MEDIUM

RICOH COMPANY, LTD., Tok...

1. A communication terminal, comprising:circuitry configured to
control a display to display a predetermined area image which is a part of a whole image, the whole image being shared with another communication terminal;
receive a request to display a destination setting screen;
receive, in response to a user input based on the destination setting screen, a request to change a destination terminal to be the another communication terminal and a destination terminal identifier that identifies the another communication terminal; and
transmit display change information to be received by the another communication terminal, wherein
the display change information includes an identifier of the communication terminal, the destination terminal identifier and predetermined area information, and
the predetermined area information indicates a predetermined area associated with the predetermined area image.

US Pat. No. 10,432,893

AD HOC ENDPOINT DEVICE ASSOCIATION FOR MULTIMEDIA CONFERENCING

Google LLC, Mountain Vie...

1. A computer-implemented method comprising:preparing, in a multimedia conference between a first, a second and a third participant device, a set of video streams for the third participant device, wherein the set of video streams is based on a first video stream associated with the first participant device and a second video stream associated with the second participant device;
determining that the first participant device and the second participant device are located within a geographical area based on the first participant device emitting an audio signal that is detected by the second participant device;
designating, in response to determining that the first participant device and the second participant device are located within the geographical area, the first participant device and the second participant device to operate as an ad hoc endpoint in the multimedia conference, such that the first participant device and the second participant device share at least one resource in the multimedia conference;
determining that a display area of the first participant device is larger than a display area of the third participant device; and
excluding, based on designating the first participant device and the second participant device to operate as the ad hoc endpoint and based on the display area of the first participant device being larger than the display area of the third participant device, the second video stream from the set of video streams for the third participant device,
wherein the first video stream includes video representations of a first user of the first participant device and a second user of the second participant device, and a video representation of the second user via the second video stream is excluded for the third participant device.

US Pat. No. 10,432,892

SYSTEMS AND METHODS FOR INTEGRATING AND CONDUCTING VIDEO SESSIONS

United Services Automobil...

19. At least one non-transitory computer-readable medium comprising a set of instructions that, when executed by one or more processors, cause a machine to perform the operations of:engaging, via a communications network, in an interaction with a user via a channel on a device associated with the user, wherein the device is a wearable;
determining whether the interaction is eligible for a video session with a representative;
actively monitoring activities of the user via the wearable to determine a potential need for the user to engage with the representative via the video session;
in response to the activities indicating a potential need to engage in the video session with the representative, sending, to the device, a link that routes the user directly to a uniquely skilled representative;
after the user selects the link requesting the video session, routing the video session to the uniquely skilled representative, the uniquely skilled representative determined based at least in part on the interaction and a location of the device; and
transferring the video session from the device associated with the user to a second device associated with the user in response to the user touching the device with the second device.

US Pat. No. 10,432,891

VEHICLE HEAD-UP DISPLAY SYSTEM

MAGNA ELECTRONICS INC., ...

1. A display system for a vehicle, said display system comprising:a head-up display unit disposed in a vehicle and operable to display information at a display area that is viewable by a driver of the vehicle when the driver is normally operating the vehicle;
wherein, when not displaying information at the display area but when said head-up display unit projects some light, there is a postcard effect at the display area;
wherein said head-up display unit comprises a display screen and projects light through said display screen for displaying information at the display area, and wherein said head-up display unit comprises a compensation film that attenuates light passing through said display screen to reduce the postcard effect at the display area of the vehicle; and
wherein, when said head-up display unit is deactivated and has a glow that is visible at the windshield, said compensation film diffuses edges of the display area to reduce sharp lines between dark grey, where the display area is located, and black, at areas surrounding the display area, so that the glow is not readily visible to and discernible by a person viewing the display area.

US Pat. No. 10,432,890

AUDIO-VISUAL SYSTEM AND METHOD FOR CONTROLLING THE SAME

SAMSUNG ELECTRONICS CO., ...

1. An audio-visual system comprising:a housing comprising an upper end portion having an opening and a storage space inside the housing;
a display configured to be stored in the storage space inside the housing and to be moved into and out of the housing through the opening of the upper end portion of the housing, the display having a display area for displaying video contents;
an actuator configured to move the display into and out of the housing through the opening of the upper end portion of the housing;
a speaker; and
a processor configured to:
output an audio content through the speaker while an entirety of the display area of the display is disposed inside the housing,
control the actuator to move the display out of the housing through the opening of the upper end portion of the housing such that the entirety of the display area of the display is disposed outside the housing, and
output a video content on the display area of the display while the entirety of the display area of the display is disposed outside the housing.

US Pat. No. 10,432,887

DISPLAY APPARATUS AND OPERATING CHANNEL SETTING METHOD FOR DISPLAY APPARATUS

SAMSUNG ELECTRONICS CO., ...

1. A display apparatus comprising:a communication interface configured to be wirelessly connected to at least one of a first access point and a portable device; and
a processor configured to receive device information of the portable device, which comprises first information about a wireless connection between the portable device and the first access point and second information about whether the portable device supports a real simultaneous dual band (RSDB) connection, from the portable device, and
determine a peer-to-peer (P2P) operating channel between the display apparatus and the portable device, based on device information of the display apparatus which comprises third information about a wireless connection between the display apparatus and the first access point and fourth information about whether the display apparatus supports a RSDB connection, and the received device information of the portable device,
wherein, the processor is further configured to:
determine the P2P operating channel aby using the first information and the third information,
control the display apparatus to connect to the portable device through the determined P2P operating channel,
identify whether the display apparatus and the portable device support the RSDB connection based on the second information and the fourth information, and
determine the peer-to-peer (P2P) operating channel between the display apparatus and the portable device further based on identifying whether the display apparatus and the portable device support the RSDB connection.

US Pat. No. 10,432,886

DISPLAY SYSTEM, DISPLAY APPARATUS, AND CONTROLLING METHOD THEREOF

SAMSUNG ELECTRONICS CO., ...

1. A display system comprising:a display apparatus;
an image providing apparatus configured to provide an image to the display apparatus; and
a remote control device configured to transmit control signals for controlling the image providing apparatus,
wherein the display apparatus is configured to:
control the remote control device to transmit a first control signal included in a first control code set to the image providing apparatus,
based on the image provided by the image providing apparatus being changed in response to sensing of the first control signal, acquire apparatus information, which identifies the image providing apparatus, based on the changed image and the first control signal, and
based on the image not being changed in response to sensing of the first control signal, control the remote control device to transmit a second control signal included in a second control code set to the image providing apparatus,
wherein at least one of a manufacturer or a model corresponding to the second control code set is different from at least one of a manufacturer or a model corresponding to the first control code set.

US Pat. No. 10,432,884

IMAGING ELEMENT, IMAGING METHOD AND ELECTRONIC APPARATUS

Sony Corporation, Tokyo ...

1. An imaging device, comprising:a plurality of pixels including a first pixel and a second pixel;
a first signal line coupled to the first pixel;
a second signal line coupled to the second pixel;
a comparator coupled to the first and the second signal lines, the comparator including;
a first differential pair unit including;
a first differential transistor coupled to the first signal line;
a second differential transistor coupled to a reference signal generation circuit;
a first select transistor coupled to the first differential transistor; and
a second select transistor coupled to the second differential transistor; and
a second differential pair unit including;
a third differential transistor coupled to the second signal line;
a fourth differential transistor coupled to the reference signal generation circuit;
a third select transistor coupled to the third differential transistor; and
a fourth select transistor coupled to the fourth differential transistor.

US Pat. No. 10,432,882

IMAGING DEVICE AND ENDOSCOPE SYSTEM

OLYMPUS CORPORATION, Tok...

1. An imaging device, comprising:a plurality of pixels configured to output a pixel current according to incident light;
a reference current generation circuit configured to generate a reference current;
a differential current generation circuit to which the pixel current and the reference current are input, and configured to generate a differential current according to a difference between the pixel current and the reference current;
a reference voltage generation circuit configured to generate a first reference voltage and a second reference voltage;
a conversion circuit to which the differential current and the first reference voltage are input, and configured to convert the differential current into an output voltage on the basis of the first reference voltage; and
an output circuit to which the output voltage and the second reference voltage are input, and configured to output the output voltage and the second reference voltage,
wherein the second reference voltage is higher than the first reference voltage when the output voltage at the time of resetting of the plurality of pixels is higher than the output voltage at the time of exposure of the plurality of pixels, and
the second reference voltage is lower than the first reference voltage when the output voltage at the time of resetting of the plurality of pixels is lower than the output voltage at the time of exposure of the plurality of pixels.

US Pat. No. 10,432,879

DUAL CONVERSION GAIN HIGH DYNAMIC RANGE IMAGE SENSOR READOUT CIRCUIT MEMORY STORAGE STRUCTURE

OmniVision Technologies, ...

10. An image sensing system, comprising:a pixel array including a plurality of dual conversion gain (DCG) pixels arranged into a plurality of rows and a plurality of columns;
control circuitry coupled to the pixel array to control operation of the pixel array;
a plurality of readout circuits coupled to the pixel array to read out pixel data from the pixel array, wherein the pixel data includes low conversion gain (LCG) pixel data and high conversion gain (HCG) data, wherein each readout circuit includes:
a ramp generator coupled to output a ramp signal;
a comparator, wherein a first input of the comparator is coupled to receive the ramp signal from the ramp generator, wherein a second input of the comparator is coupled to a respective one of a plurality of column bitline outputs of the pixel array to receive an output signal from one of the plurality of DCG pixels, wherein the output signal is one of an LCG signal or an HCG signal;
a counter coupled to receive an output of the comparator;
a first memory circuit and a second memory circuit coupled to receive an output of the counter, wherein the counter is coupled to write to only one of the first and second memory circuits at a time in response to a memory write select signal;
a first multiplexor, wherein a first input of the first multiplexor is coupled to receive an initial value, wherein a second input of the first multiplexor is coupled to receive an initial memory value from the first memory circuit, wherein the counter is coupled to load either the initial value or the initial memory value from an output of the first multiplexor in response to an initial select signal;
a second multiplexor, wherein a first input of the second multiplexor is coupled to the first memory circuit, wherein a second input of the second multiplexor is coupled to the second memory circuit, wherein the second multiplexor is coupled to load either an LCG memory value from the first memory circuit or an HCG memory value from the second memory circuit from an output of the second multiplexor in response to a memory read select signal; and
a data transmitter circuit coupled to the output of the second multiplexor to transmit pixel data of the pixel array; anda digital processor coupled to the readout circuits to receive the pixel data from the pixel array.

US Pat. No. 10,432,877

IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD AND PROGRAM STORAGE MEDIUM FOR PROTECTING PRIVACY

NEC CORPORATION, Tokyo (...

1. An imaging processing system comprising:a memory storing computer program instructions; and
at least one processor configured to execute the computer program instructions to:
generate a foreground image by extracting, from a first image frame, a difference area in which a difference from a background image is not less than a certain threshold;
extract an edge portion from the generated foreground image; and
generate a second image frame to be output by superimposing the edge portion and the background image,
wherein transparency of the second image frame is dependent on an intensity of the edge portion.

US Pat. No. 10,432,876

IMAGING APPARATUS CAPABLE OF SWITCHING DISPLAY METHODS

MAXELL, LTD., Kyoto (JP)...

1. An imaging apparatus comprising:an imager configured to perform image pickup and obtain a pickup image;
an image display configured to display the pickup image;
a touch panel configured to enable a user to select a subject included in the pickup image displayed on the image display;
a controller configured to control the image display to display the pickup image or a superimposed image including the pickup image and a cutout image which is cut out and generated for displaying a part of an area of the pickup image with magnification;
an operation input interface configured to switch an image displayed on the image display from the pickup image to the superimposed image in accordance with an operation of a user; and
a recording button configured to operate a start and an end of a recording of the pickup image obtained by the imager when a moving picture is recorded,
wherein the controller is further configured to:
when the subject included in the pickup image displayed on the image display is selected via the touch panel, recognize and chase the selected subject in the pickup image,
in a case that a position of the selected subject that is recognized and chased in the pickup image is changed, automatically change the area which is cut out as the cutout image from the pickup image in accordance with the changed position of the subject selected in the pickup image, and
when the superimposed image including the pickup image and the cutout image including the selected subject is displayed on the image display in accordance with the operation via the operation input interface and an operation of the start of the recording for recording the moving picture by the recording button is detected, switch from displaying the superimposed image to displaying the pickup image on the image display.

US Pat. No. 10,432,871

SYSTEMS AND METHODS FOR IMAGING USING A RANDOM LASER

Yale University, New Hav...

1. An active interrogation imaging system, the imaging system comprising:a complex laser having a mutual coherence of less than one tenth and a photon degeneracy of greater than 102 that produces a plurality of independent lasing modes with uncorrelated phase relationships and distinct spatial output patterns in response to being pumped; and
one or more detectors that detect an image of an object in response to interrogation of the object by the plurality of independent lasing modes with distinct spatial output patterns,
wherein the plurality of independent lasing modes with distinct spatial output patterns of the complex laser in response to being pumped has a controlled degree of spatial coherence and the image detected by the one or more detectors in response to the controlled degree of spatial coherence is free of speckle,
wherein the complex laser is adapted to enable (i) adjusting a mean free path by adjusting a refractive index of at least one of (1) a background material in an excitation medium or (2) scattering elements in an excitation medium or (ii) adjusting of a shape of the cavity to adjust a degree of cavity chaoticity.

US Pat. No. 10,432,869

METHOD OF UTILIZING WIDE-ANGLE IMAGE CAPTURING ELEMENT AND LONG-FOCUS IMAGE CAPTURING ELEMENT FOR ACHIEVING CLEAR AND PRECISE OPTICAL ZOOMING MECHANISM

MULTIMEDIA IMAGE SOLUTION...

1. A method of utilizing a wide-angle image capturing element and a long-focus image capturing element to achieve clear and precise optical zooming when controlling an electronic device to simultaneously and respectively capture a wide-angle image and a long-focus image of a same spot comprising the steps of:reading the wide-angle image and the long-focus image, and performing exposure and white balance adjustment processes to the wide-angle image and the long-focus image respectively, so as to enable the exposure and the white balance of the wide-angle image and the long-focus image to be in consistency;
performing an image matching algorithm to the wide-angle image and the long-focus image respectively, so as to enable each pixel on the long-focus image to match with each corresponding pixel on the wide-angle image; wherein the image matching algorithm calculates and obtains an image ratio between the long-focus image and the wide-angle image according to hardware parameters of the long-focus image capturing element and the wide-angle image capturing element, retrieves a region from the wide-angle image that corresponds to the long-focus image according to the image ratio, and then calculates and obtain dense optical flow field of an offset of each pixel on the long-focus image with respect to each corresponding pixel on the wide-angle image by using an optical flow estimation; and
performing an image morphing fusion process to the corresponding pixels on the long-focus image and the wide-angle image, so as to generate a new wide-angle image; wherein the image morphing fusion process performs an offset deformation to the pixels on the long-focus image T(x+u, y+v) according to the dense optical flow fields of the offsets (u, v) of the pixels on the long-focus images T(x+u, y+v), so as to enable the pixels on the long-focus image T(x+u, y+v) to be deformed to match with the corresponding pixels on the wide-angle image W(x, y), and then perform a fusion process to the long-focus image T(x+u, y+v) and the wide-angle image W(x, y), in accordance with the following formula, so as to generate the new wide-angle image WNEW:
WNEW=(1??)T(x+u,y+v)+?*W(x,y)wherein ? is a weight factor andwherein, when pixels on a deformed long-focus image T(x+u, y+v) are clearer than the corresponding pixels on the wide-angle image W(x, y), taking ?=0, otherwise, taking 0

US Pat. No. 10,432,866

CONTROLLING A LINE OF SIGHT ANGLE OF AN IMAGING PLATFORM

Planet Labs, Inc., Menlo...

1. A computer-implemented method of controlling an imaging platform, the method comprising:determining, by one or more computing devices, a motion profile for a dual-axis steering mirror associated with an imaging platform;
determining, by the one or more computing devices, position information indicative of an orientation of the imaging platform at one or more points along a path on which the imaging platform travels;
determining, by the one of more computing devices, a plurality of integration time periods based at least in part on the motion profile;
capturing, by the one or more computing devices, a sequence of image frames of at least a portion of a region of interest during at least a subset of the plurality of integration time periods as the imaging platform travels along the path;
identifying, by the one or more computing devices, blur in at least one of the captured image frames; and
controlling, by the one or more computing devices, the motion of the steering mirror based at least in part on the motion profile, the position information, and the identified blur, wherein controlling the motion of the steering mirror comprises controlling the steering mirror to rotate about a first axis and a second axis.

US Pat. No. 10,432,863

ROUTING OF TRANSMISSION MEDIA THROUGH ROTATABLE COMPONENTS

GoPro, Inc., San Mateo, ...

1. An image capturing system, comprising:a digital image capturing device (DICD); and
a hand-held apparatus configured to support the DICD, the hand-held apparatus including:
a grip;
a transmission media extending through the grip, the transmission media configured to transmit data and/or power through the hand-held apparatus;
a first gimbal assembly positioned adjacent an upper end of the grip, the first gimbal assembly including:
first and second housings configured for relative movement; and
a motor assembly accommodated within one of the first and second housings;
a first arm including opposite first and second end portions, wherein the first gimbal assembly is positioned adjacent the first end portion of the first arm, and the transmission media extends from the grip, through the motor assembly of the first gimbal assembly, and into the first arm;
a second gimbal assembly positioned adjacent the second end portion of the first arm, the second gimbal assembly including:
first and second housings configured for relative movement; and
a motor assembly accommodated within one of the first and second housings;
a second arm including opposite first and second end portions, wherein the second gimbal assembly is positioned adjacent the first end portion of the second arm, and the transmission media extends from the first arm, through the motor assembly of the second gimbal assembly, and into the second arm; and
a third gimbal assembly positioned adjacent the second end portion of the second arm such that the third gimbal assembly is operatively connected to the DICD, the transmission media extending into the third gimbal assembly from the second arm.

US Pat. No. 10,432,860

CAMERA OPERATION MODE CONTROL

LENOVO (BEIJING) CO., LTD...

1. A controlling method comprising:detecting a triggering event, the triggering event being one of a plurality of types of events, and the plurality of types of events including a first-type event and a second-type event that are different from each other; and
controlling, based on the triggering event, a camera of a laptop including a first face and a second face to be in an operation mode corresponding to the triggering event as detected, including:
determining an angle between the first face and the second face,
controlling, in response to the angle being smaller than a preset angle, the camera to be in an operation mode with a first operation mechanism, and
controlling, in response to the angle being larger than the preset angle, the camera to be in an operation mode with a second operation mechanism,
wherein:
a screen is arranged on the first face of the laptop,
a keyboard and the camera are arranged on the second face of the laptop,
the first-type event is a user authentication event corresponding to the angle being smaller than the preset angle and the operation mode with the first operation mechanism is a texture acquiring mode,
the second-type event is a picture capturing event corresponding to the angle being larger than the preset angle and the operation mode with the second operation mechanism is a picture acquiring mode, and
the picture acquiring mode is independent of the texture acquiring mode.

US Pat. No. 10,432,853

IMAGE PROCESSING FOR AUTOMATIC DETECTION OF FOCUS AREA

SONY CORPORATION, Tokyo ...

1. A method for image processing, said method comprising:extracting, by an imaging device, a plurality of object features of a plurality of objects,
wherein the plurality of objects are in a field-of-view (FOV) of said imaging device;
generating, by said imaging device, a plurality of confidence maps based on said extracted plurality of object features;
determining, by said imaging device, a focus area corresponding to said FOV based on said generated plurality of confidence maps and a specific rule.

US Pat. No. 10,432,851

WEARABLE COMPUTING DEVICE FOR DETECTING PHOTOGRAPHY

1. A wearable computing device for detecting photography comprising:an outer casing configured to be worn by a user;
a device camera coupled to the outer casing and configured to detect image data corresponding to a person holding a remote camera in an environment of the wearable computing device;
a mobile processor coupled to the device camera and configured to determine that a photograph will be taken based on the image data corresponding to the person holding the remote camera, and determine a direction of the remote camera relative to the wearable computing device based on the image data corresponding to the person holding the remote camera; and
an output device coupled to the mobile processor and configured to output data indicating that the photograph of the wearable computing device will be taken, the output device including at least one of a speaker configured to output audio data providing directions for the user to turn to face the remote camera or a pair of vibration units each positioned on one side of the outer casing and configured to output stereo haptic feedback in a pattern that provides directions for the user to turn to face the remote camera.

US Pat. No. 10,432,849

IMAGE MODIFICATION BASED ON OBJECTS OF INTEREST

eBay Inc., San Jose, CA ...

1. A method, comprising:receiving, by a processor of an image capture device, a user interface selection initiating an image capture;
detecting a first image capture parameter;
identifying an object of interest within a field of view of the image capture device responsive to identifying a set of object characteristics of the object of interest based on one or more publications corresponding to the object of interest;
based on one or more object characteristics of the set of object characteristics, generating a parameter notification indicating a suggested modification of the first image capture parameter, the parameter notification including a specified parameter range corresponding to the suggested modification and one or more user interface elements selectable to modify the first image capture parameter; and
causing presentation, at the image capture device, of the parameter notification.

US Pat. No. 10,432,847

SIGNAL PROCESSING APPARATUS AND IMAGING APPARATUS

Sony Corporation, Tokyo ...

1. An imaging apparatus, comprising:two imaging devices that generate respective pieces of imaging data that differ in angle of view from each other; and
a composition unit that generates first composite imaging data, by adding together a low-frequency component of first imaging data, a high-frequency component of the first imaging data, and a high-frequency component of second imaging data, the first imaging data being the imaging data that has been generated by one of the imaging devices and has a relatively wide angle of view, and the second imaging data being the imaging data that has been generated by another of the imaging devices and has a relatively narrow angle of view.

US Pat. No. 10,432,846

ELECTRONIC DEVICE, IMAGE CAPTURING METHOD AND STORAGE MEDIUM

Chiun Mai Communication S...

1. A method for capturing an image using an electronic device, the electronic device comprising a camera device and a display device, the method comprising:obtaining images directly from a predetermined device according to location coordinates of the electronic device, wherein location coordinates of each of the obtained images belongs to a predetermined geographical range, the predetermined geographical range is a circular range that is defined by a centre and a predetermined radius, and the location coordinates of the electronic device are set to be the centre, and the predetermined radius is equal to a predetermined distance from the centre;
dividing a display area of the display device into a first display area and a second display area;
dividing a horizontal direction of the first display area into M value ranges, wherein the horizontal direction of the first display area represents a first parameter of related parameters of each of the obtained images, and each of the M value ranges represents a range of the first parameter;
dividing a vertical direction of the first display area into N value ranges, wherein the vertical direction of the first display area represents a second parameter of the related parameters of each of the obtained images, and each of the N value ranges represents a range of the second parameter, wherein the M and N are positive integers; wherein the first parameter is a horizontal azimuth angle of an image capturing device when the image capturing device captures the obtained image, and the second parameter is selected from a color temperature of the obtained image and a pitching angle of the image capturing device when the image capturing device captures the obtained image;
displaying the obtained images on the first display area according to a value range of the first parameter of each of the obtained images and a value range of the second parameter of each of the obtained images;
displaying a preview image of a current scene on the second display area;
setting one of the obtained images to be a reference image;
calculating one or more difference values using current orientation parameters of the electronic device and orientation parameters of the reference image;
adjusting the orientation parameters of the electronic device according to the one or more difference values; and
controlling the camera device to capture an image of the current scene based on the related parameters and/or orientation parameters of the reference image.

US Pat. No. 10,432,845

METHOD AND APPARATUS FOR GENERATING BLURRED IMAGE, AND MOBILE TERMINAL

GUANGDONG OPPO MOBILE TEL...

1. A method for generating a blurred image, comprising:determining, according to preview image data acquired via two rear cameras of a dual-camera device, first depth-of-field information for a foreground region and second depth-of-field information for a background region in a current preview image;
acquiring a basic value of a blurring degree according to the first depth-of-field information and the second depth-of-field information, the basic value of the blurring degree being a reference value of the blurring degree; and
performing Gaussian blur process on the background region according to the basic value of the blurring degree to generate the blurred image;
wherein, performing the Gaussian blur process on the background region according to the basic value of the blurring degree to generate the blurred image comprises:
determining a blurring coefficient for each pixel in the background region according to the basic value of the blurring degree and the second depth-of-field information for the background region; and
performing the Gaussian blur process on the background region according to the blurring coefficient for each pixel in the background region to generate the blurred image;
wherein determining the blurring coefficient for each pixel in the background region according to the basic value of the blurring degree and the second depth-of-field information for the background region comprises:
calculating a multiplied value by multiplying the basic value of the blurring degree by the second depth-of-field information of each pixel in the background region, to obtain the blurring coefficient for each pixel in the background region.

US Pat. No. 10,432,844

IMAGE PICKUP DEVICE AND ELECTRONIC APPARATUS WITH AN IMAGE PLANE PHASE DIFFERENCE DETECTION PIXEL

Sony Semiconductor Soluti...

1. An image pickup device comprising an image plane phase difference detection pixel for obtaining a phase difference signal for image plane phase difference AF, the image plane phase difference detection pixel comprising:a first photoelectric conversion section that generates an electric charge in response to incident light;
a second photoelectric conversion section that generates an electric charge in response to light that passes through the first photoelectric conversion section and a lower electrode section;
an upper electrode section that is one of a number of electrodes disposed facing each other across the first photoelectric conversion section, the upper electrode section being formed on a light incident side of the first photoelectric conversion section; and
the lower electrode section that is another of the electrodes disposed facing each other across the first photoelectric conversion section, the lower electrode section being formed on a side opposite the light incident side of the first photoelectric conversion section, wherein the lower electrode section includes a first lower electrode section and a second lower electrode section that are unevenly two-divided at a position that avoids a center of the incident light, and a member that transmits the incident light.

US Pat. No. 10,432,843

IMAGING APPARATUS, CONTROL METHOD OF IMAGING APPARATUS, AND NON-TRANSITORY RECORDING MEDIUM FOR JUDGING AN INTERVAL BETWEEN JUDGEMENT TARGETS

OLYMPUS CORPORATION, Tok...

1. An imaging apparatus comprising:an image pickup device which acquires a judgment image of a structure that includes judgment targets of nearly a same shape; and
a processor communicatively coupled to the image pickup device, wherein the processor:
calculates intervals between the judgment targets of the structure based on the judgment image, and
judges whether each of the intervals between the judgment targets is within a preset interval based on the calculated intervals;
wherein the wherein the processor further:
compares a figure of a reference target which is one of the judgment targets shown in the judgment image with figures of a pair of comparative targets which are the judgment targets adjacent to the reference target, to judge whether each of the intervals between the judgment targets is within the preset interval; and
judges that the intervals between the judgment targets are within the preset interval when a first ratio between a dimension of the figure of the reference target and a dimension of the figure of one of the pair of comparative targets and a second ratio between the dimension of the figure of the reference target and the dimension of the figure of a second of the pair of comparative targets have a preset relation.

US Pat. No. 10,432,842

FUSION OF INERTIAL AND DEPTH SENSORS FOR MOVEMENT MEASUREMENTS AND RECOGNITION

1. A movement recognition system, comprising:an inertial sensor coupled to an object and configured to measure a first unit of inertia of the object;
a depth sensor configured to measure a three dimensional shape of the object using projected light patterns and a camera; and
a processor configured to receive a signal representative of the measured first unit of inertia from the inertial sensor and a signal representative of the measured shape from the depth sensor and to determine a type of movement of the object based on the measured first unit of inertia and the measured shape utilizing a classification model,
wherein the processor is configured to:
compare the type of movement with a predefined intended movement type:
issue a warning in response to the type of movement not matching the predefined intended movement type; and
count a number of correctly completed movements in response to the type of movement matching the predefined intended movement type.

US Pat. No. 10,432,841

WEARABLE APPARATUS AND METHOD FOR SELECTIVELY CATEGORIZING INFORMATION DERIVED FROM IMAGES

OrCam Technologies, Ltd.,...

1. A wearable apparatus for collecting information related to activities of a user, the wearable apparatus comprising:an image sensor configured to capture a plurality of images from an environment of the user;
a communications interface; and
at least one processing device programmed to:
process the plurality of images to identify an activity occurring in the environment of the user;
associate the activity with an activity category;
determine, based on the plurality of images, a level of interest of the user in the activity category, wherein the level of interest is based, at least in part, on a duration of the activity;
cause transmission of at least the activity category to a remotely located computing device via the communications interface; and
cause a life log to be stored in memory, the life log including information comprising at least part of at least one of the plurality of images depicting the activity and the activity category associated with the at least one of the plurality of images, wherein the information stored in the life log is selectively included based on at least the level of interest of the user in the activity category exceeding a predetermined threshold.

US Pat. No. 10,432,840

FUSION NIGHT VISION SYSTEM

L-3 Communication-Insight...

14. A vision system, comprising:a first housing having a first optical axis, a display, an image combiner; and a first eye piece; and
a second housing having a second optical axis and a second eye piece, the first housing coupled to the second housing through a first coupler, the first coupler having a first hinged joint rotatable about a first axis and a second hinged joint rotatable about a second axis, the first axis spaced a first fixed distance from the second axis, the second housing coupled to the first housing through a second coupler, the second coupler having a third hinged joint rotatable about a third axis, the first housing and the second housing coupled through the first coupler and the second coupler such that a row of pixels in the display when viewed through the first eye piece can be maintained in a position relative to an imaginary line going through the first optical axis and the second optical axis as the distance between the first optical axis of the first housing and the second optical axis of the second housing is varied.

US Pat. No. 10,432,838

LIGHTING FOR INDUSTRIAL IMAGE PROCESSING

1. A light source for industrial image processing for illuminating an image area, comprising:at least one sensor configured for capturing an actual relative position of the light source in relation to at least one defined reference, which is at least one of: at least one defined reference point or at least one defined reference plane; and
a position comparer, which is connected to the at least one sensor, being configured for comparing the actual relative position of the light source with a defined relative target position.

US Pat. No. 10,432,837

PLASTIC BARREL, CAMERA MODULE, AND ELECTRONIC DEVICE

LARGAN PRECISION CO., LTD...

1. A plastic barrel, comprising:an object-end portion, comprising:
an outer object-end surface;
an object-end hole; and
an inner annular object-end surface, wherein a part of the inner annular object-end surface is connected with the outer object-end surface and surrounds the object-end hole;
a holder portion, comprising:
a bottom surface;
a bottom hole; and
an outer bottom side, wherein the bottom surface surrounds the bottom hole and connected with the outer bottom side, and the holder portion further comprises at least three cut traces obtained by partially removing at least three gate portions; and
a tube portion, connecting the object-end portion with the holder portion and comprising a plurality of inner annular surfaces;
wherein a diameter of the object-end hole is ?d, a height of the plastic barrel parallel to a central axis is H, and the following condition is satisfied:
1.02

US Pat. No. 10,432,835

OPTICAL IMAGE CAPTURING SYSTEM, IMAGE CAPTURING DEVICE AND ELECTRONIC DEVICE

LARGAN PRECISION CO., LTD...

1. An optical image capturing system comprising seven lens elements, the seven lens elements being, in order from an object side to an image side: a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element, a sixth lens element, and a seventh lens element;wherein the first lens element has negative refractive power, the third lens element has an image-side surface being convex in a paraxial region thereof, the seventh lens element with negative refractive power has an image-side surface being concave in a paraxial region thereof and comprises at least one convex shape in an off-axis region thereof, and an object-side surface and the image-side surface of the seventh lens element are aspheric;
wherein the optical image capturing system has a total of seven lens elements, there is no relative movement among the seven lens elements, and an absolute value of a focal length of the first lens element is smaller than an absolute value of a focal length of the second lens element.

US Pat. No. 10,432,834

LENS DRIVING MODULE WITH CASING HAVING PLASTIC MATERIAL AND ASSEMBLY METHOD THEREOF

TDK Taiwan Corp., Yangme...

1. A lens driving module, configured to drive an optical lens, comprising:a holder, having a receiving space for the optical lens to be disposed therein;
a casing, having a plastic material;
a base, having a plurality of protrusions extending toward the casing and a main body from which the protrusions protrude, and each protrusion has a side surface and a positioning bump, wherein the holder is disposed between the casing and the base;
a first elastic element connecting the holder to the main body;
a second elastic element connecting the protrusions to the holder and having a plurality of locating holes, wherein the positioning bumps are correspondingly incorporated within the locating holes;
an electromagnetic driving assembly, disposed between the holder and the casing and configured to force the holder and the optical lens to move relative to the base; and
a glue, disposed between the side surfaces and the casing, wherein the side surfaces are parallel to a central axis of the optical lens.

US Pat. No. 10,432,833

INTERCHANGEABLE LENS CAMERA

SONY CORPORATION, Tokyo ...

1. A camera, comprising:a body; and
a plurality of groups of contacts on the body, wherein
the plurality of groups of contacts comprises a first group of contacts and a second group of contacts different from the first group of contacts,
the plurality of groups of contacts is configured to be coupled to a plurality of lenses,
the plurality of lenses includes a plurality of lens-side mounts, and
a first length of each of the first group of contacts in a direction of an optical axis of the camera is different from a second length of each of the second group of contacts in the direction of the optical axis of the camera,
the difference between the first length and the second length is based on a difference between flange back distances of respective lens-side mounts of a first lens of the plurality of lenses and a second lens of the plurality of lenses, the flange back distances are with respect to an imaging plane, and
the first group of contacts having the first length is configured to couple with the first lens and the second group of contacts having the second length is configured to couple with the second lens.

US Pat. No. 10,432,831

IMAGE SENSOR

SK hynix Inc., Icheon-si...

1. An image sensor device comprising:a pixel array in which a plurality of pixel blocks are arranged,
wherein each of the pixel blocks comprises:
a light receiver comprising a floating diffusion and a plurality of unit pixels and configured to receive incident light and generate photo charges in response to the received incident light, the plurality of unit pixels sharing the floating diffusion;
a first driver located at a first side of the light receiver and comprising a driver transistor;
a second driver located at a second side of the light receiver and comprising a reset transistor; and
a conductive line having a first region coupling the driver transistor to the floating diffusion and a second region coupling the floating diffusion to the reset transistor,
wherein the driver transistor and the reset transistor are respectively located the first side and the second side of the light receiver in a diagonal direction.

US Pat. No. 10,432,829

OPTICAL DEVICE AND IMAGING DEVICE WITH MECHANISM FOR REDUCING CONDENSATION

DENSO CORPORATION, Kariy...

1. An optical device comprising:a lens assembly comprising:
at least one lens having an optical axis for receiving light from an object located at a first side of the optical axis; and
a holder for holding the at least one lens;
a circuit board for performing at least one process based on the received light; and
a housing having an opening and configured to house the lens assembly and the circuit board therein, the lens assembly being exposed to an outside of the housing via the opening,
wherein:
the housing comprises a top wall;
the opening comprises a recess formed in the top wall of the housing;
the recess communicates with an inside of the housing and concavely extends in a second side of the optical axis, the second side of the optical axis being opposite to the first side thereof; and
the lens assembly is arranged below the recess.

US Pat. No. 10,432,828

CAMERA MODULE HAVING A SHIELD MEMBER

LG INNOTEK CO., LTD., Se...

1. A camera module comprising:a lens unit;
a first casing coupled with the lens unit;
a printed circuit board disposed behind the lens unit so as to be spaced apart from the lens unit and to face the lens unit in an optical-axis direction of the lens unit;
a second casing disposed behind the first casing and coupled at a front portion thereof to a rear portion of the first casing, the second casing being configured to accommodate the printed circuit board therein; and
a shield member electrically connected to the lens unit and grounded to the second casing,
wherein the shield member is disposed between the lens unit and the printed circuit board, and
wherein the shield member comprises:
an upper surface to which an optical axis of the camera module is perpendicular;
a hollow region within the upper surface;
a first terminal protruding from the upper surface in a radial direction into the hollow region; and
a second terminal that comprises a first portion bent from the upper surface and extending downwardly therefrom, a second portion bent from the first portion and extending laterally therefrom, and a third portion bent from the second portion and extending downwardly therefrom, wherein the third portion of the second terminal is configured so as to be brought into contact with the second casing and to allow the shield member to be grounded to the second casing.

US Pat. No. 10,432,827

INTEGRATED AUTOMOTIVE SYSTEM, NOZZLE ASSEMBLY AND REMOTE CONTROL METHOD FOR CLEANING AN IMAGE SENSORS EXTERIOR OR OBJECTIVE LENS SURFACE

DLHBOWLES, INC., Canton,...

1. An external lens washing system, comprising:a substantially rigid aiming fixture having a distal side and a proximal side and being configured to support and constrain an external lens exposed toward said distal side;
said external lens having an external lens surface with a lens perimeter and a lens central axis projecting from said lens surface, wherein a lens field of view is defined as a distally projecting solid angle including said lens central axis and originating within said lens perimeter;
a first nozzle assembly configured to be supported and aimed toward said external lens by said aiming fixture;
said first nozzle assembly including a fluid inlet in fluid communication with a first laterally offset washing nozzle projecting from said aiming fixture's distal side;
said first nozzle assembly being configured and aimed to spray washing fluid toward said external lens surface and across said field of view, spraying at a first selected spray aiming angle in relation to the lens external surface; and
said first spray aiming angle being within the range bounded by 1° and 20° in relation to the lens external surface,
wherein said first nozzle assembly is aimed to spray along a first selected spray azimuth angle in relation to a fixed datum on said lens perimeter,
said external lens washing system further comprising a second nozzle assembly configured to be supported and aimed by said aiming fixture;
said second nozzle assembly including a fluid inlet in fluid communication with a second laterally offset washing nozzle projecting from said aiming fixture's distal side;
said second nozzle assembly being configured and aimed to spray washing fluid toward said external lens surface and across said field of view, spraying at a second selected spray aiming angle in relation to the lens external surface;
said second spray aiming angle being within the range bounded by 1° and 20° in relation to the lens external surface; and
said second nozzle assembly being configured and aimed to spray along a selected spray azimuth angle being radially spaced at a selected inter-spray angle from said first nozzle assembly's spray azimuth angle.

US Pat. No. 10,432,826

AUTOMATIC SUPPRESSION OF UNRECOGNIZED SPOT COLORS FROM A RASTER IMAGE

Xerox Corporation, Norwa...

1. A method of suppressing unrecognized spot colors, the method comprising:receiving a print job into a computerized device comprising a marking device having a programmed color space, said print job comprising an electronic document and print job attributes for use in rendering said print job;
performing raster processing of said electronic document;
identifying an object in said electronic document, wherein said object calls for a spot color, and wherein said spot color is not defined in said programmed color space of said marking device; and
responsive to said spot color not being defined in said programmed color space of said marking device,
suppressing said spot color for said object by creating an alternate color space for said marking device, said alternate color space having tint values of zero for every color in said alternate color space, and
assigning said object to said alternate color space.

US Pat. No. 10,432,825

COLOR MAP FROM COLOR DATA COLLECTED FROM PLURALITY OF SOURCES

Hewlett-Packard Developme...

1. An output device, comprising:a processor; and
a machine-readable storage medium on which is stored instructions that when executed by the processor, cause the processor to:
collect color data from a plurality of sources including at least one of a printer cartridge, a peripheral device, a source local to the output device and a remote source accessed over a network;
store a locator table, the locator table to index a plurality of color map selection fields to a plurality of color maps, the plurality of color maps to be based on the collected color data, wherein each entry of the locator table includes an identifier to match one of the plurality of color maps to at least one permutation of the color map selection fields;
select one of the plurality of color maps based on a condition and the identifier; and
generate an output using the selected color map.

US Pat. No. 10,432,824

IMAGE FORMING APPARATUS, IMAGE FORMING SYSTEM, METHOD OF CALIBRATING IMAGE DETECTION UNIT, AND POST-PROCESSING APPARATUS

KONICA MINOLTA, INC., To...

1. An image forming apparatus comprising:an image forming unit that forms an image on a recording medium;
a conveying unit that conveys the recording medium through a conveying path;
a first image detection unit being a line sensor configured to detect the image formed on the recording medium; and
a control unit that controls the forming of the image and the conveying of the recording medium, and is configured to receive detection results of the image from the first image detection unit and a detection result of the image from a second image detection unit, the second image detection unit being a spectral colorimeter, wherein
the control unit has a detection-unit calibration mode for determining a calibration parameter with use of the detection result of the image from the second image detection unit, and
the control unit determines, on the detection-unit calibration mode, whether the image has an image quality higher than or equal to predetermined quality based on the detection result of the image from the first image detection unit, and only when the image quality is determined to be higher than or equal to the predetermined quality, the control unit determines the calibration parameter based on the detection result of the image from the second image detection unit.

US Pat. No. 10,432,821

DETECTING NOISE IN IMAGE DATA THAT PREVENTS AN IMAGE PROCESSING APPARATUS FROM CORRECTLY DETECTING A PRINTING AREA DUE TO THE NOISE IN THE IMAGE DATA

KABUSHIKI KAISHA TOSHIBA,...

9. An image processing method comprisingacquiring, by an arithmetic element, a first image data read by a scanner from a manuscript containing a printing area;
generating, by the arithmetic element, second image data by executing an image processing on the first image data;
recognizing, by the arithmetic element, the printing area in the first image data based on the second image data;
cutting, by the arithmetic element, an image in the printing area from the first image data as a third image data;
wherein the second image data is generated by reducing a contrast of the first image data, the second image data is generated by extracting a contour line image from the first image data, painting the inside of the extracted contour line image as a function of a binarized lightness value and a binarized saturation value associated with a hue, saturation, and lightness color space, and reducing the contrast.

US Pat. No. 10,432,819

IMAGE FORMING APPARATUS FOR MANAGING SUBSTANTIALLY SIMULTANEOUS IMAGE PROCESSING REQUESTS

Ricoh Company, Ltd., Tok...

1. An image forming apparatus, comprising:a communication interface configured to communicate with a plurality of control terminals that are operated by different users; and
circuitry configured to
authenticate the plurality of control terminals in an order that an authentication request is received from the control terminals;
send an operational screen to each one of the plurality of control terminals that have been successfully authenticated for display at each control terminal, the operational screen being configured to accept a process request for requesting the image forming apparatus to execute an image forming process;
receive a plurality of process requests from the plurality of control terminals that have been authenticated in an order that the process request is accepted at the control terminals; and
control an image forming device to execute a plurality of image forming processes according to the plurality of process requests in the order that the process request is accepted at the control terminals, wherein the image forming apparatus further includes
the image forming device including reading circuitry to execute a scan process, and printing circuitry to execute a print process, the reading circuitry and the printing circuitry being configured to operate independently from each other,
wherein, when a process request for executing the print process and a process request for executing the scan process are received at substantially a same time, the circuitry is further configured to control the reading circuitry to execute the scan process, and control the printing circuitry to execute the print process, concurrently.

US Pat. No. 10,432,818

SPARSE MODULATION FOR ROBUST SIGNALING AND SYNCHRONIZATION

Digimarc Corporation, Be...

1. An image processing method for updating a design file for a retail product package or for a product label, said image processing method comprising:obtaining a design file comprising a first area and a second area, the first area comprising area that is devoid of image or color variability, the second area comprising text;
using one or more multi-core processors, generating a sparse pattern of elements at spatial coordinates within the first area, the sparse pattern of elements conveying a machine-readable variable data signal, the sparse pattern of elements comprising a plurality of binary one and binary zero elements, which correspond to ink elements and no-ink elements of the sparse pattern of elements;
selecting an ink color to use for the ink elements through a perceptual analysis that minimizes a visual difference between the ink color and the first area, in which said selecting yields a selected ink color; and
providing the sparse pattern of elements and the selected ink color to update the design file.

US Pat. No. 10,432,817

SYSTEM, APPARATUS AND METHOD FOR ENHANCING METADATA REGISTRATION WORKFLOW

RICOH COMPANY, LTD., Tok...

1. A multi-function output apparatus comprising an operational display, a document scanner to scan a document in a scanning session of an authenticated apparatus user, a processor and a non-transitory medium embodying a program of instructions executable by the processor to configure said multi-function output apparatus having the document scanner to perform a method comprising:(a) providing, on the operational display, of the multi-function output apparatus having the document scanner and via a metadata interface, a series of plural metadata entry screens, associated with a selected workflow, for user entry of metadata to be associated with a scanned document image, each of the plural metadata entry screens provided in the series depending on metadata entered in one or more previous metadata entry screens in the series;
(b) capturing the metadata entered through the metadata entry screens provided on the operational display of the multi-function output apparatus having the document scanner;
(c) causing, upon user confirmation by the apparatus user on the operational display of the multi-function output apparatus of the metadata entered through the metadata entry screens provided on the operational display of the multi-function output apparatus having the document scanner, the metadata entry screens to be customized based on the captured metadata, which was user-entered via the operational display of the multi-function output apparatus, and causing the customized metadata entry screens to be registered as a customized workflow including plural sequential metadata entry screens for the authenticated apparatus user; and
(d) providing, in another scanning session of the authenticated apparatus user after the customized workflow has been registered, the customized workflow, including the plural sequential metadata entry screens capturing the metadata entered by the apparatus user, through the metadata interface, for user selection in connection with one or more additional documents scanned or to be scanned by the document scanner of the multi-function output apparatus.

US Pat. No. 10,432,816

FINISHING LINE CONTROLLERS TO OPTIMIZE THE INITIALIZATION OF A PRINTING DEVICE

Hewlett-Packard Developme...

14. A print device comprising:a first mechanism movable by a first servo;
a second mechanism movable by a second servo; and
a processor to:
cause the first mechanism and the second mechanism to perform a calibration; and store reference positions of the first mechanism and the second mechanism in a memory;
receive a signal indicative of a transition to a power on state;
cause the first mechanism to move to a first stop position and determine whether the first mechanism is within a threshold of an expected position for the first mechanism;
cause the second mechanism to move to a second stop position and determine whether the second mechanism is within a threshold of an expected position for the second mechanism;
responsive to a determination that both the first mechanism and the second mechanism are within respective thresholds of the respective expected positions; fetch the stored reference positions; and
responsive to a determination that one or both the first mechanism and the second mechanism are not within respective thresholds, cause the first mechanism and the second mechanism to perform the calibration.

US Pat. No. 10,432,814

IMAGE FORMING APPARATUS CAPABLE OF PERFORMING A HIGH-SPEED STARTUP PROCESS IN RESPONSE TO A POWER-ON OPERATION, AND RECORDING MEDIUM

KONICA MINOLTA, INC., To...

1. An image forming apparatus comprising:a volatile storage;
a nonvolatile storage;
a main power switch;
a hardware processor; and
a display,
wherein: the hardware processor tries to obtain, from the volatile storage, saving target information related to the image forming apparatus in a power supply continuation period from a time of a power-off operation to a time of power supply interruption, and to store the saving target information in the nonvolatile storage as first snapshot data for restoring a state at a predetermined time after firmware of the image forming apparatus is activated,
the hardware processor determines, when a power-on operation is performed in response to operation of the main power switch after the time of the power-off operation, whether to perform a first high-speed startup process using the first snapshot data as an apparatus startup process with respect to the image forming apparatus, and
when a determination is made to perform the first high-speed startup process using the first snapshot data, the hardware processor causes the display to display, in a period in which a hardware initialization process in response to the power-on operation is being performed or immediately after the hardware initialization process is completed, an advance notice screen to be displayed at a predetermined time before completion of startup in response to the power-on operation, the advance notice screen including at least one of a logo and a message and giving an advance notice that a transition from a power-off state to a user operable state following the power-on operation will be completed.

US Pat. No. 10,432,813

IMAGE READING DEVICE THAT READS DOCUMENT IMAGE

KYOCERA Document Solution...

1. An image reading device comprising:a contact plate on which a document to be read is placed;
a box-shaped frame that supports the contact plate;
a carriage that includes a reading mechanism extending in a main-scanning direction and reciprocally moves in a sub-scanning direction at a side of a bottom surface of the contact plate in the frame, the bottom surface being an opposite side from a top surface on which the document is placed;
a flexible flat cable that transmits an electric signal of the reading mechanism, the flexible flat cable being connected to an one side surface of the carriage so that a width direction thereof matches the main-scanning direction, and being arranged so that a part continued from a portion where the flexible flat cable connects with the one side surface is curved in a U-shape to go around toward an underside of the carriage; and
a cable guide unit that guides a U-shaped curved portion of the flexible flat cable and avoids the flexible flat cable from coming into contact with the contact plate while reciprocally moving in the sub-scanning direction by following the carriage at a speed slower than a moving speed of the carriage, the U-shaped curved portion moving along with a move of the carriage.

US Pat. No. 10,432,810

SCANNER AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM FOR IMAGE PROCESSING DEVICE

Brother Kogyo Kabushiki K...

1. A scanner, comprising:a conveyer configured to sequentially convey multiple original sheets along a conveying passage,
an image sensor arranged on the conveying passage and configured to optically read the original sheet being conveyed along the conveying passage,
a detection sensor configured to detect physical information regarding the original sheet being conveyed,
a memory configured to store particular image information regarding a particular image, the particular image being an image indicated on a particular fixed object which is fixed to the original sheet in an overlapped state,
a controller configured to perform:
controlling the conveyor to convey original sheets one by one along the conveying passage;
controlling the image sensor to optically read the original sheet;
determining a detection position of the original sheet based on an output signal of the detection sensor;
obtaining target image data containing at least a partial image of the original sheet at a detection area including the detection position;
determining whether the particular fixed object including the particular image exists at the detection area of the original sheet by analyzing the target image data with use of the particular image information in the memory;
interrupting conveyance of the original sheet by the conveyer when the detection position is determined, based on an output signal of the detection sensor, to be an overlapped position and when it is detected, by analyzing the target image data, that the particular fixed object does not exist at the detection area; and
outputting image data representing an image of the original sheet when the detection position is determined, based on the output signal of the detection sensor, to be the overlapped position and when the detection area is detected, by analyzing the target image data, that the particular fixed object exists at the detection area.

US Pat. No. 10,432,809

READING APPARATUS AND IMAGE GENERATION METHOD

Seiko Epson Corporation, ...

1. A reading apparatus configured to read an original document, the reading apparatus comprising:a background board placed behind the original document;
a sensor configured to obtain a read image by repeatedly reading a line-image at a prescribed frequency; and
a clipping processor configured to estimate a background pixel value which is a read value of the background board and to clip an image of the original document from the read image based on the background pixel value which is estimated, the read value changing due to an increase in distance between the sensor and the background board caused by the original document, the read image being a result of reading an area including the original document and the background board.

US Pat. No. 10,432,808

USER INPUT BASED PRINT TRAY CONTROL

Hewlett-Packard Developme...

1. A printing device comprising:a plurality of feeder print trays to hold print media to be used for printing;
a user input sensor in proximity to a corresponding feeder print tray to:
sense a user input on an outer surface of a casing of the corresponding feeder print tray;
generate user input data based on the user input; and
input tray processor circuitry communicatively coupled to the user input sensor, wherein the input tray processor circuitry is to:
correlate the user input data as selection of the corresponding feeder print tray;
determine a selection attribute corresponding to the user input data;
determine a tray control action to be initiated for the corresponding feeder print tray based on the user input data and the selection attribute; and
initiate the tray control action.

US Pat. No. 10,432,807

REMOTE POST-SCANNING WORKFLOW USING SCAN JOB IN SCAN JOB QUEUE

Xerox Corporation, Norwa...

1. A system comprising:an application configured to control a computerized device having a computer interface controlled by the application to display a remote scan job menu for at least post-scanning processing options for unexecuted scan jobs within unexecuted scan job queues,
wherein the computerized device is in communications with a scanning device,
wherein the application is configured to control the scanning device to receive at least one of the unexecuted scan jobs in at least one of the unexecuted scan job queues,
wherein the scanning device comprises a scanner interface, wherein the scanner interface is controlled by the application to display at least one of the unexecuted scan job queues and is configured to receive selection of one of the unexecuted scan jobs from one of the unexecuted scan job queues to identify a selected scan job,
wherein the scanning device is controlled by the application to execute the selected scan job in response to the selection of one of the unexecuted scan jobs by scanning items provided to the scanning device to produce an electronic image file, and
wherein the scanning device is controlled by the application to process the electronic image file by performing the post-scanning processing options within the selected scan job on the electronic image file.

US Pat. No. 10,432,806

INFORMATION PROCESSING APPARATUS AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR SETTING FUNCTION FOR ENTITY IN REAL SPACE

FUJI XEROX CO., LTD., To...

1. An information processing apparatus comprisinga registration unit that registers an entity and an executable function in association with each other, the entity being an entity in real space identified by sensing, the executable function being a function executable in response to the entity being identified again,
wherein the registration unit associates, as the executable function, a coordinated function with a plurality of entities in real space identified by sensing, the coordinated function being a function executable by use of the plurality of entities in response to the plurality of entities being identified again.

US Pat. No. 10,432,805

OPERATION OF A PANEL FOR A PRINTER USING A DIAL CONTROL

KYOCERA Document Solution...

1. A method for operation of a multi-function printer panel, comprising:displaying a main menu on a display screen of the multi-function printer panel;
navigating a plurality of primary image items of the main menu displayed on the display screen responsive to a first movement of a dial control;
pressing the dial control to select a primary image item of the plurality of primary image items;
navigating a plurality of secondary image items displayed on the display screen associated with the primary image item selected responsive to a second movement of the dial control;
pressing the dial control to select a secondary image item of the plurality of secondary image items;
navigating to a height arrow image displayed on the display screen;
pressing the dial control to select the height arrow image;
shifting the dial control up or down to adjust a y-axis boundary;
navigating to a width arrow image displayed on the display screen by rotating the dial control;
pressing the dial control to select the width arrow; and
shifting the dial control right or left to adjust an x-axis boundary.

US Pat. No. 10,432,804

DISPLAY CONTROL FOR AN IMAGE PROCESSING APPARATUS

Canon Kabushiki Kaisha, ...

1. An image processing apparatus having a plurality of applications for using functions of the image processing apparatus which arranges, in a first area displayed on an operation unit of the image processing apparatus, a first software key for executing an application among the plurality of applications, the image processing apparatus comprising:at least one memory storing instructions; and
at least one processor that, upon execution of the instructions, configures the at least one processor to
display, on the operation unit, a second area where a second software key generated by executing a job corresponding to the application is arranged, wherein the second software key is for re-executing the application by a first user operation according to a setting content of the executed job; and
display, on the operation unit, an item for arranging, in the first area, the first software key for executing a job according to a setting content of the second software key, and a menu screen including the item when the second software key is selected by a second user operation different from the first user operation.

US Pat. No. 10,432,803

IMAGE FORMATION SYSTEM INCLUDING ENCODED IMAGE GENERATION DEVICE AND IMAGE FORMATION DEVICE

RISO KAGAKU CORPORATION, ...

1. An image formation system comprising:an encoded image generation device including a first processor that performs:
generating print data from manuscript data;
setting security information for controlling a print mode of the print data; and
generating an encoded image from the print data and the security information; and
an image formation device including a second processor that performs:
inputting the encoded image;
decoding the print data and the security information from the input encoded image;
determining whether an output of the decoded print data is available in accordance with the decoded security information; and
outputting the decoded print data when it is determined that the output of the print data is available,
wherein:
the security information includes manuscript identification information that identifies the manuscript data, and number-of-times-of-printing threshold information that sets a threshold of a number of times of printing of the manuscript data, and
the second processor performs:
recording print history information including the manuscript identification information included in the decoded security information and information relating to an accumulated number of outputs of the decoded print data, every time the decoded print data is output; and
determining that the output of the print data is available, when the print history information that corresponds to the manuscript identification information included in the decoded security information has not been recorded, or when the print history information that corresponds to the manuscript identification information has been recorded, and the information relating to the accumulated number of outputs included in the print history information is compared with the number-of-times-of-printing threshold information included in the decoded security information so as to find out that printing is available.

US Pat. No. 10,432,802

TERMINAL DEVICE, AND NON-TRANSITORY COMPUTER READABLE MEDIUM STORING PROGRAM FOR TERMINAL DEVICE

FUJI XEROX CO., LTD., To...

1. A terminal device comprising:a photographing apparatus configured to capture a photograph in a specific direction in which a data processing apparatus is installed;
a display configured to display a request screen for designating the data processing apparatus to process data, the request screen being configured to display the photograph captured by the photographing apparatus and at least one of:
(i) data images indicating data that is requested to be processed, and
(ii) when an image of the data processing apparatus arranged in the specific direction is not captured in the photograph due to an obstacle positioned between the photographing apparatus and the data processing apparatus, a processing apparatus image indicating the data processing apparatus acquired over a network, the processing apparatus image being a graphical representation of a silhouette of the data processing apparatus, the processing apparatus being superimposed on the photograph captured by the photographing apparatus such that the processing apparatus image and the photograph are entirely visible except a portion of the photograph covered by the silhouette of the data processing apparatus; and
a control unit configured to:
acquire position information of the data processing apparatus;
acquire the processing apparatus image indicating the data processing apparatus over the network;
acquire an address of the data processing apparatus on the network; and
in response to an operation designating the processing apparatus image and one of the data images on the request screen, transmit a request for processing the data indicated by the designated data image to the data processing apparatus indicated by the designated processing apparatus image.

US Pat. No. 10,432,801

DEVICE MANAGEMENT SYSTEM, DEVICE MANAGEMENT METHOD, AND RECORDING MEDIUM

Ricoh Company, Ltd., Tok...

1. A device management system for communicating with a relay device connected with one or more devices in a local network via a firewall, the device management system comprising:circuitry configured to:
receive status information indicating a status of the relay device from the relay device,
based on a determination that the received status information satisfies a predetermined condition, obtain instruction information associated with the predetermined condition, the instruction information indicating a predetermined process to be executed by the relay device, the predetermined process corresponding to a reboot process when the status information of the relay device indicates the relay device is low on memory, and
transmit the obtained instruction information to the relay device to cause the relay device to execute the predetermined process, which corresponds to the reboot process when the status information of the relay device indicates the relay device is low on memory.

US Pat. No. 10,432,800

APPARATUS AND METHOD FOR MANAGING THREE-DIMENSIONAL PRINTING

ELECTRONICS AND TELECOMMU...

1. A method of managing three-dimensional (3D) printing, the method comprising:receiving a video of a product being output from a 3D printer;
acquiring first output information by comparing a first frame of the video and a second frame subsequent to the first frame;
acquiring second output information by extracting output layer-specific trace information from a G-code of the product being output acquired from the 3D printer; and
acquiring quality information of the product being output based on the first output information and the second output information,
wherein the acquiring of the first output information by comparing the first frame of the video and the second frame subsequent to the first frame comprises acquiring first output information by calculating an area change rate of the product being output with respect to a heating bed between a first frame and a second frame of a video looking down from an upper end of the 3D printer on the product being output.

US Pat. No. 10,432,799

RING AND TEXT TONE NOTIFIER (RTTN)

1. A system of notification of a Call and Text by the “Ring and Text Tone Notifier” to emulate by duplicating the ring and text tone received from a mobile device comprising;a two prongs adapter for plugging into any standard 110-volt power outlet;
an on/off power switch to turn the RTTN on/off;
Bluetooth search capabilities module denoted by (130) that includes a Bluetooth processor (CPU) to identify one or more than one available mobile device that are ready to pair with the RTTN;
a touch screen display to allow the user to select the detected devices to pair with;
to select an audible tone, or ring, and a volume control;
red LED and Green LED to distinguish between an incoming phone call vs text message; the Green LED also turns solid green light as the RTTN is plugged into 110 Volts outlet indicating the “Ready” state of the RTTN;
a USB port input for the purpose of providing a means for the mobile device to recharge, or data transfer;
an AC-DC Converter;
a speaker with a Piezo buzzer with a built-in amplified circuit to broadcast the selected tone or ring;
the RTTN may contain an extra 3-prong grounded outlet adaptor in order to compensate for the one already taken by plugging in the RTTN.

US Pat. No. 10,432,798

SYSTEM, METHOD, AND APPARATUS FOR SERVICE GROUPING OF USERS TO DIFFERENT SPEED TIERS FOR WIRELESS COMMUNICATION

1. A system, comprising:a processor; and
a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations, comprising:
determining speed tier data indicative of respective speed tiers assigned to user equipment served by an access point device of a communication network, wherein the respective speed tiers specify target data rates for communication between the user equipment and the access point device; and
in response to determining that an observed data rate, associated with a first user equipment of the user equipment, is not less than a first target data rate, of the target data rates, that corresponds to first speed tier of the speed tiers assigned to the first user equipment, assigning a first priority to a first non-guaranteed bit rate bearer that is associated with the first user equipment, wherein the first priority is lower than a second priority that is to be assigned to a second non-guaranteed bit rate bearer that is associated with a second user equipment, of the user equipment, that has been assigned a second speed tier of the speed tiers, wherein the first speed tier is higher than the second speed tier, wherein the second user equipment is determined to not have exceeded a second target data rate of the target data rates that corresponds to the second speed tier, and wherein the first non-guaranteed bit rate bearer and the second non-guaranteed bit rate bearer belong to a common quality of service class.

US Pat. No. 10,432,797

PRE-DISTORTION SYSTEM FOR CANCELLATION OF NONLINEAR DISTORTION IN MOBILE DEVICES

1. A system, comprising:a memory that stores instructions;
a processor that executes the instructions to perform operations, the operations comprising:
canceling, via a nonlinear cancellation signal included within an acoustic signal transmitted to a device, a portion of nonlinear distortions created once a component of the device emits a linear signal, wherein the acoustic signal includes the linear signal and the nonlinear cancellation signal.

US Pat. No. 10,432,796

METHODS AND APPARATUS TO ASSIST LISTENERS IN DISTINGUISHING BETWEEN ELECTRONICALLY GENERATED BINAURAL SOUND AND PHYSICAL ENVIRONMENT SOUND

1. A method comprising:enabling a listener to distinguish between electronically generated binaural sound and physical environment sound during a telephone call with a person by:
playing, with a wearable electronic device worn by the listener and during the telephone call, an audio alert that signifies that the electronically generated binaural sound is a voice of the person to assist the listener to distinguish between the voice of the person and the physical environment sound; and
repeating, with the wearable electronic device worn by the listener, the audio alert during the telephone call to remind the listener that the electronically generated binaural sound is the voice of the person and not the physical environment sound.

US Pat. No. 10,432,795

PERFORMING AUTOMATED EVENT SERVICES TO REGISTERED END USERS

West Corporation, Omaha,...

1. A method, comprising:transmitting an initial event notification message to an end user communication device based on a primary communication preference of:
at least one primary communication contact preference and at least one secondary communication preference; and
at least two of a mobile device preference, a computer device preference, a voice call preference, a text message preference and an email preference;
transmitting another event notification to the end user communication device via a different communication medium, based on the at least one secondary communication preference; and
joining, from the end user communication device, an event via the different communication medium while the initial communication medium is currently being occupied via a current event in progress on the end user communication device.

US Pat. No. 10,432,794

SYSTEM AND METHOD FOR DISTRIBUTED DYNAMIC RESOURCE COMMITMENT

1. A managed resource device, configured for use in a contact center, the device comprising:a processor; and
a memory coupled to the processor, wherein the memory stores instructions that, when executed by the processor, cause the processor to:
receive a signal corresponding to an initialization of a raise round according to a request over a shared data communications channel to a plurality of non-committed resources registered to communicate on the shared data communications channel,
determine, by hosted logic, whether the managed resource device should volunteer for an activity type during the raise round,
automatically transmit a volunteer signal over the shared data communications channel in response to determining that the managed resource device should volunteer for the type of activity according to the hosted logic indicating selected volunteering resources,
receive a message for committing the selected volunteering resources to the request, wherein the committed resources are selected for routing an activity having the activity type; and
an electronic routing device coupled to the processor for routing the activity having the activity type, to the committed resources, for handling by the committed resources.

US Pat. No. 10,432,793

SYSTEMS AND METHODS TO ENROLL USERS FOR REAL TIME COMMUNICATIONS CONNECTIONS

INGENIO, LLC., San Franc...

1. A method, comprising:providing a web server coupled with a connection server configured to establish real time communication connections between telephonic devices, wherein the web server is configured to present information about first users of a first set of telephonic devices to second users of a second set of telephonic devices; and
in response to a user of a web browser visiting the web server:
presenting, by the web server to the web browser, a user interface that includes a plurality of questions;
determining, by a computing apparatus, personalized ranks of the information of first users of the first set of telephonic devices based on answers from the user;
selecting, by the computing apparatus, a subset of the first users based on the personalized ranks determined from the answers;
presenting, by the web server to the web browser, the information about the subset of the first users;
receiving, in the web server, a user selection of a particular one of the first users from the subset presented in the web browser; and
in response to the user selection, the connection server
establishing a real time communication connection between a telephonic device of the user and a telephonic device of the particular one of the first users.

US Pat. No. 10,432,792

SYSTEM AND METHOD OF INTEGRATING TO AN EXTERNAL SEARCH APPLICATION IN AN EMPLOYEE DESKTOP WEB CLIENT

Verint Systems UK Limited...

1. A method of integrating to an external application for an agent in a web client application, the method comprising:logging into the web client application by the client;
starting an interaction with a client with the web client application by the agent;
searching for relevant knowledge content through a third-party integration module using a graphical user interface, wherein the third-party integration module integrates with other systems and applications outside of a current system in order to search for the relevant knowledge content, wherein the other systems and applications outside of the current system are integrated into the graphical user interface;
completing the interaction with the client using the graphical user interface with enhanced input from the search step;
completing the interaction with the client without the knowledge search if the knowledge search feature is not configured.

US Pat. No. 10,432,791

METHOD AND SYSTEM FOR A SCALABLE COMPUTER-TELEPHONY INTEGRATION SYSTEM

State Farm Mutual Automob...

1. A computer-implemented method for presenting a contact center directory in a computer-telephony integration system, the method executed by one or more processors programmed to perform the method, the method comprising:presenting, via one or more processors, a plurality of contact center service categories to a system administrator, each contact center service category corresponding to a particular type of contact center service in the computer-telephony integration system;
for each of the plurality of contact center service categories, presenting, via the one or more processors, one or more sets of contact information for communicating with call agents assigned to the contact center service category; and
presenting to the system administrator, via the one or more processors, one or more user controls for editing at least one of: (i) the plurality of contact center service categories, or (ii) the one or more sets of contact information for communicating with the call agents assigned to the plurality of contact center service categories.

US Pat. No. 10,432,789

CLASSIFICATION OF TRANSCRIPTS BY SENTIMENT

VERINT SYSTEMS LTD., Her...

1. A method for classifying a sentiment of a dialog transcript, the method comprising:training a lexicon, wherein the training comprises:
receiving a training set of dialog transcripts;
splitting the training set into a negative set and a non-negative set based on a seed;
identifying n-grams in the dialog transcripts;
computing, for each n-gram, a polarity score that corresponds to the likelihood of the n-gram having either a negative or a non-negative sentiment, wherein the computing the polarity score for a particular n-gram comprises comparing the frequency of the particular n-gram in the negative set to the frequency of the particular n-gram in the non-negative set;
identifying prominent n-grams based on each n-gram's polarity score;
expanding the lexicon by adding the prominent n-grams, which are not already in the lexicon, to the lexicon; and
repeating the splitting, computing, identifying, and expanding for a plurality of iterations to obtain a trained lexicon, wherein the splitting for each iteration uses the expanded lexicon from the previous iteration; and
classifying the sentiment of the dialog transcript using the trained lexicon wherein the classifying comprises:
receiving a dialog transcript;
selecting an utterance in the dialog transcript;
identifying n-grams in the utterance;
obtaining a polarity score for each n-gram using the trained lexicon;
determining the utterance is negative or non-negative based, at least, on the polarity scores for each n-gram;
repeating the selecting, identifying, computing, and determining for other utterances in the dialog transcript; and
distinguishing the sentiment of the dialog transcript as negative or non-negative based on the negative or non-negative utterances determined in the dialog transcript.