US Pat. No. 10,397,597

MODE IDENTIFICATION DATA REDUCING METHOD FOR INTRA-PREDICTION CODING

NTT DOCOMO, INC., Tokyo ...

1. An image predictive decoding device comprising:a processor;
an input unit executable with the processor to accept input of compressed picture data containing a residual signal and encoded information, the residual signal generated by dividing a picture into a plurality of blocks and performing predictive encoding of a target block, and the encoded information about a prediction mode indicative of a generation method of a predicted signal of the target block;
a restoration unit executable with the processor to extract the residual signal of the target block from the compressed picture data to restore a reproduced residual signal;
a prediction mode decoding unit executable with the processor to restore the encoded information about the prediction mode to generate a prediction mode;
a predicted signal generation unit executable with the processor to generate the predicted signal of the target block based on the prediction mode;
a picture restoration unit executable with the processor to add the predicted signal to the reproduced residual signal to restore a pixel signal of the target block; and
a storage unit executable with the processor to store the restored pixel signal as a reproduced pixel signal,
wherein the prediction mode decoding unit is executable with the processor to generate a candidate prediction mode list containing elements of prediction modes of a plurality of previously-reproduced blocks neighboring the target block;
wherein the prediction mode decoding unit is further executable with the processor to decode a flag that indicates whether or not the candidate prediction mode list contains an element corresponding to the prediction mode;
when the flag indicates that the candidate prediction mode list contains the corresponding element, the prediction mode decoding unit is further executable with the processor to decode an index indexing the candidate prediction mode list to obtain an element indicated by the index as the prediction mode; and
when the flag indicates that the candidate prediction mode list does not contain the corresponding element, the prediction mode decoding unit is further executable with the processor to:
decode a REM (remaining) mode,
set a variable for the intra-frame prediction mode of the target block equal to a decoded value of the REM mode,
repeatedly perform a round of comparison between the variable and one of elements on the candidate prediction list that starts from a smallest element on the candidate list in an initial round of comparison and moves in a next round of comparison to a next larger element on the candidate list until reaching a highest element thereon, and in every round of comparison, incrementing a value of the variable for use of the incremented value of the variable in a next round of comparison when the variable is larger than or equal to the one of the elements, whereas keeping the value of the variable unchanged for use of the unchanged value of the variable in a next round of comparison when the variable is smaller than the one of the elements, and
obtaining a final value of the variable as the prediction mode.

US Pat. No. 10,397,596

SELF-SIMILAR REFERENCE MASKS FOR PERSISTENCY IN A VIDEO STREAM

Cisco Technology, Inc., ...

1. A method comprising:dividing a reference mask into a plurality of reference mask divisions;
determining a plurality of motion vectors respectively associated with a plurality of slice divisions, wherein the plurality of reference mask divisions respectively correspond to the plurality of slice divisions;
modifying a blurring kernel in accordance with the plurality of motion vectors, yielding a plurality of modified blurring kernels that are respectively associated with the plurality of slice divisions; and
performing at least one action to yield an altered reference mask, including for the plurality of reference mask divisions and the plurality of modified blurring kernels: convolving a reference mask division with a weighted function of at least a modified blurring kernel associated with a slice division, of the plurality of slice divisions, to which the reference mask division corresponds.

US Pat. No. 10,397,595

PROVISION OF SUPPLEMENTAL PROCESSING INFORMATION

Telefonaktiebolaget LM Er...

1. A method of providing supplemental processing information relating to encoded media content, the method comprising:generating media content comprising:
an alias for a media codec employed to encode the media content; and
supplemental processing information defining post-decoding instructions for processing the media content; and
sending the media content to a receiving device, wherein the alias triggers the receiving device to abort decoding the media content when the receiving device does not recognize the alias.

US Pat. No. 10,397,594

REAL-TIME PROCESSING OF IOT DATA

Hewlett Packard Enterpris...

1. A method for real-time processing of IoT data, the method comprising:receiving, by a first physical processor of an edge computing device, a set of data from a first IoT device communicably coupled to the edge computing device, the set of data comprising a set of video data;
splitting, by the first physical processor, the set of data into a set of individual data packets;
determining, by a second physical processor of the edge computing device, a number of a plurality of instances of the second physical processor for processing the set of individual data packets based on a frame rate of the set of video data and at least one of a number of processor cores available and a bandwidth rate at which the data packets are being received; and
processing, by the second physical processor of the edge device, the set of individual data packets by:
concurrently applying, by the plurality of instances of the second physical processor of the edge computing device, a learning model to each of a corresponding plurality of data packets from the set of individual data packets;
annotating, by a subset of the plurality of instances of the second physical processor, a corresponding subset of the plurality of data packets with a corresponding output from the concurrent application of the learning model; and
processing the annotated subset of the plurality of data packets by performing at least one of: object counting, visual click-through analysis, augmented reality facilitation, prediction, and estimation.

US Pat. No. 10,397,592

METHOD AND APPARATUS FOR MULTI-THREADED VIDEO DECODING

COREL SOFTWARE LLC, Wilm...

1. A method for performing video decoding, comprising:accessing a plurality of pictures associated with a video, wherein the plurality of pictures have been received over a network, each picture consisting essentially of a complete field or frame having one or more slices of a plurality of macroblocks, each picture being encoded in accordance with one of MPEG-1, MPEG-2, MPEG-4 and H.264;
parsing input bits of a first one of the pictures into a plurality of syntax elements of the first picture; and
simultaneously decoding the plurality of syntax elements of the first picture into pixel values and parsing input bits of a second different one of the pictures into a plurality of syntax elements of the second picture.

US Pat. No. 10,397,591

PROCESSOR INSTRUCTIONS FOR ACCELERATING VIDEO CODING

TEXAS INSTRUMENTS INCORPO...

1. A processor for video decoding with video specific instructions, the processor to receive an encoded video bit stream, the processor comprising:an instruction memory storing an instruction set comprising a first instruction;
an instruction fetch stage coupled to the instruction memory, the instruction fetch stage to fetch the first instruction from the instruction memory; and
a functional unit coupled to the instruction fetch stage, the functional unit to implement datapath logic for the first instruction to decode the encoded video bit stream, to produce a video sequence instruction to search for a next start code in the encoded video bit stream starting from a current position in the encoded video bit stream, the next start code corresponding to a code of two or more bytes that is prefixed to a network abstraction layer (NAL) unit.

US Pat. No. 10,397,589

METHOD AND APPARATUS FOR PREDICTING INTER-LAYER BASED ON TEMPORAL SUB-LAYER INFORMATION

Electronics and Telecommu...

1. An inter-layer prediction method of an image including a plurality of layers, each layer having at least one sub-layer, the method comprising:generating a residual block by decoding a bitstream;
decoding an indicator indicating whether information on a maximum temporal identifier for a first layer which may be used for inter-layer prediction of a second layer is included in the bitstream;
deriving a maximum temporal identifier based on at least one of the indicator and the information on the maximum temporal identifier;
deriving a reference picture from the first layer to be used for inter-layer prediction of a current picture of the second layer based on the maximum temporal identifier;
performing the inter-layer prediction of the current picture based on the reference picture to generate a prediction block; and
generating a reconstructed block based on the residual block and the prediction block,
wherein a picture of the first layer and having a temporal identifier greater than the maximum temporal identifier is not used for the inter-layer prediction of the current picture of the second layer.

US Pat. No. 10,397,588

METHOD AND APPARATUS FOR RESOURCE SHARING BETWEEN INTRA BLOCK COPY MODE AND INTER PREDICTION MODE IN VIDEO CODING SYSTEMS

MEDIATEK INC., Hsinchu (...

1. A method of configuring an on-chip buffer or cache memory for a video coding system using coding modes including an Intra Block Copy (IntraBC) mode and an Inter prediction mode, comprising:storing at least partial Inter reference video data from a previous picture in the on-chip buffer or cache memory;
storing at least partial pre-deblocking reconstructed video data of a current picture in the on-chip buffer or cache memory;
receiving input data associated with a current block in the current picture;
determining which one of the coding modes including the IntraBC mode and the Inter prediction mode was used to code the current block;
when it is determined that the current block is coded using the IntraBC mode, using the at least partial pre-deblocking reconstructed video data of the current picture stored in the on-chip buffer or cache memory to derive IntraBC prediction for the current block; and
when it is determined that the current block is coded using the Inter prediction mode, using the at least partial Inter reference video data from the previous picture to derive Inter prediction for the current block,
wherein a total cache line number of at least one cache line group of the on-chip buffer or cache memory is compared to a threshold for determining whether to flush at least one cache line of the at least one cache line group, and the threshold is set to zero for the Inter prediction mode if blocks encoded or decoded are in an Intra slice.

US Pat. No. 10,397,587

IMAGE PROCESSING APPARATUS AND CONTROL METHOD THEREOF

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus, comprising:an obtaining unit configured to obtain image data of a first data format, wherein the image data of the first data format comprises a plurality of pixel data each having a value of a single color component among a predetermined plurality of color components, and wherein the plurality of pixel data in the image data of the first data format being arranged according to a predetermined pixel arrangement;
a first conversion unit configured to convert the image data of the first data format that was obtained by the obtaining unit to a second data format, and output the image data of the second data format, wherein the image data of the second data format comprises a plurality of pixel data each having values of the plurality of color components;
a resizing unit configured to reduce the number of pixel data of the image data of the second data format that was output from the first conversion unit, and output reduced image data of the second data format; and
a second conversion unit configured to convert the reduced image data of the second format to the first data format, and output the converted image data as the reduced image data of the first data format.

US Pat. No. 10,397,586

CHROMA RESHAPING

Dolby Laboratories Licens...

1. A method to reshape a high-dynamic range video signal, the method comprising:obtaining a first video signal representing the high-dynamic range video signal in a first color format comprising a first chroma component;
analyzing, by a processor, a first set of statistical parameters of said first video signal;
determining, by the processor, based on the first set of statistical parameters whether the first video signal is a reference signal by comparing at least one of the first set of statistical parameters to a corresponding predetermined threshold value;
transforming, by the processor, the first video signal to a second video signal in a second color format based on the determining, wherein the transforming transforms the first video signal to have a reference color format if the first video signal was determined to not be a reference signal and transforms the first video signal to have a non-reference color format if the first video signal was determined to be a reference signal;
if the first video signal is determined to be a reference signal, applying, by the processor, a chroma reshaping function to the second chroma component of the second video signal to generate a reshaped chroma component and if the first video signal is determined to not be a reference signal, applying, by the processor, a chroma reshaping function to the first chroma component of the first video signal to generate a reshaped chroma component, wherein the chroma reshaping function maps pixel values of the second chroma component of the second color format to pixel values of the first chroma component of the first color format, wherein at least one parameter of the chroma reshaping function is determined by fitting the chroma reshaping function to pixel values (vji) of the second chroma component of the second video signal and corresponding pixel values (sji) of the first chroma component of the first video signal;
generating, by the processor, a third video signal comprising the reshaped chroma component.

US Pat. No. 10,397,585

PROCESSING HIGH DYNAMIC RANGE AND WIDE COLOR GAMUT VIDEO DATA FOR VIDEO CODING

QUALCOMM Incorporated, S...

9. A method of processing video data, the method comprising:converting a chromaticity component of High Dynamic Range (HDR) and Wide Color Gamut (WCG) video data between a color representation format and fractional chromaticity coordinate (FCC) format to produce a chromaticity component of FCC formatted video data;
scaling the chromaticity component of the FCC formatted video data based on a corresponding luminance component of the FCC formatted video data to obtain adjusted FCC formatted video data; and
applying a log-like transfer function (TF) to only a luminance component of the adjusted FCC formatted video data to obtain a luminance component of compacted FCC formatted video data; and
applying, depending on a level of the luminance component of the compacted FCC formatted video data, adaptive invertible linear TFs to the chromaticity component of the FCC formatted video data to obtain a chromaticity component of the compacted FCC formatted video data.

US Pat. No. 10,397,584

METHOD FOR DECODING IMAGE AND APPARATUS USING SAME

Electronics and Telecommu...

1. A method of decoding a video supporting a plurality of layers performed by a decoding apparatus, the method comprising:receiving information on a reference layer used for decoding a current picture for interlayer prediction;
deriving a number of active reference layer pictures used for decoding the current picture based on the information on the reference layer;
performing interlayer prediction based on the number of active reference layer pictures to generate a prediction block for a current block;
generating a residual block for the current block; and
reconstructing the current block based on the prediction block and the residual block,
wherein the generating the residual block comprises entropy-decoding a bitstream to generate a quantized transformed coefficient, inverse-quantizing the quantized transformed coefficient to generate a transformed coefficient and inverse-transforming the transformed coefficient,
wherein when a layer identifier of a current layer comprising the current picture is not 0 and when a number of reference layer pictures available for interlayer prediction in the same access unit as that of the current picture is not 0, and when a reference layer picture available for interlayer prediction specified by values of maximum temporal sub-layer information on each layer and information on maximum allowed value of temporal slab-layer allowing inter-layer prediction in each layer among direct reference layer pictures comprised in all the direct reference layers of the current layer is present in the same access unit as that of the current picture and included in an interlayer reference picture set of the current picture, the number of active reference layer pictures is derived to be equal to the number of reference layer pictures, and
wherein the number of reference layer pictures is derived based on information indicating a number of direct reference layers of the current layer, the maximum temporal sub-layer information on each layer, the information on maximum allowed value of temporal sub-layer allowing inter-layer prediction in each layer and a temporal identifier of the current picture, and among the pictures in the direct reference layers of the current layer comprising the current picture, a picture in a reference layer is considered as a reference layer picture available for decoding the current picture for interlayer prediction when maximum temporal sub-layer information of the reference layer is greater than or equal to the temporal identifier of the current picture and either when information on maximum allowed value of temporal sub-layer allowing inter-layer prediction in the reference layer for the current layer is greater than the temporal identifier of the current picture or when the temporal identifier of the current picture is 0.

US Pat. No. 10,397,583

IMAGE PROCESSING APPARATUS AND METHOD

SONY CORPORATION, Tokyo ...

1. An image processing apparatus, comprising:a control information obtaining unit configured to obtain control information to control a smallest size of an inter prediction unit corresponding to a first processing unit of an inter prediction, wherein the control information is obtained based on:
a coding unit which comprises a multilayer structure in which a picture is divided into a plurality of pieces, wherein the coding unit is set as a second processing unit before the inter prediction, and
an encoded picture of the picture, wherein
the control information indicates a permission status for an N×N block size with respect to a 2N×2N block size of a smallest coding unit;
a setting unit configured to:
set one of a 2N×N block size or an N×2N block size as the smallest size of the inter prediction unit based on the permission status that indicates the N×N block size is prohibited as the smallest size of the inter prediction unit; and
set the N×N block size as the smallest size of the inter prediction unit based on the permission status that indicates the N×N block size is permitted as the smallest size of the inter prediction unit; and
a motion prediction and compensation unit configured to set the inter prediction unit to a size larger than the set smallest size for execution of the inter prediction.

US Pat. No. 10,397,582

IMAGE ENCODING DEVICE AND IMAGE ENCODING METHOD

FUJITSU LIMITED, Kawasak...

1. An image encoding device that encodes an image to be encoded by using a palette, the image encoding device comprising:an association circuit that determines whether a first pixel included in the image to be encoded will be associated with any entry in the palette, on the basis of a threshold for a difference between a pixel value of the first pixel and a palette value;
an addition circuit that adds an entry for the first pixel to the palette when the association circuit does not associate the first pixel with any of the entries in the palette;
a fusion circuit that fuses two entries in the palette so as to generate a fused entry when an encoding result of the image to be encoded does not satisfy a condition of a target amount of information by adding an entry to the palette;
a control circuit that changes the threshold when the association circuit associates the first pixel with any of the entries in the palette, when the addition circuit adds the entry for the first pixel to the palette, or when the fusion circuit generates the fused entry and the addition circuit adds the entry for the first pixel to the palette; and
a palette encoding circuit that encodes the image to be encoded by using the palette, wherein
the association circuit determines whether a second pixel included in the image to be encoded will be associated with any of the entries in the palette, on the basis of the changed threshold for a difference between a pixel value of the second pixel and a palette value.

US Pat. No. 10,397,581

METHOD AND RATE CONTROLLER FOR CONTROLLING OUTPUT BITRATE OF A VIDEO ENCODER

Axis AB, Lund (SE)

1. A method of controlling output bitrate of a video encoder encoding a video sequence, the method comprising:setting a long-term bit budget for a time period of at least one day for output of the video encoder;
determining a first allowable bitrate based on the long-term bit budget;
determining an instantaneous bit restriction for output of the video encoder;
determining a second allowable bitrate based on the instantaneous bit restriction; and
controlling output bitrate based on the first allowable bitrate and the second allowable bitrate, such that the long-term bit budget, the first allowable bitrate and the second allowable bitrate are complied with.

US Pat. No. 10,397,580

METHOD AND APPARATUS FOR PROCESSING A VIDEO SIGNAL

LG Electronics Inc., Seo...

1. A method for encoding a video signal comprising at least one coding block by an encoding apparatus, the method comprising:obtaining, by the encoding apparatus, a plurality of prediction blocks being partitioned from the coding block;
obtaining, by the encoding apparatus, merge flag information indicating whether prediction information of a current prediction block of the plurality of prediction blocks is determined by prediction information of a merging candidate selected from merging candidates, the merging candidates comprising at least one neighboring prediction block adjacent to the current prediction block; and
encoding, by the encoding apparatus, the current prediction block based on the prediction information of the selected merging candidate when the merge flag information indicates that the prediction information of the current prediction block is determined by the prediction information of the selected merging candidate,
wherein, when the coding block is partitioned into two prediction blocks each of which has a horizontal size greater than a vertical size, and the current prediction block is a bottom prediction block of the two prediction blocks, a top prediction block of the two prediction blocks is not included in the merging candidates.

US Pat. No. 10,397,579

SAMPLING RATE CONVERTER

NTT ELECTRONICS CORPORATI...

1. A sampling rate converter converting input digital signals Xki sampled at an input sampling rate into a plurality of output digital signals Yki each sampled at output sampling rates different from each other, comprising:a position coordinate difference computing unit calculating position coordinate differences Dki between position coordinates Tki of the plurality of output digital signals Yki and position coordinates Zki of the input digital signals Xki adjacent to the position coordinates Tki;
an FIR coefficient memory storing a plurality of FIR coefficients F(z), with z being a variable number about a position coordinate, of a finite impulse response low pass filter (FIR-LPF), and outputting the plurality of FIR coefficients F(z) corresponding to the position coordinate differences Dki, and outputting the plurality of FIR coefficients, where F(z) is represented as F(Tki?Zki (1)), . . . , F(Tki?Zki (p)) corresponding to a plurality of position coordinates z=Tki?Zki(1), . . . , Tki?Zki(p), respectively, when the position coordinate differences Dki are input, wherein the FIR-LPF has a characteristic of blocking frequency components which are ½ or more of the output sampling rate and the position coordinates Zki (1), . . . , Zki(p) represent position coordinates of a fixed number of the input digital signals Xki (1), . . . , Xki(p) present in the vicinity of a periphery of the position coordinates Tki of the output digital signals Yki;
a common parallel FIR calculator calculating, in parallel, Yki=F(Tki?Zki(1))*Xki(1)+ . . . +F(Tki?Zki (p))*Xki (p) to obtain the plurality of output digital signals Yki, wherein the plurality of output digital signals Yki are temporarily stored, before being output in synchronization with an output clock, in a plurality of output memories selected by a selector based on a corresponding identification of the plurality of output digital signals Yki, and wherein the plurality of output digital signals Yki are output in parallel at different sampling rates;
a control unit supplying a group of the FIR coefficients and a group of the input digital signals corresponding to the respective position coordinate differences to the common parallel FIR calculator in a predetermined order when the position coordinate differences corresponding to two or more different output digital signals are concurrently computed;
a shift register storing the input digital signals sequentially; and
a counter counting in synchronization with the shift register, wherein the control unit comprises:
a parallel serial converter configured to store the position coordinate differences and a first count value indicating a value of the counter when each of the position coordinate differences is computed in an FIR computation parameter memory in the predetermined order when the position coordinate differences respectively corresponding to two or more different output digital signals are concurrently computed, and
a data selector reading out a group of the input digital signals corresponding to the first count value from the shift register based on a difference between a second count value which is a current value of the counter and the first count value and supplying the group of the input digital signals to the parallel FIR calculator.

US Pat. No. 10,397,578

NESTED ENTROPY ENCODING

Dolby International AB, ...

1. A method for decoding a motion vector predictor of a current block in a picture of a sequence of pictures, the method comprising:accessing a current block in the picture;
identifying a first block and a second block that are each adjacent to the current block in the picture;
conditioned on determining that a motion vector of the first block is not equal to a motion vector of the second block, including, in a motion vector predictor candidate set for the current block, the motion vector of the first block and the motion vector of the second block;
conditioned on determining that the motion vector of the first block is equal to the motion vector of the second block, including, in the motion vector predictor candidate set for the current block, one of the motion vector of the first block or the motion vector of the second block;
receiving a flag from a bitstream, the flag indicating whether a temporally-located motion vector can be used as a motion vector predictor;
conditioned on determining that the flag indicates that a temporally-located motion vector can be used as a motion vector predictor, including a motion vector of a block in another picture in the motion vector predictor candidate set for the current block;
conditioned on determining that the flag indicates that a temporally-located motion vector cannot be used as a motion vector predictor, excluding, from the motion vector predictor candidate set for the current block, the motion vector of the block in the other picture;
selecting a motion vector from the motion vector predictor candidate set as the motion vector predictor of the current block;
deriving a motion vector for the current block based on the selected motion vector predictor and a motion vector differential; and
generating a residual block as a difference between the current block and a reference block identified by the motion vector for the current block.

US Pat. No. 10,397,577

INVERSE SCAN ORDER FOR SIGNIFICANCE MAP CODING OF TRANSFORM COEFFICIENTS IN VIDEO CODING

Velos Media, LLC, Plano,...

1. A method of coding transform coefficients associated with residual video data in a video coding process, the method comprising:coding a respective syntax element for each transform coefficient in a block of transform coefficients with a first scan pass, wherein each respective syntax element indicates whether or not a corresponding transform coefficient in the block of transform coefficients is a significant coefficient having an absolute value greater than zero, and wherein the first scan pass proceeds in an inverse scan direction from higher frequency coefficients in the block of transform coefficients to lower frequency coefficients in the block of transform coefficients; and
coding information indicating levels of the significant transform coefficients in the block of transform coefficients with a second scan pass.

US Pat. No. 10,397,576

RESHAPING CURVE OPTIMIZATION IN HDR CODING

Dolby Laboratoreis Licens...

1. A method for generating a reshaping function and encoding video frames with a processor, the method comprising:receiving with the processor a first video frame in a first dynamic range;
receiving a second video frame in a second dynamic range, wherein the first and the second frames represent the same scene;
receiving a tone-mapping function mapping luminance pixel values from the first dynamic range to luminance pixel values in the second dynamic range;
receiving a first cumulative density function (CDF) matching curve which maps luminance values from the first dynamic range to reshaped luminance values in the second dynamic range;
generating a first histogram of luminance values based on luminance pixel values in the first video frame;
generating a second histogram of luminance values based on luminance pixel values in the second video frame;
generating one or more histogram peaks based on the first and second histograms of luminance values and the first CDF matching curve, wherein generating the one or more histogram peaks comprises:
smoothing the second histogram with a smoothing filter to generate a smoothed histogram;
computing a histogram of element-wise differences between corresponding codeword values in the second histogram and the smoothed histogram;
normalizing the element-wise differences to be between 0 and 1; and
determining there is a histogram peak if a normalized element-wise difference is larger than a peak-detection threshold;
generating a first luminance pixel range in the first dynamic range based on the one or more histogram peaks and the first CDF matching curve;
generating a second luminance pixel range in the first dynamic range based on the first luminance pixel range, the first CDF matching curve, and the tone-mapping function;
generating a second CDF matching curve based at least on the second luminance pixel range, the first CDF matching curve, and the tone-mapping function;
generating a forward reshaping function based on the second CDF matching curve;
applying the forward reshaping function to the first video frame to generate a reshaped video frame;
generating a backward reshaping function based on the second CDF matching curve;
compressing the reshaped video frame to generate a compressed video frame; and
combining the compressed video frame and the backward reshaping function to generate a coded bitstream.

US Pat. No. 10,397,575

IMAGE CODING APPARATUS, IMAGE CODING METHOD, AND PROGRAM, AND IMAGE DECODING APPARATUS, IMAGE DECODING METHOD, AND PROGRAM

CANON KABUSHIKI KAISHA, ...

1. An image encoding apparatus comprising:a determination unit configured to determine a block size of a block included in an image;
an information encoding unit configured to encode information on a minimum area size related to a quantization parameter; and
an encoding unit configured to encode a difference value for a quantization parameter capable of being shared in encoding processes for a block group including a plurality of blocks,
wherein a size of each of the plurality of blocks is smaller than the minimum area size,
wherein the plurality of blocks includes at least both of a first block and a second block processed subsequent to the first block,
wherein the encoding unit is configured to encode the difference value in processing of the first block, in a case where the first block contains a coefficient value not equal to 0 and a split flag for the first block indicates that the first block is not split into blocks with half horizontal size and half vertical size, and
wherein the encoding unit is configured to encode the difference value in processing of the second block, in a case where the second block contains a coefficient value not equal to 0 and the difference value has not been encoded in the processing of the first block and a split flag for the second block indicates that the second block is not split into blocks with half horizontal size and half vertical size.

US Pat. No. 10,397,573

METHOD AND SYSTEM FOR GENERATING A TRANSFORM SIZE SYNTAX ELEMENT FOR VIDEO DECODING

Dolby Laboratories Licens...

1. A method for video signal processing, the method comprising:receiving an intra-predicted macroblock;
receiving a macroblock type of the intra-predicted macroblock indicating a transform size;
receiving a transform syntax element generated based on the transform size that indicates an inverse transform size for use with the intra-predicted macroblock,
deriving, based on the macroblock type, a flag indicating whether a sub-partition of the intra-predicted macroblock uses a size other than 8×8;
selecting a transform size based on the macroblock type, transform syntax element and the flag;
re-scaling and inverse transforming the received macroblock based on the selected inverse transform size to generate an inverse transformed macroblock;
reconstructing a reconstructed macroblock based on the inverse transformed macroblock; and
deblock filtering the reconstructed macroblock,
wherein selecting the transform size includes selecting an N×N transform size when the macroblock type is an N×N macroblock type and selecting an M×M transform size when the macroblock type is an M×M macroblock type, wherein N and M are integer values and M is greater than N.

US Pat. No. 10,397,572

METHOD AND SYSTEM FOR GENERATING A TRANSFORM SIZE SYNTAX ELEMENT FOR VIDEO DECODING

Dolby Laboratories Licens...

1. A method for decoding a coded picture, the method comprising:receiving an intra-predicted macroblock;
receiving a macroblock type of the intra-predicted macroblock indicating a size;
selecting a transform size having a same size as the size of the macroblock type;
receiving a transform syntax element generated based on the transform size that indicates an inverse transform size for use with the intra-predicted macroblock;
re-scaling and inverse transforming the received macroblock based on the selected inverse transform size to generate an inverse transformed macroblock; and
reconstructing a reconstructed macroblock based on the inverse transformed macroblock, wherein selecting the transform size includes selecting an N×N transform size when the macroblock type is an N×N macroblock type and selecting an MxM transform size when the macroblock type is an M×M macroblock type, wherein N and M are integer values and M is greater than N.

US Pat. No. 10,397,570

METHOD FOR ENCODING AND DECODING IMAGES, DEVICE FOR ENCODING AND DECODING IMAGES AND CORRESPONDING COMPUTER PROGRAMS

ORANGE, Paris (FR)

1. A coding method for coding at least one image subdivided into blocks, implementing, for a current block of N×N pixels to be coded, where N?1, the following acts performed by a coding device:predicting the current block in accordance with a prediction mode selected from a plurality of predetermined prediction modes,
obtaining a predictor block,
determining a set of N×N residual data representative of a difference between the predictor block obtained and the current block, said residual data being likely to have an amplitude and a sign,
computing a set of N×N coefficients from said set of residual data,
scanning the coefficients of said set of N×N coefficients according a given order, delivering a set of N2 coefficients,
coding the amplitude of each coefficient of said set of N2 coefficients,wherein for the coefficients of said set of N2 coefficients that have a sign:selecting, from said set of N2 coefficients, a subset of coefficients whose sign is the most predictable, said subset containing the first K coefficients assigned a non-zero amplitude and a sign, using a predetermined criterion which is a function of the selected prediction mode,
obtaining K predicted values of the sign of respectively said first K coefficients,
computing an information item representative of the difference between said K predicted values of sign and respectively said K values of sign,
coding said computed information item,
coding the signs of the coefficients that have not been selected.

US Pat. No. 10,397,569

METHOD AND APPARATUS FOR TEMPLATE-BASED INTRA PREDICTION IN IMAGE AND VIDEO CODING

MEDIATEK INC., Hsin-Chu ...

1. A method of video encoding and decoding used by a video encoding system and video decoding system respectively, the method comprising:receiving input data associated with a current block in a current image, wherein template-based Intra prediction is enabled for the current image;
receiving a plurality of neighboring reconstructed samples of the current block;
determining one or more size-reduced templates from said plurality of neighboring reconstructed samples, wherein determining the one or more size-reduced templates comprises at least one of:
identifying, from said plurality of neighboring reconstructed samples, a first set of neighboring reconstructed samples as a top template located on a top side of the current block with a template width less than a width of the current block; and
identifying, from said plurality of neighboring reconstructed samples, a second set of neighboring reconstructed samples as a left template located on a left side of the current block with a template height less than a height of the current block;
determining a target Intra mode or an Intra-mode candidate set using the template-based Intra prediction according to said one or more size-reduced templates, wherein determining the target Intra mode or the Intra-mode candidate set comprises:
generating prediction samples for said one or more size-reduced templates according to a corresponding available Intra mode; and
determining whether the corresponding available Intra mode is selected as the target Intra mode or one of the Intra-mode candidate set based on a comparison between the prediction samples and neighboring reconstructed samples in said one or more size-reduced templates; and
encoding or decoding the current block using Intra prediction with a current Intra mode selected from an Intra mode group comprising the target Intra mode or the Intra-mode candidate set.

US Pat. No. 10,397,568

METHOD AND APPARATUS FOR PALETTE CODING OF MONOCHROME CONTENTS IN VIDEO AND IMAGE COMPRESSION

HFI Innovation Inc., Zhu...

1. A method of decoding video data using palette coding for a video coding system, comprising:receiving a video bitstream associated with the video data;
obtaining a color-format syntax element from the video bitstream;
determining whether a picture of the video data is monochrome or non-monochrome based on the color-format syntax element;
in response to determining that the picture of the video data is monochrome:
from the video bitstream, obtaining first palette coding parameters for a particular color component of the picture; and
generating, by circuitry of the video coding system, a first palette table in a monochrome format for palette decoding the picture using the first palette coding parameters, wherein a variable indicating a number of color component values for each palette entry in the first palate table is set to a first value in response to the color-format syntax element indicating that the picture of the video data is monochrome;
in response to determining that the picture of the video data is non-monochrome:
from the video bitstream, obtaining second palette coding parameters for multiple color components of the picture; and
generating, by the circuitry of the video coding system, a second palette table in a multiple-component format for palette decoding the picture using the second palette coding parameters, wherein a variable indicating a number of color component values for each palette entry in the second palate table is set to a second value that is different from the first value in response to the color-format syntax element indicating that the picture of the video data is non-monochrome; and
outputting a decoded presentation of the picture that is decoded using the first palette table or the second palette table.

US Pat. No. 10,397,567

MOTION VECTOR CODING APPARATUS, METHOD AND PROGRAM FOR CODING MOTION VECTOR, MOTION VECTOR DECODING APPARATUS, AND METHOD AND PROGRAM FOR DECODING MOTION VECTOR

CANON KABUSHIKI KAISHA, ...

1. A method for determining a predicted motion vector for decoding at least one blocks included in a picture of a video from a bit stream which is hierarchical coded using a plurality of layers, the method comprising:obtaining, for generation of the predicted motion vector in a target block of a current picture on one of a plurality of layers, at least one of
(i) a first motion vector for a block at a position corresponding to the target block to be decoded of a previous picture before the current picture including the target block,
(ii) a second motion vector for a block, at a position corresponding to the target block to be decoded, of a picture in another layer of the plurality of layers, and
(iii) a third motion vector for one of neighboring blocks of the target block to be decoded of the current picture on one of the plurality of layers;
generating a candidate group of motion vectors used for prediction of a motion vector for the target block, the candidate group having the second motion vector and the third motion vector excluding the first motion vector if the second motion vector is in the candidate group, and the candidate group having the first motion vector and the third motion vector if the second motion vector is not in the candidate group;
determining the predicted motion vector from the generated candidate group for the target block in accordance with information about a predicted motion vector for the target block received from a bit stream; and
decoding the target block on the one of the plurality of layers using the determined predicted motion vector,
wherein the generated candidate group has a fixed number of motion vectors for prediction of a motion vector for the target block.

US Pat. No. 10,397,566

IMAGE CODING APPARATUS, IMAGE CODING METHOD, AND PROGRAM

Canon Kabushiki Kaisha, ...

1. An image coding apparatus which codes an image using intra prediction, the image coding apparatus comprising:a size determination unit configured to determine a size of a prediction block to be a unit of processing for performing intra prediction;
an intra prediction unit configured to derive a prediction error by performing intra prediction on the prediction block;
a cost determination unit configured to derive a coding cost of a prediction mode in a case where the intra prediction is performed using a reference according to the size determined by the size determination unit and determine the coding cost of the prediction mode based on the coding cost of the prediction mode and the prediction error; and
a mode determination unit configured to determine a prediction mode of the prediction block from among a plurality of prediction modes based on the coding cost.

US Pat. No. 10,397,565

IMAGING DEVICE WITH ALIGNMENT ANALYSIS

Fluke Corporation, Evere...

1. A method for determining misalignment of an imaging device, the method comprising:capturing first visible light image data at a first time using the imaging device, the first visible light image data including a target and a visual indicator, the visual indicator included in the first visible light image data as a function of a location of a measurement zone of an infrared sensor of the imaging device;
determining a first position of the visual indicator relative to the target in the first visible light image data;
capturing second visible light image data at a second time using the imaging device, the second time being subsequent to the first time, the second visible light image data including the target and the visual indicator, the visual indicator included in the second visible light image data as a function of a location of the measurement zone of the infrared sensor of the imaging device;
determining a second position of the visual indicator relative to the target in the second visible light image data; and
comparing the first position of the visual indicator to the second position of the visual indicator.

US Pat. No. 10,397,563

CONTEXT AWARE MIDAIR PROJECTION DISPLAY

International Business Ma...

1. A method comprising:receiving an audio data set corresponding to audio from a set of microphone(s) set up to capture sound in a three dimensional space;
receiving a visual data set corresponding to visual images from a plurality of camera(s) set up to capture visual images in the three dimensional space;
analyzing the video data set and the audio data set to detect a plurality of persons in the three dimension space;
further analyzing the video data set and the audio data set to determine, for each person of the plurality of persons, a respectively corresponding zone of attention within the three dimensional space;
determining a projection zone within the three dimensional space, with the determination being made so that the projection zone avoids blocking each person of the plurality of persons from viewing that person's respectively corresponding zone of attention; and
projecting a midair projection display at the projection zone.

US Pat. No. 10,397,562

4D PLATFORM FOR HOME USE AND 4D SYSTEM FOR HOME USE

FOURREAL CO., LTD., Seou...

1. A home four-dimensional (4D) platform comprising:an outer casing;
a physical feeling effect execution unit disposed in the outer casing;
a support, fixed to an inner bottom surface of the outer casing, for supporting the physical feeling effect execution unit; and
a control unit, electrically connected to the physical feeling effect executing unit, for controlling operation of the physical feeling effect executing unit,
wherein the physical feeling effect executing unit comprises:
a blower unit for blowing wind to a front of the outer casing;
a spray unit for spraying water as mist to a front of the outer casing in sync with each scene of video images; and
a light unit for flickering light to a rear of the outer casing,
wherein the outer casing comprises a track-shaped front side portion, and
wherein a plurality of mist holes for spraying water as mist are formed in a central portion of the front side portion.

US Pat. No. 10,397,561

AUTOSTEREOSCOPIC 3D DISPLAY DEVICE USING HOLOGRAPHIC OPTICAL ELEMENTS

Covestro Deutschland AG, ...

1. An autostereoscopic 3D display device comprising: a first light source and a second light source configured to alternately generate light; a light guide panel arranged to guide the light from the first light source in a first direction as a first light beam and the light from the second light source in a second direction as a second light beam; and a stack of a plurality of holographic optical elements configured to converge the first light beam and the second light beam from the light guide panel, wherein the light guide panel includes a prism structure configured to reflect the light from the first and second light sources toward a surface of the light guide panel adjoining the stack of the plurality of holographic optical elements; wherein the light guide panel and the prism structure are further configured to refract the light from the first and second light sources out of the light guide panel through an air gap towards the stack of the plurality of holographic optical elements; wherein the prism structure includes at least one first inclined surface having a first inclined angle and at least one second inclined surface having a second inclined angle; wherein the first and second inclined angles are less than 100; and wherein each of the plurality of holographic optical elements includes one or more interference patterns recorded by illuminating a reference beam on the holographic optical element based on one or more first incident angles and an object beam on the holographic optical element based on one or more second incident angles.

US Pat. No. 10,397,560

TRANSMISSION-TYPE DISPLAY

SEIKO EPSON CORPORATION, ...

1. A transmission-type display that is capable of transmitting an external light while forming a virtual image, the transmission-type display comprising:a first display device on a left side of the transmission-type display, the first display device being arranged along a first line; and
a second display device on a right side of the transmission-type display, the second display device being arranged along a second line, the first line and the second line meeting at a predetermined angle, and each of the first display device and the second display device guiding image light to form the virtual image and causing the image light and the external light to be superimposed on each other and viewed, wherein:
the predetermined angle at which the first line and the second line meet defines a parallax of the virtual image by tilting a principal ray of the image light which is emitted from each of the first and second display devices; and
the predetermined angle is adjustable to change the parallax.

US Pat. No. 10,397,559

DISPLAY METHOD, DISPLAY APPARATUS, AND DISPLAY SYSTEM

BOE TECHNOLOGY GROUP CO.,...

1. A display method, comprising:processing a plurality of images to form base compositions; and
presenting the base compositions with different images in a spatial arrangement which as a whole form a composite image viewable to a naked-eye viewer;
wherein:
a subset of the plurality of images are selectively viewable as a modulated view to a user with an optical modulation device;
wherein the presenting base compositions with different images in the spatial arrangement comprises presenting the base composition containing different images according to a display pixel row arrangement;
the presenting the base compositions containing different images according to a display pixel row arrangement comprises:
presenting a first group of the base compositions at odd-numbered rows of pixels; and
presenting a second group of the base compositions at even-numbered rows of pixels;
luminance of the even-numbered rows of pixels correspond to the modulated view; and
a sum of the luminance of the even-numbered rows of pixels and luminance of odd-numbered rows of pixels adjacent to the even-numbered rows of pixels correspond to the composite image;
the method further comprising:
obtaining pixel gray scale values X1 corresponding to the even-numbered rows of pixels;
mapping the X1 into X1? in a luminance domain;
obtaining pixel gray scale values Y0 of the composite image corresponding to the odd-numbered rows of pixels and the even-numbered rows of pixels in luminance modulation regions; and
mapping Y0 into C in the luminance domain; wherein C is a constant.

US Pat. No. 10,397,558

FORMING SPECTRAL FILTERS

Dolby Laboratories Licens...

1. Glasses for viewing a 3D stereoscopic image display comprising:a first lens of the glasses, the first lens comprising a first optical filter, wherein the first optical filter transmits light in a first plurality of bands of wavelengths and blocks light in a second plurality of bands of wavelengths separated in frequency from the first plurality;
a second lens of the glasses, the second lens comprising a second optical filter, wherein the second optical filter transmits light in a third plurality of bands of wavelengths and blocks light in a fourth plurality of bands of wavelengths,
wherein the first and third pluralities of bands of wavelengths are complementary, and the first and second lenses have a curvature in at least one of the horizontal or vertical planes such that incident light reaching the eyes of the viewer comes through the first and second lenses at a low angle of incidence, regardless of the direction of said incident light in the horizontal or vertical plane in which the lenses of the glasses are curved, and
wherein each of the first and second optical filters of the lenses of the glasses comprises multiple layers of optical material coated onto a corresponding substrate material having a thickness of between 0.005? and 0.015?.

US Pat. No. 10,397,557

DISPLAY DEVICE WITH DIRECTIONAL CONTROL OF THE OUTPUT, AND A BACKLIGHT FOR SUCH A DISPLAY DEVICE AND A LIGHT DIRECTION METHOD

KONINKLIJKE PHILIPS N.V.,...

1. A device, comprising:a lightguide having a top face and a bottom face, and opposite first and second side edges extending between the top face and the bottom face,
wherein the top face is parallel to the bottom face, and
wherein light is output from the top face;
a light source arranged to provide the light into the lightguide at one or both of the opposite side edges; and
an array of light out-coupling structures formed at the top face or the bottom face of the lightguide to redirect the light so that the light escapes from the lightguide at the location of the light out-coupling structures,
wherein the light source is controllable to provide a selected one of at least a first light output and second light output into the lightguide,
wherein the first light output and the second light output are each at least partially collimated,
wherein an angular direction of elevation is defined with respect to a plane of the top face and across a direction from the first side edge of the lightguide to the second side edge of the lightguide,
wherein the light source is arranged to provide the first light output and the second light output into the lightguide with different angular directions of elevation than each other, and
wherein the first light output entering into the lightguide results in the light exiting the top face of the lightguide at a first range of angles, and the second light output entering the lightguide results in the light exiting the top face of the lightguide at a second range of angles, wherein the first range of angles is different than the second range of angles.

US Pat. No. 10,397,555

DYNAMIC IMAGE GENERATION SYSTEM

Fourth Wave LLC, Washing...

1. An electronic communications method, comprising:receiving, by a device, electronic information associated with a two-dimensional image;
analyzing, by the device, the electronic information,
where the analyzing the electronic information includes analyzing color information within the two-dimensional image, where the color information determines a shape dimension of the three-dimensional electronic image;
generating, by the device, a three-dimensional electronic image based on the electronic information, where analyzing a line thickness in the two-dimensional image determines a height dimension of a three-dimensional shape that is generated within the three-dimensional electronic image.

US Pat. No. 10,397,554

TIME-RESOLVING SENSOR USING SHARED PPD+SPAD PIXEL AND SPATIAL-TEMPORAL CORRELATION FOR RANGE MEASUREMENT

SAMSUNG ELECTRONICS CO., ...

1. An image sensor, comprising:a plurality of a first type of diodes that each detect one or more incident photons; and
a time-resolving sensor that outputs a first reset signal, a second reset signal, a first measurement signal and a second measurement signal, the first reset signal representing a reset-charge level of a first floating diffusion and the second reset signal representing a reset-charge level of a second floating diffusion, the first reset signal and the second reset signal being output in response to a reset condition, the first measurement signal and second measurement signal being output in response to detecting by the at least two of the first type of diodes one or more incident photons that have been reflected from an object corresponding to a light pulse projected toward the object, a first signal being formed by subtracting the first reset signal from the first measurement signal and a second signal being formed by subtracting the second reset signal from the second measurement signal, a first ratio of a magnitude of the first signal to a sum of the magnitude of the first signal and a magnitude of the second signal being proportional to a time of flight of the one or more detected incident photons, and a second ratio of the magnitude of the second signal to the sum of the magnitude of the first signal and the magnitude of the second signal being proportional to the time of flight of the one or more detected incident photons.

US Pat. No. 10,397,553

TIME-RESOLVING SENSOR USING SHARED PPD+SPAD PIXEL AND SPATIAL-TEMPORAL CORRELATION FOR RANGE MEASUREMENT

SAMSUNG ELECTRONICS CO., ...

1. A pixel in an image sensor, the pixel comprising:a plurality of a first type of diodes that each convert a received luminance into a corresponding electrical signal;
a second type of diode operable to store a charge; and
a control circuit coupled to the plurality of the first type of diodes, the control circuit initiating transfer of a first portion of the charge from the second type of diode, and terminating the transfer in response to at least two first type of diodes converting received luminance into the corresponding electrical signals a pre-defined time interval.

US Pat. No. 10,397,552

TIME-OF-FLIGHT CAMERA SYSTEM

1. A Time-Of-Flight (TOF) camera system comprising:a plurality of sensors, the plurality of sensors including a TOF sensor and a color image sensor, wherein the TOF sensor and the color image sensor are assembled on a common substrate and are configured to image a same scene simultaneously; and
circuitry configured to drive a first sensor of the plurality of sensors with first driving parameters and to drive a second sensor of the plurality of sensors with second driving parameters different than the first driving parameters.

US Pat. No. 10,397,551

TEMPERATURE COMPENSATION FOR IMAGE ACQUISITION AND PROCESSING APPARATUS AND METHODS

Symbol Technologies, LLC,...

1. A data capture device, comprising:an imaging module including an image sensor and a lens;
a temperature sensor associated with the imaging module;
a memory storing, for a plurality of temperatures, respective imaging module response parameters corresponding to the temperatures;
an imaging controller connected to the image sensor and the temperature sensor, the imaging controller including a frame generator configured to:
receive (i) image data from the image sensor and (ii) a temperature measurement from the temperature sensor;
generate a raw image frame containing the temperature measurement and the image data; and
provide the raw image frame to a calibrator configured to generate a calibrated image frame based on the raw image frame and one of the imaging module response parameters corresponding to the temperature matching the temperature measurement;
wherein the imaging controller is further configured to:
retrieve a dimensioning error value from the parameters corresponding to the temperature measurement;
compare the dimensioning error value to a threshold; and
when the dimensioning error value exceeds the threshold, apply a restriction to a determination of object dimensions by:
imposing minimum object dimensions when the dimensioning error value exceeds a first threshold, and
aborting the determination of object dimensions when the dimensioning error value exceeds a second threshold.

US Pat. No. 10,397,550

APPARATUS AND METHOD FOR THREE DIMENSIONAL SURFACE MEASUREMENT

BP Corporation North Amer...

1. A measurement system for three-dimensional measurement of a subsea structure, comprising:a laser projector and a first camera packaged for subsea operation;
wherein the laser projector is configured to emit a laser projection onto a surface of the subsea structure for laser triangulation;
wherein the first camera is configured to provide images of the surface, and is disposed at an oblique angle with respect to the laser projector;
a processor configured to:
apply photogrammetric processing to the images captured subsea;
compute calibrations for laser triangulation based on a result of the photogrammetric processing of the images captured subsea, wherein the calibrations include determination of:
orientation of a laser plane projected by the laser projector with respect to the first camera; and
opto-mechanical parameters of the first camera, the parameters comprising principal distance, fiducial center, radial distortion, decentering distortion, and distortion relating to orthogonality of pixels of an image sensor in the first camera; and
compute, based on the calibrations, coordinates of points of the surface illuminated by the laser projection via laser triangulation; and
a second camera configured to capture images of the surface, and disposed such that an optical axis of the second camera is approximately parallel to emission of the laser projector.

US Pat. No. 10,397,549

COMPACT ARRAY OF IMAGING DEVICES WITH SUPPLEMENTAL IMAGING UNIT

GoPro, Inc., San Mateo, ...

1. A computer-implemented method for capturing images from a camera array, comprising:capturing a set of images from a 2×2 array of cameras, each camera of the array of cameras having an overlapping field of view (FOV) with an adjacent camera of the array of cameras;
synchronously capturing a supplemental image from a fifth camera, the fifth camera having an at least partially overlapping FOV with every camera of the array of cameras;
extracting supplemental information by comparing the supplemental image with the set of four images; and
stitching portions of the set of images based in part on the supplemental information to produce a combined stitched image, the combined stitched image having a higher resolution than each image of the set of images.

US Pat. No. 10,397,548

CAMERA MODULE

TDK TAIWAN CORP., Taoyua...

1. A camera module, comprising:a first light guiding hole, wherein a first light passes through the first light guiding hole;a second light guiding hole, wherein a second light passes through the second light guiding hole along a first direction, and the first light is parallel to the second light;a first camera assembly, having a first lens unit and an anti-vibration electromagnetic driving unit for driving the first lens unit to move along a second direction relative to a first optical sensor, wherein the second direction is perpendicular to the first direction, and the first optical sensor corresponds to and faces the first light guiding hole;
a second camera assembly, having a second lens unit and an electromagnetic driving unit for driving the second lens unit to move along the second direction relative to a second optical sensor, wherein the first light guiding hole and the second light guiding hole are arranged along the second direction; and
a light guiding assembly, disposed between the first camera assembly and the second camera assembly, and comprising a light path control unit and a switching unit, wherein the switching unit drives the light path control unit to rotate around a rotation axis perpendicular to the first direction and the second direction, wherein the distance between the center of a first magnetic element of the anti-vibration electromagnetic driving unit and the center of a second magnetic element of the electromagnetic driving unit is greater than the distance between the center of the first light guiding hole and the center of the second guiding hole.

US Pat. No. 10,397,547

STEREOSCOPIC IMAGE PICKUP UNIT, IMAGE PICKUP DEVICE, PICTURE PROCESSING METHOD, CONTROL METHOD, AND PROGRAM UTILIZING DIAPHRAGM TO FORM PAIR OF APERTURES

Sony Corporation, Tokyo ...

1. An image pickup unit comprising:a diaphragm configured to form an aperture;
an image pickup device including a first plurality of pixel circuits configured to receive object light that passes through the aperture, the image pickup device including:
picture generation pixels, respective ones of the picture generation pixels including one of the pixel circuits, and
parallax detection pixels arranged along a first direction, respective ones of the parallax detection pixels including a second plurality of the pixel circuits, the second plurality being smaller in number than the first plurality; and
a control section configured to control the diaphragm to cause a length of the aperture in the first direction to be larger than a length of the aperture in a second direction orthogonal to the first direction.

US Pat. No. 10,397,546

RANGE IMAGING

Microsoft Technology Lice...

1. Apparatus for controlling operation of a plurality of active illumination range cameras, the apparatus comprising:an optical detector configured to detect light that each of the plurality of active illumination range cameras transmits to illuminate a scene that each of the plurality of active illumination range cameras images;
a transmitter operable to transmit signals to the plurality of active illumination range cameras; and
a controller configured to:
identify an additional active illumination range camera based on a signal associated with the light transmitted by the active illumination range camera that the optical detector detect;
based at least on identifying the additional active illumination range camera, send an instruction to the plurality of active illumination range cameras to cease any range imaging in progress;
determine start times for imaging time slots during which the active illumination range cameras and the additional active illumination range camera may image a scene; and
control the transmitter to transmit signals to the active illumination range cameras and the additional active illumination range camera that provide the start times of the imaging time slots.

US Pat. No. 10,397,545

3-D LIGHT FIELD CAMERA AND PHOTOGRAPHY METHOD

University of Deleware, ...

1. A method of generating an image of a scene, the method comprising:directing light representing the scene through a lens module coupled to an imaging sensor, the lens module including: a surface having a slit-shaped aperture and a cylindrical lens array positioned along an optical axis of the imaging sensor, a longitudinal direction of the slit-shaped aperture being arranged orthogonal to a cylindrical axis of the cylindrical lens array;
capturing, by the imaging sensor, the light directed through the lens module to form a three-dimensional (3D) light field (LF) image; and
processing, by a processor, the 3D LF image to form a rendered image based on pitch size, focal length and an image center of each lens in the cylindrical lens array, wherein the processing includes at least one of: (a) refocusing the 3D LF image to a predetermined focus depth, including prior to refocusing the 3D LF image, locating the image center of each lens in the cylindrical lens array, based on a reference image, (b) adjusting a perspective of the 3D LF image based on a predetermined viewpoint, or (c) generating a 3D stereoscopic view image from the 3D LF image.

US Pat. No. 10,397,544

THREE-DIMENSIONAL OBJECT DETECTION DEVICE AND THREE-DIMENSIONAL OBJECT DETECTION METHOD

NISSAN MOTOR CO., LTD., ...

1. A three-dimensional object detection device comprising:a camera that captures a single image of a predetermined area;
a computer programmed to
perform viewpoint conversion processing on the image captured by the camera to create a conversion image viewed from above,
set a line segment extending in a vertical direction in an actual space as a vertical imaginary line in the conversion image, set two pixels at the same height in the actual space for each of a plurality of positions along the vertical imaginary line, and calculate a luminance difference in the conversion image between the two pixels set for each of the plurality of positions along the vertical imaginary line,
detect an edge line on the basis of continuities of the luminance differences among the plurality of positions, and detect a three-dimensional object on the basis of the edge line.

US Pat. No. 10,397,543

METHODS AND APPARATUS FOR CAPTURING, STREAMING AND/OR PLAYING BACK CONTENT

NextVR Inc., Newport Bea...

1. A method of providing stereoscopic content, the method comprising:storing, in a content delivery system,
i) first stereoscopic camera pair correction information corresponding to a first stereoscopic camera pair, said correction information corresponding to the first stereoscopic camera pair including:
first correction information for a first camera of the first stereoscopic camera pair, said first correction information being first camera dependent mesh correction information, indicating corrections to be made to node positions in a UV map when rendering an image captured by said first camera; and second correction information for a second camera of the first stereoscopic camera pair, said second correction information being second camera dependent mesh correction information, indicating corrections to be made to node positions in the UV map when rendering an image captured by said second camera;
ii) second stereoscopic camera pair correction information corresponding to a second stereoscopic camera pair, said second stereoscopic camera pair including a third camera and a fourth camera, said correction information corresponding to the second stereoscopic camera pair including third correction information for images captured by the third camera and fourth correction information for images captured by the fourth camera; and
iii) said UV map;
operating the content delivery system to provide said UV map, said first correction information corresponding to the first stereoscopic camera pair and said second correction information corresponding to the second stereoscopic camera pair to a playback device prior to detecting a network controlled switch from streaming content from the first stereoscopic camera pair to streaming content corresponding to the second stereoscopic camera pair;
detecting a network controlled switch from streaming content from the first stereoscopic camera pair to streaming content second stereoscopic pair; and
in response to detecting the network controlled switch from streaming content from said first stereoscopic camera pair to said second stereoscopic pair, operating the content delivery system to indicate to the playback device that the second stereoscopic camera pair correction information should be used for rendering operations.

US Pat. No. 10,397,542

FACILITATING QUANTIZATION AND COMPRESSION OF THREE-DIMENSIONAL GRAPHICS DATA USING SCREEN SPACE METRICS AT COMPUTING DEVICES

INTEL CORPORATION, Santa...

1. A computing device comprising:a memory to store a metric application; and
a graphics processor coupled to the memory and to execute the metric application in computing a rate of change of a geometry size of an object at a plurality of distances in screen space with respect to a location of a camera at an origin as a computed metric used to compress geometry data of the object.

US Pat. No. 10,397,541

METHOD AND APPARATUS OF LIGHT FIELD RENDERING FOR PLURALITY OF USERS

Samsung Electronics Co., ...

1. A rendering method for a plurality of users, the method comprising:mapping positions of a left eye and a right eye of a first user and positions of a left eye and a right eye of a second user to a desired viewing area of a three-dimensional (3D) display device using a sensor;
detecting an order of each of the left eye and the right eye of the first user and the left eye and the right eye of the second user relative to one another; and
determining an image value for the desired viewing area based on the detected order of the left eye and the right eye of the first user and the left eye and the right eye of the second user, and a distance between the mapped positions of the left eye and the right eye of the first user and the mapped positions of the left eye and the right eye of the second user, the determining including,
determining, for the first user, pixel resources of the image value based on the mapped positions of the left eye and the right eye of the first user so that the first user views a 3D image, and
determining, for the second user whose priority is lower than a priority of the first user, remaining pixel resources of the image value based on the detected under and the distance between the mapped positions.

US Pat. No. 10,397,540

METHOD FOR OBTAINING AND MERGING MULTI-RESOLUTION DATA

1. A method for generating an image, comprising the steps of:(i) providing information that relates to an image of a single target that was captured by at least two image capturing devices;
(ii) storing information provided in at least two input buffers;
(iii) sampling the stored information to obtain sampled information at at least two different image resolutions, and storing the sampled information at at least two output line buffers, each configured to store sampled information at a different image resolution;
(iv) processing the sampled information that had been stored at each of the at least two output line buffers in accordance with pre-defined disparity related information, wherein the pre-defined disparity related information is associated with a respective one of the at least two image capturing devices that had captured the information being currently processed;
(v) retrieving information from the at least two output line buffers and storing the retrieved information at a hybrid row buffer, for generating said image; and
(vi) repeating steps (i) to (v) in the process of generating said image.

US Pat. No. 10,397,539

COMPENSATING 3D STEREOSCOPIC IMAGERY

Schepens Eye Research Ins...

1. A three-dimensional stereoscopic imaging system, comprising:two image capture devices configured to obtain images of a scene;
at least one display screen configured to pivot about an axis; and
an electronic processor configured to:
receive one or more images of the scene from the capture devices, wherein each of the one or more images comprises a lateral dimension and a height;
warp the one or more images along their respective lateral dimensions to at least partially correct for disparity distortions associated with displaying the one or more images on the at least one display screen;
correct the one or more images by scaling the respective heights of the images along the lateral dimensions of the images so that the height of each image is the same along the image's lateral dimension;
synchronously adjust an orientation of the at least one display screen about the pivot axis based on viewing directions associated with the two image capture devices; and
display the warped and corrected one or more images on the at least one display screen.

US Pat. No. 10,397,538

METHODS AND APPARATUS FOR SUPPORTING CONTENT GENERATION, TRANSMISSION AND/OR PLAYBACK

NEXTVR Inc., Newport Bea...

1. A method of operating an image processing system, the method comprising:receiving a first image corresponding to a portion of an environment, said first image including a non-occluded image portion corresponding to a portion of the environment visible from a first location, wherein said first image is an image that was captured by a first camera of a stereoscopic camera pair;
generating a first frame including image content from said non-occluded image portion of said first image;
receiving an additional image of the environment including at least a first occluded image portion corresponding to a portion of the environment occluded from view from said first location, wherein said additional image is captured by an additional camera which is in addition to said stereoscopic camera pair;
generating an auxiliary frame including image content from said first occluded image portion of the additional image; and
storing said first frame and said auxiliary frame in a storage device or transmitting said first frame to another device.

US Pat. No. 10,397,537

VIDEO SIGNAL TRANSMISSION DEVICE, VIDEO SIGNAL RECEPTION DEVICE AND VIDEO SIGNAL TRANSFERRING SYSTEM

THINE ELECTRONICS, INC., ...

1. A video signal transmission device, comprising:a packer unit configured to capture a data enable signal and a video signal constituted by one or more pixel signals, each of which corresponds to one pixel and includes a color signal and a sync signal, and apply packetizing processing to the video signal in accordance with the data enable signal so that the video signal has a packet configuration of size corresponding to the number of pixels per video signal and the number of tone bits of the color signal to generate a plurality of block signals;
an encoder unit configured to apply encoding processing to the plurality of block signals to generate a plurality of encoded block signals; and
a serializer configured to apply parallel-serial conversion to the plurality of encoded block signals to generate a serial signal, wherein
the packer unit generates a control signal including a pulse having a pulse width corresponding to the number of pixels and the number of tone bits of the color signal, and
the encoder unit applies encoding processing of encoding efficiencies different between a first period of the control signal in which the pulse exists and a second period of the control signal distinguished from the first period depending on existence or non-existence of the pulse.

US Pat. No. 10,397,536

PIXEL PRE-PROCESSING AND ENCODING

Telefonaktiebolaget LM Er...

1. A method of pre-processing a pixel in a picture, said method comprising:obtaining an original linear luminance component value of said pixel in a third linear color space, the third linear color space comprising a linear XYZ color space, the original linear luminance component value being determined based on a linear color of said pixel in a first linear color space; and
deriving a non-linear luma component value in a second non-linear color space for said pixel based on a first non-linear chroma component value in said second non-linear color space, a second non-linear chroma component value in said second non-linear color space and said original linear luminance component value in said third linear color space;
wherein deriving said non-linear luma component value comprises deriving a non-linear luma component value in said second color space that minimizes a difference between said original linear luminance component value in said third color space and a linear luminance component value in said third color space determined based on said non-linear luma component value in said second color space, said first non-linear chroma component value in said second color space and said second non-linear chroma component value in said second color space.

US Pat. No. 10,397,534

PROJECTION DISPLAY AND IMAGE CORRECTION METHOD

Sony Corporation, Tokyo ...

1. A projection display comprising:an image display device that displays an image;
a projection unit that projects the image displayed by the image display device to a projection surface;
a light irradiator that irradiates the projection surface with linear light at an incident angle shallower than an incident angle of projection light, the linear light extending along a first direction within the projection surface;
an imaging unit that has an optical axis different from an optical axis of the light irradiator, and performs capturing of the projection surface; and
a signal processor that performs signal processing on an imaging signal outputted from the imaging unit,
the imaging unit performing capturing of the linear light with which the projection surface is irradiated, and
the signal processor correcting, on a basis of a captured image of the linear light, a distortion of the projected image.

US Pat. No. 10,397,533

PROJECTION SYSTEM AND METHOD FOR ADJUSTING PROJECTION SYSTEM

SEIKO EPSON CORPORATION, ...

1. A projection system comprising:three or more projectors; and
an image processor including:
an arrangement determining section that is configured to determine an arrangement of the projectors based on characteristics of each of the projectors;
a guidance display control section that is configured to cause the projectors to project images showing an installation state of the projectors in accordance with the arrangement determined by the arrangement determining section;
a divider section that is configured to divide an image to be projected to generate divided images to be projected by the projectors;
a storage section that is configured to store correction data;
a correction section that is configured to correct the divided images generated by the divider section based on the correction data stored in the storage section; and
an output section that is configured to output the divided images corrected by the correction section to the projectors, wherein:
each of the projectors includes a projection section that projects image light to form a projection image on a projection surface;
the projectors are arranged such that the three or more projection images projected by the three or more projectors form a tiled image;
the projection image projected by one of the projectors and the projection image projected by another one of the projectors adjacent to the one of the projectors form an overlapping area where the projection images overlap with each other,
one of the overlapping areas differs from the other overlapping areas in terms of size;
the tiled image includes a first overlapping area as one of the overlapping areas;
the arrangement determining section is configured to carry out a size determination process of determining a size of the first overlapping area based on luminance provided by a first one of the projectors and a second one of the projectors that project images that form the first overlapping area; and
in the size determination process, in each of the image projected by the first one of the projectors and the image projected by the second one of the projectors, the arrangement determining section is configured to find a ratio of luminance at a boundary of the first overlapping area to peak luminance of the projection image and determine the size of the first overlapping area based on the obtained ratios.

US Pat. No. 10,397,531

PROJECTOR, DISPLAY DEVICE, AND DISPLAY METHOD

Seiko Epson Corporation, ...

1. A projector comprising:an input section adapted to input a video signal;
a storage section which stores a portal screen, the portal screen including
(i) a first region in which a service set identifier (“SSID”) of the projector and information related to connection between the projector and an external device are displayed,
(ii) a second region in which a plurality of selection items for changing the operation of the projector is displayed, and
(iii) a third region in which information representing a plurality of input sources with which the projector is compliant is displayed, the plurality of input sources including a local area network (“LAN”) source;
a processor configured to perform operations of a determination section and a selection section, wherein:
the determination section is adapted to determine whether the video signal is input by the input section and whether the portal screen needs to be displayed based on presence or absence of the video signal input by the input section at the time of startup; and
the selection section is adapted to select, in accordance with an operation by the user, one of the plurality of input sources from the plurality of input sources having been displayed in the portal screen; and
a projection section adapted to project either of a picture corresponding to the video signal and the portal screen in accordance with a determination result by the determination section,
wherein
in response to a determination by the determination section that no video signal is input when the portal screen is not currently being displayed, the determination section determines that the portal screen is to be displayed and makes the projection section display the portal screen, and
in a case in which the operation by the user has not been performed for a predetermined time, the projection section projects thumbnails obtained from video signals supplied from the plurality of input sources.

US Pat. No. 10,397,530

METHOD OF DETERMINING A TOUCH POSITION AND TOUCH PROJECTION SYSTEM USING THE METHOD

Coretronic Corporation, ...

1. A method of determining a touch position to be used for a touch projection system including at least two interactive projection apparatuses and a computer apparatus connected to the interactive projection apparatuses, wherein each of the interactive projection apparatuses comprises a projection unit and an image capture unit with a viewing angle range, the method comprising:projecting projection sub-images to a projection touch area, each projection sub-image is partially overlapped with an adjacent one of the other projection sub-image to form a blending projection image;
capturing the blending projection image within the viewing angle ranges, wherein the blending projection image has an overlap area;
providing a portion of information of a positioning pattern to each projection unit;
projecting each projection sub-image with the portion of information of the positioning pattern onto the projection touch area;
capturing each projection sub-image with the portion of information of the positioning pattern so as to form a positioning image respectively; and
confirming a coordinate range of the overlap area located within the blending projection image according to each of the positioning images.

US Pat. No. 10,397,529

TRANSPARENT OPTICAL COUPLER ACTIVE MATRIX ARRAY

Palo Alto Research Center...

1. A backplane, comprising:an array of output terminals arranged on an output surface of a backplane; and
an array of solid state optical switches, each optical switch corresponding to one of the output terminals, wherein the solid state optical switches are responsive to light of a control wavelength, wherein the control wavelength actuates the optical switches, and transparent to light of a sensing wavelength, wherein the sensing wavelength forms images, different from the control wavelength wherein the backplane is of a material having at least an area that is transparent to light of the sensing wavelength.

US Pat. No. 10,397,528

PROVIDING STATUS INFORMATION FOR SECONDARY DEVICES WITH VIDEO FOOTAGE FROM AUDIO/VIDEO RECORDING AND COMMUNICATION DEVICES

Amazon Technologies, Inc....

1. An audio/video recording and communication device (A/V device) comprising:a camera configured to capture image data;
a communication module; and
a processing module operatively connected to the camera and the communication module, wherein the processing module is in network communication with at least one secondary device via the communication module, the processing module comprising:
a processor; and
one or more computer-readable media storing a device status check application comprising instructions that, when executed by the processor cause the processor to perform operations including:
receiving a secondary device state request signal from a server via the communication module, wherein the server is in network communication with the A/V device;
checking a status of the at least one secondary device in network communication with the processing module;
generating a secondary device status update signal, wherein the secondary device status update signal provides the status of the at least one secondary device based on the checked status; and
transmitting the secondary device status update signal to the server using the communication module.

US Pat. No. 10,397,525

MONITORING SYSTEM AND MONITORING METHOD

PANASONIC INTELLECTUAL PR...

1. A monitoring system comprising:a camera, which, in operation, captures images of an imaging area;
a microphone array, which, in operation, collects audio from the imaging area;
a monitor, which, in operation, displays a captured image of the imaging area which is captured by the camera;
a processor; and
a memory including instructions that, when executed by the processor, cause the processor to perform operations including:
using the audio collected by the microphone array to set a masking area to be excluded from detection of a pilotless flying object which appears in the captured image of the imaging area;
detecting the pilotless flying object based on the audio collected by the microphone array from outside the masking area set by the masking area setter; and
superimposing a sound source visual image on the captured image and around the pilotless flying object detected in the captured image, the sound source visual image indicating the volume of a sound at a sound source position, and
when the pilotless flying object is detected in an area other than the masking area, displaying the captured image, the pilotless flying object detected in the captured image and the sound source visual image on the monitor.

US Pat. No. 10,397,524

THREE-DIMENSIONAL AROUND VIEW MONITORING SYSTEM OF VEHICLE AND METHOD THEREOF

UL See Inc., Tortola (VG...

1. A three-dimensional around view monitoring method of a vehicle, comprising the following steps:receiving a plurality of pieces of fisheye image data generated by photographing a correction plate, wherein the correction plate comprises a plurality of horizontal reference lines presented as straight lines and a plurality of radiate vertical reference lines presented as straight lines, and the horizontal reference lines and the radiate vertical reference lines are presented as circular arc lines in the pieces of fisheye image data;
correcting the pieces of fisheye image data comprising the horizontal reference lines presented as circular arc lines and the radiate vertical reference lines presented as circular arc lines into a plurality of pieces of corrected image data comprising the horizontal reference lines presented as straight lines and the radiate vertical reference lines presented as straight lines to generate a fisheye correction lookup table, and acquiring a coordinate position of each pixel, corresponding to each pixel in the pieces of fisheye image data, in the pieces of corrected image data according to the fisheye correction lookup table;
rotating and translating the radiate vertical reference lines presented as straight lines in the pieces of corrected image data in a neighborhood into overlaps having a same radiation angle to generate a rotation and translation lookup table and acquiring, according to the rotation and translation lookup table and the fisheye correction lookup table, a coordinate position of each pixel, corresponding to each pixel in the pieces of fisheye image data, in the pieces of corrected image data that have been rotated and translated; and
generating a stitched image lookup table according to the rotation and translation lookup table and the fisheye correction lookup table, acquiring, according to the stitched image lookup table, a coordinate position of each pixel, corresponding to each pixel in the pieces of fisheye image data, in the pieces of corrected image data, calculating a pixel value of each pixel in the pieces of corrected image data by using the pixel value of each pixel in the pieces of fisheye image data, and performing image stitching on the neighboring pieces of corrected image data to generate a piece of static stitched image data.

US Pat. No. 10,397,523

SYSTEM AND METHOD FOR CONTROLLING AND SELECTING SOURCES IN A ROOM ON A NETWORK

Image Stream Medical, Inc...

13. The method of claim 11, further comprising analyzing data contained in a message transmitted over a network at the medical facility to determine the status.

US Pat. No. 10,397,522

IDENTIFYING POPULAR NETWORK VIDEO SEGMENTS

INTERNATIONAL BUSINESS MA...

1. A computer-implemented method, comprising:receiving video player operation information for a plurality of video players operated by users accessing a network video having a plurality of video segments;
evaluating, using a hardware processor and for each of the plurality of video segments, a popularity measure using the received video player operation information; and
creating a condensed version of the network video, wherein
the video player operation information for each of the plurality of video players is information that regards a plurality of media controls being activated during a single playback of the network video by a particular video player,
the plurality of media controls respectively adjust a process of watching the network video by the particular video player,
the condensed version of the network video consists of a subset of video segments of the network video,
the subset of video segments of the network video are based upon the evaluating,
the condensed version of the network video is smoothed by using an expansion method that includes a predetermined length adjacent each of the video segments in the video segments, and
the predetermined length varies depending upon a popularity of adjacent segments.

US Pat. No. 10,397,521

SECURE TELECONFERENCE MANAGEMENT

York Telecom Corporation,...

1. A method, comprising:receiving, from a clinician application, a first request, the first request comprising an indication to initiate a teleconference and an indication of a treatment room;
identifying a virtual meeting room for the teleconference;
sending, to the treatment room, an identifier of the virtual meeting room;
receiving, from the treatment room, a second request, the second request comprising an indication to join the teleconference in the virtual meeting room;
joining the treatment room to the teleconference in the virtual meeting room; and
joining the clinician application to the teleconference in the virtual meeting room.

US Pat. No. 10,397,520

INTEGRATION OF VIDEOCONFERENCING WITH INTERACTIVE ELECTRONIC WHITEBOARD APPLIANCES

Ricoh Company, Ltd., Tok...

15. A computer-implemented method comprising:a manager component executing on an interactive whiteboard appliance:
causing a collaboration menu to be displayed on a display of the interactive whiteboard appliance, wherein the collaboration menu includes one or more controls for switching between an interactive whiteboard session and a videoconferencing session that is displayed by a collaboration client executing on the interactive whiteboard appliance, controlling sharing of content between the interactive whiteboard appliance and one or more client devices that are external to the interactive whiteboard appliance, and controlling the videoconferencing session, wherein the interactive whiteboard session includes an interactive whiteboard content area for displaying content that is shared with the one or more client devices that are external to the interactive whiteboard appliance, and wherein at least a portion of the videoconferencing session is arranged within the interactive whiteboard session, such that both the content of the interactive whiteboard session and the videoconferencing session are displayed simultaneously,
detecting a user selection of a particular control from the one or more controls for switching between the interactive whiteboard session and the videoconferencing session, controlling sharing of the content with the one or more client devices, and controlling the videoconferencing session,
managing display of the interactive whiteboard session and the videoconferencing session in response to detecting the user selection of the particular control from the one or more controls for switching between the interactive whiteboard session and the videoconferencing session, controlling sharing of the content with the one or more client devices, and controlling the videoconferencing session.

US Pat. No. 10,397,519

DEFINING CONTENT OF INTEREST FOR VIDEO CONFERENCE ENDPOINTS WITH MULTIPLE PIECES OF CONTENT

Cisco Technology, Inc., ...

1. A method comprising:detecting, by a primary video conference system participating in a collaboration session with one or more secondary video conference systems, one or more participants within a field of view of one or more cameras of the primary video conference system, the primary video conference system including at least a first video conference endpoint and a second video conference endpoint, the one or more cameras being associated with the first video conference endpoint or the second video conference endpoint of the primary video conference system;
determining, by the primary video conference system, a first attention score for a first content displayed at the first video conference endpoint based on the one or more participants;
determining, by the primary video conference system, a second attention score for a second content displayed at the second video conference endpoint based on the one or more participants;
determining, by the primary video conference system, whether the first content, the second content, or both the first content and the second content are active content based on whether the first attention score exceeds a predetermined threshold value and whether the second attention score exceeds the predetermined threshold value; and
sending, by the primary video conference system, to the one or more secondary video conference systems an indication of the active content to enable the one or more secondary video conference systems to display the active content.

US Pat. No. 10,397,518

COMBINING ENCODED VIDEO STREAMS

Amazon Technologies, Inc....

1. A computing device, comprising:one or more cameras;
one or more displays;
memory; and
one or more processors configured to:
initiate connection to a multi-party communication session including a plurality of participants;
generate, using the one or more cameras, first video content of a first participant associated with the computing device;
encode the first video content according to a video encoding standard, including generating first stream metadata for the encoded first video content;
transmit a transmitted video stream of the encoded first video content to each of a plurality of remote devices associated with the other participants;
receive a plurality of received video streams, each received video stream originating with a corresponding one of the remote devices, each of the received video streams being independently encoded according to the video encoding standard and having individual stream metadata associated therewith;
generate combined stream metadata for generating a combined video stream, the combined stream metadata being derived from the individual stream metadata associated with each of the received video streams, wherein the combined stream metadata defines a plurality of regions of a frame of the combined video stream, and also maps a corresponding frame of one of the received video streams for each of the plurality of regions of the frame of the combined video stream, and further wherein the combined stream metadata is not configured for display in the combined video stream to be generated;
generate, using the combined stream metadata, the combined video stream conforming to the video encoding standard;
decode the combined video stream according to the video encoding standard to generate decoded video content; and
present the decoded video content on at least one of the one or more displays.

US Pat. No. 10,397,517

MATRIX SWITCHER

LONTIUM SEMICONDUCTOR COR...

1. A matrix switcher, comprising: M transmitting side chips, K matrix switch chips and N receiving side chips, whereinthe M transmitting side chips are connected to the N receiving side chips via the K matrix switch chips, the K matrix switch chips are configured to forward at least one group of input signals transmitted from the M transmitting side chips to the N receiving side chips, each of M, K and N is an integer greater than or equal to 1, and the value of each of M and N is greater than the value of K;
each of the transmitting side chips comprises:
a signal receiving circuit, configured to receive an ultra-high-definition video signal and unpack the ultra-high-definition video signal;
a logic processor, configured to perform at least one of a Color Space Conversion (CSC) process and a Digital Stream Compression (DSC) process on the unpacked ultra-high-definition video signal to generate a compressed video signal; and
a signal transmitting circuit, configured to transmit the compressed video signal to the K matrix switch chips through four universal high-speed differential serial ports SERializer/DESerializer (SERDESs), and
each of the receiving side chips comprises:
a signal receiving circuit, configured to receive the compressed video signal outputted from the K matrix switch chips through the four SERDESs;
a logic processor, configured to perform at least one of a DSC data decompression process and a CSC process on the compressed video signal to generate a decompressed video signal; and
a signal transmitting circuit, configured to recover the decompressed video signal to generate an ultra-high-definition video signal and transmit the recovered ultra-high-definition video signal to an external device;
wherein the four SERDESs of the signal transmitting circuit of the transmitting side chip are configured to transmit four channels of data signals, in a case that the ultra-high-definition video signal is a ultra-high-definition video signal adopting a Display Port (DP) protocol; and
wherein the four SERDESs of the signal transmitting circuit of the transmitting side chip are configured to transmit three channels of data signals and one channel of a clock signal, in a case that the ultra-high-definition video signal is a ultra-high-definition video signal adopting a High Definition Multimedia Interface (HDMI) protocol.

US Pat. No. 10,397,516

SYSTEMS, METHODS, AND DEVICES FOR SYNCHRONIZATION OF VEHICLE DATA WITH RECORDED AUDIO

FORD GLOBAL TECHNOLOGIES,...

1. A method comprising:determining an engine speed based on time-series vehicle data;
determining recorded audio data comprising a vehicle noise having non-zero volume;
generating proxy sound data comprising a proxy vehicle noise synthesized based on the engine speed;
determining an offset that maximizes cross-correlation between the proxy sound data and the recorded audio data; and
shifting one or more of the time-series vehicle data or the recorded audio data relative to each other in time based on the offset to generate a synchronized set of time-series vehicle data and recorded audio data.

US Pat. No. 10,397,515

PROTECTING PERSONAL PRIVACY IN A VIDEO MONITORING SYSTEM

Nokia of America Corporat...

1. A method comprising:compressing, at an image acquisition device, a media signal representative of a scene based on a sensing matrix that is a determined by a sensing matrix template and a set of template parameters;
providing, from the image acquisition device, the compressed media signal to a receiver; and
selectively providing a specification of a subset of the set of template parameters to the receiver so that the media signal cannot be reconstructed from the provided compressed media signal using the provided specification of the subset of the set of template parameters.

US Pat. No. 10,397,514

PROJECTION APPARATUS, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD

Canon Kabushiki Kaisha, ...

1. An image processing apparatus for deforming a first image into a second image and generating a third image which contains the second image, the image processing apparatus comprising:a memory unit configured to store coordinate values indicating a position of each of a plurality of grid points after deformation disposed in the second image; and
a determination unit configured to determine whether a pixel of interest in the third image is located in the second image or the pixel of interest in the third image is located outside of the second image, based on the coordinate values stored in the memory unit,
wherein the determination unit further decides a set of grid points after deformation corresponding to the pixel of interest from the plurality of grid points after deformation.

US Pat. No. 10,397,513

METHOD FOR DRIVING DISPLAY INCLUDING CURVED DISPLAY AREA, DISPLAY DRIVING CIRCUIT SUPPORTING THE SAME, AND ELECTRONIC DEVICE INCLUDING THE SAME

Samsung Electronics Co., ...

1. An electronic device comprising:a display;
a processor operatively connected to the display and configured to generate display data to be output on the display;
a display driver integrated circuit configured to output the display data received from the processor,
wherein the display driver integrated circuit is configured to output a screen including:
an image displaying area having at least a partially round shape,
a blank area partially surrounding the image displaying area,
a boundary area disposed between the image displaying area and the blank area,
wherein the image is displayed in the image displaying area according to execution of an application,
a black image, to be generated by the display driver integrated circuit, displayed in the blank area, and
a borderline, to be generated by the display driver integrated circuit, displayed in the boundary area, and
wherein the display driver integrated circuit is further configured to apply a plurality of color transformation values to at least some pixels of the borderline in the boundary area based on a plurality of distances from a specified point in the image displaying area to each of the at least some pixels, in the boundary area, where the color transformation value is applied, so that a color of the borderline becomes darker from the image displaying area toward the blank area based on the plurality of color transformation values.

US Pat. No. 10,397,512

DISPLAY DEVICE WITH SLIMMER BORDER PORTION

FUNAI ELECTRIC CO., LTD.,...

1. A display device comprising:a display panel;
a light source disposed on a rear side relative to the display panel;
an optical member disposed on the rear side relative to the display panel;
a rear chassis made of a piece of sheet metal and integrated as a one-piece, unitary member, the rear chassis including
a light source housing component that houses the light source,
a first flat component that extends outward from an opening edge of the light source housing component, and extends substantially parallel to a rear face of the display panel, and
a second flat component that extends outward relative to the first flat component and extends substantially parallel to the rear face of the display panel, the second flat component being disposed closer to the display panel than the first flat component, the second flat component supporting the optical member;
a front face housing covering an outer peripheral part of a front face of the display panel;
a rear face housing covering a rear face of the light source housing component from a rear of the display device, the rear face of the light source housing component forming a rear face of the rear chassis, the rear face housing being made of plastic; and
a fastening member disposed through a bore of the rear face housing and fastened to the first flat component of the rear chassis to fasten the rear face housing to the first flat component of the rear chassis.

US Pat. No. 10,397,511

METHOD FOR TELEVISION REMOTE KEYPRESS RESPONSE BASED ON ANDROID OPERATING SYSTEM AND TELEVISION THEREOF

HISENSE ELECTRIC CO., LTD...

1. A method for television (TV) remote keypress response based on Android operating system (OS), comprises:creating a first process when the Android OS is being initialized; and
executing the first process to acquire a key value of a remote keypress event; and performing a remote keypress operation according to the key value;
wherein the acquiring a key value of a remote keypress event comprises:
acquiring a device list that comprises location information of at least one device node, wherein each device node is stored under a first directory of the Android OS and is configured to store a keypress event corresponding to a key device, wherein the keypress event comprises a key value;
acquiring location information of a device node from the device list
accessing the device node according to the location information of the device node; and
acquiring the key value of the remote keypress event when the remote keypress event corresponding to a remote keypress device is stored in the device node.

US Pat. No. 10,397,510

ELECTRONIC DEVICES COMPRISING IMAGE SIGNAL PROCESSORS

Samsung Electronics Co., ...

1. An electronic device comprising:an image signal processor configured to:
receive a signal of a first code value corresponding to an active pixel included in an active area;
calculate a correction value based on a second code value associated with a first area and a third code value associated with a second area; and
calculate an output code value based on the first code value and the correction value,
wherein the first area and the second area are different from the active area, and
wherein the correction value includes a component of the third code value in proportion to a distance between the active pixel and the first area and includes a component of the second code value in proportion to a distance between the active pixel and the second area.

US Pat. No. 10,397,509

SOLID-STATE IMAGING DEVICE AND ELECTRONIC APPARATUS

Sony Corporation, Tokyo ...

1. An imaging device comprising:a first unit including:
a first plurality of photoelectric conversion portions, and
a first floating diffusion coupled to the first plurality of photoelectric conversion portions;
a first set of transistors, including a first reset transistor coupled to the first floating diffusion, a first amplification transistor coupled to the first floating diffusion, and a first selection transistor coupled to the first amplification transistor;
a second unit disposed adjacent to the first unit in a row direction, the second unit including:
a second plurality of photoelectric conversion portions, and
a second floating diffusion coupled to the second plurality of photoelectric conversion portions; and
a second set of transistors, including a second reset transistor coupled to the second floating diffusion and disposed adjacent to the first amplification transistor in a row direction in a plan view perspective, a second amplification transistor coupled to the second floating diffusion, and a second selection transistor coupled to the second amplification transistor,
wherein a first set of gate terminals and a second set of gate terminals are symmetrically arranged in a first row, the first set of gate terminals includes a gate terminal of the first amplification transistor and a gate terminal of the first selection transistor, and the second set of gate terminals includes a gate terminal of the second amplification transistor and a gate terminal of the second selection transistor,
wherein the first amplification transistor and the second amplification transistor are disposed between the first selection transistor and the second selection transistor, and
wherein a gate terminal of the first reset transistor and a gate terminal of the second reset transistor are disposed in a second row.

US Pat. No. 10,397,508

IMAGE SENSOR HAVING LED FLICKER MITIGATION FUNCTION AND IMAGE PROCESSING SYSTEM INCLUDING THE IMAGE SENSOR

SAMSUNG ELECTRONICS CO., ...

1. An image sensor comprising:a pixel array including a plurality of pixels, the plurality of pixels configured to respectively generate pixel signals corresponding to photocharges; and
an analog-to-digital conversion (ADC) circuit configured to convert the pixel signals into digital pixel signals, the pixel array further including,
a floating diffusion node,
a first photoelectric conversion element operably connected to the floating diffusion node, and
a second photoelectric conversion element operably connected to the floating diffusion node via the first photoelectric conversion element, and
the ADC circuit is configured to perform ADC on a reset signal of the floating diffusion node, perform ADC on a first pixel signal based on a first photocharge accumulated in the first photoelectric conversion element, and perform ADC on a second pixel signal based on a second photocharge accumulated in the second photoelectric conversion element, wherein the pixel array is configured to send the first photocharge accumulated in the first photoelectric conversion element to the floating diffusion node after the reset signal of the floating diffusion node is generated by the pixel array, wherein the second pixel signal is based on a sum of the second photocharge and a third photocharge, the third photocharge being accumulated in the first photoelectric conversion element.

US Pat. No. 10,397,507

ELECTRIC CAMERA

Maxell, Ltd., Kyoto (JP)...

1. A camera comprising:an image sensor having an array of pixels arranged vertically and horizontally in a grid pattern;
a processor to form a plurality of image signals by using the pixels of the image sensor in a static image mode and a moving video mode, the processor configured to:
form the image signals by using a first set of the pixels of the image sensor corresponding to a predetermined first view angle during recording in the static image mode, and
form the image signals by using a second set of the pixels of the image sensor corresponding to a predetermined second view angle during recording in the moving video mode, wherein the predetermined second view angle is different from the predetermined first view angle;
an image-instability detector configured to detect an amount of image-instability of the camera and configured to change a position of the second set of the pixels according to the amount of image-instability detected by the image-instability detector, in order to correct the image-instability; and
a display configured to:
display a still image corresponding to the image signals formed based on the first set of the pixels, and
display a moving image corresponding to the image signals formed based on the second set of the pixels.

US Pat. No. 10,397,506

IMAGING DEVICE AND ELECTRONIC APPARATUS WITH UPPER AND LOWER SUBSTRATES

Sony Corporation, Tokyo ...

1. An imaging device comprising:an upper substrate connected to a power-supply for the upper substrate of a power-supply voltage for the upper substrate, wherein the power-supply voltage for the upper substrate is higher than a power-supply voltage of a power-supply of a circuit of a later stage, the upper substrate including:
a pixel that receives incident light and that generates a signal in response to the received incident light, the pixel including a photodiode, a transfer transistor and a reset transistor; and
an amplification transistor, wherein a gate of the amplification transistor is configured to receive the signal generated by the pixel, wherein a source of the amplification transistor is configured to receive a reference voltage, and wherein a drain of the amplification transistor is connected to a buffer; and
a lower substrate, including:
a latch circuit, wherein the latch circuit stores a signal based on an output of the amplification transistor.

US Pat. No. 10,397,505

AUTOMATED NON-CONFORMING PIXEL MASKING

KROMEK GROUP, PLC, Sedge...

1. A method, comprising:receiving a plurality of communication events associated with a pixel of an imaging device;
identifying a frequency associated with the communication events, wherein the identifying a frequency comprises determining a number of communication events occurring within a predetermined time interval or determining a mean time interval between the communication events;
determining, from a plurality of pixels neighboring the pixel, a frequency range comprising an upper frequency limit and a lower frequency limit;
resolving, from the identified frequency and the determined frequency range, whether the pixel comprises a non-conforming pixel, wherein the resolving comprises updating a score associated with the pixel, and wherein the updating a score comprises increasing the score if the identified frequency is greater than the upper frequency limit and wherein the updating a score comprises decreasing the score if the identified frequency is less than the lower frequency limit; and
masking, if the pixel comprises a non-conforming pixel, subsequent communication events from the non-conforming pixel.

US Pat. No. 10,397,504

CORRECTING LAG IN IMAGING DEVICES

Kidde Technologies, Inc.,...

1. A method of correcting image lag in an imaging pixel, comprising:converting electromagnetic radiation in an infrared waveband into a first pixel value;
receiving the first pixel value from a current frame;
converting electromagnetic radiation in the infrared waveband into at least one second pixel value;
receiving the at least one second pixel value from at least one of a prior frame and a subsequent frame;
determining a first filter coefficient for the first pixel value;
determining a second filter coefficient for the at least one second pixel value;
calculating a pixel output by adding (a) a product of the first pixel value and the first filter coefficient to (b) a product of the at least one second pixel value and the corresponding second filter coefficient to compensate for image lag in the first pixel value; and
displaying an image on a display device using the calculated pixel output.

US Pat. No. 10,397,503

IMAGE SENSOR WITH HIGH DYNAMIC RANGE

STMicroelectronics (Croll...

1. An image sensor pixel circuit, comprising:a photodiode configured to produce photogenerated charges in response to exposure to light for integration at a charge collection node;
a transfer gate transistor circuit coupled to the charge collection node and configured to pass a first portion of the integrated photogenerated charges to a sense node;
an overflow transistor coupled to the charge collection node and configured to pass a second portion of the integrated photogenerated charges to an overflow sense node; and
read circuitry coupled to the sense node and overflow sense node and configured to read out a first signal representing the first portion from the sense node and read out a second signal representing the second portion from the overflow sense node.

US Pat. No. 10,397,502

METHOD AND APPARATUS FOR IMAGING AN OBJECT

RICOH IMAGING COMPANY, LT...

1. An apparatus for imaging an object, comprising:an image sensor comprising an effective pixel area having a plurality of pixels;
a charge-reading processor that reads out accumulated charges in a given pixel of the effective pixel area via a pixel circuit of the charge accumulated pixel; and
a noise-reading processor that reads out noise signals from a pixel circuit corresponding to a pixel that is within the effective pixel area and is an object of noise acquisition, in parallel with the reading of accumulated charges by said charge-reading processor,
wherein the noise signals represent noise from the same frame as the accumulated charges.

US Pat. No. 10,397,501

SOLID-STATE IMAGE SENSOR AND IMAGING APPARATUS

Ricoh Company, Ltd., Tok...

1. A solid-state image sensor comprising:a pixel array including a plurality of pixel sub-arrays arranged in a main scanning direction, each of the pixel sub-arrays having a plurality of pixels that are two-dimensionally arranged to form a plurality of rows along the main scanning direction and a plurality of columns along a sub-scanning direction, each of the plurality of pixels to generate a pixel signal according to light being input;
a plurality of control lines connected with respective ones of the plurality of pixel sub-arrays such that one of the plurality of control lines is connected with all pixels of at least one of the plurality of rows in each of the plurality of pixel sub-arrays;
a plurality of signal lines individually connected with all pixels in each of the plurality of pixel sub-arrays;
a pixel control circuit to apply a control signal to each pixel of each of the plurality of pixel sub-arrays through each of the plurality of signal lines, so as to cause each pixel to generate a pixel signal having a phase difference between the plurality of pixel sub-arrays; and
a readout circuit to read the pixel signal from each pixel of each of the plurality of pixel sub-arrays such that the pixel signal has a phase difference between the plurality of pixel sub-arrays.

US Pat. No. 10,397,499

IMAGING DEVICE, IMAGING SYSTEM, AND METHOD FOR DRIVING IMAGING DEVICE FOR CAPTURING AN IMAGE WITH A RESTRAINED INFLUENCE OF A BLINKING STATE OF A LIGHT SOURCE

CANON KABUSHIKI KAISHA, ...

1. An imaging device comprising:a plurality of pixels arranged in rows and columns, each of the plurality of pixels including a photoelectric conversion unit configured to accumulate electric charges generated by incident light, a holding unit configured to hold the electric charges, an amplifying unit configured to output a signal based on the electric charges, a first transfer switch configured to transfer the electric charges from the photoelectric conversion unit to the holding unit, and a second transfer switch configured to transfer the electric charges from the holding unit to the amplifying unit;
output lines configured to output signals from the amplifying unit of the plurality of pixels; and
a control unit configured to execute the following operations:
a first transfer operation of transferring charges accumulated in a first charge accumulation period from the photoelectric conversion unit to the holding unit by simultaneously turning on the first transfer switches of a first pixel row and a second pixel row different from the first pixel row while maintaining the second transfer switches of the first pixel row and the second pixel row in the OFF state,
a first reading out operation of transferring the charges accumulated in the first charge accumulation period from the holding unit to the amplifying unit by turning on the second transfer switch of the first pixel row, after the first transfer operation,
a second transfer operation of transferring charges accumulated in a second charge accumulation period from the photoelectric conversion unit to the holding unit by simultaneously turning on the first transfer switches of the first pixel row and the second pixel row while maintaining the second transfer switches of the first pixel row and the second pixel row in the OFF state, after the first reading out operation, and
a second reading out operation of transferring the charges accumulated in the second charge accumulation period from the holding unit to the amplifying unit by turning on the second transfer switch of the second pixel row, after the second transfer operation,
wherein the second transfer switch of the second pixel row is maintained in the OFF state from the first reading out operation to the second reading out operation.

US Pat. No. 10,397,498

COMPRESSIVE SENSING CAPTURING DEVICE AND METHOD

SONY CORPORATION, Tokyo ...

1. A compressive sensing capturing device, comprising circuitry configured to:obtain compressive sensing image data; and
set a device attribute based on image attribute data, wherein the image attribute data are based on a machine learning algorithm performed in the compressing domain on the obtained compressive sensing image data.

US Pat. No. 10,397,497

SOLAR INVARIANT IMAGING SYSTEM FOR OBJECT DETECTION

Apple Inc., Cupertino, C...

1. An object detection system, comprising:an imaging system that:
generates a first image based on incident light captured in a visible spectrum; and
generates a second image based on incident light captured in an infrared spectrum; and
a control system that:
receives a disparity indication associated with object detection, wherein the disparity indication includes information that an object is not detected within the first image and that the object is detected within the second image; and
sends a command to one or more vehicle systems to implement a disparity response based on the disparity indication.

US Pat. No. 10,397,496

IMAGING DEVICE PROVIDED WITH LIGHT SOURCE, IMAGE SENSOR INCLUDING FIRST ACCUMULATOR AND SECOND ACCUMULATOR, AND CONTROLLER

PANASONIC INTELLECTUAL PR...

1. An imaging device for capturing an image of a target object, comprising:a first light source that, in operation, emits pulsed light that is radiated onto the target object;
an image sensor that includes pixels, each of the pixels including
photoelectric converter that, in operation, converts incident light into signal charge,
a first accumulator that, in operation, accumulates the signal charge,
a second accumulator that, in operation, accumulates the signal charge, and
a discharger that, in operation, discharges the signal charge; and
a controller that, in operation, controls the first light source and the image sensor,
wherein the controller, in a first frame period that is a period in which the image of one frame is captured,
causes the first light source to emit the pulsed light,
in a first period that includes a period in which, from within the pulsed light, a surface reflected component reflected by a surface of the target object is incident on the image sensor,
causes the first accumulator and the second accumulator to accumulate, from within the signal charge, a portion that has not been discharged to the discharger, by setting a state of the image sensor to a state in which the signal charge is discharged to the discharger,
in a second period in which, from within the pulsed light, an internally scattered component that has scattered inside the target object is incident on the image sensor,
causes the first accumulator to accumulate the signal charge by setting the state of the image sensor to a state in which the signal charge is not discharged to the discharger and the signal charge is accumulated in the first accumulator, and,
after the first period and the second period, causes the image sensor to generate a first signal that is based on the signal charge accumulated in the first accumulator, and a second signal that is based on the signal charge accumulated in the second accumulator.

US Pat. No. 10,397,495

SELF-CONTAINED MOBILE SENSOR CALIBRATION STRUCTURE

Amazon Technologies, Inc....

1. A mobile calibration system for unmanned aerial vehicles (UAVs), the system comprising:a trailer to enable the mobile calibration system to be moved using a vehicle; and
a collapsible calibration room comprising:
a floor;
three or more walls configured to be coupled to the floor and moveable between a stowed position and a deployed position;
a ceiling configured to be couple to at least one of the three or more walls and moveable between the stowed position and the deployed position; and
one or more targets disposed on one or more of the three or more walls or the floor;
one or more light sources disposed on the three or more walls or the ceiling to provide illumination inside the collapsible calibration room; and
a turntable supported by the floor to rotate a UAV about a first axis of rotation to enable calibration of one or more cameras on the UAV.

US Pat. No. 10,397,494

SEAMLESS SETUP AND CONTROL FOR HOME ENTERTAINMENT DEVICES AND CONTENT

Caavo Inc, Milpitas, CA ...

1. A method performed by a switching device comprising a plurality of audio/video (AV) ports and a switch circuit that is operable to selectively connect any one of a plurality of source devices, each of which is connected to a corresponding one of the plurality of AV ports, to a sink device that is connected to another one of the plurality of AV ports, the method comprising:detecting, by a control signal detector of the switching device, which is operable to sniff wireless control signals that have been sent from different remote control devices to different source devices, that a wireless control signal has been sent from a remote control device to a source device of the plurality of source devices so that the remote control device can wirelessly control the source device;
determining an identifier that identifies the source device to which the control signal was sent;
identifying a first AV port from among the plurality of AV ports to which the identified source device is connected using a data structure that comprises a device-to-port mapping that identifies the first AV port to which the identified source device is connected based on the determined identifier; and
automatically connecting the first AV port to the AV port to which the sink device is connected so that content can be provided from the identified source device to the sink device.

US Pat. No. 10,397,493

DUAL LENS SYSTEM HAVING A LIGHT SPLITTER

SZ DJI TECHNOLOGY CO., LT...

1. A system for capturing images, said system comprising:an optical element configured to separate light into a first light beam and a second light beam;
a first lens module configured to focus the first light beam;
a second lens module configured to focus the second light beam;
a first sensor having a first sensor size and configured to capture a first image from the first light beam focused by the first lens module onto the first sensor;
a second sensor having a second sensor size and configured to capture a second image from the second light beam focused by the second lens module onto the second sensor, wherein the second sensor size is different from the first sensor size; and
one or more processors configured to:
modify the first image or the second image based on the first sensor size and the second sensor size to generate a modified image; and
generate a combined image based on the modified image, wherein the first sensor size is a first pixel size and the second sensor size is a second pixel size, and wherein modifying the first image comprises scaling the first image by

 and modifying the second image comprises scaling the second image by

US Pat. No. 10,397,492

IMAGING DEVICE

PANASONIC INTELLECTUAL PR...

1. An imaging device comprising:a camera body;
a first imaging unit that is fixedly disposed on the camera body and generates a first image data by capturing a first subject image;
a display unit that is rotatable with respect to the camera body via a hinge unit and includes a display face;
a second imaging unit that (i) is disposed at one side end of the display unit facing an other side end of the display unit with the hinge unit, across the display face, and (ii) generates a second image data by capturing a second subject image; and
a controller that causes the display unit to display a superimposed image obtained by superimposing an image represented by the second image data on an image represented by the first image data,
wherein the image represented by the second image data is superimposed and displayed so as to be always positioned closer to a side of the second imaging unit than to a side of the hinge unit, regardless of the degree of user's interest of the image represented by the first image data, on the image represented by the first image data, when the first imaging unit captures the first subject image for the first image data and the second imaging unit captures the second subject image for the second image data,
wherein the second imaging unit is disposed at the one side end of the display unit via a rotatable shaft mechanism, a photographing direction of the second imaging unit being
changeable between a plurality of photographing directions by manually rotating the second
imaging unit around a rotation axis of the rotatable shaft mechanism,
a shape of the display unit including the second imaging unit in the state in which the display unit is opened 90 degrees with respect to the camera body, is substantially rectangular in
shape from the perspective of a photographer, and
a rotation axis of the hinge unit and the rotation axis of the rotatable shaft mechanism are in the same direction.

US Pat. No. 10,397,491

PHOTOGRAMMETRY SYSTEM AND PHOTOGRAMMETRY METHOD

NUCTECH COMPANY LIMITED, ...

1. A photogrammetry method, comprising the following steps:(a) in a direction perpendicular to a direction of movement of an object to be measured with respect to photographing devices, photographing, by the photographing devices, the object to be measured two times at a predetermined time interval to obtain two images, respectively; and
(b) according to a length Lp of the object to be measured or at least one portion of the object to be measured in the two images obtained by the photographing devices, a transverse movement distance Dp of the object to be measured or the least one portion of the object to be measured in the two images, and a speed of the object to be measured, calculating an actual length L of the object to be measured or the at least one portion of the object to be measured;
the object to be measured moves at a speed V, and, in step (a), the photographing devices photograph the object to be measured two times at a time interval t, or the object to be measured is stationary, and, in step (a), the photographing devices move at a speed V in a direction perpendicular to a photographing direction and photograph the object to be measured for two times at a time interval t; and
in step (b), the transverse movement distance Dp of the object to be measured or the at least one portion of the object to be measured in the two images obtained by the two times of photographing and the length Lp of the object to be measured or the at least one portion of the object to be measured in the images are obtained, and then the actual length L of the object to be measured or the at least one portion of the object to be measured is obtained by the following formula:

wherein Dp is a non-zero, real number.

US Pat. No. 10,397,490

CAMERA ILLUMINATION

GOOGLE LLC, Mountain Vie...

1. A camera, comprising:a camera lens configured to capture visual data of a field of view;
a plurality of light sources configured to illuminate the field of view; and
a light source driver coupled to the plurality of light sources and configured to drive the plurality of light sources with a first drive voltage at a first mode and with a second drive voltage at a second mode, the second drive voltage being lower than the first drive voltage;
wherein:
the plurality of light sources is configurable to a plurality of light source subsets;
at least two of the plurality of light source subsets include distinct light source members and are configured to illuminate different regions of the field of view of the camera;
in the first mode, the plurality of light sources are electrically coupled to each other and driven by the first drive voltage; and
in the second mode, one of the plurality of light source subsets is selected and driven by the second drive voltage.

US Pat. No. 10,397,489

LIGHT SOURCE CONTROL DEVICE, METHOD OF CONTROLLING LIGHT SOURCE, AND IMAGE CAPTURE SYSTEM

SONY CORPORATION, Tokyo ...

1. A light source control device comprising processing circuitry configured to:cause a plurality of narrowband light sources including at least a red light source, a green light source, and a blue light source to emit light on a time division basis; and
set an output of each of the narrowband light sources on the basis of a ratio of a number of luminance values that are smaller than or equal to a current control luminance value to a total number of the luminance values in a luminance value histogram as equal to a predetermined ratio corresponding to the respective narrowband light source, the luminance value histogram generated on the basis of image information that is frame-sequentially detected by a monochrome single-plate image sensor configured to capture a reflected image of a subject illuminated with light emitted from the respective narrowband light source.

US Pat. No. 10,397,488

SMART SHUTTER IN LOW LIGHT

GoPro, Inc., San Mateo, ...

1. A method for controlling a digital camera, comprising:determining if motion meeting a predefined motion criteria is present in image frames captured by the digital camera;
responsive to determining that the motion meeting the predefined motion criteria is not present in the image frames, controlling the digital camera to operate with a default shutter speed and a default digital gain; and
responsive to determining that the motion meeting the predefined motion criteria is present in the image frames, controlling the digital camera to operate with an adjusted shutter speed and an adjusted digital gain, the adjusted shutter speed and the adjusted digital gain resulting in an exposure value that corresponds to the default shutter speed and the default digital gain.

US Pat. No. 10,397,487

SIGNAL PROCESSING APPARATUS, SIGNAL PROCESSING METHOD, AND IMAGE CAPTURING APPARATUS

Canon Kabushiki Kaisha, ...

1. A signal processing apparatus that processes a video signal, comprising:a signal conversion unit configured to convert a first video signal quantized according to a first characteristic to represent a video image with a relative luminance, into a second video signal quantized according to a second characteristic to represent the video image with an absolute luminance in an output of a display device based on a predetermined conversion correspondence relationship; and
a signal output unit configured to output information representing the conversion correspondence relationship in association with at least one of the first video signal and the second video signal.

US Pat. No. 10,397,486

IMAGE CAPTURE APPARATUS AND METHOD EXECUTED BY IMAGE CAPTURE APPARATUS

Canon Kabushiki Kaisha, ...

1. An image processing apparatus, comprising:a processor; and
a memory storing a program, wherein when the program is executed by the processor, the image processing apparatus is configured to function as:
an image capture unit that performs predetermined image capture process using a physical auxiliary light source and a virtual light source, wherein the image capture unit performs the predetermined image capture process using the physical auxiliary light source and the virtual light source by:
obtaining a first image by capturing a subject with the physical auxiliary light source emitted;
generating a second image by applying image processing that reduces brightness to the first image; and
applying correction process that adds an effect of a virtual light of the virtual light source to the second image.

US Pat. No. 10,397,485

MONITORING CAMERA DIRECTION CONTROL

Axis AB, Lund (SE)

1. A method performed by a camera controller, said camera controller being configured to adjust, from a current pan and tilt position, at least pan and tilt of a camera that is monitoring a scene in a monitoring direction, the method comprising:detecting an adjustment signal from a user input device, said adjustment signal representing an angular adjustment of the monitoring direction received by the user input device,
obtaining a representative movement vector that is representative of movement of objects in the scene,
calculating, based on the representative movement vector and based on the angular adjustment of the monitoring direction received by the user input device, a corrected adjustment signal, and
adjusting, using the corrected adjustment signal, at least pan and tilt of the camera.

US Pat. No. 10,397,484

CAMERA ZOOM BASED ON SENSOR DATA

QUALCOMM Incorporated, S...

1. A method of operating a camera, comprising:associating a camera zoom with at least one sensor;
obtaining a reference location and a reference angle of the camera, the reference location and the reference angle of the camera being determined at a previous time for determining a previous zoom amount;
detecting, with the at least one sensor, a change in a location of the camera and an angle of rotation of the camera relative to the reference location and the reference angle of the camera, the change in the location and the angle being due to a movement of the camera relative to the reference location and the reference angle of the camera, wherein the location is based on a geographical location of the camera; and
determining a current zoom amount, wherein determining the current zoom amount includes adjusting the camera zoom based on the change in the location and the angle of rotation of the camera and a change in distance to an object, the change in distance to the object being determined using the movement of the camera detected by the sensor and an image of the object captured by the camera.

US Pat. No. 10,397,483

IMAGE PROCESSING DEVICE, IMAGE PROCESSING SYSTEM AND NON-TRANSITORY COMPUTER READABLE MEDIUM STORING PROGRAM

FUJI XEROX CO., LTD., Mi...

1. An image processing device comprising:a computer-readable memory having stored therein program instructions; and
a processor configured to execute the program instructions, that when executed implement:
an acceptance unit that accepts an image information pair composed of image information before color conversion and image information after color conversion;
an accuracy output unit that outputs accuracy of a color conversion property from a plurality of the image information pairs accepted by the acceptance unit;
a color conversion property creation unit that creates the color conversion property from a plurality of the image information pairs accepted by the acceptance unit; and
a display control unit that, when the acceptance unit accepts a new image information pair, controls to display, on a display device, at least image information created by color conversion of image information of the new image information pair before color conversion based on the color conversion property created by the color conversion property creation unit from the image information pair that has already been accepted by the acceptance unit and image information of the new image information pair after color conversion.

US Pat. No. 10,397,482

IMAGING CONTROL APPARATUS AND METHOD FOR CONTROLLING THE SAME

Canon Kabushiki Kaisha, ...

1. An imaging control apparatus comprising:at least one memory and at least one processor which function as:
a display control unit configured to perform control to present a 2-area enlargement display of displaying live view images captured at two imaging regions in an imaging unit that are separately arranged in a width direction or a height direction on a display unit;
a setting unit configured to set an autofocus method;
a control unit configured to perform control to conduct, in response to an autofocus instruction operation on a first operation unit, an autofocus operation inside of a range displayed in the 2-area enlargement display while maintaining the 2-area enlargement display in a state where the 2-area enlargement display is presented, and
to end, in response to an autofocus instruction operation on a second operation unit, the 2-area enlargement display and conduct the autofocus operation in a range independent of the inside of the range displayed in the 2-area enlargement display in a state where the 2-area enlargement display is presented,
wherein the control unit performs control to end the 2-area enlargement display and conduct the autofocus operation in the range independent of the inside of the range displayed in the 2-area enlargement display in response to the autofocus instruction operation on the second operation unit in a state where the 2-area enlargement display is presented when a first autofocus method is set by the setting unit, and
wherein the control unit performs control to conduct the autofocus inside the range displayed in the 2-area enlargement display while maintaining the 2-area enlargement display in response to the autofocus instruction operation on the second operation unit when a second autofocus method is set by the setting unit.

US Pat. No. 10,397,481

STABILIZATION AND ROLLING SHUTTER CORRECTION FOR OMNIDIRECTIONAL IMAGE CONTENT

QUALCOMM Incorporated, S...

1. A method of generating image content, the method comprising:receiving image content for a first set of bands of a first overlapping portion of a first image, and a second set of bands of a second overlapping portion of a second image, wherein the first overlapping portion and the second overlapping portion include overlapping image content, and wherein the first image includes a first non-overlapping portion;
adjusting image content within the first set of bands until the image content within the first set of bands overlaps with image content within the second set of bands to generate a set of overlapping bands;
receiving information indicative of deviation from a common reference;
determining coordinates for where image content within the set of overlapping bands and the first non-overlapping portion is to be mapped on a rectangular mesh based on the deviation from the common reference to compensate for at least one of device movement or rolling shutter delay during capture of the image content; and
generating an equirectangular image based at least in part on the determined coordinates, the set of overlapping bands, and the first non-overlapping portion.

US Pat. No. 10,397,480

IMAGE STABILIZATION APPARATUS, AND CONTROL METHOD AND STORAGE MEDIUM THEREOF

Canon Kabushiki Kaisha, ...

1. An image stabilization apparatus, comprising:at least one processor and/or circuitry;
a first calculation unit configured to calculate an angular velocity of movement of an image capturing apparatus;
a second calculation unit configured to calculate a moving amount of an object from a plurality of images captured by an image capturing device of the image capturing apparatus;
a third calculation unit configured to calculate an angular velocity of movement of the object based on output of the first and second calculation units;
a determination unit configured to determine whether or not the object is walking; and
a selection unit configured to select, based on a determination result of the determination unit, whether to perform control such that an image stabilization device corrects image blur of the object based on the angular velocity of movement of the image capturing apparatus calculated by the first calculation unit and the angular velocity of movement of the object calculated by the third calculation unit, or such that the image stabilization device corrects image blur of the object based on the angular velocity of movement of the image capturing apparatus calculated by the first calculation unit,
wherein the first calculation unit, the second calculation unit, the third calculation unit, the determination unit and the selection unit are implemented by the at least one processor or the circuitry or a combination thereof.

US Pat. No. 10,397,479

SYSTEM AND METHOD FOR MOTION COMPENSATION IN IMAGES

Kabushiki Kaisha Toshiba,...

1. A method of compensating for camera motion during capture of an image in a rolling shutter camera, the method comprising:receiving an image of a scene captured by a camera with a rolling shutter;
extracting line segments in said image;
estimating movement of the camera during the capturing of the image from the received image; and
producing an image compensated for the movement during the capture of the image,
wherein the scene is approximated by a horizontal plane and two vertical planes that intersect at a line at infinity and estimating the movement of the camera during the capture of the image comprises assuming that the extracted line segments are vertical and lie on the vertical planes.

US Pat. No. 10,397,478

BEARING LIMITER STRUCTURES IN OPTICAL IMAGE STABILIZATION SUSPENSIONS

Hutchinson Technology Inc...

1. A suspension assembly, comprising:a support member;
a moving member;
one or more bearings between the support member and the moving member to space the support member and moving member by a bearing distance about the z axis; and
one or more bearing limiters between the first and second members to limit movement of the support member and moving member about the z axis to a gap distance that is less than the bearing distance.

US Pat. No. 10,397,477

THREE-DIMENSIONAL STUDIO SYSTEM

1. A three-dimensional studio system comprising:a booth including camera modules each including a camera which photographs a subject in response to a camera control signal and a driving device which adjusts a position and a direction of the camera in response to a driving control signal and sensor modules each of which provides a sensing signal obtained by sensing the subject, the camera modules and the sensor modules being disposed by being spatially dispersed around the subject;
a database configured to store camera setting data for controlling the camera and driving setting data for adjusting the position and the direction of the camera, to have a correlation with the sensing signal;
a booth control unit configured to transfer a photographed image of the camera to an external image processing device, select the camera setting data and the driving setting data for the sensing signal of the sensor module in the database, and provide the camera control signal and the driving control signal corresponding to the camera setting data and the driving setting data, to the camera and the driving device, respectively; and
a statistical analysis unit configured to receive the sensing signal and the camera setting data and the driving setting data selected for the sensing signal, from the booth control unit, receive a rendering correction value obtained in a course of performing an image process, from the image processing device, and back up the camera setting data and the driving setting data in the database by reflecting the rendering correction value, to have a correlation with the sensing signal.

US Pat. No. 10,397,476

PANORAMIC CAMERA WITH MULTIPLE IMAGE SENSORS USING TIMED SHUTTERS

Google LLC, Mountain Vie...

1. A system for capturing images, comprising:a camera apparatus, the camera apparatus including:
a first camera configured to capture a first image; and
a second camera configured to capture a second image; and
a controller in communication with the camera apparatus configured to:
receive a velocity of a movable structure, the velocity of the movable structure corresponding to a movement of the camera apparatus at a given speed; and
while moving the camera apparatus at the given speed:
start to expose the first image using the first camera at a time T1;
start to expose the second image using the second camera at a time T2, wherein T2 is after T1; and
complete exposure of the first image at a time T3 after the time T2 such that the first image and the second image include overlapping fields of view captured by the camera apparatus between time T2 and T3,
wherein a same object at a same location within the first and second images is captured by each of the first and second cameras at different times due to the movement of the camera apparatus.

US Pat. No. 10,397,475

CAPTURING CONTROL APPARATUS AND METHOD OF CONTROLLING THE SAME

CANON KABUSHIKI KAISHA, ...

1. A capturing control apparatus that generates a high-dynamic-range (HDR) moving image by combining a first moving image captured by a first capturing unit and a second moving image captured by a second capturing unit, the capturing control apparatus comprising:a memory configured to store instructions; and
a processor configured to execute the instructions stored in memory to provide:
a first setting unit configured to set a first capturing condition in the first capturing unit and the second capturing unit when capturing a first frame image pair to be used in HDR composition;
a second setting unit configured to set a second capturing condition, different from the first capturing condition, in the first capturing unit and the second capturing unit when capturing a second frame image pair to be used in deriving a geometric correction parameter;
a capturing control unit configured to control capturing of the first moving image by the first capturing unit and the second moving image by the second capturing unit by capturing at least one first frame image pair using the first capturing condition and at least one second frame image pair using the second capturing condition;
a deriving unit configured to derive the geometric correction parameter used to correct a position shift between a frame image of the first frame image pair included in the first moving image and another frame image of the first frame image pair included in the second moving image based on frame images of the second frame image pair captured under the second capturing condition, one frame image of the second frame image pair being included in each of the first moving image and the second moving image; and
a combining unit configured to perform geometric correction processing for at least one of the frame image of the first frame image pair included in the first moving image and the another frame image of the first frame image pair included in the second moving image using the derived geometric correction parameter based on the second frame image pair, and after the geometric correction processing, combine at least the first frame image pair included in the first moving image and the second moving image to generate the HDR moving image,
wherein each of the first capturing condition and the second capturing condition include a first exposure condition for the first capturing unit and a second exposure condition for the second capturing unit,
wherein, in the first capturing condition, the first exposure condition for the first capturing unit is different from the second exposure condition for the second capturing unit, and
wherein, in the second capturing condition, at least one of the first exposure condition or the second exposure condition is higher than in the first capturing condition when capturing a scene that has shadow-detail loss regions or lower than in the first capturing condition when capturing a scene that has highlight-detail loss regions.

US Pat. No. 10,397,474

SYSTEM AND METHOD FOR REMOTE MONITORING AT LEAST ONE OBSERVATION AREA

1. A system (100) for remote monitoring of at least one observation area, comprising at least one first camera (110) arranged on a rotating platform (120), said first camera (110) is a line scanning camera for providing high definition panorama pictures of said observation area, the system further comprising:at least one second camera (130) arranged on a stationary platform (140), said second camera (130) is a video camera with Pan/Tilt/Zoom (PTZ) functionality for providing real time video from a selected scene within said observation area;
an angular position sensor device arranged for triggering the read-out of each line in said first line scanning camera (110) for providing each pixel in the panorama image with a corresponding azimuth and elevation angle, and for providing synchronization pulses enabling adaption of rotation velocity of the rotating platform, and
a processing device (150) connected to the angular position sensor and said first and second cameras (110, 130) for capturing and processing video and images and coordinating azimuth and elevation signals received from said first and second cameras (110, 130).

US Pat. No. 10,397,473

IMAGE PROCESSING APPARATUS HAVING AN IMAGE SYNTHESIS UNIT THAT GENERATES SYNTHESIZED IMAGE DATA DEPENDING ON AN OBJECT BRIGHTNESS, AND RELATED IMAGE-PICKUP APPARATUS

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:(A) a memory that stores instructions; and
(B) one or more processors that execute the instructions to cause the image processing apparatus to function as:
(a) a first acquisition unit configured to acquire first image data from one of a first pixel and a second pixel, of an image sensor, that share a single micro lens and receive light passing through different pupil areas of an optical system;
(b) a second acquisition unit configured to acquire second image data as a sum of the first image data obtained from the first pixel and the first image data obtained from the second pixel; and
(c) an image synthesis unit configured (i) to synthesize the first image data acquired by the first acquisition unit and the second image data acquired by the second acquisition unit based on an object brightness, so that the first image data is selected in an area in which the object brightness is a first value that is greater than a threshold value, and the second image data is selected in an area in which the object brightness is a second value that is less than the threshold value, and (ii) to generate synthesized image data of the first image data and the second image data.

US Pat. No. 10,397,472

AUTOMATIC DETECTION OF PANORAMIC GESTURES

Google LLC, Mountain Vie...

1. A method for capturing panoramic images comprising:recording, with one or more processing devices, a set of video frames;
determining, with the one or more processing devices, tracking features each including one or more features that appear in two or more recorded video frames of the set of video frames within the set of video frames;
determining, with the one or more processing devices, a set of frame-based features based on a displacement of the tracking features between the two or more recorded video frames of the set of video frames;
determining, with the one or more processing devices, a set of historical feature values based on the set of frame-based features based on the displacement of the tracking features;
determining, with the one or more processing devices, whether a user is attempting to capture a panoramic image based on the set of historical feature values; and
capturing, with the one or more processing devices, a panoramic image in response to determining that the user is attempting to capture a panoramic image.

US Pat. No. 10,397,471

IMAGE PROCESSING APPARATUS, LOCATION INFORMATION ADDING METHOD

SONY CORPORATION, Tokyo ...

1. A first information processing apparatus, comprising:a control unit configured to:
determine a condition that first information of the first information processing apparatus is undetected;
acquire captured image data;
control a display screen to display an information input image based on the acquired captured image data,
wherein the control of the display screen is based on the determination that the first information of the first information processing apparatus is undetected;
receive an input based on an instruction related to the information input image; and
extract the first information based on the received input,
wherein the display screen is controlled to set a timing of the input to one of a timing before an imaging operation of the captured image data, a timing during the imaging operation of the captured image data, or a timing after the imaging operation of the captured image data,
wherein a setting screen, displayed on the display screen, comprises a plurality of user selectable items to set the timing of the input, and
wherein the plurality of user selectable items comprise a user selectable item to set the timing of the input to the timing during the imaging operation of the captured image data.

US Pat. No. 10,397,470

IMAGE CAPTURE USING DISPLAY DEVICE AS LIGHT SOURCE

Apple Inc., Cupertino, C...

1. A method of capturing digital images, comprising:determining a frame latency for an image processing pipeline of an electronic device;
measuring, using the electronic device, ambient light in an environment during a pre-flash phase;
determining, using the electronic device, a flash intensity based on the measured ambient light during the pre-flash phase;
determining, using the electronic device, a sustain time for a flash phase during the pre-flash phase;
determining an image capture time based on the frame latency and the sustain time;
emitting light in the environment during the flash phase at the determined flash intensity for a period of time corresponding to the sustain time; and
capturing a digital image within the environment based on the determined image capture time.

US Pat. No. 10,397,469

DYNAMIC IMAGE-BASED ADJUSTMENT OF IMAGE CAPTURE PARAMETERS

Snap Inc., Santa Monica,...

1. A device comprising::a frame;
a camera mounted on the frame and configured to capture and process image data according to a plurality of image capture parameters;
an input mechanism operable by the user to activate image-capture by the camera; and
a camera controller incorporated in the frame and configured to perform operations comprising:
extracting a plurality of video frames from video content captured by the camera responsive to user engagement with the input mechanism to activate image-capture by the camera;
determining a count of video frames in the plurality of video frames for which an image brightness metric exceeds a predefined brightness threshold; and
in response to identifying that the count of threshold-transgressing video frames exceeds a pre-defined threshold number, performing an automated adjustment action with respect to one or more of the plurality of image capture parameters of the camera, the automated adjustment action comprising automatically modifying the one or more image capture parameters.

US Pat. No. 10,397,468

RECORDED IMAGE SHARING SYSTEM, METHOD, AND PROGRAM

OPTIM CORPORATION, Saga-...

1. A recorded image sharing system that shares a recorded image of a captured image captured by a wearable terminal with a connected operator terminal, the recorded image sharing system comprising:a storage unit that stores trigger data in advance, trigger IDs indicating predetermined triggers being associated with reference data for detecting the predetermined triggers and predetermined times each being a time for displaying a recorded image including a predetermined trigger when the predetermined trigger is detected, in the trigger data;
a recording unit that records captured images captured by the wearable terminal;
a detecting unit that detects a predetermined trigger by referring to the reference data;
a time changing unit that changes a predetermined time which is a time for displaying a recorded image including the detected predetermined trigger by referring to the trigger data; and
a display control unit that displays, on a display unit of the operator terminal, the recorded image including the detected predetermined trigger among the recorded images, for the predetermined time which is changed by the time changing unit.

US Pat. No. 10,397,467

IMAGING APPARATUS, IMAGE PROCESSING DEVICE, IMAGING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM

OLYMPUS CORPORATION, Tok...

1. An imaging apparatus comprising:an imaging unit configured to continuously capture images to sequentially generate image data;
a combining unit configured to combine a plurality of sets of the image data generated by the imaging unit to generate composite image data;
a display unit configured to display a composite image corresponding to the composite image data generated by the combining unit;
an operating unit configured to receive an operation for the image data to be left in the composite image selected from among a plurality of sets of the image data combined into the composite image displayed by the display unit;
a control unit configured to cause the combining unit to combine at least two sets of the image data selected in accordance with the operation of the operating unit to generate a new set of the composite image data; and
a display control unit configured to cause the display unit to display a last image overlaid on the composite image, whenever the imaging unit generates the image data, the last image corresponding to a last set of the image data generated by the imaging unit, the composite image being generated by the combining unit,
wherein the display control unit causes the display unit to display the composite image and the last image, in slow motion.

US Pat. No. 10,397,466

FOCUS ADJUSTMENT DEVICE, FOCUS ADJUSTMENT METHOD, AND NON-TRANSITORY STORAGE MEDIUM FOR STORING FOCUS ADJUSTMENT PROGRAMS

Olympus Corporation, Tok...

1. A focus adjustment device including an image pickup device which receives object light via an image-acquiring lens including a focus lens and generates image pickup signals, the focus adjustment device allowing a defocus amount to be detected by a phase difference detection system based on the image pickup signals and allowing contrast to be detected by a contrast detection system based on the image pickup signals, the focus adjustment device comprising:an extreme value detection section configured to acquire the contrast repeatedly while performing scan driving of the focus lens, based on a direction indicated by the defocus amount detected by the phase difference detection system, and perform a scan operation to detect and store an extreme value of the contrast and a direction of change in the contrast;
a determination start position calculation section configured to calculate a determination start position corresponding to a position of the focus lens where a determination whether the scan operation is stopped is started, as a position precedent to an in-focus position based on the defocus amount; and
a control section configured to stop the scan operation if the extreme value detection section detects a maximum value during the scan operation before the focus lens reaches the determination start position calculated by the determination start position calculation section and the control section determines that the contrast decreases as a latest change during the scan operation after the focus lens reaches the determination start position.

US Pat. No. 10,397,465

EXTENDED OR FULL-DENSITY PHASE-DETECTION AUTOFOCUS CONTROL

QUALCOMM Incorporated, S...

1. A method for performing phase-detection autofocus control, comprising:receiving, by one or more processors, luminance values measured by a plurality of sensing elements in a sensor array, the sensing elements comprising imaging pixels and phase-detection pixels;
comparing, by the one or more processors, luminance values measured by at least one of the phase-detection pixels to luminance values associated with a subset of the imaging pixels including two or more imaging pixels, the comparing being performed at extended horizontal density or full horizontal density along a first sensor-array row that includes the at least one phase-detection pixel and the two or more imaging pixels; and
performing, by the one or more processors, a phase-detection autofocus operation based on an outcome of the comparison.

US Pat. No. 10,397,464

CONTROL APPARATUS, IMAGE CAPTURING APPARATUS, CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A control apparatus comprising:a controller having a processor which executes instructions stored in a memory or having circuitry, the controller being configured to function as:
a normalizer configured to perform normalization processing on a first signal and a second signal by using normalization coefficients related to the first signal and the second signal, respectively,
a correlation calculator configured to perform correlation calculation with respect to the normalized first and second signals, and
a corrector configured to correct correlation data to cancel the normalization processing, the correlation data being obtained by adding a plurality of output signals generated by the correlation calculation.

US Pat. No. 10,397,463

IMAGING DEVICE ELECTRONIC DEVICE, AND METHOD FOR OBTAINING IMAGE BY THE SAME

Samsung Electronics Co., ...

1. An imaging device, comprising:a first image sensor comprising a plurality of first micro lenses and a plurality of first light receiving sensors, each of the plurality of first light receiving sensors being configured to detect light passing through a corresponding one of the plurality of first micro lenses and convert the light into an electrical signal; and
a second image sensor comprising a plurality of second micro lenses, a plurality of second light receiving sensors, and color filters provided between one or more micro lenses of the plurality of second micro lenses and one or more light receiving sensors of the plurality of second light receiving sensors, each of the plurality of second light receiving sensors being configured to detect light passing through a corresponding one of the plurality of second micro lenses and convert the light into an electrical signal,
wherein at least some of the plurality of the first light receiving sensors or at least some of the plurality of second light receiving sensors are set to be phase difference pixels that detect phase difference information for an object,
wherein the plurality of first light receiving sensors comprises a first sensor group and a second sensor group, and wherein a first exposure time of the first sensor group is set to be different from a second exposure time of the second sensor group, and
wherein the phase difference pixels are configured to be arranged in a sensor group set to have a longer exposure time from among the first sensor group and the second sensor group.

US Pat. No. 10,397,462

IMAGING CONTROL APPARATUS AND IMAGING APPARATUS FOR SYNCHRONOUS SHOOTING

CASIO COMPUTER CO., LTD.,...

1. An imaging control apparatus comprising:a processor; and
a radio communication unit comprising a radio controller and an antenna, the radio communication unit being operable in a first transmission mode and a second transmission mode, the first transmission mode involving wireless data transmission to each of a plurality of imaging apparatuses with reception of an acknowledgement from the imaging apparatuses, and the second transmission mode involving simultaneous wireless data transmission to the imaging apparatuses without reception of the acknowledgement from the imaging apparatuses,
wherein the processor performs a communication controlling process comprising, in a case of instructing at least one simultaneous shooting process to the imaging apparatuses, separately transmitting a piece of shooting preparation instruction data instructing shooting preparation to each of the imaging apparatuses in the first transmission mode, receiving acknowledgements of completion of the shooting preparation from all of the imaging apparatuses having received the shooting preparation instruction data, and then simultaneously transmitting shooting instruction data requesting a simultaneous shooting start to the imaging apparatuses in the second transmission mode.

US Pat. No. 10,397,461

CONTROL APPARATUS, CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A control apparatus of an image capturing device, comprising:an obtain unit configured to obtain first information from a first control apparatus for controlling a first image capturing device on an upstream side of said image capturing device;
a generation unit configured to generate, based on a captured image obtained by said image capturing device and the first information, second information used to generate a 3D model of an object in the captured image;
a transmit unit configured to transmit transmission information being based on the first information and the second information to a second control apparatus for controlling a second image capturing device on a downstream side, and
a control unit configured to control generation of the transmission information in accordance with a size of the object in the captured image or the size and a position of the object in the captured image in a case in which the captured image obtained by said image capturing device includes an invalid object that is not an object corresponding to information registered in advance.

US Pat. No. 10,397,460

LIGHTING AND POWER DEVICES AND MODULES

Windy Place, Inc.

1. A case configured to be removably attached to a smartphone, comprising: a back that substantially covers a back face of the smartphone when the case is attached to the smartphone, the case defining a first opening configured and arranged so that when the case is attached to the smartphone, at least part of a display of the smartphone is visible by way of the first opening, and the case further comprising a plurality of light sources positioned along at least a side region of the case, and the plurality of light sources are operable to illuminate part of an area to be imaged by a front-facing camera of the smartphone, wherein the back comprises a second opening that is aligned with a rear-facing camera of the smartphone when the case is attached to the smartphone, and wherein the case further comprises an electrical connector that is configured to receive power from a power source external to the case.

US Pat. No. 10,397,459

AUTONOMOUS INTELLIGENT VEHICLE

HON HAI PRECISION INDUSTR...

1. An autonomous intelligent vehicle, comprising:a vehicle body;
an advanced driver assistance system located on the vehicle body; and
two eye lamps located in a front of the vehicle body and spaced apart from each other, wherein each of the two eye lamps comprises a light emitting device and an image acquiring device spaced apart from the light emitting device; the light emitting device emits lights to light an object on a road and in front of the vehicle body, and the image acquiring device acquires an image of the object, processes the image to obtain an image information, and sends the image information to the advanced driver assistance system; wherein the two eye lamps comprises a left eye lamp and a right eye lamp;
wherein the left eye lamp comprises a first light emitting device and a first image acquiring device; the first light emitting device comprises a first heat sink, a first light emitting diode, a first controller, and a first reflective concave mirror; and the first image acquiring device comprises a first adjusting device, a first optical sensor, and a first image processor;
wherein the right eye lamp comprises a second light emitting device and a second image acquiring device; the second light emitting device comprises a second heat sink, a second light emitting diode, a second controller, and a second reflective concave mirror; and the second image acquiring device comprises a second adjusting device, a second optical sensor, and a second image processor; and
wherein the advanced driver assistance system is respectively connected to the first controller, the first adjusting device, the first image processor, the second controller, the second adjusting device, and the second image processor.

US Pat. No. 10,397,458

TELECENTRIC ILLUMINATION AND PHOTOGRAPHING SYSTEM FOR DETECTION OF MARINE MICROSCOPIC ORGANISMS

GRADUATE SCHOOL AT SHENZH...

1. A telecentric illumination and photographing system for detection of marine microscopic organisms, comprising an optical path module and an illumination drive module, wherein the optical path module comprises:a light emitting diode (LED) light source, configured to provide an illumination light source;
a light homogenizing rod, configured to mix light beams emitted from the LED light source to obtain uniform light intensity distribution;
a decoherence light homogenizing sheet, configured to perform secondary light homogenization on the light beams emitted from an end surface of the light homogenizing rod and perform decoherence processing on the light beams, to obtain an incoherent and uniform light source surface;
a diaphragm, configured to determine a corresponding diaphragm aperture according to a requirement for a collimation degree of an illumination beam;
a telecentric collimation camera, wherein a light beam emitted from the diaphragm is incident into a microscopic organism area with uniform illuminance after passing through the telecentric collimation camera; and
a telecentric imaging camera, matching the telecentric collimation camera and configured to cooperate with the telecentric collimation camera to receive an illumination beam passing through the microscopic organism area and output the illumination beam to an imager, to obtain an imaging result of uniform illuminance;
wherein the illumination drive module comprises:
a pulse width modulation (PWM) synchronizer, configured to provide a PWM synchronization signal of an imaging system;
a processor, configured to receive the PWM synchronization signal and generate a digital signal after performing quantization processing on the PWM synchronization signal;
a digital-to-analog converter, configured to receive the digital signal and output an analog voltage signal after performing digital-to-analog conversion on the digital signal;
an analog signal amplifier, configured to receive the analog voltage signal and output an analog quantization voltage after performing synchronous following and amplification on the analog voltage signal, wherein the LED light source is driven by the analog quantization voltage; and
a synchronous current detector, configured to perform real-time sampling on a working current of the LED light source and transmit sampled current information to the processor, wherein the processor performs feedback control according to the sampled current information, so that the LED light source works in a stable state in which light emitting intensity is constant;
wherein the processor performs light intensity or light brightness attenuation compensation by using a single-output proportional-integral-derivative neural network (SPIDNN) and according to the sampled current information, the SPIDNN comprises an input layer, a hidden layer, and an output layer, the input layer has two proportional neurons, the hidden layer has one proportional neuron, one integral neuron, and one derivative neuron, the output layer has one proportional neuron, one proportional neuron of the input layer inputs a preset ideal working current Ref_I, and the other proportional neuron of the input layer inputs a sampled actual working current Real_I, the SPIDNN outputs a control signal Out_pwm with a pulse width after processing, and a drive signal of the LED light source is based on the control signal Out_pwm.

US Pat. No. 10,397,457

CAMERA MODULE

SAE MAGNETICS (H.K.) LTD....

1. A camera module, comprising a lens assembly, a voice coil motor assembly, an image sensor assembly and a flash assembly, the lens assembly being installed in the voice coil motor assembly along an optical axis direction, the image sensor assembly being located below the lens assembly, the flash assembly being mounted above the voice coil motor assembly and comprising at least one light source located around a lens of the lens assembly, and the light source being adapted for providing a flash for the lens, wherein the flash assembly comprises multiple light sources, a mounting plate for supporting and mounting the light sources, and a cover covering the light sources.

US Pat. No. 10,397,456

APPARATUS FOR INSPECTING EDGE OF SUBSTRATE

Corning Precision Materia...

1. An apparatus for inspecting an edge portion of a substrate, comprising:a first right-angled prism disposed above an edge portion of a substrate such that an inclined surface thereof is directed toward an upper surface of the edge portion of the substrate;
a second right-angled prism disposed below the edge portion of the substrate such that an inclined surface thereof is directed toward a lower surface of the edge portion of the substrate;
a lighting part directly irradiating the edge portion of the substrate with light; and
a photographing part disposed adjacent to the edge portion of the substrate, wherein the photographing part takes an image of the upper surface of the edge portion of the substrate from light that has passed through the first right-angled prism, an image of the lower surface of the edge portion of the substrate from light that has passed through the second right-angled prism, and an image of an end surface of the edge portion of the substrate;
wherein the lighting part is disposed on a side of the edge portion of the substrate.

US Pat. No. 10,397,455

IMAGING LENS, AND ELECTRONIC APPARATUS INCLUDING THE SAME

Genius Electronic Optical...

1. An imaging lens comprising a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element and a sixth lens element arranged in order from an object side to an image side along an optical axis of the imaging lens, each of the first lens element, the second lens element, the third lens element, the fourth lens element, the fifth lens element and the sixth lens element having a refractive power, and having an object-side surface facing toward the object side and an image-side surface facing toward the image side, wherein:the object-side surface of the first lens element has a convex portion in a vicinity of the optical axis;
the object-side surface of the second lens element has a concave portion in a vicinity of the optical axis;
the refractive power of the second lens element is positive;
the image-side surface of the third lens element has a concave portion in the vicinity of the optical axis;
the refractive power of the fourth lens element is positive; and
the refractive power of the sixth lens element is negative,
wherein the imaging lens as a whole has only six lens elements having refractive power, and
satisfying 0.765?BFL/(G34+G56)?1.858, where BFL represents a distance between the image-side surface of the sixth lens element and an image plane along the optical axis; G34 represents a distance between the image-side surface of the third lens element and the object-side surface of the fourth lens element at the optical axis; and G56 represents a distance between the image-side surface of the fifth lens element and the object-side surface of the sixth lens element at the optical axis, and
satisfying 6.982?TL/(G12+G45)?31.561, where TL represents a distance between the object-side surface of the first lens element and the image-side surface of the sixth lens element at the optical axis; G12 represents a distance between the image-side surface of the first lens element and the object-side surface of the second lens element at the optical axis; and G45 represents a distance between the image-side surface of the fourth lens element and the object-side surface of the fifth lens element at the optical axis.

US Pat. No. 10,397,454

MULTIPOLARIZED SIGNAL TERMINAL CONNECTOR FOR ACCESSORY, ACCESSORY SHOE DEVICE, IMAGE PICKUP APPARATUS, AND ACCESSORY

Canon Kabushiki Kaisha, ...

1. A signal terminal connector for an accessory, comprising:a first surface opposed to an attaching direction of an accessory;
a first terminal row provided in the first surface;
a second surface that is inclined with respect to the first surface by a predetermined angle; and
a second terminal row provided in the second surface,
wherein the second surface is inclined with respect to the first surface, in a predetermined direction orthogonal to the attaching direction of the accessory and a direction in which terminals of the second terminal row are arranged, and is displaced with respect to the first surface in the predetermined direction,
wherein terminals of the first terminal row and the terminals of the second terminal row are arranged in a staggered relation to each other,
wherein each terminal of the first terminal row is a movable terminal having a contact portion which is displaced, and
wherein each terminal of the second terminal row is a fixed terminal having a contact portion which is not displaced.

US Pat. No. 10,397,453

ELECTRONIC DEVICE INCLUDING CAMERA

Samsung Electronics Co., ...

1. An electronic device comprising:an upper cover unit including at least one first camera that faces a first direction;
a plurality of second camera pairs disposed to face a second direction, and second cameras included in each of the second camera pairs being arranged to face directions that intersect each other;
a housing including a plurality of first openings, to which the plurality of second camera pairs are coupled to be exposed to an outside;
a first support member disposed in an accommodation space inside the housing, and providing a seating space for a printed circuit unit electrically connected with a connector of the upper cover unit; and
a bracket coupled to a first opening in the housing to be at least partially exposed to the outside, and including at least one pair of second openings in each of which a pair of second cameras are seated; and
a first sealing member disposed between the housing and the bracket.

US Pat. No. 10,397,452

VEHICULAR CAMERA APPARATUS

DENSO CORPORATION, Kariy...

1. A vehicular camera apparatus to be fixed to a windshield of a vehicle from a passenger compartment side, the camera apparatus comprising:a camera module including a lens and an imaging element;
a camera case to be fixed to the windshield of the vehicle, the camera case having the camera module received therein; and
a hood provided below the lens,
wherein
the hood includes both a rib structure and a hole structure,
the rib structure comprises a plurality of ribs that each protrude upward from a bottom wall of the hood and are arrayed in an optical axis direction of the lens, each of the ribs having a front surface on an opposite side to the lens and a rear surface on the lens side, the front surface making an acute angle with an imaginary plane that contains an upper surface of the bottom wall of the hood, the rear surface making an obtuse angle with the imaginary plane, and
the hole structure comprises a plurality of holes that are formed in the bottom wall of the hood and extend through the bottom wall of the hood in a direction perpendicular to the imaginary plane, each of the holes being formed along a corresponding one of the ribs and including, at least, a projection of the front surface of the corresponding rib on the imaginary plane that contains the upper surface of the bottom wall.

US Pat. No. 10,397,451

VEHICLE VISION SYSTEM WITH LENS POLLUTION DETECTION

MAGNA ELECTRONICS INC., ...

1. A vision system for a vehicle, said vision system comprising:a camera disposed at a vehicle and having a field of view exterior of the vehicle, said camera operable to capture frames of image data;
wherein said camera comprises a lens and an imager comprising an array of photosensing elements having multiple columns of photosensing elements and multiple rows of photosensing elements;
a processor operable to process image data captured by said camera;
wherein, with the vehicle moving, and with said camera capturing image data, said processor models outputs of photosensing elements as Gaussian distributions;
wherein, with the vehicle moving, and with said camera capturing frames of image data, said processor determines an output of respective ones of said modeled photosensing elements over multiple frames of captured image data;
wherein, with the vehicle moving, and with said camera capturing frames of image data, said processor determines whether the output of a modeled photosensing element fits the respective Gaussian distribution for that photosensing element;
wherein, responsive to determination that the output of the modeled photosensing element fits within the respective Gaussian distribution for that photosensing element, said vision system classifies that photosensing element as a blocked element;
wherein, responsive to determination that the output of the modeled photosensing element does not fit within the respective Gaussian distribution for that photosensing element, said vision system classifies that photosensing element as a not blocked element; and
wherein, responsive to determination over multiple frames of captured image data that the ratio of the number of photosensing elements classified as a blocked element to the number of photosensing elements classified as a not blocked element is greater than a threshold ratio, a blockage condition is determined.

US Pat. No. 10,397,450

TWO DIMENSIONAL SHIFT ARRAY FOR IMAGE PROCESSOR

Google LLC, Mountain Vie...

1. A processor comprising:a plurality of execution lanes, each execution lane comprising logic circuitry capable of executing instructions; and
a two-dimensional shift-register array comprising a plurality of electrically coupled shift registers, wherein each execution lane is configured to shift data from a respective shift register dedicated to the execution lane to one or more adjacent shift registers in the two-dimensional shift-register array, each of the one or more adjacent shift registers being a shift register dedicated to a different execution lane,
wherein the processor is configured to execute instructions of a stencil function over a plurality of overlapping stencils by the plurality of execution lanes, wherein executing, by each execution lane, the instructions causes the execution lanes to shift data in the shift registers of the two-dimensional shift-register array multiple times in two dimensions in order for each execution lane to read multiple values of a respective stencil defined by the stencil function.

US Pat. No. 10,397,449

IMAGE CAPTURING APPARATUS FOR STORING, AS IMAGE DATA, SIGNAL FROM PIXEL SENSIBLE TO LIGHT, IMAGE PROCESSING APPARATUS FOR PROCESSING SUCH IMAGE DATA, AND CONTROL METHOD THEREFOR

CANON KABUSHIKI KAISHA, ...

1. An image capturing apparatus comprising:an image sensing device including a plurality of groups of pixels, each pixel of each group of pixels including a plurality of photoelectric conversion elements;
one or more processors; and
a memory storing instructions which, when the instructions are executed by the one or more processors, cause the image processing apparatus to function as:
a reading unit configured to perform, on a plurality of groups of pixels, a reading-out operation for reading out a signal as a first signal from part of the plurality of photoelectric conversion elements and a reading-out operation for mixing signals from the plurality of photoelectric conversion elements and reading out a resultant mixed signal as an image signal;
a correction unit configured to make a correction based on defect data which is data indicating a group of pixels for which when signals are read out from the photoelectric conversion elements by the reading unit, the first signal read from this group of pixels is defective while the image signal read out from this group of pixels is not defective, the correction being made on the first signal read out from the group of pixels indicated by the defect data; and
a generation unit configured to generate one image file including the first signal corrected by the correction unit and the image signal.

US Pat. No. 10,397,448

INTRODUCING VISUAL NOISE IN A FLAT TINT AREA OF A PRINTED IMAGE

Hewlett-Packard Developme...

1. An apparatus, comprising:a color conversion module for converting page description language describing an image into rasterized image data; and
a visual noise module for recalculating pixel values in an area of flat tint detected in the rasterized image data to introduce visual noise in the area of flat tint.

US Pat. No. 10,397,447

IMAGE PROCESSING APPARATUS FOR MEASURING COLORIMETRIC VALUES OF A COLOR IMAGE HAVING GLITTERING CHARACTERISTICS FROM A PLURALITY OF DIRECTIONS

FUJI XEROX CO., LTD., Mi...

1. An image processing apparatus comprising:a storage unit in which a plurality of colorimetric values obtained by measuring a target-color image having a glittering characteristic from a plurality of directions are stored for each target color;
a color information receiving unit that receives color information for specifying a target color to be printed;
a display controller that performs a control operation for reading out a plurality of colorimetric values stored corresponding to the target color of the color information received by the color information receiving unit from the storage unit and causing the plurality of colorimetric values to be displayed on a display;
a change receiving unit that receives a change of the plurality of colorimetric values displayed on the display by the display controller; and
a converter that converts the plurality of colorimetric values reflecting the change received by the change receiving unit into color values including a value indicative of an amount of a glittering color material and a value indicative of an amount of a color material other than the glittering color material.

US Pat. No. 10,397,446

CORRECTING COLOR DIFFERENCES BETWEEN SCANNER MODULES

Hewlett-Packard Developme...

1. A method performed by a system comprising a hardware processor, comprising:analyzing a first scan of a first target scanned using a first scanner module of a scanner;
analyzing second scan of a second target scanned using a second scanner module of the scanner, the first target and the second target being on a medium;
identifying a color difference greater than a threshold color difference between the first scan and the second scan; and
adjusting color settings for the first scanner module to correct the color difference to less than the threshold color difference, the adjusting of the color settings for the first scanner module reducing a color difference between respective scans obtained by the first scanner module and the second scanner module.

US Pat. No. 10,397,445

SIGNAL PROCESSING DEVICE, PHOTOELECTRIC CONVERSION ELEMENT, IMAGE SCANNING DEVICE, IMAGE FORMING APPARATUS, AND METHOD OF PROCESSING SIGNAL

RICOH COMPANY, LTD., Tok...

1. A photoelectric conversion element, comprising:a plurality of pixel groups, each pixel group including a plurality of pixels,
a plurality of amplifiers corresponding to the plurality of pixel groups, respectively, wherein each amplifier amongst the plurality of amplifiers is configured to amplify or attenuate an output range of a signal to be input from a corresponding pixel group amongst the plurality of pixel groups, to coarsely adjust the signal and output a coarsely adjusted signal;
a plurality of analog-to-digital (A/D) convertors corresponding to the plurality of amplifiers, respectively, wherein each A/D converter amongst the plurality of A/D converters is configured to perform A/D conversion on the coarsely adjusted signal output from a corresponding amplifier amongst the plurality of amplifiers and output a digital signal; and
a reference voltage generator configured to adjust, for each A/D converter amongst the plurality of A/D converters, an output range of the digital signal output from the A/D converter, more finely than the signal has been adjusted by the amplifier corresponding to the A/D converter.

US Pat. No. 10,397,444

IMAGE DISPLAY APPARATUS AND DISPLAY CORRECTION METHOD

MITSUBISHI ELECTRIC CORPO...

1. An image display apparatus comprising:a light source unit including three or more light sources that emit lights of different wavelengths, the light source unit combining the lights emitted from the three or more light sources and emitting the combined light;
a wavelength detector that detects, for each of the light sources, wavelength information indicating the wavelength of the light emitted from the light source;
a color value determiner that determines, for each of the light sources, from the wavelength information of the light source detected by the wavelength detector, a color value indicating a color of the light from the light source in a predetermined color space;
a correction value determiner that determines, based on the color values of the lights from the respective light sources determined by the color value determiner, a correction value for correcting a ratio between the intensities of the lights from the respective light sources so that the color of the light obtained by combining the lights from the respective light sources is a color to be displayed; and
a driver that drives the light sources so that the ratio between the intensities of the lights from the respective light sources is a ratio corrected based on the correction value determined by the correction value determiner,
wherein the correction value determiner:
determines a ratio between the intensities of the lights from the respective light sources when the color values of the lights from the respective light sources are the color values determined by the color value determiner and the color of the light obtained by combining the lights from the respective light sources is a reference white; and
determines the correction value from the determined ratio between the intensities of the lights and a ratio between the intensities of the lights from the respective light sources when the wavelengths of the lights from the respective light sources are reference wavelengths of the respective light sources and the color of the light obtained by combining the lights emitted from the respective light sources is the reference white.

US Pat. No. 10,397,443

METHODS AND SYSTEMS FOR GENERATING COLOR REMAPPING INFORMATION SUPPLEMENTAL ENHANCEMENT INFORMATION MESSAGES FOR VIDEO

QUALCOMM Incorporated, S...

1. A method of processing video data, the method comprising:obtaining a video bitstream, the video bitstream including a plurality of pictures having a first color characteristic, wherein a chroma format of the plurality of pictures comprises a sub sampling format;
identifying, from the video bitstream, a color remapping information (CRI) supplemental enhancement information (SEI) message, wherein one or more values of at least one syntax element of the CRI SEI message are restricted based on a condition; and
remapping one or more samples of the plurality of pictures from the first color characteristic to a second color characteristic using a color remapping model of the CRI SEI message without upsampling the one or more samples.

US Pat. No. 10,397,442

IMAGE PROCESSING APPARATUS PERFORMING EDGE CORRECTION PROCESS ON SCAN DATA AND ACQUIRING CHARACTERISTIC INFORMATION

Brother Kogyo Kabushiki K...

1. A non-transitory computer readable storage medium storing a set of program instructions for an information processing apparatus including an interface and a processor, the set of program instructions, when executed by the processor, causing the information processing apparatus to:receive a setting value via the interface;
acquire first scan data;
determine, on the basis of the setting value, at least one image process to be performed on the first scan data from among a plurality of image processes, wherein after all of the at least one image process is performed on the first scan data to generate resultant data, the resultant data is to be outputted, wherein the plurality of image processes includes a clipping process, an edge correction process, and a characteristic acquisition process;
perform the clipping process on the first scan data to generate second scan data including original scan data and not including outer scan data, the original scan data being determined as a part of the first scan data representing an original image, the outer scan data being determined as another part of the first scan data representing an outer image around the original image, the original scan data including edge data representing an edge of the original image;
perform the edge correction process on the second scan data to generate third scan data by correcting the edge data;
perform the characteristic acquisition process on the third scan data to acquire characteristic information indicating a characteristic of the third scan data; and
perform a specific process on the second scan data by using the characteristic information which is acquired by using the third scan data in a case where the determining determines both the clipping process and the characteristic acquisition process are to be performed and the edge correction process is not to be performed.

US Pat. No. 10,397,441

INFORMATION EQUIPMENT MANAGEMENT SYSTEM FOR MANAGING USE APPROVAL/DISAPPROVAL INFORMATION, INFORMATION EQUIPMENT, PERSONAL IDENTIFICATION APPARATUS, AND RECORDING MEDIUM

Konica Minolta, Inc., Ch...

1. An information equipment management system, comprising:a personal identification apparatus which is portable and configured to acquire biometric information of a carrying person thereof to identify said carrying person;
an information equipment; and
an information equipment management server for managing use approval/disapproval information which is management information on approval or disapproval of use of said information equipment,
wherein said personal identification apparatus comprises:
an acquisition part for acquiring use approval/disapproval information of said information equipment relating to said carrying person, which is use approval/disapproval information managed by said information equipment management server, from said information equipment management server; and
a storage part for storing therein said use approval/disapproval information acquired from said information equipment management server, and wherein said information equipment controls an operation of said information equipment by using said use approval/disapproval information previously stored in said storage part of said personal identification apparatus.

US Pat. No. 10,397,439

SERVER SYSTEM TRANSMITTING JOB TO PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND CONTROL METHOD FOR SERVER SYSTEM

CANON KABUSHIKI KAISHA, ...

1. A server system which transmits, via a network, a job received via the network to a processing apparatus that executes the job, the server system comprisinga transmission unit configured to transmit, in a case where the job for the processing apparatus is held by a holding unit at the time at which a connection of communication with the processing apparatus has been established, a notification message indicating an occurrence of the job to the processing apparatus through the connection-established communication even in a case where no inquiry associated with the job is made from the processing apparatus through the connection-established communication, wherein
in a case where the job is not held at the time of establishment of the connection, the transmission unit does not transmit the notification message.

US Pat. No. 10,397,438

SYSTEMS AND METHODS FOR CAUSING EXECUTION OF AN ACTION BASED ON PHYSICAL PRESENCE OF A DETECTED PERSON

ORCAM TECHNOLOGIES LTD., ...

1. A wearable apparatus for causing an action to be executed based on whether a person is physically present in an environment of a user of the wearable apparatus, the wearable apparatus comprising:a wearable image sensor configured to capture a plurality of images from the environment of the user of the wearable apparatus; and
at least one processing device programmed to:
analyze at least one of the plurality of images to detect the person;
analyze at least one of the plurality of images to determine whether an image of the detected person appears on a display of a device in the environment of the user;
select at least one action based on whether an image of the detected person appears on the display of the device in the environment of the user; and
cause the selected at least one action to be executed.

US Pat. No. 10,397,437

IMAGE FORMING APPARATUS

CANON KABUSHIKI KAISHA, ...

1. An image forming apparatus, comprising:a light source, which includes a plurality of light emitting points, and is configured to emit light beams based on an image data;
a photosensitive member configured to rotate in a rotation direction so that a latent image is formed on the photosensitive member with the light beams emitted from the light source;
a rotary polygon mirror, which is configured to rotate around a rotation axis, and has a plurality of mirror faces each configured to deflect the light beams emitted from the light source so that the photosensitive member is scanned with the light beams;
a detector configured to detect temperature;
a storage unit configured to store scan position error data for a scan position error corresponding to each of the plurality of mirror faces, wherein the scan position error data comprises a data based on a scan position in the rotation direction of the light beams deflected by each of the plurality of mirror faces; and
a correction unit configured to correct the scan position error data based on a temperature detection result of the detector to generate a correction data, and configured to correct the image data by using the correction data, in the rotation direction of the photosensitive member, of the light beams deflected by each of the plurality of mirror faces.

US Pat. No. 10,397,436

IMAGE PICKUP APPARATUS AND IMAGE PICKUP METHOD OF IMAGE PICKUP APPARATUS

Olympus Corportion, Toky...

1. An image pickup apparatus comprising:a switch configured to output a first signal according to first operation by an operator and output a second signal according to a further second operation following the first operation;
an imager configured to continue to receive the first signal and continuously perform image pickup of a plurality of images until receiving the second signal, as first image pickup and configured to receive the second signal outputted according to the second operation and perform image pickup, as second image pickup, following the first image pickup;
a processor configured to select, out of the plurality of images picked up by the first image pickup, a desired image different from an image immediately after the reception of the second signal among images picked up by the second image pickup and configured to measure a time period or a number of images from the image immediately after the reception of the second signal to the desired image.

US Pat. No. 10,397,435

ELECTRONIC DEVICE

KYOCERA Document Solution...

1. An electronic device comprising:an operation section that receives an instruction from a user;
first storage storing therein data of a file including contents that can be printed;
second storage;
a processing section that transfers the data of the file from the first storage to the second storage when the operation section receives an instruction for erasing the file; and
a display section that displays an inquiry screen for inquiring whether or not to reproduce the file when the data of the file is transferred from the first storage to the second storage, wherein
when the operation section receives an instruction for reproducing the file while the inquiry screen is displayed, the processing section executes print processing for printing the contents of the file, transmission processing for transmitting an email with the data of the file attached thereto, or transfer processing for transferring the data of the file from the second storage to the first storage, and
when executing the print processing or the transmission processing, the processing section erases the data of the file from the second storage after the execution of the print processing or the transmission processing.

US Pat. No. 10,397,434

SCANNER THAT COMBINES IMAGES READ BY FIRST AND SECOND SENSOR ARRAYS, SCAN PROGRAM, AND METHOD OF PRODUCING SCAN DATA

Seiko Epson Corporation, ...

1. A scanner comprising:a first sensor array and a second sensor array having read regions which are overlapped partially; and
a processor constituting
a combining section configured to combine a first read image read by the first sensor array and a second read image read by the second sensor array,
an acquisition section configured to acquire a degree of relative deviation between the first read image and the second read image in a main scanning direction in a region read by the first sensor array and the second sensor array in an overlapped manner, and
a correction section configured to correct an image based on the degree of the relative deviation acquired by the acquisition section,
wherein the correction section outputs a combined image, whose brightness is made brighter and/or whose sharpness is made higher, in a case where the degree is large than in a case where the degree is small.

US Pat. No. 10,397,433

IMAGE FORMING APPARATUS FOR EMBEDDING ERROR DETECTION INFORMATION WITHIN A SCANNED IMAGE

KONICA MINOLTA, INC., Ch...

1. An image forming apparatus comprising:an image forming unit configured to form a reproduced image on a sheet;
an image reading unit configured to read a sheet face on which the reproduced image is formed and generate a scan image based on the read sheet face;
an image inspecting unit configured to compare the generated scan image with a normal image comprising an original image from which the reproduced image is formed to detect an error in the scan image; and
a history generating unit configured to convert the scan image file format to one of Portable Document Format (PDF), Hypertext Markup Language (HTML), and Office Open eXtensible Markup Language (OOXML), the history generating unit being further configured to, when the image inspecting unit detects the error, generate an error name text and embed the generated error name text in the scan image after the scan image has been converted to the PDF, HTML, or OOXML, thereby generating a history image,
wherein the error name text is text that identifies the error in the scan image.

US Pat. No. 10,397,432

SYSTEM AND METHOD FOR EXTRACTING PRESCRIPTION INFORMATION AND INSTRUCTIONS FROM A LABEL

1. A system for extracting prescription information from a medicine bottle, the system comprising:a platform that supports and rotates the medicine bottle;
a mirror that reflects an outer surface of the medicine bottle as the medicine bottle is rotated;
an imaging device that captures a plurality of snapshots of the reflected outer surface of the medicine bottle as the medicine bottle is rotated;
a microprocessor that:
assembles the captured plurality of snapshots into a single image;
detects text in the assembled single image, and
parses the detected text to identify a patient name and a medication name;
a keypad used to enter a digital pharmacy code for a dispensing pharmacy of the medicine bottle; and
a microphone that records an audio reading of information to be associated with the medicine bottle.

US Pat. No. 10,397,431

GRAPHICS PROCESSING DEVICE, IMAGE PROCESSING APPARATUS, GRAPHICS PROCESSING METHOD, AND RECORDING MEDIUM

Konica Minolta, Inc., Ch...

1. A graphics processing device comprising:a CPU corresponding to a general-purpose; and
a GPU corresponding to a special-purpose for graphics processing, the GPU being configured to necessitate draw call batching before the GPU starts its operation, wherein:
the CPU judges which device, the CPU itself or the GPU, should perform graphics processing to generate a screen image to be displayed on a display, with reference to a product of a number of screen elements composing the screen image and a pixel count of the display; and
the CPU performs graphics processing to generate the screen image if the CPU judges that the CPU itself should perform graphics processing, and the CPU performs draw call batching then makes the GPU perform graphics processing to generate the screen image if the CPU judges that the GPU should perform graphics processing.

US Pat. No. 10,397,430

INFORMATION PROCESSING APPARATUS THAT INSTALLS APPLICATION MODULES, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM

CANON KABUSHIKI KAISHA, ...

1. An information processing apparatus that executes an installed application in accordance with associated URL information for the installed application, comprising:a memory configured to store instructions; and
a processor configured to execute the stored instructions to:
receive an instruction to perform installation of an additional application, the additional application being different from the installed application on the information processing apparatus;
install the additional application according to the received instruction;
determine whether associated URL information for the additional application corresponds to associated URL information for the installed application; and
in a case where the associated URL information for the additional application corresponds to the associated URL information for the installed application:
display a screen for prompting a user to select an application to be set to an enabled state from among applications including the additional application and the installed application;
receive a selection of an application set to the enabled state via the screen;
set the selected application to an enabled state; and
set an application not selected via the screen to a disabled state.

US Pat. No. 10,397,429

PRINTING SOFTWARE SYSTEM WITH DYNAMIC CUSTOMIZATION CAPABILITY

KABUSHIKI KAISHA TOSHIBA,...

1. A printing software customization system comprising:a computing device including:
a processor;
a memory coupled to the processor, the memory storing program instructions that, when executed, cause the computer to perform actions, comprising:
requesting installation of printing software from a printing device to enable the computing device to print using the printing device;
receiving the printing software from the printing device;
installing the printing software;
requesting information related to a customized version of the printing software from a customization server, distinct from the printing device, operated by a printer administrator, the customized version of the printing software incorporating at least one customized element specifically designed by the printer administrator for the organization by whom the printing device is used;
requesting installation of the customized version of the printing software;
installing the customized version of the printing software without disturbing the printing software;
performing a print operation to execute software components from both the printing software and from the customized version of the printing software; and
wherein at least a portion of the customized version of the printing software is installed in a secondary installation folder rather than a primary installation folder for the printing software and further wherein the printing software operates first upon software components in the secondary installation folder before operating upon software components in the primary installation folder.

US Pat. No. 10,397,428

IMAGE FORMING APPARATUS, CONTROL METHOD OF IMAGE FORMING APPARATUS, AND PROGRAM

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus, having at least a first power state, and a second power state which consumes less power than the first power state, the image forming apparatus comprising:an accepting unit configured to accept, from a user, a transition instruction for a transition of the image processing apparatus to the second power state;
a detecting unit configured to detect human;
a power control unit configured to transition the image processing apparatus from the second power state to the first power state based on detection of human by the detecting unit; and
a prohibition unit configured to prohibit the image processing apparatus from transitioning from the second power state to the first power state based on detection of human by the detecting unit, for a predetermined period in accordance with the transition instruction accepted by the accepting unit.

US Pat. No. 10,397,427

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus that supplies roughness shape data to an image forming apparatus that forms a roughness shape based on a roughness shape of an object to be reproduced, the image processing apparatus comprising:one or more processors; and
one or more programs stored on the information processing apparatus, wherein the one or more programs cause the one or more processors to:
receive an input of information representing the roughness shape of the object to be reproduced;
acquire output characteristics relating to a roughness shape that the image forming apparatus can output; and
generate the roughness shape data that is supplied to the image forming apparatus based on the information representing the roughness shape of the object to be reproduced and the output characteristics,
wherein the roughness shape data is generated so as to give more weight to at least one of a difference of elevation, a height, and sharpness of a convex portion of the roughness shape of the object to be reproduced.

US Pat. No. 10,397,426

INFORMATION PROCESSING SYSTEM

KABUSHIKI KAISHA TOSHIBA,...

1. An information processing system, comprising:a processor; and
a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations, comprising:
storing a copy template comprising a plurality of settings for an image forming apparatus to execute a functionality associated with the image forming apparatus, the plurality of settings comprising a first setting representing a scanning function and a second setting representing a printing function;
facilitating receiving, by a user computing device, of the copy template, wherein the user computing device identifies the image forming apparatus based on use of a simple network management protocol broadcast message and an identifier associated with the image forming apparatus stored to a data readable and writable nonvolatile memory associated with the user computing device;
facilitating selection, by a user interface operational on the user computing device, of a scanning protocol supported by the image forming device;
in response to receiving the scanning protocol, converting the first setting to comply with the scanning protocol;
facilitating selection, by the user interface, of a printing protocol supported by the image forming device;
in response to receiving the printing protocol, converting the second setting to comply with the printing protocol; and
in response to receiving input from the user computing device, performing the scanning function as modified by the scanning protocol and the printing function as modified by the printing protocol.

US Pat. No. 10,397,425

IMAGE READING DEVICE AND IMAGE FORMING APPARATUS

FUJI XEROX CO., LTD., Mi...

1. An image reading device comprising:a light applier configured to apply light from a light source to a recording medium;
a detector configured to detect the light reflected by the recording medium;
a holding member that is movable in a first scanning direction of the light applier and configured to hold a calibration member to which the light from the light applier is applied;
a shaft member extending in the first scanning direction through the holding member and configured to guide the movement of the holding member in the first scanning direction; and
a restricting member that extends in the first scanning direction and configured to restrict a rotation of the holding member about the shaft member.

US Pat. No. 10,397,424

IMAGE FORMING APPARATUS AND METHOD FOR CONTROLLING IMAGE FORMING APPARATUS

CANON KABUSHIKI KAISHA, ...

1. An image forming apparatus comprising:a reading unit, which comprises a carriage including a sensor and a platen for placing an original thereon, configured to read an original placed on the platen using the sensor while moving the carriage in a predetermined direction;
an image forming unit configured to form an image on a sheet based on a read result of the original obtained by the reading unit; and
a controller configured to:
control the image forming unit to form a plurality of measurement images corresponding to four corners of the sheet on a predetermined surface of the sheet, wherein the plurality of measurement images includes a first measurement image, a second measurement image, a third measurement image, and a fourth measurement image,
control the reading unit to read a first area on the predetermined surface of the sheet, wherein the first area includes an area in which the first measurement image is formed and an area in which the second measurement image is formed,
control the reading unit to read a second area on the predetermined surface of the sheet, wherein the second area includes an area in which the third measurement image is formed and an area in which the fourth measurement image is formed,
generate a read image relating to the predetermined surface of the sheet based on first read data relating to the first area read by the reading unit and second read data relating to the second area read by the reading unit,
obtain position information of the plurality of measurement images based on the read image,
adjust the position information of the plurality of measurement images based on a shape of the read image, and
control a position of an image to be formed on the predetermined surface of the sheet by the image forming unit based on the adjusted position information of the plurality of measurement images.

US Pat. No. 10,397,423

PRINTING APPARATUS, CONTROL METHOD FOR THE SAME, AND STORAGE MEDIUM TO TRANSFER SPECIFIC DATA UPON OCCURRENCE OF INTERRUPTION

CANON KABUSHIKI KAISHA, ...

1. A copying apparatus comprising:a reader configured to read a document;
a storage configured to store image data of the document read by the reader;
a printer; and
a controller configured to read the image data of the document from the storage before image data for one page of the document has been stored in the storage and cause the printer to perform printing based on the read image data,
wherein the controller performs, in a case where an interruption factor of reading by the reader occurs before image data for one page of the document has been stored in the storage, control to transfer specific data to the printer instead of remaining image data of the document which is to be transferred to the printer, and
wherein, in a case where the interruption factor of reading by the reader occurs before image data for one page of the document has been stored in the storage, the printer performs printing based on the image data of the document stored in the storage and the specific data transferred by the controller.

US Pat. No. 10,397,422

CONTROL APPARATUS AND NON-TRANSITORY COMPUTER-READABLE MEDIUM STORING PROGRAM

Seiko Epson Corporation, ...

1. A control apparatus comprising:a processor that causes a preview to be displayed for an image to be printed on each of two sides of a print medium by a printing apparatus; and
a display that displays the preview, wherein
the processor
causes the display to display the preview, the preview including a front side display area and a rear side display area that is next to the front side display area along a transverse direction in the preview, the front side display area including at least a first and second front side page image for printing on a front side of the print medium, the rear side display area including at least a first and second rear side page image for printing on a rear side of the print medium, such that the preview shows a positional relationship along a print direction between a position of each of the front side pages and a position of each of the rear side page images, the print direction of an image in the preview corresponding to a print position of the image on the print medium, the print position in the preview being orthogonal to the transverse direction; and
adjusts the position in the print direction of at least one of the page images such that a position along the print direction of a blank area between the first and second front side page images corresponds to a position along the print direction of a blank area between the first and second rear side page images.

US Pat. No. 10,397,421

IMAGE FORMING DEVICE, IMAGE FORMING SYSTEM, AND COMPUTER-READABLE NON-TRANSITORY STORAGE MEDIUM STORING CONTROL PROGRAM EXECUTED BY COMPUTER FOR CONTROLLING ADDITIONAL PRINTING

Konica Minolta, Inc., Ch...

1. An image forming device that performs printing, comprising:a storage that stores information about a plurality of alignment images printed on a transfer medium;
an image forming unit that prints a first additional image and a second additional image on the transfer medium, the first additional image being different from the second additional image;
a reader unit that reads the alignment image printed on the transfer medium on an upstream side of the image forming unit along the transfer direction of the transfer medium; and
a control unit that distinguishes multiple alignment images upon reception of reading results from the reader unit, wherein
the control unit performs control of storing adjustment information for adjusting the positions of the first and second additional images to be formed, according to multiple alignment images, in the storage, and
the control unit performs control of associating, with each alignment image, a respective job for printing to cause the image forming unit to print the first additional image a plurality of times and then to print the second additional image a plurality of times.

US Pat. No. 10,397,420

MEDIUM PROCESSING DEVICE

Seiko Epson Corporation, ...

1. A medium processing device comprising:a processor that performs processing on a medium;
a feed roller that has a shaft and feeds the medium to the processor;
an electric motor that is a drive source of the feed roller;
a motive force transmission gear that is disposed at an end of the shaft of the feed roller and that is one gear to transmit motive force of the electric motor to the feed roller;
a motive force switch that includes a main gear driven by the electric motor and a sub gear that meshes with the main gear and moves in planetary motion around the main gear, and that is configured to switch accompanying rotation direction switching of the main gear between a meshed state in which the sub gear is meshed with the motive force transmission gear and a disengaged state in which the sub gear is disengaged from the motive force transmission gear;
an electronic load detector that detects load of the electric motor; and
an electronic controller that controls the electric motor and the processor, the electronic controller controlling at least one out of the electric motor or the processor according to the load of the electric motor detected by the electronic load detector at switching of the sub gear from the disengaged state to the meshed state,
the electronic controller transitioning to control performed when the sub gear has switched to the meshed state, at, as a starting point, a timing when the electronic controller determines that the load of the electric motor has exceeded a threshold value.

US Pat. No. 10,397,419

HINGE MECHANISM AND IMAGE FORMING APPARATUS

KYOCERA Document Solution...

1. A hinge mechanism for installation between a first cover pivotably supported by a housing and a second cover movable to be placed over the first cover, the hinge mechanism comprising:a first member configured to be supported by the first cover;
a second member turnably attached to the first member and configured to be fixed to the second cover; and
a stopper mechanism configured to restrict turning of the second member relative to the first member, wherein
the stopper mechanism has a path and a moving member provided in the path,
the path includes a first path part located in the first member and a second path part located in the second member,
the first member and the second member are changeable between a communication state and a non-communication state depending on a turning angle of the second member relative to the first member,
the first member and the second member are in the communication state when the first path part and the second path part are in communication with each other, and
the first member and the second member are in the non-communication state when the first path part and the second path part are not in communication with each other.

US Pat. No. 10,397,418

PROFILE CREATION DEVICE, PROFILE CREATION METHOD, AND RECORDING MEDIUM

Seiko Epsom Corporation, ...

1. A profile creation device comprising:a temporary selection unit configured to accept an input of a temporary condition, which is an illumination condition temporarily selected by a user from a plurality of illumination conditions;
a first calculation unit configured to use spectral colorimetric data for a specific patch printed as a specific color to calculate a color value under each of the plurality of illumination conditions;
a second calculation unit configured to calculate, by using the color value calculated by the first calculation unit, a color value indicating an appearance of a printed matter printed to appear as the specific color under the temporary condition in a case where it is assumed that the printed matter is observed under a comparison condition, which is an illumination condition other than the temporary condition;
a display unit configured to display color information indicated by the color value calculated by the second calculation unit;
a permanent selection unit configured to accept an input of an illumination condition permanently selected by the user from the plurality of illumination conditions; and
a creation unit configured to create a profile such that the specific patch is observed as the specific color under the illumination condition permanently selected.

US Pat. No. 10,397,417

METHODS AND SYSTEMS FOR PRINTING SELECTIVE PORTIONS OF DOCUMENTS

Xerox Corporation, Norwa...

1. A method, comprising:receiving a non-transitory removable storage device by a multi-function device, wherein the non-transitory removable storage device comprises a document having a table of content listing a plurality of topics and content corresponding to each topic;
storing the document in a temporary memory of the multi-function device;
identifying the table of content in the document based on one or more text recognition techniques;
printing only the table of content, wherein one or more topics of the plurality of topics are highlighted by a user in the printed version of the table of content;
receiving the printed version of the table of content with the one or more highlighted topics, for scanning;
identifying the one or more highlighted topics from the printed version of the table of content;
identifying content corresponding to the identified one or more highlighted topics; and
printing the identified content, the identified content corresponds to the one or more highlighted topics.

US Pat. No. 10,397,416

DOCUMENT READING DEVICE GENERATING SEPARATE FILES BASED ON CHARACTER STRINGS RECOGNIZED ON DIFFERENT PAGES

KYOCERA Document Solution...

1. A document reading device comprising:a document reading unit that optically reads an image of a source document;
a designation reception unit that receives designation of a region in an optionally selected position in the source image, acquired through reading by the document reading unit;
a character string recognizer that recognizes a character string included in the region, the designation of which has been received by the designation reception unit, in the source image acquired through reading by the document reading unit;
a storage device used to store the source image read by the document reading unit; and
an image storage controller that stores the source image read by the document reading unit in the storage device,
wherein the image storage controller performs a file dividing operation including generating, when a character string recognized by the character string recognizer in the source image of a preceding page of the source document, and a character string recognized by the character string recognizer in the source image of a current page are different from each other, one file including the source images up to the preceding page, and a separate file including the current and subsequent pages, and storing the file and the separate file in the storage device.

US Pat. No. 10,397,415

SYSTEMS AND METHODS FOR AUTOMATICALLY TRANSFERRING AUDIOVISUAL CONTENT

GoPro, Inc., San Mateo, ...

1. A system configured for automatically transferring audiovisual content, the system comprising:a camera device configured to:
obtain configuration information, wherein the configuration information supports establishment of a communication channel between the camera device and the computing device;
capture a first item of audiovisual content;
generate a notification indicating that the first item of audiovisual content has been captured by the camera device, the notification including a first identifier that identifies the first item of audiovisual content, the first identifier derived from content of the first item of audiovisual content;
transfer the notification to the computing device;
receive a request for the first item of audiovisual content from the computing device; and
responsive to the request, transfer the first item of audiovisual content to the computing device; and
a computing device configured to:
communicate with the camera device;
manage electronic access to and storage of items of audiovisual content;
receive the notification from the camera device prior to reception of the first item of audiovisual content;
effectuate registration of the first item of audiovisual content, the registration of the first item of audiovisual content including an association between the first item of audiovisual content and the first identifier, wherein a registration status of the first item of audiovisual content indicates whether the first item of audiovisual content has been transferred to the computing device;
transfer the request for the first item of audiovisual content to the computing device; and
receive the first item of audiovisual content from the camera device.

US Pat. No. 10,397,414

INFORMATION PROCESSING APPARATUS THAT HAS AN ELECTRONIC MAIL FUNCTION AND IS CAPABLE OF OPERATING IN COOPERATION WITH A PORTABLE TERMINAL AND PROGRAM THEREOF

Konica Minolta, Inc., Ch...

1. An information processing apparatus that has an electronic mail function and is capable of operating in cooperation with a portable terminal, the information processing apparatus comprising:a hardware processor configured to:
perform a communication with the portable terminal;
transmit a command of activating an address book application to the portable terminal in a case where the electronic mail function is selected, and to acquire a transmission destination address and a user address from the portable terminal;
create an electronic mail in which the transmission destination address acquired by the hardware processor is set to a transmission destination of the electronic mail, and the user address acquired by the hardware processor is set to a transmission source of the electronic mail;
acquire an image to be transmitted with the electronic mail;
attach the acquired image to the created electronic mail, and
transmit the electronic mail.

US Pat. No. 10,397,413

METHOD FOR CARRYING OUT A PRINTING OPERATION ON AN INKJET PRINTING MACHINE

Heidelberger Druckmaschin...

1. A method for carrying out a printing operation on an inkjet printing machine including a color space transformation between a target color space and a process color space by using a computer, the method comprising the following steps:calibrating the inkjet printing machine by printing and colorimetrically measuring a process color space test chart suitable for the printing operation in the target color space including printing-operation-related limitations in an amount of applied ink;
using the test chart to generate measured values corresponding to sampling points in a measured target color space;
interpolating between the sampling points to define further sampling points;
carrying out the color space transformation by using the sampling points in the target color space and input values known form the test chart in the process color space corresponding to the sampling points in the target color space;
directly using a physical variable of an ink drop volume for printing-operation-related ink application limitations as well as for the input values in the process color space;
adapting the calibration of the inkjet printing machine based on the ink application limitations and the input values in the process color space; and
carrying out the printing operation based on the adapted calibration.

US Pat. No. 10,397,412

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD FOR ESTIMATING CAUSE OF ABNORMALITY HAVING OCCURRENCED IN IMAGE PROCESSING APPARATUS

CANON KABUSHIKI KAISHA, ...

1. An image processing system comprising an image processing apparatus and a server connected to the image processing apparatus via a network, the image processing apparatus comprising:a printer;
a controller having a memory storing instructions and a processor which executes the instructions, the controller being configured to function as:
a forming unit configured to form a chart by the printer;
a reading unit configured to read the chart formed by the forming unit and white paper by a scanner; and
a transmitting unit configured to transmit first image data obtained by reading the chart and second image data obtained by reading the white paper to the server to estimate a cause of an abnormality that has occurred in the image processing apparatus,
and the server comprising:
a first estimating unit configured to estimate a cause of an abnormality that has occurred in the reading unit using a second feature amount of the second image data; and
a second estimating unit configured to
in a case the cause of an abnormality estimated by the first estimating unit is a cause of an abnormality,
estimate a cause of an abnormality that has occurred in the printer using a first feature amount of the first image data adjusted by the second feature,
wherein the second estimating unit excludes, from the first image data, a pixel corresponding to an abnormal pixel included in the second image data, and estimates the cause of the abnormality using a feature amount obtained by analyzing the first image data from which the pixel is excluded.

US Pat. No. 10,397,411

ACCESS NODE AND METHOD

Boyce Technologies Inc., ...

1. An access node for a communication system, comprising:a housing having:
a chassis configured for mounting electronic modules; and
a door mounted on the chassis by a hinge for movement between an open position and a closed position;
the door defining a plurality of pockets;
a sliding strip mounted in the chassis and operable to move in a sliding motion between a first sliding strip position and a second sliding strip position;
a plurality of latch hooks mounted on the sliding strip; the latch hooks each including a hook portion shaped to engage into respective ones of the plurality of pockets when the door is in the closed position and the sliding strip is moved to the second sliding strip position; and
a roller bearing mounted on the hook portion of each of the plurality of latch hooks, the roller bearing engaging a respective one of the pockets when the door is in the closed position and the sliding strip is being moved toward the second sliding strip position.