US Pat. No. 10,341,679

ENCODING SYSTEM USING MOTION ESTIMATION AND ENCODING METHOD USING MOTION ESTIMATION

SK PLANET CO., LTD., Seo...

1. An encoding method using motion estimation with an encoding apparatus, the encoding method comprising:determining an image unit in a frame for processing a plurality of image blocks included in the image unit independently or in parallel;
obtaining information on candidate motion vectors relating to a first image block which is one of the plurality of image blocks included in the image unit;
determining a motion vector relating to the first image block based on the information on candidate motion vectors;
generating a prediction signal relating to the first image block by performing an inter prediction based on the determined motion vector; and
encoding a residual signal relating to the first image block by performing a quantization on the residual signal, the residual signal being a difference between an original signal relating to the first image block and the prediction signal,
wherein the image unit comprises the plurality of image blocks, the information on candidate motion vectors for the first image block generated without using motion information of other image blocks included in the image unit,
wherein the information on candidate motion vectors for the first image block is generated by using motion information of at least a second image block included in another image unit,
wherein both the image unit and the other image unit are at different locations within the same time frame,
wherein both the first image block and the second image block are encoded by inter prediction, and
wherein when the candidate motion vectors for the first image block include a fixed value, candidate motion vectors of other image blocks included in the image unit also include an identical value as the fixed value.

US Pat. No. 10,341,678

SYSTEMS AND METHODS FOR PLAYER INPUT MOTION COMPENSATION BY ANTICIPATING MOTION VECTORS AND/OR CACHING REPETITIVE MOTION VECTORS

ZeniMax Media Inc., Rock...

1. A computer-implemented method for caching motion vectors comprising:transmitting a previously generated motion vector library from a server to a client, wherein the motion vector library is configured to be stored at the client;
transmitting an instruction to the client to monitor for input data from a user;
transmitting an instruction to the client to calculate a motion estimate from the input data;
transmitting an instruction to the client to update the stored motion vector library based on the input data, wherein the client is configured to apply the stored motion vector library to initiate motion in a graphic interface prior to receiving actual motion vector data from the server; and
transmitting an instruction to apply one or more scaling factors to the motion vector library, wherein the scaling factor is calculated based on the general equation:

US Pat. No. 10,341,677

METHOD AND APPARATUS FOR PROCESSING VIDEO SIGNALS USING INTER-VIEW INTER-PREDICTION

LG ELECTRONICS INC., Seo...

1. A method for decoding a video signal by a decoding apparatus, the method comprising:deriving, by the decoding apparatus, an inter-view motion vector of a current texture block by searching inter-view motion vector candidates in an order of an inter-view motion vector of a temporal neighboring block of the current texture block, an inter-view motion vector of a spatial neighboring block of the current texture block, and a disparity vector derived by the decoding apparatus using depth data of a depth block; and
performing, by the decoding apparatus, inter-view inter-prediction for the current texture block using the derived inter-view motion vector of the current texture block,
wherein deriving the inter-view motion vector of the current texture block includes:
determining, by the decoding apparatus, whether the temporal neighboring block of the current texture block is coded using inter-view inter-prediction, wherein the inter-view motion vector of the temporal neighboring block is derived as the inter-view motion vector of the current texture block when the temporal neighboring block is coded using inter-view inter-prediction,
determining, by the decoding apparatus, whether the spatial neighboring block of the current texture block is coded using inter-view inter-prediction, wherein the inter-view motion vector of the spatial neighboring block is derived as the inter-view motion vector of the current texture block when the temporal neighboring block is not coded using inter-view inter-prediction and the spatial neighboring block is coded using inter-view inter-prediction, and
deriving, by the decoding apparatus, the disparity vector as the inter-view motion vector of the current texture block when the temporal neighboring block of the current texture block and the spatial neighboring block of the current texture block are not coded using inter-view inter-prediction.

US Pat. No. 10,341,676

VIDEO ENCODING APPARATUS AND A VIDEO DECODING APPARATUS

KABUSHIKI KAISHA TOSHIBA,...

1. A video decoding apparatus, comprising:processing circuitry configured to:
acquire available blocks of blocks having motion vectors from decoded blocks adjacent to a to-be-decoded block and a number of the available blocks, wherein acquiring the available blocks comprises searching for available block candidates from decoded blocks neighboring the to-be-decoded block and having motion vectors, determining a block size of motion compensated prediction of the available block candidates, determining whether the available block candidates are in a prediction mode of a unidirection prediction or a bidirectional prediction, and extracting available blocks from the available block candidates based on the determination of the block size and the determination of the prediction mode;
select one selection code table from a plurality of code tables depending on the number of the available blocks and decode selection information for specifying one selection block using the one selection code table;
select the one selection block from the available blocks; and
subject the to-be-decoded block to motion compensated prediction coding using a motion vector of the one selection block as a motion vector of the to-be-decoded block.

US Pat. No. 10,341,675

VIDEO ENCODING METHOD AND VIDEO ENCODER SYSTEM

AXIS AB, Lund (SE)

1. A method of encoding digital video data corresponding to a sequence of input video frames, wherein said input video frames are encoded into a sequence of output video frames by a video encoding apparatus, the method comprising:encoding a first input video frame of the sequence of input video frames, in a first encoder instance of the video encoding apparatus, using intra-frame encoding to produce a first intra-frame, the first encoder including an output that discards intra-frames including the first intra-frame, instead of transmitting the intra-frames to a buffer for subsequent storage or display,
decoding said first intra-frame to produce a first decoded frame by a decoder of the video encoding apparatus, the decoder providing decoded intra-frames to a second encoder instance,
encoding said first decoded frame in the second encoder instance to produce a first output video frame, wherein the second encoder instance outputs a video stream using intra mode and inter mode prediction, wherein the video stream includes the first output video frame.

US Pat. No. 10,341,674

METHOD AND DEVICE FOR DISTRIBUTING LOAD ACCORDING TO CHARACTERISTIC OF FRAME

SAMSUNG ELECTRONICS CO., ...

1. A method for distributing a load, the method comprising:identifying characteristics of each of frames included in a received bit stream; and
distributing loads of a plurality of cores based on the characteristics of each of the frames whenever the frames are decoded,
wherein the identifying comprises identifying a reference relationship between the frames included in the bit stream, and
wherein the reference relationship is distance information indicating a relative distance between a first frame to be encoded and a second frame, which is a reference frame, in a sequential picture order of the frames included in the received bit stream.

US Pat. No. 10,341,673

APPARATUSES, METHODS, AND CONTENT DISTRIBUTION SYSTEM FOR TRANSCODING BITSTREAMS USING FIRST AND SECOND TRANSCODERS

INTEGRATED DEVICE TECHNOL...

1. An apparatus comprising:an interconnect configured to (i) provide encoded video data from an encoder to a decoder and (ii) receive an input bitstream including the encoded video data from the encoder, the interconnect comprising:
a communication link;
a first transcoder configured to (i) detect a type of lossless coding methodology used to generate the input bitstream, (ii) generate intermediate video data by (a) transcoding the encoded video data using a second lossless coding methodology responsive to detecting that the input bitstream is generated using a first lossless coding methodology, (b) copying the encoded video data responsive to detecting that the input bitstream is generated using the second lossless coding methodology and (c) copying the encoded video data responsive to detecting that the input bitstream is generated using a third coding methodology and (iii) transmit the intermediate video data in the communication link, wherein the second lossless coding methodology is different than the third coding methodology; and
a second transcoder located proximate to the decoder and in a different facility remote from the first transcoder, and configured to (i) receive the intermediate video data from the communication link, (ii) receive a signal from the decoder, (iii) detect a type of lossless coding methodology used to generate the intermediate video data and (iv) generate output video data by (a) transcoding the intermediate video data to recreate the input bitstream with the first lossless coding methodology responsive to the signal indicating that the decoder is capable of decoding with the first lossless coding methodology, (b) copying the intermediate video data with the second lossless coding methodology responsive to the signal indicating that the decoder is capable of decoding with the second lossless coding methodology and (c) skipping the transcoding of the intermediate video data responsive to determining that both (1) the intermediate video data is generated using the third coding methodology and (2) the signal indicates that the decoder is capable of decoding with the third coding methodology.

US Pat. No. 10,341,672

METHOD AND SYSTEM FOR MEDIA SYNCHRONIZATION

KOREA ADVANCED INSTITUTE ...

1. A method for media synchronization, comprising: collecting stream source information; generating network delay information between stream sources by performing a delay test between the stream sources; setting synchronization information of a stream source corresponding to a specific channel based on the collected stream source information and the network delay information; measuring network delay with at least one user terminal to receive the stream source; updating the synchronization information based on the measured network delay; and performing time synchronization with the at least one user terminal based on a time clock of the at least one user terminal comprisingrequesting the time clock of a corresponding terminal to each of a plurality of user terminals when the plurality of user terminals requests to provide the stream source; receiving the time clock of a corresponding terminal from each of the plurality of user terminals in response to the requesting of the time clock; and performing time synchronization between the plurality of user terminals based on the received time clock comprising
generating a common time stamp based on the time clock of the corresponding terminal and identifier
information of the corresponding terminal; and providing a stream inserted with the generated common time stamp to each of the plurality of user terminals.

US Pat. No. 10,341,671

METHOD AND SYSTEM FOR IMAGE COMPRESSION

XEROX Corporation, Norwa...

1. An image compression method comprising:compressing an input image with a first compression method to generate a first compressed image;
compressing the same input image with a second compression method to generate a second compressed image;
comparing the generated first compressed image and the generated second compressed image and, based on the comparison, generating:
a first residual layer comprising first pixels corresponding to foreground pixels that are present in only the second of the first and second compressed images, and
a second residual layer comprising second pixels corresponding to foreground pixels that are present in only the first of the first and second compressed images;
identifying connected components, where present, in the at least one of the first and second residual layers, each connected component comprising a group of first or second pixels in the respective first or second residual layer that, when mapped to the second compressed image is connected, in at least one of first and second directions, to foreground pixels in the second compressed image;
generating an output compressed image comprising at least one of:
for a connected component identified in the first residual layer, removing at least one corresponding foreground pixel from the second compressed image, and
for a connected component identified in the second residual layer, adding at least one corresponding foreground pixel to the second compressed image.

US Pat. No. 10,341,670

VIDEO ENCODER BIT RATE STABILIZATION

AMAZON TECHNOLOGIES, INC....

1. A method of adjusting a bit rate of a portion of a video stream, the method comprising:determining a first frame of the video stream to be encoded and sent over a network to a recipient computing device;
determining a first quantization value of an encoder, wherein the first quantization value was used to encode a previous frame of the video stream, prior to the first frame;
determining a first estimated compressed frame size of the first frame when encoded with the first quantization value;
determining that the first estimated compressed frame size is less than a target frame size tolerance band, wherein the target frame size tolerance band represents a range of frame sizes suitable to maintain a target bit rate of the video stream;
determining a second quantization value, wherein the second quantization value is less than the first quantization value;
determining a second estimated compressed frame size of the first frame when encoded with the second quantization value;
determining that the second estimated compressed frame size is within the target frame size tolerance band;
generating a compressed first frame by encoding the first frame of the video stream with the second quantization value; and
sending the compressed first frame over the network to the recipient computing device.

US Pat. No. 10,341,669

TEMPORALLY ENCODING A STATIC SPATIAL IMAGE

Intel Corporation, Santa...

1. A system for temporally encoding static spatial images, the system comprising:electronic circuitry; and
a memory including instructions that, when executed by the electronic circuitry, cause the electronic circuitry to:
obtain a static spatial image, the static spatial image defining pixel values over an area;
select a scan path, the scan path defining:
a path across the area of the static spatial image; and
a duration path, the duration path being a non-linear function that defines progression of the scan path in time;
scan a window in accordance with the scan path on the static spatial image to produce changes in a portion of the window over time; and
record the changes in the portion of the window with respective times of the changes.

US Pat. No. 10,341,668

CODING OF SIGNIFICANCE MAPS AND TRANSFORM COEFFICIENT BLOCKS

GE VIDEO COMPRESSION, LLC...

1. An apparatus for decoding a transform coefficient block encoded in a data stream, comprising:a decoder configured to extract, from the data stream, syntax elements via context-based entropy decoding, wherein each of the syntax elements indicates whether a significant transform coefficient is present at a corresponding position within the transform coefficient block, and extract information indicating values of the significant transform coefficients within the transform coefficient block; and
an associator configured to associate each of the syntax elements with the corresponding position within the transform coefficient block in a scan order,
wherein the decoder is configured to use, for context-based entropy decoding of at least one syntax element of the syntax elements, a context which is selected for the at least one syntax element based on a size of the transform coefficient block, a position of the at least one syntax element within the transform coefficient block, and information regarding prior syntax elements previously extracted from a neighborhood of the position of the at least one syntax element, wherein contexts for different syntax elements are selected based on different combinations of the size of the transform coefficient block, the position of the respective syntax element, and the information regarding the respective prior syntax element previously extracted.

US Pat. No. 10,341,667

ADAPTIVE PARTITION CODING

GE VIDEO COMPRESSION, LLC...

1. A non-transitory computer-readable medium for storing data associated with a video, comprising:a data stream stored in the non-transitory computer-readable medium, the data stream comprising encoded information of a reference block of a texture picture of the video, wherein the reference block is co-located to a block of a depth map of the video, wherein the block of the depth map is decoded using a plurality of operations including:
reconstructing the reference block of the texture picture based on the encoded information from the data stream;
determining a texture threshold based on sample values of the reconstructed reference block of the texture picture;
segmenting the reference block of the texture picture by thresholding the texture picture within the reference block using the texture threshold to acquire a bi-segmentation of the reference block into first and second portions of the reference block;
spatially transferring the bi-segmentation of the reference block of the texture picture onto the block of the depth map so as to acquire first and second portions of the block of the depth map; and
decoding the block of the depth map in units of the first and second portions.

US Pat. No. 10,341,666

METHOD AND DEVICE FOR SHARING A CANDIDATE LIST

Electronics and Telecommu...

1. A method for encoding a video signal comprising:deriving, based on a position and a size of a coding block, at least one merging candidate relating to a first prediction block, the coding block comprising the first prediction block and a second prediction block when a size of a block to which parallel merge processing is applicable is equal to or greater than a first size and a size of the coding block is equal to a second size;
generating a first merging candidate list for the first prediction block based on the merging candidate; and
encoding motion information of the first prediction block based on the generated first merge candidate list,
wherein the first merging candidate list for the first prediction block is equivalent to a second merging candidate list for the second prediction block included in the coding block with the first prediction block.

US Pat. No. 10,341,665

METHOD OF PROVIDING RANDOM ACCESS FOR VIDEO DATA BASED ON RANDOM ACCESSIBLE P-FRAME

INNODEP Co., LTD., Seoul...

1. A method of providing random access for video data based on random accessible P-frame, wherein the video data includes a series of frame data, the method comprising:generating at least one I-frame by performing intraframe encoding on specific frame data out of the series of frame data;
generating a series of P-frames by performing interframe encoding on the remaining frame data of the series of frame data with reference to each corresponding previous frame data;
identifying a P-frame out of the series of P-frames as a random accessible P-frame of the video data;
identifying the closest preceding I-frame for the random accessible P-frame;
identifying a reference frame data by a frame data corresponding to a predetermined spacing step from the random accessible P-frame;
generating a random access reference frame for the random accessible P-frame by performing interframe encoding on the reference frame data with reference to the closest preceding I-frame; and
inserting the random access reference frame in user defined fields of header area of video data packets, wherein the video data packets are prepared for transmitting the I-frame and the series of P-frames.

US Pat. No. 10,341,664

CONFIGURABLE INTRA CODING PERFORMANCE ENHANCEMENTS

Intel Corporation, Santa...

1. A computer-implemented method for video coding comprising:determining, for a current block of video data, processing performance costs for a plurality of intra modes, wherein the processing performance costs are based on one or more reference blocks associated with the plurality of intra modes and a processing order of the one or more reference blocks with respect to the current block;
selecting an intra coding mode for the current block based at least in part on the processing performance costs for the plurality of intra modes; and
encoding the current block into a bitstream based at least in part on the selected intra coding mode.

US Pat. No. 10,341,663

COEFFICIENT CODING HARMONIZATION IN HEVC

SONY CORPORATION, Tokyo ...

1. A decoding device, comprising:circuitry configured to:
apply, as a condition that diagonal scan is applied to a first transform block of a plurality of transform blocks and a second transform block of the plurality of transform blocks, the diagonal scan to the plurality of transform blocks of a plurality of variable block sizes,
wherein the first transform block is of a first block size of the plurality of variable block sizes and the second transform block is of a second block size of the plurality of variable block sizes, and
wherein 4×4 sub-blocks of both the first transform block and the second transform block are diagonally scanned, and the diagonal scan is applied inside each of the 4×4 sub-blocks; and
apply same multi-level significance map decoding to the first transform block of the first block size and the second transform block of the second block size.

US Pat. No. 10,341,662

METHODS AND SYSTEMS FOR ENTROPY CODER INITIALIZATION

Velos Media, LLC, Dallas...

1. A method for decoding a video bitstream comprising: decoding, in a slice header associated with a picture, a first syntax element with an integer value indicating a number of a plurality of entropy slices defining a first slice, wherein each of the entropy slices contains a plurality of largest coding units (LCUs); decoding a second syntax element in the slice header indicating an offset with an index i, wherein the index i has as range from 0 to the integer value of the first syntax element minus 1 and the offset indicates, in a unit of bytes, a distance between (i) one of the plurality of the entropy slices in the first slice in the video bitstream and (ii) an entropy slice preceding the one of the plurality of the entropy slices in the video bitstream;decoding a third syntax element in the slice header indicating a slice type of the first slice;
in circumstances where the third syntax element indicates the slice type of the first slice is a B slice, decoding a flag in the slice header indicating an initialization method of a Context-Adaptive Binary Arithmetic Coding (CABAC) context;
in circumstances where the decoded flag indicates a first value, initializing the CABAC context using a first initialization method at the first LCU of each of the plurality of entropy slices in the B slice;
in circumstances where the decoded flag indicates a second value, initializing the CABAC context using a second initialization method at the first LCU of each of the plurality of entropy slices in the B slice; and
initializing the CABAC context of a P slice using at least one of the first initialization method and the second initialization method.

US Pat. No. 10,341,661

METHOD AND DEVICE FOR ENCODING/DECODING IMAGES

Electronics and Telecommu...

1. A method of encoding a video using an encoding apparatus, the method comprising:obtaining, using the encoding apparatus, transform coefficients of a current block by performing inverse-quantization on quantized transform coefficients of the current block;
obtaining, using the encoding apparatus, a residual sample of the current block by performing an inverse-transform on the transform coefficients of the current block based on a transform type of the current block;
obtaining, using the encoding apparatus, a prediction sample of the current block;
reconstructing, using the encoding apparatus, a reconstructed sample of the current block using the residual sample and the prediction sample; and
encoding using the encoding apparatus, a first information indicating whether the residual sample for the current block is present and a second information indicating whether the inverse-transform is performed on the residual sample of the current block,
wherein the transform type is determined to be a Discrete Cosine Transform (DCT) or a Discrete Sine Transform (DST),
wherein in response to a size of the current block not being equal to 4×4, the transform type is determined to be the DCT, and
wherein the transform type is determined independently of an intra prediction mode of the current block.

US Pat. No. 10,341,660

VIDEO COMPRESSION APPARATUS AND VIDEO PLAYBACK APPARATUS

Kabushiki Kaisha Toshiba,...

1. A video compression apparatus comprising:a first compressor configured to compress a first video in accordance with a first target bit rate to generate a first bitstream;
a second compressor configured to set regions in a second video and compress the regions in accordance with a second target bit rate larger than the first target bit rate so as to enable each region to be independently decoded, to generate a second bitstream;
a partitioner configured to partition the second bitstream according to the set regions to obtain a partitioned second bitstream; and
a communicator configured to receive region information indicating a specific region that corresponds to one or more regions and select and transmit a bitstream corresponding to the specific region from the partitioned second bitstream, wherein the region information is generated so that the specific region is selected in descending order of priority in each region, a first priority in a first region is higher than a second priority in a second region, and a first distance from the first region to one or more request regions requested by a user is smaller than a second distance from the second region to the request regions.

US Pat. No. 10,341,659

SYSTEMS AND METHODS OF SWITCHING INTERPOLATION FILTERS

QUALCOMM Incorporated, S...

14. An apparatus, comprising:a memory configured to store video data; and
a processor configured to:
obtain the video data;
determine, for a coding unit, a subset of interpolation filters from a set of interpolation filters, the subset of interpolation filters including a plurality of interpolation filters, wherein the subset of interpolation filters is determined based on information, in the video data, associated with the coding unit;
encode the coding unit, wherein encoding the processing includes selecting an interpolation filter for motion estimation and motion compensation, and wherein the interpolation filter is selected from the subset of interpolation filters; and
generate an encoded video bitstream, wherein the encoded video bitstream includes the encoded coding unit.

US Pat. No. 10,341,658

MOTION, CODING, AND APPLICATION AWARE TEMPORAL AND SPATIAL FILTERING FOR VIDEO PRE-PROCESSING

Intel Corporation, Santa...

1. A computer-implemented method for video coding comprising:applying adaptive temporal and spatial filtering to pixel values of video frames of input video to generate pre-processed video, wherein the adaptive temporal and spatial filtering comprises, for an individual pixel value of a block of pixels of an individual video frame of the input video:
spatial-only filtering the individual pixel value when the block is a motion block; and
blending spatial and temporal filtering of the individual pixel value when the block is a non-motion block by determining a spatial filtering output value and a temporal filtering output value for the individual pixel and generating a weighted average of the spatial and temporal filtering output values, wherein determining the temporal filtering output value comprises determining a previous pixel filtering weight for the individual pixel value and generating a weighted average of a previous pixel value from a second video frame and the individual pixel value based on the previous pixel filtering weight, wherein the previous pixel value is co-located with the individual pixel value, and wherein the previous pixel filtering weight is determined using a monotonic increasing function of a quantization parameter, a global noise level, and a visual index corresponding to the individual pixel;
encoding the pre-processed video to generate a video bitstream; and
storing the video bitstream.

US Pat. No. 10,341,657

SYSTEM AND METHOD FOR MITIGATING MOTION ARTIFACTS IN A MEDIA STREAMING NETWORK

Telefonaktiebolaget LM Er...

1. A media processing method operative at a network node, the method comprising:separating a video component and an audio component from an incoming source media input;
determining static object grid (SOG) coordinate information for still areas identified in the video component;
encoding the video component at different bitrates to generate a plurality of adaptive bitrate (ABR) representations of the video component;
scaling the SOG coordinate information with respect to each of the bitrate representations of the video component;
encoding the audio component to generate an encoded audio stream; and
multiplexing each bitrate representation of the video component with corresponding scaled SOG coordinate information and the encoded audio stream to generate a plurality of multiplexed media outputs for distribution to one or more subscriber stations.

US Pat. No. 10,341,656

IMAGE DECODING METHOD USING INTRA PREDICTION MODE

INFOBRIDGE PTE. LTD., Si...

1. A method of image decoding, the method comprising:generating a quantization block by inversely scanning quantization coefficient information;
generating a transform block by inversely quantizing the quantization block using a quantization step size;
generating a residual block by inversely transforming the transform block;reconstructing an intra prediction mode group indicator and a prediction mode index of a current block;constructing a first group including three intra prediction modes using valid intra prediction modes of left and top blocks of the current block;
determining an intra prediction mode corresponding to the prediction mode index in the first group as an intra prediction mode of the current block when the intra prediction mode group indicator indicates the first group;generating a prediction block on the basis of the determined intra prediction mode of the current block; andgenerating a reconstructed block using the residual block and the prediction block,
when only one of the intra prediction modes of the left and top blocks of the current block is available, two intra prediction modes are added to the first group,
wherein when the intra prediction mode of the left block is not equal to the intra prediction mode of the top block, and the intra prediction mode of the left block and the intra prediction mode of the top block are planar mode and DC mode, the first group includes the intra prediction modes of the left and top blocks and a vertical mode, and
wherein a lowest prediction mode index is assigned to the planar mode when the intra prediction mode of the current block does not belong to the first group and the planar mode is not included in the first group.

US Pat. No. 10,341,655

HEVC ENCODING DEVICE AND METHOD FOR DETERMINING INTRA-PREDICTION MODE USING THE SAME

AJOU UNIVERSITY INDUSTRY-...

1. A high efficiency video coding (HEVC) encoding device for determining an intra-prediction mode of an image, the HEVC encoding device comprising:a candidate group updater configured to select a plurality of representative modes as a candidate group from among intra-prediction modes and update the candidate group using a plurality of minimum modes selected from the candidate group, the plurality of representative modes representing a range where there is an optimal mode; and
an optimal mode selector configured to select any one mode as an optimal mode from among a plurality of minimum modes selected from the updated candidate group,
wherein, upon the penultimate update of the candidate group, the candidate group updater updates a candidate group before the penultimate update by using a DC mode and a planar mode in addition to a plurality of minimum modes selected from the candidate group before the penultimate update,
wherein the candidate group updater updates the candidate group by adding and subtracting variable mode values to/from each of the plurality of minimum modes, while proceeding from a second update of the candidate group to a penultimate update of the candidate group,
wherein the variable mode value is decreased by a predetermined ratio as a number of update repetition is increased.

US Pat. No. 10,341,654

COMPUTING DEVICE FOR CONTENT ADAPTIVE VIDEO DECODING

1. A non-transitory computer-readable device having instructions stored which, when executed by a processor, cause the processor to perform operations for decoding a bitstream encoded via a plurality of encoders, the operations comprising:identifying, from the bitstream, a first portion of a video covering a first period of time in the video and a second portion of the video covering a second period of time in the video, wherein each of the first portion of the video and the second portion of the video comprises a plurality of entire video frames;
identifying the first portion of the video covering the first period of time as comprising a first degree of action associated with an action scene;
identifying the second portion of the video covering the second period of time as comprising a second degree of action associated with a slow scene;
identifying, for the first portion of the video, a first model chosen from a plurality of predefined models based on the first portion of the video being the action scene, wherein each of the plurality of predefined models comprises a coding model associated with a set of composition features used for generating one of the first portion or the second portion and wherein the coding model defines a coding tool for encoding and decoding a portion of the bitstream comprising the set of composition features;
identifying, for the second portion of the video, a second model chosen from the plurality of predefined models based on the second portion of the video being the slow scene;
routing the first portion of the video to a first decoder of a plurality of decoders based on the first model;
decoding the first portion of the video by the first decoder according to the first model;
routing the second portion of the video to a second decoder based on the second model, wherein the plurality of decoders comprises a generic decoder and wherein the routing of each of the first portion and the second portion of the video to one of the plurality of decoders comprises routing one of the first portion or the second portion to the generic decoder when one of the first portion or the second portion is associated with a generic model and when content of one of the first portion of the video or the second portion of the video does not match a predetermined model; and
decoding the second portion of the video by the second decoder according to the second model, wherein the first model and the second model are each a different model.

US Pat. No. 10,341,653

METHOD AND SYSTEM FOR REDUCING SLICE HEADER PARSING OVERHEAD IN VIDEO CODING

Texas Instruments Incorpo...

1. A method for decoding a slice of picture from a bit stream when weighted prediction is enabled for the slice, the method comprising:decoding a first sequence of luminance weight flags corresponding to a first plurality of reference pictures from the bit stream;
decoding a first sequence of chrominance weight flags corresponding to the first plurality of reference pictures from the bit stream, wherein the encoded first sequence of chrominance weight flags follows the encoded first sequence of luminance weight flags in the bit stream;
decoding luminance weighting factors for each luminance weight flag of the first sequence of luminance weight flags that is set to indicate weighted prediction of a luminance component of a corresponding reference picture is enabled, wherein the encoded luminance weighting factors follow the encoded first sequence of chrominance weight flags in the bit stream; and
decoding chrominance weighting factors for each chrominance weight flag of the first sequence of chrominance weight flags that is set to indicate weighted prediction of chrominance components of a corresponding reference picture is enabled, wherein the encoded chrominance weighting factors follow the encoded luminance weighting factors in the bit stream.

US Pat. No. 10,341,652

IMAGE CODING METHOD, IMAGE DECODING METHOD, IMAGE CODING APPARATUS, IMAGE DECODING APPARATUS, AND IMAGE CODING AND DECODING APPARATUS

SUN PATENT TRUST, New Yo...

1. An image coding apparatus, comprising:a processor; and
a memory storing thereon a computer program, which when executed by the processor, causes the processor to perform operations including:
performing Sample Adaptive Offset (SAO) processing on a luminance signal, a chrominance Cb signal, and a chrominance Cr signal which are included in a target block which is locally decoded;
performing arithmetic coding on a first flag indicating whether or not an SAO parameter for the target block is identical to an SAO parameter for a left neighboring block immediately left of the target block, the SAO parameter for the target block indicating details of the SAO processing;
performing arithmetic coding on the SAO parameter for the target block, when the SAO parameter for the target block is different from the SAO parameter for the left neighboring block; and
performing arithmetic coding on a second flag indicating whether or not the SAO parameter for the target block is identical to an SAO parameter for an upper neighboring block immediately above the target block,
wherein, in the performing of the arithmetic coding on the first flag, the arithmetic coding is performed on the first flag for the luminance signal, the chrominance Cb signal, and the chrominance Cr signal by using only a single first context,
in the performing of the arithmetic coding on the first flag and the performing of the arithmetic coding on the second flag, a same context determination method is used to determine both: the single first context to be used in the arithmetic coding on the first flag; and a single second context to be used in the arithmetic coding on the second flag,
the same context determination method is used to determine each of the single first context and the single second context to be shared for luminance signals, chrominance Cb signals, and chrominance Cr signals which are included in a same picture,
the single first context is used in common for coding a plurality of the first flag in a same picture,
the single second context is used in common for coding a plurality of the second flag in a same picture, and
in the performing of the SAO processing,
each of pixels included in the target block is classified to one of categories,
the each of the pixels is added with an offset value corresponding to the classified one of the categories, and
the SAO parameter includes: information indicating a method of classifying to the categories; and information indicating the offset value.

US Pat. No. 10,341,651

IMAGE CODING METHOD, DECODING METHOD, CODING DEVICE AND DECODING DEVICE

TONGJI UNIVERSITY, Shang...

1. An image coding method comprising:determining a coding mode of a coding block; and
performing hybrid coding on the coding block using a plurality of coding modes, comprising performing coding on pixel sample segments in the coding block using one of two coding modes which are palette coding and string copy coding,
wherein, the coding block is a coding region of an image, comprising at least one of the following: a largest coding unit, LCU, a coding tree unit, CTU, a coding unit, CU, a sub-region of the CU, a prediction unit, PU, a transform unit, TU, and an asymmetric partition, AMP,
the pixel sample segments comprise any one of the following: a pixel, a pixel component, and a pixel index; wherein
when performing coding on any one of the pixel sample segments in the coding block using palette coding, the method comprises:
constructing or acquiring a palette and performing palette coding on the pixel sample segments to generate palette parameters related to palette decoding;
when performing coding on any one of the pixel sample segments in the coding block using string copy coding, the method comprises:
performing string copy coding on any one of the pixel sample segments to generate copy parameters related to string copy coding, obtaining a string of reference pixel samples matching with the pixel sample segments from a set of the reconstructed reference pixel samples according to a copy path shape mode of the string copy coding of the coding block.

US Pat. No. 10,341,650

EFFICIENT STREAMING OF VIRTUAL REALITY CONTENT

ATI TECHNOLOGIES ULC, Ma...

1. A method of processing Virtual Reality (VR) content, the method comprising:receiving tracking information including at least one of user position information and eye gaze point information;
using one or more processors to:
predict, based on the user tracking information, a user viewpoint of a next frame of a sequence of frames including video data to be displayed,
estimate, for a video portion in a previously encoded frame, a corresponding location of the video portion in the next frame based on the user tracking information, wherein the video portion in the previously encoded frame is encoded using a first encoding mode;
render the video portion in the next frame to be displayed at the estimated corresponding location in the next frame;
identify, based on the estimated corresponding location of the video portion in the next frame, the video portion in the previously encoded frame;
encode the video portion in the next frame using the first encoding mode; and
encode another portion of the next frame using a second encoding mode determined from a prediction mode map.

US Pat. No. 10,341,648

AUTOMATED DETECTION OF PROBLEM INDICATORS IN VIDEO OF DISPLAY OUTPUT

Amazon Technologies, Inc....

1. A system comprising:an electronic data store that stores a digital video, wherein the digital video comprises a plurality of frames, wherein the digital video comprises a recording of display output from a computing device; and
a hardware processor in communication with the electronic data store, the hardware processor configured to execute computer-executable instructions to at least:
select a set of frames from the digital video;
for each of a plurality of frame pairings within the set of frames, determine a pixel change value for each of a plurality of pixel locations, wherein the pixel change value for each pixel location represents a mathematical difference between an intensity value at the pixel location in a first frame of the frame pairing and an intensity value at the pixel location in a second frame of the frame pairing;
determine adjusted pixel change values for each of the plurality of pixel locations, wherein the adjusted pixel change value for each individual pixel location comprises the lowest pixel change value determined within a window that includes the individual pixel location and one or more adjacent pixel locations;
determine, for each pixel location, a final pixel change value for the set of frames, wherein the final pixel change value for each individual pixel location comprises the highest adjusted pixel change value determined for the individual pixel location in any frame pairing within the set of frames;
generate a target motion score representing motion in a target area over the set of frames, wherein the target area is smaller than a size of each frame within the set of frames and has a fixed position across each of the set of frames, wherein the target motion score is based on the highest final pixel change value determined for any pixel location within the target area;
generate a peripheral motion score representing motion outside of the target area over the set of frames, wherein the peripheral motion score is based on the highest final pixel change value determined for any pixel location outside of the target area;
determine that isolated motion occurred within the target area over the set of frames based at least in part on a comparison of the target motion score to the peripheral motion score;
based at least in part on the final pixel change values of pixel locations within the target area, identify that a shape of the motion within the target area matches a target motion shape associated with a problem indicator; and
store an indication that a problem indicator appeared in the digital video during at least a portion of the set of frames.

US Pat. No. 10,341,647

METHOD FOR CALIBRATING A CAMERA AND CALIBRATION SYSTEM

ROBERT BOSCH GMBH, Stutt...

1. A method for calibrating a dynamic vision sensor (DVS) camera, the method comprising:detecting a two-dimensional imaging trajectory of a moving calibration object by the DVS camera;
detecting a three-dimensional reference trajectory of the moving calibration object by a detection device that determines the reference trajectory on the basis of a plurality of accelerations of the moving calibration object at a plurality of detection times and corresponding calibration object positions, as measured by an acceleration sensor on the moving calibration object and received by the detection device from a transmitter on the moving calibration object, the imaging trajectory representing a trajectory of the calibration object imaged in image coordinates of the DVS camera and the reference trajectory representing the trajectory in world coordinates;
reading in the imaging trajectory and the reference trajectory by an interface device that supplies the imaging trajectory and the reference trajectory to an ascertainment device that includes a processing unit; and
calibrating the DVS camera by the ascertainment device on the basis of the imaging trajectory and the reference trajectory.

US Pat. No. 10,341,646

VARIABLE FOCAL LENGTH LENS SYSTEM WITH OPTICAL POWER MONITORING

Mitutoyo Corporation, Ka...

1. A variable focal length (VFL) lens system, comprising:a VFL lens;
a VFL lens controller that controls the VFL lens to periodically modulate an optical power of the VFL lens over a range of optical powers at an operating frequency;
an objective lens that inputs workpiece light arising from a workpiece surface during a workpiece imaging mode and transmits the workpiece light along an imaging optical path that passes through the VFL lens;
a camera that receives the workpiece light transmitted by the VFL lens along the imaging optical path during the workpiece imaging mode and provides a corresponding workpiece image exposure; and
an optical power monitoring configuration comprising a monitoring beam generator comprising a light source and a beam pattern element that inputs light from the light source and outputs a monitored beam pattern, wherein:
the optical power monitoring configuration transmits the monitored beam pattern along at least a portion of the imaging optical path to travel through the VFL lens to the camera during an optical power monitoring mode;
the camera provides at least a first monitoring image exposure including the monitored beam pattern during at least a first phase timing of the periodic modulation of the VFL lens during the optical power monitoring mode; and
a dimension of the monitored beam pattern in the first monitoring image exposure is related to an optical power of the VFL lens during the first phase timing.

US Pat. No. 10,341,645

BACKLIGHT AND IMAGE DISPLAY DEVICE USING THE SAME

LG DISPLAY CO., LTD., Se...

1. A backlight unit for a display device, comprising:a light path conversion sheet;
a light source above a first side of the light path conversion sheet; and
a reflecting plate above a second side opposite to the first side of the light path conversion sheet, the reflecting plate reflecting light directly emitted from the light source without intervening any optical members in substantially parallel with a light travel direction between the light source and a center portion of the reflecting plate,
wherein the light path conversion sheet directs the light reflected from the reflecting plate in a direction substantially perpendicular to the light travel direction between the light source and the center portion of the reflecting plate.

US Pat. No. 10,341,644

OMNISTEREO CAPTURE AND RENDER OF PANORAMIC VIRTUAL REALITY CONTENT

GOOGLE LLC, Mountain Vie...

1. A system comprising:at least one processor;
memory storing instructions that, when executed by the at least one processor, cause the system to perform operations including:
receiving a set of images based on captured video streams collected from at least one stereo pair of cameras;
calculating optical flow between images from the set of images to generate a plurality of image frames that are not part of the set of images, the calculating of the optical flow including analyzing image intensity fields for selected columns of pixels associated with the set of images;
interleaving the plurality of image frames into the set of images at the respective selected columns of pixels and stitching together a portion of the plurality of image frames and the set of images based at least in part on the optical flow; and
generating, using the portion of the plurality of image frames and the set of images, an omnistereo panorama.

US Pat. No. 10,341,643

PROCESS AND SYSTEM FOR ENCODING AND PLAYBACK OF STEREOSCOPIC VIDEO SEQUENCES

3DN, LLC, Ottawa (CA)

1. A method for displaying images, comprising:in a conventional two-dimensional (2D) viewing mode:
driving a 2D image of a plurality of 2D images to a head mountable display; and
in a stereoscopic three-dimensional (3D) viewing mode:
driving to the head mountable display a left image of a plurality of left images for left eye viewing, wherein at least a portion of the plurality of left images are generated using time interpolation to increase a frame rate of the display, and a right image of a plurality of right images for right eye viewing, the left image and the right image being time-synchronized with parallax for perception of depth in the stereoscopic 3D viewing mode, wherein the driving comprises simultaneous dual presentation of the left image and the right image.

US Pat. No. 10,341,642

DISPLAY DEVICE, CONTROL METHOD, AND CONTROL PROGRAM FOR STEREOSCOPICALLY DISPLAYING OBJECTS

KYOCERA CORPORATION, Kyo...

1. A display device, comprising:a display configured to three-dimensionally display a predetermined object, by displaying images respectively corresponding to both eyes of a user when the display device is worn;
a sensor configured to detect displacement of a real body in a display space of the object; and
a processor configured to
determine a material of the object, and
cause the display to display the object according to the displacement of the real body detected by the sensor and the determined material of the object,whereinin response to that movement of the real body in which the real body comes in contact with the object at a contact position and then moves away therefrom without maintaining contact with the object is detected by the sensor, the processor is configured to execute processing corresponding to the contact position of the object.

US Pat. No. 10,341,641

METHOD FOR PERFORMING IMAGE PROCESS AND ELECTRONIC DEVICE THEREOF

Samsung Electronics Co., ...

1. An electronic device comprising:a first image sensor configured to calculate a packet rate output in dynamic vision sensor and to measure a motion speed of a subject by the packet rate output in the dynamic vision sensor;
a second image sensor configured to be synchronized with a system clock; and
a processor operatively coupled to the first image sensor and the second image sensor, wherein the processor is configured to:
identify at least one subject having a motion based on first information obtained using the first image sensor, the motion determined as a function of a comparison between a current image frame obtained from the first image sensor and a previous image frame obtained from the first image sensor,
determine at least one region of interest (ROI) corresponding to the motion of the at least one subject from the current image frame,
after determining the at least one ROI, obtain second information corresponding to the at least one ROI, using the second image sensor,
identify a motion speed of the subject included in each of the at least one ROI based on the second information,
determine one ROI among the at least one ROI based on the motion speed,
identify a motion of the subject included in the at least one ROI, and
perform a function corresponding to the motion.

US Pat. No. 10,341,640

MULTI-WAVELENGTH PHASE MASK

The Board of Trustees of ...

1. An apparatus comprising:a phase mask configured and arranged with optics in an optical path to provide modification of a shape of light simultaneously for each of a plurality of wavelengths of the light concurrently passed from a plurality of objects, wherein the modification is simultaneous for each of the plurality of wavelengths of the light concurrently passed from the plurality of objects, independent of one another and from the optical path; and
circuitry configured and arranged to characterize a three-dimensional image of the plurality of objects, that are associated with different colors, based on each of the modified shapes of light and the respective wavelengths that are labeled using different colors.

US Pat. No. 10,341,639

AUTOMATICALLY SCANNING AND REPRESENTING AN ENVIRONMENT WITH COLLISION AVOIDANCE

ABB Schweiz AG, Baden (C...

1. A computer-implemented method comprising:obtaining a first representation of an environment employing a 3D camera coupled to a robotic arm based on a first scanning path;
simulating a second scanning path of the 3D camera comprising movements of the robotic arm using the first representation of the environment;
determining the 3D camera cannot reach a portion of the second scanning path so as to complete the second scanning path in response to simulating the second scanning path;
adjusting the second scanning path in response to determining the robotic arm cannot reach the portion of the second scanning path;
obtaining a second representation of the environment employing the 3D camera based on the adjusted second scanning path; and
wherein the second representation of the environment is different from the first representation of the environment.

US Pat. No. 10,341,638

METHOD AND APPARATUS OF DEPTH TO DISPARITY VECTOR CONVERSION FOR THREE-DIMENSIONAL VIDEO CODING

MediaTek Inc., Hsin-Chu ...

1. A method for three-dimensional or multi-view video coding, the method comprising:receiving input data associated with a conversion region of a current picture in a current dependent view, wherein the conversion region comprises a grid of pixels;
receiving depth data for the grid of pixels associated with the conversion region;
determining the conversion region is partitioned into multiple motion prediction sub-blocks;
determining a single converted DV (disparity vector) from the depth data associated with the conversion region based on at least (a) first depth data associated with a first motion prediction sub-block from the multiple motion prediction sub-blocks, and (b) second depth data associated with a second motion prediction sub-block from the multiple motion prediction sub-blocks; and
processing each of the multiple motion prediction sub-blocks of the conversion region using the single converted DV.

US Pat. No. 10,341,636

BROADCAST RECEIVER AND VIDEO DATA PROCESSING METHOD THEREOF

LG ELECTRONICS INC., Seo...

1. A method for transmitting a signal in a transmitter, the method comprising:generating the signal including a first video sequence and a second video sequence and viewpoint information,
wherein the first video sequence includes first video sections having frames of left viewpoint for first scenes and second video sections having frames of right viewpoint for second scenes,
wherein the first video sections and the second video sections are multiplexed in the first video sequence,
wherein the second video sequence includes third video sections having frames of right viewpoint of the first scenes and fourth video sections having frames of left viewpoint for the second scenes, and
wherein the third video sections and the fourth video sections are multiplexed in the second video sequence; and
transmitting the signal;
wherein the viewpoint information specifies whether viewpoints of the frames correspond to a left viewpoint or a right viewpoint.

US Pat. No. 10,341,635

STEREOSCOPIC IMAGING METHOD AND DEVICE

National Chiao Tung Unive...

1. A stereoscopic imaging method for generating, based on a pair of a first image and a second image that respectively correspond to different viewing angles, a stereoscopic image on a display screen for a viewer, said stereoscopic imaging method comprising:acquiring viewer-related information that includes a pupil distance between pupils of the viewer, a first parameter associated with a negative disparity condition, and a second parameter associated with a positive disparity condition;
upon receipt of positional information associated with a convergence position on the display screen at which the viewer is looking, acquiring, by a processor based on the positional information, a convergence disparity value from an original disparity map that corresponds to the first and second images, the convergence disparity value corresponding to a pixel of the display screen at the convergence position;
generating a disparity transformation model by the processor based on at least the convergence disparity value and the viewer-related information;
transforming, by the processor, the original disparity map into a transformed disparity map based on the disparity transformation model; and
synthesizing, by the processor, the first image and the second image into the stereoscopic image based on the transformed disparity map.

US Pat. No. 10,341,634

METHOD AND APPARATUS FOR ACQUIRING IMAGE DISPARITY

SAMSUNG ELECTRONICS CO., ...

1. A method of acquiring an image disparity by one or more hardware processors, the method comprising:acquiring, from dynamic vision sensors, a first image having a first view of an object and a second image having a second view of the object;
calculating a cost within a preset disparity range of an event of first image and a corresponding event of the second image, wherein the event of the first image and the corresponding event of the second image are generated when an intensity of lighting is greater than a preset threshold;
calculating an intermediate disparity of the event of the first image and an intermediate disparity of the event of the second image based on the cost;
determining whether the event of the first image is a matched event based on the intermediate disparity of the event of the first image and the intermediate disparity of the event of the second image; and
in response to the event of the first image being determined as the matched event, predicting optimal disparities of all events of the first image based on the intermediate disparity of the event of the first image.

US Pat. No. 10,341,633

SYSTEMS AND METHODS FOR CORRECTING ERRONEOUS DEPTH INFORMATION

QUALCOMM Incorporated, S...

7. A method for generating a corrected depth map by an electronic device, comprising:obtaining a first depth map, wherein the first depth map comprises first depth information of a first portion of a scene sampled by a depth sensor at a first sampling;
obtaining a second depth map, wherein the second depth map comprises second depth information of a second portion of the scene sampled by the depth sensor at a second sampling;
obtaining displacement information indicative of a displacement of the depth sensor between the first sampling and the second sampling;
transforming the first depth map based on the displacement information to produce a transformed depth map;
determining a spatio-temporal interpolation between at least two depths of the transformed depth map and at least two depths of the second depth map;
detecting erroneous depth information by comparing one or more depths of the second depth map with a value that is based on at least the spatio-temporal interpolation and a threshold; and
generating a corrected depth map by correcting the erroneous depth information of the second depth map based on the transformed depth map.

US Pat. No. 10,341,632

SPATIAL RANDOM ACCESS ENABLED VIDEO SYSTEM WITH A THREE-DIMENSIONAL VIEWING VOLUME

1. A method for displaying an environment from a viewpoint, the method comprising:at an input device, receiving user input designating the viewpoint within a viewing volume;
at one or more processors, identifying, from among a plurality of vantages within the viewing volume, a subset of the vantages nearest to the viewpoint comprising at least two of the vantages, each of which has associated video data;
at a data store, retrieving the video data from the subset of the vantages;
at the one or more processors, combining the video data from the subset of the vantages to generate viewpoint video data depicting the environment from the viewpoint; and
at a display device, displaying the viewpoint video data, wherein identifying the subset comprises identifying four vantages of the plurality of vantages that define a tetrahedral shape around the viewpoint.

US Pat. No. 10,341,631

CONTROLLING MODES OF SUB-TITLE PRESENTATION

Harmonic, Inc., San Jose...

1. One or more non-transitory computer-readable storage mediums storing one or more sequences of instructions for creating a sub-titles stream or file composed of sub-titles elements, wherein execution of the one or more sequences of instructions by one or more processors causes:for each sub-titles element in said sub-titles elements, performing:
inserting a sub-titles element into the sub-titles stream or file;
determining whether at least one end-of-block condition, of a set of two or more end-of-conditions, related to a mode of presentation of sub-titles is satisfied by the inserted sub-titles element; and
upon satisfying said at least one end-of-block condition, inserting into the sub-titles stream or file a datum representative of an end of a block according to each mode of presentation of sub-titles that is satisfied by the inserted sub-titles element.

US Pat. No. 10,341,630

CREATING TIME LAPSE VIDEO IN REAL-TIME

IDL CONCEPTS, LLC, Los A...

1. A computer-implemented method for creating time lapse video in real-time, comprising:receiving a real-time video feed from a video input device;
receiving one or more user configurations related to time lapse video, wherein the one or more user configurations comprise a duration of input that corresponds to a span of content covered by the real-time video feed;
automatically generating frames according to the one or more predetermined user configurations, wherein said generating comprises:
buffering frames of the real-time video feed; and automatically selecting buffered frames according to a frequency set by the one or more user predetermined configurations; and
outputting a time lapse video file, wherein outputting the time-lapse video file comprises outputting the selected buffered frames,
wherein a duration of the time lapse video file is less than a duration of the real-time video feed while span of content covered is substantially equivalent for both the time lapse video file and the real-time video feed.

US Pat. No. 10,341,629

TOUCH SCREEN WIFI CAMERA

HIPCAM LTD., Givat Shmue...

1. An in-place imaging device, comprising:(a) a housing, said housing comprising:
(i) a camera;
(ii) a wireless communication module;
(iii) a processing unit; and
(iv) a touch sensitive screen configured for displaying content and receiving input, wherein a lens of said camera and said touch sensitive screen are disposed on a same surface; and
(b) a support stand mechanically and electronically coupled to said housing, said support stand including:
(i) a temperature sensor embedded in said support stand, for sensing the temperature in an immediate area surrounding the in-place imaging device and displaying said temperature on said touch sensitive screen;
wherein the imaging device is adapted to capture footage with said camera and transmitting said footage via said wireless communication module to a network access point.

US Pat. No. 10,341,628

MONOCHROME-COLOR MAPPING USING A MONOCHROMATIC IMAGER AND A COLOR MAP SENSOR

Google LLC, Mountain Vie...

1. A method for providing a color, high-resolution image of a scene comprising:capturing a high-resolution monochromatic image of the scene;
capturing a low-resolution color image of the same scene, a resolution of the captured low-resolution color image being lower than a resolution of the captured high-resolution monochromatic image;
mapping, using a nonlinear grayscale that is downscaled to a color scale based on a number of pixels of the captured high-resolution monochromatic image of the scene and another number of pixels of the captured low-resolution color image of the same scene, monochrome luminosities from the captured high-resolution monochromatic image to color luminosities corresponding to colors from the captured low-resolution color image of the scene to produce a map, the map correlating the monochrome luminosities in the captured high-resolution monochromatic image to the color luminosities corresponding to colors from the captured low-resolution color image of the scene;
using the map, colorizing the captured high-resolution monochromatic image of the scene with the colors corresponding to the color luminosities of the captured low-resolution color image of the scene, the colorizing providing a color, high-resolution image of the scene; and
providing the color, high-resolution image of the scene.

US Pat. No. 10,341,627

SINGLE-HANDED FLOATING DISPLAY WITH SELECTABLE CONTENT

INTERMEC IP CORP., Fort ...

1. A barcode scanning device comprising:a stabilized component having a projector;
a nonstabilized component having a light emitting component; and
a camera adapted to:
scan a work item including a barcode;
wherein the barcode scanning device is adapted to:
decode the barcode scanned by the camera to identify the work item;
project, via the projector and in response to decoding the barcode, a first user interface on a surface of the identified work item such that the projected first user interface is within a field of view of the camera, wherein the first user interface comprises one or more commands and/or options associated with the identified work item that are selectable by the barcode scanning device based on a position of the stabilized component relative to a position of the nonstabilized component, wherein the one or more commands and/or options of the projected first user interface are selected based on a position of a light indicator projected by the light emitting component within the projected first user interface;
detect a position of the light indicator relative to the projected first user interface by recognizing presence of the light indicator within a particular portion of a grid structure associated with the field of view of the camera and correlating the position of the light indicator detected by the camera with a command or option of the one or more commands and/or options of the projected first user interface based on the grid structure, wherein the grid structure is generated using grid technology and is independent of the projected first user interface;
receive an indication of a selection of the command or option based on a position of the selection indicator relative to a position of the projected first user interface; and
project a second user interface corresponding to the command or option in the first user interface in response to receiving the selection.

US Pat. No. 10,341,626

IMAGE PROJECTION SYSTEM, PROJECTOR, AND CONTROL METHOD FOR IMAGE PROJECTION SYSTEM

SEIKO EPSON CORPORATION, ...

1. An image projection system comprising:a first projector; and
a second projector, wherein
the first projector includes:
a first projecting section configured to project a first image; and
a first imaging section configured to capture a range including at least a part of the first image projected by the first projecting section and at least a part of a second image projected by the second projector,
the second projector includes a second projecting section configured to project the second image, and
the image projection system sets a target color on the basis of a first captured image obtained by capturing, with the first imaging section, at least a part of the first image projected by the first projecting section and calculates, on the basis of a second captured image obtained by capturing, with the first imaging section, at least a part of the second image projected by the second projecting section, first correction data for correcting a color of a projected image of the second projector to the target color, the correcting occurring based upon lattice points, the image projection system generating a correspondence table in which specified coordinates of the lattice points in data of the first captured image and coordinates of stored lattice points in pattern image data are registered in association with each other, wherein:
the first correction data is data for correcting a color of a first point in the second captured image to the target color, the first correction data being a first value, the first value being a difference between the target color and an imaging value of the first point in the second captured image,
the second projector includes a second imaging section configured to capture a third captured image, the third captured image being obtained by capturing at least a part of the second image projected by the second projecting section, and
the image projection system calculates, on the basis of the third captured image and the first correction data, second correction data for correcting a color of a second point to the target color by adding the first value and a second value that is a difference between an imaging value of the first point in the third captured image and an imaging value of the second point in the third captured image.

US Pat. No. 10,341,625

DISPLAY APPARATUS, DISPLAY CONTROL METHOD, AND COMPUTER-READABLE STORAGE MEDIUM

CASIO COMPUTER CO., LTD.,...

1. A display apparatus comprising:a display screen for displaying a video based on video data, wherein the display screen comprises:
a light source configured to be controlled to emit light and to temporarily stop emission of the light, based on the video data; and
a light modulator configured to modulate the light emitted by the light source; and
a processor configured to:
control the light source to emit the light based on the video data and the light modulator to modulate the light emitted by the light source, to display the video; and
while controlling the light source to emit the light based on the video data to display the video:
determine whether a time-out period set to a first length has passed;
in response to determining that the time-out period set to the first length has passed, perform a first determination, based on the video data, of whether to temporarily stop emission of the light so that the display screen is black; and
in response to determining, in the first determination, to temporarily stop emission of the light so that the display screen is black:
control the light source to temporarily stop emission of the light;
set the time-out period to a second length shorter than the first length;
determine whether the time-out period set to the second length has passed;
in response to determining that the time-out period set to the second length has passed, perform a second determination, based on the video data, of whether to temporarily stop emission of the light; and
control the light source based on a result of the second determination.

US Pat. No. 10,341,624

PROJECTION IMAGE DISPLAY APPARATUS

PANASONIC INTELLECTUAL PR...

1. A projection image display apparatus comprising:a plurality of light sources including a first light source and a second light source;
a light combiner combining light rays emitted from the plurality of light sources into a combined light;
a light modulation element modulating the combined light; and
a projection optical system projecting an image emitted from the light modulation element,
wherein the plurality of light sources are each controlled by pulse width modulation signals, each of which contains a plurality of pulses, and duty ratios of the plurality of pulses of the pulse width modulation signals differ from each other,
the duty ratios of the plurality of pulses of the pulse width modulation signals to the light sources are set so that a total luminance value of the light sources, which is determined based on the duty ratios of the plurality of pulses of the pulse width modulation signals to the light sources, becomes a target luminance value of an image projected from the projection optical system,
the duty ratios of the plurality of pulses of the pulse width modulation signals to the light sources increase as the target luminance value increases,
when the target luminance value is less than a predetermined value, which is less than 100%, a duty ratio of the first light source increases faster than a duty ratio of the second light source increases, as the target luminance value increases, and
when the target luminance value is equal to or greater than the predetermined value, the duty ratio of the first light source increases slower than the duty ratio of the second light source increases, as the target luminance value increases.

US Pat. No. 10,341,623

OPTICAL PROJECTION SYSTEM AND ENERGY CONTROL METHOD THEREFOR USING SELECTION UNIT

Coretronic Corporation, ...

1. An optical projection system, comprising:a light source module capable of emitting at least one light beam;
an optical engine for receiving the light beam and modulating the light beam according to at least one image signal to form an image beam;
a thermoelectric generator for absorbing heat in the optical projection system and converting the heat into electrical energy;
a storage unit for storing the electrical energy, wherein a state of charge of the storage unit is allowed to reach at least a first level or a second level, and the second level is larger than the first level;
a first electronic device and a second electronic device for receiving the electrical energy stored in the storage unit, wherein the first electronic device has a first threshold voltage, the second electronic device has a second threshold voltage, the first threshold voltage is a minimum voltage required to make the first electronic device operable, the second threshold voltage is a minimum voltage required to make the second electronic device operable, and the second threshold voltage is larger than the first threshold voltage; and
a selection unit, wherein the selection unit outputs a first selection signal to the storage unit to turn on the first electronic device having the first threshold voltage when the state of charge of the storage unit reaches the first level and turn on the second electronic device having the second threshold voltage when the state of charge of the storage unit reaches the second level, and the selection unit outputs a second selection signal to the storage unit to selectively shut down at least one of the first electronic device and the second electronic device according to (1) the state of charge of the storage unit and (2) the first threshold voltage of the first electronic device and the second threshold voltage of the second electronic device, wherein the second electronic device with the second threshold voltage is set by the selection unit to have higher priority to be shut down over the first electronic device with the first threshold voltage when the first electronic device and the second electronic device are in an operating state simultaneously.

US Pat. No. 10,341,622

MULTI-HALF-TONE IMAGING AND DUAL MODULATION PROJECTION/DUAL MODULATION LASER PROJECTION

Dolby Laboratories Licens...

1. A method for reducing halo artifacts in a final image rendered upon a dual modulation display system by creating a Point Spread Function (PSF) of a desired size with a first, pre-modulator that illuminates a second, primary modulator comprising the step of preparing a dual modulation energization signal comprising a pre-modulator energization signal comprising a plurality of half-tone images, each of the plurality of half-tone images to be displayed or energized on a pre-modulator of a dual modulation display system over a plurality of sub-frames time periods during a single frame time period, wherein further the single frame period is a modulation period of the primary modulator and the sub-frame time period is a modulation period of the pre-modulator and the pixel elements of the primary modulator are switched on/off once during the single frame period and the pixel elements of the pre-modulator are switched on/off once during a single sub-frame period and further that the single frame period comprises a plurality of sub-frame time periods, thereby increasing the number of levels in the pre-modulator without increasing PSF size, in synchronization with a primary modulator signal comprising an image to be displayed or energized on a primary modulator of the dual modulation display system, the primary modulator signal further comprising a bit sequence of N bits per pixel wherein the higher order bits are spread out across the frame synchronized with the pre-modulator energization signal.

US Pat. No. 10,341,621

SYSTEMS AND METHODS FOR CREATING FULL-COLOR IMAGE IN LOW LIGHT

Chromatra, LLC., Beverly...

1. A color imaging system, comprising:a radiation sensitive sensor configured to generate, in response to incident electromagnetic radiation from a scene:
a first set of electrical signals indicative of a first channel comprising a first spectrum of wavelengths of electromagnetic radiation, a first array of radiation sensitive pixels that are enabled to detect the first spectrum of wavelengths of electromagnetic radiation, and the first array of radiation sensitive pixels comprises clear, unfiltered pixels;
a second set of electrical signals indicative of a second channel comprising a second spectrum of wavelengths of electromagnetic radiation, and a second array of radiation sensitive pixels that are enabled to detect the second spectrum of wavelengths of electromagnetic radiation, and the second array of radiation sensitive pixels comprises a filter; and
an image processor coupled to the radiation sensitive sensor and having circuitry configured to:
receive the first set of electrical signals indicative of the first channel and the second set of electrical signals indicative of the second channel;
derive an output based on the first and second sets of electrical signals to channels of a red-green-blue (RGB) display to generate a full-color image of the scene based on the first and second sets of electrical signals by combining the first and second sets of electrical signals into a set of color vectors, wherein a color vector comprises an ordered set of numbers describing a color;
translate the set of color vectors into colors in a color space with at least one of using a color lookup table that records associations between color vectors and colors, or using predetermined formulas based on definitions of the first and second channels; and
display the full-color image.

US Pat. No. 10,341,620

IMAGE SENSOR AND IMAGE-CAPTURING DEVICE

NIKON CORPORATION, Tokyo...

1. An image sensor comprising:a first microlens;
a first filter that is transmissive to a first wavelength of light having passed through the first microlens;
a first photoelectric converter that generates charges by performing photoelectric conversion of the first wavelength light transmitted through the first filter;
a second filter that is transmissive to a second wavelength of the light having passed through the first microlens;
a second photoelectric converter that generates charges by performing photoelectric conversion of the second wavelength light transmitted through the second filter;
a second microlens;
a third filter that is transmissive to the first wavelength of light having passed through the second microlens;
a third photoelectric converter that generates charges by performing photoelectric conversion of the first wavelength light transmitted through the third filter;
a fourth filter that is transmissive to the second wavelength of light having passed through the second microlens;
a fourth photoelectric converter that generates charges by performing photoelectric conversion of the second wavelength light transmitted through the fourth filter;
a first output unit that outputs a first signal based upon at least one of the charges generated by the first photoelectric converter and the charges generated by the third photoelectric converter; and
a second output unit that outputs a second signal based upon at least one of the charges generated by the second photoelectric converter and the charges generated by the fourth photoelectric converter.

US Pat. No. 10,341,619

METHODS, SYSTEMS, AND PRODUCTS FOR EMERGENCY SERVICES

1. A system, comprising:a connection for communicating with a network;
a processor in communication with the connection; and
memory storing instructions that cause the processor to effectuate operations, the operations comprising:
receiving an indication that an emergency notification was sent from a communications device;
determining a first location of the communications device;
querying for devices connected to the network that have a view of the first location;
based on the querying, identifying a nearby mobile device located at a second location;
in response to the querying, receiving a list of nearby devices, the list indicative of nearby device locations associated with the nearby devices, wherein the nearby devices comprise the nearby mobile device;
determining that a first nearby device of the nearby devices has a first camera resolution value; and
based on the first camera resolution value not meeting a minimum requirement, excluding the first nearby device from the list.

US Pat. No. 10,341,618

INFRASTRUCTURE POSITIONING CAMERA SYSTEM

Trimble Inc., Sunnyvale,...

1. A camera system comprising:a housing;
a first imaging system integrated with the housing, the first imaging system comprising:
a first lens, and
a first image sensor for detecting light from the first lens to generate first data, wherein:
the first data is image data,
the first imaging system has a first field of view, and
the first field of view is between 20 degrees and 40 degrees;
a second imaging system integrated with the housing, the second imaging system comprising:
a second lens, and
a second image sensor for detecting light from the second lens to generate second data, wherein:
the second data is image data,
the second imaging system has a second field of view,
the second field of view is between 20 degrees and 40 degrees,
the first field of view and the second field of view combine to form a third field of view,
the third field of view is larger than the first field of view, and
the third field of view is larger than the second field of view;
a processor, integrated with the housing, configured to generate third data from the first data and/or the second data, wherein the third data corresponds to position information of an object; and
a communication interface configured to transmit the third data.

US Pat. No. 10,341,617

PUBLIC SAFETY CAMERA IDENTIFICATION AND MONITORING SYSTEM AND METHOD

Purdue Research Foundatio...

10. A system for determining a travel path, comprising:a network of at least one camera;
a communication hub coupled to the network of at least one camera;
at least one electronic communication device;
a data processing system coupled to the communication hub, the data processing system comprising one or more processors configured to:
(a) establish an interface with a 3rd-party mapping system via the electronic communication device,
(b) receive a start point and an end point by a user on the interface for a preselected zone,
(c) generate input data for the 3rd-party mapping system based on the start and end points,
(d) provide the input data to the 3rd-party mapping system,
(e) receive output data from the 3rd-party mapping system associated with a path from the start point to the end point,
(f) identify waypoints in the output data,
(g) identify a camera from a predetermined list of cameras of the preselected zone closest to a line between each of the two consecutive waypoints,
(h) determine the center of a viewing angle of the identified camera from a list of predetermined viewing angles for each of the cameras in the list of cameras of the preselected zone,
(i) calculate a path from the start point through each of the viewing angle centers to the end point,
(j) set the view angle center between each of the two consecutive waypoints as a new start point and iterating steps (c) through (i) until the end point is one of the two consecutive waypoints, at which iteration the incremental path is calculated from a viewing angle center representing the last pair of consecutive waypoints to the end point, and
(k) display the calculated path on the electronic communication device,
wherein the predetermined list of cameras is determined by:
(A) receiving name of an organization,
(B) identifying a range of internet protocol (IP) addresses associated with the organization,
(C) querying each IP address in the range of the IP addresses,
(D) receiving a response from the IP addresses in response to the queries,
(E) verifying the received response is from a camera by obtaining an image file from the IP address and analyzing the image file, and
(F) adding the IP address to the predetermined list of cameras, and
wherein location of each camera is determined by:
(A) using an IP address to physical address translator, and
(B) verifying the location information by using a street-view of a 3rd-party mapping software.

US Pat. No. 10,341,616

SURVEILLANCE SYSTEM AND METHOD OF CONTROLLING THE SAME

HANWHA TECHWIN CO., LTD.,...

1. A network camera comprising:a camera configured to capture images of a monitoring area;
a communication interface configured to communicate with a server and a beacon terminal; and
a processor configured to transmit a first beacon signal to the beacon terminal to receive a second beacon signal corresponding to the first beacon signal from the beacon terminal, generate beacon information and image information, in response to receiving the second beacon signal, and transmit the beacon information and the image information to the server via the communication interface,
wherein the beacon information comprises location information of the beacon terminal, and the image information comprises an image of a monitoring target that carries the beacon terminal.

US Pat. No. 10,341,615

SYSTEM AND METHOD FOR MAPPING OF TEXT EVENTS FROM MULTIPLE SOURCES WITH CAMERA OUTPUTS

Honeywell International I...

1. A video surveillance system comprising:a CCTV keyboard having a controller that groups each of a plurality of surveillance cameras into a respective one of a plurality of zones, wherein each of the plurality of zones contains a respective one of a plurality of transaction devices within a respective field of view of a respective one of the plurality of surveillance cameras associated with the respective one of the plurality of zones, and wherein each of the plurality of zones is a respective physical location where the respective one of the plurality of transaction devices is located;
a capture unit that receives transaction data from the plurality of transaction devices; and
a switching unit connected to the CCTV keyboard that selectively switches between the plurality of surveillance cameras to display respective video from the respective one of the plurality of surveillance cameras associated with a first one of the plurality of zones in response to a first selection from the CCTV keyboard selecting the first one of the plurality of zones,
wherein, responsive to the first selection, the capture unit and the switching unit display the respective video of live transactions from the respective one of the plurality of surveillance cameras associated with the first one of the plurality of zones,
wherein, when the first one of the plurality of zones includes more than one of the plurality of transaction devices, the CCTV keyboard is configured to receive a second selection selecting a selected transaction device of the more than one of the plurality of transaction devices,
wherein, responsive to the second selection, the capture unit and the switching unit display text corresponding to the transaction data associated with the live transactions from the selected transaction device superimposed on the respective video of the live transactions from the respective one of the plurality of surveillance cameras associated with the first one of the plurality of zones; and
wherein superimposition of the text on the respective video of the live transactions includes placing the text corresponding to the transaction data associated with the live transactions on top of the respective video of the live transactions.

US Pat. No. 10,341,614

BIOLOGICAL IMAGING DEVICE

NEC CORPORATION, Tokyo (...

1. A biological imaging device, comprising:an emitting unit that emits parallel light to a first part of a finger;
an imaging unit that images the first part and a second part connected to the first part, wherein the first part is a fingerprint part between a distal interphalangeal joint and a fingertip and the second part is a part between a proximal interphalangeal joint and the distal interphalangeal joint; and
a finger root guide on which a root of the finger is placed
wherein the imaging unit captures blood vessel of the second part, the parallel light emitted by the light propagating inside the finger and radiated from the second part.

US Pat. No. 10,341,613

VIDEO SHARING PLATFORM PROVIDING FOR POSTING CONTENT TO OTHER WEBSITES

Crackle, Inc., Culver Ci...

1. A method for use in providing content, comprising:hosting a network site on a computer network, where the network site is remote from a plurality of client computers and accessible by the client computers over the computer network;
displaying on the network site links to one or more videos uploaded over the network from multiple client computers of the plurality of client computers;
generating one or more video files from the uploaded one or more videos in a format that is supported for playback on one or more portable video players;
displaying on the network site a tool for searching through the one or more videos available through the network site and accessible over the computer network;
displaying on the network site a result of a search through the one or more videos;
displaying on the network site procedures for allowing downloading of video that is representative of the result of the search on one or more portable video players;
causing downloading of one or more generated video files, that is representative of the result of the search to one of the portable video players in response to the procedures being followed, wherein each transferred video file is playable on the portable video player, and wherein the downloading is performed in pieces from two or more client computers on the network;
updating the video that is representative of the result of the search in the portable video player;
displaying on the network site an option to be activated by a user to create a film strip widget that is representative of the result of the search, wherein the film strip widget includes display of the still images for the corresponding plurality of videos, code comprising identifiers that are used to identify one or more video files to be represented in the film strip widget and a command to start an on-demand playback of the created on-demand video clip for any video included in the film strip widget;
displaying on the network site an option to create an RSS (really simple syndication) feed corresponding to a search term, wherein the RSS feed is configured to provide notifications to the user of updates to the result of the search corresponding to at least the search term;
subscribing the user to the created RSS feed;
identifying when new video is shared that corresponds to the search term;
including the new video in the RSS feed;
identifying the user as being subscribed to the RSS feed; and
notifying the user, in response to the including the new video in the RSS feed and identifying the user as being subscribed to the RSS feed, when the new video is available; and
posting the film strip widget that is representative of the result of the search to a different network site in response to the option being selected.

US Pat. No. 10,341,612

METHOD FOR PROVIDING VIRTUAL SPACE, AND SYSTEM FOR EXECUTING THE METHOD

COLOPL, INC., Tokyo (JP)...

1. A method, comprising:defining a first virtual space, wherein the first virtual space is associated with a first user to be associated with a first head-mounted device (HMD);
defining a second virtual space, wherein the second virtual space is associated with a second user, different from the first user, to be associated with a second head-mounted device (HMD);
specifying a plurality of first potential match users as candidates to be matched with the first user, wherein the plurality of first potential match users comprise the second user;
presenting in the first virtual space information representing the plurality of first potential match users;
detecting a first input from the first user;
detecting a first period during which the first user designates the second user in accordance with the detected first input;
specifying a plurality of second potential match users as candidates to be matched with the second user, wherein the plurality of second potential match users comprise the first user;
presenting in the second virtual space information representing the plurality of second potential match users;
detecting a second input from the second user;
detecting a second period during which the second user designates the first user in accordance with the detected second input; and
matching the first user and the second user in accordance with the first period and the second period satisfying a predetermined relation.

US Pat. No. 10,341,611

SYSTEM AND METHOD FOR VIDEO CONFERENCING

Inuitive Ltd., RaAnana (...

1. A system for video conferencing, comprising:a data processor configured for receiving from a remote location a stream of imagery data of a remote user and displaying an image of said remote user on a display device, receiving a stream of imagery data of an individual in a local scene in front of said display device, extracting a head orientation of said individual, and varying a view of said image responsively to said head orientation;
wherein said variation of said view comprises varying a displayed orientation of said remote user such that a rate of change of said orientation matches a rate of change of said head orientation;
wherein said stream of imagery data comprises range data in a form of a depth map providing depth information at a lower resolution for a group of pixels of the image data.

US Pat. No. 10,341,610

METHOD AND APPARATUS USING AN INTEGRATED FEMTOCELL AND RESIDENTIAL GATEWAY DEVICE

1. A device, comprising:a processing system including a processor; and
a memory that stores executable instructions that, when executed by the processing system, facilitate performance of operations, comprising:
sending a cellular phone number to a gateway device associated with the device, wherein the gateway device comprises a femtocell, and wherein the sending is performed when a cellular phone associated with the cellular phone number is within a communication range of the femtocell;
determining whether to use a first service associated with the cellular phone number or a second service associated with a voice over internet protocol phone number when initiating an outgoing call or when answering an incoming call as presented options on a user interface, wherein the determining is performed based on an input received through the user interface as a selection from among the presented options, wherein the voice over internet protocol phone number is provided to the femtocell;
receiving the incoming call or initiating the outgoing call associated with one of the cellular phone number or the voice over internet protocol phone number through the femtocell responsive to receiving the selection and according to the selection from among the presented options;
performing an outgoing voice over internet protocol call when receiving a user initiated phone call signal from one of a voice over internet protocol phone or the device by processing the user initiated phone call as a voice over internet protocol phone call; and
modifying a presentation of a video program being presented by the device for a duration of the outgoing call or the incoming call.

US Pat. No. 10,341,609

GROUP VIDEO SYNCHRONIZATION

MOTOROLA SOLUTIONS, INC.,...

1. An apparatus comprising:a first Push-to-Talk (PTT) button;
a display configured to display video;
an over-the-air receiver(s) configured to receive a PTT audio transmission along with first video synchronization information, wherein the PTT audio transmission is initiated by a user pressing a second PTT button on a PTT radio, and wherein the PTT audio transmission is not part of the video;
an over-the-air transmitter(s) configured to transmit second video synchronization information;
logic circuitry configured to receive the first video synchronization information and advance or retard the video based on the first video synchronization information, and also configured to receive a trigger that the first PTT button was pressed, and in response to the received trigger, determine the second video synchronization information for the video, and instruct the transmitter to transmit the second video synchronization information via an over-the-air transmission to other radios.

US Pat. No. 10,341,608

COMMUNICATION SYSTEM AND COMMUNICATION METHOD

1. A communication system comprising:a plurality of different base sites communicably connected to each other, the plurality of different base sites including first and second base sites;
a video display device disposed in the first base site, the video display device being configured to display a video image at a first area corresponding to a first flow line on a floor of the first base site, the first flow line corresponding to a first walking route in the first base site;
an imaging device configured to capture the video image of a second area of the second base site, the second area corresponding to a second flow line on a floor of the second base site, the second flow line corresponding to a second walking route in the second base site; and
a communication device configured to transmit the video image captured by the imaging device to the video display device,
wherein the video display device is configured to display the video image at the first area, and
the video image is displayed at the first area by linking the second flow line to the first flow line.

US Pat. No. 10,341,607

DISPLAY DEVICE

MITSUMI ELECTRIC CO., LTD...

1. A display device that includesan attachment part mountable on a head of a user,
a control device to control the attachment part, and
a transmission cable to connect the attachment part with the control device,
the display device comprising:
an imager;
a first converter configured to convert a digital signal from the imager into an analog signal;
a second converter configured to convert the analog signal into a video signal;
a laser light generator configured to generate a laser light modulated depending on the video signal;
an optical scanner configured to scan the laser light; and
an optical projection system configured to project the scanned laser light to form an image,
wherein the imager, the first converter, the optical scanner, and the optical projection system are placed in the attachment part,
wherein the second converter and the laser light generator are placed in the control device, and
wherein the analog signal and the laser light are transmitted via the transmission cable.

US Pat. No. 10,341,606

SYSTEMS AND METHOD OF TRANSMITTING INFORMATION FROM MONOCHROME SENSORS

SA Photonics, Inc., Los ...

1. An imaging system comprising:a plurality of cameras comprising a plurality of optical sensor arrays, different sensor arrays of the plurality of optical sensor arrays configured to obtain a monochrome image of a scene and produce image data,
a multiplexing unit configured to multiplex image data obtained by different sensor arrays of the plurality of optical sensor arrays and generate a single image stream; and
a transmission line configured to accept the generated single image stream,
wherein the plurality of optical sensor arrays comprises three optical sensor arrays, the three optical sensor arrays comprising respective arrays of pixels, a pixel in the array of pixels associated with a unique set of coordinates designating the position of the pixel in the array of pixels, and
wherein the single image stream comprises a plurality of multiplexed pixels comprising information from co-located pixels of the three optical sensor arrays, the information from the co-located pixels of the three optical sensor arrays being time synchronized.

US Pat. No. 10,341,605

SYSTEMS AND METHODS FOR MULTIPLE-RESOLUTION STORAGE OF MEDIA STREAMS

WatchGuard, Inc., Allen,...

1. A method comprising, by a computer system:continuously receiving, from a plurality of cameras, raw video frames at an initial resolution, wherein the plurality of cameras are arranged to provide a 360-degree view relative to a point of reference;
for each camera of the plurality of cameras, for each raw video frame, as the raw video frame is received:
downscaling the raw video frame to a first resolution to yield a first scaled video frame;
downscaling the raw video frame to a second resolution distinct from the first resolution to yield a second scaled video frame;
identifying a location of a target in at least one of the raw video frame, the first scaled video frame, and the second scaled video frame;
cropping at least one video frame selected from among the raw video frame, the first scaled video frame, and the second scaled video frame based, at least in part, on the location of the target;
downscaling the cropped at least one video frame to a third resolution to yield a third scaled video frame; and
storing the first scaled video frame, the second scaled video frame, and information related to the cropped at least one video frame as part of a first video stream, a second video stream, and a third video stream, respectively; and
blending together a video stream of each of the plurality of cameras into a 360-degree video stream, wherein the video stream of each of the plurality of cameras comprises at least one of the first video stream, the second video stream, and the third video stream.

US Pat. No. 10,341,604

CAMERA SYSTEM, CAMERA, INTERCHANGEABLE LENS, AND STORAGE MEDIUM STORING CONTROL PROGRAM FOR CAMERA SYSTEM THEREON

Olympus Corporation, Tok...

1. A camera system, comprising a lens-changeable camera body and an interchangeable lens, the camera system including:a usage history information collecting unit configured to collect a plurality of usage history information related to a usage state of the camera system;
a usage history information storage unit configured to store the collected usage history information;
an information extracting unit configured to extract, from a plurality of the usage history information stored in the usage history information storage unit, usage history information related to the interchangeable lens being attached; and
a lens-related information storage unit configured to store the extracted usage history information related to the interchangeable lens.

US Pat. No. 10,341,603

IDENTIFYING AND TRACKING DIGITAL IMAGES WITH CUSTOMIZED METADATA

Lifetouch Inc., Eden Pra...

1. A method of generating a digital image file, the method comprising:receiving a digital image of a subject;
receiving first barcode data;
receiving second barcode data, at least one of the first barcode data and the second barcode data including information identifying the subject in the digital image;
storing the first barcode data and the second barcode data in a plurality of metadata fields in the digital image file, wherein at least some of the first barcode data or the second barcode data are fragmented between metadata fields; and
storing the digital image file in a memory device of a digital camera, wherein the digital image file is a single digital data file encoded in the memory device according to a standard digital image file format.

US Pat. No. 10,341,602

TV POWER SUPPLY

SHENZHEN CHINA STAR OPTOE...

1. A TV power supply, comprising a main power supply and a voltage conversion circuit electrically connected to the main power supply;the main power supply comprising a first output terminal for outputting a backlight driving voltage and a second output terminal for outputting a motherboard driving voltage;
the voltage conversion circuit being configured to convert the backlight driving voltage or a motherboard driving voltage to a standby voltage, the voltage conversion circuit comprising an input terminal electrically connected to the first output terminal or the second output terminal and a third output terminal for outputting the standby voltage;
wherein the voltage conversion circuit comprises a first resistor, a second resistor, a third resistor, a first transistor, a second transistor, and a zener diode;
one terminal of the first resistor is electrically connected to a first node and the other terminal of the first resistor is electrically connected to an emitting electrode of the second transistor;
one terminal of the second resistor is electrically connected to the first node and the other terminal of the second resistor is electrically connected to one terminal of the third resistor;
the other terminal of the third resistor is electrically connected to a base electrode of the first transistor;
a collect electrode of the first transistor is electrically connected to the terminal of the third resistor, the emit electrode of the first transistor is electrically connected to a second node;
the base electrode of the second transistor is electrically connected to the terminal of the third resistor and the collect electrode of the second transistor is electrically connected to the second node;
a cathode of the zener diode is electrically connected to the base electrode of the first transistor and the anode of the zener diode is grounded;
the first node is the input terminal of the voltage conversion circuit, the second node is the third output terminal of the voltage conversion circuit;
the voltage difference between a stable voltage of the zener diode and the conduction voltage drop of the emitter junction of the first transistor is equal to the standby voltage.

US Pat. No. 10,341,601

IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, AND IMAGE PROCESSING METHOD

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus that receives an input of an image signal obtained by imaging an object, and performs signal conversion on the image signal to output the image signal to a display apparatus, the image processing apparatus comprising:at least one processor or circuit configured to function as following units:
a calculation unit configured to calculate an absolute luminance value of the object from a luminance value of the object acquired from the image signal and an exposure parameter in the imaging;
a determination unit configured to determine a predetermined absolute luminance code for the luminance value of the object according to input-output characteristics of the display apparatus so that the object is displayed at the absolute luminance value on the display apparatus;
a conversion unit configured to perform signal conversion for converting the image signal based on a relationship between the luminance value of the object and the absolute luminance code; and
a correction unit configured to perform gamma correction on the image signal output from the conversion unit,
wherein the determination unit determines the predetermined absolute luminance code for the luminance value of the object in the image signal after the gamma correction.

US Pat. No. 10,341,600

CIRCUIT APPLIED TO TELEVISION AND ASSOCIATED IMAGE DISPLAY METHOD

MSTAR SEMICONDUCTOR, INC....

1. A circuit, applied to a television comprising a memory and a display panel, comprising:an image processing circuit, processing image data to generate processed image data, wherein the image processing circuit processes at least one of scaling, color adjustment, brightness adjustment and noise reduction on the image data to generate the processed image data;
a control circuit, generating a control signal according to a switching signal;
an image capturing circuit, capturing the processed image data as predetermined image data according to the control signal and storing the predetermined image data to the memory;
a first output circuit, transmitting the predetermined image data to the display panel according to the control signal; and
a second output circuit, superimposing an interface image onto the processed image data and transmitting the processed image data superimposed with the interface image to the display panel, and stopping superimposing the interface image onto the processed image data and stopping transmitting the processed image data superimposed with the interface image to the display panel according to the control signal;
wherein the memory stores the interface image the switch signal contains a resolution switching signal, and the second output circuit further sets a parameter corresponding to the resolution switching signal according to the control signal.

US Pat. No. 10,341,599

METHOD AND DEVICE FOR RENDERING VIDEO CONTENT ON A DISPLAY

ADTOOX AB, Solna (SE)

1. A method of continuously rendering video content on a display device, the method comprising:rendering the video content in one of a number of surfaces designated for displaying visual content on the display device, while rendering at least one image extracted from the video content in at least another one of the surfaces; and
automatically rendering, when the video content reaches a point in time corresponding to a position where said at least one image is extracted from the video content, the video content in said another one of the surfaces thereby replacing said at least one image; and
automatically rendering, in the surface in which the video content was rendered before being rendered in another surface, an image similar in appearance to the image extracted from the video content at the position that corresponds to the point in time which is reached when the video content is rendered in said another surface.

US Pat. No. 10,341,598

METHOD FOR NOTIFYING A USER OF A TELEVISION TO SAVE POWER CONSUMPTION BY MULTIPLE MULTIMEDIA DEVICES CONNECTED TO THE TELEVISION

TP VISION HOLDING B.V., ...

1. A method for notifying a user of a television to save power consumption by a plurality of multimedia devices which are connected to the television and which include at least a first multimedia device and a second multimedia device, the first multimedia device being connected to the television through a first interface connector, the second multimedia device being connected to the television through a second interface connector, said method comprising:presenting, by the television, desired multimedia content to the user based on signals provided by the first multimedia device through the first interface connector;
detecting, by the television, whether the second multimedia device is providing signals of multimedia content through the second interface connector during presentation of the desired multimedia content; and
outputting, by the television and upon detecting that the second multimedia device is providing signals of multimedia content through the second interface connector during presentation of the desired multimedia content, a notification that is perceivable by the user and that relates to the provision of signals of multimedia content by the second multimedia device.

US Pat. No. 10,341,597

SOLID-STATE IMAGING DEVICE, METHOD FOR MANUFACTURING SOLID-STATE IMAGING DEVICE, AND ELECTRONIC APPARATUS

Brillnics Japan Inc., To...

1. A solid-state imaging device comprisinga pixel part having a plurality of pixels performing photoelectric conversion arranged in a matrix,
a reading part reading pixel signals from the pixel part to a column output direction,
a first substrate, and
a second substrate, wherein
the reading part includes at least
a row driver which drives a row designated in the pixel part so as to read out pixel signals in the row to the column output direction and
a column readout circuit applying predetermined signal processing with respect to the pixel signals read out in response to driving of the row driver,
the first substrate and the second substrate have a multilayer structure in which they are connected through a column level connection part which is formed in at least one side portion of the column output direction of each substrate and through a row level connection part which is formed in at least one side portion of a direction perpendicular to the column output direction of each substrate, on the first substrate, a pixel part is formed, in which pixel part, the side portion along the column level connection part of the column output direction has a first pitch, and the side portion long the row level connection part of the direction perpendicular to the column output direction has a second pitch,
on the second substrate, at least,
the column readout circuit with a side portion of the column output direction having a third pitch corresponding to the first pitch is formed along the column level connection part, and
the row driver with a side portion of the direction perpendicular to the column output direction having a fourth pitch corresponding to the second pitch is formed along the row level connection part, and
a pitch conversion-use interconnect region including a slanted interconnect for pitch conversion between interconnects is formed,
at least one of the third pitch of the column readout circuit or the fourth pitch of the row driver on the second substrate is shorter than a corresponding first pitch or second pitch of the pixel part on the first substrate, and
the pitch conversion-use interconnect region is formed at least between an end part of the column readout circuit having the third pitch shorter than the pixel part and an end part of the column level connection part and/or between an end part of the row driver having the fourth pitch shorter than the pixel part and an end part of the row level connection part.

US Pat. No. 10,341,596

IMAGE SENSOR SYSTEM, ASSOCIATED TIMING CONTROL CIRCUIT THEREOF AND ASSOCIATED METHOD

TAIWAN SEMICONDUCTOR MANU...

1. An image sensor system, comprising:a timing control circuit, arranged to determine if a coding condition is fit according to an input signal, the timing control circuit generating a control signal when the coding condition is fit, wherein the input signal includes a plurality of bits serially input to the timing control circuit, and each bit of the plurality of bits corresponds to each pulse of a clock signal respectively;
an image sensor, coupled to the timing control circuit wherein the image sensor comprises a plurality of pixels, and one of the plurality of pixels receives the control signal from the timing control circuit and outputs a sensing signal; and
a modulation circuit, coupled to the image sensor, wherein the modulation circuit is arranged to receive the sensing signal and generate an output signal according to the sensing signal, and a frequency of the output signal is different from a frequency of the sensing signal.

US Pat. No. 10,341,595

IMAGE SENSOR FOR COMPENSATING FOR SIGNAL DIFFERENCE BETWEEN PIXELS

SAMSUNG ELECTRONICS CO., ...

1. An image sensor, comprising:two or more phase-difference detection pixels disposed adjacent to each other;
a plurality of general pixels spaced apart from the phase-difference detection pixels;
a first peripheral pixel and a second peripheral pixel adjacent to the phase-difference detection pixels, and between the phase-difference detection pixels and the general pixels;
a first light shield disposed in one of the general pixels and having a first width;
a second light shield that extends into the first peripheral pixel from a first area between the phase-difference detection pixels and the first peripheral pixel, and having a second width different from the first width; and
a third light shield that extends into the second peripheral pixel from a second area between the phase-difference detection pixels and the second peripheral pixel, and having a third width different from the first width,
wherein the second width is different from the third width.

US Pat. No. 10,341,594

LIGHT FIELD CAPTURE CONTROL METHODS AND APPARATUSES, LIGHT FIELD CAPTURE DEVICES

BEIJING ZHIGU TECH CO., L...

1. A light field capture control method, comprising:acquiring depth information of a to-be-shot scene;
determining target pixel density distribution information of an image sensor of a light field camera according to the depth information;
adjusting pixel density distribution of the image sensor according to the target pixel density distribution information; and
performing, by the adjusted image sensor, light field capture of the to-be-shot scene,
wherein determining target pixel density distribution information according to the depth information comprises:
performing regional division on the to-be-shot scene in a depth of field (DOF) direction according to the depth information; and
determining the target pixel density distribution information according to a result of the regional division, wherein in the target pixel density distribution information, target pixel density distribution corresponding to at least two different regions divided varies.

US Pat. No. 10,341,593

COMPREHENSIVE FIXED PATTERN NOISE CANCELLATION

DePuy Synthes Products, I...

1. A digital imaging method for use with an endoscope in ambient light deficient environments comprising:actuating an emitter to emit a plurality of pulses of electromagnetic radiation, wherein each pulse comprises an individual wavelength of electromagnetic radiation to cause illumination within the light deficient environment;
pulsing the emitter at a predetermined interval corresponding to a sensing interval of a pixel array;
sensing reflected electromagnetic radiation from a pulse with the pixel array to create an image frame in a plurality of cycles, the cycles including an integration time of the pixel array that is controlled using an electronic shutter, wherein a duration of the integration time is variable from cycle to cycle in the plurality of cycles between a readout of a first frame in a first cycle and a readout of a next subsequent frame in a second subsequent cycle and wherein the electromagnetic radiation pulse width and duration are variable during each cycle to adjust a level of light intensity caused by each pulse in the cycle, wherein variation of the integration time is related to the variation in the pulse width and the duration of the electromagnetic radiation pulse;
wherein the pixel array is actuated at the sensing interval that corresponds to the pulse interval of the emitter;
stopping the emitter from pulsing for a single iteration;
creating a dark frame by sensing the pixel array while the emitter is not pulsing a pulse of electromagnetic radiation;
creating one or more reference frames, based on pixel values in both the dark frame and in a plurality of previous dark frames, for use in removing fixed pattern noise;
removing fixed pattern noise from the image frame by subtracting stored reference data in the one or more reference frames from the image frame; and
creating a stream of images by combining a plurality of image frames to form a video stream.

US Pat. No. 10,341,592

IMAGING ELEMENT, DRIVING METHOD, AND ELECTRONIC DEVICE

Sony Semiconductor Soluti...

1. An imaging device, comprising:a pixel including:
a first photoelectric converter;
a first transfer transistor coupled to the first photoelectric converter;
a first capacitance coupled to the first transfer transistor;
a reset transistor coupled to the first capacitance;
a second photoelectric converter;
a second transfer transistor coupled to the second photoelectric converter;
a second capacitance coupled to the second transfer transistor;
a third transfer transistor coupled between the first and the second capacitances; and
an amplification transistor coupled to the first capacitance, wherein a sensitivity of the first photoelectric converter is higher than a sensitivity of the second photoelectric converter, and wherein the second capacitance is configured to receive a power source voltage.

US Pat. No. 10,341,591

IMAGING DEVICE

PANASONIC INTELLECTUAL PR...

1. An imaging device comprising:a semiconductor layer; and
a pixel including
an impurity region of a first conductivity type, the impurity region being located in the semiconductor layer,
a photoelectric converter electrically connected to the impurity region, the photoelectric converter being located above the semiconductor layer,
a first transistor having a first gate, a first source and a first drain, one of the first source and the first drain being electrically connected to the impurity region,
a second transistor having a second gate of a second conductivity type different from the first conductivity type, a second source and a second drain, the second transistor including the impurity region as one of the second source and the second drain, the second gate being electrically connected to the impurity region, and
a third transistor having a third gate, a third source and a third drain, the third gate being electrically connected to the photoelectric converter.

US Pat. No. 10,341,590

METHODS AND APPARATUS FOR A CCD IMAGE SENSOR

SEMICONDUCTOR COMPONENTS ...

13. An imaging system, comprising:a memory device;
a processor coupled to the memory device; and
an image sensor coupled to the processor, the image sensor comprising:
a lateral overflow drain;
an active pixel array comprising a plurality of photosensitive regions arranged in rows and columns;
a center channel disposed along a region of the active pixel array substantially equidistant from two opposing edges of the active pixel array, wherein the center channel is located between and separates two adjacent photosensitive regions;
a center gate disposed above the center channel;
a first barrier disposed on a first side of the center channel;
a second barrier disposed on a second side of the center channel;
wherein the first and second barrier are disposed on opposing sides of the center channel; and
wherein the center channel is coupled to the lateral overflow drain.

US Pat. No. 10,341,589

IMAGING DEVICE AND CAMERA SYSTEM INCLUDING SENSE CIRCUITS TO MAKE BINARY DECISION

Sony Corporation, Tokyo ...

1. A light detecting device comprising:a first substrate including a plurality of avalanche photodiodes arranged in an array including columns and rows, the plurality of avalanche photodiodes including:
a first group of the plurality of avalanche photodiodes coupled to a first signal line arranged in an i-th column of the array, and
a second group of the plurality of avalanche photodiodes coupled to a second signal line arranged in an (i+1)-th column of the array; and
a second substrate bonded to the first substrate, the second substrate including:
an inverter configured to invert a signal based on at least one of a plurality of outputs from the plurality of avalanche photodiodes, and
a counter coupled to the first and second signal lines.

US Pat. No. 10,341,588

NOISE AWARE EDGE ENHANCEMENT

DePuy Synthes Products, I...

1. A digital imaging method for use with an endoscope in ambient light deficient environments comprising:illuminating an environment using a source of visible, infrared or ultraviolet electromagnetic radiation;
continuously focusing a scene onto a pixel array of a sensor;
sensing reflected electromagnetic radiation with said pixel array, wherein said pixel array generates image data;
creating an image frame from said image data;
detecting image textures and edges within the image frame;
enhancing textures and edges within the image frame;
retrieving from memory properties pertaining to a pixel technology comprising a known-conversion gain for a given pixel or group of pixels and an applied sensor gain of the sensor to:
determine an expectation for a magnitude of noise within the image frame created by said sensor based on the known-conversion gain for an individual pixel in the pixel array;
using said expectation for the magnitude of noise to control the enhancing of the textures and edges within the image frame; and
creating a stream of images by sequentially combining a plurality of image frames.

US Pat. No. 10,341,587

MATRIX-ARRAY SENSOR WITH TEMPORAL CODING WITHOUT ARBITRATION

1. A matrix-array sensor comprising a matrix of detection elements that are arranged in rows and in columns and a readout circuit for each column, the detection elements of one and the same column being linked to the corresponding readout circuit via a bus, each detection element comprising a sensor for generating an electric current having an intensity that is representative of a physical quantity to be detected, a charge integrator configured to accumulate charge generated by said sensor, a comparator configured to generate a trigger signal when a voltage level across the terminals of this comparator reaches a threshold level, and a bus access logic circuit configured to receive, as input, said trigger signal and, following reception of said signal, to attempt to transmit, over said bus, an address of said detection element in the column, wherein the detection elements of one and the same column have different bus access priority levels, and wherein said bus access logic circuit of each detection element is configured:to abandon transmission of said address and reset the charge integrator of the detection element if the bus is pre-empted by a detection element having a higher priority level;
to count a number of attempts made before being able to transmit said address; and
to communicate said number to said readout circuit along with said address of the detection element.

US Pat. No. 10,341,586

VISIBLE LIGHT IMAGE AND INFRARED IMAGE FUSION PROCESSING SYSTEM AND FUSION METHOD

WUHAN GUIDE INFRARED CO.,...

1. A fusion method, comprising,providing a visible light image and infrared image fusion processing system, comprising an image acquisition module, an image fusion module and an image display module,
wherein the image fusion module is connected with the image acquisition module and the image display module respectively; and
wherein the image acquisition module comprises an infrared image acquisition device and a visible light image acquisition device, and the image fusion module is used for fusion processing of an infrared image and a visible light image to obtain a fused image; the image display module transmits the fused image to a display device for display,
wherein lenses of the infrared image acquisition device and the visible light image acquisition device are mounted at the same position, and optical axes of the lenses are in the same direction and in parallel;
wherein the infrared image acquisition device and the visible light image acquisition device need to synchronously output video images frame by frame, and the field angle range is registered according to the resolution, so that the images have the same scene area; and
wherein the area where registered images are selected is preset, and thus image registration can be achieved without complex calculation,
wherein the fusion method also comprises the following steps:
Step 1: transforming the selected target region of the visible light image into a format, converting the color image into a grayscale image or selecting only the luminance component image of the color image;
Step 2: filtering at a low-pass the grayscale image or the luminance component of the visible light to obtain the low-frequency component image of the visible light image; and then performing the difference calculation between the non-filtered visible grayscale image and the visible low-frequency component image to obtain a visible high-frequency component image;
Step 3: filtering at a low pass the infrared image, to obtain a low-frequency component image of the infrared image; then performing the difference calculation between the non-filtered infrared grayscale image and the low-frequency component image of the infrared image to obtain a high-frequency component image of the infrared image;
Step 4: Using the look-up table method to realize the pseudo-color enhancement reality of the low frequency component of the infrared image and extracting the luminance component of the pseudo-color image;
Step 5: Fusing the low-frequency components of the image, and performing sum of the weights of the low-frequency luminance component of the infrared image and the low-frequency component of the visible grayscale image, and the weight sum of each pixel is one;
Step 6: enhancing the high-frequency component images, and adjusting the high-frequency component image of the visible light image and the enhancement degree of the high-frequency component image of the infrared image by control parameters;
Step 7: superposing the enhanced high-frequency component image of the infrared image and the enhanced visible high-frequency component image on the fused low-frequency component image obtained in Step 5 to obtain a fused luminance component image;
Step 8: replacing the luminance component in the infrared pseudo-color image with the fused luminance component image, to obtain the final fused image.

US Pat. No. 10,341,585

ELECTRONIC DEVICE

Samsung Electronics Co., ...

1. Au electronic device comprising:a lens;
an optical filter having an opening and configured to transmit visible light and block at least one type of non-visible light, wherein the opening is asymmetric to an optical axis of the lens;
an image sensor including a visible light image sensor and a non-visible light image sensor, the visible light image sensor configured to sense the visible light and the non-visible light image sensor configured to sense the at least one type of non-visible light;
wherein the visible light image sensor comprises:
a blue light detecting device configured to one of selectively sense and selectively absorb light in a blue wavelength region;
a green light detecting device configured to one of selectively sense and selectively absorb light, in a green wavelength region; and
a red light detecting device configured to one of selectively sense and selectively absorb light in a red wavelength region.

US Pat. No. 10,341,584

IMAGING DEVICE FOR DISTINGUISHING FOREGROUND AND SENSOR CHIP THEREOF

PIXART IMAGING INC., Hsi...

1. An imaging device, comprising:a condenser lens;
an image sensor configured to sense light penetrating the condenser lens, the image sensor comprising:
a pixel matrix comprising a plurality of infrared pixels, a plurality of first pixels and a plurality of second pixels arranged in a first direction and a second direction;
an opaque layer covering right upon a first region which is a part of the first pixels and right upon a second region which is a part of the second pixels, wherein the first region and the second region are mirror-symmetrically arranged in the first direction, one first pixel having the first region covered by the opaque layer and one second pixel having the second region covered by the opaque layer are two adjacent pixels in the first direction, and an uncovered region of the first pixel is adjacent to an uncovered region of the second pixel in the first direction;
a plurality of microlenses; and
an infrared filter layer covering upon the infrared pixels.

US Pat. No. 10,341,583

FULL FIELD VISUAL-MID-INFRARED IMAGING SYSTEM

Agilent Technologies, Inc...

1. A scanning system comprising:a first scanning station that generates component images of a specimen at visual wavelengths;
a second scanning station that generates component images of portions of said specimen at wavelengths that are different from wavelengths used by said first scanning station;
a stage that moves specimen between first and second scanning stations to allow specimen to be scanned at each scanning station without being removed from said scanning system;
a controller that stores said component images, generates a compound image from a plurality of said component images, said compound image comprising a weighted sum of first and second ones of said component images, said controller displaying said compound image on a display controlled by said controller; and
a user interface adapted to control a weighting factor used in generating said weighted sum in response to user input, said controller redisplaying said compound image after said weighting factor is changed in response to user input;
wherein one of said component images has a first region with spatial resolution that is different from a spatial resolution in a second region of that component image.

US Pat. No. 10,341,582

ACTIVE SUBSTRATE AND IMAGING DEVICE

SHARP KABUSHIKI KAISHA, ...

1. An active substrate that results from forming on the same substrate a plurality of switching elements each of which includes a semiconductor layer, and a drive circuit that drives a plurality of scan signal lines that are connected to the switching elements, respectively,wherein the drive circuit includes N (N is equal to or greater than 2) shift register blocks in each of which a plurality of stage-wise shift registers, each of which outputs an output signal to one of the plurality of scan signal lines, are included,
wherein N shift registers in the same stage in the N shift register blocks are connected to N neighboring scan signal lines, respectively, and
wherein N shift registers that belong to each of at least one or more same stages, among a plurality of same stages in the N shift register blocks, output output signals in such a manner that two or more neighboring scan signal lines, among the N neighboring scan signal lines, are active at the same time.

US Pat. No. 10,341,581

RADIOGRAPHIC DETECTOR WITH HEAT SINK

Carestream Health, Inc., ...

1. A digital radiographic detector comprising:a multi-layered core comprising:
a two-dimensional array of photo-sensitive cells,
an integrated circuit chip in electrical communication with the array of photo-sensitive cells,
a printed circuit board in electrical communication with the integrated circuit chip, and
an electrically conductive plate made from a metal, the electrically conductive plate having a standoff integrally formed therewith, the standoff extending from the electrically conductive plate to the printed circuit board and electrically connected to the printed circuit board to provide an electrical ground plane for the printed circuit board, the electrically conductive plate further having a thermal extension integrally formed therewith, the thermal extension extending from the electrically conductive plate to the integrated circuit chip to provide a thermal exit path for heat generated by the integrated circuit chip; and
a housing to enclose the multi-layered core.

US Pat. No. 10,341,580

IMAGE PROCESSING DEVICE CONFIGURED TO CORRECT AN IMAGE SO AS TO DECREASE OUTPUT DATA

NIKON CORPORATION, Tokyo...

1. An image processing device, comprising:an image generation unit that is configured to generate an image at an optionally selected focal plane of a subject from output data of a plurality of photodetectors disposed at each of a plurality of microlenses; and
a correction unit that is configured to correct the image generated by the image generation unit so as to decompose overlapping of light from a first area of the subject and light from a second area of the subject in output data of one of the plurality of photodetectors on which the light from the first area of the subject and the light from the second area of the subject are incident.

US Pat. No. 10,341,579

CAMERA SYSTEM INCLUDING LENS WITH MAGNIFICATION GRADIENT

Google LLC, Mountain Vie...

8. A system comprising one or more computers and one or more storage devices storing instructions that are operable, when executed by the one or more computers, to cause the one or more computers to perform operations comprising:capturing a first digital image with a first digital camera including an image sensor optically coupled to a first lens, the first lens (i) being a non-rectilinear lens, and (ii) having a first magnification gradient exhibiting a peak magnification at a center of the first lens and a lowest magnification at edges of the first lens;
capturing a second digital image with a second digital camera including an image sensor optically coupled to a second lens, the second lens (i) being a non-rectilinear lens, and (ii) having a second magnification gradient exhibiting a peak magnification at a center of the second lens and a lowest magnification at edges of the second lens, wherein the second magnification gradient of the second lens is greater than the first magnification gradient of the first lens; and
fusing the first digital image with the second digital image to create a third digital image.

US Pat. No. 10,341,577

IMAGING APPARATUS

RICOH IMAGING COMPANY, LT...

1. An imaging apparatus comprising:an imaging device taking an image of an object formed by an optical system;
a photometry device for metering the brightness of an object;
first to third exposure factors for determining the exposure condition of said imaging device;
a first controller for setting said first exposure factor;
a second controller for setting said second exposure factor;
a first-exposure-factor-manually-setting exposure mode for manually setting said first exposure factor, and calculating and setting said second and third exposure factors depending on the set first exposure factor and the brightness value obtained by said photometry device; and
a first/second-exposure-factors-manually-setting exposure mode for manually setting said first and second exposure factors, and calculating and setting said third exposure factor depending on the set first and second exposure factors and the brightness value obtained by said photometry device;
when said second controller is operated in said first-exposure-factor-manually-setting exposure mode, said second exposure factor being manually set, and said first-exposure-factor-manually-setting exposure mode being changed to said first/second-exposure-factors-manually-setting exposure mode.

US Pat. No. 10,341,576

POLARIZATION SENSITIVE IMAGE SENSOR

X Development LLC, Mount...

1. A system, comprising:a device, comprising:
a first multi-element image sensor;
a second multi-element image sensor;
a polarizing layer positioned between the first and second multi-element image sensors,
wherein a portion of light having a first polarization state incident on the device along a first direction is transmitted through the first image sensor, is transmitted through the polarizing layer, and is detected by the second image sensor, and
light having a second polarization state orthogonal to the first polarization state incident on the device along the first direction is transmitted through the first image sensor, is blocked by the polarizing layer, and is substantially reflected by the polarizing layer, and
wherein at least some of the blocked light is detected by the first image sensor;
an electronic processing module in communication with the device,
wherein during operation the device detects incident light, sends signals to the electronic processing module, and the electronic processing module determines information about an intensity of the incident light and information about a polarization of the incident light based on the signals,
wherein the signals sent by the device comprises a first signal associated with the first multi-element image sensor and a second signal associated with the second multi-element image sensor, and
wherein the electronic processing module further determines the information about the polarization of the incident light based on both the first signal and the second signal.

US Pat. No. 10,341,575

METHOD AND SYSTEM FOR CAPTURING IMAGES OF A LIQUID SAMPLE

Biosurfit, S.A., Aveiro ...

1. A method of capturing images of a liquid sample flowing through a channel on a microfluidic cartridge, wherein the channel passes through a field of view of an imaging device, the method comprising:retaining the microfluidic cartridge relative to the imaging device such that the channel is disposed relative to the field of view;
stepping a focus mechanism of the imaging device through a first plurality of focus values;
capturing a plurality of images of the sample at each of the first plurality of focus values as the sample flows through the channel passing through the field of view of the imaging device to obtain a set of images of the liquid sample over time for each of the plurality of focus values;
determining a focus measure for each of the captured images; and
identifying for further processing a subset of the captured images based on the determined focus measures.

US Pat. No. 10,341,574

OPERATING A DEVICE TO CAPTURE HIGH DYNAMIC RANGE IMAGES

Apple Inc., Cupertino, C...

1. A non-transitory machine readable medium of a device that captures images, the medium storing a program that when executed by at least one processing unit generates an image of a scene, the program comprising sets of instructions for:capturing a plurality of images of the scene at different exposure levels, wherein the different exposure levels are selected based at least in part upon detected lighting conditions within the scene, comprising:
capturing and storing, prior to receiving input to generate the image, at least one image at a first exposure level;
detecting the lighting conditions within the scene based on an analysis of at least one of the at least one image captured and stored; and
capturing, upon receiving the input to generate the image, at least a different image at a second exposure level selected based at least in part upon the detected lighting conditions within the scene; and
compositing two or more of the plurality of images at different exposure levels to generate the image of the scene.

US Pat. No. 10,341,573

AIRCRAFT CONTROL METHOD AND APPARATUS AND AIRCRAFT

AUTEL ROBOTICS CO., LTD.,...

1. A method of controlling an aircraft, comprising:obtaining ambient luminance data, wherein the ambient luminance data indicates ambient luminance of an environment, wherein the aircraft is located in the environment;
determining a flight height of the aircraft;
determining whether the flight height is less than a preset low-altitude height threshold;
determining whether the ambient luminance data satisfies a luminance value required for normal running of a vision system of the aircraft; and
based on a determination that the flight height is less than the preset low-altitude height threshold and that the ambient luminance data fails to satisfy the luminance value, adjusting a working status of a light emitting apparatus on the aircraft.

US Pat. No. 10,341,571

IMAGE SENSORS WITH ELECTRONIC SHUTTER

INVISAGE TECHNOLOGIES, IN...

1. An image sensor, comprising:at least one pixel, the at least one pixel including a transistor to couple an overflow capacitor to a floating diffusion node;
wherein, under a low light condition, photocharge is to be collected in a floating diffusion, but not into an overflow node; and
wherein, under a high light condition, photocharge is to overflow into the overflow node.

US Pat. No. 10,341,570

ASSEMBLY AND METHOD FOR RESTRICTING INDEFINITE ONE-WAY ROTATION OF A ROTATABLE CAMERA

ADEMCO INC., Golden Vall...

8. A rotatable surveillance camera comprising:a stationary support structure securable to a mounting surface;
a rotatable gimbal ring supported by the stationary support structure, the gimbal ring having an annular interior surface provided with a slotted channel recessed therein, the slotted channel having opposing first and second ends and a length therebetween, the slotted channel having a spiral configuration along the interior surface and extending around the interior surface at least one full revolution between the opposing first and second ends;
a gimbal base having an annular rim defining a central opening and a body having at least one side wall defining an interior portion, the annular rim being securable against the gimbal ring, a slot being provided through at least a portion of the at least one side wall extending from the rim;
a flexible hook member secured to the gimbal base, the hook member having a proximal end and a distal end, the proximal end of hook member being secured to the interior portion of the gimbal base, the hook member extending from the proximal end through the slot to the distal end outside of the gimbal base, a tab being formed on the distal end of the flexible hook member, the tab being engagable within the slotted channel;
a camera assembly having a rotatable camera, the camera assembly being securable to a portion of at least one of the gimbal ring and gimbal base opposite the stationary support structure;
wherein the slotted channel is configured for receiving the tab of the hook member and being slidably rotated around said tab between the opposing first and second ends during rotation of the gimbal ring relative the gimbal base.

US Pat. No. 10,341,569

METHOD AND APPARATUS FOR VARYING FOCAL LENGTH OF CAMERA DEVICE, AND CAMERA DEVICE

TENCENT TECHNOLOGY (SHENZ...

1. A method for varying a focal length of a camera device, comprising:obtaining a position where a single finger of a user first touches a touch screen of the camera device;
when it is detected that the single finger of the user touches the touch screen of the camera device and slides, obtaining a distance and a direction that the single finger of the user slides on the touch screen of the camera device;
obtaining an image zooming multiple according to the distance and the direction that the single finger of the user slides on the touch screen of the camera device; and
zooming, according to the image zooming multiple and by using the position on the camera device where the single finger of the user touches the touch screen of the camera device as a center, an image currently taken by the camera device, so as to vary the focal length of the camera device, comprising:
determining an image zooming multiple variation of the image according to the distance that the single finger of the user slides on the touch screen of the camera device;
determining whether to zoom in or zoom out the image according to the direction that the single finger of the user slides on the touch screen of the camera device, and at least one of a preset focal length shortening direction or a preset focal length lengthening direction; and
determining the image zooming multiple of the camera device according to an image zooming multiple for current image zooming of the camera device and at least one of an upper limit or a lower limit of an image zooming multiple range corresponding to the camera device.

US Pat. No. 10,341,568

USER INTERFACE TO ASSIST THREE DIMENSIONAL SCANNING OF OBJECTS

QUALCOMM Incorporated, S...

1. A user device to assist with three dimensional scanning of an object, comprising:a processor;
a scanner coupled to the processor and configured to perform a three dimensional scan of the object;
a display to display a graphical user interface (GUI), the display coupled to the processor;
a memory coupled to the processor and the display, the memory including one or more instructions that when executed by the processor cause the GUI to:
display a target marker for the object being scanned;
display a visual boundary;
display a first scanner position marker, located on the visual boundary, the first scanner position marker indicating an orientation of the scanner and a preferred position of the scanner in relation to the target marker; and
display a sight to assist a user to move the scanner to the preferred position,
wherein the sight extends from the visual boundary along the orientation of the scanner.

US Pat. No. 10,341,567

PHOTOGRAPHING APPARATUS

RICOH IMAGING COMPANY, LT...

1. A photographing apparatus comprising:an imaging-plane tilter configured to tilt an imaging plane, formed by a photographing optical system, relative to a plane that is orthogonal to an optical axis direction of said photographing optical system;
a focus detector provided with a plurality of focus detection areas;
a tilt controller configured to control said imaging-plane tilter to tilt said imaging plane based on focus deviation amounts of said plurality of focus detection areas;
a focus deviation-amount detector configured to detect a focus deviation amount for each of the plurality of focus detection areas;
an auto-focus driver configured to drive a focal adjustment optical element to an in-focus position based on at least one of the focus deviation amounts; and
a calculator configured to calculate, based on the focus deviation amounts of the plurality of focus detection areas detected by the focus deviation-amount detector, a tilt correction amount for tilting said imaging plane so that each of the focus deviation amounts become minimum values.

US Pat. No. 10,341,566

METHOD AND APPARATUS FOR IMPLEMENTING A DIGITAL GRADUATED FILTER FOR AN IMAGING APPARATUS

CLEAR IMAGING RESEARCH, L...

1. A method for use in an imaging device for capturing an image of a high-dynamic range subject, the method including:displaying a preview of the subject to be captured in a display of the device;
capturing multiple images of the subject by the device, each image comprising parts of highlights and parts of shadows;
determining a light intensity dynamic range for parts of the subject;
determining number of images to be combined to form parts of a final image based, at least in part, on the dynamic range of the parts of the subject;
combining the multiple images to form a final image, wherein the combining includes the determined number of images to be combined to form different parts of the final image;
displaying the final image in the display of the device; and
storing the final image.

US Pat. No. 10,341,565

SELF CORRECTING ADAPTIVE LOW LIGHT OPTICAL PAYLOAD

Raytheon Company, Waltha...

1. A method for capturing images, the method comprising:capturing, during an exposure period with at least one sensor on a moving imaging platform, frames of a scene in a first photon-rich spectral band and in a second photon-poor spectral band having a lower light level than the first photon-rich spectral band, the frames captured in the first photon-rich spectral band and the frames captured in the second photon-poor spectral band including identical image motion induced by a relative motion between the imaging platform and the scene, wherein the relative motion is unknown and not calculated;
calculating one or more transformations based on measured changes in inter-frame scenes captured in the first photon-rich spectral band to compensate for the induced image motion, wherein the induced image motion includes at least one of rotation, scale, and anamorphic stretch;
digitally transforming the captured frames of the second photon-poor spectral band with the one or more transformations compensating for the induced image motion in the frames captured in the second photon-poor spectral band to remove effects of the induced image motion; and
summing a plurality of successive compensated frames captured in the second photon-poor spectral band to obtain higher signal to noise ratio (SNR) imagery in the second photon-poor spectral band compared to successive uncompensated frames captured in the second photon-poor spectral band,
wherein capturing frames of the scene comprises interleaving first spectral band exposure times with second spectral band exposure times, the first spectral band exposure times being shorter than the second spectral band exposure times.

US Pat. No. 10,341,564

SYSTEMS AND METHODS FOR STABILIZING VIDEOS

GoPro, Inc., San Mateo, ...

11. A method for stabilizing videos, the method performed by an image capture device including a housing and one or more processors, the housing carrying an optical element, an image sensor, and a position sensor, the optical element configured to guide light within an optical field of view to the image sensor, the optical field of view being greater than a capture field of view for generating video content, the method comprising:generating, by the image sensor, an image output signal based on light that becomes incident thereon during a capture duration, the image output signal conveying image information that defines images with the optical field of view;
generating, by the position sensor, a position output signal based on positions of the housing during the capture duration, the position output signal conveying position information that characterizes positions of the housing at different moments within the capture duration, the characterized positions of the housing including rotational positions of the housing, the position information being conveyed by the position output signal of the position sensor independent of the image information;
determining, by the one or more processors, an observed trajectory of the housing during the capture duration based on the position information, the observed trajectory reflecting the positions of the housing at the different moments within the capture duration, the observed trajectory including a first portion corresponding to a first moment within the capture duration and a second portion corresponding to a second moment subsequent to the first moment within the capture duration;
determining, by the one or more processors, a capture trajectory of the housing based on a look ahead of the observed trajectory, the capture trajectory reflecting actual and/or virtual positions of the housing from which orientations of the capture field of view are determined, the look head of the observed trajectory including use of a subsequent portion of the observed trajectory to determine a preceding portion of the capture trajectory such that a portion of the capture trajectory corresponding to the first portion of the observed trajectory, which corresponds to the first moment within the capture duration, is determined based on the second portion of the observed trajectory, which corresponds to the second moment within the capture duration, the capture trajectory having smoother changes in the positions of the housing than the observed trajectory;
determining, by the one or more processors, the orientations of the capture field of view for the images with respect to the optical field of view of the images based on the capture trajectory of the housing, the capture field of view including a smaller field of view within the optical field of view; and
generating, by the one or more processors, the video content based on a punch-out of visual content of the images within the capture field of view.

US Pat. No. 10,341,563

CAMERA DEVICE AND SHAKE CORRECTION METHOD

PANASONIC INTELLECTUAL PR...

1. A camera device which is fixed to a fixing target surface and is provided in a place in which shaking occurs on the fixing target surface, the device comprising:a lens unit that includes a zoom lens on which subject light is incident, and is capable of changing a zoom magnification of the zoom lens;
an imaging element that images an image based on the subject light;
a shake sensor that detects shake of the camera device;
a shake correction mechanism that holds a holder which holds the imaging element, and performs shake correction on a captured image captured by the imaging element through driving of the holder based on a detection value of the shake sensor;
a processor that causes the lens unit to change the zoom magnification of the zoom lens based on the detection value of the shake sensor, cuts a part of the captured image on which the shake correction is performed by the shake correction mechanism through a zoom process according to the changed zoom magnification, and outputs the cut part of the captured image; and
a memory that stores movable amount information acquired by associating the zoom magnification of the zoom lens with a movable amount of the imaging element based on the driving of the holder,
wherein, when a movement amount of the imaging element based on the detection value of the shake sensor exceeds the movable amount of the imaging element corresponding to the zoom magnification of the zoom lens, the processor causes the lens unit to change the zoom magnification of the zoom lens, cuts the part of the captured image on which the shake correction is performed by the shake correction mechanism through the zoom process, and outputs the cut part of the captured image.

US Pat. No. 10,341,562

SYSTEMS AND METHODS FOR TRANSLATIONAL MOTION CORRECTION

GoPro, Inc., San Mateo, ...

1. A system that provides translational motion correction for videos, the system comprising:one or more physical processors configured by machine-readable instructions to:
obtain video information generated by an image sensor, the video information defining images of a video based on light received within a field of view of the image sensor during a capture period;
obtain motion information of the image sensor, the motion information of the image sensor characterizing motion of the image sensor during the capture period;
estimate relative positions of the image sensor between different moments within the capture period based on the motion information;
determine depth information for a portion of an environment of the image sensor based on the video information generated at the different moments and the relative positions of the image sensor between the different moments; and
apply a translational motion correction to one or more of the images based on the depth information and the relative positions of the image sensor, the translational motion correction warping the one or more of the images to compensate for translational motion of the image sensor during the capture period and to stabilize the video.

US Pat. No. 10,341,561

DISTRIBUTED IMAGE STABILIZATION

Facebook, Inc., Menlo Pa...

1. A computer-implemented method for stabilizing captured video frames, implemented in a distributed computer network, comprising:analyzing, using a single one of one or more master processors, frames of a video to generate a list of per-frame transform for mitigating camera movement for each frame;
partitioning, after analyzing, using the one or more master processors, the video into multiple video segments, wherein each video segment is assigned a bit rate based in part on a complexity of each video segment, wherein the complexity of each video segment is associated with an amount of bits in order to achieve a uniform quality in the video, and wherein each video segment comprises a plurality of consecutive frames and one or more overlapping frames of the video;
dividing, using the one or more master processors, the list of per-frame transform into multiple portions corresponding to the multiple video segments;
providing, from the one or more master processors to a plurality of worker processors, a different one of the multiple video segments generated from the partitioning, metadata indicating the one or more overlapping frames, and a corresponding portion of the list of per-frame transform;
performing, by the plurality of worker processors, image stabilization on the received video segment and the corresponding portion of the list of per-frame transform to produce stabilized video segments, wherein a first worker processor of the plurality of worker processors produces a stabilized video segment including the one or more overlapping frames and a second worker processor of the plurality of worker processors skips the one or more overlapping frames based on the metadata;
conveying the stabilized video segments from the plurality of worker processors to the one or more master processors; and
generating, by the one or more master processors, a stabilized video from the stabilized video segments.

US Pat. No. 10,341,560

CAMERA MODE SWITCHING BASED ON LIGHT SOURCE DETERMINATION

GOOGLE LLC, Mountain Vie...

1. A method for controlling a camera mode, comprising:in a camera including a controller, memory storing instructions for execution by the controller, a sensor array comprising a plurality of sensors, an IR filter, and a lens assembly that is configured to focus light on the sensor array:
operating the camera in a night mode, wherein while in the night mode the IR filter is not interposed between the lens assembly and the sensor array, including:
receiving at the sensor array ambient light that is not filtered by the IR filter;
determining whether the received ambient light is due to an IR light source or a light source other than an IR light source; and
in response to a determination that the received ambient light is due to an IR light source, continuing operation of the camera in the night mode.

US Pat. No. 10,341,559

IMAGING SYSTEM, METHOD, AND APPLICATIONS

1. A method for forming a panoramic image, comprising:providing a panoramic imaging system having a plurality of discrete imaging systems disposed in a side-by-side array, each of the discrete imaging systems characterized by a field of view; and
constraining a plurality of chief rays that strike along an edge of the field of view of every one of the discrete imaging systems to be substantially parallel to another plurality of chief rays that strike along an adjacent edge of the field of view of an immediately adjacent one of the discrete imaging systems such that all of the chief rays that strike along the edge of the field of view of the discrete imaging systems appear to converge to a common NP point when viewed from object space and the field of view of each of the plurality of discrete imaging systems conjoins with but does not overlap the field of view of the immediately adjacent one of the discrete imaging systems.

US Pat. No. 10,341,558

SYSTEM AND APPARATUS FOR INCREASING QUALITY AND EFFICIENCY OF FILM CAPTURE AND METHODS OF USE THEREOF

1. A method, comprising:receiving, by a system including a processor, first image data of a scene captured at a first resolution;
receiving second image data of the scene captured at a second resolution less than the first resolution;
identifying an object represented by a portion of the first image data;
determining a position of the object as represented in the second image data; and
generating a third image of the scene from the first image data and the second image data, wherein the third image represents the object according to the position of the object as represented in the second image data.

US Pat. No. 10,341,557

IMAGE PROCESSING APPARATUSES AND METHODS

QUANTA COMPUTER INC., Gu...

1. A mobile device, comprising:a wide-angle camera, having a first Field Of View (FOV), configured to capture a first image of a first area of a scene;
an auxiliary camera, having a second FOV which is narrower than the first FOV, configured to capture a second image of a second area of the scene,
wherein the wide-angle camera and the auxiliary camera are disposed on the same surface of the image processing apparatus, and synchronized to capture the first image and the second image, respectively, and
wherein the first image in the area has a first resolution and the second image has a second resolution which is higher than the first resolution, and the first FOV is greater than 180 degrees, and the second FOV is between 60 and 75 degrees; and
a controller, configured to determine a portion of the first image, which corresponds to the second area of the scene, and superimpose the second image on the portion of the first image to generate an enhanced image.

US Pat. No. 10,341,556

IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREFOR

Canon Kabushiki Kaisha, ...

1. An image capturing apparatus comprising:an image sensor having a plurality of photoelectric conversion portions that correspond to each of a plurality of microlenses arranged in a matrix;
a control circuit that controls read-out from the image sensor by either of a first read-out control and second read-out control, wherein the first read-out control is to read out signals from the plurality of photoelectric conversion portions so as to be able to obtain pupil-divided signals, and the second read-out control is to combine signals of the plurality of photoelectric conversion portions corresponding to each microlens and read out an image signal;
a setting circuit that sets rows to be read out by the first read-out control among rows that include a focus detection area for which focus detection is performed;
an amplification circuit that amplifies a signal read out from the image sensor with a gain set in accordance with an exposure state; and
a signal processing circuit that performs signal processing on an image signal corresponding to each microlens obtained from the signals read out by the first read-out control and on the image signal corresponding to each microlens read out by the second read-out control, using an image signal of neighboring rows,
wherein the setting circuit sets the rows to be read out by the first read-out control according to the gain.

US Pat. No. 10,341,555

CHARACTERIZATION OF A PHYSICAL OBJECT BASED ON ITS SURFACE ROUGHNESS

Chromologic LLC, Monrovi...

1. A method for characterizing an object, said method comprising:striking a surface of the object with light from at least two incoherent light sources, each incoherent light source oriented to simultaneously generate incoherent raking light onto a feature on a surface of the object at grazing angles of twenty degrees or less relative to the surface of the object, wherein the at least two incoherent light sources face the surface of the object such that the incoherent light travels directly to the surface of the object, wherein the at least two incoherent light sources are spaced apart such that the at least two incoherent light sources shine light onto the object from different directions;
obtaining a digital image of micro features on the surface of the object;
converting said image into electrical signals comprising micro surface features based on spatial relationships of multiple surface features; and
processing said signals into unique identifying reference information for characterizing the object; and
comparing the unique identifying reference information to corresponding features of a reference.

US Pat. No. 10,341,554

METHOD FOR CONTROL OF CAMERA MODULE BASED ON PHYSIOLOGICAL SIGNAL

Samsung Electronics Co., ...

1. A portable communication device, comprising:a touchscreen display disposed in a front surface of the portable communication device;
a fingerprint sensor disposed in a rear surface of the portable communication device;
an image sensor; and
a processor adapted to:
display, via the touchscreen display, a preview image being obtained via the image sensor;
receive, via the fingerprint sensor, a user input while the preview image is displayed; and
perform photographing of a panorama image based at least in part on the user input, the performing including:
displaying a guide indication indicative of a specified direction;
capturing, using the image sensor, a plurality of images based at least in part on a determination that a movement of the portable communication device corresponds to the specified direction; and
generating the panorama image using the plurality of images.

US Pat. No. 10,341,553

IMAGE CAPTURING DEVICE WITH TOUCH SCREEN FOR ADJUSTING CAMERA SETTINGS

Apple Inc., Cupertino, C...

1. A method of capturing digital images, comprising:displaying a first image of a scene on a display screen of a digital camera, wherein the first image comprises a plurality of pixels;
receiving a first selection on the display screen;
identifying a first object in the scene at or near a location of the first selection;
determining a selected region based on a first set of pixels corresponding to the first object;
adjusting focus parameters of the digital camera based on the selected region;
adjusting exposure parameters of the digital camera based on the selected region; and
capturing a second image using the adjusted focus and exposure parameters.

US Pat. No. 10,341,552

INFRARED RECORDING DEVICE AND INFRARED RECORDING METHOD

MISSION INFRARED ELECTRO ...

1. An infrared recording device having a processing and control function of part or all elements realized via special-use circuits, general processors, or Field Programmable Gate Arrays (FPGAs) comprising:an acquiring part, wherein the acquiring part acquires thermal imaging data;
an information designating part, wherein the information part designates object information as special object information based on multiple object information stored in an information storing part;
the special object information acquires object instructing information displayed specially;
the object information being information representing identity of an object comprising a position, a type, a number of the object, or a combination thereof to uniquely differentiate multiple objects;
a display controlling part, wherein the display controlling part controls a display part to display an infrared thermal image generated by the thermal imaging data and simultaneously to display a specified amount of object instructing information according to a sequence of object information, based on the multiple object information stored in the information storing part;
the object instructing information acquired according to a specified amount of the object information being displayed and the object instructing information acquired according to the special object information being displayed specially in a display mode different from the display mode of other object instructing information;
an image display area displaying the infrared thermal image and an information display area displaying a specified amount of the object instructing information being configured to facilitate simultaneous observation of the infrared thermal image and the specified amount of the object instructing information by a user;
each object information stored in the information storing part comprising attribute information having a plurality of specified attributes;
the object instructing information being acquired according to all attribute information or specified part attribute information of the object information;
a task setting part, wherein the task setting part sets a filter condition based on the object information stored in the information storing part;
a photographing task being divided into a plurality of subtasks according to the filter condition, and the object information in the subtask being then sequenced and finally photographed;
a record part, wherein in response to record instructing operation or according to a specified record condition, the record part records specified infrared data associated with information related to the designated special object information, the infrared data being the thermal imaging data acquired by the acquiring part and/or data acquired after specified processing for the thermal imaging data acquired by the acquiring part;
wherein the record part generates thermal image file according to the specified infrared data, the record part comprises a file name generating unit, wherein the file name generating units generates a file name of the thermal image file, the file name comprises attribute information of the designated special object information, and a separation mark is located between at least two attribute information;
in response to switch instructing operation or according to a specified switch condition, the information designating part switching the designated special object information according to a sequence of the object information, based on the object information stored in the information storing part.

US Pat. No. 10,341,551

METHOD, AN APPARATUS AND A COMPUTER PROGRAM PRODUCT FOR FOCUSING

GRUNDIUM OY, Tampere (FI...

1. A method for focusing, comprising:receiving a first image stack of a first field of view, the first image stack comprising images captured with different focus from the first field of view;
determining, from the first image stack, a first spatial distribution of focus depths in which different areas in the first field of view are in focus;
determining a first local sample thickness and a first sample tilt in the first field of view based on the first spatial distribution of focus depths;
receiving a third image stack of a third field of view, the third image stack comprising images captured with different focus from the third field of view and wherein the first field of view and the third field of view are adjacent fields of view for a second field of view;
determining, from the third image stack, a third spatial distribution of focus depths in which different areas in the third field of view are in focus;
determining a third local sample thickness and a third sample tilt based on the third spatial distribution of focus depths; and
estimating, based on the first local sample thickness, the first sample tilt, the third local sample thickness and the third sample tilt, a focus setting for capturing a second image stack from the second field of view.

US Pat. No. 10,341,550

END FACE INSPECTION APPARATUS AND FOCUSED IMAGE DATA ACQUISITION METHOD

ANRITSU CORPORATION, Kan...

1. An end face inspection apparatus including an optical system which forms an image of an end face of a held test object on an image acquisition unit, and inspecting the end face of the test object by using acquired image data, the apparatus comprising:focusing degree changing means for changing a distance between the end face of the test object and a focal position of the optical system; and
a control unit that acquires a series of image data which is output from the image acquisition unit at a preset time interval while the distance between the end face of the test object and the focal position of the optical system is changed by the focusing degree changing means, determines whether or not each piece of the image data is focused, and selects focused image data as image data for end face inspection.

US Pat. No. 10,341,549

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGE CAPTURE APPARATUS CAPABLE OF REDUCING AN IMAGE DATA AMOUNT

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:a processor and a memory which function as:
an obtaining unit configured to obtain data that is based on light beams that have passed through partial pupil areas obtained by dividing an exit pupil of an imaging optical system into a predetermined number Np; and
a reduction unit configured to reduce a bit depth or a number of tones of signals constituting the data based on the predetermined number Np,
wherein the reduction unit reduces the bit depth to a value (>0) that is less than log2Np, or reduces the number of tones to be greater than or equal to 1/Np.

US Pat. No. 10,341,548

GUIDED PHOTOGRAPHY AND VIDEO ON A MOBILE DEVICE

EBAY INC., San Jose, CA ...

1. A system comprising:a first user device including:
a first application configured to:
receive data representing an indication that a user intends to photograph an object;
determine a category for the object;
use the category for the object to retrieve a script corresponding to the category from a plurality of possible scripts;
access an account stored for the user in a database to obtain a listing of registered devices for the user; and
trigger a camera application on one of the registered devices for the user to execute the script, causing the camera application to present instructions to the user as to how to capture images appropriate to the category using the one of the registered devices.

US Pat. No. 10,341,547

IMAGING DEVICE FOR SCENES IN APPARENT MOTION

Urugus S.A., Montevideo ...

1. An apparatus for imaging a scene having apparent motion, the apparatus comprising:an area imaging device having a plurality of pixel sensors;
a positioning mechanism, the positioning mechanism operable to move the area imaging device along a tracking axis; and
a control module configured to:
direct the positioning mechanism to move the area imaging device in one or more cycles such that the area imaging device is moved, in each of the one or more cycles, forward along the tracking axis in a direction that is substantially parallel with a direction of the apparent motion and at a tracking speed that compensates for a speed of the apparent motion;
direct the area imaging device to take at least a first exposure during each of the one or more cycles and at least a second exposure during the same or different cycle;
analyze the first exposure and the second exposure for differences;
determine, based upon the differences between the first exposure and the second exposure, at least one of a run length, a second tracking speed, or a tracking axis orientation; and
direct the positioning mechanism to move the area imaging device in a subsequent cycle using at least one of the run length, the second tracking speed, or the tracking axis orientation.

US Pat. No. 10,341,546

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

SONY CORPORATION, Tokyo ...

1. An image processing apparatus, comprising:a ray reproduction section configured to reproduce rays to be incident to a virtual lens, which has a synthetic aperture configured from a plurality of image pickup sections that pick up images at a plurality of visual points, from a real space point in a real space; and
a light condensing processing section configured to perform a light condensing process in which
positioning of a position at which the rays reproduced by the ray reproduction section are condensed on a virtual sensor through an emulation lens of an emulation target is performed depending upon an image plane shift position that is a position on the virtual sensor, which is reached by a principal ray emitted from the real space point through the emulation lens, and
the ray is condensed on the virtual sensor,
wherein the ray reproduction section and the light condensing processing section are each implemented via at least one processor.

US Pat. No. 10,341,545

WEARABLE APPARATUS WITH WIDE VIEWING ANGLE IMAGE SENSOR

OrCam Technologies Ltd., ...

21. A method for capturing image data, the method comprising:capturing image data of an environment of a user, wherein a field of view of the image sensor includes at least a portion of a head of the user;
identifying a head of the user in the image data to obtain a direction the head of the user is facing with respect to a body of the user;
based on the direction the head of the user is facing, activating the at least one microphone to record sound in the direction the head is facing;
identifying a direction of sound received by the at least one microphone; and
after identifying the direction of the sound, adjusting a pixel resolution for capturing subsequent image data at a higher resolution.

US Pat. No. 10,341,544

DETERMINING A MATCHING SCORE BETWEEN USERS OF WEARABLE CAMERA SYSTEMS

OrCam Technologies Ltd., ...

1. A server for determining a matching score between users of wearable camera systems, the server comprising:a memory for storing image data received from the wearable camera systems, wherein each wearable camera system is configured to capture images from an environment of a corresponding user and produce image data from the captured images; and
at least one processing device associated with the server and programmed to:
receive the image data from the wearable camera systems, wherein the image data includes first image data from a first wearable camera system of a first user and second image data from a second wearable camera system of a second user; and
determine a value of the matching score between the first user and the second user of the wearable camera systems based on:
the first image data received from the wearable camera system of the first user and the second image data received from the wearable camera system of the second user, and
information related to the first user and the second user, including profile information for each of the first user and the second user, wherein the matching score between the first user and the second user indicates a level of commonality between the first user and the second user according to one or more visual details of objects, people, or features in an environment of the first user and one or more visual details of objects, people, or features in an environment of the second user.

US Pat. No. 10,341,543

PARALLAX MASK FUSION OF COLOR AND MONO IMAGES FOR MACROPHOTOGRAPHY

QUALCOMM Incorporated, S...

1. A method of capturing color image data, the method comprising:capturing, by a monochrome camera of a device, monochrome image data of a scene;
capturing, by a color camera of the device, color image data of the scene;
determining, by a processor of the device, a parallax value indicative of an amount of the color image data or the monochrome image data affected by parallax between the monochrome image data and the color image data;
determining, by the processor and based on a comparison of the parallax value to a parallax threshold, that the scene was captured using a first photography mode or a second photography mode; and
combining, by the processor and in response to the determination that the scene was captured using the first photography mode, a luma component of the color image data with a luma component of the monochrome image data to generate a luma component of enhanced color image data.

US Pat. No. 10,341,542

LIGHT SHIELDING MEMBERS FOR SOLID STATE IMAGE CAPTURING APPARATUS, CAMERA MODULE AND ELECTRONIC DEVICE

Sony Corporation, Tokyo ...

1. An imaging apparatus, comprising:at least one structure supporting a lens, wherein the lens has an optical axis extending in a first direction;
a mold attached to the at least one structure;
an imaging device configured to receive incident light through the lens via an incident light path and perform photoelectric conversion;
a light shielding member having a light shielding portion extending in a second direction perpendicular to the first direction and having an opening in the incident light path;
an infrared cut filter held by the light shielding portion and disposed at a position of the opening; and
a substrate holding the imaging device and the mold;
wherein the light shielding portion is disposed between the imaging device and the infrared cut filter in the first direction,
wherein an edge of the substrate is attached to a portion of the mold such that the imaging device is spaced apart from the infrared cut filter,
wherein the light shielding portion includes an inclined surface that forms a first angle relative to the first direction, and
wherein the light shielding member has a light shielding wall extending in the first direction at a peripheral portion of the light shielding member such that the light shielding member has an L-shape in a cross-sectional view.

US Pat. No. 10,341,541

INTEGRATED SENSOR AND LENS ASSEMBLY WITH POST-TUNING OPTICAL ALIGNMENT

GoPro, Inc., San Mateo, ...

1. A method of manufacturing an integrated image sensor and lens assembly comprising:positioning a collet within a channel of a tube portion of a lens mount, the lens mount affixed to an image sensor substrate comprising an image sensor;
positioning a lens barrel within the collet, the lens barrel housing a set of lenses for directing light to the image sensor, the lens barrel having threads on an exterior surface of the lens barrel that mate with threads on an interior surface of the collet, wherein rotation of the lens barrel within the collet causes a change in vertical alignment of the lens barrel;
checking an alignment of an optical axis and a focal plane;
aligning the lens barrel, the collet, and the lens mount such that the optical axis is substantially perpendicular to the focal plane and the focal plane is aligned with the image sensor;
adhering the collet to the tube portion of the lens mount using an adhesive;
curing the adhesive to affix the collet within the tube portion of the lens mount;
after curing the adhesive, re-checking alignment of the optical axis and the focal plane; and
rotating the lens barrel in the collet in a direction such that the focal plane is re-aligned with the image sensor.

US Pat. No. 10,341,540

CAMERA SYSTEM FOR GAS-INSULATED SWITCHGEAR SYSTEMS

SIEMENS AKTIENGESELLSCHAF...

1. A camera system for gas-insulated switchgear systems including a front plate of the gas-insulated switchgear system including two front plate connectors, a central unit, the camera system comprising:at least one camera including a first camera; and
at least one camera housing, including at least one camera housing of the first camera, the at least one camera being housed in at least one camera housing, the at least one camera housing being conductive, being configured to enhance electromagnetic compatibility, and being conductively connectable to one component of the gas-insulated switchgear system, the at least one camera housing being strengthened via natural rubber seals or carbon-free rubber seals for use in at least one of SO2 and salt spray environments,
wherein the central unit is connectable to the first camera for data communication and including a Dynamic Host Configuration Protocol (DHCP) server and a captive portal,
wherein at least one camera housing of the central unit is accommodated in the at least one camera housing of the first camera,
wherein the central unit includes a power supply to supply current and voltage to the at least one camera, and wherein the power supply is suitable to supply the at least one camera with an input voltage to the at least one camera,
a shared connector, on an outer side of the camera housing of the first camera including the central unit, for a line for data communication between at least one of the two front plate connectors and the central unit of the first camera, and as a current supply and a voltage supply of the central unit of the first camera, the line running from the two front plate connectors to the shared connector of the first camera including a first side and a second side, wherein:
the line on the first side includes a plug, compatible with the shared connector, or a socket compatible with the shared connector,
line branches between the first side and the second side in a y-like manner into a first connecting line and a second connecting line on the second side such that
the first connecting line provides the current supply and the voltage supply to the central unit and is connectable to a first front plate connector of the front plate connectors, and
the second connecting line provides the data communication between a second front plate connector of the two front plate connectors and the central unit and is connectable to the second front plate connector, wherein the camera housing of the first camera includes at least one additional shared connector to connect at least one additional camera in an additional camera housing to the first camera, and to establish the data communication and the current supply and the voltage supply for the at least one additional camera via the at least one additional shared connector.

US Pat. No. 10,341,539

PHOTOGRAPHING APPARATUS MODULE, USER TERMINAL INCLUDING THE SAME, AND METHOD OF OPERATING THE USER TERMINAL

SAMSUNG ELECTRONICS CO., ...

1. A photographing apparatus module comprising:a photographing apparatus configured to rotate on a first axis and a second axis perpendicular to the first axis;
a lens disposed at a front portion;
a housing configured to accommodate the lens and having a hemispherical rear portion facing the front portion; and
an actuator module configured to rotate the photographing apparatus around the first axis or the second axis, wherein the actuator module comprises:
one or more first actuator devices configured for movement into and out of contact with the photographing apparatus so as to selectively apply a contact force to the photographing apparatus along an optical axis direction of the photographing apparatus that is perpendicular to the first axis and the second axis; and
one or more second actuator devices configured to apply a driving force to the photographing apparatus along the first axis direction or the second axis direction for rotating the photographing apparatus around the second axis or the first axis, wherein
a first actuator device of the one or more first actuator devices is configured to extend along one of the first axis direction and the second axis direction and includes one end portion moveable in the optical axis direction to generate a contact force with respect to the photographing apparatus, and
a second actuator device of the one or more second actuator devices is configured to extend along one of the first axis direction and the second axis direction and includes one end portion moveable in the other of the first axis direction and the second axis direction to generate a driving force with respect to the photographing apparatus.

US Pat. No. 10,341,538

DISPLAY SYSTEM AND DISPLAY METHOD

Japan Display Inc., Toky...

1. A display system comprising:a controller; and
an image display panel,
wherein the controller includes
a signal transmitter configured to output at least a vertical synchronization signal to a plurality of image-capturing apparatuses; and
a synthesizer configured to provide synthesized image signals in units of lines obtained by synthesizing, in units of lines, image signals in units of lines output from the respective image-capturing apparatuses at a timing corresponding to a horizontal synchronization signal formed in a predetermined cycle based on an output timing of the vertical synchronization signal,
wherein the image display panel is configured to display sequentially, in units of lines, the synthesized image signals in units of lines,
wherein the signal transmitter is configured to output a display control vertical synchronization signal used for display control on the image display panel at a timing different from the output timing of the vertical synchronization signal to the image-capturing apparatuses, and
wherein the synthesizer is configured to receive the image signals in units of lines started to be output from the image-capturing apparatuses at the timing of the vertical synchronization signal, and to provide the synthesized image signals in units of lines obtained by synthesizing the image signals in units of lines output at the output timing of the display control vertical synchronization signal.

US Pat. No. 10,341,537

SPECTATOR VIEW INTO AN INTERACTIVE GAMING WORLD SHOWCASED IN A LIVE EVENT HELD IN A REAL-WORLD VENUE

Sony Interactive Entertai...

1. A method for enabling participation in a live event, comprising:establishing a multi-player gaming session controlled by a plurality of players through execution of a gaming application at a server, the multi-player gaming session generating an interactive gaming world within which player participation is enabled, the live event being a real-world venue where the plurality of players is present and engaging with the multi-player gaming session being displayed on a screen at the live event;
generating at the server a three dimensional (3D) live view of the real-world venue based on one or more captured video streams, the 3D live view generated for a physical point-of-view (POV) of the live event, wherein the physical POV is anchored to a physical location in the real-world venue;
streaming the 3D live view via a network from the server to a head mounted display (HMD) of a first remote user, the first remote user being located outside of the real-world venue, the 3D live view presenting an augmented reality view of the live event including the screen to the first remote user through the HMD;
receiving at the server a request from the first remote user to jump from the 3D live view to a first spectator view of the multi-player gaming session displayed on the screen as a spectator of the interactive gaming world from a first virtual POV defined by a virtual location in the interactive gaming world;
generating by the server the first spectator view associated with the first virtual POV, wherein the first spectator view is generated for the multi-player gaming session being displayed on the screen at the live event; and
delivering the first spectator view via the network to the HMD of the first remote user for display, the first spectator view presenting in real-time a virtual reality view of the interactive gaming world to the first remote user through the HMD.

US Pat. No. 10,341,536

IMAGING DEVICE, CONTROL METHOD THEREFOR, AND IMAGING SYSTEM

FUJIFILM Corporation, To...

1. An imaging device which photoelectrically reads fluorescence or chemiluminescence emitted from an object to image the object, the imaging device comprising:a control unit which receives first control information for controlling a first function and second control information for controlling a second function from a plurality of external terminals and performs the first function and the second function based on the received first and second control information; and
a terminal recognition unit which recognizes the plurality of external terminals,
wherein the control unit recognizes which external terminal output the first control information and the second control information by which the first function and the second function are executed based on the first and second control information output from each external terminal, sends a signal for prohibiting a reception of an operation instruction input of the first function to external terminals other than the external terminal that output the first control information, and performs parallel processing of the first function and the second function by the plurality of external terminals, and
wherein the first function is an imaging function and the second function is an analysis function of an image acquired by imaging.

US Pat. No. 10,341,535

COLOR CALIBRATION

Hewlett-Packard Developme...

1. A method, comprising:determining that a print color established for printing a reference color by a printing system using multiple colors has a measured difference from the reference color by more than a color consistency threshold;
in response to determining that the print color measurably differs from the reference color by more than the color consistency threshold, creating a color calibration chart including a plurality of colors that are variations of the print color, each color measurably differing from the print color by less than the measured difference;
measuring the colors in the calibration chart; and
selecting from the calibration chart the color for printing the reference color which has a measurement value indicative of a color consistency within a predefined distance from a reference value for the print color, calibrating the printing system for printing the reference color using the multiple colors by replacing the print color with the selected color for printing the reference color; and
printing an image including the reference color by using the color selected from the calibration chart for the reference color.

US Pat. No. 10,341,534

COLOR CALIBRATION

Hewlett-Packard Developme...

1. A color calibration system comprising:a calibration target generation engine to print a calibration target foreground and a calibration target background on a non-opaque media, the calibration target foreground being printed directly on the calibration target background;
a color measurement engine to perform a color measurement of the calibration target foreground, the color measurement representing an amount of light reflected by the calibration target foreground; and
a color calibration engine that calibrates a printer based on the color measurement.

US Pat. No. 10,341,533

COLOR TABLE COMPRESSION

1. A print cartridge component comprising:a memory device comprising:
quantized coefficients based on a compression of a difference table including a plurality of difference nodes in which each difference node represents a value that is a difference of a value of a node of a color table and a value of a corresponding node of a reference table, the quantized coefficients useable to produce a reconstructed difference table; and
residue information representing a difference of the color table and the reconstructed difference table,
wherein the quantized coefficients and the residue information are accessible by a printing device to reconstruct a color table and perform a color transformation between color spaces when printing.

US Pat. No. 10,341,532

IMAGE FORMING APPARATUS, IMAGE FORMING METHOD, AND PROGRAM

KONICA MINOLTA, INC., To...

1. An image forming apparatus that forms an output image corresponding to input image data on an image carrier by superimposing toner images of a plurality of colors, comprising a hardware processor that:detects color information of the output image for every pixel region;
converts color information of the input image data for every pixel region and color information of the output image for every pixel region into indexes which define colors in predetermined color space coordinates;
extracts a pixel region in which the color information of the output image is different from the color information of the input image data, and corrects the color information of the input image data with respect to the extracted pixel region on the basis of the color information of the output image; and
stores the color information of the input image data, which is corrected in the hardware processor, as data for calibration in combination with the color information of the output image and image forming conditions when forming the output image.

US Pat. No. 10,341,531

APPLYING A PERCEPTUALLY UNIFORM COLOR SPACE FOR IMAGE PROCESSING

Motorola Mobility LLC, C...

1. A method comprising:identifying, via a processor of an image capturing device, a perceptually uniform color space that includes only real colors identified within a plurality of real-world images;
applying, via the processor, the perceptually uniform color space to a color processing stage of an image processing pipeline; and
in response to receiving image data captured by the image capturing device, the image data including image color values associated with a primary color space, converting, via the processor, the image color values to the perceptually uniform color space to generate a representation of the image data having more perceptual uniformity for real-world colors.

US Pat. No. 10,341,530

SYSTEM AND METHOD FOR OBFUSCATING INDICIA ON REUSABLE MEDIA

Kabushiki Kaisha Toshiba,...

9. A method comprising:feeding a paper sheet for scanning via a scan engine;
scanning a surface of the paper sheet to generate digital image data corresponding thereto;
analyzing the digital image data to determine a presence of indicia on the paper surface by an intelligent controller including a processor and associated memory;
identifying, by the intelligent controller, one or more areas of the paper surface containing indicia in accordance with analysis of the digital image data;
overprinting on the areas of the paper surface containing the indicia; and
erasing the paper surface containing the indicia and the overprint.

US Pat. No. 10,341,529

TERMINAL AND CONTROLLING METHOD THEREOF

LG ELECTRONICS INC., Seo...

1. A terminal comprising:a display unit;
a camera configured to capture an image; and
a controller configured to:
display a preview image captured through the camera on the display unit,
display a UI (User Interface) for zoom processing the preview image on the preview image, wherein the UI represents a plurality of zoom magnification values and includes a zoom button for changing a zoom magnification value,
when an object is selected from the preview image, determine a maximal magnification value for the selected object to be displayed in a maximal size on the display unit,
display an indicator indicating the determined zoom magnification value on the UI, wherein the indicator is displayed at a zoom magnification value corresponding to the determined zoom magnification value among the plurality of zoom magnification values,
when the zoom button is selected, zoom process the preview image centering on the selected object based on the determined zoom magnification value, and
display the zoom-processed preview image on the display unit.

US Pat. No. 10,341,528

SETTING AN IMAGE FORMING APPARATUS AS A DEFAULT DEVICE

Konica Minolta, Inc., To...

1. A control device that executes an operating system program, comprising:a hardware processor, wherein the operating system program defines:
a process of setting an image forming apparatus among a plurality of image forming apparatuses as a default device; and
a changing process of, in a case where an execution device different from the default device is designated among the plurality of image forming apparatuses by an application task for executing an application program and printing is executed by the execution device, changing the default device to the execution device, and
the hardware processor:
in response to an instruction to execute printing given by the application task, allows a control target device among the plurality of image forming apparatuses to execute the printing;
maintains the same image forming apparatus as the default device before and after the printing is executed by the control target device;
accepts setting of a selection of whether the default device is maintained or changed; and
maintains or changes the default device based on types of the printing, wherein
the changing process defined by the operating system program includes:
an acquiring process of, in the case where an execution device is designated from among the image forming apparatuses by the application task, acquiring device identification information for identifying the execution device; and
a process of, in the case where the execution device specified by the acquired device identification information is different from the default device, changing the default device to the execution device, and
the hardware processor:
acquires device identification information of the default device before allowing the control target device to execute printing; and
in the case where the default device specified by the acquired device identification information is different from the control target device, in response to an instruction for printing by the control target device given by the application task, allows a task for executing the operating system program to acquire the device identification information of the default device instead of device identification information of the control target device in the acquiring process defined by the operating system program.

US Pat. No. 10,341,527

COMPUTER-READABLE STORAGE MEDIUM AND PRINTING SYSTEM

Brother Kogyo Kabushiki K...

1. A non-transitory computer readable storage medium storing computer readable instructions that are executable by a computer in an information processing apparatus, the information processing apparatus comprising a communication interface, through which the information processing apparatus is connected with a printer, the information processing apparatus comprising a first channel and a second channel being processing channels configured to cause the printer to print an image based on image data, the computer readable instructions, when executed by the computer, causing the computer to:in response to receiving a print command, conduct a first printing control to cause the printer to print the image through the first channel;
determine whether image printing to print the image by the printer under the first printing control failed; and
based on a determination that the image printing under the first printing control failed, conduct a second printing control to cause the printer to retry the image printing based on the image data through the second channel.

US Pat. No. 10,341,526

FACSIMILE COMMUNICATION DEVICE, LINE DISCONNECTION JUDGMENT METHOD AND NON-TRANSITORY RECORDING MEDIUM STORING A COMPUTER READABLE PROGRAM

KONICA MINOLTA, INC., To...

1. A facsimile communication device, comprising:a line voltmeter that measures a voltage of a line used for a facsimile communication; and
a hardware processor that:
detects a predetermined change in the voltage of the line, during the facsimile communication, wherein the predetermined change is a voltage increase caused by unplanned disconnection of a facsimile device of an opposite side; and
judges whether the facsimile device of the opposite side opens the line or not according to whether the predetermined change is detected.

US Pat. No. 10,341,525

IMAGE FORMING SYSTEM, COMMUNICATION TERMINAL, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM

Konica Minolta, Inc., (J...

1. An image forming system comprising a first image forming apparatus, a second image forming apparatus, and a communication terminal,the communication terminal including:
a first input device configured to accept input of (i) information for specifying the image forming apparatus and (ii) information to be transmitted to the image forming apparatus;
a first network interface configured to communicate with the first and second image forming apparatus; and
a first processor configured to control operations of the first input device and the first network interface,
the first processor being configured to connect the communication terminal to the first image forming apparatus via the first network interface when the first processor detects that the information to be transmitted to the image forming apparatus is being input into the first input device,
the first image forming apparatus including:
a second input device configured to accept an operation of inputting information to the first image forming apparatus;
a second network interface configured to communicate with the communication terminal; and
a second processor configured to control operations of the second input device and the second network interface,
the second image forming apparatus including a third network interface configured to communicate with the communication terminal,
when the second processor detects that an operation is being performed onto the second input device while the first image forming apparatus is connected to the communication terminal, the second processor being configured to perform a first notification to notify the communication terminal that the operation is being performed onto the second input device,
when the first notification is performed by the first image forming apparatus, the first processor being configured to perform a second notification corresponding to the first notification in the communication terminal, the second notification urging to change a connection destination of the first network interface to the second image forming apparatus.

US Pat. No. 10,341,524

SYSTEM AND METHOD OF PROCESSING DOCUMENTS TO PROVIDE DOCUMENT HISTORY AND COMPLIANCE CONTROL

Xerox Corporation, Norwa...

9. A document management method associated with a document processing system, the document processing system including one or more multifunction devices (MFDs) configured to perform one or more of document printing, document scanning, document copying and document faxing, the one or more MFDs operatively connected to a network;one or more user workstations operatively connected to the network, the one or more user workstations configured to digitally process a content associated with a digital representation of a document;
one or more servers operatively associated with the one or more MFDs and the one or more user workstations, the one or more servers operatively connected to the network and the one or more servers operatively associated with a document management system, the one or more MFDs, the one or more workstations, and the one or more servers, the method comprising:
a) generating a unique secure mark associated with an original document created by one of the MFDs and workstations;
b) registering in a document registry database the unique secure mark associated with the original document and associating the unique security mark with metadata associated with the original document including a document creator, document creation time, document name and document security protection level;
c) monitoring document activity associated with all or part of the original document by detecting a presence of the unique security mark during a next occurrence of all or part of a content of the original document within an active document associated with the document activity, wherein document activity is activity associated with an image output processing device, including printing including one or more of printing the active document, scanning the active document, copying the active document, faxing the active document and digitally processing a digital representation of the active document;
d) recording in an activity log associated with the document registry database the document activity associated with the active document; and
e) repeating steps a)-d) for subsequent document activity associated with all or part of the original document.

US Pat. No. 10,341,523

PRINTER AND CONTROL METHOD OF A PRINTER

Seiko Epson Corporation, ...

1. A printer comprising:a carriage configured to support and move a printhead;
a camera attached to the carriage and configured to photograph an image printed by the printhead;
an adjustment mechanism configured to adjust an installation position of the camera; and
a processor configured to control to move the carriage to a position of a predetermined specific mark, photograph the specific mark by the camera, and based on a photographed image of the specific mark, adjust an installation position of the camera by the adjustment mechanism,
wherein the processor is further configured to control the adjustment mechanism and adjust an installation position of the camera.

US Pat. No. 10,341,522

IMAGE CAPTURE AND OUTPUT METHOD

TECO IMAGE SYSTEMS CO., L...

1. An image capture and output method, comprising steps of:(a) providing an image capture device, wherein the image capture device comprises a case, a control unit, a linear image sensor and two optical mouse sensors, the control unit is a processor or a microcontroller disposed in the case, the linear image sensor is disposed in the case and connected with the control unit, the two optical mouse sensors are disposed in the case and connected with the control unit, and the two optical mouse sensors are respectively disposed at two opposed sides of the linear image sensor;
(b) continuously capturing a plurality of line images from an initial position and recording a position information corresponded to each line image, wherein the position information comprise two coordinate positions of the two opposed sides of the line image captured by the two optical mouse sensors, the position information further comprise a shift degree, and the shift degree is a degree of the angle between the first line image and the initial position, or a degree of the angle between each of the rest line images and a former one of the line image;
(c) calculating the difference between the position information of the first line image and the initial position and the differences between the position information of each of the rest line images and a former one of the line image to obtain a fill information;
(d) filling each line image into an image buffer according to the fill information wherein the step (d) further comprises steps of:
(d1) respectively filling a plurality of image units of one of the line images into corresponded image positions of the image buffer; and
(d2) copying each of the image units and filling into next row of the image positions corresponded to the image units; and
(e) outputting the image buffer as a product image,
wherein the step (b) is implemented by the linear image sensor and the two optical mouse sensors, the step (c), the step (d) and the step (e) are implemented by the control unit.

US Pat. No. 10,341,521

IMAGE READING DEVICE

RISO KAGAKU CORPORATION, ...

1. An image reading device which optically reads a document, comprising:a platen on which a document is placed;
a pressure plate openable and closable with respect to the platen;
an angle detector which detects a fact that an open/close angle of the pressure plate has passed a predetermined angle from a close state;
a reader which records a variation in a distribution for a fixed period of time of incident light heading toward the platen when the angle detector detects passing of the pressure plate;
a variation amount calculator which calculates a variation amount of incident light at each distribution point on a basis of a variation in distribution recorded by the reader; and
a variation amount determiner which determines whether or not a variation amount calculated by the variation amount calculator exceeds a predetermined threshold.

US Pat. No. 10,341,520

INFORMATION PROCESSING APPARATUS AND NON-TRANSITORY COMPUTER READABLE MEDIUM

FUJI XEROX CO., LTD., To...

1. An apparatus comprising:a sound detector; and
a processor configured to remotely control an external device, at least one function of the external device being invalidated while the external device is being remotely controlled, wherein
the invalidated function becomes validated in response to a certain sound being detected by the sound detector when a certain screen is being displayed on the external device, and
the invalidated function does not become validated in response to the detected certain sound when the certain screen is not being displayed on the external device.

US Pat. No. 10,341,519

COMMUNICATION SYSTEM, IMAGE FORMING APPARATUS, METHOD OF CONTROLLING THE SAME, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A communication system, in which an image forming apparatus and an information processing apparatus are connected via a network and remote maintenance is performed between the image forming apparatus and the information processing apparatus,wherein the image forming apparatus comprises:
a first wireless communication unit that wirelessly communicates with a plurality of mobile terminals;
a second wireless communication unit that directly communicates with a mobile terminal;
a display;
a memory that stores a set of instructions; and
at least one processor that executes the instructions to:
control the remote maintenance with the information processing apparatus;
perform wireless communication with a mobile terminal;
determine whether or not the remote maintenance is possible using the mobile terminal;
control to, in a case that it is determined that the remote maintenance is possible, establish wireless communication with the mobile terminal in response to a connection request from the mobile terminal, and perform input and output of audio data via the mobile terminal with the information processing apparatus;
determine whether the established wireless communication with the mobile terminal is by the first wireless communication unit or the second wireless communication unit;
in a case that it is determined that the wireless communication with the mobile terminal is by the first wireless communication unit, display on the display a screen for requesting permission for a connection with the mobile terminal; and
in a case that it is determined that the wireless communication with the mobile terminal is by the second wireless communication unit, not display on the display the screen.

US Pat. No. 10,341,518

OPERATION INPUT SYSTEM, ELECTRONIC DEVICE AND MOBILE TERMINAL

SHARP KABUSHIKI KAISHA, ...

1. An operation input method of an electronic device, the electronic device having a display portion and wirelessly communicating with a mobile terminal, the operation input method comprising:wirelessly communicating with the mobile terminal in a case where the electronic device detects that the mobile terminal is in proximity with the electronic device and displaying a transmission destination address input screen on the mobile terminal while displaying a transmission destination address setting screen on the display portion of the electronic device;
receiving a transmission destination address set by a user for the transmission destination address input screen from the mobile terminal; and
updating contents displayed on the transmission destination address setting screen in accordance with the transmission destination address received.

US Pat. No. 10,341,517

IMAGE FORMING APPARATUS AND METHOD FOR CONTROLLING THE IMAGE FORMING APPARATUS

KYOCERA Document Solution...

10. A method for controlling an image forming apparatus, comprising steps of:causing a first motor to rotate at a speed depending on a frequency of a first drive signal, the first motor causing a first rotor that conveys a paper sheet to rotate;
generating the first drive signal;
causing a second motor to rotate at a speed depending on a frequency of a second drive signal, the second motor causing a second rotor to rotate, the second rotor being provided downstream beyond the first rotor in a conveyance direction and conveying the paper sheet;
generating the second drive signal;
operating a reading unit so that, based on a read signal, the reading unit reads line by line the paper sheet being conveyed thereto, the reading unit including an image sensor and being provided between the first rotor and the second rotor;
generating the read signal;
at a time point when the paper sheet has arrived at the first rotor, maintaining the first motor in a stopped state;
after a start of rotation of the first rotor, accelerating a rotational speed of the first motor and subsequently decelerating the rotational speed so that the first drive signal has a reference frequency;
after the first drive signal has been set to have the reference frequency, stopping the first motor before arrival of a succeeding paper sheet at the first rotor;
using, as the reference frequency, such a frequency that a paper sheet conveyance speed of the first rotor becomes a reference speed;
generating the second drive signal so that a paper sheet conveyance speed at which the second rotor conveys the paper sheet becomes the reference speed;
in a first period from after a start of rotation of the first rotor until a preset switching time point, generating the read signal so that, in accordance with a frequency of the first drive signal, every line of the paper sheet is read upon conveyance thereof;
in a second period from the switching time point until a rear end of the paper sheet has passed beyond the image sensor, generating the read signal so that every line of the paper sheet being conveyed at the reference speed is read upon conveyance thereof; and
providing the switching time point within a time period in which the second rotor has started to convey the paper sheet and the first drive signal has the reference frequency.

US Pat. No. 10,341,516

ROTATION DRIVE DEVICE, IMAGE READING DEVICE, AND IMAGE FORMING APPARATUS

RICOH COMPANY, LTD., Tok...

1. A rotation drive device comprising:a driven rotation member; a driving rotation member to receive rotational driving force from a driving source;
a rotation relay member to transmit only a unidirectional rotational driving force from the rotational driving force of the driving rotation member to the driven rotation member;
a shaft; and
a holder to hold the driving rotation member,
wherein the driven rotation member, the rotation relay member, and the driving rotation member have through holes, through which the shaft passes, including center axis lines of the driven rotation member, the rotation relay member, and the driving rotation member,
wherein the rotation relay member includes:
a housing rotatable with the driven rotation member; and
a cylindrical one-way clutch disposed at an inner peripheral side of the housing,
wherein a contact portion between the driving rotation member and the shaft is configured to rotate the driving rotation member and the shaft in conjunction with each other to transmit the rotational driving force of the driving rotation member to an inner peripheral side of the cylindrical one-way clutch via the shaft,
wherein the driving rotation member includes a boss that is projected toward the holder and is rotatably held in a supporting hole of the holder, and
wherein a portion of the shaft to support the driving rotation member and an inner wall surface of the through hole of the driving rotation member are shaped to rotate the shaft and the driving rotation member together.

US Pat. No. 10,341,515

IMAGE FORMING SYSTEM INCLUDING IMAGE FORMING APPARATUS AND POSTPROCESSING DEVICE, AND IMAGE FORMING APPARATUS AND POSTPROCESSING DEVICE AVAILABLE FOR IMAGE FORMING SYSTEM

Kyocera Document Solution...

1. An image forming system comprising:an image forming apparatus that is configured to perform an image formation process to form images on a plurality of sheets based on an execution request of a job; and
a postprocessing device connected to the image forming apparatus, the postprocessing device is configured to receive the plurality of sheets conveyed one by one from the image forming apparatus; wherein
the image forming apparatus includes
a first communication unit that communicates with the postprocessing device,
a paper feeder that is configured to house the plurality of sheets to feed the housed plurality of sheets one by one, and
a first controller that controls operation of the first communication unit and the paper feeder;
the postprocessing device includes
a second communication unit that communicates with the first communication unit,
a postprocessing unit that is configured to perform postprocessing on the plurality of sheets, and
a second controller that controls operation of the second communication unit and the postprocessing unit;
the second controller obtains a first time to perform the postprocessing and a second time to transition to a performable state configured to perform the postprocessing;
the first controller obtains attribute information necessary for the second controller to obtain the first time and the second time;
the first controller transmits the attribute information corresponding to a first feeding sheet to the second communication unit via the first communication unit, the first feeding sheet being the sheet as a feed target of the paper feeder;
when the second communication unit receives the attribute information, the second controller transmits information indicating the first time and the second time to the first communication unit via the second communication unit;
the first controller is configured to cause the paper feeder to feed the first feeding sheet after the second time has passed from the reception of the information indicating the first time and the second time by the first communication unit; and
the first controller is configured to transmit the attribute information corresponding to a second feeding sheet to the second communication unit via the first communication unit after the first time has passed from the feeding of the first feeding sheet, the second feeding sheet being the sheet to be fed by the paper feeder next to the first feeding sheet.

US Pat. No. 10,341,514

IMAGE PROCESSING APPARATUS, CONTROL METHOD FOR THE IMAGE PROCESSING APPARATUS, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:a scanner unit which reads an original to obtain an image;
a display unit which displays a setting screen of a job for performing setting of a plurality of setting items related to the job;
a memory that stores, in association with identification information about a user, setting values of the setting items related to the set job in accordance with logout of the user who has logged in to the image processing apparatus; and
a control unit configured to perform control so as to read the setting values stored in the memory and display, on the display, the setting screen of the job to which the setting values are restored while a user having logged out logs in again,
wherein the setting items include a transmission destination to which the image obtained by reading the original is transmitted, and the control unit does not restore, to the setting screen, a transmission destination set when the user logs out, and
wherein the control unit is implemented by at least one processor.

US Pat. No. 10,341,513

ELECTRONIC APPARATUS AND IMAGE FORMING APPARATUS

KYOCERA Document Solution...

1. An electronic apparatus comprising:a touch panel and a display; and
a control unit including a processor, and being configured to function, when the processor operates according to an operation program, against the display so that the display is caused to
(i) switch and display, on a first region of the display, a plurality of guidance images representing an operation procedure in an order of steps of the operation procedure,
(ii) display a step bar on a second region of the display different from the first region, the step bar including a plurality of piece images corresponding to the plurality of guidance images on a one-to-one basis, the plurality of piece images being aligned in a first direction in the order of the steps, and
(iii) when detecting through the touch panel that one of the plurality of piece images is touched and further a drawing operation in which the one of the plurality of piece images is made to slide in a second direction orthogonal to the first direction, display, on the second region, the step bar in a manner such that one or more piece images of a number corresponding to a travel distance in the second direction at the drawing operation are to be drawn in the second direction from the step bar, the piece images at least including the one of the plurality of the piece images having been drawn out, and display, on the first region, the guidance images respectively corresponding to the piece images being displayed so as to be drawn out from the step bar.

US Pat. No. 10,341,512

IMAGE PROCESSING APPARATUS, METHOD OF CONTROLLING THE SAME, AND STORAGE MEDIUM

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus comprising:a scanner that reads an original, wherein the image processing apparatus has a transmission function for transmitting image data corresponding to the original read by the scanner;
a storage device that stores a plurality of address books including at least an address book for an administrator;
a memory storing instructions, and
a processor executing the instructions causing the image processing apparatus to:
authenticate a user of the image processing apparatus;
display, as a default, address information included in the plurality of address books stored in the storage device, if a request to display an address book is received from the authenticated user and the authenticated user does not have an administrator authority;
display, as a default, address information included in the plurality of address books stored in the storage device, if a request to display an address book is received via a first screen related to the transmission function from the authenticated user and the authenticated user has the administrator authority;
display, as a default, address information included in the address book for the administrator to select a transfer destination if a request to display an address book is received via a second screen different from the first screen from the authenticated user and the authenticated user has the administrator authority; and
control for automatically transferring image data received from an external apparatus to the selected transfer destination.

US Pat. No. 10,341,511

DATA TRANSFER APPARATUS, IMAGE FORMING APPARATUS, AND IMAGE READING APPARATUS

KYOCERA Document Solution...

1. A data transfer apparatus configured to receive an input of data transfer destination from a user, the data transfer apparatus comprising:a display unit;
a touch panel located on a display screen of the display unit, and configured to receive an instruction of the user;
a storage unit containing, beforehand, a name of the data transfer destination in association with each transfer destination address of each type of methods of the data transfer;
a data transfer unit that transfers the data to the data transfer destination;
a display controller that controls an operation of the display unit, and causes the display unit, when the touch panel receives a data transfer instruction from the user, to display a screen urging the user to select the data transfer destination and the method of the data transfer to the destination; and
a controller that causes the data transfer unit, when the touch panel receives the instruction from the user to select the data transfer destination and the method of the data transfer to the destination, to transfer the data to the selected destination, by the selected transfer method,
wherein the display controller causes the display unit to display a list showing a plurality of the names of a plurality of the destinations stored in the storage unit, when causing the display unit to display the screen urging the user to select the data transfer destination,
when the touch panel receives an instruction to select one destination out of the plurality of the destinations included in the list, the display controller recognizes from the storage unit the transfer destination addresses of the one destination selected through the touch panel, and causes the display unit to display icons, together with the list, on a selection field displayed on a portion besides the list on the display unit as the screen urging the user to select the method of the data transfer to the destination, the icons respectively corresponding to each type of methods of the data transfer and to each address of the one destination recognized from the storage unit,
when it is detected through the touch panel that the user has pressed one of the icons on the selection field, it is decided that the instruction to select the data transfer method, corresponding to the icon pressed by the user, has been inputted, and when it is detected through the touch panel that the user has held down the one icon on the selection field equal to or longer than a predetermined time, it is decided that a determination instruction has been inputted, for the data transfer method corresponding to the one icon that has been held down,
when the touch panel receives the determination instruction with respect to one of the data transfer methods, the display controller causes the display unit to display, together with the list and the selection field, on a portion besides the list and the selection field displayed on the display unit, a predetermined lock icon indicating that the data transfer method is fixed to the one designated by the determination instruction,
upon causing the display unit to display the lock icon, the display controller causes the display unit to display, on the selection field, only the icon representing the data transfer method that is the fixed method indicated by the lock icon, and causes the display unit not to display, on the selection field, icons other than the icon representing the data transfer method, and
the controller causes the data transfer unit, when the touch panel receives, in a case where the plurality of the destinations are selected out of the list, the determination instruction with respect to one transfer method selected for one selected destination, to transfer the data to the selected destinations, using the one transfer method as the method of the data transfer to other destinations than the one destination, without the process through which the user selects the data transfer method for other destinations than the one destination, on the touch panel.

US Pat. No. 10,341,510

IMAGE FORMING APPARATUS, IMAGE EDITING METHOD AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM FOR FORMING AN IMAGE ON A RECORDING MEDIUM BASED ON AN IMAGE DISPLAYED ON A DISPLAY SECTION

SHARP KABUSHIKI KAISHA, ...

1. An image processing method for processing a display image by using an image forming apparatus including a display screen, and a position detector that detects a position of contact with the display screen, and that forms an image on a recording medium based on the display image displayed on the display screen, the method comprising:displaying on the display screen an image including an object image indicative of an object for an image formation, and a medium image indicative of the recording medium;
changing to an edit mode in response to a contact with the display screen detected by the position detector;
while in the edit mode, modifying magnification of the object image with respect to the medium image, without modifying the medium image, based on contact positions of two points detected by the position detector;
displaying in the medium image on the display screen an area occupied by the object image modified by the modifying;
selecting one recording medium of a plurality of selectable recording mediums based on the contact positions detected by the position detector; and
finishing the edit mode in response to a predetermined operation.

US Pat. No. 10,341,509

CLIENT DEVICE STATE COLLECTION AND NETWORK-BASED PROCESSING SOLUTION

Qualys, Inc., Foster Cit...

1. A system comprising:a) a network interface for communicating over a network with an endpoint device, wherein the endpoint device is intermittently connected to the network;
b) a repository for storing a state image of the endpoint device, the state image being a last agreed upon stage image between the system and the endpoint device; and
c) a processor unit in communication with the repository and the endpoint device; wherein the system is configured to:
provide a manifest to the endpoint device via the network interface, wherein the manifest includes priority and frequency requirements for collecting a state image delta associated with the endpoint device based on the state image of the endpoint device, the state image delta comprising data to add, update, or delete from the state image of the endpoint device;
receive the state image delta from the endpoint device via the network interface, wherein the state image delta is received when the endpoint device connects to the network;
merge the state image delta with the state image of the endpoint device in the repository to result in an updated state image, wherein the updated state image is stored in the repository, wherein the system being configured to merge the state image delta with the state image of the endpoint device in the repository comprises the system being configured to:
determine whether a data collection interval has expired and whether an instruction to drop the state image delta has been processed, and
in response to determining the data collection interval has expired and the instruction to drop the state image delta has not been processed, the system is configured to merge the state image delta with the state image of the endpoint device in the repository,
wherein the endpoint device instructs the system to drop a last state image delta if the last state image delta has not been processed and if the data collection interval has expired;
receive a request for confirmation that the state image delta has been merged with the state image of the endpoint device in the repository from the endpoint device via the network interface; and
transmit confirmation that the state image delta has been merged with the state image of the endpoint device in the repository to the endpoint device via the network interface, wherein an identifier associated with the state image delta is stored at the endpoint device, and wherein after the confirmation that the state image delta has been merged with the state image of the endpoint device in the repository is received by the endpoint device, the endpoint device merges the state image delta with a second endpoint image stored at the endpoint device.

US Pat. No. 10,341,508

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An information processing apparatus which lays out a plurality of images, the apparatus comprising:a storage which stores images; and
a processor configured to:
receive a first instruction for creating a new album or a second instruction for re-editing images laid out on an existing album which has been created and ordered;
make a first selection of first images out of the images stored in the storage in a case where the first instruction is received;
make a second selection of second images out of the selected first images which are selected in the first selection;
lay out at least some of the selected second images which are selected in the second selection, on the new album;
delete, from candidates for re-editing the laid out images on the new album which has been ordered, at least some of the images that have not been selected in the first selection such that at least some of the first images that have been selected in the first selection and have not been selected in the second selection remain as the candidates in addition to the selected second images; and
re-edit the laid out images on the new album which has been ordered, by using the candidates from which the at least some of the images have been deleted in a case where the second instruction is received
wherein even if the images laid out on the new album which has been ordered are re-edited and the new album is re-ordered, deletion from the candidates for re-editing the laid out images on the new album is not executed.

US Pat. No. 10,341,507

IDENTIFYING A FOREIGN OBJECT IN AN ELECTRONIC DOCUMENT

Konica Minolta Laboratory...

1. A method for processing an electronic document (ED), the method comprising:generating a mark-up version of the ED that comprises:
a text attribute that covers a first area of the ED, and
an image attribute that covers a second area of the ED;
segmenting the text attribute into lines of text and the lines of text into words separated by spaces;
determining whether the second area overlaps with the first area based on at least one of the lines of texts or the words; and
displaying, in response to determining that the second area overlaps with the first area, the ED on a display to notify a user that the image attribute is an overlapped objected on the ED.

US Pat. No. 10,341,506

CALIBRATED PRINT DATA GENERATION BASED ON COLOR CALIBRATION DATA ACCOUNTING FOR BACKLIGHT CHARACTERISTICS

Hewlett-Packard Developme...

1. A printing system comprising:a calibration mechanism to:
receive print data of an image that is to be backlight when the image is displayed;
receive color calibration data that accounts for characteristics of a backlight; and
generate calibrated print data from the print data of the image, based on the color calibration data; and
a printing engine to print the image on a substrate by printing the calibrated print data.

US Pat. No. 10,341,505

CONTROLLER, CALIBRATION CONTROL PROGRAM, AND CALIBRATION CONTROL METHOD

Konica Minolta, Inc., Ch...

1. A controller in a printing system that includes the controller equipped with an image processor that generates image data for printing by rasterizing a print job, and an image forming device equipped with an engine that performs a printing process in accordance with the image data for printing, the controller and the image forming device each performing calibration to adjust an output of the image forming device, the controller comprisinga hardware processor that:
associates first calibration data indicating a result of first calibration performed by the image forming device with second calibration data indicating a result of second calibration performed by the controller in accordance with the first calibration, and stores the first calibration data and the second calibration data into a storage;
when obtaining the first calibration data of the first calibration performed by the image forming device, compares the obtained first calibration data with the stored first calibration data, and determines whether the stored first calibration data includes data equal to the obtained first calibration data; and,
when there is no stored first calibration including data equal to the obtained first calibration data, creates the second calibration data to be associated with the obtained first calibration data by performing the second calibration using the controller and storing the second calibration data, and,
when there is stored first calibration including data equal to the obtained first calibration data, creates the second calibration data to be associated with the obtained first calibration data using the second calibration data stored and associated with the stored first calibration including the data equal to the obtained first calibration data.

US Pat. No. 10,341,504

PHOTO RESPONSE NON-UNIFORMITY SUPPRESSION

Hewlett-Packard Developme...

1. A method for photo response-non-uniformity (PRNU) suppression, comprising:performing a calibration surface PRNU characterization using a scanning system;
performing a document-based PRNU characterization using the scanning system;
determining a correction function for PRNU suppression for the scanning system based on the calibration surface PRNU characterization and the document-based PRNU characterization; and
suppress PRNU regions in an output of the scanning system by adjusting calibration surface PRNU targets according to the calibration surface PRNU characterization and the document based PRNU characterization.

US Pat. No. 10,341,503

REMOTE MAINTENANCE SYSTEM, IMAGE FORMING APPARATUS, EXTERNAL TERMINAL, METHODS OF CONTROLLING THESE, AND STORAGE MEDIUMS

Canon Kabushiki Kaisha, ...

1. A remote maintenance system including an image forming apparatus that executes an image formation, a print server that executes some image processing for the image formation, and an external terminal, whereinthe image forming apparatus comprises:
a memory device that stores a set of instructions;
at least one processor that executes the instructions to:
accept a start of a remote maintenance function for resolving a failure of the image forming apparatus in accordance with an instruction of an operator that uses the external terminal,
obtain, from the print server, device information of the print server which is information that is necessary when the operator analyzes the failure of the image forming apparatus, wherein the device information of the print server indicates setting information that has been set for the print server;
obtain device information of the image forming apparatus, which is information that is necessary when the operator analyzes the failure of the image forming apparatus, wherein the device information of the image forming apparatus indicates setting information that has been set for the image forming apparatus;
generate support data including information of the obtained device information of the print server and the device information of the image forming apparatus, wherein the device information included in the support data includes data for indicating which device information is related to the device information of the print server or the device information of the image forming apparatus; and
transmit the generated support data to the external terminal, and
the external terminal comprises:
a memory device that stores a set of instructions;
at least one processor that executes the instructions to
receive the support data transmitted by the image forming apparatus; and
display the support data received by the external terminal.

US Pat. No. 10,341,502

IMAGE FORMING APPARATUS THAT EXECUTES IMAGE PROCESSING CORRESPONDING TO RESOLUTION

Canon Kabushiki Kaisha, ...

1. An image forming apparatus comprising:a plurality of image forming units configured to form a plurality of images having different colors based on image data;
a first sensor configured to detect a color pattern formed on an intermediate transfer member, the color pattern being used for detecting a color misregistration;
a second sensor configured to measure a measuring image; and
a controller configured to control the plurality of image forming units to form a plurality of color patterns having different colors on the intermediate transfer member, control the first sensor to detect an amount of color misregistration related to a relative position of a color pattern having a reference color among the plurality of color patterns and a color pattern having another color among the plurality of color patterns, determine an offset value for adjusting an image writing start timing of the other color different from the reference color based on the amount of color misregistration detected by the first sensor, control the plurality of image forming units to form the measuring image on the intermediate transfer member, control the second sensor to measure the measuring image, and determine an image forming condition for adjusting densities of images to be formed by the plurality of image forming units based on a measurement result of the second sensor,
wherein:
the controller controls the image forming apparatus based on an image forming mode corresponding to the image data;
the controller controls the plurality of image forming units to form the plurality of color patterns in a case where a first condition is satisfied in a first image forming mode;
the controller controls the plurality of image forming units to form the measuring image in a case where a second condition is satisfied in the first image forming mode;
the controller controls the plurality of image forming units to form the measuring image and the plurality of color patterns in a case where both the first condition and the second condition are satisfied in a second image forming mode;
the controller skips the image formation of the measuring image when the first condition is not satisfied in the second image forming mode irrespective of a state of the second condition; and
the first condition is different from the second condition.

US Pat. No. 10,341,501

MONITORING APPARATUS, MONITORING METHOD, AND RECORDING MEDIUM

Seiko Epson Corporation, ...

13. A method for acquiring, at a monitoring timing, device information which is a target of collection from a device, the monitoring method comprising:acquiring from the device, status information that includes information representing a power supply state of the device; and
setting a monitoring interval which is an interval of the monitoring timing in accordance with the power supply state denoted by the status information which has been acquired,
wherein in the setting of the monitoring interval, a setting of the monitoring interval is made longer than a current setting when the power supply state denoted by the status information which has been acquired corresponds to a sleep mode, and
in the setting of the monitoring interval, the setting of the monitoring interval is not made longer than the current setting when the setting of the monitoring interval reaches an upper limit.