US Pat. No. 10,462,463

ARITHMETIC DECODING METHOD AND ARITHMETIC CODING METHOD

SUN PATENT TRUST, New Yo...

1. An arithmetic decoding method including:initializing a context variable, the context variable specifying a probability of a possible value of each of elements included in a binary string; and
arithmetic decoding, using the context variable as initial value, on the binary string which corresponds to a value of a given variable,
wherein a group to which the given variable belongs being dynamically changeable, and
wherein in the initializing, (i) in a case where the given variable belongs to a first group, the context variable is initialized by a first initializing method selected from among a plurality of initializing methods, and (ii) in a case where the given variable belongs to a second group, the context variable is initialized by a second initializing method without using a quantization parameter.

US Pat. No. 10,462,462

MOTION VECTOR DIFFERENCE CODING TECHNIQUE FOR VIDEO CODING

Qualcomm Incorporated, S...

1. A method of decoding video data, the method comprising:receiving an encoded block of video data;
receiving one or more syntax elements indicating a motion vector difference (MVD) associated with the encoded block of video data;
determining an MVD coding technique from two or more MVD coding techniques based on a difference between a picture order count (POC) value of a reference frame and a POC value of a current picture and based on a motion vector precision;
decoding the one or more syntax elements indicating the MVD using the determined MVD coding technique; and
decoding the encoded block of video data using the decoded MVD.

US Pat. No. 10,462,459

NON-LOCAL ADAPTIVE LOOP FILTER

MEDIATEK INC., Hsinchu (...

1. A method, comprising:receiving reconstructed video data corresponding to a picture;
dividing the picture into current patches;
forming patch groups each including a current patch and a number of reference patches that are similar to the current patch;
denoising the patch groups to modify pixel values of the patch groups to create a filtered picture, wherein denoising the patch groups includes deriving a variance of compression noise in the respective patch group based on a compression noise model in which a standard deviation (SD) of compression noise in the respective patch group is a function of a SD of pixel values of the respective patch group, and the function is represented as a polynomial function; and
generating a reference picture based on the filtered picture for encoding or decoding a picture.

US Pat. No. 10,462,458

CODEWORD ASSIGNMENT FOR INTRA CHROMA MODE SIGNALLING FOR HEVC

SONY CORPORATION, Tokyo ...

1. A decoding device, comprising:circuitry configured to:
execute a debinarization process on same-as-luma intra prediction mode for a chroma component based on a codeword assignment, wherein
in the same-as-luma intra prediction mode for the chroma component, an intra prediction mode for the chroma component is same as an intra prediction mode for a luma component corresponding to the chroma component,
the intra prediction mode for the chroma component is assigned to one bit in case the intra prediction mode for the chroma component is the same-as-luma intra prediction mode, and
the one bit of the same-as-luma intra prediction mode for the chroma component is a shortest codeword among a plurality of codewords associated with a plurality of intra prediction modes for the chroma component.

US Pat. No. 10,462,457

DYNAMIC REFERENCE MOTION VECTOR CODING MODE

GOOGLE LLC, Mountain Vie...

1. A method for decoding a video stream including a processor, the method comprising:identifying, for a current block, a reference frame used to encode the current block within a current frame;
creating a reference motion vector candidate list for the reference frame using reference blocks within at least one frame of the video stream;
determining a popularity value of a motion vector within the reference motion vector candidate list, wherein the popularity value indicates a level of use of the motion vector by at least some of the reference blocks, and determining the popularity value comprises:
calculating a number of previously coded pixels within the at least some of the reference blocks having values that were predicted using the motion vector, the popularity value determined using the number of previously coded pixels;
ranking each motion vector within the reference motion vector candidate list by a distance from the current block to a reference block providing the motion vector, and by the popularity value of the motion vector;
assigning the motion vectors to a plurality of inter-prediction modes based on the ranking;
selecting an inter-prediction mode for decoding the current block;
decoding the current block using the inter-prediction mode;
determining whether the current block was encoded using single or compound prediction, wherein single prediction comprises using only one reference frame for inter prediction of the current block and compound prediction comprises using at least two reference frames for inter prediction of the current block;
in response to determining that the current block was encoded using compound prediction:
identifying the reference frame used to encode the current block comprises identifying a first reference frame and a second reference frame used to encode the current block; and
creating the reference motion vector candidate list for the reference frame comprises creating a first reference motion vector candidate list for the first reference frame and creating a second reference motion vector list for the second reference frame using the reference blocks.

US Pat. No. 10,462,456

AUTOMATED NETWORK-BASED TEST SYSTEM FOR SET TOP BOX DEVICES

Contec, LLC, Schenectady...

1. A network-based testing system for simultaneously testing a plurality of devices under test at a plurality of remote test locations, comprising:a plurality of remote test systems, each remote test system configured to test a subset of the plurality of devices under test and comprising
a Quadrature Amplitude Modulation (QAM) modulator for providing video test patterns to the subset of the plurality of devices under test at the remote test system, and
a plurality of video and audio analyzers, each of the video and audio analyzers associated with one of the subset of the plurality of devices under test at the remote test system and configured to provide infrared signals to the associated device under test and to capture video and audio output from the associated device under test,
a main controller that is operatively connected to the plurality of remote test systems and configured to simultaneously conduct tests on the plurality of devices under test;
a headend controller operatively connected to the main controller and to the plurality of remote test systems and configured to provide video services to the plurality of devices under test and to provision IP addresses to IP-based devices under test of the plurality of devices under test,
wherein
the main controller determines which of the plurality of devices under test are IP-based devices under test and which of the plurality of devices under test are non-IP-based devices under test,
the main controller receives from the headend controller the IP addresses associated with the IP-based devices under test and is configured to employ Simple Network Management Protocol (SNMP) to communicate with and test the IP-based devices under test using the associated IP addresses,
the main controller causes a subset of the plurality of video and audio analyzers to provide infrared signals to the non-IP-based devices under test and to capture video and audio signals that are output by the non-IP-based devices under test in response to the infrared signals, and
the main controller receives and analyzes the video and audio signals that are captured by the video and audio analyzers.

US Pat. No. 10,462,455

DISPLAY APPARATUS, DISPLAY METHOD, AND COMPUTER READABLE RECORDING MEDIUM

NIKON CORPORATION, Tokyo...

1. A display apparatus comprising:a processor that causes a computer to:
display an image that gives a stereoscopic perspective to an observer;
enhance or turn down the stereoscopic perspective of the image based upon detected distances of regions within the image; and
change an amount by which the stereoscopic perspective in the image displayed is enhanced or turned down according to a stereoscopic perspective setting of the observer,
wherein
the changing performs calibration to enable the observer to select an amount of enhancement for the stereoscopic perspective, by displaying images, each image being the same subject but having stereoscopic perspectives that have been enhanced or turned down by different amounts, and
the displaying displays an image according to the amount of enhancement for the stereoscopic perspective selected for the observer.

US Pat. No. 10,462,454

EXTENSIBLE AUTHORING AND PLAYBACK PLATFORM FOR COMPLEX VIRTUAL REALITY INTERACTIONS AND IMMERSIVE APPLICATIONS

1. An immersive video system comprising:a first sensor that provides information about a user's location;
a projector that projects images onto the user;
a processor in communication with the sensor, and the projector, wherein information about the user's location is used by the processor to generate a map regarding the user's location; and
a second sensor that tracks the user's eye movements;
wherein the processor manipulates the images projected onto the user based on user location data from the first sensor and eye movements from the second sensor;
wherein the processor directs the projector to project based on the user's eye movements.

US Pat. No. 10,462,453

DISPLAY DEVICE AND DISPLAY CONTROL METHOD

Koninklijke Philips N.V.,...

1. A display device comprising:a display panel;
an array of lenses arranged in front of the display panel, wherein the array of lenses have a lens pitch; and
a light blocking arrangement arranged between the array of lens and the display panel;
wherein the display device is configurable in a privacy mode and a public mode,
wherein the light blocking arrangement blocks laterally directed light output from the display panel device based on the polarization of the light in the privacy mode,
wherein the light blocking arrangement allows laterally directed light output from the display panel to pass in the public mode,
wherein the light blocking arrangement comprises a stack of layers, wherein each layer comprises a pattern of light blocking arrangement portions of two different types such that in the stack of layers, the portions align to form light blocking members of two different types,
wherein each light blocking member is associated with an associated lens such that the light blocking members form a pattern with a blocking pitch, and
wherein the blocking pitch is double the lens pitch.

US Pat. No. 10,462,452

SYNCHRONIZING ACTIVE ILLUMINATION CAMERAS

Microsoft Technology Lice...

1. A system for controlling operation of a plurality of active illumination time of flight (TOF) range cameras, the system comprising:a hub configured to communicate with mobile communication devices and comprising a database of active illumination TOF range cameras subscribed to the system; and
a set of computer executable instructions comprised in each subscriber TOF range camera that configures the camera to communicate with the hub when located in a same imaging neighbourhood with other subscriber TOF range cameras to establish a frequency division multiplexing (FDM) imaging mode of operation in which each of the subscriber TOF range cameras is assigned a different FDM imaging frequency that the TOF range camera uses to determine a modulation period for light that the camera transmits to illuminate a scene that the camera images and a modulation period for modulating sensitivity of the TOF range camera to register light reflected by features in the scene from the transmitted modulated light.

US Pat. No. 10,462,451

ASYMMETRIC STRUCTURED LIGHT SOURCE

Facebook Technologies, LL...

1. A depth camera assembly (DCA) comprising:an illumination source assembly configured to emit light in accordance with emission instructions, the illumination source assembly comprising a plurality of emitters on a single substrate, the plurality of emitters comprising at least a first emitter and a second emitter;
a projection assembly configured to project light from the illumination source assembly into a local area, the projection assembly comprising an optical element that is positioned to receive light from the first emitter at a first angle and project the received light from the first emitter to a first depth zone in the local area, and to receive light from the second emitter and project the received light from the second emitter to a second depth zone in the local area; and
an imaging device configured to capture one or more images of the local area illuminated with the light from the illumination source assembly.

US Pat. No. 10,462,450

COMBINING TWO-DIMENSIONAL IMAGES WITH DEPTH DATA TO DETECT JUNCTIONS OR EDGES

AUTODESK, INC., San Rafa...

1. A computer-implemented method for detecting a feature in three-dimensional (3D) space, comprising:obtaining three-dimensional (3D) pixel image data based on two-dimensional (2D) image data and depth data for the 2D image data;
within a given window over the 3D pixel image data, for each of one or more pixels within the given window, determining an equation for a plane passing through the pixel;
computing, for all of the determined planes within the given window, an intersection of all of the planes;
analyzing a spectrum of the intersection;
based on the spectrum, determining eigenvalues that determine a number of surfaces that intersect at the pixel wherein:
one and only one zero eigenvalue corresponds to a junction of three (3) or more surfaces;
two (2) zero eigenvalues correspond to a crease at the intersection of two (2) surfaces; and
three (3) zero eigenvalues correspond to one (1) planar surface; and
creating and displaying a computer-aided design (CAD) drawing that depicts the number of surfaces that intersect at the pixel.

US Pat. No. 10,462,449

METHOD AND SYSTEM FOR 360-DEGREE VIDEO PLAYBACK

Novatek Microelectronics ...

1. A method for 360-degree video playback, applicable to a video playback system having a screen, comprising:receiving a current frame of a 360-degree video having a sequence of frames;
detecting a plurality of candidate objects in the current frame;
selecting a main object from the candidate objects by using a selector recurrent neural network (RNN) model based on information of the candidate objects in the current frame and a previous frame of the current frame comprising:
computing a current state of the selector RNN model corresponding to the current frame based on the information of each of the candidate objects and a previous state of the selector RNN model corresponding to the previous frame; and
classifying the main object from the candidate objects according to the current state of the selector RNN model;
computing a viewing angle corresponding to the current frame by using a regressor RNN model based on the main object in the current frame and the previous frame comprising:
obtaining an action of the main object in the current frame according to a position of the main object and a viewing angle corresponding to the previous frame and obtaining a motion feature of the main object in the current frame;
computing a current state of the regressor RNN model corresponding to the current frame based on the action and the motion feature of the main object in the current frame and a previous state of the regressor RNN model corresponding to the previous frame; and
computing the viewing angle corresponding to the current frame according to the current state of the regressor RNN model; and
displaying the current frame on the screen according to the viewing angle.

US Pat. No. 10,462,448

IMAGE GENERATION SYSTEM AND IMAGE GENERATION METHOD

MITSUBISHI ELECTRIC CORPO...

1. An image generation system comprising:image generation apparatuses provided so as to correspond to areas, respectively, into which a target area is divided, the generation apparatuses each being connected to a plurality of imaging apparatuses each capturing a corresponding area being an area that corresponds, and the image generation apparatuses each generating downward view image data for the corresponding area from imaging data captured by the plurality of imaging apparatuses connected; and
an image integration apparatus, when a display area has been specified again by scrolling the display area for which display is requested, to predict, as a predicted area, a location serving as the display area in future, by predicting, from an angle and a distance at which the scrolling is performed, a direction in which scrolling is performed in future, allow each of image generation apparatuses to generate, as partial image data before the predicted area is specified as the display area, downward view image data for a corresponding area of the image generation apparatus, and combine the generated partial image data to generate integrated image data, the image generation apparatuses being provided so as to correspond to areas, respectively, the areas each including at least a part of the predicted area, the integrated image data being downward view image data for the display area.

US Pat. No. 10,462,447

ELECTRONIC SYSTEM INCLUDING IMAGE PROCESSING UNIT FOR RECONSTRUCTING 3D SURFACES AND ITERATIVE TRIANGULATION METHOD

Sony Corporation, Tokyo ...

1. An electronic system, comprisingcircuitry configured to
obtain a sequence of frames of an object under different viewing angles,
process the sequence of frames, wherein the circuitry for processing the sequence of frames is further configured to
receive preconditioned frames,
for each preconditioned frame, extract salient image points, wherein the salient image points identify small image areas of high contrast,
reduce a total number of salient image points by selecting only a predetermined number of stable salient image points, and
generate, for a first time instance, a point cloud descriptive for an external surface of the object based on (i) a point cloud obtained for a second time instance preceding the first time instance and (ii) disparity information concerning a frame captured at the first time instance,
display a 3D representation of the point cloud generated for the first time instance, and
update the displayed 3D representation while obtaining the sequence of frames.

US Pat. No. 10,462,445

SYSTEMS AND METHODS FOR ESTIMATING AND REFINING DEPTH MAPS

FotoNation Limited, Galw...

1. A method for improving accuracy of depth map information derived from image data descriptive of a scene where pixels of such image data, acquired with one or more image acquisition devices, each have an assigned intensity value, the method comprising:performing a matching cost optimization by iteratively refining disparities between corresponding pixels in the image data and using optimization results to create a sequence of first disparity values for an initial disparity map for the scene based in part on a superpixel-wise cost function;
performing a guided filter operation on the first disparity values by applying other image data containing structural details that can be transferred to the first disparity values to restore degraded features or replace some of the first disparity values with values more representative of structural features present in the image data descriptive of the scene,
the guided filtering operation performed by applying a series of weighted median filter operations to pixel intensity values in the sequence of first disparity values so that each median filter operation replaces a member in the sequence of first disparity values with a median intensity value, where each median intensity value is based on intensity values in a group of pixels within a window of pixels positioned about said member in the sequence,
each window being of a variable size to include a variable number of pixels positioned about said member in the sequence, where selections of multiple ones of the window sizes are based on a measure of similarity between the first disparity values and said other image data, and wherein the series of weighted median filter operations provides a new sequence of disparity values for a refined disparity map or from which a depth map of improved accuracy can be created.

US Pat. No. 10,462,444

THREE-DIMENSIONAL INSPECTION

FARO TECHNOLOGIES, INC., ...

1. A method comprising:providing at least one processor;
providing a camera system having a first linear polarizer oriented at a first angle, a second linear polarizer oriented at a second angle, and a third linear polarizer oriented at a third angle;
forming with the camera system a first image of a surface illuminated at a first angle and a second image of the surface illuminated at a second angle, the first image and the second image seen through the first linear polarizer;
forming with the camera system a third image of the surface illuminated at the first angle and a fourth image of the surface illuminated at the second angle, the third image and the fourth image seen through the second linear polarizer;
forming with the camera system a fifth image of the surface illuminated at the first angle and a sixth image of the surface illuminated at the second angle, the fifth image and the sixth image seen through the third linear polarizer;
with the at least one processor, determining machined characteristics of the surface based at least in part on the first image, the second image, the third image, the fourth image, the fifth image, and the sixth image; and
storing a description of the machined characteristics.

US Pat. No. 10,462,442

APPARATUS, SYSTEMS AND METHODS FOR MONITORING VEHICULAR ACTIVITY

1. A system for recording events relating to the operation of a vehicle, the system comprising:a plurality of video cameras recording visual information completely around a vehicle, wherein each of the plurality of video cameras comprises a visual field wherein a portion on the right side of each visual field overlaps with a portion on the left side of an immediately adjacent visual field to the right of each visual field, and a portion on the left side of each visual field overlaps with a portion on the right side of an immediately adjacent visual field to the left of each visual field, without any areas between adjacent visual fields external to the vehicle that are not covered by at least one of the plurality of video cameras;
at least one sensor associated with the vehicle, the at least one sensor for sensing a condition occurring to the vehicle;
a first hard drive for storing video information recorded by the plurality of video cameras and condition information relating to the condition sensed by the at least one sensor, the hard drive continuously recording the video information and the condition information when activated for a first period of time prior to an event, a second period of time during the event and a third period of time after the event, said hard drive recording over previously recorded information recorded by the video camera and the condition information sensed by the at least one sensor when the hard drive is full;
a first processor for detection of the event, the first hard drive storing the video information thereby forming stored video information, wherein the stored video information comprises each of the visual fields, and including the overlapping visual field including all of the video information without deletion of any of the video information in each of the visual fields and in the overlapping visual fields wherein the stored video information is used to form a composited recorded scene comprising all of the visual fields including the overlapping visual fields and all of the video information without deletion of any of the video information in each of the visual fields and in the overlapping visual fields, and the condition information of a subset period of time shorter than the first period of time prior to the event, said subset period of time predefined by a user of the system and immediately preceding the occurrence of the event and lasting until the event, the hard drive further storing the video information and the condition information of the second period of time during the event, and of the third period of time after the event; and
a data center external to the vehicle, the data center comprising a second hard drive, and means for uploading the stored video information for storage by the second hard drive,
wherein the data center external to the vehicle receives the stored video information, the data center configured for viewing the composited recorded scene, wherein the composited recorded scene is presented to a viewer as a presentation selected from the group consisting of an enhanced 2D presentation and a stereoscopic presentation.

US Pat. No. 10,462,440

IMAGE PROCESSING APPARATUS

OLYMPUS CORPORATION, Tok...

1. An image processing apparatus comprising:a processor comprising hardware, wherein the processor is configured to:
acquire, in time series, an imaging signal generated by capturing an object;
determine, when a freeze instruction signal that allows an image based on the imaging signal to be displayed as a still image is input, an image of a freeze target, thereby to specify an imaging signal corresponding to the image of the freeze target, or determine, when the freeze instruction signal is not input, a latest image as an image to be displayed, thereby to specify an imaging signal corresponding to the image to be displayed;
perform a color balance adjustment process by using a first color balance parameter, based on the imaging signal corresponding to the image which is determined to be either one of the image of the freeze target and the latest image in the determining, thereby to generate a first imaging signal;
generate a display purpose imaging signal, based on the generated first imaging signal;
perform the color balance adjustment process by using a second color balance parameter, based on the imaging signal corresponding to the image which is determined to be either one of the image of the freeze target and the latest image in the determining, in parallel with the performing the color balance adjustment process by using the first color balance parameter, thereby to generate a second imaging signal;
detect signals of a plurality of color components that are included in the second imaging signal;
calculate, based on the detected signals, a color balance parameter used for performing the color balance adjustment process; and
set, when the freeze instruction signal is not input, a latest color balance parameter that has been calculated in the calculating as the first and the second color balance parameters, or set, when the freeze instruction signal is input, a color balance parameter corresponding to the image of the freeze target as the first color balance parameter and the latest color balance parameter as the second color balance parameter.

US Pat. No. 10,462,439

COLOR CORRECTION WITH A LOOKUP TABLE

VID SCALE, Inc., Wilming...

1. A method for processing video content, the method comprising:receiving a first luma sample, a first chroma sample, and a two-dimensional (2D) look-up table (LUT) for producing corrected chroma values, wherein the 2D LUT defines a luma dimension and a chroma dimension associated with a 2D color space of the video content, the 2D color space is defined by a plurality of rectangular units with respective vertices, and the 2D LUT comprises corrected chroma component values associated with the respective vertices of the rectangular units;
determining an input luma value and an input chroma value to the 2D LUT via an upsampling operation utilizing at least one of the first luma sample or the first chroma sample, wherein the first luma sample and the first chroma sample are associated with different sampling locations, and the input luma value and the input chroma value are aligned to a same sampling location;
producing an output chroma value based on the 2D LUT, the input chroma value and the input luma value; and
reconstructing the video content using at least the output chroma value.

US Pat. No. 10,462,438

DISPLAY APPARATUS, DISPLAY SYSTEM, AND METHOD FOR CONTROLLING DISPLAY APPARATUS THAT IS CONFIGURED TO CHANGE A SET PERIOD

SEIKO EPSON CORPORATION, ...

1. A display apparatus comprising:a display section that displays an image;
an input section to which an instruction for displaying the image is inputted;
a storage that stores a set period that is a period from a point of time when the instruction is inputted to the input section to a point of time when the display section displays the image;
an operation accepting section that accepts operation;
a changing section that changes the set period based on the operation accepted by the operation accepting section; and
a controller that causes the display section to display the image after the instruction is inputted to the input section and the set period stored in the storage then elapses, wherein the controller switches a first mode in which the display section displays the image after the instruction is inputted to the input section and the set period then elapses to a second mode in which the display section displays the image when the instruction is inputted to the input section irrespective of whether or not the set period elapses and vice versa, and the controller causes the display section to display the image in the selected mode.

US Pat. No. 10,462,437

HIGH LUMINANCE PROJECTION DISPLAYS AND ASSOCIATED METHODS

Dolby Laboratories Licens...

1. A projector comprising:a projection device configured to project a base image according to image data, the projection device comprising a first spatial light modulator and a light source arranged to illuminate the first spatial light modulator, the first spatial light modulator configured to modulate light from the light source to provide illumination of the base image;
a highlight projector configured to project boosted illuminations which boost luminance in one or more highlight areas of the base image, the highlight projector comprising a second spatial light modulator configured to prevent light from the highlight projector from reaching parts of the base image outside of the one or more highlight areas, the second spatial light modulator of the highlight projector directing the boosted illuminations for the one or more highlight areas to the first spatial light modulator of the projection device such that the first spatial light modulator of the projection device modulates both light incident from the light source and the boosted illuminations projected onto the first spatial light modulator by the highlight projector;
wherein the highlight projector comprises one of a steerable beam projector, a holographic projector, and spatial light modulation projector; and
the boosted illuminations vary smoothly and are of low spatial frequencies.

US Pat. No. 10,462,436

PROJECTOR

Seiko Epson Corporation, ...

1. A projector comprising:a light source;
a plurality of light modulation devices that modulate light emitted from the light source;
an optical projection device that projects the light modulated by the plurality of light modulation devices;
a light path changing element that is disposed between the plurality of light modulation devices and the optical projection device and changes a light path of the light modulated by the plurality of light modulation devices through fluctuation;
an exterior casing that forms an exterior;
an internal casing in which the light path changing element is disposed;
a light combination device that combines and exits pieces of light modulated by the plurality of light modulation devices; and
a support member that is disposed in the internal casing and supports the light combination device,
wherein
the optical projection device projects the light exited from the light combination device,
the light combination device is disposed in the internal casing,
the light path changing element includes
a permanent magnet,
an optical member that changes the light path of the light fluctuated and incident by the permanent magnet,
a holding portion that holds the optical member and the permanent magnet,
a pair of coils that are disposed in the holding portion with the permanent magnet interposed therebetween,
a coil holding portion that holds the one pair of coils, and
a heat transmission portion that comes into contact with the coil holding portion, and
the heat transmission portion comes into contact with the support member so that heat is transmittable to the support member.

US Pat. No. 10,462,435

IMAGE DISPLAY DEVICE AND SCREEN FOR CAR WINDSHIELD AND MANUFACTURING THEREOF

Panasonic Intellectual Pr...

1. An image display device comprising:a light source that emits laser light;
a screen on which an image is drawn by being scanned by the laser light;
a scanner that causes the laser light, which is emitted from the light source, to scan the screen; and
an optical system that generates a virtual image of the image, which is drawn on the screen, by the laser light transmitted through the screen, wherein:
the screen is configured such that, in a drawing region on which the image is drawn, a divergence angle is constant in a predetermined range in a center of the screen in a scanning direction, and the divergence angle becomes gradually larger toward ends in side ranges on both sides of the predetermined range of the screen in the scanning direction, the side ranges excluding the predetermined range, and
a plurality of lenses are arranged in the predetermined range and each of the side ranges.

US Pat. No. 10,462,434

PROJECTION DEVICE AND PROJECTION METHOD

DELTA ELECTRONICS, INC., ...

1. A projection device, comprising:a first chip set comprising a plurality of first micro mirrors, wherein angles of the plurality of first micro mirrors are adjusted correspondingly in a plurality of first time intervals to output a first image set according to a light source, and the first image set comprises a plurality of first images corresponding to the plurality of first time intervals; and
a second chip comprising a plurality of second micro mirrors, wherein angles of the plurality of second micro mirrors are adjusted correspondingly in the plurality of first time intervals to output a second image set according to the first image set, and the second image set comprises a plurality of second images corresponding to the plurality of first time intervals.

US Pat. No. 10,462,433

IMAGE SENSOR WITH BIG AND SMALL PIXELS AND METHOD OF MANUFACTURE

OmniVision Technologies, ...

1. A method of manufacturing an image sensor,comprising: providing a substrate;
forming a first set of sensor pixels on said substrate arranged in rows and columns;
forming a second set of sensor pixels on said substrate arranged in rows and columns, each pixel of said second set of pixels being smaller than each pixel of said first set of pixels;
forming a set of transparent windows over said first set of sensor pixels, said transparent windows being arranged in rows and columns and each being configured to pass light within a first range of wavelengths; and
forming a set of filters over said second set of sensor pixels, said filters being arranged in rows and columns and each being configured to pass light within one of a set of ranges of wavelengths, each range of wavelengths of said set of ranges of wavelengths being a subrange of said first range of wavelengths;
wherein each pixel of said first set of pixels has a center disposed between adjacent rows of said second set of pixels and between adjacent columns of said second set of pixels; and
wherein said step of forming a second set of sensor pixels includes forming each pixel of said second set of sensor pixels spaced apart from every other pixel of said second set of sensor pixels by a distance greater than a width of one of said sensor pixels of said second set of sensor pixels.

US Pat. No. 10,462,432

METHOD FOR DRIVING IMAGE PICKUP APPARATUS, AND SIGNAL PROCESSING METHOD

CANON KABUSHIKI KAISHA, ...

1. A method for driving an image pickup apparatus that includes an image pickup element and an output signal processor, the image pickup element having a plurality of color pixels, each having a color filter of one of red, green, and blue, and a white pixel, the method comprising:outputting, from the image pickup element to the output signal processor, signals output by a first color pixel of the plurality of color pixels in n frames where each frame of the n frames is output from the image pickup element at a different time and n is an integer greater than two, and signals output by the white pixel in m frames, where each frame of the m frames is output at a different time from the image pickup element and m is an integer smaller than n; and
generating, by the output signal processor, image data using the signals output by the first color pixel contained in the n frames, and the signals of the white pixel contained in the m frames,
wherein the output signal processor generates image data using a signal obtained by averaging the signals output by the first color pixel contained in the n frames and a signal obtained by averaging signals of the white pixels contained in the m frame.

US Pat. No. 10,462,431

IMAGE SENSORS

VISERA TECHNOLOGIES COMPA...

1. An image sensor, comprising:a semiconductor substrate containing a plurality of photoelectric conversion elements;
a filter array, including a first color filter, a second color filter, a third color filter and an infrared filter, disposed above the semiconductor substrate;
an isolated partition disposed in the filter array to surround one of the first, second and third color filters and the infrared filter, wherein the isolated partition has a refractive index that is lower than the refractive indexes of the first, second and third color filters and the refractive index of the infrared filter, wherein an area of the isolated partition and the one of the first filter, the second filter, the third filter and the infrared filter surrounded by the isolated partition added together is equal to the area of the other one of the first filter, the second filter, the third filter and the infrared filter that is not surrounded by the isolated partition; and
an additional isolated partition disposed in the filter array, wherein the filter array includes a red color filter, a green color filter, a blue color filter and the infrared filter, the isolated partition and the additional isolated partition surround the red and the blue color filters, respectively, and the infrared filter is in contact with the green color filter.

US Pat. No. 10,462,430

IMAGING DEVICE, SIGNAL PROCESSING DEVICE, AND ELECTRONIC APPARATUS HAVING PIXEL SHARING UNITS THAT INCLUDE PORTIONS OF A COLOR FILTER ARRANGEMENT

Sony Semiconductor Soluti...

1. An imaging device, comprising:a first plurality of pixels configured to include only adjacent pixels of a same color;
a first pixel sharing unit including a color filter that is formed by a Bayer arrangement, wherein pixels of the first pixel sharing unit share a floating diffusion and include one or more pixels of the first plurality of pixels;
wherein a defect generated in at least one of the one or more pixels of the first pixel sharing unit that share the floating diffusion is corrected using at least one of the adjacent pixels in the first plurality of pixels that does not share the floating diffusion,
wherein the first pixel sharing unit is 2×4 pixels that include the color filter formed by the Bayer arrangement and that share an amplification transistor.

US Pat. No. 10,462,429

PROPERTY INSPECTION DEVICES, METHODS, AND SYSTEMS

United Services Automobil...

1. A method for performing an inspection of a structure using a remotely located multispectral sensor device comprising:positioning a multispectral sensor including an audio sensor and an image capture sensor;
capturing, by the multispectral sensor, data relating to the structure, the data including audio data and image data;
electronically transmitting the data to a computer server and electronically receiving the data in the computer server;
analyzing the captured data in the computer server to determine a condition of the structure, wherein analyzing the data includes overlaying the audio data and the image data from a common location to determine the condition of the structure; and
electronically transmitting the condition of the structure to an operator of the multispectral sensor device.

US Pat. No. 10,462,428

VIDEO SYSTEM AND METHOD FOR ALLOWING USERS, INCLUDING MEDICAL PROFESSIONALS, TO CAPTURE VIDEO OF RELEVANT ACTIVITIES AND PROCEDURES

1. A system comprising: a portable video camera; a head-worn apparatus having an offset fixed mount for attachment of a camera such that a lens of said camera is in the center of the wearer's field of vision, said offset mount including multiple parallel extensions; and one or more attachment members for attaching said portable video camera to said fixed mount on said head-worn apparatus, said one or more attachment members each including a first and second series of multiple parallel extensions configured to rotatably interlace with one another and said multiple parallel extensions of said offset mount permitting said portable video camera to rotate at least upward and downward.

US Pat. No. 10,462,427

SECURING REMOTE VIDEO TRANSMISSION FOR THE REMOTE CONTROL OF A VEHICLE

Siemens Mobility SAS, Ch...

1. A method for securing a remote transmission of an image of an object intended to be captured by a photosensitive receiver of a camera of a video system and transmitted remotely by the video system, which comprises the steps of:generating an optical securing datum for the remote transmission of the image of the object, the optical securing datum being generated by a first light source, the optical securing datum being different from the object;
forming a secure optical image by optical superposition of a securing image containing the optical securing datum and of the image of the object, the image of the object being generated by a second light source being different from the first light source, the secure optical image being formed as light rays and intended to act on the photosensitive receiver of the camera in order to be converted into a video signal; and
verifying the optical securing datum carried by the video signal.

US Pat. No. 10,462,425

PROCESSING SYSTEM FOR PROVIDING A TELLER ASSISTANT EXPERIENCE USING ENHANCED REALITY INTERFACES

Bank of America Corporati...

1. A computing platform comprising:at least one processor;
a communication interface communicatively coupled to the at least one processor; and
memory storing computer-readable instructions that, when executed by the at least one processor, cause the computing platform to:
establish, with an enhanced reality device, a first wireless data connection;
receive, from the enhanced reality device and while the first wireless data connection is established, pre-transaction information corresponding to an interaction with an automated teller machine terminal platform, wherein the pre-transaction information corresponding to the interaction with the automated teller machine terminal platform is sent by the enhanced reality device to the computing platform based on pre-transaction input received by the enhanced reality device as gesture input or detected eye movement;
generate, based on the pre-transaction information received from the enhanced reality device, teller assistant experience information and one or more commands directing the enhanced reality device to generate a teller assistant experience interface using the teller assistant experience information;
send, to the enhanced reality device and while the first wireless data connection is established, the teller assistant experience information and the one or more commands directing the enhanced reality device to generate the teller assistant experience interface using the teller assistant experience information;
cause the enhanced reality device to establish a second wireless data connection with a teller interaction platform; and
initiate a video call between the enhanced reality device and the teller interaction platform using the second wireless data connection, wherein the video call is displayed at the enhanced reality device in a portion of the teller assistant experience interface.

US Pat. No. 10,462,424

PAIRING DEVICES IN CONFERENCE USING ULTRASONIC BEACON AND SUBSEQUENT CONTROL THEREOF

Polycom, Inc., San Jose,...

1. A method of conferencing over a network, the method comprising:outputting, in an imperceptible acoustic beacon in a near-end environment with a second device of a network system, connection information for connecting to a given device of the network system in the network, wherein the connection information comprises a network address of the given device encoded in the acoustic beacon;
connecting a first device in the near-end environment to the given device of the network system by receiving a response in the network system from the first device, the response being based on the first device using the output connection information and requesting connection of the first device to the given device of the network system;
obtaining, at the network system via the network, content from the first device; and
operating the network system using the obtained content.

US Pat. No. 10,462,423

VIRTUAL ON-SET COMMUNICATION PLATFORM FOR ON-SET CONTRIBUTORS

Shutterstock, Inc., New ...

1. A computer-implemented method comprising:receiving, from a device of a first user, an input indicating that the first user is ready to start working on a project associated with the first user;
identifying, based on the project associated with the first user, a second set of users associated with the project;
generating a session identifier for a communication channel between the first user and one or more users of the second set of users;
selecting, based on a listing of communication mode preferences associated with the first user, a mode of communication for the communication channel;
generating messages for the communication channel, wherein the messages include a link to the communication channel;
transmitting the messages to the one or more users of the second set of users;
receiving one or more stream of images from the device of the first user while the one or more stream of images are being captured by an image capturing device of the first user;
transmitting the one or more stream of images to the second set of users; and
receiving, from a device of a user from the second set of users, one or more messages related to the one or more stream of images.

US Pat. No. 10,462,422

AUDIO SELECTION BASED ON USER ENGAGEMENT

Facebook, Inc., Menlo Pa...

1. A method comprising:receiving, during an audio-video communication session, audio input data from a microphone array comprising at least two microphones, wherein the audio input data is generated by a first sound source at a first location within an environment and a second sound source at a second location within the environment;
determining a first classification for the first sound source and a second classification for the second sound source;
predicting a first engagement metric for the first sound source and a second engagement metric for the second sound source, wherein:
the first engagement metric is based on the first classification and the second engagement metric is based on the second classification;
the first engagement metric approximates an interest level of a receiving user for the first sound source; and
the second engagement metric approximates an interest level from the receiving user for the second sound source;
determining that the first engagement metric is greater than the second engagement metric;
processing the audio input data to generate an audio output signal, wherein the audio output signal amplifies sound generated by the first sound source and attenuates sound generated by the second sound source; and
sending the audio output signal to a computing device associated with the receiving user.

US Pat. No. 10,462,421

PROJECTION UNIT

Microsoft Technology Lice...

1. A portable projection unit comprising:a rotating capture module comprising: at least one color camera, at least one microphone, and at least one depth camera, the rotating capture module configured to capture images of the environment;
a rotating projection module configured to project images onto at least one surface of a plurality of surfaces in the environment; and
a processor configured to:
identify a plurality of users physically in the environment;
identify a first set of users from the plurality of users physically in the environment as participants and a second set of users from the plurality of users physically in the environment as non-participants;
use data captured by the rotating capture module to select a surface from the plurality of surfaces in the environment on which to project the images, the selection being based on a current shared field of view of the first set of the plurality of users in the environment and not the second set of the plurality of users and on characteristics of the plurality of surfaces in the environment;
control rotation of the rotating capture module such that the data captured by the rotating capture module is suitable for computing the current shared field of view of the first set of the plurality of users and for determining characteristics of the plurality of surfaces; and
control operation of the rotating projection module to project the images onto the selected surface.

US Pat. No. 10,462,420

ESTABLISHING A VIDEO CONFERENCE DURING A PHONE CALL

APPLE INC., Cupertino, C...

1. A non-transitory computer readable medium of a first mobile device, the computer readable medium storing a computer program, said computer program comprising sets of instructions for:presenting, through a wireless communication network with one or more devices, a first composite view on the first mobile device, the first composite view comprising:
a first selectable user-interface (UI) item corresponding to video data captured by the first mobile device; and
a plurality of second selectable user-interface (UI) items corresponding to a plurality of video conference participants, the plurality of second selectable UI items comprising video data captured by second mobile devices of at least two video conference participants participating in a video conference with a user of the first mobile device;
receiving an indication of a first video conference participant of the plurality of video conference participants based upon detected speech of the first video conference participant; and
in response to the indication of the first video conference participant, present a second composite view comprising an enlarged view of one of the plurality of second selectable UI items corresponding to the first video conference participant.

US Pat. No. 10,462,419

HYBRID SPLITTER PASSING CATV+MOCA AND MOCA SIGNALS

CommScope Technologies LL...

13. A splitter comprising:a housing;
a first coaxial port attached to said housing;
a second coaxial port attached to said housing;
a power divider element within said housing, wherein a first terminal of said power divider element is directly connected to said first coaxial port without any intervening circuit element, and a second terminal of said power divider element is directly connected to said second coaxial port without any intervening circuit element, so that all frequencies presented to said first coaxial port can pass to said second coaxial port, and so that all frequencies presented to said second coaxial port can pass to said first coaxial port;
a high pass filter within said housing, said high pass filter having a first terminal directly connected to a third terminal of said power divider element without any intervening circuit element; and
a third coaxial port attached to said housing, said third coaxial port being directly connected to a second terminal of said high pass filter without any intervening circuit element, so that frequencies within said high pass filter's frequency range may pass from said first coaxial port to said third coaxial port, frequencies within said high pass filter's frequency range may also pass from said third coaxial port to said first coaxial port, and frequencies outside of said high pass filter's frequency range are attenuated by said high pass filter.

US Pat. No. 10,462,418

ELECTRONIC DEVICE, DISPLAY DEVICE, AND DISPLAY SYSTEM INCLUDING ELECTRONIC DEVICE AND DISPLAY DEVICE

SAMSUNG ELECTRONICS CO., ...

15. A display system comprising:a display device; and
an electronic device connected with the display device through a cable to transmit a test signal including specified data to the display device, and to receive error check information regarding the test signal from the display device, wherein the electronic device includes a PCB
wherein the display device connected with the electronic device through the cable to receive the test signal including the specified data from the electronic device, and configured to examine the specified data to check for an error in the test signal, and to transmit the error check information regarding the test signal to the electronic device,
wherein the electronic device is configured to:
determine an impedance of the PCB,
perform calibration for adjusting the impedance of the PCB if the determined impedance is different from a specified impedance,
sequentially changes transmission characteristic values of the test signal within a first specified range to transmit a plurality of test signals, and
performs calibration for changing a transmission characteristic of a signal to be transmitted to the display device based on the error check information regarding the plurality of test signals, and
wherein the display device is configured to:
sequentially changes reception characteristic values of the test signal within a second specified range to receive the plurality of test signals; and
performs calibration for changing a reception characteristic of the signal received from the electronic device based on the error check information regarding the plurality of test signals.

US Pat. No. 10,462,417

METHODS AND APPARATUS FOR REDUCING ELECTROMAGNETIC INTERFERENCE RESULTANT FROM DATA TRANSMISSION OVER A HIGH-SPEED AUDIO/VISUAL INTERFACE

Apple Inc., Cupertino, C...

1. A method of operating a source device in an HDMI system, the method comprising:receiving an operating mode for a sink device coupled to the source device;
determining if a first operating mode is supported by the sink device, and if so, turning off one or more diodes present in an active filter circuit, otherwise if the first operating mode is not supported, turning on the one or more diodes present in the active filter circuit.

US Pat. No. 10,462,416

FACE PLATE COVER FOR OUTDOOR IN-LINE MULTITAP

1. A face plate cover for an outdoor directional in-line multitap cable communication distribution device, wherein the device comprises a body portion containing suitable distribution signal conditioning circuitry; a pair of network trunk cable ports at a lower side of the body portion and adapted to connect to an input network cable and an output network cable, respectively, and connected with said signal conditioning circuitry which provides video output signals to apply to a plurality of customer drop cables; a gasket member disposed between a periphery of said body portion and said face plate cover and forming an RF seal therebetween; the face plate cover comprising a plate member adapted to removably attach onto said body portion and a plurality of customer connector ports affixed on said face plate member and projecting there from; the customer connector ports each formed of a customer connector of generally tubular shape having a base portion that is affixed to and projects out from said face plate member at a right angle to the face plate and a connector portion affixed onto said base portion and angled downward at about 45 degrees from said base portion, such that a connector end of a respective customer drop cable can be attached directly to the respective customer connector port at a downward angle of about 45 degrees.

US Pat. No. 10,462,415

SYSTEMS AND METHODS FOR GENERATING A VIDEO CLIP AND ASSOCIATED CLOSED-CAPTIONING DATA

Tribune Broadcasting Comp...

1. A method comprising:accessing a first video clip demarcated into frames;
accessing closed-captioning (CC) data demarcated into CC blocks, wherein each of the frames correlates to a respective one of the CC blocks;
identifying a first frame from among the frames;
determining a first set of CC blocks that correlate to a first set of frames within a range of the identified first frame;
identifying a first position from among the determined first set of CC blocks;
identifying a second frame from among the frames;
using the identified second frame to identify a second position from among the accessed CC data; and
generating a second video clip and associated CC data, wherein the second video clip includes the frames of the accessed first video clip spanning from the identified first frame to the identified second frame, and wherein the generated CC data includes the CC blocks of the accessed CC data spanning from the identified first position to the identified second position.

US Pat. No. 10,462,414

METHOD AND SYSTEM FOR GENERATION OF CAPTIONS OVER STEREOSCOPIC 3D IMAGES

Cable Television Laborato...

1. A method for adaptive management of a graphical overlay within stereoscopic video comprising:generating a depth map for image frames used to form the stereoscopic video, the depth map using one or more depth lines for representing depth disparity for a plurality of objects appearing within the image frames, the depth lines varying in elevation to reflect corresponding parallax variances of a corresponding one of the objects over time;
identifying a first segment, a second segment and a third segment for a first depth line of the one or more depth lines, the first depth line being associated with a first object of the plurality of objects, including identifying a beginning and an ending for each of the first segment, second segment and third segment;
generating an first overlay line to represent depth of the graphical overlay relative to the first depth line, including shaping a first portion, a second portion and a third portion of the first overlay line such that the first portion extends from the beginning of the first segment to the ending of the first segment, the second portion extends from the beginning of the second segment to the ending of the second segment and the third portion extends from the beginning of the third segment to the ending of the third segment; and
positioning a first graphical overlay within the stereoscopic video to track the first overlay line, thereby positioning the first graphical overlay relative to the first object within the stereoscopic video.

US Pat. No. 10,462,412

SURGICAL VISUALIZATION AND RECORDING SYSTEM

1. A method for capturing, communicating, and displaying images of a surgical site with up to an ultrahigh definition resolution in association with patient information in real time during a surgery, said method comprising:providing a surgical visualization and recording system comprising:
an ultrahigh definition camera system comprising an optical component and an image sensor positioned at a proximal end of a surgical scope device, said image sensor in optical communication with said optical component for receiving reflected light from said surgical site via said optical component and capturing images of said surgical site with up to said ultrahigh definition resolution;
a display unit comprising an embedded microcomputer in operable communication with said ultrahigh definition camera system, said embedded microcomputer comprising at least one processor configured to execute computer program instructions for receiving, transforming, and processing said captured images of said surgical site; and
said display unit further comprising a tactile user interface in operable communication with said embedded microcomputer for receiving one or more user inputs for controlling operation of said ultrahigh definition camera system and for displaying said captured images of said surgical site with up to said ultrahigh definition resolution;
receiving said patient information via said tactile user interface of said display unit of said surgical visualization and recording system by said embedded microcomputer of said display unit;
capturing and communicating said images of said surgical site with up to said ultrahigh definition resolution by said image sensor of said ultrahigh definition camera system of said surgical visualization and recording system to said embedded microcomputer of said display unit in said real time, on receiving one or more user inputs via one of said tactile user interface of said display unit and one or more input devices operably connected to said embedded microcomputer of said display unit;
associating said captured and communicated images of said surgical site with said received patient information by said embedded microcomputer of said display unit in said real time; and
displaying said captured and communicated images of said surgical site associated with said received patient information with up to said ultrahigh definition resolution by said tactile user interface of said display unit in said real time.

US Pat. No. 10,462,411

TECHNIQUES FOR VIDEO ANALYTICS OF CAPTURED VIDEO CONTENT

INTEL CORPORATION, Santa...

1. A method comprising:receiving, at a camera, information from a host processing system, the camera comprising a first camera located with a display device receiving streaming video from the host processing system, the information received from the host processing system including information directing the first camera to capture video content contemporaneously with the display device displaying video included in the received streaming video;
capturing video content based, at least in part, on the received information;
performing video analytics on the captured video content; and
sending data associated with the video analytics to the host processing system.

US Pat. No. 10,462,410

SYSTEMS AND METHODS FOR RE-RECORDING CONTENT ASSOCIATED WITH RE-EMERGED POPULARITY

Rovi Guides, Inc., San J...

1. A method for re-recording content associated with popularity that has re-emerged, the method comprising:storing on a storage device at a remote server a plurality of media assets;
determining, at a first time, whether popularity of a media asset of the plurality of media assets is below a first threshold;
in response to determining that popularity of the media asset is below the first threshold, deleting the media asset from the storage device;
determining, at a second time after the first time, whether the popularity of the deleted media asset is above a second threshold; and
in response to determining that the popularity of the deleted media asset is above the second threshold, storing the deleted media asset on the storage device again.

US Pat. No. 10,462,409

METHOD FOR COLLECTING MEDIA ASSOCIATED WITH A MOBILE DEVICE

GOOGLE TECHNOLOGY HOLDING...

1. A method performed at a computing system, comprising:receiving, without user interaction, video data and audio data captured via a plurality of distributed video devices configured to monitor one or more vicinities;
receiving a first request from a mobile device for a media collection service;
in response to the first request, determining, without user interaction, a user of a mobile device to be within a proximity of the plurality of distributed video devices;
in accordance with a determination that the user is within the proximity of the plurality of distributed video devices:
identifying, without user interaction, video data and audio data in which the user of the mobile device appears; and
storing the identified video data and the audio data to the computing system;
receiving a second request from the user of the mobile device to access the identified video data and the audio data; and
in response to the second request, transmitting the identified video data and the audio data to the user.

US Pat. No. 10,462,408

DISPLAY SYSTEM, DISPLAY METHOD, AND DISPLAY APPARATUS

Panasonic Intellectual Pr...

1. A display apparatus comprising:an electro-optical transfer function (EOTF) converter that receives a video signal having a first luminance range and performs EOTF conversion on the video signal to obtain a first luminance value;
a luminance converter, which is provided for converting the video signal into a pseudo High Dynamic Range signal having a second luminance range, a maximum value of the second luminance range being smaller than a maximum value of the first luminance range, the luminance converter that converts the first luminance value into a second luminance value within the second luminance range, the maximum value of the second luminance range being larger than a maximum value of a luminance of a Standard Dynamic Range (SDR) signal, and the maximum luminance value of the SDR signal being 100 nit; and
a display that displays the pseudo High Dynamic Range signal having the second luminance range based on the second luminance value,
wherein the maximum value of the second luminance range is a maximum possible display peak luminance (DPL) that the display is capable of displaying, and is greater than the maximum luminance value of the SDR signal, and
wherein the luminance converter performs conversion of the first luminance value into the second luminance value, based on characteristic information on a characteristic of the display apparatus.

US Pat. No. 10,462,407

DISPLAY METHOD AND DISPLAY DEVICE

PANASONIC INTELLECTUAL PR...

1. A display method of displaying, on a display device, video of video data including peak luminance information indicating peak luminance of the video, where luminance of video is defined by a first Electro-Optical Transfer Function (EOTF) indicating a correlation of High Dynamic Range (HDR) luminance and code values, the method comprising:acquiring the video data;
performing first conversion where the luminance of the video is converted to a luminance corresponding to a dynamic range of a third EOTF; and
displaying the video on the display device using the result of the first conversion,
wherein a luminance range of second EOTF is a part of a luminance range of the first EOTF from a minimum luminance to the peak luminance of the video indicated by the peak luminance information included in the acquired video data, and
wherein the dynamic range of third EOTF is obtained by reducing a dynamic range of the second EOTF so that a maximum luminance of the second EOTF matches a displayable luminance of the display device while maintaining a relative relationship of luminance of the second EOTF by multiplying a variable representing a luminance in the second EOTF by a value obtained by dividing the displayable luminance of the display device by the peak luminance of the video.

US Pat. No. 10,462,406

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD

Sony Corporation, Tokyo ...

1. An information processing apparatus, comprising:a camera;
a screen configured to display an image shot by the camera; and
a data processing unit configured to:
acquire virtual object display control data that records first positional information associated with an anchor and second positional information associated with a real-world registration information setting point in a virtual three-dimensional spatial coordinate system;
detect the anchor from the image shot by the camera;
determine a position of the camera in the virtual three-dimensional spatial coordinate system based on the first positional information associated with the anchor and a direction of the detected anchor in the virtual three-dimensional spatial coordinate system;
calculate a first position of the real-world registration information setting point in a coordinate system of the screen, based on the second positional information of the real-world registration information setting point; and
display, on the screen, a virtual object that indicates the real-world registration information setting point, wherein
the virtual object is displayed based on the calculated first position, and
the real-world registration information setting point is included in the image shot by the camera.

US Pat. No. 10,462,405

SOLID-STATE IMAGING DEVICE AND MANUFACTURING METHOD THEREFOR

CANON KABUSHIKI KAISHA, ...

17. A signal processing circuit substrate connectable to a photoelectric conversion substrate,wherein
the photoelectric conversion substrate includes
a pixel region;
a plurality of the photoelectric conversion units each including a first semiconductor region of a first conductivity type and arranged in the pixel region;
a plurality of the floating diffusion regions arranged in the pixel region; and
a first element isolation portion provided in the pixel region, and configured to electrically isolate two selected from the plurality of the photoelectric conversion units and the plurality of the floating diffusion regions, and
the first element isolation portion includes a second semiconductor region of a second conductivity type which contacts a first insulating film disposed on a first surface of the first semiconductor substrate and a second insulating film disposed on a second surface of the first semiconductor substrate, the second surface being opposite to the first surface,
the signal processing circuit substrate comprising:
a signal processing circuit configured to process a signal output from the plurality of photoelectric conversion units;
a plurality of transistors included in the signal processing circuit; and
a second element isolation portion configured to electrically isolate at least a part of the plurality of transistors and including an insulator portion.

US Pat. No. 10,462,404

SOLID-STATE IMAGE SENSOR, METHOD FOR PRODUCING SOLID-STATE IMAGE SENSOR, AND ELECTRONIC APPARATUS

Sony Corporation, Tokyo ...

7. An image sensor comprising:a semiconductor substrate including a first photoelectric conversion region adjacent to a second photoelectric conversion region;
a plurality of on-chip lenses located above the semiconductor substrate; and
a light-shielding section including:
an embedded section extending vertically in at least a region between the first photoelectric conversion region and the second photoelectric conversion region; and
a lid section extending horizontally above the first photoelectric conversion region and a portion of the second photoelectric conversion region, wherein a portion of the lid section is between a first on-chip lens that corresponds to the second photoelectric conversion region and the portion of the second photoelectric conversion region, wherein the light-shielding section further includes a front-side lid section disposed in such a way as to cover at least a charge retaining section on a front side of the semiconductor substrate opposite to a side on which light enters the first photoelectric conversion region, and wherein, in the front-side lid section of the light-shielding section, an opening is formed in a region corresponding to the first photoelectric conversion region.

US Pat. No. 10,462,403

ELECTRIC CAMERA

Maxell, Ltd., Kyoto (JP)...

1. A camera comprising:an image sensor having an array of pixels arranged vertically and horizontally in a grid pattern;
a processor to form a plurality of image signals by using the pixels of the image sensor in a static image mode and a moving video mode, the processor configured to:
form the image signals by using a first set of the pixels of the image sensor corresponding to a predetermined first view angle during recording in the static image mode, and
form the image signals by using a second set of the pixels of the image sensor corresponding to a predetermined second view angle during recording in the moving video mode, wherein the predetermined second view angle is different from the predetermined first view angle;
an image-instability detector configured to detect an amount of image-instability of the camera and configured to change a position of the second set of the pixels according to the amount of image-instability detected by the image-instability detector, in order to correct the image-instability; and
a display configured to:
display a still image corresponding to the image signals formed based on the first set of the pixels, and
display a moving image corresponding to the image signals formed based on the second set of the pixels.

US Pat. No. 10,462,402

IMAGE SENSOR HAVING FULL WELL CAPACITY BEYOND PHOTODIODE CAPACITY

Apple Inc., Cupertino, C...

1. An image sensor, comprising:a photodiode having a full well capacity;
a storage node configured with a charge storage capacity at least twice the full well capacity;
a first transfer gate linking the photodiode and the storage node;
a storage gate linking the photodiode and the storage node, in series with the first transfer gate;
a floating diffusion node;
a second transfer gate linking the storage node and the floating diffusion node; and
control circuitry operable to:
activate the first transfer gate to transfer a first charge acquired in the photodiode during a first integration period to the storage node;
deactivate the first transfer gate to acquire a second charge in the photodiode during a second integration period, the second integration period beginning after an end of the first integration period;
activate the first transfer gate to transfer the second charge acquired in the photodiode to the storage node; and
activate the second transfer gate at an end of an image exposure time frame to transfer a third charge from the storage node to the floating diffusion node,
wherein:
the first integration period and the second integration period occur during the image exposure time frame, and
the third charge comprises the first charge and the second charge.

US Pat. No. 10,462,401

ENCODING APPARATUS AND ENCODING METHOD ENCODING RAW DATA THAT IS QUANTIZED BASED ON AN INDEX VALUE OF NOISE AMOUNT IN AN OPTICAL BLACK AREA

CANON KABUSHIKI KAISHA, ...

1. An encoding apparatus comprising:a processor and a memory storing a program configured to execute:
acquiring an index value of a noise amount by analyzing at least a part of an optical black area that is included in raw data;
quantizing the raw data based on the index value; and
encoding the quantized raw data,
wherein the raw data includes a first area having at least a part of the optical black area, and a second area that does not include the optical black area,
wherein the acquiring acquires the index value based on a code amount of encoded data generated by encoding the raw data in the first area, and
wherein if the index value is a first value, the quantizing quantizes the raw data in the second area with a first quantization step, and if the index value is a second value corresponding to a larger noise amount than a noise amount in a case where the index value is the first value, the quantizing quantizes the raw data in the second area with a second quantization step that is larger than the first quantization step.

US Pat. No. 10,462,400

SOLID-STATE IMAGING DEVICE AND IMAGING SYSTEM

CANON KABUSHIKI KAISHA, ...

1. A solid-state imaging device comprising:a plurality of pixels each of which includes:
a photoelectric conversion portion which photoelectrically converts incident light to generate a signal charge;
a charge holding portion that accumulates the signal charge transferred from the photoelectric conversion portion;
a floating diffusion region to which the signal charge of the charge holding portion is transferred;
a first transfer gate configured to transfer the signal charge from the photoelectric conversion portion to the charge holding portion; and
a second transfer gate configured to transfer the signal charge from the charge holding portion to the floating diffusion region,
wherein:
the photoelectric conversion portion includes a first semiconductor region of a first conductivity type, and a second semiconductor region of a second conductivity type including a majority carrier of a polarity of the signal charge formed under the first semiconductor region,
the charge holding portion includes a third semiconductor region of the first conductivity type, and a fourth semiconductor region of the second conductivity type formed under the third semiconductor region,
in a cross section including the charge holding portion, the floating diffusion region and the second transfer gate, a first side edge of the fourth semiconductor region on the floating diffusion region side is provided between the floating diffusion region and a first side edge of the third semiconductor region on the floating diffusion region side, and
in the cross section, the floating diffusion region and the fourth semiconductor region are isolated by a semiconductor region of the first conductivity type, the semiconductor region being in contact with a surface of a semiconductor substrate.

US Pat. No. 10,462,399

ANTI-ECLIPSE CIRCUITRY WITH TRACKING OF FLOATING DIFFUSION RESET LEVEL

Micron Technology, Inc., ...

1. An imager, comprising:a pixel array including a plurality of imaging pixels, a first non-imaging pixel configured to produce a first nominal reset signal, and a second non-imaging pixel configured to produce a second nominal reset signal having a different signal level than the first nominal reset signal;
control circuitry operably coupled to the pixel array and configured to—
output image signals based on reset signals produced by corresponding ones of the imaging pixels,
adjust a signal level of a reset signal of one of the imaging pixels proximate the first-non-imaging pixel based at least in part on the first nominal reset signal, and
adjust a signal level of a reset signal of another one of the imaging pixels proximate the second-non-imaging pixel based at least in part on the second nominal reset signal.

US Pat. No. 10,462,398

SOLID-STATE IMAGING DEVICE, METHOD OF DRIVING THE SAME, AND ELECTRONIC APPARATUS

Sony Semiconductor Soluti...

1. An imaging device comprising:a pixel including:
a first photoelectric conversion unit;
a floating diffusion coupled to the first photoelectric conversion unit;
a first transistor, one of a source or a drain of the first transistor being coupled to the floating diffusion;
a second transistor, a gate of the second transistor being coupled to the floating diffusion;
a third transistor, one of a source or a drain of the third transistor being coupled to one of a source or a drain of the second transistor; and
a fourth transistor, one of a source or a drain of the fourth transistor being coupled to a node between the second transistor and the third transistor,
wherein the other of the source or the drain of the first transistor and the other of the source or the drain of the fourth transistor are configured to have a same potential,
wherein the second transistor and the fourth transistor have a same channel type.

US Pat. No. 10,462,397

SAMPLE-AND-HOLD CIRCUIT WITH FEEDBACK AND NOISE INTEGRATION

Sony Semiconductor Soluti...

1. A sample-and-hold circuit, comprising:a sampling capacitor;
an amplifier transistor; and
a noise reduction circuit including an integration capacitor and a feedback capacitor, the noise reduction circuit being configured to reduce noise via a four-phase operation including:
an auto-zero phase in which the feedback capacitor is discharged,
a feedback phase in which a gate voltage of the amplifier transistor is partially compensated through the feedback capacitor,
an integration phase in which the integration capacitor is charged, and
a feedforward phase in which the gate voltage of the amplifier transistor is fully compensated by a voltage on the integration capacitor through the feedback capacitor.

US Pat. No. 10,462,396

IMAGING DEVICE

PANASONIC INTELLECTUAL PR...

1. An imaging device comprising:a pixel;
a signal line electrically connected to the pixel;
a first sample-and-hold circuit electrically connected to the signal line; and
a second sample-and-hold circuit electrically connected to the signal line, wherein
the pixel includes
a photoelectric converter that generates signal charge by photoelectric conversion,
a charge accumulation region that accumulates the signal charge,
a reset transistor that resets a voltage of the charge accumulation region to a reference voltage, and
an amplifier transistor that amplifies a signal voltage corresponding to an amount of the signal charge accumulated in the charge accumulation region to output to the signal line,
the first sample-and-hold circuit includes
a first switch that is electrically connected to the signal line and has input-output characteristics in which an output is linear with respect to an input up to a clipping voltage and the output is clipped at the clipping voltage with respect to the input exceeding the clipping voltage, and
a first capacitor electrically connected to the signal line through the first switch, and
the second sample-and-hold circuit includes
a second switch that is electrically connected to the signal line and has input-output characteristics in which an output is linear with respect to an input, and
a second capacitor electrically connected to the signal line through the second switch.

US Pat. No. 10,462,394

DIGITAL PIXEL IMAGER WITH DUAL BLOOM STORAGE CAPACITORS AND CASCODE TRANSISTORS

RAYTHEON COMPANY, Waltha...

1. An integration capacitor network for connection to a photo-current source, the network comprising:an input;
a first path connected between the input and a reset voltage, the first path including a first integration capacitor and a first cascode transistor, the first cascode transistor coupled between the input and the first integration capacitor; and
a second path connected between the input and the reset voltage, the second path including a second integration capacitor and a second cascode transistor, the second cascode transistor coupled between the input and the second integration capacitor;
wherein gates of the first and second cascode transistors are connected to a reference voltage; and
wherein charge is accumulated on the first integration capacitor until a voltage on the first integration capacitor exceeds the reference voltage and then charge is accumulated on the second integration capacitor.

US Pat. No. 10,462,393

SOLID STATE IMAGE SENSOR AND IMAGE-CAPTURING DEVICE

NIKON CORPORATION, Tokyo...

27. A solid-state image sensor, comprising:a plurality of pixels, each including a photoelectric conversion unit and a charge accumulating unit that accumulates an electric charge from the photoelectric conversion unit;
a connection unit that is disposed between the charge accumulating units which are respectively included in the plurality of pixels and raises capacitance value at the charge accumulating unit by electrically connecting to the charge accumulating unit;
a control unit that disconnects the charge accumulating unit and the connection unit from each other so that the capacitance value is lowered on condition that quantity of the electric charge generated at the photoelectric conversion unit is first electric charge quantity and connects the charge accumulating unit and the connection unit with each other so that the capacitance value is raised on condition that the quantity of the electric charge generated at the photoelectric conversion unit is second electric charge quantity greater than the first electric charge quantity.

US Pat. No. 10,462,392

IMAGE SENSOR AND IMAGE-CAPTURING DEVICE

NIKON CORPORATION, Tokyo...

1. An image sensor, comprising:a first readout circuit that reads out a first signal, being generated by an electric charge resulting from a photoelectric conversion, to a first signal line;
a first holding circuit that holds a voltage based on an electric current from a power supply circuit; and
a first electric current source that supplies the first signal line with an electric current generated by the voltage held in the first holding circuit, wherein:
the first holding circuit holds the voltage based on the electric current from the power supply circuit when the first signal is not read out to the first signal line by the first readout circuit.

US Pat. No. 10,462,391

DARK-FIELD INSPECTION USING A LOW-NOISE SENSOR

KLA-Tencor Corporation, ...

1. A method of inspecting a sample using an image sensor and an analog-to-digital converter (ADC), the image sensor including multiple pixels disposed in at least one column and an output sensing node, the ADC being configured to convert analog output signals on said output sensing node into corresponding digital image data values, the method comprising:driving the image sensor such that a plurality of analog image data values are generated in the multiple pixels, each said analog image data value corresponding to a radiation portion directed onto said multiple pixels from a corresponding region of the sample, said driving including systematically transferring said analog image data values along said at least one column from said multiple pixels to said output sensing node while translating said sample relative to said image sensor such that each said analog image data value is shifted from a first said pixel to a second said pixel in said at least one column in coordination with said corresponding region of the sample, whereby said each analog image data value is influenced by said corresponding radiation portion from said corresponding region during a first time period when said each analog image data value is in said first pixel, and said each analog image data value is influenced by said corresponding radiation portion during a second time period when said each analog image data value is in said second pixel, and wherein said systematically transferring is performed such that said output sensing node stores charge values determined by said systematically transferred analog image data values and generates said analog output signals in accordance with said stored charge values, wherein driving the image sensor further includes periodically resetting the output sensing node to an initial charge value according to a reset clock signal; and
controlling the ADC to sequentially convert one or more of said analog output signals generated on said output sensing node into two or more said corresponding digital data values during each cycle of said reset clock signal.

US Pat. No. 10,462,390

IMAGE PICKUP APPARATUS, IMAGE PICKUP METHOD, PROGRAM, AND IMAGE PROCESSING APPARATUS

SONY CORPORATION, Tokyo ...

1. An image pickup apparatus, comprising:an image pickup device configured to divide one frame period corresponding to a predetermined frame rate into 3 or more sub-frame periods and generate, via a rolling shutter, in the one frame period, sub-frame images in a number corresponding to the number of sub-frame periods; and
processing circuitry configured to
turn on an infrared light source that irradiates infrared light onto an image pickup range in a time length unit that is equal to at least one of the 3 or more sub-frame periods in the one frame period, and
generate a color image at the predetermined frame rate based on an infrared image which is based on a first sub-frame image of the sub-frame images in which the infrared light is irradiated during a first exposure time of the first sub-frame image and a visible image which is based on a second sub-frame image of the sub-frame images in which the infrared light is not irradiated in a second exposure time of the second sub-frame image, wherein
the processing circuitry is configured to convert luminance information of the infrared image and color difference information of the visible image into color information of the color image based on luminance information of the visible image.

US Pat. No. 10,462,389

METHOD AND APPARATUS FOR USING A REGIONALLY SPECIFIC SPECTRAL MODEL TO IDENTIFY THE SPECTRAL BANDS FOR USE FOR ISOLATING REMOTELY SENSED MATERIALS OF INTEREST

THE BOEING COMPANY, Chic...

1. A method of collecting spectral data of a region of interest with a sensor, comprising:(a) generating a simulated spectral representation of a region of interest, the simulated spectral representation comprising:
a plurality of geospatial portions at least partially disposed in the region of interest, each geospatial portion having fused spectral characteristics of a plurality of materials disposed in the respective geospatial portion;
(b) identifying at least one of the plurality of materials as a material of interest within the region of interest;
(c) identifying other of the plurality of materials not identified as the material of interest as background materials within the region of interest;
(d) selecting a subset spectral portion of the spectral data according to the simulated spectral representation of the material of interest and the simulated spectral representation of the background materials within the region of interest; and
(e) configuring the sensor to collect the subset spectral portion of the spectral data.

US Pat. No. 10,462,388

METHOD OF SHUTTERLESS NON-UNIFORMITY CORRECTION FOR INFRARED IMAGERS

1. A method of correcting an infrared image, the method comprising:providing a processor;
receiving the infrared image from a camera, the infrared image comprising a plurality of pixels arranged in an input image array, a first pixel in the plurality of pixels having a first pixel value and one or more neighbor pixels with one or more neighbor pixel values, wherein the first pixel and the one or more neighbor pixels are associated with an object in the infrared image, the one or more neighbor pixels being adjacent to the first pixel in the input image array;
processing the infrared image to generate a processed image;
determining, by the processor, whether the processed image is a first frame;
in the event that the processed image is the first frame:
initializing a correction table to an initialized correction table;
initializing a mean image counter to zero;
determining, by the processor, whether the camera is moving;
in the event that the camera is moving:
determining whether a motion of the camera is greater than or equal to a motion threshold;
in the event that the motion of the camera is greater than or equal to the motion threshold:
updating the correction table using a motion-based algorithm; and
providing an output image based on the updated correction table using the motion-based algorithm and the processed image.

US Pat. No. 10,462,387

INFRARED IMAGING DEVICE, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING PROGRAM

FUJIFILM Corporation, To...

1. An infrared imaging device comprising:an imaging element comprising a plurality of infrared detection pixels which are two-dimensionally arranged;
a diaphragm that is provided closer to an object than the imaging element;
a temperature detection unit, including a sensor that detects a temperature of the diaphragm; and
a processor configured to subtract, from at least a portion of each of a plurality of captured image data obtained by capturing images with the imaging element in a state in which an F-number of the diaphragm is set to a plurality of values, a signal value corresponding to an amount of infrared rays which are radiated from the diaphragm and are based on the F-number when each of the plurality of captured image data is acquired and the temperature detected by the temperature detection unit, and combine the plurality of captured image data after the subtraction to generate composite image data.

US Pat. No. 10,462,386

VIRTUAL FOCUS FEEDBACK

Microsoft Technology Lice...

1. An apparatus comprising:a display;
a camera; and
processing logic in communication with the display and the camera, the processing logic configured to:
receive image data associated with the camera;
determine a degree to which the camera is in focus based on the image data;
blur a visible base image having content that is independent of content in the image data from the camera to generate a proxy image that has a same degree of blurring across an entirety of the proxy image that inversely correlates with the degree to which the camera is in focus;
display the proxy image on the display; and
instruct a user to adjust a camera focus mechanism to better focus the proxy image,
wherein the processing logic receives updated image data associated with the camera after displaying the proxy image, determines a new degree to which the camera is in focus, modifies the degree of blurring of the proxy image to inversely correlate with new degree to which the camera is in focus, and displays an updated proxy image based on the modified degree of blurring.

US Pat. No. 10,462,385

DISPLAY APPARATUS FOR VEHICLE

HONDA MOTOR CO., LTD., T...

1. A display apparatus for a vehicle, the apparatus comprising: an image capture device configured to capture an image of surroundings of the vehicle to acquire a captured image;a display device configured to display images including the captured image; a display controller configured to cause the display device to display the images in a plurality of display modes which include a first display mode and a second display mode; and
a plurality of switches including a first switch and a second switch and each configured to allow the display controller to switch each of the first and second display modes between an ON state and an OFF state, thereby selectively displaying the images of the corresponding display mode,
wherein the display controller is configured to perform a successive interruption process which successively switches the images to be displayed in response to a successive operation of two or more of the corresponding switches to be the ON state such that the image of the second display mode is displayed by interrupting the first display mode when the second display mode is turned on during the display of the image of the first display mode and that the image of the first display mode is displayed by interrupting the second display mode when the first display mode is turned on during the display of the image of the second display mode, and
wherein when any one of the first and second switches is switched from the ON state to the OFF state in the successive interruption process, the display controller causes the display device to display an image which was displayed immediately before the successive interruption process.

US Pat. No. 10,462,384

APPARATUS AND METHODS FOR THE STORAGE OF OVERLAPPING REGIONS OF IMAGING DATA FOR THE GENERATION OF OPTIMIZED STITCHED IMAGES

GoPro, Inc., San Mateo, ...

1. A computerized apparatus configured to re-stitch a stitched image, the computerized apparatus comprising:processor apparatus configured to execute a plurality of computer readable instructions; and
a non-transitory computer-readable medium comprising a storage apparatus in data communication with the processor apparatus and comprising at least one computer program, the at least one computer program comprising a plurality of instructions which are configured to, when executed by the processor apparatus, cause the computerized apparatus to:
obtain at least a portion of the stitched image, the stitched image being associated with a first stitching quality and a first image projection;
determine a desired re-stitch line based on metadata related to a stitch area, the stitch area associated with the at least portion of the stitched image, the re-stitch line being different from a first stitch line associated with the at least portion of the stitched image;
retrieve one or more overlapping portions of image data corresponding to the stitch area; and
generate a second image projection of the one or more overlapping portions based at least on a re-stitch of the stitch area using the re-stitch line, the second image projection comprising a second stitching quality that is greater than the first stitching quality.

US Pat. No. 10,462,383

SYSTEM AND METHOD FOR ACQUIRING VIRTUAL AND AUGMENTED REALITY SCENES BY A USER

Dropbox, Inc., San Franc...

1. A method comprising:receiving, from a first client device:
a plurality of images of a location, and
device orientation information for the plurality of images;
compositing, using the device orientation information, the plurality of images into a spherical spatial image scene by spatially organizing the plurality of images into the spherical spatial image scene based on the device orientation information; and
providing the spherical spatial image scene to a second client device, wherein the spherical spatial image scene is explorable by rotating or changing a direction of the second client device.

US Pat. No. 10,462,382

SINGLE-MODALITY-BASED VISUAL DISTINGUISHING OF MEDICAL INTERVENTION DEVICE FROM TISSUE

KONINKLIJKE PHILIPS N.V.,...

1. An imaging apparatus configured for imaging an intervention device and body tissue surrounding said device using a single imaging modality, said apparatus comprising:image acquisition and formation circuitry configured to generate a plurality of first images and a plurality of second images of a target area of a subject from a single imaging modality,
wherein the plurality of first images and the plurality of second images are each generated by imaging from different acoustic windows,
wherein individual images of the plurality of first images and individual images of the plurality of second images are generated in an alternating manner,
wherein the plurality of first images are obtained using one or more first parameter values specific for imaging an interventional device in the target area,
wherein the plurality of second images are obtained using one or more second parameter values specific for imaging tissue within the target area such that the image acquisition circuitry toggles back and forth between using the one or more first parameter values and the one or more second parameter values; and
image combination circuitry configured for forming a plurality of combined images of the interventional device surrounded by the tissue using the individual images of the plurality of first images and the individual ones of the plurality of second images, wherein forming comprises:
segmenting at least a portion of the individual ones of the plurality of first images, the segmented portion includes the interventional device and its surrounding boundary area; and
overlapping the segmented portion onto the individual ones of the plurality of second images to form the plurality of combined images, the plurality of combined images comprising the tissue, the interventional device, and the boundary area.

US Pat. No. 10,462,381

SYSTEM AND METHOD FOR PROCESSING A VIDEO SIGNAL WITH REDUCED LATENCY

Freedom Scientific, Inc.,...

1. A camera system with reduced latency for use by a blind or low vision (“BLV”) user, the system being used to highlight selected lines of textual material, the system comprising:an adjustably mounted high definition camera, the camera being adjustable to view lines of textual material placed beneath the camera, the camera generating a corresponding video signal of the textual material, the video signal comprising a series of frames, with each frame comprising an array of pixels;
a field programmable gate array (FPGA) in electrical communication with the camera, the FPGA storing an associated look-up table for use in processing the video signal on a pixel by pixel basis, the look-up table specifying the portion of the array to be shaded and the degree of translucence to be applied, with the translucence being such that the underlying textual material is not masked, and wherein the shaded portions of the array function to highlight the non-shaded portion of the array and thereby facilitate viewing of the selected lines of the textual material by the BLV user, the FPGA being characterized by the absence of a video buffer;
a monitor for displaying the video signal with the shaded textual material.

US Pat. No. 10,462,379

DRIVING METHOD FOR HIGH DYNAMIC RANGE DISPLAY SCREEN, DRIVING APPARATUS AND DISPLAY SCREEN

BOE TECHNOLOGY GROUP CO.,...

1. A driving method of a high dynamic range display screen, comprising:acquiring image data to be displayed;
determining a local backlight brightness of each region in a backlight module according to the image data to be displayed;
adjusting pixel brightness corresponding to the image data to be displayed according to the local backlight brightness and a maximum backlight bright in the backlight module; and
carrying out display according to the adjusted brightness and the determined local backlight brightness of each region;
wherein the adjusting the brightness of the image data to be displayed according to the local backlight brightness and the maximum backlight brightness in the backlight module comprises:
determining original pixel brightness according to the image data to be displayed; and
adjusting the original pixel brightness according to the local backlight brightness and the maximum backlight brightness in the backlight module;
wherein the adjusting the original pixel brightness according to the local backlight brightness and the maximum backlight brightness in the backlight module comprises:
adjusting the original pixel brightness by adopting a formula as follows:

where Y0 is the original pixel brightness, Y is the adjusted pixel brightness, BLmax is a maximum backlight brightness in the backlight module, BLDB is the local backlight brightness of the backlight module, and “a” is an adjustable parameter.

US Pat. No. 10,462,378

IMAGING APPARATUS

Toyota Jidosha Kabushiki ...

1. An imaging apparatus comprising:an imager configured to image surroundings of a moving body, and configured to obtain a surrounding image;
a position detector configured to detect a position of the moving body;
a map device configured to store therein road map information;
a setting device configured to estimate an area corresponding to a road in the obtained surrounding image on the basis of the detected position and the road map information, and configured to set a photometric area including at least a part of the estimated area;
a calculator configured to calculate an exposure condition of said imager on the basis of image information in the set photometric area in the obtained surrounding image; and
a controller programmed to control said imager on the basis of the calculated exposure condition.

US Pat. No. 10,462,376

EXPOSURE METHOD IN PANORAMIC PHOTO SHOOTING AND APPARATUS

HUAWEI TECHNOLOGIES CO., ...

1. An exposure method in panoramic photo shooting, comprising:obtaining, by a terminal device, luminance values respectively corresponding to one or more pictures shot in advance and a luminance value of a current environment in which shooting is to be performed;
obtaining a luminance value adjustment coefficient ? and adjustment weights ?i respectively corresponding to the one or more pictures shot M advance, wherein i is a positive integer;
obtaining, by the terminal device according to the luminance values respectively corresponding to the one or more pictures shot in advance and the luminance value of the current environment in which shooting is to be performed, a luminance value of a picture to be shot, wherein the luminance value of the picture to he shot is obtained using a difference between the luminance value of the current environment in which shooting is to be performed and a deviation coefficient that is based at least in part on the luminance value adjustment coefficient ? and the adjustment weight ?i; and
performing, by the terminal device, exposure according to the luminance value of the picture to be shot.

US Pat. No. 10,462,375

EXTERIOR VIEWING CAMERA MODULE FOR VEHICLE VISION SYSTEM

MAGNA ELECTRONICS INC., ...

1. A camera module for a vehicle, said camera module comprising:a housing that houses an imager assembly and a main circuit board;
wherein the imager assembly comprises (i) an imager disposed on an imager circuit board, (ii) a lens barrel accommodating at least one lens, (iii) a lens holder, and (iv) a flexible ribbon cable extending from the imager circuit board, said flexible ribbon cable terminating at a terminator portion;
wherein said housing comprises an upper cover and a lower cover, and wherein the upper cover and the lower cover are joined together;
wherein said terminator portion of said flexible ribbon cable comprises a first electrical connector;
wherein said main circuit board comprises a printed circuit board (PCB) having a first planar side and an opposing second planar side separated from the first planar side by a thickness dimension of the PCB of said main circuit board;
wherein circuitry disposed at the PCB of said main circuit board comprises (i) first circuitry disposed at the first planar side of the PCB of said main circuit board and (ii) second circuitry disposed at the second planar side of the PCB of said main circuit board;
wherein circuitry disposed at the PCB of said main circuit board comprises an image processor;
wherein said image processor is operable for processing image data captured by the imager;
wherein the PCB of said main circuit board comprises a second electrical connector, and wherein said second electrical connector of the PCB of said main circuit board is configured for connecting with said first electrical connector at the terminator portion of said flexible ribbon cable;
wherein said first electrical connector at the terminator portion of said flexible ribbon cable electrically connects with said second electrical connector of the PCB of said main circuit board;
wherein, with the imager assembly operated to capture image data and with said first electrical connector at the terminator portion of said flexible ribbon cable connected with said second electrical connector of the PCB of said main circuit board, image data captured by the imager of the imager assembly is provided via said flexible ribbon cable to circuitry disposed at the PCB of said main circuit board;
wherein said housing, with said upper cover and said lower cover joined together, comprises a front portion and a rear portion;
wherein said front portion is in front of said rear portion;
wherein said main circuit board is accommodated within said front and rear portions;
wherein the imager is accommodated within said rear portion and is not accommodated within said front portion of said housing;
wherein said housing of said camera module has breadth and length, and wherein the PCB of said main circuit board extends across the breadth of said housing and along the length of said housing;
wherein said rear portion of said housing has height;
wherein said front portion of said housing has height;
wherein the maximum height dimension of said rear portion of said housing is greater than the maximum height dimension of said front portion of said housing;
wherein the imager assembly is attached at an inner surface of the upper cover at said rear portion of said housing;
wherein the upper cover at said rear portion of said housing comprises an opening, and wherein the imager views to exterior of said housing via said opening;
wherein the lens barrel extends through said opening to protrude outside said housing;
wherein the lens barrel of the imager assembly is tilted at an acute angle upward relative to the plane of the PCB of said main circuit board; and
wherein the first circuitry disposed at the first planar side of the PCB of said main circuit board comprises an electrical socket connector configured for electrical connection to a plug connector of a vehicular wire harness.

US Pat. No. 10,462,374

ZOOMING CONTROL APPARATUS, IMAGE CAPTURING APPARATUS AND CONTROL METHODS THEREOF

Canon Kabushiki Kaisha, ...

1. A zooming control apparatus comprising:an object detection unit configured to detect an object from an image;
a first acquisition unit configured to acquire information regarding a distance to the object; and
a zooming control unit configured to perform one of a first zooming control for automatically changing a zoom magnification according to first information that includes information regarding a size of the object detected by the object detection unit and a second zooming control, that is different from the first zooming control, for automatically changing a zoom magnification according to second information regarding the distance to the object acquired by the first acquisition unit,
wherein the zooming control unit determines whether to perform the first zooming control or the second zooming control in accordance with whether or not the object is a face.

US Pat. No. 10,462,373

IMAGING DEVICE CONFIGURED TO CONTROL A REGION OF IMAGING

CASIO COMPUTER CO., LTD.,...

1. An imaging device comprising:an imaging unit including an image sensor;
a display unit including a display panel; and
a control unit including a CPU,
wherein the control unit is configured to:
make the display unit successively display an entire image that is successively taken by the imaging unit as a live-view image;
during display of the entire image as the live-view image, set a cutting region at a predetermined position in an imaging range of the imaging unit;
displace a position of the cutting region in the imaging range so as to track movement of a specific subject in the imaging range;
fix the position of the cutting region in the imaging range before a first operation is performed, and when the first operation is performed, displace the position of the cutting region so as to track movement of the specific subject in the imaging range;
at least before the first operation is performed, display an image at a region other than the cutting region in the entire image that is successively displayed on the display unit as the live-view image so that the image at the region other than the cutting region is displayed in a predetermined display mode so as to be relatively less prominent than a partial image corresponding to the cutting region;
before the first operation is performed, allow at least one of a size and a shape of the cutting region to be changed while fixing the position of the cutting region relative to the imaging range; and
when the first operation is performed, displace the position of the cutting region while fixing the size and the shape of the cutting region.

US Pat. No. 10,462,372

IMAGING DEVICE, IMAGING SYSTEM, AND IMAGING METHOD

SONY CORPORATION, Tokyo ...

1. A display control system for a vehicle having a camera comprising:a distortion correcting unit configured to correct a left side region distortion and a right side region distortion of a first wide-angle image derived from a captured image captured by a single camera of the vehicle with a wide angle lens, the first wide-angle image including the left side region distortion and the right side region distortion based on the wide angle lens;
an image generating unit configured to generate a second wide-angle image derived from the captured image, the second wide-angle image including a left-directional image, a right-directional image and a one-directional image based on the first wide-angle image; and
a control unit configured to selectively display either (a) the first wide-angle image with distortion or (b) the second wide-angle image;
wherein the left-directional image in the second wide-angle image is generated based on the distortion-corrected left side region of the first wide-angle image and is positioned at the left side in the display of the second wide angle image;
wherein the right-directional image in the second wide-angle image is generated based on the distortion-corrected right side region of the first wide-angle image and is positioned at the right side in the display of the second wide angle image; and
the display on a display device in the vehicle of the first wide angle image with distortion is changed over to display on the display device the second wide angle image in response to a selection input.

US Pat. No. 10,462,371

IMAGING APPARATUS AND IMAGING METHOD FOR COMPARING A TEMPLATE IMAGE WITH A MONITORING IMAGE

RICOH COMPANY, LTD., Tok...

1. An imaging method that records an image of a subject shot by an imaging device in a recording medium as image data, the method comprising:displaying the image data;
receiving a user selection of an image from among images previously captured and stored on the recording medium;
controlling a display of a monitoring image of the subject together with the selected image after a transparency of the selected image is changed and until the monitoring image is recorded as the image data; and
in response to a determination that a shutter button of the imaging device is half-pressed, terminating display of the selected image so that only the monitoring image is displayed, wherein the method further comprises
displaying plural images and at least one motion picture previously captured and stored on the recording medium, and
in response to receiving a user selection of the motion picture, displaying an error message indicating that the motion picture cannot be selected.

US Pat. No. 10,462,370

VIDEO STABILIZATION

Google LLC, Mountain Vie...

1. A computer-implemented method, comprising:receiving, by a computing system, a series of frames of a video captured by a recording device using an optical image stabilization (OIS) system;
receiving, by the computing system, (i) OIS position data indicating positions of the OIS system during capture of the series of frames and (ii) device position data indicating positions of the recording device during capture of the series of frames;
determining, by the computing system, a first transformation for a particular frame in the series of frames, the first transformation being determined based on the OIS position data for the particular frame and the device position data for the particular frame;
determining, by the computing system, a set of camera positions occurring over a set of multiple frames based on the OIS position data and the device position data, the set of multiple frames including one or more frames before the particular frame and one or more frames after the particular frame;
applying, by the computing system, a filter to the set of camera positions;
determining, by the computing system, a second transformation for the particular frame based on the first transformation and positions of the recording device determined, based on the device position data, for one or more frames in the series of frames that are captured after the particular frame, wherein the second transformation is determined based on a camera position determined based on applying the filter to the set of camera positions; and
generating, by the computing system, a stabilized version of the particular frame using the second transformation.

US Pat. No. 10,462,369

ROUTING OF TRANSMISSION MEDIA THROUGH ROTATABLE COMPONENTS

GoPro, Inc., San Mateo, ...

1. A gimbal assembly for use with an image capturing device, the gimbal assembly comprising:a motor assembly;
a first housing defining an internal compartment configured and dimensioned to receive the motor assembly, the first housing including an arm extending outwardly therefrom and a first guide configured and dimensioned to support transmission media adapted to communicate electrical and/or digital signals; and
a second housing including an arm extending outwardly therefrom, at least one of the arms of the first and second housings being configured and dimensioned to directly or indirectly support the image capturing device, the second housing being mechanically connected to the motor assembly such that actuation of the motor assembly causes relative rotation between the first and second housings about an axis of rotation, the second housing defining a channel configured and dimensioned to receive the first guide such that the first guide extends into the second housing through the channel, the transmission media being supported on the first guide such that the first guide routes the transmission media from the first housing into the second housing.

US Pat. No. 10,462,368

TEMPORAL FILTERING BASED ON MOTION DETECTION BETWEEN TEMPORALLY NON-ADJACENT PICTURES

Ambarella, Inc., Santa C...

1. A method for temporal filtering based on motion detection between non-adjacent pictures comprising the steps of:computing a motion score by motion detection between a target area in a target picture and a temporally non-adjacent first area in a non-adjacent one of a plurality of reference pictures;
generating a temporal filter strength based on the motion score, wherein said temporal filter strength is determined based upon said motion score and a blending function that is nonlinear over a range of values of said motion score; and
temporal filtering said target area with a second area in an adjacent one of said reference pictures based on said temporal filter strength to generate a filtered area in a filtered picture, wherein said target area, said first area, and said second area are spatially co-located and at least one of (i) said motion score and (ii) said generation of said filtered area is controlled by one or more gain settings in a circuit.

US Pat. No. 10,462,367

IMAGE CAPTURE HAVING TEMPORAL RESOLUTION AND PERCEIVED IMAGE SHARPNESS

GVBB HOLDINGS S.A.R.L., ...

1. A camera system for generating an image with improved image sharpness by reducing motion blur of an object moving in the image, the camera system comprising:an image sensor comprising an array of pixels configured to accumulate an electrical charge representative of an image captured during a frame;
a pixel output sampler configured to sample a pixel output for at least one pixel of the image sensor during a beginning portion of the frame, at least one intermediate portion of the frame, and at an end portion of the frame;
a full frame exposure output configured to generate a full frame pixel output based on a difference between the sampled pixel output at the end portion of the frame and the sampled pixel output at the beginning portion of the frame;
an intermediate frame exposure output configured to generate an intermediate exposure pixel output based on the sampled pixel output at the at least one intermediate portion of the frame;
a detail processor configured to generate a detail correction signal from the generated intermediate exposure pixel output; and
an image signal output module configured to apply the generated detail correction signal to the full frame pixel output to produce an enhanced pixel output for generating a digital video output for the captured image with reduced motion blur and judder.

US Pat. No. 10,462,366

AUTONOMOUS DRONE WITH IMAGE SENSOR

Alarm.com Incorporated, ...

1. A monitoring system that is configured to monitor a property, the monitoring system comprising:one or more sensors that are located throughout the property and that are configured to generate sensor data;
a drone that is configured to move throughout the property and generate additional sensor data;
a drone dock that is configured to receive the drone, wherein the drone is configured to continue generating the additional sensor data while the drone dock is receiving the drone;
an additional drone dock that is configured to receive the drone, wherein the drone is configured to continue generating the additional sensor data while the additional drone dock is receiving the drone; and
a monitor control unit that is configured to:
receive the sensor data, the additional sensor data, and data indicating whether the drone dock or the additional drone dock received the drone;
analyze the sensor data, the additional sensor data, and the data indicating whether the drone dock or the additional drone dock received the drone;
based on analyzing the sensor data, the additional sensor data, and the data indicating whether the drone dock or the additional drone dock received the drone, determine a status of the property; and
provide, for output, data indicating the status of the property.

US Pat. No. 10,462,365

LOW POWER SURVEILLANCE SYSTEM

HRL Laboratories, LLC, M...

1. A low power surveillance system, the system comprising:one or more processors and a memory, the memory having executable instructions encoded thereon, such that upon execution of the instructions, the one or more processors performs operations of:
receiving a series of frames from a camera, each frame having a background and a foreground;
generating a background template from the series of frames;
generating a foreground detection kernel;
receiving reduced image measurements of a new image frame of the scene, the new image frame having a background and a foreground;
detecting potential regions of interest (ROI) in the foreground of the new image frame;
determining initial region descriptors in the potential ROI in the foreground of the new image frame;
segmenting the initial region descriptors to generate a segmented region;
re-determining region descriptors from the segmented region;
determining a contiguous sparse foreground from the re-determined region descriptors by maximizing a smoothness and contiguous nature of a boundary of pixel regions of a set of foreground pixels, the contiguous sparse foreground being a contiguous ROI;
reconstructing the ROI using foveated compressive sensing to generate an image of an interesting object; and
combining the interesting object image with the background template to generate a reconstructed foreground;
wherein determining the contiguous sparse foreground a contiguity-constrained compressive sensing (CCCS) model is applied that penalizes isolated pixels and favors contiguous ROIs according to the following:

where S=supp(e) denotes support of a sparse component e and ?S?TV??i,j|(?S)i,j| denotes a total variation of S, ? denotes a measurement matrix, ?=?d, where d denotes a single raw image,
approximating S by a function f(e) according to the following:

where f(e)=f1(e)?f2(e), where f1(e) penalizes e=0 and f2(e) penalizes e?0.

US Pat. No. 10,462,364

ELECTRONIC DEVICES HAVING MULTIPLE POSITION CAMERAS

Hewlett-Packard Developme...

1. An electronic device comprising:an input device to receive an input to identify boundaries of a scene, wherein the identified boundaries correspond to boundaries of a panoramic image, the panoramic image has an associated first field of vision, and the boundaries comprise a first boundary and a second boundary;
a multiple position camera having an optical axis and having an associated second field of vision less than the first field of vision, wherein the camera to acquire sub-images of the panoramic image, and the sub-images correspond to different parts of the scene; and
an actuator to move the camera based on the input to pan the optical axis of the camera across the scene to acquire the sub-images, wherein the panning of the camera comprises moving the camera such that the optical axis of the camera moves from a first position at which the camera acquires a first sub-image of the sub-images coinciding with the first boundary to a second position at which the camera acquires a second sub-image of the sub-images coinciding with the second boundary.

US Pat. No. 10,462,363

OPTICAL APPARATUS

CANON KABUSHIKI KAISHA, ...

1. An optical apparatus comprising a plurality of first image capturing optical systems facing different directions,wherein each of the first image capturing optical systems includes a reflective element configured to bend an optical path to an image sensor,
wherein optical paths in the first image capturing optical systems intersect with one another on an object side of each reflective element,
wherein each of the first image capturing optical systems includes, in order from the object side, a first lens unit having a negative refractive power, and a second lens unit having a positive refractive power, the second lens unit including the reflective element configured to bend the optical path to the image sensor, and
wherein in each of the first image capturing optical systems, a condition of 3.205?L/f?5.50 is satisfied, where L is a distance between the first lens unit and the second lens unit, and f is a focal length of a whole optical system.

US Pat. No. 10,462,362

FEATURE BASED HIGH RESOLUTION MOTION ESTIMATION FROM LOW RESOLUTION IMAGES CAPTURED USING AN ARRAY SOURCE

FotoNation Limited, Galw...

1. A method for performing feature based high resolution motion estimation of a camera comprising a plurality of imagers in an imager array from a plurality of low resolution images captured by imagers in the imager array, comprising:performing feature detection using a processor configured by software to identify an initial location for at least one visual feature in a first plurality of low resolution images, where the first plurality of low resolution images includes one image captured by each of a plurality of imagers in an imager array from different perspectives at a first point in time;
performing feature detection using the processor configured by software to identify an initial location for the at least one visual feature in a second plurality of low resolution images, where the second plurality of low resolution images includes one image captured by each of the plurality of imagers in the imager array from different perspectives at a second point in time;
synthesizing a first set of high resolution image portions from the first plurality of low resolution images using the processor configured by software to perform a super-resolution process using parallax information, where the synthesized high resolution image portions contain the identified at least one visual feature;
synthesizing a second set of high resolution image portions from the second plurality of low resolution images using the processor configured by software to perform a super-resolution process using parallax information, where the synthesized high resolution image portions contain the identified at least one visual feature;
performing feature detection within the first and second sets of high resolution image portions to identify locations for the at least one visual feature at a higher resolution than the initial locations identified in the low resolution images using the processor configured by software; and
estimating camera motion using the identified locations for the at least one visual feature in the first and second sets of high resolution image portions using the processor configured by software.

US Pat. No. 10,462,360

IMAGE CAPTURING APPARATUS AND CONTROL METHOD UTILIZING AN IMAGE SENSOR WITH TWO-DIMENSIONAL PIXELS AND PIXEL SIGNAL STORING UNIT

Canon Kabushiki Kaisha, ...

1. An image capturing apparatus comprising:an image sensor including two-dimensionally arranged pixels each having a photoelectric conversion element which output first and second image signals and a signal storing unit which stores at least one frame of image signals that are output from the pixels;
an image processor which is provided outside of the image sensor and performs a predetermined image processing on the first and second image signals; and
a controller which controls
to transfer the first image signals from the image sensor to the image processor without storing in the signal storing unit of the image sensor, and controls to store the second image signals in the signal storing unit of the image sensor and transfer the second image signals stored in the signal storing unit of the image sensor to the image processor.

US Pat. No. 10,462,359

IMAGE COMPOSITION INSTRUCTION BASED ON REFERENCE IMAGE PERSPECTIVE

Adobe Inc., San Jose, CA...

1. In a digital medium environment to capture digital images, a method implemented by a computing device having an integrated camera device, the method comprising:capturing, by the camera device, a digital image of an object as a template image depicting the object from an initial perspective;
displaying reference images depicting subjects that are similar to the object captured in the template image;
receiving a selection of one of the reference images indicating a selected reference image;
determining that a location of the computing device is proximate a perspective location associated with the selected reference image;
displaying instructions to adjust the initial perspective of the computing device to a position that aligns a composition of an image preview of the object with a composition of the selected reference image, the instructions including a first indicator of the composition of the image preview and a second indicator of the composition of the selected reference image;
adjusting at least one of the first indicator or the second indicator to indicate a direction to move the computing device to the position; and
capturing, by the camera device according to the position, an additional digital image of the object captured in the template image.

US Pat. No. 10,462,358

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD

SONY CORPORATION, Tokyo ...

1. An information processing apparatus, comprising:a central processing unit (CPU) configured to:
determine a first face in a first image and a second face in a second image are of a same user, wherein
the first image is captured by a first imaging device from a first position,
the second image is captured by a second imaging device from a second position, and
the first position is different from the second position;
determine, based on the determination that the first face and the second face are of the same user, a direction in which the first imaging device and the second imaging device are arranged is along one of a horizontal orientation of a user face or a vertical orientation of the user face; and
replace, based on the determination that the direction is along the one of the horizontal orientation of the user face or the vertical orientation of the user face, an eye part of the first image with an eye part of the second image to generate a line-of-sight correction image.

US Pat. No. 10,462,357

PHOTOGRAPHING APPARATUS, METHOD AND MEDIUM USING IMAGE RECOGNITION

FUJIFILM Corporation, Mi...

1. A non-transitory computer readable medium of a camera device, the computer readable medium storing a program for causing a processor to execute an image processing method, wherein the camera device generates a moving image of a subject by continuously taking photographs of the subject, the camera device including:a display for displaying information including the moving image,
a user interface that initiates a photographing operation to generate a photograph of the subject, the photograph comprising image data; and
memory for storing information including the image data,
the program comprising instructions for processing successive frames of the moving image until terminated responsive to a release input from a user, the processing comprising:
determining whether a human face is included in the frame;
detecting a facial position in the frame if the determining step determines that a face is included in the frame;
storing the detected facial position in the memory;
wherein the determining of whether a human face is included in the frame comprises:
determining whether a human face is included in a first frame of the successive frames;
if a human face is not determined to be in the first frame, repeating the determining on a next frame of the successive frames; and
if a human face is determined to be in the first frame, determining whether a human face is included in a next frame of the successive frames; and
wherein the determining of whether a human face is included in the next frame of the successive frames comprises:
processing only pixels in the next frame located in a region corresponding to a region of the first frame in which the human face was detected.

US Pat. No. 10,462,356

RANGE IMAGE CAMERA, RANGE IMAGE CAMERA SYSTEM, AND CONTROL METHOD OF THEM

Hitachi, Ltd., Tokyo (JP...

1. A range image camera comprising:a light emitting part that emits an irradiation light to a photographing space;
a light receiving part that receives a reflected light of the photographing space;
a range image generation part that generates a range image from a time difference between a light emitting timing of the irradiation light by the light emitting part and a light receiving timing of the reflected light by the light receiving part;
a luminance image generation part that generates a luminance image from an intensity of the reflected light by the light receiving part; and
a control part that controls the light emitting part, the light receiving part, the range image generation part, and the luminance image generation part,
wherein the control part gives at least one instruction of an image generation execution of:
an instruction of a light emission execution to the light emitting part;
an instruction of a light reception execution to the light receiving part;
an instruction of a range image generation execution to the range image generation part; and
an instruction of a luminance image generation execution to the luminance image generation part, as an image generation execution mode,
wherein the control part has at least one mode of:
an image generation execution mode for generating installation information; and
a light emission execution mode for generating installation information,
wherein the control part gives at least one instruction of: an instruction of a light emission stop to the light emitting part; an instruction of a light reception execution to the light receiving part; and an instruction of a luminance image generation execution to the luminance image generation part, as the image generation execution mode for generating installation information,
wherein the control part gives at least one instruction of: an instruction of a light emission execution to the light emitting part; an instruction of a light reception stop to the light receiving part; and an instruction of a luminance image generation stop to the luminance image generation part, as the light emission execution mode for generating installation information,
wherein the range image camera further comprises a 3D image generation part that generates a 3D image from the range image,
wherein the control part controls the 3D image generation part and gives an instruction of a generation stop or a generation execution of the 3D image to the 3D image generation part in conjunction with an instruction of a generation stop or a generation execution of the range image to the range image generation part,
wherein the range image camera further comprises:
a first installation information generation part that generates first installation information of the range image camera from at least one image of the range image, the 3D image, and the luminance image; and
a second installation information generation part that generates second installation information of the range image camera from the luminance image,
wherein the control part controls at least one of the first installation information generation part and the second installation information generation part,
wherein the control part has at least one mode of: the image generation execution mode for generating installation information; a first installation information generation execution mode; and a second installation information generation execution mode, as an installation information generation execution mode,
wherein the control part generates first installation information of the range image camera from at least one image of the range image, the 3D image, and the luminance image, as the first installation information generation execution mode; and
wherein the control part generates second installation information of the range image camera from the luminance image of the range image camera as the second installation information generation execution mode.

US Pat. No. 10,462,355

IMAGE PROCESSING APPARATUS TO GENERATE COMBINED IMAGES

Canon Kabushiki Kaisha, ...

1. A display processing apparatus comprising:one or more processors; and
a memory storing instructions that, when executed by the one or more processors, cause the display processing apparatus to function as:
a specifying unit configured to specify an unspecified item from items displayed in a first area,
a display control unit configured to display the specified item distinguishably from an unspecified item in the first area and to display an item corresponding to the specified item in a second area in response to the unspecified item being specified,
an acceptance unit configured to accept a first operation to target the specified item displayed in the first area from among the items displayed in the first area, and
a control unit configured to perform control,
to contain an image corresponding to an item in the first area which is specified by the specifying unit into a series of images which is configured to be sequentially played back, in response to the unspecified item being specified, and to remove an image corresponding to the specified item on which the first operation is performed from the series of images and not to display the item, which relates to the specified item on which the first operation is performed, in the second area in response to the first operation being performed in the first area.

US Pat. No. 10,462,354

VEHICLE CONTROL SYSTEM UTILIZING MULTI-CAMERA MODULE

MAGNA ELECTRONICS INC., ...

1. A control system for a vehicle, said control system comprising:a camera module disposed in a vehicle, said camera module comprising a plurality of cameras;
a control operable to adjust a field of view of any one of said cameras responsive to an input;
wherein, responsive to image processing of image data captured by said cameras, said control generates an output to control at least one vehicle control system to maneuver the vehicle;
wherein, when the vehicle is traveling in and along a traffic lane, said control sets the field of view of at least some of said cameras to be forward through a windshield of the vehicle;
wherein, responsive to an input indicative of an intended lane change by the vehicle, said control adjusts an orientation of at least one camera of said cameras to set the field of view of said at least one camera to be toward an exterior rearview mirror at the side of the vehicle at which a target lane is located such that the field of view of said at least one camera encompasses a mirror reflective element of that exterior rearview mirror; and
wherein, responsive to determination, via image processing of captured image data when said at least one camera views the mirror reflective element of the exterior rearview mirror of the vehicle, that the target lane is clear, said control generates an output to control the at least one vehicle control system to maneuver the vehicle into the target lane.

US Pat. No. 10,462,353

IMAGING DEVICE, IMAGING METHOD, AND STORAGE MEDIUM

CASIO COMPUTER CO., LTD.,...

1. An imaging device comprising:a temporary storage which temporarily cyclically stores images for a predetermined duration or a predetermined number of images imaged in succession through an imager; and
a processor which detects a capturing instruction;
wherein the processor performs control to:
record, in a recorder, temporarily stored images for a first duration or a first number of images before detection of the capturing instruction, and record, in the recorder, images for a second duration or a second number of images imaged in succession through the imager after detection of the capturing instruction;
variably determine a ratio of the first duration to the second duration or a ratio of the first number to the second number in response to input of a first type of operation of an operational input unit, without changing a total duration of the first duration and the second duration or a total number of the first number of images and the second number of images;
variably determine the total duration of the first duration and the second duration or the total number of the first number and the second number in response to input of a second type of operation of the operational input unit, without changing the ratio of the first duration to the second duration or the ratio of the first number to the second number; and
determine the first duration and the second duration or the first number and the second number based on (i) the ratio of the first duration to the second duration or the ratio of the first number to the second number, and (ii) the total duration or the total number,
wherein in a case in which a last input operation among the first type of operation and the second type of operation is the second type of operation, the processor performs control to determine the first duration and the second duration or the first number and the second number based on the determined total duration or the determined total number determined in response to the input of the second type of operation, without changing the ratio of the first duration to the second duration or the ratio of the first number to the second number, and
wherein the processor performs control to control the recorder to record images in an amount of the determined first duration and the determined second duration, or the determined first number and the determined second number.

US Pat. No. 10,462,352

FOCUS DETECTION APPARATUS AND IMAGE PICKUP APPARATUS

Canon Kabushiki Kaisha, ...

1. A focus detection apparatus comprising:a determination unit configured to determine a degree of effect of noise included in a pair of parallax image signals; and
an acquisition unit configured to acquire information about a phase difference between the pair of parallax image signals based on a calculation of correlation between the pair of parallax image signals,
wherein the acquisition unit selects, based on a determination result of the determination unit, a filter used to acquire the information about the phase difference from among a plurality of filters having different frequency characteristics, and outputs, as a focus detection result, the information about the phase difference acquired by the correlation calculation based on the pair of parallax image signals applied to the selected filter.

US Pat. No. 10,462,351

FAST AUTO-FOCUS IN IMAGING

Roche Diagnostics Hematol...

2. The method of claim 1, wherein at least two images are acquired under illumination by each color.

US Pat. No. 10,462,350

CAMERA CONTROL APPARATUS AND CAMERA CONTROL METHOD

SONY CORPORATION, Tokyo ...

1. A camera control method comprising the steps of:selecting by a user a first or a second control information for controlling operating parameters of a camera, at least one of the first or second control information being adapted to control at least one of pan, tilt and zoom operations of the camera;
sending command information to the camera to read out from a storage in the camera the first control information preset in the camera for controlling the camera in accordance with the first control information when the user selects the first control information for controlling operating parameters of the camera, the storage in the camera having a capacity of n pieces of information;
reading out from a storage in camera control apparatus, physically separate from the camera, preset camera command information for controlling operating parameters of the camera, the storage in the camera control apparatus having a capacity greater than n pieces of information;
checking whether camera model information stored in the storage in the camera control apparatus is the same as model information of the camera connected to the camera control apparatus; and
sending to the camera the read out preset camera command information for controlling the parameters of the camera when the user selects the second control information.

US Pat. No. 10,462,349

METHOD AND SYSTEM FOR HOSTING ENTITY-SPECIFIC PHOTO-SHARING WEB SITES FOR ENTITY-SPECIFIC DIGITAL CAMERAS

Chemtron Research LLC, W...

1. A method of operating an image capture device programmed with instructions for transmitting an entity identification to a server coupled to the image capture device over an electronic network, the method comprising:displaying, on the image capture device, an action list received from the server over the electronic network, the action list comprising actions performable by the server on images captured by the image capture device;
receiving, by the image capture device, a selected action from the action list;
uploading, by the image capture device to the server over the electronic network, at least one selected image from the images;
transmitting by the image capture device to the server over the electronic network, an instruction for the server to perform the selected action on the uploaded image in response to the receiving the selected action; and
transmitting, from the image capture device to the server over the electronic network, the entity identification to:
identify an entity associated with the image capture device including a manufacturer of the image capture device, a company that owns or controls the image capture device, a retailer that sells the image capture device, or combinations thereof;
identify a specific photo-sharing website on the server corresponding to the entity;
locate an account on the specific photo-sharing website for the image capture device; and
associate the selected image transmitted by the image capture device to the server with the account.

US Pat. No. 10,462,348

COMMUNICATION APPARATUS CAPABLE OF ESTABLISHING COMMUNICATION WITH MULTIPLE DEVICES, CONTROL METHOD THEREOF, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A communication apparatus that can establish predetermined communication with another apparatus present on the same network, the communication apparatus comprising:a communication interface; and
a processor configured to perform the operations of following units:
while the communication apparatus has joined a first network,
a first receiving unit configured to receive, via the communication interface, a first signal, the first signal being transmitted from a first external apparatus that has joined the first network and the first signal including information pertaining to the first external apparatus;
a second receiving unit configured to receive, via the communication interface, a second signal, the second signal including information of a second network formed by a second external apparatus;
a determining unit configured to determine whether the second external apparatus meets a predetermined condition on the basis of the information of the second network received by the second receiving unit; and
a display control unit configured to display, in a display unit, a selection screen for selecting an apparatus with which to establish the predetermined communication,
wherein the display control unit displays the information of the first external apparatus and the information of the second external apparatus determined to meet the predetermined condition in the selection screen as information of candidate apparatuses with which the predetermined communication may be established.

US Pat. No. 10,462,347

POSITIONING APPARATUS FOR PHOTOGRAPHIC AND VIDEO IMAGING AND RECORDING AND SYSTEM UTILIZING THE SAME

GoPro, Inc., San Mateo, ...

1. A device for supporting a camera, comprising:a positioning apparatus coupled to the camera, the positioning apparatus having a first rotational axis and a second rotational axis; and
a driver in communication with the positioning apparatus,
wherein the driver is configured to receive a sequence comprising:
first predetermined positions of the positioning apparatus about the first rotational axis;
second predetermined positions of the positioning apparatus about the second rotational axis;
a first time period over which to move the positioning apparatus about the first rotational axis; and
a second time period over which to move the positioning apparatus about the second rotational axis,
wherein the driver is configured to send commands to move the positioning apparatus about the first rotational axis to the first predetermined positions according to the first time period and to move the positioning apparatus about the second rotational axis to the second predetermined positions according to the second time period, and
wherein the camera is configured to capture an image frame at each of the first and second predetermined positions of the positioning apparatus.

US Pat. No. 10,462,346

CONTROL APPARATUS, CONTROL METHOD, AND RECORDING MEDIUM

Canon Kabushiki Kaisha, ...

1. A control apparatus comprising:a determination unit configured, in a case where a definition condition defining a plurality of conditions including an image quality condition for defining image quality of a captured image captured by an imaging apparatus is specified, to determine whether predetermined processing according to a different image is being performed, the different image being different from an image corresponding to the definition condition and being generated by the imaging apparatus; and
a control unit configured, in a case where the definition condition is specified, to control the imaging apparatus to generate an image not satisfying at least part of the conditions defined by the definition condition and satisfying the other condition(s) according to a result of determination by the determination unit.

US Pat. No. 10,462,344

IMAGING SYSTEM HAVING MULTIPLE IMAGING SENSORS AND AN ASSOCIATED METHOD OF OPERATION

NCTech Ltd, Edinburgh (G...

1. An imaging system comprising:a plurality of imaging sensors,
wherein:
each imaging sensor comprises a plurality of pixels or sensing elements configured to detect incident radiation and output a signal representative thereof;
each imaging sensor is operable to sample different subsets of pixels or sensing elements at different times to collect output signals representative of radiation incident thereon;
the imaging system is configured to sample one or more of the subsets of pixels or sensing elements of one or more or each imaging sensor that are at least one of towards or closest to at least one or each neighboring or adjacent sensor whilst collecting output signals from one or more subsets of pixels or sensing elements of the at least one or each neighboring or adjacent imaging sensor that are at least one of towards or closest to the imaging sensor;
at least one or each of the imaging sensors is configured to sample the subsets of pixels or sensing elements as a scan or sweep of the subsets of pixels or sensing elements from one side of the imaging sensor to an other side of the imaging sensor; and
the plurality of imaging sensors are positioned on external non-co-planar surfaces of a single camera such that one or more or each of the imaging sensors is oriented or faces in a different direction to at least one or each other or to at least one or each adjacent or neighboring imaging sensor, the different direction for one or more or each of the imaging sensors being divergent relative to another direction for another one or each of the imaging sensors.

US Pat. No. 10,462,343

MULTI-ARRAY CAMERA IMAGING SYSTEM AND METHOD THEREFOR

Aqueti Incorporated, Dur...

1. A method for high-speed capture and rendering of a first image of a scene that includes a plurality of object points, the method comprising:storing a first set of raw image data points (214) at a plurality of servers (204), the servers being operatively coupled with a first plurality of sensor arrays (212) and a first rendering system (206-i), and the first set of raw image data points including raw-image data points from each of the first plurality of sensor arrays;
determining a second set of raw image data points (222-i) based on a rendering request (218-i) received from the first rendering system;
determining a second set of sensor arrays, the second set of sensor arrays being the sensor arrays of the first set of sensor arrays that provided the second set of raw image data points; and
generating a set of calibrated image data points (224-i) by applying a first set of calibration factors to the second set of raw image data points, the first set of calibration factors being based on the second set of sensor arrays, wherein the first image of the scene is based on the set of calibrated image data points.

US Pat. No. 10,462,342

IMAGING MODULE AND IMAGING-MODULE-ATTACHED CATHETER

FUJIKURA LTD., Tokyo (JP...

1. An imaging module comprising:an image-sensing device comprising an image-sensing device electrode;
a first substrate comprising:
a first insulating substrate main body;
a first wiring disposed on the first insulating substrate main body;
an electrode terminal electrically connected to the image-sensing device electrode and the first wiring; and
a first cable terminal disposed on a surface of the first insulating substrate main body, and electrically connected to the first wiring;
a second substrate comprising:
a second insulating substrate main body;
a second wiring disposed on the second insulating substrate main body; and
a second cable terminal disposed on a surface of the second insulating substrate main body, and electrically connected to the second wiring; and
a signal cable disposed between the first substrate and the second substrate and that electrically connects the first cable terminal to the second cable terminal,
wherein
D1 is a length of a first diagonal line on a light-receiving face of the image-sensing device,
D2 is a length of a second diagonal line on an end face of the second insulating substrate main body, and
D2 is less than or equal to D1.

US Pat. No. 10,462,341

AUTOMATIC GREETINGS BY OUTDOOR IP SECURITY CAMERAS

KUNA SYSTEMS CORPORATION,...

1. An apparatus comprising:a camera sensor configured to capture video data of an area of interest; and
a processor configured to (i) process said video data, (ii) generate control signals used to initiate security responses and (iii) execute computer readable instructions, wherein (A) said computer readable instructions are executed by said processor to (a) determine a first activation state and a second activation state for said security responses, (b) determine a status of a visitor in said area of interest in response to an analysis of said video data, (c) select one of said control signals to initiate said security responses based on (i) said status of said visitor and (ii) said activation state and (d) determine a reaction of said visitor in response to said security responses, (B) said first activation state for said security responses is selected based on a schedule, (C) said visitor is identified using a wireless identification signal and (D) when said wireless identification signal is detected, said second activation state for said security responses based on said wireless identification signal temporarily overrides said first activation state.

US Pat. No. 10,462,340

METHODS AND APPARATUS FOR WIRELESSLY CONTROLLING LIGHTING EFFECTS OF A NETWORKED LIGHT SOURCE

SIGNIFY HOLDING B.V., Ei...

1. A method for illuminating a target, the method comprising: selecting, using an image recording device comprising a wireless communications interface, one or more lighting units located within a first distance to the image recording device, wherein each lighting unit comprises an LED-based light source capable of emitting light with one or more of a plurality of lighting effects, wherein the selecting comprises the image recording device broadcasting a signal received by said one or more lighting units located within the first distance, receiving one or more responses from the one or more lighting units, and locating the one or more lighting units in relation to the target by determining whether said one or more lighting units is disposed within a location range to the image recording device; and communicating, via the wireless communication interface, to one or more of the selected lighting units a request to illuminate the target with one or more of said plurality of lighting effects when the image recording device captures an image or records a video, wherein the selecting comprises excluding at least one other lighting unit including a corresponding light source from being selected in response to determining that said at least one other lighting unit is disposed outside of said location range to the image recording device.

US Pat. No. 10,462,339

IMAGING APPARATUS AND IMAGING SYSTEM HAVING PRESS BENT INFRARED CUT FILTER HOLDER

Sony Corporation, Tokyo ...

1. An imaging apparatus, comprising:at least one structure supporting a lens, wherein the lens has an optical axis extending in a first direction;
a mold attached to the at least one structure;
an imaging device configured to receive incident light through the lens via an incident light path and perform photoelectric conversion;
an infrared cut filter disposed between the lens and the imaging device;
an infrared cut filter holder holding the infrared cut filter from a lower face of the infrared cut filter and having an opening in an area of the incident light path; and
a substrate holding the imaging device and the mold, wherein an edge of the substrate is attached to a portion of the mold such that the imaging device is spaced apart from the infrared cut filter, and
wherein the infrared cut filter holder is pressed to bend such that the infrared cut filter holder has an L-shape in a cross-sectional view.

US Pat. No. 10,462,338

FLOATING CAMERA MODULE

MICROSOFT TECHNOLOGY LICE...

1. An electronic device, comprising:a front side;
a back side separated from the front side by a device thickness;
a camera having a lens side and a sensor side, wherein the lens side and the sensor side are separated from one another by a camera depth that is greater than the device thickness; and
a linkage floatably holding the camera, the linkage configured to move the camera within a recess between the front side and the back side such that the sensor side extends through and past the front side responsive to force on the lens side of the camera, and the lens side extends through and past the back side responsive to force on the sensor side of the camera.

US Pat. No. 10,462,337

LIQUID CRYSTAL DISPLAY DEVICE AND IMAGE PROCESSING METHOD FOR SAME

SAMSUNG DISPLAY CO., LTD....

1. An image signal processing method comprising:receiving an image signal; extracting an image depth of the image signal by analyzing the image signal if the image signal does not include separate image depth information;
separating the image signal into a foreground image signal and a background image signal based on the image depth; correcting the foreground image signal based on a plurality of gamma curves; and correcting the background image signal by applying only a single gamma curve to the background image signal.

US Pat. No. 10,462,336

LOW LATENCY TEARING WITHOUT USER PERCEPTION

Microsoft Licensing Techn...

1. A computer device, comprising:a memory to store data and instructions;
a processor in communication with the memory; and
an operating system in communication with the memory and the processor, wherein the operating system is operable to:
render at least one image frame received from an application for a virtual reality image for display on a display device;
determine that the rendered frame is received after a frame timing event that corresponds to a deadline for initiating display of a new frame,
communicate a prior rendered frame for presentation on the display device or communicate an instruction to re-present the prior rendered frame on the display device;
receive a selection of one of a plurality of tear thresholds, wherein each of the plurality of tear thresholds define different tear conditions, at a point in time after the frame timing event, when an amount of tearing in a displayed image is allowed;
determine whether the rendered frame meets a tear condition defined by the selected one of the plurality of tear thresholds; and
communicate the rendered image frame to the display device to switch the presentation from the prior rendered frame to the rendered image frame on the display device in response to the rendered frame meeting the tear condition defined by the selected one of the plurality of tear thresholds.

US Pat. No. 10,462,335

VIDEO PROCESSING DEVICE

FUJITSU TEN Limited, Kob...

1. A video processing device that generates a display video signal to be supplied to a liquid crystal display having a liquid crystal that is driven by a frame inversion scheme, the video processing device comprising:a control microcomputer that controls a data enable signal of an interlace video signal that is input to the control microcomputer from outside the control microcomputer, the interlace video signal input to the control microcomputer having a series of fields each of which includes a data signal and a vertical synchronization signal, the control microcomputer controlling the data enable signal such that a display invalid section having a length corresponding to a predetermined number of the fields is set for the interlace video signal at a predetermined period based on the vertical synchronization signal to cause skipping of a video output for one field at the predetermined period to inverse a polarity of an electrode of the liquid crystal display; and
a video signal processor that generates the display video signal by setting the display invalid section for the interlace video signal based on the data enable signal controlled by the control microcomputer and outputs the display video signal to the liquid crystal display.

US Pat. No. 10,462,334

PIPELINE FOR HIGH DYNAMIC RANGE VIDEO CODING BASED ON LUMINANCE INDEPENDENT CHROMATICITY PREPROCESSING

Disney Enterprises, Inc.,...

1. A method, comprising:receiving an additive color space digital image;
converting the received additive color space digital image into a uniform color space digital image having chrominance components and a luminance component;
scaling the chrominance components of the uniform color space digital image, wherein scaling the chrominance components of the uniform color space digital image comprises scaling a rectangular bounding box of a visual color gamut in a parameter space of the chrominance components;
quantizing the scaled chrominance components and the luminance component;
encoding the quantized chrominance components and the luminance component to create an encoded image; and
outputting a bitstream carrying the encoded image.

US Pat. No. 10,462,333

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD QUANTIZING FIRST, SECOND AND THIRD COLOR DATA USING FIRST AND SECOND THRESHOLD MATRICES

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus for printing an image on a print medium with use of inks of multiple colors including at least a color material of a first chromatic color, a color material of a second chromatic color different from the first chromatic color, and a color material of a third chromatic color different from the first chromatic color and the second chromatic color, the image processing apparatus comprising:one or more processors that function as:
an acquisition unit configured to acquire first, second, and third pieces of multivalued data respectively corresponding to the inks of the first, second, and third chromatic colors; and
a generation unit configured to generate first, second, and third pieces of quantization data respectively corresponding to the inks of the first, second, and third chromatic colors on a basis of the first, second, and third pieces of multivalued data acquired by the acquisition unit, and stored first and second threshold value matrices each configured to include arrayed multiple threshold values defining an arrangement of threshold values according to position of pixels respectively in one or more memory, the arrangement of threshold values in the second threshold value matrix being different from the arrangement of threshold values in the first threshold value matrix,
wherein:
the generation unit (i) generates the first quantization data on a basis of the first pieces of multivalued data acquired by the acquisition unit and the first threshold value matrix, (ii) generates the second quantization data on a basis of the second pieces of multivalued data acquired by the acquisition unit and the second threshold value matrix, and (iii) generates the third quantization data on a basis of the third pieces of multivalued data acquired by the acquisition unit and the second threshold value matrix.

US Pat. No. 10,462,332

IMAGE PROCESSING APPARATUS GENERATING PROFILE MAPPING A PLURALITY OF INPUT VALUES TO RESPECTIVE ONES OF A PLURALITY OF OUTPUT VALUES

Brother Kogyo Kabushiki K...

1. An image processing apparatus comprising a processor configured to perform:acquiring a first profile and a second profile, the first profile mapping a plurality of first input values to respective ones of a plurality of first output values, each of the plurality of first input values being represented in a specific color space, each of the plurality of first output values being represented in a first color space and having N1 component values corresponding to N1 types of color materials among M types of color materials used by a print execution device, M being an integer larger than two, N1 being an integer larger than or equal to one and smaller than or equal to M, the second profile mapping a plurality of second input values to respective ones of a plurality of second output values, each of the plurality of second input values being represented in the specific color space, each of the plurality of second output values being represented in a second color space and having N2 component values corresponding to N2 types of color materials among the M types of color materials, N2 being an integer larger than or equal to one and smaller than or equal to M; and
generating a third profile mapping a plurality of third input values to respective ones of a plurality of third output values,
wherein the generating includes:
acquiring a first boundary defining a first boundary value, the first boundary value being represented in the specific color space; and
determining a second boundary by using a first boundary output value and a second boundary output value, the second boundary defining a second boundary value, the second boundary value being represented in the specific color space, the first boundary output value being determined from among the plurality of first output values on a basis of the first boundary, the second boundary output value being determined from among the plurality of second output values on a basis of the first boundary, the first boundary value and the second boundary value defining a first range, a second range, and a third range so that an end of the first range is in contact with an end of the second range at the first boundary value and another end of the second range is in contact with an end of the third range at the second boundary value,
wherein the third profile is generated so that:
when a specific input value is in the first range, the third profile maps the specific input value to a third output value equal to an output value to which the first profile maps the specific input value;
when a specific input value is in the third range, the third profile maps the specific input value to a third output value equal to an output value to which the second profile maps the specific input value; and
when a specific input value is in the second range, the third profile maps the specific input value to a third output value by using an output value to which the first profile maps the specific input value and an output value to which the second profile maps the specific input value.

US Pat. No. 10,462,331

IMAGE PROCESSING METHOD AND APPARATUS, AND PRINTING APPARATUS, WITH DITHER RESULT APPLICATION LEVEL CONTROL

Seiko Epson Corporation, ...

1. An image processing apparatus configured to print an image, the image processing apparatus comprising:a preliminary halftone processing unit configured to determine whether a preliminary dot through a dithering method is to be formed on the basis of a data gradation value in image data;
a final halftone processing unit configured to determine whether a dot to be printed through an error diffusion method is to be formed on the basis of the data gradation value; and
a dither result application level control unit configured to control a dither result application level, the dither result application level being a level at which probability that the dot to be printed will be formed is raised when the preliminary dot is formed, wherein
the final halftone processing unit applies the dither result application level, and
the dither result application level control unit reduces the dither result application level during printing in a case where copy data from a reflected document is used as the image data or a case where the image data contains many high-frequency components, compared to other instances of printing.

US Pat. No. 10,462,330

IMAGE READING DEVICE, PRESSURE PLATE OPENING/CLOSING DETECTION METHOD, AND IMAGE FORMING APPARATUS

RICOH COMPANY, LTD., Tok...

4. A method of detecting opening/closing of pressure-plate, used for an image reading device includinga contact glass having a surface on which a document is disposed or along which the document moves,
a reading unit including a light source configured to emit light to the document on the surface of the contact glass and an image sensor configured to receive the light reflected by the document, the reading unit configured to obtain image data according to the light received by the image sensor,
a pressure plate moveable between a posture where the surface of the contact glass is closed and a posture where the surface of the contact glass is opened, and
a pressure-plate opening/closing sensor configured to detect whether the pressure plate is closed,
the method comprising:
determining a presence or absence of external light input to a space between the pressure plate and the surface of the contact glass based on black reference image data obtained by the reading unit when the light source is turned off;
comparing read white data of the light emitted from the light source and received by the image sensor when the pressure plate is closed, with reference white read data of the light emitted from the light source and received by the image sensor when the pressure plate is completely closed, to determine a difference between the read white data and the reference read white data; and
determining whether the pressure plate is open or closed based on the determined presence or absence of the external light when the light source is turned off and the difference between the read white data and the reference read white data.

US Pat. No. 10,462,329

METHOD FOR SUBSTRATE SHRINKAGE COMPENSATION

Heidelberger Druckmaschin...

1. A method for compensating for substrate shrinkage during a printing operation on a printing machine by using a computer, the method comprising the following steps:generating multiple image parts of a digitally available image to be produced, factoring in information on substrate shrinkage, by subdividing the image by using the computer;
subdividing the respective multiple image parts that have been created into a number of data blocks by using the computer;
saving actual positions of all of the data blocks in the digital image by using the computer;
calculating target positions of all of the data blocks in the digital image by using the computer by shifting the data blocks away from one another by a respective pixel to create single-pixel-wide gaps between the data blocks in the digital image;
copying and rearranging the data blocks in the digital image in accordance with the calculated target positions by using the computer;
calculating positions of resultant single-pixel-wide gaps by using the computer;
filling the single-pixel-wide gaps with digital image data of neighboring pixels by using the computer to create a compensated digital image; and
printing the compensated digital image on the printing machine.

US Pat. No. 10,462,328

PRINTING DEVICE, CONTROL METHOD OF A PRINTING DEVICE, AND STORAGE MEDIUM HAVING A WIRELESS COMMUNICATOR THAT IS PAIRED WITH AN EXTERNAL DEVICE

Seiko Epson Corporation, ...

1. A printing device comprising:a print mechanism configured to print on a recording medium;
a wireless communicator configured to communicate wirelessly with a paired external device; and
a controller configured to receive print data by wireless communication from the paired external device, and control the print mechanism based on the received print data, wherein the controller is further configured to control the print mechanism to print pairing-related information on the recording medium before the printing device enters a second mode when changing the printing device from a first mode to a second mode, to not allow pairing of the printing device with any external device in the first mode, to allow pairing in the second mode when there is a pairing request, and to not allow pairing when the print mechanism cannot print,
wherein a pairing authentication method is used for pairing, the pairing authentication method being configured to be changed to any of a plurality of authentication methods, the pairing authentication method being changed according to the authentication method that the printing device is configured to use or according to a level of security required.

US Pat. No. 10,462,327

DOCUMENT ELEMENT RE-POSITIONING

Hewlett-Packard Developme...

1. A non-transitory machine-readable storage medium comprising instructions which, when executed by a processor, cause the processor to:receive a document comprising a plurality of document elements;
create a relevance score for each of the plurality of document elements;
determine that a first relevance score for a first document element of the plurality of document elements is less than a threshold score and that a second relevance score for a second document element of the plurality of document elements is greater than the threshold score;
remove the first document element from the document based on the determination that the first relevance score is less than the threshold score;
re-position the second document element in the document based on the determination that the second relevance score is greater than the threshold score; and
insert a new document element comprising a link to a source of the document.

US Pat. No. 10,462,326

MACHINE READABLE SECURITY MARK AND PROCESS FOR GENERATING SAME

Xerox Corporation, Norwa...

1. A method of generating an encoding for printing secure information on a document, the method comprising:by a print device, printing on a substrate a security mark comprising a plurality of characters in a microtext font, wherein each of the plurality of characters appears more than one time in the security mark;
by a scanning device, scanning the security mark to create a digital image that includes the security mark;
by an image processing server, executing programming instructions to implement an optical character recognition (OCR) engine by:
receiving the digital image that was created by the scanning device and that includes the security mark,
applying an OCR process to the digital image to attempt to recognize the plurality of characters in the security mark,
identifying which of the plurality of characters in the security mark are recognized via the OCR process at least a number of times that exceeds a recognition threshold,
saving the plurality of characters that are recognized via the OCR process at least a threshold number of times as a character subset for an encoding; and
by a processing device, executing programming instructions to generate an encoding by:
determining how many characters are in the character subset,
determining how many font characters are required to print the microtext font,
generating an encoding comprising a representation of each of the font characters by an encoded character string that consists of one or more of the characters that are in the character subset, and
saving the encoding to a memory device.

US Pat. No. 10,462,325

IMAGE READING APPARATUS, CONTROL METHOD, AND CONTROL PROGRAM

PFU LIMITED, Kahoku-Shi,...

1. An image reading apparatus comprising:a first imaging device secured to the image reading apparatus;
a second imaging device movably provided between a first position facing the first imaging device and a second position facing the first imaging device, wherein the second position is at a longer distance from the first imaging device than the first position;
a guide for guiding a document between the first imaging device and the second imaging device;
an image capturing device provided on one of the first imaging device and the second imaging device for capturing an image of the document guided by the guide;
a drive force generating device for generating a first drive force for moving the second imaging device by a rotation in a first direction and generating a second drive force by a rotation in a second direction opposite to the first direction;
a conveyance roller;
a drive force transmitter arranged between the drive force generating device, and the second imaging device and the conveyance roller; and
a processor configured to cause the drive force generating device to generate the first drive force when the document is not conveyed, wherein
the drive force transmitter
transmits the first drive force to the second imaging device to move the second imaging device, and
transmits the second drive force to the conveyance roller to convey the document and interrupts transmission of the second drive force to the second imaging device in response to a switching from the first drive force to the second drive force.

US Pat. No. 10,462,324

IMAGE SENSOR UNIT, PAPER SHEET DISTINGUISHING APPARATUS, READING APPARATUS, AND IMAGE FORMING APPARATUS

CANON COMPONENTS, INC., ...

1. An image sensor unit, comprising:a lens array that condenses light from a target object;
an image sensor that receives the light condensed by the lens array;
an elongated housing elongated in a first direction and that supports the lens array and the image sensor, the elongated housing having a side surface extending in the first direction; and
an elongated rigid member that has a facing surface facing the side surface and an opposite surface opposite to the facing surface across a thickness of the elongated rigid member,
wherein
the elongated rigid member is provided with a penetration-hole that extends from the facing surface to the opposite surface, and the opposite surface of the elongated rigid member has a concave portion, and
a protrusion provided on the side surface extends through the penetration-hole and is positioned in the concave portion.

US Pat. No. 10,462,323

NON MECHANICAL OPTICAL BEAM STEERING MECHANISM FOR LASER PRINTERS

CAPITAL ONE SERVICES, LLC...

1. A method of laser printing, comprising:receiving image data from a computing device at a laser printer controller;
activating a laser using the printer controller in accordance with the received image data to produce a modulated laser light beam;
receiving the laser light beam into an array of waveguide cores of an optical waveguide having a controllable index of refraction, wherein an insulating layer is provided adjacent the array of waveguide cores with one or more openings through the insulating layer configured to expose different amounts of different waveguide cores of the array of waveguide cores to portions of a liquid crystal cladding layer;
controlling a voltage applied to the liquid crystal cladding layer to separately tune all of the waveguide cores and produce a different phase shifting effect for each different waveguide core by selectively changing an effective index of refraction of the liquid crystal cladding layer, thereby causing the laser light beam to exit the waveguide in a scanning motion; and
directing the scanning laser light beam across a photoconductive layer on a rotating photoreceptor drum in a direction parallel to a central axis of the drum.

US Pat. No. 10,462,322

IMAGE SCANNING APPARATUS AND METHODS OF OPERATING AN IMAGE SCANNING APPARATUS

Ventana Medical Systems, ...

1. A method of operating an image scanning apparatus;wherein the image scanning apparatus includes a line scan detector and is configured to image a surface of an object mounted in the image scanning apparatus in a plurality of swathes, wherein each swathe is formed by a group of scan lines, each scan line being acquired using the scan line detector from a respective elongate region of the surface of the object extending in a scan width direction, wherein each group of scan lines is acquired whilst the object is moved relative to the scan line detector in a scan length direction;
wherein the method includes:
using at least one scan line, acquired from a surface of a first object mounted in the image scanning apparatus using the scan line detector, to obtain at least one measure indicating that the surface of the first object is uneven or tilted in the scan width direction relative to an imaging plane of the image scanning apparatus,
setting a swathe width value for use in acquiring at least one swathe from a surface of a second object mounted in the image scanning apparatus, wherein the swathe width value is set based on the at least one measure and configured so as to keep each swathe acquired from the surface of the second object substantially in focus across its width in the scan width direction, and
acquiring at least one swathe from the surface of the second object using the scan line detector, wherein the at least one swathe acquired from the surface of the second object has a width in the scan width direction that corresponds to the swathe width value set based on the at least one measure.

US Pat. No. 10,462,321

SCANNER AND SCANNER DATA GENERATING METHOD

Seiko Epson Corporation, ...

9. A scanner comprising:a first mirror having a plurality of concavities configured to reflect light from a document;
a sensor configured to sense light reflected by a concavity of the first mirror; and
a wall disposed to the first mirror and protruding from between the plurality of concavities,
a protruding end of the wall being shaped conforming to the shape of the ridge formed by concavities of the first mirror.

US Pat. No. 10,462,320

INFORMATION PROCESSING SYSTEM INCLUDING SETTING VALUES FOR PRINT EXECUTION

Seiko Epson Corporation, ...

1. An information processing system that sets a setting value corresponding to a combination of a model of a printing apparatus and a print medium on which printing is performed to a setting item of the printing apparatus and that executes printing, the system comprising:a setting value table which stores a candidate setting value that can be set to the setting item in correlation with label information corresponding to each candidate setting value,
wherein, in a case of setting the setting value to the setting item of the printing apparatus, the information processing system determines the setting value corresponding to the setting item based on print medium information including the label information by using the setting value table, and sets the determined setting value to the setting item.

US Pat. No. 10,462,319

IMAGE READING APPARATUS TRANSMITTING DEVICE IDENTIFICATION INFORMATION AND READING INFORMATION TO PUSH NOTIFICATION SERVER, AND METHOD FOR CONTROLLING THE IMAGE READING APPARATUS

BROTHER KOGYO KABUSHIKI K...

1. An image reading apparatus comprising:an operation interface;
a scanner;
a communication interface;
a processor; and
a memory storing computer readable instructions, the computer readable instructions, when executed by the processor, causing the image reading apparatus to perform:
receiving via the operation interface an input instructing to control the scanner to read an original;
controlling the scanner to generate image data by reading the original on a basis of the input;
receiving device identification information identifying an information processing apparatus from the information processing apparatus via the communication interface; and
executing a first process including:
transmitting to a push notification server via the communication interface the received device identification information and reading information, the reading information including at least one of information specifying the controlling the scanner is started and information specifying the controlling the scanner is complete;
receiving request information from the information processing apparatus via the communication interface, wherein the request information is transmitted from the information processing apparatus when the information processing apparatus receives a push notification including the reading information from the push notification server; and
transmitting the generated image data to the information processing apparatus via the communication interface in response to receiving the request information.

US Pat. No. 10,462,318

IMAGE READING APPARATUS WITH A RESINOUS CONDUCTIVE SHEET

CANON KABUSHIKI KAISHA, ...

1. An image reading apparatus, comprising:a document feeding unit configured to convey a document;
a transparent member;
a reading element configured to read, through the transparent member, an image on a first surface of the document conveyed by the document feeding unit; and
a conductive sheet, which is a resinous sheet and is grounded, provided in a first region on a surface of the transparent member,
wherein a fluorine-containing organic compound is coated in a second region on the surface of the transparent member to be brought into contact with the first surface of the document, the second region being positioned downstream with respect to the first region in a first direction in which the document is conveyed, and
wherein the first region includes a region free from a region where the fluorine-containing organic compound is coated.

US Pat. No. 10,462,317

IMAGE READING APPARATUS

PFU LIMITED, Ishikawa (J...

1. An image reading apparatus comprising:a junction where a first feed path, a second feed path, and a third feed path join together;
a reading unit configured to read an image of a medium fed through the first feed path;
a switching unit that isolates the third feed path from the junction so as to guide the medium from the second feed path to the first feed path when being disposed at a first position and that guides the medium from the first feed path to the third feed path when being disposed at the second position;
a first transporting roller that rotates in a first direction so as to feed the medium disposed in the first feed path apart from the junction and that rotates in a second direction opposite to the first direction so as to feed the medium disposed in the first feed path toward the junction; and
a drive unit that moves the switching unit to the second position when the first transporting roller rotates in the second direction,
wherein the drive unit transmits a rotation of the first transporting roller to the switching unit so as to move the switching unit to the first position after a lapse of predetermined time from when the first transporting roller starts rotating in the first direction.

US Pat. No. 10,462,316

IMAGE PROCESSING APPARATUS, METHOD OF CONTROLLING IMAGE PROCESSING APPARATUS, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:a document sheet conveyance unit configured to drive a driving portion to convey document sheets one by one from a stack of document sheets;
a reading unit configured to read an image on a document sheet conveyed by the document sheet conveyance unit;
a transmission unit configured to transmit, by facsimile, image data generated by the reading unit reading the document sheet to an external image processing apparatus;
an output unit configured to output the image data generated by the reading unit reading the document sheet using a method different from transmission by facsimile; and
a conveyance control unit configured to stop driving of the driving portion for a predetermined time period after conveyance of the document sheets by the document sheet conveyance unit is started and before reading of the document sheets by the reading unit is completed,
wherein the predetermined period is a first time period when the conveyance control unit stops the driving portion while a first job of transmitting the image data by facsimile is executed by the transmission unit concurrently with reading of the document sheet by the reading unit, wherein the first time period is shorter than a reference time period, wherein a line is disconnected with the image processing apparatus in a case where the external image processing apparatus does not receive the image data continuously during the reference time, and
wherein the predetermined period is a second time period when the conveyance control unit stops the driving portion while a second job of outputting the image data is executed by the output unit concurrently with reading of the document sheet by the reading unit, wherein the second time period is longer than the first time period.

US Pat. No. 10,462,315

IMAGE READING APPARATUS

KONICA MINOLTA, INC., Ch...

1. An image reading apparatus comprising:a light source that emits light in a main scanning direction to a subject to be read;
a light receiving unit that receives light reflected by the subject to be read; and
an optical system that images the light reflected by the subject to be read and guides the light to the light receiving unit,
the optical system including a reflector mirror of glass and an optical element of resin that is disposed adjacent to the reflector mirror and images the light reflected by the subject to be read,
the optical element being held by the reflector mirror.

US Pat. No. 10,462,314

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD

KONICA MINOLTA, INC., To...

1. An information processing apparatus capable of setting favorite function information in an image forming apparatus, the information processing apparatus comprising a processor, a display, a storage device, and a non-transitory storage medium, the non-transitory storage medium storing a program comprising a plurality of instructions which, when the processor executes the instructions, perform the following:collecting function setting item information from a plurality of models of image forming apparatuses;
selecting combinations of candidates of the function setting item information to be included in the favorite function information from the collected function setting item information and setting a specified function to each of the selected candidates of the function setting item information;
displaying availability in the plurality of models of image forming apparatuses, of each of the specified functions of the candidates of the selected function setting item information, wherein
responsive to selection of either required item information or recommended item information for each of the combinations of the candidates of the function setting item information, the displaying
does not display the combination of the candidates of the function setting item information, of an image forming apparatus in which the function setting item information specified as the required item information is unavailable, of the plurality of models of image forming apparatuses, and
changes the specified function of the function setting item information specified as the recommended item information to a specifiable function and displays the changed function, of an image forming apparatus in which the function setting item information specified as the recommended item information is unavailable, of the plurality of models of image forming apparatuses; and
storing, as favorite setting information to be set as the favorite function information in the plurality of models of image forming apparatuses, a combination of allowed candidates, of the combinations of the candidates of the function setting item information to which the displayed specified functions have been set.

US Pat. No. 10,462,313

RESTORING STATE OF OPERATION IN AN IMAGE FORMING APPARATUS

SHARP KABUSHIKI KAISHA, ...

1. An image forming apparatus having an image forming section that forms an image on a recording medium based on printing conditions, the apparatus comprising:state determination circuitry configured to determine a state of the image forming apparatus;
authentication circuitry configured to authenticate a user and permit login of the user to the image forming apparatus;
operation display circuitry configured to receive, via an operation screen, setting associated with the printing conditions from the user;
storage circuitry configured to store the printing conditions and a state of the operation screen in a case where the user is logged out from the image forming apparatus without performing a logout operation by the user; and
returning circuitry configured to read out the printing conditions and the state of the operation screen stored in the storage unit, and
return the image forming apparatus to the state at the time of logout in a case where the user is logged out without performing the logout operation and the state determination circuitry determines that a current state of the image forming apparatus is the same as the state of the image forming apparatus at the time of logout, when the authentication unit permits the login of the user, and
initialize the state of the image forming apparatus in a case where the user is logged out without performing the logout operation and the state determination circuitry determines that a current state of the image forming apparatus is different than the state of the image forming apparatus at the time of logout, when the authentication unit permits the login of the user.

US Pat. No. 10,462,312

INFORMATION PROCESSING APPARATUS FOR SETTING IMAGE-QUALITY ADJUSTMENT INFORMATION AND IMAGE FORMING APPARATUS MANAGEMENT SYSTEM INCLUDING THE SAME

SHARP KABUSHIKI KAISHA, ...

1. An information processing apparatus to be connected to an image forming apparatus in such a manner that communication is possible through a network, the image forming apparatus forming an image in accordance with information that is set therein, the information processing apparatus comprising:a controller; and
a storage; wherein
the controller obtains, through the network, first information having been set in a first image forming apparatus, and second information having been set in a second image forming apparatus that is an adjustment target,
the controller stores the first information and the second information in the storage,
the controller reads the first information and the second information from the storage, and creates third information for the second image forming apparatus based on the first information and the second information, and
the controller transmits the third information, which is different from the first information and the second information, through the network, to the second image forming apparatus such that the third information is set in the second image forming apparatus and the second image forming apparatus forms an image in accordance with the third information by transferring obtained image data onto a recording sheet.

US Pat. No. 10,462,311

COMMUNICATION APPARATUS, IMAGE CAPTURING APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A communication apparatus comprising a processor and a memory storing a program which, when executed by the processor, causes the communication apparatus to function as:a communication control unit configured to control a communication unit so as to connect to one of a plurality of external apparatuses;
a transfer unit configured to transfer a data item to an external apparatus to which the communication unit has connected;
a storage control unit configured to, in a case that a transfer of the data item to the external apparatus to which the communication unit has connected is failed, store transfer failure information in which the data item is associated with the external apparatus to which the communication unit has connected, in a storage unit; and
a transfer control unit configured to control, in a case that the communication unit has connected to a first external apparatus after a transfer of a data item by the transfer unit is failed, a transfer of a data item included in the transfer failure information,
wherein the transfer control unit performs control so as to, if a data item included in the transfer failure information is associated with the first external apparatus, automatically transfer the data item to the first external apparatus.

US Pat. No. 10,462,310

PULL-PRINT COMPATIBLE IMAGE FORMING SYSTEM, SERVER AND IMAGE FORMING METHOD

KYOCERA Document Solution...

1. An image forming system comprising a terminal, a server, and an image forming apparatus; whereinthe terminal comprises
a document-data-transmitting unit that transmits document data to the server as direct-output data that is directly outputted by the image forming apparatus, or as pull-output data that is stored in the server and outputted according to an instruction from the image forming apparatus;
the server comprises:
a document-data-receiving unit that receives the document data from the terminal;
a data-type-determining unit that determines whether document data that is received by the document-data-receiving unit is direct-output data or pull-output data;
a document-data-retransmitting unit that, when the document data is determined to be direct-output data by the data-type-determining unit, transmits the document data as is to the image forming apparatus;
a model-information-acquiring unit that, when an instruction for preview data is received from the image forming apparatus, acquires model information from the image forming apparatus;
a preview-creating unit that performs a plurality of rasterizations of the document data to create a plurality of respective preview data, wherein each of the plurality of rasterizations is performed according to different model information, and wherein the plurality of rasterizations are performed prior to receiving the instruction for preview data from the image forming apparatus; and
a preview-transmitting unit that transmits, according to model information of the image forming apparatus, a preview data of the plurality of respective preview data created by the preview-data-creating unit to the image forming apparatus in response to the instruction for preview data being received from the image forming apparatus; and
the image forming apparatus comprises:
an input unit that acquires an instruction for selecting and outputting the document data that is pull-output data stored in the server;
a preview-data-acquiring unit that sends the instruction for preview data to the image forming apparatus and acquires the preview data of the document data that is selected by the input unit from the server;
a display unit that displays the preview data that is acquired by the preview-data-acquiring unit;
a document-data-acquiring unit that acquires the document data for which there is an output instruction by the input unit from the server, and also acquires the document data that is direct-output data from the server; and
a rasterizing unit that performs rasterization of the document data that is acquired by the document-data-acquiring unit, and creates output data.

US Pat. No. 10,462,309

SYSTEM AND METHOD FOR DIAGNOSING A PRINTING DEVICE BASED ON A CORRELATION COEFFICIENT BETWEEN PRINT VOLUME AND ERROR RATE

KYOCERA DOCUMENT SOLUTION...

7. A method to determine an operating status of at least one image forming device comprising:monitoring a plurality of operating parameters of the at least one image forming device during a first time interval, wherein monitoring the plurality of operating parameters of at least one image forming device comprises:
determining a total number of pages printed by the at least one image forming device during the first time interval; and
determining a total number of errors recorded by the at least one image forming device during the first time interval;
calculating a correlation coefficient of the at least one image forming device during a second time interval based on a total number of pages printed and a total number of errors recorded for the at least one image forming device during a plurality of data sampling intervals of the second time interval;
identifying the at least one image forming device as an abnormal operating status when the correlation coefficient is a positive correlation coefficient above a threshold level associated with a model of the at least one image forming device; determining a maintenance action for the image forming device based on the abnormal operating status; andwherein calculating the correlation coefficient comprises using Pearson's correlation coefficient defined as:wherein:n—is the sample sizexi—is a single sample of the total number of pages printed indexed with iyi—is a single sample of the errors recorded indexed with ix?—is the sample mean for the total number of pages printedy?—is the sample mean for the errors recorded.

US Pat. No. 10,462,308

RIP TIME ESTIMATION METHOD

SCREEN HOLDINGS CO., LTD....

1. A method for estimating RIP time required for a RIP processing in a print data processing device configured to generate print data in a bitmap format by performing the RIP processing on submission data that is a file in a PDF format, the print data being to be sent to a printer, the file in the PDF format describing a print target by a page description language, the method comprising:specifying, as an original file in the PDF format, sample data for test printing performed before printing by the printer,
creating, by taking the original file or a duplicate of the original file as an initial file and analyzing the initial file, reuse information for allowing specification of whether a use state of each resource held in the initial file is a shared state of being used as a component in one file a plurality of times or a non-shared state of being used as a component in one file only once;
generating a page that constitutes a new file in the PDF format, by performing, based on the reuse information, duplication of a page in the initial file in such a way that the use state of each resource is same before the duplication and after the duplication, the new file having a larger number of pages than the original file; and
estimating the RIP time using the new file.

US Pat. No. 10,462,307

SYSTEM AND METHOD FOR MAINTAINING SHARING GROUPS IN A SERVICE DELIVERY SYSTEM

1. In a service delivery system, a computer-implemented method for maintaining membership in a sharing group associated with a sharable service, the method comprising:generating a user interface for displaying current membership in the sharing group;
receiving, via the user interface, an instruction to modify the current membership; automatically generating and inserting into a transaction table at least one pending modification transaction for the sharing group in accordance with the received instruction;
constructing a payload based on all pending modification transactions for the sharing group in the transaction table;
generating and inserting into a middleware messaging queue a message comprising the payload;
processing the message from the messaging queue to extract the payload;
modifying associations in a billing database between one or more member services and the sharing group in accordance with the payload extracted from the message; and
producing a success or failure message for each of the one or member services in the payload based on success or failure of the modifying associations in the billing database.

US Pat. No. 10,462,306

MOBILE DEVICE USAGE OPTIMIZATION

vMOX, LLC, Roslyn Height...

1. A method, comprising:obtaining mobile device data for a plurality of mobile devices in an enterprise using a processor configured for any of: an application programming interface into a mobile device carrier portal, screen scraping a mobile device carrier website, and automated navigating of the mobile device carrier website to obtain usage reports of the plurality of mobile devices in the enterprise,
wherein the mobile device data is collected over a period of time and is indicative of mobile device data trends;
normalizing the mobile device data obtained for the plurality of mobile devices in the enterprise by the processor;
indexing a relational dataset of mobile device data created through the normalization;
utilizing the indexed relational dataset of mobile device data to compare previous optimized mobile device usage plans for one or more of the plurality of mobile devices to measure impact of compliance with previous optimization strategy by identifying a degree to which the previous optimization was effective;
segmenting the indexed relational dataset of mobile device data by device type to measure the impact by device type for one or more of the plurality of mobile devices to measure the impact of compliance with the previous optimization strategy by identifying the degree to which optimization was effective; and
analyzing each component of each category to exclude and/or adjust certain elements to normalize and quantify a difference between previous non-optimized plans and the previous optimized mobile device plans, to determine efficacy of the previous optimized mobile device plan.

US Pat. No. 10,462,305

WIRELESS ACCOUNT MANAGEMENT APPLICATION FOR A WIRELESS DEVICE

TracFone Wireless, Inc., ...

1. A method for displaying wireless service usage information and account information using an account management application operating on a wireless device, comprising:receiving, at the account management application operating on the wireless device and from a tracking module implemented by a wireless service provider, wireless service usage information for a wireless subscriber associated with the wireless device, the wireless service usage information including one or more units of wireless services that the wireless subscriber has used, the wireless service provider being different from the wireless device and a wireless network that provides wireless service to the wireless device;
receiving, at the account management application operating on the wireless device and from the wireless service provider over the wireless network, account information associated with the wireless subscriber associated with the wireless device that is stored in an account information database implemented by the wireless service provider;
receiving, at the account management application operating on the wireless device and from the tracking module implemented by the wireless service provider and from the wireless service provider over the wireless network, other wireless service usage information, the other wireless service usage information including one or more units of wireless services that the wireless subscriber has used;
enabling presentation, on a display of the wireless device, of the account management application including the wireless service usage information and the account information for the wireless subscriber associated with the wireless device;
receiving, at the account management application operating on the wireless device and from the wireless subscriber associated with the wireless device, an identifier of a prepaid card;
forwarding, from the account management application operating on the wireless device and to a service provider over the wireless network, the identifier of the prepaid card;
determining, at the service provider, a number of units of wireless service or a monetary value associated with the prepaid card; and
depositing, at the service provider, the number of units of wireless service or the monetary value associated with the prepaid card, in an account associated with the wireless subscriber.

US Pat. No. 10,462,304

SERVICE CONTROL POINT FUNCTIONALITY IMPLEMENTED AT COMMUNICATION ENDPOINTS

1. A system for establishing voice communication between a communications device and a contact center, the system comprising:a processor of the communications device; and
a memory of the communications device, wherein the memory stores instructions that, when executed by the processor, cause the processor to:
identify a first number of a destination to be dialed;
determine that the first number satisfies a particular criteria;
in response to determining that the first number satisfies the particular criteria, automatically transmit a request to a server over a data link, the request including geographic location information of the communications device, wherein in response to the request, the server is configured to:
lookup the first number and identify a related second number;
determine, based on the geographic location of the communications device, whether the identified second number is within a local dialing range of the destination; and
in response to determining that the identified second number is within the local dialing range, return the identified second number;
receive the identified second number from the server; and
initiate a voice call to the second number instead of the first number.

US Pat. No. 10,462,303

SURROGATE CELLULARLESS ROAMING

MOHAMMED HAMAD AL HAJRI, ...

1. A method, comprising:establishing, by a surrogate cellular device, radio communication with a cellular base station associated with a home location in a private cellular network, the radio communication specifying a cellular identifier associated with a subscriber identity module installed in the surrogate cellular device;
converting, by the surrogate cellular device, the radio communication into messages utilizing an Internet protocol;
performing a cloud-based service that provides routing information for cellularless roaming;
determining an Internet protocol address associated with a SIMless mobile device, the SIMless mobile device having the subscriber identity module removed therefrom and installed in the surrogate cellular device; and
sending, by the surrogate cellular device, the messages via the public Internet in response to the cloud-based service, the messages sent to the SIMless mobile device associated with the Internet protocol address, the messages providing cellularless roaming outside the home location in the private cellular network.

US Pat. No. 10,462,302

METHOD FOR ESTABLISHING THE ROUTING, IN PARTICULAR FORWARDING OF AN OBJECT OF A COMMUNICATIONS ACTIVITY, AND DEVICES FOR CARRYING OUT SAID METHOD

1. A method for executing routing of an object of a communication activity, which is sent from a sender terminal device to a first user terminal device of a first user via a first data connection and/or telephone connection so that the object of the communication activity is routable to a second user terminal device of a second user via a second data connection and/or telephone connection when the object is received within a first pre-selected time period and prior to authorization of a routing change that approves of the routing of the object to the second user terminal device, the method comprising:prior to the object of the communication activity being sent, defining the second data connection and/or telephone connection with respect to the first data connection and/or telephone connection via the first user terminal device so that the object is routable to the second user terminal device;
in response to the defining of the second data connection and/or telephone connection, immediately implementing a routing change so that the object is routable to the second user terminal device via the second data connection and/or telephone connection before approval of the routing change is received from a second user of the second user terminal device;
sending a first notification regarding the defining of the second data connection and/or telephone connection for authorization of the routing change;
deactivating the routing change in response to (i) a second notification that responds to the first notification to deny authorization for the routing change or (ii) a non-receipt of a second notification that approves the routing change within a first pre-selected time period, the deactivating occurring to prevent any subsequent routing of communication objects to the second user terminal device.

US Pat. No. 10,462,301

CALL INTENT NOTIFICATION FOR ESTABLISHING A CALL

MICROSOFT TECHNOLOGY LICE...

1. A system comprising:a processing apparatus; and
memory storing code that is executable by the processing apparatus to perform operations including:
receiving an indication that a first user performs an action on a first user terminal to initiate a call with a second user of a second user terminal and via a communication service;
determining one or more of that the second user terminal is currently unavailable to answer a call through said communication service, or that the second user is currently unavailable to answer a call through said communication service;
sending, automatically and responsive to said determining, a call intent notification to the second user terminal via an out-of-band communication channel that is separate from the communication service; and
including in the notification a Public Switched Telephone Network (PSTN) number enabling the second user terminal to initiate a PSTN call with the first user terminal using the PSTN number and via a connection between the first and second user terminals over a PSTN network.

US Pat. No. 10,462,300

TECHNOLOGIES FOR MONITORING INTERACTION BETWEEN CUSTOMERS AND AGENTS USING SENTIMENT DETECTION

1. A method for assigning a video call from a customer from a service queue to an agent, utilizing sentiment detection from interactions between customers and agents, the method comprising:receiving, by an interaction management computing device, a video call from a customer;
performing, by the interaction management computing device, a facial recognition analysis of the customer based on images of the customer received with the video call;
performing, by the interaction management computing device and subsequent to a determination that an agent is available to receive the video call inserted into a service queue, a facial recognition analysis of the agent based on images of a video stream captured of the agent;
determining, by the interaction management computing device, a probable emotional state of the customer as a function of the facial recognition analysis of the customer;
determining, by the interaction management computing device and subsequent to the determination that the agent is available to receive the video call inserted into a service queue, a present emotional state of the agent as a function of the facial recognition analysis of the agent; and
determining, by the interaction management computing device, whether to transfer the video call to the agent as a function of the determined probable emotional state of the customer and the determined present emotional state of the agent.

US Pat. No. 10,462,299

METHOD AND SYSTEM FOR CREATING CONTACT CENTER MODELS

1. A method for creating contact center models by a network configuration optimization system, through derivation and optimization of model parameters using historical data, comprising the steps of:a. using the network configuration optimization system to analyze historical automatic communication distributor data of contact center, wherein said historical automatic communication distributor data is analyzed by an extracting module for use in constructing a plurality of analytic models;
b. using the network configuration optimization system to construct the plurality of analytic models using predetermined input metrics for a plurality of corresponding queues, wherein a number of the plurality of analytic models corresponds to a number of the plurality of corresponding queues and each analytic model simulates a behavior of the corresponding queue, and optimizing parameters of each of the plurality of analytic models by analysis of the historical automatic communication distributor data by generating performance tables for each of the plurality of analytic models representing a change in a targeted performance metric in response to the addition of the agent of the network of the contact center;
c. using the network configuration optimization system to apply the optimized parameters, by the network configuration optimization system, to create a contact center model from the plurality of analytic models using the optimized parameters; and
d. using the network configuration optimization system, to apply the contact center model to compute forecasted staffing requirements by supplying forecasted inputs to the contact center model.

US Pat. No. 10,462,298

INTERACTIVE USER INTERFACE FOR PROFILE MANAGEMENT

eBay Inc., San Jose, CA ...

1. A system comprising:a display device; and
one or more processors and a memory including instructions, which when executed by the one or more processors, cause the one or more processors to perform operations comprising:
accessing, from a data structure, a user profile of a user, the user profile comprising a profile identifier and a plurality of data fields at least some of which containing information associated with the user;
establishing an audible communication with a service provider during which information in the user profile can be requested;
displaying, on the display device, an interactive user interface during at least a portion of the audible communication, the interactive user interface comprising:
a plurality of user interface action elements to enable requested information to be sent to the service provider during the audible communication, wherein each of the plurality of user interface action elements is:
associated with a data field in the data structure; and
individually selectable by a respective user action of the user to transmit the associated data field to a remote device associated with the service provider.

US Pat. No. 10,462,297

SYSTEM AND METHOD FOR AUTOMATED DETERMINING WHEN TO REVIEW AN AGENT RESPONSE PROCESS

Avaya Inc., Santa Clara,...

1. A method, comprising:receiving, by a transceiver of an oversight device, first data corresponding to a request received by a contact center, the first data including a type of the request, wherein the type of the request comprises an issue or a service of a plurality of issues or services handled by the contact center;
in response to receiving the first data, automatically determining, by a processor of the oversight device, a set of rules corresponding to the type of the request, the set of rules indicating at least one required step to be performed by an agent for the type of the request;
receiving, by the transceiver of the oversight device, second data corresponding to a response process used by the agent in generating a response to the request, the response process including at least one performed step performed by the agent for the request;
in response to receiving the second data, automatically determining, by a processor of the oversight device, whether the at least one required step is included in the at least one performed step of the response process;
upon determining that the at least one required step is not included in the at least one performed step:
forwarding, by the transceiver of the oversight device, the response to a supervisor device for review instead of transmitting the response to a user device that transmitted the request; and
after forwarding the response to the supervisor, receiving an override indication from the supervisor device to transmit the response to the user device.

US Pat. No. 10,462,296

SYSTEM AND METHOD FOR PROCESSING HIGH FREQUENCY CALLERS

1. A computer method for processing calls, comprising:providing a memory including the contents and frequency of previous telephonic conversations from a caller, and a controller programmed to:
determine reception of a telephonic call from a caller having one or more accounts associated with the caller;
utilize predictive analytics to determine a predicted question to be requested from the caller utilizing data correlating to previous conversations associated with the caller stored in memory;
determine an answer to the predicted question of the caller consisting of data from the one or more accounts associated with the caller; and
determine a time frequency to provide a notification to the detected caller containing the determined answer to the predicted question of the caller without necessitating the caller to initiate future telephonic calls to the communications interface regarding the predicted question utilizing the stored contents and frequency of previous telephonic calls associated with the caller.

US Pat. No. 10,462,295

COMMUNICATION ATTEMPTS MANAGEMENT SYSTEM FOR MANAGING A PREDICTIVE DIALER IN A CONTACT CENTER

NOBLE SYSTEMS CORPORATION...

1. A system for managing voice telephone calls originated by a predictive dialer in a contact center, comprising:a communication attempt management system (“CAMS”) comprising a first processor and a memory, wherein the first processor is configured to:
receive a communications list comprising a plurality of records, each record comprising identification data of an account involving a debt, a telephone number of an individual associated with the account and responsible for the debt, wherein each record reflects a planned communication attempt to be made by the predictive dialer to the telephone number within the next twenty-four hours;
retrieve from the memory communication attempt occurrence data for a current week for each account identified in the plurality of records;
determine for each account identified in the plurality of records in the communications list whether a number of communication attempts, including non-voice communication attempts, made to the individual associated with the account during the current week meets or exceeds a weekly communication attempt threshold;
update the communication attempt occurrence data in the memory for each account identified in the plurality of records in which the number of communication attempts does not meet or exceed the weekly communication attempt threshold during the current week to reflect the occurrence of the planned communication attempt;
generate a modified communications list comprising a subset of the plurality of records, wherein the number of communication attempts for each account identified by the records in the subset does not meet or exceed the weekly communication attempt threshold; and
transmit the modified communications list to the predictive dialer; and
the predictive dialer comprising a second processor configured to:
receive the modified communications list from the CAMS;
select a particular record from the subset of the plurality of records, the particular record comprising a particular telephone number and associated with a particular account;
originate a voice call to the particular telephone number; and
connect the voice call to an agent in response to the voice call being answered.

US Pat. No. 10,462,294

METHOD AND APPARATUS FOR PROCESSING A COMMUNICATION REQUEST FROM A ROAMING VOICE OVER IP TERMINAL

1. A method comprising:detecting, by a controller of a gateway, a request from a voice over internet protocol terminal to communicate with an emergency response center without the voice over internet protocol terminal querying the gateway for a media access control address for the gateway, wherein the request comprises an identification of the emergency response center and an identification of the voice over internet protocol terminal;
inserting, by the controller of the gateway, in response to the detecting the request from the voice over internet protocol terminal to communicate with the emergency response center, the media access control address for the gateway into the request to thereby form a modified request; and
transmitting, by the controller of the gateway, directly to a network proxy, the modified request with the media access control address for the gateway for enabling communications between the voice over internet protocol terminal and the emergency response center, wherein the gateway remains in a fixed location, wherein a location of the voice over internet protocol terminal is capable of being determined using a location of the gateway determined from the media access control address for the gateway, wherein the media access control address for the gateway and the identification of the voice over internet protocol terminal indicate whether the voice over internet protocol terminal is roaming outside of a home network.

US Pat. No. 10,462,293

PERSONALIZED AUDIO/VIDEO INVITATIONS FOR PHONE CALLS

Mobiline, Inc., Dover, D...

1. A method for providing the identity of a call initiator prior to acceptance of a call by a call recipient, comprising:prior to initiating the call, the call initiator's communication device recording a personalized audio or video invitation from the call initiator and, upon completion of the recording, the call initiator's communication device initiating the call by providing a call request message including the personalized audio or video invitation from the call initiator and call completion information needed to provide real-time communication with the call recipient's communication device prior to and after acceptance of the call;
the call initiator receiving from the call recipient an indication of the progress of the review of the personalized audio or video invitation by the call recipient;
initiating a timer on the call initiator's communication device that starts counting upon receipt of the indication that the call recipient has started reviewing the personalized audio or video invitation; and
upon receipt of a notification from the call recipient that the call is accepted pursuant to review of the personalized audio or video invitation by the call recipient, enabling a live voice connection between the call initiator and call recipient using the call completion information.

US Pat. No. 10,462,292

ANTI-SPOOFING TECHNIQUES FOR OUTBOUND TELEPHONE CALLS

Republic Wireless, Inc., ...

1. A method of verifying a caller ID field of an outbound telephone call, the method comprising:receiving a query from an inbound carrier call server in a termination service provider call server, the query including a caller ID telephone number associated with the outbound telephone call received by the inbound carrier call server where the outbound telephone call identified the termination service provider call server as the source of the outbound telephone call;
determining whether the caller ID telephone number in the received query is currently in use;
if the caller ID telephone number is not currently in use, returning a fail message to the inbound carrier call server;
if the caller ID telephone number is currently in use and has been for greater than a predetermined amount of time, returning a fail message to the inbound carrier call server; and
if the caller ID telephone number is currently in use and has been for less than the predetermined amount of time, returning a pass message to the inbound carrier call server.

US Pat. No. 10,462,291

SHARED GROUP NUMBER

T-Mobile USA, Inc., Bell...

1. A method comprising:receiving, by a communication server, a call addressed to a shared group number associated with a group including a plurality of member user equipments (UEs);
accessing, by the communication server, a group database containing information about the plurality of member UEs, each of the plurality of member UEs being associated with the shared group number and at least one of the plurality of member UEs being associated with a phone number different than the shared group number;
identifying, by the communication server, one or more active member UEs from the plurality of member UEs based on status information in the group database, the status information identifying each of the plurality of member UEs as one of active or inactive;
forwarding, by the communication server, the call to the one or more active member UEs;
receiving, by the communication server, a text message addressed to a shared group number; and
forwarding, by the communication server, the text message to at least one of the plurality of member UEs, without regard to the status information in the group database.

US Pat. No. 10,462,289

METHODS AND SYSTEMS FOR MANAGEMENT OF MEDIA CONTENT ASSOCIATED WITH MESSAGE CONTEXT ON MOBILE COMPUTING DEVICES

Vyng, Inc., Santa Monica...

1. A system comprising: a sender computing device configured with a sender-controlled contact media content-based application (SCCMC application), the SCCMC application configured to execute on the sender computing device and to interface with a sender state detection application that executes on the sender computing device, wherein the sender state detection application is configured to process content of a communication of a sender to a recipient and to determine a meaning and a context of the communication for determining a detected state of the sender that includes at least one of a mood of the sender, a physical status of the sender, an emotional state of the sender and a mental state of the sender, and wherein the SCCMC application is further configured to use the detected state of the sender to determine a sender-controlled contact media content structure (SCCMC structure) that corresponds to the detected state and to associate information that identifies the determined SCCMC structure with an outgoing message for the recipient.

US Pat. No. 10,462,288

METHOD, DEVICE, AND SYSTEM OF PROVIDING CALLER IDENTIFICATION INFORMATION TO A USER OF A WIRELESS DEVICE

TracFone Wireless, Inc., ...

1. A method for displaying caller identification information within a wireless device, the method comprising:receiving, at a wireless device, a calling party phone number to be displayed by the wireless device;
searching a wireless device memory in the wireless device for the calling party phone number and a corresponding caller identification;
determining with the wireless device that the calling party phone number and a corresponding caller identification are not stored in the wireless device memory;
performing with the wireless device, in response to the determination that the calling party phone number and the corresponding caller identification are not stored in the wireless device memory, an external database search by connecting the wireless device to the Internet to search for the calling party phone number and the corresponding caller identification;
determining with the wireless device whether the calling party phone number and the corresponding caller identification have been identified in the external database search;
displaying on the wireless device, when the determination that the calling party phone number and the corresponding caller identification have been identified in the external database search, the calling party phone number and caller identification;
displaying on the wireless device, when the determination that the calling party phone number and the corresponding caller identification have not been identified in the external database search, an indication that the calling party phone number and caller identification is unavailable;
determining that the calling party phone number and the corresponding caller identification have been identified in the external database search further comprises:
displaying a web page that includes web page information on a display of the wireless device where the caller identification is found; and
providing an option to a user to add the calling party phone number identified in the external database search to a “do not answer list” stored in the wireless device,
wherein the searching the wireless device memory with the wireless device for the calling party phone number and a corresponding caller identification, the determining with the wireless device that the calling party phone number and a corresponding caller identification are not stored in the wireless device memory, the performing an external database search, the determining whether the corresponding caller identification have been identified in the external database search, and the displaying on the wireless device the calling party phone number and caller identification are performed by a processor of the wireless device.

US Pat. No. 10,462,287

AUTOMATIC CALLER IDENTIFICATION TRANSLATION

Fraunhofer-Gesellschaft z...

1. A user voice communication device comprising:a microphone for recording of voice signals;
a loudspeaker for reproduction of voice signals;
an input device configured for controlling operations of the user voice communication device by user actions;
a display device configured to present information in visual form; and
a receiving unit configured to receive the interrogated information from an interrogator for acquiring information associated with a caller identification transmitted within an incoming telephone call, and to forward the interrogated information;
wherein the interrogator comprises:
a receiving device configured to receive the incoming telephone call and to extract the caller identification from the incoming telephone call,
an interrogating device configured to receive the extracted caller identification from the receiving device and to interrogate information associated with the
caller identification from an external data base, which is configured to operate independently from the user voice communication device for which the incoming telephone call is intended for, and
a forwarding device configured to receive and to forward the interrogated information;
wherein the forwarding device is configured to forward the associated information to a remote user voice communication device for which the telephone call is intended for over at least one third communication channel, in particular over at least one third packet based communication channel,
connecting the interrogator and the remote user voice communication device;
wherein the display device is configured to present the interrogated information forwarded by the receiving unit when receiving the incoming telephone call;
wherein the user voice communication device is configured in such way that it is selectable by a respective user action at the input device whether the interrogated information, information from the contact list related to the caller identification or a mixture of both is presented at the display device when receiving the incoming telephone call.

US Pat. No. 10,462,286

SYSTEMS AND METHODS FOR DERIVING CONTACT NAMES

Vonage Business, Inc., A...

1. A method of determining contact information for a first party when the first party initiates a telephony communication with a second party, comprising:recording audio of at least an initial portion of a telephony communication between the first party and the second party;
analyzing at least a portion of the recorded audio to determine contact information associated with the first party; and
inserting the contact information associated with the first party into a call log entry associated with the telephony communication within a call log that is maintained for the second party.

US Pat. No. 10,462,285

SYSTEM AND METHOD FOR AUTHENTICATING CALLED PARTIES OF INDIVIDUALS WITHIN A CONTROLLED ENVIRONMENT

1. A call processing platform for processing telecommunications of a controlled environment, comprising:one or more processors and/or circuits configured to:
register, by the one or more processors and/or circuits, an application running on an external device;
handshake with the application;
receive a call attempt to the external device placed by a member of the controlled environment;
receive a device identification of the external device, a telephone number associated with the external device, and personal verification information of a user of the external device via the application;
delay connection of the call attempt so as to register the external device by storing the personal verification information, the telephone number, and the device identification such that the personal verification information, the telephone number, and the device identification are associated with each other;
process a call associated with the call attempt after completion of the registering of the external device; and
communicate with the application after the external device has been registered.

US Pat. No. 10,462,284

CATEGORY-BASED FENCE

Apple Inc., Cupertino, C...

1. A method comprising:detecting, by a mobile device, a signal from a signal source at a particular location;
determining that a signal source identifier included in the signal matches a category identifier corresponding to a category-based fence, the category-based fence being a location-agnostic virtual fence corresponding to a group of signal sources, the category identifier being common to the group of signal sources;
determining a number of times the signal is detected at the particular location;
upon determining that the signal source identifier matches the category identifier corresponding to the category-based fence and the number of times the signal is detected is at least equal to a threshold number, notifying an application subsystem of the mobile device that the mobile device has entered the category-based fence; and
in response to the notification, activating, on the mobile device, an application program corresponding to the category-based fence.

US Pat. No. 10,462,283

GEO-FENCING IN A BUILDING AUTOMATION SYSTEM

ADEMCO INC., Golden Vall...

1. A non-transitory computer-readable storage medium with an executable program stored thereon, wherein the executable program instructs a mobile device having location services and a communication interface for communicating with a remote server to perform the following:save information pertaining to a geo-fence, wherein the geo-fence is associated with a user of the mobile device and defines a boundary of a fixed location of a building;
save a previous geo-fence state of the mobile device both locally on the mobile device and remotely on the remote server, wherein the previous geo-fence state is based on a previous location of the mobile device, and wherein the previous geo-fence state is selected from at least an inside geo-fence state in which the previous location of the mobile device is determined to be inside of the boundary of the fixed location of the building and an outside geo-fence state in which the previous location of the mobile device is determined to be outside of the boundary of the fixed location of the building;
identify a current location of the mobile device via the location services;
determine a current geo-fence state of the mobile device based on the current location of the mobile device, wherein the current geo-fence state is selected from at least the inside geo-fence state in which the current location of the mobile device is determined to be inside of the boundary of the fixed location of the building and the outside geo-fence state in which the mobile device is determined to be outside of the boundary of the fixed location of the building;
compare the current geo-fence state with the previous geo-fence state;
when the current geo-fence state fails to match the previous geo-fence state, communicate the current geo-fence state to the remote server to update the previous geo-fence state to reflect the current geo-fence state; and
when the current geo-fence state matches the previous geo-fence state, refrain from communicating the current geo-fence state to the remote server.

US Pat. No. 10,462,282

ESTIMATING THE ELEVATION OF A WIRELESS TERMINAL BASED ON DETERMINING THE MEASUREMENT BIAS OF A PRESSURE REFERENCE

Polaris Wireless, Inc., ...

1. A method of estimating elevation of one or more wireless terminals, the method comprising:receiving, by a data processing system, a first series of measurements of barometric pressure made at a first pressure reference, wherein the first pressure reference is at a first outdoor location and at an unknown height above a reference level, the unknown height being unknown to the data processing system;
receiving, by the data processing system, a second series of measurements of barometric pressure made at a second pressure reference, wherein the second pressure reference is at a second outdoor location and at a first known height above the reference level, the first known height being known to the data processing system;
generating, by the data processing system, a first estimate of bias of barometric pressure measured by the first pressure reference based on:
(i) the first series of measurements of the barometric pressure made at the first pressure reference,
(ii) the second series of measurements of the barometric pressure made at the second pressure reference, and
(iii) the first known height of the second pressure reference;
receiving, by the data processing system, a first measurement of the barometric pressure made at a first wireless terminal;
receiving, by the data processing system, a subsequent measurement of barometric pressure made at the first pressure reference, wherein the subsequent measurement of the barometric pressure is separate from the first series of measurements and is received after the first series of measurements is received;
generating, by the data processing system, an estimate of the elevation of the first wireless terminal based on:
(i) the first measurement of the barometric pressure made at the first wireless terminal,
(ii) the subsequent measurement of the barometric pressure made at the first pressure reference, and
(iii) the first estimate of the bias.

US Pat. No. 10,462,281

TECHNOLOGIES FOR USER NOTIFICATION SUPPRESSION

Intel Corporation, Santa...

1. A user notification controller to control a presentation of notifications to a user of a vehicle, the user notification controller comprising:a user notification determiner to determine whether a user notification is to be presented to the user; and
a user activity analyzer to (i) determine a present navigation activity of the user to control navigation of the vehicle, (ii) determine whether the navigation activity of the user is indicative of user's compliance to a user instruction included in the user notification prior to the presentation of the user notification to the user, and (iii) determine whether the navigation activity of the user is indicative of the user's knowledge of the user instruction included in the user notification based on the user's compliance; and
a notification suppressor to suppress the presentation of the user notification to the user in response to a determination that the navigation activity of the user is indicative of the user's knowledge of the user instruction included in the user notification.

US Pat. No. 10,462,280

DEVICE AND METHOD FOR USING APPROPRIATE TELEPHONE NUMBER FOR CALL CONNECTION

NTT DOCOMO, INC., Tokyo ...

1. A device comprising:a memory that stores instructions; and
a processor that executes the instructions stored in the memory to:
acquire a call connection request that requests a call connection from a source telephone number to a destination telephone number;
select, as a calling telephone number, one of source telephone numbers that have been assigned to a source telephone that has sent the call connection request, in accordance with a condition relevant to the destination telephone number;
compare the calling telephone number with the source telephone number of the requested call connection to determine whether the calling telephone number differs from the source telephone number of the requested call connection; and
upon detecting that the calling telephone number differs from the source telephone number of the requested call connection, cause the source telephone to display a message prompting a user to change the source telephone number of the requested call connection to the calling telephone number.

US Pat. No. 10,462,279

NOTIFYING A USER OF EVENTS IN A COMPUTING DEVICE

QUALCOMM Incorporated, S...

1. A method performed by one or more processors of a mobile computing device, comprising:associating a first type of event with a persistent event notification type;
associating a second type of event with a non-persistent event notification type, wherein the first type of event is one of a text message and an email, the second type of event is the other of the text message and the email, and the associating of each of the first and second type of events is based at least in part on a user's preferences;
presenting, along a first edge of a touch-sensitive display of the mobile computing device, a status bar indicating at least one of a time, a remaining battery strength, or a signal strength;
detecting an event comprising reception of the first type of event or the second type of event;
presenting, along an edge of the touch-sensitive display of the mobile computing device, a persistent banner alert in response to detecting the first type of event, wherein the persistent banner alert is separate from the status bar;
presenting, on the touch-sensitive display of the mobile computing device, a non-persistent banner alert in response to detecting the second type of event, wherein the non-persistent banner alert is separate from the status bar;
dismissing the persistent banner alert based on the user swiping the persistent banner alert off the touch-sensitive display; and
automatically dismissing the non-persistent banner alert after expiration of a time period.

US Pat. No. 10,462,278

PORTABLE TERMINAL APPARATUS

MAXELL, LTD., Kyoto (JP)...

1. A portable terminal apparatus comprising:a display;
an operation input interface receiving an input corresponding to a user operation;
a wireless communication interface performing communication over a wireless local area network; and
a processor programmed to control state transitions of the portable terminal apparatus to and from a sleep state and further performing a plurality of display modes, the plurality of display modes including:
a first display mode during which a lock release icon is displayed on the display without display of a control icon when the portable terminal apparatus wakes up from a sleep state based on a first condition, wherein the first condition is met when a control screen for remotely and wirelessly controlling an external device via the wireless communication interface is not displayed on the display before the portable terminal apparatus enters a sleep state;
a second display mode during which a lock release icon is displayed on the display without display of a control icon when the portable terminal apparatus wakes up from a sleep state based on a second condition, wherein the second condition is met when the wireless communication interface is unable to communicate with the external device and the control screen is displayed on the display before the portable terminal apparatus enters a sleep state; and
a third display mode during which a control icon and a lock release icon are displayed on the display when the portable terminal apparatus wakes up from a sleep state based on a third condition, wherein the third condition is met when the wireless communication interface is able to communicate with the external device and the control screen is displayed on the display before the portable terminal apparatus enters a sleep state,
wherein the control screen is used to execute control operations on the external device based on the input received by the operation input interface,
wherein, when the lock release icon is selected in one of the first, second, or third display modes, the display displays a screen for performing authentication to execute an unlock operation for the portable terminal apparatus, and
wherein, when the control icon is selected in the third display mode without selection of the lock release icon, the wireless communication interface transmits a remote control signal corresponding to the selected control icon to the external device.

US Pat. No. 10,462,277

METHOD AND DEVICE FOR PROVIDING FUNCTION OF MOBILE TERMINAL

SAMSUNG ELECTRONICS CO., ...

1. An electronic device comprising:a communication interface for communicating with a first wearable device and a second wearable device; and
a processor configured to:
select the first wearable device and the second wearable device from among a plurality of wearable devices connected to the electronic device via the communication interface based on respective performance information, that identifies a performance value at which a specific operation is performed, of each of the plurality of wearable devices and minimum performance information required for performing the specific operation associated with the electronic device;
receive, via the communication interface, sensing information from the first wearable device based on selecting the first wearable device, wherein the sensing information is obtained by a sensor of the first wearable device being worn by a user; and
control the communication interface to transmit the sensing information to the second wearable device being worn by the user based on selecting the second wearable device.

US Pat. No. 10,462,276

FUNCTION SELECTION IN A PORTABLE DEVICE COUPLED TO THE HEAD UNIT OF A VEHICLE

GOGGLE LLC, Mountain Vie...

1. A method for providing geographic information to head units of vehicles, the method comprising:providing, by one or more processors, a set of user interface features of a navigation application executing on a portable device that currently operates in a vehicle, including providing a list of geographic search results from the navigation application to the head unit;
receiving, at the navigation application from a head unit of the vehicle, an indication of a changed environmental condition; and
reducing, by the one or more processors, a level of detail for at least one of the set of user interface features of the navigation application in response to the changed environmental condition, including locking out a scrolling list/paging feature by reducing the list to a single page.