US Pat. No. 10,891,900

EMISSION DRIVER AND ORGANIC LIGHT EMITTING DISPLAY DEVICE HAVING THE SAME

SAMSUNG DISPLAY CO., LTD....

15. An organic light emitting display device, comprising:a display panel including a plurality of pixels;
a scan driver to supply a scan signal to the pixels through a plurality of scan lines;
a data driver to supply a data signal to the pixels through a plurality of data lines; and
an emission driver to supply an emission control signal to the pixels through a plurality of emission control lines; and a data driver to supply a data signal to the pixels through a plurality of data lines, wherein the emission driver includes:
a plurality of stages each outputting the emission control signal, wherein a k-th stage (k is a natural number) includes:
an input block to supply a signal supplied to a first input terminal to a first node and to supply a voltage of a first power source to a second node, in response to a signal supplied to a second input terminal;
an output block to supply the voltage of the first power source or a voltage of a second power source to an output terminal in response to a voltage of a third node and a voltage of a fourth node;
a first signal processing block to control a voltage of the first node in response to a voltage of the second node and a signal supplied to a third input terminal;
a second signal processing block connected to a fifth node electrically connecting the second node and the fourth node, wherein the second signal processing block is to control the voltage of the fourth node in response to the signal supplied to the third input terminal and a voltage of the fifth node;
a third signal processing block to control the voltage of the fourth node in response to the voltage of the first node;
a fourth signal processing block to control the voltage of the third node in response to the voltage of the fourth node; and
a stabilization block electrically connected between the input block and the output block to limit a voltage drop between the first node and the third node, wherein the stabilization block limits a voltage drop between the second node and the fourth node by lowering the voltage of the second power source supplied to the fifth node.

US Pat. No. 10,891,899

DISPLAY DEVICE, ELECTRONIC DEVICE, AND BODY-BIASING CIRCUIT

LG Display Co., Ltd., Se...

1. A display device comprising:a pixel array comprising a plurality of sub-pixels defined by a plurality of data lines and a plurality of gate lines;
a plurality of driving transistors, each configured to drive a light emitting element of a corresponding sub-pixel of the plurality of sub-pixels, and configured to receive a driving voltage;
a source driving circuit configured to drive the plurality of data lines;
a gate driving circuit configured to drive the plurality of gate lines;
a controller configured to control the source driving circuit and the gate driving circuit;
a power supply circuit configured to output the common voltage; and
a circuit configured to:
receive a common voltage commonly applied to the plurality of sub-pixels,
receive the driving voltage, and
apply a body voltage corresponding to the common voltage to a body of a driving transistor of the plurality of driving transistors, wherein the body voltage is between the common voltage and the driving voltage, wherein the body voltage has an amplitude that varies according to a variation of an amplitude of the common voltage, and wherein the circuit is a body biasing circuit connected between the body of the driving transistor and the power supply circuit.

US Pat. No. 10,891,898

PIXEL CIRCUIT FOR TOP-EMITTING AMOLED PANEL AND DRIVING METHOD THEREOF

SHENZHEN CHINA STAR OPTOE...

1. A pixel circuit for top-emitting active matrix organic light-emitting diode (AMOLED) panel, which comprises:a first thin film transistor (TFT), having a gate connected to a first node, a source and a drain connected respectively to a second node and a third node;
a second TFT, having a gate connected to a scan signal, a source and a drain connected respectively to the first node and a data signal;
a third TFT, having a gate connected to the scan signal, a source and a drain connected respectively to the second node and a reference voltage;
a fourth TFT, having a gate connected to the scan signal, a source and a drain connected respectively to the third node and a high voltage power source;
a first capacitor, having two ends connected respectively to the first node and the second node;
a second capacitor, having two ends connected respectively to the third node and the reference voltage;
an OLED, having an anode connected to the second node, and a cathode connected to a low voltage power source;
wherein the high voltage power source having a first voltage determined by a second voltage the low voltage power source, a voltage difference between the first voltage of the high voltage power source and the second voltage of the low voltage power source maintaining unchanged, the reference voltage being less than activation voltage of the OLED;
wherein the first voltage of the high voltage power source is a function of the second voltage of the low voltage power source and a difference between the voltage of the high voltage power source and the voltage of the low voltage power source maintains unchanged; and
wherein the fourth TFT and the first TFT are connected, in series, between the high voltage power source and the anode of the OLED, with the end of the second capacitor that is connected to the third node being connected to the connection between the fourth TFT and the first TFT, such that the first voltage of the high voltage power source is applied through the fourth TFT and the first TFT, in sequence, to the anode of the OLED, and wherein the second voltage of the low voltage power source is applied to the cathode of the OLED such that the voltage applied to the anode of the OLED is a function of the voltage applied to the cathode of the OLED.

US Pat. No. 10,891,897

METHOD AND SYSTEM FOR ESTIMATING AND COMPENSATING AGING OF LIGHT EMITTING ELEMENTS IN DISPLAY PANEL

SHENZHEN YUNYINGGU TECHNO...

1. A method for estimating aging of light emitting elements in a display panel, comprising:determining a current, a position, and a temperature associated with a light emitting element in the display panel based on display data provided to the display panel at a time interval;
determining a current aging weight of the light emitting element based on the current and a current-aging relationship measured at a standard temperature;
determining a temperature aging weight of the light emitting element based on the temperature and a temperature-aging relationship measured at a standard current;
determining a position aging weight of the light emitting element based on the position;
determining an aging rate of the light emitting element based on the current aging weight, the temperature aging weight, and the position aging weight;
determining an aging time of the light emitting element based on the aging rate of the light emitting element and the time interval; and
determining a luminance loss of the light emitting element based on the aging time and a luminance loss-aging time relationship measured at the standard temperature and the standard current.

US Pat. No. 10,891,896

DISPLAY DEVICE AND DRIVING METHOD FOR DISPLAY DEVICE

Japan Display Inc., Toky...

1. A display device comprising:a first pixel having a first light emitting element including a first pixel electrode and a common electrode, and a drive transistor having an input/output terminal, one end of the input/output terminal being connected to the first pixel electrode;
a second pixel adjoining the first pixel, and having a second light emitting element including a second pixel electrode and the common electrode; and
first switches including a first switch of the first pixel,
wherein
the first pixel electrode and the second pixel electrode are connected via the first switch of the first pixel,
the first switch of the first pixel is ON when a video signal is written to the first pixel, and
the first light emitting element and the second light emitting element are connected in parallel.

US Pat. No. 10,891,895

LIGHT EMITTING DEVICE, DISPLAY DEVICE, AND LED DISPLAY DEVICE

SHARP KABUSHIKI KAISHA, ...

1. A light emitting device with LEDs as light sources, the light emitting device comprising:a plurality of LED units arranged in matrix, each of the plurality of LED units including one or more LEDs;
a plurality of LED drive circuits configured to drive LEDs included in the plurality of LED units, the plurality of LED drive circuits corresponding to the plurality of LED units one on one;
a drive control circuit configured to control an operation of the plurality of LED drive circuits so that the LEDs included in the plurality of LED units are driven row by row, wherein
each of the LED drive circuits includes:
a data voltage holding unit configured to hold a data voltage corresponding to target brightness of one or more LEDs included in a corresponding LED unit; and
a lighting control unit configured to perform a lighting period control operation in which the one or more LEDs included in the corresponding LED unit is lit for a period of a length depending on the data voltage held in the data voltage holding unit,
a signal line for supplying the data voltage to the plurality of LED drive circuits is provided for each column,
regarding each of plurality of the LED drive circuits, a charge period of a predetermined length is provided every one frame period, and a plurality of lighting enable periods are provided during a period of a length corresponding to a length of one frame period from a time point at which the charge period ends, and
an operation of each of the plurality of LED drive circuits is controlled by the drive control circuit, so that the data voltage corresponding to target brightness of the one or more LEDs included in the corresponding LED unit is written to the data voltage holding unit during the charge period, and so that the lighting period control operation by the lighting control unit is performed in the plurality of lighting enable periods.

US Pat. No. 10,891,894

SEMICONDUCTOR DEVICE AND DRIVING METHOD THEREOF

Semiconductor Energy Labo...

4. A semiconductor device comprising:a pixel comprising:
a first transistor;
a second transistor;
a third transistor;
a fourth transistor;
a fifth transistor;
a sixth transistor;
an electroluminescent device; and
a capacitor,
wherein a gate of the first transistor is directly connected to a first electrode of the capacitor,
wherein a second electrode of the capacitor is directly connected to the electroluminescent device,
wherein one of a source and a drain of the second transistor is directly connected to a source signal line,
wherein the other of the source and the drain of the second transistor is directly connected to one of a source and a drain of the first transistor,
wherein one of a source and a drain of the third transistor is directly connected to the gate of the first transistor,
wherein the other of the source and the drain of the third transistor is directly connected to the other of the source and the drain of the first transistor,
wherein one of a source and a drain of the fourth transistor is directly connected to the one of the source and the drain of the first transistor,
wherein the other of the source and the drain of the fourth transistor is directly connected to the electroluminescent device,
wherein one of a source and a drain of the fifth transistor is directly connected to the other of the source and the drain of the first transistor,
wherein the other of the source and the drain of the fifth transistor is directly connected to an electric current supply line,
wherein one of a source and a drain of the sixth transistor is directly connected to the second electrode of the capacitor, and
wherein the other of the source and the drain of the sixth transistor is directly connected to an electric power source line.

US Pat. No. 10,891,893

CURRENT CONTROLLER FOR OUTPUT STAGE OF LED DRIVER CIRCUITRY

Planar Systems, Inc., Be...

1. A current controller for an output stage of light emitting diode (LED) driver circuitry defining a set of channels through which electrical current is controllably deliverable to a set of LEDs along an actuatable scanline, the current controller comprising:a current source to establish a nominal amount of current available for each member of the set of channels, the nominal amount of current being based on a desired brightness level;
pulse width modulation (PWM) circuitry electrically coupled to the current source and configured to control durations in which adjusted amounts of current are applied to corresponding members the set of LEDs; and
compensation circuitry electrically coupled to the current source and the PWM circuitry, the compensation circuitry including a set of switching elements to adjust, for each corresponding member of the set of LEDs, the nominal amount of current and thereby provide to the PWM circuitry the adjusted amounts of current based on feedback representing one or both load impedance variations and parasitic conditions (LIVPC) and process, voltage and temperature (PVT) conditions.

US Pat. No. 10,891,892

DISPLAY DEVICE

Panasonic Liquid Crystal ...

1. A display device, comprising:a plurality of data lines extending in a first direction;
a plurality of groups, each group having a plurality of gate lines extending in a second direction, wherein the plurality of gate lines for each group are adjacent in the first direction;
a plurality of blocks, each block including:
a plurality of selector transistors and each of the plurality of selector transistors includes:
a first conductive electrode connected to an end of a corresponding gate line of the plurality of gate lines,
a second conductive electrode, and
a control electrode,
wherein each block among the plurality of blocks corresponds to a group among the plurality of the groups;
a plurality of selection signal supplying wirings, each of which is connected to the control electrode of each of the plurality of selector transistors for a corresponding block of the plurality of blocks;
a plurality of gate voltage supplying wirings each of which is connected to the second conductive electrode of one of the plurality of selector transistors in each of the groups;
a gate driver that sequentially supplies a gate voltage to the plurality of gate voltage supplying wirings and supplies a control voltage to the plurality of selection signal supplying wirings to turn on or off one or more of the plurality of selector transistors; and
a dummy transistor having a control electrode and being disposed between two adjacent blocks among the plurality of blocks,
wherein the plurality of blocks include a first block and a second block next to the first block,
the first conductive electrodes of the plurality of selector transistors included in the first block are connected to the ends of the corresponding gate lines respectively,
the first conductive electrodes of the plurality of selector transistors included in the second block are connected to the ends of the corresponding gate lines respectively,
the dummy transistor is disposed between one of the plurality of selector transistors in the first block and one of the plurality of selector transistors in the second block,
the control electrode of each of the plurality of selector transistors included in the first block and the control electrode of the dummy transistor disposed between the first block and the second block are integrally formed, and
the control electrode of the dummy transistor disposed between the first block and the second block is electrically isolated from and spaced apart from the control electrode of each of the plurality of selector transistors included in the second block.

US Pat. No. 10,891,891

ELECTRO-OPTICAL DEVICE AND ELECTRONIC APPARATUS

SEIKO EPSON CORPORATION, ...

1. An electro-optical device comprising:a scan line;
a data line;
a pixel circuit located at a position corresponding to an intersection of the scan line and the data line; and
an enable line, wherein
the pixel circuit includes a memory circuit, a light emitting element, and a control circuit,
the light emitting element changes brightness in response to an image signal held in the memory circuit,
the control circuit controls state of the light emitting element between a light emission state and a non-light-emission state to display a single image during a field, the field includes a first sub-field and a second sub-field,
the first sub-field includes a first period, during which the light emitting element does not emit light, and a second period during which the light emitting element is allowed to emit light,
the second sub-field includes a third period, during which the light emitting element does not emit light, and a fourth period during which the light emitting element is allowed to emit light,
a length of the fourth period is longer than a length of the second period,
a length of the first period is equal to a length of the third period,
the length of the fourth period is longer than one vertical period from a first time, at which selection potential starts to be supplied to the scan line in the first sub-field, to a second time at which selection potential starts to be supplied to the scan line in a next sub-field subsequent to the first sub-field,
the field further includes a third sub-field,
the third sub-field includes a fifth period, during which the light emitting element does not emit light, and a sixth period during which the light emitting element is allowed to emit light,
a length of the sixth period is longer than the length of the fourth period, and
a length of the fifth period is equal to the length of the first period.

US Pat. No. 10,891,890

ASYMMETRIC PIXEL OPERATION FOR COMPENSATING LENS OPTICS LIMITATIONS

FACEBOOK TECHNOLOGIES, LL...

1. A method of calibrating display screens of artificial-reality devices, comprising:at an artificial-reality device with a display comprising a plurality of pixels:
determining gray-level values for the plurality of pixels using a uniform test image, wherein each pixel has an initial luminance level proportional to its gray-level value;
grouping the plurality of pixels into a plurality of distinct non-overlapping segments according to the initial luminance levels of each of the plurality of pixels and an initial gamma band;
for each segment of the plurality of segments:
computing an overall luminance level and a luminance target for the segment according to the determined gray-level values for the pixels in the segment;
in accordance with a determination that the overall luminance level is below the luminance target for the segment, calculating calibration data for the segment to adjust the overall luminance level of the segment by: (i) increasing the gray-level of each pixel in the segment by a first predefined amount or (ii) selecting an alternative gamma band for the segment corresponding to a difference between the luminance target and the overall luminance level; and
storing the calibration data for the segment on the artificial-reality device; and
configuring the artificial-reality device to use the stored calibration data for the segments in subsequent display of images on the display, wherein a first pixel of the plurality of pixels has an adjusted luminance level that is either greater than its initial luminance level or less than its initial luminance level.

US Pat. No. 10,891,889

DISPLAY DEVICE

Samsung Display Co., Ltd....

1. A display device comprising:a substrate comprising a display area and a non-display area;
a thin film transistor disposed in the display area;
a first connection electrode disposed in the non-display area and connected to the thin film transistor;
a second connection electrode disposed in the non-display area and disposed apart from the first connection electrode;
a first insulating layer disposed on the substrate and overlapping one end portion of the first connection electrode and one end portion of the second connection electrode;
a floating electrode disposed on the first insulating layer; and
a bridge electrode connected to the first connection electrode through a first contact hole, and connected to the second connection electrode through a second contact hole,
wherein:
the first connection electrode and the second connection electrode are disposed on the same layer; and
the first insulating layer directly contacts an upper surface of the substrate in an area between the first and second connection electrodes.

US Pat. No. 10,891,888

DISPLAY DEVICE CAPABLE OF MONITORING VOLTAGE OF PIXEL ARRAY

InnoLux Corporation, Mia...

1. A display device comprising:a pixel array;
a power line;
a ground line;
a plurality of power detection lines;
a plurality of ground detection lines; and
a power supply circuit, configured to provide to the pixel array a supply voltage via the power line and a ground voltage via the ground line, receive from the pixel array a plurality of detected supply voltages via the plurality of power detection lines and a plurality of detected ground voltages via the plurality of ground detection lines, and adjust the supply voltage and/or the ground voltage according to the plurality of detected supply voltages and the plurality of detected ground voltages;
wherein the plurality of detected supply voltages are obtained from a plurality of different locations in the pixel array, and the plurality of detected ground voltages are obtained from the plurality of different locations in the pixel array.

US Pat. No. 10,891,887

FRAME-LEVEL RESYNCHRONIZATION BETWEEN A DISPLAY PANEL AND A DISPLAY SOURCE DEVICE FOR FULL AND PARTIAL FRAME UPDATES

Intel Corporation, Santa...

1. A display source device, comprising logic to:receive a frame start indication from a display panel at a start of a frame;
align a timing of the display source device to a timing of the display panel based on the frame start indication received from the display panel to obtain frame-level synchronization between the display source device and the display panel;
send one or more frame update regions to the display panel in accordance with the timing of the display source device that is aligned to the timing of the display panel;
send a plurality of partial frame update regions to the display panel as a burst in accordance with the timing of the display source device that is aligned with the timing of the display panel; and
enter a low power mode at the display source device after the plurality of partial frame update regions are sent to the display panel.

US Pat. No. 10,891,886

SHIFT REGISTER, GATE LINE DRIVING METHOD, ARRAY SUBSTRATE AND DISPLAY DEVICE FOR HIGH AND LOW RESOLUTION AREAS

BOE TECHNOLOGY GROUP CO.,...

1. A shift register, comprising:a plurality of shift register circuits arranged in a one-to-one correspondence with gate lines on an array substrate; and
a control circuit configured to control signals output from the shift register circuits to the gate lines, to control each row of gate lines to be turned on and off, so that a display area has a high-resolution area and a low-resolution area, wherein a resolution of the low-resolution area is smaller than that of the high-resolution area;
wherein in the high-resolution area, the control circuit controls the gate lines to be turned on and off row by row; and in the low-resolution area, the control circuit controls the gate lines to be turned on and off group by group, each group of gate lines including at least two adjacent gate lines, and gate lines in the same group being turned on and off synchronously;
wherein the plurality of shift register circuits are divided into a plurality of groups, each group of shift register circuits are arranged to correspond to at least two consecutive gate lines, each group of shift register circuits comprise a first shift register circuit and at least one second shift register circuit, the first shift register circuit is directly connected to a corresponding gate line, and the second shift register circuit is connected to a corresponding gate line through the control circuit;
wherein the control circuit comprises a plurality of groups of first control switches and a plurality of second control switches, each group of first control switches comprise at least one control switch, gate lines corresponding to each group of shift register circuits are connected successively using one group of the first control switches, and each second shift register circuit is connected to the corresponding gate line through one second control switch respectively, and
the first control switch is configured to be switched off when the gate line corresponding thereto is located in the high-resolution area and scanned, and switched on when the gate line corresponding thereto is located in the low-resolution area and scanned; and the second control switch is configured to be switched on when the gate line corresponding thereto is located in the high-resolution area and scanned;
wherein when two consecutive areas in a direction of scanning data line are sequentially the low-resolution area and the high-resolution area, the second control switch is configured to be switched off when a first gate line in the low-resolution area is scanned, and switched on when a second gate line in the low-resolution area is scanned, and a next stage of shift register circuit of the second shift register circuit corresponding to the second gate line is a second shift register circuit corresponding to a gate line in the high-resolution area, and the first gate line is a gate line in the low-resolution area other than the second gate line; and
wherein the shift register further comprises a first control line and a second control line, wherein the first control line is connected to control terminals of the plurality of groups of first control switches, and the second control line is connected to control terminals of the plurality of second control switches.

US Pat. No. 10,891,885

METHOD AND TEST MACHINE PLATFORM FOR QUICKLY SEARCHING FOR COMMON VOLTAGE OF DISPLAY PANEL

CHONGQING ADVANCE DISPLAY...

1. A method of quickly searching for a common voltage of a display panel, comprising the following steps of:measuring a flicker value of the display panel at any at least three different common voltage values;
performing a power conversion on at least three the flicker values above, and performing a unary quadratic function fitting on the converted flicker value and the corresponding common voltage; and
obtaining the common voltage value corresponding to the vertex of the unary quadratic function as a common voltage of the display panel.

US Pat. No. 10,891,884

TEST-RESPONSE COMPARISON CIRCUIT AND SCAN DATA TRANSFER SCHEME IN A DFT ARCHITECTURE FOR MICRO LED BASED DISPLAY PANELS

Apple Inc., Cupertino, C...

1. A comparison circuit comprising:plurality of scan-data out (SDO) inputs;
a corresponding plurality of comparators to compare SDO data streams from the plurality of SDO inputs with an expected data stream, each comparator to transmit a compared data stream indicative of whether or not an error exists in the any of the SDO data streams;
a corresponding plurality of sticky registers coupled to the plurality of comparators, each sticky register to store a value indicative if an error is present in the compared data stream; and
a scan-chain register to store values from the corresponding plurality of sticky registers.

US Pat. No. 10,891,883

DISPLAY DEVICE

Samsung Display Co., Ltd....

1. A display device comprising:a substrate including a display area and a non-display area outside the display area;
a plurality of pixels disposed in the display area on the substrate;
a plurality of data lines connected to the pixels;
a first crack detection line disposed in the non-display area on the substrate, the first crack detection line being electrically connected to at least one of the data lines; and
a second crack detection line disposed in the non-display area and surrounding at least a portion of the first crack detection line, the second crack detection line being electrically connected to at least one of the data lines.

US Pat. No. 10,891,882

TECHNIQUES FOR TESTING ELECTRICALLY CONFIGURABLE DIGITAL DISPLAYS, AND ASSOCIATED DISPLAY ARCHITECTURE

Apple Inc., Cupertino, C...

1. A method of testing a display having an array of microdrivers arranged in a plurality of rows and columns, comprising:(a) selecting a row of microdrivers to be tested;
(b) delivering test data in parallel from support circuitry to each of the microdrivers in the selected row;
(c) transmitting an output in parallel corresponding to the test data from each of the microdrivers in the selected row to the support circuitry; and
(d) repeating steps (a) through (c) for each row in the array of microdrivers.

US Pat. No. 10,891,881

LIGHTING ASSEMBLY WITH LEDS AND OPTICAL ELEMENTS

Ultravision Technologies,...

1. A lighting apparatus comprising:a circuit board;
a plurality of light emitting diodes (LEDs) attached to the circuit board, the LEDs being arranged in an array of row and columns, wherein all of the LEDs attached to the circuit board are arranged in a single plane;
a support substrate supporting the circuit board, the support structure made of a thermally conductive material and configured to dissipate heat during operation of the LEDs; and
a plurality of optical elements configured to redirect light from the plurality of LEDs, each optical element being substantially the same as all other optical elements and configured to shape and direct light in a rectangular waveform, wherein each LED is associated with a single optical element and each optical element is associated with a single LED, wherein each optical element comprises a convex portion at least partially overlying the associated LED, and wherein the optical elements are part of an outer surface that forms an exposed surface of the lighting apparatus;
wherein the lighting apparatus is configured so that when all of the LEDs are operating a substantially rectangular surface that is off-center relative to the lighting apparatus is illuminated with an illumination level and a uniformity; and
wherein failure of one or more of the LEDs will cause the illumination level of light impinging the substantially rectangular surface to decrease while the uniformity of light impinging the substantially rectangular surface remains substantially the same.

US Pat. No. 10,891,880

METHOD FOR THE PRODUCTION, RECOGNITION, IDENTIFICATION, READING AND TRACEABILITY OF A SEAL OR LABEL, SEAL OR LABEL AND APPARATUS FOR THE RECOGNITION, IDENTIFICATION, READING AND TRACEABILITY OF SAID SEAL OR LABEL

PUNTO 2 S.R.L., Padua (I...

1. Method for the production, recognition, identification, reading and traceability of a seal or label, comprising the steps of:preparing a mold for said seal or label;
delivering to a molding chamber of the mold a first material comprising a matrix of the seal or label;
delivering to the molding chamber of the mold a second, tracking material, which is mixed into the matrix at random, creating a pattern inside the matrix, said pattern defining a marking of the seal or label;
wherein said mixing of the first and second material is carried out before the materials are introduced into the mold to pre-mix said materials before said materials enter into said molding chamber;
wherein the molding chamber is shaped so that, following extraction from the mold, the seal or label has a reading area of the marking of the seal or label said reading area being a subset of said seal or label; and
wherein said reading area of the marking of the seal or label is delimited by a perimeter groove or a bas-relief.

US Pat. No. 10,891,879

REPURPOSED PACKAGES

Amazon Technologies, Inc....

1. A computer-implemented method, comprising:under control of one or more computing systems configured with executable instructions,
receiving, from an application executing on a device:
a first user identifier of a first user associated with at least one of the device or the application; and
a first image, generated by a camera of the device, of a package to be used as a repurposed package for shipment of an item from the first user to a second user;
processing the first image to determine that a unique identifier is not included on the package;
sending, to the application executing on the device, a request that a user generated unique identifier be added to the package, wherein the user generated unique identifier is generated by the first user;
subsequent to sending the request, receiving, from the application, a second image of the package;
processing the second image to determine that the user generated unique identifier is unique compared to other identifiers maintained in a data store; and
in response to determining that the user generated unique identifier is unique, storing, in the data store, the user generated unique identifier as a stored user generated unique identifier;
associating the first user identifier with the stored user generated unique identifier to indicate the first user as a sender of the repurposed package;
receiving, from the application, an indication of a delivery destination for the repurposed package;
associating the delivery destination with the stored user generated unique identifier;
subsequent to associating the first user identifier and the delivery destination with the stored user generated unique identifier, causing a carrier to retrieve the repurposed package; and
causing the repurposed package to be shipped to the delivery destination, wherein at least a portion of a routing of the repurposed package to the delivery destination is based at least in part on the user generated unique identifier added to the package.

US Pat. No. 10,891,878

TRACKING SYSTEM FOR WEB-BASED ARTICLES

3M INNOVATIVE PROPERTIES ...

1. A parent film comprising:a. one or more layers including a removable liner, and
b. a plurality of identifying marks,
wherein the parent film has a width and length,
wherein each of the plurality of identifying marks are different from each other,
wherein the identifying marks are located on the parent film across the width of the parent film and across the length of the parent film,
wherein each of the identifying marks is printed on an adhesive layer bonding the removable liner to the parent film, the identifying marks disposed at regular intervals along the length of the parent film, the identifying marks being scannable through the removable liner,
wherein each of the identifying marks is chosen from a one-dimensional matrix data code and a two-dimensional matrix data code, and
wherein the parent film is subdivided into one or more child films, each having a smaller surface area than the parent film.

US Pat. No. 10,891,877

METHODS AND APPARATUS FOR SECURING SOUNDING SYMBOLS

Intel Corporation, Santa...

1. An apparatus to secure a sounding signal, the apparatus comprising:a cipher to generate a bit value based on a common key and random seed information, the random seed information including a seed value;
a frame generator to generate a sounding signal based on the bit value;
an interface to instruct radio architecture to transmit the sounding signal to a first device; and
a decipher to determine that a response to the sounding signal is from a second device different than the first device, the determination made using the random seed information.

US Pat. No. 10,891,876

DUMMY OBJECT WITH EXTREMITIES WHICH UTILIZE THE MASS INERTIA THEREOF TO REPLICATE A NATURAL MOVEMENT PROCESS

4ActiveSystems GmbH, Tra...

1. A dummy object, comprisinga torso;
at least one extremity, the extremity representing an arm or a leg, wherein the extremity comprises a proximal extremity portion and a distal extremity portion, and wherein a first end of the proximal extremity portion is coupled at the torso in an articulated manner;
a joint coupling the distal extremity portion with a second end of the proximal extremity portion;
a first mechanical stop associated with the joint, wherein the first mechanical stop is structured to limit movement of the distal extremity portion relative the proximal extremity portion;
at least one drive, which is arranged in the torso and is drivingly coupled to the proximal extremity portion, wherein the at least one drive is structured to selectively exert a force on the proximal extremity portion, thereby moving the proximal extremity portion relative to the torso, and wherein movement of the proximal extremity portion results in the creation of a mass inertia at the distal extremity portion; and
wherein the mass inertia propels the distal extremity portion.

US Pat. No. 10,891,875

METHOD, DEVICE, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM FOR CONTROLLING TACTILE INTERFACE DEVICE

GACHON UNIVERSITY-INDUSTR...

1. A method of controlling a tactile interface device implemented by a computing device including a processor, and connected to the computing device to interact with a user, the method comprising:generating information inputted to an application configured to be executed in the computing device based on an input at the tactile interface device including a tactile display;
selecting a focus area of a display screen of the computing device, based on outputs from the application, and identifying input coordinates within the selected focus area;
generating first output information configured to be transmitted to the tactile interface device, based on the selected focus area and the identified input coordinates, wherein
the output information comprises data for implementing a tactile graphic formed of a plurality of two-dimensional tactile pixels,
the tactile display includes a first layer and a second layer,
the first layer includes a tactile icon for executing an instruction to change a displayed image in the focus area of the display screen of the computing device,
the second layer includes text information in a form of Braille, the text information is information of a tactile graphic element corresponding to the input coordinates,
the input coordinates are changed in response to the input at the tactile interface device, and
the text information of the second layer is automatically changed in response to a change of the input coordinates, to be information of another tactile graphic element corresponding to changed input coordinates; and
generating second output information configured to be transmitted to the tactile interface device, the second output information controls the tactile interface device to perform, while the input coordinates correspond to a location of the tactile graphic element, a periodical movement of at least one tactile pixel of the tactile graphic element corresponding to the input coordinates.

US Pat. No. 10,891,874

ROBOT USING MULTI-COLOR CODE CARDS

ROBOMATION CO., LTD, Seo...

1. A robot using multi-color code cards, wherein a starting color portion, at least one of middle color portions, and a base color portion are arranged adjacently to one another and consecutively in sequence on the surface of each multi-color code card;the starting color portion indicates a start of each multi-color code card, and the same color is used with respect to all multi-color code cards;
at least one of middle color portions has colors different from one another and also different from the color of the starting color portion;
the base color portion has a color different from the starting color portion and also different from the adjacent one of the middle color portions;
codes corresponding to combined colors of the at least one of middle color portions and the base color portion are allotted; and
when a user pushes the starting color portion of each multi-color code card below a single-color sensor on the bottom of the robot, operations corresponding to the codes of combined colors of the at least one of middle color portions and the base color portion recognized by the robot are stored.

US Pat. No. 10,891,873

METHOD AND APPARATUS FOR MONITORING LEARNING AND ELECTRONIC DEVICE

BEIJING YIZHEN XUESI EDUC...

1. A method for monitoring learning, comprising:performing three-dimensional modeling on a classroom of a class student to generate a three-dimensional classroom mode, determining a location of the class student in real time based on a thermodynamic diagram in the three-dimensional classroom model, and associating the class student to a portion of the thermodynamic diagram corresponding to the class student, wherein the thermodynamic diagram is formed based on a detected heat radiated by a human body;
acquiring a class image of the class student in the three-dimensional classroom model when it is determined that the class student is currently in class based on the determined location of the class student;
recognizing the class image to acquire characteristic data of the class student, wherein the characteristic data comprises at least one of the following: facial characteristic data of the class student, visual characteristic data of the class student, and body characteristic data of the class student; and
determining, based on the characteristic data of the class student, a class status of the class student.

US Pat. No. 10,891,872

METHOD AND APPARATUS OF MUSIC EDUCATION

School of Rock, LLC, Can...

1. A computer-aided method of educating music students comprising:assembling an ensemble of at least three music students, the students using differing instruments at differing proficiency levels and focusing on differing musical techniques;
compiling a multidimensional database of songs, wherein dimensions of the database include three or more of: instrumentation requirements, technique requirements, musical styles represented, proficiency levels, and show theme suitability;
searching the database for songs having desired characteristics matching the differing instruments, differing proficiency levels, and differing musical techniques of the students;
selecting a first song for the students based on results from the searching of the database, the selected first song being playable by the students using the differing instruments;
receiving feedback regarding shortcomings of the students in regard to the students playing the first song; and
selecting a second song based at least in part on the feedback, the second song having one or more of: (i) a different proficiency level for at least one of the differing instruments when compared to the first song, (ii) a different technique requirement for at least one of the differing instruments when compared to the first song, or (iii) a different musical style when compared to the first song.

US Pat. No. 10,891,871

SYSTEMS AND METHODS PROVIDING AN ENHANCED USER EXPERIENCE IN A REAL-TIME SIMULATED VIRTUAL REALITY WELDING ENVIRONMENT

LINCOLN GLOBAL, INC., Sa...

1. A server computing device, comprising:a processor;
a non-transitory, computer-readable storage medium having stored thereon computer-executable instructions for an on-line welding game, the computer-executable instructions, when executed, configure the processor to:
communicate with a simulated welding system via an external communication infrastructure to receive user statistics representative of a user's performance of a simulated welding activity with the simulated welding system;
update a game profile of the user with respect to the on-line welding game based on the user statistics received from the simulated welding system; and
provide a ranking of the user in relation to a plurality of other users of the on-line welding game based on a comparison of the user statistics with respective statistics associated with the plurality of other users.

US Pat. No. 10,891,870

SYSTEM AND METHOD FOR AIDING THE NAVIGATION OF AN AIRCRAFT WITHIN AN AIRPORT

AIRBUS (S.A.S.), Blagnac...

9. A system for aiding ground navigation of an aided aircraft within an airport, the system comprising:an airport navigation device configured to generate a ground route within the airport for the aided aircraft between a first position and a second position, the ground route divided into successive segments;
a position determination device configured to determine a current ground position of the aided aircraft;
a traffic surveillance device configured to identify a current location and type of each of other aircraft on each of the successive segments;
a computation unit configured to determine, for each of the successive segments, a congestion information item as a function of, at least, the current location and the type of each of the other aircraft on the segment; and
a display device on board the aided aircraft and configured to display, on at least one viewing screen in the aided aircraft, an airport map of the airport, a symbol of the aided aircraft on a position of the airport map corresponding to the current position of the aided aircraft, a plot on the airport map illustrating the ground route, and an indicator on or near each segment of the congestion information item corresponding to the segment,
wherein the computation unit is configured to determine a predicted individual journey time for each of the successive segments based on a reference speed for the segment and a length of the segment, and to compute at least one of:
a total journey time between the first and second positions along the ground route;
a remaining journey time between the current position of the aided aircraft and the second position,
wherein the total journey and the remaining journey time are computed based on a sum of the predicted individual journey times for either all of the successive segments or a sum of the segments including the current position of the aided aircraft and the second point.

US Pat. No. 10,891,869

GROUND COLLISION AVOIDANCE SYSTEM AND METHOD THEREOF

1. A system for collision avoidance, the system comprising:a first apparatus comprising a first terminal and a second terminal;
a first vehicle electrically coupled to the first terminal and having a first electrical characteristic;
a reference point electrically coupled to the second terminal and having a reference electrical characteristic; and
wherein the apparatus is configured to determine a proximity between the first vehicle and a nearby object using the first electrical characteristic and the reference electrical characteristic.

US Pat. No. 10,891,868

EFFICIENT FLIGHT OPERATIONS BASED ON NATURALLY PRESENT ENERGY SOURCES OR SINKS

Amazon Technologies, Inc....

1. An aerial vehicle comprising:an airspeed sensor;
an altimeter;
a position sensor;
a thermometer;
at least one propulsion motor; and
a control system having at least one computer processor, wherein the control system is in communication with each of the airspeed sensor, the altimeter, the position sensor, the thermometer, and the at least one propulsion motor, and
wherein the control system is configured to execute a method comprising:
determining a first airspeed of the aerial vehicle at a first time, wherein the at least one propulsion motor is operating at a first power level at the first time;
determining a first altitude above a first ground-based location of the aerial vehicle at the first time;
determining a first temperature of air at the first altitude above the first ground-based location at the first time;
determining that the aerial vehicle is at a second altitude at a second time;
in response to determining that the aerial vehicle is at the second altitude at the second time,
determining a second airspeed of the aerial vehicle at the second time, wherein the at least one propulsion motor is operating at a second power level at the second time;
determining a second altitude above a second ground-based location of the aerial vehicle at the second time;
determining a second temperature of air at the second altitude above the second ground-based location at the second time;
calculating a first energy level of the aerial vehicle at the first time based at least in part on at least one of:
a product of one-half of a mass of the aerial vehicle and a square of the first airspeed;
a product of the mass of the aerial vehicle, an acceleration constant due to gravity and the first altitude; and
the first power level;
calculating a second energy level of the aerial vehicle at the second time based at least in part on at least one of:
a product of one-half of the mass of the aerial vehicle and a square of the second airspeed;
a product of the mass of the aerial vehicle, the acceleration constant due to gravity and the second altitude; and
the second power level;
determining a difference between the first energy level and the second energy level;
attributing the difference between the first energy level and the second energy level to a difference between the first temperature and the second temperature;
generating a map of at least a portion of a region including at least the second ground-based location, wherein the map comprises an indicator of naturally present energy at the second ground-based location; and
storing at least the map of at least the portion of the region in at least one data store,
wherein the indicator identifies a natural energy source associated with the second ground-based location if the second energy level exceeds the first energy level, and
wherein the indicator identifies a natural energy sink associated with the second ground-based location if the second energy level does not exceed the first energy level.

US Pat. No. 10,891,867

DIGITAL COPILOT

The MITRE Corporation, M...

1. An electronic device, comprising:one or more processors;
memory; and
one or more programs, wherein the one more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for:
receiving a destination airport for an aircraft;
determining whether visibility at the destination airport is below a threshold visibility;
in response to a determination that the visibility at the destination airport is below a threshold visibility, providing, via the electronic device, the aircraft with a visibility notification;
determining whether ceiling at the destination airport is below a threshold ceiling;
in response to a determination that the ceiling at the destination airport is below the threshold ceiling, providing, via the electronic device, the aircraft with a ceiling notification;
determining a phase of flight for the aircraft, wherein the phase of flight for the aircraft comprises arrival;
in response to determining the phase of flight for the aircraft, providing the aircraft with at least one notification based on the phase of flight for the aircraft;
determining whether the aircraft distance to the destination airport is less than a threshold distance;
determining whether the tower at the destination airport is open or closed;
in response to determining that the aircraft distance to destination airport is less than a threshold distance and that the tower at the destination airport is open, providing, via the electronic device, the aircraft with a tower frequency notification; and
in response to determining that the aircraft distance to destination airport is less than a threshold distance and that the tower at the destination airport is closed, providing, via the electronic device, the aircraft with a CTAF notification.

US Pat. No. 10,891,866

PARKING ASSIST APPARATUS

Hall Labs LLC, Provo, UT...

1. A parking assist system comprising:an overhead sensor mounted to a ceiling of a garage;
a screen in a vehicle;
a motion sensor attached to a garage door, wherein the motion sensor detects when the garage door is opening and when the garage door is closed; and
a controller in communication with the overhead sensor, the screen, and the motion sensor, wherein the controller:
receives a first signal from the motion sensor that indicates that the garage door is opening,
switches the overhead sensor to an awake mode in response to the indication that the garage door is opening, wherein the overhead sensor provides images to the controller during the awake mode,
receives an advertisement from a short-range wireless beacon;
determines an optimum parking position for the vehicle based on the advertisement;
receives an image from the overhead sensor;
draws one or more guidelines on the image to indicate the optimum parking position for the vehicle, wherein the image illustrates a position of the vehicle in relation to the optimum parking position, and
provides the image to the screen in the vehicle.

US Pat. No. 10,891,865

CONTROL DEVICE FOR LANE DEPARTURE WARNING DEVICE, VEHICLE, AND LANE DEPARTURE WARNING CONTROL METHOD

Isuzu Motors Limited, To...

1. A control device for a lane departure warning device that outputs a warning based on positional relationship between a vehicle and a lane boundary line, the control device for a lane departure warning device, comprising:a detection section that detects switching from an ON state to an OFF state of a brake; and
a warning control section that suppresses output of the warning once the detection section detects the switching until a predetermined time elapses from the switching,
wherein once the detection section detects the switching in a warning suppression state in which the suppression is enabled, the warning control section extends time to suppress the warning.

US Pat. No. 10,891,864

OBSTACLE WARNING METHOD FOR VEHICLE

LG Electronics Inc., Seo...

1. A method of a first vehicle for warning a second vehicle of an obstacle, the method comprising:detecting a first obstacle through a laser sensor;
identifying a location of the second vehicle;
determining a blind spot of the second vehicle at the location of the second vehicle;
detecting a second obstacle in the blind spot of the second vehicle through the laser sensor; and
transmitting a danger message to the second vehicle,
wherein determining the blind spot of the second vehicle comprises determining the blind spot of the second vehicle based on location coordinates of the second vehicle, and a location and volume of the first obstacle.

US Pat. No. 10,891,863

VEHICLE AND TRIP DATA NAVIGATION FOR COMMUNICATION SERVICE MONITORING USING MAP GRAPHICAL INTERFACE

VIASAT, INC., Carlsbad, ...

1. An apparatus for monitoring a network communication system onboard a vehicle, the apparatus comprising:a network interface; and
control circuitry configured to:
obtain location data indicating geographic locations of a plurality of vehicles within a geographic area;
generate first graphical interface data simultaneously representing:
a map of the geographic area; and
a plurality of vehicle icons, each of the plurality of vehicle icons being at a position on the map corresponding to a respective geographic location of one of the plurality of vehicles;
in response to receiving an indication of a first user input associated with a first vehicle icon of the plurality of vehicle icons, generate second graphical interface data representing:
a vehicle identifier identifying a first vehicle of the plurality of vehicles associated with the first vehicle icon; and
a trip identifier for a trip associated with the first vehicle;
establish a network connection with an onboard server of the first vehicle using the network interface;
receive a set of vehicle data from the onboard server via the network connection;
receive a set of trip data associated with a communication service provided by the onboard server on the first vehicle during the trip;
in response to receiving an indication of a second user input associated with the vehicle identifier, generate third graphical interface data representing the set of vehicle data; and
in response to receiving an indication of a third user input associated with the trip identifier, generate fourth graphical interface data representing the set of trip data.

US Pat. No. 10,891,862

INFORMATION COMMUNICATION DEVICE AND POSITION MANAGEMENT SYSTEM

Honda Motor Co., Ltd., T...

1. An information communication apparatus that is held by a displacement body whose current position changes according to passage of time, the information communication apparatus comprising:a positional change amount acquiring section configured to acquire information concerning a change amount of the current position of the displacement body; and
a transmission control section configured to transmit information to outside of the displacement body,
wherein the transmission control section is configured to transmit current position transmission information that includes information concerning the current position of the displacement body at a prescribed time to the outside,
the transmission control section is configured to generate change amount transmission information as information that includes information concerning the change amount of the current position but does not include information concerning the current position, and
the transmission control section is configured to transmit the change amount transmission information to the outside, at a different timing than transmission of the current position transmission information to the outside.

US Pat. No. 10,891,861

INFORMATION PROVISION SYSTEM, SERVER, AND INFORMATION PROVISION METHOD

MITSUBISHI ELECTRIC CORPO...

1. An information provision system comprising:a receiver to receive request information and condition information, the request information including various pieces of information requested by in-vehicle devices mounted on a plurality of vehicles, respectively, and the condition information indicating internal and external conditions of each of the plurality of vehicles;
a controller to
set priorities for information transmission for respective types of the pieces of information included in the request information from each of the plurality of vehicles, and for the internal and external conditions of each of the plurality of vehicles indicated by the condition information, and
determine a transmission schedule for transmitting the pieces of information to the in-vehicle devices on a basis of the priorities, the request information and the condition information being received by the receiver; and
a transmitter to transmit the pieces of information to the in-vehicle devices, in accordance with the transmission schedule determined by the controller,
wherein the controller extends communication bandwidth allocated for transmission of the pieces of information in descending order of the priorities and determines the transmission schedule.

US Pat. No. 10,891,860

APPARATUS AND ASSOCIATED METHODS FOR NAVIGATION OF ROAD INTERSECTIONS

HERE Global B.V., Eindho...

1. An apparatus comprising:at least one processor; and
at least one memory including computer program code for one or more programs, the at least one memory and computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
filter two or more different complete sets of lane traversals based, at least in part, on one or more known safety-traffic flow restrictions;
generate safety-traffic flow scores for the two or more different complete sets of lane traversals according to one or more safety-traffic flow criteria, wherein the safety-traffic flow scores comprise a poor-scoring indicator, a similarly-scoring indicator, a penalty indicator, or a combination thereof;
identify, for use in route navigation, a complete set of lane traversals for a road intersection from the filtered two or more different complete sets of lane traversals based on the safety-traffic flow scores; and
provide one or more driving instructions on an audio user interface, a video user interface, or a combination thereof based on the complete set of lane traversals for use in navigating the road intersection,
wherein each lane traversal defines a path of travel from an inbound lane of the road intersection to an outbound lane of the road intersection, and
wherein each different complete set of lane traversals comprises a different combination of lane traversals for the road intersection.

US Pat. No. 10,891,859

DISPLAYING SENSOR DATA AND SUPPLEMENTAL DATA AS A MASK FOR AUTONOMOUS VEHICLES

Waymo LLC, Mountain View...

1. A method of displaying an environment near a vehicle, the method comprising:receiving, by one or more processors, data points generated by a lidar sensor representative of one or more objects in an external environment of the vehicle;
displaying, by the one or more processors, a scene including a representation of an environment near the vehicle; and
displaying, by the one or more processors, an animation in the scene, wherein the animation includes a pulse that makes temporarily visible on the scene a representation of at least one object of the one or more objects.

US Pat. No. 10,891,858

SYSTEMS AND METHODS FOR ALERTING DRIVERS TO ALERT ZONES

1. A computer-implemented method for communicating a message from an alert management computing device to at least some of a plurality of vehicles approaching an alert zone, said method comprising:receiving, by the alert management computing device, an alert request message including an identifier of a requester of the alert request message, a timestamp, a description of the alert zone including characteristics of the alert zone, and a location of the alert zone, the timestamp distinguishing an earlier alert request message from a later alert request message;
receiving, by the alert management computing device, vehicle location data from a plurality of vehicle user computing devices located in a respective plurality of vehicles, the vehicle location data includes a present vehicle location, and a present vehicle trajectory;
calculating, by the alert management computing device, a vehicle zone for each of the plurality of vehicles based on a projected vehicle location and a projected vehicle trajectory, wherein the projected vehicle location and the projected vehicle trajectory are based on the present vehicle location and the present vehicle trajectory, and a predetermined time to travel, the vehicle zone including a variable-sized vehicle zone larger than and surrounding each of the plurality of vehicles;
determining, by the alert management computing device, an overlap of the alert zone and each of the vehicle zones;
identifying, by the alert management computing device, a subset of the plurality of vehicles within the alert zone by comparing each vehicle zone to the alert zone, the subset of the plurality of vehicles consists of vehicles whose vehicle zone overlaps the alert zone; and
transmitting an alert message directly from the alert management computing device to the subset of the plurality of vehicles via the vehicle user computing devices, the alert message includes information regarding the alert zone location received in the alert request message and the calculated alert zone.

US Pat. No. 10,891,857

ELECTRONIC APPARATUS, ROADSIDE UNIT, AND TRANSPORT SYSTEM

KYOCERA Corporation, Kyo...

1. An electronic apparatus, comprising:a communication unit configured to wirelessly communicate with a roadside unit and a vehicle; and
at least one processor configured, when receiving through the communication unit a reception signal transmitted from the roadside unit, to determine whether to restrain transmission of a transmission signal through the communication unit, based on the reception signal,
wherein the at least one processor acquires information indicating a road to which the vehicle is prohibited from traveling, and
the at least one processor sets the road to a restraining area based on the information.

US Pat. No. 10,891,856

TRAFFIC DRONE SYSTEM

United Services Automobil...

1. An unmanned vehicle comprising:a processor; and
a memory coupled with the processor, the memory comprising executable instructions that when executed by the processor cause the processor to effectuate operations comprising:
receiving, from a device, an alert associated with an accident of a vehicle, wherein the device comprises a communication module of the vehicle;
responsive to receiving the alert, deploying to a location of the accident;
receiving, at the location of the accident, data from one or more vehicles; and
providing instructions, based at least on the data, for directing traffic, wherein the instructions for directing traffic comprises instructions communicated wirelessly to a traffic control device at the location of the accident of the vehicle.

US Pat. No. 10,891,855

METHOD TO SCHEDULE INTELLIGENT TRAFFIC LIGHTS IN REAL TIME BASED ON DIGITAL INFOCHEMICALS

DALIAN UNIVERSITY OF TECH...

1. A method to schedule intelligent traffic lights in real time based on digital infochemicals, DIs, wherein comprising the following steps:step 1, collect digital infochemicals
according to the target requirements, a road is split into several cells; at time tick t, the traffic light system automatically collects the DIs generated by the traffic flow in each cell, and then updates the DIs through three processes, i.e., aggregation, evaporation, and propagation;
said aggregation refers to the accumulation of DIs generated by different vehicles within the same cell;
?i,t=?i,t?1+ni,t  (1)where, ?i,t?1 is number of DIs in the ith cell at time t?1; ni,t is the number of vehicles in the ith cell at time t; ?i,t is the updated number of DIs in the ith cell at time t;said evaporation refers to the gradual deduction of DIs along with time going:
?i,t?=(1??v)?i,t  (2)where, ?i,t is the number of DIs in the ith cell at time t; ?v is the evaporation rate; ?i,t?is the number of DIs left after evaporation;said propagation refers to that the DIs propagate to the neighboring areas along with the driving direction of vehicles:
?i,t?=(1??p)?i,t?  (3)where, ?i,t?is the number of DIs left after evaporation; ?? is the propagation rate, i.e., the percentage of DIs propagated to the neighboring areas; ?i,t?the number of DIs left after propagation;under synchronized update, the DIs in all the cells propagate simultaneously, and then receive the DIs propagated from other cells:
where, ? is the set of upstream cells whose DIs are propagated to the ith cell; ?j,t? is the DIs propagated from the jth cell and sprayed to the passed cells evenly;where, ?j,t?is the DIs left after evaporation; ???j,t?is the total DIs propagated to the neighboring areas; v is the speed for propagation; ? is the unit time length; v? is the length that the DIs are able to propagate within time ?; CS is the length of cell; v?/CS is the number of cells that the DIs pass during propagation within time ?;step 2, adjust Green/Cycle, g/C, ratio
assume t to be the beginning time of a signal cycle, i.e., mod(t,Tc)=0, then the traffic signal light adjusts the g/C ratio for the next signal cycle according to the number of DIs on the adjacent roads of an intersection in the current cycle:
where, TiG is the green duration of the ith phase; Di is the number of DIs on the roads corresponding to the ith phase; ?jDj is the total number of DIs on all the roads of an intersection; TC is the cycle length;if t is not the beginning time of a signal cycle, then follow Step 1 to collect the DIs for the t+1 time; such a process forms an infinite loop and keep updating.

US Pat. No. 10,891,854

TRAFFIC MANAGEMENT SYSTEM

SUBARU CORPORATION, Toky...

1. A traffic management system comprising:a traffic information collector configured to collect a number of vehicles passing through a predetermined section for each traveling lane;
a vehicle sensor installed behind the predetermined section and configured to collect vehicle information of the vehicles passing through; and
a traffic control device comprising:
a traffic information acquirer configured to acquire, as traffic information, the number of vehicles traveling in the predetermined section that is collected by the traffic information collector, and the vehicle information collected by the vehicle sensor; and
a traffic manager configured to detect a congestion indication in the predetermined section for each traveling lane based on the traffic information acquired by the traffic information acquirer, the traffic manager comprising:
a traffic density calculator configured to determine a traffic density for each traveling lane based on the number of vehicles traveling in the predetermined section acquired by the traffic information acquirer and a section length of the predetermined section;
a traffic volume calculator configured to calculate a traffic volume for each traveling lane from the number of vehicles passing by the vehicle sensor based on the vehicle information which is acquired by the traffic information acquirer and collected by the vehicle sensor;
an actual traffic density calculator configured to correct the traffic density calculated by the traffic density calculator with the traffic volume calculated by the traffic volume calculator to determine an actual traffic density for each traveling lane;
a congestion indication determining unit configured to compare the actual traffic density for each traveling lane calculated by the traffic density calculator to a preset determination threshold value for determining a congestion indication to check a traveling lane exhibiting the congestion indication; and
an instruction signal transmitter configured to transmit a lane changing instruction signal to a vehicle traveling behind the vehicle sensor when a traveling lane exhibiting the congestion indication is detected by the congestion indication determining unit.

US Pat. No. 10,891,853

METHOD AND SYSTEM FOR CLASSIFYING TRAFFIC FLOW

TNICO Technology Division...

1. A device for collecting traffic data comprising:a touch screen display;
a computer readable medium;
an orientation sensor configured to report a current orientation; and
a processor configured to execute instructions stored on said computer readable medium for:
displaying on said touch screen display a representation of an intersection comprising a plurality of legs in different directions;
displaying on said touch screen display a first plurality of count buttons on each one of said plurality of legs, each one of said first plurality of count buttons representing a vehicle type moving in a first direction of motion on each one of said plurality of legs;
displaying on said touch screen display a second plurality of count buttons on each one of said plurality of legs, each one of said second plurality of count buttons representing a vehicle type moving in a second direction of motion on each one of said plurality of legs;
associating each one of said first and second plurality of count buttons with an associated counter, said touch screen display configured to detect contact with each one of said first and second plurality of count buttons, and said processor configured to increment the associated counter in response to said contact;
wherein said processor is configured to record each associated counter with said first and second plurality of count buttons to a database along with a time code at the end of a predetermined slice of time;
setting an original orientation during a calibration period; and
remapping each one of said first and second plurality of count buttons to a different counter in response to an orientation change.

US Pat. No. 10,891,852

PRECISE PREDICTIVE MAINTENANCE METHOD FOR DRIVING UNIT

ITS CO., LTD., Ulsan (KR...

1. A precise predictive maintenance method for a driving unit, used for various facilities, the method comprising:a first base information collecting step S10 of collecting change information of an energy size in accordance with a time measured in a normal driving state of the driving unit by arbitrarily setting at least two or more time zones from the change information of the energy size and collecting energy values of the set arbitrary time zones;
a second base information collecting step S20 of collecting energy values of the same arbitrary time zones as in the first base information collecting step S10 from change information of an energy size in accordance with a time measured in a driving state of the driving unit before a malfunction of the driving unit is generated;
a setting step S30 of setting an alarm upper limit and an alarm lower limit for the energy values of the arbitrary time zones set based on the information collected in the first and second base information collecting steps S10 and S20; and
a detecting step S40 of measuring energy values of the arbitrary time zones set in the change information of the energy size in accordance with the time measured in the real-time driving state of the driving unit and detecting the driving unit to be an abnormal state when the measured energy value exceeds the alarm upper limit set in the setting step S30 or is lower than the alarm lower limit,
wherein an energy measured by the driving unit is selected from any one of a current consumed to drive the driving unit, a vibration generated during the driving of the driving unit, a noise generated during the driving of the driving unit, a frequency of a power source of the driving unit, a temperature, a humidity, and a pressure of the driving unit during the driving of the driving unit.

US Pat. No. 10,891,851

ALARM SIGNALING TECHNOLOGY

Alarm.com Incorporated, ...

1. A server comprising:at least one processor; and
at least one non-transitory computer-readable storage medium coupled to the at least one processor having stored thereon instructions which, when executed by the at least one processor, causes the at least one processor to perform operations comprising:
monitoring signaling quality for signals received from at least one component of a monitoring system;
determining signaling quality for the at least one component of the monitoring system based on the monitoring;
determining a wait time for processing sensor data from the monitoring system based on the determined signaling quality for the at least one component of the monitoring system; and
processing the sensor data from the monitoring system based on the determined wait time for processing sensor data from the monitoring system.

US Pat. No. 10,891,850

HUMIDITY CONTROL SYSTEM

Waymo LLC, Mountain View...

1. A method comprising:receiving, by one or more processors, first information corresponding to a saturation percentage of a desiccant within a sensor housing of a sensor of a vehicle; receiving, by the one or more processors, second information identifying temperature data corresponding to the first route along which the vehicle is projected to travel; determining, by the one or more processors, a condensation risk corresponding to a likelihood of condensation forming within the sensor housing in the event that the vehicle travels the first route based upon the saturation percentage and the temperature data; and
based upon the determined condensation risk, determining a second route for the vehicle to travel.

US Pat. No. 10,891,849

SYSTEM FOR SUPPRESSING FALSE SERVICE OUTAGE ALERTS

Microsoft Technology Lice...

1. A system comprising:a processing unit:
a storage device comprising instructions, which when executed by the processing unit, configure the processing unit to perform operations comprising:
receiving a service outage alert for a service used by an entity;
retrieving a current count of non-recurring meetings for the entity:
determining that the service outage alert is a false positive based on a current service load for the service and the current count; and
based on the determining, suppressing the service outage alert.

US Pat. No. 10,891,848

SYSTEM AND METHOD FOR VOLTAGE DETECTION AND COMMUNICATION BETWEEN ELECTRIC FIELD DETECTORS

GREENLEE TOOLS, INC., Ro...

1. A system provided in an environment, comprising:a first electric field detector operably connected with first field detection circuitry configured to detect a voltage in an electric field in the environment which meets or exceeds a voltage threshold;
a first communication module operably connected with the first field detection circuitry;
a first warning module operably connected with the first field detection circuitry and configured to provide a warning to operators based on the voltage meeting or exceeding the voltage threshold;
a second electric field detector operably connected with second field detection circuitry configured to detect a voltage in an electric field in the environment which meets or exceeds a voltage threshold;
a second communication module operably connected with the second field detection circuitry;
a second warning module operably connected with the second field detection circuitry and configured to provide a warning to operators based on the voltage meeting or exceeding the voltage threshold;
the first communication module configured to communicate a warning notification to the second electric field detector; and
the second communication module configured to receive the warning notification, and in response, communicate the received warning notification to the second field detection circuitry so as to provide the warning to operators based on the voltage meeting or exceeding the voltage threshold, via the second warning module.

US Pat. No. 10,891,847

VISIBLE INDICATION OF A PORT AS CONFIGURED TO MANAGEMENT FUNCTIONALITY

Hewlett Packard Enterpris...

1. An apparatus comprising:a panel;
a networking port coupled to the panel and configured for a networking functionality of a server; and
a light emitting diode (LED), coupled to the panel as separate from the networking port, and to provide a visible indication of a status of the network functionality, wherein the networking functionality comprises a management functionality of the server.

US Pat. No. 10,891,846

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM

SONY CORPORATION, Tokyo ...

1. An information processing device comprising:a communication unit configured to
receive, from a server, location information of another information processing device; and
a notifying unit configured to
output, based on the location information, a notification as to whether the another information processing device is within a predetermined range from the information processing device,
wherein the communication unit is further configured to
directly receive, from the another information processing device and based on the notification, identification information including an identifier specific to the another information processing device for identifying the another information processing device, the identification information being displayed at the another information processing device,
transmit the received identification information to the server which manages operation setting information of one or more sensors included in the another information processing device and configured to sense a user, the operation setting information including a state of the one or more sensors included in the another information processing device, and
receive, from the server, the operation setting information on the basis of the transmitted identification information,
wherein the notifying unit is further configured to display the another information processing device to have an appearance notifying that the operation setting information indicates that the state of the one or more sensors included in the another information processing device is not activated,
wherein a volume of the sound notification is first level when the state of the one or more sensors is activated, and the volume of the sound notification is a second level different than the first level when the state of the one or more sensors is not activated,
wherein the state of one sensor of the one or more sensors and the state of another sensor of the one or more sensors are switched based on a remaining battery level or a measurement position of the one sensor or the another sensor, and
wherein the communication unit and the notifying unit are each implemented via at least one processor.

US Pat. No. 10,891,845

MOUTH AND NOSE OCCLUDED DETECTING METHOD AND SYSTEM THEREOF

NATIONAL YUNLIN UNIVERSIT...

1. A mouth and nose occluded detecting method, comprising:a detecting step, comprising:
a facial detecting step, wherein an image is captured by an image capturing device, and a facial portion image is obtained from the image according to a facial detection;
an image extracting step, wherein a mouth portion is extracted from the facial portion image according to an image extraction so as to obtain a mouth portion image; and
an occluded determining step, wherein the mouth portion image is entered into an occluding convolutional neural network so as to produce a determining result, and the determining result is an occluding state or a normal state; and
a warning step, wherein a warning is provided according to the determining result, when the determining result is the normal state, the detecting step is performed; when the determining result is the occluding state, the warning is provided;
wherein the occluding convolutional neural network comprises a softmax layer, the softmax layer comprises at least one image state, the mouth portion image, at least one image state parameter, at least one image state probability and an image state probability set, the softmax layer is corresponded by:

wherein y(i) is the at least one image state, k is a number of the image state, x(i) is the mouth portion image, ?1,?2 . . . , ?K are the image state parameters, p(y(i)=k|x(i):?) is the image state probability, h?(x(i)) is the image state probability set and T means transpose matrix.

US Pat. No. 10,891,844

ELECTRONIC BRACELET AND AN OFFENDER MONITORING SYSTEM

UPSTREEM, Jumet (BE)

1. An electronic bracelet (1) for monitoring an offender comprising a monitoring device (11), a strap (12) for attaching the monitoring device (11) to a wrist of the offender and a tamper detection device for detecting a bracelet attachment error, and wherein said monitoring device (11) comprises a housing (15) and a screen (13) coupled to said housing (15), said housing (15) enclosinga geographic location determining device (25) for determining a current location;
a rechargeable battery (22);
one or more computer-readable storage media (21) comprising i) offender configuration data defining home curfew rules and geographical restriction rules, and ii) a computer program;
a processor (20) adapted to execute said computer program; and
a data and time determining device (27) configured for determining a current date and time; characterized in that said housing (15) further encloses
a first communication module (23) configured for wireless communication with a central monitoring station across a wide area network; and
a second communication module (24) configured for wireless communication with a home station across a local area network;and in that said computer program comprisesa first algorithm, when executed, to perform a step of transmitting a strap alarm signal to said central monitoring station if a bracelet attachment error is detected by said tamper detection device;
a second algorithm, when executed, to perform a step of displaying said current date and time on said screen (13); and
a third algorithm, when executed, to perform steps of
a) verifying if the offender is authorized to leave home or not by comparing the current date and time with the home curfew rules;
b) verifying if said second communication module (24) can establish a communication connection with said home station across said local area network;
c) if no communication connection can be established with the home station and if the offender is authorized to leave home then periodically acquiring the current location of the electronic bracelet with said geographical location determining device (25), comparing the acquired current location with said geographical restriction rules and if a geographical restriction rule is violated then
i) displaying a first text message on said screen (13) instructing the offender to take a first action, and
ii) transmitting a geographical location violation signal to said central monitoring station indicating an occurrence of a violation of the geographical restriction rules; and
d) if no communication connection can be established with the home station and if the offender is not authorized to leave home then
i) displaying a second text message on said screen (13) instructing the offender to take a second action.

US Pat. No. 10,891,843

METHODS AND SYSTEMS FOR MANAGING HAZARD RISK BASED ON LOCATION AND INCIDENT DATA

INTERNATIONAL BUSINESS MA...

1. A method, by one or more processors, for managing hazard risk comprising:detecting a presence of an individual at a location, the location comprising a certain geographical position;
assigning a hazard risk score based on at least one data source associated with at least one of the individual and the location, wherein the hazard risk score is representative of a risk of injury to the individual at the location stemming from detected prior incidents at the location, and wherein detecting the prior incidents includes utilizing a deep learning neural network (DLNN) machine learning algorithm to analyze input features pertaining to geo-spatial conditions to perform object detection at the location from an incident vector matrix; and
causing a notification of the assigned hazard risk score to be generated.

US Pat. No. 10,891,842

SYSTEMS AND METHODS FOR ADAPTIVELY MONITORING FOR AN ENVIRONMENTAL ANOMALY USING A DESIGNATED BRIDGING ID NODE FOR A REMOTE MONITOR BEACON

FEDEX CORPORATE SERVICES,...

1. A method for adaptively monitoring a shipping container for an environmental anomaly selectively using elements of a wireless node network having at least a plurality of ID nodes disposed within the shipping container and a command node associated with the shipping container maintaining a plurality of packages, the ID nodes being lower level elements of the wireless node network and the command node being a mid-level element of the wireless node network, and wherein the command node being operative to communicate with each of the ID nodes and an external transceiver associated with a transit vehicle, the method comprising:designating, by the command node, a group of monitor beacons from the ID nodes, wherein each member of the group of monitor beacons broadcasts according to a communication profile associated with that member of the group of monitor beacons, wherein each member of the group of monitor beacons is deployed at a different location within the shipping container, and wherein the group of monitor beacons includes at least a remote monitor beacon located outside a reception range of the command node;
programmatically configuring, by the command node, at least another of the ID nodes not included in the group of monitor beacons to be a dedicated bridging node providing a dedicated intermediary communication link between the command node and the remote monitor beacon, the dedicated bridging node being deployed within the reception range of the command node and a broadcast range of the remote monitor beacon;
receiving, by the command node, a respective broadcast signal from each respective member of the group of monitor beacons,
wherein the command node directly receives the respective broadcast signal from the group of monitor beacons not including the remote monitor beacon, and
wherein the command node indirectly receives the respective broadcast signal from the remote monitor beacon through the dedicated intermediary communication link provided by the dedicated bridging node;
monitoring, by the command node, the received respective broadcast signals from the group of monitor beacons for an unanticipated state of ceased broadcasting from any of the group of monitor beacons;
identifying, by the command node, an unresponsive subset from the group of monitor beacons to be in the unanticipated state of ceased broadcasting based upon the monitoring step;
detecting, by the command node, the environmental anomaly when a size of the unresponsive subset of the group of monitor beacons exceeds a threshold setting maintained by the command node;
automatically generating, by the command node, an alert notification related to the detected environmental anomaly for the shipping container; and
transmitting, by the command node, the alert notification to the external transceiver to initiate a mediation response related to the detected environmental anomaly.

US Pat. No. 10,891,841

APPARATUS AND SYSTEM FOR CAPTURING CRIMINALS

1. A gripper assembly for capturing criminals from overhead comprising: gripper palm; laser, attached to said gripper palm; remote button, to activate said laser; ejection chute, constructed onto said gripper palm; tether, attached to inside of said ejection chute and said tether stored in said ejection chute; entrapping net, attached to said tether and said entrapping net stored in said ejection chute; compressed gas cartridge, attached to said gripper palm; electric motor, attached to said gripper palm; said electric motor to puncture said compressed gas cartridge to release compressed gas cartridge contents into said election chute to eject said entrapping net; remote button, to activate said electric motor; two cameras, attached to said gripper palm; remote button, to activate said cameras; speaker, attached to said gripper palm; remote button, to activate said speaker; LED, attached to said gripper palm; remote button, to activate said LED; infrared emitter, attached to said gripper palm; remote button, to activate said infrared emitter; microphone, attached to said gripper palm; remote button, to activate said microphone; two electrically motorized multiple jointed mechanical fingers, attached to said gripper palm; metal electrical shock causing surfaces, attached to said mechanical fingers; contact button, attached to said gripper palm; said contact button to activate said mechanical fingers; remote button, to activate said mechanical fingers; remote button, to activate said shock causing surfaces; an electrically motorized rotating triple axis mounting base, attached to said gripper palm; remote button, to activate said mounting base; remote joysticks, to control said mounting base; remote speaker, to relay the sound detected by said microphone; remote button, to activate said remote speaker; remote microphone, to transmit sound to said speaker; remote button, to activate said remote microphone; remote display panel, to display images from said cameras; remote button, to activate said remote display panel; remote virtual reality headset, to display images from said cameras; and remote button, to activate said virtual reality headset.

US Pat. No. 10,891,840

SYSTEMS AND METHODS FOR MONITORING COMPONENTS OF AND DETECTING AN INTRUSION INTO AN AUTOMATED TELLER MACHINE

Capital One Services, LLC...

1. An automated teller machine (ATM), comprising:a housing comprising an interior surface;
a substance adhered to the interior surface, the substance comprising a piezoelectric element; and
a detection circuit coupled to the substance, the detection circuit being configured to:
receive a response pattern generated by the piezoelectric element based on a vibration of at least one component within the ATM;
compare the response pattern to a stored signature corresponding to an expected vibration pattern of the at least one component to detect a vibration pattern indicative of intrusion, the stored signature being derived by the ATM from vibrations of the at least one component within the ATM occurring during a calibration period; and
generate, based on the comparison detecting a vibration pattern indicative of intrusion, an indication of an intrusion into the housing.

US Pat. No. 10,891,839

CUSTOMIZABLE INTRUSION ZONES ASSOCIATED WITH SECURITY SYSTEMS

Amazon Technologies, Inc....

1. A method comprising:storing, by a network device, intrusion zone data representing an intrusion zone for an audio/video recording and communication device (A/V device), the intrusion zone being associated with at least:
a motion zone from a plurality of motion zones associated with the A/V device, the motion zone representing a portion of a field of view (FOV) of the A/V device;
a conditional setting; and
at least one action that a security system is to perform based at least in part on the A/V device detecting motion within the motion zone and the conditional setting being satisfied;
receiving, by the network device, motion data from the A/V device, the motion data indicating that the A/V device detected motion within the motion zone;
determining, by the network device, that the conditional setting for the intrusion zone is satisfied; and
after receiving the motion data from the A/V device, and after determining that the conditional setting for the intrusion zone is satisfied, determining, by the network device, to cause the security system to perform the at least one action.

US Pat. No. 10,891,838

DETECTING DEVICE AND CONTROL SYSTEM WITH SUCH DETECTING DEVICE

CARRIER CORPORATION, Pal...

1. A detection apparatus, comprising:a passive infrared sensor and a Fresnel lens provided on the passive infrared sensor, characterized in that the detection apparatus further comprises a rotation unit, the rotation unit being configured to drive the passive infrared sensor and the Fresnel lens to rotate together.

US Pat. No. 10,891,837

MONITORING OPERATIVES IN HAZARDOUS ENVIRONMENTS

Wearable Technology Limit...

1. A system for monitoring operatives in hazardous environments, comprising:a base station;
items of clothing with illuminatable devices connected to a control unit by a wiring loom;
condition detection devices, each attachable to a respective item of clothing of said items of clothing for producing condition data; and
non-cellular radio devices, each attachable to said respective item of clothing of said items of clothing, wherein:
said control units include a processor and a cellular radio communication module, said cellular radio communication module is configured to communicate with said base station over a cellular radio network to transmit uplink signals and receive downlink signals under control of said processor;
said condition detection devices supply said condition data to a connected control unit over said wiring loom;
said control units transmit said condition data to said base station when said downlink signals are being received, over said cellular radio network, via said cellular radio communication module;
said control units transfer said condition data to a connected non-cellular radio device over said wiring loom when said downlink signals are not being received;
said connected non-cellular radio devices transmit said condition data received from said control units to said base station over a non-cellular radio network;
said base station includes a visual display device for displaying a graphical interface;
said graphical interface displays an indicator for each connected device; and
an attribute of an indicator changes in a response to operational changes to said respective connected device.

US Pat. No. 10,891,836

INPUT/OUTPUT PORT MOUNTING UNIT INTEGRATED TYPE DISPLAY FRAME FOR POS EQUIPMENT

POSBANK CO., LTD, Seoul ...

1. An input/output port mounting unit integrated type display frame (300) for POS equipment, comprising:a display panel mounting unit (310), in which a display panel (50) is loaded, formed at a front surface thereof;
an input/output port mounting unit (320) formed at a rear surface thereof and having an input/output port mounting hole (322), in which an input/output port (31) provided at a main substrate (30) is inserted and coupled, the input/output port mounting unit having a part cut from an input/output port mounting unit perforation (321) and being formed to be bent towards a rear direction so as to be protruded therefrom; and
a lower main ventilation hole (330) formed at a lower side of the rear surface thereof.

US Pat. No. 10,891,835

CASH STORAGE APPARATUS

CASIO COMPUTER CO., LTD.,...

1. A cash storage apparatus comprising:a cash storage case; and
a housing comprising:
a first surface and a second surface substantially parallel to the first surface; and
a third surface and a fourth surface substantially parallel to the third surface,
wherein in a horizontal orientation state of the housing relative to a placement surface on which the housing is placed, the first surface faces a front direction relative to the placement surface, the second surface faces a back direction relative to the placement surface, the third surface faces an upward direction relative to the placement surface and the fourth surface faces a downward direction relative to the placement surface:
the cash storage case is configured to be pulled out or pushed out away from the first surface of the housing in the front direction; and
a depth of the housing between the first surface and the second surface is greater than a height of the housing between the third surface and the fourth surface, and
where the housing is rotated and turned to be in a vertical orientation state of the housing relative to the placement surface, such that the first surface faces the upward direction, the second surface faces the downward direction, the third surface faces the front direction, the fourth surface faces the back direction, a first portion of the third surface forms an open/close cover configured to be moved in the front direction away from a second portion of the third surface to an open position to expose the cash storage case arranged within the housing and to be moved in the back direction toward the second portion of the third surface to a closed position.

US Pat. No. 10,891,834

AUTOMATIC TRANSACTION APPARATUS AND CONTROL METHOD THEREOF

HITACHI-OMRON TERMINAL SO...

1. An automatic transaction apparatus which performs a payout transaction according to a request from a user, comprising:a plurality of modules which each execute processing required for the payout transaction;
an overall controller configured to control each of the modules; and
a security controller,
wherein the security controller is configured to hold in advance a first list in which content of signals which are exchanged between the overall controller and each of the modules at the time of a normal payout transaction being recorded in the order in which the signals are exchanged, to sequentially record the content of the signals which are exchanged between the overall controller and each of the modules at the time of the actual payout transaction in a second list in the order in which the signals are exchanged, and to issue a signature approving operation of the modules when there is a match between the content of the first list and the content of the second list, and
wherein the modules execute a corresponding operation when the security controller has issued the signature,
the modules further comprising:
a card reader configured to read information that is recorded on a card of the user that is presented by the user, the card reader comprising:
a card lock mechanism configured to lock the card inserted by the user inside the apparatus;
a card reader controller configured to control the card lock mechanism;
a list holder configured to hold a third list that associates each of the commands supplied to the card reader from the overall controller with respective command execution authorizations for executing the commands;
a command execution authorization holder configured to hold the command execution authorizations assigned to its own apparatus; and
an authentication unit configured to execute predetermined authentication processing, wherein, in the third list, the command execution authorization for executing a card lock cancellation command, which is the command for canceling the lock when the card has been locked by the card lock mechanism, is associated with a second command execution authorization, which is the command execution authorization for executing a command of a higher security than the first command execution authorization, which is the command execution authorization normally assigned to the apparatus,
wherein the authentication unit is configured to execute the authentication processing according to a request from a high-level host or according to an external operation,
wherein the card reader controller is configured to update the command execution authorization held in the command execution authorization holder from the first command execution authorization to the second command execution authorization when the authentication processing by the authentication unit has been performed normally, wherein, when the card lock cancellation command has been supplied from the overall controller, the card reader controller is configured to confirm the command execution authorization held by the command execution authorization holder and the command execution authorization for executing the card lock cancellation command registered in the third list, and
wherein the card reader controller is configured to cancel the lock on the card by causing the card lock mechanism to operate when the command execution authorization assigned to its own apparatus is the second command execution authorization.

US Pat. No. 10,891,833

METHOD OF FORMING A LOTTERY TICKET WITH A TRANSLUCENT SUBSTRATE

Pollard Banknote Limited,...


and preventing observation by an intruder of any ink material from the lottery game indicia which has migrated from above the white lilypad to the translucent polymer material by printing onto a rear surface of the translucent polymer material at a position aligned with the opaque lilypad a coating which is arranged to absorb or reflect any electromagnetic energy applied thereto;
said coating being printed directly onto the translucent polymer material having the above characteristics without any intervening ink material arranged to display lottery game indicia;
and printing the lottery game indicia in an arrangement for playing a game related to the lottery ticket using at least said lottery game indicia.

US Pat. No. 10,891,832

SYSTEM AND METHOD FOR TRACKING MACHINE USE

17. A pool table machine, comprising:a pool ball release switch;
an electric relay that is connected to the pool ball release switch;
a tracking device that is connected to the electric relay; and
an electronic payment device that is connected to the electric relay;
wherein the electric relay is configured to transfer a first electrical pulse from the pool ball release switch to the tracking device;
wherein the electric relay is configured to transfer a second electrical pulse from either the electronic payment device or the pool ball release switch to the tracking device; and
wherein the electric relay transfers the first electrical pulse or the second electrical pulse when the pool ball release switch releases a plurality of pool balls from a cavity of the pool table machine.

US Pat. No. 10,891,831

METHOD AND ASSOCIATED HARDWARE FOR AWARDING A BONUS

Aristocrat Technologies, ...

1. A controller of a central control system configured to communicatively interface with a plurality of electronic gaming machines (EGMs), each EGM including a player interface, a display, and a credit mechanism, the controller configured to execute instructions stored on a memory, which when executed, cause the controller to at least:generate an operator interface that enables definition of a plurality of target combinations by an operator of the central control system, the operator interface including a first input field that enables the operator to select between a rules engine mode and a custom entry mode for defining the target combinations, wherein the selection of the rules engine mode causes the operator interface to display:
a second input field that enables the operator to select a number of cards;
a third input field that enables the operator to select a start of a card range;
a fourth input field that enables the operator to select an end of the card range;
a fifth input field that enables the operator to select a relationship associated with at least one of the start and the end of the card range; and
a sixth input field that enables the operator to select between a same rank and a same suit for the cards defined by the second through the fifth input field, such that each of the target combinations is one of a predefined sub-set of combinations in a predefined sub-set of poker hands;
define the plurality of target combinations to include each combination of the cards that satisfies the operator input into the second through the sixth input field;
store a counter value and the plurality of target combinations within the memory;
initialize the counter value to equal a total number of the plurality of target combinations;
cause the plurality of target combinations to be displayed on the display of each of the EGMs;
receive a signal from a first EGM of the plurality of EGMs indicating that the first EGM has executed a game and including information indicative of a first game outcome, the first game outcome defining a combination of cards in a first poker hand;
credit a credit balance associated with the player of the first EGM if the first game outcome is a winning outcome;
subsequently to crediting the credit balance of the first player, determine if the combination of cards in the first poker hand is one of the predefined sub-set of combinations in the predefined sub-set of poker hands, and, if so, highlight the corresponding one of the plurality of target combinations on the display of each of the EGMs and decrement the counter value by one;
receive a further signal from a second EGM of the plurality of EGMs indicating that the second EGM has executed a further game and including information indicative of a second game outcome, the second game outcome defining a further combination of cards in a second poker hand, wherein the second poker hand is generated independently of the first poker hand and the plurality of target combinations;
credit the credit balance associated with the player of the second EGM if the second game outcome is a winning outcome;
subsequently to crediting the credit balance of the second player, determine if the further combination of cards in the second poker hand is another of the predefined sub-set of combinations in the predefined sub-set of poker hands and, if so, highlight the corresponding another of the plurality of target combinations on the display of each of the EGMs and ii) decrement the counter value by another one;
compare the counter value to a threshold value of zero; and
one of: i) award a bonus to a respective credit balance associated with at least one of the players of the plurality of EGMs if the counter value is equal to the threshold value, or ii) continue to execute one or more additional games on each of the plurality of EGMs to generate one or more additional game outcomes and, in response to the one or more additional game outcomes satisfying others of the predefined sub-set of combinations in the predefined sub-set of poker hands, highlight corresponding others of the plurality of target combinations on the displays of the plurality of EGMs and decrement the counter until the counter value reaches the threshold value.

US Pat. No. 10,891,830

GAMING DEVICES AND SYSTEMS FOR PRESENTING IMPROVED BLACKJACK TYPE WAGERING GAMES

1. A method of presenting a wagering game to a plurality of players comprising the steps of:providing a gaming table having a playing surface and a game layout;
receiving, via at least one input device of said gaming table, player input of a game wager upon either a banker hand or a player hand from each player;
dealing an initial two card player hand and an initial two card banker hand;
determining if either or both of said initial two card player hand and said initial two card banker hand comprise a blackjack;
when either said initial two card player hand or said initial two card banker hand comprise a blackjack, determining an outcome of said game with reference to said initial two card banker hand and said initial two card player hand and resolving said game wager of each player based upon said determined outcome;
when neither said initial two card player hand or said initial two card banker hand comprise a blackjack, completing said initial two card player hand and said initial two card banker hand into a final player hand and a final banker hand according to one or more predefined rules regarding the dealing of additional cards to said initial player hand and initial banker hand, determining an outcome of said game with reference to said final player hand and said final banker hand by at least determining which of said final player hand or final banker hand has a point total closest to or equal to 21, without exceeding 21, and resolving said game wager of each player based upon said determined outcome, except that if said final player hand and final banker hand each have the same point total of 21 or less, declaring the outcome as a push; and
wherein the step of resolving said game wager of each player based upon said determined outcome comprises:
a) returning each player's game wager in the event said outcome is determined to be a push;
b) paying winnings to each player who placed a game wager on a winning player hand and collecting the game wager of each player who placed a game wager on a losing banker hand;
c) unless a winning banker hand is a three card hand with a point total of 17, paying winnings to each player who placed a game wager on a winning banker hand and collecting the game wager of each player who placed a game wager on a losing player hand; and
(d) if the winning banker hand is a three card hand with a point total of 17, returning the game wager to each player who placed a game wager on the winning banker hand.

US Pat. No. 10,891,829

SYSTEM AND METHOD FOR GENERATING CUSTOMIZED ODDS BET FOR AN EVENT

CANTOR INDEX, LLC, New Y...

1. A method comprising:generating, by at least one processor, for a field of participants for an event, particular data representing odds for each participant to finish in a particular subset of finishing positions in the event;
receiving, by the least one processor, from at least one computer interface, data representing customized odds for a bet;
selecting, by the at least one processor, a lead participant from the field of participants;
selecting, by the at least one processor, one or more additional participants from the field of participants such that the odds associated with the lead participant combined with the odds associated with the one or more selected additional participants at least approximates the customized odds for the bet;
representing, by the at least one processor, a bet on a group of participants that comprises the lead participant and the additional participants;
communicating, by the at least one processor, the data representing a bet to at least one computer interface; and
determining, by the at least one processor, whether a bet associated with the customized odds cannot be established and, in response, generating data establishing a bet on the lead participant based upon the odds of the lead participant.

US Pat. No. 10,891,828

HEAD TO HEAD SYSTEMS

Gamblit Gaming, LLC, Mon...

1. An electromechanical gaming machine constructed to receive currency, comprising:a real world controller connected to a game world controller, wherein the real world controller is constructed to:
accept from the game world controller, a trigger to run a gambling game; and
distribute to the game world controller, in response to the trigger, a randomly generated payout of real world credits from a wager in the gambling game; and
the game world controller connected to the real world controller and connected by a network to an entertainment software controller executing a multiplayer entertainment game, wherein the game world controller is constructed to:
receive from the entertainment software controller via the network, a plurality of players' actions taken during the plurality of players' execution of the multiplayer entertainment game; and
trigger the wager in the gambling game based on the players' actions taken during the plurality of players' execution of the multiplayer entertainment game,
wherein the game world controller utilizes a gambling controller constructed to:
enter the plurality of players into a multiplayer simultaneous gambling session;
parameterize wager terms of the wager made in the gambling game based on information associated with each of the plurality of players entered into the multiplayer simultaneous gambling session;
trigger the wager in the gambling game during the multiplayer simultaneous gambling session based on the plurality of players' actions;
distribute the randomly generated payout of real world credits as a result of the wager in the gambling game during the multiplayer simultaneous gambling session between the plurality of players of the multiplayer entertainment game entered into the multiplayer simultaneous gambling session;
determine a payout of resources utilized by the plurality of players in the multiplayer entertainment game gameplay session based on the result of the wager; and
distribute to the entertainment software controller via the network, the payout of resources for utilization by the plurality of players in the entertainment game during the multiplayer entertainment game gameplay session.

US Pat. No. 10,891,827

METHOD FOR SHARING GAME PLAY ON AN ELECTRONIC GAMING DEVICE

ACRES TECHNOLOGY, Las Ve...

1. At least one non-transitory computer readable medium that stores a plurality of instructions, the plurality of instructions, when executed by at least one processor, causes the at least one processor to:receive a first game credit via an electronic gaming device from a first player at a first gaming station of an electronic gaming device;
receive a second game credit via the gaming device from a second player at a second gaming station of the electronic gaming device;
receive an input from at least one of the players at the electronic gaming device to allocate at least a portion of their game credits to a player-selectable wager amount for each outcome in each of the group consisting of an outcome of a first game, an outcome of a second game, and the better of the first and second game outcomes;
receive a first signal responsive to an input made by the first player at the electronic gaming device indicating that the player has initiated the first game, the first game being displayed at least at the first gaming station and generating the first game outcome;
receive a second signal responsive to an input made by the second player at the electronic gaming device indicating that the player has initiated the second game separate from the first game;
substantially synchronize display of the outcomes of the first and second games;
display the outcome of the second game adjacent display of the outcome of the first game; and
award prizes to each player based on the outcomes.

US Pat. No. 10,891,826

GAMING SYSTEM HAVING ASYNCHRONOUS MOTION OF SYMBOLS DETERMINING AWARD OUTCOMES

SG Gaming, Inc., Las Veg...

1. A gaming system comprising:an input device configured to detect a physical item associated with monetary value that establishes a credit balance;
an electronic display device; and
game-logic circuitry configured to:
initiate a wagering game responsive to an input indicative of a wager drawn on the credit balance;
direct the electronic display device to display a gamescape including an award zone and a plurality of symbols moving around the gamescape, and
trigger an award sequence in response to a predetermined threshold of the plurality of symbols being completely inside the award zone at the same time, the predetermined threshold including at least two moving symbols, the award sequence including directing the electronic display device to display stopping the movement of the at least two symbols and determining an award value to be credited to a player, the award value being associated with at least the symbols of the triggering predetermined threshold.

US Pat. No. 10,891,825

EXAMPLE VIRTUAL WALLET FOR FUND MANAGEMENT OF ACCOUNT BASED WAGERING ACCOUNTS

CFPH, LLC, New York, NY ...

20. A method comprising:controlling, by at least one processor of a server:
determining, from a first location determination signal over a communication network from a first mobile computing device of a player indicating a current location of the player, a first location of the player;
presenting, over the communication network on a first graphical user interface of a display of a given network computing device, information identifying a balance in a first account for the player, in which the first account is accessible by the player to place wagers associated with a first gaming operator that operates at the first location but not to place wagers associated with a second gaming operator;
receiving, over the communication network from the given network computing device, an indication based on an operation to the first graphical user interface that at least a portion of the balance is available for transfer to a second account for the player, in which the second account is accessible by the player to place wagers associated with the second gaming operator at a second location but not to place wagers associated with the first gaming operator, in which wagering is allowed at both the first location and the second location,
in which the first account and the second account are both linked to a linking account, to which money from each of the first and second accounts is transferrable and from which money is transferable to each of the first and second accounts;
in response to the indication that the at least the portion is available for transfer, storing information identifying that the at least the portion has been pre-permissioned for transfer;
presenting, over the communication network, indicia on a graphical user interface of a display of the first mobile computing device through which the player may request a transfer of a given amount of money from one of the linked first and second accounts that causes the given amount of money to be transferred from the linking account to the other of the first and second accounts;
receiving, over the communication network from the first mobile computing device, a request to transfer a first amount of money from the first account to the second account, in which the request to transfer is received automatically in response to an operation to the graphical interface that places a wager from the second account;
determining whether the first amount of money is less than or equal to the at least the portion in response to receiving the request to transfer;
determining whether a current location of the player is the second location from a second location determination signal over the communication network from the first mobile computing device indicating a current location of the player at a time the wager is placed from the second account; and
automatically in response to determining that the first amount is less than or equal to the at least the portion and while the current location of the player continues to be determined as the second location from the second location determination signal,
transferring, the first amount of money from the first account to the linking account;
transferring the first amount of money from the linking account to the second account; and
causing, over the communication network, the first mobile computing device to operate in an operating state in which the first mobile computing device is operable by the player to place through the first mobile computing device wagers associated with the second gaming operator using money in the second account and not operable by the player to place through the first mobile computing device wagers using money in the first account.

US Pat. No. 10,891,824

SIDE RECORDING SYSTEM FOR GAMING DEVICE

8. A side recording system for gaming device, provided for side-recording a gaming device, which is a gaming machine platform with a game display unit, characterized by comprising:an image acquisition device, said image acquisition device being connected to said gaming device so as to acquire images of said game display unit and then generate a continuously side-recorded video;
a game state monitoring device, said game state monitoring device connecting and monitoring said gaming device to determine if said gaming machine platform executes an event which is a member system operation, wherein said game state monitoring device generates a beginning time point and an ending time point at the beginning and the ending of said event, respectively, if said event is occurred;
an association recording unit, said association recording unit being connected to said game state monitoring device and said image acquisition device, said association recording unit receiving said beginning and ending time points of said event and said continuously side-recorded video, wherein said association recording unit correlates said beginning and ending time points with a timeline of said continuously side-recorded video to generate an association recording table; and
a data storage unit, said data storage unit being connected to said association recording unit and said image acquisition device, as well as being allowed to receive and store said continuously side-recorded videos and said association recording table.

US Pat. No. 10,891,823

NETWORK ARCHITECTURE FOR GAMING INDUSTRY ACCOUNTING

JCM American Corporation,...

1. A peripheral device for an electronic gaming machine in a casino environment, the peripheral device comprising:a mobile interface device configured to establish a communication channel with a personal electronic device of a player; and
a dedicated processing unit configured to:
receive an identifier from the mobile interface device; and
send an instruction to a peripheral controller associated with the electronic gaming machine to authorize transactions between an account associated with the identifier and the electronic gaming machine, the peripheral controller operable to control multiple peripherals associated with the electronic gaming machine based on communications with an external server without interacting with a game machine processing unit of the electronic gaming machine, the dedicated processing unit not configured to control the multiple peripherals directly.

US Pat. No. 10,891,822

GAMING MACHINES USING HOLOGRAPHIC IMAGING

IGT, Las Vegas, NV (US)

1. A gaming device comprising:a gesture detection device that includes a gesture detector that is operable to detect a gesture of a player and a gesture interpreter that is configured to generate gesture data corresponding to the gesture;
a housing that includes an interior and a front surface that includes a portion that is a semi-transparent window that is configured to reflect, to the player, a reflected image of at least a portion of the player that is an object distance away from the semi-transparent window;
an object illumination source that is configured to provide illumination to the portion of the player; and
a display device that is arranged within the housing and that includes a display surface that faces the semi-transparent window, that is spaced apart from the semi-transparent window by about the object distance and that is configured to display a supplemental image that is visible to the player through the semi-transparent window.

US Pat. No. 10,891,821

CONFIGURABLE AND MODULAR TRANSACTION TERMINAL

NCR Corporation, Atlanta...

1. A terminal, comprising:a horizontal operating surface comprising a touchscreen display;
a first vertical support comprising first integrated devices; and
a second vertical support comprising second integrated devices;
wherein a bottom portion of the horizontal operating surface rests on tops of the first vertical support and the second vertical support;
wherein the terminal is configured to dynamically switch between different modes of operation based on touch interaction with the touchscreen display, and each mode of operation configured to:
activate and deactivate selective first integrated devices and selective second integrated devices; and
orient displayed information in the touchscreen display to one side or both sides of the horizontal operating surface.

US Pat. No. 10,891,820

COUNTERFEIT NOTE TRACKING

NCR Corporation, Atlanta...

6. A method for tracking a counterfeit note, the method comprising:receiving, by a self-service terminal comprising a processor, a counterfeit note;
collecting, by the self-service terminal, data from the counterfeit note;
receiving a request for the data collected from the counterfeit note;
obtaining properties associated with known counterfeit notes that are assigned to a counterfeit template; and
modifying the counterfeit template by expanding a range of values for at least one of the properties included in the counterfeit template based on the data collected or contracting a range of values for at least one of the properties included in the counterfeit template based on the data collected.

US Pat. No. 10,891,819

BEZEL ASSEMBLY FOR USE WITH AN AUTOMATED TRANSACTION DEVICE

JCM American Corporation,...

1. A bezel assembly for use in a transactional device having a bill validator with a document acceptance slot and a validator window, said bezel assembly comprising:a bezel housing comprising, in combination:
a front portion including a casing having an upper portion and a lower portion, wherein the casing is configured with an insertion/dispensing slot through which notes are received and pass through into the bill validator, the lower portion including a protrusion projecting distally from the casing proximate to said insertion/dispensing slot, wherein a top surface of said protrusion and an interior surface of said casing form a continuous runway; and
a back plate attached to said front portion and configured to couple to said transactional device or the bill validator;
a short-range wireless communication module in the bezel housing and positioned to connectively communicate with a mobile device wherein communication is enabled along an area aligned with the protrusion and the insertion/dispensing slot; and
a processor integral to the bezel assembly that controls operation of the bezel assembly including interfacing with the bill validator wherein transactional information received by the processor from the mobile device is presented to the bill validator and passed to the transactional device to provide credit on the transactional device.

US Pat. No. 10,891,818

IDENTIFICATION DEVICES, IDENTIFICATION METHODS, IDENTIFICATION PROGRAMS AND COMPUTER READABLE MEDIA INCLUDING IDENTIFICATION PROGRAMS

TOPPAN PRINTING CO., LTD....

1. A computer readable medium including an identification program for causing a computer to execute an identification process for performing authenticity determination of an article provided with an anti-counterfeiting medium whose observed light pattern changes depending on an observation angle, the medium comprising instructions to cause the computer to perform the identification method comprising:calculating a degree of similarity between captured image data of the anti-counterfeiting medium and reference image data;
performing authenticity determination as to whether the anti-counterfeiting medium is genuine or not on the basis of whether the degree of similarity exceeds a predetermined threshold or not; and
notifying a match-percentage indicative of a degree of match between an imaging viewpoint for imaging the anti-counterfeiting medium and a reference imaging viewpoint which is predefined as the imaging viewpoint for the captured image data used for authenticity determination, wherein the notifying comprises calculating the match-percentage on the basis of an imaging angle difference and a coordinate value difference, which are differences between an imaging angle and an imaging device coordinate value of the imaging viewpoint and a reference imaging angle and a reference imaging device coordinate value of the reference imaging viewpoint, respectively, in a three-dimensional coordinate system.

US Pat. No. 10,891,817

COIN HANDLING APPARATUS AND COIN HANDLING METHOD

GLORY LTD., Hyogo (JP)

1. A coin handling apparatus that handles a coin, the coin handling apparatus comprising:an inlet section that is provided on a customer side of the coin handling apparatus and from which the coin is put in;
a recognition unit that recognizes the coin put in from the inlet section; and
a first ejection section that is provided on an attendant side of the coin handling apparatus and to which the coin recognized as a rejected coin by the recognition unit is ejected,
wherein the first ejection section is provided at a position inaccessible from the customer side.

US Pat. No. 10,891,816

SPATIO-TEMPORAL TOPOLOGY LEARNING FOR DETECTION OF SUSPICIOUS ACCESS BEHAVIOR

CARRIER CORPORATION, Pal...

1. A spatio-temporal topology learning system for detection of suspicious access control behavior in a physical access control system (PACS), the spatio-temporal topology learning system comprising:an access pathways learning module configured to determine a set of spatio-temporal properties associated with a resource in the PACS;
an inconsistency detection module in operable communication with the access pathways learning module, the inconsistencies detection module configured to
analyze a plurality of historical access control events and identify an inconsistency with regard to the set of spatio-temporal properties; and
if an inconsistency is detected, at least one of the events is flagged as potentially suspicious access control behavior;
wherein the spatio-temporal properties include a reachability graph;
wherein the spatio-temporal topology learning system refines the reachability graph based on an initial estimate of the notional distance between readers determined as the minimum difference between access event time stamps at two connected readers;
the inconsistency detection module detecting the inconsistency in response to the refined reachability graph.

US Pat. No. 10,891,815

KEY INFORMATION MANAGEMENT DEVICE, MANAGEMENT METHOD OF KEY INFORMATION, COMPUTER-READABLE NON-TRANSITORY STORAGE MEDIUM STORING KEY INFORMATION MANAGEMENT PROGRAM

TOYOTA JIDOSHA KABUSHIKI ...

1. A key information management device, comprising:a central processing unit (CPU); and
a memory coupled to the CPU and configured to store key information,
wherein the CPU is configured to:
distribute the key information to a company that provides a delivery service that allows an inside of a vehicle, a building, or a facility used by a user to be designated as a delivery destination of a package, the key information being used by a delivery person of the company to unlock a specified entrance of the vehicle, the building, or the facility;
determine use schedule information of the vehicle, the building, or the facility;
determine whether or not the delivery person who delivers the package has reached within a specified range of the vehicle, the building, or the facility; and
distribute the key information to the company based on the use schedule information and when it is determined that the delivery person has reached within the specified range of the vehicle, the building, or the facility,
wherein when a plurality of the packages is delivered to the inside of the vehicle, the building, or the facility at the same time period, the CPU is further configured to
distribute the key information for the plurality of the packages to the delivery company when the plurality of packages is to be delivered by the delivery company, and
distribute the key information to each of a plurality of the delivery companies when the plurality of packages is to be delivered by different delivery companies.

US Pat. No. 10,891,814

MOBILE CREDENTIAL MANAGEMENT SYSTEM FOR VEHICLE KEY BOX ACCESS CONTROL

CARRIER CORPORATION, Pal...

1. A mobile credential management system comprising:a processing system; and
a memory system comprising computer-executable instructions that, when executed by the processing system, cause the processing system to perform a plurality of operations, the operations comprising:
issuing a credential to a mobile device of a user, the mobile device comprising a computing device operable to trigger opening of a releasable latch of a vehicle key box responsive to a controller of the vehicle key box wirelessly receiving the credential from the mobile device within an assigned window of time;
monitoring for a releasable latch opened indicator from the vehicle key box indicative of the vehicle key box acknowledging validation of the credential from the mobile device within the assigned window of time and providing access to a vehicle key corresponding to a vehicle assigned to the vehicle key box; and
sending a notification to a user system responsive to receiving the releasable latch opened indicator, the notification identifying the vehicle and information about the user of the mobile device.

US Pat. No. 10,891,813

COGNITIVE VETTING SYSTEM FOR UNSOLICITED RESIDENTIAL VISITORS

International Business Ma...

1. A method for cognitive vetting, the method comprising:determining, by one or more processors, that an identified person is not expected; and
generating, by one or more processors, a recommendation as to whether the identified person should be permitted entry based on analyzing a plurality of historical information relevant to at least one past interaction with the identified person, wherein analyzing the plurality of historical information includes generating a numerical score.

US Pat. No. 10,891,812

UNIVERSAL BARRIER OPERATOR TRANSMITTER

GMI Holdings, Inc., Moun...

1. An apparatus, comprising:a portable controller for a door operator, comprising a processor, a first actuator operatively connected to the processor, and a transmitter operatively connected to the processor;
wherein the processor is configured to:
invert bit positions of a rolling code,
divide the inverted code by a predetermined number,
convert the divided inverted code to base 9 coefficients, and
substitute an output nibble for each base 9 coefficient in the divided converted code; and
wherein the transmitter is configured to transmit the substituted output nibble responsive to actuation of the first actuator.

US Pat. No. 10,891,811

AUTHENTICATION INFORMATION ISSUING APPARATUS, DELIVERY SYSTEM AND AUTHENTICATION INFORMATION ISSUING METHOD

TOYOTA JIDOSHA KABUSHIKI ...

1. An authentication information issuing apparatus configured to issue authentication information with which a vehicle is unlocked, to a portable terminal, the vehicle being capable of containing a parcel and being locked and unlocked based on the authentication information acquired from the portable terminal, the authentication information issuing apparatus comprising:a storage device configured to store, in association with each other, (i) a vehicle ID as position information with which the vehicle is identified, and (ii) a parcel receipt ID as information with which the vehicle ID is specified instead of an address or a name;
an acquiring device configured to acquire the parcel receipt ID associated with a target parcel; and
an authentication information sending device configured to:
specify the vehicle ID stored in the storage device based on the acquired parcel receipt ID,
acquire the authentication information with which a vehicle as a containment destination of the parcel is unlocked based on the vehicle ID, and
send the authentication information to the portable terminal.

US Pat. No. 10,891,810

ENABLING DISTRIBUTION OF A MOBILE KEY

ASSA ABLOY AB

1. A method for enabling distribution of a mobile key for obtaining access to a physical space, the method being performed in a mobile key agent and comprising:receiving a booking signal from a booking agent, the booking signal being associated with a booking of physical space, wherein the booking signal comprises a property identifier, an allocation time, and a name of a user;
obtaining a system booking reference associated with the booking of physical space;
providing the system booking reference to the booking agent;
establishing contact with a mobile key repository, yielding an identifier of the mobile key repository being an end-point identifier to a key holding application on a smartphone;
receiving a space allocation message from a different entity than the booking agent, the space allocation message comprising an identifier of the physical space and the system booking reference; and
transmitting a key allocation command to an electronic access control system associated with the physical space, the key allocation command comprising the identifier of the mobile key repository, the identifier of the physical space and the allocation time, wherein the key allocation command enables the electronic access control system to determine access rights and generate an electronic key containing the access rights with a validity time corresponding to the allocation time.

US Pat. No. 10,891,809

INTELLIGENT DIAGNOSIS ASSISTANCE METHOD, DEVICE AND EQUIPMENT

SHENZHEN LAUNCH SOFTWARE ...

1. An intelligent diagnosis assistance method, comprising:receiving a voice command inputted by a user;
obtaining diagnosis assistance information of a vehicle from a diagnosis assistance information database of a server according to the voice command; and
displaying the diagnosis assistance information to the user;
wherein said obtaining the diagnosis assistance information of the vehicle according to the voice command particularly comprises:
parsing the voice command;
determining whether the voice command includes a first vehicle identification; and
obtaining the diagnosis assistance information from the diagnosis assistance information database of the server according to the first vehicle identification, if the voice command includes the first vehicle identification; or
controlling a vehicular device to obtain a second vehicle identification and obtaining the diagnosis assistance information of the vehicle from the diagnosis assistance information database of the server according to the second vehicle identification, if the voice command doesn't include the first vehicle identification.

US Pat. No. 10,891,808

CROWD-SOURCED DRIVER GRADING

Allstate Insurance Compan...

1. A system comprising:a video capture device associated with a first vehicle and capable of capturing a video of one or more other vehicles;
memory communicatively coupled to the video capture device, wherein the memory stores a video captured by the video capture device;
a processor executing instructions that cause the processor to:
process, using a video analysis algorithm, the captured video stored in the memory to determine a driving event performed by a second vehicle in the captured video;
determine, by an event analyzer, a first event rating indicating a level of danger of the driving event, wherein determining the first event rating includes:
receiving other event ratings of the same driving event from a plurality of sources; and
combining at least a portion of the received other ratings to determine the first event rating;
receive a second event rating of the driving event from a driver of the second vehicle;
determine an accuracy of the second event rating relative to the first event rating; and
determine an incentive to provide the driver of the second vehicle based on the determined accuracy of the second event rating.

US Pat. No. 10,891,807

SYSTEMS AND METHODS UTILIZING GRAVITY FEED FOR POSTAGE METERING

Stamps.com Inc., El Segu...

1. A system comprising:a postage metering apparatus operable to provide postage metering operation with respect to mail items to result in said mail items having valid postage indicia thereon;
a mail item storage tray holding said mail items prior to postage metering processing by said system, wherein at least a portion of said postage metering apparatus is disposed to interact with a mail item of said mail items as said mail item is held in said mail item storage tray;
a gravity drop feed chute configured to accommodate said mail items being fed therethrough for interaction with said postage metering apparatus during at least a portion of said postage metering operation, wherein at least a portion of said postage metering operation is provided with respect to a mail item of said mail items when said mail item is passing through said gravity drop feed chute; and
a mail item gravity drop controller configured to provide control of the passage of said mail item through said gravity drop feed chute during said at least a portion of said postage metering operation, wherein said control includes moving said mail item from a first position to a second position while said mail item is engaged with said mail item gravity drop controller, wherein said at least a portion of said postage metering operation comprises printing postage indicia on said mail item, and wherein at least the portion of the postage metering operation occurs during the transition between the first position and the second position while said mail item is engaged with said mail item gravity drop controller.

US Pat. No. 10,891,806

MOBILE PHONE AND CLOUD BASED VIRTUALIZED PARKING METER SYSTEM

1. A parking meter system comprising one or more computers configured to perform operations comprising:receiving a query and a location identifier identifying a geographic location from a first device communicatively coupled with the one or more computers;
generating a virtual map comprising conventional map data of a geographic area in proximity to the geographic location and parking space data, wherein the parking space data includes at least one or more parking space status for one or more parking spaces in the geographic area and at least one or more parking space identifiers assigned to one or more of the at least one or more parking spaces in the virtual map;
sending the virtual map to the first device in response to the query;
receiving from the first device a selected parking space information, wherein the selected parking space information is identified by a first parking space identifier associated with the selected parking space; and
allocating the selected parking space responsive to receiving the selected parking space information.

US Pat. No. 10,891,805

3D MODEL ESTABLISHING DEVICE AND CALIBRATION METHOD APPLYING TO THE SAME

INDUSTRIAL TECHNOLOGY RES...

1. A 3D model construction device, comprising:a camera configured to obtain a plurality of first frames, a second frame and a depth information; and
a wearable display coupled to the camera, wherein the wearable display comprises:
a display unit;
a processing unit;
a storage unit coupled to the processing unit and configured to store a first module and a second module, wherein when the first module is performed by the processing unit, the processing unit calculates a first pose of the wearable display; when the second module is performed by the processing unit, the processing unit calculates a 3D model according to the first frames, the depth information, the first pose and a plurality of calibration parameters, and updates the 3D model according to the second frame; and
a projection unit coupled to the processing unit and configured to project the 3D model and the second frame onto the display unit according to the first pose for being displayed with a real image on the display unit,
wherein the calibration parameters are a basis for transforming between a coordinate system of the camera and a coordinate system of the wearable display,
wherein the calibration parameters comprise a rotation parameter generated according to the following steps:
enabling the 3D model construction device to move along a plurality of directional straight lines and respectively obtain a plurality of first moving velocity vectors of the wearable display with respect to each direction and a plurality of second moving velocity vectors of the camera with respect to each direction; and
calculating the rotation parameter according to the first moving velocity vectors and the second moving velocity vectors.

US Pat. No. 10,891,804

IMAGE COMPENSATION FOR AN OCCLUDING DIRECT-VIEW AUGMENTED REALITY SYSTEM

Adobe Inc., San Jose, CA...

1. A system operative in a digital medium environment to present a synthetic graphical image in conjunction with a direct-view of a real-world scene, the system comprising:a camera configured to detect light representative of the real-world scene;
an emissive display layer configured to present an emissive graphic;
an attenuation display layer configured to present an attenuation graphic;
a dark region compensation module configured to:
determine an unintended dark region in which the attenuation graphic blocks light of the real-world scene from an eye of a user; and
generate a replica graphic for the unintended dark region, the replica graphic replicating an appearance of the real-world scene based on the light representative of the real-world scene detected by the camera; and
a display module configured to cause the emissive display layer to present the emissive graphic along with the replica graphic as part of the synthetic graphical image viewable in conjunction with the direct-view of the real-world scene, the replica graphic replicating the appearance of the real-world scene in the determined unintended dark region as reproducing at least a portion of the light representative of the real-world as detected by the camera.

US Pat. No. 10,891,803

USER INTERFACE AND FUNCTIONS FOR VIRTUAL REALITY AND AUGMENTED REALITY

Comcast Cable Communicati...

1. A method comprising:receiving at least a first digital asset and a second digital asset, each of the at least first digital asset and second digital asset being associated with a content item;
determining at least a first virtual plane associated with the first digital asset and a second virtual plane associated with the second digital asset, wherein the first virtual plane is associated with a first depth and a first template, and wherein the second virtual plane is associated with a second depth and a second template;
determining a first modified size of the first digital asset to fit within the first template and a second modified size of the second digital asset to fit within the second template, wherein the first modified size is different from the second modified size; and
outputting a user interface comprising the first digital asset output in the first modified size in the first virtual plane and the second digital asset output in the second modified size in the second virtual plane.

US Pat. No. 10,891,802

INFORMATION PROCESSING DEVICE, IMAGE FORMING APPARATUS, METHOD FOR MANUFACTURING OUTPUT OBJECT, AND COMPUTER-READABLE RECORDING MEDIUM

RICOH COMPANY, LTD., Tok...

1. An information processing device comprising:a memory storing computer readable instructions; and
one or more processors configured to execute the computer readable instructions such that the one or more processors are configured to,
when a three-dimensional image is to be built by depositing a build material based on height information indicating heights of the three-dimensional image on a per-pixel basis and based on color information indicating colors of the three-dimensional image on a per-pixel basis, correct the height information so that surface of the three-dimensional image is covered with the colors indicated by the color information; and
generate layer information indicating, on a per-layer basis, pixel layouts for building the three-dimensional image whose shape is corrected so that the surface of the three-dimensional image is covered with the colors indicated by the color information, based on the corrected height information and the color information.

US Pat. No. 10,891,801

METHOD AND SYSTEM FOR GENERATING A USER-CUSTOMIZED COMPUTER-GENERATED ANIMATION

DreamWorks Animation L.L....

1. A method for generating a user-customized computer-generated animation, the method comprising:receiving digital content including a rendered video of a computer-generated animation;
determining a modifiable portion of the digital content, wherein the digital content includes texture and shading data of the modifiable portion;
receiving a design template, wherein the design template includes a representation of the modifiable portion of the digital content;
generating template image data by performing image analysis on the representation of the modifiable portion of the digital content, wherein generating the template image data includes determining a color profile of the representation of the modifiable portion of the digital content;
generating a revised portion of the digital content, wherein the revised portion is an altered portion of the modifiable portion of the digital content, and wherein the revised portion of the digital content includes a set of points, wherein each point in the set of points is generated based on the texture and shading data of the modifiable portion from the received digital content and the color profile of the representation of the modifiable portion of the digital content determined from the image analysis performed on the representation of the modifiable portion of the digital content from the design template;
generating an updated version of the video of the computer-generated animation, wherein the updated video comprises a version of the computer-generated animation including the revised portion of the digital content; and
causing a display of the updated video.

US Pat. No. 10,891,800

PROVIDING FEATURES OF AN ELECTRONIC PRODUCT IN AN AUGMENTED REALITY ENVIRONMENT

Apple Inc., Cupertino, C...

1. A device for providing a software feature of an electronic product in an augmented reality (AR) environment, comprising:one or more processors; and
memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
obtaining images using one or more image sensors;
determining whether the obtained images include printed media depicting the electronic product;
in accordance with a first set of one or more conditions being satisfied, the first set of one or more conditions including a first condition that is satisfied when the obtained images include the printed media depicting a screen of the electronic product:
displaying a virtual object corresponding to the electronic product in the AR environment, wherein the virtual object includes a virtual screen overlaying the screen of the electronic product depicted in the printed media; and
in accordance with a second set of one or more conditions being satisfied, the second set of one or more conditions including a second condition that is satisfied when the virtual object corresponding to the electronic product is displayed:
displaying, with the virtual screen of the virtual object, the software feature of the electronic product.

US Pat. No. 10,891,799

AUGMENTED REALITY PROCESSING METHOD, OBJECT RECOGNITION METHOD, AND RELATED DEVICE

TENCENT TECHNOLOGY (SHENZ...

1. An augmented reality processing method for a terminal, comprising:obtaining a plurality of frames of images, the plurality of frames of images comprising a first image and a second image, the second image being a frame of an image immediately following the first image;
obtaining a key point set of a first object in the first image;
obtaining, through a neural network model, first pose key point sets respectively corresponding to a plurality of objects in the second image, the neural network model being configured to obtain a key point set of an object in an image, and the first pose key point set comprising at least one first pose key point;
determining a second pose key point set of the first object in the second image according to the key point set and a motion trend of the first object, the second pose key point set comprising at least one second pose key point;
determining, for any target first pose key point set in the plurality of first pose key point sets, a target distance between the target first pose key point set and the second pose key point set according to at least one first pose key point in the target first pose key point set and the at least one second pose key point;
using the target first pose key point set as a key point set of the first object in the second image when the target distance is less than a preset threshold; and
generating an augmented information image according to the key point set of the first object in the second image.

US Pat. No. 10,891,798

SYSTEM AND METHOD FOR DISPLAYING AN ASSET OF AN INTERACTIVE ELECTRONIC TECHNICAL PUBLICATION SYNCHRONOUSLY IN A PLURALITY OF EXTENDED REALITY DISPLAY DEVICES

2689090 Canada Inc., Mir...

1. A system for displaying an asset of an interactive electronic technical publication synchronously in a plurality of extended reality display devices, the system comprising:a computer having a display displaying the interactive electronic technical publication with a reference to the asset;
a content-management system storing the asset referenced in the interactive electronic technical publication;
a multi-channel messaging subsystem comprising a key-value server and a subscription manager together defining a message broker and using a reactive protocol component allowing clients to subscribe to a channel associated to a conversation, the reactive protocol component used by the message broker allowing sending and receiving messages containing at least one of commands, events and property value changes relative to the asset referenced in the interactive electronic technical publication, with the messages being communication vessel using key-value pairs where the key of each one of the key-value pairs identifies the channel, the asset and the at least one of the commands, the events and the property thereof and the value of each one of the key-value pairs identifies a value to which the key is to be changed to;
an extended reality application module including a cross-platform game engine generating a virtualized 3D environment displayable on the plurality of extended reality display devices for rendering the asset referenced in the interactive electronic technical publication and loaded into the cross-platform game engine, the extended reality application module being configured to subscribe to the channel associated to the conversation, to communicate messages corresponding to that channel with the message broker of the multi-channel messaging subsystem;
wherein the plurality of extended reality display devices are connected to the extended reality application module and display the virtualized 3D environment generated by the cross-platform game engine thereof, the plurality of extended reality display devices subscribing to the channel associated to the conversation.

US Pat. No. 10,891,797

MODIFYING MULTIPLE OBJECTS WITHIN A VIDEO STREAM

Snap Inc., Santa Monica,...

1. A method comprising:receiving, by one or more processors, a set of images within a video stream depicting a face;
applying a graphical representation of glasses to the face depicted in the video stream;
determining a pixel depth of one or more pixels representing the face;
determining that the pixel depth exceeds a depth value; and
removing a portion of the graphical representation of the glasses in response to determining that the pixel depth exceeds the depth value.

US Pat. No. 10,891,796

SYSTEMS AND METHODS FOR AUGMENTED REALITY APPLICATIONS

1. A method of rendering a virtual shadow onto a real-world surface in an image using an augmented reality application running on a computing device having a camera, the method comprising:capturing an image of a scene using the camera;
detecting the real-world surface in the image;
obtaining the geometry of the real-world surface;
rendering a transparent occluding virtual plane onto the real-world surface by using the obtained geometry;
creating a virtual directional light for the image, the virtual directional light radially extending from a point in space in the scene;
using the created virtual directional light source to write a texture associated with a virtual object into a shadow buffer; and
projecting the texture written to the shadow buffer onto the transparent occluding virtual plane in the image.

US Pat. No. 10,891,795

LOCALIZATION METHOD AND APPARATUS BASED ON 3D COLOR MAP

Samsung Electronics Co., ...

1. A localization method comprising:generating a synthetic image corresponding to a current time period based on a three-dimensional (3D) color map corresponding to an image captured at a time period prior to the current time period, and position information at the current time period by projecting the 3D color map onto a 3D semantic map; and
determining final position information at the current time period based on the synthetic image corresponding to the current time period and an image captured at the current time period.

US Pat. No. 10,891,794

METHOD AND SYSTEM FOR GENERATING AUGMENTED REALITY CONTENT ON THE FLY ON A USER DEVICE

ARGO, Montpellier (FR)

1. A method for generating on the fly augmented reality content on a mobile user device, said method comprising the following steps:during a phase, called preliminary phase, carried out for at least one image, called real image:
configuring a scenario comprising at least one video and at least one display parameter, and
associating said scenario with said real image;
during a phase, called execution phase:
reading of a real image by said user device,
identifying a scenario associated beforehand with said real image,
producing augmented reality content on said user device, by executing said scenario on said real image, and
reading of said content by said mobile user device;the production step comprising deleting, in the entire video, the pixels of a predetermined colour provided as display parameter, in said scenario; andthe step of identifying the scenario associated with the image read by the user device comprises the following operations:transmitting the image read by the user device to a first remote site;
identifying said read image at said first remote site by analyzing the content of said image;
transmitting an identifier of said read image from said first remote site to said user device;
transmitting said identifier by the user device to a second remote site; and
identifying the scenario associated with said read image at said second remote site.

US Pat. No. 10,891,793

REALITY TO VIRTUAL REALITY PORTAL FOR DUAL PRESENCE OF DEVICES

Microsoft Technology Lice...

1. On a virtual reality display device, a computer-implemented method for virtualizing a physical electronic device in a portal zone of the virtual reality display device, the method comprising:displaying on the virtual reality display device application content from an application running on the virtual reality display device;
receiving identifying information associated with the physical electronic device in the portal zone, the physical electronic device comprising a display;
rendering a virtual object representing the physical electronic device in the portal zone based on the identifying information received, wherein the virtual object comprises a virtual display that corresponds to the display of the physical electronic device in the portal zone;
receiving, from the physical electronic device in the portal zone, display data generated for display on the display of the physical electronic device in the portal zone; and
rendering, on the virtual display, the display data from the physical electronic device in the portal zone such that the virtual display is displayed over a portion of the application content of the application running on the virtual reality display device.

US Pat. No. 10,891,792

PRECISE PLANE DETECTION AND PLACEMENT OF VIRTUAL OBJECTS IN AN AUGMENTED REALITY ENVIRONMENT

SPLUNK INC., San Francis...

1. A computer-implemented method, comprising:detecting a first orientation of a client device;
projecting a line from a reference position on the client device to a physical object;
identifying a first location on the physical object that intersects with the line;
determining an x-coordinate and a y-coordinate of a portion of the physical object included in an image displayed on the client device based on the first location;
receiving a z-coordinate of the portion of the physical object; and
in response to receiving user input via the client device, anchoring an augmented reality object at a second location that corresponds to the x-coordinate, the y-coordinate, and the z-coordinate, wherein an orientation of the anchored augmented reality object is determined based on the first orientation of the client device without consideration of an orientation of the physical object.

US Pat. No. 10,891,791

DETECTION AND VISUALIZATION OF SYSTEM UNCERTAINTY IN THE REPRESENTATION OF AUGMENTED IMAGE CONTENT IN HEADS-UP DISPLAYS

VOLKSWAGEN AKTIENGESELLSC...

1. A device for the recognition and visual display of system uncertainty in the representation of augmented image contents in a heads-up display that displays augmented image contents as a contact-analog display, the device comprising:at least one camera for the capture of image data for an image of the real world;
a control device for determining augmented image contents for representation as contact-analog data along with image data depicting the real world; and
a heads-up display on which the determined augmented image contents are represented, wherein the augmented image contents appear integrated into an image displayed on the heads-up display based on the image data depicting the real world from the point of view of the observer,
wherein the control device determines a characteristic value that describes the system uncertainty of the determination of the augmented image contents in relation to the image of the real world, and
wherein the control device modifies the augmented image contents displayed on the heads-up display depending on the determined characteristic value by deemphasizing the augmented image contents to be less erroneous in order to reduce the observer's perception of inaccuracy of the representation of the augmented image contents that results from movement of the transportation vehicle relative to an environment.

US Pat. No. 10,891,790

METHODS AND SYSTEMS FOR ALIGNMENT OF A SUBJECT FOR MEDICAL IMAGING

Stryker European Operatio...

1. A method of aligning a camera, the method comprising:receiving a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue;
processing the reference image to generate an alignment reference image;
displaying the alignment reference image on a display concurrently with real-time video of the anatomical region acquired from the camera;
dynamically updating the displayed real-time video to reflect adjustments to a current alignment of the camera relative to the anatomical region of the subject; and
displaying the real-time video and the alignment reference image as overlaid with one another when the current alignment of the camera is aligned with a predefined alignment associated with the reference image;
determining that the current alignment is aligned with the predefined alignment, wherein the determination is performed before initiating illuminating, by a light source, of the target tissue to induce fluorescence emission and is performed before initiating capturing, by the camera, of a time series of fluorescence input data from the fluorescence emission;
in response to the determining that the current alignment is aligned with the predefined alignment:
initiating the illuminating, by the light source, the target tissue to induce the fluorescence emission;
initiating the capturing, by the camera, the time series of fluorescence input data from the fluorescence emission, wherein the camera is configured to receive fluorescence images for fluorescence medical imaging.

US Pat. No. 10,891,789

METHOD TO PRODUCE 3D MODEL FROM ONE OR SEVERAL IMAGES

ITSEEZ3D, INC., Santa Cl...

1. A method of neural network learning of 3D models of human heads comprising the steps of:a) providing at least two training 3D models produced by scanning or modeling representative human heads;
b) mapping each training 3D model to a pair of target images comprising an Is image that describes the model shape, and an It image that describes the model texture; and
c) rendering a frontal image of each training 3D model, detecting facial features in the frontal images, and applying a 2D affine transformation to the frontal images in order to make the coordinates of the facial features as close to the average position of facial features for all the training 3D models to produce frontal images of the representative heads,
wherein step b) comprises:
i) placing each of the training 3D model of a human head in the standard orientation into a reference surface, wherein the reference surface consists of a cylinder and a half-sphere, the cylinder axis coinciding with the z axis, the cylinder upper plane being placed on the level of a forehead of training 3D model, and the half-sphere placed on top of the cylinder upper plane; and the standard orientation is the training 3D model orientation with a line between the eyes parallel to x axis, the line of sight parallel to y axis, and z axis going from the (approximately) center of the neck up to the top of the head;
ii) for points with z coordinate smaller than or equal to the upper cylinder plane, producing a cylindrical projection to establish the correspondence between the human head and the reference surface;
iii) for points with z coordinate larger than the cylinder upper plane, producing a spherical projection to establish the correspondence between the human head and the reference surface; and
iv) defining a distance r for each point from the human head as a distance from a point from the human head to the cylinder axis for points lower than or equal to the cylinder upper plane, and as a distance from a point from the human head to the half-sphere center for points above the cylinder upper plane.

US Pat. No. 10,891,788

SYSTEMS AND METHODS FOR FINITE ELEMENT MESH REPAIR

DASSAULT SYSTEMES SIMULIA...

1. A computer-implemented method of repairing a finite element mesh in a simulation determining behavior of a real-world physical object represented by the finite element mesh, the method comprising:identifying a non-compliant mesh element in a finite element mesh representing a real-world physical object;
extracting a mesh patch from the finite element mesh that includes the identified non-compliant mesh element;
based on one or more rules executed by a processor, generating an invariant patch topological description for the extracted mesh patch, wherein the generated invariant patch topological description is: (i) a binary sequence, indicated by the one or more rules, that is a function of one or more properties of edges of the extracted mesh patch and (ii) invariant to transformations, specified by the one or more rules, of the extracted mesh patch;
communicating with a database that stores pre-determined repair solutions and obtaining from the database a repair solution corresponding to the generated invariant patch topological description;
automatically repairing the mesh patch in the finite element mesh using the obtained repair solution from the database, and said repairing resulting in a repaired finite element mesh stored in computer memory; and
performing a finite element simulation of the real-world physical object using the repaired finite element mesh stored in the computer memory to determine behavior of the real-world physical object;
said identifying, extracting, generating, communicating, repairing, and performing being implemented by one or more processors in an automated manner.

US Pat. No. 10,891,787

APPARATUS AND METHOD FOR CREATING BIOLOGICAL MODEL

FUJITSU LIMITED, Kawasak...

1. A biological model creation apparatus comprising:a memory configured to store therein mesh model data and target point data, the mesh model data representing a three-dimensional mesh model of a heart, the target point data indicating positions of a plurality of target points in a three-dimensional space, the plurality of target points being set on a plurality of valve annuli of a specified heart; and
a processor configured to perform a process including
setting a plurality of control points respectively corresponding to the plurality of target points, on a plurality of valve annuli in the mesh model,
determining positions of the plurality of control points on the plurality of valve annuli in the mesh model, based on a first evaluation value and a second evaluation value, the first evaluation value indicating a degree of matching of relative positions among control points corresponding to target points belonging to a same valve annulus to relative positions among the target points belonging to the same valve annulus, the second evaluation value indicating a degree of matching of relative positions among control points corresponding to target points belonging to different valve annuli to relative positions among the target points belonging to the different valve annuli, and
deforming, upon arranging the mesh model in the three-dimensional space, the mesh model such that the positions of the plurality of control points arranged at the predetermined positions in the mesh model coincide with positions of their corresponding target points.

US Pat. No. 10,891,786

GENERATING DATA FOR A THREE-DIMENSIONAL (3D) PRINTABLE OBJECT, INCLUDING A TRUSS STRUCTURE

Hewlett-Packard Developme...

1. A computer program product for generating data for a three-dimensional (3D) printable object, the computer program product comprising:a non-transitory computer readable storage medium comprising computer usable program code embodied therewith, the computer usable program code to, when executed by a processing device:
convert mesh data into a volumetric voxel data set, the voxel data set being organized as an N-ary tree defining at least a portion of the 3D printable object;
convert the N-ary tree of the voxel data set into print instructions defining a truss structure as volumetric infill within voxels of the 3D printable object; and
print the 3D object including the truss structure.

US Pat. No. 10,891,785

SYSTEMS AND METHODS FOR FITTING PRODUCT

1. A method for rendering reality for an object, comprising: using a mobile device with an accelerometer and a camera, performing motion tracking and learning in an environment of the object with accelerometer and camera data in the mobile device;selecting a pattern or color from a plurality of product variations or service variations;
blending the pattern or color of the object and the environment; and displaying the color on the object as reality view.

US Pat. No. 10,891,784

METHOD, APPARATUS AND STREAM FOR IMMERSIVE VIDEO FORMAT

InterDigital VC Holdings,...

1. A method of generating a stream, the method comprising:partitioning a three-dimensional point cloud in a plurality of three-dimensional parts each comprising at least a point of the three-dimensional point cloud;
for each of said three-dimensional parts:
determining a depth map of said three-dimensional part according to parameters representative of a two-dimensional parametrization responsive to geometric information associated with said at least a point and to pose information associated with a range of points of view; and
determining a color map of said three-dimensional part according to said parameters;
the plurality of determined depth maps being represented in a first patch atlas, each depth map corresponding to one patch of the first patch atlas, and the plurality of determined color maps being represented in a second patch atlas, each color map corresponding to one patch of the second patch atlas;
generating said stream comprising the parameters representative of the two-dimensional parametrizations, data representative of the first patch atlas, data representative of the second patch atlas and a first mapping information representative of a mapping between said two-dimensional parametrizations and corresponding depth maps in the first patch atlas and corresponding color maps in the second patch atlas.

US Pat. No. 10,891,783

IMPROVING AREA LIGHT SHADOWS IN COMPUTER GENERATED SCENES

Nvidia Corporation, Sant...

1. A method of adaptive occlusion sampling from a rectangular area light to provide realistic shadowing in a computer generated scene, comprising:generating, based on a rectangular area light, a spherical polygon on a unit sphere around a surface point of a geometry within the scene;
determining angular sizes of the rectangular area light along a first and a second dimension employing the spherical polygon;
determining a number of cones needed to sample the rectangular area light along the first and the second dimension employing the angular sizes;
generating a sample pattern for the first and the second dimensions based on the number of cones for sampling along each of the first and second dimensions;
creating a sampling grid within the spherical polygon based on the sample patterns of the first and second dimensions, wherein each sample point of the sampling grid has two dimensional coordinates within the spherical polygon;
obtaining samples from a light surface of the rectangular area light by tracing cones, through a voxel representation of the scene, from the surface point through the sample points to the light surface, wherein a total number of the cones is a product of the number of cones for sampling along each of the first and second dimensions; and
computing average visibility for the surface point based on the samples and a weight associated with each of the cones as determined by the sample patterns.

US Pat. No. 10,891,782

DETERMINING ILLUMINATION DIRECTION WITH THREE-DIMENSIONAL ILLUMINATION CHARTS

Hewlett-Packard Developme...

1. A three-dimensional illumination chart, comprising:a substrate divided into a number of portions, wherein each portion comprises a raised relief pattern disposed thereon, wherein relief elements of each relief pattern are arranged having a relief angle relative to a reference line,
wherein each relief angle of the raised relief patterns of the number of portions is different from other relief angles of other raised relief patterns of other portions such that the relief pattern of each portion of the number of portions casts a unique shadow with respect to a common illuminant;
wherein a difference between each pair of subsequent relief angles is a set value.

US Pat. No. 10,891,781

METHODS AND SYSTEMS FOR RENDERING FRAMES BASED ON VIRTUAL ENTITY DESCRIPTION FRAMES

Verizon Patent and Licens...

1. A method comprising:generating, by a virtual scene capture system for a first point in a temporal sequence, a key description frame comprising virtual entity description data that describes:
a state of a virtual object that is to be rendered for the first point in the temporal sequence, the description of the state of the virtual object including a location of the virtual object with respect to a global coordinate system associated with a three-dimensional (3D) space, and
a state of a virtual vantage point from which the virtual object is to be rendered for the first point in the temporal sequence, the description of the state of the virtual vantage point including a location of the virtual vantage point with respect to the global coordinate system;
generating, by the virtual scene capture system for a second point in the temporal sequence subsequent to the first point, an update description frame comprising virtual entity description data that describes the state of the virtual object for the second point in the temporal sequence and does not describe the state of the virtual vantage point for the second point in the temporal sequence; and
providing, by the virtual scene capture system, the key description frame and the update description frame to a 3D rendering engine server configured to render the virtual object from the virtual vantage point for the first and second points in the temporal sequence by generating, based on the key and update description frames, surface data frames depicting the virtual object from the virtual vantage point.

US Pat. No. 10,891,780

METHODS AND SYSTEMS FOR VIEWING A THREE-DIMENSIONAL (3D) VIRTUAL OBJECT

Google LLC, Mountain Vie...

1. A method comprising:a computing device operating a display to indicate a view of a virtual object according to a first viewpoint at a first distance to an object surface of the virtual object, wherein the first viewpoint is along a first viewpoint path having a shape that is a first blend of a shape of the object surface and a circular path around the virtual object, and wherein the first viewpoint path has a first extent of similarity with the circular path;
receiving a request for changing the view indicated in the display to a second view of the virtual object according to a second viewpoint at a second distance to the object surface of the virtual object, wherein the second distance is less than the first distance;
in response to receiving the request for changing the view indicated in the display to the second view, and based on the second distance being less than the first distance, determining a second viewpoint path along which the second viewpoint falls, wherein determining the second viewpoint path comprises determining the second viewpoint path (i) to have a shape that is a second, different blend of the shape of the object surface and the circular path around the virtual object, and (ii) to have a second extent of similarity with the circular path that is smaller than the first extent of similarity; and
operating the display to indicate a change of the view to the second view.

US Pat. No. 10,891,779

EFFICIENT VOLUMETRIC RECONSTRUCTION WITH DEPTH SENSORS

Naked Labs Austria GMBH, ...

1. A method for real-time volumetric 3D reconstruction of an object using at least one depth sensor camera, the method comprising:performing a preparation step of a new depth map frame, wherein the preparation step includes generating a first cast ray that extends from a camera center through an associated 3D position of an associated pixel on a camera plane, wherein the preparation step includes determining a first depth interval on the first cast ray, wherein the first depth interval is determined on the first cast ray depending on the new depth map frame and depending on a depth value of the associated pixel;
wherein the preparation step includes a search step in which are searched only an existing voxel and a non-existing voxel that is adjacent the existing voxel, which are hit by the first cast ray in the first depth interval;
wherein in the preparation step multiple search steps are parallelized as a a plurality of individual search threads and each individual search thread does searching on a separate pixel;
wherein in the preparation step only the existing voxels and the non-existing voxels are collected in a reconstructed scene depending on the new depth map frame and in which the collected existing voxels and the non-existing voxels are cached in order to perform an update of the reconstructed scene; and
performing an integration step, in which only the collected and cached voxels and non-existing voxels of the preparation step are updated with a newly captured depth map frame;
wherein the preparation step and the integration step are separated from each other so that both the preparation and integration steps can be carried out in parallel at the same time.

US Pat. No. 10,891,778

APPARATUS AND METHOD FOR PRODUCING THREE-DIMENSIONAL MODELS FROM MAGNETIC RESONANCE IMAGING

THE BOARD OF TRUSTEES OF ...

1. A method comprising:receiving, by a processing system including a processor, an input three-dimensional dataset comprising a first plurality of two-dimensional images, wherein the first plurality of two-dimensional images covers an entire myocardium of a subject, and wherein the input three-dimensional dataset had been generated based upon one or more magnetic resonance imaging scans of the subject;
applying, by the processing system, bias field correction to the input three-dimensional dataset to generate a corrected three-dimensional dataset comprising a second plurality of two-dimensional images;
generating, by the processing system, a labeled three-dimensional dataset comprising a third plurality of two-dimensional images, wherein the labeled three-dimensional dataset further comprises one or more labels associated with the entire myocardium of the subject, and wherein the labeled three-dimensional dataset is generated via a convolutional neural network based upon the corrected three-dimensional dataset and based upon a previously trained three-dimensional dataset; and
outputting, by the processing system, a three-dimensional digital model of the entire myocardium of the subject based upon the labeled three-dimensional dataset;
wherein the method operates in an automatic manner proceeding without manual user intervention after the receiving the input three-dimensional dataset to the applying the bias field correction and then to the generating the labeled three-dimensional dataset and then to the outputting the three-dimensional digital model.

US Pat. No. 10,891,777

ULTRASOUND IMAGING SYSTEM AND METHOD FOR IMAGE GUIDANCE PROCEDURE

KONINKLIJKE PHILIPS N.V.,...

1. An ultrasound imaging system comprising:a 3D flow processing unit configured to generate an ultrasound flow volume based on ultrasound signals corresponding to a subject's vasculature;
a flow image processing unit configured to create a 3D-vessel map based on the ultrasound flow volume;
a B-mode volume processing unit configured to generate a B-mode volume based on the ultrasound signals;
a registration unit configured to automatically register the 3D-vessel map to the ultrasound flow volume and to select a portion of the 3D-vessel map corresponding to a current ultrasound flow image;
a display configured to display a live ultrasound flow image, which is updated in real-time, based on the current ultrasound flow image and the selected portion of the 3D-vessel map;
an image processing unit configured to overlay the current ultrasound flow image and the selected portion of the 3D-vessel map to provide the live ultrasound flow image; and
a controller configured to select either the B-mode volume processing unit or the 3D flow processing unit, wherein the controller is configured to first select the 3D flow processing unit, prior to an insertion of an invasive device, and subsequently select the B-mode volume processing unit after insertion of the invasive device.

US Pat. No. 10,891,776

IMAGING SYSTEM AND METHOD

KONINKLIJKE PHILIPS N.V.,...

1. An imaging system comprising:an input unit adapted to receive 3D image data of a volume of an object to be imaged;
a segmentation unit adapted to perform segmentation of the 3D image data received by the input unit, the segmentation being based upon an anatomical model and configured to determine at least one segmented surface within the 3D image data;
a surface rendering unit, adapted to generate one or more surface values to be assigned to points on the at least one segmented surface, said values being based upon image data values of the 3D image data falling along projection vectors extended through said points, each projection vector having length or angle to the segmented surface determined at least partly on the basis of the anatomical model; and
an image rendering unit adapted to generate one or more images of the object to be imaged based upon the surface values generated by the surface rendering unit.

US Pat. No. 10,891,775

AUTOMATIC LEVEL-OF-DETAIL FOR PHYSICALLY-BASED MATERIALS

NVIDIA CORPORATION, Sant...

1. A method comprising:identifying a declarative representation of a material to be rendered, the declarative representation including expressions representing a structural part in the material as one or more functions defining a distribution of light as it interacts with the material;
creating a reduced complexity declarative representation of the material by applying one or more term rewriting rules to the expressions, wherein the one or more term rewriting rules map the material to another material by pattern matching one or more of the expressions with one or more other expressions; and
returning the reduced complexity declarative representation of the material.

US Pat. No. 10,891,774

METHOD AND APPARATUS FOR PROFILE-GUIDED GRAPHICS PROCESSING OPTIMIZATIONS

Intel Corporation, Santa...

1. An apparatus comprising:a graphics processor to process graphics commands responsive to execution of an application; and
a non-transitory machine-readable medium having program code stored thereon which, when executed, causes the apparatus to:
obtain a graphics execution profile through an application programming interface (API) extension to an existing three-dimensional (3D) API,
store the graphics execution profile data associated with one or more graphics workloads that are being executed by the graphics processor, wherein the graphics execution profile data indicates a number of Single Instruction Multiple Data (SIMD) lanes to be allocated for processing the one or more graphics workloads, wherein the number of SIMD lanes is identified corresponding to characteristics of the one or more graphics workloads, and wherein the number of SIMD lanes results in a least execution time of the one or more graphics workloads by the graphics processor after the one or more graphics workloads are executed using each of available numbers of SIMD lanes of the graphics processor, and
read the graphics execution profile data upon detecting one of the graphics workloads during execution of the application, and
configure the graphics processor in accordance with the graphics execution profile data to execute the one or more graphics workloads during a next execution interval, wherein a shader is generated to execute the one or more graphics workloads based on the graphics execution profile data.

US Pat. No. 10,891,773

APPARATUS AND METHOD FOR EFFICIENT GRAPHICS VIRTUALIZATION

Intel Corporation, Santa...

1. A processor comprising:a command streamer to queue commands from a plurality of virtual machines (VMs), the commands to be distributed from the command streamer and executed by graphics processing resources of a graphics processing unit (GPU);
a tile cache to store graphics data associated with the plurality of VMs as the commands are executed by the graphics processing resources; and
tile cache allocation hardware logic to allocate a first portion of the tile cache to a first VM and a second portion of the tile cache to a second VM, wherein a priority associated with each of the plurality of VMs and sizes of the first portion and second portion of the tile cache are selected in accordance with the priorities;
the tile cache allocation hardware logic to further allocate a first region in system memory to store spill-over data for the first VM when the first portion of the tile cache allocated to the first VM becomes full and the second portion of the tile cache allocated to the second VM is available.

US Pat. No. 10,891,772

RENDERING APPARATUS, RENDERING METHOD AND RECORDING MEDIUM

NEC CORPORATION, Tokyo (...

1. A rendering apparatus comprising:a memory storing a computer program; and
at least one processor configured to run the computer program to:
divide a three-dimensional model into a plurality of fragments;
generate a directed graph with each of the plurality of fragments as vertex and radiance in the direction from each of the plurality of fragments to every other of the plurality of fragments as edge, each vertex includes three-dimensional coordinates where each of the plurality of fragments is located;
calculate first radiance for each of the plurality of fragments based on the directed graph, the first radiance being radiance in a direction from a fragment to other fragment, the first radiance being a sum of the radiance in a direction from the fragment to the other fragment and a value multiplied each of the radiance incident on the fragment by a reflectance in the direction of the other fragment; and
calculate second radiance for each fragment visible from a viewpoint among the plurality of fragments based on the first radiance, the second radiance being radiance in a direction from the fragment to the viewpoint, project each fragment visible having the second radiance on a rendering screen relevant to the viewpoint, and generate a rendered image.

US Pat. No. 10,891,771

INFORMATION PROCESSING APPARATUS FOR CREATING AN ANIMATION FROM A SPHERICAL IMAGE

RICOH COMPANY, LTD., Tok...

1. An information processing apparatus, comprising:circuitry configured to
receive selection of a first projection mode indicating a relative position of a virtual image capture so as to constrain a location of a first viewpoint, the first projection mode being selected by a user as one of inside, on, and outside a surface of a virtual three-dimensional object;
receive designation of the first viewpoint, the location of the first viewpoint being constrained by the selected first projection mode to be one of inside, on, and outside the surface of the virtual three-dimensional object;
register at least the first viewpoint and a second viewpoint, the second viewpoint being designated next to the first viewpoint, as viewpoints in a full-view, 360-degree spherical image that is mapped onto the surface of the virtual three-dimensional object;
set a transition path of the viewpoints by interpolating between the first viewpoint and the second viewpoint in the full-view spherical image;
generate a first partial image matching the first viewpoint, and generate a second partial image matching the second viewpoint; and
play an animation by sequentially displaying partial images while transitioning the viewpoints from the first partial image to the second partial image along the set transition path.

US Pat. No. 10,891,770

SYSTEMS AND METHODS FOR AUGMENTED REALITY VIDEO FEED ON AN UNMANNED VEHICLE

FLIR UNMANNED AERIAL SYST...

1. An unmanned vehicle (UV) controller comprising:a processor;
a communication interface for communicating with an unmanned vehicle (UV); and
a non-transitory memory device storing machine-readable instructions that, when executed by the processor, causes the processor to render a ground plane over images or video feed, the processor configured to:
receive an image or video feed from a camera on the UV;
receive from the UV telemetry information of the camera;
receive from the camera a zoom factor of the camera;
calculate a horizon of the camera based upon the telemetry information and the zoom factor; and
render the horizon as an overlay image on the image or video feed.

US Pat. No. 10,891,769

SYSTEM AND METHOD OF SCANNING TWO DIMENSIONAL FLOORPLANS USING MULTIPLE SCANNERS CONCURRENTLY

FARO TECHNOLOGIES, INC, ...

1. A system of generating a two-dimensional (2D) image of an environment, the system comprising:a plurality of 2D scanner systems, each 2D scanner system comprises a light source, an image sensor, and a controller, and wherein, in each 2D scanner system, said light source steers a beam of light within a first plane to illuminate object points in the environment, said image sensor is arranged to receive light reflected from the object points, and said controller is operable to determine a distance value to at least one of the object points;
one or more processors operably coupled to the 2D scanner systems, the one or more processors being responsive to non-transitory executable instructions for generating a plurality of 2D submaps of the environment in response to an activation signal from an operator and based at least in part on the distance value, each submap generated from a respective point in the environment and by a respective 2D scanner system; and
a central processor operably coupled to the one or more processors, the central processor being responsive to non-transitory executable instructions for generating the 2D image of the environment using the 2D submaps from the 2D scanner system, wherein generating the 2D image of the environment using the 2D submaps comprises aligning a first submap to the 2D image of the environment, the alignment comprising:
performing at least one of a shift or a rotation of the first submap relative to the 2D image;
identifying overlapping natural features in the 2D image and the first submap; and
automatically shifting and rotating the first submap to align with the 2D image when the overlapping natural features are aligned by the operator within a predetermined threshold.

US Pat. No. 10,891,768

ANNOTATING AN IMAGE WITH A TEXTURE FILL

Snap Inc., Santa Monica,...

1. A method comprising:accessing user input data defining a border with respect to a target digital image, the border separating a first portion of the target digital image from a second portion of the target digital image, the first portion being disposed within the border, and the second portion being disposed outside of the border;
applying a contouring technique to coordinate data received via the user input data, the applying of the contouring technique to the coordinate data yielding the border;
selecting, based on a sample digital mage for a media overlay, a texturing technique from a set of texturing techniques;
generating the media overlay using the selected texturing technique;
generating a contour mask for the target digital image based on the border with respect to the target digital image; and
applying the media overlay to the target digital image based on the contour mask to yield an annotated digital image.

US Pat. No. 10,891,767

IMAGE PROCESSING APPARATUS AND NON-TRANSITORY COMPUTER READABLE RECORDING MEDIUM FOR TRIANGLE-BASED PIXEL AREA CALCULATIONS TO PERFORM ANTI-ALIASING ON EDGES IN SCAN LINE CONVERSIONS

KYOCERA DOCUMENT SOLUTION...

1. An image processing apparatus, comprising:a controller circuit configured to
detect a first point in a pixel group, the pixel group including multiple rectangular pixels arrayed in one line in series in a scan line direction, each pixel of the multiple pixels being divided into multiple rectangular sub-pixels, the multiple rectangular sub-pixels being arranged in a matrix in the scan line direction and in a perpendicular direction, the perpendicular direction being a direction perpendicular to the scan line direction, a lower side and an upper side of the pixel group being in parallel with the scan line direction, the first point being a point on which the lower side of the pixel group and an edge line cross each other, the edge line being a line indicating an edge of an image-to-be-rendered,
detect a second point, the second point being a point on which the upper side of the pixel group and the edge line cross each other,
select a first vertex, the first vertex being one vertex of two vertices positioned on a lower side of a first sub-pixel, the first sub-pixel being a sub-pixel including the first point out of the multiple sub-pixels,
select a second vertex, the second vertex being one vertex of two vertices positioned on an upper side of a second sub-pixel, the second sub-pixel being a sub-pixel including the second point out of the multiple sub-pixels,
create a right triangle, the right triangle including the image, the right triangle having a hypotenuse and a first cathetus, the hypotenuse being a line segment from the first vertex to the second vertex, the first cathetus being the upper side or the lower side of the pixel group,
calculate an area of a first micro-triangle, the first micro-triangle being included in the first sub-pixel or the second sub-pixel, the first micro-triangle including the image, the image having a hypotenuse, one cathetus, and one other cathetus, the hypotenuse being the line segment, the one cathetus being the first cathetus of the right triangle, the one other cathetus being one side of the first sub-pixel or the second sub-pixel, the first micro-triangle and the right triangle having a geometrical similarity,
calculate a number of multiple micro-triangles included in a region of each pixel of the multiple pixels, the region being covered by the right triangle, each micro-triangle of the multiple micro-triangles and the first micro-triangle being congruent, and
calculate an area of the region of each pixel of the multiple pixels, the region being covered by the right triangle, on a basis of the number of the multiple micro-triangles and the area of the first micro-triangle.

US Pat. No. 10,891,766

ARTISTIC REPRESENTATION OF DIGITAL DATA

Google LLC, Mountain Vie...

1. A method comprising:receiving, by one or more processors, data from one or more sensors, the data related to a user state measured over a period, the data having at least a first value and a second value within the period;
generating, by the one or more processors, a modified Cartesian representation of the received data, wherein the modified Cartesian representation comprises:
a first end corresponding to a beginning of the period and a second end corresponding to an end of the period, the first end being connected to the second end, and
a center;
wherein a first distance between the center and a first portion of the representation corresponds to the first value of the data, and a second distance between the center and a second portion of the representation corresponds to the second value of the data, wherein the first distance is different than the second distance; and
providing the modified Cartesian representation for display.

US Pat. No. 10,891,765

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD

SONY CORPORATION, Tokyo ...

1. An information processing apparatus comprising:a display size decision unit configured to decide a display size of each piece of a plurality of pieces of image data that is based on learning of a label corresponding to a class into which a selected piece among the plurality of pieces of image data is classified, wherein the display size of each respective piece of the image data is decided according to a respective likelihood vector obtained by recognition of the image data, the respective likelihood vector indicating a distance between a position of the respective piece of the image data arranged based on the recognition of the image data and a position of the class into which the selected piece of the image data is classified; and
a communication control unit configured to cause the decided display size of each piece of the image data to be transmitted,
wherein the display size decision unit and the communication control unit are each implemented via at least one processor.

US Pat. No. 10,891,764

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING PROGRAM

FUJIFILM Corporation, To...

1. An image processing apparatus comprising:a processor configured to:
acquire a plurality of projection images obtained by irradiating a subject disposed between a radiation source and a radiation detector with radiation emitted from the radiation source at different irradiation angles and capturing the radiation with the radiation detector at each of the irradiation angles;
decompose each of the plurality of projection images into a plurality of first projection images with a low-frequency component lower than a predetermined spatial frequency and a plurality of second projection images with a high-frequency component higher than the predetermined spatial frequency; and
generate a pseudo two-dimensional image using a plurality of first tomographic images which have been reconstructed from the plurality of first projection images and subjected to first image processing and a plurality of second tomographic images which have been reconstructed from the plurality of second projection images and subjected to second image processing different from the first image processing in type,
wherein the processor is further configured to combine each of the plurality of first tomographic images and each of the plurality of second tomographic images for each height according to the height based on a detection surface of the radiation detector to generate a plurality of composite tomographic images and generate a composite pseudo two-dimensional image as the pseudo two-dimensional image, using the plurality of composite tomographic images.

US Pat. No. 10,891,763

ENHANCED IMAGING OF A VASCULAR TREATMENT

KONINKLIJKE PHILIPS N.V.,...

1. An apparatus for determining an enhanced image of a vascular treatment, comprising:a data processor configured to:
receive a plurality of sets of images generated by the same diagnostic imaging system comprising a representation of a region of interest of a vascular structure wherein the images are contrast agent free, wherein each image of each set of images comprises image data of at least one localizing feature associated with at least one tool configured to be used in the vascular treatment and each image of each set of images comprises image data associated with the at least one tool,
determine registration information for each of the images of each set of images,
select a subset of images from each set of images on the basis of the registration information for each of the images, and
determine an enhanced image of a selectable one or both of the at least one tool and the region of interest of the vascular structure from the subset of images, by combining images of at least a part of the subset of images and wherein the enhanced image provides for enhanced visibility of the at least one tool including:
(i) determining tool location information from the subset of images as a function of the at least one localizing feature, and determine the enhanced image of the tool from the subset of images based on the tool location information, and/or
(ii) determining element location information for the subset of images as a function of the at least one localizing feature, and determining the enhanced image of the region of interest of the vascular structure from the subset of images based on the body element location information; and,
a display device configured to display data representative of the enhanced image.

US Pat. No. 10,891,762

APPARATUS AND METHOD FOR MEDICAL IMAGE DENOISING BASED ON DEEP LEARNING

CLARIPI INC., Seoul (KR)...

1. A method for medical image denoising based on deep learning, the method comprising: generating multiple trained deep learning models, the multiple trained deep learning models being grouped by examination areas; extracting examination information from an input CT data, the examination information including examination area information; selecting at least one deep learning model corresponding to the examination information from the multiple trained deep learning models; and outputting a CT data denoised from the input CT data by feeding the input CT data into the selected at least one deep learning model wherein the generating comprises: generating a second training CT data set to which noises of multiple predetermined levels are added by applying a CT data image noise simulator to a first training CT data set; extracting examination information from the second training CT data set and grouping the second training CT data set into multiple groups according to a predetermined rule; and generating and training multiple training-target deep learning models so as to correspond to the respective groups of the second training CT data set by groups, wherein in the selecting, the multiple previously trained deep learning models are the multiple training-target deep learning models trained in the generating and training.

US Pat. No. 10,891,761

ADAPTIVE THREE-DIMENSIONAL SPACE GENERATION METHOD AND SYSTEM

3I Inc., Daegu (KR)

1. An adaptive three-dimensional space generation method comprising:allowing an adaptive three-dimensional space generation system to determine whether a space is a first-type space or a second-type space depending on structural characteristics of the space based on a plurality of images captured from different locations in the space;
allowing the adaptive three-dimensional space generation system to adaptively select an image for performing texturing of the space among the images depending on whether the space is the first-type space or the second-type space; and
performing texturing of the space based on the image selected by the adaptive three-dimensional space generation system,
wherein said determining whether the space is the first-type space or the second-type space depending on the structural characteristics of the space includes:
determining whether the space is the first-type space or the second-type space depending on whether or not at least two criteria are satisfied among a first criterion in which the number of corners of the space is smaller than or equal to a predetermined number, a second criterion in which a ratio of an area of a plane structure of the space to an area of a minimum bounding box of the plane structure of the space is greater than or equal to a reference area ratio, a third criterion in which distances from positions corresponding to the images to all walls of the space are smaller than or equal to a predetermined distance, and a fourth criterion in which whether or not all the walls of the space are seen from the positions corresponding to the images.

US Pat. No. 10,891,760

DIGITAL OVERPAINTING CONTROLLED BY OPACITY AND FLOW PARAMETERS

ADOBE INC., San Jose, CA...

1. A method for responding to a brushstroke input by updating canvas display data to include an output canvas color and an output canvas opacity, wherein the method includes one or more processing devices executing a graphics manipulation application and thereby performing operations comprising:receiving the brushstroke input within a digital canvas edited with the graphics manipulation application, the brushstroke input having a maximum alpha-deposition parameter and a fractional alpha-deposition parameter, wherein the maximum alpha-deposition parameter controls a maximum change in opacity resulting from brushstroke inputs and the fractional alpha-deposition parameter controls a fraction of the maximum alpha-deposition parameter applied in each brushstroke input;
computing an alpha flow increment from the maximum alpha-deposition parameter weighted with the fractional alpha-deposition parameter;
computing the output canvas opacity by increasing a current canvas opacity based on the alpha flow increment;
computing an interim canvas opacity that is the current canvas opacity adjusted by a value corresponding to the alpha flow increment;
computing a color mixing factor indicating a contribution of the alpha flow increment to a combination of the alpha flow increment and the interim canvas opacity; and
selecting, as the output canvas color, a sum of (i) a current canvas color that is weighted based on the color mixing factor and (ii) a brush color selected for the brushstroke input that is differently weighted based on the color mixing factor; and
modifying a portion of the digital canvas, to which the brushstroke input is applied, to display the updated canvas display data.

US Pat. No. 10,891,759

METHOD FOR LOSSLESS COMPRESSION AND REGENERATION OF DIGITAL DESIGN DATA

1. A secure method for lossless compression and regeneration of native computer aided design (CAD) file representing a design, comprising:a. executing a severable first routine configured to extract part data from the native CAD file and thereby expressing the extracted data in a private file format having significantly reduced file size; and
b. executing a severable second routine arranged to complement the first routine and which is configured to receive the private file format expressed by said first routine to thereby regenerate the native CAD file in a lossless manner,characterized in that:the regenerated CAD file has lesser file size than that of original CAD file due to optimization of part creation keeping the number of features same as that of the original parts in the native CAD file;
colour scheme, material properties, and orientation of parts are maintained after regeneration of the native CAD format;
the regenerated CAD file is backward compatible with previous versions of CAD software;
the private file format established is an .ampedvextension file;
security is introduced by the fact that the private .ampedv extension file established can solely be created, and read, by the first and second subroutines respectively which therefore exclude any instance of unauthorized access, and alternatively use, of said private .ampedv extension file.

US Pat. No. 10,891,758

GEOMETRY ENCODER

GOOGLE LLC, Mountain Vie...

11. A method comprising:receiving geometric data to be encoded;
generating a signature for the geometric data based on at least one property associated with the geometric data, the signature being represented by a number of variables based on a statistical analysis of the at least one property;
receiving a first set of options defined by a first set of parameters associated with compressing the geometric data;
accessing a classifier based on the signature and the first set of options;
selecting a set of second options based on the classifier, the set of second options being defined by a second set of parameters associated with compressing the geometric data; and
encoding the geometric data using the first set of options and the set of second options.

US Pat. No. 10,891,757

LOW-LIGHT CAMERA OCCLUSION DETECTION

Waymo LLC, Mountain View...

1. A method of determining whether a camera is occluded, the method comprising:capturing an image using the camera, the camera having red, green, and blue pixels each including a photosensor;
determining, by one or more processors, output values for the photosensors of each of the red pixels, green pixels, and blue pixels for the image;
comparing, by the one or more processors, the output values of the green pixels to one or more of the output values of the red pixels or the output values of the blue pixels; and
based on the comparison, determining, by the one or more processors, that the camera is occluded.

US Pat. No. 10,891,756

IMAGE PROCESSING DEVICE, CHART FOR CALIBRATION, AND CALIBRATION SYSTEM

SONY CORPORATION, Tokyo ...

1. An image processing device, comprising:at least one processor configured to:
acquire a far-infrared image and a visible light image;
allocate a first pixel value to each pixel that has a pixel value lower than a threshold value in the far-infrared image;
allocate a second pixel value to each pixel that has a pixel value higher than or equal to the threshold value in the far-infrared image, wherein the first pixel value is lower than the second pixel value;
extract, based on the allocation of the first pixel value and the second pixel value, a plurality of first markers having a first temperature from the far-infrared image;
extract the plurality of first markers having a first color from the visible light image; and
specify a position of each of a plurality of second markers, having both a second temperature in the far-infrared image and a second color in the visible light image, based on a geometric relationship between the plurality of first markers, wherein
the threshold value is set at the pixel value which corresponds to a temperature that is lower than the first temperature of the plurality of first markers and higher than the second temperature of the plurality of second markers, and
a first marker of the plurality of first markers is distinguishable from a second marker of the plurality of second markers based on both of: a difference between the first temperature and the second temperature, and a difference between the first color and the second color.

US Pat. No. 10,891,755

APPARATUS, SYSTEM, AND METHOD FOR CONTROLLING AN IMAGING DEVICE

Danxiao Information Techn...

1. A method, comprising:imaging a target object in a first imaging mode using an imaging device;
identifying a feature of the target object;
using image recognition to recognize the feature;
determining a spatial coordinate data of the feature;
updating the spatial coordinate data of the feature after the target object moves;
transferring a feature input data, the feature input data requesting additional imaging of the feature; and
actuating the imaging device based on the spatial coordinate data to additionally image the feature in a second imaging mode using the imaging device.

US Pat. No. 10,891,754

LIGHT-BASED VEHICLE POSITIONING FOR MOBILE TRANSPORT SYSTEMS

OSRAM SYLVANIA Inc., Wil...

1. A method for determining a position of a vehicle within an area, the method comprising:receiving an image of an asymmetric fiducial pattern displayed by a luminaire within the area, wherein:
the luminaire comprises a plurality of light sources;
when the luminaire is in illumination mode, the plurality of light sources are configured to emit a maximum light output; and
when the luminaire is in messaging mode, the plurality of light sources are configured to emit light output of varying intensity to display the asymmetric fiducial pattern;
determining a coordinate position of the luminaire based on the received image of the asymmetric fiducial pattern displayed by the luminaire;
determining an orientation of the vehicle relative to the area based on an orientation of the asymmetric fiducial pattern in the received image; and
determining the position of the vehicle relative to the area based at least in part on the determined coordinate position of the luminaire.

US Pat. No. 10,891,753

DEVICE, SYSTEM AND METHOD FOR NOTIFYING A PERSON-OF-INTEREST OF THEIR LOCATION WITHIN AN ESTIMATED FIELD-OF-VIEW OF A CAMERA

MOTOROLA SOLUTIONS, INC.,...

1. An electronic device comprising:a controller in communication with a first camera, the controller configured to:
identify a first responder in an area proximal one or more of the first camera and the electronic device, the first responder responding to a current incident;
receive images from the first camera;
process the images from the first camera to identify a second camera in the images located in the area proximal one or more of the first camera and the electronic device using one or more of: machine learning algorithms; video analytic algorithms; three-dimensional video analytic algorithms; and trained video analytic classifiers applied to the images from the first camera;
determine a location of the first responder;
determine, from the images from the first camera; a respective location of the second camera; a direction that the second camera is pointing in the images; an estimated field-of-view of the second camera; and a bystander, to the current incident, operating the second camera to acquire one or more of respective images and video of one or more of the current incident and the first responder,
determine, by comparing the location of the first responder and the respective location and the estimated field-of-view of the second camera, whether the first responder is located within the estimated field-of-view of the second camera; and,
in response to determining that the first responder is located within the estimated field-of-view of the second camera, control a notification device to provide a notification to notify the first responder of: the location of the first responder the estimated field-of-view of the second camera; and the direction of the second camera with respect to the location of the first responder.

US Pat. No. 10,891,752

SENSOR-BASED MOVING OBJECT LOCALIZATION SYSTEM AND METHOD

INDUSTRY ACADEMIC COOPERA...

1. A moving object localization system comprising:a plurality of sensors including an ordered pair of sensors that detect a moving object and measure positional information of the moving object; anda server that:collects the measured positional information from the ordered pair of sensors,
calculates a position of the moving object,
selects other ordered pairs among the plurality of sensors, and
obtains an average of ordered pair positions for all ordered pairs of the plurality of sensors.

US Pat. No. 10,891,751

RANGE IMAGE GENERATING METHOD, RANGE IMAGE GENERATING APPARATUS, AND ON-BOARD DEVICE HAVING RANGE IMAGE GENERATING FUNCTION

Ricoh Company, Ltd., Tok...

1. An image processing method of generating a range image, the method comprising:detecting, by a distance measurement unit including a light outputting device, a distance to an object as a measured distance by emitting light from the light outputting device to the object;
performing, by circuitry, matching of a stereogram by shifting a comparison image with respect to a reference image, to calculate matching cost values of pixels in the stereogram in a parallax space, each of the matching cost values in the parallax space corresponding to a corresponding shift amount of a pixel between the reference image and the comparison image;
converting, by the circuitry, the matching cost values in the parallax space into a set of matching cost values in a distance space, each of the matching cost values in the distance space corresponding to a corresponding distance;
performing, by the circuitry, integration of a distance evaluation value related to the measured distance, with a matching cost value corresponding to the measured distance among the matching cost values of a pixel in the stereogram corresponding to the location on the object of which the distance is detected by the distance measurement unit; and
generating by the circuitry, after the integration, a range image for measuring a distance to an object, based on a result of the integration.

US Pat. No. 10,891,750

PROJECTION CONTROL DEVICE, MARKER DETECTION METHOD, AND STORAGE MEDIUM

CASIO COMPUTER CO., LTD.,...

1. A projection control device comprising:a projection module configured to project a correcting image onto a projection target, the correcting image including an image position identifying marker which includes a first polygon and a second polygon positioned inside the first polygon and which is arranged in a predetermined position, and the correcting image being projected onto the projection target for pickup by an image pickup device,
wherein at least one of sides of the first polygon is non-parallel to sides of the second polygon, and
wherein the second polygon has as many vertices as the first polygon or has fewer vertices than the first polygon.

US Pat. No. 10,891,749

DEPTH MAPPING

Cambridge Mechatronics Li...

1. A device for depth mapping of an object, the device comprising:an emitter comprising a laser for emitting radiation and an optical component for transmitting the radiation to the object and from which reflected radiation is reflected from a surface of the object for measuring a distance of the device relative to the object and generating a depth map of the object;
a depth mapping processor unit that executes software to generate the depth map from the reflected radiation;
a shape memory alloy (SMA) actuator arranged to change the relative position and/or relative orientation of the optical component to the laser to control a position of the radiation on the object; and
a control processor unit that executes software to:
receive information relating to change in the location and/or orientation of the device relative to the object during use of the device; and
instruct the SMA actuator to change the relative position and/or relative orientation of the optical component to the laser so as to compensate for and reduce the extent to which a change in the location and/or orientation of the device relative to the object during use of the device affects measurement of the distance of the device relative to the object, the position of the radiation on the object, and the depth map of the object generated by the depth mapping processor unit from the reflected radiation.

US Pat. No. 10,891,748

PROCESSING OPTICAL COHERENCE TOMOGRAPHY SCANS

Michelson Diagnostics Ltd...

1. A method of processing optical coherence tomography (OCT) scans through a subject's skin, the method comprising:receiving a plurality of scans through the subject's skin, the scans representing an OCT signal in slices through the user's skin at different times;
comparing the scans to determine time-varying regions in the scans; and
determining a distribution of density of the time varying regions with varying depth in the scans;
determination of the time-varying regions comprising a determination of a threshold depth through the user's skin at which a density of the time-varying regions exceeds a density threshold.

US Pat. No. 10,891,747

SENSOR CALIBRATION SYSTEM FOR AUTONOMOUS DRIVING VEHICLES

BAIDU USA LLC, Sunnyvale...

1. A computer-implemented method for calibrating a sensor of an autonomous driving vehicle (ADV), the method comprising:determining a horizon line representing a vanishing point from a view point of the ADV based on a pitch angle of a camera that captured a first image representing a two-dimensional (2D) view from the viewpoint of the ADV;
displaying, on a first display area of a display device, the horizon line superimposed on a location within the first image that is based on a setting of the camera;
displaying, on a second display area of the display device, a second image that is a three-dimensional (3D) view based on the first image and 3D information obtained from one or more other sensors of the ADV;
determining one or more lane lines based on the first image based on a perception process performed on the first image and projecting the one or more lane lines onto the second image;
in response to a first input signal received from an input device,
updating a position of the horizon line superimposed on the first image, based on the first input signal, and
updating a position of at least one of the lane lines projected onto the second image, based on the updated position of the horizon line superimposed on the first image; and
determining a first calibration factor for calibrating the pitch angle of the camera based on a difference between the horizon line and the updated horizon line in view of the one or more lane lines and the updated position of the at least one of the one or more lane lines.

US Pat. No. 10,891,746

THREE-DIMENSIONAL GEOMETRY MEASUREMENT APPARATUS AND THREE-DIMENSIONAL GEOMETRY MEASUREMENT METHOD

MITUTOYO CORPORATION, Ka...

1. A three-dimensional geometry measurement apparatus that measures a three-dimensional geometry of an object to be measured by projecting, onto the object to be measured, a projection image including a light pattern in which luminance changes depending on a position, the three-dimensional geometry measurement apparatus comprising:a projection part that projects the projection image onto the object to be measured;
an image capturing part that generates a captured image by capturing the object to be measured on which the projection image is projected;
a relationship identification part that identifies a projection pixel position which is a position of a pixel of the projection image having correspondence with a captured pixel position which is a position of a pixel of the captured image;
a defective pixel determination part that determines whether or not the pixel at the captured pixel position is a defective pixel on the basis of a positional relationship between (i) a projection light beam starting from the projection part and passing through the pixel at the projection pixel position and (ii) a captured light beam starting from the image capturing part and passing through the pixel at the captured pixel position having correspondence with the projection pixel position, and (i) identifies, as a first three-dimensional position, a position at which the captured light beam passing through the captured pixel position intersects with a projection light beam plane in a predetermined direction passing through the projection pixel position having correspondence with the captured pixel position, (ii) specifies, as a second three-dimensional position, a position at which the projection light beam passing through the projection pixel position intersects with the captured light beam plane in a predetermined direction passing through the captured pixel position, and (iii) determines that the pixel at the captured pixel position is a defective pixel when a distance between the first three-dimensional position and the second three-dimensional position is greater than a threshold value when determining whether or not the pixel at the captured pixel position is the defective pixel; and
a geometry identification part that identifies the three-dimensional geometry of the object to be measured on the basis of pixel values of the captured pixel positions excluding the position of the pixel determined to be the defective pixel by the defective pixel determination part.

US Pat. No. 10,891,745

HYBRID SYSTEM WITH A STRUCTURED-LIGHT STEREO DEVICE AND A TIME OF FLIGHT DEVICE

Apple Inc., Cupertino, C...

1. A system for depth sensing, the system comprising:a structured-light stereo device;
a time of flight device;
a modulator; and
a computing device including a processor configured to execute instructions stored in a memory to:
project a light pattern onto a scene, wherein modulation of the light pattern is modified based on at least one of mapping information, route information, surface conditions, or speed information from a vehicle moving through the scene,
determine depth measurements of the scene using information about the light pattern received by the structured-light stereo device,
determine time of flight measurements of the scene using information about the light pattern received by the time of flight device, wherein the light pattern is projected onto the scene using the modulator and both the structured-light stereo device and the time of flight device to determine the depth measurements and the time of flight measurements,
generate a depth map using the depth measurements,
generate calibration points using the time of flight measurements and the depth measurements, each calibration point including information related to distance between a corresponding time of flight measurement and a corresponding depth measurement, and
update the depth map using the calibration points.

US Pat. No. 10,891,744

DETERMINING THE KINETIC STATE OF A BODY USING LIDAR POINT CLOUD REGISTRATION WITH IMPORTANCE SAMPLING

Argo AI, LLC, Pittsburgh...

1. A method of estimating a kinetic state of a rigid body, the method comprising:by a LiDAR sensor, receiving a first point cloud and a second point cloud, wherein the first point cloud and second point cloud each comprise a subset of points, and wherein each point in each subset of points comprises data returned to the LiDAR sensor from a rigid body;
by a processor:
determining a rigid body transform for the first point cloud and the second point cloud,
obtaining a sample space of rigid body transforms, wherein a previous point cloud registration identified each rigid body transform of the sample space of rigid body transforms,
identifying a proposal distribution, wherein the proposal distribution comprises probabilities that each rigid body transform in the sample space is the rigid body transform that best aligns the subset of points of the first point cloud to the subset of points in the second point cloud,
performing importance sampling on the proposal distribution identify a probability distribution for the rigid body transform that best aligns the subset of points of the first point cloud to the subset of points in the second point cloud, and
using the identified probability distribution and a history of previously-identified probability distributions to estimate a kinetic state of the rigid body, wherein using the history of probability distributions to estimate the kinetic state of the rigid body comprises using the history of probability distributions as a measurement of object pose in a Kalman Filter to estimate position, velocity, and acceleration of the rigid body over time.

US Pat. No. 10,891,743

IMAGE PROCESSING DEVICE, OPERATION METHOD PERFORMED BY IMAGE PROCESSING DEVICE AND COMPUTER READABLE RECORDING MEDIUM FOR PERFORMING DIFFERENT ENHANCEMENT PROCESSINGS BASED ON CONTEXT OF UPDATE DETERMINED FROM LATEST IMAGE ACQUIRED

OLYMPUS CORPORATION, Tok...

1. An image processing device comprisinga processor comprising hardware, the processor being configured to:
sequentially acquire multiple images of different types;
determine whether a latest acquired image of the multiple images is of a pre-set type of the different types;
in response to determining that the latest acquired image is of the pre-set type, perform a first enhancement processing on the latest acquired image to generate an enhanced image; and
in response to determining that the latest acquired image is not of the pre-set type, perform a second enhancement processing comprising set a previously generated enhanced image of the pre-set type that is previously generated as the enhanced image.

US Pat. No. 10,891,742

DENSE MOTION TRACKING MECHANISM

Intel Corporation, Santa...

1. An apparatus to facilitate motion tracking, comprising:one or more processors to receive video data including a plurality of point clouds in a time series, convert each of the plurality of point clouds into a deformable model representation including generating an isosurface mesh for each point cloud and deconstructing the isosurface mesh into a mesh constellation having a plurality of independent mesh surface elements, perform a skinning and articulation operation on the deformable model representation to assign bone weights to each of the plurality of independent surface elements and generate an inter-frame mapping between the deformable model representations to track motion between the plurality of point clouds, wherein a centroid of each surface element comprises a separate point cloud.

US Pat. No. 10,891,741

HUMAN ANALYTICS USING FUSION OF IMAGE AND DEPTH MODALITIES

RetailNext, Inc., San Jo...

1. A method comprising:obtaining, by a computing device, multiple frames of stereo image pairs from an image capturing device;
rectifying, by the computing device, each frame of the stereo image pairs;
computing, by the computing device, a stereo disparity for each frame of the stereo image pairs;
determining, by the computing device, a first set of object detections in each frame using the computed stereo disparity;
determining, by the computing device, a second set of object detections in each left or right frame of the stereo image pair using one or more machine learning models;
fusing, by the computing device, the first and second sets of object detections;
accumulating, by the computing device, batches of the fused object detections;
creating, by the computing device, tracks based on the batches of fused object detections; and
associating, by the computing device, the tracks over consecutive batches of the fused object detections.

US Pat. No. 10,891,740

MOVING OBJECT TRACKING APPARATUS, MOVING OBJECT TRACKING METHOD, AND COMPUTER PROGRAM PRODUCT

Kabushiki Kaisha Toshiba,...

1. A moving object tracking apparatus comprising:one or more processors configured to:
acquire a plurality of pieces of moving object information representing a moving object included in a photographed image;
execute an associating process for associating a plurality of pieces of the moving object information similar to each other as the moving object information of the same moving object for three or more pieces of the moving object information; and
output the associated moving object information, wherein
each of the plurality of pieces of moving object information is acquired based on the photographed image that is photographed by one of a plurality of imaging units,
each of the plurality of imaging units is classified into one of a plurality of groups,
each of the plurality of groups includes two or more of the plurality of imaging units,
each of the plurality of pieces of moving object information is classified into one of the plurality of groups to which a first imaging unit belongs, the first imaging unit photographing the corresponding photographed image, and
the associating process includes:
an intra-group associating process for the moving object information included in the group, and
an inter-group associating process for the moving object information associated by the intra-group associating process.

US Pat. No. 10,891,739

OBJECT TRACKING WITH A HOLOGRAPHIC PROJECTION

International Business Ma...

1. A method of tracking a physical object with a holographic projection, comprising:obtaining a feed from a sensor of an area containing at least one physical object;
locating positions of a set of physical objects in the area based on the feed from the sensor;
categorizing the set of physical objects into a set of categories;
determining a distribution of the set of physical objects based on the positions;
creating a set of visible holographic objects to mark the set of physical objects; and
projecting, based on the set of categories and the distribution, the set of visible holographic objects into the area at the positions of the set of physical objects, each visible holographic object marking at least one physical object of the set of physical objects.

US Pat. No. 10,891,738

BOUNDARY LINE RECOGNITION APPARATUS AND BRANCH ROAD DETERMINATION APPARATUS

NIPPON SOKEN, INC., Nish...

1. A branch road determination apparatus comprising:a camera that is installed in a vehicle to capture an image of a travel road;
a boundary line candidate extraction section that extracts candidate lines from a pair of left-and-right boundary lines sectioning the travel road, based on image information captured by the camera;
a curvature calculation unit that calculates a curvature of the travel road; and
a branch road determination section that
calculates a probability of whether the travel road is a branch road, in which the probability increases as a degree of the travel road having characteristics of a branch road increases, and
calculates an integrated probability to determine whether the travel road is a branch road by integrating the probability with a plurality of branch road characteristics, wherein
the plurality of branch road characteristics include at least one of
a branch road characteristic in which a candidate line extracted by the boundary line candidate extraction section is a solid line, and
a branch road characteristic in which the curvature of the travel road calculated by the curvature calculation section is smaller than a predetermined value.

US Pat. No. 10,891,737

MEDICAL IMAGE PROCESSING DEVICE, ENDOSCOPE SYSTEM, DIAGNOSIS SUPPORT DEVICE, AND MEDICAL SERVICE SUPPORT DEVICE

FUJIFILM Corporation, To...

1. A medical image processing device comprising:an image acquisition unit that acquires a medical image obtained from image pickup of an object to be observed;
a region-of-interest extraction section that extracts a first region of interest as a region of interest from the medical image;
a region-of-interest change section that performs correction processing for correcting the first region of interest to a second region of interest; and
a user interface that receives an instruction given to the region-of-interest change section by a user,
wherein a first medical image and a second medical image different from each other are included in the medical image,
the region-of-interest change section performs the correction processing on a region of interest extracted from the first medical image,
the region-of-interest extraction section extracts a region of interest from the second medical image by using region correction information about the correction processing,
the region-of-interest change section performs the correction processing by changing a region-of-interest extraction condition for extraction of the region of interest,
the region-of-interest extraction section calculates a first feature quantity from the medical image and extracts a region where the first feature quantity is in a first region extraction range as the region of interest, and
the region-of-interest extraction condition is a condition about the first region extraction range.

US Pat. No. 10,891,736

ASSOCIATING AN AGENT WITH AN EVENT USING MOTION ANALYSIS

Amazon Technologies, Inc....

1. A computer implemented method, comprising:under control of one or more computing systems configured with executable instructions,
receiving an event notification that indicates an event location within a materials handling facility and an event time for an event;
determining an event time window that includes the event time;
obtaining a plurality of overhead images, created during the event time window, that include a corresponding representation of an area near the event location;
determining, based at least in part on the plurality of overhead images, a motion model representative of a motion of an agent represented in at least some of the plurality of overhead images, the motion model generated by at least:
combining a plurality of pixels of at least some of the plurality of overhead images to form the motion model; and
altering at least some of the plurality of pixels representative of the agent to represent a gradient history illustrating the motion of the agent during the event time window;
searching pixels of the plurality of pixels of the motion model to determine a closest contour point of the motion model to the event location;
determining a distance between the closest contour point of the motion model and the event location;
determining a score for the agent based at least in part on the distance and the motion model;
determining that the score exceeds a threshold; and
associating the event with the agent.

US Pat. No. 10,891,735

GENERATION OF SYNTHETIC HIGH-ELEVATION DIGITAL IMAGES FROM TEMPORAL SEQUENCES OF HIGH-ELEVATION DIGITAL IMAGES

X DEVELOPMENT LLC, Mount...

1. A method implemented using one or more processors, comprising:obtaining a first temporal sequence of high-elevation digital images,
wherein the first temporal sequence of high-elevation digital images capture a geographic area at a first temporal frequency,
wherein each high-elevation digital image of the first temporal sequence comprises a plurality of pixels that align spatially with a respective first plurality of geographic units of the geographic area, and
wherein each high-elevation digital image of the first temporal sequence is captured at a first spatial resolution;
obtaining a second temporal sequence of high-elevation digital images,
wherein the second temporal sequence of high-elevation digital images capture the geographic area at a second temporal frequency that is less than the first temporal frequency,
wherein each high-elevation digital image of the second temporal sequence comprises a plurality of pixels that align spatially with a second plurality of geographic units of the geographic area, and
wherein each high-elevation digital image of the second temporal sequence is captured at a second spatial resolution that is greater than the first spatial resolution;
generating a mapping of the pixels of the high-elevation digital images of the second temporal sequence to respective sub-pixels of the first temporal sequence, wherein the mapping is based on spatial alignment of the geographic units of the second plurality of geographic units that underlie the pixels of the second temporal sequence with portions of the geographic units of the first plurality of geographic units that underlie the respective sub-pixels;
selecting a point in time for which a synthetic high-elevation digital image of the geographic area at the second spatial resolution will be generated;
selecting, as a low-resolution reference digital image, the high-elevation digital image from the first temporal sequence that was captured in closest temporal proximity to the point in time;
determining a first deviation of ground-truth data forming the low-resolution reference digital image from corresponding data interpolated for the point in time from the first temporal sequence of high-elevation digital images;
predicting, based on the first deviation, a second deviation of data forming the synthetic high-elevation digital image from corresponding data interpolated for the point in time from the second temporal sequence of high-elevation digital images; and
generating the synthetic high-elevation digital image based on the mapping and the predicted second deviation.

US Pat. No. 10,891,734

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND CELL ANALYSIS SYSTEM

Sony Corporation, Tokyo ...

1. An analyzing system comprising:a processor configured to:
calculate, for each time range, a feature value indicating a feature of an amount of movement in a target video image in which a target of analysis is imaged over time, wherein the feature is one of a mean value, a maximum value, a minimum value, a standard deviation, a variance, or a variation coefficient; and
superimpose the feature value for each time range to a respective frame of the target video image, wherein the feature value is visualized by applying shading in accordance with a magnitude of the feature value such that the feature value is superimposed as the shading on the target video image to form a feature value-displaying video image.

US Pat. No. 10,891,733

RADIOGRAPHING SYSTEM, RADIOGRAPHING METHOD, CONTROL APPARATUS, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

12. A radiographing system comprising:an image acquisition unit configured to acquire a radiographic image based on radiation transmitted through a subject;
an image processing setting unit configured to set plural types of image processing to a specific imaging procedure;
a determination unit configured to determine whether an imaging procedure included in a subject examination order matches the specific imaging procedure; and
an image processing unit configured to execute the set plural types of image processing on the acquired radiographic image to generate a plurality of radiographic images in a case where the imaging procedure included in the subject examination order matches the specific imaging procedure.

US Pat. No. 10,891,732

DYNAMIC IMAGE PROCESSING SYSTEM

KONICA MINOLTA, INC., To...

1. A dynamic image processing system comprising a hardware processor that extracts a heart region from a chest dynamic image which is obtained by radiation imaging of a dynamic state at a chest, extracts a density waveform for each pixel in the extracted heart region, determines an extraction target candidate region of blood flow information based on the extracted density waveform for each pixel, and sets an extraction target region of the blood flow information in the determined extraction target candidate region of the blood flow information, wherein the hardware processor calculates a feature amount of the extracted density waveform for each pixel, and determines the extraction target candidate region of the blood flow information based on the calculated feature amount for each pixel.

US Pat. No. 10,891,731

SYSTEMS AND METHODS FOR PRE-PROCESSING ANATOMICAL IMAGES FOR FEEDING INTO A CLASSIFICATION NEURAL NETWORK

Zebra Medical Vision Ltd....

1. A system for prioritizing patients for treatment for an acute medical condition requiring early and rapid treatment thereof based on a created a triage list of inference anatomical images likely depicting a visual finding type indicative of the acute medical condition, comprising:at least one hardware processor executing a code for:
feeding each one of a plurality of inference anatomical images into a visual filter neural network for inference of each one of the plurality of inference anatomical images by outputting a classification category indicative of a target body region depicted at a target sensor orientation and a rotation relative to a baseline;
rejecting a sub-set of the plurality of inference anatomical images classified into another classification category;
rotating to the baseline a remaining sub-set of the plurality of inference anatomical images classified as rotated relative to the baseline by the visual filter neural network;
identifying pixels for each respective image of the plurality of inference anatomical images having outlier pixel intensity values denoting an injection of content injected into the plurality of inference anatomical images after capture of the plurality of inference anatomical images;
adjusting the outlier pixel intensity values of the identified pixels denoting the injection of content injected into the plurality of inference anatomical images after capture of the plurality of inference anatomical images to values computed as a function of non-outlier pixel intensity values;
feeding each one of the remaining sub-set of the plurality of inference anatomical images with adjusted outlier pixel intensity values of the identified pixels denoting the injection of content injected into the plurality of inference anatomical images after capture of the plurality of inference anatomical images into a classification neural network for inference of the plurality of inference anatomical images by detecting the visual finding type; and
generating instructions for creating a triage list for which the classification neural network detected the indication for a plurality of patients,
wherein the plurality of patients likely suffering from the acute medical condition denoted by the indication are selected for early and rapid treatment thereof based on the triage list created based on the feeding the plurality of inference anatomical images into the visual filter neural network, the rejecting, the rotating, the identifying pixels, the adjusting, and the feeding inference anatomical images into the classification neural network.

US Pat. No. 10,891,730

FIBER PATTERN REMOVAL AND IMAGE RECONSTRUCTION ENDOSCOPIC DEVICES AND RELATED METHODS

University of Southern Ca...

1. A computer-implemented method for image reconstruction, the method being executed using one or more processors and comprising:receiving, by the one or more processors, an image acquired by an endoscopic system comprising an optical fiber bundle comprising a plurality of optical fibers, wherein each of the plurality of optical fibers is surrounded by cladding;
determining, by the one or more processors, in the image a plurality of fiber core locations corresponding to the plurality of optical fibers in the optical fiber bundle;
reconstructing, by the one or more processors, missing information from the image using interpolation performed in accordance with the plurality of fiber core locations, wherein the missing information corresponds to artifacts in the acquired image that result from the cladding;
providing, by the one or more processors, a fiber-pattern removed image in which the artifacts in the acquired image are removed using the missing information; and
performing, by the one or more processors, spectral correction for the fiber-pattern removed image by correcting spectral distortions associated with the endoscopic system.

US Pat. No. 10,891,729

AUTOMATED METHODS FOR THE OBJECTIVE QUANTIFICATION OF RETINAL CHARACTERISTICS BY RETINAL REGION AND DIAGNOSIS OF RETINAL PATHOLOGY

University of Louisville ...

1. A method of quantifying a retinal characteristic from retinal OCT image data, the method comprising:(a) obtaining a retinal spectral domain-OCT image from a subject;
(b) segmenting the OCT image into retinal regions according to a retinal OCT segmentation algorithm, said regions comprising at least the vitreous and the retinal pigment epithelium (RPE), and one or more regions selected from the nerve fiber layer (NFL), ganglion cell layer (GCL), inner plexiform layer (IPL), internal limiting membrane (ILM), inner nuclear layer (INL), outer plexiform layer (OPL), outer nuclear layer (ONL), external limiting membrane (ELM), inner segments (IS), inner/outer segment junction (1/0S), and outer segments of the photoreceptors (OSP);
(c) calculating a mean raw characteristic value within each region;
(d) assigning the vitreous region mean raw characteristic value equal to zero normalized characteristic units, and assigning the RPE region mean raw characteristic level equal to X normalized characteristic units;
(e) converting the calculated mean raw characteristic values for each region based on an offset and uniform scale to yield a control normalized characteristic unit scale from 0 to X; and
(f) quantifying the characteristic according to the normalized characteristic unit scale.

US Pat. No. 10,891,728

METHOD AND SYSTEM FOR IDENTIFYING AN ISTHMUS IN A THREE-DIMENSIONAL MAP

UNIVERSITE DE LORRAINE, ...

1. A method for identifying an isthmus in a three-dimensional mapping of a cardiac cavity from surface electrocardiograms (ECGs), excluding ventricular tachycardia, obtained for each stimulated point of a plurality of stimulated points of the cardiac activity, by a processing unit configured to carry out the following steps:a) determining a correlation coefficient for each pair of the stimulated points by comparing the surface electrocardiogram (ECG) associated with each stimulated point of the pair of the stimulated points;
b) identifying a watershed line based on the correlation coefficients associated with the pair of stimulated points and 3D coordinates of the stimulated points obtained from 3D mapping of the cardiac cavity; and
c) determining the isthmus in the cardiac cavity that is transverse to the watershed line.

US Pat. No. 10,891,727

AUTOMATED AIRFIELD GROUND LIGHTING INSPECTION SYSTEM

Airport Authority, Lanta...

1. A method of assessing the condition of one or more lights in an airfield ground lighting system, the method comprising:capturing by an image acquisition means an image stream of the light of the airfield ground lighting system by moving a housing having the image acquisition means disposed therein across said light;
detecting the location information of the image acquisition means whilst capturing the plurality of images comprised in the image stream by a location sensor coupled image acquisition means;
processing the image stream of a light of the airfield ground lighting system by a processor coupled to the image acquisition means by:
(a) associating characteristics of a plurality of points in an image with an item to be checked in the image of the light to be checked, and
(b) extracting the points from the images of the image stream wherein said extraction is by;
(i) analysing a plurality of randomly selected pairs of images from the images of the image stream to determine a plurality of tentative reference locations, one for each of the pairs of images for each extracted point relative to a three dimensional coordinate frame of the light;
(ii) assessing the tentative reference locations determined for each extracted point, to determine a reference location for each extracted point;
(c) projecting each extracted point into the images of the image stream based upon the determined reference location of said each extracted point and location information of the image acquisition means for each image;
(d) analysing the images of the image stream by comparing the locations in the images of the extracted points and the corresponding projected points and calculating the proximity therebetween;
(e) verifying existence in an image of a points of the item to be checked in the image stream of the light being checked by comparing the calculated proximity against a threshold value;
(f) repeating steps (b) to (e) to verify existence of each point in the plurality of points associated with each item to be checked in the light being checked; and
(g) determining the state of the item to be checked based upon analysis of verified points.

US Pat. No. 10,891,726

SEAL TAMPER DETECTION

Authentic Vision GmbH, S...

1. A method for confirming an integrity of a tamper-evident seal, comprising:capturing at least one image of the seal;
determining a first parameter value of a first parameter describing physical structure of the seal from the at least one captured image;
comparing the determined first parameter value with a pre-determined first reference parameter value;
confirming the integrity of the seal when the compared parameter values match;
wherein the first parameter describes physical structure of a predetermined breaking point provided in the seal,
wherein before capturing an image of the seal, the predetermined breaking point is produced in the seal according to a predefined value of the first parameter describing physical structure of the predetermined breaking point,
wherein the predefined value is stored as the first reference parameter value for comparison with the determined first parameter value describing the physical structure of a predetermined breaking point.