US Pat. No. 10,971,054

DISPLAY PANEL

WUHAN CHINA STAR OPTOELEC...

1. A display panel, comprising:at least two pixel repeating units arranged in an array, the pixel repeating units comprising a first pixel, a second pixel, and a third pixel;
wherein the first pixel, the second pixel, and the third pixel are ones of red, green, and blue pixels; the first pixel, the second pixel, and the third pixel are different from each other;
wherein a ratio y1 of an aperture ratio of the red pixel to an aperture ratio of the green pixel is in the range of 0.78e (?1.98r)?y1?2.297e (?1.85r), and 0.1?y1?3, where r is a ratio of a luminous efficiency of the red pixel to a luminous efficiency of the green pixel; and
wherein a ratio y2 of an aperture ratio of the blue pixel to the aperture ratio of the green pixel is in a range of 1.32e (?10.7b)?y2?5.95e (?14.1b), and 0.3?y2?4, where b is a ratio of a luminous efficiency of the blue pixel to the luminous efficiency of the green pixel.

US Pat. No. 10,971,053

ELECTRONIC DEVICE FOR CHANGING CHARACTERISTICS OF DISPLAY ACCORDING TO EXTERNAL LIGHT AND METHOD THEREFOR

Samsung Electronics Co., ...

1. An electronic device comprising:a display;
a sensor;
a memory; and
at least one processor operably connected to the display, the sensor, and the memory,
wherein the at least one processor is configured to:
in response to identifying that the electronic device is being worn by a user, identify first information regarding external light directed to the electronic device by using the sensor,
based on the first information regarding external light and second information regarding the user, acquire first frame data,
in response to acquiring the first frame data, control the display to display the first frame data,
identify second frame data distinguished from the first frame data from an application stored in the memory while the first frame data is displayed on the display,
in response to identifying the second frame data, adjust a color of at least one of multiple pixels included in the second frame data at least partially based on the first frame data, and
control the display based on at least one of the first frame data or the adjusted second frame data.

US Pat. No. 10,971,052

DRIVING METHOD AND DRIVING DEVICE FOR DISPLAY PANEL, AND DISPLAY DEVICE

BOE TECHNOLOGY GROUP CO.,...

1. A driving method for a display panel, wherein each pixel of the display panel comprises at least two primary-color sub-pixels of different colors and one mixed-color sub-pixel, and the driving method comprises:determining display power consumption according to an obtained brightness value of each of the primary-color sub-pixels;
compensating for the brightness value of each of the primary-color sub-pixels according to the display power consumption;
determining an output brightness value of each of the sub-pixels according to the compensated brightness value of each of the primary-color sub-pixels and a color coordinate of each of the sub-pixels; and
outputting the output brightness value of each of the sub-pixels to a source driving circuit, wherein the output brightness value of each of the sub-pixels are used by the source driving circuit to drive the display panel to display images; and
wherein determining the output brightness value of each of the sub-pixels according to the compensated brightness value of each of the primary-color sub-pixels and the color coordinate of each of the sub-pixels comprises:
determining a color mixing ratio corresponding to each of the primary-color subpixels according to a color coordinate of each of the primary-color sub-pixels and a color coordinate of the mixed-color sub-pixel, wherein the color mixing ratio corresponding to each of the primary-color sub-pixels refers to the proportion of light having a color of the primary-color sub-pixel in light emitted by the mixed-color sub-pixel;
calculating a ratio of the compensated brightness value of each of the primary color sub-pixels to the color mixing ratio of the primary-color sub-pixel, to obtain a reference brightness value corresponding to each of the primary-color sub-pixels;
determining the minimum reference brightness value among the reference brightness values corresponding to the primary-color sub-pixels as an output brightness value of the mixed-color sub-pixel; and
determining an output brightness value of each of the primary-color sub-pixels according to the output brightness value of the mixed-color sub-pixel, wherein the output brightness value of each of the primary-color sub-pixels is a difference between the compensated brightness value of the primary-color sub-pixel and the brightness component of the primary-color sub-pixel, wherein the brightness component of the primary-color sub-pixel is a product of the output brightness value of the mixed-color sub-pixel and the color mixing ratio corresponding to the primary-color sub-pixel.

US Pat. No. 10,971,050

METHOD FOR DRIVING A DISPLAY PANEL TO DISPLAY IMAGE, DISPLAY APPARATUS THEREOF, AND DRIVER ENABLED TO PERFORM THE METHOD

BOE TECHNOLOGY GROUP CO.,...

10. A driver for controlling image display operation of a display panel comprising at least a processor including a computer readable storage media, the computer readable storage media being configured to store computer-executable instructions, the processor being configured to execute the computer-executable instructions for controlling a display panel for displaying images, the computer-executable instructions comprising:a first set of instructions for determining multiple subpixels in a bright area of an image to be displayed by the display panel, wherein the bright area is an image display area having a luminance value maintained greater than a threshold luminance value for a duration longer than a threshold duration; and
a second set of instructions for driving the multiple subpixels to emit light alternatingly in a period of alternate light-emission for displaying the image and driving at least one adjacent subpixel surrounding a first subpixel that is not emitting light during the period of alternate light-emission to provide luminance rendering to the first subpixel such that the luminance value of the bright area is greater than the threshold luminance value;
wherein the second set of instructions comprises a first subset of instructions for determining the first subpixel and second subpixels among the multiple subpixels in the bright area corresponding to each period of alternate light-emission; and
a second subset of instructions for controlling the first subpixel to not emit light and the second subpixels to emit light in each period of alternate light-emission so that luminance values of the second subpixels are increased to provide the luminance rendering to the first subpixel without emitting light;
wherein the second subpixels comprise all subpixels immediate adjacent to the first subpixel;
the first subpixel is a subpixel configured to emit light of a first color;
at least one of the subpixels immediate adjacent to the first subpixel is configured to emit light of a second color; and
the first color and the second color are different.

US Pat. No. 10,971,049

DISPLAY DEVICE, INTERFACE UNIT AND DISPLAY SYSTEM

LG Display Co., Ltd., Se...

1. A display device, comprising:a connector unit including a plurality of receiving electrodes configured to receive an input signal from a plurality of transmission electrodes of an interface unit having a flat plate shape;
a signal processor which determines an arrangement of an input data included in the input signal; and
a display output portion which performs an image output processing according to the arrangement of the input data determined by the signal processor,
wherein the plurality of receiving electrodes are disposed to face the plurality of transmission electrodes in each of a plurality of connection directions defined by in-plane rotation angles of the interface unit, in a predetermined region which the interface unit opposes, and
wherein the signal processor determines the arrangement of the input data according to the connection direction of the interface unit.

US Pat. No. 10,971,048

SIGNAL TRANSMISSION METHOD, TRANSMITTING UNIT, RECEIVING UNIT AND DISPLAY DEVICE

BEIJING BOE DISPLAY TECHN...

1. A method for transmitting a signal in a display device, the display device comprising a timing controller and a source driver, the method being applied to any one of a plurality of transmitting units of the timing controller, the plurality of transmitting units corresponding to a plurality of receiving units of the source driver in a one-to-one relationship, the method comprising:obtaining a scrambled signal by scrambling, via a scrambler in a transmitting unit, a non-identification signal in a signal to be transmitted, the scrambled signal comprising an identification signal and a scrambled non-identification signal; and
transmitting the scrambled signal to a corresponding receiving unit,
wherein a signal obtained by scrambling a signal X via the scrambler is X16+X5+X4+X3+1, X24+X4+X3+X+1 or X32+X7+X5+X3+X2+X+1.

US Pat. No. 10,971,047

DEVICE SUBSTRATE

Au Optronics Corporation,...

7. A device substrate, comprising:a substrate;
a reset signal line, a first voltage signal line, a second voltage signal line and a plurality of high-frequency clock signal lines, located on the substrate; and
a 1st-stage driver unit to an nth-stage driver unit, located on the substrate, wherein n is a positive integer, and each of the 1st-stage driver unit to the nth-stage driver unit comprises:
a first start signal line;
a pulldown element, wherein a gate of the pulldown element is electrically connected to the first start signal line, and a source of the pulldown element is electrically connected to the first voltage signal line;
a reset element, wherein a gate of the reset element is electrically connected to the reset signal line, and a source of the reset element is electrically connected to the second voltage signal line; and
an output element, wherein a gate of the output element is electrically connected to a drain of the pulldown element and a drain of the reset element, a source of the output element is electrically connected to the corresponding high-frequency clock signal line, and a drain of the output element is used for outputting a corresponding gate driving signal; wherein
the first start signal line of the nth-stage driver unit is electrically connected to the reset signal line.

US Pat. No. 10,971,046

DISPLAY DEVICE, POWER SUPPLY DEVICE FOR DISPLAY DEVICE, AND DRIVING METHOD OF DISPLAY DEVICE

SAMSUNG DISPLAY CO., LTD....

1. A display device comprising:a display unit including a plurality of pixels;
a scan driver configured to apply a scan signal to a plurality of scan lines connected to the plurality of pixels;
a data driver configured to apply a data signal to a plurality of data lines connected to the plurality of pixels; and
a power supply unit configured to supply a driving voltage to at least one among the display unit, the scan driver, and the data driver,
wherein the power supply unit includes:
an inductor connected between an input terminal to which an input voltage is input and a driving voltage output terminal to which the driving voltage is output;
a switch connected between the inductor and a ground; and
a switch controller configured to output a first ramp pulse having a first frequency at a first load of the display device and output a second ramp pulse having a second frequency at a second load of the display device to control a switching operation of the switch,
wherein the first frequency is lower than the second frequency,
the first load is greater than the second load, and
the switch controller includes a pulse generator configured to receive a current flowing through the switch and output a ramp pulse having a frequency corresponding to the received current.

US Pat. No. 10,971,045

DISPLAY APPARATUS

Au Optronics Corporation,...

1. A display apparatus, comprising:a pixel array, comprising a plurality of display rows, each of the display rows comprising a plurality of pixel circuits, each of the pixel circuits comprising a first transistor and a second transistor coupled in series between a data line and a display pixel, wherein the second transistor is connected between the data line and the first transistor;
a plurality of gate lines, wherein a control end of the first transistor of each of the pixel circuits is coupled to one of the plurality of gate lines, and receives a gate driving signal; and
a plurality of de-loading lines, wherein a control end of the second transistor of each of the pixel circuits is coupled to one of the de-loading lines for receiving a de-loading signal,
wherein an enable time period of the de-loading signal received by each of the pixel circuits overlaps with an enable time period of the gate driving signal by each of the pixel circuits.

US Pat. No. 10,971,044

METHODS AND SYSTEMS FOR MEASURING ELECTRONIC VISUAL DISPLAYS USING FRACTIONAL PIXELS

Radiant Vision Systems, L...

1. A method in a computing system having an image capture device for measuring one or more characteristics of a region of interest (ROI) in an image captured by the image capture device, the image having an array of a number of pixels containing information about an electronic visual display that is separate from the image capture device, the method comprising:determining a fractional pixel location of a center of the ROI and at least one dimension of the ROI in pixels;
determining a pixel location of bounds of the ROI based on the fractional pixel location of the center and the at least one dimension of the ROI;
measuring, for each pixel located wholly within the bounds, a first image characteristic of the pixel;
measuring, for each pixel partially within and partially outside the bounds, a second image characteristic of the pixel;
scaling each second image characteristic; and
determining an overall image characteristic for the ROI based at least on a summation of the first image characteristics and the scaled second image characteristics.

US Pat. No. 10,971,043

SYSTEM AND METHOD FOR EXTRACTING CORRELATION CURVES FOR AN ORGANIC LIGHT EMITTING DEVICE

Ignis Innovation Inc., W...

1. A method of determining the efficiency degradation of an organic light emitting device (OLED) in an array-based semiconductor device having an array of pixels that include OLEDs, said method comprising:subjecting OLEDs of a plurality of OLEDs of the array of pixels to a corresponding plurality of distinct stress conditions by electrically driving said OLEDs according to a corresponding plurality of predetermined distinct electrical stress conditions, the first plurality of OLEDs substantially similar to said OLED;
periodically, for each OLED of the plurality of OLEDs, measuring an electrical operating parameter of the OLED and measuring with use of a photo sensor and an output of the photo sensor an efficiency degradation of the OLED generating measurements for the distinct stress condition corresponding to the OLED, said photo sensor on the array-based semiconductor device and in or next to a pixel of the array of pixels including the OLED;
measuring a change in said electrical operating parameter of said OLED from a baseline of said electrical operating parameter;
determining a stress condition of said OLED; and
using said measurements for the distinct stress conditions, said measured change in said electrical operating parameter of said OLED, and said determined stress condition to determine the efficiency degradation of said OLED.

US Pat. No. 10,971,042

RELIABILITY TEST FIXTURE FOR FLEXIBLE DISPLAY COMPONENT AND ONLINE RELIABILITY TEST DEVICE FOR FLEXIBLE DISPLAY COMPONENT

Kunshan New Flat Panel Di...

1. A test fixture for testing reliability of a flexible display component, comprising:a bracket; and
a rotating shaft rotatably coupled to the bracket, the rotating shaft including:
a latching groove disposed along an axial direction on a surface of the rotating shaft and adapted to couple the flexible display component to the rotating shaft; and
a test module disposed in the rotating shaft and adapted to detect an electrical parameter of an internal circuit of the flexible display component, the test module having a test contact electrically coupled to the flexible display component in the rotating shaft.

US Pat. No. 10,971,041

ELECTRONIC DISPLAY DEVICE AND INFORMATION DISPLAY METHOD OF SAME

DOT INCORPORATION, Seoul...

1. An electronic display device comprising:a storage unit configured to store a plurality of display information;
a display signal generating unit configured to generate a first display signal corresponding to each of the plurality of display information and a second display signal different from the first display signal;
a display signal selecting unit configured to select the first display signal or the second display signal;
a display unit comprising a protrusion unit and configured to generate a display via the protrusion unit in accordance with the first display signal or the second display signal;
a driving unit connected to the protrusion unit and configured to drive the protrusion unit; and
a drive determining unit electrically connected to the driving unit and configured to operate at least a part of the driving unit in accordance with the first display signal or the second display signal,
wherein the protrusion unit comprises at least one protrusion,
wherein the driving unit comprises at least one driving module connected to the at least one protrusion,
wherein the drive determining unit further comprises a period determining unit configured to determine a projection period of the at least one protrusion,
wherein the protrusion unit comprises a plurality of protrusions separated from each other,
wherein the driving unit comprises a plurality of driving modules individually connected to at least one of the plurality of protrusions, and
wherein the drive determining unit further comprises a drive module selecting unit configured to operate at least one of the plurality of driving modules.

US Pat. No. 10,971,040

DISPLAY DEVICE AND METHOD FOR CONTROLLING DISPLAY DEVICE

SEIKO EPSON CORPORATION, ...

1. A display device comprising:a display unit having an electro-optical device and displaying an image via the electro-optical device;
an image pickup unit picking up the image displayed by the display unit and generating picked-up image data; and
a processor programmed to act as:
a decision unit deciding a first time period during which an elimination function to eliminate burn-in on the electro-optical device is to be executed, based on the picked-up image data; and
a processing unit executing the elimination function during the first time period decided by the decision unit, wherein:
the decision unit determines whether the burn-in on the electro-optical device is eliminated by the execution of the elimination function or not, and
when it is determined that the burn-in on the electro-optical device is eliminated, the processing unit ends the execution of the elimination function even before the first time period passes after the execution of the elimination function is started.

US Pat. No. 10,971,039

DISPLAY SYSTEM FOR GUITAR SOUND HOLE

1. A display system for use in a sound hole of a guitar, the display system comprising:a plurality of legs extending outwardly from a central point;
a base connected to the plurality of legs;
a top plate rotationally connected to the base; and
a weight incorporated into the top plate to create rotation;
wherein the plurality of legs are configured to engage with the guitar to secure the display system within the sound hole.

US Pat. No. 10,971,038

SIGNAGE SYSTEMS AND MERCHANDISING DISPLAY ASSEMBLIES

T.M. SHEA PRODUCTS, INC.,...

1. An overhead signage system comprising:first and second uprights of a gondola;
a frame for engaging the first and second uprights of the gondola, the frame including first and second laterally spaced apart side portions and a front portion connecting the first and second laterally spaced apart side portions; and
a sign holding member having a front panel attached to the front portion of the frame and a first and second side panels attached to the first and second laterally spaced apart side portions of the frame, respectively,
wherein the sign holding member has a height greater than the frame, the frame is mounted to the sign holding member proximate a lower end of the sign holding member and the sign holding member upwardly extends from the frame.

US Pat. No. 10,971,037

SYNTHETIC KNEE JOINT APPARATUS AND RELATED EDUCATIONAL METHODS FOR CLINICAL KNEE JOINT EXAMINATIONS

Iowa State University Res...

1. An educational knee joint model, comprising:(a) a synthetic tibia comprising a tibial joint surface;
(b) a synthetic medial meniscus attached to the tibial joint surface;
(c) a synthetic lateral meniscus attached to the tibial joint surface; and
(d) a synthetic femur comprising a femoral joint surface, wherein the synthetic femur is positioned such that the femoral joint surface is positioned in contact with the synthetic medial meniscus and the synthetic lateral meniscus.

US Pat. No. 10,971,036

MEDICAL SIMULATION SYSTEM AND METHOD

Mentice-AB, Gothenburg (...

1. A medical simulation system comprising:a braking unit for applying friction to a medical instrument, wherein the braking unit comprises:
an electrical drive unit;
a drive gear arranged for cooperation with the electrical drive unit;
a curved braking arm having a first end and a second end which is opposite to the first end, wherein the first end of the braking arm is arranged for cooperation with the drive gear, and wherein a fulcrum of the braking arm is located on the second end of the braking arm opposite to the first end of the braking arm and drive wheel; and
a brake pad attached to the braking arm which engages the medical instrument.

US Pat. No. 10,971,035

TRAINING IN DISPENSING A MEDICAMENT

Teva UK Limited, West Yo...

1. A system comprising:a computer application residing on a mobile device;
a medicament-free device not containing medicament comprising:
a housing;
a wireless communication device;
an acoustic sensor configured to output acoustic data indicative of a user's inhalation;
an accelerometer configured to output movement data indicative of whether the medicament-free device was shaken before or during a user's action performed upon the medicament-free device;
an orientation sensor configured to output orientation data indicative of an orientation of the medicament-free device during the user's inhalation; and
a processor configured to receive the acoustic data, the movement data, and the orientation data, and cause the wireless communication device to transmit the acoustic data, the movement data, and the orientation data to the mobile device; and
wherein the computer application is configured to:
receive the acoustic data, the movement data, and the orientation data;
cause the mobile device to display an indication of whether the medicament-free device was shaken before or during the user's action based on the movement data;
cause the mobile device to display an indication of the user's inhalation based on the acoustic data; and
cause the mobile device to display an indication of orientation of the medicament-free device based on the orientation data.

US Pat. No. 10,971,034

DYNAMIC PARTITIONING OF A REFRESHABLE BRAILLE DISPLAY BASED ON PRESENCE OF ANCILLARY ALPHANUMERIC CONTENT

Freedom Scientific, Inc.,...

1. A method of dynamically partitioning a refreshable braille display having a plurality of braille cells to simultaneously display a primary alphanumeric content and an ancillary alphanumeric content associated therewith, the method comprising the steps of:outputting a first segment of the primary alphanumeric content onto the refreshable braille display, wherein the first segment of the primary alphanumeric content is output using the plurality of the braille cells of the refreshable braille display;
determining whether a second segment of the primary alphanumeric content has the ancillary alphanumeric content associated therewith;
responsive to determining that the second segment of the primary alphanumeric content has the ancillary alphanumeric content associated therewith, automatically partitioning the refreshable braille display into a first partition having a first set of the plurality of braille cells and a second partition having a second set of the plurality of braille cells;
outputting the second segment of the primary alphanumeric content using the first set of the plurality of braille cells, and outputting the ancillary alphanumeric content using the second set of the plurality of braille cells;
determining whether the ancillary alphanumeric content is associated with a third segment of the primary alphanumeric content, wherein the third segment of the primary alphanumeric content is subsequent to the second segment of the primary alphanumeric content;
responsive to determining that the ancillary alphanumeric content is not associated with the third segment of the primary alphanumeric content, allocating all braille cells of the plurality of braille cells of the refreshable braille display for outputting the third segment of the primary alphanumeric content; and
outputting the third segment of the primary alphanumeric content onto the refreshable braille display using the plurality of braille cells.

US Pat. No. 10,971,033

VISION ASSISTIVE DEVICE WITH EXTENDED DEPTH OF FIELD

Freedom Scientific, Inc.,...

1. A vision assistive device with an extended depth of field, the vision assistive device comprising:a housing having a base region, a top region, and an intermediate extent therebetween, the housing further including a front wall and a back wall, a screen positioned within the front wall;
an overhang formed at the top region of the housing, the overhang including a top surface, a bottom surface, opposing sides, and a forwardly facing peripheral extent, controls for the vision assistive device positioned within the top surface and the forwardly facing peripheral extent;
a single imaging unit positioned within the bottom surface of the overhang, the single imaging unit adapted to image an object positioned below the overhang, the distance between the single imaging unit and object constituting a focal length; and
a focus motor operatively connected to the single imaging unit, the focus motor functioning to selectively vary the focal length and take sequential images of the object at different focal lengths;
a fin extending from the back wall of the housing, wherein the fin has an upper extent at the top region of the housing and a lower extent at the bottom region of the housing, with the fin functions to stabilize the device when the device is being used upon a desktop and further facilitating use of the device upon the lap of the user.

US Pat. No. 10,971,032

SYSTEMS AND METHODS FOR PROVIDING EXTENSIBLE ELECTRONIC LEARNING SYSTEMS

1. An extensible electronic learning system operated by an educational services provider comprising:at least one learning management system having a learning management processor and a learning management memory operatively coupled thereto, said processor programmed for providing at least one extensible integration module, each extensible integration module comprising:
i. a predefined vendor services interface configured for requesting vendor services from at least one vendor system, the vendor services comprising one or more electronic learning services and the predefined vendor services interface comprising at least one vendor services definition based on a parameter required for the at least one vendor services definition, and
ii. a vendor configuration upload component configured for receiving vendor configuration settings about the at least one vendor system, the vendor configuration upload component configured for receiving one or more vendor configuration settings from at least one vendor integration module of the at least one vendor system, wherein each vendor integration module comprises a corresponding predefined vendor services interface,
wherein the extensible integration module includes at least one or more configuration settings of the learning management system and one or more of the vendor configuration settings,
wherein the at least one vendor system is not the educational services provider;
wherein the one or more electronic learning services comprise an electronic tool for at least one of electronic live or prerecorded lecture webcasts, online presentations, electronic publications, generation of electronic course content, electronic chat rooms, online course management; and
wherein the learning management processor of the at least one learning management system is programmed to provide a learning management services module for performing at least one learning management service, the at least one learning management service comprising at least one of the one or more electronic learning services provided by the at least one vendor system upon request.

US Pat. No. 10,971,031

APPARATUS AND METHOD FOR IDENTIFYING FOOD NUTRITIONAL VALUES

Fitly Inc., Philadelphia...

1. A system for monitoring a food item being consumed by a user, comprising:a food container forming one or more partitions to receive and retain the food item;
a weight sensor placed within the food container, positioned to measure a weight of the food item being received by the one or more partitions;
one or more cameras, each of the one or more cameras being respectively positioned to identify a type and a preparation status of the food item, each of the one or more cameras capturing an image of the food item, the image comprising at least a portion of the food item; and
wherein the weight sensor and the one or more cameras are in communication with one or more processors, the one or more processors being configured to determine a nutritional value of the food item based on the type, the preparation status, and the weight, the type and the preparation status being identified by comparing properties of the food item identified from the captured image to reference properties corresponding to one of a various types of food item stored in a storage unit, the preparation status indicating a preparation method of the food item, the one or more processors in communication with the storage unit.

US Pat. No. 10,971,030

REMOTE PHYSICAL TRAINING

INTERNATIONAL BUSINESS MA...

1. A computer-implemented method of performing remote physical training, the method comprising:receiving, using a remotely located processor system, movement of a controller held by an operator resulting from movements performed by the operator who is remotely located from the remotely located processor system;
presenting, using the remotely located processor system, the movement of the controller held by the operator as movement of an icon representing the controller in the virtual reality environment, and a path to be traversed by the icon in the virtual reality environment; and
providing real-time feedback on the movement to the operator, wherein the real-time feedback indicates how closely the movement of the icon follows the path to be traversed by the icon.

US Pat. No. 10,971,029

INFORMATION PROCESSING DEVICE, METHOD, AND STORAGE MEDIUM

Kabushiki Kaisha Toshiba,...

1. An information processing device comprising:a memory; and
a hardware processor in communication with the memory, the hardware processor configured to:
acquire a first motion data indicating a motion of a first operator, the first motion data comprising a first time series;
acquire a second motion data indicating a motion of a second operator, the second motion data comprising a second time series;
synchronize the first motion data and the second motion data by detecting sharp rising data edges that are common to the first and second time series and adjusting rising time positions;
compare the first motion data and the second motion data;
determine a similarity of the first motion data and the second motion data; and
present to the first operator instruction data indicating an improvement point relating to a motion at a time of performing a predetermined operation in accordance with a determination result,
wherein the first motion data and the second motion data are acquired from one or more motion sensors attached to one or more tools.

US Pat. No. 10,971,028

SYSTEMS AND METHODS FOR MUSIC AND MOVING IMAGE INTERACTION

1. A method of instructing a first player to play a music playing tool to create a repeating effect displayed in a timeline at any point along a length of a progress bar via a screen of a remote computing gadget, comprising:importing a visual portion archive into a folder of a remote computing gadget, wherein the visual portion archive comprises visual portion of a second player playing the music playing tool and sound portion of the second player playing the music playing tool, the first player resembling a student and second player resembling a teacher;
playing the visual portion archive with the remote computing gadget under a learning mode at regular speed, wherein
decreasing speed at various visual speed of the visual portion, wherein:
decreasing visual speed at a first location of the visual portion, the first location relative to the displayed timeline along the length of the progress bar via the screen of the remote computing budget as instructed by the first player, a sound portion at a first location continues to remain in sound frequency; and
decreasing visual speed at a second location of the visual portion, the first location relative to the displayed timeline along the length of the progress bar via the screen of the remote computing budget as instructed by the first player, a sound portion at a second location continues to remain in sound frequency;
playing the visual portion archive with the remote computing gadget under a practicing mode, wherein:
playing the visual portion archive at a first decreased visual speed at the first location of the visual portion without instructed by the first player, the sound portion at the first location continues to remain in sound frequency;
playing the visual portion archive at a second decreased visual speed at the second location of the visual portion without instructed by the first player, the sound portion at the first location continues to remain in sound frequency; and
playing of the visual portion archive at all locations other than the first and second locations at the regular speed.

US Pat. No. 10,971,027

SYSTEM AND METHOD FOR IMPROVING SAFETY WHEN OPERATING AIRCRAFT IN REDUCED- OR MODIFIED-VISIBILITY CONDITIONS

AT Systems, LLC, Carroll...

1. A method for training a pilot to operate an aircraft in sudden-onset reduced-visibility conditions, the method comprising:(a) providing a flight helmet comprising:
(i) a visor comprising an electrooptic material having an optical transmittance, wherein the electrooptic material is disposed to restrict the pilot's view outside the aircraft when the electrooptic material is in a low-optical-transmittance state;
(ii) a power supply connected to the electrooptic material;
(iii) a flight-safety sensor configured to generate a signal; and
(b) changing the output of the power supply to change the optical transmittance of the electrooptic material, wherein the changing step is performed without first informing the pilot that the changing step is to be performed at the moment it is performed;
(c) receiving a signal from the flight-safety sensor; and
(d) changing the position of the visor depending on the signal received from the flight-safety sensor.

US Pat. No. 10,971,026

METHOD FOR INTEGRATING EDUCATIONAL LEARNING INTO ENTERTAINMENT MEDIA

1. A computer-implemented method of integrating a succession of educational questions into ongoing entertainment media that is being presented to a student on a display, the method comprising the steps of:a. identifying a target process associated with an ongoing entertainment media;
b. injecting an educational library that includes the educational questions into the target process comprising the steps of:
allocating memory in the target process;
writing instructions in the allocated memory in the target process to load the educational library; and,
creating and executing a remote thread in the educational library to process the instructions in the allocated memory to render images of the educational questions;
c. receiving an image of a first educational question rendered by the remote thread created and executed by the injecting step;
d. overlaying the image of the first educational question rendered by the remote thread created and executed by the injecting step over the ongoing entertainment media, the first educational question overlay and the ongoing entertainment media being separately executed;
e. initially positioning the first educational question overlay in an unimportant area of the ongoing entertainment media, the unimportant area of the ongoing entertainment media being an area where the student would not naturally focus their attention;
f. providing the student with an available amount of time to correctly answer the first educational question; and,
g. moving the first educational question overlay from the unimportant area of the ongoing entertainment media to an important area of the ongoing entertainment media if the available amount of time has run out, the important area of the ongoing entertainment media being an area where the student would naturally focus their attention.

US Pat. No. 10,971,025

INFORMATION DISPLAY APPARATUS, INFORMATION DISPLAY TERMINAL, METHOD OF CONTROLLING INFORMATION DISPLAY APPARATUS, METHOD OF CONTROLLING INFORMATION DISPLAY TERMINAL, AND COMPUTER READABLE RECORDING MEDIUM

CASIO COMPUTER CO., LTD.,...

1. An information display apparatus comprising:a display; and
a processor, the processor being configured to:
display a text in a first display area of the display;
display, in a second display area of the display, a location indicator indicating a location of a portion of the text displayed in the first display area;
display, in a third display area of the display, the portion of the text at the location indicated by the location indicator in an enlarged manner such that a display size of the portion of the text in the third display area is larger than a display size of the portion of the text displayed in the first display area, wherein the processor updates the portion of the text displayed in the third display area in accordance with a user operation to move the location indicator to indicate a new location;
detect a user operation to specify and designate a previously undesignated character string in the portion of the text displayed in the third display area, and change a display state of the designated character string from an original display state in which the character string was displayed when it was undesignated to a changed display state to visually indicate that the character string has been designated by the user operation, wherein the processor is operable to detect the user operation to specify and designate any character string in the text displayed on the display such that the designated character string is not a character string that has been predetermined to be designated prior to detection of the user operation to specify and designate the character string;
determine, by referring to a memory in which a plurality of keywords are stored in advance, whether a keyword among the plurality of keywords stored in advance is included in a part of the designated character string;
if it is determined that the keyword is included in the part of the designated character string, change a display state of the keyword such that the keyword is visually distinguishable from another part of the character string, and generate problem setting data based on the keyword included in the designated character string; and
transmit the problem setting data to an external terminal,
wherein the processor is further configured to:
display an image in the third display area; and
in a case in which the keyword is included in the designated character string and a user operation is detected to designate a part of the image corresponding to the keyword, change a display state of the designated part of the image to visually indicate that the part has been designated and to generate the problem setting data based on the designated character string, the keyword, and the designated part of the image.

US Pat. No. 10,971,024

COMMUNICATION SYSTEM AND METHOD

Core Vocabulary Exchange ...

1. A communication system for assisting a user in conveying or exchanging a communication, the communication system comprising:a core vocabulary substrate comprising a first surface having a plurality of icons defined thereon and arranged in a predetermined arrangement, each of the plurality of icons representing one of a plurality of core vocabulary words;
a plurality icon pieces removably attached to the surface of the core vocabulary substrate, wherein each of the plurality of icon pieces is removably attached to a matching one of the plurality of icons defined on the core vocabulary substrate;
a communication card landing pad extending from and interconnected with the core vocabulary substrate, the communication card landing pad configured to receive a communication card removably attached thereto; and
a communication card removably attached to the communication card landing pad, the communication card configured to receive one or more of the plurality of icon pieces removably attached thereto to convey or exchange a communication.

US Pat. No. 10,971,023

SYSTEMS AND METHODS FOR MULTI-CHANNEL REMOTE IDENTIFICATION OF AIRCRAFT

Kittyhawk.io, Inc., San ...

1. A method comprising:receiving identifying information about a first set of aircraft flying over airspace via a first detection channel that uses a first set of sensors distributed throughout the airspace;
receiving identifying information about a second set of aircraft flying over the airspace via a different second detection channel that uses a second set of sensors that are different than the first set of sensors and that are distributed throughout the airspace;
determining a plurality of aircraft flying in the airspace based on receiving the identifying information about the first set of aircraft and receiving the identifying information about the second set of aircraft;
assigning an identifier that differentiates each aircraft of the plurality of aircraft from other aircraft of the plurality of aircraft; and
providing the identifier for a particular aircraft of the plurality of aircraft at a position corresponding to a relative position of the particular aircraft that is determined from either the first set of sensors or the second set of sensors used to identify the particular aircraft, and different identifiers for other aircraft of the plurality of aircraft at positions corresponding to relative positions of the other aircraft that are determined from either the first set of sensors or the second sets of sensors used to identify the other aircraft.

US Pat. No. 10,971,022

DRONE USER EQUIPMENT INDICATION

Qualcomm Incorporated, S...

1. A method for providing a drone user equipment (UE) indication, comprising:connecting to a wireless network by a UE having flight capabilities; and
transmitting, by the UE, a drone UE indicator to the wireless network to indicate that the UE is an unmanned aircraft system having the flight capabilities, wherein the drone UE indicator is transmitted to the wireless network as part of a Radio Resource Control (RRC) procedure, during a non-access stratum (NAS) attach procedure, or using Medium Access Control (MAC) signaling during a Random Access Procedure (RACH) when establishing the connection to the wireless network.

US Pat. No. 10,971,021

COMPREHENSIVE FLIGHT PLANNING TOOL FOR A MOBILE DEVICE

AIRCRAFT OWNERS AND PILOT...

1. A client device for communicating with a flight planning system, comprising:a computing system configured to execute software instructions, the computing system including:
a memory device storing software instructions;
a processor configured to execute the software instructions stored in the memory device to communicate with the flight planning system;
a flight planning application stored in the memory device, wherein the processor is configured to execute the flight planning application to access the flight planning system via a flight planning application program interface (API),
wherein accessing the flight planning system includes identifying a user and authorizing access to the flight planning system via the flight planning API when the flight planning application is executed;
a display device comprising interface hardware configured to display interface screens generated by the flight planning application;
wherein, when executed, the flight planning application is configured to:
access a server of the flight planning system, the server including a plurality of modules comprising a flight planning module, a weather module, and an airport module;
communicate, via the API, a request for information based on a selection of one of a plurality of tools corresponding to one of the plurality of modules, the selection received via the interface hardware;
receive, from the flight planning system via the API, an aviation flight plan generated using responsive information that fulfills the request, wherein the responsive information was generated by the flight planning system based on information from a third-party device; and
display the aviation flight plan via the interface hardware.

US Pat. No. 10,971,020

AIRCRAFT SYSTEM AND METHOD TO PROVIDE LINEAR MAP OF TRAFFIC

Rockwell Collins, Inc., ...

1. A system, comprising:a display configured to present images to a user; and
at least one processor communicatively coupled to the display, the at least one processor configured to:
receive aircraft traffic data and ownship data, the ownship data associated with an ownship and from the ownship, the ownship data including at least one of position information, speed information, trajectory information, or heading information associated with the ownship, the aircraft traffic data associated with aircraft traffic other than the ownship, the aircraft traffic data from the aircraft traffic, the aircraft traffic data including at least one of position information, speed information, trajectory information, or heading information associated with the aircraft traffic;
generate and update a linear map based at least on the aircraft traffic data and the ownship data; and
output the linear map as graphical data to the display;
wherein the display is configured to display the linear map to the user;
wherein the linear map depicts a one-dimensional relationship between the ownship and designated traffic of the aircraft traffic, wherein the linear map conveys a range between the ownship and the designated traffic and conveys a closure rate between the ownship and the designated traffic, wherein the linear map comprises:
a graphical threshold indicator, the threshold indicator representing a time or distance from a threshold to the designated traffic;
a graphical ownship indicator depicting a position of the ownship;
a graphical scale representing a time or distance displayed by the linear map; and
a graphical designated traffic indicator depicting a position of the designated traffic.

US Pat. No. 10,971,019

VEHICLE COMMUNICATION AND NAVIGATION SYSTEM

Honeywell International I...

1. A system comprising:a vehicle communication application hosted on a portable device, the vehicle communication application comprising instructions stored on a non-transitory processor readable medium and executable by a processor on the portable device such that the vehicle communication application is operative to:
generate a traffic control log on screen of the portable device, wherein the traffic control log on screen is operative by a user to access a datalink communications system;
generate a free text screen accessible by the user to input text messages on the portable device;
generate one or more standard traffic control message screens accessible by the user on the portable device;
receive a user input of a text message via the free text screen or a selection of a standard traffic control message via the one or more standard traffic control message screens, the user input confirming or requesting clearances and/or changes;
format a message based on the received user input in accordance with a regulatory standard communication compliant protocol; and
transmit the message using a standard IP interface to a datalink communication device, the datalink communication device transmitting the message to a ground center configured to receive, process, and re-transmit, to a traffic control center, messages received from the vehicle communication application that meet protocol requirements prior to transmission of the messages to the traffic control center, using the regulatory standard communication compliant protocol.

US Pat. No. 10,971,018

VEHICULAR PLATOON SUBSCRIPTION AND MANAGEMENT SYSTEM

Ford Global Technologies,...

1. A vehicle platoon system, comprising:a database configured to maintain a user profile and data descriptive of existing platoons, wherein the user profile includes a driver compliance score indicative of historical compliance with platoon rules;
a processor configured to
receive a platoon request indicating desire to join a platoon;
receive, in response to the request, the user profile defining at least one compliance threshold indicative of a minimum compliance score of other platoon participants;
determine whether all members of at least one of the existing platoons have a compliance score exceeding the compliance threshold; and
instruct an indication of the platoon to be presented for selection.

US Pat. No. 10,971,017

SENSOR FUSION AND INFORMATION SHARING USING INTER-VEHICLE COMMUNICATION

Cummins Inc., Columbus, ...

1. A method, comprising:determining information based on sensor data collected by each of two or more vehicles operating within a predetermined range of one another;
communicating the information from at least one of the two or more vehicles to at least one of a vehicle controller of the other two or more vehicles and a cloud computing control system;
comparing the information from the at least one of the two or more vehicles with the information from the other of the two or more vehicles;
diagnosing a sensor error of a sensor of the other of the two or more vehicles based on the comparison of the information; and
updating a sensor processing output for the other of the two or more vehicles based on the diagnosis to correct the sensor error of the sensor.

US Pat. No. 10,971,016

SYSTEM AND METHOD FOR IDENTIFYING A VEHICLE VIA AUDIO SIGNATURE MATCHING

BLUEOWL, LLC, San Franci...

1. A computer-implemented method comprising:receiving a set of raw audio signals associated with a vehicle at a feature extraction layer of a trained neural network;
generating a set of audio signal features from the set of raw audio signals by using the feature extraction layer of the trained neural network, the set of audio signal features representing audio data in which information private to an operator of the vehicle has been removed;
analyzing the set of audio signal features to identify a type of the vehicle by using a feature recognition layer of the trained neural network; and
determining a level of risk associated with the operator of the vehicle based upon the type of the vehicle.

US Pat. No. 10,971,015

ACCIDENT PRONE LOCATION NOTIFICATION SYSTEM AND METHOD

Nationwide Mutual Insuran...

1. A method comprising:maintaining vehicle accident occurrence information, wherein the vehicle accident occurrence information comprises information describing accidents including accident location information;
identifying, based at least in part on the accident location information, one or more accident-prone locations; and
in response to a user-initiated navigation query, the query comprising a request to identify alternate directions, displaying on a display component a visual representation of the one or more accident-prone locations and the alternate directions, wherein the alternate directions are based on avoiding the one or more accident-prone locations,
wherein the visual representation of the one or more accident-prone locations comprises an indicator superimposed on a map and, based on a selection of the indicator, the display is configured to indicate at least some of the vehicle accident occurrence information associated with the one or more accident-prone locations.

US Pat. No. 10,971,014

BOLLARD RECEIVER IDENTIFICATION

FORD GLOBAL TECHNOLOGIES,...

1. A method comprising:receiving perception data from a perception sensor of a vehicle;
calculating, based on the perception data, a location of a bollard receiver in relation to a body of the vehicle;
calculating a safety margin centered on the bollard receiver;
calculating a tangent from the body of the vehicle to an edge of the safety margin such that the tangent is drawn towards a side away from oncoming traffic; and
determining a driving maneuver for the vehicle based on the location of the bollard receiver, the safety margin, and the tangent.

US Pat. No. 10,971,013

SYSTEMS AND METHODS FOR AUTOMATICALLY WARNING NEARBY VEHICLES OF POTENTIAL HAZARDS

Micron Technology, Inc., ...

1. A system comprising:one or more sensors configured to detect a potential safety hazard in or near a roadway;
a memory containing computer-readable instructions for generating a location of the potential safety hazard, and identify one or more actions to be taken by a first vehicle for avoiding or mitigating a risk of collision with the potential safety hazard;
a processor configured to read the computer-readable instructions from the memory and generate the message; and
a transceiver to transmit the message to a second vehicle, and to receive from the second vehicle, a second message including a request to the processor of the first vehicle to execute one or more alternative actions to avoid or mitigate the risk of collision, wherein the second message is based on a processor of the second vehicle identifying actions to be taken by the second vehicle for avoiding or mitigating a risk of collision with the potential safety hazard and the first vehicle, conflict with the one or more actions to be taken by the first vehicle.

US Pat. No. 10,971,012

DUAL MODE INDOOR PARKING DATA DELIVERY AND MAP INTEGRATION

HERE Global B.V., Eindho...

1. A method for vehicle parking navigation and communication, the method further comprising:negotiating a first communication link between a mobile device and a parking server;
generating a parking map of a parking facility from first parking data received over the first communication link;
receiving a parking request including a parking facility identifier;
identifying a short range communication network at least in part in response to the parking facility identifier;
negotiating a second communication link between the mobile device and the parking server, wherein the short range communication network includes the second communication link;
receiving second parking data over the second communication link; and
merging the parking map from the first parking data with dynamic indicators from the second parking data.

US Pat. No. 10,971,011

DATABASE CREATION APPARATUS, PARKING LOT SEARCH APPARATUS, AND DATABASE CREATION METHOD

TOYOTA JIDOSHA KABUSHIKI ...

1. A database creation apparatus comprising:a controller comprising at least one processor configured to:
(1) acquire vehicle information including position information and status information of a vehicle and information relating to vehicle size of said vehicle associated with each other;
(2) extract a position of said vehicle in a parked state on the basis of said vehicle information; and
(3) add data indicating that general vehicles of the same vehicle size as said vehicle can be parked at said position to a parking lot database when the position of said vehicle in the parked state is not an exclusive parking lot for said vehicle, and do not add the data when the position of said vehicle in the parked state is said exclusive parking lot,
wherein said exclusive parking lot is a parking lot at which only said vehicle or only specific vehicles including said vehicle are allowed be parked and general vehicles are not allowed be parked.

US Pat. No. 10,971,010

TRACKING SYSTEM, METHOD AND MEDIUM FOR ENHANCING THE USE OF SELECT TRANSIT

MASTERCARD INTERNATIONAL ...

1. A method to track, facilitate, and induce utilization of select transit entities that produce less carbon emissions relative to other transit entities, the method comprising:receiving, with a transit inducement server, a first signal from an input device, the first signal indicating that an item utilized a select transit entity, the first signal including a procurement vehicle identifier and a transit entity identifier;
responsive to receiving the first signal, determining, with the transit inducement server, a quantity of units to incrementally associate with the procurement vehicle identifier based on at least the transit entity identifier and the procurement vehicle identifier,
wherein a total quantity of the units associated with the procurement vehicle identifier are dynamically stored in a units database;
responsive to determining the quantity of the units, associating, with the transit inducement server, the quantity of the units with the procurement vehicle identifier in the units database;
receiving, with the transit inducement server, a second signal from the input device, the second signal associated with the procurement vehicle identifier indicating a procurement request was made to a transit provider;
responsive to receiving the second signal, determining, with the transit inducement server, whether the procurement request is associated with a select transit provider that produces less carbon emissions relative to other transit entities;
responsive to determining that the procurement request is associated with the select transit provider, determining, with the transit inducement server, whether a deduction criteria is met;
responsive to determining that the deduction criteria is met, automatically deducting, with the transit inducement server, at least some of the units associated with the procurement vehicle identifier in the units database; and
responsive to deducting the at least some of the units associated with the procurement vehicle identifier in the units database, transmitting a third signal to the select transit provider indicating that the procurement request is to transpire at a reduced rate.

US Pat. No. 10,971,009

EXTRACTING EVENTS AND ASSESSING THEIR IMPACT ON A TRANSPORTATION NETWORK

INTERNATIONAL BUSINESS MA...

1. A method of assessing impact of an event on a transport network, said method comprising:utilizing at least one processor to execute computer code configured for:
receiving, using the at least one processor, an event notification provided in an identified format and containing information associated with more than one event, wherein at least one of the more than one events affects at least one target vehicle in the transport network, wherein the at least one vehicle follows a predetermined route comprising a plurality of stops in the transport network;
extracting, using the at least one processor, metadata for each of the events from the event notification, the metadata including at least one of: event location information, event time information, and event type information, wherein the extracting comprises:
comparing the event notification to a plurality of event models, wherein each of the plurality of event models is associated with an extraction pattern;
identifying the event model relevant to each of the events contained within the event notification based upon a pattern of the event notification; and
extracting the metadata for each of the events from the event notification using the extraction pattern associated with the identified event model;
storing, within a database, the metadata extracted from the event notification, wherein the metadata stored within the database is utilized to tune the plurality of event models via, at least, identifying, within a database, a frequency of occurrence of attributes of events;
converting, using the at least one processor, the event notification provided in the identified format to a notification having a predetermined format different from the identified format by using the extracted metadata for the at least one of the more than one events, affecting the at least one target vehicle;
determining, using the converted metadata, an estimated impact of the at least one of the more than one events on the at least one target vehicle in the transport network and storing the estimated impact of the event;
said determining comprising:
estimating, using the at least one processor, a delay with respect to at least one stop in the plurality of stops for the at least one target vehicle in the transport network; and
determining, using the at least one processor, a probability of a delay occurring at the at least one stop, depending on one or more events at the stop and a prior probability of a delay occurring at the at least one stop, wherein the prior probability of a delay is determined, at least in part, utilizing the frequency of occurrence identified by and stored within the database, wherein the determining further comprises storing the probability of delay in an events database; and
providing, responsive to a user requesting, within an application, information associated with the at least one stop, the stored estimated impact, wherein the providing comprises querying the events database for the probability of delay.

US Pat. No. 10,971,008

SAFETY EVENT MESSAGE TRANSMISSION TIMING IN DEDICATED SHORT-RANGE COMMUNICATION (DSRC)

QUALCOMM Incorporated, S...

1. A method for transmitting vehicle information messages among a plurality of vehicles, comprising:transmitting, by a transceiver of a vehicle of the plurality of vehicles, a first set of vehicle information messages over a vehicle communication channel on a wireless medium at a first periodic rate, the first set of vehicle information messages including information related to the vehicle;
detecting, by one or more sensors of the vehicle, an event related to operation of the vehicle;
generating, by at least one processor of the vehicle, a second set of vehicle information messages each including an event flag and information about the event, the event flag indicating that the second set of vehicle information messages is reporting the event;
transmitting, by the transceiver of the vehicle, a first vehicle information message of the second set of vehicle information messages over the vehicle communication channel on the wireless medium without waiting for a next transmission opportunity of the first periodic rate; and
transmitting, by the transceiver of the vehicle, a remainder of the second set of vehicle information messages over the vehicle communication channel on the wireless medium.

US Pat. No. 10,971,007

ROAD CONDITION INFORMATION SHARING METHOD

Huawei Technologies Co., ...

1. A road condition information sharing method comprising:receiving, by a server, a road condition information request sent by a first terminal, wherein the road condition information request carries: a road condition position of concern, and a time point of concern, and wherein the server manages device information of a plurality of second terminals including:
position information reported to, and obtained in advance by, the server at specific time points from the plurality of second terminals, and
road condition recording capability information indicative of road condition recording capabilities of the plurality of second terminals, the road condition recording capabilities being determined according to credit information of the plurality of second terminals, the credit information of each of the plurality of second terminals indicating a number of times for which road condition information collected by the second terminal was adopted historically;
determining, by the server, a second terminal of the plurality of second terminals according to: the road condition position of concern, the time point of concern, and the road condition recording capability information, wherein determining the second terminal comprises searching the position information and the road condition recording capability information to obtain the second terminal having a road condition recording capability that meets a preset criterion and located at a distance from the road condition position of concern that is less than a preset distance threshold at the time point of concern, and wherein the second terminal has a road condition recording capability; and
sending, by the server, a road condition sharing request to the second terminal, the road condition sharing request including a terminal identifier of the first terminal, to enable the second terminal to share road condition information according to: the road condition position of concern, the time point of concern, and the road condition recording capability information, directly with the first terminal according to the terminal identifier of the first terminal, wherein the road condition information includes a photograph of or near the road condition position of concern.

US Pat. No. 10,971,006

TARGET VEHICLE SELECTION AND MESSAGE DELIVERY IN VEHICULAR SYSTEMS

NOKIA SOLUTIONS AND NETWO...

1. A method comprising:maintaining in a database of a vehicular message distribution entity in a communications system,
map information,
driving context information of a plurality of vehicles, wherein the driving context information of each vehicle of the plurality of vehicles comprises at least information for determining a geographical location and a short-term trajectory of said vehicle; and
message history information for each vehicle pair formed by a first source vehicle and a target vehicle of the plurality of vehicles,
receiving, in the vehicular message distribution entity, from the first source vehicle a vehicular message comprising at least driving context information of the first source vehicle;
selecting, by the vehicular message distribution entity, a first set of target vehicles in proximity of the first source vehicle from the plurality of vehicles based on geographical locations of the first source vehicle and the plurality of vehicles;
selecting, by the vehicular message distribution entity, a second set of one or more target vehicles from the first set of target vehicles based on
the driving context information of the first source vehicle,
driving context information of the first set of target vehicles, and
the map information;
analyzing, by the vehicular message distribution entity, message history information relating to the first source vehicle and the second set of one or more target vehicles to determine values of one or more message history metrics;
selecting, by the vehicular message distribution entity, a third set of one or more target vehicles from the second set based on the values of the one or more message history metrics; and
causing, by the vehicular message distribution entity, sending the vehicular message to the one or more target vehicles in the third set using unicast transmission.

US Pat. No. 10,971,005

DETERMINING I2X TRAFFIC-PARTICIPANT CRITICALITY

Continental Automotive Sy...

1. A method comprising:for each pair of traffic participants (TPs) entering an intersection, calculating a time to collision (TTC) value;
creating a matrix of TTC values between TPs entering the intersection;
creating a matrix of criticality values containing a respective criticality value for each pair of TPs, wherein each criticality value is determined as a function of both TTC and at least one additional weighting factor;
broadcasting at least one of a basic safety message (BSM) and a pedestrian safety message (PSM) on behalf of at least one traffic participant based on having determined the criticality for each pair of traffic participants and having identified at least one of a near-miss situation and a potential-collision situation between traffic participants based on the respective criticality values in the matrix of criticality values for the pairs of TPs; and
providing one or more warnings to at least one of a pedestrian, a driver of a vehicle, and a cyclist based on the broadcasted at least one of a BSM and a PSM.

US Pat. No. 10,971,004

DENSITY BASED TRAFFIC LIGHT CONTROL SYSTEM FOR AUTONOMOUS DRIVING VEHICLES (ADVS)

BAIDU USA LLC, Sunnyvale...

1. A computer-implemented method to direct a traffic flow for autonomous driving vehicles (ADVs), the method comprising:receiving vehicle information from one or more ADVs;
determining a location and a heading of an ADV from the one or more ADVs from the vehicle information;
determining if each of the one or more ADVs is approaching a traffic light junction based on the location and the heading of the ADV, including
determining if the ADV is within a predetermined proximity to the traffic light junction;
determining if a current road segment of the ADV routes to the traffic light junction based on a route information; and
determining if a heading to route the ADV to the traffic light junction matches the heading of the ADV;
determining a number of ADVs from the one or more ADVs anticipated to approach the traffic light junction based on the determining if each of the one or more ADVs is approaching the traffic light junction to assist a traffic light control system in predicting a density of vehicles at the traffic light junction for a predetermined time period; and
sending the vehicle information of the one or more ADVs and for each of the ADVs determined as being approaching the traffic light junction, an indication that the ADV is predicted to arrive at the traffic light junction to a traffic light control system in response to determining the ADV is approaching the traffic light junction, the indication comprising a current time, an estimated time of arrival (ETA) of the ADV at the traffic light junction, and a traffic light identifier (ID) for at least one traffic light that the ADV would need to pass through at the traffic light junction, wherein the vehicle information is used by the traffic light control system to direct a traffic flow at the traffic light junction based on a density-based traffic light control algorithm, including adjusting a time duration of a light signal at one or more traffic lights disposed at the traffic light junction in advance of the number of ADVs arriving at the traffic light junction, wherein the traffic light control system includes a traffic density sensor placed well ahead of the traffic light junction to capture traffic data, and wherein the vehicle information received from the one or more ADVs simulates traffic data along with the captured traffic data and are used to predict the density of vehicles at the traffic light junction for the predetermined time period and the density-based traffic light control algorithm is performed based on the number of ADVs anticipated to approach or have arrived at the traffic light junction.

US Pat. No. 10,971,003

SYSTEMS AND METHODS FOR PREDICTING PEDESTRIAN BEHAVIOR

Ford Global Technologies,...

1. A method comprising:obtaining, by at least one computer located in an automobile, from a personal communication device of a first pedestrian, data indicative of at least a first behavioral characteristic of the first pedestrian;
generating, by the at least one computer, a behavioral profile of the first pedestrian, based at least in part on the at least first behavioral characteristic of the first pedestrian;
detecting, by a navigation assistance equipment located in the automobile, a presence of the first pedestrian outside the automobile; and
predicting, by the at least one computer, an upcoming behavior of the first pedestrian, based at least in part on the behavioral profile of the first pedestrian,
wherein the data indicative of the at least first behavioral characteristic of the first pedestrian comprises at least one of an image, a document, a file, a database, or a recording, and
wherein the at least one of the image, the document, the file, the database, or the recording is indicative of a route traveled by the first pedestrian, the route comprising a first portion that is a road surface traveled by the first pedestrian and a second portion that is a sidewalk traveled by the first pedestrian.

US Pat. No. 10,971,002

INTERSECTION PHASE MAP

Waymo LLC, Mountain View...

1. A method comprising:receiving, at a server from at least one information source, data related to a traffic signal controlling traffic through a road intersection being approached by a vehicle;
determining, based on the data, a current status of the traffic signal and a prediction of future status of the traffic signal at one or more future times, wherein the prediction of future status of the traffic signal includes indicating a beginning time and an end time between which the traffic signal will display a particular color;
determining that, for the traffic signal controlling traffic through the road intersection, a drift has occurred between two adjacent days in the beginning time at which the traffic signal will display the particular color;
determining an amount of the drift in the beginning time between the two adjacent days;
determining a level of certainty for the prediction of future status of the traffic signal, wherein the level of certainty is based at least on the determined amount of drift;
receiving, at the server, an information request by the vehicle related to the road intersection;
in response to the information request, the server generating information including the current status of the traffic signal, the prediction of future status of the traffic signal at a specified time at which the vehicle reaches the road intersection, and the level of certainty; and
sending the information to the vehicle.

US Pat. No. 10,971,001

ARTIFICIAL INTELLIGENT SYSTEMS AND METHODS FOR PREDICTING TRAFFIC ACCIDENT LOCATIONS

BEIJING DIDI INFINITY TEC...

1. A system of one or more electronic devices for displaying traffic accident locations on a user terminal associated with the traffic accident locations, comprising:at least one storage medium including a first operation system and a set of instructions compatible with the first operation system for predicting traffic accident locations; and
at least one processor in communication with the storage medium, wherein when executing the first operation system and the set of instructions, the at least one processor is directed to:
obtain a plurality of accident records of a plurality of traffic accidents, each of the plurality of accident records being associated with a corresponding target user terminal and including an on-record accident time of a traffic accident and a plurality of locations that the target user terminal appeared around the on-record accident time;
determine a plurality of refined accident locations by, for each of the plurality of accident records,
operating a first clustering procedure with the corresponding plurality of locations of the target user terminal as inputs of the first clustering procedure and assigning a first result of the first clustering procedure as a refined accident location of the plurality of locations of the target user terminal,
wherein the operating the first clustering procedure includes:
identifying a plurality of points corresponding to the inputs;
determining a result cluster and a result point associated with the result cluster by a point-identification operation, including:
 selecting a candidate cluster of points from the plurality of points;
 selecting a candidate point from the candidate cluster of points;
 operating a first iterative operation until a first stop criteria is met, wherein the first iterative operation includes a plurality of first iterations, and each of the first iteration includes:
 using the candidate point as a center point and using the candidate cluster of points as a target cluster of points;
 identifying, from the target cluster of points, a cluster of points that is within a predetermined distance from the center point as the candidate cluster of points; and
 identifying a point from the candidate cluster of points as the candidate point;
determine at least one accident-prone road section by operating a second clustering procedure with the plurality of refined accident locations;
generate electronic signals including information of one of the at least one accident-prone road section and a triggering code, in a format recognizable by an application installed in the user terminal, configured to render the application to generate a presentation of the at least one accident-prone road section on an interface of the user terminal; and
direct the user terminal to display on the interface the accident-prone road section.

US Pat. No. 10,971,000

ESTIMATING TIME TRAVEL DISTRIBUTIONS ON SIGNALIZED ARTERIALS

Uber Technologies, Inc., ...

1. A system for estimating time travel distributions on signalized arterials, comprising:a processor;
memory; and
an application stored in memory and executable by the processor to perform operations comprising:
receiving travel data from one or more re-identification devices;
estimating a first distribution on one or more signalized arterials based on the received travel data. the first distribution comprising a linear combination of individual paces weighed by distance traveled;
calibrating the first distribution to obtain a second distribution, the second distribution being a more recent estimate of travel time compared to the first distribution;
estimating a traffic conditions on a particular arterialized segment, of the one or more signalized arterials, at a particular time based on the second distribution; and
causing the traffic condition to be displayed through a graphical interface being displayed on a device.

US Pat. No. 10,970,999

SYSTEM AND METHOD FOR OPTIMIZED APPLIANCE CONTROL

Universal Electronics Inc...

1. A non-transitory, computer readable media having instructions stored thereon, the instructions, when executed by an intermediate device, cause the intermediate device to perform steps to facilitate configuring a controlling device to command functional operations of a target appliance comprising:using an appliance identifying data received by the intermediate device from the target appliance to determine that the target appliance is capable of receiving command communications via use of a RF communications protocol;
using the intermediate device to configure the controlling device to communicate commands to the target appliance via use of the RF communications protocol; and
causing the intermediate device to transmit to the target appliance a command for causing the target appliance to be placed into a pairing state for the purpose of pairing the target appliance with the controlling device whereupon the target appliance will be configured to receive command communications from the controlling device via use of the RF communications protocol.

US Pat. No. 10,970,998

SYSTEMS, METHODS, AND DEVICES FOR REMOTELY CONTROLLING FUNCTIONALITIES OF VEHICLES

Ford Global Technologies,...

1. A vehicle comprising:a first remote control device configured to remotely control one or more functionalities of the vehicle;
at least one memory comprising computer-executable instructions; and
one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
determine that the first remote control device is located at a first position within the vehicle, the first position associated with a first functionality of the vehicle;
establish a first communication connection between the first remote control device and the first functionality based at least in part on a first machine readable medium associated with the first position; and
cause a user interface on the first remote control device to present one or more settings associated with the first functionality.

US Pat. No. 10,970,997

SYSTEM AND METHOD FOR OPTIMIZED APPLIANCE CONTROL

Universal Electronics Inc...

1. A device, comprising:a multimedia port configured to receive identification data from at least one of a plurality of devices coupled to the multimedia port, the plurality of devices including an intermediate device coupled to the multimedia port via a first multimedia cable, and a media rendering device coupled to the intermediate device via a second multimedia cable;
an attribute identifier configured to identify a unique set of information in the received identification data;
a device correlator configured to associate in a system topography the identified unique set of information with one or more of the plurality of devices coupled to the multimedia port; and
a power state determiner configured to monitor the multimedia port and to use the system topography to determine a power state of the plurality of devices.

US Pat. No. 10,970,996

SYSTEM FOR AUTOMATICALLY OPENING A LID TO A GRAIN BIN

2320 SOLUTIONS, LLC, Ced...

1. A system for automatically opening a lid on the top of a grain bin to receive grain from a unloading device therein to fill the grain bin with the grain, the system comprising:an actuator combined to the lid of the grain bin;
a controller in communication with the actuator for receiving a signal containing a data packet to activate the actuator; and
an RFID combined to the unloading device for transmitting the data packet to the controller when the RFID is positioned proximate to the controller, and the controller comparing the device id with a stored identification value and upon a match the controller activates the actuator to open the lid of the grain bin to receive grain from the unloading device therein to fill the grain bin with the grain.

US Pat. No. 10,970,995

SYSTEM FOR MONITORING EVENT RELATED DATA

NEC CORPORATION, Tokyo (...

1. A control system comprising:at least one memory storing instructions; and
at least one processor coupled to the at least one memory, the at least one processor configured to execute the instructions to:
detect that an event occurred in a surveillance area by using sensor data;
identify a type of the detected event; and
control a predetermined imaging range of a camera depending on the identified type of the detected event in the surveillance area,
wherein varieties of imaging ranges, including the imaging varieties of types, including the type, correspond to each other, and
wherein the varieties of the imaging ranges are defined differently depending on the type.

US Pat. No. 10,970,994

METHOD AND SYSTEM FOR MONITORING FIRE ALARM SYSTEMS

JOHNSON CONTROLS FIRE PRO...

11. A system for testing a fire alarm system, comprising: a non-compatible control panel of a fire alarm system that sends event data; a monitoring station that receives the event data and forwards the event data, wherein the monitoring station monitors multiple fire alarm systems for indications of a alarm triggering event; and a connected services system for receiving the event data forwarded from the monitoring station and storing the event data and also passing the event data to a technician for testing the control panel remotely from connected service system, wherein the non-compatible control panel is not connect to the connected services system and support one or more loops or networks of fire detection and alarm notification devices, and the monitoring station notifies authorities in response to receiving alarm signals from the non-compatible control panels.

US Pat. No. 10,970,993

METHOD FOR MANAGING THE ASSISTANCE TO A PERSON IN RESPONSE TO THE EMISSION OF AN ALERT

HAREAU, Paris (FR)

1. A method for managing the assistance to a person in response to the emission of an alert comprising:emitting an save our souls (SOS) alert from a first mobile equipment of a first user by a wireless interface to a first set of terminals of a plurality of users each one having a call identifier, the call identifiers being recorded in a memory of the first mobile electronic equipment or of a remote equipment;
establishing of a two-way communication between the first equipment and a given terminal of the first set of an assisting user;
automatic generating of a plurality of first notifications to a subset of terminals of the first set, each one of the first notifications comprising at least one piece of data identifying the assisting user;
automatic generating of a plurality of second notifications to terminals of the second subset by an action of the assisting user or the first user, each second notification comprising a status relative to the processing of the alert by the assisting user, at least one second notification being prerecorded and proposed to one or the other of the first individual or of the assisting user.

US Pat. No. 10,970,992

EMERGENCY NOTIFICATION APPARATUS AND METHOD

Choprix LLC, Rochester, ...

1. A wearable emergency alert apparatus, comprising:a memory configured to store a unique identifier;
a processor in communication with the memory;
one or more sensors in communication with the processor;
a location device in communication with the processor; and
a transceiver in communication with the processor, wherein the wearable emergency alert apparatus is configured to perform a method, the method comprising:
obtaining, via the transceiver, by the processor, data from a portion of the one or more sensors;
determining, by the processor, that the data indicates an emergency condition;
based on the determining, obtaining, by the processor, the unique identifier from the memory;
based on the determining, obtaining, by the processor, location information from the location device; and
communicating, by the processor, utilizing the transceiver, the unique identifier, to a computing node, via a network connection, wherein the computing node utilizes the unique identifier to obtain additional data for communicating to an additional computing node, and wherein responsive to obtaining the additional data, the computing node communicates the additional information to the additional computing node; and
communicating, by the processor, utilizing the transceiver, via the network connection, the unique identifier and the location information to the additional computing node.

US Pat. No. 10,970,991

MOISTURE SENSING ROOFING SYSTEMS AND METHODS THEREOF

Building Materials Invest...

1. A system comprising:a plurality of radio frequency (RF) tags;
wherein the plurality of RF tags is positioned at a plurality of locations throughout a roof;
wherein a plurality of location identifiers for the plurality of locations is stored in a database in communication with a computing device;
at least one tag reader configured to:
generate at least one reader RF signal at one or more frequencies to read the plurality of RF tags, and
detect at least one return RF signal from at least one RF tag of the plurality of RF tags;
wherein the at least one return RF signal of the at least one RF tag carries tag data comprising at least one impedance value and at least one tag identifier;
wherein the at least one tag identifier is associated with a corresponding location identifier of the plurality of location identifiers in the database;
at least one processor of the computing device;
wherein the at least one processor is programmed to:
receive the tag data;
determine, based on the at least one tag identifier of the tag data and from the database, a corresponding location of the at least one RF tag from the plurality of locations;
obtain at least one dry state linear regression function associated with the at least one tag identifier of the at least one RF tag;
determine a wet state or dry state of the roof at the corresponding location associated with the at least one RF tag based, at least in part, on:
i) the at least one impedance value of the at least one RF tag and
ii) the at least one dry state linear regression function of the at least one RF tag; and
output an indicator of the wet state or the dry state of the roof at the corresponding location.

US Pat. No. 10,970,990

SYSTEMS AND METHODS FOR MONITORING BUILDING HEALTH

STATE FARM MUTUAL AUTOMOB...

1. A building monitoring system for monitoring a building, the building monitoring system comprising:construction materials that form at least a portion of the building, the construction materials comprising a waterproofing layer, a baseboard of a roof of the building, and a frame of the roof;
a first water sensor positioned at a first position on an interior surface of the waterproofing layer and an exterior surface of the baseboard, wherein the first water sensor generates a signal in response to detecting a presence of water within the construction materials near the first water sensor, the first water sensor being embedded within the construction materials of the building;
a second water sensor positioned at a second position on the frame, wherein the second water sensor generates a signal in response to detecting a presence of water near the second water sensor; and
one or more processors communicatively coupled to the first water sensor and the second water sensor, wherein the one or more processors are programmed to:
receive, over a baseline period of time, a plurality of first signals from the first water sensor and the second water sensor via wireless communication, the plurality of first signals indicating a moisture level associated with the construction materials near the first position and the second position;
calculate a historic baseline condition associated with the first water sensor and the second water sensor, the historic baseline condition including, for each of the first and second water sensors, an average of the plurality of first signals received by the one or more processors over the baseline period of time;
receive a second signal from the first water sensor and the second water sensor via wireless communication, wherein the second signal is received after the baseline period of time and corresponds to a current sensor condition;
compare, in response to receiving the second signal, the current sensor condition to the historic baseline condition to generate a water signal difference;
compare, in response to generating the water signal difference, the water signal difference to a pre-determined threshold to determine that the water signal difference exceeds the pre-determined threshold;
generate, in response to a determination that the water signal difference exceeds the pre-determined threshold, a water alert indicating the presence of water damage at the first position; and
transmit a water alert message to a mobile device of the user of the building monitoring system via wireless communication to inform the user of the presence of water damage at the first position.

US Pat. No. 10,970,989

PROXIMITY ALERT DEVICE AND METHOD

Tereo Corporation, Inc., ...

1. A method of alerting a user comprising:loading, in a proximity alert device having a processor, non-volatile memory, a user input device, and a Bluetooth transceiver, a first configuration value from the non-volatile memory and setting a transmission power level of the Bluetooth transceiver, based on the loaded first configuration value, to a first power level selected from a plurality of power levels comprising +4 dBm, 0 dBm, ?4 dBm, ?8 dBm, ?12 dBm, ?16 dBm, and ?20 dBm;
advertising for a pairing and subsequently pairing and bonding with a protected device that also has Bluetooth transceiver, via a Human Interface Device Profile (HID) over Generic Attributes Profile (GATT) profile (HOGP);
upon pairing and bonding with the protected device, loading a second configuration value from the non-volatile memory and setting the transmission power level to a second power level that is less than the first power level;
receiving input from the user input device to calibrate a threshold distance that is inferred to exist between the proximity alert device and the protected device;
in response to receiving the input, retrieving at least one value for a received signal strength indicator (RSSI) associated with the paired protected device, and based on a comparison of the at least one value to a stored reference value, increasing the transmission power level by at least 4 dBm if the at least one value is less than the stored reference value by more than a first threshold, and decreasing the transmission power level by at least 4 dBm if the at least one value is greater than the stored reference value by more than a second threshold, and saving to non-volatile memory the increased or decreased transmission power level as a calibrated power level;
detecting a loss of connection with the paired protected device;
in response to detecting the loss of connection, triggering an alarm and decreasing the transmission power level relative to the calibrated power level; and
upon restoration of a connection with the paired protected device, restoring the transmission power level to the calibrated power level.

US Pat. No. 10,970,988

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, METHOD, AND PROGRAM

CANON KABUSHIKI KAISHA, ...

1. An apparatus configured to monitor a person in a predetermined area, comprising:an acquisition unit configured to acquire information indicating a location of the person and information indicating a destination of the person in the predetermined area, the acquisition being performed for each of a plurality of persons present in the predetermined area;
a determination unit configured to determine whether a contact between persons having different destinations has occurred based on the acquired information; and
an output unit configured to, in a case where the contact has occurred, output information identifiably indicating the occurrence of the contact.

US Pat. No. 10,970,987

SYSTEM AND METHOD FOR IDENTIFYING VAPING AND BULLYING

SOTER TECHNOLOGIES, LLC, ...

1. A sensor system comprising:an air quality sensor including a combination of sensors configured to sense air quality;
a controller configured to identify vaping based on the sensed air quality of the combination of sensors; and
a network interface configured to transmit a signal indicating vaping when vaping is identified by the controller.

US Pat. No. 10,970,986

DEVICE, SYSTEM, AND PROCESS FOR AUTOMATIC FALL DETECTION ANALYSIS

TracFone Wireless, Inc., ...

1. A system for optimizing fall detection determination, the system comprising:a server configured to receive potential fall parameter data associated with a user from a fall detection device associated with a wireless device, the wireless device implementing a three-axis accelerometer, a gyroscope, and an altitude sensor, the potential fall parameter data comprising 3-axis acceleration data, gyroscopic data, and altitude data received from the wireless device implementing the three-axis accelerometer, the gyroscope, and the altitude sensor;
a database associated with and in communication with the server, the database configured to store the potential fall parameter data of the user, and the database further configured to store a library of previous potential fall parameter data of the user;
the server further configured to analyze the potential fall parameter data to determine whether the potential fall parameter data is consistent with a real fall, wherein the server analyzes the potential fall parameter data and compares the potential fall parameter data comprising the 3-axis acceleration data, the gyroscopic data, and the altitude data to the library of previous potential fall parameter data to determine whether the potential fall parameter data is consistent with a real fall;
the server further configured to send an alert to the wireless device if the potential fall parameter data is indicative of a real fall based on the comparison of the potential fall parameter data comprising the 3-axis acceleration data, the gyroscopic data, and the altitude data to the library of previous potential fall parameter data;
the server further configured to receive an indication from the wireless device in response to the alert, wherein the indication includes an indication that the potential fall parameter data was one of the following: a real fall or a false positive;
the server further configured to receive an indication from the wireless device in response to the alert requesting help;
the server further configured to receive a location of the user in response to the indication from the wireless device requesting help; and
the server further configured to transmit the location of the user and the potential fall parameter data to emergency medical services in response to the indication from the wireless device requesting help,
wherein the server communicates to the wireless device over a wireless network that comprises a wireless mobile telecommunications network;
wherein the wireless device comprises a smartphone and the smartphone implements the three-axis accelerometer, the gyroscope, and the altitude sensor to obtain the potential fall parameter data; and
wherein the smartphone includes at least one analog-to-digital converter and at least one filter configured to process signals associated with the three-axis accelerometer, the gyroscope, and the altitude sensor to obtain the potential fall parameter data.

US Pat. No. 10,970,985

SENSOR DEVICE AND SYSTEM

Halo Smart Solutions, Inc...

1. A device, comprising:a housing;
a group of sensors secured within the housing, wherein the group of sensors comprises a particle detection sensor and a gas detection sensor, wherein the gas detection sensor comprises at least one of a carbon dioxide sensor and a volatile organic compound sensor; and
a processor, and a memory storing instructions, that when executed by the processor, cause the processor to:
generate a profile for vaping activity comprising a threshold for measurements from the particle detection sensor;
receive monitoring data from each of the particle detection sensor and gas detection sensor; and
upon determining that at least a portion of the received monitoring data is indicative of an exceeded threshold for particulates from the profile for vaping activity and not indicative of an exceeded threshold for one of carbon dioxide or volatile organic compounds, generate a detected event communication for the vaping activity.

US Pat. No. 10,970,984

PRESENCE SIMULATION BY MIMICKING LIGHTING SETTINGS AT AN UNOCCUPIED ENVIRONMENT USING LIGHT SETTINGS FROM ANOTHER ENVIRONMENT

SIGNIFY HOLDING B.V., Ei...

1. A computer system comprising:a communication interface configured to receive from a first lighting system an indication of a sequence of illumination settings, wherein at least some of the illumination settings have been applied to the first lighting system by at least one user when present in a first environment illuminated by the first lighting system;
an input configured to receive a mimic instruction identifying a second environment as unoccupied; and
at least one processor configured to:
select the first lighting system from a set of candidate lighting systems by comparing a profile of a second lighting system with profiles of the candidate lighting systems; and
mimic, in response to the mimic instruction, the presence of the at least one user of the first environment in the unoccupied second environment, by applying a matching of the sequence of illumination settings to the second lighting system of the second environment,
wherein the at least one processor is further configured to disqualify at least one candidate lighting system from the set of candidate lighting systems from being selected in response to determining that a distance of the at least one candidate lighting system to the second lighting system is less than a first distance denoted by a first threshold, and
wherein the application of the at least some of the illumination settings to the first lighting system illuminating the first environment is simultaneous with the application of the matching of the sequence of illumination settings to the second lighting system of the second environment.

US Pat. No. 10,970,983

INTELLIGENT DOOR LOCK SYSTEM WITH CAMERA AND MOTION DETECTOR

August Home, Inc., San F...

1. An apparatus for use with a motion detector and a camera, the camera adapted to operate in a lower-power mode and a higher-power mode, the camera consuming less power in the lower-power mode than in the higher-power mode, the apparatus comprising:at least one processor;
a first transceiver to communicate wirelessly according to a first wireless communication protocol; and
at least one storage medium having encoded thereon executable instructions that, when executed by the at least one processor, cause the at least one processor to carry out a method, the method comprising:
in response to the motion detector detecting motion at a dwelling, instructing the camera to exit the lower-power mode, wherein instructing the camera to exit the lower-power mode comprises communicating at least one message to the camera via the first transceiver and according to the first wireless communication protocol; and
in response to determining that the motion corresponds to a person, triggering the camera to capture an image and/or video and to transmit the image and/or video to the Internet via a second wireless communication protocol,
wherein the first wireless communication protocol is a wireless personal area network protocol (WPAN) and the second wireless communication protocol is a wireless local area network (WLAN) or wireless wide area network (WWAN) protocol.

US Pat. No. 10,970,982

VIDEO SURVEILLANCE APPARATUS AND METHOD

Canon Kabushiki Kaisha, ...

1. A video surveillance apparatus comprising:an identifying means configured to identify a target to be captured by a camera having a sensor mounted thereon;
a distance measuring means configured to obtain a plurality of distance measurements based on data received from the sensor; and
a determining means configured to determine, based on the plurality of distance measurements, whether the target is obscured by at least a part of an object within a field of view of the camera.

US Pat. No. 10,970,981

METHOD FOR COLLECTING AND SHARING LIVE VIDEO FEEDS OF EMPLOYEES WITHIN A DISTRIBUTED WORKFORCE

BroadPath, Inc., Tucson,...

1. A method comprising:during a first period of time:
distributing a first side-facing video feed to a second computing device associated with a second user and to a third computing device associated with a third user, the first side-facing video feed recorded by a first side-facing camera facing a first user at a first computing device;
distributing a second side-facing video feed to the first computing device associated with the first user, the second side-facing video feed recorded by a second side-facing camera facing the second user at the second computing device;
distributing a third side-facing video feed to the first computing device associated with the first user, the third side-facing video feed recorded by a third side-facing camera facing the third user at the third computing device; and
in response to activation of a communication channel between the first user and second user:
activating a first forward-facing video feed recorded by a first forward-facing camera facing the first user at the first computing device;
distributing the first forward-facing video feed to the second computing device;
activating a second forward-facing video feed recorded by a second forward-facing camera facing the second user at the second computing device;
distributing the second forward-facing video feed to the first computing device;
distributing the first side-facing video feed to the third computing device; and
distributing the third side-facing video feed to the first computing device.

US Pat. No. 10,970,980

TELECOMMUNICATIONS SYSTEM FOR TRANSPORTING FACILITY CONTROL DATA AND WIRELESS COVERAGE INFORMATION

CommScope Technologies LL...

1. A head-end unit for a distributed antenna system, the head-end unit comprising:a radio frequency interface configured to transport wireless communication information between the head end unit and a base station, wherein the head-end unit communicates the wireless communication information with at least one remote unit of the distributed antenna system;
one or more facility control communication ports configured to communicate facility control information with a facility control center of a facility control and monitoring system;
a signal processing module comprising a signal processor configured to communicate translated facility control information with a facility control node of the facility control and monitoring system via the at least one remote unit;
wherein the facility control node is configured to detect an environmental parameter within a coverage area of the distributed antenna system; and
wherein the signal processing module is configured to receive the environmental parameter from the at least one remote unit and to change a mode of operation of the distributed antenna system with respect to wireless communication coverage, based on the environmental parameter.

US Pat. No. 10,970,979

PRODUCT DISPLAY AND INVENTORY MONITORING

SENNCO SOLUTIONS, INC., ...

1. A system for monitoring a product on display, the system comprising:a shelf;
a barrier configured to define a restricted access zone for storage of inventory of the product on the shelf and to define an open access zone for instances of the product on the shelf; and
a sensor system comprising a capacitive sensor pad, the capacitive sensor pad being supported by the shelf, extending across the restricted access zone and the open access zone, and configured to detect movement of the product relative to the restricted access zone or the open access zone;
wherein the sensor system comprises a processor configured to determine whether the detected movement is indicative of a removal of the product from the restricted access zone or the open access zone.

US Pat. No. 10,970,978

METHOD FOR THE BROADCASTING, BY A WATCH, OF AN INFORMATIVE MESSAGE RELATING TO AN EVALUATION OF THE QUALITY OF LIFE OF A WEARER OF SAID WATCH

Tissot SA, Le Locle (CH)...

1. A method for broadcasting, by a watch, an informative message relating to an evaluation of a quality of life of a wearer of said watch, the method comprising:recording, by processing circuitry, data describing at least one factual episode of at least one type of environmental event recorded during a given period;
identifying a type of a disturbing environmental event that disturbs the quality of life of the wearer, from processing of the recorded data, wherein the identifying step comprises selecting one or more particular factual episodes, from the recorded data describing the at least one factual episode, each of the selected one or more particular factual episodes having corresponding measurement data that exceeds a first threshold and a corresponding time duration that exceeds a second threshold;
estimating an evaluation index of the quality of life of the wearer from an indicator of a level of disturbance of the quality of life that is calculated for the identified type of the disturbing environmental event; and
generating the informative message comprising the estimated evaluation index and broadcasting the informative message to the wearer.

US Pat. No. 10,970,977

RADIO TAG READING DEVICE AND METHOD

TOSHIBA TEC KABUSHIKI KAI...

1. A radio tag reading device, comprising:a placement table;
an antenna configured to communicate with wireless tags within a communication range including a placement region on the placement table;
a reader configured to output commodity information based on a signal received by the antenna from wireless tags;
a sensor configured to detect a user at a position proximate to the placement region;
a user-operable element; and
a controller configured to start a reading operation by activating the reader and causing the antenna to start emitting radio waves upon the sensor detecting the user, and terminate the reading operation in response to a user operation of the user-operable element.

US Pat. No. 10,970,976

END USER PROTECTION AGAINST ATM KEYPAD OVERLAY

INTERNATIONAL BUSINESS MA...

1. A computer-implemented method for determining a presence of a fraud device overlaid on a data entry device, the method comprising:emitting, from beneath a surface of the data entry device, security light from a set of one or more emitters, wherein the security light has a security light luminous power,
wherein the surface comprises one or more apertures, and
wherein the one or more apertures include a material with a reflection coefficient;
collecting a reflection of the security light off the material by a first set of one or more sensors, wherein the reflection has a reflection luminous power;
determining a luminous power range based on the security light luminous power and the reflection coefficient; and
based at least in part on the reflection luminous power being outside the luminous power range, engaging one or more security measures at the data entry device;
wherein the first set of one or more sensors is randomly selected.

US Pat. No. 10,970,975

END-TO-END SECURED CURRENCY DISPENSING

Diebold Nixdorf, Incorpor...

1. A method of dispensing currency from an automated transaction machine (ATM) comprising:receiving, with a managing computing device positioned in the ATM, a dispense request and account information from a user, the dispense request including an amount of currency to be dispensed from a currency dispenser positioned in the ATM;
generating, with a controller of the currency dispenser, a first number and storing the first number in a memory communicating with the controller;
receiving the first number at a host computing device that is physically remote from the ATM;
receiving, at the host computing device, the dispense request and the account information;
generating, with the host computing device, a first message authentication code by applying a signing algorithm to at least one of the first number and the amount of currency to be dispensed, the signing algorithm based on a cryptographic key;
receiving, at the controller of the currency dispenser, at least the first message authentication code from the host computing device;
receiving, at the controller of the currency dispenser, the amount of currency to be dispensed;
retrieving, with the controller of the currency dispenser, from the memory, the first number;
generating, with the controller of the currency dispenser, a second message authentication code by applying the signing algorithm to at least one of the first number and the amount of currency to be dispensed, the first number retrieved during said retrieving;
confirming, with the controller of the currency dispenser, identity between the first message authentication code and the second message authentication code;
dispensing, with the currency dispenser by the controller, the amount of currency in response to at least said confirming;
generating, with the controller of the currency dispenser, a fifth message authentication code by applying the signing algorithm to at least the first number and the amount of currency dispensed during said dispensing;
transmitting, with the controller of the currency dispenser, the fifth message authentication code to the host computing device;
generating, with the host computing device, a sixth message authentication code by applying the signing algorithm to the first number and the amount of currency to be dispensed; and
confirming, with the host computing device, identity between the fifth message authentication code and the sixth message authentication code.

US Pat. No. 10,970,974

AMUSEMENT DEVICES AND GAMES INVOLVING MULTIPLE OPERATORS, MULTIPLE PLAYERS, AND/OR MULTIPLE JURISDICTIONS

CFPH, LLC, New York, NY ...

1. A method comprising:controlling, by at least one processor:
determining that a first player desires to start play of a networked game through a first mobile device, in which the first player accesses the networked game through an account maintained by a first gaming operator;
determining that a second player desires to start play of the networked game through a second mobile device, in which the second player accesses the networked game through an account maintained by a second gaming operator;
in response to determining that the first and second players desire to start play of the networked game, matching the first player and the second player into a round of the networked game based on a determination that there are no other players associated with either the first or the second gaming operators looking to play the networked game; and
in response to facilitating play of the round of the networked game by the first and second players, allocating a first payment to the first gaming operator and a second payment to the second gaming operator based on the first player and the second player accessing the networked game through an account maintained by one of the first gaming operator and the second gaming operator.

US Pat. No. 10,970,973

CARD-BASED ELECTRONIC GAMING SYSTEMS AND TECHNIQUES FOR FIVE-CARD DRAW POKER

Generation Z, LLC, Colum...

1. A electronic five-card draw poker gaming system using common physical cards, the system comprising:a plurality of physical playing cards;
a scanner that is configured to identify each of the plurality of physical playing cards as they are dealt by a dealer;
a gaming table where the plurality of physical playing cards are dealt and identified by the scanner;
a plurality of player computing equipment with graphical displays that are programmed to provide individualized five-card draw poker gaming interfaces for a plurality of players, the individualized five-card draw poker gaming interfaces each being programmed to:
output a virtual five-card draw poker hand for a corresponding player, and
receive user input to perform one or more discard actions with regard to the virtual five-card draw poker hand;
a five-card draw poker gaming computing system that is communicably connected to the scanner and the plurality of player computing equipment, the five-card draw poker gaming computing system being programmed to:
provide five-card porker gaming on the plurality of player computing equipment using cards dealt on the gaming table based on (i) predetermined pairings between the plurality of player computing equipment and the gaming table, (ii) user selection of the gaming table from among a plurality of gaming tables via the individualized five-card draw poker gaming interfaces on the plurality of player computing equipment, or (iii) automatic selection of the gaming table from among the plurality of gaming tables by the five-card draw poker gaming system;
identify five of the physical playing cards that are detected by the scanner, the five of the physical playing cards being common across the plurality of players;
assign the five of the physical playing cards as an initial five-card draw poker hand for each of the plurality of players;
transmit the initial five-card draw poker hand to the plurality of player computing equipment;
receive, from the plurality of player computing equipment, information identifying the discard actions performed by each of the plurality of players with regard to the initial five-card draw poker hand;
identify next five of the physical playing cards that are detected by the scanner, the next five of the physical playing cards being common across the plurality of players;
generate final five-card draw poker hands for the plurality of players based on the initial five-card draw poker hand, the discard actions performed by each of the plurality of players, and the next five of physical playing cards; and
determine gaming outcomes for each of the plurality of players based on the final five-card draw poker hands.

US Pat. No. 10,970,972

WAGER REGISTRATION FOR AFTERMARKET BROKERED WAGERS

HEDGEKINGS LLC, Key Bisc...

1. A wager registration method for aftermarket brokered wagers comprising:receiving from an initial bettor in an aftermarket brokered wager computing system, from over a computer communications network, a digital image of a physical wager established with a purveyor of wagers independent of the aftermarket brokered wager computing system;
validating an authenticity of the digital image and responsive to the validation, writing a record to a registry in the aftermarket brokered wager computing system indicating one hundred percent (100%) ownership of the wager by the bettor at an amount and odds indicated by the image;
confirming the record in the registry upon receipt of the physical wager; and,
subsequently updating the record in the registry to indicate only a fractional less than 100% ownership of the wager responsive to a third-party aftermarket buyer purchasing a fractional share of the wager.

US Pat. No. 10,970,971

REGULATED CASINO GAMES AND GAMING MACHINES CONFIGURED TO OFFER CONDITIONAL WINS AND/OR CONDITIONAL WIN OPPORTUNITIES

Synergy Blue LLC, Las Ve...

1. A computer-implemented method of operating a computing device, comprising:accepting, by the computing device, funds from a player and enabling the player to play a wager-based game using the accepted funds, the wager-based game being configured to present a plurality of in-game assets for player interaction during the game, at least some of the plurality of in-game assets being configured as wagering opportunities, player interactions with which generates wagers;
receiving player interactions with the wagering opportunities via an interface of the computing device;
generating wagers upon receiving player interactions with the wagering opportunities;
during game play of the wager-based game, triggering and presenting a conditional prize to the player on a display of the computing device depending upon at least one of received player interactions and states of the wager-based game, an award of the presented conditional prize to the player being predicated upon receiving future player interactions by the player indicative of the player having caused a predetermined number of additional wagers to have been generated;
continuing game play of the wager-based game and counting a number of additional wagers that are generated since the conditional prize was presented to the player; and
awarding the conditional prize to the player when the counted number of additional wagers equals the predetermined number of additional wagers; and
foregoing awarding the conditional prize to the player when the accepted funds run out, when a player interaction is received that is indicative of a cash out event before the counted number of generated additional wagers equals the predetermined number of additional wagers or when a player interaction is received that is indicative of the player choosing to forego the award of the conditional prize,
wherein the conditional prize comprises at least one of:
money;
a symbol, with the wager-based game being configured such that a predetermined combination of symbols wins a predetermined prize or a chance to win a prize;
increasing a potential payout of a later-occurring wager, and
game play at an optimal return to player (RTP) for a predetermined period of time.

US Pat. No. 10,970,970

GAMING DEVICE HAVING MULTI-CHANCE FEATURE

1. A gaming device comprising:a display;
a user interface;
a memory configured to store a credit amount;
a wager acceptor structured to receive a physical item associated with a currency value; and
a processor operable to:
receive a signal from the wager acceptor indicating receipt of a physical item associated with a currency value;
increase the credit amount in memory based upon the received signal from the wager acceptor;
receive a signal on the gaming device to initiate a first poker game, the signal indicating a wager amount, where the credit amount stored in the memory is reduced by the wager amount;
determine if a dice feature is randomly triggered;
randomly roll at least one die to generate a dice value outcome when the bonus dice feature is triggered;
associate the dice value outcome, if any, with a bonus multiplier;
display a result of the first poker game on the display;
evaluate the poker hand for the first poker game to determine first awards associated with the first poker game;
multiply any determined first awards for the first poker game with the bonus multiplier, if any;
increase the credit amount in memory based on the multiplied first awards;
receive a signal on the gaming device to initiate a second poker game, the signal indicating a wager amount, where the credit amount is reduced by the wager amount;
select a plurality of cards to present in the card positions of the second poker game to form a poker hand;
display a result of the second poker game on the display;
evaluate the poker hand of the second poker game to determine second awards associated with the second poker game;
multiply any determined second awards for the second poker game with the bonus multiplier, if any; and
increase the credit amount in memory based on the multiplied second awards.

US Pat. No. 10,970,969

WAGERING APPARATUS, METHODS AND SYSTEMS

Diogenes Limited, Dougla...

1. A method of conducting a wagering event for one or more players, comprising:causing a central server system to initiate a wagering event for one or more players, wherein the wagering event corresponds to a sporting event;
causing a wager corresponding to a player of the one or more players that is received by a wagering input device for the sporting event to be transmitted to the central server system;
causing the central server system to generate a cash out offer for the player based on the wager and progress of the sporting event, by:
monitoring progress of the sporting event, wherein the progress of the sporting event may comprise at least a current status;
identifying that the player's wager is active; and
determining an amount of the cash out offer; and
causing the central server system to transmit the cash out offer to the wagering input device, prompting the player to accept or reject the cash out offer.

US Pat. No. 10,970,968

SYSTEM AND METHOD FOR INCENTIVIZING THE MAINTENANCE OF FUNDS IN A GAMING ESTABLISHMENT ACCOUNT

IGT, Las Vegas, NV (US)

1. A system comprising:a gaming establishment component processor; and
a gaming establishment component memory device which stores a plurality of instructions, which when executed by the gaming establishment component processor responsive to an occurrence of an external account destination fund transfer event associated with an amount of funds, cause the gaming establishment component processor to:
determine an incentive,
communicate data which results in a display device displaying the determined incentive,
responsive to a receipt of data associated with a rejection by a user of the determined incentive, communicate data associated with the amount of funds to be electronically transferred from a gaming establishment account to an external account, and
responsive to a receipt of data associated with an acceptance by the user of the determined incentive:
not communicate any data associated with the amount of funds being electronically transferred from the gaming establishment account to the external account,
determine any benefit associated with the determined incentive, and
communicate data which results in the display device displaying the determined benefit.

US Pat. No. 10,970,967

ELECTRONIC VOUCHER TICKET SYSTEM

1. A voucher ticket system, comprising:a value input device communicably coupled to an electronic gaming machine; and
at least one server that:
obtains information about a print voucher ticket received by the value input device, the print voucher ticket associated with a first value;
receives data from the value input device that is based on a communication between the value input device and a mobile device;
determines an association between the information about the print voucher ticket received by the value input device and a mobile wallet account or a voucher ticket account using the data received from the value input device; and
based on the association, at least one of:
increments the mobile wallet account by a second value; or
records at least a portion of the information to the voucher ticket account.

US Pat. No. 10,970,966

SPONTANEOUS ECO-SYSTEM OF AFTERMARKET BROKERED WAGERS

HEDGEKINGS LLC, Key Bisc...

1. A wager management method for a spontaneous eco-system of wagers, the method comprising:registering a multiplicity of different subscribers to an aftermarket wager brokering computing system brokering a multiplicity of published wagers owned fully by owing ones of the different subscribers for fractional purchase by others of the different subscribers;
receiving a request from one of the subscribers to import a wager pool of new wagers based upon a stated outcome of an event yet to occur wherein none of the new wagers are published in the aftermarket wager brokering computing system;
extracting from the request, odds for each of the new wagers and an amount wagered for each of the new wagers, and a list of participants owning the new wagers in the pool, none of whom are subscribed to the aftermarket wager brokering system other than the one of the subscribers from which the request is received;
writing to a registry of the aftermarket wager brokering computing system, the extracted information;
publishing the new wagers in the pool for aftermarket re-sale by the aftermarket wager brokering computing system; and,
writing an entry to the registry responsive to an acquisition of a portion of one of the new wagers by a purchasing one of the different subscribers indicating a fractional ownership of the one of the wagers.

US Pat. No. 10,970,965

SUGGESTION ENGINE FOR AFTERMARKET BROKERED WAGERS

HEDGEKINGS LLC, Key Bisc...

1. A suggestion method for aftermarket brokered wagers comprising:selecting a subscriber to an aftermarket brokered wager computing system;
determining at least one profile characteristic of the selected subscriber;
filtering in accordance with the determined profile characteristic, a set of available wagers owned by other subscribers to the aftermarket brokered wager computing system and registered with the wager computing system, the filtering producing a recommended wager owned by a specific one of the other subscribers;
pushing a message to a mobile device of the selected subscriber recommending a fractional purchase of less than a complete portion of the recommended wager at odds established at a time of acquisition of the recommended wager irrespective of contemporaneous odds for the recommended wager; and,
responsive to an acceptance of the recommended fractional purchase by the selected subscriber, writing a registry entry in the brokered wager computing system denoting a fractional ownership of the recommended wager by the selected subscriber along with the specific one of the other subscribers.

US Pat. No. 10,970,964

TRIGGERING IN-APPLICATION CURRENCY TRANSFER

PAYPAL, INC., San Jose, ...

1. A system comprising:a non-transitory memory storing instructions; and
one or more hardware processors coupled to the non-transitory memory and configured to read the instructions from the non-transitory memory to cause the system to perform operations comprising:
receiving a first selection of a first game application from a device of a first user;
detecting, via a network interface with the first game application, that an in-game balance of a second user account of a second user in the first game application is below a threshold for purchase of a virtual item in the first game application by the second user account;
determining a plurality of game account balances for the first user with one or more additional game applications, wherein the one or more additional game applications are linked to a first user account of the first user based on past purchases of application currency through the first user account with the one or more additional game applications;
transmitting first game information for the first game application with the plurality of game account balances to the device through an interface associated with the first user account of the first user, wherein the first game information comprises an option for a transfer to the second user account of the second user associated with the first game application and the plurality of game account balances;
receiving a second selection of the option from the interface;
receiving a second request to process the transfer to the second user through the interface; and
processing the transfer through a payment service provider based on the second request.

US Pat. No. 10,970,963

COIN OPERATED ENTERTAINMENT SYSTEM

TOUCHTUNES MUSIC CORPORAT...

1. An interactive entertainment system provided in a venue comprising:a central audiovisual data reproduction device including a non-transitory computer-readable storage medium and a processor configured to reproduce audiovisual data in accordance with user inputs;
a plurality of stands distributed within the venue;
a plurality of portable interactive entertainment devices operable independently of one another, each of the portable interactive entertainment devices comprises:
a lock configured to removably secure the portable interactive entertainment device to one of the plurality of stands,
a display,
a payment acceptor configured to accept payment from a user,
a communication interface configured to send and receive data to and from other portable interactive entertainment devices and/or the central audiovisual data reproduction device, and
at least one processor configured to:
cause a user interface to be displayed on the display, and
in response to inputs to the user interface, enable selection of instances of media for playback via the portable interactive entertainment device and/or the central audiovisual data reproduction device in exchange for credits.

US Pat. No. 10,970,962

MANAGEMENT SYSTEM OF SUBSTITUTE CURRENCY FOR GAMING

ANGEL PLAYING CARDS CO., ...

1. A management system of substitute currency for gaming comprising:substitute currency for gaming to which a unique ID that is individually identifiable is attached;
a winning/losing determining device determining and displaying a result of winning/losing of each game at a game table;
a chip tray configured to hold the substitute currency for gaming at the game table;
a measurement device measuring an amount and a quantity of the substitute currency for gaming placed on the game table by using a camera; and
a management control device that:
is configured to identify a total amount of the substitute currency for gaming in the chip tray and to determine and store a position, an amount, and a quantity of the substitute currency for gaming on the game table based on a result of the measurement performed by the measurement device in each game;
has a calculation function of calculating a balance of a casino side in the game table by using the result of winning/losing acquired from the winning/losing determining device and a result of the measurement of the position, the amount, and the quantity of the substitute currency for gaming for each collection and payment; and
has a collection determination function and a payment determination function, wherein, for each player of the multiple players:
the collection determination function determines a result of the calculation of the balance of the casino side in the gaming table and a total amount of the substitute currency for gaming in the chip tray after finishing collection of lost chips of the substitute currency bet by the player to determine whether or not there has been a fraud or mistake in the collection; and
the payment determination function determines the result of the calculation of the balance of the casino side in the gaming table and the total amount of the substitute currency for gaming in the chip tray after finishing payment of chips of the substitute currency to the player to determine whether or not there has been a fraud or mistake in the payment.

US Pat. No. 10,970,961

SYSTEMS AND METHODS FOR ELECTRONIC GAMING

Aristocrat Technologies A...

1. An electronic gaming system comprising:a cabinet;
a display device supported by the cabinet;
a player input interface including a touch-screen input device, the player input interface supported by the cabinet; and
a game controller enclosed by the cabinet and configured to execute instructions stored in a memory, which when executed by the game controller, cause the game controller to at least:
control the display device to present a game area including a plurality of reel strips, each reel strip including a plurality of symbol display positions, each symbol display position having a symbol display position width and a symbol display position height;
add a first oversized symbol to at least two reel strips of the plurality of reel strips, the at least two reel strips adjacent one another, the first oversized symbol having a first oversized symbol width that is at least twice the symbol display position width;
determine a number of reel strips remaining that do not include the first oversized symbol, the remaining number of reel strips defining a remaining width;
select a second oversized symbol based on the remaining width, the second oversized symbol having a second oversized symbol width that is less than or equal to the remaining width, whereby the second oversized symbol is selected to fit within the remaining number of reel strips without overlapping the first oversized symbol;
add the second oversized symbol to at least two reel strips of the remaining number of reel strips;
control the display device to simulate spinning and stopping the at least two reel strips that include the first oversized symbol together and based upon a single first reel stop position to facilitate spinning and stopping the at least two reel strips that include the first oversized symbol in unison;
control the display device to simulate spinning and stopping the at least two reel strips that include the second oversized symbol together and based upon a single second reel stop position to facilitate spinning and stopping the at least two reel strips that include the second oversized symbol in unison, wherein the display device is further controlled to simulate spinning and stopping the at least two reel strips that include the first oversized symbol independently of the at least two reel strips that include the second oversized symbol;
generate a game outcome based, at least, on the first oversized symbol and the second oversized symbol; and
generate a game award from the game outcome.

US Pat. No. 10,970,960

INTERACTIVE VIDEO GAMING SYSTEM INVOLVING A MATCHING FEATURE AND MULTIPLE PAY TABLES AND METHOD OF UTILIZING THE SAME

GAMECO, LLC, Las Vegas, ...

1. A gaming machine comprising:a monetary input device configured to receive a physical item associated with a monetary value;
a user interface configured to:
enable a player to select a wager for a game of chance and enable the player to initiate a cash out operation;
at least one processor running executable instructions related to a game of chance;
said at least one processor programmed to:
add said monetary value to a credit balance for said player;
deduct said selected wager from said credit balance; and
decrease said credit balance in response to said cash out operation;
a display;
a ticket reader;
a bill validator;
memory in communication with said processor, said memory storing at least multiple pay tables; and
wherein said processor running said executable instructions: (i) causes to be displayed a grid comprising a plurality of unique game icons; (ii) receives a player input consistent with moving one or more of said plurality of unique game icons to group like game icons; (iii) responsive to said player input being deemed proper, causes to be moved said one or more of said plurality of unique game icons and a bet amount to be deducted; (iv) if said move causes three or more like game icons to be grouped in a pre-established pattern, identifies a corresponding pay table, based on a number of grouped like game icons in said pre-established pattern, from said multiple pay tables stored in said memory, said multiple pay tables having increasing RTP as the number of grouped like icons increases; and (iv) awards a prize from said identified pay table.

US Pat. No. 10,970,959

ELECTRONIC DEVICE FOR PLAYING A REEL-BASED GAME WITH MINI-REELS

Lightning Box Games Pty L...

1. A gaming device used by a player for the operation of electronic game play comprising:at least one fund-receiving mechanism for receiving funds and establishing a monetary credit associated with said gaming device;
at least one memory storage location for storing instructions associated with a game;
at least one processor;
at least one data input mechanism for allowing said player to input data associated with said game including a wager amount drawn from said monetary credit; and
at least one display device, wherein upon input associated with said game including said wager amount, the instructions are executed by the at least one processor to display on the display device: (i) a plurality of positional sets with each of said positional sets comprising a plurality of positions wherein the positions of at least one of said positional sets comprises segments of a simulated reel strip and the positions of at least one other of said positional sets comprises a plurality of simulated mini-reels, and (ii) a symbol residing within each of said positions, said symbol randomly selected from a symbol set;
a payline defined by the selection in a predetermined order of one of the positions from each of the positional sets that comprise segments of a simulated reel strip and one of the positions from each of the positional sets that comprise a plurality of simulated mini-reels;
a payline set comprising all combination of positions of said payline wherein the position from each of the positional sets that comprise segments of a simulated reel strip remains the same for each of said combination of positions and one of the positions from one of the positional sets that comprise a plurality of simulated mini-reels is different for each of said combination of positions;
at least one predetermined winning symbol combination comprised of an arrangement of specified symbols from said symbol set wherein said controller:(i) reads each of said symbols from each of said combination of positions in said predetermined order;(ii) determines if the arrangement of said symbols from each of said combination of positions matches any of said at least one predetermined winning symbol combinations;(iii) awards the player a pre-determined award value for each arrangement of said symbols that match one of said at least one predetermined winning symbol combinations.

US Pat. No. 10,970,958

GAMING MACHINE, CONTROL METHOD FOR MACHINE, AND PROGRAM FOR GAMING MACHINE

KONAMI GAMING, INC., Las...

1. A gaming machine for providing a game, comprising:an operation unit configured to receive an operation of a player;
a display unit operably coupled to the operation unit, the display unit including a first display area and a second display area, the first display area configured to display a primary game including a plurality of reels, the second display area configured to display a skill-based bonus game including a rotating selector device and a pointer, the rotating selector device including a reel strip being displayed within a plurality of cells arranged in a grid, the grid having a single column and a plurality of rows; and,
a control unit operably coupled to the operation unit and the display unit and being configured to initiate the primary game in response to player operation and to establish an outcome of the primary game, the control unit, in response to initiation of the primary game, being further configured to:
randomly determine the outcome of the primary game and spin and stop the plurality of reels to display the outcome of the primary game; and
upon detecting a trigger condition in the outcome of the primary game, initiate the skill-based bonus game by:
generating the reel strip displaying a plurality of symbols for use with the selector device, the generated reel strip including a plurality of stop segments, each stop segment including a plurality of segment symbol positions, each segment symbol position including a corresponding symbol and an associated selection probability;
displaying the selector device including the generated reel strip being associated with the single column of the grid and including each segment symbol position having an associated symbol displayed therein;
rotating the selector device in the grid such that symbols in the reel strip are displayed in the cells of the grid;
receiving player input via the operation unit indicating the player actuating a stop button while the selector device is rotating; and
upon receiving the player input, displaying the outcome of the skill-based bonus game by:
detecting a timing of the received player input and establishing a stop segment of the reel strip by selecting one of the plurality of stop segments as a function of the detected timing associated with the player input, wherein the established stop segment includes two or more segment symbol positions;
determining a selection probability associated with each of the segment symbol positions included in the established stop segment;
randomly selecting one of the segment symbol positions included in the established stop segment based on the associated selection probabilities; and,
stopping the selector device to display the corresponding symbol associated with the randomly selected segment symbol position adjacent to the pointer.

US Pat. No. 10,970,957

SYSTEMS AND METHODS FOR SIMULATING PLAYER BEHAVIOR USING ONE OR MORE BOTS DURING A WAGERING GAME

Aristocrat Technologies A...

1. A tangible, non-transitory, computer-readable storage medium having instructions stored thereon, which when executed by a processor, cause the processor to at least:initiate a simulated multiplayer game;
control a display device to display a plurality of selectable items;
select a plurality of bots from a bot selection table, the bot selection table including a bot selection weight associated with each bot of the plurality of bots, each bot simulating a player behavior;
determine, using a timed entry distribution table, an entry time for at least one bot of the plurality of bots in the simulated multiplayer game; and
control the display device to simulate a selection, by the at least one bot, of at least one selectable item at the determined entry time.

US Pat. No. 10,970,956

DATABASE AND SERVER FOR AUTOMATIC WAGERING

1. A system, comprising:a database configured to store data associated with a plurality of players, the associated data including statistics related to prior wagers made by each player;
a computing system configured to establish a ranked list of players, at least in part, as a function of the statistics of the players; and,
an application executable by a processor of a user computing device, wherein the application is configured to:
present the ranked list of players to a user using the application on the user computing device;
enable the user to select one of the players from the ranked list of players; and
transmit the selected player to the computing system, the computing system configured to receive the selected player from the application and to responsively link the user with the selected player, the computing system further configured to automatically place a user wager in response to the selected player placing a player wager, wherein the user wager is automatically placed as the same wager as the player wager, wherein an amount of the user wager will be an amount wagered by the selected player multiplied by a percentage value selected by the user.

US Pat. No. 10,970,955

ACCOUNTING DEVICE AND CONTROL PROGRAM

TOSHIBA TEC KABUSHIKI KAI...

1. An accounting device comprising:a first display device that includes a first screen facing a first direction towards an operator and configured to accept a touch input;
a change machine that includes a depositing port for receiving money and a dispensing port for dispensing money, and faces a second direction different from the first direction;
a second display device that includes a second screen that is configured to accept a touch input and faces the second direction; and
a processor configured to:
upon receipt of a first touch input via the first screen, determine whether the first touch input is made to start a first job performed on the change machine, or to start a second job that is different from the first job,
when the first touch input is made to start the first job, control the second display device to display a first button on the second screen, and
when the first touch input is made to start the second job, control the first display device to display information for supporting execution of the second job on the first screen,
wherein, when the first touch input is made to start the first job and a second touch input is made on the first button displayed on the second screen, the processor enables the change machine to receive or dispense money, and controls the second display device to display a second button for completing the first job on the second screen, and
when a third touch input is made on the second button displayed on the second screen, the processor disenables the change machine from receiving or dispensing money.

US Pat. No. 10,970,954

MOVABLE BARRIER OPERATOR REGISTRATION VERIFICATION

The Chamberlain Group, In...

1. A method for managing registration of a network-enabled movable barrier operator, the method comprising:receiving, from a user device, a user account identifier and a movable barrier operator identifier;
providing, to the user device, an instruction to perform a specified action with a movable barrier operator associated with the movable barrier operator identifier;
determining a registration condition is met upon detecting that the specified action has been performed; and
upon the registration condition being met, associating the movable barrier operator identifier with the user account identifier to allow a user account associated with the user account identifier to control and/or monitor the movable barrier operator over a network;
wherein the specified action comprises changing a state of a movable barrier coupled with the movable barrier operator.

US Pat. No. 10,970,953

FACE AUTHENTICATION BASED SMART ACCESS CONTROL SYSTEM

1. A method comprising:detecting, by a motion detection module, a motion by a subject within a predetermined area of view;
assigning a unique session identification number to the subject detected within the predetermined area of view;
detecting a facial area of the subject detected within the predetermined area of view;
generating an image of the facial area of the subject;
assessing a quality of the image of the facial area of the subject;
determining an identity of the subject based on the image of the facial area of the subject;
identifying an intent of the subject; and
authorizing access to a point of entry based on the determined identity of the subject and based on the intent of the subject, wherein identifying an intent of the subject comprises:
upon detecting the facial area in a bounding box, commencing authentication of the subject;
calculating a directional vector of a face of the subject;
determine an intent of the subject to gain access to the point of entry based on the directional vector of the face of the subject; and
granting the access to the point of entry based on authentication of the subject and based on determining the intent of the subject.

US Pat. No. 10,970,952

USER IDENTIFICATION SYSTEM

Toyota Jidosha Kabushiki ...

1. A user identification system comprising a processor configured to execute instructions, the instructions cause, when executed by the processor, perform operations including:(a) determining whether a user registered in an information storage boards or is on board a vehicle; and
(b) controlling at least one of a navigation device, a audio device and an information display device, based on information on the user who boards or is on board the vehicle,
wherein the operation (a) includes determining whether the user boards or is on board the vehicle based on a plurality of types of boarding information indicating that one of a plurality of users registered in the information storage in advance boards or is on board the vehicle, the plurality of types of boarding information including a first type of boarding information which is acquired to correspond to one of a plurality of seats in the vehicle, and
wherein the operations further include (c) identifying a seat of the user who boards or is on board the vehicle among the plurality of seats based on the first type of boarding information.

US Pat. No. 10,970,951

DATA MANAGEMENT METHOD, APPARATUS, DEVICE, SYSTEM AND STORAGE MEDIUM FOR SMART LOCK

SHENZHEN GOODIX TECHNOLOG...

1. A data management method for a smart lock, wherein the smart lock has N databases, the N databases have a one-to-one correspondence with N unlock modes, N is an integer greater than 1, and the method comprises:receiving a registration command transmitted by a terminal device;
acquiring first unlock information of a current unlock mode according to the registration command, wherein the current unlock mode is comprised in the N unlock modes;
storing the first unlock information of the current unlock mode into a database corresponding to the current unlock mode, wherein the database corresponding to the current unlock mode is comprised in the N databases;
acquiring second unlock information of the current unlock mode;
identifying the second unlock information of the current unlock mode in the database corresponding to the current unlock mode; and
controlling the smart lock to be in an open state when the second unlock information of the current unlock mode is successfully identified.

US Pat. No. 10,970,950

SYSTEMS AND METHODS FOR ACTIVATING A SECURITY ACTION RESPONSIVE TO PROXIMAL DETECTION AND IDENTIFICATION OF A WIRELESS DEVICE

Ademco Inc., Golden Vall...

1. A method comprising:a wireless radio of a door mounted contact sensor of a security system receiving a broadcast message that includes a device identifier from a wireless device;
determining whether the device identifier corresponds to any of a plurality of authorized devices enrolled with the security system;
determining whether a signal parameter of the broadcast message satisfies a preconfigured access condition, wherein the signal parameter includes a current moving direction of the wireless device, wherein the preconfigured access condition includes a preconfigured moving direction associated with opening a door associated with the door mounted contact sensor, and wherein the preconfigured moving direction includes a moving direction toward and outside of the door; and
when the device identifier corresponds to any of the plurality of authorized devices enrolled with the security system and the signal parameter satisfies the preconfigured access condition by matching the current moving direction with the preconfigured moving direction, performing a security action comprising disarming the security system without providing an entry delay at the security system.

US Pat. No. 10,970,949

SECURE ACCESS CONTROL

GENETEC INC., St-Laurent...

1. An access controller for use in a secure access control system having a number of smart card readers and door controllers, the access controller being operative to communicate with said smart card readers and door controllers for authenticating users and enabling authorized access to secured premises, the access controller comprising:at least one communication interface connectable to said number of smart card readers and door controllers;
a plurality of secure access module (SAM) interfaces, each one of said SAM interfaces able to connect to a corresponding one of a plurality of SAMs and to communicate with any one of said number of smart card readers through said at least one communication interface.

US Pat. No. 10,970,948

SYSTEMS, DEVICES, AND METHODS FOR ACCESS CONTROL AND IDENTIFICATION OF USER DEVICES

INTRINSIC VALUE, LLC, Na...

1. A computer implemented method, comprising:receiving, by a processor circuit on a user's mobile device, a first user-profile including one or more first access rules governing the user's access rights to a point-of-entry device during non-emergency situations;
receiving, in response to an emergency event, a second user-profile including one or more second access rules governing the user's access rights to the point-of-entry device during emergency situations;
when the user attempts to gain access to the point-of-entry device, determining a decision to grant or deny access to the point-of-entry device based on the first or second user-profile during non-emergency and emergency situations, respectively; and
transmitting the decision to grant or deny access to the point-of-entry device,
wherein determining, during an emergency situation, the decision to grant or deny access to the point-of-entry device is based on a second access rule of the second user-profile that includes:
granting a first access to the point-of-entry device to allow the user to pass through a door into an enclosure; and
denying a second access to the point-of-entry device to deny the user from exiting the enclosure thereby confining the user to the enclosure.

US Pat. No. 10,970,947

SYSTEM AND METHOD TO PROVIDE A REAR WINDOW WIPER USAGE NOTIFICATION

GM GLOBAL TECHNOLOGY OPER...

1. A method to provide a rear window wiper notification, the method comprising:when a rear window wiper is in an active state, via a processor, establishing a time duration wherein the time duration is one of a long time duration and a short time duration;when a rearview mirror is set to a camera mode, via the processor, selecting the short time duration; otherwise, via the processor, selecting the long time duration; andwhen the rear window wiper remains in the active state throughout the time duration and after a conclusion of the time duration, via the processor, providing the rear window wiper notification in an interior cabin of a vehicle.

US Pat. No. 10,970,946

INFORMATION ACQUISITION APPARATUS, INFORMATION ACQUISITION METHOD, AND PROGRAM

TOYOTA JIDOSHA KABUSHIKI ...

1. An apparatus comprising:circuitry configured to:
calculate first information regarding a first actual use state of a vehicle in a first period based on displacement information of a sprung part of the vehicle in a height direction acquired by the vehicle; and
calculate second information regarding a second actual use state of the vehicle in a second period that includes a plurality of the first periods,
wherein the second information is calculated using the first information of the plurality of first periods.

US Pat. No. 10,970,945

CONTROL APPARATUS FOR VEHICLE AND CONTROL METHOD FOR VEHICLE

SUBARU CORPORATION, Toky...

1. A control apparatus for a vehicle, the apparatus comprising:tire-force sensors provided on respective wheels of the vehicle;
a tire-force estimator configured to estimate tire forces of the respective wheels on a basis of sensor signals outputted from the respective tire-force sensors;
a turning-state detector configured to detect a turning state of the vehicle; and
a warning unit configured to generate a warning in a mode that differs depending on the turning state when at least one of the wheels is estimated to be in a limit state on a basis of the estimated tire forces.

US Pat. No. 10,970,944

SYSTEMS AND METHODS FOR TRANSPORTATION SERVICE SAFETY ASSESSMENT

BEIJING DIDI INFINITY TEC...

1. A system configured to assess a service order, comprising:at least one server operating an online transportation service platform;
at least one database in communication with the at least one server, including information of a plurality of service requestors and a plurality of service providers of a transportation service;
a plurality of authorized service requestor terminals associated with the plurality of service requestors, in communication with the at least one server via at least one network, wherein the plurality of requestor terminals connects to a positioning system to obtain positions of the plurality of requestor terminals and transmit the positions to the at least one server;
a plurality of authorized provider terminals associated with the plurality of service providers, in communication with the at least one server via the at least one network, wherein the plurality of provider terminals connects to the positioning system to obtain positions of the plurality of provider terminals and transmit the positions to the at least one server;
wherein the at least one server, during operation, further:
loads a set of instructions for providing an online transportation service from a storage device;
receives electronic signals including a plurality of service orders for the online transportation service from the plurality of requestor terminals and the plurality of provider terminals;
for each service order of the plurality of service orders, operating logic circuits in the server to:
determine whether the service order starts;
in response to the determination that the service order starts, obtain transportation service data of the service order and first historical data relating to the service requestor and the service provider of the service order;
determine, based on the transportation service data and the first historical data, at least one safety score for the service order at a current time point; and
assess, based on the at least one safety score, a dangerous condition of the service order at the current time point for the service requestor or the service provider.

US Pat. No. 10,970,943

METHOD AND APPARATUS FOR A VEHICLE FORCE INDICATOR

1. A method of tracking tilt in a motor vehicle, the method comprising:a. registering an at rest state of the motor vehicle with one or both of an electromechanical device and a solid state device;
b. determining an amount of velocity of the motor vehicle at a given time interval;
c. determining an amount of tilt of the motor vehicle during the given time interval;
d. transmitting an indication of an amount of tilt experienced by the vehicle during motion of the motor vehicle; and
e. displaying an indication of the amount of tilt on a human discernable display device, wherein no artificial delay is introduced from the time of determining an amount of tilt of the motor vehicle during motion and the display of the indication of the amount of tilt.

US Pat. No. 10,970,942

FOG DATA AGENT FOR CONNECTED CARS

Wistron Aiedge Corporatio...

1. A fog data agent for a car, comprising:a connector configured to couple to an on-board diagnostic port of the car;
a microprocessor;
a wireless communication interface coupled to the microprocessor;
a cellular communication interface coupled to the microprocessor;
a data storage device coupled to the microprocessor; and
logic configured to form a fog network via the wireless communication and cellular communication interfaces, receive data from the on-board diagnostic port, where the received data including at least one of tire pressure, engine coolant temperature, engine speed, throttle position, cam position, fuel pressure, fuel level, fuel temperature, air flow, air-fuel radio, hall effect, vehicle speed, airbag, automatic transmission speed, manifold absolute pressure, oil level, oil pressure, spark knock monitoring, oxygen, navigational and GPS, outside temperature, inside temperature, radars, LiDARs (Light Detecting And Ranging), imaging data, analyzes the received data in real-time, and establish a bi-directional communication channel with a remote server via the fog network to transmit at least a subset of the received data to the remote server in response to the data analysis when there is sufficient bandwidth to transmit at least the subset of the received data.

US Pat. No. 10,970,941

ALL SEEING ONE CAMERA SYSTEM FOR ELECTRONIC TOLLING

Raytheon Company, Waltha...

1. A method of electronic tolling for a vehicle travelling on a road surface, the method comprising: arranging a single camera or a single array of cameras having a rectilinear wideangle lens that provides to have a field-of-view that is normal to the road surface and is defined by a plane that is parallel with the road surface; arranging a light source to provide uniform light throughout the field-of-view; and capturing a front image and a rear image of the vehicle traveling along the road surface using the single camera or the single array of cameras in the field-of-view using the single camera or the single array of cameras; determining a classification of the vehicle based on the captured front image and the rear image; and determining a toll for the vehicle based on the classification of the vehicle.

US Pat. No. 10,970,940

TICKET CHECKING DEVICE, TICKET CHECKING METHOD, AND OCCUPANT SEAT POSITION

BOE TECHNOLOGY GROUP CO.,...

4. A ticket checking device being configured on a seat, the ticket checking device comprising:an identifier configured to identify ticket surface information of a ticket presented by a user;
a control unit configured to compare the ticket surface information identified by the identifier with preset ticket surface information corresponding to the seat, wherein the control unit is configured to send alarm information in response to the ticket surface information identified by the identifier being inconsistent with the preset ticket surface information; and
an electronic tag configured to display the preset ticket surface information corresponding to the seat.

US Pat. No. 10,970,939

APPARATUS AND METHOD FOR TRANSFERRING GARMENT DRAPING BETWEEN AVATARS

CLO Virtual Fashion Inc.,...

1. A method of transferring garment draping between avatars, the method comprising:preparing a virtual reference garment modeled with meshes having a plurality of vertices connected to each other;
draping the virtual reference garment on a source avatar with meshes of the virtual reference garment adhered to surfaces of the source avatar;
draping the virtual reference garment on a target avatar with the meshes of the virtual reference garment adhered to surfaces of the target avatar;
obtaining a correspondence map indicating correspondence relationships between the meshes of the virtual reference garment draped on the source avatar and the corresponding meshes of the virtual reference garment draped on the target avatar;
determining avatar deformation as deformation transformations between the meshes of the virtual reference garment draped on the source avatar and the corresponding meshes of the virtual reference garment draped on the target avatar as defined by the correspondence map; and
transfer-and-draping a target garment draped on the source avatar, the target garment being distinct from the virtual reference garment, onto the target avatar based on the avatar deformation.

US Pat. No. 10,970,938

METHOD AND APPARATUS FOR GENERATING 3D INFORMATION

Baidu Online Network Tech...

1. A method for generating information, comprising:selecting a three-dimensional object model from a preset three-dimensional object model set based on a to-be-matched object image in a target two-dimensional image;
determining, based on a normal vector of a ground plane of the target two-dimensional image, a plane equation of ground corresponding to the normal vector of the ground plane in a three-dimensional space;
adjusting a rotation parameter and a translation parameter of the three-dimensional object model in a plane characterized by the plane equation; and
generating, in response to determining that a contour of the adjusted three-dimensional object model matches a contour of the to-be-matched object image in the target two-dimensional image, three-dimensional information of an object corresponding to the to-be-matched object image based on the adjusted three-dimensional object model.

US Pat. No. 10,970,937

TECHNOLOGIES FOR VIRTUAL ATTRIBUTE ASSIGNMENT REFERENCING REAL OBJECTS

Intel Corporation, Santa...

1. A compute device for virtual attribute assignment, the compute device comprising:an augmented reality renderer to render one or more virtual objects in an augmented reality space; and
a user input analyzer to:
receive an attribute assignment command from a user that identifies a user-selected virtual object, a user-referenced attribute of the user-selected virtual object, a user-selected real object, and a user-referenced attribute of the user-selected real object;
analyze the attribute assignment command to determine:
the user-selected virtual object identified by the user in the attribute assignment command, wherein the user-selected virtual object identified in the attribute assignment command has a plurality of user-assignable attributes;
the user-referenced attribute of the user-selected virtual object, wherein the user-referenced attribute of the user-selected virtual object is one of the plurality of user-assignable attributes of the user-selected virtual object;
the user-selected real object identified by the user in the attribute assignment command; and
the user-referenced attribute of the user-selected real object, wherein the user-referenced attribute of the user-selected real object is identified by the user in the attribute assignment command;
determine, in response to the attribute assignment command, a state of the user-referenced attribute of the user-selected real object; and
copy, in response to the attribute assignment command, the state of the user-referenced attribute of the user-selected real object to a state of the user-referenced attribute of the user-selected virtual object.

US Pat. No. 10,970,936

USE OF NEUROMUSCULAR SIGNALS TO PROVIDE ENHANCED INTERACTIONS WITH PHYSICAL OBJECTS IN AN AUGMENTED REALITY ENVIRONMENT

Facebook Technologies, LL...

1. A computerized system for interacting with a physical object in an extended reality (XR) environment generated by an XR system, the computerized system comprising:a plurality of neuromuscular sensors configured to sense a plurality of neuromuscular signals from a user, wherein the plurality of neuromuscular sensors are arranged on one or more wearable devices worn by the user to sense the plurality of neuromuscular signals; and
at least one computer processor programmed to:
determine, based at least in part on the plurality of neuromuscular signals sensed by the plurality of neuromuscular sensors, information relating to an interaction of the user with the physical object in the XR environment generated by the XR system;
instruct the XR system to provide feedback based, at least in part, on the information relating to the interaction of the user with the physical object; and
instruct the XR system to provide the feedback to the user, wherein:
the feedback provided to the user comprises an indication of an amount of force applied to the physical object by the user, and
the amount of force is determined based, at least in part, on the plurality of neuromuscular signals and image data associated with the force applied to the physical object, wherein the image data comprises images captured using visable and non-visable light.

US Pat. No. 10,970,935

BODY POSE MESSAGE SYSTEM

1. A method comprising:capturing a first image using a sensor of a hybrid reality (HR) system worn or carried by a user;
detecting, by a computing device, an individual, different than the user, in the first image;
ascertaining, by the computing device, a first situation of at least one body part of the individual in 3D space at a first time using the first image;
recognizing, by the computing device, a body pose of the individual based on the first situation of the at least one body part of the individual;
determining, by the computing device, information for the user to relay to the individual based on the recognized body pose of the individual;
selecting a response body pose that corresponds to the information to relay to the individual; and
prompting the user using the HR system to perform the response body pose to relay the information to the individual.

US Pat. No. 10,970,934

INTEGRATED OPERATING ENVIRONMENT

Roam Holdings, LLC, Nixa...

1. An integrated operating environment system, comprising:a central processor;
at least one virtual reality (VR) device in data communication with the central processor, each VR device adapted to allow a VR user to navigate a virtual environment; and
at least one augmented reality (AR) device in data communication with the central processor, each AR device adapted to allow an AR user to view AR objects in a real-world environment;
wherein, the central processor:
receives data from each VR device and each AR device;
assigns each AR device and each VR device a unique tracking identifier;
compiles the received data into a real-time virtual construction;
disseminates the real-time virtual construction to each VR device and each AR device; and
positions and moves a virtual representation of each AR device and a virtual representation of each VR device in each of the virtual environment and real-world environment relative to each other virtual representation of the AR device or virtual representation the VR device such that each AR device and each VR device can interact and see each other AR device and each other VR device in real time.

US Pat. No. 10,970,933

DEVICE, SYSTEM AND METHOD FOR EMBEDDING ONE OR MORE ATTRIBUTES IN A GRAPHICAL OBJECT

1. A system comprising:a user device configured to obtain a graphical object and place it in a data packet;
a server communicatively coupled to an application program on the user device, wherein the server comprises:
a data packet processor that processes an image from an application program image and an attached video into a binary format and codes any video into different versions for viewing by Android, IOS or other formats;
a marker subsystem configured to create at least one marker on the graphical object;
a user ID database containing settings, preferences and attributes which are applied to the data packet;
a video re-sizer and thumbnail creator;
an embedding subsystem operatively coupled to the marker system and configured to embed one or more attributes in the graphical object, wherein the one or more attributes comprises a video, an augmented reality video, a 3-dimensional content, a geolocation, a hyperlink or a text;
a queue manager operatively coupled to the embedding subsystem and configured to process and upload the graphical object and the embedded attributes, wherein processing and uploading the embedded graphical object comprises processing and uploading the embedded graphical object in a user profile;
a permission subsystem configured to impose a viewing permission in the embedding subsystem and embed the viewing permission in the graphical object; and
an alert subsystem configured to generate a notification to the application program, upon uploading of the embedded graphical object.

US Pat. No. 10,970,932

PROVISION OF VIRTUAL REALITY CONTENT

Nokia Technologies Oy, E...

1. A method comprising:providing data indicative of dimensions of a real-world space within which a virtual world is to be consumed;
identifying one or more objects within said real-world space, wherein the one or more objects comprise real-world object(s);
determining one or more available areas within the real-world space for rendering three-dimensional virtual content, based at least partly on the indicated dimensions of the real-world space;
identifying one or more of the one or more objects as being movable with an action of a virtual reality user within said real-world space, wherein identifying the one or more movable objects comprises assigning a mobility score to each of the one or more objects indicative of whether or not it is movable with the action of the virtual reality user, where an object is identified as movable with the action of the virtual reality user where the respective mobility score is above a predetermined threshold;
identifying, from a set of three-dimensional virtual content items, one or more candidate items unable to be rendered within the one or more available areas unless at least one of the one or more moveable objects is moved; and
providing an indication of the one or more candidate virtual items and of the one or more movable objects required to be moved with the action of the virtual reality user to be able to render the one or more candidate virtual items.

US Pat. No. 10,970,931

METHOD FOR TRANSMITTING VIRTUAL REALITY IMAGE CREATED BASED ON IMAGE DIRECTION DATA, AND COMPUTER READABLE MEDIUM STORING PROGRAM USING THE SAME

CLICKED INC., Seoul (KR)...

1. A method for transmitting a virtual reality image, the method comprising:generating, by a server, an initial image frame;
determining, by the server, image direction data on a direction in a virtual 3-dimensional space of the generated initial image frame;
creating a first final image frame configured to be displayed at a first time point, by coupling the image direction data to the initial image frame as meta information;
transmitting the first final image frame to a client through wireless communication, wherein the client is a device that reproduces the first final image frame corresponding to the virtual reality image;
displaying, by the client, the first final image frame at the first time point;
detecting, by the client, whether a second final image frame configured to be displayed at a second time point is received at the client, wherein the second time point is a time point after a transmission cycle from the first time point;
when the second final image frame is received at the client, displaying, by the client, the second final image frame at the second time point; and
only when the second final image frame is not received at the client, performing:
calculating, by the client, a difference value between the image direction data, which is determined by the server based on the direction in the virtual 3-dimensional space of the initial image frame generated by the server, and reproduction direction data, which is generated by the client based on sensing data acquired by the client at the second time point;
correcting, by the client, the first final image frame based on the calculated difference value, to generate a corrected image frame, wherein the corrected image frame includes a marginal area that is generated due to a movement of the first final image based on the calculated difference value; and
displaying, by the client, the corrected image frame, wherein the marginal area is displayed as a black color area or a white color area.

US Pat. No. 10,970,930

ALIGNMENT AND CONCURRENT PRESENTATION OF GUIDE DEVICE VIDEO AND ENHANCEMENTS

Amazon Technologies, Inc....

1. A computer-implemented method, comprising:determining that a guide device is positioned at a first known location within an environment;
processing guide device video data generated by the guide device while the guide device is positioned at the first known location within the environment to determine a marker represented in the guide device video data, the marker positioned at a second known location within the environment;
generating alignment data that aligns first pixels of the guide device video data corresponding to the marker with position data of the marker in a visual mapping of the environment such that pixels of the guide device video data are aligned with a representation of the marker within the visual mapping;
sending, to a user device, the guide device video data, at least a portion of the visual mapping, and the alignment data;
aligning, based at least in part on the alignment data, second pixels of a portion of the guide device video data that is representative of a first portion of the environment with a first corresponding portion of the visual mapping that is representative of the environment; and
causing, on the user device, a concurrent presentation of the portion of the guide device video data that is representative of the first portion of the environment and the at least a portion of visual mapping representative of the first corresponding portion of the environment such that the portion of the guide device video data that is representative of the first portion of the environment is presented in alignment with the at least a portion of the visual mapping representative of the first corresponding portion of the environment.

US Pat. No. 10,970,929

BOUNDARY DETECTION USING VISION-BASED FEATURE MAPPING

Occipital, Inc., Boulder...

1. A system comprising:one or more processors;
non-transitory computer-readable media storing computer-executable instructions, which when executed by the one or more processors cause the one or more processors to:
generate point data including a set of safe points in a space and a set of unsafe points in the space, the space surrounding a user device of the system;
generate a triangulation over a union of the set of safe points and the set of unsafe points;
determine triangles of the triangulation that include at least one safe point;
determine edges of determined triangles which are part of a single triangle that includes at least one safe point; and
determine one or more boundaries of the space using the determined edges.

US Pat. No. 10,970,928

INCIDENT SITE INVESTIGATION AND MANAGEMENT SUPPORT SYSTEM BASED ON UNMANNED AERIAL VEHICLES

1. A method for generating incident site data based on a reconstructed 3D model of the incident site and input data from one or more measurement tools comprising:a. causing generation of the reconstructed 3D model based on one of more object models of objects according to at the incident site and a current state of the objects;
b. causing generation of the incident site data based on:
(i) receipt of a high resolution (HR) line with point-to-point data, in 3D point cloud, selected with the one or more measurement tools being applied to an interface displaying a visualization of the reconstructed 3D model;
(ii) receipt of a high resolution (HR) area based on a drawn boundary and selected points, in 3D point cloud, selected with the one or more measurement tools being applied to the interface;
(iii) generation of a high resolution (HR) volume based on an undamaged object template in 3D point cloud;
(iv) generation of a high resolution (HR) object model based on a static 3D infrastructure model; and
c. identifying differences between volumes in the reconstructed 3D model and volumes in the high resolution (HR) object model to identify damage at the incident site.

US Pat. No. 10,970,927

POSITIONABLE EMISSIONS CONTROL WATERCRAFT

1. A positionable emissions control watercraft consisting of:a. a floating platform;
b. a plurality of spuds attached to said floating platform for anchoring said floating platform at a predetermined orientation relative to a serviced watercraft at berth, each spud including a spud drive and a spud column which is configured to be raised or lowered by the spud drive, wherein the floating platform is movable along the water with the spud columns in a raised position and fixed in a desired position fixed relative to the serviced watercraft with the spud columns lowered to the seabed, wherein the floating platform is fixed in said desired position relative to the serviced watercraft without the use of floating fenders, spacers, or mooring lines and devices;
c. an exhaust capture system for receiving emissions from said serviced watercraft when the floating platform has been fixed in the desired position relative to the serviced at-berth watercraft;
d. a purification system connected to said exhaust capture system for accepting and purifying said emissions from said exhaust capture system when the floating platform has been fixed in the desired position relative to the serviced at-berth watercraft;
e. said exhaust capture system comprising an arm and an exhaust pipe connector configured to connect to an exhaust pipe of the serviced at-berth watercraft;
whereby said positionable emissions control watercraft may be placed any distance away from said serviced watercraft which allows the interconnection of said exhaust capture system to an exhaust pipe of said serviced watercraft for operation of the purification system to accept and purify said emissions.

US Pat. No. 10,970,926

SYSTEM AND METHOD FOR LUNG-VOLUME-GATED X-RAY IMAGING

DATA INTEGRITY ADVISORS, ...

1. A method, comprising:positioning a patient at a first orientation relative to an x-ray imaging apparatus;
obtaining a volumetric measurement of the patient's breathing;
while obtaining the volumetric measurement of the patient's breathing:
determining, based on the volumetric measurement of the patient's breathing, a breathing phase of the patient;
gating the x-ray imaging apparatus, based on the patient's breathing phase as determined from the volumetric measurement, to obtain multiple x-ray measurements corresponding to different breathing phases of the lung; and
extracting multiple displacement fields of lung tissue from the multiple x-ray measurements corresponding to different breathing phases of the lung, wherein each displacement field represents movement of the lung tissue from a first breathing phase to a second breathing phase and each breathing phase has a corresponding set of biometric parameters.

US Pat. No. 10,970,925

METHOD FOR CREATING A CURVED COVERING FROM FLAT MATERIAL

BIGGIE INC., Thousand Oa...

1. A method, comprising:providing a three-dimensional representation of a doubly curved surface as a smooth function or triangulated mesh;
determining a map of principal curvatures of the three-dimensional representation of the double curved surface;
cutting a flat, developable surface into a set of panels, wherein each panel of the set of panels is represented by a three-dimensional triangulated mesh;
introducing cuts to the set of panels primarily oriented along a direction of maximum principal curvature identified by the map of principal curvatures of the three-dimensional representation of the double curved surface; and
creating a two-dimensional approximation of the set of panels by representing each panel of the set of panels as a triangulated mesh that is topologically equivalent to the three-dimensional triangulated meshes that represent each of the panels.

US Pat. No. 10,970,924

RECONSTRUCTION OF A SCENE FROM A MOVING CAMERA

Foresight AI Inc., Los G...

1. A method comprising:directing an aerial drone to position at a key frame location;
capturing a key frame image of an object of a scene using a camera carried by the aerial drone at the key frame location;
recording location data of the aerial drone at the key frame location as the location data at which the key frame image is captured;
directing the aerial drone to move at a first radial direction away from the key frame location to a first displaced location;
capturing a first additional image of the object from the first displaced location;
recording first additional location data of the aerial drone at the first displaced location as the first additional location data at which the first additional image is captured, the first additional location data recorded as different data than the location data at which the key frame image is captured;
returning the aerial drone to the key frame location from the first displaced location;
directing the aerial drone to move at a second radial direction away from the key frame location to a second displaced location, the second radial direction different than the first radial direction;
capturing a second additional image of the object from the second displaced location;
recording second additional location data of the aerial drone at the second displaced location as the second additional location data at which the second additional image is captured, the second additional location data recorded as different data than the location data at which the key frame image is captured and as different data than the first additional location data at which the first additional image is captured.

US Pat. No. 10,970,923

METHOD AND SYSTEM FOR VIRTUAL AREA VISUALIZATION

State Farm Mutual Automob...

1. A computer-implemented method of visualizing overall regions, the method comprising:obtaining, by a server, a first set of image data indicative of an overall region, the first set of image data captured by a remote imaging vehicle during a first period of time;
generating, by the server, a first virtual model of the overall region based on the first set of image data;
determining, by the server and using the first virtual model, one or more virtual coordinates associated with one or more locations within the overall region;
determining, by the server, and based on the one or more virtual coordinates and customer information of one or more customers, that the one or more locations are associated with the one or more customers;
linking, based at least in part on determining that the one or more locations are associated with the one or more customers, the customer information to the first virtual model;
providing, by the server, a virtual environment including the first virtual model of the overall region for rendering by a user electronic device;
receiving, from the user electronic device, a first request from a user of the user electronic device interacting with the first virtual model in the virtual environment to capture additional image data of an indicated area within the overall region, the indicated area being associated with a customer of the one or more customers;
transmitting, to a remote control client, a second request to dispatch the remote imaging vehicle to capture a second set of image data, wherein the second set of image data includes image data representative of the indicated area within the overall region;
obtaining, by the server, the second set of image data;
generating, by the server, a second virtual model for the indicated area within the overall region based on the second set of image data, wherein the second virtual model for the indicated area within the overall region has a higher resolution than the first virtual model for the overall region;
integrating, by the server, the second virtual model for the indicated area within the overall region into the virtual environment to form an updated virtual environment; and
providing, by the server, the updated virtual environment for rendering by the user electronic device such that the updated virtual environment toggles between display of the first virtual model and display of the second virtual model as a zoom level of rendering the updated virtual environment crosses a threshold zoom level.

US Pat. No. 10,970,922

GENERATING A 3D MODEL OF A FINGERTIP FOR VISUAL TOUCH DETECTION

Apple Inc., Cupertino, C...

1. A computer readable medium comprising computer readable code executable by one or more processors to:determine, based on sensor data from a touch sensor on a first device, a touch event, wherein the touch event comprises a touch on the first device by a touching object, and wherein the touch event occurs during a first time;
in response to a touch event:
obtain a first image of the touching object by a first camera of the first device during the first time, and
obtain, from a second device, a second image of the touching object captured during the first time,
wherein the first image of the touching object captures a first view of the touching object, and wherein the second image of the touching object captures a second view of the touching object; and
cause a model of the touching object to be generated based on the first image and the second image.

US Pat. No. 10,970,921

APPARATUS AND METHOD FOR CONSTRUCTING A VIRTUAL 3D MODEL FROM A 2D ULTRASOUND VIDEO

UNIVERSITY HOSPITALS CLEV...

1. A method for creating a three-dimensional image of an object from a two-dimensional ultrasound video, the method comprising:acquiring a plurality of two-dimensional ultrasound images of the object;
recording a plurality of videos based on the acquired two-dimensional ultrasound images, each of the plurality of videos comprising a plurality of frames;
separating each of the plurality of frames;
cropping each of the plurality of frames to isolate structures intended to be reconstructed;
selecting a frame near a center of the object and rotating the selected frame to create a main horizontal landmark;
aligning each of the plurality of frames to the main horizontal landmark; and
stacking each of the aligned plurality of frames into a three-dimensional volume.

US Pat. No. 10,970,920

SYSTEMS AND METHODS FOR RAY-TRACED SHADOWS OF TRANSPARENT OBJECTS

Electronic Arts Inc., Re...

1. A method for rendering shadows in a scene, the method comprising:selecting, by one or more processors, a pixel in an image of a scene, wherein the pixel corresponds to a point on a first object that is visible to a camera capturing the image of the scene;
initializing, by the one or more processors, an occlusion value for the pixel;
performing, by the one or more processors, ray tracing by launching a first ray in the direction of a light source from the point on the first object corresponding to the pixel;
determining, by the one or more processors, that the first ray intersects a second object;
performing, by the one or more processors, ray tracing by launching a second ray from a point of intersection of the first ray with the second object, wherein a direction of the second ray is based on reflection or refraction of the first ray from a surface of the second object;
generating, by the one or more processors, an updated occlusion value based on accumulating lighting information of the second object with the occlusion value;
adjusting, by the one or more processors, the updated occlusion value to generate an adjusted occlusion value, wherein the adjusting is based on a divergence between a direction of the first ray and a direction of the second ray, wherein adjusting the updated occlusion value to generate the adjusted occlusion value is based on applying a factor to the updated occlusion value according to an equation based on the direction of the first ray and the direction of the second ray, wherein the equation is a monotonically increasing function; and
storing, by the one or more processors, the adjusted occlusion value in a buffer at a location corresponding to the pixel, wherein the buffer includes shadow information that is applied to the image of the scene.

US Pat. No. 10,970,919

METHOD OF DETERMINING AN ILLUMINATION EFFECT OF A VOLUMETRIC DATASET

SIEMENS HEALTHCARE GMBH, ...

1. A method of determining an illumination effect value of a volumetric dataset, the method comprising:determining, by one or more processors, one or more first parameter values based on the volumetric dataset, the one or more first parameter values relating to one or more properties of the volumetric dataset at a sample point; and
determining, by the one or more processors, an illumination effect value relating to an illumination effect at the sample point by inputting the one or more first parameter values to an anisotropic illumination model, the illumination effect value defining a relationship between an amount of incoming light and an amount of outgoing light at the sample point, one or more second parameters of the anisotropic illumination model defining a major axis of an ellipse.

US Pat. No. 10,970,918

IMAGE PROCESSING METHOD AND APPARATUS USING A PIXELATED MASK IMAGE AND TERMINAL ORIENTATION FOR A REFLECTION EFFECT

HUAWEI TECHNOLOGIES CO., ...

1. An image processing method applied to a terminal, the image processing method comprising:obtaining a first pixelated image when a diffuse reflection object of the first pixelated image is illuminated by a light source;
obtaining a pixelated mask image corresponding to the diffuse reflection object;
obtaining orientation information of the terminal;
moving the pixelated mask image based on the orientation information to obtain a moved pixelated masked image;
generating a second pixelated image, wherein the second pixelated image comprises the diffuse reflection object and a diffuse reflection effect, by:
superposing the first pixelated image and the moved pixelated mask image;
calculating a color value obtained after superposing each pixel in the first pixelated image and each pixel at a corresponding position on the moved pixelated mask image;
calculating an after-superposition color value for a first pixel based on a color value of the first pixel and transparency of a pixel at a corresponding position on the moved pixelated mask image, wherein the first pixel is any pixel in the first pixelated image;
calculating the after-superposition color value for each pixel in the first pixelated image; and
before generating the second pixelated image:
obtaining a third pixelated image comprising a diffuse reflection object obtained when no light source illuminates the diffuse reflection object; and
calculating the after-superposition color value for the first pixel based on the color value of the first pixel, a color value of a pixel at a corresponding position on the third pixelated image, and the transparency of the pixel at the corresponding position on the moved pixelated mask image.

US Pat. No. 10,970,917

DECOUPLED SHADING PIPELINE

Intel Corporation, Santa...

1. A portable computing device comprising:a general purpose processing unit; and
a graphics processing unit to:
partition an image into at least a first region and a second region, wherein the first region is a peripheral region of the image;
perform rendering in the first region at a first rate; and
perform rendering in the second region at a second rate higher than the first rate.

US Pat. No. 10,970,916

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

SONY CORPORATION, Tokyo ...

1. An image processing apparatus, comprising:a processor configured to:
control reception of depth-related images of a plurality of viewpoints, wherein
the depth-related images comprise depth images as luminance components of the depth-related images and foreground images as color components of the depth-related images, and
the foreground images are silhouette images corresponding to silhouettes of a foreground; and
generate a three-dimensional (3D) model of the foreground based on the depth images as the luminance components of the depth-related images of the plurality of viewpoints and the foreground images as the color components of the depth-related images of the plurality of viewpoints.

US Pat. No. 10,970,915

VIRTUAL VIEWPOINT SETTING APPARATUS THAT SETS A VIRTUAL VIEWPOINT ACCORDING TO A DETERMINED COMMON IMAGE CAPTURING AREA OF A PLURALITY OF IMAGE CAPTURING APPARATUSES, AND RELATED SETTING METHOD AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. A setting apparatus comprising:(A) one or more hardware processors; and
(B) one or more memories that store instructions executable by the one or more hardware processors to cause the setting apparatus:
(a) to acquire information indicating respective configurations of a plurality of image capturing apparatuses at different positions, each of the plurality of image capturing apparatuses capturing an image of at least a part of a target space;
(b) to determine, based on the acquired information, a common image capturing area that is included within each of a plurality of fields of view, of the plurality of image capturing apparatuses;
(c) to cause a graphical user interface (GUI), which is used for setting a movement path of a virtual viewpoint for generating a virtual viewpoint image representing a view from the virtual viewpoint on the set movement path based on a plurality of images including the common image capturing area captured by the plurality of image capturing apparatuses, to display an image of the target space where (i) a designated movement path of the virtual viewpoint in the target space, (ii) a designated movement path of a gaze point positioned in a viewing direction from the virtual viewpoint in the target space, and (iii) the determined common image capturing area in the target space are shown such that a positional relationship between the designated movement path of the virtual viewpoint and the determined common image capturing area is indicated and a positional relationship between the designated movement path of the gaze point and the determined common image capturing area is indicated, wherein quality of the virtual viewpoint image depends on the positional relationship between a position of the virtual viewpoint and the common image capturing area; and
(d) to set the movement path of the virtual viewpoint according to a user input obtained based on the GUI.

US Pat. No. 10,970,914

MULTIPLE PRECISION LEVEL INTERSECTION TESTING IN A RAY TRACING SYSTEM

Imagination Technologies ...

1. A ray-tracing system configured to perform intersection testing, comprising:a tester module for testing rays for intersection with a volume, the tester module being configured to receive a packet of one or more rays to be tested for intersection with the volume, wherein the tester module comprises:
a first set of one or more testers configured to perform intersection testing at a first level of precision to provide intersection testing results, wherein for a first type of the intersection testing result from the first set of one or more testers intersection testing does not need to be reperformed at a second level of precision greater than the first level of precision, and for a second type of the intersection testing result from the first set of one or more testers intersection testing is to be reperformed at the second level of precision; and
a second set of one or more testers configured to perform intersection testing at the second level of precision;
wherein the tester module is configured to:
allocate a ray from a received packet to one of the first set of testers for intersection testing at the first level of precision;
identify the type of an intersection testing result for the ray provided by said one of the first set of testers to determine whether intersection testing for the ray is to be reperformed at the second level of precision; and
if it is determined that intersection testing for the ray is to be reperformed at the second level of precision, allocate the ray to one of the second set of testers for intersection testing at the second level of precision.

US Pat. No. 10,970,913

SYSTEMS AND METHODS FOR TEXTURE-SPACE RAY TRACING OF TRANSPARENT AND TRANSLUCENT OBJECTS

Electronic Arts Inc., Re...

1. A method for rendering a translucent object in a scene, the method comprising:generating, by one or more processors, a first data structure that stores a surface normal value for each location on the translucent object, wherein the translucent object comprises a light-scattering medium;
generating, by the one or more processors, a second data structure that stores a position value in world space of the scene for each location on the translucent object;
selecting, by the one or more processors, a valid position value in the second data structure corresponding to a location on the surface of the translucent object;
generating, by the one or more processors, an inverted normal vector from the location on the surface of the translucent object corresponding to the valid position value;
identifying, by the one or more processors, a location inside the translucent object along the inverted normal vector;
performing, by the one or more processors, ray tracing by launching a plurality of rays from the location inside the translucent object along the inverted normal vector;
determining, by the one or more processors, for each ray of the plurality of rays, a lighting result from an intersection of the ray with a surface of the translucent object;
aggregating, by the one or more processors, the lighting results corresponding to the plurality of rays into an aggregated lighting value; and
storing, by the one or more processors, the aggregated lighting value for the valid position value in a texture map, wherein the texture map is applied to geometry corresponding to the translucent object to render the translucent object.

US Pat. No. 10,970,912

3-D GRAPHICS RENDERING WITH IMPLICIT GEOMETRY

Imagination Technologies ...

1. A computer-implemented method of testing a ray for intersection with an implicit surface in a 3-D space of a computer graphics scene to be rendered, comprising:entering, by a processor, a surface of a shell bounding a 3-D volume in said 3-D space with a ray, the shell defining a maximum extent for implicitly-defined geometry within the shell;
iteratively stepping, by a processor, a current 3-D position of the ray along its path through the 3-D volume defined by the shell;
for each current 3-D position, by a processor
projecting the current 3-D position of the ray to a current position on an explicitly-defined 2-D surface in said 3-D space and bounded in the shell,
producing data for the implicitly-defined geometry using the current position on the explicitly-defined 2-D surface in said 3-D space, and
characterizing the ray as either hitting or missing the implicitly-defined geometry at the current 3-D position in said 3-D space, using the produced data;
wherein the ray characterization is used in rendering of said scene on a visual display.

US Pat. No. 10,970,911

GRAPHICS PROCESSING CHIP WITH MACHINE-LEARNING BASED SHADER

Facebook Technologies, LL...

1. A graphics processing chip, comprising:a controller configured to manage operations of the graphics processing chip in accordance with a graphics-rendering pipeline, the operations comprising geometry-processing operations, rasterization operations, and shading operations;
at least one programmable memory component configured to store a machine-learning model configured to perform at least a portion of the shading operations, wherein the machine-learning model is trained using a plurality of training samples, and wherein each of the plurality of training samples is associated with material property data associated with a virtual surface and target color data associated with a fragment;
a plurality of processing units configured to be selectively used to perform the shading operations in accordance with the machine-learning model; and
at least one output memory configured to store image data generated using the shading operations.

US Pat. No. 10,970,910

ANIMATION OF CONCEPTS IN PRINTED MATERIALS

INTERNATIONAL BUSINESS MA...

1. A method, comprising:tracking a reading position of a reader with respect to text of printed material;
detecting an area of the printed material of interest to the reader, wherein the detecting comprises correlating a reading position with an indication of interest provided by the reader, wherein the indication is based upon sentiment of the reader identified through at least one of: facial analysis, speech analysis, and gaze analysis, wherein the detecting an area of interest comprises determining the sentiment indicating the interest exceeds a predetermined threshold value;
identifying a concept corresponding to the area of interest, wherein the identifying a concept comprises (i) parsing the text of the area of interest and (ii) extracting at least one concept from the text, wherein the parsing comprises utilizing a natural language parser to extract portions of the text, wherein the at least one concept corresponds to an entity within the area of interest identified and identified in a portion of the text;
generating an animation for the concept by (i) accessing an image related to the concept, (ii) identifying a portion of the image to animate, and (iii) animating the portion of the image, wherein the accessing an image comprises querying a secondary source utilizing the entity as a query search term, wherein the generating an animation comprises identifying at least one action of the at least one concept extracted from the text, wherein the identifying a portion of the image to animate comprises identifying a portion of the image corresponding to the identified at least one action and determining the portion corresponds to a portion identified by the user within the indication of interest, wherein the generating comprises generating a series of images illustrating movement of the portion of the image performing the at least one action; and
providing, on an information handling device, the generated animation to the reader.

US Pat. No. 10,970,909

METHOD AND APPARATUS FOR EYE MOVEMENT SYNTHESIS

BEIHANG UNIVERSITY, Beij...

1. A method for eye movement synthesis, comprising:obtaining eye movement feature data and speech feature data, wherein the eye movement feature data reflects an eye movement behavior, and the speech feature data reflects a voice feature;
obtaining a driving model according to the eye movement feature data and the speech feature data, wherein the driving model is configured to indicate an association between the eye movement feature data and the speech feature data; and
synthesizing an eye movement of a virtual human according to speech input data and the driving model and controlling the virtual human to exhibit the synthesized eye movement,
wherein the obtaining a driving model according to the eye movement feature data and the speech feature data comprises:
normalizing the eye movement feature data and the speech feature data separately to obtain eye movement feature data and speech feature data in a common data format;
performing data alignment to the eye movement feature data and the speech feature data in the common data format according to a time series corresponding thereto to obtain eye movement feature data and speech feature data aligned in time series; and
obtaining a driving model according to the eye movement feature data and the speech feature data aligned in time series.

US Pat. No. 10,970,908

DISPLAY CONTROL METHOD AND APPARATUS FOR GAME SCREEN, ELECTRONIC DEVICE, AND STORAGE MEDIUM

TENCENT TECHNOLOGY (SHENZ...

1. A method for display control, comprising:detecting, by processing circuitry of a terminal device that displays animation for a game on a display screen, a frame rate inadequacy of animation frames that are generated according to animation features respectively associated with animation files;
obtaining, by the processing circuitry, preconfigured values respectively associated with the animation files, a preconfigured value associated with an animation file being indicative of performance influence for turning off an animation feature associated with the animation file and being a measure of at least a combination of graphic processing consumption and user experience of the animation feature; and
turning off, by the processing circuitry, one or more animation features according to the preconfigured values associated with the animation files until an adequate frame rate is achieved,
wherein the method further includes performing, by the processing circuitry:
sorting the animation features into a turn-off sequence or classifying the animation features into multiple classes according to the preconfigured values, and
turning off the animation features according to the turn-off sequence, when the animation features are sorted into the turn-off sequence, or class by class, when the animation features are classified into multiple classes, until the adequate frame rate is achieved.

US Pat. No. 10,970,907

SYSTEM AND METHOD FOR APPLYING AN EXPRESSION TO AN AVATAR

Facebook Technologies, LL...

1. A method comprising:determining, by one or more processors, a class of an expression of a face according to a set of attributes indicating states of portions of the face;
determining, by the one or more processors according to the class of the expression of the face, a set of blendshapes with respective weights corresponding to the expression of the face;
providing, by the one or more processors, the set of blendshapes with respective weights as an input to train a machine learning model;
training, by the one or more processors, a neural network of the machine learning model according to the set of blendshapes with respective weights provided as the input; and
configuring, by the one or more processors via training, the machine learning model to generate an output set of blendshapes with respective weights, according to an input image.

US Pat. No. 10,970,906

FILLING IN AN ENTITY WITHIN A VIDEO

International Business Ma...

1. A computer-implemented method, comprising:identifying an entity within a selected plurality of video frames in response to detecting that results of a first attempt to fill in an area of the selected plurality of video frames by an editing application has failed, where the entity has one or more missing portions;
determining a three-dimensional (3D) model for the entity having the one or more missing portions, utilizing first metadata and second metadata, where:
the first metadata is associated with the entity and includes a classification of the entity and one or more specific characteristics of the entity, and
the second metadata is associated with the selected plurality of video frames and includes a date and time the selected plurality of video frames was created;
utilizing the 3D model to fill in the one or more missing portions of the entity within the selected plurality of video frames; and
validating the filling in of the one or more missing portions of the entity utilizing image recognition.

US Pat. No. 10,970,905

FILLING IN AN ENTITY WITHIN AN IMAGE

International Business Ma...

1. A computer-implemented method, comprising:identifying an entity within an image in response to detecting that results of a first attempt to fill in an area of the image by an image editing application has failed, where the entity has one or more missing portions;
determining a three-dimensional (3D) model for the entity having the one or more missing portions, utilizing first metadata and second metadata, where:
the first metadata is associated with the entity and includes a classification of the entity and one or more specific characteristics of the entity, and
the second metadata is associated with the image and includes a date and time the image was created;
utilizing the 3D model to fill in the one or more missing portions of the entity within the image; and
validating the filling in of the one or more missing portions of the entity utilizing image recognition.

US Pat. No. 10,970,904

INTERFACE LAYOUT USING RELATIVE POSITIONING

Twitch Interactive, Inc.,...

1. A computing system comprising:one or more processors; and
one or more memories having stored therein computing instructions that, upon execution by the one or more processors, cause the computing system to perform operations comprising:
receiving one or more commands associated with a virtual surface for display in a streaming application, wherein the one or more commands indicate an animation that performs a movement in the virtual surface, and wherein the one or more commands include a relative width that is relative to a total width for the virtual surface in its entirety and a relative height that is relative to a total height for the virtual surface in its entirety;
generating, based on the one or more commands, rendering instructions for rendering the virtual surface including the animation in a user interface of the streaming application; and
providing the rendering instructions to one or more rendering components that render the virtual surface including the animation in the user interface based on the rendering instructions, wherein the relative width and the relative height indicate a first position at which the movement of the animation is at least one of started, stopped, or redirected.

US Pat. No. 10,970,903

SYSTEMS AND METHODS FOR DISPLAYING AUTONOMOUS VEHICLE ENVIRONMENTAL AWARENESS

Lyft, Inc., San Francisc...

1. A computer-implemented method comprising:providing a visualization comprising a representation of a vehicle and a representation of a physical environment associated with the vehicle;
receiving, from at least one sensor of the vehicle, sensor data associated with the physical environment;
determining, based on the sensor data, movement of at least one object in the physical environment to occupy a new position relative to the vehicle;
representing the movement of the at least one object with a smoothing effect by incrementally modifying a represented position of the object within the physical environment based at least in part on an amount of time that the object has occupied the new position; and
providing an updated visualization showing the movement of the at least one object with the smoothing effect.

US Pat. No. 10,970,902

ALLOCATING AND EXTRAPOLATING DATA FOR AUGMENTED REALITY FOR 6G OR OTHER NEXT GENERATION NETWORK

1. A method, comprising:generating, by a vehicle comprising a processor, first object data representative of a first location of an object sensed by the vehicle;
in response to generating the first object data, transmitting, by the vehicle, the first object data to network equipment to facilitate an augmented reality representation of the object;
in response to transmitting the first object data, receiving, by the vehicle, second object data representative of a second location of the object from the network equipment;
in response to receiving the second object data, determining, by the vehicle: that the second object data has been previously received by the vehicle, and that a threshold amount of a media has not been satisfied, wherein the media is a sound file;
in response to determining that the threshold amount of the media has not been satisfied, prompting, by the vehicle, a user equipment to transmit the media;
based on determining that the second object data has been previously received, deleting, by the vehicle, the second object data to increase a processing efficiency from a first efficiency to a second efficiency greater than the first efficiency according to a defined efficiency criterion; and
based on receiving the second object data, generating, by the vehicle, augmented reality data representative of the second location of the object.

US Pat. No. 10,970,901

SINGLE-PHOTO GENERATING DEVICE AND METHOD AND NON-VOLATILE COMPUTER-READABLE MEDIA THEREOF

WISTRON CORP., New Taipe...

1. A single-photo generating device, comprising:an image capturing device, generating a first image, wherein the first image comprises a plurality of people; and
a processing device, coupled to the image capturing device and obtaining the first image from the image capturing device;
wherein the processing device extracts each human image corresponding to the plurality of people from the first image and selects a background image from the first image, and the processing device generates a plurality of single photos corresponding to each of the human images according to the extracted human images and the background image,
wherein the background image does not comprise the human images, and
wherein the first image and the single photos corresponding to each of the human images have a first size, and the processing device selects a plurality of second images with a second size from the first image, and the processing device selects one of the second images to be the background image, wherein the second size is smaller than the first size.

US Pat. No. 10,970,900

ELECTRONIC APPARATUS AND CONTROLLING METHOD THEREOF

Samsung Electronics Co., ...

1. A method of controlling an electronic apparatus, the method comprising:providing a user interface (UI) comprising an illustration generation icon on a display of the electronic apparatus;
displaying a text on a first area of the UI based on a first input of a user;
identifying a plurality of key terms from the text;
in response to a selection of the illustration generation icon, acquiring at least one image corresponding to the plurality of key terms based on a first artificial intelligence model, wherein each of the at least one image is acquired based on the plurality of key terms;
displaying a plurality of illustrations by synthesizing at least two images from among the at least one image on a second area of the UI different from the first area of the UI based on a second artificial intelligence model, each illustration of the plurality of illustrations including the text and the at least one image, wherein the plurality of illustrations include two or more illustrations having different arrangements between the text and the at least one image;
displaying, based on a second input of the user for selecting a first illustration of the plurality of illustrations being received, the selected first illustration on the first area of the display while the plurality of illustrations displayed on the second area of the UI is maintained; and
based on a third input of the user for modifying the at least one image included in the first illustration, displaying, on the first area of the display, a second illustration modified to correspond to the third input of the user,
wherein background style of the plurality of illustrations is determined according to a selection of the user,
wherein, in the displaying of the plurality of illustrations, each of the plurality of illustrations is modified so that the at least two images of the at least one image correspond to a design of a presentation image which is acquired by inputting the information on the design of the presentation image and the at least one image to the second artificial intelligence model as an input image, and
wherein the input image is modified by the second artificial intelligence model according to the design of the presentation image in relation to at least one of a theme, a line style, a line thickness, a color, a size, a graphic effect, a brightness, a contrast, a shape, a layout, and synthesis of the input image to produce a style of the presentation image.

US Pat. No. 10,970,899

AUGMENTED REALITY DISPLAY FOR A VEHICLE

International Business Ma...

1. A method for an in-vehicle display of information for an object of interest, the method comprising: receiving, by a computer system, sound data from a sound sensor system connected to a vehicle, wherein the sound data is for sounds detected in an exterior environment around the vehicle; analyzing, by the computer system, the sound data for a presence of an object of interest; in response to detecting the presence of the object of interest, determining, by the computer system, a location of the object of interest with respect to the vehicle using the sound data; displaying, by the computer system, a visual indicator for the object of interest with an indication of the location of the object of interest using an electronic display system in the vehicle to augment a live view of the exterior environment seen directly by an occupant through a window of the vehicle for the in-vehicle display of the information for the object of interest, wherein the displaying of the visual indicator for the object of interest with the location of the object of interest is overlaid on the live view to draw attention to the object of interest in the live view outside the vehicle; displaying, using the electronic display system in the vehicle to augment the live view of the exterior environment seen directly by the occupant through the window of the vehicle, a direction of travel of the object of interest whose presence was detected by the analyzing of the sound data responsive to detecting the object of interest whose presence was detected by the analyzing of the sound data, wherein the displaying of the visual indicator for the object of interest is overlaid on the object of interest in the live view to draw attention to the object of interest in the live view outside the vehicle; and displaying text from a voice to text conversion of the sound from the object of interest when the object of interest is a person using the electronic display system in the vehicle to augment the live view of the exterior environment seen directly by the occupant through the window of the vehicle.

US Pat. No. 10,970,898

VIRTUAL-REALITY BASED INTERACTIVE AUDIENCE SIMULATION

International Business Ma...

1. A computer-implemented method, comprising steps of:determining one or more situational and location characteristics for a given performance by a user;
generating a virtual reality (VR)-based simulated audience for the given performance based at least in part on the determined situational and location characteristics;
presenting the VR-based simulated audience to a user during the given performance utilizing a VR headset;
utilizing one or more sensors to measure one or more aspects of the given performance before the VR-based simulated audience; and
generating real-time feedback adjusting the VR-based simulated audience presented to the user utilizing the VR headset based at least in part on the measured aspects of the given performance, wherein said generating the real-time feedback comprises:
utilizing at least one feedback repository to identify reactions of different types of audience members to the measured aspects of the given performance; and
adjusting characteristics of respective ones of a plurality of members of the VR-based simulated audience to provide the identified reactions for the different types of audience members in proportion to the representation of the different types of audience members in the VR-based simulated audience, wherein said adjusting characteristics of respective ones of the plurality of members of the VR-based simulated audience comprises:
adjusting characteristics relating to cues and interjections, the characteristics relating to cues and interjections comprising (i) applause carried out by one or more of the plurality of members of the VR-based simulated audience during the given performance and (ii) one or more of the plurality of members of the VR-based simulated audience leaving the VR-based simulated audience during the given performance; and
adjusting characteristics relating to a non-homogenous portion of the VR-based simulated audience comprising (i) a diversity of reactions of the non-homogenous portion of the VR-based simulated audience members and (ii) projection of the non-homogenous portion of the VR-based simulated audience members as a connected network measuring (a) audience coverage percentage and (b) address accuracy percentage of the user with respect to the non-homogenous portion of the VR-based simulated audience members;
wherein the steps are carried out by at least one processing device.

US Pat. No. 10,970,897

USING AUGMENTED REALITY FOR ACCESSING LEGACY TRANSACTION TERMINALS

PAYPAL, INC., San Jose, ...

1. A method for accessing legacy devices via augmented reality devices, the method comprising:acquiring, via an augmented reality (AR) device, an image of a keyboard of a legacy device;
accessing one or more customer commands from audio data associated with a user with the AR device for initiating a transaction using the legacy device;
determining a command sequence based, at least in part, on the one or more customer commands, the command sequence associated with the transaction that maps to a keystroke sequence using the keyboard;
determining a complexity level of the command sequence associated with the transaction based on the keystroke sequence and a competency level associated with the user;
determining that the complexity level meets or exceeds a threshold level for generating an overlay associated with the keystroke sequence;
generating the overlay indicating the keystroke sequence based on the determining that the complexity level meets or exceeds the threshold level; and
displaying, via the AR device, the overlay by visually projecting the overlay over the image of the keyboard of the legacy device.

US Pat. No. 10,970,896

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus comprising a computer executing instructions that, when executed by the computer, cause the computer to function as:an acquisition unit configured to acquire a captured image captured by an image capturing unit;
a region extraction unit configured to extract a region corresponding to a human body from the captured image;
a selection unit configured to select a background image from a plurality of background images;
a processing unit configured to perform image processing for obscuring the region in the background image selected by the selection unit; and
a determination region obtaining unit configured to acquire object region information of the captured image and remove the object region from the captured image to obtain a determination region,
wherein the selection unit selects the background image based on similarity between the determination region of the captured image and corresponding region in each of the plurality of background images.

US Pat. No. 10,970,895

INTELLIGENT AND CONTEXT AWARE READING SYSTEMS

Massachusetts Mutual Life...

1. A method comprising:generating, by a server, a first graphical user interface for a display of a computing device;
generating, by the server, text comprising a set of words for the first graphical user interface, wherein one word of the set of words is sequentially displayed for a predetermined amount of time and removed prior to display of a subsequent word of the set of words on the first graphical user interface;
generating, by the server, an instruction configured to activate an eye-tracking sensor device configured to track movement of at least one eye of a user operating the computing device with respect to the text displayed on the first graphical user interface and generate ocular sensor data, wherein the eye-tracking sensor device is associated with the computing device;
upon transmitting the instruction to the eye-tracking sensor device, receiving, by the server, the ocular sensor data from the eye-tracking sensor device for the user, wherein the user is operating the first graphical user interface of the computing device;
generating, by the server, ocular engagement data based on the ocular sensor data, wherein the ocular engagement data is associated with a portion of a first field containing at least a portion of the text on the first graphical user interface displayed on the computing device; and
in response to an ocular engagement value indicating that the user has looked away from the one word currently being displayed, and using emotional state data, generating, by the server supplemental content data for the portion of the first field on a second graphical user interface corresponding with the ocular engagement data while automatically pausing the sequential displaying of the words on the first graphical user interface, wherein the supplemental content data comprises (1) a definition of the one word currently being displayed within the text in the first field based on a context of the one word currently being displayed within the text and (2) a number of times the one word currently being displayed is presented within a portion of the text currently being sequentially displayed, and wherein the server resumes the sequential displaying of the words on the first graphical user interface based on a length of a gaze of the user at the first graphical user interface.

US Pat. No. 10,970,894

FILLING EMPTY PIXELS

DREAMWORKS ANIMATION LLC

1. A method of rendering a computer image, the method comprising:for each pixel of a plurality of N×M pixels forming a tile of the computer image,
determining a plurality of masks for the pixel,
wherein N denotes an integer larger than 1,
wherein M denotes an integer larger than 1, and
wherein each of the plurality of masks identifies a respective subset of the plurality of N×M pixels that are equidistant from the pixel and located at a respective distance from the pixel;
determining an active mask for the tile, the active mask identifying active pixels of the plurality of N×M pixels, each of the active pixels being determined as having color information;
based on the active mask, identifying an empty pixel of the plurality of N×M pixels, the empty pixel lacking color information; and
determining at least a first nearest active pixel of the active pixels that is nearest to the identified empty pixel,
wherein determining at least the first nearest active pixel comprises comparing the active mask with at least one mask of the plurality of masks for the identified empty pixel.

US Pat. No. 10,970,893

SELECTING AN ANOMALY FOR PRESENTATION AT A USER INTERFACE BASED ON A CONTEXT

Oracle International Corp...

1. One or more non-transitory machine-readable media storing instructions which, when executed by one or more processors, cause:identifying a set of entries, in a set of one or more tables, associated with a plurality of metrics and a plurality of attributes, wherein each of the plurality of attributes is associated with a respective set of attribute values;
determining a plurality of insights based on the set of entries, wherein determining the plurality of insights comprises:
identifying a first subset of the set of entries that are associated with a first attribute value for a first attribute of the plurality of attributes, wherein the first subset of the set of entries includes two or more entries;
applying a first insight algorithm to each of the first subset of entries, with respect to a first metric of the plurality of metrics, in order to determine a first insight of the plurality of insights;
identifying a second subset of the set of entries that are associated with a second attribute value for a second attribute of the plurality of attributes, wherein the second subset of the set of entries includes two or more entries;
applying a second insight algorithm to each of the second subset of entries, with respect to a second metric of the plurality of metrics, in order to determine a second insight of the plurality of insights;
determining a context for determining one or more insights to be presented on a user interface, wherein the context comprises at least one of the first attribute and the first attribute value, and does not comprise the second attribute or the second attribute value;
identifying a plurality of relevant insights, from the plurality of insights, that is associated with the context, wherein the plurality of relevant insights includes the first insight and does not include the second insight;
presenting, on the user interface, at least one of the plurality of relevant insights that are associated with the context, without presenting any insight not within the plurality of relevant insights.

US Pat. No. 10,970,892

ELECTRONIC APPARATUS FOR DRAWING FIGURE BASED ON FUNCTION DATA STORED IN ADVANCE OR BASED ON DRAWING DATA RECEIVED FROM SERVER DEPENDING ON TYPE OF FIGURE TO BE DRAWN, AND INFORMATION PROCESSING METHOD, SYSTEM, AND MEDIUM FOR SAME

CASIO COMPUTER CO., LTD.,...

1. An electronic apparatus which communicates with at least one server, comprising:a memory which stores in advance function data for figure drawing, the function data including a plurality of functions, wherein each of the plurality of functions is expressible as a figure on a graph by plotting coordinate points on the graph based on the function and based on a value inserted in the function in accordance with a user operation;
a display; and
a processor which:
obtains an instruction on a figure-related process to draw a figure;
if a type of the figure indicated by the obtained instruction is a type that is not expressible by any of the functions included in the function data stored in advance in the memory, (i) sends relevant data including information on the figure to the at least one server, (ii) receives, from the at least one server, drawing data generated by the at least one server executing, based on the relevant data, an arithmetic operation required for the figure-related process, and (iii) performs the figure-related process on the display in accordance with the received drawing data without executing the arithmetic operation required for the figure-related process using the function data; and
if the type of the figure indicated by the obtained instruction is a type that is expressible by one of the plurality of functions included in the function data stored in advance in the memory, (i) executes an arithmetic operation required for the figure-related process using the function data without requesting the at least one server to execute the arithmetic operation required for the figure-related process, and (ii) performs the figure-related process on the display in accordance with drawing data generated as a result of the arithmetic operation,
wherein:
the relevant data sent by the processor to the at least one server includes identification data identifying to which figure the relevant data is relevant, and
the drawing data received by the processor from the at least one server agrees with the identification data included in the relevant data.

US Pat. No. 10,970,891

SYSTEMS AND METHODS FOR DETECTING AND ACCOMMODATING STATE CHANGES IN MODELLING

Oracle International Corp...

9. One or more non-transitory computer-readable media storing instructions, which when executed by one or more hardware processors, cause at least one computing device to perform operations comprising:receiving a time-series signal that includes a sequence of values captured by one or more computing devices over time;
generating, within at least one of volatile or non-volatile storage of at least one computing device, a representation of the time-series signal;
determining whether an average number of states per seasonal period within the representation of the time-series signal satisfies a threshold;
responsive to determining that the average number of states per seasonal period does not satisfy the threshold:
(a) classifying, as abnormal by at least one computing device, a first state change of a plurality of state changes within the representation of the time-series signal that is abnormal; and
(b) classifying, as normal by the at least one computing device, a second state change that recurs seasonally at least after the first state change;
selecting, based at least in part on the first state change and the second state change, a subset of values from the sequence of values to train a model as if there were no state changes in the time-series signal, wherein the subset of values excludes one or more values that occurred prior to the first state change and includes one or more values that occurred prior to the second state change;
training, by at least one computing device, the model using the selected subset of values, from the sequence of values, that excludes the one or more values that occurred prior to the first state change and includes the one or more values that occurred prior to the second state change; and
generating, within at least one of volatile or non-volatile storage of at least one computing device, an analytical output using the trained model.

US Pat. No. 10,970,890

MAZE SOLVING METHOD BASED ON LINE-SURFACE SPATIAL RELATION

GUANGXI HUIGUIXIAN INFORM...

1. A maze solving method based on a line-surface spatial relation, wherein the method comprises the following steps:A. converting a maze picture into vector data by raster to vector conversion, and extracting channel surface-shaped data;
B. converting the channel surface-shaped data into figures of channel boundary lines, and removing the channel boundary lines at a start point and a terminal point, so that the channels at the start point and the terminal point are open, a channel line start point I and a channel line start point II are formed at the start point, and a channel line terminal point I and a channel line terminal point II are formed at the terminal point;
C. respectively extending extension lines from the channel line start point I and the channel line start point II of the start point along the two outer sides of the maze, wherein a base point XI is provided on the extension line of the channel line start point I, and a base point X2 is provided on the extension line of the channel line start point II; respectively extending extension lines from the channel line terminal point I and the channel line terminal point II of the terminal point along the two outer sides of the maze, wherein a base point Y1 is provided on the extension line of the channel line terminal point I, and a base point Y2 is provided on the extension line of the channel line terminal point II; at the outside of the maze, constructing a line which connects the base point X1 and the base point Y1 to obtain a virtual connecting line I from the channel line start point I to the channel line terminal point I, and constructing a line which connects the base point X2 and the base point Y2 to obtain a virtual connecting line II from the channel line start point II to the channel line terminal point II, wherein the virtual connecting line I has no cross point with the extension line of the channel line start point II or the extension line of the channel line terminal point II, and the virtual connecting line II has no cross point with the extension line of the channel line start point I or the extension line of the channel line terminal point I;
D. constructing a straight line G which passes the start point of the maze and the terminal point of the maze,
constructing a virtual connecting line polygon I through the channel boundary lines and the virtual connecting line I, within a range enclosed by the virtual connecting line I and the straight line G;
constructing a virtual connecting line polygon II through the channel boundary lines and the virtual connecting line II, within a range enclosed by the virtual connecting line II and the straight line G;
if both of the virtual connecting line polygon I and the virtual connecting line polygon II fail to be constructed, the maze has no solution, and the solving is ended;
if both of the virtual connecting line polygon I and the virtual connecting line polygon II are successfully constructed, the maze has a solution, and entering step E; and
E. comparing paths connecting the start point of the maze and the terminal point of the maze in the virtual connecting line polygon I and the virtual connecting line polygon II according to line-surface relation, and selecting the shortest path as the solution of the maze.

US Pat. No. 10,970,889

STROKE PREDICTION FOR STYLIZED DRAWINGS BASED ON PRIOR STROKES AND REFERENCE IMAGE

Adobe Inc., San Jose, CA...

1. A method for generating stroke predictions, the method comprising:accessing a set of prior strokes of a partial sketch superimposed on top of a reference image, distinct from the partial sketch;
identifying a target region associated with the reference image;
generating, based on the set of prior strokes and the reference image, a stroke prediction for a future stroke of the partial sketch at a location associated with the target region by iteratively minimizing an energy function considering comparisons between compared strokes of the partial sketch and between image patches of the reference image corresponding to the compared strokes; and
rendering the stroke prediction in association with the partial sketch.

US Pat. No. 10,970,888

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM

Sony Corporation Sony Int...

1. An information processing apparatus, comprising:circuitry configured to
acquire a plurality of images in which position information showing an imaging position is attached;
control display of a first map image, a positional information, and the plurality of images on a first screen image, wherein the positional information indicates the position of a currently selected pointer from among a plurality of pointers displayed on the first map image, corresponds to two or more images of the plurality of images, and is displayed in a part of the first screen image that is outside the first map image, wherein the first map image, the positional information and the plurality of images are displayed simultaneously on the first screen image;
acquire a user input of selecting an image;
control display of the image selected by the user input and a second map image related to the selected image on a second screen image, wherein the second screen image includes second positional information and a second plurality of images, the image selected by the user being one of the second plurality of images, and wherein the second positional information indicates the imaging position of the image selected by the user and is displayed in a part of the second screen image that is outside the second map image, wherein the second map image, the second positional information and the second plurality of images are displayed simultaneously on the second screen image; and
control switching between display of the first screen image and display of the second screen image,
wherein the circuitry is configured to control display of an indication of availability of a back function on the second screen image, and wherein activation of the back function causes the information processing apparatus to switch from an image selection mode to a position selection mode.

US Pat. No. 10,970,887

TOMOGRAPHIC IMAGE RECONSTRUCTION VIA MACHINE LEARNING

Rensselaer Polytechnic In...

1. A method of reconstructing an image from tomographic data obtained by an imaging process, the method comprising:performing at least one algorithm step on a raw data set or an intermediate data set of the tomographic data to obtain a final reconstructed image, wherein performing the at least one algorithm step comprises:
performing at least one conventional, non-deep-learning algorithm on a raw data of the tomographic data to obtain an intermediate data set of an initial reconstructed image; and
performing a deep learning algorithm on the intermediate data set to obtain the final reconstructed image.

US Pat. No. 10,970,886

DEVICES, SYSTEMS AND METHODS UTILIZING FRAMELET-BASED ITERATIVE MAXIMUM-LIKELIHOOD RECONSTRUCTION ALGORITHMS IN SPECTRAL CT

Shandong University, Sha...

1. A system for reconstructing a polychromatic CT image from a single scan using energy-dependent attenuation coefficients, comprising:a. a computer-readable media coupled to a computer storing a single CT scan; and
b. a processor executing a computer program configured for implementing a framelet-based iterative algorithm for CT image reconstruction from the single CT scan, the iterative algorithm comprising:
i. a scaled-gradient descent step;
ii. a non-negativity step; and
iii. a soft thresholding step,
wherein the processor is configured to reconstruct the CT image following the execution of the iterative algorithm.

US Pat. No. 10,970,885

ITERATIVE IMAGE RECONSTRUCTION

GENERAL ELECTRIC COMPANY,...

1. A method for reconstructing an image, comprising:receiving a scan data input from a scan device;
processing the scan data input to generate a final reconstructed image using a sequence of processing steps, wherein each processing step comprises:
applying a forward model to an input image provided as an input to the respective processing step to generate calculated data;
providing the calculated data and measured data as inputs to a data fidelity calculator that outputs an error term;
backtransforming the error term to generate an image domain error term; and
providing at least the image domain error term and the input image as inputs to an image update neural network to generate a next reconstructed image as input to a next processing step.

US Pat. No. 10,970,884

SYSTEMS AND METHODS FOR PROVIDING CONTENT

Facebook, Inc., Menlo Pa...

1. A computer-implemented method comprising:obtaining, by a computing system, a media item;
generating, by the computing system, a plurality of histograms based on colors of a first portion and a second portion of the media item, the generating further comprising:
sampling, by the computing system, colors included in the first portion of the media item, wherein the first portion corresponds to pixels associated with a top portion of the media item;
generating, by the computing system, a first histogram based on colors sampled from the first portion of the media item;
sampling, by the computing system, colors included in the second portion of the media item, wherein the second portion corresponds to pixels associated with a bottom portion of the media item; and
generating, by the computing system, a second histogram based on colors sampled from the second portion of the media item; and
generating, by the computing system, a custom background for the media item in a content item based on the first histogram and the second histogram.

US Pat. No. 10,970,883

AUGMENTED REALITY SYSTEM AND METHOD OF DISPLAYING AN AUGMENTED REALITY IMAGE

AUGMENTI AS, Trondheim (...

1. An augmented reality (AR) system, comprising:a global navigation satellite system module adapted to output position data;
an orientation measurement module adapted to output orientation data;
an augmented reality module; and
at least one AR-client comprising a camera and a display, wherein
the augmented reality module is adapted to
determine a position and an orientation of the camera of the at least one AR-client based on the position data and orientation data,
calculate screen positions of at least one AR object based on the position and the orientation of the camera of the at least one AR-client to create at least one AR-overlay,
transmit the at least one AR-overlay to at least one AR-client,
the at least one AR-client is adapted to:
merge the at least one AR-overlay with a picture received from the camera of the at least one AR-client to provide an AR-image, and
display the AR-image on the display,
the augmented reality module further comprises another camera, and
the augmented reality module is further adapted to adjust, based on the position data and orientation data, a heading of the camera of the at least one AR-client by:
identifying earth fixed features captured by the another camera of the augmented reality module,
tracking movement of the earth fixed features, and
adjusting the heading of the camera of the at least one AR-client by compensating for the movement of the earth fixed features.

US Pat. No. 10,970,882

METHOD FOR SCALABLE VOLUMETRIC VIDEO CODING

1. A device, comprising:a processing system including a processor; and
a memory that stores executable instructions that, when executed by the processing system, facilitate performance of operations, the operations comprising:
encoding a point cloud in a first frame of volumetric video in an octree format;
identifying a first group of points of the point cloud, resulting in a first base layer;
encoding the first base layer in the octree format;
identifying a second group of points of the point cloud, resulting in a first enhancement layer, wherein the second group of points is different from the first group of points;
identifying a third group of points of the point cloud, resulting in a second enhancement layer, wherein the third group of points is different from the first group of points and the second group of points, wherein the second enhancement layer comprises all points in each octet of leaf nodes in the octree format of the point cloud that are not in the first base layer and the first enhancement layer;
encoding the first enhancement layer in the octree format;
providing the first base layer of the point cloud over a communication network to a media device; and
providing the first enhancement layer to the media device responsive to available sufficient bandwidth in the communication network for delivery of the first enhancement layer.

US Pat. No. 10,970,881

FALLBACK MODES FOR DISPLAY COMPRESSION

Samsung Display Co., Ltd....

1. A display stream codec for a display device comprising an encoder, wherein the encoder is configured to:determine a fallback display mode as a display mode for a current block to be encoded;
represent a mode signaling of the fallback display mode that is identical to a mode signaling of a regular display mode corresponding to the fallback display mode;
encode the current block in accordance with the fallback display mode using a same signaling syntax as the corresponding regular display mode; and
send the encoded current block.

US Pat. No. 10,970,880

TEXTURE COMPRESSION

ELECTRONIC ARTS INC., Re...

1. A computer-implemented method comprising:receiving a first compressed representation of a texture map in a first compression format, wherein the first compressed representation has been compressed using a first compressor;
prior to performing compression using a second compressor, receiving an array of compression parameters for the second compressor, the array of compression parameters including one or more respective compression parameters for each of a plurality of pixel regions of the texture map;
decompressing the first compressed representation of the texture map to obtain a decompressed output comprising the texture map;
compressing, using the second compressor, the decompressed output comprising the texture map to a second compressed representation in a second compression format, comprising compressing each of said plurality of pixel regions of the texture map in accordance with the respective one or more compression parameters, wherein the compression parameters have been determined separately from the compressing;
storing the second compressed representation of the texture map to one or more memories accessible by a graphics processing unit;
selectively decompressing portions of the second compressed representation of the texture map using the graphics processing unit.

US Pat. No. 10,970,879

FORMULATION SYSTEMS AND METHODS EMPLOYING TARGET COATING DATA RESULTS

PPG Industries Ohio, Inc....

1. A computer system for seeding a formulation engine, comprising:one or more processors; and
one or more computer-readable media having stored thereon executable instructions that when executed by the one or more processors configure the computer system to perform at least the following:
receive spectrometric data from a target coating;
process the spectrometric data through a probabilistic colorant analysis, the probabilistic colorant analysis generating a set of final colorants, each final colorant within the set of final colorants being associated with a calculated probability that the associated final colorant is present within the target coating, wherein processing the spectrometric data through a probabilistic colorant analysis comprises:
initiating a set of colorant decision points, wherein each colorant decision point comprises a set of independent calculations of the spectrometric data that each provide an independent assessment about the presence of a particular effect pigment type within the target coating,
calculating in parallel each colorant decision point within the set of colorant decision points, wherein each colorant decision point provides a probability that a different effect pigment type is present within the target coating,
calculating a set of final colorant probabilities, wherein each final colorant probability within the set of final colorant probabilities is calculated by combining a unique subset of probabilities calculated by the set of colorant decision points, and
associating each final colorant probability with a final colorant from within the set of final colorants;
add at least a portion of the final colorants within the set of final colorants to a formulation engine, wherein the portion of the final colorants are added to the formulation engine in order of decreasing probability; and
generate, from an output of the formulation engine, a coating formulation that is calculated to match the target coating within a predetermined threshold.