US Pat. No. 10,657,748

TRI-ANGLED ANTENNA ARRAY FOR SECURE ACCESS CONTROL

Schlage Lock Company LLC,...

1. An access control device adapted to be secured to a door having an interior side and an exterior side, the access control device comprising:an access control mechanism adapted to control access to a passageway;
a housing adapted to be positioned at the exterior side of the door;
an antenna array secured within the housing and including a plurality of antennas that are angularly offset relative to one another;
a processor; and
a memory comprising a plurality of instructions stored thereon that, in response to execution by the processor, causes the access control device to:
determine a first signal strength of a first signal received by one of the plurality of antennas from a mobile device;
determine a second signal strength of a second signal received by another of the plurality of antennas from the mobile device;
determine whether a location of the mobile device relative to the access control device is indicative of an intent of the user of the mobile device to access the passageway based on the first signal strength and the second signal strength; and
automatically unlock the access control mechanism to allow access to the passageway in response to a determination that the location of the mobile device relative to the access control device is indicative of the intent of the user to access the passageway.

US Pat. No. 10,657,747

ACCESS CONTROL SYSTEM AND METHOD FOR USE BY AN ACCESS DEVICE

Liberty PlugIns, Inc., S...

1. An access control system comprising:a) an access device comprising a communication module connected to a processor having control of a door lock;
b) a secure reservation interface to receive a reservation request from a first device for a reservation at a given destination, the reservation interface comprising one or more screens for receiving a selection of a location at the destination and an interval of the reservation;
c) a reservation server, in communication with the reservation interface and a network, to:
receive the reservation request for the destination, the reservation request including a selected location at the destination and an interval of the reservation;
issue a reservation certificate describing the interval of the reservation based on the reservation request and the selected location; and
transmit, via the network, from the reservation server to a second device distinct from the first device, the reservation certificate and a communication setting corresponding to the access device;
d) an application installed on the second device to receive the reservation certificate and the communication setting corresponding to the access device, wherein the application wirelessly transmits the reservation certificate to the access device using the communication setting upon receipt of a command to activate the door lock; and
e) wherein the access device receives the reservation certificate from the application based on use by the application of the communication setting, and the processor activates the door lock based on at least the receipt of the reservation certificate.

US Pat. No. 10,657,746

ACCESS CONTROL SYSTEM INCLUDING OCCUPANCY ESTIMATION

Robert Bosch GmbH, Stutt...

1. An access control system comprising:an entrance threshold;
a privilege access management module configured to, responsive to an authorization input from a reader, output an access grant to traverse the threshold and generate an approved number of individuals allowed admittance across the threshold during a predetermined period of time;
a depth sensor located above the threshold and configured to generate and output 3-dimensional depth data, wherein the depth sensor is a time of flight sensor or a stereo camera sensor, and generation of the depth data is triggered by an Red Green Blue (RGB) camera, an ultrasonic sensor, a motion detector, or an infrared array sensor;
an occupancy estimation module, in communication with the privilege access management module, configured to output a count of individuals crossing the threshold during the predetermined period of time, that is based on a hemi-ellipsoid model of the depth data, wherein the hemi-ellipsoid model is indicative of a human head traveling across the entrance threshold, and the occupancy estimation module is further configured to detect a rate of motion and trajectory of the hemi-ellipsoid model; and
responsive to a difference between the number and count, inhibit the access grant.

US Pat. No. 10,657,745

AUTONOMOUS CAR DECISION OVERRIDE

Be TopNotch LLC, Henrico...

1. A smart device display for an autonomous vehicle, comprising:a first display section on the smart device that displays an external view of the autonomous vehicle;
a second display section on the smart device that displays vehicle actions that the autonomous vehicle will take; and
a third display section on the smart device that displays alternative vehicle actions that an authorized passenger of the autonomous vehicle may select to override the vehicle actions.

US Pat. No. 10,657,744

ACCESS CONTROL SYSTEM AND METHOD USING ULTRASONIC TECHNOLOGY

Schlage Lock Company LLC,...

1. An access control system, comprising:a mobile device including a microphone and a credential; and
an access control device configured to wirelessly communicate with the mobile device, the access control device including a wireless transceiver and an ultrasonic transmitter;
wherein the access control device is configured to generate an access control device identifier that is transmitted by the ultrasonic transmitter and received by the microphone of the mobile device;
wherein the mobile device is configured to evaluate the access control device identifier to determine a proximity of the mobile device relative to the reader device; and
wherein the wireless transceiver of the access control device is configured to receive the credential from the mobile device.

US Pat. No. 10,657,742

VERIFIED ACCESS TO A MONITORED PROPERTY

Alarm.com Incorporated, ...

1. A monitoring system that is configured to monitor a property, the monitoring system comprising:a sensor that is located at the property and that is configured to generate a biometric identifier of a visitor;
a monitor control unit that is configured to:
receive, from the sensor, the biometric identifier of the visitor;
compare the biometric identifier to a stored biometric identifier of a known visitor
based on the biometric identifier corresponding to a stored biometric identifier of the known visitor, receive location information that corresponds to locations of the known visitor during a time period before receiving the biometric identifier of the visitor;
based on biometric identifier corresponding to the stored biometric identifier of the known visitor and the location information of the known visitor during the period of time before receiving the biometric identifier of the visitor, determine a confidence score that reflects a likelihood that the visitor is authorized to access the property;
based on the confidence score that reflects the likelihood that the visitor is authorized to access the property, select, from among multiple monitoring system actions, a monitoring system action; and
perform the monitoring system action.

US Pat. No. 10,657,741

REGULATING ACCESS TO ELECTRONIC ENTERTAINMENT TO INCENTIVIZE DESIRED BEHAVIOR

1. A method comprising:monitoring, by a processor deployed in a communication network, a behavior of an individual through data received from an electronic device associated with the individual;
comparing, by the processor, the behavior to a predefined behavioral goal stored in a profile for the individual, wherein the predefined behavioral goal comprises a target level of physical activity;
identifying, by the processor, a predefined incentive associated with satisfaction of the predefined behavioral goal by the individual, wherein the predefined incentive comprises an access to an electronic entertainment medium, wherein the electronic entertainment medium is associated with a digital service that is accessible via the electronic device associated with the individual over the communication network, wherein the digital service comprises a data service or a video streaming service; and
transmitting, by the processor, an instruction to the electronic entertainment medium to grant the access to the individual when it is determined that the individual has satisfied the predefined behavioral goal, wherein the electronic entertainment medium is deployed in the communication network as a network-based service, wherein the electronic entertainment medium is distinct from the endpoint device associated with the individual.

US Pat. No. 10,657,740

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM

NEC CORPORATION, Minato-...

1. An information processing apparatus comprising:a memory configured to store instructions; and
a processor configured to execute the instructions to:
receive a first facial image of a user who is present in front of a display terminal;
compare the first facial image with a second facial image, among a plurality of second facial images registered at a check-in stage; and
display a first information including a boarding information registered at the check-in stage to be larger than a second display information, which is different from the first display information, when a remaining time is less than or equal to a predetermined threshold, the boarding information associated with the second facial image matching the first facial image of the user.

US Pat. No. 10,657,739

VEHICLE TIRE MONITORING SYSTEMS AND METHODS

Solera Holdings, Inc., W...

1. A system comprising:one or more accelerometers;
one or more processors; and
a memory communicatively coupled to the one or more processors, the memory comprising instructions executable by the one or more processors, the one or more processors, being operable when executing the instructions to:
receive on-board diagnostic (OBD) data from an OBD port of a vehicle;
receive tire pressure data from one or more tire pressure monitoring system (TPMS) sensors;
receive accelerometer data from the one or more accelerometers;
determine a daily commute of a driver;
determine, using the accelerometer data, road conditions along the daily commute of the driver;
determine, based on at least some of the OBD data and at least some of the accelerometer data, recommended tires to install on the vehicle;
determine, based on the determined daily commute of the driver, at least some of the OBD data, the road conditions along the daily commute of the driver, and at least some of the tire pressure data, a recommended cold tire pressure for at least one tire of the vehicle; and
send information to display the recommended tires to install on the vehicle and the recommended cold tire pressure on a display device.

US Pat. No. 10,657,738

RECONSTRUCTING AN ACCIDENT FOR A VEHICLE INVOLVED IN THE ACCIDENT

International Business Ma...

1. A method for reconstructing an accident for a vehicle involved in the accident, said method comprising:receiving, by a processor of a computer system from an accident report pertaining to the accident, vehicle data pertaining to the vehicle over a period of time relevant to the accident, said period of time relevant to the accident encompassing I discrete times, wherein I is a positive integer of at least 2; wherein for i=1, 2, . . . , I: the vehicle data comprises Ti, xi, yi, Dxi, and Dyi, wherein Ti denotes time i whose value is an integer, and wherein the vehicle is the only vehicle appearing in the accident report;
said processor identifying locations (xi, yi) determined by a Global Navigation Satellite System (GLASS), such that xi and yi denote a position of the vehicle along an x-axis and a y-axis of a cartesian coordinate system, respectively, at time Ti, wherein Dxi, and Dyi are values along the x-axis and y-axis such that (Dxi, Dyi) identifies a direction in which the vehicle is pointing, and wherein Ti+1?Ti?2 for i=1, 2, . . . , I?1;
for each time interval (?T)i from time Ti to time Ti+1 (i=1, 2, . . . , I?1), said processor computing and plotting a trajectory of the vehicle during the accident, said plotting the trajectory comprising plotting on a computer screen a position (XX, YY)j of the vehicle at each time j for j=Ti+1, Ti+2, . . . , Ti+1?1 such that XX and YY denote a position of the vehicle along the x-axis and the y-axis, respectively, at time j, wherein the plotted graph on the computer screen is visible to a user viewing the computer screen, wherein said computing and plotting the position (XX, YY)j of the vehicle at time j utilizes the received vehicle data and identified locations as input and comprises:
determining an integer z that satisfies a condition of Tz?j computing a parameter ? according to ?=(j?Tz)/(Tz+1?Tz),
computing XX at time j as a function of ?, xi, xi+1, Dxi, and Dxi+1,
computing YY at time j as a function of ?, yi, yi+1, Dyi, and Dyi+1; and
plotting XX and YY at time j as a spatial point on a graph in the cartesian coordinate system;
after said computing and plotting a position (XX, YY)j for all said times j for i=1, 2, . . . , I?1, said processor sending the graph of the plotted spatial points to an output device of the computer system;
determining, utilizing the plotted graph, whether the vehicle is speeding in each time interval (?T)i (i=1, 2, . . . , I?1) by:
computing, utilizing the plotted graph, an average speed (Vi) of the vehicle for each time interval (?T)i from time Ti to time Ti+1 (i=1, 2, . . . , I?1) according to (Distance Traveled)/(Time of Travel) wherein Distance Traveled in time interval (?T)i is a function of xi, yi, xi+1, and Yi+1, and wherein Time of Travel in time interval (?T)i is a function of Ti and Ti+1,
determining, utilizing the plotted graph, whether the average speed Vi of the vehicle for each time interval (?T)i exceeds a specified speed threshold (Vth) equal to a speed limit for a road on which the accident occurred,
determining that the vehicle is speeding in time interval (?T)i (i=1, 2, . . . , I?1) in response to a determination that Vi exceeds Vth,
determining that the vehicle is not speeding in time interval (?T)i (i=1, 2, . . . , I?1) in response to a determination that Vi does not exceed Vth; and
determining whether the vehicle is skidding at each time Ti (i=1, 2, . . . , I?1) by:
determining, utilizing the plotted graph, whether the vehicles has an Orientation (ORIENTi) at time Ti that exceeds a specified skid threshold (SKIDth), said Orientation (ORIENTi) at time Ti being measured by (Dxi, Dyi),
determining, utilizing the plotted graph, that the vehicle is skidding at time Ti (i=1, 2, . . . , I?1) in response to a determination that ORIENTi exceeds SKIDth,
determining; utilizing the plotted graph, that the vehicle is not skidding at time Ti (i=1, 2, . . . , I?1) in response to a determination that ORIENTi does not exceed SKIDth;
reconstructing the accident for the vehicle, utilizing: said plotting the trajectory of the vehicle during the accident, said determining whether the vehicle is speeding in each time interval (?T)i (i=1, 2, . . . , I?1), and said determining whether the vehicle is skidding at each time Ti (i=1, 2, . . . , I?1);
making a determination, from the reconstructed accident, that the vehicle engaged in skidding, including uncontrollable sliding, during the accident.

US Pat. No. 10,657,737

VEHICLE ERROR IDENTIFICATION SYSTEM

1. A diagnostic system for diagnosing a vehicle or a vehicle model, comprising:an input device configured to receive user input including desirable vehicle performance characteristics of the vehicle corresponding to target operation of vehicle performance, desirable system performance characteristics of systems of the vehicle corresponding to target operation of the systems, and desirable component performance characteristics corresponding to target operation of components of the systems;
an output device configured to output data; and
a diagnostic processor coupled to the input device and the output device and configured to:
receive test data corresponding to a simulation of the vehicle or a performance test of the vehicle and including detected vehicle data, detected system data, and detected component data,
determine undesirable vehicle performance data points that correspond to vehicle performance that falls below the target operation of the vehicle performance by comparing the desirable vehicle performance characteristics to the detected vehicle data,
determine undesirable system data points that correspond to likely causes of the undesirable vehicle performance data points by comparing the desirable system performance characteristics to the detected system data, the undesirable system data points corresponding to system performance that falls below the target operation of the systems,
determine undesirable component data points that correspond to likely causes of the undesirable system data points by comparing the desirable component performance characteristics to the detected component data, the undesirable component data points corresponding to component performance that falls below the target operation of the components,
generate an analysis of the undesirable component data points including selected component data points of the undesirable component data points that are likely causes of the undesirable vehicle performance data points,
identify unlikely root cause data points corresponding to at least one of the undesirable component data points that was caused by operator error or an incorrectly-set desirable component characteristic,
prioritize remaining undesirable component data points over the unlikely root cause data points in the analysis, and
control the output device to output the analysis of the undesirable component data points.

US Pat. No. 10,657,736

SYSTEM AND METHOD FOR AIRCRAFT FAULT DETECTION

The Boeing Company, Chic...

1. An aircraft fault detection system comprising:at least one aircraft data logging device configured to capture parametric flight data from at least one aircraft subsystem; and
an aircraft controller coupled to the data logging device, the aircraft controller being configured to
group the parametric flight data from the at least one aircraft subsystem into a plurality of test states, the plurality of test states being determined from a plurality of training states and one or more of the test states is different from other test states in the plurality of test states,
generate at least one test transition matrix based on the plurality of test states and determine anomalous behavior of the at least one aircraft subsystem based on the at least one test transition matrix, and
forecast a fault within the at least one aircraft subsystem based on the anomalous behavior of the at least one aircraft subsystem determined from the at least one test transition matrix.

US Pat. No. 10,657,735

SYSTEM AND METHOD FOR ADAPTABLE TREND DETECTION FOR COMPONENT CONDITION INDICATOR DATA

TEXTRON INNOVATIONS INC.,...

16. A method, comprising:acquiring a current condition indicator of a condition indicator set associated with an operating condition of a vehicle, the condition indicator set indicating sensor readings associated with an operating element of the vehicle under the operating condition;
determining, by a data server, a volatility of a first portion of the condition indicator set, wherein the first portion of the condition indicator set includes the current condition indicator;
determining, by the data server, one or more moving averages of a second portion of the condition indicator set;
determining, by the data server, whether a trend associated with the operating element is indicated according to the one or more moving averages and the volatility; and
generating, by the data server, an alert signal in response to the determining that the trend is indicated.

US Pat. No. 10,657,734

VEHICLE RUNNING TEST SYSTEM, PROGRAM FOR VEHICLE RUNNING TEST SYSTEM, AND VEHICLE RUNNING TEST METHOD

HORIBA, LTD., Kyoto (JP)...

1. A vehicle running test system comprising:a vehicle speed pattern display apparatus configured to display a prescribed speed pattern and current vehicle speed on a graph with one axis as vehicle speed and the other axis as time or running distance;
a display control part configured to, while a vehicle is being driven, simultaneously display (i) information separate from the vehicle speed based on a driving index indicating a driving state of the vehicle and (ii) the graph; and
a calculation part configured to calculate the driving index at a predetermined time interval or a predetermined distance interval,
wherein the display control part successively displays information based on the driving index,
wherein the driving index is at least one of a first driving index, a second driving index, a third driving index, a fourth driving index, a fifth driving index, or a sixth driving index,
wherein the first driving index is a difference or a ratio between a reference integrated workload of the vehicle calculated on a basis of the speed pattern and an actual integrated workload of the vehicle, wherein the reference integrated workload is work that would be done by the vehicle and wherein the actual integrated workload is actual work done by the vehicle,
wherein the second driving index is a difference or a ratio between a reference integrated distance calculated on a basis of the speed pattern and an actual integrated distance of the vehicle,
wherein the third driving index is a difference or a ratio between (the reference integrated distance/the reference integrated workload) and (the actual integrated distance/the actual integrated workload),
wherein the fourth driving index is a difference or a ratio between a reference instantaneous acceleration of the vehicle calculated on a basis of the speed pattern and actual instantaneous acceleration of the vehicle,
wherein the fifth driving index is a difference or a ratio between a reference inertial workload of the vehicle calculated on a basis of the speed pattern and an actual inertial workload of the vehicle, and
wherein the sixth driving index is a square root of a sum of squares of speed differences obtained at intervals of one second.

US Pat. No. 10,657,733

DYNAMICALLY MANAGING PARKING SPACE UTILIZATION

International Business Ma...

1. A method for dynamically managing parking space utilization comprising:receiving a dynamic feed of available parking spaces across a set of sectors, each parking space associated with a parking control device, each sector having a preferred utilization rate;
utilizing a processor to determine that a parking utilization rate of a first sector exceeds a preferred utilization rate for the first sector; and
providing to a user a list of available parking spaces including associated prices dynamically generated, wherein parking spaces in the first sector are priced higher than parking spaces in other sectors.

US Pat. No. 10,657,732

METHOD AND SYSTEM FOR LEGAL PARKING

OPERR TECHNOLOGIES, INC.,...

1. A computer-implemented method for providing legal parking guidance, the method comprising:storing, on a server, legal parking related data having a data type, wherein the data type is categorized based on at least a vehicle-type and a vehicle-plate-type;
receiving, by the server, from a user computing device, (i) user data comprising a user type corresponding to the data type of the legal parking related data, and (ii) real-time location data associated with one or more locations of a user;
identifying, by the server, one or more potentially available legal parking locations based on the legal parking related data, the location data, and the user type, wherein the user type is categorized based on at least the vehicle-type and the vehicle-plate-type; and
transmitting, by the server, to the user computing device, at least a portion of data associated with at least one of the one or more potentially available legal parking locations.

US Pat. No. 10,657,731

PROCESSING 3D IMAGES TO ENHANCE VISUALIZATION

1. An apparatus comprising:an image processing system that generates a three-dimensional image comprising 3D pixel element type voxels corresponding to image data of a scanned volume, the image processing system comprising visual representation adjustment logic, the visual representation adjustment logic comprising:
selecting a feature for demarcation within a volume of interest;
selecting a first viewpoint for a left eye;
selecting a first viewing angle for the left eye;
performing a first voxel adjustment for demarcation of the feature based on the first viewpoint for the left eye and the first viewing angle for the left eye;
displaying, in a head display unit (HDU), a first image for the left eye based on the first viewpoint for the left eye, the first viewing angle for the left eye and said volume of interest with the first voxel adjustment;
selecting a second viewpoint for a right eye wherein said first viewpoint for the left eye and said second viewpoint for the right eye are different viewpoints;
selecting a second viewing angle for the right eye;
performing a second voxel adjustment for demarcation of the feature based on the second viewpoint for the right eye and the second viewing angle for the right eye wherein the second voxel adjustment is different from the first voxel adjustment; and
displaying, in the HDU, a second image for the right eye based on the second viewpoint for the right eye, the second viewing angle for the right eye and the volume of interest with the second voxel adjustment and wherein the first image for the left eye and the second image for the right eye produce a three-dimensional image to a user.

US Pat. No. 10,657,730

METHODS AND DEVICES FOR MANIPULATING AN IDENTIFIED BACKGROUND PORTION OF AN IMAGE

BlackBerry Limited, Wate...

1. A method, implemented by a processor of an electronic device, for manipulation of an image, the method comprising:receiving image data, the image data including a first image obtained from a first camera and a second image obtained from a second camera, the first camera and the second camera being oriented in a common direction;
identifying a background portion in the image data by analyzing the first image and the second image to obtain depth information, wherein the analyzing includes identifying one or more boundaries associated with an object in the image data based on the depth information; and
displaying a manipulated image based on the image data, wherein the manipulated image includes manipulation of the identified background portion.

US Pat. No. 10,657,729

VIRTUAL VIDEO PROJECTION SYSTEM TO SYNCH ANIMATION SEQUENCES

Trimble Inc., Sunnyvale,...

1. A method of synchronizing an animation sequence with a video footage, the method comprising:obtaining a digital three-dimensional (3D) model of a site captured in the video footage by a video camera installed in the site, the video camera having a set of camera parameters;
identifying a plurality of key frames of the video footage, wherein a moving object is at a respective position in a respective key frame of the plurality of key frames;
placing a first virtual camera in the digital 3D model at a first location and a first orientation corresponding to a location and an orientation of the video camera in the site, the first virtual camera having a first set of virtual camera parameters, at least some virtual camera parameters of the first set of virtual camera parameters are same as some corresponding camera parameters of the set of camera parameters of the video camera;
generating a first set of virtual frames of the animation sequence by projecting the digital 3D model onto a first scene frame from a viewpoint of the first virtual camera using the first set of virtual camera parameters, the first set of virtual frames including a first plurality of virtual key frames, each virtual key frame corresponding to a respective key frame in the video footage;
rendering at least a subset of a set of pixels of the video footage in each virtual frame of the first set of virtual frames;
for each respective virtual key frame, placing a virtual object corresponding to the moving object at a respective location in the digital 3D model that matches with the respective position of the moving object in a corresponding key frame; and
playing the first set of virtual frames of the animation sequence while the video footage is overlaid on the subset of the set of pixels, wherein the animation sequence and the video footage are played simultaneously by stepping through time such that the first plurality of virtual key frames is in synch with the plurality of key frames of the video footage.

US Pat. No. 10,657,728

AUGMENTED REALITY PROJECTION DEVICES, METHODS, AND SYSTEMS

Verizon Patent and Licens...

1. An augmented reality projection device comprising:an image sensor having a field of view into a real-world environment;
a spatial sensor configured to detect a location and an orientation of the augmented reality projection device in the real-world environment;
a projector configured to project content onto a physical surface within the real-world environment; and
a processor that is coupled to the image sensor, the spatial sensor, and the projector, and that is configured to:
determine, based on sensor output from the spatial sensor, the location and orientation of the augmented reality projection device,
determine, based on the determined location and orientation and based on a particular target object profile from a library of predetermined target object profiles, that a virtual target object associated with the particular target object profile is included within the field of view of the image sensor,
identify content associated with the virtual target object, and
direct the projector to project the content onto the physical surface within the real-world environment, the physical surface associated with the virtual target object.

US Pat. No. 10,657,727

PRODUCTION AND PACKAGING OF ENTERTAINMENT DATA FOR VIRTUAL REALITY

WARNER BROS. ENTERTAINMEN...

11. An apparatus for outputting at least one of augmented reality (AR) output or a virtual reality (VR) output, comprising:a processor,
a memory coupled to the processor, and
a stereoscopic display device coupled to the processor,
wherein the memory holds instructions that when executed by the processor, cause the apparatus to perform:
providing, a data signal configured for causing the apparatus to output one of an augmented reality (AR) output or a virtual reality (VR) output when the data signal is processed by the processor, the data signal comprising a plurality of scripted events grouped in one or more event groups, and a narrative ruleset defining a chain of event groups, wherein each event group comprises at least one critical event and a number of optional events, the number of optional events being zero or more;
receiving a sensor input from a sensor configured to detect at least one of eye movement or orientation indicating a user navigation, a viewpoint rotation of the user, a user interaction with the immersive content, a view direction of the user, a focus depth of the user, or a combination thereof; and
controlling a narrative pace of the scripted events defined by the narrative ruleset based on the received sensor input, wherein the processor generates continuous video based on when the scripted events occur;
wherein the memory holds further instructions for maintaining a predetermined order of the scripted events according to the narrative ruleset; and
wherein the memory holds further instructions for performing the maintaining at least in part by varying an order of the scripted events based on the sensor input, subject to a narrative hierarchy, wherein the narrative hierarchy defines narrative relationships between groups of events and permits events within each group to occur in any chronological order based on the sensor input.

US Pat. No. 10,657,726

MIXED REALITY SYSTEM AND METHOD FOR DETERMINING SPATIAL COORDINATES OF DENTAL INSTRUMENTS

International Osseointegr...

1. An intra-oral image sensor-based positioning device for determining spatial coordinates of a dental instrument comprising:an algorithm for calculating and processing coordinate data, a processor for implementing the algorithm, a storage for storing coordinate data, a first camera, and a second camera wherein the first camera and the second camera are attached to a bite block, wherein:
the first camera having a first field of view: configured to sense and locate a point on a dental instrument on a first plane corresponding to the first field of view; the point on the dental instrument is assigned a first set of two-dimensional coordinates with respect to the first plane;
the second camera having a second field of view; a second plane corresponding to the second field of view wherein the point on the dental instrument is also located on the second plane and is assigned a second set of two-dimensional coordinates with respect to the second plane;
the algorithm configured to provide a three-dimensional coordinates assigned to the point on the dental instrument according to calculation based on the first set of two-dimensional coordinates and the second set of two-dimensional coordinates;
the first set of two-dimensional coordinate points, the second set of two-dimensional coordinate points, and the three-dimensional coordinate points are transmitted to the processor or the storage.

US Pat. No. 10,657,725

AUGMENTED VIRTUAL REALITY OBJECT CREATION

Flow Immersive, Inc., Au...

1. A method comprising:converting real world data into one or more augmented virtual reality (AVR) objects;
enhancing the one or more AVR objects to include at least one of processed data visualization and multiuser controls;
positioning the enhanced one or more AVR objects in a virtual space-time;
making available, as AVR media, a scene tree including the virtual space-time in which the enhanced one or more AVR objects are positioned, wherein:
each scene in the scene tree is capable of embedding an unlimited number of additional scenes therein;
each scene in the scene tree is capable of being shown in relation to other scenes in the scene tree;
applying a skyscape to a boundary of the virtual space-time, the skyscape including features that do not change as an audience member of the AVR media moves toward or away from the boundary of the virtual space-time.

US Pat. No. 10,657,724

KEY LIGHTS DIRECTION DETECTION

THOMSON Licensing, Cesso...

1. A method of generating virtual lighting for an object according to an input image captured by a camera, the method comprising:dividing the input image in blocks of pixels;
for at least one block, determining a pixel of the block with a highest luminance and associating coordinates and luminance of the determined pixel with the block;
determining a final block with a highest luminance by iterating:
generating a new input image using said blocks associated with a luminance and coordinates as pixels of the new input image;
dividing the new input image in second blocks; and
for each second block, determining a pixel of said second block associated with a highest luminance and associating coordinates and luminance of the determined pixel with the second block until said new input image is a one-pixel image; and
generating virtual lighting for said object from a main lighting source for which a main lighting direction is determined according to coordinates associated with said final block by mapping the input image on a portion of a sphere determined according to a field of view of the camera.

US Pat. No. 10,657,723

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

Canon Kabushiki Kaisha, ...

1. An image processing apparatus that embeds additional information relating to an object that is displayed in a superimposing manner on a captured image, in an original image, the image processing apparatus comprising:a determining unit configured to determine whether a direction of the original image is a landscape or a portrait; and
an embedment unit configured to embed the additional information in the original image,
wherein the additional information is information capable of at least specifying a type of the object and a display direction, of the object with respect to a display screen, in a case of displaying the object in a superimposing manner on the captured image, and
wherein the embedment unit embeds the additional information based on the determination by the determining unit, such that the display direction changes in accordance with whether the direction of the original image is the landscape or the portrait.

US Pat. No. 10,657,722

TRANSMISSIVE DISPLAY DEVICE, DISPLAY CONTROL METHOD, AND COMPUTER PROGRAM

SEIKO EPSON CORPORATION, ...

1. A transmissive display device comprising:a first display section having a light transmissive property, and adapted to display a first pointer image so as to be superimposed on an external world visually recognized through the first display section;
a second display section having a light transmissive property, and adapted to display a second pointer image so as to be superimposed on the external world visually recognized through the second display section
a display control section adapted to control a display configuration of the first pointer image and the second pointer image as a parallax image; and
an imaging section adapted to take a first image of the external world and a second image of the external world by a stereo camera,
wherein the display control section is configured to, when each of the first image and the second image include an object, a position of the object in the first image is different from a position of the object in the second image, and the first pointer image and the second pointer image overlaps the object in the first image and the object in the second image respectively:
change at least one of a color of the first pointer image and a pattern of the first pointer image based on a feature quantity of the object in the first image, and
change at least one of a color of the second pointer image and a pattern of the second pointer image based on a feature quantity of the object in the second image.

US Pat. No. 10,657,721

SYSTEMS AND METHODS FOR PROVIDING AUGMENTED REALITY SUPPORT FOR VEHICLE SERVICE OPERATIONS

PACCAR INC, Bellevue, WA...

1. A method of providing an augmented reality presentation of a component of a vehicle using a virtual reality/augmented reality (VR/AR) device, the method comprising:receiving, by the VR/AR device, an indication of a vehicle type;
retrieving, by the VR/AR device, a virtual object model file associated with the vehicle type;
presenting, by the VR/AR device, an instruction to attach a physical marker to the vehicle at a predetermined location specified in the virtual object model file;
capturing, by a camera of the VR/AR device, at least one image, the at least one image including the physical marker;
receiving, by the VR/AR device, a selection of an assembly of the vehicle;
determining, by the VR/AR device and based on the at least one image, a position and an orientation of the VR/AR device with respect to the physical marker; and
presenting, by the VR/AR device and based on the determined position and orientation of the VR/AR device, an augmented reality depiction of the vehicle including both an output of the camera and an assembly detail model that is included in the virtual object model file and is associated with the selected assembly, wherein the selected assembly comprises a portion of the vehicle that is not visible via the camera of the VR/AR device from the determined position.

US Pat. No. 10,657,720

VIRTUAL REALITY SYSTEM HAVING ADAPTIVE CONTROLLING FUNCTION AND CONTROLLING METHOD THEREOF

ACER INCORPORATED, New T...

1. A controlling method of a virtual reality system, comprising:obtaining a sensing signal by a head-mounted display device wore on a user; and
adaptively controlling a procedure of transmitting a virtual reality content to the head-mounted display device by a host or a portable device according to the sensing signal;
wherein in the step of adaptively controlling the procedure of transmitting the virtual reality content, a compression ratio, a frame rate or a saturation of the virtual reality content is controlled, and
wherein when the sensing signal indicates the user is dazed, a frame shaking of the virtual reality content is reduced.

US Pat. No. 10,657,719

HOUSEHOLD APPLIANCE CONTROLLED BY USING A VIRTUAL INTERFACE

ARCELIK ANONIM SIRKETI, ...

1. A household appliance comprising:a projector configured to project an image of a virtual interface which enables control of household appliances within a projection area on a surface,
a movement sensor configured to detect gestures on the projection area,
a temperature sensor configured to measure a temperature of surfaces or objects on the projection area,
a control unit that wirelessly communicates with the household appliances to receive and send data, that includes a memory for storing data, and that is configured to:
control the projector and the household appliances to which the projector is connected, based on data received from the movement sensor,
monitor the temperature of the projection area, and
draw a temperature map of the projection area based on data received from the temperature sensor, and
control content of the image projected on the projection area according to the temperature of the projection area.

US Pat. No. 10,657,718

FACIAL EXPRESSION TRACKING DURING AUGMENTED AND VIRTUAL REALITY SESSIONS

Wells Fargo Bank, N.A., ...

1. A visual computing device that can be worn by a user, the visual computing device comprising:a processing unit;
system memory;
a display unit on which one or more virtual images can be projected; and
a plurality of cameras, the plurality of cameras being oriented in an inwardly-facing direction towards a face of the user to capture facial reactions of the user;
wherein the system memory encodes instructions that, when executed by the processing unit, cause the visual computing device to:
display, by the display unit, an image to the user, the image being of an item the user may be interested in purchasing;
capture, using one or more of the plurality of cameras, a facial expression of the user as a result of the user viewing the image on the display unit, the facial expression comprising a reaction of the user to the image;
send the captured facial expression to an electronic computing device;
receive, from the electronic computing device, a message identifying an emotion of the user corresponding to the facial expression, the emotion of the user being identified via a comparison of the captured facial expression with a previous facial expression of the user obtained during a training period; and
when the emotion corresponding to the captured facial expression is a negative emotion, display, by the display unit, an image of a different item the user may be interested in purchasing that may create a positive emotional response for the user, wherein the image of the item includes a street scene and the item is a home on the street scene, and wherein the image of the different item includes a different street scene in a different geographical location than the image of the item.

US Pat. No. 10,657,717

SIMULATOR WITH MULTIPLE RECONFIGURABLE THREE-DIMENSIONAL COCKPIT VIEWS RENDERED IN REAL-TIME

Lockheed Martin Corporati...

1. A method comprising:maintaining, by a computing device comprising a processing device and at least one graphics processing unit (GPU), during a simulation, a cockpit model comprising a plurality of cockpit model parts that collectively correspond to a simulated cockpit in a simulated vehicle; and
for each frame of a plurality of frames:
determining, by the processing device, a plurality of cockpit view frustums, each cockpit view frustum corresponding to a different cockpit view of a plurality of cockpit views of the simulated cockpit;
based on the plurality of cockpit view frustums, generating shared cockpit scene information comprising a set of cockpit model parts that are within any of the plurality of cockpit views;
submitting, by the processing device to the at least one GPU, the shared cockpit scene information and GPU instructions that direct the at least one GPU to generate a plurality of cockpit view images that correspond to the plurality of cockpit views from the shared cockpit scene information; and
generating, by the at least one GPU, the plurality of cockpit view images.

US Pat. No. 10,657,716

COLLABORATIVE AUGMENTED REALITY SYSTEM

CALIFORNIA INSTITUTE OF T...

1. A method for controlling navigation of a three-dimensional (3D) computer aided design (CAD) model in an augmented reality space comprising:(a) rendering the 3D CAD model in the augmented reality space, wherein the 3D CAD model appears as if the 3D CAD model is present in a physical space at true scale; and
(b) for each rendered frame of the rendering:
(1) defining a virtual camera fixed to a current pose of a user's head;
(2) constructing a virtual line segment S coincident with a ray R from a center of projection P of the virtual camera and a center pixel of the virtual camera wherein the virtual line segment S starts at a pre-defined minimum distance from the projection P and ends at a pre-defined maximum distance from the projection P;
(3) checking for geometric intersections between the virtual line segment S and surfaces of scene elements, wherein the scene elements comprise one or more parts of the 3D CAD model and user interface objects;
(4) determining that there is a geometric intersection with one or more parts of the 3D CAD model; and
(5) rendering a gaze cursor at an intersection point C closest to the center of projection P, wherein the gaze cursor comprises a visually distinguishable indicator.

US Pat. No. 10,657,715

SYSTEMS AND METHODS FOR VISUALIZING AND ANALYZING CARDIAC ARRHYTHMIAS USING 2-D PLANAR PROJECTION AND PARTIALLY UNFOLDED SURFACE MAPPING PROCESSES

St. Jude Medical, St. Pa...

1. A method of rendering a graphical representation from a 3D surface geometry of a chamber, the method comprising:obtaining a 3D surface geometry of a chamber;
identifying a first surface section on the 3D surface geometry having a first feature of the chamber;
selecting first and second points on the 3D surface geometry to form a cutting curve;
unfolding the 3D surface geometry at the cutting curve to render a 2D representation of the chamber;
generating a conformal map from the 3D surface geometry, wherein the first and second points are selected on the conformal map; and
adaptively resampling the conformal map before selecting the first and second points; and
wherein the first and second points are selected to optimize representation of the first feature in the graphical representation.

US Pat. No. 10,657,714

METHOD AND SYSTEM FOR DISPLAYING AND NAVIGATING AN OPTIMAL MULTI-DIMENSIONAL BUILDING MODEL

HOVER, Inc., San Francis...

1. A method of calculating an optimal camera position within a multi-dimensional building model, the method comprises:defining a look angle based at least partially on information obtained from a camera used during image capture of a building;
defining a field of view by defining up, down, left and right angles which define an extent of the building;
calculating a camera first main axis and a second main axis of the building to define camera orientation within the multi-dimensional building model;
calculating the optimal camera position based on the look angle, the field of view and the camera orientation; and
storing the optimal camera position in computer storage.

US Pat. No. 10,657,713

METHODS AND SYSTEMS FOR WIREFRAMES OF A STRUCTURE OR ELEMENT OF INTEREST AND WIREFRAMES GENERATED THEREFROM

Pointivo, Inc., Atlanta,...

1. A method for generating verified wireframes of structures or elements from at least one 2D image or a 3D representation of a scene, comprising:a. presenting, to a user, at least one 2D image or a 3D representation of a scene having a plurality of regions;
b. identifying, by either or both of user interaction or by a computer, a region of interest from the plurality of regions;
c. identifying, by either or both of user interaction or by the computer, a first structure or element in the region of interest, thereby providing a first identified structure or element;
d. generating, by either or both of user interaction or by the computer, an unverified wireframe for the first identified structure or element;
e. verifying, by user interaction, the unverified wireframe as corresponding to the first identified structure or element, thereby providing a user-verified wireframe for the first identified structure or element; and
f. recording, by the computer, information associated with user interactions verification for use in a machine learning system configured to simulate user action in subsequent wireframe verification processes.

US Pat. No. 10,657,712

SYSTEM AND TECHNIQUES FOR AUTOMATED MESH RETOPOLOGY

1. A method of retopologizing an object model comprising:maintaining a plurality of retopologized object models, each respective retopologized model of the retopologized object models associated with a respective mesh data, each respective mesh data composed of a plurality of regions which make up the respective retopologized model;
receiving a first object model comprising 3D data associated with an object;
segmenting the first object model into the plurality of regions by assigning, to each of the plurality of regions, separate portions of the 3D data of the first object model;
identifying, for each of the plurality of regions, a closest matching corresponding region from the plurality of regions which make up at least one second object model of the retopologized object models;
determining, for each closest matching corresponding region, mesh data associated with that closest matching corresponding region; and
generating, from the mesh data for each closest matching corresponding region, a retopologized object model by combining the mesh data determined for each closest matching corresponding region.

US Pat. No. 10,657,711

SURFACE RECONSTRUCTION FOR INTERACTIVE AUGMENTED REALITY

Intel Corporation, Santa...

1. An electronic processing system, comprising:a processor;
a depth sensor communicatively coupled to the processor; and
logic communicatively coupled to the processor and the depth sensor to:
perform depth sensor fusion to determine depth information for a surface,
smooth the depth information for the surface and preserve edge information for the surface based on adaptive smoothing with self-tuning band-width estimation,
iteratively remove holes from the surface based on conditional iterative manifold interpolation,
reduce one or more of a file size and an on-memory storage size of data corresponding to the surface based on triangular edge contraction, and
construct at least a portion of a 3D model based on data corresponding to a visible portion of the surface.

US Pat. No. 10,657,710

DISPLACEMENT DIRECTED TESSELLATION

SONY INTERACTIVE ENTERTAI...

1. A method for rendering computer graphics, the method comprising:creating a displacement map for a plurality of surfaces;
sampling the plurality of surfaces in the created displacement map;
initializing a tessellation process;
determining a tessellation density for a first set of the surfaces based on the displacement map;
assigning a texture to the tessellation factor scale based on identifying that a tessellation factor scale corresponding to the tessellation density is associated with an amount of computational resources above a performance threshold;
modifying the first set of surfaces by applying the assigned texture to each surface of the first set of surfaces;
determining to decrease the tessellation density for a second set of the surfaces based on the displacement map; and
modifying the second set of surfaces by decreasing a tessellation factor scale for each surface of the second set of surfaces.

US Pat. No. 10,657,709

GENERATION OF BODY MODELS AND MEASUREMENTS

Fit3D, Inc., Redwood Cit...

1. A system for generating measurements of a human body, comprising:an image capturing device configured to capture one or more images of the human body;
a data store configured to store pre-processed fiducial maps of human bodies;
one or more processors configured to:
receive the one or more images of the human body from the image capturing device;
identify body landmarks for the human body based on the one or more images;
generate a fiducial map of the human body using the body landmarks and body landmark measurements, wherein the fiducial map corresponds to a 3D body model of the human body;
compare the fiducial map of the human body to the pre-processed fiducial maps of human bodies stored in the data store;
identify, based on the comparison, a plurality of pre-processed fiducial maps of human bodies when a correlation value between the fiducial map and the pre-processed fiducial map exceeds a defined threshold;
generate a silhouette image of the human body based on the one or more images of the human body;
compare the silhouette image of the human body to pre-processed silhouette images associated with the plurality of pre-processed fiducial maps of human bodies;
identify, based on the comparison, one of the pre-processed silhouette images associated with one of the plurality of pre-processed fiducial maps when a correlation value between the silhouette image of the human body and the pre-processed silhouette images exceeds a defined threshold; and
identify measurements of the human body based on a pre-processed 3D model in the data store associated with the one of the pre-processed silhouette images and the one of the plurality of pre-processed fiducial maps; and
a user interface configured to receive the one or more images of the human body, and display the measurements of the human body.

US Pat. No. 10,657,708

IMAGE AND POINT CLOUD BASED TRACKING AND IN AUGMENTED REALITY SYSTEMS

Snap Inc., Santa Monica,...

1. A method for reducing augmented reality perspective position error comprising:accessing three-dimensional (3D) point cloud data describing an environment associated with a client device and a first position estimate for an image sensor of a companion device associated with the client device;
accessing a first image of the environment captured by the image sensor of the companion device, wherein the companion device is separate from the client device and associated with a different location than the first position estimate;
processing the first image to match at least a portion of a set of key points of the 3D point cloud to the first image;
determining, based on the match of the portion of the set of key points of the 3D point cloud to the first image, a position error associated with the first position estimate along with a second position estimate for the image sensor of the companion device;
generating a model of a virtual object within the 3D point cloud; and
generating a first augmented reality image comprising the virtual object in the environment using the second position estimate for the client device, the model of the virtual object within the 3D point cloud, and the match of the portion of the set of key points of the 3D point cloud to the first image.

US Pat. No. 10,657,707

PHOTO DEFORMATION TECHNIQUES FOR VEHICLE REPAIR ANALYSIS

STATE FARM MUTUAL AUTOMOB...

1. A server device for using photo deformation techniques for vehicle repair analysis, the server device comprising:one or more processors;
a non-transitory computer-readable memory coupled to the one or more processors, and storing thereon instructions that, when executed by the one or more processors, cause the server device to:
obtain a set of training data for previously damaged vehicle parts including actual repair times to repair the previously damaged vehicle parts, wherein the set of training data includes a plurality of subsets, each subset corresponding to a different actual repair time;
for each subset, determine vehicle identification information and a plurality of previously damaged part characteristics for the previously damaged vehicle parts within the subset of the training data;
when a vehicle part is damaged in a vehicle crash, receive, from a client device, vehicle identification information and currently damaged part characteristics for the currently damaged vehicle part, including:
receiving three-dimensional image data depicting the currently damaged vehicle part; and
analyzing the three-dimensional image data to identify the currently damaged part characteristics for the currently damaged vehicle part, including identifying a depth of a dent to the currently damaged vehicle part based on a difference in depth values within a portion of the three-dimensional image data corresponding to the dent;
compare the vehicle identification information and the currently damaged part characteristics to the set of training data to determine a repair time for repairing the currently damaged vehicle part; and
cause an indication of the repair time for repairing the currently damaged vehicle part to be displayed on a user interface of the client device.

US Pat. No. 10,657,706

3D RENDERING METHOD AND APPARATUS

Samsung Electronics Co., ...

1. A three-dimensional (3D) rendering method comprising:determining respective color values of vertices shaded by a direct virtual light source;
establishing a polygonal area bounded by the vertices;
determining, based on the vertices, an indirect virtual light source at a location within the polygonal area;
establishing, based on the respective color values of the vertices, the location; and
rendering a 3D scene based on the location of the indirect virtual light source,
wherein the determining the indirect virtual light source comprises:
determining a number of additional indirect virtual light sources for the polygonal area based on a size of the polygonal area from a viewpoint of a virtual camera and the respective color values of the vertices, and
wherein each color value of the vertices of the additional indirect virtual light sources is a function of the location of the indirect virtual light source, a location of a vertex, a normal of the vertex, a color value of the indirect virtual light source, and a color value of the vertex.

US Pat. No. 10,657,705

SYSTEM AND METHOD FOR RENDERING SHADOWS FOR A VIRTUAL ENVIRONMENT

MZ IP Holdings, LLC, Pal...

1. A method, comprising:adjusting depth values for pixels in a first region of a digital image to render a first shadow in the first region,
wherein a depth buffer is provided for the digital image,
wherein the depth buffer comprises a depth value for each pixel in the digital image, and
wherein the adjusted depth values comprise an indication that shading has been applied to the pixels in the first region; and
rendering a second shadow in a second region that partially overlaps the first region by (i) identifying a shadow rendering region to be within the second region but outside the first region, based on the indication, and (ii) adjusting depth values for pixels in the shadow rendering region.

US Pat. No. 10,657,704

MARKER BASED TRACKING

Facebook Technologies, LL...

1. A method comprising:mapping of labels, via a neural network, to one or more representations in a depth map using a model of a portion of a body that wears a wearable item, wherein the wearable item includes the markers and the representations in the depth model are of the markers;
determining a joint parameter using the mapped labels; and
updating the model with the joint parameter, wherein content provided to a user of the wearable item is based in part on the updated model.

US Pat. No. 10,657,703

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus comprising:an obtaining unit configured to obtain image data based on capturing of a plurality of image capturing apparatuses;
a generating unit configured to generate shape data representing a three-dimensional shape of an object based on the image data obtained by the obtaining unit; and
a processing unit configured to process the shape data generated by the generating unit, based on information representing a number of image capturing apparatuses that capture an area corresponding to at least a part of the shape data,
wherein the processing unit is configured to delete the at least a part of the shape data based on the number of image capturing apparatuses that capture the area corresponding to the at least a part of the shape data being lower than a threshold value.

US Pat. No. 10,657,702

SYSTEMS AND METHODS FOR CONTENT STREAMING

Facebook, Inc., Menlo Pa...

1. A computer-implemented method comprising:determining, by a computing device, a viewport interface for accessing a content item, the content item being composed using a set of content streams that capture at least one scene from a plurality of different positions;
predicting, by the computing device, at least one change in a direction of the viewport interface while accessing the content item, wherein the predicted change in the direction corresponds to a second viewport interface for accessing at least one different content stream of the content item, the predicting further comprising:
determining, by the computing device, the change in the direction to correspond to the second viewport interface for accessing the at least one different content stream based at least in part on aggregated user data indicating playback times at which viewport interfaces of different users transitioned to the at least one different content stream while accessing the content item; and
buffering, by the computing device, the at least one different content stream corresponding to the second viewport interface prior to the viewport interface transitioning to the second viewport interface.

US Pat. No. 10,657,701

DYNAMIC ENTERING AND LEAVING OF VIRTUAL-REALITY ENVIRONMENTS NAVIGATED BY DIFFERENT HMD USERS

Sony Interactive Entertai...

1. A computer-implemented method for processing operations for head mounted display (HMD) users to join virtual reality (VR) scenes, comprising:providing a first perspective of a VR scene to a first HMD of a first user;
receiving an indication that a second user of a second HMD is requesting to join the VR scene provided to the first HMD;
obtaining real-world position and orientation data of the second HMD relative to the first HMD, the real-world position and orientation data is usable to determine a relative location of the second HMD from the first HMD in a real-world space; and
providing, based on the real-world position and orientation data, a second perspective of the VR scene to join the VR scene by the second user, the second perspective is shifted from the first perspective corresponding to a difference in the relative location of the second HMD from the first HMD in real-world space;
wherein the first perspective and the second perspective of the VR scene are each controlled by respective position and orientation changes of the first HMD and the second HMD, wherein the first user of the first HMD controls a progression of the VR scene and the progression causes a movement of the first perspective toward a direction within the VR scene, and said movement causes a corresponding movement toward the direction within the VR scene for the second perspective while substantially maintaining the shifted perspective between the first perspective and the second perspective.

US Pat. No. 10,657,700

SYSTEMS AND METHODS FOR DISTRIBUTED SCALABLE RAY PROCESSING

Imagination Technologies ...

1. A machine-implemented method of processing rays, comprising:at a first computation unit of a plurality of computation units,
selecting a group of rays to be processed for intersection with an element of an acceleration structure, wherein each element of the acceleration structure bounds a respective selection of geometry located in a 3-D space, and the element of the acceleration structure is identifiable with an identifier,
indicating the identifier of the element of the acceleration structure to the other computation units of the plurality of computation units,
initiating retrieval, from a memory, of data defining the element of the acceleration structure,
obtaining data defining the rays of the group of rays from a memory local to the first computation unit,
determining whether each of the rays hits or misses the element of the acceleration structure, and
at the other computation units of the plurality of computation units,
indexing a memory using the identifier of the element of the acceleration structure indicated by the first computation unit to determine whether a respective local memory to that computation unit contains definition data for a group of rays to be tested for intersection with the element of the acceleration structure identifiable with the identifier;
in dependence on the determination of whether a respective local memory to that computation unit contains definition data for a group of rays to be tested for intersection with the element of the acceleration structure identifiable with the identifier, determining whether to schedule testing of that group of rays in that computation unit for intersection with the element of the acceleration structure identifiable with the identifier, or to schedule another group of rays, for which definition data is stored in the respective local memory of that computation unit, for intersection with one or more other acceleration structure elements.

US Pat. No. 10,657,699

PERFORMING TEXTURING OPERATIONS FOR SETS OF PLURAL EXECUTION THREADS IN GRAPHICS PROCESSING SYSTEMS

Arm Limited, Cambridge (...

1. A method of operating a graphics processor, the graphics processor comprising:a programmable fragment shader operable to execute graphics fragment shading programs to perform fragment shading operations; and
a texture mapper operable to perform graphics texturing operations in response to requests for graphics texturing operations from the fragment shader;
wherein:
the fragment shader processes graphics fragments by executing fragment shader programs using respective execution threads for sampling positions of a render output being generated by the graphics processor; and
the fragment shader is operable to, when it encounters a graphics texturing instruction in a fragment shader program that it is executing for a thread:
request the texture mapper to perform a graphics texturing operation for a set of plural threads that are executing the graphics texturing instruction in the fragment shader program; and
the texture mapper is operable to, in response to a request from the fragment shader to perform a texturing operation for a set of plural execution threads that are executing a graphics texturing instruction in a shader program:
perform the texturing operation for the set of plural execution threads together;
the method comprising:
when the texture mapper is to perform a texturing operation for a set of plural execution threads together, the texture mapper:
determining whether the texturing operation for the set of plural threads can be performed together with the texturing operation for another set of plural execution threads for which a texturing operation is required; and
when it is determined that the texturing operations for the sets of plural execution threads can be performed together, performing the texturing operations for the sets of plural threads together; and
when it is determined that the texturing operation for the set of plural threads cannot be performed together with the texturing operation for another set of plural execution threads for which a texturing operation is required, performing the texturing operation for the set of plural execution threads alone.

US Pat. No. 10,657,698

TEXTURE VALUE PATCH USED IN GPU-EXECUTED PROGRAM SEQUENCE CROSS-COMPILATION

MICROSOFT TECHNOLOGY LICE...

1. A computing system configured to execute instructions for a first graphical processing unit (GPU) on a second GPU, the computing system comprising:the second GPU; and
a processor configured to:
receive second GPU state data that indicates one or more global properties of the second GPU;
receive one or more binary instructions for texture operations configured for the first GPU, wherein:
the one or more binary instructions include one or more texture fetches, and
each texture fetch includes a texture fetch constant having a respective sign for each of one or more channels that encode coordinates for the texture fetch;
determine a texture value patch type based at least in part on the one or more signs included in the one or more binary instructions, wherein the texture value patch type is incompatible signs, gamma, depth, or bias; and
based on the second GPU state data, apply a texture value patch of the texture value patch type to the one or more binary instructions, wherein applying the texture value patch translates the one or more binary instructions into one or more translated binary instructions configured to be executed on the second GPU.

US Pat. No. 10,657,697

METHOD FOR THE COMPUTER ANIMATION OF CAPTURED IMAGES

1. A method of computer animating captured images, comprising:obtaining a first set of rules that define one or more first nodes as a function of two or more first body parts of one or more animatable captured images;
obtaining a timed data file of a moving body having one or more tracked coordinates of two or more second body parts of said moving body;
evaluating the one or more tracked coordinates against the first set of rules to determine a respective tracked coordinate for each first node; and
applying synchronized movement to each first node by mirroring each respective tracked coordinate to produce animation control of the one or more animatable captured images;
the first set of rules comprise defining a relationship between two adjacent first nodes as a vector, wherein each vector establishes a relative directional orientation of the two adjacent first nodes when applying synchronized movement.

US Pat. No. 10,657,696

VIRTUAL REALITY SYSTEM USING MULTIPLE FORCE ARRAYS FOR A SOLVER

DeepMotion, Inc., Redwoo...

1. An interactive avatar display system providing at least a visual output on a device that is viewable by a user, wherein the visual output is a computer-generated view of a virtual space from a camera position and a camera angle, the computer-generated view including an avatar that moves in the virtual space in response to movements of the user, the interactive avatar display system comprising:a processor;
a data memory, coupled to the processor, for storing data accessible by the processor;
sensors for sensing a plurality of user body part positions, in a real-world coordinate system of the user, of tracked body parts; and
a program memory, coupled to the processor, containing program instructions for execution by the processor, and the program memory comprising:
a) program code for determining positions and orientations of the tracked body parts from sensor outputs, wherein a tracked body part is a user body part that affects one or more of the sensors;
b) program code for generating motion targets of tracked elements of the avatar, wherein a motion target is a position or orientation in the virtual space corresponding to a position or orientation, in the real-world coordinate system of the user, of one or more of the tracked body parts, and wherein at least one body part of the avatar is an untracked avatar body part that corresponds to a body part untracked by the sensor outputs;
c) program code for computing a first array of force values, wherein a force value in the first array of force values is an inverse dynamics force value that corresponds to a force on a particular avatar body part of avatar body parts that corresponds to a particular tracked body part wherein the first array of force values are such that when applied to drive the avatar body parts, they drive the particular avatar body part towards alignment with the motion targets of the tracked body parts and the first array of force values corresponds to inverse dynamics force values of a plurality of avatar degrees of freedom, including at least one degree of freedom that is unspecified by the motion targets and thus independent of the tracked body parts;
d) program code for computing a second array of force values, wherein a force value in the second array of force values is either a balance control force value or a locomotion control force value and the second array of force values corresponds either to balance control forces needed to maintain a balance of the avatar in the virtual space or locomotion control forces needed to move the avatar as a whole to (1) maintain alignment of the tracked elements of the avatar with the motion targets corresponding to the position or the orientation of the one or more of the tracked body parts and (2) move the untracked avatar body part consistent with the second array of force values;
e) program code for combining the first array of force values and the second array of force values into a summed force array, wherein a summed force in the summed force array represents a combination of (1) the force value in the first array of force values that is the inverse dynamics force value that corresponds to the force on the particular avatar body part, and (2) the force value in the second array of force values that is either the balance control force value or the locomotion control force value;
f) program code for determining a set of constraints for the avatar;
g) program code for computing a solution to equations of motions of elements of the avatar given the summed force array and the set of constraints for the avatar, wherein the solution comprises an array of accelerations wherein an acceleration value in the array of accelerations is an acceleration to be applied to a particular avatar body part as a rigid body system; and
h) program code to update positions and/or velocities of the avatar body parts based on the array of accelerations.

US Pat. No. 10,657,695

ANIMATED CHAT PRESENCE

Snap Inc., Santa Monica,...

1. A method comprising:detecting an initiation of a communication session at a first client device, the communication session between the first client device and a second client device;
causing the first client device to capture image data that depicts a face of a user of the first client device, and audio data in response to the detecting the initiation of the communication session;
generating a mesh representation of the face of the user based on the image data, the mesh representation including a point-of-gaze indicator;
selecting a set of avatar features from among a collection of avatar features based on a user account associated with the first client device, the user account including an identification of the set of avatar features;
generating an avatar to be associated with the user of the first client device based on the set of avatar features and the mesh representation;
causing display of a communication interface at the second client device in response to the initiation of the communication session, the communication interface comprising a presentation of the avatar associated with the user of the first client device and a display of a chat transcript that comprises a plurality of messages sent during the communication session between the first client device and the second client device;
orienting the presentation of the avatar associated with the user within the communication interface based on point-of-gaze indicator of the the mesh representation;
transcribing the audio data to a text string;
translating the text string from a first language to a second language, the second language based on a language preference associated with the second client device; and
presenting the text string within the communication interface at a position based on the presentation of the avatar associated with the user.

US Pat. No. 10,657,694

ACTIVITY SURFACE DETECTION, DISPLAY AND ENHANCEMENT OF A VIRTUAL SCENE

Tangible Play, Inc., Pal...

1. A method for monitoring user activity in a physical activity scene, the method comprising:displaying, on a display of a computing device, a graphical user interface embodying a virtual scene and including an animated character;
capturing, using a video capture device coupled to the computing device, a video stream of the physical activity scene proximate to the computing device, the video stream including an image depicting a tangible work;
generating, using a processor of the computing device, a visualization of the tangible work in the virtual scene;
determining, using the processor of the computing device, a shape of the visualization of the tangible work;
determining, using the processor of the computing device and based on the shape of the visualization of the tangible work, an interaction routine executable to animate an interaction between the animated character and the visualization of the tangible work in the virtual scene, the interaction routine causing the animated character to modify the shape of the visualization of the tangible work; and
executing, using the processor of the computing device, the interaction routine to animate, on the display of the computing device, the interaction between the animated character and the visualization of the tangible work.

US Pat. No. 10,657,693

METHOD FOR SCRIPTING INTER-SCENE TRANSITIONS

Smarter Systems, Inc., N...

1. A computer-implemented method of creating a motion picture experience of a realm based on a series of digitally stored images of the realm, the method comprising:receiving a user's definition of a series of locations in the realm, the series of locations comprising at least a first location and a second location that immediately succeeds the first location in the series of locations, wherein the first location is associated with a first image and the second location is associated with a second location image;
processing the first and second images to determine a 3D geometry;
creating at least one virtual image, each virtual image representing a virtual camera view of the realm from a distinct point in a three-dimensional space between the first location and the second location, and each virtual image comprising a portion of the first image and a portion of the second image projected in the 3D geometry;
storing, in a tangible non-transitory computer readable medium, for subsequent playback on a display, data describing the series of and the at least one virtual image, such that playback of the first image, the at least one virtual image, and the second image simulates motion in three-dimensional space from the first location to the second location;
wherein storing further includes storing an annotation associated with a specified location in the series of locations, and/or a pointer to an annotation, the pointer linking the annotation to a specified location in the series of locations.

US Pat. No. 10,657,692

DETERMINING IMAGE DESCRIPTION SPECIFICITY IN PRESENTING DIGITAL CONTENT

INTERNATIONAL BUSINESS MA...

1. A computer-implemented method comprising:obtaining digital content to be presented on a device to a user, the digital content comprising a digital image, of which a description is to be provided to the user, and text;
analyzing the digital content and ascertaining a context under which the description of the digital image is to be provided to the user;
determining, based at least in part on the context, a level of specificity for describing the digital image to the user; and
presenting the digital content, including the digital image, to the user in accordance with the determined level of specificity for describing the digital image, wherein based on the level of specificity of the description of the digital image being as specific or more specific than a portion of text of the digital content, the presenting the digital content includes, in lieu of reading the portion of text, reading a modified version of the portion of text, the modified version of the portion of text being less specific than the portion of text.

US Pat. No. 10,657,691

SYSTEM AND METHOD OF AUTOMATIC ROOM SEGMENTATION FOR TWO-DIMENSIONAL FLOORPLAN ANNOTATION

FARO TECHNOLOGIES, INC., ...

1. A system of generating a two-dimensional (2D) map of an environment, the system comprising:a coordinate measurement scanner comprising a light source, a first image sensor and a controller, the light source emits a beam of light to illuminate object points in the environment, the first image sensor is arranged to receive light reflected from the object points, the controller being operable to determine a distance value to at least one of the object points;
one or more processors operably coupled to the scanner, the one or more processors being responsive to executable instructions for generating a 2D image of the environment in response to an activation signal from an operator and based at least in part on the distance value;
a portable computing device having a second image sensor, the portable computing device being coupled for communication to the one or more processors, wherein the one or more processors are responsive to correlate a location captured by a first image from the portable computing device with the location in the 2D image of the environment in response to the first image being acquired by the second image sensor; and
a mapping system configured to:
generate a 2D map based on the 2D image of the environment;
apply image recognition to the first image to identify and label an object in the first image, the applying including utilizing an image classifier that was trained using images of objects and their corresponding known labels;
update the 2D map based at least in part on the label of the object in the first image; and
based on the label of the object being one of a door and a window:
extract a wall line from the 2D map based at least in part on a location of the object; and
perform automatic segmentation of the 2D map based at least in part on the extracted wall line.

US Pat. No. 10,657,690

INTELLIGENT AUGMENTED REALITY (IAR) PLATFORM-BASED COMMUNICATION SYSTEM

1. A method for providing real-time augmented reality (AR) data, the method comprising:receiving, in real-time at a computer device, a stream of visual data:
generating the real-time AR data by integrating the stream of received visual data, AR input data, information input, and knowledge input, based on one or more criteria comprising a user preference, a system setting, an integration parameter, a characteristic of an object or a scene of the stream of visual data, an interactive user control, or a combination thereof, wherein:
the information input comprises a real time extracted portion of the stream of visual data, extracted in real-time as the stream of visual data is being received, at a plurality of time points based on one or more criteria comprising a user preference, a system setting, an integration parameter, a characteristic of an object or a scene of the stream of visual data, an interactive user control, or a combination thereof;
the knowledge input is learned cumulatively based on the information extracted from the visual data at the plurality of time points and a user's behavior learned from the real-time extracted portion of the visual data;
the real-time AR data comprise information data corresponding to the information input, the received visual data, and knowledge data corresponding to the knowledge input: and
representing at least a portion of the information data or knowledge data of the real-time AR data, including replacing a background image data by AR data based on the information data or knowledge data, with a plurality sets of data parameters, wherein each set of data parameters comprises text, one or more codes, one or more numbers, one or more matrixes, one or more images, one or more audio signals, one or more sensor signals: or combinations thereof, wherein said representing comprising making a personalized decision based on the learned knowledge input.

US Pat. No. 10,657,689

METHOD AND APPARATUS FOR POINT CLOUD COLOR PROCESSING

SONY CORPORATION, Tokyo ...

1. A method for reducing color leaking artefacts in an image formed by projection processing from a 3D point cloud, the method comprising:receiving an input image comprising the 3D point cloud;
classifying the 3D point cloud into a plurality of 3D surface patches;
projecting the 3D surface patches onto a 2D image plane to form a first 2D image;
processing the first 2D image, by coding, transmitting and decoding, to form a final 2D image; and
providing the final 2D image as an output;
wherein processing comprises coding comprising background filling of pixels between patches to reduce inter-patch color leakage in the final 2D image;
wherein background filling of pixels between patches comprises:
for each patch, displacing active pixels at a patch border by a safeguard distance towards the patch interior; and
for each patch, filling the safeguard distance with a background fill.

US Pat. No. 10,657,688

MULTI-DIMENSIONAL VISUALIZATION AND RESOURCE EVALUATION SYSTEM

ACCENTURE GLOBAL SOLUTION...

1. A system comprising:a geographic data storage layer configured to store:
geographic data for a pre-determined geography; and
time zone boundaries that divide the pre-determined geography into different time zones;
a resource data storage layer configured to store:
resource identifiers for available resources in the pre-determined geography;
resource locations for the available resources in the pre-determined geography; and
resource descriptors for the available resources;
virtual reality interface circuitry in communication with the geographic data storage layer and the resource data storage layer, the virtual reality interface circuitry comprising a VR processor configured to:
control one or more control peripheral interfaces to connect with a virtual reality control peripheral;
control view rendering circuitry to generate a resource analysis visualization comprising the available resources rendered on the pre-determined geography and path waypoints rendered on the pre-determined geography;
control view guidance circuitry to:
accept selection inputs from the virtual reality control peripheral that specify selected resources among the available resources for designated waypoints among the path waypoints;
communication circuitry comprising a communication processor configured to control transmission of resource selection messages to the selected resources for the designated waypoints; and
path planning circuitry comprising a path planning processor configured to:
determine a waypoint distribution of at least one of the path waypoints across a selected time zone boundary and into a different time zone than other path waypoints to define a segment of continuous project flow.

US Pat. No. 10,657,687

DYNAMIC CHAINING OF DATA VISUALIZATIONS

SAP SE, Walldorf (DE)

1. A computing system comprising:a memory storing instructions for outputting a user interface; and
a processor configured to execute the instructions, wherein, when executed, the instructions are configured to cause the processor to:
display the user interface comprising a display area where one or more values of a data source are visually depicted;
detect, via the user interface, a selection of a point source location via the display area which is associated with a first data value from among the one or more data values;
generate a point visualization that comprises additional information associated with the first data value associated with the point source location; and
display, via the display area of the user interface, the point visualization which includes the additional information associated with the first data value, and a visual link connecting the point visualization to the point source location.

US Pat. No. 10,657,686

GRAGNOSTICS RENDERING

Two Six Labs, LLC, Arlin...

1. In an analytics environment having graph data responsive to rendering for visual recognition and comparison of statistical trends defined in a plurality of graphs, a scalable method of visualized graph data, comprising:receiving a plurality of graphs, each graph defining associations between data entities and renderable in a visual form having a plurality of vertices connected by one or more edges;
computing, for each graph, a plurality of features based on the edges and vertices interconnected by the edges, including computing a feature value for each of the features in a linear computability time such that the feature value is computable in a time that varies linearly with at least one of a number of nodes or a number of vertices;
normalizing the computed features into a predetermined range;
for each graph, arranging each of the normalized features into a feature vector, the feature vector having ordered values for each of the features, the ordered values in the feature vector including a tree feature and a linearity feature for each graph, further comprising:
determining the tree feature in linear time by:
traversing each of the vertices in the graph;
accumulating, based on the traversal, a number of edges; and
determine a number of edges for which the removal of would result in a tree by removing cyclic paths; and
comparing the determined number of edges with a number of the traversed vertices; and
determining the linearity feature in linear time by:
traversing each of the vertices in the graph;
determining, at each vertex, if a number of edges emanating from the vertex is consistent with a linear graph;
accumulating the number of vertices consistent with a linear graph; and
computing the accumulated vertices with the number of traversed vertices;
computing a multidimensional distance between each of the feature vectors for determining a similarity between the graphs corresponding to the feature vectors;
computing a two dimensional position corresponding to each of the feature vectors based on a projection of the compound multidimensional distance;
displaying the position of each vector onto a visualized two dimensional rendering;
rendering a visualization of the feature vectors; and
determining similarity of the graphs based on a distance between the corresponding visualized feature vectors by classifying, based on a distance on the visualized two dimensional rendering, groups of graphs, the classification defined by visual clusters of the positions on the two dimensional rendering.

US Pat. No. 10,657,685

METHODS AND DEVICES FOR ADJUSTING CHART MAGNIFICATION

Tableau Software, Inc., ...

1. A method, comprising:at an electronic device with a touch-sensitive surface and a display:
displaying a line chart on the display, wherein the line chart includes a first plurality of data marks and a first plurality of line segments connecting adjacent data marks of the first plurality of data marks;
detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart;
while detecting the first touch input:
expanding the line chart horizontally according to the touch input, including expanding a first line segment of the first plurality of line segments; and
adding a second plurality of data marks, distinct from the first plurality of data marks, on the first line segment, thereby subdividing the first line segment into a second plurality of line segments, which are initially collinear; and
after expanding the line chart horizontally and adding the second plurality of data marks:
determining an ordinate value for each of the second plurality of data marks;
animatedly moving each of the second plurality of data marks from a respective position on the first line segment to a respective vertical location defined by the respective determined ordinate value; and
concurrently with moving each of the second plurality of data marks,
animatedly moving each of the second plurality of line segments according to movement of respective data marks at endpoints of the respective line segment.

US Pat. No. 10,657,684

MATCHED ARRAY ALIGNMENT SYSTEM AND METHOD

EffectiveTalent Office LL...

1. A method for displaying optimized values in a two-dimensional array, the method, implemented on a system including at least one display device and at least one input device, comprising:displaying a grid of cells of an array on the at least one display device;
displaying, on the display device, an X-axis of proxy values adjacent to the grid and a Y-axis of proxy values adjacent to the grid,
wherein the range and interval of the proxy values of the X-axis is the same as the range and interval of the proxy values of the Y-axis;
receiving, by the at least one input device, an input of a first metric;
scaling the first metric to convert the first metric into a first proxy value;
receiving, by the at least one input device, an input of a second metric;
scaling the second metric to convert the second value into a second proxy value; and
on the at least one display device, displaying, in a cell of the array that corresponds to an intersection of the first proxy value and the second proxy value on the X-axis and the Y-axis, an indicator that visually indicates a distance between the cell and an alignment vector of the grid,
wherein the alignment vector is defined by cells of the two-dimensional array for which the first proxy value equals the second proxy value.

US Pat. No. 10,657,683

HAZARD WARNING POLYGONS CONSTRAINED BASED ON END-USE DEVICE

HERE Global B.V., Eindho...

1. A method for generating warning polygons constrained to an end-use device, the warning polygons indicative of hazard events in a geographic region, the method comprising:receiving measurement data from one or more sensors associated with the geographic region;
identifying at least one location from the measurement data;
identifying at least one map tile within a predetermined distance to the at least one location;
generating a map tile cluster based on analysis of the at least one map tile;
accessing an end-use constraint of the end-use device, the end-use constraint indicative of a constraint in processing on the end-use device, wherein the end-use constraint comprises a maximum vertices limit; and
calculating, by a processor, a warning polygon based on the map tile cluster and based on the end-use constraint, wherein the warning polygon intersects the geographic region, wherein the warning polygon comprises a set of vertices as geographic coordinates for defining a hazard event and a number of vertices in the set of vertices is constrained to be no more than the maximum vertices limit.

US Pat. No. 10,657,682

DRAWING CURVES IN SPACE GUIDED BY 3-D OBJECTS

Adobe Inc., San Jose, CA...

1. In a digital medium environment including a computing device executing a 3-D drawing module to perform an improved method for drawing a 3-D curve relative to a 3-D object, the method comprising:receiving, by the computing device, an input defining a 2-D curve that has been drawn relative to the 3-D object;
converting, by the computing device, the 2-D curve into a plurality of 2-D points;
discovering, by the computing device, candidate 3-D vertices for each of the plurality of 2-D points, the candidate 3-D vertices defining a potential location in a 3-D space for each of the plurality of 2-D points;
building, by the computing device, a point-to-point graph having the candidate 3-D vertices;
estimating, by the computing device, distances of the candidate 3-D vertices from the 3-D object;
processing, by the computing device, the point-to-point graph to define a plurality of vertex segments, each vertex segment of the plurality of vertex segments including a set of the candidate 3-D vertices;
constructing, by the computing device, from the estimated distances of the candidate 3-D vertices from the 3-D object and the plurality of vertex segments, a segment-to-segment graph;
processing, by the computing device, the segment-to-segment graph to define multiple different 3-D curves relative to the 3-D object; and
selecting, by the computing device, from the multiple different 3-D curves, a final 3-D curve having a topology that includes curve portions that flow behind and in front of the 3-D object.

US Pat. No. 10,657,681

METHOD OF AND APPARATUS FOR PROCESSING GRAPHICS

ARM NORWAY AS, Trondheim...

1. A graphics processing system in which a scene to be rendered is divided into a plurality of sub-regions for rendering, each sub-region including a respective area of the scene to be rendered, the system comprising:primitive list building circuitry configured to prepare lists of primitives for rendering, each list of primitives listing primitives for rendering for one or more sub-regions of the scene;
the primitive list building circuitry further comprising an index generator configured to generate indices for primitives that are included in the primitive lists by the primitive list building circuitry; and
the primitive list building circuitry further being configured to associate with each graphics primitive that it lists in a primitive list, an index for the primitive;
the graphics processing system further comprising:
primitive selection circuitry configured to, when rendering a sub-region of the scene, select from a primitive list a next primitive to be rendered for that sub-region of the scene;
wherein the primitive selection circuitry is configured to:
receive plural different primitive lists simultaneously;
determine which primitive from the plural different primitive lists should be rendered next for a sub-region of the scene using the indices associated with the graphics primitives in the primitive lists; and
select the primitive determined to be the primitive that should be rendered next as the primitive to render next for that sub-region of the scene.

US Pat. No. 10,657,680

SIMPLIFIED POINT-IN-POLYGON TEST FOR PROCESSING GEOGRAPHIC DATA

SPLUNK INC., San Francis...

1. A computer-implemented method for displaying geographic data, comprising:receiving a query to be processed, wherein the query is associated with a set of geographic regions;
extracting a set of data points from raw machine data;
for each data point in the set of data points, performing simplified point in polygon (PIP) tests to determine zero or more geographic regions, in the set of geographic regions, that bound the data point, the simplified PIP tests using a coordinate of the data point to look up previously identified segments of the set of geographic regions that would intersect parallel rays cast from points in an identified range within which the coordinate falls; and
causing display of at least a subset of the set of geographic regions based on counts of the data points determined to fall within corresponding geographic regions.

US Pat. No. 10,657,679

MULTI-ENERGY (SPECTRAL) IMAGE DATA PROCESSING

KONINKLIJKE PHILIPS N.V.,...

1. A method, comprising:generating a material landmark image in a low and high energy image domain, wherein the material landmark image estimates a change of a value of an image pixel caused by adding a small amount of a known material to the pixel;
generating an air values image in the low and high energy image domain, wherein the air values image estimates a value for each image pixel where a value of a pixel is replaced by a value representing air;
extracting from de-noised low and high energy images generated from low and high energy line integrals, a material composition of each image pixel based on the material landmark images and air values image; and
generating a signal indicative of the extracted material composition.

US Pat. No. 10,657,678

METHOD, APPARATUS AND DEVICE FOR CREATING A TEXTURE ATLAS TO RENDER IMAGES

Alibaba Group Holding Lim...

1. A method, comprising:obtaining, by one or more processors, image data comprised in a data frame to be rendered, the image data comprising data for a plurality of objects in the data frame;
determining, by the one or more processors, a first set of the plurality of objects to be rendered using a texture atlas, and a second set of the plurality of objects to be rendered from the corresponding image data without use of the texture atlas, the determining of the first set and the second set comprising determining, for an object in the plurality of objects in the data frame, whether the image data for the corresponding object exceeds a preset size threshold;
determining, by the one or more processors, one or more dimensions for the texture atlas based at least in part on one or more dimensions of the image data for the first set of the plurality of objects;
creating, by the one or more processors, the texture atlas based at least in part on the determined dimensions; and
rendering, by the one or more processors, the data frame based at least in part on the texture atlas.

US Pat. No. 10,657,677

COGNITIVE SITUATION-AWARE VISION DEFICIENCY REMEDIATION

INTERNATIONAL BUSINESS MA...

1. A computer-implemented method for remediating a vision deficiency, the method comprising:receiving, by a processor, time dependent location information for a user, wherein the time dependent location information comprises a route for the user;
receiving, by the processor, images of a plurality of objects, wherein each of the plurality of objects corresponds to the time dependent location information and, for each of the plurality of objects, identifying an object type and an object color;
determining a number of distinguishable colors required to remediate a color vision deficiency and a number of available colors; and
overlaying one of the plurality of objects with an available color responsive to a determination that the number of distinguishable colors does not exceed the number of available colors.

US Pat. No. 10,657,676

ENCODING AND DECODING A STYLIZED CUSTOM GRAPHIC

Snap Inc., Santa Monica,...

1. A method of encoding a bit string via a custom graphic having a defined style, comprising:receiving, via hardware processing circuitry, the bit string;
generating, via the hardware processing circuitry, using an encoder, image data encoding the bit string, the encoder trained using a plurality of training bit strings to minimize a loss when the plurality of encoded training bit strings are decoded by a decoder, the decoder also trained to minimize the loss when decoding the encoded training bit strings;
transferring, via the hardware processing circuitry, the defined style from a sample image having the defined style to the image data;
generating, via the hardware processing circuitry, a graphic image;
filling, via the hardware processing circuitry, the graphic image with thee image data encoding the bit string and having the defined style; and
writing, via the hardware processing circuitry, the filled graphic image to an output device.

US Pat. No. 10,657,675

TECHNIQUES FOR IMPROVED PROGRESSIVE MESH COMPRESSION

GOOGLE LLC, Mountain Vie...

1. A computer-implemented method of progressive mesh compression, comprising:generating, at an encoder, a first plurality of levels of detail (LODs) and associated first type of vertex split records, the first type of vertex split records associated with an LOD of the first plurality of LODs are generated using a first type of collapse operator;
switching from using the first type of collapse operator to a second type of collapse operator in response to a switching condition being satisfied, the switching condition being based at least on identifying a branching point, to balance distortion and bits per vertex, from a plurality of branching points; and
generating, at the encoder, after the switching, a second plurality of LODs and associated second type of vertex split records, the second type of vertex split records associated with a LOD of the second plurality of LODs are generated using the second type of collapse operator.

US Pat. No. 10,657,674

IMAGE COMPRESSION METHOD AND APPARATUS

IMMERSIVE ROBOTICS PTY LT...

1. A method of compressing image data representing one or more images, the method including:a) obtaining pixel data from the image data, the pixel data representing an array of pixels within the one or more images;
b) applying a transformation to the pixel data to determine a set of frequency coefficients indicative of frequency components of the array of pixels;
c) selecting one of a plurality of bit encoding schemes, wherein each of the plurality of bit encoding schemes selectively encodes different frequency coefficients with respective different numbers of bits to provide a different degree of compression and wherein the bit encoding scheme is selected at least in part based on:
i) a desired degree of compression; and,
ii) a position of the array of pixels in the one or more images;
d) selectively encoding at least some of the frequency coefficients using the selected bit encoding scheme to thereby generate a set of encoded frequency coefficients, wherein the bit encoding scheme defines the number of bits used to encode each of the frequency coefficients so that when the frequency coefficients are selectively encoded:
i) at least some of the encoded frequency coefficients are encoded with a different numbers of bits; and,
ii) at least one frequency coefficient is discarded so that the set of encoded frequency coefficients is smaller than the set of frequency coefficients; and,
e) generating compressed image data using the encoded frequency coefficients.

US Pat. No. 10,657,673

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM TO CORRECT PIXELS USING SINGULAR VALUE DECOMPOSITION

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus comprising:a processor; and
a memory storing instructions that, when executed by the processor, cause the processor to perform operations comprising:
performing development processing on an image,
acquiring a singular value by carrying out singular value decomposition with respect to a pixel value contained in a partial region in the image on which the development processing has been performed,
correcting the singular value according to a value of a parameter in the development processing, and
calculating a corrected pixel value in the partial region with use of the corrected singular value,
wherein the singular value decomposition is a processing of decomposing a first matrix into a second matrix, a third matrix, and a fourth matrix,
the second matrix is an orthogonal matrix,
the third matrix is a matrix in which an off-diagonal element is zero and a diagonal element has a non-negative value, and
the fourth matrix is an orthogonal matrix.

US Pat. No. 10,657,672

IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND STORAGE MEDIUM

NEC CORPORATION, Minato-...

1. An image processing device comprisinga memory that stores a set of instructions; and
at least one processor configured to execute the set of instructions to:
estimate similarity of input image to particle images including areas of particles at relative positons, true positions of the particles in the particle images being given;
categorize the particle images into groups based on distances between the true positions of the particles included in the particle images;
determine a group from the groups based on the similarity of the input image to the particle images grouped into the group; and
calculate positions of particles in the input image based on the similarity and the true particle positions of the particle images grouped into the determined group.

US Pat. No. 10,657,671

SYSTEM AND METHOD FOR NAVIGATION TO A TARGET ANATOMICAL OBJECT IN MEDICAL IMAGING-BASED PROCEDURES

Avent, Inc., Alpharetta,...

1. A method for providing navigational directions to a user to locate a target anatomical object during a medical procedure via a medical imaging system, wherein the method is computer-implemented, the method comprising:selecting an anatomical region surrounding the target anatomical object;
generating a plurality of real-time two-dimensional images of scenes from the anatomical region surrounding the target anatomical object and providing the plurality of real-time two-dimensional images to a controller;
developing and training a deep learning network to automatically detect and identify the scenes from the anatomical region surrounding the target anatomical object via ground truth data by scanning and collecting a dataset of a plurality of images of scenes from the anatomical region surrounding the target anatomical object from each of a plurality of patients; annotating the dataset of images based on user input to create the ground truth data; dividing the dataset of images and the ground truth data into a training dataset and a validation dataset; and utilizing the training dataset to train the deep learning network, wherein the deep learning network comprises a feed-forward neural network, a recurrent neural network, or a combination thereof;
automatically mapping each of the plurality of real-time two-dimensional images from the anatomical region surrounding the target anatomical object based on a relative spatial location and a relative temporal location of each of the identified scenes in the anatomical region via the deep learning network; and
providing directions to the user to locate the target anatomical object during the medical procedure based on the relative spatial location and the relative temporal location of each of the identified scenes.

US Pat. No. 10,657,670

INFORMATION PROCESSING APPARATUS

Hitachi Automotive System...

1. An information processing apparatus, comprising:a far information acquisition unit which acquires far information regarding a far region;
a near information acquisition unit which acquires near information regarding a near region;
a far information processing unit which performs far information processing using the far information;
a near information processing unit which performs near information processing using the near information; and
an allocated resource changing unit which changes the magnitudes of resources allocated to the far information processing unit and the near information processing unit, according to priorities of the far information processing and the near information processing,
wherein the allocated resource changing unit allocates the resources of the magnitudes determined according to the priority to the far information processing unit, even when the priority of the near information processing is higher than the priority of the far information processing.

US Pat. No. 10,657,669

DETERMINATION OF A GEOGRAPHICAL LOCATION OF A USER

BEIJING KUANGSHI TECHNOLO...

1. A method for determining a geographical location of a user, comprising:extracting characters or icons in an image taken at a place where the user is located, the image reflecting an environment where the user is located;
analyzing the extracted characters or icons to determine a meaning of the characters or the icons, analyzing the extracted characters or icons to determine a meaning of the characters or the icons including:
organizing the extracted characters into a character string in a row or column order,
analyzing the character string to determine one or more words with a specific meaning by performing word segmentation on the character string, and
matching the extracted icons with pre-stored icons with specific meanings to determine the meaning of the extracted icons; and
determining the geographical location of the user based on the meaning of the characters or the icons, determining the geographical location of the user based on the meaning of the characters or the icons including:
selecting geographical locations which are searched for in a predetermined geographical region on a map and are associated with the meaning of the one or more words or the icons as candidate geographical locations, and
determining the geographical location of the user based on the candidate geographical locations.

US Pat. No. 10,657,668

SYSTEMS AND METHODS FOR DETECTING AND TRACKING A MARKER

Tata Consultancy Services...

1. A method for detecting and tracking a marker, the method comprising:performing shape based segmentation of at least one object detected in a first frame from a sequence of frames, the at least one object having a shape in line with shape of the marker to define a region of interest (ROI) surrounding an object of interest corresponding to the marker, the ROI comprising a plurality of pixels, wherein performing shape based segmentation comprises:
receiving the first frame containing the at least one object;
performing shape based feature extraction on the first frame to detect the at least one object;
eliminating false objects from the at least one object to identify the object of interest, wherein eliminating the false objects comprises use of a color density based band-pass filter and
defining the region of interest (ROI) surrounding the object of interest; and
iteratively performing until a last frame from the sequence of frames is received:
dynamically training and updating a marker detection model based on sampling points from the plurality of pixels in and around the ROI, wherein dynamically training and updating a marker detection model comprises;
classifying the plurality of pixels in the ROI as marker pixels and pixels around the ROI as non-marker pixels;
training and updating the marker detection model being a support vector machine (SVM), using the marker pixels, the non-marker pixels and velocity of the marker corresponding to one or more frames under consideration in relation to a previous frame in the sequence of frames; and
tracking the marker in real-time based on projected ROI in subsequent frames of the sequence of frames and the marker detection model.

US Pat. No. 10,657,666

SYSTEMS AND METHODS FOR DETERMINING COMMERCIAL TRAILER FULLNESS

Symbol Technologies, LLC,...

1. A three-dimensional (3D) depth imaging system for use in commercial trailer loading, the 3D depth imaging system comprising:a 3D-depth camera configured to capture 3D image data, the 3D-depth camera oriented in a direction to capture 3D image data of an interior of a vehicle storage area; and
a depth-detection application executing on one or more processors, the depth-detection application determining, based on the 3D image data, at least a wall data region and a non-wall data region,
wherein the determination of the wall data region and the non-wall data region causes the depth-detection application to generate a wall indicator, the wall indicator indicating that a wall is situated at a discrete depth within the interior of the vehicle storage area, and
wherein the determination of the wall data region and the non-wall data region comprises:
calculating a histogram of a depths of the 3D image data;
identifying a plurality of histogram peaks;
identifying a highest peak from the plurality of histogram peaks and filtering the plurality of histogram peaks having a height that falls outside of a threshold percentage of the highest peak; and
identifying a peak from the plurality of histogram peaks having a furthest distance from the 3D-depth camera and filtering the remaining plurality of histogram peaks located at a distance from the peak having the furthest distance from the 3D-depth camera.

US Pat. No. 10,657,665

APPARATUS AND METHOD FOR GENERATING THREE-DIMENSIONAL INFORMATION

ELECTRONICS AND TELECOMMU...

1. An apparatus for generating three-dimensional information, the apparatus comprising:a laser light source providing laser light to an object to be reconstructed in three-dimensional information;
a coordinate reference mechanism unit provided between the laser light source and the object and having a plurality of protrusions reflecting the laser light;
a camera unit outputting an image capturing the coordinate reference mechanism unit and the object simultaneously; and
a three-dimensional information processing unit generating three-dimensional information of the object by identifying a projection plane formed by the laser light and using the projection plane, considering a relationship between a plurality of actual protrusion reflection points at which the laser light is reflected by the plurality of protrusions respectively and a plurality of protrusion reflection points displayed in the image,
wherein the coordinate reference mechanism unit includes a base plate on which the plurality of protrusions are fixed, and a fixing arm connecting the laser light source and the base plate to each other.

US Pat. No. 10,657,664

METHOD AND SYSTEM FOR IMAGE-BASED IMAGE RENDERING USING A MULTI-CAMERA AND DEPTH CAMERA ARRAY

1. A method of rendering new images, comprising:acquiring a set of images of a scene using a set of image cameras, each of the image cameras in the set of image cameras being positioned within a multi-array grid towards the scene;
acquiring a set of depth values of the scene using a set of depth cameras, each of the depth cameras in the set of depth cameras being positioned within the multi-array grid towards the scene and the set of image cameras and the set of depth cameras comprise different camera types;
generating, based on the set of depth values, a geometry of the scene;
determining, a new camera position of the scene that is different than positions of the set of image cameras in relation to the scene and positions of the set of depth cameras in relation to the scene;
generating, based on the acquired set of images of the scene and the geometry of the scene, pixel values associated with the new camera position of the scene, wherein generating a pixel value of the pixel values comprises:
determining a ray associated with the new camera position of the scene, wherein the ray intersects a first portion of the geometry of the scene;
determining, based upon a location of the first portion of the geometry of the scene, a first plurality of rays, each ray in the first plurality of ray intersects the first portion of the geometry of the scene;
acquiring, based on the set of depth values of the scene, a depth value associated with each of the first plurality of rays;
filtering, based on the depth values associated with each of the first plurality of rays, rays from the first plurality of rays that do not have a same depth value relative to a common plane to generate a second plurality of rays; and
interpolating one or more rays of the second plurality of rays to generate the pixel value; and
rendering the pixel values to generate an image of the scene from the new camera position.

US Pat. No. 10,657,663

LOCALISATION AND MAPPING

Sony Interactive Entertai...

1. A method of generating a three-dimensional map of a region from successive images of the region captured from different camera poses, the map comprising a set of landmark points each defined by a three dimensional spatial position and image information associated with the three dimensional spatial position, the method comprising:capturing successive images of the region using a camera;
detecting, by a processor, feature points within the captured images;
designating, by the processor, a subset of the captured images as a set of keyframes each having respective sets of image position data representing image positions of landmark points detected as feature points in that image;
detecting, by the processor, an angular velocity of the camera between a previous image and a current image of the successive images using at least one of (i) one or more accelerometers, or (ii) one or more gyroscopes;
in respect of a newly captured image, detecting, by the processor, a pose of the camera by detecting the position of landmark points in the newly captured image, the detecting step comprising:
integrating the angular velocity of the camera since the previous image was captured to obtain an estimated rotation; and
predicting the positions of the landmark points in the newly captured image from the estimated rotation and the position of the landmark points in the previous image.

US Pat. No. 10,657,662

DISPLAY METHOD AND SYSTEM FOR ENABLING AN OPERATOR TO VISUALIZE AND CORRECT ALIGNMENT ERRORS IN IMAGED DATA SETS

D4D Technologies, LLC, R...

1. A method of visualizing and correcting alignment errors between 2D and 3D data sets in an implant planning tool, wherein the 3D data set comprises one or more spatial points each having an intensity, comprising:displaying a textured view that is generated by providing a given display indication on a surface of the 2D data set as a function of intensities of one or more spatial points of the 3D data set at which the surface intersects the 3D data set, where the textured view is generated in software executing in a hardware element responsive to receipt of operator-entered data indicating one or more common points on the 2D data set and the 3D data set and illustrates an alignment between the 2D and 3D data sets;
in association with the textured view, displaying one or more slices of the 3D data set, wherein each of the slices of the 3D data set is overlaid with a wireframe projection from the 2D data set to form a visualization; and
responsive to receipt of operator-entered data indicating a rotation or movement of the 2D data set, updating the visualizations as the 2D and 3D data sets are transformed relative to each other.

US Pat. No. 10,657,661

INFORMATION PROCESSING APPARATUS, IMAGING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM CAUSING COMPUTER TO EXECUTE INFORMATION PROCESSING

Canon Kabushiki Kaisha, ...

1. An image processing system for medical images, comprising:a non-transitory memory that stores executable instructions; and
at least one processor coupled to the non-transitory memory, wherein responsive to executing the instructions, the at least one processor is configured to:
acquire a plurality of first positions in a three-dimensional image of a subject as positions of a sternum in the three-dimensional image of the subject;
acquire a plurality of first body surface positions of the subject as positions of the sternum in the subject;
calculate a first sternum direction vector as a direction of the sternum in the three-dimensional image based on the plurality of first positions;
calculate a second sternum direction vector as a direction of the sternum in the subject based on the plurality of first body surface positions; and
calculate a first correspondence relationship in a position of the sternum between the three-dimensional image and the subject, by (a) matching a first position of the sternum in the three-dimensional image and a first body surface position of the sternum in the body of subject and (b) matching the first sternum direction vector and the second sternum direction vector,
wherein the at least one processor is configured to further acquire a plurality of second positions in the three-dimensional image of the subject as positions of ribs in the three-dimensional image of the subject,
wherein the at least one processor is configured to further acquire a plurality of second body surface positions of the subject as positions of the ribs in the subject,
wherein the at least one processor is configured to further calculate a second correspondence relationship in a direction of rotation with the sternum as an axis between the subject and the three-dimensional image, based on the second positions in the three-dimensional image and the second body surface positions in the subject, and
wherein the at least one processor is configured to generate a tomographic image on the basis of a cross section in the three-dimensional image, the cross section being specified using the first correspondence relationship and the second correspondence relationship.

US Pat. No. 10,657,660

SEARCH ASSIST SYSTEM, SEARCH ASSIST APPARATUS, AND SEARCH ASSIST METHOD

TOYOTA JIDOSHA KABUSHIKI ...

1. A search assist system comprising:a first storage configured to store characteristic information on one or more subjects of identification detected from images or videos picked up by one or more fixed cameras installed at predetermined locations, respectively, and information about the one or more fixed cameras;
a second storage configured to store location information on a plurality of vehicles each of which has an on-board camera;
one or more first controllers configured to:
receive characteristic information on a subject of search;
determine a search target area based on information, stored in the first storage, about a fixed camera among the one or more fixed cameras that has picked up an image or a video from which characteristic information on a subject among the one or more subjects of identification matching at least part of the characteristic information on the subject of search is detected; and
send, to each vehicle that is present within the search target area, a search instruction to search for the subject of search by using the on-board camera; and
a second controller configured to output an image or a video picked up by the on-board camera from which first characteristic information matching at least part of the characteristic information on the subject of search is detected.

US Pat. No. 10,657,659

VISUAL SIMULTANEOUS LOCALIZATION AND MAPPING SYSTEM

Slightech, Inc.

1. A processor-implemented method for visual simultaneous localization and mapping, the method comprising the steps:receiving a sequence of image frames from a camera;
extracting edges for each image frame;
calculating an intensity gradient value for each edge point;
building a 3D map of the surrounding environment by initializing 3D points in 3D spaces and continuously optimizing the 3D coordinates of each 3D map point;
generating a synthetic gradient field based on an edge image;
tracking the pose of said camera by computing transformation parameters for each image frame, wherein the transformation parameters further comprise parameters for aligning each synthetic gradient field to the 3D map of the environment;
wherein computing the transformation parameters and optimizing the map comprises using an iterative process to identify corresponding points from both the synthetic gradient field and the 3D map points and optimizing an error metric applied to the identified corresponding points; and
wherein initializing a map of the surrounding environment comprises extracting edges from an image frame and back projecting 2D edge points into 3D space with random initial depth values, wherein each 3D map point keeps the intensity gradient value computed at corresponding 2D edge point.

US Pat. No. 10,657,658

TRANSFORMATION MATRIX DERIVING DEVICE, POSITION ESTIMATION APPARATUS, TRANSFORMATION MATRIX DERIVING METHOD, AND POSITION ESTIMATION METHOD

Kabushiki Kaisha Toshiba,...

1. A transformation matrix deriving device position estimation apparatus, comprising:a first trajectory generator configured to generate a first trajectory that is a motion trajectory of a moving object in a first coordinate system, the moving object being detected from a video;
a second trajectory generator configured to generate a second trajectory from time-series data of positional information output from a position sensor, the second trajectory being a motion trajectory of a moving object having the position sensor in a second coordinate system;
a trajectory matcher configured to associate the first trajectory with the second trajectory, the first trajectory and the second trajectory being estimated to be a motion trajectory of a same moving object based on a similarity between the first trajectory and the second trajectory;
a deriver configured to derive a transformation matrix that transforms the second coordinate system into the first coordinate system by using the first trajectory and the second trajectory associated with each other; and
an estimator configured to estimate a position of the moving object in the first coordinate system by mapping the second trajectory in the first coordinate system by using the transformation matrix,
wherein the estimator integrates the first trajectory with the second trajectory mapped in the first coordinate system for each moving object and estimates a position of the moving object in the first coordinate system at each instant of time based on an integrated motion trajectory of the moving object,
the deriver further calculates a time difference between a time stamp of the video captured by a camera and a time stamp of positional information output from the position sensor, from an amount of shift by which the first trajectory or the second trajectory is shifted along a time axis to minimize a distance between the first trajectory and the second trajectory associated with each other, and
the estimator shifts the first trajectory or the second trajectory mapped in the first coordinate system along the time axis in accordance with the time difference, and integrates the first trajectory with the second trajectory mapped in the first coordinate system for each moving object.

US Pat. No. 10,657,657

METHOD, SYSTEM AND APPARATUS FOR DETECTING A CHANGE IN ANGULAR POSITION OF A CAMERA

Canon Kabushiki Kaisha, ...

1. A method of detecting a change in angular position of a camera, the method comprising:receiving a plurality of frames captured by the camera, each frame being associated with brightness and acceleration metadata, the brightness metadata associated with a frame being representative of brightness of a scene captured by the frame;
selecting a contiguous set of candidate panning motion frames from the plurality of frames, based on similarity of acceleration characteristics, to determine a change of brightness metadata in the selected contiguous set of frames;
determining an angular movement score for the selected contiguous set of frames using the change of brightness metadata; and
detecting a change in the angular position of the camera based on the angular movement score to adjust a manner in which the plurality of frames is displayed by a display device coupled with the camera,
wherein the determining includes classifying panning motion associated with the selected continguous set of frames as either a smooth pan or a snap pan based on the acceleration metadata and the change of brightness metadata, and frames in the snap pan are adjusted not to be selected for display as video highlights.

US Pat. No. 10,657,656

VIRTUAL GENERATION OF LABELED MOTION SENSOR DATA

INTERNATIONAL BUSINESS MA...

9. A computer-implemented method, comprising:tracking, by a system operatively coupled to a processor, virtual location data corresponding to a feature of a computer animated character in a virtual environment;
based on the virtual location data, generating, by the system, virtual motion sensor data;
based on the virtual motion sensor data, employing, by the system, machine learning to train a predictive model to identify one or more movement activities of an entity, wherein the predictive model is trained to dynamically control an amount of random variation to apply to the virtual motion sensor data to identify the one or more movement activities of the entity within a defined range of acceptable variation from the virtual motion sensor data.

US Pat. No. 10,657,655

VR CONTENT SICKNESS EVALUATING APPARATUS USING DEEP LEARNING ANALYSIS OF MOTION MISMATCH AND METHOD THEREOF

Korea Advanced Institute ...

1. A virtual reality (VR) content sickness evaluating apparatus, the apparatus comprising:a memory; and
at least one processor connected to the memory, and configured to execute computer readable instructions included in the memory,
wherein the at least one processor is configured to
analyze visual recognition information according to a visual recognition motion feature based on a change in motion of VR content;
analyze posture recognition information according to a posture recognition motion feature based on a change in motion of a user, the change being received from a sensing module; and
determine a degree of sickness induced by the VR content from a difference between the visual recognition information and the posture recognition information,
wherein the at least one processor is configured to:
analyze a change in motion of the VR content the user recognizes with his or her eyes, using a convolutional neural network (CNN) and a convolutional long short-term memory (conv LSTM); and
extract the visual recognition information of motion information according to the visual recognition motion feature of a temporal factor and a spatial factor using the CNN and the conv LSTM.

US Pat. No. 10,657,654

ABNORMALITY DETECTION DEVICE AND ABNORMALITY DETECTION METHOD

DENSO TEN Limited, Kobe-...

1. An abnormality detection device, comprising:an estimator configured to estimate an amount of movement of a mobile body based on an image taken by a camera mounted on the mobile body; and
a determiner configured to determine an abnormality in the camera by obtaining
estimated information on the amount of movement of the mobile body as obtained in the estimator and
actually observed information on movement of the mobile body as detected by an external sensor, other than the camera, mounted on the mobile body.

US Pat. No. 10,657,653

DETERMINING ONE OR MORE EVENTS IN CONTENT

Comcast Cable Communicati...

1. A method comprising:receiving, by a first computing device, an indication of an occurrence of an event within content;
determining, based on the indication of the occurrence of the event within the content:
an expected motion of objects associated with the event; and
a portion, of the content, in which the event is expected to occur;
determining, after receiving the indication of the occurrence of the event within the content, and based on comparing the expected motion of objects with a motion of objects in the portion, a subset, of the portion, in which the event occurs; and
sending, to a second computing device, the subset of the portion.

US Pat. No. 10,657,652

IMAGE MATTING USING DEEP LEARNING

Adobe Inc., San Jose, CA...

1. A computer-implemented method comprising:receiving an image, the image having a corresponding trimap that indicates a blended region of the image;
identifying, by a neural network system, structure or texture information corresponding to an object in the image;
determining, by the neural network system, percentages of foreground information for pixels in the blended region of the image using the structure or texture information, the blended region of the image indicated using the trimap; and
generating, by the neural network system, a matte for the image using the percentages of foreground information for the pixels in the blended region.

US Pat. No. 10,657,651

SYSTEMS AND METHODS FOR DETECTION OF SIGNIFICANT AND ATTRACTIVE COMPONENTS IN DIGITAL IMAGES

THE PENN STATE RESEARCH F...

1. A method of electronically assessing a visual significance of pixels or regions in an electronic image, the method comprising:constructing, by the processing device, an attributed composition graph comprising a plurality of nodes, wherein each node of the plurality of nodes corresponds to a segment of a plurality of segments of the electronic image or a part of a plurality of parts of the electronic image and wherein each node of the plurality of nodes comprises one or more attributes;
modeling, by the processing device, the visual significance of the electronic image based on the attributed composition graph using a statistical modeling process or a computational modeling process to obtain a plurality of values; and
constructing, by the processing device, a composition significance map comprising a significance score for each segment of the plurality of segments or each part of the plurality of parts according to the values obtained from the statistical modeling process or the computational modeling process.

US Pat. No. 10,657,650

SYSTEM AND METHOD FOR FINDING AND CLASSIFYING LINES IN AN IMAGE WITH A VISION SYSTEM

Cognex Corporation, Nati...

1. A system for finding line features in an acquired image based upon one or more cameras comprising:a vision system processor;
an interface associated with the vision system processor, that allows creation of discrete labels with respect to relevant lines located by a line-finding process in a training image of the object;
a runtime line-finding process that locates lines in an acquired image;
a neural net process that employs classifiers, based on the labels, to determine a probability map for line features relative to the labels; and
a runtime result-generation process that provides labels and probability scores for at least one of the relevant lines.

US Pat. No. 10,657,649

METHOD, APPARATUS AND SYSTEM FOR ANALYZING MEDICAL IMAGES OF BLOOD VESSELS

AGFA HEALTHCARE, Vienna ...

1. A method for segmentation and/or shape detection of blood vessels in medical images, the method comprising the steps of:classifying a surrounding of a vessel in a medical image by applying a first classifier to the medical image, the surrounding of the vessel being assigned to one of at least two surrounding classes; and
segmenting the vessel depending on the surrounding class to which the surrounding of the vessel has been assigned; wherein
a method used to perform the step of segmenting is selected from at least two predefined segmentation methods based on the surrounding class of the vessel assigned in the step of classifying;
the at least two surrounding classes are different in a concentration of bone structures in proximity of the vessel;
the at least two surrounding classes include a first surrounding class and a second surrounding class, a concentration of bone structures in proximity of the vessel in the first surrounding class is higher than a concentration of bone structures in proximity of the vessel in the second surrounding class; and
if the surrounding of the vessel in the medical image is assigned to the first surrounding class, the segmentation of the vessel includes applying a learning-based ray casting algorithm to the medical image.

US Pat. No. 10,657,648

BORDER TRACING

String Limited, Surrey (...

1. A method for finding borders in an image represented by pixels for processing of borders found in the image, the method including a first operation comprising:estimating an attribute for each of a first set of two adjacent pixel positions in the image;
storing the estimated attributes for each of the two adjacent pixel positions of the first set;
assessing whether a predetermined binary condition differs in respect of the first set of two adjacent pixel positions, the assessing step comprising determining the predetermined binary condition by comparing a scalar attribute of each of the first set of two adjacent pixel positions against a predetermined value, and if the predetermined binary condition differs in respect of the first set of two adjacent pixel positions determining that a border is present in a part of the image represented by pixels at those positions; and
if a border is determined to be present in that part of the image, estimating a direction of the border as being perpendicular to a line joining the first set of two adjacent pixel positions; and
initiating tracing the border in that direction, wherein tracing the border comprises steps of:
selecting a second set of two adjacent pixel positions, each of those pixel positions being adjacent to a respective one of the two adjacent pixel positions of the first set and offset from that pixel position of the first set in the estimated direction;
estimating an attribute for each of the two adjacent pixel positions of the second set;
estimating a new direction of the border in dependence on the estimated attributes of the second set of two adjacent pixel positions and the stored estimated attributes of the first set of two adjacent pixel positions;
and subsequently further tracing the border;
the method further comprising storing as part of a set of locations representing a location of the border in the image the location where a linear interpolation of position with respect to the scalar attributes of the two adjacent pixel positions of the first set intersects a predetermined value.

US Pat. No. 10,657,647

IMAGE PROCESSING SYSTEM TO DETECT CHANGES TO TARGET OBJECTS USING BASE OBJECT MODELS

CCC INFORMATION SERVICES,...

1. An image processing system that detects changes in a target object depicted in one or more target object images, comprising:one or more processors;
one or more computer readable memories;
one or more base object models stored on the one or more computer readable memories, each base object model of the one or more base object models representing a respective base object from which changes are to be detected;
one or more statistically based change detection routines stored on the one or more computer readable memories, each of the statistically based change detection routines configured to identify a respective change in the target object as compared to a base object associated with the target object; and
a routine stored on the one or more computer readable memories for execution on the one or more processors, that when executed on the one or more processors:
determines a particular base object model that corresponds to the base object associated with the target object and that is stored within the one or more computer readable memories;
compares the particular base object model to the target object image to detect one or more components of the target object within the target object image; and
scans each surface area segment of a plurality of surface area segments of the one or more detected components of the target object within the target object image by applying at least one of the one or more statistically based change detection routines to the each surface area segment to produce a respective change parameter value for the each surface area segment to thereby determine one or more changes in the target object as depicted in the target object image, wherein:
the application of the at least one of the one or more statistically based change detection routines to the each surface segment to produce the respective change parameter value for the each surface area segment includes an application of a particular change detection routine of the one or more statistically based change detection routines to each pixel of a particular area of the target object image, and
the each pixel of the particular area is used as input into the particular change detection routine to produce a respective change value parameter for the each pixel that is included in an output of the particular change detection routine.

US Pat. No. 10,657,646

SYSTEMS AND METHODS FOR MAGNETIC RESONANCE IMAGE RECONSTRUCTION

UIH AMERICA, INC., Husto...

1. A method implemented on a magnetic resonance imaging (MRI) system, the MRI system including a magnetic resonance (MR) scanner and a computing apparatus, the computing apparatus including at least one processor and at least one storage device, the method comprising:acquiring, by the MR scanner, MR signals;
generating, by the at least one processor, image data in a k-space according to the MR signals;
classifying, by the at least one processor, the image data in the k-space into a plurality of phases, each of the plurality of phases having a first count of spokes, a spoke being defined by a trajectory for filling the k-space;
classifying, by the at least one processor, the plurality of phases of the image data in the k-space into a plurality of groups, each of the plurality of groups including at least one of the plurality of phases of the image data in the k-space;
determining, by the at least one processor, reference images based on the plurality of groups, each of the reference images corresponding to the at least one of the plurality of phases of the image data in the k-space; and
reconstructing, by the at least one processor, an image sequence based on the reference images and the plurality of phases of the image data in the k-space.

US Pat. No. 10,657,645

VOXEL-BASED METHODS FOR ASSESSING SUBJECTS USING MOLECULAR MAGNETIC RESONANCE IMAGING

THE TRUSTEES OF COLUMBIA ...

1. A computer-implemented method for diagnosing or determining risk of Alzheimer's disease in a subject, the method comprising:(a) determining the presence of a magnetic resonance signal from a contrast agent in at least one individual in a control group and at least one individual in a reference group to generate primary brain scan image voxel data of contrast agent distribution in a brain of at least one individual in the control group and at least one individual in the reference group,
(b) generating secondary brain scan image data for the individuals in the control and reference groups, wherein the secondary scan brain image data is generated using magnetic resonance imaging,
(c) generating a probability-corrected time-activity curve data for each voxel in the primary brain scan image of the at least one individual in the control and reference group,
(d) processing the probability-corrected time-activity curve data of the at least one individual in the control and reference group to generate a voxel binding outcome map data of the at least one individual in the control and reference group,
(e) transforming the voxel binding outcome map data of the at least one individual in the control and reference group into a normalized space to generate a normalized voxel binding outcome map data of the at least one individual in the control and reference group
(f) processing the normalized voxel binding outcome map data of the at least one individual in the control and reference group using statistical analysis to identify one or more voxels of interest (VOI) in the normalized voxel binding outcome map data to generate a VOI map data for differentiating of the at least one individual in the control and reference group, and
(g) applying the VOI map data to the voxel binding outcome map data of the test subject to generate a mean masked binding value to diagnose or determine risk of Alzheimer's disease in the subject wherein the voxel binding outcome map data of said test subject is generated from Magnetic Resonance Imaging (MRI) using a MRI scanner.

US Pat. No. 10,657,644

BREAST CANCER DETECTION

International Business Ma...

1. A computer-implemented method for a hybrid detection model for use in breast cancer detection, the method comprising:applying, to each level of grey for each pixel in a region of interest in a digital image, a predetermined morphological filter, wherein the morphological filter reduces grey levels of the digital image to generate a first set of values;
applying, to each level of grey for each pixel in the region of interest in the digital image, a predetermined entropy filter to capture a set of maximum grey values, wherein the maximum grey values reflect a local maximum relative to other grey values in the digital image;
generating a hybrid result that is a combination of the outputs of the predetermined morphological filter and the predetermined entropy filter; and
segmenting, using the hybrid result, the digital image into potential problem areas.

US Pat. No. 10,657,643

MEDICAL IMAGE ANALYSIS FOR IDENTIFYING BIOMARKER-POSITIVE TUMOR CELLS

Ventana Medical Systems, ...

1. A system for scoring an assay, the system comprising:one or more processors; and
one or more memories coupled to the processor, the memories to store computer-readable instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising:
identifying a plurality of nuclei in a portion of a first image of a biological sample; and
determining whether regions surrounding each identified nuclei of the plurality of identified nuclei may be associated with a membrane; wherein a positive association of one of the identified nuclei with the membrane results in a determination of a positively stained cell; and wherein the determining of whether the regions surrounding each identified nuclei may be associated with a membrane comprises performing a stroke detection operation and/or a spoke detection operation.

US Pat. No. 10,657,642

SYSTEM, METHOD AND COMPUTER-ACCESSIBLE MEDIUM FOR THE DETERMINATION OF ACCELERATED BRAIN ATROPHY AND AN OPTIMAL DRAINAGE SITE FOR A SUBDURAL HEMATOMA USING COMPUTED TOMOGRAPHY

New York University, New...

1. A non-transitory computer-accessible medium having stored thereon computer-executable instructions for determining at least one attribute of a brain of a patient, wherein, when a computer arrangement executes the instructions, the computer arrangement is configured to perform procedures comprising:receiving information obtained from at least one computed tomography (CT) scan of at least one portion of the intracranial cavity;
generating at least one CT image based on the information; and
determining the at least one attribute of the area(s) of interest identified based on the at least one CT image by segmenting an intracranial space (ICS) in the at least one CT image.

US Pat. No. 10,657,641

SOURCE AND MASK OPTIMIZATION BY CHANGING INTENSITY AND SHAPE OF THE ILLUMINATION SOURCE AND MAGNITUDE AND PHASE OF MASK DIFFRACTION ORDERS

ASML Netherlands B.V., V...

1. A method for obtaining a lithographic process for creating an image of a pattern at an image plane using a patterning device, the method comprising:by a hardware computer system,
based on intensities determined for the subset of diffraction orders, selecting one or more illumination points in an illumination system and determining a transmission mask for the pattern, for which minimum image log slope is maximized in the image at each of a plurality of points, wherein determining the transmission mask comprises determining mask diffraction orders, and determining mask diffraction orders comprises performing a non-linear optimization to find optimal diffraction orders, and
performing a linear optimization by selecting quantized mask transmission to match the optimal diffraction orders,
where the (a) illumination points, (b) the transmission mask, and/or (c) information derived from (a) and/or (b), is configured to design, control and/or modify the physical lithographic process involving the pattern and/or design, control and/or modify a physical object or apparatus to be used in the physical lithographic process.

US Pat. No. 10,657,640

SYSTEM AND METHOD FOR GENERATING IMAGES FOR INSPECTION

Advanced Vision Technolog...

1. A method for generating a test set for inspection of a printed embodiment of a design as printed by a printing press, said printing press including a plurality of color units, each one of said color units configured to print a respective color, said design defined by a computer file comprising information relating to a plurality of original layers, each of said original layers including a topography associated with a color respective thereof, said inspection at least including determining the origin of at least one printed defect in said printed embodiment of the design, the method comprising the steps of:generating, for each member of the test set, at least one defective layer of said design by introducing at least one synthesized defect to at least one selected original layer in at least one selected location;
combining layers using a trained synthesis neural network, said layers including said at least one defective layer and remaining ones of said original layers, said trained synthesis neural network providing as an output thereof a plurality of features respective of each pixel; and
generating said test set from said output of said synthesis neural network, said test set including at least one synthesized test image corresponding to each member of the test set, said at least one synthesized test image including said at least one synthesized defect at said at least one selected location.

US Pat. No. 10,657,639

DETECTING POTENTIALLY DEFECTIVE PACKAGED RADIO-FREQUENCY MODULES

Skyworks Solutions, Inc.,...

1. A method of identifying potentially defective individual packaged modules, the method comprising:receiving a Printed Circuit Board (PCB) including a set of individual module substrates;
capturing an image of a first face of the PCB;
determining, using the image, whether the set of individual module substrates includes potentially defective individual module substrates based on a first set of markings included on the potentially defective individual module substrates; and
in response to determining that the set of individual module substrates includes potentially defective individual module substrates, creating a map of the marked individual module substrates based on the first set of markings, forming an overmold over at least a portion of the first face of the PCB, the overmold covering the first set of markings, and marking locations on the PCB corresponding to potentially defective individual module substrates to create a second set of markings, the locations identified via the map.

US Pat. No. 10,657,638

WAFER MAP PATTERN DETECTION BASED ON SUPERVISED MACHINE LEARNING

Mentor Graphics Corporati...

1. One or more non-transitory computer-readable media storing computer-executable instructions, the computer-executable instructions, when executed, causing one or more processors to perform a method, the method comprising:generating defect pattern variants of one or more defect patterns, the one or more defect patterns being extracted from wafer maps of wafers having at least systematic defects, a wafer map of a wafer being a map generated based on results of one or more measurements performed on the wafer;
superimposing each of the defect pattern variants on wafer maps of wafers having no systematic defects to generate positive training data of wafer maps to be included in a training dataset; and
deriving a trained machine-learning model for recognizing known defect patterns on wafer maps based on the training dataset, the known defect patterns comprising the one or more defect patterns.

US Pat. No. 10,657,637

SYSTEM FOR INSPECTING OBJECTS USING AUGMENTED REALITY

FARO TECHNOLOGIES, INC., ...

1. A method of comparing measured three-dimensional (3D) measurement data to an object comprising:obtaining 3D coordinates on the object with a 3D measurement device to measure at least one dimensional characteristic;
associating an AR marker with the object;
reading the AR marker with an AR-marker reader and, in response, transmitting the first collection of 3D coordinates and an electronic dimensional representation of the object to a computing device having a camera and a display;
positioning the camera to view an area and to generate a camera image of the area;
displaying the camera image on the display;
displaying the at least one dimensional characteristic and a virtual object in the camera image on the display, the virtual object represents the object;
comparing the 3D coordinates to a predetermined specification;
determining that the at least one dimensional characteristic is outside of the predetermined specification; and
displaying a symbol on the virtual object indicating the at least one dimensional characteristic is outside of the predetermined specification.

US Pat. No. 10,657,636

METHODS AND SYSTEMS TO ENSURE CORRECT PRINTING PLATE USAGE FOR ENCODED SIGNALS

Digimarc Corporation, Be...

1. A substrate for a retail package, hang tag or label, the substrate comprising:a barcode including a check digit;
a printed control icon, the printed control icon comprising a design printed with each of four (4) ink separations, the design comprising a plurality of design elements, the plurality of design elements comprising a central element, a first rotation indicator corresponding to the check digit, and a plurality of registration indicators, the four (4) ink separations comprising a Cyan (C) color channel, a Magenta (M) color channel, a Yellow (Y) color channel and a Black (K) color channel; and
in which the printed control icon indicates a printing plate mismatch when it comprises the first rotation indicator and a second rotation indication, and in which authenticity of the substrate can be determined based on a displacement from a predetermined center of the central element or a printing plate mismatch can be determined based on the central element.

US Pat. No. 10,657,635

INSPECTION APPARATUS, INSPECTION METHOD AND STORAGE MEDIUM

Ricoh Company, Ltd., Tok...

1. An inspection apparatus, comprising:circuitry configured to
acquire a reference image used as a reference for inspecting a printed matter;
acquire a scanned image by scanning the printed matter;
generate a difference image indicating a difference between the acquired reference image and the acquired scanned image based on the acquired reference image and the acquired scanned image;
extract an edge region from the acquired reference image;
detect a proximity region having pixels located within a pixel range surrounding the extracted edge region in the acquired reference image;
correct an inspection threshold to a corrected first inspection threshold to be applied to a pixel in the extracted edge region, and correct the inspection threshold to a second corrected inspection threshold to be applied to a pixel in the detected proximity region, wherein
the first corrected inspection threshold to be applied to the pixel in the edge region is calculated based on (1) density differences between the pixel in the edge region and each one of proximity influencing pixels set in the proximity region, respectively, and (2) distances between the pixel in the edge region and each one of the proximity influencing pixels set in the proximity region, respectively, and
the second corrected inspection threshold to be applied to the pixel in the proximity region is calculated based on (1) density differences between the pixel in the proximity region and each one of edge influencing pixels set in the edge region, respectively, and (2) distances between the pixel in the proximity region and each one of the edge influencing pixels set in the edge region, respectively; and
inspect the printed matter based on the generated difference image and the corrected first and second inspection thresholds.

US Pat. No. 10,657,634

SYSTEMS AND METHODS FOR IMAGE PROCESSING

Indiana University Resear...

1. A method for processing an image comprising:assigning a value to a plurality of pixels of an image;
placing the values of the plurality of pixels in an array;
determining a sorting indices to sort the array;
determining an output indices to sort the sorting indices;
sorting the array with the sorting indices;
sorting the array with the output indices; and
altering the image based on the sorted array.

US Pat. No. 10,657,633

SUBJECT INFORMATION PROCESSING APPARATUS AND IMAGE DISPLAY METHOD

Canon Kabushiki Kaisha, ...

1. A subject information processing apparatus comprising:a signal processing unit configured to generate image data of a subject based on electrical signals output from a probe detecting acoustic waves propagated from the subject; and
a display control unit configured to control a display unit to display an image based on the image data,
wherein the signal processing unit combines at least two of i-th image data to (i+m)-th image data (i+m wherein the signal processing unit combines at least two of (i+n)-th image data to (i+n+m)-th image data (n is a natural number) with a second inclined weighting manner so that weights in association with the (i+n)-th image data to the (i+n+m)-th image data increase in this order, to generate second combined image data, and
wherein the display control unit controls the display unit to display an image based on the first combined image data, and further updates the image displayed on the display unit to an image based on the second combined image data.

US Pat. No. 10,657,632

APPARATUS HAVING A USER INTERFACE FOR ENHANCING MEDICAL IMAGES

KONINKLIJKE PHILIPS N.V.,...

1. An apparatus for processing a medical image of a structure of interest, comprising:a first processor configured for decomposing the medical image into a plurality of band pass images and a low-pass image;
a user interface configured for specifying a plurality of enhancement curves for the medical image based on at least one of: a metric structure length, a structure selectivity, and a structure enhancement strength;
a second processor configured for applying the plurality of enhancement curves to the plurality of band pass images to generate a plurality of enhanced band pass images; and
a third processor configured for composing an enhanced medical image based on the plurality of enhanced band pass images and the low pass image;
wherein the plurality of enhancement curves is based on a decomposition level dependent enhancement parameter, and wherein the decomposition level ranges from 0 to n?1, where n denotes the number of band pass images of the plurality of band pass images, and where a separate enhancement curve of the plurality of enhancement curves is specified for each decomposition level.

US Pat. No. 10,657,631

APPARATUS AND METHOD FOR CONTROLLING CONTRAST RATIO OF CONTENT IN ELECTRONIC DEVICE

Samsung Electronics Co., ...

1. A method for controlling a contrast ratio of content in an electronic device, the method comprising:receiving the content, the content being standard dynamic range (SDR) content including one or more frames;
identifying one or more highlight regions based on luminance information of each of the one or more frames included in the SDR content;
deciding thresholds based on the one or more highlight regions in the each of the one or more frames;
generating one or more masks corresponding to the one or more highlight regions based on the thresholds for the each of the one or more frames;
generating a contrast ratio-controlled frame based on the one or more masks and one or more boosting factors for the each of the one or more frames; and
generating high dynamic range (HDR) content based on the contrast ratio-controlled frame generated for the each of the one or more frames,
wherein the thresholds comprise a hard threshold for identifying a boundary of a highlight region and a soft threshold for each of the one or more highlight regions, and
wherein the hard threshold is decided based on an average pixel luminance and a luminance highlight distribution in the luminance information, is greater than the average pixel luminance and smaller than a maximum luminance, and is determined as a number indicating a specific percentage with respect to a total pixel count of one frame.

US Pat. No. 10,657,630

IMAGE DISPLAY APPARATUS

LG ELECTRONICS INC., Seo...

1. An image display apparatus comprising:a display;
an image receiver to receive a high dynamic range image; and
a controller to display at least one of a luminance setting object for setting luminance of the high dynamic range image, an automatic setting object for automatically setting the high dynamic range image, and a contrast setting object for setting contrast of the high dynamic range image,
wherein the controller is configured to:
extract brightness information of the high dynamic range image,
extract maximum luminance information from brightness information of the high dynamic range image, and
perform control to vary a saturation section upon luminance setting based on the maximum luminance information.

US Pat. No. 10,657,629

IMAGE READING DEVICE

Mitsubishi Electric Corpo...

1. An image reading device comprising:a light source to illuminate an illumination area with light;
sensors arranged in a line and including imaging elements, the sensors being configured to generate, from images formed on the imaging elements, image data containing a component data piece for each of color components;
optical systems provided for the corresponding sensors and arranged along a main scanning direction that is a direction in which the sensors are arranged, the optical systems being configured to cause light being emitted by the light source and scattered on a scan target object in a scanning area in the illumination area to form images on the imaging elements included in the sensors, wherein scanning areas of adjoining optical systems overlap each other;
a width detector to detect a width of a duplicate area along the main scanning direction, the duplicate area being an area in which images indicated by the image data generated by adjoining sensors of the sensors overlap each other;
a displacement detector to detect, for each of the optical systems, a displacement of the scan target object along an optical axis direction relative to a focus position of each of the optical systems based on the width of the duplicate area along the main scanning direction;
a first blur corrector to perform blur correction on each component data piece using a point spread function for each of the color components, the point spread function being dependent on the displacement of the scan target object;
an adjuster to adjust, based on a transfer magnification dependent on the displacement of the scan target object, a size of an image for each of the color components indicated by the respective component data piece; and
a combiner to combine images by superimposing portions of the component data pieces generated by the adjoining sensors, having undergone blur correction by the first blur corrector, and adjusted by the adjuster.

US Pat. No. 10,657,628

METHOD OF PROVIDING A SHARPNESS MEASURE FOR AN IMAGE

FotoNation Limited, Galw...

1. A method of providing a sharpness measure for an image comprising:detecting an object region within an image;
obtaining meta-data for the image;
scaling the object region;
calculating a gradient map for the scaled object region;
comparing the gradient map to a threshold determined for the image;
obtaining, based at least in part on the comparing, a filtered gradient map of values exceeding the threshold;
determining a sharpness measure for the object region as a function of the filtered gradient map of values, the sharpness measure being proportional to at least one of the filtered gradient map values; and
tracking the object region over multiple image frames based at least in part on the sharpness measure,
wherein the threshold for the image is a function of at least: a contrast level for the detected object region, a distance to a subject, and an ISO value used for image acquisition.

US Pat. No. 10,657,627

TEMPORAL SMOOTHING IN IMAGE CAPTURE SYSTEMS

GoPro, Inc., San Mateo, ...

1. A system comprising:an image sensor configured to capture a sequence of images; and
a processing apparatus configured to:
access the sequence of images from the image sensor;
determine a sequence of parameters for respective images in the sequence of images based on the respective images;
store the sequence of images in a buffer;
determine an average of parameters over a window of time that includes the sequence of parameters;
determine a temporally smoothed parameter for a current image in the sequence of images based on the sequence of parameters, wherein the sequence of parameters includes parameters for images in the sequence of images that were captured after the current image, and wherein the temporally smoothed parameter is determined based on the average of parameters over the window of time, and wherein the window of time includes times when older images were captured, before the current image was captured, and the parameters for these older images are accessed for determining the average of parameters over the window of time after these older images have been deleted from the buffer; and
apply image processing to the current image based on the temporally smoothed parameter to obtain a processed image.

US Pat. No. 10,657,626

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, AND RECORDING MEDIUM

OMRON Corporation, Kyoto...

1. An image processing apparatus that performs image measurement, the image processing apparatus comprising a processor configured to:access a data storage device;
execute an image measurement processing on image data of a first data format based on predetermined measurement conditions and output an image measurement result;
generate image data of a second data format by irreversibly compressing the image data of the first data format; and
restore the image data of the second data format to the image data of the first data format,
wherein the processor:
generates second image data of the second data format from first image data of the first data format acquired by imaging an object;
restores the second image data to third image data of the first data format; and
stores the second image data and the image measurement result acquired by executing the image measurement processing on the third image data in the data storage device in association with each other,
wherein the image measurement processing comprises a process of calculating a feature amount from the image data of the first data format and a process of generating the image measurement result by comparing the feature amount with a predetermined threshold, and
wherein the processor further:
determines whether or not a first feature amount calculated from the third image data is within a predetermined range including the predetermined threshold;
outputs the image measurement result acquired by executing the image measurement processing on the first image data in accordance with a result of the determining; and
stores a first image measurement result acquired by executing the image measurement processing on the first image data and the first image data in the data storage device in association with a second image measurement result acquired by executing the image measurement processing on the third image data.

US Pat. No. 10,657,625

IMAGE PROCESSING DEVICE, AN IMAGE PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM

NEC CORPORATION, Tokyo (...

1. An image processing device comprising:a feature extraction unit that extracts features from scaled samples generated from given region of interest, after normalizing the samples;
a maximum likelihood estimation unit that derives an estimated probability score of the scaled samples by maximizing the likelihood of a given scaled sample and a parameter of the probability distribution model;
an estimation unit that combines the previous estimates of the object and its features into a single template which represents the object appearance, and that removes samples which have a probability score below the threshold;a feature matching unit that obtains a similarity between a given template and a scaled sample and selecting the sample with the maximum similarity as the final output.

US Pat. No. 10,657,624

IMAGE SYNTHESIS METHOD FOR SYNTHESIZING IMAGES TAKEN BY CAMERAS ON OPPOSITE SIDES AND SMART DEVICE USING THE SAME

Acer Incorporated, New T...

1. An image synthesis method adapted for a smart device, comprising:respectively capturing a first image and a second image by using a first camera and a second camera, wherein the first camera and the second camera are disposed on two opposite sides of the smart device;
recognizing at least one first object in the first image;
finding a specific area in the first image, wherein the specific area is not overlapped with the at least one first object, comprising:
dividing the first image into a plurality of grids;
finding from the plurality of grids at least one first grid set respectively corresponding to the at least one first object;
adjusting the second image to a second size according to a first size of each of the at least one first grid set and characterizing the second size as a second grid set;
finding in a first region at least one candidate region sufficient for accommodating the second grid set, wherein the at least one candidate region is not overlapped with the at least one first grid set; and
taking one of the at least one candidate region as the specific area; and
embedding the second image into the specific area of the first image for generating a third image.

US Pat. No. 10,657,623

TWO STAGE MULTI-SCALE PROCESSING OF IMAGE DATA

Apple Inc., Cupertino, C...

1. An apparatus for processing image signal data, comprising:first stage processing circuitry configured to:
generate an unscaled single color version of a received image comprising a plurality of color components, and
generate an unscaled single color high frequency component, based in part on the unscaled single color version;
a scaler circuit configured to generate a first downscaled version of the received image, the first downscaled version comprising the plurality of color components and having a first pixel resolution lower than a pixel resolution of the received image; and
second stage processing circuitry configured to:
process the first downscaled version of the received image,
generate a plurality of sequentially downscaled images based on the first downscaled version, each of the sequentially downscaled images comprising the plurality of color components,
process the plurality of sequentially downscaled images to generate processed versions of sequentially downscaled images, and
generate a processed version of the first downscaled version of the received image using the processed first downscaled version and the processed versions of sequentially downscaled images, the processed version of the first downscaled version merged with the unscaled single color high frequency component.

US Pat. No. 10,657,622

CONTROLLING PROJECTED IMAGE FRAME RATE IN RESPONSE TO DETERMINED PROJECTION SURFACE CURVATURE

Canon Kabushiki Kaisha, ...

14. An information processing method comprising:acquiring surface change information on magnitude of change of a projection surface onto which an image is projected;
determining, based on the surface change information, (a) frame rate control information on a frame rate of image processing for maintaining correspondence between each position in the image and each position on the projection surface, (b) gradation control information on a number of bits of an image for which the image processing is performed, and (c) lattice point control information on a number of lattice points that are used for the image processing; and
allowing a user to select followability of the image processing, gradation of the image processing, or transformation accuracy of the image processing,
wherein in a case where the followability is selected by the user, in the determining, the frame rate control information, the gradation control information, and the lattice point control information, which are determined based on the surface change information, are changed so that the frame rate is increased, the number of bits is decreased, and the number of lattice points is decreased,
wherein in a case where the gradation is selected by the user, in the determining, the frame rate control information, the gradation control information, and the lattice point control information, which are determined based on the surface change information, are changed so that the number of bits is increased, the frame rate is decreased, and the number of lattice points is decreased, and
wherein in a case where the transformation accuracy is selected by the user, in the determining, the frame rate control information, the gradation control information, and the lattice point control information, which are determined based on the surface change information, are changed so that the number of lattice points is increased, the frame rate is decreased, and the number of bits is decreased.

US Pat. No. 10,657,621

MOVING STRUCTURE MOTION COMPENSATION IN IMAGING

KONINKLIJKE PHILIPS N.V.,...

1. A method, comprising:manipulating segmented structure of interest, which is segmented from first reconstructed image data at a reference motion phase of interest, that is registered to second reconstructed image data at one or more other motion phases;
updating initial motion vector fields corresponding to the registration of the segmented structure of interest to the second reconstructed image data based on the manipulation; and
reconstructing projection data with a motion compensated reconstruction non-iterative algorithm employing the updated motion vector fields.

US Pat. No. 10,657,620

POOLING METHOD, DEVICE, AND SYSTEM, COMPUTER-READABLE STORAGE MEDIUM

THINKFORCE ELECTRONIC TEC...

1. A pooling method, comprising:acquiring pixel data of each row where a pooling window is located row by row, each time after the pooling window is moved vertically, wherein a size of the pooling window is N×N, N is a positive integer;
marking the acquired pixel data, to obtain a row number of the pixel data;
writing the acquired pixel data of the first N?1 rows or a pre-pooling result thereof into a cache;
performing a pooling operation on the pixel data of a last row where the pooling window is located and pixel data in the cache, when the pixel data of the last row is acquired; and
outputting a pooling result of the pooling operation.

US Pat. No. 10,657,619

TASK EXECUTION ON A GRAPHICS PROCESSOR USING INDIRECT ARGUMENT BUFFERS

Apple Inc., Cupertino, C...

1. A computer-implemented method for task execution on a graphics processor, the method comprising:creating a data structure for grouping data resources;
populating the data structure with two or more data resources for encoding into a graphics processing language by an encoding object, wherein the data structure is ordered based on an index associated with each data resource, and wherein the encoding object is configured to reorder the data structure by changing the index associated with at least one of the data structure's data resources;
passing the data structure to at least one programming interface command, the at least one programming interface command configured to access the data structure's data resources; and
triggering execution of least one function on the graphics processer in response to passing the data structure to the at least one programming interface command.

US Pat. No. 10,657,618

COARSE GRAIN COHERENCY

Intel Corporation, Santa...

1. An electronic device comprising:a general-purpose processor including a first cache memory and a first coherency module;
a general-purpose graphics processor including a second cache memory and a second coherency module, wherein the first coherency module and the second coherency module enable heterogeneous coherency between the first cache memory and the second cache memory, the heterogeneous coherency enabled at multiple cache line granularity; and
a first memory module to store a superline directory table, the superline directory table to track ownership for each superline owned by the general-purpose processor and the general-purpose graphics processor, wherein each superline is a sub-page address region that spans multiple cache lines of the first cache memory and the second cache memory.

US Pat. No. 10,657,617

METHOD AND APPARATUS FOR MEMORY ACCESS MANAGEMENT FOR DATA PROCESSING

GM Global Technology Oper...

1. A system for dynamically processing a datafile, comprising:a central processing unit (CPU), an accelerator, a communication bus, and a system memory device, wherein the system memory device is configured to store the datafile;
the accelerator including a local memory buffer, a data transfer scheduler, and a plurality of processing engines; and
the local memory buffer including an input buffer and an output buffer, wherein the input buffer is in communication with an input register set for one of the processing engines and the output buffer is in communication with an output register set for the one of the processing engines;
wherein the data transfer scheduler is arranged to manage data transfer between the system memory device and the local memory buffer, wherein the data transfer includes data associated with the datafile;
wherein the local memory buffer is configured as a circular line buffer;
wherein the data transfer scheduler includes a ping-pong buffer for transferring output data from the one of the processing engines to the system memory device; and
wherein the local memory buffer is configured to execute cross-layer usage of the data associated with the datafile.

US Pat. No. 10,657,616

SYSTEM AND METHOD FOR MONITORING AND REPORTING A PERSON'S PHONE USAGE WHILE DRIVING

Copernicus, LLC, Minneap...

1. A method of determining the probability of whether the user of a mobile phone is texting while operating a vehicle, including the steps of:downloading a superseding keyboard on to the mobile phone, the superseding keyboard configured to monitor usage data of the mobile phone;
setting the superseding keyboard as a default keyboard on the mobile phone such that the superseding keyboard is used instead of the mobile phone's original operating system keyboard;
storing the usage data for each mobile phone usage occurrence using the superseding keyboard, the usage data including data relating to usage of the mobile phone while operating the vehicle at various speeds, the usage data including situational data unique to the user and environmental data relating to the physical environment of the mobile phone, the situational data further including keystroke specific data, word specific data, and session specific data, the environmental data including global positioning data;
creating a baseline user profile of the mobile phone user's vehicle keyboard usage pattern and non-vehicle keyboard usage pattern, the baseline user profile created by analyzing the stored usage data of the mobile phone usage occurrences;
monitor a real time mobile phone usage profile of the mobile phone user via the superseding keyboard, the real time mobile phone usage profile including current situational data and current environmental data; and
comparing the real time mobile phone usage profile of the mobile phone user to the baseline user profile;
determining, based on the comparison between the real time mobile phone usage profile and the baseline user profile whether the user is operating a motor vehicle;
creating a report based on the determination of the probability of whether the user is operating a motor vehicle;
sending the report to an acknowledged third party recipient, the user, or the acknowledged third party recipient and the user.

US Pat. No. 10,657,615

SYSTEM AND METHOD OF ALLOCATING COMPUTING RESOURCES BASED ON JURISDICTION

Bank of America Corporati...

1. A system, comprising:a hardware processor; and
a memory medium that is coupled to the processor and that includes instructions executable by the processor;
wherein the system is in a first jurisdiction and as the processor executes the instructions, the system:
receives a request for a transaction that involves a citizen of a second jurisdiction from a second computer system in the second jurisdiction, wherein:
the transaction comprises at least one of a sales transaction, a property transaction, and a commercial transaction, the transaction between a provider and a customer, wherein the system belongs to the provider and the citizen of the second jurisdiction is the customer;
executing the transaction comprises receiving private information from the citizen of the second jurisdiction and using the private information;
the request for the transaction does not comprise the private information; and
the second jurisdiction is subject to a privacy law prohibiting a transfer of the private information to the first jurisdiction, the privacy law comprising at least one of a privacy statute and a privacy regulation;
determines that the second computer system is in the second jurisdiction, wherein determining that the second computer system is in the second jurisdiction comprises receiving a network address of the second computer system;
determines that the second jurisdiction is subject to the privacy law;
in response to determining that the second jurisdiction is subject to the privacy law:
selects, from a plurality of computer systems in the second jurisdiction, a third computer system, in the second jurisdiction, to execute the transaction;
provides, to the third computer system, in the second jurisdiction, at least one of a container and a virtual machine that includes program instructions to execute the transaction;
provides an instruction to the third computer system to execute the program instructions that executes the transaction, wherein:
in response to executing the program instructions that execute the transaction, the third computer system receives the private information from the second computer system and uses the private information to complete the transaction; and
the instructions included in the memory medium prevent the system from receiving the private information;
receives, from the third computer system, metadata associated with the transaction, wherein:
the metadata comprises an indication that the transaction was successful; and
in response to providing the metadata to the system, the third computer system deletes the private information; and
in response to receiving the metadata, provides at least one of one or more goods and one or more services to the citizen of the second jurisdiction, based on the metadata.

US Pat. No. 10,657,614

LOCATOR DIAGNOSTIC SYSTEM FOR EMERGENCY DISPATCH

1. A computer-implemented method to assist a dispatcher when communicating with a caller via telephone regarding an incident requiring an emergency dispatch response, the computer-implemented method comprising:presenting, on a dispatch center computer, a plurality of pre-scripted interrogatories for the dispatcher to ask the caller to collect information regarding the incident;
receiving, on the dispatch center computer, input representative of the collected information;
determining automatically on the dispatch center computer a determinant value indicative of priority of the incident from one of a plurality of pre-established determinant values based on the collected information;
providing the determinant value from the dispatch center computer to a computer aided dispatch system;
initiating a diagnostic tool on the dispatch center computer;
the diagnostic tool presenting to the dispatcher a user interface;
the user interface providing a plurality of pre-scripted location questions for the dispatcher to relay to the caller over the telephone to guide the caller in describing caller location information, wherein the diagnostic tool is configured to traverse a logical tree to determine which of the plurality of pre-scripted location questions will be relayed to the caller;
the user interface receiving dispatcher-entered input indicative of caller location information relayed from the caller, wherein the caller relays observations over the telephone to the dispatcher, and wherein the logical tree is traversed based on the dispatcher-entered input,
the diagnostic tool storing the dispatcher-entered input for later recall;
providing the caller location information to the computer aided dispatch system;
the computer aided dispatch system receiving location and availability information from a plurality of emergency response unit devices corresponding to a plurality of emergency response units;
the computer aided dispatch system automatically dispatching at least one of the plurality of emergency response units based on the determinant value, the location and availability information received from the plurality of emergency response unit devices, and the caller location information received from the dispatch center computer;
the diagnostic tool prompting the dispatcher to ask the caller if the caller hears a siren of an emergency response unit;
upon receiving confirmation that the caller hears a siren, the diagnostic tool displaying instructions for the dispatcher to relay to the caller to assist emergency responders in locating the caller, the instructions including actions to make the caller seen or heard by the emergency responders; and
upon receiving confirmation that the caller does not hear a siren, the diagnostic tool providing an option for the dispatcher to end the diagnostic tool.

US Pat. No. 10,657,613

IDENTITY MATCHING OF PATIENT RECORDS

Koninklijke Philips N.V.,...

1. A method of determining whether a first patient record and a second patient record stored in a computer-readable form relate to the same patient, comprising:extracting a first plurality of clinical properties from clinical information in the first patient record;
extracting a second plurality of clinical properties from clinical information in the second patient record;
distinguishing persistent clinical properties of the first plurality of clinical properties from non-persistent clinical properties of the first plurality of clinical properties;
for each persistent property of the first plurality of clinical properties:
determine if the persistent property is incompatible with at least one clinical property of the second plurality of clinical properties, wherein an attribute type of the at least one clinical property of the second plurality of clinical properties is different from an attribute type of the persistent property;
if the persistent property is incompatible with the clinical property of the second plurality of clinical properties, determine that the first and second patient records do not correspond to the same patient;
else, determine and store a matching score for the persistent property and the second plurality of clinical properties, and repeat for the next persistent property;
for each non-persistent property of the first plurality of clinical properties:
determine and store a matching score for the non-persistent property and the second plurality of clinical properties, and repeat for the next non-persistent property;
extracting a first plurality of demographic properties from demographic information in the first patient record;
extracting a second plurality of demographic properties from demographic information in the second patient record;
for each demographic property of the first plurality of demographic properties:
determine and store a matching score for the demographic property and the second plurality of demographic properties, and repeat for the next demographic property;
determining whether the first patient record and the second patient record relate to the same patient based on the matching scores of the first plurality of persistent properties, the first plurality of non-persistent properties, and the first plurality of demographic properties; and
if the first patient record and the second patient record relate to the same patient, recording this relationship in a non-transitory computer-readable medium so that a subsequent access to records of this same patient includes access to the first patient record and the second patient record.

US Pat. No. 10,657,612

CLAIM PROCESSING VALIDATION SYSTEM

CERNER INNOVATION, INC., ...

1. A system including at least one hardware processing device for simulating and validating integrated system performance for changes to processing rules used for processing a claim form and data related to reimbursement for provision of healthcare to a patient by specific payer organizations, comprising:at least one repository of information including:
a validation output data set comprising validation claim result data derived from processing a plurality of validation claims according to a validation set of claim data processing rules, the validation output data set comprising a validation processing time duration corresponding to a processing time required to process the plurality of validation claims; and
a set of modified claim data processing rules comprising at least one claim data processing rule that is different from at least one claim data processing rule of the validation set of claim data processing rules;
a rules processor, in a simulated healthcare reimbursement system, for:
processing the plurality of validation claims according to the set of modified claim data processing rules and generating modified claim result data;
based on the modified claim result data, generating a modified output data set, the modified output data set comprising:
a modified processing time duration corresponding to a processing time required to process the plurality of validation claims according to the set of modified claim data processing rules;
a comparator for validating the set of modified claim data processing rules, the comparator being configured to:
compare the modified output data set with the validation output data set; and
determine a change in rules processing duration by comparing the modified processing time duration to the validation processing time duration; and
an output processor for providing a validated output form that includes an indication of the change in rules processing duration, and storing the validated output form in a validated output form directory.

US Pat. No. 10,657,611

NEGOTIATION PLATFORM IN AN ONLINE ENVIRONMENT

eBay Inc., San Jose, CA ...

1. A system comprising:a network-based negotiation platform having one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising:
detecting, by a negotiation eligibility detector, that a listing associated with a seller is a failed listing;
identifying, by a potential buyer detector, a user as a potential buyer to participate in a negotiation with the seller regarding the failed listing, the identifying comprising associating a score with the potential buyer, the score being adjusted based on user actions performed by the potential buyer with respect to the failed listing; and
in response to a selection to start a negotiation session, causing, by the negotiation manager on a device of the seller or the device of the potential buyer, display of an offer interface presenting one or more input features each for receiving a new term in the negotiation session.

US Pat. No. 10,657,610

MANAGING BUILDING INFORMATION AND RESOLVING BUILDING ISSUES

LGHORIZON, LLC, Denver, ...

1. A computer-implemented method comprising:receiving, at a computer system and from a client computing device, information that describes an issue with a building, wherein the information includes first information corresponding to the issue and second information identifying the building or a user associated with the building, the first information including at least (i) issue identification information that describes a symptom of the issue observed by a user of the client computing device and (ii) detected location information that identifies a location within the building at which the symptom of the issue is observed by the user of the client computing device;
accessing, by the computer system and based on the second information identifying the building or the user associated with the building, data that identifies (i) a plurality of components that are included in the building and (ii) features of the plurality of components, wherein the data includes (i) component attribute information that identifies one or more of: component types for the plurality of components, ranges of potential issues with the plurality of components, and common issues with the plurality of components, and (ii) component location information that identifies locations within the building where the plurality of components are installed;
identifying, by the computer system and based on a comparison of the first information with the data for the building, one or more candidate components that have at least a threshold likelihood of being, at least partially, a cause of the issue, wherein the identifying comprises:
determining, by the computer system, issue matching scores for each of the plurality of components based on comparisons of the issue identification information with the component attribute information;
determining, by the computer system, location matching scores for each of the plurality of components based on comparisons of the detected location information with the component location information; and
selecting, by the computer system, the one or more candidate components from among the plurality of components based on one or more of the issue matching scores and the location matching scores;
selecting, by the computer system, a candidate service provider from among a plurality of service providers based on a comparison of (i) the one or more candidate components and (ii) information identifying technical qualifications for the plurality of service providers;
transmitting, by the computer system, a service request for the issue to the candidate service provider;
receiving, at the computer system, a response from the candidate service provider; and
scheduling, by the computer system and based on the response, a service appointment with the candidate service provider to resolve the issue.

US Pat. No. 10,657,609

SMART SWITCH WITH STOCHASTIC OPTIMIZATION

Promanthan Brains LLC, Se...

1. A switching apparatus comprising an electronic processor and a switch for an energy consumer, said processor being adapted toread an input from values provided by an electronic sensor through an electronic input,
define a function in an electronic memory mapping a proposed control strategy to a cost of said proposed control strategy, said cost comprising a merit or a demerit for attaining at least one predetermined goal as well as a price of energy where said price of said energy depends upon time and said price varies more frequently than once per twenty-four hour interval, to
evaluate on an electronic processor a plurality of said proposed control strategies which are different against said function in a stochastic optimization method chosen to converge toward a global minimum or a global maximum even if said function optimized possesses numerous local minima or maxima, and to
control through said switch actuated by said processor said energy consumer according to a selected control strategy found by said stochastic optimization method, said predetermined goal being characterized by some of said values read from said electronic sensor being preferable to other of said values according to a preference function stored in said electronic memory which maps said values read from said electronic sensor to a value indicating a smaller or a greater preference, said merit expressing said greater preference and said demerit expressing said smaller preference,
said different proposed control strategies being strategies for controlling said energy consumer, and said energy consumer changing through its operation said input read from said electronic sensor.

US Pat. No. 10,657,608

ENGINEERING CHANGE MANAGEMENT SYSTEM

Shem, LLC, Hagerstown, I...

1. A computer-implemented management system comprising:at least one memory configured for storing information;
at least one computer processor connected to the at least one memory;
a plurality of sub-systems operated by the at least one computer processor, comprising:
a sales configuration system configured for receiving, at the at least one computer processor, a request for a project for implementing a new engineering component, wherein the new engineering component comprises a tangible product;
a design system configured for receiving design input data to create a computer-implemented model of the new engineering component using the at least one computer processor;
a material requirements planning system configured for identifying components necessary for implementing the new engineering component using the at least one computer processor;
a manufacturing instructions system configured for creating a plurality of instructions for implementing the new engineering component, including instructions for manufacturing the tangible product, using the at least one computer processor; and
a service system including a database stored in the memory, the service system configured for receiving a project information at the at least one computer processor from at least one other sub-system and storing the project information in the database; and
a control system operated by the at least one computer processor and connected to all of the plurality of sub-systems, wherein the control system is configured for controlling exchange of information between two or more of the plurality of sub-systems for completion of the project,
wherein the control system is configured for receiving the information for completion of the project, analyzing the information based on a set of rules, and determining an action to be taken by one or more of the plurality of sub-systems, and
wherein the control system is further configured such that when the information is received from a first sub-system of the plurality of sub-systems, and the information complies with the rules, the control system is configured for determining that the action is to be taken by a subsequent sub-system plurality of sub-systems to advance completion of the project, and when the information does not comply with the rules, the control system is configured for determining that the action is to be taken by the first sub-system to supplement the information to comply with the rules.

US Pat. No. 10,657,607

IMPLEMENTATION OF PAYROLL SMART CONTRACT ON A DISTRIBUTED LEDGER

ADP, LLC, Roseland, NJ (...

1. A method comprising:improving security and accessibility of data relating to payment of wages by:
storing a first smart contract on a blockchain maintained in a computer network, wherein:
the first smart contract contains a first clause in an event of authorized changes to the first smart contract; and
the first smart contract contains a second clause to pay first wages to an employee upon occurrence of a trigger event, the second clause different than the first clause;
storing a modification to the first smart contract as a second smart contract on the blockchain prior to executing instructions to pay the first wages, the second smart contract containing a third clause to pay second wages to the employee, wherein the second smart contract is different than the first smart contract; and
responsive to receiving the trigger event:
executing the first smart contract, wherein the first clause is executed prior to the second clause; and
after executing the first clause, executing the second smart contract, whereby the second wages are paid to the employee;
wherein:
the blockchain is maintained on non-transitory, computer-readable storage media comprising a distributed ledger implemented by the computer network;
the first smart contract and the second smart contract are recorded on the blockchain such that the first smart contract and the second smart contract cannot be modified;
the second wages comprise updated wages; andpayroll security is improved by paying the updated wages using the blockchain,wherein the first clause is a redirection clause.

US Pat. No. 10,657,606

COMPUTERIZED-METHODS AND SYSTEMS FOR IDENTIFYING DUPLICATE ENTRIES IN A DATABASE OF MERCHANT DATA

MASTERCARD INTERNATIONAL ...

1. A non-transitory computer readable storage medium including executable instructions that, when executed by at least one processor, cause the at least one processor to:access a data structure indicative of entities, each of the entities associated with a location designation;
determine, from the data structure, a location designation for an identified entity;
identify a plurality of entities based on the location designation for the identified entity, each of the plurality of entities including at least one attribute in common with the location designation of the identified entity;
query multiple users as to which of the location designations of the plurality of entities is the location of the identified entity, wherein each of the multiple users is associated with a region of the identified entity;
receive a response from ones of the multiple users, each response selecting one of the location designations of the plurality of entities;
generate a score indicative of a probability that one of the location designations of the plurality of entities is the location of the identified entity, based on the responses from the ones of the multiple users; and
when the score fails to satisfy a predefined threshold, query at least one additional user as to which of the location designations of the plurality of entities is the location of the identified entity, wherein the at least one additional user is associated with a location within a predefined distance from at least one of the location designations of the plurality of entities.

US Pat. No. 10,657,605

METHOD FOR CONTROLLING AN ACCOUNTING PROCESS FOR FINANCIAL INSTRUMENTS

SAP SE, Walldorf (DE)

1. A method for controlling an accounting process for financial instruments:receiving information associated with a financial instrument, wherein receiving the information comprises creating a corresponding entry in a worklist in a database;
registering the received information in the database;
deriving a work package of relevant accounting steps based on the registered information;
determining a plurality of dates having an open-type status, wherein the dates indicate when the relevant accounting steps are to be executed, wherein determining the plurality of dates having an open-type status comprises determining all open-type statuses beginning with a first open-type status, and wherein the open-type statuses comprise a “to be executed” status or a “roll-up” status; and
triggering execution of the relevant accounting steps according to the determined dates having an open-type status, wherein execution of the relevant accounting steps comprises:
retrieving the registered information and the associated financial instrument from the database;
receiving the worklist entry;
performing the work package to generate execution result data; and
storing the execution result data in the database.

US Pat. No. 10,657,604

SYSTEMS, METHODS, AND PLATFORM FOR ESTIMATING RISK OF CATASTROPHIC EVENTS

Aon Global Operations Ltd...

1. A system comprising:processing circuitry; and
a non-transitory computer readable memory coupled to the processing circuitry, the memory storing machine-executable instructions, wherein the machine-executable instructions, when executed on the processing circuitry, cause the processing circuitry to
receive catastrophic risk models representing risk to a plurality of locations,
wherein each of the catastrophic risk models is associated with one of a plurality of types of catastrophic events, and
wherein each of the catastrophic risk models includes a plurality of data points, each data point of the plurality of data points including at least two dimensions of data including, for each location of the plurality of locations,
a) a first dimension of the at least two dimensions corresponding to geographic coordinates of the respective location, and
b) a second dimension of the at least two dimensions corresponding to a measure associated with the respective location,
for each of the catastrophic risk models, compress the respective catastrophic risk model into a respective compressed risk model, wherein compressing the respective catastrophic risk model includes
identifying, from the plurality of data points in the respective catastrophic risk model, a first portion of data points that can be estimated from one or more surrounding data points within a predetermined error tolerance,
wherein the first portion of data points is identified based in part on a density of the geographic coordinates for the respective locations of the first portion of data points and an amount of variation in the measures for the respective locations of the first portion of data points,
removing, from the respective catastrophic risk model, the first portion of data points, and
storing, within a non-transitory database storage region, the respective compressed risk model, wherein a plurality of data points in the respective compressed risk model include a remaining second portion of data points from the respective catastrophic risk model, and
compute, in real-time responsive to receiving a risk score request for a location due to a type of catastrophic event identified in the request, a catastrophic risk score for the location,
wherein the catastrophic risk score corresponds to a weighted estimation of one or more of the respective data points in the respective stored compressed risk model for the type of catastrophic event,
wherein the geographic coordinates for the one or more of the respective data points are located within a predetermined distance of the location, and
wherein the request is received from a second remote computing device via the network.

US Pat. No. 10,657,603

INTELLIGENT ROUTING CONTROL

Progressive Casualty Insu...

1. A method of classifying insurance documents having insurance data, the method comprising:receiving a request to access a plurality of documents in an assessment queue stored in a memory of a device; each of the plurality of documents is made up of pages containing insurance data and have an associated predefined destination and an associated metadata before the plurality of documents are read;
converting a set of documents encoded in a first file format for storage in a non-transitory computer media into a second file format that removes all of the metadata associated with each document of the set of documents;
partitioning each document of the set of documents into separate stand-alone documents such that each partitioned document represents no more than a physical page;
converting each of the partitioned documents into separate recognition vectors that represent information conveyed in each of the partitioned documents;
classifying the partitioned documents through machine learning algorithms comprising a plurality of learning models that are combined to generate a summed output;
the plurality of learning models includes a successive learning model that minimizes a plurality of residuals generated from a preceding learning model;
processing the summed output as an input to a subsequent learning model separate from the plurality of learning models that embeds routing data in second metadata within and associated with each of the partitioned documents that reduces a prediction error;
merging the classified partitioned documents in response to a plurality of rules based at least in part on the second metadata; and
causing the merged documents to be routed to a remote destination independent of the predefined destination and the associated metadata.

US Pat. No. 10,657,602

DYNAMICALLY TRIGGERED INSURANCE SYSTEM BASED ON A FLOATING RECOVERABLE BASIS AND CORRESPONDING METHOD

Swiss Reinsurance Company...

1. A dynamically triggered, multi-tier risk-transfer system based on an automatically steered, floating recoverable basis, the dynamically triggered, multi-tier risk-transfer system comprising:circuitry configured to
trigger, signal, and mutually activate a coupled first and second insurance system providing a self-sufficient risk protection fir a variable number of defined risk exposure components by a first and second resource pooling system;
implement interfaces comprising a plurality of payment transfer modules to connect to the risk exposure components, the plurality of payment transfer modules configured to receive and store first payments from the risk exposure components for pooling of their risk exposures, wherein the first insurance system provides automated risk protection for each of the connected risk exposure components based on received and stored first payment parameters;
implement second payment transfer modules to connect the first insurance system to the second resource pooling system, the second payment transfer modules being configured to receive and store second payment parameters from the first insurance system for adopting of a portion of the risk exposures accumulated by the first insurance system, wherein, in case of an occurrence of one of the defined risk events, loss is automatically covered by the first insurance system;
implement a trigger system that comprises a first trigger module triggering a variable loss ratio parameter by an alterable loss ratio threshold value, wherein the trigger system comprises an aggregation module for automatically aggregating captured loss parameters of a measured occurrence of risk events over all risk exposure components within a predefined time period by incrementing an associated stored aggregated loss parameter and for automatically aggregating the received and stored first payment parameters over all risk exposure components within the predefined time period by incrementing an associated stored, aggregated payment parameter, and wherein the variable loss ratio parameter is generated dynamically based on a ratio of the aggregated loss parameter and the aggregated payment parameter;
by triggering the variable loss ratio parameter exceeding said loss ratio threshold value, activate a second trigger module of the trigger system, dynamically set a floating activation value to a value of the variable loss ratio parameter and/or subject to the aggregated loss parameter, and trigger the floating activation value by an adjustable minimum activation threshold trigger, and
if said floating activation value exceeding the minimum activation threshold trigger is triggered, automatically activate the second insurance system by transferring activation signaling to the second insurance system covering, upon activation, said adopted portion of the risk exposures accumulated by the first insurance system.

US Pat. No. 10,657,601

INSURANCE PREMIUM GAP ANALYSIS

Allstate Insurance Compan...

1. A method comprising:(a) using a processor associated with a computer, retrieving, through a data transaction manager, historical premium data from a data source, wherein the data transaction manager is configured to:
i. format a request for the historical premium data into a markup language document; and
ii. store the historical premium data in a data storage system for future retrieval without requiring a further request to the data source;
(b) using the processor, calculating a first set of future state-wide industry insurance premiums based on projected annual premium growth rates and a historical state-wide industry insurance premium specified in the retrieved historical premium data;
(c) using the processor, calculating a second set of future state-wide industry insurance premiums based on information related to a total number of households per zip code, a total number of auto-owning households per zip code, and average insurance expenditures per household per zip code;
(d) using the processor, calculating an adjustment factor based on the calculated first set and second set of future state-wide industry insurance premiums;
(e) using the processor, applying the calculated adjustment factor to a total expenditure of insurance per zip code to determine an adjusted total expenditure of insurance per zip code; and
(f) using the processor, calculating an expected market share for a given insurance provider based on the adjusted total expenditure of insurance per zip code.

US Pat. No. 10,657,600

SYSTEMS AND METHODS FOR MOBILE IMAGE CAPTURE AND PROCESSING

KOFAX, INC., Irvine, CA ...

1. A computer-implemented method for processing insurance claims, the method comprising:capturing image data using a mobile device, the image data depicting a document; and
using at least one processor:
defining, based on the image data, a plurality of candidate edge points corresponding to the document;
defining four sides of a tetragon based on at least some of the plurality of candidate edge points; and
determining whether the document is relevant to an insurance claim; and
in response to determining the document is relevant to the insurance claim, submitting the image data, information extracted from the image data, or both to a remote server for claims processing; and
associating the image data, the information extracted from the image data, or both with a claim number prior to submitting the image data, the information extracted from the image data, or both to the remote server.

US Pat. No. 10,657,599

VIRTUAL COLLABORATION

Allstate Insurance Compan...

1. A method, comprising:determining, by a virtual collaboration server, a queue position of a first claims adjuster in a first queue, the queue position being based on a first amount of time the first claims adjuster has been waiting in the first queue, wherein the first queue is associated with a first property type;
determining, by the virtual collaboration server, a queue position of a second claims adjuster in a second queue, the queue position of the second claims adjuster being based on a second amount of time the second claims adjuster has been waiting in the second queue, wherein the second queue is associated with a second property type;
receiving, from a mobile computing device comprising a camera, a microphone, and a speaker, and by the virtual collaboration server, a request to initiate a communication session, wherein the request is associated with a second property type;
modifying, by the virtual collaboration and based on determining that the second amount of time is below a threshold, the queue position of the first claims adjuster in the first queue to a modified queue position in the second queue;
selecting, by the virtual collaboration server based on the modified queue position, a computing device associated with the first claims adjuster;
transmitting, by the virtual collaboration server and to the computing device of the first claims adjuster, the request to initiate the communication session;
responsive to receiving an indication that the first claims adjuster has accepted the request to initiate the communication session, transmitting video and audio bidirectionally between the mobile computing device and the computing device of the first claims adjuster, wherein the video comprises video of damaged property for evaluation;
based at least in part on the transmitted video and audio, determining an amount of compensation to provide to an owner of the damaged property; and
transferring the determined amount of compensation for the damaged property displayed within the video to the owner of the damaged property.

US Pat. No. 10,657,598

SYSTEM AND METHOD FOR USE OF CARBON EMISSIONS IN CHARACTERIZING DRIVER PERFORMANCE

Scope Technologies Holdin...

1. A method for remotely determining vehicle driver performance for a specific vehicle and driver using acceleration measurements and estimated carbon emission patterns for a plurality of vehicle behaviors and conditions derived from said acceleration measurements, said specific vehicle being one of a defined vehicle type, the method comprising:developing a carbon emission pattern prediction model for the vehicle type comprising:
collecting data over time from one or more carbon emission sensors and
one or more accelerometers from a plurality of vehicles of said vehicle type driven in a variety of manners and in a variety of conditions,
observing patterns in the carbon emission data and the acceleration data
for each of the plurality of vehicles driven and for each of the manners and conditions driven,
developing statistical comparisons between the observed carbon emissions patterns and the associated acceleration patterns, and
storing representative patterns and statistical comparisons in a remote computer memory along with identifying information for the vehicle type, driving manner and conditions for each pattern;
identifying specific vehicle and driver carbon emission patterns for the specific vehicle and driver by:
collecting acceleration and corresponding carbon emissions data over time for the specific vehicle while operated by the driver from one or more on-board carbon emission sensors and accelerometers;
developing specific vehicle and driver statistical comparisons between collected acceleration and corresponding carbon emissions data for the specific vehicle and driver, and
storing, as specific vehicle and driver acceleration patterns, the acceleration and corresponding carbon emissions data, and said specific vehicle and driver statistical comparisons in a specific vehicle on-board computer memory;
subsequent to said identifying and on-board the specific vehicle, calculating carbon emissions for the specific vehicle and driver by:
measuring acceleration for the specific vehicle with one or more on-board accelerometers while operated by the driver,
comparing the acceleration measurements for the specific vehicle with the stored specific vehicle and driver acceleration patterns accessed from the on-board computer memory, and
upon finding one or more matching acceleration patterns, querying the on-board computer memory to return one or more corresponding carbon emissions;
transmitting the returned corresponding carbon emissions, along with identifying information for the specific vehicle and driver, to a remote user;
comparing, at a location remote from the specific vehicle, the transmitted carbon emissions with said representative patterns and statistical comparisons stored in the remote computer memory for the vehicle type corresponding the transmitted specific vehicle identifying information; and
assigning the driving manner for each representative pattern matching the transmitted carbon emissions as the driver performance for the specific driver corresponding to the driver identifying information.

US Pat. No. 10,657,597

SYSTEMS AND METHODS FOR DYNAMIC INSURANCE PREMIUMS

United Services Automobil...

1. A method of calculating an insurance premium for a driverless vehicle, the method comprising:detecting a first occupant of the driverless vehicle;
establishing data communication, utilizing a processor based machine, between an insurance premium calculator and an onboard vehicle system of the driverless vehicle, the insurance premium calculator hosted by an insurance company and remote to the onboard vehicle system;
causing, by the processor based machine, the onboard vehicle system to provide at least one signal indicative of a first amount of time the first occupant is in active control of the driverless vehicle to the insurance premium calculator during a period;
causing, by the processor based machine, the onboard vehicle system to provide at least one signal indicative of a second amount of time that the vehicle is operated without a driver to the insurance premium calculator during the period; and
modifying an insurance rate for a subsequent period based on the first amount of time and the second amount of time using the insurance premium calculator;
the first amount of time the first occupant is in active control of the driverless vehicle; and
the second amount of time that the vehicle is operated without a driver.

US Pat. No. 10,657,596

SYSTEMS AND METHODS FOR PROVIDING CURRENCY EXCHANGE INFORMATION VIA AUGMENTED REALITY

United Services Automobil...

1. A system, comprising:an image database comprising a plurality of images, wherein each of the plurality of images is linked to a respective currency of a plurality of currencies;
a financial database comprising financial information associated with the plurality of currencies;
a processor configured to:
receive image data representative of a plurality of cash notes, wherein the image data is acquired via one or more image sensors;
determine a currency associated with the plurality of cash notes based on a comparison of the image data with at least one image of the plurality of images in the image database and a correlation between the at least one image and the image data representative of the plurality of cash notes;
retrieve financial data regarding the currency from the financial database;
determine a value of the plurality of cash notes in the currency;
determine an equivalent value of the plurality of cash notes in another currency based on the value and the financial data;
generate one or more visualizations comprising the equivalent value of the plurality of cash notes in the other currency based on the financial data;
overlay the one or more visualizations on the image data; and
display the one or more visualizations overlaid on the image data via an electronic display.

US Pat. No. 10,657,595

METHOD OF TOKENIZATION OF ASSET-BACKED DIGITAL ASSETS

RESPONSIBLE GOLD OPERATIO...

1. A method performed by a computing device for issuing an asset-backed asset token whose provenance can be tracked in a distributed ledger that is a blockchain, the method comprising:receiving from a buyer a purchase order for an asset token, the purchase order specifying a quantity and quality of an asset that is to back the asset token;
reserving the asset by:
recording in the distributed ledger an asset token that identifies the quantity and quality of the asset and an issuer of the asset token as owner and that is associated with computer code of a smart contract that controls transferring ownership of the asset token by recording transactions to transfer ownership of the asset token in the distributed ledger, the owner identified based on a public key of a public/private key pair of the buyer;
confirming whether an inventory of assets includes sufficient assets to fulfill the quantity and quality of the asset specified in the purchase order for an asset token;
when the inventory does not include sufficient assets to fulfill the purchase order, acquiring and adding to the inventory assets so that the inventory includes sufficient assets to fill the purchase order; and
under control of the computer code of the smart contract, recording in the distributed ledger a reserve transaction for the asset token as reserved;
approving the purchase order;
receiving a payment for the asset token;
after receiving the payment, issuing the asset token to the buyer by:
creating an issue transaction that identifies the asset token, that is signed using a private key of a public/private key pair of the issuer, and that identifies the buyer as the owner based on a public key of a public/private key pair of the buyer;
recording ownership of the asset; and
under control of the computer code of the smart contract, recording the issue transaction that identifies the asset token in the distributed ledger,
wherein transactions to transfer the asset token that are recorded in the distributed ledger establish provenance of the asset including each owner of the asset.

US Pat. No. 10,657,594

METHOD AND SYSTEM FOR INTELLIGENT ROUTING OF INSIGHTS

MASTERCARD INTERNATIONAL ...

1. A method for providing insights based on merchant bidding, comprising:communicating, by a receiving device of a processing server, with specialized infrastructure associated with a payment network, and receiving therefrom payment transaction information associated with plurality of processed payment transactions;
storing, in an account database of the processing server, an account profile, wherein the account profile includes at least an account identifier and a plurality of transaction data entries related to the plurality of processed payment transactions, each transaction data entry including respective payment transaction information received from the payment network including at least a merchant identifier and transaction data;
receiving, by the receiving device of the processing server, a plurality of merchant bids, wherein each merchant bid is received from a different merchant and includes at least a bid amount and a corresponding merchant identifier;
identifying, by an analytical module of the processing server, a transaction metric based on at least the transaction data included in each transaction data entry stored in the account profile, wherein the transaction metric is at least a predetermined value;
identifying, by the analytical module of the processing server, a propensity to transact for each corresponding merchant identifier included in the plurality of merchant bids based on at least the transaction data included in each transaction data entry stored in the account profile that includes the respective corresponding merchant identifier;
determining, by a determination module of the processing server, a winning bid of the plurality of merchant bids based on at least one of (1) a score that is determined on a basis of the propensity to transact for the included corresponding merchant identifier and the included bid amount, (2) a weighting of the propensity to transact for the corresponding merchant identifier and a weighting of the bid amount; and
electronically transmitting, by a transmitting device of the processing server, at least the account identifier included in the account profile to the merchant from which the winning bid was received.

US Pat. No. 10,657,592

SYSTEM AND METHOD FOR PROCESSING A TRADE ORDER

Intercontinental Exchange...

1. A computer device comprising:one or more processors executing computer-executable instructions; and
an input receiving a constant stream of live data reflective of live market data conditions from a plurality of third party computing devices over one or more networks,
the computer-executable instructions causing the computer device to generate a graphical user interface comprising a display window a plurality of dynamic graphical indicators that move, within said display window, to reflect fluctuations in said stream of live data, and a plurality of trade buttons, within said display window, that display information reflective of the fluctuations in said stream of live data, wherein interaction with at least one trade button automatically initiates a trade at a then-current price which is displayed in said at least one trade button,
said plurality of dynamic graphical indicators comprising:
a theoretical price indicator associated with a theoretical price that is calculated based on at least one proposed order price, at least one pricing parameter, and the retrieved market data for at least one type of asset traded on at least one electronic exchange, and
one or more market data indicators displayed relative to the theoretical price indicator, the computer executable instructions further causing the one or more processors to:
construct a first proposed trade based on at least one proposed order quantity and on the at least one proposed order price,
construct a second proposed trade based on the constant stream of live data reflective of live market conditions,
position the one or more market data indicators at a location within the display window relative to the theoretical price indicator such that a distance between the one or more market data indicators and the theoretical price indicator reflects a level of profitability of the first and second proposed trades,
update a first trade button of the plurality of trade buttons to display information relating to the first proposed trade and a second trade button of the plurality of trade buttons to display information relating to the second proposed trade,
dynamically and continuously move, in real time, the one or more market data indicators relative to theoretical price indicator to reflect live fluctuations in said level of profitability that caused by the live fluctuations in said constant stream of live data, and
dynamically and continuously update, in real time, the second trade button to reflect live price fluctuations in said constant stream of live data, wherein interaction with the second trade button initiates a trade at a live market price.

US Pat. No. 10,657,591

COMPUTER-IMPLEMENTED SYSTEMS AND METHODS FOR REAL-TIME RISK-INFORMED RETURN ITEM COLLECTION USING AN AUTOMATED KIOSK

COUPANG CORP., Seoul (KR...

1. An automated kiosk for collecting return items based on a real time risk decision, comprising:one or more memory devices storing instructions;
an imaging device;
one or more containers, each container associated with an identifier associated with a status of empty or occupied;
a network interface;
a display screen;
one or more processors configured to execute the instructions to perform operations comprising:
responding to user input by capturing, with the imaging device, return item information representing a return item;
transmitting, via the network interface, the return item information and a request for return risk level relating to the return item to a server operable to execute a machine learning model trained on historical information to predict a risk score, wherein the server is configured to prepare the return risk level in response to the request by:
predicting a risk score of the return request based on the captured return item information by deploying the machine learning model;
determining a risk level based on the predicted risk score;
transmitting the determined risk level to the kiosk;
receiving the transmitted risk level through the network interface from the server;
displaying a return result on the display screen based on the received risk level; and
accepting the return item based on the received risk level, and
wherein the operations are performed in real-time.

US Pat. No. 10,657,590

SYSTEM AND METHOD FOR AN ELECTRONIC LENDING SYSTEM

Branch Banking and Trust ...

1. A method of interfacing with a financial institution using a computer interface, the method comprising the steps of:(a) receiving an interface request from a customer after the customer has reached, via one of a plurality of paths through a computer network, a predetermined first webpage for the financial institution;
(b) determining a first path upon which the customer reached the first webpage, wherein the first path includes a first website external to the financial institution;
(c) determining, based on the determined first path, a first list of products and/or services offered by the financial institution;
(d) presenting on the first webpage the determined first list of products and/or services offered by the financial institution, wherein the determined first list for the first path is different than a determined second list for a second path of the plural paths, wherein the second path includes a second website external to the financial institution different than the first website;
(e) receiving a first input from the customer wherein the first input comprises an indication of a choice of at least one of the products and/or services in the determined first list offered by the financial institution;
(f) authenticating the customer using a predetermined client identification profile for the condition where the customer is an existing online client of the financial institution, otherwise identifying the customer using either the predetermined client identification profile for the condition where the customer is an existing off-line client of the financial institution or challenge questions for the condition where the customer is not an existing off-line client of the financial institution;
(g) after the customer is authenticated or identified in step (f), receiving from the customer a first set of information, wherein the first set of information is login information and includes at least one of: an online client user identification and password for the customer; the customer's last name; and the last four digits of the customer's social security number;
(h) receiving from the customer a second set of information, wherein the second set of information includes at least one of mortgage information, borrowing information, collateral information, employment information, income information, financial information, asset information, liability information, and combinations thereof;
(i) presenting to the customer a set of terms and conditions and receiving from the customer an authorization for the financial institution to perform a credit check on the customer and an application for review;
(j) performing a risk analysis on the customer;
(k) approving the application based at least in part on the risk analysis;
(l) receiving from the customer a third set of information; and
(m) providing to the customer a fourth set of information.

US Pat. No. 10,657,589

DIGITAL BANK BRANCH

JPMORGAN CHASE BANK, N.A....

1. A digital bank branch system for associating a customer communicating over the Internet with a banking host system with a physical bank branch, the digital bank branch system comprising:at least one computer memory storing customer information and instructions; and
at least one computer processor accessing and executing the stored instructions to perform steps including:
providing a banking web site from the bank host computing system accessible to mobile device users over the Internet, the banking web site offering customers a selection interface enabling changing of a default branch by selecting as a local branch at least one physical bank branch from multiple selectable physical bank branches;
providing a selectable link to a digital branch web page connecting the customer with the selected local branch over the Internet;
providing a local branch information area at the digital branch web page, the local branch information area including information relevant to the local bank branch;
implementing an alert generator for generating alerts relevant to the local bank branch over a distribution channel;
selecting the distribution channel for distribution of the alerts relevant to the selected local bank branch from a plurality of available channels based on a determination from a customer monitor that the customer is using a particular mobile device for communicating with the bank host system over the Internet, the available channels including at least email, text message, and social media channels; and
distributing the alert over the selected channel to the mobile device.

US Pat. No. 10,657,588

METHOD AND SYSTEM FOR FUNDING A FINANCIAL ACCOUNT

EFUNDS CORPORATION, Jack...

1. A system for transferring funds comprising:a memory storing instructions; and
at least one processor configured to execute the instructions to:
receive, from a client device, a request for a transaction between a first account and a second account;
generate an electronic funding application;
transmit the electronic funding application to a client device, thereby causing the client device to generate a graphical user interface displaying the electronic funding application;
receive, from the client device, financial data entered into the funding application via the graphical user interface, comprising a magnetic ink character recognition (MICR) line, a currency amount, an account identifier associated with the first account, and a customer name;
validate at least a portion of the financial data by:
searching a database having a plurality of records using the received financial data, and;
matching a record of the database based on at least a portion of the entered MICR line,
convert the entered MICR line to an item compatible with an automated clearing house (ACH) network;
based on whether the conversion is successful, submit the item and the currency amount to the ACH network;
transfer the currency amount from the first account to a custodial account, wherein the custodial account is valid during a predetermined holding window;
settle the transaction through the ACH network; and
upon settlement, transfer the currency amount from the custodial account to the second account.

US Pat. No. 10,657,587

LISTING AND EXPIRING CASH SETTLED ON-THE-RUN TREASURY FUTURES CONTRACTS

Chicago Mercantile Exchan...

1. A computer-implemented method for use with a data transaction processing system which implements an exchange computing system in which data items are transacted by a hardware matching processor that matches electronic data transaction request messages for the same one of the data items based on multiple transaction parameters from different client computers over a data communication network, the method comprising:receiving, electronically from a database external to the exchange computing system, by a swap rate value determination processor of the exchange computing system, the database coupled therewith via the data communications network, a plurality of swap spread quotes and a plurality of swap rate quotes of a plurality of contributing dealers contributed to the database during a polling interval, the plurality of swap spread quotes and the plurality of swap rate quotes including outlier values as compared to the others of the plurality of swap spread quotes and the plurality of swap rate quotes, and processing the obtained plurality of swap spread quotes and plurality of swap rate quotes to remove the outlier values by using a trimmed means function;
deriving, by a present value calculator and settlement price calculator of the exchange computing system, a settlement price based on a present value of the most recently issued U.S. Treasury note of a selected tenor of a plurality of tenors determined by calculating the difference between a swap spread value and a swap rate value therefore, the swap spread value and the swap rate value having been determined based on the processed plurality of swap spread quotes and plurality of swap rate quotes obtained from the database;
computing, by a scheduling processor of the exchange computing system, a listing date and an expiration date of a cash-settled futures contract for delivery of the underlying most recently issued (“an on-the-run”) U.S. Treasury note for the selected tenor based on a U.S. Treasury note auction cycle for a next-to-be-issued U.S. Treasury note for the selected tenor which is announced but not-yet-auctioned (“when issued”) having the derived settlement price, wherein the listing date is in a “when issued” period, the multiple transaction parameters based upon which the hardware matching processor matches the electronic data transaction request messages comprising the computed listing date and expiration date;
listing via the data communications network, the cash-settled futures contract by an exchange processor of the exchange computing system coupled with the scheduling processor, to enable a market participant to submit an electronic data transaction request message for a transaction in the cash-settled futures contract to the data transaction processing system;
receiving, by the exchange computing system, the electronic data transaction request message from the market participant and matching, by a matching function, the transaction of the electronic data transaction request message with a previously received transaction counter thereto received from another market participant;
novating, by the exchange computing system, the transaction and the counter transaction so as to bifurcate and transform both from being between the market participant and the other market participant into separate transactions between the exchange computing system and each of the market participant and the other market participant; and
executing, by the exchange computing system, at least one of the separate transactions between the exchange computing system and each of the market participant and the other market participant when the other of the separate transactions between the exchange computing system and each of the market participant and the other market participant cannot be completed.

US Pat. No. 10,657,586

SYSTEM AND METHOD FOR DYNAMIC OFFERING DEPLOYMENT

ORACLE AMERICA, INC., Re...

1. A method for deploying an offering through cooperative functioning of a customer-based offering platform and a remote offering platform, comprising:by a processor on an offering platform computer system, which is provided within a customer data processing system, running a customer-based offering platform to communicate with a customer device in the customer data processing system and to communicate a request for an offering to a remote offering platform running on a computer device outside of the customer data processing system;
receiving, by the customer-based offering platform, an offering deployment package, the offering deployment package including an offering associated with an asset associated with the customer device, the offering deployment package being received from the remote offering platform operatively connected to the customer-based offering platform through a distributed offering network;
customizing, via the processor and the customer-based offering platform, the offering based on profile information of a customer associated with the asset to produce a customized offering;
deploying, from the processor and the customer-based offering platform, the customized offering to the asset; and
responsive to deployment of the customized offering, registering the customized offering with a memory of the customer-based offering platform disposed on the offering platform computer system.

US Pat. No. 10,657,585

ON-LINE AUCTION SALES LEADS

eBay Inc., San Jose, CA ...

1. A method comprising:performing operations, using one or more processors of a computing service, to insert information about prospective buyers into a data file to supplement information about expressed offers in the data file with respect to a classification of an item, the operations comprising:
setting up a memory field in a memory, using the one or more processors of the computing service, for a seller of at least one item associated with the classification of the item so that when a prospective buyer accesses a screen for the item, the information about the prospective buyer is stored in the memory; and
delivering contents of the memory, using the one or more processors of the computing service, as the data file to the seller of the at least one item associated with the classification of the item, the delivering including transferring the data file from the memory at the computing service to a memory of a machine of the seller, wherein the data file is made accessible at the machine of the seller as a complete data file of the expressed offers supplemented with the information about prospective buyers with respect to the classification of the item.

US Pat. No. 10,657,584

METHOD AND DEVICE FOR GENERATING SAFE CLOTHING PATTERNS FOR RIDER OF BIKE

Stradvision, Inc., Pohan...

1. A method for generating one or more safe clothing patterns to be used for a human-like figure, comprising steps of:(a) if at least one image of the human-like figure is acquired, a safe clothing-pattern generating device performing a process of generating at least one specific clothing pattern having an initial value, and a process of inputting the specific clothing pattern and the image of the human-like figure into a clothing composition network, to thereby allow the clothing composition network to combine the specific clothing pattern with a clothing of the human-like figure on the image of the human-like figure, and thus to generate at least one composite image corresponding to the image of the human-like figure;
(b) the safe clothing-pattern generating device performing a process of inputting the composite image into an image translation network, to thereby allow the image translation network to generate at least one translated image by translating a surrounding environment on the composite image, and a process of inputting the translated image into an object detector, to thereby allow the object detector to output detection information on the human-like figure representing the human-like figure detected in the translated image; and
(c) the safe clothing-pattern generating device performing a process of instructing a 1-st loss layer to calculate one or more losses by referring to the detection information on the human-like figure and at least one GT corresponding to the image of the human-like figure, and a process of updating the initial value of the specific clothing pattern by using the losses such that the losses are minimized.

US Pat. No. 10,657,583

PHARMACEUTICAL LOCATOR AND INVENTORY ESTIMATION

Mylan, Inc., Morgantown,...

1. A non-transitory machine-readable storage medium encoded with instructions for execution by an application server, the non-transitory machine-readable storage medium comprising:instructions for receiving, from a user device, a request for pharmaceutical availability, the request including a pharmaceutical identification and a location of the user device;
instructions for identifying a pharmacy near the received location of the user device that is known to stock a specific non-branded pharmaceutical identified by the pharmaceutical identification that is manufactured by a specified pharmaceutical manufacturer; and
instructions for transmitting the identified pharmacy to the user device, whereby the user device receives a list of at least one nearby pharmacy that carries the specific non-branded pharmaceutical manufactured by the specified pharmaceutical manufacturer;
instructions for displaying a display area on the user device including a map-style view, an information window including an availability indication, wherein the availability indication is configured to display different colors depending on an inventory condition of the specific nonbranded pharmaceutical at the identified pharmacy, and the availability indication when displaying one of the different colors is selectable by the user to initiate a phone call to the identified pharmacy;
instructions for querying the identified pharmacy to retrieve a first number for quantity on hand and a second number for average prescription quantity from the identified pharmacy; and
instruction for generating a table based on inputting the first number and the second number for the availability indication, the table comprising an inventory table based on directly accessing a pharmacy's inventory system, the inventory table comprising a result when y>=Rz, a second result when Mz instruction for querying the identified pharmacy to retrieve a third number for a prescription frequency for the specific non-branded pharmaceutical and a fourth number for time interval since a last sale was entered; and instruction for generating a second table based on inputting the third number and the fourth number, wherein the second table comprises a first result when x<=F1, a second result when x>F1 and xF2, a fourth result when x<=F2, a fifth result when x>F2 and x=F3, a seventh result when x<=F3, an eighth result when x>F3 and x=F4, wherein x is the fourth number and F1, F2, F3 and F4 are positive numbers such that F4>F3>F2>F1,
wherein the instructions for identifying the pharmacy include server instructions for identifying pharmacies that are proximate to the user device by a spatial engine such that the user device displays a first color representing the first result when the specific non-brand pharmaceutical is available to the user, a second color representing the second result, and a third color representing the third result.

US Pat. No. 10,657,582

METHOD, USER TERMINAL, AND SERVICE TERMINAL FOR PROCESSING SERVICE DATA

Tencent Technology (Shenz...

1. A computer-implemented method performed at a service terminal having one or more processors and memory and a short-range wireless signal transceiver, wherein the service terminal is communicatively coupled to a remote server, the method comprising:repeatedly broadcasting, via the short-range wireless signal transceiver, a service message including a first user account identifier of a social networking application and service-related information;
receiving, via the short-range wireless signal transceiver, a service purchase request message from a mobile phone adjacent the service terminal, wherein the mobile phone transmits the service purchase request message via a short-range wireless signal transceiver in the mobile phone and the service purchase request message includes a second user account identifier of the social networking application and service-purchasing authorization information corresponding to the service-related information via the mobile phone;
in response to receiving the service purchase request message:
transmitting the first user account identifier of the social networking application, the second user account identifier of the social networking application, and the service-purchasing authorization information to the remote server, wherein the remote server performs an operation between the first user account identifier and the second user account identifier according to the service-purchasing authorization information;
receiving a purchase confirmation message from the remote server; and
transmitting, via the short-range wireless signal transceiver, the purchase confirmation message to the mobile phone.

US Pat. No. 10,657,581

METHODS AND SYSTEMS FOR ORDER PROCESSING

BEIJING DIDI INFINITY TEC...

1. A system for operating an online transportation platform to interact with service receivers and service providers through order processing, comprising:at least one storage medium including a set of instructions; and
logic circuits connected to the at least one storage medium, wherein during operation, the logic circuits load the set of instructions and:
obtain electronic signals from a bus, the electronic signals encoding orders from terminals of the service receivers via a network;
extract order information based on the orders;
extract service provider information;
obtain features of the service providers using one or more trained machine learning models operating in real time on the online transportation platform, wherein each of the service providers is associated with a vehicle, the features of the service providers include characteristics of responding to orders, and the characteristics of responding to orders of each of the service providers indicate characteristics of change with time of probabilities that the service provider responds to the orders;
determine a result as to whether the order information matches the features of the service providers or whether the features of the service providers satisfy a preset condition;
rank the service providers based on the result;
generate orders to be allocated; and
send out, via the network, electronic signals encoding the orders to be allocated to terminals of the service providers based on the ranking.

US Pat. No. 10,657,580

SYSTEM FOR IMPROVING IN-STORE PICKING PERFORMANCE AND EXPERIENCE BY OPTIMIZING TOTE-FILL AND ORDER BATCHING OF ITEMS IN RETAIL STORE AND METHOD OF USING SAME

WALMART APOLLO, LLC, Ben...

1. A system comprising:a server computing device comprising one or more processors, the server computing device being configured to provide output to a plurality of first user devices of a plurality of first users, wherein the plurality of first users comprise workers at a plurality of retail stores, the server computing device being further configured to communicate with the plurality of first user devices and a plurality of second user devices and to perform:
receiving, using an item locator system of the server computing device, a plurality of orders having one or more items, wherein the plurality of orders are received from the plurality of second user devices of a plurality of customers, and wherein the plurality of orders are associated with the plurality of retail stores;
separating, by the one or more processors of the server computing device, the plurality of orders by a plurality of vehicle load numbers, wherein each vehicle load number of the plurality of the vehicle load numbers is associated with a weight and a due time;
batching, by the one or more processors of the server computing device, the plurality of orders, as separated, into different commodities, wherein the different commodities comprise different temperatures of the one or more items;
generating, by the one or more processors of the server computing device, rebatched orders according to an optimization algorithm that generates the rebatched orders based at least in part on both a distance and a first volume of each respective item of the one or more items within each of the rebatched orders of the different commodities;
sorting, by the one or more processors of the server computing device, the one or more items within the rebatched orders by sequence numbers based on at least a second volume of a container of each respective one of a number of containers to be used by the plurality of first users to retrieve the one or more items of each of the rebatched orders; and
sending instructions, by the one or more processors of the server computing device, to display on user interfaces of the plurality of first user devices information for filling the each respective one of the number of containers with the one or more items of each of the rebatched orders.

US Pat. No. 10,657,579

SYSTEM AND METHOD FOR DETERMINING AND ENABLING ACCESS TO A SERVICE PROVIDER SYSTEM FOR PRINT ORDER PROCESSING

CANON KABUSHIKI KAISHA, ...

1. A system including a client terminal capable of ordering merchandise of an online store service from a service providing system of a plurality of service providing systems each configured to provide the online store service and process an order of merchandise of the online store service, and an information processing apparatus capable of communicating with the client terminal,the client terminal comprises at least one processor to execute instructions to operate as:
a creation unit to create print data, including at least one image designated by a user, for printing the at least one image based on the image data, using a print data creation application, and
an upload unit to upload the created print data to the information processing apparatus,
the information processing apparatus comprises at least one processor to execute instructions to operate as:
a first reception unit to receive the uploaded print data;
a determination unit that, when the first reception unit receives the uploaded print data, determines a service providing system, of the plurality of service providing systems, which is associated with the print data creation application used to create the print data received by the first reception unit;
a first transmission unit to transmit access information used to make the client terminal access the determined service providing system;
the at least one processor of the client terminal further executes instructions to operate as:
a second reception unit to receive the access information from the information processing apparatus; and
an order unit to access the determined service providing system using the received access information and perform an order process to order a printed product created based on the print data uploaded to the information processing apparatus,
wherein, when the order process is performed, the printed product is created based on the uploaded print data.

US Pat. No. 10,657,578

ORDER PROCESSING SYSTEMS AND METHODS

WALMART APOLLO, LLC, Ben...

1. A method being implemented via execution of computing instructions configured to run at one or more processors and stored at one or more non-transitory computer-readable media, the method comprising:receiving a request to create a single order containing a plurality of items for a plurality of people, wherein the request comprises a set period of time when the single order remains open, and each respective person of the plurality of people enters a respective order at different times from each other during the set period of time, wherein the respective order of the each respective person of the plurality of people is transmitted directly to an online marketplace while the single order remains open;
sending instructions for display on each respective user interface of each respective mobile device of the each respective person of the plurality of people, the each respective user interface comprises a simultaneous display of a first window and a second window, wherein the first window presents a single list of the plurality of items included in all of the respective orders of the plurality of people, wherein the single list includes an indication of the each respective person associated with each respective one of the plurality of items based on the respective order of the each respective person, and wherein the second window presents (i) a list of additional items that can be selected to be added to the single order, and (ii) a respective selection for the each respective person of the plurality of people participating in the single order for adding one or more of the additional items to the respective order of the each respective person, wherein the each respective person enters the respective order for one or more of the plurality of items by using the each respective user interface of the each respective mobile device associated with the each respective person of the plurality of people while the single order remains open;
persisting in a database a state of the single order during the set period of time associated with the plurality of people;
automatically closing the single order after the set period of time expires;
receiving identification of the plurality of items in the single order;
receiving identification of the each respective person of the plurality of people associated with each of the plurality of items in the single order;
requesting respective payment information associated with the each respective person of the plurality of people;
receiving respective payments for the plurality of items in the single order using the respective payment information for the each respective person of the plurality of people and respective items of the plurality of items in the single order;
identifying a physical store to deliver the single order in a single shipment;
after receiving the respective payments, transmitting order processing instructions for the single order that identifies the plurality of items in the single order associated with the plurality of people; and
after transmitting the order processing instructions, generating order delivery instructions for the single shipment of the single order based on the physical store, as identified.

US Pat. No. 10,657,577

METHOD AND SYSTEM FOR AUTOMATIC END-TO-END PREPARATION AND MANAGEMENT OF FOOD

VISHNU GURUSAMY SUNDARAM,...

1. An automatic food preparation and management system that is distributed across a plurality of geographic locations, the system comprising:a hardware processor;
a memory module, wherein the memory module comprises a plurality of digital data storage devices for storing digital data for automatic food preparation and management;
an analytics module, wherein the analytics module is stored in the memory module;
a plurality of end-point devices, wherein the plurality of end-point devices comprise kitchen appliances and food processing machines located in a plurality of geographical locations, and wherein the plurality of end-point devices comprise kitchen appliances that are connected to the cloud computing module, and wherein the plurality of end point devices are configured to receive a plurality of instructions from the analytics module and cook-food, and wherein the kitchen appliance has a plurality of processing chambers, and wherein the plurality of processing chambers are configured to process the ingredients to perform heating, boiling, cooling, spraying, baking, cooking, cleaning, dish washing operations, and moved the ingredients from one chamber to next, based on conditions, and wherein the conditions include time, input from a plurality of sensors in a sensor module, and wherein all the end-point devices are configured to work independently and in coordination with other end-point devices, and the kitchen appliance has a user interface where the user is enabled to order for what he wants, and wherein the kitchen machine enables the user to browse through all the items that are to be cooked out of the available ingredients during the ordering process, and wherein the kitchen appliance has machine a configurator to decide on what is to be cooked accommodating constraints that include Quantity required, Ingredients available, Time available, and Storage required, wherein the kitchen appliance has a camera and biometric capabilities to identify a user to recognizes his preferences in suggesting items;
a cloud computing module, wherein the cloud computing module connects the plurality of end-point devices, and wherein the system enables gifting recipes or items so that a mom is enabled to gift a special cake by picking a recipe (or creating it) and pushing it to the target ID of his son who lives several thousand kilometers away, and wherein the system even enables mom to pay for the ingredients that is ordered by the machine to cook the item she forwarded thereby enabling a to send the cooked food remotely, and wherein the cloud computing module helps in supporting models thereby enabling a restaurant to order or replenish the used ingredients by covering the cost of ordering them to the machine, which is batched/grouped in batches;
communication module, wherein the communication module is configured to establish communication between the plurality of end-point devices;
a plurality of sensors connected to the analytics module, the cloud computing module and the communication module;
an inventory management module, wherein the inventory management module is connected to the analytics module, the cloud computing module and the communication module, and wherein the inventory management module keeps record of a perishable time either from its internal database or from the input from the store, and wherein the inventory management module is configured to dispose a perishable item and place an order based on a record of time; and
a recipe and menu building module, wherein the recipe and menu building module is connected to the analytics module, the cloud computing module and the communication module, and wherein the recipe and menu building module is run on the hardware processor, and wherein the recipe and menu building module is configured to receive an information related to a food intake pattern of a user, an amount of nutrients taken by the user and a plurality of vital health parameters of the user through the plurality of sensors that are remotely located and connected to the cloud computing module;
a Cooking Initiation module configured to receives inputs from a historical information module and a Live information module, and wherein Historical Information module comprises information comprising Ingredient Properties, Recipe Properties, Results of Cooking Process and Effects of Storage, and wherein Live Information module comprises information from plurality of live processes comprising Optical Inspection, Thermal Inspection, \vapour Analysis, Texture Analysis, Conduction, Endoscopy and Chemical Reactive test of small sample.

US Pat. No. 10,657,576

METHOD AND SYSTEM FOR MAINTAINING INTEGRITY OF A USER'S LIFE STATE INFORMATION

1. A computer-implemented method, comprising:receiving, by an information delivery system, information about and relevant to a user's life from a user who is a registered member of the information delivery system, wherein the information about and relevant to the user's life includes at least (a) demographic information, (b) ethnic information, (c) social information, and (d) psychological information;
granting permissions to a partner to access the information about and relevant to the user's life, wherein the partner is also a registered member of the information delivery system, and wherein the granted permissions control visibility of the information about and relevant to the user's life to the partner; and
receiving, by a portal application of the information delivery system, a definition of a layout of a portal associated with the user;
receiving, by an information filtering application of the information delivery system, the filtered information from the partner, wherein the filtered information is generated by the partner based on applying the information about and relevant to the user's life for information of the partner, wherein the information about and relevant to the user's life is provided by the user using a client computing system associated with the information delivery system, and wherein the information about and relevant to the user's life is received by a server computing system associated with the information delivery system;
communicating, by the portal application, the portal to the client computing system in accordance with the received layout definition, the communicated portal incorporating the filtered information received by the information filtering application within the layout of the communicated portal wherein the filtered information is a subset of information of the partner;
providing, by a marketplace application of the information delivery system, a marketplace through which partners procure the information about the user's life;
establishing, by the information delivery system, a data structure to enable the user to provide the information about and relevant to the user's life, the data structure including fields related to one another, wherein each of the fields is associated with a value, and wherein the data structure includes multiple levels such that a field at a lower level is to provide more detail information and value than a corresponding field at a higher level; and
automatically populating the fields, by a life state application of the information delivery system, of the data structure with a set of baseline values; and
receiving, from the user by the information delivery system, updated values of the baseline values and non-baseline values to accurately reflect the user's life.

US Pat. No. 10,657,575

PROVIDING RECOMMENDATIONS BASED ON USER-GENERATED POST-PURCHASE CONTENT AND NAVIGATION PATTERNS

WALMART APOLLO, LLC, Ben...

1. A system comprising:one or more processors; and
one or more non-transitory computer-readable media storing computing instructions configured to run on the one or more processors and perform:
generating, for a first user, a weighting vector that stores a respective weight corresponding to each feature of a plurality of features, wherein the plurality of features represent purchasing criteria that are common to each item in a category of items;
in response to receiving a request from the first user to view details for a selected item, updating, in real-time after receiving the request, a graphical user interface to display to the first user the details for the selected item and recommendations for one or more other items that are different from the selected item, wherein the selected item and the one or more other items are in the category of items that is the same, wherein the plurality of features are common to each item in the category of items, wherein sentiment data comprises a respective sentiment score for each feature for each item in the category of items, and wherein the one or more other items are recommended based on the respective sentiment score of one or more first features of the plurality of features for each of the one or more other items exceeding the respective sentiment score of a respective corresponding one of the one or more first features for the selected item;
receiving a new request from the first user to view new details for a new selected item based on the first user selecting one of the one or more other items in the graphical user interface;
in response to receiving the new request for the new selected item, updating the respective weight that corresponds to each of the one or more first features for the new selected item in the weighting vector for the first user based on the new request of the new selected item; and
updating the graphical user interface, receiving the new request for the new selected item, and updating the respective weight that corresponds to each of the one or more first features are iteratively performed in an interactive navigation pattern of the graphical user interface such that, for each subsequent iteration, the new request becomes the request, the new selected item becomes the selected item, and the new details become the details.

US Pat. No. 10,657,574

ITEM RECOMMENDATION TECHNIQUES

Adobe Inc., San Jose, CA...

1. A method, performed by a computing device, for identifying interest items for a user based on positive interest of many users in many items, the method comprising:identifying positive interest information identifying individual items in which individual users have demonstrated positive interest;
determining values of a similarity matrix quantifying similarities between items or similarities between users, wherein determining the values of the similarity matrix comprises:
determining a model of the similarity matrix that models the similarity matrix as a product of a first matrix and a second matrix;
selecting a rank parameter k based on accuracy of test predictions of user interest made using the rank parameter;
applying a rank constraint in the model to restrict dimensions of the first matrix and the second matrix, wherein applying the rank constraint includes:
applying the rank parameter k to limit the first matrix to k columns, and
applying the rank parameter k to limit the second matrix to k rows;
determining values of the first matrix and values of the second matrix based on the positive interest information; and
determining the product of the first matrix and second matrix to determine the values of the similarity matrix; and
identifying interest items for display to the user based on the values of the similarity matrix.

US Pat. No. 10,657,573

NETWORK SITE TAG BASED DISPLAY OF IMAGES

Houzz, Inc., Palo Alto, ...

1. An apparatus comprising:a display device;
one or more inertial sensors;
a processor configured to perform operations comprising:
display, on the display device, a user interface including a network site image from a network site, the network site image depicting a plurality of physical objects;
determine a true gravity vector using the one or more inertial sensors to detect true gravity;
determine image plane gravity by generating a vector component of the true gravity vector in an image plane of the network site image and store the vector component as the image plane gravity;
display, in the user interface, an editorial tag depicted as hanging from one of the plurality of physical objects towards the image plane gravity;
generate, in the user interface, a user tag depicted as hanging from another of the plurality of physical objects towards the image plane gravity, the user tag generated by a browsing user of the network site, wherein the user tag is visually distinguishable from the editorial tag on the network site image;
detect, using the one or more inertial sensors, movement of the apparatus while the network site image is displayed on the display device;
determine that an updated vector component of the true gravity vector in the image plane is zero due to the movement causing the image plane to be perpendicular to the true gravity vector;
in response to the updated vector component being zero, set the image plane gravity from the component of the true gravity vector in the image plane to a bottom of the network site image as displayed on the display device; and
in response to the movement, animate the editorial tag and the user tag as undergoing pendulum motion with respect to image plane gravity as set to the bottom of the network site image due to the updated vector component being zero.

US Pat. No. 10,657,572

METHOD AND SYSTEM FOR AUTOMATICALLY GENERATING A RESPONSE TO A USER QUERY

Wipro Limited, Bangalore...

1. A method for automatically generating a response to a user query, the method comprising:receiving, by a response generating system, the user query from a computing device associated with an end user;
determining, by the response generating system, whether the user query belongs to at least one domain from a plurality of predefined domains;
determining, when the user query belongs to the at least one domain, by the response generating system, goal data and a problem category of the user query, wherein determining the goal data comprises determining one or more features of one or more tokens based on Part-Of-Speech (POS) tags, wherein the one or more features comprise satisfying features that indicate steps successfully performed by the end user for resolving a problem;
detecting, by the response generating system, a problem node and one or more problem sub-nodes associated with the user query by parsing a predefined knowledge graph corresponding to a category of the at least one domain, based on the goal data and the problem category;
comparing, by the response generating system, each of the one or more problem sub-nodes with the satisfying features for semantic similarity;
removing, upon comparison, by the response generating system, the one or more problem sub-nodes that are semantically similar to the satisfying features;
providing, upon removal, by the response generating system, at least one of open-ended questions and closed-ended questions based on the one or more problem sub-nodes of the problem node to the computing device to receive a feedback for at least one of the open-ended questions and the closed-ended questions from the end user; and
displaying, by the response generating system, the response to the user query extracted from one of the one or more problem sub-nodes to the end user based on the feedback;
wherein determining the problem category comprises:
creating, by the response generating system, a vocabulary file comprising each of one or more words in the user query and an Identifier (ID) corresponding to each of one or more words in the user query;
assigning, by the response generating system a weightage for each of the one or more words in the user query;
filtering, by the response generating system, the one or more words from the user query based on the weightage;
generating, by the response generating system, one or more feature vectors by assigning a feature vector weightage to each of the one or more filtered words based on one or more parameters; and
comparing, by the response generating system, each of the one or more feature vectors with each of the one or more predefined feature vectors related to the at least one domain to determine the problem category.

US Pat. No. 10,657,571

METHOD AND APPARATUS FOR FACILITATING COMPREHENSION OF USER QUERIES DURING INTERACTIONS

IntelliResponse Systems I...

1. A computer-implemented method, comprising:receiving a query input provided by a user, the query input comprising one or more query terms;
comparing, by a processor, the one or more query terms with respective content elements in a plurality of content records stored in a database for at least a partial match in relevancy, wherein each content record from among the plurality of content records is associated with a respective comparison result subsequent to the comparison;
generating, by the processor, a score for the each content record based on the respective comparison result and a measure of comprehension between the query input and the plurality of content records, wherein the measure of comprehension is determined using respective comparison results for the plurality of content records and, wherein generation of the score for the each content record configures a set of scores;
determining, by the processor, a confidence state corresponding to the set of scores based on the score for the each content record in the set of scores; and
providing a reply to the user in response to the query input based on the confidence state, wherein providing the reply to the user comprises:
generating an instruction based on the determined confidence state; and
selecting a text snippet from among a plurality of text snippets stored in the database based on the generated instruction, wherein the text snippet configures, at least in part, the reply provided to the user, said snippet comprising either a confidence level corresponding to the confidence state, said confidence state associated with the set of scores indicating relative to a pre-set threshold whether the search results address the query or said text snippet comprising information regarding further user action to be taken in connection with the query.