US Pat. No. 10,339,821

PLATFORM AND KNOWLEDGE-BASED INSTRUCTION ENGINE IMPLEMENTED SKILL-IMPROVEMENT SYSTEM THROUGH CLOUD COMPUTING

1. A method for generating feedback to a human practitioner practicing a skill, comprising:providing a local platform for acquiring physical parameter data pertaining to motion and position of said human practitioner and motion and position of a golf club and a golf ball struck by said golf club during a golf swing by said human practitioner, wherein providing the local platform comprises:
providing a laser grid, said laser grid acquiring said physical parameter data of said motion and position of said golf club and said golf ball struck by said golf club during said golf swing; and
providing a video camera to acquire said physical parameter data associated with said motion and position of said human practitioner during said golf swing, wherein said physical parameters data associated with said motion and position of said human practitioner being processed to identifying body parts of said human practitioner during said golf swing, divide said physical parameters data associated with said motion and position of said human practitioner during said golf swing into a plurality of individual video frames over a predetermined timeframe, wherein each individual video frame is divided into a plurality of subframes, each subframe of the plurality of subframes focused on a predetermined body part, wherein the plurality of video frames are processed to remove background imagery;
transmitting via the internet at least a portion of said physical parameter data of said motion and position of said golf club and said golf ball struck by said golf club during said golf swing and said physical parameter data associated with said motion and position of said human practitioner during said golf swing from said local platform to a cloud-based analysis engine, said cloud-based analysis engine implemented via a cloud-based paradigm and is located remotely relative to said local platform, said cloud-based analysis engine comparing said physical parameter data of said motion and position of said golf club and said golf ball struck by said golf club during said golf swing and said physical parameter data associated with said motion and position of said human practitioner during said golf swing to a predefined parameters and generating feedback;
receiving via the internet said feedback from said cloud-based analysis engine; and
providing said feedback to said human practitioner using at least one of an audio device operatively coupled with said local platform and a visual display operatively coupled with said local platform, wherein said visual display showing and comparing at least one corresponding sub-frame from said videos frames to a base model showing proper position of the predetermined body part during said golf swing.

US Pat. No. 10,339,819

UNMANNED AERIAL VEHICLE BEACON POD

Amazon Technologies, Inc....

1. A method of operation of a beacon pod for unmanned aerial vehicle (UAV) navigation, the method comprising:determining, by a beacon pod, its own location by using at least one of GPS signals, communications with other beacon pods, or signals from mobile telephone network antennas;
storing, at the beacon pod and by the beacon pod, the location of the beacon pod;
establishing, by the beacon pod, a secure communication between a UAV and the beacon pod;
receiving, by the beacon pod and via the secure communication, a request from the UAV for the location of the beacon pod; and
transmitting, by the beacon pod, via the secure communication, one or more of a synthetic Global Positioning System (GPS) signal indicating the location of the beacon pod, or a homing signal to enable the UAV to determine the location of the beacon pod.

US Pat. No. 10,339,817

FLIGHT MANAGEMENT SYSTEM AND FLIGHT PLAN ALERT INTEGRATION SYSTEMS AND METHODS

ROCKWELL COLLINS, INC., ...

1. A flight management system, comprising:a display configured to display a user interface; and
at least one processor with a non-transitory processor-readable medium storing processor-executable code for causing the at least one processor to:
receive flight plan information regarding a flight plan of an aircraft;
receive aircraft status information from an aircraft monitoring system;
generate an alert indicating a deviation from the flight plan based on a comparison of the aircraft status information to the flight plan information; and
provide the alert on a portion of the user interface by emphasizing an interactive element displayed on the user interface, the interactive element displaying information related to the aircraft status information,
the interactive element configured to be selected by a pilot to at least one of view additional information regarding the alert and to modify the flight plan of the aircraft to change the aircraft status information.

US Pat. No. 10,339,816

AUTOMATIC AIRCRAFT MONITORING AND OPERATOR PREFERRED REROUTING SYSTEM AND METHOD

THE BOEING COMPANY, Chic...

1. An automatic monitoring and proposed in-flight rerouting system for an airborne aircraft traveling to a destination via a current route, comprising:at least one computer processor; and
at least one memory storing a plurality of components of an application, the plurality of components executable by the at least one computer processor and comprising:
a route optimization function executable to: (i) receive updated information selected from updated airline information, updated aircraft information, updated airspace information, and updated traffic information; and (ii) responsive to receiving the updated information, automatically and proactively compute at least a first in-flight reroute to the destination for the airborne aircraft, by at least in part communicating with an operational control system specific to an airline associated with the airborne aircraft in order to consider reservations, airframe usage and movement, crew movement, and high-value passenger connection data;
a conflict detection function executable to automatically check the first in-flight reroute against traffic trajectories of other aircraft and airspace constraints for conflicts; and
a conflict resolution function executable to, upon detection of one or more conflicts in the first in-flight reroute by the conflict detection function, automatically and proactively compute a second in-flight reroute in accordance with preferences selected from airline preferences, flight crew preferences, and air navigation service provider preferences, in order to resolve the detected one or more conflicts in the first in-flight reroute, wherein the second in-flight reroute is selected from a cost optimal reroute, a fuel optimal reroute, a time optimal reroute, an environmentally beneficial reroute, an airspace constrained reroute, and an airport constrained reroute, wherein the second in-flight reroute is characterized by a resource usage improvement relative to the current route, wherein the resource usage improvement is selected from cost saved, fuel saved, time saved, environmental impact, airspace impact, and airport impact;
wherein upon no conflict being detected in the second in-flight reroute by the conflict detection function, and further upon receipt of clearance from air traffic control to reroute the airborne aircraft based on the second in-flight reroute, the airborne aircraft is rerouted to the destination based on the second in-flight reroute, wherein the rerouted aircraft arrives at the destination after traveling according to the second in-flight reroute, which causes the resource usage improvement to be attained.

US Pat. No. 10,339,815

APPARATUS AND METHOD FOR CONTROLLING LAMP OF PLATOONING VEHICLE

Hyundai Motor Company, S...

1. An apparatus for controlling lamps of platooning vehicles including a leading vehicle and a plurality of following vehicles, the apparatus comprising:a memory configured to store information related to the following vehicles; and
a controller configured to collectively control lamps of the leading vehicle and following vehicles in a response to events occurring in the leading vehicle and following vehicles.

US Pat. No. 10,339,813

DISPLAY CONTROL SYSTEMS AND METHODS FOR A VEHICLE

GM GLOBAL TECHNOLOGY OPER...

11. A display control method for a vehicle, comprising:generating a surround view based on:
(i) first images of an outward front view of in front of the vehicle captured using a first camera;
(ii) second images of an outward right side view to the right of the vehicle captured using a second camera;
(iii) third images of an outward left side view to the left of the vehicle captured using a third camera; and
(iv) fourth images of an outward rear view of behind the vehicle captured using a fourth camera,
wherein the surround view includes a predetermined top view image of the vehicle and one or more features located within a predetermined area around the vehicle captured in the first, second, third, and fourth images;
selectively generating a parking signal when, for at least a predetermined period, a vehicle speed is less than a predetermined speed and:
(i) an accelerator pedal position is less than a predetermined position;
(ii) a brake pedal position indicates a request for vehicle braking; and
(iii) a magnitude of a steering wheel angle is greater than a predetermined steering wheel angle; and
in response to the generation of the parking signal, displaying the surround view on a display within the vehicle.

US Pat. No. 10,339,812

SURROUNDING VIEW CAMERA BLOCKAGE DETECTION

DENSO International Ameri...

1. A camera monitoring system for a vehicle, the system comprising:a plurality of cameras disposed around an exterior of a vehicle;
an image processing module communicating with the plurality of cameras and generating overhead view images from raw images taken by at least one of the plurality of cameras;
a histogram module communicating with the image processing module and generating at least one histogram from the overhead view images;
a likelihood module communicating with the histogram module, determining a blackout ratio and a whiteout ratio for the at least one histogram, and determining a likelihood of blockage for at least one of the plurality of cameras, the likelihood module summing the blackout ratio and the whiteout ratio to create a Blackout+Whiteout ratio and determining whether the Blackout+Whiteout ratio is greater than a first predetermined threshold, indicating a possible blockage of at least one of the plurality of cameras;
a line alignment module communicating with the likelihood module and the image processing module and detecting feature points in the overhead view image from a selected camera of the plurality of cameras and detecting the same feature points in the overhead image from an adjacent camera, determining trajectories for the feature points, and determining whether a trajectory of detected feature points in the overhead view image from the selected camera aligns with the trajectory of detected feature points in the overhead view image from the adjacent camera of the plurality of cameras; and
a reporting module communicating with the line alignment module and reporting a camera blockage status to at least one vehicle system or controller.

US Pat. No. 10,339,811

IMAGE PROCESSING APPARATUS, ELECTRONIC APPARATUS, AND IMAGE PROCESSING METHOD

FUJITSU TEN LIMITED, Kob...

1. An image processing apparatus that processes an image to be displayed on a display apparatus mounted on a vehicle, the image processing apparatus comprising:a receiving section that receives an instruction signal;
a synthetic image generating section that generates a plurality of first synthetic images based on a plurality of images of a periphery of the vehicle captured by a plurality of cameras which are provided so as to be capable of capturing a 360° view of the periphery of the vehicle, the first synthetic images being viewed towards the vehicle from a plurality of virtual viewpoints disposed around the vehicle; and
an output section that sequentially outputs the first synthetic images to the display apparatus so that the virtual viewpoint of the first synthetic images moves in a circle around the vehicle at an angle that is less than 360° while being directed towards the vehicle in response to the instruction signal.

US Pat. No. 10,339,810

MANAGEMENT OF MOBILE OBJECTS

INTERNATIONAL BUSINESS MA...

1. A computer-implemented method comprising:identifying, by an event agent (EA) of a plurality of EAs, an event occurring in a geographic space in which a plurality of mobile objects move, wherein the event agent handles all events occurring in a region of a plurality of regions of the geographic space, wherein identifying, by the event agent (EA), the event occurring in the geographic space in which the plurality of mobile objects move includes: generating the event based on information comprising information from outside a system containing the EA and detection information that is detected by at least one mobile object among the plurality of mobile objects, wherein identifying, by the event agent (EA), the event occurring in the geographic space in which the plurality of mobile objects move further includes determining an expected event to be handled by at least one of a plurality of predictive environment agents (PEAs), wherein each of the plurality of PEAs is assigned a region of the geographic space, and determining the expected event to be handled by at least one of the plurality of PEAs is positioned in an adjacent region to the region of the EA, transmitting the expected event to an adjacent PEA assigned to the adjacent region;
determining the event is the expected event based on predicting time-series changes of the event handled by the EA, wherein determining the event is the expected event includes determining a plurality of the events correspond respectively to a plurality of future timings, wherein determining the event is the expected event includes updating the plurality of PEAs in response to reaching an earliest future timing among the plurality of future timings, according to passage of time, wherein updating the plurality of PEAs includes shifting the plurality of PEAs to be earlier in time series order, and reassigning the plurality of PEAs to the plurality of future timings updated with a future timing following the earliest future timing as an origin, wherein the EA and the plurality of PEAs are assigned for every same region in the geographic space; and
determining a derived expected event that is predicted to derive from the generated expected event;
transmitting the one expected event generated based on determining the expected event to be handled by each of at least one of the plurality of PEAs in response to generation of one expected event to the adjacent PEA via an adjacent EA assigned to the adjacent region;
deleting, from the PEA, an expected event that was completed;
managing, by the predictive environment agent (PEA), the expected event.

US Pat. No. 10,339,809

SAFETY CONFIRMATION ASSIST DEVICE

Yazaki Corporation, Mina...

1. A safety confirmation assist device comprising:an optical displaying device which presents visible information, at a position, visually recognizable to a driver of a vehicle;
an object detector which detects an object approaching the vehicle from a front left side or a front right side;
a control unit which controls the optical displaying device, if the object detector detects the object approaching the vehicle from the front left side or the front right side, to present the visible information moving in the same direction as a left-right component of a direction in which the object approaches the vehicle; and
a line-of-sight detector which detects a line of sight of the driver, wherein:
the control unit controls the optical displaying device so that a movement range of the visible information is located on only a left side or a right side, which is the same side of the vehicle as the object approaching the vehicle from the front left side or the front right side is located, of a boundary that is a plane perpendicular to a road surface, wherein the boundary defines the left side and the right side using, as a reference, the line of sight of the driver detected by the line-of-sight detector, and
the control unit controls the optical displaying device so that distances between the boundary and a starting end where movement of the visible information is started, and a finishing end where the movement is finished, are kept constant, and
wherein when a line-of-sight direction of the driver is changed from a driving direction, the control unit moves the movement range of the visible information according to a movement amount of the line-of-sight direction in such a manner that the movement range does not cross the boundary.

US Pat. No. 10,339,808

PREDICTING PARKING VACANCIES BASED ON ACTIVITY CODES

HERE Global B.V., Eindho...

1. A method for predicting occupancy of a parking area, the method comprising:receiving, by a processor, a request for data relating to occupancy of the parking area;
identifying, by the processor, one or more first entities within a first predefined distance of the parking area;
receiving, by the processor, at least one activity classification code for each of the one or more first entities, wherein the at least one activity classification code describes a type of economic activity by each of the one or more first entities;
calculating, by the processor, a first predicted occupancy for the parking area as a function of the activity classification codes without sensor data or real time observation data; and
transmitting, by the processor, the first predicted occupancy for the parking area.

US Pat. No. 10,339,807

APPARATUS USING SYNC AND BALANCED V2V COMMUNICATION

Ford Global Technologies,...

1. A vehicle comprising:a wireless interface to communicate with an offsite wireless interface; and
a controller to:
identify a location of the vehicle;
determine when the location is within one of multiple concentric intersection zones extending outwardly from an intersection area;
responsive to determining a traffic signal within the intersection area is malfunctioning, generate a traffic optimization plan based on the location being within the one of the zones; and
communicate the traffic optimization plan.

US Pat. No. 10,339,806

TRAFFIC CONTROL SERVER AND SYSTEM

Hitachi Construction Mach...

1. A traffic control server that performs traffic control of a first vehicle, which is traveling in an approach lane, and a second vehicle, which is traveling in a return lane, the second vehicle being different in attribute from the first vehicle, and the approach lane and the return lane being provided in parallel to each other in a mine site, comprising:a central processing unit (CPU);
a storage device; and
a communication device,
wherein the storage device and the communication device are connected to the CPU,
wherein the storage device stores map information of a travel route and zone information that indicates travel-permitted zones given to the respective vehicles,
wherein the zone information includes information indicating a first travel-permitted zone as a partial zone, where the first vehicle is allowed to travel, in the approach lane, a travel permission restricted zone that is set in the return lane in association with the first travel permitted zone, where the second vehicle is restricted to travel, and a second travel-permitted zone as another partial zone, where the second vehicle is allowed to travel, in the return lane, respectively, and
wherein the storage device stores one or more programs that when executed by the CPU configure the CPU to:
upon receiving a new zone setting request from at least one of the first vehicle and the second vehicle via the communication device, execute a new setting and a cancellation of at least one of the first travel permitted zone, the travel permission restricted zone and the second travel permitted zone, based on the zone information, and store the new setting and the at least one cancellation in the storage device as new zone information,
upon setting the first travel permitted zone, set a partial zone parallel to the first travel permitted zone in the return lane as the travel permission restricted zone,
after a new first travel permitted zone is set in the approach lane based on the travel of the first vehicle:
set a new travel permission restricted zone, the new travel permission restricted zone being set in association with the new first travel permitted zone,
upon determining a distance from a current position of the first vehicle to a rear end of the first travel permitted zone that the first vehicle has already passed is a predetermined travel permission canceling distance or more, cancel the first travel permitted zone, through which the first vehicle has already passed, and
cancel the travel permission restricted zone in association with the first travel permitted zone that the first vehicle has already passed,
upon receiving a request for setting a new second travel permitted zone from the second vehicle:
upon determining the travel permission restrict zone is set to a zone in the return lane that the second vehicle requests to set the new second travel permitted zone, restrict the setting of the new second travel permitted zone in the zone of the return lane, and
upon determining that the travel permission restrict zone that is set to the zone of the return lane that the second vehicle requests to set the new second travel permitted zone is cancelled, set the new second travel permitted zone in the return lane, and
instruct the communication device to transmit setting information including the new zone information indicating each zone newly set to the first vehicle and the second vehicle.

US Pat. No. 10,339,805

TRAFFIC LIGHT RECOGNITION DEVICE AND TRAFFIC LIGHT RECOGNITION METHOD

Nissan Motor Co., Ltd., ...

1. A traffic light recognition device, comprising:a camera mounted on a vehicle and configured to capture an image around the vehicle;
a processor; and
a memory coupled to the processor, the memory storing instructions which, when executed by the processor:
acquire map information around the vehicle;
detect a current position on a map of the vehicle;
estimate a position on an image of a traffic light on a basis of the current position and the map information;
set an imaging direction of the camera on a basis of the position on the image of the traffic light and of a moving direction in the future on the image of the traffic light;
change the imaging direction of the camera to the set imaging direction;
recognize the traffic light from an image captured in the imaging direction by the camera;
set a change point or region at which the imaging direction of the camera is changed, on a basis of the position on the image of the traffic light and of the moving direction in the future on the image of the traffic light; and
change the imaging direction when the vehicle reaches the change point or region.

US Pat. No. 10,339,804

SIGN TO VEHICLE IDENTIFICATION SYSTEM

3M Innovative Properties ...

1. A system comprising:a sign comprising sign communication information;
a marker positioned at a predetermined location relative to the sign, wherein the marker comprises retroreflective material defining marker communication information; and
a vehicle information system comprising a light source configured to emit light to the marker, a reader configured to receive marker communication information, and a processor configured: to generate output information based on the received marker communication information and positional information related to at least one of a position of the autonomous vehicle or the predetermined location of the marker; send, to an internet-based information storage system, the generated output information that is based on the marker communication information and the positional information; receive additional information relating to the generated output information; and in response to determining that the received marker communication information is relevant to the autonomous vehicle, control a function of the autonomous vehicle based at least in part on the additional information received from the internet-based information storage system that relates to the generated output information, wherein the received marker communication information comprises retroreflected light from the light source that is retroreflected by the retroreflective material.

US Pat. No. 10,339,803

METHOD FOR OPERATING AN ASSISTANCE SYSTEM OF A MOTOR VEHICLE AND ASSISTANCE SYSTEM

Conti Temic microelectron...

1. A method for operating an assistance system of a motor vehicle comprisingdetecting an image with a camera of the assistance system;
determining a traffic sign within the detected image;
determining an alignment of the determined traffic sign with respect to the motor vehicle, wherein the alignment indicates an angle formed between a straight line parallel to a longitudinal direction of the motor vehicle and a plane parallel to the traffic sign, and
triggering a signal device of the assistance system as a function of the alignment, wherein the signal device is triggered if the alignment of the traffic sign with respect to the motor vehicle is within a determined angular range;
wherein alignment of the determined traffic sign is determined by dividing an image of a traffic sign within a detected image, into a number of areas and, assigning a particular value to each area depending on at least one of: a brightness and a color of an area,
wherein a gradient line extends between two of said areas that differ from each other by a determined value and that are above a predetermined threshold, the gradient line representing an edge and thus an angle of the traffic sign relative to the straight line parallel to the longitudinal direction of the motor vehicle.

US Pat. No. 10,339,802

METHOD AND SYSTEM FOR MANAGING A PARKING LOT

Wistron Corp., New Taipe...

1. A method for managing a parking lot, comprising:capturing, by a first video camera, an entrance image of a vehicle;
determining whether the entrance image satisfies a first condition based on information stored in a database;
raising a first barrier in response to the entrance image satisfying the first condition;
detecting whether there is only the one vehicle between the first barrier and a second barrier; and
raising the second barrier to enable the vehicle to enter the parking lot in response to detecting that there is only the one vehicle between the first barrier and the second barrier;
wherein the step of determining whether the entrance image satisfies the first condition further comprises:
determining whether vehicle information of the vehicle in the entrance image matches the information stored in the database; and
raising the first barrier in response to the vehicle information matching the information stored in the database;
wherein the method further comprises:
detecting whether the vehicle enters within a first distance from a parking space;
raising a parking-space barrier corresponding to the parking space in response to detecting that the vehicle enters within the first distance from the parking space;
detecting whether a wheel blocking structure corresponding to the parking space is triggered to limit the vehicle in the parking space;
detecting whether a person has left the parking space in response to the wheel blocking structure corresponding to the parking space being triggered; and
lowering the parking-space barrier in response to detecting that the person has left the parking space.

US Pat. No. 10,339,800

METHOD AND DEVICE FOR PROCESSING TRAFFIC ROAD INFORMATION

HANGZHOU HIKVISION DIGITA...

1. A method for processing traffic road information, comprising:obtaining acquired traffic parameters of a first target road section and/or the reliability of the traffic parameters within a first preset period, wherein the traffic parameters at least comprise any one or more of the following parameters: a vehicle time occupancy rate, flow saturation of vehicle flow, and a vehicle speed;
selecting a first fuzzy rule matrix table from a pre-stored set of fuzzy rule matrix tables based on the number of the traffic parameters of the first target road section and/or the reliability of the traffic parameters, wherein the fuzzy rule matrix tables comprise any one of the following types of matrix tables: a one-dimensional fuzzy rule matrix table, a two-dimensional fuzzy rule matrix table, and a three-dimensional fuzzy rule matrix table;
determining a membership degree for each type of traffic conditions contained in the first fuzzy rule matrix table by calling a membership function, wherein the traffic conditions at least comprise the following types: Unblocked, Slow and Congested; and
comparing the membership degrees of all types of traffic conditions contained in the first fuzzy rule matrix table to determine a real-time traffic condition for the first target road section within the first preset period.

US Pat. No. 10,339,798

INFRARED REMOTE CONTROL APPARATUS AND TERMINAL

Huawei Technologies Co., ...

1. An infrared remote control apparatus, comprising an audio codec chip, a transfer switch, and an infrared transmitter, whereinthe audio codec chip comprises a pair of differential output pins and a pair of differential input pins, and the infrared transmitter is connected to the differential output pins and the differential input pins by using the transfer switch; wherein
the transfer switch is configured to set up a connection between the infrared transmitter and the differential output pins;
the infrared transmitter is configured to obtain an infrared learning signal; and
the audio codec chip is configured to:
obtain an infrared remote control parameter, wherein the infrared remote control parameter comprises an envelope length and a carrier frequency of an infrared remote control signal;
generate the infrared remote control signal according to the envelope length and the carrier frequency;
drive, by using the differential output pins, the infrared transmitter to transmit the infrared remote control signal when the connection between the infrared transmitter and the differential output pins is established; and
when the transfer switch sets up a connection between the infrared transmitter and the differential input pins:
read the infrared learning signal by using the differential input pins;
calculate an envelope length and a carrier frequency of the infrared learning signal; and
use the envelope length and the carrier frequency of the infrared learning signal as the envelope length and the carrier frequency of the infrared remote control signal, respectively.

US Pat. No. 10,339,794

SMOKE DETECTOR AND METHOD FOR DETERMINING FAILURE THEREOF

Google LLC, Mountain Vie...

1. A method for determining an operational state of a smoke detector having an illuminator and a light sensor, comprising:measuring a first clean-air voltage indicative of a first intensity of a first electromagnetic signal detected by a first photodetector of the light sensor at a first center wavelength, wherein the illuminator comprises a first light source that emits light at the first center wavelength;
determining a first signal drift value from the first clean-air voltage and a first reference voltage for the first center wavelength;
measuring a second clean-air voltage indicative of a second intensity of a second electromagnetic signal detected by a second photodetector of the light sensor at a second center wavelength that exceeds the first center wavelength by at least twenty percent thereof, wherein the illuminator comprises a second light source that emits light at the second center wavelength;
determining a second signal drift value from the second clean-air voltage and a second reference voltage for the second center wavelength; and
determining the operational state from both the first signal drift value of the first center wavelength and the second signal drift value of the second center wavelength, wherein determining the operational state comprises:
determining a signal-drift threshold based on the second signal drift value;
comparing the first signal drift value to the signal-drift threshold that was determined based on the second signal drift value to determine the operational state; and
producing a failure signal when the smoke detector is in a degraded operational state.

US Pat. No. 10,339,792

SYSTEM, METHOD, AND RECODING MEDIUM FOR EMERGENCY IDENTIFICATION AND MANAGEMENT USING SMART DEVICES AND NON-SMART DEVICES

INTERNATIONAL BUSINESS MA...

1. An emergency system, comprising an actuation and discovery device configured to create an ad hoc network between a first device and second devices, as a result of discovering an emergency situation in a same location as the first device, the second devices including non-smart devices comprising at least two of a consumer device and an appliance that receives contextual and localized information from the first device including a smart device such that the ad hoc network performs a mitigation action that combines a utility of the second devices based on data received from the first device in the same location as the second devices, via the non-smart devices, as instructed by the actuation and discovery device to mitigate and stop the emergency situation detected by the first device.

US Pat. No. 10,339,786

SAFETY REMINDING DEVICE AND METHOD BASED ON BICYCLE-SHARING

Fu Tai Hua Industry (Shen...

1. A safety reminding device applied in a bicycle-sharing system, the bicycle-sharing system comprising at least one bicycle, the at least one bicycle comprising a basket, the safety reminding device comprising:a detecting unit to detect whether a person is sifting in the basket of the bicycle when the bicycle is unlocked;
a processor; and
a storage device storing one or more programs, when executed by the processor, the one or more programs cause the processor to:
detect whether the bicycle is unlocked;
control the detecting unit to detect whether a person is sifting in the basket of the bicycle when determining that the bicycle is unlocked;
determine whether a person is sifting in the basket according to a detection of the detecting unit; and
output a warning signal to inform a user that the basket is forbidden for sifting when determining that a person is sifting in the basket when the bicycle is unlocked.

US Pat. No. 10,339,785

FAILURE DIAGNOSIS SYSTEM

SUMITOMO HEAVY INDUSTRIES...

1. A failure diagnosis system comprising:a first sensor that ascertains, from a first diagnosis target device in a facility, first diagnosis target information;
a second sensor that ascertains, from a second diagnosis target device in the facility, second diagnosis target information;
a terminal device that causes a display unit to display, on a screen, an image of the first diagnosis target device, an image of the second diagnosis target device, and a layout of the facility; and
a processing unit that executes a diagnosis process on the first diagnosis target information to determine an occurrence of an abnormality in the first diagnosis target device, and executes the diagnosis process on the second diagnosis target information to determine an occurrence of an abnormality in the second diagnosis target device,
wherein when the diagnosis process determines the occurrence of the abnormality in the first diagnosis target device, the terminal device causes the display unit to display a first indicator on the screen,
wherein when the diagnosis process determines the occurrence of the abnormality in the second diagnosis target device, the terminal device causes the display unit to display a second indicator on the screen,
wherein the terminal device causes the display unit to display a setting screen to create a facility layout and set disposition of the first diagnosis target device and the second diagnosis target device in the facility layout,
wherein the terminal device causes the display unit to display, on the setting screen, a part region, in which a part image of the facility is displayed, and a facility layout display region, and create the facility layout by receiving an operation of disposing, in the facility layout display region, the part image displayed in the part region,
wherein the terminal device receives an operation of disposing images of the first diagnosis target device and the second diagnosis target device in the created facility layout on the setting screen,
wherein the image of the first diagnosis target device depicts, on the screen, a location in the layout where the first diagnosis target device is sited, and
wherein the image of the second diagnosis target device depicts, on the screen, a location in the layout where the second diagnosis target device is sited.

US Pat. No. 10,339,784

METHOD AND SYSTEM FOR MONITORING SENSOR DATA OF ROTATING EQUIPMENT

SIEMENS AKTIENGESELLSCHAF...

1. A method for detecting a failure in rotating equipment based on monitoring sensor data of the rotating equipment, wherein the method comprises:collecting, during an online phase, by a plurality of sensors of the rotating equipment, a sensor data stream, wherein the data stream consists of an ordered sequence of feature vectors, each feature vector representing measurements of at least one sensor of the plurality of sensors of the rotating equipment at a certain point in time,
providing the sensor data stream to a processor,
processing, by the processor, the sensor data stream
representing the sensor data stream with a set of microclusters, each microcluster defining a subspace,
for each new feature vector of the sensor data stream, updating the set of microclusters by
calculating a correlation distance measure between the new feature vector and each microcluster,
assigning the new feature vector to a microcluster with a smallest value for the correlation distance measure if the value is below a range parameter and updating the microcluster based on the new feature vector, or
creating a new microcluster based on the new feature vector if all values for the correlation distance measure are above the range parameter,
creating, during an offline phase, a macrocluster model containing macroclusters based on the microclusters by calculating a comparison measure between each pair of microclusters and grouping microclusters in a macrocluster if their value of the comparison measure is below a threshold, and
comparing the macrocluster model with historical models by calculating a similarity measure, with each historical model representing either a standard operation or a failure state,
choosing the historical model with the highest value of the similarity measure, and
detecting a failure if the chosen historical model represents a failure state.

US Pat. No. 10,339,783

REMOTE COOKING SYSTEMS AND METHODS

Weber-Stephen Products LL...

1. A remote unit for monitoring a temperature status of a food item, the remote unit comprising:a power supply;
a screen;
a receptacle that receives a connector of a temperature sensor;
a communication unit for wireless communication; and
one or more processors, the one or more processors being adapted and configured to receive sensed data from the temperature sensor, display the sensed data on the screen, use the communication unit to establish a wireless connection, and transmit the sensed data over the wireless connection;
wherein the remote unit further comprises hardware, firmware and software adapted and configured for installation of a computer application; and
wherein the remote unit is further configured to receive a user input representing a cooking parameter and to display, on the screen, an estimated time said cooking parameter will be reached.

US Pat. No. 10,339,782

METHODS AND SYSTEMS FOR METRICS ANALYSIS AND INTERACTIVE RENDERING, INCLUDING EVENTS HAVING COMBINED ACTIVITY AND LOCATION INFORMATION

Fitbit, Inc., San Franci...

1. A server system, the server system comprising:a network interface configured to communicate with an activity tracking device via a computer network;
a memory device configured to store a rules database including inference rules of the activity tracking device configured to be worn by a user; and
one or more processors communicatively coupled to the memory device and configured to:
receive geo-location data from the activity tracking device via the network interface;
receive motion data from the activity tracking device, the motion data generated by the activity tracking device while the user is performing a motion, the motion data is associated with a time interval of occurrence of the motion and with the geo-location data;
identify a type of activity performed by the user based at least on a comparison between the motion data and identifiable activity patterns corresponding to a plurality of predefined activity types;
determine based on the inference rules, that the identified type of activity is consistent with the one or more geo-locations, the inference rules correlating the motion data and the geo-location data of the activity tracking device to the predefined activity types;
determine, based at least on the geo-location data and the time interval, that the user is performing the identified type of activity at a place of occurrence;
determine a first graphical identifier that identifies the place of occurrence;
determine a second graphical identifier that identifies the type of the activity;
store the first graphical identifier, the second graphical identifier, and the time interval of occurrence of the motion in a database such that the first graphical identifier is associated with the second graphical identifier, and the time interval of occurrence of the motion is associated with the first and second graphical identifier; and
cause display of the first and second graphical identifiers corresponding to a distinct time segment of an electronic-based graphical timeline, the electronic-based graphical timeline comprising a first axis representing time and the first and second graphical identifiers such that the electronic-based graphical timeline provides a graphical representation of the type of activity that was performed at the place of occurrence during the time interval of occurrence of the motion.

US Pat. No. 10,339,780

SYSTEMS, DEVICES, AND/OR PROCESSES FOR TRACKING BEHAVIORAL AND/OR BIOLOGICAL STATE

ARM Ltd., Cambridge (GB)...

1. An apparatus for tracking behavioral content for a particular user, comprising:at least one processor to track signals and/or states representative of the behavioral profile content for the particular user, the behavioral profile content to include a plurality of parameters representative of a current behavioral state of the particular user;
at least one memory to store the tracked signals and/or states representative of the behavioral content;
wherein the at least one processor to perform one or more machine learning operations to determine one or more relationships between the tracked signals and/or states representative of the behavioral profile content and bioavailability or balance, or a combination thereof, of one or more particular substances within the particular user's body.

US Pat. No. 10,339,779

MONITORING SYSTEM

MSA Europe GmbH, Jona (C...

1. A mobile monitoring system for monitoring a deployed person, the mobile monitoring system comprising:a base station comprising:
at least one antenna which provides a radio cell, wherein the radio cell uses a first communication standard, and
a wireless interface which uses a second communication standard; and
at least one mobile monitoring apparatus comprising:
a transceiver configured for receiving monitoring data from at least one equipment object and sending the monitoring data to the base station via the radio cell, and
a signaling tag having participant information,
wherein the wireless interface of the base station is configured to automatically read out the participant information from the signaling tag of the at least one mobile monitoring apparatus using the second communication standard when the at least one mobile monitoring apparatus is within a minimum predetermined distance from the base station to automatically register the at least one mobile monitoring apparatus as at least one participant in the radio cell if the at least one mobile monitoring apparatus is not registered as the at least one participant and to deregister the at least one mobile monitoring apparatus if the at least one mobile monitoring apparatus is registered as the at least one participant.

US Pat. No. 10,339,777

IDENTIFYING AN INDIVIDUAL BASED ON AN ELECTRONIC SIGNATURE

Lenovo (Singapore) PTE. L...

1. An apparatus comprising:one or more sensors;
a processor;
a memory that stores code executable by the processor to:
detect an individual based on input from the one or more sensors, the one or more sensors comprising a camera;
determine an electronic signature associated with the detected individual, the electronic signature comprising a wireless signal emitted from an electronic device associated with the detected individual;
compare the determined electronic signature to a predefined list of known electronic signatures;
report the determined electronic signature to one or more remote devices in response to determining that the determined electronic signature is not on the predefined list of known electronic signatures, which indicates that the detected individual is an unknown individual;
query the one or more remote devices regarding the reported electronic signature to determine whether the one or more remote devices recognizes the reported electronic signature as being associated with a known individual; and
in response to the one or more remote devices not recognizing the reported electronic signature:
request that the one or more remote devices monitor for and further report the presence of the reported electronic signature in response to the one or more remote devices detecting the electronic signature;
perform one or more security actions to increase security where the electronic signature was detected, the one or more security actions selected from the group consisting of turning on lights, sounding an alarm, and locking entrances;
receive a location of the electronic signature from at least one of the one or more remote devices, the location comprising an address; and
report, to a law enforcement agency, the determined address, one of a picture and a video of the individual captured using the camera, and a timeframe indicating when the individual was at the determined address.

US Pat. No. 10,339,774

WEARABLE CHARM ANTI-THEFT SYSTEM WITH AN ENVIRONMENTALLY SENSITIVE SENSORY ALERT

Charm Alarm LLC, Beverly...

1. An anti-theft proximity alert system comprising:a wearable smart charm and an object monitor;
said wearable smart charm having an ornamental charm housing;
wherein said ornamental charm housing contains an environmental sensor, a sensory alert, an alarm controller, an alarm communicator capable of receiving a radio frequency proximity signal transmitted from said object monitor, at least one operating instruction to determine a measure of said radio frequency proximity signal, at least one alarm operating instruction to determine if said measure satisfies a threshold alert criterion, and at least one environmental operating instruction to adjust the output of said sensory alert based upon an environmental condition detected by said environmental sensor.

US Pat. No. 10,339,773

HOME SECURITY SYSTEM WITH AUTOMATIC CONTEXT-SENSITIVE TRANSITION TO DIFFERENT MODES

GOOGLE LLC, Mountain Vie...

1. A computer-implemented method, comprising:determining, by a home security system, that a user is not on a premises of a home within a period of time based on a location of the user determined by the home security system from an event in an email received in an email account of the user;
based on determining that the user is not on the premises of the home within the period of time, placing the home security system in a vacation mode, wherein the vacation mode defines a vacation mode response for a security event, wherein the vacation mode response for the security event is different from an away mode response for the security event defined by an away mode of the home security system; and
selectively activating one or more devices when the home security system is in the vacation mode.

US Pat. No. 10,339,772

SOUND TO HAPTIC EFFECT CONVERSION SYSTEM USING MAPPING

IMMERSION CORPORATION, S...

1. A non-transitory computer-readable medium having instructions stored thereon that, when executed by a processor, cause the processor to perform the operations of:receiving an audio signal;
pre-processing the audio signal to establish a set of audio regions according to actuator characteristics such that each audio region of the set of audio regions has corresponding actuator characteristics;
separating the audio signal into a plurality of sub-signal sets such that each sub-signal set is associated with a corresponding one of the set of audio regions and includes corresponding sub-signals;
mapping a sub-signal of one of the sub-signal sets of the plurality of sub-signal sets to a haptic signal; and
sending the haptic signal to an actuator having the actuator characteristics corresponding to the audio region associated with the one sub-signal set.

US Pat. No. 10,339,771

THREE-DIMENSIONAL HOLOGRAPHIC VISUAL AND HAPTIC OBJECT WARNING BASED ON VISUAL RECOGNITION ANALYSIS

International Business Ma...

7. A computer-implemented method for presenting a haptic hologram warning, the computer-implemented method comprising:receiving, by a computer, an indication that a first individual who needs supervision is approaching a situation;
presenting, by the computer, a haptic hologram to the first individual who needs supervision prior to the first individual reaching the situation
receiving, by the computer, image data of an area surrounding an object from an image capturing device;
performing, by the computer, an analysis of the image data using a visual recognition analysis component of the computer;
identifying, by the computer, the object, a current status of the object, a set of individuals in the area surrounding the object, age of each individual in the set, activity of each individual in the set, and direction of movement of each individual in the set based on the analysis of the image data;
determining, by the computer, whether generating a three-dimensional (3D) holographic visual and haptic warning is pertinent to the object based on the current status of the object and the age, activity, and direction of movement of each individual in the set of individuals within the area surrounding the object; and
responsive to the computer determining that generating a 3D holographic visual and haptic warning is pertinent to the object based on the current status of the object and the age, activity, and direction of movement of each individual in the set of individuals within the area surrounding the object, determining, by the computer, a size and a shape of the 3D holographic visual and haptic warning to generate between the object and one or more individuals in the set of individuals.

US Pat. No. 10,339,770

HAPTIC ENABLED STRAP FOR WEARABLE ELECTRONIC DEVICE

IMMERSION CORPORATION, S...

1. A system, comprising:a wearable electronic device;
a strap, operatively connected to the wearable electronic device, to allow a user to wear the wearable electronic device on a body part;
an actuator connected to the strap; and
a processor, in signal communication with the wearable electronic device and the actuator, configured to:
receive an input signal including directional information,
determine an output signal, based on the directional information, to generate a haptic effect that conveys the directional information to the user, and
send the output signal to the actuator to provide the haptic effect to the user through movement of the strap.

US Pat. No. 10,339,768

METHODS AND SYSTEMS FOR AUGMENTATIVE AND ALTERNATIVE COMMUNICATION

UNIVERSITY OF IOWA RESEAR...

1. A method, comprising:receiving a first input signal from a first sensor configured to detect only a single gesture;
receiving a second input signal from the first sensor after receiving the first input signal;
classifying the first input signal and the second input signal as intentional signals;
determining timing information indicative of a time difference between receiving the first input signal and receiving the second input signal;
determining whether the first input signal and the second input signal are relevant to a single command or multiple commands based on the timing information;
selecting a device from a plurality of devices to provide one or more output signals to based on a count of input signals received during a first gesture time window initiated in response to receiving the first input signal, wherein each device of the plurality of devices is associated with a respective number and a number of the device matches the count of the input signals; and
providing, to the device, the one or more output signals based on determining whether the first input signal and the second input signal are relevant to the single command or the multiple commands.

US Pat. No. 10,339,767

SENSOR SYSTEMS AND METHODS FOR ANALYZING PRODUCE

Walmart Apollo, LLC, Ben...

1. A system for identifying and tracking ripeness levels of produce items purchased at a store to manage inventory, the system comprising:a server configured to execute a produce analysis module;
a database for storing data sensed at a point-of-sale (POS) system, the database communicatively coupled with the server; and
a sensor disposed at the POS system and in communication with the server, the sensor configured to:
sense data measurements of a plurality of produce items scanned at the POS system,
transmit the sensed data measurements to the server along with a timestamp indicating at least one of a time and date when the data was sensed by the sensor,
wherein the produce analysis module is configured to:
store the sensed data measurements and the timestamp in the database,
analyze a plurality of sensed data measurements stored in the database taken over a period of time for a produce item, so as to determine a ripeness level from the sensed data measurements for the produce item for each timestamp stored in the database over the period of time,
determine a customer-preferred ripeness level for the produce item for at least one of a particular time and date based on the determined ripeness level for the produce item over the period of time, and
adjust programmatically at least one of inventory orders or display times for the produce item based on the customer-preferred ripeness level.

US Pat. No. 10,339,766

METHODS OF PLAYING WAGERING GAMES AND RELATED SYSTEMS

Bally Gaming, Inc., Las ...

1. A computer-implemented method of managing play of a wagering game by a control processor and memory in communication with a user device, the user device including a video display and player input controls, the method comprising:receiving, at a control processor from the player input controls, an indication of an ante wager to participate in the wagering game from a user device;
storing the indication of the ante wager in the memory;
the control processor providing to each user device associated with an ante wager, the display of virtual playing cards on the video display from a set of randomized playing cards to define a partial player hand associated with the user device;
configuring the control processor to apply a set of rules for:
(1) receiving, at the memory, from the player input controls, a first play option including a fold or a first additional wager from the user device associated with a partial player hand;
(2) providing, in response to the received first additional wager of the first additional wager and to the associated user device, at least one first additional virtual card from the set of randomized playing cards to be added to the partial player hand and displayed on the video display of the user device and which is insufficient to complete the partial player hand;
(3) responsive to receiving the first additional wager and providing the at least one first additional virtual card, receiving, at the memory, from the player input controls, a second play option including a fold or a second additional wager;
(4) providing, in response to the received second additional wager, at least one second additional virtual card from the set of randomized playing cards to be added to the partial player hand and displayed on the video display of the user device;
(5) identifying a complete player hand including the partial player hand, the at least one first additional virtual card added to the partial player hand, and the at least one second additional virtual card added to the partial player hand and storing the complete player hand in the memory;
(6) comparing the complete player hand against a plurality of predetermined winning outcomes stored in the memory; and
(7) in response to comparing the complete player hand against the plurality of predetermined winning outcomes, resolving the ante wager, first additional wager, and second additional wager solely against the plurality of predetermined winning outcomes and not against a dealer hand.

US Pat. No. 10,339,765

DEVICES, SYSTEMS, AND RELATED METHODS FOR REAL-TIME MONITORING AND DISPLAY OF RELATED DATA FOR CASINO GAMING DEVICES

1. In an environment including a communication network and a plurality of casino table games which have associated electronic card handling devices, each adapted to generate card handling performance data and an area identifier indicating an area including the respective card handling device within the environment, a monitoring system comprising:a monitoring server in communication with the communication network and programmed to receive the area identifiers and the card handling performance data from the electronic card handling devices in real-time during operation thereof, the card handling performance data selected from the group consisting of one or more of shuffling data, game hand data, card dealing/distribution data, game round data, and game outcome data; and
an operator station in communication with the monitoring server and including a user input device and a video display;
wherein at least one of the monitoring server or the operator station is configured to:
associate each electronic card handling device's received performance data and the respective area identifier with the corresponding electronic card handling device generating the performance data;
receive, via the user input device, user input including a selected area of interest within the environment and one or more user-defined settings;
in response to at least some of the user input, identify the electronic card handling devices associated with a common area identifier corresponding to the selected area of interest; and
control, based at least partially on the one or more user-defined settings, the operator station video display to display a graphical user interface including simultaneously displaying graphical representations of the performance data associated with the identified electronic card handling devices, the performance data including at least one report depicting a comparison of real-time operational data for at least two card handling devices of the identified electronic card handling devices,
wherein the performance data associated with the electronic card handling devices having area identifiers different from the common area identifier is filtered from the display in response to the user selection.

US Pat. No. 10,339,764

SOCIAL MEDIA LOTTERY GAME WITH PLAYER PROFILE WAGERING SELECTIONS

IGT Global Solutions Corp...

1. A method, comprising:causing at least one processor to execute a plurality of instructions stored in at least one memory device to:
establish a pool of available profile icons, wherein the pool comprises a plurality of profile icon pool entries received from players;
randomly select a first subset of the pool for displaying on a user interface of a player computing device as player-selectable profile icons;
execute a first game by comparing a player-selected subset of the first subset with a randomly selected game subset of the first subset, wherein the player-selected subset of the first subset is received via the user interface;
randomly select a second subset of the pool for displaying on the user interface as player-selectable profile icons; and
execute a second game by comparing a player-selected subset of the second subset with a randomly selected game subset of the second subset, wherein the player-selected subset of the second subset is received via the user interface.

US Pat. No. 10,339,763

COMPUTER GAME OF CHANCE

1. A system for a computer game of chance, the system comprising:one or more processors; and
a non-transitory computer readable medium storing a plurality of instructions, which when executed, cause the one or more processors to:
display, via a user interface, at least a part of a field of play associated with a game of chance;
receive, via the user interface, a user selection of a region within the field of play;
evaluate a probability that an object will traverse the user-selected region during any of a plurality of random movements by the object from a beginning area to an ending area during a round of play;
display, via the user interface, at least one of the probability that the object will traverse the user-selected region during any of the plurality of random movements by the object from the beginning area to the ending area during the round of play, and a success value based on the probability, prior to displaying the object moving in the field of play during at least some of the plurality of random movements;
display, via the user interface, the object moving in the field of play during at least some of the plurality of random movements;
determine whether the object traversed the user-selected region during any of the plurality of random movements during the round of play; and
credit the success value, based on the probability, to a player of the game of chance in response to a determination that the object traversed the user-selected region during any of the plurality of random movements during the round of play.

US Pat. No. 10,339,762

GAME WITH CHANCE ELEMENT AND STRATEGY COMPONENT THAT CAN BE COPIED

CFPH, LLC, New York, NY ...

1. An apparatus comprising:a computing device; and
non-transitory medium having stored thereon a plurality of instructions that when executed by the computing device cause the apparatus to:
receive an indication of a first plurality of decisions made by a first player under a first set of circumstances in a first set of games, in which each decision includes a choice of an action from a respective set of possible actions that each have an effect on a respective outcome of a respective game of the first set of games;
present first information about (a) a first performance metric of a first set of rules, in which the first performance metric indicates a first level of success achieved from following the first set of rules at a first time, and in which the first set of rules is based on the first plurality of decisions, and (b) a second performance metric of a second set of rules, in which the second performance metric indicates a second level of success from following the second set of rules at the first time; and
present second information about third and fourth performance metrics to the second player, in which each performance metric of the third and fourth performance metrics indicates a respective level of success achieved from following a respective one of the first set of rules and the second set of rules at a second time;
offer the second player a choice between having an automated gameplay follow the first set of rules and having the automated gameplay follow the second set of rules.

US Pat. No. 10,339,761

SYSTEM AND METHOD FOR PROVIDING A FEATURE GAME

ARISTOCRAT TECHNOLOGIES A...

1. A gaming machine, comprising:a credit input mechanism for receiving a physical item representing a monetary value to establish a credit balance;
a display device for displaying a plurality of virtual reels;
a player interface for selecting a wager funded by the credit balance and initiating play of a base game; and
a game controller configured to:
spin the plurality of virtual reels to select and display a plurality of symbols at a plurality of corresponding display positions on the display device, wherein the plurality of symbols are selected from configurable symbols and non-configurable symbols, and wherein each of the configurable symbols is configured to display an award value; and
initiate a feature game when a triggering number of configurable symbols are displayed;
wherein, during the feature game, the game controller is further configured to:
set a quantity of games remaining in the feature game to an initial quantity;
hold each configurable symbol at its corresponding display position;
select and display replacement symbols for non-configurable symbols in the plurality of corresponding display positions;
reduce the quantity of games remaining when the replacement symbols do not include at least one configurable symbols;
reset the quantity of games remaining in the feature game to at least the initial quantity when the replacement symbols include at least one configurable symbol;
repeat the hold, the select and display, the reduce, and/or the reset until the quantity of games remaining is none or until a threshold number of configurable symbols are held; and
award a feature game award based on the configurable symbols held in the plurality of corresponding display positions.

US Pat. No. 10,339,759

WAGERING GAME CONTENT BASED ON LOCATIONS OF PLAYER CHECK-IN

Bally Gaming, Inc., Las ...

1. A method of operating a gaming system, the method comprising:receiving, at at least one of one or more remote servers from an application on a mobile device, a check-in message indicating a physical location external to a wagering game establishment and an identification of a wagering game player account;
detecting log-in of the wagering game player account on a wagering game machine in the wagering game establishment, wherein the wagering game machine is configured to present a wagering game and establish a credit balance upon detection of a physical item associated with value;
in response to detecting the log-in, determining modified game content associated with the physical location; and
transmitting the modified game content from at least one of the one or more remote servers to the wagering game machine for use in presenting the wagering game.

US Pat. No. 10,339,758

ENHANCED ELECTRONIC GAMING MACHINE WITH GAZE-BASED DYNAMIC MESSAGING

IGT CANADA SOLUTIONS ULC,...

1. An electronic gaming machine, comprising:a data storage unit to store game data for a game played by a player and comprising wagering and payout elements;
a display unit to display, via a graphical user interface, a plurality of graphical game components including a message in accordance with the game data, the message conveying information to the player;
a data capture unit to collect player movement data representative of movement of an eye of the player, the data capture unit comprising a camera to capture images of the player, wherein the player movement data is based on the images;
a processor circuit; and
a memory comprising computer usable instructions that, when executed by the processor circuit:
cause the processor circuit to analyze the player movement data to determine the movement of the eye of the player;
cause the processor circuit to determine a reading pace of the player based on the movement of the eye of the player and determine a fatigue level of the player based on the reading pace of the player;
cause the processor circuit to determine, based on the movement of the eye of the player, that the player has read a certain part of the message;
cause the processor circuit to select, based on whether the player has read the certain part of the message and based on the fatigue level of the player, a message presentation rule comprising an instruction that causes the processor circuit to modify a graphical game component of the plurality of graphical game components;
cause the processor circuit to select an available food service associated with a location of the electronic gaming machine, the available food service comprising a caffeinated beverage; and
cause the processor circuit to modify the graphical game component of the plurality of graphical game components based on the message presentation rule to indicate to the player that the available food service is available.

US Pat. No. 10,339,757

MOBILE SECONDARY BETTING USER INTERFACE

Bally Gaming, Inc., Las ...

1. A method of operating a wagering game system, the wagering game system including a primary content controller and a secondary content controller, the secondary content controller being independent from and coupled to the primary content controller, the primary content controller configured to present primary wagering game content on a display of a wagering game machine, the method comprising:establishing, by one or more processors, a wireless link between a mobile device and the secondary content controller; and
in response to the primary wagering game content being in a state that permits a secondary wager thereon, authorizing, by one or more processors, the mobile device to place the secondary wager on the primary wagering game content.

US Pat. No. 10,339,756

METHOD AND SYSTEM FOR SYNCHRONOUS MOVEMENT OF GAMING MACHINES

1. A gaming system comprising:a first gaming machine;
a first means actuating configured to move moving said first gaming machine;
a second gaming machine;
a second means for actuating configured to move said second gaming machine; and
a controller comprising a processor and at least one memory, said controller configured to generate and send movement instructions to said first and second means for actuating in order to move said first and second gaming machines relative to each other.

US Pat. No. 10,339,755

USING A TABLE AND PROGRESSIVE METER IN SIDE EVENTS

Bally Gaming, Inc., Las ...

1. A method of operating a table in communication with a game controller operatively connected to a progressive jackpot meter and using at least one physical deck of cards, the table having a surface illustrated at each of a plurality of player positions with a first token area for an underlying event and a second token area for a side event, the method comprising:operating the table in a round of the method according to a set of rules applied in an ordered combination of steps,
the set of rules comprising rules of:
a participant must place a main wager to participate in the round;
the participant may place a side wager to participate in multiple side events;
an amount of the side wager is not limited to a fixed amount or percentage;
the side wager funds the multiple side events; and
increasing the amount of the side wager increases the potential payout on the side wager; and
the ordered combination of steps comprising:
applying the rule that the participant must place the main wager to participate in the round, comprising receiving, at the first token area and from the participant, at least one token, as the main wager, to initiate administration of the underlying event;
applying the rule that the participant may place the side wager to participate in multiple side events, applying the rule that the amount of the side wager is not limited to the fixed amount or percentage, and applying the rule that the side wager funds the multiple side events, comprising:
receiving, at the second token area and from the participant, at least one additional token forming the amount of the side wager at at least a predefined threshold amount, the amount of the side wager selected by the participant during the round from a predefined set of amounts comprising:
 the predefined threshold amount to initiate, in addition to administration of the underlying event, administration of a nonprogressive event, a progressive event, and an envy event; and
 a predefined minimum to initiate administration of only the nonprogressive event and the progressive event and not the envy event; and
a game controller detecting, by an electronic sensor at the second token area, the presence of the at least one additional token; and
the game controller incrementing the progressive jackpot meter to increase, by a predetermined portion of the amount of the side wager, a progressive amount displayed by the progressive jackpot meter;
administering the underlying event, comprising:
distributing, from the at least one physical deck of cards, cards for a participant hand; and
resolving the underlying event based at least in part on the cards for the participant hand; and
applying the rule that increasing the amount of the side wager increases the potential payout on the side wager, comprising:
administering the nonprogressive event, comprising applying a nonprogressive pay table to the cards for the participant hand, the nonprogressive pay table defining nonprogressive payouts of multiples applied to the amount of the side wager;
administering the progressive event, comprising applying a progressive pay table to the cards for the participant hand, the progressive pay table defining at least one hand rank qualifying the participant for a percentage of the progressive amount displayed by the progressive jackpot meter; and
administering the envy event comprising applying another pay table to cards of another participant in the round.

US Pat. No. 10,339,753

GAMING SYSTEM, GAMING DEVICE AND METHOD FOR MODERATING REMOTE HOST INITIATED FEATURES FOR MULTIPLE CONCURRENTLY PLAYED GAMES

IGT, Las Vegas, NV (US)

1. A gaming system comprising:at least one display device;
at least one processor; and
at least one memory device which stores a plurality of instructions, which when executed by the at least one processor, cause the at least one processor to:
cause the at least one display device to concurrently display a plurality of poker games,
randomly select one of the concurrently displayed poker games,
cause the at least one display device to display a poker game outcome for each of the concurrently displayed poker games, and
cause the at least one display device to display a determined award associated with each of the concurrently displayed poker game outcomes, wherein:
responsive to the displayed poker game outcome for the randomly selected poker game being a designated poker game outcome, the determined award associated with said displayed poker game outcome is a progressive award, and
responsive to the displayed poker game outcome for any of the plurality of poker games except the randomly selected poker game being the designated poker game outcome, the determined award associated with said displayed poker game outcome is not the progressive award.

US Pat. No. 10,339,752

METHOD OF GAMING, A GAMING SYSTEM AND A GAME CONTROLLER

Aristocrat Technologies A...

1. An electronic method of gaming in a gaming system, wherein the gaming system comprises an input device, a display, and a game controller, the game controller comprising a processor and memory, the memory storing instructions and reel data defining at least a plurality of reels, each of the plurality of reels having a plurality of symbol positions, wherein the plurality of symbol positions include a plurality of designated symbol positions, the electronic method of gaming comprising:selecting, by the game controller, in response to receiving an instruction from the input device, a selected game option selected from one of a plurality of game options;
forming, by the game controller, in response to the selected game option, a set of symbols for each of the plurality of reels to be used in generating at least one game outcome, wherein a symbol of the set of symbols of each of the plurality of reels being is located at a symbol position of the plurality of symbol positions, and wherein, for each of the plurality of reels, one or more of the symbols of the set of symbols at at least one of the plurality of symbol positions is fixed and the symbols of the set of symbols that are not fixed at one or more of the plurality of designated symbol positions are selected, and wherein
for a first game option, forming, by the game controller, the set of symbols includes randomly selecting at least one mystery symbol from a set of mystery symbols for use at a first subset of the plurality of designated symbol positions and using predefined symbols at the designated symbol positions not used at the first subset of the plurality of designated symbol positions, and
for a second game option, forming, by the game controller, the set of symbols includes randomly selecting at least one of the mystery symbols for use at the first subset of the plurality of designated symbol positions and a second subset of the plurality of designated symbol positions;
generating, by the game controller, a game outcome by selecting subsets of the symbols from the respective plurality of reels of the formed set of symbols for display on the display; and
evaluating the symbols displayed at the plurality of symbol display positions in the game outcome to determine if the game outcome corresponds to a winning combination of symbols.

US Pat. No. 10,339,750

LOCKING SYSTEM FOR EXCHANGE OF ITEMS, SERVICES, AND/OR FACILITIES

1. A method for facilitating a transactional exchange comprising:receiving a request, from a requestor, for access to an item secured within a containment component by a locking component;
generating an unlocking code that is valid within a time range and is valid for either a single use or a plurality of uses;
providing the unlocking code to a requestor;
receiving a user input of a code through a code entry component associated with the containment component; and
modifying the locking component into an unlocked state for providing the requestor with access to the item in response to the code being validated as corresponding to the unlocking code and being used within the time range:
receiving imagery from a device of the requestor; and
evaluating the imagery to validate whether the requestor has returned the item into a return containment component in a locked state in an acceptable condition.

US Pat. No. 10,339,749

SYSTEM AND METHOD FOR PARKING VEHICLES USING A MOBILE COMPUTING DEVICE

Rapid Lane, LLC, San Fra...

1. A Garage Server-based system for providing valet and self-parking services, the system comprising:a network of participating Valets and Garages;
a network of independent Garage Servers wherein each independent Garage Server contains data related to a specific garage and participating Valets and their availability, pricing and address; and
User and Valet software applications that operate on a mobile computing device, cell phone, tablet or laptop and perform the following functions:
(A) provide an interface for a User and an interface for a Valet;
(B) access real-time data from the Server;
(C) allows a User to search for a Valet based on address, city or price for a particular day and time;
(D) allows a User to summons a Valet to a particular location at a certain day and, time for a predetermined price;
(M) upon receiving a request for Valet, sends alerts to a predetermined number of Valets simultaneously and regularly over a predetermined period of time;
(J) sends a Valet confirmation to the User
(F) receive a map and driving instructions to meet the Valet at the particular location;
(P) once the User and Valet arrive at the particular destination and have a visual of each other, the User and the Valet are each provided with an “I'm here” electronic confirmation button to initiate the handover of the User's vehicle to the Valet;
(G) initiate a financial transaction to pay for the Valet or other parking services from the User s inApp financial account;
(Q) the User app generates and opens a QR Code which the Valet is able to scan;
(R) allows the Valet to upload parking location information including but not limited to Garage, address, floor and space number to the Server;
(S) sends the parking location information to the User;
(T) when the User requests its vehicle be returned by the Valet, the Server sends the User a request confirmation from a Valet;
(U) the Valet is provided with a mute and GPS guidance to a return destination requested by the User;
(V) when the Valet and User arrive at the return destination and have a visual of each other, the User and the Valet are each provided with an “I'm here” electronic confirmation button to conclude the handover of the User's vehicle from the Valet back to the User;
(W) the User app generates and opens a second QR Code which the Valet is able to scan;
(X) the transaction on the User app ends, and User funds for payment of the parking services are dispensed from the User's financial account to the Valet account, the Administrator account and the Garage account;
(Y) the Valet service data is added to the Server;
(Z) the User is sent a receipt, and the User's history file in the Server is updated; and
(H) conclude the transaction picked up by the Valet.

US Pat. No. 10,339,747

CARD READER

NIDEC SANKYO CORPORATION,...

1. A card reader for use with a system comprising a host device without a wireless communication interface, and a mobile terminal, the card reader comprising:a host connecting interface structured to connect with the host device of the system; and
a wireless communication interface structured to connect with the mobile terminal;
a control unit connected to the host connecting interface and the wireless communication interface;
wherein said wireless user interface is structured to exchange information with said mobile terminal through said wireless communication interface and to exchange the information with said host device through said host connecting interface.

US Pat. No. 10,339,746

MOBILE DEVICE FOR MAKING A MOBILE PAYMENT

GoFigure Payments, LLC, ...

1. A mobile communications device used by a user for making a purchase from a first merchant, said device comprising:an internal memory in which is stored an encoded number that corresponds to said mobile communications device and is associated with a payment account of the user;
a transmitter operable to establish a communications interface between said mobile communications device and an electronic system used by the first merchant;
a receiver, wherein information indicative of a monetary amount to be paid by the user to the first merchant for the purchase from the first merchant is received by said mobile communications device via said communications interface;
a display, wherein said received information that is indicative of a monetary amount to be paid is displayed on said display of said mobile communications device;
an input device operable to receive from the user an input that indicates the user's authorization of a payment to the first merchant, using said payment account, of said monetary amount to be paid for the purchase from said first merchant,
wherein said transmitter transmits information indicative of said authorization to said electronic system used by said first merchant,
wherein said mobile communications device is further used by the user to make a payment for a purchase transaction at a point of sale terminal in a retail store of a second merchant,
a transponder operable to wirelessly transmit to said point of sale terminal said encoded number to make a payment for said purchase transaction at said retail store of said second merchant when (i) said mobile communications device is within a field generated at said point of sale terminal, (ii) said transponder has been excited in response to said generated field, and (iii) said mobile communications device has received required input by the user of said mobile communications device;
a plurality of output devices each operable to emit a corresponding output to confirm to the user of said mobile communications device that a step in said purchase transaction at said point of sale terminal in said retail store of said second merchant is complete;
wherein, after said purchase transaction at said point of sale terminal in said retail store of said second merchant is complete, said mobile communications device wirelessly receives, with said receiver, information indicative of said purchase transaction at said retail store of said second merchant that comprises a monetary amount of said payment for said purchase transaction at said point of sale terminal in said retail store of said second merchant; and
wherein a date of said purchase transaction at said second merchant, information indicative of a time of said purchase transaction at said second merchant, information indicative of a location of said purchase transaction at said second merchant, and said information indicative of a monetary amount of said payment for said purchase transaction at said point of sale terminal in said retail store of said second merchant are stored in said mobile communications device.

US Pat. No. 10,339,745

BANKNOTE STORING DEVICE AND BANKNOTE HANDLING MACHINE

GLORY LTD., Himeji-Shi, ...

1. A banknote storing device comprising:a casing; and
a plurality of banknote storing mechanisms arranged inside the casing side by side along a depth direction of the casing and each of which capable of storing banknotes sent from outside of the casing to inside thereof and feeding stored banknotes from the inside of the casing to the outside thereof, wherein
each of the banknote storing mechanisms includes:
a rotating member that rotates around an axis orthogonal to the depth direction;
a plurality of belt-shaped winding members with each first end of two ends thereof connected to the rotating member; and
a plurality of winding member accommodating units to which each second end of the two ends of the winding members is connected and that is capable of accommodating the winding members, and
each of the banknote storing mechanisms stores therein the banknotes by winding the banknotes on the rotating member together with the plurality of winding members and feeds the banknotes one by one by unwinding the plurality of winding members wound on the rotating member from the rotating member, and
at least one of the rotating members is arranged at a position at which the rotating member does not overlap with any of the winding member accommodating units when viewed in a predetermined direction orthogonal to the depth direction of the casing and also orthogonal to an axis direction of the rotating members,
a plurality of insertion inlets for sending the banknote to each of the banknote storing mechanisms from the outside of the casing are arranged on a front side of the casing in a depth direction thereof such that the plurality of insertion inlets are aligned vertically,
in at least one of the banknote storing mechanisms, the plurality of winding member accommodating units are arranged further inside in the depth direction of the casing than the rotating member to which the first end of each winding member is connected while the second end thereof is connected to the winding member accommodating units,
an outer peripheral edge of each winding member wound on the rotating member when the maximum number of the banknotes are stored in the banknote storing mechanism overlaps at least partially with at least some of the winding member accommodating units in the depth direction of the casing, and
a plurality of banknote transport mechanisms corresponding to each insertion inlet is disposed inside the casing, with each banknote transport mechanism sending the banknote from the insertion inlet to the banknote storing mechanism, an opening is arranged on a side of at least one of the banknote transport mechanisms in the casing such that the operator can see the banknote transport mechanism through the opening.

US Pat. No. 10,339,744

PORTABLE CASSETTE HOLDER

NAUTILUS HYOSUNG INC., S...

1. A portable cassette holder, comprising:a frame detachably coupled to an outer surface of an automated teller machine having a medium entrance, the frame including a frame medium entrance formed at a position corresponding to the medium entrance of the automated teller machine, the frame configured to mount a cassette thereon;
a conveying roller unit coupled to the frame at a location where the frame medium entrance is provided, the conveying roller unit configured to convey a medium passing through the frame medium entrance; and
a power supply connection unit provided in the frame and configured to be electrically connected to the automated teller machine so as to receive electric power;
wherein the frame includes:
a front frame having the frame medium entrance and the conveying roller unit; and
a lower frame rotatably hinged to a lower portion of the front frame and configured to mount the cassette thereon.

US Pat. No. 10,339,743

COIN BIN

1. A coin bin, comprising:a body defining a chamber to receive coins, the body further defining a receiving area to releasably receive part of a steering handle and the body comprising a moulded plastic;
a lid to close the chamber, the lid being lockable and removable when unlocked, wherein the lid defines a plurality of recesses positioned to partly receive respective wheels of another coin bin when the another coin bin is stacked on top of the coin bin; and
a plurality of wheels affixed to the body to support the body, wherein each of the wheels is coupled to the body via a fixed axle that defines an axis of rotation of the wheel, wherein each of the wheels is partly received in a wheel recess defined by the body;the coin bin further comprising one or more of:(a) a coin release door disposed at one end of the body;
(b) the steering handle;
(c) opposed recessed portions to enable the coin bin to be manually grasped and lifted;
(d) a brake mechanism actuable to resist wheeled movement of the coin bin; and
(e) an internal wall in the body that includes an inwardly bulging or projecting area.

US Pat. No. 10,339,742

INPUT DEVICE, ELECTRONIC LOCK INCLUDING THE SAME, AND METHOD FOR CONTROLLING THE ELECTRONIC LOCK

Hyundai Motor Company, S...

1. An input apparatus comprising:a touch pad including a transmissive material;
a touch detector disposed at a rear portion of the touch pad;
a printed circuit board connected to the touch detector;
a plurality of input elements disposed on a back surface of the touch detector, each input element having at least two symbols that are formed in overlap with each other and in mutually different colors; and
a plurality of light emitting devices respectively emitting light to the input elements.

US Pat. No. 10,339,741

SYSTEMS AND METHODS FOR ADDING A TRAINABLE TRANSCEIVER TO A VEHICLE

GENTEX CORPORATION, Zeel...

1. A remote button module for controlling remote devices, comprising:a transceiver; and
a control circuit configured to:
receive control information for controlling a remote device for storage onto memory;
pair with a mobile communications device;
transmit the control information to a trainable transceiver base station for training;
receive, via the pairing, a first control signal for activating the remote device from the mobile communications device;
transmit, responsive to the receipt of the first control signal, a second control signal to the trainable transceiver base station to cause the trainable transceiver base station to transmit a third control signal to activate the remote device, wherein the third control signal is formatted based on the control information and the second signal.

US Pat. No. 10,339,740

ACCELERATION AND GRAVITY DATA BASED SYSTEM AND METHOD FOR CLASSIFYING PLACEMENT OF A MOBILE NETWORK DEVICE ON A PERSON

GM GLOBAL TECHNOLOGY OPER...

1. A system comprising:a data module configured to receive at least one of acceleration data and gravity data from an accelerometer in a mobile network device, wherein the acceleration data and the gravity data are indicative of accelerations experienced by the mobile network device;
a classification module configured to
classify a location where the mobile network device is on a person based on the at least one of the acceleration data and the gravity data and generate a location classification output indicating where on the person the mobile network device is located relative to a first bodily feature of the person,
by determining whether the mobile network device is within one of a pocket, a backpack, a clothing article, and a waste pack;
wherein when the mobile network device is determined to be in a pocket, further determining whether the pocket is a front pocket, a front pant pocket; a back pant pocket or a front shirt pocket; and
classify the location as being at least one of the front pocket, the backpack, the clothing article, the waste pack, the front pant pocket, the back pant pocket or the front shirt pocket; and
a control module configured to perform an operation based on the location classification output.

US Pat. No. 10,339,738

SYSTEMS AND METHODS OF ACCESS CONTROL IN SECURITY SYSTEMS WITH AUGMENTED REALITY

ADEMCO INC., Golden Vall...

9. A user device comprising:an imaging device;
a database device;
a display device; and
a programmable processor,
wherein the imaging device captures a static image of an exterior of a secured area,
wherein the display device displays the static image,
wherein the programmable processor determines whether the static image includes a predetermined portion of the exterior of the secured area,
wherein the predetermined portion of the exterior of the secured area corresponds to a location within the static image where at least one computer generated image of at least one artifact associated with the static image is to be displayed,
wherein, when the static image includes the predetermined portion of the exterior of the secured area, the programmable processor identifies the at least one artifact associated with the static image from the database device,
wherein the programmable processor identifies a current distance of the user device from the secured area, retrieves a predetermined distance from the database of the user device, and determines whether the current distance of the user device from the secured area is less than or equal to the predetermined distance by comparing the current distance to the predetermined distance, and
wherein, responsive to the current distance of the user device from the secured area being less than or equal to the predetermined distance and the static image including the predetermined portion of the exterior of the secured area, the programmable processor superimposes the at least one computer generated image of the at least one artifact at the location within the static image as displayed on the display device.

US Pat. No. 10,339,737

AUTOMATED ATTENDEE MONITORING AND GUIDANCE SYSTEM

Carrier Corporation, Pal...

1. A method for monitoring a location of a group of individuals comprising:receiving a first signal at a server, the first signal including an individual identification (ID) corresponding to a specific individual, and at least one beacon ID corresponding to a unique beacon from a set of beacons;
cross-checking the individual ID and the at least one beacon ID with an allowed individuals list, and unlocking an automated lock on an entry-way corresponding to a unique beacon ID in the at least one beacon ID in response to the unique beacon ID corresponding to an entry-way beacon at a room which the individual ID is authorized to access, wherein the automated lock is part of a campus security system in communication with the server; and
cross-checking the individual ID and the at least one beacon ID with an attendance list, and updating an attendance monitoring file on said server in response to the unique beacon ID in the at least one beacon ID corresponding to an interior room beacon.

US Pat. No. 10,339,736

REMOTE APPLICATION FOR CONTROLLING ACCESS

Honeywell International I...

1. A system for providing access to a secure room, the system comprising:a server comprising:
a processor;
memory in communication with the processor, the memory storing a requester database and a supervisor database;
wherein the server is configured to:
receive a request for access to a secured room from a first device associated with a first user;
attempt to authenticate the first user of the first device, wherein:
when the first user is authenticated by the server:
the server is configured to send a request for approval to a second device and to receive approval from a second device for granting access to the secured room, where the second device is associated with a second user listed in the supervisor database;
the server is configured to send an instruction to provide access to the secured room in response to receiving approval for granting access to the secured room; and
when the first user is not authenticated by the server, the server is configured to not send the request for approval to the second device and deny access to the secured room for the first user without receiving input from the second user.

US Pat. No. 10,339,735

GATE CONTROL METHOD, AUTHENTICATION DEVICE, AND GATE CONTROL SYSTEM

TECHFIRM INC., Tokyo (JP...

1. A method for controlling a gate for entering a facility comprising:acquiring a card number of a credit card of a user via a card reader of a gate control device of the facility;
transmitting an authentication request from the gate control device to an authentication device, the authentication request comprising the card number;
authenticating the card number of the credit card with the authentication device by performing a credit inquiry of the credit card using the card number;
transmitting an authentication result from the authentication device to the gate control device, the authentication result comprising an indication of whether the user's use of the credit card is authorized or not, an indication of whether the user's use of the credit card was authorized or not during a first period before a time when the card number was acquired, and an indication of whether the credit card is expired or not; and
opening the gate of the facility with the gate control device when the authentication result indicates the user's use of the credit card is authorized, the user's use of the credit card was authorized during the first period, and the credit card is not expired.

US Pat. No. 10,339,734

INTERNET-CONNECTED GARAGE DOOR CONTROL SYSTEM

GENTEX CORPORATION, Zeel...

1. An in-vehicle remote garage door opener for receiving signals through the internet from a garage door opener, the remote garage door opener comprising:an RF receiver for receiving an RF signal during a training mode;
an RF transmitter for transmitting an RF signal to the garage door opener in an operating mode;
an interface configured to communicate with an internet-connected mobile device that is mobile relative to the vehicle; and
a controller coupled to:
said interface; and
at least one of the trainable RF receiver and the trainable RF transmitter;
wherein said controller is configured to receive signals via the mobile device and said interface that are sent through the internet from the garage door opener that indicates an open or closed status of a garage door.

US Pat. No. 10,339,733

MOBILE DEVICE ATTENDANCE VERIFICATION

Saudi Arabian Oil Company...

1. An attendance verification system comprising:an attendance server comprising:
a device mapping comprising a mapping of international mobile equipment identities (IMEIs) to respective personal identifiers; and
an attendance record comprising a listing of attendance events documenting attendance of persons at events; and
a mobile electronic device comprising:
a memory;
an international mobile equipment identity (IMEI) of the mobile electronic device stored in the memory of the mobile electronic device;
characteristics of a fingerprint of a person associated with the mobile electronic device stored in the memory of the mobile electronic device;
attendance region data defining geographic extents of an attendance region associated with the event stored in the memory of the mobile electronic device; and
a fingerprint scanner,
the mobile electronic device configured to:
determine whether the mobile electronic device is located in the attendance region associated with the event;
acquire, in response to determining that the mobile electronic device is located in the attendance region associated with the event, a fingerprint of a user of the mobile electronic device by way of the fingerprint scanner;
determine a time and date of the acquisition of the fingerprint of the user by way of the fingerprint scanner;
determine a location of the mobile electronic device at the time of the acquisition of the fingerprint of the user by way of the fingerprint scanner;
compare characteristics of the fingerprint of the user acquired to the characteristics of the fingerprint of the person associated with the mobile electronic device stored in the memory of the mobile electronic device to determine whether the user is the person associated with the mobile electronic device; and
in response to determining that the user is the person associated with the mobile electronic device, send to the attendance server, attendance data comprising:
the IMEI of the mobile electronic device;
the time and date of the acquisition of the fingerprint of the user by way of the fingerprint scanner; and
the location of the mobile electronic device at the time of the acquisition of the fingerprint of the user by way of the fingerprint scanner, and
the attendance server configured to:
in response to receiving the attendance data:
determine a personal identifier of the person associated with the mobile electronic device based on a mapping of the IMEI of the mobile electronic device to a personal identifier of the person associated with the mobile electronic device in the device mapping; and
generate, in the attendance record, an attendance event associating the person associated with the mobile electronic device with at an event associated with the time and date of the acquisition of the fingerprint of the user by way of the fingerprint scanner and the location of the mobile electronic device at the time of the acquisition of the fingerprint of the user by way of the fingerprint scanner, to document attendance of the person associated with the mobile electronic device at the event.

US Pat. No. 10,339,732

VEHICLE OPERATOR PERFORMANCE HISTORY RECORDING, SCORING AND REPORTING SYSTEMS

SmartDrive Systems, Inc.,...

7. A method for determining a vehicle operator performance measure, the method comprising:receiving operator identity information that identifies periods of time a vehicle operator operates a vehicle, the operator identity information including a vehicle operator identifier, a vehicle identifier, and time ranges that correspond to the periods of time the vehicle operator operates the vehicle;
receiving event records that include information related to vehicle operation during vehicle events from a vehicle event recorder;
associating event records for vehicle events that occur while the vehicle operator operates the vehicle with the operator identity information;
forming datasets comprising the operator identity information and the associated event records;
transmitting the datasets to a data store;
analyzing the datasets to determine a vehicle operator performance measure for the vehicle operator, the analysis of the datasets based on a mathematical algorithm determined based on one or more of the operator identity information or the associated event records;
generating vehicle operator performance counseling information based on the vehicle operator performance measure, wherein the vehicle operator performance counseling information includes progressive discipline information determined based on one or more of:
(i) a quantity of vehicle events during the periods of time the vehicle operator operates the vehicle,
(ii) a type of vehicle events during the periods of time the vehicle operator operates the vehicle,
(iii) a severity of vehicle events during the periods of time the vehicle operator operates the vehicle, and/or
(iv) a number of prior counseling sessions,
wherein the progressive discipline information includes an assessment of whether performance of the vehicle operator improved in response to the prior counseling sessions; and
facilitating presentation of the vehicle operator performance counseling information to the vehicle operator.

US Pat. No. 10,339,730

FAULT DETECTION USING HIGH RESOLUTION REALMS

UNITED TECHNOLOGY CORPORA...

1. An article of manufacture including a tangible, non-transitory computer-readable storage medium having instructions stored thereon for detecting a fault in a gas turbine engine that, in response to execution by a controller, cause the controller to perform operations comprising:receiving, by the controller, a first data output from a first flight data source, the first flight data source comprising a first sensor configured to detect a first operating condition of the gas turbine engine, wherein the first data output correlates to the first operating condition of the gas turbine engine;
receiving, by the controller, a second data output from a second flight data source, the second flight data source comprising at least one of a second sensor or an avionics unit configured to detect a second operating condition of the gas turbine engine, wherein the second data output correlates to the second operating condition of the gas turbine engine;
determining, by the controller, a first flight realm of the first data output based on whether a rate of variance of the second data output is at least one of above or below a predetermined rate of variance threshold, wherein the first flight realm defines an engine transient state;
identifying, by the controller, a first fault threshold for the first data output in the first flight realm, wherein the first fault threshold is different from a second fault threshold for the first data output in a second flight realm; and
comparing, by the controller, the first data output to the first fault threshold.

US Pat. No. 10,339,729

METHOD FOR DETECTING AN INCREASE IN THE RATING OF A LOW-PRESSURE TURBINE OF AN AIRCRAFT REACTOR DURING A CRUISING FLIGHT PHASE, AND ASSOCIATED DEVICE AND METHOD FOR REGULATING THE COOLING AIR FLOW RATE OF A LOW-PRESSURE TURBINE

SAFRAN AIRCRAFT ENGINES, ...

1. A method for detecting an increase in a rotor speed of a low-pressure turbine of an aircraft reactor during a cruising phase, the method comprising:measuring a first rotor speed of the low-pressure turbine at a first time via a sensor;
measuring a second rotor speed of the low-pressure turbine at a second time subsequent to the first time via the sensor;
determining a rotor speed gradient of the low-pressure turbine as the ratio between the difference between the first rotor speed and the second rotor speed, and the time interval between the first time and second time;
comparing said determined rotor speed gradient to a reference rotor speed gradient;
determining a positive or negative indication that the aircraft is under cruising phase conditions from flight parameters of the aircraft;
activating an alarm when the determined rotor speed gradient is higher than the reference rotor speed gradient and when said received indication is positive;
and reducing a cooling air flow rate applied to the low-pressure turbine when the determined rotor speed gradient is higher than the reference rotor speed gradient and when said received indication is positive;
wherein reducing the cooling air flow rate comprises emitting a minimal opening signal of a cooling air valve to command the closing of said valve to a minimal opening of the valve.

US Pat. No. 10,339,728

DETECTING MOTOR VEHICLE DAMAGE

United Services Automobil...

1. A system for detecting damage, comprising:a sensor in visual communication with a motor vehicle, wherein the sensor is configured to detect damage to a component of the motor vehicle, wherein the damage to the component is detected by determining a change in distance between two or more focal points of the motor vehicle; and
a network interface configured to transmit an indication of the damage.

US Pat. No. 10,339,727

WIRELESS COMMUNICATION DEVICES

TOMTOM TELEMATICS B.V., ...

1. A wireless communication device for collecting vehicle on-board diagnostics (OBD) data, the device comprising:a connector for connecting the device to a vehicle OBD port to receive OBD data;
a processor configured to aggregate the OBD data into risk profile data comprising at least one of: (i) one or more scalar indicators and (ii) one or more histogram indicators, wherein each scalar indicator represents a single average value for a particular category of OBD data in a given time period;
a memory for storing the risk profile data; and
a wireless transceiver for pairing with an external mobile telecommunications device to wirelessly transmit the risk profile data,
wherein the wireless communication device is arranged to store the risk profile data in the memory until the wireless communication device is paired with a mobile telecommunications device and a data transmission instruction is received.

US Pat. No. 10,339,725

ROAD TOLL SYSTEM

TELIT AUTOMOTIVE SOLUTION...

1. A road toll system comprising:a first data processor to receive satellite navigation data from a satellite navigation receiver implementing a position tracking function, determine a route taken based on the received satellite navigation data, the satellite navigation data being associated with a variable identity, and determine a road toll level, the first data processor then to output the determined road toll level; and
a second data processor to receive, from the first data processor, the determined road toll level output by the first data processor, the second data processor to output the determined road toll level, wherein the second data processor provides the determined road toll level using the variable identity to the satellite navigation receiver.

US Pat. No. 10,339,724

METHODS AND APPARATUSES TO PROVIDE GEOFENCE-BASED REPORTABLE ESTIMATES

UNITED PARCEL SERVICE OF ...

1. A method for recording telematics data for a vehicle based one or more geofences and updating an electronic map, the method comprising:receiving, at a monitoring server, an indication, via a vehicle sensor, of an initiation of an ignition of the vehicle;
detecting, by a fuel sensor positioned within a fuel tank of the vehicle, a first fuel amount, wherein the fuel sensor sends the first fuel amount to the monitoring server to store as a first electronic record in a database of the monitoring server;
automatically monitoring, by a global positioning system located within the vehicle, location of the vehicle, as the vehicle travels at least one travel path; wherein the global positioning system sensor determines when the vehicle enters or exits one or more geofenced areas during one or more time periods and the one or more geofenced areas are stored in the databased of the monitoring system;
responsive to determining that the vehicle has entered a first geofenced area of the one or more geofenced areas based at least in part on first geo-coordinates detected by the global positioning system sensor during the one or more time periods, automatically detecting, by the fuel sensor, second fuel data indicating a second fuel amount, of the fuel tank of the vehicle, to be stored as a second electronic record in the database of the monitoring server, the first geofenced area comprising a non-public traversable area;
responsive to determining that the vehicle has exited the first geofenced area of the one or more geofenced areas based at least in part on second geo-coordinates detected by the global positioning system sensor during the one or more time periods, automatically detecting, by the fuel sensor, third fuel data indicating a third fuel amount, of the fuel tank of the vehicle, to be stored as a third electronic record in the database of the monitoring server;
responsive to determining that the vehicle has entered a second geofenced area of the one or more geofenced areas based at least in part on third geo-coordinates detected by the global positioning system sensor during the one or more time periods, automatically detecting, by the fuel sensor, fourth fuel data indicating a fourth fuel amount, of the fuel tank of the vehicle, to be stored as a fourth electronic record in the database of the monitoring server, the second geofenced area comprising a public traversable area;
responsive to determining that the vehicle has exited a second geofenced area of the one or more geofenced areas based at least in part on fourth geo-coordinates detected by the global positioning system sensor during the one or more time periods, automatically detecting, by the fuel sensor, fifth fuel data indicating a fifth fuel amount of the fuel tank of the vehicle, to be stored as a fifth electronic record in the database of the monitoring server;
receiving, at the monitoring server, an indication, via the vehicle sensor, of a termination of the ignition of the vehicle;
detecting, by the fuel sensor, sixth fuel data indicating a sixth fuel amount, of the fuel tank of the vehicle, to be stored as a sixth electronic record in the database of the monitoring server;
providing, by the fuel sensor, the first fuel data, the second fuel data, the third fuel data, the fourth fuel data, the fifth fuel data, and the sixth fuel data to the monitoring server for processing;
determining, by the monitoring server, a non-public traversed area amount of fuel consumed by the vehicle while operated (a) during the one or more time periods and (b) within the first geofenced area, wherein the non-public traversed area amount of fuel is determined based on the second and the third fuel data;
determining, by the monitoring server, a public traversed area amount of fuel consumed by the vehicle while operated within (a) the one or more time periods, and (b) the second geofenced area, wherein the public traversed area amount of fuel is determined based on the fourth and the fifth fuel data;
determining, by the monitoring server, a total amount of fuel consumed by the vehicle during the one or more time period based on the non-public traversed area amount of fuel and the public traversed area amount of fuel;
determining, by the monitoring server, a reportable amount of fuel consumed by the vehicle during the one or more time periods, wherein determining the reportable amount of fuel consumed by the vehicle during the one or more time periods comprises (a) excluding the non-public traversed area amount of fuel consumed by the vehicle within the first geofenced area from the total amount of fuel and (b) including the public traversed area amount of fuel consumed by the vehicle within the second geofenced area with the total amount of fuel;
generating, by the monitoring server, a trip record for display via a display device, wherein the trip record includes at least the total amount of fuel, the reportable amount of fuel and a total distanced traveled in the first geofenced area and the second geofenced area;
determining, by the monitoring server, that the at least one travel path traveled by the vehicle is not plotted, or is incorrectly plotted, in a previously generated electronic map of a geographic area comprising the first geofenced area and the second geofenced area based on travel path detected by the global positioning system sensor of the vehicle; and
plotting, by the monitoring server, the at least one travel path and corresponding geolocations associated with the travel path, in the geographic area, to update and save the electronic map in the monitoring server for display via the display device.

US Pat. No. 10,339,723

GENERATING VIRTUAL NOTATION SURFACES WITH GESTURES IN AN AUGMENTED AND/OR VIRTUAL REALITY ENVIRONMENT

GOOGLE LLC, Mountain Vie...

1. A method, comprising:detecting, from one of a first user or a second user in a shared virtual environment, a selection input corresponding to a selection of a virtual notation surface, of a plurality of virtual notation surfaces available to the first user and the second user;
displaying the selected virtual notation surface as a first virtual object in the shared environment in response to the selection input;
detecting a first gesture input from the first user;
annotating the selected virtual notation surface with a first annotation in response to the detection of the first gesture;
detecting a second gesture input from the second user;
annotating the selected virtual notation surface with a second annotation in response to the detection of the second gesture; and
displaying the annotated virtual notation surface including the first annotation and the second annotation in the shared virtual environment.

US Pat. No. 10,339,722

DISPLAY DEVICE AND CONTROL METHOD THEREFOR

Samsung Electronics Co., ...

1. A display device for providing a virtual reality service, the display device comprising:a display;
a user interface; and
a processor configured to:
based on a user command being input through the user interface, adjust a three-dimensional effect for a stereoscopic image to correspond to the user command,
control the display to display an enlarged image or a shrunk image in which the three-dimensional effect is adjusted, and
based on the user command being a command for enlarging the stereoscopic image, move a left-eye image of the stereoscopic image leftward, move a right-eye image of the stereoscopic image rightward, generate a left-eye stereoscopic space image and a right-eye stereoscopic space image based on the left-eye image moved leftward and the right-eye image moved rightward, and obtain images corresponding to the user command from each of the left-eye stereoscopic space image and the right-eye stereoscopic space image.

US Pat. No. 10,339,721

DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR SYSTEM-WIDE BEHAVIOR FOR 3D MODELS

APPLE INC., Cupertino, C...

1. A method, comprising:at a device having a display generation component, one or more input devices, and one or more cameras:
receiving a request to display a virtual object in a first user interface region that includes at least a portion of a field of view of the one or more cameras;
in response to the request to display the virtual object in the first user interface region, displaying, via the display generation component, a representation of the virtual object over at least a portion of the field of view of the one or more cameras that is included the first user interface region, wherein the field of view of the one or more cameras is a view of a physical environment in which the one or more cameras are located, and wherein displaying the representation of the virtual object includes:
in accordance with a determination that object-placement criteria are not met, wherein the object-placement criteria require that a placement location for the virtual object be identified in the field of view of the one or more cameras in order for the object-placement criteria to be met, displaying the representation of the virtual object with a first set of visual properties and with a first orientation that is independent of which portion of the physical environment is displayed in the field of view of the one or more cameras; and
in accordance with a determination that the object-placement criteria are met, displaying the representation of the virtual object with a second set of visual properties that are distinct from the first set of visual properties and with a second orientation that corresponds to a plane in the physical environment detected in the field of view of the one or more cameras;
detecting first movement of the one or more cameras while the representation of the virtual object is displayed with the first set of visual properties and the first orientation over a first portion of the physical environment captured in the field of view of the one or more cameras; and
in response to detecting the first movement of the one or more cameras, displaying the representation of the virtual object with the first set of visual properties and the first orientation over a second portion of the physical environment captured in the field of view of the one or more cameras, wherein the second portion of the physical environment is distinct from the first portion of the physical environment.

US Pat. No. 10,339,720

METHOD OF GROUND ADJUSTMENT FOR IN-VEHICLE AUGMENTED REALITY SYSTEMS

1. A computer program product comprising a non-transitory computer-usable medium including a computer-readable program, wherein the computer-readable program when executed on a computer of a vehicle causes the computer to:wirelessly receive pointcloud data describing a road surface from an external device, wherein the pointcloud data is wirelessly received via a message communicated via Dedicated Short Range Communication;
determine six-dimensional pose data of the vehicle;
analyze the pointcloud data and the six-dimensional pose data to identify data points that fail to describe an elevation of the road surface;
generate a subset of the pointcloud data that excludes the data points that fail to describe the elevation of the road surface;
determine, based on the subset of the pointcloud data, a terrain for the road surface;
construct an elevation grid that includes a plurality of cells, wherein one or more cells in the plurality of cells is assigned an elevation value based on a data point in the subset of pointcloud data corresponding to a similar road surface position of the one or more cells in the plurality of cells;
for each cell of the plurality of cells with no elevation value assigned, determine an interpolated elevation value based on elevation values of neighboring cells; and
display the elevation grid on a three-dimensional heads-up display (3D HUD) based on the elevation values and one or more interpolated elevation values.

US Pat. No. 10,339,719

SYSTEM AND METHOD FOR PROJECTED TOOL TRAJECTORIES FOR SURGICAL NAVIGATION SYSTEMS

SYNAPTIVE MEDICAL (BARBAD...

1. A method for communicating a distance of a surgical instrument from an object in a surgical area of interest, comprising:determining a relative position and orientation between at least one non-contact distance acquiring device, having a known relative position and orientation, and a the surgical instrument, thereby providing a determined relative position and orientation between the at least one non-contact distance acquiring device and the surgical instrument;
acquiring a first distance, between said at least one non-contact distance acquiring device, having the known relative position and orientation, and the object in the surgical area of interest;
computing, using the determined relative position and orientation between the at least one non-contact distance acquiring device and the surgical instrument and the first distance, a second distance between the surgical instrument and the object;
communicating the second distance; and
overlaying onto the image feed a projected trajectory of the surgical instrument on a visual display.

US Pat. No. 10,339,718

METHODS AND SYSTEMS FOR PROJECTING AUGMENTED REALITY CONTENT

Verizon Patent and Licens...

1. A method comprising:directing, by an augmented reality presentation system, a sensor included within an augmented reality projection device to capture an image of a portion of a real-world environment, the portion of the real-world environment included within a field of view of the sensor;
determining, by the augmented reality presentation system as the sensor captures the image, that a virtual target object located within the real-world environment is included within the field of view of the sensor, the determining including
determining, based on sensor output from one or more sensors included within the augmented reality projection device, a location and orientation of the augmented reality projection device with respect to a global coordinate system associated with the real-world environment, and
determining, based on the determined location and orientation and based on a target object profile associated with the virtual target object in a library of predetermined target object profiles, that the virtual target object is included within the field of view of the sensor;
identifying, by the augmented reality presentation system, augmented reality content associated with the virtual target object; and
directing, by the augmented reality presentation system as the sensor captures the image, a projector included within the augmented reality projection device to project the augmented reality content onto a physical surface within the real-world environment and associated with the virtual target object, the physical surface physically detached from the augmented reality projection device and included within the field of view of the sensor while the augmented reality content is projected onto the physical surface.

US Pat. No. 10,339,717

MULTIUSER AUGMENTED REALITY SYSTEM AND METHOD

1. A method for providing a collaborative augmented reality spanning multiple sites, the method comprising the steps of:detecting a first position of a first anchor at a first site with a sensor of a first augmented reality device at the first site, wherein a portal is defined at the first site with respect to the first anchor and the portal joins the first site to a different second site in a shared augmented reality;
receiving with the first augmented reality device, from a second augmented reality device at the second site, a first information representative of a second position of a first object relative to the portal, wherein the portal is defined at the second site with respect to third position of a second anchor at the second site;
displaying for a first user, by the first augmented reality device, a first representation of the first object, wherein the first representation is displayed relative to the portal on the basis of the first information and the first position.

US Pat. No. 10,339,714

MARKERLESS IMAGE ANALYSIS FOR AUGMENTED REALITY

A9.COM, INC., Palo Alto,...

1. A computer-implemented method, comprising:determining a sensor bias value associated with at least one of an accelerometer or gyroscope of a computing device; acquiring, using at least one camera of the computing device, a first image of a real-world environment;
determining a first planar surface in the real-world environment represented in the first image having a minimum threshold amount of features;
acquiring, using the at least one camera, at least a second image and a third image of the first planar surface in the real-world environment, the second image and the third image comprising different views of the first planar surface acquired at different positions;
determining a distance and orientation of the first planar surface in the real-world environment based at least on the sensor bias value and positional data associated with the second image and the third image;
determining scale information of the first planar surface in relation to the at least one camera based on the positional data associated with the second image and third image and the sensor bias value; and
displaying a representation of a first object in a camera view on a display screen of the computing device,
the camera view including an image of the real-world environment and the representation of the first object appearing to be situated on the first planar surface at a first size in accordance with the determined distance and orientation and the determined scale information.

US Pat. No. 10,339,713

MARKER POSITIONING FOR AUGMENTED REALITY OVERLAYS

International Business Ma...

1. A method comprising:receiving, by one or more processors, an input image, that includes an element, from an augmented reality input buffer, wherein the input buffer is a digital representation of information, including the element, taken from a camera;
receiving, by one or more processors, an output image from an augmented reality output buffer, wherein:
the output buffer comprises an overlay; and
the overlay comprises information about the element;
scanning, by one or more processors, the output buffer image for one or more markers, wherein the one or more markers are associated with one or more marker overlays;
receiving, by one or more processors, a first user input, wherein the first user input indicates a user selection of a first marker of the one or more markers; and
displaying, by one or more processors, a first marker overlay, wherein the first marker overlay:
is associated with the first marker; and
includes additional information about the element.

US Pat. No. 10,339,712

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:a processor programmed to function as the following units:
a first acquisition unit adapted to acquire a position and orientation of an image capturing apparatus;
a second acquisition unit adapted to acquire an image of a physical space including a user's hand, the image being captured by the image capturing apparatus;
a third acquisition unit adapted to acquire a region of the user's hand in the image of the physical space;
a specifying unit adapted to specify a vicinity region of a contour of the region of the user's hand in the image of the physical space as a region to be blurred;
a virtual image generation unit adapted to generate an image of a virtual space based on the position and orientation of the image capturing apparatus;
a generation unit adapted to generate a synthesized image based on the image of the physical space and the image of the virtual space, wherein the specified region includes a first region, wherein the synthesized image is the image of the physical space at the first region and the image of the physical space is not combined with the image of the virtual space at the first region, and wherein the synthesized image is blurred at the first region; and
an output unit adapted to output the synthesized image.

US Pat. No. 10,339,711

SYSTEM AND METHOD FOR PROVIDING AUGMENTED REALITY BASED DIRECTIONS BASED ON VERBAL AND GESTURAL CUES

Honda Motor Co., Ltd., T...

1. A method for providing augmented reality based directions, comprising:receiving a voice input based on verbal cues provided by a passenger in a vehicle;
receiving a gesture input and a gaze input based on gestural cues and gaze cues provided by the passenger in the vehicle, wherein the gaze input of the passenger includes an object in a surrounding environment and an associated focal plane for the object, wherein the gaze input of the passenger is indicative of where the passenger is looking;
determining directives based on the voice input, the gesture input and the gaze input;
associating the directives with the surrounding environment of the vehicle;
generating augmented reality graphical elements based on the directives and the association of the directives with the surrounding environment of the vehicle; and
displaying, for a driver of the vehicle, the augmented reality graphical elements on a heads-up display system of the vehicle using an actuator to move a projector based on a location of the object in the surrounding environment to project the augmented reality graphical elements at the focal plane associated with the object, wherein the augmented reality graphical elements are displayed at the focal plane associated with the object from the gaze input of the passenger where the passenger is looking.

US Pat. No. 10,339,710

MEDICAL IMAGE SYSTEM AND METHOD

KONINKLIJKE PHILIPS N.V.,...

1. A medical image system for enabling a user to navigate through a three-dimensional (3D) image of an anatomical structure, the system comprising:a display configured to simultaneously display a set of three views of the 3D image defined by an associated set of three respective mutually orthogonal planes intersecting the 3D image wherein each view is associated with one of the three mutually orthogonal planes and displays the other two planes of the three mutually orthogonal planes as lines while the associated mutually orthogonal plane is not shown; and
a computer programmed to:
obtain a further set of three respective mutually orthogonal planes intersecting the 3D image corresponding to a spatial configuration indicated in a navigation command performed in one view of the three views and comprising dragging a line representing one of the two mutually orthogonal planes represented as a line to input a rotation of the plane represented by the dragged line around an intersection point between the plane represented by the dragged line and the other plane of the two mutually orthogonal planes represented by a line; and
derive a further set of three views of the 3D image defined by the further set of three respective mutually orthogonal planes by:
(i) determining a spatial difference between a first plane of the further set of three mutually orthogonal planes associated with a first view of the further set of three views and a reference plane of a reference view defined by the reference plane and having a reference anatomical orientation of the anatomical structure,
(ii) adjusting a rotation of the first view in the first plane of the further set of three mutually orthogonal planes in dependence on the spatial difference for more closely aligning an anatomical orientation of the anatomical structure in the first view of the further set of three views to the reference anatomical orientation in the reference view; and
(iii) adjusting the second view and the third view of the further set of three views in dependence on the adjusted rotation of the first view of the further set of three views.

US Pat. No. 10,339,708

MAP SUMMARIZATION AND LOCALIZATION

1. A method, comprising:generating, at an electronic device, a first set of data representative of one or more objects in an environment of the electronic device, wherein the first set of data is based on images captured from one or more visual sensors and non-visual data from one or more non-visual sensors;
identifying, based on the first set of data, a first set of one or more groups of objects, wherein each group comprises objects appearing together in a configuration in the environment
identifying a first scene comprising the first set of one or more groups of objects based on a consistency with which the configuration appears over time or over a plurality of motion tracking sessions being above a first threshold;
identifying a utility weight of each of the one or more objects for identifying the first scene, the utility weight indicating a predicted likelihood that the corresponding object will be persistently identifiable by the electronic device in the first scene over time and based at least in part on verification by one or more mobile devices;
generating a three-dimensional representation of the first scene based a subset of the first set of data representative of a subset of the one or more objects in the first scene, wherein the subset of the one or more objects comprises objects having utility weights above a threshold, wherein the subset of the first set of data is limited to a threshold amount of data; and
localizing, at the electronic device, an estimated current pose of the electronic device based on the three-dimensional representation of the first scene.

US Pat. No. 10,339,706

METHOD AND APPARATUS FOR ESTIMATING BODY SHAPE

BROWN UNIVERSITY, Provid...

1. A method comprising:obtaining data representing a body of an individual in a first pose and a second pose, wherein the data comprises one of image data of the body captured via a camera and partial depth information of the body captured via a range sensor, and wherein the first pose differs from the second pose; and
estimating a consistent body shape of the individual across the first pose and the second pose by fitting a parametric body model of the body to the data to generate a set of pose parameters and a set of shape parameters, wherein the estimating separately factors (1) changes in a shape of the body due to changes between the first pose and the second pose to yield the set of pose parameters from (2) changes in the body of the individual due to identity to yield the set of shape parameters.

US Pat. No. 10,339,705

MAINTAINING 3D LABELS AS STABLE OBJECTS IN 3D WORLD

Microsoft Technology Lice...

15. An electronic device, comprising:a processor; and
a memory coupled to the processor and storing instructions that, when executed by the processor, causes the processor to:
render a plurality of text labels on a three-dimensional map within a mapping application that is rendered on a display of the electronic device;
determine an initial orientation of each of the plurality of text labels based, at least in part, on a viewing angle and a viewing elevation of a virtual camera associated with the three-dimensional map;
receiving input that changes at least one of the viewing angle and the viewing elevation of the virtual camera; and
updating the initial orientation of each of the plurality of text labels based, at least in part, on the received input, wherein updating the initial orientation of each of the plurality of text labels comprises continuously animating changes to each of the plurality of text labels from the initial orientation to a new orientation as long as at least one of the viewing angle and the viewing elevation of the virtual camera is altered by the received input, wherein continuously animating changes to each of the plurality of text labels from the initial orientation to the new orientation comprises continuously animating a rotation of each of the plurality of text labels around an axis.

US Pat. No. 10,339,703

DEVICE AND METHOD TO COMPUTE SHADOW IN A 3D SCENE

Vidon Patents and Strateg...

1. A graphics processing device configured to compute shadow in a three-dimensional scene lighted by at least one light source, based on a plurality of depth maps, a depth map of said plurality of depth maps comprising a plurality of elements, said depth map being associated with a two-dimensional bounded space adapted to render an image of the scene according to a point of view of said light source,said graphics processing device comprising at least one processor configured for implementing the following acts, for a depth map of said plurality of depth maps:storing information representative of a depth consisting of the distance between said point of view and a closest visible surface from said point of view for each of said elements of said depth map;
storing in each element of at least one part of said depth map, coordinates of vertices of at least one surface element of a closest visible surface, the coordinates being expressed in said two-dimensional bounded space of said depth map;
storing in each element of said at least one part of said depth map, information representative of local depth variation of said at least one surface element in said two-dimensional bounded space of said depth map;
for a pixel of a plurality of pixels of said image, testing ray intersection in said at least one part of said depth map, with said at least one surface element having depth computed for said pixel from said information representative of depth and of local depth variation, taking into account said coordinates of vertices;
computing one minimum depth associated with said at least one surface element for each element of said at least one part of said depth map, said minimum depth being computed from said coordinates of vertices and said information representative of depth and of local depth variation;
comparing depth associated with said pixel of said image, located in a depth map element, with said minimum depth associated with said depth map element and with said at least one surface element;
classifying said pixel as lit if the depth associated with said pixel is lower than said minimum depth;
classifying said pixel as shadowed if the depth associated with said pixel is greater than said minimum depth, and if a ray intersection is found for said pixel with said at least one surface element from said ray intersection testing.

US Pat. No. 10,339,702

METHOD FOR IMPROVING OCCLUDED EDGE QUALITY IN AUGMENTED REALITY BASED ON DEPTH CAMERA

NATIONAL TSING HUA UNIVER...

1. A method of improving occluded edge quality in an augmented reality system, which comprises a camera set and a computer device, the method comprising:a data input step of using the camera set, which comprises one or multiple cameras, to capture a scene and an object set, which comprises one or multiple objects, in the scene to obtain an original image;
an occluded region extracting step executed by the computer device, wherein a first virtual plane and a second virtual plane are set in the scene, depth buffer calculations are performed according to the first virtual plane and the original image to obtain a first image, depth buffer calculations are performed according to the second virtual plane and the original image to obtain a second image, and an extracted image is obtained by way of extracting according to a difference operation performed according to the first image and the second image, where a distance from the first virtual plane to the camera set is k, a distance from the second virtual plane to the camera set is (k-h), and k and h are positive integers greater than 0, wherein the second virtual plane is disposed between the first virtual plane and the camera set; and
an occluded image generating step executed by the computer device, wherein a third virtual plane is set between the first virtual plane and the second virtual plane, a virtual object is inserted into the original image according to the extracted image, the third virtual plane and the original image to obtain an occluded image, and the virtual object is partially occluded by the object set, wherein each of the original image, the first image and the second image contains depth data and visible light image data.

US Pat. No. 10,339,700

MANIPULATING VIRTUAL OBJECTS ON HINGED MULTI-SCREEN DEVICE

MICROSOFT TECHNOLOGY LICE...

1. A mobile computing device comprising:a housing having a first part and a second part coupled by a hinge, the first part including a first display and the second part including a second display;
a sensor mounted in the housing, coupled to the hinge, and configured to detect a hinge angle between the first and second parts of the housing; and
a processor mounted in the housing, wherein the processor is configured to render and cause to be displayed on a display a two-dimensional view of a virtual object on the first display and a three-dimensional view of the virtual object at an orientation on the second display, and the three-dimensional view of the virtual object is rendered at the orientation based on the detected hinge angle between the first and second parts of the housing,
wherein a type of three-dimensional view for the three-dimensional view is a virtual camera angle that captures a view of a three-dimensional virtual environment, the virtual camera angle being determined based upon the detected hinge angle.

US Pat. No. 10,339,699

ADAPTIVE USAGE OF AVAILABLE STORAGE FOR ACCELERATED VOLUME RENDERING

Siemens Healthcare GmbH, ...

1. A method for adaptively storing distance information for volume rendering a first volumetric dataset, the first volumetric dataset including voxels having respective values, the method comprising:generating, by a processor, a second volumetric dataset, the second volumetric dataset identifying a first subset of the voxels of the first volumetric dataset representing empty space and a second subset of the voxels of the first volumetric dataset representing non-empty space;
generating, by the processor, a distance field based on the second volumetric dataset, the distance field identifying, for each voxel of the first subset of voxels, a distance to a closest voxel of the second subset of voxels;
identifying, by the processor, a third volumetric dataset, the third volumetric dataset identifying, for each of the voxels, to which portion or portions of the first volumetric dataset the respective voxel belongs;
selecting, by the processor, at least one compression method;
compressing, by the processor, the distance field based on the at least one compression method; and
merging, by the processor, the compressed distance field with the third volumetric dataset.

US Pat. No. 10,339,698

METHOD FOR DISCRIMINATION AND IDENTIFICATION OF OBJECTS OF A SCENE BY 3-D IMAGING

THALES, Courbevoie (FR) ...

1. A method for discriminating and identifying, by 3D imaging, an object in a complex scene comprising the following steps:A) generating a sequence of images called two-dimensional (2D) maximum intensity projection (MIP) images of the object, from a three-dimensional (3D) voxel volume of the complex scene, this volume being predetermined and visualized by an operator by using an iterative process of “MIP” type from a projection plane and an intensity threshold determined by the operator on each iteration,
B1) automatically extracting from the sequence of 2D MIP images, coordinates of a reduced volume corresponding to the sequence of 2D MIP images,
B2) choosing one of the intensity thresholds used during the iterations of the step A), this choice being made by the operator,
C) automatically extracting, from the 3D voxel volume of the complex scene, from the coordinates and from the chosen intensity threshold, a reduced 3D voxel volume containing the object,
D) automatically generating from the reduced volume, by intensity threshold optimization, an optimized intensity threshold and an optimized voxel volume of the object, a color being associated with each intensity,
E) automatically generating, from the coordinates of the voxels whose intensity is represented by the 2D MIP images and from the chosen intensity threshold, a 3D cloud of points of the object,
F1) automatically generating, from the 3D volume of the complex scene and from the chosen intensity threshold, a raw 3D cloud of points of the complex scene,
F2) automatically generating, from the optimized 3D voxel volume and from the optimized intensity threshold, an optimized 3D cloud of points of the object,
G) automatically generating, from an overlaying of the 3D cloud of points of the object, and of the optimized 3D cloud of points of the object, and of the raw 3D cloud of points of the complex scene, an optimized global 3D cloud of points of the object included in the complex scene,
visualizing the optimized global 3D cloud,
identifying the object by the visualization, and
if the visualization is not satisfactory, iterating the previous steps to obtain a densification of the 3D clouds of points.

US Pat. No. 10,339,697

MEDICAL IMAGE PROCESSING APPARATUS

Toshiba Medical Systems C...

1. A medical image processing apparatus configured to capture images of an observation object over time to obtain a three-dimensional image, the observation object including a first hard tissue and a second hard tissue adjacent to each other, the apparatus comprising:a control circuit configured to
extract the first hard tissue and the second hard tissue based on the three-dimensional image; and
generate, as an image for observation, a cross-sectional image of a reference plane including the first hard tissue and at least part of the second hard tissue or an image projected on a parallel plane parallel to the reference plane including the first hard tissue and at least the part of the second hard tissue based on the three-dimensional image, wherein the control circuit is further configured to
generate the image for observation so that the reference plane includes a first axis of the first hard tissue,
extract a second axis of the second hard tissue,
generate the image for observation so that the reference plane includes at least a point on the second axis as the part of the second hard tissue,
rotate the second hard tissue about the point such that the second axis is placed on the reference plane, and
generate a cross-sectional image as the image for observation by cutting out the first hard tissue and the rotated second hard tissue along the reference plane.

US Pat. No. 10,339,696

ON DEMAND GEOMETRY AND ACCELERATION STRUCTURE CREATION WITH DISCRETE PRODUCTION SCHEDULING

Imagination Technologies ...

1. A method of 3-D geometry processing for graphics rendering, comprising:producing final geometry from source geometry by applying one or more geometry modification processes to the source geometry, the producing characterized by a plurality of discrete productions, each producing final geometry limited to a subset of final geometry located in a 3-D scene; and
scheduling the plurality of discrete productions of the final geometry by collecting requests for particular sub-sets of final geometry into groups based on a scheduling criteria, and relatively ordering the plurality of discrete productions according to the scheduling criteria,
wherein the requests are collected from a rasterization sub-system and a ray tracing subsystem.

US Pat. No. 10,339,695

CONTENT-BASED MEDICAL IMAGE RENDERING BASED ON MACHINE LEARNING

Siemens Healthcare GmbH, ...

1. A method for content-based rendering based on machine learning in a rendering system, the method comprising:loading, from memory, a medical dataset representing a three-dimensional region of a patient;
applying, by a machine, the medical dataset to a machine-learnt model, the machine-learned model trained with deep learning to extract features from the medical dataset and trained to output values for two or more volume rendering parameters that correspond to input of the medical dataset, the two or more volume rendering parameters being settings of a volume renderer, the settings used by the volume renderer to control rendering, from three dimensions to two-dimensions, an image of the three-dimensional region of the patient;
rendering, by the volume renderer, the image of the three-dimensional region of the patient from the medical dataset using the output values resulting from the applying as the settings to control the rendering from the medical dataset, the rendering of the medical dataset of the three-dimensional region being to the image in the two-dimensions; and
transmitting the image.

US Pat. No. 10,339,694

RAY TRACING APPARATUS AND METHOD

Samsung Electronics Co., ...

1. A ray tracing apparatus comprising:a ray generator configured to generate a ray and transmit the ray; and
an unified traversal (TRV)/intersection test (IST) processor configured to
receive the ray,
determine one of a ray-node intersection test, an intersection distance test, and a hit point test to be performed based on a state of the ray thereto, and
perform the determined test with respect to the ray,
wherein the unified TRV/IST processor is configured to perform the ray-node intersection test, the intersection distance test, and the hit point test with respect to the ray through a same pipeline, the pipeline including a plurality of stages, each of the stages including at least one arithmetic unit configured to perform processing for each of the ray-node intersection test, the intersection distance test, and the hit point test of the corresponding stage, and
determine the at least one arithmetic unit to be used at each of the stages based on a hardware area occupied by each of the at least one arithmetic unit.

US Pat. No. 10,339,693

ELECTRONIC DEVICE, STORAGE MEDIUM, PROGRAM, AND DISPLAYING METHOD

Semiconductor Energy Labo...

1. A displaying method of an electronic device comprising the steps of:displaying a first part, a second part, and a third part on a display screen of the electronic device, the display screen having flexibility and the second part being between the first part and the third part;
displaying a first object in the first part on the display screen;
calculating a three-dimensional shape of the display screen; and
moving the first object from the first part to the third part without through the second part only when the three-dimensional shape of the display screen satisfies a first condition,
wherein the first condition is that a portion of the display screen on which the second part is displayed protrudes downward and the first part and the third part are brought close to each other.

US Pat. No. 10,339,692

FOVEAL ADAPTATION OF PARTICLES AND SIMULATION MODELS IN A FOVEATED RENDERING SYSTEM

Sony Interactive Entertai...

1. A method for implementing a graphics pipeline, comprising:generating a system of particles creating an effect in a virtual scene, the system of particles including a plurality of particle geometries;
determining a subsystem of particles from the system of particles, the subsystem of particles comprising a subset of particle geometries taken from the plurality of particle geometries;
determining a foveal region when rendering an image of the virtual scene, wherein the foveal region corresponds to where an attention of a user is directed;
determining that at least one portion of the effect is located in a peripheral region for the image;
rendering the subsystem of particles to generate the effect;
wherein the determining the subsystem of particles includes:
determining at least one particle of the system of particles is in the peripheral region;
generating a plurality of clusters of particles from the system of particles, wherein each of the clusters is represented by a corresponding aggregated particle, wherein the subsystem of particles includes the aggregated particles of the plurality of clusters of particles; and
simulating the effect using the subsystem of particles;
determining an aggregated location of the corresponding aggregated particle by averaging locations of particles in a corresponding cluster of particles;
determining an aggregated mass of the corresponding aggregated particle by summing masses of particles in the corresponding cluster of particles;
scaling a size of the corresponding aggregated particle based on a reduction ratio corresponding to the subsystem of particles; and
determining aggregated visual properties of the corresponding aggregated particle.

US Pat. No. 10,339,691

STORAGE OF TIME-VARYING DATA AS PARAMETRIZED FUNCTIONS

Mapbox, Inc., San Franci...

1. A method comprising:for a geographic area comprising a plurality of unit areas subdividing the geographic area:
receiving records obtained from a plurality of reporting devices when the reporting devices were located within the geographic area, the records including the unit areas in which the reporting devices were located and time stamps indicating when the reporting devices were located in those unit areas;
determining, for each of the plurality of unit areas, a relationship of raster density of the unit area as a function of time in a time window, wherein the raster density represents a density of records within the unit area and the relationship is represented by a vector of parameters each corresponding to a respective coefficient of the function;
generating a table, wherein each row of the table corresponds to a respective one of the plurality of unit areas, and each column of the table corresponds to a respective coefficient of the functions;
for at least one of the plurality of unit areas:
determining that a particular coefficient of the function is below a threshold coefficient value, and
responsive to determining that the particular coefficient is below the threshold coefficient value, performing at least one of setting the particular coefficient to zero and deleting the particular coefficient from the table; and
storing the determined table.

US Pat. No. 10,339,690

IMAGE RECOGNITION SCORING VISUALIZATION

Ricoh Co., Ltd., Tokyo (...

1. A method comprising:receiving a request to view a job in a graphical user interface;
requesting, using one or more processors, the job by sending a token related to the job to a server storing data associated with the job via a network;
receiving, using the one or more processors, the job related to the token, wherein the job includes a panoramic image and metadata;
processing, using the one or more processors, the metadata to extract image recognition information, the image recognition information comprising features of a plurality of objects depicted within the panoramic image;
creating, using the one or more processors, a visualization of the job, wherein the visualization includes the panoramic image with the image recognition information overlaid on the panoramic image to highlight an object of the plurality of objects;
determining, using the one or more processors, an identity of the object of the plurality of objects depicted within the panoramic image by matching the features of the object of the plurality of objects with a potential candidate in a product database that has a highest recognition score;
determining, using the one or more processors, a correctness of the determined identity of the object of the plurality of objects based on the matching;
updating, using the one or more processors, the visualization of the job by removing portions of the visualization that do not include the plurality of objects; and
displaying in the graphical user interface, the updated visualization of the job and a correctness indicator signaling the correctness of the object of the plurality of objects.

US Pat. No. 10,339,688

SYSTEMS AND METHODS FOR RENDERING EFFECTS IN 360 VIDEO

CYBERLINK CORP., Shindia...

1. A method implemented in a computing device for inserting an effect into a 360 video, comprising:receiving the effect from a user;
receiving a target region from the user, the target region corresponding to a location within the 360 video for inserting the effect, wherein receiving the target region from the user comprises:
positioning the effect in a front view of the spherical model; and
receiving input from the user for rotating the spherical model while a location of the effect remains static, wherein the spherical model is rotated until the effect is located in the target region within the spherical model; and
for each frame in the 360 video:
inserting the effect on a surface of a spherical model based on the target region;
generating two half-sphere frames containing the effect from the spherical model;
stitching the two half-sphere frames to generate a panoramic representation of the effect; and
blending the panoramic representation of the effect with an original source panorama to generate a modified 360 video frame with the effect.

US Pat. No. 10,339,687

IMAGE PROCESSING APPARATUS, METHOD FOR CONTROLLING SAME, IMAGING APPARATUS, AND PROGRAM

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:an acquisition unit configured to obtain a plurality of viewpoint images and imaging information corresponding to the plurality of viewpoint images;
an image processing unit configured to apply image processing to image data based on the viewpoint images;
a setting unit configured to set a parameter of the image processing by the image processing unit based on a user operation; and
a limitation unit configured to limit the parameter settable by a user via the setting unit in the image processing based on the imaging information,
wherein a parameter of blur shift processing for shifting a position of a blur in an image is a composition ratio in synthesizing the viewpoint images, the blur shift processing being applied by the image processing unit, the parameter being set by the setting unit.

US Pat. No. 10,339,686

IMAGE PROCESSING DEVICE, ENDOSCOPE APPARATUS, AND IMAGE PROCESSING METHOD

OLYMPUS CORPORATION, Tok...

12. An image processing method comprising:acquiring, from an imaging section having an optical system that can capture a plurality of images that differ from each other as to an in-focus object plane position, a plurality of captured images that have been captured by the imaging section in an i-th (wherein i is an integer that satisfies 1?i?N) frame and that differ from each other as to the in-focus object plane position, the imaging section performing a frame sequential imaging process while sequentially irradiating illumination lights that differ in spectral characteristics from an illumination section, and one cycle of the frame sequential imaging process including first to N-th (wherein N is an integer equal to or larger than 2) frames;
calculating a first synthesis map with respect to the i-th frame based on an evaluation value which represents an in-focus degree of each of the captured images in the plurality of captured images that have been captured in the i-th frame and that differ from each other as to the in-focus object plane position;
acquiring, from the imaging section, a plurality of captured images that have been captured by the imaging section in a k-th (wherein k is an integer that satisfies 1?k?N and k?i) frame and that differ from each other as to the in-focus object plane position;
calculating the first synthesis map with respect to the k-th frame based on an evaluation value which represents an in-focus degree of each of the captured images in the plurality of captured images that have been captured in the k-th frame and that differ from each other as to the in-focus object plane position;
calculating a second synthesis map including information that determines a pixel value of each pixel when synthesizing the plurality of captured images that have been captured in the i-th frame and that differ from each other as to the in-focus object plane position, based on the first synthesis map calculated with respect to the i-th frame, and the first synthesis map calculated with respect to the k-th frame that differs from the i-th frame;
generating an i-th synthesized image which brings an object that is distributed over a wider distance range than the plurality of captured images before being synthesized into focus, by performing a synthesis process that synthesizes the plurality of captured images that have been captured in the i-th frame and that differ from each other as to the in-focus object plane position based on the second synthesis map; and
generating a color image by synthesizing first to N-th synthesized images that have been generated with respect to the first to N-th frames.

US Pat. No. 10,339,685

SYSTEM FOR BEAUTY, COSMETIC, AND FASHION ANALYSIS

Northeastern University, ...

1. A system for analyzing an image of a human face for the presence of makeup, comprising:one or more processors and memory, including a dataset comprising images of human faces, the images comprising facial images of multiple human subjects, and including multiple images associated with a single human subject showing steps of makeup application including a face with no makeup, a face with an intermediate stage of makeup application, and a face with a final makeup application;
wherein the one or more processors have been trained using the dataset to predict an image of a human face without makeup from an input image of a human face wearing makeup;
machine-readable instructions stored in the memory, that upon execution by the one or more processors cause the system to carry out operations comprising:
receiving from an input device an input image of a human face wearing makeup;
detecting the presence of the makeup on the input image;
decomposing the input image to remove the makeup from the input image by applying a mapping from makeup features to non-makeup features; and
providing to an output device an output image of the human face with the makeup removed from the image.

US Pat. No. 10,339,684

VISUALIZATION OF CONNECTED DATA

International Business Ma...

1. A computer-implemented method comprising:receiving a request to create a graphics object by rendering a set of generic data according to at least two or more visualization types among a plurality of possible visualization types;
providing a plurality of data retrieval plugins for fetching additional data bearing connection information about how to structure the generic data during rendering without having to alter the generic data relative to the computer system, wherein the at least two or more visualization types are configured to render a plurality of data points that are at least one of connected and non-connected, wherein the plurality of connected data points is rendered hierarchically, and wherein the plurality of non-connected data points is rendered in a horizontal list;
fetching the generic data and the additional data from at least one data store by:
fetching the generic data using inbuilt data retrieval logic;
selecting and loading at least one of the data retrieval plugins; and
fetching the additional data by running one or more of the loaded data retrieval plugins that satisfy the request from each of the at least two or more visualization types, wherein the additional data includes connection information about how to structure the generic data; and
rendering the fetched data set into the requested graphics object by:
receiving the generic data and the additional data that has been fetched;
rendering the generic data, using a visualization logic relative to the visualization unit, and the additional data according to the at least two or more visualization types to create the requested graphics object, wherein the rendered additional data is passed to a visualization unit, via at least one visualization plugin of a plurality of visualization plugins, without having to further process the additional data, wherein the visualization unit is operable to:
select and load the at least one visualization plugin; and
render, at least in part, by running the at least one visualization plugin, wherein at least a subset of the generic data and the additional data are rendered jointly by at least one of the selected and loaded visualization plugins to create the requested graphics object; and
outputting the graphics object in reply to the request; and
storing a list of supported visualization types, each of the supported visualization types comprising an annotation specifying the additional data that is associated with that visualization type.

US Pat. No. 10,339,683

PERFORMANCE DIAGNOSTIC FOR VIRTUAL MACHINES

VMWARE, INC., Palo Alto,...

1. A virtual machine system with improved performance diagnostic for virtual machines, comprising:networked host computers running the virtual machines, the virtual machines running applications;
a virtual machine manager to manage the virtual machines, the virtual machine manager comprising:
a performance analyzer to provide a plurality of highest ranking regions in a regions list comprising data points of a performance metric for a virtual machine, the performance analyzer comprising:
a region abstractor to create regions of various time interval in the regions list, each region being a parent, a child, or a neighbor to a number of other regions in the region list;
a region sorter to sort the regions in the regions list by variance and mean;
a region pruner, comprising:
a child-parent region pruner to remove any child region from the regions list when its parent region has a variance that substantially represents the child region; and
a neighbor region merger to process the regions list after the child-parent region pruner, wherein the neighbor region merger is to replace any two neighboring regions in the regions list with a merged region comprising the two neighboring regions when the merged region has a variance that substantially represents the two neighboring regions;
a statistics subsystem to:
collect the data points from the host computers and provide them to the performance analyzer;
generate a chart of the data points;
receive the plurality of highest ranking regions in the regions list from the performance analyzer and visually indicate them in the chart;
display the chart or transmit it over a computer network; and
display recorded events and recorded alarms corresponding to the plurality of highest ranking regions in the regions list or transmit them over the computer network; and
an events and alarms subsystem to record events and alarms for the virtual machine and provide the recorded events and the recorded alarms corresponding to the plurality of highest ranking regions in the regions list to the statistics subsystem.

US Pat. No. 10,339,681

INTERACTIVE MULTIMEDIA PROCESS FLOW CHART BUILDER

RAYTHEON COMPANY, Waltha...

1. A method for building an interactive multimedia process flow chart, the method comprising:outputting a process flow chart in an interactive graphically editable format on a user interface by an interactive multimedia process flow chart builder tool executable on a processing subsystem of an interactive multimedia process flow chart system, the process flow chart comprising a plurality of nodes each located entirely in a single node cell of a node grid, wherein the user interface is interactively displayed on a first client interface of a first client system configured to communicate with the interactive multimedia process flow chart system across a network;
linking, by the interactive multimedia process flow chart builder tool responsive to user input from the first client interface, a pair of output hubs on a decision node of the process flow chart to a pair of nodes defining a yes-path and a no-path to form a troubleshooting tree;
linking, by the interactive multimedia process flow chart builder tool responsive to user input from the first client interface, an output hub of a process step node of the process flow chart to a single node defining a next process step in the troubleshooting tree;
establishing, by the interactive multimedia process flow chart builder tool responsive to user input from the first client interface, a link to an end node of the process flow chart absent any output links from the end node in the troubleshooting tree, wherein the decision node, the process step node, and the end node all have a same common shape including a display region illustrating node information and an editing command region depicting at least one node editing command;
saving the process flow chart to a data storage system of the interactive multimedia process flow chart system;
reading the process flow chart from the data storage system by an interactive multimedia process flow chart analysis tool executable on the processing subsystem of the interactive multimedia process flow chart system, wherein the reading is performed responsive to a second client interface of a second client system configured to communicate with the interactive multimedia process flow chart system across the network; and
executing a troubleshooting process, by the interactive multimedia process flow chart analysis tool responsive to user input from the second client interface, based on the process flow chart to control traversal through a sequence of troubleshooting steps comprising the plurality of nodes and a plurality of paths to identify a root cause and an associated remedy, wherein the interactive multimedia process flow chart analysis tool prevents access to the at least one node editing command that is accessible during execution of the interactive multimedia process flow chart builder tool.

US Pat. No. 10,339,680

GRAPHICS CONTROL DATA FOR PERFORMING SKELETON-BASED MODIFICATIONS OF A TYPEFACE DESIGN

Adobe Inc., San Jose, CA...

1. A method for generating graphics control data used in performing skeleton-based modifications of a typeface design, the method comprising:accessing, by a typeface processing application executed by one or more processing devices, a character graphic for a character from a typeface, the character graphic comprising (i) a character skeleton that includes a set of control points and a set of curves defined by the set of control points and (ii) a character outline including one or more shapes that surround the character skeleton;
computing, for a design parameter of a computer-implemented typeface design application, a graphics control dataset based on a particular control point from the set of control points, wherein computing the graphics control dataset comprises:
identifying a pair of positions of the particular control point that correspond, respectively, to a pair of design parameter values of the design parameter,
identifying a pair of expansions of the character outline with respect to the particular control point, wherein the pair of expansions correspond, respectively, to the pair of design parameter values, and
generating the graphics control dataset that includes (i) intermediate positions of the particular control point between the pair of positions and (ii) intermediate expansions of the character outline between the pair of expansions; and
outputting the graphics control dataset from the typeface processing application to the typeface design application, wherein the typeface design application is configured to display, responsive to a selection of a design parameter value, a modified character design that includes modified curves generated from a portion of the graphics control dataset.

US Pat. No. 10,339,679

DYNAMIC PATH MODIFICATION AND EXTENSION

Adobe Inc., San Jose, CA...

1. In a digital medium environment to dynamically modify an existing path in a user interface, a method implemented by a computing device, the method comprising:receiving, by the computing device, an un-parameterized input originated by user interaction with a user interface to specify a path to be drawn;
fitting, by the computing device, a parameterized path as a mathematical ordering representation of the path;
determining, by the computing device, the parameterized path is logically suitable to modify an internal segment of the existing path, the determining based on detecting acceleration of the un-parameterized input as less than a threshold amount; and
modifying, by the computing device, the internal segment of the existing path in the user interface by blending the internal segment with the parameterized path in response to the determining that the parameterized path is to modify the existing path.

US Pat. No. 10,339,678

SYSTEM AND METHOD FOR MOTION ESTIMATION AND COMPENSATION IN HELICAL COMPUTED TOMOGRAPHY

University of Central Flo...

1. A method for estimating and compensating for motion by reducing motion artifacts in an image reconstruction from helical computed tomography (CT) scan data of an object of interest, the method comprising:collecting helical computer tomography (CT) scan data of an object of interest, wherein the scan data is acquired using a radiation source to generate a cone beam and a radiation detector to detect the cone beam;
selecting a plurality of center-points along a trajectory of the radiation source;
identifying a plurality of pairs of sections along the trajectory of the radiation source, wherein each of the plurality of pairs of sections is associated with one of the plurality of center-points and wherein a first section of each of the pairs of sections and a second section of each of the pairs of sections are positioned on opposite sides of the center-point;
selecting a subset of the plurality of pairs of sections;
reconstructing, for each pair of the subset, a first partial image from the scan data of the first section and a second partial image from the scan data of the second section;
performing image registration of the first partial image and the second partial image for each pair of the subset to estimate a deformation that transforms the first partial image into the second partial image, wherein the deformation is representative of motion of the object of interest during the scan; and
generating a motion compensated image by reconstructing the object of interest using the scan data and the estimated deformation.

US Pat. No. 10,339,677

SYSTEMS AND METHODS FOR IMAGE DATA PROCESSING IN COMPUTERIZED TOMOGRAPHY

SHANGHAI UNITED IMAGING H...

1. A system, comprising:a scanner having a collimator, a collimation width of the collimator being adjustable;
one or more non-transitory storage devices including a set of instructions for image data processing; and
at least one processor configured to communicate with the one or more non-transitory storage devices, wherein when executing the set of instructions, the at least one processor is configured to cause the system to:
obtain a relationship between correction coefficients and collimation widths;
obtain a target collimation width of the collimator; and
determine a target correction coefficient based on the target collimation width and the relationship between correction coefficients and collimation widths, wherein to obtain the relationship between correction coefficients and collimation widths, the at least one processor is configured to cause the system to:
obtain a first correction coefficient, the first correction coefficient corresponding to a first collimation width of the collimator;
obtain a relationship between scattered radiation intensities and collimation widths; and
determine the relationship between correction coefficients and collimation widths based on the first correction coefficient, the first collimation width, and the relationship between scattered radiation intensities and collimation widths.

US Pat. No. 10,339,675

TOMOGRAPHY APPARATUS AND METHOD FOR RECONSTRUCTING TOMOGRAPHY IMAGE THEREOF

SAMSUNG ELECTRONICS CO., ...

1. A tomography method for generating a computed tomography (CT) image, the tomography method comprising:generating a first tomography image based on first raw data corresponding to a received X-ray comprising acquired photons;
determining second raw data by generating a second tomography image having an increased resolution in comparison with the first tomography image and performing forward projection on the second tomography image;
determining third raw data based on a first parameter, the first raw data, and the second raw data; and
generating a third tomography image based on the third raw data,
wherein the determining of the third raw data is based on information about a distribution of the acquired photons and a number of the acquired photons, the information being included in at least one from among the first raw data and the second raw data,
wherein the second tomography image has the increased resolution in comparison with the first tomography image in an arrangement direction of a detector cell detecting the X-ray.

US Pat. No. 10,339,674

QUANTITATIVE DARK-FIELD IMAGING IN TOMOGRAPHY

KONINKLIJKE PHILIPS N.V.,...

1. An image signal processing apparatus configured for improving fidelity or accuracy of dark field images in differential X-ray phase contrast imaging, the image signal processing apparatus comprising:a signal input port configured to receive from a detector interferometric measurement data detected by the detector in response to an X-ray beam incident on said detector after projection of said beam through a specimen to be imaged, the data comprising a phase contrast signal and a dark field signal;
a processor configured to execute an image reconstruction algorithm to reconstruct at least the dark field signal into a dark field image, wherein the reconstructing of at least the dark field signal into the dark field image is based on a forward model that incorporates a model component configured to capture cross-talk of the phase contrast signal into the dark field signal; and
an image output port configured to output to a display device at least the reconstructed dark field image.

US Pat. No. 10,339,673

DUAL-ENERGY RAY IMAGING METHODS AND SYSTEMS

Tsinghua University, Bei...

1. A dual-energy ray imaging method comprising the steps of:performing a dual-energy transmissive scanning on an object to be inspected to acquire high-energy projection data and low-energy projection data for at least a part of the object to be inspected;
determining whether the high-energy projection data and low-energy projection data correspond to a combination of tow base materials by using a lookup table for single base materials;
for each pixel that is determined to correspond to a combination of the two base materials, searching a high and low energy projection database for mass thicknesses of the two base materials by using the high-energy projection data and the low-energy projection data according to an equation that is determined by using a surface fitting method, the two base materials comprising a first base material and a second base material;
calculating a first high and low energy data set corresponding to the first base material and a second high and low energy data set corresponding to the second base material based on respective mass attenuation coefficients and the mass thicknesses of the two base materials; and
performing a substance recognition by using the first high and low energy data set and second high and low energy data set.

US Pat. No. 10,339,672

METHOD AND ELECTRONIC DEVICE FOR VERIFYING LIGHT SOURCE OF IMAGES

Samsung Electronics Co., ...

1. An electronic device comprising:an image sensor comprising a pixel array; and
an image processor electrically connected with the image sensor, wherein the image processor is configured to:
acquire a first image of a subject generated based on a first group of optical paths and acquire a second image of the subject generated based on a second group of optical paths, the second group of optical paths having a phase difference with the first group of optical paths from the pixel array,
normalize pixel values included in the first image and pixel values included in the second image in units of a pixel, and
verify a light source of light reflected by the subject and/or a light source of light produced from the subject, based on a parameter associated with changes in the normalized pixel values.

US Pat. No. 10,339,670

3D TOOL TRACKING AND POSITIONING USING CAMERAS

Trimble Inc., Sunnyvale,...

1. A system for detecting a tool in a scene, the system comprising:a camera unit for acquiring an image of a scene; and
one or more processors configured to:
divide the image into a first number of patches to generate a first set of patches, wherein a patch is an area of the image,
perform object detection on the first set of patches to identify the tool in a patch of the first set of patches,
determine that the tool is not detected in the first set of patches,
divide the image into a second number of patches to generate a second set of patches, after determining that the tool is not detected in the first set of patches,
perform object detection on the second set of patches to identify the tool in a patch of the second set of patches, and
determine that the tool is detected in a patch of the second set of patches.

US Pat. No. 10,339,669

METHOD, APPARATUS, AND SYSTEM FOR A VERTEX-BASED EVALUATION OF POLYGON SIMILARITY

HERE Global B.V., Eindho...

1. A computer-implemented method for evaluating polygon similarity using a computer vision system comprising:processing, by the computer vision system, an image to generate a first set of vertices of a first polygon representing an object depicted in the image;
for each vertex in the first set of vertices, determining a closest vertex in a second set of vertices of a second polygon, and determining a distance between said each vertex in the first set of vertices and the closest vertex in the second set of vertices;
calculating a polygon similarity of the first polygon with respect to the second polygon based on a total of the distance determined for said each vertex in the first set of vertices normalized to a number of vertices in the first set of vertices; and
transmitting the polygon similarity over a network to a mapping platform,
wherein the mapping platform processes the polygon similarity to localize a vehicle within a geographic area.

US Pat. No. 10,339,668

OBJECT RECOGNITION APPARATUS

FANUC CORPORATION, Yaman...

1. An object recognition apparatus comprising:a two-dimensional sensor for acquiring two-dimensional information of an object at a first clock time;
a three-dimensional sensor for acquiring three-dimensional information of the object at a second clock time;
a storage unit that associates and stores a first position of the two-dimensional sensor at the first clock time and the two-dimensional information and that associates and stores a second position of the three-dimensional sensor at the second clock time and the three-dimensional information; and
an arithmetic operation unit that calculates an amount of change in orientation between the orientation of the two-dimensional sensor at the first position and the orientation of the three-dimensional sensor at the second position on the basis of the first position and the second position stored in the storage unit, that converts the three-dimensional information acquired at the second position into three-dimensional information acquired at the first position on the basis of the calculated amount of change in orientation, and that calculates a state of the object on the basis of the converted three-dimensional information and the two-dimensional information.

US Pat. No. 10,339,665

POSITIONAL SHIFT AMOUNT CALCULATION APPARATUS AND IMAGING APPARATUS

Canon Kabushiki Kaisha, ...

1. A positional shift amount calculation apparatus that calculates a positional shift amount, which is a relative positional shift amount between a first image based on a luminous flux that has passed through a first imaging optical system, and a second image, the apparatus comprising:at least one processor operatively coupled to a memory to functions as:
(a) a calculation unit adapted to calculate a positional shift amount based on data within a predetermined area out of first image data representing a first image and second image data representing a second image; and
(b) a setting unit adapted to set a relative size of the area to the first and second image data,
wherein (i) the calculation unit is adapted to calculate a first positional shift amount using the first image data and the second image data in the area having a first size that is preset, (ii) the setting unit is adapted to set a second size of the area based on the size of the first positional shift amount and an optical characteristic of the first imaging optical system, and (iii) the calculation unit is adapted to calculate a second positional shift amount using the first image data and the second image data in the area having the second size, and
wherein, when an absolute value of the first positional shift amount is greater than a predetermined threshold, the setting unit sets the second size to be larger as the absolute value of the first positional shift amount is greater.

US Pat. No. 10,339,664

SIGNAL DETECTION, RECOGNITION AND TRACKING WITH FEATURE VECTOR TRANSFORMS

Digimarc Corporation, Be...

1. A method of object recognition comprising:receiving a sequence of images captured of a scene by an image sensor;
using a hardware processor of a computer system, performing a feature vector transform on plural images in the sequence of the images to produce N-dimensional feature vector per pixel of the plural images, the feature vector transform producing for each pixel in an array of pixels, a first vector component corresponding to plural comparisons between a center pixel and pixels at plural directions around the center pixel for a first scale, and second vector component corresponding to plural comparisons between the center pixel and pixels at plural directions around the center pixel for a second scale,
wherein N is a number of dimensions of the N-dimensional feature vector;
wherein the plural comparisons at the first and second scales comprise quantized differences, and the quantized differences are encoded in a first data structure representing magnitude and direction of the quantized differences at each of the first and second scales; and
using a hardware processor of a computer system, deriving a second data structure characterizing geometry of an object in the scene from N-dimensional feature vectors represented using the first data structure, obtaining a pixel patch geometrically registered to the geometry of the object, and identifying the object by processing the registered pixel patch with a digital watermark reader to extract an identifier, a barcode reader to extract an identifier, or a trained classifier to identify the object according to training images for the object.

US Pat. No. 10,339,663

GENERATING GEOREFERENCE INFORMATION FOR AERIAL IMAGES

SKYCATCH, INC., San Fran...

1. A computer-implemented method comprising:accessing a first plurality of aerial images of a site, the first plurality of aerial images comprising georeference information for the site;
building an initial three-dimensional representation of the site from the first plurality of aerial images;
generating, using at least one processor, a transformation based on the initial three-dimensional representation of the site and the georeference information for the site;
accessing a plurality of new aerial images of the site without georeference information;
building, by the at least one processor, a new unreferenced three-dimensional representation of the site based, at least in part, on the plurality of new aerial images of the site without georeference information; and
applying, by the at least one processor, the transformation generated based on the initial three-dimensional representation of the site and the georeference information for the site to the new unreferenced three-dimensional representation built based on the plurality of new aerial images of the site without georeference information to create a new georeferenced three-dimensional representation of the site.

US Pat. No. 10,339,662

REGISTERING CAMERAS WITH VIRTUAL FIDUCIALS

Microsoft Technology Lice...

1. A multi-camera imager (MCI) comprising:first and second cameras each having a center of projection, an optical axis extending from the center of projection and an image plane intersected by the optical axis at a principal point, wherein the first camera comprises a time of flight (TOF) range camera and the second camera comprises an RGB picture camera;
memory having stored therein:
a fundamental matrix based on a polynomial transform configured to map image plane coordinates of images of features of a calibration target imaged by the TOF range camera on the image plane of the TOF range camera to image plane coordinates of images of the same features of the calibration target as imaged by the RGB picture camera on the image plane of the RGB picture camera; and instructions executable to register images of a scene acquired by the first and second cameras to each other based on the fundamental matrix; and
a processor configured to execute the instructions to register the images of the scene based on the fundamental matrix.

US Pat. No. 10,339,661

MOVEMENT DIRECTION DETERMINATION METHOD AND MOVEMENT DIRECTION DETERMINATION DEVICE

PANASONIC INTELLECTUAL PR...

1. A movement-direction determination method of causing a processor to determine a movement direction of a camera that images a polygonal recognition target, the method comprising:causing the processor to acquire an orientation of the camera acquired by a sensor included in the camera;
causing the processor to acquire an image of the recognition target imaged by the camera;
causing the processor to determine a number of corners of the recognition target included in the image;
causing the processor to determine the movement direction of the camera based on the orientation of the camera and the number of corners; and
causing the processor to determine coordinates of a corner which is not included in the image, and to determine the movement direction of the camera based on the orientation of the camera and the coordinates of the corner which is not included in the image, in a case where the number of corners of the recognition target included in the image is one less than an actual number of corners of the recognition target.

US Pat. No. 10,339,660

VIDEO FINGERPRINT SYSTEM AND METHOD THEREOF

SHANGHAI XIAOYI TECHNOLOG...

1. A method for generating a transformed representation of a quantity of video data structured as a plurality of frames including arrays of rows and columns of pixels having pixel properties, comprising:generating first representations of the video data based on a plurality of the rows;
wherein generating the first representations includes determining first pluralities of statistical values based on pixel property values of pixels in the rows;
generating second representations of the video data based on a plurality of the columns;
wherein generating the second representations includes determining second pluralities of statistical values based on pixel property values of pixels in the columns;
generating frame representations corresponding to the frames and based on the first and second representations; and
combining the frame representations to form the transformed representation of the video data.

US Pat. No. 10,339,654

SYSTEMS, DEVICES, AND METHODS FOR TRACKING MOVING TARGETS

Kineticor, Inc., Honolul...

1. A motion compensation system for tracking and compensating for subject motion during a magnetic resonance (MR) scan, the motion compensation system comprising:a magnetic resonance (MR) scanner;
at least two detectors positioned so as to view an optical landmark on a subject from different directions with each of the at least two detectors being configured to record two dimensional images of the optical landmark, wherein the at least two detectors are affixed to an exterior surface of a head coil configured to be used in conjunction with the magnetic resonance (MR) scanner;
one or more computer readable storage devices configured to store a plurality of computer executable instructions; and
one or more hardware computer processors in communication with the one or more computer readable storage devices and configured to execute the plurality of computer executable instructions in order to cause the system to determine a position of the subject, wherein the determining the position of the subject comprises:
identifying the optical landmark and displacement of the optical landmark based on optical images collected by the at least two detectors;
utilizing an iteration procedure, wherein the iteration procedure comprises testing an approximate first-order solution against the identified target point to determine residual errors and dividing the determined residual errors by local derivatives with respect to rotation and translation to determine an iterative correction;
repeating the iteration procedure until the residual errors are within predetermined levels of accuracy; and
utilizing the repeated iteration procedure to determine the position of the subject at rates of at least 100 times per second.

US Pat. No. 10,339,652

IMAGE RECONSTRUCTION

Shenyang Neusoft Medical ...

1. A method of image reconstruction, comprising:performing a pilot image scanning to generate a pilot image;
determining a first primary scanning condition;
determining an event indicating a need for an auxiliary scanning occurs using at least one of the pilot image or the first primary scanning condition;
obtaining an auxiliary scanning condition according to the event;
performing the auxiliary scanning on an object using the auxiliary scanning condition to generate an auxiliary image; and
in response to a determination to perform a primary scanning based on the auxiliary image, determining a second primary scanning condition based on the auxiliary scanning condition and performing the primary scanning on the object using the second primary scanning condition to generate a primary image.

US Pat. No. 10,339,643

ALGORITHM AND DEVICE FOR IMAGE PROCESSING

NIKON CORPORATION, (JP)

1. A method for evaluating an image to identify areas of the image that are suitable for point spread function estimation, the method comprising:selecting a first image region from the image with a control system that includes a processor, the first image region including a plurality of pixels;
estimating gradients in at least a portion of the first image region with the control system by analyzing each of the pixels in the at least a portion of the first image region;
identifying a first region feature of the first image region with the control system, the first region feature being a low-level feature that is related to an accuracy of the point spread function estimation;
calculating a first feature value for the first region feature with the control system utilizing the estimated gradients in the at least a portion of the first image region;
transforming the first feature value into a first feature score for the first image region with the control system;
computing a first region score for the first image region with the control system that is based at least in part on the first feature score; and
evaluating the first region score with the control system to determine if the first image region is suitable for point spread function estimation.

US Pat. No. 10,339,642

DIGITAL IMAGE PROCESSING THROUGH USE OF AN IMAGE REPOSITORY

Adobe Inc., San Jose, CA...

1. In a digital medium environment to transform a target digital image using image processing, a method implemented by at least one computing device, the method comprising:obtaining, by the at least one computing device, a plurality of candidate digital images from an image repository;
generating, by the at least one computing device, a plurality of transformations to be applied to the target digital image, each said transformation based on a respective said candidate digital image;
filtering, by the at least one computing device, the plurality of transformations to remove semantically incorrect transformations;
generating, by the at least one computing device, a plurality of transformed target digital images based at least in part through application of the filtered plurality of transformations to the target image; and
outputting, by the at least one computing device, the plurality of transformed target digital images.

US Pat. No. 10,339,641

IMAGE PROCESSING APPARATUS AND METHOD, AND DECODING APPARATUS

SAMSUNG ELECTRONICS CO., ...

1. An image processing apparatus comprising:a receiver configured to receive an image; and
an image processor configured to divide the image into a plurality of regions, and to generate an enhanced image by iteratively applying at least one filter to each of the plurality of regions in the image,
wherein the at least one filter comprises an asymmetric filter that uses an asymmetric filtering window having a first height and a first width different from the first height,
wherein the image processor applies a different kind of filter according to a focus state of the image, and
wherein the image processor is further configured to determine a plurality of focus levels, each of the plurality of focus levels respectively corresponding to one of the plurality of regions in the image.

US Pat. No. 10,339,640

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM

Canon Kabushiki Kaisha, ...

1. An image processing apparatus comprising:one or more processors; and
memory storing instructions that, when executed by the one or more processors, causes the image processing apparatus to perform operations including:
acquiring a plurality of pieces of image data obtained by temporally successively capturing images of an object,
obtaining, as a first correction, first correction data by performing fluctuation correction on processing target image data using a plurality of pieces of temporally neighboring image data among the acquired plurality of pieces of image data,
calculating, as a first calculation and as first displacement data, an amount of displacement between reference image data and the processing target image data or reference image data and the first correction data,
determining a moving object region contained in the plurality of pieces of image data based on the first correction data, the reference image data, and the first displacement data,
calculating, as a second calculation and based on the first displacement data or the first correction data, second displacement data by interpolating the first displacement data in the determined moving object region, and
obtaining, as a second correction, second correction data by correcting, based on the second displacement data, the processing target image data or the first correction data.

US Pat. No. 10,339,639

METHOD AND SYSTEM OF CALIBRATING A MULTISPECTRAL CAMERA ON AN AERIAL VEHICLE

Konica Minolta Laboratory...

1. A method of calibrating multispectral images from a camera on an unmanned aerial vehicle, the method comprising:capturing multispectral images of an area of land at a plurality of intervals with a multispectral imaging camera;
simultaneously capturing sunlight radiance data for each of the captured images with a solar radiance sensor mounted on an upper portion of the unmanned aerial vehicle and monitoring an area above the unmanned aerial vehicle;
correlating the images with the sunlight radiance data;
calibrating the multispectral images based on the sunlight radiance data to normalize the multispectral images to one or more previous images of the area;
monitoring an intensity level of the sunlight radiance data for detection of one or more clouds; and
changing the direction of flight of the unmanned aerial vehicle to reduce an impact of shadows from the one or more clouds.

US Pat. No. 10,339,638

IMAGE PROCESSING DEVICE, IMAGING DEVICE, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING PROGRAM

FUJIFILM Corporation, To...

1. An image processing device, comprising:an image acquisition unit that acquires first image data indicating a visible light image captured with sensitivity to a visible light wavelength band using an optical system, and second image data including a near-infrared light image captured with sensitivity to the visible light wavelength band and a near-infrared light wavelength band using the optical system;
a first point image restoration processing unit that performs a first point image restoration process on the acquired first image data using a first point image restoration filter based on a first point spread function with respect to visible light of the optical system; and
a second point image restoration processing unit that performs a second point image restoration process on the acquired second image data using a second point image restoration filter based on a second point spread function with respect to near-infrared light of the optical system, the second point image restoration processing unit causing restoration strength in the second point image restoration process for the second image data captured with radiation of near-infrared light to be higher than restoration strength in the first point image restoration process performed by the first point image restoration processing unit.

US Pat. No. 10,339,637

IMAGE PROCESSING APPARATUS, IMAGE PICKUP APPARATUS, IMAGE PROCESSING METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM FOR CORRECTING DETERIORATION OF IMAGE

CANON KABUSHIKI KAISHA, ...

1. An image processing apparatus comprising:at least one processor configured to execute a plurality of tasks, including:
an image acquiring task that acquires an image;
a PSF acquiring task that acquires a point spread function relating to each of a plurality of pixel positions of an image sensor;
a correction data determining task that determines correction data for each of the plurality of pixel positions of the image sensor based on the acquired point spread function relating to each of the plurality of pixel positions of the image sensor; and
an image restoring task that:
calculates image correction data for each position of the acquired image based on the plurality of point spread functions acquired by the PSF acquiring task; and
repeats a predetermined image processing using the image correction data N times, where N is a positive integer, to perform an image restoration processing for the each position to restore the acquired image,
wherein the predetermined image processing includes:
processing of generating an (n+1)-th intermediate image based on an n-th image (1 processing of generating an (n+1)-th image based on the (n+1)-th intermediate image, the n-th image, and the image correction data,
wherein the image correction data are coefficient data for a difference between the (n+1)-th intermediate image and the n-th image; and
an outputting task that outputs the restored acquired image.

US Pat. No. 10,339,636

IMAGE PROCESSING APPARATUS THAT SPECIFIES EDGE PIXEL IN TARGET IMAGE BY CALCULATING EDGE STRENGTH

Brother Kogyo Kabushiki K...

1. An image processing apparatus comprising:a processor; and
a memory storing a set of computer-readable instructions therein, the set of computer-readable instructions, when executed by the processor, causing the image processing apparatus to perform:
acquiring target image data representing a target image including a plurality of pixels, the target image data including a plurality of pixel values corresponding to respective ones of the plurality of pixels, each of the plurality of pixel values having a plurality of component values;
generating first image data representing a first image, the first image data being one of first component data and second component data, the first component data including a plurality of first pixel values corresponding to respective ones of the plurality of pixels, each of the plurality of first pixel values being related to a maximum value among the plurality of component values of corresponding one of the plurality of pixels, the second component data including a plurality of second pixel values corresponding to respective ones of the plurality of pixels, each of the plurality of second pixel values being related to a minimum value among the plurality of component values of corresponding one of the plurality of pixels;
calculating a plurality of first edge strengths corresponding to respective ones of the plurality of pixels using the first image data to generate first edge strength data including the plurality of first edge strengths; and
specifying a plurality of edge pixels included in the target image, wherein the specifying includes binarizing the first edge strength data to generate first binary image data.

US Pat. No. 10,339,635

IMAGE PROCESSING METHOD AND IMAGE PROCESSING APPARATUS

Realtek Semiconductor Cor...

1. An image processing method for processing an input image, comprising following steps:selecting a pixel of the input image;
determining if the pixel is a first image edge according to at least one first calibrated pixel and at least one second pixel in the input image, wherein the first calibrated pixel corresponds to at least one first pixel in the input image; and
replacing a high frequency component of at least one channel of the pixel with a first calibrating high frequency component if the pixel is not the first image edge, to generate a calibrated pixel, and maintaining the pixel as the calibrated pixel if the pixel is the first image edge.

US Pat. No. 10,339,634

SYSTEM AND METHOD FOR IMAGE RECONSTRUCTION

SHANGHAI UNITED IMAGING H...

1. A method implemented on a computing device having at least one processor and a non-transitory storage medium for image reconstruction, the method comprising:receiving raw data relating to a subject;
generating a first image based on the raw data;
constructing a noise model that indicates a noise distribution of the first image;
generating a second image by reducing noise from the first image based on the noise model;
generating a third image by subtracting the second image from the first image, wherein the third image indicates the noise in the first image;
generating a fourth image by improving a luminance or color of at least a portion of the second image; and
generating a fifth image by combining the fourth image and the third image to add an amount of noise to the fourth image.

US Pat. No. 10,339,633

METHOD AND DEVICE FOR SUPER-RESOLUTION IMAGE RECONSTRUCTION BASED ON DICTIONARY MATCHING

Peking University Shenzhe...

1. A method of super-resolution image reconstruction based on dictionary matching, wherein comprising:establishing a matching dictionary library;
inputting an image block to be reconstructed into a multi-layer linear filter network, and extracting a local characteristic of the image to be reconstructed;
searching the matching dictionary library for a local characteristic of a low-resolution image block having highest similarity with the local characteristic of the image block to be reconstructed, wherein the step for extracting a local characteristic of the image comprises:
Step 1: wherein a multi-layer linear filter network comprises a filter layer, filtering an input image block to be reconstructed by a first-stage filter of the filter layer using N linear filter windows with different sizes to obtain corresponding N filtered images and output to the next stage filter, wherein the filtered image includes a line characteristic of the image, where N is an integer greater than one;
Step 2: filtering the N filtered images output from the first-stage filter by a second-stage filter of the filter layer using M linear filter windows with different sizes to obtain corresponding M×N filtered images, where M is an integer greater than one;
Step 3: outputting all the filtered images obtained by each stage filter to a next stage filter repeatedly; filtering all filtered images output from the previous stage filter by the next stage filter using multiple linear filter windows with different sizes until filtering by the last stage filter is completed; outputting all filtered images to the mapping layer of the multi-layer linear filter network;
Step 4: performing binarization on all the filtered images of the filter layer by the mapping layer to output to the output layer of the multi-layer linear filter network; and
Step 5: if the input image of the multi-layer linear filter network is a local image block, concatenating and outputting, by the output layer, the binarized filtered image output by the mapping layer to obtain a local characteristic of the image; if the input image of the multi-layer linear filter network is a whole image, making, by the output layer, a block histogram for each binarized filtered image output by the mapping layer, and then performing convergence for output to obtain a local characteristic of the image;
searching the matching dictionary library for a residual of a combined sample in which the local characteristic of the low-resolution image block having the highest similarity is located; and
performing interpolation amplification on the local characteristic of the low-resolution image block having the highest similarity, and adding the residual to a result of the interpolation amplification to obtain a reconstructed high-resolution image block.

US Pat. No. 10,339,632

IMAGE PROCESSING METHOD AND APPARATUS, AND ELECTRONIC DEVICE

GUANGDONG OPPO MOBILE TEL...

1. An image processing method, configured to process a color-block image output by an image sensor to output a simulation image, wherein the image sensor comprises an array of photosensitive pixel units and an array of filter units arranged on the array of photosensitive pixel units, each filter unit corresponds to one photosensitive pixel unit, each photosensitive pixel unit comprises a plurality of photosensitive pixels, the color-block image comprises image pixel units arranged in a preset array, each image pixel unit comprises a plurality of original pixels, each photosensitive pixel corresponds to one original pixel, and the image processing method comprises:dividing the color-block image into a plurality of frequency analysis regions;
calculating a space frequency value of each of the plurality of frequency analysis regions;
merging frequency analysis regions each with the space frequency value conforming to a preset condition into the high-frequency region;
converting the color-block image into the simulation image, wherein the simulation image comprises simulation pixels arranged in a preset array, the simulation pixel comprises a current pixel, the original pixel comprises an association pixel corresponding to the current pixel, the converting the color-block image into the simulation image comprises:
determining whether the association pixel is within the high-frequency region;
when the association pixel is within the high-frequency region, determining whether a color of the current pixel is identical to that of the association pixel;
when the color of the current pixel is identical to that of the association pixel, determining a pixel value of the association pixel as a pixel value of the current pixel;
when the color of the current pixel is different from that of the association pixel, determining a pixel value of the current pixel according to a pixel value of an association pixel unit using a first interpolation algorithm, wherein the image pixel unit comprises the association pixel unit, the association pixel unit comprises a plurality of original pixels each with the same color as the current pixel and adjacent to the association pixel;
when the association pixel is beyond the high-frequency region, calculating a pixel value of the current pixel using a second interpolation algorithm, wherein, a complexity of the second interpolation algorithm is less than that of the first interpolation algorithm.

US Pat. No. 10,339,631

IMAGE DEMOSAICING FOR HYBRID OPTICAL SENSOR ARRAYS

MICROSOFT TECHNOLOGY LICE...

1. A method for an imaging device, comprising:receiving, from a hybrid optical sensor array, a first set of data for a scene captured by a first set of pixels at a first resolution and a second set of data for a scene captured by a second set of pixels at a second resolution, the first and second sets of pixels having differing spectral sensitivities;
demosaicing the first set of data based on at least the second set of data by executing one or more of:
a) interpolating the first set of data independent of the second set of data; and
filtering the interpolated first set of data based on the second set of data; and
b) interpolating the first set of data based on the second set of data; and
outputting, using the demosaiced first set of data, a third set of data for the scene at a third resolution, greater than the first resolution.

US Pat. No. 10,339,629

METHOD FOR PROVIDING INDICATION IN MULTI-DIMENSIONAL MEDIA IN ELECTRONIC DEVICE

Samsung Electronics Co., ...

1. A method for use in an electronic device, the method comprising:receiving an image data of a multi-dimensional media comprising at least one of 360 degrees video, three-dimensional (3D) video, 360 degrees image or 3D image;
controlling a display to display a first field of view of the image data on the display;
determining at least one region of interest (ROI) in at least one second field of view of the image data; and
controlling the display to provide an indication aiding a user to navigate towards a direction of the at least one second field of view comprising the at least one ROI from the first field of view of the image data.

US Pat. No. 10,339,628

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD AND COMPUTER READABLE MEDIUM

KONICA MINOLTA, INC., To...

1. An image processing apparatus comprising:a hardware processor that
detects an isolated pixel pattern at multiple locations in an original image, some of the multiple locations being adjacent to each other and some of the multiple locations are not adjacent to each other, each black pixel in the isolated pixel pattern having sides that are adjacent to a white pixel and isolated such that the sides do not adjoin a side of another black pixel in the isolated pixel pattern, and
outputs a converted image in which the isolated pixel pattern, at each of the multiple locations in the original image, has been converted to a different pattern that is different from the isolated pixel pattern, wherein the different pattern is the same at the multiple locations in the original image, and black pixels in the different pattern are aggregated such that a side of each black pixel in the different pattern adjoins a side of another black pixel,
wherein the hardware processor detects the isolated pixel pattern by a unit of M×N pixels and converts the isolated pixel pattern into the different pattern by the unit of detected M×N pixels, and the hardware processor includes any of an application specific integrated circuit, a field-programmable gate array, and a processor running software, wherein each of M and N is a natural number of three or more, and
wherein the hardware processor outputs the converted image in which only the isolated pixel pattern, at each of the multiple locations in the original image, has been converted to the different pattern.

US Pat. No. 10,339,627

APPARATUS AND METHODS FOR THE OPTIMAL STITCH ZONE CALCULATION OF A GENERATED PROJECTION OF A SPHERICAL IMAGE

GoPro, Inc., San Mateo, ...

1. A computerized apparatus configured to generate a projection of a captured image, the apparatus comprising:a processing apparatus; and
a storage apparatus in data communication with the processing apparatus, the storage apparatus having a non-transitory computer readable medium comprising instructions which are configured to, when executed by the processing apparatus, cause the computerized apparatus to:
obtain a plurality of images, the plurality of images configured to represent a panoramic image;
map the plurality of images into a spherical collection of images;
re-orient the spherical collection of images in accordance with a desired stitch line for a desired projection; and
map the re-oriented spherical collection of images into the desired projection comprising the desired stitch line.

US Pat. No. 10,339,626

METHOD FOR PRODUCING FRAMING INFORMATION FOR A SET OF IMAGES

FotoNation Limited, Galw...

1. A method for producing framing information for a set of source images, each comprising an object region, comprising the steps of:a) one or more of: scaling, translating and rotating images of said set of N source images so that said object region is aligned within said set of source images;
b) for a given image of said set of object aligned source images, at a given frame size, a given frame angle for a frame relative to said set of object aligned images and at a first candidate boundary position for said frame, determining if there is at least one position for a second boundary of said frame orthogonal to said first boundary where said frame lies within said image and said frame encloses said object region;
c) responsive to said determining, incrementing counters associated with said first candidate boundary position for each position for said second boundary where said frame lies within said image and said frame encloses said object region;
d) responsive to any counter meeting a threshold value,K?N, for said set of source images, indicating that framing is possible at said given frame size, said frame angle, said first candidate boundary position and any position for said second boundary associated with said threshold meeting counter; and
e) responsive to no counter meeting said threshold value, K, repeating steps b) to e) for another image of said set of source images.

US Pat. No. 10,339,625

COMMAND SCHEDULER FOR A DISPLAY DEVICE

INTEL CORPORATION, Santa...

1. An apparatus, comprising:a processor;
a display device, to generate a tearing effect signal to indicate that a frame is to be displayed on the display device;
a host controller coupled to the display device, to operate the display device;
a command scheduler executable on the processor, wherein the command scheduler includes:
a dynamic queue to store a first set of commands that are to be executed on the display device to display the frame, wherein the command scheduler, in response to execution by the processor, is to flush the first set of commands from a first queue upon execution of the first set of command on the display device; and
a static queue to store a second set of commands that are to be executed on the display device, wherein the command scheduler, in response to execution by the processor, is to keep the second set of commands stored in a second queue upon execution of the second set of commands on the display device,
wherein the command scheduler, when executed on the processor, in response to a receipt of the tearing effect signal from the display device, is to retrieve from the dynamic queue or static queue and provide one of the first or second sets of command to the host controller for execution on the display device, wherein the command scheduler is to select a command from the first set of commands or second set of commands to be sent to the host controller for execution on the display device according to a priority scheme.

US Pat. No. 10,339,624

REVERSED POLISH NOTATION PROCESSING ELEMENTS

Intel Corporation, Santa...

1. A computing system comprising:a data interface including one or more of a network controller, a memory controller or a bus, the data interface to obtain one or more shader instructions;
a compiler to receive the one or more shader instructions and compile the one or more shader instructions into a Reverse Polish Notation (RPN) program stream including a first set of operands and a first set of operations; and
a first register stack;
a program streamer interface to receive the RPN program stream;
a first stack allocator to populate the first register stack with one or more operands in the first set of operands; and
a first power gating unit to selectively power off one or more registers in the first register stack based on a stack depth of the first register stack.

US Pat. No. 10,339,623

PHASE ROTATION WATERMARKING FOR PHASE MODULATION

Harris Corporation, Melb...

1. A method, comprising:generating a sequence of phase modulated host symbols having continuous, antipodal phase transitions between adjacent ones of the host symbols representing different states;
receiving a sequence of overlay symbols each spanning a respective set of the host symbols;
rotating the continuous, antipodal phase transitions between the adjacent ones of the host symbols in each set of the host symbols in a same rotation direction according to a symbol state of the respective overlay symbol spanning the set of the host symbols; and
generating a phase modulated transmit signal that conveys the continuous, antipodal phase transitions rotated according to the symbol states of the overlay symbols.

US Pat. No. 10,339,622

SYSTEMS AND METHODS FOR ENHANCING MACHINE VISION OBJECT RECOGNITION THROUGH ACCUMULATED CLASSIFICATIONS

CAPITAL ONE SERVICES, LLC...

1. A machine vision system comprising:a handheld camera configured to capture a plurality of image frames during movement of the handheld camera, each image frame of the plurality of image frames comprising a representation of a first object, wherein the first object comprises a vehicle, and wherein the movement of the handheld camera is based on a movement of a user;
a classification module in communication with a memory, the classification module configured to process the plurality of image frames to generate a corresponding plurality of object classification scores associated with the first object, wherein the object classification scores represent confidence that the first object matches one or more representations of objects in a trained machine vision model;
an accumulation module in communication with the classification module, the accumulation module configured to accumulate the plurality of classification scores;
a discernment module in communication with the accumulation module, the discernment module configured to output classification information of the first object corresponding to a highest accumulated classification score and responsive to a dynamically adjusted threshold, wherein the dynamically adjusted threshold is set based on: a percentage of accumulated classification counts, one or more differences between accumulated classification counts, and a minimum value; and
a display configured to output an indication of the classification information, wherein the classification information comprises vehicle model information.

US Pat. No. 10,339,621

OPERATOR MANAGEMENT DEVICE, OPERATOR MANAGEMENT SYSTEM, AND OPERATOR MANAGEMENT METHOD

Nissan Motor Co., Ltd., ...

1. An electronic car sharing determination method for a plurality of shared vehicles allocated to stations using a shared vehicle management device in communication with an onboard device of a first shared vehicle and an operator terminal device, the method comprising:determining, by the onboard device, a state information of the first shared vehicle including an energy amount sensed by an energy amount sensor of the first shared vehicle;
acquiring, by the shared vehicle management device, the state information of the first shared vehicle of the plurality of shared vehicles from the onboard device;
calculating, by the shared vehicle management device, a utilization rate of the first shared vehicle;
determining, by the shared vehicle management device, that a transportation object vehicle to be transported to an energy supply facility for restoring is the first shared vehicle so that the state information changes to a predetermined target value, the determination being made on a basis of the energy amount of the first shared vehicle being less than a predetermined remaining amount threshold and the utilization rate is less than a predetermined rate, among the plurality of shared vehicles of which the remaining amount of energy is more than the predetermined remaining amount threshold;
calculating, by the onboard device, the state information further including a remaining amount of energy used to drive the first shared vehicle to the energy supply facility;
acquiring, from the onboard device of the first shared vehicle that further includes a Global Positioning System (GPS) receiver, a first current position of the shared vehicle;
acquiring, from a second GPS receiver in the operator terminal device, a second current position of the operator terminal device;
determining, by the shared vehicle management device and using the first current position and the second current position, the operator terminal device based on the device being within a short distance from the transportation object vehicle as the operator device to receive an electronic task instruction to transport the transportation object vehicle;
transmitting, by the shared vehicle management device, the electronic task instruction to the operator terminal device to transport the transportation object vehicle to the energy supply facility; and
receiving, by the shared vehicle management device, execution progress information of the electronic task instruction from the operator terminal device until the transportation object vehicle arrives to the energy supply facility.

US Pat. No. 10,339,619

METHOD AND APPARATUS FOR PRESENTING SUPPLY CHAIN INFORMATION TO A CONSUMER

1. A method for presenting pallet trip data associated with delivering a manufactured product to a retail consumer through a social media application created by a manufacturer and downloaded by the consumer onto a personal smart phone comprising the steps:the product manufacturer affixing a machine readable identifier onto the product,
the manufacturer hiring a pallet owner to supply a smart pallet to ship the product through a designated supply chain for delivery to the consumer,
the pallet owner supplying the smart pallet with a unique identifier and monitoring and reporting capabilities and a connection to a nation-wide wireless network,
the pallet owner using a first e-commerce business application layer to pre-configure the smart pallet for the manufacturer's proposed trip in the designated supply chain using the wireless network and thereafter transferring custody of the trip enabled smart pallet together with an access code to a dashboard available through an internet portal to the manufacturer,
the manufacturer using the access code to remotely access a second e-commerce business application layer to configure the programmable condition monitoring and reporting capabilities of the smart pallet for the purpose of documenting a pallet trip record and associating product trip data on the dashboard at a remote work station with an internet connection to generate an electronic supply chain pedigree,
physically loading the product on the smart pallet and thereafter electronically associating the unique identifier of the smart pallet with the machine readable identifier of the product on the smart pallet through the designated supply chain,
the smart pallet monitoring conditions and reporting conditions in response to instructions pre-programmed in the first and second e-commerce business application layers during the pallet's trip, and subsequently accessing the condition reports of the second e-commerce business application layer by the manufacturer using the access code to view updates on the dashboard,
electronically disassociating the product from the smart pallet when the smart pallet reaches its intended destination and custody of the smart pallet is transferred from the manufacturer to the pallet owner,
the pallet owner providing the manufacturer with an invoice for using the smart pallet along with a pallet rental trip file containing information documented through the first e-commerce business application layer used by the smart pallet owner to manage the smart pallet for rental purposes and information documented through condition reports of the second e-commerce business application layer used by the manufacturer to provide an electronic supply chain pedigree of the product,
the manufacturer using a third e-commerce business application layer to extract information from aggregated data records of the first and second layers,
presenting the extracted information in the social media application sponsored by the manufacturer to influence purchasing decisions of the consumer, and
the consumer bringing the smart phone into communication range of the machine readable identifier on the product to launch a proprietary social media application for displaying the product's supply chain pedigree and other useful information of interest about the product to the consumer.

US Pat. No. 10,339,618

SECURITIZING AND TRADING HEALTH QUALITY CREDITS

Cerner Innovation, Inc., ...

1. A system for improving health care provided by a health care entity using health quality credit exchange and electronic health record (EHR) systems, comprising:one or more computer processors configured to:
invest and trade health quality offset credits associated with the health quality credit exchange;
broker trading of the set of health quality offset credits;
facilitate reporting market data, trading, retiring the set of health quality offset credits;
verify an emission count associated with the health care entity; and
store health care related information and identify patient treatment outcomes for the health care entity; and
at least one software agents running on each of the one or more processors, each of the at least one software agents working cooperatively to implement a method comprising:
determining the existence of one or more potentially avoidable mortalities (PAMs) or potentially avoidable complications (PACs) from a set of patient health records associated with the health care entity;
in response to determining the existence of one or more PAMs or PACs, determining an emission count for the health care entity;
communicating an indication to retire a set of health quality offset credits equivalent to the emission count, the set of health quality offset credits associated with a health quality account associated with the health care entity;
communicating an order to retire the one or more health quality offset credits;
accessing, the health quality credit account associated with the health care entity and determine if the health quality account has sufficient health quality offset credits to fulfill the order to retire the one or more health care quality credits;
if there are a sufficient number of health quality offset credits in the health care quality account, applying the set of health quality offset credits to the emissions count thereby reducing the emission count;
retiring the one or more health quality offset credits by annotating a unique credit identifier associated with each health quality offset credit of the set of health quality offset credits as retired;
reducing the number of health care quality offset credits in the health quality account by the determined emissions count, wherein a retired health quality offset credit becomes no longer available for trading on a health quality exchange; and
if there are not sufficient health care quality offset credits in the health care quality account, providing a notification that the health care quality account has insufficient credits.

US Pat. No. 10,339,614

WASTE ANALYSIS SYSTEM AND METHOD

International Business Ma...

1. A computer-implemented method for identifying waste in a process, comprising:receiving an image of one or more discarded products from a camera;
performing an object count process on the received image to identify an amount of the one or more discarded products within the image;
acquiring metadata relating to the identified amount of the one or more discarded products from the received image;
obtaining an amount of used product;
recording the metadata;
determining an overage amount of product as a function of the acquired metadata and a factor of the amount of used product;
deriving a suggestion for waste reduction based on the determination; and
generating a report based on the recorded metadata, wherein the report includes the suggestion for waste reduction.

US Pat. No. 10,339,613

VIEWING SHOPPING INFORMATION ON A NETWORK BASED SOCIAL PLATFORM

eBay Inc., San Jose, CA ...

1. A system comprising:one or more computer processors;
one or more computer memories;
one or more modules deployed into the one or more computer memories via a computer-implemented deployment process, the one or more modules configuring the one or more computer processors to perform operations for communicating content pertaining to items listed on a network-based marketplace based on disclosure information and item filter information maintained in a social network, the operations comprising:
receiving a request to view a list of items from a user of a client device;
identifying a type of the list of items;
requesting information pertaining to the items from a network-based marketplace based on a determination that the user is authorized to view the list;
identifying a subset of the items based on the item filter information;
identifying the content pertaining to the subset of the items based on the disclosure information; and
communicating the content pertaining to the subset of the items for presentation on the client device in user interface elements, each of the user interface elements being configured to, upon selection, do nothing or display additional content pertaining to a corresponding one of the subset of the items.

US Pat. No. 10,339,612

MULTI-DIMENSIONAL JOB TITLE LOGICAL MODELS FOR SOCIAL NETWORK MEMBERS

Microsoft Technology Lice...

1. A social networking system comprising:one or more processors; and
a computer readable medium storing instructions that, when executed by the one or more processors, cause the system to perform operations comprising:
receiving an unstructured job title record from a member profile or a job posting in the social networking system;
extracting terms from the unstructured job title record;
searching a job role taxonomy database with the extracted terms to identify one or more job roles in the unstructured job title record;
for a job role identified in the unstructured job title record, extracting a plurality of additional terms appearing prior to and after the identified job role in the unstructured job title record;
mapping each additional term of the plurality of additional terms to a standardized modifier by searching one or more of a job seniority taxonomy database, a job specialty taxonomy database, a job accreditation taxonomy database, and a job status taxonomy database, thereby identifying one or more of a job seniority modifier, a job specialty modifier, a job accreditation modifier, and a job status modifier for each additional term; and
creating a multi-dimensional standardized job title for the member profile or job posting by writing the job role identified in the unstructured job title record, the job seniority modifier, the job specialty modifier, the job accreditation modifier, and the job status modifier to a standardization record in a standardization database, the standardization record associated with the member profile or the job posting.

US Pat. No. 10,339,610

METHOD AND SYSTEM FOR MAKING A TARGETED OFFER TO AN AUDIENCE

MASTERCARD INTERNATIONAL ...

1. A method for making a targeted offer to an audience of a second plurality of entities, the method comprising:retrieving, by a processor, from one or more financial transaction databases of a payment card system, a first set of information including activities and characteristics attributable to a first plurality of entities, wherein the first set of information comprises financial transactions and geographic and demographic information from payment card transaction data;
determining, by the processor, behavioral variable information of the first plurality of entities;
extracting, by the processor, an intent of the first plurality of entities from the behavioral variable information;
generating, by the processor, a plurality of interaction associations based on (a) at least one of selected activities criteria and selected characteristics criteria from the first set of information and (b) the behavioral variable information and the intent of the first plurality of entities;
developing, by the processor, audiences of the second plurality of entities from one of the plurality of interaction associations using at least one methodology that is selected from the group consisting of: Decision Trees, Chi-Squared Automatic Interaction Detection (CHAID), Correlation Analysis, and Market Basket Analysis;
generating prediction rules containing one or more of the interaction associations for predicting a target audience, wherein the target audience is a dependent variable and the one or more interaction associations are the independent variable, wherein the prediction rules are configured to, when run by a processor:
match activities and characteristics of the second set information to the activities and characteristics of one of the plurality of interaction associations;
predict behavior and intent of the second plurality of entities to carry out certain activities based on (a) the activities criteria and/or characteristics criteria and (b) the behavioral variable information and the intent of the first plurality of entities used in forming the interaction associations, thus yielding predicted behavior and intent; and
present the targeted offer to the audience of the second plurality of entities based on the predicted behavior and intent of the second plurality of entities;
defining a format for the prediction rules that is conveyable to a third party web-based social network or API vendor;
conveying to the third party, by the processor and using the defined format of the web-based social network or API vendor, the prediction rules configured to enable the third party to identify a second set of information including activities and characteristics attributable to the second plurality of entities.

US Pat. No. 10,339,609

SYSTEMS AND METHODS FOR PAGE RECOMMENDATIONS BASED ON ONLINE USER BEHAVIOR

Facebook, Inc., Menlo Pa...

1. A computer-implemented method comprising:determining, by a computing system, features of content items based on online behavior of a first user, wherein the online behavior of the first user is associated with a seed content item and one or more candidate content items;
receiving, by the computing system, an indication of approval from a second user regarding the seed content item;
determining, by the computing system, using at least one machine learning technique, a probability that the second user will interact with a candidate content item from the one or more candidate content items based on features found in the seed content item;
determining, by the computing system, whether the probability that the second user will interact with the candidate content item satisfies a threshold value; and
selecting, by the computing system, the candidate content item for presentation to the second user based on the probability that the second user will interact with the candidate content item.

US Pat. No. 10,339,608

SELECTABLE PAYROLL AMOUNTS FOR INSTANT PAYROLL DEPOSITS

Square, Inc., San Franci...

1. A system comprising:one or more processors; and
one or more computer-readable media storing instructions executable by the one or more processors, wherein the instructions program the one or more processors to implement a payroll service to:
receive employee payroll information indicating an employee account at which an employee is to receive payroll payments from an employer;
receive employer information indicating an employer account, wherein the employer account is associated with the payroll service, and wherein the employer account is an account from which the employer is to pay the payroll payments;
receive, from a first device operated by the employee, compensation information indicating compensation to be received by the employee for performing work for the employer, the first device having installed thereon an employee application for, at least, sending the compensation information to the payroll service via a network;
determine that the employer account has not received payroll funds, from a bank account associated with the employer, corresponding to the compensation to be received by the employee;
analyze employee information to calculate a reliability level of the employee, wherein the employee information includes at least one of a previous employer of the employee, an amount of time worked by the employee, an amount of sales made by the employee, or an upcoming schedule for the employee;
determine that the reliability level satisfies a threshold indicating that the employee is eligible for instant deposits;
receive transaction information associated with a plurality of transactions between the employer and customers, wherein the transaction information for an individual transaction of the plurality of transactions comprises at least an amount of the individual transaction and a type of payment instrument used for the individual transaction;
determine, based at least in part on the employer information and the transaction information, a level of risk associated with the employer;
determine an amount of funds to deposit in to the employee account based at least in part on the level of risk and the reliability level, wherein the amount of funds is equal to a first portion of the compensation to be received by the employee;
generate, based at least in part on determining that the reliability level satisfies the threshold, a notification to inform the employee that the employee is eligible for instant deposits;
send the notification to the first device operated by the employee, the notification being presented by the employee application via a user interface enabling the customer to initiate an instant deposit, wherein the instant deposit causes the amount of funds to be deposited in to the employee account;
receive, responsive to sending the notification to the first device, an instruction to initiate the instant deposit;
effectuate, at a first time that is before the employer account has received the payroll funds from the bank account associated with the employer, and responsive to the instruction to initiate the instant deposit, the deposit of the amount of funds in to the employee account; and
effectuate, at a second time after the first time, the deposit of an additional amount of funds equal to a second portion of the compensation in to the employee account, wherein a sum of the first portion and the second portion is equal to a total amount of the compensation.

US Pat. No. 10,339,607

TIME DATA ANALYSIS

CERNER INNOVATION, INC., ...

1. A computerized method, carried out by at least one server having one or more processors, the method comprising:receiving an indication from a user interface that one or more EMRs is active;
in response to the indication, tracking a set of active indications generated by at least one input device;
capturing time information for each active indication of the set of active indications;
based on the set of active indications and the time information, generating time data representing a total amount of time spent in one or more electronic medical records (EMRs) by a plurality of clinicians;
segmenting the time data to a per-clinician time data level such that the time data illustrates the total amount of time spent in the one or more EMRs by each clinician individually;
segmenting the per-clinician time data such that the per-clinician time data illustrates one or more activities performed by each clinician individually while in the one or more EMRs;
identifying one or more clinicians associated with a total amount of time spent in the one or more EMRs that exceeds a predetermined threshold amount of time; and
utilizing the per-clinician time data obtained over a predetermined period of time, creating a predicted pathway for a first clinician, the predicted pathway automatically redirecting a default starting view associated with opening the one or more EMRs from a first view to a second view without having to navigate from the first view to the second view, wherein the first clinician has navigated from the first view of the one or more EMRs to a second view of the one or more EMRs a number of times greater than a predetermined threshold.

US Pat. No. 10,339,606

SYSTEMS AND METHODS FOR AN AUTOMATICALLY-UPDATING FRAUD DETECTION VARIABLE

AMERICAN EXPRESS TRAVEL R...

1. A method, comprising:receiving, by a processor, a plurality of transactions for a plurality of consumers, wherein each respective transaction of the plurality of transactions is between a consumer of the plurality of consumers and a merchant of a plurality of merchants;
automatically inputting, by the processor, the plurality of transactions into a neural network;
automatically analyzing, by the processor using the neural network, the plurality of transactions over a plurality of iterations, wherein an iteration of the plurality of iterations comprises cycling, by the processor using the neural network, through a consumer transaction history associated with the consumer, wherein the consumer transaction history has a consumer transaction sequence associated with the consumer,
wherein the cycling through the consumer transaction history comprises:
retrieving, by the processor, for each transaction of the plurality of transactions, a sliding window number of transactions preceding, in the consumer transaction sequence, a transaction in the consumer transaction history, wherein the sliding window number of transactions are retrieved from a previous iteration of the plurality of iterations, wherein the sliding window number of transactions is a positive integer of transactions;
inputting, by the processor, the sliding window number of transactions preceding the transaction into the neural network as a set of transaction inputs for the transaction;
designating, by the processor and the neural network, the transaction in the consumer transaction history as a desired transaction output of the neural network associated with the set of transaction inputs for the transaction;
analyzing, by the processor and the neural network, the set of transaction inputs for the transaction to produce a generated transaction output of the neural network; and
comparing, by the processor, the generated transaction output and the desired transaction output; and
automatically updating, by the processor using the neural network, over the plurality of iterations, a previous fraud detection variable associated with at least one of the consumer or the merchant to generate updated fraud detection variables, in response to the analyzing the plurality of transactions.

US Pat. No. 10,339,604

SYSTEMS AND METHODS FOR MODIFYING RESOURCES TO MANAGE LOSS EVENTS

STATE FARM MUTUAL AUTOMOB...

1. A computer-implemented method of modifying resources to manage loss events, the method comprising:receiving loss event data related to a loss event, the loss event data (i) recorded by at least one sensor associated with at least one property and (ii) received in real time during occurrence of the loss event;
ingesting the loss event data using dedicated stream processing hardware connected to federated database hardware, the loss event data ingested from a front office cluster to a back office cluster of the dedicated stream processing hardware;
routing the loss event data to a high-speed memory store;
performing, by one or more processors, complex event processing on the loss event data in the high-speed memory store to identify (i) a particular type of the loss event, and (ii) a degree associated with the particular type and indicating an expected amount of damage from the loss event;
accessing historical data associated with the particular type of the loss event and the degree associated with the particular type, the historical data indicating a level of computing resources needed to manage at least one previous loss event of the particular type and of the degree;
comparing, with the one or more processors, the loss event data to the historical data;
based on the comparison of the loss event data to the historical data, determining, (i) in real time as the loss event data is received and (ii) during occurrence of the loss event, that the loss event data indicates a different amount of damage than that expected by the historical data; and
modifying, according to the different amount of damage, a level of the computing resources to employ in managing the loss event.

US Pat. No. 10,339,602

POWER ADJUSTMENT SYSTEM, POWER ADJUSTMENT METHOD, AND COMPUTER PROGRAM

PANASONIC INTELLECTUAL PR...

1. A power adjustment system configured to make a deal with a trading device about supplying power to a power grid from a power supply apparatus of a customer facility in accordance with a trade term,the power supply apparatus including a power generation apparatus, and a power storage apparatus including a storage battery,
the power adjustment system comprising:
a first estimator configured to estimate first power to be generated by the power generation apparatus during an interested period;
a second estimator configured to estimate second power to be consumed by an electric load of the customer facility during the interested period;
a power purchasing cost calculator configured to, when there is a shortfall in the first power estimated by the first estimator compared to the second power estimated by the second estimator, calculate a cost to be paid by the customer facility for receiving third power for compensating for the shortfall from the power grid;
a controller configured to select one of a first state of supplying power from the power storage apparatus to the power grid and a second state of supplying power from the power storage apparatus to the electric load; and
a determiner configured to compare an amount of first money to be paid to the customer facility in accordance with the trade term when the first state is selected, with an amount of second money equal to the cost calculated by the power purchasing cost calculator,
the controller being configured to select the second state when a comparison result made by the determiner indicates that the amount of the first money is equal to or less than the amount of the second money,
the controller being configured to predict, based on the first power estimated by the first estimator and the second power estimated by the second estimator, an occurrence of a power shortfall period which is an interested period to be expected that the first power becomes smaller than the second power,
the controller being configured to control the power storage apparatus to be charged, before a start time of the power shortfall period, so that an amount of remaining power thereof exceeds the third power in response to the predicted occurrence of the power shortfall period.

US Pat. No. 10,339,601

CONNECTED DEVICE-TRIGGERED FAILURE ANALYSIS

The Toronto-Dominion Bank...

1. A system comprising:a memory;
at least one hardware processor interoperably coupled with the memory and configured to:
monitor operations of at least one monitored device using at least one connected device, the at least one monitored device associated with a user;
determine a projected life span of the at least one monitored device based on the monitored operations;
in response to determining that the projected life span of the at least one monitored device is less than a threshold amount, determine a corrective action to be performed; and
generate a proposal to be presented, via a user interface, based on the determined corrective action, wherein generating the proposal includes:
estimating a cost of the determined corrective action;
analyzing at least one of a financial or transactional account associated with the user, wherein analyzing the at least one of the financial or transactional account associated with the user includes determining whether funds are sufficient to cover the estimated cost of the determined corrective action are available in accounts associated with the user;
in response to determining that funds sufficient to cover the estimated cost of the determined corrective action are not available in accounts associated with the user, perform an automated credit worthiness determination based on a credit history of the user; and
creating the proposal associated with the determined corrective action based on the projected life span of the at least one monitored device, the estimated cost of the determined corrective action, and the analysis of the account.

US Pat. No. 10,339,600

APPLICATION PLATFORM REVERSE AUCTION

EMC IP Holding Company LL...

1. A method for software application management comprising:receiving, from a client device, a software application;
generating a software application manifest for the software application, the software application manifest comprising:
an expected number of communications with the software application,
an amount of data storage for the software application, and
an amount of processing cycles for the software application;
transmitting the software application manifest and bid constraints associated with the software application to a plurality of vendors, wherein each of the plurality of vendors comprises computing system resources for hosting the software application;
receiving a plurality of bids from the plurality of vendors;
selecting, from the plurality of bids, a winning bid, the winning bid from a vendor of the plurality of vendors; and
transmitting the software application and payment information to the vendor,
wherein the plurality of bids from the plurality of vendors are received prior to any portion of the software application being transmitted to the plurality of vendors.

US Pat. No. 10,339,598

METHOD, APPARATUS, AND SYSTEM FOR DISPLAYING A WEARABLE ARTICLE INTERFACE ON AN ELECTRONIC DEVICE

1. An electronic device comprising:a computing system including a memory and at least one processor, wherein the computing system is configured to:
receive a selection of one or more preferences for wearable articles,
receive a selection of a first wearable article of a first article type displayed on a display screen,
identify one or more additional wearable articles each having a different article type from the first article type, and
generate for display on the display screen, with the first wearable article, a second wearable article of a second article type from the one or more additional wearable articles based on at least one preference of the one or more preferences, wherein the first wearable article is generated at a first portion on the display screen based on the first article type, the second article of the second article type, different from the first article type, is generated at a second portion of the display screen, different from the first portion, based on the second article type.

US Pat. No. 10,339,597

SYSTEMS AND METHODS FOR VIRTUAL BODY MEASUREMENTS AND MODELING APPAREL

1. A system, comprising:a server including one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the server to:
derive body measurements of a human subject from a plurality of two-dimensional (2D) images representative of the human subject's body, wherein the plurality of 2D images are obtained from a single user device;
generate a virtual model representative of the human subject based on the derived measurements such that the virtual model comprises a body having one or more portions thereof that are equivalent to that of the human subject;
obtain information regarding characteristics of one or more items of clothing, wherein the information regarding the characteristics of the one or more actual items of clothing are obtained by deriving measurements of the one or more actual items of clothing from a plurality of images representative of the one or more actual items of clothing being worn by one or more physical mannequins, the plurality of images having been obtained by capturing images of the actual items of clothing by one or more camera-equipped unmanned vehicles traversing one or more routes about one or more physical mannequins; and
present a virtual three-dimensional (3D) rending of the virtual model outfitted with one or more images of actual items of clothing, the virtual 3D rendering representing the characteristics of the one or more actual items of clothing relative to the virtual model.

US Pat. No. 10,339,596

VENDOR WEBSITE GUI FOR MARKETING GREETING CARDS

Minted, LLC, San Francis...

1. A method comprising:displaying, by a display screen, a front view of a card, the front view including (i) a front plan view in which a front surface of the card appears to be parallel with a front surface of the screen and (ii) the card being portrayed on the screen as rotating oscillatorily about an axis while remaining less than 90 degrees from the front plan view, wherein the oscillatory rotating causes the card's front face to appear nonparallel to the screen's front surface, and the card has negligible thickness such that side surfaces of the card do not embody a design or information that is revealed by the rotation;
displaying, by the display screen, a list of one or more card features, wherein a viewer can select one or more features to be applied to the card;
wherein the top, bottom, and side surfaces of the card are substantially paper thin;
wherein the front surface of the card comprises one or more images;
wherein the front surface of the card comprises a message in foil, wherein the foil has a raised texture;
wherein the oscillatory rotating displays reflectivity and shine of the foil;
wherein the axis is a vertical rotational axis;
wherein the axis is on the card, so that one section of the card appears to swing toward the viewer while another section of the card appears to swing away from the viewer;
wherein the card appears to be levitating in the air without support; and
wherein the displaying is implemented as part of a graphical user interface (GUI) of a vendor website of a merchant that sells cards, wherein instructions for implementing the GUI are received at the display screen over the Internet from a server that hosts the website.

US Pat. No. 10,339,594

TOUCH SENSATION INTERACTION METHOD AND APPARATUS IN SHOPPING

HUAWEI TECHNOLOGIES CO., ...

1. A touch sensation interaction method in shopping, wherein the method comprises:collecting an image of a first object, wherein the first object is a part of a human body;
acquiring information about a second object, wherein the second object is a wearable item worn by the first object;
obtaining parameter information of a touch sensation signal using the image of the first object and the information about the second object, wherein the touch sensation signal is applied to the first object and is used to simulate a touch sensation caused by the second object to the first object when the second object is worn by the first object;
generating the touch sensation signal using the parameter information of the touch sensation signal based on a comparison of a plurality of lengths of the first object in a plurality of different directions and a plurality of lengths of the second object in the plurality of different directions; and
applying the touch sensation signal to the first object.

US Pat. No. 10,339,592

CONFIGURING A VIRTUAL STORE BASED ON INFORMATION ASSOCIATED WITH A USER BY AN ONLINE SYSTEM

Facebook, Inc., Menlo Pa...

1. A method comprising:generating a virtual store for presentation to a user of an online system via a virtual world generated by the online system, the virtual store rendered using graphics imitating appearance of a physical store;
generating an avatar rendered in the virtual world and representing the user;
identifying an opportunity to present an object to the user via the virtual store, the object associated with an organic appearance of the object designated by the online system;
retrieving information associated with the user by the online system;
identifying an additional user connected to the user via the online system from the information associated with the user;
determining, from an edge store, a first affinity between the user and the additional user;
determining, from the edge store, a second affinity between the additional user and the object;
determining an object score for the object based on the first and second affinities;
selecting the object for inclusion in the virtual store in response to the object score exceeding a threshold score;
ranking the object in a list of objects based on the object score;
obtaining a ranking of positions within the virtual store in accordance with relative levels of prominence within a layout of the virtual store;
determining a placement of the object at a position within the layout of the virtual store relative to other objects and relative to a position of the avatar representing the user in the virtual store based at least in part on the information associated with the user and the ranking of the object relative to the ranking of the position, the determined placement corresponding to an eye level of the avatar representing the user; and
receiving a sponsorship request to sponsor an object and a bid value associated with the sponsorship request, the sponsorship request including a sponsored appearance for the object;
determining a first appearance score for the organic appearance based on a third affinity between the user and the organic appearance of the object;
determining a second appearance score for the sponsored appearance of the object based on the bid value and a fourth affinity between the user and the sponsored appearance of the object;
selecting between the organic appearance and the sponsored appearance based on the first and second appearance scores to determine a selected appearance for the object; and
providing a graphical rendering of the virtual store including a rendering of the avatar and a rendering of the selected object according to the selected appearance in the determined placement at the eye level of the avatar representing the user to a client device for presentation to the user.

US Pat. No. 10,339,590

METHODS, SYSTEMS, AND PRODUCTS FOR GIFT GIVING

1. A method comprising:receiving, by a server, an electronic selection specifying a gift for a recipient;
identifying, by the server, information about a device associated with the recipient;
determining, by the server based at least in part on the electronic selection and the information about the device associated with the recipient, that the gift for the recipient specified in the electronic selection received by the server is incompatible with the device associated with the recipient; and
generating, by the server, a message indicating that the recipient does not have a device that is compatible with the gift for the recipient specified in the electronic selection received by the server.

US Pat. No. 10,339,589

CONNECTED CONSUMABLES PREPARATION AREA

MASTERCARD INTERNATIONAL ...

4. A system for maintaining an electronically stored consumables inventory, the system comprising:a consumables preparation area including:
a consumables preparation surface and an electronically stored consumables inventory update initiator, said update initiator comprising a barcode and/or quick response (QR) code scanner directed at the consumables preparation surface, a Radio-Frequency Identification (RFID) tag reader, and a digital scale;
wherein the update initiator is configured for use within a dwelling or establishment in which said consumables are to be transformed for use;
electronic data processing means, connected to the update initiator, configured to process data received from the update initiator and accordingly update the consumables inventory; and
electronic data storage means configured to store the consumables inventory;
wherein said electronic data processing means is configured to initiate placement of an order with a merchant for at least one consumable, through an electronic merchant ordering system, in response to determining that the consumables inventory in the electronic data storage means does not match a predetermined list of consumable items.

US Pat. No. 10,339,588

SYSTEMS AND METHODS FOR PRICE SEARCHING AND INTELLIGENT SHOPPING LISTS ON A MOBILE DEVICE

United Services Automobil...

1. A system comprising:at least one processor configured to:
receive input data containing product identification information and a price of a product;
determine a plurality of stores that offer the product identified by the product identification information;
obtain, from a location system associated with a mobile device of a user, a current location of the mobile device;
determine a route from the current location of the mobile device to one of the plurality of stores based at least in part on a prioritized list of factors, wherein:
the prioritized list of factors includes whether a lowest overall cost for obtaining the product is within a predetermined percentage difference from the price included in the received input data, and whether the product is available at a location within a certain distance of the current location of the mobile device;
automatically load data representing the determined route into the location system of the mobile device;
based on the loading, cause dynamic display, on the location system, of visual representations of locations of the user while the user is moving along the route determined from the current location of the mobile device to one of the plurality of stores based at least in part on the prioritized list of factors including whether a lowest overall cost for obtaining the product is within a predetermined percentage difference from the price included in the received input data and whether the product is available at a location within a certain distance of the current location of the mobile device; and
reserve the product at the one of the plurality of stores for a period of time until the user arrives at the store to purchase the product.

US Pat. No. 10,339,587

METHOD, MEDIUM, AND SYSTEM FOR CREATING A PRODUCT BY APPLYING IMAGES TO MATERIALS

FUJIFILM Corporation, To...

1. An image processing apparatus comprising:a product material storage configured to store a plurality of product materials therein;
an instruction acquiring section configured to acquire an instruction input by a user;
a group-of-image acquiring section configured to acquire a group of images in accordance with an instruction of the user;
a first product material selector configured to select a first product material from among the plurality of product materials in accordance with an instruction of the user;
a second product material selector configured to select a second product material that is different from the first product material from among the plurality of product materials;
a product creator configured to create a recommended product by applying a first image constituting at least part of the group of images to the second product material;
a display controller configured to, when second images constituting at least part of the group of images are displayed on a display of a terminal device of the user in accordance with an instruction of the user, cause the recommended product to be displayed, together with the second images, on the display at least once;
a product material selection history recorder configured to record thereon a history of selection of the first product material in accordance with an instruction input by each of a plurality of users; and
a product material correlation storage configured to store therein a number of users, who ordered the recommended product together with a user-selected product, for each of the plurality of product materials based on the history of selection of the first product materials, the user-selected product being created by applying a third image constituting at least part of the group of images to the first product material,
wherein the second product material selector selects, from among the plurality of product materials, a product material for which the number of users is not lower than a threshold, as the second product material.

US Pat. No. 10,339,586

TECHNIQUES FOR IDENTIFYING SIMILAR PRODUCTS

Amazon Technologies, Inc....

1. A computer-implemented method for making product recommendations in a computing network, comprising:identifying, using one or more computing devices operating in the network, a reference product set based in part on one or more actions of a user associated with a remote device, the reference product set including a first product and a second product, wherein the one or more actions include one or more of viewing information representing the first and/or second product on the remote device, selecting the first and/or second product on the remote device, or purchasing the first and/or second product;
transmitting, using the one or more computing devices, a first product interface control for presentation on the remote device;
receiving, using the one or more computing devices, first selection data representing activation of the first product interface control from the remote device;
retrieving, using the one or more computing devices, a first product vector associated with the first product from a data store in response to the first selection data, the first product vector comprising a first plurality of values corresponding to a plurality of product attributes, the plurality of product attributes defining a vector space, the first plurality of values defining a first point in the vector space;
retrieving, using the one or more computing devices, a second product vector associated with the second product from the data store in response to the first selection data, the second product vector comprising a second plurality of values corresponding to the plurality of product attributes, the second plurality of values defining a second point in the vector space;
identifying a third point in the vector space, the third point representing a third product having an associated third product vector, the third product vector comprising a third plurality of values corresponding to the plurality of product attributes, the third plurality of values defining the third point in the vector space, the third product not being in the reference product set;
determining, using the one or more computing devices, that the third product is similar to the reference product set by:
calculating a first Euclidean distance between the first point and the third point in the vector space;
calculating a second Euclidean distance between the second point and the third point in the vector space;
determining a degree to which the first product and the second product in the reference product set are similar by determining a third Euclidean distance between the first point and the second point;
determining, based on the third Euclidean distance, a programmable threshold Euclidean distance for the reference product set beyond which products not in the reference product set are not considered similar to the reference product set; and
determining a sum of the first Euclidean distance and the second Euclidean distance;
determining that the sum of the first Euclidean distance and the second Euclidean distance is within the programmable threshold Euclidean distance for the reference product set;
generating a detail page that includes the first product in the reference product set, the second product in the reference product set, and the third product; and
transmitting, using the one or more computing devices, information representing the detail page for presentation on the remote device, thereby indicating that the third product is a recommended product based on the reference product set.

US Pat. No. 10,339,585

COMBINED BOOTSTRAP DISTRIBUTION AND MIXTURE SEQUENTIAL PROBABILITY RATIO TEST APPLICATIONS TO ONLINE ECOMMERCE WEBSITES

WALMART APOLLO, LLC, Ben...

1. A system comprising:one or more processing modules; and
one or more non-transitory storage modules storing computing instructions configured to run on the one or more processing modules and perform acts of:
receiving an online search query entered into a search field of an online ecommerce website by a user using the online ecommerce website;
determining a query response to the online search query by combining a nonparametric bootstrap distribution and a mixture sequential probability ratio test (SPRT), the query response comprising one or more products and being based on one of:
(1) a first metric comprising a query success rate per user session of a plurality of previous user sessions, or
(2) a second metric comprising a revenue per user session of the plurality of previous user sessions, wherein the first metric comprises a ratio of a total number of successful queries per user session of the plurality of previous user sessions to a total number of queries per user session of the plurality of previous user sessions; and
coordinating a display of the query response to the user using the online ecommerce website, wherein:
determining the query response to the online search query further comprises:
dividing data from the plurality of previous user sessions into a plurality of blocks of data; and
determining studentized plug-in statistics for the query success rate per user session of the plurality of previous user sessions on each block of data of the plurality of blocks of data.

US Pat. No. 10,339,583

OBJECT RECOGNITION AND ANALYSIS USING AUGMENTED REALITY USER DEVICES

Bank of America Corporati...

1. An augmented reality system comprising:an augmented reality user device for a user comprising:
a display configured to overlay virtual objects onto tangible objects in a real scene in real-time;
a camera configured to capture images of tangible products;
a global position system (GPS) sensor configured to provide geographic location of the user;
one or more processors operably coupled to the display, the camera, and the GPS sensor, and configured to implement:
an object recognition engine configured to identify tangible products;
a virtual assessment engine configured to:
authenticate the user based on a user input;
identify a user identifier for the user in response to authenticating the user;
identify a vendor based on the geographic location of the user;
capture an image of a product;
perform object recognition on the image to identify the product;
determine a price of the identified product;
generate a token comprising:
 the user identifier,
 a vendor identifier of the identified vendor,
 a product identifier of the identified product, and
 the price of the identified product;
send the token to a remote server;
receive virtual assessment data in response to sending the token, wherein the virtual assessment data comprises a recommendation identifying a selected account for the user and one or more new prequalified accounts for the user; and
a virtual overlay engine configured to present the recommendation identifying the selected account and the one or more new prequalified accounts as virtual objects overlaid with the product; and
the remote server comprising a product analysis engine configured to:
receive the token;
identify account information comprising one or more existing accounts for the user based on the user identifier;
prequalify the user for one or more new accounts based on at least one of the account information, the vendor identifier and the product identifier;
select an account from the one or more existing accounts and the one or more prequalified new accounts from the one or more new accounts for the user based on the price of the identified product;
generate the recommendation that identifies the selected account and the selected one or more prequalified new accounts;
generate the virtual assessment data identifying the recommendation; and
send the virtual assessment data to the augmented reality user device.

US Pat. No. 10,339,582

SYSTEM AND METHOD FOR INCREASING LOCATION AWARENESS OF ORGANIZATIONS

GOOGLE LLC, Mountain Vie...

1. A method for measuring location awareness of organizations on a map display, the method comprising:for a particular organization, identifying, by one or more processors, a plurality of locations to present on a map display of a geographic area;
applying, by the one or more processors, a random selection function to select a subset of the plurality of organization locations, wherein the subset includes fewer locations than the plurality of organization locations;
causing, by the one or more processors, the map display of the geographic area to be presented to a user including placing an indication of the organization at each of the subset of organization locations on the map display;
when an organization location within the geographic area has been presented to the user a predetermined threshold number of times:
causing, by the one or more processors, an icon to be presented in place of the indication of the organization at the organization location, wherein the icon does not identify the organization;
in response to receiving a selection of the organization location represented by the icon, causing, by the one or more processors, a request to be presented for the user to identify the organization corresponding to the organization location, wherein the request includes an indication of the organization and an indication of at least one other organization;
providing, by the one or more processors to an organization computing device, a location awareness metric for the organization location based on the user's response to the request to identify the organization, wherein the location awareness metric is indicative of commercial content presented on the map display; and
receiving, by the one or more processors, commercial content for placement at specific geographic locations on the map display in accordance with the location awareness metric.