Log in
Enquire now
‌

US Patent 10152676 Distributed training of models using stochastic gradient descent

Patent 10152676 was granted and assigned to Amazon on December, 2018 by the United States Patent and Trademark Office.

OverviewStructured DataIssuesContributors

Contents

Is a
Patent
Patent
0

Patent attributes

Current Assignee
Amazon
Amazon
0
Patent Jurisdiction
United States Patent and Trademark Office
United States Patent and Trademark Office
0
Patent Number
101526760
Patent Inventor Names
Nikko Strom0
Date of Patent
December 11, 2018
0
Patent Application Number
140878520
Date Filed
November 22, 2013
0
Patent Citations Received
‌
US Patent 12137123 Rapid predictive analysis of very large data sets using the distributed computational graph
0
‌
US Patent 11468263 Technology for building and managing data models
0
‌
US Patent 11989651 Method and system for on-the-fly object labeling via cross modality validation in autonomous driving vehicles
0
0
‌
US Patent 12093821 Method and system for closed loop perception in autonomous driving vehicles
0
‌
US Patent 11468492 Decentralized recommendations using distributed average consensus
0
‌
US Patent 10380503 Distributed online learning for privacy-preserving personal predictive models
‌
US Patent 10706352 Training action selection neural networks using off-policy actor critic reinforcement learning
...
Patent Primary Examiner
‌
Eric Nilsson
0
Patent abstract

Features are disclosed for distributing the training of models over multiple computing nodes (e.g., servers or other computing devices). Each computing device may include a separate copy of the model to be trained, and a subset of the training data to be used. A computing device may determine updates for parameters of the model based on processing of a portion of the training data. A portion of those updates may be selected for application to the model and synchronization with other computing devices. In some embodiments, the portion of the updates is selected based on a threshold value. Other computing devices can apply the received portion of the updates such that the copy of the model being trained in each individual computing device may be substantially synchronized, even though each computing device may be using a different subset of training data to train the model.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more entities like US Patent 10152676 Distributed training of models using stochastic gradient descent

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us