Log in
Enquire now
‌

US Patent 11625436 Systems and methods for query autocompletion

Patent 11625436 was granted and assigned to Salesforce on April, 2023 by the United States Patent and Trademark Office.

OverviewStructured DataIssuesContributors

Contents

Is a
Patent
Patent
0

Patent attributes

Patent Applicant
Salesforce
Salesforce
0
Current Assignee
Salesforce
Salesforce
0
Patent Jurisdiction
United States Patent and Trademark Office
United States Patent and Trademark Office
0
Patent Number
116254360
Patent Inventor Names
Wenhao Liu0
Young Mo Kang0
Yingbo Zhou0
Date of Patent
April 11, 2023
0
Patent Application Number
171199410
Date Filed
December 11, 2020
0
Patent Citations
‌
US Patent 10565305 Adaptive attention model for image captioning
0
‌
US Patent 10565306 Sentinel gate for modulating auxiliary information in a long short-term memory (LSTM) neural network
0
‌
US Patent 10565318 Neural machine translation with latent tree attention
‌
US Patent 10565493 Pointer sentinel mixture architecture
0
‌
US Patent 10573295 End-to-end speech recognition with policy learning
0
‌
US Patent 10592767 Interpretable counting in visual question answering
‌
US Patent 10699060 Natural language processing using a neural network
0
‌
US Patent 10747761 Neural network based translation of natural language queries to database queries
0
...
Patent Citations Received
‌
US Patent 12067037 System, method, and computer program for performing natural language searches for documents in a database using alternate search suggestions
0
Patent Primary Examiner
‌
Loc Tran
0
CPC Code
‌
G06F 16/90324
0
‌
G06F 16/9027
0
‌
G06F 16/90344
0

Embodiments described herein provide a query autocompletion (QAC) framework at subword level. Specifically, the QAC framework employs a subword encoder that encodes or converts the sequence of input alphabet letters into a sequence of output subwords. The generated subword candidate sequences from the subword encoder is then for the n-gram language model to perform beam search on. For example, as user queries for search engines are in general short, e.g., ranging from 10 to 30 characters. The n-gram language model at subword level may be used for modeling such short contexts and outperforms the traditional language model in both completion accuracy and runtime speed. Furthermore, key computations are performed prior to the runtime to prepare segmentation candidates in support of the subword encoder to generate subword candidate sequences, thus eliminating significant computational overhead.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more entities like US Patent 11625436 Systems and methods for query autocompletion

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us