Skip to content

Commit a8c575b

Browse files
Adds section about the different options to use NLP in the stack (elastic#2679)
* Adds section about the different options to use NLP in the stack. * Apply suggestions from code review Co-authored-by: David Kyle <[email protected]> * [DOCS] Addresses feedback. --------- Co-authored-by: David Kyle <[email protected]>
1 parent a253fa2 commit a8c575b

File tree

1 file changed

+31
-2
lines changed

1 file changed

+31
-2
lines changed

docs/en/stack/ml/nlp/ml-nlp-overview.asciidoc

Lines changed: 31 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,35 @@
44
{nlp-cap} (NLP) refers to the way in which we can use software to understand
55
natural language in spoken word or written text.
66

7+
[discrete]
8+
[[nlp-elastic-stack]]
9+
== NLP in the {stack}
10+
11+
Elastic offers a wide range of possibilities to leverage natural language
12+
processing.
13+
14+
You can **integrate NLP models from different providers** such as Cohere,
15+
HuggingFace, or OpenAI and use them as a service through the
16+
{ref}/inference-apis.html[{infer} API]. You can also use <<ml-nlp-elser,ELSER>>
17+
(the retrieval model trained by Elastic) and <<ml-nlp-e5,E5>> in the same way.
18+
This {ref}/semantic-search-inference.html[tutorial] walks you through the
19+
process of using the various services with the {infer} API.
20+
21+
You can **upload and manage NLP models** using the Eland client and the
22+
<<ml-nlp-deploy-models,{stack}>>. Find the
23+
<<ml-nlp-model-ref,list of recommended and compatible models here>>. Refer to
24+
<<ml-nlp-examples>> to learn more about how to use {ml} models deployed in your
25+
cluster.
26+
27+
You can **store embeddings in your {es} vector database** if you generate
28+
{ref}/dense-vector.html[dense vector] or {ref}/sparse-vector.html[sparse vector]
29+
model embeddings outside of {es}.
30+
31+
32+
[discrete]
33+
[[what-is-nlp]]
34+
== What is NLP?
35+
736
Classically, NLP was performed using linguistic rules, dictionaries, regular
837
expressions, and {ml} for specific tasks such as automatic categorization or
938
summarization of text. In recent years, however, deep learning techniques have
@@ -24,8 +53,8 @@ which is an underlying native library for PyTorch. Trained models must be in a
2453
TorchScript representation for use with {stack} {ml} features.
2554

2655
As in the cases of <<ml-dfa-classification,classification>> and
27-
<<ml-dfa-regression,{regression}>>, after you deploy a model to your cluster, you
28-
can use it to make predictions (also known as _{infer}_) against incoming
56+
<<ml-dfa-regression,{regression}>>, after you deploy a model to your cluster,
57+
you can use it to make predictions (also known as _{infer}_) against incoming
2958
data. You can perform the following NLP operations:
3059

3160
* <<ml-nlp-extract-info>>

0 commit comments

Comments
 (0)