Python Elasticsearch Client Elasticsearch 7.15.2 documentation

Web Name: Python Elasticsearch Client Elasticsearch 7.15.2 documentation

WebSite: http://elasticsearch-py.readthedocs.io

ID:235098

Keywords:

Elasticsearch,Python,Client,documentation,

Description:

keywords:
description:
Docs Python Elasticsearch Client Edit on GitHub
Python Elasticsearch Client¶

Official low-level client for Elasticsearch. Its goal is to provide commonground for all Elasticsearch-related code in Python; because of this it triesto be opinion-free and very extendable.

Installation¶

Install the elasticsearch package with pip:

$ python -m pip install elasticsearch

If your application uses async/await in Python you can install withthe async extra:

$ python -m pip install elasticsearch[async]

Read more about how to use asyncio with this project.

Compatibility¶

Language clients are forward compatible; meaning that clients support communicatingwith greater or equal minor versions of Elasticsearch. Elasticsearch language clientsare only backwards compatible with default distributions and without guarantees made.

If you have a need to have multiple versions installed at the same time olderversions are also released as elasticsearch2, elasticsearch5 and elasticsearch6.

Example Usage¶
from datetime import datetimefrom elasticsearch import Elasticsearches = Elasticsearch()doc = {    author: kimchy,    text: Elasticsearch: cool. bonsai cool.,    timestamp: datetime.now(),}res = es.index(index=test-index, id=1, document=doc)print(res[result])res = es.get(index=test-index, id=1)print(res[_source])es.indices.refresh(index=test-index)res = es.search(index=test-index, query={match_all: {}})print(Got %d Hits: % res[hits][total][value])for hit in res[hits][hits]:    print(%(timestamp)s %(author)s: %(text)s % hit[_source])
Features¶

This client was designed as very thin wrapper around Elasticsearch’s REST API toallow for maximum flexibility. This means that there are no opinions in thisclient; it also means that some of the APIs are a little cumbersome to use fromPython. We have created some Helpers to help with this issue as well asa more high level library (elasticsearch-dsl) on top of this one to providea more convenient way of working with Elasticsearch.

Persistent Connections¶

elasticsearch-py uses persistent connections inside of individual connectionpools (one per each configured or sniffed node). Out of the box you can choosebetween two http protocol implementations. See Transport classes for moreinformation.

The transport layer will create an instance of the selected connection classper node and keep track of the health of individual nodes - if a node becomesunresponsive (throwing exceptions while connecting to it) it’s put on a timeoutby the ConnectionPool class and only returned to thecirculation after the timeout is over (or when no live nodes are left). Bydefault nodes are randomized before being passed into the pool and round-robinstrategy is used for load balancing.

You can customize this behavior by passing parameters to theConnection Layer API (all keyword arguments to theElasticsearch class will be passed through). If whatyou want to accomplish is not supported you should be able to create a subclassof the relevant component and pass it in as a parameter to be used instead ofthe default implementation.

Automatic Retries¶

If a connection to a node fails due to connection issues (raisesConnectionError) it is considered in faulty state. Itwill be placed on hold for dead_timeout seconds and the request will beretried on another node. If a connection fails multiple times in a row thetimeout will get progressively larger to avoid hitting a node that’s, by allindication, down. If no live connection is available, the connection that hasthe smallest timeout will be used.

By default retries are not triggered by a timeout(ConnectionTimeout), set retry_on_timeout toTrue to also retry on timeouts.

Sniffing¶

The client can be configured to inspect the cluster state to get a list ofnodes upon startup, periodically and/or on failure. SeeTransport parameters for details.

Some example configurations:

from elasticsearch import Elasticsearch# by default we dont sniff, everes = Elasticsearch()# you can specify to sniff on startup to inspect the cluster and load# balance across all nodeses = Elasticsearch([seed1, seed2], sniff_on_start=True)# you can also sniff periodically and/or after failure:es = Elasticsearch([seed1, seed2],          sniff_on_start=True,          sniff_on_connection_fail=True,          sniffer_timeout=60)
Thread safety¶

The client is thread safe and can be used in a multi threaded environment. Bestpractice is to create a single global instance of the client and use itthroughout your application. If your application is long-running considerturning on Sniffing to make sure the client is up to date on the clusterlocation.

By default we allow urllib3 to open up to 10 connections to each node, ifyour application calls for more parallelism, use the maxsize parameter toraise the limit:

# allow up to 25 connections to each nodees = Elasticsearch([host1, host2], maxsize=25)

Note

Since we use persistent connections throughout the client it means that theclient doesn’t tolerate fork very well. If your application calls formultiple processes make sure you create a fresh client after call tofork. Note that Python’s multiprocessing module uses fork tocreate new processes on POSIX systems.

TLS/SSL and Authentication¶

You can configure the client to use SSL for connecting to yourelasticsearch cluster, including certificate verification and HTTP auth:

from elasticsearch import Elasticsearch# you can use RFC-1738 to specify the urles = Elasticsearch([https://user:secret@localhost:443])# ... or specify common parameters as kwargses = Elasticsearch(    [localhost, otherhost],    http_auth=(user, secret),    scheme=https,    port=443,)# SSL client authentication using client_cert and client_keyfrom ssl import create_default_contextcontext = create_default_context(cafile=path/to/cert.pem)es = Elasticsearch(    [localhost, otherhost],    http_auth=(user, secret),    scheme=https,    port=443,    ssl_context=context,)

Warning

elasticsearch-py doesn’t ship with default set of root certificates. Tohave working SSL certificate validation you need to either specify your ownas cafile or capath or cadata or install certifi which willbe picked up automatically.

See class Urllib3HttpConnection for detaileddescription of the options.

Connecting via Cloud ID¶

Cloud ID is an easy way to configure your client to workwith your Elastic Cloud deployment. Combine the cloud_idwith either http_auth or api_key to authenticatewith your Elastic Cloud deployment.

Using cloud_id enables TLS verification and HTTP compression by defaultand sets the port to 443 unless otherwise overwritten via the port parameteror the port value encoded within cloud_id. Using Cloud ID also disables sniffing.

from elasticsearch import Elasticsearches = Elasticsearch(    cloud_id=cluster-1:dXMa5Fx...,    http_auth=(elastic, password),)
API Key Authentication¶

You can configure the client to use Elasticsearch’s API Key for connecting to your cluster.Please note this authentication method has been introduced with release of Elasticsearch 6.7.0.

from elasticsearch import Elasticsearch# you can use the api key tuplees = Elasticsearch(    [node-1, node-2, node-3],    api_key=(id, api_key),)# or you pass the base 64 encoded tokenes = Elasticsearch(    [node-1, node-2, node-3],    api_key=base64encoded tuple,)
Logging¶

elasticsearch-py uses the standard logging library from python to definetwo loggers: elasticsearch and elasticsearch.trace. elasticsearchis used by the client to log standard activity, depending on the log level.elasticsearch.trace can be used to log requests to the server in the formof curl commands using pretty-printed json that can then be executed fromcommand line. Because it is designed to be shared (for example to demonstratean issue) it also just uses localhost:9200 as the address instead of theactual address of the host. If the trace logger has not been configuredalready it is set to propagate=False so it needs to be activated separately.

Type Hints¶

Starting in elasticsearch-py v7.10.0 the library now ships with type hintsand supports basic static type analysis with tools like Mypy and Pyright.

If we write a script that has a type error like using request_timeout witha str argument instead of float and then run Mypy on the script:

# script.pyfrom elasticsearch import Elasticsearches = Elasticsearch(...)es.search(    index=test-index,    request_timeout=5  # type error!)# $ mypy script.py# script.py:5: error: Argument request_timeout to search of Elasticsearch has#                     incompatible type str; expected Union[int, float, None]# Found 1 error in 1 file (checked 1 source file)

For now many parameter types for API methods aren’t specific toa type (ie they are of type typing.Any) but in the futurethey will be tightened for even better static type checking.

Type hints also allow tools like your IDE to check types and provide betterauto-complete functionality.

Warning

The type hints for API methods like search don’t match the function signaturethat can be found in the source code. Type hints represent optimal usage of theAPI methods. Using keyword arguments is highly recommended so all optional parametersand body are keyword-only in type hints.

JetBrains PyCharm will use the warning Unexpected argument to denote that theparameter may be keyword-only.

Environment considerations¶

When using the client there are several limitations of your environment thatcould come into play.

When using an HTTP load balancer you cannot use the Sniffingfunctionality - the cluster would supply the client with IP addresses todirectly connect to the cluster, circumventing the load balancer. Depending onyour configuration this might be something you don’t want or break completely.

Compression¶

When using capacity-constrained networks (low throughput), it may be handy to enablecompression. This is especially useful when doing bulk loads or inserting largedocuments. This will configure compression.

from elasticsearch import Elasticsearches = Elasticsearch(hosts, http_compress=True)

Compression is enabled by default when connecting to Elastic Cloud via cloud_id.

Customization¶Custom serializers¶

By default, JSONSerializer is used to encode all outgoing requests.However, you can implement your own custom serializer

from elasticsearch.serializer import JSONSerializerclass SetEncoder(JSONSerializer):    def default(self, obj):        if isinstance(obj, set):            return list(obj)        if isinstance(obj, Something):            return CustomSomethingRepresentation        return JSONSerializer.default(self, obj)es = Elasticsearch(serializer=SetEncoder())
Elasticsearch-DSL¶

For a more high level client library with more limited scope, have a look atelasticsearch-dsl - a more pythonic library sitting on top ofelasticsearch-py.

elasticsearch-dsl provides a more convenient and idiomatic way to write and manipulatequeries by mirroring the terminology and structure of Elasticsearch JSON DSLwhile exposing the whole range of the DSL from Pythoneither directly using defined classes or a queryset-like expressions.

It also provides an optional persistence layer for working with documents asPython objects in an ORM-like fashion: defining mappings, retrieving and savingdocuments, wrapping the document data in user-defined classes.

Contents¶API DocumentationGlobal OptionsElasticsearchAsync SearchAutoscalingCatCross-Cluster Replication (CCR)ClusterDangling IndicesEnrich PoliciesEvent Query Language (EQL)Snapshottable FeaturesFleetGraph ExploreIndex Lifecycle Management (ILM)IndicesIngest PipelinesLicenseLogstashMigrationMachine Learning (ML)MonitoringNodesRollup IndicesSearchable SnapshotsSecurityShutdownSnapshot Lifecycle Management (SLM)SnapshotsSQLTLS/SSLTasksText StructureTransformsWatcherX-PackExceptionsUsing Asyncio with ElasticsearchGetting Started with AsyncASGI Applications and Elastic APMFrequently Asked QuestionsAsync HelpersAPI ReferenceConnection Layer APITransportConnection PoolConnection SelectorUrllib3HttpConnection (default connection_class)API Compatibility HTTP HeaderTransport classesProduct check on first requestConnectionUrllib3HttpConnectionRequestsHttpConnectionHelpersBulk helpersScanReindexRelease NotesLicense¶

Copyright 2021 Elasticsearch B.V. Licensed under the Apache License, Version 2.0.

Indices and tables¶IndexModule IndexSearch Page
Next

Copyright 2021, Elasticsearch B.V Revision 8b0ecc23.

Built with Sphinx using a theme provided by Read the Docs.
Read the Docs v: v7.15.2 Versions master latest stable v7.15.2 v7.15.1 v7.15.0 v7.15.0a1 v7.14.2 v7.14.1 v7.14.0 v7.14.0a2 v7.14.0a1 v7.13.4 v7.13.3 v7.13.2 v7.13.1 v7.13.0 v7.12.1 v7.12.0 v7.11.0 v7.11.0b1 7.11.0b1 v7.10.1 7.10.0 7.10.0a2 7.10.0a1 7.9.1 7.9.0 7.8.1 7.8.0 7.7.1 7.7.0 7.6.0 7.5.1 7.0.1 7.x 7.0.0 6.8.2 6.3.1 6.3.0 6.2.0 6.1.1 6.1.0 6.0.0 5.5.1 5.5.0 5.4.0 5.3.0 5.2.0 5.1.0 5.0.1 5.0.0 2.4.0 2.3.0 2.2.0 2.1.0 2.0.0 1.9.0 1.8.0 1.7.0 1.6.0 1.5.0 1.4.0 1.3.0 1.2.0 1.1.1 1.x 1.0.0 0.4.5 0.4.4 0.4.3 0.4 Downloads On Read the Docs Project Home Builds
Free document hosting provided by Read the Docs.

TAGS:Elasticsearch Python Client documentation 

<<< Thank you for your visit >>>

Websites to related :
ReboundTrading.com is for sale |

  keywords:
description:Choose a memorable domain name. Professional, friendly customer support. Start using your domain right away.
Questions?+1-303-89

Airwheel Singapore Official Webs

  keywords:Airwheel x3, airwheel x5, airwheel x8, airwheel Q1, Airwheel Q3, airwheel Q5, airwheel Q6, airwheel s3, airwheel s6, airwheel s8, airwheel E6

Welcome to StreetLightz. Over 1

  keywords:
description:
My Account: Log In
0 item(s) in cart/ total: $0  view cartHomeAbout UsImagesContact UsThe used parts garageJDM PartsAccesso

The Colorado Bar > Home

  keywords:
description:

Tickled pink mit deutschem Blog

  keywords:
description:
Skip to content Menu CasaModeGesundheitAutomobil

Ladida Web Analysis - Ladida.com

  keywords:
description:Ladida has the largest selection of handpicked designer children’s clothing. Shop Ladida.com for brand name boutique kids cloth

Philadelphia Public Record | Ser

  keywords:
description:
Home Featured NewsLatest NewsPast IssuesCalendarContactMedia KitFind HardcopyAbout Us FacebookTwitter

Reebok.com - store.reebok.com We

  keywords:
description:
ip-address.comHomeMy IPSpeedtestSitemapProxy CheckerProxy ListVerify Email AddressTrace Email AddressIP to Zip CodeIP Address D

Basketball Coaching Aids - Korne

  keywords:
description:Basketball Coaching Aids the source for Basketball Coaching Aids for competitive play. KBA products are approved by the NFHS, NC

T. Hasegawa - Flavors, Fragrance

  keywords:
description:T. Hasegawa develops custom flavors & fragrances. Our flavor chemists work with clients to create innovative concepts unlike any

ads

Hot Websites