Skip to content

Latest commit

 

History

History
43 lines (34 loc) · 1.59 KB

crawler-plugin-tooling.md

File metadata and controls

43 lines (34 loc) · 1.59 KB
copyright lastupdated subcollection
years
2020, 2021
2021-03-12
discovery-data

{:shortdesc: .shortdesc} {:external: target="_blank" .external} {:tip: .tip} {:note: .note} {:pre: .pre} {:important: .important} {:deprecated: .deprecated} {:codeblock: .codeblock} {:screen: .screen} {:download: .download} {:hide-dashboard: .hide-dashboard} {:apikey: data-credential-placeholder='apikey'} {:url: data-credential-placeholder='url'} {:curl: .ph data-hd-programlang='curl'} {:javascript: .ph data-hd-programlang='javascript'} {:java: .ph data-hd-programlang='java'} {:python: .ph data-hd-programlang='python'} {:ruby: .ph data-hd-programlang='ruby'} {:swift: .ph data-hd-programlang='swift'} {:go: .ph data-hd-programlang='go'}

Using a Cloud Pak for Data custom crawler plug-in with the Discovery tooling

{: #crawler-plugin-tooling}

After you build and deploy a crawler plug-in, you can configure your {{site.data.keyword.discoveryshort}} collection to use your plug-in to process documents. {: shortdesc}

Cloud Pak for Data only {{site.data.keyword.icp4dfull_notm}} only

This information applies only to installed deployments. {:note}

You can create and manage a collection as described in Creating and managing collections. You can select a successfully deployed crawler plug-in when you create and manage a collection. For more information, see Crawler plug-in settings. You can also deploy a crawler plug-in package to a testing environment.