Migration Examples

edit

This document is a list of examples of how to migrate plugin code from legacy APIs to their Kibana Platform equivalents.

Configuration

edit

Declaring config schema

edit

Declaring the schema of your configuration fields is similar to the Legacy Platform, but uses the @kbn/config-schema package instead of Joi. This package has full TypeScript support out-of-the-box.

Legacy config schema

import Joi from 'joi';

new kibana.Plugin({
  config() {
    return Joi.object({
      enabled: Joi.boolean().default(true),
      defaultAppId: Joi.string().default('home'),
      index: Joi.string().default('.kibana'),
      disableWelcomeScreen: Joi.boolean().default(false),
      autocompleteTerminateAfter: Joi.number().integer().min(1).default(100000),
    })
  }
});

Kibana Platform equivalent

import { schema, TypeOf } from '@kbn/config-schema';

export const config = {
  schema: schema.object({
    enabled: schema.boolean({ defaultValue: true }),
    defaultAppId: schema.string({ defaultValue: true }),
    index: schema.string({ defaultValue: '.kibana' }),
    disableWelcomeScreen: schema.boolean({ defaultValue: false }),
    autocompleteTerminateAfter: schema.duration({ min: 1, defaultValue: 100000 }),
  })
};

// @kbn/config-schema is written in TypeScript, so you can use your schema
// definition to create a type to use in your plugin code.
export type MyPluginConfig = TypeOf<typeof config.schema>;

Using Kibana config in a new plugin

edit

After setting the config schema for your plugin, you might want to read configuration values from your plugin. It is provided as part of the PluginInitializerContext in the constructor of the plugin:

plugins/my_plugin/(public|server)/index.ts

import type { PluginInitializerContext } from 'kibana/server';
import { MyPlugin } from './plugin';

export function plugin(initializerContext: PluginInitializerContext) {
  return new MyPlugin(initializerContext);
}

plugins/my_plugin/(public|server)/plugin.ts

import type { Observable } from 'rxjs';
import { first } from 'rxjs/operators';
import { CoreSetup, Logger, Plugin, PluginInitializerContext, PluginName } from 'kibana/server';
import type { MyPluginConfig } from './config';

export class MyPlugin implements Plugin {
  private readonly config$: Observable<MyPluginConfig>;
  private readonly log: Logger;

  constructor(private readonly initializerContext: PluginInitializerContext) {
    this.log = initializerContext.logger.get();
    this.config$ = initializerContext.config.create();
  }

  public async setup(core: CoreSetup, deps: Record<PluginName, unknown>) {
    const isEnabled = await this.config$.pipe(first()).toPromise();
  }
}

Additionally, some plugins need to access the runtime env configuration.

export class MyPlugin implements Plugin {
  public async setup(core: CoreSetup, deps: Record<PluginName, unknown>) {
    const { mode: { dev }, packageInfo: { version } } = this.initializerContext.env
  }

Creating a Kibana Platform plugin

edit

For example, if you want to move the legacy demoplugin plugin’s configuration to the Kibana Platform, you could create the Kibana Platform plugin with the same name in plugins/demoplugin with the following files:

plugins/demoplugin/kibana.json

{
  "id": "demoplugin",
  "server": true
}

plugins/demoplugin/server/index.ts

import { schema, TypeOf } from '@kbn/config-schema';
import type { PluginInitializerContext } from 'kibana/server';
import { DemoPlugin } from './plugin';

export const config = {
  schema: schema.object({
    enabled: schema.boolean({ defaultValue: true }),
  });
}

export const plugin = (initContext: PluginInitializerContext) => new DemoPlugin(initContext);

export type DemoPluginConfig = TypeOf<typeof config.schema>;
export { DemoPluginSetup } from './plugin';

plugins/demoplugin/server/plugin.ts

import type { PluginInitializerContext, Plugin, CoreSetup } from 'kibana/server';
import type { DemoPluginConfig } from '.';
export interface DemoPluginSetup {};

export class DemoPlugin implements Plugin<DemoPluginSetup> {
  constructor(private readonly initContext: PluginInitializerContext) {}

  public setup(core: CoreSetup) {
    return {};
  }

  public start() {}
  public stop() {}
}

HTTP Routes

edit

In the legacy platform, plugins have direct access to the Hapi server object, which gives full access to all of Hapi’s API. In the New Platform, plugins have access to the HttpServiceSetup interface, which is exposed via the CoreSetup object injected into the setup method of server-side plugins.

This interface has a different API with slightly different behaviors.

  • All input (body, query parameters, and URL parameters) must be validated using the @kbn/config-schema package. If no validation schema is provided, these values will be empty objects.
  • All exceptions thrown by handlers result in 500 errors. If you need a specific HTTP error code, catch any exceptions in your handler and construct the appropriate response using the provided response factory. While you can continue using the Boom module internally in your plugin, the framework does not have native support for converting Boom exceptions into HTTP responses.

Migrate legacy route registration: legacy/plugins/demoplugin/index.ts

import Joi from 'joi';

new kibana.Plugin({
  init(server) {
    server.route({
      path: '/api/demoplugin/search',
      method: 'POST',
      options: {
        validate: {
          payload: Joi.object({
            field1: Joi.string().required(),
          }),
        }
      },
      handler(req, h) {
        return { message: `Received field1: ${req.payload.field1}` };
      }
    });
  }
});

to the Kibana platform format: plugins/demoplugin/server/plugin.ts

import { schema } from '@kbn/config-schema';
import type { CoreSetup } from 'kibana/server';

export class DemoPlugin {
  public setup(core: CoreSetup) {
    const router = core.http.createRouter();
    router.post(
      {
        path: '/api/demoplugin/search',
        validate: {
          body: schema.object({
            field1: schema.string(),
          }),
        }
      },
      (context, req, res) => {
        return res.ok({
          body: {
            message: `Received field1: ${req.body.field1}`
          }
        });
      }
    )
  }
}

If your plugin still relies on throwing Boom errors from routes, you can use the router.handleLegacyErrors as a temporary solution until error migration is complete:

plugins/demoplugin/server/plugin.ts

import { schema } from '@kbn/config-schema';
import { CoreSetup } from 'kibana/server';
import Boom from '@hapi/boom';

export class DemoPlugin {
  public setup(core: CoreSetup) {
    const router = core.http.createRouter();
    router.post(
      {
        path: '/api/demoplugin/search',
        validate: {
          body: schema.object({
            field1: schema.string(),
          }),
        }
      },
      router.handleLegacyErrors((context, req, res) => {
        throw Boom.notFound('not there'); // will be converted into proper Platform error
      })
    )
  }
}

Accessing Services

edit

Services in the Legacy Platform were typically available via methods on either server.plugins.*, server.*, or req.*. In the Kibana Platform, all services are available via the context argument to the route handler. The type of this argument is the RequestHandlerContext. The APIs available here will include all Core services and any services registered by plugins this plugin depends on.

legacy/plugins/demoplugin/index.ts

new kibana.Plugin({
  init(server) {
    const { callWithRequest } = server.plugins.elasticsearch.getCluster('data');

    server.route({
      path: '/api/my-plugin/my-route',
      method: 'POST',
      async handler(req, h) {
        const results = await callWithRequest(req, 'search', query);
        return { results };
      }
    });
  }
});

plugins/demoplugin/server/plugin.ts

export class DemoPlugin {
  public setup(core) {
    const router = core.http.createRouter();
    router.post(
      {
        path: '/api/my-plugin/my-route',
      },
      async (context, req, res) => {
        const results = await context.core.elasticsearch.client.asCurrentUser.search(query);
        return res.ok({
          body: { results }
        });
      }
    )
  }
}

Migrating Hapi pre-handlers

edit

In the Legacy Platform, routes could provide a pre option in their config to register a function that should be run before the route handler. These pre handlers allow routes to share some business logic that may do some pre-work or validation. In Kibana, these are often used for license checks.

The Kibana Platform’s HTTP interface does not provide this functionality. However, it is simple enough to port over using a higher-order function that can wrap the route handler.

Simple example

edit

In this simple example, a pre-handler is used to either abort the request with an error or continue as normal. This is a simple gate-keeping pattern.

// Legacy pre-handler
const licensePreRouting = (request) => {
  const licenseInfo = getMyPluginLicenseInfo(request.server.plugins.xpack_main);
  if (!licenseInfo.isOneOf(['gold', 'platinum', 'trial'])) {
    throw Boom.forbidden(`You don't have the right license for MyPlugin!`);
  }
}

server.route({
  method: 'GET',
  path: '/api/my-plugin/do-something',
  config: {
    pre: [{ method: licensePreRouting }]
  },
  handler: (req) => {
    return doSomethingInteresting();
  }
})

In the Kibana Platform, the same functionality can be achieved by creating a function that takes a route handler (or factory for a route handler) as an argument and either successfully invokes it or returns an error response.

This a high-order handler similar to the high-order component pattern common in the React ecosystem.

// Kibana Platform high-order handler
const checkLicense = <P, Q, B>(
  handler: RequestHandler<P, Q, B, RouteMethod>
): RequestHandler<P, Q, B, RouteMethod> => {
  return (context, req, res) => {
    const licenseInfo = getMyPluginLicenseInfo(context.licensing.license);

    if (licenseInfo.hasAtLeast('gold')) {
      return handler(context, req, res);
    } else {
      return res.forbidden({ body: `You don't have the right license for MyPlugin!` });
    }
  }
}

router.get(
  { path: '/api/my-plugin/do-something', validate: false },
  checkLicense(async (context, req, res) => {
    const results = doSomethingInteresting();
    return res.ok({ body: results });
  }),
)

Full Example

edit

In some cases, the route handler may need access to data that the pre-handler retrieves. In this case, you can utilize a handler factory rather than a raw handler.

// Legacy pre-handler
const licensePreRouting = (request) => {
  const licenseInfo = getMyPluginLicenseInfo(request.server.plugins.xpack_main);
  if (licenseInfo.isOneOf(['gold', 'platinum', 'trial'])) {
    // In this case, the return value of the pre-handler is made available on
    // whatever the 'assign' option is in the route config.
    return licenseInfo;
  } else {
    // In this case, the route handler is never called and the user gets this
    // error message
    throw Boom.forbidden(`You don't have the right license for MyPlugin!`);
  }
}

server.route({
  method: 'GET',
  path: '/api/my-plugin/do-something',
  config: {
    pre: [{ method: licensePreRouting, assign: 'licenseInfo' }]
  },
  handler: (req) => {
    const licenseInfo = req.pre.licenseInfo;
    return doSomethingInteresting(licenseInfo);
  }
})

In many cases, it may be simpler to duplicate the function call to retrieve the data again in the main handler. In other cases, you can utilize a handler factory rather than a raw handler as the argument to your high-order handler. This way, the high-order handler can pass arbitrary arguments to the route handler.

// Kibana Platform high-order handler
const checkLicense = <P, Q, B>(
  handlerFactory: (licenseInfo: MyPluginLicenseInfo) => RequestHandler<P, Q, B, RouteMethod>
): RequestHandler<P, Q, B, RouteMethod> => {
  return (context, req, res) => {
    const licenseInfo = getMyPluginLicenseInfo(context.licensing.license);

    if (licenseInfo.hasAtLeast('gold')) {
      const handler = handlerFactory(licenseInfo);
      return handler(context, req, res);
    } else {
      return res.forbidden({ body: `You don't have the right license for MyPlugin!` });
    }
  }
}

router.get(
  { path: '/api/my-plugin/do-something', validate: false },
  checkLicense(licenseInfo => async (context, req, res) => {
    const results = doSomethingInteresting(licenseInfo);
    return res.ok({ body: results });
  }),
)

Chrome

edit

In the Legacy Platform, the ui/chrome import contained APIs for a very wide range of features. In the Kibana Platform, some of these APIs have changed or moved elsewhere. See Core services.

Updating an application navlink

edit

In the legacy platform, the navlink could be updated using chrome.navLinks.update.

uiModules.get('xpack/ml').run(() => {
  const showAppLink = xpackInfo.get('features.ml.showLinks', false);
  const isAvailable = xpackInfo.get('features.ml.isAvailable', false);

  const navLinkUpdates = {
    // hide by default, only show once the xpackInfo is initialized
    hidden: !showAppLink,
    disabled: !showAppLink || (showAppLink && !isAvailable),
  };

  npStart.core.chrome.navLinks.update('ml', navLinkUpdates);
});

In the Kibana Platform, navlinks should not be updated directly. Instead, it is now possible to add an updater when registering an application to change the application or the navlink state at runtime.

// my_plugin has a required dependencie to the `licensing` plugin
interface MyPluginSetupDeps {
  licensing: LicensingPluginSetup;
}

export class MyPlugin implements Plugin {
  setup({ application }, { licensing }: MyPluginSetupDeps) {
    const updater$ = licensing.license$.pipe(
      map(license => {
        const { hidden, disabled } = calcStatusFor(license);
        if (hidden) return { navLinkStatus: AppNavLinkStatus.hidden };
        if (disabled) return { navLinkStatus: AppNavLinkStatus.disabled };
        return { navLinkStatus: AppNavLinkStatus.default };
      })
    );

    application.register({
      id: 'my-app',
      title: 'My App',
      updater$,
      async mount(params) {
        const { renderApp } = await import('./application');
        return renderApp(params);
      },
    });
  }

Chromeless Applications

edit

In Kibana, a chromeless application is one where the primary Kibana UI components such as header or navigation can be hidden. In the legacy platform, these were referred to as hidden applications and were set via the hidden property in a Kibana plugin. Chromeless applications are also not displayed in the left navbar.

To mark an application as chromeless, specify chromeless: true when registering your application to hide the chrome UI when the application is mounted:

application.register({
  id: 'chromeless',
  chromeless: true,
  async mount(context, params) {
    /* ... */
  },
});

If you wish to render your application at a route that does not follow the /app/${appId} pattern, this can be done via the appRoute property. Doing this currently requires you to register a server route where you can return a bootstrapped HTML page for your application bundle.

application.register({
  id: 'chromeless',
  appRoute: '/chromeless',
  chromeless: true,
  async mount(context, params) {
    /* ... */
  },
});

Render HTML Content

edit

You can return a blank HTML page bootstrapped with the core application bundle from an HTTP route handler via the httpResources service. You may wish to do this if you are rendering a chromeless application with a custom application route or have other custom rendering needs.

httpResources.register(
  { path: '/chromeless', validate: false },
  (context, request, response) => {
    //... some logic
    return response.renderCoreApp();
  }
);

You can also exclude user data from the bundle metadata. User data comprises all UI Settings that are user provided, then injected into the page. You may wish to exclude fetching this data if not authorized or to slim the page size.

httpResources.register(
  { path: '/', validate: false, options: { authRequired: false } },
  (context, request, response) => {
    //... some logic
    return response.renderAnonymousCoreApp();
  }
);

Saved Objects types

edit

In the legacy platform, saved object types were registered using static definitions in the uiExports part of the plugin manifest.

In the Kibana Platform, all these registrations are performed programmatically during your plugin’s setup phase, using the core savedObjects’s registerType setup API.

The most notable difference is that in the Kibana Platform, the type registration is performed in a single call to registerType, passing a new SavedObjectsType structure that is a superset of the legacy schema, migrations mappings and savedObjectsManagement.

Concrete example

edit

Suppose you have the following in a legacy plugin:

legacy/plugins/demoplugin/index.ts

import mappings from './mappings.json';
import { migrations } from './migrations';

new kibana.Plugin({
  init(server){
    // [...]
  },
  uiExports: {
    mappings,
    migrations,
    savedObjectSchemas: {
      'first-type': {
        isNamespaceAgnostic: true,
      },
      'second-type': {
        isHidden: true,
      },
    },
    savedObjectsManagement: {
      'first-type': {
        isImportableAndExportable: true,
        icon: 'myFirstIcon',
        defaultSearchField: 'title',
        getTitle(obj) {
          return obj.attributes.title;
        },
        getEditUrl(obj) {
          return `/some-url/${encodeURIComponent(obj.id)}`;
        },
      },
      'second-type': {
        isImportableAndExportable: false,
        icon: 'mySecondIcon',
        getTitle(obj) {
          return obj.attributes.myTitleField;
        },
        getInAppUrl(obj) {
          return {
            path: `/some-url/${encodeURIComponent(obj.id)}`,
            uiCapabilitiesPath: 'myPlugin.myType.show',
          };
        },
      },
    },
  },
})

legacy/plugins/demoplugin/mappings.json

{
  "first-type": {
    "properties": {
      "someField": {
        "type": "text"
      },
      "anotherField": {
        "type": "text"
      }
    }
  },
  "second-type": {
    "properties": {
      "textField": {
        "type": "text"
      },
      "boolField": {
        "type": "boolean"
      }
    }
  }
}

legacy/plugins/demoplugin/migrations.js

export const migrations = {
  'first-type': {
    '1.0.0': migrateFirstTypeToV1,
    '2.0.0': migrateFirstTypeToV2,
  },
  'second-type': {
    '1.5.0': migrateSecondTypeToV15,
  }
}

To migrate this, you have to regroup the declaration per-type.

First type: plugins/demoplugin/server/saved_objects/first_type.ts

import type { SavedObjectsType } from 'kibana/server';

export const firstType: SavedObjectsType = {
  name: 'first-type',
  hidden: false,
  namespaceType: 'agnostic',
  mappings: {
    properties: {
      someField: {
        type: 'text',
      },
      anotherField: {
        type: 'text',
      },
    },
  },
  migrations: {
    '1.0.0': migrateFirstTypeToV1,
    '2.0.0': migrateFirstTypeToV2,
  },
  management: {
    importableAndExportable: true,
    icon: 'myFirstIcon',
    defaultSearchField: 'title',
    getTitle(obj) {
      return obj.attributes.title;
    },
    getEditUrl(obj) {
      return `/some-url/${encodeURIComponent(obj.id)}`;
    },
  },
};

Second type: plugins/demoplugin/server/saved_objects/second_type.ts

import type { SavedObjectsType } from 'kibana/server';

export const secondType: SavedObjectsType = {
  name: 'second-type',
  hidden: true,
  namespaceType: 'single',
  mappings: {
    properties: {
      textField: {
        type: 'text',
      },
      boolField: {
        type: 'boolean',
      },
    },
  },
  migrations: {
    '1.5.0': migrateSecondTypeToV15,
  },
  management: {
    importableAndExportable: false,
    icon: 'mySecondIcon',
    getTitle(obj) {
      return obj.attributes.myTitleField;
    },
    getInAppUrl(obj) {
      return {
        path: `/some-url/${encodeURIComponent(obj.id)}`,
        uiCapabilitiesPath: 'myPlugin.myType.show',
      };
    },
  },
};

Registration in the plugin’s setup phase: plugins/demoplugin/server/plugin.ts

import { firstType, secondType } from './saved_objects';

export class DemoPlugin implements Plugin {
  setup({ savedObjects }) {
    savedObjects.registerType(firstType);
    savedObjects.registerType(secondType);
  }
}

Changes in structure compared to legacy

edit

The Kibana Platform registerType expected input is very close to the legacy format. However, there are some minor changes:

  • The schema.isNamespaceAgnostic property has been renamed: SavedObjectsType.namespaceType. It no longer accepts a boolean but instead an enum of single, multiple, or agnostic (see SavedObjectsNamespaceType).
  • The schema.indexPattern was accepting either a string or a (config: LegacyConfig) => string. SavedObjectsType.indexPattern only accepts a string, as you can access the configuration during your plugin’s setup phase.
  • The savedObjectsManagement.isImportableAndExportable property has been renamed: SavedObjectsType.management.importableAndExportable.
  • The migration function signature has changed: In legacy, it used to be
`(doc: SavedObjectUnsanitizedDoc, log: SavedObjectsMigrationLogger) => SavedObjectUnsanitizedDoc;`

In Kibana Platform, it is

`(doc: SavedObjectUnsanitizedDoc, context: SavedObjectMigrationContext) => SavedObjectUnsanitizedDoc;`

With context being:

export interface SavedObjectMigrationContext {
  log: SavedObjectsMigrationLogger;
}

The changes is very minor though. The legacy migration:

const migration = (doc, log) => {...}

Would be converted to:

const migration: SavedObjectMigrationFn<OldAttributes, MigratedAttributes> = (doc, { log }) => {...}

UiSettings

edit

UiSettings defaults registration performed during setup phase via core.uiSettings.register API.

legacy/plugins/demoplugin/index.js

uiExports: {
  uiSettingDefaults: {
    'my-plugin:my-setting': {
      name: 'just-work',
      value: true,
      description: 'make it work',
      category: ['my-category'],
    },
  }
}

plugins/demoplugin/server/plugin.ts

setup(core: CoreSetup){
  core.uiSettings.register({
    'my-plugin:my-setting': {
      name: 'just-work',
      value: true,
      description: 'make it work',
      category: ['my-category'],
      schema: schema.boolean(),
    },
  })
}

Elasticsearch client

edit

The new elasticsearch client is a thin wrapper around @elastic/elasticsearch’s Client class. Even if the API is quite close to the legacy client Kibana was previously using, there are some subtle changes to take into account during migration.

Official client documentation

Client API Changes

edit

Refer to the Breaking changes list for more information about the changes between the legacy and new client.

The most significant changes on the Kibana side for the consumers are the following:

User client accessor
edit

Internal /current user client accessors has been renamed and are now properties instead of functions:

  • callAsInternalUser('ping')asInternalUser.ping()
  • callAsCurrentUser('ping')asCurrentUser.ping()
  • the API now reflects the Client’s instead of leveraging the string-based endpoint names the LegacyAPICaller was using.

Before:

const body = await client.callAsInternalUser('indices.get', { index: 'id' });

After:

const { body } = await client.asInternalUser.indices.get({ index: 'id' });
Response object
edit

Calling any ES endpoint now returns the whole response object instead of only the body payload.

Before:

const body = await legacyClient.callAsInternalUser('get', { id: 'id' });

After:

const { body } = await client.asInternalUser.get({ id: 'id' });

Note that more information from the ES response is available:

const {
  body,        // response payload
  statusCode,  // http status code of the response
  headers,     // response headers
  warnings,    // warnings returned from ES
  meta         // meta information about the request, such as request parameters, number of attempts and so on
} = await client.asInternalUser.get({ id: 'id' });
Response Type
edit

All API methods are now generic to allow specifying the response body. type

Before:

const body: GetResponse = await legacyClient.callAsInternalUser('get', { id: 'id' });

After:

// body is of type `GetResponse`
const { body } = await client.asInternalUser.get<GetResponse>({ id: 'id' });
// fallback to `Record<string, any>` if unspecified
const { body } = await client.asInternalUser.get({ id: 'id' });

The new client doesn’t provide exhaustive typings for the response object yet. You might have to copy response type definitions from the Legacy Elasticsearch library until the additional announcements.

// Kibana provides a few typings for internal purposes
import type { SearchResponse } from 'kibana/server';
type SearchSource = {...};
type SearchBody = SearchResponse<SearchSource>;
const { body } = await client.search<SearchBody>(...);
interface Info {...}
const { body } = await client.info<Info>(...);
Errors
edit

The returned error types changed.

There are no longer specific errors for every HTTP status code (such as BadRequest or NotFound). A generic ResponseError with the specific statusCode is thrown instead.

Before:

import { errors } from 'elasticsearch';
try {
  await legacyClient.callAsInternalUser('ping');
} catch(e) {
  if(e instanceof errors.NotFound) {
    // do something
  }
  if(e.status === 401) {}
}

After:

import { errors } from '@elastic/elasticsearch';
try {
  await client.asInternalUser.ping();
} catch(e) {
  if(e instanceof errors.ResponseError && e.statusCode === 404) {
    // do something
  }
  // also possible, as all errors got a name property with the name of the class,
  // so this slightly better in term of performances
  if(e.name === 'ResponseError' && e.statusCode === 404) {
    // do something
  }
  if(e.statusCode === 401) {...}
}
Parameter naming format
edit

The parameter property names changed from camelCase to snake_case

Even if technically, the JavaScript client accepts both formats, the TypeScript definitions are only defining snake_case properties.

Before:

legacyClient.callAsCurrentUser('get', {
  id: 'id',
  storedFields: ['some', 'fields'],
})

After:

client.asCurrentUser.get({
  id: 'id',
  stored_fields: ['some', 'fields'],
})
Request abortion
edit

The request abortion API changed

All promises returned from the client API calls now have an abort method that can be used to cancel the request.

Before:

const controller = new AbortController();
legacyClient.callAsCurrentUser('ping', {}, {
  signal: controller.signal,
})
// later
controller.abort();

After:

const request = client.asCurrentUser.ping();
// later
request.abort();
Headers
edit

It is now possible to override headers when performing specific API calls.

Note that doing so is strongly discouraged due to potential side effects with the ES service internal behavior when scoping as the internal or as the current user.

const request = client.asCurrentUser.ping({}, {
  headers: {
    authorization: 'foo',
    custom: 'bar',
  }
});
Functional tests
edit

Functional tests are subject to migration to the new client as well.

Before:

const client = getService('legacyEs');

After:

const client = getService('es');

Accessing the client from a route handler

edit

Apart from the API format change, accessing the client from within a route handler did not change. As it was done for the legacy client, a preconfigured scoped client bound to an incoming request is accessible using the core context provider:

router.get(
  {
    path: '/my-route',
  },
  async (context, req, res) => {
    const { client } = context.core.elasticsearch;
    // call as current user
    const res = await client.asCurrentUser.ping();
    // call as internal user
    const res2 = await client.asInternalUser.search(options);
    return res.ok({ body: 'ok' });
  }
);

Accessing the client from a collector’s fetch method

edit

At the moment, the fetch method’s context receives preconfigured scoped clients for Elasticsearch and SavedObjects. To help in the transition, both, the legacy (callCluster) and new clients are provided, but we strongly discourage using the deprecated legacy ones for any new implementation.

usageCollection.makeUsageCollector<MyUsage>({
  type: 'my-collector',
  isReady: async () => true, // Logic to confirm the `fetch` method is ready to be called
  schema: {...},
  async fetch(context) {
    const { callCluster, esClient, soClient } = context;

    // Before:
    const result = callCluster('search', options)

    // After:
    const { body: result } = esClient.search(options);

    return result;
  }
});

Regarding the soClient, it is encouraged to use it instead of the plugin’s owned SavedObject’s repository as we used to do in the past.

Before:

function getUsageCollector(
  usageCollection: UsageCollectionSetup,
  getSavedObjectsRepository: () => ISavedObjectsRepository | undefined
) {
  usageCollection.makeUsageCollector<MyUsage>({
    type: 'my-collector',
    isReady: () => typeof getSavedObjectsRepository() !== 'undefined',
    schema: {...},
    async fetch() {
      const savedObjectsRepository = getSavedObjectsRepository();

      const { attributes: result } = await savedObjectsRepository.get('my-so-type', 'my-so-id');

      return result;
    }
  });
}

After:

function getUsageCollector(usageCollection: UsageCollectionSetup) {
  usageCollection.makeUsageCollector<MyUsage>({
    type: 'my-collector',
    isReady: () => true,
    schema: {...},
    async fetch({ soClient }) {
      const { attributes: result } = await soClient.get('my-so-type', 'my-so-id');

      return result;
    }
  });
}

Creating a custom client

edit

Note that the plugins option is no longer available on the new client. As the API is now exhaustive, adding custom endpoints using plugins should no longer be necessary.

The API to create custom clients did not change much:

Before:

const customClient = coreStart.elasticsearch.legacy.createClient('my-custom-client', customConfig);
// do something with the client, such as
await customClient.callAsInternalUser('ping');
// custom client are closable
customClient.close();

After:

const customClient = coreStart.elasticsearch.createClient('my-custom-client', customConfig);
// do something with the client, such as
await customClient.asInternalUser.ping();
// custom client are closable
customClient.close();

If, for any reasons, you still need to reach an endpoint not listed on the client API, using request.transport is still possible:

const { body } = await client.asCurrentUser.transport.request({
  method: 'get',
  path: '/my-custom-endpoint',
  body: { my: 'payload'},
  querystring: { param: 'foo' }
})