Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Commit

Permalink
[Issue HHS#2089] Setup opensearch locally (#39)
Browse files Browse the repository at this point in the history
Fixes HHS#2089

Setup a search index to run locally via Docker

Updated makefile to automatically initialize the index + added a script
to wait for the index to start up before proceeding.

Setup a very basic client for connecting to the search index (will be
expanded more in subsequent PRs)

Basic test / test utils to verify it is working (also will be expanded)

This is the first step in getting the search index working locally. This
actually gets it running, and the client works, we just aren't doing
anything meaningful with it yet besides tests.

This doesn't yet create an index that we can use, except in the test.
However, if you want to test out a search index, you can go to
http://localhost:5601/app/dev_tools#/console (after running `make init`)
to run some queries against the (one node) cluster.
https://opensearch.org/docs/latest/getting-started/communicate/#sending-requests-in-dev-tools
provides some examples of how to create + use indexes that you can
follow.
  • Loading branch information
chouinar authored and acouch committed Sep 18, 2024
1 parent 433a06a commit acc5451
Show file tree
Hide file tree
Showing 12 changed files with 315 additions and 10 deletions.
15 changes: 14 additions & 1 deletion api/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ start-debug:
run-logs: start
docker-compose logs --follow --no-color $(APP_NAME)

init: build init-db
init: build init-db init-opensearch

clean-volumes: ## Remove project docker volumes (which includes the DB state)
docker-compose down --volumes
Expand Down Expand Up @@ -179,6 +179,19 @@ create-erds: # Create ERD diagrams for our DB schema
setup-postgres-db: ## Does any initial setup necessary for our local database to work
$(PY_RUN_CMD) setup-postgres-db

##################################################
# Opensearch
##################################################

init-opensearch: start-opensearch
# TODO - in subsequent PRs, we'll add more to this command to setup the search index locally

start-opensearch:
docker-compose up --detach opensearch-node
docker-compose up --detach opensearch-dashboards
./bin/wait-for-local-opensearch.sh



##################################################
# Testing
Expand Down
31 changes: 31 additions & 0 deletions api/bin/wait-for-local-opensearch.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
#!/bin/bash
# wait-for-local-opensearch

set -e

# Color formatting
RED='\033[0;31m'
NO_COLOR='\033[0m'

MAX_WAIT_TIME=30 # seconds
WAIT_TIME=0

# Curl the healthcheck endpoint of the local opensearch
# until it returns a success response
until curl --output /dev/null --silent http://localhost:9200/_cluster/health;
do
echo "waiting on OpenSearch to initialize..."
sleep 3

WAIT_TIME=$(($WAIT_TIME+3))
if [ $WAIT_TIME -gt $MAX_WAIT_TIME ]
then
echo -e "${RED}ERROR: OpenSearch appears to not be starting up, running \"docker logs opensearch-node\" to troubleshoot.${NO_COLOR}"
docker logs opensearch-node
exit 1
fi
done

echo "OpenSearch is ready after ~${WAIT_TIME} seconds"


37 changes: 37 additions & 0 deletions api/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,41 @@ services:
volumes:
- grantsdbdata:/var/lib/postgresql/data

opensearch-node:
image: opensearchproject/opensearch:latest
container_name: opensearch-node
environment:
- cluster.name=opensearch-cluster # Name the cluster
- node.name=opensearch-node # Name the node that will run in this container
- discovery.type=single-node # Nodes to look for when discovering the cluster
- bootstrap.memory_lock=true # Disable JVM heap memory swapping
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m" # Set min and max JVM heap sizes to at least 50% of system RAM
- DISABLE_INSTALL_DEMO_CONFIG=true # Prevents execution of bundled demo script which installs demo certificates and security configurations to OpenSearch
- DISABLE_SECURITY_PLUGIN=true # Disables Security plugin
ulimits:
memlock:
soft: -1 # Set memlock to unlimited (no soft or hard limit)
hard: -1
nofile:
soft: 65536 # Maximum number of open files for the opensearch user - set to at least 65536
hard: 65536
volumes:
- opensearch-data:/usr/share/opensearch/data # Creates volume called opensearch-data and mounts it to the container
ports:
- 9200:9200 # REST API
- 9600:9600 # Performance Analyzer

opensearch-dashboards:
image: opensearchproject/opensearch-dashboards:latest
container_name: opensearch-dashboards
ports:
- 5601:5601 # Map host port 5601 to container port 5601
expose:
- "5601" # Expose port 5601 for web access to OpenSearch Dashboards
environment:
- 'OPENSEARCH_HOSTS=["http://opensearch-node:9200"]'
- DISABLE_SECURITY_DASHBOARDS_PLUGIN=true # disables security dashboards plugin in OpenSearch Dashboards

grants-api:
build:
context: .
Expand All @@ -28,6 +63,8 @@ services:
- .:/api
depends_on:
- grants-db
- opensearch-node

volumes:
grantsdbdata:
opensearch-data:
9 changes: 9 additions & 0 deletions api/local.env
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,15 @@ DB_SSL_MODE=allow
# could contain sensitive information.
HIDE_SQL_PARAMETER_LOGS=TRUE

############################
# Opensearch Environment Variables
############################

OPENSEARCH_HOST=opensearch-node
OPENSEARCH_PORT=9200
OPENSEARCH_USE_SSL=FALSE
OPENSEARCH_VERIFY_CERTS=FALSE

############################
# AWS Defaults
############################
Expand Down
66 changes: 57 additions & 9 deletions api/poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

7 changes: 7 additions & 0 deletions api/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ gunicorn = "^22.0.0"
psycopg = { extras = ["binary"], version = "^3.1.10" }
pydantic-settings = "^2.0.3"
flask-cors = "^4.0.0"
opensearch-py = "^2.5.0"

[tool.poetry.group.dev.dependencies]
black = "^23.9.1"
Expand All @@ -43,6 +44,12 @@ sadisplay = "0.4.9"
ruff = "^0.4.0"
debugpy = "^1.8.1"
freezegun = "^1.5.0"
# This isn't the latest version of types-requests
# because otherwise it depends on urllib3 v2 but opensearch-py
# needs urlib3 v1. This should be temporary as opensearch-py
# has an unreleased change to switch to v2, so I'm guessing
# in the next few weeks we can just make this the latest?
types-requests = "2.31.0.1"

[build-system]
requires = ["poetry-core>=1.0.0"]
Expand Down
4 changes: 4 additions & 0 deletions api/src/adapters/search/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
from src.adapters.search.opensearch_client import SearchClient, get_opensearch_client
from src.adapters.search.opensearch_config import get_opensearch_config

__all__ = ["SearchClient", "get_opensearch_client", "get_opensearch_config"]
36 changes: 36 additions & 0 deletions api/src/adapters/search/opensearch_client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
from typing import Any

import opensearchpy

from src.adapters.search.opensearch_config import OpensearchConfig, get_opensearch_config

# More configuration/setup coming in:
# TODO - https://github.com/navapbc/simpler-grants-gov/issues/13

# Alias the OpenSearch client so that it doesn't need to be imported everywhere
# and to make it clear it's a client
SearchClient = opensearchpy.OpenSearch


def get_opensearch_client(
opensearch_config: OpensearchConfig | None = None,
) -> SearchClient:
if opensearch_config is None:
opensearch_config = get_opensearch_config()

# See: https://opensearch.org/docs/latest/clients/python-low-level/ for more details
return opensearchpy.OpenSearch(**_get_connection_parameters(opensearch_config))


def _get_connection_parameters(opensearch_config: OpensearchConfig) -> dict[str, Any]:
# TODO - we'll want to add the AWS connection params here when we set that up
# See: https://opensearch.org/docs/latest/clients/python-low-level/#connecting-to-amazon-opensearch-serverless

return dict(
hosts=[{"host": opensearch_config.host, "port": opensearch_config.port}],
http_compress=True,
use_ssl=opensearch_config.use_ssl,
verify_certs=opensearch_config.verify_certs,
ssl_assert_hostname=False,
ssl_show_warn=False,
)
33 changes: 33 additions & 0 deletions api/src/adapters/search/opensearch_config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
import logging

from pydantic import Field
from pydantic_settings import SettingsConfigDict

from src.util.env_config import PydanticBaseEnvConfig

logger = logging.getLogger(__name__)


class OpensearchConfig(PydanticBaseEnvConfig):
model_config = SettingsConfigDict(env_prefix="OPENSEARCH_")

host: str # OPENSEARCH_HOST
port: int # OPENSEARCH_PORT
use_ssl: bool = Field(default=True) # OPENSEARCH_USE_SSL
verify_certs: bool = Field(default=True) # OPENSEARCH_VERIFY_CERTS


def get_opensearch_config() -> OpensearchConfig:
opensearch_config = OpensearchConfig()

logger.info(
"Constructed opensearch configuration",
extra={
"host": opensearch_config.host,
"port": opensearch_config.port,
"use_ssl": opensearch_config.use_ssl,
"verify_certs": opensearch_config.verify_certs,
},
)

return opensearch_config
29 changes: 29 additions & 0 deletions api/tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
import src.adapters.db as db
import src.app as app_entry
import tests.src.db.models.factories as factories
from src.adapters import search
from src.constants.schema import Schemas
from src.db import models
from src.db.models.lookup.sync_lookup_values import sync_lookup_values
Expand Down Expand Up @@ -143,6 +144,34 @@ def test_foreign_schema(db_schema_prefix):
return f"{db_schema_prefix}{Schemas.LEGACY}"


####################
# Opensearch Fixtures
####################


@pytest.fixture(scope="session")
def search_client() -> search.SearchClient:
return search.get_opensearch_client()


@pytest.fixture(scope="session")
def opportunity_index(search_client):
# TODO - will adjust this in the future to use utils we'll build
# for setting up / aliasing indexes. For now, keep it simple

# create a random index name just to make sure it won't ever conflict
# with an actual one, similar to how we create schemas for database tests
index_name = f"test_{uuid.uuid4().int}_opportunity"

search_client.indices.create(index_name, body={})

try:
yield index_name
finally:
# Try to clean up the index at the end
search_client.indices.delete(index_name)


####################
# Test App & Client
####################
Expand Down
Empty file.
Loading

0 comments on commit acc5451

Please sign in to comment.