Content Sources
What is it?
Content Sources is an application for storing information about external content (currently YUM repositories) in a central location as well as creating snapshots of those repositories, backed by a Pulp server.
To read more about Content Sources use cases see:
- Introspection
- Snapshots
Developing
Requirements:
- podman & podman-compose installed, or docker & docker-compose installed (and docker (Orbstack for mac) running)
- This is used to start a set of containers that are dependencies for content-sources-backend
- yaml2json tool installed (
pip install json2yaml
).
Create your configuration
Create a config file from the example:
cp ./configs/config.yaml.example ./configs/config.yaml
Add pulp.content to /etc/hosts for integration tests and client access
sudo echo "127.0.0.1 pulp.content" | sudo tee -a /etc/hosts
Start dependency containers
make compose-up
Import Public Repos
make repos-import
For local development, if you want less Red Hat repos try"
OPTIONS_REPOSITORY_IMPORT_FILTER=small make repos-import
Run the server!
make run
Hit the API:
curl -H "$( ./scripts/header.sh 9999 1111 )" http://localhost:8000/api/content-sources/v1.0/repositories/
Stop dependency containers
When its time to shut down the running containers:
make compose-down
And clean the volume that it uses by (this stops the container before doing it if it were running):
make compose-clean
There are other make rules that could be helpful, run make help
to list them. Some are highlighted below
Playwright testing
- Ensure that the backend server is running
- Ensure the correct node version, is installed and in use:
cd _playwright-tests
and nvm use
- Copy the env file and create a file at: _playwright-tests/.env
For local development only the BASE_URL:
http://127.0.0.1:8000
is required, which is already set in the example config.
make playwright
OR
cd _playwright-tests \
&& yarn install \
&& yarn playwright install \
&& yarn playwright test
HOW TO ADD NEW MIGRATION FILES
You can add new migration files, with the prefixed date attached to the file name, by running the following:
go run cmd/dbmigrate/main.go new <name of migration>
Database Commands
Migrate the Database
make db-migrate-up
Get an interactive shell:
make db-shell
Or open directly a postgres client by running:
make db-cli-connect
Kafka commands
You can open an interactive shell by:
make kafka-shell
You can run kafka-console-consumer.sh using KAFKA_TOPIC
by:
make kafka-topic-consume KAFKA_TOPIC=my-kafka-topic
make kafka-topic-consume # Use the first topic at KAFKA_TOPICS list
There are other make rules that could be helpful,
run make help
to list them.
Start / Stop prometheus
Create the configuration for prometheus, getting started with the example one.
Update the configs/prometheus.yaml
file to set your hostname instead of localhost
at scrape_configs.job_name.targets
:
# Note that the targets object cannot reference localhost, it needs the name of your host where
# the prometheus container is executed.
cat ./configs/prometheus.example.yaml | sed "s/localhost/$(hostname)/g" > ./configs/prometheus.yaml
To start prometheus run:
make prometheus-up
To stop prometheus container run:
make prometheus-down
To open the prometheus web UI, once the container is up, run the below:
make prometheus-ui
Start / Stop mock for rbac
Configuration requirements
-
To use this you need to enable RBAC into config/configs.yaml
file:
clients:
rbac_enabled: True
rbac_base_url: http://localhost:8800/api/rbac/v1
rbac_timeout: 30
mocks:
rbac:
user_read_write: ["jdoe@example.com", "jdoe"]
user_read: ["tdoe@example.com", "tdoe"]
Running it
- Run the application by:
make run
or ./release/content-sources api consumer instrumentation mock_rbac
.
- Make some request using:
./scripts/header.sh 12345 jdoe@example.com
for admin or ./scripts/header.sh 12345 tdoe@example.com
for viewer.
RBAC mock service is started for make run
To use it running directly the service: ./release/content-sources api consumer instrumentation mock_rbac
Add the option mock_rbac
Migrate your database (and seed it if desired)
make db-migrate-up
Run the server!
make run
Hit the API:
curl -H "$( ./scripts/header.sh 9999 1111 )" http://localhost:8000/api/content-sources/v1.0/repositories/
Generating new openapi docs:
make openapi
Generating new mocks:
make mock
Live Reloading Server
This is completely optional way of running the server that is useful for local development. It rebuilds the project after every change you make, so you always have the most up-to-date server running.
To set this up, all you need to do is install the "Air" go tool, here is how.
The recommended way is doing:
go install github.com/air-verse/air@latest
After that, all that needs to be done is just running air
, it should automatically use the defined config for this project (.air.toml).
air
Configuration
The default configuration file in ./configs/config.yaml.example shows all available config options. Any of these can be overridden with an environment variable. For example "database.name" can be passed in via an environment variable named "DATABASE_NAME".
Linting
To use golangci-lint:
make install-golangci-lint
make lint
To use pre-commit linter: make install-pre-commit
Code Layout
| Path | Description |
| ---------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --- |
| api | Openapi docs and doc generation code |
| db/migrations | Database Migrations | |
| pkg/api | API Structures that are used for handling data within our API Handlers |
| pkg/config | Config loading and application bootstrapping code |
| pkg/dao | Database Access Object. Abstraction layer that provides an interface and implements it for our default database provider (postgresql). It is separated out for abstraction and easier testing |
| pkg/db | Database connection and migration related code |
| pkg/handler | Methods that directly handle API requests |
| pkg/middleware | Holds all the middleware components created for the service |
| pkg/event | Event message logic. More info here |
| pkg/models | Structs that represent database models (Gorm) |
| pkg/seeds | Code to help seed the database for both development and testing |
| pkg/candlepin_client | Candlepin client |
| pkg/pulp_client | Pulp client |
| pkg/tasks | Tasking system. More info here |
| scripts | Helper scripts for identity header generation and testing |
More info