book-list
More detailed documentation is available here: https://mobica-workshops.gitlab.io/examples/go/gin/book-list/
Requirements
This projects needs to be cloned into ~/go/src/gitlab.com/mobica-workshops/examples/go/gin
folder in case
when $GOPATH
is set to ~/go
.
This Project requires those tools for the development:
- Brew - optional
- Docker Engine - required
- Docker Compose - required
- Ansible - required
- Helm - optional
- K3d - optional
- Trivy - optional
- Grype - optional
and GO
installed and configured.
GO - static analysis & code style
This project uses staticcheck tool, to run a static analysis check on the source code.
The tool installation process is described at this link.
The project code style check base at gocheckstyle tool.
The tool installation process is described at link link
The configuration for gocheckstyle tool is defined in go_style file.
The static analysis and code style check can be triggered by the execution of the pre-build.sh script.
Configuration procedure
Prepare configuration:
./local-configure.sh
Because this is only for the presentation purposes password is: ThisIsExamplePassword4U
By the end of running this you should have files in the secret
folder which will be used by all the Development
methods
Development
Service Development and Continuous Integration
The most standard method of the development requires to start local docker compose file with a command:
docker-compose -f docker-compose-local.yml up
with all dependent services available you can use prepared shell scripts to work:
build.sh
- to build you microservice
serve.sh
- to serve it
local-test.sh
- to run integration and functional tests
Continuous Delivery and end-to-end testing
This environment is most useful for checking application with the tools like by example Postman before sending result
of work to the repository.
You just need to start it with this command:
docker-compose up
and you can use integrated Swagger UI, Swagger Editor and other tools which can use our OpenAPI files:
public/v1/openapi.json
public/v1/openapi.yaml
Continuous Deployment Testing
Please remember to read documentation which in details shows how to use our k3d
integration.
If your K3d
and helm
is correctly configured just run:
./deploy-k3s.sh
Documentation Development
To work with documentation which is generated based on a AsciiDoc files you need to have Ruby >= 3.0 installed on your
desktop.
To see generated locally version of this documentation you need to have Python >= 3.7 installed on your desktop.
Documentation will be served on the port 8880
You have those scripts which automates building and serving this documentation:
./build-doc.sh
./serve-doc.sh
Production
Here should be a description of your production procedures.
Readiness checklist
General Rules
- No shared database between different services - a DB instance should only be used by one service exclusively.
- Not breaking the one-hop rule - “By default, a service should not call other services to respond to a request, except in exceptional circumstances.”. The exception can be for example Backend For Frontend (like GraphQL) which can compose and aggregate data on top of other services.
- Prefer APIs to Sharing SDKs. Try to avoid using SDKs between the services, it is not needed.
Documentation
- README.md - self-explanatory service name, how to run it locally and domain/subdomain, bounded context described
- Project documentation - if possible should be kept with a code
- Architecture docs / C4 Model diagrams
- Development docs - more detailed version of service development documentation than README.md which will be used by new developers to start development of the service and for other teams to cooperate with development team.
- API Open Specification file in root directory or other location known by everyone:
openapi.yaml
file
- API versioning - if needed
Testing and Quality
- Linters (with reports that can be exported to e.g. SonarQube)
- Automatic code Formatter or code Format Checkers (e.g. gofmt, ktfmt)
- Test coverage above 70% (use common sense, just getting to the required number of coverage is not a goal here)
- Functional/e2e/acceptance tests in place
- Load Tests (at least basic ones) especially if higher traffic is expected
- Contract Tests are recommended if there is service-to-service communication via HTTP (example: PACT tests)
Observability
- Logging in general https://12factor.net/logs
- All logs are written to STDOUT / STDERR.
- Logs are written in JSON.
- No sensitive data is logged
- Monitoring
- Integration with a monitoring platforms and Dashboards in place.
- Business metrics added to the dashboards
- Tracing
- Distributed tracing configured
- Error tracking configured
- Alerts are configured
Operations and Resiliency
- Staging environment exists
- There is autoscaling in place (based on CPU, memory, traffic, events/messages e.g. HPA with K8S)
- Graceful shutdown: The application understands SIGTERM and other signals and will gracefully shut down itself after processing the current task. https://12factor.net/disposability
- Configuration via environment: All important configuration options are read from the environment and the environment has higher priority over configuration files (but lower than the command line arguments). https://12factor.net/config
- Health Checks: Readiness and Liveness probes
- Define SLO/SLI/SLA
- Build applications with Multi-tenancy in mind (sites, regions, users, etc.)
Security and Compliance