Skip to main content

9.1 Relational Database Installation (PostgreSQL / MySQL) and Environment Configurations

This phase covers connecting the Database—the heart and vascular system required for any Spring Boot application to survive. Let's proceed with an industry-standard, lightweight containerized deployment strategy.


🐋 1. The Most Elegant DB Deployment leveraging Docker Compose

To stop reinstalling databases every time you format your PC, you must use Docker. By establishing just one docker-compose.yml file in your project's root directory, anyone on your team can spin up the exact identical DB environment in 5 seconds with a single terminal command: docker compose up -d.

docker-compose.yml (PostgreSQL Example)

version: '3.8'

services:
postgres:
image: postgres:15-alpine # The most lightweight Version 15 Alpine Linux binary
container_name: boot-postgres
ports:
- "5432:5432" # Forwarding local Port 5432 -> Docker Port 5432
environment:
POSTGRES_USER: root
POSTGRES_PASSWORD: password123!
POSTGRES_DB: local_db
TZ: Asia/Seoul # Aligning to KST timezone
volumes:
- ./postgres-data:/var/lib/postgresql/data # Mounting data to the host PC to guarantee data preservation!
# Optional Automated Schema Initialization
- ./init/schema.sql:/docker-entrypoint-initdb.d/init.sql

init/schema.sql (Useful for pre-baking fundamental DB tables before Spring Boot boots)

CREATE TABLE IF NOT EXISTS members (
id BIGINT GENERATED ALWAYS AS IDENTITY PRIMARY KEY,
email VARCHAR(255) NOT NULL UNIQUE,
nickname VARCHAR(50) NOT NULL
);

💽 2. Testing Connections with DBeaver

If the database container boots up, perform a live connection test locally using SQL Clients like DBeaver or DataGrip.

  • Host: localhost
  • Port: 5432
  • Database: local_db
  • Username: root
  • Password: password123!

If the connection is successful, you just need to inform Spring Boot using these exact coordinates!


⚙️ 3. Establishing the Spring Boot DataSource Connection

The application.yml (or application.properties) is the supreme configuration center read the moment the Spring Boot application boots.

spring:
datasource:
# 1. Which exact DB coordinate are we dialing?
url: jdbc:postgresql://localhost:5432/local_db
driver-class-name: org.postgresql.Driver
# 2. Authentication credentials.
username: root
password: password123!

# HikariCP Connection Pool Performance Tuning (For Heavy Traffic)
datasource.hikari:
maximum-pool-size: 15 # Maximum concurrent physical connections
connection-timeout: 3000 # If a connection is not delivered within 3 seconds, drop the transaction (Prevents Infinite Waiting)

jpa:
# 3. Neatly format the underlying Hibernate SQL prints on the console
show-sql: true
properties:
hibernate:
format_sql: true
# [FATALLY CRITICAL] 4. Will you autonomously overwrite physical DB Tables observing Java entity schemas?
hibernate:
ddl-auto: update

🚨 4. Pro Tips for Modern Engineering

💡 The Culprit Behind Catastrophes: The Terror of the ddl-auto command

spring.jpa.hibernate.ddl-auto analyzes your native Java @Entity class structure to decide whether to automatically generate physical Database Tables or brutally demolish them.

  • create: Every time the Server boots, it DROPS (Erases) the tables (e.g. 1 million users) and creates a brand-new empty shell.
  • create-drop: Dropped entirely when the server shuts down. (Strictly for local testing).
  • update: Retains historical data, generating (Altering) newly added Columns mapped in Java. (Risky)
  • validate: "Does the Java blueprint exactly match the living DB tables? If anything contradicts, refuse to spin the server alive!" (Pure verification).
  • none: Completely deactivates this mechanism!

[The 1st Commandment of Baseline Operations] Even if you casually use update in Local Development environments for convenience, within a live Production / Staging Server supporting massive concurrent traffic, you MUST aggressively LOCK THE SYSTEM DOWN by setting it to strictly validate or none. The exact millisecond a Production DB Column is inadvertently Altered (Added/Deleted), the database cascades into triggering a devastating Table Lock resulting in full network Table Scans lasting several grueling minutes—catalyzing a catastrophic full-scale enterprise system shutdown!