Creating gRPC Server Based on SQL
Overview
Create gRPC server based on sql
provides a complete gRPC backend service solution from development to deployment. The core feature of this solution is: developers only need to connect to the database to automatically generate standardized CRUD API interfaces, without writing any Go language code, achieving "low-code development" for gRPC services.
Tips
Another gRPC service development solution, Create gRPC server based on protobuf
(see Based on Protobuf), allows using built-in or custom ORM components, while this solution only uses the built-in ORM component. This is the biggest difference between the two solutions.
Applicable Scenarios: Suitable for gRPC service projects primarily based on standardized CRUD APIs.
Prerequisites
Note
Sponge supports database types: mysql, mongodb, postgresql, sqlite
The following operations use mysql as an example to introduce the gRPC service development steps. The steps for the other database types are the same.
Environment Requirements:
- sponge is installed
- mysql database service
- database table structure
Note
Generating code requires a mysql service and database tables. You can start a mysql service using the docker script and then import the example SQL.
Create gRPC Server
Execute the command sponge run
in the terminal to enter the code generation UI:
- Click on the left menu bar [SQL] → [Create gRPC Server];
- Select database
mysql
, fill inDatabase DSN
, then click the buttonGet Table Names
, and select table names (multiple selections allowed); - Fill in other parameters. Hover the mouse over the question mark
?
to view parameter explanations;
Tips
If you enable the Large Repository Type
option when filling in parameters, this setting must be maintained for all subsequent related code generations.
After filling in the parameters, click the button Download Code
to generate the complete gRPC service project code, as shown in the figure below:

Equivalent Command
sponge micro rpc --module-name=user --server-name=user --project-name=edusys --db-driver=mysql --db-dsn="root:123456@(192.168.3.37:3306)/school" --db-table=teacher
Tips
The directory name for the generated gRPC service code defaults to the format
service-name-type-time
. You can modify this directory name as needed.The system will automatically save the records of successfully generated code. When you generate code again, if the
Database DSN
remains unchanged, the available table names list will be automatically loaded after the page refreshes or is reopened, without needing to manually click theGet Table Names
button.
Directory Structure
.
├─ api
│ ├─ types
│ └─ user
│ └─ v1
├─ cmd
│ └─ user
│ ├─ initial
│ └─ main.go
├─ configs
├─ deployments
│ ├─ binary
│ ├─ docker-compose
│ └─ kubernetes
├─ docs
├─ internal
│ ├─ cache
│ ├─ config
│ ├─ dao
│ ├─ ecode
│ ├─ model
│ ├─ server
│ └─ service
├─ scripts
└─ third_party
Code Structure Diagram
The created gRPC service code adopts the classic "Egg Model" architecture:

Code Call Chain Explanation
The gRPC service code generated by sponge adopts a layered architecture. The complete call chain is as follows:
cmd/user/main.go
→ internal/server/grpc.go
→ internal/service
→ internal/dao
→ internal/model
The service layer is primarily responsible for API processing. If more complex business logic needs to be handled, it is recommended to add an additional business logic layer (e.g., internal/biz
) between the service and dao layers. For details, please refer to the Layered Code Architecture section.
Test gRPC Service API
Unzip the code file, open a terminal, navigate to the gRPC service code directory, and execute the commands:
# Generate and merge api related code
make proto
# Compile and run the service
make run
make proto
Command Explanation
Usage Recommendation
Only execute this command when there are changes in the API description in the proto file. Otherwise, skip this command and directly runmake run
.This command performs the following automated operations in the background
- Generate
*.pb.go
files - Generate error code definitions
- Generate gRPC client test code
- Generate API template code
- Automatically merge API template code
- Generate
Safety Mechanism
- Code merging preserves existing business logic
- Code is automatically backed up to the following locations before each merge:
- Linux/Mac:
/tmp/sponge_merge_backup_code
- Windows:
C:\Users\[username]\AppData\Local\Temp\sponge_merge_backup_code
- Linux/Mac:
Testing Method One: IDE (Recommended)
Open Project
- Load the project using an IDE like
Goland
orVSCode
.
- Load the project using an IDE like
Execute Test
- Go to the
internal/service
directory and open the file with the suffix_client_test.go
. - The file contains test and benchmark functions for each API defined in the proto file.
- Modify request parameters (similar to Swagger UI testing)
- Run tests through the IDE, as shown in the figure below:
micro-rpc-test - Go to the
Testing Method Two: Command Line
Navigate to Directory
cd internal/service
Modify Parameters
- Open the
xxx_client_test.go
file - Fill in the request parameters for the gRPC API
- Open the
Execute Test
go test -run "TestFunctionName/gRPCMethodName"
Example:
go test -run "Test_service_teacher_methods/GetByID"
Add CRUD API
If you need to generate CRUD API code for new mysql tables, the CRUD API generated by sponge can be seamlessly added to the gRPC service code without writing any Go code.
Generate Service CRUD Code
- Click on the left menu bar [Public] → [Generate Service CRUD Code];
- Select database
mysql
, fill inDatabase DSN
, then click the buttonGet Table Names
, and select mysql tables (multiple selections allowed); - Fill in other parameters.
After filling in the parameters, click the button Download Code
to generate the service CRUD code, as shown in the figure below:

Equivalent Command
# Full command
sponge micro service --module-name=user --db-driver=mysql --db-dsn="root:123456@(192.168.3.37:3306)/school" --db-table=course,teach
# Simplified command (use --out to specify the service code directory, generated code is automatically merged into the specified service directory)
sponge micro service --db-driver=mysql --db-dsn="root:123456@(192.168.3.37:3306)/school" --db-table=course,teach --out=user
Code Directory Structure
The directory structure for the generated service CRUD code is as follows:
.
├─ api
│ └─ user
│ └─ v1
└─ internal
├─ cache
├─ dao
├─ ecode
├─ model
└─ service
Test CRUD API
Unzip the code, move the internal
and api
directories to the gRPC service code directory, and execute the following commands:
# Generate and merge api related code
make proto
# Compile and run the service
make run
To test the CRUD API, please refer to the section above: Test gRPC Service API.
Tips
The List
interface in the CRUD API supports powerful custom conditional pagination query functionality. Click to view detailed usage rules: Custom Conditional Pagination Query.
Develop Custom API
Projects usually have custom APIs in addition to standardized CRUD APIs. sponge adopts a "definition generates code" development model, allowing rapid development of custom APIs, which mainly involves the following three steps:
Define API
Declare the request/response structure of the API in the.proto
file.Write Business Logic
- Fill in the core business logic code in the generated code template.
- No need to manually write the complete gRPC server/client code.
Test Verification
- Test code is automatically generated and can be run directly.
- No need to rely on third-party gRPC client tools like Postman.
The following takes adding a "Login" API as an example to explain the development process in detail.
1. Define API
Go to the project directory api/user/v1
, edit the teacher.proto
file, and add the description information for the Login API:
service teacher {
// ...
// Login, describe the specific implementation logic here to tell sponge's built-in AI assistant how to generate the business logic code
rpc Login(LoginRequest) returns (LoginReply) {}
}
message LoginRequest {
string email = 1 [(validate.rules).string.email = true];
string password = 2 [(validate.rules).string.min_len = 6];
}
message LoginReply {
uint64 id = 1;
string token = 2;
}
After adding the API description information, execute the command in the terminal:
# Generate and merge api related code
make proto
2. Implement Business Logic
There are two ways to implement business logic code:
Manually write business logic code
Open the code file
internal/service/teacher.go
and fill in the specific business logic code in the Login method function.Note
In developing custom APIs, you might need to perform database CRUD operations or cache operations. You can reuse the following code:
Automatically generate business logic code
Sponge provides a built-in AI assistant to generate business logic code. Click to view the section on AI Assistant generates code.
The business logic code generated by the AI assistant may not fully meet the actual requirements and needs to be modified according to the specific situation.
3. Test Custom API
After implementing the business logic code, execute the command in the terminal:
# Compile and run the service
make run
To test the custom API, please refer to the section above: Test gRPC Service API.
Cross-Service gRPC API Call
In a microservice architecture, the current service may need to call APIs provided by other gRPC services. These target services may be implemented in different languages but must use the Protocol Buffers protocol. The following is a complete description of the calling process:
Generate gRPC Service Connection Code
Operation Steps:
- Access the sponge UI interface.
- Navigate to [Public] → [Generate gRPC Service Connection Code].
- Fill in parameters:
- Module Name (Required)
- gRPC Service Names (Supports multiple services, separated by commas)
Click the button [Download Code] to generate the gRPC service connection code, as shown in the figure below:

Equivalent Command
# Full command
sponge micro rpc-conn --module-name=user --rpc-server-name=community
# Simplified command (use --out to specify the service code directory, generated code is automatically merged into the specified service directory)
sponge micro rpc-conn --rpc-server-name=community --out=edusys
Generated Code Structure:
.
└─ internal
└─ rpcclient # Includes complete client configuration for service discovery, load balancing, etc.
Unzip the code, move the internal
directory to the current service's code directory.
Example of using the generated gRPC connection code in your code:
In actual use, you might need to call APIs from multiple gRPC services to get data. The initialization example code is as follows:
package service
import (
userV1 "edusys/api/user/v1"
relationV1 "edusys/api/relation/v1"
creationV1 "edusys/api/creation/v1"
// ......
)
type user struct {
userV1.UnimplementedTeacherServer
relationCli relationV1.RelationClient
creationCli creationV1.CreationClient
}
// NewUserClient create a client
func NewUserServer() edusysV1.UserLogicer {
return &user{
// Instantiate multiple gRPC service client interfaces
relationCli: userV1.NewRelationClient(rpcclient.GetRelationRPCConn()),
creationCli: userV1.NewCreationClient(rpcclient.GetCreationRPCConn()),
}
}
// ......
Configure Target gRPC Service Connection Parameters
Add the following configuration to the configuration file configs/service-name.yml
:
grpcClient:
- name: "user" # grpc service name
host: "127.0.0.1" # grpc service address, this field value is invalid if service discovery is enabled
port: 8282 # grpc service port, this field value is invalid if service discovery is enabled
registryDiscoveryType: "" # service discovery, disabled by default, supports consul, etcd, nacos
Multiple Service Configuration Example:
grpcClient:
- name: "user"
host: "127.0.0.1"
port: 18282
registryDiscoveryType: ""
- name: "relation"
host: "127.0.0.1"
port: 28282
registryDiscoveryType: ""
- name: "creation"
host: "127.0.0.1"
port: 38282
registryDiscoveryType: ""
Tips
For complete configuration options, refer to the description of the grpcClient
field in the configuration file configs/[service name].yml
.
Call Target gRPC Service API
After successfully connecting to the target gRPC service, to clearly know which API interfaces can be called, you need to introduce the Go language stub code generated from the proto files. Depending on the service architecture, there are two main ways to call APIs:
Mono-repo Architecture
If the target gRPC service was created by sponge and belongs to the same microservice mono-repository as the current service (selected "Large Repository" type when creating the service), you can directly call its APIs without cross-service dependency issues.Multi-repo Architecture
If the target gRPC service is located in an independent code repository, you need to solve the problem of cross-service referencing proto files and Go stub code. The following are two common solutions:Solution One: Use a Public Protobuf Repository
For a multi-repo microservice system, it is recommended to create a dedicated public Git repository (e.g.,
public_protobuf
) to centrally manage proto files and their generated Go stub code. A typical directory structure is as follows:· ├── api │ ├── serverName1 │ │ └── v1 │ │ ├── serverName1.pb.go │ │ └── serverName1_grpc.pb.go │ └── serverName2 │ └── v1 │ ├── serverName2.pb.go │ └── serverName2_grpc.pb.go ├── protobuf │ ├── serverName1 │ │ └── serverName1.proto │ └── serverName2 │ └── serverName2.proto ├── go.mod └── go.sum
Calling Steps:
Copy the
protobuf
directory from the public repository to thethird_party
directory of your local service.. ├── third_party │ └── protobuf │ ├── serviceA │ │ └── serviceA.proto │ └── serviceB │ └── serviceB.proto
Import the target proto in your local proto file using
import
(e.g.,import "protobuf/serviceA/serviceA.proto";
).Call the target gRPC service's API in your local service code.
Note
Ensure that the proto files under
third_party/protobuf
are synchronized with the public repository.Solution Two: Copy the target service's proto file into this service and generate Go language stub code
Different processing flows are used depending on how the target service was created:
Service not created by sponge
- Manually copy the target service's proto file to the
api/target-service-name/v1
directory. - Manually modify the
go_package
path definition in the proto file.
- Manually copy the target service's proto file to the
Service created by sponge
Integration is completed via automated commands:
# Copy the target service's proto files (supports multiple service directories, separated by commas) make copy-proto SERVER=../target_service_dir # Generate Go stub code make proto
Advanced Options:
- Specify proto files:
PROTO_FILE=file1,file2
- Automatic backup: Overwritten files can be found at
/tmp/sponge_copy_backup_proto_files
- Specify proto files:
Test Cross-Service gRPC API Call
Start dependent services:
- gRPC service created by sponge: Execute
make run
- Other gRPC services: Run according to their actual startup commands
- gRPC service created by sponge: Execute
Start the current service by executing the commands:
# Generate and merge api related code make proto # Compile and run the service make run
To test the cross-service gRPC API call, please refer to the section above: Test gRPC Service API.
Service Configuration Explanation
The gRPC service created by sponge provides a rich set of configurable components. You can flexibly manage these components by modifying the configs/service-name.yml
configuration file.
Component Management Explanation
You can add or replace custom gRPC interceptors in internal/server/grpc.go
.
Default Enabled Components
Component | Function Description | Configuration Documentation |
---|---|---|
logger | Logging component • Default terminal output • Supports console/json format • Supports log file splitting and retention | Logger Configuration |
enableMetrics | Prometheus metrics collection • Default route /metrics | Monitoring Configuration |
enableStat | Resource monitoring • Records CPU/memory usage per minute • Automatically saves profile if threshold is exceeded | Resource Statistics |
database | Database support • MySQL/MongoDB/PostgreSQL/SQLite | gorm Configuration mongodb Configuration |
Default Disabled Components
Component | Function Description | Configuration Documentation |
---|---|---|
cacheType | Cache support (Redis/In-memory) | redis Configuration |
enableHTTPProfile | Performance analysis (pprof) • Default route /debug/pprof/ | - |
enableLimit | Adaptive request limiting | Rate Limiting Configuration |
enableCircuitBreaker | Circuit breaker protection | Circuit Breaker Configuration |
enableTrace | Distributed tracing | Tracing Configuration |
registryDiscoveryType | Service registration and discovery • Consul/Etcd/Nacos | Service Registration and Discovery Configuration |
Configuration Update Process
If you add or change field names in the configuration file configs/service-name.yml
, you need to update the corresponding Go code by executing the following command in the terminal:
# Regenerate configuration code
make config
Note
If you only modify the field values in the configuration file, you do not need to execute the make config
command; you only need to recompile and run.
Tips
To learn more about components and configuration details, click to view the Components and Configuration section.