Fine-Tune Model

Fine-tunes a model with the specified training data set.

Options 

  • Account Name is a text expression for the name of the AI account to use. In the current file, use the Configure AI Account script step to set up the account and assign it this name any time before this script step runs.

  • Base Model is a text expression for the text generation model to fine-tune. For supported models, see FileMaker technical specifications.

  • Training Data specifies the source of training data for fine-tuning:

    • Table: Uses data from the specified table occurrence based on the current found set of records. For training, data from Completion Field is used as the assistant response; data from all other fields is used as the corresponding user prompt.

    • File: Uses a JSONL file containing training data. Each line must be a valid JSON object containing a user prompt and its corresponding assistant response.

  • Response Target specifies the field or variable where the model provider returns a JSON object with information about the fine-tuning job started by this script step.

  • Fine-Tune Parameters is a text expression for a JSON object that consists of key-value pairs for fine-tuning options that are supported by the model provider.

Options available only when Training Data is Table:

  • The value for Table specifies a table occurrence to use for training data.

  • Completion Field specifies the field in Table containing expected assistant responses or answers for training.

Options available only when Training Data is File:

  • The value for File specifies a list of one or more paths for the JSONL file containing training data. Paths must use one of the file path prefixes. The script step searches the list and adds the first JSONL file it successfully locates. See Creating file paths.

Compatibility 

Product Supported
FileMaker Pro Yes
FileMaker Go Yes
FileMaker WebDirect Yes
FileMaker Server Yes
FileMaker Cloud Yes
FileMaker Data API Yes
Custom Web Publishing Yes

Originated in version 

22.0

Description 

This script step sends training data to a supported model provider to create a fine-tuned version of a base AI model using low-rank adaptation (LoRA) techniques. Fine-tuning allows you to customize a model's behavior for specific tasks, domains, or response styles by training it on your own data while preserving the base model's general capabilities. For example, you can fine-tune a model to respond better using your company's specific terminology, writing style, or domain expertise. LoRA is an efficient method that adds small trainable parameters to the model without modifying the original weights. This approach requires significantly less computational resources and memory compared to full model fine-tuning while achieving comparable performance improvements.

The AI account specified by Account Name must be configured for one of the following model providers:

  • OpenAI

  • The AI Model Server provided with FileMaker Server on an Apple silicon Mac

Other model providers and operating systems aren't supported for fine-tuning. See FileMaker technical specifications.

Training Data

Training Data can be provided either from a FileMaker table or from a JSON Lines (JSONL) file. Each training example consists of a user prompt and the desired assistant response. The model is then able to generate responses similar to your training examples when given similar prompts.

Training Data from Does this

Table

For the specified table occurrence, sends data from Completion Field as the assistant response and data from all other fields as the corresponding user prompt. For each record in the current found set (or for each related record if specifying a related table), the script step creates a JSON object in the following format, then sends them all as a JSONL file to the model provider.

Copy
{
  "messages"
  [
    {
      "content" : "<FieldName1>=<Data1>, <FieldName2>=<Data2>, ... ",
      "role" : "user"
    },
    {
      "content" : "<Completion_Field_Data>",
      "role" : "assistant"
    }
  ]
}

Note  This option doesn't allow you to specify which fields are sent in the user prompt. To specify the fields to include in the user prompt, use the Save Records as JSONL script step to create a JSONL file, then use the File option in this script step to send that file as the training data.

File

Sends the first JSONL file successfully located in the path list specified by File. Each line of the JSONL file must contain a JSON object that has at least the user prompt and the corresponding assistant response in this format:

Copy
{
  "messages"
  [
    {
      "content" : "<User_Prompt>",
      "role" : "user"
    },
    {
      "content" : "<Assistant_Prompt>",
      "role" : "assistant"
    }
  ]
}

Note  Though shown here on multiple lines for clarity, each JSON object must be on a single line in the JSONL file.

You can use the Save Records as JSONL script step with the Format for fine-tuning option turned On to create this file from record data.

Response Target

To get information about the fine-tuning job that is started by this script step, specify a variable or field for the Response Target option. Fine-tuning can take considerable time, so you will need to check with the model provider to determine when the process is complete and the fine-tuned model is ready to use.

For example, the AI Model Server sends a response like that shown below when:

  • The JSONL file specified by File is my-training-data.jsonl.

  • Base Model is google/codegemma-7b-it.

  • The value of the fine_tuned_model_name key specified in Fine-Tune Parameters is my-fine-tuned-model-name. The value of the fine_tuned_model key in Response Target, is the full name assigned by the AI Model Server.

Copy
{
  "result": {
    "object": "fine_tuning.job",
    "id": "fm-ftjob-1753297022103",
    "file_id": "fm-ft-train-1753297022070",
    "model": "google/codegemma-7b-it",
    "created_at": 1753297022103,
    "fine_tuned_model": "fm-mlx-my-fine-tuned-model-name",
    "status": "queued",
    "training_file": "my-training-data.jsonl",
    "tags": [
      "fine-tune"
    ]
  }
}

To determine when the AI Model Server has completed fine-tuning, sign in to FileMaker Server Admin Console. See Notes.

Fine-Tune Parameters

The Fine-Tune Parameters option can be used to specify fine-tuning parameters that are supported by the model provider. Refer to the model provider's documentation for key names of supported parameters and their valid ranges.

For the AI Model Server provided with FileMaker Server, you can use the following keys and values to adjust fine-tuning, if needed. If a key isn't specified or the Fine-Tune Parameters option isn't used, the script step uses the default values.

Parameter Description Default value

max_steps

Total number of training steps (or iterations). More iterations require more machine memory and time but risk overfitting.

1000

learning_rate

Number that controls how much the model adjusts during each training step. Higher values mean faster training but risk overshooting optimal performance. Lower values mean more stable training but slower convergence.

2e-4 (0.0002)

lora_layers

Number of layers of the model that will have LoRA adapters applied during fine-tuning. Lower values mean faster training with lower memory usage, which is good for simple model changes. Higher values mean slower training with higher memory usage, which is better for more complex model changes.

4

batch_size

Number of training examples processed together before updating model weights during fine-tuning. Lower values mean lower memory usage and slower training. Higher values mean more memory usage and faster training.

1

fine_tuned_model_name

Controls the custom name of your fine-tuned model after training. For the root name you specify in this parameter, the AI Model Server converts spaces to hyphens and adds the fm-mlx- prefix to produce the full name you should use in other script steps and functions when you want to use this fine-tuned model. For example, if you specify my support model, the full name is fm-mlx-my-support-model. If no value is specified, the AI Model Server appends a timestamp after the prefix (for example, fm-mlx-1753215585244).

The full fine-tuned model name is returned in Response Target.

Warning  If the full name matches that of an existing fine-tuned model, the existing fine-tuned model is deleted before training for the new fine-tuned model starts.

timestamp

For example, this JSON object sets keys in the Fine-Tune Parameters option:

Copy
{
  "max_steps": 2000,
  "learning_rate": 1e-4,
  "batch_size": 2,
  "lora_layers": 6,
  "fine_tuned_model_name": "customer-support-v1"
}

Notes 

  • Fine-tuning requires significant computational resources. Running more than one fine-tuning job at a time on the AI Model Server may negatively affect system performance.

  • To monitor the status of fine-tuned models on the AI Model Server (for example, to tell when training is complete), open FileMaker Server Admin Console. There you can also create fine-tuned models directly by uploading a JSONL file. See Creating AI fine-tuned models in FileMaker Server Help.

Example 1- Basic fine-tuning using table data

Fine-tunes an OpenAI model with data from a table. The script configures an AI account for OpenAI and goes to the Support_QA layout. The Support_QA table contains Question and Answer fields among others. The script performs a find to get the desired found set, then sends data from those records to fine-tune a GPT-4.1 model, specifying the Answer field for Completion Field (data from all other fields is sent as the user prompt).

Copy
Configure AI Account [ Account Name: "my-account" ; Model Provider: OpenAI ; API key: Global::OpenAI_API_Key ]

Go to Layout [ "Support_QA" (Support_QA) ]

Perform Find [ Restore ]

Fine-Tune Model [ Account Name: "my-account" ; Base Model: "gpt-4.1-2025-04-14" ; Training Data: Table ; "Support_QA" ; Completion Field: Support_QA::Answer ; Response Target: $$responseTarget ]

When OpenAI notifies you that training is complete and provides the fine-tuned model name, you can use the model in script steps like Generate Response from Model:

Copy
Generate Response from Model [ Account Name: "my-account" ; Model: "ft:gpt-4o-mini-2024-07-18:my-org::LBNO71Qq" ; User Prompt: $question ; Agentic mode ; Response: $$response ]

Example 2- Fine-tuning using a JSONL file

Fine-tunes a model on the AI Model Server by sending the JSONL file created in Example 2 for the Save Records as JSONL script step. The script configures an AI account for and sets the $trainingFile variable to the path for the JSONL file in the Documents folder.

The script then sets the $parameters variable to a JSON object containing the key-value pairs to use for fine-tuning parameters, including setting the root name to give the fine-tuned model.

Finally, the script sends the JSONL file specified by $trainingFile to the AI Model Server, indicating the base model to be fine-tuned, the parameters to use, and the $response variable to store the response. To have the full fine-tuned model name available to use later, the script gets the name from $response and stores it in a global field.

Copy
Configure AI Account [ Account Name: "AI_Model_Server" ; Model Provider: Custom ; Endpoint: "https://myserver.example.com/llm/v1/" ; API key: Global::Fine_Tuning_API_Key ; Verify SSL Certificates ]

Set Variable [ $trainingFile ; Value: Get(DocumentsPath) & "training_data.jsonl" ]

Set Variable [ $parameters ; Value: 
  Let ( [
    json = "{}" ;
    json = JSONSetElement ( json; "max_steps"; 1500; JSONString ) ;
    json = JSONSetElement ( json; "learning_rate"; 1e-4; JSONString ) ;
    json = JSONSetElement ( json; "batch_size"; 2; JSONString ) ;
    json = JSONSetElement ( json; "fine_tuned_model_name"; "product-expert-v2" ; JSONString )
  ] ;
    json
  )
]

Fine-Tune Model [ Account Name: "AI_Model_Server" ; Base Model: "google/codegemma-7b-it" ; Training Data: File ; "$trainingFile" ; Response Target: $responseTarget ; Fine-Tune Parameters: $parameters ]

Set Field [ Global::Fine_Tuned_Model ; JSONGetElement ( $responseTarget ; "result.fine_tuned_model" ) ]