fireworks ai how config fireworks.json for llama-3
In Short
To configure fireworks.json
for Llama-3 on Fireworks AI, specify the base_model as accounts/fireworks/models/llama-v3-8b
or accounts/fireworks/models/llama-v3-70b
. Enable the conversation_config for chat/completions API, and use JSON mode for structured outputs. Example configuration below.
Configuration steps
Base model
Set the base_model
to the appropriate Llama-3 model:
json
or
json
This specifies the Llama-3 model to be used .
Conversation configuration
Enable the conversation API by adding conversation_config
:
json
This configures the model for chat-based interactions .
JSON mode
To enforce structured outputs, use JSON mode:
json
This ensures the model's output adheres to the specified JSON schema .
Example configuration
Combining all elements, a complete fireworks.json
for Llama-3 might look like:
json
This configuration sets up Llama-3 for chat interactions with structured JSON output.
Conclusion
Configure fireworks.json
for Llama-3 by setting the base_model, enabling conversation_config, and specifying JSON mode for structured outputs.