Getting Started
Installation
Ensure you have Node.js installed on your system. Install LinguaLinkAI globally using npm:
Configuration
Run LinguaLinkAI to configure your preferred translation model:
You will be prompted to select a language model from the following options:
- Ollama
- Amazon Bedrock
- OpenAI
Follow the on-screen instructions to complete the configuration for the selected model.
Usage
Once configured, start a translation job by specifying the source and destination paths, along with the target language:
LinguaLinkAI will process the files in the source location and output the translated files to the destination location.
Contributing
Feedback and contributions are highly appreciated. If you have ideas for new features, encounter bugs, or want to add support for additional language models, please visit our GitHub repository:
https://github.com/DCoderAI/lingua-link-ai
Disclaimer
The translations are powered by AI and, while highly effective, may not always achieve perfection. Depending on context, some nuances, idioms, or cultural expressions might not be fully captured.