Choosing an AI Model
LrGeniusAI supports multiple AI models for analyzing and tagging your photos. Each model has its own strengths and requirements. Here’s a brief overview to help you choose the right one for your needs
Privacy and Pricing Considerations
The first question to ask yourself is: Do you want to use a cloud-based AI model, or do you prefer to run an AI model locally on your own hardware for privacy reasons? Also consider that cloud-based models typically incur costs based on usage, while local models are free to use.
Cloud-based AI Models
Cloud-based AI models like Google Gemini and OpenAI's ChatGPT offer the best results without the need for local hardware resources. However, using these services involves sending your photos to external servers, which may raise privacy concerns depending on the sensitivity of your images.
- Google Gemini: Requires setting up an API key and activating billing at Google. Costs are incurred per image analyzed. Available Google Gemini models (2025-12-12)
- gemini-2.5-flash-lite - Cheapest and fastest but less intelligent
- gemini-2.5-flash - Good compromise between speed, intelligence and costs.
- gemini-2.5-pro - Best in class model.
- gemini-3-pro-preview - Successor to gemini-2.5-pro, currently in preview. Limited to 300 images per day from Google.
- ChatGPT: Requires setting up an API key at OpenAI and activating billing. Costs are incurred per image analyzed. Available ChatGPT models (2025-12-12)
- gpt-4.1 - Old flagship model
- gpt-4.1-mini - Reduced size model. Faster, less expensive, but not as intelligent.
- gpt-5 - Old flagship
- gpt-5-mini - Good compromise between performance and cost.
- gpt-5-nano - Cheap and fast model.
- gpt-5.1 - Current flagship model from OpenAI.
Local AI Models
If privacy is a concern or if you want to avoid ongoing costs, consider using local AI models with LrGeniusAI. This requires installing and setting up software like Ollama or LM Studio on your own computer.
- Ollama: A local AI model hosting solution that allows you to run vision-capable models on your own hardware. See our Ollama setup guide.
- LM Studio: Another local AI model hosting solution that supports various vision-capable models. See our LM Studio setup guide.
Choosing the Right Model
The choice is a very personal one and depends on your specific needs regarding privacy, cost, and performance. Also the hardware you have available plays a big role when choosing a local AI model. For cloud-based models, consider the cost per image and the quality of results you require. For local models, ensure your hardware meets the requirements for running the chosen AI model effectively. The best way is probably to try out different models and see which one fits your workflow and requirements the best.
Model comparison done by the community
There are two german-speaking YouTube videos from Andreas Pott (channel link) where he compares different AI models for photo analysis and tagging. Credit to Andreas for doing these tests and sharing his insights with the community!