We are delighted to introduce Marqtune, the embedding model training platform that allows you to train highly specialised, billion parameter embedding models that improve search, recommendations and RAG applications.
We’ve built Marqtune on the foundation of our new training framework, Generalized Contrastive Learning (GCL). With GCL, you can fine-tune embedding models to rank search results not only by semantic relevance but also by a ranking system defined by your search team. This means better, more relevant search results that cater to your business.
Marqtune has been developed in response to feedback from our customers. Every vector search system in production needs to have its models continuously retrained and updated. Doing this manually is simply not feasible. Marqtune introduces a user-friendly process for fine-tuning embedding models and allows users to achieve significant improvements in search relevance with minimal engineering effort.
If you've ever dealt with search systems, you know how frustrating it can be when the results just don’t quite hit the mark. They might technically be correct, but they often miss the true intent behind what you're searching for. That's where Marqtune comes in! By using advanced embedding models fine-tuned with Generalized Contrastive Learning (GCL), you ensure that search results are not only spot-on in terms of relevance but also perfectly aligned with your business needs.
Another huge pain point for many businesses is the tedious task of maintaining and updating vector search models. Manually retraining these models can be a real headache – it’s time-consuming and requires a lot of technical know-how. Marqtune takes the hassle out of this process. With our platform, you can fine-tune embedding models with just a few lines of code, making it super easy to keep your search systems up-to-date and performing at their best.
In short, Marqtune solves these common problems by delivering superior search experiences tailored to your unique needs, without requiring extensive engineering resources. It's all about making life easier and more efficient for you and your business.
We’re thrilled to collaborate with incredible customers worldwide and share our success stories. In 2023 Redbubble began working with Marqo to leverage Vector Search to continue improving the online search experience for their customers. After a successful implementation, the team saw dramatic improvements to core metrics such as add-to-cart, conversions, and latency.
While open-source CLIP models provided relevant results, they were not aligned with Redbubble customer intent; Rebubble’s audience and content necessitates specific definitions of relevance. Traditional CLIP training provided some improvements, however, proved unable to capture customer wants.
With Marqtune, new models are now trained with Redbubble’s historical sales information to align the model with user behavior. Unlike keyword search, these models can generalize meaning that previously unsold works do not require a score to be easily surfaced in search - they simply must fit the style of works that are successful on the Redbubble’s platform. As a result, A/B experiments showed that models fine-tuned with Marqtune increased add-to-card rate by 12% for 3+ word queries (representing a third of all search volume) compared to the existing keyword search.
It has been great to see the continued success of our product with customers like RedBubble. To find out more about the future of this partnership, read our case-study.
We've witnessed Marqtune's success in production, so let's get set up and running so you can start utilizing its full potential! First, sign up for Marqo Cloud if you haven’t already. Once you're in, select ‘Marqtune’ from the left-hand navigation and request access.
Fine-tuning your own embedding models can be done with just a few lines of code. Check out our Getting Started with Marqtune guide that walks you through the process step-by-step.
We’re Marqo, a vector search platform equipped with the machine learning capabilities and infrastructure you need to deploy next-gen AI-powered search. We handle everything from vector generation to storage and retrieval, enabling you to implement multimodal, multilingual search through a single API. Our proprietary inference engine converts unstructured data into high-performance vectors, returning hyper-relevant search results in real time.
Some of our customers include Redbubble and Temple & Webster, and we have countless developers adopting our platform for end-user search, retrieval-augmented generation, and more. We’re based in San Francisco, Melbourne, and London, and we’re backed by top-tier investors like Lightspeed and Blackbird.
Got questions? Join our Slack community to chat with other members and the Marqo team!