Build, train and track all of your machine learning project metadata including ML models and datasets with semantic versioning, extensive artifact logging and dynamic reporting with local↔cloud training.
To get started, you can simply run our Quickstart Example!
https://lnkd.in/eUJGPhXx
How is Layer different from other tools?
Although there are plenty of ML and DS tooling products, we believe that there is still a large gap around collaboration. Many data science projects are hosted on GitHub, which, in our experience, does not provide sufficient depth and abstractions for ML/DS projects.
We don't want you to change how you develop your ML projects. No need to use a special remote notebook, no need to create YAML files or learn a new scripting language just to integrate a new tool into your stack. You can use Layer in your local notebook or Python script.
We tried to hit a sweet spot for the level of abstraction for complex ML pipelines. Models and Datasets are first-class citizens in Layer.
Layer can be easily integrated into your existing codebase. Adding Layer decorators on top of your existing functions is easy and straightforward.
What can you do with Layer?
Build, train and track your machine learning projects.
Use remote GPU-enabled containers (Free 30 hours/week) to train your models with the help of @fabric decorator from your local notebook.
Create dynamic Project Cards inserting comparison metrics, parameters, plots, images and tables.
With a single line of code, load models or datasets from publicly shared community projects.
What’s under the hood?
At Layer, there are two modes for executing your projects: local and remote. To make your training function execute in local mode you just add a model decorator to it. This decorator attaches special metadata that Layer uses. When you call such a decorated function as you normally would, it will be executed on your machine. In addition to that, Layer will use the attached metadata and the result of that function call to register the returned model into our model catalog. To make a function execute in remote mode, you decorate it with the model decorator and pass the function to layer.run([])`.This will make the function run remotely in a container on the Layer infrastructure. This works by leveraging the metadata attached with the decorator and pickling/unpickling your function in the dedicated container for its execution. Then the function is called and its returned result is saved into your project.
“ No need to use a special remote notebook, no need to create YAML files or learn a new scripting language just to integrate a new tool into your stack. You can use Layer in your local notebook or Python script.”
But I still need to import layer package and learn how to use it. How is that different from learning a new scripting language or learning how to write a specific YAML file?
@pakapica Hey! Thanks for the great question. I'm one of the makers of Layer AI.
Ultimately, for any tool of any sort (whether it is YAML-based or no-code, or it has its own DSL), there has to be _some_ learning! =) However, we firmly believe that providing intuitive primitives within Python itself allows for the most expressive approach while still keeping the barrier to entry by developers/data scientists low. Also, one doesn't have to leave their environment of choice (or even their notebook/source file, for that matter!), which we think is super helpful for a smooth UX and keeping the "flow" going. Hope that answers your question, happy to provide more depth if you need it.
Layer AI
Layer AI
Copy Text Easily