What is the Allonia platform ?

According to several studies, most enterprise’s data science projects "never make it to production". There are several reasons for this, and the Allonia Platform has been designed to tackles the main pain points that data teams face when creating, deploying & maintaining AI applications in order to make the industrialization of an AI stack easier.

"Fast Track to AI" is the slogan and the purpose of Allonia

Allonia Platform is a fully integrated, off the shelf software solution that helps data team all along the data valorisation pipeline:

  • Data Ingestion

  • Data preparation

  • Model Training

  • Model engineering

  • MLOps services (model deployment, pipelines, monitoring)

overview lifecycle

Allonia platform key principles are based on this lifecycle and our breadcrumb trail is organized in accordingly:

menu

Details about the sections:

  • menu factory : Section that will let you manage all activites related to project development

    • menu dataset : Section that will let you manage the Data of your project. You can check for further details in related section here.

    • menu notebook : Section that will let you manage the Notebooks that will be the core development of your project. You can check for further details in related section here.

    • menu model : Section that will let you manage all Models created or imported in your project. You can check for further details in related section here.

    • menu module : Section that will let you manage all Modules (that consist of reusable Python code you can import for notebooks and pipelines) of your project. You can check for further details in related section here.

    • menu pipeline : Section that will let you manage all Pipelines (that consist of a description of multiple data or processing tasks) of your project. You can check for further details in related section here.

    • menu librarie : Section that will let you manage all Python libraries that need to be set up by default for you project. You can check for further details in related section here.

  • menu launchpad : Section that will let you manage all activites related to project deployment and monitoring

    • menu job : Section that will let you manage and monitor all Jobs instances (as Pipelines execution) of your project. You can check for further details in related section here.

    • menu service : Section that will let you manage and monitor all Services instances (as Real-Time serving) of your project. You can check for further details in related section here.

Platform’s core components

overview platform

Useful informations

  • The number of files you can upload through sftp is limited to roughly 1000 files at a time.

  • When you activate a service, the toggle will show immediate availability, but the actual activation might take some minutes. Check on user-service logs to see deployment progress.

  • When opening a notebook in Jupyter interface, you may expect delay including the launch of the Jupyter Interface, the Kernel setup and the loading of your notebook.

  • When working on Notebook inside JupyterLab, you will likely be disconnected after about an hour of inactivity. If so, please close JupyterLab tab and open it again.

  • Time zone of the platform is GTM: in the application, the execution time of the jobs are displayed in UTC