skip to main content
Close Icon We use cookies to improve your website experience.  To learn about our use of cookies and how you can manage your cookie settings, please see our Cookie Policy.  By continuing to use the website, you consent to our use of cookies.

Introduction

Tecton stands ready to enter the ML market with enterprise-scale features that can be served online to support real-time predictions.

Highlights

  • Only time will tell whether Tecton can thrive within a market dominated by the global hyperscale AI innovators Google, Amazon, and Microsoft. Omdia believes however that Tecton’s distinctive approach to the problem of building and maintaining enterprise ML solutions at scale will find a solid footing upon which customers can build a scalable ML practice.

Features and Benefits

  • Analyzes Tecton’s unique approach to the creation, management, and deployment of enterprise-grade ML features.
  • Evaluates the importance of building a centralized ML feature repository as a means of driving CI/CD for AI outcomes.

Key questions answered

  • How should Tecton seek to both compete with and complement global hyperscale AI innovators Google, Amazon, and Microsoft?
  • How does Tecton use a feature repository to help enterprise ML practitioners build a scalable ML practice?

Table of contents

Omdia view

  • Summary
  • Background
  • Key traits
  • Figure 1: Tecton’s Python SDK in action
  • Figure 2: Retrieving and viewing a feature package with Python
  • Figure 3: Tecton materialization status report
  • Figure 4: Tecton feature pipeline visualization
  • Figure 5: Tecton feature package monitoring
  • Findings and future development

Appendix

  • Further reading
  • Author