All content for attributes annotation

Design Article

The modern AI workforce part 2: In-house or outsource?

A walkthrough on the costs, benefits, and risks of outsourcing annotation work. We also give you our recommended approach.

Vladimir

Design Article

The modern AI workforce part 1: Why you need expert annotators

Going through the need for expert annotators and how you can minimize the cost of having expensive experts annotating for you.

Vladimir

Monitor Deploy Development Article

Automated quality control, opening up the AI black box, and more

We’ll go through the most critical parts of the new release and give you a bit more insight into what we’ve built and how it might benefit you.

Alex Wennman

Development Deploy Monitor Article

Uncovering hidden biases in your data

This post is a hands-on guide on how you can use Hasty's tooling to de-bias your data.

Tobias Schaffrath Rosario

Design Article

Infrastructure that is never noticed

Practical machine learning is about much more than statistics and model architectures only. Resilient and reliable infrastructure is the foundation …

Andriy Borodiychuk

Development Article

Building a Xmas GAN

Here's a fun little side project about training a GAN to Christmasify images.

Hasnain Raza

Development Article

From start to pro in Hasty

Over the last few months, Hasty has seen continued, accelerating growth. To keep the momentum going and deliver on our promise of built for and by …

Tristan Rouillard

All stages Article

We need "Agile" in machine learning

It’s time for a non-linear, iterative approach that builds highly performant vision AI applications.

Tobias Schaffrath Rosario

Deploy Development MP Wiki

Adadelta

Adadelta was proposed with the aim to solve the diminishing learning rate problem that was seen in the Adagrad. Adagrad uses the knowledge of all the …

Development Deploy MP Wiki

Adagrad

Adagrad , short for adaptive gradient, is a gradient based optimizer that automatically tunes its learning rate in the training process. The learning …

Development Deploy MP Wiki

Adam

Adam solvers are the hassle free standard for optimizers. Empirically, Adam solvers converge faster and are more robust towards hyper-parameter …

Development Deploy MP Wiki

AdaMax

Adam can be understood as updating weights inversely proportional to the scaled L2 norm (squared) of past gradients. AdaMax extends this to the …

Previous Next

Get AI confident. Start using Hasty today.

Our platform is completely free to try. Sign up today to start your two-month trial.

Tuple

Hasty.ai helped us improve our ML workflow by 40%, which is fantastic. It reduced our overall investment by 90% to get high-quality annotations and an initial model.