All content for data scientists and ML engineers

Development Deploy MP Wiki

Adagrad

Adagrad , short for adaptive gradient, is a gradient based optimizer that automatically tunes its learning rate in the training process. The learning …

Development Deploy MP Wiki

Adam

Adam solvers are the hassle free standard for optimizers. Empirically, Adam solvers converge faster and are more robust towards hyper-parameter …

Development Deploy MP Wiki

AdaMax

Adam can be understood as updating weights inversely proportional to the scaled L2 norm (squared) of past gradients. AdaMax extends this to the …

Development Deploy MP Wiki

Adamw

AdamW is very similar to Adam . It only differs in the way how the weight decay is implemented. The way how it's implemented in Adam came from the …

Development User documentation

Advanced options

In advanced options we currently only have one option available to users. The "Automated tools generate..." toggle allows you to decide if our …

Development User documentation

AI assistants

AI assistants are what we call our AI tooling that you can use to automate parts - or all - of your annotation work. The concept behind them is …

Development User documentation

AI assistants status overview

Out of the many questions we get from our users, many concerns the status and training of our AI assistant models. We’re the first to admit this has …

Development Deploy MP Wiki

AMSgrad Variant (Adam)

While the Adam optimizer, which made use of momentum as well as the RMS prop, was efficient in adjusting the learning rates and finding the optimal …

Development Deploy MP Wiki

ASGD

Average Stochastic Gradient Descent, abbreviated as ASGD, averages the weights that are calculated in every iteration. $$ w_{t+1}=w_t-\eta \nabla …

Development User documentation

ATOM segmenter

ATOM is a an AI-powered segmentation tool that can be used both for instance and semantic segmentation. It is available from the start of every …

Development User documentation

Attribute Prediction

Attribute Prediction dashboard Widgets GPU Consumption Running time Inference time Hamming Score VS Number of Iterations Loss VS Number of Iterations …

Deploy User documentation

Attribute Prediction

Sample inference script for torchscript exported image-tagger. The following sample code should be run from the export directory: import torch import …

Previous Next

Get AI confident. Start using Hasty today.

Our platform is completely free to try. Sign up today to start your two-month trial.

Tuple

Hasty.ai helped us improve our ML workflow by 40%, which is fantastic. It reduced our overall investment by 90% to get high-quality annotations and an initial model.