Author name: Admin

The Basics of Machine Learning Loss Iteration, Gradient Descent, and Learning Rate

In this article, we provide a basic overview of loss iteration, gradient descent, and learning rates. Concisely: loss iteration is looping over different values to try and find increasingly more accurate weights for a model. Gradient descent is the idea that we can find more accurate models by examining the gradient of a given loss, and iterate over those gradients until we find one at the “bottom” of a convex shape. Learning rates are the amount of change in each of our “guesses” that try to get us to the most efficient values for a model

Blocks, Procs, and Lambdas in Ruby

Blocks, Procs, and Lambdas are Ruby’s approach to closures.

Blocks are any anonymous function that can be passed into methods. The’re the sections of code encapsulated between either braces `{}` or `do…end` syntax. In short: you’ve probably used them a lot before. Procs and Lambdas are both blocks that can be assigned to a variable.

Presenters and Decorators in Ruby on Rails

In presenters, you can access view helpers (eg: see the “h.link_to”) above in a way you can’t should you put this logic in the model. Ultimately, they allow you to separate out complex view logic from templates, while still having access to Rails’ view helpers.

Presenters are the view-focusing implementation of decorators, which serve the broader purpose of providing extra functionality to a specific instance of a class when (and only when) the decorator is instantiated.

What Is Scope?

In programming, scope is the area where variables, functions, and other name bindings can be used. In almost all languages, a variable can be accessed by any function/class inside the function/class its currently in, but not outside/above it