-
May 4, 2022
Empirical Rademacher Complexity and Its Implications to Deep Learning
In machine learning, Rademacher complexity is used to measure the capacity of a hypothesis class from which utilized algorithm chooses its predictors and probably first proposed in [1]. In computational learning theory, it is possible to bound generalization loss by way of using data dependent rademacher complexity under certain assumptions.
-
February 1, 2020
Yazılım Alanında Doküman Çeşitleri
Kapsamlı dokümantasyona sahip bir kütüphanenin, modülün veya *framework*'un kullanıcı tarafından benimsenmesi daha olasıdır. Ürettiği değer göz önünde bulundurulursa, yazılım geliştirmenin bu önemli ayağı standartlaştırılmalıdır. Divio dokümantasyon sistemi yazılım sektöründe kullanılmak üzere hazırlanan teknik metinleri tasnif etmek için önerilmiştir.
-
March 8, 2019
Maximum Mean Discrepancy (MMD) in Machine Learning
Maximum mean discrepancy (MMD) is a kernel based statistical test used to determine whether given two distribution are the same which is proposed in [1]. MMD can be used as a loss/cost function in various machine learning algorithms such as density estimation, generative models as shown in [2], [3] and also in invertible neural networks utilized in inverse problems as in [4].
-
February 8, 2019
NICE: Non Linear Independent Components Estimation
NICE is a deep learning framework changing high dimensional complex data into non linear independent components. This post summarizes the paper and explains some left out mathematical concepts especially why resulting jacobian of transformation function is unit constant and how to derive it. In addition, pytorch implementation of experimental results are given.
-
January 22, 2019
A Brief Introduction to Machine Learning for Engineers 2 Part III
In this post, bayesian learning framework is covered in detail with bayesian predictive function and marginal likelihood including coding exercises.