1,282 total views
Matlab Training - Toolbox Finance & Econometrics in Geneva, Zurich, Huston, San-Antonio, Dallas, Los Angeles, San Diego, New York, Washington, Chicago, San Francisco and anywhere in Switzerland, USA, Great Britain and Germany.
625.-/d ID : 986 Goal : This training introduces applied optimization in the MATLAB environment, focusing on using Optimization Toolbox and Global Optimization Toolbox on small academic examples. Audience : R&D Engineers, Supply Chain Engineers, Financial Engineers, Practitionner in Artificial Intelligence Prerequisites : Knowledge of mathematical aspects, limitations and parameters of Master/PhD level optimization models (no maths will be explained during the training!) Goals : Introduction Search global maximum/minimum of a vectorial/matrix defined function/plot (find) Search all local maximum/minimum defined function/plot (find) LP Simplex maximization/minimization with inequalities (linprog) LP Simple maximization/minimization with equalities and no target value LP Simple maximization/minimization with equalities and target value Function constrained optimization with linear constraints and without specific target (fmincon) Function constrained optimization with linear constraints and with specific target (fmincon) Function constrained optimization with non-linear constraints and without specific target (fmincon) Quadratic optimization (Q-CG) using conjugate gradient method (fminunc) Quadratic optimization (Q-IPC) using interior point convex method (quadprog),Genetic algorithm optimization (ga) Training Conclusion Pedagogical method : A certificate will be awarded to each participant who has attended at least 80% of the training. Suggested duration (days) : 1 Daily price : 625 CHF
Price per day per trainee without course material, without certificate, without evaluation, without training room or computer
Book Title : MATLAB Author(s) : Vincent Isoz Pages : 1337 ISBN : Tags : matlab training, matlab course, matlab optimization, genetic algorithms, simplex method, conjugate gradients method, newton method.
comments powered by Disqus.