40,95 €
40,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
20 °P sammeln
40,95 €
40,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
20 °P sammeln
Als Download kaufen
40,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
20 °P sammeln
Jetzt verschenken
40,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
20 °P sammeln
  • Format: PDF

This book discusses unconstrained optimization with R - a free, open-source computing environment, which works on several platforms, including Windows, Linux, and macOS. The book highlights methods such as the steepest descent method, Newton method, conjugate direction method, conjugate gradient methods, quasi-Newton methods, rank one correction formula, DFP method, BFGS method and their algorithms, convergence analysis, and proofs. Each method is accompanied by worked examples and R scripts. To help readers apply these methods in real-world situations, the book features a set of exercises at…mehr

Produktbeschreibung
This book discusses unconstrained optimization with R - a free, open-source computing environment, which works on several platforms, including Windows, Linux, and macOS. The book highlights methods such as the steepest descent method, Newton method, conjugate direction method, conjugate gradient methods, quasi-Newton methods, rank one correction formula, DFP method, BFGS method and their algorithms, convergence analysis, and proofs. Each method is accompanied by worked examples and R scripts. To help readers apply these methods in real-world situations, the book features a set of exercises at the end of each chapter. Primarily intended for graduate students of applied mathematics, operations research and statistics, it is also useful for students of mathematics, engineering, management, economics, and agriculture.

Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

Autorenporträt
Shashi Kant Mishra, Ph.D., D.Sc., is Professor at the Department of Mathematics, Institute of Science, Banaras Hindu University, Varanasi, India. With over 20 years of teaching experience, he has authored six books, including textbooks and monographs, and has been on the editorial boards of several respected international journals. He has guest edited special issues of the Journal of Global Optimization and Optimization Letters (both Springer Nature) and Optimization (Taylor & Francis). A DST Fast Track Fellow (2001-2002), Prof. Mishra has published over 150 papers and supervised 15 Ph.D. students. He has visited around 15 institutes/universities in countries such as France, Canada, Italy, Spain, Japan, Taiwan, China, Singapore, Vietnam, and Kuwait.

Bhagwat Ram is a Senior Research Fellow at the DST Centre for Interdisciplinary Mathematical Sciences, Institute of Science, Banaras Hindu University, Varanasi. He holds an M.Sc. in Computer Science, and co-authored the book Introduction to Linear Programming with MATLAB, with Prof. Shashi Kant Mishra. He is currently developing generalized gradient methods to solve unconstrained optimization problems and instructing graduate students in their MATLAB practicals at the Centre for Interdisciplinary Mathematical Sciences at the Banaras Hindu University. He received an international travel grant from the Council of Scientific Industrial and Research, Government of India, to attend a summer school on linear programming at New South Wales University, Australia, in January 2019.