Many problems arising in science and engineering aim to find a
function which is the optimal value of a specified functional. Some
examples include optimal control, inverse analysis and optimal
shape design. Only some of these, regarded as variational problems,
can be solved analytically, and the only general technique is to
approximate the solution using direct methods. Unfortunately,
variational problems are very difficult to solve, and it becomes
necessary to innovate in the field of numerical methods in order to
overcome the difficulties. The objective of this PhD Thesis is to
develop a conceptual theory of neural networks from the perspective
of functional analysis and variational calculus. Within this
formulation, learning means to solve a variational problem by
minimizing an objective functional associated to the neural
network. The choice of the objective functional depends on the
particular application. On the other side, its evaluation might
need the integration of functions, ordinary differential equations
or partial differential equations. As it will be shown, neural
networks are able to deal with a wide range of applications in
mathematics and physics.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!