Bibliographic record
The algebra and the geometry aspect of Deep learning
- Authors
- Tsemo Aristide
- Publication year
- 2025
- OA status
- oa_green
Print
Need access?
Ask circulation staff for physical copies or request digital delivery via Ask a Librarian.
Digital copy
Unavailable in your region (PD status unclear).
Abstract
This paper investigates the foundations of deep learning through insight of
geometry, algebra and differential calculus. At is core, artificial
intelligence relies on assumption that data and its intrinsic structure can be
embedded into vector spaces allowing for analysis through geometric and
algebraic methods. We thrace the development of neural networks from the
perceptron to the transformer architecture, emphasizing on the underlying
geometric structures and differential processes that govern their behavior. Our
original approach highlights how the canonical scalar product on matrix spaces
naturally leads to backpropagation equations yielding to a coordinate free
formulation. We explore how classification problems can reinterpreted using
tools from differential and algebraic geometry suggesting that manifold
structure, degree of variety, homology may inform both convergence and
interpretability of learning algorithms We further examine how neural networks
can be interpreted via their associated directed graph, drawing connection to a
Quillen model defined in [1] and [13] to describe memory as an homotopy
theoretic property of the associated network.
geometry, algebra and differential calculus. At is core, artificial
intelligence relies on assumption that data and its intrinsic structure can be
embedded into vector spaces allowing for analysis through geometric and
algebraic methods. We thrace the development of neural networks from the
perceptron to the transformer architecture, emphasizing on the underlying
geometric structures and differential processes that govern their behavior. Our
original approach highlights how the canonical scalar product on matrix spaces
naturally leads to backpropagation equations yielding to a coordinate free
formulation. We explore how classification problems can reinterpreted using
tools from differential and algebraic geometry suggesting that manifold
structure, degree of variety, homology may inform both convergence and
interpretability of learning algorithms We further examine how neural networks
can be interpreted via their associated directed graph, drawing connection to a
Quillen model defined in [1] and [13] to describe memory as an homotopy
theoretic property of the associated network.
Copies & availability
Realtime status across circulation, reserve, and Filipiniana sections.
Self-checkout (no login required)
- Enter your student ID, system ID, or full name directly in the table.
- Provide your identifier so we can match your patron record.
- Choose Self-checkout to send the request; circulation staff are notified instantly.
| Barcode | Location | Material type | Status | Action |
|---|---|---|---|---|
| No holdings recorded. | ||||
Digital files
Preview digitized copies when embargo permits.
-
View digital file
original
APPLICATION/PDF · 391 KB
Links & eResources
Access licensed or open resources connected to this record.
- oa Direct