Research
Centre Scientific Computing: Applied Linear Algebra
|
||||
Home | Members | Projects | Events | sc:ala seminar |
2012 - Spring semester June 14, 11:00 AM, Room #60 A polygon representing transportation network is given, together with p, a point in its interior. We aim to extend the network by inserting a line segment, called feed-link, which connects p to the boundary of the polygon. Geometric dilation of some point q on the boundary is the ratio between the length of the shortest path from p to q through the extended network and their Euclidean distance. The utility of a feed-link is inversely proportional to the maximal dilation over all boundary points. We give a linear time algorithm for computing the feed-link with the minimum overall dilation, thus improving upon the previously known algorithm of almost O(n log n) complexity. June 12, 13:00 PM, Amphitheatre I June 7, 12:00 PM, Room #60 May 10, 12:00 PM, Room #60 April 26, 12:00 PM, Room #60 As is well-known, the class of strictly diagonally dominant (SDD) matrices is intimately related to the Jacobi iterative method; in a similar fashion, the class of Nekrasov matrices arises from the Gauss-Seidel method. There is an connection between these two classes, which enables several interesting applications. April 12, 12:00 PM, Room #60 Based on the book by Fuzhen Zhang, The Schur Complement and its Applications, we give the early development of the Schur complement and illustrate its power as a rich and basic tool in mathematical research and applications. The special emphasis is on results concerning the eigenvalues of the Schur complement for some matrix classes based on diagonal dominance. Also, we analyze matrix classes that enjoy closure properties under Schur complementation, or, in other words, we examine which properties remain invariant under transformations of this type. April 5, 12:00 PM, Room #60 We will look at a variation of one of the most important problems in
computer science - devising a comparison based algorithm for sorting n given
values in ascending order. Our goal is to explore possibilities and obstacles for producing a
(deterministic or randomized) algorithm that outputs the order that is "not
far" from the ascending order, and (hopefully) runs faster than the classical
sorting algorithms. March 29, 12:00 AM, Room #60 Consider the following problem: We are given a countably infinite
base structure S in a finite relational language; the goal is to
determine its reducts, i.e., all relational structures T on the domain of
S which have a first-order definition in S. When doing so, we identify reducts
T, T' of S when they are first-order interdefinable. March 22, 12:00 PM, Room #60 Neural network models of human brain represent large- and multi-time-scales nonlinear dynamical systems that contain neuron activity and synaptic changes of the formed cognitive maps. These systems are the basis for every single cognitive task and their complex dynamical behavior is more and more thoroughly mathematically investigated. These biological networks, inherently, undergo many parametric perturbations, thus, understanding instabilities of their dynamical behavior is an important task. Therefore, in this talk, we will first develop conditions on the activation functions, synaptic weights and impact weights of external stimuli, relatively to neural timescales, that guarantee global stability of a network, and later investigate influence of certain perturbations that occur in nature. March 8, 12:00 PM, Room #60 The simple and nice idea of localizing eigenvalues of a given matrix by circles, as it was done by Geršgorin in his famous theorem, can be combined with a similarity transformation that corresponds to the first step of Gaussian elimination. The resulting area can localize eigenvalues much better that original Geršgorin set. |