Roberval


Accueil > Vie scientifique > Séminaires de nos voisins > Séminaire LABEX MS2T

Séminaire LABEX MS2T

SEMINAIRE LABEX MS2T du 12 Juillet 2016

Lors du séminaire du Labex MS2T du 12 juillet 2016, nous avons eu le plaisir d’écouter un exposé de Anand Sanchez, professeur au CINVESTAV, sur le thème :

Control and navigation of quadrotors performing aggressive maneuvers

Le séminaire (ouvert à tous) a eu lieu à l’Université de Technologie de Compiègne - Salle GI 042, Bâtiment Blaise Pascal (Génie Informatique).

Résumé de l’exposé :

Nowadays, quadrotors have a great potential in a wide variety of applications. In recent years, they significantly increased their degree of autonomy and performance, mainly thanks to technological innovations which facilitate their construction and control. However, due to their small size and limited payload, it is necessary to find more efficient solutions for such systems. In this talk I will present the synthesis of robust estimation and control algorithms with respect to endogenous and exogenous disturbances for the autonomous flight of multiple quadrotors. In particular, we consider disturbances that are not necessarily differentiable in the usual sense (integer order). The proposed approach is based on fractional order differintegral operators that provide uniformly continuous control signals in order to exactly compensate these disturbances. In addition, the proposed algorithms allow interactions between quadrotors for tracking aggressive maneuvers.
Experimental results will be presented.

SEMINAIRE LABEX MS2T du 20 Juin 2016

Lors du séminaire du Labex MS2T du 20 juin 2016, nous avons eu le plaisir d’écouter un exposé de Elias Cueto, professeur à l’Université de Zaragoza, sur le thème :

Model order reduction techniques for Computational surgery

Le séminaire (ouvert à tous) a eu lieu à l’Université de Technologie de Compiègne - Salle GI 042, Bâtiment Blaise Pascal (Génie Informatique).

Résumé de l’exposé :

Numerical simulation for surgery training or planning is a formidable task that, despite the current computational capabilities, has had a limited success in real surgery practice. This is so due to the obvious complexity of human anatomy and the different phenomena taking place at different time and length scales during surgery.

Research in my group has been based on the need for some model order reduction of this formidable task. This seminar will overview recent results in the field towards the construction of our own surgical simulator. It will also include some very recent results on the development of patient-specific models (the so-called patient “avatars”). These include the use and the development of state-of-the-art, linear and non-linear, model order reduction techniques.

SEMINAIRE LABEX MS2T du 18 Avril 2016

Lors du séminaire du Labex MS2T du 18 avril 2016, nous avons eu le plaisir d’écouter un exposé de Stéphane Régnier, professeur à l’Université Pierre et Marie Curie, et responsable de l’équipe INTERACTION de l’Institut des Systèmes Intelligents et de Robotique (ISIR), sur le thème :

Microrobotique, microrobots : des systèmes interactifs pour les petites échelles

Le séminaire (ouvert à tous) a eu lieu à l’Université de Technologie de Compiègne - Salle GI 042, Bâtiment Blaise Pascal (Génie Informatique).

Résumé de l’exposé :

In the presentation we will see how systems at small scales are different from macroscopic systems and detail the challenges associated with miniaturization. I will, in particular, focus on the design of micro-manipulation platforms and magnetic micro-robots. I will discuss the issues related to dynamic sensors for visual monitoring and force measurement, as well as teleoperation of micro-robotic systems with force feedback.
I will illustrate the presentation with experimental examples developed at the ISIR Institute.

SEMINAIRE LABEX MS2T du 13 Octobre 2015

Lors du séminaire du Labex MS2T du 13 octobre 2015, nous avons eu le plaisir d’écouter un exposé de Alexis Tsoukias, Directeur de recherche CNRS, Directeur du LAMSADE, Université Paris Dauphine, sur le thème :

What is a Decision Problem ?

Le séminaire a eu lieu à l’Université de Technologie de Compiègne - Amphi Bessel, Centre de Recherches de Royallieu.

Résumé de l’exposé :

The talk introduces a general framework through which any type of decision problems should be characterisable. The aim of the framework is to show that only a finite number of primitive characteristics describe a decision problem, thus reducing the number of archetypes to a finite number of combinations. In the talk we present in some details some of these characteristics. We also provide a number of simple examples in order to show how this framework helps in categorising current decision problems, in reformulating decision problems and in constructing formal arguments for the decision aiding process.

SEMINAIRE LABEX MS2T du 8 Septembre 2015

Lors du séminaire du Labex MS2T du 8 septembre 2015, nous avons eu le plaisir d’écouter un exposé de Dan Istrate, titulaire de la Chaire E-Biomed (Laboratoire BMBI, UTC), sur le thème :

EBiomed Chair : Ambient assisting living and biomedical connected objects

Résumé de l’exposé :

The main objective of EBiomed Chair is to accelerate the development of the eHealth research program as part of the IUIS (Institut Universitaire d’Ingénierie pour la Santé) more specifically through the scientific program : developing connected biomedical tools (targets : the chronically diseases, pregnant women, elderly, functional rehabilitation) while integrating the human sciences (usability, acceptability, ergonomics, …). The two main axis of the chair program are : biomedical connected tools and robotics for eHealth. An integration living lab platform is developed in the framework of Innovation Center of UTC.
The two main topics of the presentation are : the sound environment analysis in order to recognize everyday life sounds using pattern recognition algorithms (GMM, HMM) and multimodal data fusion through two different approaches : fuzzy logic and evidential networks.
The developed system in the frameworks of EMonitor’age project will be presented like an example.
The main scientific aspects which will be developed are : biomedical signal processing, pattern recognition, multimodal data fusion taking into account the uncertainties, multisystem collaboration.

SEMINAIRE LABEX MS2T du 6 Mai 2015

Lors du séminaire du Labex MS2T du 6 mai 2015, nous avons eu le plaisir d’écouter un exposé de Hermann Matthies, Professeur au sein de Carl-Friedrich-Gauss Faculty et Directeur de l’ Institute of Scientific Computing at Technische Universität Braunschweig, Allemagne, sur le thème :

Identification, Uncertainty Quantification and Bayesian Updating

Résumé de l’exposé :

In inverse problems and identification procedures, there are unknown – and usually not directly observable quantities, which have to be inferred from observations which are only indirectly linked to these unknown quantities. If these unknown quantities and our knowledge/information about them is modelled with the methods of probability theory, then it is possible to use methods based on Bayes´s theory to condition on the knowledge description on these indirect observations.
The mathematical framework for such a computational task will be sketched, and it turns out that the crucial component is the ability to predict in mathematical models the effect of „input“ uncertaintes onto „output/observable“ quantities. This „uncertainty quantification“ can in principle be done with Monte Carlo methods, but they are often very slow. Here we use „functional approximations“, where the unknown random quantities are expressed as functions of known random variables. The mathematical framework for this will be explained.

SEMINAIRE LABEX MS2T du 7 Avril 2015

Lors du séminaire du Labex MS2T du 7 avril 2015, nous avons eu le plaisir d’écouter un exposé de Peter Hehenberger, Deputy Head of the Institute of Mechatronic Design and Production, Johannes Kepler University of Linz, Austria, sur le thème :

Mechatronic Design : A Review and Outlook

Résumé de l’exposé :

Mechatronics makes use of the disciplines of mechanical engineering, electrical engineering/electronics and information technology in the designing of successful products. The integration and interaction of these disciplines in the design process set mechatronic systems apart from other multi disciplinary systems. The increased complexity of mechatronic systems, resulting from the positive interaction of system elements from various disciplines, makes it necessary to use an adequate design methodology. In order to master the mechatronic design approach and to benefit from it as much as possible, a hierarchical system design process is proposed, in which the discipline specific design tasks need not be integrated as a whole on the mechatronic level of the design task. Consequently the system models should cover the different views on a system as well as the different degrees of detailing which lead from a hierarchy of models to a hierarchy of design parameters.

The presentation focuses on the proper usage of models in mechatronic design, which is a very important instrument for realizing complex activities such as engineering design. From the viewpoint of the engineering design process, models contain the knowledge gained during the project and simulations and other model based analysis yield information that may improve product knowledge and potentially also the quality of many decisions made in the design process.

The current trend in mechatronics involves networked mechatronic systems, or cyber physical systems (CPS) which can also be considered as a sub-part of Systems of Systems (SoS). This couples the virtual world with the real world, e.g., when machines are controlled from a different location.

SEMINAIRE LABEX MS2T du 18 Février 2015

Lors du séminaire du Labex MS2T du 18 février 2015, nous avons eu le plaisir d’écouter un exposé de Didier Caudron, Director Global Process Control, et Paul Baduel, Director Global Technology Innovation, chez SANOFI Pasteur, sur le thème :

Apport de la modélisation des systèmes dans le processus d’innovation de Sanofi Pasteur pour la mise au point de vaccins.

Résumé de l’exposé :

Le développement et l’amélioration des vaccins et de leurs procédés de fabrication demandent de nombreuses expérimentations. Celles-ci sont longues, requièrent de nombreux équipements et des moyens analytiques complexes. De plus ces expérimentations génèrent un flux de données
importantes qu’il faut gérer et interpréter.
Il apparaît alors évident que les moyens de modélisation et de simulation des systèmes peuvent constituer un moyen de diminuer le nombre d’expérience à mener en laboratoire ou à l’échelle industrielle et de mieux exploiter les données. Il faut être en mesure de proposer des outils numériques fiables qui permettront d’être utilisé en parallèle de l’expérimentation ou en lieu et place de certaines expériences.
L’objectif des exposés est dans un premier temps de présenter le processus d’innovation de Sanofi Pasteur afin de bien comprendre le contexte et de permettre aux participants de situer à quelles étapes les outils de modélisation simulations des systèmes pourraient être intégrés. La deuxième partie de l’exposé présentera une expérience menée par Sanofi Pasteur avec l’aide d’un laboratoire extérieur sur la technologie des Systèmes Multi Agent. L’objectif de Sanofi Pasteur était de vérifier que cette technologie permettait de faire émerger un comportement « macroscopique » à partir de la description de comportements élémentaires.

SEMINAIRE LABEX MS2T du 8 Juin 2012

Lors du séminaire du Labex MS2T du vendredi 08 juin 2012, nous avons eu le plaisir d’écouter un exposé de Francisco CHINESTA, Professeur à l’Ecole Centrale de Nantes et responsable du pôle Matériaux et Procédés de fabrication du GeM (Institut de Recherche en Génie Civil et Mécanique, UMR CNRS - Ecole Centrale de Nantes - Université de Nantes).

Francisco CHINESTA est titulaire de la Chaire de la Fondation d’entreprise EADS, dont l’objectif est de mettre en place des projets de recherche en modélisation et simulation numérique avancées des procédés de fabrication des structures composites pour l’industrie aéronautique et spatiale et plus généralement de contribuer à relever les nombreux et grands défis technologiques de ces secteurs.

Résumé de l’exposé :

Today many problems in science and engineering remain intractable, in spite of the impressive progresses attained in mechanical modelling, numerical analysis, discretization techniques and computer science during the last decade, because their numerical complexity is simply unimaginable. We can distinguish different challenging scenarios :

• The first one concerns models that are defined in high dimensional spaces, usually encountered in quantum chemistry and kinetic theory descriptions of complex fluids. Model defined in high dimensional spaces suffer the so-called curse of dimensionality.

• The second problem category involves time-dependent problems not necessarily defined in high-dimensional spaces, but whose spectrum of characteristic times is so wide that standard incremental time discretization techniques cannot be applied.

• Real time simulators are needed in many applications, e.g. surgical simulators. Control, malfunctioning identification and reconfiguration of systems also need to run in real time.

• Problems of the fourth category are defined in degenerate geometrical domains, as plate or shell-like domains. Standard grid-based 3D discretization methods then quickly become impractical, in view of the compulsory discretization of the small length scales that yield extremely fine meshes.

• Many problems in process control, parametric modeling, inverse identification, and process or shape optimization, usually require, when approached with standard techniques, the direct computation of a very large number of solutions for particular values of the problem parameters.

• Traditionally, Simulation-based Engineering Sciences relied on the use of static data inputs to perform the simulations. A new paradigm emerged : Dynamic Data-Driven Application Systems (DDDAS) entails the ability to dynamically incorporate data into an executing application.

• Augmented reality is another area in which efficient (fast and accurate) simulation is urgently needed. The idea is supplying in real time appropriate information to the reality perceived by the user.

• Light computing platforms (tablets or smartphones) are appealing alternatives to heavy computing platforms that in general are expensive and whose use requires technical knowledge.

The main challenge is to address the modeling and simulation of “real” models encountered in science and engineering with all their complexity from the geometrical and constitutive points of view, some of them never until now solved because their computational complexity. These models should be solved very fast, in some cases in real time, by using light computing platforms. Classical simulation techniques fail to fulfill the above requirements. An appealing alternative consist of considering off-line solutions of parametric models, in which all the sources of variability – loads, boundary conditions, material parameters, geometrical parameters, etc. - will be considered as extra-coordinates. Thus, by solving only once the resulting multidimensional model, we have access to the solution of the model for any value of the parameters considered as extra-coordinates. Now, from this general solution computed only once and off-line we could perform on-line real time post-processing, optimization, inverse analysis, analysis of sensibilities, stochastic analysis … by using very light computing platforms as for example smartphones. We could also adapt the model on-line while its simulation is running within the framework of dynamic data-driven application systems – DDDAS. The price to be paid is the solution of parametric models defined in high dimensional spaces that could involve hundreds of coordinates. The use of the Proper Generalized Decomposition that we recently proposed and we are intensively developing, allows such solution, because thanks to the separated representation of the unknown fields the computational complexity scales linearly with the dimensionality, instead of growing exponentially which is characteristic of mesh-based discretization techniques. This off-line-on-line Proper Generalized Decomposition Based Dynamic Data-Driven Application Systems could constitute a new paradigm in computational sciences and engineering.