Category Archives: Seminar

2014 nov 12 wednesday, 14h30 room C115

Tomer Kotek: Shape and Content: A database-theoretic perspective on the analysis of
data structures

The verification community has studied dynamic data structures
primarily in a bottom-up way by analyzing pointers and the shapes
induced by them. Recent work in fields such as separation logic has
made significant progress in extracting shapes from program source
code. Many real world programs however manipulate complex data whose
structure and content is most naturally described by formalisms from
object oriented programming and databases. In recent work, we attempt
to bridge the conceptual gap between these two communities. Our
approach is based on Description Logics (DLs), a widely used knowledge
representation paradigm which gives a logical underpinning for diverse
modeling frameworks such as UML and ER. We show how DLs can be used on
top of an existing shape analysis to add content descriptions to the
shapes.

Joint work with Diego Calvanese, Mantas Simkus, Helmut Veith and
Florian Zuleger.

2014 sep 16 tuesday, 14h room C115

Julien Henry (Verimag): Improving static analysis with decision procedures

A widely-used approach to static program analysis is Abstract Interpretation, which consists in computing an over-approximation of the set of reachable program states, for any possible program execution. The main challenge is then to compute an approximation sufficiently precise to prove the desired properties (e.g. no division by zero, no arithmetic overflow etc). In this work, we propose several ways of using Satisfiability Modulo Theories to improve precision of the abstract interpretation framework. We present a static analyzer for C implementing these new methods, called PAGAI, that can use any SMT-solver supporting SMTLib2. Finally, we describe an interesting application of SMT to the estimation of Worst-Case Execution Time, which leads to formulas that are hard to solve by any production-grade solver based on DPLL(T). We show how a simple static analysis of the program dramatically helps the SMT solver to decide unsatisfiability faster in this particular case.

2013 sep 12 thursday, 11h room C05

Mehdi Mhalla et Frédéric Prost: Preuves et jeux d’échecs

La résolution de la valeur théorique de jeux populaires comme les dames, les échecs, othello, le jeu de Go etc. est un challenge informatique particulièrement intéressant. Du fait de la taille des espaces considérés (autour de 10^45 positions légales pour le jeux d’échecs par exemple) les recherches en force ne peuvent aboutir. Il faut donc trouver des stratégies de preuves plus subtiles. Récemment, en 2007, le jeu de dame a été prouvé comme étant une nulle. La construction de la preuve a pris plus de dix ans, et a nécessité des centaines d’ordinateurs en occupant une place mémoire gigantesque. En utilisant une nouvelle approche nous avons pu montrer que le jeu d’échecs sur un échiquier 5 x 5 (Gardner Chess) est également une partie nulle. Alors que les espaces de recherche des deux jeux (dames et échecs sur 5 x 5) sont du même ordre de grandeur (autour de 10^19 positions légales) notre preuve est humainement lisible (une dizaine de pages) et a été réalisée avec l’aide d’ordinateurs personnels. Nous montrerons comment cette approche peut être automatisée et généralisée ainsi que les perspectives que nous envisageons.

2013 may 7 thursday, 10h room C115

Vincent Nesme: Introduction en deux parties aux automates cellulaires quantiques, première partie

Dans ce premier tutoriel je ne parlerai que d’automates cellulaires classiques, en donnant définitions et principaux théorèmes, en utilisant pour illustrer mon propos un programme développé par Guillaume Theyssier, du LAMA. Il ne s’agit vraiment que d’introduire les concepts de base, mais si le niveau de connaissance de l’audience le permet, je m’aventurerai peut-être à expliquer la conjecture des taux positifs et comment Gacs l’a brillamment réfutée.

2013 feb 21 thursday, 14h room C115

Frédéric prost: On the Impact of Information Technologies on Society: an Historical Perspective through the Game of Chess

The game of chess as always been viewed as an iconic representation of intellectual prowess. Since the very beginning of computer science, the challenge of being able to program a computer capable of playing chess and beating humans has been alive and used both as a mark to measure hardware/software progresses and as an ongoing programming challenge leading to numerous discoveries. In the early days of computer science it was a topic for specialists. But as computers were democratized, and the strength of chess engines began to increase, chess players started to appropriate to themselves these new tools. We show how these interactions between the world of chess and information technologies have been herald of broader social impacts of information technologies. The game of chess, and more broadly the world of chess (chess players, literature, computer softwares and websites dedicated to chess, etc.), turns out to be a surprisingly and particularly sharp indicator of the changes induced in our everyday life by the information technologies. Moreover, in the same way that chess is a modelization of war that captures the raw features of strategic thinking, chess world can be seen as small society making the study of the information technologies impact easier to analyze and to grasp.

2012 dec 19 wednesday, 11h room C115

Vincent Aravantinos: On the Formal Analysis of Physical Systems using Theorem Proving

Formal methods have been used classically for the verification of software, circuits or protocols. In most cases, verification deals with an abstraction of the underlying physical system (e.g., software verification does not involve the modelization of a computer, circuit verification does not model semi-conductors). We propose to try to verify such physical systems, focusing in particular on optical systems which have several applications (from fiber optics to quantum computer implementations). This requires the formalization of physical concepts (e.g. ray of light, lense, electromagnetic wave or photon), of the physical principles that govern them (e.g., Fermat’s principle or Maxwell equations), and the derivation, from these principles, of various laws that are useful for the verification of concrete systems (e.g., Snell’s law or the Uncertainty principle). We will explain how these concepts and principles can be successfully formalized (in HOL-Light) as well as the limitations of this approach.

2012 nov 30 friday, 11h30 room C05

Thomas Cuvillier: 2-catégories pour protocoles avec états mixtes
In this presentation, I’ll show how one can encode quantum protocol into a 2-category, where classical data is seen as a correlation between quantum systems, and then making it arise in a meaningful fashion. This idea was first applied to Hilbert spaces, resulting in a category where one see quantum states as pure ones, and quantum operation as reversible ones. I will show how one can extend this construction to a 2-category where quantum states can be mixed, and where quantum operations are no longer reversible. I will especially interest myself in measurement, which encodes the correlation between quantum system and classical data, and highlights the key difference between the two approaches.

2012 nov 20 tuesday, 10h30 room C115

Raphael Dias da Silva: Compact quantum circuits from one-way quantum computation (and applications for circuit optimisation)

In this talk I will introduce a new method to translate measurement patterns to the circuit model and discuss some applications of it. We start by giving a straightforward circuit representation of any 1WQC, at the cost of introducing many ancilla wires. We then propose a set of four simple circuit identities that explore the relationship between the entanglement resource and correction structure of a 1WQC, allowing one to obtain equivalent circuits acting on fewer qubits. In the second part of the talk, I will show how to optimize arbitrary quantum circuits by using back-and-forth translation between the QC and 1WQC models.

2012 nov 13 tuesday, 10h30 room C115

Jan Bouda: Weak randomness seriously limits the security of quantum key distribution

In usual security proofs of quantum protocols the adversary (Eve) is expected to have full control over any quantum communication between any communicating parties (Alice and Bob). Eve is also expected to have full access to an authenticated classical channel between Alice and Bob. Unconditional security against any attack by Eve can be proved even in the realistic setting of device and channel imperfection. In this Paper we show that the security of QKD protocols is ruined if one allows Eve to possess a very limited access to the random sources used by Alice, provided participants use the standard high-efficient sublinear eavesdropper checking. Such a knowledge should always be expected in realistic experimental conditions via side channels.

 

2012 nov 8 thursday, 15h30 room C05

David Cattanéo: Etude de la Complexité Paramétrique de Problèmes de Domination et Application aux Codes Linéaires

Nous avons étudié la complexité paramétrique de problèmes de dominations.
La (sigma,rho)-domination est un cadre général introduit par Telle pour unifier les problèmes de domination: un ensemble D de sommets d’un graphe G est (sigma,rho)-dominant si pour tout v dans D, |N(v)cap D| appartient à sigma et pour tout v dans VD, |N(v)cap D| appartient à rho.
Nous avons principalement montré que quelque soit sigma et rho récursifs le problème de (sigma,rho)-domination reste dans W[2] quand il est paramétré par la taille du dominant. Ce résultat est optimal lorsque l’on considère qu’il existe des sous-problèmes particuliers de (sigma,rho)-domination qui sont complets pour W[2]. Nous avons aussi prouvé que sa paramétrisation duale, c’est à dire quand le paramètre est la taille du dominé, reste dans W[2], ainsi qu’une généralisation de ces problèmes incluant Connected Dominating Set.

Nous avons aussi considéré des problèmes de théorie des codes en relation avec les problèmes de domination avec contraintes de parité. En particulier nous avons montré que le problème de distance minimum sur Fq appartient à W[2] pour ses paramétrisations standards et duales et W[1]-difficile pour sa paramétrisation duale.

Pour les preuves de d’appartenance à W[2], nous avons étendu le modèle de Machine Turing de Cesati pour la complexité Paramétrique en introduisant un nouveau type de machine non-déterministe pouvant effectuer des transitions aveugles, c’est à dire des transitions ne dépendant pas du contenu des bandes. Nous avons prouvé que le problème associé Short Blind Multi-Tape Non-Deterministic Turing Machine est W[2]-complet. Nous pensons aussi que ce modèle peut être utilisé pour d’autres types de problèmes sans rapport avec la domination.