Persons
Ing. Peter Jung
All publications
Quantified Neural Markov Logic Networks
- Authors: Ing. Peter Jung, Marra, G., Ing. Ondřej Kuželka, Ph.D.,
- Publication: International Journal of Approximate Reasoning. 2024, 171 ISSN 0888-613X.
- Year: 2024
- DOI: 10.1016/j.ijar.2024.109172
- Link: https://doi.org/10.1016/j.ijar.2024.109172
- Department: Department of Computer Science, Intelligent Data Analysis
-
Annotation:
Markov Logic Networks (MLNs) are discrete generative models in the exponential family. However, specifying these rules requires considerable expertise and can pose a significant challenge. To overcome this limitation, Neural MLNs (NMLNs) have been introduced, enabling the specification of potential functions as neural networks. Thanks to the compact representation of their neural potential functions, NMLNs have shown impressive performance in modeling complex domains like molecular data. Despite the superior performance of NMLNs, their theoretical expressiveness is still equivalent to that of MLNs without quantifiers. In this paper, we propose a new class of NMLN, called Quantified NMLN, that extends the expressivity of NMLNs to the quantified setting. Furthermore, we demonstrate how to leverage the neural nature of NMLNs to employ learnable aggregation functions as quantifiers, increasing expressivity even further. We demonstrate the competitiveness of Quantified NMLNs over original NMLNs and state-of-the-art diffusion models in molecule generation experiments.
On Discovering Interesting Combinatorial Integer Sequences
- Authors: Svatoš, M., Ing. Peter Jung, Ing. Jan Tóth, Wang, Y., Ing. Ondřej Kuželka, Ph.D.,
- Publication: Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, 2023. p. 3338-3346. ISBN 978-1-956792-03-4.
- Year: 2023
- DOI: 10.24963/ijcai.2023/372
- Link: https://doi.org/10.24963/ijcai.2023/372
- Department: Department of Computer Science, Intelligent Data Analysis
-
Annotation:
We study the problem of generating interesting integer sequences with a combinatorial interpretation. For this we introduce a two-step approach. In the first step, we generate first-order logic sentences which define some combinatorial objects, e.g., undirected graphs, permutations, matchings etc. In the second step, we use algorithms for lifted first-order model counting to generate integer sequences that count the objects encoded by the first-order logic formulas generated in the first step. For instance, if the first-order sentence defines permutations then the generated integer sequence is the sequence of factorial numbers n!. We demonstrate that our approach is able to generate interesting new sequences by showing that a non-negligible fraction of the automatically generated sequences can actually be found in the Online Encyclopaedia of Integer Sequences (OEIS) while generating many other similar sequences which are not present in OEIS and which are potentially interesting. A key technical contribution of our work is the method for generation of first-order logic sentences which is able to drastically prune the space of sentences by discarding large fraction of sentences which would lead to redundant integer sequences.
Graph Generation with Graphon Generative Adversarial Networks
- Authors: Ing. Peter Jung, Ing. Ondřej Kuželka, Ph.D.,
- Publication: Proceedings of The 31st International Conference on Inductive Logic Programming. Proceedings of Machine Learning Research, 2022.
- Year: 2022
- Department: Department of Computer Science, Intelligent Data Analysis
-
Annotation:
Graphons are limits of converging sequences of graphs with a particularly simple representation—a graphon is simply a symmet ric function of two variables on [0; 1]2. In this work, we develop an el- egant GAN model, called GraphonGAN, which uses graphons imple- mented by neural networks as generators and graph neural networks as discriminators. We show that GraphonGAN is a decent model for modelling real-world networks. All the source codes will be available at https://github.com/kongzii/gangraphon
Learning to Generate Molecules From Small Datasets Using Neural Markov Logic Networks
- Authors: Svatoš, M., Ing. Peter Jung, prof. Ing. Filip Železný, Ph.D., Marra, G., Ing. Ondřej Kuželka, Ph.D.,
- Publication: International Joint Conference on Learning & Reasoning. Cham: Springer, 2022.
- Year: 2022
- Department: Department of Computer Science, Intelligent Data Analysis
-
Annotation:
Neural Markov Logic networks are a statistical relational model capable of generating relational structures. In this paper, we investigate how this particular model behaves in the setup of few-shot learning and show that Neural Markov Logic Networks are able to learn to generate small molecules from a handful of training examples without any pre-training.