SPEAKERS/BEN GOERTZEL
Ben Goertzel

Ben Goertzel

SingularityNET Foundation

Ben's lectures

lecture cover

AGI-20

Guiding Symbolic Natural Language Grammar Induction via Transformer-Based Sequence Probabilities

A novel approach to automated learning of syntactic rules governing natural languages is proposed, based on using probabilities assigned to sentences (and potentially longer word sequences) by trans-former neural network language models to guide symbolic learning processes like clustering and rule induction. This method exploits the learned linguistic knowledge in transformers, without any reference to their inner representations; hence, the technique is readily adaptable to the continuous appearance of more powerful language models.

lecture cover

AGI-20

Embedding Vector Differences Can Be Aligned With Uncertain Intensional Logic Differences

The DeepWalk algorithm is used to assign embedding vec- tors to nodes in the Atomspace weighted, labeled hypergraph that is used to represent knowledge in the OpenCog AGI system, in the con- text of an application to probabilistic inference regarding the causes of longevity based on data from biological ontologies and genomic analyses. It is shown that vector difference operations between embedding vectors are, in appropriate conditions, approximately alignable with “intensional difference” operations between the hypergraph nodes corresponding to the embedding vectors. This relationship hints at a broader functorial mapping between uncertain intensional logic and vector arithmetic, and opens the door for using embedding vector algebra to guide intensional inference control.

lecture cover

AGI-20

What Kind of Programming Language Best Suits Integrative AGI?

What kind of programming language would be most appropriate to serve the needs of integrative, multi-paradigm, multi-software-system approaches to AGI? This question is broached via exploring the more particular question of how to create a more scalable and usable version of the ”Atomese” programming language that forms a key component of the OpenCog AGI design (an ”Atomese 2.0”) .

lecture cover

AGI-20

Combinatorial Decision Dags: A Natural Computational Model for General Intelligence

A novel computational model (CoDD) utilizing combinatory logic to create higher-order decision trees is presented. A theoretical analysis of general intelligence in terms of the formal theory of pattern recognition and pattern formation is outlined, and shown to take especially natural form in the case where patterns are expressed in CoDD language. Relationships between logical entropy and algorithmic information, and Shannon entropy and runtime complexity, are shown to be elucidated by this approach. Extension to the quantum computing case is also briefly discussed.

PLATFORM

  • Home
  • Events
  • Video Library

COMPANY

RESOURCES

Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2020 Underline - All rights reserved

Made with ❤️ in New York City