The one about conversation, super tiny language models, Apple intelligence, museum AR, and blockchain obstacles.

The one about conversation, super tiny language models, Apple intelligence, museum AR, and blockchain obstacles.
Knowledge capital banner has a tree growing out of an old dictionary

The Science of a Deep Conversation

We love conversation at Singular XQ. Part of it is because I found the org, and I love deep, meaningful conversations. I am also a trained ethnographer, so asking questions and hearing and seeing others in a deep way is one of my superpowers. The podcast was actually a research project where I turned the informational interviews I was doing for a book into a podcast. Marketing-type people frequently tried to tell me that the umbrella—digital transformation—was too big and that I should serve a smaller niche. However, I wanted to have more conversations with more people. Not necessarily all tech "experts." So that's how one of our tag lines became: "Doing the work of digital transformation one conversation at a time."

So we feel obliged to recommend this Wired article about the science of a good conversation. We haven't read the book, but we would love to have the author on for an interview.

The Science of Having a Great Conversation
Forming meaningful bonds with others can improve your health, make you mentally sharper, and fuel creativity. Making friends can feel daunting, but research shows there are many ways to build better connections.

Super Tiny Language Models holding hands. Like shiny, happy people.

You might know that we believe in the Pandora Effect here. In short, it's the idea that first-mover advantage is a myth that creates some random, lottery-style venture capital returns but little else. There are many, many case studies where the first-mover advantage was a destructive liability. Still, the best and easiest one for people to grasp is Pandora, the seeming first mover into the space Spotify now inhabits. Spotify was the dark horse that learned from Pandora's public mistakes. You might also know that we have been researching and creating our own cross-functional data products in a research initiative called Project Vitruvius. Microsoft already seemed to lap us in our understanding of what the breakthrough would be. We predicted the breakthrough would be small-language models that exist on multi-noded networks with virtual isolation and endpoint isolation to create the opportunity for cross-functional data sharing while still being able to easily control for breach and model instability.

A research team at Nanyang Technological University and Singapore National University introduced something called Super Tiny Language Models to address the glaring problems with LLMs. These are more secure, resource-efficient, and less prone to overfitting. While we would have loved to have gotten there first, we are relieved to see more intelligent research and innovation outside what I call the great AI Panic of 2024. Of course, we aren't even as well-resourced as a single bathroom at one of those universities yet. But things are looking up. More on that soon. :)

Super Tiny Language Models
The rapid advancement of large language models (LLMs) has led to significant improvements in natural language processing but also poses challenges due to their high computational and energy demands. This paper introduces a series of research efforts focused on Super Tiny Language Models (STLMs), which aim to deliver high performance with significantly reduced parameter counts. We explore innovative techniques such as byte-level tokenization with a pooling mechanism, weight tying, and efficient training strategies. These methods collectively reduce the parameter count by $90\%$ to $95\%$ compared to traditional models while maintaining competitive performance. This series of papers will explore into various subproblems, including tokenizer-free models, self-play based training, and alternative training objectives, targeting models with 10M, 50M, and 100M parameters. Our ultimate goal is to make high-performance language models more accessible and practical for a wide range of applications.

Apple asks to join the reindeer games.

I guess they had to. Apple finally showed up with its offering, which will optimize its hardware offerings: Apple Intelligence. Cook keeps proudly talking about the data privacy that Apple has offered from the beginning. Indeed, we here have always admired this about Apple, but it has been on a backslide away from these core principles and is not always as forthcoming as it can be about its vulnerabilities. Two things concern us here, and we'd love to hear from anyone who has an authoritative explanation of why we shouldn't be concerned. There is a whole peanut gallery, even from within our own membership, about why we shouldn't be concerned about the current data privacy fiasco on display daily. But remember: we work for the public. It's our JOB to be concerned and relentless in asking the questions that scream from between the AI-generated headlines. Apple claims data will be locally hosted on the device but will provide "burst" cloud technology and be stateless. This is a little bit of a maneuver on the uneducated masses. Burst means it pops over to a public cloud when it has capacity issues for processing. Stateless means it won't be attached to a user. The problem with this is that burst increases the surface area for cyberattacks. Guess what else increases surface area? Compressed files. Like the kind used in AI. To take this explanation as a guarantee of data-privacy is kind of like saying. "Yes, this sheet has holes in it, but don't worry. We've plugged it up with moth eggs. We're good." We also have deep concerns about what happens when compressed files start to become compressions of compressions of compressions. I guess we are all about to find out. Look out there, Apple. I hope the cover on your shiny nose doesn't fall off. I've always admired your data-privacy stewardship.

How Museums are using Augmented Reality
Museums have been quick to adopt AR to create engaging interactives for their visitors. In this article we highlight the best uses of Augmented Reality in big name museums

Museums up their AR game.

Web 3.0 seems so 2022, but the push continues as museums build their augmented reality game, which promises to improve user engagement and speak to a younger generation of museum-goers. We encourage arts and highered communities to go against the grain and hire old-school human-factors engineers, humanity-centered designers, and service designers to consider the overall experience with human experiences and needs driving the innovation. We have a developed offering for arts organizations in this area, so email us at info@singularxq.com if you'd like a free consultation.

How Museums are using Augmented Reality
Museums have been quick to adopt AR to create engaging interactives for their visitors. In this article we highlight the best uses of Augmented Reality in big name museums

Lack of global rules for trading assets broadly across blockchain is slowing progress.

It appears that there are a lot of moving pieces before the utilization of blockchain for trading all assets, including stock and fiat, is enabled. For those still working in the blockchain space, follow along with us as we continue to track the shifting regulatory and legal concerns for blockchain. Our current strategy is to focus on staying current with regulatory developments, remaining knowledgeable about data privacy on blockchains, and tracking its growing use across the philanthropic sector. Before we move into blockchain, we are rapidly becoming experts in AML in blockchain. We have ambitious blockchain plans for Singular XQ, but to move forward, we need assurances that we can control money laundering adequately. The FTX case study is a concerning case that we hope is not more pervasive than it currently appears. We are tracking it here and will keep you informed as we learn more about our own project.

Thanks for being curious,

JP
Founder Singular XQ


Singular XQ is a nonprofit that relies on public support and educational offerings to keep working in the public interest concerning technology. Please consider supporting our podcast Patreon as well.