Just a link here to the Washington Post which has an excellent semi-interactive article that can be browsed in a few minutes that really does a great job of explaining how AI art generators work: https://www.washingtonpost.com/technology/interactive/2022/ai-image-generator/
It Came From the Workshop
Projects and documentation for selected items from my workshop
Friday, 16 December 2022
Thursday, 8 December 2022
My Inter-dimensional Camera Takes Photos of an Alternate Reality
I had an idea recently that I quickly prototyped: The Interdimensional Camera.
The idea is that the camera takes images not of the reality directly in front of the lens, but from a neighboring dimension (or parallel universe, if you prefer.)
It uses the Dall-E API from OpenAI to turn what the camera sees into an alternate-reality version that is kind of close, but also entirely different.
The original image -- what the camera ACTUALLY sees -- is actually never shown to the user, which feeds the narrative that the camera is somehow seeing into an alternate reality.
What keeps this from just being a crappy snapchat filter knock-off is the fact that it works on ANYTHING. Not just people, not just faces, but ANYTHING including objects, scenery, animals, plants, etc.
Check out the code and details if you're interested.
Wednesday, 16 November 2022
How-To's for Stable Diffusion (Art-Generating AI)
Art-generating AIs are fantastic and developments are happening at a breakneck pace. While experimenting with Stable Diffusion (via the fantastic Automatic1111 Webgui) to generate images I made a document to keep track of what I learned and the ways I was fixing problems.
Here is my HOW TO guide which includes my recommended settings and techniques for:
- How to use inpainting to redraw a small messed-up part in a generated image
- How to use inpainting to remove parts of an image
- Use inpainting to redraw areas or details
- Generate same-but-different variations on an image
- And more!
Saturday, 25 June 2022
All The Good VR Ideas Came From The 1960s
I'd like to highlight an article of mine that was published a short time ago: all the truly good VR ideas were dreamt up in the 1960s.
The core of the article is to highlight that VR is a combination of simulation and interaction, and combining them effectively had to wait until the 60s, when the digital revolution and computers provided the right tools.
Here is an excerpt from my article:
In 1965 Ivan Sutherland, a computer scientist, authored an essay entitled The Ultimate Display (PDF) in which he laid out ideas far beyond what was possible with the technology of the time. One might expect The Ultimate Display to be a long document. It is not. It is barely two pages, and most of the first page is musings on burgeoning interactive computer input methods of the 60s.
The second part is where it gets interesting, as Sutherland shares the future he sees for computer-controlled output devices and describes an ideal “kinesthetic display” that served as many senses as possible. Sutherland saw the potential for computers to simulate ideas and output not just visual information, but to produce meaningful sound and touch output as well, all while accepting and incorporating a user’s input in a self-modifying feedback loop. This was forward-thinking stuff; recall that when this document was written, computers weren’t even generating meaningful sounds of any real complexity, let alone visual displays capable of arbitrary content.
I round out the article with a list of ideas from the 60s that happened, as well as listing some that have not happened yet. I also highlight the difference between "important" features, and "cool" ones.
Give it a read, and maybe you'll come away with some new ideas or perspectives on VR, and where it is going.
Thursday, 19 May 2022
Using Natural Language AI in Your Next Project is Easier Than You Think
I've been playing around a lot with OpenAI's natural language API, which is an interface to GPT-3 and it's opening all kinds of doors for me lately. I wrote an article published on Hackaday covering the basics of what it can do and how to get started. Short version? If you can code some basic Python, write a curl statement, or even just use the command line, you already have all the tools you'll need to get started!
Here's a fun little utility I wrote to learn the ropes. The "technobabbler" converts otherwise boring log messages or status reports into technobabble. Here are a few examples, below.
Note that results are never really the same and cannot be predicted. I find this kind of uncertainty fascinating; the way that responses are not predictable yet not actually random is deeply interesting to me.
Below, multiple responses are shown below each command to demonstrate this. (The technobabbler actually only returns one response at a time.)
echo "Connection lost" | ./technobabbler.py
All systems are down. We are cut off.
Subspace carrier has lost coherence.
echo "Connection established" | ./technobabbler.py
Accessing subspace frequency...
We have a green light
Subspace carrier has locked on.
echo "Power low. Please charge now." | ./technobabbler.py
Insufficient energy levels. Please provide additional power.
Power reserves are running low. Please recharge as soon as possible.
echo "Display not found" | ./technobabbler.py
Monitor not active.
There is no display device on the specified port
No compatible output devices detected
Friday, 10 December 2021
Does a Meta Quest 2 VR System Need a Computer or Phone to Work? Simple Answers
This question is coming up a lot for those new to VR, and here are the simple answers.
In short, the answer to "do you need a computer or phone to use a Quest 2?" is yes and no. That's because that question can mean different things. I'll explain.
First of all, a brand new Quest 2 straight from the box needs to go through a setup process before it can be used. This setup process needs a phone with an app installed that will pair with the headset. Once this setup is done, the Quest can be used without the phone. In short, a phone is required for initial setup of a Quest headset, but is NOT required to use the Quest afterwards.
If you mean: "Can I set up the Quest 2 without a phone?" the answer is NO. The Quest 2 needs you to install an app, after which your phone will be used to complete the setup process (which involves creating an account.)
If you mean: "Do I still need the phone to use the Quest 2 after setup?" the answer is NO. Once the Quest 2 is setup, the headset can be used without the phone being involved. All you need is the Quest headset. You can pack it up and bring it somewhere, and it will work to play games without the phone being around. (There are some features that will only work with the app, but the headset will work and play games just fine by itself.)
If you mean: "Can I use a computer to setup the Quest 2?" the answer is NO. A computer cannot be used to set up the Quest. A phone with the app is needed.
If you mean: "Can I use the Quest 2 to play PC VR games?" the answer is YES. The Quest works like a games console, like a Nintendo Switch, or Sony PlayStation, or Microsoft Xbox, and does not require a computer. However, if you wish to use the Quest 2 to play SteamVR games instead of using it like a console, that can be done.
More info on playing PC-based VR content using the Quest
There is another, optional way to use the Quest, and that is by using a Link Cable or Airlink.
To use the Quest as a headset for PC-based VR content, you will need a Link Cable, or use wireless Airlink. They both do the same job, just with a cable (the Link Cable), or wireless (Airlink).
When using the Quest in this way, the PC runs the VR content, and the Quest is used to display it to the player.
If you don't plan to use the Quest to play SteamVR games or PC-based Oculus VR games, you can ignore this entirely.
Monday, 4 October 2021
A Quick Word About Estimating Costs
Making a budget and estimating costs is critical to the success of any endeavor, especially crowdfunded products. I am always looking to learn more on this process, and wanted to share a brief bit that is not often talked about. I was reading a recent guide, and the numbers they provided matched my own experiences, so I'll repeat them here.
Overestimating costs to provide a buffer for the unexpected is important, and while exactly how much you budget in is up to you, here are some simple and reasonable numbers to use as a guide:
- Add a 10%-15% buffer onto any costs that are not shipping or manufacturing.
- Add a 30% buffer to shipping and manufacturing costs.
You'll ultimately have to decide how much of a buffer works best for your project, but these numbers aren't unreasonable. This is especially true for creators who are crowdfunding on Kickstarter or similar platforms, where your budget needs to be cemented ahead of time, and manufacturing and shipping might be some time in the future.