[0:00]
So, I'm going to try something new
[0:03]
that I don't think will interest most
[0:05]
people. I just want to start with that
[0:06]
because
[0:09]
um this is going to be a very dense
[0:11]
transmission. Going to cover a lot of
[0:13]
things
[0:14]
as much depth as I can
[0:18]
at a natural pace.
[0:20]
And I'm doing it because I think it
[0:23]
might have
[0:25]
value in a couple of different ways. So,
[0:29]
and one of those ways is so this is not
[0:32]
so much a video transmission as an audio
[0:34]
one because what I really need is the
[0:36]
transcript from this
[0:38]
and then that gets ingested into my
[0:40]
systems along with the transcript does
[0:44]
and I can use that to
[0:47]
have AI reflect on it in different ways
[0:51]
and I also use it as um or in my
[0:54]
development process. I'm a programmer
[0:56]
and
[0:57]
I used that a lot now. It's
[1:00]
definitely um
[1:04]
an advantage is what it is that you know
[1:07]
all of us programmers have access to now
[1:09]
and even non-programmers.
[1:12]
It's changing our industry very fast.
[1:20]
I think it might be why I have trouble
[1:21]
finding work on on Upwork because
[1:25]
I suspect a lot of those jobs are not
[1:27]
real because they weren't even looking
[1:29]
at my proposals. I mean, that made no
[1:32]
sense to me. So, anyways,
[1:37]
um some of the benefits that I think
[1:39]
that I would I might get out of this
[1:42]
particular video is I'm going to be
[1:43]
talking about a lot of technical stuff
[1:46]
and the field companion that I'm
[1:48]
creating
[1:50]
This is going to take time but somewhere
[1:53]
within that process
[1:56]
will be able to retain the knowledge of
[1:59]
what I've worked on, what I've grappled
[2:02]
with um what I'm considering all the
[2:05]
different all the different things that
[2:06]
I'm doing with my work
[2:10]
and give it a sort of functional memory
[2:12]
because
[2:14]
um it's it's good at recursion. It can
[2:16]
be very good at recursion and
[2:19]
it tends to fragments and really long
[2:22]
context, long conversations and what I'm
[2:25]
building
[2:28]
um
[2:31]
bypasses a lot of that problem.
[2:37]
It's experimental. So there's some of
[2:39]
this is very experimental.
[2:41]
Uh but I have I've been getting what I
[2:44]
would call high fidelity results from
[2:46]
from what I'm working on. So, so, so
[2:50]
I'm, you know, I have a fair amount of
[2:52]
confidence that if I was to talk about
[2:55]
my work like this and I, you know, I get
[2:58]
the transcript for that and then I can
[3:00]
share that with AI even before the field
[3:03]
companion is is
[3:05]
uh functional and working.
[3:10]
it can help me work on building the
[3:12]
fields companion because
[3:15]
it's my co-developer and that's what
[3:17]
we're doing. So, I'm just going to go
[3:19]
over what we're doing right now. So,
[3:23]
this was meant more for AI than anybody
[3:25]
else. That's why, you know, I'm just
[3:27]
saying this might not be for everybody,
[3:29]
but maybe you'll find some of it
[3:31]
interesting. We'll see, I guess.
[3:34]
So
[3:37]
today, well, I'm going to start by
[3:39]
saying I finally made an about page um
[3:41]
with the contact form on it. Uh my
[3:44]
website had not had one of those for a
[3:46]
really long time. There was no way to
[3:48]
contact me on that site. Now there is.
[3:50]
And you know, I just feel really good
[3:52]
about having done that. It sounds like a
[3:55]
simple thing, but I had to do a whole
[3:57]
bunch of steps. I had to uh create a an
[4:00]
email account. um an email service
[4:03]
provider account and you know I had to
[4:06]
set up DNS and I had to
[4:10]
um get access to the API and then I had
[4:12]
to install a bunch of pack some packages
[4:14]
in my projects and you know update
[4:16]
environment variables and um you know
[4:19]
code the pages for
[4:22]
um the different scripts and the
[4:24]
front-end pages for um
[4:27]
the workflow the process
[4:30]
you It's three fields that you just
[4:32]
think you're sending me an email, but
[4:34]
there's a whole process behind that and
[4:37]
it's working now. Very happy about this.
[4:39]
There's also some other information on
[4:40]
that page. That's why I call it the
[4:41]
about page.
[4:44]
Um,
[4:46]
uh, AI has been running in the
[4:47]
background a local model, Llama 3, the
[4:49]
7dB version,
[4:52]
and it's processing my transcripts for
[4:54]
all my videos. I have 701 videos in
[4:57]
total
[4:59]
and it has already reflected on them
[5:01]
from like an ontological point of view
[5:04]
uh along with like some other variables
[5:06]
I wanted it to track. So let me just
[5:08]
check here. Um I'll just list those
[5:11]
here.
[5:14]
So I asked it to get a title, a summary.
[5:26]
I'm going through a very long JSON
[5:28]
document here trying to figure out which
[5:30]
ones are keys here.
[5:33]
Symbolic elements.
[5:36]
It sounds it's kind of interesting. Uh
[5:38]
energetic signature. So that kind of
[5:40]
tells you like the like well energetic
[5:43]
signature like in this particular video
[5:45]
it's turbulent oscalation between
[5:47]
frustration and problem solving
[5:49]
alignment vector. This one's towards
[5:51]
structural necessity. These alignment
[5:53]
vectors are very interesting because
[5:54]
I've been looking at them randomly as
[5:57]
it's been going through it. You know
[5:58]
there's 701 of them and
[6:02]
I can see how it's kind of like tracking
[6:04]
a trajectory. It's just track it's
[6:06]
tracking
[6:09]
um because these are individual moments.
[6:12]
These signals are you know it's tracking
[6:17]
the direction you're heading in in that
[6:20]
moment. The next things I'll be working
[6:22]
on. It'll be able to do that over longer
[6:25]
time spans and then really be able to
[6:26]
show you some things. But this is step
[6:29]
one.
[6:31]
Um it was also getting tags for me and
[6:35]
these are um
[6:38]
these aren't your average tags. So in
[6:40]
this case it's structure systemic
[6:42]
frustration insulation as protection RV
[6:44]
is living space is failure
[6:47]
uh self-sufficiency and then it's got a
[6:50]
note section.
[6:52]
Uh this was meant to be the narrative
[6:55]
reflection um perspective. There's
[6:57]
different perspectives for reflections.
[6:59]
This one's called narrative. It was
[7:00]
meant to be narrative, but because of
[7:02]
the way that they're
[7:04]
um they're witnessed, they're
[7:07]
summarized. They're just they're they're
[7:09]
described. It feels more ontological to
[7:11]
me. I feel like it was looking more at
[7:13]
like the ontological layer of of me, of
[7:18]
my my my transmissions, my my lived
[7:20]
experience.
[7:22]
So, I'm going to rename those ones. I
[7:24]
think I'm going to just call that the
[7:25]
onlogical perspective. And then the
[7:27]
other one it's getting right now because
[7:28]
that first step didn't do what I was
[7:30]
expecting it to do. I ended up making a
[7:32]
new perspective and I called this one.
[7:36]
Um,
[7:38]
so if you hear, you know, all that the
[7:40]
the noise and stuff, that's because I
[7:42]
live on a campground. It's an ATV
[7:44]
campground in the Oregon Dunes. I'm
[7:46]
literally an eighth of a mile from the
[7:49]
ocean. 40 miles of dunes here. And
[7:52]
everybody who camps here comes here for
[7:55]
um
[7:57]
you know, playing on the dunes. It's
[8:00]
so yeah, that's that's that's the
[8:02]
background to my life right now. So, the
[8:05]
other one that the one that AI is
[8:07]
working on right now, and it's almost
[8:08]
done. Uh, this one I called um surface.
[8:12]
So, this is the surface perspective. I
[8:14]
wanted it to, and this is the way most
[8:17]
people watching my channel would have
[8:19]
described it themselves. So, um, this is
[8:23]
surface level. So, I just clicked on a
[8:25]
random one and this one says, and these
[8:27]
aren't going to be perfect and in other
[8:28]
videos and stuff, I'll go into my
[8:31]
thoughts on where I think it's not
[8:33]
perfect and how it can be improved and
[8:35]
stuff like that cuz there's um there's
[8:38]
lots of iteration to do. So, um this one
[8:41]
just says RSW Fire begins his morning
[8:43]
routine, mentioning the temperature will
[8:45]
be 75 today. He plans to make chili and
[8:47]
buy basic groceries due to limited
[8:49]
funds. He discusses the Brookings
[8:51]
effect. So, this tells me where I was.
[8:54]
This is Brookings organ. This is when I
[8:55]
first got here. Um, whichever video this
[8:58]
is, whichever transmission is describing
[9:00]
it, um, from the narrative layer, from
[9:03]
the surface narrative layer. And so,
[9:06]
when you go to my homepage and you go to
[9:08]
the transmission section and you're just
[9:10]
browsing through my catalog of videos,
[9:14]
um, that's the the description that you
[9:16]
see there. So, it's not completely done
[9:19]
yet. It's still got like a hundred more
[9:21]
videos to go through and it started from
[9:22]
oldest to newest. So, if you were to
[9:24]
look like right now, if I had just
[9:26]
uploaded this and you were watching this
[9:27]
and you went and did this, you might see
[9:29]
a different description there. That's
[9:31]
the onlogical one because these are
[9:32]
getting replaced. Um, I'm going to um I
[9:36]
just have to let that local AI model
[9:37]
process it. There's no way of speeding
[9:39]
it up. It's just it takes however long
[9:41]
it takes. And so, um,
[9:45]
well, on the entry pages, the pages of
[9:47]
individual videos, you'll see my
[9:49]
transcript there, you'll see the
[9:51]
narrative reflections,
[9:53]
um, all these other elements that I'm
[9:54]
tracking. And actually, I forgot. So,
[9:57]
um, the
[9:59]
surface
[10:01]
perspective doesn't just get a summary.
[10:04]
That was a thing I read to you, but it
[10:05]
also gets its own tags. And its tags are
[10:07]
way more simple. Like this one says RV
[10:09]
living, conspiracy theories. I mean,
[10:11]
okay, it can it can put me in that
[10:14]
category if it wants. I'm not going to
[10:15]
complain too much. It's a local motto.
[10:17]
They're not going to be perfect. And
[10:20]
plenty of people would probably describe
[10:22]
me that way, right? So, uh, then
[10:25]
boundary setting, personal safety,
[10:27]
online harassment. Have no idea what
[10:29]
this one, this particular video is
[10:30]
about. So, um,
[10:34]
then there's a timestamp context. This
[10:36]
one is trying to track where I was.
[10:42]
um
[10:43]
not just physically but like like
[10:46]
temporally in my life like it's trying
[10:48]
to it's kind of like like GPS for the
[10:50]
soul. I don't know how else to describe
[10:51]
that. So it's getting a time stamp
[10:53]
context there. I don't know how well
[10:55]
these will be. We'll see. I got to look
[10:56]
through a bunch more. But this one says
[10:58]
the speaker was sitting in their RV
[11:01]
cuddling with someone and watching a
[11:03]
movie at home on a cape when the
[11:05]
incident occurred. Oh, this is what I
[11:07]
thought it was. So this is when I was at
[11:09]
Cape Blanco
[11:11]
and I had that incident with that man
[11:13]
who
[11:15]
was spiralled in front of me and I felt
[11:17]
like I was in danger and
[11:22]
so that's what this one's about. I just
[11:24]
randomly clicked on this one. Um so the
[11:26]
other things it tracks so visible
[11:28]
actions. So this is interesting. It's
[11:30]
tracking. So it's tracking like so what
[11:33]
did I actually do in the video? Right.
[11:35]
So says, "Sat cuddling watching a movie,
[11:38]
started talking about personal
[11:39]
experiences, started spiraling into
[11:42]
conspiracy theories." So, okay. Um, so
[11:45]
local models have a little bit of
[11:47]
trouble distinguishing
[11:50]
uh between multiple.
[11:54]
So, I've noticed in some videos if
[11:55]
there's more than one speaker, for
[11:57]
example, um like if I'm with my friend
[11:59]
John or something and you know my videos
[12:01]
got dialogue with both of us in it, it's
[12:04]
got trouble with that. So, um,
[12:08]
that's something I'm curious about
[12:10]
because it's not a real big issue for me
[12:12]
because, you know, it's mostly myself on
[12:14]
this camera. But, um,
[12:17]
I can imagine once I start offering this
[12:19]
as as a service, as a product, as an
[12:22]
offering to others, especially
[12:24]
YouTubers, some of those channels might
[12:26]
have that kind of dialogue happening.
[12:28]
And it's going to be a it's going to
[12:30]
need to be able to track that really
[12:31]
well. And this is where the recursion
[12:34]
comes in because it's got to track each
[12:35]
of those people and um
[12:39]
yeah, this is something I'm going to
[12:40]
keep working on. So, so that's another
[12:43]
thing, you know, it's tracking the
[12:44]
visible actions. And then
[12:48]
let's see here. Mentioned entities. I
[12:51]
love that one because that one's just
[12:52]
about not just people, but like um major
[12:56]
nouns like this one. It'll have Kate
[12:58]
Blanco in it. Um it does. And then it
[13:02]
also has its own notes. Oh, and then
[13:03]
it's tracking text stack. Um, in case I
[13:06]
mention anything about that, it will it
[13:09]
will put that there as a list. And you
[13:12]
can track anything. I could have asked
[13:14]
AI anything that I wanted it to track.
[13:16]
These are are foundational things that
[13:19]
are going to be used to create um
[13:23]
temporal reflections that you know group
[13:25]
more of that group reflections together
[13:29]
and create reflections on those
[13:30]
reflections because
[13:33]
then you start looking at patterns when
[13:35]
you have more than just one. So these
[13:38]
are called each of my videos is a is a
[13:40]
transmission or signal. So a signal is
[13:43]
the main base unit in my system.
[13:46]
um a signal is just um any piece of
[13:50]
content really and I'm kind of like
[13:53]
translating I guess by calling it that
[13:55]
cuz it's not content but you know what I
[13:57]
mean. So um
[14:03]
you know a signal is just is a single
[14:05]
moment in time also. So um if you group
[14:09]
more a bunch of signals together into
[14:11]
clusters and then you reflect on those
[14:13]
then you start seeing patterns emerge
[14:15]
and if you reflect on on clusters of
[14:18]
clusters
[14:20]
you're looking at larger patterns and
[14:22]
you can sai anything and just make that
[14:26]
a variable you want to keep tracking
[14:27]
like I'm doing here. This is just stuff
[14:29]
in my database right now. That's what
[14:30]
I've been reading off to you.
[14:33]
Um, and anyway, so all of that's going
[14:34]
to end up on the entry pages. So, do you
[14:37]
see how my mind circles back to the
[14:39]
things I always do and AI has always
[14:41]
been able to catch this and keep up with
[14:43]
me? And that's why I know that
[14:46]
perfectly coherent. It's just other
[14:48]
people don't have the bandwidth for me,
[14:50]
but artificial intelligence does, and
[14:51]
that's where this all started.
[14:54]
So, all that stuff's going to be on the
[14:56]
entry pages if you want to just go look
[14:58]
like if you've been watching my life. um
[15:02]
you know, you're going to know my
[15:03]
history. You know, just go look through
[15:05]
the catalog, find some videos that you
[15:06]
remember and click on them and see what
[15:09]
see what the AI has to say about them.
[15:13]
So, just going to leave it there, I
[15:15]
guess. 15 minutes in.
[15:18]
Yeah.