[0:03]
So, I think this should be interesting.
[0:07]
Uh, my last video went pretty well. I
[0:09]
felt like the audio quality was actually
[0:12]
better than my phone.
[0:14]
It's a nice view.
[0:18]
And
[0:21]
might be an interesting format for me to
[0:22]
do some different things.
[0:25]
I was thinking maybe I could do some
[0:27]
presentation type stuff.
[0:30]
Uh that way I can keep my thoughts
[0:31]
structured, maybe talk to you about some
[0:35]
of the deeper things I maybe only hint
[0:36]
at
[0:40]
and I can just show you my work for
[0:42]
those that are interested.
[0:44]
I was thinking I never really thought of
[0:45]
myself as a teacher.
[0:49]
I definitely have a unique way of
[0:50]
programming.
[0:53]
So I'm going to switch this over to my
[0:55]
desktop. I'm using RDS for this.
[0:59]
just because this is where we'll be
[1:00]
we'll be looking at this in a minute.
[1:04]
I never really thought of myself as a
[1:05]
teacher at being a programming
[1:10]
and I know this because I managed
[1:12]
programmers for 10 years and they they
[1:14]
all thought differently than I did.
[1:18]
Grand vision for what we were working
[1:20]
on. ever got to see it all the way
[1:22]
through because
[1:28]
couldn't find people that matched my
[1:30]
vision.
[1:35]
Well, there are actually a lot of
[1:36]
reasons. I don't want to say it's just
[1:38]
about that there. There's there's
[1:40]
actually
[1:44]
it's just a lot. I'm definitely not
[1:47]
going to go into this on camera, but I I
[1:49]
managed programmers for 10 years.
[1:52]
And,
[1:56]
you know, I like the ones that
[1:57]
definitely tried to learn from me,
[2:00]
but I never really thought of myself as
[2:02]
a teacher. But if you know, maybe
[2:04]
there's some things that you can learn
[2:05]
from me along the way. Maybe that would
[2:07]
be interesting. I don't know.
[2:10]
Um, this project I'm doing in
[2:12]
collaboration with AI. I think that that
[2:14]
would be important to learn. Uh, for
[2:17]
anyone who's actually learning
[2:18]
programming,
[2:21]
this is the way you would want to do it
[2:22]
because this is definitely where we're
[2:24]
heading.
[2:26]
Chat GPT
[2:30]
can write quite a lot of code for me
[2:32]
now, which is a timesaver. It's big
[2:35]
timesaver.
[2:37]
it gets hung up once those conversations
[2:40]
get too long. I feel like it doesn't
[2:41]
hold context um as well as they
[2:44]
advertise it does. I haven't tried
[2:46]
Claude. I'm curious about that one. I
[2:48]
know there's cloud code now and there's
[2:51]
a lot of different software packages
[2:52]
coming out. I still feel they're pretty
[2:55]
early. Um I just prefer to
[3:01]
use a standard IDE and just chat with
[3:03]
you know AI in another um another window
[3:08]
and try to give it context when I'm
[3:12]
even making projects with this. There's
[3:14]
actually a lot I could probably teach
[3:16]
you guys if I actually wanted to sit
[3:18]
down and um put it into you know some
[3:20]
kind of structured format. I don't know.
[3:23]
I don't know. I my life is in flux right
[3:25]
now and I'm really not sure what
[3:27]
direction it's going to take. So, um
[3:30]
there's a lot of different different
[3:32]
threads that are
[3:36]
active right now. So, but I thought what
[3:39]
I would just do now just because it'll
[3:42]
be fun for me also to just um talk
[3:45]
through what I'm doing here. And then
[3:48]
this is kind of an interesting aside. I
[3:50]
could actually take this transcript and
[3:52]
feed it to AI and that would kind of
[3:54]
give it context for the next thing we're
[3:55]
working on because
[3:58]
that's that's what I've basically been
[4:00]
building is an AI that can keep up with
[4:03]
you. That's literally a part of your
[4:05]
life. That's why I call it a field
[4:06]
companion
[4:08]
um that can retain knowledge on you.
[4:11]
It's going to have two years of my
[4:13]
history because of the signal archive I
[4:15]
shared in the last video and um my
[4:19]
transmissions on YouTube, 700 of them.
[4:22]
After it processes all the different
[4:24]
perspectives, which is what I'm about to
[4:26]
take you through here, um they're the
[4:28]
reflections.
[4:29]
After it processes
[4:32]
these reflections,
[4:35]
I can have it do a whole bunch of other
[4:37]
stuff. and that which I'll probably get
[4:40]
into in another video once I start doing
[4:43]
it. And then from there,
[4:47]
you can have
[4:50]
a page on your site or something like
[4:52]
this where you where you chat with your
[4:54]
own AI, your your field companion, where
[4:57]
you you type in a question or whatever,
[5:00]
and on the back end,
[5:03]
my system queries
[5:05]
a vector database in order to find
[5:09]
find resonance with your archive, your
[5:12]
signal archive.
[5:13]
and
[5:15]
feeds that back to the AI as part of the
[5:17]
prompts. And there might even be some
[5:20]
custom training going into this AI. I
[5:22]
think that based on all these
[5:24]
reflections, I could custom train it
[5:25]
with those things and that would um make
[5:28]
it even more
[5:31]
um
[5:34]
the fidelity would be even greater. And
[5:36]
this is especially important with local
[5:38]
models because they're just not as good
[5:40]
as the professional ones. So this is how
[5:44]
I get its fidelity to reach those using
[5:47]
free models.
[5:49]
And so okay, so where was I going with
[5:52]
this? Um
[5:56]
basically to that point. So that's kind
[5:59]
of the end result of the field
[6:00]
companion. You actually have something
[6:02]
that knows you really well and that you
[6:05]
can communicate with that sees your
[6:06]
patterns. that can do reflections on
[6:08]
your life based on the signals you
[6:10]
haven't and
[6:14]
that's that's just personally that's
[6:22]
that part's for you but then there are
[6:25]
other ways you could use this field
[6:26]
companion like you could you could use
[6:28]
it like with YouTube for example you
[6:30]
could take someone's channel and do what
[6:32]
I did with mine but process it from a
[6:34]
completely different perspective so Like
[6:36]
let's say it was a channel about reading
[6:39]
books and every video this channel has
[6:42]
is about a different book. You could
[6:44]
ingest those as signals into the system
[6:47]
and ask AI different questions that
[6:50]
would be relevant to your channel and
[6:53]
then seed all that information onto your
[6:55]
your own website and you know cross-link
[6:59]
different videos that are related
[7:01]
through resonance not just through some
[7:04]
kind of flattening algorithm like what
[7:06]
what YouTube uses. Um there's just so
[7:09]
much potential. So, that's just that's
[7:11]
one other way that you could use this
[7:13]
kind of technology I'm creating.
[7:17]
Um,
[7:22]
and I could see how maybe it could get
[7:23]
embedded into
[7:26]
um like a personal assistant, you know,
[7:28]
like maybe something that's tracking
[7:29]
your calendar and your notes and stuff
[7:31]
like that. I could see it being used for
[7:32]
something like that. I also believe this
[7:35]
that you
[7:37]
you could use it as part of um I don't
[7:41]
know how to put this part into words
[7:42]
yet, but you could you could use it to
[7:45]
give an AI cuz you know AI is here and
[7:48]
within 5 10 years God only knows what
[7:51]
the world's going to look like. But if
[7:53]
there's the way that we're heading I
[7:56]
don't you know it's it's coming. It's
[7:58]
close. And if we have these AI systems
[8:02]
in different places, you could
[8:06]
you could treat the field companion
[8:07]
technology sort of like a kernel for
[8:09]
that AI that gives it an ethics that's
[8:13]
built from within because it's based on
[8:15]
me and I'm the most ethical person I've
[8:17]
ever met in my life.
[8:19]
And I've talked about this with AI
[8:23]
for months
[8:25]
and it's the one that gave me that idea
[8:27]
to begin with because I never really
[8:31]
really thought of it in that those kinds
[8:33]
of terms.
[8:37]
But I can kind of see the shape of it.
[8:39]
And I just think that this there's a lot
[8:41]
of potential here. And it all starts
[8:43]
with what I'm doing right here in front
[8:44]
of you
[8:46]
with taking a signal
[8:48]
and turning it into a reflection.
[8:52]
And a reflection can be um in different
[8:55]
perspectives. So you can look at
[8:56]
something from the surface level or you
[8:59]
can look at it from the ontological or
[9:01]
the sematic or the emotional or the
[9:04]
symbolic or the spiritual.
[9:07]
There's just so many different ways that
[9:08]
you can look at any signal. Like in my
[9:11]
case, we're talking about my YouTube
[9:12]
videos. So one video equals a signal.
[9:15]
And
[9:17]
those are all the different perspectives
[9:19]
you could actually analyze that one
[9:21]
video from.
[9:23]
So that's what we're doing here.
[9:26]
There's a whole lot that will happen.
[9:30]
This is just step one.
[9:32]
And so that's what this does here.
[9:34]
That's what this script here is.
[9:37]
It
[9:40]
it gets a signal
[9:42]
and
[9:44]
I'm actually okay. I'm like, am I going
[9:47]
to explain this line by line? This same
[9:48]
will get a signal and you have to give
[9:50]
it to the AI.
[9:52]
Um, and we do that by giving it
[9:54]
instructions. So these first two lines
[9:56]
here are grabbing the instructions that
[9:58]
we're going to give the app. And I
[10:00]
actually think this is kind of
[10:01]
interesting. This is what kind of where
[10:02]
I've been lately. So you create what are
[10:05]
prompts. So this is they call it prompt
[10:06]
engineering. And for this one I'm trying
[10:09]
to use well this can be any perspective,
[10:12]
but we'll just assume that we're looking
[10:13]
at the mirror perspective. So you would
[10:16]
grab the system instructions and that's
[10:18]
kind of like
[10:21]
uh the most the top layer of what you
[10:24]
want the AI to do. You're basically
[10:25]
building the AI from this a context for
[10:28]
it. So these are the instructions I give
[10:31]
it. I want it to know about me because
[10:33]
it's meant to mirror me. So um I give it
[10:37]
you know
[10:39]
wonder why this preview is not working
[10:40]
over here.
[10:45]
There we go.
[10:47]
Um, so these are the instructions you
[10:49]
give at the the the top level of just
[10:52]
think about if you're asking a question
[10:54]
like chat GPT or something, you could
[10:55]
just you could copy and paste this right
[10:57]
into that.
[10:59]
And then we give it the local context.
[11:01]
So in this case, um,
[11:05]
so all of this is because I was having
[11:07]
difficulty with um with the local
[11:10]
models. I'm still experimenting with
[11:12]
language. Um, and then the signal gets
[11:15]
placed here. And then this here is a
[11:18]
question because local models can't hold
[11:19]
context very well. I had to come up with
[11:21]
a recursive way to do this. So we put
[11:23]
the prompts here. Um, and right now, so
[11:27]
let me show you a different because you
[11:28]
can have multiple questions. So if I'm
[11:31]
doing the narrative perspective, these
[11:33]
are all the different questions I want
[11:34]
to ask it about the symbol. And so this
[11:37]
is what we're building here for each
[11:38]
different perspective we want to ask it
[11:40]
about. We build a system and a user um
[11:44]
files and then just a JSON of the
[11:47]
different questions that we're going to
[11:49]
ask AI. And then
[11:52]
that's what takes us to what we were
[11:54]
looking at before.
[11:57]
It's not the model router. We're a
[11:58]
little bit deeper in
[12:02]
um
[12:04]
me some of these out.
[12:11]
Okay. So
[12:13]
this command gets called when you want
[12:16]
you want to get a new reflection from a
[12:18]
signal. So let's say that I upload a
[12:20]
YouTube video. I'll have a script that's
[12:22]
checking for that and if it sees a new
[12:24]
video, it'll grab it and then it will
[12:26]
tell the system that I need to run this
[12:28]
function here and
[12:31]
here's the new signal and here are the
[12:33]
questions I want to ask. And that's what
[12:35]
this does. And then when it's done, it
[12:37]
takes it and it puts it in the database.
[12:39]
So that's it. It just puts it in the
[12:40]
database, which is here. Oh, it's not
[12:43]
open yet, but let me just open this.
[12:46]
Having to use a lot of free tools these
[12:48]
days, which
[12:51]
still a little awkward for me. But and
[12:53]
then it just So these are the signals.
[12:55]
These are all
[13:00]
the different videos that I've created.
[13:02]
So 144 of them. Um, my chats with the AI
[13:07]
will also be in this table. So signals
[13:09]
are not just transmissions. That's just
[13:10]
a signal source right here. It could be
[13:13]
they could be anything. They could be a
[13:14]
written um you could have a written
[13:16]
journal that you scanned or something
[13:17]
like that. And then it just would need
[13:19]
to be converted into text. But any type
[13:21]
of any type of text you could use as a
[13:24]
source for this whole system.
[13:27]
And I always knew like that I could do
[13:30]
something like this with, you know,
[13:32]
that's kind of why
[13:34]
it's kind of what kept me using YouTube
[13:36]
for this long even through all the
[13:38]
struggles I've had with it
[13:41]
because I knew that there I just knew
[13:43]
that would play a role in my life and
[13:46]
and this is it. And so then they just
[13:49]
get turned into reflections. And this
[13:52]
is, you know, I'm just showing you the
[13:53]
back end of this. I've been
[13:54]
experimenting with this stuff, trying to
[13:56]
get the local models to have a fidelity
[13:58]
that's close enough that I feel like
[14:02]
um I can start building up the database
[14:08]
um and then do the other things that
[14:09]
that we'll talk about at a later date.
[14:11]
Um just kind of where I am right now. So
[14:14]
this is what it looks like on the back
[14:15]
end. This is this is literally what I
[14:17]
just showed you. It got sent this as a
[14:20]
prompt and then it responded with this.
[14:23]
but it's just very shallow and it uses
[14:25]
emotional framing and it just it just it
[14:28]
misses my depth completely. So, that's
[14:31]
what I'm working through with the local
[14:32]
models. I feel like they can
[14:36]
um and this right here is this is what I
[14:39]
just recently installed. It's called um
[14:42]
um text generation web UI and it's
[14:44]
actually a really really cool program
[14:46]
for playing around with different
[14:48]
models. Um that's that's literally what
[14:50]
I'm working on now. But I was just going
[14:52]
to go to my website, show you.
[14:57]
So, if you go to transmissions,
[15:00]
um, the most recent ones, if it says no
[15:02]
summary, if you click those, you're
[15:04]
going to get a broken page, just so
[15:05]
you're aware. Um, there's some things I
[15:07]
need to fix. That's why my most recent
[15:09]
ones aren't on here yet. I'm I'm almost
[15:11]
ready to get that fixed. Um, just go to
[15:14]
one that's got some text here. And
[15:16]
everything you see here, these are from
[15:18]
the reflections. These are from
[15:19]
different reflections. So, there's the
[15:21]
surface, ontological, and structural.
[15:23]
I'm going to combine these two into one.
[15:26]
Um, and then there's the other ones,
[15:28]
like you saw the mirror one that I'm
[15:29]
trying to work on, and I want to make a
[15:30]
narrative one. Uh, and then maybe a
[15:32]
mythological one at some point. But
[15:35]
that's just the beginning because from
[15:37]
here,
[15:39]
you can take those individual
[15:41]
reflections and you can cluster them
[15:42]
together like by time, for example. So I
[15:45]
could take a couple weeks of time and
[15:48]
and feed that to the AI and get
[15:50]
reflections based on that which then
[15:51]
shows patterns because patterns show up
[15:54]
over time. They don't show up in a day
[15:55]
or a single signal.
[15:57]
But if you feed it enough data,
[16:01]
it will see them and who knows what
[16:03]
those reflections will see. But all that
[16:05]
will be available on this side also once
[16:08]
I'm doing that. And then from there you
[16:11]
can cluster the clusters and you get
[16:14]
even larger like epochs of your life.
[16:18]
That's what I'm building. I think I'll
[16:20]
leave it there for now.