week dec 11

Still didn't do any work on the program but tried to help khoi. He got this weird hat for the rpi that has an analog digital converter on it. I got the libraries working on my rpi2 but then switched to rpi4 which khoi had where the pin numbers are all weird so the tutorial had a different weird bcm library and the farthest I got in 2 classes was getting an LED to blink from on of the pins on the board, next step would be to try and see if the board can read the output from a potentiometer, problem is the tutorial is 100% in Korean so hard to understand what the are explaining.

0 comments

week dec 4

This week all i did was the presentation that I totally knew was due this week.
Other than that the only thing I think you would find interesting is getting opengl to work on ubuntu and drawing some triangles

0 comments

week nov 15

Ok so even worse stuff happened last week I got back to school and plugged my computer in it doesn't even boot up, it had the amber light flash in a 2-pause-2-pause-2... pattern but I couldn't really do anything since people online said that it was the power supply that malfunctioned. So I just didn't use my PC and used my mac laptop. Then on Wednesday my mac shut down out of nowhere and when I booted it back up the fans were on full blast and the battery wasn't charging. I was scared that it had limited battery and my other computer just broke so I would have been screwed. The mac just didn't show the batter symbol at the top of the screen and the setting to show battery percentage automatically unchecked itself when I clicked it, and the fans were on full blast, and the screen didn't turn off when put down so I assumed the sensors for the computer were all broken, the battery voltage, thermometer, and magnet for the screen. I decided that the battery was probably still charging and it lasted until tomorrow In math class where i unplugged it for 10 seconds (i had it plugged in 24/7 since it broke) and it turned off after 10 second so i guesss it was drawing power directly from the charger instead of the battery. Somehow after it shut off from no battery it fixed itself and everything went back to normal. So within the span of 2 weeks my ubuntu SSD broke so i couldn't boot into linux, then my PC which I still had my windows installation broke its power supply, then my mac almost also broke, so I almost lost all 3 operating systems but my mac fixed itself somehow. Also then my backpack zippers broke and my power strip broke the plastic on the bottom and i shocked my arm with 110 V when I went to pick it up and my fingers touched the metal so idk why everything is breaking these past 3 weeks.

0 comments

Week Nov 6

Ok so I got some example photos from the garden of the weed and plants. Last weekend I brought the jetson nano home to try and get tensorflow to see if it runs faster than on my computer, but I never actually did cause procrastination, and the internet is so bad at hpa that apt get fails to download like 10% of the stuff. Then also last week my ubuntu SSD was saying invalid sectors so I assume the SSD broke, but then this week on Tuesday the powersupply on my big good computer broke so now my $700 gtx 1070 is out of service until I get a new one. I gave my extra RAM sticks to alex so at least some one won

0 comments

Week Oct 23

I didn't really do anything this week other than the proposal.

0 comments

Week Oct 16

I got the jetson nano from you and I spent a bit in the elab trying to get it to work but for some reason the internet didn't work, then I took it back down to my room and tried to get it to work and it still didn't work. The reason I thought it didn't work was that the dorm wifi sucks, but after failing I went to sleep cuz the internet cuts out anyways then the next day i forgot about it and did regular stuff then the next day i tried again and after googling it said that it didn't even have on board wifi since it is supposed to be "mobile" or whatever so then i had to steal Khoi's wifi USB thing from his laptop so then i got it to connect to the DORM_RAD but the wifi still sucked so when i tried to install pip using apt it failed to connect to servers so i instead stole one of lawerences ethernet cables and finally got a relatively good connection in the dorms. I checked the /bin/ folder and it said that it had python 2.7 and 3.6 installed, i wanted 3.9 since that is the latest version of python tensorflow ran on so i tried to uninstall 3.6 using apt but then it said that it wasn't installed so i thought some previous guy had installed it without using apt so that's why it didn't detect it so then i did sudo rm /bin/python3.6 then the screen blacked out. I didn't know that 3.6 is preinstalled on ubuntu so then had to go into terminal mode and run this auto thing that would fix everything since a bunch of the GUI stuff uses python to run. So then that means i had to install python 3.9 along with 3.6 which is dumb, but apt's latest supported python is 3.6 so i had to have it import some weird "snake" or something list of programs and got 3.9 and pip installed, but then when i did pip install tensorfllow it said that there wasn't a version available for some reason so then i googled the error and they said that I had the wrong version of python and it is only between 3.6-3.8 but i had 3.6 and 3.9 so i down graded to 3.8, it still didn't work, some random thread said that you had to have the 64 bit python not 32 so then i had to figure out which version but the example code to check was from 2009 so none of their code works with pythons library, i finally found a way and it actually was 64 bit. So then it thought maybe pip was wrong so i ran pip --version and it said it was doing python 3.6 even though i had 3.8 so then i had to uninstall pip then force reinstall as pip for 3.8, it still didn't work, so then i upgraded back to 3.9 because maybe the old version was messing it up, and pip to 3.9 but it still didn't work, then i tiried to manually install pip using the get-pip.py on 3.9 but it still didn't work then the ethernet cut out since after 10:30 in the dorms and then the next day i went home where maybe the internet is better but i still haven't worked on it yet.

0 comments

Week sep 25

All i did this week was write the script for my pitch, I thought it was due Thursday since you sent those emails so accidentally wrote it like a week early, which is good since I would have procrastinated to Monday if it hadn't anyways.
I am going to try and put some of the thresholding stuff in the pitch to show how it works but other than that don't really know what I am going to say for the slides.




1 comment

Week sep 18

This week schorn showed us a website he is making for students to discuss as a capstone that he is sort of doing. Is an example of a good capstone that solves problems.

I also talked to Mr.Quail in the garden. I found of FDA guidelines that ionizing radiation isn't organic, and the garden is supposed to be organic without the artificial fertilizer and pesticides, but it was never mentioned if it actually follows FDA guidelines or if its by their own standards. I asked Mr quail if he was ok with ionizing radiation on the garden (lasers) and he didn't really know I guess they aren't that exact on their definition of organic. But Quail did say my capstone was a good idea though.

I didn't really work on the capstone that much since I am doing college things but I did spend a day fiddling around with reading in .jpeg files in c++, I tried with opencv and qt. open cv was way faster I ran it on like 50 files and it was like 1/4 the time qt was, qt probably isn't built for speed though more like python where it is easy to use with api's and stuff.

1 comment

log sep 16

Schorn showed us his website that is supposed to be like high quality reddit threads with honesty and stuff.
Also about the elevator pitch, use English language to compress 15 minutes of talking to 2 minutes with minimum loss of information.

0 comments

Capstone 9/11

This week turned in the problem statement.
The previous week I had tested training ai on plant seedlings using this dataset (https://www.kaggle.com/c/plant-seedlings-classification)
In that week l had my crappy 2012 hand me down macbook train with a random convolutional ai that i just kind of randomly set the layers train for like 2 hours and it had barely even completed 1 epoch and was only 20% accurate because it was so slow. There were 8 types of plants so even randonly it would get 12.5%.

This week i brought my home computer that I use to play games on to my dorm and it has a 1070 nvidia gforce card. The hardest thing was trying to install Unbuntu which i had to get a cd and use my macbook and a cd reader to burn the installation on after like 4 hours of trying at 1 am on sunday night. Also had to figure out how to mess with the boot settings to turn off RAID or some wierd settings untill it worked. Then i ran the ai and it was still super slow since the user is supposed to install some driver or something for tensorflow to use nvidia graphics cards. I accedentally bricked the unbuntu installation while installing the drivers somehow since i had installed the wrong version then tried to manually delete through terminal but messed up somehow. After reinstalling Unbuntu then eventually getting the driver to work after finding the correct version, then forgetting i was supposed to reboot the computer after installing the driver, i got it to train using the GPU.

Before one epoch took like 30 minutes on a mac cpu with a hard drive, it took like 20 seconds on a dell 1070 geforce GPU with a SSD. After like ~40 seconds the AI was already more accurcate than the macbook training for an hour and the code was exacly the same i had just copied it to google drive. I really quickly put a while(true) in python around the training code so that it will run overnight then forgot about it since I woke up at tike 8:25 the next morning since my alarm was muted for some reason and forgot about the AI untill statistics class, then went back up during lunch and the following free period and the AI after training for like 12 hours was still only 95%. Also I didn't add the code to save the AI so lost all the training and several cents worth of electricity from the dorm but at least in concept it would work since if I actually thought about how the AI layers would be layed out instead of just randomly adding convolutional and dense layers it would probably be better than 95%. The 95% was for the training data, but 10% of the images is set aside and isn't trained on to detect overfitting, and when run on the validation data it was only 90% which is closer to the real accuraccy if it was used on real world data.

But the highest guy on there had 97% on the validation data, and had a bunch of extra image processing code ahead of time, and i just slapped my code together in 30 minutes and got 90% validation accuracy after 12 hours, so it's probably easier than i thought if i just do some more optimizing.

0 comments

Capstone 9/3

Usless notill

I looked online and the highest people get on seendling plant classification is 97.103% out of 12 different plant categories of 60000 images, which isn't very good.
I found another with plant disease detection based on photos of a leaf from a plant, and it one guy got 96.74% accuracy, which also isn't very good.
There is corn recognition where some people got within 0.00001 standard deviation on the softmax (basically 99.99999% accurate, but there are only ~2000 images so that is probably just overfitting and doesn't really count.

I tried to copy the first example on seedling classification and within like 4 epochs i got like 20%, but it runs so slow on my laptop that each epoch takes like a minute, and it would take 1000's to get up to the >95%.


if the precision application either by drones or whatever to the football field or the clovers on the field behind the track, it would need better than those for Mr. Perry or whoever to consider it practical or at least comparable to the accuracy of humans

both field sprayed like 2 times a year.

0 comments

attachment test


Download file "Screen Shot 2021-08-15 at 8.37.42 PM.png"

external experts: herbiside farmers, quail for laser
doesn't actually have to be practical so idk but they say to make money or whatever
brain thing: if not like actually seeable by humans in difference of word or picture perception, use AI or something idk

0 comments

Problem statement Andy

problem:
tilling to bury weeds causes more erosion, no-till is easier but uses more herbiside
solution: manually pick weeds, GMO but that needs patents and stuff, cover entire field in herbicide, intermediate crop but that uses water so no. just no -till normally but no one likes that since corps suck

idea 1:
community: farmers
what they need: money (cheaper wages and less environment impact)
effects: erosion, herbicide mostly lands on dirt and friendly plants, also if easier no-till = less erosion
importance: food is important duh

actually idea:
15$ sterioscopic camera -> 2 images like 3 inches apart -> opencv or blockmatching to build depth map -> depthmap and image combined -> thresholding -> cut image into individual plants -> ai ->know if plant is a good one or a weed -> decide if to spray with precision herbiside so use less and more environmentally friendly

research:
computer vision scientist
farmers

idea 2:

main problem: the drone costs a lot of money

community: mappers and farmers
what the need: money (more efficient growing)
effects: calibrate moisture n stuff idk
importance: ???

actuall idea:
10000$ drone -> measure stuff about farms -> ??? -> profit

researth:

0 comments

Final Video 2

0 comments

5/3/2021

5/4/2021 15:34:56 HST Gregorian Calendar
This day I had took the Calc BC AP test in the morning so was not operating at standard capacity. This day I tried to continue with OpenCV in c++, but eventually gave up since there is like zero resources online, it is akin to doing tensorflow in c++. I switched back to python which I barely know, and then had to continue on the problem where the steroscopic thing doesn't actually work, Im still pretty sure its the camera calibration. To calibrate the camera it is the same way with the chessboard algorithm for the camera distortion and stuff, but I don't really remember what happened after I started doing it other than I never got it to work.

5/6/2021 17:34:04 HST Gregorian Calendar
This day I had took the US History AP test in the morning so was not operating at standard capacity. I gave up on the soft body simulation with springs for now, and repurposed the code to simulate n body gravitational systems, which was easy it only look about an hour. Then I wanted to have it calculate the conics for a 2 body system from a initial condition because I wanted to simulate trajectory of sub-orbital ICBM's and so a feedback loop like PID or something to guide them. I still don't know physics and eventually ended up on the "Orbital Equation" Wikipedia page (https://en.wikipedia.org/wiki/Orbit_equation), where it says that it is for a super small body and a really big one, and that the acceleration on the bigger body is negligible. The equation for the conic is in polar form and is r = (L^2 / (m^2 * u)) * (1/(1 + e*cos(theta))) where r is a function of theta and everything else is constants. Theta is the angle from the periapsis, I assume periapsis is what I think it is from kerbal space program. m is mass of smaller body, u is the standard gravitational parameter, which is equal to the gravitational constant times the mass of the bigger body, which is easy since in my code the gravitational constant = 1. Then L is the angular momentum which the page says is m*r^2 *theta, only problem is that theta is unknown since you need to know the periapsis position, which you need to solve the conic to get in the first place, but some other Wikipedia page says its equal to r*m*v, where v is the velocity perpendicular to the center of the system, in this case since one body is way larger it's just assumed the center of the system is the larger body. Assuming the equation for L is correct, the only variable left is e, which is the eccentricity of the orbit. It says that e = sqrt(1 + (2*E*L^2)/(m^3 * u^2)), where E is the "energy of the orbit". I have no idea what the "energy of the orbit" means or how to get it, but some other page called "Orbital eccentricity" has a similar equation but just different letters, and it says E is called the "orbital energy", and actually links to another page called "Specific orbital energy", where it says E = (v^2 / 2) - (u/r), where u = gravitational constant * (m1+m2) but m1 is so small it stays the same as the previous u, and v = the orbital speed. Then it says that orbital speed (in another page) at an instant, v = sqrt(u* (2/r - 1/a)), where a is the length of the semi major axis, but to get the semi major axis you need to know the equation, which you need to know e, which need E, which needs v, which needs the semi major axis? So I guess 2 body problem isn't actually solved? THer is an alternate equation for E which is -u/2a, but a is still the semi major axis.

So this week Wikipedia didn't tell how to do the 2 body problem.

0 comments

4/26/2021

4/26/0021 -31:255:255 EGT Julian Calendar
Ok so after bringing the mac up to the room and harassing the guys trying to present their capstone i continued the soft body thing with the weird classes having pointers of each other. I did a thing last week where the balls keep a pointer to a list of every spring they are connected to, and each spring has a pointer to every ball that it is connected to. I fixed all the stuff where they blow up when classes have data of each other, but this time for some reason the pointers weren't matching up. It took a while to figure this out as it was asserting false when checking f a pointer is part of the spring and took like an hour for me to find it. After I left class I continued and I don't even know how this works but because the ball and spring classes had functions that call each other, and one has to be initialized first, I did the partial initialization of the spring by defining the class, then the ball class was initialized which dedicates a vector of pointers to springs and all the other functions that don't need the spring, then the spring class is initialized everything (except one function that requires a function in the ball which requires a function of the spring but that isn't important), then the functions of the ball that need the spring are initialized. But because the vector to the ball was initialized and had the vector step size set when the spring wasn't initialised, it would store the correct data of the pointers but in the wrong format, so when pushing a pointer on top of that vector, it stores it wrong and is off by a couple bit shifts, then when that pointer is called to get another pointer, it is shifted by the same amount when calling that function which grabs it's pointers off too. I am going to have to do some really weird stuff to fix this, its such an obscure issue with pointers that there is like no one online to help.

4/28/2020 15:48:33 HST Gregorian Calendar
The previous day my advisor said I didn't have the second college counseling meeting so I signed up for tomorrow, which was this day, since if I do 2 days I would probably forget or something. I listened for 20 minutes on how to screw computer science majors for minimum wage which was actually very insightful, then had to go to college counseling until 3 o'clock, and I think I made a little kid late for his piano practice by staying too long. When I came back I stole a drill to give to Will, then spent the rest of the time extracting information about isochrone fitting from Will while he almost fell and broke the telescope. It's something like taking the luminosity, color, and type of a star to find its age.

4/30/2020 15:55:00 HST Gregorian Calendar
This day I got back to the block matching thing. I decided to try and use OpenCV's algorithm instead of the crappy one I wrote in 20 minutes that gets it wrong half the time. I was going to try and extract images from the .MOV video of Nick's drone, but I decided to try and get OpenCV working on some online test images. It took a while to find a tutorial on how to do OpenCV sterioscopic in c++, and you have to do this weird calibration thing first. The calibration basically, or at least what the tutorial was saying, uses images of a chess board since it had a hard difference between the black square and white squares, and the math is easy to do on squares, except I don't know the math because hypothetically OpenCV should handle all the hard stuff. Also btw idk if I said this yet but the calibration is to see the camera lense and distortion and stuff so the stereoscopic algorism actually works. So then I spent a while looking for a chess board to take pictures of, the gave up, then tried taking pictures of a chess board image on google displayed on my computer screen, but the glare in the conference room kept messing it up since the screen is shiny, and I had to do it under the table pointed towards the window. Then I actually didn't know the format to pass the image into the function, the tutorial said monocolor, or 8 bit RGB, but the input is an array of integers? Not like that was the bottle neck since I copied the drone video into the Documents folder where I was writing the program, it copied it to cloud, and I had 2 of the drone footages for some reason so it was 3.8gb *2 = 7.6 gb and I spent a long time wondering why the text message of the image wasn't going through. Also another thing I just realised is that if I apply the Open CV to every frame of the video, then qt sucks in the whole video to convert to images even if its just one timestamp, it would use at least 3.8 gb of ram(technically virtual memory but who cares) out of 8 gb which is a lot, and if it fills all the way up it will go to the storage which will make t super slow, but it will technically still work.

So this week an unintentional bitshifted pointer to a pointer messed up, then I watched Will fail at fixing the telescope, then had no idea what I was doing for programming OpenCV in c++.

0 comments

4/19/2021

4/19/2021 17:02:44 HST Gregorian Calendar
Like a week ago I saw a video for how to do soft body physics and its just a bunch of balls connected with springs, and I didn't take physics yet but apparently the force springs make is linearly proportional to their length from their default length so I figured I could program it in like 15 minutes. I got an old c program that already had some ball colliding physics and set it up. 2 classes, a ball class and a spring class. the ball class would have a list of springs, and the spring class would have the 2 balls. But then that meant the 2 classes contained each other which would use ∞ memory and the compiler said something weird so I changed it to where instead of the a copy it would just have a pointer to the ball/spring in the spring/ball class. But they had to use each others name an one of them had to be compiled before the other so it had to have the class name defined but none of the structure, then initialize the other class which would have pointers to that class, then intirialise the original class which would have pointers to the other class passed in. It was the first time I had to do this weird class nesting so it took a really long time to figure but there were like a million people on google that had the same problem.

4/21/2021 15:28:00 HST Gregorian Calendar
I was about to continue the soft body simulation but then the white hair guy made everyone go outside to look at wills telescope and then instead had to do the solar panel counting thing. I thought of just using black thresholding but when I saw the video there were big shadows and the road is black colored and so was water and it wouldn't actually work as easy as a thought. So instead I thought of using the block matching thing I wrote for the farm bot and just apply it on the drone footage, except it would actually be better since block matching was made for videos not just 2 frames. The block matching would basically be able to tell how far away something is from the drone, so if there is black color on a roof rather than the ground it will see the roof as higher. Idk if it should be another input to an AI or to do the black thresholding plus the depth block matching. I started to download nick's video but it was like 4.8 GB so it would take like an hour on the crappy laptop. I messed around with the old block matching code, I don't really remember, and I tried googling for how to convert .MOV files to images in c++ but ended up talking about one of the founders of reddit and the Sahara dessert.

This week I messed around with classes in c++ then got distracted trying to do block matching on a drone footage

0 comments

4/12/2021

4/12/2021 Gregorian Calendar 16:11:56 HST
This day I spent the whole time trying to learn python to do the solar panel counting AI. I have MNIST ai working, but the way it loads the data is through the keras which puts it into a numpy array and everything for you. If I wanted to make an AI that takes an image, I would have to do something similar. The MNIST thing is in a 60000x1x28x28 numpy array (the 60 thousand is how many images) or whatever its called, but I had to first figure out how to allocate an array with that structure but mine would be like 60000x1x128x128x3 (3 because RGB) or whatever size the image is going to be (and probably not 60000, more like 100 since there isn't that much data. Python doesn't do memory stuff the same way and I usually worst case in C is to just manually push the memory into the array, but python can't do individual register shennanagins, so I didn't know how to put an image into an array format.

4/14/2021 Gregorian Calendar 15:29:06 HST
Today I fixed the wierd bug on the blockmatching like 5 weeks later. It was just the threading overlapped because in the for loop it counted up to the height of the image divided by the blocksize, but if it isn't a whole number it would return a float, but I was accidentally saving it as an integer and it would trunkate then in the for loop allocating the threads would make the last 2 collide and overwrite each others output messing up the memory in the array which when displayed would make it look like a white pixel instead of red since it is just messed up.

So this week I failed to do an AI because python is weird then messed around with multi-threading until I fixed a minor bug in a previous program that would also be the input into an AI.

0 comments

3/29/2021

3/29/2021 17:12:50 HST Gregorian Calendar
Today I found out about the gamma function when reading about fractional derivatives and integrals on Wikipedia, and instead of doing useful work I wanted to graph it on the weird 4d imaginary graph in openGL i made last year. I went to the old computer and unsurprisingly all the stuff was still there. I tried to compile but it kept saying that qt was messed up, so i reinstalled qt for like 10 minutes, but then it said that the opengl couldn't be included, and I had dig out my old .pro file and add QT += QOpenGL or something like that into the include directory to fix the make file, but it said that couldn't be found. Eventually I gave up since it was kind of useless anyways and just watched Will fail at navigating SkyX menus for 10 minutes.

3/31/2021 17:11:09 HST Gregorian Calendar
Today I helped will with the telescope by recording the obstructions on the telescope. He decided to use like a 50W lazer, and i thought 500mW lazer was powerful. Idk he didn't just use some regular red dot laser anyways. I just recorded what nick shouted out. There was something about pitch and orientation which I guess is good enough to represent a unit vector since it is only direction and you don't care about length.

4/02/2021 13:26:47 HST Gregorian Calendar
Today I tried to to the PID with some motor data I had from home. Obviously I couldn't actually test it but I could at least to the outline in c++ the try and feed the data in later. THe problem happened when I tried to graph the position of the motor onto the target position, but for somereason the scrolling messed up the scene and the application wasn't clearing the window for each new screen, while looking cool it made it so I couldn't really see what was going on. I decided to reinstall qt since I hadn't updated it in like 6 months and this computer was running like a 2008 version of mac OS, but while installing brew kept saying that files had the wrong hash and stuff was disconnecting client side, using curl. I never thought that would actually happened, maybe the internet was just bad and like too many packets were dropped idk networking, or there was an impostor server sending viruses. Anyways I decided to just wait until I got to better internet, and when I tried to write a weblog the server loaded super slow and had to connect through https with a bad certificate since HTTP didn't work (I thought that if HTTP doesn't work https shouldn't too but I guess not).
Download file "Screen Shot 2021-04-02 at 3.20.42 PM.png"

So this week I failed to get OpenGL to work, stole a drill, copied down some orientation numbers, and failed to do a simple pixle by pixle graph.

0 comments

3/22/2021

3/22/2021 15:24:09 HST Gregorian Calendar
Today I tried to get the serial working on the arduinos at the elab. I plugged it in and had to fish out the device name from /dev/ but for some reason no matter what the computer wasn't getting what the arduino sent. It would get some random bytes but they weren't what was supposed to be sent. I thought it might be bad connection because the bytes looked like some was missing and plugged it into a different USB port on the computer but then I forgot the device name changes and spent a while until I realised I had to re-find the name in /dev/. I didn't actually get the serial port working today.

3/24/2021 15:03:47 HST Gregorian Calendar
I got my previous arduino code and it turns out it is just the baud rate being set to 921600 instead of 115200 because when I switched over to C++ the weird library didn't have a setting for 921600, only up to 115200 (maybe it was written in like 1980?) but after that it started working. The arduino at home for some reason tooke like about a second before serial port would work and stuff could be received on the arduno, but at school for some reason it doesn't do that and works immediately. I still don't know if it is the Arduno or the computer because I am using a different one for both. Then I realised I couldn't really work on the PID because I didn't actually have the motor, h bridge, and power supply so can't collect data into PID, but at least I know the weird 1 second bug that isn't actually that big of a deal is localised to Macs and 168P arduinos only (basically not 328P ardunos)

So basically this week I failed to get a serial port to work then got it to work but then didn't actually do any PID because I have no motor. I could technically do pwm because online it says 328P has the right timer registers and pins set up.

0 comments