Posts

Project update 5 Conclusion

Image
After many weeks of testing and reviewing the source code some small improvement has been made. The information in from the previous tests revealed the hot function that was butteraugli::convolution and it also revealed that most of the work in the entire application was preformed in butteraugli.cc I will first cover some of my failed attempts and then move on to explain the actual improvement made One of the first things I tired before even touching the code was compiler optimizations, the package had already come set at o3 level I played around with all settings available both with GCC and visual studio anything I did on that front seemed to be notably worse so I quickly abandoned those attempts as it was quite clear someone had already worked out the best configuration for this. The next thing I tried was modifying the algorithm whiten the convolution function. I wanted to convert the 4 separate loops into one single pass over the data passed in. This ended horribly as it becam

project update 4 - testing

Image
Hi, in this post I will be covering my findings during the testing phase of the project. I initially built the application on an AArch64 linux system, I had no issues building the file as per the github instructions under the POSIX section. I used perf report to profile the application. but before I continue I should note that in all data obtained from tests used four this post on all platforms with the flags --verbouse --quality 84 on the application run. I tested using a sample png file that was provided, I successfully converted and compressed the file from png to jpeg multiple times on both windows x86_64 and linux Aarch64 platforms. Before After I should note that during my testing I did test with several other png images and some were rejected by the applications for reasons I have not yet narrowed down, this is an example of one such image. The choice of the specific image has no relavance to the test it just happened to be on my computer so I used it as test dat

Project update 3, change of course

Hi, this update is to let you know that I have moved to another project  guetzli             on a basic level it is a image compressor. The reason I am switching progress is due to time constraints and the previous project having a build system I did not have the time to sit down and read everything necessary to know how to build, modify and test in a meaningful way. With the new project it only took me a few hours to set up build install and attempt to tweak the source and even do some  bench marking on two platforms. I plan to take a different approach then the other project I want to either implement and algorithm change if possible, or alternatively I am open to investigating compiler changes or even the possibility of inline assembler as an unlikely but not off the table option. The code appears to be well written from the portions I have read so I think I have to take as many angles as possible to have a shot at improving it, Now I have already identified the hot locations of t

Lab2

The first project I built for this exercise is bazaar   on an Aarch64 system, the build presses was fairly simple, first step was to make sure I had the correct version of python, I checked that I had the correct version of python installed in this case the system already had the required version so I did not have to do anything. I then located the build directory and ran the setup.py file included to build the software. This package was very quick and easy as it wass small and did not have many build dependencies. The next package I built for the exercise is glibc the one of the most notable differences to the previous package is that,It has a directory for the build and also one for the source code this make it easy to clean things out and rebuild if needed and just keeps things more organized. the first thing I did was git clone the directory, then I made a new directory for the build, I had no issues running the initial configure and makes commands witch built th

Project update 2

I have found a new package at https://github.com/google/guetzli, it is very likely that I will be switching to this application. I have been able to build and test it on windows so far, I am currently putting together a new plan to attempt to optimize the software. My initial observation is that there is allot of nested loops it might be possible to gain some performance with an algorithm adjustment in processor.cc as it appears this file contains the code doing the bulk of the work. I have not yet zeroed in on a specific function to do work on.

Project Update

So far the project is not progressing smoothly however this was anticipated, the root of the issue seems to be, that I am having trouble identifying exactly how the build system is being used and exactly where the assembler code I am interested in is specifically being called so I can reproduce a test case, I have transferred the files over to one of the Linux systems  where it was easier to track down locations I should be paying attention to. I used the first half of the export symbol to grep search all of the files in the project and found several matches. and I think I have party identified how the setup works, I have missed several of the labs that I should have completed by now so its possible that this would be very clear to me by now if I had done so. So as things stand my current plan of action is to revisit the labs to look for information I may have missed out on, and to prepare for the event that I might not find what I need, I will also in parallel seek out a new project a

My expeeriance so far.

Since the start of taking the course SPO600 I have learned allot, and tried allot of new things, But to date I have been rather silent about it regarding the blogs, so I'm going to dedicate this post to some of the things that stuck out to me. One of the first things we did is generate a public an private ssh key to use for the class servers, and prior to my first class I knew nothing about ssh keys. and I am surprised I did not know anything about them prior considering how commonly they are used. An ssh key is basically a long randomly generated number used in a math formula to lither encrypt data or decrypt it. ssh keys are normally generated in pairs, the public key is responsible for encrypting data, and only the corresponding private key can decrypt that data. so that means even if the public key gets into the hands of an attacker, as long as the private key is kept safe they cant decrypt the data without spending allot of time brute force calculating what the private key is