Papa's got a brand new bag
Ramblings of
Alex Lovett
RSS
Twitter
Tumblr
Youtube
LinkedIn

Navigation:

Up a Level - error_log - Experiments - Store

Documents:

Unity5 - Reality_2.0 - Math_Art - Lilly - Drawing - GameDesign - Inspiration - XFactor - Valideus - Food - WheelReview - GKN - Lumen - WishList - RoundTree - Painting_with_Light - House - Website - Fridge
Tags: - Misc - Web
Show comments

Now I hate twitter for various... nay numerous reasons. Chief amongst them the fact that for the very few things I do follow, I want to make sure I read every single thing they say and essentially mark things as read like how email or RSS feeds work. And so finally I have found a way to get twitter into an rss feed to use with a service like Feedly. This used to work in the past but Twitter stopped it because well.. Twitter are dicks.

This is really easy to setup also:

Link: www.labnol.org --- 28149



There is also this service that can pull links from your feeds into an RSS but I did'nt try it:

Link: www.siftlinks.com



Just more ways I can get stuff from my RSS reader Feedly over to my 'read it later' service Pocket... this is why I got this badge of honour after all ;-P




Show comments for 'Twitter to RSS'
Tags: - Games - Unity
Show comments

The way Unity will work on the web is the most confusing bastard stuff in the world

Unity uses Mono for It's scripting
Mono is an custom implementation of the C# spec that Microsoft partially open sourced
the part Unity uses is free due to the open source LGPL license it enjoys

Then Mobile phones happened and Apple do not allow the on the fly compilation that Mono lets you do free
This is done via a JIT VM, just in time virtual machine and is typically how C# is ran, compiled on the fly using magic

instead they have to compile beforehand ( AOT ahead of time ) which Mono wants moneys for... lots and lots of moneys
So they held Unity to ransom ( probably ) as Unity become more successful
Unity said FU... people are happy with the 5 year old version you already sold us
But people were not happy, they cried at weird memory problems and missing features that came with C# 4 and 5 that they could only dream of using, let alone the debugging issues.

And now to get out of this on going problem and finally be able to upgrade to a modern C# with better memory management they will do the following

C# is compiled to an intermediate language ( IL ) using the free part of Mono, this is normally then ran in realtime on a device via the HIT VM ( which Apple won't allow )
But instead Unity will convert this intermediate code into C++ source code, which is then compiled using Enscripten asm.js thingy made? by Firefox guys into a weird JavaScript soup that some how runs fast but looks like computer vomit

Thus taking advantage of a shit ton of work either end that is free and open source and even used and probably contributed to by their competitor Unreal Engine and Mono

They just have to take care of this middle step themselves... now do we trust them to do this part properly?

Already they are claiming "What is so exciting about IL2CPP is the big performance speedups however. We have seen 2-3x speedups on math heavy code."

Still a weird place we find ourselves that ( C# -> IL -> C++ -> JS -> Machine Code ) is a fast intelligent way of making anything.. ever..



Link: blogs.unity3d.com --- the-future-of-scripting-in-unity



Link: forum.unity3d.com --- 247039-The-collected-il2cpp-forum-topic



Link: blogs.unity3d.com --- on-the-future-of-web-publishing-in-unity




Show comments for 'Unity Future'
Tags: - Rant - Tech - Misc
Show comments

Plex is quite brilliant ( and free )



If you have your own collection of movies ( say you got from the equally brilliant Yify website yts.re ) or just home videos or photos or whatever. Plex will run thru them and find the ratings, descriptions, poster art and present them to your Tv either via DLNA or via various native clients it has such as for the Roku ( also brilliant if soon to be obsolete due to the new and upcoming Amazon Fire Tv and Android Tv and Apple Tv etc ) and you can get a native Plex client for iPad iOS and Android. And the source code is available, which leads to things like Rarflix which is an enhanced Roku client. And it has plugin support for adding your own channels to navigate various sites, or adding things like omdb support ( also brilliant , a site that gives you an API to access movie information )

So using all this combined with my new 10 megabytes per second internet wooyay, I wrote a script that takes my IMDB watch list and magically downloads all the movies, boom, 300 movies... which naturally I legally own all of them already.. *cough* ahem, I have said this before, the Movie industry needs to be real careful and stop the infighting, I want to pay monthly for access to all tv and movies and music. I do not want to pay £10 per movie to RENT or buy it. I do not want to have to subscribe to 5 different services all with varying degrees of support for my devices and even differing quality or limited availability. And they could do all of that for a reasonable price... but they do not. So apps like Popcorn Hour ( also brilliant if you don't mind the lower quality ) emerge become open source and suddenly people can freely stream almost any movie instantly for free, and then they get no money. Zip Diddly Squat.

I am a paying user of Netflix ( also brilliant ) but it periodically looses films I wanted to watch due to license issues or whatever. And the selection of films in the UK is a joke compared to the US. But it is still the most convenient way to get at most of the Tv series I watch.. well given the 50/50 chance they are even on Netflix anyway. And the Netflix original creations such as House of Cards are significant draws, good for them, as at any point in an instant the film studio can and have pulled significant numbers of films out of greed.



*sigh* all alone

Also after a decade of using Apple Script I finally started using external functions, so you can have common functionality in separate apple scripts and use them from anywhere else.

It is achieved quite simply with:

set SomeVariable to load script POSIX file "/SomePath/SomeFolder/SomeScript.scpt"
tell SomeVariable to someFunction(someStuff)

Embarrassingly simple, yet I never thought to check it could do that till now. Also you can use a property instead of a variable to store the script in, then it is saved with the file. So if you make changes to the originally loaded script it won't effect other scripts you saved that used an older version ( till you open them up and resave them ) this would prevent say you changing something in the loaded script that inadvertently broke many other scripts. The Cons are obviously any improvements you make won't propagate then either.

Also I have implemented some multi threaded type stuff by using this:
set scptf to "theScriptYourWantToRun.scpt"
set arglist to "arg1 arg2" -- A space separate string of arguments you want to send to the script
do shell script "/usr/bin/osascript "" & scptf & """ & arglist & " >/dev/null 2>&1 &"

Which will run a script and send it some arguments / variables. And it doesn't sit around and wait for it to finish ( so you can't have any callbacks as such ) but is good to spit out 100 of processes that are happy to work independently of each other.

-

And in other news I repaired my TimeBomb / Apple Time Capsule, soldered new capacitors onto it and fixed a resistor onto the fan to keep it permanently running and cut a hole in the bottom for air flow. On performing this however I found someone had already changes the capacitors themselves

:-D

then remembered vaguely that I had in fact bought a refurb/repaired capsule off ebay.

Thankfully the new AC capsules do not have the sudden death problems any more. This has all been a huge pain in the ass simply because I cannot use my 3TB in any external enclosures due to legacy bit issues, ugh. Nothing but a nightmare all round.

It is no wonder I never get any work done... I need to work on my priorities, just after I uhm... write a script to remind me to do things

:-P


Show comments for 'Films and Plex'
Tags: - Photoshop - Games - Unity
Show comments

Decided to enter the "make a T-shirt design for Unity because they can't be bothered to do it themselves" contest. You can win $300 of asset store credit which might be handy, but I just wanted to enter for fun and because quite frankly, it only took 1 hour 20 minutes of my time and I am better than anyone else... in my mind

Well then another 1 hour 20 fiddling with it and making it high res, but you get the idea, I'm a badass is the point I am trying to make



First Draft below:



How can I not win, anyone who is clever enough to both deconstruct the original Unity logo from a cube into the 3 arrow / vectors it is made from and use the cube representation as a mathematical exponent should clearly be worshipped for the god they are

Though something to do with angry birds will probably win instead...

In case the symbolism escapes you, the person at the top ( who should really be outside the circle but T-shirt design is limited to a square area and I have no room ) represents one person going in 'you' effectively, are then multiplied or cubed to the power of Unity resulting in 3 people all looking rather relaxed. It's you x 3 if you use Unity basically. Using Unity is thus like having 3 cooler versions of you.

Nerd / Programmer Humour I know, but hey, you don't get much more nerdy than the Unity in crowd now do you

;-)



And in White:





Show comments for 'Unity Shirt'
Show comments

Warning serious NERD article ahoy!

Some major changes to the 'blog' fixing a few dozen problems, all very exciting stuff. Most confusing aspect was getting my lazily loaded images ( only the visible images load till you scroll the page ) working with read it later services like Pocket and Readability. I found a work around in the end for Pocket using PHP's $_SERVER['HTTP_USER_AGENT'] and detecting if it is PHP Pear grabbing my site, nasty I know, but the only way to do it for now. Pocket appears to use PHP behind the scenes to parse and so It's identity is returned as 'HTTP_Request2/2.2.0 (pear.php.net/package/http_request2) PHP/5.3.28' Readability is a bit nicer in that it reports as 'Readability/6534b2 - readability.com/about/' and Readability has the handy tag to ignore content 'class="entry-unrelated'

And I just added a comment system finally! I keep seeing people use Disqus so gave it a try. Turns out it cannot show more than one comment box per page which is no good, but found a work around... after a few hours

:-/

Nothing is ever simple.

In any case here is a zoomed out view of the code that runs the blog. The White code is the pre processor that is made of very squirrely looking AppleScript code and ran on my Mac. The grey code is a mix of PHP HTML and CSS which runs on the server side and reads in the XML produced by AppleScript. As you can see it is all a horribly ultra nested sprawling mess, though in my defense I did start this a decade ago and hadn't ever read a single thing on programming.

The whole blog is generated automagically out of folders with TextEdit documents in, just some RTF files that can hold Images and Videos. So I simple write an entry in a normal text editor, copy paste an image or video in. Done. And my AppleScript massages this into XML and export all the images, and videos, upload the videos to youtube and sync the whole site to the web host and so on.

Ah what kind of crazy bastard sets out and writes his own blog platform and website from scratch. Well Wordpress did'nt exist when I started, and I like images a lot on a blog, so copy pasting and ease of authorship was a must. Otherwise I wouldn't bother. But admittedly scary amounts of hours have probably been sunk into this over the years, bit by bit adding things.

I just made the whole applescript side multi threaded, take that crappy performing applescript code! It will now spin off a separate thread for each rtf file it is processing. Speeding the whole thing up by 100's of times. So that's nice. So no longer such a monolithic single block of code as I broke it up into several independent pieces. Go team me.



Below, See, makes perfect sense, The joy of escaping escape characters that are escaped. This abomination is what helps clean up an RTF file into something more legible, It somehow works



I also added caching for PHP page rendering using :

Link: www.phpfastcache.com --- www


As my site does a stupid amount of heavy string searching, replacing and other guff that is scary and abhorrent. So if say, I become really popular and everyone liked me so much as to visit my site 100 times a second, well the server would explode, and then my adoring fans would be deprived of access to my wisdom.
So typical processing of a page takes 0.2 seconds on a bad day. With caching... 0.001.

I wrote an AppleScript to spawn curl processes to assault my site as a test and total up the page rendering times with and without caching:

So for example with 5 users hitting the site every 0.1 seconds for a duration of 5 seconds you get:

39 seconds taken to run 5 seconds of access
5 concurrent accesses every 0.1 sec for 5 sec
556.525 seconds for sum of all load times for 250 page loads
2.2261 seconds avg per download

Without caching it takes 2.2261 seconds to serve up a page that took 0.2 seconds to render when only 1 render was being done at once, this will be due to contentions for disk access that are only revealed when multiple access happen simultaneously

So it basically took 40 seconds to serve up 550 seconds worth of page renders so it appears to be effectively achieving 13x at once ( on a 8 core Mac )

Now lets try with caching on:

7 seconds taken to run 5 seconds of access
5 concurrent accesses every 0.1 sec for 5 sec
17.886 seconds for sum of all load times for 250 page loads
0.071544 seconds avg per download

Nice, so now it took 7 seconds to run 5 seconds of access, bare in mind I am using the same machine to download the pages as serve them, it probably means it is coping 100% now
A difference of total page rendering time of 18 seconds versus the uncached 556 seconds of cpu time
It takes 0.07 seconds to serve up a page that took 0.001 seconds to render when only 1 render was being done at once now

Lets see how far I can push the cached load, given the over head of downloading from this many threads it likely the biggest issue now

20 seconds taken to run 60 seconds of access
5 concurrent accesses every 0.05 sec for 60 sec
8.178 seconds for sum of all load times for 6000.0 page loads
0.001363 seconds avg per download

CPU Load during serving up the cached pages on the left half, and uncached right half, plenty of room to spare for the cached, I suspect the red is the server and green will be me downloading the results. And my machine pretty much locks up while serving the uncached so.
Imagine how the poor typical 1 x 2ghz server would handle the uncached instead of a Mac Pro with 8 x 3ghz would not be pretty.


It would appear in any case that the majority of the cpu use is due to the overhead of downloading not the serving of the page. But it was a 'fun' exercise to stress test.

Now all of the above was tested on my local web server, lets try the remote Amazon EC2

With it uncached it hung the server really badly with some requests taking 40 seconds, and many just getting a blank return, only 300 out of 2000 came back! and those averaged 13 seconds per load

462 seconds taken to run 60 seconds of access
10 concurrent accesses every 0.3 sec for 60 sec
4818.08 seconds for sum of all load times for 373 out of 2000.0 page loads
12.917104557641 seconds avg per download

Lets run it again as I think my machine is struggling a bit with max proc limits

360 seconds taken to run 60 seconds of access
10 concurrent accesses every 0.5 sec for 60 sec
5007.7 seconds for sum of all load times for 370 out of 1200 page loads
13.534324324324 seconds avg per download

Interesting it craps out at the same 370 downloads, maybe some kind of spam protection on the server as I am loading all this from one IP

After thrashing the Amazon server, trying to access my site from a mobile phone ( so different IP ) fails completely. It is basically shutdown. Nice one amazon. And the whole time the Amazon status of the server shows as ok and not under much use.

Definitely odd as if I run again even a few small requests it takes 30 seconds per request ( I timeout curl after 30 seconds ):
88 seconds taken to run 1 seconds of access
10 concurrent accesses every 0.5 sec for 1 sec
560.322 seconds for sum of all load times for 18 out of 20 page loads
31.129 seconds avg per download

Below are a some cached runs thru, the first run thru is still showing hurt from the prior uncached stress test, after that they are fine again

11 seconds taken to run 1 seconds of access
10 concurrent accesses every 0.5 sec for 1 sec
31.288 seconds for sum of all load times for 20 out of 20 page loads
1.5644 seconds avg per download

8 seconds taken to run 1 seconds of access
10 concurrent accesses every 0.5 sec for 1 sec
0.288 seconds for sum of all load times for 20 out of 20 page loads
0.0144 seconds avg per download

9 seconds taken to run 1 seconds of access
10 concurrent accesses every 0.5 sec for 1 sec
0.389 seconds for sum of all load times for 20 out of 20 page loads
0.01945 seconds avg per download

9 seconds taken to run 1 seconds of access
10 concurrent accesses every 0.5 sec for 1 sec
0.336 seconds for sum of all load times for 20 out of 20 page loads
0.0168 seconds avg per download

And now for fun lets try a free web host ( opened host ) using the cached copy

11 seconds taken to run 1 seconds of access
10 concurrent accesses every 0.5 sec for 1 sec
0.104 seconds for sum of all load times for 20 out of 20 page loads
0.0052 seconds avg per download

Ok lets ramp it up for 60 seconds
152 seconds taken to run 60 seconds of access
10 concurrent accesses every 0.5 sec for 60 sec
3.476 seconds for sum of all load times for 1117 out of 1200 page loads
0.003111906893 seconds avg per download

ok so that's actually better than Amazon.... bastards
Lets try uncached now:

Webhost refuses to respond now

:-P

ok maybe not better than Amazon.. wait that does appear to be a DOS prevention as I can access site from my phone still but not the computer I was using to thrash the server

Anyway

I could just have used a service like load impact

:-P

which lets you test for free unto 250 users per second ( not connections but users with some kind of average page download per minute ) and pay for more based on use:

Link: loadimpact.com



But it doesn't hit the server as hard as my own script does

When using LoadImpact at 250 users graph looks like this, resulting in 1478 page loads over 2 minutes ( When not caching page )




Ouch over 7 seconds load time, though my internet is running off crappy ADSL at the moment, waiting for fibre to be installed, and this doesn't include serving up the actual HTML and Images, that's just to process the page and return a number

And now with it cached:





Ok lets test my Amazon EC2 hosting



huh it is still loading stuff from my domain... *tries again*



Ah that is better

And with it uncached:



Identical, but I suspect the test was still cached

CPU graph on Amazon:



And for future references, here is what the site currently looks like:




Show comments for 'Blog Engine'
You have reached the end of this page - But there's more! Click Older for more
Subscribe to my News Feed.. or screw you then!
Life isn't about finding yourself. Life is about creating yourself
Copyright © 2006 - 2024 - Alex Lovett
Site and content designed, built and massaged by
Alex Lovett
( HD6 / HeliosDoubleSix )
contact me by email:
Page Rendered in: 0.031 seconds, like a boss