PHP 7.1 on Debian for NGINX

Upgrading to PHP 7.1 was easy, the only place I encouraged issues, was with figuring out where the socket was located, after figuring this out I felt write an article over this would be a good idea and a big time saver for people.

apt-get install apt-transport-https lsb-release ca-certificates
wget -O /etc/apt/trusted.gpg.d/php.gpg
echo "deb $(lsb_release -sc) main" > /etc/apt/sources.list.d/php.list
apt-get install php7.1-fpm php7.1 php7.1-cli php7.1-curl php7.1-mysql php7.1-sqlite3 php7.1-gd php7.1-xml php7.1-mcrypt

Edit the Nginx config and replace the older FPM socket with a new one.

fastcgi_pass unix:/run/php/php7.1-fpm.sock;

Confused replace the whole PHP block with this.

location ~ \.php$ { 
fastcgi_pass unix:/run/php/php7.1-fpm.sock; fastcgi_index index.php; 
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; 
include fastcgi_params; 

How to score 100/100 on Google PageSpeed Insights

Who does not like blazing fast websites, that take no time to load and always serve great content. There are websites that have great design and content but tend to be slow due to many factors, for example loading images and other assets that are not optimized for the web.

I recently had issues with my internet connection, it was a 25 Mbps connection dropping below 500 Kbps, pages took ages to load, my own site felt slow.

Painful 4 hours, but an eye opener, I felt the discomfort of a user with a slow connection who had to repeatedly refresh to prevent a (server) timeout or incorrect render of the page.

Since then I have decided to do the highest level of optimization for personal and client projects.

When you have a fast internet connection, you don’t care about loading time, all the content renders in less than 3 seconds. The only time you feel pain is when the connection is slow.

PageSpeed Insights an industry standard performance testing tool used by programmers and non-programmers, what makes it so great tool is how it provides information in a simplified way and recommends changes that follow good standards and practices when the scores don’t turn out to be in the good zone(green) making it friendly to a non-programmer, plus it comes from a tech giant.

Waiting for a while screen for too long and seeing the browser render the web page in pieces, Imagine you visit a website on a fast internet and for 6 seconds there is just a white screen, most would quite and not everyone has a faster internet connection or time to see a web page load partially and render progressively, then you realize the web page has not loaded correctly because only one of the 6 stylesheets are loaded and you have 5 more to go, already feel the burn? right. I almost forget to mention about the high-quality image that has yet to load.

Later you know it was powered by WordPress and the admin installed all the plugins he desired off, without thinking about bloating of the front-end with unused and unnecessary stylesheets and scripts.


According to Google most users until 2020 will be using 3G, which is slow, but fast for developing countries.

If you score 100/100, you will have a wider audience, who would not want to have visitors from around the globe.

I will cover every area that will make your website/web app project score 100/100 but before we do that here is a list the advantages.

  • Faster web page.
  • Faster asset loading.
  • Better search engine indexing.
  • Better user experience.

The faster a web page will render that much better, content in text form will be read by the user much earlier, this will prevent the user to stare at a blank screen and wait for the content to load, instead, of quitting/exiting the page, he/she will head will other content is loading(slow connection).

Faster asset loading.

Assets like images, gifs, videos are very important parts of the web of today and aid the user with more visual content, instead of a wall of text, but these assets can be at times heavy and take a lot of time to load, due to various factors.

Better Search Engine indexing.

There are millions of web pages that a search engine bot has to index through and place in the right search tags, if a site is slow and takes the time to load, it might not index it well and some engines rate sites depending on the page speed.

Better user experience

Page loads fast, assets loads fast and the website is completely optimized for this device, what more does a user want, Once we have achieved every area of optimization we can completely focus on content.

The optimization level you can achieve depends on how much control you have on the site.

Image Optimization

As I have said before, images play a big role in our content, but using too many images can slow down the site and take large amounts of data. We have too many image formats, which is a good thing, we get a wider variety to choose from and work.

To the web, the best formats are JPG, PNG as they offer compression, and GIF, these are also common and widely used format. You must avoid TIFF and BMP(Bitmap) as their not web formats.

You can compress images using tools like Photoshop, GIMP and others, if you don’t already use any of these already or don’t prefer them for compression, I would highly recommend using ImageOptim, an image compressor for Mac, don’t use a Mac, that no problem, there are websites that compress images like Optimizilla and TinyPNG , that do the job well.

I personally recommend Optimizilla they have the smallest file size after compressing and do a great job and second is TinyPNG which supports APNG but has a slightly bigger file size when compared to Optimizilla, but that’s just a few KB. Go with what is available, as these are online services.

Compress Stylesheets

A lot of sites use multiple stylesheets, especially those that are powered by WordPress. Having multiple stylesheets in development can be considered fine, due to debug purposes, but in production, it is plain and render block for a web browser.

I would recommend using a pipeline to manage stylesheet, this way you can have something like this


At the end have a single.


which will reduce the number of render block and load up time.

With the pipeline process, we can also add another level of optimization which will reduce the file size drastically, this process is called minification, there is an example.


body { 
font-family: sans-serif; 
background-color: #fff; 
margin: 0 auto; 
.container { 


body{font-family: sans-serif;background-color: #fff;margin: 0 auto;}.container {max-width:1200px;}

It is the removal of whitespace, but removing of whitespace will reduce the file size and number of blank space the browser has to render.

Another optimization tip, I have seen developers ignore is removing unnecessary selectors,classes and attributes,declarations, once the project has reached a certain face of development there are a lot of changes that have taken place, in simple words there is a lot of unnecessary elements that have to be cleaned in project clean up face, for example, if you don’t use tables in any party of the page, it is recommended that you don’t have it in your stylesheet, it is unnecessary information that is not being used by the browser and only increasing the file size.

A better workflow, use pre-processors like Sass, Less, Stylus, PostCss.

If you are that dev who uses Bootstrap, then for the love of God, use pre-process and include only those partials/parts that are being used and minify the output.

I have seen some websites that use Bootstrap 3 and include bootstrap.css and bootstrap.min.css, not sure why maybe the dev has reached next level.

Compress scripts

Minify JavaScript, there is an example


var message = "Hello Web, I am the unminified script"; document.write(message);


var message = "Hello Web, I am minified script";document.write(message);

If you use CoffeeScript and TypeScript as JavaScript pre-processor, you could set the output to be minified and have a pipeline similar to what I had mentioned about stylesheets.


To do simple tasks, always use vanilla JavaScript, don’t use frameworks for simple tasks.


Morden browsers support async scripts, this means you can specify scripts to load async, however, you should use async only for scripts that don’t have dependencies.

Don’t make dependencies async as the browser will throw console error and scripts depending on them will fail to render.

You can achieve this by adding async to the script tag.

<script async src="/js/index.js"></script>

Async does not work with inline JavaScript, so scripts (example below) don’t support async.

<script> var message = "Hello Web, I am unminified script"; document.write(message); </script>

Compress web page

Minify HTML, this is not a common practice and majority of websites miss this part, as I have mentioned earlier, whitespace is extra space ignored by the browser and an increase in file size, minifying the HTML, should do a big impact.

Compressed web page example

<meta charset="utf-8"/><title>Example</title><div class="container"></div>


Pug/Jade is indent based HTML pre-processor, which by default uglifies/minifies the HTML.

Pug/Jade example.

doctype html html(lang="en") 
    head meta(charsett="utf-8)" 
        link(rel="stylesheet" href="/css/main.css") 
        title= Example 

Ultimate optimization.  
 After doing all of this you would score 90/100, which is a good score, but there would be still render blocking.

Load the complete stylesheet inside the head with style minified tag, this will remove the stylesheet render block


You could do the same with scripts, if they’re very important or if you have scripts after the opening tag or inside the head, move it to before the closing tag.

Leverage browser caching.

Leverage browser caching is a way to inform the browser to keep specific files on the client, so the next visit is much lighter and faster, these files are assets, stylesheets, and scripts that don’t frequently update.

If you have access to the server you can set the header of the site to tell the browser how long and what to cache.

If you are on a shared hosting that uses Apache2 and don’t have SSH access, you could do this by specifying it in the .htaccess, if you have access don’t use the .htaccess instead, edit the server config.

Server Optimization

This is to the folks who have server level control. If the site you are hosting is running WordPress or Drupal, update to the latest stable version and switch.

Do a plugin check for CMS users and switch to PHP7, It is faster than PHP5 and currently latest stable version of PHP, before careful over plugins as production does not display errors and you don’t want to risk security and see the white screen of death.

If you are an Apache2 user, SWITCH TO NGINX!!!

Doing all of this optimization correctly should give a 100/100 on Google Page Speed Insights, if not or you are have an issue, drop me a line. We can work together.e. We can work together.

Dice roll in Python

Dice roll in Python Image 2

Python is a fun language, the syntax is what feels so nice about it and how the indents are used to format the code, after using Jade and Sass, this comes very naturally to me.

I was learning about modules and found random to be a good place to start off, and turned out to be a very interesting internal module. To understand how it works I created a dice roll game, with just a few lines of code it would generate a random number, which in this context would be the side of the die, with a number.

As you can see it runs five times and generates random numbers.

    Simple console game that returns random sides of a die 5 times

# Importing random module
import random

# Adding inital attempt value
attempt = 1

# Run five times until attempt value equals 5
while attempt < 5:
    number = random.randint(1, 6)
    print("Your die roll is: " + str(number))
    attempt = attempt + 1

This was a great exercise and a very interesting way to learn Python.

BitPay is now available for Windows Phone

Early today, I received an email from BitPay team saying “BitPay wallet is now available on Windows Phone”.

As we all know Copay wallet was available for Windows Phone and In my opinion, it was not useable due to tremendously slow startup and UI and there was no good alternative for Windows Phone.

Now with BitPay, we can finally enjoy their service on Windows devices.

I will be publishing a review soon.

Rust on Mac

Rust is systems programming language gaining popularity due to its “safe, concurrent, practical language”, being memory safe while maintaining performance is the reason why so many people are adopting it as their systems language of choice, winning the first place for “most loved programming language” in the Stack Overflow Developer Survey in 2016 and 2017.

It is open source and sponsored by Mozilla Research. There are many interesting projects like Piston a modular game engine, you can find more project and packages on,

Install instructions.

Open Terminal and run these commands.

curl -sSf | sh

source $HOME/.cargo/env

Time to write some rust code.

fn main() { print!("Hello World!") print!("Rust coding day {}", 1); }

Rust file is saved by the extension .rs.

To compile this file, we need to execute it with Rust compiler.

The compiler is known as rustc in the command line.


Executing the file.


It will build instantly and create a binary executable as hello.

Running hello

If you have any question leave a comment below.

Blender on Raspberry Pi

In a hot weather with multiple laptops running in the same room, the last thing I wanted to do was render, which I did and instantly the cooling fans turned on, result, the room turned uncomfortably warm.

Ten minutes later, I gave up on the render process as I did not want to sit in this oven and decided to fire up my Raspberry Pi 2,  and checked to see if a build of Blender is available in the repository, turns out there is a built not the latest built but has everything I need for the render.

I highly recommend doing this on a Raspberry Pi 2 or higher, as it has sufficient amount of RAM for decent render and CPU that can handle BVH(Bounding Volume Hierarchy) duplication without any issues.

Announcement: I will be publishing the result of the benchmark once they have been completed, you can subscribe via email to get notified or leave a comment.

Installing it is like another application.

Arch Linux

sudo pacman -S blender


sudo apt-get install blender

The good thing about Blender is it has CLI support, which lets you render without firing up the GUI.

blender -b file.blend -o /render/frame_##### -F PNG -f -2

Let’s break down the flags used above.

-b (render in background without GUI)

file.blend (location of the blend file to be rendered)

-o (Location to where the render will be saved)

-F (Override the image format specified in the blend-file and save to an PNG image.)

-f (Frame)

To learn more about the command line you can visit the official documentation.

You could also do preview renders, that are quicker than offline render.

Digital Sculpting

The first posts out my new series of Notes, to help you understand computer graphics better.

Digital Sculpting is one of the most artist ways of mesh manipulation and most preferred way of creating organic forms example character, environment props, rocks, stones, trees to name a few. It has given artists the freedom to interact with the mesh like traditional clay.

In programs like ZBrush, you get brushes which are similar to traditional tools like clay, rake, and move.

Before the advancement of dynamic mesh generation and equalizing mesh based on active form.

I don’t think anyone who has done digital sculpting would want to go back. Back then you had to draw a cage with NURBS (Non-uniform rational basis spline) curves which are filled with loft or birail functions, this process was known as patch modeling, which would take hours, not getting the desired shape meant redoing it. I have never done it, knowing it was earlier ways of the creative process, make me think the development of the new age tools and advancement in computer graphics.

Today software like Blender come with Sculpt mode, Modo and Maya come with sculpting toolset.

Like such content, I try publishing every Sunday.

RuTorrent on Raspberry Pi

After publishing RTorrent on Raspberry Pi, I did not have enough time to try anything new with the Raspberry Pi, I got requests later to get RuTorrent working with RTorrent on Raspberry Pi.
I decided to use a bash script based setup instead of step by step because it will end up having the same result but with better security measures as the script is written by a much-experienced person and maintained actively.

I would recommended doing this on a clean install for best result.

This setup will contain

  1. libtorrent/rtorrent (Latest/Compiled)
  2. SSH port reassigned
  3. VSFTPD (FTP client) random port assigned
  4. Webmin (Optional) (Admin interface)
  5. autodl-irssi
  6. rutorrent (Web UI)

Download the script

sudo bash -c "$(wget --no-check-certificate -qO -"

Run the script

sudo bash

The script would respond shown below

Raspbian GNU/Linux 8.0 (jessie)Your Server IP/Name is Is this correct y/n? 

Choose y
The next response would be to add a password to secure the web interface to the user pi

Set Password for RuTorrent web client Enter a password (6+ chars) or leave blank to generate a random one Please enter the new password:

The next response can take some time depending on the SD card speed and active tasks running in the background

No more user input required, you can complete unattended It will take approx 10 minutes for the script to complete Updating package lists

The final response after the process

crontab entries made. rtorrent and irssi will start on boot for pi ftp client should be set to explicit ftp over tls using port 48915 If enabled, access https downloads at rutorrent can be accessed at rutorrent password as set by user to change rutorrent password enter: rtpass IMPORTANT: SSH Port set to 26828 - Ensure you can login before closing this session The above information is stored in in your home directory. To see contents enter: cat /home/pi/ To install webmin enter: sudo rtwebmin PLEASE REBOOT YOUR SYSTEM ONCE YOU HAVE NOTED THE ABOVE INFORMATION

Note down the changed SSH and FTP port before exiting the SSH session, the script adds these information to a file named `` for later access.

ftp client should be set to explicit ftp over tls using port 43915 If enabled, access https downloads at rutorrent can be accessed at rutorrent password as set by user to change rutorrent password enter: rtpass ssh port changed to 21828

The script does even more, it adds various options and update functionality with the help of rtadmin

sudo rtadmin

It would respond with these options

Select from the following options: 1.) rtgetscripts - update the rtinst scripts 2.) rtadduser - add new user 3.) rtremove - delete a user 4.) rtdload enable - enable http downloads 5.) rtupdate - up/down-grade rtorent/libtorrent 6.) rutupgrade - upgrade to latest version of RuTorrent 7.) rtwebmin - install webmin Enter option or q to quit

Doing this whole process manually would take a lot of steps and longer time, this script does more and simplifies, things like update, upgrade plugin management, thanks to arakasi72, the repository of the script is on GitHub
If you have more requests for such posts let me know in a comment or contact directly for feedback.