dev edit

I’ve been using Jekyll for years to generate static site from my blogs. In many cases it’s enough to push the Markdown document but sometimes it does help to see it beforehand to notice broken links or as a general review. So I decided to set up dev environment for my blogs.

The Easy Way

In the past it used to be easy as running the commands below taken from Jekyll docs:

gem install jekyll bundler

Then navigate to blog folder in terminal and run:

bundle exec jekyll serve --drafts

The Hard Way

Turns out in MacOS Mojave (10.14) it doesn’t work anymore. You have run the command as root which is not advised. So I followed the commands listed here:

echo '# Install Ruby Gems to ~/gems' >> ~/.bashrc
echo 'export GEM_HOME=$HOME/gems' >> ~/.bashrc
echo 'export PATH=$HOME/gems/bin:$PATH' >> ~/.bashrc
source ~/.bashrc

After this I was able to run the command without root but I was still getting errors. Found the solution to that on this thread:

open /Library/Developer/CommandLineTools/Packages/macOS_SDK_headers_for_macOS_10.14.pkg

Ran the command and followed the wizard. After that I was able to install Jekyll and run it on my local machine.

Resources

linux edit

To enable WSL run the following command in an elevated Powershell prompt and confirm to restart after the installation.

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

Now we have to install a Linux distro. Launch Microsoft Store app and search for Ubuntu. In my case I used Ubuntu 18.04 LTS version

Once it’s downloaded the installation starts automatically:

And finally set a user:

Resources

docker edit

To install Docker on Raspberry PI, run the following commands:

sudo apt-get install apt-transport-https ca-certificates software-properties-common -y

curl -fsSL get.docker.com -o get-docker.sh && sh get-docker.sh

sudo usermod -aG docker pi

sudo curl https://download.docker.com/linux/raspbian/gpg

Then edit sources.list

sudo nano /etc/apt/sources.list

and add the following line at the end:

deb https://download.docker.com/linux/raspbian/ buster stable

Make sure to replace your version name (buster above) with the correct one.

Patch and update OS:

sudo apt-get update
sudo apt-get upgrade -y

and start Docker service:

sudo systemctl start docker.service

Reboot pi and when it comes back online, run the line below to test the installation:

docker info

It should print diagnostic information about Docker.

Resources

linux edit

Inserted the SD/Micro SD card and run:

diskutil list

Then

sudo dd bs=1m if=path_of_your_image.img of=/dev/rdiskn conv=sync

where n is the number of disk which I found from the first step.

If you get “resource is busy” error, then run the following

diskutil unmountDisk /dev/diskn

tips & tricks edit

In many case I need to run applications as administrator in Windows. Also I like pressing the Window button on the keyboard and start typing the name of the application.

The problem is to be able run the application as an administrator, I have to switch to the mouse. Right click on the name of the application and select Run As Administrator.

To avoid this I found this shortcut: Shift + Ctrl + Enter. This way it display the privilege escalation confirmation dialog and I can simly tab and press enter. So all this without reaching out to the mouse!

tips & tricks edit

Like many people Chrome is my main browser. I like the experience as a user and a developer. Also as a developer, I always bump into caching issues. I fail to see my changes because everything is cached in the browser.

One way of clearing cache is:

  1. Open Developer tools
  2. Go to Application tab
  3. Clear all application data

This was my preferred method before I got a tip from a colleague:

  1. Open Developer tools
  2. Right click on the refresh icon

It show 3 options to clear cache. Note that, these options are only available when the developer tools are open.

Resources

tips & tricks edit

Using multiple monitors is great but quite often I have to disconnect them and move my laptop around. When I connect them back sometimes I find some windows act weirdly. For example a minimized window goes to an non-existent monitor when I try to maximize it.

To fix that issue, I got this quick tip from a colleague:

Instead of clicking on the application in task bar, Shift + Right Click. In the context window there’s a Maximize option which brings the lost window to the main monitor.

tips & tricks edit

I like LINQPad for prototyping C# applications and trying out short snippets. In many scenarios I have to see the output of what I’m trying out. I used to treat my snippets as they if they were small console applications and I used to use Console.WriteLine() statements to display the output.

Not anymore! In a tech video on YouTube I saw this option and loved it:

In LinqPad, there’s a generic extension method called Dump(). It writes the output to the console. Exactly as Console.WriteLine but in a much more concise way.

For example:

var nums = new[] { 1, 2, 3, 4, 5, 6 };
var sum = nums.Aggregate((a,b) => a + b);
Console.WriteLine(sum);

This displays 21 and it can be shortened with Dump method

var nums = new[] { 1, 2, 3, 4, 5, 6 };
nums.Aggregate((a,b) => a + b).Dump();

It shows the same result but in a shorter way.

Resources

devops edit

Some time ago I developed a script to backup my GitHub account and blogged about it here. The idea is to have a backup copy always and have redundancy.

But what is instead of a local copy, we have two online copies? I’ve learned that it’s very easy to achieve as Git supports pushing to multiple repositories at the same time.

For this I created a repository in CodeCommit which is free up to 50GB.

The secret to achieve this is the following command:

git remote set-url --add --push

Steps to push to multiple repositories

01. First run it with the original value (which can be found by running git remote -v command)

git remote -v

02. Copy the URL for push and run the following command:

git remote set-url --add --push origin {push URL from above}

03. Then run it with the 2nd remote like this:

git remote set-url --add --push origin ssh://git-codecommit.eu-west-2.amazonaws.com/v1/repos/{repo_name}

Testing the new repo

When you run git remote -v again you should see something like this:

origin	git@github.com:volkanx/{repo_name}.git (fetch)
origin	git@github.com:volkanx/{repo_name}.git (push)
origin	ssh://git-codecommit.eu-west-2.amazonaws.com/v1/repos/{repo_name} (push)

Now put a test file in the local repo and push it with

git push -u origin master

It should appear in both remote repositories. It also works well with GUI tools (I use SourceTree).

This way I get to have an online backup without too much hassle and I get to change my provider whenever I want if I wanted to.

Resources

dev, aws, ses edit

It’s important to read titles on AWS console pages. The reason I decided to post this is that I saw in forum other people making the same mistake. In this case, I’m talking about the rulesets.

The problem with this user interface design is that you can easily see the “Create a New Rule Set” button. You can create your rules inside it and save it and expect it to take affect instantly but this new rule would be inactive. So basically if you want your rule to start working right away, make sure to click on the blue button first. This way you add your rule in the Active Rule Set.

No other AWS service has this kind of distinction in AWS console. I guess that’s what’s causing this confusion for me as well as other people. Hope this post comes in handy for someone who’s having this issue or just to create some awareness.

dev, aws, iam edit

Normally the login URL for IAM users is in this format

https://{Account Id}.signin.aws.amazon.com/console

But it is possible to make this URL more memorable and user-friendly.

In order to achieve this, click on the Customize link which is to the right of the link:

This brings up the Create Account Alias dialog box. Since the name we provide is used in the URL it needs to be unique globally.

If you click on Customize again it asks if you would like to delete the alias and use the account number again. This way you can revert your changes.

Resources

dev, csharp edit

Today I was looking at a code sample and I noticed a rather unusual syntax in a while condition. It looked like this:

int T = Int32.Parse(Console.ReadLine());
while(T-->0)
{
    // Do stuff
}

I started looking around for –> operator assuming it’s a new addition to the language but it wasn’t listed in the C# operators list.

Playing around with this code in a sample console application I realized that it’s just a decrement operator and greater than comparison grouped together. So it’s essentially equivalent of this:

int n = 4;
while ( n-- > 1 )
{
    Console.WriteLine($"Hello World! {n}");
}

But looks better when it looks like a single operator. Same can be done with the increment and less than comparison but it a bit looks uglier:

int n = 4;
while ( n ++< 8 )
{
    Console.WriteLine($"Hello World! {n}");
}

It can also be used in a for loop:

var items = new int[] { 1, 2, 3, 4, 5 }; 
var m = 4;
for (; m --> 1 ;)
{
    Console.WriteLine($"Hello World! {m} {items[m]}");
}

which would print

Hello World! 3 4
Hello World! 2 3
Hello World! 1 2

I don’t I would use this notation in my own applications but might be helpful when reading somebody else’s code.

Resources

sysops edit

Yesterday I had to find the count of objects in a folder in an S3 bucket. I only had access to AWS via command line and was working on a Windows Server.

After a bit digging around I found the solution using PowerShells’ Measure-Object cmdlet.

The solution to get the object count was:

aws s3 ls s3://{bucket}/path/to/files | Measure-Object

This can also be used in local folders as well. It also can be used to get the minimum / maximum / average / total size of the folder too so quite handy to get some quick stats about a folder/bucket

While trying it out again I had to install AWS CLI on my Mac. So just for future reference to be able to use AWS CLI on macOS you can install it with Brew as following:

brew install awscli

Resources

sysops edit

Today I learned an easy way to retrieve stored Wi-Fi passwords on a machine without installing an external application. The original article is in the resources section.

All it takes is 2 commands:

First list all the previous WiFi connections with this command:

netsh wlan show profiles

Then copy the connection name and replace ConnectionName in the command below:

netsh wlan show profile name="ConnectionName" key=clear

Resources

utility edit

I often need to combine PDFs when I do my weekly planning. As a Windows user I didn’t have an easy way of doing it (free applications generealy come with bloatware) so I developed myself a small console applicaiton in C# to merge PDFs into one. As I decided to switch to Mac for daily usage I was again in need to achieve the same task. Fortunately, the solution is built-in to Mac already.

Apparently, the PDF preview tool allows you to drag and drop other documents as pages so it’s an easy way of doing it. But I wanted to automate the process and found this article which shows how to do it via command line.

The solution

So here how it works:

Basically there’s a Python script that is shipped with OSX which does the job:

"/System/Library/Automator/Combine PDF Pages.action/Contents/Resources/join.py" -o Output.pdf /SomePath/Input1.pdf /SomePath/Input2.pdf /SomePath/*.pdf

This is it essentially. To make it a bit easier a soft link can be created

cd /usr/local/bin
sudo ln -s "/System/Library/Automator/Combine PDF Pages.action/Contents/Resources/join.py" PDFconcat

In the article the symlink parameter (-s) is not mentioned and I was getting “Operation not permitted” error without it. I found this parameter in the comments section. It worked this way.

So now the usage becomes much simpler:

./PDFconcat -o Week.pdf ./Days/*.pdf

Looks like the order of the parameters matter. For instance the following doesn’t work

./PDFconcat  ./Days/*.pdf -o week.pdf

It doesn’t show any errors but just doesn’t produce the output either.

Resources

devops, aws, route53 edit

One of my favourite AWS services is Route53. WGoing through my hosted zones on Route53, I decided to take a closer look into routing policies.

Especially after I read Loggly’s blog post on using Route53 as load balancer I thought I should give using Weighted Routing a shot.

If you assign an equal value to all nodes it works in a round robin fashion and distributes equal load to all nodes. But the nice thing is you can customize the weights based on your configuration.

Basically the idea is to create duplicate record sets (e.g and A record for test.domain.com pointing to 1.2.3.4 and another for test.domain.com to 5.6.7.8) with a unique identifier and a weight.

Resources

dev edit

This is a very old feature really but didn’t bother to check it out before: Basically in Visual Studio you can select from up to 20 last-copied items in the clipboard. Instead of Ctrl + V, using Ctrl + Shift + V cycles through the 20 items starting from the most recent and going backwards.

Also, as SQL Server Management Studio is based on Visual Studio, it works in SSMS too!

It’s a handy feature and will try to utilise it more often…

Resources

dev, security edit

I’ve been playing around with SSL certificates for years but never occurred to me what’s the purpose of Extended Validation. Primarily because they are so out of my price range I didn’t even consider them. Today I learned that one benefit of having EV cert is having the company name next to the green padlock.

I don’t think it’s worth the effort and a whole lot of extra money but it sure looks cool compared to a small padlock icon only!

Resources

dev, database edit

A few weeks ago I needed to migrate data from a SQL Server instance on Azure to another one. I looked around a bit but couldn’t find a satisfactory solution. I had to export the data as insert scripts and run them on the target server. Yesterday a colleague told me a far better way to do this using Visual Studio 2015: SQL Server Object Explorer.

It’s a fantastic tool for such scenarios. All you have to do is create 2 connections for source and target

Then you right-click on one of them and select Data Comparison which pops up the following dialog.

The rest is just following the comparison wizard. Select the source, select the destination and compare. It displays all the differences and you can select which ones to apply to the destination and run! That’s all.