Sunday Nerding: Soyuz Clock

Posted Leave a commentPosted in Technology

A while back, I published a link to a series of videos about the restoration of an Apollo Guidance Computer (AGC) – the computer that allowed man to travel to and land on the moon in 1969.

Now the same team have been working on a Soviet-era clock from a Soyuz spacecraft.

The clock is in pretty good condition, so won’t take the time and effort it took to restore the AGC, but for the electronic and space geeks, this is pretty cool too.

I will add to this playlist as more installments of their work is added.

Bitly’s V4 API – Sample PHP Code

Posted Leave a commentPosted in Technology

Recently Bitly have rolled out version 4 of their API, and if you’re using the old API, it is quite a change.

While users were asked to move from older versions to the new version before March 1st 2020, the version 3 API was to hang around until March 31st 2020 to give people a little extra time to move.

Given the current COVID-19 situation, this discontinuation has been put on hold for the time being as per an email sent out to API users.

Nevertheless, it is absolutely time to move to the new version to ensure that you can still automate the shortening of your URLs.

Given my professional working life, I’ve worked with very similar API implementations, so for me it wasn’t a big deal to change over.

Here is the simple function I wrote for PHP, to do a basic URL shorten. Your PHP installation will need to have cURL and JSON libraries installed and enabled.

function shorten_url ($access_token,$group_guid,$shorten_domain,$shorten_url) {

  # setup the JSON payload
  $json_payload = @json_encode(Array(
    "group_guid"=>"".$group_guid."",
    "domain"=>"".$shorten_domain."",
    "long_url"=>"".$shorten_url.""
  ));

  # initialise cURL handle
  $curl_handle = @curl_init();

  # define cURL parameters
  @curl_setopt_array($curl_handle,Array(
    CURLOPT_URL => "https://api-ssl.bitly.com/v4/shorten",
    CURLOPT_RETURNTRANSFER => TRUE,
    CURLOPT_ENCODING => "",
    CURLOPT_MAXREDIRS => 10,
    CURLOPT_TIMEOUT => 0,
    CURLOPT_FOLLOWLOCATION => TRUE,
    CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
    CURLOPT_CUSTOMREQUEST => "POST",
    CURLOPT_POSTFIELDS => $json_payload,
    CURLOPT_HTTPHEADER => Array (
      "Host: api-ssl.bitly.com",
      "Authorization: Bearer ".$access_token."",
      "Content-Type: application/json",
    ),
  ));

  # execute the cURL request
  $json_output = @curl_exec($curl_handle);

  # decode the output and get the HTTP response code
  $json_decoded = @json_decode($json_output);
  $http_code = "".@curl_getinfo($curl_handle,CURLINFO_HTTP_CODE)."";

  # return results for further processing
  return(Array("json"=>$json_decoded,"http"=>$http_code));

}

The inputs are relatively straight forward:

$access_token – this is the same as your existing OAuth application access token – this has not changed, at least in my case.

$group_guid – this is the only new piece of information you might need. While you can query for this value via the API, for most people the simplest way to get it is to open into your Bitly account in a web browser, where you will find the GUID in the URL, as per this image:

$shorten_domain – if you have a custom domain – (for example, I have the domain “mwyr.es” for shortening purposes) – this is the value you need here. If you don’t have a custom domain, just use “bit.ly”.

$shorten_url – this is the URL you wish to shorten – simple!

So now you can call the function, with the required information.

Firstly, the input data required by the API is converted to JSON via the json_encode() function.

Secondly, we set up the cURL handle, and then supply it with the required data. Note the JSON data is passed within the “CURLOPT_POSTFIELDS” variable, and that the access token is passed as the “Authorization: Bearer” header in the “CURLOPT_HTTPHEADER” variable.

Then we execute with curl_exec() – and store and return the results for further processing.

That’s it!

To understand the returned JSON – (including the generated short link) – and the HTTP response codes, refer to the excellent Bitly API v4 documentation.

Sunday Nerding: Machine Learning

Posted Leave a commentPosted in Technology

In today’s Sunday Nerding, we’re looking at what has become an increasing trend in data and data analysis – machine learning.

In simple terms, as described on Wikipedia, machine learning is:

Machine learning (ML) is the scientific study of algorithms and statistical models that computer systems use to perform a specific task without using explicit instructions, relying on patterns and inference instead.

Watch this video from former NASA engineer Mark Rober on what machine learning is in a practical sense, and how it can be used in a real-world scenario – in this case, sign stealing in baseball.

Sunday Nerding: The Birth of BASIC

Posted Leave a commentPosted in Technology

Many people working in the Information Technology industry today, probably started their journey when they learnt the computer language BASIC – (Beginners All-purpose Symbolic Instruction Code) – on computers such as the Apple II or the Commodore 64.

What many people won’t know is the fascinating story of how BASIC started, at Dartmouth College in 1964. In today’s “Sunday Nerding”, learn about the origins of this once ubiquitous programming language.


References

Turnbull’s NBN Legacy of Failure

Posted Leave a commentPosted in Technology

Malcolm Turnbull has left Australia with quite a legacy with the National Broadband Network (NBN).

Back when the NBN was first mooted in 2008 – (though one could argue its origins go back the OPEL Networks plan from 2006) – everyone was supposed to be on one of three different technologies – 93% of the population with Fibre-to-the-Premise (FTTP, with up to 100Mbps), 4% with Fixed Wireless (FW, with up to 25Mbps), and 3% Satellite Broadband (SB, with up to 12Mbps).

Tmthetom [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)]

When then Communications Minister Stephen Conroy cancelled the OPEL plan in 2008, what has become known as the NBN was formulated, with the 93/4/3% split described above.

Enough capacity was to be put into the ground in the FTTP footprint to support 6 separate services – (4 data, and 2 voice) – in every single premise in those areas. The fibre going into the ground was to support services of up to 40Gbps.

To achieve those speeds – (over and above the standard 100Mbps offered initially) – all that would be required is an update to the electronics at each of the fibre connection.

Yes – the original 2008 NBN plan would have allowed for 40Gbps, dependent on CVC and backhaul capacity to be provided by individual ISPs.

Leading up to the 2013 election – and the change of government – then opposition communications spokesweasel, Malcolm Turnbull, and opposition leader Tony Abbott had other ideas.

Simply to oppose on politically ideological grounds, they decided that Conroy’s plan was “too expensive”, and would take “too long”.

Their alternative was to be “cheaper” and “faster to deliver” – neither which has been proven, and has in fact been widely debunked. Their plan called for all areas in which FTTP had not already been deployed, to change to Fibre-to-the-Node (FTTN, with up to 100Mbps), using existing copper.

The status of the existing copper was questionable at best.

The rollout has proven to be no faster to deliver – (and in fact has taken longer) – and sustainable speeds of 100Mbps have been so difficult to reach that most ISPs no longer even offer 100Mbps plans – including on the parts of the network that are deployed with FTTP.

What we in fact end up with is what Turnbull called the “Multi Technology Mix” (MTM) – which would leave Australia covered with FTTP in areas where it had already been rolled out, Hybrid Fibre Coaxial (HFC) cable in areas where HFC was already rolled out, FTTN in the remaining areas where FTTP had not already been committed and there wasn’t already HFC, and finally FW and SB in much the same areas as originally planned.

The FTTN areas were later broken up further to include some Fibre-to-the-Curb (FTTC) deployments when they realised FTTN, in particular, wasn’t cutting it. Many areas which they earmarked for existing HFC later switched back to FTTN or FTTC, because many of the existing HFC networks they purchased couldn’t be made suitable.

And 100Mbps? Not even remotely likely unless you’re in an FTTP area, and with an ISP that has purchased enough CVC and backhaul capacity.

Rare.

So what does the MTM end up looking like? Take a look at this small area in the western part of Geelong, Victoria, with mapping provided by NBN MTM Alpha:

The purple dots represent locations that are serviced by FTTP; the yellow dots locations that are serviced by FTTN; the green dots locations that are serviced by FTTC; the pink dots locations that are serviced by FW; the orange dots locations serviced by SB; and finally the blue dots locations that are serviced by fibre from a non-NBN provider.

This is pretty stunning – and stunningly stupid.

You’ll see in the bottom right a patch of FTTC – (green) – where some premises right next door to green dots are getting FTTN – (yellow).

In the same street.

In the middle of the map, you’ll see the hamlet of Fyansford – where at the southern end of town you have a non-NBN fibre provider – (blue) – and at the northern end of town you have FTTP – (purple) – with a blob of FIXED WIRELESS in between. This band of fixed wireless is about 10 house blocks wide – or around 120 metres.

Apparently nobody thought that this area – (which is the newest part of that residential estate) – right next door to two fibre areas should get any kind of fixed-line service – not even FTTN or FTTC.

Stupidity.

Finally, zooming into the area just to the right of Fyansford – (which is on the side of a hill) – we see this:

Locations on the eastern side of Hunt Road get FTTN – (yellow) – locations on the western side get SATELLITE – (orange) – and just a little way down the hill, locations get Fixed Wireless – (pink).

And just to the north? A purple dot of FTTP.

I mean, what the hell?

Australia will one day rue this shemozzle of a “multi-technology mess”.

Trouble is, that day has already come, and Turnbull should hang his head in shame.

Sunday Nerding: Getting To The Moon

Posted Leave a commentPosted in Technology

This month marks the 50th anniversary of the Apollo 11 mission – the first moon landing, and that one small step for mankind that changed the world forever.

Apollo 8 Crew
Jim Lovell, William Anders, and Frank Borman.

In 1961, President John F Kennedy had set his his nation the goal of “landing a man on the moon, and returning him safely to the earth“.

A while back I wrote about the amazing – (and ongoing) – restoration of an Apollo Guidance Computer – one of the very first digital computers, developed at MIT for NASA, which was crucial to achieving the goal.

While everyone remembers Apollo 11 – (and to a lesser extent Apollo 13, due to the problems it struck) – very little is thought about with respect to Apollo 8 – the mission where NASA figured out how to do two of the four main important tasks of a successful moon landing – getting there and getting back.

The following video discusses the pivotal role Apollo 8 played in making Apollo 11, and all of the subsequent moon landings possible.


Further Reading:

Sunday Nerding: The Clock That Changed The World

Posted Leave a commentPosted in Technology

Before there was GPS, people still needed to sail the oceans of the world and know precisely where they were.

With a sextent, you could figure out your latitude, but not your longtitude. Enter the Longitude Rewards, a British government program to encourage someone – anyone – to find a accurate way to determine longitude.

Enter John Harrison, and his incredibly accurate clock.

Russian GNSS Spoofing

Posted Leave a commentPosted in Technology

A recently released report from C4ADS following a year of research, appears to confirm the hacking and/or spoofing of GNSS transmissions by Russia’s Federal Protective Service (FSO).

https://commons.wikimedia.org/wiki/File:CYGNSS_concept_art.jpeg

GNSS is the collective term for “global navigation satellite systems“, of which the common GPS system is one. Russia and China are known to operate their own GNSS systems, alongside the GPS system developed by the US military.

The activities of the FSO – (in which it is apparent that false signals are deliberately broadcast to confuse GPS receivers, such as those you might have in your car, or those found in commercial ships or commercial aircraft) – are reputedly designed to keep attack drones away from Russian president, Vladimir Putin.

While this might seem like a not unreasonable use of such techniques, the report presents evidence that they are also using these techniques in Syria, possibly to confuse enemy military systems. There is of course a long running military conflict in the region.

It is therefore logical to assume that such techniques can and have been used all over the world at some time – past, present and future.

These techniques could be used to disrupt navigation in all sorts of transportation systems and infrastructures.

Russia shot down a Korean Air passenger jet in 1983 after an issue with the configuration of the navigation system on that Boeing 747. While this was found to be the fault of the pilots at the time, faulty navigation data could be used to initiate similar incidents, but with plausible deniability.

Quoting the report’s Executive Summary:

In this report, we present findings from a year-long investigation ending in November 2018 on an emerging subset of EW activity: the ability to mimic, or “spoof,” legitimate GNSS signals in order to manipulate PNT data. Using publicly available data and commercial technologies, we detect and analyze patterns of GNSS spoofing in the Russian Federation, Crimea, and Syria that demonstrate the Russian Federation is growing a comparative advantage in the targeted use and development of GNSS spoofing capabilities to achieve tactical and strategic objectives at home and abroad. We profile different use cases of current Russian state activity to trace the activity back to basing locations and systems in use.

The full report can be found here.

DARPA’s Open Source eVoting Initiative

Posted Leave a commentPosted in Technology

I’ve never been a fan of the concept of electronic voting. I’m still not a fan of electronic voting.

For the most part the idea that I might cast my vote, walk away from the machine that contains my vote, and not know what happens with that machine afterwards scares me.

How do I know my vote eventually gets counted?

It could be argued that a paper ballot in a ballot box might “go missing” too. Most systems have certain kinds of vulnerabilities, whether they be electronic or otherwise.

But can eVoting be made reliable and verifiable?

With this initiative from DARPA, I’ve moved into the “maybe” column. I’m not convinced, but this is the best concept I’ve heard to date.

It’s worth discussing, and we do need to understand that DARPA is part of the US Department of Defence. How much can we trust that?

As an open source initiative, their work would be able to be closely scrutinized by any interested party. This perhaps means the eventual product they develop can be trusted.

It contains a lot of verification mechanisms to instill confidence in it.

Here is their plan as discussed on the most recent episode of Security Now!


References