Web Development

JavaScript Is Always Changing

JavaScript Melting Pot

https://increment.com/development/the-melting-pot-of-javascript/

So no, it’s not the dependency iceberg itself that is worrying me.  It’s the proliferation of configuration options.

— Dan Abramov

Old Coder

When I started professional development in the early-90’s I only had a few software development languages and tools at my disposal.  C++ was just gaining steam and available to me on my PC in MS/DOS via Borland Turbo C++ (which I bought at a student discount.)   NCSU provided very nice Sun SPARCstations in their computer labs.  I spent many-a-late-night in the labs, but for personal use no student could afford buying a SPARCstation.  Microsoft Visual Studio was evolving in Windows and sometimes unstable.  Linux was spanking new.  Java was also new (and controlled by Sun), but didn’t start rolling hard until about 1995.  The primary source of learning new development-related skills was via books (not yet the Internet).  The pace of change was slow and controlled by companies who built the compilers, just as as Abramov indicates.  When Microsoft rolled out .NET and Windows XP circa 2000 it took about 3 years for .NET to gain full uptake in the Windows ecosystem.  Even I shied away from it until .NET hit 2.0.  .NET’s growth required a huge advertising and educational push from Microsoft to encourage developers to adopt it and yet they still had to keep around older API’s like Microsoft Foundation Classes, COM, ATL, and WTL because of the massive quantity of legacy code and developer lock-in.

Nowadays…

Does this globulous menagerie make you feel anxious?

Gulp Bower Grunt Yeoman NPM

What about today? We should expect at least one new software development language a year to appear and gain some traction.  How many transpiled-to-JavaScript languages have appeared in the last few years?  Expect a whole new paradigm of dev tools with JavaScript at least once or twice a year.  I think this proliferation is the beauty of open source and from the abundance of developers.  There are many more developers now than when I got started.  There are many smart people in the ecosystem who are tired of waiting around to convince someone else to fix their problems.  Smart folks argue best by making stuff.  The modern JavaScript ecosystem both scares me and makes me giddy.  If I work on something in June, then come back to it in October there’s a good chance what I was using is now outdated.

This pace is frightening and frustrating for those developers who have a nagging desire to always be using the best tool while craving stability.  When the tooling keeps changing there is no “best” tool.  A particular challenge in the modern JavaScript environment is creating longer-life corporate web-based products which need to be around for a few years to make good return on the development investment.  Thus, finding peace requires a change of mindset.  I liken it to the metaphor of grabbing a morphing cyborg by the hand and learning to dance.

Good Practice Make Good Play

The core challenge with all the new tools (and all the old tools) is figuring out how to apply good development practice and methodology.  Good practice and design concepts are timeless.  An amusing part of entering the JavaScript environment as a mature developer is watching the ecosystem walk through well-known growth pangs taken by all maturing development environments.  A good example of this truth is the rising popularity of TypeScript.  Duck-typing is great for single-developer smaller projects.  But, when you need to scale a large application and involve lots of developers it causes problems.  This was known over 40 years ago, but JavaScript was not initially intended for such large scale development.  Now it is.  The Community responded.  Awesomesauce!

Nowadays Was Yesterday

My most recent efforts have been using React dev environment with Inferno (because React’s patent clause freaked many folks out and causes confusion).  But, wait, Facebook is dropping the patent clause for React.  Maybe we can move to full React.  Time to change…

Advertisement
Standard
Uncategorized

Backup, Schmackup!

New Backup Hardware

Recently, after putting off a data backup overhaul and almost suffering catastrophe (read below) I recently invested in a Synology DS216+II with two 4TB drives.  It was a significant investment.  But it provides:

  • Always on access (it will power down into a sleep mode to conserve energy when not active)
  • A rich set of apps (Android, iOS, AppleTV, web-accessible) for me to connect to it wherever I am including video, audio, pictures.  The apps are top notch.
  • Provides dynamic DNS (they provide the service) which is usable by all the apps and let’s me connect to it from anywhere
  • Very easy to setup and configure.

As home NAS comparisons go the top two manufacturers (as of my own recent survey) are QNAP and Synology.  A very easy simile between the two is Synology is to QNAP as iOS is to Android (in terms of interfaces—not legal approaches).  QNAP gives more tweaks, but isn’t quite as intuitive.  I probably would have been fine either way.  Do your own research.

Synology DiskStation will allow you to use a simple file hierarchy for your photos, videos, and music files and still take advantage of their apps for easy perusal.  Of course, you can just use them like a network disk.  I like this file approach as it’s the most universal and non-proprietary dependent layout for your files.

Offsite Backup

I use Backblaze to backup my individual computers offsite.  I want security if my house burns down.  In addition, I have a Microsoft Family Office 365 subscription (wife needs the latest MS Office for her contract work) and it comes with 1 TB of OneDrive folders for each user.  I use one of the accounts as a “backup” account and have Synology’s easy built-in cloud sync app push my videos and photos folders to that OneDrive account (I’ve set it to push-only so there are no accidental deletes if Microsoft borks my data).  I also replicate the videos/photos folders to my Mac via the built-in Synology sync folder software so it’s doubly-backed-up to Backblaze.  Now I have a “I don’t have to think about it anymore” solution.

Paranoid!?  I’m not Paranoid!

Here’s why I go to all that trouble for offsite backup.  I almost lost major data because of not fully understanding the fine print.

A Long Short Story

Before the NAS I had an old iMac which ran the Server app and served as the Time Machine backup for all my home Macs. It also served up the music and video files via iTunes.  They were all locked into Apple’s formats.  I didn’t have a way of sharing photos easily with the immediate family other than I did the occasional post to my SmugMug account.  I was using iPhoto and its proprietary format and had not yet migrated to the new “Photos” yet-another-proprietary photo-storage app.   I don’t want to fork over my moolah for the iCloud hosting as I don’t trust Apple in regards to cloud stuff.  Cloud storage isn’t Apple’s core competency and I have a very technically competent friend who had Apple lose lots of his critical files in iCloud and it wasn’t his fault.  I was already using Backblaze for offsite backup of all the Macs.  Flashback to several months ago when we started a renovation at our house.  I disconnected the iMac and didn’t reconnect it for 3 months hedging my bets that the offsite Backblaze backup will suffice for my and my wife’s laptops.  Upon reconnecting the iMac I see the internal secondary hard drive has died.  No problem, I thought, as I still have an off-site backup of those files.  I check to verify the files were still remotely archived (in the April time frame), and I figure I can restore whenever.  I order a new hard drive and didn’t get around to reconnecting until July.  I feel confident that all my laptops are being remotely backed-up and I can restore the old photos and videos which had resided on the old server later.

Upon adding the new hard drive to the server I log into the Backblaze’s interface to restore the files and, whoops, they have vanished.  Long story short, they hadn’t seen my machine for a long period of time and they have the policy that since they aren’t an “archival” service they start to dump my files after 30 or more days of not seeing them.  In addition their software seemed to wrongly considered my secondary internal drive as a temporary external drive.  Gratefully, I was able to talk with their tech support and they restored my files from some way-back cavern of wherever they stored them (with only minor loss).

I almost lost my most important files (home videos and photos) and would have had to eat a huge cost of trying a forensic hard drive retrieval to get them back.  As much as I like Backblaze their fine-print was not very clear regarding secondary hard drives.  Furthermore, they don’t really specify what files they are deleting when they start the purge.

So, I trust no-one.  I have my own multiple back-ups:

  • All Macs using time machine backup on RAID-1 Synology NAS
  • Photos and videos replicated on NAS and a local (iMac) machine
  • Files on Macs backed-up offsite using Backblaze
  • Photos and videos synced to OneDrive

Two offsite back-ups is a bit of overkill.  One other option I could have considered was dropping Backblaze and buying a duplicate NAS.  Synology can setup a replication service between NASes and I could have a friend with adequate bandwidth host it at their house.  It would have paid for itself in a about 5 years, but felt that was too much effort.

I also considered Amazon Glacier and Backblaze’s own B2 service.  But, I was concerned about the administrative effort of picking and choosing the files to back up on the Macs and felt the additional cost for the brain-dead solution was better.

Standard