Category Archives: Homelab

Updates about my homelab

Migrating my VPS to Linode and Sentora

Background

If you’ve been following my recent ventures at all, you know that not too long ago I decided to move my virtual private server to DigitalOcean. I also decided to run CentOS7 and use Centos Web Panel (CWP) to allow anyone using my VPS to have a nice graphical interface to manage their services. I have not taken time to formally review either of these services, but I did experience some issues with Centos Web Panel. Some of the key issues I had with the product are:

  • Obfuscated Code. I know that some people like to obfuscate their code in order to make the program uneditable by other parties; however, if I am unable to look at the code of my web-panel to see exactly what it’s doing, I am worried that the web-panel is doing something that it doesn’t want me to see. On top of this, obfuscated code makes it very, very difficult to quickly edit files if I want to make changes.
  • Support is not good. The CWP forums, which should be a central source of information for all who choose to adopt CWP into their own servers, is riddled with posts describing common issues. Almost all of the threads contain an administrator response prompting the original poster to file a ticket. This creates a forum that is essentially useless since no actual problem solving takes place there. On top of this, when an administrator does take the time to answer a question, the answer is usually not completely answering the question. To be fair, for a free product, support is something that doesn’t need to be offered; however, a community-driven forum is important, and CWP is not running theirs properly.
  • Things break. More often than not, users will make changes that unintentionally break things, and these breakages are reported as “random occurrances” by the very users who caused them. Maybe I am just a really poor user; however, during my time with CWP, things would definitely seem to randomly break. People on the forums seemed to agree, too, as all issues that I had were also reported by other users. For instance, after a CWP update, all hosts could no longer navigate to their associated “/roundcube” domains to access webmail. The problem was reported on the forums; however, it was left unanswered.
  • Untemplatable. I don’t only use my VPS to run my web services and applications, I also allow other people to host their websites, services, and applications on my VPS. When they login to their web panel, however, I want them to be reminded that they are using my VPS. Thus, branding the web panel is essential. CWP supported this in no friendly manner. Most web panels have a sort-of template feature, where one is able to define their own template for the web panel to use. However, CWP has no such thing, and even when looking for source images to replace and source files to edit, nothing can be easily found. If something is found, it is obfuscated. Most people would consider this to be a minor issue, however, the default CWP template is slow and unresponsive. The unresponsiveness was probably the straw that broke the camel’s back, so to speak.

With all of these small issues combined, and with a growing favoritism toward Linode’s VPS services, I decided that it was once again, time to make the transition to a new stack of services. This time, I decided to use Linode’s VPS plan, run CentOS7 with the LAMP stack, and finish it off with the free and open source Sentora web panel.

The Process

Preparing the VPS

Setting up the actual VPS with Linode is quite simple. I went with the Linode 4096 plan (Which is the same price as a Digital Ocean plan with less features) and deployed a CentOS7 image onto it. From there, after the machine is booted, the “Remote Access” tab provides all the necessary information to SSH into the machine and install the necessary software.

Installing Sentora

The Sentora installation process is fairly straightforward and fairly well documented. Setup with a fresh new Linode VPS, all of the requirements regarding a minimally installed machine were already met. However, installing Sentora requires a valid subdomain to be pointing towards the server machine. This subdomain will later be used to access the Sentora web panel.

At the time of installing Sentora, I did not have an extra domain lying around to point to the new machine, and knowing very little about DNS and DNS Records, I eventually discovered that what I needed to do was create a new “A” Record that pointed a subdomain of an already existing domain to the new machine. I edited my DNS Zone file to contain a record that looked as such:

www.panel.    1800    IN    A    123.456.789.246

Which mapped the “panel” subdomain to the IP Address (123.456.789.246) of the new machine. After waiting about 15 minutes for this new DNS information to propagate, I was able to ping panel.domain.com and receive responses from the new machine’s IP Address.

Note that at the time, this process was two-fold: Since I was transferring from DigitalOcean, I changed my DNS Zone File on my DigitalOcean host to contain a NS entry that told the DigitalOcean host to “Look for the server that is serving the panel subdomain here”. That entry looked as such:

panel    1800    IN    NS    123.456.789.246

Then, now that the DigitalOcean host knew where to look, the “A” record was added to the Linode host, which basically said “The IP Address for the panel subdomain is this”.

And with that, the pre-installation steps of Sentora are complete. The real installation can begin. As per the documentation, this includes downloading and running a Sentora installation script. The installation script is extremely straightforward and asks only about timezone, hostname (the subdomain that was just declared), and IP Address.

Configuring Sentora

I decided that the best user-architecture for my server would be one that treats every domain as an individual user, even the domains that belonged to myself. Although I could have chosen to attach my domains to the admin account, I wanted to create a separation layer that would allow each website to have its own configuration. On top of this, configuring my websites as “clients” of my website hosting would allow me to better support those who are actually clients of my hosting.

Thus, I added two new reseller packages that would take care of this. The BasicUser package was limited to 10 domains, subdomains, MySQL databases, email accounts, etc, but was given unlimited storage and bandwidth. I figured that 10 was more than enough for the clients of my web hosting. I also created the ElevatedUser pacakge, which had unlimited everything. This, of course, would be for my own domains.

Before moving on to actually creating the users, however, I wanted to customize the Sentora panel so that my brand would be reflected across the panel for my clients. I am a fan of the base Sentora theme, so I decided to basically use that theme, but replace all Sentora images with my own logos. This was done in the following steps:

  1. Copy the “Sentora_Default” folder (The base template) to another folder (Your new template) in /etc/sentora/panel/etc/styles
  2. Replace all images in the img/ folder with your logo
  3. Replace information in /etc/sentora/panel/etc/static/errorpages with your information
  4. Continue searching for things to replace in /etc/sentora/panel/etc

Then, once everything is replaced, the new theme is visible in the Sentora theme manager. Select it and voila!

User Creation and Migration

I created each user’s account using the Sentora admin panel. I would setup a generic username and password for them to use, and before alerting them of their new account information, I would do the following:

  1. Login to the account and create a new domain, which was their domain.
  2. Edit the DNS Records for that domain (The default DNS Records are perfect)
  3. Add a new FTP account (with a generic username and password), set to the root directory for their account.

These three things would allow the user’s information to be transferred from the old server to the new server relatively easily. Transferring users is also a relatively easy process. First, I logged in as the root SFTP account on my old DigitalOcean/CWP configuration and downloaded everyones’ files onto my local machine. This, of course, took some time and allowed me to discover that some of my users were hosting a lot more than I previously thought they were (one user was hosting a 10GB MongoDB Instance). I also was able to go through and delete some of the files that I didn’t think would be used anymore, such as log files, backups from my previous server migration (From HostGator to DigitalOcean), and old files that were not in use anymore.

Once all of the files were downloaded to my local machine, I went ahead and uploaded all of them using the newly created user FTP accounts, being sure to put all of the public_html files into the new public_html directory.

I also logged into phpMyAdmin using the root login credentials. This allowed me to find all MySQL accounts attached to each user. For each user, I logged into the Sentora web panel and created MySQL databases with names matching the old database names. Although this could have been accomplished by simply exporting the tables, exporting the tables would not allow users to see their MySQL databases in their Sentora panel. Thus, creating the database in Sentora would allow the users to see their MySQL databases in the panel. I also created MySQL users for each existing user on the old MySQL servers. Unfortunately, Sentora doesn’t allow users to me named in the same format as my old setup, so I had to change the naming format a bit.

Now that all the databases were created, I could go into the old phpMyAdmin and export all tables using phpMyAdmin’s EXPORT functionality, taking care to make sure that the “CREATE TABLE” statements were included. I saved the .sql files, and then used phpMyAdmin on the new Linode host to import these tables.

WIth all database information transferred succesfully, the final change I made was establishing users’ subdomains. I was able to look through the DNS Zone file on the old DigitalOcean/CWP host and find the subdomains for each user. I then logged into each user’s Sentora panel and created the subdomain using Sentora’s Subdomain functionality. Unfortunately, Sentora’s subdomain functionality only creates a directory for that subdomain, so I also had to edit the DNS Information, adding a CNAME entry for each subdomain that was to be added. The CNAME entry looked like:

subdomain     IN    CNAME    123.456.789.246

 

More Database Migration

The most glaring roadbloack with transferring MySQL databases is that there is no way to actually view the password of a user. Although this is a great security feature, it causes problems when migrating servers becasue one is unable to view the passwords of old users to put them into new users.

The solution to this problem that I chose to use utilized mysqldump. When run from the old VPS host, this tool would dump all the important data pertaining to the old MySQL databases, including users and passwords. The full command looked as such:

mysqldump -u root -p mysql > mysql.sql

After typing in the MySQL root user’s password, this creates a mysql.sql file that can be downloaded and then uploaded to the new host, where it can be restored using

mysql -u root -p mysql < mysql.sql
UPDATE user SET password=PASSWORD('<rootPassword>') WHERE user='root';
FLUSH PRIVILEGES;

This command restores all of the users, their passwords, and their permissions to the server. The only problem with using this command is that, for me at least, it changed some information of important users such as the MySQL Root user, Postfix user, ProFTPd user, and Roundcube user. This was due to the fact that those database users just happened to have the same names as they did on my previous server. So, of course, they were overwritten. Thus, with that being said, the above command also contains a line to keep the root MySQL user password in check.

Luckily, Sentora stores a text file that keeps all of the default MySQL user passwords in /root/passwords.txt. Thus, using this file as reference, I went ahead and updated all of the users passwords accordingly. For example:

mysql -u root -p
UPDATE user SET password=PASSWORD('<MySQL Postfix Password>') WHERE user='postfix';
UPDATE user SET password=PASSWORD('<MySQL ProFTPd Password>') WHERE user='proftpd';
UPDATE user SET password=PASSWORD('<MySQL Roundcube Password>') WHERE user='roundcube';
FLUSH PRIVILEGES;

After these updates, everything on the system was working properly. However, I will admit that it took me a long time to figure out that importing my mysqldump file actually changed these passwords.

And with that, all MySQL user accounts were restored and working!

Small Adjustments

Now that all the major elements were transferred from the old server to the new server, I sat back and casually navigated to as many parts of the websites as I could. Mostly everything worked. I also followed certain steps to transfer MyBB instances. Those steps can be found here.

Another issue I was having involved a PHP script I had written to get the name of the song that I was currently listening to (via Last.FM) for my blog. For whatever reason, the script was not executing and was simply acting as plain text. After several hours of troubleshooting, I realized that the default Sentora installation disallows PHP’s short tags (So <?php is required instead of <?). This issue was fixed by editing the php.ini file in /etc/php.ini and seting short_open_tag field to “On”.

Finally, since I thought that some of my users would dislike navigating to panel.mydomain.com when they wanted to edit information about theirdomain.com, I created a simple HTML file with a Javascript redirect, called it index.html, and put it in the public_html/panel directory for each of the primary domains being hosted on my server. In this way, theirdomain.com/panel would take them to the right place and not force them to type my domain into the web browser.

The Final Step for users’ websites was for them to direct their domains to the new host. I directed users to use their domain registrar to register nameservers for their domains, ns1.domain.com and ns2.domain.com and have them point to the new host IP Address. Then, they directed their domains to the newly registered nameservers of the same domains. With this, the domains now pointed to the new host and all DNS configuration changes made on the new host would be picked up and reflected in the domain.

The Final Moment

With that, the migration was complete. In order to ensure that everything worked properly, I shut off my DigitalOcean host and went to bed. Thankfully, when I woke up in the morning, all websites still worked and pinging them returned a fancy IPv6 IP Address.

I ended up encountering some issues with SSL, Antivirus, and Malware later on, but I will cover those in a later post.

Reverse DNS (rDNS) and DigitalOcean

As you may know from my previous post, I recently made the switch to Digital Ocean for my VPS needs. So far, everything has been great. However, I encountered a problem the other day regarding Reverse DNS (rDNS) for my VPS. The error occurred when I attempted to send an email to someone with a @cox.net email address. I got a reply containing a failure message. The message contained details about an invalid rDNS name.

After searching the Digital Ocean forums, I learned that by default, Digital Ocean configures rDNS for all hosting accounts. I couldn’t, for the life of me, figure out why mine was not working.

Then I read a small post that stated something along the lines of

In order for rDNS to be configured properly, the name of your droplet must be the same name as the URL that is being used to point to the droplet’s IP.

After changing the name of the droplet to reflect the primary domain name, everything worked out and emails started sending.

So now you know: Name your droplets with their corresponding URLs.

The Switch from HostGator to DigitalOcean

Introduction and History

About 6 years ago, I was in desperate need of a reliable webhost. I had a small personal website that I needed to be online most of the time, and I preferred to have something that was cheap, yet customizable. The solution that I found was HostGator’s shared hosting. The service was fairly cheap, had full access to MySQL servers, mail servers, and everything else that I needed. I stayed on their shared hosting plan for about two years. Then, I upgraded to a HostGator VPS.

I stayed on this VPS until about 2 weeks ago. The VPS was running CentOS and had WHM and cPanel installed. It was mostly managed, meaning that everything was installed and managed by the HostGator team and changes that I needed to make were usually much faster if I contacted the HostGator team directly instead of attempting to make the changes myself. This served my purposes very well since at the time I knew next-to-nothing about web server management. However, as I learned more about the way that web servers and UNIX systems worked, I desired to manage my own web server. Thus, I made the decision to switch to an unmanaged Digital Ocean VPS. There were also a few benefits of doing this.

The Benefits of Making the Switch

  1. The most notable benefit of switching away from any service to a Digital Ocean droplet is the decrease in price. I was able to nearly triple all of the specs of my webserver and still pay less per month. I could use this extra money to donate to some of the Open Source projects that I decided to use or to pay more skilled individuals to build a new website for me. Whatever I decide to use it for, paying less for more is something that should never be passed up.
  2. Speed is also a great benefit. Since Digital Ocean’s droplets all run on SSDs, there is a noticeable speed difference when working with my Digital Ocean VPS compared to my HostGator VPS. I’m not trying to say that my HostGator VPS was unbearably slow in any fashion, but rather than the Digital Ocean VPS is just so much faster.
  3. An unmanaged VPS is great. Instead of relying on other people to setup my system and maintain my system, with a Digital Ocean VPS, I am in control of everything. I get to choose what operating system I use, what software I install, and how frequently things get updated. It’s a nice shift from having people manage the entire webserver. Of course, there are some drawbacks to this. For instance, when a customer’s product spontaneously decides to malfunction, it is very difficult to work on a solution without knowing anything about the webserver itself. In this particular example, it is definitely useful to have people who know the server more than you; however, the goal is to learn through these experiences and ultimately know as much as the server-managers know.
  4. The cool factor is, well, cool. When you tell people that your VPS is from Digital Ocean, there is some sort of coolness to it. People seem to respect those who have an unmanaged VPS, especially from Digital Ocean. It’s a plus, I guess.

The Process of Switching

The great part about setting up a Digital Ocean VPS is that they charge by the hour; thus, there is no reason to fear wasting your money on a server only to spend hours setting it up (and ultimately failing). On top of this, they allow their servers to be dynamically resized and upgraded, which allows you to simply purchase the cheap server, mess around with it, and then upgrade when everything is ready.

With that being said, the first step of switching to a Digital Ocean VPS would be to set one up. I did exactly this. I chose the smallest VPS available, perused the options of software, and chose what I was comfortable with: CentOS 6.5. Despite running Arch Linux on my own machine and having previous experience with Ubuntu, I knew that CentOS had a lot of options available to it as far as webserver management software was concerned. The first steps were to pick that webserver management software.

The Web Panel

Like I said previously, my old VPS had WHM installed to manage all of the server software, packages, and cPanel accounts. Then, cPanel was used to manage individual websites. This software setup was perfect for my needs. In addition to myself, I also had several clients who paid for hosting on my VPS. These clients, of course, needed to manage their own website, so I needed some sort of web panel with strong admin/client separation.

The first WebPanel I ended up trying was Sentora, which is an Open-Source fork of the once legendary zPanel. Upon installing Sentora and having a look, however, it seemed to lack all of the features I wanted in a webpanel. Namely, Sentora lacked the ability to assign domain names to user accounts (Or maybe I simply did not see the option). This is something that was completely necessary for me so that I could keep track of all the domains that were hosted on my server. Although it had a beautiful design and seemed to be functional, I had to leave it.

What I ended up going with instead CentOS Web Panel, which was very similar in design and functionality as WHM and cPanel’s Paper Lantern theme. The installation of the software was very straightforward and the steps were explicitly outlined on the CentOS Web Panel website. After installing the software, it’s very easy to setup the nameservers, FTP accounts, and other domain names on the server. With CWP, everything just seemed to work, which was a plus.

The WebPanel also featured a decent admin/client separation in its design. When one logged in with an admin account, the options available were very different from the options available when logged in with a client account. This is exactly what I was looking for. There are a few drawbacks to the webpanel, however. Since the software is free and in its early stages, it is not the most efficient webpanel available and feels slow or clunky when typing in information or editing fields. On top of this, the team is a little slow to update the associated versions of PHP, MySQL, phpMyAdmin, etc. However, you can’t ask for too much from free software. That is, afterall, the best part of CWP. It’s free.

CentOS Web Panel Admin Page

CentOS Web Panel Admin Page

The NameServer Switch

Since I know nothing about how nameservers work, this was the hardest part of getting everything setup for me. I essentially had to switch my namesevers from the preconfigured HostGator setup to my own custom setup. Luckily, CWP has native support for this. Basically, the first thing that I had to do was go to my domain name registrar and configure my root domain (http://brandonsoft.com) to be a nameserver. I then pointed both the NS1 and NS2 subdomains to the IP address of my new Digital Ocean VPS. Once this was complete, I configured the ns1 and ns2 A Records on Digital Ocean’s DNS Management panel. Once this was complete, I made sure to change CWP’s nameserver options to reflect the IP Address of my Digital Ocean server. After waiting about a day for all of these changes to take effect, everything worked perfectly. Since all of the domain names that were hosted on my HostGator account were already pointing to my nameserver (whose IP Address was just changed), they automatically began to point to the new location.

The Data Migration

Now that all of the software was installed and all of the domains pointed to the right location, the process of migrating data could begin. This was a daunting task, of course, since I had about 10 websites worth of data whose files and databases needed to stay in tact, preferably without any noticeable change. Luckily, the process was actually rather easy to do. Of course, I made many mistakes along the way, so here are the steps that I took that ended up working properly:

  1. In the individual cPanels of all of the accounts hosted on my HostGator VPS (Which I still could access through the old IP Address of the server since the domain names were no longer attached), I performed a full backup and sent the .tar.gz files to the /home directory of the new Digital Ocean VPS using cPanels built in “backup-over-SCP” with the root account. This took a while, but I got a handy-dandy email notification when each account finished backing up.
  2. After all backups were completed, the .tar.gz files were named something like “-.tar.gz”. In order to quickly migrate the accounts into my new CWP setup, I had to change the naming scheme to “-cpmove.tar.gz”. Once this was complete, I could login to the admin account of CWP and create new accounts using the “migrate from cPanel” functionality, which automatically filled in the necessary information.
  3. Unfortunately, at the time of my migration, CWP’s “migrate from cPanel” functionality was not working properly. Thus, the migration didn’t actually transfer any data. However, it did seem to setup a MySQL user and a home directory in the same username that was provided in the .tar.gz backup file. Now that a home directory was created, the rest of the backup process could be completed. I started by (Still logged in with the root account) using chown to change the ownership of the “.tar.gz” backup files and then used cp to copy them into the user’s home directories. After this was complete, I logged into CWP and made sure that SSH Shell access was enabled for all of the users on my VPS so that the next step could be completed.
  4. The next step was to actually process the backup file. This could be done by SSHing into the server using the newly created user’s SSH credentials. Once logged in, I extracted the backup file using tar -xvf. This put everything into a nice backup folder. Inside of the backup folder, the entire user’s mail, MySQL data, and file hierarchy from the old HostGator VPS was maintained. In order to get their site up and running as fast as possible, I copied backup/homedir/public_html into their ~/public_html directory. This put their website online and made it viewable to others.
  5. Finally, all I had to do was restore the user’s MySQL data. There were two processes that had to be completed. The first of which involved restoring all of their tables. This could be done by navigating to the ~/backup/mysql folder and using MySQL to process all the *.create files and then all the *.sql files. This created all the tables and then populated them with their data. The command calls looked like
mysql -u -p < TABLENAME.create
mysql -u -p TABLENAME < TABLENAME.sql

Once this was completed for all of the files in the ~/backup/mysql directory, we would need to restore the MySQL user accounts and passwords that existed on the old HostGator VPS so that it would seem like nothing was changed. This could be completed my finding the file ~/backup/mysql.sql and calling its queries as the root MySQL user, as such:

mysql -u root -p < mysql.sql

After this is completed, everything should be setup.

The Final Moments

After all of these steps were completed, the entire server was successfully migrated. There were a few issued I encountered; mainly, the site would stop working after scheduled backups ran since the backup files ate up disk space and prevented the MySQL server from running properly. After backups were disabled, however, this issue was fixed.

After running the server for around 2 weeks without any issues, I decided to close my HostGator account. So far, I have encountered no issues and everyone on the VPS seems to be happy with CWP (Some people even said it was less confusing than cPanel!). The next steps I am going to take are customizing CWP to be branded for my company, and as I complete this task, I will post a guide.

Hopefully this post has helped you migrate your server or inspired you to do so!

On Dropbox’s Recent Announcement

First off, if you haven’t heard of Dropbox’s new Pro Plan announcement, see it for yourself here.

The Overview

In short, Dropbox has revamped its pro plan to add a couple new things.

  • There is now only one plan option: 1TB for $10/month
  • Shared links can now be password protected
  • Shared links can now be set with expiration dates
  • Shared links can now limit access to ReadOnly/Read+Write
  • Shared folders can now limit access to ReadOnly/Read+Write
  • Unlimited version history for a year

Analysis

It is clear that most of the items on this last are good for the consumer. Let’s take a closer look.

New Price Point

Dropbox seems to have lost their minds just a little bit. Previously, Dropbox was $10/month for 100GB of storage. This is a 10x increase in storage space for users who were already paying for the standard 100GB pro plan. This is a great deal for those who were reliant on Dropbox’s services.

I believe the reason that Dropbox was able to do this was because they realized that a large majority of their customers actually used less than their allocated limits. Thus, they figured that upgrading everyones’ limits would not hurt too much as most people won’t use up to 1TB of space. I doubt that Dropbox has enough storage space to actually house 1TB of files from every single user – but they were confident that not everyone would be utilizing the full 1TB of cloud storage. It is a little insane to put that much in the cloud, after all.

The interesting thing about this price point is that it does not change their base price at all. They still have their free version, but to upgrade from the free version you still need to fork over the $10/month. Thus, this recent pro plan change does not do anything to separate them from their competitors who might offer 100GB of storage for $5/month. Although 1TB/$10 is a pretty sweet deal, Dropbox is not going to be gaining any new customers who are hesitant to pay the $10/month fee.

The main competitor that Dropbox was going against with this price revision, I think, is Google Drive, who is known for their notoriously low price points. Combine these price points with Dropbox’s speed and native clients… that’s enough to make any existing Dropbox customer drool. But it makes me wonder how many old pictures are now going to be hogging all the hard disk space in Dropbox’s headquarters.

Upgraded Sharing

When sharing a file on Dropbox, it was obvious that there was a gap. When sharing from Google Drive, for example, the sharer had the ability to give the receiver specific permissions (whether they could only view, edit, or comment). Dropbox did not have this functionality, causing it to slump a bit when it came to sharing. Either you shared the file or you didn’t. Those were your only options.

With this new update, Dropbox is leveling the playing field just a tad. Just like Google Drive, sharers who use Dropbox to share their files can now specify what people can do with these files. This makes it much easier to collaborate on documents, something that Dropbox definitely didn’t have the functionality to do previously.

Expiration dates on links are a great touch, too. There have been numerous times when I want to share something with a coworker in that moment, but wanted the link to expire so the public could not see it after about 30 minutes. It would serve as a quick interaction between my colleague and I. Since usually I share code, I previously did this through Pastebin. However, now that Dropbox has the functionality natively, I will definitely start using Dropbox for this function.

I think this is a major win for Dropbox. The only downside? Maybe sharing files will take a few extra seconds to make sure that all the permission and expiration settings are correct. I think it’s worth the sacrifice, though.

Version History

Previously, Dropbox’s version history capabilities were only available to those who subscribed to their Packrat service, which provided unlimited version history for all files in the user’s Dropbox folder. Now, this functionality is available to all subscribers for a year.

Of course, Dropbox didn’t eliminate the Packrat service. For those who still want unlimited versioning history, users can opt-in to the Packrat service again, even though it has essentially disappeared from the public eye.

Being a subscriber of Packrat, I know that it can definitely come in handy. I think that this feature being available to everyone is great, as it is a perfect solution for when you’re coding and you forget to setup a git repository and something breaks. Dropbox has your back! (As long as you had an internet connection while creating and editing the files).

 

Summary

Overall, this upgrade is a major win for Dropbox customers. Although the amount of new customers it may lure in is questionable, it still provides existing customers with a huge slew of new features. I honestly don’t know how Dropbox is able to handle this much data (1TB for each pro subscriber as well as 1 year of backed up versioning history), but I think the primary reason is that not every subscriber will be utilizing the full 1TB of cloud storage space. I know that I probably won’t be.

If you’ve been hesitant to join the Dropbox Pro club, though, maybe this recent pro plan revision will convince you otherwise.

Home Automation

If you have seen the latest addition to my code page, it is my first project that deals with hardware other than standard computers. It deals with the Chromecast. Take a look at the project here.

One of the goals of this project is to make life with roommates easier. In my opinion, it’s the modern alternative to keeping a grocery list pinned to the fridge. I want to make an app in which everyone can add tasks to a task list and everyone can see them just as easily as they appear on the television screen.

This, I believe, is my first step into the realm of “home automation”. Ever since I started toying around with making the tv screen display my things when I want them displayed, I have begun thinking about the possibilities of home automation. One of the things that I would really like to happen is a system in which I tap my phone on an NFC tag and everything turns on. The TV turns on and displays tasks that I need to complete. My computer turns on and becomes ready to use. Coffee is brewed. I imagine a place where I am truly the master of the house.

As I gain more and more knowledge in software development, this is exactly what I will be experimenting with. Right now, I am working on getting my Chromecast to display a task list that is editable from anywhere. The concept is really simple and pretty straightforward, but not knowing most of the technologies behind it makes it challenging. Basically, the following process needs to happen:

“Launcher App” –> Chromecast App –> Node.js App –> Data from MongoDB

This launcher app needs to be launched manually. However, as long as the launcher app exists as a mobile app, I can make it launch when the phone detects a specific NFC tag. Adding tasks is a little more straight forward.

AngularJS –> ExpressJS –> Node.js –> MongoDB

Also known as the MEAN stack, this is the process that I plan to use to power the backend of the task management. As the project progresses, hopefully I will be smart enough to make it into a deployable package so that anyone with a Chromecast and roommates has the ability to setup a communal living room task list.

During the development of this small system, I’ve been listening to a lot of older songs, especially those featured on Radio New Vegas. Here’s one of my favorites: