Scaling Technology in a Startup



A lot of people say programming is hard. They are not wrong. However, to programmers, it is not the coding that is hard, but solving the actual problem. Coding is like expressing your thoughts in words, only the set of words you can use are limited and have a specific structure to them. Thus, one someone solves a solution to a problem with a program they are ecstatic, yet it might not be the best solution. Most computer science majors may not remember their data structures classes, but data structures is the foundation of programming and is connected to virtually any problem solved with programming.


A key constraint programmers tend to forget when coding is time complexity. Time complexity revolves around how long a program takes to run in its entirety. Often times, this is measured relative to the input size of the parameters. For instance, code can be linear in size, denoting the program iterates through every element in the input, or code can run constant, denoting the program takes a static number of elements in the input to complete. Many naive algorithms ignore time complexity and iterate through the input over and over again. Now, the solutions these algorithms provide may be the best solution and most accurate, but this is terrible for a product that consumers are going to use.


As technology has been growing more and more each year, consumers expect instant results and are disappointed if they are not given instant gratification with everything they do. Thus, if a program takes "n^(n)" time complexity and the input size is only 10 items and each iteration through an item is 1 second, this algorithm will take approximately 316 years to complete. I do not know about you, but there is no way I am waiting 316 years for a program to complete. Clearly if a programmer is working on a client-facing application, then they should definitely think twice about the speed of their program before releasing it. To learn more about how time complexity actually works, here is a good resource to get you started.



Another factor one might consider when developing algorithms is the computational expense. Algorithms that require much computational power may cause issues to the device it is running on. Moreover, machine learning algorithms are generally computationally expensive and need lots of memory; most of the time machine learning developers run their models on CPUs or GPUs with just the model running. For example, just last week I was testing my multiplayer convolution neural network on my laptop with 16GB of RAM and my computer could not take it and froze for an hour! Although, there may be no way to work around creating less computationally expensive machine learning models, it is definitely possible to create algorithms that require low computational power. Another example of an algorithm that is computationally expensive is checking a device’s location frequently. On most smartphones, checking the geolocation requires the device to ping a satellite’s GPS, receive a signal, process the data and then respond to the device. This is power heavy and can heat up the phone and or drain the battery. Thus, a smarter solution is to check the geolocation every so often and dynamically set timeouts in between intervals depending upon the needs of the program.


Although I have depicted the consequences users face from poorly written algorithms, the companies face just as costly consequences as well. Quite literally, since each API call a service makes normally costs a few pennies to the company, which does not sound so bad, but if the algorithm is poorly structured and is calling this API every 5 seconds for every user, it becomes extreme. For instance, let us say that there are 1000 customers using the service; for each customer the service is making 17,280, a total of $864 a day; thus, these few pennies each call add up to become quite a costly sum. Additionally, using datastores or virtual machines have very much the same issue where using a third-party company to store data or run servers, the pricing can be dependent upon time used or amount of data stored or a lot of times both. As a result, if a company is consistently querying a database for results that they could have just cached, they face the issue of wasting money that could have easily been saved, if the developers were smart about their algorithms. Thus, it is critical that the developers design their algorithms intelligently keeping the customer and their company in mind.


Overall, the goal of most programming is not to find a solution, if that is the case, then you are probably in theoretical research. If you are, keep at it and continue breaking the doors to pave the way for new technologies. If not, then you should be mindful of your algorithms and keep them efficient. I bet you do not like the buffering that we all experience, so prevent the buffer and be better.


Written by Daniel Schwartz, VyB's Technology Leader.

43 views
VyB Logo (9).png
AppStore.png
PlayStore.png
  • Instagram Social Icon
  • LinkedIn Social Icon
  • Facebook Social Icon
  • Twitter Social Icon

3230 Market Street, Philadelphia, PA 19104

©VyB 2020