Tuesday, October 30, 2018

How to debug a web page on iOS - Part 2

In this post we will continue our pursuit to debug a web page on iOS. Wont it be nice, if we had Chrome Developer Tools kind of interface even for the iOS?

The second way to investigate webpage errors on iOS is going to achieve exactly that, but using Safari!

The Solution

Apple natively supports remote debugging of webpages on  iOS since iOS6. Here are the steps that are necessary to enabled it.
  • Open Settings app on iOS device
  • Click on Safari

  • Here, click on Advanced

  • Next, enable the Web Inspector switch.
  • Now, head over to your Mac and open Safari.
  • Click on Safari --> Preferences

  • Click on Advanced tab and enable Show Develop menu in menu bar 

  • This will Show a new menu item called Develop right next to the Window menu item.
  • Now, Connect your iOS device with the Mac using a cable.
  • Next, Open the Safari browser on the iOS device and open the webpage that you want to investigate.
  • Click on Develop and you should see your iOS device listed there.
  • Clicking on your iOS device name under the Develop menu, should show up all the web pages opened on the iOS safari browser.

  • Click on the webpage that you want to debug, this opens up the Safari Web Inspector 
  • Here, you can select elements, manipulate CSS, investigate console errors, look at all the network calls, profile your app and more. 
  • Notice when we inspect a div in the Web Inspector, it automatically gets highlighted on the iOS device

    Nice isn't it?

    Sunday, September 30, 2018

    How to debug a web page on iOS - Part 1

    One of our customer, recently wrote an email to our support team about a problem they were facing. They were not able to view the "Printables" section of Monster Math Games. This section shows list of free and paid Printable activities. This page is designed as a web page instead of a native screen and loads in a web view on the app. The customer was only getting a blank white page.

    I started investigating the issue, initially I thought it might be some sort of network latency issue or  something related to HTTP vs HTTPS. But it turned out, the problem was with one of Javascript API we were using.

    The Problem

    Debugging a web page on iOS device, is not as straightforward as I had initially thought. I learned two very interesting ways of debugging webpage errors on iOS. This post is an attempt at documenting these two approaches so that it could help someone who is facing a similar problem.

    The Solution

    The first thing I wanted to check was the actual source code of the web page. Here are the steps we need to follow to view HTML source of a web page.

    • Open this post on the iOS safari browser
    • Click the upload icon button on the menu bar, then click the "Add Bookmark" button.
    • Change the Title field to "View Source" and hit "Save"
    • Next we need to edit this bookmark and change the Address field to the following javascript code. Copy all the text from the following gist.

    • It needs to be done this way because, iOS Safari doesn't let us edit the Address field while creating the bookmark.
    • Click on the "Book Icon" in the menu bar to bring up your bookmarks. 
    • Click on "Favorites", you should see the newly created book mark "View Source".
    • Click the "Edit" and then tap on "View Source" bookmark.

    • Now select the Address field and paste the copied code.
    • Hit the "Done" to save the bookmark.
    • Visit the web page whose HTML source you want to view.
    • Once the page opens, click the newly created bookmark "View Source". 
    • This will open up a popup with details of the web page

    • This Popup has a "View Source" tab that shows the HTML source of the web page.

    Pretty cool, huh!

    Friday, August 31, 2018

    How to find APNS Device Token of a Production iOS app

    I wanted to test out the look and feel of a push notification on the production iOS app. For that, we needed to know device token of my device.

    The Problem

    Quick googling suggests that, we can get the device token from the app delegate callback method. However this method doesn't work for production apps. So how do we get the device token of a production iOS app?

    The Solution

    The solution is pretty straightforward and extremely low tech :D. We could get the device token from XCode using the following steps:

    • Connect your device with the Mac
    • Open XCode, click on Window -> Devices and Simulators
    • Open Devices and Simulators
    • Devices and Simulator window opens up, click on "Open Console"
    • Click on Open Console
    • New and shiny device console opens up
    • Launch the app and accept the popup to receive notifications.
    • Now on the device console search for "Request per-app token with token identifier"
    • Device Console with Push Token
    • You should get the device token which is in the format "4D2338E0-1D8F-490A-9C8E-F5A4FEA2CFFF"
    Just use the device token with any push notification sending service to send the push notification to your device!

    Tuesday, July 31, 2018

    How to paginate faster in PostgreSQL with big offset values

    I was surprised to know, how inefficient pagination could be, when its done with LIMIT and OFFSET.

    Everything is good and dandy as long as the OFFSET value is in hundreds and you are dealing with relatively smaller dataset. With huge datasets (5-10 Million+ records) the performance degrades pretty fast as the offset values increase.

    The Problem

    Offset inefficiency creeps in because of the delay incurred by shifting the results by a large offset. Even in the presence of an index, the database must scan through storage to count rows. To utilise an index we would have to filter a column by a value, but in this case we require a certain number of rows irrespective of their column values.

    Moreover Rows could be of different size in the storage and some may be marked for deletion, hence the database cannot use simple arithmetic to find a location on disk to begin reading results.

    The Solution

    Its best to demonstrate the solution with an example. Lets say for e.g. we have a table called "events" with primary key column "id". We are fetching 30 records per page from it and now want to skip 100000 records and get the next 30 records. The query to do this would look like this

    This query would be pretty slow because of reasons mentioned above. To get around this problem we can tweak the query as follows and it should start running faster (unbelievably fast).

    Reason for this significant improvement in performance is because of the WHERE clause on "id". Database could go right to the given row and than fetch next 30 records!

    There you have it, a simple tweak in query could be the difference between a "crawling query" and a "blazing fast query"!

    Saturday, June 30, 2018

    How to map PostgresSQL JSON column with Hibernate value type and Kotlin

    At makkajai, there has been no dearth of challenging problems :). Recently we moved our analytics partner. I will not bore you with details on why we had to move, but what is significantly more interesting is how we executed the move. Some key requirements for the move where:

    • Migrate all the data collected by previous analytics partner i.e. around 40 Million events to the new partner.
    • Honour concurrency limits of old and new analytics partners. Because if we didn't honour them, they will stop responding for a period of 10 minutes (which would be costly 10 minutes)
    • Old analytics partner had a limit of 3 concurrent requests.
    • New analytics partner had a limit of sending 1000 events per second. 
    • Migration had to be reliable and fault tolerant. For e.g. we could run the migration multiple times during the migration window.
    I am not going to go into details of how we solved the whole problem (may be some other time), in this blog I am going to focus on a very small part of the problem. 

    The Problem

    PostgresSQL JSON column type has great querying features, I wanted to use it to save parts of events JSON response received from our old analytics partner. For this to happen, I needed to map the PostgresSQL JSON column type to Hibernate value type. This blog post is to document the steps needed to achieve this using Kotlin.

    The Solution

    There are 4 steps involved to make things work.

    • Adding a custom PostgreSQL dialect to register the JSON column type with Kotlin String.
    • Registering the custom PostgreSQL dialect in application.properties.
    • Adding a custom user type class to map kotlin String to PostgreSQL JSON column.
    • Annotating the model classes, to use the custom user type class.
    Here is the exact code needed to achieve all the 4 steps mentioned above

    Above is the custom PostgresSQL Dialect to register the JSON column type with a Kotlin String.

    Sample Application properties to register the custom dialect.

    Above is the custom user type mapping class. This will be used to map Kotlin String to PostgreSQL JSON column.

    Above is the simple UserEvent model class that uses the string property properties and maps it to the PostgreSQL JSON column data type.

    Thats about it! When we create the instance of UserEvent class and set the value of properties it will be correctly saved in PostgreSQL JSON column type. PostgreSQL will also validate that its a valid JSON String before saving the information.

    Tuesday, May 29, 2018

    How To Print 1x1 Shipping label on a 2x2 A4 Generic Sticker Paper

    When you are running a startup, you will face numerous business problems on a daily basis. Some problems are within your core competency and some fall outside your comfort zone. Being a startup founder, you really cant afford to not solve the problem because it falls outside your comfort zone. Also at times, you only need to be street smart to solve the problem and move on :)

    I recently faced one such, not so interesting problem, but it was essential for me to solve it.


    We ship books to our customers in India and we recently moved to Delhivery as our delivery partner. Have you noticed the stickers on the packaging, when you receive deliveries from Amazon/Flipkart/Delhivery etc? Those stickers are called Shipping Label. It has bunch of information like:

    • Who is the package for.
    • Where is it coming from
    • Contact details of the client 
    • Contents of the package and its approximate value
    • And many other things.
    This is how a Shipping Label looks:

    When we create the shipment in Delhivery portal, they generate the shipping label for us. Shippers are supposed to print it and affix it on the shipment. So far it feels normal and business as usual, what's the problem is not clear?

    The Problem

    We usually we ship in bulk, hence we ship to hundreds of our customers in one batch. Delhivery generates a PDF with 1 shipping label on every page, hence if we are shipping 100 shipments, Delhivery will generate the PDF with 100 pages i.e. 1 label on each page in a 1x1 format.

    If we had access to a specialised printer which could print these stickers on a sticker roll, that we would be sorted. But unfortunately we didn't have that printer.

    There are generic sticker papers available in the market to print shipping labels. However each A4 size generic sticker paper would cost around Rs. 5. Its not optimal from cost as well as resources perspective to print just one sticker on the entire A4 page.

    Considering the size of the Shipping Label, we could easily print 4 shipping labels on one A4 size paper. If we were able to do it, the cost of printing one label will drop 4 folds. Something like this

    Initially you might think, why is saving few bucks so important. Thats because of a simple concept called unit economics. If you ever want to get your startup in the successful zone, you need to get the unit economics right :D!

    The Solution

    Now that we know whats the problem and why we need to solve it, lets focus on how did I get it done.

    Speed of execution is everything in startup world. I had to solve this in way that its easily doable by any non-tech operations guy, at the same time I didn't have the luxury to build a sophisticated custom solution. 

    So, what did I do? I tried to breakdown the problem into smaller steps and try to solve each of those smaller steps.

    • First thing I observed in the Shipping Label's PDF was that, There was some extra information around the shipping labels like the footer of the page and some more unimportant stuff. 
    • In order to arrange it in 2x2 format I need to trim the unimportant stuff. For this I looked for a site that could help me trim all the PDF pages in one go. Sejda was perfect for this. The free plan has some restrictions but we could live with those.
    • You could upload the PDF and you could crop all pages with a mask in one go. What I got after that was a PDF with all pages having only the important stuff.
    • Next, I need to export each page as a separate image either in PNG or JPEG format. This was necessary so that I could use mail merge to actually arrange the shipping labels in 2x2 format.
    • I exported the PDF pages to png using pdf2png site. The result was a zip file with all the PDF pages exported as PNG files.
    • Final step was to use "Microsoft Word Mail Merge" to arrange these images in 2x2 grid. I followed the this nice article, to get that done.
    • Once I followed the steps, I finally got a 2x2 grid of all my shipping labels. These could be printed on a generic A4 paper with a 2x2 sticker grid.
    • In the end, I had a very low tech solution (which could be easily followed by any Operations guy) to a business problem.
    The entire solution is in line with the theory suggested by Mark Watney, from The Martian :)
    You solve one problem... and you solve the next one... and then the next. And If you solve enough problems, you get to come home!

    VoilĂ , my job here was done!

    Friday, April 27, 2018

    How to get Snowplow-Mini running on AWS

    While looking at various Analytics engines we came across Snowplow Analytics. We wanted to give it a shot and experience it first hand. Luckily, they have something called as Snowplow-Mini. Its an easily deployable, single instance version of Snowplow. It essentially gives us, a taste of what Snowplow can do for us, as far as data collection, processing and analytics is concerned!

    We started with the quick start guide and usage guide, performed all the steps mentioned there to get the Snowplow-Mini instance working. However, we did faced two annoying issues, investigating and fixing them, wasted a few hrs. This post is about those two issues, so that my fellow developers do not have to waste any time on investigating and fixing them.

    Unable to: Generate a pair of read/write API keys for the local Iglu schema registry

    We followed all steps mentioned in the usage guide but we were unable to generate the keys.
    • Navigate to http://<public dns>/iglu-server
    • Input the super API key set up in previous step, in the input box in the top right corner
    • Expanded the keygen section
    • Expanded the POST /api/auth/keygen operation
    • Input the appropriate vendor_prefix for this API key
    • Click Try it out!
    At this, it should have generated the read and write keys for us. But all it did instead was, showed a progress bar and runs forever without return.

    Investigating it in Chrome Developer Console revealed that the calls were failing with 401 UnAuthorized. After googling for this error a bit, I found that someone else was also facing a similar problem. Their solution was to do HTTP POST via CURL and that seemed to work. However it didn't work for us either.

    I looked around for ways to debug the problem.
    • I connected to the Showplow-Mini instance via SSH (refer to AWS documentation on how to do this)
    • Checked the config under "snowplow" directory on the instance. Could not spot anything unusual there -- not that I knew much about it anyways :D
    • Checked the logs under "/var/logs" directory. Found a few things but could not really solve the problem.
    • Connected to PostgreSQL DB on the instance using the following command
      • psql --host=localhost --port=5432 --username=snowplow --dbname=iglu
        # Password is "snowplow"
    • Ran the query to check the API key
      • select * from apikeys;
    • What I saw next, made my jaw drop, in disbelief!
    • They API key is case-sensitive and the key Snowplow-Mini had saved was all in lowercase, even though when I had given it the key, I had given it in all caps.
    • Passing the key in small case and making the following call did result in generating the read/write API keys for local iglu schema registry
      • curl http://<IP address of your server>/api/auth/keygen -X POST -H "apikey: <your case sensitive API key>" -d "vendor_prefix=com.makkajai"
    • Duh! Yea I know.

    • How to connect to PostgreSQL Snowplow-Mini DB, I got to know that from here 
    I must have easily wasted an hour trying to fix this problem. I hope others can save that time!

    Unable to: See events in Kibana Dashboard

    This was a tricky one. After raising sample events, I was unable to see them in Kibana Dashboard. This happens mainly because the "snowplow_stream_enrich" is not able to connect to the "elastic search service".

    How Did I figure it out?
    • ssh into the Snowplow-Mini instance
    • I checked the logs under "/var/logs" directory. 
    • The logs seemed to be filled with exceptions like
      • Exception in thread “main” java.net.UnknownHostException: ip-xx-xx-xx-xx: ip-xx-xx-xx-xx: unknown
    • Googled it a bit, found the solution here 
    • Edit the file "/etc/hosts" and add the IP address information in that file as follows.
      • sudo vim /etc/hosts 
      • xx.xx.xx.xx ip-xx-xx-xx-xx localhost
    • xx.xx.xx.xx being the AWS local IP address.
    • Save and exit and re-start all services from the Snowplow-Mini console.
    • Generate a few events and open Kibana dashboard, and it worked this time!
    After these two problems were out of the way, my Snowplow-Mini instance was fully up and running on AWS!
    Have some Fun!