Building the New Standard in Time Tracking

Written by Muneeb Sahaf
By Muneeb Sahaf, QA Engineer

I’m Muneeb Sahaf, a Mechanical Engineer turned Software Developer with a few good years of corporate sales experience under my belt. I have a thirst for knowledge, and a hunger to improve daily, both of which eventually led me to join Jibble. I aspire to provide genuine value, and writing this candid review was a small step in that direction. Enjoy…

The Jibble Story

Building a software that stands out, is robust, and will create an impact is difficult. A lot of teams can do one, a few can do two, but rarely do we see a team come together and do all three.

That is what I’ve witnessed at Jibble.

A quick Google search will validate my point the company’s software has an impressive 4.8 rating on GetApp and Capterra and a 4.7 rating on the Apple App Store. I’m sure there are many things that the company can improve, but the team is doing something right if these are our ratings. I wanted to understand what they were doing exactly, so I decided I should try to make sense of how and why the company was achieving these impressive results.

I realise timesheet software and time tracking software aren’t really that enticing; it’s definitely not the same as working for a gaming company like Activision, which is developing the new Spider-man game.

However, Jibble boasts itself to be The New Standard In Time Tracking”, a bold claim, but one that I found to be rooted in fact. This is a result of grueling efforts to make interesting of something mundane yet significant.

You see, legacy industries are what most startups will be changing, given that most immediate, in-your-face innovation has been achieved. This is why I believe Jibble finds itself in a unique position to capture a market that has been long neglected by young founders and seasoned venture capitalists.

At a glance, Jibble may look like a run-of-the-mill concept that runs along the lines of ‘OK, you found a problem, let’s mesh up some code, create a UI, attach a payment gateway, and throw it out into the world.’ That, at least, was my understanding in the beginning before I joined the team, but as I quickly learned, SaaS companies that develop complex products like timesheet software work differently.

There are a LOT of processes that need to be followed, and while Jibble is no Google, it certainly boasts a very impressive team of individuals that helm the development side of things. These heroes often go unnoticed.

You may have heard that Software Engineers are not great communicators — they are generally introverts, masters of their own domains, and rarely do they want to be disturbed by human interaction. This stereotype is true to some extent. 

Most engineers, to be good at their craft, have to spend countless hours huddled over their dimly lit screens consistently sharpening their keyboards, and our team is no different. 

Having been on both sides of the operation of the company, the client-facing consumer role and the dev-facing product role, I find myself in a unique position to be able to comment on why things here just…work and why keeping an eye on this startup is worth your time.

Our Development Process

Our team spends countless hours refining each inch of their development process, which is as follows:

1. Ideation: Finding a Pain Point

Our team spends an excruciating amount of time before the first line of code is written, to make sure we don’t spend double that once we’ve started to build. As Abraham Lincoln said:

“Give me six hours to chop down a tree and I will spend the first four sharpening my axe.”

A team working and brainstorming on creating a winning time tracking software together.

2. External Research

After a pain point has been identified within the domain of our time-tracking software, the team is quick to compare existing customer feedback, current and projected market trends, and relevant technologies to find commonalities that will assist in the preparation of data required to make an informed decision.

3. Internal Research: Discussions with the Team

Once the data has been gathered, relevant shareholders are involved including Product Owners, Product Managers, CTOs, Lead Designers, and Team Leads from all sections. Once the team has had an internal debate on the pros and cons, resources vs. time, and expected ROI of the decision we are about to take, everyone is asked to take time to voice their concerns and express what direction they feel we should take and why or why not.

4. Decision Making: Do it now or do it later how will it affect the business?

After the dust has settled, and a conclusion has been reached, the team picks the decision that aligns with the company vision and its immediate or expected goals. If the decision does not meet that criteria, it doesn’t make the cut.

Decisions are often (but not always) related to improvements to existing features or the development of brand-new ones. After either of those has been given the green light, it’s a matter of where to put it in the roadmap; do we do it now, next month, next quarter, or super backlog it?

Here, we look at what impact our decision will bring to existing customers, future prospects, and current capacity, and then we decide.

5. Design Research

Once the feature timeline has been decided, we need to come up with ideas for how it will look. This is something a lot of new entrepreneurs and software product owners mess up. They’re too focused on how it should work when they also need to be concerned about making sure all features fit together in their visual puzzle.

A chart showing the design research process to make time tracking software.

6. Specification

The next step is the bane of Software Developers, documentation. Specification is where we write the scope of the project we’re developing. It’s where the idea is written and developed on a document to be the source of truth for development, testing, and bug checks.

It states what the whole feature will be, explains how it will look, and describes the end goal (at the moment, we don’t look at the metrics much). There is also technical documentation which is written down by our developers.  This is how we currently go about writing specifications:

  • Initial draft from PM
  • Team leads go over initial specs and leave comments regarding technical feasibility or alternative suggestions
  • Edits made by PM
  • Another discussion (steps 2-3 can be repeated a few times depending on the complexity of the project)

The team makes sure all edge cases are also considered during this time, but if the design has been finalized, we put a hard stop to spec discussions. Until then, iterations are possible.

7. Product Design

Our brilliant designers build various mockups with an assortment of display options. Everything is generated keeping Jibble’s color palette in mind.

Each and every display conceivable—mobile, web, tablet— and a wide array of screen sizes, everything is taken into consideration at this stage. 

8. Back-end Implementation and Unit Tests

Most features start from the Backend. Our backend needs to support the feature before we add UI and logic on the clients’ side & add on our API. So, grooming sessions are done with the BE team to refine tickets and tasks based on specs and then the assignment of tickets is done during Backend sprint planning.

The BE team is also responsible for developing unit tests for the code they write to make sure all functionality is working as expected. At this point, feature architecture is created. A good feature model will ease the development process for clients.

9. Client Implementation and Unit Tests

Now that our BE has been implemented, mobile and web development teams will start their respective FR implementations. The process is about the same as BE, however, the refining of tickets is mainly done outside of planning calls.

Clients mainly refer to specs, designs, and BE model structure of the feature to implement their code. On mobile, we have two different teams for UI following one shared logic set; this helps optimize delivery and reduce redundancy. Initial tests are run by the dev. team and then passed to QA for manual checks.

10. QA Acceptance Testing

As features are developed, they are deployed into our test environments, where the QA team does rigorous Acceptance testing. That’s a fancy way of saying we test developed features of the time-tracking software to make sure they perform according to expectations. If issues are found, it’s back to the workshop for overhauls and improvements. 

11. QA Regression Testing

If all features work as intended, they are then sent to production where they are once again tested, this time to make sure that existing fixes don’t break what was working before.  

Regression Testing is defined as a type of software testing to confirm that a recent program or code change has not adversely affected existing features. Here, at Jibble, we’re performing two types of Regression Testing: One is an informal or minor Regression Testing, which is done within the scope of Acceptance Testing tickets that have issues and require to be re-fixed. The other one is the full cycle iteration Regression Testing, which is performed towards the end of the Sprint known as the cycle with more coverage focusing on the critical path of the selected features.

At the moment, we’re executing it manually before Automation scripts are fully ready.

12. Exploratory Testing

As the team is agile in developing its timesheet software, all of the above is done in cycles of two weeks. Outside of those cycles, or during periods when the workload is relatively low, the team does exploratory testing – which just means keeping an on everything and flagging any improvements that need to be hotfixed.

We also perform exploratory testing while testing the Acceptance Testing tickets, which cover the extended scope of the ticket itself.

13. Release

Finally, after all that dedicated effort, our creation is set loose upon the world. Weren’t expecting it to take this long to make timesheet software, were you?

The Takeaway

To create a superb product, you need to have a superb vision, and not deviate from it when winds change. Jibble’s time-tracking team understands the wisdom behind not giving into trends too fast; they are purposeful yet constantly iterating for improvement. At the same time, they are executing strategies in an effort to stay committed to their vision of what should be “The New Standard In Time Tracking”. THIS—I think—plays a big part in their recent success. 

The final takeaway for teams looking to replicate Jibble’s success in building reliable time-tracking software is this: Create fast, build strong, lean teams that put in the time to do data-driven research before making decisions and employ brilliant engineers that use best practices and global collaboration to build, test, improve and ship features like clockwork. Then, sit back and watch the magic happen.