fbpx

Hyperlocal Shopping Search Portal

Brief about the Client and Product

The client is a startup based out of the United States, headquartered in California.

The client’s product is an e-commerce portal that aggregates the products store by store across various categories and helps to find merchandise of different brands from top local retail stores.

The portal enables the user to search for specific merchandise on sale, geographically pinpointing the nearest store location carrying those items.

This makes local shopping easy. So, one can print their shopping list and plan before their shopping trip instead of having to search from store to store to find sales.

Kind of Engagement

AFour started the engagement with the client in 2015 for the product engineering services.

The client approached us for improving the data processing engine, which had very low performance and at the same time revamping the user experience.

When we started the engagement, the data processing engine took more than 20 hours for processing millions of products into the portal and was developed in Java.

The user experience was built using the Magento platform, which was very bulky and based in PHP language.

Because of the technology choices, there were several shortcomings in providing delightful user experience and adoption of the product.

Key Highlights and Achievements

We rearchitected the data processing engine using Hadoop Distributed Processing and improved the performance by over 200%.

Moreover, the data processing engine was completely cloud-native and costed 1/5th of the original solution.

The User experience was revamped and built using React.js to enable Instant Search Experience with response time < 500 ms. This greatly enhanced the product search for the users.

Developed entirely new faceted search experience using Elastic Search.

Additionally, AFour team also owned the product validation and managed the staging, Production infrastructure in AWS.

Thus, owning the end to end development, testing, and delivery of the product.

Team Composition

The team was cross-functional with a Technical Project Manager planning & owning the delivery management.

The team consisted of 2 engineers working on building the user experience, 2 engineers on the API development & data engineering and, 1 on QA.

Additionally, we had 1 product validation engineer to validate the product before it went into production.

Project Delivery Process and Communication

We developed features of the product in an iterative way.

Initially, we built the Data Processing Engine and then faceted search experience.

Then, we revamped the UI from PHP to React.js and built a richer experience that was mobile friendly and responsive.

Later, we added features to the product sprint by sprint based on the business needs of the customer.

We had 2 scheduled meetings every week with the client to showcase the progress of development.

Mostly, the sprints lasted for 2 weeks which involved story analysis, development, testing, and deployment.

In addition to the Sprint report, weekly status reports were shared with the client to apprise of the progress made, milestones achieved and any risks to the project.

This ensured the identification of risks very early and effective mitigation of the same.

Tools and Technologies

We have used React.js, Next.js, Bootstrap 3 for the frontend.

We have used Elastic Search that was hosted on AWS for the faceted search experience.

For processing millions of product imports periodically, we have used Hadoop for distributed processing and ingestion of product data.

We have used Java as the language of choice for writing map-reduce programs. To reduce the cost of infrastructure, we have used AWS EC2 Spot Instances Fleet.

On the data storage layer, we have used MYSQL and Mongo DB. For caching, we have used Redis.

The Staging and Production Infrastructure was completely maintained in AWS and we have used Amazon CloudFront as a content delivery network for the image-intensive e-commerce portal.

In a nutshell, the e-commerce portal was completely built using open source technologies and hosted 100% in the AWS cloud.