Google Hash Code 2020 — Practice Round
I thought of taking part in Google Hash Code 2020 and it will be my first time, I will be taking part in something like this. So in order to feel confident before the competition, I tried out the practice problem provided by Google before the competition actually begins.
I will assume that readers of this article are already familiar with the problem statement. Here is the link to the Practice Problem-https://hashcodejudge.withgoogle.com/#/rounds/4684107510448128/
Let’s dive into solving this problem. So in short we have to figure out how many pizza slices in total we can collect such that the sum of the pizza slices is maximum and it’s less than or equal to a number M provided to us in the problem statement.
Here are the constraints of the input data :-
● an integer M (1 ≤ M ≤ 10^9) — the maximum number of pizza slices to order
● an integer N (1 ≤ N ≤ 10^5) — the number of different types of pizza
The second line contains N integers: the number of slices in each type of pizza, in non-decreasing order:
● 1≤S(0) ≤S(1) ≤…≤S(N-1)<=M
I tried 3 different approaches, let’s look at each of them and see which one performed the best.
Approach 1
As provided in the input, the number of slices in each type of pizza are provided in non-decreasing order.
So, a simple approach would be to keep picking slices till the time sum of all the collected slices is greater than amount M
You can iterate either from the start till the end or from end to start.
This is the result I got from this submission.
The total time complexity of this approach will be O(N).
Approach 2
For a problem like picking those items whose sum of weights come to as close the given value M, obviously we should go for a dynamic programming approach. I tried writing the code for it but faced few problems because of the input size.
1. Time complexity going around O(N*M), which is not feasible with the constraints provided.
2. In order to work your way around space complexity, we for sure can’t make use of 2-D arrays, so tried making use of maps and use only that much memory as we want in order to save valid previous calculated values.
But this approach timed out for last two large inputs.
Approach 3
This was my best and final approach till date. Here is the algorithm explained below:-
- Iterate over the items one by one.
- For each item let’s consider the part that we are gonna consider it.
- Now we know if we picked the item in the previous step, the left_over_weight = M-slice_count[i] for the i’th item.
- Now, we also keep a Binary Search Tree of pair of values of each item value and it’s index.
- So now we search the lowest possible value than left_over_weight that we can find in our Binary search tree which takes O(logn) time complexity.
- We keep doing this inside a while loop until we reach left_over_weight value to be zero.
- We keep doing this for every i’th item, considering i’th item is for sure considered to be present in our bag.
- Now we take the best possible combination which gives us the nearest total weight count to the value M.
Let’s look at the results achieved from this approach.
Now based on the input provided to us, the best possible answer that we can achieve would be:-
17 + 100 + 4500 + 1000000000 + 505000000 which is equal to = 1505004617.
So the difference between maximum achievable value and output from our approach is = 1505004617–1505004544 = 73.
I would appreciate any other efficient possible approach that would get us more closer to the maximum possible output. Please feel free to share your ideas as well.
Thanks for reading the article !