Monday, May 13, 2019

A fastest week

A TopCoder SRM was also the first event of the Apr 22 - Apr 28 week, more precisely SRM 756 (problems, results, top 5 on the left, analysis). Stonefeang was not the fastest on any problem, but very fast on all three, and therefore held a commanding coding phase lead that he defended during the challenges. Well done!

The next Open Cup 2018-19 round, the Grand Prix of Minsk, took place on Sunday (top 5 on the left). Many teams solved everything this time, but team Past Glory did this under 3 hours, and therefore earned a clear first place. Congratulations!

Finally, Google Code Jam 2018 Round 1B wrapped up the week (problems, results, top 5 on the left). Benq was a lot faster than everybody else this time, and even one incorrect attempt did not really put his first place in doubt. Great performance!

Thanks for reading, and check back for more.

A forethought week

The Apr 15 - Apr 21 week was quite busy. TopCoder SRM 755 kicked things off on Monday (problems, results, top 5 on the left, analysis). The problems were quite straightforward, therefore the coding speed and the challenge phase came to the forefront. Tourist did very well on both, and got a well-deserved first place. Congratulations!

TopCoder hosted a second round this week, the TCO 2019 Round 1A (problems, results, top 5 on the left, analysis). Most high-rated contestants got a bye to Round 2, therefore it was a chance to see some new names on the scoreboard. Congratulations to mrho888 on the great challenge phase performance and the win!

Codeforces ran the Elimination Round for the Forethought Future Cup right after the TCO round (problems, results, top 5 on the left, analysis). There were eight problems and therefore the round was slightly longer at 2:30, but this did not stop tourist from solving everything in just over an hour. Well done!

Finally, the Open Cup 2018-19 returned with the Grand Prix of Baltic Sea on Sunday (results, top 5 on the left). The team Gifted Infants solved 10 problems in just under three hours and nothing in the remaining two, and yet nobody could catch them. Congratulations on the victory!

Thanks for reading, and check back for more!

A postponed week

The Apr 8 - Apr 14 week was light on contests from the usual suspects, but Google Code Jam Round 1A filled the void (problems, results, top 5 on the left). Gennady.Korotkevich required just 26 minutes to wrap things up, with others following in approximately 5-minute increments. Congratulations to everybody who made it to Round 2!

In my previous summary, I have mentioned two problems. The first once came from ICPC World Finals: you are given a street with n<=500 traffic lights on it arranged from left to right. The i-th traffic light stays red for ri seconds, then green for gi seconds, then repeats this cycle infinitely. The cycles for all traffic lights start at the same time 0, ri and gi are integers, and ri+gi does not exceed 100. Suppose you approach this street from the left at a random moment in time, and drive until you encounter a red light for the first time, or until you pass the entire street. What is the probability that you see red for the first time on 1st, 2nd, ..., n-th traffic light?

The solutions gives a whole new meaning to the "meet in the middle" term. First of all, suppose all periods ri+gi are relatively prime. In this case seeing red on each of the traffic lights are independent random events, and therefore our answers are r1/(r1+g1), g1/(r1+g1)*r2/(r2+g2), ...

Moreover, only slightly more complex solution exists in case any two periods are either relatively prime, or one divides the other. In each group of periods where one divides the other we'll pick the largest one, and go from left to right which units of time modulo that largest period have already seen red, and the probabilities for different groups can still be simply multiplied since they're independent.

However, we have arbitrary periods up to 100 in this problem, so the above condition does not hold. In order to overcome this, let's split the time into m parts: first part will have units 0, m, 2m, ..., the second part will have units 1, m+1, 2m+1, ..., and so on. Each of the parts can be "compressed" m times to obtain an instance of the same problem, but a period of p gets replaced by a period of p/gcd(p,m).

The only remaining task is to find which value of m will guarantee that for any two numbers p and q up to 100, the numbers p'=p/gcd(p,m) and q'=q/gcd(p,m) are always relatively prime or one divides the other. We can write a small program for that, and it turns out that m can be as small as 2520=23*32*5*7.

More generally, if the numbers were up to x instead of 100, m could be equal to the product of largest powers of primes not exceeding sqrt(x). For p' and q' to have a non-trivial greatest common divisor means for them to have two different primes in their factorization, which would mean that original numbers p and q were a product of two prime powers each exceeding sqrt(x), which is a contradiction.

We have essentially decomposed the problem m times from the top, and that allowed to split it into independent problems of size at most x at the bottom, hence the "meet in the middle" mention above. The running time of this solution is O(n*m*x), which is fast enough.

The second problem I mentioned came from Codeforces: you have n game units, each capable of causing 1 unit of damage per second. You need to merge them into m bigger game units, and if ai units were merged into the i-th bigger unit, it will cause ai units of damage per second. The m bigger units will then attack an enemy army also consisting of m units, with the j-th enemy unit having bj hit points. The attack will happen second-by-second, and in each second each of your big units attacks one enemy unit. It is allowed for multiple units to attack one enemy unit in one second. What is the fastest way to decrease the hit points of all enemy units to zero or negative? You can choose the way you merge the n units into m big units, but you can do it only once, before all attacks.

The key idea in solving this problem is to postpone the choice of ai. Let's imagine we start with a single big unit of size n. Let it keep attacking the first enemy unit, until at some second its hit points drop to zero or below. If they dropped below zero, it means that we have wasted some attacking power; to avoid that, let us split this big unit into two now, one that can exactly finish the first enemy unit, and the other can now attack the second enemy unit during that second. In the following seconds, both units would keep attacking the second enemy unit, and when its hit points drop below zero, we will split one of our units again to avoid wasting attacking power.

We will end up with at most m+1 units after all splitting is done, since we split once per enemy unit, but we can notice that the last split was not necessary, so we can skip doing it and end up with m units as required. And we have clearly spent the smallest possible time, since we never wasted a single unit of damage during all seconds except the last one.

In some sense, the main obstacle in solving this problem is not realizing that it is actually easy :)

Thanks for reading, and check back for more!

Sunday, May 12, 2019

A double Moscow week

The first week of April was of course focused on ICPC 2019 World Finals (problems, results, top 12 on the left, official broadcast, our stream, analysis). It feels strange to write a short summary given the amount of coverage available for the contest, but I'll do it anyway :) Warsaw took a huge early lead, but almost stopped after two hours, spending eternity on the technically tricky problem C and still not solving it. The three pre-tournament favorites Moscow, MIT and Tokyo overtook them, but only Moscow could solve all 10 solvable problems. Congratulations to the winners and to all medalists!

Problem K was the most exciting for me, even though I didn't actually solve it myself. It went like this: you are given a street with n<=500 traffic lights on it arranged from left to right. The i-th traffic light stays red for ri seconds, then green for gi seconds, then repeats this cycle infinitely. The cycles for all traffic lights start at the same time 0, ri and gi are integers, and ri+gi does not exceed 100. Suppose you approach this street from the left at a random moment in time, and drive until you encounter a red light for the first time, or until you pass the entire street. What is the probability that you see red for the first time on 1st, 2nd, ..., n-th traffic light?

Codeforces Global Round 2 took place on Saturday, when most of the teams were already back home or on the way there (problems, results, top 5 on the left, analysis). ICPC gold medalist ecnerwala got enough points for the first place in less than an hour, even though our entire streaming team was trying to catch him as you can see :) Well done!

Problem G in this round was very nice. You have n game units, each capable of causing 1 unit of damage per second. You need to merge them into m bigger game units, and if ai units were merged into the i-th bigger unit, it will cause ai units of damage per second. The m bigger units will then attack an enemy army also consisting of m units, with the j-th enemy unit having bj hit points. The attack will happen second-by-second, and in each second each of your big units attacks one enemy unit. It is allowed for multiple units to attack one enemy unit in one second. What is the fastest way to decrease the hit points of all enemy units to zero or negative? You can choose the way you merge the n units into m big units, but you can do it only once, before all attacks.

Thanks for reading, and check back for more!

Sunday, May 5, 2019

An order statistic week

The weekend of the Mar 25 - Mar 31 week was already the time to travel to Porto for ICPC World Finals for participants, but it still had a couple of rounds. TopCoder SRM 754 was the first one (problems, results, top 5 on the left, analysis). tourist's solution for the medium was challenged, which doesn't happen very often, but he more than made up for that by being the only contestant to solve the hard. Congratulations on the win!

Codeforces Round 549 followed a few hours later (problems, results, top 5 on the left, analysis). It was Um_nik's turn to dominate, being the only one to solve all problems but already leading the field with a sizable margin after solving 4 out of 5. Well done!

In the previous summary, I have mentioned an AtCoder problem: we pick n independent uniformly random points on a circle, and then find two points the (circular/angular) distance between which is as close to 1/3 of the circle as possible. What is the expected value of the difference between that distance, expressed as a fraction of the circle, and 1/3? n is up to 106, and you need to compute the answer using division modulo 109+7.

The official editorial has a similar but not exactly the same explanation that does not require any googling to solve the problem. However, I'd like to share my approach as well.

Let's collapse the circle 3 times, in other words project each point between 1/3 and 2/3 to x-1/3, and each point between 2/3 and 1 to x-2/3. The original question of finding two points with distance as close to 1/3 as possible almost translates into finding two closest points on this projection: if two points x and y are close, then these two points are either also close on the original circle (with probability 1/3), or the distance between them on the original circle is close to 1/3 (with probability 2/3), since 2/3 is also 1/3 when viewed from another angle. Moreover, these probabilities of 1/3 are almost independent for pairs of consecutive points: it's impossible that all pairs of consecutive points are close on original circle as well, but apart from that the probability that any k<n consecutive pairs each are close on the original circle as well is equal to 1/3k.

Now, in order to find the expected value of the difference between the sought distance and 1/3, define f(x) as the probability that this difference is >=x, then our answer is the integral of f(x).

Consider g(x,k) to be the probability that the k-th smallest distance (in sorted order, the so-called order statistic) between consecutive points from n uniformly random ones being >=x. Then we can find the probability that exactly k distances between consecutive points are less than x as g(x,k+1)-g(x,k) (we define g(x,0)=0 here). If the points are picked on a circle of size 1/3 instead of the unit circle, that probability is equal to g(3x,k+1)-g(3x,k). Finally, if we have k such distances on the 1/3 circle less than x, then in order to have our answer >=x, we need all of those small distances to be "also close" on the original circle, the probability of which is 1/3k, therefore we have established that f(x)=sumk((g(3x,k+1)-g(3x,k))/3k).

Since we're interested in the integral of f(x), we can swap summation and integration in the above formula, and learn that we're interested in sumk((int(g(3x,k+1))-int(g(3x,k)))/3k)=sumk((int(g(x,k+1))-int(g(x,k)))/3k+1).

Finally, int(g(x,k)) is simply the expected value of the k-th order statistic of distances between consecutive points when n uniformly random points are chosen on a circle. It turns out it's possible to google it: int(g(x,k))=1/n*(1/(n-k+1)+1/(n-k+2)+....+1/n).

Most of the terms cancel out, and we get int(g(x,k+1))-int(g(x,k))=1/(n*(n-k)), and int(f(x))=sumk(1/(n*(n-k)*3k+1)), where the summation is done over all values of k from 0 to n-1, which solves our problem, and the actual code we submit just computes this sum.

As I was coming up with this solution, I did not just invent it from top to bottom. Instead, I was digging from both ends: first, I've noticed that the setting is similar but not exactly the same as just finding the smallest distance between n random points on a circle of size 1/3. Then, I've googled that such distance has a known closed-form expression, and the same is true for k-th order statistic. Then, I've noticed that knowing how the k-th order statistic behaves allows us to split the probability space into areas where we just need to multiply by 1/3k. Finally, I've put the pieces together and verified that swapping summation and integration checks out, implemented the solution and got wrong answer for all samples: I forgot the extra 1/3 that comes from the fact that our circle is of size 1/3 instead of 1, and thus had 3k instead of 3k+1 in the formula. The contest time was running out, and I've almost given up hope, but still started to make small changes to the code expecting that I have some issue in that vein, 3k-1 didn't help but 3k+1 did, and I very happily submitted with just 1 minute left in the round :)

Thanks for reading, and check back for more!

Thursday, May 2, 2019

A close to third week

TopCoder SRM 753 opened the Mar 18 - Mar 24 week in competitive programming (problems, results, top 5 on the left, analysis). Just ecnerwal and mnbvmar solved all three problems, and ecnerwal was so much faster that the challenge phase did not really matter. Congratulations on the victory!

AtCoder Grand Contest 032 took place on Saturday (problems, results, top 5 on the left, analysis). ksun48 solved problem F in just 30 minutes, and tourist solved problem E in just 15 minutes, but nobody was able to solve both during the round. Therefore the advantage went to contestants who went for F, and ksun48 was by far the fastest of those. Well done!

The hardest problem F had a very simple-looking statement. We pick n independent uniformly random points on a circle, and then find two points the (circular/angular) distance between which is as close to 1/3 of the circle as possible. What is the expected value of the difference between that distance, expressed as a fraction of the circle, and 1/3? n is up to 106, and you need to compute the answer using division modulo 109+7.

Open Cup 2018-19 Grand Prix of Moscow was the last Open Cup round before the ICPC 2019 World Finals (results, top 5 on the left). Moscow SU team was setting the problems, but many other top ICPC teams competed. Three future gold medalists appear on the screenshot on the left :) Team Past Glory was head and shoulders above everybody, being the only team to finish all 13 problems, and doing it in just 3:40. Amazing dominance!

Thanks for reading, and check back for more.

Sunday, April 28, 2019

A week of 715

AtCoder Grand Contest 031 was the main event of the Mar 11 - Mar 17 week (problems, results, top 5 on the left, analysis). I was stuck for a long time trying to solve any of the last three problems, finally getting F in with some serious squeezing into the time limit. eatmore solved that problem and had enough time and energy left for problem D, thus winning a clear first place. Congratulations!

Said squeezing was required because I over-complicated a simple step of the solution. There was an odd number m up to 106, and up to 105 queries (a, b), and I needed to check for each query if there exists a number k such that a*4k=b modulo m. I ended up grouping queries (a, b) by gcd(a, b), then in each group we need to answer discrete logarithm queries "is this number a power of 4 modulo q=m/gcd(a,b)", which can be done using big-step-small-step algorithm. This was still not fast enough, but we can actually do big steps only once per group, and small steps once per query, then the optimal size of a big step (and the overall complexity of answering one query) becomes not sqrt(q) but sqrt(q/c) where c is the number of queries in the group, and the solution runs in time.

Of course, all this was completely unnecessary and I should've just found connected components in the graph where edges connect x with 4*x modulo m :)

In the previous summary, I have mentioned an Open Cup problem: let's define f(x) as the smallest non-negative number that can be obtained by placing + or - before each digit in the decimal representation of x, and computing the resulting sum. What is the sum of all numbers x between l and r  that have f(x) equal to 0, 1, ..., 9, modulo 109+7? l and r have at most 100 digits, and there are 10000 testcases to solve in 2 seconds.

The first step is pretty typical for such "between l and r" problems: let's compute the answers for segments [0,r] and [0,l-1] and get the final answer by subtraction. The second step is as typical: let's build our number x from left to right, then as soon as the next digit is smaller than the corresponding digit of r, the remaining digits can be arbitrary and we can use dynamic programming to share computations within testcase and between testcases.

The dynamic programming state will consist of two parts: the number of remaining digits, and something describing the digits already chosen. However, it's not entirely clear what that something should be :) As the first approximation, we would need to store a set of numbers achievable by placing + or - before all digits already chosen. However, with 100 digits the achievable numbers would be up to 900, so we'd have something like 2901 states.

Here we need to become more practical. First, it seems natural to expect that there's no need to use very big intermediate sums, so we can take a guess and introduce a cap of, say, 100. This still leaves us with at most 2100 states. Most of those states are not reachable, though, so we can write a small program that would generate all reachable states and learn that there are only 23108 states with a cap of 100, and 65618 states with a cap of 200 (the program uses BigIntegers as bitmasks):

import java.math.BigInteger;
import java.util.*;

public class PlusMinusSums {
    static final int BUBEN = 200;
    static final BigInteger MASK = BigInteger.ONE.shiftLeft(BUBEN).subtract(BigInteger.ONE);

    public static void main(String[] args) {
        ArrayDeque queue = new ArrayDeque<>();
        Map> stateToMoves = new HashMap<>();
        queue.push(BigInteger.ONE);
        stateToMoves.put(BigInteger.ONE, null);
        while (!queue.isEmpty()) {
            BigInteger cur = queue.poll();
            List moves = new ArrayList<>();
            for (int digit = 0; digit < 10; ++digit) {
                BigInteger next = makeMove(cur, digit);
                if (!stateToMoves.containsKey(next)) {
                    queue.add(next);
                    stateToMoves.put(next, null);
                }
                moves.add(next);
            }
            stateToMoves.put(cur, moves);
        }
        System.err.println(stateToMoves.size());
    }

    private static BigInteger makeMove(BigInteger cur, int digit) {
        BigInteger ifAdd = cur.shiftLeft(digit);
        BigInteger ifSub = cur.shiftRight(digit);
        BigInteger ifSubMinus = BigInteger.valueOf(Integer.reverse(cur.intValue() & ((1 << digit) - 1)) >>> (31 - digit));
        return ifAdd.or(ifSub).or(ifSubMinus).and(MASK);
    }
}

This is already quite promising, but still not enough to fit in the time limit. But we can take this idea of automatically discovering the state space further, and say that many of the states we have are actually equivalent. All that matters in the end is the lowest bit set in the final state, and that bit is always between 0 and 9. Therefore, we can do the following process, which is essentially automaton minimization: first, color all states in 10 colors based on the lowest bit set in them. Then repeatedly iterate over all states, and create a new coloring based on the following 11-tuple of colors for each state: the color of this state and the 10 states we can reach from it. After some amount of iterations the number of colors stops changing, which means we have found the equivalence classes of states. It turns out we have only 715 different states in this problem! Here's the code to be inserted into the main() method above:

Map stateToKey = new HashMap<>();
for (BigInteger x : stateToMoves.keySet()) {
    long key = x.getLowestSetBit();
    if (key >= 10) throw new RuntimeException();
    stateToKey.put(x, key);
}
int numStates = 10;
Random random = new Random(54815353151L);
while (true) {
    long base = random.nextLong() * 2 + 1;
    Map newStateToKey = new HashMap<>();
    Set newKeys = new HashSet<>();
    for (BigInteger x : stateToMoves.keySet()) {
        long key = stateToKey.get(x);
        for (BigInteger dest : stateToMoves.get(x)) {
            long childKey = stateToKey.get(dest);
            key = key * base + childKey;
        }
        newKeys.add(key);
        newStateToKey.put(x, key);
    }
    if (newKeys.size() == numStates) break;
    if (newKeys.size() < numStates) throw new RuntimeException();
    numStates = newKeys.size();
    stateToKey = newStateToKey;
    System.err.println(numStates);
}

Having just 715 states makes the overall solution run in time. Moreover, now we can also check the validity of our original assumption: we can notice that the number of different states does not change if we restrict the intermediate sums to 100 or 200, strongly suggesting that we have found all of them.

Thanks for reading, and check back for more!