## Monday, October 26, 2009

### A difficult problem ten years later

Here's another Russian IOI camp problem, this time one of the most difficult problems from the 1999 summer camp. Only one contestant was able to solve it back then. I wasn't. How does it look ten years later? I have testdata in case someone wants to actually write the solution :)

You are given eight pairs of numbers, x1, y1, x2, y2, ..., x8, y8. You need to find eight numbers z1, z2, ..., z8 such that (x1,y1,z1), (x2,y2,z2), ..., (x8,y8,z8) are the coordinates of the vertices of a cube with an integer side length, or to report that such numbers do not exist.

Apart from interesting problems and breathtaking competition, the algorithm contests have their own astonishing personalities. Let me introduce the today's hero: Gennady Korotkevich. His name is also spelled Henadzi Karatkevich sometimes.

He is 7th top ranked TopCoder Algorithm contestant. He's won IOI 2009 (got the highest score, not just gold medal which he did several times before that). He's beaten my team (which included another TopCoder ex-target!) at an ICPC-style contest this September. He's topped all Belarusian teams in the Western Quarterfinal for NEERC 2009.

And yet he's only 14. That means he has 2 or 3 more IOIs to go. That also means I have three more TopCoder Open tournaments to fight for before he arrives :)

What feels so good about Gena is that he doesn't seem to lose the rest of his life to programming contest training, at least according to the above interview. He reminds me of myself, but of course on a greater scale. While I'm sure he's training a lot, it's important that becoming a world-class master does not affect the other aspects of his life badly.

Gennady, if you're reading this, I challenge you for a game of soccer or table tennis :)

More generally, I think this example supports the idea that the skillset (or maybe talent? that's probably a question for a separate discussion) that the programming challenges require is unique and is only partially related to CS or mathematical higher education. And seeing how many big companies are valuing that skillset in job interviews, it seems important enough for education of future software engineers to maybe borrow something from the algorithm contests. Another possible direction, of course, is that algorithm contest puzzles were just lucky to get into the limelight, and will just go away (or maybe become a pure sport) at some point when the big companies and universities discover a better way to educate and recognize the future engineers. Somewhat like Formula 1 which is becoming less and less important for the development of normal cars.

## Sunday, October 25, 2009

### Torrent for screencasts

Okay, time to revive the blog.

SRM 446: Torrent, Direct Streaming, Direct AVI, Direct MOV. First of two consecutive SRMs where I didn't manage to solve all three problems, so you have a chance to see me frantically trying to complete the medium problem until the end of the contest, but needing several more minutes :( A nice successful challenge as well - the medium problem solutions were of that kind when you know they're wrong but coming up with a challenge is a problem in itself. Try staring at the same solutions as I did and challenge them yourself :)

SRM 447 was a real disaster. Ironically, I was in so bad form that I even forgot to turn on the screencast recording.

SRM 448: Torrent, Direct Streaming, Direct AVI, Direct MOV. Back to the alarmingly usual 2nd place form. The most interesting part to watch is probably squeezing the hard to run within the time limit (the final version worked 1.8s in the worst case, the first one didn't even manage to solve 40 cards case within 2s). And maybe handling the victory to ACRush in the challenge phase by doing an unnecessary unsuccessful challenge.

Later, the laptop I've been using for recording screencasts went under water and is no longer working. While I'm waiting for it to be repaired, there will probably be no more new screencasts, sorry.

Now, time to talk about something more serious. Stay tuned for philosophical posts about Gennady Korotkevich, Oleg Hristenko and OpenCup, and Moscow State University ACM preparation.

## Thursday, July 9, 2009

### SRM 444 + answers to previous questions

Here's the SRM 444 screencast: AVI from narod.ru.

Summary: this screencast may seem a bit "slow motion". Somehow it took me much more time than usual to solve the problems (which I think were nice although probably too heavy with "standard" ideas, and I couldn't understand the problem statement of the easy problem for quite some time) - heh, it's strange to be sleepy at 3pm - but it's of course ACRush's superb performance that we should blame for the 500+ point margin :) Good job!

I wonder if there exists a non-exponential solution for the medium problem. Intuitively, it seems like with only one kind of block there might be some matching-style algorithm to solve it, but maybe the intuition is wrong. A reduction of an NP-complete problem to this problem would also be of interest, though :)

And here are the answers for the questions posted previously on this blog:

1) Why did the SRM 442 screencast interrupt in the middle? Imagine that you've coded the solution and submitted it almost without testing. Then, you realize you're not sure whether you should exchange i with j in one place in your code. You do that change, and see the code still pass all sample tests. You compile it in the arena, and find a test where it seems to break. You then change indexes back and compile in the arena again to see that the old version works on that test case. After all these changes, you realize that you don't remember which of the two versions you actually have submitted! How to find that out? Easy! Stop the screencast, open the recorded video and find the submission time :)

2) How to find last 10000 digits of x=a/b in base 31? First, if b ends with 0, then a also must end with 0, and we can remove both those zeros without changing the answer. Let's repeat that until the last digit of b isn't 0. Then, how to find the last digit x0 of the answer? Obviously, when multiplied by the last digit of b, it should give the last digit of a. But exactly one digit will satisfy that equation since 31 is prime! Then, we subtract b*x0 from a, and we can find the next-to-last digit x1 of the answer from the fact that multiplied by the last digit of b it should give next-to-last digit of a-b*x0 (the last digit of a-b*x0 is zero). We can repeat this process k times then. The final note is that one doesn't need to calculate a-b*x0 completely, as we only need the last k digits of it during the future process, and thus each step of the algorithm takes O(k), with the entire algorithm taking O(k^2).

Here's sample code that assumes a and b are given as arrays of digits with 0-th item representing the least significant digit (so the numbers are 'reversed'):
` 1: int start = 0; 2: while (b[start] == 0) 3:     ++start; 4: for (int i = 0; i < k; ++i) { 5:     int x; 6:     for (x = 0; x < 31; ++x) 7:         if (x * b[start] % 31 == a[start + i]) 8:             break; 9:     res[i] = x;10: 11:     // Subtraction12:     int carry = 0;13:     for (int j = start + i; j < start + k; ++j) {14:           carry = carry + a[j] - b[j - i] * x;15:           if (carry < 0) {16:             int ncarry = (carry - 31 + 1) / 31;17:             a[j] = carry - ncarry * 31;18:             carry = ncarry;19:           } else {20:             a[j] = carry;21:             carry = 0;22:           }23:         }24: }`

## Friday, July 3, 2009

### A funny problem

Here's a funny problem that I've composed for the Russian IOI training camp that's happening now.

Given two numbers a and b in base 31 with at most one million digits each such that a is a multiple of b, find the last k digits (again, in base 31) of a/b, where k is not more than 10000.

## Wednesday, June 24, 2009

### SRM 443

Here's the screencast: Streaming, AVI, MOV, AVI from narod.ru.

Once again the easy-hard-medium strategy worked quite well, although it was a bit more nervous this time (a submission with about 2 minutes to go in the coding phase, and a succesfull challenge with about 20 seconds to go in the challenge phase). It's funny how I've chosen over-complicated solutions for easy and hard problems (well, for hard the main idea is common to most solutions, but my implementation is too long and it took me too many iterations to even get that one), but got the easiest to implement one for the medium under the time pressure :)

The hard problem for the round is a reasonably complex example how matrix multiplication can chime in to speed up DP. If you want to learn the more advanced DP techniques, this problem might give you a level-up :)

## Sunday, June 14, 2009

### A lot more screencasts

I've been not updating the blog with the more recent TopCoder screencasts. I can't remember any spectacular parts of those except the last one since the matches were long ago, but here they go anyway:

TopCoder Open 2009 Round 4 (Streaming, AVI, MOV, AVI from narod.ru). Resubmissions on all 3 problems (if I remember correctly), was just lucky that nobody else got all three.

TopCoder Open 2009 Round 5 (Streaming, AVI, MOV, AVI from narod.ru). Tried my best and made almost no mistakes, but ACRush spent so little time on the hard. No challenges either, so the screencast is probably no so spectacular.

SRM 437 (Streaming, AVI, MOV, AVI from narod.ru). Couldn't figure out all the cases in the hard problem, although the approach I've chosen should be doable. Also underperformed during the challenge phase - but to be fairself-assuring, I didn't even have as many wrong 250 solutions in my room as ACRush has challenged, so there was little or no chance to fight for the win.

SRM 440 (Streaming, AVI, MOV, AVI from narod.ru). Wrote a reasonably logical solution for the hard - but rem has invented a brilliant way to express it in very simple terms, and thus his score is almost twice mine. Just one challenge with seemingly good prepared-before case - too little boundary in binary search - and then a couple of unsuccessful challenges on possible precision problems.

SRM 441 (Streaming, AVI, MOV, AVI from narod.ru). Once again my score on the hard is significantly lower than the best one due to me implementing an easy-to-invent but difficult-to-write solution - but luckily, ACRush resubmitted the medium. Several good challenges on a prepared-before case (forgetting about isolated vertices) - one more confirmation that such cases can be crucial when they exist.

SRM 442 (Streaming, AVI, MOV, AVI from narod.ru). Implementing the "TCO reslution": try the unusual strategies many times to see how they perform in the long run. Easy-Hard-Medium this time, turned out to be not so important since it was rather easy to solve all three (although rather hard not to make a bug in the medium). The screencast has an unusual feature at 48:11 - I had to stop the recording and restart it. Can anybody guess why (the screencast is no spoiler for this puzzle, so you can look at its nearby parts when figuring out the answer)?

## Friday, June 5, 2009

### TopCoder Open 2009 - Going Back

So that's how it goes. I've fallen into the same trap the second consecutive year: trying to solve the medium problem when the hard problem is actually easier, and lacking some 5-10 more minutes to finish the hard problem after that. Well, better luck next time, and congratulations to crazyb0y, UdH-WiNGeR and marek.cygan!

What are your impressions of the Final round and overall of the TopCoder Open?

## Thursday, June 4, 2009

### TopCoder Open 2009 - First Two Days

So how does it go here at the TCO? Well, I have to admit that the atmosphere is quite a bit more 'neutral' than usual. There's little people talking to the sponsors, there's just one or two groups at any given moment talking to each other, people spend quite a bit of time with their own laptops. Why? Maybe it's the reduced amount of Algorithm competitors, which might be more talkative than others; maybe it's that most competitors don't actually compete here, since the results for most tracks were ready before the start of the event, so they just don't come at the Arena and explore Las Vegas attractions instead. I'm sure that TopCoder will find a way to address this problem, since each TopCoder event was always very exciting, and they should become even better, not worse :)

Speaking of yesterday's semifinal round (Results) - all went well. I didn't make mistakes, I had time to stresstest both the second and third problem solutions, and I was lucky to find a reasonably easy-to-write approach for the third problem (I've used Arena compiling and testing a lot to run dummy programs that will help me see which approaches will run for reasonable time - it's probably the first time I do that at an onsite round).

And here's a couple more lousy videos. First one shows the announcement of semifinal's results (you should press the "HD" button if you want to see the handles):

And this one shows the breakfast room today - just to give you a picture of a TCO place other than the Arena. Notice how people are grouped by language (Spanish, English, Chinese, Russian):

## Tuesday, June 2, 2009

### TopCoder Open 2009 - Marathon Introduction

Before the start of each onsite competition, there's a brief introduction for each competitor. Here's how it happened for the first competition of the TCO, the Marathon Match track:

Any suggestions for future movies?

### TopCoder Open 2009 - Arena Room Overview

Hi all,

here's how the arena room looked like yesterday during the welcome reception. Lousy quality of both video and audio, but hopefully still interesting for those of you who're not in Vegas now :)

## Saturday, May 30, 2009

### TopCoder Open 2009

TopCoder Open onsite finals is an unique opportunity where one can blog about daily events and still stay in the style of this blog (which I hope to be exclusively about programming competitions). I'll try to post news/thoughts/videos here, although of course the official http://www.topcoder.com/tco09/blog/ will be more interesting and diverse :)

During all my previous tournaments the Algorithm track participants were the majority, while this will not be the case this year, so the atmosphere may turn out quite different. Anyway, I'm sure it will be fun.

[posted from New York - JFK]

## Tuesday, April 21, 2009

### ICPC 2009 World Finals

Here're the problem statements: http://cm2prod.baylor.edu/resources/pdf/2009Problems.pdf (mirrored at http://77.41.63.3/icpc2009/statements/problems.pdf)
Here's the SnarkNews coverage: http://acm.math.spbu.ru/~snark/finals/

I'll try to post some analysis here.

Problem A.
First, we loop over 8! possible orderings for the landing of the planes. Then, we do binary search on the answer. Having fixed the answer, we simply land each plane in the chosen order at the earliest possible time.

Problem B.
This problem requires careful checking of all boundary cases, and doesn't seem to require any particular insight. One just verifies the output of the original scheme on all given inputs, as well as the output of all "wrong" schemes that can be obtained from the given one by introducing one mistake (since there's not more than 19 gates, there's not more than 57 such schemes), and then checks if the output can be uniquely determined. But there surely is some trickiness to implementing that.

Problem C.
This is a very well-known problem if the ant moves along the surface of the cube; the octahedron doesn't change much. We try all possible ways to lay out the octahedron's sides on the plane (first place the first side, then place any of its neighbors next to it, and so on). The number of possible layouts should be on the order of hundreds or thousands. And in at least one layout, the shortest path will be just a straight segment - so we need to check those segments in all layouts, throw away those that go outside the layout, and then find the shortest one of the remaining. This solution should work as long as no shortest path passes through the same side twice; I don't have a proof for that, but intuitively it seems true. The implementation will require a LOT of accuracy, though.

Problem D.
One thing that comes to mind is to draw all possible layouts on paper, and figure out the formulas for the answer in each of them. But that's very error-prone, so I hope there's a more concise solution.

Problem E.
Take any vertex in the graph. Let's say it has a "nice left" if any path from the start to that vertex has the same cost. Let's say is has a "nice right" if any path from the end to that vertex has the same cost.
This can be determined by DFSen.
If there's a vertex without both nice left and nice right, then there's no solution: at least one of the roads to the right of it will have to be modified, and at least one of the roads to the left of it will have to be modified - so there will be a path with two modified roads.
Otherwise, the vertices split into 3 sets: A - vertices with nice left but no nice right, B - vertices with nice left and nice right, and C - vertices with nice right but no nice left. If both A and C are empty, the constraints are already satisfied and there's no need to change anything. Othewise, A will always have the start vertex, C will always have the end vertex.
Let's assume for now that B is empty. The above argument about no path with two modified roads tells us that the only way to achieve what's required is to modify the roads that lead from A to C. More specifically: take all pairs of vertices a from A and c from C, where there exists and edge from a to c. The cost of that edge plus the cost of the path from start to a and from c to end is a lower bound for the answer. Take the highest of those lower bounds, say H, and change the cost of each edge between a from A and c from C to H-cost(start,a)-cost(c,end). This will quite obviously be the minimal possible answer, and every path will have the same cost of H since it will pass along exactly one edge from A to C.
Now, how to deal with B? It turns out that we can simply put all vertices from B to C, and the above solution will still work - since it always leaves at least one route from start to end unchanged, it can't make a non-optimal answer; and it will always produce a correct answer since the only thing it requires is for all vertices from A have a "nice left" and all vertices from C have a "nice right".
Does that make sense? :)

Problem F.
First, let's learn how to calculate the length of one fence required to enclose the given set of points. To find that fence, we find the convex hull of that set of points, and then "extend" it to the outside by the given margin. The result may have a complex structure of straight and circular segments, but its total length is easily computed: the total length of straight segments is equal to the perimeter of the convex hull, and the total length of circular segments is equal to one circle, 2*pi*margin (because you have to "rotate" by 2*pi to complete the cycle - when you're moving in straight line, your direction doesn't change).
The rest of the problem is standard: we do a DP that calculates the minimum possible cost for each subset of points, by choosing a subset of that subset to be contained by one fence, and taking the previously-computed DP value for the rest of the points.

Problem G.
I think this problem requires one just to implement the game's rules carefully. The number of possible states in the game is reasonably small: the position is characterized by the number of cards still lying in the row, the cards held by both players, if any, and the layout of the "top line" of the house of cards - which will always include not more than 8 cards. This gives us a rough estimate of C(26,10)*26 states, which is roughly 100 million - but in reality much less of the states will be reachable, since the top line only contains 8 cards in one case, and so on.

Problem H.
If we introduce variables x1,x2,... for the decisions on the bills, then satisfying all ministers can be formulated via statements of form "if x5 is false, then x7 is true" (since if at least one decision contradicts minister's choices, then all his other choices must be respected). That gives us an instance of 2-CNF satisfiability problem, and its solution is well-known. Checking which variables have the same value in any solution is also quite routinely added to that well-known solution (for example, we fix that variable and run the satisfiability checking again).

Problem I.
Another implementation problem here. One should understand the problem statement, and then simulate all resizes going from the outermost window inside.

Problem J.
We can try solving this with a binary search + DP. We binary search over the answer, then do a DP on subtrees. For each subtree, consider all possible roundings that don't contradict the chosen answer (that is, the paths completely inside the subtree don't change by more than that answer), and from each of those roundings, we're interested in the lowest and highest differences for paths from the root of that subtree to its vertices (these two values are the only ones that affect paths that pass through the root of that subtree). So the DP can be: for a subtree T and a lower bound L on the difference on the paths from its root, what is the lowest possible upper bound U on that difference? This should give us only at most of 100*30*100 states, and I think the processing of all states can be done in at most 100*30*100 time (since there're only 100 edges, and "updating" along each edge will take one scan along a 30*100 array that has the U's for each L), thus even a binary search outside still leaves the running time reasonable.
I know this is very unclear, but I think the only way to figure out the details is to actually start coding the solution, and I'm too lazy for that :)
A shot at clarifying: for a subtree, there're 3 parameters:
1) the maixmum modulus of delta for paths inside that subtree
2) the maximum delta for paths from the root of that subtree into the subtree
3) the minimum delta for paths from the root of that subtree into the subtree
Without the binary search, we'd be forced to add the 1st parameter to the DP, and that would make it too slow. That's why we use binary search outside.

Problem K.
Consider the chain of transformation in the optimal answer. Consider the change(s) that affect the leftmost character of the string. Between those changes, the transformation is split into independent parts. That gives us the possibility of the following DP: consider the set of all suffixes of length L of all given strings (S,T, and all transformation strings). There're at most 202 of those for each L. Now, let's calculate the best way to transform each of those strings to each other, given that we know such answers for smaller values of L. First, we account for the transformations that don't involve the whole L characters - we can just take the value from the L-1 step. And then, there are transformations with the whole L characters - they are given by rules of length exactly L. And then, we need to combine those two kinds with a shortest-paths algorithm, like the Floyd-Warshall. That gives us 202^3 running time for each L, so about 160 million very simple operation for each testcase - that should be enough.

So that makes about 170 minutes for all solutions with some interruptions :) It's nice that they don't have problems that are theoretically impossible to solve during the contest.

I think the problems are nice and continue the usual trend of World Finals problems. Thanks for all judges for composing this contest, and for KTH and ACM for organizing it!

## Sunday, March 22, 2009

### TCO R1, R2 and R3 screencasts

This is another interesting-only-for-TopCoder-players post.

I've uploaded the screencasts of TopCoder Open 2009 Elimination Rounds 1 (Streaming, AVI, MOV, AVI from narod.ru), 2 (Streaming, AVI, MOV, AVI from narod.ru) and 3 (Streaming, AVI, MOV, AVI from narod.ru). The R1 screencast is quite usual, with a resubmit because of a slightly too slow solution; the R2 screencast features my frustration as I'm unable to fix my solution for the medium (I knew that it was wrong because it has failed a stress test with a dumb solution) as the time is running out, and the R3 screencast features a new challenging strategy I've discovered: look at other challenged solutions (from other rooms) to see possible mistakes to look for in the solutions of your room. This sounds obvious, but I've never used that successfully before yesterday.

I've liked the hard problem from R3 a lot. I think it is an excellent example of how dynamic programming problems can be very non-standard (and the medium problem from the same round provides an example of how they can be very standard :)) and beautiful. The regular appearance of such problems makes me sure that we'll never run out of ideas in algorithm competitions. Although I'm biased for sure.

## Saturday, March 14, 2009

### A couple recent FAILs

I've recently lost my 1st TopCoder rating to ACRush and have already fell quite a bit behind. This is mostly due to his superb performance (great job Lou!), but it also highlighted several of the weaknesses of the choices I've made for my competing style: namely, using C# and expecting the time limit to have some margin, and not having any pre-written code to use in the competition.

All 3 last matches made me think again about those questions. In TCO09 Round 1 the hard problem turned out to have a rather strict time limit, and thus my asymptotically correct solution timed out slightly (I believe it ran within 3 seconds) and I had to resubmit, placing only 10th. SRM 436's hard was about fast multiplication of polynomials, and as I don't understand Fast Fourier Transformation good enough to code it and don't have it prewritten, I've resorted to the Karatsuba algorithm and again, my solution was very little over the time limit, so I've spent a lot of time until finally getting it under the time limit (and again, it turned out that this solution was expected to be correct by the problem authors), scored very low and placed only 15th. Today, me failing the medium problem was totally my fault (I couldn't invent the correct solution) - but even if it didn't fail, I would place below Lou because he had a pre-written implementation of the Dinic maximum flow algorithm for the hard problem, and it turned out that one could run that 2500 times under the time limit. This was not possible with the stupid maximum flow algorithm I've used, and thus I had to spent a bit more time implementing a slightly more complex approach.

I still believe that all this happening in a row is just a bad coincidence, and in the long term my strategy has more pros than cons. Next couple of months will show :)

## Monday, March 9, 2009

### Self-referential sentence solution

OK, here's the sentence:

This sentence has seven hundred and seventeen letters, one hundred and fifty-six words, forty-six commas, one dot, six dashes, one hundred and forty-nine spaces, eighty-two quotes, two words "This", five words "and", two words "commas", one word "dash", two words "dashes", two words "dot", one word "eight", two words "eighteen", two words "eighty", one word "eleven", one word "fifteen", two words "fifty", two words "five", three words "forty", four words "four", one word "fourteen", two words "has", four words "hundred", two words "letters", two words "nine", one word "nineteen", one word "ninety", eighteen words "one", two words "quotes", two words "sentence", two words "seven", two words "seventeen", one word "seventy", four words "six", one word "sixteen", one word "sixty", two words "spaces", one word "ten", two words "thirteen", two words "thirty", three words "three", one word "twelve", two words "twenty", twenty-one words "two", thirteen words "word" and thirty-one words "words".

Now, how to build that one? Amazingly, the following very simple approach works. Start with some string. Build a sentence that describes it. Now build a sentence that describes the previous sentence. And so on, until at one point you find that the sentences stop changing. It's true that this can never happen - but it turns out that trying out several initial strings is enough to find one that leads to a solution. It's true that the solution will be excessively long (e.g., all [one word "smth"] clauses look really unnecessary, don't they?), but who cares?

Here's the code:
http://77.41.63.3/blog/self-referential/SelfReferentialSentence.html
http://77.41.63.3/blog/self-referential/NumberSpelling.html

## Friday, February 27, 2009

### Self-referential sentence

Here's one funny problem back from 1998 summer training camp for Russian IOI team hopefuls. Construct a sentence of form:

This sentence has one hundred and ninety letters, twenty one word, twelve commas, one dot, seventy five spaces, fifteen quotes, one word "This", seven words "word" and two words "twenty".

Which is completely true and has nothing missing. At the camp, there were only so many participants, so the results were checked by hand with help of a computer program, and thus some leeway was allowed in interpreting the problem statement. However, the problem doesn't suffer if we fix what's needed precisely: the sentence should be like the above but obviously with different numbers (but still spelled in "canonical" English way) and all words listed with correct amounts in case-sensitive way, with no word from the sentence missing.

I couldn't solve it back in 1998, and the solution looked like a very innovative idea to me at that time. It doesn't anymore, but I hope some of you will enjoy thinking about it. Any suggestions on the approach? Any solutions? :)

// I know that this is a well-known idea, and Googling the title of this post reveals a lot of examples and solutions :)

// To give you some context: here's me at the similar camp a year later in 1999:

## Saturday, February 21, 2009

### Screencasts since fall

This will probably only be interesting for TopCoder players, and they've probably already seen this on the forum, but anyway. Here're the screencasts for the SRMs that happened since my next-to-last post.

With problems not being very difficult, all came down to the challenge phase, and I was lucky to emerge in the 1st place in the end :) There was nothing particularly interesting about my challenges - I've had a pre-crafted case for the medium where I felt many can hit integer overflow, but only found one solution to challenge with it; and I've found a timeout case for a backtracking solution in the hard.

Nothing exceptional except stupid switch to C++ to use upper_bound that costed me quite a lot of time and points instead of just using binary search :)

This SRM highlights one of the techniques that I believe has improved my TopCoder results quite a bit - doing a stress-test against a stupid solution on random testcases. I've managed to resubmit the hard problem correctly because of it.

SRM 433 - no screencast - but just a comment.
Wasn't it stupid to do a last-second challenge on a possible hash collision when I was in first place with less than 25 points of margin? Yes it was.

With problems that seemed to suit me and a late resubmission by ACRush, there was no need in much challenge activity. And since I've chosen the wrong problem to challenge (who would've thought of so many ways to fail the easy? :)), the screencast should be rather boring in the challenge phase.

Very nice problems, but too slow thinking at 5AM. And very thorough example cases.

### Remembering Petr Mitrichev Contest 1

Naturally, after taking part in so many programming contests, one accumulates enough ideas to start being a problemsetter. My first ACM-style contest was Petr Mitrichev Contest 1, which I set in August 2006 at the Petrozavodsk training camp, and which was later replayed at http://acm.sgu.ru. I'll try to remember how I came up with those problems.

Problem A - Balance was created when went to a website with a lot of mathematical puzzles in search of inspiration; different "find fake coin" problems didn't seem very well suited to programming competitions, but requiring to output the script and placing rather low constraints turned it from a purely-mathematical problem to quite an interesting DP requiring some accuracy.

Problem B - Cipher was created (or found in some paper, I don't know exactly) by a friend of mine as a side effect of his research.

Problem C - Hyperboloid Distance was created in search of a geometrical problem requiring an algorithmic idea. The more I thought of whether it's possible to solve it exactly or at least reduce it to a computation of  some integral, the more I liked it (since mathematical solution, if at all possible, seemed way more difficult than the approximate algorithmic one); The precision of 0.1 was there to be absolutely certain that my solution was correct and to make the problem a little easier, although I believe my solution had output much more correct digits after the decimal point.

Problem D - Real Fun was a special case in a research article that I was reading; I was amazed at the beauty and simplicity of the solution, and I still believe this problem follows the spirit of ACM ICPC-styled contests the most (among the problems of this contest).

Problem E - Hippopotamus was created as the "first problem to solve" for this contest, by adapting the idea of a simpler problem requiring exactly k iron boards among each m. This problem is quite standard, and stands out a little in terms of difficulty, but without such problem the contest could've become even more boring :)

Problem F - Ice-cream Tycoon was created because I've wanted at least one problem requiring log(n)-style data structures. There's nothing particularly interesting about it, as almost every similar question can be solved in a similar way using almost the same data structures. This is the kind of problems that's very easy to come up with.

Problem G - 4-3 King was one of the problems that I've had in mind for a long time (maybe a couple of years), but it was difficult to choose the exact variation of this idea (separate the given not-necessarily-convex n-gon in the given way) that will be both solvable and interesting to solve. I like the final version of the problem, and I've thought it would be the second easiest problem in the contest, but somehow the contestants didn't prove that :)

Problem H - Circular Railway was again a special simple case in a research article that I was reading. Luckily, it allowed for several different solutions, and thus I think was quite appropriate for the format.

Problem I - Shortest Paths was directly based on an entire research paper by David Eppstein, and I didn't expect anyone to solve it during the contest - the sole purpose was to share the amazing fact that there's such an effective solution to it with the participants :)