And the thing is - when I look at it now, it is painfully obvious. The authors seem to expect a N2 solution, with N equal to 5000, and give us 64MB of memory (restrictions come from http://olympiads.win.tue.nl/ioi/ioi2002/contest/day1/overview1.pdf). But hey, there's the obvious N3 solution: for every pair of marks (a,b) we check if the path that starts with a and has b as the first hit continues correctly until the outside (and that going to the left of a gets us outside immediately). To avoid counting the same path twice, we can order all marks by x-coordinate and then by y-coordinate, and only look for pairs with a less than b. It's not clear if this algo implemented with hashtables will actually hit the time bound and not get full score; but there's no need to check that, as this solution is very easily transformed to an N2 DP: we can compute the same boolean D[a,b] as above (does the path continue correctly beyond b until the end of the field) for each pair (a,b) with a less than b, but without requiring anything about what precedes a on that path (and only take that into account when constructing the answer). To compute that, we find the next mark c on the path a-b-..., and if that mark is present and the D[b,c] is true or c is outside the field, then D[a,b] is true. The only issue is to find c in constant time. This can be done with hashtables, but again, there's no need: if we traverse the values for b (with the value for a fixed) in our order (first by x-coordinate, then by y-coordinate), then we should traverse the values for c in the same order, thus only requiring O(N) operations for each a, thus getting O(N2) in total. In terms of memory this uses only 25M bools, which is compressable to just about 3M, but there's no need as we have 64M. And the code should be so clean and simple that I won't even write it here (does anyone want me to?). In other terms, I would spent at most 20-30 minutes on such problem today.
But one might say, wasn't this expected? Don't the contestants' knowledge and problem difficulty constantly raise and try to pursue one another? True. But, it amazed me how there's no (I mean, totally no) clever algorithm involved, just a simple DP with a simple optimization that reduces O(N) to amortized O(1). I can swear we used to do much more complicated DPs in high school without any trouble. Is the amortization idea so new that it was very difficult 6 years ago? I doubt that it's the only reason. Looking back, I can think of at least 2 more reasons:
1) I expect that we were too nervous to find that at the actual IOI and after that. But the point is, we didn't feel that way! At least I remember myself being fairly confident and determined then.
2) We were reluctant to inject the ideas that we knew into algorithms that we knew, changing them significantly. This is more of a mathematical drawback, so both the university education and solving problems in a tougher timeframe at the ICPC trainings should have eliminated that difficulty.
Any more ideas?