Editorial for WC '17 Contest 4 S3 - Guardians of the Cash
Submitting an official solution before solving the problem yourself is a bannable offence.
Let be the Southern skyline, and
be the Western skyline. Let
. All
stacks in row
and column
must be pruned down to a height of at most
. Let's refer to this reduction of stacks as "processing index
". Note that it's not a simple matter of processing each index between
and
. When an index is processed, it may affect the heights of stacks in other rows/columns, which may affect other
,
, and
values.
This may suggest the following approach: Process all of the indices, then process them all again, and so on until a full iteration of processing all indices results in no further changes to the collection of coins. Processing a single index takes
time, and it can be shown that this algorithm will terminate within
iterations of processing all
indices, resulting in a time complexity of
. Processing the indices in random orders may be tempting, but the time complexity will remain at
, and sufficiently strong test data can be constructed such that the algorithm will require roughly
iterations.
A key observation is that, when an index is processed, it can only affect the
values of other indices
when
. Furthermore, when such an index
is affected, its
value may be reduced down to
, but not lower than that. This suggests that, if we process all
indices in non-decreasing order of
, the
values of already-processed indices will never be affected, and so a single iteration through all
indices will be sufficient.
That being said, it's insufficient to sort all indices by their initial
values and process them in that static order, as their
values can change during the algorithm, which may change the order in which they should be processed. Therefore, we'll need to dynamically keep track of the
,
, and
values. At each step of the algorithm, we'll choose to process the remaining unprocessed index with the minimum
value (in a fashion reminiscent of Dijkstra's shortest path algorithm). While processing an index, we'll then need to update other
,
, and
values more efficiently than iterating over all
stacks in the grid. Overall, it's possible to implement this approach in time
, for example by maintaining a set (balanced binary search tree) of pairs
as well as a multiset of
values for each row and each column.
Comments