Task 1

  1. For some array $R$ of positive integers with length $n$ and $d$ the minimum distance between picked items. Define $OPT(i)$ to be the maximum revenue choosing from the first i revenues.
  2. $OPT(i) = \begin{cases} 0 & i < 0\\ max (R[i] + OPT(i - d), \ OPT(i - 1)) & \end{cases}$
  3. The base case is when $i < 0$. In this case $OPT(0) = 0$ as choosing from no revenues will result in no profit.
  4. The most value achieved is in the subproblem $OPT(n - 1)$, choosing from all revenues in $R$.
  5. As selecting from no revenues will result in no selections the base case is sufficient.

    Let $OPT(i - 1)$ be the optimal solution for $i - 1 < n$

    Consider $OPT(i)$ we can either choose to select the revenue at i or skip it.
    If we choose to add the revenue at $i$ then the next item we can select must be at least d spaces from $i$. As $d \ge 1$ then $i - d < n$ and hence the solution for the proceeding selections must also be optimal.
    If we choose to skip the revenue at $i$ the next revenue we can select can be directly next to $i$, $i - 1$.
    As the optimal solution after skipping or selecting is defined for the proceeding selections, choosing the maximum of these options will maximise profit.
  6. Consider the $OPT(i)$ for $i = n - 1$. Recursion depth OPT 3 i-3 3 i-2-d 2 i-2 3 i-1-2d 3 i-2-d 2 i-1-d 1 i-1 3 i-3d 2 i-1-2d 2 i-2d 3 i-1-2d 3 i-2-d 2 i-1-d 1 i-d 0 i OPT(i - 1)OPT(i - d) OPT(i - 2)OPT(i - 1 - d)OPT(i - 2d) OPT(i - 3)OPT(i - 2 - d)OPT(i - 1 -2d)OPT(i - 3d) It can be seen at a recursive depth of $j$ the required number of solutions is $j + 2$, the number of combination of selecting $1$ or $d$, $j + 1$ times (i.e. $j + 1$, $j + d$, $j - 1 + 2d$, ... $(j + 1)d$). The maximum recursive depth will occur when all of the required solutions are base cases.
    $i - (j + 1) = 0$
    $n - 1 - (j + 1) = 0$
    $j = n - 2$
    The maximum number of solutions to solve will be $1 + 2 + 3 + ... n = \cfrac{n}{2}(n + 1)$.
    Hence the algorithm will run in $O(n^2)$ time.

    Solutions of a recursion are dependant only on the solutions from the next recursion, consequently the maximum values required at any time will be $n$
    Hence the algorithm will require $O(n)$ space.

Task 2

  1. For some 2d array $S$, where $n$ is the height, $m$ the width, $B$ the budget of costs, $c_{up}$ the cost of moving up and $c_{right}$ the cost of moving right. Define $OPT(i, j)$ to be the maximum scoring path starting at the location $(i, j)$, where $i \ge 0$ and $j \ge 0$.
  2. $OPT(i, j) = \begin{cases} 0 & ic_{up} + jc_{right} > B \\ & or \\ & i \ge n \lor j \ge m \\ S[i, j] + max (OPT(i + 1, j), \ OPT(i, j + 1)) & \end{cases}$
  3. The base case is when $(i, j)$ is outside of $S$ or the costs to reach that location exceeds the budget (i.e. $ic_{up} + jc_{right} > B \lor i >= n \lor j >= m$) , in this case $OPT(i, j) = 0$.
  4. The most value achieved is in the subproblem $OPT(0, 0)$, the optimal solution for a path ending at the bottom left corner.
  5. As a path starting outside the budget range or outside of $S$ is impossible the maximum score that can be acheived is 0, hence the base case is sufficient.

    Let $OPT(i + 1, j)$ and $OPT(i, j + 1)$ be the optimal solutions for $$i+1 \le \cfrac{B - jc_{left}}{c_{up}} \land i + 1 < n$$ and $$j+1 \le \cfrac{B - ic_{up}}{c_{left}} \land j + 1 < m$$ Hence, at the location $(i, j)$ a move can be made either up to $(i + 1, j)$ or right to $(i, j+1)$. As $OPT(i + 1, j)$ and $OPT(i, j + 1)$ are both defined then the optimal move will be to the next largest solution. Therefore $OPT(i, j) = max(S[i, j] + OPT(i + 1, j), \ s[i, j] + OPT(i, j + 1))$ $$OPT(i, j) = S[i, j] + max(OPT(i + 1, j), \ OPT(i, j + 1))$$

    To obtain optimal values for the subproblems construct a queue of locations who's solutions require that of a location with higher priority in the queue. In this way the locations with highest prioirty will require only base case locations. Solve the solutions of the locations from highest prioirty to lowest.
  6. The alogrithm can be broken into three stages:
    1. Constructing the solution table and populating it with the value $-2$ will run in $O(nm)$ time and require $O(nm)$ space.
    2. Constructing the queue will require looking and storing at most every location of every element in the grid, starting at $(0,0)$. This stage will also update the solution table changing the value of queued locations to $-1$ such that no location appears more than once in the queue. Hence this stage will run in $O(nm)$ time and require $O(nm)$ space.
    3. Solving the values in queue will require looking at every location in the queue and storing every solution in the solution table. Hence this stage will run in $O(nm)$ time and require $O(nm)$ space.
    As each stage runs in $O(nm)$ time and requires $O(nm)$ space the entire algorithm will run in $O(nm)$ time and require $O(nm)$ space.