mdp.ipynb 274 ko
Newer Older
    "### Case 2\n",
    "---\n",
    "R(s) = -0.4 in all states except in terminal states"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {
    "collapsed": true
   },
   "source": [
    "sequential_decision_environment = GridMDP([[-0.4, -0.4, -0.4, +1],\n",
    "                                           [-0.4, None, -0.4, -1],\n",
    "                                           [-0.4, -0.4, -0.4, -0.4]],\n",
    "                                          terminals=[(3, 2), (3, 1)])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      ">   >      >   .\n",
      "^   None   ^   .\n",
      "^   >      ^   <\n"
     ]
    }
   ],
   "source": [
    "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))\n",
    "from utils import print_table\n",
    "print_table(sequential_decision_environment.to_arrows(pi))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This is exactly the output we expected\n",
    "![title](images/-0.4.jpg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "As the reward for each state is now more negative, life is certainly more unpleasant.\n",
    "The agent takes the shortest route to the +1 state and is willing to risk falling into the -1 state by accident."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Case 3\n",
    "---\n",
    "R(s) = -4 in all states except terminal states"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {
    "collapsed": true
   },
   "source": [
    "sequential_decision_environment = GridMDP([[-4, -4, -4, +1],\n",
    "                                           [-4, None, -4, -1],\n",
    "                                           [-4, -4, -4, -4]],\n",
    "                                          terminals=[(3, 2), (3, 1)])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      ">   >      >   .\n",
      "^   None   >   .\n",
      ">   >      >   ^\n"
     ]
    }
   ],
   "source": [
    "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))\n",
    "from utils import print_table\n",
    "print_table(sequential_decision_environment.to_arrows(pi))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This is exactly the output we expected\n",
    "![title](images/-4.jpg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The living reward for each state is now lower than the least rewarding terminal. Life is so _painful_ that the agent heads for the nearest exit as even the worst exit is less painful than any living state."
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Case 4\n",
    "---\n",
    "R(s) = 4 in all states except terminal states"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {
    "collapsed": true
   },
   "source": [
    "sequential_decision_environment = GridMDP([[4, 4, 4, +1],\n",
    "                                           [4, None, 4, -1],\n",
    "                                           [4, 4, 4, 4]],\n",
    "                                          terminals=[(3, 2), (3, 1)])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      ">   >      <   .\n",
      ">   None   <   .\n",
      ">   >      >   v\n"
     ]
    }
   ],
   "source": [
    "pi = best_policy(sequential_decision_environment, value_iteration(sequential_decision_environment, .001))\n",
    "from utils import print_table\n",
    "print_table(sequential_decision_environment.to_arrows(pi))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In this case, the output we expect is\n",
    "![title](images/4.jpg)\n",
    "<br>\n",
    "As life is positively enjoyable and the agent avoids _both_ exits.\n",
    "Even though the output we get is not exactly what we want, it is definitely not wrong.\n",
    "The scenario here requires the agent to anything but reach a terminal state, as this is the only way the agent can maximize its reward (total reward tends to infinity), and the program does just that.\n",
    "<br>\n",
    "Currently, the GridMDP class doesn't support an explicit marker for a \"do whatever you like\" action or a \"don't care\" condition.\n",
    "You can however, extend the class to do so.\n",
    "<br>\n",
    "For in-depth knowledge about sequential decision problems, refer **Section 17.1** in the AIMA book."
   ]
  },
2177 2178 2179 2180 2181 2182 2183 2184 2185 2186 2187 2188 2189 2190 2191 2192 2193 2194 2195 2196 2197 2198 2199 2200 2201 2202 2203 2204 2205 2206 2207 2208 2209 2210 2211 2212 2213 2214 2215 2216 2217 2218 2219 2220 2221 2222 2223 2224 2225 2226 2227 2228 2229 2230 2231 2232 2233 2234 2235 2236 2237 2238 2239 2240 2241 2242 2243 2244 2245 2246 2247 2248 2249 2250 2251 2252 2253 2254 2255 2256 2257 2258 2259 2260 2261 2262 2263 2264 2265 2266 2267 2268 2269 2270 2271 2272 2273 2274 2275 2276 2277 2278 2279 2280 2281 2282 2283 2284 2285 2286 2287 2288 2289 2290 2291 2292 2293 2294 2295 2296 2297 2298 2299 2300 2301 2302 2303 2304 2305 2306 2307 2308 2309 2310 2311 2312 2313 2314 2315 2316 2317 2318 2319 2320 2321 2322 2323 2324 2325 2326 2327 2328 2329 2330 2331 2332 2333 2334 2335 2336 2337 2338 2339 2340 2341 2342 2343 2344 2345 2346 2347 2348 2349 2350 2351 2352 2353 2354 2355 2356 2357 2358 2359 2360 2361 2362 2363 2364 2365 2366 2367 2368 2369 2370 2371 2372 2373 2374 2375 2376 2377 2378 2379 2380 2381 2382 2383 2384 2385 2386 2387 2388 2389 2390 2391 2392 2393 2394 2395 2396 2397 2398 2399 2400 2401 2402 2403 2404 2405 2406 2407 2408 2409 2410 2411 2412 2413 2414 2415 2416 2417 2418 2419 2420 2421 2422 2423 2424 2425 2426 2427 2428 2429 2430 2431 2432 2433 2434 2435 2436 2437 2438 2439 2440 2441 2442 2443 2444 2445 2446 2447 2448 2449 2450 2451 2452 2453 2454 2455 2456 2457 2458 2459 2460 2461 2462 2463 2464 2465 2466 2467 2468 2469 2470 2471 2472 2473 2474 2475 2476 2477 2478 2479 2480 2481 2482 2483 2484 2485 2486 2487 2488 2489 2490 2491 2492 2493 2494 2495 2496 2497 2498 2499 2500 2501 2502 2503 2504 2505 2506 2507 2508 2509 2510 2511 2512 2513 2514 2515 2516 2517 2518 2519 2520 2521 2522 2523 2524 2525 2526 2527 2528 2529 2530 2531 2532 2533 2534 2535 2536 2537 2538 2539 2540 2541 2542 2543 2544 2545 2546 2547 2548 2549 2550 2551 2552 2553 2554 2555 2556 2557 2558 2559 2560 2561 2562 2563 2564 2565 2566 2567 2568 2569 2570 2571 2572 2573 2574 2575 2576 2577 2578 2579 2580 2581 2582 2583 2584 2585 2586 2587 2588 2589 2590 2591 2592 2593 2594 2595 2596 2597 2598 2599 2600 2601 2602 2603 2604 2605 2606 2607 2608 2609 2610 2611 2612 2613 2614 2615 2616 2617 2618 2619 2620 2621 2622 2623 2624 2625 2626 2627 2628 2629 2630 2631 2632 2633 2634 2635 2636 2637 2638 2639 2640 2641 2642 2643 2644 2645 2646 2647 2648 2649 2650 2651 2652 2653 2654 2655 2656 2657 2658 2659 2660 2661 2662 2663 2664 2665 2666 2667 2668 2669 2670 2671 2672 2673 2674 2675 2676 2677 2678 2679 2680 2681 2682 2683 2684 2685 2686 2687 2688 2689 2690 2691 2692 2693 2694 2695 2696 2697 2698 2699 2700 2701 2702 2703 2704 2705 2706 2707 2708 2709 2710 2711 2712 2713 2714 2715 2716 2717 2718 2719 2720 2721 2722 2723 2724 2725 2726 2727 2728 2729 2730 2731 2732 2733 2734 2735 2736 2737 2738 2739 2740 2741 2742 2743 2744 2745 2746 2747 2748 2749 2750 2751 2752 2753 2754 2755 2756 2757 2758 2759 2760 2761 2762 2763 2764 2765 2766 2767 2768 2769 2770 2771 2772 2773 2774 2775 2776 2777 2778 2779 2780 2781 2782 2783 2784 2785 2786 2787 2788 2789 2790 2791 2792 2793 2794 2795 2796 2797 2798 2799 2800 2801 2802 2803 2804 2805 2806 2807 2808 2809 2810 2811 2812 2813 2814 2815 2816 2817 2818 2819 2820 2821 2822 2823 2824 2825 2826 2827 2828 2829 2830 2831 2832 2833 2834 2835 2836 2837 2838 2839 2840 2841 2842 2843 2844 2845 2846 2847 2848 2849 2850 2851 2852 2853 2854 2855 2856 2857 2858 2859 2860 2861 2862 2863 2864 2865 2866 2867 2868 2869 2870 2871 2872 2873 2874 2875 2876 2877 2878 2879 2880 2881 2882 2883 2884 2885 2886 2887 2888 2889 2890 2891 2892 2893 2894 2895 2896 2897 2898 2899 2900 2901 2902 2903 2904 2905 2906 2907 2908 2909 2910 2911 2912 2913 2914 2915 2916 2917 2918 2919 2920 2921 2922 2923 2924 2925 2926 2927 2928 2929 2930 2931 2932 2933 2934 2935 2936 2937 2938 2939
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## POMDP\n",
    "---\n",
    "Partially Observable Markov Decision Problems\n",
    "\n",
    "In retrospect, a Markov decision process or MDP is defined as:\n",
    "- a sequential decision problem for a fully observable, stochastic environment with a Markovian transition model and additive rewards.\n",
    "\n",
    "An MDP consists of a set of states (with an initial state $s_0$); a set $A(s)$ of actions\n",
    "in each state; a transition model $P(s' | s, a)$; and a reward function $R(s)$.\n",
    "\n",
    "The MDP seeks to make sequential decisions to occupy states so as to maximise some combination of the reward function $R(s)$.\n",
    "\n",
    "The characteristic problem of the MDP is hence to identify the optimal policy function $\\pi^*(s)$ that provides the _utility-maximising_ action $a$ to be taken when the current state is $s$.\n",
    "\n",
    "### Belief vector\n",
    "\n",
    "**Note**: The book refers to the _belief vector_ as the _belief state_. We use the latter terminology here to retain our ability to refer to the belief vector as a _probability distribution over states_.\n",
    "\n",
    "The solution of an MDP is subject to certain properties of the problem which are assumed and justified in [Section 17.1]. One critical assumption is that the agent is **fully aware of its current state at all times**.\n",
    "\n",
    "A tedious (but rewarding, as we will see) way of expressing this is in terms of the **belief vector** $b$ of the agent. The belief vector is a function mapping states to probabilities or certainties of being in those states.\n",
    "\n",
    "Consider an agent that is fully aware that it is in state $s_i$ in the statespace $(s_1, s_2, ... s_n)$ at the current time.\n",
    "\n",
    "Its belief vector is the vector $(b(s_1), b(s_2), ... b(s_n))$ given by the function $b(s)$:\n",
    "\\begin{align*}\n",
    "b(s) &= 0 \\quad \\text{if }s \\neq s_i \\\\ &= 1 \\quad \\text{if } s = s_i\n",
    "\\end{align*}\n",
    "\n",
    "Note that $b(s)$ is a probability distribution that necessarily sums to $1$ over all $s$.\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "collapsed": true
   },
   "source": [
    "### POMDPs - a conceptual outline\n",
    "\n",
    "The POMDP really has only two modifications to the **problem formulation** compared to the MDP.\n",
    "\n",
    "- **Belief state** - In the real world, the current state of an agent is often not known with complete certainty. This makes the concept of a belief vector extremely relevant. It allows the agent to represent different degrees of certainty with which it _believes_ it is in each state.\n",
    "\n",
    "- **Evidence percepts** - In the real world, agents often have certain kinds of evidence, collected from sensors. They can use the probability distribution of observed evidence, conditional on state, to consolidate their information. This is a known distribution $P(e\\ |\\ s)$ - $e$ being an evidence, and $s$ being the state it is conditional on.\n",
    "\n",
    "Consider the world we used for the MDP. \n",
    "\n",
    "![title](images/grid_mdp.jpg)\n",
    "\n",
    "#### Using the belief vector\n",
    "An agent beginning at $(1, 1)$ may not be certain that it is indeed in $(1, 1)$. Consider a belief vector $b$ such that:\n",
    "\\begin{align*}\n",
    "    b((1,1)) &= 0.8 \\\\\n",
    "    b((2,1)) &= 0.1 \\\\\n",
    "    b((1,2)) &= 0.1 \\\\\n",
    "    b(s) &= 0 \\quad \\quad \\forall \\text{ other } s\n",
    "\\end{align*}\n",
    "\n",
    "By horizontally catenating each row, we can represent this as an 11-dimensional vector (omitting $(2, 2)$).\n",
    "\n",
    "Thus, taking $s_1 = (1, 1)$, $s_2 = (1, 2)$, ... $s_{11} = (4,3)$, we have $b$:\n",
    "\n",
    "$b = (0.8, 0.1, 0, 0, 0.1, 0, 0, 0, 0, 0, 0)$ \n",
    "\n",
    "This fully represents the certainty to which the agent is aware of its state.\n",
    "\n",
    "#### Using evidence\n",
    "The evidence observed here could be the number of adjacent 'walls' or 'dead ends' observed by the agent. We assume that the agent cannot 'orient' the walls - only count them.\n",
    "\n",
    "In this case, $e$ can take only two values, 1 and 2. This gives $P(e\\ |\\ s)$ as:\n",
    "\\begin{align*}\n",
    "    P(e=2\\ |\\ s) &= \\frac{1}{7} \\quad \\forall \\quad s \\in \\{s_1, s_2, s_4, s_5, s_8, s_9, s_{11}\\}\\\\\n",
    "    P(e=1\\ |\\ s) &= \\frac{1}{4} \\quad \\forall \\quad s \\in \\{s_3, s_6, s_7, s_{10}\\} \\\\\n",
    "    P(e\\ |\\ s) &= 0 \\quad \\forall \\quad \\text{ other } s, e\n",
    "\\end{align*}\n",
    "\n",
    "Note that the implications of the evidence on the state must be known **a priori** to the agent. Ways of reliably learning this distribution from percepts are beyond the scope of this notebook."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### POMDPs - a rigorous outline\n",
    "\n",
    "A POMDP is thus a sequential decision problem for for a *partially* observable, stochastic environment with a Markovian transition model, a known 'sensor model' for inferring state from observation, and additive rewards. \n",
    "\n",
    "Practically, a POMDP has the following, which an MDP also has:\n",
    "- a set of states, each denoted by $s$\n",
    "- a set of actions available in each state, $A(s)$\n",
    "- a reward accrued on attaining some state, $R(s)$\n",
    "- a transition probability $P(s'\\ |\\ s, a)$ of action $a$ changing the state from $s$ to $s'$\n",
    "\n",
    "And the following, which an MDP does not:\n",
    "- a sensor model $P(e\\ |\\ s)$ on evidence conditional on states\n",
    "\n",
    "Additionally, the POMDP is now uncertain of its current state hence has:\n",
    "- a belief vector $b$ representing the certainty of being in each state (as a probability distribution)\n",
    "\n",
    "\n",
    "#### New uncertainties\n",
    "\n",
    "It is useful to intuitively appreciate the new uncertainties that have arisen in the agent's awareness of its own state.\n",
    "\n",
    "- At any point, the agent has belief vector $b$, the distribution of its believed likelihood of being in each state $s$.\n",
    "- For each of these states $s$ that the agent may **actually** be in, it has some set of actions given by $A(s)$.\n",
    "- Each of these actions may transport it to some other state $s'$, assuming an initial state $s$, with probability $P(s'\\ |\\ s, a)$\n",
    "- Once the action is performed, the agent receives a percept $e$. $P(e\\ |\\ s)$ now tells it the chances of having perceived $e$ for each state $s$. The agent must use this information to update its new belief state appropriately.\n",
    "\n",
    "#### Evolution of the belief vector - the `FORWARD` function\n",
    "\n",
    "The new belief vector $b'(s')$ after an action $a$ on the belief vector $b(s)$ and the noting of evidence $e$ is:\n",
    "$$ b'(s') = \\alpha P(e\\ |\\ s') \\sum_s P(s'\\ | s, a) b(s)$$ \n",
    "\n",
    "where $\\alpha$ is a normalising constant (to retain the interpretation of $b$ as a probability distribution.\n",
    "\n",
    "This equation is just counts the sum of likelihoods of going to a state $s'$ from every possible state $s$, times the initial likelihood of being in each $s$. This is multiplied by the likelihood that the known evidence actually implies the new state $s'$. \n",
    "\n",
    "This function is represented as `b' = FORWARD(b, a, e)`\n",
    "\n",
    "#### Probability distribution of the evolving belief vector\n",
    "\n",
    "The goal here is to find $P(b'\\ |\\ b, a)$ - the probability that action $a$ transforms belief vector $b$ into belief vector $b'$. The following steps illustrate this -\n",
    "\n",
    "The probability of observing evidence $e$ when action $a$ is enacted on belief vector $b$ can be distributed over each possible new state $s'$ resulting from it:\n",
    "\\begin{align*}\n",
    "    P(e\\ |\\ b, a) &= \\sum_{s'} P(e\\ |\\ b, a, s') P(s'\\ |\\ b, a) \\\\\n",
    "                  &= \\sum_{s'} P(e\\ |\\ s') P(s'\\ |\\ b, a) \\\\\n",
    "                  &= \\sum_{s'} P(e\\ |\\ s') \\sum_s P(s'\\ |\\ s, a) b(s)\n",
    "\\end{align*}\n",
    "\n",
    "The probability of getting belief vector $b'$ from $b$ by application of action $a$ can thus be summed over all possible evidences $e$:\n",
    "\\begin{align*}\n",
    "    P(b'\\ |\\ b, a) &= \\sum_{e} P(b'\\ |\\ b, a, e) P(e\\ |\\ b, a) \\\\\n",
    "                  &= \\sum_{e} P(b'\\ |\\ b, a, e) \\sum_{s'} P(e\\ |\\ s') \\sum_s P(s'\\ |\\ s, a) b(s)\n",
    "\\end{align*}\n",
    "\n",
    "where $P(b'\\ |\\ b, a, e) = 1$ if $b' = $ `FORWARD(b, a, e)` and $= 0$ otherwise.\n",
    "\n",
    "Given initial and final belief states $b$ and $b'$, the transition probabilities still depend on the action $a$ and observed evidence $e$. Some belief states may be achievable by certain actions, but have non-zero probabilities for states prohibited by the evidence $e$. Thus, the above condition thus ensures that only valid combinations of $(b', b, a, e)$ are considered.\n",
    "\n",
    "#### A modified rewardspace\n",
    "\n",
    "For MDPs, the reward space was simple - one reward per available state. However, for a belief vector $b(s)$, the expected reward is now:\n",
    "$$\\rho(b) = \\sum_s b(s) R(s)$$\n",
    "\n",
    "Thus, as the belief vector can take infinite values of the distribution over states, so can the reward for each belief vector vary over a hyperplane in the belief space, or space of states (planes in an $N$-dimensional space are formed by a linear combination of the axes)."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Now that we know the basics, let's have a look at the `POMDP` class."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<!DOCTYPE html PUBLIC \"-//W3C//DTD HTML 4.01//EN\"\n",
       "   \"http://www.w3.org/TR/html4/strict.dtd\">\n",
       "\n",
       "<html>\n",
       "<head>\n",
       "  <title></title>\n",
       "  <meta http-equiv=\"content-type\" content=\"text/html; charset=None\">\n",
       "  <style type=\"text/css\">\n",
       "td.linenos { background-color: #f0f0f0; padding-right: 10px; }\n",
       "span.lineno { background-color: #f0f0f0; padding: 0 5px 0 5px; }\n",
       "pre { line-height: 125%; }\n",
       "body .hll { background-color: #ffffcc }\n",
       "body  { background: #f8f8f8; }\n",
       "body .c { color: #408080; font-style: italic } /* Comment */\n",
       "body .err { border: 1px solid #FF0000 } /* Error */\n",
       "body .k { color: #008000; font-weight: bold } /* Keyword */\n",
       "body .o { color: #666666 } /* Operator */\n",
       "body .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
       "body .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
       "body .cp { color: #BC7A00 } /* Comment.Preproc */\n",
       "body .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
       "body .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
       "body .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
       "body .gd { color: #A00000 } /* Generic.Deleted */\n",
       "body .ge { font-style: italic } /* Generic.Emph */\n",
       "body .gr { color: #FF0000 } /* Generic.Error */\n",
       "body .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
       "body .gi { color: #00A000 } /* Generic.Inserted */\n",
       "body .go { color: #888888 } /* Generic.Output */\n",
       "body .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
       "body .gs { font-weight: bold } /* Generic.Strong */\n",
       "body .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
       "body .gt { color: #0044DD } /* Generic.Traceback */\n",
       "body .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
       "body .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
       "body .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
       "body .kp { color: #008000 } /* Keyword.Pseudo */\n",
       "body .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
       "body .kt { color: #B00040 } /* Keyword.Type */\n",
       "body .m { color: #666666 } /* Literal.Number */\n",
       "body .s { color: #BA2121 } /* Literal.String */\n",
       "body .na { color: #7D9029 } /* Name.Attribute */\n",
       "body .nb { color: #008000 } /* Name.Builtin */\n",
       "body .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
       "body .no { color: #880000 } /* Name.Constant */\n",
       "body .nd { color: #AA22FF } /* Name.Decorator */\n",
       "body .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
       "body .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
       "body .nf { color: #0000FF } /* Name.Function */\n",
       "body .nl { color: #A0A000 } /* Name.Label */\n",
       "body .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
       "body .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
       "body .nv { color: #19177C } /* Name.Variable */\n",
       "body .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
       "body .w { color: #bbbbbb } /* Text.Whitespace */\n",
       "body .mb { color: #666666 } /* Literal.Number.Bin */\n",
       "body .mf { color: #666666 } /* Literal.Number.Float */\n",
       "body .mh { color: #666666 } /* Literal.Number.Hex */\n",
       "body .mi { color: #666666 } /* Literal.Number.Integer */\n",
       "body .mo { color: #666666 } /* Literal.Number.Oct */\n",
       "body .sa { color: #BA2121 } /* Literal.String.Affix */\n",
       "body .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
       "body .sc { color: #BA2121 } /* Literal.String.Char */\n",
       "body .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
       "body .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
       "body .s2 { color: #BA2121 } /* Literal.String.Double */\n",
       "body .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
       "body .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
       "body .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
       "body .sx { color: #008000 } /* Literal.String.Other */\n",
       "body .sr { color: #BB6688 } /* Literal.String.Regex */\n",
       "body .s1 { color: #BA2121 } /* Literal.String.Single */\n",
       "body .ss { color: #19177C } /* Literal.String.Symbol */\n",
       "body .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
       "body .fm { color: #0000FF } /* Name.Function.Magic */\n",
       "body .vc { color: #19177C } /* Name.Variable.Class */\n",
       "body .vg { color: #19177C } /* Name.Variable.Global */\n",
       "body .vi { color: #19177C } /* Name.Variable.Instance */\n",
       "body .vm { color: #19177C } /* Name.Variable.Magic */\n",
       "body .il { color: #666666 } /* Literal.Number.Integer.Long */\n",
       "\n",
       "  </style>\n",
       "</head>\n",
       "<body>\n",
       "<h2></h2>\n",
       "\n",
       "<div class=\"highlight\"><pre><span></span><span class=\"k\">class</span> <span class=\"nc\">POMDP</span><span class=\"p\">(</span><span class=\"n\">MDP</span><span class=\"p\">):</span>\n",
       "\n",
       "    <span class=\"sd\">&quot;&quot;&quot;A Partially Observable Markov Decision Process, defined by</span>\n",
       "<span class=\"sd\">    a transition model P(s&#39;|s,a), actions A(s), a reward function R(s),</span>\n",
       "<span class=\"sd\">    and a sensor model P(e|s). We also keep track of a gamma value,</span>\n",
       "<span class=\"sd\">    for use by algorithms. The transition and the sensor models</span>\n",
       "<span class=\"sd\">    are defined as matrices. We also keep track of the possible states</span>\n",
       "<span class=\"sd\">    and actions for each state. [page 659].&quot;&quot;&quot;</span>\n",
       "\n",
       "    <span class=\"k\">def</span> <span class=\"fm\">__init__</span><span class=\"p\">(</span><span class=\"bp\">self</span><span class=\"p\">,</span> <span class=\"n\">actions</span><span class=\"p\">,</span> <span class=\"n\">transitions</span><span class=\"o\">=</span><span class=\"bp\">None</span><span class=\"p\">,</span> <span class=\"n\">evidences</span><span class=\"o\">=</span><span class=\"bp\">None</span><span class=\"p\">,</span> <span class=\"n\">rewards</span><span class=\"o\">=</span><span class=\"bp\">None</span><span class=\"p\">,</span> <span class=\"n\">states</span><span class=\"o\">=</span><span class=\"bp\">None</span><span class=\"p\">,</span> <span class=\"n\">gamma</span><span class=\"o\">=</span><span class=\"mf\">0.95</span><span class=\"p\">):</span>\n",
       "        <span class=\"sd\">&quot;&quot;&quot;Initialize variables of the pomdp&quot;&quot;&quot;</span>\n",
       "\n",
       "        <span class=\"k\">if</span> <span class=\"ow\">not</span> <span class=\"p\">(</span><span class=\"mi\">0</span> <span class=\"o\">&lt;</span> <span class=\"n\">gamma</span> <span class=\"o\">&lt;=</span> <span class=\"mi\">1</span><span class=\"p\">):</span>\n",
       "            <span class=\"k\">raise</span> <span class=\"ne\">ValueError</span><span class=\"p\">(</span><span class=\"s1\">&#39;A POMDP must have 0 &lt; gamma &lt;= 1&#39;</span><span class=\"p\">)</span>\n",
       "\n",
       "        <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">states</span> <span class=\"o\">=</span> <span class=\"n\">states</span>\n",
       "        <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">actions</span> <span class=\"o\">=</span> <span class=\"n\">actions</span>\n",
       "\n",
       "        <span class=\"c1\"># transition model cannot be undefined</span>\n",
       "        <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">t_prob</span> <span class=\"o\">=</span> <span class=\"n\">transitions</span> <span class=\"ow\">or</span> <span class=\"p\">{}</span>\n",
       "        <span class=\"k\">if</span> <span class=\"ow\">not</span> <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">t_prob</span><span class=\"p\">:</span>\n",
       "            <span class=\"k\">print</span><span class=\"p\">(</span><span class=\"s1\">&#39;Warning: Transition model is undefined&#39;</span><span class=\"p\">)</span>\n",
       "        \n",
       "        <span class=\"c1\"># sensor model cannot be undefined</span>\n",
       "        <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">e_prob</span> <span class=\"o\">=</span> <span class=\"n\">evidences</span> <span class=\"ow\">or</span> <span class=\"p\">{}</span>\n",
       "        <span class=\"k\">if</span> <span class=\"ow\">not</span> <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">e_prob</span><span class=\"p\">:</span>\n",
       "            <span class=\"k\">print</span><span class=\"p\">(</span><span class=\"s1\">&#39;Warning: Sensor model is undefined&#39;</span><span class=\"p\">)</span>\n",
       "        \n",
       "        <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">gamma</span> <span class=\"o\">=</span> <span class=\"n\">gamma</span>\n",
       "        <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">rewards</span> <span class=\"o\">=</span> <span class=\"n\">rewards</span>\n",
       "\n",
       "    <span class=\"k\">def</span> <span class=\"nf\">remove_dominated_plans</span><span class=\"p\">(</span><span class=\"bp\">self</span><span class=\"p\">,</span> <span class=\"n\">input_values</span><span class=\"p\">):</span>\n",
       "        <span class=\"sd\">&quot;&quot;&quot;</span>\n",
       "<span class=\"sd\">        Remove dominated plans.</span>\n",
       "<span class=\"sd\">        This method finds all the lines contributing to the</span>\n",
       "<span class=\"sd\">        upper surface and removes those which don&#39;t.</span>\n",
       "<span class=\"sd\">        &quot;&quot;&quot;</span>\n",
       "\n",
       "        <span class=\"n\">values</span> <span class=\"o\">=</span> <span class=\"p\">[</span><span class=\"n\">val</span> <span class=\"k\">for</span> <span class=\"n\">action</span> <span class=\"ow\">in</span> <span class=\"n\">input_values</span> <span class=\"k\">for</span> <span class=\"n\">val</span> <span class=\"ow\">in</span> <span class=\"n\">input_values</span><span class=\"p\">[</span><span class=\"n\">action</span><span class=\"p\">]]</span>\n",
       "        <span class=\"n\">values</span><span class=\"o\">.</span><span class=\"n\">sort</span><span class=\"p\">(</span><span class=\"n\">key</span><span class=\"o\">=</span><span class=\"k\">lambda</span> <span class=\"n\">x</span><span class=\"p\">:</span> <span class=\"n\">x</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">],</span> <span class=\"n\">reverse</span><span class=\"o\">=</span><span class=\"bp\">True</span><span class=\"p\">)</span>\n",
       "\n",
       "        <span class=\"n\">best</span> <span class=\"o\">=</span> <span class=\"p\">[</span><span class=\"n\">values</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">]]</span>\n",
       "        <span class=\"n\">y1_max</span> <span class=\"o\">=</span> <span class=\"nb\">max</span><span class=\"p\">(</span><span class=\"n\">val</span><span class=\"p\">[</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"k\">for</span> <span class=\"n\">val</span> <span class=\"ow\">in</span> <span class=\"n\">values</span><span class=\"p\">)</span>\n",
       "        <span class=\"n\">tgt</span> <span class=\"o\">=</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">]</span>\n",
       "        <span class=\"n\">prev_b</span> <span class=\"o\">=</span> <span class=\"mi\">0</span>\n",
       "        <span class=\"n\">prev_ix</span> <span class=\"o\">=</span> <span class=\"mi\">0</span>\n",
       "        <span class=\"k\">while</span> <span class=\"n\">tgt</span><span class=\"p\">[</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"o\">!=</span> <span class=\"n\">y1_max</span><span class=\"p\">:</span>\n",
       "            <span class=\"n\">min_b</span> <span class=\"o\">=</span> <span class=\"mi\">1</span>\n",
       "            <span class=\"n\">min_ix</span> <span class=\"o\">=</span> <span class=\"mi\">0</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">i</span> <span class=\"ow\">in</span> <span class=\"nb\">range</span><span class=\"p\">(</span><span class=\"n\">prev_ix</span> <span class=\"o\">+</span> <span class=\"mi\">1</span><span class=\"p\">,</span> <span class=\"nb\">len</span><span class=\"p\">(</span><span class=\"n\">values</span><span class=\"p\">)):</span>\n",
       "                <span class=\"k\">if</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"n\">i</span><span class=\"p\">][</span><span class=\"mi\">0</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">tgt</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">]</span> <span class=\"o\">+</span> <span class=\"n\">tgt</span><span class=\"p\">[</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"n\">i</span><span class=\"p\">][</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"o\">!=</span> <span class=\"mi\">0</span><span class=\"p\">:</span>\n",
       "                    <span class=\"n\">trans_b</span> <span class=\"o\">=</span> <span class=\"p\">(</span><span class=\"n\">values</span><span class=\"p\">[</span><span class=\"n\">i</span><span class=\"p\">][</span><span class=\"mi\">0</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">tgt</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">])</span> <span class=\"o\">/</span> <span class=\"p\">(</span><span class=\"n\">values</span><span class=\"p\">[</span><span class=\"n\">i</span><span class=\"p\">][</span><span class=\"mi\">0</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">tgt</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">]</span> <span class=\"o\">+</span> <span class=\"n\">tgt</span><span class=\"p\">[</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"n\">i</span><span class=\"p\">][</span><span class=\"mi\">1</span><span class=\"p\">])</span>\n",
       "                    <span class=\"k\">if</span> <span class=\"mi\">0</span> <span class=\"o\">&lt;=</span> <span class=\"n\">trans_b</span> <span class=\"o\">&lt;=</span> <span class=\"mi\">1</span> <span class=\"ow\">and</span> <span class=\"n\">trans_b</span> <span class=\"o\">&gt;</span> <span class=\"n\">prev_b</span> <span class=\"ow\">and</span> <span class=\"n\">trans_b</span> <span class=\"o\">&lt;</span> <span class=\"n\">min_b</span><span class=\"p\">:</span>\n",
       "                        <span class=\"n\">min_b</span> <span class=\"o\">=</span> <span class=\"n\">trans_b</span>\n",
       "                        <span class=\"n\">min_ix</span> <span class=\"o\">=</span> <span class=\"n\">i</span>\n",
       "            <span class=\"n\">prev_b</span> <span class=\"o\">=</span> <span class=\"n\">min_b</span>\n",
       "            <span class=\"n\">prev_ix</span> <span class=\"o\">=</span> <span class=\"n\">min_ix</span>\n",
       "            <span class=\"n\">tgt</span> <span class=\"o\">=</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"n\">min_ix</span><span class=\"p\">]</span>\n",
       "            <span class=\"n\">best</span><span class=\"o\">.</span><span class=\"n\">append</span><span class=\"p\">(</span><span class=\"n\">tgt</span><span class=\"p\">)</span>\n",
       "\n",
       "        <span class=\"k\">return</span> <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">generate_mapping</span><span class=\"p\">(</span><span class=\"n\">best</span><span class=\"p\">,</span> <span class=\"n\">input_values</span><span class=\"p\">)</span>\n",
       "\n",
       "    <span class=\"k\">def</span> <span class=\"nf\">remove_dominated_plans_fast</span><span class=\"p\">(</span><span class=\"bp\">self</span><span class=\"p\">,</span> <span class=\"n\">input_values</span><span class=\"p\">):</span>\n",
       "        <span class=\"sd\">&quot;&quot;&quot;</span>\n",
       "<span class=\"sd\">        Remove dominated plans using approximations.</span>\n",
       "<span class=\"sd\">        Resamples the upper boundary at intervals of 100 and</span>\n",
       "<span class=\"sd\">        finds the maximum values at these points.</span>\n",
       "<span class=\"sd\">        &quot;&quot;&quot;</span>\n",
       "\n",
       "        <span class=\"n\">values</span> <span class=\"o\">=</span> <span class=\"p\">[</span><span class=\"n\">val</span> <span class=\"k\">for</span> <span class=\"n\">action</span> <span class=\"ow\">in</span> <span class=\"n\">input_values</span> <span class=\"k\">for</span> <span class=\"n\">val</span> <span class=\"ow\">in</span> <span class=\"n\">input_values</span><span class=\"p\">[</span><span class=\"n\">action</span><span class=\"p\">]]</span>\n",
       "        <span class=\"n\">values</span><span class=\"o\">.</span><span class=\"n\">sort</span><span class=\"p\">(</span><span class=\"n\">key</span><span class=\"o\">=</span><span class=\"k\">lambda</span> <span class=\"n\">x</span><span class=\"p\">:</span> <span class=\"n\">x</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">],</span> <span class=\"n\">reverse</span><span class=\"o\">=</span><span class=\"bp\">True</span><span class=\"p\">)</span>\n",
       "\n",
       "        <span class=\"n\">best</span> <span class=\"o\">=</span> <span class=\"p\">[]</span>\n",
       "        <span class=\"n\">sr</span> <span class=\"o\">=</span> <span class=\"mi\">100</span>\n",
       "        <span class=\"k\">for</span> <span class=\"n\">i</span> <span class=\"ow\">in</span> <span class=\"nb\">range</span><span class=\"p\">(</span><span class=\"n\">sr</span> <span class=\"o\">+</span> <span class=\"mi\">1</span><span class=\"p\">):</span>\n",
       "            <span class=\"n\">x</span> <span class=\"o\">=</span> <span class=\"n\">i</span> <span class=\"o\">/</span> <span class=\"nb\">float</span><span class=\"p\">(</span><span class=\"n\">sr</span><span class=\"p\">)</span>\n",
       "            <span class=\"n\">maximum</span> <span class=\"o\">=</span> <span class=\"p\">(</span><span class=\"n\">values</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">][</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">][</span><span class=\"mi\">0</span><span class=\"p\">])</span> <span class=\"o\">*</span> <span class=\"n\">x</span> <span class=\"o\">+</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">][</span><span class=\"mi\">0</span><span class=\"p\">]</span>\n",
       "            <span class=\"n\">tgt</span> <span class=\"o\">=</span> <span class=\"n\">values</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">]</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">value</span> <span class=\"ow\">in</span> <span class=\"n\">values</span><span class=\"p\">:</span>\n",
       "                <span class=\"n\">val</span> <span class=\"o\">=</span> <span class=\"p\">(</span><span class=\"n\">value</span><span class=\"p\">[</span><span class=\"mi\">1</span><span class=\"p\">]</span> <span class=\"o\">-</span> <span class=\"n\">value</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">])</span> <span class=\"o\">*</span> <span class=\"n\">x</span> <span class=\"o\">+</span> <span class=\"n\">value</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">]</span>\n",
       "                <span class=\"k\">if</span> <span class=\"n\">val</span> <span class=\"o\">&gt;</span> <span class=\"n\">maximum</span><span class=\"p\">:</span>\n",
       "                    <span class=\"n\">maximum</span> <span class=\"o\">=</span> <span class=\"n\">val</span>\n",
       "                    <span class=\"n\">tgt</span> <span class=\"o\">=</span> <span class=\"n\">value</span>\n",
       "\n",
       "            <span class=\"k\">if</span> <span class=\"nb\">all</span><span class=\"p\">(</span><span class=\"nb\">any</span><span class=\"p\">(</span><span class=\"n\">tgt</span> <span class=\"o\">!=</span> <span class=\"n\">v</span><span class=\"p\">)</span> <span class=\"k\">for</span> <span class=\"n\">v</span> <span class=\"ow\">in</span> <span class=\"n\">best</span><span class=\"p\">):</span>\n",
       "                <span class=\"n\">best</span><span class=\"o\">.</span><span class=\"n\">append</span><span class=\"p\">(</span><span class=\"n\">tgt</span><span class=\"p\">)</span>\n",
       "\n",
       "        <span class=\"k\">return</span> <span class=\"bp\">self</span><span class=\"o\">.</span><span class=\"n\">generate_mapping</span><span class=\"p\">(</span><span class=\"n\">best</span><span class=\"p\">,</span> <span class=\"n\">input_values</span><span class=\"p\">)</span>\n",
       "\n",
       "    <span class=\"k\">def</span> <span class=\"nf\">generate_mapping</span><span class=\"p\">(</span><span class=\"bp\">self</span><span class=\"p\">,</span> <span class=\"n\">best</span><span class=\"p\">,</span> <span class=\"n\">input_values</span><span class=\"p\">):</span>\n",
       "        <span class=\"sd\">&quot;&quot;&quot;Generate mappings after removing dominated plans&quot;&quot;&quot;</span>\n",
       "\n",
       "        <span class=\"n\">mapping</span> <span class=\"o\">=</span> <span class=\"n\">defaultdict</span><span class=\"p\">(</span><span class=\"nb\">list</span><span class=\"p\">)</span>\n",
       "        <span class=\"k\">for</span> <span class=\"n\">value</span> <span class=\"ow\">in</span> <span class=\"n\">best</span><span class=\"p\">:</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">action</span> <span class=\"ow\">in</span> <span class=\"n\">input_values</span><span class=\"p\">:</span>\n",
       "                <span class=\"k\">if</span> <span class=\"nb\">any</span><span class=\"p\">(</span><span class=\"nb\">all</span><span class=\"p\">(</span><span class=\"n\">value</span> <span class=\"o\">==</span> <span class=\"n\">v</span><span class=\"p\">)</span> <span class=\"k\">for</span> <span class=\"n\">v</span> <span class=\"ow\">in</span> <span class=\"n\">input_values</span><span class=\"p\">[</span><span class=\"n\">action</span><span class=\"p\">]):</span>\n",
       "                    <span class=\"n\">mapping</span><span class=\"p\">[</span><span class=\"n\">action</span><span class=\"p\">]</span><span class=\"o\">.</span><span class=\"n\">append</span><span class=\"p\">(</span><span class=\"n\">value</span><span class=\"p\">)</span>\n",
       "\n",
       "        <span class=\"k\">return</span> <span class=\"n\">mapping</span>\n",
       "\n",
       "    <span class=\"k\">def</span> <span class=\"nf\">max_difference</span><span class=\"p\">(</span><span class=\"bp\">self</span><span class=\"p\">,</span> <span class=\"n\">U1</span><span class=\"p\">,</span> <span class=\"n\">U2</span><span class=\"p\">):</span>\n",
       "        <span class=\"sd\">&quot;&quot;&quot;Find maximum difference between two utility mappings&quot;&quot;&quot;</span>\n",
       "\n",
       "        <span class=\"k\">for</span> <span class=\"n\">k</span><span class=\"p\">,</span> <span class=\"n\">v</span> <span class=\"ow\">in</span> <span class=\"n\">U1</span><span class=\"o\">.</span><span class=\"n\">items</span><span class=\"p\">():</span>\n",
       "            <span class=\"n\">sum1</span> <span class=\"o\">=</span> <span class=\"mi\">0</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">element</span> <span class=\"ow\">in</span> <span class=\"n\">U1</span><span class=\"p\">[</span><span class=\"n\">k</span><span class=\"p\">]:</span>\n",
       "                <span class=\"n\">sum1</span> <span class=\"o\">+=</span> <span class=\"nb\">sum</span><span class=\"p\">(</span><span class=\"n\">element</span><span class=\"p\">)</span>\n",
       "            <span class=\"n\">sum2</span> <span class=\"o\">=</span> <span class=\"mi\">0</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">element</span> <span class=\"ow\">in</span> <span class=\"n\">U2</span><span class=\"p\">[</span><span class=\"n\">k</span><span class=\"p\">]:</span>\n",
       "                <span class=\"n\">sum2</span> <span class=\"o\">+=</span> <span class=\"nb\">sum</span><span class=\"p\">(</span><span class=\"n\">element</span><span class=\"p\">)</span>\n",
       "        <span class=\"k\">return</span> <span class=\"nb\">abs</span><span class=\"p\">(</span><span class=\"n\">sum1</span> <span class=\"o\">-</span> <span class=\"n\">sum2</span><span class=\"p\">)</span>\n",
       "</pre></div>\n",
       "</body>\n",
       "</html>\n"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "psource(POMDP)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The `POMDP` class includes all variables of the `MDP` class and additionally also stores the sensor model in `e_prob`.\n",
    "<br>\n",
    "<br>\n",
    "`remove_dominated_plans`, `remove_dominated_plans_fast`, `generate_mapping` and `max_difference` are helper methods for `pomdp_value_iteration` which will be explained shortly."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "To understand how we can model a partially observable MDP, let's take a simple example.\n",
    "Let's consider a simple two state world.\n",
    "The states are labelled 0 and 1, with the reward at state 0 being 0 and at state 1 being 1.\n",
    "<br>\n",
    "There are two actions:\n",
    "<br>\n",
    "`Stay`: stays put with probability 0.9 and\n",
    "`Go`: switches to the other state with probability 0.9.\n",
    "<br>\n",
    "For now, let's assume the discount factor `gamma` to be 1.\n",
    "<br>\n",
    "The sensor reports the correct state with probability 0.6.\n",
    "<br>\n",
    "This is a simple problem with a trivial solution.\n",
    "Obviously the agent should `Stay` when it thinks it is in state 1 and `Go` when it thinks it is in state 0.\n",
    "<br>\n",
    "The belief space can be viewed as one-dimensional because the two probabilities must sum to 1."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let's model this POMDP using the `POMDP` class."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {},
   "outputs": [],
   "source": [
    "# transition probability P(s'|s,a)\n",
    "t_prob = [[[0.9, 0.1], [0.1, 0.9]], [[0.1, 0.9], [0.9, 0.1]]]\n",
    "# evidence function P(e|s)\n",
    "e_prob = [[[0.6, 0.4], [0.4, 0.6]], [[0.6, 0.4], [0.4, 0.6]]]\n",
    "# reward function\n",
    "rewards = [[0.0, 0.0], [1.0, 1.0]]\n",
    "# discount factor\n",
    "gamma = 0.95\n",
    "# actions\n",
    "actions = ('0', '1')\n",
    "# states\n",
    "states = ('0', '1')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [],
   "source": [
    "pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We have defined our `POMDP` object."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## POMDP VALUE ITERATION\n",
    "Defining a POMDP is useless unless we can find a way to solve it. As POMDPs can have infinitely many belief states, we cannot calculate one utility value for each state as we did in `value_iteration` for MDPs.\n",
    "<br>\n",
    "Instead of thinking about policies, we should think about conditional plans and how the expected utility of executing a fixed conditional plan varies with the initial belief state.\n",
    "<br>\n",
    "If we bound the depth of the conditional plans, then there are only finitely many such plans and the continuous space of belief states will generally be divided inte _regions_, each corresponding to a particular conditional plan that is optimal in that region. The utility function, being the maximum of a collection of hyperplanes, will be piecewise linear and convex.\n",
    "<br>\n",
    "For the one-step plans `Stay` and `Go`, the utility values are as follows\n",
    "<br>\n",
    "<br>\n",
    "$$\\alpha_{|Stay|}(0) = R(0) + \\gamma(0.9R(0) + 0.1R(1)) = 0.1$$\n",
    "$$\\alpha_{|Stay|}(1) = R(1) + \\gamma(0.9R(1) + 0.1R(0)) = 1.9$$\n",
    "$$\\alpha_{|Go|}(0) = R(0) + \\gamma(0.9R(1) + 0.1R(0)) = 0.9$$\n",
    "$$\\alpha_{|Go|}(1) = R(1) + \\gamma(0.9R(0) + 0.1R(1)) = 1.1$$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The utility function can be found by `pomdp_value_iteration`.\n",
    "<br>\n",
    "To summarize, it generates a set of all plans consisting of an action and, for each possible next percept, a plan in U with computed utility vectors.\n",
    "The dominated plans are then removed from this set and the process is repeated till the maximum difference between the utility functions of two consecutive iterations reaches a value less than a threshold value."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/markdown": [
       "### AIMA3e\n",
       "__function__ POMDP-VALUE-ITERATION(_pomdp_, _&epsi;_) __returns__ a utility function  \n",
       "&emsp;__inputs__: _pomdp_, a POMDP with states _S_, actions _A_(_s_), transition model _P_(_s&prime;_ &vert; _s_, _a_),  \n",
       "&emsp;&emsp;&emsp;&emsp;&emsp;&emsp;sensor model _P_(_e_ &vert; _s_), rewards _R_(_s_), discount _&gamma;_  \n",
       "&emsp;&emsp;&emsp;&emsp;&emsp;_&epsi;_, the maximum error allowed in the utility of any state  \n",
       "&emsp;__local variables__: _U_, _U&prime;_, sets of plans _p_ with associated utility vectors _&alpha;<sub>p</sub>_  \n",
       "\n",
       "&emsp;_U&prime;_ &larr; a set containing just the empty plan \\[\\], with _&alpha;<sub>\\[\\]</sub>_(_s_) = _R_(_s_)  \n",
       "&emsp;__repeat__  \n",
       "&emsp;&emsp;&emsp;_U_ &larr; _U&prime;_  \n",
       "&emsp;&emsp;&emsp;_U&prime;_ &larr; the set of all plans consisting of an action and, for each possible next percept,  \n",
       "&emsp;&emsp;&emsp;&emsp;&emsp;a plan in _U_ with utility vectors computed according to Equation(__??__)  \n",
       "&emsp;&emsp;&emsp;_U&prime;_ &larr; REMOVE\\-DOMINATED\\-PLANS(_U&prime;_)  \n",
       "&emsp;__until__ MAX\\-DIFFERENCE(_U_, _U&prime;_) &lt; _&epsi;_(1 &minus; _&gamma;_) &frasl; _&gamma;_  \n",
       "&emsp;__return__ _U_  \n",
       "\n",
       "---\n",
       "__Figure ??__ A high\\-level sketch of the value iteration algorithm for POMDPs. The REMOVE\\-DOMINATED\\-PLANS step and MAX\\-DIFFERENCE test are typically implemented as linear programs."
      ],
      "text/plain": [
       "<IPython.core.display.Markdown object>"
      ]
     },
     "execution_count": 35,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "pseudocode('POMDP-Value-Iteration')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let's have a look at the `pomdp_value_iteration` function."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<!DOCTYPE html PUBLIC \"-//W3C//DTD HTML 4.01//EN\"\n",
       "   \"http://www.w3.org/TR/html4/strict.dtd\">\n",
       "\n",
       "<html>\n",
       "<head>\n",
       "  <title></title>\n",
       "  <meta http-equiv=\"content-type\" content=\"text/html; charset=None\">\n",
       "  <style type=\"text/css\">\n",
       "td.linenos { background-color: #f0f0f0; padding-right: 10px; }\n",
       "span.lineno { background-color: #f0f0f0; padding: 0 5px 0 5px; }\n",
       "pre { line-height: 125%; }\n",
       "body .hll { background-color: #ffffcc }\n",
       "body  { background: #f8f8f8; }\n",
       "body .c { color: #408080; font-style: italic } /* Comment */\n",
       "body .err { border: 1px solid #FF0000 } /* Error */\n",
       "body .k { color: #008000; font-weight: bold } /* Keyword */\n",
       "body .o { color: #666666 } /* Operator */\n",
       "body .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
       "body .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
       "body .cp { color: #BC7A00 } /* Comment.Preproc */\n",
       "body .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
       "body .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
       "body .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
       "body .gd { color: #A00000 } /* Generic.Deleted */\n",
       "body .ge { font-style: italic } /* Generic.Emph */\n",
       "body .gr { color: #FF0000 } /* Generic.Error */\n",
       "body .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
       "body .gi { color: #00A000 } /* Generic.Inserted */\n",
       "body .go { color: #888888 } /* Generic.Output */\n",
       "body .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
       "body .gs { font-weight: bold } /* Generic.Strong */\n",
       "body .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
       "body .gt { color: #0044DD } /* Generic.Traceback */\n",
       "body .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
       "body .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
       "body .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
       "body .kp { color: #008000 } /* Keyword.Pseudo */\n",
       "body .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
       "body .kt { color: #B00040 } /* Keyword.Type */\n",
       "body .m { color: #666666 } /* Literal.Number */\n",
       "body .s { color: #BA2121 } /* Literal.String */\n",
       "body .na { color: #7D9029 } /* Name.Attribute */\n",
       "body .nb { color: #008000 } /* Name.Builtin */\n",
       "body .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
       "body .no { color: #880000 } /* Name.Constant */\n",
       "body .nd { color: #AA22FF } /* Name.Decorator */\n",
       "body .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
       "body .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
       "body .nf { color: #0000FF } /* Name.Function */\n",
       "body .nl { color: #A0A000 } /* Name.Label */\n",
       "body .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
       "body .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
       "body .nv { color: #19177C } /* Name.Variable */\n",
       "body .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
       "body .w { color: #bbbbbb } /* Text.Whitespace */\n",
       "body .mb { color: #666666 } /* Literal.Number.Bin */\n",
       "body .mf { color: #666666 } /* Literal.Number.Float */\n",
       "body .mh { color: #666666 } /* Literal.Number.Hex */\n",
       "body .mi { color: #666666 } /* Literal.Number.Integer */\n",
       "body .mo { color: #666666 } /* Literal.Number.Oct */\n",
       "body .sa { color: #BA2121 } /* Literal.String.Affix */\n",
       "body .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
       "body .sc { color: #BA2121 } /* Literal.String.Char */\n",
       "body .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
       "body .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
       "body .s2 { color: #BA2121 } /* Literal.String.Double */\n",
       "body .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
       "body .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
       "body .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
       "body .sx { color: #008000 } /* Literal.String.Other */\n",
       "body .sr { color: #BB6688 } /* Literal.String.Regex */\n",
       "body .s1 { color: #BA2121 } /* Literal.String.Single */\n",
       "body .ss { color: #19177C } /* Literal.String.Symbol */\n",
       "body .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
       "body .fm { color: #0000FF } /* Name.Function.Magic */\n",
       "body .vc { color: #19177C } /* Name.Variable.Class */\n",
       "body .vg { color: #19177C } /* Name.Variable.Global */\n",
       "body .vi { color: #19177C } /* Name.Variable.Instance */\n",
       "body .vm { color: #19177C } /* Name.Variable.Magic */\n",
       "body .il { color: #666666 } /* Literal.Number.Integer.Long */\n",
       "\n",
       "  </style>\n",
       "</head>\n",
       "<body>\n",
       "<h2></h2>\n",
       "\n",
       "<div class=\"highlight\"><pre><span></span><span class=\"k\">def</span> <span class=\"nf\">pomdp_value_iteration</span><span class=\"p\">(</span><span class=\"n\">pomdp</span><span class=\"p\">,</span> <span class=\"n\">epsilon</span><span class=\"o\">=</span><span class=\"mf\">0.1</span><span class=\"p\">):</span>\n",
       "    <span class=\"sd\">&quot;&quot;&quot;Solving a POMDP by value iteration.&quot;&quot;&quot;</span>\n",
       "\n",
       "    <span class=\"n\">U</span> <span class=\"o\">=</span> <span class=\"p\">{</span><span class=\"s1\">&#39;&#39;</span><span class=\"p\">:[[</span><span class=\"mi\">0</span><span class=\"p\">]</span><span class=\"o\">*</span> <span class=\"nb\">len</span><span class=\"p\">(</span><span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">states</span><span class=\"p\">)]}</span>\n",
       "    <span class=\"n\">count</span> <span class=\"o\">=</span> <span class=\"mi\">0</span>\n",
       "    <span class=\"k\">while</span> <span class=\"bp\">True</span><span class=\"p\">:</span>\n",
       "        <span class=\"n\">count</span> <span class=\"o\">+=</span> <span class=\"mi\">1</span>\n",
       "        <span class=\"n\">prev_U</span> <span class=\"o\">=</span> <span class=\"n\">U</span>\n",
       "        <span class=\"n\">values</span> <span class=\"o\">=</span> <span class=\"p\">[</span><span class=\"n\">val</span> <span class=\"k\">for</span> <span class=\"n\">action</span> <span class=\"ow\">in</span> <span class=\"n\">U</span> <span class=\"k\">for</span> <span class=\"n\">val</span> <span class=\"ow\">in</span> <span class=\"n\">U</span><span class=\"p\">[</span><span class=\"n\">action</span><span class=\"p\">]]</span>\n",
       "        <span class=\"n\">value_matxs</span> <span class=\"o\">=</span> <span class=\"p\">[]</span>\n",
       "        <span class=\"k\">for</span> <span class=\"n\">i</span> <span class=\"ow\">in</span> <span class=\"n\">values</span><span class=\"p\">:</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">j</span> <span class=\"ow\">in</span> <span class=\"n\">values</span><span class=\"p\">:</span>\n",
       "                <span class=\"n\">value_matxs</span><span class=\"o\">.</span><span class=\"n\">append</span><span class=\"p\">([</span><span class=\"n\">i</span><span class=\"p\">,</span> <span class=\"n\">j</span><span class=\"p\">])</span>\n",
       "\n",
       "        <span class=\"n\">U1</span> <span class=\"o\">=</span> <span class=\"n\">defaultdict</span><span class=\"p\">(</span><span class=\"nb\">list</span><span class=\"p\">)</span>\n",
       "        <span class=\"k\">for</span> <span class=\"n\">action</span> <span class=\"ow\">in</span> <span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">actions</span><span class=\"p\">:</span>\n",
       "            <span class=\"k\">for</span> <span class=\"n\">u</span> <span class=\"ow\">in</span> <span class=\"n\">value_matxs</span><span class=\"p\">:</span>\n",
       "                <span class=\"n\">u1</span> <span class=\"o\">=</span> <span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">matmul</span><span class=\"p\">(</span><span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">matmul</span><span class=\"p\">(</span><span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">t_prob</span><span class=\"p\">[</span><span class=\"nb\">int</span><span class=\"p\">(</span><span class=\"n\">action</span><span class=\"p\">)],</span> <span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">multiply</span><span class=\"p\">(</span><span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">e_prob</span><span class=\"p\">[</span><span class=\"nb\">int</span><span class=\"p\">(</span><span class=\"n\">action</span><span class=\"p\">)],</span> <span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">transpose</span><span class=\"p\">(</span><span class=\"n\">u</span><span class=\"p\">))),</span> <span class=\"p\">[[</span><span class=\"mi\">1</span><span class=\"p\">],</span> <span class=\"p\">[</span><span class=\"mi\">1</span><span class=\"p\">]])</span>\n",
       "                <span class=\"n\">u1</span> <span class=\"o\">=</span> <span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">add</span><span class=\"p\">(</span><span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">scalar_multiply</span><span class=\"p\">(</span><span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">gamma</span><span class=\"p\">,</span> <span class=\"n\">Matrix</span><span class=\"o\">.</span><span class=\"n\">transpose</span><span class=\"p\">(</span><span class=\"n\">u1</span><span class=\"p\">)),</span> <span class=\"p\">[</span><span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">rewards</span><span class=\"p\">[</span><span class=\"nb\">int</span><span class=\"p\">(</span><span class=\"n\">action</span><span class=\"p\">)]])</span>\n",
       "                <span class=\"n\">U1</span><span class=\"p\">[</span><span class=\"n\">action</span><span class=\"p\">]</span><span class=\"o\">.</span><span class=\"n\">append</span><span class=\"p\">(</span><span class=\"n\">u1</span><span class=\"p\">[</span><span class=\"mi\">0</span><span class=\"p\">])</span>\n",
       "\n",
       "        <span class=\"n\">U</span> <span class=\"o\">=</span> <span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">remove_dominated_plans_fast</span><span class=\"p\">(</span><span class=\"n\">U1</span><span class=\"p\">)</span>\n",
       "        <span class=\"c1\"># replace with U = pomdp.remove_dominated_plans(U1) for accurate calculations</span>\n",
       "        \n",
       "        <span class=\"k\">if</span> <span class=\"n\">count</span> <span class=\"o\">&gt;</span> <span class=\"mi\">10</span><span class=\"p\">:</span>\n",
       "            <span class=\"k\">if</span> <span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">max_difference</span><span class=\"p\">(</span><span class=\"n\">U</span><span class=\"p\">,</span> <span class=\"n\">prev_U</span><span class=\"p\">)</span> <span class=\"o\">&lt;</span> <span class=\"n\">epsilon</span> <span class=\"o\">*</span> <span class=\"p\">(</span><span class=\"mi\">1</span> <span class=\"o\">-</span> <span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">gamma</span><span class=\"p\">)</span> <span class=\"o\">/</span> <span class=\"n\">pomdp</span><span class=\"o\">.</span><span class=\"n\">gamma</span><span class=\"p\">:</span>\n",
       "                <span class=\"k\">return</span> <span class=\"n\">U</span>\n",
       "</pre></div>\n",
       "</body>\n",
       "</html>\n"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "psource(pomdp_value_iteration)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This function uses two aptly named helper methods from the `POMDP` class, `remove_dominated_plans` and `max_difference`."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let's try solving a simple one-dimensional POMDP using value-iteration.\n",
    "<br>\n",
    "Consider the problem of a user listening to voicemails.\n",
    "At the end of each message, they can either _save_ or _delete_ a message.\n",
    "This forms the unobservable state _S = {save, delete}_.\n",
    "It is the task of the POMDP solver to guess which goal the user has.\n",
    "<br>\n",
    "The belief space has two elements, _b(s = save)_ and _b(s = delete)_.\n",
    "For example, for the belief state _b = (1, 0)_, the left end of the line segment indicates _b(s = save) = 1_ and _b(s = delete) = 0_.\n",
    "The intermediate points represent varying degrees of certainty in the user's goal.\n",
    "<br>\n",
    "The machine has three available actions: it can _ask_ what the user wishes to do in order to infer his or her current goal, or it can _doSave_ or _doDelete_ and move to the next message.\n",
    "If the user says _save_, then an error may occur with probability 0.2, whereas if the user says _delete_, an error may occur with a probability 0.3.\n",
    "<br>\n",
    "The machine receives a large positive reward (+5) for getting the user's goal correct, a very large negative reward (-20) for taking the action _doDelete_ when the user wanted _save_, and a smaller but still significant negative reward (-10) for taking the action _doSave_ when the user wanted _delete_. \n",
    "There is also a small negative reward for taking the _ask_ action (-1).\n",
    "The discount factor is set to 0.95 for this example.\n",
    "<br>\n",
    "Let's define the POMDP."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {},
   "outputs": [],
   "source": [
    "# transition function P(s'|s,a)\n",
    "t_prob = [[[0.65, 0.35], [0.65, 0.35]], [[0.65, 0.35], [0.65, 0.35]], [[1.0, 0.0], [0.0, 1.0]]]\n",
    "# evidence function P(e|s)\n",
    "e_prob = [[[0.5, 0.5], [0.5, 0.5]], [[0.5, 0.5], [0.5, 0.5]], [[0.8, 0.2], [0.3, 0.7]]]\n",
    "# reward function\n",
    "rewards = [[5, -10], [-20, 5], [-1, -1]]\n",
    "\n",
    "gamma = 0.95\n",
    "actions = ('0', '1', '2')\n",
    "states = ('0', '1')\n",
    "\n",
    "pomdp = POMDP(actions, t_prob, e_prob, rewards, states, gamma)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We have defined the `POMDP` object.\n",
    "Let's run `pomdp_value_iteration` to find the utility function."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [],
   "source": [
    "utility = pomdp_value_iteration(pomdp, epsilon=0.1)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYEAAAD8CAYAAACRkhiPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzsnXd81dX9/5+fm733JiEBEkYYYRNkrwABW9yjWq2tP7WuVq24v0WtiKNaF1K0al3ggNIEhIDKEhlhRCBkD7L33rnn98eH+ykrZN2bm3Gej0cekuQzziee+359zjnv83orQggkEolEMjDRmbsBEolEIjEfUgQkEolkACNFQCKRSAYwUgQkEolkACNFQCKRSAYwUgQkEolkACNFQCKRSAYwUgQkEolkACNFQCKRSAYwluZuwPl4enqK4OBgczdDIpFI+hTx8fElQgivrpzbq0QgODiYI0eOmLsZEolE0qdQFCWrq+fK6SCJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgGMFAGJRCIZwEgRkEgkkgHMgBaBF198kfDwcMaOHUtERAQHDx40d5MkfYBNmzahKApnzpy54nGOjo491CJJW1hYWBAREUF4eDjjxo3j9ddfR6/XX/GczMxMRo8e3e4xn3/+uTGbajYGrAgcOHCAmJgYjh49SkJCAjt37iQwMNDczZL0Ab744gtmzJjBl19+ae6mSNrBzs6O48ePc+rUKeLi4ti6dSt//etfu31dKQL9gPz8fDw9PbGxsQHA09MTf39/Vq1axeTJkxk9ejR33303QggSExOZMmWKdm5mZiZjx44FID4+ntmzZzNx4kSioqLIz883y/NIeoaamhr279/PBx98oIlAfn4+s2bNIiIigtGjR7N3794LzikpKSEyMpLY2FhzNFlyDm9vb9atW8fbb7+NEILW1lYee+wxJk+ezNixY3n//fcvOaetY1auXMnevXuJiIjg73//e4eu1WsRQvSar4kTJ4qeorq6WowbN06EhoaKe++9V/z4449CCCFKS0u1Y37zm9+ILVu2CCGEGDdunEhLSxNCCLF69Wrx/PPPi6amJhEZGSmKioqEEEJ8+eWX4s477+yxZ5D0PP/+97/F7373OyGEEJGRkSI+Pl68+uqr4oUXXhBCCNHS0iKqqqqEEEI4ODiIgoICMWXKFLFjxw6ztXkg4+DgcMnPXF1dRUFBgXj//ffF888/L4QQoqGhQUycOFGkp6eLjIwMER4eLoQQbR7zww8/iOjoaO2abR3XUwBHRBfjrlGspBVF+RBYBhQJIUaf+5k7sAEIBjKBG4QQ5ca4nzFwdHQkPj6evXv38sMPP3DjjTeyevVqnJycWLNmDXV1dZSVlREeHs7y5cu54YYb2LhxIytXrmTDhg1s2LCBpKQkTp48ycKFCwH1rcHPz8/MTyYxJV988QUPP/wwADfddBNffPEFy5cv53e/+x3Nzc38+te/JiIiAoDm5mbmz5/PO++8w+zZs83ZbMl5qDETduzYQUJCAl9//TUAlZWVpKSkEBYWph3b1jHW1tYXXLOt40JCQnrikbpHV9Xj/C9gFjABOHnez9YAK8/9eyXwcnvXCRsdJhpbGk2ilO3x1VdfiQULFghvb2+RnZ0thBDiueeeE88995wQQojU1FQxfvx4kZSUJCZMmCCEECIhIUFMmzbNLO2V9DwlJSXC1tZWBAUFicGDB4tBgwaJwMBAodfrRW5urli3bp0YPXq0+Pjjj4UQQtjb24vbb79dPPHEE2Zu+cDl4pFAWlqacHd3F3q9XlxzzTXiu+++u+Sc80cCbR1z8UigreN6gvz87o0EjLImIITYA5Rd9ONfAR+f+/fHwK/bu05yaTKeazy5buN1fHT8IwprCo3RvMuSlJRESkqK9v3x48cZPnw4oK4P1NTUaKoOMHToUCwsLHj++ee58cYbARg+fDjFxcUcOHAAUN/8Tp06ZbI2S8zL119/ze23305WVhaZmZmcPXuWkJAQ9uzZg7e3N3/4wx+46667OHr0KACKovDhhx9y5swZVq9ebebWS4qLi7nnnnu4//77URSFqKgo3nvvPZqbmwFITk6mtrb2gnPaOsbJyYnq6up2jzMlubnw0EPQ3cGGKSuL+Qgh8gGEEPmKoni3d8JQ96HMGz2P2JRYvkn8BgWFyQGTWRa6jOiwaMb7jkdRFKM0rqamhgceeICKigosLS0ZNmwY69atw9XVlTFjxhAcHMzkyZMvOOfGG2/kscceIyMjAwBra2u+/vprHnzwQSorK2lpaeHhhx8mPDzcKG2U9C6++OILVq5cecHPrr32Wu644w4cHBywsrLC0dGRTz75RPu9hYUFX375JcuXL8fZ2Zn77ruvp5s9oKmvryciIoLm5mYsLS257bbb+POf/wzA73//ezIzM5kwYQJCCLy8vNi8efMF57d1zNixY7G0tGTcuHHccccdPPTQQ+1ey1hkZ8PLL8P69aDXw223wb/+1fXrKeLc/Fh3URQlGIgR/1sTqBBCuJ73+3IhhNtlzrsbuBsgKChoYlZWFkIIjhccJyY5htiUWA7lHkIg8HfyZ+mwpSwLW8aCIQtwsHYwStslEomkt5OZCS+99L+Af+edsHKlOhJQFCVeCDGpK9c1pQgkAXPOjQL8gB+FEMOvdI1JkyaJy9UYLqwpZFvqNmJTYtmeup3qpmpsLGyYEzyHZWHLiA6NJsStDyzAmIGdZeos3QJ3dzO3RCLpOgO5H6elwd/+Bp98Ajod/P738PjjEBT0v2N6qwi8ApQKIVYrirIScBdC/OVK12hLBM6nqbWJfdn7iEmOISY5hpQydV5/lNcookOjWRa2jOmB07HUmXKmq+8w59gxAH4cP97MLZFIus5A7MdJSWrw/+wzsLKCu++Gv/wFAgIuPdbsIqAoyhfAHMATKASeAzYDG4EgIBu4Xghx8eLxBXREBC4muTSZ2ORYYlNi2Z21mxZ9C662riwetphloctYPGwxHvYeXXiq/sFA/PBI+h8DqR+fPg0vvghffgk2NnDvvfDoo3Cl7PPuiIBRXpeFEDe38av5xrj+lQjzCCMsMow/Rf6JqsYq4tLiiEmJYWvKVr48+SU6RUfkoEhtlDDae7TRFpclEonEWPzyC7zwAnz1Fdjbq4H/kUfAu92Umu7Rr+ZMnG2cuXbUtVw76lr0Qs+RvCPa4vKT3z/Jk98/SZBLkCYIc4PnYmdlZ+5mSySSAczx4/D88/Dtt+DkBE88AX/6E3h69sz9e5UItOfu1xl0io4pAVOYEjCFVXNXkVedx9aUrcQkx/DxiY9578h72FnaMX/IfC0FdZDzIKPdXyKRSK7EkSNq8N+yBVxc4Nln1bz/nl77NtrCsDFQFEUA6HQ6fH19iYqK4qmnnmLo0KFGvU9DSwO7M3eri8spMWRWZAIwzmecNkqYEjAFC52FUe9rDpLq6gAYbm9v5pZIJF2nP/Xjn3+GVatg2zZwc1Pf+h94AFxd2z+3Lcy+MGwsDCLQFs7OzkRERPDII4+wfPlyo8ztCyFILEnUpo32Z++nVbTiae/JkmFLWBa2jEVDF+Fq243/QxKJZMCzb58a/OPiwMNDnfO/7z5wdu7+tfuNCLi6uopp06aRnZ1NWloaTU1N7Z5jZWVFSEgIt9xyC3/6059w7uZftLy+nO1p24lJjmFb6jbK6suwUCyYOXimNkoY7jG8zywu/7ekBIDlPTXBKJGYgL7aj4WA3bvV4P/DD+oi72OPwT33gDFrDvUbETh/JGBjY4OXlxe+vr5YW1tTX19PRkYGlZWVtNdmRVHw9PRk9uzZPPfcc+1WCWqLVn0rP+f8TGxKLDHJMfxS9AsAQ9yGaOsIswfPxsbSpkvX7wkGUmqdpP/S1/qxELBrlxr89+4FX191g9fdd6uZP8am34hASEiIWLRoEQkJCWRkZFBWVqYZMhmwtrbGw8MDV1dXdDod5eXlFBYW0tra2u717ezsGD16NA888AC33HILFhadm/PPrswmNjmWmJQYvs/4noaWBhysHFg0dBHRodEsDV2Kn1PvspLuax8eieRy9JV+LARs364G/wMH1I1dK1fCXXeBnQkTEfuNCFxus1hZWRlxcXHs2bOH48ePk5GRQUlJySXiYGVlhYuLC/b29uj1ekpLS6mvr2/3nhYWFvj7+3Pttdfy7LPP4uZ2ib3RZalrruP7jO81UcipygFgot9EloUtY1nYMib4TUCnmLd4W1/58EgkV6K392MhIDZWDf6HD6uWDk88ofr72PTAREG/FoG2qKqqYufOnezZs4djx46RlpZGSUkJjY2NFxxnaWmJg4MDNjY21NfXU1tb26FUVFdXVyIjI3n22WeZNm3aFY8VQpBQmKBNG/2c8zMCgY+DD9Gh0USHRbNwyEKcbJw69GzGpLd/eCSSjtBb+7Fer6Z4rloFx46pZm5PPgm33w4X1Z0xKQNSBNqitraW77//nt27d3P06FFSU1MpLi6moaHhguMsLCywtbVFp9NRX19PS0tLu9e2sbFh2LBh3H333dx3331YWl5+m0VxbTHfpX5HbEos36V+R2VjJVY6K+YEz9EWl4e6GzfttS1664dHIukMva0f6/Xq5q7nn4eEBBg2DJ56Cm69VfX56WmkCHSAhoYGdu/ezY8//kh8fDwpKSkUFhZeMmWk0+mwsrJCr9dfMuV0OXQ6Hd7e3ixdupQXXnjhkvKSza3N/HT2J21PwpmSMwAM9xiuOaDOCJqBlYVpes7Zc+IXaGtrkutLJD1Bb+nHra2wcaNq73D6NAwfDk8/DTfdBG28E/YIUgS6QVNTE/v27eOHH37gyJEjJCcnU1BQQN25zSkGFEVBp9Oh1+vbzU4CcHBwYMKECTz11FNERUVpP08rSyM2RTW8+zHzR5pam3C2cSZqaBTLwpaxZNgSvBy8jP6cEomk67S0wBdfqMZuSUkwahQ88wxcfz10Mr/EJPQbEXBxcRGPP/44V199NeHh4WbNxW9paeHgwYPs3LmTw4cPk5ycTF5e3mVLximK0iFhsLS0JCgoiN/+9rc8/vjjNNHEzvSdmigU1BSgoDB10FQtBXWcz7hu/R02FBUBcKOpXagkEhNirn7c3AyffqoG/7Q0GDtWDf7XXKN6+/cW+o0IXGnHsJWVFa6urvj7+zN27FiWL1/OwoULce3OXusuoNfriY+PZ+fOnRw8eJAzZ86Ql5d3Qb3RzqAoCi4uLsybN49bH7qVX/S/EJMSw5E8dUQ0yHmQurgcGs38IfOxt+pcknFvm0uVSLpCT/fjpib4+GPVzz8zEyZMUIP/1Vf3ruBvYECIQEfQ6XTY2dnh6elJSEgIM2bMYMWKFYwbN67TewI6i16vJyEhgbi4OA4ePEhiYiI5OTlUV1d3aJRwMVbWVvgM9sF3iS9nfM9Q01SDraUtc4PnamsJg10Ht3sdKQKS/kBP9ePGRvjwQ7WM49mzMHkyPPccLF0KvdkkoF+JgCFrx9bWFmtray0Dp7GxkZqaGhobGzu0MexKGAqC+/v7M3r0aBYvXszVV1+Nuwns+4QQJCYmsmPHDn7++WdOnz7N2bNnqaqq6rxrqgI6Bx36kXqYD6ODRmvTRtMGTbtsNTUpApL+gKn7cX29Wrj95ZchNxciI9Xgv2hR7w7+BvqNCDg4OAhfX18qKyupra2lqanpioHS0tISKyurC8SiqamJhoYGWlpauvQGbkBRFGxtbXFzc2PIkCFMnTqVa6+9lsmTJ7eZGtpZkpOTiYuL48CBA5w6dYrs7GwqKio6Lw6WoAvUMf+e+dy55E6ihkXhbqcKmhQBSX/AVP24rg7efx/WrIGCApg5Uw3+8+b1jeBvoN+IwOWygxobG0lOTubMmTOkpKSQlZVFTk4OhYWFlJaWUlVVpQnGlZ7FwsICS0tLLYC3tLTQ0tLS7VGFhYUFjo6O+Pn5MXLkSKKiorjmmmvw8up6hk9WVhbbt29n//79nDx5kuzsbMrLyzvXVh24ertif+2vCPntH9k3eXKX2yORmBtji0BNDbz3Hrz6KhQVqUH/2Wdh9myjXL7H6dci0Bnq6+s5c+YMiYmJpKWlkZmZSV5eHgUFBZSVlVFVVUVdXR3Nzc3tCoZhDaG1tbXDaaFtoSgKVlZWuLm5MXjwYCZPnsy1117LzJkzOzWqyM3NJS4ujn379pGQkEBmZiZlZWWdEgdnZ2eioqJ4++238ZYZQ5I+Qsk5R2HPbm7DraqCd96B116D0lJ1uueZZ2DGDGO00nxIEegC1dXVJCYmkpSURGpqKtnZ2eTk5FBUVERZWRnV1dWaYFwJ3blUASFEt4TCcC07Ozt8fHwYOXIk8+bN44YbbmDQoCtXPCsqKiIuLo69e/eSkJCgWWh0dFrJysqK4cOHs3r1aqKjo7v1DBJJb6SiAt56C/7+dygvVxd6n3kG2nGE6bVUVFTw1VdfsWvXLk6fPs0vv/wiRcCUlJeXc/r0aZKSkkhLSyM7O5u8vDyKioooLy+nurqa+vr6Du0wNgaWlpY4OzszePBgJkyYwIoVK4iKirpkVFFWVsZTGzeSfPAg9clJnDx9kuqKjqeyenp6cvPNN/Paa69hZY698BLJOT7KzwfgDr/OufSWlcGbb6pflZVqiuczz8CkLoXLnqG5uZm9e/eyefNm4uPjtenghoaGK436pQj0FoqLizXBSE9P1wSjuLhYEwzDwrUpURQFGxsbWp2csAsO5rGrr+bWW28lJCSEyspK/vXtv/h629ecOHqCmuwa6KB+2djYMHHiRN577z3Gjh1r0meQSAx0dk2gpER963/rLaiuVjd3Pf009Ib8iIyMDDZu3MjevXtJSUmhqKiI2traDiezWFpaYm9vj4eHB8HBwQwdOpT169f3DxFQFEVYWVlhZWWFjY0Ntra22Nvb4+joiKOjIy4uLri6uuLm5oanpydeXl54e3vj6+uLv78/fn5+2NnZ9YmqX0IICgoKNMHIyMjQBKOkpITy8nJqamraU/9uY2FhgZ2dHU4eTuj8dRRZFdHc2AzpQCnQgRklRVHw8fHhkUce4dFHHzVZWyUDl46KQFGROt//zjtq5s/116vBf8yYnmgl1NXVsWXLFrZv384vv/xCTk4OlZWV7WY6GtDpdNjY2ODg4ICzs7MW7ywsLCgtLaW4uJiKigrq6uoufpHsHyJga2srvL29qa+vp7Gxkaampi5l8Oh0OiwsLC4REwcHh0vExMPDAy8vL3x8fPD19cXPzw8/Pz8cHBx6jZjo9XrOnj3L6dOnSUlJISMjg7Nnz5Kfn09JSQkVFRVG20PRJjrQWejQt+ihg13G2tqa6dOn8/nnn19irCeRdIb2RCA/H155BdauVTd83XST6uo5apTx2iCE4MiRI3z77bccOnSI9PR0rW5JR0b2Bv8xQ1q7ra0tlpaWtLS0UFtb290Xvt4rAoqiZALVQCvQcqWGXmk6qLm5mZKSEvLz88nPz6ewsJDi4mJKSkooLS2lvLycyspKqqurqampoa6ujrq6ugvEpLNZPoqiXLAX4XJi4uLigru7uyYm3t7e+Pj44OfnR0BAAI6Ojj0qJnq9noyMDE0wXjt0iMbCQkIbGjTBqK2tpbGxsfP7EYyAoih4eHjw2GOP8eijj2oL6xLJlWhLBHJy1Bz/detUk7dbb1WDf1hY5+9RXFzMV199xffff09iYiKFhYVUV1e3m01owBDkDZ93vV5v9M+Y4QXX1tYWJycnfHx8GDZsGF999VWvF4FJQoiS9o7tiTWBlpYWSktLLxCToqIiSkpKKCsr08SkqqqKmpoaamtrLxiZNDc3d0lMzh+Z2NjYtDkycXd3v0BM/P398ff3x9nZuUticqU3qJaWFlJTU0lMTCQlJYXMzExycnIoKCigtLSUyspKampq2t2DYQysrKwYO3Ysr7/+OjNmzJDiILmAi/txdjasXg0ffKB6+99+u1rJa9iwy5/f0tJCXFwcMTExHD16VNuY2dDQYJaXofOxsrLC1tYWZ2dnfHx8CA0NZcKECUyePJkJEybg4uLS7jV6dYpobxMBY6HX6ykrKyMvL4+8vLwLRiYGMamoqLhETBoaGi6Y5uqqmBhGJnZ2djg4OODk5ISTk9Ml01wuHh54+/gQHBDAoEGDcHFx6ZKYNDU1kZycTGJiIqmpqWRkZJCYnkja2TRKS0tpqmlSF5dN8Hny9vbm1ltv5emnnzaJtYek91N3bpqkMNuCl16Cjz5Sf/6736k1fJubU9iwYQM//fQTKSkpFBcXa/PmPT3lbWFhgb29Pc7Ozvj6+hIaGsqkSZOYOnUqERERODo6Gv2evV0EMoBy1Jnk94UQ69o6ti+JgLHQ6/WUl5dfMjIpLi6+YJqrLTFpbm7usphYWlpqIxODmDg6Ol6wIOXh4YGnp+cF01z+/v64u7tfICZFtUVsS9lGbEos2xK3UZNbg0WZBcH6YLyavbCutaaqrIrCwkLKy8tpbGw02ofTwcEBb29vJk+ezF133cX8+fNNbhgo6TlqampYu3Yna9e6k5Y2HWjFwuJftLb+DThr8vsbpl9cXFzw8/MjNDSUqVOnEhkZybhx47DtBQWbersI+Ash8hRF8QbigAeEEHvO+/3dwN0AQUFBE7Oyskzanv6KEILKykry8vI0QSkuLmZ7Rga15eX4NzVdMDKpqakxqphYW1trYmLvYA/WUKfUUSbKqNHVgD14eXoxMXQi88bMY96YeQwOHIy1tTVJSUnaLm+DLUhqaio5OTlGW+jW6XQ4OzsTHBzMwoULeeihhwgICDDKtSVdRwjBgQMH2LRpE4cOHSIlJYWysrLzpiDDgKeAW4Em4H3gFSCvy/c0vPwYgnpYWBjTp0/nqquuIjw8HOueLA5sJHq1CFxwM0X5P6BGCPHq5X4/EEcCpqaz+dVCCKqrq8nNzb1kzcSQutqemHR2CH5x1oQmJufSg52dnXFxccHOzk6zBbm4LKixsLS0xMXFhbFjx3Lbbbdxyy23YGNjY5J7DQSys7N588032bJlCzk5OZ0YAY5CDf43AfXAe8CrQOElRxrSKl1cXPD399eC+rx58xg+fLjRDB97M71WBBRFcQB0Qojqc/+OA1YJIb673PFSBIyPuVxEhRDU1tZqI5OCggKKiorIyc/hRMYJknOTyS3KpamuCRrBRthgrbeGFmhtbtVGJuZetGsLa2tr3N3dmTJlCo8//jjTp083d5N6jMrKSj788EO2bt1KQkICZWVlRtz8OAZ4GrgOqMPLayPTp//ML2GuuE6bxpEVK3pN6nZvojeLwBBg07lvLYHPhRAvtnW8FAHj05utpPVCT3xePDHJMcSmxBKfHw9AoHOgVjhnXsg89E168vPzNTNAw8iktLSUsrIyKioqtPTg2tpa6urqqK+vp6amxuQ7s6+EoihYW1vj5eVFZGQkDz/8MNOmTet1mU/19fVs2bKF//znPxw/fpzCwkItjdiYWFtb4+Pjw4QJE5g7dy7R0dEMHToURVE4dgyefx42bQInJ3jwQXj4YfD0VM/tzf24N9BrRaCzSBEwPn3pw5NXncfWlK3EpsQSlxZHbXMtdpZ2zAuZp4lCoEtgl6+vLjCu5b333iMrK8ukO7E7i8GSfPjw4cyZM4fIyEjGjx+Pv79/p32b6urq2L59O9u3byc+Pl4rYtTU1GT0Z9bpdDg4ODBixAjuvfdebrvttk5Nvxw+rAb///4XXFzUwP/QQ+DmduFxfakfmwMpApI26asfnsaWRnZn7SYmOYaY5BgyKjIAGOszVqumNjVgKha67mcBnTp1imeffZbvv/+eysrKTq1nGIKgYa+Hs7MzDg4OVFRUkJaWRlVVlVlHI11Fp9Ph6OjIoEGDCA8PZ86cOVx//fXdqpNxPgcOqMF/2zY14P/5z/DAA6oQXI6+2o97CikCkn6NEIIzJWe0aaN92ftoFa142HmwJHQJy0KXETUsCldbV6Pds7a2ljVr1vDvf/+bs2fPdjqQW1hY4O7uTnBwMGPHjuWqq65i0aJFBAQEUFVVRVxcHDt37uTHH38kPT2dpnN++T2JYUe8g4MDnp6e+Pn5aenBHh4eF+yC9/X1JSAgAF9f326lRO7dC6tWwc6d6lTPI4/AH/+oTgFJuo4UAcmAory+nB1pO4hJiWFrylbK6suwUCyYETRDmzYa4TnC6AuIQgj++9//smbNGo4dO0ZdXZ1Rr9+X0Ol0mqWKwZ/Lzs7uArNHNze3c7vgPaioiGDXrumcOuWNh0cLjzwieOABK0ywb2pAIkVA0iavZmcD8GhQkJlbYhpa9a0czD2ojRISChMAGOI2RJs2mj14NjaWHUvzNLi77tu3jz179mhOkGVlZVoZU3Nx/pu7t7c3YWFhzJw5kxtuuIHAwEAOHjzIp59+yp49e8jJyaG6utpo2VXn+2hZWFhc4o/T2trahtnjAuBZYCZqbv/LwD9R0z6vbPbo5OSkjUyyrKywd3Pj6qFD8fb21jYt+vv7Y29vb5Rn7MtIEZC0yUCbS82uzGZrylZikmPYlbGLhuYGbOtsCW8Mx7nAmfrceory1WJABk8oY38GDEHSWAZijo6OBAcHEx4ezrRp01i4cCGjRo3q0kinoqKCr7/+mi1btnDixAmKi4tpaGgw2t9ADejW2Nr+moaGx2hoGI+1dSEhIRsJCfkea2v171FdXX2B2aNhr0lzc3OX/LnOF5OLzR7PF5OLnYPP3wXf02aPxkSKgKRN+psI6PV60tLSOHToED/99BMnT54kNzeXsrIyzSvGmBkwhk1shsVfRVEuCVodCfSGdFFbW1taW1s7VVhIUZRLAqKiKDg5OTFo0CBGjhzJlClTWLhwIePGjet2CqoQgmPHjrFhwwZ2795NWloalZWVHayctwz1zX8ykAX8DfgIdbfvhZz/d/Xw8NDWHYKDgxk2bBhhYWF4enpSUlLCbfv20Vxayl329hQXF3fI7LGr/lyWlpYXWKpcSUwutlQZNGgQTk5OPS4mUgQkbdLbRaClpYWkpCQOHTrE4cOHOXXqFDk5OZSXl2s1no2+YUwHWAD2YOliqQYe72DsmuwoLfnqFwU+AAAgAElEQVRf4Y6uVnsaNmwY06ZN47rrrmPMmDHtBoS8vDxeeukltmzZQm5ubodEzGBZfPGxiqLg6OiIv78/I0aMYMqUKSxYsIBJkyYZfX9CbW3tuf0F/2XPHjcKC/+AXh+BWpHoReDfdLhk3RVQFAWsrLCwsyPo3IK1j48PgwYNIiQkhGHDhjFy5EhCQ0Mvm54qhLjA7LGgoMDsZo+GXfDni4mhpolBDDtj9ihFQNImPS0CjY2NnDp1imPHjhEfH8+pU6c4e/asVg3JFEFdURTtrdLDw0ObMw4MDCQkJISAgABycnKIj4/n5MmT5OXlUVlZ2eGpIIMtgaurK4MGDSIiIoKlS5eyZMkSk1lKNDc38/HHH7N27VpOnz7dIZsMw9/BcP7Fz+bg4ICfnx/Dhw9n0qRJLFiwgMjIyC6b7en18M03aqrnL7+oNs5PPw233AKGrQ1CCJKSkti4cSO7d+8mMTHR6AaC53N+ZS4XFxc8PT3x9fUlMDCQ4OBgQkNDGTVqFEOGDOm0KAohKC8v13bBn2+pYti42BNmj/b29tjb21/gHPz1119LEZBcniUJ6kLpti7WA66vrychIYFjx45x4sSJC97U6+vrTfOmzv9qJDs6OuLu7q6lMBre/sLCwhg5ciSBgYHEx8dfUO2prKysU9WerKyssLGzwcrZiibXJmr8aiAchg8ZTnRoNMvCljEjaAZWFp3btGUKDh8+zOrVq9mzZw+lpaUdCiiGRdeWlpbLBl97e3t8fX0JCwtj4sSJzJ07l1mzZrW5Sa21FTZuhBdegNOnYcQINfjfeCN01aanvr6eHTt28J///Efz+6+urjbpHgsLCwtNMFxdXfHy8tIEw9DHRo0aRWBgoFFHURebPRYWFlJYWHhJgSxDPQ/DLvh2xESKgKRj1NTUcPz4cY4dO8bJkyc5ffo0OTk5VFRUmDSow//e0gxvL+cH9uDgYO1DFxQUpH3oioqKtLnpxMRECgoKqKmp6XC1J0MNZXd3dwYPHszkyZP51a9+dcXCNenl6cQmxxKTEsOPmT/S1NqEs40zUUOjWBa2jCXDluDlYJxNU8agpKSEN954gw0bNpCVldWhuXuD572VlRVNTU3U1dVd8v/d1tYWX19fhg0bxsSJE5k5cy6FhXN5+WVrkpMhPByeeQauuw56yrk7NTWVb7/9lh9//JEzZ85QWFjYI4VhDHbSjo6OmmD4+fkRFBRESEgIw4cPZ+TIkfj5+fWYLcj5Zo+jRo2SIjAQEUJQUVGhvaWfPn2axMRE8vLyTP6mbkCn013w4bg4sIeGhhIeHn5BYDfQ0tLC9u3biY2N5dixY52u9mRYtHV2dsbf35/Ro0ezYMECVqxY0aFqTB2hpqmGnek7iU2OJTYllvyafBQUpg6aqo0SxvmM63VZJS0tLXz77be89dZbnDhxgurq6g6dZyhwDurbeXV1Na2tCvAbVFfPYShKAu7u7zBmTCoTJkQwZ84c5s2bh4ODg8mepzM0NDSwZ88eNm/ezJEjR8jKyqKioqLH0nstLS21z4RhFOvv709QUBBDhw4lNDSU0aNHG233Ncg1gX6DEIKSkhKOHj3KiRMnOHPmDMnJyeTm5mpz6oZayabk/GGym5ubtmgVFBSkBfaRI0cSHBx8xbee06dP880331xQ7ckwTdNevzNkadjb2+Pp6cmwYcO46qqruO666xgxwvgbwTqCXug5XnBcs7I4nHcYgACnAE0Q5oXMw8G6dwTDy3Hy5EleeeUV4uLiKCgoaOf/gxXwW+BJIAQbm5O4uPyD1tbNVFaWXzJVYzDLGzJkCOPHj2fWrFksXLhQE5Xu8HxmJgDPBAd3+1oGzp49y5YtW4iLi9NGmbW1td3OLju/b7bXzw2lJZ2cnHBzc8Pb2xt/f38GDx7MkCFDGD58OOHh4bhdbKZ06T2lCPRG9Ho9BQUFxMfH88svv3DmzBnS0tK0oG4IiD1hl3y5+U8fHx9t/rOjgd1AZWUlmzdvJi4ujpMnT5Kfn6+ZlHXkeQztcXV1JSgoiHHjxrF8+XIWLFjQZ/z7C2oKtGpq29O2U9NUg42FDfNC5hEdGk10WDTBrsHmbma7VFVVsXbtWj755BPS0tJoaBDA74CVQBBwEFgFbNXOURQFNzc3fH19sbGxoampidLSUkpLSy+ZjrKyssLT05OQkBAiIiKYOXMmixYt6lSpUHNluTU3N7N//37+85//cPDgQTIyMqioqDDKwrZOp9M+a0KIdsXHysoKOzs7nJ2dtVrkhpTaVatWSRHoCVpbW8nNzeXIkSMkJCSQmppKeno6ubm5VFZWatMvPfU3vTiwn58JYQjsf9XrsfX3Z8/EiR2+bmtrq9bxjxw5QmZmprbY2pG3JMNiq6OjIz4+PowaNUozIPPx8enOI/damlqb2JO1R1tLSC1LBSDcK5xlYctYFraMaYOmYanrvQVO6uvhn/+El1+GvDwYObIMO7tXSE9/n4qK8g5dw9bWlkGDBuHj44O1tTW1tbXk5uZSXFx8yXSMpaUlHh4emr/SjBkziIqKumwf6e2pznl5eWzbto0dO3ZoGWjGsDI/P9UU1BfLNvbCSBHoCk1NTWRlZWmpjKmpqWRmZpKXl3fBm3pP/o2uFNjPT3Hr6Bv75T48Z8+e5ZtvvtEWW4uKiqitre2wgBnmPD08PAgJCWHKlCmsWLGCKVOm9DqvfHORXJqsWVnsydpDi74FN1s3loQuITo0msXDFuNu1/E3YVNSWwvvvw9r1kBhIcyaBc89B3PnwsWzbqmpqbz++uvExsaSn5/foUVoRVHw9PQkNDQULy8vLC0tKSkpIT09naKiokvqFhjM9wYPHsyYMWO46qqrWB8QgI2PT68VgY7Q3NzM4cOHiYmJ4aeffiI9PZ3S0lLq6+uNEWOkCIC6IJSenk58fDynT58mPT2drKwsLagbdmka45k7M+/XkcBumIrpboH0hoYGtm3bxrZt2zhx4gTH0tJoqalB6eACsU6nw9raGmdnZwICAhgzZgxRUVEsX74cJ2n12CUqGyqJS48jJlk1vCuuK0an6Lgq8CptLWGUV9dsILpDTQ28+y68+ioUF8P8+Wq2z+zZnbtObW0tH3/8MR988AFJSUnU1tZ26DxbW1sGDx5MeHg4vr6+6PV6UlJSSE1N1bJ+LkCnw8PNjaCgIMaMGcP06dOJiooi2IjrBL2B4uJidu7cydatWzlx4gT5+fkd2bHdP0WgtraW1NRULahnZGSQk5Oj/VGM+aZuqHMLalA3fLVFW4H9/F2Mhjf27gZ2A0IIjh8/zubNmzlw4ACpqamUlpZqC8YdeUbDYqvBgGz69OnccMMNWoUniWnRCz2Hcw9ro4RjBepIbbDLYG3aaE7wHGwtu27X3B5VVfD22/D661BaClFRavC/6irj3UMIwe7du3nzzTfZu3cv5eXlHX4J8fDwICwsjAkTJuDn50dlZSX/2rePmqwsOGcPcvE5Li4uBAYGEh4eTmRkJIsWLWL48OHGe6BeRktLCydOnCA2Npa9e/eyc+fO/iECOp1OQPtv1h28lra13mDm1d5uPUO6oyGwG/xMLjcVY6zAfj6lpaVs2rSJXbt2cerUKfLz86murqapqanDOfG2tra4ubkRGBjIhAkTuPrqq5k9e3afWWwdaORU5WjV1Ham76SuuQ57K3sWDFmgLi6HRhPgHGCUe1VUwD/+AX//u/rv6Gg1+E+dapTLd4js7GzefvttNm3aRHZ2dofTNm1tbbURwLRp0wgICCApKYnDhw+TnJxMfn7+JSMQRVFwcXHR/JWmTZvGokWLCA8P73cvPP0mO0hRlMs2xvCWbmFhcYFDY2tra7vFyA0blAx57BcHdsMbe0hIiEkCu4GWlhZ2797Nli1biI+PJysri/LychoaGjq12Ork5ISvry/h4eHMnTuXa665Bm9vb5O1W9JzNLQ08GPmj1oKalZlFgARvhGaLfZk/8mdrqZWVgZvvAFvvqmOAn71KzX4dyJXwKQ0NjbyxRdfsH79ehISEqipqemwnYe7uzuhoaFMmjSJefPm4e7uzt69ezl06BBnzpzRFmjPx2C+FxAQwIgRI5g6dSoLFy4kIiKiz65p9RsRsLCwEJaWlu2mTbYV2C82lBoyZIhJA/v5pKWl8c0337Bv3z6SkpK0xdbOGJDZ2dnh4eHB0KFDmTp1Ktdccw3jx4/vVsd8Ij0dgJeGDOnyNSQ9jxCC08WntWmj/Wf3oxd6vOy9WBq6lOjQaBYNXYSLbdub4kpK1Cmft95S5/+vvVa1d4iI6MEH6SJCCA4cOMC7777Lrl27KCwqQnQwldqQoRQeHs6MGTOIjo6mvr6enTt38vPPP3PmzBlyc3Oprq6+5LPp5OSEn5+fZr43f/58Jk+e3GNxpKv0GxHQ6XTC09PzAuMncwZ2A7W1tcTGxrJt2zYSEhK0lNCO5sQbFlsNQ9Nx48axePFilixZgqOJSyv19tQ6Sccoqy9je+p2YlJi2JayjfKGcix1lswMmqmtJYR5hAFqhs9rr6mLvnV1cMMNavAfPdrMD9ENDP34Sz8/1q5dy1dffUV6evqli8dtoCgK7u7uDB06lEmTJjF//nwWLVpEZmYmO3bs4Oeff9YsVKqqqi5rvufr68uIESO08yMjIy/rWmoO+o0ImGOfQGtrK0ePHmXTpk0cPHjwgrStzhiQOTg44OXlxfDhw5k1axbXXHMNISEhZp97lCLQ/2jRt/Bzzs/aKOFk0UkAgi0icTv6N05vm0lzk46bb1Z46ikYOdLMDTYCV+rHTU1NbNq0iX/+85/Ex8dTWVnZ4XVFW1tb/P39GTVqFNOnT2f58uWEh4eTnJysicOpU6fIzs6msrLykpc+e3t7fHx8NPO9efPmMXPmTKytrbv/0J1AisAVKCgoYNOmTfzwww+cOnWKwsJCampqurTYGhwczMSJE1m+fPkVXRZ7E1IE+j8/n8ph5V/L2Lt5BPpWHYz5FIf5/yBqagjLQpexNHQpPo59e5NeZ/uxoTDO2rVr+e6778jPz+9UER83NzdCQkK0wL5kyRKcnZ1JT09nx44dWkEjg9/Vxet6tra2+Pj4aOZ7c+bMYc6cOdjZ2XXuwTvIgBWB5uZmdu7cSUxMDMeOHdOMohobGzu82GrIiff19dUMyK6++mo8PT278yi9BikC/ZesLFi9Gj78UPX2/+1v4aFH6sjU7dJGCbnVuQBM9p/MsrBlRIdGM95vPDqlby2AGqsfl5WVsX79er788kuSkpI6tVHLxsYGPz8/Ro4cSWRkJMuWLdMquZ09e5YdO3awf/9+fvnlF7KysigrK7skDtnY2ODt7c3QoUOZMGECs2fPZt68ed2eFu7VIqAoymLgTdRaTuuFEKvbOvZ8ERBCcObMGTZt2sS+ffs0AzJDTnxnqj0ZDMimTZvGihUrGDNmTK9f6DEWvzl9GoBPR40yc0skxiI9HV56CT76SN3R+7vfwcqVcPGeKSEEJwpPaFYWB3MOIhD4OfqxNHQpy8KWsWDIAhytTbsuZQxM2Y9bW1v573//ywcffMCBAwc6vKcB/peGGhISwvjx45k3bx5Lly7VDN8KCgqIi4tj7969JCQkkJmZSWlp6RXN9yIiIpg1axYLFizA1dW1o+3onSKgKIoFkAwsBHKAw8DNQojTlztep9MJQ/pnexhy+g2LrREREURHRzN//nyTL7ZKJOYgJQX+9jf497/V4i2//z08/jgEBnbs/OLaYralqoZ336V+R1VjFdYW1swJnqOloA5xk1lkBk6dOsW6deuIiYnh7NmzHayxrGJtba0tJEdGRrJ06dILSnyWlpZq4nD8+HEyMjIoKSm5rPmeh4cHQ4YMYdy4ccycOZOFCxdeMlPRm0UgEvg/IUTUue+fABBCvNTG8eLcfzUDMm9vb0aMGMHs2bO55pprCAwMNPtiq0TSk5w5Ay++CJ9/DtbWcM898Nhj4O/f9Ws2tzazL3sfsSmxxCTHkFSaBMBIz5GalcX0wOm9oppab6KqqopPPvmEzz77jJMnT1JbW9vh6SRFUXB2diY4OJhx48Yxd+5coqOjL6grUFlZyc6dO9mzZw/Hjh0jPT29TfM9d3d3zXxv/fr1vVYErgMWCyF+f+7724CpQoj7L3f8xIkTRXx8vMnaMxB5OCUFgDdCQ83cEklnOXVKLeG4YQPY2cF998Ejj4Cvr/HvlVqWqk0b7c7cTbO+GVdbV62a2uJhi/G0N986WW/ux3q9nri4OD788EN2795NSUlJp2oSWFlZaRlGkZGRLF68mOnTp1+wP6i2tpZdu3axe/dujh07RmpqKsXFxeenyPZaEbgeiLpIBKYIIR4475i7gbsBgoKCJmZlZZmsPQMRuTDc9zhxQg3+X38Njo5w//3w5z+DEQtRXZHqxmri0uO0amqFtYXoFB3TBk3Tpo3GeI/p0RF5X+zHaWlprF+/ns2bN5ORkXGJW2p7ODs7a7U2Zs+ezdVXX32JzXZDQwM//PADS5cu7bUi0KnpoN5eT6Av0hc/PAOVo0fh+edh82ZwdoYHH4SHHwYPD/O1SS/0xOfFa9NG8fnqSD3QOfCCamp2VqZJfTTQX/pxXV0dn3/+OZ999hlHjx697K7lK2FlZYWXlxdhYWFMnTqVqKgoZs6ciZWVVa8VAUvUheH5QC7qwvAtQohTlzteioDx6S8fnv7MoUNq8I+JAVdXNfA/+CC0U1HQLORX52uGdzvSdlDbXIutpS3zQ+Zr1dSCXIKMft/+3I/1ej379+9n/fr1qkVGYWFXitF0WQRMuudZCNGiKMr9wHbUFNEP2xIAiWSgceAArFoF330H7u7qFND994NL23ZAZsfPyY+7JtzFXRPuorGlkT1Ze1TDuxR1XwJbYazPWG2UMDVgaqcN7wYaOp2OmTNnMnPmzAt+npOTwwcffMCmTZtITk6mvr7eJPfv05vFJO1zd5Ka9bGuH3ur9zX27FHf/HfuBE9PePRRddG3L9fsEUKQVJqkbVLbm7WXVtGKh52HVk0tamgUbnZdG97IfqxSX1/P119/zWeffcbhw4cpLy83TCf1zumgziJFQNJfEQJ+/BH++lfYvRt8fNQ0z3vuAQcHc7fO+FQ0VLA9dTuxKbFsTdlKaX0pFooFM4JmaKOEEZ4jZLq3ERBCoNPppAhIJL0RIdQ3/lWrYN8+8PNTN3j94Q9gb2/u1vUMrfpWDuUe0kYJJwpPABDiGqI5oM4ePBsbS1n4qKv02s1inUWKgPGRw2jzIARs26YG/4MHYdAg1drhrrvA1nSVI/sEZyvPEpuipp/uSt9FfUs9DlYOLBy6kOjQaJaGLsXf6cKdcLIfX5nuiEDvMMOWmIzki+qxSkyLEPDf/6rBPz4eBg+G999Xzd1khU+VQJdA7pl0D/dMuof65np+yPxBGyVsPrMZgAl+E7Q9CZP8J8l+bEKkCEgkRkCvV/P7n38ejh+HIUPggw/gttugDziOmw07KzuWhi5laehShBCcLDqp7Ul4Ye8LrNqzCh8HHxT3qbj7zKRq1FCcbZzN3ex+hZwO6uf05/zq3kBrK3zzjRr8T56E0FC1itctt6gmb5KuU1pXynep3xGTEsPXSbG0NFdjpbNi1uBZ2lrCMPdh5m5mr0CuCUjaRIqAaWhtVT19XngBEhNhxAi1ePuNN8IAcSnvUWbHH6ay/ASLRDKxKbGcLlaNiMM8wrRpoxlBM7C26NmKXr0FuSYgaZMIaattVFpaVDfPF1+E5GS1bu+GDWoRdxn8Tcd4Z1dwns2a0N+zZuEaMsoztGmjtw+/zes/v46zjTNRQ6OIDo1mSegSvB28zd3sPoEcCUgkHaC5WfXxf/FFtajLuHHw7LPw61+Drm8V6ep31DTVsCt9lyYK+TX5KChMCZiiVVOL8I3o13sS5HSQRGIiGhvh44/VYi5ZWTBxohr8ly9Xq3pJehdCCI4VHNNssQ/nHkYgCHAK0LyN5ofMx8G6f+3QkyIgaRNZXrJrNDSotXtXr4azZ2HqVDX4L1kig7856Go/LqwpZFvqNmKSY9iRtoPqpmpsLGyYGzJXW0sIdg02QYt7FrkmIGmTnE56mA906uth3TpYswby8uCqq2D9eli4UAZ/c9LVfuzj6MMdEXdwR8QdNLU2sTdrrzZtdP+2+7l/2/2Ee4VrVhaRgZFY6gZWWBxYTyuRtEFtLaxdC6+8AoWFMHs2fPopzJkjg39/wdrCmvlD5jN/yHxej3qd5NJkbdro9Z9fZ81Pa3CzdWPxsMVaNTV3O3dzN9vkSBGQDGiqq+Hdd+HVV6GkBObPV7N9Zs82d8skpibMI4ywyDD+FPknKhsq1WpqKbHEJsfyxckv0Ck6pgdO16aNwr3C++XishQByYCkshLefhtefx3KymDxYjXPf/p0c7dMYg5cbF24btR1XDfqOvRCz5G8I2qdhOQYVu5aycpdKxnsMlibNpobMhdby/5hAiVFoJ8T2ZsrlJiB8nL4xz/gjTegogKWLVOD/5Qp5m6Z5Er0ZD/WKTqmBExhSsAUVs1dRW5VrlZN7aMTH/HukXext7Jnfsh8LQU1wDmgx9pnbGR2kGRAUFqqBv5//AOqqtT8/qefVlM+JZKO0tDSwO7M3Vo1tcyKTAAifCO0UcJk/8k9Xk1NpohKJG1QXKxO+bz9NtTUwHXXqcF/3Dhzt0zS1xFCkFiSqDmg7s/eT6toxcveiyWhS1gWuoxFQxfhYmv6UYwUAUmbXHvyJADfjB5t5pb0LIWF6mLvu++qaZ833ghPPaXaPEj6Hn2hH5fVl2nV1LalbqOsvgxLnSUzg2Zqo4QwjzCTLC7LfQKSNiltbjZ3E3qUvDw1zXPtWmhqUt08n3pKNXiT9F36Qj92t3Pn5jE3c/OYm2nVt/Jzzs/aKOHRuEd5NO5RhroN1RxQZw2e1SsM76QISPoFOTnw8svwz3+qJm+33QZPPqlaO0skPY2FzoKrgq7iqqCreGnBS2RVZLE1ZSsxKTG8H/8+bx58E0drRxYNXaRVU/N19DVLW6UISPo0WVmqtcOHH6qFXe64A554Qi3qIpH0Fga7Dubeyfdy7+R7qWuu4/uM77VRwreJ3wIw2X+yNm003m88OqVnnAmlCEj6JOnpqqnbxx+rO3rvukut4Tt4sLlbJpFcGXsre21KSAhBQmGCJgh/3f1X/m/3/+Hr6KsJwoIhC3C0Np0lvBSBfs58NzdzN8GopKSods6ffqpW7rrnHvjLXyAw0Nwtk5iS/taPDSiKwjjfcYzzHcdTs56iuLb4f9XUTn/NB8c+wNrCmjnBc1QX1NBohroPNW4bTJUdpCjK/wF/AIrP/ehJIcTWK50js4MkbZGYqAb/L75QC7b/v/8Hjz0G/v7mbplEYhqaW5vZf3a/5m90puQMACM8R2hWFlcFXoWVhVXvTBE9JwI1QohXO3qOFAHJxZw8qZZw3LgR7Ozgj3+ERx4BHx9zt0wi6VnSytI0B9TdWbtpam3CxcaFxcMWs+H6DTJFVHJ5liQkALBt7Fgzt6RznDihFm//5htwdFTn+//0J/DyMnfLJOagr/ZjYzLUfSgPTn2QB6c+SHVjNTvTd6qGdymx3bquqUXgfkVRbgeOAI8IIcpNfD/JRdS3tpq7CZ0iPl4N/v/5Dzg7q74+Dz0EHh7mbpnEnPS1fmxqnGycWDFyBStGrkAv9Fg82nWbim7lICmKslNRlJOX+foV8B4wFIgA8oHX2rjG3YqiHFEU5UhxcfHlDpEMAA4eVM3cJk2C3bvhr39V0z9XrZICIJFcie6mknZrJCCEWNCR4xRF+ScQ08Y11gHrQF0T6E57JH2Pn35SA/327eDuri7+3n+/OgqQSCSmx2S7ERRF8Tvv2xXASVPdS9L32LMHFixQyzceParu9s3MVHf5SgGQSHoOU64JrFEUJQIQQCbw/0x4L0kbLOtFcylCwA8/qG/+u3erGT6vvaamezo4mLt1kt5Mb+rH/Q3pIioxOUJAXJwa/PfvV3P7H38c/vAHNe1TIpF0j+7sE+gZcwrJgEQI2LoVIiMhKkpd6H3nHUhLgwcflAIgkfQGpAj0c+YcO8acY8d69J5CqCmekydDdDQUFMD770NqKtx3H9j2j9Kskh7EHP14oCBFQGI09Hp1c9f48Wr5xooK1d0zJQXuvlu1e5BIJL0LKQKSbtPaChs2wNixavnGujrV3fPMGbjzTrCyMncLJRJJW0gRkHSZlhb47DO1ZONNN6kjgc8+U83ebr9ddfmUSCS9GykCkk7T0qK+6Y8aBb/5jRrsN25Uzd5uuQUsur6DXSKR9DDyXa2fc4O3t9Gu1dQE//63WswlPR0iIuDbb+FXvwKdfJ2QmBBj9mPJhUgR6OfcFxDQ7Ws0NsJHH8FLL6lpnpMmwRtvqF4/itL9Nkok7WGMfiy5PPL9rZ9T19pKXRcdGBsa1Lz+YcPUCl6+vmre/6FDsHy5FABJz9Gdfiy5MnIk0M9Zes6H/cfx4zt8Tl0d/POfqp9Pfr7q7/Phh6rXjwz8EnPQlX4s6RhSBCQatbWwdi288goUFsKcOWq2z5w5MvhLJP0VKQISqqvVaZ/XXoOSEvWNf+NGmDXL3C2TSCSmRorAAKayEt56C/7+dygrg8WL1Upe06ebu2USiaSnkCIwACkvhzffVL8qKtRF3qefhilTzN0yiUTS00gR6Ofc4eur/bu0VH3rf+stqKqCFSvU4D9hghkbKJF0gPP7scS4SBHo59zh50dxMaxcqc7719aq/j5PP616/UgkfYE7/PzaP0jSJaQI9GMKCuD5l1v5aJ2O+nqFm26Cp56C8HBzt0wi6RwlTU0AeFpbm7kl/Q8pAv2QvDxYs0b18G9o0uGzpJz4V90ZMcLcLZNIusZ1p04Bcp+AKZAi0I84e1bd4LV+vWrydvvtcOpXp7ELamLECHdzN08ikfRCpG1EPyAzU2czTncAAA2TSURBVLV1GDpUffu//XZITlZ3+doFNZm7eRKJpBcjRwJ9mLQ01dTt449VF8/f/14t4D54sLlbJpFI+gpSBPogycnw4ouqpYOlJdx7L/zlLzBokLlbJpFI+hpSBPoQiYlq8P/iC7Ve74MPwmOPwZWy5+6VFrySfoDsx6ZDikAf4Jdf4IUX4KuvwN4eHnlE/fLxaf/cG2UxDkk/QPZj09GthWFFUa5XFOWUoih6RVEmXfS7JxRFSVUUJUlRlKjuNXNgcvw4XHutuqlr2zZ44gl1EXjNmo4JAMDZhgbONjSYtJ0SiamR/dh0dHckcBK4Bnj//B8qijIKuAkIB/yBnYqihAkhZFWIDnDkCDz/PGzZAi4u8Oyz8NBD4N6FLM/bEhMBmV8t6dvIfmw6uiUCQohEAOVSs/lfAV8KIRqBDEVRUoEpwIHu3K+/c/AgrFqlVu9yc1P//cAD4Opq7pZJJJL+iqnWBAKAn8/7PufczySXYf9+NeDv2AEeHmoh9z/+EZydzd0yiUTS32lXBBRF2QlczsLvKSHEf9o67TI/E21c/27gboCgoKD2mtOv2L1bDf7ffw9eXupc/733gqOjuVsmkUgGCu2KgBBiQReumwMEnvf9ICCvjeuvA9YBTJo06bJC0Z8QQg36q1bBnj1q8fbXX4e77wYHB3O3TiKRDDRMNR20BfhcUZTXUReGQ4FDJrpXn0AIdbpn1Sr46Sfw94d//EPd5WtnZ7r7PhIY2P5BEkkvR/Zj09EtEVAUZQXwFuAFxCqKclwIESWEOKUoykbgNNAC/HGgZgYJoS70rloFhw5BYCC8+y7ceSfY2pr+/ss9PU1/E4nExMh+bDq6mx20CdjUxu9eBF7szvX7MkKoKZ6rVsHRoxAcDOvWwW9/Cz1piZ5UVwfAcHv7nrupRGJkZD82HXLHsJHR6+Hbb9UdvidOqM6eH34Iv/kNWFn1fHv+X1ISIPOrJX0b2Y9Nh7SSNhKtrfDll+ru3uuvh/p6+OQTOHNGnfoxhwBIJBJJe0gR6CYtLfDpp2rJxptvVkcCn38Op0/DbbepLp8SiUTSW5Ei0EWam+Gjj2DkSDXYW1vDxo1w8qQqBhYW5m6hRCKRtI98T+0kTU3qNM/f/gYZGTB+PGzaBFdfrRZ2kUgkkr6EFIEO0tgI//qXWskrOxsmTVLz/KOj4VLrpN7D07LMmKQfIPux6ZAi0A4NDWrh9tWrITcXpk1T6/hGRfXu4G9gQVesRyWSXobsx6ZDikAb1NWpef1r1kB+PsyYoa4BzJ/fN4K/gePV1QBEODmZuSUSSdeR/dh0SBG4iJoaWLsWXnkFiopg7lw122f27L4V/A08nJoKyPxqSd9G9mPTIUXgHNXV8M478NprUFICCxfCM8/AzJnmbplEIpGYjgEvApWV8NZb8Pe/Q1kZLFmiBv/ISHO3TCKRSEzPgBWB8nJ44w14801VCJYvV4P/5MnmbplEIpH0HANOBEpL1bf+f/xDnQJasUIN/nKqUSKRDEQGjAgUFanz/e+8o2b+XH89PP00jBlj7paZlr8NGWLuJkgk3Ub2Y9PR70WgoEDN9HnvPXXD1003wVNPwahR5m5ZzzDdxcXcTZBIuo3sx6aj34pAbq6a479unWr18JvfwJNPwvDh5m5Zz/JTZSUgP0SSvo3sx6aj34lAdja8/LK6y1evh9tvhyeegGHDzN0y8/Bkejog86slfRvZj01HvxGBzEzV1+df/1K/v/NOWLkSQkLM2iyJRCLp1fR5EUhLUx09P/lEdfH8wx/g8cchKMjcLZNIJJLeT58VgaQkNfh/9plateu+++Avf4GAAHO3TCKRSPoOfU4ETp+GF19USzna2MBDD8Gjj/L/27vfGCuuMo7j31+pQAh/w9qUWBAaoeFPiVZC2jdVQ6MNUYhNVUwaW20kFOUFEqMNSW3AvrFpNEZrwdigjVr+NBTQEixarTFuBUNKKRUC2BYQslIUX7Si4OOLmXY3ZHfv7M7OzN6Z3ychmd2Ze+6Th3Pvs3PmzBmmTKk6MjOz9tM2ReCll5KHt2/dCmPGJF/8a9bANddUHdnw9p2mXhG3WnE/Ls6wLwIHDsD69cnTu8aNS2b6rF4NHR1VR9YevPSu1YH7cXGGbRHYty/58t+1CyZMgAceSIZ+/GyJgdl7/jzgh3JYe3M/Lk6uIiDpU8CDwGxgYUTsT38/HXgFOJIe2hkRK7K02dkJ69bB7t0waVKyvWoVTJyYJ9Lm+uZrrwH+8Fh7cz8uTt4zgUPAHcCGXvYdj4j3D6Sxo0eTJZwnT07m/K9cCePH54zQzMz6lKsIRMQrABqiR2699Vayzs+KFTB27JA0aWZm/biqwLZnSDog6XeSMj2f68Ybk1k/LgBmZuVoeSYgaS9wbS+71kbEjj5edgaYFhFvSPog8LSkuRHxr17aXw4sB5jm23zNzErVsghExG0DbTQiLgIX0+0/SzoOzAL293LsRmAjwIIFC2Kg72X929C0ZVOtltyPi1PIFFFJ7wbOR8RlSdcDM4ETRbyX9e+GMWOqDsEsN/fj4uS6JiDpk5JOAbcAv5S0J911K3BQ0ovANmBFRJzPF6oNxq5z59h17lzVYZjl4n5cnLyzg7YD23v5/VPAU3natqHxyMmTAHzCt1hbG3M/Lk6Rs4PMzGyYcxEwM2swFwEzswZzETAza7Bhu4qoDY0nZs+uOgSz3NyPi+MiUHNTR4+uOgSz3NyPi+PhoJrb3NXF5q6uqsMwy8X9uDg+E6i5H5w+DcBn/BxOa2Pux8XxmYCZWYO5CJiZNZiLgJlZg7kImJk1mC8M19y2uXOrDsEsN/fj4rgI1FzHyJFVh2CWm/txcTwcVHObzpxh05kzVYdhlov7cXFcBGpu09mzbDp7tuowzHJxPy6Oi4CZWYO5CJiZNZiLgJlZg7kImJk1mKeI1twz8+dXHYJZbu7HxXERqLkxI0ZUHYJZbu7HxfFwUM09evo0j6bL8Jq1K/fj4rgI1NyWri62+GEc1ubcj4uTqwhIeljSXyQdlLRd0sQe++6XdEzSEUkfyx+qmZkNtbxnAs8C8yJiPnAUuB9A0hxgGTAXuB14VJIH9czMhplcRSAifhURl9IfO4Hr0u2lwJMRcTEi/gocAxbmeS8zMxt6Q3lN4AvA7nT7PcDJHvtOpb8zM7NhpOUUUUl7gWt72bU2Inakx6wFLgE/fftlvRwffbS/HFie/nhR0qFWMTVEB3BuqBrr7T+kjQxpLtpco3NxRT9udC6ucMNgX9iyCETEbf3tl3Q38HFgUUS8/UV/Cpja47DrgL/10f5GYGPa1v6IWJAh7tpzLro5F92ci27ORTdJ+wf72ryzg24HvgYsiYg3e+zaCSyTNErSDGAm8Kc872VmZkMv7x3D3wNGAc9KAuiMiBUR8bKkLcBhkmGiL0XE5ZzvZWZmQyxXEYiI9/Wz7yHgoQE2uTFPPDXjXHRzLro5F92ci26DzoW6h/HNzKxpvGyEmVmDVVIEJN2eLidxTNLXe9k/StLmdP8LkqaXH2U5MuTiK5IOp0tz/FrSe6uIswytctHjuDslhaTazgzJkgtJn077xsuSflZ2jGXJ8BmZJuk5SQfSz8niKuIsmqTHJXX1NY1eie+meToo6aZMDUdEqf+AEcBx4HpgJPAiMOeKY1YCj6Xby4DNZcc5jHLxEWBMun1fk3ORHjcOeJ7kDvUFVcddYb+YCRwAJqU/X1N13BXmYiNwX7o9B3i16rgLysWtwE3AoT72Lya5YVfAzcALWdqt4kxgIXAsIk5ExH+AJ0mWmehpKfDjdHsbsEjp9KOaaZmLiHguuqff9lyao26y9AuA9cC3gH+XGVzJsuTii8D3I+IfABFR1yU2s+QigPHp9gT6uCep3UXE88D5fg5ZCvwkEp3ARElTWrVbRRHIsqTEO8dEsjbRBWByKdGVa6DLa9xL99IcddMyF5I+AEyNiF+UGVgFsvSLWcAsSX+Q1Jnes1NHWXLxIHCXpFPAM8CqckIbdga1XE8VTxbLsqRE5mUn2txAlte4C1gAfKjQiKrTby4kXQV8G7inrIAqlKVfXE0yJPRhkrPD30uaFxH/LDi2smXJxWeBTRHxiKRbgCfSXPyv+PCGlUF9b1ZxJpBlSYl3jpF0NckpXn+nQe0q0/Iakm4D1pLcmX2xpNjK1ioX44B5wG8lvUoy5rmzpheHs35GdkTEfyNZqfcISVGomyy5uBfYAhARfwRGk6wr1DSZl+vpqYoisA+YKWmGpJEkF353XnHMTuDudPtO4DeRXvmomZa5SIdANpAUgLqO+0KLXETEhYjoiIjpETGd5PrIkogY9Jopw1iWz8jTJJMGkNRBMjx0otQoy5ElF68DiwAkzSYpAn8vNcrhYSfwuXSW0M3AhYg40+pFpQ8HRcQlSV8G9pBc+X88kmUm1gH7I2In8COSU7pjJGcAy8qOswwZc/EwMBbYml4bfz0illQWdEEy5qIRMuZiD/BRSYeBy8BXI+KN6qIuRsZcrAF+KGk1yfDHPXX8o1HSz0mG/zrS6x/fAN4FEBGPkVwPWUzy/JY3gc9nareGuTIzs4x8x7CZWYO5CJiZNZiLgJlZg7kImJk1mIuAmVmDuQiYmTWYi4CZWYO5CJiZNdj/AfYEjbWN5IUkAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x2f41e659e80>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "%matplotlib inline\n",
    "plot_pomdp_utility(utility)"
   ]
  },
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "---\n",
    "## Appendix\n",
    "\n",
    "Surprisingly, it turns out that there are six other optimal policies for various ranges of R(s). \n",
    "You can try to find them out for yourself.\n",
    "See **Exercise 17.5**.\n",
    "To help you with this, we have a GridMDP editor in `grid_mdp.py` in the GUI folder. \n",
    "<br>\n",
    "Here's a brief tutorial about how to use it\n",
    "<br>\n",
    "Let us use it to solve `Case 2` above\n",
    "1. Run `python gui/grid_mdp.py` from the master directory.\n",
    "2. Enter the dimensions of the grid (3 x 4 in this case), and click on `'Build a GridMDP'`\n",
    "3. Click on `Initialize` in the `Edit` menu.\n",
    "4. Set the reward as -0.4 and click `Apply`. Exit the dialog. \n",
    "![title](images/ge0.jpg)\n",
    "<br>\n",
    "5. Select cell (1, 1) and check the `Wall` radio button. `Apply` and exit the dialog.\n",
    "![title](images/ge1.jpg)\n",
    "<br>\n",
    "6. Select cells (4, 1) and (4, 2) and check the `Terminal` radio button for both. Set the rewards appropriately and click on `Apply`. Exit the dialog. Your window should look something like this.\n",
    "![title](images/ge2.jpg)\n",
    "<br>\n",
    "7. You are all set up now. Click on `Build and Run` in the `Build` menu and watch the heatmap calculate the utility function.\n",
    "![title](images/ge4.jpg)\n",
    "<br>\n",
    "Green shades indicate positive utilities and brown shades indicate negative utilities. \n",
    "The values of the utility function and arrow diagram will pop up in separate dialogs after the algorithm converges."
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.4"
    "001e6c8ed3fc4eeeb6ab7901992314dd": {
     "views": []
    },
    "00f29880456846a8854ab515146ec55b": {
     "views": []
    },