From bf6f1d02b583b05a9c1b84c4dc713e9997990742 Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Fri, 29 Dec 2023 06:10:32 +0000 Subject: [PATCH] build based on 706dc13 --- .../assets/2021-10-08-dcgan-mnist/cat_gan.png | Bin 0 -> 1173514 bytes .../assets/2021-10-08-dcgan-mnist/output.gif | Bin 0 -> 182397 bytes previews/PR2365/assets/documenter.js | 331 ++++++++ previews/PR2365/assets/flux.css | 111 +++ previews/PR2365/assets/logo-dark.png | Bin 0 -> 159356 bytes previews/PR2365/assets/logo.png | Bin 0 -> 106256 bytes previews/PR2365/assets/quickstart/loss.png | Bin 0 -> 62443 bytes .../PR2365/assets/quickstart/oneminute.png | Bin 0 -> 333858 bytes previews/PR2365/assets/rnn-basic.png | Bin 0 -> 10304 bytes previews/PR2365/assets/search.js | 267 +++++++ .../PR2365/assets/themes/documenter-dark.css | 7 + .../PR2365/assets/themes/documenter-light.css | 9 + previews/PR2365/assets/themeswap.js | 66 ++ previews/PR2365/assets/warner.js | 49 ++ previews/PR2365/data/mlutils/index.html | 565 ++++++++++++++ previews/PR2365/data/onehot/index.html | 76 ++ previews/PR2365/destructure/index.html | 115 +++ previews/PR2365/ecosystem/index.html | 6 + previews/PR2365/gpu/index.html | 237 ++++++ previews/PR2365/index.html | 6 + previews/PR2365/models/activation/index.html | 433 +++++++++++ previews/PR2365/models/advanced/index.html | 106 +++ previews/PR2365/models/basics/index.html | 130 ++++ previews/PR2365/models/functors/index.html | 196 +++++ previews/PR2365/models/layers/index.html | 715 ++++++++++++++++++ previews/PR2365/models/losses/index.html | 225 ++++++ previews/PR2365/models/nnlib/index.html | 378 +++++++++ previews/PR2365/models/overview/index.html | 59 ++ previews/PR2365/models/quickstart/index.html | 62 ++ previews/PR2365/models/recurrence/index.html | 103 +++ previews/PR2365/outputsize/index.html | 84 ++ previews/PR2365/performance/index.html | 17 + previews/PR2365/saving/index.html | 62 ++ previews/PR2365/search/index.html | 6 + previews/PR2365/search_index.js | 3 + previews/PR2365/siteinfo.js | 1 + previews/PR2365/training/callbacks/index.html | 91 +++ .../PR2365/training/optimisers/index.html | 71 ++ previews/PR2365/training/reference/index.html | 118 +++ previews/PR2365/training/training/index.html | 122 +++ previews/PR2365/training/zygote/index.html | 197 +++++ .../2020-09-15-deep-learning-flux/index.html | 124 +++ .../tutorials/2021-01-26-mlp/index.html | 81 ++ .../tutorials/2021-02-07-convnet/index.html | 152 ++++ .../2021-10-08-dcgan-mnist/index.html | 180 +++++ .../2021-10-14-vanilla-gan/index.html | 107 +++ .../tutorials/linear_regression/index.html | 109 +++ .../tutorials/logistic_regression/index.html | 134 ++++ previews/PR2365/utilities/index.html | 164 ++++ 49 files changed, 6075 insertions(+) create mode 100644 previews/PR2365/assets/2021-10-08-dcgan-mnist/cat_gan.png create mode 100644 previews/PR2365/assets/2021-10-08-dcgan-mnist/output.gif create mode 100644 previews/PR2365/assets/documenter.js create mode 100644 previews/PR2365/assets/flux.css create mode 100644 previews/PR2365/assets/logo-dark.png create mode 100644 previews/PR2365/assets/logo.png create mode 100644 previews/PR2365/assets/quickstart/loss.png create mode 100644 previews/PR2365/assets/quickstart/oneminute.png create mode 100644 previews/PR2365/assets/rnn-basic.png create mode 100644 previews/PR2365/assets/search.js create mode 100644 previews/PR2365/assets/themes/documenter-dark.css create mode 100644 previews/PR2365/assets/themes/documenter-light.css create mode 100644 previews/PR2365/assets/themeswap.js create mode 100644 previews/PR2365/assets/warner.js create mode 100644 previews/PR2365/data/mlutils/index.html create mode 100644 previews/PR2365/data/onehot/index.html create mode 100644 previews/PR2365/destructure/index.html create mode 100644 previews/PR2365/ecosystem/index.html create mode 100644 previews/PR2365/gpu/index.html create mode 100644 previews/PR2365/index.html create mode 100644 previews/PR2365/models/activation/index.html create mode 100644 previews/PR2365/models/advanced/index.html create mode 100644 previews/PR2365/models/basics/index.html create mode 100644 previews/PR2365/models/functors/index.html create mode 100644 previews/PR2365/models/layers/index.html create mode 100644 previews/PR2365/models/losses/index.html create mode 100644 previews/PR2365/models/nnlib/index.html create mode 100644 previews/PR2365/models/overview/index.html create mode 100644 previews/PR2365/models/quickstart/index.html create mode 100644 previews/PR2365/models/recurrence/index.html create mode 100644 previews/PR2365/outputsize/index.html create mode 100644 previews/PR2365/performance/index.html create mode 100644 previews/PR2365/saving/index.html create mode 100644 previews/PR2365/search/index.html create mode 100644 previews/PR2365/search_index.js create mode 100644 previews/PR2365/siteinfo.js create mode 100644 previews/PR2365/training/callbacks/index.html create mode 100644 previews/PR2365/training/optimisers/index.html create mode 100644 previews/PR2365/training/reference/index.html create mode 100644 previews/PR2365/training/training/index.html create mode 100644 previews/PR2365/training/zygote/index.html create mode 100644 previews/PR2365/tutorials/2020-09-15-deep-learning-flux/index.html create mode 100644 previews/PR2365/tutorials/2021-01-26-mlp/index.html create mode 100644 previews/PR2365/tutorials/2021-02-07-convnet/index.html create mode 100644 previews/PR2365/tutorials/2021-10-08-dcgan-mnist/index.html create mode 100644 previews/PR2365/tutorials/2021-10-14-vanilla-gan/index.html create mode 100644 previews/PR2365/tutorials/linear_regression/index.html create mode 100644 previews/PR2365/tutorials/logistic_regression/index.html create mode 100644 previews/PR2365/utilities/index.html diff --git a/previews/PR2365/assets/2021-10-08-dcgan-mnist/cat_gan.png b/previews/PR2365/assets/2021-10-08-dcgan-mnist/cat_gan.png new file mode 100644 index 0000000000000000000000000000000000000000..a3b59c5adb5eb0734eb90d4d7bd4051b68542dc5 GIT binary patch literal 1173514 zcmeEP2VjoJ8^7!!_8uWNtq`%Ju{W`6mWm{Z+G6)#dsk7EDn(J$EJ~}bJ!{t9d+(i{ z{GZ=*@B1Z2l@LvMPTudm_wKppp8MS=_ses4?~SZmyJ}MBOwLNFq}8ics;`vwIHjz9 zceJOJpyG2&DwW!)Z^I@*^?iG}b?Mf*O<;!rx1c^<0^9<62ewhF_wl(?8~4ohTguAO zKRdfyl^Jkpz3raAldb&xRJnA=f{M&;G@)X;-#hoJZWjgs@ ze-O4N+R3Kuo-Zyv?sM!+lzXMl=`U9=+}?S|gVH}fF5q^oPM5%=4#B<7Z5)@W|F(mn zw;KE}X_azcj5-jWK4X_{-%RcQ$JGXnPDh7*z4)H3hNb#nc&WCly9e(oSY^qOHEWAMTD2uS{SluZmo50t zFYr*3vO}|9{G!*|%pd=mZ|JSbGd^FsuVStb^IlxG$uaNM|F!V_d2^OSmCudMkUrze z63xFpdZETQ>pGk{S1J4O?>cNx0Adt}EPZS<_(wd%ozNF{9I!vQo=XR=6-hu`GDC3>Mf1SNsrCKeT)T-t`VA0>1XW17jT*J|J`S_U|_WS+S_3MFk zYn&|Ld3oXozXksN;X>suy$mZPw0ejrI%;8-H^Cta|l#%$?Y5 z&dU8e=P$V2to`EpNz)9^wWIc^hA9L02dq5U?Zd^z0~bFk-uI@{z5;*tytX*FjD7cH zRkEJ+{soD%w7shYe`!O^y# zCtYe^r)}f=X={F#JI|?+LmFT6z1(0!nPcH!dh{*SV_VVESDS^UxjR1~(}PNnrVg_` zRI+w(z~bW%J?c+g{G`a!YI&T|`NnsBs%zfaSJqt|xIJUf4}ncfpGw;eq& zcb03eUtP@c=-`mmfqSf12K+dC@Rr41O>4JUv#?k4(>v7%?E^Me964>uoXORkE}ahg z^5nrbHE!*y*YzuxtyxG*qX@1VFi(YJ!?ck!K zms|OJe$ze4twomtc7L(vvkKEE{gH2Pn!g7X3Ewxechg+6Z+z6Ee*WTaKF+ODx6M)Z zr}diy?v)NG^KA?7 zO=?jyfOdS$;iRSC!}7GedCe7SHHX1ysUq- zfs6V%{cq$R&uaa~t=dt|K7FkP-?t7ewasyP-nrYpN_pt`HT$D0Po=)vr(M5uZNm<4 zTT-LmXSFilZL_K3fZ}esYp*HmGIQzL#)pUZ&G&`R{;Cdj=8PG;BF{J5hxf7R-tEVm zI~p}j(=^G>q~s%(#*D=MTI+o7*=menE|Gn{wR3+}~YG-ejSFl3OPy zwJwym_7b)CTj!J;{>nak+LfE5U2AL)ueE+p@!eGxSFj77m^Wgx|Bt8q+z+-LR{4JM z5ck}_CjH@q3qPtzr&vCb65KO(xmE$D*dLryEJ3q;) z)|{J8MavCy^7JUX-eKTh^G0>(*8AtihZ+`3Iq8bK7L2PPTe$X$GHtM9x2$Q zUYdo~UF+4n_HcUJUyJ6yc_OIS(Y&Kw8t2SjA!vqIwM-ja*ZgvKM!l2ctwPuL{CMG+ zV?Q47>esGc&UU^Z&Oh9GK!>0tAz4(QYC10b?&-kaSN-p7anHJ$*V|vuc_n%JApbos zbyBXV>Q*jeyKlT&%ve!y=lqculGRZsKKZ-noFmuswWx6|$CUs^BNv-JIHI9RB6vG6e=V{AlKqMe~0M8`Nz4 z`kj~Zjmi9=x6S-U(W7=O`b14>opDmJPQlxT9M0spw9?dc1H987t+3bULYlwo@9Xv9 zSM@fuNR#Fl*8$n~Wy!K*b_;!feY9`S$f2dW4Yoi2+1E!}TlYAZwqg3AHNPF`oTr=T374AnlBK-%*I&iE zI(uiE^X<=fwwB+&wd@ylJ;&$EeDH3&jh=bu)=AZRQ;8pcbsn=Pd6P}c$9-|^XP5REeiP6n&8HjB6-iewNtLg^>w2zD zrg33gH|O8Euwn63g>rr}X4K(ncFPy1@8@;9+=9C4R+ebtI44c znkIL(^2&RpQSqrOXD?k-(6OaYCw0c|!I+*^l5L1eJ0@A1T-lqQs?#h(VfzyMZVXyb zKFDoLlEI!+*G;NbaZu&yc7K1Gt6Z<9qlP**J-7F(*3$#;{o*suuW-8aTQ)2i(BwPk z+?#G$f9P_u+IgRuf0QeDD`s^HVtjetY^Go}W@)p#^kmpb~TTjZsP&DADc-M?s9@Sz`9 zY+aunGk1G9F!-}^%l<4*A?7GL(4*g@tpZS02WIZg`wY04rymqUV zy(bJ&<*KJFRB~VYPM)(edEA-5^I_KC+xK){?!7OA^S*jJ3mvrSlezGmUMt4<(JYaRkYcn2=sF*TG;jW`b1g$SrH+QB2 zD~ErTGtaUz+kZ*ktx%ZH829P}Kd97e<2pOHFSm?4*y5pgd!NzXD&Ls_7ng@QsxFw|i`?bSq<{ z>5c{GoM^FUPn(@{rzBsRcKUC#4rQvdEMVWKHu)PYXjQ9y-qGzNmN#teacSeA>F2N8 zl^n8uR=GtT%kSGiphA*Vm48`bJ!H|;rEU&GH=iHbwVqwp5!Hr#-{kWVdoE_bHK}@s z)|s}aNtf#K#{7G4-O6XRHOcl4HCAsuoz8C6uVoshX--;pIT;@Yw>2* zBP)%$P^R>4*FMSHtRJE5tgQOGUbjD2sdk^aU+oKWw>oyU%QvZ8d2Uvxzt}p|cFXt6 zJ2sn=aaoE2HELZ8@z~O0{3PH0pBDe0{hpbb8myX-fB%@0gF3dV@kq7%WmBG9xkhcB z@UWQMD7&qfrew;LD)|PNu$6q!Q;tzZ(#J*UNfxj^+;+%WOMj3|F@l+HXNAT`@D_Ek2ObSt}w8ie+obMYF&;eoBXBA4F}tD^I8QiSdeVR z=C4l8+*@mHlfJ*@9dq*X#s!&!SN~aMRp|}w4n)=by>R3C-WTl8^(;3p}={Fq}?n%s8%ca0jg6HVT?mzciNXnBv-HzX>JND|T;oWvy{}#0; z;`g!hREzMfRabmIq;#G;&JIOB==yoc_B_94E0ZnKBgN>w!TG0;Pv7t73oSY)`K0mC zG48HEZ93`{;azx=hwbsBwU>1ZIyyE@kukf@7FoY2r2O(P4qv{|)H~g>S&QoK@*DQe z{EMB=kL$nTWB)k?PJS?G^Q`Z!n&j$Pb>XHGE5|J;k^6e?fvr+FXL9Oub^M`0eJceO zJKQ#6r0X9iN^H8b@4%G2{aRZ8l(*LS=~dE97`|p;__vK)+Wb{%bjR=Nm+Llh&TnC^ zlaDX>t4fFT4hyqCn7Vq?=D-cPO8z?3|K^A?13HYjH|d9YLz~Y((SGv%;SXlbwEyTr z_?i(<+n$7k?ub#y?PF7Cza-TO3cwaTHs zo+&CNDO`5efx&wZ54qj5cY|^p_B_7nRCRUg;f<^c@2~UKwRz=+H=dJu$X6rs&uTq% zy?3|iO-uc38+Lv~?M3C&9kt$RmDzDer)@QRRBM}aY>HFI_s;g*)~ELJ?H4`Pe3E8s z(Lay7?pU*7#qe5nj@2vGzW6!cQbjs!w@uaOm+{_fezu>tIrPCG7pIva-yiVvdb~EE z@BOCRKg(Er*}D5TB9^(>touD#gATp2ue$kZl0NR6J}9xLsz?9k+gE&9dP9q$Ug-vo zI-Yjiu+vM^+jn7;QsrFx zjHv5(U_`}@pR2VeR*k+ryv*fHQC|+oJ9Nj*`JZP!v2WpOtKsupc4fNWCymp_D3x>E zgX$`M(~9n8^BxTtvikQK{mPyWJy+(#=qcG&T{wJX%@VT~Fy&;2$lRGoKAeJ48e;C&Tb+;*hhU*7ji zySY;*R7vSG_JcYLr+QiE&$N8gktI8`mQ8vz@WX-4MizHi6;$eK=R<{Wj(3Zg+-#K zH*?GP)ob;eS@}WI=6gG2I525qr6hgrlU1E!Q`^1Kk)_FO^ORpUG{c(IR(tZ?Tkblt zaPxVePpDGswoof)m?SI^RVgl3ode>TRz=b*A3nCcHJ?h+sL#v+oasscU+c;4jwy3PP%b)|KDAG zQ*UWg$0JFyl$pnVv9Z|k38U@~=)XAls*nZ9sps`iGkfKs+qc)9DC_m@`Rp#4$U`(r~ce_s`UCZi#!kA%xKeax2NZV zIpcONb)MCDNXp8#MI3AFI(Vg0;iP--^+=xW(>tj?+8KUw#_T%zr?}j8+PtDxi2~=7 zo?qZo^U5D}viJp5&t4{P)?!x&Z65o{y+!=06faUIOU`sPS`_G2_`}Q} z+{@MVlX}Je3cm75qmrGwHd@%@@Y=lnedo+vT%h~(%PHzN*_^79>|eIc_HI8E zTTy%uIWIE>l9RHDPc1LM&K(Q-`gd*}P_TE$F66{i>Z7u~yZHKj77*mtI-qS}C+{40 zeqWcvEzsXPN8@5X9zIg`1eItB#!y7lhZ zp;LFS-rhM_uNTFUTquVdiv)eXbSDY-?tMB1`Sy12 z)IBFuXdRUTy8Cqt>=G2%xsw}o`nK-eBgi{P4$5~k?d#aZ$H$_+Q+ERe!lO`c-!6p; z7xXC9v16gwCAtSy?nz1vgPvQWdqeW|7OEf6y>pLlegTzx26PI_8Cylw>OOTX8qlP+ zfgQVWqX@hwD)aZV$aU$_tpm&W`xOf45YUm7b|?D^KOr9!*d_+7Cu~CwHz&3sRGmft z6VhheS(v1Je7q`k_UnO4t5@>Q0eijtJNpItdvUZxk)i>f{(i;WOBM<6bT8`Lx~RLa zho`T55&xndtv!piE)vkDR4i51J9Q88?L_~@NEInB7)W~Bl=3f8vUSl??#2B}2DlgX z_but}TdJh5yGN-KrAoAEQ@EIakz%n_)ae#T-%8&Ov8{qCe^OPXcnMEW&o+hK{fm$d zMSY9-xtA>LS=8Ob)5EV#@sh<#mhf%Es?ZN!C~2spj*7CF$AnHzNe&_)2S_96}tua26gV%uyf}Q-Z@~a8&vYw z+wCLWDb)hIlSChUZv3_N^7Ye|bocEU;LjCNIi2O#&bL$B z0Q&!Je4)(rz~^*UQBMzl-!{I5-2+EFh;q@RDWHbtM* z-o101pkBV+0?N0gmQriz3&gE)%f}e!e^Ri)Ms8MmaP44N^$B^t!a0qlBC9ErKV2HK#2}P)qN^ETyjqCQrdBL2nd0UcZlEh3gnegZ?T8xwsE0j1N0dIZ?Rgb zN)QkP1OY)n5cp>VgiQW3Hj?vF2*~T90FzMam-@1ZkRTuk2m*q@JA!}^$akb8r5XP% z0z#wzTl}SVK|l}?1O$P%i+~Www~M#bCI|=u{}BQ~4!M0~Ex>;SzEmp+2mOn@2)tPYWWN5*;wjn$0YN|z zcoz^50{JfVqO?H}5CjB)ctT)iw&NRQEkHbhD{2LScLV_;lkZ4JN;3ojK|l}?1Ox#= zAU+Y0#XYgn6#^L>4T%c^f`A|(2nYg#fFST(1Y~hfY*>Uq#zsTpfX}2=x5Ub)u{VNT|IdcD+FagiO9cFh$k7g}|Kk zx0$d_{q5!x4V$#;hUz)%;uC2J?cvb*V75JQ^7$uB+j{h&s@(s?6KOAc@Sk{2t=@g_ zMIw?X2nYg#fFST5A|M3vJ?wO8^2;NT+}TF`wB@$Cb~pUx3%#afMC6-Wb93Qlm|Az> z!E0Ldc8W(uDHZkRn*Mf-mRbY>K|m0APZ6+vPunfc76e`jfvjomRodjX3SP+nHO~0l zGQnX{oWJ?-L-m;AA7*t_-Rh)Q8B*A>eEs34RNs#?s-n4^`T54Z2vvW?DYd$PZsp== ztuTFyG#mPJ?##6?mg`t6of@|2s@iq(k$M;$#Ws9WE|se8nf$pn{8OTD#nfubuQ$|3 zAGjzxn`hVZyvTOgb^2PUnoQ+qt-GaiXLM9SbqHG$!yi`Q;$`bX%|$V zh8a|iBFR-GF~AywPO5SKt}1^PM|J#ih?=nShB|pAL?t6KUbAR&^-0;(%9`5xFeFM< zA9zA_teH;zw)eie|2R^8+dk_vO_{a!mim?QM}L}GWl3ZIOui%t0)oK1h(NV^+3Ltz zfW*?XLM9VSPl=uX83OI9rB$DNbw)KTostRC(}oQB>52*`58}6&%tGkw~3`Mdsfk zwTq`vOLyE=i?`lUO}tY*BmN@WI}xGXe>PY(D3wB0C!%PEoEaR5o{{)GqO=74ZAP4{fna-B)fgtcN2xLrY zM14SMD02K_?5O-Bo=(A|2SOJ+#MSnqxsot(gf0+C|H^5|EBQ!86IZ~q zXAWn!8QxiVTj6~KIn18kL4i1~*!kr8%I0NLJwqPhsSKKNQ9TNcV$Y_K$JjlWnBqbn z6H{-At%AU-ARr#dS3yf;-&+J)R!FUyj6Tg|F*XR7?}RHW8Y8<-zYsfFIoeyPTlXUr zymc=V($duM(_z8pJ2cIEPu+bKscfi{yF>up(!ct2ZmE1p*dsXe&s(ZkZat-ry5ed( zrsapz?&$}?k?d70SHOk6t|zaCGO?^&C>b~9I+5#4sq7dZvu&AE+o|jKo?Kg+Ho2bm zH)U{#=ksAmr24+2n{u!d5)vC5i3 z>kP3@uq1f~&5$95{x3ACZB~`yAEsNK9IRDX1nnF#0!ZQ0;c2>r>FxQOZnGyY8?Dbo z=tEQF@c6|gbfEQzZN53D;8``p%h>+B>G(p~o!PUwc-vitIHs?Qk%w~Qp$7`n*?k&h z;40G5n$oL8wqu9Ox5HJsr^w^eZHK%U5#U@jZQtp;3u;X3EF6!mN8*BjARq_`0{<}r zK0}xHl{xMIxD%xM#DhSc5-E6j*YAJdw-}wCcF}y4--Tar2p1dzTy<;rKj7Uac+N)) z1h*V{gea%Y-*}tjFZzHEEU?T%EmPU0UcQ4*qBtZ}m1Ox#=AmJb&Zpehw7h*{QN5GaA_O+*7E^r6N4xcy5s4!pL zaq4+>=V7G6dx1Q1IH__4lCoHvs%g~Fg;&+5zpL`mmg}*8u53avB8+=!fhHD6A{5T) ztU|)0RXO*VIdkCf6}b>6(qd99$^;>!gg_>aeiADM0YN|z5CjAPK|l~lFbD{NOfX%MAPl*` zZg5Un3y>hXLW~du1OY)n5D)|s2LeJS6GuOZm4bjEAP5Kof`A|(2qYK;gg_>kt`I{K zD*|ol<%=tyI)){&wngj~1Ox#=KoAfFUIqankS~Lh$h>n11cyZ_?>@(#c?))28j5rL z3eaKK|7&^iV0^Cyn6&DK8b+@QdLP64-1hgezopHBfFO`~5Ri98CZ4_$OaB7|BIqUA zct_-mg85F!7s*K8lL$oN1RsL+c}LV;fb;*3 z(D{5B=(HId8>Mg(O;hhw%J=KDs(9`se0JZRhY_lF@f3W{;ga4tRZ15Uq=2`pHX3zW z`P0k0s}@eCFrDpFECrv!g0l)It-7wN6;7@;A9=_p*Lc#4xP$7Z=d%G({=m6lHTLIg z3g`djpaZt?67fIjbekW#xFP$A@CTiYGiKQ}6-9NL_0R5_9o&mw;Qi~~`|43hl**Hd z&Ij~Kr*fo!{(;@$5z%TQ*@b%Sh!EE=l|ucx=f1)V$V~#ghh82&^|zaJs!S-q6ur?$ zsZ_m^DcGKuX&2fvfnF?*lVxxYV%ZOqs&>`WC`UVMIt)8XRUddlb)>Twe(mm+SknK8zZ2)Bm`s{va}l(_lb? zXVPgfU8i4EKX%WqlF%XDPlp+Fn$E#latY|XIm%wsi$ zh$BcW!iAgRl%HPZnb)oHAagAMk>SUoQEK?2t4yAUwaBdg35P{n?We#zPNj+FvQZ%4rxE9D5j0CG8XCp{F?=rDzgKJQvFDq&{$U=vJ|i6Dv312X3g>}iIvh4*Z7N8=1VKO$5CjB)w}gPt(B*yQZ}ztYwp5dN z5J>x!E32G)QZ;_%b+!L&u*#Oso|)ixpJn^U2I0W#*a2)4bGQgM?nkf-F?N226tDK* z5OeA`Nl)g~b`~{Ye$aEB=6IhDkiYfd$$IUHd8TrSsrT}oaCQ1>sM`P48B6gnng`F6 zDyEJXwF~8MJ&07zW0l7|JXjufGnpZ6@)&(cJ4bWc-~E15oxBpFP@ke7xFazK4|e=? zKwGX*UEi&_sb;OaWyubvGWqF%ZwZ2cARq_`0&fR_m=Wad)F-tF0to^Et7o~1#jdBb zK(?yUTrFIYaDCx|`!FscEx7^L-;J3|hK{RupCx^aH$^C`^vIafR zo5zJb=I5_cy?HZhk_EDMn-+bV0rTQ+M2@@GPH!f?(jDb9l6&=e3p=3*iy{%8#&*E` z%RGnvzeX9^eTljC*0eB{n#19FZBI`mw0(`V|G1{WDEfZfZoAw(Y_0`(KUyrU6a)l; z1c*SaS*-+UotPvD#1jITYsNyt^R%<$Wdi24*ByAEmTkYQYSV6-xCU>Uf4xfHG`wIE z1n?Aj09X8V_l4@x5D~n1D@^gWW4W!AmJY5D)|e z0YTvHAdo=3k?=h{?WcG<07xzGF9PkVr&YrjUsc#0g6+t@6;uBs7}?U=lPAwX9jDzc z6>0HcTmzZawnoA5~VNZQvUZKTx%XoFvj2tyc8QMGG8L zE8=;o%d`u;+b2&ZNA;ComgnxSfnr};MEEe|x@tw9K5TJr;FUt9r|EBVfnq+I3Leoi zGw<9Px#TJnGqf4?yi5QY|2n{xcGTQZ&BmNzOtCBgMAs|dKWpewFTI*b+qxHRy~~R# zD-}$}yOGQi&uja~UMZCWd5DM7i!-kgX>_1J2J&X1E$X#mcJi3zU@-1D?JXs)l}&`DKf|wj8{6 z)d!tWSl1cXsrlD7LKM7{2qd20Gwr^awo29lypvx-Y1CUs;6Iqo7O&%5?`5MxK|b!t zU$_~@)7h9xeckZaiAU<5vDpS&x0lk(5Ai;m_zYMFi78)9q2nDy@Bm_aH@0F+5Cq;N z0`hz2O`<8f1c8K$KmwW0PPhPxWrD!xb1$pEjz3bLluPxGZ}ff@lnxVV9P-l@-i?eM z(k_W!czlKk+8jQ9#dThzh&OwGK-1U5$oq-i(-IOJ0*S}pm)MXMdj$bO;N=nc z52mwU-Z+sI1Ox#=KoEEZ1cY0@0$3vY8VEG(-IX>1x~b}w$~SxsHB$6lLEt}_&VE;* zlXeIKf`A~9zz`7casulRF-H)1cM%W*`R?|uv@D(x!21@x`yAs}?>w)n9W(`AfA}d& z77p6RYkd%KbWM{lPh0X5dIY^3o45O`M+_+-itli#(q0J6E} zUH!I7yIzO@UZ`zLn@rwjC|4kUH_*gqTMA@NcqBMJ+bg;S0YN|z5CmR~fDp(R<0HB6 z2m+Ws#w%^`8XcUV;%INJCb!F~F4OUW6Da)>)-tL@@st?)wcHdWIwf&QA#1jl>o!%= zsG3D}TTvD-F+X}KL`Bdtfb8^=^^P^usSjLZ&b!7*Kyx?VCafbF1Dpr$OfNAvk7Ilh zCOCm--16(ROyB|UdQa+P!}VA8NTeRrXL>*q1OY)n5J+GM_zYd%S6E~M>l863{ty5Q zN#ba&W^{7nN^n$7r|&KhD@>*aG|9v-PaH7sGBKu`s*8_4{-i)wNA(#UZIjg5M(w1R zJNBJ@N#)AmsB&j|b1zr?nO+hJ1~=-{%zVK0%h(kgC7dzX@w@ZPF*%u1*{SpNvdiAHE;1wjm|l2^V_x>34OR_GrBK!Bc$rrTYxh6kV`^~7 zb=PSZRluZksxj$Y+9#*Ni!l2WmR@uQV@ekrK5C~z49(qeo3UKbBL|;% z2>B_$-B4e(c2zh&XFZ)SIhfvqRJ*vg>EAHHocxgs!Rp8EIg|_a@$Cl@3g;|Xc?l%!|$n@mZH|XrYJL(&9kLJy+-E%KvSM2&B zZ$pgkvl?mn{6n}_-Dx4F=~k^!FsaI%+D?J=;Y`DNB~z;1r(VAet?5j~)O5Hv$SMxd zb|Mm6w`2;&5(kCDEeLlfLi!YT3TfyAoVnPF&Q%0KJa(i_2XQ5CjAPLEsG|Al}F~45+9}xCo>ouNdY@ z&2X6xU%vGqQZ@hTjG4qox`4n06S;9Bva?|G9XinYp1S)el1bWKI%>u({hJ{scOFKn zCEM<*u65F>3`DTZ@G^FpvbbNRc4Q)gXbpz=oUrZaBQrwFzmNFmna+AqWTpf`B0KMiF@OH~t${t(3g{_6-9n>RtW40F&(({Cl=F#rjq`W45RpDKoej_7n+U0r+NgdXXH5!6 zbWtA`R+>TzNreNVE`~HucdN;yJMEo8cBqIv&$%Ixj{aXPR5=> zQ{!k4;s?(K^IA`MRimk4xo9mXNFxsJHeXt4D!`O9)=b7{0G1%VSQ>zU^`Mx3HbYe+ z(!cGw$7?!q$oD#08+m}*V1|Fg1c!s4yb`LSNF{dVU=6IfHq_K1cR}EtML-7LceZO^ zzhUN|?60rsX$|q}jeNQ?c@_i`5&}txXnY;u%G-~djXA^9yx4|agEpCX%(JF=>7(=VuCB7pFyy~wVo>qk4rQ#;^+Yc%?_ilp^` zIWs!YW}PhTZEaUAEpH-9LF6qdZPCWdA9vHXb2Bt6n@Wvbe3iFxV`Ix$KUXvBjop1| z?c~sfS5)f>XL*?bHr-&2q#246u2_rO|HsQjEcF)cvISgt?IdLWt*>s#EgJ=TVJF1;_t5uIY9e0xmm?;le({KS>i1(-pIt;cVf99@J=Dnlip!@@LUMD39lo< z8_V$~a|wcgARq_`0tpWRA&?2LPsAcYKoAfF1OY)n5D)|sGy*c6ouIo%j1vSB0s_UF zOwJ=~0TMz_hz){(ARq`NL|`2{|Nyh zlmCQ_q{kluGQPwgwxU@ONKgn^B`9me7(qY~cykB{fqZjVil&!EK*p4pMM~rZ0YN|z z5CjB)1c`tU$OPFfVw4~t2nYg#fFK|UymbT~x1KUc)&jgT>_R4A88Rs#2nYg#fFK|U z2m*q@`;CAQ$oISdrS*bl{H-^T7(m@KyiH1PZZkE?bJGzm{$~lR%a&n@}(MdTvI4B46PWiTWcFNw)LD|^aav5tI zYmQr6lihUTyJ1BN>1GLtQuP2t+saOw~4n}{$KM*R>$ zeT2S=h>9dR(qZ&3)Z^k$9pUzJ5>nX!j_`Mfg@GL<0V8~e&}&xQcX(E+SEn9k@i^U| zAd_*@AcX`0K|l}?1O$OZf`B>Df?()CcMQJfGv6^n8~Hpi>v&9=XB(@jim4K03Dkv# zZ;M`I)Q1S`)UqHu7|eOV=OkSgc5<>ItDlJi#3qj*19&ExC?*`5pn&LsNJfH)FhSI$ zk;!AYK`IS`5=nQYM;RB>{N~p)}-!@G$W)$OniJlM!wbVn$o&W(8ZwCMzP$xX3U{ zjaCs5A>tKn4QaFyL<}+LqJVbN4?i*S($7fewxbxALIl`EE2si3(uk2R(nr(|N;!b2 zLK^+1i7N=Kj%%_@&mg))46UqeSb<3nxvxnUcQejP>Dkn;Ye8ky8hzs=Z6QUCG~94; zKU*ZI6cQ2m4+|hdB`J;n*3dbI4o$P8Ne(cfE+&SoRSS}p6rp^&0e6iHl>>grLOUp8 zWkdBK0g7HT2!VV}0Ht`sN8p!Veo+@LT(DdhFJ4sl@83`OhKq%9Mj%nIst6 zC~j~D&Wfo6w1|ZD4-+TkX#zz=jYy#;i*$!)iijGB9r8`W7{vq-1k&^>f;^IE5#BA_ zK`N;%u1KVDJe)m|@HWCL1~SPW%1C+!v7|bn59CgJ{HQ*vn=0le!o#OqO+<`5ZdTFS z!x|k0!iX{?%DA8iBncBZXVDkbR!sn@^^pjN<$pD~=TXNEM?+Y>7eOxCq;% z%NpfygHf@{@*~QU0j4_rRiP(}Sd&rkWa@x6AO`Y`e51{rhE&o)0HG}!;7FDSRPgh1 z9Vkb^D8li$eh7h#D{@bl`@PC60@F=ZuT;L_(|PjzpChno(@V`0@YT7BS%M zB7i>@4j(?ums#`NxpP%uV4(8$_BNB`{$3z}fmo9T5Sb?ekJ>SVxX$At*@#kEJu=hn zsj?)Ylh}u3QTwcvuHPbHM3^8!n6jBPAswQ0MIqgAQJO@O$s?t~(+6V5UPvN>Adhr2 z2%nxF1py3)*AY{sG%d~3r9=!(&m@Q7ckMp4yvhNhzs zMyIin+$Pdu3HbWpD}}ZqjzAjd*?6F5eXyr zo;-O}nKET8UMYB~Ql(0j04*1j-V_42Z{Jq%I-6eTjT<-e9}-==c2(7@S64lH^ic3v zN_ZC#;8F7h4Z%Q3!!!trWzc+b@HEHWSmWyx`eMpGk!(De!!)V^9nciw8A%NMCd)A6 zR8-fk2?EM6KOkF?4lk5Ro?_P0!6ONeAv}psuCc>9NG6XUencQa9>YK)!$2fyonsg- z5K77?@~Ee)m1U}$i7E6_8OVTeMl$gXCtJcbeufv%nx=nki1D$#uoh$v-afk9k_VC& z5mFj?{-_PSmXL74#Ix=Kv`_Z|x8DGZBq`s>qdTTz5x5SThP9z*o}L9!q_#3X6lX6b z)(!$UM4*Ou6Hkc};2aAnJhAn73E^_Hw{s4l3Xr~ci4eKGyK*R|}DnfDe4?QD|?It}) ze^WsSAwUf1Q=8?nDT=;h{h_5B#hJeh#hFa)mgFAbqafjC~6^?G^q0w zDVHaCi#9Ylzxjp6y@I}s&JEz^`Ph#oFXQL_h<$P?sv zZ6lGsFpx(kksy*`Dujq*XlMwLM_fb>$-{`v8z7L_pu!t#s62>cwBfl7C*lYa%IhOZ z7rct_5L(;X@kSXN8$0rt*&5z0cpO3QsJ#*7IkdraF*c;wDO+1K&Q3)V?l#6WcN8;1 z>I)))KvnZ227#u@ff)jxbqroYV2!}jxzs0^Vh2w}x-DI?F@q$CCi3991@p7Bw^I&w z_Pk-n))wASv#0T$(3R@i^e0!w^_$NiUuGFthji1%m?zzjy0q?CEymVcJ`kac$s8rNqUJHOJY>)*I$)!t|s!W+OykvuI?GGy?N~jCCcq;~`uL6n zec{I5NKT*!3fvU*7z*N<5;#uJoCG~)oHCHl8AclVR`)lODG?ZQ9(^#u#CM8vbVV^W z8+t7D>e8k`u!LoGG3bh|0By6RP+r`Fkjc1XCnf(0fmq+ye-eqWbgX|wfEQB8V|<~U zc=h5tf$zq|i4)l~DFia{_T|f44)THl)VvTi(l033V{%DFmxbpj_^-LDOt-fX)wyfkO`^*#Dd8qkuSRQx<4Y1@T|cD$(~3MK1?^$ z1p*1eh|M!t=g0&SYZ@_)O;>POFq250&IU1r$C1hd2DAeDq1&9o29eYoa%dWxJd_de zvf>|v71m;6!w2?T(-b$OgO&g5Drfmd~H6oU_MBXsf zO~CPH7u>0Aq(R(h0-W`**NkG=45PWY){jpnc#zT$+<)*c0#?n6vatQF)!7GTZ z9M;rGqNJ1tFDHROB-om+NXn0-y2C*VvB8G=1{W};B&~zurU#JMQG%F51_YBT)%|JQ zxh!Q<{R9)=*+Png&KlRC<1RJJXJ&)V~cHyPeI(@?^G|zaPtkDkbq(c{H{pK#CJ?5AdxN; z5K9@I9oJ7GlW|2($|X1iOfrD?14w%fM?qV_<~` z%y`r{J@8rvWEzNhm#js`kfvK~86Ks<2z13p9taPSQ1DPAfJoudaO5Fo{D4pxamwUZ z>d>7}YoT=ZMq(P9Jc+#Ck-Uo_hoLlejLkD4wC*uD7n<%i2p0+99h5YC zO&&;U6HRX;fOth)({#8&0zp1Gou08y5>1B?zm^AilweL9!)6krBy8jX?bJpv)chuM z)?^`7@OGM=YG;cb;~=hfbfF!D0op-2up9uGI67J}X31QVu@~5uZPNxaanw;%V$o|7 zM<$e9nL*;Xp=MdoSJ-W|1sLET+km_F1k;WBnc_sO8GnjXg!3(bgpek9ZdoBWoHc8M zD2i$|u{P=>8A>t&1n@&?#@)<`>s<(BT#=J<2@V0144C8~!5Jt9B^U(E?}+)GNic?q zA#W4`49W(VUPMf56Dh>2d*E>l4hc5M<0JL( z(F3|3st}OL$M-4y5pSxAr0vg4cnp%mb%0bF_E^)}L9AOud-Qrmn53s!vHsG0PbEBe zcC^dcmUcZ8FVle)8;t;gvji|a8-f;ui1g82Z-}7>0uugRdM!5nV|qj~Hh_V$bP5O$ zY(X#als4Tta3>J1=nD`slmhWWI(ROVG@ws}Hrz@@kgaw^xUJ|8JYo3#`}ddxKO(GB zr%ug;97iE?kCGa*U(ILM&DB$i2^wQb>excm3junfJBuJn|JXuGFzwR)8$%d%8}u{2 z7kK6X#S}D=)N509)#xLwo9$$~w5&-a4MZ*NU=~ic0NK|K364OT9q_shlk)E$0(c)F z0^a{9K@fNm0_JxlsAbYoO6j!#=J)4CihA^=Wd3EBlikI#-c7BM4YPG6#|>CZ6- zYbGofE$|E>#~AccfCp(!*m&@VPF&zt=G&y1Txd(!9A*ZZ6hJ>LjASyQy;FKq20VnC zRD#suRXmpUkmQLBWikmb}gtWNxPpTB51=3y-vs8hR&d-DPa)P zD4OPuAdEmBZR!1zR+NT22qTkhVnd)u*8H9zqB!g|ggsOPZO-9tgFIEtQ`%GtG5}TE zM`iFx76C@|7p#Ox6rG%dbiJvD)?$J}kqnbG(*NKtd2GWT(fUuNT#~t{2M-^plqplQ zUb7BpCUr(IB4|7X^gQ=58wtgfZ}g=$N(*X>Kr9i%KMF?wnxv86cL{_k9o|yj@I-?A z$|C-bp$zGVcbC%9b}q|}r*wnR8epIXwoltmxO1(rA67#i@z)GKLznkWaE$T|wCgot zk>ZIJ0rN~ld7F4*?ZLNaxA`40zcX*IW~u3IBA`cGjM*4ja0kCI##75EYUXPZ%^r}3 zKzEGXbmy_3qI!%n^zk!tOs!mqKbTk`#s!R#ACE9fYonX42RE*PBIbNE^5De6o##2h zAo(3Xv=askB(cDeF1RB@u??H%v%w@oLNtpEdHjeOWC*cH?5x37X)s7;XRw&-B;&x= z=*PlJOc}<9GGHxk^Z^+#N&Fz-<&IIIv~8RgW7^t=(K~bCqO@UGCElN-w?)InNpWkv z9omkzXh(6}hPP#7K_%W^ZWT?75{dM&YJ(;610|rxP&aI5dPp}sBOkkFbfA1h=|U3c zQvoc1WLt>Ug4x;<+ocqB|K5Fd?fO-E_m4fX?qHSlQBqZ;NHMZi&xNxpLmzqo?yqQiGbMW|nGj(<7VbrA3)u~_ z$Xt!?z}f6`i0WNBnoxphn!9vD9*3wCCr?t{HtNF73)C-GDpTf6%GoIiw+EVZAF;+5 zUdC2&%_M?`!%EnJe${ae2vW5M#B_iR$`}_VTArx^+&}0Kqi;2VB(}(Pl09T2+KUCH zz}uR4E5S~%ZqjM&EJ9tTC49Ghz#e+Qk0lM#aSuWu#k z?*(8Y#|!IdAmstnax?lcb|MB_kQ8IkWeFqSEQ~bPhX{xm<(ncXMi0z)Vx?K*$QF4zGEDq${FPjUT#g_V4v1nIwf=dL<^`m8#0>J-nd-+ORRWy$JFbK1$cEbkzK zO4EK)gJ9e&akKBazd02jz{&s`hQMOF@gD}rHk5D(Q8^C4vNb!IG+Lr99SyOeO_LHc1#{jsI7a4^o4F zvFqgln^=Y^sg=fjnku11mj=f|44)J?NFVmY5tJM_TNF{Ma(| zXaI^-^e}&N8tg>pAXUeibP`VZ=CC~EfHU1$D?o=l-Jyw|v@CtsAPPJ=1F5<~{zC>) z^P=c$1tE~H1>Y+c=1zUZG9oSryfFmuYePcfL_mIdyjH&v%dij)jIQi9!jOvqY<1u) z`fq|k+@uanN+4s3kr2}GsQ*Vn8uA#>L1HL_WhsJsv=9&8IvsV9JY}IA2jix5P)z3& z%hGc6Y!V8~nmHmD@KS+1^1?*A@xjz|XJUqYc>PQdB&TaaYE~Nw6JeieBgr67dpl55vwM#^ zd*-~ldG`UWBaKjYBu0E%UBCZO{l5NBm5)xVfOi*oXfg_|s1cVAj2As41=XTvv>NGb z29`IlR#>C$f&|O!!uq>tej%Y2Vr(}QQYpSP!R9y>h6bX}Y!?m=N1-Uf1@h1hstJ6l z+jXBB5z2&etDS6MjES8ueNLstz?i zTdR@Cqd&~Wz^}LKA|68`asD8l$x7lq@ao}!Yks*TlS?gYZq9(Eu$9Chtf!LAIw%R$ zhuH_44k8m{8$*V|rU+z=XYd%rP0|%ZC?QQsSP-owAyNb@G=D-Nove?Kp&WxO^7=j^ zL`X+@lVoCBHxo;e<28)S9!9{R%%0tQ)v;42)JJJbY2)bgiC~xwSuTJ!P(-84RU?5?Sb}lX2_(a2k|bww zL5+bHHDu`lfSWg?jXJ>~i@dEJH9&}nCG{!4T$3x}N(eK;0(vZg;=mJWz|_dn-*Jmj zoFYOd;{=-&5(Hj^0RD#>za(BHCV7H@AQ1lu7=tNqfH3}FVnBZ;Skg2lFlCJKmIq&E zUl@!Hf*GU3{67zthI~?H(Wj-f_mSeJs70|BXO$EM8G;rl&>}R{n}mQ!1L%$>lki01 z1&-uuBa(zP+?jadPGk~m1kEXBUdzaONvr;06G7_qj&al%Lq1R?S^`;+5^XpupbWwY z<;**X37+{*5tK1RL1sYata%>Y*!fI^5c2#oP7p?_iK&>O14XqikV)vKJIEtElz0Y7 zf{b~AfZ_r5BA=qzT!ggC7cQ$!8@H;<*KQKwic-#0)|UHFsZfwG`UgUG|L!Ao;`DLl zU8=NJ#9GV>=x#o99R?ay0otc)ghh;?$7TX@*FpjbJLEM?}FJIGA@z)kKW$@FJ1`d)SC1 z!4s)pD2X_TBg#Vt=4#tPBJqAo<}xrH7v)2}5hWWfmFWx^fJ`IAkVQ%+9t0?m4c!PP zapQ{VnbmQnh-=adFC|$q5FFx!MlY zXGOSNyLwIi`R4|8{n{<%OhujSh=7vqFq!rqDLZHph>u3Jl5OkXC#t6xIr)HLm?tsf#xn0#}D6-j>~)|FClqQBO__eqcJs&^?iI? z4Tvz4BI*lji%GJgq6YX!O;Ad&1t6~`IW(D|kxor6Sv{Gd4bsFdO5M48Pp$oPqq=zc zhRWzbw1=|UlSy*WO}vs`Xl|wKQGZy3dPHeguifAS$DxCfXPs0KKUzJbs^nSgjgd}x$}%nVil0IzUlA}77X$BvJal`t6uMtYN3>TLbVs$&&0^C*e4P>j+v>BDf22jfBp7y}C!cjJZ$P@O@Uuz`NG z_l?#F^2QYH8sv3?T%PD8$RybbIS@b`SQ5!Lvoedum?s^BIc76WA&?H@Seks$o#Qc? z)C4L;Kd=+fz&@D4#0GMB=8Fgpy#okS(pcw+IBdi@gs6l6ff&KNYG-&Ti9p%XtAC>C zqIUq%6fSMNiNvNGIxUgPV51In^QIC4&|F7(lZfE*rN5}_H}9wnj<(9d7S_Q`BG-6- zB^d{T6%tCiDb<-=lpzt(^m29_{S(R!(R7T9p&*em*sq_BW*V91jWS6m$|3)Wz`z(O z6i40YBi2i8v@AuSw(~kykZFoDnM7IO$N#7ZrQ=mfMl#EDpFkd%82!wDlyFfPjR@MT zG16t9vGNI%U*FXz(I7IaYr$( zL+6>YNM=FUfdWQg8(=4qUTZq7#+vT10z^_-(MdJbF8EL*0Wj@!p%`TwWDNEaL9(Hp z<~DTBVk8du1|Fo3sUZ(*KezgPPQtgv%LxjNh6&On+30FcB|J1C zBnc!y68%K&LV|@bf(YU=(xBEt3aPPJ(+P_)EsrA|ZKxioz#V0Pp+=oPIRY#xf=c;D zcLdJU(YSsJfs8A1Qce(f4-vqxo;UD&=RHJDn*5(3V0>Xs;s-mB#KS7*=>b)X#eBng z&^7ZZ89A6L)^gYY0u4q$#3*WdFEPv;93rL+ZJ|Y}*yN5inC0n#IVQNRi)*$ zuvrMX9JozvtA1wk$dwrdP#B8KU?3JaJj$2t;@(vGE3ZqzCOrr65bx zG)f}MNoy%-FFNPbzH}mwq3X}y*Q=9fE~!-Hb#x)Jh$@`z=@mir>}c;mg+b8Bv#G+B zGkGUfC=LlHFC?`Vgi1H(e6uhJdES;W??NKrXKC+eTKwx=$X!orn8tabi-qW5*V7zO%~=rZmAmi#4^|d z5;ki>?TGU;eLOL|vr?^D6Ew;vgG_H6rxC$mnr0GB)B~kpU3_WVK&;?>dm^ws&`GJd zsD8GC;%1U5&oU501_j7z@(f|Jni7~Fa;~8Oa+Ho&=~xpGun4pe7|?t)ekcZ6kY(c{ zA{j*_)6R;HLPXo>8Tla3n9JoG)yHi|Y3QeSKknYOPaQjRQ8`kc6MJgWdiXiD4XD$> z&VlWZ0D)6s%07~8pgs))4y5DO?He$NEySnsprcQHqFwqA11`Y1oW=^EpZ;kfz9Sy{ znMov*MzRul6z4UNNF)37rX$k?`s0AWwFbka_Tf|_D|+1z6G?hzRBIV;k!4Fi@BuQxzU5Ij^V@^=Cu1-Eg@vgQO2Uqs<8DlYRUCRy+Sb%&aQ8owZ7lgxrv{01|Qzb~Y-BvlI1~i*j&E z%5&Tfjt|wN2lwd+pGS1~IQ?Zna#$RR`EA`AQcO0-{tLj8K-Dr4R)nMILaKxgIxn6oEyQ7(=gK-ls z@M}|h7%yOi3Q3+m`J$9J5`;j$5g0{@ARq_`0)jw7ML-Xjv4>3zr3gk223a0jjTA!^ zQXJ)MkVtZkMbjd?@No6`;Uju?%oUY7bs8R^9UO>6!d->{5z!{W%-z7WPt>6;5;f%Btp++>v>_229OAx^M?hJOdgO%g-}Nn*#q~Vu9P+L z10>i5hz$DSTIjDp{O~9%9gg%V1y(3B@CcB!NgqThY!K{s*OG z1s#?RiUa-7p_>d+MQOxTki&IbJ`GICsLK~GtKEAKDJyc#rK5T3qz=?iG#_qHd#~*s z>}e+t#T}iMWAZdAS<)12TNv%0adf^#`Y)5l$0|A`oE90%Z>6v%mQ19vX!;Oo&Acy{-Cv-QJ zv8MhfVIt7ip^-6ACwcs;na850JXIFiYLcW*GTS=j}rkQwNdL-ecJLl z3g~93ykz&@caPn0=w@qfYq4x57ecGBnLCIGy|8(+`D;vp1MgfN3LbbjUMqY=04bh; z3InPN2jvA40gW+7GAK!g^B7o7kmxmFxTwk2Zl=BZAHRsTWCK|6t z&IpBDlvI@qPPGU|d#Qj9*abin#ue8v21%m9Di`H}T=5$Xl>mX96dbhmb!6F?Y`dZ# zZC(DJ5&#wDH7Fx`UsyVE#E(aEK)~eX6DVn!P#~c|LV<(=&ut0_s)emT?BB8J zwXe!I<*>QxGu~Z=71U~bOF;o`!V+!QAu?Smm)XE;?1O*x0lV||JFK<6+nO31t+S&O zpjpobFRme@a17%M2C0`Y5C7b@nTIQs`kd4vAhj?>WkP3mITqJg9ZS{rsakP_f;?&6LgWYiZof0bFwri4XUE(AT%dHr}^vXoGp$qZH0kU z=0UvUHoj_~fg|NLEQ|z91(I~h{rqSxyw#7AII9{~{Rv1(`EL!dS>rKU0zEX_efI6Ie2sN=by*fMssKP8T(LNIq59qI zS^WuLc##J7$;6kvrcT7o)1|;=6c~`77lFoF^!`F#fu(~UAwWZ5k~%(j<_!)BZw3O# z0A51Zv0kQ@rX;{4@Ca^lup*G68L7JFshJ9N*7|(7!k?O|jxr8($O^#75=nqgSJkh; zP%ZBoz>z?na{*KJ9Me`0;I6~sr5+Bi)CS^p2?%Qy64&||tl2>r;d;)TIb-)fd<_3Q zwN_;9nKafk*)*WB9v#ROz_Af-an>3TVQ0~UY{=*Ft6e3(wBRZLTAI`W zEoi@NA}RtV+5Ta(>FW zI5{B?Uh?bysr{EESmmIKbAdW-+Qr6Z`^x=?Z3-{f;!f7#QTx=16ZX*IBX;ez*V+v? z9kLhQc{lhWa1O;+3Q0%aI;ek+J}n(p6deW1lr%T5rgMNo0F4qY2LS0nL14iFCR{n^ zOad4LEa+=ZK!wRWmWk$|yhBBf{0ZYw-~T9&Xv1(S_C$N}dCLBVamFlRHrp8;D z0Gv582qRNY9-vf%A&diE08YIEV#@=$9|4m9OL-(0CszPd;8^dJXw8~atWUF9MK^V5 zaKt|SnR{#S#w?1R#Romqe~f9;LFgGLW+st1@Vb4Z9PLymOxe& zKj*$dFO9p?lz_>dQZQ+sP#~c|LV@Qy1+??|#`)Y^C|l@7Q)e-cxgKrp+Hj+dmsWnR z)@;_&Ba>LrlfV-jvbcb=GxPTF(WAC7x9Hsp;wT+IHDF);%Ga#rZ=3C#U-?yg^{c?3-HNLo#!WK$euMpyfIEVp3t5AO-sh$HuXhT0R!vJ6$89N##RCYRrI*y?@ zSCL?jgCBq-=Fr}!17ocVkmxUtwOG@{KvKpoDb^D>t^%g(RxVZSt6x2Ai{*+na;*XD z>lAZXAu9Iz#;Pr^FIr1;Gk_9~VntD;Pt7QJ_b_LuG0!@DX58n1aBK_o=lq?q*UO&K(5Ok3`+-ua<#qew=Lm*9G=kS~ z3}VnrE#xH1#$YBbE@sLKfWRE0>ns-Y4FJe|zQ6!$v^nS8%-)_pYeH;ZSz5N5-UiDS zW^8rvjLmYtoU`muOLN~!TV&9pLB{r7*Z}c&AGWXEf4{x) z4R5fQz3k<`0W&nq)B@a~{>E9!{gyDj>fOr|um3kPkj`3Xa@{wGoyk zPfSDUKc)eW&CN|%TQ^xMokI_@U{h0LHa0n8>!TQ;EW^3P&_>+8^MwZdM&_(gDA>}% zB4KM5g4I0$w9wGR8UiQ_&x=!o6+{FGOAEx%;+!7QsS_wF3@rixX0EZ=dP5>>0ZHdL zN+(i!keCVyL@uv-48|2et-xduIjd9VpaXQEHW@vuDu8aTuqyN|y;D8K+;4#825};# zLLm;O&yG1jU9G}fDiY34QvxP;O2MRkLV<(=2?Y`gJnJbCK##+K&AB=NiolRyf_7G& z#!ylf!-mVY%r)tqg^inynU^?4HNoP<>j_bTA2GIJs}}fS!*4-9Z-utAOI1Sct=q%* zAGM!5`c8Y}>yOx*-uxCgXxYtC)JY6iwC!s$K5qbx0OFd&)c9DGxLUe?fg>r=HGq*a z=@rVrN1RCa9Z8uudIDw=7mE*tSqn@m?_DE=F?0+Aa#Tj^4D<-x#ZWBTI6VYLl%bx} z3WO-}s*2)r4dHdYj2!gPeCq*<3M4Kl1WIB}07Jm;#s=YOC=2xDT0~p7qF%#~Bfx8X zbjm*c@lO-0FmPrYvh~)|n8n$9&UWqEXT|1L04M#>pYBkhVXseI;B#?)!bRBYtMyha zHhD}$H}Zi?DZi2e^L43ZYoIT2TQ`&PwD%(JwWt`@t+hf+ab&mG*HxNaz>TCp=MEeN z9uc`aa1;j;{m@m!?K1kQE>}%HELS4%=v-pj#RgLn!#hCK!g@oiL1DF(YZKw zGEPyM^aAS$7F|!dfToawz${$|2Vp-V5UF*EBAjciXjx7pA6peP4@!g}Re^KW->OTF zJ{-AlMHbtt>SgqCLE= z;~Olzm&#U|n*_M6S_vPMT;KP3S2@@bc}Oem00;Hjj*+iGrJF-vp-o=O_OK>K!} zqMg@Me2mY+c`O->DutSW9r1D;Xhj!PTfj+xCYKjCkxiXcWutel92@vDpfLw$5LG=| zQ$q|k^+b6yU~d7=R2K{@kx6ZE9;)0U1mJP%3Vlpc{j1Qp7WkP2w`G(ralt z4n(y|twVL4W5+th2@`hFd_9F`k$7AcL$R*r0p8LJl>bHnUCk|nK4V(0d30pHY06}U zEwQ%T)U2*8+dM$>-uv#iGpEmEMXj}jPL~13nXIDH#yyT}W|P*Wfy?;M{zr8n}@Hl`A+yUk6O9 zS$O8OR$!oh42afx={_=k1$DXvd%R}Wpm5u~3S zeaZ;LfgB)F#vVC|vjL6jI+a>+qXtD8N#mBLzPe(Q<8$^iKl7jLJHGQ(_NE_r3!rLC z&KSc0aqpzdR*Ned=S-qwxvBpcbMqM2bs+^fd8^Hq6LJAdm1!OUI0*RZr8R1qD%Bi) z$;y!8GS7!*T-U{LvI!nVPbDsakEkP18F9xrDi8i~PFn+8zqlWL)Q4?;dctZN8ZFn;i4zDHu%)*L zw?7R|gfL?kJWV$kT5W34)O@WzL@kujK*K4fs?BVUjLkm5$d zf#falC{QVmp?pa9iNfGlYqFl;#YFr>uOq|1+gOxL6MhFzocbCfrJ7H1riE8iz(m$z<2*>SB+RVE)evcmdS$ze6OWmVDbnW z#Hc}+;$7FFXV{gw9_rQ*o~GUM4QX31RV@vY%YbskGHO7MDnz-;m?403&Iqb!wJ7ba z!h%eL9*P8e&Mhw6%v{O-=O6yL9ml!kPyC0Uw9d{hAIFh|&>yIxRt_172`EqBwOCNp zfF#{74%dYcg&ZXx7IF-%%HV*pwiabS*p;qW#~QU7Uum3YsQ^kX)02>ov-zU)u;A#L zDseqjR9{hB6;Q6WdaIfvN$4PDQZaHx3(dD4OPJoCZkwH(a}VITY%rr18Wf6@Q%Ipq zu_4EX$L-9(89R3DxK+zl+t*U`zC`_wmL^AUawsVuL+khj+PCit&(~3Qc4)xt3^h?Y z%t7b=eTS^8yUP}5N31+GOUUeXoO&+WG75=I(XDa{%`J{%wKA1$t$l0VfquG34KEg& z6YGz$^wt%QqK)pG@~SIjC<~@Fh%HBu?!O1b82tGNJ;HvJr41-kb(QqQsUMkT>rABz zSF^E!@+lqNLm!*w9g)+sG+`O@#W%jx`VRjsMmJ$HDGx{a16AC00NGR~^ zqCoN2K7Yt8(FM5Zu4q#gE>zMMx|zI8yQ}`hETbTj1G-Y6*A5%6;DNW*Cah}I4p@^b zG?in*1J@j|Q}-P~CaQRJDA@sw(g2#KmKJO8>a}^C63TRVc6!E2OY^q4Sf-yc!JjME znity=$CD^ce*QD}+UU?v+mHU(57|vO-^pM;Q3n750eTBChPZ)ZPfEV4`OCCffKtF! z$4FsPa&6`#>l+80-X4e&x{cPuIz)$fZ2~2{0_afBA%%783Q2W66ONu&#pta|l6^BB z`T^d^;z&xH%uOJ1Z{Hp}Q9bJ3y|d6M3&2#^ntA}?^3sYeR2Gr*uG+Cjj@s#gvsNmV zVKJs`9RO61lu7vBt5gaXVyY54niSDz&rRD%b(x)OIDUQ0} zI^Lxmtb1%v9!!<^ebMDOZ59X;>$%xkO2FjK=$EukC~(OXm|I-4Klu2hec_Q38sTf! z-k7qR_vh{PcemQ^ju0;4k~@8h8hrRGbN1OI3-(`sVAoSLLegzQfv1iF+I$^p_;#z4 z3zN3dkOa<@i%^jl#*I}*KE4zea}>yGXeii=U-S|?_0>mIU|A{l0o;1J`>m_L&sHiG zYwGF*?PRS&Y{YcF1*ekrRz==j0yvhID(Db~^Ya1yC~l#jQXcXEMF)^@t#nQE=a?42B2-Hi0O|Gpf?vXVMaV;adS!h?$>C!F z$+{LTKhEl0eW}>-h%V?I9aJVvtLqLR$qU%l3qV4f9A?#HBZGiOaU8`v1L(k=TLD0? zBeS0&820S!tj$bK;}dDn!FF$R5&t}Z?AelA%%szGwumBOZ)b};gKR1g-w^*ijg8{; zAs^A0>e2JaZLjTgOr-&)S#|uV_iW zE-g@N_T-Bf7>o;)*J|UW^T0_|KYopK9aYOYhRX96P}H~qP>NGYUffE`Vr9#CT(uVD zQy+DhV2kkxxjb!vBG=(q%5kSm+I<|(eN3HD;w$3D=u|Tg5LDMLKP3S2@@bc}Oek>4 z6j-dT+mF8cob|NS+q>S-XL|sQ3*|MyugrG0(Sx zK;L}BE%s+9_{r)){$eE7&JoO6&;Hf7-irC{nhl*eYRzrUI3=vuB9;`hm=v#}1gDL? zjPjq%xSIi2`gfd}oU?blUB z`c~|~EbYt8%PXl#w6Fjx;8i&qiF(sgsi3wRN^h?owb3L(&-x4?QAeLi)OQevOYB|) z;DZ|qaLi}$?48RwedLKrU9h-FZA z%mF3`PJPYJo;;32#&t}&bIuu_0kFzMx>#&PoUV0?e;2JOtSk^g#PAd?)URzLjIM1v zb?~SnebI8k6;a9Sr7zzAAR`dTJg;tdYxVh@r~KU?nXt;rx_$Sr zp0;AX*53EFy|x4h*4(~$w8WQwjeYAaO%8koME~ZVS?8=aGAGUByI z0JJu&ud@ekSk6}F7669J){YKev9ZPa`uc3Sg4r!Pc>M>jCRXC44Gf&NI)JBu&thrb z(3rFa#I#H3{*{+ktN~7&JeAK=_LqP8H*n)B_R4SjHu~BHzyK+*rT`Ym`W)d%@_*>y zPwQh+VQzRNhvT@6ZlHjO2dnlUc477cOmR|wn@MEhR9Ah~-WlCeAdhdzfSo*s7qgAlQS5OimK%5&m+oQ}AX!xfGzE?r|2fDiz<&SVOx2jnryZs6G7*6J2J^*PG1wy6^1aCvgcG7{tC zc{~rNa*g{k3+J)5tKTwkgcl~qG2b3VrJxLr7jWjeV$E)zt%U?Y886D07Ol@$N#1G3 zy$C!)L0YGlUyeV*^AVkis49s1i2-HIG4!MNK?MMyl7mpyr)vRCcb+NwBggh2pPFQ0bLcuPI5(X;rB&n20_lcT`BCl~Ey z*A_N`LDcQS!bhnGq`7W_Z0O>EhzSnlQryT(P#2=owL%XwfRbSSA zhxe;BvP`BnSC+xI)LeEa+Gy8#IO?w7Rqa=v_g(SGZP`t9exr^om2$68}j zmQdj7r+_wQ-u|U2xWy^2aH?{Fo=q4=xnH4H*vx$+rpPybU5e0w_*R`AowjFBznwjN z1QXx9^`P%pXaZ2;J-UJLDA)xf#?D)L9QBp zNxV>Oq5nGT>g%f4;)RI*9chF ziSzDBeTmAE4=L|hJ?|_f0CH#adq!J_NAek+LNfRzQebXr&DtWap_<-%a>;)2gCpL{ zzOJd)H$aedLtZ`zkPz3(};Xy1Bklb8M5o0}Xw>g2Ag8tgqEo%Hf2 z)ApNhYIN|Z2KxTG;|eyoC|$$2-yqO>rtAPRF5iCq=LZ(-cYnCw3R!;b^Zx#qx7q*t z!ptUsytY5*^HrPZdvkm9B%e^=X`_H|zWxyUt{ZmaX!HxVPi@+&;v2e7^tS~j)kK@V zK%`$(z0D3b+?UVQ*Bt^tKH{7EG8`}<7d~n#HiuYty)J_eAX0>u4yz#k)>dARCGu(& zz%o>C!=ppa@~Z;28sVU&;M!FXaqA~p1NjH;d&u7TOYgKdz3ENXzk9c3af&EQ-dH^9 zF$Yrrpkuow2rt7A%b6E}9%ZHqfpWKRgAzXv1ZMcqPD|fZ!_g zsl#HtqqD>2@%RnQfffKp!09wTcNPJbON&+OY${q;Yl~F@hyu;B+Li%I13r32N5&Xy z1&(B^JC>Zr7$#4k>N%cAh(JOMup9{QEQ!cv}yGKL$>em@pA>Hp}9S>H_41>Y@~Ks;{IYAYBIlw(4$w z`)%Tyo-HNCCv=>rbDu{9@3|N9Chlqi{`3e4>O~vB-*G;)Q(Im0I?Ah#{>19l9K2dI zDEZ4x34pv@$|da*3S0&SB<7uhdl#o0_vY;VKfcF4{ZPsN{wuRyJUY8#51p&n>wjY) zuI8aGuG=GT#vwH#T8@+4i6|6?(vxZ1R+NlmD?NO!Y=7|2lXiYmqAqj&%+BV={%TmP ztfM!y?jrWMhi-H`C$_Drt@zk&mYfHPkl}W9pthuKX09Un^wkvIZeiofb#HaI% zHa(3v`^c84Pe15P>(*|7i?ue($$`*S_c9eUBZw z;W~FiBVe@#;s{O%nF~Nh^YKsCCQh6(TKEcOTS%uHzh@!kZYsQe9%+(iJ86HGuN;@#AoV8|~m#*O=x%F)@n1 z>6mra^|}#`ZhmY3Ubv9cR^Z+=;xw`y_eyEz;Xf$o$w8Nhqr4}TDHET^&xTWya*nDJ zRg0U(^mWU5Xd7HIaReOKU~r`8NGbrbst?vk+)c8grS3CV!stG%?yWK$HLUJ6FWomU z4Taoy&Y_prbuO{$tCsFXXy-C;vO7u1xBgD3m$XeNaETPS4gJP{xNpwI#QLRQFGX;` zm$-Uy>T0UD+YaXK7rw7IE_%|(C)L)!h_<@?`40^Hd|q*5qkn(>%3qC!FBz)%chy!^ z{NLdvn79sO0AA^2DvhF38S7AZ^6~LVBI3ovyh!kwWF=`%t~a;ze%@2}7Yq)L!!9ms*i)67Q}M>O>qdd9&8+q#q|yvA9=2lud4# zfQ0~LhVPyY^{E`3#u|6M(?IM&0YX`S z@9#NeQ&Xdak7+>fa1HL>B0e!O2x1&@FXe%~46v)9IJr!}a!aGY*(Erx4ssAn-{z6h67{uR%Ujk!g()C%&M0XV*hyt(Pk zt*tf!2RMhtbrI7CT}!w(I)T-t!R8;F&qHwhn>nu5n1ab*N_=6S;UzGj)Wl-DvZ(LitJ zvW}=)_skQ0hG#KMXLCwi3_l-qH^Y5rA@t3S6s_q%KYH9Khv!4|mKeNFH-3jH0gyW+ zVA4FHz@<>&wJ&P1e*{4Ozkf311}SpqBa7{a&zJcU2^bK*o++DZdqBs;)bIA!N(@; z9pBsQ`jCJ2nQ8mB+k=?>vF4PNy|5|p`@jGDPJm|uMEWMnGw%OeLXz8(13x}g8vpMJ zMJex}WL#9JqrY?^Tva!o)hS7b-p+1o=xDLI*(tkncb~QP^x&?B*oM<%7}vCd|K@E9 z4rFg%r*-%2g3DO8^XJZUT(a)&PTRM4w^b-xM;~tvKqKzqDn=!8M%|2vdl8+zEP!MM z-NR3Q?2|S&K5lp3cANG0_d7_FMfM7u%XRb!Hvk~Qgt!}boj4rnz$MMuYW7m@4dW> z=%UK6qONUJ%hm)yZzvG(9GcDm+#h-5VLLr=%I0ThtY^<|4;$n6(;pmxKtIz5Nhbnh zUdUVnY;-+DJ!6!|&k55ZF;fv2+RH(>Ksw}gAEE;AvfkIbAqTidpS)D4=l(|lwB>{x z&auDw?sW&{^yRdJbOBUqe$kHvjpt)(mU?=nFG-mmQhy7?>IV|nxrYvDb+W^h0LUE@ zFlnAp;8G|c4&Lu!2=j*+zz9Uj07jg=JJD&BQHzqq!N2kEciHd$!-W0%-;C*d)~?x= zwYR^f>j{Ut)VBIAl;O@h;6h4Ivz@UumpI*iu)+SiY1Ur#OQ)=ewc`hWYQMb|OYz_O z`*C~Sub*~t*SBDu{hb$Cm&fgsV?qH@;F$xFJmA_xwW<2PkMlSyFIX19vsB32EC37)i1l&G zj^2O7M$Ql0jW^z4O>iRfScEU)?{F1vWERe)^amAeJGF+x^AuuczDIJkd0WXXqmxJg zZC*fB%fVG_$Q0oaVu->x0*K;{Df&w`*Jvp$rRxD~3T*A%EC6mD#y)M$t%#FH0GH^4 zV(e0jWpo9Q)_|3C3)g1QZOo=JxY1d$p^*^)G)~o_v-BthX42?H27m>b1tysTaP?Zb_YZu90*H+$!DL8lz9n)1T%vWVK{{Gqbngv~e0+GH}6Vi+N{Mfdho%+ z_NTx1+vwfZ;3)hCYwPNBai+Lyib^67CS#K|P=!RlnfjcK4C7F{vVxfVsOv>`U@#&- zinB9wHZ?QhWv(YUKacno!xluyMf@jXi(+HA{;8$U0$jHOprneWv7DWE;@rHgqZ@8s z1%QWZ96+K#lD2z+F#!{~v?=0qafSJ$5IxJaO{p~JfKj4X1(j}tdshVbj-pH02xqbh zUy4;Yj{-1?GIR3GDQj1zJ?&h@1@h;HM*y$^{0PEm9xuM_ zE_>j?`)nRjb_+fn;iJMOBtb%@ToPtE8~t^;rj_|}v& z_Uo@cv5xD%9Z|qU))t3Fu1zZS+(t(Ddqq zyY2OF`zhPK|DdY~cqAbGlGmW|{Z>w=el8HEJRhu0FNaf%>7^XSz12Krp}yuviK&TD z!+caG$LMR^8|i(nk>8+>IJM&b27TAiZk4`dSS0SG9|4ie;#N{uz*`_+&)*v5YXoK& zz~_=5!-QD}Yw?hnaCd7JT9V_>{nXFpSL0O18*9c z=7e=$xeuTS*9?c>;sK72pRzp%u7QJ8XP^4SN39j`m4_R3%@x<69)URc?Ux&fFx^o(zFVMcq zt{?Fh*<1s+#`k$u2(4s|pq#Gr$s0%#AJj(0e1v`oppE6nOTYP>?Z`(z3ah8qT99q8 z60Rn-u?&h-c$>7fHg(t}NM{N3w;$7BDb+Q#V(HP;iJ9@7D^&)dFUU~;?-A9O7NUTUu>k{iwa-9H>pIESpmAJ$_7tfrxE}U;Jl~=8V{9H+4 zem2kp5L-D5jfLRWL9c7Dd?jmdBiMNfB}AcF$;{R7%={cTQ@mr0}GaQ`gsVq{($2n1CX=BD5I4UE0jK z(4-#3O;pKz9dYX3feKybomp`ORmsD}T3yDSh7h^4EUC{QeLxA(T+n`p>Dd4vf9FL% z&rOT`0_<=-N%MpP2?Y`gBoug-Qy{jmi2a&swr4MF z+H?*R>>3nH3$}=d^(yAtS$ymW#5SVr_{jaou*g|-`ing&%Nqfy6+qAotVDst9QS6C zc9Q940fL8*J%qoYyo2R7Kytpm*BbEkgWWi+T9h~ejwu{ zd>x^08u;H9s3aHK>qtW>H422wTF4(*aC`uZV(JR85{Hur+Z8~5E#O?SBv-kI4n_gy zkFm6PiBJ0YYygu_da_BygaXf13Os-Z??3s}l>OTOx$6m+p!kN@<4E#5G3S0ZB$;i0 z|8JhLe}3l`mpq-{`=<%a%2UspwOsN7E}?;Z**ss9KvLM)!U`|-Jf_Qb5c%HsVo*SO zDnVq}B;&1;Bb&04gFYSojE{gAl?hd-r+wb{`lyYrh5EV>SD@6hDpUO$;_tQF8j5(c zfRjDD_F4vzR>kq*JYXybxVhqnYk{=Ln~4K>_T+^tZHF36L0ro@qFqg^bLzQGCQ=2it1f+rJNaZUs(c%DGzUJYi*muZ|sN)b>_ zAH}50p!nF^hYr9>*>*vv6Q_}b1BL_?uexHtefhqx0Vq)-1^g%#;a-+d`gD*K-KQlu zaq^h6j1^Cs`PU)Ku47Gg_*xo7CZENT#U~k3)X{$_L(%<%N9r5MIJ%0ErbB~IT_`NS} zu{ZzDS$p;Eh{>K{+{f+sIdUNu(>_P0`;1NVX$6sv0JZgp%{T@D+JxCsd290y{|~;Y z!X=86x9|^b`kwJiYO9TZTT$4ZX{y#4xM+9W@nSo6&pkE=S1bbFW76Uy>Kikamyd_De8I{>;1h`L1+`R$njtja5P4ojdC);)PZvDP`YFz(#Y6~J2> zKw4rxw_w$B>g;KoH-IKwHE}D~@mjfp{~`quuVB@pP&RdNMimpX7SQ_gyI*9(!{^;Z z`T)wI!-J#lAhNTg9WLgats>elqmbD|JV|*WmnDhrUlH1BO!*3$C+K(qiKEDTUflThCuq9G3+I$up#`os%sf7hMI=I&kzhp)VI>GdbNy2Z92VJbcQ# z^GC+;9w^grb$oCu99T+BF3x8?1?!mVuK`fE`~_*IJ4*?G+!_6z@zyVW_~tRQL>J&0 zzY$Mz0%ymUZ3K>_F3ohNq0sm{S*K$aT5=pSz=SDv(MuU zGLEb018)29%`|*}Z7Vgr_}1F1?+6YgFSzg|PUKB}Kh7vW`*6wR%<_ehqyxBqWf1tzvSfGdU;hOL154aOg@nM{Kh> z#M@&8hHfE--|+ZpbSmp@X-;5iD`~C^j5p6a4;L~8=T7F}qllwhI@b7(Xn-S`M+b5W zV2P*~?%s*BXNfDgY`X!r$4{TKjt)4F0JuqF8TNK{*&t&M5fY{oQ*G%g9yxY|bvD?w z*In=P&LLJFBg~9M(*iaMHIqkNEwl0(K+b`KSJ;DxPuW#hUtxzIc-R5wARNsM;^#%? zx1VuOo;eG6TEoF&7Au(*%(wxC%)1jOor{C#(RGB2$`_cxo8~OvMDnqdOB1vluq+3d z((zm&E@d0HNx{sU;aKJX$}5dV+WUe~)B7Ut++#s2$;$L;i3)jg2^I1U_dB^1u% zrH_1d-mcx9vug>C{ZC(=y8u8+bly@(+3C@$4FfdAHT>+6l6{F_-GBYXS-b1%27CQC zv|b?2{>;M*_U?~N;_MP-%V5&V&|82wv9LIe8%U}JfwGbL+-<~}Lzh1zI8{ad4 zyV+HYgyxS}Pj8=#-34AG{%(MqC@gDY4fl@GgA`hXKe;u z%BI!~x|EF9+=O4AIq19&@I!|a|3Lz)a+oMR#|on8A~b;I=Am&m;Ob@J(k>y=o&kJM zLW^efMz6W@3c&qd_f)R>GdPu8UQWZkt)Lq@XM3C4@OOlOJpQtFNp!22VCAKh*3cpT?@#>5%jF3uJQGfukxD}Om^|Lq6+-3$6cdCh}o z_qV4#ZsKQ;EZBejfeUp$_jIQ1WB1R;T_neZ0-F?w0b^`rqYH7_W~CixB0#7w_c&=| zb$Mc35FdkRP`iLbL2-T9fwgbzH=c8r1THw&|AKd+@@UtHhmffF`fG2rkLLb@&@OZI-fBl4_=-)8+-L2Cw&?&MF5?xk=c?kSgPaIT|Rvl zG5nb%M39_HcJ=N8KtfPYOu|8#ju$80Y0C8wc#4nl8Rz~N$~Uh@^&Ozbh0*gA9w^## z?jw7-o>QL0?c+EiR6RY{yreA2(E+7%Ds>KZ0MqI-T-G_|>PKb#EO9Y8HFuI8d&k*H zkI(V8kM*cL$Cs2$|M4hr*H6A-pV`x@3$TFRlSIOok`xl=sy>{CkzPsuy~G<-`uGF$ zb_lLwERvRpTa4G=-adZ+f_)DRjJnPn4v{eG-7Y8vt1(c9-7&TGHA}fX*Z(6nHEO#NhEM1C-jdd9JmEZav@IAM~)R zt3-7|C5MuprBx6OMjN_s>{NER>D5Df)r))8IL#QP>_*RT{Go?%xLB~+dp?b>UCNpe z?-nueXuunJV^+6{7sMu)9JA2~drH_=ZCfR_c#WpMG6gOhmV=xCtpYK`6f@)x-zW@vO2 z{m5^$FMRgXh~cx$v*IFZIgo5pKh0Qh14MIJk0ZY8 zQicA5C+fq2CO<5MDCLY&9@rgZLN|Fm-)Ty~A6v^Ug9JXc8$=AH$srTxr;jt z#gAgjZoK(6`|~e;$=c`U39#MDk1b*Y!U60*bO*lt;E3U4alKq)UAyt5 zI>%^eYzU4hbCq}R41jcec+?Kve#k!cp-okE^Ae920EX*`vE_$SV0Z<9)B!*hfU3f=l=XLpc~o%i(+7|opPXPW3(OIf z0>1J}3o~59|0Uegk;!Sh?VG+4j%TkwC8C+014#I&r1Ch09}RHhzNpsAOOu{nmMXHD&R`|}j#buk)HHN79s14xXn>AK#mU*J%n(Sf75kr7~YPv!!R ze$J7+$Fx!ZJ4gwD+yU{DrU?a}3Uqovx;OyX|1!e&PEt9PZ;CkrSaZ@CJaUb~1wEZ&T?f>~7 z_C}lTqg%?LXmhMbUrtJ1z!Z2&!6S6lrt6!nz@7k}P@fL^P??{=weq}fg!nnCz$VN; zg7RGAJZ$>jf&%G1D#>_;+Vm~J5Xy8^t`4Ad{dG6mXRg1}mM7<}Nc_G7*W3gbtqGCu zk~Mbp0D6|Kt*6^Y;K1GY<QCb7RpK5CazgUTKU# zae50*ioMETxILw^$`cU>(4a_=nAKJg3Jd+&5$uk+u9wPNG{Vuv!&pMEw1!&wc?H zJ-rT$vT!jQ@&&sKfO+5H2d%ZU4e|Ao9Xznd9hgsI;k~{LutU@>d^LCmpO+W|$(Lmv zE^emgM$xClIrfTT6V_l@MBGNi+0(444xn69r=Zff`hn(MSZz;D&$^DNz_3yt8pznF zPp*bAlF2g96?8Cr5zEU$z61!L9veXyvfi38a4O?lQ(-}#6inJD6nFv(+;wGx{cns@^u6$S zUrJBV#pY!hrTp=yr|c5|!@Ca^yw4-&aWdSJa^srKiex^AZlWx`|LDj00*{P$bp1bn zVAwtaXY#Gz*15gr&2Rv2{Fyry5Ac<81x|#(Oki zKrkzu2H@xWw+dArp#>LWQ^_r8{Yq3pb$G|vbgR@WZR?x3fsIY}9j|$vee5s)0x>mU z3YR}Uy}PZstCwKfDI8&U*(X2t5nG&-FUb!8vt7Z%IRFm2F{P*DyQBiFm{nL=s^r=CdcNn z=0>j)6fUC~BK^4&@+mCQ^8nQxoWe3r&v&6adGz>6oSXMr8%8(-XU{qCX~h}2^b$`# zI0RT6A*Nvtk^ZE$qc1r=v*0la_g@P)c5nhLhQoAc3lcaRpO_l}>p5{b0Ygj?FpaNG zz*8BKx`KL-<3BSG;Fja}ETFs@5I;OPWD}F)ZZK4kgLOEqFnn!(X%c-(xSVhrzb)UmCqqC3`wDTV;Dk3kZ=NBYDje%YOnUB&*R2zSY0#J;{aR4bG zxd~3gb#XL7i_U>{07+h8Qr7EUm+JzOpvS;{1W0;h8v#kcxZFB@+1{a;Fqb z+9wov0t)QIKt>L?AHc}qRAFS!k8 z*H}tRRNRzz-;s69Gff!3MZk@$x5vj3C9lFATBU<&f4jL0V2aRVeGPKS-6Nba0@#+S_$q_aZtFMd1m3> z3HTN*x(BLNaV{u+V6t9qta|HG7>F#ms>@;;}5gWH)Oe2x9jCR_v=x%|tnZ-Y2 z32tEti|M&i749J+8gU*j&~xzMZUE#N_tltHct}LpN4|!eAi!b+B5hfwccTVS%R0{k z2HWUwcx2SW6{h8Y8_<+S6t3$6!G!2Wf3x0lv^#t59DsAi3Rqqja%o$HzQ8SfNMf0e zI9qtE1nrI!UFc;meK{8G5eM3z2*0U*`{U=s3B2t?fY5(T=o!+p0hm)OiI(-OIH+We z4jef;FcdqMIp{fY0IEr7yy)2pGl=@3j*{91okseiegKVIWg5?~qf#Jo^_l}kfyY(2 zyBwj#3O^zgB5@6u04m#QV(!uGcG^?YKB2%fOo7{OyUk~C@4fdv!!vpG@DB_Stp+aQ zlZjUI=+@8OlK(-pnvSNFeb<%X>r=ppc46Il%??)U}*@^ z_AtH}j~qV&H*LxR$b3@@g>UcWUMK;lL}SnakuG5bIRwkSY@|c zYfC5njUY;1wcGE!gD5#u)>Op88Qnl}KzHx%bI05Zh=EZdv?25pv$>RY5>=-MPU#GF z&cRV!!n!+;5zNgu-DqRyhw)oEhuC?|4jsDMDoe|V?+KNND0~AqW+w**5YyuZY!Oc8 z75i)m-L&g-fHMtqNH{ECBDx^wkeVt=nOmV7RHobrVw{6gaX2x-c9 zednv}`@ipv)`18;#h7w5{#gHZF=7h;Zvt}S{0RsO6awft7bq0K1JUsZR~T@obJbND zi}p3CK84;kJTK~EGsi02VhQU?_mNO5dy8SJ6OGfE*z5BcejS%6*aeUj|WC zIMY??RXLaXfM}=ihwpvQUohYf(yQL}cR!o3A$LH)q-jEdCrg1F_vP%yeHZ$^d9rPv z%k>_Q)%jdr)8oy3JDctC>X_~NnKurYJ*|EL9)B`V`;{0`YT&06O>QkzyEbPI58~ZA z;6Ch?n030EXeAcUj5y6 zdgQF_U9Uq%eA-so^)n#8B8r8bph-Dj%wx5(3~O)-*=!XB!>=AWOn~Ykzy)DQaD+HE zHH#u(8ToArnRcB`fwIO&#z3Gogk3?#%(yAohT|w!%A0r#aSSV%bI+g*m;<40AeZh2 zH7WjKp`i&&l{)hPZWrEt&!ub#`@PwKI!iC5oervqYGhK70p_BLi=4 zD&XXDM!?HjP|y_G5{p*%q>lB;`*>4}%)4i~A+Rgy4?d(#!w#;}&m5s}bQ|Rqvxjgu zQY2i4ZCi`OTj>UE{f^L$jgsG3(5w;zKOnoFigf3UTUsrwvr3zI#`%!Ghhz+SCcvYBgzvzcla66Sfa{=2KmJ#FjP5%9KiJzYRessK*u!>wj?1Q-UehzuK=tcc|Z4VJkF z%CEq7+$`hXEK}dvl@2hsX1}?%OUxty@)8M_R8J_7P#~ee3zh;~IO&o}5OX5G?tXm! zm6>P{c3HD;#_{BJ+~O#PQosTkL59t%Y}VSy`HaA2P^0wWLZi)I{8a6Q1E?U-sUejY z_es_$=LYcJ4M=`(9l5uC;e^0U43TG1Mk)3T;0W3-(QowVbeh%#4V}or&&;tOAV&ev! zyj8%8)vVa56Q_}NcVO{Sa2fUj3Y&6LA{*+Y>z)0;S z%U;44khp9DTx&E+F~`<)BMvv`;kvEiqX-juOyi|MDf99PK&%|!NQp>P$+nmIfl&dxOP*aJuWK&j=AKmv zU@li3>xzf7!;}EX9TG5Uo>1UHCCUU{$R*u#}4IoIWo00RI?`Qup! zu&esW@{uC zcN*3W!paY{w(`%Gxp@ zcEc9t0DoB1ETbGb2Paebe==2Zq9mb6~xH;$)oktq+D7W;H|X3;v@0#H0LfZ`>b zv}M}%0BlRFd4+a`hAd-1izR}61IV>D`5jU)?bX#ZX6|wx3Gm_`2}ouDvpJL}%e1W{ z+)gJ}I%)X<0@PVe1;AVdEH`<)NpbOpu-+LSI`4BT;w@cadFaZMj$2B|Din8+!Te8o zFY<96?BPFY1CFfx>}&w|kwJz85znab;#(`!4cTw_X{Ev}?~ zG`Lcv0HlM+2tY!U03^dvK++W@S!0=d?viZ!-)Ty~&QdI!*#F1@sB!xN*_N%fu!mNOea?4IZ}Z-C<${kd?z4q<{aF=;@X0G~BVn$BrV# z9=D~^yo;e507LzV#?Moh2F&%fb=t`6l!FI(@)pt+Ad-I`>GL%<7Xd*jEMWR`2f?c zkV&~Xsj?8NljxPc1%k%#@#8?A@@_bf0-!0br!#!(f2(B&;<+9kidcHT;w7HZ^&LjZxl1Y{{NGOm{AfZ4)flHylV}r*_>0t+TwO^6~ z9NgM9w*x%(WI!LAHm^7Z@EDyrh*Vykg+8>gJ4cdp2N_Y-tr}5*TCn-I;9)$6^Th&s zfAZO*pw1;gL>C4$RX9mYgmzh)nYUHAi~>^sRt-#;B@Mmyn|-|qa!$3tl`*zvq$L>ifdP2lO{wCu@Wa|ah`ad@_LM1 z9zwjl%O1vf-5*PF>Bc(o?}_G{O}YJ zVV9X_2i(qKOuC!l7z!||p6I1@N)!+735om?U}qW-*9f;zhC&rYuI(*2f+PlG8lA{V z#P{N~iE}0+6#?Z%!t2b#&D5Qc&ZD@D(&LmtOd}rDB^IAS^sg}m%G^X8;8_b07x+?= zsd^ottr0&$wfF~;VahBb?<0>q;-PXHn_6wx?!ClH6rVTT^Kk$5$+?3?-ECz#M(07i zEe)D8Ste}G5**mEx5 zLEL43+89nggT(2Kxsa~+C@v(=g@AYvVTYrhnF?T19Af`J;CYtW_fAs+Aa_c^qRR#pM&}j)L-p1R2r4@0DGdm_$S z0r0j6@XG)eD}Yj23d>+bV)_Qe3>}_{+RdK<0tGm z9@tx3+pQ6f?Dpi(fEPsf+n+OmN1lK<56BbH3p$JvIir#w@jOOJvM!%rtOo0LfcK)p z4KZImaN-cw@@qf=!qXhDJfW2Z2i& zI;pHIJyw7u%jY?vB?e4Z8$k8;wOwjH0h544^R#L_PwZe(>hhRs*HbamecvP^a!Cjtw2E4!{6p zAR`rltX05Dh4_an%Q@@AQ+J&#oH2CSMJSl*W5?n00WJXuhff>@SglwyqTB1Qypj+w zxTG1!OE*BS2*9j^vo%j%pv5_T0LBa;L*i(8_7*T&Kv%Jc_=;Ctb-)&;=3ES%%l5j{ z#T8;UmJm00;_9bDSgJV!N=wHMtmWLy6pLKPf*UrddnB(TE+Y(5%ZQ^1@X7&G9at1 zK)`FRFD?PFIB$VN+1Qx2mR$tW?#fzM13|r^RV`rY%MTH%XBM5$jb%GOqOx$(EDtL^O#COU&P-6c?$e}>f}kg z=7yVC8%9Fhk8ra&T$J<(_;V0FXS(0ICj!2*6c4(Q0ob`tBHUN$x60%BBKJ`Gu(Gfg zh!q%&lO8euS)+`8z;Kiikhqp9ka9h5S=zhmf}S}ClVEFsV7&pQD~QyWc`JO8#NiT= zd(!#lkET#`IiCrDyd0_}Z4wG36i6tLP~bU1fvqh$%IToD14-@9AxWedHq}r;FfxEn z-@!xmfD|@cDC1qlh>{bX2N0o5bk6SIh?ns!@%F0teQd#%P7%Ij(xfL54;S9p90vw3TCr)MqqE@-I5eKiP79a-Tn#0R@ zu|R-m0A-$VFSWA5K64Ih>5|MIa#%jCr zcJs>**&*B_y%EpmpZn5)edNp(>x2Rc1riD*6nM5%K)dR803eQz@2o1; zrm9W$!bRvb6zP*Xejf4>tI`{I2_0w?kGoV$fh6x&dAMKA=ovO)5j=!5aQQAQN;k5I ziypXk*2pm*y?~ zMRA@M080{iZZKXCB5L?mh;%D}O+va%PIyR~G{AEK|C4DrkSmC@-3fPFhh2C54fgpj ze!&5T)`7oo(gwPjAp}bnlqYeh!oAd@n>S5}n}%Y+mJnfAabSMii*L8X|8l=WnAY}o zJALegi%TUkZU*p8z^PPFYw6}SBc@&iNaYcgw_q4EJ%h_4(9nDZYhgy+x3|sedh&K^ z5I{9ZSRck)g>xvskstc#h`r^lFR?e?eY5?*wKv(_FM7z{@ty(e+1qMY^)%QmFTUIU z=I{R2?mhe<1~?1O0aWZsaq0xi1g6Aktk7S*3~zu_IRN|$fLq2Xx}P$BnITAcnR@Q( z=lBoKESUve005_+72R7$ays~(kB)F19i*v3^meQ!0L?Y$N^-9hcTix`-SKdAKaT)O z8RD#gCLM}y_Ms7cK=0d!)TkOu<@||L?s+YMr03D$B+o+lLBJW;ICb8mnx&C-*!;R{4 z2-kE`X_MO%PQ6`J9$NbeS#1j5R2+r(i?19ZxOwH5p;a5RWW_Cj5wAxU)RKfs zs5^(DNV*=7iP*PPB2+cwU3cS+_K6RFjM#yl0M-UuLX0e3zAD^Bfu{xLsdi*rQiwdnU%pw;jE!NWFi zYRUfdkG~Xi&#GK6{dw}Sqm#{I0emLL+b!D})5YAJ z2M|sFx+ZR?K&|S29Z~`yzYau9x=1LHP#~c|LV;%|1-7+(Q!+h}9d20t0x1 zJ_0zQUo5OkRm)OV2JF;gj3MXU_1RS{z6-o_h|aU^@*GyX)n)7rjv#&n(PM z+tJe}?f&D(?H@mP-VPl6vii$rVF87dB5M?ZAvTf9fFN4cKgou&jp?v#Q_`-B1s1$IUOeWUIC zyKHB3NLoL?Dd5l7<2_{BuEVteFa-+r26||-cA!Zej(TYub|V`orajoj*5u>f!$z(% z^&5IrnE*=x@$a|)+t=9H)8}jiUxXbUUG7_P4RLU>sm*dtiZZfdgs?`}u;L=?28>q5 z0CsY1QzWcl3kDveh?1*-4*^Mq>yuH2AcKQd29vQy3vO$Qyyeno4Uj2Aj0Jp9N`zWL zL|qH#aj^`y5?~=WGVRR`HZ(nNW4Qg1!Hc+#ags|O=|-{$FOmp+L?Ii~W!x6E+L4Ek zxahi$=rJjDClxPy&;QTfd%($6)_dc>-PxJl+1^XC=^>3^Xptgdp(s_kB6_)A#fs=v zUU~K6iybeoSg#_AUH^Ur{fml30s@M{1*Ahr2q7UMWs_|0JF~O({eFMXIkS_pDLY$c zXMblid(L^zQ@_v2KHvGZXVC(QLT%p^!b;rP|^rH_6{z12D12&A&F?ARrx|9nJVT>paH`KOoU^~WDC zXD(ePm%a6P`PTOzlDA!S9z4y9l`7GJCKnD#w0)%4Gq+2m*EqulG2BiJdyVHoO37V! zhS0aXaFOonG^6vXW{)UUqzan{?H>J-kbg=5 z--fh^J{`Hz+hl%SeB+wy9alX>Ef>A}g|CSuvq!)oDz`_5fFWQABqHGdflf@bYeQh1 z5bzgY$G7~|)V*|ZeWBNurXq`DlP;00QJG`BD06Z3=5x2Y^U`q(rg_);+)sB;$=nvq zUm$Nf`y7dYWxTp)FN^?pA@E}}Hq73OZCrNCmaSW42ewWj%c68C1o0WSS)o3*6(YX~ zSQ>*rGUiJ}#aoDnqX84KNjCJ!#bB*P_)wf#3h`h*ylt`>R|pJ>0Sjp~$SqvjA?EBr zyqO*KZHQpr3h^Bxf`h7%O>(OkVnZgGkA7tF3>`HIC9iwIe(2T1;E{}+&GkbYPB~%@1!dhUbuxsu3ajQyWw-?xHYj_f=r#v>aK;+S zg7~un`!mw6fLr8Hk4?{-TwqFuOH9c$yNL;mPr02;MrIH(iaMmJzEO=ARs879uAVs0 zd?)ZBlh@Y+36Ms+JM>o#B9X@W$g1fY&Y+gM-3*N)$t>|9gVh*Ec+QQ;-F7zatU7U| z>$|GNlkXf_(+1fGcTe|j4OYqaag7_ro0yZ?ssSaQ(esC8cuw{>ESXHU0V{6^7y^cX zAz%nhB?4}(r9hGEuD9~?j08Vb`^OCY;FlwgF2N#-SjrOaQr zNERP*m6l!#dvdARuEDVzy%Ff6L0CTgEeuTLVKvZY)hwI;*-| zN}zu(hG$C{bS0s5GYL09D>nm6*0^pVVx>(QN=B$AkrKrmiS z4}u*Iz}1)pvY;qWNm2}fCrOI(K`gleHUnDHqlTL|_dcWS}Eu@HDEPGap_>5SM~v2eJ#~ zxRVxvR1Zi?OPdnsFrvCslu5(P82U%j)X!j$z$yg&Hr>R7AeS}Ol_1~wAaXEN1SzJ= zbRWEr+FM&>+pC)q$gx{Vm`*!q66O8(NYA|Y()i^=$s7r!XB0VzGYumB=-@8?4vZnu z4n1v@P?C(3>=74|O_DwIkJn_VB!c*b4fb3->^8JqRfonauLnpDbfc0GYux`%FR9YV$gAus1y|;Z*zxwy_A9Ybzxt`UJ$%o5Xb-s z7WLqC0q^N0(Hqu4i%}315=m8WgP3%{sIUzCFlM1&-0+*kEE76ZM(xglzPM^u7WP%F z*KK3SXBZHYdU$%>Fm_NY_FBxt{XE1ts+cuT-N{23(n*Ww$peqAlh>|ZAwRn3Zh3ik zot*djGk{+d9yPB@NZ{22@1qxfhMf;lQN+=Z=n2C<8H80}X=>bP}h9dO0I zgTR&soKYA$(&H!wTp4k^7rK4!%}7i#GhnRANyfV~0Na2jcr?+Ihh#1@3+ei%ILv+W zsEzX%9y_82!shcm)H4z+l9EPB96@SHRzYsrrjsI~Jle(Uglcfc`#_rtw#dUp!uR#s zF3l%_d>#3o$89iBPD}8F4{u30X_t|y^)ihQlRKs@fzf#*L8g4d5<(wn7{~i!kWCW5UXZgAY1Phg7l0*_=;grJk>k&UL^3PO1%B*^Vt+yOkwG0P z^`cZS3Q1}PUCBWVl^6}dND>4tjCu;;8s3II8c8&12uT8y1+gaM>|~=%H>9XE{G5g0 zkX7h&X-+ZJ%{j7T_fGlk-FM4t=M>3r?uL55wOQIx{=UZ^1Bn*dy>o}UYiGdl@l3d7 z6Z;`dW?!UFIv};AF(jpo*^sQ(?}R!UMw%H2m|1}Bce>$C#Eq;A3*h$6IRv@xhdVp# zAh`qM24TlGG3fsgJfEnJXVi3cZw5);XO_Jwc4QEFNFa$P#;IwP{*hH;jFdDojl@#j zv`HvE2_%voiNu-tDWOF!NPE53STTFZ{^~XJyJzIRbi?MsG4MwZ_Q72sFDu4J2_>(Q zi!XnYM~ab24mbHhC5ed@Hv|j;L%ql30=HWj| zhj1Ow=P%PB1zHcE>6K9@!77qje9xe7Y>d#u14_ecbbp`{QBbX+$E=M_L3D%P^h?FfR7F7ZOPZjbs2x z`fEV@tz(P86DX6U2lsSC4??oYhGMX}q=b}hvfKcrenNtVx=?p+P9BIVq^po#RwFWa z7V0PpgygGVxI(V`@}K0%7oU+W%{3CtFO!^n>^q3@_QFGmb5T%OEWJJ0js{Y~{PJR` zgnK~HV)Fcp_43NjCdq|YQVjVifdu;{^4gae>_I6B9YADudh5hMe9#*xHPZ5{DkBa)NQ zA*$h=myB zYNR|1BaH{@^i`3p-jM`yCpf1-`c_e10UNw(*|8I9ZAiVEuyJxL)YjQ}PMbyUk64pe z0}==69tdKfKPWM51CvQd6jVO|sUq{SV%Aw+j%{5?dDsPzpR6DpU_d@!-L+fy@5=<4 zDF<;Xu7cA`38Ig8^h;M;AJpEk2ZWSv!HE?zuNIqo_YBH&kJQ2a8Zm${&}>LpBiNRv zqBvje7z>a$679pjiy&eog=nu2ed&R8ltCl2;lPrYj|k)lBw1Dll80#OIoQfZiV##1 zwtu}{3_ghs#o-W?2|`IJYyeTjd7V{M1qo_EmM@%(SV$%E&y8KOW7kfMA4IyiRLVh& z3-c?n*>sn*LDip&m_1?Cvt$AGjRX$QKk))K8t;Uo4GHCd27~MZrk&^;@eCm^Hw@=| zMPa{5X(Iz?l4x_j!*E*SHajH5bl+yoq3Xgysjiu=?QmK-ze?@=3H8|;)4en$X1E47 z)HHS5Rx;_Nkd*w=Oq{I+BG9EidNeM^`?Dv8(cx!{) zQ8k>6y199DGt{~@2>w`_f#8gJ8M0(mxxCsrBrmLMhEx-QCrR?4b{;@~Hg1YSRo*YB zy?(A-_tjOh`>7rBhkG|kLt{6jc92Tqlc%1{86by2sVc~TXE%ML;ar0bcfdiYwWS5Y zDa$~3A;E;?s}GXGU=|20oP65aXqO2)OA=#BBPkW6jU?*K$)%G^hU{rAk!4vRbVI%J zzJEVgUcKRWvbhb?M2wG;!gk^V^|xwnLC%QP1?L_ouUS+heTbxf&)R1ZD``Lq5oDZ% zpEjk9u-hywE(8IMssj=?!tO;nJ=8;xw07f~(%S(LVLBnjpw=elj8M*gq9&LUbg4N( zHs=ah=`?%u$r~qm#1;Fe4|$=JiIYblK_1mpTcwdInp8w~MfmL?$VlBToIxJKWSJ=iZh3;{#H5HJJ`f%HTm zah2u@nsDzHQe9T}89UQ;4M(cKAd|3gF;{66p6*{t>(Zy%h__gC(RTMdF?0(fb#sfD zJ$rVmdNUs$$F+#4y>QMP4NO>3Q4U+Z0uVKLG=t1=-^5C!<$_d_fYH{EeWIr?9cD2JZmY4VwVi$6v=R9jpQvlRu-K01~~>1 zy^lHmGzkF8`%Nh$k4dP35~&9F&q8I+OIwQNX5vr!KtqLWB-v+ppRa{7a{*w?WD!7_7X z_1s#y`I}eBacm#tc^Cw5nD&$;(jb&WAb9g;70Unm$Xld3Tq^h8@t|z4YgQ>IJ=eLx zcqTl;=?UN5*aExKGGGlyAcX5dXW#?`2`KuP58F%5M<43Tq8%h8qwF{dJTdNt9qP`V z+f+KKL`?~sn^O`;&N+|x9P>)kd3N@XB>c1!RnKc)0}~ew9)*$Dkw3g=$nO&}zI(*V zNer2X&pooJEiwn+O_E83Phwu&`)59LUE>!VUb()KGolNcMpb*Vhe;!oMbOF{0)~Jg zU?-kb<#qi= zMiPo1kgONhJuS~Z`J@a$3RhcM1)IAZ$;m4OAsm9|Fhb?PTRRF{y<#|{ka*CGHy0$c z7?L#7)>wX2Nn`{>iiD#FA*H#->(a$LQQ3 z@Ngnxj7j1eMca%xPL(!oJ}Dh0uj)etcrZPX;Q?R_8&&kry&I@Pue{0Tt#?p5s3CPQ zMKXOY%z{zu zHTou{lPryFw9v6NQQmF=<1cqLgQ-%NdV8^Zg4yEn>C+ zlRHEFtF3?SC%0K^?PwA3CMrH(Tv>6Wi!@I7;UeoEd(Y4W7hoPd@fjC>my0({;{wkV zJVhoY6z=Z8G_=*()+(>OvN;NL92tv6AUwqPFPMm`9#TuTKLoxs^#=^UMqKQ>cF-y zJ+ffQQdKK2p1V+1Bktshi)*C^DqwDzQ&9~2N>thePw3XhVOcz{Ru+NZTzu*rd2kE7 z$q`zI?NRy}flXx*Y#^IE7+9P@fOvtNbs-pfPe~7?sUW_6pkbMChC%-_(dRI{#dQl8 z*izCNh(Wvy5DdK=62hW_ko0#n$jh6yO84FYIpxG^DJrgz?9yzx;=Cex>l;_dj;2lo zVQ-P;ixHA`3>3I&?z-@W~*~iFVy{Y4b8eac6n*3+H(@ap@EnZc}g5J zA^Btwb*lU+;icC*rFaa+&QMBJ6=wtcAxK;43`Lc^bBZA;q(hW9-<)&KGUioZCGVcW z#`RrcI&zSmI!Pp@kWRhrODg%;g<|LX!x<*6bYDW)4iElTLdZtYPnU|kyvB!V>;ThA zT=@?2UN_9frS_2z(Kc-{Ttoqkfy^?=!^mWk09tWFzz~?h2=EWH9Vw51l0IE%6T6DZ z^XE?&doIxY=Mr_lLy|+R$Vw#fSfxg+hll3nnJczC+!ddXq%oh`FhUjExOcZS?QMV> z7g9Q?T8ALjBUvoSgGv}d!dqIRQdV9Di5s0*B4{uJ4j^Sx-_WRGfhaZWgKC$XNoT^_ zni4>`a)ab#f{-G74M^J3MGNKR<5x*#B{t5^+$cGfE9A*P+@}&j+5__ET^C*=5C8o| zdE;p(z&qUamG+cDkl0denxw8I;jneYmKhO?`4MK!5<${=% zA`o~Z;!uX69_F5ll!kV}4wF(;w#86ALjxd=kPa4r7!QGDb`Erb*bS=0m0@+bCnJMy zFPS@6p1pUCbRG8^c?My8KKQQVrD)C)*m71WrJ7fQ&9CQ1yd^Dj(nxn!bv8*P2pc7o zJUgi+t3W-pS=4vRd-MEnE&2|*c6jNYq8&DhWRhSOb1pWnEp_Okq|$W@XXa-!=qt!S z$t~KW&iMV+b(AYE7i5wG{P1A9 zhWX)*Cb@L7N!G{T`amndFYj^utmWJS&PAWZuN{PjBqBiah_mYpPlHPSNIr%@rlOG^ zB{5VxN9cS4vZf9mg+q82fW2S0s)hp~pM}^`kX!ZT=jWoHVOf3hNwWOp)za79E>&g6 zNMD%?)p=8x@Nc=p+{V!?b_P*bZLS^w^~`{lfIP6ELQAUw`CY|x!AIT?L$bSan2 z$X=NF?(STs#1h##osQG~Uw} z1%+nRAI7spj7D7Nu&)siau8g4adRt4N?WP6r#E;%>Lwwk`k2Rv^v=C4?)~zwIp>^P zCv8;eBkp-t;t2AlpbgxV}r&eVQLv9ZkftPkZ_IuWC0zq=P%^`aeLa3W8II2*xM*&tlq?7Y& zswEc^v<#|cVWY?xi-VA$4Zzlq6rvZbueq^F_U!FI48$UN>ZNV+-8KKe{Nahe%j%O( zln4L$EIhgca_;JRAaYQpqaI4&5UX6?d({W!=ij! zTM;rSU$Q}z+jXYIoR3VopLJ z&6rpvgfuPC0q87aUXjjcn&w^e9UyTgp(OES-;to#qGH3utUdtN_q%RcCRybPX`K{C7mv+|tmaq*37t~W+`2oyOlb{GPN zfFWQAOjiWBQtFcGiG6UVf7rcre^@GNVw#`dUBw?`Id{bp@*u^JOs-G)2#22HE4Y_J zCIJtc04xaU)|>@VB0EBtZR%G0;Tb|E83hC5>Q9dsRzcBI5yWV8aaK29*U zY!}l7M)AznSER9huQWDvOLZye;LvN)-vV^A6&r3B$uWx;%j`LG<>eRGgF$jbXt-^I zJyM*QiI_nfUw%cZt1EQRKza$av^6W!D@4B-n~2fHDe?=z@^WA(00swUl?9O?M1Ay# zd1>obIdgT9tXwfq)^C1U-K_IaCV+elL`fz^4D<7Y(AQUj2n{0!Q435po#A55tQr+D zl4a7k6EKr1q5lXF2{9_gpe)}5F)3ZJ8Ly`ROqF>D!ORPB-NnF{VDV(>y%`aRadUM~ z$3EQ*oEuy_qe(~a+`HgSJ4)b+|1fd-0c#}FC4*Ha$#=kerjv{A(HK)-NTlP+eSGSu zcYKT|-gEz34A~=N{^&j102(;yp?7S@-QqW|n4j#I4AnhhWb`B>h9q8$d(#h6NCP|?O6Gz1ktnh&If#Hw0!cE-efs*4hpyT*gvSWL8`9Xuo5Qt<9qB@eLAlhkaZkAd^7+<+)q5S=^ZHQ&l1>J2cE$ZZn^&4dnnWXo$2;5G8~fHvH;=)#kk9f#5!By9)1j>gQNctCk}#mFlZ$skq`Aa?;dWP-4Cyn z7~%|3Z#%%f{6N;aA#SCYN+{&E=V2NNbDt7PYycj?STF~aj=i$GQBGRXA$P3ZfNgOy zKy*8#1kYyAt_Ci}O0YYr&&KvW)b9_FQ~@)3zEF%woqV*T4>2VnrUf>`5Npz7sTBs0 zB#GR3oP@0$(Z`7yNnjUj1Bzfg*#cVwZe>H-#ZA8lfHn2unIP7Yo=$1)Xp>s>I~Vn` zEXf+D+~Zsy-W;9l69at$?o#bGk#Xv><*VT_c4HCQ^*B92$>k;zE{^)y*H1>g5kAut*Od`Ecjov~wl5IAFK;2e?EwFTgMnXq1C&|E;VQ0qeKOJlf5W64Gu1vFf| zH4lDM;^mfXrg)3Br%4>9FD|aNMlFOZuH0G!Q(2`eLjuC!h=tI<7J-npL9eWakkFmd zn2%(P#*ekuVXUQ9+Tq3w zOF*88;jl-UdbIY{G1jFFMN zIs6E8%Y>LHbYqXf0Al<&BI9r*Hk4)#L0_DMEpbSei1nk-Z^D*3+yJ~3#2LDO5W_Bv zIRMg_35j7o^v|rJALTTHImSB#-FiOAa5oGYi%SX+sU71VShw=FcE7?{+#H~kvmkbW7RJt zBx-ukc^{EJy7z!+;DbE8BAIn$l4VF9NmA+gqA!!Css5RHQ6K|Y^SyOtkq7^@CVj_T zI-jvFFh-Y8n?aAYP7+D9NkYkX@JaUY{|6;okCitBrW}FV+FI>HU0vOj_r*Gp+6egX zi2u%{c5khBzg1wP2cETfuuE8WgL6A!*xAli=*u4B4DJumL1*v2_ zpKN?U2O#0ZhR`vnYZ2UW{R``4=PR2a<%FsjF4UEf;C=9_3#D#PBP4LWvTe&YjkY}u z!^B~zcZXpRIP8(Gx1GCGY7k#12hz#D?hg6U1D&#TPK9iF?pZ_^&y<>)8W=+c&_7UF z5LkM@bkiUaq=kJa;UftoNeMyU&3HKkfcFe+*$)y!>P1y=3`CBkiR6hMFWVd2)Hsq} z;|mrq0C6q=8HL1_1QmEtE!_cvNV5QX9&zu*UbG*D6qHik4j4U>bOuP0P&Zw#8PUB0 zZthV?MrlAwClk8IbMB%bwhUrPPoY+9Z`0V+qGRD6kNuQtg0xXmn+r)VWATs_Ix@&J zYS4R(1~<6Wu~2&KZ<5V)-vCn8+91CAgLJ-0B#Rt7{;@lbJo3i-@C${KOx(x6n=m6k zY!fF^MGYf>Guz^i=HPSuaC}U0AHGTKFYegkBo8ByNdjoaQwM=L*z$qLu3fuQr&HFG zltjRPNBnmtB|B-I8dn6kM#m4_=%TADZNd}YD>01c#D!+k#hTCD9T#9+;U87s{J6uV zapCr>afE}5J5-|}Y8`Fuy4N1#^)T2Y2@hk-M7if4&QxKOO!dHrjQ1jBxl3T&RP!VGpmf6*Xa^<_ulH#mfc;)2C!w)BA!c;!V$^ zl8Q1Zp_dIjmFN*f&zv0efo(GYC+!=%;i^rQJf)g++3ttbkpzqlX8zj%)y_RqI_bheOu?9y@N7XUGI;%-MIaShT-vPe=%>8DQ~ z^^ALU4`&(gK$Tt}Twk?nZQE-PueJ6AW5?9!c>EqZ!s*vE_OqMav6r`!hCs?7;FAHL z9HdOgtRpFhfd7v8?@Y>d&^k2k2ynsGgM87y<|P*F1PZB(vPyk4v$r@Sk1ND8KCDdB zNGM%p@o3|Tm1}ONnxy)WM2BwEVc6XDVk7DZHf>hIaq2Y}T`l5l0YJtW4~Wt{?s>?A zDp{5dqTM{Gb}8ZN2YIBrn;SgmA-<0q6{0IW9i0+Ez{EnhAyzY&a{hiFXvH1Kp^*1QJfx)ebvLb@o9N%pN?W zEo1;BR@Km8Uzif)gUJAhFz0a)ZG&Fo!vm^iY)qbb4DlNU-w}NDiv}Jlc|%<;*?X@D z9A77%^gVQ6P?*Vo2mQ0MSmp#q8R&x&Sk~*cgG=537r)JY^Dvuc(QIx`{B{+5_jt^| zjZBUiIxCoJ2>8U{;fEiVk9_1Ksn$K~PYNNxcZ7%k&ZJP+tSjS)fWMF{@8d$uU^H$$ z)>d?(<|<2O!;iPBvYcDQk*K0ZF1{m@AUyQ4ayjPVomqFntv>R8QIbV9Z(B;P{RA@+UAumFlYH&lcgxN_O|oD?4oKRd?A_Z0d&GR%vTcW~TDd~*fApUe zps8IU?FgeFfwPjRf`S8&Dw-joEEVP8T@F+1RVv}@NMwF8}1 zTdk6~7*yJfAWkyX3)09rh`~mZB)g6o{UP8< zms%D|xK=|n|4#)ndDZwS%*sVyi-TA=e&Eg|4EoUpo}S6D(Zfnk@8vFRqe91!VGt%i zUjX@YAoZfX8a=k@#Y|F3PhCdG?&|7;6f=r@*bjnu_CTFmTIQTey0IBFYQxqyQEA2A zgC!til*+MV?I1~=AYa?IZG|*6EUQjlE(;5D$W{5Pd~LDB(Xtq zio)7P3-)K+wtXk0h%sz--3@!eJ~##yO9A@T4OKY%&qkz$IvmO!s+XSQ2xpWHDJQqqVKnhUNKM&y+N4JCob_`H z9Fj>&GY3i9GD$9x4f*^ZXHSSbonOqqp8b1tK*#ClcI1xTVv76Hf;R>)4ys4}xIlh) zptrn467$8ASrYFA?Mbyw^pgj#mCNG1uj<^@=T+s})E5;gQ04RMkaj?88T&6IkYmSg z%1in$$CP&@g*tHc)mO_WKlw?y_10VEyz|bJS+i!1-aGz>vSUUeuxHO6eK$C0zAJX5 zKLY+Dtl?6)aC%4liYinr*6yAQEIwS__dB@w$J4oLb5Ymd9uYthmSU=&Sr7l=%zN7S zapUOHsw$AP9jZoUTu3^ckR%PzrVu2Hp1+hz=^k~Im_mihsN^9?#=;2s)6aeTKr%@v z=z&aK>k!g)=sVF6G zy--oNbigYdZRR2RcNF9>gmVMDwPU!ZBy$+Hi*;~12}1SFa6eSJ69b}0PiIG@p>l`Q zPj?q>JLyG?O{g)3Kyjf;PJ7{@oq=-9PiK#k@qD<6jYRwg5q>`C%^#N1qt*D8xqK?0X;7)3n>9 z=Xy5E4rFG_tl1S(59dn~WRgGC4p}jQ)`9XM%Kw;K5#6GD~L9S_r#K+Jibt zF**UxiE(E_`pW!#FVIFN?n$D9@SGnE;yG+yi4up%LrGx5S>G!qjP%G>wyJsDmleMd zWTzy~j8%z>SO-=3`mSiDsFw{e9kZfs;hOpM;5*ItoPFdw&u1XV>=#L8vWJn$WD%Ug z@^}U8*bjj-&pcDkKmUBW_uhNu(n~K@!f@r`q>}s9VYlgt0Hu)pl0hj6-x0nmcFZIM z_!pDJ4vQ#OX)dw;!n)7jQZBk~z193gNU0{0bsc7ccX(O#PV-=aR-UOTsDx!eNWlA) zFXX}jWWl1vP}g>X2(@EF?RxcKhSj7rLxRY5qp+6@Vmlg2*{E7&97wi9lG)!s1Y#M3 z%9nW%h#RWhF4*!>vPQKrScZnFenLV#s<5+e9 z-;FP9Q1ZtuY6ujGW{9{!#a&o5Tb3kD2 zy-RLhdCe3@e$jXIEsTxBB_AX@1J?-nUX))h9R=O+0B=USd!XL$0!A=Pz<8ZB6Xj7i z!weOYI0Fx2PtwQnle99_5dVHrAFng*gp%49WK1BCIy}^eI?*#K?)^jinQ-r?kL7Fx zaxAD>nUqH0)1Usd_Kzg;mRoL-`|rPB&N}NXwSA-g8$-HSEXkCf!lXOLSda|iLnj)3 zX#YqRHV=|V{~bxXA**1TAwVe%x~#Dsv8eKTpMwjnTVnTp=#|4WErM+7OyerkR6lkf zYjBbv)_~kvfRZXmy*>~d4poPXGt4>Bzks$!`$lfKO_EJ7Y^IX{h7hW!6(1)tbBWkNkZ+ylt=BD3W_(U*PWR&@} zRT#-^5J~KygL*UoIKO2;7axm7bYIF|5M~!D$f?*Jf#mK+ppM9@?-mK9zJs`9zq}u% zjeMUFw4EWCD1BrJrt5)|s@2`yn136A95ZxQFjW!YAG@!7XhPU&h8I`$mwoIrYuVuU3ecyeuw{f%V+WD+J@wl@|)t!JqLb(khg$$L5g;`ZI3plxu zM*3c$E|PtcXA(co6C0r9RMp8OamYgnp^`qN4?^uc0FUlIm<%xXXZPOSFkPUu6?u?P z-z}7JcL0zTunv@^Q;s8#?(BPC>wCq4;UDvpOOl{7KBRlCo!l|r2A*SR`W`Z$m+6ad zIukatt?eGwzaQ3|UlFLvthyj@eNg#>eFx1JzAXY5MvJB(HgJg+H$t^v-UsZv@GLsup>}kfw!l@<2jiYY8!44`Ng@EDl2ekx1s{W1B@V z5E4R?J4!+OK;T^bN`x9hz3c_;04Z_H2f^b1t0;ES*^PFhAa|7@k|iLET<&_n>$2UoJCu}h}(8#VmI71Q#uXVPUg)w%*ktZ7x zxmYjSM_*_cn3YuovgyJg(WaARl6aJq)X1^NoeW#f40-XXN8wxq)jY;rUbR?GJAI*) z70;E*@}NBZ_@naOZ+{k~4r*fjFo-#Wk}F|DzZt8NyV!(*PXTbM#9rXluw^a8ZZ+8; zxg^LeM{-9(sRR!FcYCN4F3uVxfan(`u30cipjiUHG*E@^_D*NWAZLrHL{wi6)`y~e zad6K!jigW|l_Zo(Dj|u4)S41XpGZ;yi2|;SUb=Tyf79+!?JOy!-1p!+sLXQT4$*7hGKrbDRw81AL@4$_$>$Vf@=P%sSydKvjd{?MQeRCrlof+M8`Ou9 z$0^>*59ew`>@Kh=>VY#LuA?ZCOormitP%t|gcz_G;zEkG=vhO(cPN_-0V2BQ7PxPh zLT`(R+PE)(_%XXb8v!M|Bo|&cF^Klml^V=;0F1Pxv=C)lrL3x4-I?3lI>0mwq-<6> z_GoN{x-}E}^E^DmX5Q#$7`pSaqGBlqlWnO-&_;*`83~*RnOFf>WImW`6~ubn6R{UD zcrr1DSvA!%e?AyA#FR~Y_h2j>2U*{sw4!~oTk5}yi}PWO*exwE9;HDg87OI+8f#Ki z3oyHThUE2czDQOsEs;Opu~yo;+E8{FtnV24z(+nI`F(YA%gwjSU+eyiu?qCuh3GrP zpNs)S-7pv%=bMJBIb?d!-xtyN6Z-TZylM&xU`&dABFVbcTZrrn2j|x>$3UI>b}$d- zl!mOC0T^$3EHZ?;!eIDh;{B@oM!TqB03ujrhhT>ElyQs_m192oJ*KGoas;Ey;5{CZ zG-?bXepMsXxAd2s7i<3kX{Z0LX) z^_sDPsI#V*O9aM#Yz4A4=z4QO@UR6953r3G^xG0biy%k+5fDj`Ji1-e&=Gc&O6*8l z>A}(kVwn#YZyFSLL#LgKeA(fkv^6nKjz|FrCcR@wPAkgG5ty-1>#xTaIMj_(4B7?V zHjf;L+)80|IHz{5)Xtx)hL>~$r%`1Xgsd3)Vj6?Cf*ev;UPwYq4<1BO2Q(1fWk~}_ zYY68e5K4;ED7s^mcIxXx@RnJ9+8d-CZm~oNg#a#NvAKT*pcO&KA+x0{fPTSk*ww=snj|h`Q$66o>JaB=aWR&Cnp-! zYlvDi)E>aS#~jyKj~PrL@9ulgLXmN>1sE&#R?ZMG1PlQ~V6qVKBrmQxP$KSP0VC z3y+*IB6knN$gilN5F`xl+2|HG0B2xEN;ONcFJdcn)TB#{ol}Z{kF#oPV8~et;)URm z;fRD`gh^|_`i8yI4843)OEdJ^&Qpi_@(M)u&Ihr~LSGoTu^$9FoQK#$Ajqwdj*?_$ z!vkp5>Jy{~B=;|mKL#Cq6p}!W}*bsPMhPKN_o2>Zy8kj|{RN&MnU@Lqkm^ zu&9GJ2b5wCfo#V>EGeO6@K8rMS-&HbetEW|Zy(yCK_p%jeOFYPig!cFE(tICk?b*p z$z-z7S$RXi5HJJ`fmB8y7s36wpkgh>;TCWudboC+kQ5J3ltkh=$(!caR4t|DyeppY z*p-Rb!9;!H2mr*C>MqNAC~e|$4$)s+It1cMGDtOcKDL@E#~y;E^&nX=EDV4sP~A-l z6wAgSJ&U5AAf;O%g;aYpRu6kma?ndJ6dK+kjYO7`D(tb14Wzl(BH96w%g=)$BPEm| z4_VpHduJGJFhFGxWGNdTx`g+@^M>StZr+rp^~2qm9yQ$Ho9$7>974V>Yzag6YI?(D zA%;ya2%E+v>H+a0@kD>AK5qm`q*^tKK9*Ee$Pm~@duN-pL(SgQ*Z||o9$Ys|BS<8r ztMshN070cxmspWR4uNRV^Cko2I2UBFh;Gst`1XxkFh)c&2eAvo{a7VkKE^ma?vq_$R@(%g?U)v$|gvkKOHrip7^Kz)=Nw9`6mcAGW zAv!@dG>Kkeeld7sJ}{xNDd*Jnk9|}XGSd1%#G{cGoj1+}4JgAfBJCidrle5|QH|`3 zB}1VcCFoB2$mb-FzGRZ*(Gf_JPV_ zAdY;;aT8Z(^Oa5VFant*fL7cPFa!(%Lm)K~D4@iP2NrOrQugG9ymwD@!6hSf&%L}} z8Al_zB7>CS7s;i2G;-FOGz))`2(T@8R*4|U$Y6$(SW(J|Ra^pC>~q6;hDvHQ#(@l|O`)0&QmA4j=>kC|(WPx4Z7@5aq8~z^S`o247q*RqAZ9`m z+1jCdJ+l72aI0p!vnt9U?d+CJkgX_O!%4m( z0+~(etw;3K%L(Hl`J!BpWKj2`B=R81ByA*~jM~UPs1y*kmNX-vNdhCcmzI=5HQykS ztZsa9fH?sqqYU~;DY7Gt-n&BF&>y;a6K~d~UnFo=@`(Dl)zJ_N^1Wg{Mp4I&%kRH; z1ju_&d8r5b&mSHnzIy3iVNDzBFC&m+MQ&!7`{M=g#Y2r4I9TE34xFdA~uW+_kq_8K^>a~sbapWHkC-g zCNCQjtzk&P*c&?5LFEy(8@?(Huw!`VO+wPe%?oxn$FA7;0!r4TnGu zhd`bRi*ltATfWK98BNh?T!XU>11bHkdQmN)3M6@Js88`=^ zPG_X=P&R^Bg6s_sQQA5P(pQGD!z~;Hwh$_N2A`w^mNt+iUzE_2F!GxAv9xQX$w3DC ztr9-&lL%?;@GvMAJB@>5jYhHE55_uc_8h428#QR;P!2Fh9@c>g)V}FG>LP)qDqBgT zN*$rj#+gz{-ChTI^xg8#EJu>dGWu1C^`ULPo6dz>yMjKtG?k(viI66H7@15KK`UC`7xGY(*LRHSah(I0b=~MDW^2AN6DUBl$p?aCIYbfa&gw!ko zFX=20pBBXKp})r<>;k#Db_66Zkd6Hkp}MBs+%P1DQ~~E=n;LacLA~9u+Y1gLFe6Au z1ocr$N{13k6zK__3o=Tw8xH1TGwVL6YNM(?Ch;WcBvI`K(Ilzi_9`K$ed%q@z=}g4 zzq@ws!d{Em>>Ff|D3PqtOPJThAw<v7)%`VVFYr_&{;u4zz{G541rWez^NYD zLq|-wl=|0xzg?QQo?@{bp$JIhJqUu5K>rDgVxKk z{2_@XLCD8`gCzq}4Uc0IHmAm_q6EHtZ3Y36(k}U^&d<1<&*EN(6?IbC!ql%c$8DUC9bBAQxD~;Nh5)eyDMj5f3 zlENNrGoxb%G%<`R2;!CpiDM2NXa+%igCuMqhDv^bJINFGZj69HGBi>Uc8dk2m8u?Q zf0ry@BFmO802zfm5yYGOAogIq&8<6tO`k?(59Q~|{Mqm%uc(#?9E@JwQ7;V*?Xqp_ ztJsBNJ;Dj?lAy#iiaNsy737z~-v{H&#vX+TWmLOtt}HxeF`RmeK@8y-155(&-e!0n z?s~*MGx?6``++|5ndE%b$emy>h033&Vqfv@CD6228&V?o<@yiHg65v@;Ltrcj zeD!nhxlP1u0mg!zl`#ayih$Z8?o-$;r28c4V(U`8bvEI#TYz1HTa>Ym?{|4lh(At9x?`^1; zGLXrNstRm#15a~2?_>W!J~ER2_O|t&sM`0HV)iPDN zd<5c4v{ZTuj2fCdlyH*>=fUBnyu4WBXI3E|<|(V@Am=>D89}5$io1Hc5cUUA(;>C2 zZ)}vt?iT3z&^h$aoX3P~eIJm<_dwqPK4Uqre%!{>alwCR z)qQw@6nBRg5T$UP++k!gIsB}&Az%m?0){}UBf#~Z2iMQUYfT*a+ zGPpGF{=jD=8M*k6NH#~(by?TST?_uA@8@7S9&Yj13wN*SUMjMXd~ke9LiC!1g(Sj# zlw|BfQji^Fgn1kRbwq^uR5A+5C#9n#om5-Xp0ghxNb($(Amb$Uc9DhuK0LUP!Hwm`5rB=m*e)4{B4sE!fR2Lv&Q4;^F} zcd`$}HWN-m0gy)B&k=-o7&eU1bL;+xRCiMj+zqE2ZcNQMl>H!MbXdwO3d`QzEs|Fd z06Bw=DaJu^+}zZxpY5BssO@FV+!|T3a*;d>A{v84Gz1b$B7FAwZ;^p;5u~KBGex{e zdX8rW-~kU(NK**K83YW-sRWoLj<%zeJaUT~5=|v|Zl;t}(yp=(V^Gg(c zZ<6N$m@DXfd2_<&JUACh9668r%uzpvYjShz=8)1+z71LjJE;FS_qgOQeX1wjen}s& zfg}zikV$}NMGXN%Ak7ir-`xXQMVb#fjfTg$b8F|3&!b<)gTbGEC5NnQjE9q2p&JLV zOtgT1{;M{66Yhy}0quh;LDCSvO}Nlhk`DLOJyA)O2Z>*imq$qkuTg-d^sM*3w3BHh zn>Z(s93`M2Gd%l}PLd=_Ih`FRuLnW&;ABH59DMqr3XekK$<4Y4aovkg31-5)l#d?Dv^ive@?Qchsp@Hr4V$+F4@!kNo{% zjj=f!WoOltY5dC&9CtG5eU3hpI8yRR2b(|+osu9a#2D!LDnv;iGP z9lYT^m@V?dxd#q6;c$tDc%n+080#z`QzEH!<(TH`p!pk`Sh<7pYK0RKd+y@ ze_lK%A&6(=#r}ExvPW?C>+#Qi5mxEaoDC+-?euU~oP6db>|fBYAmKc$Vn9D(-_AO6 z#qNItd&YNF#CIrBM54W6+8KKjEmZj}&$>?nyLx#2aqkJ^O#k~ESqJabMYPWnkg zsvnY3UW25?5GE)_^*hKWV^4Ntu~VF1D2o=%lgrNkkPKxP%g$Ym^1{>4z<%^qNHjYjE$xL0o#Ax) z;StXml-M*JUh0gENk^wZmhW`KSej6|>sYBuXADlnV=)8UluZAz%okIsznCTw{6oXa73!nPy1%#k~7J_wT*)0TN38c|tXMIZ8U6FDNE zJn|p=_fQ9WLNp~bEUd%^&owW8QM%_}D^Vf=MVc>l<1=29Jd>!>wvwciLBh{_^J_uq zGIeK{#s+NT2=zEgCmnqvkciS#duLZnVz7Dag=&7@vQ-ieoFvCAtdN4Te0lommt@O} zPr=O^s_ep2>{^3;yWs>B#7;j{p_7~qKzd3n8O|p&z}TGJiyOT`h@T;Kw!z#$^5`Uq zju;aZ_2O1M2pGpZLiiLk3@kuZIL1m-0nV_h#gWbu9%J-QOk*9a$w?*MH52@MS5Et( zd07kVWESK}_PF@QHP^%aoAudY2p9r}zUZC}+Q^#3n|`deq4W z(uxCHam>S|`z*xu$uS=vProkB>HYHaGmn>BvlC~|Ie%2=OxA}V-#ZKAN6bVFZt)A!ZbA<=HT{48tXy@sD&4SR=}7ztEnx1?EWRO6JG7N^FOH z(I4#(Z&3zJ*VVyEV;TpNHc;X5toLzmce{92Jlx)Yt zhOCn1&pG!kY;0Mi9x|9#g5+3PgZ za8HqwiVG9-c+Uu9`{vU!Ue3g8zi=V}_0#;b|BUy3 zKm4}5JX$64`CIb%Mg8pl{o&4jwf>p;_Z59=idN&u6EEp)G?MP}yOs%s|46T0MmHZw z;B6c(@7l`}PuIp0elgJH5QES2Kw-MblVh;XdA$)~LT&gMau{#P8$z`NXxQazE6&e_p?gf4oo`{i717Y5KSr$)W!3eN*?)K2_~MAta=1=5K>OI#O-X@x@(tLY=cD9 zU9-ZGHu)^^-nGp@ph_vR(R;Fosk4(s(8?PEqaeWl+=+jTN6BW%hCmu2U~hdIP3E`` z#V3X&HFl&20?AX*^uQq{M`8cXQgT|3xKlg4 zfzYcL*G?B(03(w#?B%rK8UluZAz%m?0)~JgkSqd5Ad^MV${PZPfFWQA7y^cXAu!Vr zxcJ63*BgtRX|rzQH3SR+L%OEvjrGmOszIUzz{G53;{#H5I8sjMkWuAi=`R@ zhJYbp2p9r}fFWQAjDmm>$Wef?WJACZFa!(%L%faw1T* zIt>9szz{G5#sL8f3OEiRS`~)Cgdku9azbFWN(})+zz{G53;{#H5J+PLF1~Ti^~NI8 zc!F(chJYb39S{g#^_`g60!)XQvOXFDhQPrQFfw^?TrAZPFa!(%L%gmM3Qbx) zFanuY(`kb-1PlQ~zz{G5rWOI)XlQEjpRCr6Ku#9Y){Y@y2p9r}fFWQAqz3~3cINgo z%@!a&mqsSzP}qebUy$e1oB8wvOI=>Az%m?0)~JgU*jGAs6tRY|s7y^cX zkAO*DeI)E`2pl~EMj($Kv*b$LfA;_GG+Th=Fq~1PZHb#v(`|#DG6YNtJ7v&Y!)btk zk;yceL>of7Az;f|y3MGK#}F_C41sh*zzAfz&8UsX5HJJ`0YktLFa!*N2}ZyOwY6~(RZ&qaik_6%#+uGKFTO-)T&_ZPnK1(QaO zE6NA2to-y_FEm?#gJUz3Q;kff-|QQiOuspu(IbjRqjKq`mr8SUvs`h-6-p$-VeENh z$BajS?*=8*x8Hudl$V#wop;_T_HNAh*TMjpBm|5=P7=_Q*o?VoPa@dX)Wjn|f} zwdJ+fUTXwl;vt_lEs!koojCQ>Q}rF;yJCkSU-_xl z&+C{kzx;A@o|tj7f8+x`_Sj>E?+6e7ojLN-meUZJY6NU?o9Y>`_8kK0OCW8AXB+~) z7cRZ9?Jxumg@FH#`0vc2^4c>)zz{G541og?FamiXCYCZy5nx1d9*hNOhaqq%1pIfz ze`gMr*Pa;yhQM?~;J&YYY{Ow~0jAraQo6rJCR2KPtaC?$fKLp3qHr{2E$_@iz<)>l zcV=dd-bQN(7y^bs@(37#Oddn4Aw>}IRg*qBNRe(?H&PG*{~ht)nH21zb;%Gg1Pp7hrqqYUurv|$*1PlQ~zz{eP0sGQ$ zASRY#2&4-FMkdo`CT$#sfFWQA7y^cXAz%nh5&}jbCkbe4#t<+B3;{#H5HJJ`fpkH@ z2xPj<Mt(yYsdz!V^0WO52Xw;_y=DtA-Db`DYY3zy0!AiNa$2lYhJYbp2p9r} zfFWQAq%HzRAX9g4tY?ORAz%m?0)~JgUMHVv6A zKRWOl>aG6W0(L%Eb?*_qkzN)?tSsU`%Rsj)mvCGOUAj*|2zklxM66x#m=KRPdkRvY%%M5EBtZS-|?|kPw zvU26hsqTigZwMFyhJYbp2p9r}K*}I6t^{&(<9c~!=l_hu;G9H~b+OKa9amm?rO?Kb zpI|U3^XAQykAM8*(%IR0ueceAU;Ro32M1--rcKHsySuyP7r*$$ zLGxRpAu!Phv_13}+4kj6XfqjEK?#;tNd9Y1mg;}~u!I*Z(eyxJvCR4C^%LEdL)Kq* z?%Ra@NcL!6dxx|?`hYC{zdug4Dl2aY7y^cXAux>)7*{c*iyl5c(II?u%DVPfeK>Qj z>Ns&!TY%i$Tv@hknH+oUvGVq}zg^z_?sv=m_uucA6?*l2^PAt4#fuk9I2_h|KmYlN zp1h~0N8al#8AGFLczDynMQE=jElDl~M(H0x`{A20+xOc)TGz1KR2}Pi{w^!@u znr%n=ATX}dNT2wP#T0UXNc}o|g1Z0n#CsCOEnBw8J@?$B`fdN&m%j8Rx$U;wua-~xt-AaJ()8!wNXMgpldPgriEP~{Bz5yXc>|=Vt2Hkrtb2cTi^S^dkf%hdFaMxa zyzSqlw{E-a{>eAA-|PSNjcB__mjCEaG8pNRy6@j8?f?9{WWwI0?9G=-&6S^!jLbm1 ztpMtYYM0*1geLSWCHJ=)v3bLUQ@URzfWhrqZJ$i1CA57*vN zKR;BR4?XmdWMyTkq;Y6yNbCN=4}PHMXf&$Q$2;%5Q_efjEt(4!ERYQwHpowY@)ISH z#l^+)xzBwr-lzHV=gV`?Jtx2U&2NlA#!<10@j#&DthdO{?|kuqswkCg`O6<=#ZUh% zS;eIi+gm3Yc&B;n{^_@*XX83q`rX?kR8cKGn_mPGbk7Dl+hurYP*(ixZaAS0OVi!I zm5wL=AqzhHZJGPYFU#&9e^Xxh+*ML?)|=(H-~I#S@?qKWt1U&kzybZ}eCAPE zdh>53Z}}>yr~fM3zxi2N`70;Q59H;`!Y|$|Idc|B&&%s&%criCycMe?cgaeb{h?1E zpw6zl^{dj?(unf+DVf}I-3OGMk}UG*cZ^6g(q!TL9Rfn=^G=x^!vkbjo^?Gy>zQ&aNpO|8EzEs`KQNPnPGOe_oz=;t9F>>Z|3_ zOD~nHuDVJ=p?zc|5|KB*`OR_f8+zbSD!Fy*);Mrq|N7TebxjpEpWk%TO|o<6_!HrH zgHv>I`#VqGHL5MZ6!q!Q&H6IvL+6@~PlM%^N~}l#6AzNZp;#1Bz)dnd&@b6FwMrbB zhjxdxAN@a&EY5b343bxxOK$BrFx`|Y<& zQ&ZD)>9Y0H5J(9G_ze^$Gn z6a+Zy>3Q@wH%cHkUsbw^BMdKE28rIMAbI?z#CC0mwD3*pc#{zdfpB&UZ7UCUWaSqq zF-&|$M;`yk%npOV?VE>bgV7#8%RXnS!To~-Gy?>cc}ZA%Ha-tk{CA`m{o^s%-2swY zqu2ZY8tmzkq28#3s%FO@g{o_%zquiPA1oW`XFM^5FB%#eiMp_M#yjq zciXyV2&4=G6#DQm0+}*XGAX|?nifRq=aD{inmx-rg>E-+i}gFsQ{~)CbyK+hGV80!NMjg*f-! zcc1zmyzX_cJMw~-GvyH&R{}Yw1Zu8vK8~hNs;=kHpRX!udVc-qfBvTu%I|&ed$M`+ zW?8pxo!olst#b3tH)}8HflXC-OG}I1-*LwsDrFewAslgaB#*S!v?Ik3s9zcWXA-sm zbWou>IJ#}K?ELR9O6ODmBeUROvi}j?zD2rUd{!M=XoDDl^v}1AEWh9~sr%tg($}~b z+-*qh5`7i({`E}QJ?t(?aOosOl{y`RIv)F5yt!a;8HnQ^?Z2N``lgFy&(FW3<@%sL zum9~&qzvly>2lDn`hf=?h?B@4{_uxtQ$1b!Z~dI62>23|Y1;kKy3H?Q{1QgTR+C1K zhSKy*&l=BeGp6H7bI%jyvp#UG60w&*{!Wmw zZY{s~t3OUGKJnL(9L|NO_O=_YkziS+ehOhP=^uqBzfKzOx=q&o%c+tD`$tMtX~Vhe zmj99s|9+lgPKkx$<4$~EsFis5;fM9h^U_N%$&Y{hVeO~Mvn5`kbe2W~Jx z#~G5yuCA`bwodKF%nJ3xzt3qVDm&~fTZW@SAQp?oi6jXnzi4I-5YNAZx4B`hBfo#{XR@Pp#G^4$!R=rB=`nACQIR%lvB6 z_PjrNWQRKz{6FADdwk$P}GC>sHDjKmtkM1CmIeKyoKI zJJKA1i*H%H9EsxK{jb>{pY0#&Sz^DLFR|Qc&-^IyiOzaQhlTH*eNi_ov!r7I8iSkngDTjk|z{#BO$_%7WveAJ9n z_#p9pNj;;;!rPBUxUczg{yA^FYzjfLrVRl@zz{eB1SU)*nb(Pj0;-F`mXmY&BWFm&CcRef(cl<(1&c0ZQ3lfy_~8$KSS5am(!PnpXENG2T;FFBQ65J8nC5$aht05mqp&Rf?(K;CIHcFF{@1nG zoO+?uy!Su!p8Fs&?_?hgY#GSSlk$t+dB~<2v*$>TvUA=l+rM_5Oa}tV@nmLZ%2{Wf zr9_fve~X%Do*8|A$0NbP@)!b!fFY2M2uv!0Oh*8ZbWDdMkF19TawZ)!i9klTyrfaO z{XGnIUqNrOGeBH20+|QwVaN@(8KJ!8uXky1#`&MR5!>FZ)>8d#&9dv}uSnOrC!j74 zOXb_&Gb&LV?(dU5zr0ymA6zSgUG0)H{}?&url0Dj)-SyMbg8`TN@@GYJ=nav7wJEf z7vFz@oN&)NX}SMT(*DQ;((%MY(s<{ubu;a`|LaSVsLnoA-MPz-6Gjyqa8y0$c>EtS|1)2gIUoPLG~NAM+5XMX%F18ePa{>?`uUGZ{>l@uAK^nX0Aji2 z)9;g<+67W{>KhJdWA{(Lg-x*6K@xf!HomTwp3N_UypH_W?|$JaS$50s5FI?I+t2tF zQ0mzE^dkqTv;F5hAFA$*%z%zLx^>gEB9s33N=i!PqKhs%fK>X=?Q94b0*1g$Mqp$W znaL5P|_X@Ym8Ow3Fx--LhG61Kd z>3mdIS8K34UphIR$6|dq1PlQ~V7w480y$ntj#7{BEJD?-ufFxuMk!*+X^23epil-Q z-A6UDPLzNM`8`TWe6(jjqkf4{(N|#U&4zH+(w0M@^dJ4apqs{O!%Z1+0vAw$_yJk*OP3A6MCZU>Isr&IaWY&BBLo%@CN^IAw zG8m0W-m=#mPh5`Y zgM`W|4=D-t>#?&TU7Pb#|O)L zKw9|)m&>*ru94?2Ia!KNf0H`+BxX8P-K`JYCFK|YYhrf0HUta-Lm+(-_~X5qKb)bq z0F$zFP5Qn$V;k9M4mbWz8dJ4%&Dchhd?}w~CtH(`UpnkulPnffU4G#IwRaxyQB`Rh zeMN=&E3WwSsF|EUdV=Dq>lsSW$|g zAP{N@fh3TG^j_yX?+JI9OlFeFWcq!Mzhv$`_tgKn!Drr6-l762UVFa1&Lce6xSAL0 z2LTWO0Z$_UYorTCOc6|$)<-JOUC=aC=|BQnvB{)2AJ3zLz-uqn(pMD>C%Om%AOHeQ z2sl+h*4NfjU1<^36c-WiUi1qIBVY03~iz?e7+>DJ5ab8n4= zs`S>8MZZ7*1dIt_%epZQCtgF!I}x_}s5LbFrSJK?4cU5K#GI>M?{=SQeV$+ zH9M>oF~a=@3(L}@B~+D{OQFJNntynte6EIH#v)P;{Y4)^00ck)1e`jo()Diaqq#RW1C=T%c+5vjJFG-?`O z^c4g^00ck)1iX!aLj+{S$qXtvvcG|qREvCRkc#FR%5O>yUza#9m!X!3m00cn5;|bV(I=jBAl8RGyOE4@A+2Uw*U`r+E5kFi;K%~xX z&j+!(PpwF(Ol`L!3zC8W2!H?xfB*=9fYS-s-5Oae+*H&HXTBU-Asr1JSMqQfT$a5< zK|uk1`|Y=4O=i8li?&O7>JX92DBA@Y3IPES009sH0T6I90lQittMgCGX=r7wmPq4}nVCs@_Uti^lGok3b)&9byEchOgu4-FShirFRz(zE zIofC~fPD%@DIfp>AOHd&00LGLu&bD?DK3=NHq2n#E*LaoQd^k?=Q2&jq|&!bF1bXx zl%IitfsO8Q0|Fob0w4eaAOHd&V1EMk6_6!Gc5h)#Gi8Q`hBm1$J3E`w($c83v{dW> z*HA=61oiCMlY)YRnxy53y1F{rwQCpE)YQ<07hXtyetwjjn@dNI9HEku5()?ipxD@0 zVoRrwkKJeg&9(`VAOHd&00JNY0wCbd1nesyg)15k4M+IcZhu!_?GN`s0s3h^>_@D2)S)#+^HNlCN-?GjQNQ(rUGomzO66Wse>`C@Lz7s;a6e zB_&0c)vK4f4nQ|%g=?=Q;On1xHP!-n<+z(c%v~D zZPqoVaVn(@rOd`p8}4|MjB&W?s;ekIKHfNr`u6Qh8#ivGva&KMFqMaum6cLthJ=LB zdFP#H5SlDLJ9q9(ojP@rNrQufr8RTMjvew-X_rzCK0yEkKmY_l00cmwc>>-g9=We} z6_D(956RQbJdJ(m&><=;ETrn{YD3nFii$=8a{Kn}a;n={JaXdV z;$q6r&!?kDj~enS7i^hyr!|+}_kBhojZY8&0T2KI5C8#pCSX?q=@%L%1*AQ!bV?aq zrlnwRJtSKSbvhkgeDTEwvA2Hx`XD6M&h+`OIBdqg}U*_=Q+jf426qHD&TVHUIDe zN#0Szs+3KX=1x85T{m%Yam1!cPNRgdurR4Km3w@G00@8p2!H?xSU~`04l5vePZIkI z$k1>}hBvYpxyH7ER)dpm&ggfhOlu0^?bON^YfjHQX!h^lZ&>hXT81JbfB*e=X`1Bc zeBdO{Y4ffc-lEMWOEz6{IS3#C0w4eaAOHd&00MSi6v>A<2E}!3ICZ>D+m9L!iR7{@ z9S~`5@Yv$mzkh!z_IObvi%AxQMMajkd9#RQ3ng132Mrn|w`C6)FhHL1vvuoMDL#2; zO+rF~xhmZ-4s!@@I4Ecm2!OyD0{>j0@8>RS0Wc^a00Q^^`1(=wrNdR_K-;Q8fL64KiN6GJ;6<_sAkY#4ye};g!gCM+0T8fEz^>ET zT#ZjaAcgi4Up7UM;?d-gM)|rG66;1&cx00ck)1VF&q1ne## zxdS{69dd5ZhQ6?YQ&EK$dnXrAigXPLIWL!i00JNY0w4eaAaI62`qfWv$65fq4sJ-m zuIK$2_pg1|Wb%)Srs4y8sHUhuK9YtsVR5XlZxD?>J}e@68;-aamwsUp6xyRN`G-du z7lmsO009sH0T2KI5C8!;BH%Cq$$exYs2h9*l^jl?%IuRQ-OmWYsLXIHOELLW@xYik z3h4&_f^PI~peY~#0w4eaAOHd&V3vSG1*B3V3q$^?udSupk|L@RKWvTU)5Nvy!^lTC zutop@5C8!X009sH0ml${=&{5TSPS47wst=1r=>=+X!MVWl0Pk_#d8n<0T2KI5C8!X z00H+T;E>HV?%7qe3Isp^1V8`;JcB@sds0248)&5$6M%s9;=x7NKmY_l00caffZTxU znN3Hly^lc7lkuN7yB5INVsepwq4!ZnkKB+zU0t0UwhB#hCjxj!+-Wq>77zddPbA=M z73qnL&}I)I(6M8~A)_Zwh-K6WAkYi}7|`_wO%3tXZ>$_U+pTC?H^#z}~%kiQf?p$~$8= zAtFHl1RO%Z1*@GL(hs#43IVD1H^8y|-~9BLP?xL)=+&ziO`0@G_HM2a6wr!K|2)`=`AOHd&00OoVa618sZzi@4Byu$n2nq_KS+i!zn~y&F zh{lc`OY7FHBb*1Q-ZzBDp5Fm}7x!3mKy?HlAXPWy>g+I2 zX(|>e{wygeQ6~tM^g1LY#9(=5@u<~mRnl{G69hm21e`+v0@690Q40uwfHMjFkUS^O zQELHg@5dtjLfbPVCkTK52)H!?x7%Uk*0j(#5C8!X009sH0T2KIf`G(J00JNY0w4ea zAOHd&&?*8DkgdXn#2^3yAmGIWmae}v)&152crkEv4FtS{0K}wsj3ash0w4eaAOHd& z00K@U00HSV)DEl&M}9gGE{X;L5C8!X009s%B>(|wO2iZI;pfK_M-6QT0TA#i0&7MO z5l4q5(Z~@)$9Yx9&>0YDkpRSGi-?c_1V8`;KmY_l00h)Rz`7|;2uQWO59lWdfB*=9 z00@8p2((DRx+zWw$QD5&0SJJA_Y<)08@Km!Ko3E{B?JcdOzq*|wE&*@H|{xYhM088 z-wi4S0T6I60@i)|cCR+0H6Q>2%@gp%ziZEFGX!MwZy%o8jezHTtFl{{?N|`#*%2m+ z0s#;J0T5_20SL%8BSZn#5%8RERo3;#xv4 z=Qnr=Naw#vs1XD}00ck)1VBJF1ibSbJOrd_Uf`x3o%DZupTSyyrd$vM0w4eaAOHd& zU_AkdN$Xi5B?y232!H?xfB*=9fFJ+?iPr!GK)^c*c;^x%4<0Lf4~V_X^I)`AHyzBZ zMumhe642;HXfYKMfB*AN%9SkY@w`0`h_i|z+(^q0T2KI69Q0_CO~kj z4g$WR;gs{~`&6Bi<+Kj#YisRNmn+)dmWJD_-==NQ*Vhs0>rJB@-D`Zb)Zxb4N!u=@ zQ9N!y00ck)1lmTxw{6*x8w5O)Kwx4Q^3%mq<~z&ioX1`=+ll&$a>{&n85RDziR#Ns zsa@~$De-}Y8v6v!# z)l!#*uaZw-kl}gN$rF^mYys64=F{HWN67f3N0&%eyT%NVyz)1!quSy^iXL+-hs|S3P|XDW=wL($3K9|j~*Z%R$o%0!WztFctp% z9hC_&XD*~Io=a6ijPzLk37z-pZ$iK%Q|gnmTeNx^msPlV19iCR4(j#pSJY|tV=`~_ z=!w*K^^eqV?KTP>beSk~7S&f(8fRbQxpkor1?EFo(rC5B1^uI`oM;2~}AHI`92VO?~KHEZFpM8@~f4zosK3QpKH;X`@ z;P%w(?R8DWV@-Z89h`F;`9*b*d5y&*7r=tG^MlVwfzR#ZpZ0wR7|JgF>v!t$#u~x! z2O<0>$uTk>!_(hGpm8Yt zb)&Sl@^kJ(P*OLFz2$BxC}jfiiv=QEUQI(q`Vl&K-z^kz`L&ezx2L4_(lkv=_puWj z>g9A=Ay^}ajix*yAUW_9ciwlOlWx=md2j9)i{RkoKJrnZ5X@|)EI+hQ-si1fOHo(f z*kB#j_)x@^*NO4F&|EQy0|5~51_C(o(B@IZbhgdtkkK0m#N9qqPMNY;G!E6LGKe%9 zF@<}laWwe^1W`?14h3})W+5wrzb>Xxs=D$rF@^q;wDQ$gRtTk~CH~~8+tvnqw>|WN zA+lY(-jb*8!!EjlGKHm%r+iEI{VAsM7s)2pe)VH1r1%-H@go}! z{%=d$H74!icJR8+v}Ye8ZD^Rd9zuSR(PBzDhO$3>k62U&cJ9i_$ge{jRh`O|*UB%R zzV?rfRc;%J%d?S0JPlvG;}<$4ro+1}ew*0xX*P&8h!%rM+uJ4g6;{MVD){kRX*Jee zJ62dmZTqK<`&*sexNp_jSrnLbuEplyff0emue;F$mtQ_zmM-4UBpNwl=s1@*1`PlK zZzIsmzu4Z^RrCi0%oB*d{#I!~;Y}J!kkigXhEw`WkBjw!CuMY9Ns$mN-$-j0M+6HI zQz+KiabOEr_L_I)b#uSC4FA{!gOJSq@?)`yqqw;&%|5fW?Yu6r@8xmnr81+gx`FbRoG2!McU2|z%qcIeS@g#cTw67G3~Os$DY^Inu2Ob-3+cKUNve@dBgBNc7^ z(J)l!ym zOQ>wR`Jrbi^!$qjgGr61!IiYUV?1)$DB3&uD%y4Zg~Zd;&4tL}W2ow6hHO7? z3bHluTFHdF{zgZio7Q=K^f3TpwpXGHDa6Idhd z&lDwsfL9Z6$K`MN8`sIRH(B(xH5C4`UX)dB*y+vZdWK(mRa=Wieh>fw5b!Djwyi+- zDg@{Z2!Mbm6WDzdpJC}so##F$j~i-hUrzt;zjREjgXO@x&iO1%f3cwm0R%t*1VG>{ z0ZeC~Wq~UY009sH0T2KI5U?MCkaeGrb^Emdi}VZa(PPYM+k+y?009sH0T2KI5C8!c z5rBYH(O{#io<|`6`>!eGu2CM*@B_2Q3%4~5u45e0VAJy1FX5hvSIBMVg}-fbM3pEL z1V8`;K)?b42uKTH5DfyYA;4~3b`I-K)g~u#x26nxq}_k-SjzkMGkX-(>M~*`OcQ79 zoZsqv)@Cp*|M)9UQPLwzsQ*1FZS3T-5g`%^7)i@K8+l1Wtt;~XCMFqAYcUn8_(Uq4jU`jA_)kz zMBu=zaa4BTFX3!CL2MV+QuLTxiLGblhxbv&^2Jo1dVqX|ufM2~<0y904APi!F3ok6 zrTj^3Wvj|emrlAv2VFr0V%oOXzrL1DV5?{5yUXNB*!AV5($RI|0}ExnC*F9La@VaU zZMzWC2Dc+VwY$s0SEX;o+;tyO&e~N}lbbD1QSUf&9)^Pc)&%w%LM~L0j zh)mKg4R3vBw=G;|r`eBFKx{`!U$%fMkEaRC;|0?FjBj|PEQhC?b=OTG-Kg=x58+12 zcykFwU40`J{`S2Tdm$GLro=fuZQrO@tso&?@WS;aF=My^s=W@^c z?`EnO&cjcA^iP@A+%ImYR=Bt6`Q~Rr0DmQX2mVR<8`jFC^_3NL=x>vyNbbA(2U&0B ziKCSH)=TD!Y81zP?6Le)8Q0_WHL|ak1bL&3xB&qW009uN8vzJNyP>pmLDAZnN3a&a z&QM)fT;Z=9N!vb@Vs91JM_)e*O6o?jx7;l@psX?5C_d@hFH%PxZoFMu)`GkBCY`We zDxo^}loWl5bDko<2%Ri5e%c(W6h0U$(he)hntbMb_5sBUzl%X#dkEi+KGFepJ7Gy= z0jUJGK(ckwIE3`Sh@|g7;SW(aay-=uOQ`PpiQ*8&L}{Ij95IH<4*q4FrO`F_o%_PU zCm={V;V#?Xke>U>z8mAGJ|Lb2N}r1H!tY?7I5`{v1V8`;K*0S8ICO2K`*$BL1Odkp zsL44gPpl@**~KP&f;9_4bD>7W2PAZE^e73lIPS5C8%9BLD&Eeq(|b zc_IOyvm#MxzMpV&Qf0DRo7VvPN1qj8N?QK(6Y(`$e=v@snu1)zy?;!+yiR^)H7RQl z4G+&;umEcm#Jzt^93{+nL^^?P^vw9iG@tP^2VC>IQC?fAtn+k06(|w}KmY_lz<~rX zo$WxlDB4v7UKkrZ-gRpM*dnM6Zcn_vPYK~eM#v2+*=ybPEpc0`!oE~5z>;G_5N!O7EQLA=44UPmd0Y* z+3iAoAOHd&00Pz#fHl%OPDlj;EfV0J#m8QlF9ll6%~L2&tU>Jl!b&=^{5d-L;d0W3 zgp%&sn<-}eU1rPT>ECk~zfC9Jc!o~C=P%Y$h7hkgEd2R9L;9q7FVe{mUKblvZl{{U zJh6^2LhQ00Na2?^Obf?MoGt|BW0d>lYB62ihq}G^9(5Qyg*5(wbnN*@#ng6|Tz?oW zp7CZJR|OW^-T(2S+)T4?%GF{DyFyxHb=Quiu#1Pfs#&NP1V8`;K)^}@INrxfP)G&> zAOHd&00JNY0_q~57u_{kCUph?rnA-cCZM+<&=P@tTYk9Tb!!1yYMt@(Mf!!tw>Dh! zzl;bV00JNY0@f2SuO>M^6s7ePkrD(zz@rIR{)f<`yM@Ms00@A93kX0!x&S(=1OX5L z0T2KI5C8#BCIA8H$)kt1g8&G)3jyDX?Y%u?Er7caLpwkK1S}Hpv;{X1lNN^(57b7W zuC$nT4(o35L~wWHQM~^Jytzc4%hO!; zz@&32LHG*vq!2SoF7^vDwGFC2`AUozaLGLQ)bJd%K?nWsDwvbQ(d zzAq6DiS+jO1^x4C0(|m#lsH(?lS0H5*V@!vwh;5F>MWWOyo_w(I*WjPUm}P}XN?=`aVr9c9=b!e=b$*bTcg!d^q5;ocgB@W>7E^QYSjwq(xqMR z{&dc~7f5j?%~s4z;r^xY*G*JkUP|qHpHGPoEEGNl6U_FnPNc~a&Z~Lqm0i#Th;*Ij zK1c0(_M?(Nwov9fOQ|yBm~2Bxzd_XLz9-2qDw=rO_VhRZC91EZy?b`bvU|VxKk3BU zxFVj;J|?DX&s+10j82(8M))}#NhLdfrLz6IDWKCi)MddkDm$=;vfg`Dim}M6ZlJ{b zpOntHIn%Kh=85#bQLPYb>_|Iy(!CUQ%}s`CO85RjCtiJusn4zH)OhL2=lH8nQ{m=~q_3%#r?MwZdypcBUvH?NFDlXwQR;$wW!!=P zn?M>LE%pB3J1X6^-6F$A1)RBIC*pbQ*HX9V-jdGBbJu-DIcrx@O>Q;?bWEa-Gv|pD z+y|M#v>C+a^HKaN6#4P>c`P22)`MLV~nreN-81KkLoM4j>{w^ySM90Q+A1RYV?c+%)~79b#Bgu! zqG;Ps`rYF%H9XpP_x1WclZNTb4(-#|RaWTJo|~)RbK7veaHXT?{t17G`dv2+)EE4^ z(U2$i%a8Rnh551%T;3no^pW{F0muJ2>U@3fS8GHUYxU)+2V{Nwre0^r-&!X7XN{BP zaRFSn5WTW(d|#cHqpvu2gfE)}ZVQhYr%^&n43EG5jQ-e*k4eU9&(G8EpFN?WucCnL zPv4hp%8%@qeJ|L&LH4yOJ4-1ZpCAANAOHdu2w**=y1p-v*oe#L(eo=C^Z@i6c)jN zxK1){`|}1$OQgovmwZFQq>n_lO13rVuAfN3-FivihHOc!$~rDBYufgq(#jS(UcN(7ag-<{b)#T+!p=JD_T5Q~50j7vY<7a$bnVw2nD_?F9QerAseBqOI z@w6_BEL($rR0l&=pTHpE9~P2I@Cg)ss%YpF$7O%AlJYlxLA8bXqBB}C?Vc|LUBmjr zn$xG6Ow}7t-M5yB(gx)-@~SbS-aN`!u~>@o(DN^*gc)-wFtLl0q}fkv7?%FELkiZO z|6D5twk%Mv=JClCOAUC6TM2byxk9_-zRE2=K>!3m00b-$fPl0B2GL$lz^C;qzWih2 zC6we>R+DddWQ&0OV-rZs0h5@@HY*~JKJzyUzj!Dm&3{n}8@9;ppK*f}lCpKeI_T)Y z*2e5L?^5@r9~eYi>Jzg>wGF%Y{B*HY{mXZano*sdX}E7K6GO%`7c8)2Zn<0j@Wz>p zH=h^lGv`x}*Z*Uh)a-qW3`MRm!I&+MhaQ|v-Cp?D8SA4)2;(>*Rv!`TM{V2eW41wv z1OX869s*b#-0WC7{{K+TR*#1u;2HvKHPi;Tr^*vYt!@BMafc5XF3%gR&N(UbvFIuI z@f*_D)io)H)3bOvzU&E->4nvcH_ud^JaHy(vDj(cO^Qd(pCc@R#(_81RAr`15o#Q7 z&+EF7A>!}HcMF=RY*QL<|r zRiqw}ee@CI!Al5ugOjzPuB3T}N;w^FnoJ4PAEHAKPoc7uT{61E*gGgoY>p~Fyie$5 zJ=IrL%4vMDMpns&PY?hBuO=|@`n?CNTno^8n}hwyEz&Qvf1!3P5hh7H!bDM?MIhmx zxzhZ_N+RawDbzvyv=ox&y+|iNc%2T3t=%atKkG&+e5HF*H z_fI6T!-WE3JIXEEJkQs)ItZ|ZNuUlehoH^>+o5@G4Q8+SxI(2#K zHR>>S3TgZU<)NI_r?RB74^Hkwoo3^Na$|0|1_2NN0doX2yz|D~1H^%V+6nLWITKS4RbEoUmZ@632&>j!~0T2KIClSD$wv#}k8qXsD@z^kC#XB}rZE>Nz;qB^t zuwvx!(VjP;XfX(Y00@A99SHQv^EFuufaz>IVD+3LU|;i`v2vf5^2z1m;7NHRxL-sR z#oazro@(wsEksK|00ck)1e{F()7j1@j`~0V1ey@QSDPkC5CH-p00M4J0MprSJv?X} z2&jVqmJX<6oY6}V00HkI00C*!ST5>1;sq z5C8!XU;+@6cn?4T1V8`;oJ0VJfj9{?ssRCq6M%qpIAT-)0`5aVUt8N^W2{NjVj)NX z0%i$d;euH#hy(!-B!C^+9yn+o-sJI{#%#h`00#^rJD)!B`m=T_<5J9OZs)N$5h z(j^U7moj|}g$=xnc!zaO!D))0`aqMN)$9Y29a!`6J%!>BajkH$o&4Iz!j;Qbc~E-g z@uTt}@|baVQuGbC$zD{8!@<*+E}-&*d&I8o8VWk6yQp_Qwd>i>0lh?#AOHd&00Qnt z0AkYJMg;8vfmRSW`qTrY4GE>*?|&MH8S><=T_r7&;X_7{#z#v*!qIp1jkgPT zj9vUNDI0f;tJXNwEES z11V$qVyY`Eq1uunu{&l7g$=sgV1l%jOb4-hCV%}}D%`w*c*`~m)rzBss4V4AS!A84 z%Re^3ARcqS{Fv%Wi*2nA`9J^!KmY_BOu*h>?;#)^JYXmu1RP1A%abpQxojWWJNar# zxpSltbdi)a|HU>}8JygQx<2zJW&ig*`s?P4Y4_N{l)mfZr?AkcWusqIaM1MRjmLez;u)g zprc9^5P*PG!62iHUO)i<5A=dTL>F8`z?D-_GnU45!CC; zaQED_mxwy|A30*^IH&ajV$x}-QIX9A@Ws_;bjSz-AOHd&00NF600HR;swmUt1Yp2) z`4FH1AOHd&00Lf200PpB2UpEqgV9pWqmRCW00@8p2soO+?3e%X#DFUot>$`0UNvUK KXT$D%`u_o;+%VGs literal 0 HcmV?d00001 diff --git a/previews/PR2365/assets/2021-10-08-dcgan-mnist/output.gif b/previews/PR2365/assets/2021-10-08-dcgan-mnist/output.gif new file mode 100644 index 0000000000000000000000000000000000000000..33435edbcc418a6a10f3e3803e2e50a01ae1b95a GIT binary patch literal 182397 zcmeFXS5%W*^!EF{=^>;NdP_pDCiHITRSiYD8agOoXad$Z^rnQSfT#f!K~aMhtf7k7 zD2ia)p(*x&=*G4={?Em^I^&Ep#&`LBYu&9i=2$n+oMX;s{XD%rY;2<{fC}I%0PNkn zci+B!`}glZaNqz0L3}dbLBGiAcTzahr|6ZSuk0*E!9mka8a)wY1tuBRSJ(KdB7 zr52swV+^$$4JUnGZ9nvhgE_oy!Ih+BJOoP3yB5q`oU8P?J_h48+V&6loNE67BW{U+a}y@ud2G{!5kM;-IP!*CkHcU3M#Mh zcu{e>UP?%UTiIUi8Y{VwbuE@^Rz_Wh>o#{ENZ%eVhCkD7Mm& z8q3CbBCg=6hO6gQ=;Dz;TJ}WsnFPm!d1HuU*ZerWGci6fu^daTwc(D#nO>R@QjI6{ ztJyj#Ep@d9rpuwt&)xD8w5BX07Pn@>i(`w6k{N5C3wCGq`xoj&=o5}^z6c4`=o`B} zq6m0flIe}fu2V>Z2lOqTgOirz`ahn$-$WA^~B^d-CQKzqrw zpr!?Yn?GNTHFrv@lekPXb>#%sqc()mPFjo1E1$xC`ANmIaF>hIE?cloLc`BxwdsXu zbh8I6&XMnGR2+d9#~vJwnVsG_KA-CP{>6nzZoit0stxmLJzDdw_8w~8f04{OiGNa`|5GmT?(R{*(jIvDDTh&&$xJTQ$>6=S*(;Ya9WuN zYAjOxj;Dasq=}MmFKT`@?;2J2=12oJ;?IbWgvT|G3}h+1cAkkk=T!R4fUQO!ReWg~ zA6GT=V;L1<>%Anq?u|ID@I1n~b#BddkanXty56a6vA%EDWiqL=aEO*VTCZ@fXi*S~ zRH(cen_MA#F+4tBJ$f-{UFA7sVC}X~JqzUzx-sqU+9^vuM^!Dh|Ks?xV|VH868eJg zS@^??!#(|(3&X^n_%SVtHqd9`U>Tyk>f{RHtGuHYH}6tAHqIe>WyU1 z3I^GCgR3{3zgJ|{3og$n`LWrms$U4SJ@hYsdV$yZZjcGR)*SD9J26eBIuPyYiRoI(Tehixg?ZjAKP>9aZRSo&M@oLb`xCYo0 zaUV?1U;VI=_NM6#&J%Cq&N}kbQF*++A8%KklomeKH@%W@Pkv7A(Q-W-H_&j2{OI?~ zfJ588KBSFq=O1iIJEia~mj6-EB(eIJt+U}CBvbqHmqy}^wCP06+Q$Cm5_nyCNMv{$$}!?b@kQCCH1CvHb{vxOZryu$6wcp0DTKId7bJaZ*>U z06#7eG4z-l9c-fQI`!RU!fpHguM2S3F%OciV?*7WD9Y~pN0a81V!78jWqZ3u_O0>; zzvwHJJ^R$*Urai(8RmXBI{8q%+LIt1`-KX2tZDjd)e_QP@wrR>TF8_4RMe-wvRr=ml3{jJk)G}0dW1h`#H<}Tyj!c) z`A?PQd*ua1tijz3(hw@VDzGkg*qKa=Nr1H$s2$ERsk})HDIX9oQ)X^b!lOgZh5Be_ zGLGP`e9kO9LsyBYQNJU4w8nRJecYn7B>$?*0JR_PUA3HJJl&j{m>J+Qow_7z!1Zvv z80uo;`gMJQkIuGxm-7TuGvx&qc$4snrw>A9FgqC;;8{D0@w(BNuuZZ0{zZKCzH|h2vX463_S7uk^NudPrGXm ziYM&2LR^(Ud?it%+@Z}e{JMpf>9;-tRI)W%I@44eA2 zt!~~d7;{$C8tK@0t#eSnPnJ`L(vx{GnK7QNa^c)#gMgY~g>@#-NAqn5+_F@9>GYF} zs&g*tKLlz9L>+g_qeHig3J-i6)Rfk)ej4{RCN`hp ze>MHe1g=PpDGt6wn(08+IcTU z{8YF2@Y!4swYl=Er-)kf(X}e$-i(lI`{j!6y9f1h!jDr8$!e9|2j@IS-S^nIzdJR* zHA|OTJRrESXF!w{%vOu`F`hzf?hh5W{*w}O!S{-Ken}ghs<19Uf0as`l!hm#4Hf3?V=y*%y}8zd;IvM4C&ICWJ8cTTH-Aja%ddY_KgNuh zii|<8UwP-T_Weu8%QzpH#p^x;AA9WhtCYeE`KOw1 zUvF@+JM`#@ex}_3eD!$Sr7eZZzlR^fAN0S8z0X+|`x)=dYwdCKP&{-GCg^c-dmt(p z3e$FA*)y()BYGXreqHWfK9c+@-EQ3#x4U z_?(ocg@4Z+_IRIt{V)pq$!Gdka^ZMIgOMfkEI;zwfa&h`)o=0hy!14oMuBa6+2il7 zvJI0~hs-$m+<;>+I-EypG|C!0bl}IY#3sG&69Mhp|*>=MQQh z_IJFL>6AVbV=jj?DGWK9W*RO{%$+6F6zD3sIv$&_DymUr_8U_&W0NnD2PC%WYfSO? zh&yB6eLUswEvzl^e%~k`u>Fe`S=|>V3y`pR<0C1FLH$OzqW3*e*bmh zK(93y-vu=PV(7mx8=LmBBMkc-4vBP;zIHe%e%>>K6Gr9I>;_Fg=@>Q5>d98~ye?x4 zs*&DWZne?y&9TPk1=|AhovNFY3A(yvr-}PSu1<+TpA%5$bdwJh$hxN+dktBiSfZd6 zbiJKp&oD4CoVeQ*g9rNT(r(qWlLy9Vz~y?ADSzu*EaeTj7Lim+wufB@>bg?62Ug%}#nW|1r-rt6f zsaNk^g4bK{{9v^?Rvr+0b3jO(t9#|vF1S%)l@LX)I89{C<31MW`OCYyYlr4BV%eD) zjQp}sdG@8D6`{Ix-%!R-o|yw^v$G^2CNE}pbJT_>1g4w}B)T+E&lZQ(@F7q;)?(tI0POx*CruA9q)mCho7v zrRLqe&uiBYjv8CkNwmZjWcoocfsJ78yuE@}Uxiw4gAWa>TDtJZi z2kb<1Gp9z0DQ{u+jTIyxn4IaMSDBe1$=vYNC6@Buz}P!n*(&4uQ-h!1QC7udYP^Zm z)d7ILx|@`5-LhjRBl2;Beoh^=mg_|Dw=&5p)A!%+r(JhR7i~U~ zM(MCz94cCqJE5H~w`o`*ZAj@^7ma&YZ*5LhYUX!WuQ;sFCtIGPViC}M#5g89VHS*e z4fuu9GtXwXC0ySyRq;dc0!h|iAuDg zt?RpjbuqP-o4{!2A7s_-^i+h}F>#UXX4P=FtfUaC=!mhy=I!d;@s~)}!(C>A>|Je% z&LPEK#Z>Czn=Nwn>|YYom3x7MT?os!ZcC?mS56=6r=h-P+IGd|Ck;3qY7I_ApSt5y z**l_>*XOoZvC7gPYd*_2vn+mn8rRdRTrAyk)xV=?+;B=p=|dFtK~BzXDzu{-k_|tc z`y`=hw|8^59HO+Q&#d^OejG5TzQ%~FFLLQU<+bS&@+NxQwH0;IO1lH^P|z6prOUaJ zW_0CjTz_zK$7o5*^<$XyKVKCIyUE?MXzv+Ay1 z=d5Btam+OL^0o6LKWka`$0{2TTTqJmEBjZX^Dcexh$-HHO|So&y}emZ@wVH>V^X_n zVbm5g`%iQBTYvJPXw2*C_|voSGZ~sfR`~}}&1dsyu76;s8*TRlj~NnarUQ;HY`AXc zc_Sh^H^Sb32(8*h@TMXE*pL(EEs!6KM>1NSXV-9Ht4-li#s3z{DaHYj-pJi)ocy19 zaIKhB-{7{uz1OGlA9#mum$%4R*#KqN-6wWOU*zk|hs0mHzWIp8cAjY(QX%=e4D<57 zA?s3QMMd}i>zC})8!SV9S{lc(ZfO`^?Q}H}4${qnZfPqU-p&JUisiVnvL-g%K<>~+8 z4BWYv7cU#{=bv@lA0vn&c>#)}Up7cnIG`xy$oc=I%BaLq?}w$>>Mj-;W4lT{|%-2{5FPqg@XH zVrV+6w$DndMRvRK(yxxdpLh8xl1(^|D8P4jY_uC)N+XIT^i*-5UzOG>NwL15u^$eY+0_)kX&_ zY+l!tWfc8O(Fd)p+Z22A_esxsPidHb$6a~-7{RSxw+r)EZHq0h=yBD=zI!CaS|K+w z7sD%yo#cBWu9+V-d$ae8{Dx1wO%Ju!7+>DIKhLyVh-i5^-*Cp}=0Jtu)`-=HP}Ym0 z+PY}lycxs0J1YLn?kyO>WKU+DF4&-creUjfVsOHZFNUES47)`|2YcF>%8Ik_+v)AM zoLRAioa@ho&z`-@+=*M+b@JvUx<6;{^ZkWX4Qu{Im8eT63(gsu-L*gJpcpbzSa-F< z*{{h*UXPv&7K%$9vyh*{*wzQ-;TsO-c)&}|HmE>A=)P%A_&=G||Bi1iinX!647Xes zCUxMVe+%KK4L+w)$4aRA@|qI8eP#NI(fO>l*xbM=7H;vx!sXnZ{@$M#L@qriwMrbcb@Za$xKGcm$4g&RSC*08Dgpk}*K$+==cL z_)!yw_iu@?cMOxff8xJyuotT;AJh2t{J2)X?wbO<;!^VfdZa027thL`fnXF0jXL`u zDy|Cl44q-h`c7@%n2JeN%Vvq#rx$Vv>H0Cf&`Cv0!cddJEFnEc$+sSB!%GtK?5)zC*8i##QT`4KE5y zXw-hB79&Sbnz*6h_N6}u!sOT3k#-G!8Mq#SNKH*p-mjykT(#rr)=@X{uWw)vHRKRe zk7rmS@3TRuXqf8{j|_fS*nPq}XVWRh(f88v{q?DdK^e1_9{zqxH9;jQO}#(P8n;y= zliGxuioWX%NOskkz2<7X;h=U(Z$j$MbmG{e>L|(UUKF`% z{G-=fa@kQC6$Lb+g6yW4}U%}E>JR;lAhn(Gel)ZPR;6N8u;8MnBgxUux(zgoUi z;w|5Qo3}4ZDZ6*~TE&XG*6vq4aR>_q!{2R@$a)ukdw2xc5@>bvoQr=V7`b#T+g!(x zJgIU&!gbCx?ixya+|%stOE+NL@}X`(VZQ2T7@d{zZsA_#x?3)n=Wownwp(SY;!Z@w z&77)L=58{r-i-WwPQ0J05bt;S=8*=Cj6pxfTo3W&9OI0tPv$0zQOo-u>|Xl3{PtBz z?5#6@G0Dqp__UcKSb6JU-GShJkWVZr~PN0Q*zmZW#j4d0gHK^ck6t%NrSvqckZci?iMgU2CGgMWUd=`k~GSB zeo#)WS}AqOckEF6ezfwH!9tV9(J}L@8@E4wdBEqgJf2TtVwBSZ0t(K#-cK>S{=UU# z(*te7(eRF(4w20K#p=Q^Akf*W%bYHNef6X~QeUIhp__dMT}M>P zhc`I(R*>+r8tYA2P4y?~`r2j+`R7WfyRIF_JtK6UG`*|2Ua?uybg7e7&Uk61*cC?a zG>(>EP&;-yGEVLT^_aoS)S8(Z?7;Fe^KipcOC5vNv@5w)N{6#*Fu?ymtNj1*#f$&? z;~)saU@$lwjzA!gNF)k{LZi_b3S%y}g5jgQKIPlatf>_3PPewzIRdi;Ih^tE-!vo4dQahlhu!r>B>f zm$$dKkB^V9udkn<--ZnvHg4RwY11YShvV<>9}o}_7#J876tsEs=HTGqkdTnj(9p23 zu<-Ekh=_>D$jB{Qwrt(Hb=$UWTrM{%Dk?fUIwmG2Ha0dcE-pSkJ|Q6?F)=YIDQWxm z?K^htNKQ^pNl8gfP2IV3Cy&QVOG`^nPtVB6$jr>l%F5ccYgcx5c1})CZf@@G-Mjbf z+4KML{ZUw0SX5M0TwGjIQc_x4T2@w8US3{NQE~9#!9#}*RaRCC1cIuns_N?Mnwpy0 z+S9zCxk*_b8~Y`OG|5O>&cTR zPn|l|*4B3V^y&8Y_KuE@GiT16J$tsZv-8}!bLY>Wzi{D#NF?g&>biLG;-yQME?>TU z<;s<-SFc{XcCEX+yQin;`t|EKZrr$e^X9Evw{G9Qedo>{u~^*O+uPUI*WceiFfcGU zI5;#kbocJv;o;$X_wJ31jNHF}|G|R?4<9~!^yty!$B#!xN5{s-#>dAeCMKRddGhq> z)5*!nXV0EZO-)TtPd|VD{KbnGFJHcV_3G8@*RTKi=O2kgGBYzXJ3Bi!H#a{&zp$|I z=FOY8Z{IF1F1~yB?*04sA3l8e`0?YXPoF-2{`}?37pYXbw6wImy!`d+*Oir(Z{NOs z|Ni~ofB#)wUH$Rn$IqWXfBpLP`}glZfByXU-+yasYnU}a3xgp6Y-a!?r=V@alRy9g z$*%vaE7v@PyoxNFCI-q$jxr?@uB5j>ADv7UQN-dhieXTbfu#9fg{-UHijsu?;3z8J zEID_K;a8_$W4NVJ00gvgjs#cuSayK0&6i+G&}O5jFuhx55A_w^i_b(g?>ENP8AqHl z{>L;+)$-HJPSRPY+OxH$+m7t&K)o)p>K*Fc??2=&5}v=`ooB6}WM7~1`V>k?VshDB zJ_7cMkn=p-=y-MyQ*G*O=BIIsU+oR#!#lj6og|@|oLp%-2PCN3+OS&6^A1IwezXPZ zN)H_Nj=32nrol0Aj3LkF=#v(1?zfkjYx-QuXUU!)Z3|@U?XaX;Bt~%e;2y8D>VbVW z(L5;G`LmBY1aeXwizpkvNxl}#_PR0pQ9ERtflSQ4`AKIO4S4%49R?xrWmWzll~iac7@n9}e{iLK zJ__X^55Y{eX^+}}OV_^M)9!H5hF}<`KnzE_k!IVx!h;H0*TjJYJD8ns^SrX=1=<4S}WIIGO2r@~eyHaS4aRead*|BeK=5dvfWmfg!YJ|Rg=p2+*W z@?~WJl{x&7^h5X+i@*RSv4G56Z3uXb8^~5sFZ%lW!7<%uO2rd~r$;ZzH&D$t8z|bl z5(wQ#Pd$l9Dse9U#1XJvV~nF<2U-UgsHwecC7)ungi87g3A_6&5 z5zs*IUi)(b_~-iNr;odTv=SB3LSUS~Ph)Xx{c5*li?a*DhkZb)r=ZEx+;7Z{+4_0z zvD5Fpy&zbUx-tekisP|i$WB4!`pcC*uGRRb+TzJau)YeLKNUY^v9et>l(xzIC?9); zA>Hu?AdQLl9M>Yu_Iec?-c7ZV!GMS^-6r`0--?ZtK z>U%Aq_J{Jf)=}NiG!JIqQqh5_e9Et4JE6#jA+j*=E@%YHro&(s-vg#eEp?vJf5U7ZbEH z>S>wSe5=VuXY-9|DuOBH+mU=VC$`_!w4S9d??sdxi!QSL-c{*80~jNud9A;D%iT)E zNXs3H2KYWuza6I0dIph^KHg;JGLMRoB1{s|mF-z)WLv9|R0YWzL@G2Yq1dq3OR5fH z#e9%5<@;vWcqdn#4SrZ0GBai46(Ys)?lYw>qSpM{j()t7A+&y#PEZLFXVLgCLQO|? z_DNjAVvJ8k%P>KNcurosq}<(OJAuM1K6IXsSq~RO`=xNQ&;YW>yI)dH(*elhV#mV* z$dtzpE9M`+K}*kbG=wx)_4~ZtM(LSc!1yIAJU~VQS&XNW&%A$FLcItTSV3xI^AHqUT zEzY3>KM(RZlEu1h(9=zC`jHx8!5fs!Aj7qiUO6ad&Z&h{h^-x9tjjg*-me1@1VU_x zq}b`BGm_2S5vVq;?R*(i1>@z(jj{Wk#`!Wx0UIG=-)%Q@hCPYWc6wO#_>!@|Y$nv| zx60x7Sxh~8Eem0{2>VGROSn0uUhHK3-F9vl?CVBrQ-t!vg2<{u~kZ*w1r4iBHAdqe-?8;89G-67R(}HwIEwIQ3oo%=Q37N^|!%z*`doO&{A`;U&>m*}(=k~+0cC@~y(Era0P zl|Qv#+o|Vl>E8L)Cyi7wK&ihPc*apB`9+zL`fsa*krW8~@r{;c?sDrL$3~2UkkQ(^ z^ud(3b8p~LqhW88n`e%ld8h`#Sms!-HAJ+56D%|tb7VyjRwOiM1EA=9HfF5guH$C%Sht~U9 zu0K2Yc5l_=ZJg@aQ)pk)tCQv@kXl(+WuL#Re{=uXl5wI(h`J6|3WV36t7B~lH+oIR zMc>w_gLh8c4+8rXlV2%bz2>2RP@~)sXEA-rNE?=lB&b8Z>XH71miuyz zT`_D*iB2Jz`bT`IM$QF6%TM--f;D+(yua)Q$*O)v_ec-stT|FKrMU*+N(V`1gps8I8RyrBF-QH?Dk0R7vQg)T!bLD zN#3^pcIcW>pEJUAz*Q|l0Prtr)Db%Jss9PkIzeGo;&v@fmEpotW_dyd_vvK@15~FF zJxHKdtM-4r6bjo;C;`y^!0%DwQ#LAv_cz6j@ksed5>8J5f69?j0JJW~dj0Q=Kxyv>ue%p=aP@i2V?G*jgF zke>eO8NG>1^yDHA9mztl?0$>TYXU-(n7utTGs7P^ZWS;Afj7q(zRSu>VuGv4#cbI_ z3ZWc?09I5`=T9n~Cr;h|WhcvB#2o_ zhhX=Aih8LOPz^#Z(q$5MMG2GwVY7?DfTpe~yE7wCeiw-C`>Fv+tvuK+-E!hKml z(G_rCj!c7?_+5(Ha@ZEXK;APBJf{(iM0&_2a7#{b@u`w5GVq5+bYKE%Khj5a6yl=c zz$lOtlAy#gF<2Wtb~;N=Zn@5&iasI4J3-)Q5lW$l^CgOUTBPaaOj;8YJvpJ2N$(Qm zW+NVQglRY?K&y~kl?$tHMJaA$5`Ksazta-x^UW|;KnoZDSx~!1Ba}nx7cBr~8mJ`# zFrHu6!S0g@~iRN}zeC0G+7Km*{#Y+@?|&l*iZ^fqMB z&HnI+<4hg1*!l-=$eU>Eg%*HOt*y=+89{@~Fij-l0({gdO_72U?gKji5C;-ZXb`qwkaA!MtbC=w8u1CR4eL$%Jr7)1NqKw4)}jVTu#oBD;?IkSD-gj=3d3{Hk&22*y#Pr7V1*6e z6=a6kXIguZvb>B`7y88lK!XEJh%lR>tyA6&7HgqfY@~?Ow1870)Y?}HW{7-73Hay@ zerY*!wwDq;UsWnkA+SMX0gx$zBje>}ws+z{%F$kB836Pd1y&?7;UWcXL_BE*pt=H; z5OV)uY|>Ttd{=kgOg+5p*F1V3oq^wki>g6o5132>QLQPhd^?qs&R-(Z)XhOT z7qcI_nvwq6mZ(UZct9QiC$C=71n-zKG|>`Z--t5unKN}%usCV@@_VAxBW2m2_su>9 z?V1V{+clBN)QZ$;f6%IUCrlQ zyGBv57jS&bxSo+_o4_KoGAcasYUA_6ngce5B>84%`6k>5zB@4XE_2iA!T7t1$9^en z`=V64^9#%F`sRTUXC1K62bbl=4WT|hxvq##0l-YutA zbt@-O`g0V{|=^ z!2m1-9}>VY_iK+uXR9Uwgi&NGEncrlHG498pAtTni{=QriqdbD>L5#*U}wHsmfjG+ zL-vUffdcIDo4Qd|LqB0kND^*V0@vaJ|5VF0yzOWf;%&L;l{u`YAUwKer|MzQjt2OS z(tiQyQ2{E1U8?Jxvk6T$iUocOiPyL=yr`$!N5RNL>#}R%FDapnn~rdNXs6?W;rY9; z?ceYaATqOP7mFppXo=vTrLsCS@chTBYz58#@`zkEPy`)K=;91VcB(rA-?;?VniRl` zY-X1q>a&O+xo8i7m?O%QZJp5L+sTTFD*#ei0+sg{xPoN)nd848;fizHQyx zg)j#WkZ^ZJgjNRevk+UvQzL!vCGgcB_rcO>uydjfO&oY*wZixyiD&{?Fb#h|#0NA? zY5nost5LL3xHZo=EOtabfS0D%3*oE8F`MwMmceQm}_ z2oWI11zXMWj}@YFBjx{yh+hF@HXFR}W;L?+Dtm}H#&#NcF%BqsEknd`5^9x1Ol6m@ zXXxT-V8oIKVF=vau`V0G&fqy5WCyR22!*9X4<_kTYhutGa2uJbeAEH}0#H2w|HCD7 zS2p^D6vT4~IindP3PWjp!U3+5r0l_otae0|^M3+@hIpt=RsJ`LXv+jN1@~jok*^yA zza31D$+X2lh(B!mmVV+_9y|ndtjGl@py3`1@Zs+WQxaJfk;D@011GT1t;!{)fzo7j7>M10KT~AHvUntZGdcv z;KU%wgJY*)oCfS$#Z-!j-`2#~Q+4t2_JFGxg=E4!W(7>3G#k_$@*{=IBmC9<-$F9x2X*4K8yr%0Qq=%UumRxG{<# zlE7r7csG&$1^s5L93^%c8gS|31yHPlh1fY=Ss{p$ z$TqHVWHd-P&r!Iq)(LaDok$L-Bt{UK>$_+jufT9Xs;ji<#Uj@WnLgWgU5X=2`?~@IQ4vKPtM;uu#CRjpaZRuSsILX8^vCT{%Zk6~z z%;+u~db4FOxi%L~f8g8&C?1zhhlq;;WN!nx;YKbZ3U^xylj9PSxjL_KAJteS8wmE% z7m>#ZIjR^^=>hCtAifa4oKG>^`&LI)w4-_F5S&K*D1qa|cs!G-G5nMuHXjAwUx8^6yX7Ha1dm7?ag+-P$H>+*>BGR@lkGVLtK_KRDi&OpDLf@! zzxyoL10lnOh8x&OB~n%MYr6!699pQ5)ZSanLTY<)xcoCN+DaDsTp``97(sAC?7ldU zM5Fu69PH%)eis(aRO7*10PwkSR{NgdnU`X1^vPk8guQ-W;!w4^Uv{ip5l%%3An#Ns zBT!M*NDeLa={-QU$=K&P-Sz3_^*(jJjaSxoJSCv6ZT%B^#U)w^Bk?$Ubmbwi#pU<; zj@`T773v`U#416S+b`#7z5a#?$O2nx+<#zt^?iX0;o38{it#xZ(rw2zc%dP5wNa&-+tV0pR(iDqmIGYrCe$YK zo0S+6?V3fyd$ik1hNb6AWd&kBC1O4mf_2LY0jjTBnI}xvMwmy~)Yv}MeAB$s?X#{O zJC9y`z032`o7Rs5DhCS0U`^uhCFl{#O9E7i%)Rf5C?YY8ns1lrcY8H_@!<_wC2<}$ z()is5M8b0WCR^K1$w?SOY<}a#8nAal3(ah2ZCVenE!DO~mat(TDTSZ9HAw>Y`Og2Zuk&JF5`dX1Wfc7kGX;xBJ5_R3qX|NRb0!-l$ zTvwyR1H0s5pMEXjwXIgMrni~+ymMHspGyg1YF1IRLYPNHhNFDK8?tYB9Jfsz6~fd% zZMOml*O|1w%B}ZMXqmOi!Vhf+&LSnmYQIX|GT82`{ze9!_)+(AxGrQO*O!pB3ImH+s#zJ?`B;rB{Q= ztn)HNdbC4Im)1%LF-RFw@6p}k)BQ@BBp4#cEj@wnn<&Er$RG1He<>B))sl-34aAxj z9z6up5SOj}>Y9_cf%x?>_UZR8-4V~bJeISk#g+rJ0sQHL zv171gf|@VnjPS^E4zP?-Q_2DK=K*jF78ev`anyze@O#LQ!T~Pi@ZKLeTsN5fPQ->S z$BuwiL&+Zmf1AndX2{hJ@VSM4c4_#)e$xelQC`_9UuBDtjbuLqMS zM6{wIO$Mhg(c2sF#!GAl9Dhjz&f_?pO3_g46Btb5hEYCBf?nzJ800Gp>yJwxS#^|f z93(IuQWS9)A{rzQ1C%#uchX`(Bg5HeY`jQ+{mY%|GsCU~kB}4Bd$-F;+KZh&d&(;B zwN9V*VI!C%c!C#Pmcr!Y3H-`mb*?y!U?JPa@EELCjQ-uNX*H15Z0${8AlPCAfrh3D zmT7plSo0ltY&kc!qctA-1JVS_NY6pzJ69p^-ed>ln-`bp+T z=P_!Bc25xsD9Y z>lud)8p=$OqrD`OP^)QDoV$?HbqO03(LY(9cOjI-RCvGGoAgZ%Q4YNw)*XE zpsDQJxf#xg@xP=&c)*124>vcM$|(8LP!5q?@{^VkxU3nM;vQ=j3scAeU~WAAQC)`o z_wKuYWX1wq`FHhE$`F2-hYd;IsOc{OvbicinktfxJ<{kBECU>dyL8T{FG z(Dk=A<4LJOvl$QCd6I!EbT^Z~FsN7q2~qJ}$nq)=Uird30tGJp_4c%NjfI*E?QRcs zmM8Dk7T_OzDrV_g*NNGN+5|`qoF^ZWkM+%5W0b<4)I%{9eK2piEqh z%;mo5s1lVxG?}`F`XkSz zB`BuK+wR(rHMO3(a%fJCNso$x4C-(44x>tb@V??a%1v7fw-dEMQg+N@Z4>2vur%kpa$ z6)rDVAv{gq^m5%JGZOPbUEojU&B}^QYQWh!KN?c8E%<+NNq@>sE3h#z2$V9b=2tR=r%<4%=M%hHoXx`TlN86-!+c9%|KuEt>vAdXHeTE#>wz!-xb-o|U+0V+o|=hO7Ko_mqXe zK}7_Xv6nyeuT|4#=|HoC`32Jr6I+%HCN zfshmQ=B&;#WQkQ;cyY8e|FBG1z(?hA9{*PlI@yMgUaTu?lFfQwY`Th04=q@)R&(ur z184&wCFp&tDD6c)GRJ_Ketc`Gkx@w@npXL`&jXk$-Y-CJRw~pL0pxkPbpv+~Ord{j zt&?>*d%z2Yr?oOl4!>ZP6)>TKPUvA;k!(G*HZf)HGwU?Y1TMWRCN1V6+o3K&MMETh z&=$U7nE!9Cy7A49+7S$!^g0yaM2WMozxOhxjbAxl-kcEK>K#TVL`0f*2n|z$$ z{&CjBZMHagJD|>CPB4lULQ98jyL(AMzG~60hT<46P-fM+u!FwUr}!5Ztp(w4_&{4( z)R=E* zM@Y;^duNF^{K;I9|7*$h%kbzKaR3egvY+Ff3^d>TqU>u&$#Wh=HR%Tb(9&R^viWgE z$&zH&0!s(irN{I?8Q|MC;cs^rY?Gq)bf3uPqDcVwOu{ZBmm_ZxuG;{PiG^_`MGxPh z*%{3Y{w`!gGPN2!*gu> zz{J44(EFyEex>od>#G#BlwpF<9A3VB$wALB%zQMf5|Vk~oUgbD{ll--Hg{a>7mIw@ zz=7@}H@2Ck>w~-uh=tjjB|@BG3zGX9`%M6A_Srocuui?;)ON$0JlDF7_H7G?kLV7cI<2Nxcv(q(h&e#B<#%tVVKZi4&(yj5V~#{T62F9LWZg&W{LqAQCrr* zi`9Iu6dowUfuQ-}GI{yqdxxulVVqiEpG7@S>{la?w9sc)za1`RZKFI0y)1MsC!klg zS1&^RHpJtTE)3*fQfRZ2H`sQ%L=L^xI8+Sb38PN>P-qeWtHmj>g_qi0|1b95^RJ0K4%_|B zq}Q2*1OfyIy$Bkb(wWdvkbr>TT7p=xtOf){K}`T@B7~+`Rzp$HRX1RFT@8qebpxVe zTLS{3i*CRMSa@>IbN+yHUY*Z5&olqQyqo#n*L~e59Zo-r9Box~;8|$Wj?3ad!uM5e zTA2CykE$WR5TV-P zo8o$H4&bY;cK?iTH<8%fo{g=-|7Ka?X4ZxE&gYuF>D^>&;sFKx9Jx*k{PD?}9{9FN zDhAYn%^Jq5V%5GSf%oj+P;tqTNfLb0!-{~m&Pq7kooJg(XpbnaT4{HA1tCAgOHJ~7 zoOI@IKTPvkyxh1mzjNVcV{J^}!QO|=SSbD>H532a&A0YPPRE834y8};?Wsv(XfU|( zQ|jDWrkSL@M&mtRX82rLy%#v22%0~~NH&q2Mqp`T$~L{dL5?MOs5pU|0-}p3Bk8Le z%Fp`DC9*?=cYK*m2NV#u>D(jHa9JbH2AzVRwo#!7W( zD77SLt0HgC(kv7*431K$HHxvaD~BgGAzX(++x_fGq2lPKVt+|Y-~y>h9XP9Tzn+ui zqI$eq6?J3{#py?tXLd>iC`&1@T|KFibF5i zYB1r)&#h7sHw43|&YZIpM)g7|{oJO=y)y3j?9g?w_dad%0X3uU_|8{u+WXD_{ETMf zKGn4woi%@oMa`*>xp#9;=H*qZMb`J`(9&jqNU?ANA3uRPk0*DnT`*}9U)BY`y%}z6 zz=?&LUopkk-QtxdvO8{7(4E{7fT=zI8-a(d)-FFXD?)v7ODya#p>N%D{E*@S^-&7 z={NqQX0;!ddaSZc>|g-exi%JGCVQdj(qqhD&qBtdz7H494d;qJy^f#z_7{ayr$KtA zqR=K1^Ys%S_*+uq8T_3U=^<#Iv{8dQeG43!-B5LkZBBrz-1}31zOPvm#TE;8iLd0Y z8ws8taH!04DH-50j|qQ&UAq4KW(DP55my7?<&T9H=nA!GedLXr&M`XI2BvL%xZ7(~ z3`hl)Wk=rKVqC4;*Tiaq;%b&{utDbu7{Fq-S21pCjA1@5HKk^#1NNqaMa~@aF!v(Tai;$ZJL?mjg1vwy zgAZ%&@3LHzVX3)AJ=;Fruuk3MS<4#P$bQdOyxRIhpp9I+rVRp&ZPi1ev`^-TSZcjrm1!>I?q(LQKL zURB%D6#yw-nq0wl`Kxn0uOy@dE}U@QQoBcbsd9#5Z6n$>xKSR-wT>h#xZlW=owELH z8VRjKOaAiQ=9^4kQSl_{c@5`4sf)wP-Q(a$MB=JBR5_suPOjo>jm8Bxp{BIt_W3ih zccR)8dn3wMPhSBzI=W^-9>@cITJte4fa8eoUnrNEpXRY4RPl?ii4DHqGB3$sUKP7W z4(v6%tOu%&mHbsJVChJKi-gp4Vd{CKJ%)}_wQS9&R15T?43~SScHdBHzywx}1KUvc z=naIog5h3J357Xdr|qli-hz`B$3za8k~;ofG$PGo^@W@BvpX`6=P0Cy^7-G$VJjwb z6nswTF~qCTedOJJ_oo9S>WGZUo2t#-e4F)MLz;cEXZhi-6hfOMHXzZJ;>@4r(UFbY z_HETZklA4sK?ckht@htO1#=l*>QzXC?3Whc;P8Xyuk)=CK7CD6y-OEu78Me0RMQp- zx?OHf5~GA#jm0{_Z3^`lj_2Z|phvokQ@mm*i>HMdEsooa4P>^uR2qN7=)zukMWQCv zyf{r=_oIH7y^V?UGLgKmm}yj6#T%MiK3ih3;ccU-xu9h6l{A!dZ}so^u{_VY z4*N~-U8eyur*?f~d6x9JML+A8WvU?AqL%eG#J-}e2x008;a(f6jO0FOX)hc~MGudS z?x9-oz266;+2l~;pCgDT=j_4y6p`_d^fD(+13ywBBUZJn%3GmD#;2=FU5E9!HAu8e z>c;{bqiEkRU8E#x#GYlW7+VZsRec zzY_+s?j2(Hj)bnEEGH9M1o+cbO}- z_qjEJqe4bAYB$Hg$Hvm>k2w|}JS{FLXxr7RSGNLLf&FtLC{E(>~VIO_cC2 zVFgC{^P??ur*z?8#q(ySBW3pXluwSwWHbX=GRYPEqhV$Y&@7G`8LN# zU0e@7wljQZZ}h7~MkM5rf$Ra#g=ZFWc4|m8CH|Uy5z&?A6c?MHfSRdyMx8hME@iO8 z*3-BkTTeU6aKkDjNKo2292VV8TEd2$gvOU3+%Ll}70SfLa_CPidzmD^msdvYC+M)e zcM)7e;dZ}n=XM7Wtx5Q=Yfn-?SPa&^P2gx(nbcQiKJMfArEO~;i8CUYDMEL?bEw$V zUy`*cWm*XOBRb06Pvtc zpHpj1nQb^~J;^`%F{zAfm$TT0@5(;cOVM?538>wO+Uj2nzDj}a-$)q(CKQzk zUIZ}{LeD9o{a6HBCzy~jM8+neYGimCO&$_jgy?FmSDAL$f%9UfDtX>DeSZQY+XzD* zXpvEMsvry9hM^=43D*zo9>*HK=3f))Qxn8gy<3}reAkx8yV}&OL6xy4Z})DHtC2A@ zb5OVD{Gt;C%lx|1Dw6yt5hH(>Pzq$skEo>XEhnQ2$Qbg_K>C^uZ8vw1wSr06*~Ss#o0hB5dk{j-ul_+<}i z4E`u*j81bv=n16nQu?TP^)fl&NWj-ngea~{@jCNE!zdIZM=;b##ki^#{S=VU1@pF` z(0Dv9ji(0`m_w)u){%K+`PLATO-7K%rY@6S4f0G4)<0e0>`AfOl=Bm%T#%gq*iU&; z#3kzRP4L0=HK?&LVBe6j4Qr;v&7wd>54lz6`ka#}+|*HPw<;y;hY$&oFn>r8Ikx2| ziujAqZQR5Sk|OgJ8=?hq%lbH*q{JpdE@=#W2TT_o<3CUO%3dT*YDvtdWaOU{zvvIBYSr#xHzP+S`J~VSB)RO^PnE=x zjJ%EOb&P-Bu~%zsnMNVx)#{QlimQr{t|XqSHTq3Pi*4~C)G{Y!j43Je`P*8`QL|AG z@Ig%^3aJ0p(!Q&aSnO!CSI7d1nSs! zQz*l1iaR7G?UGo0!i-;aG4_ZTw!4GxLWZl9tZBHm^@a&grCL(r{0*~-(H;w|!n1F4 zqgBX$85H)5Vsd03flEpCwYag>t~DO;wVly&aClnV{3`BNvytD-bGYvs=C5LAhGvoE zJ72a=m3ye%wn5?f_IA>K#jkU_61WD-lErg68sZZYE%LVH5?GeU_YzTPN-uT)jQwe{^so8o7}c2IeA9v z?ap}JajVLcohKO+N}3+t3pez??N9+ryq8Z{ACr=%6R&P!kj%HSK|I$OqMh*VF4w;vnY76k5N6hDqx9g~)eRs|c(Y*spxd`1S08BoSpCOaH~^P~&%Q zGC*7uOY7?iOWKqxYd{oUh*vQt#k4jyXDQ2sB!k|GrB>Dym!pbsHBq!MY*1pWN{WO__)D~jUR>hL5nWFX~U5U~+8`MAKo6SG-GgU7mf?wf!@ z%J_B^niL~EfNRW34yhw7VbeE8%~_hVdNB(ZU9{}NckVhFB}SU_Obm=P1bbBQGdI>w zP2P*zB?CYcGp}d`16ftCf6abnwBehEK~>eb<)Bd-WD;fkP$SpG4`xhoN7t^z=XdA7 zW`zOz6A$1=+9X=)C6Ez)3VE;=Y|>Ca)-%74adS4KcDs0P{od4SmOw>%UqjziW2xO=@v;=f#S z=9(Bwfw9m=Ci{J_G3&EiN(&xz*ovVUU$kh9nOd| zTi$T>@=?~~pN?mBj&-si@=^|8i{waw8EWh+qdh8fM+KdqA*RUDl^}r>Eo3Lf6T*d8 zmntEF^qHv<50aR`;5CB-s>l6uf|m7b!}duDMTR7m=s{mrNG6FTNY4YN_4O%mbO z3&@|9NH_q$Mr=`i=^OVWHLCQdTV4}WKuVDGr;f4EcwLDIm_9+KSLBro&P5u4 z_gcnPCGrKU@DIC07SYm|@_v#_KE@7)J>c#e&Ye*03HLQloNdt{El$7~4qhlhfJ|We z?->$tJF2v~gJ)&_BcVk8X7p9dd?Q5qv<2=9kkv1dx5790N^*;g`AsG-QD=P8l1`M- z{llyJw8((W9rtv&?)9~?--n0v0R7TWgnp`n&UoCa^{0ToLcDv^7`*^rNvH%%7hLUb zgnx)>PZ|@y^LZC^j?1P`B-Tp*+D;b|1P9A$`-=Izem#FBn0)&37Qu7bZY44)o%|5! z^+8(ruCH^FXCm4Syv2CH0%|OefFnECiTi`{$i}ljbr1DpMFuXjuhg*`!*-LrcO3GP zVBrb_oF1F%t`Ek+b7zO}w6b{$81ZpdWb}&;P=2L~HD`|9%BMPIzo4j(#We{Lzl$Yf z!@bowaxq6Jm;sTAoH&0+Fw~R3=u}vQD(-lUD&-Hh1F$?5bKHJo!@Uy;qcGlb&AX@w znJls(ImVEOW=ZDna&V0E5cUb;aIU^Ln=bc_Uk9j?>LA5jWe~xRdaf>^F&x5Y!o69!8$E`RmDI-+q`xB%VFFJ(f_QLz}vwMhf)`Tp}kGOIIJs z$zX-C(1J4z*_JA=J<3jpo)U;gRtxrW;1SGxTa+xQl&VvcdU|AiMrETTD8#L|*pUJCSS*`(8-BQ^u#p1gdL-R%pbs|mqa-}mXtCpg za77tm!@&{AY4h94Bj~Y+e4K~zT)4|ze#O>2ovN?15JuN$SO`8xzV)=3&TACPx|pS_ zZ>EnJUq8!t$&=91r=GqsfYX%ybMmGZP|XCB#tGzYO_X}q2j|zcB?yk5?0UF|4S#MV z;6E1~+5W(Cz-dHElhtlITSDRzs9yajnR@g+ymjjME#utb^v2RG?WMeAYZ5hB>Al0?uz?1G)xKw|}f8FpQGo@pf%r zb2Xr*X`8465t8{u7#YsXFtHqAQuz|xe%f$|u?#Q9)X42=cqBYZ@Yb=Wu`=@DBde&X z`q0?!z1GZw+pm(fI3_(wTZm*H*jQ|M_~Na-a>2+=s>5v80ncO9j(=r^AG)({3c8pN z0H^n}qhbTTxuOyIGm?|JoggXWQ^OzMFmau1xu8M$Bm?{e0T!Q`9s@lywPSn;Ne!vHx%lDj#}HX6&GsVi=b zx7kLyq%M1xa+M~uS~X(Dr1hxtPq6ab`!5xfx22d&<1j)xeG-KC))I(PCDM+{O+VCE zq}@a847LWHerIX70-wTZ@iJI4-(AKZO|`JsMoj}Tw829FAImPXB`d9Wd*L#S+6BQ5 zt_8Tke)EZukctoW;HiUcX@Fy4;CQ!e-3XO*--~1*=t58PWu}vL`AkxVx~OG27gdFy zwo`+1et4)`uwqvb;jE2l`ZbfoCCE+wz8H~vtpYOkJF(oGX|D(N7B9H<5 z?3=)Ki_LnY7-x&S_yAgSQ_K`K8j@t3ZZg3a5ADI+%g7%9T+|ge#zi!Y2%P17a zNh(*K4h6e3;kfmeHPx#Mz_12OZ@aGO!~>J}it@|?L6}2lzm!Oi zl{RK})iCQo-I7^T^<`)#s|N9w1*{TPZ0~K{?zta}PZ*GAsw6ENJ;Arld8U;3;Q7WW z%S9+)T5>Qr;8p&fFrwyO~H??-)4E18K@yXcCgx6mJSJIaNP~g1?CMq zeX#!5UTSpIr>RXBzs{6Yt?P8YG1=!K9nwiEOWswhu?j7K#zYDo^ zrupw9SU%Nk?%9^^#;}b&QO7xNW5{L%GG3rO-dXFvWqsFiHvs=`{mV^`Qii{}{j*B%DviYi1*#!);Xhj zBI|cWdgz&jVsrFj3i?6Mmo1=8CZk zIc)lYJd#1!)g9~k@n&me%dWWzjmVbD15+P&Jm1kbE!j8IB5<=|v&H(AS_$m`>7px- zCkT2}V*hJY-MY%JqHm{qz?s56?!Wt#~QX}j14)Pi_ z6>WCTS3s8c>WlNOYNQtFxb?UK5frb+LI^BfM-jw-{PzZ2t28OVt>sCqTaj6{k-4WB z^c2yJkEI5FODZzuSn)JGuydsN?uv8T>qy{4TI_wlQC&T@#j1Z(UBG}^3~;AkEMUMc zpFPnnBvE>ju*z&xlC$`?ChI7Msy`nlkV`59ijF%DrF`=c$QKiqoLS{W;7ZK%->YN~ z+@FRBO;$pqN^9p+9>N0tXGK0?AqEN@$-bI^RDG@Pht{EBySQG9I@>_X`ojB zID0)gILgH18QYqUKUhAm&O;4bSt_j~ZCnNAe#LaL9QUmxgc$H4f}=id#>HYDydF(0 zrb1D&>l8Cr0p@Ioo5o=O@0mSwPC^DNbV`P$;zG@2od+~$?|ODYCQ&RghA&VyDRAph z0rrqEl>3X{5@(4VA}ZA(;dATYYR4i+g=QWhR%sphH&HDN=4CD*leQZxRRxeAn&|yQ zkd9J3HQC)2uRjIH*FuymO*NlVgk|XU z7VZLVGDWQZaCl4I*qU}?sgK}%W`X{$PymWM$kod&|0-~n<>A&C5>4{X?kM|vDK8T9 zoBmRdY}Z~l-Ny^@)?xOL3?E}QR#?6`>oy%gqgJvx!Vll;FP1Rz|D9>F4&fi{Hm2fsl57~!U0*7bFg_; z3IzgZyUL@qLU*A=eL#bf=*+Mvw%Y$gES2VUeQAUj{V}x)T+xe-1QF_AC%R7v9X2={ z$_mSCFy;b$Fzfoo5E9UOYueMIbE(gwOOJwCpM#lGUAO$aRnCU;t~(X#Rr2*303=qp+|iaB2tqwjudk4Hm$P;uy`LJ z_B>c|oEDEDvmz6-1DvG#_151%*-|U&071WypvHGKF0e{ z3C9~q(@Gh^-?S{4kD%!pXqfxs7jW%v4Lv1^%E`!=PG-!7&}>_6#^gLjexbRa<_s`C>Qjt zu!S#|6nwU_))?AFwSj;8&baX??e?23+qTDBlxK&mGNh!t!&mqMbfxdZ;`p0R$++C1KiyliG(`SlkK0(-A}&L;>gIDVhy zg8Uwzy~+z*gF%C(c}1+Dv0ZMUe|oRe9yPP27>$z9%v!tv-h5XbV`L^wOQY(GFK<$M z8>TM5sU{mG zIM-O+yHHRFzkA|PJB!lZXb7hyDE+Hpq)r6VqC1yjf&F*A}&k0e>Xzuu#$QnrY zrtQG7t~QG)O{-@D?3f`eHoK)Q^3l4z&zE5&iv~4iZo|SnI%aZFln*c=zzHyJXxEqo zr{%{tJVpu2@DE}_N||}U7Q+i*@A@!WdE#@nZA(B9QP#5CA7 zb}6Izm0`&-w(EZOy3{p@(DJWMwqM}m6;%_;8 zmy*;~wC5^DiHc=#jPzcCY>+^vI_5_S@<>THRtFw=z0#~7%)m@uT}OB8_zYVEk!o($ zXWR8kAQf)9k&d|GtSMUWsMb{HF)$%zCduroQ=+Veb0VkhxyxBVT!8xIA<^mZ`z_lD zEOJa6(x^adccFHut*hFC1y*YxCF>?D;gzUGh{ffk-E}Md?)zjZ>{#4-e2wR&3}_%@cCwu536)GR6}nysUUE~5nPfh zcn7w&w#4!JA-V&+(wc=0vT<5!u*$#)U8W4-lO&XQB^qrFhW4SOzz!0?Z4xBLiuqZ9 z=!FJtV)D#1bZ?$%Uj`x<*W$auY?;71T`w6i^Qpr5)Wga%Gn#^sDJgW2j>69(V6A2NvW&Jc@Ok&J0Dsq(6z(TK4EMmvNS;(}A<$2#YPc$+ zFNLTfpbc~&&ovNDLh?}6kn0&2A;!Ejf+C1%nhtemQsMyA6id2;(&lI@0+jeU^?{-D>I_Ubs!5NC~h(d*3n+?GJU?F zjtEg70;Yk6ept-NZbqgfYZ~2*t*BYKDO@kT38Q0)&v~#mYvY4n{wF%l%mg#?j-p(WU=jE8Jx z$B{8hiqiU?D{3mRJuQI5*#=5A@>RPJl7+i^2QVeLadER7K$hj1epY!13fVgs@^LEY zzMIkQSkzU@d!|06Tx7XGSM{&e%>RG^BEhXr=D-reXKlnUQsi+vGOOzF-9F)zlui*5 z$7GNL#d@a}N6?!|mZR&XyqTKp=lT&O)uB)fC5ZLjD%_S!2CHs@Ms3~i67nO7*==zA zRS^GCH;Ix??-gX|1LuWp_;1r{EIgQ&SYJV6+2%;l05K7ftn-Ux8P$UXHL279G$_dz zPOyicBCEYYcQ1X&4oIXV;g}hTr4J%jU6#2_2$*!Wt53t0EkX1*D8or8n%9a_ggDU@ zyjdMims7u>q|5HmH^XvQxeKu$d=VpaF%dypO0a~AL{wrrd|UZD$#SW<4H@i~|IiSo z=7174vq&bWNnts%VwfVxM2zD}Nio`jO)TbnG0sL_UXyM*p*uz0#{4E^GV`s2U^o^= z6f>A3C?z60E?mRx5$ngSkQ2V>0tNGLX$3C@gk}<8Ko3_7VE`Qv)5S8Jg&M_8D@fHC z`Ix=IJCSjR3fZVe1sb}qhMic8zL<+HRWf{=O39WWRN+#;?^>|<4|;262Y~()q}WQK z!)K1FGeF+$q;wIQVuz4L&?_A&YZF*DH`%?N+6ObHE*Ke!XcyC&2eAMgAW+NDpC4_m zrhqsJwA6Jaz5aOpC-0U`@$`E7g%rd}d+_FU)5xu;MrZq2hucuY+Nr~Nd6bx?FB6ZK z)Oa3#Tem{xU|WbSL4-QH9sq1*f1!_D)QyCQ$e)}PPPe}9Jbl?t%}N#iVyD+a>f*FFQc;k z@vTV}aVHI^>4}`@$*g)4d3IuAf1>sN+w&ms3n6QVXO+{MTW@aMTFO1eB-~vxTGduo zefDJaWxrd0zPRf@dG|d1UJ3X32@}pK=kh=9CC7a0`XpyM=wv}D##9*`G<7zhzG?}E zoKm2>gsllFcks2FVycjWx!fQDuG^VyCWIny-^(9j4YTOP6hgX+yW$2aQn>rTe*ad% zf1kVugz$h;eLK9sZEzt;0!LZZ(B)(u%G_=M4hkvDw6NSQg{8AmVFWLMtvCwhq9TWS zo8DtCLsVx>eoJVDJEqd?wFGW}F^dE;r!}-d9k|?49=eA4L&12BBiu;uG4J2&U5)I< zxMaz14kDHr2D9xJI9Ygb^0!N-;Gw<;AG5{77({m?Opv4E*ti67f|rwZwnxJR?j&8Io%3+k<;kMNeX zwEIY;#Cxd-H_!zQikWgWcm55H&l2>aLLYk3j(-m5t#m&d&-~-!?M<#BuhkeqL5ue? zW=c$46bp~yep#^w@ZoCN<_M_YM+qj4l^i@`LX+q`seVl%Q-3r(qzeGkq+)59-V0d01(im5b( z(!t2UIa|KJVU7_5zyGqm&WClEmrD=%db9B|Jb%$$?W2+smB~8+Q5Xw;Ek*^6 z_@ObzN;9BucpZ0QQ?%5J?YW(?dN)>wSqaNfM=N^A4vlAItLQ~{;2SCM?P!!QBc0aL z6h_A$xUoOOoDq>3 z;^@S@{_VdCbALPt^(Osy>`e^0aw`Nq|DmY11;^~(?3McZ&W)n#svk#X{a0lAtcST& z1DaDMu@C6+ONZn?DBl62aXlBO#E;$Oe$4c`Yg*pfn*LSHyd)&OQ;=TmdTrDXebO*% zmb(R|M!^98yF^;#ipm6Iijn}{UkVSmBX@MRU!_Fh zu4}w@T)52I^KuPQNr@{kb$h|pM29#j;cYKi7cVBlD8!L4cTGzOBr*NGnnFtMztQ>< zNl9q?iRx*qt75VZ=CtKE$XkG?MzIh+H%CV~R1i}G%?nPGPywzQRt47g2rn(XbY=Yf z>DCF6U(BUqzfcxU6<=lRk$1AJ<_1{6?P9F?P@NSWHMEj|7Q4^qt{jY)z2xq!;2Mak z{3a_mW=)(uQ%S0r*s{yQbpPG6qYqN9-KsqD;4*#k>Y>R4*Cr1w<2a2!h5lT)*W%uj zC<7QnmJ7He!go4zVa;h}-9`f)Wr}P`xIEu*PD(<;7S>i~mH%vd(Zf!S-Zh!bg|AW=b-fN63Y=a)f zp+BE+jIgD40^12Y1A>&eJRMvR2uu@Effkyj0YLPB2F_608oRYXgnOo>P{cwkBp+ug zbO3?@eJgdUKUF2QmX~lL^2i0`Nc)`c?~M6UQC{SM?^B=GEiC%YW&4kgP!86rl`)81 z-E$++fPJzd_22!VJQ!c+Cl)vX!sZX}^FuHhZ4Q-=KNTTAa+PKG z>v^|gwbT7l3u~fmk@k5M5YH$#>pHck9sISUB-AL5n_iE8{RV;_3ah7KYq%<*6{20Bmk$D6~zQ~e{)~<*~kV!686=ciyeIGa-@YQ zkwq_~SOp*19>%gQB2APUv=R=aNsJ>7*tVeCuei2GR8vPNT{~>hikRvH3R|{=&T+QI z)c-tJK&V8*YqpnRFypHy1XGve3dwY1>!r?P! z^l7?}mhUn@a}FR5>W20Z&80&KiQQG7SMu(Z!{H@PRt2x+ex4?&1~ZFu&SXuARuBEM z!Adg1TYuoXk~`i{I9-doC~?}d91X4r;YnUs^Qb&Q>{WAz*;nLbarlvRx|No&^xv5d zkmQgItgY_7w8q3RruTtBG()jmma?s<*~%|l^k{@^(oeF8rRq1Iad4k3k2Ke(_n_0z zA;8)(-1>)-7Le5Ua+L8*UFBF>?h6inI{tkCqPKiCI5-jc#^{C4s^zf2e=IK=N6|DV z<%~=Y<>5Ez^)TjInDX7$4eq+(`%gcB%hYxs9poWF0y(MRq!mf?*k0XGI31}~%fj4& z*C!S--`N@1Idntk!xhk|HE;E0lEqX{MXL1n(p;9z|5;@*CvD)Dusr>5{mla|1OYrH zxtWxmX@a}mZ#S!M54>1q^oTJCQN+(Mfz~|n_EC#*S{%T1azYuvu4vVfWJx>Zrw$Id zQ?!AXUcNe84lUsA>4H_@Q+K|_xFq37Tbj`e4bGJ-$zL_qX%y67Xz@YcRU6DCcz)L{ zCnl>|Y)v_P@D`z09i6D!ROLljZFNIQ@VTr)Hn-sLT@n!$FOxeN+87?=nY<-hnf;op zwxOr(lV_%zIuCWgU)rF^(i$11m&Uu)eam>RH)<8Rv2z9`NpCqp3oqHKVW#d;|B2fs zffPVIC(Mu3Lr<2=qlm-V;etdji0>S;kLi1R{o4KKa^8JB{Ov{exznDCG=QT7octPa1iwz4TY!1$F6_b6Txd_INlq$P;unZ zuX8S^au$!!{REJh3?fNZT zn{+OQ*!XLNZgGN{HH+Kz3Kfu4-~#|ZxaZ6qk9-lGGF;HwLY=Z zfW`W#%O@vU9J+aZn;}JgcTJGFK4a?`GmwXkJA}$GVv_>lB*!&1hObggui-ikzpSQ* zevOxk=7CXz1Ne;qlmHXM-VVa>DN=!Z5L{IW`pL&{VU|~}lTjW)SH2;CL&3wGZw_m@ z$LI`o!T(VmR0qtx-AJeYqqUAXuhUuMlAuF3<3i4};twa_w+|GUn(Tj437T(M#egp% z)ZI*CKcBJrFMY_HMi^!4&1fUGI8H(7%|qnlW&-wq`H z8n;Q*@jhHkN?KjROu)a)SVUw+9(pi)B7&hN`Hr5Eg^^z`h<^FObwjOpgk;6sXT;cA zBy8#;DdLW3ebv{XnY(0+z|VCHAY097b(9(E()<})PX7^7g?dYy%Xk0VNDS&l@a^Yy zQS})K{E()ej2*FAGOFvLFyi7HRGn<=ix>IKXYVqXDe#^1{dO)}x;ugf%U~d0V=*2f zcDhJ9o6nOV%`DkmYC2;_mlfeYO-*uET|{3D?EEh76vG)f zb$K`hBaC=0{?PM>0iV*50_WzPL`SvZw4o6#qae-nkKzcxM)Myhhnn|XZ1;JwfIjaN zQjA7+u)egHZTAZ{XDZdPHNi6XqjD3zcGpiqp4?~Ty2G2-k3@Z0CTm6;^Xi2Kp;XBd=HA%JfUe^br1IAWAi&i7W^y>rEz>7CxV>=z=An8D0W^J5t66`?75 zeyg5msVShOX|5Dg1V#9I00-=6$yH4SCU0?YfHY!pKq*|9+hsz5!A3EEG5VYq~ zK7>{qe~*l~JULypW9s7(z-Ylpb2WJ;pOoYwa)9H2WzN(lqHS6*ba)uzY*dDW}hUg<(Q%Y_1o_5-yJS zf2=<&Ol&Ymu;0K*-NVB5MTcmi5v(sPW)A1I$6)b0-)H9#?`>~QY6M-0lijCEbxQC} ziVN%-?8zIv&gI3+$Srvli^3YiKPw$Vuf!tTIU%@fv_WSFWcQKFK%N5Kf ze!`e-@B{s>uI9ZJ+=5G{^%(6~tXu_q2Lrwc{Nc4uFi$OG343~r>#R-vGmD{#R;#jU zvf6yEzBz4UnVkRqZYn@;XeQG4RN$3HU$5iIPHY*DBPYJ7(Ym>0 zvil>!|5+@fQRr$R3%GC7*~P!sd*5lYYc7~wU$Iu!U^P!QcZp^5lH!wEJnojMI^ev^ z?_DVt8#WC{^(p!vA7+79jzi=tdbQ}v6{e$T)foswGWvoc&*>SqfpN0q4{aWGdtScf zCmS}JPv=>|WYvb(u6dbK>zq#$vx5u;B@Z2JJrdsrZQPziJ<-%C*w%N;YVUr&)1hPc zfy2oFb@!EAk73^I|H3>c+RGP5;P0yfA}E5i+O01(2-w;4cfH}VatgZcCXN-JxZsOq z^H!K2b&!ap6#E@Xsq*M9+}e@Thw^MN_R_yeR3%2-e$m?EJB^QYC=Y#bc_8>*Ots-L zD&ZADf${zf|Gw7D{6mdS2K_RR&Gr1nZS9-!s0=o#&yS6m6xgL# z8ZERcJGv1c2}tm_LjrzRBi78pH1#w-A}zf@E9ZVsvlO8Mx!R_{a(r83 zknK1$r`L*zVUT`*Y)3&vy;G<77rdNkq{|`q0|Xecdto9l?hJDNQUHBG3FcxHhX#*( zQ2qnk&06`*LU=2H;xG`mm;Vi=pOT;qDX?9@au5O&)D>t=n;Rg40HRj%gLOb8L04w6 z9!cQNy=z2gz$`%Y*5NopXz3XVYQ`xyV2$~+JzVv5Rh9xG=-i)&=#yU9QHn=o z{0?F?NedkvMpr|AP!n*_!deg3<6b#yFS?~~=gr!pJ&zN2Occp@#d%)E1$}Htz}%Iz zqa=dGR|S8c77+5hz+LIOt6$)cVtlxOtv}t>NQ(@_%O`~yN zsA=Y|%)gYgnF^PcnaDl@816Q5=YdH9wNPASP>opTQ72URZvu$ewHVSdBvO-^nr)c`G^FyH`On#vfS46An0X!|_K$ zh#Utpm8tEIS?~n&jtc2p&iaT!QUI0+3sO`Z}ExKNgW5 zV{KNIa+ExZN)N*`^|c&^(gUr`>1;>#3(F@oANVls-zhLN`8WmD9Mi|3{-NX!16pFJs@?B%)tXdf;-4696iup!fzi#;J zMJ&4pSehL50xCd@`o7cW1Bj!tG6;G zxC!mjY%~qjZfV9otvbl(9AUWUG9l(Il~sQp<4C(GN#@Q|A^(O&w#oQEw46$P91mkD z*|u2&r!Bg8KxnYDv1H!eE?BZ7m=T}4yUTo+YUSw)$KTrWvS>^wLq`x3MP2Z4qwQw} zYAdS@UugDODH5>Qv+Ks`qj0fioiWTMh+x91BCeLnFd{AniGGY%zng+XWakQCzYvbYQ&a7k!$Unl(lneTTN* zQ{XmHWcew`kwDkWW>1hnT-|)30t6d@jDRlaM12&Sw`at~(;L3}MDU*a36<#TD{#d@ z6I)##ghQE>C=|=FjAC;2^m;5w$~P}ehG`6Ntb#SlZ3Zme7U^*I4FqNY$?i(aGgH82 zin$b<%A_`oOE7)p2_hxcG?0q|W)RMK^9Jjg5`om%DpJfjDoEgyc0GBB{>{{9U6Yzh z*L{>Q!j#;@>#(53CF|-UGm<11E)-n)eSxy3^F|{Vw@$Kay%0>no5$AgDM4Qwk+Va^ zEONDVGWeo$|A1YjeK7SO^S@##h5_T=7lA;lTxL;%Y94~fr{ZWGkZU&lWybMxS8UxT zKK+C1MCDs*1s)XqZtq+JK)>OfqdMk7jhH24ecmzR!Ev!o;*NfbFVPG?J73hH07shg zld3P7w=UR8f}PE1q(T@-qCv)7c@sLBc2;r0+IwQZM(=5R2Ak>-qXx|-g{%(om90z{ z<9;(W!(PRf74`r`lHb}I)kM`!(=6slW@4a8EiaALucEY*Yn(?>X8710Ui-rRhG}D)0 z)}+m+{SxmsSdIN%ROY~(C_NJ4g1NP%J@El8s&a-qb6K6(L8S~ZL0xyChYUk{HP`J+ zctswbpRkLu3b&8iTUa}^lVFV6eP)m<|EI}PNb9}y1|7{|we^Bns27E+wDmw2>q{3R8)C^N z!DJ)%Cy+iPw~3?9Fj@nqwJrf9sKcG^;*o4M_@F}fkCvGi^W>_@b5T8OH zXoPwteT!8uVSNMB87svr$=gsTk{|5Yz1f!AI zf|ZOvFTE=+2PU#7KXl$9SIuuJllNuFcVeTfVKjUM^#$zl241R; zYeqg2McM0gqp^5sv;n!Sp<@YL`Z1(i2L}PjTpK%R5_aXVbVxkUf2a7;W>l!;7chaj z46LKT$N?-qJwoi=Q9L?K0BIKwD>r8xmM(0V%rXFhc6#s<58r**o5o0pU^o~r@udL4 zASW%*yPlu0J;!GacvLxkVFO_608{zt(OJ8N?GgJ%TvNAQKpN_=%;~)LCBHGNlQ9+j ze$dQC{@`p`L5+jz9}q|2=8bk>5{b?sxWf&|a*`V-?(kC~^?HGLCxjSJ+sk)N6QicL zv0Msr1;7Ras%_@K4V*nV2wil?O3j}_Ze8`ke2_a&L7*bdW78t6*Z^+|;)~-UZw;44 zYfDNx=jD`*dfHh{Hrv3v&>7T|%5F+9>rNJD#+Wr@g0bXY%RKDBz+rUyaHdT?R-22I zG*(ztqm8;pjCPCB+VT3w!aWtJVUk!H3NFlJ@-&BRqX60O8cv}xq7#f(hD8%%Q;qz@ zW~xoL^YT|7Cz!$hIr~D`#a5n&YSD2WoG+h;1Fhg&&X^q-!zxOk3e7jK=5MIc{7PvR zmas1_#BSc6-A%pbbRo|IAO6HHa%9ZsD(=-`KqUJnXf3#l;YL(=4Ry=@fcLi+X8AJf+>gq(Rs!dZB*~l-wcO7Ym}z zj*#fivByr=I=xFq4xbDzOh8YQ*eT8MngNCIhmrR|FZiH?UtlTFZt*$|Iu?WInmQFg z=FPzhF{@D*y&DYLK@g%oBbYem&9o9H5QTFfz!KTf*@^6&NWca%mij+_*7sgpu11j% zxUyW!+pgcz(zKbk(6)Wos6%73PW2vbiH&{@nGGp^D8OP|KvGlhb-7wWfH}S<=RvZJ z7#Y>q*7fYj-HuSVZ$su!xVO}z_$Y_81Px*&;7 zYW&w&SuH5TZOH7GzB}3}2-91T^~t}=%05+0N|smkA5#H?v*~50jlbvYHJs1{xGpkX zb3=P~i$C-r&ggr#IDeqvM6ARi>G-I(4%wNzbvvey+=3>9p}pc$_Z|}? zhM%&zZryat`zd?E)6Vw%vQ_|$ojAM-`Dv=^IkD$4f!Kd~DHGx?L61+q{p_TnNt-zD$KM;y z`Z6#1NM`iC_-$~?y7wn8sk#=~WgXm|-JCnC$km>CEI4V$&{d32mS)1ON|L|(xJ(Ts zj#{ufcEX3<^M$w}B4=)U$`&cf~wVD}pg%s*hnru;|`q`{lh2 z(yHuvrG#bfk69QL!yA>$E9Ao_@WcVk_C~GA&i40q4%@SFDqGTpk-O`_>W=D!>`QA~ zIQChFR@Xt6cAh~mx34Z+3cRieraEVQI~R7q)KtT>vwH6CR$a9J$*@D8^G1!R2KK

m9D%gM`%A`sb%uPt7>@U;^zf2$Nx#TbBoEY*VZ_XhwWx({Szw- znY4M#{5(@jJbO&A&zR69YdFw#Xgc3>RyC*$+~*?j{Iu|H!!1bq@z3w(kyl%Q*(m2l zNel1f6f>=c5i*r5H7bK7W3_Wuyw?1Q{VpQ5I7r7i-f2(WY!^AE7eJqzj%-Ho~ zmIJ##rSI2`wt5z`McDs6=Df+H(?b2ST;O~NAEN>HZ*{Y?`k$W)BTabN%W;1#%_!N- zs|luIjH+(b$5E`@;(v%x#g6w#iwU1(|DYRT+bruY9{Y21<=t_8o`jw|b5Tq}5K~6z z@$z34Me*fV-WQKq8cUdj53kD!n^fPka#ONZLwE6T*MDC}nzQ~8<#tdO|cZdlTC(hayw9Lq7i!gdTctxX;RE?S5TsGy6wQkUYIN2ThRhl=&8)+M_D{=)r}2 zT*dcHB;MM1*za+{DEJo6Pg?rXT3gcm*t+6>!&etj3Szg?anSO2-s zmP4R1(rcWZDaj~*&=3;vXO@Ya@!#`pTmCsbUY6(IEn5qDHhz0G!|!h`gT{Bf@f*Ge zuMBO4|MaYE3cEY2UCf=iBc=qp+PO`*e)BX*vHzdoAWZbDD(71$yfofS-rQlq4;Nz3 zA-N^Nh_GB0Uin2!0yIiX;Roxq|ayFJk1azm>QoeP$aSGtU z&JIw5Z=FVuZ(m%kxcgn!ZX><9#>Z5c1zNu)0E$Zcn&y%`%pq)|W zu|>dg2gNSFt){@QzrV2~Xf)`Y9uU#V6Z&53|E!11bXcy~tOXzMdqOq51S<1hnQoY`PZK+KgSW7udw4koPn8k&87xXa@_uO`hc;P|uMybVI0#5M&=qw# z_6L8l%?MK6{8r%)xY4$a4d|a9hJ6m#F2qeGvhf>l(MRQ9HWx#uV8_ilu1fiC&7_6%rEYK9BBxbL5 zkrLj^dx|quuM6$zeB^slu6L5O!kkEKD*8}-uON6E!^bs(ugjSe9#rWvL%Oo8dTXk2 zyIrMk2lJ6zF8lzB`41$4P{T$otb~-A1yFz?-{Sdp+#I{@x2M4e&t}58dKkM`UYy31 zI&s~aZDi5N`g+GboI&(yUM84Cm^n@)Tc4zatG&wT!LT%S}u3H{RCR8!3&E)dIq|U_o zx?0MhKje}$=f%8ScuX4=U#Pgg`qmEzv#Pv+DyfF4=;E0tQ*Ji6=GC$HlD}=uJpR2a z%%oBL--i6bqehV#y{|2N{9A;9_fc!i0Jz=+Z%h0}?2RW_S$b{2R7r*^k69cRk{bpm z3jJCtRCjNd&$Z45_FrCegmxA5dQiN)WdgI@Cp7PgncA(s$-dkBio`{4&a7&%7$uX* z*?Qip`X;f=8yeeJQp~$j=)*gV`PSYN!&TJ4Mh*6Csze@^ya9f(jGJ+xK^x42x3$*oG<8~Q3AL2l ztrJ24Ga5FFGb*7WT=wBOuvOBNV?V|H+KA;>ir*{_2KKRBhdW(@VYxxE9~8C))(&sf zLc_IJ3Vx(Zjotpzc>7<(P=IUDmAZ3MBC$n;NB7bT49qT+`$^RQ@qv8G|1HW#yR*# z)Rxcxu?mM=F`fFOdW-T(W9goO?7AS@VFxdTZE1h&PP4zGxpja&sDQw4sSbePc)K^+ zN23p@1*2~|&wYoc)gjH^pDaD7HrSv z#<#3>`l{Tr+D7X1$QUNAMZ`;<7vYEMPLBzeZ&^kLvQ{r5#=2d3n5C~U_O*mItp9h% z`bi1tgF09kTnR!tUfVU)F)J`?%aT@yJy1cB+y%25Uf{I$I|1?b>jW#g?;iCfhv9V5 zwMSZm?Js?^-;gK857fC}f@JN!52_OzZGg@1s%F3JD27#?qvsdJ1v)XF=SUiWFdYQA zF*zlbOEwrRiUxeElqGPuQt(Z|IeNnEvnEM7v|FYDW63G2{h6ZPg1Z5oQE2@4*OH|_ z;MV}{p;<;Fw_Y*6k7N&ek3H4f^yA1QgXmhN#djU^Vh9K(=%I<=xgO~!MfaM;#{1}# z>Nsu}#s}zx8oE<=P>gXIObLcn*mqI<>pG;BWd4(@>fdIcEk@07OsHTP49HVmnn|)C zQwcEuWTxkoYuTI-xJQ|`W}zo42m9y|A%R}fF;UfAlXfsc&kHj~FbT+64_>~*#=RHJ zNW&N8n2mv=EZV_ctuPUTT-*W6u&AcXEZaMpuY*ii!iJnt3h8kBO?x0;UxN~06g>IxS4}F6>Ss(o$ zk(qBWBv;1_$3-2oO-MZGvtS7aF(L<)N~|U`sz-EC373xow_~wyxK-mT^u30;nl6~$ zR0>E!ODFzUxJgaM32^R>x`^(`fG|Q1a~{&F!(fDHqo$ z@XlEhq7YqD@>gi09F>zkY9PeGUS@>GYyeTsXot=nkB7R*B@`*K)S3Opx6xzK9|p9C z6h>nb1_j?$pa-?441%#vmj5hK_*uKY6yFUrqJ!02ECD|aXqS$;w?)|a3i($92@O!K zk^gcHHL&|Hn^Gsch6kWUCn-Qno_<`Q#)d`qHV6cR@Ra#CA{R)dR# z^aLBQl>(aXDbK|`R@hB)9U;zVoSk0{T#ee^sf-aCk$4*V9c*?g z=lqok?OUOzKyWMs+p0x&P!nlJ+F@3cqs1)wg zqaRdh^$cNii@2Z`|MwoMi}s-loJSZVxC9-Ll4Oz$l{W~3d)Xg!h=WYR)(A`2z$<9d zN2OtRWuDuGq^Y1WW?&{X!deZLyYtx)?Q>@iQbM*W*r$+Q!7X1Zcy!P44}$AnZ8vc> zxU48Abi6hl&!t8@bUXcwu)lgv^MM%N-^78K)h8TccClVXqt^&-nu=+qwEBvT8&Dxn zU4Rt8i9B0R?WWf8Dv*stU!%h$kwu{sXRS;IfM+Hu3FY74zDZ>Q%w z1#8b*ItX$cV<1ip<8UkveYpxA;#O~Uf&wjT8^(7~0W6STUX2=L+2(RES;hx+t0jt&D`lpJX56EIMa<^P;2&%WSDC}pn~pN z-P9&S>trnPCv(=+6XU+Yb$NSxdjxMt^prlrM8i3PF*z*gjFPjbVzRqNcl($2X)a(J zPcY(vSXqfnp$|CEK@2QZI!t`De1k+rA=QTS+@H20Qwa-W2R@FzXbHZ07j&i+n#h5K z_{ZCMq}5OCSo)?FCfmg|s)yIAO-l-!pDAu`!X971p5+!je|GW3#7+DGvTjovw4S(d z(z}==y)=+0u_XDg$YmY97p>K-RjP}SDl1%baSstf)G|)G)$b}@4@OIpaHLHoAhefQ zCG1XWRIg&JV?#BA0$bfALM6+0qNE(N>euGFxkdJp=pj9`K*R7@A|9z{h06FwYRJF` z=q^^#d(i#`e%w(Ft|g)gzs27AbRt)9q3=K$j#iL>Nx~gL1rQ0HG}M~60-Y8)J!K;A zFk-C}BMO+S0g=^kTk;U)WlI2Hln#1};5W;97kSRSu zjWuYWVXB3i+XsX)%4DQ!$#N%2T*~mQ0UhU|0j01Z5Xt#Op{I@BOf5S=6XgV0LsVIr zK*3Sc@y^`k4#SKZFM04z6S4g_O2Q~%J+ApzQf>A}!4+Z9giq)yt$4&v^NeDEl}UxuC~ctwhR-&BW?YA~~v&x+@-O>UZ5p6JbD)~#k)Ronc* zX(gthN|{!(6k}qLzfAN|BdjOU`Js~6diwU*pGxR$2kNgBzaXUR`QW}0?UhL;Yb}nA z#xe;sUze!DCd8gqWs>M;!$Mahm~9S2A%j&Pb_Fem`sk%d2+EG_ zv{EN}z$Tt_7dL(h(g907OiSA{=#^3n2Q9-`B=S)eXYCFiF5(zCgl`Ov4tk_g$M?fU8rEYxL=alYj)H#@s47#;Rq*n5ZJebTA2dYLU_e*=6rTu_ zepSI^05WIn6HEEB5(Bd7BKn+sSbs3`T^~EbYf~`YCy0ywuEi@b`WHd&J}F$&?(Ewn z-b4ky#Ceak_Gk1_+cEUBE_4%qNmi{|CN=vdv$IKN^ik;T0U0~86^z-5YP8FaF0)$R z$UB@Ew6+Az1E1@i$kK4r;;P4y<_@C}vdvr3@qH=C$g9%>Zh^drz8G_yg}8zUBynIsiZ&B)xWIPeQfQoZ>I*_} zn?zdw-RguDGF0$=GW-u7mg4O7Y{nYX!|74vo%VT^ZsB`i{~(Qw$R8k$lm8fuJ){MF%f2;5O3 zCuE#U*P{{glO>B4XPEwAiKN``Yt8L?lh&niZX;)wnXGQ#*!Q2C&xVaL8w&?Z4{dh1 z%)e|De!V!1`YifNv$9`ZTaRATT!U79{uV8)_!!02n)S8cu;?;p5nYfLj-#$Nw(xpL z<}o**$2A`Yw2yC0@ z4nHZ@qzgD(1Eqd_pNegwg!WT{81g9xVvaUN`N@L^Nru4qS{B5hJ=-=>!Sqx4vSimv zwb-u#*y|YzOmq}`exS-OSAqVa#WWT3gn=gPS!6UaVX-w!;S_x!+o}PSHqtM0rRhQ} z!_Ytn1DFHIiKdEiXt!4I&dBjqDIg>J32iK)pb0vpv2*mVfaQ4u1eT_wcTIgJ@eKZ= zALi1nN@rAgOT1Dqs89*D2G&eH@;H$5I14x(2O=X#z_;!PZlHbh+)5Y%;k~;QFvt@P zV#|{B?ZE(g<1AcAz6u5WAMMXN?%G3Job%O=UdOjoD0};=@8i{PZqeaRIB&t-*?GH> z|1x7V2BzP7-rMcDZ6@UYB;LU3s6NHk%PUO`dD*Qw6IlI2g;x!Jd!Vh`_~ z^X_Q=9{#(PO8XUTWjL@`BoD21?f;OU|I4z5LVPY^@!3{ix{g@1eJp7$2WdqMEhgov z4+~6P9hD~QI-gv#O&$`z+ynMjyJnl7ZgsT;6RoJSOP-}Aqd2j#mG)T$1w-;LS*qN) zS-n12Hpu2E@MF%_f!dOck558&&TAx;9;O3d+>>(wGuBn=GTK)OVLcx{HH;cWS9C}A zv@hIP6*s=hery02&s+QBIy;ALsnnkp+~F%X%w61mP!yJybo%V(&h8`o6)x&)r&ulm zs&L-D@!|U?x@N0)Wb~9vkIgOXJ7S;S(eAoGXmo*r9BJK1ZS;p|Ky;GLZl}_jWl7Z8 z&Fr@3QyG2jr^{w#I%eBl`kO30wgRdmtylJ4Z!QrIdt1O*Jn$sncJ+Pxm6+B$;3WL-DAj^}YrZvv?Fe)=(ehRnzB;mnIc&3NByf{m zd7E*AJ|;{(FK#}%=Vy7?zQj_?H5J(yqRflaZ)7bxggf|!@RfRL(co`o#n3dxjPREA z^k8OZ()kZ-u0bQtH7~dq(NMmL(1G}Gn_H?7+8ZNz<&xpNt{HBR#+-#TPy z+&{N?l#%*AKi_rEafmC!;kJgxVKOy<%09_xR{@@RbzwsfNPy`Wr}*7sUew7v(iXVq7!{%0)DE#}y) z@p>X`r!3({S-6sk#YB6@k}{9VK-jvzyy==i-2!+;8BLG3$b&V|6wBSl=*m|0lv?ji zF{j@Ljf{wEQJy_=Gr0>NfRF6VrlP#bBF-Insy_a6

%aEdF!ksT3u<)McYfL|fWEwx z+27Z7({*`hI0W;D83S+3zO=42(InWI%V=T$?y+4oyoBQOSxg(*tKny#JnvuKG%#a$ ztL?a@4|uCPhn8O(>^V(#G-plmtdVX3CNQoox4l1WwvT)8TJ%q2aR5Kf)W&2x=2+m~ zD?5zhvsGK~jm`KyUn64V*7~%l8R~Yt|@0XUDd%aOW$U z9L{uY^j-hYZG;|V2TLnxUrlT@H04(J7T-^uiq zDW*_hYD$W%Y|=H*w;soc@vHR}=(yi)uY(FL#CU@!&^!Jreqq*X4ZJ!%HDa70H<5xr znHdSjRDVqve>q=Y0*oDbZQ)2~#j}8~wT%>Gxx^E~uV+Lxurz)H?V2oWmM$`2@1R^# z$?`XDavt<0?Ai$P*j&Q?Sh+}mJ9>W;YIsv3E!2u zbXX~p8Vrs4`6gx}3l^%fZHLwkhOLL%7^o5Lw@Z!%Ip>Du(#&Q~V`#>MdoF-3zL2jS z7*#aMmlhy{Y!1f8%SDDX&NF-e-Y(W5b8%NQ*D5?^X25F$5zS3EcOsqs7)&~Yc%^{_ z6CpR{Rp*Vzc7K+yeI64zL^`7y3HV<38d;w9G3Kt<1%rgk95cog@P>zX?OATrNv=TX zqz_`duio)`>G>|FsQT3?hP?eJ$V(Vx zu|3=QoO^dPa7pwRN3XeQGlzrE;Dl3UeYRSWKPcTTS6*pFe1@wrw%nhgGP;9fP^soG$si1C0+!vbz3vwfyi4A&XML9c??d;5mt@T&Yb{A`e>a4fs1 z6fRHQ*7^_waC1^v<_&SNvD4Ac!n`^L z3b`ET)Q8C80p?T(j#A*SID|2S@p*KLgI86#aI|TY-}&*JSGSSDWSJIt ziBi<*da(n-&9+T6o)pM#+Zn6!{hkkz#3uUKC#trY9W2>m@sA5VgYEXkr2pmP@uHFA zDerf2+0X0}*(dajSpo$!qO)CTw>#J=5E34qanDgDo9KJ6{eba#1#IF@jt<^|-ei%C zSghHnvwDoT@Vay`_c4LR7cW(+87v}2?x$q0oWxkuF^n93rMghIJ{&luu?N|Xx0~LU zBVkwq)>!xJ@%17!+%EM$n$o3SHfl;E${<5I))%(MoEDE@>_^- z(g-P|gPhT9n4GO7%s+F(8s{D-k73H3Uc#84)nb`ZpJ4lV;P&l+FKRSHQBn56tANc6 z0{)3XRyD>Rd9S8uZCj9Sp^WSbrM9HkiJZ(j*j{Dp#{XVD9~iV@W2^j11Pi}LM2XM< znYjDrIi-2P%ciaQPK%`{?HlLPqnYw8jVQsc8-OzFDd;?WA>YAMy zhww<*hRS=Ltj{!XJZpnBwp5ONKE@M$Ybf?`OU1_~h521-D42Sy(fe%mu%YvMEOtk` zuX~>KIixLq=h}{bsW3LjDSaC}esG}upB4@`@Ef$WgJEhlnrrc~{wX z0&^{neot#2{j4jhxTs=*w{s-V61tR8n`2is`_(`wn*wH4q@<;)NxoOH^K08Yf4oKH zr`SA^VIx27l$`HB=Op#4w%+a)vEEz?TIm)d9QltFoP0;qU!oW4p^F>mEmSFP1WHgH zTBnh`(4ar^5UpVu8U>!Itb3HA`B<9x40Fv^RGWyUD0srF4EF_Yu5!i_g5gAhc2D@Y zMwEd;yW=H3G8R)3jkAznI!0)f1frrw#4D`4oK6q|w>Fc#ho(;~5YOm{6U761tMKU5 zhCw9~Dn9b1=W2#{*{t5l-0AaqPJoHI&$QgIaP2txtU$}8p=)AanL_I*(SH{6J2e6WvGk>$?oJK)qeb5y@*1j`G4Cpx6fb$> z%HN?w*4z`W@*c|3^PUr|$*y8gZOHj*LER7O3c&fP8k%a9jG#o33(S9EDg>gF75R}X z1?hTbr%`nGL9x`0El~lUUbrR&b(8`J`ieSIXo=GCgkSZ7$Xc<=K`DIOpyh<0UcBGTKHBiJ+pauxYFb^cQ&`dKU7POxPpbi^A&m7JZq45!EF zSD6IT#2OV6E}r_QNqDZo*Df9;5(3s0^uED_OL4=!(WM#~*#?Chq03Idn}GZ8%k~1V z-v$9^KC{Xzl4InalcTiTI9MiHP)%Q4;y?-e09vDA8e@Q%V%igcN)F{=NGFag?XvCF zt#c~L=rwNN6~oNi&U6$*3#pVf#!!ev(xQ=JGVs)$A*Y}l5+J153HZVlulFu@_cNO& zX{XRGoaH8mUX$pV!8^rTp@HJBNQIGs@xmn(u*BKl6vCKB<_8MxlzBo5@{bOEMDlOy zL=(8t{}E`tF~+$8+|fF@bOrvW*};IIx-kdtlk9LUkA_ojhGAYhY;aiDSPHOO+;6Lh4>nHe&13B$@1;FY5teou* z%<;oOA^fG{b6A{S1CJ8|GC?Ri7Fk5UVH+5BJ>nn*@l)e9jX*QBF3VGp?LDTWRQRX? z2S@8v*vVZO?8Hd&7JHr-*g?=2D%@_3=-*Q?rUIIIVjI1AH^m$)+t;t<3}1{{Yf*Gj zm0j1AW$}nSO#dc>x*pFCT_P|P^4GJU4C z7O4(fkvkpa%nnR=mDaZSt1SozwML*0S5URd^8=b{t* z(jaS>oc);uBp}aO&1qAi4`qc@z8!m^Mu$_m8`AQn)q+k;G)5tj(d>j#SQ_|0+y(E! z@s}9WF&UB)Ou2!E5SdxJ`Q3DcC^%o2Yu&3s;*8TxXv?~eTdU!=e_NO9*=uBOK*$R8IIzRWVl$?6pvUl5?M4=ci6~Nd;UD zT&X}_7(>TDChTH_FRrrh>x86&x1JDmD$#vvl(K z%B537?HEcp%n0Z<>bKgpl@oY|`(yW5*!2;D)7b*Wp^N>v(76lllA$RSoQ{V++6LX! z@wtgKoXs1OAx>0?p6K}}H9{{=h_TWM&P0{5Nj7aC)8x%EEVuP{gVyyKTtB^$cF!V?P6o1INM{}`t0E=QM zt=|$1E2E%Q3fZZ7+jpTi72J9qBHJCUB3Bt&$2hh^J7mHuD!!izK18J7qSoIxxObDH zQ&-R??UBi(>n;ikRPo>9w4oQXCpo<;n~1IGN9{I-8?e?hKFH?&RyjI0-0D&A#bR(B z1Q3J8Llku8t#G6fSj*rh75bG60uJF3CzE(<)%=aSnX1KHROMGvXMNeP=k{M($_C$$awJ& ztfXYhXyrv3-tJaxU>b!NS_QLl-Xpzmt3HA?VBnsKN9&Zi3;^E&=u~4U%P8E}&gZCE zF9|_dvzURg^eXGk&o&suBHEFhr9$2j*TdwDA2LL)d0=Nd5+)!^m9R%Ce6$jjlCI-2 zv2vAYEHGm!_;XKHOv#@AM~8^Cws%zM7UO|r68^nM9P^3Z8lba##2XbvpOIoF!w)ca zae3))a1Ci-zU)G-|6tcOFr1Ttab7OXg7G!TFs|e?K758z!UCK(B*M6UW>pff?FmJy z{I6=5Ycv^yTy`2fBkcg%hdbFfjJ&rMkyEV4y2;;_%u}rliOBW4F_fd?-cgx(=mlrn zA6}N`agD+s16uBi9&lvgG52%gKp8T{C=nQuUgHCEcd$$QZ@1IVEO&53C343ic$!Dc z%F%nu)4^Kdg^SZUk{VYj{A#`gRst8@8HSSIYZPtLLpMgz5!L+QMR%STg}+g}uuEn) z$fiPC?u&T1dpnqX6)e!QrmVvpKwLj1bd~|O0;=$723inFvDc8d%xM*M7|gqYJ{3hv zR{=hm-L>2LI{|wyuiuv_>ZOnkAM328&{n_;I1$As;3XH%PHO>?1sfDx#i#9ql5bub zH?10W-h7cx zY-g08C0LPUE1g#urGSO>`B=sHLK{|O&{!iZ`D}8W5Z!OcePFQdP`qK&G5L#7J;9k? zcFnqrWw7w7OGY&x#SFSRUMonbMSF4K;*YaEF=1aQKk~23W=!|KN-$;JoZnjHv4MMy zLcW2&at z2S}H3^JERP=rd-cwBD{4`xp}csL&b!+iMu*10%#Iwbo1Z6I;vBrPDD)h5nS8R_nN1eJlE@oj_3=wN%1X z&<-JF#7c3UYZ0p)95=GsoxohBwOuOBuyQ22({!@@?n5pW3gJaUSoVJMXlh;=fuRN= zR7FE$=q_gxb%)ZvrJmGshi}1N7P|HV+Q%cZC!Vm{S>NfiK{Y&$;5{^ovN87?nC%rK zkjVsnp(t%F6H(VV+5DOS%nlFvq{1AHqL=vMwWpb%6v7mYxu1^PCZK^)J7*X$A;q{r z3F9<1*6!TP12{r`Hvgn)XO9%WQd!eE+`H|;q3VaT%NftKXvfQ)xDFD_LGi1}kfEJj z4X@Jj4;oYeA`Em>C(FEU!^xM8neV_()n=#HKW zAu8lT4wtQ<<~7{N=CddH>4wif;mA#ju(?+JG9cq#&2#rv-3#T=m)<9Wnp6Ye%YT1v zvKpx=m}IzD5{-MQ3+W;WL!s-5T=fCSoU7?2runU$>$!c$5E*eH^Ot6@dr9-Y^Kd$6 zZA~fi^&ViVp1(e;vmdwi>ZR(Hp-I)D97ls>scGcgfb_doOJ5{uh|z26zy`0SqiqJx zTQKHsc!tfy$ChKMCR0YW-*QTO`Q~9l?%G$f{rzaE>uMR)wCyxw+t$8R_vbHNs>dU04;atl{}78+gUvOldpiZQy0jfaiO z$dyUW)^)~*j)4yEo)jS9Jk8#p0j?i6yKb4?Y`gAxdb90_6HQc(_Y`;6QfbQl+Yo9Y zDLk;YY}04!>!4^dSaqo#OwK`C^UG7>l9@JZjU~8E!Ha5ODM+RP%|!Lxp7hR?L(G&N46m7iP**};P#S@OPy}!Tm+A(=BA7QH#fj2 z+Sfu3)}~DvshU27xb~9U=3+N8<tN*G0!gf%S`n{ zuJT&%w>qPt0l>EAMQ+sxujHA#np>*eD;faD4W~L<6Tntyh?sP}X8v`E&#l|;O(t%RiJh935t5D&HG?z%73%BEA}N-|FufdEJtez%>;;99U{9zq zr9hk}gum5p@TubAj?+9r2;C=BJnpn7fx*LYA25N_a{umT50(+~|500UbIe1nX4~w| zZ(@qI_HPHb&bRYi2q0K!q-?&oG`Rtm&>=evb{m)h{Qv$FG=&BSR?ZmL-;u-H#|$YT z1aFK@Z-e&RB~l4K%W7Q#BMp!6wPYEB{n9!fLVOVq+#P2u82y0pu>Z+jzfRBltgsdC zH0m)y*A!`%`_>rg=^c@q?R8to2Atb%Cc&XBnUom@{ZkzoFwE@`YizHwsR`n$@NB5F zkv|idR-Gz7G$*|qW$qn+)z$lt!GpOpZ+OCF_Mq!JfuoI?ar3yJ+llSajYN;IAV#ht z4-0fU9xZ2R8jIcMu!y18hc6<-Y($nuU{f1lUTj`5O2Hbgj5Q^Oed(zQWM&z=Bgfs- zmLyL+(Y9^EHHJ+^aez^&8|3n*WuWQ534;%{M~>syng)6cViG0?{7H}Xl$#12C+>$8 zDYu4Jm399&zq{>qQQa`9F4BJmLsb~C^Vd5G`n&4&NO`D5;*FxIHI-NWJ??BnPyRA= z-=-T+^di|I>j#f5yZSdq;+Q8D?sLs_MTyJrGJ^(lbA7v2{4p*2Uih%tUn6-@B^kT3`w`sVV&Q$45DH; z6cx*M2eIpR4@E&m4OX_I20_KT1A>C0ymNnk*ZY2a)_T^u{{*tQu9-QH^LPBdL02w{ zYmHodXx7lAYWZ5iHdaCfyO1sU=2#Gi<918%rTd7>Y^ncxVg{f{be;v9KS)N{hgY(0 zuFtpr&#COJDYt0gTmzKv6lV5-kZd-DSdB3>L2bk~4}U@}!0ECXEcSwN_qP4k<#H;$ zv3$>La9rDXN}O46y}LYPMgen-2b~COpC*{~=A1N&? z*Jk7AJEuyVNYip#H0m0FtLJS60ooAUypp}N`lf|KR&1$Dut^Io48)_%lJ%!q$Jiv4 zS$k>0n+jR=DLz~khi+`mEY5x}0H*r94P=tgJ_=YEA1#{FgnMBEl7G@LlxcE0-s-@7 z-MD-0_M6@84ze!;_qTDQ|JVh}WDp$i$tQ2kig2hjKcb{Op(r^Lwpi#yzp@lu6;63g z4QY5@mDJSSP?EjZ@-~R4bIgcG<*+l!w`#XIkDA>k!QNq|cKx6k(6m2o3^^t~vH zv6;^EYuUrdsd2Y|%&wfO9GTrR57bh~Iybp_T=rm;WOpikI@>z4Jw1?rdFW}a? zn|__CLMML>$Ihv3iYY0WoWIA(yY*0Dr+eY7^a{&E=?nTFm7!@vyULMRWrM?vmSc$~ zwB(~aEGBkU&eRpg{w2#MVhr#SJ0G>oIu&8-Amt`^!8Rn{WwJGnqHFoYuztSyJ#~Jw zCu_2=Z^IF~qHt}0m(49|dALl!V?(9=W#2OTNI|vLZWV3yt14Htcm{CUaB8bJ%57dyqek_*7up*i)x5)R zdR!>mu~H>W8xgK{?@TRfc=FofqpHy*ey$XXW?3cBk>H7#pXKJrI4GJ)tXcPO=<=89N_eR+Z@4v~_B>rb@t{{e}j{WwljO8osPAub)<>FU#r#V@*^lEO)nR3A*eU`cH5YYr(A-1Kr!j{YajD*eKZJW2PW4OJLl=nt$ zYpt&BFdeqQXjOUbw9Q%8y{&Doqt2?+dxbVv^-n+=C5q?6rfNOQBdu|3Zng*b$Nb>P z%z@1KgL9`pZEcj#D9>7|NwB{1Ai`HR`ohmC9Waao$&>EAJLYABd&#nUXWO@}!TZf^ zouDyaV2RdzDtOXwE+Zb$^{BFFphq>SL+ddpMW8WJ|NpAc1b z1pd0zIH7WV(V~3%d8_$+@r*;#khVX~kDqU1KWcYaGe5JA7nC_6(!FGOTfOyI^KB+Q zEnhPe)pN?uHi2gHs^gn_tEKIEL*9$ySVC2h&7d-Ka=r>(lYMcr2apGL)gpmYnxJMp zVlszWXm9N}%hJZ5rOfbD9RJ#|7haUGRZMEwBInYN-2YF!6>MTWXVc1m^z&h2Ud_BsxtgX}3@G9~V3$cA2WO2aKB^OvmKp-o`B z>$!i*5pkF8pq%fUBva_7{-Q)59EOi1MaMD3Xx!f2^v;M9W{NKU`vJdp9R{m}stD9R zd&ZD_Wo*feC21IASUYzq=%)tFo)w>!3q8WnA4=OnI&ftau%RHOLi|CAeANpDX4AMp zvZq+mZ||j|77?ZBa}2pZq!NAAA>U~7ClGiq#*7rhr=E>{r$-DJ`rpWdCaJ(y2M3)4 zC0f73L}L7EkU=8v;C<;S~$NlT3y{Ccq_2 zV?;tPBe7Ppv{j-q;m{6M~BYw+8He^Jw{PdjjO5_bCu22edbsFn3=++Q{%pf5v zz`#^=snMnoHEdeW^f(=LE9MO`OYp#}=6!dU$C;H?h* zdJOGTEf|ucjsmcVW=@^pdq3tD2bf6Wy>Ujv_2&Jmqp%eErd??IIu(B{n#&MHidUnY z{CxnKdKUdmBVaA!nIc`TLqFoMgc@nRo~hQ8{7^|kqrf}4&pjb%Q-=Lkqcs$O6=0sm z>#b%tCkniofzA&B5>j+wZiRC@zn|vai{D>e!AmSgpZgk*0jG~SOF5dL;pZp*CWJv@ zPz_H~8yoMS<2a=9wv-2#KHy!$jaVA+VyLR5hDLv?5crs}M0x7pU7XTl^qImcUV~gX zxecxc2adwaa=E667AI@&2LdtjWJgt+(y9ja{!{f&Ta?0_ne)& z`t*coS&#}Hkc*2*hzF#w4szA9Hy+3tqY&KC2U*RNjVBSaJ>pOeWUHNh@ew->?B>Q@ z{cX4$Khy0eiN=xBC&jTIYb3}4*%}S{yBbxfYQLYTS{TIFr~c)V3Lnw)2cOFvR3iUR zGAsR`qbAjvF5>^f#X0Q_yfQF_s1s9bITj$FMh@o{aa5P4R#0q$r%}}$@}2eYXR&kl ztO4UGrZ*x#pBx!g4F0M=WBG6K#kyI4-@duB7(_SAY%p$=9x8j=IAu3*98yklILLI8 zN-pZyIB00HW0>CGJ9-?gHGD$l9vKFf%Ij9PR|u(wn=mBP|ApZ!ORU5r707VjD-u~j zXt7S-J^i7jL8zRPvE}t}8MvTjI5v~hse@>!nG%j4lm;urk7=QTKgU}mOq8OR$efie z!fs{+>EW~6ATyyjMl{F^1~L_dDQm)2F$2mI)1S zew#wrzU?587D+Fm_jTu&&X&%ngjXoxJ%GH5Ll@YwbPDl5I!T>I@^&KHr<;1}=v}lJ zh$8SrVz7b4{i)P&$c44U+)zLz+qp}>EDk8cF@39LMD3FQn}cQC*DyGrKd6?G}c%qG|k-q73UZsZEP;n z)#$?XFLdhB*@ukSq~A{<%E6F6T0EarNDbU-QhY`wSW+#q^#xw~7bS8dp$f9Vpy2>o zQO&lF=bQM#q%vln(H6N3U#}1{uUtW0S#H zjK5XOi%>nC%{M5}rq8_ZLRqE*pNxO6ASz8Ch0%lSb-d9UG144^U$t0a&LV*P49Nwb zX~EDk*)?diYDf=>}T{TCrIQ@A%W?W~o8c2DSuK{Cq@_>!ke z1AD5ytW*YjDzOsfDZ1opzSH9$T5%e_ix(s?;JkT1!JkTTDc--J z=T3GRBvQ>}0rpd!=(7|sYkt=OQ$Oa1r3rSMfQ#C?Jc311YYq)L%vV45T(e>F7{y=K z4+hxP;1q^7C|>|Q|Fud|N^*?iJy&CoUyZXb(KWARY$OomcEt;n0h(4r3zd9A(H@*D zF>zdTNkIw%^xp=qCf&M0=37nJ%(mF6A>&@S<%jM&TS4-$S=Y1o%^=CW&$|quf<3M=-KHdpuNljCRVnwW&8GGBuc3KfWvoLgQPQD*w(NQ; z_~aA(^!n!CJ|9<~2dNadG5U9Ze5 z@x1k2;O1r{JCtrSbZ9#b$7n2{g4!qs^=G){wo_1>QuY*hr7Dm%j|uAY@K#J?OXeCkM*xL?@tts&;RJZ*>Edw z?B-`FcPfYv&VL=*)O&(=h=b|I&`pbF#OMc9A&NkC(!jEix>t1JX*3+JJ zBv6Ktq}$%hjYl@$v(1F|6GHu!&HAlDY7VGAeq^2F`muJ}i!xo&6!y_O+}oEM zF>dw-4eLMCv6}tqU%(lA)~D{{dvm5W9Ur!rMbnm>dTWmdrF#K^XxU=dNrg)d)e(o# zWVb4t&A1{`ys5u7>w1M{Ik4VjHxW3*{jeBWCD5ltgcawXcvt?@a5?XApvE%-%jGle z-HYyBK2}dz_9q=lhRpSnu0dZmOF4Y3a`=cX;6&c$PpRrMzP|XK!!AVIXuakE#E}|@ zD!x^4-BPPaKC9c&xG!L=`p6<d( z!d*C-W8!m?B$A3q%{~ox20zCPopC$bOnD@7DsgBrV-xS4i+U&yy^6ZN2TP`OCss zyD2D7 zUCTqpzI$x4X_gFFtbg6zPz6@Ke`od=?->yj|HuB?!|6{?gjp}6wr8}#%9s|KXQ#|E zc~ov)C*z4R00kt*qccwc&aimS(2ceF&2S1B|5`2Tun=1I(FY@QyV0es4sxCjAP(I- z)~{)|7i;D((yI&8boNsQnz+(wVCx#s5WbZ&QDM3Yto#&Wf%M~m#>;1G5TMXC`Yn9T zz1^9B@&FK4F+RGXE)8 zzvK3~cvq;O_SaSHL=-n|oC_v`%Ic!@n0Oa;20Np7H4+nQhm3DhkD%l7e)l>6aem!6 zUAsVl={G}Fxofr+XNw|jt?cd8lNMJOZR}M$pI7lbR}uEY)32>wl6m>Mvh694f{jZF z-k5SHAlN%pKCB80Vg)$CJ-Q0Zf2f<|d!>k^JR5wa8;57SR;x0yFmI=d(qYPi2CK<2 z%lzRfY%S?EAGX}#q?>%Ota5W($ngD6^^z(bs2P_HIt;u3rI`_wjt}eY;*Pkf&y-+F zK*y7QWM^c$fwj|L3fKfx$go<+K*m4&woD`ubD)D*_}EU>j7(#t;;=(MkZ_YC*b0rHjGqbRp2Q$akHJ#%I4_ABe<< zSp@U!bLHyMhu(wmkUOy~%G)UnNW891gvOH8HPde{va!o%UB6A;=wZ9S0fPU{M*bhn zMy8~s{NELi{9kJPm&`d|@lO!)9L*&-l&9#$bE3$;?T6nRo2Q61Ookg)vIJoV4x#@Mf%?U4zx8 z-=}AZiKC$`h$9T`Cq^9^zh*=7;>x3v1)k0$cbx7Ri9nO!+l97Quo`97vfsVSC&%gE zUf*)tLFS+gIqa3*_e1je-AWtdVXhv+s*u}x#i1rl&D^RpudZ;%N-7DZX7aTSvxR2H zx_HI3OiAxA*e*2=a9}=rz+sZ~!nz(Ob836{C9>mX>j_Ldyfzbl&nvPS{lYB8 z;rY?UlSp9oB-T&s-gJF)`>8y6>GbJEQfrQ`IbfP)FM(QP5kI$HK74Dk-05_L;0PJqh?97>%F7mqDkMB&f~?dznc=u!vB-tbsmE%Y$q6^?PqxO zbw$+*X87#f=Vrk%-0?GGUuy909!$3nL(Bz)mkALHI@m}*R1+3CZ_qmSi^4K=CHZ@xPZ~+Kxr8k85&yld*lQ>wB??^p5Zbiqp{2uQ zZ!fQ1^SLYN)LTBinyfqi&Py65+KD+t!-RLLIaYzcjb~xJIebyogrRgw3V)lDz~WxM z^tCXa_3Bb#0GQh+^!uw-R}MB9D(c!VZrKmWja=zH5d_MI8zyCF)z#y*gyn*zA1&&-s3%?N>Val<;fmj=b&c z6wNmGMtQ}&n>;K1{lG!P>ieF%_{ZqAqNX9R&Ho2G>l@)`NTtAoP8J+4^9&XGy%ulo zH#eJ~6ztm3Rdeq)Ccz^uczoa;6zlCk1pK~~HEmpt*nWBdBLyis^R4&mJ=k;s1O?OV zui2e|O@+h>=@N;hauvDVicpkK5B%Pc&=fOVS@t@Jd-{U{^!KioxHMHI+;1DA?oW3; z6KXpE-Ei{9#HFJj_J@Eb6B@E#v-AFRZgtIl_Zag9p?oEKsm|W|YuCbn2fESuIPS|5 zFN9s>>W}UX;@eITB*QXYJee8}ez>C-7L4A_CyxweFa+7BKA_~&lTKDL72bfCP$Crl!6V;Ijt^qjsye=H)&gWNk7o?re zLaNRQp>$QI^Rb>imzs}`UU%!x!XIbH|6}E6Bz6am3t zCQBB~tevmHrI&I8^i3998bXX3nn{OewG39Kfp_bIa64ulRr4vhwJA@u{XI2yN)0-g z-Y6LRgj~_^dI>fVG@-w`Xw9`O>#vcycDWxgj;h@1cN5RO85Cq(%7QF{z_9qUc^ma# z>^O!xzV4wqp&)gmOENdGO`4fQ5Z3+aEboyu;*HHAX09YHNG4(HE~VXnDfyg?cFDSe zOm>o{F|fB?Uf`ga1WReJ*Xr4i=O@GGsfJ+H$liS|3agYQGVgm7Xy4Wg(8er_kIMR*(bz+<0!N+3(1R0x`o?44J2R&a_)})HXjVju=eO4qy0GoS0#6^`WaV zuqvupR0W%l-cUbovDx-*Vhy+A4jK3*?oKYQj2)>m3;Yq4U(3PGS}%o-DrhOpU}zE+ z*ZS#O+{Jgh>+Os$C4kWwE4nGc(keDH@LF6Xq^-AU(tK7p#uW$Y?JU1j3H}SU(#?Ze z)~E7Gz*q6M=;4#>%S-?19GkVVv*UPqdu||=R18tULpI;5>=@u7id1Vnpzb{tz%zzxCFJM4gIK!OARJH_*cmD$Q&umpoxzFa?S`9QtOvMj7vT&|3= z>&%O$N$x^T);NZ*KdK7qU?$ar!&REJwt`@tBWbyTeV6$P-03=_ zsK@2j|9|ORz~_Kd@r7ClFd@c?iEn_;8nF#xc8Ww(IjDA9ctwH#S5!*@kj}aMP!b zO5j6!E?!r(rW3aLNal{5kwAgU!C*(9JkzA&c?FLMX8f8RfotnFJ7jU3aQ4)nYMy+n z(zpZLK6M&+a<8q!Dm9-5TH&{vO&P(`C!H{;{vUDF202wQl~gq2NaUnS=MI5B+#Q?|wg$ zBFqB9zYMJ35rKF&u%c5~IFA|c2hkfRu^h!T=b6TJ?N=eE-`ZVm4e0=L&sb2vy%>Dd zAcu7z`60BMP`$}QpXxTx_T(CrBClFtZV-Lfq1^!WrHvc)?C3d6oCcguMfJgx8-5bR8V zuIeOTb?8r8a#$-|Y~V}<47^2VzeW4>k~3>DnBaF1@?W&*lfwG8av^sp2t14iv0B~= z5*^Z^qt{6{VD641>{Xcfxm*%LSzpp1uQaJ_cVMAFRz{!(9kPUk1o$K)Qmj`9muW*- z1ao~00&5oObIJSTFpmjZ0xOd3;g=*@M*%ij{S>~zvkZx{(N zesD3kuSW}MDMWGimn}PY18_(vOa*(W1aDLv_jb01S|y+}XWyCk7_H&>_n=_%SUGSTNc;nD?Y_kWE*TlI}d zxAnIH*+%fnh+Rii*nbA}rxx}oE)lmtZ&WC42Dj5}+6;&^yodWZcC{9BZ($5|^SiD0 zw5CxePoOO0idb9i`I7&1d@FZQ4$v@qqXiJQYb<(cBuOrtvXkjqH1FAq{?t#LtwsOD zq*9#iLoN=GBNeJbzsLN)336B}*p02tz>193j=w4C9PMJBJIb(GUj;ir!)-FyN^7Ae z+Vd|Qj>geWqHMg{i(u9eh z2zo(8J3x9ByPs!GvZ7e?l!9?2_}+0it7qh|2fVK;j#CR5q7Iwef~VSDYu7>Hgy@c* zm21qLNAW|0=&l}d&`a3{b`g$zRf?7l=^$HOu~{v+fbC_}kRwk@M^m{8Z!?6<^qz#Q zi=j>hT9bwTp#|0(Y+Xp1LqDp~NLh-}X0C!Yq}zn?d7xrO**uXa=!f>X7m0r84Sh4~cAt_gj(Uk14DyFyS`2XorsH1NdKP z)c9R&E5>^yjofD|nX3na)J-O~OmPmAT1r&zgq|zByA_BMld)Badvm3s8njM`_Ue!x zLf%14nFJyV^^lPc1nI>!CG3iLrapjuS1Wp+3yQ&!H$lg{ZA(d&_Np%>TbPsqK;#RabE*8o=&MJ(5Iv*OaqPA3eRIQo@&jPMAmx} zX;cYY43Z7re%%!B;iHqkX@sO%cDVR{NB*!{(QS&o7og2OyOs~_LB46Z#!R=h3fiS& zy24mYGpIZJf`=`}(Q=;=$S;6qWkLHI5IeAytK>U=mDX|)-@WGVG=CYpgWB_45dpX~ z`-u`w%aw|NU_gO7xwC|h=x7;`t0ZsfrH7vI2Ix^%EnurJYM?E8^nCC4*d~CM(U4RF z9aJz`s>Q>izzPkMT11?sk~awl{#XKhA3{BN`s|64YqKOT73ga{Y;y-Q(MT@e5na@e z_0B7mMRD(F>jreE3kYu`OR19vy{nVXB@+DkaFqNCRtx+H@kayFra(R%U2Cf3FV_ge zo2Fv&sekEF)&=Pj8ojBNE*#P$dJWo*i=0OPm${4&S4h;{2?$PKAO3G|CeeV`UYT-x%nrV*TA<&&bEo?)(7=zDtQNIB)Z5Mqe zxg+$a01n-fBddK!mr#2}&&G&T!Da29bb38P3u%hC>lw!61ed91gzM06%2R*=@;4a|&j` zo3buHUE9b~YQ0d3Q@;t=IU8-C)?*I;TboA zKSR!bp9I!}xmsbAm=L`pP-c>G*#V5yvKHz2<*Q^c#$`257UGbR^7-G1f3nETTb1gy zt2Y1s-uB#k%jfYg9xZtBWYdcim13PHIrkhuhp58V9fdC2|3!iXhlq7^a5$D4PtJHCF+@*6P3 zU0uS6G9^;{!WseZ?CPz7o#+eAE0Y#buSNe;pyPjq+pUm})4E-IjP_`dd#X~KTKNEh zjv}OHm}FV0xJ`>ZFo@qNB*QfLU#qM-pJ9OnI>!fXRS2#r7Y8b(Ql((&W#lo2wiA*D z1=>g6U+&oOOMx(o^k)<@NLJgna94koB!R_F1iE#f$XW>{YAtFi(EAGh1&r&gfUM;= z^m_CwE>1Q94nx%5rE(v(QurJo*OgFXM9Mb>I;IVaAWk;nW-#Ck7!Zws^;#MFhuo*0 z(UaqDkY4Ee*y4vm7;9j;fUVX4Vi%9HyDlLEhI^QlT`IqARwo%yZWg$SS+mxWRFhQ@ ziy@^aoRMcC(q03?Y83S1lBaUPWbo3o9qE=!nMB}!@>^otQ!iF!ug86zafw=CPH1kQ zD-~?rcaNGYJ%JOU2FVy2v}f2d+bZKikU^!uHkH37QZG&AO___Clj!IdSTHT=(xtiU zuZwHp$0T&`(xdNk5z8G+$Hl8W_s>27D}z2y^2Q{j@WMIhH}ew*!RdjxZ~VX;|BsHCUQK)Oee{@a;4kf;UOd9y5hsfw%M8 zd3`u+)h+m-Kvq{Gf761yM1y7fqB)ubVJ+{rQsAgnAHRoAL0906cV2awTMYUX;^%4E zL8qjJG}1`$-wX#*TC(tpG(wNIYNTA6b5)tvp%l&`AV>o#XQ6-QF|1DVjp{hj#Ju6g zSrnTuD)4HNqpK-*A(&uk3mpCWn@XWR)7kAhWD5}aIz1fL&3db_&Y`xY=#lrzpF&$A z8T&gY4zg!<#e;-x_-F}@dUG9*(&+mk3>O%HNDk%bMepV4L)^D=7qf@Q|E#I=pAHgL zlI42hjG~GSLF;FW+-t#Zy_<0{0=kz2kO{m7?#%cfY7E7+e~sAI9=c8Z{oMBB)561Y z*yHbd992R_(=%5@cJUMPph9R^nd`Uw-L?t+Uq-~n;B%bVmIC3WchiVkp`nLk@J-$M z&xVh=jyXbo*p6$>k3o#4V6;hj#j|d+`x)zUSZ9~p?}8hlxHBuSy@?md3i-e#s6ZT|6lCGDp;UU{BC!Td6*`v(&&cAi`XykV5Ijbnsf~-3Sd{$ehuS%XcW#`2H zUw87vjl&;i}%gtGuFZIYzI~zAdZ5c{a*7-vlNfS1v3Q92?=d z{^E3-zt?^3JlaDYEWdD?B)C3?)gag7eNX$)^QZU!_-E72M4TO6^y;Dn<%sPqLo1kP z4&NcBc(LX6uRk>pkr95$iWl<_X;)8k+$hdM1HX5Y7>i@cC%|{{ZX!i?h@855uxn-! zBh`nI;Z}b>o#wSQ|J3Yuv>zPgpzIP{id(_$Fv+WTAvik;`ZBFJ+_+6Q$fP<3d z_F=(T8VI??kpscC&iW$X{3=x)*VDl^C}gCu)QJ7V&~bATe2uyr(BJSdD<|SSp9Y6I zji$5t3ltGTUVQ)kac%+?%lV7kC9@q2UkNkbUMEbal* zDB9Uy6sC{hYQ{F?#SD`VW!r>Vn2(+Avm`@ND~xhz`4Ch&%pqw1!PCa9fcU+Y6ixWvRWaS~W;FKrG&~VB2lQN;H)Hr< z;@Tp66PzSCp>E}wheWnIHAmP+Y!CQZRgJ-XS`-y9WNftJ81_q1lo-3{@$?Bhrre^S z;pZp@PYS2)u+Q@KoS5r(wB9Gc;7TF#VkdSS5LDu8- zjQO+3kv-FskoR3Hmhvuye!D-xl<17&z3+W^-)+~zey-dr+KlkcHaSntK_1tkf&G$8%-DBafSk?zhSCD`8zV9JtwjCAsR5* z+>_<=*qo5^m@ZuST*bD+ea91iG_h}*Q<_b!9U`eOJt{Q8C^H$ixw`#Xj%g1$MgwA+ zZu5Q+Jy1(~g4jYzh_x*{2E!acThp9?x#7jOo!jMCvUI*piY(`3wWYZ~I%eox9ws`N zWj)`WKlX=hA;&z_8Nwi|&7I|I`a6byUYZY8*$!X(=u7ah{F?#aQbiKGM3ATD)z@{` zjtm&VePv$&p-#TlgRqT5x~-!DDspp4u=&zNhO!OkQGWt5GSoUAOU=rPR{P&f-*!n@ zPR?oyx6v`1M{_j=8<;wj%2|GADz`>9Ig^7$>Vau@=aNVRniHQbj!LGTxZDt zTimuRbG_A|A-DVODw3nHGdm#98P`)>v?&^f``u}i6}IM^k|75HKQuAE6Iq-#Wv}!- zF}BscXydgI>#taZqa~yDPzJ%FZjqcBEzN70f7G>YZLpJk2+TK41O+iK-HFF!O2#Ua z+^YvzqsfBRyI^abeRIG)Y0*Y=mDTT$li)Mz-Q!<&o7q*!Sl3BmtTAKGY<<)7A+9K) z%{E2iOk>G3IniAYS>@OA22cDMQ7N#gY69HXSYcByY|WTBVK;f^yfyY#-Tj9nl5T}s zH?PF!J4IOisx2HAodjS711JXIpyrCgU+LkTGaq(xTWhVd%HAZt4&rzOYuJ{mFsn8) zJKnOwvc7DDw-fzDjgit1RS~YZA|_Kzh`XJp2dHXiUdBQLjeXz%hl^(utMtyiNFpXd zMn+Q!HgAY9p3BmIbNL$BE&23(mK^T{ zR;Aj_%hsmhrWURtXJdB06@I_6wPtkM#Z&Vj9EWedIg@^QGHKphW?Y?Lx4QAp!po0v z|2xkkJU3sJrKVgQAyBFX(+toE21S2d{wJrYc+3l{al#xH%Tp^YoV7Re`cnx&n`*TW z+7ql(o!r(&*k!KfM|jqq$Qw0f6*+!>g7rWTFYtr=aYpgaUF=g^g?Y&aLKolYp*4Xj z+ns{ma=x3I^zETN1!ABREw=TIwMR>VXS;c5OhL6Tb z$iKR1ORIgQbPdN}<9=)>Mh=-gPm;G1L$#2^-EZ{+Gb#H{Ys8XZBvEYMdzPf-CWGYD{F@__q&&Figf0! z4%2y;vgB$17Ntkm*&ivc?Rw-cN$YgA&J^E*da%qqCw%x7QTQsM9ys;0vfjmWWkMM= zXK$A2aK|RRwaj;Ur`iv0QvdZcs1TWdUiw|d!w?#{@k6YCG)4G%>Fws64Ovh_J`Evb2YrGU%1YI$WnQj76?v}NWHrdJ`{QR96l&VCBi$@(8(ekc z`)#!um0#xHH*_hBncZv8>^}h9O zMM8U9()ZY;z3F>#yO{o$%+uj-?!k`0pH1QZq8VUH+k-GIeGVsN{L@`oW^=R+KKJ5s z_irXG`Y%O{LPE);r6+8j@#R4XSy{xXH(QLh?8kc`WABPt935$~3PRV&Pg@7%^Lc4X z%Dc&N*oY#&QNDkXxmW|33JK+?oE#V}X88WSd0jEN_Se`GfeA!GWxe)X^_AT98{8{^G0wJ7frER}zp5Z}snijm zCDLV#BHGUMg%th9=uM=at5e&#MOf*m@$$E+%ln*Z=<`yt= zPCS=a6_!6C$%9@GoVeQ%Du6kOtd`dFRp5LqNt+S(Oy$JWud$#4QH!O}Y`B zA$PH8q-$I|I^R(?3!trlJbkm8{^Px81-FCyL+>im6kP`i3hZjK`k93otwk{Q1E7Kk_q~x&u&qd zb*-6{rG;+b!x;mWW3kwDw~6Z$z9DbHRos;CHD4MBUK5l1HIh*pD<8e2PK{~};PH9> zUn;Q{7dlhLtsKryHTr#M9JZSJADiQ13M^wv(v%{t0gnA1>?<(g-$HG43uRKsTn?2# zUHGj}x&*gAuk?F9YxnC}F(uJbrZ{JjCU1T5uHE@YGbfH%Jh9|i>aJ-6dsG&st9(n> z#Fj>H1*mRRTI?S$kCsN?V#fHKCJN~%#bdOkW@CYo9xS~Sv=C-r)FaG9Y&Mo>>+Ukb z7L2CELs8Z}dbxL;Et_2W_jzD~oA7YfD9z%}%y5Nboz9gSGkj@8#0&WvrcZDo&e)%9 z?c=~n9TIvhzE{B=?G8O95Nm24ph21zd8ZHt5UkT$_nKZNM>zW64YU?Gan&GAB{Cjb zOk|g#mjnyi#{w}4r4#7ESpxegP{y>Ff%JEbb3qN>$kA7Z!mTAfQ?f?t>?Fe9N?q^kV5 zx4n{ghn`ZR-xFQO-Xs+T8bHhj+d6spQ+gyj4sv>regnLeGtaPcqswsrLt?oudO%lGu;@2nO7lLA*Ab2HW#BXJ{M+d=2nqiwXHfROaz zE4ONAV=dqcCaF|%Hz+u5ipAMw$Uav^ma-wKW|CoDp&06J?T{>OV1w(UNTu zv?DIxw%JR8n||UiAgoK{wv*z2R1&@3|CJJP2p<02%#Jiz#Ol+m?9mpzcqjyXhzf%#gps?3 zDltFJh)Llxju!a|gtK0cTS+^+EI#O3_U90xwh4lbXC6Mw6suz6U>uEyL_8{trCPaN z=WPP`ZcX?Akc_w`3&ky%TFAd~i=PUSBVcXOGO*+_ry4#m0vY3n zHK~}|Etd)S@M{Vg$AH+>uv0F>7YO9gKD6Mi>|fk8>)V>7KJt8rz!0c4$ zAdbXd6usNYTfdq!RE@Onw%kKlcTjN3zx<|POT&7jPq-|(I#p< zhmJnG790TR9(A7Y+s(K9-XM^~KeINVF^S+|NKBko{jV6A4WLlCG@ zqHPA=xngTb;C(ttg__xlH@XJL{%s^%SDKt{7Fw4#~x z&a$mwt5_x?VPpAjn1BSMs&tys4|Veq74HFg)x}ZrX+FRy_8C&(t{B$24NL>b53OV( zc9&lZbPCalWtbW5Tr*UAXUAJ8gAfTnqTjKHh37$pL2yrxth*_8SmHMvRCo@8RZR64 z#w-G$of@9oCd@>Mc1c?RdFuHi^sVZCSuj{iWF{{0i`&g~g7QC5j>%nUkBVhu3!+G* zTO~O|@CuZoS7dy$Q0#%RMgxu&xWlXi=qa>Ub+Il@{5;Y6qvEk~3yY{ivp4y3D7Zi+ zUU$>KPk|P>qQRI`iX6SGMt7b-zTxMNlS{wWPpD&Hp+KaMy%=S||CeNjQCN z;Y2?)9fv&i;vS6uco7(<<-Ed;UtzLMI%M8Vj#vdG26%~5((h^Pu?!RIB$gePR~;;F zlXoo&q|22X`>W71;0fF=yy7A8BcM4H@`2_&o(1gmoDW3eaZFaDK`OPZU%W9atvI#s zv7`mO(D{vIg5FcHJq_#?1n1HeaEUbeOx!U7q3?8T zWBJ)%gi~G%uQ&=lsSAK`esCNTFco_sw|KE3mhXQ3`(}}e7Oc=qp5iUXTM60}jQ!()?Y5iP5)hPUzSyh@eM{2H=TH*bEl7XwFOroSyUba{F;i}J(DFyb z>{n($LRrIoma~$;X&|^GrH>lw7DH&=V?)Oq=PT^^V(q(JEB-gBCE^BlZ4HwSn(ATt z`YLX$?4f)!XLq4Lqk-XjSP5AfY;GHTNPX1~(u7|RnA=WE3N!wy;vbOH&D5|&1OqETQc6rdES5SP3)uFDJ0AH zn2fCB;d>7lsP6w3B@*(;ysI~>bMTYA@=U2q6DP3@tevW(H^2_`3PGtXWyJ@@hKv5nZi zTCn5g(%qK&UyB)vd|KIgibdDJ!sL|C;g)F~b-BmViN7?}K;*`>%&=d!HaJ69H$B%V zAX;|$_3)kWW5`d*?QoSXZg=E(&zj&EU zjYSz9Z_Ujo7Js&Z)Yyqb^7^*bX3NK=ce2YjzqQKc4f&tOkG<|Sq0sl<6-N#zt z#QV$E9_SAYt~s{7=aa|;`R3b_9|#$f0A3N$lb1H@I#dO{R-ZhUmN_&3Rv9A?&t5!H zRb7!j%(l-iRhDUvFgQ-fk8ShW+X^H-zVgx zpJ>-AOg){7WEWXtxn@(w=ZLe9zJ8eM@dv0a`w+Rpo%oW;n%3l9k#YNbM|R3+C(~jH zhd~hH#(a8a9!;653ec6e6brf3OEt4i?>B&!8 zru8y*6H9B(>|IDtX=9q{UwE}nZ^wYu`?ih}pM)7ZuEq>FCXYQoS=d{k`s?V;$$JpX zbyLI~*2ctTt3CPkM^5Hs>BT@F6%1Lzq@@`b3NFC-ozi1zDQ-y4^*NI(udF+Ip)z3a z)&7wDDUBxXX4t$@yc*~LcFvAyM z?=~On$rsz0h{~EXKD2mc5Is#~JHXK6vLxL5s^7I%scC zIkakv*^}pY!s>G1G!SBK-f3gekdg_zhk{KnZOmt2d_*LN-Con6Jt}6vibL5&d|QTb zd9K^KP5LO-ZSfAj#d6jt_?jeZcA`R*7`s z^r>^kpJ&54B;7>`>J|SZ55xXP9v=Q*1l>O{F!1&3*TKQTp`oGS;o)!JzKx8GeE8P5C zD9cky&I4DrdT<`JO-^R2$wY*=PB$46-( zdKtg8d4beq3=Fiz^9-Z}-H5ie7;t7O3C7X(>LS0btepIyb1`;OC|%@%2EOhgl`~BW z42ZyAv&8zyVxnqp;5%DQRLI%A!lHna%UVmoH(mUyZLl#GJh7aJ7ucDJFmuV|}*W5rty5RWMY`~EG>q|ZdI#XUM;QI5Y2{I9I@ZKP}a zJTxJ|1uL48up}F@{n%TL2KMy2e+<*}Rqpb-aNhv;`B^&N4Q3;k+7`M>8oG;O<_3xs z8xV7x7gWB=CYN7eo!m(0RWux!M;>a=5BfoDB0?Y~XoTW_`rvwBd!IZ!Mc={!&{%VP zS4c-8^w8SDr_A~eJ^#P{|sv75%5CAxqhYD-^xO# zv_E;|JISp5`1FP`k4I}?zCEy)f}t`WZWFsU}U3#t#4KOqI1l&T$aal2_a^6<-#ytO`as1C7n{x&e81gLlH?-)6 znCY63LJ70uNg?B1puv|NW4aJ}C=xS?n4%-x^^hH!xU4WiUR6=x7w1NoZE#)NfuJG$ zLeAtftSFKSE&A^r(b*zR<*;q+{WFYPPYR1`*alV8XtrWzhv<7~BX}3Al zT;T5$TUp3jXHwu!_>T82z-YEwD7SvPLnFvd#Iu~jXt49uq-ojL#MG5MCm>_rxlO&@ z-aoa>sr3$6A&Y>d#?u|PstE*_(I-(WR=6f>+Wg|>7RIK5gKg;h-X=3WN4Cj8F=}rd z3P053I!uQ%twq-6?(4XlU%Lr{3t;2Q&$<3H>PqHVi|8TREbpF%@X_{eV|Sc&2oJ6> z>u59|9AWs?EAl+(09Gjxo{vhwe{oARU&Qn)fU=&-Rl*N)feBqN+jf>@!mzrfwnv|A z`WbKebR?LNv4oJCT=`Da^uaG&$Pr~ljQ23TYFj3Fny;iB2hgOBw!)NIY|Czfc7*qM z9AcWu;9%-c1*U=cDqteNYU&vwzqHkY20w^k+M0gSY~I^t{#ZuK7+ly^_-S^mg&n(8 zKH+i6p}`u**=y!Hj#^a~csdT~e~dBGJ5qLTll4gSsX1Qu`Uft2b_jWXN@|q}4qpug z=E|n2&%N4Dl-QrNw<0ic)ZSFQ(}UITp8{QI_ZEdK#nEbQL# z{YxN=VIthSBep1&=w5HH0M<7k{QXD>&z7oSrUes=NVdfZ)@||8w!I1Cp7Vhla5-%; z0B>xLH-Fd~YF8%Q`IuqG)@4JY-=49INVPL$s_tsrTlFIEu3;hH9ZwsbI%O83qrM2;J?7L3_XXwoCC7 zQsp7miFE}Vbs-j6wX%~O9CdpkLMDh5cD==X2(9si^A>_LpkTXnm)Yx%BLPoEj|2U? znWD-j|HhO;7Z(CBd_sJ#3HNRtV|e@^8b5FVJ0ZPue0xn4?(`z6Xlt#Fd8Mw=gz||k z@(l^85s*76A@&*#XC`U1Vau_!3?} zV$!x&h^D>GvQR|430W`UKjhmY@5gJI_GzjaS(-}ApCcg`I8f1AJhlsq0>172tgyFD zM$UT`zCCU{4>|#@(S_9?ohn46l=wx=!s;6YoBXQ{ALA|F(UC%IzM`N(&2-a~?b|Px z%->YjYWq^L2TRU|C#xk>$0{$D75?Dw)eP>->ppI%a? z)%=h{SIW#H=^|uPNB4RCRKk5gk(FOSnkS1KUgbfXp~_)VnKnG2svFt<7CUl!W0?Kr z{)y=tBGvoTt?_`GV);2vD0`_99Lf?EZIeDdVIPwlXcSw*5eMALXubW}5mq>h5Rcq_ zJ|j}}n>_6&E(p{LVHJN&%)gf`WJv`-$;nKJHK3%uY<6uY(K<33)^K}0F`An5QUzb8 zU`;jaxsundq5uNtJ$0y7m1@uiwGX8H(a-uR=RcT^C17Z`Oizy#bjla$_d{RATspCp@qX>Rat*qTaQsh$-qQK9HSB*?+}0u6QWYANtJjE|JQg8ablf4G zU_is{7_v1a=%-1(mt4DfQFZ2t8Z$_~i$`p@`4Iq;CLd(26Qt&LXz>t8?r$j%` z+f0r`2gGLEMisPm2;b+e_)uS%e`~^2B!qW_HLll~O= zPerZ^$b(2MNRtY7S5rlN`ZU0$hJrkO4k8V}hayasi+j|9e;IqzPGFb3b7u%Z z0TLl+*2vKhYE}rr{(^b$ujJawvL}L7cQNj7p+_vbj~a~^Ge(r?<`8d@g5Ia5>nj*P z2^1A&@73^Y7NFVr^B5T84u!(VQ1=Lg4pyUTxLzkX(7>whJS=NB`>I~{UCi`I;tZ0! zAQR{1dh`#v#K+LdYe{;yB#b8}nFe(frp(#EF^~I;a;>y+U!F$Y@bVcL zdx_e0k}ycWjy0@8!t(_lDSAdyv>mGvy0LFcP_0hjp13*n#;ykRNcunB`h% zSOWBv=ip-STFhvpeqibZZ+(PmhNsPo;ZMriF>#R_h21C%O%HR$BvwzS2Wucl5i&)J z%q44S68MdZ`D-QZB<0pj(G!SyRno1dv0!PlpvT0pNwesWB<`?I@Sm8rM9p}Da}^}# zy(UScW4*rsphzkPbZAqR!=_59LP#}q4aD_A#p&P337v8@6W22qvpY%j%eaz%M8h=I z0tYSkzLYmZnv2DP%~SyQx8R8AGF->}0&eJOO{sQ9kQ%X+dp;4dzp?-wcbWZNn3E<* z#kV2~Fad{_Zh^81UY6<#7R&5cn<=aKO*)vbEJV~i-=#{^B-$?>m+Y7W%W0wbb<=L> z`E<)a)TityEC@5_8-gJn`VcFcrds`O2K1T`)RQ-i6d-20_s1U4M)G&_O#{*p%=P z=&t;#Z90Z%86R-l6RL&UQAE7$dY&HKC_^bOGf#BuYY0U3Z>)uY@>uW0Ik$DZV5J2r z-DMh8dtB*!CJM}z@w%U)VIqS84VovXI9A3T5t{!3Z4#Ya{*BY5W&W;07GUiEl;{ts z;EuPxP%dcEowMi%#yGo+@(qgw@3FrF`gcAoyA3q+5H%c4a&__`yHa#i&Rynhlq^T< z-}4@jynn`(8+);ezLlmcr092Z(RgcCVSAn~EA7wRVRyHvjkZ$9&Yw9uMKcM~^4P1`PGac! z>5H6xTDq9-0hqsn?@xe`fYoR9xR3sT*aH1LdJuJwpX6?gmE3*YXyQ-=&V=K z3^hDC3H&SGIPUJ$tA)d}Koy3*Q}a&9`OTci`gLIJKI6Rua3FbuKuwu9$~w75z0(z) zU`&aYE}H&29^0d~r343m;+!*5hQtqizAkv96x^TjFB%IddVIAkLm;EKll*~sVEi$| z*$@=2o4?E%3(y#iko+4W&Rscu{U@PFEtnRKetaPF#2l2;X{7mA~OYQ`KrFj0GEI3idhu#%Y3KiI|`Z0W!9zu{epc#+VWp% z=yT;f3+AEa8Yx}&`w)qCe3?IRo#)W^9EhMQN(So!Hcfv| zLLV=N2ceuftAM2j@|lJ4aAp95(`W_sRB&rd;h$o3LoxbH%ktS^+Bd9Z=om|I z@5cB!*3mxVmY;OYDY{?}3O-jMBN=)H7&`uS;>UKu*9gc>+k+>ySSY}ZFX%t=e`epJ zeImo2s&uAN@H-_VvCtlo-hN)j8w03S;R3oAn1kU3ZXjKW+-kiY)p@79VgXOvQ+|G4 z#hZ5V{so(PPix+MKk@selRNe2t+qg9W)V{v+*`mKAZCRhx zllFb&dsfeXmO;rorqUwH`bCgMi0Hv*X4EmTm+5w(J+J17U!vpeqw;V6z4<#hXAVE^ z@|1~!gRaQSPgnjk_w-At_@duZnLzM&D;M8wUU%eT(vU6^ncpK(9A(^6R);^by18Ra zl^n(;GwWj`t&`0~`R0EkEM{_B%Ut6-mzYe?$Zuyu3BXp(IvoDa_AY=qSj>7kbjW$v zr8OlQFW`EiyOqb1<4XdM@9?sh8AKnmnpo`B<>OEgJRS_ogHLalnG&>B--mO2dlVD> zw=L73ZeLJB99i2veB7pOOVf>=LvuvS-l4xOf5l)HxP6l!bDxy8{WYO-1Lf&MrhLuW zuA@O)9r(tF6VB$1-Y|dt<*N(z*`dRHo0AY!=RRCx$UTYc375^PfLZN zDXp?#+PTEH#%UWn=S09Mo#DsAGo4RCTZbcYn-v53t1lO*=dY_sAC02Am3la^^YU_! zw@kwmrGQS^y1Y?qy?Nlkqxr^Zj!b7eC}5f9f1>zVp2hYNGM#cAFi2JR{cp&6uxEwaJ;f zT?!GuR?_tSb#xurM&+wzQPC^UIJji4RZU}#C|Ok*pJMIzZ5=FT=FMKdNwn|x8zv`@ zZHhiC2{spUNSZU7SaozXds*C(RZf_UzDQ;K=gD=$AA=)Yd)gcW)tVn0>sTw)xt4%pTt zF^_HZD~Aiv=r1#f)DcC3>0Gk^9ben!w27F8p=<78E|-0Z4_!G2`mSggQ$I)}AEnTecEF$-Fb znEL~nF(l4bg-Qw%vSX*p>%u%OP8$Ex<{G^4X<^V+wB=RtSw37?`2U|~hj#oNzLPZ%$0zL^*LoUAJHw90zq;`1$+IHMpj21!)i#HRnZ!PCkE=hJ zdEK-Xw=UIn=l=fNjm@@$rv9s@EX>^RKKAZ!V=e5o>cgkvc|=`FV6q{DMWiHm0+K$} zPZY}Ioit84+RvcDy9YGT^Ff zAoFt~Wt36=^|;35W@z7B4*Qh0X&=2(J2%FZu9d*_mq~@w-p6#~VRVBij-}m7PtyO_ zV5a?aw$GEfE=<0ZG{(YHPW(}9B9<`cUn_=ropUOt_#j+2ep1ry0nB9nV13}8Jw@P}^m+-aB1 z>P|_JOChLu&PpXoA6!7)W>qhdIq2xn5(1f3VRP#Wu+`38cU#YsqMD>q1w%5{_$cUC z>%7RTz!<>7Yqpue2U%9SkST9P^TN>U_mLuxn^GbA770$mhga*N_uWw@cgrZB`q9T8Z;j zDJ|SZ9%Az$Z*y}f^_@It0>ec1(C1pp5h-XI`zi?Iz3qM!INCdYvB>X2y>dC=)RLiY z!zq^^xrxhWNO_Ixaeb)hWk?Y!=!|Kaykon2F`|(o3dymNEYy%smo`ovBW*SXqxYYb zzZqTMv>WBj+C3-e+Sj8+Q>XA|Qpz52f(E_Rs1nkyX>NzU7p8AQr+BW>>#%?G)CY0a z(gywwp+$rU{`mPeo|3VqG|dV6jem#BD4f{NGGHZ1i~M)S8IYV49mqpS@SU9UwgLUE zf^x01nj3nBUZ0Icv|adgV^fb}Yy>j99{<}9cpsW13K8&@wz}Mx>brD=m^FGtz|Nr9 zo`p>O(u&;B6LZt$yDZ;pvZn?AgGSx5-fUP)n2mQqC|+R2Q)(Ki?z>mE78ks0W&S7n zie|Li80BnQ?A<8Y#~8FW(}A$v)|+syN@4LcF_-0oXU`t&vdGJ4dS{ESWU)lZwvJmK zeVs;5qfHhsV?q$G{lx5uwyB5nnNtF>AdWA=N!4T;CfD&I(?zuNl+j&R$D+}2AaKm` zP29X!sCnyo-JLMg2+PjbU)b@3l{;mlrZs!M)}$*jcbZ%nyxX5JkIpU#dMwEsLv#BB^Yv+8XtMPMyXz!I#cMYzE;?xcvRBD~)%UZ6{HMzGd$?7|9JHRF@iyFfZKU-%0icV_BfvS5 z6@RANObbo~rgui}j!CxJoYyj~hH8_FAkz3y;M}-Z2WS9Svq^zR%k!CMP8ot?=5Wn|Sn`>)# zO{qxY?-=u$0?XRmcR6cK1I{qG{Aj|L7s43f1J3ZwB7dpYp5Bw`^C=2V*B}cQP!OZXe=S6S8{$z#G$!M406qn*!}PMQHuf(RLH z;;nEGTpR)6BEe8kuKyC?0NszX)OD;sA48GSt%&7F1k>>l8OMrh8aGSwV6k-#=!Rsi z)f8Qb&{^n|e905KAkP&e-RQP-a1$w4Sgmd2`m_s?WAc`boej{gS!Dc-n0&?|37T3T zVjADt7kmy!*SBk_$~rTsBg#0}nc#{=62H+--`y3sCGlnXv@WktTf|ij^h(lq^gw;O ztaE8yVs13kI%```x9PVM&5tB8JzjJPD8lDDy%?W-&Ysv9)32x3IKgFDQP>h4q4`F_FBoDZ zt)}`oAm9XcV^->=_yn$1rso{DSOp7kFOCoA`V_|Q4IHkXs2AN@1+Gtb*hW??c24jnaZ=dvvb<D;1k($-Pjt^Z|*X`)UY=2#N^Z4YeEcTUGktjqrs7d z;o!3}x0}8R%*eAbOF9-Y?mJTVGdI2RF})HAydAYeVL;Vwmqq7wpuIrLNZZ_zP^apr{O+Vv9E*+|CaIo zk!|Jf#}F{FsX~7l`bI9~;DYaRv`4(CSI#@3q)j~ON?njBq0=b` zuMsMBv)>FBVd;diNjo|s7le=tZ07OSexN?{?5UT~5vgE3F5zt9jgau6s!3Djvrl;N zvNT)Dzov|>6wCje|5EyC^w-KzcDc9Hz`<%qjDa$iffP1(=GJYZt z4$BirfH{^;x6AhsuCP?($KA@$DKk|X?@sK_uZr1yqDN?(vU@k>QPe@%O3o2A^|+aO z|A?Mgfp8V4P{YDyuD9dhrz)PST4WwY4Oi``eUCg-t_rw>9#KYRJA!DE|AjzxB+E`c z=f0j`n`le1``)qr=wU4A@IWL)05>Xu5TThmYD$vZwQ%CFoNn&MO^Vpyjx(^~89$|T zUpYUcT`)=H59yE|0=-u4awZz>xgwog1vV@BI%4(T@6di6a+WaP>jeGEYzI8yOtf6sF_%g|xDAPq?y zI)E``@L@59HSwd{(e#68HjDpDM4ONT0wOU>rLl1?KKU{FT)CTaB`zVRf0GJ&w3+kU z(F*B7ACjMjm_$3S`K9HXD5xFBgr9nVtAs^4l3YS)Ca2i%>E{;Ho~o-lH3+rEM^{1* zz8n5jayZg(Sch~cpdD)FEFrj|L6<0n|4Bs&j#$Jzw}&T%7jZONCZy}Q4Yp3_)HaA4 z*iS6@q=uUGz;r?|C}*rdc)w&cN~}68M-2P@C_UyUE!{MVJx2Jtjz8i2M>_XQ_iD*# z%HEZH8$Be4nyMgbBCT3B>5&#an<~svql)Jk183T70!f&_Q<`BYVf<3^{l6Z}2xoBQ zw5J$)R0T5)IG3t;T&cM;Cit#oujmGSyNbWJ^GD)^RZ>BV3|Sy&cZ&p7@@&44`-?<= z24gcx!B@a{z=WPM{ufczQ8{&SWAux7bDvLMEJhz;UKG|}f}=`8XeZ{{l+P`ZqL+l! zo8_!|6!#Z)o)yLXSIwS5970w6k5c}h0PVuKmqh4y`P$pTy!RR!)iUtaaau^;eU77# z?)>Jjwy=u3<12q2z8k0nz%Oc=w-!DK{t&pqPt;X2AhcUe zcln2rvQ)T6hxUo1V2n|E7+cOo)&ZVCwAP@KdFKa~s6-FQqY6r+Mr{t6Q-h)6>&W>2 zLu|u8p|Bf)Z4vLeLKs}fj-REiI%*F6rBA;B*_oeCQLuD#QM-$JtH5R*{p9YsWgV;U z=FJy0AyhdnBLz6CAZVgOuq6BUIbr-^q6J z3$O(ElAiCxf+sR|jkb|866jQ@=sp%N<-L@1Pipxk?*;D5FrE}P6Q2QcP~?O(E0_bS zS(V{aYU^|#v{p8$H{2|8`JaHfVNdaOfY{z=}Df;{CDTGYL!4>Rj|N!DoVhUP+%X;#_O0~xA{2%FQ@`Ge+JcXtt?Ko8-k#bw`pdFO*8H4*YsIOeX)6!+YDqtn&`8;TH6v!(% z#;ByTKIRPK0%uKoXDIarfwpOw<%6@C7(M+D1f>>CwMce4Mw5%*sm{8)0f$r1DyqD9 z>I*)IIi-BS#}(^zk4Fx}3S#x14dy%?mN?L=^l0Zk3cf1m`l|$&8rX6f|3BhMvylHE z-rOvKW2B4|+Uh4WeWqa~tN`88#L~0CVxG_jzeve2N~MSsCitP@{`LUN5f`O9g6rcX z|EpHerL6l7UND5vIvu}@q`M1&Ey3&28XZ^qx@Lmk$yq!v=%Q832?91y)6#!1Ok$BW z@2?t0A#67KRK|UH@E8y0XU1~VqK>W;XIxSUCW`a#4KbrzrHG7iK?fmZoGP2&C3?oF z1oO8g8TIqthX^jJ*i8f~_7g^{mzJpk1+_RUIq9XS+`jhI6UrNJ@@$s%CVh~q;%vDP z|4QA&!UPX6#9EBplJW=TT)Xv}OBfO(<9^X@{3aHpVn_mScif+k1I`A`lyMHbU=*O= ztL4r<0xm20zlkO-)Cq3NIG%pO8RV6JIzj^DIDxXoAIJZl?5chbv>AKz^I* z!j6d0-J}%Na7M)FMJX~c+jQIw_(w)D=TfH2ktDQV%)574NGEQ6vgYnr zJ)4O01Der)0n3QIIof^a%Mjm{2p#4Tl?oo`pzS`AruJkrpqf>s_jKJ3c{|-Rw zALhfv4`{q!8Dz*I~>ut&1oY{jClyAg&-v z+W5yAV;Z&y+q*`Do^40JQ6d95`krLoel>ZvZ0LM@;0Vb&9%#}kjeGtW?I(_zM?rZ> z=wHfCEEs+vPGs#vql$SuKd+_w$uE^I?-4>cU11DbWPS}Y`hkwA2~z%7rC^bcu2(L2LwZjP5~4a-slJ~thC`){ z2Qtl|3@+CfDkoA$CiWGA^Mqu-+=Km%z1pXJ$`7VWhj*GQk*;5qUY@mqHYGNB3KoD)I*#PrK`9H6iY zg8`jyL8Ja}2Lt%3=G3#Iqs(1~Xo>dVlqe!|;*HLGp_;CUY_9ra)3!&(tP;Sx_V~jr z5Da}APFla%je6|{v-i6xJrW4!i;+7^MT2nxsUaKzX*i`gx&+!d)ZKD!0;|Giif75H zsH1`Kd_Dsg&i(CB&gw`f*|Xo5E;`PBQI5QmH0#OVaOzk2$YuoY>?^cY{qclRhQ|Ha zAfKo*`8)l^3u}L6jHAMIXxIyKRvkFMb-3`dK4M&%nbh~bKkhG+rQ_BHhD_}8)Q0s? z`j}2}DuUqTl&_vG?Y`3I9{hkgN%PmUM{feRna#HxY@c%M&^s4;S+M;%w?>Dtv42A1 zF6Ylq|0tZwVipy`SB|`WzE0+bC+&d`T)W3S8dq4{bu|+s9V{txKascN#GZsBi~Y5t zi5b1r_DSW|J138(JeBq2?HK-8J?+27)RqJ9vu6;(t%)D0y8Mkh|M|HNSFXNQAt>%U zwsZKS{lV0cwy>7n^U3gI(!NAV2(-ID*V6purYy_#x1ao8!>H<<3O9_i1vg*xf9hal5bNB`A z`DV$9nyu)%;X5B-QV@|<8slttSp|H2PMYd(*hjZa9d2V*=AWN&uaPjdRRXnp3SE5wDn4=4nSOTS@t_;_~Y( zSftHH7jE4{aNHeL0BT(oL9Fd5pH9Dph>3NlS+z@&Z_O=AuGlOJU;d}}uRABVJg3Gx zP>a>xQexqI32ob(2YUFUHz5~g@ckX=|JD%y$0YubN$mgs7voV(VoZwKsxh0v#FHK6 z*0=T`bj(16Ti^bFwYYD?_t?*i8FXVuW7(yZ+)2-L4OiKvTEffes%ZsS`OtgkuU~h5en5|K z3Nm_ceOdJA(BA<&jy4QED>~es1qaGI8eCqVb6=rJ*Cm!nre=IMEj4r`6%N}+MsYmG z$?|4KcFTf$;Rj!5?O|QMv+qjLo9n3~19$#remh!)SRwvC4XodS(w^;di4W-{JBunt zhi0la#kD|_eeRSn&aXH^&$}WVKf0)vAZcS`<;Xd?3Ymr*`A0}9;eA_%M!wwpkgsTO zf^6lHfj?yn=T7@`OtxaiNo*HB1L{>rPHvYX{3(h{oiN)+Eh6UguQkneqhX==w6T~c znp{Jx3qoUtEaMtwcpZ=W9#p8|X&| zUWU!<%_^F>cGdizFu$=Uk62Dt?u0q}CF4xP8n?M=zjR-<;fRltKw?1FL?7*b1HvQwx`)6;zcx8Ia99*q%v*3R zDfq_tbj~9;!n^0c@bc={Me9L<8z;_G6IVZ}S1ChTObF@hw#H z7@R&+okxd7Qnr7X?H@|+M`C72K$}+Yo@@Jpz>D*1+H!ul) z=4Z7=>au*4oHsFOSSj=Yi2vh<8c4It+B;D0 zbd2==`L;TD%K^+L=NXh}Ga=A}SV;zM>i=4}=KHmHE`!ZhHthd(MnB{a(X7pL{R1&i zNb$zecEMwsZ)4zCRMDU8P>97Pg{+nQJy?Uqd(Hh>rkP;AEYjk=9Gc@(Eoh{8quMpg z-G?<zp1368+=IZ+n~dddcuKT|du5OfDC8hs9qSqEv8a$@7M>5nzJ(^hsKD>cInQ|1{2 zxU0*RyuOXMUxWv}sN~wMz#+48t8RO@g5+h7BjSm9vnM0mE&kT4XTI8xwHKr5WET68 z+Zs@>w?MW2;vX{pA2BU6o?bXKbSBYdGI5~(AIO#fmZO?QmT_U__r1}wJURMVaDkxmXGtKe)7m$%jug4z7)0`Zp}sqWNgn8FEEo8I;GFb zd`H9wQ&YM9bR(>Ai{qcR26>DCom)|Y=LySC9Ymy_GB>Gtud9_=X2{qQ1w96Ew&%e@ ze`zfoA*w?b<9f{!(a+>4>9cxVp52K~&9q%w4rh5k?cW<6)4=>w6(0Dw4K6VgaoRMU zGVSA{DL&cIG$PorIV$pMM+ncUmN4F(R2bkWOr5IU<<%}$u+njpq|YtZg$YVS^>mAo zcVGCv%7U%eLM9$1g1MK8g01t<^K2SO?_R4*TY-(?fTN}ryGR_4z)dQCGJ`6L_xygX z+x$AF9%+^CaiYx`2TiO67Ae<+No0=sj8JfVYGw&6kFyw(Gkte8AT%?Qc|ztIF!sIJ z=vutxmyRZy-(%X=o^FmP7Pq_cXVLF?h(%dG{a|or<~n(k(A<%Lu2QY()gg=Y&rnWe zl9-kK$t3G+E7PL#!K_tl*K}%_fgKXQBvEF?z$M;mn^<-`YV^O?Gc8;72zmi*dD(bq z#&j8HVY32#vzuu+{X+iu-4OGCF;y z-p}=3FRFts0rT~y^!d>7C+m$5g1xo5{_-}%oikM|8=RJC+rK0i#BV8kc}i5;V+Mr+ zv*GawH0Y{TR>Xz1+%&h6?(_{Iu|0@cfVvP(5!xg;S5WVDe&GMZ-hX~ImA-G-_`TEH zE(s-cc4#6hhNh^LO$Q4GY^bQgiame@6mf@cgCL;TgIKVP6L3^?91Tsd2P-;^20;Nu z4G4&c^4p)^dfq&5pV!|%AS{xsmHXb;d7Z~0n1s=Ns2w~_c8I?n=}H3hTO(T#9?1Jp z<7oStxbD6s`=)Zb#Q&!^o4c|yqr~F^3%_1xGS=N;+jJoVPa#?iZtF9A2pfZeStAuO zBo6kr{f%DXk?JheSS`P9#vZ#VBI$L3y{&=Fw4#&gOX?n`UeVrh@+RLx;WC`j9P0n3 zfOT)_a`3J?$Gpg!-1;m-{Q5VC%Lq^>KbW)2()*oA8YoEB)6JdL#&o{`by|0Zq?hFS?To!4wI8_y){$Tmix{>jv6y4}!YcJ3-M`8`6F8WcM?enT|FgMOBZEu_9 zdjejhgIGp=c6$a5^zRO`y|k==iuaU?U5_p1mv7TO{G|Y8xC?-+|%{P#bUSLbrwhNs^*1#8%3;viD z=iMq`h@x97DUczfNHCyw7W%bB4A&)A?5b4@bgzQh+C zs$^1g!B?snmL5K%w78RF{+Sfm)r4HuXBtDC$?wNHMuF#n$fHw0L&?zy1}lo{4knoB z)|3Z{+SRD+&cYRH!N7=W>zyPj@q*t0xKK;CHgGzUl)qI-{6rC3ivCu^Efmr84GteFdgZ&0>yB29yWzCAR z`azPTOTnWnXs-(C#L<*GWTj$Wq!c|Q7kMid&Rm8TQ|%qiLXb31Od?;2G-(q()I=O(jIopHs6uS?m^(uHzhtSEMn08?q`1Uuj&<+bS6{L=9vBc|+8ADA{<7_SpNc@AakSt~11s`=EZ z6FkuHopg3(Ou$vqz7l4X+HOmiD@_4x04q(6d?#2*b0`oT_kjU%Muk{&%toLK>1R0yusq0i+}k0;XNdL8XI zO#7t}OsL`f!4emv*kJ08>4kfi9WElnyNYxOTWrW4YX^v_LCZ_q-hqMqx;bJ?y9`X5+}P8jZLiVyb|T=IpQ*854skvH)Mh3{{)b_3WsF zVO31=@em3u&Ht)Ft||E4y3#YP=%6alNy#rSMSg`(q3h}C)#zkBMMji=8aiew<6V#2 z7pn=l`JT4S9R%o?{~+Ma8ul@;W<-ug0o)3Q>qlAt&I*p-fKz$S>#S`KxnY*+$7H7Ym@MNS30<~jKwbQo{kArlburbMrsZ$G5as77zeXbHuDT9;{W zLrb<`0tND21)sPUY|x{rYSBD9RJvduT_N};O=3hrdSXxiZsAI~(}1*qCgjWy&F`gD ztx~Qp0U4_R&j7O36wwo|U9jmXfksh&tWAv3O6X1lKmFQY=Ttl&&8nM;$WIJ;!ZjKr z8}~OZx~hoyMF{M)P`(VN3alS={0(aQVw_H9nbmQbObPwF2K_Tmbdn&l3FyEquvd;y zp3$=gewP~kR~|dMm6JCb?HlDK zrwpL{a11*0UC7oWYszgqaN6VmP9k&#?iqb03jI{XZ71P+GSQZS{wuq|7TWnQWbOj| z?dQ==+CMox@HdQOH;ERkfDjS{ykJi@xu`cJxwIm5BP$;@V90=2LDj6eg*wPb%KxO| ztdQu`356cRjMiMAGGSa6( z^lCE71l*{F(AxVzh+@hKKYnGXV@3La0b+hGCAWiVZIi z=?L-G8~YEY9Aqkh@)Jgr6|X3Oey9ao=g(@GUdWKq1k;P8haTL$d9bk%6eWn<%pWdu z1QuNnS}tBiUOm)5a0$n_5BZO1dLz4PP!AZG9W+i2Q6k$VA>fLNf(CPMsJW>sZWqRC zj-AC+(~fEQo~c4Js*o(FTGLQA5Y*}ft>paO^{kH=(xBm2$f$!5ePeEP!ph23FUV6t zS5h&t0{vlVq8ET1Ilo-WJFkcTD7(iWQFC)h8ea=cWZavjypQpKB5~i8Bi7cUoNK{j z-GMEIE+LSmt3>YVptdtH)VkJJDd)s2LCnc&ZzW+=0N%y<&RPZ?$B}}kv8D1V4#9ay zb=4FCt$}|)EvT(UKM^ipmh$XexbKIBBEqr{<9ewX@|EcM2Xj|a#b6b>5f?FVW{Wy# zX>D99w%i{~{{axHz`kfUzC>qUU{BfgHvvwcV2kb-WlI<1aN8)HgqQVBTy%zv~|G zAjT*1$*r?Um)7!q65CdCZ7!rc>Gz8iY&@B0+)4C7?$=@au zkZ12QMiN|?E5Y17CnO0#i>>bgf(!*D1Pr$;&5oaeXD#{~qmS3H?|(qw8W!+%j5Mr~ zmH^E#@CNLy-{}7_Rzf#R(Pk;{{uwlI!JAkD8dpF|@Tieg(+@bI6_gK-BCIO{Uju!J z_}WiG;3K1F>*4ZNUQR3ZKMAfViXj%!+eX7m%2YUp2$ZIZ%lG`=!vk1)`{J*#1gM>JqLX;Crx7()Z>3`Wvx z8!6H3TMgGYP&9{h{3eCBuC-@tX$y9s)~kfha=t<37p+;`>dtvI3UUOHs1;&F>pa}n zG7DJANmh^K@V^9~M6uLlWZK7gaCA5RGj|_hIs)_S_xEVS2-;>B_V)qBxZDm5Z^zamZgE)H2m#4Uj21J$A-!8#~^Vo)?akI zY*m+;lKw-*VFqwFRX?|fSOu6-31N1XCxR77zh1BfXTQVIi}(+_CT2;%`P4NGmtSB| zN=Z1t<zzZ>BR@$dBm*&OggzVX z_9dOiLrrplnCk2EOLW@NvIQ(7r0dL~B zy7NJRgt8%0FW)wc%oFWmChHcW+)v>eFMJLyx6=j9AI~!AWkl+|e>0l!14adcav;4l+$|ga=tA zC!g!GL%Iv{PHgpyt2vbRd-Y2HgJvalHl2M+Dj;)>4zrD)oO_Op7}1XIwM{Y$+I3*t zNA4ZhbO-SMXi}wWwHm@oS_K#R67q&M)yc1`YG=6e0zEQX^IMR1=|$!$2g{#MC&<))#Ev-HeMM~E@) zoES&j)DNRSz!Brq)tOm1VQjibc9`_I&1!7*+q9-?7l~w20o?YQuIk7v0OjXr-h6fT z$RByLI?0aq>nAg4#}`d3716Q-&n!-M=&<&^U9CR@+BqE|+ZUWumw`21qii#{fuXiL zR+6UUZ8~3#IuyrUN3y~8IFT6{X|%2L$NZ3_ z!B-vmtEk5dY}w=Zk7P)ub~0e#bDini2j_n{!#Wi=^O)KGzr)GK-lpY?4_-^ovE7i) zQv;7LO{n*&PunG-K#?@B*ohe zaku7VbL6XRWkDnRtn7ESjoJ~PO56o=U!Ti4k#b(@4A%*+Um#h==a<@wi zQfu>jP&s3-0^PZszSd4i(Cl>kfdk-XePTODCV~-IW@p_znCTkd!??#GMd$B>w}^+AQvcvnMYo&7J8`YkDs?)LUN7UUh6aV~;v)>p)Z9N`bp0v!qdU&N*Lkig2Q~ zN)M#faME;bN@f8Xz>+FyR!0H*2_813d<(@ot2K-A);cP6d42t5Rx|X(;b>9*uA2om z2c4@u+@g`4Ez3BHu6ozM2aL0P^t|_~FuxWk&&gZApYfya=#T^vSUtlR15)%J0UE6Tw_{?M(k)h9q_{%Rw3!_;=NFbr4OhJ>dT{ei?`7;+9GH{&J3zp(l4_ z6o}S;P6`OL=NS_=-LtPu%YZn8n6V=6FJSrLW-Tm$$LffD1`|VkjB1|bPvJIt-u3&X z`qmm`mRTznaGpB|eLpPNh zE90C~-=2m1EAwOkZfSxiziJmf^u0Xbf$V@yaOR|rlBE3DA0@kDCZBeQK`7lVJVu<;Q^!?21XFeN|8HjW6nc_vqqK_^g_1Olxx2q#wgq#1rLK zj!~W?eY4QY-+#@@t}(t86}dQ}KM6Gj?3Ty=6I_BT@Lqk8>v(e5JwNuh)ToG{ImiK> z_Mk`y#%@~XoNaDf_-Pe(r|s_Ot3RIR`%IR35}aoFO|b{7Xe-e`C9et6EpRaK9PSQ! zZ0VCMx$oYd+7`YDa*$I+qG#!uQ+=KEp;HEnGHL+Rd+AL7J>et9fs=X!qjVsDI)CN~ z0h6e9w;eG;ZAlyo;JpY=B7?@2cg%gZ%d#oiuc+u`p6{hjT{R1jksu`r?`s<#L(UbI zzRs8@JJ!vojy>g^*tCse*dvJF(Jn*>Szufzjc8b}Jw!J>X zH|c@l7-!bZ*x}|`(^2@j4E4`6^ISh3ecpPP#2yUgMld!-=op{t9FZL~GAnkVdeYwx zrPg1{b9}p7b1zC9%_5Wd{*#sI;^{%2zZ%0P#u_}*J?-2~rQs8QOZGkY%%wHP*9MSk zH0@MY?MV0XAkX&uIil~%{1wC6a8$_f z)AdLceRLzI*q4Mz!`WPwKA)LO7{&h&w5h>Uq%o`z4{=INwZz0Z%ui(|2~RDh8^r*| zw&h?!cUcyh!uQ*)2+fGG3koWa9y378d2UQVfn zAq%RNY4)%@bmG0Hyls7DID9TVY$}DA-Bf10y~xa}r?PN&A3eB}Lt|7_?!e!bx>~oe z$Ko`}>_BWCsbnA@%TgQdPn*mK(gER&or8f+7(tOZnkc)PJ&%EZ3`NN7^+nUi_cYh@ zTu#>Q%avPG%bflXN^kEv>cCv-ZsFV}yH|*7fmdxV{e={LQSjB~wW>NeRPMRYOv~+U z&Y9d=SP~Ln4Eg8Qd-T%|ZuTZ59_9osR?1Bri8@Gf;f zbgKui!aNpO*vzk@^Xw$Rt37(5s5sE|eB3^V%qHcLEYkRM$#cKgQ8`@-?fzeedfvLk zKPHa7=5iindI(JJ-FFHGJR-=uwKAU)&AR<^W+&RXlM&BOl5{d;qj!@EIFrJ4Lk;8S zdXVT}trqIe0C7p&5v_vze+pW^;1uNMTraNLE@$Iv+E{I<-V(-gS5+}`YX<$brh zij}sKlh)rAY&iVc?wg(V#wLU|(?Pd8Jci}uN8M=ugz64EYh|nC;h4%~#+7@^EKjtvCndIFv~#S#w9EH+l`s#EW>OVOb}iP>|t6fzVajq{rI@V<_@JT>FB0bv#*YYk~Ei1v*@ zi{gYljr(^E%d2c{mO}7)lRu{k9G*-6*oq>a7{{=Vx_wO40bebzS7ml!rQA3QoW?DN zF~Rfi7#~A_%a8?LRP_=bTO0heQ1VTh#)?|NAbCS`IYx2d6Tal1ikPpMAXK*@qZc}a zv%HzWRE`c~4D$eRoCsSp2idMtJecRWP>z1lae`M+-p+_8!BoeEmDn6rb|B!&yu@HN z@@XD=4J4x#t~MiAc}=oN#~isMHIkF%T5pEhWk}TAXqJ}drpTP+xC%>z%aV7CTfuQH z`(*C!vKWUyI(KJ?qWtXkTut6nRkG(s)gCiPkW#;=${C2I+{b^R9a5IH8!h2w`s04u zbS;go+02UqzL?;v7V*{#-IggSgzFp)w_djKw*j^+7t(OXB$c35O@{UptG{Z&1{3-c!0#TxNiU5MmI}VA z;Ooo8yH&_nDe_1w=pe1PL+~aQV<-{*hOuW;W4;5D>uOe=LJ}Du3oY_aIo{zsUE;Y4x2NwO)EfegJ1y;p+G88iGN~(ym$6&75#}E{ifl* zZyqxnFiH5`S47YIEH|kmXF+=C0}xw}7zt@IXq%q5PPL5|1&*l#0$*ZW4TP!J23NsmQBb}X z>7}OTfcr!bUpDyi-Do|AX?DbMM~$G-i<+$NgQK8ziKxR;uyr$JCfjaLB0ptZkp!p> z$Q{7%!o>~J99EP=f@UHu2K^%2uP8;ItHzWaFdN*0JWM*+FWYGnK-Cirqg~T!168AfW$Dt}HSL(NUi=bY?@6b$4sIXZ~q}{@O%>$7! zBN%j69VsJ)vq{M`l^0D1J=G$4I(*9ynm7SO$lx@HX6(o~fOFeObeKAQb^I4N+(jZ! z0Qa^^O!e;OM02|d^d_ETqTjMVM)*)CKYSf?qkORB1)!oy4^vG)B(WND;sLrG-c}G` zQ1HISc~T(TM;gdY>-1YLFeYd7F!U!trxVy#-5-lX&7P2wKGpt#gZyC?BJngb)AAF0 zj*ohc&A`R=H?X@>u|dHUXh4BM{Js!gMNZtUKp#tIV{YY{gxhru`b?W|uk{_!!V9T2 z2LON9!&`A)i_G-1j%iuIJ7O<g)t-kN%3AD}xC>`3A) zCjiV1Ubx7>{;rd*7>MNxp_8RN%DsL|ihR-{`7(kpWSuBNGY<$Gbkdnh`XhYxH;fl3 zyR;w#&_a^+@`NXAMBGx^tpwIsDep6|?K5y%%E3L{e5QP%r~=s5qVBE0P6pjhrYeGh zf&sUFSvERCTJKE{_^#&v!FTP$$?7%Od`Dq8F7A{?8>`@=7Yxd!+@hZP*A-|(J?|sN zrwFfPkE>%p^i_|v#EUQ~A@{Cx(+xk)Xzn=?EMiYg$K~v1}&5NnALoHwG@|U z45?x6I?Rzkvun_g`g<%&-zlTEj0Ae%)I*=d?C);g7(|oX z!J-20Uwl}u+7Kax6Dp9~W87OtNJ*QPZm;Jj#-aUI3!OFWZ&KNi29ey?^{+&!Xt=q$ z#6rpbNxh|*)D{wL#~H9lAWw(irbA_PBve7Gl#c0AjN{|`Ae>HH;Yk(wy74eH3 zx1V4Pr+Ob#>i1Jlx0}W{3xutFpkBGBVfoHOa+94J#uH*($G>qT{`i+%K#+`L=~JYY z@s%k5Eq%y#svPqaZfOROHK^G|>>t^xKe?_MTCNceV4eTcrNR}Y^@ZIG3q4C?UiJ#89=Zx2PY|Bh zqx0m#Vs%WHM$n=23U|qT$3fRPuiCFbE{pCAtiz@k3n-9n*L}2E3o#UsmOw8`9JvvqH~pq^nVlqZNTn;>>RA-~ zwVLNEXPl8E1{q%~5V`}w)a&9t1Mhbq)~!Wrv}23(rsXQDRsWXjoliC6mX`$M>~Hx)A_P@di0HcZm?MV9aq_zfV&3qPR}v-48l>W z`^+fL`Z(%`0BEIf3R!zs2{09-TZKCMMGNH2Y`wwBtK0H)kK1q$Zrf|;-)r%{$Kke+ zr$&C0{BeRr(A)~}p{o0!<~2&gN@d|RDeJBR`3&-% zE=TsOn_vl8LQzAk(Op?zLv{+6X@B&@AU%fV7U>D8tlNoy4`d_@M;(>EX6F03OUk> zc>Si0nhiDLmTS8_6^J*O0BZI58*8ufg zNhlpBpBqhJ1c(|Q`osRD>|L<+&p}9NM)fwgzc8K7;JR0z+5X)p%b~HRbt1-F(f6r= zwxW?<)7kp;;3ZYYA=+vNh#;-}6GOI5eb)c@#J;S9gH3cNz4KJtkKO;RE!I1a-#P5( zvbS=hZT2~at<~imB(l(maqr~K;>SmJU8B+apI%!*Kb|tU82m9!MsPL{wfI^nJjp-P z?l^5LKlpX(BIrv|UQ!gO23jaO2KY#ccPdujkJ>w6dHgPxuHWd>*S90LGB4!&-TnUO z5>MnRS5(#U=h0x=iQfJ_tMu#1%_F_V*dmHG$8mIEIO$dAzCNdt%HOM8H8o&`;m7ud z%iVT6KAsXigJ#(HZM=8iuH<|EiM=}f$@4Lx+!yYLx%|^C<8@~|x?960Y!A0Y$x-L; zHrQ_Xm*=Zx1by{-Gz#HuTQVcH6D8K$&FegVA@y z$sf$^1A!IBvf19P%yVBn)YEWchd!#|SaNfUQ+g!_YqNeer2LSjbZG84v9s<_IBCT^ zPh4HE2wrWzvpX&@ecfD77mayuLhgy}k*RyY)b{_C83%gKo=6=zo-obL`_A>SiyQY3 z@<^zz`u^ST|BF{Y^f*)Dnjq4ty`_0lQmZEZq>hy z_Rey9@3|ZccVmpaoW+pg8LUSR&H(ahF0Jt^5D$IKCvBFn2)jtWL3s@$OuBYLQ|f z?D*{2-rLTb)bz1mUuC9;o;ZM z2regTIGj<7URd$K8L&6d(h1W^wTF|tZZTF6L1q(I9GTmHvQ2WXmS$s7=WtJ@y|``^ z;+fkhF{gan!#_S<8~tc^|PGVMgBRIXyav z_NSG-yZrmi4gV~LxiR-fU3+_deRunf3su^MQ#J-$;VP_iQEb|PobiWj_mbObZ&GK+ zW35`+8K?0#Uj@fE{I?%ciH)VF-`xK6yIr!>O#jGl=g;tQBUrQi?QK&Nw@%v1JJ1uy zAHVzmJq=cnu=dv1DOcq74+>dTtq&*P+`Cmo!+fN3fxXt~<>r_DZTESC5F+1yh^q2I z8zVY&z&u(PSIS(@qcMoba4+~s;@zTYAF2MjWEhzoP?NM*;Z8tKk>KzEHi zxF^=zxGvXU-x#d)9hPTkryEVLMSY3PmCFLEyT9G1dvNuclX@6QZ1I>FKxvDTV`R{a zW{;2nlxl49-Q9E#ryRod#YY4^vV8C5DNj)|FIX1DS?JduYHe5i1ob2Ldop~#gOfxV zGGB^Hg1M9p$TMV>Z8-Cs{E@GwaT7WDmtiDmAAEFzXO^FO!kqo0V~&SGIGrO~8^xSM zKJ|H=d!#bECrcsRy@JsvHvKp3NXyHng*A4eQmzqdlgg`!JHoT z$am8}^cfbc1IwrRs-2@)KEwL9##{DGDcAp}G$@!kPAXLzUpeLRNN(RhG%mouJPGyf zQD2Une5$|g5mIiWs2^Xgs8rHUP4J|R|1msp#D6%o%_@Sv_wuymbVh>I|L-a-Q4Pc7 z4EftThX+mBG!5wd~3#aS7qTV{JDo;w{^WXyNSu-fgNcf?G)E46ujq{GY$wkGkk`k^21XBEs@jU+Z_e_gkNyr#n63C@8+j!%EX!4Y&(Py z@qB`84R}3k0-hO=`pzhvC853?9 zja;aDaN6c_IXRtyu{V{|&F07K0TZ+=|10a@0X5&}4TNrP4rKS+?+J3%>`iH>Oyn}Q z-_p~ttuf)i@5*Jr!}cZPPWwi5>{0H#V=QMfl`ez$F+WV?J{r#+O*+qO^066)wy>m% zY=NFrK#eBpS)ELmzY09l*3s#`E)lfIHT&~o0fPenMKzdTWSJ2cm(94a;N`s+S8w$H zp_tP+Eq4E$MW1Cci95n0jORAod~xbh0=#ZXyi0RUU`{!0ko1M`#}uk}kL6b}CR8>z z-GoFdK*Qv!1SF`={ZgcWIk|G+Na|l1;s~8-gp)dNv1$u$Ql9NU_&$UO5cW?it2g1@DmZuY-rgz1ljD-R#LN8Mf~YS;RQ^Tz>B+y9v92dtYh(YRVW1Xx_lMu_DjgM_$fwBOLW{ zsAAno8V%FZS(1v)>dm-U3HxPLKswC;EQZ6`oHbr#?9q(-w|-_(*7D46lb1u9^d<3j ziVwTA(A4HCA~YVr$#rEW<0^9LYPbBQC3;CtV}rM5x{acp(yYg8>G*$WWQhZMyEII) zTb{YHxz;9Dn-qn~1rO`rE&Z;{v{pxt^}ca%v~O(?KCl^4;4pjqN*GDZ}_P!oC?H z?909Hv)IjXS8}V(kIqbLPT)Fv?{ctyq?594_8@7BjKhw~ny3m7ahf#a*5SDNiTc*P zn~VtfttQNO>M1n7IhXyVq;?Y(x1?&ztb?`RpvnR?F+ADsE)}j){$Sc{iuG91;3r7f z&$=08b3zvcAJ+3*z-c>6=dejd73nGsZuxmI8-FLu-(h&CqaJ3f*@2mxI+s}nCvOi~ zBS&{wl}cy?+h5^!vFw>7sc2lp#=E1)kW1Bemm1l=MWwW@hb30uI`5#kG$-WVG6Ql5 ztY=rIr^I$~o=^svy6XJRx+3gvxOSS+EqRcN_AzDPaZ?A3XHi->n+%PNv zw+*xOZi|0vkT?wT0mv`Rp@v))83OnOrF7u2OZL%DT1|(<(23&88bQ_W0HJ> z0nP<65a8b`MMHx!1{H9gI@v)2s#NH8g-C?MwVr~pa!3kTm1_QFJHxK{)mrHFt1q91Z3kjP$rn)6d5Fzv!Hg1OH?g{CT03vNGw=IEI8`64ewx^bkUa6tPyq+{ALBi5sAlf)Z`v zsm^(e`e`qzdaW00-Ydp`stw0#pr3Yt51?>4@3mB*)UsyE>0=aN?P@I1DY_~EjwA|A zgg_)uhvd|to#8B-2g#T%s$HKH{A^Dn5oKE==q3Tw9y)T4;HDA$2E&S>darNl$%l>5 zL_-WS0K{nd?VU5JiT+DsC%e_eiXv~SGjfGy|8VeX!_r|DJP$v3Fa^;=yF>wC{deNu z%~`D@v(VMg&<%zX0H9|E{B-q7gO(pZ8l%g>DpPRSK!iJrf6AeN3TF5>WFmRYL`i!^ ziMBMkA_#g9Jbr4C3nbl$WbD8}Kv;-sm|y zyhI!W(GO@kiDg6-^S1)od>ZP~qR(YULXtPx105Vupr6&eS*rak3C%_36}g#SO|@IJ zNT#OX#x?Rc!H^q4N1hn(o)GLTq%UPJSh1S-QVqW`Ed8lLzYu7x{1B}|HcA6H3R*tJ z^uR9mYA7*R`Y1Ql?R41;h_+JBT-?N+`Yr9?b>^PBi~ymv)i$Jqg1Sq2k8yNQA37}K zSR4k{D(HYd`X@%&W?AJbw3Aq%HK3QUToaOeN6$bgd+ZguyBj;Hl{G2>SjyR29L2@x z88zJ-u-_;`gwZUdlp+=L)+eJ{RV1Y49Pu`~qCwkDDU=^rsTOn*{6HZvJG8iJ>m@@~kJSJ$Db_b^k6M5(6U8z)2r9tI+R&q04kSN5AwUmBHM zz*{ZAJ_Y(s#_+7zxO|>yo%-;A0ghHtwkoIsL#S1Nmoh;n5Pa8Df1!-l16mBoKk{^1 z0_f7A3fw40g&vU?*lXxEfgoOkv`Ke+#GxaG)3g8{m8WFr?I1aQ@&WXfn!%Do1qNQ# zJ->^Dfa8MAB|J+P3aS0_$+NfM*jP-@ye35&6=$7W$C3wykO6sFO1*rPu@wHzSUFAw ztYigN5OWRTXCs4qbv&||`AW~RxG%cv&FL|$*wBg0#TL9Qz1A|)VI$#uCV6X#Yb*#VCwP;k&`bjin76i>SZEc3y|l$w)W~z3zNsFY4p6ff z%;T?({X7mDxvlM|G>0TrEatoEpuE*0!(Y5B#AUsTHaY>KFsd#xXcErrk)rq17m)yn zs<`J9vqKcM&Iup{V=TtGX;L3|39a6+;-{Rinubl%mXBS7QpD3QINPxm+&|5AI_Mo}7*K};EDq&b5AUD&uj3j0aVtmd6sv?dy4q{{7#-=?u;WwY&p+lC-79diZ z)W960yK`d|yZxS!BLH4%^r52Iq8?hQy0?%Ucxi9U<#qFZTI8-9nipGq)TQFkl+xm$ zIb*(VzkesTrs3#=s*4ZqjRhhZV8{zPQ$$QTRKc?THp!={8hB91PvTyJ7bpj4r=GED zjtG|V7hz6gHM~YGFW~jmZBqD)Vd4FL>jCmMg&>9Hf4{%{=R7S!b?(ZEd%+~4H7sJ` zv;s91cEXp>ilQ~A?&{!oB$99+mGP~qKDPyb@qKt8Su`!xe6vCRGp+(xBPj^})6zKSGiL&IQs6oT_ z643t6M3>4%{s8TjA=McFxfK0QFasns2Wj2;VZyd(?|fbDVUG*9n&A3aOsGc&4c8b2 z-~}!&1~CjNy%9^lBOP1vzqJy6gbs=}2)?Nq&T*HvR)9U5mf5#LI@K<{{%RdSlX|!g zae%Tki=`NodZ5hVc_=}kS|HmG_ly-Tm3kgiKpY+IUuwxf zOh8;cbksn6GrWeQfbueWSq12Nppe;pBNni6Mhb~kpGCYzWMk$^-7e_>7h~xUr#Mzq zY=P~Nnjeuny-teuDwt2=z+JuNC!CWU3Tp0q^-xVhENTQ3_&+tKn7thhC{>y(@exhH z#0C{}h8`YW+j6iA+zg_+nDRv_+JuzDddT&_pfMT zc1h7s2KWTVpd5~W$S)7PpuS!>>Ky&&2hO6G+%#tfTnfF^p#x+RBB7N_;j$7&*b0`r zkoJO%3nyq3FxojQ5UpRkIga~#ok*hc+OBh_uF_QGV~T(rtyhApkjliOR3a9J7|r4e z2&h=f8HcwyBye8MgL_7PhzP}0QYzv~IbLbJ#jSjO!UvHX|F?>5wGR}_Q;!tV0~GvT z9kL~~bD@YPynI;O09Vbt3z+RM zDeO#w1mO5p`g(K#pzolME&%_?1X(yFCOL;1t4vDiw|Zb#B~YwIJ7gV)-J%P9BAYO3 zy@V^q=o{2TXpK2!c z(qar&9J%Fp!c?gJl;`L+|K$D$Qgo+$)^-)rt6^+*1VMvx%D-8OG3VZlx}^K3_lMwS zU4C!?Eq092L~P92)xt1U!iMI+L50Ah0z7D=8qdtsSkqc{)j6I0F$wp|Md+IL@aQNc zcGeA337gN63{zd?u|f%8ts?2iHVW+t)U~{Bc@(%dg7cR=wyptai@boNhw%=oyq_3(tb9ry(mF?fYeEzlyEt!+&Q8_d%>=FDri~jN1&x40-RYi=c z&$`a!+ulxjw||bu_lLT9yEav>{M|9rcBCpyWahuz2wmcnocpUMN$sx1w+;85`?QjC z4qLe(Zsh*fP?}OJ;V+kjZnV9^tfjJ5CsIGgtzaX0+Kv-DaL@RRC{V|?UG;>(?Pm72 zHkcq5PblDNS{vJn8#p!a*v2pFZsn48H@1~2!_Ya_9KuP}*4L6zj9m!0W3z)yfKKw$ zf+E1&=J_4g^XHlo=j=at&r;iuTotmdBZ4be+Adt)4GwKDI_Gklb!+vgV$^$0ob{#w zyig4WyE+^=4#f=?ZX!duPRR(5Q!QD?i2!wbB_5=izO}&yo$anR+kWXBSqA(9M&<1u zYGQ{gojV`eBv{04R{#IlJFmDV^1t8zCX=2_N$5Q^r3s;m)S(Ghf>^Kz!Md&+5ET{K z2^~ZT*w94`f{JA|D7siT6a`(>ps1**p{S^+0TB^(IrBWv#d)2}b8gR{ORgt*Wxk)! z=luo_WzrLF&F+!eM~CI8o}uB1+=YJSRIFy+)y}dEDdS3wlBP!W%NaB9welq&FR}Cl z4tlK_%h;TUws5LE*Zf=PbZ{_jApV?Hn#?h^nqT4AQn9UG(__jUWK!pTp&S$`CS_`R}B?~k{x zW{J0dv`kK|rPQ1SKOI1%pGzb!^GJQ6f{cGgwl&*9*Gf^{vQ?f#{+EN=iU1_mprg2ox4qbPJEzFL2 zQ}LUrkt1o@|F*?_#cF6-eS$@1DxnOI$Z!{n>{vB7pQM&#oY+J+{cg+%H>@lk-$?X> z4eVS$(JIa#8jmARj?!G5q>QWUCP{P;!rjwB0)!4BYc(A zyHRq)JGmBO&uX&z#?CPPgFtsxcKaax5rJv!?EMezpnIJt|6&vl&$KrEqRtKMg0O9> z1e3TpLeI8TzO=3?Y?KUV_?PCrRD>CIl95wV1v$?**@1JNvyD?PD#kyAnfdAOWf^xP z+wxXF3Oh&c2*V3S|?figcm?(uwC=Dy`RAE4 zXU?8It5T_IYirM)J9qy4`MSEg3l}b2ym;}izy7MPufKHZ(&fvSuUxru_3G7Y*REZ^ ze*N#i|5mHj4Gj&Ajg3uBP0h{CH*VaxdGlsVOG|5O>#bY2Zr{HB&p-dPwYA;3bLZ~e zyZ7$h(`Yp9?d|vP-+%DnK}Sc&!-o$aJ$m%`@#D_U&L>ZvJbn7~*|TR|U0u(gKY#J! z#mkp3wOVa=cXv-uPj7E;Utiy=SFc{be*NanoBsa(w{PFRd-v}B`}YF_10Ozo`1tYT zr%#`BI^E#^4Lle8_aA$l+DMv5sgwGOXgdI6_`&e ztpdi@rs*!X&e=~>;79v1bAff{+P5C{UB;yPZi#v8r3a7rdm?fhM&HKHQILwIUq3Q# z8xsxL7V`?!1D)W&?mt}*nQ|MnQUG!5nhzfT?PcJOAES;kcHgokTPtc5BK-}^m!vg# zSt>tIvb?g^hgoQ@YqQvW7|_e5f8DjbFp0^CvtjLDem%Le(ChQ}#Q41%)4x8+jC8Z4 z!rrkDnTbSiRTC!v8bWG zdA5Qld-XXi#J7KAbU5S-XmlEBR{{=b!ipHH*M^hM!<0L5viGEYSg^W~9v=9;KjE-P zSb%meh_0{QG;NlgwuO1ZR7Sj~Br>yCziVl2aT+p7w-V+mjWV42tI5kMSap{up zOA?@?p5+^(gGLJH5CG1S5@%Qj_+r{zZw=GFyR&NjlArQ5itzC%sh6Op@CmXg=u5=bLqU0BWvIovYN`4pH(&4B5%xT91?~4{J^SlP zH!fm0oE8qyn`NZ9-Q)n;!f_Q9eWv>P$EPS8bo~x^;RbLzhi|K(~FrR-b3UEHJ)Z!d&-0EZPwh zsbRnGsh&pRu3f!WAr?v|#>7lCbeB?9Eq}?#l5;}9%M06}KfshG2eT(PE8wSVPT%wC zxe6Bkz-m*MlWNs39|G_fJ!VD7wZy9U{A zO5~XZHxO{E6b=5ag4&HkK6@61t-hwcyNTZtTRdfd2a)5iDBTh5Wg9TJIdD`oKSWzw zoV4U0>s5;85LE+~s(vqGf(BSf%Js9XEli)am@!K;wq6}D#H>k&7w24lsdlxd<0LaB z{?PmNGZJ(Ua@Z8mFtYw=Dx43N^u!nzP$-%6onay4G)<1fiQZsSC(^--SxiL6AK7_y*2mmU)8AP z)fR@>CSy}`gjJV1D`43%Y+XID;p8#oI%`b22ZHa$Mse%ojmkcR@mIP;IStnq-QYBH zzDTQ0eoRmt{Bwo~QZ!5+sl`&_n#_h|Xi=h~1k#Gke@lvox46?LyC%XXqH;HWj44{+ zMMKS^NGNfoA&D-J-IBGgw)n|iJm z7Pe)vX&Jbeyqm^;bJB7u8z0YH>&3QwDZ9b+pqpebQ5xF~@7(da5_Qbe18iTCnMyHl z#7&kU)B4Fa!L;eMbjZ0M;zcQL9S$Jy7keVDQ2d^J{pv7Oss?ltusTjxGwwLfkez%` zF*bahKu$`wKfPaR`lTjn*2y{hZ+9H?`(h0d1h&`%xMp=Tucxh>Geg6ym|({iKYc#2 z?%fH*k$s$2bJOSC?GjT}Sk-4;2m7=xH&E5R$At4NilzaaY)#STtC{BQYeH$}>Ne!96Zgv+>nA#I%trsA z$TM#aSCWl8j<4Cd__W<&cUgfvDpj%m*`nN4W$aj^?`sBWwpX($Q^SK7qNcDbSyfbD9dwG+b0DRnL zJ$c8k0NQ2njbU+jQ63|!-S(b_6iqG@J3k0%%8-I9BlVf5UCE`}ls%QKZqPHFRJ3A% zghZ8Up8+U$38FL@&}O-%yBBR$4GB#+jK7x6FNz+`VBzg*pP|ZwehzEEt5lP3#|EvS z!pyymVmABON^Hq#V6!e2@hcU-%&5}}%u=K1HIT(;aN5#h9^3b6K03Qu-sEVT!TL-| zYNMp4Zl8I<`zjrMw3(JI6WGwXkf4+_a)ns7Fo=vt3~$OeGUFUVv_ zIe<-Aisy&jkmdQr{{FK`2vSK&p zAD2kkZR}CIb7atIwk5}>6&$JDv&yU6hF|t5sJ;ZK>LBysC>Cl9TV>( zdR~(?EiU)Z{u16a{yD3megL6b&U;aqIHBxtJr2U^|ggA8$%mXBiOW_ z40>@2H56@|2fbM5EPiBbK0SS*SD_*BRpwYV^#3&;ma(rm##enBI1u)_cuYz+2V5CB z%Ie{sJF&#t+qdC*oC-;AP58X3g!T0P><`UYw)l5e2$b1DSnl`eX_5ZWp;L+3VKVjs zwaE`1g57ZAmLNYRsHrtLLogVW5>Ug_k=Rxsz-5e71tPw0@@xh2m1SAH0y}p$N+&{QKaH@+CmNTMTb5m;dh$^ zLIVA&;($D-zj339rRc|OY!Gaq4_d7yZP{fWBq1A6qMtP4ZzAluj%fkxU=rLqF4(3; z+m%a=Ycs{4jh>7vnI|)G>M43GnZr_ty4#8e6i5wDTXsVzAkj`8WFr7^+F5Rym~QT@ zT{SWM~-a^sqm}r#vY0u#<4gW$5}GHyu#$;%OP=j1AqWQY;jxkWm@> zH!()Kih>%eFLr(Y^{j>TtukZ}sX>@yCJ->7nhsHKV;-I$NP16+aTmtZlsTZ!IL zu`<-?cO5!AML;82_a)O9UwLU&Z1*`dGwNYGi`(}p9WNQ@FhUracD92AvQP=Oa=;EL zc9(>x6=$I$rq7eRuZ%J8G1d#_QKJwZU^iw_97i-JEV)l)W~bIWc@z8nW{|*|o%Y1g zGoG_V2_FmvG2+Q^E$xjId!$4()F;awj1WFz9S*CYNn%E_ju6H zl49>Kj9pshauIznkzL!c{b3@l#vOArVGUAwuXwM!2ewxV4Dp#gn$YG<9F-u?^w1=k zK2Kw$)d;w{QyIei9~tJ0L%=*%XhkDc_Sj~W1nsyvea<74ejJ%2VV#z8f0H84=TP}J ze7FxW`SRHgb`_LW&Z4C8!ETJ2|!ykoU_ZqDd ztFTjYbkpq>A4HrWEq;pR1+jp=2K}Z+?Jfv5k`#xNZmfl7h?%~mV7qwpcjX#;$o{AL z!0~VFz4}ITR#o@*9N(_o`CAG{tL9GBLAwE_lW_NJr_8a~R}lpzgEpv;ZW(u_KA*K3 zlTQ(1xc%>VMk$GIo{m1kk!pe+FM(EifyaO-pMkvuf0zkYe^Ya#CERxfXoc<)tr8WM z@>~aleyXdijiK#gf7x_6J$o zqE|FOvdA*c$7<$0{~JO>exsfBb!+~vmt~F9FEqLo&UJN}>pru{zWlhk26EJ1F^K_N zwESx_Zkr6*DPv>I8ThUR=JE=@BXCTV^$fOOS0Kr zRo_FQ8o>B`0uR+XeUoi6v1Sa&P^|{}eu^45Q-E=r<9XE=PzAwoYmlwNt=Q@xLh0mpK2P?$oAofl!R@D`E0vAYNq?sB@V` zqP>mNuT)_#wM!Rh%?HH{TmvR4xSK<;Uk>8#GG0IN%PT^S$XS>t)~Dj0|8JqEbrf$H z=Xwi3xqAA_I@6JgwEa#75f0eJ3ZD%Ybp6X)EQBH|3}&KAJ|U^FB~ zPU3u1f5a?=`ANjqxkGwp3yf-+_swhvl}L+}_ni2pOX{^)bo!-^-!@=$oMmY#;q|v! zyilI$8e$tKv15|4I|^=nEp}feFjjGwXdx3FKdGHpX;niKFR`QGeXA*CfybCzSKMz}nfVs-wrjbrZDQ&0% zk%oXGZOIG5eZF|o7cq9XQvc>DXQwxC=NDv#7V@>{^p#E+C9yl&qB*h5&F zb8`~5K^?nZsh>>>49Q@(*xFAb+fBRf-Zs8(+6`iq+}$#oKH$h|(S_@9YpL^)Gd>9* z&bOx5Yv*0S&Bk^Kd=!iwB6hTh=7`g+G=QP`hZ}G4Q7MqiI2Cb$a;?!OrQoj%1w1w4 zam4J%xYUa#ErlUYVlQL10hqK7F^Q+YQt_|F(BeF?dX^wUBPx=yUMM9mq@0~=xU7isQSA~p;U9bdh5I05g=hA|8(cVo@9MV|7>Gmzv@jvZ?WWWYjVZn1L znPZY}A79E`qwO`B5ESat2D>qGQfabMeC)7LW<5GIEk^(X{HEnI; zN#D@jft?nzQ^7rN9gSp6XAZWNJZ?G&sx>C}#fKpsZO#ezAF{fFJOL)7H8@-MN|D1l zsMQ$gi_VHYu?qF)q&2h-@?z0;q!j@yC1Tt{aTYol5OaCmq!BS&hsGp-cu(PvJGGBE z?O>?n&1J@|5Z5{D^y@{DzAT%eC4?sKyxR;Lbwnd|_q^mu$Vlf56pgHS^9x3?G26~I72yQY5Hq#dM%Pz~X49Xi^3H{^%t zqI1EVk@nnLqKy#)g&H%ta{wqQvI^dQQ6CT#LN_#vrg@wB7a)LTi|)gW)$;DCMG#u1 z-Q>318#Xt{Xg?mk%!bCb_;70VedYBEp;GfT~7Nx^u+d&M>h$F zWG9=8|EUN}_RzWpOx2(hw|;-|=OtNLLgtErk8jT}m=3yD{U~)e7tOE=;+C4dyTAG` zi+v_M_blHU&Drr=;t`!ZQ)=4H*amML+U-MZS+Z(u=xx>U^%oy38OyWyHQsfT{&Pq( z_~6Z*netc6G6&%qT)wO*?cB)&j-L0fm;W3Mu?;pSSzF;ZwYO!IK4iy0A@x$7JHPJ4 z>cg^qz0=9y9FI{q&REUBK)w<*hb{BJF5Qnwe;b5W=J-jtq#v}NPXj! z-X1(zo!yzwe_a z%)RMYX}xZonS9?))HemSib_9?W_WEdX*Q7hblY3*_~wwuu$HJYqzUtNAj>m=GQdw1(i#frijkpmRf z+P8}#d|pFKQrUzAL8os`{6m+jSp9%w_nkCaT6=1uiPPjp!%7g^?S43JxL01D`tee( zxpXT%^lB#^Y&aD6c+TaC6|)Q-7N7H3()PyUou5!NltoX?n(h7d zz{K5|*o>o7(^CewTN~uTl%mhZIez+B3NI?Y_OFh&X#0mlCzS%rJuidO23sZ`t|Y;K!oL6C!@fWMPhVzb zW#xbJzR}UqZ{NO+jg5W({{6>~A3uNo{P*8~$H&J30RJ8K(Q&D?K~9BzbB(2BMkdcT z)XrGCqIoBq#a89H$KQ~%Q2uCVEw#vGPC2sV*11=Wz-;O5@P#qQcU+lF(Ff_1MI=x}G`J%nGJRouUOMuF}5bi|T4%E_T&TIX{3EM=W+Iw;*bJU+ zXl^>lGOwB^5Ndj$1d_eB`A~vhCNyEk$C-PERm?x?x4k-8`_$*2WuT9TS=K-Hwr6e~ zw0GC{Z-`w9_=|F^&zTSm$_B!7oRx zBj}R^RPQoo?gK`wva{*UOxU=4cHl9I=)qj3gL10N=b8j==|n`$4j4Jz&_G+Cc<#}3 zC{@dH9hPqVBhvN5XLJU8&A5aBoRkpMZ)_l8HrLZyO1B^F+g>t9f z25#4z6CT>}OxVk%v;n+q(GapQ_1Ho0d4R-mr)I)^{&9wnxQXu@V7y0dbB(`kj_90=o!X^$Jt=_v!iT2$UpJ2raqRMXusoC*CG+tA zA?b<bxPk?Skm9x{bvi&uTIn0o_HH zX*<#xHf^T`M|a27$mH|volXYU{79(ToyRk_AnpYmuf}h=zimE$Z;Gv@x+u`kGn`}# z)~AHpcEp7hn@%!Fz=AA3cLokOVCmHMGRyY>!B!U~L{(c3NbvvPida zb87bEl3=Q*kQ%nNyUe!$>}~2$t3`uh8~P!JK`Lb5D(6S1Xg$T#0IOWXO6jX2FFavQ z83E89KEYBN=b9^XZ6F6JJuEq&Q3n;r5O%=AH*&H~NkcZDV7Ewfe6Lm(r{TKUKWier z3RF(}^_85XQ8_+3Nz{I|RNqxhK>h5)JI$3?wSw-sO(ciuHs&2-$O5wgQwon5lth%FjI%cy{eD&`LW4Yo+yQ_eMyEP0m>Xc|DjD= zCqUk2HS-W47Xcrz_}w?4W4OWT{tuxy-`y07oov9!6%P_l52 zc%_p8<#2Sm6fI+ta6l`>bI{y5WT&NBQ@Kl-xHtJ z2b)jZ!=rrl$N~LqfVJiXeQYQw=VsBh3g+UN`;asWW;o>&=#Q%1Kwrz4$^kY5CFqsS ziGpF{NLw@mtTu0=OauU6J;9D7cI>%Z?wFmx&@pRN?LudW$wY*hG$CnO$QS}c^qrPkDdCEw!!=ti4;tz zoO^OG%<=*oW*##YFcP)ntOsO7K9VfgPbH~s3Vpv34Mtt6A;I~KCHIv@o6;(e_jEG@ z#hTKfj!et`HIY7LCFnNAidu_>U_xpLbNR5;25N1vQ%E8I)39ZfLq7k;je>0hM|N9G zA^ooZ(Q+^Go$24Qe5hs$PA{`HMI3J7_bbs^WsSE3Lx7&b1PEp}nHu`sGU{^YT0|eG z>6E!xoCmpPA{`l056|)y=5Oq(JP4IVLT}`|JSXWOgd{@fT6U^=(_uzli#LCZfY}CR z5AMkC)>qu$xkqH>v`^wv5pOEzlNR5Vi-NlvvUx7u`lWfonR8S~l76KfbCj_!5v*T)3ev(}?fQftEqo zB=rO4&$3(@m2w!1%&hrpvM3T1yPdPQnd%#sVMAr@HC5y{MGG`dPMfOiwbaE;T^=_? z-!wO50!?|M3ZDZUQ;sn1N05w^HtzN~e;ZCeh+F|U!KX|aKI{h9yOqc$`WNr_LCr6w ztwdAQSM9#_N4oy_Oiq(l8Bg*~48!*z3F?ioT9(80nL`T7yQwcd?-LGcW|BjPMUwRM z-p8Owdan<;By6!PAlU26!;BUwr$pGSkf#*Jmqn|Yft|Jd1)VgLdeAuj#&a)Oa5)j2 zW>X*^8YH`#b8Iv(*DX<;Uf3x$)|mrr41{w%Gx?6WvttQf<6|L9$EYK)5gkALrrv2? zl0y$zE@ORAqia-vNv@eEXLgdbttFr#YYPhxj?nl0G8gPt8bc5DxUy8|)RjE3)Xg{3 z%UItvF0%UEHpfai5P|cJhA$F51L88nf)?zq4D?!qm?YpNh0hCp zZRH|61@c9SOj{6%D1oRo(U%Qy8M|j3N3STtBCWw~Da7uPRT@qEDCN(=;T{E*xFD|p zc3cDp$QY3_=#|F|R7KCzZ5~!}3vU`S6;P`NxvS;eB@vG@$FuXm0qyqpB-f7AKPN(u zh2adzs@Pt4!2+`n8uWpVnXW?yG{|n5fh?wVgN+*YNg1bynLMbbd}Nc@ zO5UgzsXBn80L>Q9q`AXy6xd5iHcJfErO;9d?7fPKD}c2cjsaY1vZa=qSI*})Y0zN_ zm3^SkaWdL?!LkJ5bTK+cDO}$mI&GPmE!Hyu)GQnY}83?+OmxkgK}CQu$y8DqN^Jt7mp09~!pV-b|z4R)?Yh&qQ6fKkZd z2c!>A#(AOUyWwFhE!0b5uS5*0a&|*!?b!hOHB;>c&_e;-DHgdUQPA5x+<4Y2g{;XQ z|4WI@t-`LVXbdTBiW1UP{l4+FV495gnq*A00f!~rKnAxgY2{s796f_?VX8m*Cf8ZO zw|IcPmrMy9rJus-omC0GBKB^L4Ts>a66;~g;_zgP`~%E467)|;J&h8x%oEz-+o&~1 zYclwAoQk$p4E{b5Q+bT(PTMHk#FvZmM4SM>u!SUZ5y5z==D)Zp?AKt`a$y{aHJ(SG ziCM8y-YW^(t3*_48YY8Xh*E}xxuIWg5y9%n7eoV|k4n#gL_ey~R=~F{(0``HumYi> zmffvEi>k2^HQKH~K1odm2vj9avrmFg5e&8lpCDmf&9Qj^&k&QB3T^U88A%GgnA#spopOC{|)iRL*9o=LIvJx1Sf%=IC@ zv3&wf@;mzo^hAu5-4-s8U@w%c;HWZVcXprn^dxs?>bOt>iP5iOdT0sb9Uy+__+5R% zV@j^BlJ=F{2xWszwX=VWz+MN8PT}%x`W(PuI7Fkwnc1BemO1MeMd;amH@WX^;85y) z+so*1$pz2)?9Uxi=QS6Xe-|z_GuU`48f_Q4i#g5&6pgcDxBo?tPcNn|mPt74ui!JJ z$S1YyCl%7F;y2@G*UN#|A1C_l*}-Z(Jq7D2JDDRAvPHc2D%b_0-BF?Ua7vBHKCTR^ zQt2_&?29OGUlCHT{){eOeG^tb|N8vuJ?Xi)WXe!g*St4aGOfFuIc- zPs95xoqvk7DN)sON;vPw#XCd0d0(_Vk8MJ_gww8KMM;8bB=fZ>-=P#|8C}VJNU0QQ z&q)fH%l=1&e2oJiq*$yYK2eH2Zqnlt+>9*FyiVq48P+9cz2#EETmC^keTE48IKe1R zg6P!fc{Q-D1PL<6(StZ5!M>CH%Wq}3Wmu`i6c_QInz=%hBeWJ>N*cMTkaGfke+Bl} z486N5jMgZbDC6zLUGkej+453+=R)q}_+&CO!s-Q2_(;)yRYMzcr;C4gH$5qUyTEs&w zWKbemEp-G3*8u96CVZ)_buIz@B$n;hvR#Vo-_H0*qJQCZrWkmuv5YPOhNC|;i!I$* zPn7%$)lDxs=bD18CqD5I=ga+fzm}r)`K&&`F%E$!MR@lS@NWqPNO!Vu2gOQkuM(fA zMTQl;3ATCxyLn|gV5mZF$}Um8jxUl0heK#3B=4?jvQazjtB&`tKJ`y*nLF>tZv1y0 z+NuGE3UCuG#~g?DYuNCvP45)wn}`sa2EMN49}#npE^@2v65bM{6Wuu%be6I0U@}-_ zFFrT&*5HVQy->pa_NMC@}~MB6JPwa5X<9=dAoFYCb-1)8LkXQ{C58hsPu(xiB3wxf}R zB6g2U5xjvmphF#h5jLu@K^;@9g|7kZBf&b*wk2F9W=ha6>I)MLUs_l*$8hCDYuY;0~PQ z<-mf@OehSO?M{WvCHs6r>L0j=(3Jwy8bB!7kCo^HNmocE>x&W_0*u)GWqKNV0l@qh zn2jlz0lw6iB|0p^KIxGCYT9nskPzwC5#?M8%dNs0{e_!;mGGXaQNnBTtOm|I3F0Za zn16}cEka@+B!sNwy9MacC9Drxv`NgX&~a{){D@`lJ)yiIb}<9F**5A&A^JwmieL(aGIY(J7@Cs)R!t>ioU$SzT?$RnA}`f6g9fk#NB@zvbbV%? z#kmrkuhB6?BGwHFY@H`y;FI5vOZdqOJxIs>4j#fX##xf*DSeDi=6=?2qCDW|B+*n& zZ&b2p0|7_G?GgR$2|-aJ?mUHvg>yctk!-?;F2)vg2{~HMCne8~EThRd!XS%W8TLT3 zhoMHlQSp5U*e6CkKGy~7_{oXCLK;roEB0t0_(Cr3H3SuE^opdx+h_N6DT@6o>>RrU zM9ePLh-VVGLL$4>y`EEeT4I=Ai^l)Ha*C2V*I}O|`)A2{pOsf8TBBoHc48{Cv=cnG4T;DYdReUxVVPUbJq>;`1`}wW@ex_g8A2tr|C3rRMc3&`~K?gzHaI@LR@7 zUX1t+Pf$6C|Kaa}s*<@e?a|COS(0lputZ!|xlmphZ^9K;Me+^9 z%Lqidn>%?kk^G)XgqXQ9F3+nxuNF6XvWrUMuIzJOo)BZD<)CGg`DX#lL#+;%Y24`+s8jNBBSfm5{XQ@y4QH~qg$&j*UVg#1o)Sg zxt6~KX%Lh2k$hI!iajU7XeN%AkC<=%S6UkZxTCa#`eYahF{o@m7pll|B=yV|9zP)5 z1Va@&g8s>&_Yv)p7LBL9*ctqNHRoKNABtC-FI@jbeTC+i=x~G^n=IOW1dI$u8V4ZM z`x5QoA9KBIA9Yr?&-5%HsI}l|-Q4GNT}MYgTDH=2-tuhk`3nQN?Ei#K{|TG^ga7=0 z!hdG6VbKycVcnF=vDGyRP?M&{slL0_;vV zV!=N(O{m+Gfs%~l^?iCp7;L9Os8Gf|WEU-}eRklOx_4>t0FgjcLh7TYQz5=Mbi#wI z%%ZdZYO9caFh}`xK2u#;95l$@Jv-=I-EJh0hY+_N^!na&1BYw$8cPj3yYEclmEOSY zWK40xWxaN=U}8l9p6~tMuIi5W8?omdr{uYcP&}3t-7vCIp6Das>e`|pWai%#vv zdmMkOaMJCpYQNg2XukRY-Q7I%-j?3=ggNv_*2ap6U)1(@O8p8Spp%Nl?k#L|u5+W; zr{T{_1N*atv_|?S2@-VBcwCw5nvi!ID!&Fx`aM8L&v}CvY1Z z-S{YCx61Pt%!*QiBuGr$%SbNHWViBH4&E^xMf>gq+C_VXnSJMo_ziW7vPkBk)@wz- zLBzGLe24*oVJcWuDF!n7`1t466hf+}BLSfB6m&3b3>kbuXv z`ygWKB7B2nxWh1PbtN?q7`qo;t)sy(8}fLagr%!&BlSawiJi5|3jsR$$9{E$rHY$2 zU~i>S0(hLQGkVhcS_ElBfQI^fw|_y4SSNdP?a-zLW5tS3t+mJ5ua#^ ztktQjtqw&kqKNESQ&JPGhFc?7^+XlBYR=kH2__2#P~TEJbfhLWxX-$fPYh0Q7m)Oi zq8#sz5Q~u-mVXWvmKsn^a$TrJ0pM<~uflN$eHTbd`js_Ve(%qjxa`tQIdvC)SRWbK zZ;gcMLA0Wc3gOCa;Q%2*7#Dl)ZhER`!(RhCX;a4Ur5AXDwK%Kq$285# zccl3_-)RhU;!eiCjf^k-H<+Pwfc}zGR^Mg0juzWW*oBpXJ(@7r7epp6Q45est&_(D z97QT&HdbT3aHvvVWG)rscZy^E=Z2|L*64v<59?d?LP+9K_nDByH=;n1u;+K47{ zx}Xs(QdXJi6s)RIcWf0&oW@NE{ey#cm$p8%jF0Q0bk`wU971Uq@^S+LRF8pp1=T>! zF`FC1Pt)02@{|&Aq`i1+YB>F>14U?Uz#fdhv$cA?rp2dD;^0yiwq%T5Wc=A0X|`;J zXZE$&)GN8mU)i#Zt-<8sT6Tq(^ytsL+zyv>^0d)R<1c)GDw6!J_w|}r)`$D|R2EGd zPO@Rvga@j=6#eRuV0@?khU12K`A)UcqRXKH%1J6un`>js)<*c1#zINMwpLF^`@)8Z z&ah@SUfswFnA;6K!2#B(%ynJ2u7sJ_X!%LX#6x8um#u~9DuuvS?DqR`#$m%LY6neH zv9qrf{+RdHFSK%A2ED=jyYJNM6&?T@@SZd(qrETnK+{{p1jnpHoX*5!i(82aVRu>~ zKld}sJEEAAu2wlc6clg9GsbNcsjR>OLB5vPWjy zzzhmHSFpA49ow!TFDs&7?wb{3Ro)skp{)Ms^0gPuXUvWh+>BApA3X;4wwg~hyHFev z5Kh0$-W1k_$L?qgjl*Kp4m}|zo0QhDbi-h&;jH2vd%|q z5U99p@QFdvJ3|gE_B}FfB<^My8?VR!P{LPHE^9&$*tq7r;Wa(=v3V>_EW zCk&@Y*;+N1Z4POb6)A`1hhbmZ&5;J4epIE&k&N&aJdX_Riv)vE@hp>JIhJe;v8UHW z7#`u)lo~6++3_^OX*V#&uZ3*b9KoLGb)eW?tOjCbIPYr`*Fto}Y^Y3X(0L)FSJ{gP z9c}e~w`rGQq8e^&y=jF@NPeLbHmr%Fufx}7jrLgmz~6<;*E;JH&H698bA8X0>?#|p zvN&2sb83reT377_{6pt^M{OvHq_-Gyi~+3E>|E|^WAs9CQ=i?JsH&^7>J{f>V>U36ey33};JCafoc{f9~pZ|ZQsKX+0m`8@)doh{^+lyEG&cP6cGGFstn6DAUy zm@SEzU?x^;F zEV4OAoXFhLRkMqb<+~2@;S&m@fOJK#WK@$=v{w|A7xM{dGOVu1Ad48J^&GlL3!Jok zRwA}fx&)8st&Fk7NS;o`XA@vuBk!XKdxvuh5(E@Z!&Ji_q^N48R$oajT95mPvCo=K zrV#CmEb^-YJ+^o0Ns$b<23;D=PVryBu_q$zGL?CVk+&kOQ^WQ3ap5Q_`p9gj67B;9 z$F7#~631FHVg^+VKek|-hAI_j{03%?Q8yzo`X@=R&77U1K!eF0)VgV&4E+YML-#hd(DZ=J>jew~G;!$BEi4BU~%^wON5!|l$mF^8T z%8+;a^ULF5vVL}C;_Tpvg41z$@)}#4!Gb@1ch<%IkIhdY(B+wiY}#qN^Hlw8hL&^e z72l32G}%Kd72#Z2$_DrJFwpsKeV-y*i{181*Y> z8i;wJmFQKrzyU{xRje=b!4^-wt+M@phw`7@p!AJY-oV)`rYEY<5dwP{hg-^!uLRaD zWd;bqA~NQLV&W976>V6sScbmUpvzQXsU4tV%vx%{BSu#$XL1!>%GZ~j0ZIwJf)E|l zVy~6ChLzCvLBuKqx}=;nB176q?pzfVr-i*efQOQQ`!#xETyuaC0=-t*zg5}M0F)x* zo|uQ*QC3Nk_t6gIt3;QS`A%YVRjSK(1xjU)#u4dQ4ERMnrOJKrTUEr?WHiy5J&;Pz zld`9)cYO=pGOA4{X!IV{?m}&ry_ox5mh+fEj!U>ot(9Il2a5+)ifNllu^!<1R4cfn z!}{<8SQ02B_%c$=!_5mcf*_gS7bSC`2c(G6Piluv$=G(1Zm6JLQW{!kZl6#H~cqDSIlxZ9Tu1Xhq?RR;tVrF!+hjk9T zWNh>6f-^GWtD0AD1CNa3m`s`fj~a^w0%1H@B4xifoRWzLCpqDZRfbDRL4u0HpCa>Q zz*a%;5bnRM$M*fO5`Q2W2wd#|u2 z);E0jn@LY5Gn3FlAF80DK~PZIgbo%0qGDSHLBSrZ*l=iqh!7Ba3xbMetrZkqb_Ya3 zmnA56)F2k@p{NKbd)D4ZdtcW+_@DpIIhpIq@Xhx=?{nY$w55P0hY~c!Ur7%ljiYKP z_S+f#Su3cApo3u1FE#d@X3wNqE_ghW39EFGz6SGW@=fiv_%I{7Pgy0)1ZS1#4}IkW zBWK?jJc2a8E(8L6IpZWdBuB4J)?8jXJ-kb>0GQhk&|C|K$0~=?nK`LixVv5aT?Kod zkyz;~`&Bw?ZN5pYD7kvdy*IW#rc;taF{N@Hq=k}c!Ez)2Je_&BR^m!x206qvn1uPQ zzb8*ACij~_;sK>-e=w||gtgyn)4kC*N^D*#ch`R;MR3??R_XXbvub{=7?buBAX1x5MyMRgguTjF6(;ZNPy$tXkGsHv#n`vTP)1p(nEW5Nk z!qa+uxUGkdzfZ$6HB8vALtE4*pX`9NMlQ1{_*jkoP?eAV&VR2(zZlV?rHCD5HA2b! zBPb9_qU8=iW)yxC%48L>%Fx?URljlP11dxc^{4Lq&bsG z)H7E+I1_!XW^rloUL7&WnmHQ9BB@Q69(`p%j~u$ltA&rg#*8>JN5lI|JO8d;VxnVb z=uuZ+be5MSN{;>1rJPrD#%s~-oed_n@Eije5nR@xkt`;(-CBlXxreJ`zf+;*VK@D$bmn*s19Rf1n9Irpw$}iB6zB>Zm2zNKgOpZ?>4g9S! z8syIvT&C3)o+J=Bh(vV<{n`Z`r!pH#&|x$1i3E1sRP3dy$0UMN*TgkCG)8|ky}>G> zP%?qSTC^3MCbsBcIo@>6(g@{m-JLdK8v`%QlEc%gC2ZB5?{a3z!EJXW7@`)PcJU?h zN1rlKdkCLWf61OPhe_ssKpf~L&~*krMG8&*BrS#^KMbO@`Open@RWG`LCfCEf&*Ll zFb@uN#+(MmOPY0%45e2%no zZe(p@$3i#c)6d+L{G-S8+a#XEfu9umhaL(tAPMAO<|XV)j<1kJ=(ZLu@*ji9uRoyw z#v#_rG~|*KFlWwuD>OrSiR;Pe=n%_l$1ig1iC(lqB~Yt4`UK8`4w{z<#z%Aib^zJr z^c`9tQ?QD*q338phh}u29$jT<>%S#iuaQ{Dxo$W#fg2Nl2lp7zncqy86dG-Z%JFdZ zj&JJIvR3e16O?Qk@k+@#o+?mF4Ey1+h9N)qgl7 zQ#(wO#bFd%dsQM)A}`7@^?UR?f%&K;t{UV6?voU+yUk=)dZkd$f1wP%rDEEbGpcbX zPh)te3Twv^l>)3L*DQw>OYft%bkEM6L7dj0Njg>r9v%}V87&uYkq&-Oig)QnAGPQr z-Akkz9M&)m`o2d5cd}B`qv@!=bEsS8{m2+C1pE(1mOyuI6p4&veA8sFC=q(3Wx`es z_uXu$mQyYQK+?NNI~IG%M(WJhAo-3CEwOT2;d2Jfi74;=Z#gEwFu_)8K1pt za>|_7-C5FgJ?|i3f6QA1QNYL8GRL zwZQ%|6eqId-#cL+nBJ6b<{@hx$2rB>W4P~gS1}WmcGh+|C+)brJ`6Voq^jc1uJ7$G z4|(n7$Xea)Y|W22i_Bu}`}D2i$d)_#*+XZ1^z4$<^)l zfgpV=JW_6@c;R=$y5&vM`mEaq4Y)-zVkld4Kjv|BQLVeO3E{GP-6& zxi9c0hp(A7{B3%~-!H2Vt%zuDOF`-yU{@ZV8s!?)?>UIfan5eHSl@e;0ZTdr8;Ox` z6L0S{6IN|5SpRzTbwSep@{QY9Gu^S>{GK(*r-v*alA0+%25W)Ne=xA$5ZRs|mNO&r zhy_Pgl3m*6WRF4Xnu9ODW0fryXx=kIp?CZog{XuGpq5ErIM;Y&cKZtv_s; z3YK-3b3L8U#tyPyneemu1h}5SvmRC8QU#`jKYeHIvGbQZu+&Yxy>IfzgLD22HkW1Y zTzAnk`)}5;b4RKtrm7zh{K~W=nlW4M87yAggv@2j&BWxUHOXDP{Z{w9L|JS4t0OGN zzkPCKzunx=k3d3)as)vu@|gZB+7?>W!=er$q)q z!>z>G>I+yjuzzYS-~Bhg?#afyYrG|AvWIx`VyHzet3_R~seIeA`!PS<#xx%i{x72o z`(H-)zl`qxIHThe;6M7K6d_5fE)UoQdlRH%eU^pSFdLaRDPjxDZT_uP$?VKxZp7SM zTjLIm-Nv0Be8ST(Cs@E=XsS$@s2T@}1D`r4kk^Lp3fXFBiffbWBEyP&Pg|-Ij*gC8 zpVc@`mvRp*U3l%)v1Xd>#hoEc&}LF~#H;>uTa8N%%-}mIc`%{KLmvDx&PCVAAkNpcig}Y~L(G=Be_}#N_ zV3Ss^xUllwUVQJP;wjgGz_HBPu*mMp01_&2tj zh}_IvYBF7Z#<#}%_U`T6A3VA7;B8LSrfz$xzVdJh9NrSl84tv=IGI1i*pTAWj_*L1ln`=SBD3D!N~v~I;6shBlRo!%O@r3zVuRz9uajN}f26!_?&WjTbpfApz3Rux zC^a3HBaP<-{&L&Ke4j;Dyijs*r_kb+#-<4t0%RnBMZL>nfpW}u|JGno8w%Djwtr;XmMy@Jye(ic8U7n_|d&B*)itMh;~+ZFwqth(A}SS$NQrS zp9fs<+UQ}mI~8nrT}vtEx7kg#DH!_zcBrqLRW#;M z|52&*iQy`Dxroe+@x7o_)n4Rv;Ma%hv~6Dle;b@oK3k>?sT`n_W*-flV~9XQl-f+* zX|;)6(Ec(teHtuWrRStpdENHE*;0W9{cLHY#){Kh2P^*8T#Xrdn*ZFwa`;(FG2-{b zw($aY54Auqh0Kxa45CWx6>FB zCvbOB70a$KofmYuLzu@VWYWTLJf4ZtPS6>kiW=6_>KrLHz!Nh`&)#zOoI{nsV&`?g z-n8OH1nukIqX>Lhj@hgA=nZ}ZZ(t#uo7{8Nejxw$FwL3b87<`zR-QDA)SR&Mb+UUM z)yP1-Smw9tOwWJ=F>KA2gk;A!QyTVvT3)=EVp;VQyGBp$kPePdvRi~#jGc^U&Th=K z`=Z=7TI*YSJC$4j`BI}_K!tfY&3j4ks*Mif$Q}|?sd&Cm)8MDSZVUG7@&bCRv5ZQV zjlcXraIhX(UsE9)R9!mORmWbow>36IK*3hT^~365%YNlXp3*6EoSTzv7sIap&Zz%7 z+RTzCMgtACaX~vdfn|~p(aPuzRc%s}#<$zDuICJkR0y-%bNn9H7P}=U1HoCCpcQCb zuanI|{GBg$av;oWr-LA;%p zuicu5tqy!sH*)T%n@y#^Q=q`d4Q^*ha6bKBrTqCjJ40)6LM07GI?N-JZr;zq8A6M; zVM3lXcY-NvN5J?G3Y6rnA38sDBEb$q=dup{KH|>tr37Q}Y)O1A8*OwQVQ4Cl#L}j{ zsc-$;tJ!%%8rs@^ZA=}6P29V&BI7-<_*8;zXq;PwYU3O_6li6J3+qOuL(t6m;sK2Rr(SzAla_dG2HEf5y4Pxs(aS%aPff|*h%;QPG%`|LsE8z#ZQA!VW-C#Ej z1rP^|J~G@P>!8F0PqG_6sZr2~7q4o+YWoxC1uRY!$(x}Z+_Kj4OJ>o-=r)rdM1%id zlqn>q^LXx1P;V_J-&@H&(ca+m6nFO|v@Yq!@Sx~KXq#cS?fIUYkew=jD$|VXGj8-J z6hdrgo!O_ybsWA6n40fo7cV=g80McPZLcP{d@U){Fr>q5&Jqs>^FMFg4H8I!QN{D;OS1Merl4Ncb%%^XD@-0aFj(MD-!~67x1|%UD)V{% zy^=r62ZzG>cmy9Q%?fV5YNgw`RbZD`JmUoA2PY^3&PW2X2onFLju`1CmtscPK81fP zS2UGU^&hmWa%!;qLt3!RSYh33ydD^-C=m7RBCPPNPeV(lKCmHCw4EAy`Y|UxA96r@ zLW3$jd6Up&@Q{ocvu4Mc$ZsU%rGo;!s}BV)cCsIke~~15B3$1K#Az}waN05rh$rm2 z4SD__6kBFgx`GMv>!%Bih3idcM`X3afe&j7rcX&`6p2}Ze9ylYpNO{WOt?03aCuQE z)$YjA@s^EQhbxv}wI6$i%cNh$wlkbrF#|IreeN;dfWp_jJNgE%0=k+jQ>mOlSdbFm zm*m`L;voRu4|htqWm5ZUGd?GEXR+5O6?{GYmYB#bgYzN#N8o%o8{aaR)#o5_t2G}w z;z%;(WHCxYijg6>%?T%a%Nz6(qhb-LX*KUk-Al67-mCqm5=Zp8p7kT8P~=j(q5R%c zCPaZdTkYlC;hJ%Lku4`V+un5q8PHu_Jc|!O{&J4XL_2`L4zVjv7jRz7u`#-zj`zs; zzd2HSm5gQi_uYK-cWi#m)@#sfn)f-h1fBczlzq?RkeFkRXa>^O!mM13s%zcc5|U%~ zjc$Oj*_i?QBw4C(*nsbE=vS^BS~Sl$!hf+dWx{l3%Q$xfqgy#9&Q7-9#zZ{#noW7X z&bG2nHW{Ks`)Rg|1gJ^z4**9=1YZqkQzCp@zlfz^CL`kWnx$X8fd|E6_(AXGNS_)z zwiFwtULa`V{0+pErzuPde<=}}GSf;U{zu2gYfZ5Z^qZVJTmdMOp*+F1v4AwN_GyIm z8t7HCWV@c%PceT0z4-@EA=t~yP5I1xpAcK%48BR!pcfC(SaCFAxwElHR90><_h9Jc zS|DoCAU6TeYwmQ%I`>~4_UAR|@k8D{S|kV`v!R>`C9p!xc)k(&D4*0jfQ$FY*-^$- z_ybd?*##=}$ifqt z7{zaC)qsI}Gs}uuTxN7I%}iJU#EWU(lL2RFC}65R*uR80Bzd}n61ywsy50j1Xf)#i z)~$;9p=E+HgxP-DuEjdua%UuZ1QFa08V$6>Zl^2_NrcAuOm+5(?p9-_J$5{$@FtEm z%dy=$QB9R(rbgVZgRb@9EWK#14zbX}vL@3EB`@sl_-FEXhJol49@5FY?-ZvC6v!pQ zHa&LVfM9xvnR|XWU?WsxT$O@Uxo{(a=`<|oYv7X#36hr{P1 z18+p{_1F_%DNnsj&?@)|R^^s4W+%YyfF2sKk*euc6!IAt)+yO@QUTjg%umcT8pKy- z4<7pvTPu)Um&ll5UEeB+NX6mlC7TTl&9Hn}BABH|U+Qg+n+e_J;x-jlvlTEB5NnMf zTaVBhY&(%Wtf{Pz&`OMupL}?b0gF(X2vxvV4}X+neMTrA0!Nk#Yc0?(gt&!5nDSMG z>3$0ba2ZD#kM&!?Cwn;>^?5z=~$h*!OtmP;#o2C`-lH77k$EKJC(y*N#Tv9*mF&8bRsK5 zi7{1qAz;5(ihnXWr~&=JbS1n%9Dp*kfQ3rPdBXmV5{7)P>M3V50Z zyP;t@!ElceJ7k8H<}#ggPObigMH)D#8^NjtB^36h()7AYV0U4~BDLTGj(k_6pES`< z!4ACxT5|)9e$m6j9piTB1T;0&rd+brz-eB|F4rUPrbXVPb0<8vZ(xYuk>?6P~0P!4(i-PrR9h{;C0wtJ1@K7&PYZ`qb3zDa?di3IkQ4`uU zOnb%_R?qF#VEq&}c^76hpvOq~&?JaqiP-_cc=_J=juVUTnc>DQHUzenK5XHM^^)Rq zWgy2Oc&8IKtB{XIo}*gBBjEEi_tt1re(yTsssCM*c6?MoY4&$h%FyK5D6w7kL^(M>-y{mX}Dwi3Yk{* zhZ~o_Hc75@xeoc5HE)?S=cxJMYHgqdj3$A8)n(&miLqj+hYsZU&jb`$=q+z>I+J8|FXJ=kN`@?F<56i~M_Bis;zTBxtr$ z%-+RxjyT^9=nj<$rV}pFvn6uwel_-y;I%*ysmHohl|k+NN1Dm&4r6C@m-&exON$QD zFMg=ReiC(WN;u4n8m&=ITYL`POY+4UXp{kMH(rZQ1Izo9St?VY=Bb!8rn*lc?$|Tk0Ms`LmH0+|X#S{F-Z3dSE)1_FqoHH)L z#GK^6FvhKw4?UH~{;fv0_esaqf}u=TNF$z<4sIC5Kjg?*U>c*r^x!;00{pEJ4s+nV z)nV`T!i{DiR)u*OUAGh1C;8pe}l)7*k@z?uqM_!19s8?!AxsIgRLUt zI7-1UL)Bn~&;YQ#T9%+gd{xe{=psQ{1%0c(CqD{$tVuiKLd@c?9VRW+x7LaZEFuI5_c;>;0) zSO!zR8e21$W%eO!vmZ8%smppg-enR`yv#BU3ZO;fRqTwzqkb9$PyN9&CEBF5_&f^a zoC3qo3ce}%E9FoqeV?5MiUxpuH~suUEO@R(KPn&CG_f`U?6!8-ye{Y$%`cOI$CP-E z+SWj8rD*;vS0sqtDJyjvbpfnAy7f5&5c=sqO8P@NjJ0VJ_bjkTiBl&Wf%@ zKWN(?Wx_vHXHVd;7#F``e(D!YofP2Fx_Y4lY;A-Ypx_Z&tPGG|(hc@lj&Gr>Y<8IT zsj!=$rOdv;VguwZ2hKXqQw?`@hketSzZog4cLdI+k@Ffs*BJQQON{cG@oGsF< zbomXi_ivzw8f-Vgq^HOWTI{A752+Bhhf{Kg@VbtL1skw8^0ofozbFd+E15SvN`($+__y8Izd%c_Gp|9O^+PZ4%FnwK3At#|49M<6(2CU_R zn=d9>1;qmY`At*cGPUUEu-NT8jgC5QKS1hl#-IZ_sPX2+tLCxpYN&OUOCyC{>leIv z2gcE2TF03ZSDW=nG(!1)UIj>@S!G_8y6!rIL`L?`HDT%av*n^1HP*?L4`Xndj(HfgSI?_S1$#8| z*6q_WF#TRBOm3U^m1dIQg-#_+>_%+M-S#ro#lfwhNRHiB3T);4>w3)Xj%~R{_*sWE ztD1A`c-z0C_x}reLB2usi$sqYKC&snDfuZG0n&i@y)yUQIc^IfxUOaA>EZc0xIhD$ z8KO8k#6V)11BA=j7VM_c3YT*}>5*;Lve^HgxL-C&6EbHPxH-9^-FAm3(Ul5`rKhJy_f$r%uJWAB*A9g_0}ZQ zd$YlcuX8umugt5A-p&29=bn#8=yORHTRJdH5 zuzVNk<>_3VdCfmxx7Tt46ITD}+ttp%5I%e}tk%vPDW5Mw^%0+2o4|bd(5cCPo9l!; zks&hqhg(i?-?;T2`EMrc0E_Rul{sIo0eJE0^SpxVP14Wp@wYTf9>1t?kAHJKd~w7; zVu|WQW==mXyJE8kVuLc_%YY1vbxt)|3+>M*W2S#Qhujr}!PyKJ7qWzot#1w=S>@(z z{jp+1x8M+4-)}KpHeH*2U37qmeaqsyfHfJw8s!TTQfB&Z=x2Spu z#Wsp=>{>wxHl5Lo(A3+n?{!Gd>=BN+zHcq_z{kOEeaeB=^4dy7X{frkW37HPXM`l{ zeZ-FAgB^+5ESOLY-u-&s#*5i{;ro2^3>hH|ZK+*!3Jl9{u%74Ne~eo4@JYk|j4!nj zvXtdV2U(e2mKsWy_8$#7#PY4U0WxNo6l7k(+>UDS z4hue)XKs8r4YFPHeN$`5oqR)zrpNhg9xL0~$dkESxV_u6_AiIJd)acC%YmsxWfbM$ zT${3gC3h0GJL81!7fSiI_AVIS-XR%+H1CYqv*wRoib>hwT8@3_%EGSPMJ-QKc3bQX z$d{=*=RNdE(MGj3C?8fa2Jf2G3ftK=C7I8%>QWEenirLlqU9fI8_YSc13<~T*SmOm zZq<(Dft9ahR8B~Hn&P>J9~){HQKm_E+qBmGoh)I}V1hfRarK~W8zzah)clqkDK?7= zR76sa-OjXq3p+O**;V|rbB@)jxsOLXL%4DZwd7_iJc2g@u=26JD`?HSw5Rjl;X|_C z9^JaM_eP%m!W!QR)anmSypP*#=zET~y*J*DU+v>W+oZinYc;gNkF)ZJN%Q|^Z}_nK z^&;S?%sJr0&RZC*PH>ZSTu}Usu(M!e2g}wMF5m0A%j+LTLU_v(?hwo+bC`^7AZ?ku z7y7@9?tdBG{~OxKNZS6#ZT#`$$NwL<@&9XR!%2#balQ9OdOBP!ka^8-*=dA}I5v!- zZMAWBFwAA)xGJuazjE=kQp8eAh30bjjg>M3#q9c9MMib0j?g@QbQWEwNT%Gsx4ZA2 zr=NQrma`K=rDzGX1*Q_*2a{oL$zuxxn+tTChgmrY=*OK-SL;ky&qz&8)rOo}`LpMF z*W~2cfFpa1+7Q~ScIiJ_Zg?KkMI00@!Fg$;aNqBoaITeqt^DmE$Mqewq=bQK5cNMDhJYT>R)$0k1mOC6Nw)=m=Zw$`u+iGv%cbgW~aDpBwLEj4Ry^rGgim<#XIYI@~7p)(ous?a;c z`}(JAv4dZqQ0mdo>8u@*4vlTaLES@3V!@1siqzeXMph=Hd2%fFG09mDEd5&sOo9EY z92U1kCZO{De!fn^d~K6MNtbn@B(;Joz+Z*O>Lxm7c}%-n>=)Wz9}BrL*!_|B%a+{@ zNcj|{m-c>Yn=qy==l<1GVvagk5>#I;vK{QrL;|j+!1RL2q|V3-UgphURPW>LJHyr+ zZkq6)HBT-M>}|iN&{}bb&4Ie*dRn-xes&(mjKi?xbcGe;{W=-Xo$s9)Cuak*$NKqU zyp~fTA>zlmN=}XbYH`p{;-HZXVNsOJL;cp6fERjJjNh1~<-n)1%op zZj?zNboAfJkV1Sk=inreYA7x8-di|-r}&&M4fp6`bC7k)_LN^To@f+Zux=dR#zQkh!Ikr8*yIx# znmp$OQ_8xm&1A$>Mz33&`!U%dDOP8j4rD>a>y07yw`CAKlqy&=F5kK?Bp-sS#Vc%* z*jA&d;ThjCcEjmN0BX@SIU!OD@UOf*5nH=Z=>J=v=Ty%DQ2o#quz#+Z>QE*RhT9bGsC>;!c`YpcJJ~XV_ZSu z0`jWZp4`UWs04E+k*s)CLm=s?h4_iqo%wl!LEhyH_kv=FwKPu?TfBa6lGOBBK5#W` zS(g)G>qn-PyT%PLR;t;i|E6=t(EQ>(r-#5E>AAk}siW&}9UR?yd=DqF0D$u+1=(tI}qsJytThsO&oq#RfxW?Ef z8oF9Ce`R!Yrku{c=hhZ^xQDyfeB?n+=x7_s+!FxxFrh=1M_a&}rZ zGco|kS@Mf?gm68QbpmZwc<5V0GxQZwhe?ef*}zMxsu(#wN&uDZF3tQD6~QKG>7AzH zwe=M?Pt;`i=MHqv>`>@D%?mjBp?HldiN89m5h&A&^L~x9^C2VQznn&FaUTM^;R^qs zH^9(=$EV7>s<>vZ_%uMywwq2wK?T)gO@yVkGu0m^t~-M*{o-UH^eSNOW>|$MUP1dV z2KzPF@+VGC2Fo>EzreIChJ9n%!_PwwS5iw27RIRX zMw{n%q6tZ*uG%!I;}tM9#HPQ{et$v~>)`7msn(tKiO)0V#AUDU$+lVG(a1mZ%#~=a zv+3buL7lZlvlEi+-y4!;jh@Ay<|$Zmezr_&;Ek3zS8*gje!jLa#29S-MSjpyZ%Ree z6>{bZyQB+Z6`B$zT`Er0U@b>|>#E&RiVj2YqV;Y&=RP5LgL9OoSqMEK2x)&-9DNd+ z;hQ#*$$;g9IX;GBJfM*MzNOWT^e0G0VZ?OE@mSR11=PyO+-^Zm8bGZ#S&f|n0tc7 zw!V=)!k{@Xo>5U|`jZp%L6K>eaW$t~b(Pz%-!iS|yrYnxJNj&5UVu}M5Him*{p&s@ z@n3L9Ea3#uKaL99w=&a>+{FFW#jEzNuKwxPU?bH-6HX-A)ui`^atUNP0<$K&4g0;FrN5Q6;WE{VkkeP~SOxUvpCv^p%}En~8}df`6VSwQsXo1; zT`boQWD@^bVPo~x34AI)y{z1sRh2mqjH-0^Y7{THmRWwKlG&{KgLZ+R&-~*x@9wZ( zmVNq8>5Xep@l^bb&l(N(QNLufg8v!_Z!*3a3APsAz+G#5dLM3ovy{&2$(e@-aVFmT*b91hjz-> z{L*3`1Lphy;|BNBIq)m!K8XC$%+m1w>y8AdV5yHJW#sgjZLX>x?ri0F9#9{Wjrg;t zAO@bIV|sM$aVO8llqGCFe|Gc5$D7?vfbiEi zOArfQvco$W@RA&B`bT0-h#m7&ve|-h55X#czR{wY3Pb{?u$AHuTBJ(O=N@8ueD1$B zv)&#=-f2d z@~RG6rem9HWmf-Xmsv=ZJzNVu(%4qg5>^+^#?g~nSfndJHB%s+qi8bglMEZJ-2j1f zzK);0BeO-q`9bpv29`tFPPpPAHcVS0RzUv{$WD?s$}khT9t9JeI^BpJCnY9Ybjzf% z@zvltB}#P!VOsc^0|?hoxRfVxmkTS^g6ArjBORB@ky|AH54~U_4m~gho;dakXD3I2 za5;KI!Nj*rPXQexmo8IcPt?1a6z(p;3!BK${$Pg^$VVe$+I`n_IU)eL0^!9hCimgL zGLIf=6z?fT=gFdl1m_DL-c2CkYPM?WTJHdeZgPts6O(!G=b~DdPs`m72ug% zTvsRFf+x(NUUQk|zvsI#37E%|Qy_M!)(jj^vEQpp?k6yNz zjQW%GK}&W|W$I6Iz$GBp0bhxiN-&jxOer{vV1IMNZIpuh=Ov?nLzfC&OdS_1gdwF9 z_vLU7sKh6Yg0*_#LnYdQhxgFva|2qZ`-|NK|4SpsfY6)R=aj~~u9qJ1l}L3=BNlBT z&R92z?#lDmdoe*g|E7MzmUN7fufA6yC)FoKO;C;wc}nCvOTZPi*do@1J{mdMAikpu z7+wl~>RmJuARWkJA?z)V><8>pj`=!0-vdnHXc=4?SE`5A3|jk$WT;;J4_zZJVd}~J zV(ZOAJWXVJ{!1g;p%*_`)BioOD)eX9qMo_8&Dkh zvnrUJD|_K*GVUQRP=l3gFXvA3WJOCj%9=07_sgd#tfoo06#L>@9H!WRDS5xfdUtIa z8Xs$7T>?&2Ic<)Ru+-uQnrf@faG7E2JN>+CiJjIA~>9B4iIyutBQAu>j#qmUK>>18K z2KI>RDUCW7YX&$>V(-;MtP(*p;{+to2BYYm0c|7?@QkbmMpz_I*Wpr;mb)t4p+!?6 zppfr461Wir>yRVSR~)*a%M8X$EA-GxE>l>~fCfp{csc?9@)?AbrR1C>7;UBap8|;&?(l3sUMR&2bUC}0qRhfF7a-qB{e)Ohtj4b6(_ka(x=!GT&wPrbZXQ?m ztIq1=_JM^_S6D)+24hrRi8q zpSu+!?ox6Z&#DA+Xe7n?mkHI4YzYPI@tX%_2(y0RI}jm-^=xMSC6$s`sgTFiSxzE& zNMJvIy5G^DwznD5G`z`>*hh+Rz^d1ah$+&!&EoHJPVoh(mSWOi*WLwaAN@$g75~6R zcc1xuC9nzOa4sPXGQo05_A5DdL+v1135FEn3Jq4ScZF%_x)N*DPlssnFOnz7+t!z9~#2TcF5_&a=`;_QGtLPO*w2g#M zj9VSsH8aW}yhQT9w4sMoe-80L zUi0T+gRGT#Ndb3&1oIwnGwm$$axT24N3qh_A2nD&x>>yz8(V>%Barv{{5yuLPO*c( z8=iPPfPU-5@@P@7mXA09dzI*&R=nj-dXBtk$xp=tHI(cE4%67b$|o^#;$NCqGEY!X zpl@*DHWF#2P?~~$!@zQ4!jdR#yYkgW2Q^m>?IWY6QCr&oMc%33n0i9Dl*8Ido=IRF ztb@~OVKXI;A-=>{^9pfkjS1VLC=PK|fB)kOqi1XBJHs9*V68K{9&)l)&WV|0Y#y6G zbv}H+_&n4MI3C*!%i%W!)@CSVdnVXvjAtVqT^~RhIy=L3R$(qyqiNfJ|0#Ds&iYsR zHP{8(ha=}GZf%^j%Y7PVfVfXTM$%|Mp^YyAu3W5_7VX-Ef1C&%X7*A#u%Alga8bdU zr}=%ltL%?gJ_UmXTJ*CP3(Ce`>fwvlz{UV2GkoHAl(PaXX7<%IXp50u`X8`n!*`!7 zl`fvpb1VMA$|hl*_%P)x+){_k9lk-Fi`pKKTpA{7B+z#n>@(RZ)brjO#5ZZ-98Jf* zZr&#x`$P*zYhY=Odz%5?TM4FWkQV@5qlH9(^FWPEJuq~zQLsfNx{V9{wNR*1@R4Ts z^0-e4rh49PrI*_Vn+&~+JynSs^~ekNX~Y0P?rT-BibX6(Gt>{+u^^2W86QE&Dz-VhbgONFGSMEBSwVbTUKMkyYQR_i>!tVIaqBh^w5{IQf`` zxyKm(l&{Q1H#j)0ygl}ff^(~T)6}W@Ej8+Y?%;0eS+*5B`7Bes>BG!`wDY%GBJsMA zoc-p0Zuh@y-D_DNdsjHf$fjxqYv|#HyA8YjC-K$fIvpaG2z?Bnk{tHx!q> zDEO;0(btXe4j$6}G`%H5o_luv@9(=l&(t7+8CCf{V%6rCWA<`5-R;doZ5?UX_RrIu z{K0qr#-k_L>P3ep-CK=0K&}bzp~c_d^X$y-IYmNAuR9Jw^h4F$1FLG9a!h8DQOP@3 z8!R3E0=qRS2b}yzl(V4#slj$y{J}dFGVS}&{IHDzhXD3uMZ_MHR1qOs@}()|z}m;2 zy#49i-Nz`NzOmte$Rn7h8s-{!LyVKpzcJ^+jf(xM#WRj>UwP|2hl?CC@}0;@$g3IvDt#9^q#82_p~+{J@XD+*)(`|X!D<@CghX$|&y>kd)>6Q}(zgZp0w*YSVc zlZ(`8|Le&OMzl%;V6&{E_db?`kpDY_OIHF`5yP^yiDHDDn_U!Kl9afF?{H#;7ET3o zqxT3o>*y#W#fPkTjgH>SUbEDz!qF;s#u*JuIg!3**nvwcVqEL=-AyVW(|-7D@=j4< zzwU{)bev}RtcRKrCe8>^c;L$F)wqj`UNLb=DPW)c>0q} zp`jzGMM*(-Sz~KH9$B*cuc?VPnpp6trfV$wUKE+Ya#R@Z1ojhN3OEuu$YKu~Ik!Xs zgZp;_-b-r2f_ie2GspB5I^|kkX;IjRNeV;833^JpYlD9AWu@HjJy$Wi3Ww%v0l%EX zG~h?i*62vry&i9%yq@;QCbMY6}$oZ86(~5@YVn&Z=+s{8wFa)7$>;z8@O7 zxTT_a^sAj+!ksLRexrBjjGT9&Bl|0gBcvyXyhNX!=2IO+o@ogtf?yob+A<44*d1H;?BsYYUt14J%(wh7n3n94EiW5sXY#l z21T=e^}oUVkjt-mKAkPg_WDIJgEFCsw-WqqQXMW2w>x|F?yQc^Q$?mSvyW&+Pyd@Y z0dF;1uh$A(GLQOwteQ9@=xdMh1ONReCR_VyJasqVr#3q~%p%k{pEY1~lEgp@)kw+c zhBkraLYeWNzpU%}?31z#Lhu7NAdtwntsXIL$RXI)IPe?ck?u#Un`5Zqf@p6qj4QFW`E zJzmjhG5HiWq5H1$etn+Bp!Y86%2y@A8`SVKiSQ85T3ag9A=rvU)=Rkdum1CeO}1Q7 zl~@Q%0JNJDkCSXH?W3=osN6Lm*)eM7rwG?@ZDyxs4&eDJ9R6xL`N1ul^GTWyz144t zC27ZiuKIKO%qZZiwGxRQI{#8&({B;GVRfeKiNqZt*R%M~L-R$uYr|}wB+06tM5@zw z4QSvKwH6P2H;%ffz%KNJ*yYK_gE)m0D2az)_LmOXf2-Vgp-Pb#z?Mjr#|!zEwqz zKNh#+X7k~=N;!5z__1nZb?O+j3UtG2Ed{ zU$VmLQ+kBY3|tDev`Xdr;|jSE<}WmeUu$lc-;v~sy$vyw@^@H-$7Q`9P(&{8cWV%? z3`N(=OyxIq^S~OVXjauLt0S`7<0#;e>niO3=;j61B?{t~&)(s;Cn6}h`f&p6WZ#=Q zcg~0M!hLEsyqk{qeP&wBQB{I`LK<`i;zZUITKnsp?i6SX3*KC{ZKkgKeaS3dTbM0S zO;hX%wJf#BziNlg8Na8e16B4+a^n48?7e4S6YJl-ea@t}nMvq9K&03RMMWu2=vafj zJ18o)Z9q}6;7}CALI4Z)p{VF~yMtmcI}`zXKvY!J0W7Gf0TBTa?)lv>;s5x02_zw( z$#<>eSlaCYgM*1xckWwQ?!*JH=b;lTgY43z;ZEbGOG9r~*jExYkV}(AQey?bC~61Q zC^{(jW!c|p&&lezjD!^vXMdIj-RX)(<9#Y@jCtXK4f$_(V7PU)Al*A<3u>$LakSC$ z0&fF{)s-Tf+CzM#EBWl;wJ`$_t=kp&O)44E3Cz12(*2HS2WCFNZGShU`=`|I@Q#UQ zJLcUHy?w==u%$43buasdeUom}2-0@ANm#_7AaVg-W4-VZAF6E6uagu(O~k%2WAJQ_ zazSrCa9ekXx!Txf?2BfrLCD7;Zdm>O0rrr@_L1j`1v9nVQozZ9CiO zlM>aq*nXWRl2xfZw6V4Voi2pZ=HOZeWwhzye|KDenU~n@W~(L`2cPaqvRE&C?iTJm zEn5+%yAQetS_{&k3N%rX>=Va-H|n|*CLycFe=$_~UCob6%^=76@No0`_BHD(E6i0w zzN2PiEL{KF6A&#oS{sU(wK6T zxIoV8ipq-c$KjZWoT4!}$!#lN?lF@OZ>il*gOu-2^D4#1rz97BOi>!K&)$$RKCPH`vpKz`Uj{`&*!5db+vjvWq($#t?jHW7N^j8r+Ew#tpCoJwMKF z1Rbn1tmw=6fPwtjqli@HRn;yF%NStNP0y=o&+_6dksf6x>eY7Ea_JGd$2sYIS_2>@ z{6uThuNMxB17D^p>=<*H)}~7fn@0l_a9a}{*hC?B&Cy`~5Mc7R(6 zsFqCPNjZi)=t>+qh@rnR)W?9aw6L0jZ!Y4cwX>{CfIHi=myYhE8KQ!HWXG1kX#O2s zD90v3N>PnUpvk}l@?ooG9*U_PLI&i7h&f?;=L7R)XoR`hVL7@o!?}|Z_;}42$FKoL z?&}elDYneH5=m>otd-*1+tF%fr%L12cuC5PR2|VsS`Csd6ll+rSuhoHj1!r@vnYDh zW2NZ9FJ?rE|HO#=U{WbYe5cd?wcjX12H& zAO&|g=kie>+Jhs9hvD3O66sP)##7K>o#+dV+|mInGca90I8Y{W7XY?~J%t&0V`v!- zmo^};h;6JK;10wl^^63I<8%g2%17^lt-NTipw;=xyU=tel-45J1LzGs8Q#MSiFd=A zCBsKcE%10OetcO9+Iz)YNI(iBr%h|~M=2VL%N7jCy#|&b-zh=R{$TFM2z2aH^rdFCtOWW`zqEDV&jiT+@Qn4C_*RtGBp(KIi(X!u0jU0_Slwj4NuWJrMI{f6CPLY2U_rv zMAyh<4rH*=IFlns&YF3ge#ZhBkmCFY^2h;P{5}{uD|cuw5J?B*jGHZ}3X}QkkPa=+ z5>`4Lan% zjmRPn@FU&F0GXc<&1LTVnfpy@c^z_rTN!w$oi+CCyByF6Gi(7gP=5=UCxx7+-NjUpcpCXBd?xrGz1pnWW$n7x} zowMy?_gz?^MLx?f^DEZ!E5$(}=|JneJCdA(el%_9sX^{_@U|WO~)`=P} zpcmDr+0laeKImbiun|YwfuPNcWPGi!I}@bdH%=rf))g|u(qN{fcgtweLRS_INVy+d z2le3M=aJUUxaC`=HA@Tnrh#*wh|g%yqDLR-h zvG%VK?jT$_N>I1#Sf#w7hlCXTNdWCvSp= zDyDd7g!D_(5QF5GJobBL!a`iqZZPklklCgtCPpYzy!4fl%Zmm!YUu9?7;Z%J4Kr8< z_>KC5VzHyDzoJ@8|U_Fa!lg@NJBygq+kQR>Jfb(HtD_@A) zz0d+J;1fbDXQ_Oec!L0-^!0bfzXdw}3&TU(u3a}sDW+fgO9|g%!?DdMlX;HEiElVU z8*iFM!*|Ni3B$0XYO60tvAOiqThs2qAyPiZp>;?9nI_spE^k*pXJ&xMhoMI?4Oh{FcNH42KSN$vUYGzExwNlO@4FIp!@7-PF@|7 zNWvF!%O$oh3biR0L(W{9_nDSSM;%TQzDq)&};x+tT^U;NP_PzgtrKeE2_Y&>!^;Ss9e; zxn#}xQ!IPb-3kZ^VY=)MM+KiT$_J;TT8t-WeX&huU#LY!0hyVW0n#O}4WE81OK0pr zrYv}<%Mpm4@NMIkkY&T!TCt!3{b3|}Fl5~;&8pGTdfIEFvGE}xxuxdB6G+>Sqk=N= znT_Y!gm~d{9jvFP*=D5Fgn2>}|C?n<%hbFObuA)Ai@T9_bjZ zf|)lt7i)GvzoM)U$v-BQV@%{4!vN|s}k2qE_#g$QDqEd8Gla*K?-H%(&*eji5KzgWMfP{v; zM4wULSAG(1Q;lE}MD9uiH_SIvF6vauuZs`YMIj3=C^%8 zkqE6XLoaEi{_*vay3~>RF`=C3R*8l2@69;6*}6Ao zJM)w!c!zy;8}QHBdGG=4(BC|^|J}m=58L|B?(Gk-0-N+3_R8;lo{X_{K))Lc)!O{l ziJ4N!2ujk1i9r<5hd!m@cV)# z=gRC_yI!nx{hG-t;SbRayPoyegQUB2Fw5WaXQQ|N&iez(COF(UH-gh}E&Z}X;M$sJ z?{CcHtsQtc^k>(05hN#d+d8=>?CgI}+)nJMk9Mx>rX3T1(vH7#cQ1Uft-mwBygN7b z;+Deqd_JjIaLa89+F3T}!E+H?CVn`~_0)$w>(bw2H`^z_T?1TS9@|%KzojjTXTkGo zrZZl^d^~+q@2Bt!8?gwXu%Io9p!oHls%`t(Zlnq;dJ*mrHttCT6u%=?;Rlv>pFU9r zlepd1?z{#+ZVu$+!fBHpr=BKb8hs;%FL3FQ{sx>R0grgiVEMpO+=ACGqwg+yCD z8t6WhzW1)RVUWZ6XW27*I@E^ z(#?!wIvJEzADg^jXSm%#?_T+n(i!>6GbWw+!>hJ_t3B8dOtKK4n^sXC)Y~opwmFd@)zSuPNq!o0TJzTU{rXCn{a=TN}H_O|X*|NoMg?35w-6mIP zRYdx+^>i)o#KcXm#oL$em~_QOaNYg$al=2M^q%SM?sv!<(WbQFvGr8N@vU@X%vPW_ zavZjF)DGB0oLq;yRFxg(@BVBt`OvhIB2zE~uX5<4m&NmJ#kxf?d^q!kg34l>dzOuHv(TD9F7J-E28#fy8Ya7eEv#GID{ zsJkJ%|0@d`-L(ucBfgK~5>?UXOW-ys&dV!K-gi`LvX%H}15+Dw3RCfSyzul`u}%>J zw3B#E(Wz`0UP!V*IwH{?0CS|zTRWE!krdFV(}8|s|P2X*Yj@lw?EtgH*K%( zDovE)HW+yIVB}cyl3CBxhKKYz1qqD??0~R-<@1rcXX(wAz2llTGg7m zQ#HPOv2eWLOmqhGvj&eCb-vHlA!vACg)kDT49-%>oP5dxyEE80udB1-PB=6zCsMp2 zk9Ek&GQqdbX26Kcc-^)aq-MR;BDMP;+Pf@VP_%ZE-_K6xEQtlj0Qg>inc}_!SXV9l z!C;acnMC~vQt_@b@!jgx3ccr+_V;`D%NSV;->JZq4CoIzqjmp9F4l)hG3Ve3P=i(w zNM{fqRX1A~ADq8mvVlo2wfbi_!eJJBp5o z1A9dh#enujXn-;I@#vQsKOSnWB}U-#HBE@vN6b>{^SO7LCkq_@lpKud+VX^dfu%B3 zSzmjw)_m+YnY+NJA|w6-~gezU7gW&B5+^m#O7Z z@PBlaR#Mwq@hBna8ZmBkAN^yJi(3)K8T_(7cd}=5q)%2rzjYcu@@QB~DofElK&P=F zOvCg0bao|w!R(MxW~S5h%HErc--L=w3nvGZOZJhYm)$B1$xV9Q zxNGbWP?&0BuEd$@bobiF?ymCtdV5!STJTsp{gkEV8n z+JMRbpuKbjI<~*EWjv23mOh@H>ZXliBZ__LW1sCiW3cLLl}!UDidrVQ`b8E2`#+sO zCN71sT?zUG6rvSd1`Oz@@cW39AT2^7ZWj+FEeZ5+FBA zf`i%0l$cf7yHusShWi+^VW6@%MufY}QCDQldo=vbELCJDM+!IUZMz!6nW>;`D2sidw)T1 z$3eZ=B?8lqkG8(3fHO$@eH$Mv{pJPg$n-$}{NQDCE-Hc=(p^`mAMTn(iM?+ut2o+j&l_A*12RSq0FDm@`+_RpXDZR?p|tvOY`T9y z0YsDz&Ooz`uRArFaUkzhGpk=!-- z$mk9gx-nwWDOOWiz&LLiYgVd#_NRit^PW3B$^FjGmOT3>@6e=mu=8wf`243TmQ$mX z!!OG!=}^P)3`=1DOIK~O0!tZPA7uBVhUdE~`}PiuwCdbAB*YSk*Wsreez%5qU6+b{ zCQwE2a>&CNm$kK?W&fH=e-)EFbpAl6hlPv`8r>VMoU4e21WFKKa)G^-T0DcH+kP26 zMU!3LiBcR-1Ka$6)O&(0i;8#ny{nak>Y4XSsvr8fENj!bV2cTAGuOQK-BUu@tlLBg zomqA;$<&z@EYQ#8P&*TAOgOFBYM*&t#}g~jL07cAv6`~vJX^V>Tc7Tqsc~Z+Ewuls z-;P*<1KLVLx+f=te~yHvw6-|>F@|M07jQT4t8kHOs{`IO?ePjxfV9@|5rpwyHx6$9 zGefPy41<%5Ws{KD)PQesh?Sg(hCRpi|FtS@^Uc%+OqtnHzggM*zj0b_|Qdrz5((9_*oUNVADB`8YfTC!Cf@;6r4x5h3G2J3|^SsLVN zn$8AFFsOm$5EGgKOdWGrvE_Ex9p z9y2tR!A|DIyM<1?HSr7=hhEmA0U?#9jTi5i^<0p!#c3d4`P0Y8Ps@3bVt;$1{ya}Q zXXY#g5czH&)tWCBf3R>$czp>!ABb`1T|=bpz<<_?f49^Sa`{!xi_M74^jw8iDD4$y zO%0pQeh(Tls{JbUQdX?Jql@LW+hI6+ucq9tRa3qvQqPaqr(PML!$&Ekdp4Mg?e^#L zts7@>=ERpfyfwZPHIyJx3evej^*;+=md6sdRiQkSd)_cEBKBM)L?;jUT;@gHHUwj< zENY;dsH{N#ak+%%`THxiW_c-g2Z`LU1H|8*D4WGF3nMftdrI_G9{obk>$PxRr9lg# z(cK!gr3#!?GpFh##_iU|ILt_vDd4ZYD0UzjcQVUb1wJYlztJPy48#EH<7Z8Y%l*BntQZzvuGSVe#-XySN zW3eXLazyH^jPjyMEGXl8TJnyDjJ*Th8CuJoIc~1Gvv%^Bbry!O(`-`2#mZ`-$&Q0Y zU4{nf(kr5DkCW&hjU>Yj%b^zt9lwF+r4y`6 z;H7ex^<_4{jmQTrXW}Bj*F!G&+J3EMf2fRW5Vzs%sCY1@A#_%Pc}hO=UN>c2L4;hz zexkSeVvtZHQoa^T-Y>J*%JpAmArqR&mg&#|HLB~DUdGT7_;AH-H*yW;>bfrb4>ae1}@@lH7!eDbOnrl5kS8g^=WP zL>JeCtaA1pB~#YlQimr>^TEqH)M#X%YL!|TQ0j;DDU&Z6^%=q0V4Vj2MWUZ5Nug2n zOs#xtIA~@RO)Yk#1~HB~$-L#W4kRM~7~%7yZ5j98WTmWBzUi@rvx`AU8$_3v`u(K_ zd<}fB6@O`TfH3|`%<@${@|NCw2_G*9=newDzHxCxGPYZfKE`EEdWK6mj5UD^6uK~M zLADBcJaf$(9HE-gkD9d%+$Xq$CNv>YN?@ZG{Z{hrJ(=wuA?rFeP=@}9l)*|#pN^7c zi&gdLdn23Cg+Ij5Dka-o!ycrA=HQ4|2^ngz{#}Y$uP!=z2oav3)iGd zxw<_;m0VZ@!l*$QIx6!>`1_CMm=WA@QCf*-R z;@Kg1M+lZs{3ZhZU=)8g%#o*{Z$awCDju!pjkkcT@$pRnJsl1QXi_0^^dG(GA3lie z*kw@*R0QK}E9{}ooR}{KC2XPrH9=r5<3fwY*jl8GFk#DiL(HZ80i9O^vEHi*{YfTs znSg-i2Fm?*P210MWIZ7eeg6n=2-AhVYYC)%5v|Y=)@)6xlGikcyU)9WBb+NRT-;bjUF>^O1zk^1SPhPZ% z6LxBmHRK^-GVdp21k{ZNK%CINtiF)r8O?IsZSm+c+x-gqQOitXA=IK}QpR&jqo1%$ z=VlQ#9X)nNii6otHQd3*GyL|IeHfeFBw5x}HY%F=5Ha6pV$T#P##>dShlAsw%_@mZ z1>PIc4zi3*4gI4LJ2ZiJi$_#)Vw7URfv9&{*dlpGDUKy;;|^_xUj|27M8U5}bQk^C zNT8bZK(=Hzx=exj;5odPm_);~oPaqolZ84&+pZt3k*>wjhf2(WLVoM@ZAYqljFKBO z;KP{U8!Y^Uqrpcp9L%|A1i_HtkMmJ>8T1U>!inaTEgIszRPv9XGv&FotzCR1^N}(UBjHW1V=&x0rj|o@H<1p zB^vad_8OpzzbXx1LVy)ENg#zTn_QCDvi9qjm{9@iy~11uuo1iU?$7lQPouJS1aHa0 zS_b^2B+Xl-H!h)pjTp}WR~xuf*JFPAG4~EgO$~Dt5- zKh$VexD26?zgxGNQf_@(7S9oE;iAt@OBqk|$Q@E2ErOJyFECCyhJKX)6~*$cNntkQ z_`5oVy@a^FU|2uCk&6pYch%W@25u&&rQzrkwV2XLp3ss4`Wn-E2qe~Fo)V|s6JBW0 z3tH(K!&p{$f?R`qBakfvl)sI%yoH)DA5OM9)`x~8kr@JF9c^rl+rHHIA3jV6E62cO!Ro}X&W~Mwt`v~DrTjs4Z zpZs;gsRJ6R6lQ2h(r?=rmug7YH4*4H4iN!px8MiEUG-7~y{9pO$b>iZ&_=@v}at+X&Q zEeWB@D4e%$nj&Ag*He6&guO0sXiU*eL30d(vZKu74fNdtHmaxgLjW;|#w}`POBmGx zZ>pASqH|!{=ABApqBhAI6Zff+G%46jqaVny69c+{-*S}3A4+Q3f9r)ehJx$hJJZL~ z=vE5Vk`5z(q>2}M^~RP}JS&dK=Ck0E73T?r~1=O1@Nti7m?jz8n zx|MGLTDf8hYcc+ger&jZBme($;QD z{U-jbgv}}Tkkn4ga^4&D4_b5J;!@<7^7FC{Xea&M!V~PKQX653;#@)t>#a2e(e8bR z%bzX3fcBr0nrKDZf#i@ZeH}lJb>hV_dxH3&|Hc#nQ^}&=I$rt3ll=yfO}k*gz_-ta zyFf#`riE9_%RG;QxW@+YKOLxlY?=MoA$)=1?v<>$P@0+4+*p@&;u0{EWV{@8uQ?(W z?fPkgPZ>l)*Nm%84z9j;Z$E2NXvH$rCiF0eqxk&s`@EvTk>?v^SF-HS>J4 zz`IORUgS1+5*N1)E}h{cy~_e{_OoH#o0Ozr^@pzP#oCdiOHzOTpC4ELwZFc(Hn8Yv z+4qH|+2YJkPTCpxn^OMcK`S$ty``!+hguIG%vxA*kK-}z0{i3S!o@%4wXd@#=8fIT z3lkE!@}B{-ZPjB~r1o*PO5Oj>2chIMM&CLvdmHFRp|{5^U)z1L8oGAJwxc_{bvHRH zpcI5Oewi8EelV!*TRUqBF{a$I$9`)l{_yS5pJuCHq=sLL&shJ_Qo)Ztw#9Av*#cO4 z7T>q1?M$@||Fd6l%K8W0E3&p2twccpjt#z;VBIQ~KGVK%aoE-nx4MWWv6l1y`tZWl z-fhOb$7zc)x>uNOxf&T+;ri|q@3`mQmFE~F??tu4HUpGj0(q5GAK%c|8WjwdUOnk| zdQ}Y@k#G!D<(7N+#5>O$BI`~h&8YSG2in~p?hu2Q=A3gmPsN?|Uvnhia!a{Y^mWSy za@jEtpRTsM(rw*$*|o1m=H=fTb7{=G;|WvpI!F4Zyj(H`vXAhb`VV1ZHX6Cgzcg+7 z%q5qeX2S6Q#Z3PfGyPx8^#2?)i7BnTf{B@sEp8>fO8s66@;v$;`z>+98&j56M)&YynIyC`ycSJ(R|Rn29EuoK& z-S(%=lU0$=v-3RQYF_Mam>(#1{MERtTnPnOBPNVnu<#3U=bvPd5&5qz#W{0sV zq#L4EoLsI*UN?$R;9mW{Ci%E^cT1L&EvI%Do71S-?KNncVqY4((`?G{-`ZR$^4rJ? zQZ@DkKi=zh4bJlV&``KHW{5x)z8$`H#W1#K-MXrc6Jm<)26iiOgXI~|@;9NdqIqvd zzcw`}lnqiR@i_uICP>`fnl1X)xtMvRb8nit3-AHkzjsR^877$JY|9mrcdHJg@Q%lPhzBRW`aGlZM-0 zS*8~}X)VkeV=uN%^Y}~$KfKGRHbIl&C&i3xBPpyPaCdj{l#QKV?F_$J*Nk(hX z%G4JV>_Z^G_m-yQ;Ep2a)FpojOJTo$%DcG!)4NwGq2cc?K0cy2T9Nh7j|XdOIG9gr z*4U0!bt^gTX(?LUJzFyj>_-i`lJUsRsac*=2lciJckYR1Uu})>6pM|G~6}~d~FE#wf5BtVuLq*Z9Rx%#L(QB#&-M zBKB@JEtA6_cx*>gdesuW)bZx=F-OaGEtjtvqwYqa;YPs*ebthfj`V=b+38lZ(pX)X zR6M7E^tqa1`#z&eI)5KIS?_E=mx=(rI3h1!q~)Fg@|)49D93%&YLS#BAhwXdboQuWu&4IRl1o z#YaG5Lo%$$mGd@kAJuUZjX$?2@qGQ=@U~{o!fC1Ybw|rSf zAkpEh#%rbQb6*a1J~$ot>+Zngq~0GVz?}%@*ms_^_MxDSB>ExIu%kcMoKsrGHZIrh z^X)9)&nEr*+L4IA6r3h)xUXAg)J(42yl-QA@5llW{w`{$liTuKA@Kw_Sl|GHEYvXo1e^{v0V!mW?a=aKgX-J7+e{0X0ofZNbYO?+JsyN&bxwwtu`EAON-l-(@f6c{J zLOR1TJT2_!)hgd9E$Ak;V+s61HMTr1#K|Y#c5L?~(69M7WT?Uy(u9p|jfPUh_<&_s zE6+biE`{Ey{+A*lD@Fp>+LXgf$4kbv4VKQT$;L+NLjsJ+=RGn;^FQi&ftQKH=qkuG zh5t$=23bk`!ikv&ewe*-X02&uIqY;x=J?D1-d#%CkMIh0H^iadzV6@|fBdYjmsuMq z_&zT$*uNPP=dy9uxCqXOhFWCFJjxN#9zW%+&5WZfY}cpp`NdJ5>kMVWmFhdlxB}@n z2?dVp)BSWqwU9R0;lb8Jfgei5JRA1d?+xshNnrQpn-!+xH;;sFwdEjQ60F-|Ioac#U_sin3K(4$~~L>J)BbU zaKEeS1AvAzx*FJX#%xF88f<_=G1roRAthd4VRxn&Z?`UIeQFFAyDw*roRbVQi4)I4 z|Iw^_j$g((TpC8~H0g>%PT13F-86^IrO9d2hMKfl0q2gWtn(&moSZy8{%9{6rwA5& ziwcEaWWzh&$*tS)uptK^(T*`i`z3KZ#tx3lhH4!7!_VO3tkSXA_s*kU>h6qxier#1 z{N{niOJ*ZKBJ^R|F)P%(p>Qgb?%6>{$zhFo$cVgIAZls`r}*cOR7!8^avLvpdZU%7 zD-wq1(t!8DlEeOSLx!3eBU!@=T)t3J*S)D(tOR6g(O=BQ)=9^Y8r5x65T_nBk1}Cl zqP4(ul+y0SAn8R7dY8aVNM4%(y+|-9j*d2jn3Ct-BzOy)aGKEd!3YYl4v_U-aH$mV2ivh z5tR}e*Xnne@f`w5u@PQZJISn!(_bML;9xby;5>*QgTJX+_l$xL6*f|C_tnDmFpmCG zBX>wi#{gKvKf1CsZ$qyy+;JMVAlsKMJ9;H%fPf1-e`W$0f>V!XW$u_N&8TvXw z2A)D{Wok%8{L+}$Ixv!L5(h05_CG+tq%)ibB3PF`wUtL({ z^t|r8|8aDthAcQy*5RAQNnF@%30y{MnNB{^qs6SUWiJ$>MC^c9_NW=$<=Sn{vpDNF zU@Bbra;Ax}XA*BV7Oq1X@W;cE^#S8aqrq(sXN)J@RwQ*8Fo@q*z}DH&5cTk*_0aOG zvgvYjPmK(gGue%#41$_y{SR;oDc<%mTom5iP2~=B^^wN zk7>9YaLEr6O;YjircEs*ZDx?z)(?tM<(g`BHY*=I1tH%s@tK;Cpa_va%|0PV`cu&n zI&3^(ByH%m|4G+`R-RxDlU}M^`jqI81M`^Wuqq?kr*%9PE#=dqU%1#+E;Un!1kFM_ z04mui$u+R!NYNn%P6Lq((`HZZs2q{-fO8PolRl&PfXGA(k?Rtv7OW z#wloGh_6*A$yEGZ10%&YgiLG@9NWPpbujJ-mTOX+f<$Z$c&kBk<$Ip$V=w3-G#{$p z$+cFA21vA#Nh*|rALc@_Zcox^jH?km(ywiNMHKV# z&Wyi&EWw2}mfi?Qf$ubWik$q$h%Te0v!WTn*)*pM=npyC43dPDxSe3O3dMB>giJwO z!Ep=bo_`tbAjFdiskxpzbR71^fOb=xEZRM0>)lP2>~J~1X+L^sAT_wh$*G-ZQO>Yu zE^EH2c{b7Drw)Ch6mG`&-)iBV%5|J%Xj6&6RweG&^A5H{#{nZLLz_c~J=Ag*Dn)Jb z0Cjn8mkFQNo#_T>2QwImp$l|Uo?gb z8srzfy$h`QsYZ_y0H%R2&d#96_Uc)JI4~CHPcM?Vl2DM|@{&#(4@4O_NOYD4tI=Fy zRe~OE((aO!aQ_%Y=Cx2_Df&jw70Q8_7IrJeXDz@XHGW!Y$}zH^8pR{@m`D?Roszn1 zP_5cpG6oB!B$Ey&|0cyL1BayxY4jQ)rQ}Xuh)ke|Gk@f>iEH7y*0mn2hvix|xkEn-4ax}7oM)Zza$^s7kq$Je|OdzIRwBozSIY~-!gZwn# zk=03|6yts*ILyXq5(DIBi+l83b_u&#%ljq;8d}6(izX^JJ2|2QdKa^3QFJo;mPFQ= z&f$b?_^LtXv!(99rd-e1dD%yel62}C6XtZuk@``R4q~BMzU|7(Jj5uI4k&I3^T8n! zfeL0`O-5Vv!uvY$Gd*`OZaG9HE{)(?>ly5W<4ar@O5s0p#59W4N+Nxmu>#Eeq5QVB zPEtTrCTI@)Eq_ z(lEqS<4wyQAy7IfA)7Y{?7%85M^|yD=XCCyz&Sg4`Gl%fja7e5(2Dh7Wn-4dE47Tg z{!Jb@Dh!%uD_b)KOeC!0W2Mf@Y6g5MHXs)VJ|hn2wxJa3r2&Z?y4O|<312C)^@u|& zk1=E#n+9y&VCYJu)Z|uBFfodC04@g=xt_&xBU88i}Sj9fqrpl&(3Msw_i4*jVSD%8+C97i@woeY9iB2*r`Ly9J) z2zwUs&{}ZXnDmoi*|&@SdH={lDj2;;v?Likq|sYzrK2_cL$Tn5-07-D%2u?=^RsY~P|3W_iNxlt(=|nXTcDyDRq4NV!;MnV z4j!!dds?Y}sfT00p-*#{=P39N=H1W>CU3!b*rI#5^y7PUN29Bwxx_pQ z8ukSJO7cu784P$2E{WcjflX#88{#N^Q5zK~4?!51xd%u#&0NO#Ng4-Zw`eq~PRdrE z&m*J?9BE_r*1lj*ly*^=RHzNkbpog4P7i=oqUAZy2Kxz*OOYE+s;e37wbg(sE(AHQ zM;yyo4|bn1YlqJ11^HAGtQU0~kZJNb>@|8yYr-;$HiwF^GIke!7V#9nHWz0Dw%o7+ zgQD&UK>ic-Hfg#Yf4+;F?_}TvX|ZFJ#WhRzPXqc|&5SuYT~OLQ9nf&NXOaAPZf4q& zkvequP}2pJ#X-{a?3nnMPg(JJDG!4Br`9v$mM?Mcub%-&F)5pzj!4Bl&HMvfvB#A1 zEiHaYqHmNdY?8r}-#)|3S=AazxE8ZB)H!_eGxY??-j}4Fz(HpwG(sQn=l4kyzILZ& z3Aj;|^V1*(HH>v9Xl3UVDf#pe^(hfP*sf_HQ6gB}J7K z_sF^U@g{l8C!JKQT0d94A>5tU2`ICIO_|658WA+WT^XOyd{ekH19_{ zmCW<2@4{BO#wZXJC$f?0d>|oOTxL?A~xRj zS%^Ux{FX#B3oYJL9TwXqNA4lEJy+Z)$vxd&i$TB#E4-TL(>51O({@&)O(TmrsOs1a|Msh|>#_#b2E_*BS8&3;$qZlmX( zQFHvUBFn)&4iq+g}0U+v6kPlB7=C_IW9YMUE?d~r?TFCWu+-`d14H# zyzNdl1Za>3nfBS5w`kh7zONNS6T`i0-@n;ysgb7*?7n}7dBvx%Stx!01w9$}WA zc0hlBTBiLi_R^<@b$+V0=YMk|Y`2|L*C3nwRL&zrX|=m$`yGEp zN6$wG@q9}u;PH2c`^>i-LfdSs_k1LW7Fb=_UC*XDch;60m<>@Kl<6s&GOzur@VsOA zQO&!O!bpBj?P?2w@=3A-y2mk@G^uPjWuG*V7cq>C^;f)0PGjp%p4c>0H_U3=^{bOb zvt^9KKx2rmyScZZG`;RkL0R<)MGvzzx4H8SFSXuxLe#S~VS~$w**@(1AQ)Bt3#L5t z<=@)yz=Ynps_U++Y{3f|1uedMuxw_GDmZOB|xpOG` zQPcUIE~pR}ZmOD{KWdKu?nR5!OpR3z8?E_XOIG^c(jPM2wbRm(J+I{B(N+Ind++(y z#2UV9KQrlVW)eb`G88G+ps1*nK@hMHh>D6D5ERSOpr|x)=qL!m-VK6^bp;o?z=EL& z*f1z4DoaBVQ4j;7pt2Tw_In)r%lmDA*x!DCLUJVY%yVD&d0x*e{Lb!lo4lNw+);Lc z`Gj9uYn3qH!#`=4TlV?R!!KR*24`F(c@9?b7kIB%rkQokwuQ{9!Cm_oIVKC1MOJNQ z$_M0lmqq96^>1UhG_SPgJehOADWS&y;59I}PkT0M9oOP2XV|1h;&$WiCLj!)HW5s0 zuDEknVPElYB>(E`i)W)OTThQCg1%oo()QH=KOk3|u0t<;WiYJbujZZmiTicP&)&C) zSV5tgMagMbZ{44`b0{hdK#a~>&prPR)_VRAZ}$IKtGRC7y7lYV(KIW{Kp@EY}vA9>(;Fbg(4~{YTLGL+qZ9zj*i~3W5>>&J9q8c6%!K^8yg!J7Z)EN zpOBESd-v|d#Kb*&_D~d+ltiOI$tfu*si~=HX=&-{=@}UrnVFe;_wL=dZ(mkcR(5vw z{{8!Na&mHWbCpVEUS8gT0|yQsJb38Pp~Hs{A31X5=+UFcjvYIG{P>9zCrj{(rQZA&xSN0z%ulr~LnS5Gb||YpE+PsLh3wnzh^;g%1x}GJHy@8xepj zUa=wsZwC*v*urrxI$0Qmq?kQD*BD9o6tJv~xZ?UV<=dx+2hPv`^Bpe-a>QI+iIpA+ z*O%IvYkWf76My#>SN;9WA}|fc9l4mi3g6h-pa0L%ud7|qY3IjmR^(dc<;Xl9;B)XL zGwA}lsIXWPmc_zKf3!@^wM!)^y%_lHzsAiQx)xFT(!J8FjD1UVGHdT z{=A{U_iOhZcLqDy1T!I$H~iKww|1l#Got9TBXwG1otETjioQL?zR#&9QVg5g7P(mL z4t?g{E>)4cLt0Xvg)*CD<~);C^u_!JOJ6m!-RVZtY?i94!RzV@ z$J~69uW7H(n_TyMk=xie1fvy;JUdF0GbVn^G;S`-* z86bHftK!cXCZ=wfaV5BapVj_JhjKl)&2^zB{BT)38-ht0&;55}6_0P=)EoZ_i-@;0dW=yJa`QE~yl?}%7YHWV>h%!q&i0wd^^6 z4#<5uldpkBM=0|ix!xMm;;D>Nih7T>NPo=ydrwNXhP^5Y9m86n5V`jkCH*2;K9mNS z#-zNo+m=yqVy5H#LiUQ)wk}}Cplf~}b8kpn<*4rilNT9Gc)W_~@iRJ93mQ_|w}JW} zq!Mn{ZGUF;YKCSpXxhqYD^PiS@DP3a4CftjF?Pf8-1}AV?2sQ7envPv^5aXySAh%V?*g>tKTu4 z89MQVsrf0ay)RNd2J(^9)^>YYd5~M`QtuK)*iN39K%)aPHC)PcSmE0@;E)hVx-yHabtgbG1Z^IAl7Vim(5u){J*t8S{A^P|+7MfM*WULeIxO7*cmXQb(!(RTGNHH3h%DKJQjDJRe6_Yb9N(trjhit`#pVIn2x~}? zY(I-0^YdKYCx*!+<6eC`-JKNx9it>=Bs80zqU6w70mGGo>=nu81P*nWXHNMg=H@zT zq-g;Q*1gHg+Uc7Lo&Hsm_B@Yc6iBZ{%4?i&RdDuQ)rl`_eOx=OPdz!6BC=J?GBm-& zu>a)(ftVWi(r&`;`L@VtZN3jdkNW2GEni;t2bXm@vCYW_Z?Rfo|0}RuVHTNh$S;tp7uio90VsX|a*>Yj4XgD#$8g0&?K}v`jQjj%N%4FLs zAfB6@wLo`H#GM5DwpC^eqA2B%S4A?0(=C&^8#^kpCeD{LLR~Xf_viEOnfZX_j(#2k zp}9K~!WbM(16*|3>*nhmZ`RdLEm7?oTUjMu=#e>Us6lWbA#}uINL^#cy_-L^>w^(C z$;*yXw-nD#sb?zE{J)A@GwpO(XK9Ap^w+u#5mx+C8*}#wnG8)HJ2_ZxzT{GkQSun@yD!Bo zKtteh!~rlojL)1zO!_6xT|71wMCfm|v^3-@d^(_R^?*mCGs{d?^}J7&W@;ekY_Ic{ zvM4PyBPu0{)R}3mYi-x<9NhnYNr&-YvAOY&`MEYT4F~* z!3?8UiWV07h#`mMLn}Q3y{OlQcppzb%k_Fn))b!r8&ar&e!CZyc*Nim<|ZBt+zLZnEEGjD`eQv4qjn%9Kxqk(XO@fdR%)boEH!5|G| zK!cw2TJ%U4=OV)pT=ZKp2MV^bb7uea@-z^GG7`p9tQnWxv&%zxID>$2H@zG*XD5K#YL9{PjyTqeck!yd1x2e%q5@O1sYjUJ{ zG1^9QaSRu!_%>`)Z4~TW2yT<89`FQmaNiGlS&)xK_s` zd~#o;pOwJ}dO???bIs{0wf=Z4r>8k*&}X#CXp_FKoPprnA2GQ@$8trDiwvxDehr1t zjq+U4Mr2(LwezrDgR%~9LWe% zaMn6v#@aE|Kt%F+b_b5Up$CL=pZMxHFZ8uZ_zn4n7wJY1Bpit~V96o6Z1^&XpdXk- z*JaDNco zZbvMFGCD$t$LLmduOGb>rm-CUIf7p$pLD37y*?U>SBbu9PQBG}{!mP;Cdn2xaz)3F z*5wIk(m*EIZv)OL&^L+#LkN;`Ia3@(UfV5Z%OF<`^@|YNV$)Akv(?m*IZH1ItB-P759S@3B|mp z;{1rqXROr zhq~-7RtC#~NX>1>CW~;Ashu!HaatN#ZSs(J8sU9~Q7?Iu8^CGN3##<|4$Nt;+%i{d z$y`Rl>_@`K8LSSH>nM3HF4eh4_Ijq{$DN&`M4}Zf$>%`*?b|aV(gQ zO?V&|8|XNXi-Im*M)69_3Y+i^a11oiIE;JglgQo;t8m4}Bng~3Xp2lRCSLkhJGS-` zgd9FIDi5?u1T@TakJx%SA9Xw^D1U*L;$jz-S*>CPq=$(VbQ2xDsW@BpV8uE>7tuL? zM4r$GyK;};h}iwNg3ActTy}n9-T*GQUR_wlT~>mA0fImh4g`q#P`(Uzx z-BWOu z-i}@z4R+y9M{Z-mm|&?PNY>tn%QtJ0ACwe3H>$+A+P+FFX2^NN6!N?0nXCTxdo}ZW z6xbMoPQQy);7;H4EJr2dC1&cO!B`Ye_XgWMb6jPTa1|fqYtVN}8jD|u3P~dt zR1A^KMvZ?B=?)WuL2S(p`nrz?UciI-TDVG$o+aq7gLz)JFAqYu5v)5}#)4R1zB2x? z*FShBt5C^JQwrBME|EBMj`*Tmo><+Y;U`i%C5F2v`~xcwydqOSsd$|l(dKfx6b{{V z&-#E)coEN>th5S|i5=B0zjbAB6i|(j<2Z5R1uC;q=3t}NxLYH9gQ4#fiFMQIdQNAl zoL+1Y{e!bIwsGkNhDr;+dm4MYS(u~VwN9%sYGqW4KMd!y?qgh2HQb>!5@*e`$DBs_ zhyyU>lb&l923^-XnD~j8$wzkr69HhB0(v1wn7&YPx#5#EKf*2#JRwD%37DPA%$g8K z==?6rAf|#5n!bVSBF37b38X*eh7ajO0i2`Eqw*&5U&qDl6V_%{MuIlVp zQM{IDP>D_WONxA~gJ)_1&HTRB*ssZ6v{#RO$FGQs;a4*BD=F*7MYcpYCkaSZ{P2Fp zn5~#(g#Kr$*f*8PSGB;ehyEQR?`g(rK6pr5Hwl)ZIlNvcSWqHrDjcBs-PQk%`w_n8$GSZ~BG3NVc>f2S2kzsd1@CibKKoH-=lL7s z2TTBCDA}O=ij_4;rgZ`oTdH3ODZq4+Q>PbXDB!6{Vn_+SQK1(Z5_*&A+z0j*SLi00 zF-(f=WVv%b^XC~{=vDB?DS={i!AH$(-deOaOl+ZK2iDkMB?Ie~RvnaJJbpeU@sos^_jtyvj&sV`FeLRSu&`|U2{L7Wn5YE#jR%a zCY*&wYNofPXPhUDnS9K(@X4{TG9nJ=T9t3TGv9izwZ@jR_&VfO5C&!y#r`|Izk0B@ z@==cIyD#&8$+<>m3-=3$>R(C)eHeEs)P zWjP_Pk$JPO`lx9~QU)ts57+iu-dAvHW}Na3NU9qBq4%lT@U9}N5Mc$sb=ay2hHw7( zF!TJ4R{K4_%*L`YIb~ij{q4V!o!8#EZ{QK=;cbiFkL0{MF!XaTGMc|_mxTV z);g7jziMdUUfiGbSTpfqmq&>A+1=%g1;D+lmF{Rek&j>5soe+M!iMrY%ZmBbMXQs; zuWO+pGaKuZ(eE1Q@yxlQ$4Mvf)~>yKFT{?gLR_jj$AumO-KdgrLkp(sLxQPnTh z!|8Ly`m=dq4Z^YfuD*|_?Y#{?&@#!NPmk{YVQuf3G2~ra^~%WPVdZ^3x!!8aDboyM zs8PWwOL($@bQW~+9t#9aN5EJm8Mdy8+cK-Ij4`rtW9{SnMV~Gz(L0LuBA$@CVBx$c zMRPLl(39Fcr&@Te)t=!`epa!MDh^t{B5Z-xp5C89V01^>db7!~lkkd_#glXvJKov) zS-ENE0kgR5Hiy#>69u~L@Y)Tvj=T58e-R`Odu0@GO!7JyPEv+jRWR zKTuIy-WT#ypU*t^>EfBc34z9rakM(q>qVSsh<-Dqfej-YU*$ z`P6q>?x)(*tK0*cQNl)bC};d;yXfr<;Ta?}iP zD)C0_g5=mAUYwNE-dSN5&!=>}^`eQsaO*l!(%pW3Z2i?v)#l&&a7v(GIp zb}VhP>nS#(ohv0BdF%Eqsy$b5yVs>L!}O+w4P~C#pSRwX|HCqin=q7QS^|5JNs9C5 zN_{PM_c}3IiM4#nsEQfrn_gB7d!J3z1*~l<5OhUYC_FsPwue8jw5voJa@szd{0Z`4 zPI}^({BqQo{L{N4qlR}~GFzT;+9rEM4er0#`+3l`^Il zU0q$DK7IQ9`Ez%7cTZ2xmoHy>dwair{i@gN`~GjzzW-m;SEtUe%E10l)MqkzsZ=4S zYO*X%*<5|GT{W0}ZoadroMIk8Orw{B7jii8efb92^kPOD)0mYR?dr9xJYl%an(;`T z?J|lhx0h?f9?wktS8qB~v)BfmXF6-?)eTBbx*NOBECf$$c8^UMb8~X-j#d!#FdE}( zaPPAlzP>DrKl(I7dtLmO&c;V)12b6+eCWcT@5{J6l)RL_x!E9>dEwgPO$uN-d5I%# zqW9W&XZLUI&l6EeE@A0gYQB`20Hd1@4p`*vny;3g&4CgdS0;vl{=T5#2Jems8n8uhWRNlXds*^H}R2E?ROzX82tkD+; zQ~8&~bHv;soIhqvj^hE(m|z!RJ~)4t*f1%E#3cIW0^ix_p%>3QhKIh@v&9Ym0nmSL zG9$~DI)myX!c0Sg)mgaRf33l6S(e9d^Vz6Wl9vY9vYUra`>gEwD?r*_&3IT&yzN{d@CB z!9ufd?%@Q}DN-qPA95{VvBs%?YN4LU7ZsuWa2;F-T_sX@Gcr?_c>Xt3^=zh2J8R#8 zm(Jm@=WMC7nPc~I07hmm$$63Dr{udpe81@AMV|Vh0}Bjl?D~DL@jhX>CQ}S${?ktT zRQ%}$-PJMpuA3PrJ$_wXJDmxx9%W5|E9_gY3Im^;Ni+lNX=M^o^` zuFbY-YPdzkAG5>xkl*x5eq+VdPlavC{w#v5U``A_*iSJ zP*)om1JD|5QZh8v@%bzt#sFSF`MB>(wtK3Jlp244u10dv5#D=)ITj=GR7Tr!j5kvw zn}Tsa>QY!fTf0&|cw;6PE%WV-tiHI3Z1S+wcbfGxNvRtNj>c@*ka!6k`yi=LQTrwIf= z>R`(T(swG3yP?_wOR2IB=C-D-I#y+ISMEDyq)s`x#QL0MxRRLOk#szjDd#mPYp18n zDcD@jP13Sw%Df;m9>pLvzHYrAzG-15Sm&zs}Fy%2kk&S-o`yLqYWEU~c-#gw-h4^Vys!+GfLL+5z$`{u7} z#_vT+&w&WZK1*jC02k>XYq~StNS3id*KYaQ|G0sOVlLFKwPFqXEiEVEcM0QdIidyn0x_^Y*vX ztuDZFOa+$_X>exf5SJap8bRdYKETOjt+fBA5)!KXMvK9eUk&NmIF9`6;y?{h{m#Vw zu+_0QKCU2ViQEqGb*b(bn9lQp8fS?7VW&jQFiWZd#*Wk|@qwg8R2bVs;wa?8Q zhFQu8sKgFZY;G^e!8rr{5V}qu#&R&!7?Al>_VlD!-jJR0t}$Tv%DqL0kEM<4Ra&J> zRR+DJpF6(`g*zFhdrZ<~k=Ci{cV}4$?)oEb`RLU;u|^0T$VDqu_7m4k%F#TTn8kXd9b?DvX`!t>uk;0 zG!kqB*={%rW@-34bPUU;;56IM>a3kiZ)oKQ#z1)}1s%eT+ha1^?DOHiadse1KYOAU z7&zA5vizV(K9Vd$E##^2t7Ek@bx~-HBg6EHKa(fJ;cF9opmk)1$GbkRg&W}rmx59D zpL_!6JMmy{<mk&7#q}NW!uopZ&wcCMmq9J?i;pf5wf_-00{wp1C7n zm8wJ=#xbN3fB3T}h|$=qHW$K+uX_h+9|qTjS^n1gO?duI>D5?e?5tk_B{=8C`sYI{ zEmA#tLX)3VYW8rCgFiQVc_%S(2L85oy1J*@E~LP++p!9YoMaTCOBu2J=$bKA38m~5 zrC7CL4{YkK9fN{~axA!h9+^q6ll-Tc^|->`lZf0c^8+DWX|3D307rHYY}rM!*ehc5 z0(#b3F4cQY>{YHPvre)4F1za4n928d*9f~yYfUj<*h6b$(Whp3%Tzg2=+(_8EMr>q zKNv*i|Mgzy!(2E6bKg7*{^IyCE^7osO?xjm@0n#sZHCCK3YKmiu#i^kr zBrGY*-oBcBi<;zRF5^cHg<*yYf8o{q+?}O1{L%7TufnO7EXSL1m=iVjdNF^IBY^lr zv)$RnO_Nd?R$mBLa9f*`&|F}=!HJN4cFtzS(ii@bIh3q74+$713SRPjJ^PD=K~f0& z*(svuqO61vZyQz}kq;?^yCU6E^IU6p|K|t{4tX0-Dw+f4Mxo)#G_?Gxpi5#o)$Di(yU1_%TXuysqjlGd5{@hjZo@ zhYoH{WR25KS`9r^U~bN(pk9(cQK^9kTc%rjbN9x?ti8YsHJqhX=ZI5v5kxoURBz#SY{3Vac6C2XfJ0`^|W6W=QzV(9a51|)WPU9Ua(L^#J zQ;pV=28d4ZK`nW87+bDK)~Wmww7fqUfDKrMI&P7Y4+o2#NOp??IjW(!*6?|m;GTxn zw+qAR;v~|Tr(_Fjd=)Bcy9^OQJLV19C@V@tWa<0@&RixVJH?$^M7SB zG@oGYCXgnat6FbiNFsS)CQR~g$=3+2X}ADwdX9!OxjXC?CJo#^ATm8~IYqOYHIj<% zQhFm%@OUtj4w8-=&$5=ob8x7?X5ko0xI-3g911uf)!LUdivoWGL1EE8bsaWG!CzvO zb<1^JdQ8^s(yY79S>;1nJ<+L}WBWcxfsHcjt|Oo*W}PDz`vBwXdCR6{+c+}^b%O0C zG`hw;?vG^y)Hm6Lp@z=t80)lw%abkaAbI|w(LM^~U%Bv4S8=WkJzIobQ@QzSK;|sq zhBLgijCu@FDj`G-7O2<(ge@e8EW&+1;fx*6;a5u0XBpCriP`~Dq1?VJ9E{RH`C8-` z#jh;FI69iR$YlnAdL>HBydr#({*y%%(P!LYxvz~# zpUSwVLw;c4M`ZgCmvCMxh(S^qNeTJ@$IuzXX?Y2Xac~9r4*1egKrM{TPbP-Q(J+T#gN7VH0a`@nutW4)wqvFFcsWrxw4E1gm&&b%H($&2^DPTVnut?9r=~rvq zY>q-qk?00YY^LD1=yq*LMZe3176jm7jL~vNh%B&YM8@ET0&kVzy=FC?lhcbKr3&;P zC%QgJ^cuijj?*>)c;Y0%8?eN0TnXeMh(4mF$vA^`_Z!Pa87fi*FVJ)(H zW28hAFVTuRupO&Y;5Nd4VIg)LLo+A<6EKCdzT5!DO4bn_(oMgL68o;?xc>2%0h*+sJ?{=fKM zL@bz79M4p-7Lp8^3}Um+3A{id$+pYnZUYN9t3*c#h)uw$GUNz(mgUSWzbSb9Lwrw* zzW5uXYi-&oeiUG*siE(eF}5-vQnKXKVW1YirSVxc!v_<-Bv@biIRkQZqFgelLxbL8 z0vwLUnAen0LaCVP402T5v`rRu3govO?N%a#03j5TPvpUy7sWj)v?zEX3u7Es!qO;* z1_J$sAzSo!=oj>F8dRwXd}lVd zSaH;Et{BonrkL1JZjzYmES~KCmAe}WZVkQ~YQfQ@Yag!Kq0mwHck_gy;D$pT3Ph1RyD(G7r^w%X= z>IIJIRxSx!;$O1lt`_}H(yCi%3(0>kNBfn6>}g^K#u^~dMRAf)<;GVyvtYpbh_CjxKqvZzq1l?$ zuX=>e+}`U@+aui$>7(7(w)Mdau^tr4O^^tmDs!I2J1^nl<)gV=aI z@>3&hfq^-J-qZ(tR-@ZCS&Y>oKQUCJrWvli<7I-u0LF@bsKKBPhM-y%I!#MItFOcP zjBY@K?f#!-+`hlzAr*Q{t@%rbmhTjs=-ryeVFH4#p+#@#0OSpSQXy@GgD@ZZW3*6~ z4>B}y8+KtZp<*>apD}dpz~v^*st4;3y`DCBAiva_l`YIc{bSUN`BW>qHlqF;4u$@r zkQc$&XC3-oX$C8yJqqIPdOr?bwx&n^p^!#O5TJnimhryf59&9gzZ8$mXtWkX0(k7d zmE0ZK*jy#LuLiSHqW_VivC4$(8q{G5xTZqsF2GUr#=;!vo1V`BfvyU)l@imHf#1nJ zNGRBi@uD;h?0jg5h!R;dzQ~Y?gHz^h5ErO7#^}q$p`ys3uDjoc&lh5iffm(Q~dK#=^yw@Yufaf#; z+ozCT?8jJut~V3yAfZt*2AWumCaD`_y43}X>&9T_dO@F3aBVF74MUxmLTgl_E{a8N z!4AlMO;(ebeC8@KxR)gyeFmFPiEa@pL{0zNg*j&8{U7*z6_`n*mjFr7Y-Zw3{hD{( zBR=38fu7ckt0B;r8ej)8uHtA31sfEz+Vw_*G%BwX=vM7=#~}xt)h0vR0pm+GYfi&o zv&u!^RWzi^O;mq)uV<{CfH~2?>)ItPcsVZ$dZ85#Z(s5TYwvOajS4hU)`{rgmn7Xa zixjHC#Sn=DM*A~AsPr|TrOfXXZ>=-fC0Il!U*>YCCtmEKhq^uPkM8T$dwd;O{q;xU zS5Y2t&-uDa%(wJo4PRf#A&>z&Xx+RtLd9~-5;N6bnWwVYfbnWCy|2pu>nNis1RPW% z|6-5QwXN7jFkLZWf1qfbdUKit(6`m4T6U{S_=QBuM)aMbE1BoD$Y-^P93_3ArpbJ^ z17Yi?7B1L8^ih3RB+PsTuR)<5B+u-Afb76Og+3N`D^L1+jc)i0tpWYqgB;-shISGR?+2I&L&8f0aTwzvenD5HJu{ipO(LHaqHRhezEU)di7Zv4SLSk7lVu7;9HD;aH3nG67ABl%{Z)f znP}uOMkrxV8Dobtc!Z%*kzHRZZ3Gz|TVXj(2|bZQ+xt2lF2z>DPWy?1sx zfofHLiL?WgkN~kN>2^XCF!wn|wpOM>L9Z7#QLU9=SO2w9`J}(6pMkrjxjbpPdt}%) z%r%o(R%U7CnW+Bz(FY&$ptc#B_Vo-7O)zTZ?D z4sH3)Qhu>~RE~B1e0(%pw6+?Yp$z=~T*WD3&hJ~CPK0JFK6Y+11YcI)HWOy`taNbh z8A(g{x$w^>*~Iq~Ctf}m&defjG5FsqoHwOwe;o8ZSu z(bt`UGi=2BK&AL!@+&Eq2K$_jueUyTv7AlP;rsBhwP$u4aUX;5p*hvI+;e*6l}#O{ z=O!&Ib?#7s(z1_Zz{CpnnOLM4DNW(l&9Zz^8CP&N!ELC>{j5n{hPebA)p|;?6v%Nck`!a4CMJe%C)4^#o*>3f3R%8X;%GlSRQf462|bXtzWu6 z75W!QBYiDn>-y}?>C>f-3hdpReg*sj2R-IOT%9g7;AlkL60V7I=)KASh} zBNWQt9f!;4*{MBa%akVkc=XWg^#+eBe^ps_4!8Q5Z7d7KPkyMc(5;#(Qm&(uFYUk! z+tE%!abla?sY$0c(N)4$vscffS<%>wj(LL2 gj>B^TE|hidh4G~wLc8YXb!Ydi`)fUbVKDQ50i~y2!2kdN literal 0 HcmV?d00001 diff --git a/previews/PR2365/assets/documenter.js b/previews/PR2365/assets/documenter.js new file mode 100644 index 0000000000..6adfbbbf4b --- /dev/null +++ b/previews/PR2365/assets/documenter.js @@ -0,0 +1,331 @@ +// Generated by Documenter.jl +requirejs.config({ + paths: { + 'highlight-julia': 'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.5.1/languages/julia.min', + 'headroom': 'https://cdnjs.cloudflare.com/ajax/libs/headroom/0.12.0/headroom.min', + 'jqueryui': 'https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min', + 'katex-auto-render': 'https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.13.24/contrib/auto-render.min', + 'jquery': 'https://cdnjs.cloudflare.com/ajax/libs/jquery/3.6.0/jquery.min', + 'headroom-jquery': 'https://cdnjs.cloudflare.com/ajax/libs/headroom/0.12.0/jQuery.headroom.min', + 'katex': 'https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.13.24/katex.min', + 'highlight': 'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.5.1/highlight.min', + 'highlight-julia-repl': 'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.5.1/languages/julia-repl.min', + }, + shim: { + "highlight-julia": { + "deps": [ + "highlight" + ] + }, + "katex-auto-render": { + "deps": [ + "katex" + ] + }, + "headroom-jquery": { + "deps": [ + "jquery", + "headroom" + ] + }, + "highlight-julia-repl": { + "deps": [ + "highlight" + ] + } +} +}); +//////////////////////////////////////////////////////////////////////////////// +require(['jquery', 'katex', 'katex-auto-render'], function($, katex, renderMathInElement) { +$(document).ready(function() { + renderMathInElement( + document.body, + { + "delimiters": [ + { + "left": "$", + "right": "$", + "display": false + }, + { + "left": "$$", + "right": "$$", + "display": true + }, + { + "left": "\\[", + "right": "\\]", + "display": true + } + ] +} + + ); +}) + +}) +//////////////////////////////////////////////////////////////////////////////// +require(['jquery', 'highlight', 'highlight-julia', 'highlight-julia-repl'], function($) { +$(document).ready(function() { + hljs.highlightAll(); +}) + +}) +//////////////////////////////////////////////////////////////////////////////// +require([], function() { +function addCopyButtonCallbacks() { + for (const el of document.getElementsByTagName("pre")) { + const button = document.createElement("button"); + button.classList.add("copy-button", "fas", "fa-copy"); + el.appendChild(button); + + const success = function () { + button.classList.add("success", "fa-check"); + button.classList.remove("fa-copy"); + }; + + const failure = function () { + button.classList.add("error", "fa-times"); + button.classList.remove("fa-copy"); + }; + + button.addEventListener("click", function () { + copyToClipboard(el.innerText).then(success, failure); + + setTimeout(function () { + button.classList.add("fa-copy"); + button.classList.remove("success", "fa-check", "fa-times"); + }, 5000); + }); + } +} + +function copyToClipboard(text) { + // clipboard API is only available in secure contexts + if (window.navigator && window.navigator.clipboard) { + return window.navigator.clipboard.writeText(text); + } else { + return new Promise(function (resolve, reject) { + try { + const el = document.createElement("textarea"); + el.textContent = text; + el.style.position = "fixed"; + el.style.opacity = 0; + document.body.appendChild(el); + el.select(); + document.execCommand("copy"); + + resolve(); + } catch (err) { + reject(err); + } finally { + document.body.removeChild(el); + } + }); + } +} + +if (document.readyState === "loading") { + document.addEventListener("DOMContentLoaded", addCopyButtonCallbacks); +} else { + addCopyButtonCallbacks(); +} + +}) +//////////////////////////////////////////////////////////////////////////////// +require(['jquery', 'headroom', 'headroom-jquery'], function($, Headroom) { + +// Manages the top navigation bar (hides it when the user starts scrolling down on the +// mobile). +window.Headroom = Headroom; // work around buggy module loading? +$(document).ready(function() { + $('#documenter .docs-navbar').headroom({ + "tolerance": {"up": 10, "down": 10}, + }); +}) + +}) +//////////////////////////////////////////////////////////////////////////////// +require(['jquery'], function($) { + +// Modal settings dialog +$(document).ready(function() { + var settings = $('#documenter-settings'); + $('#documenter-settings-button').click(function(){ + settings.toggleClass('is-active'); + }); + // Close the dialog if X is clicked + $('#documenter-settings button.delete').click(function(){ + settings.removeClass('is-active'); + }); + // Close dialog if ESC is pressed + $(document).keyup(function(e) { + if (e.keyCode == 27) settings.removeClass('is-active'); + }); +}); + +}) +//////////////////////////////////////////////////////////////////////////////// +require(['jquery'], function($) { + +// Manages the showing and hiding of the sidebar. +$(document).ready(function() { + var sidebar = $("#documenter > .docs-sidebar"); + var sidebar_button = $("#documenter-sidebar-button") + sidebar_button.click(function(ev) { + ev.preventDefault(); + sidebar.toggleClass('visible'); + if (sidebar.hasClass('visible')) { + // Makes sure that the current menu item is visible in the sidebar. + $("#documenter .docs-menu a.is-active").focus(); + } + }); + $("#documenter > .docs-main").bind('click', function(ev) { + if ($(ev.target).is(sidebar_button)) { + return; + } + if (sidebar.hasClass('visible')) { + sidebar.removeClass('visible'); + } + }); +}) + +// Resizes the package name / sitename in the sidebar if it is too wide. +// Inspired by: https://github.com/davatron5000/FitText.js +$(document).ready(function() { + e = $("#documenter .docs-autofit"); + function resize() { + var L = parseInt(e.css('max-width'), 10); + var L0 = e.width(); + if(L0 > L) { + var h0 = parseInt(e.css('font-size'), 10); + e.css('font-size', L * h0 / L0); + // TODO: make sure it survives resizes? + } + } + // call once and then register events + resize(); + $(window).resize(resize); + $(window).on('orientationchange', resize); +}); + +// Scroll the navigation bar to the currently selected menu item +$(document).ready(function() { + var sidebar = $("#documenter .docs-menu").get(0); + var active = $("#documenter .docs-menu .is-active").get(0); + if(typeof active !== 'undefined') { + sidebar.scrollTop = active.offsetTop - sidebar.offsetTop - 15; + } +}) + +}) +//////////////////////////////////////////////////////////////////////////////// +require(['jquery'], function($) { + +function set_theme(theme) { + var active = null; + var disabled = []; + for (var i = 0; i < document.styleSheets.length; i++) { + var ss = document.styleSheets[i]; + var themename = ss.ownerNode.getAttribute("data-theme-name"); + if(themename === null) continue; // ignore non-theme stylesheets + // Find the active theme + if(themename === theme) active = ss; + else disabled.push(ss); + } + if(active !== null) { + active.disabled = false; + if(active.ownerNode.getAttribute("data-theme-primary") === null) { + document.getElementsByTagName('html')[0].className = "theme--" + theme; + } else { + document.getElementsByTagName('html')[0].className = ""; + } + disabled.forEach(function(ss){ + ss.disabled = true; + }); + } + + // Store the theme in localStorage + if(typeof(window.localStorage) !== "undefined") { + window.localStorage.setItem("documenter-theme", theme); + } else { + console.error("Browser does not support window.localStorage"); + } +} + +// Theme picker setup +$(document).ready(function() { + // onchange callback + $('#documenter-themepicker').change(function themepick_callback(ev){ + var themename = $('#documenter-themepicker option:selected').attr('value'); + set_theme(themename); + }); + + // Make sure that the themepicker displays the correct theme when the theme is retrieved + // from localStorage + if(typeof(window.localStorage) !== "undefined") { + var theme = window.localStorage.getItem("documenter-theme"); + if(theme !== null) { + $('#documenter-themepicker option').each(function(i,e) { + e.selected = (e.value === theme); + }) + } else { + $('#documenter-themepicker option').each(function(i,e) { + e.selected = $("html").hasClass(`theme--${e.value}`); + }) + } + } +}) + +}) +//////////////////////////////////////////////////////////////////////////////// +require(['jquery'], function($) { + +// update the version selector with info from the siteinfo.js and ../versions.js files +$(document).ready(function() { + // If the version selector is disabled with DOCUMENTER_VERSION_SELECTOR_DISABLED in the + // siteinfo.js file, we just return immediately and not display the version selector. + if (typeof DOCUMENTER_VERSION_SELECTOR_DISABLED === 'boolean' && DOCUMENTER_VERSION_SELECTOR_DISABLED) { + return; + } + + var version_selector = $("#documenter .docs-version-selector"); + var version_selector_select = $("#documenter .docs-version-selector select"); + + version_selector_select.change(function(x) { + target_href = version_selector_select.children("option:selected").get(0).value; + window.location.href = target_href; + }); + + // add the current version to the selector based on siteinfo.js, but only if the selector is empty + if (typeof DOCUMENTER_CURRENT_VERSION !== 'undefined' && $('#version-selector > option').length == 0) { + var option = $(""); + version_selector_select.append(option); + } + + if (typeof DOC_VERSIONS !== 'undefined') { + var existing_versions = version_selector_select.children("option"); + var existing_versions_texts = existing_versions.map(function(i,x){return x.text}); + DOC_VERSIONS.forEach(function(each) { + var version_url = documenterBaseURL + "/../" + each; + var existing_id = $.inArray(each, existing_versions_texts); + // if not already in the version selector, add it as a new option, + // otherwise update the old option with the URL and enable it + if (existing_id == -1) { + var option = $(""); + version_selector_select.append(option); + } else { + var option = existing_versions[existing_id]; + option.value = version_url; + option.disabled = false; + } + }); + } + + // only show the version selector if the selector has been populated + if (version_selector_select.children("option").length > 0) { + version_selector.toggleClass("visible"); + } +}) + +}) diff --git a/previews/PR2365/assets/flux.css b/previews/PR2365/assets/flux.css new file mode 100644 index 0000000000..c3817e76d6 --- /dev/null +++ b/previews/PR2365/assets/flux.css @@ -0,0 +1,111 @@ +@import url('https://fonts.googleapis.com/css?family=Lato:400,400i'); + +body { + font-family: Lato, "Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif; +} + +nav.toc { + padding-top: 0; + background: rgb(240, 240, 240); + line-height: 2em; + cursor: default; + user-select: none; +} + +h1+h2 { + margin-top: 0; +} + +/* Green banner in ToC */ +nav.toc > h1 { + margin-top: 0; + padding-top: 0.4em; + padding-bottom: 0.5em; + border-bottom: 5px solid white; + box-shadow: 0px -2px 5px rgb(60,60,60); + margin-bottom: 0.5em; + background: rgb(60, 150, 60); + + font-style: italic; + font-weight: normal; + font-size: 50pt; + text-transform: lowercase; + text-shadow: 2px 2px 5px rgba(0,0,0,0.2); + color: white; +} + +/* Reduce ToC font size */ +.toctext { + font-size: 10pt; +} + +/* Fade out non-clickable ToC headers */ +nav.toc ul span.toctext { + color: rgb(180, 180, 180); +} + +nav.toc ul .toctext { + color: rgb(100, 100, 100); +} + +nav.toc ul a.toctext:hover { + color: inherit; + background: rgb(220, 220, 220); + cursor: default; +} + +nav.toc li.current > .toctext { + background: linear-gradient(90deg, rgb(245,245,245) 0%, white 90%); + font-weight: normal; +} + +nav.toc ul.internal li.toplevel { + font-weight: normal; +} + +/* Content */ + +article { max-width: none; } + +article > p, article > ul { + max-width: 45em; +} + +/* Links */ +a, a:visited { color: rgb(0, 120, 0); } +article p a { border-bottom: 1px solid rgb(200, 230, 200); } +a:hover, a:visited:hover { color: rgb(0, 80, 0); } + +/* Article Links */ +article p a { border-bottom: 1px solid rgb(200, 230, 200); } +article p a:hover, article a:visited:hover { color: rgb(0, 120, 0); } +article p a:hover { border-bottom: 1px solid rgb(150, 200, 150); } + +/* Doctstrings */ +article section.docstring { + padding: 0.5em 0; + border-left: none; + border-right: none; + border-bottom: none; +} + +/* Code */ + +article pre, article p > code { + background: rgb(245, 250, 245); +} + +article pre { + border: none; + max-width: none; + padding: 1em; + border-radius: 10px 0px 0px 10px; +} + +.hljs-comment { + font-style: italic; +} + +.hljs-number { + color: rgb(0, 150, 150); +} diff --git a/previews/PR2365/assets/logo-dark.png b/previews/PR2365/assets/logo-dark.png new file mode 100644 index 0000000000000000000000000000000000000000..e10669f892b093de6dcbad99782d92ea3d936285 GIT binary patch literal 159356 zcmeEvX&{ts`#005v6OwO#Mp(9HYH41L#U|iDqFH9`!>^}&|avlDcOm#hf%2{q(rh! zrBt#MvgbW6nQJKhpYHGP{o#J*9&?@Nd2GL9J+CL!P+w~?(=sMHI=aQ%wAFXg(J^Py z(JlCnTL|Ag3O!H)|3mMwQ)??-RxST$IyyYvHuWt=K9*nG!oAJyJtf`F^eEOY`0k+1 z6uNGW{aFsJ+z&T7l9rSjef+W2MC@|AS#t2%2-U7fC(8PgR%y6=&ht;jUXa=7ZdV&~ z;p-rWHWAD5c*MQ;u!Dm@&}mJNcGbqGI(vWC77YLFajp3tbH}scO1~-wi(|tRh;;P- z{Na2eL8P+OdHa8P=GO+|ELlp}|Gr-uv0})`*IEBRZ@922_TR~(@<7{|Oz}&*w_bw! z#XocBSaIrIl#MN;IsbW^89@;%JnjE8N3c7&1xPZEHWjvGzOmbw4r8{{k5#cSb&Q!Kj;oG^j8l;BiOmGY4$zbq*!DZJ_BmH7B_dt2KNu`#L1^HP= zrzS;%%kOO37e$Lq%v~+&ZiwvXK!5tz4`~wCZl!+8#_T9>}DS*O-p)8>w|_HBcq_3P_H-D~TVhg;(~CG^cR zg)UwEm&oFgsI^<5-uZkXR3wHxB(;-ElyB$f|G3oC*`3+lQlA_7cXCkfQEmWTv5d6S zuj71O+tS#Pyt>fxj#{EQ?GJw=i+lxwhI30Byl6iNlX>n^5Z^sKKIKueQGfgPwx;f( z_BXwrjsFy+70N>JvXQ19hMXjM7S&fhUcqY}i`t7-(qg)4ETowb?SWrRnTsBd!skV| zh5=3?S zO1>~nRO%$VJENZ~to!@kKdCWmdVJl1^?$ROFo|W!YRc}CLd`JcDp|2T|EJ2;ilJ*O z1U~$OWzC~D>Lhm{Dq9YBRPLesM4ND}t6Mn*%KyUc zw3UDz*8+{XsYXwzVYn!|?S(MzD8=BtVvtbgWREU-=D&Y1R0imH)twrK>W8E3?ck}O zwT+Ft<+g_)&O`kg`3EYC-<1!23}>WAzP+t-SsFi;t)N*lva-FKf+j{@-Q|u8_+E9= zGHCR}h8JfK1XF)QJ^efeOmf5l!!!krx1gt|Ccg zKpMINk?fDDu^XtWuy$K7z2sQ8$+mDc+8^gR#q)?)>r93b*#s{{HvFec*yK)s|IVLw6&e#DeI){QU!Y6$8r^ zMLg|DI#O)d3i}(9W0vioRR^A}@s3vs9NXXSdkTfEzjBKWZ)grmUfFZz8BHp!M%MfG z?dy2)V&}e)%2Lm^ePaK|^VB~TMnF*Cbqp08ff2E}u*>-sL&BbFvb59E_L-}iq<7#L zlxNvE7tm1h2?Af#Ax;W+W2f`mjd1=D^>%ydWAUipIIh(Dt_3|-y z?%Wy9$jbWpE}z+BFD0{kV*>Lw^zVQDAQ=iuesQhNHQH`j%pf-|s@+%L(9mF>-83GwqbtZV|K5jx zaSrF>fc=9`J$f{@X(5CLvn)KO+EOpR(c&v;R|WGwtRMT|#V zKe|< z$S}_`wfI7d&Hql`YebWJd`)N)w*CqRXg=jXnwR5hPF!r3*%)X^+ST7b z;I+V_H@=%?5tL%cEVo67A-1x@*yZ*cnpEb3@GKsx;rQ@NF;yHiZ8^vY{_b{Dx97_@ z<+$7{Oc;S*w`roMOs1c9W=AoLt~&#hCbbXD~dBTX}isOa?a}H z8b~#iR?z-@m;+L~= z0mN2R&x|69;U2_#uQisX*E=-i{kWEI-uxGW(8EA}X@%sM95i}uB%eFx#f7~eS+q1n zMlVI4rJPTre&;@pi@5*IC?dfQh+A`m=M?q)3B8z?FJJm^SifHORrB?Fhmj>ayA~vP z(Ww8+2@~=$A)1aYekaiE&t?@QV;4Q2B~~i=4SuqaFo+8%aPY07RUxQb&LVE+#CQwf z)aS7`vFJ}AbnWpPFMt?ts&e(}x0&7g^#7Z>gdiZUyUBx_b|#tI@Q>@kgP-?5+Q+=S zmg2Y1Ol0-{B#uaMgq3#P$XktW3St22ZjTs{9-L%V&4r9yMEozTnM2!!2x^xgURXOuZN>^Vf6G$Rj_26KUP@5JG=T zie|=c=^x+0HM!tMG_y~4@kFSER-uWX%!YPDHb~{bUq2)WAR1L~Z;a;Q zHlp$yH;%rlsi`Til{EkSVC}NS9ouv|G#F6m`TGYKIcx3CADX*qzu?ocwzK@!=H@C|A$Vw6mykA!+tSmIub-mkYA?+=^iO2Y=OS z1YN8LG&xzbU_u(TwAL1?6vs-_u$^}3-!#e=(eOz9Dim82ZX;1SYj##v+3wd^=D62C zC&vX#+}_?#XE4|`E4l?pGo&Y-IB`OmSJvEz@+R*PtLV1FQ@w2^f}gxfI?@OS1*NVx zJE@e|*Ts2Dh>N$6OKiIsZ&l*elR4g7(wCW+msgO!JKK*2jnp&#k(Loi4ELM^yJ{fo zj9va_j?eYvRXp7J#3H-&$D0FZG~ee~W)H^}`=3t#iO2ms(v)T4x?NB2-YbW-?X3l_ z9o?+TgU^=(wB1w4;`{L@b5l-&D*FwZM4&=j$;f9a$eK|x^iA>2396V76Efz#5*JrC z+~(b(%%)LaZd}@!+r*2l5bSxA*F7Y<&9b~ZtHjdE%4+K1je6~;mQfRbib4}I{y9SW zRnDk$a&|5?^$5wiEQqb>f^um4-t~7F=jixlOUm=>YOMDLYL#9h#bq6<)0hyRE7z`# z9SaQ&6-0PEq$S|n`8e4izD#kjk_Xj)DvNPkmH;Df%|y3Qjfj-Utcjzjce}hDa-Htr zlRvyNFE49ySR0_Vt*SA9_C`;1@GK9mx)hYe8E=Lfl4L%CFXYk=u2?y*$Xt=%5IL7k zp6lX?Tt4*q2zH3WQuS6d$k+r0{rnjb5wU^!%C*aL%#9+B+WJByrr5wi?VGIFgf2w= zFlQ8iB*btOey8->HNa`+_SDbsz&09SuM!u!LO3(1&_Pt_qFYXB%@gUr6Vi1>*NZtVdIDkT1 zE5XW6TGgmW{H42w&LG`az1d$-iXVwWh@)^T)5&8wX?Hp|#PE+2&_5y{@WD{kwl8k$ z_4|4C3B+!>NJaI}Wbct2l#eHFd7)*2lqrVyb;B#u}pXcbqOf- z+%?Ti2I$)2u8_7zy#0}Jc2N4P&Ijo$7O6?Df|}$r*+eKNG#Do|J9*{j=Z|@2TD767 z@aqHlB$8pU)4p}VyG`!7>FVTUe)Lj28R|4;mf6XRM)L?Z)Dr+BrNkl5Hg)HJ%q^|E z5sTy6OZilP!#j4f@Ao!LvgEp1znqr58Ys)s{mdFO#)>x-KwR;68?zd{WB8Gmif64- z>^#$G4Fdv^;Dw3Rx6f+oUS-d_LloAV&>_1vc+3j*r@@xQ3QFWHK-04T^#rK8$ew*4 z%y6lkcrto)m@qrdy}S!hK{NT~s1BD#uvsmW&2}Pz8{68Fx|R3UZzlf;nsR{wkzVMj zQi$XWB3iwhJijz3aZ4s~h;9D%kTc%zL&P4-Z1vS)vpqjyA(}YYvgsh{QtNIqWz0EL zX7;bZ*77(YK1lbHM#V?*LdsDEX7B6naQ73}n7P)|Xk(L1CUY8W{uUZ#6*QW8*iP;c zXr&aTOy5{3U3M6`-2Fw?%sN*tHv|mrB{@0nD_c?$*#Wl3p_itOTkL)9;>Cfz7h|I+p z_hb~KB!uul)qE3bR^Qkn#Xa9v`cqL_f&P4+FKRFN-^O?E zK%;(PSEsDCgYq!X=@>%#tFdPw%&FzR3wUQK2_eB6X$xxX7}S~JwwEYiWm$1i(P)9v zn}?OC#D0Asd;pg0$x32Eai%bW{PpYC2kGhQ**m1Ni3Oe!1>Op?W(z5*4Ah>J9hCb? zQqbe~kqk>5l4`sqK9?tiKZCI2nD%_j`?mr`ZI5+-=)2BD<8*@sbyB2QDJ7nc^aa*c zT&ht&`ek>@oY(^-Y0U-9zL8QCLbch1%a9o9Hs_xcQ(m`j9dy6Ttj;(Z&GM>*Sgf7B zeV>Vi#iT~8<8Ens%n(=qJCffIi&t|w0tmRU$yK7W4wa*nQ2*-#=aiAL4CV+)O?FO! zpiQ%fLLt;+c_8Oq(&iem%5@ZE0jC>w(cIy&nCkL>oie2ijONSm-V^AC5MKi~G7t2( zderr7bbt5om7{Ue?7@aKNwofMoUwg7JXm-YndhRR(yyfpFCYyeG$>8@4%|p@dIw7O zwXw0W!zf#U8&x4tRWXu&@<1hIQ|FA+wylCy`82^j$m=xDGFwnM(ioJtvaRZjsa3L+ z2v9r6N_YOIv`^Q#y6GLZFU!=UiCm0}ih`-fQ*{wMJ;3hPVJCBXJ!)^i^f&?<4VFcb zQfS;q2?U3nH5E9#?1Cec-pu?$+FYNUkb%kE-A&+sEElmZ*cSqA>tv)3bq)#Qza9~6 zLPPHvB3>bzl*XmeHc>^|o7P5Vxw|UkPAj(^dlqKG_JC#rooP zOB^byUmplX5UoU&1m~jmP#8*J!|vL_Rf z$8GfZn0)Zj?G1Ak1aGJhYW?ABbOx$>E<(2p3k$nPzP;ZgWNZZnvYY>L^s8(mnhrtq zW1R5vGk?#JU&l(vpzmPEge8~~s6KZbzp~)o`MDG%e1b!Uw+oG;#5W8kG#wkC69VIZN_}TS2usfB2QUFz$*{fSnbf0usk~XI2LnK^%aK zg-yE6>$jngwrZ2J9kTn1`U!W-O83wPWJxkeI#Q0nXOqbzE+o&i<#da`<+S3G97@>q zhbQ`g)n1LlC}(d}!(jFCOlBD{Wsk1rJNCgQ6s{!N90!pQ3N^l>pPN6J zhU%W*gq#`a!!6At&?GT^<|D|Nn-mz)GY1&Ke4bz5qeqX#6o$AR@=Sws;h1N}m{Mj{ z;z)O~kWKWpy?#C7YdYQ%1q2v#z;K3#JAZJt&brh&5fbdk7Y``)&E!q!FJytdNdgI$ zW41Dz;ptrmqzu>Gq&0bAunyXAFTs&@NbfwW0LMIEjD{pTzL2m~zeSa!4l*us+4<61 zD(|3U30Cd>PElq1adO46MZpK4+bI2Y>W< z4Dlopt+TBVIH!6uUA|~^mKVFdPI!a-cWC8QL44BB6pErnkmMa}^u< zo9PddNx+?*`P*NhvS~sJwAySV3B;2(Y8TANgy|(TQQuo|&sYE%LvH~3?PY!P@_wDB+zifk2w5*?91GK;CRxD4jnj#&3*>TA_CInV z03zfd0a&_V?r#o31HtOMXP=-A37AFzY}bYKw@;qE@G8Kl)#*fs&^gj@k$}ooY@5q) zTTA9Z7ec9kxp7Z)s(!t0BNetB#UU)-dFNt;(n4f{a`Gcg7aTh9`YF$> z=ZpxHZ;u6oKeZNlsAdsX>L3Ow1g@7ichyBSQ2iy<6G#f1;-!s-f-qni(ij^Wi}SHA z=2UK?@ge0b;*Gz0mALLs7LGuU=7Z^dz?Chb@Y|AqYGV^-5vxEz=d-J8RMody8_~1s zX+?OfKv_uog5%$KxHD^4vVQ5V*f1j`ECx=q{u8HJ8GTxc7v$`MqAPR`eLR!L3uouW zho^Ng2XwIdSIUjMtcA4V9kQF0OyW-4(nd*UG6X;bWmeDCv=Bnt+F>$Zorh&5#o0=2 zs|n5XB<#n8hliJ(6;ut%+i$xPF}@RAkcjrcw56oY;omupWrC)1seI;nw3!LueNJWU z(*fuaC>k6mD`Mky)1v55DA}q$>aFh$jmJ&uWvbOl9Ro~32GpCTT>LU`eUMgKK+kHphQLFkjBu7Z zu2r*|;W`}B(2RcAVouZd*SgNIZU;{kE$=(X=p$`un)@TrioC$cp4M+6crt8?)qmYR zRQxK_&{l&7;t8v^Qvcy-$Wo%REK0n5VInLldusg4tMsUB^PZVF+E19>OuXitI?pIoxS&l(zY5=b?0 zBh%XGn(UsolHpf@<3CFMU}{2e-Lf;HkB{VY`?V#->*c`NfR#|fd;Lf(k_I2Y;AS1-L|#6UMx6*3>A>T&Jwb0z@Mdv| z3Hl=0MfsfUqQSu0#ijS`hYugD8nX{`pHqV3yA zdGW=)ybQmwq4Xydze&HW{ZyDIIWmvvoAmrVd#AYjSl$kg>nv z6J6XZwU!=WLz*DLaCxL=wUcf5Y3$h*O8Mgrsk`FK-k((x*nY7y2T{_cCwT0hVS4jU z&l$5v^fKW-*spY}c@our!(=EmJ{rCHc^=X6@f{_(x$jPjr?uJ9(o3Vgke?kdwahkq z%Mr2uIOK~~a0XS+cMn6;)np28KE#|;{W+o*C1({h2#waJlPy!X9wr&B_x!ES`3rzv zgBv^-+Yo}%U%-ib&|d(3ji)`0qaK)pNcl6Hb>Xpcy#ubZq5%B)EOKjWE7E0iOaCg3=8xw4UWb&k9952L%_E}N2vt9BN+wp5L`=R4<`T2Tgs z%71uoidBX2uqo+s7WG*rw`o`K2O*u*~?tP9yzvY*olMJJk^EIlp1i+MM-+6p01i@`GNcG&)3z;%n2n^H&m6xx3nwKR+ zoiP&q&VJ?U)zLvDr@Uft^-qir>^Pxo zw2T(Uyr2!rzu-+ILJX)9FmfR;Z=LF8O2Aibq%yLD|8a26jqT=}bBVkQPV04T{Y{Lz z5CG>nl_XNd)XW6d*Af1H#3p}-hKQR#J?kDz|?>G07w_xdE7aF@9mkC1xqk$L9)tE}IB4 zNpnK+Y`N0+g|g^AkScQ{8AVA{R8)t)tw*$-EXdG?OQUNItBmz3Y^(p%f*P1h9^$3V zQcdf^kSzf zZgEpQ`#$eFkD0o)I(b6J(l*fr*A&vpvkFH0)>S=pTD{^WnfwE*omz=TQM|I3vI-oV^G(x zo={+$>7KBik>sGx0{n6ousf93Du!}B^~z>2i*ym^7KNktX4-&6cKs{^jxuadzE?`@aEemy);bqv;Z^ZvN{Xp|G@tIYr zYD`!Krqq;J6Gjz`jWm)~!_|RO=apt&^0#*(*0fGdIeF?PGd7$FguTeue@7*3|E#Pj{Az3lu5~XG ztG|U(>j9Nn=j@b#GQR3~tY+NRt3?A}o9~pjepbnnxNU)C{ENW(2fGe$>V7Oj(V%

%{xGnVK25 zxKL<=J~=QYz^vC&r|f4UH8q=oDg~r7exRw7s&Ftfeewi@^Xa=G9R0UF=QM6^3RA{y zq%u-n{6_lU`ZqN;w(GAuRm0e-Nw(K2(|TBXIbc%bx2lINa_{6(Ru#41W=J3%Tg0kz z7zxyPiR5ULS_A2q=2#i zG6R=z>88rM%>&QckWrp1hd7RMj+`Raoewy7ykh7d7>6fm2?h@4F!RdB$QkPF6q3(H zVhfl)71V4e&#n-&tU@fH0qROFpO|HyZ6~`qx(s4Epy=-xq;>P<=Yaf_8Y?jyHN53tLhU(KT#O$+b}lhE+ufK)y}gC!r!!YQM#GPZm6K)0(I>Y$Uw zjq%=XfHSPaex-t4u_s4<->(P>Xw$87skX_b1C_4pZ-lO{W<*?3JGi0+YY%+COY5wV zmNSm1m`r%oe?AiTX?Q%FOLf=MMq_BmoPfz1qo%Qetns3*xE2p2*z)ZXQkiSOctcsh z#SdSioTz$pg1vut`u)3Gg;(NWN}4GppNb-E_OraR%Odn|oF4860H zT!Z1{45Ab7*hOjI_NP7lK56n~$MkH_xqFcZn6`4l>pLDa&3;>mp$GkgTa_wwR6w3? z&w}4jS5xee=la69 zKKA~r7qZJg&0ZWL>kOn)^kyT{+nFZiFq@*+*VnzgmOgKFkc&F!i-?PEU%kYP&W;+o z)2rR8biUoEid?F4k)^B=>v;LrL*!2!FhQ7<=kXn75}rb`Uh-+{M{#h`0bCl=MPhgw zxRaO|zjl;QA+cQc?v$HPFYh0@tB{#tA=JC;L(})v)Kq(Ll%Hu{4+W7wICL6v>e>pp zH6i}~_I-Ak5~I_aJtbzOO(q^(9sOC!VD-3gdJ8k8t_Bdg2bWLkNWWma5)%_J2vOR^ zOLE?Cm(@~`3rS%c_WDvq;f_Q6`E;4c{i`i2#+E}7CcRf6pB$@rw)C4)@_M*H+|xbe7j>9P?)On+!~I|{vx9vTMxQm7 zyI;KH+76(tQj(W9v_Yd3g1;X9;p4q5dE7HeHKiHU*_ms8lrr05B z;PcC?;ti3T_OJSKn5der#6eAQ0GC&jIGBC-%-((6hK5miZ?vCYBq2<)V&+%&8_c>S zVNe!+oNyIH-_@c#tn>ZPz{>;>@M?*KH#P8jI+NHaKhtk+{!z(;=8AJiBr0iBY?aj+ zWlNPO2Vb(@8+jM5nx0L3nl20#xDNeg=5IP~suI+QPr6+C~;!CAuVVlb38Jnq3k#f)r6q|pXaaOzt4c=?*t*rqE zZ-!f_ClJ9R2b{ykMdt~_WgrZH)^>?H8G;w1HCcAPY0t;%NPyF6IaXn9So~0o8cHDT zBzm=^jTIgEuQ1!S211=9?g|$fNwLf$r<;r3rYJ#V?GMU1kkV9wj!w^{OeqDR^nSxu zLsX?-&g3)FWz9BHNGu55$W23&5F{n8h1U-Bx}|68>;laUaz1pZJNxD7eXbaYe4jnr zTv%52Q?t>%ay$ZjW~SoV4ZbvhoH1ujXdSAAT|=u;+&oZ!y57gDH%*WH(1*pFRZeI_O*-=dcl9J56W!5~w_8MuwY_$<)#2ny4DtG7J zkJLRbOv=m`qAc2f8$qq;E_kF-x<%yu3YJZA$k4@sqn_`St=be>Jl@z|4ueH&oiR{W3A9T>UBal)7w1M zuxogVvPgAgCFFB1s~}tcX#>|Fm&eR@zYEH ze6s`vxt9~Wn936pBkKe|2J?W~hoE#Xf;LS)#7^^8fl{(|NieVEzi0PuS>82DChUCY zh~s;cKq9te7BV(QzoC1y{lTX98kK=aD1XQQ$8bc7lT=C&YvOHMLWf~ za*44#NR;^(VMO=9QU8B%L=cu=c-BSvdK&)@9L}OnOx zs;*vr!t`V>h@=fflG%t0%Z?xlUal%=>pju=N2HZ<6gHANvSaZKhe*y$K2Hj^%U3f( zLb(Sn9Z*nxXG^(5?28)p>R>wvk*B>JWsQXY0ITNU)k}W{SWKYE>CZLRQ6C0ypzm0J zTSm-{85*a^T{RA5)^+C3o=8=99ICpzExPMUdLisN3IB0!pjIsaW5QIJ9g+0o@qwlD zc{v6Uyz!^RWz?0_nE0glIG!qC)R=s%l|(PwqI*Pe$rG-*ngt40HCYV?U;}zP-~S9U zidXv6-=sm>Udm))LkmI?h$>%X>4yD+6xSF~t7H6bMosUR*%4L+h73r%! zhQ4a{Wf|ws9ItbT{$@(z{JzJ4bPkRBg5Q#+X)cgLLmAcPZIwrX3$fpfWO-)wS8<>Af zexU#eHAgxBMIWf|CMY~J-BYUdUJQ-v&p7ad0b@{nP}Q^dE}=C#B;z1I2=Ie`NV(5d zW7Qk-EE3>*tR{7gwJGlrD}mPI_>$-V?8I|DmEJ_!gNWZGr6Dt8rdqG4eJsdm3n&{o zurGM~`_~5^U`NFhCPMMI3o7kz6H)cQFu>d2e94tH;O9KH5C&{WJF=5{3zf8ddcYSh z=~cVo*tGYMAG90^>_JP`hWxbm^1QP#ir2e!>(-AUzf#?y)V)bq6^L4tJ2>CAYyFv~ zr6I#iIufs_<=>H0(#_pSwr(x||7N?W_F6`X<_Ns?#Z zmp_##PQiu=Q3fPBRQTpI4Q|9VKHuC#WrHA0N!4AzpU)&NDs1)(N1Lt6@3eC8W&MQ|OZ!{G2l_}7FH@uW`- z7CXo&Zx$O0vNE;3pV+mN zUoLKT`az8Qq^>c0m+4I}AzP(JH&D0A;dSFrUU#X=DO4FthjZZAd)J}2zd`s4-+ zIoAbXT8|C>IPQPJGB{BLVb3&=e*Me(BHgiWanDPpGaSV?Vm6b*J;zt7sAD!iTfAe9 zHUrA&1St$NkIa>prY7$cspyykyM}iTB_{SmijQpW-^|>SIsZFYEnB}wVuie z>e=MC!9?w);%%#DXk*hQ!6a>tS-z9+#>mQDYcWFH6q1Yfay3&L@wSZe@}@jy8oaAa za&7L%Df`RrzRSofVtt-&Sg5q7vy0zvGW6_RZV`#wfU(gH+a0(3(hSmGrhbZSq;vA8 z^UVE(4V{aHm1)B3z@!>|@oMtiFkVJHVa!fZ)-dL4eu$?J`k9n{waVuE&#l+{xdWH} zsn1wF4!D7Q(`6^~jI18+BrfvKaZ=JaX->}XQ94}IYOpm<{Id{EnDU8qeH*nz)5eaT zVCJnwN=a2|uFpbn>U)X)7s@^+z5JZ-w7w-%Zh`Rc8t{M6gv0eShO#?wtQ1B!Vv(l! zwo}#|7uKs_>bDf$>za7N%DO&L<0Xt9i3MF%+m$|IyKH;^@!u_nY)HSbX|c16?aY}8 z0oWIx2bKK&{mWOxhoy3n9B_B@^3Hte?d?6MnQs$H<2$*V;bK)M#frm^KAM8PSas7v zlIw7VhA0Q-Eiag=G%PXt&~kxgM;6rV3}ZzKByx#ti+ zk^)qwBKqwe?Lu0ko_&=zo+XKCRwd|yK3aT#h_WF~k}SW4dv!At4B+>>6@*=JI)QG% zR!G;kZh-`EyBgM2|6xC~o7{^QkzU>T)bAjoevLc6(A4kd&eVdEtns$JJO=f+<9ic| zihRG+#wyhvSJc?4MuRWPEvhm;;EW+BYJ`jTwiNVjbz=`bPFM?=*3=@guxV(Fa|P9+ zr~vyC0ldr`4aa7Vo5M{JvUD;tcq_z9p?7_XiLY+QfA`2BdBXqdcG<^tl-5vaIhr)GC=6_!JCJRX5@;OPs$gQ!U-MGixhN^!UB> z?YkCsw>31}U3;nO>yq_S^=zj?2%BOpU$nH8NH1%?7G-N@){+9I7xZuYg)#80HDr1A z19us3-SQ*aqo|ctxA1rpO%ZSo>~8$_Tf{GYe#x=<)|Sz`Ls!D{V54c7*L$|yM@vY~j5?$vVoRod84VyH}sQVB$Gx!KD z4e0RAW&^{#AwpK(=9*oo`gurV)9;Ilg9mllzbBQE%0r(MMCTT7+dIu&1hA!2%%R4N zr7vH-^3-tUQP-mJ&f14QME#hu(Dv=y*Q~kvEY$i2$70eaoSVUm7U}uzg_8m7B`3Y% zIMuxGX;LE1T&|Y6cBlv(_fTMaE<;{mRkIks89gKUHs7N60OGtzUN`qPEP9iIK#NlCViFo!cT#vB)Sigq*=EFaqt$*5AS0LHZjkgi86 zzO%cxf$zd81{`i!Ebt`RJw~(1bNt_SAus|crU7pxeyOh^iSEHlZi@R#D(OU`0)0z< zbZ+z4AY1y^Bc7Oh8z_cwiI;qofjI`kMHYQO?Cxw!Wv`dFwnq)AOGu#)qLPvVNu|an z$x>-fz>e+c;jq+GJYXu$bh|vMD3^kWGV7d?_ zIx$U~ZY1%&qH7hf=G5-Sk2ckEO;=v(O3gR~s(KUfdf+KerY`TK2C6s*9&4h7#{!(- za(aQEk8Zu<3ZtG;T@z9^Yo}zfchp$`vsd-?J4*IF2q8@?yaS_~)r%InAg}B5>Ea(0 zZY3Sr)YcBzl5^j=kRORoZ@aV=UxT+I2`f?s40|8nLLRtRVb2f{yv%wHA8s*QwBZGA zYc~tm?m~n4Z1@&p!;bc;C}5{tB*66pz~d6*H)oyM2n2PfmLM(JsBrH4wztH~xu!AGtm8tJP1#a{;#RzM4BZ0h zM|qczQmG<6Vf2aHFcOo%SM*RMM!s)>aO>ustVW-KxT1FQm@eF%p*RxfF9Xjr@QQdP{k)$~b!#{F{ za!x_PuzAelloj-Ps)1gDlU$wQh2|5`i z3b83u0V&M#qn*t&xi``ET4~fEiE7;V(}>mONOz*>&nJ4@@m1Ptt+V{T_Mn+*TWohQ zVT{3(xxJEX#-ZCLu)uk%3F#D$vZ-vkbGiq_$fes{ON#c)X9(iVUT?POx#)UhFP(<2Op8$S1}7?zsj zEq)P1wt>@Z@0HC%;}s#^z3_em(0*@0MkG8YDv*q%BA#{)_yR8SQO`l3Bkh9--9{m* z`U8PU6CbPJ5dqN@zuwwJJexDDIrQTkIJMdC>2DjdbA0;xRWINzvy}s=v}&l~o10+- z4a{a-)~R#hsJdXKxU`dYuE^?EF?qY*ymwbZW~Q^bxA(AEe6>jkAp=}-Q%g{g<-D*Z z1HfF>mb4y~-F-|?PtT2k(TNpB=2_DBZO?Cl0w|>C&pq2F*-%d4@JMX`-1@^(?Bd2~ zhk72$&zsr75FhkCwNdD(kD|XoJKc;C{mIUSjCb?%8)M<+0Ln$?z3B@hG*+(Wv33x9 zRFjPQW4tJaC1ha>u6+Tvd;47nX1CxMP31;Xo`+Q8JJRR1@G8W@TP+;Xu%CtVw)1$F z)B5q4eYQLFSs3{ChVFa05-p$Lu99JZ^-yEBRj{(n!KJZF`WBpak*hS`K@C{SIE`B` zX6mr*Ep&T(dUAX?Ugj-FBn^X5LoWkH4{xvBN@XP-eH$j=O=Ry9w(?~dmUJVv3(8bH zp~p&bgUD88b-QFA5zHEvpeWY%WObK)a$T3scB%keTPAoy-Fw`F#&vb82Aet2)6Uo$ zNf9SsC~#7Cmk8Cgtxo}#hpbh&Wec-02+;4*t4?-8&q6xS*#A@Nb;#Ov>(12LDC{|Q zMrSQHaV2DNK2|F6!BW?nwNwu+Y`W2*CiRCoKSRv9NU1Xf#kPJa`i0VpY&*F3{z^CP zSEr3$+CV)Dk?4!LSI(55%yM)gRcI}$y?f?M~mOW*zQI=^&tRQ1sD z^;lZS7tile{-8_U8V1qtc=xpjK0JR|@&OxZbf8O4@yMs= zThyLjXQA=#3QnGDnbVD?43k|=5auhtPEl9G7h_s)3=Y-0$JjM_dVi@V&Z-m6P-I<< zE0N^?SA$l2LgVU#|0KvXJ{_ZTpFBBphI^14Vkt-8diPhD=dy3iZ+>0i1 zNJxk;yiQdyV_y6HjG4IIFt}uwAXS~LPZ6XdzPMQoZ$tLN;UM`v%g8A{h*T|nurI_k zQ2OYx0nGATU!v2bn3tMoGl@h$REJ0yg31tsiN*%k-aXeSjNzyf|ElH?PD5&cZpWRK z+*B<&l}mSwdq=Njp@$&9VR_Kxn2X8|nbtsIC3GLs7tp~pzsMl3{cH{AI1aImq(Th# z-v9gPMANtDBIHS(b4ojA@nPDcPF(^fx_?=phe8KCkM(Wg`!z3P{5Hjw?NVQoVP5pT zKV@5e9+D4v3HqdtjYqu=R3F5I&2r#ta$l(&@+T3vvCF7ew*0TUddRYfZbP(2$G`S5DWLd9xXy9#&+5!PU`V86m0j%c?vM^;Uy84C60Leb>Syk6mwGMmtxN){ zb{bYh|I1|pz=v_B6ZiKF&6kz^d2n`Kq`rMO3W|L;53BK$O*naS#f4e0iHCf7>#=oR zfd)ueT1Gc48K~^!=;&BuqhR*Wdr(0_#B$;1@~zY;f`y4{7Kr*{r4a*bXUSRwIQwbZ zx*ovW#`Iet?>snfP4yifrib{8_0CY$gYb|6&_?xqT~|5BHwEn}ZYV*mEzW9LGV=yb!sAsBFqt5;aN+E`N_clc$>Y#Rm-V8w{xFj7^`#dN%d8EQCFrJ8aW+otcyTX zO^DkKT9^XEb@bh;E{YBL|M*(8`}+y=Uluv@oGmjrGO^Oq(vqj`L}pzM2Pda{1=-ns z-C`CG_ha|VK+OfQg5d*t33*3?>NTZREB-LKGoW)h2iKiIb*>&#E9K(SKveq zUv_t~!AaF!YZUa7b{2db7%jCkQsc&@@KU-|znnQ4PM%-?3K3@VSO1y||t1)NP4`5e6SuZl`lT9h``Blp`zaxUv@6&*vQ7*?|+X zvA6W8K!d*zUhA~mK!|qi$?G47E{D+&NX(&%1nBth9L(7koP=WWG zU(19+%gi@zkEPO0P-nOn9sPJ{aM1j;!`Myb<8%x7`4>~ShII2NueIdby&mnuNSHUH z$wpE`Apt7oCRPj8HK`dX^-!cgdAlWf++V>d`N9VFbwz#YF#69Duf z>l=@hmi76+YYR@31Lr;Tq9)C=5(?pVeekZty?H+l2+lg$%#PNQ5V>z$P9=yAbDhV* z!J#nOP;&F~FYq1^QOJziKNa_$(gcd+XDK%=j={580xysnZaHr|OU=_~?lCRqO>O&k8yg)XZh}AcJ+-@2ml{s4 zeZbfszCMQa4{~ujuw}x{(CK*`i#tWfge zXPqwPK;AA{T6$o9w=u_Sj_~X)-(UGI@ri^+=a#ww$3Uwm5N@XwlNYq)8wVP!Z;=QC$D zjq#yMki3|*b-VC4`<+%(=T=0ow|FEPgkd?cu*D;UfMlP8424>sdxVB%<}-VJ3`=Ka zYD>d(flB@+c`P!rBrNpdgSf8JLXnJ_e$Y?Lh*U4 zx;t8mOU45&D-{+n@G&jUPh}w?0r|9fwW#%)%oC6kVvulXS3&A@I7IQpSeX`mw+P+; z?Pc7P*nT`93^t25{QH2i@>|{0r=@^3rNSz*sCfo3?u{#Wr7Nm3Kpt-&=1#wJ=To;Z z&(c+ho9eN}t)=YW@z8LVgLxd2Uny>o2`<&c&XS1uDbtwZ6ke^Klov8M9 zql-(C5|%q!;(&Jxe()A$0L6SCt{Gsb3VnkuW_>wz4 z6^Z2&iMA#kI$YuByaosRLsLZ@D_u$N71N=5e(wFAqO!21=1uknou3%N$1MdP_xMIs z(bA>aCU@-g_UvimUXym6=}((10g3PHV;h||bR(I5r$gPZtK){HHR{F})D7?$-9w^3 zf;Aw_BFz;J_=QaiK8W+poao9b!@KHnI;S?j$q5|yx}%#^40Tn}_d0_*<&Ew!NbhX& zMRq3>_T5VB7B92)(SfVuUUg?*v$|W~u|Q!~Xfut&c+kzP$1}S5P|rH>P`7AN(By`& z$f4B+Hk!|#CYxhV1BcsAtuM>MEo`cx^#%zEn90Z|yp;n#M@VsTlO#>?y`qpev*tUW zeJ^vwA}YJax;d0*9|>M9F%CtDC=v6mH+R{)DZl6u@%^16!V=(RI!!h2Pn(+vV}R|@ zm#a;(7GutJxk0*{UEgVUkM`jZFkIO5pe6e7n(_U*;SJ&(A1rGJBR+GfBq6rtaFm)u>*DiWEDchrg|7zYy~15G7s_+Q z^+v2#)G2#!4OAC~RxQ}fidi<(E^xSRzE;JYeMy+^=xAP^LyjuFzU^A-H*3ipD;Y_z zk)p9>_fqo>*~v!Ks@xZ>%mvQKCsX;3jz5cUl#Is|Ykzerrc=9K;dMpmIn~qq`9D$C zoiCn`M&kKy#UUiNTFAJ${YCD59Z{9HzU`*$+xU!@B>@voacOiuXvyEV?+W}jWZmV% zGrp7%3f8eGgG5rj^XSa@$+O#|Wu+PK-^yV}N-W2bp2@lG6+)7ig`TsmC2KArGDowe9zZ6=zhm4s{fbSE15A%lqeHcA*_gB5mqtDj#MNI>F*r_CgVu(ygKomFS z#K_KtO=_+D)J8dp%ph|?&`kJTki+;7*M$Nx#2j{O{My&MB_Lgdf`0PR>DrMr1mL+& z>q|t6xz&|Cg6hTJOp5sbZla0MqVwY_(X0iVAh2Nb@8&mt2&-JQXmj5_1Dky;I=9rZ zY(7YoZYRJv7bgy4BD(?~>xc?WLYSc{)1$Ln#1JV%Bt;t^KJi`=L9G2^{%)%}KfZ_Vgp1AE^({y59?+X5JzTFS(tYkI@8iS19#Oik^2pej<>E;FcQ?`9Fq`eYif z3F>9Ok5_QYnU{{$CqC&5{eOhLbzD{3yEP75z)eVpph!28A{{EyNOx=`r4giKvjssy z5GiR9=?3ZA23;yGEv0~T=Wj0bocG@Me((GH^Y}UExYt^9K6526?JGQf)_4+ z31M7**RGc<;$OHAGzOp|I(bjEj1Ulr-hwi)s`N*Y1Y~O`j$Q50r)6GD*~BmSDzMYmz}IQ{uu+B9*VmxsQ;sTVS1+qD6xN;zSk^YrlzEP=GtB) z_7-sZ3~iWZZKZEE*(`)`dVp>9+7pbmK5_0@t6}}%LHqMcumz56>g_-Mu)C3}DgaGe z%^5Zp#Md6t)cWksrFZc9FlL!++`7*gDANhJZ=X)f7|8Rb@IH+BZ}0NP4;2eOON542 z&PfK*6Zf>6$ZI9CZ+yTiXyl6R^eglcuNY8(Nd(gJ;zb zi;Uo@D^FB_5r-?_Z|g{TEkt2(0){VNzLeFfv#;VPlmZzM=e}C*z&e-$oq8;Ym*1L3pRmA^A=24fau`4?K4tJ( zJOP>FoUEg>Q!qQFo0h&QW(U0UJWwkbO_3yd9*h3b8X^CCs_ED+)sE1u z_}3F3gWD;S6pPp0=i6rL3Ej=16PcgX%kOY&D8b7A+jgU`h4$@i@J*h96G{($^;FE5sXYX#d`f*kzrG@dBn)P-y4t*O0n4!XEkm*S-LB zMsabmJJ2M~+6*z+iIh$YJL3ecFo?Z-2My_8GjGg?Q#fyjw}wh~8(L_ZKpN8z5Xz2p z`=asRx~n&L!NJLpu$v?+)|6@SDFl;(e{`VbA3@6r;)4=6jnEnBa0;#*hw0|iqL!f| zPI>SYHGm{ok9WH{qRoT;wB2Y7JA-{ zmN$~o7^I_?ETqv=W$r5@m!98;XTT&J_HbJTu!z|W0z)HMV&nZV_cva37WiP=G!~?m zevhmy?i*h3OLcD1EV_rzz}Oc;KH$^?&0!Uotrv4ze70Q&v$Tel`jk0sw>Y~h$KzhW z>4$1`B^i^zgD^@*9{&8;?+T!X;weyu9_$=5dWzv3smg3-pe87XnNLsQftj>Q} zWhF`xjc-0$2DQ+N+ zL&6xVuT~7%SN$K(dH0p#ng`P$-iafg|3|g(lGp?-qS9K`1awxrE;foD@L*^aG*pgp zXAoX}l^_x|@Uz|tq3{jYiUth2DsD1EqD0wyW){B3Kj%Q=*Q+J(j1AdWUpwk|xNmm9 zQq*GWE;@9}4%qqmO?U?MY2f99%l|s>B0uaQjwz_)6@sPINM-*pR;XEQiT+r(-2RU8 zAo=;_c&;{A1IBcgt*Wgp|z6UvF^)QsWe5; zNJT(cq?W|5YTcK{l(j5>pV|JKEE*cj_x_yYDjePZ8rs7Z{ueDgk3r9%kCMo8!S_F^ zDqK8sSMILBk?#Dp8QekV=`dc+CU-N+A)i29w%VOJO6Di_)mQ4%jBB%iOAUXi!n)2u zj^{^{@s#q`ty|nYPGTJ>S#7r7LHqHRPyoT5!!GBG*nUt)M%P6W%rbK%9;v+?fT8zI6OITbAja&Slr)9iZud*Gyk8*;r(HZzBj0niA!xd7uyg1 zlaie>!gYBRSM>4xYvc9xk+i7#g9P{1tw!XYT=ZG&WrP<>EmwohWH-&5Ps?5Nwk!tf zoOI?+)=?H((zt0;og_Iq8Aw^ELkcP0Lp9zC)Oe23me3V2AjUTtnJl#S^e7I&uxSA#&%+9g^}lkO|SLr*+Bw(yZsU_;gu zF(*L#@iCQti`|gK`nT-qn});k_bpTNZwOUY2{;ZDB=OrM0PE^&LsoYNVjq;hO@HU* z^ikp?3B|Ho;L|E^n+yFZH}%*!C5Xh>_6HrEOzoCtW-^{)T|bN%sbuHNWlbiR#7zA* z^D1Nra{}+h>hE=PW;uel(w0!q5^GU-HE&zjgI5hf;mGJLq1MTyTiXg(w8K~ z&`D5V3ZN!i!ys|8tinuKwlZyg;c>0k6t6F17Gs=`V*c>ZQ0IFrh?KnA#OmHudo9Athy0)*k zfAUlQ#6|k;iwC&_y8t^7u-t!-eD^};HNf2gZa~WN@U>f{k5jdyFlbHh4z4`6J0obx zU%CFtdD-D~t>b>T<@iwJ^s}C) z<@%faZG!W>AR8~)wBqSB(dmF}{m?*Z%r*P7RPE9qQ#rb?-AaaQzD=-9PhNVO!Z>dO2>N&MA9m8 zM+X%`f1wwipY1|d4KohTjXmy_VpjY}T4`PDo;0S|9F^O3Dt7nix4>GIZ+=$FHPab& zY&AB>u}C5JcUlBEgm0}8t|uG(qR2&u;KHk-?#h;WcjpDa4enUBhg`YDg!mB7j>{)Q zlI;}31^Y@mlbLlC@c94daI7->@bnM_gG#&;!13*#xh(ejm<@jA5nqcjbOdDra|=jj zDFaf1ZJn`5RZhxBK>sxY>TLdx7v{CV2gdIp#sUe=f?(-O25Lpd-lYrP8GEu$f(BM6 zpK%2&)*O-my6LpiyasCnu+2$@>m0#wz8N9STiG6YyOi(mn3w)3jH*8^aZWy}bLKqd zq<^ci*OYPd=Ov>WDSHD-q8nEnZxh8|I3HRu48A`d$I#KT`!hT#6@a_33h{g1_p=f0rc+j+-j&_8^K*iaU`f`E-^pVd?aZ zzSj27&>E-HXn0kqb~pLV9n~ZkV<4wt!s8?6XVhrLuMO2x(ci-XPMJ%GerNVp&gf_N z@J6dAU-nw#6aOOnclLQfel3C?w74bL&zZ)UNM!wkK_Z#$WWBo(z@e?KB?~lID#SB4 zDAIYL={`i)e2mrD5da0nN0P|`6qGmIDCg$m$z7^w+T&L9@&iRRW}RVHlG=_6?xb7p zT`y3(UoF~X=`A!t70|;=Z5ifuKFCF4xALJR>HIs&!r3ohr~!agh2ZSj9E5B(@5wK8 zEp&fXsE_2nNutKRk|bMx@JL>*U#)BOsWKxc`cv}r+fnaHo(E!ly0e_}`OmQkc4j!3 z?wKHP7Uin{3bg7dLRtjItBt^5F>1p`2^e?o{i~IfV={89ocA4=@o8`vtKbm#SQVi) zmSv3RM33CdH!l*tPoJd11G@2Xm3+2kcNgA*#voAUojM31sFR01TvHRB2rk=jL{|hL zWEX?Q#_(QUyRM%h;9t;_Dl9KIG@@5VmdF?w_yD!~4BvIVMLJ&)+nfuU6JT;0K`R4b zk;n_U6PxQqP=si(TSL66=C+rSSPy%<ZwS)flBYA6}EP!Nr|8MLk%;A%#hJT4Xw`1p+);bW~r3c zkmQ(33&e5y%HUi8pG{&2_Z%H4iatKb+u7g0X7rxLRfl^8v2165K7U4qc=F)Qav=}?*-6nDKT9gSL79UMEFOxpgf z7w!%}Ap3{9TLHp1ZCn>;Ef(KxFj2lT+%WhEY=%>?8GdP&T1dh)C};fTyy_GGYKmt& z+3S-)Q#d&l1KNLj7cEq-q0!O~5y45x^OF!Zfhp~r3WmG||3k!=CogM%|a_;t?p$4`FhTB$eUv{supVXZ@&onRgEADcxITk!av`E`Xdu?knZV73GQ`%?opikK)Xn|D35HLseVB@}r zkicJJb>($s=@a6^)nEjs<>2Q?$%-3%7j@1%!Z_C2p#R!86edY|vEsb>q@ zUhGXlZp?hRlx@;#2g-$+CF*_zr7Cn;;j*q#_AZ-jrq>@L=_R|r9-+Ys0>sA-bafC z);N+UyS_UjP-|+U<^gE}VN?g2%H6MI88>lB4$GFj+X8kD@{TkFEczUh#N6ujDR~hn zx?3ZCp?FWIN&ahAN3p-&D22fS`NZ3jq0(Qix-J3USd=qvR z)#gtD{@LF?`kH06I58&ReWZ;ud`Z**>yAt*cr2e&9a&_>T!6uC_wexIF|t4PcNEvS z$@P8|ofxN$TH0+Nvq{oUgM?i+GVA`crG}Q>;=*$KTiC}JvKeVWT#SvFQ$mA77Wf8V z1y+$jq5zgQt8qNfT%;cE7VEf6_0WIamltbvp#a=})(N!V1Uv<@H_X>tBdDUkH~9)n zCw?0YQ@x$n*g)`_CnL}uY7%Bx}48Ue18LXPmm%WahU$R^q<)>KtxbUa*`fN1wt($lH zAb1xXGBj5nhOL&9hV}QPifo1nnV5{c;7RzwNjWpC+8XQt{;GR$&^n(zy68h7Zh+<- z*vC{R!Di(UYN#wkVE)Ufk2pXi$YX?7zWCOZ-<3#*cW&4SIe!}9tE7wQcm(zUafCbqy+ z=3@>l;&tCj%&Ryt*LVvok@dVmD|Y^4kjax>+3fUtRaVi|R5ZbJzi(uZyS_8Ss0xzYN>Guzi=CI&KK^hz5i0JVDSTzPt^0MC{^T%J`bXq zw3XCv!6D4c+yFt)MM-fjToM`srV*8$m#{k#*YtlapCUq)f`{=vI=bb5vvjoz64bG2r}BAwH0s#ymU_Zn3z_F}Q&T z*aA@qQUN?fz{@967O*(aAsX=Wxp9L}uVu+wmaRDbVLHh&N2BP@XPQQQFo_P_wj_4= zXVrf!+dp(YU+(5;=7Sl;p((w>j^1bU165jy>6EcuL8Zj)!ULsgsvb132^d_|ITS_Z$hSX+8ehwUqE@;P(ZJp-QLSN8%EBjSqUw z88^?vmQ=N<6xaJ8Ak_DoRdBI4roLXRsgoG2gS%{H2j%^tJZz)|#(>NF#|wp4L7A7c z=Yr2p78<35M7AW|Kilb-Bz!P;?(67(Y9m$a?N(a;OjG|2%+JvI%>>pxdR-W3)>E2* z)XbusQ*6fk8V$|bO>(?`Ea3DxXF;n+*Jq*Mg#8E7GX4)8Hmyz_)O3o!Av8P9`7{2H z6qD+S@ec;|oBwRr2Ecf!9l{#-5}fsx;{jL2+2#Y``f%8abAzKtdCR#42;0Cq|@)fOZJ**f{1rWfCn6;41v(2;M!Hvc&$ zOTfh+9sJ74Fo?xfQT8mer1`9WZMyOB4S3qulqhi1RFb5mBte4RS2lKF&teAS)5uay z3lzc0qtA3F*&jbhhjvM`z)~GqidY=G4tc^NTH%rGXn#jp#sz$U6QRMevNt{KfYjys zk_nK<88uT@vb$h56Twqc19eFMJv(j1tMkX=AzLDN8GjUMUGdYr zhQmm*{%a=1{-3JVOiYSZK%8|>e=1K{a&lR=ycC$`2!J(vZ}=tWyX}$nCi&r>G?Qm( zqkl|osGs;d`+)C~kM05wo-JawgFZ4)o_B5sbj#Y*+FI<__3ISD8gIZQiT8oygr;qx z0}v~~b#O^(_tXOZplQ>EL}YIHM0EUr@;$CP0!gVx?Br1p3d+XaaMwuIds%q6VsOWf ztS}iip;wS^FlM?kTwM5Tb(LGU8ei`is8c7QX$pYToN$S9*M(x;@&jC;X6f6Ou#om? zbQ)xbm^_sfQIjIsw|K4&K=t()652HKNSE(P*{0tf`ag{*DHb3;W9LGnyMB@7e&(V% z73`nGjcIwUjyOlIO*J0+;mUH~e`ARPa(rL>XCDPw5FOZHAwYU@$pUFciMYQkJI(pu zgMVM=e?3W!rlvrY0~j8Z6gC#tDOF)01Ph*xwS0#)4(l4dKRE>)s5KZlkS(mEt;LfsKUUH#plr zCm-2*QGD!6=dHf@g4V{tpZt#}1?>XU<3-f*mj&z%+HOkp>vJo=>*u6ApP&(5qZ7OB zSnuGFNlwX$d1E=afR;HZPy<;-o*?UWX%}kE+8=*QQpp*CP8tJ9 z(tmt(lz!#nta*av)jK@R)hs+T9h72)9+49{7I>`#GTH{y#68n-okJ``&|sUl+d?XD zzOTGr{@PW)WhgcZm=*B-i8O)h&va67P0dq85CP=Z7kKY7jZ@Dkzc9`)n`#Q_BjcwG zfq|m1%ccPaQ-{vcDO$ww6U3uOg)Q7!_I6(&+6=r6U05QoNy1!cR7c%_wKP+?>>dRC z9x!e?J5ElXmfH!m+EL9glP zG`)Ml=VJy3H6!tr3&`*LD?9PQKV7%W-t8onshI8sQ@D+XW$VZl8+trwkQZC+Fy?w2 zuda`HRcBxE4t=2+aFxztz7I&001J%m=!qcWH4Yx zYU!^s&bRa~J(qn>j}4GRCjBRpzqZz57%!sH%fjsR@QRh#RPM8dx2kzF_J+jo7r3&jq_Yby^~OA&Yft&}bb+vGDHnCAy%)}xug0j~ zfy!;L0RKfR_5|Np#24a+IC&9#$KKCbwk!bpBXXQRQ=MD{Qr}SVRPdG68D2ifoqpA8 zEeuV4k(@wma}$`l=x+F6dC`~lrTt%hv;t5fbOR;Ac1^NQ0ak5I{;OiYX>HD-AN6Z{ zU7qHbft~Yrp1M`P9qz_drs1pj^-sL@+D`hNgn%M>+g=5om(KyE;SrbEdD6J#jrw^Ng+JrX|mq{MEySs zaQV<$LR}+{pZ8h+XE}xF^P!LNzxziNtpYepo&E~w3qWy&$1#EM9DCN-G0nazV=WVu zr9d4J12k$+nv4~bL-UQV=8!;i0}o|mWYliDypGI!t>9-YyOYmR zczL7xMnpYsfbBsS82ZeA7s^^1J0fF5Y)Pztz3k(&XMc%}5B3SZ@iKG{oHqqnUXSvX z_H2#?uyUuoYT>Dm{}&BV!QX_0s6z0G;_)}CH0v@8XovrL&rF8SRwHEcVH3Pc6k0Xu z`-IC5%4133p1Sw?KtNYcdGL+av!|fWXV(*`N>hG#lc~1mmIY|!k%_8&*j|0D)r#`BLhxPqe_94u6 zb{ABZ41F+cB;vm9*)bCDq4hiMF!Y@LdTfHN7T4HQ`#*!NFRb9d3)uR(dF`Bw@ajqA zmx?|%*|Tb=p4a`8`zPCe;fd8`@-vzcZ6^+RiWZYDp(1FhpTPMBXXNuMgd~bedrDsw zYUOMDN@bk)EA7pEPr>H~SG@17O*TZ&WC7PoEl9OVVME(jpQXDZ2Bt5yP;c!1(BnJ+ z1IjcZF3hV2%@M6A-uzY)!gVX_I5=`2uYu?i=Vzk zef)LKJ!{z#TE6aWPZXI+Q!hiOp9?B0v9!bJY&Qbc&ho|(wSM+7NB}t@QgvtaPizm0 z6MqNM#L5)I1gBQG!wnM{bI`Tj@#3@)o3sb`OqA!X|>?L_#1@Y43 z{r~XN+8|8f0u9Vt3V4Gg1>X;U8&BuH1y^Z;znv$@s)0J{4z?IOC#vZbJ0}_3*`?kT z;q#dfK_}RsNPC_dca3Pi{QlXe82Hjx-T~QJ>V3HuD7FF=yyGK5hM;qx8x*sFVi&w# zpviiBxmaQj-m$s1JijX5X(H|OzUQ^La*j$0tM1dpOU;1DlgW_j&mY*D2`T-l9N3|R zRZn-sLak-Ly6(@xZ=nTV4hE(_Os~{9hAcF?Gbxs>b$e$aQ>+4LZ9i{I376z7uCM=I zYknto~J%#(FDa- zyp39NOspPK{poM6di*h7quY(1DLP|?h17x`t1r~_Gb9|tFNQ_XEY~-buzOwNxEF4v zQV?*#B}jMcUszBA19ef|iKYpl59=HouzA0jDOSr;(IPjmrX?hF{(X3|H#a9A#|D`V zli1Mc>RyC@)&-b-a|)!Bl(tjS2Yv*L7-$Ooet2RG3*&#d=i|QN=lmAZ#(*2ZZZy~q z8WxR44Qwfp3F&z$q3WwEeK zZ=#Qb)zh8_TN&UWEj99r{MkOu)H`!h0tl4<-@Fq!!y@`4wYYOuZK(zVkGm{myWauR z{qr~IAMH|fABq&cnZY?FPx_vjF-4583G`hu=OzUv>{L?Dw+qzDb^A9bHTDeET4D`FpDO{`rq4Y zJtVT(qrlm)nD)@8Ac6V#0CJ`Sd2Peky3cSv*C}j|*3dHe|6Xhz&8n2imZ&IIk{h zGg$D?+?Ey&GMTovo<-!0J5-Q{$l}3Zd>iZ(aGP=ft2ngV+Inr=#J|Yl-8Nz-7sg$vlNX&kglNWCdyPos(7aQt5ZYyLtJn|R$ zPpxJAoXCrkmA!m$vf5!C9BIrMzA!lq$OICRS^2=@K)P(R;SG1XNVhs5nd8-q0v#p2 zCTvBon1)HO^sMIX(kW`m8=o=W54Y1Wp<~{501UK7qHmadjOT_<01y}n0I#W)CC-S> za>a$OKadWy^k?UoR_?;VY?kY87lm}9th*{d!`AzFg%geT;bK1@1?>ifsb*}igSiU^ zM*S%Uk)Se~(~V<)U0}B03AIh5(o5m7kpC!2D>iNNSz zWu@MTb1cXnM3@JTe2EEudc58!5AUm=Pyhr7S3B@?5aXhDK|eBov^&jiINUA~LmdMK zlFVd4?~^12shMuN?`Cz4q5xXLz?6jD?~lz!?nnDPCDTblBc^<78Xc#Mt&S^!rWhi( zKH-u_+!uBG5eMngj!Jj0MOYaOyOVe~?j7yQOV_$BnOfjy^I@4Xp)b)#VSx^Wcm|b& zH>9`Td1?fSet?Dv$m*s?5>a(ApnhC9A8*xse5BnbDM91U*!>d@&2aT5Mt1h!ydpTD zDUenBUFGMPzx(2N?Qi@Vsb?tPU_Wo*vrK%ZB@hNjP2+a_*W9IiR&@Ne=1K zJKn()X1Eq)RbPNfmq|{4(#@)>s^*f1kSUay6-B03(vfn$5b;nc%(#GlBfDz9bIVq% zu#C*;g2JPpLA(*+qtA^>dr1aH@w+QQf|DU!vyupP%0awa%(d%JKS1`t#KUp zVC}6^8MLe2C4+V4vR++`$!9?H;Y5h)l62Sm)FlF%Zm`(63&qttxa!Ve_+`C zRcX`p-as%rr@C!W07^+Rr!QOCq5h?Ru1&!CLN9Fk@Y!Xn54DdsIg{-!aMF8R0>XSE z!lDcbS8W`e@b6pdw4GO!n+?CcIKCxB7j+5bFG@jXY|1Ey4)gzWCde>jr9ZN*yH6S>y&6>%ImQ)F!owhGNb4)*>Xp|U!cy1m6q^I+nt3&VXJP4I=XuMD0q83};R68! z@Sq|-E0$uEPWqe{lK6v|iY&+$P(|ZgF7FOG&(oTlrE2OMDvVOGz0`=)*S3$S(@hUL zgFm=SD-{8%@(W%OF!OkSt&sW!S&R5RUioVKH-VA8QI1LmYeH+W`1{MRJ_GvG64e^* z-#l+$OMe3Ml{!*Ch-HTKQy8cYJitxsfI%6f&4YBxenC$#Die*{XzYKHU+lj+wdQoz z=whkH9j0d#7$kDZxuZE=dAA!}ur}jvr*x{;i zDLAMvHgKcn3~0Jte|*sp_3c_E8mN>N2$TH*5}(hqfh5{K6$Y?n6$Vj&C<*9l93ign zB6`F4xBu9>4Q!%eHysDZ?f-F>CUq|~7lO9*G7~Tgm^M6~YdbT0OOj2F!p5iUk@nh{ z$SX-nbT18xnGfpP;__4QV68wD z_OAbIGbpwHk6!y9zfCXOL4hox%Q{Gz`^*$AAwX=VaQBe>-{hhlNK)IV6gbkoaUDKI z6r4+yZ4Ct7Kt3anfkI>5uC5tLm+)NsG!oG&P2O!tu5We)#rhzLyK=EVp5LPV(enoi zU`xql0CaS;sv~FZVv^0{3w^Ui{8Vu_-6%QnmGDUER8e*FFDkRne%b&|>T6kWa%g-?o4``WCM)6T~-O z1CiGmjV`l#B|s*EdVULsSW_9^U17QPofRkON7u zatioKYm;a9_)@Ptc-HB3_|tZRO8o-9B;>>(Mtxn@{gw zfCCt@*&9X-%e(X&=arsIv+gDfndj5~QF3Y(LdsG8a$`{o`Y{W{sb3;glSo720@H%W z$A5iG@E!kVDkgJxoos;!_WsG6nwspmZ&Q>CjF&L)DrTxT$e6#+L~~( zo32!&&Hl$wGVykuJmcMRILPkO1($I9HR>;}EO=MHf3M9#8isFL67$`O8~wpu7ii4A z=TUl9oM?DY$YYHt5%k4``^rf3# z$#|MS7=vE&nCAK|G!)e)@K^3>H7>v}#*<|eYH4lVrcc~mu_&9*220renf@tMRK(ID zH%XrX`?mAcgcrCr?Ze$gW-8z&;Xu#Q?;m$Di0r>8O?(K9Kkm?-tgoBF{p8ofWCK`% ztD}8_+6QSJqhM*n8hhxuuBY@2esIqW$Y8A49;EW^JHpN$1#ML3tIrT>Mbr# z<#Mmo#C0q;V(;m{k>O((+O5(YA7szYX1P%q1sP*eAMyP>AL$32cB@fH@l%^n9JC=D z5W_bSvv!0}FPg#w1iH`K%Fj}z4JXvn!=vwYGR;7(&bN~L~8XZ&wh@OnfW)=+!lHNBfWT_W7b9F z`}A~wMrG<#enJ#B7Gq3D8*ql;u{S|y#8E?1S_bXJpo!1w@9`Sm^|iI;iXFJI0XNJq zq-nr*Eq?0PeV%w17fC9~lKU9ohP7s|(O^ri=jadnC$G4fIiT4`8gx||LyW;^^YRgpjp58R zn-1QEt#Y=e{6x3R{59MSndzq84<9J@9n$2S935}2|NbqsU)Yd(myZx7fh)@xBMf{K z&Wjw^A11J)n|#Auq-wcuM6a)|tc))OX-1UDN_pNN*}AVLa-FaRTs=;wzw^xar6;CT zsW;mY<;$Rz?>`kIWj67i{2nFI1HBfC^zc55rged=t!@6`Z6O8KdudoOOs(9`M(C5X z;Jdi^cRopPF57Qf-+bav{)dj0H}mk&l?x5AGJmA)Fg~m$%TQrB>?M=mV%%)r$~JWb z{$1;@cVEX|!Xyck%en?M5VjZk=2krL*jGT9t8@7X|5P1#aBpAE_@nS`uB%CEVEDD9 zwcT#<*kw}EC^yDn`a$;9&_F@VJt3P1PEp}XHdbzh4A zrjkz0gll(ldU`r;>7Z7R`-Xu~;qi52;lhj_985KQzku=T;yc+=d9r)#*^&OcUf$kG zgF?cc$@Mhl+|>qZyh`Cf$W;uxZamVpZswa+p)! zi%5{D!4u=z$*G?uk;hqUgeUaKTQpa42Yv-OgT$Xh;u&79txN?FomQJEqvv}}Xf%H( zdvIJOry#HB%i$U=jy8rFl|cXAq}Y@!z(0;$7j>OaO%Z)wj0o&c zKD*JVi4CHZIK~=0orhC~G~#){vaFv+N;pg0XyHmyIg1c3$PD+x1sfId>NCLB8Khos zn@)GTs_8Tns=6)Ht+)~t{oP`V3@g9S+X@JSUqrtSzFWUID73CFlVpx>;NZ_>Qqb|; ziB|JHFO6T{I;Kuxah<@!lCo|!kN!Q23aQ?eGskjbCJvLGN>|Emeo6|^n+p>rp=5oF zl-HN_=FMFJ*{rX=fOOW|)iLw19ai`PgS2OdhLfN~?nH9O-YB@_mjmu=^7mG$4p*EJaFG#STLSq;kV zvIm8QyTS`}Sk+>|2f~1rz*94lVrVlC<@Lm?Gms-k%AA!ci>^%mN z^rSOy=*)z0vuVhqTVhD|U}olBit?;cgXsBk!Rl~xP-4t2A|F#jdq4#{YD+hWxGA=Sa> z@hL`U{qeYSwCsf;ZKDW%k}@RUN7xmwtS4b)A%#~N_uU2?7Cq&8uu-!`5)Cqg)xA)F ztG;zz+`}PnBdHS@5lty)TwWee&Zg=ANcCGXCiaONP-$Gr{eeR9bjpvO^ip}|>?^!nLJSVw+R}JN; zMjT%BOeevCno50YDqS5(carMGtcs#p-hQrV&Z!t9lKwKVomx(2Wez9L1mzO~M=B%_ z(m0p`(Q0rPk&aWA50{V-mO$2@)WPzj}1Dh=Bg#?U!ry9#9>couLrZFn1(5Y2rhQc3}1`5AdmO$yP0flF&2|x#s znE1h6SUW6Ch6L`n&?m0+MN2bc81SPZb@%9G{6XL66>$Ptm z|2&nHY@h5`zybS!qYT)qz~thxxMn_i%bT~AB=d4IK{B^!n%L4FY#Z(Je3xg8c>yHZ zaS9nekbo$=w?}`QoZ3TEQ3KH1RRbLyD4`zG=3=-n8 zOnUFQfIo*I)oAt2xldWpDdSe@_Kf$94E4)R~PMvcerX{K4rO+Z=07`--eTP?s;~R zZtfxvz~x%)a@2Pau=JA$b#g&hdXsnbSy?%F$+m@rSe;E@Ia_j-nb+FeTA)MDB-tDo zIs{$o9VG6^g+zu!^YfO<0!e-h8IPgz`HXC|-rar|h%H(16Uz?-9uvzu_j`}7r%T(9 z|0kdz%P0<3_5Nhf9fx(?B9n%_oJ2&xADS%!qQPPP1*@7x~0PiejVwg3aM6;E$ijp!C5qD$-P|_mzeI+I3cP056b~MW+ zO^#_u@DVYwCMafR>KmHt-@3V&BuaQgx}g(LWP7@Pe%Yw1De7-=oSZ?jUQ#;Eu5uX5 zSQm9M`#9q*AOvMq$i&3-wDeDFLUUiO!2S&3zh1 zI-8aL2^I$0x{e^S zB}NHCmK?h_!^=a+Dx32&<<^0R)<7|9O8mTiZUJ0m6?1s+_vz`Wns+9voy;43Y?0?P z0ZZNo7l&=|r&-V4{r&B3EuN~ts^6rmj2|mPd0`Qma1mnggn{#0F9D`c1Yx-S^{F8n zH^@gTtdu{2)es_~Z>W(-9wz%8gjU78m1BR^**>>P>+kR1_;C8t>7tDv(XGrt78j+; zz|nVi(L}Il$4Q?C3IZv1r6!ZGs}1tn+PVrRZdpYo`HSd!I!acM{0W3tiA#&~)oSNK z8gAwS(!?_}VqDPtb0Zf$IK)9MDbBvs#?~uU#`B60Cx{MVia7W5ls?^gFWqo|2pqT- zXfSyW>X+O%X1_|%rjFzNo_n_Iwq3Vljo2~(heV3qgLDbeg#Oo` z4U;i4YO?DN4>>J}vXAOG2BScNNWb&A#pwLhKU#pjdYRkDFDr^NhGJUOMY;62V?lAw zH>|8z=jgncTyBuTenn6%+|2NWQC-3YAK6SR31FhDH(hqOtf-{~nWS>| zCkM6d-Q4wCZ~rXb`k$<(%w*~34Yx_kBP{FF(-%uj8kF|8SK9_8U9&_0XwOtq!SZ9| zLy`sfjaS!%Y0{C#=pDXWWr?pdAL_Dq)<+mr*(r{=U1U+poMENY(Z(Wv@41zeB=iJz~A#_=JXFm|Hkr&M%;xFklA0z)1INq**bjig2`e`I_b@NK z-&l&=ExH_p{Z{cd?R*=XU?Tb84DDs3_B<+-1pLYGDcZQ^=3|GL&^QJA8+Uzfe=Zo( z(QK*KNGR5y9i!oZ!VK7NqjG_X=n~@Gx?FYq;4om%Wc{UMl~7xMEG|SF1yI)Vd7H}h zrN?t6Mj|N>*Aq0Or;t1B(k=(yA2h{baBg<=rE2UjV_Nhldp&Plr$B%y2mv&BkEyk} zKt7r^VSjBO0X43K&+e9^P^h?p%}5Y5a}6KZkpkQRFvU&X!E;$3URi&(y6NSo1D1?8 z@8I1l(&W$OvbK>WzuG@@qx88wz)|kiR%CGKe0$UD#J$tVrDG#U-D$$-~DxzT^RY8`hT({lFJhF17=AkqCi{#IGAI?;0mcV@eqvR_<6Z z6P6s8+Yd&f8L{g<){GfqV_Oocil)h8a_*b1rOa+$9v&IgRW@;pjnj)e&T+zn$iQ&u zyt4K;nUWf7prcuJ8^j2O#>N3?06ob6;m@Xz0O0iW@N7uWFg`dF|&XzcD zdDI2&%XM7-{hO9E8TXomM)u%|fx-9;$C0faW_Lwg242qa2_9(fTGD0jp|xNp;vU-F zgM-fBl}@Wk^Y^qLdvmiA!vO+B7VnMX(@(hz}d zDTh9Wt!nuAe*R#(e_}AcHYWC>=g5PzRRk*)x2#mu&DD(@+3cbqIhpNk z=C0%3D9oJ0#hs47H`}BhnpoOHa{g~3%0J$5InvQcR^c{=zHJjLe)tGk@hKI?y>tjI@;PM zNxyt7gTP+2)>Z12Hmd1O%apcm^B?`cy4VdvY(Ji+;TXEXPc}m-{*N{PIXyewu-Wu^ zqG#Ra1{DpGPcd7+OA%pb6Xj0k~W zVZ0{I+|Eb2?UCsg}~IP>8eqhW@?l z37>c~#DY1|d~2TAGJ6_$ps3^-#D$~P@FW62^o#)`5ob(>7GqmLIu~bGc1fM0?uc&x zISvy(gmh%k?J&tXGS~Ad-a{{(xByA8_bP8+G%IiwRn_jFNq0nrr5Kd*KQa%+W=mg% z`!zZZ6{xL$Z^#ZHaNEXfxp|?yH++!dG1ci*WXsI-lQ|c#_9Bl#MZ}q zzpX%ne*KBz)5*JuU~B5-4%kjvw-CogoJ;=GbGMQQ%e~3N66slb;jz7h@4WNDUT0(b z{fO9Bkk3IQD5Amz;r*|KeX7AYrt4Hs-CA-YrNVdpcS4lP*ku9_Ay~pqTT#aw{^E}_*KVI(F zO)u~3Jv`_LW%A!;FB7Q}UjF{Get>h_H@^iI>?b_9`^9;) z8!hp)*4T}OENEdX(`qH-_^$wam)}ILl;9V#{mKuuQa!gTD_^H~E_M%+J}t`gF7MU3 z(S>T$u{368Uo!4Zv0x{;`IVyzinwhPx1W?>ePrn1&pWWe(_3Ng@_r=@k!Z6W@o%E! z1^D4isr&xD_RhkHX-$8O{DRb-@4-=K)O#FvlW7{pp{H#GDF`ps8Yj!BQIMf;z9*WI z_QY@k)XxihiY-~sB^oQMoDN(792}3euPR;dN>?AdUdInyB51+*OWJ3V%mcs~K$!_W zp9L5rhF&zbt<@(_)bYLJ`#V0a$QMp(QWHKHl01KYF`|^;)LaS5t!=f5k`w-@qe$NK zjy!14xSyE4sVr6cc43N6X_?~bZo$U-C$^YICDY4I5Hd6Y$|4HpKymml{2{<)e;LX`zUyvKF?0-Wz2F_0~> zytoau)LzRK{SP;AA7A$?R-3UE!M{DQg5&^mOicKjW>Z%zFF^Yl_iN88Z>YPtj~gVs zy1vXeN3Qs9QR+W_G*{dAxb!vjo+Ia{`Y%gV;%?5mr%CK?b%|9j>M}!0rz`&k-~WK* zdtYUF^YqNlgRFY|NQLH`R89FQRxMWP<>@*coSe$>EGt;~@FuCh33<2^x`*<@1%+1t z(_gR@R7PB`KeN7y2wNeaf(J>b)Wby)eFm@IMyIe`hu#{A~Q4{Gy6%H?Q}TZT+L&9k?LnwP~3Y7KV60z_>bmhwU{e`rvE28gsGZrZfH@3^n5Ky*0`&M_Tzf>p+t>^ zwZ%|*`lGI3XC&ygP$&man>BPa+$5mEo`Wdn-wVUDQ`N0q-xBn@jdgAAaLVLDY zck5TVFA2>`kGRREpWfj6@syT8Tg^a=)1~E_nyzL>F;zacZDvVca#4sicY9UJvPorh zm%!_1TK;MOYZ${hEM6!?l2=d!L8ak)@0pkJF9mw7y2-X|uHM~}A6gr5_GVFuz@o?x zz+uHDDaBWm7=LoLH@sI0(wuD%fV9xDa3zy-Q3d=3MQa<;J9+!`H~e)_fMoA90HlrY z^M;WkxRNJS!KD-=v67c?2*Trrn1DQ%(C8a-yRr|j5L8_8d zj|BtzHoX#AXXH}mpu_h)vL)8X@PU%{RH(L(r?CZwb>#Sk8_QZJrq6p6^4;L zCTG2_e@8g(7SB-!2G^rec~>|w5_-Nfy+Cfm@Sa+{yJncSx$BY1s+hjzoL7N2eAxC| z!8d=F%~!Y(I$vFluzZa%$@Os&J})ijaHgz&%w?tS?(mC0gA!5DRyGsMd*j33KDcM#i)V;XFkD;=f-MGj4hr@=J^aE zE(a5hQvEA~3(KzL!#Uc70KDW9LSWn~{xQE{<^8&+wnnH^ta*3&3??bYi;ai-6V$&P z8wQq#Ds&pYJciJdMwDMACnrw>I6M{i)I35D zI>Lit0{lF_3?r3Q%hc;~D!;F6rRudM&Gx>&GEkLos4WqmU#W%I>3u=>$rVO?{~X(X zQJYlyld1C>kQ=QxayF(`VRNHkGitFGT2QrNxfS{5?^VmX0&c> z64yVIiga>@$X-;5y3Ap=OP7wSD&Ea35!W;cI;LH;$J)I-;ad)sN#@;S*VwSxhZf1m zrNSYz_cFI3t=p)ALtP~R9ou!ERCCRG`UyjH=|LGhP|MqEGhr7KwX~~VzBPD#Gi+|*Zhd}ft z4~C--8C^)e>q{u!KD3XHu$4;Tnz^^h#1!tD@&!GDLqGl>~+1UHTk#FBZ)--*-f4~ zcLrMx(834ZiW8S#{W(fi_XT8-gaL@n)i$4Oc7=#eRG0Sf&sQTQ*+6%9 z(LdmcpZ_<8uy4_y=Fq_kIR3)i!h)|FRCDXehor*YbFFue<>Vq4xFA=~wO<2F_iox>D=ek_1zF7E>@2O3|j{&9C&u-QI_hPPoBm|1Ku6v$7q(TH`%U z&tCV1>9u!$)3e9K*T3Xeah^YT=n^WZLK$n~$f=bZ20h_?uiD47XB`ng8u;saC||Vp zEypL_+L}vqPck^u(=a^j88V=$#fftwi!v-;A1;=Oe=zXC>U+-HqQ{dJj_z(XzJaq#g{d)RiOu`4n&XR6~RMs1pggKBs2aNr4VEepDF?9_-&|3k!Y z*0OQwWoHUl3w)8?TjPLeDGAgZJv1Kle{5$KcsFkCNxVW8KzbzJ-UjCJjpUIx?&spd z)J{qoDk_D|49l9}r*&h+;Nmj&P^FW1A%JHS~`u<-V<|1f1}tn!x^zP;1aF)@H=^5(J${f;>pc~k>nxVDL& zALLJY7I*pXZ|$7RH%@gUN&)XxX^aV~QPauJ>#`j5Rad*-)nznUixipZsBrY}FCLZ; zp>MS|1zj(EVq7{SuVd|WcRG}KD=Wqi>Awg+gg@ytIyTBgIzkdxC6|k?mfVN`$&78S z!kkI`^0KnY=KoTDJL3WUrbj}u^TCfGHLLxKOoJCTrlkqcwDUF*jo;V0PokXx8yd|~ z=W>Pn^Ottj)coGz8%)*7veU}71h}aD*Ew?2F*K=JS{$PzSj&9j7iQiw>sbSL!K7?=iE)T6#20e0(G?m0ENQs^K18crH0G%cl z+b`eAliHSz+$SAL0E#@q2N;GxxlLfLHto*_0_8DRnwyJUT%<5Yk<5bR1bibY2*wO^nX6FcYZ zJXvMbmY+a?*o&SbM^Hkqp!lH9& zc~$Tu`aCWrZewv7(oJ>*Enp*YV1(^_YuLNKz#h86mdZh=KiYI=C9umn_H@c?u*YD& zgl_f!2Z2I>{>$vYRWrL6&7(9qdSz?Scc0QERB|WBt$sU_Go`3v;HwDV1=y`uH&E~o z)IF3l^|I8?*9oY0>B#57mEA9Um%psC;Jw}(dU*?*_DdVxMu|rl#^_b$jN;s>Iu~S; zfwaD%tE&9GzPMyPWR$JC!3AphQ1=8_Dd zK{{x+;!#9d^i})u{CvK2Tf67eWx7XkQ69l(p#DeRV!V@ikzt|f?9{O_QO=$GPIl(H z`y(^MY({v&N^7k4cMG!Ed80p>MRsWGu%Ei*({AHn68wa!c-u|GdEc{GXnSOXV^gHolWx2q z{vcs-&Tvb}K=C7B_IezVq9qa5rAPb|PeRws8s)D=;+~bAJ(!<`A6YMU<>S}zfr(QS z2&>ZC%IYIw9^K+C?EXuS$f2Umr*i0Q5@VVgvuvB1eRmh64)*#EEDDx)dPFw&<=v{U zDsm|)9=^nWR9np$FNKS$Wc?uZypM@_asfdwIR4>5BxJP72>oxqmxwYO8G#MlyuK55cl^q$%Qzc?7R;ciUmE&}3sBhRTHm&$6u6>-}xcNv_zRC#9F0 z(Zdse)oGz(en0>f`i#bJP8f7aTbuj3htcyjVE%_^4jBE91q%Stt}unZ=tdts^@0(~ z8?CjqwU>L8gOb)=U|Oi<=H}KJPP;=9Qa7``y)EZ-tY$)laKN#St_kLPItbLA*MEiy z0iszvp6WYPMgjlgrdE|xZfci|V#=WDqezX+6d&7UE_7+fFP_=&6@lN}B}j^(Gl(Qb z-$w75?1^K@?rNg^Lk4`mf9}~8C$_>4gxE~Nz_W1r8J(Em{UCar6U{#ySu?9~g>lnz z(QpkXm06q>|Bvipbt4>U`zmMpwH0oNHcp8bC$ObXwiPb(B@9qGSBV0P7g*%pBKS;) zP*MmNCrY;cDDCWbwrlKbXxsk%S7g1Ge9{_gR_XS35rAI)Uwa}fb%Fh?#gU2WQw3;3 zQj#q#z43PZXSBW5jD&}=d*{Ur>^MU()lFE@Xg_kQ>99pWOv6om@{znJ*6)ap3_%0F z1>Qv#3fg+VD?~P5^ODwAe{?1XS3M_M=D>Ih=f>g6_}M!xV@?~PSER|0aV8}voAd&A zhnGf>8?9mi@NW|c%X2q9$W8}OwMCy^IymbB#ZNIz_q@6ndBZ;lkOiT)naw{=K^tuL zO~j8M2X_7I3!I^Auld0HD$_b6mktu3tnw>J_ACFw)#XJ%%`+H~eg z#U3}GcKi5|5to+8q0Lo!qEC8VpvN*;^n9Z2LSbUtc83QU5Q+d>xhvAvvAOJeYDUOD z0@ni%xhY&vC^xs~igU{@3am*LM6FfJrVMo*)${ke+0(7C*C)r$&H0-j_2;^fz+c}B zIL|XAtRG9W zoqTM{_cDTgWvO2rx{@>6Y$rYBS5cXd+EYra1U}f_Q!n(ndPhGZ)WLp2-6Mz60h? zw@C$zGT`cyLa&H33wQ0FbWhu^ay&??z}VcLBb0-;Yh3QQztIgV8mrXTZ&ekQlm-+o zDgX6cKuSi5a)n+P5TXPxaupODIq;4#yQZVKSDEo6zpCfh zxcEeaiXH>Wg;vjw->`dYu2BjN;vaJj7A<4}-- zp2zj9bIXvuZaB9seG0|XI)c4@1-R~;S-FXCwP!VeBZBuY%<|T6%+d)o$B%*V-1Hkw zM=Ij0X`Sd#R}-bVC_KG^Rb5m?SZ|Y$X^j_Cf)`Q9dYN`luoxUaB49pz&j3Ed7~Iz5wW@ZAF#y?-$>T_q-TCr-_VA&rrTyf8)kV={zo%wK&@{WfM$bl(OF&+F)sk zXXD5IR!Qg~;)T`h5MWt44 zq3T5ixe|7au8Vr~S1Ar;vjP_I=OYLiWAI$#V zCdNEd|Cc|gsYRSMS?X9O56FBP94ma1n4MT&btP&FT-Z%! zS$BZjeNqO`tGn{cgoCh`Jup4OhcX&zzAOe)41GUbeRJZeD}xz>K?VXx%5gePsg4%`x3(up4Kq%_uqcV9}h1_zwF#ui-+SMN>_-Ut*snV zN;ko2+J5uyo%mae1;e37V^ih`Ife1Lh5s+I?}P}~H(El6ypLy2);!kr3%Ib|H{93P z*VW%ImVy*>z!Ssvz@06=hrD!TSpOfStH((wpf0?WE~lEDjuu(&^_{rxi0-Jqw`FDt z)aXxy!X<`J3^u;SP5Hj^_$PUcF04f|g)6frP~Aa)8A#n~!(EIk2W?V+-G9&sTp-dH z_q~L$6;g>PpPJ`IY*zs*-RU4nMrFw*Nv=4i0ja>1D3DfIJ+#7foeuFdU-b)Y5?OL| z3?*>*DveI=O&%x+`6Z~>;3Me`I%GW%L}o(bz=M(KFOAeY+uJh`#yqb=2v1xw1cTf& zW5@mBya}y^`4h36D21i&UWY{?bZsZ%$!`SEJ}dg& zKD%3@tcbnni++a!l~x7)@!gSN>y2zvM1yIo#~*kXGNR*fykh@L`AJ7dmkO5fagKs9 zs&+>m7F?C0N^;L(hFRUM4HmZ!Zb=o{4eI)?es8~J@A76dUJworZ`OPM2Idm!#5BN90&G>jxlHfbg6Jef`lcClKr`u4e)AaJhL6YoHuIy>$ z)O-|9dQN6&*`yph=l?I$Ksrh>)9SEIF zpo4g6{!O6yCiI&?GtkkYmZ)$}F8-MXEs9`3+wXAJww3~9ApIB&re{^_GG6UDOxwkw zX+f##m3v2cv+6A`o`#tfdxmcwoPMkhA*Z(2t*vD{9Tdo%&dv+!QEPgNriD|4&FYts zlwsN$VWtB4cI9N``F~w3zXRM^8_iGhpG@yKXiu73>6dfnSAv%O&lwYU9ddeYaDe{p|5-6(xDQ4uk^0Q~bYK?J0d#lPv4m0^$WgG4_NbGE& z`R@MH43aPau<}sQ#fmcJ>s9}nWxAt+b0R8_f<#-}PsV|Wrc_{bGS)sv!ezRvL~kir zBnR+PoW#;-DZje> z%32w&u-_}Cecd5|zU2-c=J6mCfCOLyWn6n?Vg=BsC7Ttc8LMFhD>J1Lmb2DMs$(%; zM*&3ZhJ6~_dMqC|0Gdt(WFVna@MuUT2w4$MO!1y-_Dc;prcf<-d#0M>=n9sT0WseD zD!}k%SJrc9CKBmhSx1L78s157(!`RRbqdYb{>hyHV}ZZV-#LHIXr6}p-dm;T@3yAg z-s{qb;j)^Hi>jOyo;HvNc~z#i51Ko{p86T$oQVFuHua0q0`naTw>Ou@DwgzqvONL( z5d%yi!G-s5XgopQr+BstM{uT4yS*R&MBp5CoOPTuwFlH%$2>p$h5>F;spjjeyrSwp zJ-88#EaD3<`li^ym%9dYa~o{H+pV{)V)F>PlB4~Tu5;x}2D97j^la>tN z!<}#cP?MMKb0rWz@-l7vR)hZ5^6+(W40ibP5XneV{*dINkxH&qZzV4XeGrcPng11) zuo_hMq2?bapEwe5F~Zn3Y}$F^a4I$|`Q``z;;Geqph5j&rG2LPGG6@{^!u0^VyWg} z-({8ZH;)E9DQas+Pynb@N`J0>Za&-^&wQ4WXCT*R=Mr;%@3=@?wNB+NIDR~6yXA{7 zY5tvbzd|%$fsjOeQPQHIoteUkQrpRr9%!LT2dICW_bd(-yGM@HhR5&-py2Mlv~ zO+(row}KZ6A77sM>ry|kgrma*{|UKRrRn1Um+SAT0x>AKdv`n`e&PI542SRGqjuxQ zvLlU_t1!Bq)!mREYC_LhA3>BD7ZGlQ#g%ccpFCsXi>pnF=~6eCLgNm5T!b?Y@=7Er zQo3UHKt3~t>)(6ijguM?m-r$gls|ya#ieoYehi;BlOorQHAn2;;$aDiw%Z&5@32d{ z!h?~K^d@2?3zFq~sVw(bXf<6vLs^+m`@LgMDiN8nW(E&5GeVUn0JMwn!8~}lqA%X! z@rjp#YEOeBax@x$w@*4?`k7qv4BVmWOX|yOH~%!)c09;(Z+O8^8Zd>9^gF}cUb3Kp zxCYbfx4Hbg2=uie{Bk{GCb|R8DVO8LHVn;lsxu?3^qm+((89h(-q#GCf?iQk=Wc0T zW~%tV7c22gzs2X|s$l*)ZBQvi1pM4|lv0$;3POg4JDv5NMN+UyBbyV>i3ihqhss^& zI0mbI58|Lced4!)2Mx9LM<5ycFYMLd#UY@RG&pIf*`wD}3He*!C3^mm(tr3KIuVHr zMXdLaJ7q+4xD`n4m+She#>Hjst7>+Rp?4zpvLva_*f%w24sKu=yO_jx30+jUs&o7L z+Y+k30ZWp_0+`RpCn?U693aCdWIf-@tz`O%Y|GcmZr)^rK`LR@x$Oo;%yXQR!MAk`YXB6%U+Ic(ACes2l(2edlX}F~)4D6VM@~1+9Z=TY>~L zDB4;OO$&iA*@)5Gv#;+R?}JI+8PK&yb7^0uB>$E8-C7r=d>Vg$(&9F&;*<8DDTHLx z-q(0HL3|zqk7`?;g;SvulAObirSHRlb zcdKc3+a~<#(`F}{r1mlVZE*tb3Ch?wg|UFE?yfExFZrV&|0#QPpx38&VL>HqAZws> zq$C*pPa*Z+=tMW+`pgo68P4lj(k#xs8LMA>&w_2vBouBAFKmxy-8~e#Vy@`>?7#rS zBem)j*+$9=Oxj_{@P5gwO!sYPc7J;7{f1IbW?}}Kv!0wdj`nQY-tF%`zHhk(7|e$F`ay{*{;ryHs!qqrQSIeugN&$!n-a+-26bZ!H0XEFYVwN_{f z^V;FeVOg%H1UY+sZW_UmBcaOOU$`HeYFX?gGWU>^!w)nCZhM4UOr|4mrJ0Tlbagb}dTpVH zSDN|dy!Kvu>dIof(~wbqBa7eWJJ_h*eeiaO;`!hh$_uYz=*V07(kX_ga`w=skodX6 z?hra`yCo-Rr1bKMJ_XQn&&rPv8}l>8_%8wRqJyoGIR{oTl`%@a<2TVec}Is zK-nNZti2!ry3ZZ6hi8~x8lB&)E9BJjF$eyv`h#|6qMxBaiE;#IxnpkGF*T8aYRY8) zqqNZ(C=>5EvxNUJu%ta0q=UwcoBmYt!(WwcX|#Vm(NRP^2HH*E+kZ5zA$3h)iM?5V z-aFChhX>NaKlJDQYzYzs5pi$fofy=#akwiXXM zDwJQw=~l;Vb##&J^&{Lf$%PskiNl5P)``%WvS5GVHNchz=s?JCXh6t{w$IK53X4C2 z**Sygr4K*#v<_1s*?=|Y&raO4TH|8S_y+vJ`GRBxLFB0}4CaqR*}#_Sp&YSbu+Kgr zHGeAWj7nA#1gA-#$b8mE>`aMp1l!VH6PvK7X4Mtjdd8adgMh?-5Ze1fYGiCoZi%yYOX?YNMy#1Yc zxa*I67{5y`;;9J;)98|-Gta^>cPf45!FN^q`U&N$hp>?@9?|}!4MTYY>FMiVSqBED zq07_iyR{`2&LPqtAF;fNJBCkWq@xe*#PDn}Og@!M{ss)n+-foo>5)jhudEe?e7nm9 z`!KwSMg>~M-?=MCGgxR(*v~91Na^brm>01CL2B?m50CK)boETmvQkY=^^3we;ztU1 zNTau{-e7YMIiPW;8tbIyTBT|KD-1Ixq7>Ny-9!31X4BI*C1pH$G^Nvhi%|Z=3nE^d zep8Q=8Grsp6eA{JjQQ2X53*|UbFH0sjs{@DfYcK8F)BJ3I1FvzF0wGeOm)de(XTMu zSJi<5rayC>SMr0ll+;4W*fTR2jb4)RYh6EWFA{r+NxZteq;|R9K}a)#F}86b+E2N@ z5L?$OxU)nS!1g$^tjhb@S1dhkY9l5-D6@qGLpBC0+KR(+e|4v0VpA&q@mj(u`8oLD zcT_$vYqXEq9{*bMB!L?GK-4nkYmYng0yDp)FW46)UcK1s^|YiR-rNr0KP{Xl&G5{7 zocrJUi6x{GdN8<6v2d}10sN|}z&rQ(A6)mvVm6*5l<9{8M^*)()03;h63+W06EoYK zWxKaGA(>T;f72)mws7AEkyqfH;^M*>zRgojl0yJBC}+eq-gqfB)OhGT*7eE#HWY?d z`R9}-WKR#)c-)I}D<=i4VdUeko=)gSkeUD!Co+_g6^^tVj zH{sx_^M}y=`^m)IL-hh?6DVhT$5J-mdD2I0O#gF~2`MwLO?_i+Bc2WwHfdQ*L`ALQ z%SzJgYp%KB54H7#@7%iOP$Fv|U9sz@oqMd8bTjGDPDIrg4E;oc_pkmJbhu_^{RHM^ zgDwyVQ;sCe4K!wl&RW|wkO>W=W+}HQ3_L2_#U&3e2l(oF;JW1;dvN(0)Du}^?LpJB zj3)_eVb0nne=s@MCwF9L6Kq>L2xma)&$Qm$yujJCZKj%Gd|S1`SL60=unQ8z8s^-) zv_Pp0h8w?AE%?q)2}|>pym-!}257zoJZ9|-u4P?OXD=`Zj_Q=E^O4Q4qgoGIxi{|*($$bo5LlMvMNX)?BwoCLN$tYKl+h-pOECd zjGxe$M{>wOIZz-o99S@IGOclh7XTZrfbp8*T95RvJLL(+v6YHF+zBeLlI4wqcZkSh zPVseJUw<)vgoi@VZXDT#qQp~JRtnHBGHdgn#J}dWzW?g)tu%_lfcq!%{D4vJioLOb zwUZQIEKy|99s75wtTs)w=yzr|iHJ~i9E~B{%6YW*o#cBk* zAaBw0=RaruEw9MC_gh{8VV(xm;=zBy{E0w&m9);`I%wxX=!0Y&+@qje!i?f>zJ6GH zw^X8!Q)&lPL|bScQ(outZ$gidaS~ybo*OYcUz$$aUNvyXxqgUpNSv3Np?s}79w#)= zCRAvg`IUUYO`gn%pAJEvl255o{2dV)Z&3MXYUKbBRN&+@s(*{js*&$)M^r$pjfi77 zw;=jmYC2I0XilXFwWGvsgT?#y0xO4CO!m5VJCP_^%<-p){tH#TJF)oK5YH_i#EFgu z%7|bBt45@7K5jxPJ+h4gXEPO!NDtImZ=^;?>W*yWL&|gaUz_QvE(B0Dy%uKPZBc+u zN)JAyw_bGy%Hma(CKq()pTLxD>+cX296^mq%2t8wuzl6|la=cBbwmaQ??>5w`!41(PFzaxv? zU|Hez*S99&*lGy8gZAGF^9sq&>~2|Z%n!cL$;lBBnv$QTA%XkUQ*HdB!62~O#g^vyM-eAkM^i!X5U_OKCaUA>Jw zK*hLRO=THeGo4L0!Ej`X813Gm_eJ%ZB+{=zI^X+VpUy=&0b7H4<#S}i zoU%Ml1KRLFBY6^@XJ~Di%S;UD2wFc=dZS_mZFOC?SAPU!bN_jO{t&wvc^{{#)-U{^ zTfYs|)?kIXe-3Ys-;>6K0p$^a(L3_h5^r96`PU(Z*62RoY!)}U$*&~Hd?gWcmejT> zO8pG;@UUgmIN^yb&lK(mQ1knc&7o{+etFDY7T+!;__HY7aJ{? zP6lnyDPZtYt-8iYG)f$3b9F=7exmH(%iXv6_9rVSzJZ!O`yk9vSS4va{5PYSBIgmQ z-BB3wX9Hs$Sh2Os9i(t`!beOGWORoJH+~H3K|ir(m3>llq5T>YFGZ)lKn(Je`<9V}$W=NsH&Hc_apUVD$+)=ii9G=@n;*9!adWJlR0Tp77clW3q3GOkh ztU&w3`QNgFac!>k4B#%``_1q9Wi}yYZ~dhNAlv#jGw6g{B1q@>6~DyrCoFwQ)4*zaNN0O9+~Me z+-*AW>7f+OzV2*)ax0o&kcs&ITA?doWtl*ZYn#dRYbJ0<`$;I;hX{ZC!``$norwAb zR4rmLx;F47EqVJJIylYh4WTqeBD{eX+taKMK~kwOg>%m)SWl}uky_f@qn*d=L!C=M zmJAg_rEN`5MI}5YXJaf%Zu!$^G9Kx#6Ko=IFy8Fo7#L0!+v4r)I`N8TI;5Q zH`XqPf)hi=IYM?8ynz0^Rs`~~ zNQvGt!+6mb)sJL#GmudeSlX9T<$h$&SVE{uOKsg~bOfdSr&IfgtwiviXGP^6kLt9H z;`nvSPD|p-|EQea^<4f%TKKhTFiE@#>509(RO&u|ua!@~O`@zXYD%crsjQcxc9^SV z{u(sV&dXtnxMr^Q6H4~&B_+swYJMCvZUA!8L3mda8a$0qq5xh8on<+4s`L>4=~Zq& zaBIp%4CizIyWaHXb2QF+u28 zj_zlSY>M*Cmwfy`=-|wVW(jmW4F`2Y@pW}~H3AQ$+}>#~XVivOC>F3gR^theJXE2Fr*Xe^a}C{i*C`ga^+$Fn+oRhf*a?%BlGrDjW0 z>*zwA9lo}`%9GpN%pbe?JH*S{oTtKlEzzZIrBfAcEs=807f;hszs%)+I@_rq9?F{I zIDVN~N{GMXkRbS)NE5P!UZ7L+)yfK%+S-h-i#@D^dzFUNsN^HT&Vfy%QIL+W>qa#O zcAFy+IZC$O$N|P&Qtk02^ibg6fDLot_Eq~lh?N_m1(i9XFT#sqFH%R)z7sf73S^7BCJyWwV9WQ1+x&5l! zV~)0FYuzzbwh=nt9&3a;H`GS^#jhehNWbqkWc0eyV!GR&9*GRv3=A|Y9%3d-g=AN= zTE!2J&d`~C>M$~apJsf7ioU6^r=dXMyB#E!@co!E&bW=HpU|p&!`uWZQ=Y@F za+m`|m|EsM)he(lr7p(bTx;|=p3gZKTaiL$=s6$y=sYD(>!$vyJB5GZsS)x!dyY&c zJ-hAr`Ssb-1_pzT02xO0zmbi7iui~HKcf}l6Nt4)OFSA_GXoPU6XNGq#D#OiZBaB?cJ-d8(U$F zf4|p^BatxbsiWUTwX9#v)XBIb)AmW7%Vj2u9`o^_O#Y67kM5%95I_U>>c~Hu({0gQ z9NV*@8MeUp z3)%l*SJx_CF?QsF7KL`D$65?f(ujm#o*J<(M$W`_zlHdUf)>GL{dwx`7ul|MJ#DFO zr7ia@37V@91n4687)A$RAO3X&oa++%EF8U!_W? zOJ(M8Z0*)8*igMKxvm(?iZgUEj`cfb@8oo5ZjQ%%=@d^)lV3dPV)Y@nh+g^&Uq|Al zEc(UtibDZ{TV?f0a4AoKAukSQuscQkp$g)U!$lCg#k29(59;-g%s%mRLunbfd|58A zXmj-}J{(#=LcpGy@1q{L;9EelZp5BDCoauw`U;my->o-?*T8qL@AJox_;=V=#dK%X zqC@cPdRL!sFGmWf=@>BMJ;3vCO1%HCNae48RGBy(J^MW@;o0>D(>VPHYX)&aOrhcJ zPiQmS>K9SNsq&YRV#pUSLh(LrX|>(pwy)^BjnDUe$(ajY9bNpCRQhb`b&`mu>xpNu z#!|xvHQ)4NVq>ds^BgXBo3x2tSb)15l_25i`q_)c)X$F&HDEotLvG8S@)B zM{S==dF^|hA_*Tz&RXvI%BHZrd78k!Fgj|iQH)+6R5{@N_7en|9iCkYumOc;Xu5u!hYmzV0x)&ft= zNkkk~9zA!Oz#nFzX(b9f{=-kwp!?!nzzl_8WSA2V#zC?5ssa9!y0KVb;Ia1*=bh)o zX?4}qPvDxI$z#pJZ85=_7g%W-uJm+k4?QbCyGNcd8PtHu!xY)}N5O!JtYL@KHoQ!| zJ`A-z1kQCV6u|}{rD%PLJw1VNpre1(ya$zzRcJq5Xz`;*h2aCWL=*~Z)2Gx7b=WyR zIxg1vMww#0%YNgDEVwM6>QXZcWwvX&PPcn*vHM$RNig9vUQgO z;}=l+si&t$3s7=9XX>w&{1*LQzFVR;Rrsbx(T6r-wtSp&UoPNKyYpfU#wy>&(gC?@ z20j_z5*?c0T0wB6u^7lQyJm-HzVc(Uw8hVoU{l`D?G5^KFn+P!V0vUfprP{gpnjU=TmOK4tovHKrQ7fz%u`dUhS_C>O?Oqc2 z-sao@<737)+;^5gsK19tCAd=4F%%YiFzW_w3o%b)!rc-?&FzzJP72`+iC^p|9g@B{ z(>A|P`4y~H-h-efZ8tGJp4v(NZcmSZb$3>k!?(-9Zek?8PY+oPJSDRxap1*R6K!Z3 zW{(@LC`Oe)(owY}FAte#6ND2V_f#ABmJ>G9Z7b~oRS|s%| z?_DZ=eZP_3B;%2tKIc$QP*Lo$KX$ZfUT3d0*U z-cQq{fkWyPDV2O&Fk|hk6ZK{1+Bn8S`my zT1#k{yBS9e{dy?ovnzBPDkmek9QRldaY`7>6wuo}O6fyh?Io#I=Ju|;)A^t44jFG* zJAt;DQCeC$nDF=wu74@fIGEjjGbkr4-`??UY2QBkPghwf|F+PF{KtH-WJN{FsBNLE zeE*g_OkOg2Rjg8`+i!N~<9uS`iH}hVP1iK|ZWO!I-$Vw~+fddEKfAN;A|aJ0DPp#O z{{hmZqYw6z7%#2`F%0^!*m|PV+p1B=qF>Bh^Y-3b{XDAbOr6J%+NT6x9Rzt9{<-AE zSU2y9nOwJ`UfIbf7Rj}`xH(*TwqSx+_l$0)t|1HVDn!#nG)`4w40@qy4zzE@pecVy zxw8Qft35!hpBY|Gs!+gxXaSz=vwu@Lm7s*YKJ)!rfS7B<^wQGs9one?Cbg+&Z0?y6 zYHFG~XTD@=*5B(08#yKY>90+O8ul6PyY3F(w_0U=eP&)Z;_|9{)UVwJ)o%CdaLaH~ zZu4KtzYH1SQ7{aI{r_OxeS-_xJjB$#od3<(zH zaw#+BjhBCD&Btj+4xh#rtu#3pRstKx+~b6cCX3;VJ>6;zk;=NHO0_@gNAY zFj-+4=3stBf-Gj2&<5Q8zXQBVzEaLvUiktOTxs`8YO<9ucdWU$pr9IW+1O!EPiUy* zj)TqYMxh#HO9K0L4YLL6NzZR`nC+`^QXQta592H(akrf=4fKnwsc`Cwp zvb~>SZUs;BDkCG)u~uq{BWF@01Y>Yw&yE1;gw6T*V_7D9U_>97{|UDbkJD%-NP5Sh z&j}0@^5x0CDk@r!g3HhZB3L)Bs5(<_Y}j0x0zrS=H(iE2$QJho?;eKm_-E#@Aq1h57=)5WM#~lK>UH#IuuJ48v94 zo;orzH5e|2sSCJSlzC1Ldr36*`uqZt`##*u<+>mHszVKLXftYXTBSH$DLi@dq_(d% zJCYdwXZK6uq|Kcad0btZYS$sTn%%oUqU@+3TCAnme_Vs_?F+R2WHpQ)cu|h|{`Jwj zHCM%rRKs$3FBKmrOQ~vTZak2;^ECCEcaoK6+(5awxFF3thABxt(m|NKJuep|u*HKR zi7O?~112xfZa*3R|03}V(nqpPp3n&iFn6`+IY8V{l3Q9@u5}(rp~)u?^kIAf_qA)+ zSf_j~TxA)BQJTkd7zW;+oqzJoM_bm%Ab2s<`MpS>DHt^WeWxAp{L?!w|0sE@)>p(G zov!BN__F5SA(S&E)oHr0?t~LR=w=f=&r8YQNK!u|?_B|141D4_3DX68!L3sK@k!lR zIXUw!KkjU2<}euDx{gQ1)xggBp6is+$a{SgsPYP@Y&>UXSVaHCrOL^n#>~h8Yr_Vz zo&}Wd%nu#e{l|}qPYZYxFR^c$pMKwSh<|?eG#3u%k)CSzL6=w=-C-Uk&v)Gw0SsCE z3Ppea?T{x$lfKYbW3v^@NHMWaRrc)@#j@X=Pf^j*M?hHp!BBOv=SG!}|^u2UI{AHfcoV2N4kmlyv=ZQ#AXyQO(?Qp<10*Qc_Vy9DJ= zuP^03S13lCy>UBK?_NKO@83Eac+)KKJ~alX*jhZjW5+>hK=M6${vAwP_G0=h^_7>K zQq0uBrk<7hZumpBWr@@;{;UgDyS(s%j*c&5m= zY>e_<=h+rk!G3TOhUI)Hr+YLj=f{tnH-Zw%k-`U_oVl7#xp$#a{EEPV7%x?Q{H2mV zaGS%x$ReTX#mCDSO@HaCoxR#@J@hHEm-O1hX4WQe5KsUATn%Z)16;Wbr4$#7$UA6t z7Ba$drS)Cw)2GgknBZ{hdK}qKEle|z`y}rb_k&hloo_E_x-)U~&dAsJw8S`*i}doC z7l-VpPbWu%vmOzgTB-BVIAUiyUX3dy5HxEb9NUf`cRX?& z-Rb?kNC}$DM|7>%CT#nMQfMtajJ!RoE5>s$IYG(>8>E{#Sa(Cjd-IE9hJpveg5{g7H?7mzho zlpgy^jdMWcApdfX6pCY*3|m>?LL&mBRbHp32fY_8Hp-GQI^ypjObb)vtx%0|(&F^K z2yXciTRBSMZb~*Wcry|$Anh~Yjdc}Qz{GBnJbBDUz)Yaq9LFp@)ux90On+g$!1`x( zWX9s}(gD7ct|B@nOQF=%@&%F-;bHbS3^=$>j3cIe9lgG)eI8l_MeDCx8<8_W+=-hM1-h1t}*8cf@N;B2xF}pQ$XaSMKcT0`vyge`# z(ZprAX1{EduvIcT_oBa4ovq$s{}0KI!re zVa^MBdPQG1R_9+dTHu>tcooc)>5o_Ez@2 z3!p^2PD}ft^5^yKggR{jqRXvO5$0X!N$NpUk%8*;?8?&dlCi|HYlM(gPUIfG$`Fbx zLM`_?YrfgqTtDvM&Mb*qB3rR{+?)x?<_LnJ(UYLE?n0U!3kZl|qU% z)=)n6KNccJaf(E>v?yAwN9~n)z7MW3RSSsz4M<|}D1X^SrCPq|TIQIw-mV@lYeDMO zOV*x?W5MO?HK91lFElt$T6j4tQcFr!zk2qBKJ-)ce?zkmjS`^{LW}2x>G7_h<+9DO zG%f?xTWy%*!ZP;GGzqHJ3 zeJ&Y#OQw21QOf@!ox^aQJa9g+F4c8=w#F{x!Jb?Lwukc3sPe!#cKzTO5&iJ17v36h z<7F(eGJ7xfcRBn{Nc*Sc&BamiH^gCXD87OSlhe<*#?o4IekW?LIr!p_E8q{r>a%&IaNyS>s%cG2C{YdbJDN zjfm{Fs;X)6?zR1NFMIvo1Bo_?B2Zp*%w>A43xEyITT`;GFa zLUSCXS6vUjZe3M)=DFO?XboHmIb|?q9a8TL`B-N?_1P?^S&R(-gyWo+UUf~4{J_XQ zGa(ilf0?)+FWTu3A)^i&fil0#rw>=8XanCrBLK~Z@6TpsLpP|i#CdEVg1_~VI)3cq zdcB)?8C72B=rGc*IxU)Q9(dFockQH?4%8M$hmX`8$64cGAS#~(xSCNPA!QO0$9W!w zvozW$eutcV$vSi3rNXG)K*`V9lYg&=2&H%Mkf306$8nvOf|4`wXH2J^Ou;;48ntPL;JT`y%je=-HAa_h;2M@iSUNn`9GmxkVRv&N4|6_m?*|@y4WsZD`IQ z*n-fOJW%07+tLmO3-?^S<5dA8PQ9%r#GD{Lt4JpJz5N|8fQr!pNGWDJkydLJu;&gVpg&i5@^{>cB zgt8>LqZ^|-q0!#ryeOO(m;*X-eP$tNjc>Ynyb{*BUPtuPi?1YngY(aN{{_&5X+mRu)4Z7XEP z+#jkD@?X+4q})Hca?}moF4a1^rYXfyLzX_LV&>$eD!C?^T9K_S9DMc0k5`zxZ5V!o zoI!0R*-!YSkA=mpk3R z#&PJZ((zX#K<}F>OyjuzS{w1|6^4FNbJI9TX7f;aM~vl)uPOyF9N#?4?-X}1O~?pU z_BWRZ5{je5CQnkl)k^X!Q&bgs4-Rv1U4+(FHpxfMkEs}$ujh;MvM-HL;WBo2GIkf*ja=fG2OQPg(EPQWDig(je%YfOq1@yu zorUDqWv>lC#vEwkKctB)j73;dmSd{ocuV^LtKsBvefqF@)!HOF0plfcr2%)B$WfT5 zyM!t?h^sKjN`%)WTa6=PJ1pJf{b{@+(x2fEQ$CjWx{#^u{gd17CO|KN=Vn;@KjJgk z$NY)U-5iKEG8cFk%U!zki~8);TB-({w|vt}6;9x29Yu@%X73zj@p76wYdA6RiN$Bw zc(7aj?SH&Qf*e9RT1>|~0W?^wx46!{Ya$-ThyrFX`~6=2LZQk`>Q>2~d1o}c6~(dD z_tB4Auv;5aM>#g*c}kcE-I2)P%0&}s3~s5-k9Jy7NY79Z=C9l(z`(*Aqb8V%@o{^Y zO@jIPcahYQjluMO`EqL7UNpU&W{`jei;>UHQy(q6uB3tNp~oo9;rc)>)vZD&OIQ4D zaE-pmw(G)^Hl=(mhjzpHeU+MV%7*{3qAoIZ0fETEm`f_v1ySJ!)OFLBb&>rMWDz)gpJ&tRO^d?&^&*%T-^qihL}px@YuF_&zWTX`;$6N@;!)%xwgB8hX)_Cm;M; z)J}hGWAt}_4cwS8ao~(F^&9&GME7BW999-b0RQ})p-yT*>LmFbG%T~K`1R`>Liz>~ znNvs_+lqCatS;*?rZm&7TE+oN7EaqNfE)C>}jhwSFODqLXTpG4c$YjS-)&cDSl{Z>z5U$RK7 zuVGru3jgUjf}8Ze9$K?P)Y)`Zz!V-X7+2U9GJC+t0O}P+;;CytD%*JNWTue7o%i5b zvplx9)@MrE)ru8QZ7gP3V&dJZATb1$wLEVjbn%OZ#XzY#Bk!id zqDuQ@=imA4U-6FqpOW`!`qXhsKVJ~E|D>Q7V7rAVovDujcj8tnuQ1TM-=W0&y{|S# zQdTKRG?gisr+syi?x}q8L|!*q2L=Zn;pXF8uw_KSpgV ze$`9$I8l}I{`1Z@4hEn=(JZ-Mno^Lh4S8J#iI(ca>+IO73xeg_d)#GR-*lgWp;V0I zLdI7Lxqml#gygRjDnZe-Is-GY7JgPUGXDLJLGt(Y4{RvTh%SCN_q2X1;{;IoxPgAq zm(&$Thgu>RDzofcN-lJRU>wVX$8{pgvry~I`IvmY}`ngE;M{00~tqu3ubtMCMn0*1ELry4@P ziqEVa*>|^76gnCc|D%|aCH(_z`#CAgo;>kH)X9BM{*9F59bgl;-6D;VOll%qg*k^I&~{4WsC zk45N&gus-HNy`aHi>JTTC#M(%c|9r585SAXRDS1Q5ukM#T2a8_F%ALn32?*@@>wMZ zbDW9L0aFA|fKzBSI8R0ufg6f&>^D$A@^gU-&qH``IVOYTA!rTVltGz|_B6H=j zSw<)zi|IWEI|Pd{4R!JWqchwS3u8m>lmA`R)uF2`-96lWyXX~croNuPbC$Q{0@Jv5 zyAFF2b|qmi>z%yIz$T!Mt#szAblK?phD;a~Jvm!1wKYxC6?+h=Rfz8ll>e?(P9E8e zf-`)}zB3OVqAWh}n@k5^wQ7u(h{px^R?-3Ja^L$L-$9BtmA8?RDZcXaMIgFTL6;8E z4-=Xl`LDR{pJtLrsq^f9X^;Zgdq27D`j@r1vn=cR(Vd=U5nbPdR1em^sykv(ng7<7 zWfHWCI65;&nVifr+i#RvJ)PP-1S*&#r;xBDAAdzci*s6M->%apscp{8O0p#d&xa|J zuvYKPFVr(RpluJBIW`&V9VQjEYHoV79~RV^4hZWwtYYR-J$^btqGxpdXy7?Fn5oV9 z>kC_rJiftdeUy|XTAm4uE z`7lnHnZx0@sU%*mdv3Zd+1;x!kP0QpZNg_0ogJC`q24ESFE_xEp}<(8a$5G2UGCiI z7bscNado3p5Zwu24NOSpkO`TC_(`*(C@sw(*5v`G&cs{m?S8VEHPiOmCPauN!GtGD zb&rrMua>1_-+eKy+tmQzc~SZV%!2NF958_RKLriEcIkq~#h)y9`bW6Z(ozhx*{F8n zEl?C%fRVFrw}^MHA{eFlWRGV1eR;H3jyCsodfS1M$m7^?Vh&g&@uZWT>t{oJCey3p zFUCrpo5++P8*gQ*Kgb_dP_I`T9uVaFnAsYpZGQp^50zncR{Tty1NLD?U*+4ZAgs}Lc#+%`QW2AWlg zajB!`ZHt0fjEWt=9nC1(v$S(1MOf=ZcMB;&62HYtOysLIC&3~nv7ONMk;F@IQAJJj zE@jBxqs<@cL9{TipUihMO)6;>7qbqY4QMA;yrYW!F{`5i;yhNo)y}JI2sYLO&wMuk z;ft=c34dA=rp|wWn7C}97p-=CI!uhx->z6>mqc~JF-v@S&}P3pQw~Z>w&2-G5#d>* z;F`PV>bvm=h0lIdRGP?Xh#>4$*%Kjt+|PN-ZcdQLL_hFgA1<1;mnx!<;TNA=@jlyQ!9$0o)nZ_ZPqH zO!?U2)NZw0^i|J(vTK(a~sW9j;I*=mZzUn~;{S$|k2r!)~< zTunRe0Ao|wi_PWaNiePCj#W>bV-Y(6>+**4mM1<$hNdPB#q-Ye6x!ksZ~9Kvte;cU z@z-kRX}`e5CaM-w|7t{3%Q&AY-O2!Q|21d*wAG*>k-r_+CH>YgiDKhT%y!7_`)6dE z!xwhSS4Tl4!B}2aR&%(kt{z;56*A+EddZ#H7IXj1IR3M5b3B>;T1pU9R&N0oxC-!v zgPpzoXi;usfq%j-CJ=;t1Ez&Bkye4Q0=$bh6XWgaoj`Im{$CG$5=n8-T*-P#c7nI$ z^7C`&%mfg!mqLgY0QN;FK^aH8_LqqsA=ftdBwNHf&?a2H^7H~OgD*WiDym)UlD>5H zHEBe!-M|;Rb~?yssJTjf{){eKaj@`iKf@zql><9hikIwEECT(j89BtLTXJN4&|FOa z<)c#wV?#*5^uBO8!hVz@+?TrMHM|&5@`J0}qy5FJ_|o)`@o%XDzhme#gBr}gZf3P6 zy@JWdir4GX*$$9PIc7wdKck=@iaz;f~t?Ywbk1gIL$j_}1c5JoL91K3e-yxVQ2p-XIF0QerI z@Z~f9YRNzYM2|h9HqY%6Z3E9vhO($Gl*eg<-rxuAnK=Ae#8y-Iikt zrliApu&&i!-|I?v6mDY1q)I1$?Oumh$US|jS`$NK7G5AkxO%2?eH||x{H}%iKVz{i z{h_0b5rzHxejNF}KeM1pnZ^#Cq?xM{{M5&~^gbZ!K8I$lessudX(hPjW2E6i zx)egku4Gjp^M7q^Z6`&4*S|%Tm3{^+Va4YEB6JKd1|Ti8KGcQk~`Yw8;7`Cr}BBD{mP2M!%M93Bm1hNI&}ugka7xKr9x0Gfa{-}2zo zio3$P^lCRC>}}GsPHsl*C;l$zV7mK>O3F23fA2bDisJuGW1z<+kvn$YIFh@+cr4MI z9OA)@`8*|rD>FZYqm6NJgITx+gfwDcW@anl`KGopxUe@&&u1j9 z*qtt^ef8Ds^Za%>^}wx+tP%Zt@;M|P@7is*5AmG<+Ta`N!9?Ls&S@@(DT$u`wbH<* z-z}VfH_;RZUx5G5sou7>w%cSV5i(=vM$!7_c!{g4Np=Y;IBbajHmqhD8&X4!E2sP7 z>Y?|whN^5^j+3n@sAIzR&P#nQ4jUf5+T@miog{+V5!cizGdE~5K{%qZ7C_TS4X0TgsVeX9c>sUYeVgbMrE`uH{jfYB6C zF9MZ=@NRusq$*RrV^oIK4%!O6mX=4S92ow}f-d*fm4L|T$ZkUpSGZpaD23Ipm|G}ok>JIk21_(pj=S3(Y*WMLC9mxnsmxYb!_rlRXWObM!QISZkJ1T zd|;3~5!xfK-8cft2B1$#yVJB0ga^yvJV%7wKa9VJ=U9y3{L5y_<<{3s7$yF{0SV&E8)B%3R0dAs^SM zw!Phyh%2+%GyJY1zc(Lq=SrXVFnD>Jc5414yab~p$CT8T@e0-xKSE#E-k2xzjKtB1 zva8DJ(`fDSE6)CzuhfI!;3zhvt6*vOCFk-8%(jtgAoTk&2vs2DP~mJ-)o-fI^k3VA zq*7LX$Nt6NP#7E|@yL9e25WIg)?!8mYsv5n<45U|Pp-Av)!(hIwr>sIczRDDFIRa# zH)#MAyKAeO`hq+>`336ZP$>n$DJ{w_`KNNa$r4(7s-~6+%Huy9)Aio?Ot4)~mlihu|-ym!{buO(WfK zfFGNZqWWif8v;V2qU=+?CM};kxwT^-474Vw`4Kn}3JR%MuVJtmPOEed8#X}mAw%(; z7s0__lAn*Cn*9VHJ(SKVmx=FqH;MvJ#uyy!lf?e@TyMeKexvOdTnXM_Uf0%?2;o-D zt~>u54D~Fpq_AACk$i&k(WL+M(KFHwtWiPHBQti<<1LJXMIwSXS`92~ip%lAArAL6 zErtOoFA7qAovHgl;RC#bg6ZKGh<8mCF2$Bo*Ki_h8(&Jtj#=bR7l0-x`3RviFhH)Z zeqPzzA;x9FJ!nW@Iwvbd>C*Q+_NTdW*o7sfb)Oz;8sO$W10Ce>9Vh~Ce`cHWaNgZH zU%VPQZRmfRy7Q6*254BF(Q=y$t(y{5>i98|Q$3?pZ@UX@kaRZx@xS0q5aSzpeE-B8 zg#BXC|A>;><0<~vE7R0$Dscp?{|8!ndpD?Ih$gi(eBxW3BEpJLnRND*-Eda6l1*mI z*R!#kVslsdi89c6t@Wy&SAne4){KfTkkudGOltm7WR8A&pz6t$Fub7etd7=lxXwYH zbKHa*=Xc-`BV$Yll3Em>l41K+eZVyZZ4XOO0-mC(@t7hSe3dupLsz55pHr8#zukmeopy6HS7iyE> z!)|9~uSlun)~gO?E>~WB)v05?NEx(4rgG#J5*Awn;HG(p5t(v^B`L$$aG464OEtwt zj6fEW_k%%1ue!G8pFalzGu0iU<;W7*8%mxw&0R|!ofAo)cS%Rae*d;6@ z)0i8whS5WT)yT)(CU;5r5hXbAGY3n{mX|@|H!s-F5~RmhwgX($d3G zach^`*|bmGBQ={}T|0e#d@|iSmfzR-ToUaBK}uy>BQLPVOa)TzMkSyoo4NuVmgSSt zM|(J$f;$h9;*kEHoWlf zFzj1--B^_p<$SlpYu#sBX26@@DF0H&rL4%*py-C>j!9&0Z9|#-d`S23MAAl6`Aq$t zezNij%m7CKJ4wHtyg31oD5ohtDa{3)8X#EIj+dTIPNXL5TYjIwS(O#pvXc!B#u7PX zcqvL<9Rz+w3tX%34=4=$IJ`_GGJ(BCmdUD_DjzP zR9!@Qo=wp6nXfdKRerqk`Ui2$_S}C5IO&PqQN6dO<82K;RM=>Xr?DqNBx)X;)|Nu} zXGnUXXjs<&RKZu@=6KU-Wz3xm$tg|-0U2rf%&F|Dl?`DjNUkbK!ce?OVS0zNkQm;U zy-)JkAy+aB^gqZiv&*-oyH$X7=vnloMHUM#gn|(tT3K{?;0NaZ;Ld>O&pBilYk_`p znAia9hi+WGI&|uQCoA(>>`1N7D( z1*b=bk6YARxFYF=z=?!?>quMhBOqhqiZfjwAsi^GnZW_X8zL>8Bj3ZIh&mj1D6jB5 z&?$qQFGLHpCH+O)!bQ(?#{?5G#n$m<4l91X(j-bh$qmC z^7~0MHvrYt{KK966|evRMEEa9rw}Rzgm7gT{|`%WClKc+D-B_v@KGn{?+~K+&nyDkt97x zKEAaden_sOMu!86AJyv9_z#tI( z35vU%vj`nPEKGT}o@sYa3D7rMK*956{l~uDc3nq^#s7#7=Qi_>@~L&qIv-a%qH>t@ z9_g)#W4qAQj_f(}TQ+(c>Kc`rHk;nbi!6FnB#sv^pT=cyWrs!4i3)FG=cx#J`aO{! zHC6*_%@_39W=q|xjK?vR9m)5(kXh66cnVw^w0pdy5C+8aZ3mm}i`Vrdef&;e{+T4Z z^mWLtqeBHO*as;P{8b!U*4b+;vC`@{#ZS%W{D_P%AbM9}5HT`Gyfx`7Ac~<5%9~=& zs{hne`x;bP)Ls_EKpEKBy?A zX}`;Ma$?qMs>Tf3-D1`YH7iwklItf0!p8|kZ|=cENrz$z6VLxR_WfX{Ye(~*@3`O# zxV4417CKl@aQnZ5iWErtNyj?F0EHAMp6j?cbn`=mTy+L4L16sQjmHp;_l#O-B z3}P!Vu{`i%;{b3u9|GjT9GldxEs9zq6-u?%6ih<{if${Fv!ysvGsQ9&l>DJv>dM!f zplB^Q!xN)(FSn>6KA!9*0wB>Rj)!LVE!#2i(X1p?tc+TX3~TnB|F}Chw32XVv5XtIBfp!~q9HM|OQOkN!lR1M&30#%klazci*V+>_N-?<|D>8A;5fV$Aeg!KEv!%Jd4@pr zH@I!fh+NF(qKS#gB=D-ae<6Rr3AC4YIP(~f&Q9PmB0Tbz|4#ikMw=Q#&uVk@`7J6) zoY}&mIid_*8b^{xx@7h?1X3(>@FaL?j1F|$@z%tdW*P(ZF{B#oSgH4)(kDJb_*sj6v~El^>|{Z8n;ln?Y$ zJ=eE9oY3r}!JV2(SQV2HVO8zG2MIcQz#4FZ`a%(zET=}B@-E+*3v2E28%XMM$OnB7Bo@0ZfVUnmk zx-ZU-ny2SuMXJ8tj`vMoywmlO?5*=iP`!EM99?d)%MI!lb|vIZ12#w2dM>}S>isv1 z5i-&BX!naPfOe9O|_{$VL&gRFLy265~|LgtFk zRJAO?4mrm-QMg!=L+IG9nshp#p(3L+dec`{8Gz3mLW}Rbf?dGzO;Z0+#}mpc00%@t zQL}?)-b8dhn3lFSmy;f`5wG=7p!YNYldA4-IzHdv_7)*U)`Xf1+J}9%bdPD#ld`JS zDVlJqpy-#`;q{k=TdOYV?euq)t!v*NQ{*@R%Lz?_pX%v!hk9#*n+_R36GQtcIo0Y3 z#?ZUl!rbS*DHprpIbYrbg4@Xlt&MW)&qYq6d zh>s`ryeuqqTMVF}@BTxN9ElNWSoWu!q(P!JJ56BqEfji%TM-MW9G_ zR*ZFh0!~r%$0eN1c_#LF;-iSHdgvlv1g)G?*YqSyyo(6qRH9rIq|CbCngFbo}vyuGra6h^W)NbPkfLAYf8)+5!SR7)Y8$z zO{-12bdX1wnP;I{fdm8tMdBQ`2*@svXCUssCfcL+Usxo+8;h+~9t{v3Lw6BR7aU`1 zI)TQN!BFBM(&7VymCj^Q)%C?8NL|fv4HCg!eX`+7gC9sty=Drdo(Zp+-WXn1Fn^tS zK7fQIH#fsj_fi|UH2hW29k|b#94Yx&t5<*!C8)JTqQ$E%z?$vlq=*EX{ zB<$PEHG7T-SgxO(KnhYhQphB-D4@?I3Yw^Ga9|+Vi+~vr@^Q!hut-XOr|ti%9>FD% z+JficBn+|e^9IkbN$|Va4lR;UCh4z8Of7WJ|n2b znOAkuYFzIZMbqECJqsrguzva@lEXHXSypMIGb2^o%~c!<6q%cDZw6M(k&<)xf|<`1 z<8Dw(I6pBumg#(`3-dHG{5)ChHopMZc2KqK#+N5mxvWQ#{gZ+QR_=V=$oRFBrNB@A zVJmQzhJv^UvGV5yLZ-WIIxp9v(6z57FMw*21>@~rRP^9#?9r!&et{aMM#X=`9}0t< zs9xvz+qP&i`(FL7FJHdIHHu^a^Cb_Z?)P}n${jt*=D1lcZY+4X#3aZs;@az_s<8kH z1ku2l>N3~$3y--R^gyNo%&I;S^i|nf-NelENZ;5?id2((C+nhN`9!D^Fe*42RoH+2 z2gCRM`~-pTXlPB6fU(JDk&@p~jegd~J;b&C(Sbn?J?Ay?K9|FL~YRY06$u7$#i` zqUE6-CP@+eVHNzD^Ik$zj?3{69Ou{YFdmpcV^A%RU1Fl9CeAD<{dqO6{|SO|%!T`> z$fTc)5isV1=0Y8iNh}wosgQt0I;=a0GpY*h@joZRF*}jul^5C0FIp&fl)!*C7X&^D z7rj7XJ$^QIl5yWBTEmp57szMUKPPZ90Ec(u;vDgHa>#z=&m(4#r9g49ay}AimPK+X z+F9enTafZWn{NKu=WE5l=kNM=6B0VWwJ-YP97qo9v;1x}Mn}@t2>&{N*H&Msp>svq zjyn?jo!}hYyKs7KW21VH1``R|7(fn0lTG@FhLDT2#3dAFEMt>MR`Olt-0vc@^1w;xJL8eiXb{Pqk~9~6 zNVEF%9)fN>+`jDk^Z?vRPo!UB3KZHyUxZy~Jce4RrF(X(B07etSJ~Nu;BYuJDxBkO z)l|5;)zR7M0R8WWznZHr)6>1rLTP&Ecsn`|xFzU32lFXtJ31h3$HcFH{9tB0!(krc zJ)zepf7bDTNd$U*=s@IS6!-XX@R-eJTY?de->8sJ)^l#y(f%0?4E6YDIeaAcTPuwt z&tfaxc|T*>40)^$e_Rd;xTLHbse_Qu0Mc0Utz;wk$Uxl=+olm zZ>abR5(6bv@Q|_t@|Q<_NgLU%9j08fz4s6X`k^k z426%w;cvoiBQ|HfmqBVkFb(=GKs(1GsFA8AH%qWAxe(y)`|!gRq)+epKP*e;L0QYs zST^n3_$PLFRnY>g0&h$5=t^wnv1Vo=C0vY;kBLVe_lm#Mf`fHfxWt|LsUd}cfZSK* zin@_{YA^jYy%+7CDI?n18ZCLNd^SJF(+sPVU-^>ESA9q~rUxo?RO=2lRO?O>jD|u3 z@WaEAh=A4z6Lac3M-b<(!b`ROx=*W&U*4D9eCJzTwkulZ2U5Q#f>%cdXGYe|Ebxc? z(UoVl$AsS!Y%_VFMOOF|%4x#b5keF5Z;W)Mbm|}K?f|;Dc`kEk-bm1Walz%Pv+4Ni^-L>d%)17Rqk%lGKGKh(T_)?WF4q~te)}5?RU0=nk8EmsC2&=4F3vr_$=^> zp<o3Wy-!46WGXQz=+s7TO*(Fg?_OdQ@!ZdXWlM2)$W* zHO{nm%c#!H+UIs8%%J&Ni%|$F*q-9&69|in;!S;(aVE0DVr{EDj0t(+1`sM%zktO6)vHXA=L z{~Qx{=1~)Zj?o!F3_rjrl;FKjX11TQo+QUQMS2DZluc5r+qQ>bWIBlN-tL6DB_-^t z;be$xXuxp*2T}yU$%=rZB5pj0+mO)?(}p?cPa>)$W_~%3Xy9%)aQ!Jt{R`gZopzvH z?O}!7ORPfueHYk4J(SVe^kENDJ6Jbf7p3(&zjss^sMD~eW*o#yDHFy?KArt-a+wWbxT{d z{8~B~0g?}t>CGZCJxEl^Lhn|(lal(vWbN*5<4me?mejTzH`;t1iYNFnUD-=gr!#Mr zBX7_u&EVdy)o{O`zJRj-Uvv+KkZK!9t4aQYFcg%5gXe1(wcZ_ztG&!;S;Qo&9<@!5}M4c(eda9$ev??;RbamR4Yf-y@ z`1Eig?Q%3qU{L+~;c)^OR9CM8JQ8# zjUggFN|OI9bzLhP%zkRVzkm6VeXHv6vDCKC{AV*V;DHg+W6V{690*eip_5|;vm-;` zk~sn5*i}i(ERxA%M2Nr|0VRcr(hOr+b{*(1aBY3AF~X^tIJx*&$Uu9Td~K?o&;gSe z3fci(;%OlxsiO}{!HY5RCd`|cdQ%h4x4(u84(j}~=1Z|xaeDnlc3tdN8-2k;8)4Vd z>TY-p+@=UPqm^e{lJQEP`NM|~m9;_V2pGehrrDKiXB5{yTBJCrE3SVWj-%R$$=+|e zyr<0K%{LZq149;ydit8Tb2Mah0X_*LwS*&XY=FOU?8o>_bO+=EkK%J;)Y4Sgo9KgF zKUXuBO$;0Cv~=Xnw$L8VZd)0c;?|E3kMbTpdi1d2Nx&ud`6J>sjuS}zcj{PwcU#+g zK|xQYNgiRVUOL(!>PHvNfX5SXmx?hl{suFJG(8CQ;venhJ1{7EH$Y>rnM&^fvF#CV zBgjrm+ow{CjVXE;Y8%^!Xqyfc;|@Y;=G?2GLR zuCSZQp$87!PGl6OqI)&AUGyYJ{!BpYHddMZ#deXh%0=s^2LySdli_=cf6dJYM+DAP zCaC(<|bv^gC!FXajdt=;Qo7G55G->?E{p=;^M-)?HkY`uvvIoP=~ zXJ9`x6t!YhQ(F|pth6ud2cPxFH^9TwY&+b3AA0$fB_*-U3?cp!Jq&Lw4@YNzFu3{p z7-#|Jq!xEZ=v%+HH|wx@&ecJPy5?{+YD{+3Y|E?JO$d3 z6QB5lhgQTtmsmdkOYeg>e)kiG3P83Q^hb+uGvqLFf9T1frmQzKX4Fl_RfPFzpYY4y zOkD@Jt!A2r=1a$SZDX5#^Sz?j1;eMl8_@NH#~k7&b0ZDgWlm+}2*BBk)+U#0FDMeC zxkK~Shr>^yRUhGgoRK`vg^PW2=Sz3NvrGgDg<;uvbY}*rtW%mJ0KCy~<1Hi9`C=QDFacF`r zJQ=ExS5vFsfU3fYv~yiI6sL(^ksgguB;nQQF9klCcIF!}xEC*$NEuC#uTK)U*A13_ zQqarE(vdiv#g`r#CCY2wm3-vlz8;)AZ*UDEF8umsumJz%!$pbq9`?RnV+uCuRs3`2 z%4s2d7~P-m_59-Y*qLTaePczMz|tv;h1D<&up{aF7PVQ@T$Uyb2JePxD2t6&OXpZ# z1H1Wntrg!@z&`3wlUAb{`={N2Ro9pCK3wI6KU>RY24rNbe-sn^2ADL78QMn4Mp}gSMHi#%^L5E`G(9`d z!}Pr=gP!TNQIU0+VHP6|UN@Vo4=gBAU_;@**%06UrlKd_G&Haa;R1Im^*pI?j*t9(i(^Lo+Db#1#6lt%qY!j5>7JJzsi za>d;FDb&KQ|GYpEncI5>rffdzI53=d27J8j4lSFx`_9KEh$e>6T%|591&b(yxLR{l zd@Uu+^blE-KWIrY>p+>ee%1|G0>e`Y2??0@VBlCz2hQq6Tj!Q9JkcnU(3IJra%fIV z+f&dHC!_vh@l(*4V7N_mN9|ir3X>3pouP4mzIJmQ?kO>R?TP{qC}ZPCv z+Gt+Adey+&dtG6fMaWqemGRRDrdHzHNmiC`8}Q!tZOrfseHe?HG_r(jz?2*B*K}VV z?~TlM8rUVwm5Y~>`)s^6S@Id1e%v$Znt}H4tioQsyc|Q__O`l>I!}d%NP+Or*DsGh ziR-8EjTy+1Jt=m@@H$DO68EBAW_#1F8L<0!LMw8Z@Kj`=J=<`LjV?r?9Hn-Xgj4t|+iiLOfv z!)-rmo#SND!oZR!UYx7V{UDl~`&^f&p7;L282NwSTz`LS;ONHB%tW=ySecvqG~_Rk z$Ar*ip7XWl8Izui*mhen`^sU-LD-oRaz{tstebgNiD%jKiiYY2Nppb4H33YhXmyTv zjC+gQZhA=j*}!PREVWhV)e(OD&< zj{%IYju^F2R3t4gB}&pUI85rn3v*E&HUmx3I4L=|q2&3qx2kxx$3ybJk0_A-P$5;~ zHhAq3+t!{DI+yFD>GU5XM$B%?3%p1z+b)lvmf^ihyW}TTP9vHYwN@p;AF?&Psbnhj zCh-_yYfT9r+^=YX?eqTqQDO#puc7mou*m$DGQSU3UL(WMLfsF4IUZYJr-&_~j{|q| z*=!>3Sc3|nV(3Fi?rE&MZwuqMn8K7J-CbPtH%1SM#MdV0AUA(Wont=pZBv~s07Tg^ z`er`rO8gAkzQ4q?>m|eX+0Sds3iVe8=t8aD2GfR@zOvBLGvQ55W%o4w@E-?H7T4Pr z89y}<-Rf&)amA^n61APm!Y<4aJYO3svig)!){)P~?&*O$NS$vCBWfHpz`km|oO0lK z^JR&F!r=EoTzFhD%a!djFYnyBGg51w+fzn48Qjn<_N=Nms1aDZE81p_k42Fr+U-qJ zS5NFBj5fFF?&%@o$^^HqbwTCjx$$Ven9^}N8u9>J{)p0lUm_x$H3^7cMr>y!B>FnpwM){htJhuq zwS-?rZD=ST^c#+q^Bs-HOn__?nj%~%>Egx0R%xGw|1)5~X>$fZ_O$HP!;s5V@!@4H z`(F+A+nH+C)KG%aiw{ODZnOMwS11WegjSi2rU<5m%7~QR2T!pJr0@{Xp9zzAKwbQE z*Tj+Mm%dl9taU0(Xt9PnF$cDU3mfQEoYCbo;0PWnrp9p-AhJIT!IG-0Kd4Vjbau+B zGIEj}3rVqxNDWU6eHjvUC)#IY57ovK2Qh6&FwBD9!cUS@wpu>(Ng*|FXy8asQxPw5 z&#b*?7|D{c=`k7bv4kP&Fdx%bI_dR;(?mGTK{G}X2OY;5v20HB;R+C&23zKJWXIP zy$?z>|Gx2qf%q>gkK(5#cujW6Bo{2P-Mxa=Q9aKcR-Im_Ci^o}PU58TgGcWP?;ck? z(OnI4D~Y&RkJ(d4MYG7l@lfKQ1?|*hO07nY_Bw_Ud-38uWfO7DlTd^4LLXSrTU}Nk zUNL$El=mZUMmGcbLnKNP>-1IcD7*lRu8(r1LZ#?@l$_DEr|5_0vgmcI~Ae`^^&Xmm& zP)%ifA$P!}XCbX`d4^$G^KK169tYi&Qzyszyx<^YtEFmbddCol?H9tphcJm#O*K7< zj42qNjq~P*+L^#WsIgwiZ+RL2uYi1r(-OW>#*yT2L0SiPK=kM={l77FK#R+r0Do253 z&RJ3KUqku`5lIY0g6`bH4{?HiL%O`y{pE{(Pk+DP#|qMh_z)r+X-1fIJX?>#MFh;#H#zMVKuYI7w*$NZ4;`vxdeM{=h6_6_ zMP7P_PPLY*K)`dM8oRWi2Sv#E6>h}Df;1Ff#R{9H;NwzfAn29bb(S5Z?%D%$eeRqh$QSRl{&4At z`+iOVFGDO_Rcck=KwAD7{sjYiMT-1sYyQKle#_f2&cno?y)r-7Hk~LL573FB%4p7x zm#dE5N%;OOW41_sz^p1>8moBc;pQ6QyZVxd$wy4fim*>pWvaSjgi~l zm>)ywm+ohu&-r{$EcEtQ)UgNiS*PI}XNP=t0X&RZ^9mck3oc(l600p*Gb)ihD?r%R z;MU_CCTqxX-^gHcMhwF#N3(Jd#Afg(8sJf&?YpKB{x^r@R58Qk)=?wh2~H#4JDRTB zd-&I6lG$x-lbzMo1H&zX(k*l}GA~8yHQf*s_|G~A@hOwfj-Qm^eMYaeX;kA9%UiOb zxG0z&>NQ+VmA~y#`gMbl?&6bBWZ3@Qnf7B;`jqwpeaQIvuxVzl3Em!2sayK?CT*}p|tV7o7T(* z?^RseUZ=1)hBE7~?=Nj*J0v!%4eQB`RF0*ZuYR-0&9w}OjEU+Yh>9SP?Ud)YXLPohythl42>YAyf&y7hFNzVVr z*L%lf+5Z3IToF=P8I_QoS!Az<6+&ds%+b#iJbwT5@aU%dI-KWm9Iw~&^&Fm5>3NrB9b|KS$t9GZv&KeqRVb({aAen1Lj&|b z1>{BYAfhZsIO7k_Q)Cbz< z)DF@y(yI}~3JQcwefcub`2P2!!2TfzyuB29b{yKr8ts5?XdVvR>UEOj>tSGee@ruV zHsB-Z)HM|x4jov;F!F@$%3wi*|NHI+u_w8p7ri>i8FLRPe-+X9*veF3`<770v39Av zvN8tveHee7Wb6t7dE#Izf!2_&yWL*~QhE3+zuFu6ky|PFH7YJC0y6M!@Q=Ely1_!k z!h9#Pc4=Ngqe3PLA8XjSy$z6M!CUm}4_$A_nLFwGYDC$|WJKqfD>%@0Og|qunaqC- zT|N7D(m+xkLU{hrjBX>EBGuA}7b8H;WYKP5YvrA#e*+^QD#gKdom^4MOL?tjhQ~4S zlh7!1jT;A}GCq-EXp_9Ov}6;qo<{J}FI*}viXXqpVW;EOtGsrTs7Vlg&wbnFk@y}| zVm7AS(CRm%P=ojPE3PZfP83yC?$J7msnz7?7e>xYn~LI`#b0k=d;%Rczh5Iw;%cYO z^rR&raDSuN8-zNBR8wRN=VT=3HaqA#&y3dBS8r|ICw+Ym9Xqihe6-fp!8xx7&h_i- z98?^z1xTwf0(uEg?(-WD{~uR(3v+YLJ|t3jQ5(akN5bo8Q|P4TtL`^{q=ndozmHk#r@Hp$qOyA_ktw(hm6WP-0@Rd+YS&KZ6xUsr z(#_IjpxG7GG&B0uRHu%g0a@YICXBBex>fmv+{3OH-q_e!QdU+r`PZFS zinedKt#aWg;XsJG`6F(=c7w|h5PHdyco-RpA}C=iC#Q=E$^3dXZ?3yhiYQ3o;g`Yb7M zN#&8j_Vfc>3${qWg8ThAK>KZi$#xm9qdkQah+tlPE%Bg8i#iPd(v#79Joe7>xs-{> z$Rl7-2zJyOHt^@@9y8x#SnKM^2L+I-B`APGyfhERX@x(#l;Z=&W=Bs?4=06hl`xCh zco2ylJ2#-j%z*lrb^GX3li0Sa=o?>BT`Q(~lSa-r%`-gAnH}J$MQt~{v01~}s2LX0 zFaq)2%lY1~anA4W)6)Z;ZoWhGOb7iYdT#6VCEq|prEV!TdA=#ciX_f|w+TLD%7ZiA zv-u0lJ{ip|xJzQRnR|?@E61Bhc%GjFH$I(5H=kR%cxZqo0qe}bTqR5Jd3j#y^KT(& zSkESGx{ZgA*5Po2eaps@JCu22K`$(N#S1n@tl8P>?{I#-1Ce#u+&5cD>sIr7QgYSR z)jO_I2=cy=6MOy|fDiz!NCYhWD(fwAT@Xs~PCA^@W$||7{7@oEJvEc3llcQ8$tHDh zD{72?oh5#S`n<-nUekAw$Y+D3;U{I)o|_d{kb2UcZmOoG#T-o>-Eu`1$Rhvqs1Q^? zEyd@a_g=qV!)q#UgefZ)pq$>AV7EhqMJaeF<85rrfBqxVP~mH#&&PuLKkpLIs_ueD z_vjqLdPu60fG;%dmINb0B#>W|JE$&1it+=r`LvGkvEy_AHE;vPC#G2CjbF>=4 z^fbR2AeR?-j@9ax7cDzxmDg*Cni>m*&yZ^K7dN_XDyS(bGS3^%zcH3}`R_>#5_Py)Wv8XfbjOlPc+6(To5HDo=2a)wOQJI|I{zD$V-90QAc0CK|bd{Tf zFe?QcGhOuTH?*Ws$jl0s_)QGq8mc+o(sWaBT5GdqW*Q2&C*1$5OZ5JTD>H9vTUaD5 zyZ@^F)11@Hg*eV{)e1}N(iJY|Q%9liyPgEfoIlcITEsiP^;+V<7Rycpj;TGi3)D2a zjR??$tB?!3{m4InEI9z=waWYxwsQyFRwj_Q%-^*H-4;$cI5)MkRiG7V{$DrpuW}=tt!zRXfZ0}U4ux|;gLOcz%(B>27+x!fi8VL$e zDED{07i0c;q)=zzJE&9?CViHAc0TI)eS zKe6Aku(%Jj?`SYFl4*RY@Fq{T36PJ#1ay}I3s5hGa6{i?v~vA4!?S;^U!n>E>Xz); zVVo!$#Eo8wfO`)~p&s=6;vz4ZJr;bhw{am>HXLP+`9nNwtb=&3mU)%y_^6<`$1Xnj zs7Z6DngoCXU^RKOZ(*#V`5Hy#E~);9O2A~Jh@yxwFD!b98C1}j>Ex%8(;_(kJtVLT z5iOhd_$CMFP#0^(%q%!RsWH44R=<&ZxSTz8w3F8D3%Qa5jKT&JL;^TVNMK=Q2X>y8 zy%ZGP!A>b|1ht}%oTco}#AprmZm1JI*8$^0eBORt4RBG3S@$SZRiq5X=9;%vSdXS$JtM11T&iaMp=x5388863flBW2 zp7P6AhU77vbs`4FZ_xba>z5x3Qs9BR_1ph;I-1_1)cu3~{WhN_F1N3sFhT4W@b^C< zWWpY0LZBEp`2{9X42K5dsceS#w2Hdgqh#>!>L9*8ZeJKkjGehc`Lb4t`6O+4>*97P zv!*yM(?aJ`rbDU03#A!)!*gu)_d*bUu;z?r+fr1}&Q~*BQ?7WP5-ow=jHHkEO@MFEG1<30jEFR2=~t>&^La8*?VPoDa0#lLJj9!xW0QkSV5 z&)>~6oDHh}@s!PtzMmbndqV&fEm`c$H@+zj*q84m}y>bCduM z5OdaM{fvQ-{av)o#s%IU8WER`4G*1>hg;`MZzP_+`_{1_5EQClqZIwZ(57pb3-es| z#*B};pdeGe;{{8dnxD8J6ePDbVb{Bhf*L)Hr@wF-?w1Zn4cg#xUZ*;MC!nRqcd3zb zPUfx3MMmc z@|{yCZq}q!_;QcX=uuWlYXi`+#4#f%3@xn+W3C@1h#ZLztB^pvIRH9NeQtC02L0q) z0}j3nES?GJ`t@f?Vp}cQOX2(#J zEW*byb_C^!Qx$~1LD4cI&cOL2ym@A0!JPYus4nk5oPVta0v{U)rcalHk2IldPaaAU zk7H)LWWQyCyAL0jxqo-l?fFNuZKl*KBkD_83Gf-Z$>mrW->tDd+5V3j!A3UNp(kTi z%tj}Nc=-V#{F$EAH4|2*;g>o)j@0-v>F7#rpA_V)mTEq_{dt%p=kJ7}IAu_r00br_ zy6i-YBf9K|W3S@2=q_9~2{{rV)15f%`!m;Cgjha3B!I}pT;FilIMpfL5JtKW`Dc9v zGpW~L-gsaNBahK{r@GsK=0~2zV{0br?(RMd$X9o&7Vr-10TRp#j(q8djvKWeYc1N2 zUO(tBK@v4-9k^$60KL=#^ve=_k2>6Ar?|LA!hu`D)NB@}7;KYRBI>P%x zb6G>m@+D>jY6_$|YZPPfR$D%STOr2=c@qHUp?-Fg_3KxaP?e~d&W`0&p{K zF$VBLJdd6c-Gx<;ys!k^#QM>Iw#LF^53aBh89m z`ww+f?vnQblU;G8*6ZY%@`*rW?ADD3*u^3TRo%cHUkMA3f_%dY(;x%@Zr}}x2?+^t zcZ$0$@82suq0zNCpULv83v3H~)MbJb|3lI%5-w~yoR{g++`Oz&zKB(^SoM~~?1--vX5CuB&%Hm}pG8%GP4Jhz7P`4KtWQ+jpnR|4U{W);B5e z%c0T*&Wux}&jj{87ecHs^6h&&=bPSDu`+&Y*0Exy;)ku_7R%LGxn*MiW>`4B8g_0} z2*Q1fa2U~2*U7B2Bu?*{VluTZ^M93J7!72+Xx@)5G1{bj2|*roMLM3Q zg>7@FoB{~3bE+daT#(WJaWtO=5yTkD_`M$EJy34~H~Rx^XaM*?9%D4|c+zBT@^p=I z?CiOhds|X8z^u;4BZ?n_7A@YVNghtOA}w2**?0h(*f;BrUrN>_X+8A^9SGMavPBD%eNDKGJ< z4Hzx*qSNuwE#WHCy-OB39kl2rw$w992u8!ZVw|BIX9j=)jk2*75fUSAqlXAtZy*Qy zaslgV(70Oifs)OS?nw$8yz|-s=xpvMb{lP9?go;;TB%z9nG) zCDs0LGf*zVZO9%R8TkwVZABxc>l5aeCa0#Q6r95_oCsHZEUl(0grfLf+1Gt*yTDh# zb4WF>mR)VoGz%UdX_v;`Wxnda)BTDcrg^@6Ae$mWKVnwn%BSPJoIfB>4(;aQM7AZ z3-$-6N9*ik4X*o9eAmNq0DcLe(`gEesfvdD?a_XnUcN2ue*PEMxpuMzM$alh92|DF1TT+emh{i zN6!e;s5H-+sIp3(gpcAr;2ax)^%+P@*RRzs=ca(zcVX$yZ7nrR59X(W6p+;!mYbiR z$;SyE77G@`8hU^{jJz}!*=$ICnKCjYwAYigc$Qm{pKu~$q>~c`WGrp0CSdB;x_BqTZP0`!hWCo_o1VisKecDBGo73F)=1Un< zXLDMUl}g5sg@S{F4OdrJNi9CKgmP_I_l0WFp!9GzK6AB@c=Bn3;9};CK)LP`dZEDx-ARIQM3$wS?5`_{6l=qQJ=2wE`WJ%24FS` z9?tYB%cUOkH9$1@2egN_=hJu$2!g%1+qj!CqIBh|+;QWVLpX!H`Vv)c%TA!uFTosJQgGbcxnI;-^@Jwf$*Cz)|`$-LvzNqPP>gq z8~X+zU>{Vt1KlYr5Vul{_pitVgN$loViYH(wYE6WMrg=q_Z*~lNSA# zGj$smOM-Y!!92cdY;X@^r+X1Pe6&e|t5(?0Os$wLKK$^9G zOPz3mR^6-qK=wx7a|qhVBC%g*t;GmP@asu+<+d8JeB08%&9JuEpJ9x9kJOGzo&~g7 zP-Y$8HD(H%LN*B9_SjkMPp>cVcMW}(*Z)k)B!h$cI>u#)E9MtGcDS*yQ5|%qlDP}y zIhXJhZMf`BKFBC5vqyfV>ukaC)eLT#`+Jq`uvlcSM`e|U?7qVjHXcLUT)R$Il{>lX z^foR!>>ONHmUcE!TmU{5dgh?Clt8FOoX++fkzBb9#s~YU*pA1KtTftQ@cmzG3uEJC z6mtu2B1jKB$7RZETGIwA0d6F(p`pRXxT}zTh6(uDq?EkYX;Wv>qt71xDvWaUl$ud9 zdQVWKe!VRReD^MaXXsjdv2tai_>}?*$;@}abzaNFM#psKF3p$Lo5lm3jraZAe}RJN zoEWVPY#rYLb;NSIe$JLeWIIlsO4cgxLnAA7jVq~M$-Ny8#^~a+xS{WQ>$eLE3g}$-R%S5C-mv)S?N5c3KDpq19UQ9etp;WAw1e>A zqGucjUUPG+)OxE=);hK>si|o(bdo3q+N$c6>Dw=9?b<6f@*wo3 z#b(t7S>=GGG5XxIi_dBY*DYu~L9LSZ5+*e`@HITSSI`+X5=avDpb&c_Y?OGoo?-r- zF7o;1%OAXk55F|uIKA7Eaatx(gWLOaqy{o0M+`aQQRjD{Iq|=!*-JC`hk1dnj(z-V z>n;Z@2FhpkUqw{07V>>St5OL{bviIAA;RdoHvQa&;Lx_YCzxk>V$er@S>P>xPZ|C; zGr(UG)N{1UKb4VVpHG;6-}7-)kfS_`MBt6(P+*= zr;-icUgH4d0t--G>(p~j4-%)C)Y*7l++c0e@&iZqO<*|cWCza5fSY!ls4fU@VFHC~ zOS}%>J~L3$@}3=_QYjOTWV!%_X^qBL-T(~Q!IW6fPlIC2I=f^3^m$-od&}JvxrUbj zf>BarP{Ji~PQk_b{m9NAo;DOF*AB)!AeCm?@W$Hb#%lGX?W(sozEas9%m*I>pHUz| zQim#;9#Sy_6rPA379Mdn=Jp8M)m*2MqCY$`{~)geh&ci{fMp}cP^80S+_l|kDA z${8zF9bT(<&z!>n3->=ul{W`DHMq99s31+U>nyJp(_0lY?ma_IQJFeDefEtI(gbdl zc@c8mkrLK;HtS}+o^2_zfKqQe>3|86?^rrmfA|FWZbuScLLGJ5zRfePp_C%0O6wfO z0=CrQs>A*L{f3Uw3o4%*Mvp1N-?_qmceh{?x#Ejz>Va zbz?yNWb*j!7re89sfKq0pBT#ev)i<*+H<;^LLm|_^L0TEtEa;dd!M+-~t>% znd||-24L<9g_+iZt}%a7DM9Dfz8*BSTf?yqb?6<=iw89-bgvtqvm@LvMqRg_F(Jt! zg>g%`-;V5mrv@C#agO+|s_dJedkL^}!5UGc2dtr1IY~DteiS4{h!C)jblLFhv#Ut8 zZ$T{8T4kmV=AiD#Pk%QMRhag&()+&JpeGU+XAh~l)l+v+LC9792u z^4jo5uz1Fpd5nk*TO1=1F;wYlA2zq4aD~{nyJdBO4evIvJr?1w%ft0l5~Pp=RE&_1 zyT-`CKL)D7@*lDA zS9`3CN?hFBY%i^>C@)JjWQ|-879TMz8`IC4Z$D{I3b}v(zBy>f>Q_D{#e@Ysgx2?h z^PKSda&6X=Abg65HPG0p#_ElLtfVrV8e7Fs=lk;aT|CZ^&GUbE?aN<`*NB0JNsfD5 z+Kb=2_swmC|<9Qj+Dw(HGiA~kF7ve#v)){>` zGEwX)>s@vc4#=hPGO)O(6C2bYp)EYvS*xUn5RATyWKkq|Aq^H6@z5G}8EQo^pd}Ig zzf00RE6GVL%J&uOMj{UzIQDLp@xMIOZgRP2+yL;Jo1F2NJtcKzrQ1H?gZf2oz<*>F zz&dek03<$H7hP6(e5kx}zeB3iz|bK%|3fapXPZmHB1DkZ`Oow5Ab|W>Il&On+_P%V zzK7kUSFkq+GaZy>vS-TSId9{2dWpfeKX`x{rQk4Si^g-B zju(6w-$gu3|2t>$R=E_UMINcxfO_vbUH`J6U>;{a*Ko6?&Yce;Iig@Xba3=FY! zr$72%4hbS*Pm>5Ie|AI{HC}Tz;OIG&aShw-1DfALy_pQ~sXr&4|C;tf^Ov1hRg?*- zImy>;A2dxi+Wy1=zyJ{(9rg=1!o?ZdKH7cgGtGb5* zxAHDvcTeI~*uXimrLSDM(hW>W85Rsg>7k^}C-XXs4?BCZ5|ffV?q<)kt&e}xz6h!z zE#cxu`)3ud597{fOTWiz8{UYvMgOD~A!Z>`7x%E$6~+AL%$YsTek^vD1ax#|ss7C1 z)b;ixSfnJ68PICr5ackb2vF#$65Hva^jPR=P$reKDI@Gt*EzgZ*P_$U!5ovWRsOTq zzIiG}#1KXa@gz=wV9Cy-YjT|rNZYC$!APrr-61YX#LuRFT^58t9Dpg+=LX1)m@(wn zlxoAOdDhyM(AQ@!SNs59&-vi_;#w8emk&e_e0Bx%=x&~FNXiLQw8gT+dU!6U(MN_hVlRTQzHGo4WJAgi7J*KO?{eZQT0 zI{iT!USZXDZd6|AibrAuz#vcK-3W^m zGM@X`78Yw#P2u`Zogpu2Rbl>q9Yr7-xJclL%nQq?Beu5qHQ^dw;TIN7QiJbF4&a9A zB0o-gkq(0(!Yp4G26%};0OD4wOT^J1gmJvr=Oo^8kIWl}V?CPrAgnA7fZn@YV2{7@ ztQcX@yyVc%HKIub@N=h9`R#kXD;U+Jy*mjz7I>m9uG5=sZZ0k;!S2I41(VHs!ix+oiXLO*SnqhG3d~!u8d;lNVKDa{>6sV_ z)lz2eLnYVgHU0>EIO`*3{v2xHE8r;BKzhhVsY*#vcI0cWw8T})v2#`JeZq<<)e!j4 z3@}B2HLTwZE7Nl45ayeIaP!8*m&=oz!PDt9>8;WV zx}Zc*MBOec?O;y?6naeX`||-+Thuu0*gI(A3qv+Cc^0(k1{t6`sO*x&8Yum0mQ z{@`k-^WP*LkXgC@eP!kNZQZjtcYgHY1N%qE6C~(s*_HNNOAf`^m>InuA0mVRLU`bA zr7KjDa=w8*nDG65mK{A-TE&DeJU&xnK9PXf!inpQbj&2|k+liL3fjt+|Hm9NtkdNC zng}~+NjpYAwVbOPERQSMVhj^J-mZM%;pwzeK+*3*4Y6>u$NpvEuCL^81K0~?3|%~5 z`}<2i_i}}~dcamp06fs4g)aQ9dRbs#%wm)jisGEzm}#yr*3as8G;sqODDGrg$Ljk+ z1Ju(u(euEDJr&#jGE-6_GN|sMBdT(c#mp&le~uSmUz$s~W|`&X25(e2|IR=zme4h2 zVn7|%&e>0?0ReV-cCpTfPvIPDYZLSi@2VtgSgI;zFF_9wM>8od%udVw%aG&4^5GU- z%{j@C4+HA$5+rv}L{lzXdo$F23`xE)LHWJH9N??QmDqh+=6D_dI{G#yH;AsD1KC;) z=hHuKa@%_~oD)9Xi&vK?f`Ai-E9@7Z5N{0#E{U{=6e9(m*YVQ8^qxCfs)4L)sXNaF zjQaw$!QANKHbMiA2Ik zKIfvI?>eO69G(@g&HsjsfR5L}o3KK1DTzl%xLlih`vp`bF0kAcZiu`db5 z;S9^1^S9l}$Xl;jwyRrL**i6iY=2rT$aA&SX=K`ByfS)ivTg)fG`blQ(?)^=K3WwZ z&9SRJkFb&oXwicMA;_x(`WKY+9p`iG9%znT&0K6clsg362n1&=uNmeB>?VXMml9uLaRN7T2MHq(ksm;Tlwun;gtW-5@n z)j7m~B*)AN>lr=W2+kp2^D6-wH z^9V`~a>N)lp0MF^82$+lby6P3f2Hc&)`Q>6YpOhLybYu#pqgCakrt>Pqa_SD1i6ls zZJ3TY7>%|%l$!7ts5h+qv99uF?sC%tIN{8N^K~34y zs3=uMQeW_wj<5?p%|K^@7aZgqD@s7koY}a{*kCsm>3E_@@R5Le8j8?*W6$6HPBe7b zy8tgjXqGlV6GrZ}Sy%!F?G6%P7Qd`@aZkd?-F504XiD_K$SeFDJMty{R~JeQIL-H! zKww3E7Pv0e6t`*ET>(wW2_$;G8r66QXEHM}$>Dd}nD&B~_K0W*KSo(81%)45=8<)L zx~qhU#TwC|P&=c~+y%FMLDt6z(*M-^=NpfcI4(QA^oz6KAg_j40OvsGb%>f-|7iM) zS?kde*8FqFqM(pqjr@#qnk_?|oa7@BcPuPpmbG$|%+2E7@sJJc-aibO)zK0fpz)8{ z{?GH$faJ$JP!1AMBHiRRSG5HUO(Z0r%)-srfj4F~G$c1BA{jfkq<@ZDUcn7NN(cca z$_QYhTo9IWZp{IP^mSltKVsx|o3xr5_~62qz(_k=D?aH7!Cg%R57tP<@KUP;cfC?4 z7>%&Lsf(ca-*r0UvOHe~HGt0SM(<>OKN)cUQn}Wr-g$6}Yg?Ho*V@XEAFV*Gm_B%h z>pKYKCnvX7p=hhtOV+wQ=x;P~*4kr1pxtKs$r`t&QaXcSDWdGfyw*@)rukSLW8Y9U|hO ze4hm6d+&lLCJjOHW5lS>azUzj0hzG#8jl#?dcV5F%5>Hx(8>R9I-svaV?MB!^4q6& zsa&y!HkchrT)n;Si5onLZ;BoKlXfY+gE&ziZnG{C8A=Ino%BxFvbs5m|fqaEZI( zG|$ThYsAWBCCiuS+XZ-JoP|4Uwx$6;DkmFRb+rX`4~igy8PnP z5(eahvXVXSyl>n+aedC&P)XEu9Y<*31Hdf_;#)!F~mBom0Z)ax#yCuRs*S~MBYMSk6e~+YBGn!P`iUyqNo1Z?{>POsg+xYI)Bd)K0MrGHM+$eK zqN1lpMT!{I6w*357eFG#&zd52=kjHBcU`SJ(FB>oTWoVgcDJOgxNoS&J;~V!zwUx|II=z$X^R^sxAN|Sp*DB$H8X0I;N8|Mr

4fv?B za6}BN2GjoPI8VeGaZvd)FQM+%!UPpXT0&V@)e`G@kgZ_U)a z&u5#n!fpF8dH<2O-%7;R|8_0FCe9oCgFLQcA)3_DJf^;g{HHQxd}&KsNU&{NI=gK^ z8vW{wb2c0lQ(pidVN1e^trEzG8}5Nj#_<+>nPfk-{cnO{WP0_#V2Twk;b(;QO4Cp* zrTf-gyJF$R7h4Z-|J8a5_F-QnPL-t91~xe<+>CU+3Nur|DgAKM>0fQ&Yc;j-D?n*; zZFSmZyQlT$EmnYJ?7xjkZ*lMoR{P&75qLPq=SC{^j2qV2h zn0;vU?NU1Dz_BSrs373^qHytpjT`9UuwqSgt6X53;-k`o=_>oZPQJ3tIeLPp zuvy*nuKjX7ggp5u!~(J;|y-g${P33?de7@lJHaS zhhP~DH2S2Ic#CQnnb8qNrUj0|+hcVei)uG8#C1bpg7zsi)8y=^+emWs8bN$Or$tu8 z3w5}=4p$wu9*F7|Kmxr;%u#8XHCY=weG%qD$G-HTk&&tKv9W{sv>^-YhO1LQesru^ zi`+M>_=PSkhMhkIUsCJ&>DpoShCT54d!DR)@!q|8DV3Y^f39Gv>0kt`LR?v3J&ef@ zpzs6e>D0QhAfm|S5DV%-g0zx~^Qu#e#Gl9p96pjAq!Bsny45?b!5EDx=H=&tbl?w{ zoB+mxDO*1^AP+P&61V#v<82PhhpGd$;r>hwUY%8v9LBHa}intIGB zX`=6y&AMRSQD-5^lgm^{ewaHpr~XTpYZ+;`LtO% zYxkqV)psk>pFNbM*UTPxXJ7_DmPFja=;Y8y}qk-H^cwW3;#4rb@`v%*?Dk zMaY%KL)sV4PX-h-V8}YPtuoxw8Phw`Qr5%p1tgQ9R;S>e_aRm_q6mh`SM71Gue!c? z91P^5{xNYn_u4=h^IsR3x6U=ybZyrI#G%^|8k9VH*VsEJ2XITi#};m%aNIN843v)! z^5^x`$g%S**)MZUPU0jdqSB2+!`rDhYXia|5+EFUIa+u5d?E@Is}(?QANsF!XPtTf z#eVR1074y7fY4tm_KO=OwzAU`*&d!6UR>caa)>$WDME%G9v-?10By3l@(9>BX53)k zJleJ6uSOom7T~^dt@F@ecCLDvg9{_$`kz;Cls)ZJxG8G4w4+9DA;;wiH|r6Iw9x{J zU8$aw^~k~$^!Sc>%}|sY8r@0A)f0z<Oj!XOiM)Oi>iR(GfG>sR#><6a7+%N+fd z(!xShvc$$eyU}QV2K8nt;F#tDbRUTeY*k)dYaG@o19oyfL~mb+;}m<64PDibcp<`> zMLbMP7}&A~{6TffvZzu?OMgPWi7oK3WLKZ^;nas`(7dL9*LA8wm+DDTmd%{J_!;V40UfhrXNas zJ$6R!GLv8Z42HTXdu%?sqDH}q{l;tU5c_3|blT4CRQchJcu88Q);+x=$)%WA2Z6iV z4zr(d8r}P}7V3?sieEfl1$TVUy!%5htgY+g^$Xk1!4-JOI;e|>J0jR|2cpmH+0Du>ae1d8x8_gd3-Sz|Thy7oX7nVaJ zzDfBCmzsY62%kLZFR1tanm{abTZIUC^}``AAZA8q8pLA(tBHUmcJ!;{P9yBdw<_eW z!%j9y*?}L0oFRLPh~H&}o`US0-kS&ivVJ*BZ{xy$OXr@zM_x#u9>3`5;s@sOs>4WR z&&@{T)`g`-z&_f7EZpmE{L@abAxTx_vs`qBNLch|au1aRCB1ZwL=7QGzx;b^y}lIn z0aIalg1KVJ^3F3{Tz;NNU?_YSQ@j;WVG6as-y|!&fq1qg*5BjD-Bskg6~LfQf1d90 z0}~zKiFLHRw10uAZ}od`u!n5hKT}2 z&AQI(+!Zx7&YRvBunvu#t~YC48?Cg^_zoh?4(PE+gX6#6^i~n7_BhFSId^z17Oq-5pGM*!NFfb%&TzX&naBm7^Pq8fRmW2f&*~3p3C_13wCR?NK-2&Nm34uC-RIeTZtDy z(3t#TaOi!DP7_EoCBCi5@c8fd2{L+!%l-(gxU|1but+^R(E7v(zGGSnrvBd3T2eN7 zIXb)~Fh(H9>MaeP$o5>aNO8S<$73ohbc*Tur%D^%Ae>2&K9cYGMnF71dLRbx{cjeW zeDd<-o@*^FHYseEmLV^fwF;oTW?WxfB-`f4n@wZ|>v-jP95(sO0%z8u?ah5=;9!#i z98?S?`5R^QcR>=oY!a`{m)LfVgq@T9DVPvQ+kZ9ON>i=ut(EU@Bqny-e%g**^*A!un zGS{zpiOiP89%RVT8@FrgD?9$Ms=a%nt3}#c5>{4RiJQ}|NsnDW8@x1ZrK+Z}tl_0!BF1E!9l*rMAMtrA@h&aV7sek&6SmIau_#j{WE zX4}FU)Y&0Xs7mC@>JjoF4D**7<5Ad^`uya_blOWqBnvsV1Np1OdoG#t&V_>wkFY;iFp7^kv5BzjmF(7MT0kp2_RSz<`Eev zY}_BAjFlWnd@3F2_VaGrXaH6s?~_*7A`6k|;DVkyW6G?(HScDwow>caF1VS6KE)FP z!)HuK!R)XOH`mr~;F%bW$BR zo8rE0j?@#F8_B)A&u;WKMU?pt`{u?1oXhT&wwlHXXbjh$E@K~nor!`Kom5!Wi;KeL z0vqMZRgac7JGMA-v?@hSb4hGVZ0 zg6WffW~WpYnG$)q@9pr6A?m~D*dCq3Pbw)dub0&3T=Vmih6xti;c)kwIP=CbTx*80 zLIbg*v5dF>yA}izVH%f%b~Gx}Lgx7dxKfnAvbSG(*nHXh*C4|2OOW|WCpEyU!IN@p zZ|peMAlPR++Y9N^0h4jT9ncZPh0HGoZL(gbjkOt${(0r8xI1qyFLH z1;L%Z$BM6%QN6kU3P28_+|6l2^bnTP8$i#q0n_kUz3B+NY-9j|(sObo<$LNNh%g4YOZEL~R(|REzW>zZZ)j{DcAQi@ z4XXr}11o#FUZH(WIB?MuJ3q4^DexH8P5C1*tat3~Vf~E@@?Pj4T~3;PJDkl zt;0=wD{o-Btjc+H{?rh-@x5Gts9-hMRlsY83H&X;9w>iSv>KuIOdzW1c^y%x`f=8G z+hD%bRP8SO%TELRLTYRN8kNS4D!6aB7{~=w{gQ6_GhXw2ph3VLGXzv|LB>)yh^fW3~C$e{phw@h~5NJ3pQCRGTlfEHm$a zJAbfNd|VlL4|`u%QDV7cb`^N6P1#pG=x>AD~m|8`;Iumop z)AqN5;N+$84u$i~JTv%MO9x)~Duej*@)IG2&iaLimo~e1x1()Tj9L14Bt6QuarK|U z=A+Zoy>&G$#+KXD;lF+Ob{O5Jo44foH>AJa5eCB*J@wpM6Ko7w zs;4}&TQHp}GH*|esneqw;|PiJdP=20r7K{NQ8M7s^x$Q}o2>U!v`_PQ zv8vrjz`Sb$FNrjw;ijimhK+_vqIX8zozSW*VO6VD&)3>dnUc;jL6j7@Iv6h`@ojI4 z%nL6y^^^)A8q?7IoeeWe*|vNI(86H1WcV$09ZZ$$g#Gl|Lcc9ok&tB_1% z3RT7qMzM{M5Igx5bZphf37am$&?%wo5;Do z$%_WVFZM&@F3cM-HrG@s%Km^2!}mhN*dd)x8bI_hCa;7Jd>Sl2MK`z}{w#vv2*>f| ze#6G%?yb`IzJgR~>0TqtyO=&rUT-VZ-q8z&l|70L%Xo=yPLDr=p4wYQf9;Y(Kqo>w z{NBj?cb>K0t(}h{@w%+V`Xjq}WAnD|kZZ9#oZwzWE-Y5lwtZGb+c{Hb=rCV7c6{{} za8&p(v{mWS7_0S4Z43x8$f{g!G0m@UIu!evHMp%>qY!tyzcpID*&~%}BmZMn3pv%GeF!-oUu%TFkUKML)*ydz;*QLSg zieK~UVQ3J665(`G?`ocO?(@iKGn*Pso>skkZp~3?^s`82>{IF&n`)ZS(;UPEPt$>! z90xp2$?06Y^&H7AQ_on*EjZn5CW(|#!E-J@ap?|elka*AhK77!_(`!IqrvtF5Umco zF2TXeOS{*IAA;3+Jy?LXi$)wSaOdxTHZkH3Kk$kOL69ZV>P@k>b$e{6Of}Nen9o8f#lWEMfZeT^1}H$N=2ou`-=Nvwk%$Mj`wXT+97vzLjC(4{wJ;vwfAB@4+_lW z0%9dD;uoi)QK$y4@=?eK?n>C3Ve2LnjxgjuTllJ9n7Hg-@-prSyq9e8)tZHlqSOTD zeWlXP!3B!?K{oo2b?upPof?o=nshh#m`BnS)__@YjgXMg0Mn2`i~0tiW@F*ZcD0+( zL-8E&IbBR;c`w26!!)N?tB#K&#@_F#$=c)Eq_8^I;>xl!2n5C`^X7b1lS+6)=hhU) zO9100j7uTQva00%`ay43`Xo?pC98#bI7f$(=^r!Afyz_FgC|G5N^KBDDPS3qA1i&SGp>%O;&@8VLFmEr%`+<0JU#3guKYx+)y&CfC6$ zCvuO6DLq*Z&5m;Wql`Cjy{}be8MZKz`J|wNwj5jAFqBaW(6z0?!a{VR){kWIxPya( zKSC&kZgjZO28HE-Ev|Lz@gS8vzFfs>RT&4L?jLt)_<0>g@D{XzTwY>#L)>T)wa2BOobV zqJ(spq?B|c-CZI`cO%k`q?8JRq;!|Gq=IxxBc&i+?>vb2cklQ0ue;Vo7xT=_Is5Fh z_ZiCqi{OcK3tPiqUFJ?rtOOWNhM5K*GG$y`4)*5e=9UV^!%q8gTTjh1t41F<46bpT zeK@O%Ca?zfx05GSIFsgOm5GXkS3)#~UJgpHKGOC)^@ZQ^O`k`W0-qiSwW!5oKA4Jn zmwIrX+1RuGy;^LtTpwflG7~dRTX=CABhm~137I_zwwX}phl#1SS zc_(rmdi-1KpeCpDJ66S8l-c48yPB3vJVv1*o9dLYx4mH?Ch>uo#Q7z_KMuyI zS;8a)dhMZc>}=H+2==?xhC`~wM8aoz4CaFlwHgBsinQU=+s*v+3N>k6G-`-bisje2$A@;=edI8*k9K|*PoeEhTU)aKpvB%nWSpv@ODBR z>psXSjUvlG{YG$O{KocN-uMfn&YT!w-GjCi9BW;a$S9>@YoEY}5BYPQXCuoLbE(OS z>W%l0-0aLz-bY9{V3odso2#X1w7P35|J_g*XD0#y59_CA6G`Bw1p`FcGCdUAnzLI# zAJUy9H`9hP{_UGP@U8WLug2WsvB8hoRI{rtnNh2>6gc~AK%*rb9SJD=mSDw z2pob6*3;)JQ?K6A&MT!0b@Tbjq2_5F#>T&UQ9&{mfws@AdkY8r)H06yI!l%w3PoBZ z6ueDE(ZMYSCxrl;h>uf|_7us=%%<%DF)Nco)WqY#^?<-fJeTNc9E293734<2{?pqf z;KC|%nox9Qp)PzUu|PyvC2Z{qIEjknZ`*bqYPvY1&hF~2^<#hpND{0q=X+KYp@jAg zvjbnB;@QpM&7)Ftf;o$WQ!ReNbL$_?+&T$^ve-M0M5@ujQ@~4GTU+-Mm{}WEm-bK2 z-#$UM+mCmK0maE2QIw<~ot0;`eBaV9J%TXtJCo(^c+X{^^lCO8`w(G^;>%3Eur&kk zfgkVMk^N$!T1kCTPN)VGecp>ky~IB8=UD#!aNx%Y{cY`~ZS;7Eh}y63u)lr4+c-`@ z2Mck(J?wKPGHv-r)N<82IM9L<9SsgcfCr);%lIQej7>M*UW5}vD9!J{(m*dr0ckqI z`YAA3oPgf#63D8C%b<|o0)@QQnL3MQXRT7-Bw?ZD=+sof^ZDlc^O6<^lCrN`x|izs zal!;BtvsdWV&R08&|b8NXn8ls7Lu)51fErXV8?6lwk1S1Dx{)hL?VxZ*q(mbuAug9ABflp8FiKfM z8-=;pSw+#cdNA9R7dnS_1OOaVS#)6V_yx>Z+o~1PsolChpw$G@G<+GPci+D9ad`b& z`DnTYARv?97dC^teyd(-UN*DNc$n}Js0+sF6J6imR{a|vPZ+jRsATfh1gUzeLYSuSMWG3V_a%$%I+HmJU&9HholYRuP+Kk!tZp&;5HL# zg^Z(cOk?<1yDq$yb#-+1z?<4ME1!{5wY1^Js}mb6GiDris}(;+4px?xl~saq1U+De znJ^c8&9slz!@GjF=)%-7q}oIo5)2Az93J@yt2^9P0<{4^%BV*oY*)eEuGT%Qmc%d>)$xNHbj2HB{RTAcoMO!jS z8ngC8nrbhy79EQiwufdjS>9O&h6HM#aH?~P+4$m)XwIZ{)@HF5fQu!76LzkV85tX< zj%cs1Tjey?M{;DldC>nMvTTQlx}ns;;W61Nh{#l`CAX1M7D(STdQ7b8zVxI z1QZ992za~B0!tWHTsO&nG45` z@u^8W@+|1xB0q>18vRJ)mG})7v*~3P+BI&T7SS$JN(24fZ7!*GZ$Dt$Gl#{u*y|>-R7W{)5UA!N3*9Z1ZbL+R?=%55s_49toyVgrk zgzXiTa+qi$Iqwb{nx?%3;cetbrtktUamUY#b!#lP=b9c-u-p<4_CJRie5|1+Lcf%~ zlppQKO0((sjelgpzC#!uz9~}TtHn&BH>9zdaACT)uaj(N%?2t|}Xzvv5XQbowC;?(YcGLr57PW}h{_&*H@ z%ctk}U{Ji(o}IJf>hd6puAgf=Vz7MK?MSH7L$>nKA_${h$ZG29#k`)s*V#=ayFcus zEIgj}7CrVt{xZXgTyP#I_k+MoAhQolWV3s(sPc80ZFl7v)6Tf#LA&haf=;zc&9~3^ z)=B(LdeYQFSs$sekkMI}_0qFj{D;`RfZVkhfR8h z7vOxpc?}*EpvqV$%Q+66V!dw}SUIo4>m3fM?x08gP5kuw5| zO-$h8>RNE29&qYF&lNwtZ9sA-#89Px4vkRN(9m#ZWW=efjhg(x3E^kNdq+hSZ^`5e z)}C-=l5uu4@H<2zytbRp;z>=ZkBUEMQJhKqYJ`*3Pn0b|Em7$#!?&`wDxFTR0-!qi z&w3FP*ZAhky3vIv(hE&Bmn5TO&ow?gNPu1EeGUADbXD(T^o(_qoL~7?g-LOubCH~q zN?0R1UcPGwtU30uJ0N-D;kQcSltp#k;{FegNxv{p*4H>(j9~48Dz7!v1<%&>6eaK; zAxmCEgV(d>rLDaqzgi1>GM<_xt-}6;Bx)Uy8I;}Jjy^D*ObBAebUSHN;qaNc-NxVS z7F1|18T4t8)6^WNKj9FQbOq4YrC|u}8)g6^s-DMLMd#;@vKB>^#gPuH4OmFrWn~@4 zCjKCfj>h+#^mLnzeTf&>>5<%?W=?M2XCVVlS$aW??25IZ$fHmc|1I8i%qe}{Ium>_ z@(Y{)D_LIt265s0EXU?Zx=+N@i9?lV)2;^diF;wZSTL+78gQ7i9aUp%`}!b8_)=d7 zRxX56#js6W?v}8Mtcb`u*0HOZb)hUf5Fguyo;X%XytXN|`aMwWzC^=OWWK7XI@2%! z7JavbQ=9*nCu!}Sl^-FQAS8{bsw@!{!Jhm7~| z|6*=&4w<~AcIR@x&|FK4pfyu9SY#Rf;#OJFOwsMki@_@6>_sCzaM(Rz?&~xqmRww1 z%|N~o{I{^IulMu-wr__&C;$#?T5~45^E8WRJ@4Y_OtRxCTZAbT| zsuxxLGbKhD9OJsdPiwo(*4xOOTDj;ng?|rP^d&JzgHbB^XIPKz*j{z%);XGY7Jh6F zA2Olwpp<_GKy2n~+~^nn+lVEN7x^m{s-Kmxm&hmw(bN$gc<)s@|Jo!OTe~FjgAllu zI{DbX)9FUGpXKQ9xFZjQo#Q;k-d+|!%|ZVgdW%Z@!DE03HkLOiUZ2^SU$JrLVha|n z@HYRRaVzmsnB-R`LLe_Sf)`jeVAJoS1#g+J9sd3;Zz+I<Y@NK1?M! zIEu$)I?$_9>uO)3n~TCwN!)Vzm-{o(w@>7qNjW-&kN7|7SlGIiDyD8S^S|z1NtBfD_{a~*5YK)6OxcLe1?9gH0u#B zmoq_9MamO}gT+HMpc%$jiu*Qt9#8;WF%8$$V zYraj}o6PR+?``vxi4mN!L49n87ShhT&OwAEqYIZNJ51A#!j0Ez!Q-It5v++HJf!I0 zKupfD+LG=BcVx$RX5y;BlDv5VW;{q)bSDbVos?jL>+CYH^sA8-px|$fp`lbNaN|kR z7s63dL$?9YShumiAZBhX{pE0)pa2(9X zLto6?gGx()kW{Lr`vyg+?!Y6jQ-HkeN!QDT#9h)iifIOwWg^r79a-^D@<`tj%g*`f zdQfMm5&cx}OvTD$aoUS)O5nW;rZO4;An9gE1&mN3qguvkz0fW34edcX%H1!_+MW7PSlG9*v(we0obpgvkaWl#tU6Za zJC@`dxG{==<2971AtbQCNE>kJtJYVN4TYXdGIdDnx9%P#BL#kn`!>`XhsCA{cg7V} z(b(DGM1zzKhk3QQ2qh`m{Aybj|dR17)Q5#{1;p{;2S$77+jW4NT~;typ?6cY+8A(_YYQ3I1*>41Zq4 zU^(>5sVLp2%v58d*0#GO+XDGK25v3Ac>ZwND|BRy_g76mx~fWw-yy~(wFKe~&_GVd zm!`2;w>~mJ#pr}UM_KT@d(e_$h?3nj>s8H<@A7f7u%Ru9ad!DiMUwI!Pjr#7cd7ZN zHvUQp1<&cCEl{LQtS%Y9{!iUKc!?R8mX=ll&C{S1!Zoumfh)m08wlgF9^(aK$=lnS z9?$R%*I)lU@jeI;dBa$xY_FCJ(@NvOPq;bWKh6$j(J>Bo@HjP`fmN>m?+PvOP!j%QF$B zHyq8%!<-MFU^!op=Gis$pb7XDH@-f7q>q6ggkXE#(_Yzjr5MB@N`FlRl8YL-J8x|T zA8;PcSnAFt6LK{IM93d$x|&jc%tl(S5Om zDo#p0vM#8&vGN%F%8iZp)ceHsp>?p(eev6pIQ7GP9uO-E>y`Uh^X$r-|Nd;8B!ZS|5a)j-MnbPEJZe$j97y0;gxM^QH#V{QU{JWHINYd5c2n~ zI5r7`2qis~X;g%UW~8s;+293^2_%((AB>I{xba@z&=H_Ks6l`Q-1&n;y5h1w)@t^B4cRvb zGoSBB0m-w<9D1BK4Kz>CVG9h;<$lGkCR_t(@h0#`xIrsJtGz!15xj*3rdB@$ZB~wZ zca{jV2E<#WnY{PIsiI=3FlIKXtUCCt7BcB94Yv~Txl5+=r$3Ck!p5&wUae5R`dtIq z7^#jvb;K_ZqZkMizr=Sc_I~bFsgj5Cs|dMyK*-MCdvB&YmBFpu!hSCwVjS^WJELhs zUk8gZR-PtbSV%PQ6m%5}fA23@bxjNkm|#{4HnY=d-2Vz%AnJ%fk<`~vUyr0#taMoa z6Hejq4uvo=0?=~{gK^4Ny1{x|Nr5*lG}l7iF3&P-$*z*dOjU+7*OdX%RDJo@wS9H) zkBZd3oAu)p&%_wk6Z5iRr<*NboZ}~v|7LYZ9*jR zSIjHT9Ed^dkx2(}7$Rlfw1BX_X~*E%@rbO$SKzClh&L4F%F+4?f0eO={sWE_~Im^D2z z!YOFTUYk5bn=?NT$O25zB+x&N3XL z2&OFr1a2C6?S2kE&z?Q&1umf!Y!Lje9=|CoV{ZWb;@rSnEL>WDuXkm?u}i)e0*9^dvH9wtqT-xI@;70+dr|>i-6ZJUl}8dJC&uR*pQuy~ zL7ns-J&+4Hc@mx@*ytq}Q})~jh|J$TU?Y_nDy%?#B}KAF38Yp=aC&5 zK5Gm*_#lr^7T&&|ur+;^_CM*&JI;Z*XEesCLL2D3;=wC9LUk45u7tQh-3373oet^! zH-zCjDWeP70kxQHaB-3ke*aGYX37*v5du!~uMp9+IrwwwuqQ}d>n8n$mvRO zEpW8EjnZpQ5sm8PoQ$YOYkBw$!RZt~g<&MNQ)gpT*2YNgas?m1>=L!yTbbH6v%^<5HjnSDQ^hHl4A_STL7;Tdq#K zZvL59jycCIj&xKIAfWRgUT(y%uhK6NlpF=Pvm6iIlXHK+4*HZ)0&4qTyp(D()_D^H=d(Z&i7Op+5q*_fQZ7UnL>}xLaZ(*igsg zZt6~#xI^j`B}WP3$=Q}`!$)AxX6?zM=Qs~+{f9uHh_7!2Av^+bUfn9ffvX0xZ+`2d z441K;YJoTC2)Nz6^&YKy4iRFEj|1xOj1SUvLI9?6=T$7H6x?_z-(cvT1z_U3BmMm) z5O~*Y+KavB23}3A@vdoVQEel>*VcPt>gl~MlmVz65V%1>KNZ0_X|%}0qx)2SkL zQ#EsCuaauV7&Qq|Ar22N;2XkVq{0{Li@=8h{FzfI1cKq%n_ohOf6gqW`SySomu|!% zww0I&7rX#6;kfe0@o6w44Kv~;`_zbi|3jt3V}0i z0M79L0cS=ZTFW%Ea>;zoOoO{x7gkYcPA7fT%6qJBpNK=?4G#W0jPef=78PB15mj^7 zbO@{L3{$B2m(2HdX%MRY0ek75LSrj?d~rE!#K!yJhIs}Is}L5l^XIJm*Tg3KX?i|w zo&Mr~wE#`a@p1$$d%cVoj^;r&gS7tEQac{s_fKSD0j;Go+P$f>qHEl~t85>$(6-|| zh##mhz(z+$L8pGsh@pFtXbGw?A<>B?+fsgFnE%lBUNz2<3_wE9EI>{9%u+dd1+lu) zb?qJoX;Tn(%cIYA*K0UnuZbZcUKBCZJ z46CxMVq7$(3Eg!y603fV+}g8$>lmhY(dL;cs3tcOS$WYjdd%;~fcF%W;laxd;6r5~ z{yga956tDeN1F5=CAj@L{lYJIZV-&Jrpw(wNq9BQ)zx?jGxK7+6qbTZ6E0?_24b$W z?62!epcZZ_|9#>;ej5gVu5tpKYW^}*a(tM8Xk=;MD|(jjSUSC+LIf~oif(Q-Q&&$5 zD!}YE!1w~8O58jV@h#IrdhiJx5Ao-tsVoiE22sHw*kO5Ak<;ojeTW}Tb7DxXy|)`S zf+|N4CprwYUwRrq`_)U&!}BPJ)BgIGN6vY`=3E})>bugEe=ppW>4E)Ppu59M37w#a z0G=Q-Y56-Tg6h-x9YC*n76by(ziBySIZTA*aT<1OV(bK{e43>JSyoyF0#ODRRzBd8jIXTUD))zLx`?(+|IE2Wru1|93QX(Fmwjgy$&miOvZzgFa#Fh> zDXbm}Ii#;4bxyb^_G5o8_H5jLF>eTl?g(ZpRp%e~mA-=9}<7RB##CY#@u^zJG zu|@5JvoFU+WFxnPcaI@exxwpWHwg5mX4% z%EaA{@6j`w&dYjJ-m9L9`4SuYS|TD0fMJsXIPSylZ>#|-9F>(^YmUe+V$=D-&%@2f zX}bn~YU5?>fMkP%4@CS4H(<2j5!~J7@;|h3=L6GQAJTp5v^2m20+Zm8+X-IgH!e5q z8##HU3I%5q89@K5BIKop^msB6Kqcp7cN8W4nsoTJ6cBLKJ%~d-rTr`q+6C9NlxloV z;6o+`aPeM?zWuVx^tyT2%JTF?(&`>wULgcZd&54F$JUGNKljqd<=omOUh@0QHzao< zZ+$;CW91EXMBkvT&iK98Rq;~QP-0+WU~|ZgX+DS{VR|h%?3)cYR~>zF{Q;N*CMOXx z1n%dl2DT^2Q;jn0D2$dGM^%2bT%l^->v!WhV$`FP1tAoVpf&hO*+8cl$N#r@P@+5o zBT+Svc>-6Q$0-;aN0o1PsD*qeD9~{LBco08Nn@5kFQ$7s7}PjdOn0L$;Rd_Vz$0HP z&d$#LL}Vz~zX`zjhJIT+yR0!Uy~r9iAh&b_VEA;jZLmul%e?~5H#gBqa$!xM@9dok zpejmCG?ia{gK65Gb-o|*K4e*R#-+`qb}1*ON>0m=fMBe)fo!C4{%3z2h;PHvifzrG zK&#x`dp!bUhNoXdq3eFED%>e}YZd$hVKDtzZ}>M;8o>_XY0V%g<*9*<{zG?kip^3T!1 zvYq0$>A>pd?0La7*!YKkK|oi{wDd5h`fzL0lR?GBW_pke`eY{uDhjwz541Fm+KbOh z6#Hp*sTap}Yr;=BS^J0HuOzOhY3PZ+Of>gpy9ZPxBCS*bxNqmbV3Ib^<~&8thPM2h zjP@j=CwdV3pSvFmtQ~kwW5UwE;yA@oTa5Hgx9}II2*>bPrcQpgD@^cGKW(O90`V{s zjx?Qk>2>di8$wwyrMHY13FeCbp`-*rwnLV@_@?t zdnWnkhxT=Pec<6J6q(`2cTrIm4CpL5eT3#$pU_om4sOLJGQcJb3S?=OL!4pBcGC^1 z4yX6lyaSy-Z=^hNcr3(hxR?|FPV~N1=aXyR;!R%b&kZ=i!%hXyJ7Hz$;5qqK)_CiK zvrzT{|F7&+P3a!>Ep(GZ!Y_&mPm>0FSup* zk!+`XfB^6U1PkLsCm1?r%FQE8!HuY3`1ZgQ!_!Z((?W2+zQ}F2zi%cp?&ouC$Aj73 zBRlXfA9n&s`%a#_c5dZ4^-jj#p=4q|;2skd-`$}{(9x|sG_#+-9d_K5zF(z6W&Q-` zeJ5h&yYh5Yv^|V)e@LGR!o3oFru-MpUG4m}SRy&8fR@v%%)G|7PX6f7x7TN)RX!k& z)71G~EqyRjDTR3Kw^rnYlV0|j;Y1w@iHrQW6i_*~%ERBZMs0xV^fVieG}4$`#K;%k z#yM@i=L03+IySw)pUAspOBO8to0Lvs#F7(#iXZ$Ix(pM}(hWw559oI|O%zR-o})sI22n!gK>GmZ;#t0K zShwwJLTwuENf50WXo>Y`tBCh8_T`haj(?ydBD}uf=y)lwLr2<+?YP)r!HQA~)eV{N z79ExN^A@!Jph5|MT#Ts9x8sT!!xK3_SXa!?cukKk4u3@r^|ZwZKp%yOe&N7#ZH^}R zV>=hl$ph1+x3%Q%FRK#ZNVg)aHUBWGL&Tupum{kP&?;{E;QX0)GbnIB^zjp?W#td8 zF`^*A+E=BjH9No)JbZ?roTX?g?DBkHvwqUOf8w16QrxRmaj^t>MDXqA??*{y-3T}2 zJM96WhQuXx}ZFP%M!zqZHe6=@M|bOBlb&Czv$1dI?P#u(7fK zj6+cJ4d(sGA$-f}0Y}$U2P<|+HfCTO&G#4^@y}PmzfmgRy?ZOrP;24PYx+Z}+GNa~ zr-IN`c#p`t_K@wNEbZCSLw2rW+v3Rh$V#%6r=QP!_a{~fN15>wAtCy)D!dIj_9Moh zPvrk2t+LE|whUgqZ~qGcmS)V z?d1~$h{_2lWUJCADHsdUKv!w>QRsOPvfkNO!-Py?CGYgmE5kvLpjaw-WPacYC~r&v zs%?7OM}03tk)bFF2g^mXSG?%vU%&D>7*fD|UgJ-8zA5!oW_u-vuY6mB{|D5hc=Hl! zGBjbn7lA)N~-S?3>#z5U{AZDuW>1o53<0t8wbPy-G{hWRTG z;eRMD_EEC4`)F+E+jTLM&Q}9C;dg*RVRw2*(JKmh2Cv%$6<_upGMk}3rxT#gc)o7* zg@0vv@K2XeBwh(l_*UOIt~y6smy0@o7{5Grt?hJk?^mjec`7;IbY50=>JR?pH*IO> zvRCEh_h1|%(${wbi-PSD=i`fRUnSw=BVCD?njEW1k4Uq_3nU%QRv5I&!l9zt=ym|; zPJ>yo&OZQx?59y<@pvJ9rkJ#P+~)LXezef>?FS5~xb%iih;U@ZI`{#{hY((vxGOOs zJf)b(9}Q(_$l>DFNLZvb)qQ5``rM* z6R&EYRDxnHxlYA#*kIR9uj5tg7DyEq2dcNk9@cp1PYOpyv6;-%8E<7}O?miamAkxS zgFc}Z^1@^7yITBYKWgmE~_>3BA;PIHP55S4@o+f+z4CI`5Am=C-x@++%WqVnflW030T`Q=`=XRK7#K4nW ztjy-Ct4&enxI(GYpj&w5w6|#W8|W^gQs{&3u!;m>hMCn64%xIF*L`Jw1h#0AvPC`s^`;}=`teYib;EE*&#Ej z8MC&w`ehiEmYIREE7(QM5)TPT!pYy{ig2)tSq6J<$e+_){1jeoP7t3J-fiA^%-}kv z;^4u&g0JyGmyQ8qU5rcvKId-$ra}obilNCkgXmGN!{JJ2Pz2fKWV^x=w=JJ6^iSFw zKmg%d4E+-s_Y0BfkIU5*HVp&~nbyq%17dWfEcJYv!f#;E;T&O@xbL$>@9j)l?xqPR zB*4`~sjI8E*h0Zo-w0UqVoqWOSqC_&zWf{}3h1h1%-nz{pK&{g_4wWewgJTfQw9t*P%byTly0 z3lst!9pgL3bgyh+B1rY(Jjec8M}8TE_&4-&0zt$d{}-ppq#%-SSF9V-f&sIpoqci7 zDCQ%}z^ccCJ4&w>3=ZasCrx8%W$`RwmAigOK61FfpLd|?a|W%Y)z0qa&labvdElg2eId>R=vgZr zZocwz8K>p+S?3-5<=FN(IUNmd^ECoxMPFj`HN43g`vict;&5e>_Ugtyc1|5afPI0 zChv0WSH!2odSsPATpm}&HPx1VQMqU64 zwQF<=GVBvMqvd|yCf8@NlHIi57`D=%5t-gjNjg9O)IE3cayaPSwJx3?%}DIS;2Wzu zA-9dJUVOLLMTJ~qu0_g=TZDIs$NNNA>`4)}XpIi<#vFW3=UY8GgRCgK17X2*`KaGU zN13;Gc7)=7{Nf$d0;2EjXvu74(`+MW#8T^U4$YAauc8KQ?o&vy5GZcgk*eH z!Sr?UKin&59^VEVh#3|05*bTc3GV3g@5X)sosKmc&P_3IsnMShK` z6KmKFlKQwC+r-*)y`p}YI~z^)BllzKF|x146cAdcGoHfnTp^xXSSDR_&@Y%P=F3CLhz#n=qIcA*-(T1=wjxHp^%vct?BzqC;&Zeu;k$VVw1wvP5HNbp-p52{o^3TY9HS7|nx0i|+@4qk`Q3nn|rPY~BX` z$4ttUz^4KMStvmKUCu`lj<}rWW3Zf zxzo{~^()LgB`bp3j2)UelducI3ByTcAPebLD;dd6A<)PfhYph=qyS z%j-_h|1Fa|xzz?}2v0-Szb#-<0Mm6u0r1Xr>$6=piaY$*mS)QkE5?KPFt8WqZB`DYiQ4K?x& zNCoJN4=b-Zz*v^%iqn4OlBX{lNz??pX8$HY%+yGm_X zT7v<8%QLayzO)4e%&y_&bJg8EFJ3uY3dP=LPXSh*Gr+!dnp1ZWvCIbbKR#O?RifOWB9RueooXBaVrK-`Z~Sy{PUQ&puPdJCq3(<-PsXZu}<(ax?Yme{pL z=dE1NuSQ@ks9%%KG&;gs+to<@K+c++q{*9KRFq=>sbIzm(%}ESe&9Flr)6X3lYm;z zyaKYQcAB(NWms{CK@5hYlnuI@d23Fe=C7lwzYs9@3duko$){?N`p+aJmGVSn6eF#r#fwp{A1(0Kbu<#u zgtsC6n#j5Rsg(ZVbB=T)F{sD&TPd(J`X9QJQFI@p+S&>5d604{(p)&h~>Bx^~DRC6^} zSC0XHr+o~&(jphUc0VxH(9s|CW3aynfDII%Rph=7drt$5e50u>Gn;VrikL{ER*?GlMB~K z-^@+rqOi+!?>W$rVvwNTAr<&=m}VT=)?TZEvQW{ew(YLUQM(3mSt@6HmrH7c>C()# z4?>XB$Lj@77WF$Fb$TMx$`6&HN$Lrz1_sdu=I{O+`ru&usP@3J=^bpaNtP4#dBY>7 zW(PjfT3py%{OJDi9F0Iur7dL~AIRQs;7I+$te=6W3u8_7BK9;n)$XFFnC|$GJ?{#iR>!$b@T^MR{jqzEJ7N|!U_@1nc zjD6PHme*&|)CV);w6__QGxQOF#OM0@Rme4I6gaK)CgOi=K4lp5W#m$Lp;zXh|AK}* zNcfKdA}p2D?WftEs(7WiI*Ed*nzDwd9AgF;Z&Cbbyv4t_ueA~OnJRBYjMBNBRZ*3b zbDz|1^cE51>huW-^tr5);&Iijy0Z*Yk%$N&J}hv@XF)?XJv#c1i!gF%wuD&_WS<6d z_;r;JRC0+g2JDiajU&E#4(!F#kiD33 zOnNBH@xVYDZXA9!SQe&by1YX(MpAdSEnW`yR|`OC>N-)&VybDAqPj12#F}|G>r^~G z-VG;@LmlPbYsth2S*y=rvV|qz%`Aw&aUHsKDVpSQ@+wygY#BddMt<9v%qcq3y``}Pr@sG$6sd` zBriTKJZ|__=$SFWR^S<;Ook9@c4ZUh(vz;ua3;Ve^=XDKW%`cNT#fZV z+EdWKc%=Z^t>&Ad1A888%q@ECap{-N@$1#fs- zd}ra(++2&DEBjzkesfj~_q4yG{*xOi9(xhdZ0bxr@i%KLG_<5O!b= z;}`G!SIshtSt|*RiJ^|eg7XaBYZm5T<9q+QIW??E_HM>m+=NSXtdz_Q!3OP%e{7p3 zfp;V^@z=`zrjNFx0+F7W{NWSXMBvh-&_xtXr3EHrZ=r=uX+UP&vE7NX_OHPAy@$!Z zXpbn-bwGtoYD{5N>aQ-OYJ$N5#=*snZ#Hp*IR?W3CbH3%S`6H^w6vtXyss9&v9NzK z_EPjM3)CgT`l1haq&zF*LQ)Ez@-52FHr1+uXOA6*x(3c(D~vk@x`*8Iv>UcvCmM>& zv8-Y~9KC&?bKm1D>TEHko+%jkAa&CHdkev7W1?f@61)Z-WMmYBb%>18?B-pHPjfDu zY-I*pYJ8ej3?-_Q!r8kDrjyww%;#O2YCnWtv5`ag5E#_n%-&l= zllUaY{NFxzQ(T;b0&#vs8HG`a|LB-ik#N@#X3KmQAv)5pUFR-U(~)|Vu8o(aQ=Tb` z&v^-8K(pVZHg^I|Hi!q4fXvl^wl|dtHlfQXM~gSG6t0`un1pj~px7w!6=x_({eNAU z5U~CG`-7RJ^X>%IN7qyCjR=C>RFC5}c3V%?ttbPkKw@s2w%Q3CU* zDsU{@AZtAa!!|lyP)Pj2J`jUBMfw06-nk3ZHsEh`p=mipbqU_zZ*zxtdI0`jO|ajhw}IFTN?Y#r@C0!owpH$ON?Cnn8L8n zm`%_3Z@b!_qFf^k##t%6+u$oK=mCU;%|i>=ssiBc4!;po1DXPiXLVaDe)~cvC}iiw zf&y~HSqn-JXlvX;EUFhW-DbB}IUo9>kwMbcw@&!IypM&^>sKe?5~Pylzy@$W88l|a zQX>PLD2%w^XOvd9wf-1|L#8m%R#s}{fdtDoRpWEkQ>V?#qOjKGEKW(#Yu{WRa{o@u zFNb<<4_IDf_QpyFBaQyqEGYlL2fY2t?z#PqJW4^H;mJUdoKeT&tH9Z-b5oKmjru3o z%WBPqxld4+`~r%R6QCG*<`;|A9RVznH^eoaO7u!^>ncxQ71PQ~(D>Db1&B!X(k_4o zq{v-(#*YKXlkLY=W-BMgjnP8bY%vE8->mWaJ({T4qZ1*yH7j_wR1heM6zfd%%T;R# zH5kF5DB$F2dTcb3;pMm_HWd=KKjb(#Td>%#&lDjWVYLzO*gG_!d?TCrd-lL`j0^(m z#v{8!B%vD=rKB+A)#!q0l+Nz?BE@xn|ItUkPxUV3PS;paImaJ-(vX&sEAC1Z3Oc%dh8HhN=m;%DE&1^_4H+ce>6Lh3JUMwxP%2{HOSbE;{d;Gry21)Z^n5G*x^8uS}%; zg0q@2%j&3RyRlhL1Qq$uw$h*i!$nO{yYcyp_^?$w^q`Qf7Yo>9pou}mTB1)H(+%D; zFj?dfG0R2-Y|9TLHuniIE`bb&^+l@nN7jAtFxs^s{`c2bIzpV%Q-Qd;3W&oeLa**i z&(3iC+xJl9%GN+`RQ}6&2Zg7@YRSM*5bbh+;kEgV-wKpm_0jf8DnrBi^d0 zkG+|hHo6(h=La&c7!$!+T|b`=7f;rZCJr(}aPjMwXC7x4mvx2@j+>sjyd_1>Ve}u` z+Xzo}^9J2xTj0m(HJr8~&3zFE+WDVy@2U?c)(BU5yE2^6o1B{hu%YZA9Gu_Y6IE9*sBU+GNn7Uw@ZD z0PXcmXn`s1fBaShk&#h37SAXN5QYhXzX+VxSe*ILHPCqj5Bd$|ee~y4!GB`aoTF2r z-Tj?2Uwb&e^V8b5Z|U|4r!Zix^8qH5?jM@Z22jg1v@7f^%)q=d3p!nau4y~JDNato zeQil+k~M+x#T7lai}WofsOiQ4MxE_E-#q?rl`qm-0^K&=^TGpIQ8_}uL1eey$OLY9Zv4Q9cVjOGd)X-2ue}}68c za4XEur_TduqHEEd+~*cVi^8d?2iyFw)3FX3iFM<2On%9*Pw4XRm@g0p{unB$_&) z1AQ~ZkRFxMOQ$zY#(_5vP%-cGJ9ll>hJr+)r)2~NgAXPfcv?wQQ$m$L&Ccv8NrPTV zABZ;*(4;!T2X7XqYO!vo-wNPCUASa8@@{R?-nk6zz0=?<@mcQP(Z8*Y85hM=)K_@R zt6$1{9UYShjN9wu?@KEGCy$8~z;IPZoXhQEk%EWlK!*!E;#rOlJME*jM_{6veD06H zr9V~vb9jLnoN>c{rU+j9q5ksGFV9m8QYlDF3n$0X)Psf)4u9|zB-MfABv}5shf156 zh&SLycL{Yr|6ud$E%YCTEiJurZWLA&qkPt)7R~ON8h0pC!g?e;t)pv^Xv+|OEwK)$ zL@f27dd@Mccmo}j&jLejb%=UOV2m0?A0S(*a>E#Fy)78TG7G=IwB!_UvKopUu#fyS zlAXE*70WvE*Z%z@Tc0R)%Y%b{=ZF2f@`w)2y)dnh%R4M%taWFU4<3P;OmA6~t`FUIcrof2gCzR_Y|D*}{J3X#66uVW7k$orh88duL_x>Sh??eK1E2S>b zvjPV3^8MY>H{oi~$DgSQaL)AwKGs4yPRE zPb`+VL|u!8x)6Kz3fHw8weHP4l{;Ru58c1CY>EHg{ViK1RACF1U2Uufe?BXYGf-W6 zHl0+71H+v3T$e&TVhkLe{s&a?yr;v^c1RdG+tmV2B#ntamo$u+JfW@^Fw5p%`?Zd= zvOYODQ8zJDx7-=(d+=I!_%0dFTL#aK%%8C}0-7`1k+GLKv!j_=xcgH0R!7d%TrNB( z|1<3B0*+i=?%X57(y|Yc;mp>G?iSg1t=ZWArIov}epRSB-M=5IJLTWL))6{AVH{3j zAT)NdceSUE^Kz3pWenutgB0UgP$9A<3`$=tOFI~~VjcBqv?M};17MDGEZL0il|rnf zrx>$sJS>fi9Zw;1~EDs+g zyyhCaQFagNX9^hIN9P$xeQ)eIXPEENAe)izG$_sDd?r9E%2JO3W582ZQ&WTLCW^jr z42|J0mViE;Pz=t)ME4d4_26Qvwp)#fVlfhKK0;D;@tS2YIt4b`Ob``C4^nu>rg*-~ zD|~U**TidQWNvJq9CCf`NhEVJ?ZHLRx#>-^e+;lQjHF~RQAhXDBAzl)_(oLH_8M3a z2w*g22GCE2wcj)_Hz9wA{e!%pw8QKU6a)l6UpZIxiRs-|?QzF{`BzLJ+S$-{a^y`q zbW9hR1&9yn%dt5unT)meF6j0#0f7lC@_T$(sSK{Z^o^M(S+QTzc% z%Xo}4GoCyFDr{koj$CzYB(J(3Be@am8n>y~v&uQlNWt7A37v2k7CwW)A4<}N4eD3H zx5@J`})rGhjh6<%=Cw{-Tb6sPj+oI1P1HpY)g%1!ZU zXt9ARG*m((UWm5K1P6N_kqgH#4bMi#%6@HVx1PL2nsVy~y|gBBtonX^^@p6%k+E&* zo=?nc*e-ij|Dcf}?!Z6cMFl`%7J|>HknA>}%-!pn`hF@!%6iB~e8rJp7Ichturc<@ zq|_GxA@g#Ui-8q}~#7n=yREK#^+OIfl{+-`uSBeM!Uu$5F+g5)>4wFm`o9{YyUiBe9NQynMu^e zpP%)Q9Na^`k__G}Zr!(a4M7bzyKRDyTBY>=Zs@TF`kjXe_jeeVsG+gwHit-3V&c?E_)RD+~0;lP|rF|l@AK-E)14oMU!i`B~i z-BU`v)s->}Qncty$b z2;6GUi^M0&e!Mt=K!8k&y0Sc}Vh&LtW4EHd`#Qu@32I8B=r2=HsZ}_bWjPC8f|N6Xz*F+O`dT+TeLRhk8mD;;}bVA8en#k z0OyT1OQp$dUwQo9vm*JLZU&{j+kC`N2^2t^ z2pM>MjxN&E!}Zk+9C%d@pqP&}sZ^@-v|`l|lG1L2j4x<-d@{atSm$oKZM&^+&g7#t zWiPD?^9Mx(Wh#RK?Gef8EHwm_!k))4V3;`ILjOCAwPH!iadYDM;JPR)KHu`4MKOh? z)nxANIlj8=os4JC-hS<`;!t*M06)K@4{vR4b&!{nb1$jFd{$tQc2&W>f&I!r59`ZI z9iUZ)I}-Fk)|JYJMYYr2$>{9+YMNj1y!J=OEy_SxptwP1rzuCfy4E=k6^1ZI-q;qQ ztItd6fOEaXIgycFVmjfr+8Z%jX*&~mwKb!zww4ooftvXlF%=Iy=A>ha+^TT7WvYwd zlkAoupQ&F-)(-dsxHp>_@FU7sX*N8?BUgt z!dS_i>G@YW;B!NbK0eP`-jOBKA*FCyvsQPy?Hp)SpMHkOaz_$_LYIcmo;dfaPoc_5q}bc-g>W z)&?W)dKp(mzhD{nJ^#Mmqy7z1Uk~b_VR1j{SiJ`dI3}^y&fGgcRqjr_s9E$My@_Jt zE;rvZ5%>V$iVn`Fp7D;rr)aRDW&cL*)fRiVdTD2US*U!_j9qscw##<5V;B@|_{Z{n zLIgu+F&dM-&d>3N{TweBr!k55YTE8^`WV^~87iEtxDMdkAb%}PsM#r_)3mtP!l7N; zAlwXGhsmX~M~#wuYm~0OziWz!nF`mkxDXW?Ij1FQl464ekHaeKTbAMMI38E-^CdRj zkhn{UBglj4mjK1jlIf_3xb*?sn_Fg7XGl_VKUahji79Zq)GS!CX0qyj$Ys5bMRfpC zPhn!LA?`B>K?ge5skGyRbFBs%=6CHmB<}L!w$i2&2mC}f2bV9xyD!_E-z5lXfG2+T ziDBNJw}O$cvO=f)mqTC)C5T_*7Ud06MQ8r3q`5XkJZp|=_nBTlO zHr<^!md4cas^%i}H4-bJuIlpCH*(FgJ56&e^EHD{@#VPEl8{nR2zdaHNq&w>Sf35* zDWOvgim;DAdB!s73VjeX+k!KC6~Ps>*iD#Q6qgY7aI>nMF-Y$6R+mz{(hlcSD|q&~MZQ(b(WGuP~{zy-53UUV2~JcpihhG`smRKXQY|8 zVvnVfNl*HvD>DFKw%^vdi+hFb7TgqUF8k76s+1;M1_b4kOYy5S-Q6YFCSx}(=g$0e zjbxZx_eO^3g|hpDkDptzX2choiqQpz-uMoG0o1}iXundxhwD;55+cY>$&)(@%E-&E z(%xrCyoHDmnW$t@zqyR9xBl`{oSe}!zboJ&xCkC7@~?4OndZ^H{}eNNbcqfu!8Z!QM3>@kczQ?WMpZlWKDs*Lr_{9{ZhQmHRBcqC63TQn}T=-3(iUMc)> z=PS*j4N4gXlL)?yFVFAB#&0*UPEfbL#n0-OkSglCozor>xY@tzI^q*AKgvcVwu=N2 z9H1D708R-43^83v!vab(yU&qczU;O-GLkepfmy4X(aADv%e&nw3|rBV)yV0}G2@JO zFwS?kBOF(Gt<4fV+gs}3kTa%DA-7qs@4j=7Z4kqymV5cwy1{&c$xh4j+p+O4ZoYA4&Gfmim!^yi zCW$NG+$r9|&vvJAdwJ*V?#~_bJ+bl(XK|!y8jy`~r!o8=HZ(r~s)MoMVb3p$p=(@^ zWsFyL%&*m@jPu}_|M>7=5Pb1B2r>AUx*Di-OBjMQCu@o;==4I%l)%dO@86$!%-K84 z(#8VST>*19j?Wp;N4T(3WLo}6(9)*(c&l%%qO?yFcV5<-OpoiS?XzU{0z>PfxiXcz zH4E*cRwgu@*v}3jRZ@2#?32Vt1NqB1*m{oFpOSe5D-wwXi(5|=cE#C$qZM3ue!A{O zgc`=~Vj^Qb@zagRZobWMCE%dbuwJ|hN!AC>kYt^T+VY5AKAFCw47Su(;BWhl=+IxZ zchZ#mdM0cIIV$PoJaUAyR7(rmCtQP?8nDWQlBZhc@{CJ zUvE%gpw2gQP6wmZdtogSRnGISOUB$AzgGLL-6H+AgSkIjUeJH_S(H_;;HI`!-xsUe ziBuY0e;8!XpqUhsROQcFevaG&DeSoYwefcN^|_zP1mwPnZ9X|>+8Yssgsn?egi?FI zM=}jO*${lPLfV48EbzFvECi;T`r=Y)*V~cl#gOiuQm4oS@(MM5b93|CdENXX#@nz? z)BK+ng{6zZS3f%7b(rS+^ToDMD_7_NaPHg!{+=}F1i_pZW7~CX6(X%F#MDGw<=DKK zAXzWE7Qhyf8sDj*J)3HbbNAjAmfqfyDcX;=uI;usYD+w>=^tZ@T{VgZeyR*E8A&Ut zMBF>PHa2rOiLNU>|Mp0=)%d8IcdYZ<^#lSpW)?54of*Qe{_7o|U+l?GLAUEyb1F?6$C^c4Q=@rLHVf0(S=~U(%#B1O7hP1;htR|tDAdVTz4M!=JZ>Fvtf;v zRyWu^cE*b~PLNcW$Hw;S#0!54uL-(P!SfIiz21R^iy31r5#5A)hmbzk|Ct6wc?JgG`|m=s86HUW;y zar|@Ijtfy^AKwkHZ~cZV5M2hYS+Km;`#FcowYn~ul^zBDU0Xba817u&cb_xbrA<-3 zb#ks+sr6cRe=!b_Dh|LFK_Esm*%t}#-@JH9GwWNyx|sE=sthca(&_h$Y0L`Nd ztjncMQ?GC8z4D9IlT$eqFhSv2pP`@kz*muIOoTS)K2=~jSh&i?ia;o71kJ9H=dQvY zocsRaL3+3VkpZWbe zG~*-zX*kqqM{VZ&yWy(bg@*Q0&(+nw=J&QAhmfb*k=3gnE6a65&--z#H}j~1NQ@s8 zm3Ndl4nDT(<6->7!-zPQ{Gum;gs%%Sv{5EN2rE3XF31Pk&vAllu90y;i=96+N}Y_8 zZZ5inUY1k1j)!JePInVP-d0Q*@V4}xHW3mMCP!OV z@KjcQ$-)tBx*ZIRqsPGOvK&o{l_?lpe(hlZ+%2IW6Smt09%u6|H6TJJUd6^UwsYWj z5^zU(b~W4flkCfiHl5_jUkv)9>l`C}b8v*tbsN`7QEOk18=t8P^#m#mTR&QD1K(JK zG2DmFhOcSa^L{hMyX{ z<73Y3J|k=i6AYbf&gfU$n5*bd7#an+X=NLw<%?l4ZqILYndU=?HSzDr(`}~$!2lNU zskGpv#EUA9m9q$eJ6WnZpa2wqiB{7-F(4?Y!o|jBp}qSzrC!)Z)D6AG4q10BAOIg# zgKyh1i>Eq%)|FS{k8uh&R$6yC*1!44p^QdGjH|}l^%MU?B{$dZ7d3;AAE|kVnK0p3 z>x4{d&B!dILmVIKaosUGBA>h`&>bHgrxVN33w{diDdr zvllpV-yH_J6=^LGJi#ZA(F-+?Ux|BBvL2Q|_hp{;blRdS0o>`N<8~ZLe-(akaBzGj z_&hA_i8#r}Z}8MLwrLkK+*ex6Qd5VcMQ|)HA&I;r=YSOeY0gt7OhONRHIff&PGKab zuJd}>+?o>hr$;`zQhL^InRI>$my_Qv4dV4Lw)9w5USD8r*tpL~1W74Fr7CR2hUR9G z`!*{#Or#->EiUi^j-6%K(qta$@acJkGSV!HG>oRiAdAKkSTu&LCfXaO@PtYS7uMTd zJ7*0H45VK*|9Wn#l6(<{DMw>fb6zcT&BtTSM`Y*t+YkLzVjem28?L`J_1=(l(Ow&B zi*4p}XTnV;?9bgk{z$CpV=;RKD;`Nq9z%@#k2a>z4DsEag4$xIWCpxQ^v(neIZ={< z#lc42=Qqk7E}m%Z26QIlR#^JnQ_r!N>%G`w9<%m4O|#n*V-wOW z#Us-`6z*%??zKez$KZoFj?9;X7Pa#N0s~`v%sJ!Zrkut=sIR_eIgoIbg}ZEQ9PyNG zE85zZRc`D)QmOL`yBt1AOd$jK7H*LchS(ub$x~Py>rBPmy|r&j*dL{HMmF%|6W%VZ z6AWopZku#jHjKXwmEDywC$6zwzphZS{wsRx#0&VzcwVd42&r@TA~+3zu6y{Q(^S?t zbNXlKlpOVl_*Hn%r06CIWkq(CG&7|+-RzDz+!nm;E7OWgqTz1EQYR}w>cN!!~Nlo1m z#d#AGMZc_mpGa&llRd%*uSMqqF{^IJ@Pi1#1JNiq6oYW8&l0J)eJRY*X5+9PlE3)jb*wTp6=rhgknj!&}%My#nNcd4 zE~b1Tz-pT(<70vs-uv#S=|}V;Hhf}&n}6}=;@~q7k)y5)TBW+jD|(BHm)vJ=hZ%R? z(rpxBnTr$Nay`utB8B2r;4Hbn1ySFus|`&_AS4zeW2$k|P@<}^jD{;?etv!tc>i0s zY|huq3{YTw%h9Ve2U3Q0o|5gQaK2{w`<~x|Z0&a2{Fh+vCn9@d{ECVyz;`>J_gIuN zS9(kc6lzgDR|1`cDja5Y{j;BqOG^P;6CkF*y<N8FQVNCpm%zRw6AmTL!ZR$2|Ldz)`$YtG8!Y z8z|*~14&rf`f!%=p8su6VH=pP!OOgO`bAJyuWH^?@OwQfdvUCrpnh}+SsPz$u*{g; zmgDV+d_r}m@%?qbz6@>d8|NSc2Q2RVUBjdfmt(Yt^de;lA$(X#X|N$R9uLHGSY%{n z$C5H;EvfxJzM;Fd^Kkw#O=ab%L9{EOz6EX8bx4wthI-=1pP=D;^z#Z!XjA48F07vw z@=;1Z0*SkFBVH2{tm;6pGRlEttFZvjt-9SY^hh zHgmS6Yrjt5C#p?xGWklOr@3@O25lz~+p0K3nv{NV%r&hZvg9T&O5&g=jA+kVG_oQjO*4k<>CQbXVm?16^vM*h4i(DskSqPJvuyh*L( ziX~)=e||A3pg>gPG$kvP!wf|LeoolVnAn=arIKx`M@B{p{f3lTA`r9orQykhk2zSey0*vE#;mr3 zEw3w`-*B@v4)qh0mwO6fKL~J2_cc*M-7wu|-vxzjKPwb5ZRB@>gSznSpWuWK9DFd>NENb1@|?j(_JWN-APS)e zS*9k?x{KE(zSuogRM=MVq4{F_vyMc^wv{JX5IN31@mBmiCeA6vs;L=DyD^fUyp7Gx zHXGh{|0zq?omjQ7l!WLZwyl{Dp6O2xj1`DIO=_=+#IgvaD)UO-Aq@)U12)9-*B|rl zab^29(0DfrALEC zpH8aTMHV;`5uE~sTOd$<5%UtENc|v4fZ6La2DZ`%qaBSB+`+_>|uaolpjleBx2D^au#49?3YX5~k9kA3sbI!_; zck?4DKS2OoG*)h`vG{=`H73py)190yRuDVHUvd2kho2-GHSvUFq)=KJ6L-mPQo1j1 z%s5;6ew1yuE~CP-mpY`xAo#nT^-Y*_pt=Ptj%QPXI-z=&JtW-!>{-6}Fwp6#!m1xk z)P;MfO%Ml&ACQfEC{65^OLS!i+61S8?|Nx{_NLqAZ>IT$cD%7UNjW#pgLHcpIA&gQ zA7l+Aj2-eb+9;tpnilp1r@SHNFqD1`;xWnH=I$I?D)=cst}tck*moO^39Jn~kv}DV zry~lpwyrJ$Bn^9n2%=AZi6^A<^o%cT=NOTV=49@DZ`3#l46;|cn2MD}71l$2SQ1(i z+Yx)|F_wvBp2JQIFty+}@op(Bc_|Y0JNMICtlfJxrF>~uJk&oCF4wIr0rOzSFE>o{ zX#}6%TLN_>WueKhI;AM-;oL#PNj4+`mWi#TAV9t0H}iVO6(YiNv+*M{!IB0w5N{ALUqHh==LP(A|_Nd>agtP}R7r5k6Z1C*e! zWjya<6T8rA)p30OIzM4e%C z^#TZ{a0Lbg1T@vuMa_~#g3kp6C5#UY)%eG+PsDFeJOTb}dkkO&8xEXs$8)LMBwnZ3 zaegRFfry#n)|u;~>Sd1eK_Gs=;Zt&<{DNX$k$dyTEYH;F+FIqa)Km|iGk1-|1;X~s z;C+tj(h!KXKIeWK%&h%k&Y=R?>hNJ6CFmP))U_!pT=F%y-T}SY^$Ed5y{F;kG+~K} ziTS{lCVsOG1(>;@)DF-~TmB5^_I|SGZ7k4-tPLTx+y@_#e3nS{uw(eVbK~>;gLT7i zd-+7PTp*IA#BKGr_TAn{07^w|d%DE(e3r-7Zd?HV;Ut*G_AQo@@q|ZE6E_p@cny?G zqt6K=(RCc0+}c>3jc(4*UJnB1n%kpf#qI*5#_(b7{Z&F2ueso#iMw*Pi}@y%lv;Cq zxVMFI?kZRsmA?8HCR8YS+0J=Xax1K!bENvL{gQNYXM?A4;wvtm+s9V+X?nyS>#!>Jc$2w2szhm z1noKu*w#7UVQ$c_EKoIL`F`a+W2zRA&Oj1;%m()7RlMc&y;-8TszFHlV+a`y=D_9U zh)m%=GkcPXT(d9p^sVy4u~o_tW;mT6FO{FW$*A9iQ1K$Z<=h+Ng4$F~XsUF+RY0Az z^a3gU66*mM$3u|eYNjI+*?b%2D$pF3M^U5JBh7p)uW9Ov(7;0#7R?@ zhaP?xU1n7kNqr77iZ3TB=%XxEP`|FaQQ%?M$5VI)Q;O;b$K{Wf%$KzmgbHQMdE-h? zaNOE`q0LpuD}fSnEE3Gg6=A64WCuyQ@E%=Xv%;pja`e`IK0<`0rly)f#$E+xtV9T; z)f9*X-r0bh->AY?wcdv z@)S-08rqE=cRp*?Z3tQUEP+n$18Kva1rOIa`Pq6}z(rMYtF*o_YZak*b$4DNTyVk- zOyKYPi;IgB*I1XWg{kRkfJ7^CgjBK^*w(zQvp~eX14qgX@JLuT$9BjmE?nRNK|K7* z@mYffdx>$M1MI@DoLsWAc7sYcDk{qtQxmz7Uu}Ks{4vwij-snjm?*&olHOfH?!5x* zGs+IT%%JoDH4;8TO8!%yxFU`~W}jCNNX8q7kjs^>bwgukSk{DngLja&IQbvAvILL3 zf|897P8XCg7sVZ9xm7AuIXIQaH_Wj26Uv9XuQ~f)UlZ8{ya~OM)ulwApv}fK?(I&3 zy0O%9jpf5NCqUtJsZp@)8NzC(*j}jXTOKfPDe~SUiewJ7Q(97Vb8^fId)&P* zvXq+dFXixsWO-E#kPKTS4q49Il1q)&{9BeHzG}%8(pZN=laPYzK1miRG9fRGm*TNe zAyoc`qm()u6*6jL(YJORCu9J8!xzYSdU$-am?NMJF@?twcl9GHN?U};hde0i2K8#F z1O+-{4+qttZQi6x>8H!!HTOI@S*&ukCRlIryLoUiD)LtEy}K}azl1nJ7g7tm_eZIM zmcYV_usmeraN*W;Z)alre2x#@_BwYE39U+I+0og(j)y7WU?;v^(hCnG-b`h6-Moxg zVJ}eM`;{;+WErXb{#o3QB(mWX@+Wtb{rZ%rwlJX8)e$#5RkCI*fUp& z`E8m=w_wEGP5x%`u*<(cdp0((n})~JSNqpz2qZvgcJmk9=v?H{SDn%TI<(*pE)>R_ z5_XeJ`>5PP4H80w(eL^q`9WIP-aqIuJ12m z&K|Q&;oXngN>+NBWDGXfvm@p*jrRY07&4&bi7DH}+SK&g-q5&I`o&lD3_u~0<)kfd zf4!Po$w62#!~zc5X2wE|i%8majNH@nisIrmCZ8UqJPI982mpO?RIa00=m!4o!~RU< zRWzR9-{Q2?DiMEM;kMMF(OuA`j`ePd^!IZW(>8$%KSHZ?$M4)eiH=-QCIi{2#7k;~ z+I}^}-^-j9>mP!!Y}t3*^~pXi{9k5q2Meh8hDtR`X=!PNVgpU{bVm0B0(re&f5w`y zeh7F6b{Zomin01ZFc8}M@!P)V5&w)jE}&kj6`j;RZR^WD`wcnD&bs`%5SRaF-qoiA zag`}2*NfU~*;tdy$e!rtUnDlYhSjg~J}czH%_7h=r92Pm2d)zc{!e+~&(QxpB0n_F zgX*u&hP3FtyY>0ISnWEui`DP=%+2UpfPdqbfAJJT^BVfUN04i%0syN@n`*Ilm$GXY zu)5M@{|Q)b_KTMxQ6`@EDTM!){(8|vQ!s4~gR|3*QT1q+BeDBh4)Y@YpE zRzsV0d?0_{g9>>I>ESt-z6(KlOb6sYM2z|;;1~Qd3izn;vN4&|MoeS+Eo!g)L9n(2 z@XwyA)A;k;M*aZ+B~N*ajRL3|0H_T|izS!Ab7~Lv&b^u_K#@H#xr>ANa~803GoB!> z9zVKBG}-<#B5+@_K~6uu@u(pPiD(%G znGvZf8?o%#3t-PrfqX-pt6lP?1L98*gReVoeFWc}CYN;x>&|JhjQFyB%5EQShcDoF zaRF3Q3bfHs)qtRirRdEvfBNZ#i{XlB$%|=24Lp!bAe$&ed3o*@b2dJyy5qq+4{>Tl z1`xI7MEYeEQF{)GPP=DBmiw9YpMTmqn1lWr6n}c9nqh~SV<(|CNNg>712Q8TF+q0! zvT)ECSOlU#9{XRC4Jca9cG81PFfL&%Bm_K%+l$ zPPh*^w_^_zkSGCk0}@&7!+a0s!+93ClrHa^d`Ws$d7T3$BN`@iwR=I|f8z^O*AaIu zaw=@0pbY`v>wTc4_=f%?AY`y%r zM7~cqB!axt&Z998Z^BW|=qv<@IaZ^Tfm6hj8njPDm2Ui-ZM%Y75!i|XSfY1esJjDC z(E-aqad%9nM*7iBFoZ@cwaBoDl^`e5Iz-qR{#hh_4$w z1xmyZtp)y5eA$b~JvuSP_~)?}n7|AX?VWo8Rmu>0jK=d(7rQe3^1ohZyhgH*Q-bjf zU=3E-xa+YYc%CeukNqPdorr-kF*Y%|moz1)hYVktK(pF=X8AwlL2$Xpxa=#|A;c`H zROPAY@l>)#)+|>e{p8Tv?m$WaKan5kAjG2WGAj6E2+6>dS`LGEKq~JRaI));?;RM} zdkhKcz|(VLa&W()>ZG4EQvR7eI3pT>ka4LQd-X5Elu8gGIkGSKm+zaIQS%}s2Y99Y z$_4xk`OVl>+kP1C62pGEvY^+$yp{cDnt*6QYInq-`3!2x_Ot(jKId8f0imJI%-GkJ z9s7i-AyO}QUWTfId-R%Su>14JVyZ7S9wy2@E+FcpG#sLixW29J2TmTI?k_UcxUmXV zB6R_`SPo>=R|inX1#4h|2~A25!sH}N0;ypFz1&0m*S~)-IhBWXDWBL)lNkGWBBJO*O!7Rm&vPp z?7d9*8ZZ$I@5u<92Nj>REF=d}uyKC~&1`WueNAGjyZvPT2)qQK5?Q|_E#Ka|o1_VV zmtNQY$QtMdiIth&jfpsX-a}leD#PU0YWANE^#NM4MfJ~fB!V=J!nG`U)ZXm`kW&s!m8$K=3|XdGRg?k!zq2PZffj&cgk-oK7ry6+S&yM8_V>@ON1+I3xX zoE$;Vfeffr@5%pt_^R zqJ+w~xNuviL!y@g*bS{F{y27s5`?$QnFVOPK@#2pRB0d#q3Z3c`e_E&s5}nj52!h? zCac>DW*9n&f#9Aw{%5$2`vADP)A^yDfk+=9@0*a!lT0Ca61mQglPk|F>dL*0#>1W@+skGOjmpcO)(LhO zaCXM}JSuvD1{Z2nF{0ce-Op)LQF~idG7KzPInbc51~?w?z>$a;2TF`XEEvBHB)q(K zUIp0LnQ7XRf7)mmFm|xh9Q@JqJ_lJibH|V|P?7BXOmAuM3QwlhJ~2T+j}l1uokB@) z{LmKEd11M~DfLgIVT6|3e;3coVVO3lcbn4XY zXZueT6V91;?mP$iU@oAtEtiDe`|jU6a&bo)j3^dcR)Gp5fv>7(OV*GgHJ{cjSJ>bl zBy03l0j!K`I@-meIUi~kLFpuc%Qn~d=5auK6gIfMJZtOM@r8jmbAx94ZjLWxXc{|$gkC=rg0IP%*dHJ;mN{(OOwF{UyW7RRvt?CS*&Ka_` zS2W@t3^33EZ21irAIM1uVvA(slFH449etR%bR~d03H=yIRE**rp*r}c8WsqnF8}`d zO!oE8GdJqHJF;SXNh|0{wwF-K-Wdr#r+%i{HdZLU9O~972Qo1|jd?Q>$r1+(2LIWdp- zA$|yMfHOislsPb>|99@(T*frvV+wuE#>>Y)pFc_@*nIC%m9B|z!)$i^{ z6>wri2GTRKHJ$t2R?APN3Blsbki&Gif84$}G1|cY=0jH>g?Iux)I|am+_~;e87uh6 zln01l=#zU%+c=Kd6Tyje2eUjpZz~(HxS3NZ+z7*9?qP1Q@#At2stu~DJ|1wN`xylK z>~8bElP25H3Gs&+oj-VXB(I>!cz95VVK0Gja4bCJ6O zVCqrmW%6oU<8Nf8CCM$CX+gvGDkX>`U@T#zFhw-Zc7qC3lM|65hk%cX9ui2d|JNbO zE?ojg9L;Ag>f^pS;1dD&GYFwSi6ivf*C#+v7RNN9)>~+22E9Zy0sb=E4LKq14#l=Sl@@wB$fi8?$L3U_GH5)CLm&Cv}Sn7p}gJz`J z{!ygTZ>2tOa0e3&bm3}DP_d^3iis`r^dGL~_r81gE(4elTW+TrjvTZ`PZrPv2qXU~ zi_<9BR)t_~GHg&i!KaF?S*D_tMhkL4K^MqA<^T~_a<)e#b4=_Ge6)uci^de7^=k(( zVtF(V`q{_`AoiW+lNWN0f|Y~B6!tm8LjsJ>N(`=xfrS1$dNF!Lsec~OvuYZU7}577 z49&j^U>uC&8JIo>B`(mJ?TXRs9E1ws%Js0DES23gg}s2%2!m2e0)Ht5p57#`-!c+cTSh@7kTyV0~N8qBmhx|S? zC3gh`ToBmzv&Rz%(DUsE!RPoaW&3Opm5bF#3q=zaRON$=#X}HkmtLQ^KNpMM%NJd* zT{z(D5usDn0PSUV^j%Tlo(SHWrb>QaBzX1y1N6T!aa<#S80)jdMz9>h`H@+I+8+>L z?$eWRML(5F1RRl^OG)4iyFO4inxIJV5CNe4?c_u8@NX`QHXQqOIdXAI9LN%V?R{S_ zMj2QG6&3-YJqh0G@19l1{~YnvRsg5@_uPBHZbKlppECY!f9XTjQ?)9}Xen?T*t8=v zwRE6lVl12S^Z{}ti^0OXPru!(+`w;{F;zRbd4vG)YA7d=nS&1Q!quRtOxNkNV$b7K z7daFU;9&HrrD8w1&lx|Or{@)@-S6}P=)fYyd#tlgo5M- zK_&aMs1FB+P-(YyU3fuVX{9ED@vqf0m~za(+PJ(|Bn9f&-+Nk~AiJ@4tX%{10%i8s zgIVZ`LZ^XAW_<{i0rXjQ+M z2Y4nspQ6%=Rb_Xvj?nPXx{rgY3hL_*qF>B@QoSVdHeIa8OwXxy9uk#(436kkv3EEa z^MmRXi@a`E-W{bbszc)(kO$}AIkrXg!3*PGm+0>tC+@<#X`Zz&jFY6m-V+M?#C|*7 zLi$KIp#$1hF#!Z3U=1{=x4`e>pgUaf2WZG;0j=Do zsp7!RktxgRSdA9$0!DpKt9R2WGs;vOd%=qOR@q1cma6KMUm%TeJoSQi69bFP)?Vn&~=(F7X* zsbbQ^Yx~^3ac#fR(YyC!Di+C#X|i&1_P@@5zWe9#DMMfY9Q34*(d84e@dItgj-6J# zed`}(SPl(Aoi#{<^tFRRL2pH8$k3<(YkR(=oOJGBz#0NjOvxgW3h+`6Z1Oztg-jmZ zr|!!K9`OM$0Lf~Ap{yIgGleQFu7{&tQPVS;4X$_<4+R#fJJ7lUI;FNyR@8U@riJ>u zdo>Y2ORw07*Cm=Cy#5e&2q>+jAghT0byv6+^ULVMIykm69}pIsVFPMkOk-Z{o3-^J z!l1TnY8lB#%%q;X)ha>Zc1@J}&R^oL1d)&v&C||TEaH=Ic~*bhp;FR{M9PN^YwBW$$BpGeU(&eftG!C$8jz z!xLTz{>3IfP~>HD;)U>6OFV7xIg3=#i+`Ag?8x#D8&XG*v~&#wN4Qwk(C&^RYc6yi z3uKWil6x&Y=$C7Z-wI1l9mxsr!|FdT6lp<1$&OlKufP_VmX+v|m#sj}%=-#N>&3{H5HZgw~{&_u*+fdWpUXm8Db*QfpG z%%J~JHUcLhk6$1ky;|g6kDmBfuQV$`BH3kwfQ5u=UjdJS zoJ7$a6j}p8@(8pUTd4*07f~NNIo~>i{htgR0sPYp6-ECi-uQv!TzP$a7axjPuO;9% z#jfr|VmYAijFvP1tCB{ji3F48hjD4pmc*;$d@AK%(0`QuJW$K0Jh&;k;x*)HRc+uU z641Ipm$Ytv3Y;Bu;Ek21QK&N;1s$ild>*!)uZ_gP%R(^-63f zl<)CsWVph$ZCJF~xdl7C;o|Cac#-FKj$|Hs9@Bm93r{ri^s-(LJXZT=^2 z{{JX$26imFf*u?@c8tJ#-60S!P6gvl8QK(j0Sf|$kHh{PrTe>fu9B~^-ZEkSa~<-3 zQIxka>Xt{QrO+nZ-;sKBi~-t#qV2r!zojewHViz;bJ*F&KKcLE4wxgf(%#XK-Y@1p kP)bMD{?SQ#3M=aKVXP~&mU(XgVjsILCU+}aME~*s2YQ;jbpQYW literal 0 HcmV?d00001 diff --git a/previews/PR2365/assets/logo.png b/previews/PR2365/assets/logo.png new file mode 100644 index 0000000000000000000000000000000000000000..c54befa0c84686c5e1237315a933a7fab3283b8e GIT binary patch literal 106256 zcmeFZcVAOk_dc8%XJqC^VKN}t0F9#TcJ=&4^b#Vm$>4yDsW@-MJ-JnY8CnCaanQ%d=qsVckG;Z z`$(5p;CARPg-%BZ?4|V$kpqAojm^2-Vm$59|vOod}mhqyVO99 zP@&v*2R{E7FIxqr^1aU6%9+sc~zpVkLSX}-Q{350TH zC!s-A(6ON)er$LyI;Kbwmh=B#{~wV+psayGbA2hIcN>>Bj_$LUI$dD#`^I13v!%1K zvGMl4KFfW56){OgA%?$vzZod|fDmAVPtn2VpY7>GmFZy3&+=DG`gwf6k1Bi_n&`dKAeK>prOGjvFK6=na8v%&S^l{* z*wMn`>KXF<9mC(g-wR}kQ{BVJ^BhK`MnS%5cl~+kxhCJ8wG)p`*s$_xa*e#A_$FWbXWa8&x94^#4ec z$ViTCkDJmgz52tgk^6}Ug|n2kSjsv3sP5dE!av+&9HP3HkmrYIjLe(~^wo~TKfFn0 zUMP7gv!QX3Zq~Q$2Z8tQ?2pqe`ALiAh_}-;rg+WZe~{}p!seitlGMm@pG|^{j<~6a zvX~#HTaF`p*Kaj9>4^^all|%Yjd`phlfbxQxAH!jCNgY*xEIc4VdV7X{OxwFH~;kM~6T9+PG*y zVTg=2BTkZYwOU~=-ql;YAQ;BNQhB|re|Sf^yHqN*Y+v7skT{}K`3FN^9p>E_heBl? zdlni0slO36_cl4%um8u9fpJr8w5u@#dFZM`hbOQ44|fB1oHuALM^8KvrnGcK>ctY& z8g+lTZFQ3x8Q;!;T`aBoKYu^jA1A1XkD4ng6h`-$o0)Ys=x+MH9IH2C;^JKj7enuVe7lcMSZ$===k5*Y>PiM{GV-){tZNHG1wBOCA7nVIQg^MYRX9#!A} z<84smu*gQ+Bo8njeMZ8l=npECbw;BqEXfD%^eqZTMMcd`HvOQmMz)x~LOV;7iuaR? zt^pUnKNQ6}WnfScLYAaORnJ5=`UL!FQ>;_CblOwuj2v}(CGnusm?+oxfmuAbst}d~ z;#TrNtyiYM-$EsIE-x>S?(0iqxw@)iy2nI+5UEr@rq7z?YDcv7JZIoxtFG~VTp;!V zLCsM}5(Fl?aQcVUYd=Xm7@$aSqkT474*PfR%Y{ehMI z?-bQ9pQp5Vsd=pU1f+hygVMh!wY_scrd;__7|YT=#zsWl{QKR&_&moFP5e-OPHt|% zgZ>%5mFqv~ny&ylJf<3K#^*)?z<`bGXn1NW9!ni>81D1MnJIxE8x`HUaTA655cJ@| z166(e_{gOvr3XXl>;gA>_;k|`RyBAGr;;B|orxnT@4Vek9vn_#jnDm`FBswNwxiXD zeeF~+?Zvc?k5@kNY!0Hcz0y+g0mI*?LJ1k0n6xWXryGbWjzops`AFbeC2?yU}Cy@ljr!SsBG^vo4P-YV7*qX)UemQL|^!yM56` zp|piMbC?fpz9L_CX5WurGs3B)fpilOdfM3$Wp?IS8+*=F&4@HKH1t>;{_%^a=nEer z8@HKDW@uJPNM?kee43w{N@sVt(JP+*VB#oytAGG@5_%#X00Trf zSuvxV_k-K%^~GbmFx2TbnA5_bdB$YT=ey@wtu7$ymG^@~==CL&1?ND@l>V(ICL*^_ zj=Cs}j8qgH4E4Lp{e9v}r%>{jB_oq?@~+a{0BcNQ+Uh{Y&L6flYgcM&u@27vMR@g0 zYHEvvE9eSMfKee24o<)R2fJw1z)~HFVi{4^MK)S2(7BA%)LPN%1ghfq*-#~WsP3z3 zlokTrzIc5)((4=TKTPK$G{9s!i=Zu8mt4zIQ$xgPyqkk)?5&1h<477|@T9?4u;1NS zYR3cXXaYU9He2NY(_ari^jVQ9imQ^=!;^&2lDEn78db`eJ|f!H6+d)8vav$5DtTYu z6qcI#EV6tXx+jdRrH@yJ$e>wejiri=Emy3L(O2br$mGnV$Z~$PGmREuO;pAEZuSBYW%Ohh9ZC+Uw&b!^s)wiC5(L+PEoceY~y?Ruzj0ki@VJ z=?k54Q+8NtKUqNywwAtdH}T*bjkEmV)}xSDOx@Tk);au8KgbQ=O%LV$yd@+W9Iu7t)vFs(4?G` zphPazk5}ATi3i6h1T|6Sa@`}{UsiA3nDBU13t zRoTjBL}mBra-|%z^IhZO18itw8mYsOLZQsIh`}sv`2cJU={~Du?<=@cDMWROU;Ue} z-8(b}3bsln^J-Le?CUd77VKF)!d%XcZ2Y2+4?l%{6hkPtgl*EscBwK;i^OU_T1k;3 z8-;4KOC6hJrcrr^^Nkmrmw2!9q!)!)}wjh^^5tQh0gETcokGnCg(w7#uRF_G3%Ma2{7`bi+SJ+@8bt;Sz4`j{IWbsh z9KmpPR`&|zIY$dpQ`^C>bK$CXfm0Lf41K#(^>ME@p8#jM$M8d>IoIcg{MF~g5eMT5 z`?DZJJCO&-+-fp3@gSy87t@zUanQK@bC0W#Q{y`Js`Ma}bpVm`>Lx-MkB2M}5Jx}j z8)GMIxzr9VEiL&oSk;4=Zkre9aR-)N{l=S!dOlCpV)zvGQzDva21ISV>*OqEbF&C#E!9Vy=eZQ_qL3!xw0RHs`NM<+Bblu)@W1-ir0m{ zjr9m4cISK}GS zBxbOD%_V4{J>c}3?PuBmBYrps`m@gz0h>(!Gx#BO@v6|bzHbrmuj9PF;MNqo7}KOk zwT_z4<^~3=_1%2^IV*>8i0S_<9~5nu0e*;$o;V$rg~`4%Y2C!Zb~Fseq7w$ef_)(- zxzERx;Y@!*1>QbDbw7$Lt^YZloDM;1N!yh(5tizHT$jsB`o)^#NjzBWTVLKGQ?HW? zR?zy}aB=ww`hqjiBu0VT?s>#aX%p7RJ&cV_OC6TOQYB|xT|lIDVl?=78w=u)IRWM~ zrsx%)UE2)%Ap=~?0v{vcLh3qRsqE}D!w;3J@r=IDj*DK0okHH_XV->309TIlfB!0t zTz5+NLiYNItk2X)%927>tzv%_xZcmW@^*OA3TEh)ZqvVN@$0BV79P z`YyD7*Z@UeDuE`5HeFLv!1LvI!kaF_Kmo{OM;{T`E0K-7;zglv)2*epe>$ML{))Ry z0AQ#JwbVl{|KOdX*NjO3E(~66MuJ)T2U;Y%^AIK{rkFNAGQ z$O(atvHz^R)(){JT@%s3dPQ|bMc7oh9E`A3#;ofy zOt1BQN6AH!z1?+#3A^0>b7x8kY5ib(kzK)RV$^A37z}wy2=wCM7`a<7t^Od2bAR$E zyCBHWe0p{lq%>_qURRW0kz4if1z}`TcKRXA{dT}j_TwB-;PBhw!-xHnBD=e4UUphZ zwSHLJKJnV@G=^V|l~iHa8(0tVP-x$`$i^tKv{3RhNbO%E+1}W?a4K)!)$=tI;UZ|9 zb(p6j{=UVdtDnXG>o#PAi}wla(a6Sr0Rs&jTRFiS0bW2Ds(ttP@4#E3D%D z!g6WIf-IIu3||!Lu|1+MI#|b?$A>VJgr0k4?@KuEDD7-#XEz?%s6pKQerxavZkFLY zBt^-StevzPh53s03=HsC^NYs}a9^)ZnT>NO#=267t|DA+8fT5pSH$3OhPso!@w4z^ z=43M9sGj_XhcIrpdzzMMUYtC~1`HsOT$mpGx-g$+A2w_l0q$XOhd<;OL~{tSmCul! zr7ye#kRYJ1fuqURc7xs_*F$4){T+-1I}x5>%BRe|;JRup_b7aP0Qa;9c9*{=3M_UHQ8{iX<4<(z8kJq!`D|0}WG&MCfR--C;Ng*&T6&aNTc${>S^!0!6Oiu=q%qcP} zUU0Ptijc^*{X*}%Z~XvI${tm0bBBi{GYKtEb<_|?S%xDtyEC8z5E575^I ze}yYB8fefCHJ&b5q<{HlBoC;b85fg~7-NRQgK4J?4CKSe1tv1v(8eVQ)!3@cu~e1? ze(1UyPbcP8^VzWm{c23cHz6JaB(f4FmBnI@mFRaalGMal2c>sP$ zi@3Ykz`%OQe=9IY>GTws~d7p|!akbq0Ax^|${fm?1)#=5RdvUE zJs(ZbBy-`o?k0q{-=&39d5uXSpR!#@|2_+3dm>y8iniSSLZX;r-{HmchJZaBb^|eC zgn>s-XKpX=D%p56ehvPv4u1*EpxM`c6-owLC7`?~Gj-U81{w21BIG;b_d?0;?4#t0 zDV}woSy_x&0G7(Vq+kXai@-ZSpl{997Fc2kwO^MXH?r<>E${0qL%RmxRMau?@oQ)T zB4{L80eI9o>i|Ju;uG|IPveKSp(k3%^Zx*Vgk6Gy(>Fqd zaJi*&jWR%Y z#36Y_nIhF47Ihd)Jv@EXsn^Rv(?^~o#$jQc&9u@GbB0w@eyjnh(X%6BwBKP2fB?YZhmN5qTFKs?csoB_={lkHgzcDK z2gK*6=5_a1c@C7x144NCraTbvZQawGgO1s%&p}Kx1S6`uNuJ-t{nYJ&<$1O%t`zW5 zJfYkWKO{iA&%Sl?sVPtohcI{ED<<3nogvRpkrmW2-GXS#)`i+%7faZgWs+ApoU^{; zddTNmRQP8XPq4t7uqk^AW`Igy@)b2BH9N{O3>19|*zz+Pr43Aa0n5O4~ovL9oy9i16Y4v*m~ zvyG*!Z@*L?qDkn9FT(Mt<`xgfv;P0nH!z5*D0LAr+zYN&SEs4-FX}WpEOD2FFnvVso57-EB~N0IZ8?V9;gUR8)?+7-64Man4{S z%mnrXuF7*EWahN1_j&#sH@wCe@Xci?zKhDL}9S4E6s1{R)hZ0e`guMj~gs z65j%_kC$Y2cA4ZD+heKfgxfP9&nelBXQ0uf=9-tr#uQ2=!wxYa)pv>_!-tS(G(jo~ zEQ7!a29b6Oy8=}dug0qf=DV7-%s2pne+yTSdI%1HyYO!6#4=GWZV&ocX%0SbDeNR6jHH8mSmPS@wxFeEOV#}A#P z8G*xqz`-YWoyPJ950`qN=>d5@l|@V7*Kiftx*lF0dts>+qhBi1#k9hDr0=GtDzUTy z2&@g}0T=r6&*9?wZJ?d4plBH~(RV6>iaTc*x^DC(`#o!0i7JEaqFZ&v=b~1RbR?*P ze5PkJsDeA0%rM$)P6{FOLPb{Qp|H)HO4(M+<;4GO46B{YS6&tKW(HdW0Gp405^3Gm z{BL!*KE8l3^E2vbg&saU>o=sB1dJwN^0b%J0Bikh=DJol-Y^_ zXVmtjrqK5%)MIsk-4f^t)v(8`VJs^FXmG1hsxErsAI2qT>E71}p9R-ZXYimSLyyxz z_Sth=4q%FhKUCAKpyU8_f1bcGM)RxTDa3oH7HPj6@lLhRE>QfYyD?Z2p%`%fW#K_b zv{*T@h!RKLq+N#50QJskJ!aXQol#a~F4>7UVKD5Md2Jmk^Ms9+3u38;kdL0HP=6iY zfuI`ULba|3fEBwAfn3lN1~C5O>6zCC2KzIt-It}MHzPq8*MFTpUE57XQsZSzCBCf;aBF*jLFTPx8;ddTRq6YL8L zd?r*l^0VrDpIrx4sA-oyEf80(`E5#0O_fldQ!oV1A|2>*$XS{wYRN}PNl(({(Gv$4 zPR>518`0z}!jouJ`7J8^0U?!AK&@BwYA(_;n*in{AHY`Dh$0c{;Ks_v;d=N4m#uT) zG9oryRtO%eS|0v73o<}QEu`)YdTTuXKHI@;28PQa@Qyq$$(^8I;J;YX{>~RlXD&i> z5!qsb|2+ZnKM$N)d(8zS$XoNNoJzBVi%Q1>i0StGsMCMe#_LCk+8toc`=wgCZJmpP za=tM)$7_?5>}EsAVM_|XA%GyiKlwgkER|$w0jy#Nc%{Hy2Qc~1B0Y^}<5ER4)ODqe z8}#s8MHIjK+qoRK8}e3By(s)dR$PvT<3iDv;MWpSig)DU@x3=g$a5x>ajDdqsEqXV zk(B84sqxe^ZmM$>-79<6m51-|?OS+mkwxZQ?sg+F-84V~5J_A0XcF8vBZ0vhuRWkZ z64Ey#Y2$J5Lj;{UInC~8oZU}{g41n1{Ay#(-c|_b&%_xIA!YM^M3S&OT-hVETrLy92jrTu?NnLCctj&mKS8+IUHD z*ZUh%-q}LBmiVCVgV*^?S94?=S9Ll1You>&ERhsR8~pq&4ZN3Iz8QSi++$Mc06>nB z`1PKg2#QMb#dp)Y(0_lx+k{jMd|>f0bT{BcOC3^1!&1ZNT;~l89#9?HsNjJOB)x#N zGmHRXdYaQ0%(&zA?}X4FoV0|C2>$T9$`jk zLw49I3wC3C=uxOyyq?mrebMl8U2kT(ECEiLZ)~FMSxUbi)W3Ok?Xz!*mpp>O70I_)zo25TbWhzb|BP zCd=OCSa@@WkL7}Q*rPILErh^hyP^%)sV6ck_^0GJBYQU=uAO`s&r^!zmR7YhXQ0sG zF|!MTls&XPR5k-aUi>0CV=i6@OMRMm&zM38NU|?0SkB<=yi1Tnd+yMA^u&T1PwCCq zPx|ePjv?75JFz#_?Nr( zQrnI4DYCji{EF@T@+I-$|MbCW1b{=f;eG@Kz10q=M(E_--8xNdc|lw09;g@GfLdM9Q6_m3 z;VZso=SoE2Vbz`sDF=Rse5W2TikSN>MM#nPtN7)8A>`nqS-;wn4VTv?fL{Gn^Al)_ zK)Q7z!uqI@k|4X(2Bx^v@{N`_VKxHj>MMJ8>ChMKXsg}vh2j1SWmkrr$q+W67aa&0QvU!@k-MnmJpT?>vu%!6JY%zJ(@~DRjB%w;!h}T4E>{?h7mwsIF$%Duxz79!yO^{wJ zTxmx5CS>IyCcdYEa-UhAOic~=_za@+zSCGCtB{;1{|ca=?Dp(3fJTYfO=YT%D`^~P zR_TGquACR)G68DlQ={oHwAFl@_0ri1tG*B0Wb}2!+I2dlz2>9<+BW9bvi^jPeYQ8? z*SM#*c#JoA2=KbVb1DODi@18%4xr7+zt<^;^8IxBc#ir5z2bYDv+qHLu)ya-$CgFS z4F%tcPzKx3=kyK#8JAXw{Iz4=H9dv3ske(#a7RxZpL$%d zoU&n=x52b}YL19Q!Yrw*@mq<3l5SY=psv0pD&vLetnCpjhwBf zoj0aDB^Ty~2c?6W=+u6C2I(P>OGP2la61I67t_KDUA~qRwL60-B`=6hkm2>M{e44R z4G50`T^`{l(m%UgF*Zp$gcR)9f*P*6e7`|OOaR?bP}=6B_SokeAC~GI{rd_$-e=>5 zA?;1*toaGnoS3U^t+breW~u{9VAG7$cg1hd-nSkNDk=g5(55md2@dr%$BWNnt&Qa% zxu)@EBE`n``!G5*ad6mV{@7W5dHE2?Uo;e%TqfNkDk7H938BAajTyGVR5kJ=sz9^p7g@A!LO%vq4el!TfJ9 zv=1#RlpYFw<(8{W1ORB?cf1LCVQ8?HOpcT&sB9Zb92A5Kk>OIWDBuA3UB0!4H{8i; z{#3($%a5;*`I<8Fs;@*Z0-H&rg$Y5dYdiNd)b^f53fsEoOeJTm-WhlaO(4wZtz|2i zdcwv7pmU|D2*{56&~T*7W#|ZTh+#d97&xO^Do^RpkUj#SfhOac0q&glT%$!C!Jo##&D_6 ztuQp}gS%}Soj2^1fLy>AVU_Eyas^#W&Z_+cr9|i!$gWQU@0PQNxvU)qOIT85T7FQO zFB-2Xw^Ex1Jh+P^#kE#u35>98H3{GPXNk7XaIw?A#5A@#!ySZBW)F3E;2Fi7JQr(w z8(2R4cTB{)Vd(kiJiyufo?up@~8OXj`caBq_*#$3SSD>+lrSFtRvB zPPzULXpzBGZ}L`X1cKL?Tx+q+UWrY34Qpru(+@6T#-0I0ZVL^{ zxrJ?Z^cJPUcH`;oQ%?%2XBZnIRNs%e6LaHr7W8p$-_o9%DrQt$r3Z<4a7ol1Aj@L^ zfhTAxGV75y;0;YIEE;&(956m3CY=aRsrba%$pA>(#HBr@S_Cvc@&vzzx)nDg+Xh*^ zgnB0;O4U%Z`itDQA=Pi{SA@*IsQ5-Jel!(M-dH-oFQ9gv&-1=_qk}5Nzmc#9X6EX` zS5;&mS!baRd!zmgA-CmavlfQA#it)u_L*ye!{SIAHr4%x@kiZMXr&h4@ckN9Q@fWn zP5EymXV=Pt9vcX%cL4!@0C@!s=r0kFE2(u+NYCY=b;Czp2)hw54kxdy^av#oSb9aL9GBF>cAa7Mnp7|rFHI#D%DBA`I-YI;Z=cg(l$ts`DTooi0_2`lJrIbY27gx_u&ziULZA31X zIEH?U?5yB#+HnKk{t0ven_RK94I$HtC$H**PqU-wDYX+VlIJ2s?S;&kX1)bJpgkKY z4T2~;pr2f*++*_kXoAVrxyaceZKNrNy6lbgVkLo*H66mlf3cwDm;rKk%;TXEB#19d z2S@!fPHe0~7S5Y=VfxB=`WA_LerNeXow8cHFClMYySg13j05c;yT87aM<4sWC)w2n zzN=|-QavSgn7$E zAK)U{G8AzuA>G4eR1aU_WHB9u)SdoIFba?!DqNiPHR6hRfhNwBzgkzdXeE_l!qCF#3FA5!1&+#x3G zEik?~|F?U)M%1cTQgz)bChf-5OTLv0Q@UBj>~Tm-SLqK{X^98h;|P+qLyQT6cQFDa zazn^dUGg3jX9e@K}dZMnaqAshqxIfOYF08w} z#CwB^{(uBxSxh{n^{y_W**092n$Z$Kn1*GGX{sfA`Kr56T3&jDkp-i7tboN{=2*yM zd4Pl}%;mR{ja<8XLppm)tIi?Q;i^_Aca$N9X4$Az=!B&fJlqlR5>gnr0=GIrPYQYh zBCbZT4~D@)&HF(dtCK-?4_CDS(|JmMyKpD)nbGGZuMbHSmzO)`^h*d((AUZempoQ>18T$W;q4dQ_rUu&99|!4^h&FcAW&6QhDz>q} z2To%#<--IYTc2BRUebU)D3AWw!N*uV#)0P#6%7nlF47mK$*nulmIQ@^yTlCF5_ph* zIr1L~wIzW3y_!Hw=m~$q(*l)^#N?qJ(mOH`Cy41=g_^ZSRrqfH_qUk+G;m6b=lH8P z`Adw8Mae_kWk!&Bgr09k(MRBYc|RUqmz7*3kPN$$Bgf+Pe4!h=9!^3it3>X2KUbSw zhwxyFk-+06)GOC&%J6uC2TX(E@3U1-ZsZjoVv^cKAtmP`AK9naVXzf87SkuB7V8y3 z_I?8I3PI}K$lz!Qi@K2seeRFrbqzcy?Sg{3v<=LFblpZ(I5gV{(>$_jdsdhKjsD1* z6R3F1;RzkQp`@pG7=be zz)zpYPJ%S$KX>XaMd zDZ(vpq4FR17KCRo9gmxBlZxE(#9N^a_EQI(vIbCOeEk*WJElvzIG$S!4-= z{cj>t#@*5Gq>P}hMK<$d&eMW%g*zKulUyl0=|$M4;$oVYoh#8$up`r7ucUf5`n@3> z4#HA{L5&87e*w7oPBCGOK(t+WiL{BBaFs}4$}e~#a8q-`R&ENSO^X#Cn}$(6v__?) z!ZoY7_3^gQeQtgm5Od{9bHaRFk8yW~YnGoLUb*g0fF6K(`#tMQyUXG%Kkr~~!(OlK z!T@uvQ4jM1L}&myJ4)&cpf?(iE!%?;7D8(xd~|m+7pgM-G`6gTx$5*My5XZjt4Gkw z$8yTPwetUG7hede7^Kcs&#-cO)u46U>1EgbrRIT2z`f8dA4+b9(j|T29@;fh6uLyR zUXEp|jk6!ef?hQ%v}P>0$F+UzkZd&mp@9+^imR1AqFEyyu6CbHH z6XA@KDSF}^*_#9f3iQMr@Nbcg5s)-S69TND37_FXOz!XRFEcw0siu3ZoMZ8WU+c#} z0%$~*)al<~UrNn>RCfz$*yop4QaNF)k_g~}t1k>{V*0%F@fF+9;dDjyBh+a!^f}NM z;^1%t+7(<)Yh+^=fzAzbLx_DA6hRaN#adeVQLFQr7=BXU9>Hpn!+h zT@0k99tvYUJqv-niuVNA+0$k?W>p5nYHjo*aT-bci&cT`C%?Mzfcu24?-+r>C3^bnpfi*(2*8E9m`0 z&O;4LP1D!LAxAqTskyKJwSmwT-lBx`8Sl?|S^-fzSYW+YjC70!*f$0f6yYmf;@dGCSb#Q2KgfyhssCz!~ z{3g`VImkH*U`y^3B=W!&}>pD zL=2A3AimE7dRta>OcMvL<$i_hTo9p()-5Kzz7GoI}_Q!H^%$~39%kiA2P_vvsZJ_nn zYxJ%lmpk+-A+H*jlk5%M7`LETuJd0Td4ytr^KvYNtQ#v*rs_%z`#}>FVJg!w792^I zfwtq~JmlC0&ddhpQn6rJ&EcxN>cL?^o682$+bW5X@AOM8KFR1d zIik_%d+|*&6;s*uvCb?y=?j@0Z1Q150=h2?02FyZ=z}-`%FRJ1AD)hy=5H~q`4W|W zr$)WIT_%G(9ksM&a|Re9RR2WTCeEQB?2)Gm=O_u2d`z2E*#C-~I(;4^U4lcgh`#?m zt%-C6cx17l$|R-kTg%N^wQp9j&y;tbokf1TjIx*vSzB|=f}caqzP-pg-C0V)+O^*qG%`GfiPe}UMh`jq{CWDpROC`Y&4ulP0X&Q=n( zw7Pff@?tjWdiBg5JxYI-C5vzKrZNFo?2z1QesPw|fQC*^c**K*ag5q!*i7hbJN?CK z4)}fBfN6?8$5$(5d0hA$pY~-!S^NlD+t$LZ_u8Y~AAp^Nb9`6Bc;cN%!!FM|V2K!^ z%XxNcSxUk^73Kh}yVTb?5=)A-vZ)yfz=>1CbCR?=!ocn2I4QnHTKY&4j3B|t%hAU! z7f;1);u-uJNN-%!y-$drw_hO_x2$d2JBjoPBJ0&F)vgtS0kT^Kacx6?4u-ZMI-3~- zp6?Flq4pz*No}^U=(-G@Dc9O97}h@BHqxp@{B>J6PK&wNmpM1y(7?(klgV%Y5@@OU z!ksUFTQtcn4{%oNMVn+rQFk!i9(u5)H_08Q3c|*OgNM3E^}J*fgMf?7CqF z%4U_>ITdBt30JhsSmxF3$yOtrH_%xQq#gX97E4Hiwp#rY*WlV>w>DUt)_f6k>L^t# zS+a$beuiRKJfpJ3%ww~5Vft`Lkm6vn-p$@^FS&XGSl-LQFl5}w<`3hZq8^A-$FD)V z^79NWT^N4#>QU!PjXc z8^c*O>w;fXr;mM;iXRx0Rm!h{#<6GRmeuucWRy9p*#WKoSRUjNBak;`%~Kc(uMIsi^LF&|8Q8$~e)0 zbx#?N9MFuLw+AaccF@&I5vg2S=YjKcUhTdWNxH@*t8=kNd8pkx;FPM>aH80e((vX zilEU&vo~f)mO|kT<eDC&y9--S}G_QwoQ&pDZHMdU|BYtPH zN!N(P^!YMT*0~~m8a-jOuE{t=lPUdgM+U_cnvbs82LO~3c$qdi;e>L~iHFD|qY13y zF!f-E4ZG8y>#=4%fN;P?o;pn-d+XqqFX4yaM=Nw|6%&Al zMk9?Zh||3xl{Y@_1!t+$naefMv~!zmNR8~ad!r)(2dC_WQQFanLv=JY@~*ntaf38jWl1t+B%( z3zV;qvS79|a=+b#YN}yW(jhpD+!TeJAwBajJzH#ixZCg0*h`ZBuAd&;GM(GpYMLkJ zqtkSzh}>H-lQ|2)Np@jmiTE=Y5U&*c`ZHMfiLm2{5?+2PmQF(AM0kNyTc55-Qwf|U z;kl_)9}9R-7gx#{8Fc(vY#^H>X`Bc)J@mzTOZzxsYA&U5mb{R<{#~3Yf8S-1%ejhI3)^}U~rOqT= zmf8;JSdAv;VG6FsM%B{PF-El9-)`vq7&}@3n4_l8|0DNu}}a>3CT(#I9|!04#;#Zhs8i0|EaEi+qkZ=MQ~mo!s5{vqHXHVeeHp>#zp6U zKFXW#*j>6h>|N?%-(+?n`WTLOShs^j3S$WYcNS97lDs8&&E|6!ug^3%3WbYIrNYna zl#r4jg5Bc@R$7Uu&;{rQPVW*pJ*1=6zPGBIMAEGBH4j+2Bk(GwA@36C`nYU-N{IdJ z!jzDb?1S;1)fi|e+InLbbWy|49wgLJQr^x%Qk?Ms_2EVHjT=D)cWPE4_TxN7!($W1 z`?&vrAEyBu1}7BUoQK4xx7yYK4_;*0(c8*POEY0X^GDgnJY=Q%AlRucE2-ZIKu4V3 zB)Z66+AYMAjk*GwZRYp9t#L_j{Hd&K{i6TFGdhmlh)fD7uVxV12pY*Ait10XiJ>5v zV7jT1f}4F)Y4s6MY;2lx+LT%0pSyI#eQ0?Xn8%I-^PQqees!7D9- z+blfh^_B+s?(p6Fy}xe4yHEaq)=+`&i6%dsXRVa(<6!_$y0@eLaMW1&-xR<#Ud|n* z3*KI<`d+R6NkM<0qmKqyz4FZdWzNc+CC;?##i=KlrC%zOAgkzdU#Mrc_ryxBz*&{u z&qM4*nWe^=Npgm8j>Y7E4Ey|yhkNx2x|8Lpsms*!sE5zEqbRf8#aZ7THzzGHj4jW3 zzH(zE(FqJ!i!7^HFeUq#ocqN3Lyp6CR!QsP0?xpsBMEsd)35+>V|PITlmiYazf{-gHq1 znl3pCfI)l!=h1FzS8rMJH8AjM!J{7Dg@rnmw#)E>ZLL0> z*nea8-=||A1A1@iizk!aBTpzBB?$wK(W3b@pT;`l^g%*xP7Ddq(bvo-@8)!PNvxjq zV|R_ob<_^)qqZY*nxE1RQM)* zPQJ3{zDO~WW%RsS*vb=pjPqytr=Yt?9@x!KdabfZl!6>RMyP4RaGx7}?gOki(6D>d zN#4@jXub;^S#3^v7Q5I_xXZ$Sz)s*CPKmyU8PX$j&cJ#NWWmn^*b{qQ9y35WrL(fw zvpt2*ON~K-wkt!!>J~!bmf6Hw%9E1SnY|c`{-ZeDfxCha5$P5Tn>cg3)Y$c2einIC zsbwj#V{goF_Q}wjf$7tmqB7=2!>{&Lbr%#2wSJmq+rumAXlZZ7YaAbm#dez^*MW?%MUZE$H z!D9sVrE}PJ$oZ9wUjk*biLMid`Lgv1PL46-^^=LOd`$Cmot4-YqrnY|P5*|T3Rhfx z)-U0-Q!Hb=&PodGiuk@y08t2I+l6mI4Oe>nU%8I@#L71;zFPExMK`pTXMGDBV>6P; z^D#l7HujZ%4ocRm|OOl$(K30&OP3lmpUA|xe;BOflpoEnZ3cC?Om&j zNIS*}SoH_D4;PX~E<>Y=tvM^B$C6PQ5}>xQPn9K`(0ts;6z!Vb*61gk84>}0IrC(d zwXa-W#ku8Q#BbR9hg-;sCiMv~HO5W-3u>`7x6JM&K$dYxnQM{NQn^Gr%SQv|>?D2O zwt-qC9n@7EIEciYgP|c6-MZz>k87Txzi7UWD=$-h_MxYTO8n4`8SpHx@{<-u86w1K zovxKk=rIJQ9!@!(f)hcq)}m^!d7JaPCx6-+H#qsWhh>!MlPvJ69r`r89fpW|$1dx{ z%`eSOxjSB+dazdC;tv>GSwIiG{`B?b&fih;mHFbPJSJcw5T2E7V4>E*yBo{Du;+5J zB#uyhzBXPk-+j~++}9c$W_B~Z*Ze}GnfoGk)IMCzZ6BZqw~kk@PDm)sw(crgN=OOn zf4%z_`Ry-m8bBLA%;r($zPs50&8SO6tIjlSGrSgWDO8TB_T|bQYNi*;?|D> z2RwJyUt+Vbh~G%?b?&}wqf>8rV_y08FM%vO)wT;4*urNqsIgKG!&!(+Mi?`a{w!FT@WN~bH(vj8ru8?vhD*VY+*SAa#u3SM`FcN49KQl!_MA~R zzSsKZcHaewPM+cOoPR7LGCc5O_$@Ga5GxMO?j?*!sHJahPnn zo?vyA^It%%Z4gjjX0H+GjYwRLpBm$mD9BV8`EdK&^%;4`uF{3yyjFLez^T+63xPQB zZ9+u5^^DCtYbK+Bny_@W_cA0~wn8l8O_9vCMfONp`aZHxXnZvWKS6V9JV|c7Jx8nU zEfCRm0^0RUQz4+3CB6S%3NQPZf8$0l?0Q1{dd#}xAgQ#%UyH@CT%Qc5Hz^n0d(Pr; zTe1_3#lV+G5k%|zem}2b36VMJoGYpP^{z3gc<~PG$hMA}FZE+=`OcrK7kKc9Q!Ue7 z^HCyA1>HQY@^-yd0;hRCz6b#8EIG%q5#W$zDaHfn2b}r+$?AHiSkh`UN!HQ1X{pCOVYwp1~>PElGUw#R#R%h`vD19 zlY`fgKqkvneXXCd@*Vv2OX{T+o2De4f<~N?vc}oESn)EF9P!=(PqyMrNU?n0!YQO{ z6}#)($~^p_gzm$i^e4)f?;w4>d4Hm4FK4T(I~*X{YlLL!wz0*z*8Hpwj^=%82R#Nl(!Ti%)1Dci_VVWI~(OBk`K6ioE%+Xj~ z1=)xtRHNQ|b}VZwYZkx50U* zC69CgLf}HEgym+pKzAkke$91g<#f-eNIrM9_baYy$|Iez90rS(e6RY-VQ2-;k~UUZ zbT^Rt8EQ`mHG|*Hx(|N8K>>anh;cr4KoVFAO;~8w%HyDR?s203 z(mg^$HY_yAXZo{O!n}?KG;Fnat$Y8&HM^+@u;zOoOjj!-Ho4a1cL)gB_#vzd@Dim+ zyp{vj<^9jqLP8J8Z?DJqdJ)!;HM_W_Be5Y(#wD?<{Ah!3{jUJ64`bq;@~7Ewa_h>) zIRQAs0R@NQ-cJEZs(7uoJ#&Ka!!O}xbNcss8c(4LLqXq~K@IMMflPubcn`e{8|@hL zN|?PL9OrmUfh*cKJ48B04!Qvx52e=&-j+6E_QjU0517cSQ-& zZpa4K8WR*7v>Vp?%cKrCEwJsUaByjLO{|(A*8TZRJ0v|Zm9<=iuDjEGeoJ+Wu}TVg zMgN>=L-i25p(pou#?ra3+rSwpJBozMXUf&ro{zPu5tBR=M=B`E8&{YJ5NOemJ9Vc$ zHziDc!Eu9K8Qyf}6XL35-*eFw@`$4!HS3pOEy6nvL7v|FYmKnqKCjjPN7q+JRoSiE zZv!zwFaSwult!dMkw(&@k=iJo(xHf;fTVOOCDI^W0s_(~xVJP)w}7yz4Zr8ze&0Fw z-Z6gsbH^FujC&r}yViW>Gv{1u>vbG%eo3N&pg<@+P^-9~KKZt_&U-F`u$23+LAJR2 zYYZ!`wVqMw632tjgm+0SjHQt09y0!MTq*Rv}E_S133C=h;hX+vWG zP=z|T?raP`pn39Wm%HONBQ^KBnUQ^Mlasvw?nR&KSEM%TSe(@CMa701b)o1+ml|)VE^PP{Fr9*pHo%6~$VaALevPmToCe8H3Q!k ziiasm3voSp<_o|24d@HLs-y40j0!bV*t%S#@r z-o8`k!_UO}c>ktK8h+UC42wrFnPkUvF7kYjFsVq=mSSLzXe*iO> zE&RXRI_5i~mAZ8EG^3i04+SI4#7B*OE7iz}>EP&)8S6G;SwN5J+ zq^OLgy~GoeFByOAy%Gm;-lsA;6B{1dJp>#Tl)SLgWNrcVe%k~C64Yh3JZROOJMS8E zUJKWFK@ZnfB^&tUwvo-|?BHkg7|)jd!QxlJM&`|I|AKrLrqI9FPpTeO7MIfg0s=7*=~JpK~sG=Ncf7)$n?(Gn7OD9ONC^?lCX+W%>) zpEziRRYxxxY+r;t5q9&gr_>*cOD8_PD1n)G%v!~<8w4MRmXr@z1G`CyK_5x{U1&?+ z>SjOhK3UZ*f}@QVrU4|9bm-eBaL}O2$;;;DD|(VeUfY*l5|*-Zgt}hmO{_{;4;4=_ zxEu_B@|!3ZlL0a^xzD<23Nhw@6ivg?GBjsCW$uh5>7 z(KnHoIn?U064hH>{{!ls{zCnLzyC&gitE^B1~E_fPT%(eP;ZuouG?<1d1~jdyiy|+ z>uFm!@H{E^9s4wF(y4NvzE3Z?h}W^f{|)*M^-GeD@v{4@Mn5 zS*==sT#R-bt#Q8oLpaLnc!A_(7`!Ljbq*-LVak-OKM?dK>emvP`0;@(U3IeXlnz_^ zqHC+vA#nHTUxM$mz8o+Qo%D;~B-2dt_qq*loXx>%svjDEVj;!`NbS})F5SivDa(w% zBONgZ06n5q0?E&aQtbuaEP5vKQMm+d(O8W#4F43f`=1cnj1yK#{D9jD`r~k=AhHA8 z(2^I$$w+RL>6d)|+-OOY2M>z6ans-5a2WJF6Yv*C06T?k*QYmCTE}wmzX!~xL$5mnO5zC8AM1NZ8`wx zMLO?(NHIdqQ;GE12uN*T5*8s_m6TX09n1NvrL=p_2|>63^LY+JbP^j576%R(!$<53 z@LCdK|E}Gg^Z_^!u--`ZTdsndU7+SBsClc^4@SSro)b&*-N@({sAsghf+n>+sqBtf__fnldKTlEU)v3Q+8!Ts&3-;V$Yp-|?sUMu`^AMV zx@~&>raSghtC|gC8D$Fj~@bnd;T&#tF zav|VZUAKW*y>rcup8)b7Z}qt2n}f&tWwh31t9%3v-MfTJ?XF05jOq(zZI&zxT#Itv zGT3FuuH5Ym?a7L=$qGkxCVuZqXa%DwCjWZZYh*(l*bqaKWz~BL2A<-^~RIDV5@lfrBT5*djFa>yD$ERLYT`FCbqfKhJoUTs$ zI*Yl~?yeDSc8)Bj%G)D_7T#}$YgzDY%KF8NBtwJO3KCYc=1X~`{)BDncd+TGl7%lM zVI^lLRc*vMa}SuogIe-Z>rC=td$7gc>pVmXgfutyS4SKjq%)lBN(eWiGtqGbLYu)|2SGm zk>##umMoQ}5|9(|XkExqUHv_D7@LP}A9V-Y* zAVJR7z3(8Ds+b|&p!n@t-5oUxL+8)mI(c#dQ;5r;wWj_xVwh%BcYoOS^EB*H zs(ll$1-N3gl!J6GZBdM^xX}d+T&O`oGmx#CdGzaUJ?JJp@u3aHZ0ZYE2Wn zc-J)H`r*Q5YeR!kOYmcl0OxB$D*V6{J%~$kkC-}7rd_f(XW#n79)(a7c!&=w^spL4 zoFCI$lM-xP zFf1GmzLps5dI5VLgz$_c;A;Kd?V)(#?<@FSxXC^{f7hFPOX7@Kdp*|kqowl|baVZV zECmWvy42CJ3!|?&tOXkyMp8lf(KP%_pIT8OBco+D^4S41LXy07<_koE;oSV?_@75ezXO+wtJHH?9#|4n ztuX}6k2HD1Y~yO#>sB7n&J+{Wi(`gJG2C)+htLXqi^mWx5% zN2eS>q!U1UEfJb;Z8S`PnxG!hE;#77#aji5bzmI%YfOIoDv$K3&fAg>-P%jKD;_JG z89_lXj(87P^wqcyO?M59NcDMsnec@ZUf91&1Y`w*l_37QMI2LwZ2vh?%))7(I)b;b zG_7182X_=}S@VN>%6Tx@>^-aB^pMQ+uw=h;ou}KyOI3FNHuWK`2W2(`4gm-}!|@UO zJ2m-lhP2WSdCxWw8dP_&`WcWTG~ov4zZ!q{iB@Dd&E56fYA`P&bcRhX-NAuj4YDxh}p|QXNAa?@& z3Ag%he{zZ^P4c);O*o@_AxD`b{+C7ACb(z?riii`VO zX2kT190s}6U=B^B!WxHJ1Lg;8*kVgZ9bAA5Lc&m^-o3{U*cF_8WwdfZkTZQ>qw@w8 zZrfe#4fbn2;YRS=KY8r0>}B-ks}{XjJq0tIDFANz9#B)Ixas-{ zTtv!YSsS#z^q88tKX=3`sj?Ra;cV6GR`-~*Db6T={Sr8t2c#Jh(sbqwA6y*-{eh2mUX^u(>9BVzkj-Zn> zV@ozmLuxg)9&MH82;DMRC+(>J+8pC~5r$O66F?noH<-^NH79t2Z!m7H8X_r5)@!;srR?7J)t%Oo`qP$PG717%X*kklPMnk+JRW) zx1sDkmaw4G$Yk@T?fmBKSfbGHa;QwqW~PX#vv}RZ%!AXOeIKDQ^=>BpZA{s1C8`zJ z&$jRa8DJI&;Gm!G#$*7g`cm~pD&>iTM?*O7kvU${4*Yim%oTe^kC!* zh;0RI6+}XYFq8^h68>ryHTJ#8yQHF`A_bHmj#s~q0~GXR`lP;G==fJngKf}fe@RJg z%t*3DQ22ttxFBM!hof#M!3>RqG|ZUDLeNPB)0e90U45WM@jJUUYT$sKKsp_P*S`~i z@c|wgp)*BS#5U45@5$3huEY4;Izehf9ja#W;Qpo)t|vDz@PUXE357l?#TrAl1J49?>2P*6P)Q(;CbU>aT zVLB9hX$c;907()L{F*)y5&UTD`s-HuBN2G*pH95BjA=gf5l{d!j3AxIlyf3so7-li zj%!B-AB@6W1-{C%b%KzG17}H<%qj^n4{&KjB4b_rOuDSr2V9yQ^xcTTt1#Wy=n|Z? z$j)f*NJq)L@Cqh!wF=@K&aPK{G~5(#&9x&M{{^y?CrcZ3ANSKBOAROKieCS<-CS>|_Z^cNt!Z0PP28`TY0Xut!v_483RH z{xhuptnv460!G@w_^XM;5cG<5>%P*+ZM=;B!qTVxgGb{s)%cr${6ilRI*OMn$rwX~ zN)^3wY|lYf)17U%t&l;*0LO8&{Oas2mO{r;GT%{aLwH;piJt-Tmz30|_kJvp*m=Ks z#5~bL1fGafA^DKQT{l_sZysg;;I$+Yph?I5r$Z53v;}y(5U5>8neD)@xyx2jO(_R< zThNJXKL0lAa|VIeC(pK)_D4$E({oFkLL0eQs{PaOvFm5t)3(3aB8pDm=Z$RkR|)c; zu^X;C5YW}SH8(ywUan_*C!FFD6RLJ_^ji5a=2|F_$bxP_B#==zH*n%GOvPns!|8T@ z6DmHrSQc1VF>2({4c|M(tFiVEY|5QoPy1!*>UFY-dv-B$M%9Frg!#uMJIhQPvlicZC{6mCD6PR-2V zZPIN`fKe=-p?Rpj-P?3|<>U?2fL4At729tKGN6DMm1!V$8V3 zC;C-w)PaPmz4J3XzqLCx)@uvVJ#QmCD0_&qRt@IffOq2#3^ZVtbgYv~ZgDT~52Pxe zA9Fd*#Aul~6Q3VFju!E2U;qvZk4k5l)JjQgjDpp-pNu~OKA(Se`Qx8EWpMfp0sMEw zod-amzk0@u9I>!-qQ3c5b-VYJ2ff?67jiV!5mYXh!NF@iU@ZFZ0(K6STRk3g07EQ* z-ZPU3-tNN=Z2F$7-iqjH+`0FyS>>lCy;HGw5c2$6?|;2B3bdCIz8a3bxz$MV6bTgk zh4EJ!i4knodZ8@|*sLrSw)47D$PoiORb6l7S} zY8#HIVB0>xJPj(OLpOz>oddkh;*PO}fjNW#qsCUCQ;gz$uG!;rK3I%_r{d&3@8H(G zjlC9C8s)fejl~@FjJgH1-NWF;{NFLYyVm?-%B@?Zq(n?0!4Yka)@4oTVoexjHH%bH zJI=hu$oPsOUm!tx_U#K*-?*-CK~BeOv7fo2$7&S$o=o@j@!9LeJ}`2L>>M!P?DR6k zdTo>HxpsLcuZZ-xNjhdS=cgp4!!vO^AVWn}(*pSjriMod05;u*zaqkDt1&&X6&>*#zHy)lmT zcm1P0wV%Al$bzgQ=H_>AdY=&yc;b2>eBuvjM!$sv`r6n^n6&S^mGVqJ!(g9m2M)#4 zonDQ`{Q40Fw?*H^dg$kEh~-|L*nLLKAJ?f)egcK6yA#-Ic`DyS_tV%1HX#hQgDEaM z>dsldLT+2D-(G3Igh?_sT=L$(|BYX#BMS*b3T)1rDQ#Oyi}%}q;>=O8tXB1F)odiL zQ*v?tknonQ%t5L3ZshF;W9BX|a;d~V@r2zt=+!a2)g3&&>ydsobr!qGBVZUjBu(yj zQ#5H*nPcM|zrF*G)K)iogM<8}hZ6iMRqIELl(~M~!J8s@E+{lo0UnWl!;Q6W^BDEJ z$#Z2`yhejAA?|JBv%se=7AnKIE+zK!?>_G5zg7Q&>k+$ij%_F~Q0s0=qJXlE$9@{x^n4LF9!tJvlxsAew@u-^J>!6(75>+uTR^$ieHy3E_!Ok#AvMH>A88jl0`!N zxBo`Xj09|vmr>KGCv)R5t-4^miRwCUd!Nh_ZZ&qo?c@I>mYaJMtV(NTG zADCrE*viN(g1a&&?Pk}`T*>|B6;1h zxl%(VyLwjJ7JhHu6Mu+1iZ6YMwGW7t@qM>kw5g%AWLiea^-Euw{D{X`uTsnQ#zo34 z_6!^2B}$y_>xsMJa`2{&iH-A!->LkBuIlBKXL@&MKV*k{w3M~42TenfGUCQXdc1DZ zJ~Z7LS}X9AOtg;X;Yf|&xs)${nN~bEIcUl_YpdmRb;XV13AW!ec|_{QYLT_ryVCuu z#Mc8Wh`Ej9ca%9V?y`#r-2c2SapF(uNB)?^xbJ1wo7$-65yOGbixbImSBpAQQeCDv4!9s*SS z?6W>(qQjn#L!PC?7i)W3@EP_>%Y4S+nw!_BMpsSD?x?8rf~DzHx%_x!b&Cyl=2p;z za9SzG*3Ql^uzGv=A#-Fut_jbsEif&>+>5pH@e7UF zg*`jT_{*ch*wt?(fuhx(J+B_l;wFAQ-+`t$BrtGg4i>v@WI`w7Djlhlk;ICd54)>D ztM$ePy$O!Aw;-TC8nfd@Vt3a5!5M$)c#e6VGHW$v%NCN20;Z??Z%apSAW(Th^SJsS z%~9j&pjE?&SOxMUJ_mL@+B}sWuMAhsglAuV`0yc=19qr8U)}u_<2-n1#U@;hnUScc zgF@WaBP{477MnTwHapU6p;-3(-LqsKype%-^j74rUb*6VP$MssW=c&)b`aXropROX z_2ckoGer0^3kBzYtxhakJRS%gk@78yXs%Prb`<$?^RvEPmZpUv>-&|dM4oU#lVsJy zUbVOizhEk^d~wy2hhBMib@LkrA6!z8Vm`LNe_0Wu?6Abel+@c)RU7AQyf=CfiX1%z zEvsc*R8+34xRoE>$#c)YD%6i{gNAb+Lq@jyNc7UG*SR}?!SdGIu$2@SR*hR_WMmq3 z9-ocfGHX?A=mq%+c!~cwUe#f1Yn}1um^!0Y^u2;8_vg20WfuJk_yu>T!Vaz#qZjQC z9+A(|>nyB93Z&UE0&J#OKH$~uycE>UqylVD_`#dgqmxyUr;>3_y+J3-(*rurm6)rm-jo?WL$Cj1l!rR5y3oWVlRipw#I$a5xLv?- zi|wb~){Tt*j@(ldxILF5Zn%adkF=kuiPcIsEZzVq6_zM#x;qovcLnjNu!o$Fl_`7f zK|=J@0rhfEBJXQYCdTNv(5^&oA;IfM>k5?+;p*A&2BmBtUc=zggi-k9#CgsN@bh(xF~by6 zY>}$2E+M1s@x03&uP10^;PrxWiGf5_pH$dbr}B5#eGg983`<(1iwoG2Or4kc%;@?x z>g9Rtu6ADOO3L=VzAS+D&-`|9xYiWVABNQpDgKn_taN&eauH!w3U`T*Uqp6%M{%@8 zUco)3+U}cY8EqGc4%C&b*=*(3YD4~P2bgI`w#C`!C^FvAz$paqJqY&TR^+VSoeY_* zn_PaI!WfrYx_@P%VN1tzs5ti$vL_FkyPTbODd3ErOvYIN1R^JQ=Hf>L**V`O<#FLx zF|;Qa@Opaneum69Wlnn<;Q<1sss^KudeT?aj=ODh_vD!3NY8pA^sl4XB1$uet3xL( zY96B_l~uZbidZ)AsgLf8OlE5Rk232>K@(y;53r98PWecIlC@1)q_HdU>joQEhqAq# z@;B2q?1ip>z+!%Cj?<8)Rq=7&m=P`I!G@n1!hZKjh5Gwx?*6WMU9(U}H~CKb0d&kz z%QqAkyVFvi1}l+G|Fel>%$D&f5^wSz!B!@fkmRg7559DrIChj$H2) zpnn&F{wb<0!QCq^GN!zWN7?Dhta~2TQS!`nF~6LGUT|a^x>PW+0@{t7B!Kj;Qf;Rw zZ{4wi%Ro08U+MFXTJkPAAjY!$6BHjBx~7+Nc{u;&x!Rv0`U0xacw zSXbft6lr5tO8rNxkX!D;bZGln@n?)WER!1tgbpK4i z?Z6{V2$=JA?KKzYW|(&UFgkE2pS5gT)Is?C7#jh)T7l0eAQfxQ_vtK@fAML<7b|nk zaq`HR7Giepuh|Z-$T5cF72qL{)AJ4^Jkt^@jn5Y9XkpP^_%A20sf2MUQ)t3IStOQk zc~@s^b>+&H|1>UDc_!pvz))9cmpm*`b`RKWDuj!8Ds0Nc-iU^cNTJT3!#Ycu4n2{v zlz^l6-DNEG$yl|m^86u{&0Y7x1rCJ@~hMeaE!>-C=FR zm<6%FXJFh+BWqptay&;8zya-+{rnn!J=uUj^p6Z9^HQ~rymG_(VSOzg*U=oe94t%%cZQ*`NzU$umAzfXZ8IBZ{p}-nRWuc9sh|=Uz%1-0#5WU6WT!Pv005G%j zy{Ae1!}d|$O2g9}Gi}Q_(fzbi@7OpaxuG%%V7PEU<&CCT8$2-ZTM^;7*u|MR9y$C8 z4o+Ln4xJ`Z_R8H5OGWf=dzu-BjDDWBn$?Z6*VhiCqByqW<1?Slj8Okhc^X4aO+EQF zt`Ar^kqRqI9I4$!qJ2oWc!ERF7M`vkIqNYD`E<7aAa)n5~ ziS1;0U}xT4-bvfW9}8W0pybA%zn|(X)ECkzwbPba52XwI=o$QoDGzs^zQo`ma_QbMA}LuJpUFP{i#dg^n^!^|uL{zH*Shx%z-$vTR6YTxgt*d$8&BdxI_w5#)rO30Z8pQfrAiQcIyE(Bjrll*u+?%NSN z#`qK^F(KSN6MLa0*~cs8=uUv91(}Gze#-_=7z`5Kr6yh%SbcWHmnv`REdF~Fo@6m2(18Srx zlK=UU>69($x%D6$pVGYf>eWdY*7joX`nL<-zYlx$>XlEeZ0-4z^D8qhTkj|tjTNh$ z2N>O$1@76rFCg6<`mXO{-r#Jzke{tsv`g&k&`_K!NaH{?B*o3}6jgiC_JsgXu$$P4 zKN~x{*8J%EfIu>^Lx}4q?~)sSiBlst>SY*3>A%}OSXV?B+WoVm%U4UCX`u4s!XtBk z0;=_JKVa%*o%P>96czKCo(yO_T8(8`R^qi|N}L_!u-s)!K6Ap|_4~M#44hBh89aW^ zoUFG0)STBwTk|^1o5n9MVCdtoYv&iJD1O2{9XgeaeFIdRu)Gyy>S}SocE) zM!!o`RWl0zdx$h5_}Pxf`|$!mN~IJi#)06%W3tWs{kiL?!ydCbW{@IOkG;@ zmtCJ6wa@!@@6NeDjXNo8$ym3X1-edHzwrY=*Z_PWnUCTQYUf(ovXMdTSNp{CFrSlA z<(7H}B~oo%0}S|Mxp}MeVw(RC!*J;>H9*_O9rj9|0Ph4-5bx!>^p z(mL_;*^D)(>u4DR#f5z3jL8O$NYjOH{>Yq0{NANAsp|!&FSq4;q)*@%9ZwBS<@2Rw zoDC(eDm3_AGxnr%8Ec!LGn4|2`O^WQ@&ykvljev3MTQ$I>>1vjoxCRO=S72$?%YED zPcT(I3B3vdTK9=ywg?S_^1jXed2Hw50nhKC6lHGSHP~xcZX5C23pbRfjIG(7>RSmj znXsId2FyBp7TAC*98>m+f|#1i$|247OFgDGwS zt32KFModRev0?q${)-JmC%4h|uDL(NFr}mRgrdztOnHnKQG5jS!zi@nVJ8%)ef5+L+3H)c-HwmD z;dh*lNfPxa18Oh(#+;wbY&quU)N&1h(?e^jqZM=tST!1XsLE?IyRu|TtV`u4PY(QI zi;sF>D}~iRZK`+T?U3;%V++ZUC@f$Pav*tmg167X@Ca0q!KB4U;~^zN9AWfGF)E`x zknX+|aePHqUcU8g-Ud#Kjb$n)%9eN-?~w$1a6oV#-f`bK4116=HSo)Leti5YY(a65 z-4gnmrdA2UAmg##nHcW#UDNPZW>2CjQs7!!S)9W{pp(dXObUnVs1s4 z_^QJ6r%y1meZ5Z!E4gV@d80G+!}vMO1fRB zcq**viuBsvH-8|7MPDSZJ5~)}tKoye``LnIRA_4}j5qLha+D_ZKqzI2|Ix#LPTOW& z*GaE@essqjFAtpe=86ljSdGQGMY)j?)oj`d{FQY^1*0)-rY|DoP`-y6^|`oq zrzBQlL8KBT^pEl{j;97W0Dadm&NQEFb$v`PV;y8Z8=wE%9dcBNAT^dYUNxdGD5wFG z3d+M1h2!@i7s%AiCXKk^YtSJ$!^aa`TUwd~T^X}OBHndH8SsAAhTLUve$cJc04cVL>u7EU31qU0}Wl#p+TD*ul4$sMm^g91r!a$j+3 zmZ+?D{wQ8=M~5hA{F^VE*IDjqx3yUNR@?8KUkip&+eW)soi3JNBVUCDGz}lN6}K4u z7==ejp|WBh_r*NOnd+2wQ10@UUexQ2S~pX}k?)B=2SmO({j9I$lTZ0WJ`Dc76jnnv z#$8d-$mxj7fH9WXGS5d=F2rDH)Sf0Q8ex;d+cLi<{9Qe*N7wf+{nI5}_O#sWzj~g$ z5*WxnCShFUY2MAbdgaQGPiNG{&DFF@qC_0 zF#6?BP2)js5@4$qcEfj6L&@3<9Sa7UPc%TCFwH(Mm4G+xJ}aEn9YK#%0ChbP z(m(t+-03Yd)0dT#+1_OM2cEvKG@>dQa&Yp;NraF!o{wnjxgfqWTr-1bB8Yq^Dt*2k z9VjL!kurNcD+xmKmp**>@%aodHpG&p9;*Ba@?M7o$a^`ga3-WE^jw@>ZXJ>U*8R^v z-u~fWKAUO|Q>SE04evA$q2CsncIK8@hcW>AHaU+RiAC6~d4$Z53&6OGo+!1;E1I

?A1bC{KhgtGl19QoAx{$vw_8tk(Qj~eIac}J&PzW}T-^GVm3~3hJQp^6SQ z?o3l)xKUshPg&Bar7?bpe0E_2r{I7d=0OaGpTepa9Gfdz!(@Im?}l4s^s8b>40HVs zV1@>qsb>w9!@aY;l5qv;xF?D@=xjTymy-_p$&Ne-Hir?iIG2N&?1{;eLB|e9_RLXS zgpy<6mxmesw5DITL}OLpJS#m{Nfs&_cMC;^g1+B!Zl3pF03I`5{4P&X)?u&0#5^x- zwRt-3LvEA+Tkso-?F(?oC3hziwH++)h$6^5BM}qYYOVl?ft+W2sOG%nCMP=g{rlsQ z06an0F4&}H1)e$3Q`j9 zpNmn>QjOpuh`NgtCHxNSzgryGbES&g)?12U`FLS0V%g+w@jmS711!KYeA)>ZDMdIU z8yu$+&`FDxceroo+I9%`O*~*Th=#nn45l@7Gm21?xFiK!Zf1+ z063KIkoGa|76d}gL@LFFx;}Bu=||MOGjHouOw~IobR8 zbICUwJ4ic54vU(z#!1Tz6P0u8Gn1+%Fk-RakJaYZF|D{^FNhb!!^yd!NZaZ43Lngu z0(8Oa%2)9&INlVBsZyr=ac^t7zE)%*UXo!6y%`uudtRaE8|m_Y25Bj?lxD7{7yA$d zQ?a%r4VOoG^57=GZFKE;2AKztfH8jv{7Ni(1gk23=_kJmU)^$>zvO zVwa$4%{KEfSvGXqjy~+)bglyR_rX=q9jlX~-5P12s>+7#xHlyIK^CmlF;eMt4;7mZ`8;csP^Q zkfMnb)wq&*9++pg3+cw2J>kDx`sR&UxpuH{+3nXJ4xWyYdi>52;x)cJZ!a@0=kstI zFyPeU8Ae_{Ke`)JR#OjLbJzxp=LBuZH$pPea>BRs$J=2l1W!l3)<;OtFTyofuEIH6 zuw@c~ev~W?{7fDcoe!`xJo((0>cW_%$(A*`(GqX^@$D?>#fHlr+bqmYhBf5j4TzRR zKl@~)=ZV|s&a)Z31TwnFNH3=d8yM)-_#%LMEWguxDK|J#H2>afDn30lt3^7WCd=y# z;y*3DXZ%eQ#coIUV-=s=4zfy=l%><+jj0!5G$!*jTJ@}i7CdnG9O))CMq-5r+Vj$x z($WCrM6!E<%JlRI|LC8S>Wu>JMh2AY8X!TI4YkE;-+J52i%1+>EGI_qpK4%$ek;wf zA%wZxCNAWv%dzn|Vh&`dTOUF96HMD@q#>bK07Uc599mzjnbD7S8F2v{vV!|f1aao= zsDDJr&&3WM#j*}7HP`3jMfAB7#WgUmX_*YEMWmyJu7By22?@@Btys)Iyh4PVS6BG< zHSqQ;5B`GdEQU%#?ciiM%vap6G@WOzwC?6~L!U6bc1z|@6WFjFBYA0Hx)D+nhF2?= z6-V+tP{@aR>IXt^fN9GsmPmx-3%?eKcQeXmj%0gUXW*m8Y~E7PM2wt_Zb5Iw#=X+& z5N2t`iQOh)FMq?k|sut=5c$z*Dn{tv)kYaRX@7(wy=9uMoDJ)|EDjY=n8 z8~XkIdHq`ifkLJ`*uby+<(G%>dHXG|*E1Vw4V5h@FrGisYESY_=fL0j9{;`yhx73^ zPHxZekIZ_v$>GDk8**L%VcR2B`6T#v*ZLta@))87@W>Kb(V`dP(gAb7IYtbleye_d zbvht(07A`uCExE}CN@Lo!b{qeK+dTkurlpmWV}L#G^O=szLM?z%Lo^)dv-Y0O-hCQ zt(I2w_lEdi${C01(1KSo$|AYa3>1+P80bSo+p3qF{*whESaWbai!r)ZtvimT{Cdc7 zPovuLDjpa4IA?Yi5;bguMn$p8^ffCv2K47{U1gx+U zlo0Azo|cgqIHoBVJY_8L@|b&V!_|Auz={;m8pU29n_X})>*vb8JMV`aSOsXiVb5k% zKzepWN)lz{QY-69N16}(nXP~%`K4>_Ct*@agV|tW07l`X0Z6pPQ475shH!}NyT6aa zY^O_0OE4{kK*IGjlqOiKb$%d-_U{O29+%8rC@88^vPepixq>^5s1J2P;&FI#Irj74 zkV1}&13>|MhzQ{;pofmZtNmFs1gGt9C20aMGo9I(QrD`l9(Q|;P=$sjFqcTWf@%9l zlt3Pmx5QsXzo2RYVFgC=g_Q==&I-1_O{2cYif8#aWlv3UG#Cc zm>om&@3DomOc*jG0p6iD!$9PL7n942;TsUwq%*jD3+qfAZ(!JCi$A zgAZf=5dvpbne_-I$#T5;+1nXA11w+i!l_YX14@qO8o8Z4rt>GyFqyXq)cGu5n+M*oFe~l zy%ci;$^;Fy@EZpJ5hq0ia;%3`Kx=meLVN@?;11t0)|cbN1SP)SH0e#J*G>T}rzEo@ zHc*B!z#`A*){9^?uE9yPmIBDYk26)&mq^+?3_uD!pg}w`m5^bv{CmlU*HRF;M>aK> zT%2KDFc+1!$-rC$Ust}txrc{2G6ez!DE0fqy)zE3=+4u~sRFUR^Q1|}GfTx(pGSEI z0f|WP;VN>%y<`PTnk#}77`MQ;Q`R#m%S_F<1s9kthcF%SA-&2iYQ$FlR6+GH%HtVK7TP zxn-W|s|BepQ~q0u=#&&MugGD_u7&G)&PVb9r6Z3Hm@%C*YyCuHs=Y zt{y%4?%*9=0;$_%L7%E;KGe`socjc$qbL${ck{8Dh#p62QVr!s2N2@aP^a(xGKdiT z5B@($hzQ73iT+0wgJdJ*fhviX9A;)=!g&U^RK;=uB1Tv1=(twXfJvvRgnxbH8qez>wDhCsjZ6D9UT#sM-m8G6XzoAzAjmK|ld79)YG{BvW+bU_8hK`Xe?Ra)mvmHpr3N zq}Du+|A@&dnO2&A5Rn_|w{yl=OHp96?71+`qw@9^hNXn;U-eXq2?0 zIMXf+ovY<=87iU8+c1o!lG{5sw@%t>&&iJ-HUUCvuc1`i4uHrC6}W~w55#uMJnwNl%xwYTm_n}2KMINdD-Nq88jqla2W^A&6%MOl6Rb{@- zLCp-UA&A?@s@|Us?a+8kc&JD z)0{Uge)vb{Ppdk+ld$4Z-{P73Sc!`9ssXT#IDmcUgTY~Ll~Ysr2vOFMKTgYXPHhIn zQeQUZ|36_lh>daUEX;^#A8kq_f;=9BukE)e0bvyeNC zJ-1&2f92mL%7v#J{$cn83xire-(zqi--D8{M0db%ax6rr^WsSSMG#dvrKgkkdR8U~ zqXqQ|APD&UwY*)M@{FKau_-W8R+%4=SQeTt|17wQrGa>cw;spF6%pR~^4LI>dYMy} z=7;eriVJ$UCB(x}VKrM%KT&KpCynTP@TY~5tbdd|ERm0d2_90wz!6H+voH&$7Q|0b zsAh&F)s^PA@kLFOF$xSQmRQZ}NN-1m>ny@>O$Ruv$JtT!973d0;nKXapi*Q;4YrAL zo^a8Mo{G{vCPoM>txT=h{@LHLG3Z61%oHWV1J8@|z&2VUaB>8bRsT084{SS5b@Oq7 z>hxD1OOO`#)^icX8zE;-{WeUazwEkVaq#6pU-XJ`Gm--?%=mPlV9?n{3Zym1D}#h* z|5azK$&*hMYrx?D(Dj&Jj}9l5Q1s8bRo%UjkR9cYViGhte73LpcOu&fnV=CWkU`IE zY(QG*iof&UedLi6hh;4fy+pOwST*x+q-`Vy$TDinsZcz*N${j2ipAi@7Og;2;1Ga5 zSWZJ_Qi?6XyIgS0z;FtU<&vw7JO4}((j1MjWpbM%Tt^GL>=S$5e64jpHy{vSUS5hh zk;#CHI+O5gb-HF@lpY=e;oCq)Qn}YCDaj5m4U7 z+88C0fnYa&z`{7BPn0yqjZS@;|iD}y5E}qNlPP4$9Q1(^CFdlvb*az zLUTMEAfPtg^#af^G@w1p;S*Y}qj(wwEdZW-AZT6Cue)^^4+J)D5_AGrX2hjIl0g7I zm75nLj%R}HuVh&pBB)J{TetO>e%tgcvHWjddUwl%x=TB`4GKa#iq<}We{q7#$O;kY zZ$n>L-Q1V0{>8&hf&ud5$65GF3#0X*B%+aJ!G~aWbVvq+MG_GO5VBlx7g|mppNT{c zR4as~UPxa?OgEr7bilg$6d9uqD`f|1hq6pOJ~YPjoK8@u0;O}$slmC%4n=OWR4<@N@Vn^TXTWG%2CM}J{R{fDcdqbW^*~O_`kfX4O)9Q5)EaR&i-vvv}K#y74h47s0>G{ zRE9()*)x_WRxJAa;v7(CDgQq)v0uOb7djtCjPik~_b2{Z;m}rtrS(VH5Waka!$>>2 z$ta;ndiEA5NiI>EJ-h;)@GyeZ&BI<@n<<1q&jv&ZDt?;@OgJtDq@Pu?P8LDV<#On-ipPAbw#p%`NumTtk7;Ffh6Cb zd)10~K`<&0P+M4#Ah8`(MxgbOiLjl(v>bdg_<^gHIEtE5L0{|`9{ky7GoT0mTZWD1 z@eE|!JkC34zr=OICqXGgQn(jxO=A?iMS9=YmvAoJxh;gE>~(T8FdKC)fGoo%(k*gS z+Wtqh>u*8B&z@A~IDZ8A?nUJ6+IU>e#J?stDvDocfl1g2hOjn-cDUi##$EUYg14Ap zSVl-H-vk{J?XD|#4H+=*4^q4w#aKgI?evk;a_UC6CRjtyHRsFsxI!x7SGrg;FQHCr zoy<=FT~mb!`wuWXul@eRfhg7p-B|QOkJ418Wr!dtS-cxe5dG6mJ4cW`{m_?&q2T9u z*N>;G0bZs6R`v}F$(1z{51{TyPA#&RPgCVp#QT2bYub~KR8SF6c zn3=AjMChEzPCiDivmj1v+wy9m3sE1fhUl7V#`E%S%5XTJhh!1}58&av;1Ifua?J2! zqzwB{kCcKERpl1hBYd}ND#x~B(rn1W8*U4rD0#uD1_6V!V)+~*U_{VyhxR1j{7(^V z=$?mSVKU*lfAZ}}kaAv`fuzxbc2)i&_F%b}OxMOnbZ<|Q%aMnd|@CFJ1$L_fK=OmcE zLFg+W@)ztpWzx-p$b6@Odu^kKZd|oG6eX&Z-P}nrVHhlt$SnO5ev*gOveeZ3tC$w)MB=~ZhCxUE?P4@vTa_P_oDQVv1R2=BaG`L>RG%T~)} zWT?v0LKE393^#~dSD++dUew=+d$XwCn4>ALmh=OOS67li0MO(w)7iOR(wvJZO3Y+R zd;8{wc^d^<1uUz#fB8QRivu9K%TXFvAE!YfP>V`Uugu9XtLMf_l#N#v9IY+%hADGx zaj)`UF1UDV0TCnN1^^6TWA<|Eg)3b;RMqTpK~4aVW49;Y8B$eNO*14XMsxqP)3QgI zFNST|lOMlPSsn$~&-YHlbu#$hC!kI(pblJyIyoJrVaV#wdjs7}mWz$BZ$y1jj;@c? z;6Ym)y3yvK>xFLxq8AB4yZ|`iRln>j%@--%yuXV7hY#9lgT@jo4)!za5q+(C(dG7+ zcXG`BVBbZ2kvmIP6w2xtKE^7VW@xLQvUOZdO_{o*SkB?Nc`ok*NP|UrC=YvqkCEOt zR~w}XvNQx7%@80095%0`OtK^mm!>>kkzZYo*~!Y)ty3QhFu&ggjA$<}cxqzXNfdu5 z^CBvW2DvV>mRz1cUS)d}WR2`hx#a|2RRcZo|7tCuae(4a$(BI=5-6e&=5%BM`h(fz zi6}KPGMBtUI=t)$fhvCIV^V68zU?eQBEbR>S+Yd3H;?kh&}d2yj=f+=cUupoupuKF zEmR3H%ow<8G&6WF{p_)m(PmvHBhfFQ(Y`U3lYZN4q?A3jD#Slc$c^-!L@B*mj*su5 z(8KNh*OVR*x396%E+!ZNfS1vwJv{L|xYw3%AYTGi@~DwSnxG#j@5r=gfkS;)t4Gc| zth41J=g|0aS6|1G>uSW5GA6R9I#*lJ43$_$qZ<;o}Lb=^~234486T!OZ+ zxiW<23zY|x@l~jhYN$*e5;$+h@)@4)H|WYK{}LSr<@r@>wQY-8U2Kvi5b)E5Gh|d4 zo3`OyHYE;fodp^nin`-ODBkz#cP3yadGTyv`7~U&d{L$2<~n%(pX-(glj5{YiIaM; z{D_K$sOOZE5x#N9sMzY}?fE*{?ek>z6khCaf`Wgx&3ytLlp(;OB^OB|cE2YANo}f} z1QCkW7|qenG_8za?-vBqiSerlQ-)uxq`WC}%;kH3aF55R z2?G9!{}=p+(CvAIaBRx=YqzI4LzUKs2xp;zwoUJ2^^<#?Eq#4AF3#*X{7ejQ`ZN}a zXzSYCCIEBwAq>bw$L)arBf579(f0MkUT+&?khpue3}x#2#0iX@;xPS%+tUn$3xrai z#u|T&`6NJ`={7TJ)}=}y?yY{ZP!6cbh%;^paao7Iaam-<{L9S63DKlJofmDvZ}?6aLZxj>6`h=8z_UeB81LKk8%Fg~>_fgODe1E_1@BIBdFL(FnzOVbb-q-v6zN*;22dHx)A74UPzLJqogMIYR(!1|AOzOG`f3q(d(URjwS=G>4?(vQ zS8LgBr}%}{@W$xRzJ$>3cYT$eeQWwK?sAT)7}%tc?5TiI_{d=J`FFE}P0hSBU+$Ou zt?n|vmfGMYPT$No(7DPqn!hX@$K4mF&O1Oj573|;U0?3{$VI-QQWM8{xQo1dc^aZ? zC}IWGIN;<1^YMX;Nz$8N+@D9R1xNPTTS^Q7Cj_;S<8c+gKI(dZ{RqHLyS*YY*!dJU zw{#21Yenrwmhr%>=z{3R%bPtb*hiR3=UXbN()zc*h5+QWLVh0NzI=r?vhkU~|3ciM zs6%M2SqA|vXW?<3qGr0q+%(F`oTovW#(vhU&&p{0Xk|-FK6g_r@x&47ce~Z!4~3}- z0bOr9JXtk%1^~|SL${(I^opznkNB%8C%$8!_2@OQPro(w@LFrcg6t)0{(<{c^?JLl zwwBhzC@IW0Zp2n{=`ajgd`lp|3Mcs%1CDY3Z5TsFh`?_CpcRgQh^dP#ke~VsXS?%i z^0KOtZ>hDdZs4j=7lW9%qukg7*ux(Cx@6|oD8O-^LT1)sZ~F``YUS-ch{}96EQW#f z8xck&kTR0I$b;Y4TDQ>RVaE|)+-xn+6he)0A6V}0u7ZA>Dp09_aC_dF z;w$~Wj^(rd%}kM7nQ76zIQjn z;j0@tvMM@GB@<;AJ1!2T71XRfsKqN1MsD)|Y4MV))`ca2z;(v0pv*vT{%K(q?IwT4 zDp9U@AO>|awI`A|3Yr;?d=`9|n)++%-Y;e5_NZpmW`m{4KF&cU{*`dxh2#)$!Po)> zLd8HnxffXZ0$2U$_L;HK-k0G#n69NXdCib1p3Cd$N7G~T)1}jjS@YKJ?bYiQYS5JN zh&%&I<5B}>-FLIuAd3w^B}h(22${vIOov#kn~Mc(Xp@oj1wEuoZ{nH7ZPZSdG~SBGWa2Gbw-R@o6(P-wha>~m^+<1f^E)~*if2zP z^co`hEn$4?GQHwM%vWy1s#3nEZYj3Q^P)gPtHaq${9Hy&K$5_9%nPf7H*PTQjB}O` zQr_v(5yr&CIM_@iGg&ty-B+orh}3ewVy}ePZvnkGY$wxb$idEXuBp3eOfD|w{Yt2^ z)W>_Vc^BT(!KaO{-(3vcTC1L-OQ?jz4iaBUpZLS`=CvdQUD$a-0kjZVz)DyaF}@uD zY5Ad*pBaTlO(WLCBkqa}m}hbkBZW|<;$Hykm$8>KkW~K<9I;p@RxsF}atHEnoammA zk_NNjRT1(=-If<(D(5SyKT;i{Il2QtO|e`g!`k;k?a||(g8Yc=OMx1`1i2(KKosuj$O9v- zH-%ihYK8x!&njf!@~ax+ubMT($pY<*vL*@;gtwC>UaHzpFU0H=J(ZXIi0D9H}LefJE#< zS1{KiKuaJM>HL)+s?3OUd|U@5$O#ZgfB)&qbZ05kfnq78cHYtP9$$v>K*`MHjy9q1 zx_{0Xf1Sado2FO#2VjL?B?P`yzJqhtcO5<}4mL*snpdK5O?T1S={o3D|x+1TvW=;GaMQs-64cz3q2a!PofKpZirOzloG_j#p6eV0&CQO zW@b%hwGN=@;o^Dj?0>L#L(aV_ahx3~!T0zwjgL4FJrBu9p3fksBzd{(E9SVlq`9cP zz?r#&3%O^UIjHYi4pzeP`}mj(Lfyy5j_CC9pMW;9)H4Vsj>gm1$%q(NLDMkU(Bpqj9phS&d5PH&36`RQ05)zGL>@xQS1{t^tT(y<>_V zJ~2b#eMGoz#ABh2Ua*v1R%(JpCxhEG^K~ra*z}9Lm^$GYQbHd z!79#{$KvA7hZqSaIL4SsGI;K2`OpTTXu#<-jtZ~PggX};P`z8A?SfW6@!390+cjJOi#{& zH!Fs#^2$12P-6bMRbc6JOyp_L5K(kZG)yGvTmuB4lDOZSXa-@+kztNRAGD1?=Rb`d!{K~sB{jxI6~uJ_0Rh- zK0bv1@B1K`^OtvH)E*PubbzC_*>G|udCJAd$IWKNA|&a$p%5SwB)Ole!=~5cV$Mq+ zI*(-@WH<{G@U)6Ud9rXs(-tnAm$sPQ%e&Ith>D^SDytItIhD8e^GQXTsTM#SRnc-SGU#| z6C@QYUO#;=&N5ZPUDlI48N5SiH8lDfUAqH5w$5hmacR?yK#tT;q@my(`$!xy2ciCT zKnehPV-%g75)FyWzZlmXB;KtiK{W;)h{8uG92&~)eCapbD46Lss~_I>>B&kj-d)?W zLN^G4j;TJT^2IsmI0l1h%5w=@{dbP>d!RAg4vpdbncciMm&Sw7ihOiq-Y=&(MY%Mexo*I;7a4i5+c*2~$;jY_XXP=C8)-;})JgG^2X6GZ zU-~s@`pJ>u$<|=OfsXHHaG{%Xdb~_4=;yB{?W=!;FbM!z^cJhj3u@w$5Fxn5U3G0L z1BBxOC%0mDcLUG|^n#^7ZDTT-r>6&eWzcnjPRXd>b>z?jPcWUzF+Pqs5&r~Umct(t zMH<6?&;{U^+o~vyATec)@{Nr*IIzO3ruL2v(K0?^6G6O+z)KM0SQyc}UFeHKF_8o< z7SI8WH(k(0&Vmt!2E7&bX`3n!0s()s2m9M^`Rk-o+IN-h-a@Ho!Ix_%q|rp5-+*j(=(8>mxf|d;YnKBtdpe z`wDegZk+F`Fy1#4{1CnFLxkLkh#IeOa=o5g0W@=Dn|&WyL)H}w3t6uEj|&5;a`#Mt zCL|63bh0Y2gdPC&JF9S=VH|Kx%WN+K0VvQM?s9AEi)R9UVAQL>#3{HZeJeHQzT0aH zxoBVw(o_vC{ck29HnQq-9Wn77Q>)3L!5{?yW;x8!z941Z+zzTr#xH`hmD2 zdV-VJJ+|`~l)w}l9!IolE{apud(&lmUm05qwnx}V=!x4XY~jhECaKee&Z z8((hsaBrclJ+`W|=ha>m^4&n(<9fvNDdLTLvVZ(-up-R$ z&>m`21|r3*@y^#bS7tYya>^n}$;NACH+yJPC}bF#0VRGrZe(!#YcU8cVOkUXKFk1Z z=6w5+mQ%*PXP%Jn(<7$Tx=1V&rJ?r5#-sVsb}fsT-;2_U5hSD+<{grtjW4j(4+$>3mRuV0a(KeLNALS8P%U;o7Ky!`R%2o?H5h z95iW*Cbx@W zSAM;13}yJU4GKm=DXJA;1uy^4XX9zG6gw?2e);mTc>gb3%Scs+^V)x~U9_-)=v%Egz{uZe zt3#44=5T1v7|(AOP*Xc`{7EUv;S;VJ+OK-#cR{}v^*>im0bJ3LuN{OPC<0M7h*?0d z&0S3hYEjxxv2D+1ed$V-N}=1S7J6oS(-;af)S-Si+YQDf9`(D?CGzp1XXB6T=KLq> zA2M`cb1=y`o#{jcLH})7tt@}0@qrCWSc3v*ve8H`$)~5~$D9$$*m)Yp;B~?R37A(a z_I$M;ox~Ax(cfNcBY118;uiYTwfUK@!xZ7~===*ZoxCx5VzKKo5#vYoZhwu}{P#pm zWzyQe?_xkt7=*_Hyl8OA==D7rU#j43JO_HBaOL*R z?WiPmo*1AaSEC^-Q4E%u(Cq;8JmH=|JT|f8yEIuHSSW=@iNqTIAmu7}Gr1AB?H+x* z5vDP=Pg|@KGx9h(`T+o0laR`)zNA2}94+mG{x;#bn36Yo6|T)ZLOqI@aD``R&{j2xS)Pd<>+Q@q#(%opvJnJLO23>NiKaqNOXQr1tSD3jY)$VJk3)bRw}j81J!jK2G|Jlnb@kM8g?1NC0Gd&<>HlwOhbT61COBJ-rG4t zzet@|^X2|Ti2%Uq4(8p1938~^_Li&StH2^rpDX`pj9fH?xVGFSt?(%WyN}@>mAlu% z8e&ywpRU9td@Fql<{9vm#QrC2O9`>Za`;7TaJ|+<4_l|_k~xFRjdbSTTu(0{ov*|Y zP49Cc&Q-I+SP{Wx$J+>}#Z>TKH*Jv#N_~J~)tdaCpMcDk6*LvK27*R6Vjw#7Ivd+@ zIZ#sna`(DmeZ-WVYWKdK)4yx)kN3(){XL9>$%)oOJ$=z^^3Ioatp!S#kFqW!MKG z-+JZ13*|gGUhpYyxoi6e6Nx%>Muwmz5prd>dOtLg`AE-07vFMlVgDS5ptn};_-~Z(iaW}TR z$WWfl@KJ_10ZJ4Tl=pQONj~i|ecO2#{Cq~I*T*F+x6)LO$m8-{DU>3COlqYgAq8Tv zzGR~J=9xG*bDwLE{pZp&a$=M{8qFHD$1K^zp9$D3J#_|i`4{<8;8y;scv3|3#Jk!( z6pmwn#uyF28q$|2*DhFlh%#Wu7RYoE6sJ7k&pTz4g%iKl9|El?RKp8r^bIp32n~MA zdCJh^#l=NaJwqO>Dm*$l!sVw z$!dz~MUa+e) z_o&RX36^^8sk((2b~#|ReFsYx_MfGMmw$KSp5RoS*yN>|!dXl7cqP1%=sT&|#Kc}0 zoA-jQeIvB?qjsvA!_svd?}`ZJfNuhy&}D~10YS>k+){mPkJa^O#U9a)SaLIG9H<5J zG#Zflfbk?)Av}n>$38^^ecOCf3MC(q091ndG~=6#SJJ1yUt&6%QLRb8mB;w2fG_!S zT67SDpN^bXJVmz$crj7_>wH}dD{|w@ALQZX$Zeo5vOcr>wDwNP`|wSSP9bHICsb9a zcLi8KH^RZ1^`!x=E+xDPI!HO^Vl))Vq}$Jb_RPJoo$4$GKTDrLKd=?tD$e~ItFP3( ziK%!iTVb!qM9!y_v*Wl1I#SuA_P7EXXP=L?zcc37N)1OMx93Tr3(b~BD{zH_Pk{{> zc!RD+aRWSFokf!X)i99o5F~%{dRDo=9NaFj&j%oAFsOLib&uUO-ZjH#w`DRVeM(u% z@GMwVg1eaQWvf3a)r&WI*>de3*Ztg=ly;8$=rYsYB-Tb2FhZuOW)7iJ)-$qRj*gey z!qxPayudY2|KvVMm}mgEEX0V0K}KVXG?SA4$l`$?F;5^;{7i)Qkh`G%9UD-fder1+ z+6tV_-GW;pk=4FCw+bjo9o-8!Q93wKB>wwZ>Z>1SF4gOb4Ud&-bzCyj&0tV>?Ncq&Ya;|tMPJ4VVj-VsVGFXMUe#wtSO8?3 zwBU!}8O=sH!3bxwVWM)sy^{&AtgJEDDh9D6!{;R5H`I zrr}p1GT?G%D`+0q=&}-4Q=>rMV0dO4#lvVz-hULWr{2uZ)ir z0`+FD=lg^y&31nmMmU^6m_G&>%_{p3NN5NgIJt&%Bv23Z1BS^CdB4LXg~s+=w@tFf z2yy@E!T2x@V{>*&4Cq&bb;Ew&WudzoTPwG*AKG1$={KV6dpX7OKgogcq_1V6C3UXh zB>uDO#-WbDH!=KNCs$J{zVy|FD+_8zHtFsp%cMs^stpgnp z43oXY%p^DKC3&=OypR_<#cwh*E)4Hjn60qhlL3T1V2rfh=ujL^h+r>hz6tp6>C9=i z0+Y^QYPyi~-jaxg zOWZ#pFNy=Hai#>UigQPB#upRd2%N?HPk{Ng$!{*ZD97nDdL?PGu&k-zOAcczi83;F z<>^Z2k=>`Gy`OwGqTd<1&Z{sKdp=}yXud`O;CIO)9qUAVxM%7K9JNrDl0gfaoe+TB zU8rB|#g1{p=M>%G1y3$eTWi=`!X!Ez6jcdE1$)7+h2SI#4Pi+z__fY5Hcz=G?f!H| z6uC&^+9<$a#cqFW^Rl^!y<4u=HYpa?cJ}1S&pV)VZdZIMU>1Y_2LI%RV*Jl5oHAY*z7_! z6PHOr9U>#Z&G1$0C7yp!ITydE-^0_8jw?iqxvWuA8)+`GtN@HRR<%otD?=gR!PbA2 zDZ~GFg$aoHU%WB8!d`T9WnC{&%ybKU#t-=p@4{Nl=NRTeJu;F&;>gGAp>`_R_Wqy+ zPp^^6Pn{=*X5k|Or_>MZS*G z79Wd&w+$T1vuSBUcdJ|n^R)QHk&E#4#8Ad2lb3?F;7{gp%1@%{?TgjJpKPQRn*2&7 z0a$|POZMwkW^@HJlJ>oLh0S_Ya$l;v_M5miJshUFvkf9|+&BvIwxWFp_QmVCE>2%j z)l^Uv%Fg>V;1(NFAZi8oyIu*i%JSIA#O1^(fCrV}ucnea34Xc};DrgCK%p#Rp=A@BLl7K*x2x79TL)guD?y5Bc`oQdk21o^?+U-ye&PNml%*e z@Vz@jfFgTL-(~Prk^oE{AH_Ace`|0EDT@@; zWgE#)fX`Ox%&4t%ESM-OexPJD{(+%xPI%pjNW;u~8{h?Y@&`zlb}jltU_RK9kxROx=|~4eVbCF6tBg>d8u<3f@1?pGSxO8k zCy+qG`czR6=y5uNkS3t9sT+VQUC@|#U9Wv7T3Q@ilixcQ;^?gIC0N@xZvWV6Qj$Hs z0I-TafB;`zl~hTI>tlh}_VgaQlpUYFKi*VT-_Q87yN@o@*B?1AcH1fiM)#1=f z#9h1#WX8Qv<9nUJNhoglb<^1XfWx-MYf{pOpb#@*NB57vH77tH3Y6fBKxu*xIiF!z zXV7_yTci=$*vVzas$Z}xj|T!S#C695*0sXm!KNEAf@}ETAg6SS4*F=4c>i^&HU>#C z56g)}SlazZ?aP;FX#JV>E5Odo73GImls(l1BSNZ2EVA;R6hx~#6JRC?ixVt@F6nQP zfpnm>MG3!QB6*=?NV?$^I%x@-Wv31+4<{HtwK126^;LCLTMWw%m|HlbhVAJajvgNt zFPY2AZv8l)e#(xyW|36tVnl`F7RAWt3|#&1`Rnj(QFv9k&UMr^jpiK?d4r}L?2`Zs zt%FZ|s#p0o(}_dGU0B4{L2k)X@i7d=_sMvXziIX#(Ex>etix8zJyQv1nQIst$I#v- z1dsW(oy>3Uxxc1e?gJ4Zs7U#whU&Qw!Odx}J^B=2=EAe_D_2bQ!hgok|4w{fhc=I% z&X!o)S@2jH0zBMH4g^1^oUKxeq})P}xC|FhFH~p!1k4Nw_;r}ICAQzZ#XoEU*wy)_BX;?grGqIiY|lIVr%($ zfTNP$^s0TFCyLhKVblVT3!RbJCAs)!5TCg6&6u8IvTlI>Cs;OYm(m{>f$w;zg9Jn^ zgdv$(MlOlDeI9`#G0VKss(M2KP$XXqy*^7+B=SSW=-Cx-P)&R%<%&ZBH0?dGpb1)d zE!d)bRWA**kf3e%$=)@V=Z)zp=!Eeoj>Jo58H#ht>Nr1w4?-~Md^(g3d+jG zYBND`z{2Fh6O|Fi$@0L35|qe8!3WMLoCd9Q+-3>=len8zN_lw|1er+Btz1m>9V-W*Hvkfz`)xOtq5@* zTv%tY!&7;2Cnaq*=&~a0Tb(Gu&HYMNYnR zLqGPiiJQr@J2V?_DNEWHW{2Aw!LHNz0^&W@`yDlYV}X9Q@Vu(kG`N%=PL=hAZtb)wF3p$aEmrTMmu>`h*W;5=8)1w|k3jdTDUY@3q`p5h$Op1nasMaUJ;53i((kcOkl^n<&^DXr0Wo z4Yug`6@K+1v#tog)ttwL>pddq3Z|hs9tGLYo}k0d==_n~&eXkbFtBxXa;*6#M4GY3 zZEc4jPJH`nU$<9G{j?d-;kRtYFlKVv+>H8JYR7>-v3><2tOVJAm>04@ECdbXpChWc zsc&xX6u(btQ={{>|3}N02bv{TA`pkTKu*~WPmy@BUie54ij$KK|6HZtde*Z!@+!K z6SflF^p4q&09|e%%4jsJpd~`dX#&-tL)K$9%DM5`P*A(u3V8kM1B;BQC%Q*qnV&d5 zr?N@K5@ktH7kTXh&CIPffhbo`UuM0?3s84Zzpi>L&y`L6>W!2{2>Pm>Kac8Jg>Y>G z-rkU-&d3STN%EHSHC^?f_(niNfBRQf#AB#hgmM+&_LcjWz>zU}(vj@imjuGy%Om>t zyG+}W4FqIfp79>B#AEKx8*@4!GS)_CZcgb&qd*u@mQTr?)fP#ea;wRm-9v^ziWH>R z275BiCx4$?O}y5~=&W#LMQcL;%@<8!k^w8x4*zJYw zExNR5+gP7osbYC*z!p!GEk+A`(;gB<=N}UG6)ATG1&%t-$E>?z$!L4jM?Xa~s}3KfN=37*&#i zA;O8bQorp0(LHNj2L)uk`WF)t)kh*@5tGK8=fG{ zy03Y5sTkhB_t+c^HcC^xPO#Yzk#+a$0{a8YW)Bo$D9no41;VeGAKpPXeo!StI z9e-e?9bwK_DvbAUYP>UBB!lU64AmI1ueBnUz*;&O9Jd2XywrafaIo*U?;Zo~H#spR z^Ky%AO)Gx+s$zzofj7GeO;|DkW=uTEgU1^4%@_b$Ej|%3ysyNOxrFzxTE@-S4{4HT zt(^J6&0AqB#M_YY8tx&KM{D_M(lhB|aLu_WA})&e_v8ouoFJ~fA;JG`@?qMBhVYM79(`8bMVC;q`2KtD&-g;k zA3>-jdo~$yDLBQkt`eH^U88Gt298(=51iZJ6=Xgg4JE-07B&s{y1fM~+LX$T$r<9p zc6hBpbq&glNolaNH!ABgwDB=l`=)-4@08|G@3R21gXZSeBT4R_?hRz~{Xu)><$qv)(>`+iG z13geHAC{ky!{2&f|1Xbk3m5rPUd>ft-Z|auzph>lm_% z)Etq&vtZ9&Ib=cH(nea-{SDBONUQoT@NY(VT~!iggi>y;DJ4Q{?*%mo-FxmrMAZh;<}1%j ztGnPnR@T@n3P16${sPX{TJ0auE;?5s4;5duBQrv`IKK@X&phS1rogSFXWNHx{xDC>EPU?aknz2U{% zFtb*0)U8zkRcqz0<{ixFC1}k`r$OnoZ$%V56R*xDFOzNkkI`I7z5U5*qR~JHWrcu6 zgRovGv1qzru>!YpG#}Yk??qkruUC1@04HSUuVRJhX~hH>${Ax$^K zyPI^Ov9j)ZimoxGN#O{RyDslaZWV-k0D=b^NHKskA&#w5uv?=}A}!i%L;^#>Q3JWy z6@fr+u6s>@Ym0z56`_gelyaG(H0+yq8k`LfFz5j8g*5f2v!~66Wz8(0ghj%xp92$Z z2UaW+f1C+`E1+OuVWAh=%FWH4oRDyyx#mYi{xae0NN633A{z||>EAK_9EJRqJ>y-q z8Ffva(?4cEA47U-B&X(~Vemhsd>HlOActB34Uh}swkB{x%`eigO0Nt13&1&5<2&(lf@ab#g zhLG=m)Hw?eW0yVXkvM+5b(rN6q`AG`gez)tanYeA5top$;+tIodeMBz{;xWV)^ar5 zI1dKjKEnf%S+2KrowRZ540AX#SYPwXzj>9XLC8qmORX^Y(AJ7Kk9>S)&A(Uw5E>>v zenBua+}}02sL@sXc5Zmi2yW7gE%odqhgvI8(pMm>m)Oe#z)M^eCleC-fxbJk`Hpxs zp{!F^ZYRYu9ocEhpFwNkyC_<24zMYP zav#jxt%A3If>C*aX6xU|7Q+F3M~@7B)D-a(qVLqo5}4ot9K|xiMv=|S7AvENO>DfW zk|1UR1liCEfK6SkW10|Wru@-Yjj~8yDw9Tw3Uf8n37J_PcvIA5gB*e4HB_t*fe6Bl za+4n+>h58H_Kqs0sezt|B>V&=oL`i0Kz#!XA-VKwD|tjCWK95L1;@yih3RT|t(+(0 zXz+QyQ)VPacwWdHd!GAYx3)5U!{|w~pF_*3-F#6lD#izIcLG4PE2>t`Hp9>`774bK zp8-rep@=jvpeJH5KLEBlP$z&JDKJfK?5iWN7|NpJR~A6kMM)kkY44RQSB_}{eZ1?z z#&_9F8mN}{pY$+Vu0l091W3_y{F#Bq@!=LaB;;nNo0)%Ar_4&ZV0uqb;^EW5U^p+t zl}UPQ6>~a!p6%O8Zas*oAJvOw)g;;CORrRc z0~ATNPX)Dj#S`PvSEJgKHJ=76I%3WT1ZK$=AlzL9tRM>1R-XJj=GGWjsq-ycjS(RQ z`X(-rh<8klE1Zxa5KG?x6gW(?@U1UY<4LYQ?A7kik-qLyi04U^?L~@z&v}6;&v*1< z>DK7=xELS1JTGW-muh&mAceeFC2v2S3?Y$UjLJTcv}y{`{ZNS?Ci~rQJEcEv-4(Lw z>S*vatA6`V>Wg*U$7kq(%NU9< zssfG(%yQ&%&)9KS$e~JCFOQ7UaHS;K@$Vy6;231;{eWh%BQ;Cd;5g7NM7= znqvz%*eVLn2#JqzAK*~;pj3{h;f5v9%JkhsPz2b6dawj&^c{qU;xK}1o#5mBT01)B zI=pAW?j!*&?OyXX*rmAxh1RCi<|3{aSm?UIj^+GvC?mpQ^|~&B5H61p>o#^!%v|8)b{5d$}<9_q5zEDUjA*>QygAI{h>7Tk4K0VHAvh|1K(;|Mp@#+A6>5?yhCq>oI*-wRY|DNQ1l z++jf!yj@KRvfK|8V+sB($@4_H9m`SzQU0sPqx^XlY&U%X?q5Xa zqR{lM_y9T<1g<|ZhltLsO6E;i0NV|31>`mqlmY>5yb6?>)e~U0)1pskjCp7Hwr`a@ z&yO3@w->eK3PD^7DuknR3Fl|8b458XnIjH2dw=bME9_3*+G{RzMp`@Xl5h2CLSu8K zEBM)0#0#Pa3CWyn2=o_7(;BoC$(%alW5~@D&*N`2;#C^ib^XyMt^IDn3Mhg_Gm_ zx_vJs6(w-JfP9zDVrB_*O}C@2hfzTd+Tos(d<5kh_E*WP*Kuu}P(GqsAn##pRBWZCGlnU{w-&EMI?({zHB~?5biM4( zeZMow$R!+gnCu-hkw<;#<&wee3n!tGKaS{d!>yz!208brxuv+tCqQyB1R^ck97Mc5 zoah;Gkn*=sPk?#FHCoRfsSnz=U@*B%uVWay_}duHo{aJ(ThrT(%h!IkwnXEdn2)Re zj<*zfOO^N9hzZY5XICzJk!Sw>GIW*e{&tlenx~d+97EAsn1J44;$=GV$$l$(JT4C& z1HjHW+a(As#~2y9B1)}?ZC*GzKRg1-B4Fx*3#__H3~MV$iJY+$&P`xzPTGcUKGszO zfemC3O66oMRwH}_hh~#8d#i2?>*g^xRy%t~+RP8uX54LoWGt=Jw8fy#hMs#q@V5O49+Kn0wEy@jM6HWtVWI$4oz7gYd;;sFjg6NDk;WelAWa+^jrczb0p%Qc5d zcUYN(O;4{e*Sy*AuETb!@wKQIGWzI4`iC0G`@5kY>3}$N<7|dSy-TCL`AdK3LH1*u zd!sayOQqgd7e91-U*soUs;7PH21fyW!O{N`bO7O5E{2-CA`iRH0Ov}5iZl7BS&nHn zG+hv`2z2jl0FP^@PGT~cbtU~zZ%23woP zL$hV%WC7j_l&9(h*GLt@wbz-W{Wo)be9qZh=#=>xHnVBhn$BJ|g;-Gg*%yZgIpY31 zE)|zjVnmfRV%w0zP5gn9PK4Y&LEhEBJI+mTPw0U3Z($w9%TNsmhzABLDhYVvo?Yw?c<$hT%=cs9((F8H_4KMf?W4rZk1 z!k`Q{I#%j1FW^6d$XE&=h1$z1!25Gz5*ujzfH}tkik;-pZb(9Z`xXa;K!e^b0MZ<# zGy+!VQ|idM9GC-RkxYS+=?bZ(aezn0@o&Cgh?>k0)W0H&>O4P*{+TYJVINwJACJ3s zFD4#p4m!8$y^kzAt%HM!kz zJ8aS>GjFMwCpvFi#(gWGy{;b-Bo?8H`h>bux1u=;#5t7K1|RAMSYDW@dH&{nc&q9w zJh5)TT=t%^2*f(sKfE9`VhdUaRazpYR$beiZQ&SVeg<0Tv34~esR@~YUKUAb=?79y z)Z_J_5W^b%%0!Rx_0?CyrczljsM1u=DT~@Zwqc~d z_V6~mU>$1_Uqw1k75Y17dDrZFpFljK?|VeVFgTvIJ-H2rNvwdK;pLx@r#2%#;s3pBWF72$$Y2~#w_j-SgE zPs62hcN*XJS%PpmCARaFFMs4Ya6zH=ao~&P1Qr4%>?ItS_N=4rz{kK` zTl~h6>tCT~jZCYe6xi%J!D;7df(+#vG*)yVPmF*;NtYK7wGj`&5SRu9mdYt$@jiLo z-e(s6;vVio17fl4<2-3|TRL>Yj~U1{o`=w&x9Rbg{J9&??f@8S7A4uoH%p2;*`mtK zZ5E!A?5T0y)R8vtg0_I?;f|~zsE>l<^tcBItvqmG>md-$6w6;^wj!|Prx{urN*F!P z$=cy+;z5@GhWTqFclS;#ytwx2Rw~(Mm~t4H7Ncj9DU{+c9wceN~G^NaH(fVFak(3$MJuTcXd_|e7+-pY>K;wpNjPLOyVd#Kl*+#g(1mnb6okOtA0F$;Bht5K%RweYaA6P55y1MD+m(kwnF(`j>lg^9FqJA*rZ;+= z4qV^TIkff|%p*b=?mv#F^-#OG4ft=EYciJ!?Q0Nd453chK=VMqj+qEe-fy;tzlZZ`(iQNa7i0dmtO3P9sw=K*fZ0W)T9wTH-AczwUv zfX-G7<>{7eTz2R~V>W+`2S=EahdKxYfsNVEtj!Lk()Ka?I8+(Jv-=aCT+~w`KF-UB zOyu7c*frWj-Dy%#bgV`btU7{0=oQ^jN{ybcUyzvHrrG8nHgpT=PNOwN$pfsu_^v!{ zU)F**!G9c-ewA^+dEDCy8MXn}Q|rYi3T%~plfrzKPS6Xm`d6 zR4wGlY7D3d*QM3Cxg}qX^yisTAj`dikemrFZ2?WC{_eG%@KusMg<8rsh8H~yDEqat#tP=F1;9kw4wnzZ${wOD zcJQ?2W2h}u#ADYno18hAoDFjPFaR`3K;R=mx5*?flvr3M1H8+knr}L>xuJ#N*3Eqg zgYj#I$K-}XTMMLx$glkmt10XsJDyp-cpfC>{k~^BFvY%k8c;I~F(I@x@WMGWYP>MY zscd=z5Io7aV=fTbo`D)SIABWB;o$ALBf8;*;Vv3NE53zn`%%R~CxFfUK=7H{g8uY! zEVAJU>FT_X9&&)q^hq|MvP@X?RKPE9BG%!@MaU1fBf@L>*(l#+vYR&X-lQ_~PI5D2 zU^AmT0L_g0{l>r)g!1+fia0$G^y8WlieSBFZK+gumo1HKrGyjUzU}=bBf#tq)80`PaxgEoG7Dh;vnRim^Mmyw34aB<~aR2B7d-0%pS&RPVlX^~#JpWw6>`v%Yi28+93mcIQfAPMz zRE8fv0x$;?dNka^<5`uA8Fi5Kp9C@t63914(O5LM6Jpq@4!D9xN%_{1?8#>EVn=FG zz7-`KvXg{hH@e=B!TC#Q$t_(DGUxjoRVz!@^y3Our2clMk<81~g|5A9xOQ$=wZht% z2?c+q{5qcau}LlasRyd#WPbFYQRM}>z_W2@X!A-PQYnpMAPc;sCvCxzu%llD&cjM) zQMojsS4;mfxWg-+tX%1Ji*1Ohq!{(cGy+J8@h=4lP+Fvye05d$zZ4`C5iKSGUy5?= z<%BGf27!M|)+}^-5x|o@gC3*L^44FgPrS&>|AGdn0Km@%aGTdj59mG!0T;3%e+>3w zDlt@_$itPk_VJh^JFLO^l9}o^%sP7}ZKf%(Em-%LM5+RPl+C^w4eNMiJgGGNURy7Z z+NFM9|i&W&h3eT`)iRY!Y8uUeWGo$$-7i7#VN^CdDF>B#~WUVexl0!ndc_Zp6 zEAOXRunLg;01L5RbV%C&jgSYrOf&q1IYYbWgXLcbQXB!$)DiiEauSa}E7b%g3~~;06tI z-*&(~T|f<5$TcK^Wpp2|Zb4GgTf8~tsxg?20<9)BaxcAi#Ie@g1c;C%Hl3O=_&7K3 zd1e@iu&}D8;Sr4p2KyqlXuk9G6|DJ$URA@3+@mgBcz<4?=PaOMWdgT>Bwk1HA}`)G z8{vBcX97OH6<8RGE`SOa3`2k{3IZzvSm(OE4%BUA<|?sX)MGpFZ6l#j%6G||B5K+> z5B+@|L>Mv`nUR5M-(*TCnJ`2?o?T_P2Tn71$UWHNaV(N3Ta=WaR|J_TA{KkgF!!%} zQ^}ZlB4LF{WsDIA=uVaX3DGV(D)`H(ZQIBp`l#A)I>#ntkJ*xbX9fqV zuD4v{u_%i}-vN!oBpXhn!o(1$%{U3Q{S;1aj~j6Tww|$Cz zss1YX3JpemIEi1)>v%K6*;A0&vvjgQdIiGdH$3I8+x|Y1);dnb%027@7QeZK&v}(C zH(eC5*PM*jjX?q8_V=4Pa^>-45bOpkWbA2+Fz$F<_%}Cv0C{+v4T1c0a5m9g^5H|u z-Hb`0;{Gn)6rQy2_hi~*yoHjdLQB^|V)F(_#X+lj?E2p zKTwpc?8`p-HJm7$lieU!g?6I8Um&8mJo>EFTQvIxW(=GoJ3JI2x(vigNj+@btQ{Ne zwc=r&I0lVi?BtQ`udorI(L#u~wErIx4_!Bqc*q|^&rPetAj+dftlrA|VR6p%o8zOvyh zlYR-qR$(Jsk`wu(CWo&@y^(}z8ro7Ypcl0kf=g;;fz5?1 z&>6MMQ1d!J!58|B=oeo6FAL}Q`{iRHCA#vcr9u8QkSE<_USM!*utpaYnfu3pxqns` z_(ucB#EZ>Z4!f=cDhcaNm4qlUC^*)dNr5XANMczQu#U#H480``hJqE^E+C8oRt@3e6_i<>!|niamK5g$>zp?1&fe|0+?@qwcM7gZqgGjNDp5EOFpT_yax8(;sRy zfZSlB;*>VLVhtTbUBrOkb=g$gAPr6(z*IqHRL4^(#GP&Hz&zrh0v#lbdReJ>7*Wnv zaggOKIM_?5#l2xfN*%Y_!T2ya~Q4`EA`nSe!2z$n2+BRs{?C>Iu#z{@At6Y3gG zdiPs$vIpTACX7&K9KUFO`xM>t7!aK9biHiS`omFZ<5}9jd7jrtEe1zdLy#Gv1}d;} z^e%3j1#lPBy9-QbpSV}k96koDHYhOYpunI9DnDenFAJsa$1w2Ujf3SlUB6}6G{Je3 zi|>5Y6rFb7y1&>tYe|37OSU+{o7d11Fo;9Rp5I&oJL0yp$MY%jiQqjk71uTY(`4et3w&{pi6mc;?lqg=%Dgm|+Aih}2Ie8f~Su2PKz3 zPu+nD6rXom6zR|dWcO}av_BA;^CL}XGq4v&Oy0g(KWr?q4Z~@o51;(`gdZQsbHL~P z>b&_xSCvKG73^PE6TPzhGQ)1ZsN>MKzi+5bl65*+(0nyGgLMZABcF6aHEM2G;wIGd zLg3n5S6%aoGD-0{kNBk?!Z72UJ%3y9_D`ZH87a9uv*H^MABqFf|Bz+Py8l=Ctlz~e z`ksofPD@180Ug&giKOgx<}ZT=)Sy5~Gw%{E_#4(B1N=wauSQw1_X|`bGldx}WP9^# zIxt4z5)X-3Tc&gB=n#GpQQ2-7%6)y*|Y8o=aTgzRnnpQ zM}Iu%^;W7^HG!<~Gu|r@wUUr)9bfgD8+yTZP+-(4lYqt%N&F}dab_H01WwiFpGz9^ zC8VGKbGzklMsv6DGXqWjE4f^DfvUkk>$$_G+qn}`jx$d%aEJS$x$v5ZmoWTDkRM(E z;|InkadKbdz!nG4Rd^@ks)_Xbj1=`>8szef_hh;xlCY{Sfz}_Nx{I(+7mJ0!imnBf zYNksA0Y(kVXIq%|sa~9}pArL;xkht7SGliTZSmyM53-yu zC2HQqF&wGMgZ@X(qfkZwPu^E4LU*{=!)@9LaoGs4B9C6IMCBLyeIR?%f61JCvrLhz z@PVW4R8}vV3A3@}3>*<^=&TRU3VQP11zpmENQePzDr?cd>Gqe5KB_Yq%z}tGrbCX) z_CPNBYmc1yAHvKFbO8tFwfLfsl$1Vx5`Z|`*;5ds z+rUT74f-R%zJfTN_dkn$D<53J$mf8L+dHc*aDG@(S%9&!Z$l;@cuw@*aCSHTB zRHolX>^fp}*0*9Xzwm=&3M|MzqgK`lzG{$EvRpSG13z~e7kF>M&(x*t4-Y1oh_ru1 zIj^`;zawE4F~vYF^p-|WJon%teBeODG0x;BlYa*DMo)?Xqcj5B4m%bHgq!Usxxj9F z3~)fw;ro8=%g-|%2jW_K`s(sCGk(V?^_OUJXi{M+sc$&oQ=n~TiH&JCy>x6TC*7H^fxW&T|qlFzXyna zi2%;!bbPmtJVZ67j5~I+aGzGFMisEw^v30UKiP68OMQJ}E$>IlU~>mr5p4=JMurwf zW7ZkWe*kgTMbwH^%?N}1_n&cQKksv=C^h_WyvLu(VUJIR53L1L1`a?t9|b>E{kn5; zc~pIxsNJ{o)K&0^v1}zN#sfcvpUt}3K`M@o@~t#97B25UqP`y9dfpqn@~pCqO=6h{ z&f;!#>z2;HqQV~bncfFJHN{^J3ovehuP?Cdw@vTP(7!fYMaCkE(W;lWZ6JmfRbh$M zF0?13wqR^$?}H<nq>Q2$fLU@f-Iy9?#p%6?M0@aiLg#E!WSUm)gY=Job*B)6?JPjZs9ffIy7o@TdEPcP)HR%x z%@!x^@j#Ih^GP0&h$ANajy@;K86!M;b@}zG^gB4Et3r!!@7|M4A)3SuL~Ou4fC zJlC${*mHK>qBk?k$ty3&|6;BuwC#alV_ zu|O}0?*7B;(N7e(j4eG@Oe_5Q@IM7&RTuFZ6}33sXj>vRX8YE(hJZEGzV=RwE4+D& zkWJ?H1bpW3`gvD!7I)z9NRWi!LVX23CZN`hh*kZ#x45fMboO1g~CBdV|((MKLjN^i6Nzc&(+}psM*9^5>Ne%POh#fLbwCkunRp7doE#UPIIU&6=OX+<+d)Pm*f%c}JZFT?#RLVwD=k;(e%J~Xe9 z@;gpyvCMMUjFoLEa_bj2HwBURTllf>-B+m|mfdIC`0#PgQyU#yWX<^xA3$EFVW~2DHQvP%to{IQB4ocubr0GMC%DY z98D(>Q9I@0%D(IJ-&(QoODBg)qm{0+c<$@ne6b?MMkJd;TS1Nj~A7u%q-l zr+0X@mmhRoDF=tZJ(&+iv;0X=Am7REnsZeYLKbQWYhf{wd=d16+q++>AD@iR5cCMU z6zg|{=q0DSX0bZ{L-)paTz3Z)BIHGvQ+V)~4>p_wxbPX6QB7E>XiS9$^1N${qJIB# zF2qWB_Iz4NJolOP+WO+s;i(i)YB|ZOm$4?k74U;T>SgpXxFjf<6MJy9(_+iNn@gY4 z4$1_hS`4p-<>I-j?=Z0%PsNH@1GP%-z2b z-tk@MyayY|K{Hd$R)6bu{Cjh1rHk_Y``H+%kGB38+OOskc)_;XGM=qRdp)YB-XGJ@ zp@IIOvDaV_D+<=stcnm-1lA~xA38mF5PAJDS=6l5hS}jblN~`$n!$aIg9tQi4__KK zFGulm*mvYy$&!1Y#X#eIe}R)=%cmynflIrIqivx8+BR z6I*pYs~3JPkMrm~XwE}q5&GRy%%57jzi_mFKEx1)Ys(AeG-Jct{P<)Nk{%ep zGAk0=f5X{jT3w*AM_;)pxKdo5k{I@P5$Pv{MYS|7-0HaE`@Z5Ajh{TH*m9pQyo!AV z5+UEBJ3Xg;gaSElq;MpaM%&LsYulh`F!R-JIE>TPJn4wWRpY~lz^|eB`%=aCV|2^g z`S*4%An66C?B4E3JYCbvbLSst?K{10$q5}Yj6XtW7hf>og4*w!sI~dk7u-R1K~P%M zx>q{%Egu&V5POlFyjPPg@bh}CiS!M)&HJWW6{QQelERcAM5CUa6Qfl8Emua=*5Ipyqcu{+MyhkXQ?MYu;70>p);9FSMitD%K0^i(sJ zFixIG>+?@e4ja#G^!n)vSgv6FxB5q>xU#3#GPGLPys0(nA}6|rVL3%bxpG2;5Q8!r zB-V;ipMPIc6~36X<1I2nrm6cO&y^-aN_{bMd!+h=4&mYLb3_YLe#aD8v^!CY{og-y zD?pvc**okqv+g?ynfSlxyl4K3b+>H9e)Uztt6iVfKI`J?l|SNl9T^USi)-bU&y$+i z_SQ@n(YUw|dQUCS6WhgY>LZEda-~-jnYvr-3LSmYy}%gfJ`uJ7bVE4;2eJMyJb$cO)&vOhr zlo?3vC(hahHjZ8K9ZAw9u^4)egy~2K(`Me(p2o&bC_U|)TPHjyFn(#ybACWkM(8Jo z6$nus9$HRhQhvU-6O+e0Fl=)^yyrg2B5iz`pj(XJQ6fyj_X{3&fElpc`BhhuakTk% z!_^J8z}B3(CPW*D&`@_IcJnn>-k70^zvpoPW*{@O>r7j=%Ud=(__gnAIcj$Q2^^9X zqmXA`c|bWKX$$DI`zriWs0Q^&hBzk%Qh^N-CnmAw9fc=!oHW1Gpjg)-$$A}0{e~^} z0(;w?$Ln&cY5Mr4AhXhjm#Am-_?K350SZEqL=_C(vkG1J@*dbBubH-ES1`3bl34*p zk`GSSDzS>oa+&ND5eJ^n^YQKDTbvy8gu}- zt~7&7z9fH!p4wqD4+@W$Y0hR>PD{PvJY0R(L;mEa-RdC4D`BlOV#>;!LS|hI>$~bF z*w+@$lVxd{v+<4Tx-nqtex19`L$na~JwL(ce%`$_l7l2d8e8`C%D-7{wcuTwSH|t* z<(VFPcl7%U@9@On@51Yy`Jv{KGh0~x&00KYy#DD~D*w&UrU7SUx30~tRl^5P2NWJz zEv}qfR=ctHamzN`SRm@zwdJ#SY-bOnD3V?nFm98wdti%KtNWa=*#bjegtlroHcWjL z>1{qD-t%Iy1PFqc`wQ}hkuihg=*Zuz1u_3?6yLB~``YHxO6OMr54T0?uBND-(q9}P zJa?qNi@_aXC|1a)?H{78BML0RgxeeG^ti=Cdtqv`B( za5!@q^9nxG;OszM2n(j|Qh(bV0$i%v@h+s|m2pW(jPqO>obDa!*=gIe&)3fVxpR13 zR#+<)<9yLN7W+d5KjONmdNt5Q=aB5rZ@};ksHj$^hb7%cwUoRFInkE(CClW`zETtJ2gBN57E(ByV~_Ro(y!YuXw6LH8J({8nxWY zh-{0n1QPBV$YdTrbOMz#acIS>5BX|mP2$QUH+bN1%95;jUFFRaIcJ~Ch;NHe9hWg= zy!Ec&*>;TyQim*QNDDO5%DriJ6WyseNp!g3lgcphy@HyS^k?Z#=MGAeaF;z@`b z|JF_Un`#O{+drWUS@;M^gWtY0a<$lHsTpxoFPNYpMDE~2{seBOY>g2wtz5Bt-6~Ew zBI)Ydxr(5@u>r0oa&@=p6}p|%eX=c*<@9J}TJ|&&_KQy_kG4xGif+J~m$h)z50luf zVb;kbNnukRgpOD@1%`Zno2S~ z6m9|7r;auEM+}T5bTnhzKBe;X-$HJ3dmwW!JJwoVCd*)TKvF6x^VQAuGe|TRTa{e? z6?(ZbG%+KSH|Zj`S=@x%RVD9BsD}?d3d5G%$7Kh1gr|)n)E$cJ&`Tj_nhK@`Bi%TszC=IY}uBQB(u7*qh}v0nwN6`I)J?S2QhDp@T86 zB5XY5TO=I?l)T?R)GfzuQH|~mUvH=IxJe=uZa>ZJsdd;{yJmefnoaCX?a?DIX1V`7 z{JeGXTVa-!S#>hPRtmf$tQ^!BEdQNs$L^K%RLK#0dov>tTy_>eTZ@Rp4MV>KB1ks>sh0AoxQ))L^9vpC5ybcwH1^+h-vp1~>SO zl=(z2A35wiiwW@dV&miI&+(>a%T{w2;=xsoSmXCQZvw_yaNn#^`^rdU7cH-AWOU!G zI@VqHaWtf`NyqvummMng|2IvLcCB>XPp5-D13Wf)E|WzTE=`eClUXK`FrC#w8Rubj z?F50z(J1?(%qyPc1w0hAgVZfhQf?qJJqUnC51q!(&}wOUQ)Ax?TLs8-K+}(8Xl9dw zAd{IN`h0ih5ztFCda|JP1CPTMCPI2*OW`Xi+rG3(OuPfT)c1O zzFNhsj)+E)f1e$v)1OMsOzW_$xIZe#{oD%6pC^pJHd{@5jRhzA+!kl`>FXU4uR+4% zAF8JC7LKQFn(j)3I|OLvkDeb?}vetIe6CI z)L+=lWLjO{F(saHb1JtLCUl6duZJqnM1>EzE;p}^6~OMmjoaUQ&-<)c1oR^q`cq@7 z{o4Pp`nY_Hs^~HkI7RH^POf_Av%x6|;O#c7*4u}(K-^!K}m@fSLB;aowT7B_9~b@wq#yV>>_A0zL~c{0Y5xwgn=}WZr-9=LED{H zZvSb5J;R-K)!i0vtC2Fs!x}Wt?Y$w4kC*XScp7CKmU5h^WG1`uV#%o84@dm5ETQ}9 zS+M=`-nG5n{2%3G2V)L!wTvzV%hO53`gd8NYgd8G>Up}&1TKLySiW~}nk+9nb)PnA4)3VtjI6%o= zbyPE%CHDAnQ&D{57@v{6^5!A44u%|qbR+HU52k@?BN8jBfpf;$hrDBcFa7>;pX@x( z{k$q1Y^A+4^Ib=34EJA32IMfk7qLnK7WL0l69_$?H^EiXwJOqruv%CYH0=>84h4Z@ zUrV~3Ch&@C>wPo3yf)^5&Ts+sVxp4R?9RM1g7#=@rdJsi? zfr=RCG1woU>&4c+6=4)-q`EbN8MyRLixgeHMkiNVxjsoIx_pH1MdriqjI@E&;=Jsl z?|H*XXCqEQHMK&DSDJcS5Q_{09xx!YX(ib%zo--@e{UO+n6zBsOL;QGOVYjE~1 zO)&~~7WxVh5H{D%c|5hhdu%d*`J%)t*j-CGAPJo)m{csjKbN#H9DohR{NhMDbG=1! z9eO?NQDQPi&BPv_i=rWv(k{4#bGRg{nA0!JwD|D}R;8h&#MB%<`W=q@kQH-Mbqq6( z6(!#3*ZHjQP*_%U$4>H}Mml+3v!n4UACxF&46F%WyOm=JD+|Yxr=QNQqrfJUHS4o| zI;(q+a_j;7{k%_f_MNMN&A1ow;>sz9rGODT1*1hQ`Leir}ZVFw+I&I{J=!>@8yR$lY-f>nLc zwKAgs>8amzJq~?HPiSOMWM~x27^9y{1+KTglwlHLcFRp3NEI!g(B}1#E^fDQ+?6vE z(oqYiS=_$th(HL*%lVi@REaWQSLXN)p$-qF;h$!=JMN(x*0n=13+p}xGYnzoN_^#5 z$LZ*@VfBv=MA!Qz^?KI%A*gDuTBK|sc0dljZCj&+;{>R%{;gwLcK*n_FRcf5LD6%p zI5W<4Dpg(QIW!NR(;=^nCRu+ahP7?yMZVXL!|Lsay@A_umwVeT&D?XOg2g0MY&+>A ztFJYx%Vm!{PZ^6Vc~k4DOk@w-5bpTFxj6W=>q2;s zhVteWLL0;S5*rzN;0Y84{uTs~lw<0;ZdG2w{QP}{gu0U~i2gp=W0{#=+HFoqtq3fa zmnV;KNKIF_H0uZ;i%$eoWvSs&vbeTN;{i6j;bn#ioKkQ<&sYKM*E*|%B_oscxa{E7~`ai3E%;I~(VBA;i4 z&Puc7N1E1myRYWEM47gQG^oxJtyoBiKj_q1%z^J$^*?K2TyEk@cfJ+j>uW4x3;Tb+feCG>iPtf-t zfIKc39slQcLh{mU3Dy?)j%Kp1gSR5&IuN?GG1Vxy;A8Lj)B?v+5vsx4{&UG1j^R+@ zD1#7QLPNbx~(QVSUzoCUF5j?H%-S{Aom z8*!=l%>@F!65W~GdtD;-v8PNvb-!-CLTnM|jBK2I!1q$-*u3hYp$?=5K1wf%H>-;- zIPGKbUY>-b&2;4~dzD$*a_jEo%%gTHR!Pn78_JtU&05*4Dz1``eEejh#wQXp-zyrq z-F@kupfKqnw?JI6=IX1rZSJ~;L3fS+XQ3~aX`j}T+pmViy|0-AnCn6D>mTQ2oUP#r zJ8+lRJM{I=D8n2S-agcHq-f(wwT(G|BgvvgOWbq8X$g1>%63L(js!^kaoBrMth<;}O*^WwL zL=t%N_CvsfckPYpLDERId)EF1N&=R3!`^yyd`$LAGk2i0UyDV9 zS@=dG^_DJWx1{UIp0l(hnHC70h1A_cexgvxp@3dv0Dfq2RY=MEYzsZ#7+Kolot9il zexkQ}XZF5Q56`E`pf^xZ9P{4Z^aplRiEsax<9xJeuAJtpdlNdo+4vAh)Itt ztIbyG{_y9{7XI|8EFQr;mYD0i*Rh1`&5*v3Us`=An#YnxZsz*oL-_$uqi#S29LfIj zb@iopR_}$eWLF9BIi*!N#sG0kkjRazJpY12T4jUMtVJ}Ab$EMf*NIS`F_$iAB!2i5 z({_6TMi1DkSxh<|+ns;ij6a%+sm13$!o;=h@&4aP{)prRVUuCj9An*`ac z<7bqKue0X?SgLAzWbCw-JP-;A2q;i4vp94R8UU7_aB(`^EWx4lFJP}h96z*H@)IQtwH;Vgi~h(KMhQu7v@5iI`6-Z^trmUE_S_# z`wJIH!I{7NoLbQ!la$-|F6=q8a<(7vEa$Lg!m zvXhXoZvkhY`Sr$r7b@_Ph%7t8;w$m78I)}fNB)95XO7y;APiEB{u0HU^X`tncX;K} z{n;!Gd|P-x@D`u{^@Gm(y6PnLV^*iX^!bH@ACu7HilU0c2p~J8M(0&=_LRf*d|{J6 z2d2(^bAE7M8zH|%k{=-bf?uVITPHvgf694TlAr>sdBhL81v}@=#8)hplRMnCXzmQU zE?ec`+-#|A>HT(swx5NvJ>P8bSLBkVy4&yYy8^zfM&5rtjfNa87^kJn&QZ_5{B^<*s{^5$75Udiwm(75yn`s0@UGdx}f=FO1$H$m@6QksW;E zSM$=lM#g`%?@luu1ADDab=tt~k)EyU5ZHr}`K#|Fazou`z}Fq=*digFpJ^0M^rOOQ6iKN$rIuQJ;UWMLD0LK%1dmwNxk@e8Ky8a)wz0ezNz zEUWvLvVQ0+iJd{e!_s_rWZ>Gp$|XVMhuhYZ4DSiOaRo~`w|5IZ>|1;yeo)Z)7_9KT zyjj0d4L_UN5*B{9$(@hYr)&13OhUDp?!3?z)`Ns3GVU*eIVi8gdvBTbt0)%*Hf?Yh zWRS40osA^gzquOTNerRj;`e8MWjFuzq}R}b#AYAll&sx1l{ zR!4LJ%X;bGrn_9Lkg9M4dz6Xv!P!Wx*_VIi6o`3*9;N3Ru^4T(oj>b$=gaycTb9RS zxrg1=7oCTs{Gg!b8^dd%F$xs1Evl6#hd)W2(;Md5wQ# zx>mnU3~SM#WPj`4rN1$KBCdnys?0ZwS_<%t-VbYXXi~#!7csyOC{OHs?waiKOzRv2 zLc!TiBVW4ZP`cAies6ub0m*Xi{V+~BzJpGDm<QTI?b&kd8tYzLl5>Sxb%4e{TerEGZIH=8m1&n&S%TVUwf>z%(r1HJOQ?ORKa3e$JB>#oE=@mzQ5S)lL5;dcuFw&) zSJ6Fbv9;pzyYslKhM--rq5VKpbNPnbWW8&c11{XE^u7FC>*|s~_6UEpB9}L9Nz$+H zfPj^t34H6nvxKILxS02jxs3lxMx5(|v}E?5O66KR50Vi`Yu)=C#B0ZFibdg?s(hUy zY@d&7^Q|C;y6L}~<{_3heIr&vxBaNr3>L#Kdk8Ce(?&7z*=~ z0KHR{tm~jy$eP>XSzYXyrvRf&;kgqMy+ScyabGgSk+dU5Y%*ZVSfdCh6Onrp1l_GL zyyZFfn-OL3i5 zsSozNr>uG|oZ4w7|Lq>+7q0&c(tQb}7NV0t1VObr+|s*O>d@jYM-@|i1<5XjvGv7) zoa$_58Ld5klgjK$7K4{aD0|zF>$Y z?NqmR`;;B@HQdv9(1pNLe!Mq#Mmgmi(Esmw@EhB9;{Co)vbdC)W-sy_4v|j-A492<$oOd(7$m-3Q+G43X@&Rur(bISajMU zf{Pk^_Z?{QHSW?W&P-FW@|R7{5B17}I1lF>K60t?qES*s3rCWje%}R-2EOB-;aXNy{RASEAfqTCI#EK!>n( zwdz8Pc>%liv5Exoanq8I?zuW?ZL2wTZ}$tro~c9L5z~&)1a-M|w38&J7bogu{rrL3 zzS=?jvOfWr=Uj5`X4IukMx~?iSU(pWpF<`01G1UXvUOSZDq~!TpfZ7K>A`f|4U)Sx z%Yx`JSBr}g1l7whtSVQB^aGGyGJMOTK-fo_`y$b^zq>dP&iUeYY^&Y^Md;J0Q(Xyn zS^VTr>}m4^#oq?!A{zg}CIr$m1x@xS#s0jQX0pWA6B%e`#v{0OzQpszMXDrK$|gyP zL{~88u}gjQ9JKh*`xveDyeX&bd@^!*r?@nb^HtFNAAW?=eUQbRGjsF-`3x^ifZZio zJZ)CbV76)HW8fBfZ*5~>Np!SPQ!HK)aqE$*dI1`u+?+WPzuSA2aJSER?5BlnS=&^} zaNtQn{>j~}mfkJrl%G8B4~2xDOYdHJ;%bAWqxC@WNZ$&q)YfPR8xB6jPvr)?{oW%F zFJCK}yadX%{>n@5KA-vEJZ=Rgb(bPvhT%geLjn3;-=p;D^2;|aPZW7_`wc1#FK!%? zdxOupl@&dqGnRI`o91u+N8y46`@V#oRy$9&MBc=W#w;;`$|!^s=ChxpR{fj2NA|i~ z3X#^TVy9p`lEj;5nL&?yyt}(i^EctvB0ajIupHbijAvp1B*yu2=B2Lx7GkeDMD4q}~Qv&4*VViUxt2ez*& z9WDQyH^=3@AwoD=k_sKUESpCP)N*=tr-HB*uhTuW%x zJZypEIr3GY?c{j+*UUgh+He}aja&8E0&fXz-#(z|cXC)ommeUXYqytLqg1kb%<3a( zhB2*U9crX@t9&|LVMx4NC=5j=#<_ee={FOSH@KoEAnfcEj?=Vj{yyC7SQk4Hx$n*S ziYntJLzZ+f+o0@cNPk@3-BqWTUh{)S=SO1pb4DL)TCwF$sg08d&*~8;K|I8v#{S5z z$)K>a)bnXntk{ILRBEiviN~gIwHPGz!U9mv&SKT8rn$WF_Wga7h}B>B5w)^OZFjkj zUGbq&1fEdjhsNWOhC2M}624(e4wMa&5;4q(GT|1|@V*JH5Ga+3Jd_FueNw0O-^qz2O$UTZjE!q%3(H z2XsO1boS{{nX^f0VJMG^;;Pw|t~AzeR?rY7hWTqrVEE<(Im4^3^N9pBZ9bddQBCN-KQ#l=(V>w$Q(d~y#7(R9d#S1VNsFs|uZ3FdONj<=6KU zX-ccI)B0ai^K1DRdQIq z(&*5q?NGC8TcL8oMSj`KMQ2V7+Oivh#4*=O7(KINhh46GKNqa{Q-*-qZAY9$vzV_I zShd3I!XdBh(EiYpaj*5fnwriju*JHw@c7-}qoy3MDG_)#W{F|RLc*m&w0>cs6<>_q zS>c>d+euilub8!c3s1Vf6H4Qlq*P83R0P_ic=7f8$FH0AC~dd%{QOj5D2)1XcY79z zxF?5UIydY(g$kvjv3LuVRRNzEa0)FCKLZ*C1x3|<)_MCAaGlUjfm8lo;68ZVHw|{f zavmU==ux@8)CgFeEERm6W{8Ab^wvR3>b;i`)(&Bh#(#$c5Bjclg!p~MFuV818*@&& zLf*_C6|W>9=+tJ0eBy@>(VP0BS&isAYl2cz^7^3W`y1$@n0v^%-bK~VeXdU{IU8(! zU>=bD<3^CQMUthtz5?*#GF=UZ$4|K#!m@>1)>5ZCeGE=P{E5Q6K?1sm4@7;!N@UP9 zWFhfE8NEH;slqObkHs^J@V0Nte2Q-P{LuuQ+7JdmAZ#fDy2r-9brUltyGFUQC;Vs7 zH&6d03qy^(rqz&&r{u?9%=69a*^9xQHi~L$XadGQ`qg2yd4vzaR=l!B|0w653jb z+>rCn^363iu7EqE+l!akt0dB7W?pXQ1=1bs{gs{O^%}H5WgCynF#O&d^eHRE!HS77 zdRvwOD_oI~hi-D;EiY_uDRktF*%^oQzNP*!trqKgTeLs}R^H7DyMpa~G&>TPE(!_? zHjLXYi8Uh2E_`~X7&6+Xpjbm`%OcV31qc~DkK<^UWqkG^+4 zk0<9>&X6LTCi#gl_8 zzMw;cI9xH$Z9$C@ATReBE?9Dx>MoKZss;lAL=xMN8Rd|x5i(aaJ^Kf)zU-Q z*O|>44^G+>H~*^4m~l+>vc7c3_f-*jReA1s^fqDrLL+Trd)GELwElYrV7l*6-xCo@ zEnea%aO+AO!Es&Qv`6Hw(az;@Hf=cvr&J}cyQ3!<0^jmry#3=xA0RqC9<)cwpQ!vR z2ejYC^ycCtFjOCQT<(p>nODvZkIKA}sDygP4%j%MZSp!?Vin|=x;U#J9m&Cpyhn@l zeiVc)U4x6c+i;UiZ2zQtqsqyNL1Pos|QrxC~5aU&khvOWJc?Hm3!?V)fX zcti#oLh?jtN^#1clFPn-^HBkSk43gbEHfbY5E=BTCzjj+cRmq%R?~z8!T`Sq+&{iA zVyk9ou`xa{Ow5`6BraU!y*9c?TbYR+xyau7%y%W89E28?Cri+`{RSfwTZtN)lRh6r zpqrRA7!|A~#Jcbw7=F{lg+$OJMjE}x-5a!nAR!IQGT;DMwLxd)*=0*6z{2hFz?Tp| zieu41_Ga`)X;2|OaHjSN?Qt}hXWAhWZd5~~mabOaV>I%N!jJD)U3zB-1!U$`x1pTQ z&Q90hhW4}J%CmglPO*Jk4H5?k&e6*F^aszM4t?~!_2nmFDp@uE|K=wD$0OD1^hQVS zJ$wiz^orH1Srnq+qAE}BE!_Km(vs&>0@8UwN9(Upz)Rvn$AMZ`(CVJD(ckQe{tJu? z60tc=8SGvdts~5cMxB$^<+%mc#KPdEyC0mF6QuTMliY&shx}dNLT=|y5hx<8ZH{)K zWm?|KbA%u--rNgVx3{yS8Mm)(9vII_q-X?1F2Auf*HwpH>OY;BBkrGl=u7UNr!pVe zp0d5#V=SZEmSqyFesm~_U&`GH;r!Yp!w zIs@SFV<b1r@SqocTs6qCC=w7I~ zW!rO%DiqmHqGXbr)|sHLn-cm5npr0Uw~gEDCqjhTB;No;9;D1c4=yw^15*K;>bkEg zr&wJEh+N*x^)gOB+UlGaMw>VQE_6`Gft5h&KboF)ty|pDIE&TIR?zyKpTN}iQfggR z`5s4z0prdwEjXelN~4QL<=STN>}?MXsE2lW1aTu+7sx|apTM3f&gF}(vyexY%2U83 zAB^aRa;~ll{I~#j`B<&oDS*{^`hbT5wl2wDqLFi}8;TJ1wLsMB z$U%X7grdzj>Zr?u_eC)~n>EaeoHL^?aai$R9CJcC+V<3b@_`>v9!*gT)lquPLAC4L z;YqOl1s#gA1SbI{D0V}*qI3ihUuP?kwbm~Nkg zPU>ksBPrp^9GOawGo&jsTW-ghRsvF$IA?g}-uM?zgv%HPo?7jZ3J7P|D#|IR9$rr) zMsFB!((>_yzbq8pCv{%yEUQl5Id+4Hh~z?jam#fFZcnRLGD597`fKcR9HM^0bg*7Y z8H|oo&@F{rA`oyT>pB4AI>3!1$FBZ1j0>##$W?zpHxS7?jpTbMiF!X>2op!BHj zSeH4NB3qB-@#0fE$t#0CV=o2Mde!!VPuH>}0Yp1V^x_WPGu|3!$2tdLxJ6vdmXL5| zw5Yrddh)5*R~e?2PUd&M!OGpNVW^1b8T@zwK|z_A^0RmKR6FjaRouagie^)~fjR$W zEfRL|y1a4^z4P81C+d%a5lCpm1-wY`)N}J%{gWWjT8Lbq;$x7!0n!|IEA7p;WB3KF z4>$h<%#!172&Tt$h|McWnQmndGT?+aT|@0-y&xrd=ijrvNc=D^d=SRcK<7jvU`&+r+<4*#fDJZQZ3bs>z5j3~Nf8qfl3+pu_B@hzL(L<%AWX!N?1cAviF|5nB zbu1E@z=SG<4Ak&5obChQ((ZhgX(e1RSb6dU;Nl3>Ud$>-ZrIYCXGx%EJT}o3Q=?2i zyYokrBx68Q&BU{@bRzZusHsh9Yw4=SefNdg-t?#dnN2nKa|sKw@qGZYtbwnn^-_=ZNU)QuRz5!=&r zU(47tcFqX~j7R>`OELy<#o{}Mrj~b3?^0z?vYiViapm36gZ8(ka*=l)a^7Z7!C+GM zxlvob9+UF7os2!(WAWL?-IgOdzlPSy;^UL+i{^1K2L(5*9QV>*g-S_$GbXmaoaMhv z71xGPjvOBnldtUNGE!~urd#DPMmKCZt04PYT$UsBdT<15TeePUDNViDf4(__gk>?K zDN)G$6`m~5Ge$l&3YUXpKz@J$Bib0G-$d6Uj=DU2=Nu@o)nKd0-Akts1OH)q z979^iLCA$w z-aU_-l{0(+kIYH%jh)8TmE-FB_jj3;_~Hk1_@Xp3&);y&I727rrnu%lv@G{#SILNX zuFWLG5x;9XM~h%BuA7fTwl5%Z3Md3gp%6sxa>Pn6-~Fc$B=`2JjnRE4^G6MNfsh_{ za;`Q$MnakY_`45zE-O#@?N#&79n@78z1F%uw`V0kA_r$hA{S+$B>(Ep>%ZVo62Kuv zWv0@niKLHKp*PV*z{9O+E(nrWUIVh~yx9nPy};^2LU4IrnLPB~@d+Tc=@I&Au!iO|noHjFADpz~b zcL7dJ9+NGP$W8l^*?}X|LVh%-y8NF&>e(>S@Ao{6pxaksBI|#(XGcnj`*gHCg(9a+ zPdj`sA%+?1?m(}(E55pMEZ=s+kW!Xj_m#X4RmT;oA!2jPG7EdvULrhB zx0@iBdjGsH$?pXu5>?%=v9|HZda;*XZ&`I0nt{+mZr9>~d^yC^!i_AV8c{1+V3IN) z-xa6X_*GoOJymF*(zm3Kyh_S54&YDm!6YAcDgB_4gU!|IC$0&;m!1cA(@;qi`LX@? zYRTIOgUzyo6&Yy;=B?Yr-58*GF!mb^u{~g|Q>LiiwPt#)jc+PvfU(j;iI;P+zcH2K z3$Y9#0MHiB0C(Lx#TZ#P%0>ZmBZ`$}#dAB!$Vxk2bDhkuyv)pR3H!+hT=l|YIBu%nAg@rnJ9UP=tbhTh3|x#~FM?V{+wIoPUUX@#GKDlEs%EdUeKNpc zHm6Z}DUdyU2-F)HK)M2q%e#orhVFhlyv?}x7?pJ;+!#*}(|QqPZXY03p{^x|R72j? ztAQ=cfa5C@kcLL?0qIe%k--BfxUj<*>xyENao1)%L0IRyJogCsj!%Mwukq&x(WtLt zNVx_w(8i{P(Te@rL@XN#yN5cr`-CqdC2MD5C9iw~4@`DVX4z%2(Zbz9Ipwr;@|R8W zRFPBbH8R`;Or&Ry5~m`cm5J2G9I^cba?`x$E7k1Rfz>|^5;m%rd%_Emr3ghJHTz}a z;JR642rZApN$j+>BZ2Usx~WuFTB;Wj9{#lBia+YXz9UcEnhL^lOFUWrHDNzKQ@i>i zMdk2AEyn^NQ08gvSvmpDed3{!lWhF_!G!oJ_uq)f`m1zMJ^LMIxrCigURKEu zEpKWmoJY1^>vnxcsL8*5KaSyYZHe5^r$iNW`2Jb+Vi`4lu;``y|15g>9r1d_QiqkG zf|L>}%uw)VK^HDBRNTH|za2+RT3hBK+n;?7;H50h2{8rhk7IvWt?3R>uy}0bkhb+M zfwPqjQHF(LRwI!XZrE|M%1c`+eEZ6nf~gS@7Y)Q>HTd%0V^gmoBdc_0x?P8v>k6#~ zhTV?)c>|>cV$CW??0|77JfumQXF2|uV7oqodF}2s=w*UZ-hQh|VHNt6h)7wc(l9ik z+x%FaA?-P&x90IfSf^{MEjj(S$e+L$47lZa&YZbdh*AXE=Johz^gF55i%3_kaz7PR zb%ij?K7Z@*?sZ@fRDp*rN~AvW8u_$>QTZp>uZGTAKKg#$65-DSdYP*~TEqE!$!KRK;C2(ZH`=7A?OQ$v5o5Yp>r zYzfE@Hh9`6O7a#?6|JOu230@y4Q_7-L;DO;_6P4LET_yTcD9biYg;37-L&yXFescD zwj{MQ8+14gTpZpjZ%F3v5zPgqG2q2Q0jpIwbO?4>r_B~R0ypphnAzjEwZHH*>83~b zB5pDcMz+1;7m^-_(w!v^h*CfnSfP>Tpu!bUmGA%q7GYi+8~z=Q;en*Pa0299GfI-% z%fmT;@c}DT!KfzZ7b}~V7y6X1x|bPy+*VeJNcVg_X_4KRo`x?yvTP<6uYVU4t!@i~ zs|R+Ue(bs`^2KxFF*F6M%G$&Lz2pMh?{nnR&=I-=QF-hwFzX$(ZPQs-#*Ha-2W~%a z^*`a2j87%}Ei{KIuPx@HvwxoWIC9zwA!^w3J@z_WOh=X~fd9JfVXs=ll6p{Q5-qe{ z-#9s=RN8i>8KyM5uOetq2tY4SeuykyKPQMM8nZ4m;GieYo8$vYO3t#xU4KJMS-Z$< z*&kbU4c~@(;P9MUF0nq(YG%~=qWOf>tw8&cayG{#fbB@k*z;P^FIHF0H+16|kwz-c z%$jYx-D6l--74Qls-RvJ*nGbtExPwQT<8tcewb7G*%5K9JacZ84QYb{Zh4_jRf#uN zACBAx++v~e;DNCaztcIEom8UbW7*oY)oq)Jo$Kr6+bVokoEg3D`XBR7#&7z%`P__@ z5zYQ}L^v74mE;zoxBXau_dba=Z;Cfdq18msM7e9EIFk1N&HyFanmK?}{z|Vkngo2v zR!CLX2`LOxDgX|%6y6f>hPHHirNL$vfBgq#jelp|q3yed|5P$P> zTHslXWricI;G~u;8&oC_ogBW&yg9l>#a^{M+OcjaL_WgVKCOPc;|vFkB8l}|dX^=` zi+M^VeS27UUeNuohB-=d`d7Ihb+o5RrGx&p$V}yP4KkyhHa@ zGV$LhrcatkeM_FeA(Qnp*JC?8n`_R?ez-XTFk6WH_39dh%dgVT;{C8{P+CZAHp4_! zxScpZ5?V=5IevkM#=ZEyuwHq$C8g-BJu0h!W1gq~Gk>Ur`GYuSggwr#sZify6`PfD;?T0(0B+O;a&m z<};Cw%|&%Yu%%~pNi_##sHepGQV2hHST|TvZXc?_f5R$sNyWw3T^}R{+H0>fpzp?} zcK$Q*Tz?R#h}yH@k)0h(oR+{?nu!0%V_8pt5Q}JAU@Y-VG<_N}K&!=>99J$EZ6tK{ z0hi^{hRf2sH;lRE@7jyif~4J()PTB$aaHSm_z(1$ahLeQI_Vdvo662&+qBr2zv_mp z5a%JKyNd?x6Y76uDE*c&XoGoev-r^_^!qDje`>nw#_mzz32DgaRUD z8egM8Zd&SjUsHT$(Rl;@AeL{;x^di%Z{8fr`fQYrd_|x;c~NfA@7@oF5k5wJyAUm8 zi72sikFAEGyVUl#@8D?!lru{rwu|ssi0OjZJT1uHUC02&Z7cEVzNX$kYb!AukIcn@ zxeIU7*JWq=B=7ah8lMs~W;KHi@iU#aKmgOi#As%|+oqMoPTrL zmij?XLl#H15eL@kHH?2hr9!ABZ`Ht}ulqm~VsNF;cB^Wf`0qqcJd7&;nG)k)M>c;Q z5UH@|yw zFX!;WX+r3|N3M(zM0oE)Rk+|C>d6a=n4cZ1<~^% z7<#2#@C}8v#t_4SIKTZsWlegJ4(b)90qegiLU|a?y*##GUdWzQ8jQ{7%@<}-_CF{^ zmkhEAWCftg^u8y@-c^FgRYdTc)X34n@HHxaDyNYXIG3EXd+sW`yvD_d5#RRSpSTc{GU{Pxn|sv#a!+9VhhMt)OH(-0 zZNHF`uCkb!E#qB*v%z<~MLbUvnj>U-nB@UbxT%R37_WnANJ!miJQa|wOD;|wyDkkk zv519%z};l@#y8e0(*hPB>6tGIhNF0bGJx5h=8SmrCh6Jsa*xJ7o(mBN~;-i_4huu%&x^WByOR zKC(nWQ6@D$K5XfbH2#+vW6hnfH>b`fNxm$9F&@3Uc?8j)qex$K)BDeYM?Mw)E-@$gjL?^WBux^k29r$B#Tg4VJa~9RvrrV8 zg<^&K%u~xUa&Hpq%pT$Tl2HxE;B@LCyYXO2M8b{%U!(A>)$=9m+AOD{FpJ${h@;sGtu3tBhEscBRSGo>x( z&c}X7pIZ1;yqz*zDB7#$`xC(FueZ=Vc4fj+)=q#^Jy1b?z)e(jCSvywWDEGS884Td zp``Ma>VsH)^qdf3z8=f2`awTD>s8a%uM04>0Dpv% z0KLR*l0_N<))71PAq3b$wo8asQlg_R?s=ahViHsDFxi%z@jD=$0Gv374_F2!uXZzu zPW61I5Py6>o2fTaG*w}w*um6GO>`g)Xnqw8{5{>sFmt!?L9qzK71&XkTw#_jZo&t* z0H|ZUS@_0gXUq`CDr*GN*iKV{YKL;xQ)#R4Lqf9If^^dFZ!pg%@x=MZDLJ?35Zkt8 z2M-!dulhzB**?zn$J?YsGH2I2i;KeKUn#6h;!P;F+Rp<*9Hgl)=0fNEzsgZ))*{=? z$hUQkdT~zy3Ak)*94m&<$!z@pwf83ORCnDU@M)w}h6t5RiIXuZA+y4fq0?k0g-RtM z^H7;e5l)$>I7yi$Gi6AIl89u=JRDPo%)DzK)BW7{`}+gl>w3@iTz6eh-(i3EUVE** z*7~f^5(`bc#RT9eSP^wA*zrC5hoI$%My)UOvTspM`6{Z@C{%+>F!8qbQuIInEyfB7RG?V)1kwG>vUZ+0Y z6Mq7@(i81jk%-;%%?iG!NfezUORC?+-%9)%@kc^34WwspYpk#Muq{7vdv{OOlah%e zf0ZP!)#*Boz1ufIupHs!1aq9WZjNOb>w&jq>&xXEAclllj2rxWX%{gsH7{@e z4k~v)zF-@5T)4 z;j`$-duI&MIxq~1>$uP8zAh7OON#sA;x%%B_DIw!aCI5T-}p5vF7juJJ-PYTk_vM2 z0WBzxRi9S^-<^Kr&3#a01uzoNNls-$<>&ebxH6*;Hzi)c>0J-$c?Pz_6npg@nip(0 z6MG3H9xqKFaPo^O41P5;Y2>&BN@cprdn{bPl%b|@IT12fet0Sev!sBv-cPCtY^Zfm z;>O5qP_DN=IP?VE``BmYB-wL;1E?HIYY2I*sWRqo^ZSm3OLk`34^FW5?>_XN)TtPm zIJYx2e)pzB0}$7+8tmJ9>$K6Sw)0ryfiX=3r#g#=))xxjC9#O>QSY^W0ienI2QDn> zNfyl9lbjhe&bj_^Zvpg0dUm4wrRgMcmE3j@IvH{VxU}9Is|93IF7k$^9p5i%8NtF^ zY+63`D)hLE9wr`EGE`1{Zr64EoimSH-!puK36n>==gHenF*$$%JWR|}26RdRcN;_6 z@x2~3g_^+{xSd^=f{)`YPfbi_r|Igq&{F7aX?%k&QtwbEoWHe+AFV?G~)J|*CQ2- z9twDfnt;JDNot6Um(7I0>rmVy;i;vu{*%}1_NY=h|1IfKE{ORzgC)jp_?UuH6^4NUJ~ z1mhA#@wwY-B;<+2@%AkB%>bNwXOW#0xwOtE0S7E1g?gbVLLkAhODm8S*$u;7mD}gU zo{UxPL$(|e1>oP_Raz40=PjQcJ2+E%D@zN?=-5m0@f8k0&_%_5TLY(uCEjeDtzSGu zJReTD`+&yiG5(Df*7_CUZUN0pRZ5G8gfSa@1vi{%&ld4%RsS)RIyRQg*os24m*kf& zYt=Kl!(sQV%Y0TP1nmQGqq`wHAm?SO6}jj@vWMvQDY*@pd(S*Pe&lkcu}r@2Oro|` zmZgqF7^`-Ku|kX!N1ESDCyVq-MN_Ei?0q1bkn4>|P{3$>*u|)%eZp%jf!{u79(%XykW2o2a5oPTm&wKu3AQ^;qD!q5y}|xQ%zh1Tbi=- z1~{09AY!Tjyyj5nZp}hYYF;6UuW|wn7tKDH6nE;0$pMh;u4#i`vk03@1P2$*P?RaH zTwcirf~hmgbK2cm+G65vDxD-2u=&Mc#sY`%6R)KU*PL;bsfB^*$~Zgfwo@Oiv$xNB zE~CgV-0k;fo<<~gv2^o3WU8^&%Pq6FJ#Jq0fOOl|i~vTbW9?a&xWne9QRACLNGtcA zd02?^TwOv_$A3LKB{A&ZUV5s@;vIh@}<6X-_2KM z|L`6P6o6Q7?9Vk0Q?mck$}GtH)T)HU3$IhC`Acn4aa-BKQ53D7^d`qj0OPhgfj7v2Cp9n( z>9=8q!TG8En#Fj?RLYhHoiY%!qfggpdHSYgM!=E5Hdcq4SO6H5J^SDu5)78|T_Nv~?5BaDJf<9% zUi44+t=2mH7Ewc#+>wyXJnb8nA=jh^r`ef8NHZ}%O!p>t$I&3*b)n@SV}H)W5j5bu z+_u)rwhw#%cAv?Pu-YiY6Cd353%EFPWdiAiS86_1vc2SW9Io?@_`!WtGHS?mWD}SU z_TZswO1<@iWFfQVAr73d9UDv`9fAQJ!&kgTQE~Qu-doFglKF5xPLIF(mNA5}&5?Sw zqGK)eK!v(ncSHUuN{iQovE;5AM$1vB+M^8dZuk1z?%d$rT>U0%i`u(Z0)1r}}|1b~p1>@Krr$qgjC#fPrCLxx4+SU|gqx zwh^TNDNV;Ebj0my@Gp?j?M>#c_KFVg3|Khoi;9gL1y40z(RejZ11VWhYg!Xenep4W zoCRG(Nk0RpG#j3&VJPQS69GNj__gcO|DtEnNPsGK1tbeco%fwmHzim&;bdkDaKYdG z>_4CG7q`3j?E-S2kHtkm$-WUqKd%$?WAHOB(RdM9J>GOuct__xM^Sf|N@wx#o>H0v z9BwB(`DQ<{`YTa>I1_KgKR2EnK#pd~ZD6|-d$-a?Rs6OdM}Pq%iV_D;k;)mR5H1P* z~T5-N)vgRCz@!Zyd^M z>IcF-+Co47Yk}jEq`Js$6|AWLDTC>o%caU2(^#Edj}565s)}!56Z&5)>_th*mouwL zRWkGa>szb9A|dN#V39rUnYQIvcvK`=npV450h4ABGEh^D88dUIoCMyvpgmXX^K z6^jxzf_~P$fX;0|{68joPTv7gBwx9AZO5bHzjI&h&wvJ}$Vs^kfiy5GCh7z4z-z@! zzn2-KrBGW)Lx{)Y9R8wYSag>4@zS_d~M>7+67c zu-zk{qs*Pzq#!DTF|FELpDEb;h3gLdXdN_+4xZ2&Tgh1NXyHQ8?lY>>fpLUoY~theGD&R zNbbu{0(J-7U&kX<4WD4emAE$3F5}0ZXMFXgfb)qLp%&}UJmsNqXh|7Bq4CN{Yx@oy z+D?L^2>{UF-KX6!fRc*kH&duckDkn31d!?~x<>{gA7P)_157D&GMA5yD&!hrazcxa zDQnV-bLMpHyW#d7IUyqnzoF0G=O0P73NR-?p9&QKqs1tYf0+IE)Y!Z=h0ClO)C#-0 zZFCikk0*I?h0PBE*$ftDYcYV4^v6{4Rp_!KxV(T19mnhezxE#@5eT(H<2_uwrI_JZ z)p4XH<-8<7)TbT3GoIw|Qxa=Dz2m&{Ob1<)!@S!_Vv6<)uB=}SlTwqSU0+4jxcq1y z1S7=fNP>;Q;pcD%@&NiIAB{3}d=t(XY3}?f+z`jRd+e247e0>hTS7T|fP`l~Z zu8F1}dErx$7lDH?<4^AYyO zC$iv-#6NWO8m@FV05@~1xrABNR#Y^m`&cuUf}H-TDlEzyjis64{Lc8(%!Pqjk zaE;k4C%m5jOMcOXq!S?j&p~j89ox57rcJ9v(d^w(Je@`u+g)EEDCr5K;D&(Aj}ybM zf&Bxu4Z2f&R}v7TtKD(>!mSq1+21KfX}#US^D)PKByiw%asycYqb%?YJmdnFsba`K zHbMFp%F{xFndjpz`~HxYMm~!fmD^vA|9SutQ;sfM^6yDkF(i#bvD5)D{(e<6MZN?) zLoA(~t)L4hsV%UfT@+;>VcS1GeHf}4fFmW9#YPpUkkjnkb*C>iwIk-tne_`>}+ z@Jy@%MKPQIrjVR=1)j7}LADyO7yByHHUsoRG^xSw!*4n`k)Q+u$%|0$oi}CcJ(3dD z`jH`tM{JQxYg|>?3ext^zD83{jTN*gMk<3+{JlfsM@#Y7o_)8tmyF2ubG4i7Z;lnsAV3DS?ysPrlN!kHCk*Mp#| z^9;`)7~~3e4_q8I>y5wql=TK?dINijdC;bUi3|GYH(<7KBJ3KVwc%Ly1L7YnmwZoJ zyGOL9d8|@O=aR{tq)A6wT*f-03iX`M6_vL|R0te~vONrqBN$DKs4lJeE?OkE#nV^_ zkwQI>qW=fO-H#M1uznmbN}M{maq)kMQkKxmC}<@8_i+nr-u$( zh0GVk@(^n0D)<5)C=>=d0Vqbz{#_|HnP#k&8=!&td3_M;T^I(0PLRNsMXmtS)t-}8 zTgX8{7@ScS6^W`IDeD^G=o0@8^$?o?I(Z$SpT=l((R^>iOYi%edTItPt3{CSh-tch zDhh~qwI1;9Yx@q}l(5v_j^zusnSET+-tHos3T-@wIj$^fV2^n#Y@BC>P6{QuqeD)* zoZGgzvK&GH)?3^z`9oQ(h%h?Y*$-K-h{pF3t84j-ncr-(ulC{qMDRdW2hTUX?|i=_ zLB}3-qjl9_EikCO>~DH}c|hF&vD?)+!tE5TZ93F-pvFcDYS(bjOrUs&sm<*2Qz(7B zbAk7>_1!}=V_zf(1eFg0**cjcBx2@wrQKv1cy#-)TU>`Yp6i{zEirs+MN`*xmCh1? zd~&%IeC1q{WxGo92RSjxr5jKwlfHuSz#uRh42wdO8zJew#52N7*;4gWQ7`j8SHjpt zryfmRfc?+(PK(p+&a^YtfP`+%_Pc`GSmJz;<4nxvy(Nv5-!?Tg#8*+ai9dJSD3hrK zcxosDHdqCAs6i4eV&Mdeu#r{4Im_yP+Aq|bQA@}ymop~fA^6G4DRS4RrF&VEe4a`u zvl~Z&+xksT>W8Q(IPQgGc z@nEI@n@204-eWw!UX_imhD-7DgtBDs%mCgtTk%JmfA+M_-0X3azJU7y zj?pwV)Ue)+yAnIp#k%c(Ad%!{>kB^$#2{BcAr%f@NY&SSVf(u_sY4T1F|^Y+9jtE6 zn%ILv6U6iW!R?Lhvodr{QfF$P0C*^MD~ov+RP+T<& zn&;URb>0gkObOd|MSps1HctQn+lZJH6|hy@T7i-9+103LP{JQ6$(aYHYgXk;#8j-5 z^K2ST-sP5jLGV;iG|%-my`oPCrd)(|xlW21qM9xYa}}~tpDCM@h~5mI`mA#l|}?;9DT38kNF6u zoTe{xz<6xd?G?pDUGrMtrtY|)qrT}^XI<*Bw4Pw?pB3D*O*sHoT#fL=3dl^yW$jz5iB1J00R%*Wo9-Zc=@-13zFiX+$GSotzNaf4{BUn1+T6%_`rfOj z?9mu@cCqn~4+jBSSY*)#Ma1B-ev1J&zxtXMf)(eG0Mwg|<}dJ^#>Ki#dlmwZtnbVZf{>Whn!yu-7^yUXXK!@dq4 zG%9?BHS%fe%(G&~vJ$@XCv60QjEv*Koz)p=j=DY^T1d#IPdpHLDNQ^^36nSjpIcaw@sQSPzcC z3XTt~HXJ0>gh>KQ$M|~hOJXn+U9$DN_wTjqIlC-dn|j>2rIWOjW})}My++VLPy@J6 zi_uFGU*dQcr^Q@v=MUKF(XBrp^q5IrN@PK7!(V(e96KTI>3@FXV4YOX3$2bO(&S;f z?KEsqe5mz*;`fkrMZb?WetJO6!ClRS+{Q1!g+I9egsJAImPn+aCXPb7THvC$SCoO01^pmHA4FJUk%^TW3)_&APls<2c(DC1j| zp%{cM2)8U&;*oiD1z47U`Z(Bdbl!%Ngv*fmB*TWX!&f?&LXVy7-U+A)%@_hzzPQpx zX}A>0`E~?s3@zrU#&L~+%}WA+6BBGhmjXu~7|Jrag}CNEFNv-_SYs6+kojWWBJ^2a zHvc|vy-zgZ4X5JMxtxNrvMktc8*tqAM-|G>8tx;d`B^{sNz82^0`TfvJoNij(+X$7wpKqD|Y}S0A*xJIwIv6$c=RMEKRQzJ2*CIctZe< zhrJ{j9psq$9}3j>Bqr=Gsm?2#3k6isxsG(GC{gqeC_-SFgT_r6tPx7L9Xr!>1UwT3i54qadI0ROCIffJM~^`5Zg3LzM=XU+e0=|Nw&xQ=0%OTiQ1}#3 z;j(^p``NSDsgZ^k5+&l}^^~=ygjCM~2^P0#!RRs6^yTn|?TV;QJh<0DwKf54u)cKH zYXfIP;ie;PVEXnf$K?^d8^mr=h9CW+-^bs2i16OSmkN0Rc{5e3P@jKfnbQgL0f;*; zS;%xjSiTeG3VC{Z`YEo`&G9EsD4q;YO^|9#%HJmao63%Ac6uZIa{I@EAp1E)fa) zf+Y)_NcLfbjr4vH7ht(AF>C`d9-fsL58{)0wS~0S2<{eeW++A$;JHcSsHEYa$Tkc{ zngi?#5c3tYtIt0}Vbtp(&}q;#LTO`kF0hxy5D6pcw}q}cL;r1g@TRdTPD)r-F6k{Z z-fGh9dU}vtKQY!3$-0R0v9KXuS8p^oUYxFPg98l*h{w)lW+vE}A~uf zWN{m|D?F{WvnsN1iTH0WoP1KRy<6SCJCt>YGhzo9L-3i1^t*z{$#!M^jo?(@V%Irq^?h>Y=7RSX z-v9;|f{1GEe!0UNPKJ64Sbol&3MpiZ#z;OHva0{7R>1kwWvf&&g__*F1#qT;mB*3BZ^pS zzPEXbJtaKC7icH~lByLcE%=%LeDUSmVl zOaFkGktN=I0VuVm8JTejiF5lPjWEh^BhjaIKxt)}FEgcB6WTw_q4&c488T$T* z;kvokD@aUH!PBWU9USZ>2NH%OpkL;G<@fgb#X>v2Pc|xH%bwA;Wj!d=-tgNyJL|Ds zSCd0^q@RW>%fZ0f{D4NEcj4tw@C)r^K6U3TK8RbLoW4YfC1pw5*?e}Mgj`<0$K64z zMdK>}rb&!uY>?dL8eFnc#3L@qq!^lcN#<_Tf!T>b7B(~?d8P;sDDBk4=iaZI>A z9rz7LPNgj}LYxycyae-~(lbuyDE~0vxBd1e-3iI=(h(iMa@2Jw7I7ykb4!Dk+<9N% z@CiG~y#7^UC9*SXNu|!nO7fGdgF&nnfX07W!B4tkRjgJNyg}WG za7D656U9!P=cSNseN7ODG2Pq;%LA}LIDG=>H98K#4pcKWapqYBy#6JWqko(EJN@!W zJ5mj6cp}Gz$n4(y5Xb35F@XU}n6{y8Q{hNQ2@%)`1MC!q<%WcyC>Egk0FD9oIP@HM zfJ|RI{!B1U2Uk7gY;!j*e*49n7J#8N1UA{b4Obe5BLm(bnu>tW9_C{+ci@2b^!&zY z?}32xB?v-b4 z4FzEB{^G;@P{^BGZs=>b<Dsj&icJPVIOMaJm#VTRD5y3^Y|IfV9{oG4Rz}9B(RcDBytmqyM<$<*qk(! z5XgX7#{~kJAU1LB>n#F)kqu#E8TuDa(#Yj02-tuEISr#6vQB2(jpA946gCn{TIcSX zyf7cpk-msyk;SoF9UNf11aZl)ZUJrk!5}S_dru;IVoQnDAWbIJ+Qk49UldRU_L*zC z{12ju=DR$0dvq&{xGD+&a=K*!y?Xf*ZTZdEXxn!Ps;WxpTFn)yd=c7=}* z=@AZ(CMdmlt9~uVfIG|)lErK7iIC!FtJ0vHRD)9$ne7&5#|lAy`9CN79~zz2*I&HA zULv0$$@icVf~si)^NP#p#QRT%!_d+tVY>~U;|oq5aI}!N%LT7v_pYeeh~3VCGu=AU zX;9A;OTSSSYE_iLIg|bumI#fj^`B&WEK~h(@}7qB7HV{+c164ft<@iWxeF9#F*#0m z7U^JQFKeu0sUIt&VF1s&`RCG&U*0LY(>&C=UeFdAc1GXkaPNo(zw%H8b+ltZ83i=$ z6>7xbyeldyHUaKMq|A}T0Ao@Vh;sl;LXcah)PWUK00lj(;Fje??a75!bgCY%fQ0d8 z@_?4yPXvgtu&3?ejJXS24!k9L|6jog6DI<5AvwaGf>a& z(^~B)X1|KF@eoZAR0A&#*(J;p^8NVXxB&A*p(=uotf-0}Tt(>=G55_6E}_)JYr-t8 z8Ldr^Y9TIOHsO_j-d|QeP|;$o(alQ&RgGtHJ3Eu_O&49_^z*tJvVm|MUojVCTF~-V z@L;sYcf_d28~jC-8<8048;gFGS{WO*@Ei?A}n3*3>wE1j!Ue zDiT4}ft5^)ELGSG;5JqrrlXV!+OUa85KKE@A@Mj@WoL0;^#Jx(EFlBs0VnP;HWn~; zkV+wNROq*<+H|z?)%C-#m}L6lh$sr0_DGYXm|0soC}#Y2t};Yk^xdKw9M%=`BKxjE zhMZ_){xwa3>Rzy>iQhqse00vW|I`xnxywAUJd6G#OkmD^=)MF4+%J?56d@?4bS!Rv zkB6u>U@_<@o!KDXz)#Er;7f4Eqex5hjI%hqP(qqN!|vA(uBi!gL;6(XML|&ggHGBQ z=F#Rr0>164{f5mH#X=r;`Tq_bD|C00D}Q!FbJ1PlP}`0FP3f)xbzut_y$)}DuOC9?KlxdtP zSqix@g2M_j$~n3-(EXfqJ-ywsY4X$LxuT%m>b%^usa}UB`Kb?#+Ik@L$FuX73Oe1 zRJKNUoq>@JiCJ&5EZ!Z-?ezN+)XKWkRHv&IOUAg-SVA5po#TNoVeBeUGj|o}Ty!7R zw@86j(QWy0#K}oSEqi?n%^5N=VW!ZCqnL{z3{3^fa8ts4SPu!$s6&o^rNhUw{w#AS zCNJu$qM$~V9+;GKT+jB2_5JBGj2GN+GkPMqn1Glm}mJxN&hCbUf!;+JYh`c)(kNJ2AV!JQ!tyNXy zEd>kw#X^5Mqkr$aF(7UPIHBP1a`Ts~RQ~tvitRrwm_cOo$cfx*h@lOGkpPG{mO}E> z6f3ILAx}?6M0Wcl>4Q!h+fg1Fsx@%fHq(}^@1KodBG#2c)KbYY3yvueTz09-nQqP) zQx0;j^CNSaz8G(D7=XSpj6rIc_TXH6MQD&y!T8PZ+ady9lU@V;LK}XShh2sNx{J6& z0GK^oCtZT9v?8FF%0`Nlr)-$ZH}y%$c~J-N1JdcQ+)f&}<%~8XjO6N03Ztz~d~R9i zu*XhFP5N#KBL5KR1`|MY_;{$g)PXdT%w`bB(%vf3Ci|0xsctcNjy|7S#)w>2}mcW^REX z5^z&-o@oBlC3KyQEIQ&ZvfCRI-oQ{QfFeDY#^?=AheiFn5vPlk7SRCy>nidwWb)uA z&ZiQ_9zJ7yY`&v-N)PpyfTeg&ZcWiEOG z3aFLeG(Fzl)v&f)Fs7C@>U7H@@duO*0?S z@jPXFEW}cRg)u2l8&3jVWl*6e>79*hdfQ^6B z2gOKb{s5zWmtkC=r_x)c12R+K6_yW<_9z}FQ)szGK?l#ja3pNeF1nJ>XqTjf9buFU z`x5#i8aR`7UBKy{r zp&B<}C@ZruK%AU##gu}}DnKD|YO3iVb<`X>BdgY)F$DQzQJH1yQ^IoMD`o&Ln=OqE z0tO4HC%vL;Y}|c6cGtWD`V)00fxd#@g`NcA2eR;m&Ysa%`G|v+`P$W3x7WfuiiK3i zpP{Bkz!kML)ytn+Rq{xy2uumphNNp z2RE8H%uCfVhsbF2OL`RX5sUwjPYum=2 zbOC1!2VL73aq`n;9{EECAAC1;Si{|G8l-9;s5AyZ1H1}(F;Yn$y^9i7I!Z(Tyg3_s zI?)^W!4A^-;o{Gc)hZ(p7wq}qKWpGd!vb7`&+!33Ls^5^E|)(-=1~A*u8|lvr(?FE za$oJUr3UKZW%w7-T@pXKH6Y6TAr7!+#XSCumx$+Ibit8kvoF|&pi{?Ij#y@cAG zrqYc@5K!SW)2&_Tn2d{$gmNt4Xe<^QMCZ+!({LC3Hl9{}b;YpAB5PXbK`I}(aKEo=tVzyGSwENKXN6^~#hjuOkCz5#OKyk{Hi@K9Mm z5Z;7b0}hU@xH7#9omnwpfe{uFqp_Eg{X3!DI}VV~co+$$X<#?>)(`@=Q?Nk@Pji_G zVqzfqA~!-zpu%B0dx5c5eFMyLAO!mNdZe;eUs-uP4}8#Aciq=u8-yn%4y5lo%JS-EB#~cZi@bmRRg@dbSgwb z)*4l?8K_)svHMG4m`f$U{Ww&^@DLc)YmdJD=fqouBEC^@z~3r-YLQ#(GZQmccJ);Z z_ClefI(hQiWw6H{%h}b;e+DSj|J?sbub~m5@PQgvSB`Bm{ap=*STY5k>lE;kSM3h; zg$|dqvmoGG6+70ydzpd3q-8h6c!Ji%Q&}Rw;FhU`8bAa(wxQ)Y;fL2h6`yylbn+q{ zlFxtqG>nw6+pQk1hG~lr>$4l^0@n^sw`m6A*h{gXlJ%t~-%4NtVi+pD z!$LbAB8Kqi_)?xbVe)UtSDkFsk(72!&fj;>GARK@=`fn2Lhrz&C*H1)el6L2X9yNm~=&H|Ju9Af023QM<$|@ z=C?}yke2X`5ju$+KK-bZHFhuh_b9*~DQX-R^Z{$<8SoE5ME~eY2mKh#>EktNW;^Tw z{R6i#<9DKe(Ey;;7ea-1$JX!U4@Upr@=xOdlgE)LrtSwRf%P#puaB#yGjaK}t&WqE zm=}h(Yk!V!Rro<`9#^psAi*py)6ZA?M5yRmt z9t57)QOu;gyy%Z1j%$tU-m(-u%zN{;juC$x`^5b zVoZces^Reej1}(9)9$+~3+$hL-Xx%-1Flo)6=@NGLBeaj#eqPb-N~++m^D_ z8rlpg9YbFs^XG`FC1vT|vri<|xvrC?XZk$b#l;)X{vg>EuD>e}CQCSVH0QDj+&l?+ZOlXVz|7cz)MT)+Kf>mBxtL|KF1N5~yl;tUg91 zB=Er(f<=Ex5aaOkI=Cz`rCj8u#y`(|%!?|Wq);YrkM(br*@lXvJM7yt_}(%Z`|#)8 zr$PT1HLdxAizkq=`QCqDhkUHZXN7oxGR(;$KOrH3>6DLjz=}$WA?#I3%}Y5<^zb{n z{ucB%S2`}pYN%S|p8y~Afy-Bj%qXucI-Ieib@eqwp$)<&rGF3b5wz|-=O%33r$VZ^ zc7C77*(V*dUzT_Ihk*PGRNZ3_hOHwwjQ{s-7_KLTySHfR^82#3iW|sQ{ zJJ|&R?7vswGQ>N0tv-Gb+JTn*BCelfUMXr-iNTUDfS}|g?^mq&*Q;<$>~1=J50-Ui z4a`0Pe{5}lKUCC`cvd|9z#* zBT)S*-?%narpX!TW|EhV4E0{!vv6|pgZJh%WIrPOTR??SaG;cj8KUo=1zGw)+V@1R zY*8a9<>;TkC^slkI_oCd>0a>ez|+%c2Y%9_T$|Mx!kZXf{S!bg``;UB z<*tJEgH8wDXLSoB*0<0W7-0|T;J;r~0#(a`R^{vJk6!%ip*8h)JJwJ?g|%<-f!n{o z>qIHfrFHGcC=S|Bacu|j6R+y0bj)7=@7tiXU8m1~&2W7K!}Vf)j8udws=rtIPHg+P zuvo+rT==ZY*LR%}Lqnm=infLGdmv%#3a)31UDK=s@+SYg`pbk7pX3;K@>OnkvS>+5 zFrpWiVq4j@;NxxP8hn_}WV%
<>Jci z@Yb6C3l(0MRW|2;Ux(S~O4E@6^ADbloPjp%A|B!{6c$^PwME&Qys{QLaNare{&g>x z8mbH`<=OX-T_j3du%hQ5s{_Ho6Ms8dM-QuiiRoqxy8?6Rqzpp?E7$s3^TApVS|o9< zzh-plSMJg`y^$QB&hRjg^zZL?y!w%n-xgVI-O}I`ekgO{1b9qNA7ls8+(!gM7k z-LfrmZZ3bgY?GAB{u30t8Wv`IH&xH&0Iw$V=PGaQlC8N15`8;QSnD-#3{Jnk0C`6T z+El@#ZAw{-_9Wz3!OxYMhdBS~mHb2e<2z3K=p0>=&JfUI`qO2b3KSc=5%P67Y%)Vg@)QA|l*YI6(4a?Ec_y`~vWXY6qiCBqTmhM{d|Fy;nWRt1?IFijvvV)bD60?=Vn2@V7FA7cA{G5`gCZM z>~=nQ817rY55}2-mLhp%jtE1U`t0&~W&|SN5*rG{`FdUY@Xfjjt7zXq15^lyib!JvAaq`&u$N2}d{<6CB%{m;(cGXQ< z+d&<;_Gb6($ugL-aC0j?n^)^kRw#)RWnr0c64YfayJ*hk7z|?g;o|VcOvJE?+SDULN_01#Zf|B!Y_E~ptwQP45#)C)mrj_f^b8?YC&s@HlVL-38?y)4d5F_D$bgaLdT&?5dEJ>kK`F$9=X+j|;%e9`Y+Q zN@YSD7sN3~%gVXJpu}mkuO_BC{m{zgLD61W$;(?^XospCwI4XuH-~OdPIZi*+tpzH z?%>*WKWNxZdL`t1A~=QRTg_qz=tkPLdgIl1mAB61+|*+B)Be zBjsFKfAI5Q^Iuku8aUb{-_G-9w`d*mYRT8)D|kzK<0iP83sX5(hn$M9a5&DwR;P{B z$aDOwtp;>i&C5^DI^*h$oBBkR0HQ?xnecJ=h)!qQfC@eSND_x-)JBpX80^9(;yw?EQX^}`d&1?F&S9Y|%7e^t#wJ4|nKo_U=y+^q8xO|hv%GTImS6iOsHCIY0W zC!AbfOs-u0p~_k>^VhxCqtl^3Jue4YXSGf_dH2p-C(GL3sj0~4!M>C=>cdxi80uF~ zWD;x~f2lR!30dojPC7~R2Ucpilk^%gY@Jz=HbibM47s~jsg zP&2E!4SFs-9V8O=RFkyPYrw#KWsHfev*QtQs9X_$R~Elu(F5;4L=%L6XvdWr9oboY z=UQu3L@<9uHIXafY*Hbmmc$ap3*fPW) zA)h#Ta^+P?46)KfmfoeS-|tgiQtK9&;<_oP^vn#n+sVe zCC|9);$#H^mN%~XZNX4@_s1mCX|uJ@J&&_XcQrar`?+Ng$Tw0;!}??VBfP!}CA|Er zn&#elg)q*SL0__4ULrjYvwO9tp;J^YSjgtB|BN<~Hs^n`*x5LVY+_QDx)Bx2u=g zqkRVzdBI~^N7U|iG=~QFYOj9V=q=+OL4^|e2j0b&!sD`)TR07n!F`-wj4LBo_qB?ZSQ zkZAsu>BVT~!c-j>SvqylnXTAQR2#G~GG(bOIS$Q?v<(f~1|0#bA;sf2LS8|E{w2kd zwr!uNzs6@=BtgHAzZ^y`JqG?xCpZ?<>Ssgp^Yd#RJ60YqW0_GuE<9duzIZ0j)zN{y z+(nZURH>qgzGSh)r(Th9brr01CYb7}H3x@64ptMkgP=$u&p~r&29%sG2Zs_LFWV9J z0`=y|V4Kq=sziclAYT70ZpJxs)p*~DQmwU$t+$s|aAXr%Qas<G& z?jhTS=u4xV|={As|~qzA}b zoTWt^B2TVg{Ncr%B6CTkuVJtHy}1A7`~PFS|NmeAKQ{vu ztxKEyHE)Waw=;=^!E;%i(zLpI*~(b_%3Wjl10yUXEW|G)%zs4Wys(fs{7>XKuaJU9&l|NDmV(|x&UdX!J8DJ01mKKOqCK)&fY literal 0 HcmV?d00001 diff --git a/previews/PR2365/assets/quickstart/loss.png b/previews/PR2365/assets/quickstart/loss.png new file mode 100644 index 0000000000000000000000000000000000000000..8cfa5523d232a0bebd071d85b8a70f30926d14e4 GIT binary patch literal 62443 zcmd43byQX17dD6@pdbPQ(x4aV?goKNcXvp4cPo<8DJ|V0-KBJQgS2#BI=+|R@0 zqKX>txI7E0l}k_MHJ{iRVBzYIT89)5xiF~;ecUqs3#Xr?Wx)-!h&&(s- z{&a;3mj7kDw<+~vZr66P0#(-$AM`=M{nFx-{#~*%J;cn8qM9?F2HJoNvdV@1+8jvS z-q>Ju68`y2`6V>utDtDcMz76ft4E7&rhMstk7>ZyAB_28 zi)pRR5_QWlKH>9saB!2^33OUMcPBj{X!Q(#dU|Y^8f!AO6kf$HJ|)QiS9yS+MyFK^ z6%h$1FW!8nI~VU*CVEpdbud5)EJJ(vgN{Hz>4pCy_qPJ!ftGA+)!VCez@d$u|G$rSzSLzTBhCLa4=t! zl$0cyTWL8{=D0OzK2b~NGRmey~N}|2I7+n zqB}S`;u(JV@`X96Bq2GAa%*#Q2j(FlAb^9DsHHon&$zL<*|&N6^XE@)_HHBQ&CSgd zUJ`Wl_Olw9BPwcY#p0>M!^7p}W#?wM@GwzBAD>pr(1(YIh=_=qIyw%H#Ne;~{+6~2 z-nr(F?`6|@3&+e-)6xtL4TXh-4)-CLw{&$n_Q*dn#X! zq-yT&;r0mW#fxuX3k*a=Kp4spSg~SPYm>6EIdmN1!$C(!$HKzG#@;+UBqkteG94jK zu_$kQ@K9Es>&7Ynv{ zhSSc{ME#>J4SJU2ZO*CKU~< zXr2FM*drQ_zqzx6hmVg>OkAc+P2zRjaecNuGh6HXaF)^9+WPrFo1mbep6|W0PWq(x z7dB%84jW$2i;_7jlQ2{Q_ORe|kKMwB#!fJ$g2m|*Dagn)kcUhhBZ>Kf%{srmnvO>! z;itVH1$kwt^YAY_@~wil_97jL%}kjNbh6F&alzu_rzMStAyj#}YE5f%b8{Vw!^MWG zh@GriHI zq!383IC*SWPmk72Dw8212AE4yyX)@99Y|oD;o;#e%?FR#9%Z3v+|KyCZj2*m-nSQN zoc0F!vS~SynTdKnbBzw4KWCoTThCX!Z$yg)qLJ7n)ic{Ha!S?sBcZwMjCTzUg$H+b zb~Y*}@w!*$4o3<+P$mw9^e!#kM~e~Tu~{hhL=b_EkB^Veng4d51>ynDgZLV^{mctI zF%l}i(N6g7+INP-{cYb%pNDHxNHV{-hcH{a=ENKI=|fumZ~~5qV9^LdbEQ_BC2oj= zlaq4sR6=KiZ6smj9|`)CJ&g!0a}nB0u1#T%C1%wnmS<*6;^1C<&0)O zht=%gzkfeL2ND_Tuo4mz#Y99f#{Pt0mI_cM&}s-lC3BhS=!O(25opu57wT+%cgD#I zE3&e(61v5oeP8D^HQiona@Fehz1g8K$F;e=I7sNMtgJ*6xMQTD8Mr}C;jmRLREAdD zuZ!-Tf|`Ur#zYMnqlVB?raDL=g2iPa-ADWz7ZZ>)+9&2K+DJ|*+?=?p93M=ohohvU zid9Rg>FEK+xLH|Q=@9?IYbu9V1%+Qi;uS`X&C>i9IY7UtsHpSv^Rw*{sd6d*4!i5q zO?;X&UJ^`9$Nkw~V2&@w2ExO!exA5|g|8&c@wIm%+H>biG6h76k7*i!({&$3Mk>EI zHw_I9C+Cmg=`tND=z33Novd2S$$Aec6OA|0CH&qullIF~bOe~-0bP%DTXY#<3Ug8X z^T7pRl0cnyCkq>SyNjxuDGJ3?P^YWo)!yD-e3}<8URcdmWYQOVJ+t^2 znx$Jlv*Yw1?gd!XoRumiR$YlXBd!H43E?2>o2~sN`lgeMi!y#UirBup`}?ZO%A7iT zknt_6Qupd?;}$jSWIjGVbql!WZ(C6H^z;IRJ~JofbfYVHcz6J8W?^C3ba1~sTmnD^ z;KMzgfs#_a&Br@@kEED}j*bkXrZ&?Urt?-IoJ3{(j$j~q_pnQE0 z0(Pqc9~pC5DR=h<8Fx3gde$bX`F*&S|8g z+?}+q_!))}2fneZg4wU%g^wUgaQ^9E$Nn?cr!I!qG9OW1InT9?p!4XKE5OrpjeU5 z#9n!uEEK?|F-+NYXFRvb{oQ5etw*ptc`TromS7HfULK~gn#aIo#3G5OENTdQG#FS9 z7V3_`oCe8z3*c8>O;$o8`rXHigN4iGYC<4jLO=}5%WXjg+#XINARGWbE#OZe6Gq=tBCJxkj1w3dJBofD=Im8zRlNH5WUuFuF`(J zJ5xAFHB$0-5ZdFB?_(4Az~x}R>(3ubtkB@#U^)KlZ^3zT8LfctuK3(q10VoLMt=XU zRH%%HhX)EFkLQIIks7H`Zs6kPWSLDv>}}U~oY_}$q$3DPTg+_zOyOE!JLE&*YL58_ka7)QRY}Y9 zq7||aA2Ov9>Dft2028)dX$1kVQI5T+^w=w{+g-*rr?{loX-wsDEd#`TdWMAC$;{sV zm$Pxz9Ixpx;hw}fa=_X_-3q_M#!C!$PtOHyFEEUyT6MLw2nT>zZITP3^WXd<-Qgrc zN#907Bep6eJ=oy)x;ow+N)VMTt@Cb6#GjHqB(xl@iF2i*p^=)m`4$W^Z1ev1GLn$% zG|hE6T2e?@xRgSq#$sw03>yGg-DbC6xug1w)fQ9Kq@;#_+rNOyc6xgHR^(*Cw$1yb z3kS~-FpawXSSe^0rOWQ*4#-eIGFB=+`?DHcfutM$!SvgT{fW-9FX>MmEVuB`)0g=^ zK1^}^gs(tRz(PSr&M7s6+x3^tFJAw6sng_wO32AbPcJK}8NLF_5YlU0^tW$;WMS)` z1*0~S#zPTm0C+_cC}smhv9Ymoshog%?gOe#NVo{(^+2_b1YovcA_fHoB_^s75oC{< zot~Wm_U7;JFQTvr$SD|HfXN_7X+8Dk#HFMb+I-tYvtPY@iOZ<(S65fpq2JrzA2VEC zsH=z7wYaz#8j8H5P+~j2;{*a5S3C0t8F?4P+!>)pQ%6~OV`D?v+gshnrc5^f?BwKE zh}os6m)9Lo5g_N5Df6P)?An`nK;oLv$sG(Zv{jc8VqywIL3nE$dO@kqo%s?QtCUEu zLtCEdy6m2RLrP9TA?3>sBu*NSYi+a9_F$4yB4RcY$}WLN#n09yG(L4uuj+-vi;#-V zM>;}7L-9H7{tnsHl<37f?M)A@tO#&(&*y1^QvyYF4PtLQU%dl}BJarMzp?cxH9haO z5Q-u_CGHo0K}zX@YGhyl>9Hr`aq$8}nlG1O6G`y`=)oaDz+PEQhf8i(``aFF_J(Xi z!@}a&^MEKD9v+@TOQ?K@g;nhY5>YA<@KGLaZb`8cc<@Da)&oCd*SZ?EBx5w-6v4q+ zzElT)L!rNZ{Q}k7)$S+iC+Gk9kwe4c#6;DTc}>ma=;+UkxDZlOZ!a&4@M@P1fE}5#u zEne4^z+;)n3zy6-+C+ADcL&e`w3>o~Le2g=b5?vuusJxyNg6vjRma9o&@HHHYkOUv z8aHS!=gAO=iS;xJ6e3QHn!*IDVnub=GXcMb5+}Yu@uF6>;eM50O`RXBqRe}uMVauW#|-5 z2GAUn1q#y=okK&`ryG4_WMql~-@d(oAR{9)CrKZ*T&5dmTO~=e8~4Y7S=`7wL8X)j zXr2#jN)3$q;apWdOsU>!=dGc1`Apk!CrSd`#_sM7Y({U!tx%;{RU<>FRGtzDXvfWd z>=X;2Lr9p7f?;Qhlvtp=m+78O$O(XZ_S!AX2T4gTlbj<@nR7!=Pfy}~t`IIzE$aO4 z1Ms|n<@3L*o0F51@h|3MV`J;x;VInC764y@#S4{zR-9 zt{EMbuhAsJ!O54WL`21Bbw4%Q+5Q9t#i+Jnn5MiLvNQ<@5k-cV1EA{waiCb}==$ar zIpEB^BcR3_5B^{p;XNB1%@of4vvYKmk(6YvbGGvMz<~h-b_t;0)bYDxVkDsID(+2} z<{|--`x%z+@8|p|)c*A*5Co~z9jPRiLXMl0^O(2IYz!hr5<(ej+jd`zg;T9rnp>Is zdkcsF;M6E)0w0Fe3>~A!c10Kch1BL)wiu>@y1KfgWN7Yj5Q$IJ>1d<#9&NzR&W??Z z4V2YoU%f`tnhLOSz;&o;Y2#YspTFy|2T=nESFKz{*qrj?hugq54FYY%`qMzp?B#Y& zPAc2l+Cp}LT9cO_1tmGxA-0e8bLR`l?r5ezOvLB*fTPgS5}B zHOrW)$jU~vF*4eX zWr-v_qXkMAP#rRA>Rq7jcwF{znT=jz9wT?w?002?sfbDyvYO59w6+l0A-*t zCW74TSB>=d_s`c_H+fw<>}9foiDEU&n1aaC&3}JB@uZ0)$>q}Tx^VOe1$>@#KQJ7M z2INQ?8IOBNe3hx=$LjzZu$T;9olnRC_Ap(l1rFjV)TU2tyIl5cdwjSBH;RISlIw1s zMLC?p5gGiBR-MuA@23GaP@WQLH7ZR<(m`T?dF#ACt3aK>Zk;@o4Xma+a6;hPX*IbV z0BgbwFDd?*_wn|qO{?}crqAf{dc=2aeI1-wE{%H%aDqnO4#Y1{zUpP%%?QaKWF6tU zuN$o$^!K8giMZk2fQVs?fkCvUmRcpJqA~|6h5ec3`M)uC$7-Nb*4NfRWShNjU8wZI zMZYt>BP8T?JJwI38A;`G>7|c4Yp2Ed!(eO&Doyo$9|B6q| z#3szKza9S8d1V5}?eNe)XLp8?d?q@}MJhC6Vq!q=!gf`t^a1)fgY z(hsn!IWV@sXQX{LAe27=3_+&u;NGz@g#6(jgBdDmTrkW-2c8$#x*ETn?iHXxw zQ&Xd(|MSH1ecN-9(i({bAbGSicV|cP7{S!fpTJN6CAmnY*vh8GZuL0^a7O3iv@XRp z#3xE)VqwmQ>Z2srkBp8X%1YlPkQ0&PT4B_$;=<){KsFQ@=7Om z22HAlye$_yI`UA%F`sO8IE?7()bREC!NI|lgNu)JjMb_DSXwj%1)pI6cUgQzQg5*& z4+|?tC!5yw*S9KBS>#XPoMI7#4H`sJ9J3<=k6!;qGlBHx^St;L5THT*V`vL}*xZ`p z5&54|kB%V%h4yYbRL!LVKFw#{>;hzMgGQ$DKbd)XR$%OS$P`Ac*T19FM2{{mwv?Bb z2Ud!rq9PC-M;qpgTI2d=9v+QA_P!IC7SfayuwR4jIf+0)eFlQ!0e|kjo${OiJV#TL zC&={>vy-(h=eC*dZ~|+@T41;oVp4FwnHa@qeev;_Vk0RayHDv zJHnvXdUrM=P?xl3$cU?2s_A^PCX&*WlbhRed)Sm1ZURQR7$6t_$ zqnM{zMtBXAc0{0nTtBV`kp%zCP1LQ#YCY|z0jv;&*8QMn=7nH|Pl@C9um-s~+gs!S zq2%QU;8t1BRd#Ku)@ZhXdJ!&4>P_S9TB_YJuFt5xO9)t^004^?i=!6*55U&;lY~cn zeFaa@775K0M12y5Wc&+dyOaz`Y%B00aS6_pZz>1t^MS@0Z~T6>t+YWQ#9Lt zsnHoY{mPVB0(Zy$#*Va*8(@q->BzXaNzg$E!KApkxk(2_9T>J>--i)$jeG0`v0Km6 zf`VGQzz^&iz>0kCZyFjJ z00w06dAfii!<}XWD)0DWrD?jmNnZ>ZxR?FW%-^83)ZE+*{Q63u0qg7QfhUH8jm>th z^o!!DjpVnd>v>RvMC|srv~HQ({d5BqaoSB_iUXcoEujWI--wlagU|X;@9|~^fKW~h zu<1yAt~xwsWFUalaD!U}{WBEp8+ufmV9~Z%~L(1^W;0` zT>)Z)hrhSs!H)0}F#kW@-9G1IVn}bEM#6714`@?`Y*7u9!#c|u8xSPWi#P{Vn*Z)t z7y@t|XyNWLv2+l@qQ*NT&FYgtE})c%u*cW5Y(w zDjA|mOJzDuh;ImSad0Tok|s8cjEsP;lr5g}@bpANKoAiVTd-n&`0J0B4)`KKc);DJ zEoy*IKutyU>h){DnM`3_f$uW5Rv9JO2cAP#hyL;9Wn@T*`G)U& z*e~E<=I7?(g^T+7J_1d=%VUM1AR^KQ)EW?X+S=NUPCIMc+ot7+pd~l*gUJvW{NqVn zs3?3-l7Q_c+S68UXbilO6#C+nS8v|b65csT0SFmy&L#h{UJr&*BVxJ%FIRC6gwu08 zEAX4dA^0ks3*boLP2coCO2QS8-l2(!3E=T$5&4!A$wZj`bqt zqG#}kiycG<|2?A=u#ts1p5coYMw^()AS2=PT+})$N=SrA<|XdjAxpcDlO+>WBEg;l$;hkRA5_U~)U%qDI9UCl5#VE5ZNQ zz2pniCOEkNt_ApC*pv3-{#@d-w<0BiNXveSwD@iXyH0H%ZNhvw;N=k|0ISy z1)*9S>?>g(*O)vVI)wDB^ih(!F=dDhkES zzB1K3gH~@mf680dkFPPNiJq&H=CVypTSCOnp)WW{1J(M?w8)zjm+K$to*oeT&2P%= zKIwzS5EemuK!e?K1S|Vj9K8GB!l%h2^9+2wW|x^X>m)k9?dP1e8{}p+%M2aAQU&}} z9tYg5LSghnKbA2eABMs;%I^Az^XV2cwV5y?_b+e+{?o0Q2{)Rd7x=#4E^pN3EM}8~ zerK8@g8TcB%|1RC6POYBUFDlIMR||qig0dj?vvsg8M!<^4+{@Ref_#*od}dYWE2$3 z!oM3EPul64or6Oyzt~;P8MKnn&>|b_8s4?(y$dlDnnXQub&z_1WeW+tX_D`scVj&a zzylLsP3a=JPRB5@yN^`oq&A53IhLmBe7x z67>^6^QRR*gAxWBUbSAglS*>dfJ}BL103^R59hGSKVYLN`KdD}-7)d+9mfk=4g{Id zwM1C>$JhfC%kwlE>}<3mH0DXS=dn2Vl9Wsc8XgzWCN+2z!ind{lmT zVC*7~%le88r^}dXO-#pF*7<4*TQw?joIV_+&ChPWwV8TgvN!|Q&d_!j6ATI|yZQLX zLOamW0Yx2j816woyE}qN>lPd6H=~B^nkVbCKaR?w!3!KSW$S&p}SG)nhprImonYpW?^d(lC%C5c+0=3DQ{5s=LN**ZgR?!hPuzro{PggqZ&7FnlpGefp!7es?I3 z4?TE%Tr50sePsi)e=eWm$q6gQ$Hylv*f2oLacCx%lvgFvnuP z*Ek!1bvMcwq*=xIc^rBSYA?jxBRoGG->>izXONzqv)~xTm+7UR9p;QXe;*WSsJn#1 z!Tnus+`Q7eM@5%}Cw6Z0iN$^2q4AAP3b^^L$7g3%4HCJE4ULVOGle_~Sc!C6qEKLy zC{bD*ig^HIXaP7-pzg}1al3nXL=bQsvJ;Cl*6>}8#f~*kOOv|%cuaJ^XZxW$9I?TI zg8=-tgG=Uo)DetQ>0)cO%eJ#0?ZLrNQ7YqF21nh^wbRjpc30n%B(@QzyjbrAjrqAH)9ve2B9PeAar8II5XKXimc%>@PREKg7X&}I}D)K3=t8)eJ zZm#UoJus%g;sLTOK(|&(WcBs+)zxtgY01cPfByUnbn=t`3>FB@18ZxZVHlS*%Mx!V zlJKFr?#!<(J}uCiMS9Z8z5RzI^i|Q}Ae9RgSML_NL-kNw%=XX-L#o#pEG+SPbu`kl z+!kNb+E*P}aYGr+M;qa_PT);oh1xpXu1ZGG@)sC?2$&FZz<4QI;kKME0c?_&oV;7H5-5K> z784^=Q-PB|m{e3$z&NYX*53Ws2yDc*WjBA$i0%h9-1D{a;mENgmVf`KJ+@y4D2Pqu zrlxA`UyZH|>=;E)6rpS$xLb!~yXZMnC|0{*(=+H!}Oy4+0aVf0)@=E-$KlqHv zzK+$VV7W1j_}`E-L1)A5*hb+(s8}lYif+MU7^mGB>yCLP@U=E=Dw(p>m5ma7LGxmt7yx zWjOVz;w2$P1qF=w;ZrOLlddX>Euy8McL`QwgxK)6cmv0hoeZqQSgtl#0{j*jd};ut z2U$iY9q#}AGlPH{QJnlkufEObXeIr=A>Vh0YpYOLH%)o)*cYu%wci$_h@y9Hn`(c4g0kj=tjH50yWyvoTm1*#qve!DtPQMADCd!V7q|w?Z z8aj#11c_2FT$te#wO;ZmU~M`yw6*yPbG%5VOOrKaPAWP=cnOotWW;0w>u)}-B(H>) zs}*FV;g{*Hl*`Vj;f3`6vDaQ2QT-)P9ZwxEc@tMo5f)Y&GT0E zFNOYk4lD+Z#JOK3pRq{IFbuo9I_xxxY^^mfv=IY#>7DZmsRYy~NzC-On3&c!-1ciP z48zvzf$W~X{|d|`KIdJmrLhY&#a>6p;?M16^txs}hZPTM2Tdr=Ww=~ts-o9H8o*t`cM|kvU zR`{_`>q8G404Ng@5(d7;6Z)@*D?tt1DkX7o@g)YzgJs}L73;QmEU8KXkhwTql7fPv zX)Wc^Bp~VY9Tg!jJ{!IMsjo|w{rkG)xLZd^MAUbi2@;C>H@Kh(b1z~+1p23MqbS1L zdwclObJKErInv%iq^p4xARCSA1FcLaqRF3b7vRVV!Iwdrm-aY9eEkebN>XCpyjTRz z?AA~UT>wy5QVFylvT*dg#u`sSc!0Ax1^i~vSKmz>--9XAdID$N#szu~+MQteosrUy z#6%90LEKW+VrA;5MR%@p|JXSP3Vq*GFwQ5q3sX$*7^qt0h^x{6Ng~afCMLwtp{*IA zS^B^_Z?QOk2L=$4J@4k9aYTeF|!h%LIAwk)z!-MiE>HInC z%Kv$wCGhD%$Llkt9>=gv-^5c`G3s*V&t2x5Jc=4G9G0PlJA@t1Qa*r{zJDJs);_Fm zORcStZnW&yo{qJG#f*?FuxtRhR9?W3Tk*b3%e4cG*Fcb?!QzUx2v&YhU@d5>S^=Me zn~O`Q89YAck2v!X;| zJutA=yN*OMV&FkadQrpvP#g?-=wHf2e&MvnZu>W|`hZtJ798YIu?Aq5JoLy~j&8sw z`1}8UTER5p`vBaVG~j6fq`bd@RV}}qGyt>|tdF69=;;As*xl2^(rvMbwQK-*8}Km% zz*M!cumEe>#>~!Pumj-X0w2V^^%T86|f1eI2g9KOX{-J=dj;~tlK z&DC*E1tVEyz}_NkL@q>fsc^1sEG}NP@`eVM2kPjW_HAUe`y8(d8oZ0GenZkkc<0NNg>SBxINTv zlhD{n=6AJ;L+zz)3dX&C0EU7G*IvznvqA<%UW;R_{qG;LEEnl<2c_|XVCHi?vtaU!Oy7pEiR;QJgr~b&HlKY6&%1En&o~AwOh=4 zT-mouf}0(uEruU|?fOXYK=gjdu{_dT-Lfng%1ZwAwk&dg+lM(f?42HZ8xMoxl*eyB zS6_P)exwak(jeC`0B)K9EI&8^`<vV1c*kH^k zkZoT&j(Z5V;Ru`on98@YCuC^)yf~`-yA01xi@85(S{g@ku2uWpP1BWHSyraj3pl$r z;dYBt#GvChyhMYB1T0XBNEgWzBi&Cg3kBxrWa6Yo($Uzp^!uD5mbS`-?99j~zCQrh6)?KNs3`dSW3 zGXdQjP00L=KLN8fwozRVq?->;9?xU5aeJQ_P{Jx@UwbHRz&9LC&A|j2Pu6G*3>K$g zt?S|h68Oz?RY%6B+GaVAPp*#<-2{)o9X+hAA?VN=r*TuApxt~R_Yf;WTN;MIt&l-L zF8zevv~Pr z<_+1u#}(&OTq57Ir_`DLTABFc9@1I;;$kB$8n)V>q0gu^E4H5RT-qBz7@mcvTS z4L+g>{au&hulhrKO=fIC`*xSSI`ry#Zj|F~nx=b-R85oG_JgOW!ksUEU;XMnywEZ7 zM1W#KX7q(>%c^Rucn}IM22l~{Fwqp=L8JztqK~}Pi&n|WMjD}R%~4ajJ;`}RmeQeq zjZIoIoVR|*$L|~uC~`m!F-x#Ze_D&!}&#{dhl#WOrVp$s*r6 zsMpirtc%iVH%v0SWJRS>8d;En=?UE^kv}I{wDp!)dHk4m{nh%-5ND3ue7W5G5&Cq0 zrif)y3eM3Mnd3=RgXdWpF*6QYqZ$R4%FF8AO0%~+cKop(6=fDbdV9yexEB4`8G=H4 zN>$|(@ON&%uN6YxsW^ZAln_B^WYu9lrl-QCS1PJi76OfPh=q?hHyKCSFm@i#?sgK! z9baBTUsXW`3Q2RiG;diR&di6E+yfTONFU?N`O+8W;_o*l-Td&Awl^f@tfoCutjv}vcN5-^VxYQYZQJjZu`NnFRc;2Kr*$TWI8M^Ir zk0-Hs-rQL^|=CqH7)eL%8RvEUTO149jrIm z?(5=4DBcZ0_glOv%%LN`uT2C?$==5Vf*OoI6TxdiPsu+Cs`^?R8^>B=`mo8a}?wGGhGP z%SysJ=CQp6C1UHqtE($FkUps;W#zsDsrPAf(ghv_QVjVZJmqwx{)v+%gV14zU%pKhtyE1-lr9m_!E$W8@ZSn36;UzkUu|F&G|; zj!nglUX})B)Nx+<0oe*Z0kBO={MvtC{}uhr%+!|Rh}fo78dxOctkXJJl{eCdDu)_| z!zKUZ`K~d(ko?}jz>n{P{t`~74rhF7WlhNi$#7P9YuCk4_dPubt0ol&Rkt7VNQBJUbxHgmT9oD|gB_^2#z9#P2J1N}&v ztl*5GguIE4fw8@j(B;XfHu6v3HhbJKa=6&>hpC$%Qn?d((!O2D?=O&~@s3}m84hCH zVjJR6)9>YN&vGr*X;mK3b7CRQgoU%8LIP-oe_w1>z+oQ{Qy*0fTa_hC#{()|R4yR> z_Vx37uaK6R>%fd0nC3Slwp&Jtb0x(UIN9CT6Dr0vQ;A)`0|Yq>DNZ#s{V>!K!=2SH zGr5a)=G)kBQnO|Fz~~BFy+AEKC|{;kOo|RJm<+Ddg!AJG@^jYv%N6RfE{juT#KxoB z4i)tmb{n{glay7`I@L?VOw>ki+jE?F_#8d9!MCN&hx&43dC%*+%|_7khM>bWvvdAu zck?uB9lY-je13}zV^u}QyofMc`gAS)g)rLIsAA-5X-kLvrZP-x)_-I|>5lBA0MqfW zxn%u`q-u0t^`8;#HB;)t)g0n|)l*l^ki=j0@gjSc76juo0og&B(=O<-lfq%GUO-?MvZt^h{W27btdKuO?E;(J*lcdSwLFp12TE zOZj7zR9O7u^KSGHaP3Q<+8gz%9kxMKHxz66?*mT+`}W!I)VBYuD0V%%Bu;TM1qVNP zD=x+l-Km@}z+DfS=*>xy_}Uy`bN}ec>(ORl{myEbp^s&5qqQ8f3@pbtkx_DT=n*)T zOguD+BM3Z>zJK~WuBn1kzoUM3_fY*HkHjF`Jr?nOEe`%`YCc7FeK=02aN*#L@`ec| zj4l4fOETT9adro1&+X7S#j3)r&NL35SV6`ltgi}vVGbrPIIPeO84E}jg!!FHt#!e9 zo8qoGv}-c701abPX-B~$^dHrLW4#dc{3}eU<&(xMhhc?2DnhOX_7|FDqgF#gkS*UTB&(wVYzbq@m*lhNzj>wAi_X$j=Gy@0prA1;>xumPYK~~ zdg<6W=@^%<<3$>#E@*J@yzN!@aeAF$9j`Uu;2gAXbsDd`m7QYYoruu zp)$#ws2$+?@V4#!7R=$7T5TtX++qB|-<0fQ@@0+Wwwk}y1uPRQ?zBn+xMQt^_C7I` zjQH=mX^M1lx)2kJ^R1KC*ywmv+rOq#yTWNilfP4q8v1(_{rsXIV>R=W&5?X_E*u;> zKxl!SfNv*>&{%PJf37`|hZq1(_YFb^DO#-yAtbn742_haaWL#4NpBp0{joIumM3yZ z3d=3RC%7~-Qf7D}h3J}tXWG1fMr_Fl&CR~*0>%5Tp~2cMJv*x~UQK&mj-5`hcaj-$ zQynxY2MiU*$>r~)S!QTm8O(SXeCSlSOnTyfIfLXIdjF+l=RJ4T9F-$p?I0TDkXk70 zsdvl>i^GgY8pDN%Vl_<~VTEk>GDlBJ$a1rAF;3{~tp`4AEhTji_09izj%hktwm;xv zto-rak11o#Sn;x~;e}|}3G3kqD|PQat*V{06HW~zy7=dIQh@=YZ58*Z@;`dBMTGDj zAyLXP0UNJ1Zr!BZ;~|x+9G(}tF0UmgUYx!8kc8YOqvmeDVSgyN`fId*&2~V zkw#f@cMmwr5$pJM!sI>+T*O~f+v?2eqE{T^jPsh6CI$aHImYKd>lQnm_`j(A6SA zU0ovVHKDi}qr1ZR>{Xc@pHz^TaR9He6?2KQQD0PA5rJLt6?C~abF`2U90jzEpp)0# z9n2Awy-89GrVvs+GHg|J(i>MCjfe=k#-Fo_1-(&ydd-yDl*S(^8sHI}6nW>8(m#5* zeb<;*Yx#~gQ%NV_oAC*{m(HT@BsQO z{&_pPdAm{uot37kKYc6=NEh-i^juyxXupk%BHBMd*Z$~A0f~;D7l+0bQ$V=Y`oE@f z6>by-UQ;&s5@XNgf5!l;h@zvXeDa}AmBDn4?}-P&uSJ_N%VM!Flo=Ept@d3e@>1Tl6uIvjxI)qZ^ZbQ~oQ1^Cvyl1Cu zUX4^~lB7e2m3T&ySyHhwIkQ0Hzz=itA7xFLbqo31=-hQvJI2f3B9JWE*wcHEdj2UdGQeG*R=Y9 z&B2?)XR=N*Qi8InUtML#ICJIy5hw1=sD%WI8>;4f;OYE5>IB-;2A}y`YVtoK2$FN{jus(~D5R31MfCaf}(eQ>xXG6ir&^GTz`a z&Pk;(Z!5!C^$KPZ^FA~xVK4M`H$d{1MDp%OK`54w`GgOO!9U}Avp z8x*jJLAhBpBG9Od3Ys6-P@6O&-^2OUu&MpodDWTWznP{|SGITs5MJz|obHI$bi&3V zRn{;&W}>?z{BRU;;N-hv!F+mGu-Rv{Bv>mp0b-wY*5EJAg{Q><9Re7iwdl8ZHq5T` zUMo4W$6F`QSyf z!?_`MKj&dqV+xg9U^f=E4ix{Z5tQ3;^IAXzaX=qZbJpA?hYjZ!9D&VPnk0)=Ih=X< z4`a{yt?WP+ERKtyl} z+Q_fnjx{W=&AL{}Q@fu6NPRPJ;y|~B&^Vr_-;>*6_>=&DediMx-N$kw0sfdoA6T?1 z(U+llpqTZ!1sT=1`H0idYs^?IUHRickHwpJ7-I$RybM0X3XQkxM+B2*<@9WNI?OKmf6a&I(z=&%gz`(4 z1<-g;Nr^(?{jXO)Tba<o2=>|665dv~M3bGD=umU#iyE>Rs&m;cxqOWx zuM*K{gZn2QPa{3OTjZ2q64`MT7#q2!14~~>A+}=3PFW5Uu)@7uQ5P90s4~R$w%vCA zTOTRM!|B4+8d2=`M~+6P5y_Dbz3T|QgtfWl@1C5&E3rCwC9i89{I3hWJAVne7=6g0r zNzYcw#x{x8yt9dUQ^2p;E)Bs@5VNH(4Cbqt5Pt1(h;*b-zQsVId2uEs8Iim3>H zn^t!4Xpbe$a!LP>$5(uPx-@7s z(n52HQTm=dEh5R2Eg6J>pnmWS*bfXIef$R>_HcSJqD8k@bGvREmP^0NAQVqLVjJ`e z=txuT?%7Ttc;vAt1qLciUA<@9gXe~`$ixW_rp8}sV}y!&bEF_yF|9OGFM|JM>yEF=1hp;vFL%htB+HJvN9D9s%y+%@$11IT0%&}L;9ub%3jrgiQ^|3QJEr}>)VWw z!J=^@#FK8b-+`~;%-WK&=CRvo;gC#t@_?6*OJzvae;RByp<=H`UmW#3K^!`>(#nK{ zf=>`V7VQfr-1BH8EBRLa#+5uL3&8b34O)% z&MWcGMMGh%fT*)x$D}2xpa@AyXUabylF7^}LHDhv4Xf`0rK#AG05Wy&WSlQNaO>?n zl^rAr$~iB;c?m}DnKCx2q~J5Cam(49JWx$NR=F;pC@AZY2(T=0@O}97kIwJCNEG@1 zVCt-cs`|daf9R5wP`ag&25FE^>F(|ZX`~dSOX)^JKtj4f2`TCB?hc7(>*xEMdG23# z?#!Kg?m2s}z4u!0_iMF$z=gs4CKQt!LsFR{wIR`Sm`CaH=9>zYzKG4XU}CHYs*Lne z-z7K4)%_Z6!pvv;8M|4Rwt#gGpRJ)5Pm32EZGv0#u4_%x5iI_9F1k_$9Wc}HdMGMg zjY{0naR?}K3f=&{9bIj#ZuLwryU5RpXeo@^)mp#z?UY!^7qnt1Hd~3Ozh$DaWk&|Y z$o4Nisv0eMbXOY79wlcOzSd4M5b0jcCDtROz`uKs8E=h*H1)CfS39%r=P9K+p#lkb z{2&ozXhrs9UOfFgM+e^{;L1FOQ_y=FEnwJtth}_-$DV20mh4lv#PsiJ!}roIwC|4c z)$+2&;O_Y8m-MRR>_vXcjZV%_UVS2mjT9k=hY0Y?X5($Nr|#GoF9&Fxt+yn>g(05l+GQ+mxk*Z{Xty5W@8~CYwA}^42#zHPR0sSIIfOH zW`YgEf;w?yJ?id8h1_Ejk!9308kk)C>#o?J>{OBH@18T^KH4825xbyT;C<{Pq6a znx*7GEv&TxY1HW7gNG55!=R_ z<40#gW(~_;={Tv;d&kk%YPGYl1x{EZ|DMmYgjdR55+tebJP1oHwW7YDs*AjzFSdwg zKU{P;N*bVi@oeaa1#xF+N=a-+J(?~ey6;*%0e8^NPgLmv+L;FmbugGq8a#ZJn*pu$nj*F6X@qIWbn|DNy%4Gd_+KpdvM z)1Q;u%6dIJvEfTIg^NmSuDz9dIn9lh2{sahU_mf}9MUfK#4dz3@ZUp~wn4y$+OK(S z#E=M~8kdMVBNWwm_Qbon=_V-q`e-6c&j@B0pN^W702zzZ7vES;L1Y9re4O(-mp*uZ zb-KYq(M0~=pQqLrr-!x))Ypi+jgP>^TS0*`Bwv2sz7q79lvJp-GR39Qd_klHhpp8i z7TZ#Sa%lDKyGU5@^~JBD=1t`~EFC;> zYw6F{eZ71>qC(^vaI&eeBKqRRR!OQ4BK!r?4bF>$MXdojZiB5O#vbw2LCX9I;M#$MQP66He5Jgmrsl60 z%iR2g1{j)^q@{t0Od3KG0apNfPq{}up2pMnqqfMxt&j&KceCTf>#1ROl7z4a3_YdiOZNn_f{eH&Hlo$DPfT#B7nmGX%DZ1@c_Zb9|oK42MXRNTX?!H8!p2N;cWEU3_q_I~f zdwvbvKYAa1QV&{!iYptY%`dbKYZv8q$<)hPG;Jx2U`qj^rm_+X0$NqZ*`(OmKptR9 zPXS1mlPI1=Uf?+aP9+#4Ami02d42aK zfnweAqB>6H_UlskDjdrKpJkQvkSwDiMJR`1nD3US*glOVqAzM#`$8zz4zKa{E#Kny zSW!&nywOW)T3T2U;1FxMT8u;k=qLlP6$EHx?}C8U;K}$i5DFqCRu(0+9737kbnyI| z6!E_3vbS*Lw{HQvG%ld=3Xh(K#)Wu@~J2-{}uxFO(s zKvbA==W$t=ZjOtK!~Vhs9v!~i)k{>!J4~F23ld4kx3ZK=ep7hQXTc3IvFIO>vt~(V z8xYh=_`&_|xE5?4-}6~DP|du{(yZ_lcWZ}FUU2wHgcvNU<-e^w$Y82kD36^Cj3gM2 z+ZVSYQ9&Z)HChJ5B1KEgZPi2zf*)^9-`S1f-Xvm}7_RhlS6@_(ocze|%^kORPEDcA z*%U$lGN-_D#w4=dHz4PVv%~@i_8GFj%nMjX5n#AmMJN8yQ|)u*0LAv10UZ!_^4U8k zYHI5LJrLC0WLn>B0>ND7TIZ6xw%~M9QN&m4b=x}-mbaZBAIgv&aq5q+VX2j@DR+s{ zi+n%tliwV9-VMB%*iEJ@R+)icgNIESe&W0;HPX-Rs!(H5bLggF;ZglDBG7tbMFCn~#)7V~e|a_# z&S>?rlUv;r1dTMq%NqeFv=Pw+AecPf?y-g_tE*3RhN5AMK`#Mc;mtje;;f*9s2{SI zY@P9?t%$E6x-2pD#cRb}tl#O6MGca8^Ij8Af@DUVGGr3ecjgUerkbQal2gWhkUZ%Y z3GZS9mPz2}nV!}X2v9Da5CW#h+wl&Z|9kXU_gZ$^`f_k9TScRCnL`G&hAzzW7z5nXKt^?Sn ztmC(E-S8S7WZv&H&wcWJ`c{zZRj!_A$_Kg+NL_f4id z+T7pX&mY5(e?zt(wnhA7LJn33fH^Yh3PZ13^#1Ge1rjb^V8{jGQTTG&G}nevv%EV2 z9l*_=P5+G$lMBM^kGT~eb!6!Dcucf^7LMf^0;NPkzP~wg$w?A}b zZO_-BDCE1V3o_(&T?CfV&A~IhEnvAZtTcj&r4NDcbXSr-5T{dvQ9DTRGhGj6S-FJO8lW6rBn3hEh=`6Rg@@;TMVsXpOB@Yu`hGs;_SNR|=~04Di0zX- z7RtU%W5FI#x`p?c0M5-CAY{?~gGd0I{fB^95pqc6{kxLyRCwrd4E<`QInmZg2CCnW zboA{8c`FV%;M<%&Ae63LUz^x1|Ed1wswT_$=~?AVWrvG4Q#3eCnn}Z%1~Qe((|-EP zNV)kH?Nhdco=CQ91!aR45!ABo?rtvINvIuo965mZd8PcXF*Zx+9vC8gWx)D^gM$MM z`1t=8w@KvhM#US)V_GxzO*YB&qPVpbE~jz@0jWw4d50V!63Eg|G)zt0M69|B-8AQn zXZS8xyu3wf9*^`f`^&!Bb6^Cx%5nP4-qIHKK7o|2Bq6ZUiKQ;f9`;MIW?G%2zz4Zu zm+RE-0*m}~y|XDGv4w7El)3|~otQfU1f+Su%}vL^fVvEUHNY1BXG5vgHra++tB?&t zuJu4@Ub@UMHd?abOJZNQAbx{g8rB=lV!dw1X#E>^u2{1!#(*uCEE-ov6<%cg!4-#V z;R0vFqKG4j*3%!vEL0+-HIx1@x=cQ(^=exavApp7r?HDGhsXiQpW4Nc0qg+@0{nvl zjndF2V6CtF^-CTC8e=8Fi*azdUO>?ZC-U~~4S;XJaY3*_BeE-|m7Fd`WZe=GpVo|d zD3Wjd4}|WPO?%ObGSgY4Dl#lm(u@CmJ3PfH+`|fo24LGkMOPuQ!=~`BIhmllRan}}{(*sSe6pah z1c7B>ISQCd4$>T3-vYw`CdsRij9}oOuGOL*$N;eFo` zj0BSn1{d)5{0t?GLaZnSHr!*Ik-rgAg0Tn^Va|h=2XxKk!p8+uJi?>C#7dpl>Ju6D zg-AwrtYQ(fNItaS>hh|LE}u#;>-QAAwGS0SY~Z;kAAF)$?9NO9|J=KqqIcYf49T#o zGeScpYHG_-0AgkQ*2E-t!2u}{>?@Vk7BI(vLvjs$ zUyvFLd5XCi0d0&w7Zi@N*NLcJz_$CT`(%$YX$5RJt$pVh7{abogsdnG^gQ#dmbdZA zEFGnd2u-ll?|9Bu5KKO_uz(TM$$$zS6Y>g#W0L(K`u{cqZVwhD6j;cjqVIvYNuROO zJLC{r&m&X8 zCX3x#`KqMQ3Y7+dJjdD_mh?1Fkaq9s5r8O)24cgzPRh+zkD7$QWdK4uum4Nu%^N-P{3O9uE9vR)2x z>IckAp-uX1H4N`Pt^O5vl>y(z+E>7EzIfxUH%eCFgfcV#+2ye^>c99G8UtZDzj4eQ+LceSVs`;%-r@zGgPONVg_j9+im-U8x-vS2ptM zfx%5{n13>Kw7{CJAHq5~W`s66qonBkgc{?b`(RilN( zEEG+Jnzn*c2X)i^Y~*pw{r!!tkr;S+ehBV9L+aVW?7@R_&k`Sn(7_Ifk(wHQm~6h% zxC>agc%uQ}Pq*HQBJ&g5zl}w zqw<~t55JBgjsJ}|$x@`~Yx1X2t-$jPwZHsz zf3-x0j}N?UO+b$hBO)!`3qD~jX?ERGgpkqD_JDgG`Uy@<@8DJ%LnJ?YC|x@dj@T`9 z5a*?MV*0yI>>=QtJ90dlRPN&4P=f<=;vxnuVgxfXG->V(k|mqAs~SY4&o{5E?tg4n z7oLrNxzuLAI;OZR;w+y>l4H)dp!e4i4<8r}_DQGG{`Z9R-(7 z&ID_oRBNy2U~2qyKBc~V>nU!-OJ=w()H?l}Br%BV#SyDrX278cDiib<^ugn}C;(hM zz~kk)H)#*RHGo^BTWJ*ZWfqV%jW`Kmj8?lp1d%`>eT0EKgjocaBVMw;a?dogdrCZv z6k+1+yVB@_?H$u$GrmZqEP52^t4bAJ-nuvEg=K~;YQuHv`?T>7Wk|I)$Ay)tVqnk6 zUXgZA~Ey!{ryr0 z?*;trK9UC1ccSFAz16!NN0r}R30CBfO|yU0N+PYiI{xf3E*x~jUXX)6tMwAz&0PTO zttx}+*9b^W2dKq%=LY8IWad*#4HWMpaSy}bf7sSeDlQ`=vjWvW;39(;eNw;0b8vgC zR_LfPixlFXLfNJkt4f;Fh8*=7Z_}ef!Kft2uyi@GfLn^v>e#0|w|~eEcde*;E3;2+ zZ%8@b&3w!?tX@w#Ueq_QE5ic2B3Rb5R|kVrq#<>YLLyz&5q$K8Qm*~57~jgE_8Zq# z7#3NI1p_AqME5dsKzC7_`~83WtL5hf>nm=OuHumUFy)zhE)i%Dho_#y4eA(;zpv8}*M2zwa z#Ra+QiJ~}LV@mMzS8fd`erBSGuv)H31Ge%3&LUQ?gL<6xWM%t-ImRED+5QqPf%6V{ z0!%*{=^g$}5)u|8puHL2e18JHBxl8jh>53p;UVv)G1ufrWka-Nva*(b!GL%6`FJ>K zXgS8TKS#1XKPBADE5FlI_gY2%vrX|`%VYve#VkyT#f3#dfd%Py1)+RTCw%0PM#Yp? zs(tcm1Issc-;L31%0O^uy8QVYR|G@^7$nrud!*?UCBz5RL)O=&a9j0uiO+ojJP2&n z44XXc)swDyfODo7jI&IDHUSHN16w047IH_B8W?>)dQl?#Co!1>%032s6lG`n_&So) z$7#M^9CtmlTwJh|t^re6c!_^lzeFtNs=)Dme)E)rP(FTHz-inEQBRLEmw5m1s+*jF z{h>vg6pv!xr`{X5{Dm!%r3WlLjiWG08o1?7ui)!43dM&>JTu}-4Orfgh7YSO#L#_7 zb||+NlYau+C+y$iWO?z6I~xq1tE(p;59p8a%K(7#2|zmlbX`&jBlPmKJTTXg0+$D% zIdxWHNIu9?cNl}LGzMmId!f$gVu0H`vHljQ~fEH*!rw| z=PUFEJs6FM9owK@R&919R# z^1v(D{T}TDB>@BG=OX5z1~~A0uaus=2Kp>MwJetl;_+63y3aZyF@gUf-zE ziKbQYqQ-JACr0jETBiK<2Z>CKOa73d;7?Bn-`sD*65OP%>4jEvG{P7@T-1IJ-tO1Un-HThmxdUJ^OF%*bC&Q6{2t6=tXv@#m z$Z6*o|4$1LBTNfcJtqhrmcq)aG^pBjV;B=lI4{0Qewog@>O`K`wp%rMz5!WKI9OQD zqD^MQ3BE?vOWDr?SLc(r+GYQDy~D-(vhdh!1fQK!wj@RhexseGn9hyoLTxqs0#!I8K$D8q#qn5vi1 zNnWYtjeb!p0oJJ&0Da<4t4MhS)+f~kC*VP95Xi5pIt5?apZuDp$AXED4p`NzzyL<% zg%1zgL4*1suTl>?gH%wYXU1|nW-XSWy83ya6amFzWYz_7$q9?7U{W|=wlT(%!RDm- zTb=OoPQ)+V*yhPF3)ckI3>o~ILEw?|&G*U>H1zIqP`4R|>Xti`tCW*UZRAMhbx9nA z>5>oe&cP-yjz$YO=Up7U=43loW=pLK#xCxf9JsJ^KZ^-OXwYR!{=0)~e~(QCT=Sx1 znfY_T9Ho>g&;V>{0MY_@K)^6}Getj81&-cbw}pV+Xj6L&kVlTqCqDrD>gr&Q00(Ea zHWWDOT70e==O>C)UsxLhNIFP{fzJRy8l3jrUg?|seXRyO{#m+O3Rr5Z206aN*s#XO z&waJ_{NEgVLiyc3Sk_xFcfQ;3*iClj3Y@*FQH!`aWpQ@m8r2N)K3VR?!E8GU#W}X) zlPwl&ItBZ!8s-!tY&nJ9W$qD!*?hUWRO#Ar#!MNv`WO~s0=TmQdK5@=k8kDM%2MhS zzEf(LhFOORcC$}(+Coxi0ykEB5GA$ySe=&t@VdXu)}%sw-J7YkFMa%yQpV?BCrw#@b|Y1JAeiDHyZ&w-GJnYM?e6&ArSL>`~|-Kwdfr- zZ23AC0St%_b(uknjv6|J8{qN*D*giK0RW=wP=7xIGjos4X)vbHbd`DJ(hD(iEImp* zEe-td;Y%j}_IY!xNn-ZA*qD!Dm*zP*EBzyfznBk?oe>O|jH&g>7)A9bU(UIL15`f; zMbXBc&EqT|Jj6|g*p)^80;OEc;X4tEVdF9JXsEN3Q&o+H1fhRP5a)?{!-Rx#0~z;A zTsIA34!B2IGafV7yPq_cGoHf}zUx@>3mw-PKXP}(%ccxAE(vlu4%cw?AG~J_-@|6I zkUf1Fq;6mk&EtJ_cp&H3s-vy_IY5x`Nf^i;BSVXTH2^R>tDS&$17Jir&|eS|;D3>& z0erG5^+ytZ4YdP6r~<(3dOa49YkwaTg9Lq+|86y!D@Epq3jw_O>e2@+WO&aIdO)^l z%O2nz%sgJ<%swdycS4B{ePs}_D9|-4Fm$%%Y}s+TAkj;P_k}F}eb`&SkQVc1w zK8=Ti>ibJYv5^}s*3DO23&X^wZ{4@wmN)Sa=UM7W{zMa>&!@`qL^IC$RnC86>gE_@MZc-!D3#w{AZ5W`oI zgo)@mA5LzEn<}g|{V+bY|F+`v>M}dEzC|~T60U6BpKnkt8ndpQeZ1Qe?mPuOEG~B^ zBF$$26YzSfiXUvI(V)5o4_&}=4B7CK=mB)m#Kgq*OKEbwt$AGWuUSk!j-SRlLo5)v zB@{upFU4HONf+43puTK5n~z&fwftgfuWVZ^wtRQ!NpF~nDEX2a>q)SvvO_3Nq%Gfe zWHfHlaVgd|Xz`vW?A-9Gd zwjm1aR^O8Hn8|o6Z8c;~(O*pA1j~PTA=3VJC==yVJD~r;E&e4Mv0h-33cTle<*~nQ z7*ih#&bQ%VVWk%vXvZHme<&#}YHvoKVznCie(c1rg@RN~F+L!mb^U=aD3{+c(pw#xPtu9v zRU=4cw(pj5&7SjLAl~q?ow>+}+O|P5^r(R+VJi91ikbb20c}1t7~Zo+l8@+R&;&p% z0p2leJ772+Oz;#1&^Cz8ZUijdc)mH!&3B;wK??pKz}bO9 zREtRuYKwuu!lcnl(B+|i?;8M{K!DzEo)xY{j^&FmOtX)t=#EkV1+KJ<_VGx9hvZXo zDQ1so&!p@X6QA;osl)I^TI?{J7EHjZ+|%Sn&XNzzT0)pVGThFT2lcC8Azk#)D4OZw z6n1)A^TA!Iuqtd7NTrPL5lJ&hg-2qy*(pM8t&cQ@!rbIyg2-m6VIUjy40?jt1;~{c zWJ3GpI-rNedS3z{l|x1kcQ-m;C)*AqA|qkQX=x=J(Fj&iTb04|1o0_7-FKxk6yN{_ zLHijHqJ`YHV=EP{t?4n?+-FSP+&CdH^saxCbiGt}Saz+!L53%7woyn#y?s57S00%T z3wdh1D2XGHW861cOW?jf{q4F%{2BsBrPfpCtekMo+98Y>GUK-u1(x})I1;y*j=!dR z1YD$vKZT_C{D`mKKb3R;fRk=3TW08h6G16jEhRTHY3r`Jc4c?t`eNFVwW(DI6i8Wj zHITD{ONLms-|V0cg!*Y~(Q!v&c4jT{b{(>=zYc}zaLw0YtZCMp!cA2!XM1U=10;(= zk9TK)B?@?s{lmkcM+)fVv369y77YmhfD7GgR#j3`0{$GxVc&r>D@HgR1n99~b=Gfd z>UyIGFpj_VwctPtp71V;{vH~!0+j`j-~x1BOE51@Oicl#0Dy;2+8*yK^7G9A%nUHD zcjxL@%K-@XBrKd19g$~qU#D`DAK;qt%~U{%OEF7m0dQc455VHK%&S;2RXTC;xp02< zo30teu&ni54B|S`vfeEx;5-RtdOmjkVo}glF6eW~_uo)rpB;=cY@lM%)bxT79iJkc z$j&(;NhA|DLSUzZ=eC<P<)h58*9@cYhVT<)5VK9RG>SN2tY8AowZn=VK5{pah&a^h9C)BQ>M@(6qNC1} z3{+uADJbA~6pQU$OCv#AO!yhv2-zc`GN!+9#~3LQUE#a%eCFU)hQ4DC#7Qg$0Am)a z5FKyy*p3@yRL3Y$K@r{+o#NCcbnw9gXbG=wn zQjY~fS}^lB)!)el%g(Qe86JKlU`%j;XGt-e(kfu|O-oAyLxc$2gZ_)bA62`ls4fI} zHiz(qo2biYCCp~>w2XlfWR;4R-JWRB_J#LXIFvb0Ufk<$bj0A$0%-j1&GnyJov*Jl z;HIRIS6bl-@;6OZZ9FiX4DSw%Yjw~O;ea57JmjM!RbXnH<#pUL=~t#NW?Td*#%d6I zLSO#ax7*nOC5r1X_&$P?sXb===dC^$7}Gw3x6%Q^d0nCKDt}8-`ylNN1}8WNBU7Z4 z_v>$f_6uO+>~T0hb96`-@hJ3zM4P*Vfk z(2$tKt+dmTys10f>)LRR#Ky7_;0*_rr{UaKkz?qmGioShuS1S94FY7tSCGtqFI5ql z%HA9AtTeG0U>RdtKg;t>|7@oKp7F$&hT18V>2O{6Kw+6iAg*zXQew28U_Kb+2yPu`a|i{-SVpB|iPinF8)*b;SLJzETQ#Cv1dd3lqdLI+K>zRAF>;Op*I zmegReL8_AJu9O9jfffwYZ#Gg!+z8_Ljaqu$0#K9 z0BVh(r-!_>v=ja7mkG2Ae_4t5L#z=UgauVqDvh8D4hsRm?Q!8r2G~}&+?@_ov|P@D zBHO-4lS=V=)7d;#+jK5Jh*B^{fGjMeh(M0Di}ur*FzScnu+M;*+Y2q1r~Jim_ojp| z-0tn8iQDq8w?==Q9eg#v>DRi`h@(%0n4m9QgkL8wBGF--dSM`jTFKG^D?YULQTES( zAVJ3m;nO|$Pb$rwyp%f*&K)8ozw!{(R3ISt|ItVB(bZ zrEh)nW*I%9Pr6mA2E#_Zd8!1oJyq(~_N=GXghf22TxSCOX2FC_k}Q0?GVa$2n-eQ* z!cr=DXa;0_08hY$4|aJSRU(I+&+l^*!_#i2UOd#V-+%N{B)~y*xZv%fdVY6`VxT2+POZQ13=rZ;H>LmQR^{;RJ(qYTT2QVm0t3IFXXsw1bJf{Y~5XZ!@QF>}>cdvIdJit{~;Z?@jw zt3yRVYo`a7tP#3*w!{@9uW?qeX5#{ah0fPxVLhwhqC^`3=~j~YDiC(e@m?V%loXw0 z&?m4#(giBkwTWLZmsF8SsihqR0JO8&=Ux><1-5yb0|Lm%a@{%>LBV?3a0i*;5$kMF)G@pf9!$IWcZ>06@8&wz`OT?(|N^2SdndW=tU=rH|_-nq%o;$fx2}IIZgBN zmc4ml^Q|0l>k4(`LX7nU_s=ww*s%mEj{`r|N8wUODj3n(qz|IXX-p5^B1cf0Y9=>9 zR6HpuBbnGl;qhe>}^OEDp4P1Bu1EtAV2T;Qri zYAj63;`5fAxg;1XQ3nQQSueGfhY|JM<6C}AHG8SP&bM-`A|odr&zQ&UNVQnWr71@B zkVTl#+Lf(q5hNtd3kS@`T=BRzKKb%0y~$ETh|ZZvY6}5hbfk=WdbW=zM=4>(GYs*^ z+bK@?#di~C%hSbw8`a3mG7YS5AQZ`eUiDl9rE@r_RwwGI9*R%591AL-U_ar`N9k(# zD>-{3msu_)j`+9Sx=u%8UZU+bkiv<6SHoJ|BHKtDrRl^)EUIqo`YNncUD_+mC*Ng^PqJFLsA550BBwUL(ax_0&*Okw^_cYD1U1!jezU}!^;M8YKZ@m zn1*prVT}CM2TKm@4%U{sbOSy68sx^>QDo;S6g(gtsb_`=Xh2@$eM9XdA5XfVze``o zBF00NfH9{*_A~v6RdLFiL=VqlU_+&3mjRizHtFO{g4sO=r(C>@$@|DR@;mEylORwm z{foJSydLkbz+u{fJBHc3`=BEZeH8KYGWawT`l9?Y4~-zFntmW-MWKXBKi>X0Un@PepqaCWhu4RvX*t$* zAb9C?+oWJD$Db*A>*p&z6`k5<+aDd8O-h+6tSY6(J`Rx&ICmdAfJp1ic~EqCgi{be zyW21WK(7ZDKp_ShCjqc223TEKg)v_JP7=@glSQASo8RvgivWxGs1-C@dGprJZvXGI zHEM_6-26R}gH8beTm^EFfX^IO{8{juNlyfTp5FLTf?O2=>5*JG2P9RWRjIhdd1U@*BPe{QaPoi24QT*A5>$TFk71J z2E}7E-5#3D<)3_`IqL7K|aidGY${VbBN`=9KD#X6?cO zCQ6Efp|I6Wxv?<9WZT0<1;DBCH2Z)+TK#soZ7T?LPy{t!zm5JR&74PYUH$p|W@LZAT*Xt&Z~1VTE{!KJgC_k+E?W4ekca5#(u zr1@WXJD3R@o0~SAsm5Jle|yn@qL+QIHpogCWZU;@@dn%iFoa0#xQ*ov&phtyT?A#5sC7f-TFc*wGUbcY?caJd_HK)de* zsnsL({sh%?Me*%VtcZ$gp~wW%?f9n!+XJ|V-xbG)=GMIm2(4rJT_&U(@MQEuyg2m3 z1wQHe4=#5mL5sweofBU9`ewVMTz#-qzjypQ->C$=xH^f_(shGZEDaKM%BkweRS*0u z`dU9|3Dy?SqreOOh;-{16!GB6MjO4YW~??qNd!~{zLkbii4GrTjknvdEdMUnm5R`59A0wiF!qwOsD;w@bQ9!)W zco3I480Yr10|g^5ooyvNYqZl6%Imh#8laOBlIf)8i$V zVoUaF>r9-}eKWdGh;fiMN~ZnLq9+xB06XyZ3hF_T54M{8F3%Yy?IxW z1I(7^(5g!`+CUtmR6LzR@c0|L^V+xQ=N;kHW>nC0=D-Sq;OmvM-#t!FI%i@8pRcLK zOPCW26R^;`#BW)0exLGVCWjVVBV`SJ!#Qs=E3fBHl-)gbwYIS3ovL0b&Gx{4m+XYC z)A{#hq8rpU{2(e1xoGC;? z!~ZcOL5F;ZKaNeJW-bLO+Xd+*Ov8s_@$wo|yYKoNtp{#gI8+3$(<#b32#gr~Tbp^Qtj!7p@M58iY#LAkg}0W33+GMGbWz5#4k~1xa7{))09inrk#%DSf?hj3D*U9 z_no-mi>^6X1+jLe;LP9kzMf@2kqB46NpG&9SNi-{_&stPtN281U^zLugeoVPE`*jS zz3P01rm@RB_Rorm!*w#@QV?e^NrJ6d-I0Y#$w;Q})QMy}8k3r3Ax;xAOnm?7V}^Oh zeehywWNgJIyKsq>G~`rVLi-{3a}Wd1*nKmLeizZN5EeDeqc*1Cv7Tl%1j9nC4->zw z*;|q<#hQO|&{6z*EJ9}SQkqlB+4R>h?5Lsn5@Gsz^Q&b__&^;5W&7|k^KBJ%eH2N$ zN^m1*#lG0Rw&r+VUXHMcwc&)}!df?BA6%-bQ`^E4Lz9&eM5dk$bDzZg#;doOMA$&D zXrP(6d%|PI)Wkc|RZXKD`-YTJ9ND{7e#K#Cvu!{%wKl&%^sji`%8$PRV)cl~0t2ECYdVWyqYEcq*r>w~=Uhxt0jNA}uF+(0zIT*Tsc?g<-1j zPoY1j9sMwT90lGP7ZU?TY75)T)QAaXTJeQ3eX3IJ*vqU+?5nP)czli)Q|u_7zi(SButTffw>1auqjgnF?~Si(sx4Buapl}`*84Y($z`b~yi`(8#QR{fXEC7cx~cJbRb>Jc>LgVaf@f<(@6`Vo2T;f| zSV}0$&0~}&*t*Sz+Hi0!AD|K?-h6jTb5iqlOBs=%^F^A~zGh=?7hc5{$wYw(3_Yvy z;G$T*_Kzc4+w;0Z_W4dMQL1;I#H=0oo5TYHMCQfc-+NHlp)U^UHCo+&!>A4odvQBJ zJ`{=P>^+ zPUi0@Bp;{r^!KiPquPva?Fj-@{lqe7%1CE>CajrBdUwLh1zgEzxzWbdKBDn%{T&Ux z-w_}tB=XNc-lywFg&wA?KRs~5(W=EuXVMx%dKq-NexEcrLtP{@AF3dO);3KV_Tt{q zmQR|a0UrL}AS`{oO*!66ArFHKH2soLJ7@4%DxYoj$XEW)1(w>HjF=HLDGBX3jmWlj z60ie94O(=FHVB`s2-o;!=GDdR`EfXsoa{&!B@LLhSoVPOmp1rcTw%a2?2J%XCHG3t zn7U+1lJcxdSuh%0*)tM<(=gE!;V(w=jwWsdZ+rG#sBb*0mNwfEEMqMp$l242JQQveGkW__6K!wo`Tjp%f5zcqm%C}>$)soo@lyOwDV!7AaNwU2#+l7&%ISxi?;r( zm_=?ns9t^&pNaI7=UNxhv&6%rGD+e>usE@#-4Nlw`x6q=K}akv=t+eR8-xJ)bcVyV zbGCS`SV|@y>X~%(dl)Pr_(pcPNPQqG4Gs?}a5p`p$(48(OjE4Dqy>+iw77DQAkYYt6v3(^xpWaEpp1$XnL7Cr&{IdC_j%SkX z`}V5w?`>^ZpY;`lyS)xpU`|By{!a^Fgxx>V^mT%hk&)+jMSxgB&Y9J|wO#YW6S~^( z8VLXH7c$Ok`doCk-jYw+P-VTFpdqkkn7Hbb>}Q|d-wdLaG57>CX%!*BP-g0dEY3PR zWmo%@z9#qTUQqjp-x+6ea;$Uigz)d?yo1-NLjH-#=6e3m5MjKTT6SpJ|0Z?Wol?JX znxok~Qe0?|(hhXqsl~S7#(qcSdB%_$&o~ier*JEr?MNfc@GpFczEd;WG0$m}9EJid z`*HM0J1ObDkFOzpb8;6hjKiBup`ULR1B2;etyXpLsLH+JJV8!tJ%*~O_D=JE*8!qT zU=o|_#~O>Om2+~rFt?-lYQ1n_5&{NaKNB zk;}g^8oMlOD4Ha2clGHxbZobyv$M!JfefSeXTuqJl>jbi#EVnz)x1T}Kc^RUeGQQ5 zc-kLwj;0oeslw4nmW8cX*?gQ?`zycr&xwa}2(MAVbT_;`2QAN3xtH4l1zl7j=Urty z0_I)z$P@CubE5hGT{&h<^^wWp%duVEix*j}`U`Gwlvp~52(;&`GLOMa15D@>uaTEC z{hmPiJ0mA5MU&AoOda8oY1BcwYya69ap-vPnIJmV*Md_!2#98^iREu|?M$CyM2J}V zV-Vzxkxp%%5s{%=6PHA~;c8bJB+L5%7 zO=(8b@M3vY^x?YR$rp~bxg>A(GU=u5Bp~I*Q;qTe{a_}u)+dl}O0EqE5!(*#tjJ>c zQ;NPvx#i|l7>~NzuojZNc8fxEB_eJC&x3QiNebj`INvBz`$8X-&5^{cF7zG^PK8jTH-5<^nr zPWP`}uZuxzOP}4drTS{UZ_M+&?z!Seu8gB?I(xMs_XL`+ku7F&-St2AD0{CxH=^Ct zQ=U@ra9@d}KE-J0oeEru@`$o*VRT5xtNQTQhb!6%{Dp4(yP2ftw|0jrG^ge;>A(WJ z`^CC6uMh=|V?3j1F9x&5tl6NM;S+J_wDQ!TLHi*Ldo=ND2tZxxPg`!64A zok}0p6E=vl_Rf7B{B)G5nj)(A%yJ&wcgfJ(;uj7=kMR6`BqrQ#?lgl4)onw>k2P@k zd4GhaEiKvUzeSfQMyo*E4vbjISGD@O8s~mxd9qRadoGJ!Ic`zm zl7}UW!oSjAO*r*$EbiQ{J*W2^EL~(}FwAIKx4Jc5DSI=Z{YSfs$OG`P;h3M_f7QlR zJKQxN@YFW`#K(1UDu_0~4Yiwf?P$tA#0FpO_~yy?`LN?Jn#pvLb!k0?LdY2(rh5#^ zB)Qt2;H%lCnemPtKecA|8k`#0#d-DLti&?3!1X3qSa5$k=R+SNM1Rem$W>=ta;sGl z#i{?>iQC#7&oWE1B|vUcPE%#;vF>^b(S6?PNsZa-YVYQS;naO!J zr;+WTO$@lLu5RJiq-2ascTPQ*ZzujBX|X%98R#j08|)Goyp%_P&YhC2KJiNRSl(on z#`)EpbiRwQH(U=W5qScQ1R`p)Q);atyY3=|8Xom~Ycw93wJ1AZM$J?l^!grlbyhr9 zY`;ixztySe!{V8A^Z#>DqmW z8KtFERV!(zmci^+u`@|b<>#}{;6t&t@R996@sYETn+P`0u*Cq zO6?maa|!wc_Ze*B^k5o1`JHdYV)$(Fz%_NeI<42a@3Hew-XJk*OKBIw_(pT z_}(n+ntVK&Jro^(c_|BNX=T2g^oqs|X`JzGKvzYF><^N}e_8jXW@7YDP0fRAJ}jLl z86uP-VpZ#X{xgs2J20@*6+e|F!DCHF{0RM)=)Xg%YGad&+?DNI|p;UNS#m( zw0IUBR84;vc{d#PSrsO`j+wwP!M`?TySxMr`y&ceX|J@s(H$`ccj$23<}w0yH?sks z%(v>L3u>nMsg**Gum~uC_*~2@sKDi9Z$Xo#vTkj<4Nq@{CC z#hpxtbtbyPgDaL*+v2D(JUl6(uz<_9QV>17I^jiHHcw+`^ZjtdCv@qQg@ayyzejkp z^-5%eB~Hq^nQbS<_NZC+dI8LLLH)Obz7&h8o(VLAiuC?=97xdZtczLiLCm?+S;evsd_VfFvI&a;xVvrpGCA^DsY7BY0r zrpZGug$tBosI%|$wsxNJ_T0L*+nL1}>fA8I;&Q`Mhehs8`w!KKFY>_nS*B{nR~sH5chTNM!4-4pw?$-?E+NfzcA> zU|)Y08tl1TbghfcKa6@oeMPAX^jQC`c*sXa%3S*tmU&4`>SNuGY#ilZ8a=8B1mCox zK$#4WjEl)BIlFi2`@XFYhHsuOZ69+&&HCqX72#m=2@O83E+z4Mn6r=?H-@0iRZiI3 z8kXmzn@`l51BnQjirxbgDIryB7x9L)M2f;{H&^@;!o7y?FKagy$FQI6EerjO=aYQ;wF~EyVZB)aw_A3HR%(w z^h^84X07a(sZ}kXHI&#h;}GM2>biFEK6T84g}ACbopG;fCZ}_)MBvIaL&HC^WgeNZ zMjN|<{m4YGe*z72bTPIdu9%=*D3mSo3(V-E9VKMUb@BSw&Krb7VNo(#0t#&D&HA43 z0CjWG(XW5UbLy~i1qJ|p;)k5F&zjbrY9ybEh5L^pjNO3J=#=A`BoQklLaa*LGA^qr z`=upsb!Zn3Q`0UW-^G^b3Y7@*D#}N$O>6X2D@a`quX?TTR@a{+W;3j=B!XRl$feh`PV)%vuCW ze_DdTxUeq)d>+6Fedcj>S#35vPDWxWO=u|%3$-x0Y5$P|G=!dv@5Utt%s2WUh5@tevwqxTo-{wL{pt)>4Oh0q%yI<|Em91yBhE{Ao9i3ItEYe&rCLYM@RBCh>%l= zMoA+-w2w~I%46r{s9PQ!Z^_7i_nZrzZucVBkhFCA&h2ey=MXEJlJEbc>8it`YM=I@ zkw&_^ySqzTx}>{1rMnxXkw&^3q`MK2MnF=K4yo_>`unbH|JvQN`_#lUGxt5Ad@uk? zI7+6SV2Gj^a6Zs{H^AFimldaZDYp|g&oz@)!!;^Su5B7#!GT@nJ0-ki3o6D^TqV$+ zAY#^A=n^?jKmiOTdu6M}PIj3=C~tOiHuHIO8+yhbttwvj?4HT+c4{j4l}>Lo2&8$m z0?+{JJoWC`C%Jgzo09*JwdZ;q+(FCCs-&)3|N zV^qTiMuj)J4BmUCqG8#T_mUemAS+gIM#21rLy&13A;*+RH{D#OrYzDBzj%!X+M}B` z=iGqDT_eX&V$bZ;1KF1C*UulY`NmTPm0JS}FD9%1q=48XmnR;>;Y|rLPI=F!R;Fb+ zy0p;?xv}3g3ncA|0pc`>e&uU=7`tCLu-T-8GG5i~AOep%Oo2Ku!8e}KKI7@*ubpqW z8THZ4@OvGklpU|_Q?b#|W$=u}QM|Bfy?wxb|HsPPb)DRM`^|`v1>)Rlfl!%C?T7sU zjkCJB@vvNjzCUzQ3mAa#jN!}7;t+T;QsTigTey1~`UH{9sU-Y`=r_alALO9Zs@IH#K zH299>H~M$AphGpYL<9@?53WVFX732wh!7 zO)lljO}FN!B)N9H)jzA!8qV)lS)6prbg^%hB4o7o{5p#7Es2OA0nql;zMtFQ^*|*? zA`<-O@-wyN)m!n&N$prB%+S3NrylcJ;oy^NsxxqI+!7mK|h$se*a72^IUW_&as2ysC#y z`|i||MWdkVV^`N!$fo1>U;n#I_=Dwznm&p`Fq%o^`vLYOUXAw$t^9)2u{>|<=}h`T zK_0=MkqQ!)*zneHn>ywMmq1X*_L?7?-wC{dC8Rr!X$%t~5d|ZLOpeSZr>3m@D5Te-ImTDjbSZi`l}#}M)r)F>*4FAXlU$e|k*RhO+U&V1 z676XW+NbcF=jPhe)x6j+aEmo^av@d<>awZvrpt8%fI$}zNtI8F8-gD~r_gtY@n?h# zJMaTyjolb7g>5I)(K=s`orumG`;hM_!5ZV%uCg|&V5AO>P z_s!D&_ix9kKgAnR^|2_3RqBd&w$W{%&$~uGylEnLX2Y_XdJ~PmqV`Q_+&t;aF>z!e zyLzriL_$W90C;J29C4Rzwc;YSw?vC@M5%TLH$hHIk+cYF13k8`GT_^8&(7=L zVu4nU^QLc2tRXeZzsA%}A}|Ka z5#SOjmKChu43e_)ihk6$i95M0egiFDr$$rPJ%%{GJ3lzVXhwNcn_Xlk+D>w!=JWh}weH2$+bvqC9`znhX?60@25cuC)w z0=zMPT)C9{IY&@PEuYX`cV`YUP}ih*_(>zo(0;oOpFgcomf zE_%@>%OB05mD&pao^|JoKfidW@n?YEs6_wB!WOAmP%mqSxA~|40yiOaRCab+ib7;$ zJ)Q>?#FERy7KNWMioZSAn6z<(ei|LluDM5mafIE^_oie}5Ze_MyZRdB^6{nxs$z0f=@Vc9jB38>(}+gZn>rj&PXw%-o17Y~p<_TO8Fl3^CIs|rK-N662*o)w@?_$B=fXFA-p^mh9!K>Je>eNx|BkGG82>(YVHluVx3gIxRMzxNCknbg z^_j<@-k)z9QI5EV3Is<3o4GI(x=w0;@Pql42CBu?z9(GO8&iG`6h?fj)1yQJ-p`<) ziQdWP0tw_S-hzY9!sQQzSh+}qU@-Zu^P%M|wB_|v{ncxGM_c}oRf&50_^sj4W_Mm@ zSJUChO2@#^r4^y=Z_g}PZZz1c=P=|ODoq~I{AQbf@!fYbta3_4XU3aq9k%#abY6_# zVpN^qzllD1@G37>gCKHWyzv$E>{?}Ff5?yjRIx2|U`mmRXEWg48<#zPi&(w-mJJ24OARh6 zSGt|AFTDbDay#7~as`s{F{}5!=8VkV&AWxDt0GlJ5)WpCpka@0t?^s1*aVGA`>&W1 zE9}F{ig>RV)4W0=zzNK`yI!O{<8oMj9Q;l$h2-#>;(_WAO_*PD3=Md^2Li)t#vXO( z6a0G?u0)0=@vN1G8CVvVgQcv<{^|cj7U{7U$4Exa(+#Qb<&YDO;LR}Cu=}^us8Lso zf~isI;*7g*Pnpnb@^~0sd6cY{yR75KuT6vpQRhWs!r$dt5WB%JjVSs@osHb0atDHJn|2=n`)8!wzB(1(HaEc}#)g1Bq%J+A{ZmSu3~`O$I& zBrrUdp7GaN+6x9gSWQhIP30L}zh!{iP%TOcN z9S%g%?d76RsiZ~QKK9Sjmgi~}&mM-t?OYvJoYqVfen6(ep>L#bi!?L!ZfNf9PJ#nw z;h6ib6F9mSTyi(7?^o)L1H;gcM+X50U5F$EAW)KP48aklqL?RfgqBogF|S)N9ya;o5>`qwFJ%!knJufH)&j?2_v^^Gh`aFG~Rs%Z#~1QF4)icQ?BhNpkBG z8~#(q896VvzlS#8QSt`;l1C^hNGlCroLS-FM;3UkNjzy5z%?QZnKP%s0r93;TagCA=|$MTEf- z`T>ZN>2x|NM;Y<3RLnu+zZy>J#jb7g4b_Sq?7c{n41Ywt%bsu-)9su_pY9U}hd9ll zoeYcal06J^hKq$Qd4H<8`=yQ6HweXx80ecl29`jR z+D#p8ibWKO6n>7tnD@2QBdrkkfJGgeI8i25$PlMU%Y+eI=yhb1of8+F{87Hm?+hjh zrb38xx8zP!FSc6|C6jzA5pC`szvx%tZ`27mT)e1CNMy~SsQ>Npq($UxhZMLs@V?3e{A#+9U*a$Z;0~cTS~&Vh!c3IxtXK(q>>4e@e+~HZ==aw z8}WQtlBZ0ny=t88NBGH;9XJ+}eryw+q&CyDV6^Nmgkg40;J>@al_{ov}PIo0z z(q+fP!7bT5iSSQZbV$N*oh&_MOV~yf*RHwDk9@=N+o$KW-2^|Yc_=tQP8RmWO}=#k zM&$`?$qwuK4n?hn=p|bM-W|K@hFLtiaoYD2tag*47-y12_1>mM-FjL1s%vs5Qot{X zlwC?sO#1UJLheX|Xgxnc#8rh^|2oApdN>e%x990Y4hQgUi3eg$ zy5_i*rmubX-mO}nsw`;uPX2crr*|AFzR$NeFdL_94>(6AzT3B!ZPfH|t?Q5hfI0<4 zwJI|vnD?i6SKDaDk@Fb3a18GzIPYas*1kDL4r=`;eaI$yPR70nV^O5-=#rBhY=f8C z?;`C3%?aq|M#qk7%iLJfo6y9aPc!;W>g{~38j#Tcko6wNrhQq(`8xt78}ESINd;z2 zYYxfp8i>%OLa}XQMsuIEhy9_QL>ck&=ex#g%({w#>Y)LXG&Jg~7k#hDH0UJSP+WMp z7WX>eU>wrDrI|N;{|%;-7DmMt0782k{C}nKHL%G_^7mTuz)i5XSovj@);B3EqFL8W zILt|m^ir!r0KLh#_hi^mRV|o<(Q1E+Sx))6@sFR>$MnyQE-Va>n!d3AJ$bqM#e|xu z#(n%_LbAw41d+VRb%W@=KcdnD3oM&X+2@l-@?%YjEC=<%DTGGpZxZZn6~M)vL->ta zayzH~(074r;ytcf+P92Xb&IK9a z4*U-9R1P;aQye8^FOORL&*!x!15}~TtPrX0uGf{j#dGo+ms38BI*WcMBa5*2Q_$}7 z=Lk@Yp=gk{hoo^NW_f0HlxgyOW2P4wQ&=rI)q++yg&Vo2mvEV)dI#2XfUT9F^+ZQ> z(}1&FK^)@m^?4W;p=u*&lwhxJ%RL1D33TyMe53YJBjotjF_9I^I?BgYBy>5PA7o1N zB1>fA6ZWxl4wk&_EC^fVx>lT03#BlnM|y|&uyQSmI#HO4dbek8Pn^hz4+MXw`Av{W zjuN@f!jk>#jOj0fw=zJ<3d`b`haw}bz$1z>Z&wPldDh=1S^2yH4UL` zxWdv2(Q~d@x}?fk?8!OZX9J^atA2Fh5?UZkX}KdXx-&ivp@<-gh@d_^bX!d4IeMRKk)Q(U^ZF*&*v4`l*Dp!7UG`}fdB;B zdW7h+scbTStYGTNhnA%2FRc{eV)j`e*qGds<8F#czw^|CoU$B)9%5M%{No+{*P*pb zv&;kR*ZB8qKhVg;;iK@X_T^uDc?);45Z}w12G`5(J?gbBhA>IteIxvpF(|Z@0Gxbh;1kEGM3jf}w=F^E1`EY~+sB2qPwy zf7r|*!gmhA@eYYD9VvZ9AGE<_e+6w+Ra}h%&!Cel#JWq1n`+ACCVw?7syQ4UDX8y4 zGUvdIgOQf6bc22V+Bz6vCp8gUS~SN;DvPA4Jj`3S#;}KM<#P2NQ`I%W&MY{k6k2pZ zEOh;5g^1lOUdH)`xfn(%mpe*JEWE|Ne#x=ZGg>}_Mk16J-aXopMjXD~Q-a;w61vkR za3X)udCW|;;-|(E_jDpGj7MKOCt5**A>O1xr#yC1o%Q;qekjvTzUg)pq=k;ilJ&=U z>%2V_z?l8OI59fp&qs5eAJDpLh)t&w@#*wf2$$OaC(N=$#^g zG9%fEBeV;bO+q6+XzBS8r4u7Q?(`Pxq>T?BQ;gaRav{9D6R$F?L>V9KJ%{(4@tl0Z z)sguOZAFwv*qnbpc#)8iTK4t|Y$1O5W`czrNfj}!e>+l(P$=sh@*c14Nrj>kI?$;{ z*hom!pbrUK9Nv47tr!2Ta=-a3P3eO;3lX-NJwv;|%PKc2`uroB5O1@(1Lcvm{?@Fb z6+2m&;B0Pv42&LD=XyRhZT^TX*7@Ny+ z?X}s8vqqo#k{vv|ROY4>1DGEH01zS{V_6K=ghiIlK8=^bCRyW(OvD&jg`bbjx}r=u z$%Ye}8{)s3Tr~66C7z@!E6_=W{4NoN@bDVd^uc*b*F_B zK@!d^;hdiqJ=jY&?CmUNbd%EN$3IWQqr0@#KmqWcLSHvEh)3BFv5Eqt)5EWPq*K~g zCe`5dx_ui`>b|{N1;PNb!2H9j6;E<^glAv6hwkuy)Pp4qAa}8^tg!e&$rt<6z?SHc0}+R$u|taU6p_t6{((L#g{nf{iTnl<^&u-y(S2iWBg(|wQ`=}IUD zmS?9+gp5BW6Sd$PZBV3tvzCPMZQX^N74DBJ>St&rTmiE@;l);g!?~nm)T$fgqj#-N zbFbGE4SpfJV=Nlt_6!M3Q)=<@)FYLlwu;pP;zeA&f6A>^QpOOMWwyR|JgWqTdIa9A zSmuMBm_v5kz>eW4_?+;-Z~}=4h?@xx4yKTa?|QynOAx$_e55EYDXBK-{)_uWMRlC+ zWY+I++Z#;+@h5r>W@Eor8i`~Gz}C-OJ4xu`M6nw=ZansfXvFf{%>|QHLKcY%E}nKs zOSX04P(UKbEKT^6mwuhK(VD7nRwIMF0HiX0ySnXM zPczo_lY?zpJT8HArq_@V#+`x;oX+y308EqNft=hSDY_i6_&iX|*=$JkHrkz$g(D|c zfmDE52j$|}==Rj`{%d}{o67We8pw@v-_I%BdxV>Im1b6BXLN)>?7H!vmMW|$Rv#R) zGzZ439=LM9uv^NzX4_@R-qY4Cr6Pt3kTV)p=o$^^df3lqBK`s5jP#nUk~ljcfojdi zd3hEHQBcgmp4JHhacBsN*Q>|=)9ij+22vy`nfiFnh9e59SCpOr@Hwm*lacXsDzY|w zOmwIn(S%S-1$u2eiudVgQ-I=oGG7C9T8Vute6nROK2>Q9HBB}IsVd5y$JLY+rG^A2 z@l$|^<{Od(^P>?&0NUt1p$=EZ>FPM7?#@|!I2Bqwbp`E$PgF)?f7{sRO*Hl0!y#0X zjain!;C6+00YxYjW3JNOqLv{wCTCs_A#A=k2JZ=w`fe~~&G znMTjPQ<7pZa?w43WYnrQM-u$LAq&q%nFomaW3Ko@F>JOU(Mm%c8HD+qtw3vwS02O_ zh`#wj)(@hAPHubvT3Xt=k6*NE4Uh)T1z$nLo29wAxAS=ljq*Gkh`WrhO;1mE_+Iw| z1gL{@4AMiKEtH`SggTW+qg{s5$dcj{#>tr!V)gsABWW-1QvgV0KyV6DI^}DVJtQKa z7n8+ONMSdD!FV=A`DLsck6VPd7TMWzfm1PYj#O5+V)H20ePjfhR1AhO$lMRFe%>cs znpY^LNT&u=Mig+9M!EHQ=+a7>h`Y&SpQx&*n?onIPtkuN=L@tS#kW0gHz!y8$&1~Y zQ>GAu!8sA@3)TU2X&4Qp&c=-M1p4I<8e%{SK{Nn2?_InGzqf6Yi&q2BodU=#q!Fes#BLEXRY{Q{o& zE#IptkhR94_woaqfX{a#)FFle11=x6HWmVo*E%r>6(Mk?11|L62J(= zBSOKUG=01&`D8FYHI+7zUD^Dh$Jcu}7B2#YNVm?Y4`grwp^58+Pv8OqPS1aCnDknB z?M30+LFl5Inws94eJE|-!O8J)s$7J1NBPvGG*;k;THA^*&aLzx7)2EouhbyXf%jT^ z6WNQv2Q^buMMYQW^VS{!Ajz47WQ|+;gpbb)#EqF%e6c>SsSG%x_;}a#RB1~H;9md9 z!p~i>=u(0B3=QP*K-H1Lb8)4=rs5;f1AKm8N{20y(o0LJtfa#lHp+fYHUvvJ+eY&8w%hH2%zF4}ugDfQ|@RxC1e`3e=yqXmgdc)QaH#vXT@? z!^Oq11MW?eb`DiIBGx{eyqaZ|3%2d&{3#QRnfHB5T&#d@-T_R|tmr$m(5x)oyk)uC zTInJ-HRC;8_w>So(qU-8qH*}*c6o*(Xya~IrCGubb#Bv~MeIGJ?tKi40n{-v z5RZ&P1ttNEU=tt#uTT|$b*;3@YCulSFluDlx8@^m)$E%9?(F8hItH3b4Dv!^am3}TcXef1s zMMXy;`f>*nG37NS z#KdItxt1P`$iTydu*sS!s8$AW3_WNtz%>LM4h{29F#UNM<~ZupC-HN@c5RW?e!)=w z(0X0$ST9FYeF)7mydtIpDTlp>G&%0ZhjtYZqEiIeB6w>(x2_2?2Fa>lTziOAPaYAY z$kYji#s)pQ==bProlxE;P@#HQ`Zvxn3ax%)2Gy>BPsAHQo$y<;_u%F`Mnl0#qp(D@ zo;#4k#o<~^C)4jo^XlqqcsQ(eHul`k@3^?QzZ8KWW2UUGjuJB)8#-W93gzM43i&Za^r2UJd`+y^hlNAZBAnxC%E-&@1i9PNf=^=w$_ zlV%h9PsP@U2p#y-F~s(ekfLae^GX`3${H`cSS-X6(raOgQtXb?73FEvIurFcvN$}8 z-i~d1(To%nFqP~ZIixFQx39!rU$MQFheYPkCZhyhv2bIM{sI;_S*pgc`#pYtK7a~g zbbUCNhlEZnXoUZ$k+dpo%TmtON-bXuGRn$wGSv$iywjh`&zzdM+RSE#@2xRl-=@4n!|eA& zS5+3gko@pr9qcvG7yvdBdc1=Y1p)maXZ|Nf9h2o`a~<8U70npmr5+w0LST{k!FDqs zr;1W8zZB2-&r6UwnaQw6kcX!g`8O2VH#YB{;DUr1+qEFg)A=1~foGGb=n#LDpEP#y zyE;6@)zqbl$}ywG>IYGQ(3bTvxxrTKG%@X8Vie^^HpsCgEn8SLP-|rPPFwb*6LJ1D zLfLKggol{6y6Co-rJS7(*sdKcR+YSmVZJ347Z(P>Ko3_3Tdp%gAoIpopBo!DH#ae{ zAn^hxJ8qC}TBp%s#oiZWjsz(jgiw_^LGKGBCtX@z#=*i$`}FC=t)s22?di#v@i-FA zW#|ei-Ru*$qJQDwgLGcP1z$JBY|4hC97WvS>I@CeGW;dvhOLv}!;$7E9J&#zOrkG* z+9EAKt(!X?%m@^e5ZGcBC_yhgH`_LbU1237QCvwq$tCx_ zZK&KdyI^pPP()%pnh^vr(QvAn&%wx81+wOyx-y5pFPBH9unXCK@!p;M5_bDV%(>{mF3;Iw2r_tL?F z`1j3j+Ywip;43z+RJZ8;SlB`w<8&Va5fLTWIiX@VF=QeGX{4>2VE3?{}M=%Lf+ zYLuR$GT>^ya-g7xC(ibOknd1_K)ICfqw9A~;*omu--k)W$h=SG00JpywQ42HiCd}tO8 zkx5zCm@}2_^_~-)-Xc#Vb14_Aac7N#`rn+>JDd^vnxn z#mZAR>SpaO((@EI`Eqi>9sIUBSxkq6txNv+?!SeHoMfgphJR}5oCyx(#>T(7lU-9w zh4fYR=xY^xDxVC6=!#)2hThOZ2G@_^YdD1g4!+bGRXM4F)T+HtCLS&Tze`{qJofH< zl?XD}+=>wVTJ;ik)g_wH=L5BN)}G7R$r>|tay}P^yDOyWlT4TDjRZJsz~%ZAPO9U@ zWoo(K-%PPM1a>v)cHb)9CP>P#Am2_iLDO2!A!+St#%`$mGkMY8rxpyc8x+GaVCj;a zFGY+EU$*453c0hTQsFW{ZNvorn~UA4(=8Y{e{sN(ztt@qDf#}aQVP+YjHMJ0Qe~K< zRl40-D>m1tD;^j@h-S3U{m#9D$uwH^K1-Uy$wP3H`LF|>qQLraasMK~3C4_-B^ivI zx_+A`w}5l@A7XR5*PMQ4m+LM%gEj+Aejl`Bp!u^%;H!(> zf8(?K^f}m+qm60TkF|7;A!`)(5HF05af>#xh0#`@KITDbA=^QLQxmc`_PG0*KpCdO zG{?xr@FrE7HNH{}LoAYQ@xIoS*e0T0{%$hFa%6k9?Dpg zuRC@KDz;kPmvRm)at+n?xR}R%U~LKbgp1ZA*g5-56bvRMBnTU?S4B|sd&)i^RgpX% zK!`(k!U;!!bq(-jhN>-fn(UXV7^ZK67Q2{NKOh1te|AWeXB=Pqk!=i?7vCq>KBftu zn$Yg{+AW8}GRI4V;{AE6x!s9d+S(Q%JFL~6Ox!P!AG zxxRn#-OkhaX{&}`T4m>)wJ7{@ zNrLk!p+IZYd7|=F^$BODa}XX<`#_6|fFY8!nAfcMa5Q@t$hkQa_q=Mw?D|pJ zh{$c|HGZU56qIW^U8S*Vd@J-6^OsIe32Lt<<8gT?fC!wPZZFhJ>fJe~v-l z?z1B^ao=G7`*AC@GGp2L;Iei?>`CDBw!H<*+y)*Rp#+{*mDpR7VICpZ8xo+G->jGs zk1;}fWO2`PZC5Wo-I`WKK)t5S(lX~~Z#Xxhh_A(_=`v5#J$4qD!e&Yf`)@BXyNYd( zIMHZ^#?6S~v^lTLc!WO0s^gA1vm=#syB+8gjj+QZLZ|7-aLC3#P~L@=u5We9-0y33 z;>-Oq#QI=SH;-&Ci8dd5yPL+|$JuixW1WsN!ATklW{q_#!q?Y_T8*Asu27Mb@pS`d8{zoyc3DlYNz1&;Wh4mK7 z8xKjyGG&Xt6b`5d1ORDtt?REm1y=WEoc3c`y=~plzo%)mjXEZnI-7~JLR+n?BVgeR zI^S<~XI=h|GVHA5Na@vNX)F(k&;{Guc{u%8tyh2Nhd1W?pH?Asdm!xKNH7$zkl>w5 zE6BZ*c`Im@UR~HFX<@a4MJ4bH^w zp(&r^+dHZ$JjRnJrU<%&6q~PQsDs1TzjC;!T?q%xL`N6(1*;V-DKUzZk$x(Fcs{_= z`M|RAK=^Dv_R(5+H*na{0BkVDF^0%HL8oxtQa{=%AT>htM=S8$@EVdyFMMJQeCu>AtehVzgY~73RJagIM0<(|5to*I~ z=JJ;eKS=A%TTv`}op#H!%wO>GH)~=$?9e^3#QoQ zLP`~3WI?~>P^fBr&`af=kR!vjg7Wcj9-X<^AK^hqKt6J%u@YUw{*oJy6uG)>65RJG_PQl>T$dc$mGO|MTf`_@P4DG;(i$rqjg33{$@}V@SUOn|4t7af&8@fdOW{2yW;qe0Gkykt+7=XB^=jg{t zHV80Mr|XVk?NgWC+WkC?;h{gZWawXWy+Oh$bY~0VV|$`!YFy@wkTcAJb*yv0D#zK} zTfg6pB3XR!P`muuWxNnpHlzpb3-i}S5T`AdwKO}ydq^*@zpjtc)|^v8RCIBIuj?hW zt?8+~L9EW6?!iplpM4tG2fW4&ik!K}8rghq2@@Op3#o5iGd;p3A+5tr}7 zm3y-2rnD;Dr``T@na1#4*H(>Brz-2cU+9t5_rb}Zv=5^%Np70*ahEF@|6mAWIG#RPE#ukeX zWh1d%S|i-*G0H>MzjiKi(^=!VG`a(hhFrJ>=lCrvAK)RCp~;qfEDJME3U8JoxCzTv ziX@JS=ksC`PNU^`$G?4&%(mTVO|r>9L~u$L0Br!~-Up_4j23m+>b}PrEh-m(-ZfG^ zr&KdoP-f<__OL16RnnC8)So@F%sl)1+$`5%YhgPF(~aNO2u*mf%12NP@o75`(_1$} zjyV}rnvgPWC0YnrZ*;qv>Lk~%U)>rE{4F#f!%V)GR3eG35ZJjt{!?gqXY)8W;^F-n z0=aA_ZZW=-H%p~kYk;Sk(R(3fm_5Q+4lwd=)OFz@nGitDD;C-lG|-(?Xj+ip?clyz9#?8&SDXw$q(OH?YSMjpmW8}YA%OuhtjH)=vg#m zr}M6tO#JWBqy#Hxy}D}`j7CYcg93JrWp);@57Z%gt85Og2E-BnIT-Kw%g-*P*C&-J zMxZ+=&PHL1mO`r1Qrjk#-VOZQE?MgS?i{BXOY|bh4RlW*p4`(-b<9HpXyD%uz*p|| z3t8Vb(vOImy`06WopKZ6ssF#t%^BZ_y{2C(0#52WJ~9|@fT>R3P&LFV@*y#0H&JRFfNU%oB>r21&S)Af-(FM!G4qktX*sL-x9C7>|2 zhPAaF%y6Gm13pV9A68g2!z~JV1*-(j7!|1eU9#_BYUx}Ow$~hHnw|O2;yoW#!*@`D zquN{^4!4>(Zea=EKg$Qivn}FARYlQ4Y(LiJ5foATqY2$qFxoinj&;(%3beeOPfn(l zVIEn`^BojD!u)TUMq}*Sr^3slK*R_&E(65pJdN6@uSH2JXUo?e?fZIAKk)Hjfb-sS z(P{bxU^`${3>;bq?zj@&{E_@lK}DNWrV#=M&wQDd`5br(YjYQX&&oGq|IfNXT+l09e%2Z#N6^SL47;&4O9ehtAb{89nkW?@aK(eLE^G0nYP=w! z1VWRj=tVY-L0G8j*uku;_R=Xg=G@1S;zu#piMyn~YUa_++$!?_W^n!lD}qs%(zx^_ zY)|Lsqml;?XqjDnrR~gm<0_W-%SOh)1;>~|g_A9Ps2H{4Do<9&2zse99t!XU*QdCm za=L?`3&?omDw_w-eCEvwrt_pODS>#ijTchkrC3s?*jICDU3mv>~FK1yrC0G3$RfDvU5C&($t z){5Efi^bjR7vd54&pXEFjyHuHZ0=A`%Va&kw7#x?WG!FKQea0vv(%V=?T6!Gi&j+G z-HOeA1E7{LWu{y}0*8@L&G#z{oi&gE#gB9

wZs)8?vGeUdAqu!aAI&S9Cqi=g&gPJD zvpLQ=scicgT&zBx*e?Qho@N-vfT$x_5>Yk|jvuwikvy0bO&mJ==&EU6lA=gE;}#lN z*LZ;0cv)@}*o^M14pJe~^vqXzUi8CS8`3X^W5d7(qRnLEV%ZJvD!=b%(9;;+*%r6g z#F3=Z{LeECU(csd2FJ{Ru|HL_t~0IUppzOu1tCgQCc@CuBrjn0 z&h!U~D#_Fm)*6g9EIGqZlRo@Vh`^@pbkWhTR_IQ@L`Q|oBq!^DH(YoA*SFdAJ;QX*TF2V4zO#!=T4|#1s z=83^2=lf@E?5{#5gSi^797+3aVete8NkgZRgbyZ9-KHnmMI8vtCT3@y&9Pu5R`bqq zSC_IEX*RiI#)prAaMCh+@iDvx=``vz`|VF|x={ao4YScF5znkm_rqoc=0e9i_rEK0 zwc|>oo(b-?65Ya)ypJvH#NIFNUakwz7Y++KD3x*O6w^?d6dUsfTxY~;erL(t$Ai!Z z;C=Zxqq_O+ELJ_F;Yj&Jn-$>2cCoqfBV-A)78W3e*Oo!`?k80BB-uQ74Q=mLUyLKJ01L4vH;nquRkV1lLg-GiWS#sbmNb`f#T-ptEek&l-5 ziKgR(&hmUX5P*AZxN!Hdl=8Ih)xgn1)Nr)8MtZa2w}Ac#9uMn03S}rpj5<^gZ+gSK z6Y#1fRQB#sRbYG0mY|QUc?tV}){ufB2LQ`mt=R{{I=1zfOQSS^z@+*&9MM1D0Ow-Y z>K8TAGkB4|%EwDT4;U!Eeq>LVtt^1-m9Um4pSrne;gOfCwha{Aqx-$DR}FxuKyn13x9-VSB~*+l^>K_$lj3^5x* zmk*rK6T@ExDUKAO_4{)~0N7I@^nD~5@S`C;y8!6_XR+~&*?~ZKK9#zlGXxjJ-{g{i z9~d5$yd2~8(1vv88WPfwh$aYb71lfm!u!vQHA4&3$yBHar?edn*~tAcxMs{mwD;!e zqv+o!Q6Vzs7GKDfs!StkxBtmFf21~5aRf{U20ZY;HRSy~h7@NFh)P2wx~{l2{yyOn zc6Rg;jA!_N=aZ67=TF%u=eIQ$hyNApgN~H77o_(`=*NJNuCCOCn~kY(-qua@f5z>d z2|QTzc|?4q=q3Rm6rOq_r387(tnIVDzynDxEb-r*jsAv3;oWK7CY|2rP^lNXZn{!1 zng+`wFn)S@L$}xNU*#r)g~$Ng#LwzQnU9yhL&81>Z%UzX6-88v>R1M4a&b5>70|2JjfVwi#57`F zCXegwhudkx`ncrOVu=X>z?^`TZ>fw;+{Y9^<|=S7Q}-`e#%69a_9gQ}7gz5NZlnqT zU`k?GPtz$%_>A;!tSDq4O@D*mxqZi{P9SQmIP}E*slA#33K$UP&|iL|*2#*G*FH}w z$~w6F0I&l^fT|gRqz`+<}xg2q$;9m?yT(?6MxHkBTwGyD(~TLg2z%9lKcar z1tkiMvCgs&XC-E165hX}=y~-ia>^UFJPlA9U$dNAq~E@0KMnX(hgt5P`y<`snD2P2`_5wO8*dNx=oSrL9xkA zRPJ=Kg_+f~2bhd!gY?>FXzCvwe{%E=s}st}!m02n7t_mF^I zkrC}P1wvdI;;~jYE-8t;UsJQ+3OemtkA%sxI5;9~=PLm-g zL*`OKnde!Qq73DjGQ^1_W5_I2l7vi|6_R<(JihC9&h=i``|bVk|MLHRQP;&|+t0o4 zz3zLjwU^jQiQciF%SWUKsJIk@Ke3!`*R#1?kP)|76Jlt*H^q{Mz=>sx>OlWN^O{Tf zN(^QZF}ofsUyYn~;xza0a1{DJ19IFxwpZ1RZXn{4tp#I z4n#*scYsXZ+1ZIRRihGzRP7zoc0a1VCZA?zVp?08ic3gHh>BYJESNJ}{<|`cWYXGO zY(Fvn0KT8(Z{8Vb!JlP4(E7$_#(B@yK)6(^}7uX9u?1b?uvQY@ZNd5 zIeEJ8@#MtmE0N#1 zA8a+7N50CEP`-IX?7;ZlpxqA-Lx4oPe{`?No?XB6E6;};C?A2lF&S}~pZ}SRbFdBE zAgz>?$cT$i#p@z?;0p=I5^|COF+DX!U$J$MB&z_)vb5!+N5xxs!V|~9j0PK~)PAPF zf<(aThwb5hjUuoO36>M)7X|wdD`#DmjI00rnY@p#?XiTo_+?NOOAox{f#t$&D=UXJ ztu`5o@>?(4-i)n;sICj&(s=qz+QO<;iT&;@@b zv#OV4fHc7e77G8;_p(*@kycNeOTf@s93f46Fjep$bhNFUsNQh*3#(01ma6K0>s{7t z@hCL(?Nnh;O3(V@L`RN^0@0awSWzniRIZetr3)>?0_- z;4?)?ZZ+8YP!PJ=h6q8w4%;21*E7AjtUCR3Dn6`seuG5Ufh_|ELCwH1>Bjv1(__XG2OXf#=Sjf5MRhud?#;@{pUE+1b>7ScJADLJ*V59{cM>#QgCTcp zLVt`O694B({jjmP%RN<}gS8>75}z#t;48PVu&_TYAY5=kRy4YPZt(sFBzSNbWUVy& z#QFDmUg8)%^Gnf~ob&pDc7A7~KiIPlw1lvR1YYe`6JtKmo}QU`QA4B1^UbSQw8Xo2 z?+&uba(vEyEhwJwqTR|;N$9Ru=$5$M!4W>!hnfaoi3=jpL$36z6l%vAm6bQ?_P>kh zwLJUNshhcPl;wo99C`7B*wMe29zCL86PK2b{Y8Ds@)4pTxlstgL^48D1o?jGf)kJR z4^=}$LuFF}rA@pS+sDVpV`5?qY%8=&y=#K^g~=^-BK@VZA@F^OPT1-6 zS9sQhU>6b8$t3(>EZX8soer{0zL4i+=W)vrHw4MX?a4tdo>(YI$WynZOZG5h|+&BjruQqY`(+5UXQ zVqeSd>V+hR=mZU_2Bam}DZ*Z6t6sG7Lro3IJ+X!7=kPGeRVL(Qe=!oZsA2IjaB{Ht zD`Dk@)cn=zjX0m91|uZ_2C8V~MxL!a)!p08H9K8q`qgGq&wLuepG}xfb-$4>iDc)J zb@FL!&(NiCPJR>;Qh^70+sRhic7)paViw+t*HIEe_43QTj-kS3&J)xGnAk~uOM!ub zAQpB_o1Ifr+eG$1$Z+i#%QOTxWgqSB?Xin2r4+$C`ntjPJ4`;;PmO0jYv~8bg4nrf zl#{-nH83i7n|lt&*5PQAN2+G(>WxsPB&{ts-Q6i=cmlA^!$ka&UgYwsQRS#a>^t{}PzXpkBd_GS^UcJ_2k=gfysF_t)4P zk60KL+a(tk7J_wqXPRa_cQ?Dt-+1{?pR$JMXS9!#ymV_yF9?s*{0sW-YHIGpdju?F zRIc9nZzV&+#ra~hNAbt5Dkqf% z$UFth%Nqd+m089iny#)cW`6ZApFh(&rD@jM|N0UOqISHjYj%10DLy_vS=sd{1Dl}G z(A`dznzh(TlVRm{B=UT%M- zMPJ{Q`)&+zQnd57J^ni~l9!ja%Nndp!y+3aM&oVGmjFL?O-(u3+3FWB3JDAEa&ny< zV7$s$;mQFcm4SLKr~qV9l@-fY9k@0USh`}WNTF$wN3{K6-p%VHhPP7YN} zzbp@H8i+N71qGQY2S6mpDtVcnj#zHQUPRF++tz$)wu%A(MyMXiyzwfv-~$?eMH(_c zcxA@Kuz)M{EF(SrwSfX&rZcRa0WtxsN2k;HmCo2ph>wQ*oKf`5z^2jVqkd+@|v&SsJhGz86 z4i0!L878d{G65JnTiv^Nj|h*5U>*%k3QdQzW_^?}>wLotVe zPyaBwOXF;e?T2KCUTY6u8(PvgA$ONuuVc#ZGSt$_!=SQ_%51s|jtU3}eBD%L{{ZGF z*p|u^%Zkf2?1+=`6iOvG7+5tW%B{o1d2y_j|7)KT`+eh_@8$0K)ut8}7N%DUVV?uq z^PTCvprK@$Wcljp0qCE4yh+->LN<~}g9N#2gZWYLM1GmgGZ+RZB>lAV{A z_xA1E@^Y_+rxIWULvecc+_`f}>-S{*VcS}qo<3&lyKayq+aiWpYx_gksB|Gx*Ekjy zRFDF`vtN<*+u-KoGlxy9M}^PaU`Ypl5-t>>*X7YJwtDGCe0p!EN>jY=AK+FQn!QV+I0mI5b+k8E9(` z)6FjD@wa@#=g%I$VsHL}_)T3-HTgwuZn#f15`oIYqo)8=VG3TEu>=%lg7k!Nbe3T! zV6Kj=`$fl-;YwYBV9hP~xTMCQv> zaCx%nu7pyl;0myoZRO??^4OXaeBJyHP4<;7!wa`%3Btn*YHCz|Q$0oN%Lz$I1;FaH zoUf4wi>)dF$Mwa80nxr0*s{ww9jUM&d3c{{lI^=URQ%zCN@WGb%vHa3J}^=1K7A?% z)P+YpSGMhtKvk*Ns#rz09JtNq`LQ-BCl+bApo#56SpsB<56Qt2?y+m){QPO`-j+M$D5vY2VlcRC* zBBdV+#IZ$>QhQ9DQ{pJXVA3fkX;-dYdc5R-d*W3g*K5bY~Ow|T9=j?Jsq8Iq~(Ny zQ;4KK8Ahg=pP#2x0`dtfC_TPuE=KP-|FaG%<8NbE1_uZGy?(3tE^tywj~%!@^f3hZ zNJk(N`?We-6BNeC;im)XyDEJvo#sb@TmZ#Rx=#0|C3;wa<`2wW=C$g8@WEP#B{$@Z z3Vq}kx0AiNQOos4}&4|SUmFjCVJ@K1x9 zvr%?tWrV#zDT${%bO>-Ni8H;Get*+%fOsO;s$HNg+GP@S7l)E^)S^Zn!OHs@&XjyM ztymd+NQ6g5-Y7~KSw&VMsxorkj&IBV!&rFrJ+)mzwHOP%c*z#1&RrD+HhnS#4Ff~R znA6q_AFVdX+W+XD)BPh8LOt>-xU5bUwo>u;PO`-3?$(h#ibRHZBQ=e!qBdxg5EL*W{<+|xH}d42|V-IjNoJ)~Eu zu7$#Jfp6!U-~oYSRaX3%n->fRZ7o9|JfNYYizrn3M7OXW5g$Lf!nFBq-~j!bdvTn{ z?dg1K!#j{MM=P2_6|UW7EInTE)|AEcO~m}X!teC0&+WY;`4lE5CVI*YUQ#4~Kc1HHtR-ZH z>a?{|rKuBZ6wexuj;?M;wzo3-1Gl*R))~Dc#=N++f`6e9wG*Cj&?MDFC^J2z+ux_w zBD8OJ#Zmf7)0e`%Ue$!}2hF&klO3&3`dpOf=mN=fWRq7Nl^F4=@I!247lrPYpRP2E zjEoEwZT`Q%65Frp^X6$@CC1SW*V^XOcTXMmagPtB2prUAqR4c4l9;H@DP_D(sg%T4 zBT3_a$HzB*OR_lxeV(aafT6FVF)N7eR{e{$icrlz!-GwcuLI23&1<*Q~`=Jl^% zKWwC3H}@wh*y+I($ZJbTC?`>?)U zJlK4`zI#K{lTujdQul;+0qA}XW7;SWJ>md6f{_RK_kMn`qPj(xOe)+?#@{^al&<&rs* zQ(Ec)H4Jgg@z+It*}$tY*a^?GeRu9KLS4ncz(6X-@CLFCOXf!uw%w7xfCd1*lQVqo zJt9O}Tgm>8iYi^bja)y~ZBpjaU>_OO+uMuOPL;BKWJL{@>5NC*g6 z!N(b>9C*cN`Y5e7ZioQ3bXXi$JK6WM*rsl*H4UK0cWuE^`7E|^Oj2A##L#!+f5IVe zdgK1vTsH@L8ri@VBn4n_>|fiT>+8+T%xsGX0{=K+eplxXcUvO7Nxk#CCKq@Euf~T8 zq3Py{oF6S>M9aXyDR%u65RZ{|@ha!^_knj* z#LHK&g3;S>awUHOoe95syx+F%jIr6%@+~R9ZT;LccH9C-w5N4;b^H03-cSu06qY%8 zOhBaB!k?NsGFELwV)~utcXuwP=c?i&^g(-WrHGm<3SZj$g}$HI<*!%c(z#!tz{Prb z@aJ~Q*f81W_q4v1&vetg8>O41C8|vt9UEu3S}erHV!wQUPezcLQqd9A{{T`I6&5lz z;|-oZeR@;wASZC{0wAPH0fq=w(W<8uG5fumLx^B`a9Kbba05=WHP_~r^}I-BlqB5} zB_kovu2)S?RHnatnX8|rvJhl@3ZgKf(1Rz;A63?p6YmspO@|M9Y(q0#^faiFwm8TM z8u~CAvLD-OH*elFd1@dO<-4_dKTX-wvy3xrN7wB_d3i~RxCW_y%CFyZ{qfWJjw9+> zIu}#`psK5@Rest@<>luGJ@D30QP`LbxIld$)oaK>>6X+C)P}!DnzmO*p4s-7zt+z} zj}eP^3w;v@2M5&<)?uHCrKKg@fWdT0MdbtV%eb8s6tT2)bhoUmoMem?6|3ONO1jfU z=SoI<6gH8_NchtxrKi`~6frk`>APC&O9lAgT=wpdSs5CY1BEwDX-mfz=_I z=Kcrke$Qk5tj4CMO84I{0AC)~F!NnJtD*7z&8_J5u$=_C9YL&|gJJgi>6DI%u9K2+ zSv{5d4Gj&A2`u!13ky!Rk$9hx=Tq0Sy7rvmM&l1luPg6{$gYPEA6{udL*#Xg?P~=I zi8raKHL2Tp{lA+A5xj=Oz<=|AEEQWY^=zj~h?2V=L3}VSL^!h4)xmD#Ok6um*O@TZ0cpQ?$mg&A^rz zGR43CYK2)j8pK*!TIJ49qN81L*1>f;P)ILVF+5f6A>nxR#HtiG zT!}+ReE)PKX2%>%ss}H|9J^!Wu{_PL!2lSMsqf&BSAYIdgL!>J!wp>?Ztng{-x;q+ zgX(J;e*yl9G-T-r3Su8XgFHqq#&HhC4{DJ-pWU4Dlg3;pPDB`0_-x8F3AyRye#F?X_|;Is$fYJtn>hi(Ga#R;M784pm{9 zj+yysU%FI1u5#wgn3GJHGwP62)QX0FXH>*uGqbW9ijInli=REr#2p+K=33cZOJpP` z-|>`5*GlhYvl3?&36^pE^{8HZS_{jkr^vQ1j#CP^Gpqm%lsyw57O2(tkXM;HZ>-qs zK&M+Z+WcJZn>27~kLP7W!^x~?P-w1~HyemV1M`PHU}a-Nu-PWK{iUb(De>}))KuXs zc^@kLahCMjY=_`8InQqf3bHPfEV3?{xVuL;`}#dB%?o}5)aw9n)vIgFsjB60Fr>;*n&F|7T zFu#QQlou~NGI*lv(Vo3} zj3ra&mbbSzBEd6ZWyLMb8p|{wAV3x$85znyjuCdyBUUA)w2tW|XJuKVn<8g9c$~lh zhHP)V)Z;Eo{EP`fD*t@?bXMm#LZ4$o>RFjn^=<|i zSqWlP_}^)IeRy}p;PJX|;asOz46JciN|?fysRZr=kOWknot@3iJtlJWJvt6Kh9xaM z)lT(RZyeIO4Vkoye=vp6vJH?anjm=ml!EHr5^+nEuV(=+;LvlDz)=gP|DN)I@%@|Y zUM6Q#TauCv<#%0FVP|LWeXQs@H8qv_LzwOqv9X+6$4*f5#$NnLD{5ShjbhX3K&n<9 zEx^P)i_xAv7kIlli=X?VtXtNpy)6kLX>g=_BCZegr6=1@(5v+JY?f1L05;*Hg7N@8 zzjlrH7B)8KRMQoAM823iYHLRp{W9DyxK;G0=LrBYogcAD`dQ`*m)zW>qIKzsjotq~ z2-0M;dxeFPw$8=k>Nsp0PqXeeEb3?a18vk$cJqbN{3dy`dzIT;J@?BTOL8#vUHQtc z*mw+YgxRuD5|yOwjABo;#sIQT|F{@zG{$Ox+0eTM_TklH__Eb60+T3 zD2l?#A2hx-pYePoDG^aoSDzH1YaBXCT6gbdXZId!;pGfOVylk2hyG#Vjr#o}49cjh z5QZaPVsSr6Z3x>))+#zZdI%;upnIHF^$Hc}TNk-<$~cXp*wFCyrMscT1X@1)^x2zeXR_orxCLbr^6 zBcsI|O!ZgvlbRL_J-k4g&5C=_8Ke+Fk>=sNs%{f-j=3g3EO_{H6L(_Lx{dm6plP3#l`I|r~oWrPa2fGy{O~k5%-+wQE`EnGM$#nhFZy;~9e?~_wxE-VpvWDQU!^gsH z_+=Xlts21KbUSy>mG}~)4BR`IR>=LHp#_R`%Xmpki^=u&SEF0EGK_O(CMU0R?xVe@ z!fl`P4T%M)M_0Gje)4bq(o(j5{a-Q7qiAl)S(AfX_QfQZr{Al)S$((tzT zzW2tpRA$YbIrGQfU;TuuC`seoB)^Fu2#&0bgc^dNts)4D940C}gAy_C3;$pk%S%fj zSO5NGH5SGp2n`}DA*SJ$wvle3OC(LvVo|Oh%|eMTp(-l*Au9*-FL_6V)V09Ij&` zNBjHtF^ow`@bN!dkCt^r-e%QrsLjc_$Hnz)YHDg?;w;sSiJ7@)YU*OSCv}ENlJAOz zO3-a~Ym174;*6Ov>h){g2DewDqAfKF2c1TT>%W>WPyZbC2H)W`c*9}QTU%d$o0!=9 zr^NPDeU01hd{9u(%j8wOC*tC5^Bs|RRD#Ei9DAQ)8MjI+J&)|91Tb&jJe}rCP}J5Q z&zDbp{P=Oxt2#bDzC;$ih?tm17-ia@UQE`$5^&vW@IL!oUjEY2(NR@(0A8Atlfy5F zR#a4^udlzhwwAXuKR+*@$kOO>Slp%_KSVxSrYDYX`XiqHarxiX0W`7LK9%@#{YHLkkJ^V19}f2RZen4bnbUu6O<>f#ygb|9 zPRHWAYPR`ZnJ+}?Eg{iv@#q!_i3?mRB_##7`>%9(c=+z#-jBY%D|32tGc$L0_cwKy z+o&;}`+Ix$dZ;cFv%I&l&udA^$Z)6xNw2zA_H%P{*PQs-@sX^uvdf;Tzv8(iB?4Z@ z_8c4>f_Zgyrw408=H}*b!-E3@t!SZm)YGLpHMo7EGBV%moc}~qi`bc&Nn(!<51-4? z|2X;f!Js*#bK%REFU~gA@V>TCyw1o~--Pq?bC}`26t16LU2C2zlF@0~ufxM{R4A`F zd?`0*Hh-M#;^HE~PE4H8r}TAvysn@iq{rOXhvf=KLQ;}hzrn3Dn$r91@5QwHNz8rC zNgcD+fNNMJOfA04%gZULsU^k5!k&LO&dk;g{NmyWW$%9P?UhoE&m+LO!@EP#+=YUM z@x3`cjoaSQ@y|}HAFNd$)2zdXmmFMNpES$$`T6-N$DVIYeyPgMd3*o*cj6;fT6+2$ zh@pwe(BL2eDXErgmd!l-Mjbi-vxYx0RLEoF+TAJPh^s@9>uX{(d%2&O3r`AK$(uXwcTt zNzKeGheu)IBa+CtxHyAm9~{K;#fyMQOLB7Zf`WpmsHoq+e~*ukOG!z!=cy8TUz|Ft zsSzap{PpV>C9l09EN+=?hP!vO^YYNe@UgJ4q@|@X0{soV_m>rAWp5x^wY8)MRe#1m zl|)3yS<^dwnS@(7KldV!8mV{PZVkMyPDh;-J7=IsgO84mzPPkhR8kT{&h?|EB`Zpk zc}4wJxCA{t{f{3%NJvNy4-aFgMOYXaIUYYI7jR(`p8D0_9~&2UpCnS1sH3_3?OU~@ zttk|QjEsz#nc4gF;EfFJ-Me?;{$X~i%zFZilarI-YWey3oa8;@;~uO1*%}N9N&bU_ zgO!z!jX!nVmr&*8Om-e=WF#awnE3YXDyFx$EP9;TqEC^;qNhj~@FyxPgh$)``Bw6(R{pUwBA@(5D(eft)2 z{P*qKx5mcCuvT%fvD4Di!+WkFSq%+oZF8g)6t3&T^jutrze+T9Sw?$$6ydSF{}tor zyP&@f)O-$23^#7v;3O}Fm8#3a!^ZaU{QAb8|V&yOWgC`DF{I2M70HZ>Oc*K^OZ_T1t86 zjwNU1a!AhV{z@OLC1N6?5F84OiMALu1A|Q1VY#^`mX=ytTBHO7gJWZsMn>!&2bOEJ z2m%wLoGHxA$_m5hY;RBVtPd(zKmaOTAP*C)K_`A8Mn-va3S&!4nY5qs3JQ5@MNk{> z2@jWO!lb{2nSt5~GXs;zF7h#Zb)+=4veE&D_sNq0iF?mKy?AQfmztd1wd5e1e4-i9 zWkn5T89`u?7-tn17uVH!NJs>gm9cnb5(NH*dDbLXG(1 zxN;9O!1oqILY;XJd4<;b08_evtCF~Q;4xRSg1Mz7U4zHs_ghFyIN>Av6Sxp;LV~3t zjrAW1jNw){f}hOuqXj+5mXMGT^;I zH))8F#{nkt(P2q$p0J~{^JwFD9Bs4R6u%p>r0Uy4ETpBse`o1uq8m0hH}@cYNKlX= z>EFdJLZoGH>1R0obhB^f@U!u;F=#aJ=yD4SJB=&e#>K&!c*RZ<>HKG;G{z_Q%CStp z@g|Za=w8`f^3fH_zP||H9bCk4XSUVH$EQ8m%GP%C_}GmOP2^tq<;7WB$;TZiK~9_F z#8)Y#Ax||m<3a!}U|~fL4{I3ve72pyKx*ChA5x*-SCEsdtgdc}agUURo!5yF!|mxt+XksxBNJ$lJTIqdpBX`uef=6c zhc+VT>}dOu)qZbpFO(b=!;`(GL00>f-gM&%E%-vx($^(b1-Z3S@7%ewga1Y*4|-`~ zVQ3`Wo;VJMH1yY@p}3?HDEughQW~$A*ZIML6LuS{4NgwZu&=J$(@&u=4-G+uv|LCB z^uZ+~D~p1_jv_=`bE@ctrm;`Q(Q;1 z>RmRQywAcG>?tTxfBcZk$rGi~sxZE$nktRIxRU9+ynmu1S}>JfQ^RBPD}MqX@e2*T zP6b;C6R~%4Y8S7ps*-K|^6uS8qvuiBmw>t-Q}wQ)I20j~;T`jXgKCIhW22Dj7>vM^ zY%Luf05hLnJ>%A>Mv3+bl+cwcdMewLIdZ8J{y-=&DEC zGs;RzO3KPdfB$X`=i#Y&|DKKxvAvC34)F84jtoyBILP}2lRfopI!vnfSf4@EKOEEFAii!&OtX#Ds zIXSt!5k=haK9$MCgM;{l1RmS*1Hgr0VVIhlnq!(`iWPNr{Is+ZPoE}q7|+ezjzL-f z{q5Vgh0d6x-Cg&?bz+-Uc{#b$znh-t$Gc%+VH+D8@b?4xcP)~|@7|FJij>lCLG>ei zsNlA@*!5?${AhEc(~xb|$wNsAmxY9yy3e?Rh?v+FHYBW`w{IPecZ?(@B_BL^Fi&Iu zyV4Bt^SL+;4h@BA7xFsp@9tJGFrcnhe#OJdsiCGe4(kpYTU%S3@9Ekjq$Nu%Fzl_p zY7+oy9=k~yWo6%!y$4?Y;-R<+GPI;2jqdvxBfE2Lp$n{RJUqACKEh%_Lv~sfU{Hgf zWW!uxe72nsYik*@xa zJyOo_f{B3j$H>6IM&Ci{QU-Gt92{)hfDcWJ07k%uTSrfCs;6fYu)>=+Z~XoJp*VW} zwP_E#mB?xkLir2|?_`+ruzJNaB3JIz()|3xpRCP6Fx}G906EFU@S)5%lo^q_9Bt{L zi}5?JJHco|Ap_J5y;Z$P3NV2f7#Ls4g?ul)a%k^3;E{8FpP3PXjl#+41owpa1!LWQ;(vU697Vx{iU98j z#C9wRMPh6Wy{Ov;^N{Yk-tLJw4OgNdjI~raRmV(p-$YJXOsE#%*;;# zuyAs=i)V3*lJ^s`v$LzJsl6}}^Lt+_4h((gsIzDwk_4~Vgg zRU)Ntx9ZK4;dTr!P7m8X;d8JMxASB6bD&6oqyQtx$=y`7f#)SCd3bqAk^Wq1JY!0) zEc&cEd@X3wg2xLGez3paPu?CuL_|c0ByY3m%5c8?Gu?O=bc0a% zQU;`pgTq6kBaG~!wdm;R_B1z7PkGw6WC<1JCWP(Q*Vf8VFo^+P`S837my&;Rejchv z+=$K0()W|->cU)UeDu#~n^sz;f9U}*?>tJY$arjav&)NUG;pr(Ivy2*JS)|f{16CL z|7Bsa0MgRhYBby8f2<4u0jf6O%Hm=MQdHsuSj~iSSp$~M=X+QoMo{K#JMmVQ!)?R- z{ZVvDI}_4&cI>%Rqv~`0aj-TbUrkOi;MAz ziIE+~=W2?IIFm2F7`6s5YLO4z~t|^ zc~RM`wf^`~yjlEIL!%mC0klEfP<2hs(9oZp1urR9=ca%Bi0CpUXaj;Fo~<(7 z;5}%ZXnHx*1ho;g8Hs(@d0~`$PqP120qv&EndYmX6oh;Z{C6J4(L0U-^>haftGc=x zB%FUu{5%&Y5@0}=o1VEd3o~R!pqNC&W{LA#}@v4ZQ?yD;`ji*mVJV`YT3}R@) zy7#4rfBl-1NLR>OzT3!LU`xWR?F7vl6a|ne{`~pVrf2sL3n(gfBpd@@2?-4aoGu_J zD9FNM2eJfoL?}qpAC1s-hYY?rJmCC*2bJc=4S>&NWR(0)mdoAAneoYeZaEtpHUa|m zFb&t#tD4z0BhU?E8F z+UMPjN3dT~6s+Ls(bxcG=9Cz9V+TZF)9-nuV^i5>tTMLCI042YFj zEKK82q4K}wQYmo?#o-X~o1D~TWo2FSoV%G25pk30<)C{Ui~nXfMh4+|kLWca0Bjx= zN_u*F1*w?8_jvjFryJZkSXg>nT4)|Uy3S%Dz{f`^;?n@z7MdjuKE7-Ar)S}YoioG3 z64Xm=VW5SZu{8Zud?IQfrs=D4&HksGb{B_O(S#oBg`!q7gtenF~KO|B?W-~@+a2C z7>Pe0T$)z>hKYiDm08D$y_9`dQ$SW;zTWGkG9dwWFVQY7rlqx#DX1f@t*sycu^KdC zV`DcsEd6-LtOLu}jty`I!2V@mQ9+@hXJ=F&_w)?f=@~+ zz0X`&5?^PoX`|90_oScAan*KQ) zd7u^+6t{W#_#V*G&e~#De{ytixQUGo%qTE!*R3rlD@!zY0KlD&wzdqFvq+KA+CXk6 zj9`Is=GN*eDLy{pdhdcKcB0mo*pNr67`S&~cD6t1V4*TKcH1}`-AkO>paWS8Eb%`n zupJ2xRfRM0Zja}ws;ZK~u7s!NMg~DljzxEJax(8tb6@xt1^P8s+eZ-AeAdFLZwH(@qJ+VWY_m9AEUNZu?inm&x@e-coe=YA5e}nMhWV5dU z4L+}cKuv|!jzK7tLx!Z}WR$OeZ+*}7y_6*g?7a?T>OtuUl<(^76our+!9C_PUmT_D ze%j#FN*QQqNMHl?1*z*nNQ*Piy?(E2qlNj`W*2oSZn}B`FTTTL2UJ#%Vra!)@^vkH zaOVO@q@w0N5s{G-N~)TguHR$tf`|dQr3H;0JCKY%Ljjin$T^1)X8FZgo zAXdJ^LE_{?48RUpMgZ5uL$LiP zM8<4fntd*)guRmD{AU(SE6j*P@YW(8nd?(p?Ea{Gb%Lljs zu&Q=0oUxCw@&15Jfm=`f{!Kz+RuWWU}9_xuwwJipL&qmyuH2WN%#ps zoP+8>d@~p%C(xc^A|pX^g!1kD>Q(E7=$qBi@=Un@n3z7`V<1=%5)wiaY%uA>9rRjx zubMlX)@|^%I6+}zW3z7y!GX=pNaXj&8osnIQ2S@kMu1F%g7k1xqC7`Xgoco+gMt120-+A$*3!t+g97&-X1<4t z$MU?smx8H1`gKl4p`oG8%>X~wq0jcE3jmH>oSQ>2d;R(~0N-E3!|?OL@$Ld30HC)O zCY?TNkA}Iy)`43rEh(YkwCo3B4-4vDO3LZZT${+{aV|&@K!8*fbaW=h#(258N7~vJ z+QSLK9Bc2mUQ3o$SxL~_B%YR*CMPfdfZ#^g9lQ16B3xWt0AK%B#El!W%E}^+%h;1! z4xnW!Po0|>8SU;Z$rBRB8Tg!BJ1vO6zY-cOm(S133k(L7XCY9(IXL#e24lepZ@pLN zd(Jch!*dJ$U!jGr5*QGmrmhZu3knIH9sW5QHSiT-v%2@DN}e8 ze7n?CELM6323&GZ%+Gh-+5o)Yp`^^tQGTKItX$vL+#KUq3^+OQ@$v9k|BN!J+&R2y z9}qvke*LP&Gy=nvO_<>;G&7C1)Ev*Pq?ABU4P|#2`*l>*!ootnvJx>SI>7x$wqHT< z`bUxiNlhIN*PjmR^Khp^t#EXEIv#TtcpWdy<#9wrgl3tp|4kmB$4jt=Ws2F-l?>`# zn73;6Jg(P)@&>(mS3`IrMj*rpl(hDR-zW&tO*4?kj_U)IMqXjykfY;7g+2NwzFeNW z&kK*j4dGy7`t76hf>#GFhXk3M1H;-B?RZ@gjvwGfW~uGwGIP&3(xlC7sBSng#|RUU2c>%Hm_dgQ`1h^CaaIAq=6m&{257yc2O%&%@f(h&-dx$ zNBnmrlj?`QNQG8fW@g`#L$j;YqF2zr86fHH>kADfod+(}_Fz)4-eu0XG(Al&qGw`) z0`WU?jRp9&CR_s*K;rr);4)Z)-^%Y{R)V;!%+HrVgln-2Ldgz(xWqr3d9aD~X0Wk^ zMdzHoJv#`^^M0*xon~mHEC>ROA!87P%I9?90#2RubaWqUY92kbFf>FO|ICYuY~`U9 zSKwXLF0^-cT3cDMlVY!-xm-HaBuz3)ox%05UAx9f9_6+r

CTYaDlkxT{mDo3=9nI zk}gy+Fhaw_!!mD7NZ&!NNK0eF$8>UbhKjE2z!e8SzkPd;Bof5I>)50;Z)D)^XUW#` z5{D)yv5!#K*VbV4s^8^^(zpR5va@3eY+Z$~3F!lwvshD~Pay4s+cYTNWraf|3DW}> zqMKj_c%1)UK}m&(-g`qtLQ<*4WVsQf4px$mPDy^gr1r-rs?}iG^{uph#Y5=m>47Xf zeE4wi4;yI+C|CJ~h4(QBp#f7+Q22|s%-I80eRYHdC>rd*oz-Wg_c)%o<$`Ed?Q;%g z2}DMa_n4VWVRN)~P+KK~oDOTl+Ij*IH;@jXC^$G#+}vw7gTKNs!xRltf$KMZdg?(* zNojm7hz|^OP;bOkp$v|KOHIE01|Jh0hL9j!0_anwnV%Kd8_GB$I7rL)?_rDjH&D}m zJATmxZ^o!CW3ogo`(U0ogcXL%$yP(8g6zt5k! zr{8@o4CF^+xlyl*E|!;Y4?#kVWMySl$UyV%j5@Lh#tVzv@Jb2$WhtZoc2In%?|p|3 zz5gaxPVNA50MXRO&d$!lVl-tQY&c&SSWq!(X=!Kc`wYe^yX$HNWse>{{4wc#ozHEDM(}Nc(CX@HOVyvdKnDT3oDDKvmVgfe z_zc;KB3khzzKot1iS;FkJPdq!X&1QtE6kb5;O!N)#w*ZVB~{gELX4<|aQCG#fEk3I z-oD#+b+k02f0-|)8ha(L7_1aAX%bL*u3m=LcWv~vx-th*f_>b|wlPz}dwG5O>ZteY?cE0_S7OAj|DB9N zE;hgHu_Lm-v9(Z{)zt}+Sr8v!k*KiF)H#<_R_=|I>OcqLbX@KR`vb(2GcZyK!cDzhh$O;K0JgEf{E8@p=xN9dZuP z?Euz>NqgakLxIM_8~yd`pORrW+P>t_*w_fc8CavT{127?t<<+MH8g-JyFsSfKY_-Y zP*j^3A3rueet9sY4B9fJGT_32zdW{}j2n{=v+H%3<2S5wb$V4|_BIJoJ8;p61I+tvLzcBR4M(*hAU~x>APV+Qx?0)+8hrB61)-B@EV+ zqvJjZav37Nb|xlOKwY3BgTME_-7e-@-<7-_)RSAB#hkCg7<;vD|5HTk{}{&+^C z%%Iuic;^;%vmyyllzFVZcTAKyg`+&e!ZTCe>mfLv7$|_HmLpETU6WCqTyox6Cyeb$53c%v3rS7O?SL92`=> zZhQMcK1pF2Kt9-YO|IMeNJ~7uI&`d~iI>gZF5K|okpnmz3_Ji4C;jAS>_kTp-e7>1h{Hru^69xRcH8CgLJa7u zO39BU>4>zov>@R3no4NAuP>%^;pFTLG)lXvx=qL~p%EWJ-vs3lyuzy${VZ5#Ki_eo z8a=)3SQ>Dyt&wxSlG<gZA>D0J>i&p!kVpa)A~V5tQ@d+WhvV*}F}+&36=x8y4MK z*fro6Ke;!h@?iDI|o~Q9V)b-Y&GbZzVX~2gAMpi{-cB01W z^fX;Mid>O@veq`}$=xWaK=BpAN4WRnyU%>&XCL^BE>^56p#~6W^?TbA8h#P9jJnREf0TF}dpO|0(W^Tvw-OWW>Vq ze)f%?g3}Gsdg|5&{(URpb2Gc><*HlDyU(}@3sOEHc{$^0SktXTHR&09OZ&sm8gxtq zd5`Ap^5tD8jF9Xt+Ad&CUz-Qc%PXVLG_$YcrmaqX*}pAy#tb$s zCJs*72c}Qe)nw3W9{myn?N8)-h!GW4nSiT+wl;Z-{K?5vDk{gV8N3thwql{?@@ekJ z6Cr=ZaK=yJ5A1EZ`Avy}6M({#m;u;mYs6XbPl6(CcDMx#~$$JM^SY_nbFJ?x@HntDd z)$zO@|9)a-w)IW(6844eQTy1)%!~`X`3fzszZ*9mG9L8`;gPbC$iHJ1D9OnYHf(8O z^~JwuTmfkrc~#ZE2L!TLmo9d8@=cRC)RWAR;XUnA_ZY)tr(0^cT)JIO#zCT(Jij;K ziH1gPQ0FBVtVkHCiV)SUxH!J4!7t1!{gUX=}1h_gk)EW8(1Q7-3ph%+DJG2Lr}o$Yz{tGRi!IxlX3Y#Vo3@ zpt3)#l}_iEgX?6K|G}A&TerbS`2Ky9iAu_mNij@8u5uJ2;YB8AgC>*hZHi}hKRn~` zXi1;Gsq?xRT;mYgszxWtD<3Ym=S}Nj&7T0184l+Nd@?of10;72<)L8j}>{EA zB^u}+UGi0Wx?UU=jk+_Xxc(W*#dY=2QB{5QccY@Afr(qkY;Q67zVwgjZ~L;C=&010 ze_C6c${cNbx_|G!a#hOUw?7CQmF{_V4Yd`bN^jrhYK$Een7@(ap|Dul<@V-$RZ0ou zX58u^i3OZk$b_B_zg{wH`b_y(OlNo|&4G?dn+89&QY?31M=A>($hmU^-UnRwP_hOk zo_@+l^~V(3xj~s%{I=fWXnJi(FR!^bnZp7Uw(;R%Y~PVzTmYe!b#y$zDyvoS@wo&j z0f{;ljU3Pfp;=m3tSEbPs$Dkg-~apDko2G43YocGH(iXt0r2-BaSB6a)itu$XUHae zszRAn3xO8ww%My=tq4*~q=juG&PE#_9kl08Vpn2f5H&$j$`HY$c~Qbi?tJB6)Lh0V zY#Do$D1r$|#s-aQU$RTyh4nc*bLrR#u^$-TddpPZ}q`e1a;B- z;1zUxfJpiV1~1`ZYdfWV>5C8S`Jr-#?z3k%d`XCil#};{(gm78YU<1>UAmg6)_rWv zsgR;|AySkwWXTC6Y3#jTlYK%f$6b;eQjr5ERf)GR?s*1Cr&w#gyfgMH?U{AA#6cC? z<$O}lf;l$d&Zw|14L`&3@f&sGrGx@K1@5cQ)02~T5d7P>eGeCfqNFJN{2qR&Z2D4n zFPNv>&QwByw5*M>R?UYOG?+B5vHw6lYH zo3BC#+u^5={#3pl4Nf*127;}^LNBw_yxv0&Ne1;{hiO_By7+}qr6lg>_k+sIaj@s_ z-FqZ(WvXpurAUb&qJIXeQdtsS{TTgVpr{btxgaSavAD8$^l^(FLUTVno20sc#HVq% zoPH!>)Jb86Kn^3{r)3pA#q>YtUqp+seT^^{BRoTgv^y2ERfOxs3)PB1Gby$56_p?HodT-N@jwlgF5z;7Pt z{%zj8>E>$E9{Ismhudy?U54O~i8U!jI^>Fzls&Z(%P{PfR8EN}>jGg>exfuW52yFh&+Bl&hL zjEuU@Pq5&y!U)yqdr&)pHAChCWcvH-h6d%g&Iq6~*{=-}pW7}j5$EOSceJ<5^r?8s zUJ8i58XX#9Wc1{5)_(S^0K%y>nC$o=6B83PB1#YAzPM~^za_g7k@MrnVzQ*O^8FAA#zx4}#DCRpWW3L*!>tAe7olPWjXqx^wJDJ+wW-?Hl?8T2SNYEwS3qF7W|tx8$y_j&1t@|lL<{@Z zFay-+M5R0Xzkg>EX7SyqotBte%o}ufEM0Y~1H~Y+0}|68H$!%@&;uas>;3){8bXI; zAzuCbn88ohnDPrPVl})FBxV#?R8Dz09ufxHxV-!$k1r&cK@}C{Q28ggYsVqdda{38 zPW~q?fo!3Yw)g26_;=J)jxya>F`~}ShXc9NKp=H=&RS(7h>4$4@Ys1lrcyTJNp=?9 zY2zLoX*k+js^7O$Au5G)4?{yHY^0#wfL^b1gN2p#iZ)jP^37nlFzGjtBD0?RcTr`E zk*_P^<0lQeEJyv*CUVXrAGHSXNlFLKyV6&-sGi&ZvTQ5Yq4E^5_?6oKrnjprN<=8L z=G96>mx$tl*2fQ(b)PYYjYYpc(ZIHBTnj@j|Ll@?LEGne*7=%(nI!V1USr9xK|F*8 z#rW|2di!xI zAi-59{&Xb~_r+?ZsB2NDeJ1N_iV;CCA0#E+TsdSHEgXu(l0TqZY|BffNU1}ye!4D@ z5Qa)OB_Z@u@8Eg85sq*PCHb8?=U>kbwH4oKwPv1J*8h$ES;Goe+Thv&?)E+#w$9w= zYLS}A)3XX3oNRtI&rA^y^VXbT6-3y%llCD`!kQ(JTIj=Wf>bsfKY#ql>MbrWH_+F| zHG{2yjw*I$T?ft>Xo?_v$j$`^Uu*P?Bc9i9#QFMU+QpDD)M)I@ec$70XOqWk|5!{% zkIuI{{G_0}52@yw_SLf*Y}uJI2|JCcN=n}4`s8-F{>6G^!@-_38mqyB*!^U0t-HIr z_aQ@3QHPKxxsXSa{eslu=NTVp1NYANm3>fZP!O<|LF*_zNwOA!V=b&SXIF1!RC3&zKio>0o^weF-{W9_ zIvtT2n&qq;BZ_!PR({8Hqs3!`*v!~i&(DPbaG63@{=^rmaH zM{g=`*E2Ey?3^67ZHxLP#xrWZy+WsRFJS}mz+mgm&?}ESq_JL0+YeOP>sj>l)8498 zo9#2Zb)A`ae}A+1#WoG((IR!}KNR-8>6wepMuV@DmmA*gsayDHH&sfz9p~Pc&PaH7 z>DVx{Al+45oX(f_Hksp5?H5Hz#E-|*E5%TsJ$mp!uh#x99UYy3Kx*~YLqd!|mi>D1 z%{26MC1r@PxbK@o(ijZWKkMtVs4&Vf9*7@k^9DtS=c>P{09*Q=24p=zjS~c!TBl0 z_kgU703p5ch9DyV&%5xu;}| zoVC8g+p)0w4TmC~JlVA6xTO>y6<_jJ75Fu6>gY+wCFd7c&{TjHp(Ah@s^a5Ew}Vv( zN%yT>zkWT2Qh)?ugK#0lz@arDeqba+4gta!O>C-gnCXWc3;oB+998iJd6S3Wv=3D! zvc#SEm+^Hvx5PcFy!|pT097DWhTD$3w758oMy4EsL=988cgCVwc{hIV*I%`nT*u`T zju~^>>EGZe6z|Qr0kQ4hQK^g{N^$6k+P}-}@7ziptfWoe`U3t42A;b`J1bFyg_4M! zVHOo}6(f!1Pyurs_sa(2-Ft+HpV3$kg>0@=l(d}8yg zzDB3d^%)~25mJ}mbZ%}RrJ%e)w`s_)b=xfaxzZr@`yl-#&q1N^g+(-3VsbK+U60-K zj-;fdpdi|cFu|ywc}1ujfgu5Y^2qk~0YX7=XMHVOh@Stas) zS(%coY*BeRgaJVo3N8enV!1E#3cho3F*$-?p{1tV&F3-90S3t@*FDPJ9W6mXUSJB< zRL36P%T^&GAQ-Rc^7i_60cVo{f`KY3tEANX{rkTxGXNgI$RGn^NA6Xc{X9L|-dSM>H$dSwJF zm_&?_U@Xkf-;s9eAh!Pg;rC~M6dza|a5BZSW@^VpUBeVXk`(IB?!l=+m;Gf%$9+8P zmmzrDKFh;1gS$g(PX40kSFEh%%8JZQdq?%z6oFzB)$W5Szcj_g2lbm-SQsRfGF@J7 zW{-S$e!go7cE~GmgF;-er2HBCYx0kF8|6zDx6Y zr9)@*(n@dr6GrkkGL^@H$sYp4H82scVKe`0bGnyDOh))V)_r*x{r%w*AIIfdW!oL}l61^WX8 z?%V#B9C6wj?rRni!t7{+eI-^*llDaNK!A*B*!(aHcaVU>4^o=%nOVyTMpWM~Jr6*4 zl-hp$stH*bK=k}}d*0}yepvd%M*DsLE@C*KRx`C5T8HU@ibP0$asSFAg*_zfR4*vz zLY~4VW5tW+Uf8PnOw%T*;#!={iwB=biL1;G^4YH39Ucbf>vl%R#87kFXbH^N3Y?hy z>Fqt)c--;?Me4*}ND-A*r>6%PrkzAj{vb51s_3(!Okv}1Wt7{Q)ilx zx*TrTwuK5f$n(c^npAb2y?UxaR6A*&yyx$a=c6~1N&3}v&~suHA~Imz{gZI3nNAjq zp6|RZD;rg~mVq<+?uXuG6vfB8`p_FHQ5@ z4^Icz)MMxw9>=&hB?+#rj7)Ao?p3IsN6_POs8lY3aB>m*&Iqr^q>Gg=Fno(*CkXGa z4f1CxG;moPMn%4ttLj#|yi{}Y+ATd37(O(b`cS;m=S-E`x0+9D?8!L8koztti4=K& z_@OU3cH|Ri_!${vhuzw->lV+BfKwh!Lq#SUVgHu}P|nX?*hxzhoWIEzjLMrfcnxck z6gwn3`g3=>v#kih{WoHNE=P!W@Vh9?7or6PVW@e@q z({!UJkN&XF#o?$}AO@V6qz3=6YGqCadNdqly0-8rNr2CFWnZ$!n}Fd5*&}6^%hf1U ze}syEDnpNzGsW0jD8^EP1wX9a_;(8Rf2Pm-ZA1iWq7E7Y<(8XGB|uvs`saZ^960rs zg9WWYCVyRnfw+z6g>@AYcObRMMVE4ZaToy2k!buq9CyX9uIB>Jbeg&sDI%BAB6A>MsoXkD?2(>IfFHg|xSVc^14pLT`n?7!xWi$Gn-{VA>Gr@U*{8rhcFqPq9wte=#iv=Gsg|9Ei|*Urm++&@sNQ zOKf&OiO2V|R2Q&T!_m=TxNVbhdDJAJt$5WfV5#CPpc=(#S*H+T9L-i#{?-1$O`X>_ zhOU*jrEasYzHZE@pCuX{+cnAT@whynmY0fhD5`AvkVMaQj%vk(v{L zdPBLJGM~3hOfB#t z$;Q8b|HeI349HBC%ZKPcNYt7d8V!)*7QfRD;T1Tr;=Ozz#P|$%&B!V?)4+XMV=+%w z&D1x$)1(rlC1Z;>j?kK^f z$(nNdRK>qP@{5bdCYem;hFgus#tr(EGucI%-l1R5cRTxljt~)PPzs&HZ+w6#>W*8E z^yTD*ul@^@xIe>1gq)mo7JU*EzsZg^Eiln6dOsgFo)r6vq_gMZ1b*d>`8orqPwr#p znrpBxv2J2T(i8d7z^S-;<$}^u zA)~MD4GjTD1BS*G5;XY0{BGY)G$f7v+;9zX+8C#d>74jvt?A~*8)5|G47rTU*W%p0 znaw-;*3#1@;#V!^jD5)*^?`e5(J_o$5M1~(67@j}l%ZB)&ajY>Z(oBSV-tYACDK)W zk!xvbIaz<(X@8k3M$$s{b$RjE0yW|cL3dCsOFW+gdH!sB_Nl07cvyjvwMJX9$9{YI z{$rk#d6+}9Uq6^Fdg~bI*-;Qw|JBh?P4e%~8Pvg1A0HhZ+H6bSy4}Zd4VB~{re{)0Kcgx#>~5PEQV%XvUm4Pw}FQ z8Rs~@V{SAy?0@l~+4)b(=tB7opMCPAMyMw3|L4iR@4#JSR086Ae)+EP>QdxA;H7 zMg{6dYwTTd$c|4_S##V@T+7Zjf@l-s7Zj8?TE6giYieO@di_bL0*EL(c6%4!5OiN> zg@^xAqu|Ox{ETu-FPjT=VbiT$SEq9ePMK_#he~O5%qCz5*|r=NfoE5b=T&A<|HZe@ zAb_OeZ2(k%CGaB3c7BbHvbq>#5M+jjX_4yn=ej2DCn_y8bwjR}$yRYN^BrT(_Cwrb zt-#_U%lW);1UbtNJZF_cX}!Mmg^HOyt8qWQSX31%6K4_i23u=<>ReXG{&4FT7CSGo zD$E57%uJVpL+xdH(gZ$z!Vf_f@SKubY$a^1MQf zd!*+|!^E9;gL`$dgNE&^Hbwo6r4Oi({(wDF%nM0J(wl7e3)rvuNy+A7Yz7`OX1{Ie z-_f&x^Cpz!1lmdJW20r;VYlwfOpKyZU*FQ*JwyCI-RdE@_Wp`x=1ll#>-tI82l^y* zG~~cS$3qiCmMuKtEg5PogKDEePo(?)?N6t3&FEjE&I&o4-oGB=+iB&?18H!OaO%Pz^V9932#QV3I?qG{YM7)Tm_FP-omniV+%lJH?%9ktUyK{WE^+wzy zb~x%OtXwC)J^u9Qch$iOmarQHORkNK;12FuLu76uD*f?VlEl3imb0ow@gMuEu~srS zMR@BSEto5`9JXx68teO9Hs2cYU?KRC9nEgtT#re5(}YswBvDZZSq!2Dyi8H!WKF7) zbu&m2(R><$aCF#`IS#yc#2NS|-r0PT;zYYFmr-3WHrAJ*e_8gycwh!64PVq~kROE| z2{T>%o(+dN;*%dB2q>ukL(^GCRk?Oucq52}AP9n_NDGK`BO=nB(xr5F3K(F}AT8Yu z(p}QsT_WAx@U8QW@$$z%j&Y9cXW!4d*PQd3n3z}(9(Y4|^b-Zx0O0;Zt=+DG@1p$? z$>%9rYe$DZ3`>wMgn^^;B_iZhkmpn9TIc*=+xa8#^MRu|eRv%%s$pZA>5%^+hA5V` zHPSB(83_sZqcuC%Gxl0HOFAc~KgB2qi>}D*93xNI-b*MdLcUs2T;t!ZIyu+WX9;RJ z)4CEqEQz#m&D1sg^E;iXbWK#w4C(^m#c?oYviq&3PEGM%@GM^8ymqCl*{P`iioVrP zOvJ_UW`pH=-}zxJVU>>FUpt4Bp4{P#0(RQGbCiy^~tj zx!FFr_BNFQtSU&2%i8X6n>6*!&2eI*P+gQ8-Y$0_+If*T~bzFCartpi8~_8(D9 z8k%~yedY}yVr^`|XS95+oQlw=4DEo>XLIvwSUv&uBKG)J2z#8t&Frq-M1@UL^9z~NP$5~ktDH8 z{Ut%FtH)CaRpjA-^4y{4xJTdROq5X1`~IE;7LVPs@#^t90rolbi=)S3QBn60)RNdc z+*gOG#9|tl?q%$$J=fjJlZ}lA)gMQ?LkUs)j#Q_-R5ji|%WrVQS`c!xOY9mXrClHphdP<^~^m&sZ);7AVmb6$AZUvtq)uwpUif4E;3t zh?P{O$21L`zttzUec$m@Ek4~Hi<{IeHEmitbc<$1{G)GR#T9eWeZ`((p)7XDEGB5R zE^EFg^&&{##!C+$$0gE3*|?zClS9eklw>fVBoUTt2#nUZiN%Y z-q;q%|6VD|K;$)HU+^!1{h5`(!Q|v9?|22{oAJvACiA5hyS3J{4-tDO z)`W-5;|o7WR6-RH{RtTxN6NO>Q&GtxB!!AbHoGG!>DRND-Kf|Hfk9G2j(c`z>27jz z1Tb+KH1s_nJTlw^ILH63kHTZJ8e3{!oX>&i_GzsnHk6BF0mf!2r16@9Rn{v+=*hw z(k;?g1fw2A?UbQ3ABoVZz^fL%^N{_4(#l#?r>gzAtHgf^d?1d#eFZ=&|Sy~-udiiqo&pzQHA&T1DZo=mQ z>ZKhl%MDgqw^1+ZPq4Ex*U(W$?<6z(`#U;WCB(%sKID9eLydL-RzsnS+51Vm(cB|H zQ4k1fR#rRMlE~23DZUISCvtK?02Pdpu^Z;Tc-~=&(eDrk0t(uPU0yiO;IN;anE?u4 zni5fBKO{H;zg3zi{ldY;9b_>oudb$j`c#-i14Mkt`IP*Z$twkD&|5h<<{&&oNObgn zx)B(a_`vGQ&+p;p77J0R*J>Ab!aH=Eg?pLjdn$#?yxpv zsGK;DdFaw_nq2FdYKUhtb}1HzTs-E(yHmf%HCK$J-lj1m5+}RgkP{eEGi0OqfDbe!w(r#uIHvVYj%`Z zq}Cj^Lh?a*D8h2T$+K#kd|bfRkZU=))?WJyhbZ%G^_LV=W8;|{_kkdM}(tKESS?;;yb3jZpZ*M=3F>Er?EX5KXwO!QMmeO z_{S4QM%1D!R+Q}yn^7czUDY?d#d+=O&Y4f6V@CvD+IjztvdLslyxeEpGkFzu%Z(>_ zYcx^gBE6O^>FG7wSX{y`=s1mNs~}DvwPzHCU!qQsvM`;S(KgpQdCqU$4vW9Hdo# zB=+AjGNJDE>l3G={)1B+x%h)UVf*|~-lx|8PPR#2%m1>sHrHjqci>So>C?UrrGNo4 zv-w4^I1aU`-0sI~@6DM>S?&{T1RkAJ0^7c!!oZB*s+hP?jjL*~9?n2!>w372!*53B zU;cHD1@c`{KSGpg#%QFmQ>A$vzybzrMRt30T@Xrc3c*Vj`A zB>lNsQSf||a{kTBCQ?rSLv%kpBt#n486sbU9`iW@sqt>0*d;GJ57p9U5w4SF;5}?P zz-1}7aH)hziTsQ}kL6&M!7MNRCm-wD(=+mKZSgU2V2$H?fnQ1qCLt>1{$;4cf=EXr z#adEQ>U+`ol+}}rjKlAv3i9z?(0wTpWADXMP*mK9yk28tX)Q~BMNvSLJM$7FX2$UdV|xNk|yWc@41-q8_=k-*Ix-u6q=>_a9dS$rC}^H`GM54B;K zABQzT^CP@vxxS!Wm!2nr&eRB*IHCQkrrdJQ=jv#0B^icP_bo@V?!QnAvnLc|oOS57 z`^ai6mr5cx_*o2F-QC+w+KB{3I0O}woGiQKkt=!=g68dvB!}0Tv3jUmG`LYSpbL3mSX6}MD&9acGO>oVXwv-0}4E zGNgi2{_Q3W9~_ve-`kj)p2_E(KpZKY&Skdp7}H41g2Po?`{eu5i<&+mpCqKOc^M=#fb`^7J%KU`lQ zHgOY}8_Rb;&X@chN^V|5Mnb~>uI+wkmNr8~Z1m&t5fw&!1}h0YhKO44+eI*c02v1C z>8@vcP2lwh6y&J+tToP*_`pjY2)X<=YBDhJu2#rkzH#&8_Zu8BZ%KL(=;vBaqgI$O zSvZd(FTWcd%}p)oyfG5r&R$T^h3a!)-?<{o-PJ`6+pWqhkR|(2@0z76t=lx@*Eqi; zh^SoO-sW*sj~k>KuP{4Y6E{$!8^jL(tHC*$4?b+x5- zd14_iKt~EIj=ajgru*{zxr}UYS9dxW7cD5gSz@0teMd*cF))0kv{NC)_?(&9*~w}2 zxf1Acv7x#$zKf+2(u~Cw4sc2C4DOgmG{VI}&)fCpDrT}_X3&&_nfbY^n5>L!1h^3B ze;|?=B4}xr84=%8zTKeFAaDr$VNr>;!7dqQSUjO`NF!EF#XTQ?!j|jnYt93Ml_Fd#d-I4X(|5kSg&0P7MI;X92kvh+%HS8$I@uR=k9%?XzAfMo6Ltg5Yj zoj72FR;(VDn0Q-r_xxtxv}c==iNLLV&(6syR3sZ${jHfjwJZ7$76~O}w9b|LaA%z@ z+h-~Ktvc%^{?2bThc+nY=GnO*XXELXKu4}wK^>TjiKt|yIOh=(%9u;?*&suqw6I83 z5+m;CIU^d^oapd;ayvDpl*8^|6O*a}B9o4e4oeyJLQ&zF#`4I>OHM{bkN0+a&AJ~2 zooyyBpFb~1EBYeMPLNCUa*J7`>!+tvHC|m`N`4a~h6+*ddBW$f4?wKRXvy@~}xHzy@#Y{F}Sn_2YPA4_C3ZnOSR8eeNX{xr$DC6tS2g3>!6R_uR( z8OTb7z{&2;BO++y4cW{O z@i`7KhI`)LkfHPndZmCf;=`q}UOj%kcGcd$&b=qc|NgeJGpZdA@->)o|qX4 zhE8<-NQ_bB`VfnU*mJF&t$A?A2ckBmr6XmOI4EX39vlZ9eBE#7z2>?jA2Zm&xKLIl z-3^$_K(VF8#pL){gby`Kpx?mpa45AT&OpO$$A`<;UiQN$US7C1db{(#w@+64wUz5j zZeJap9R7-BQo3t*@Wx^!lW09z4r$<@DGn=Eb{rO(Qg&18V`D?JR>&dr8RV!ipnn>` zSWv>j@P)8*Ur!H2=D66{3`>YOh(CututuW|Hv<(6PA%uH30SFYHX!?*$IH-u02D;E zs5;5(bT%9V4#&gp_Xiyv5?!`*Xs@XsYG`R`fzFgpw4nlsT%=c`qOH-a?=Ufb)C-Y` zx{mdVA;-u#pY4T0Jnlb?rnu)-I|L7RWR*W@g~__H@m&k zkWv|`DAao?AIs4;Jj~9^8%fPcEon13Omfn(eGVzoSIW`ta+P7kwo~ z3rIdHn*8rH>HS!P6V=xj^hwE?8|{_^J%W_bE8(}A$YrIz%Dkg#6nEq(Z1{TpaCO)F zZ3|_=3%gW209ZUOrddBABTMQu3I6pIU1ty3>y@sb7neYEJ+R2gq>|TcOjhvj* z4~H9#qTie^9xh&R(wx(7EdfH+p z)1|`oq{l`ca&D?wPS!1lq-iEMPpqixyhj&#`~%|xE|^)>Rm(ekR|VnaEt z{-PoDi`9k3Id}P47If8qpqommM zxtjMQLc3A+!{wBM(?oao<`7VPDZ)EFXQx~uPsI_(#x8v-UO|yUjfmk8V&*P~m71s` z_C8Wv9UfcJ&)J%nZpIlVQ!AAm!soA$lRgrbTq}3R*Izvi*mdXO&5=P3{us{Vr>{NI z5xjXTcJ%}C%iliz3;uGFB)a5v&8dr%4SA_)kiXVp9=Dk#Y7KEAYvy{ZRW8TqGfqAB z@#P}imAjp?@{tx=sbh$c><%iNzmlUKc*OF4-Rty%jcFntUYqzD&K?Qv`iIuZDRCwb z5`G!+hup;P{_C?N>@r^bpBLa2Vs6}X-TE&UND&!qc~d7sjo#U8x1P2<8Fz%s@~!GH zp9$(LESSDDcl5X?KFs>=vOkV+prqnm_TKd0)+kxo_yXPUPEI9!}Xl7=e`jtxlje%^%)fMU6@j29cIyzPu z7+wDUxg+LOpT98ccwERQi()2bRjljOR3ZCRQ~cYxte?~s7160X zxmR|lE4W#Tm``m6l7tgq(kR|${s`wVGpuz#?gC8%zQ61Jh3~&&lhJ#x0W<|t<-ejb zrP-1dygz{|_8qU?j`RUKdX@JrlJEkE?OK%z!I($lM+A?4Ou<+v-mtMl(J2fwp1)?c;q+XS`eo7W$Ew5X zno4!~7|)sRZiByR%!rPo>~=aC%X0JH_Xve-jyUewbNm-^92)Y-?R|69OVbaWhGk@B z=W98sf8Z0Sy*E*ndF%o>8)WY&l;nyE7vVly2yejNcz!VC)FAY&t?TWU>i6Hhbv$x$v|XU1{;8stRAD1@Y0;bl z1TiqKj1^BkqoAeDFfcHYOMI!Lpa6*{2rpMTt`C7|fG5~t!x(nZ2)+BTnuTak`bqh zafS#!@Gt@_5*RW}jJXJ(wY6y2=#vS$tMGR#^m%=3^l?hP#~gJCa!E&r1t+HyufMNv ztFto~a2CPer^uj6ifsZc55QX80M;A!0VC$xnIuyjZgs-`OYdc50>*3s43w;yZ)Q4A zvNb3x6HsO@bZfZ9U+K`iMii^$LED9l;8lF8(#CCcl9KnHE1LM-iCb5e_A-e>Y_6X| zbaJ1uDSOCg&d*Ha?;m9^ZLd3NP=5KPrl~1Kv?EZjcsC2{AH`_a`-RdbMz!acRAgFp zUd!zhtgVtnV@iY(nvqGh?XM=NaUAulSv+MMi5ZeFtT-ogrfNy}g_hYhTSTiW+unGt zOzbD3;R6gKmWm7i8WrhJ;QCVAQRsqdc7O}E^;o(Q4HHtLVsA;>WUT(mqzgOa9x*qz zUK;K{dUkWkyAPIYervvrnmiBaQRT0i>kPn%9+q>-D)Ltmq{IiqSg-~CJ-dh0%Od#>K#wVD^B?0n5ts>{@#6;r zogCUTraaS0rT$whYXfQme{cL{E>>1X1~vg2!8&TrvH(8~(wCkayYBx+bGc;`UvfRm zb{$w4L*Pt|^KbRUO;@~}oQzR<%lN~ek>A;o7mrKYVvdY&@9);+01^8q1en?O#e*?u z1B18bTF_fUDr_*O%Gbd&sV8~4V#@d@oEEet@A9c^vKmnUTQ zRiCIO0YzM(6Gy^bpe(D14}xz8=*kU+p94cg*Byb-tqS~>ptQGp3S?Ww|~E%eMS+#&hvb1vLyNXf;@c~pNSwfPkX4`N|~K~ zCsW?-5_|1??tS8|4>#JM{QXVFt)HK4e_%0^it>4xkF;=-f3^-GwCG#V}K>+w1|{Q|J2hn(<<#A7c**SCJ{E%aO$QoZ~?oghKj%a zL+QBZ5WzWr=T3ec((=R3D$*>#ZFU-si#bP zddNdXwRCYzvb;QH^+-vDS{-<^kOX?P;9zIR>%3)ZyJsJoC6lZW9UBYbA7t)^h=~8t zxut6SzJCp$Ms@eXn(lZePrdl{s_qms(N4)`bxO(fjotC&#pK6cC-^5Zmr6@1B zn}CR8pl$W>R#ss(=ib`!QE&qppB`BE7XXh2- zW~K&VjWH_-pZAQz9BCQoErSnBBtyI@KJG1-nxy_RuRRFA`ka~AUYhjXJfhlRybpwt z-No_GzASj}pP-|dQ1)c>7$3;`yH;}C`+uu5y4b4JXC3USv?E3o z3?!xh+){U!p1NFXQ8z-A=nLv=qT5y>Y*-TzGS5twSq#^>7|8WyC`mp~Y&R%2O{XmW zoa#Gfo>^q880_If{X|IZ&^6D=g6f6NO~l`JA=ePs>lH)B-0HTI6XS6kChgj_=JalN zsnGP-cBxVWc6Lw7ZS*Yuh?HAPR(dYK-QEui%b6ic`Bv+W@9Wjwf0|qC#DWFniFOby-Zw+fl)WTnWMG=# zo{A=ta5D#!TDLw6v8u|j=sOM@3fsr<9<8)3bMKu6$aZ3|4r$VCdA8> z)k?{=z2$yQiOX%(R_FZg`EjwE!{(PE0U8kQXV`%~czNssqhu%J*CTF3m%> zwyG>DUMc%LQmx!^y&C3YuaCe&mW@-x`SMdv+`@c>-GuqONbUnz!B99`&JSfs*t2J$_Y)8#Eb91mU@K9))y zB)VT6A3qBMA2f!WzTTtbpl8oqx2$B6r9 zIqpUuCl3cuy=u(I;NGY!=ytdx%A{pg8}1xp)BO^^diQF5Z(Kg*mG7PSHS!6g%Y^)& zcXwA-Xy2)Qt`E<;^Tn7te=D##QHT3*Vu%bwmG_P+?hl;DuQwa>J-lUDN_1;)3L6B; zdCjTIO0Ex=^MfVDU+DxzXsEj?Q@QuauGJX4Kn}_pW zU?kd*9VsgbWmdY!K9Z(Rvu11=ozwT$p*`@i1=Z>&uliu*VOqIZzCJb_gapO1Hs{Bo z`wW=`^rB$+#N6P7t1m{he^<@@ifDvqwzAFk zow1FLnr3XDu>%_$3W(pPC7O>g%HXikH|r$)IkIN_ThQ&lAlU3cNN-=?T#3nXYqxTx z_KNSJ;!-n}I>-ZSudA;3xCD(?ZotE>5wsabG>ZO*$PM{filv z|E=!6D2yT;EhD1E#de@=WE7Me=hm@Le206kZ1Y>BuJ@(!8}&8jch335;%1~Zz5Mc5 zZ?ODqx)kxv9aWKRm5ee<^qTEFkJea;h|c$P_#XTI5b%T=Ud_Mf)*d+boC9|=mi<$Shf2zu$J~neSXhEjanRu5jg~P6^&Kpa zR}jX5;{yQ{q`6qF;c^PvGMT@4MWfLdK$y@!BP2^(7%=;M%^{G31;XcYU&*SE^L zx+FMQ-rn9^W2^LL7lT`r4naomMH zvH7bv@{-TAE3@>}B8y5(+eTFjxqd6qJGtb#-OUofZ zj{<%yT{`p012XyI`=jQbEf@Yu*?_=m^IbIQjeYd+VOT;!_NXEo+h1%FE;#?qOWx@< zc;$1A{A8mJ1fNNB5Dm>IFhe|Hb(09g_qRBbf8&VysClu7-|4{nlsRJ&TQ^;aNFwX^ zq@~rO3$)6>bOi*H%=B~+UYvouLjQwUr8whN_A>CQ;igB!&;I~H{CrO=)c~dsAXSYR z*BrWvL6m!RSq=>AP8%cpFj7xxpbLXS2$ZPR7f}?~nW@t9w|`$q;U^C(7KGuKyn5lS zDXbD>7{rrpzIIFTF@!3@l6_`wmg(QW@7+<%Pi8!1RJ(*(dxBQdsna>%pbMMh)Ss>A zo}w1uR%yd?V3u@~?O#IIe`xV5>y+>wCuXrqs6C=iF~D|TS-hVuAF_s>@_esY zDYZA|GccE-j4V>m1Gud-P+f7jL4|S_ll@LbqwWD4EOs^Af;p^ z%+8v;Z;rmqcyJhY-1OLQ=iKQ~pUKKI)R%m`8rt|v|t)Dhb zg;V1FO_MUbIsQ{0E|t#q{5e*;UA79tLn8MkoG++pAEULX3Ovd*9bjC5pvLW*_}R&} z76?NR2pmxmOwW4(Eeo{Ve244myJYQHUmNoCw}IuGeXCdP2-D6wG#IhP0PWQaY9YWk zE2=o6kL+XcxV1U(+qY;T@21lZH!4iNxLz)HcE)G8rq7>fX>C%8A1PtNCwL62gOn8a znkS;kur2xz!9Mt6Z+!0o*9=l)TU&}o$%Xz=k%85NE$>Pu*QAqpW{Kgn)+{@Kt>Y zWIumhY2>SwO#TAJAFSgwkYohaANW~pf7Mh~n;s;SrI@CQ#8Z5GLbL-j(!2H`JhU%? zIs}|-e}VQ6uFms~(QlCD)_G27PD%_5aD3G?eFFo7kORYa@3hpjqW*yx*!Ew|OdOY0 zSMR9Q@MEQUMA8Qjq4eL86lmM;*dBxann5&k1PlYM3Tp_Z0%Y%zU-D;EEO7o_em9N#DLe`Wnkc8^p2Q);~0w*dbHIvP)M zVjcp*=m9hb{G4IceloLd-mL6LD`IyPg{k53cUl|xs2SGadCun2_Q^c?4viU)2&>cB zA0tKrZW(fh7rFYc1gZLdto%y8(Dxf1Hn1C@cQsmeBNhLh(9rQ_qSscBh}mzqyPB8d z{NSmv{=$MoR3G5}t!L^NQ>N34mR6GqT5z4xOPgOCJZEA=5?p_VK;X07Paeb-xkR+E z<~-&{)OsHL)>b8G9M1Kz5UpTn@Aiv6S$*e;Pzc|kw+jeU=n8nGJEuEktj+9l(X-Bi zN69R%t>HF?xv`0zBT@D`b&u7dLFuFtd0u)vj+{<1;_q*?L1j|gBK^|MiF%C^<4CSC z3xa|Wv|J8q>+Kop6`iZZ{5NOqdx>!9@V3NLcS0S;Os!jZ|7xLnaf-pO5b914sbTvG z#zCX5pJKTIw}gQF78A3~&yo(O%@`d$c%k>B)JR>0p|@TTuEYhDXt0H5;jgJSGdL}% zS$-7O+>G&KK`yznvTJPYd~b0$Q3yDdty<--7SG4$y{01*$l2Wkq8Ni8kvVNFcSo0P zZg!H4jkV#?ItK+lHyh1SRZ}An#(}${4bRhDl^mTKF7IpN-I&!cNp{Q~A>B|3a7j$G`>N2g9%6*Th{9)D{#BBHK70jtVUb0-l(&L1fd;ypQW zhNwM|SR!60TSg1Dt&)uWVDkxvANdaf#c*7Kp{cdiea=x!%Yxz13LITnX9dUl#gChs zxDMu&TV8q?&G5tPD_p%i?^_#ANe!dn|TX<9Uf&% z;S^|GB%DQ}Jv{+$jA#%F>e17`(;If)(*HQS;Met?9*V`$oV0=9mIW5m%`ZaW@ug zHdEm^;d3lI4s%}!hlSO0&t}*O*IQWz7X5D=^Gs$tXcW{q>8=|gO*;R1bn!99nWxQ_ zR*-!V7-ITne!M0*x)u6I&qh5T;oY&0&=|;ypf7ZuPLQ>lx(FcWnamsMzTyvm={epK zyRnha&rNcaQzoDLwf~_^NOd+U-DqU=yRd^8s^0Lx1L`-TPa|u2UAJSL#=7J!hN#z);woU+E5WCovPGzl1|(sQ_P4cu8IpU zR+X_^8Q%LcKNV951f$ySvu(?W6rG4AjCUm*bC{V(a)Z{%eZ3+^#4p zV!;=MohQ_LwFeL#Ryd#O-H9M>xBK+)EzF7H8O$1_a$)c1>*r`_hO#Q$7~fi_Z|!=V z>S7a*Ffwe!WoL)JQXuoPoBNf-?r|b_v{!`e1NBAA-PV8^359W7dPxxq2t&2^k4Iu8 zN7_M*M!mmxU)Op4C`O~Yw?6$*UxHw{tkyjg;fHuX+5%!;nhkNxqDn{gZ3;CSc9&O| zK$|o)royHe@3TN;;;U3K$0w20owYN+yQJ1@^^jNk`EO~;OwYbK-$&gZ%m4DT|A5`3 zz*JsF{&sAtTj8|1ef(m-1z<<+6IwWEDHUhs&IY^(%TcH%V$|Ew+?=#Wf5|>O@Vl&& z;@yhnt5Ko7P95BVB4&eY?C+I3lqlKDWgDN~AS4t-K5VOg5lc1W@GJN5CkV7?0El>j zcGZ8Do54o@_&y#NX0SlPPg2^OaaD1h#6K))FdGEwRy{Shu$YHe2mJQUM>=xfQb|c~ z+V5Lg=0!5=m>U_UI+^HKoa|?gy7e@+N zU)xeqq(;&-wfrP8pD}-64B9avZWNwGyt)&k66$W--_+W@nK}WiiO)3;iqvBH2J=hs z&j~s7w6%5IkGDs+oXo=Qr4go0cI8IueQ44VU^-g9Ah1Ft%7DJA}Si`#F2SOax~E4%;w+=Af)Ov;dy(y0)3 zbv*+x2KYiy5xHvle6D*7-dLw%O~(7LGM$&)wO>8yH&cFnx)o>p_a9tRiReW1>fB?>KU_AIBl=mrvcz zpe-pSRUwq)To?3jLh=>9cZ%5~)RHOm=2lj`hJT>)Jq+ILg&HNX7|q-g4EgV-4_$x3 z?|w;v$HI@ut^buuGmIf8+;yM-FaO#eg2C0*Z2_`U?Z18_%F^)NOoY04Oz`G zaG{F@9XZ(lKD(+~*k9^Daqfjk?S$#&cf3qDSJ(NvihBr_`&hBV{HAg?h>*^o{)8`~ zN)~!ifCRzCb^P=U8VBf1MD7(iD0d%Q*zoddm75MyGv+#WrZC2F2ajlOBL=d=H(U4Q z?|ghkl8`!xMyqVYxojUZkkofBuWx2nWi=g|9sXHoR>sQGHJIFsoU?knN3S~Z{s&&` zEnIw2SAnr+(GR%{I`_>5kY1@q2vCV6VaXyI^W&c6=gGj{EmKaA{G|lBz#s9nxD0Ya_I2Y=Eze@oG%>g>}9`y1J2qUdZvJ};(fkMm|TS4ZpR#j zM@lLHX89Iqc81t)tPJfi(VV*onP@1OnC>9rdWUYMUMxOkXTOBkwNO9#$^Jgj1Esxa zBmvV1FmKIj$4-!+U{?->=!3v`f!lzIvGG$h;jpA|vCoy&)sXi4(4dn#w2&s1excJM zNKX$1=hGv#1don)R(iGI&A%}_t?=$l?>iFmy$K$5e0;G(7JdT5JnB}%+K#3sYha5a zz~}&g<^tLe2!wJ^Cw?v-Z}wBDTUNd}S`Uv>mbfSK!3ecEztJId&E+gSqxV>M8s7v66cv28>jdK$ zL%X!C%;)u<+{v8&UZQ-b7pi%E*1a1YVu26;hLzqM9v)g}!(QI7YBSx>qD?_9G=J@a zK*7zzRDJ&(%QV=+dOzQdliA3vHRk>&4e9WE*y|-`?P41k3{@B+MQWt|k0oWey}0vR zZc?{WF}rysBWEJ!zyF>>LLmJiwED2zvn<-=Phazqy0bB;PTUUJO@3}e zp4WdyOY%udNNzB02-hm1>j3zTE^7*zc9Vc#2W-{j($LQLn#bq>X8ZXQZt8Z+BX6RlLFWhRn|Z zV8xW7S+J6*9(rP5HIk&jh=`8a=eDjS;eaUov(xRk0OZTW+S?+ni9T~l!=>quE zIcGpY9xIPGXfyX{$apG_hq%dFd}-WsR>ab|mrQrf$Fs=ec6#!au<&iya*kllE+)z- z+MVZuTh*s~$EVyQWt+g9zQA%YzI(pO7tHL}o;zxZ{gp~F=9`h>%Zi&^51Uq<1`FY1 z=d3H7hIPJFu{^QGJqyvSY5A|e1A@XnwAEa zkn7$*c{w`=hs#J>HlDNncXIGGENlom*eiv4mMHvj>v^w`A5{FjRqu^ykH+2wzBU${ z%+qMtYQ;Gc z3kHd`Ku$n>GettChu{5dc(LV@>A98r$yWaCM;<;N)E9Le*0tV=jbj+T7Z+s{BO}d` zSc6qRSA_wb^8NlDAf&Oz$QbDy6R2cT5TPU5+A@Bc-+)OLR1Osl zjam9sF@^{@wNB{#F5%@-FReZ%lG+5Ltn_+|fcw!GxlGl!`9Elu(Thq-K>p7< zVS?n$*%`{``QVmCjvhWp^Uai7#i=u83=@%)lf!vdZMS4BDyox!kbDg_T(xeX@?9*` zHS{;2%FrI(ENO+e_u>$Yd^6&-q^cHIj*`Y*Egjar#I`Cd((_RYuBUnGYjDq6SHNj@ zvg|vtYO!QpZ+Wb4%b)utv}xN|+ z@oN=7!nRgP2$eF)KL6YOq=v;aK(Sj>DIrsCFr-VHaun~_go5dj#-;4}AWDgTsu-eL z5lJEECq~Lm>$@ZwA?tAr?KffX`Q=Qwx{!QU!)l1DGTzZOU`=xYv1F3~9~trUW7w-H z)-8+gQJQpO&9FxIOOwVW5`8RGLN-^rFB|M@>!at^>3QEWgq`EV>XB^3xY| z^&;vI&u8`G%^|WL#Jl^iglWn7oUxj7y0<2sN3U(%tdN$|@f`uM+SBqhinJ+nm zwuJTxqU>tP3V^`4O6+}ki%u&WOZyOAude!b%>Y9f=XG+*+rrQ(44=T}_RQ0i?$A{5P~zUSv(mRosb*FFRFI_F`IwTCb1CJ7JS(b;Tl>_)lO^{lpz zqcrOsM4;cG>2Zw`bR}$#N;sIDMMZtV`_cbByREI-`;JPxDJ{|~i=E4}GtcW-v9YQvERkg?BP+No zlru|-0HAoW(!f)fnt|Vr5+C7qa|NlIG_lX#r`O8kbO@g&{c_XAYUAG6-8xQdbsJ3K z#_{oxf>Z^1wF2$)qjhE9l*nFVFom-U3R2Oe>vk(8XUR%R>J;fOf}7L(>hKEtW`F(g z1}wF6A2^*h_#%ds(hWP7hgjdC29eL2^i25d%ud5ig^}L$AduvE$VQvF!c4jFTdv&{ zReCySuG$;a!Zr7uW}whtk_&KLxUTKdNE`oXHAJjI?da#rb!c@$&@&f2^nVl)F{H(< z4E?AT#=R>gCja%yx6aKK=gAlju!c%xX!&GIh8sV7=B_e6F|oDMD|pjP1=P86i*YD2 z+r9Rv!~F5)uPed(SVg;lkOBIPSlFdeN!9_uLO@XU9)nFo4*BF^>ErrLFA>H8zhL z5kd{=I*4SOi{B0Jv>Y)f#PEeRDLK1w56rO8wqj-F6ppL-BmVVu`@}@_l&KS3`2m=I z5&ntXt6=pIX#Q?PFL3~6p_!dGT3NAJ33ac74-4yVq|rpQ4WGl{W;?RSmy z+B>7IwXqS0(StHFKIaAgT#HWklc)gt;e|5e-($5(yEEzt|MeIs*#Pm7#6UYb^>RRuwEdhA*Mb~N2eQxfI2Z)~{k`=ngL#z?Ys%|+le`~s@ z9Bm}(yeR%4f$!i~BxUPn4j(=GF&>-U_6b&JV)!a{+*tO2aa!9~8s8JKUZ)owEfZx% zf1gCXI18uPyX_Ey7dl1B6DnF-vFW=EB63=@uNfk^N;|l%rl5fC@4~{u;^N?MBWc?3 z$+f<{282(#^jqK;6@Iv6eH7tH>reRT__s+v_EOt1 z@C;qq+8WG#+Xz~t{I5z7E47ia^f0P*xt=^pT+7n_jFRua_~ah;)o^v4BVQfd*1ZB>toP z|7bePs4COH3m=ef0TGZ?q`OnPyE~=38)=aSDQW2jY3Xi|?(XjH_n!Y+@A)#b)~p$s z!}Hv+e|zt%`7M&4W%4;VhJuvvbF4g}Xyx}OXkC!UXHR^qP8{t+s`@vzNmxL@N2s`=VSLFW{o^Nw*NY(n?(}bGYx`M& z;)bjVNlC&cd92UcU_AQhygFpq)tiB!>$PRxk^oTboBTL7OA5XtkssUu-pBEPb@kh` zHWrM>>&mWGtbSLQ-OF|X2=paL&B5cmdP@bE31-B zt87pu&tb3|C^*`7`f^zCCK3)@CjSb`i~-K?Na`%Ot^+S>uftMzw=t-=1VgYV&|nPM z<7H(wz!HDw{Q?38#aApCfEU!&AXcOzN%lRp%a>`URLA$S6FGlwfWU!?bULq=3dO)e z$~_C7ex>33>~8*zI2wWcO|k#!avok^awMNldMHt#7)l7{S}WOaF9l~h56{fZl*D=U=0H0w zF)6=06uD*WJE((Elj_rvc#sOWMCCI_cmpvOku4gT4KFw~%Qt)G_J~!k<=D2W5P@Ot z`?+oP&+1+!=)1k~-{yob&L(y29mCf9XvD@9I=|@Ba7Uni3nYA+h&Mnmm9DQ>n$JMw z;_+NEKj?lJuHM}C+#$HL_Pr6AX?^72Lz!lotU0;Ipf?`w^?WCULUygxSGl6b!RGPo2?*1z{79WW_tu({gjzOlHU za&wI@8zXZR#HkYzAOesu{~9M@7=&5S8+6Zpj&?R`vIMB+jKJVx_pDm&rKhf*Bv>NsCB9l*+3Z^4lhb&rcbA@XrQs~=v6 z-uKm?E0e!k4ZiOV)}1gy=Wyqs#eAF@0`V8%^tc%8jqE@Y4Xi{g%l#`y9qe|RE>=(< zq@$;|Os_j!SW@zpIhlyar>feAcd0HVdD7Y%g@3zJlj2g0#$9WhO7p-gJJJWlUHsyu0{$(P+KXgvIQhAlft~85DgH z$Zfv(gfi&aDP0)yKDwr#+zz3<=DBUe{Q>`xcOUBP7F*p`?HlP4Aqq-Lw~r6;apT4R z+rdy37QwKAA9Xg8>uYSt*kbeloHB`xgk z=w0d8cm?@r%3>+$oMI8b13A-bD>NYTA89;l8vi$zxs{fZQlhT!?pdy?tpzM)OQB9( z9BI1C0BDFX#-F-6;8~A(s0MKzdSI#pfTq$?AsXnJ$*(nsm;^LX09Sdl37NEil#Q6&f%)oU=PA)@dMQ9x> zl(;sDS2z0tdK*Vc8!5LM>Tl2bjg%w)7{t%^2Pfb;TCLm zPXwo|$Z(!BVN|5Z$Vfm)1pb*8&!YG;022fRh`K#2ipWTpi!bl?S&C9NenI@$tU|oawc5_ruv&Jw?SlTRJfQia&kF`PPqHa>qp4rXQ`8HP&a>N<)lj=7!-6{ipWzTG~zkhpnw~-z9ISd;G zErWH$?D1dt>fm=Zb#>4d@-`Y2?N*%{bZ0sBeeE_1Wyw!iYkarYQF%8eLuXO?&fpKO zl~=fvqFcNTOMTqe*|LRpY6dt)>Nsf!jBmzm<8b*8kI^U9m6dpF)2|NBQoj3V%ob4U zYN9=3ViV+T@qCo;a15Y`0z1jwFEhy0A;8@FqqQ-PoL^clT>6miw9GqZal>ZhkLf6* zBsS2Q41@`VOJ3B>+|mIjC&^YPh10ULVD!FD;ypo-$xy>7BXK%v`ul&s_?@r*SbS@J zd3jkstkVpW9XS(|dh3S$o8b;G$N3Q;+%QU8iQftMQ*1y%mK)uoUF*M+ zlj-yU&{4OEZhBz;@1H+01I8Z_e#Y2uE(OE>`&V4-XllAYk>^ydZ>PARR}X~IhDrW! z#*rUZ-vs>ja&yva+eqVbQfA%A@0LWk~NRntovzy=sOr=8OM$z7g23%dBkL{l>} zK(B9TZ2T!fMjq#}bru|XIQ!8Y3og-Y7uyCoGkd_OU|d2kuxxT-0(=J??q`hnvToOQ z+So{>Fot{1n|oN5Ha(Q}$Kg^kxJ;*adMea@=PIwT9i9)p%HPqF*bpH&mqTNuR_wu+ zRAa&zB00+E)^A{9DHYyAC{eH~W1=E$l7N3or`4N_%ve`gIIvJN2>Ogoww$}>?_XSR zP&1n*Dbrk`X8Sh$b3;QW5D~%a4y1hr9i1njjXXTS;eJm`_aflNF2ak78`uJuu%m_A zp~c0=<3-lIBrX(TQ;@p-mj+I_%SFls#U&+te0)iC+WGUNz~Kc3j`n8;#uNx`Wf1Vv z!d!uKR1_hP8dvHY2%r^!q~M^DDsu3MGxze~pwU{pKUm;FidfB{pv^V>~wp6Ra*TeSq-P zYt-WA=-A-tQf&%r_IRTZTEloSr$L?2zZ>3JPcbuP|Bcd;--G#}qf3LKf*uBXN90Nk zfe}ap!ep4>bdCu55gyLM)IyqI`TqCF$TcT{s&ZXM#tN~occ$`mo^;Y?-zd$iKJYVP zZ?$oB7I@w2LqSAE(_oSF3HUrUrBQvr1$)|2~f=@BmZ+3k#;dWq^3!Q(RY5gMf&*e(v7l)dcEY zsFUz2D>=Z8rwypV*rH&UEZ2{=vBbnc5NXgkt3AZ7S^Sv|i-olbnAI+S0x#+TWn}^E z74hhAD>VR;snYGVweIISL;#e5kXhcdox3lAjp6$G`jtJZ!u?*tT)3*Ks7{>P-ssnENoA`SO z)Vu4{X7KlJ$EM16{sN{6i<+C#_;J7GeTx1A`|&# zF2vfB;wXwYS?J$kaKbD1gNxW{doL`zX|{9PsEz?474lWx27h47J)vNzUNo64ww z&4;YZ*eYA|J4qjknSmH141{$McB^86M|H|m_EsN8a7R9s?dPdLPfYy9<+DYUSurvG zW((Xt_iCwRy2+8KQW;FUlf_Fn?Bqhzt}nqJ|1XDEQfttl47%FE*kJwg9xRh8?6Dl| zfa%rj0dyS|YBm5J9E_(zfy)7t1Tr#_dDCZxh2XZ=?!2MqrzfytlVQ$hEF*Gpy^AZj z^0*@z>gvk3({JhuAp#d=nPVWpliC3vY40(0~SS`s-GF<9Bgb( zZp4iCF%8_7@Q}ZaT0FP?s*7gO5DBu01e){PQ;H0(Yh2Nec7H52T05o?3tQWp%S*(x zP0SA*$N|5XmIhG|vABN?EeKI!muuA`gxDQjhI^b8^s-s4%BM3^aB>2+&(1~5eIJL>8#7Z=x%o7Y&)+U;D7 zX7aCU*#Ub7K-+_kubO7%R#@Ec;BM&tpF75E6l_1}=(jO3F+eZ?Jc+=rB3Q^$ODgdC z0NA<)*s(wf#O`iVralQaQl*BqRngteb}pNt!S7Xnzo3i5jdS;EGx)?Ln(M3Iy<*?g z)aX-}qXtlL!hsvc`4 zUUc5RuyTb2DCVVby@Dz{|!PTKMbX$<{3K#4V^ue^^*{%(VHJg(R5#zH zY)xJQI@Bv#@fd&KP8Qh3A4+3k#XUh=3koL8_=}1XAcx!C*m%DOk~udf+sL3)?M-bh zTcfwjIIdmh)D*#p2>@{Uo^RY|i-tQoIhinXMdnunGz6%2waT#U2)I9-c`l#7mHhaY z$Dr)b{83#i3b1W2`;XJ-{?}CW?0S2`SEK?YSwxtC^78UFUjfbJMka_+bpwT*2*{s5 zgKF5msC3YZVhKe64b^9S0EXq@fZ5YWfe=iW#!sCP(PQ)hUnM|ChP-Rk?xEguciPM+EXc2&g2fZu=$Mn!1W07kIJ4f!jr-yOP(a)xRpMPDW*D0F z$36)W2|}oBAVvllAo#~G4_EKP1^IFczN;mhOTj=xogq?#Nj_Pe-T(G7`Nk9uW?1F= zn{I=1@TFccg`#TA4~*$wyDyIiBP(CUEJPDieOFUl8-637snBE4#uBNclS6#6vLn}j zK=&72MR6vj&f2jV^6YgMQzfG)dr88{5fdv6Q~mIDx0-AmVaRM1>!$n;^C&~yc5^#j zl2kzzRmS>V%h4kEPQFCFIKQ!Rf!j$#dpW>w&4V{%rdpH?vrPNl&y1_YsJ-~9_qSb5 zTI;I2tg(l_^drO~PBgJ9QvOS_eUfO{1E~90$-=|upuL=m+DqZpl5_Jt;z#LVr@oPl z!en$W;EWW+3JtSJ-etsCiaz;8M|+ve6*1w&ubnPJhADd$>27~HJbh_8oIKred&$x{ zHpWOtS5{S}#3FBJXqcszhRbe^8{T77JrCI4{MhTyzQRfqli!{r6BKzMegf?_E6h~w zuY14so5FjTA$~&sl$v6!+@Y$|UtH`ry7wt^+(eJ;VimS7Me{x%HHDJa*2-3!ioCWu zl+KX)JZNm~m=?Wp``p|j?&TG2>-ADt7$}|Si9(#EWx&Igtp6jq#s1}CZtgMt=~qXG z;3W>H`&Ff4t4@pQ)s$rH=g&RBQ40@Qv*&~h%o??(rrqM?s&}M+yl613xy?))Ghv?J z&QV;vE}Pc=HqBCOmNb%l;zoMYxRU`uxJ}-oW*|Lq-7JRtotCytKJ7HKDX4bv= zp~6f!EPTlHUs`f{I!JyO0?cFuj4lG?mq_P_47X%xEnd#tmzxpyV zXmxM|8J<(UQ82tMF4oZc&|^dum{ro_!^X}I@?*C*$Lms6G~i7HLmklVQy6*q0`u*p zy#q{DDXb<@oOt!-S5f}=hpr)p+rBv-Y@|ik8Ag@mUA;{2uD6!>RE2(tijnlr3 zs^$RCH;@7ePnZ}awU>K+KUzD@+iir0HmezB>Ug0{Ongi}<*-SZ$uFP$rA0Y;^VhGq z@d9#l+vG3Hm!0>!@MuR1%^vWO_K^{r*>WY}Pu8ZUMI|M*j>Is6^|o?z&0NKCv-3Ep zs01!kk=*2Ikp2(1Dqc~p4Qz`DI6HXAcD0lW2H zJ+?+_YHCSw@tJ#?%7WBMq*xAB~xxlD-d!lLkRhcjQ4M3BFA2=G6^bg(}fSe0)pI z-^9hO^BdTMASlRGAf}+@{fCF+L^D8*+~kIHl%tlzLW5je&nUu1ligdf`Tmo!7knoD zu20mvqwEG)`1QQV3m~KdqQEgf0hAVcwok0r?##a-Xa59B zeiHbrgh->WQY>!PT!8bAW)3>|58fveXO2OCO_lHeW=6aoukA#oO90jTtrd;^!KV4U zpJ3C5&f&&+D6s2g#6Ck7j`C_9 z`Z4-wlJrsWHc%j(o_Dc2Zk9{YqR|4;lUKYtoo|#>Po$>>}GaJs2~mzCRl}fIy=1vIZJ^Oql)jL z#`#AlV+}>8-FQGX_}1j)`e>2q-|^enqzE5;^PK^(9q$>$PL|^g zY|D4|=W81q*Lx#}V`>*`&FT1qTod|>usg%w%hy7?v2m}xiinAQ4+;VQVg4-z@ z8%@G8HG_U@8mGeqm~w*L-zzAf86Lh>Dyjh|4*j+l&@B@h-BzYRsX?novHKOw=v|2q zK~n(0$$%*m_#1LeNzf5=zn!EHhx$-kUJmjs4NXm=Vq!0{X=|BqoTcn-f5c+NgUsP@ zc?c58L@e)~el|DDgR8S2$)3(|QEq|c3)XJuM?&h-QwdyE5?OY*sFZ&>NaVA#zF{+5 zpD__p6l>Xq5K4Q*g%nlB#2!%J{NW5To{H2`{rCaN<&MYs*JV}SuD7>V+5zY$!Ao-5=$?_&Lx=IN(_!Ex zDjA0Hy*)cPuvAt)?u#lbBMlDlxnmT>`wPaLAHTe4y|+?cEeZ)iajB}C@r~2^71za& z=W{P#&-Mx7W71#^KeC^*;&3&EyxFjA0oqK5b$La_E0`*QK%kkiF(@&<0?9z_GQkh* z)||dC(BZ0O)6`;FF$FToOOi87K>YZx&xBiBR$vW;p6>6{NhgylLQnsw2ZAV|< z*RBwBaNqC+xF~RD1xQeObuu!txw$!TgWDfTodTw<2A%TH)sJ+XoSX)XxMeE8%gaxI zTJl%NRzw6Qb2YbO6a^VsT2}UWzB);Q>|@*js6s|TLV7sV>B!`lr?7G-Pu zakq%aD56(;Ato$*d^lAmrL6kt(*uLLjTu}TA+NiLwjn3PFDmNaM$gb+os*k0(St(W zP(mJ`7V(qC2CD@>{~KA^jmzB?Y@B}iH1pNXB03%2$0c8k{q1RLKRHRNWWJL0&)pn; zu|w}H-84SuDsiDLGNbfHXU>%*S*c9* z%uFNNyO(B1mN`%u0&q#C$!{I{OfDxiMMWxzfb!zI@1(p$tM{7x{QUYP80Ops)T4pf zg*ZK^Azh6P4Z1DG%ju;{_y5v)(9o7S38Ey()Rb4Y9m$x^TSG&s0y|;`wnRQ=+3Al0 zx3nAWc|IUz*x9M8_Ds++;Nu^F+VX|L!Gm7Z1d?_zY6R-kovIoR=w{dk4Wbqn7E;p< zujqRVwKP(g0pAnpgmQ5Nx9i*!`u@=ODv6u(y7gv9&?ZppkLvNE;+d*6SpL8>Y}}TP z8?K}`{zWPnY}8Uk<-a0Tt00SVb>Mq#QboT&tw{ap0>p*#3=jk{pe^9@DZ$98d>(@~ zp`Rfmblsk7Ih|nR(?93ml%+wHc@CeNqM|GeD+U#lp+!YTR;{PtAanEc7ADnz2hg4W zLvX{N0BM)~6bGv>w{yx;lT9Y@t9rL3;miIcMK(qf1ARunN`tH-6Sj2re&m7(azH3b zLX%=T>#f<^9zxiiK!%KuD*QWjwKj75;J8@AOj=Z}ChMK0M6;xl5}MSH*ht;+6;k&| z*aPIMef+H)PER#iBWj86#<&TjHm+7t!6B zr&7RW)!% z45vhJ)sGyzF2)zkKc3IlpFeQW^L^70D?kRF0xkR-40vZ;+}x%dRv!urW@dk3S!DAM z&F_H=d>`9?wq!5~DlI8_<@N$ZP=o*jTyhNibeS$F2#`@|l4!I-UNFJ}>WB`EnTvEa z*&!srdWl(O3&c$@`A-SU^!}y{vJo0{>`1`@*_5FoNN6uM_ZO$`ypInLV6sur-29Gz zer|58tLx2o!&;NMAVe%8ep?48bVSY9i|vx3q_(e?_qtEU*($)uT34s`3|Qa~5C60q zHd59~hJ_|3n^PTvfi*+F$#gMWw2LiJ7a_oO*jZZUNMdQMz5RDhuH2AIVP&PqR3Ty} z|2`e6)KcxI$-Jb~fjFjg{kt1*&nD4-+B2;497?~tz2*&Yv{}ZBAmsUcVL-#tn#9od zD}}j#OD%E$WWBv^L{^%T;(y&a15&3i6OeVRSDLL3wuD7Rsd2Pd(0PhEVm{L3ID#>t@gY`XYKQla25ouy&eY6|s#?{?hVA{GcN;X;hYySO zrwpE+C=dk#G%FL6w>WjvJ$dgOj3)C(4hjhim-NEhw(+0!;Y`rqzy;pZ85jB8AVL+} zCx_tR8fuq$F=6Np2TDuDCn&cYGb?&ar=S+VqdRldakw(hIbV!d{i zH zF~pDXv92wFjE&lTFv&>eD*rx|w8wod28xilF!(O@e8-ywK_Ih7|6$+#Chulns^;7< zf-n%Tw%<(=07-HcvEVrY`=;#lthH-!XmI`_g=kByScK8I#vzV99s%kN-I@fYa*daqG(F<<- z!?2^|pG>CEAS9iCL+|uLT#p{JqUTA+eC7I{^`B0EgQqR-nHS(cN~2{fXH^r7i&R>D z7MEi``Cy%pdAUDbXxo>RJAC@o7=^Yl_L~sKs=Wiq4Yx<9II~&riu2PsWVN_}DVt=&MFvR^u3Y9yZ{w)@3@GcYt{^LpVFGjCx zOVKJ~u8GI3fnVa$dO+!Wby>9LCE|BK*m@({@Uz0r-P1=FGbu7>|F6f(FCw{J2VlsY z$sogkFor<~LvNqs?~bGb&t}n!Hn=MLLA|lIXMu(d@N)#S4HCLG&hBGk9( z09x7h3{Yo)Jg1~I^IZ}`iePSu`Db1StsuXq#v8zjfhGOuE{7?37@Xne7hFj(5h1_1 zxmC<|Pk@F$5rF~{9d{4cRxaS1#6U{T_~f~HPT&{N~L z(OT<|{hGn+p`zTiZ^`36-nj@#$fjJXHp{e{#mrT*Ff|pcKUs(^@h7EYsIObTq=6C~ z>kfs|t|zlz;1$EWQdgO>S$sG$wjRWL;cay~fjQYWC{ zGBPrpR;P?sTdJM`pRnuc3MvGw+*8we^-O9I2!xOjKV1$h{Ifgn%EZLQL>{iPvN9i^ zm`iP4W;tfM;!hzAR$zDj^GAV3vjmopuLX4Q0~csBaNq-T&DmOq779O6^lY6WtYA!h zeEg|?_hRQJ!i-wKq`8LQfO*iZYLz;*jsp%NrclDV4lev-!x zAR!@vkzx@tg{Rx8GH_D-s8oT#iYu1-ZJ#`+ajS{gz6Zzpu0Avmms3=Mcvo^47r&=(Cp1A7$KihLHV1MBri7_3X-oTxE7UyuRlT3V%NeX7Ig@ z$km#Z?-1H|Mw;Hj$jbUK8MLo0r4|4E@SYLZFMDpjII6lj+s8{6oy z`i}N&ni?mR1-PokMMe87Elte%5nwIL{I~vZ2oDZ})#=+#CVJMo9(xHFd5q?t zU5=KBneOg~Ts)FhP;f;3--WK*SZA8ZK7$Sv!kpT0#$vWCCr-AktW-=a!0euwob>Vb zo~pKYaM%ZBhnQGcUe^b(kY8ZRu|1@Y|KZ+d1!`h4Cr-jRR78p#3wT@6LCSIWA0h7d zMK-0Q2r*ETRT=BMHruA}Am@PrE#go2PDG=6&e7f%JRUV--#j!Ji_Ygnz|z86z0M~K&O z+7m9~gq$w+Ws9Xl_mY>GUxk~~9W-t2?CjXVZaD4IH+m@BEyQm+bIXwnBf#{I=P>zx z1{*6-miXJ|>FIq?P*CC%;_Z$#)z#}=kCk|{<_88SU*v=pNquL_CE!DB-!rUmx=?#= zeiv4u2H1;OiL2|wdLAAg-$UNxfT8W)zCKm8M9^8{Gm%?_jg$u#b$R)Uz(dW}Ck`5# zLl(2Bv3*W|NNCst2A;>toQ9XuYZJ&d6{e%b4p919-(LrFMb{vVSaq_qNQbom{ZtV>jd|kdJ zyJZQ#w?yx3MGfLMVN>Wtq>~r`MtiYo6Fn5tVL-}a)m5m>!SiDwfe{#ZF8ltKmeT6# z>Vmodh|c43Q1^&cy-}*OAHk>39CB#~%AiIo?jGumJsDx=FT~Bx+nQA^REYci#&%9O z@SDee{m$d;1g=sk*5jAzqsPm54u?{_Wd3g>If*sBLT)+9)Eg!7sz3eK*F=Tr|}>lLsX8O{Fw z(|S~FG-PDIAdq%}KiLhcskI7})|Qq4>I^7}4m3u~L6MW&EP+M5|MT3}Vj7H!+|Wnww3@_kjr{SJ5%%^okD&5#hQm#d;;uUj>~ z_|H8qKt$J}Pm;>}R?_LcGd!)^R7b?;9VYha>|TkZ-XF|H9>ZttEG-+YH6?&~KRkS> zKBYF^9pp*Mb+hPOWPd$uVT~s|$SmqMy)ZH}dp>gmw|vLQUn^(sPyv$J%U>YH_%I$K zU}|crG~rXEaxugZ2zpGx8k&-l!oBlE_Qe;c)lm6i(l{GhrL%Az;r5b;+cN((oRN*q zMp(lJwzhG*ShdY!`=18wKfr(kWD78{CL#h9US8jqfEgviK!1R#29ALN(`vv0NkBhi z`r6yhI7M#%%0zb?MXZbv0FH*BN(?h}3;grrvNDpV>x+eZK}GDyu&@-iLxxZ*!M$k9 z$=3?M0b?L?6xwyHt$hv;Xk3`D4SVm?dm7zVt#$9q-efrc^DdQxIOB^?!0MIjcCjI4 zm|UtSrS&2<+xxQYY+Kv4&-jt~1tFhf&rR~KC-Y%J;l{4^ED26HB8*U<4Gb8jSLo%D zAZ+Y&2{f=;Bcp|25j4-u!Aw>cIB04Li*Z{m);-hHAG36U>W6BV)TD>=?oC9~k-xYf z9tFyfR;Q+b5c3*zYM`?ObouPvk<+h+15fdN+;@Fn38L2ajP<%~CT8mw>THPPipuay z5KWNlWXKN%R<9}`f-%2;e+GbFS~a!+68(^5R}w2j!<3{X@jNA9NcG(8r7LS=#%%`^P3H`CJaRK)AIQ?Ff*r)e?UW<<8iKu$DtvIDIbc1@0Qq z5ersb%XW^N`3TImh zc;(V)c6R^fl0?*=k7jCg6b@Yw(Q<4U~KxQs2ISBh~3RJut$RUD({vLnDnee+M34aMBk)$>b$KA8o zE(k}>3Z z2TR*$0}^GyK@wy?hqejp#r_j?IcM$UW~!~D<-$B!J3`69;PKqSLL zP#Up}11eLvPL;Tm+?oqQRKdJ2QjV@|JCipXrix-X^eN1m)gi zT_vN_O};BkTBs^2F6cg0pfo$D%;`yd;i;r3C z&KFqNqs_jIBYsBV`-{Qt`eyEu?IA$n>$6#4?jIj7&CNAEPLT#D7e%ZYyWW;2&+TxL z&ia#1Y$P&SS%b@+WpEjLnlbqyk{w0C03!&Z%^*Y_88HlA=a(`QU!{pn{ucyd#y{cy zP+WX1n2LKuoc1!4b)u1A-BN8EX?C|ji!D4Ll$U(*rit>ovqwhFDKD1;02j-oZxhSPQKq9j^I!aDV%)+a5-8Z^l?}9Ib z`%_g_^~LAGzG2YQtHgugyCDqJvFzT0_R*2`#%Ht&XH$Q)W zG|#fys0#k2t!=hnL(@hcB<{S1z_kxbsEnEKM}1!T#e36xsOzgIcl>}O7Mx)Q37RV_ z%M;jY!H8(3Wp3^-*MAB@ROT;We=Tur=o}o38H%<4tMu_Lx09U7_Xv``6+U#ibWD&; zo;pc&rTrUD(hsar?2lH0 zZRr^=)TfiFcRV!lZj$x3`8myA;HSW=^u;8K9vLZ1V=-F*7U4G0Yd8o7LXfXY6eAfK zAfMpIO62O{(VUt^27iWufLbR7YSXpzV_IDU+!n}DK-+6pR+S17*JY# zsC}batSwrpPyla$JGZ##)WF88Cwy4rf59|mnZ}arc>o(o4lsCXr5brSA)q7#SU%dk zIM%f#Z=WA;D9HfK6Yr{7xgSaXyvJKsKFDY6X6Krr1nqS>vh@&$Y{ zg9^Hly~&cNhffMgp#OBjN2NDut)%}HP--_PUQI(rDs6AU$;2t5$Cd0> z2xA0w7}6q=w@{+hR(T>)I~BNoYQNL2l{A|}%)!lg;M6TIeKFaD=OBbPnbITQzPfOR zg~TT$&?PgQVNOYJSL8w%&{nK**eg~rpdlJnmJLoKWH`Z1v9eGQS)_9p+hUGtGgCni zir~xEa6pgX{{$$Fva+?>$Hzwznx#0*HJPsrllf`HT#ls)Y>LUAhZ8*pBMX(2KIks{ zM*YSIJKHmq*f685-#bd4s`t8y;QnPOU_;PtgmQbuYG^LQrat_Kh4*j+S%8 zyRNUCD>W|0vNmxT^vdPZLEfE!zxA?rboHR7zwPp^F%JblWO!rvFVx5%GaObNL+mGPK{?j(Wz-1FQO#`su` zb$Y=UK5e4JDXbO~ae!3YK#fDqP(i?Kya%dUkGadh?bqk-%n+QBCna8UAdn)J#9zN~ zab{XGEX>ZFSPMEC^XYa4Nz6b0G`A?y*3*mXs$BveX10f|r#HUC-f>#*h`!BqE-_#H zsbB`*MLT1S?Q(nz0$nsg^!-mYry&I-0;<4&G?JH zXz$?2h>NpPVGcNuCrF`aP4H8i7e|lABr#-KWmFdy_76y=#%*p{YbnIs3eVoGb+nvs z!rP&yws{Qc$d8`v<0VmYmzoTHH;+QaMC5lg9-W$khWN$B-Oo7yqfPB<#%o(3n5f4} z;FVE-HAfJIg2_6N{!RiS>Ia|_aWJX1b<&*GMv)F*Ry_Pm4Jd+sdSZ~2lvh^D0L&X$ zIJmmU69A^sUoG6GIRJeNAiiE_w`yu*BUv!%z)67e1{%uDY}RG$w?7E44GJO%#v~+& zI$t4{WoauY*n>I9e{r82*Uy}3DKRlFs}DrYEiECaACMpy*O?c{P@vL6SM`bEO{APU z#3r2?Hip_p`5Z$uEUcfg3yr@U;czOqu&xiiuG^y>@&?i_1oN@pN@I^hch7b;tj+BO zw%e37=b8_p%WI=KUD2-NL$7&7XfW|m(QlKPAV{gU*{xM(7V(O&fvPGN=`l7w?Wd_} z3m3UJ5CKWafgge^KF7X~EDW*3QGa)8Z)@=4>@ zKl%FhRIkIRtI6BEug+f(_IQ_i7!lv0*lU>lr z5C7B?frKE%x7ilqo2u+91@58#0$w&$aa1)w0yJp*!QDm(a$MjjFijUUaw$!uG4e*Aawfn#^v zB|I=NsnCtb_lu|So`eZ-ZiIwH7B(2&95Er|46iwAM>b_=Kb=IC{pExBmAk@$ru?WO zLIKb%ZL`SfF}T(x9$+>?w7z*s6BR?6sB$-Iu3F>fR9|XPG{NU&x@cm|`@&x{FDETc zLq|6|W~BC9g+KfjCK~itvKs@J_TcPsl}wY1)wH~Po|4_tkCA}^kGr#Po}LS!jqc(?!D zJf%mkcSlP>56ax3ooAD)VzD^jSKiAWBm#q$CQG!1}lB}7Ih z?O2c^Ftg`h+r4dyJ)4gx7$NEE9!o5*WB&rnFl0->@wZW-s?rQ3#VkFbk7a!~P>D+1G?e8H zWNOd@{Z3LaP_-i@_J1E!^s7JXT~S;_BLg(90X8HMvL8pV{Y300WWgf)@;P^~cL1Nc z>AG~>=SB>{PG@H~54!yUqTxr#%cN^lUda~qlj9gdHa2BdRQ>(kg>Ge3sEgE-^DvTZ zk?LTM$AN(W(~$PtBogz!&xbm}{?OF(6$q2` z;|4nMx2qG~fkD+3PJ&+k_RH(T5+J?}rs!E%#0Cbss+W|cUjIF}g@JZ`@ov+uCn}n} zl*#`RGFezufC5?jbIV4%Pf7pnWMgfuqdPLbaQB|7 zk6(O^2$z$eaV)9N>IQvJm>i4U_HQDn3p5_E$B%TT+W8zZ`PX?K0T+%*edi5Cqx7LC zTrTrRhMiG1GUTrdOSSJSDY&0lilgdM#)EVTS_q5VNzG^J$A9{Jfh6Xg>Ctf%(T(os zrcE33pcw`r)q%r1&P+crwf>v(OZ(@~Y{>7^njKxZBE|4I``3@~rjdq$b(2HBi3A`} z?2JVyV$a^fliQ&!J6&vniu>ZuP9b20$t{2l4}>csR-z2c_>$acHJkj;SDixc$zf*+ z+eHcz(si`F%i%ruZ#1*m^U8cXDkDjq%QdZ-sCze@68jj&wGZGo0Y#;@widj()7VZn zwzj}#G3#h_hzsTqAKwPrT~!oztWV_@6nJ@h0&N|^^J;G$6tEESdFDpcy@fHYCBnOmYJV4uk}Lf}!fY+MFKi+;2I> z?Alz+93g0ttl7D+h&kFNFDp$um!p-4vyn#p`eQmSpN$CmKB4psaYz8P%H$&=EF}^K z#1AE5V(<8J$(>ivQHeQZ#Nlu=|4LUqv1ifb5Dym@0z;YO$~JsvN}N&+fyhLxByDuY zv>XcWlz&@|hZX*V_MKbSZg_GJ9tP8J&40H`=fwgynJgSt{DZZYhcQ*kcOL0Ao0zS8 z2M2x^JRDNS%*@?596hxkAyD2$JIX!3Pw%UEe>Ht^%^W;7vR z{a1Bq?uemnC$1_vttSWseC$hZ$yTCnjOH`k><>$kNbUnu){%&pk0wZW+Dj=cks!OlV>v5s-%z=y? z9ua}!<>ggNRHG9`M!`?aYChHx2v0~rz-HladCkJY{gycptjTVWN9AijS9ih|6*5*KfenoIu)_7iPeQ67RCz%y9O_0SOlKIKL#$)U1ReHLcbuu{YICZ0l`gYtEYZ(&>3 zvZE@`4C}OH4<4Kp&Il(v>z8$~S&weYN-c42_Qt^}uOm==Il_x$r2ezb=sIgQC1OaI z&2r;OyCAX4<>8dY2~J`JpI6=ST$ZjI4K1xs1GD||X;+n#On`cQjTpG3Vi52?EX~Ki zqY;c9llimEYL z>dxC~n$tiWw_JsyU|`lG;KNtd9Z4M8es~xLo51GgrY5bLfq{YQJLxP1urYu>ZJU+m zhPZ%RV`fd-nYxfMAz)=SG0EFWWYE8OhxEIlYS+;URB+{%lnfa&M-zr$oSh+{qsz<6 z<|!0{MT}bm=WuTi0NpKY@wgnPX4P;JAfVixkkH2wAP@QKo9rN9wQokSB%=P zm2-w&twn@z5HK70Yrp;kz@fk!rX1%tFm<>veX}#tmX&$Rk5llENd8+Qv)jzaKrJ5X z+!X9<#+mIZR=zptH})tICZCk1C7uF$hPE-8aFim+Njrr}lIiveyt99j>N@J4{z2vO zA&&w($X!KFu**+9cRQD#M961ob5Qpe>T-9am?-+$=Hvo&BdsP?(IUTeK8{R{Fm)T9 zpBmp_j5vaFa+2ui;UL(g8vadySdO&p&|rWuBwB0&l#i&F6a z>$B#pKq}hxb5_<2iV<<43CB$SMA(Y)c-|ByY8GW>kuI%t%C~5de9G-= z`b425zUHh&4d=7I5>hX|@*|Y3n(-Y+IDlMhwz&ib2U~td?>g(u3&)|?%~ha;yFTDS z+!rQWg96Jf=Rn!VnbW5{UwterEc&D-llw-+i<41%Ot{2_bnA?9l`Tmt2Nw*=ZudnzD?J6-r!z(k7=)~q;=f0S)a3}l};2i`#hxU64-3=`D;bBO( zqB569B}P&*={pVH^B&luvt{atxoCttPj?PCElsB`=DBg;$B(>F?zR8;Hosi1-MDTm zwOm*B1Cvpj$VV=phqaN#@OTspVc~_SbgY^3(8sg&z(fl;$l6rV*HdOJ@n|OEFP$2M z069l4cj5lTslCrni8DmyS^VztYTrK=zFwT@LiJQrRdsZm9UpHT3SxVbnDq5TMD*C@ z^lN#6_@KACrs{l+c}JedC13KxR84)}Y`L+h*oy#uIYq__P zElw8N{nv^&rC(bhLCZiLTP%L-Mv57lFQ1|K&=AV-gUQx~AG9i;MS(M{6zF0d)(!M=E;d5yT|PGI}a1AU|=mw3Ox= zw+iy84UXHz3WeLr%a+GkL9wyxSNoG4^-KUc3AAQ#%>cI+5(GpCeUba$teba$h43)0-6`GO`S16i`4~rM z9FF3HXYV`Ky4K>Nsetc$gg~I!C>%$w1~Hj{;cQYfbaW(E%UxJ_O#C?3&>B=#F8%yo1}`*) z_#cN7pHdst7ne$dbi7-~*G{dDu46+&#En_}3vz8uu8qk25JY*H$zlCNo}n&x#FiFt zm*#PN?-cGCbJxfJC8NUHb4J472(W)XSYXsG)8+NN0a0HemAwcG)(lLP+db38Q~bU! zI%;a!tJZ66&%8vZfR>WZwF?+nMuv>8X-?E}13>BwKEOt(s;a!z)nS z@cayO=Ee)euQCJwVd{pHYrsT^C+-Z~$w0?28`kdrC+G(}K*TSDq|Z@9)dtt+DpMo6 zcqw}qm*uu+9s~fm(BtA-0I`Sh@$penQ9xl2N-ZF@2TOEoLcv#_v(h;dvZycqTKT_+ z?ON&bY?1w7AbvsY8BGp*P6BI^xrM?rR2b6={(~A}?-Ots{Hfk{eFODkVSWLkL#UhJ zbk3{+DRda^9jfQ#)w})T)5btI-F}g*iGuoz*axG&wgkc#8{fmK#wa=NoR|_Mq=Ha2S&f0BobTn$! zeo*!3-6x0vBt{E|Zg`EO@H!V68p5{wt}rU)f$g~1v>EX`i7=gTKyU{hYnmV_iC6sc z$MTQDZW8+GMCEN55Jf0=LE-2nQF&9G$Z|TXGuezMN&b`;Ael*GJ4F5ZXUBik za0*|&g6X}U`}3r#dbRln3d8jx2;_ar3Z;uw&6f+rkqpRYSoe_=!r}7%PzrS#yYHyT z+D9N~jYN)An^lAM&nhYkkt=*9lyt>-NQ{1&0#cR=jp`uaTz^1O7eAH?I8DNP(vH|k zAQeuFB7Xd27;wAVx;tG5!d>1EXK>kG<5j-E90h(E2;G$AFr2csx&Q#&PsT;*YWlgE z?Mu&(=UV=r01dL)$5Ek^+Q4wMGi|;|CGz~R@Xec^m1EA+?saG)w}iSrHPusVF{VRv z_uH@&`=`8iZ{?VzyMJdt1^?q?yMADCbO`@&@+?MDG&{_c;mbvIZGV{1$5QMkNV2$a zeU~l5882mRHqJ=THjQ{p7(qxV`{c!R<-j~MmAlrSC`^?t{vJgHm5ry+MqcD)he4eC zl~Ee2tjH&d1xd5H|3K{@D2YmH!JRrXUr1SXrvMARN zZ()H_L8%2*>T|1;oxu$70IFK0JQU$EAz>NnNCtt z()9ZX;Ik>OFtxNcuHgdkJbb=pLi-Xhfbxx8tkY5_9eKAhuHZYG)^RE~2y%lEP*A`w zO`QeiWS09e+h(NfRYL|RFxn2)zMIh?Stmay>9g(#ys(4d!VH{eE7 zclYYa%gc+SiX_)_aNsxD`7Brda546#mvu+;*ge_Urb~CbKDLWuPICD!5t`T8A0kJ! zxl}**7S<5`9kJuN^WXGXKR+z#Grd-wjSZqWNjkiHg?7d#(0ep@(zCD{ZQ~0?&h^XRgYHWh;&L|E2n*^PGd?`oZQdzM*E__|NZJJLXMH*;P7zU!8+G-`^M!K80nvGN?_n1dwVght>VC+aSj*@ zqi0KJZeHN*Dv(Wu>#90@t-4LOm(BBBdQE0Sf17<9MJ9^I6PEKm3>i}hjXuj#xZ3En z^{ZyJq0ma*p<&-|H3dNo%%5bf!FX@?O?IA9kp>iuT{AEL^j)SM?!x01P!99mI=>D% zUmtPVt!JHYJ_jM;0)mZ#vwHX|DmXfE?Sr7v6{=2~VI6*h4n{>?y|0ircV2b3KO=rn zNipq9kqA6QC+y<>3Eq{c#(1aPl;4=-r)Hu!(PC=s4OX>PQS5Gb)`Xw&K)kwZc5yPc z3)wAIDBDp9ui-84?X`J63YXMTLSMla*4cbYe= z1g=N)#ko-e9?L#;&;1$+Zkm|xMjQxEQGrSM@jy{rK=`Qs=U3d*KqT9oax$5#*vIAu zwfB&sST)P9NMVia&juark`atZOJ5X|(L~3@S8aC}kClqGwCw=uUs&deUz4Id_PZg3 zW{_c=oY}cQrEXFTA_$di!_pm6;#m2avS35AzV2-QaSnxH=Mz;NmqCYLqqgg3IbA#R zn_2QHb(BM{_3fHnhSM`9n64Mf*O%~D20OsRjB5-qs>Xr#Uo7j~l`fEZ2XH~AK+9}( zM2s-B#i-k?)Bdt|NzG76`jd1@-sRZhh2MGUIyE~bWt$D}T(zuRcK0$AK@KP@$iJE<@tTwCMv4@ARUqWrakVyL;ngY4ip~&aHXBDB?33$kE@Nq0A{0R z*>?vm9>??1#glEV*elKPoUW@G3@x0O*t^g7lVm%8H^EH`STX0l*Op+;SuSTRwa%H5 za;1r=9GB+LfJGGQk-b8czAfL?XO8G4*-;A z3C=FeP?6)ZU3&sBjEXlo{I8{O+>*dZmCf5M&jFy5vkDkUw!-}E+LyovRI5~s*a2Ad(>K18MLe2 zrx{~~9>d}Tx6(>%1)A=7=BQ1C0^pZT`l%?!NwKqBdUO=_^wj+PX!uri7-WX731qpR z@?UlpEG^=By7N@(yD}Ik3GmAg?|xVaD=jr~QJYdym6qN-tMvETl!t>4c2W=ojb=Hk zDXh#@CEnp-7>OOvP1i&bHS0Izl_qk=ZQX@7?>^64n;fdg8+JtvkZxl!v+?lobbP9Z^X0Dl;t3eJ87>@rJqe{FQ4VwM<#juVKvtluo zAXJ7nQS9xI6IEQ3QC~e6j=!EY!=Bwg*>9@Cecbq;7N8fTsQbJsc9^P1;%7vPtd2% zs9EQ@pa+*2NJb;yV&mp_JL)aV!f8TjGD`uk5R4W2G3L_T7!QIIizGndOLAmK`tS#H zfw$rX4KrKHj#+(mPq1OP|DYX#P;}~| z`RO?{yl^Ruit%l-^>Zjszz#G91cDo6o!o&7{Wpx@ZRM(^EMBjF!@ zUlfg~5}8`@dFOfAoc@gUGty%nuV;s|6pv;LQ`Z4ax%V?-cr426q7Dof)3D{t;Lznx zX)-`@#l*>tmfG^ry!4UhkbrO`Dn03sW8cHx-dZ*c80M@j zrYrxLt<+&}RJX-<5q)0evj;h5^)6!Sm3h(>Wps%~nCWg{HjU3}nzm5{4yUHvS8T##WArYW7Ac!XD4_5Xb^ZRg98FOqSrSz z08zHvklh)$(iSgft@M$X|NeafUaOLH2%(6M&N2wi{$(%E*l^w*s!We%2?*!D`@900 z-GqJi-097pbA&UIp`l!F-msgS_drPE2bt1*y-9aWyq>yPcd05hc_wWQGgV8y8K1EJ zIG7{>>6Zk>;-axC7d1rVPxrTzG)F`GsqYbEz!>CYJrK(tPzs!W=>HJD4uTiKP!r!j z)L}qip$C5I6K`)-Wxh0hz7Sj=Nwtm|;HpV1_(RU1eL?Xo9hjIHp3#PtDocN4sy>a6 zSHfQ1blV^cgi!sr$_ffqPWw7|$O5i6bz>ngz#%~z_psW0T^25!wH7c_T`eD9ri)ps z&LsbMk+P(5yCu2U>iKc%+f`YK52zQ9>nkQLY-LN46OCD#$p2nqVqz{VtbIaK=`%JW z_Z|Gz!Q*NupU#bjfieHvAZnx{4aDun$2TGT=xKlL%0fPzN*Fp(JJ;_Du>ZcD2z%HCy9k+nw`zcJ-Wz@1i17ED=iJ-(&``qG>0t5MFS9QfEJj? ztv>S&P~1`c>h7A3eTO+YMiHPz;%zCrZz+3F7_MQ7c%*^JcgX3ir1KW?K#^+U-UhM7 zTHcBsYHo4`fr^gKj{pAE)o-ot$5K0M6z53Ris`A)c%DiTDk^H^(l7(t7w`3wVzNm* zZqtX=Mb_NqeheIeMndsBYoNxG{Q54s`#$848;tAqpP#vjpwV%E8R-Li7f(ose=Gb+ znx}%Qp6VdeuW#rrTR$-_^~)C7Sey@*rMpTXyzfgfne)V+ zl_+bUketv!I@^&Hg$aSXHHR#`e(OJ#fJN~khRDq8ghzk6-Se&Q@Bc8pKh(1``6He- z=lmOvq4uMZWI{Hq37oq69MiS?_3s@Yk9SnwOK0pne25U}!Z-PogIk=R2gWz$Mq1$O zPN(jB&X*%f%`CcKWEmnqLP$V;%uJ*=5UtN&^7K%^FVCFT|CSrNOrGdS;iS}h{WY@j z^(Z7ddhYAQTAON3+S2cPHAS?-`UH3gJA33m#IZPui!E1>3X{V3tzvhp!>64nnU`|j z^mx$(5)*qdx};q`*}JTR-BqI( zT+r7}D7}rKi0_llsg-zBZ8Ik*+qQayljGwah`2vGAuRS{aX&DGwBNxrEb307PfSeY z=bOMLa}oqEH5f7JzwlA*lto6eM}2^ipP88fU03#$Ic?Z!SZ7Gxf7aQH2agTxRa&tq zf&~_V=&$oY)2h89z{#1^sX*1M2tg9EgQRt40l=%6xVURtog-)3_;~V4OACOm$z}4V zE0lNaBKo{OE5Xa>cgthhV~2{UsRgC;q{7OErhLk7NJ!{CoYSnUiv*(x#IA}Y?eTr5 z#SpZWgmRqpMD60jx#jxURBdx(5v_W9uigm#Uv(d(BF^kr+D~VG4PZw^;^G9&%}A1y zeFh%Eu*}irCh0_aezaVsfVhnXnjozG!Y?h z69-C{R~W9ZL^A~vt2I&};|-P1C%UdXBe!btKSQH@D$I0Mzc41&sb*(~fzun%$_e<0 zifFQS&qan3^niS!p}N|f`Zvf(1k>MK{06Ml!YT_d&SFUr$gIrxX@mzQTvhKw_H0A<;2NLvwJsV&nG1}UmM&fhS z^CEP83f!-i3Ps6j!=k+T2@&kG>;`E)(<``k5MWOLstWY!4X6%a;Tg028~TA-H3zy8 zpd$cP7X}9Ab#l^cIH~&l{GC@D%b8?tL`8Y|5CDHHm|ZL90zrhIsrFJ*pO92i5i{I{ zhkG?uNnd@(8Y3N@3&}8{XZ&X1ZW!`XO;=D$`p2@F!v3+;(lcpApjg2(7T#b)*>Pohk zr=6z`=JqziJNY$c)D!ngsTExHAu5;eJd6Or0 zh4LySFFU8FRRK{TV;sZAOj4Fb`Y5{rM*(}Bzx@#O_!3Ss_1$8lqbkD^&kv%+SR7UU z>=;$;J5`u#uWE~lfzb3Ic1yd4r=U@w;vH31tp#~$?G*NzH4wEc_`Daq*6TonDCi%or#NUcj&{nlxuj1V~U)V*zEg^ zP~wetibra3XLipW&_|87{ULnxIWwRpk_Vm|cpjcy8{HN0=&+;d0R`<;{%;j?`+K|E z4R2^1BUqCER3xOvq66%OBoa&t+&Oa+m!sWq(=bRb($LzUojn719wWnw1{qC|vNT}g z;k*1`@U=L%un-eJm4$VSB3+36vuXL-Cm~X`uco~ezZ#%A<|_9OXBqYVvS$@{buNaV*t zk?@p~8TPpydUt2VRQd04&UVwGi2}Vi{pP1)Zf~3WrN`I#Kuk39l}A@OxzQHF)ZgwyBKa%8 zrkLF7od~ME%iY{|?;+6WARwgRV%GUeRmBI3E6X3p=Ii-vzBe&V;NnWQUEadxyzggf zN3!3y(9}_J+?v~+{mc!!J%`OS*=BO;`E)lrnhvkUzNbXO;?URG^)hHwSIZVf_#Ope zS-@&Oq#gxiE(YJG@$+bRE<*W9QJyZ~vweMnVi@^MS<1`H``KpM4xk_h2WhJ{49(2O z!4m_Q0ziL&v2{*G6bytWOhZ#sr*T~BhpjYFVR_3_mGHNi+1O{e~En{ z0&Q)!&gcE$w7_CGp$Hg5fZ405nG9yjm6eqxXQP1s0Ze788azd{4WIo7rY(ky?so$L zownp-HJaz!b2uoqaOi=QNqIj<-uKt7xP`%SiM z4${_MLFS)&`Qqa@AZDgFTs_cQtGWx$g9X(ZN>870XjTjvBcF+#_vTMBBERq51{sl< zkXhKO0_p0V-c?NaOj3+BR1B>qh}UeKNIiN-R}DRyT6S^au+3t%dBBjsu8V7?%o!P> zKmhCQV!gP@(;%yd4&;vUv6!r3s`0fs8B-X~4lXWw^+p4o$HC6{Nd4$ie61A< zF-wWe5Ez+&P=tN=C_R)eu_AL+FtFd2MBuGkiKLMQ1H+Viw`WCpJUxwxVW=!Et;eub zv(m3Nm0xLlB7+Xts27Hdf^>1y>GJE;*ro_H?RWj<0*-J&^T5JB?k18SpiEhQ`lSJ< zWHSi*;!O#ZAUzhg_kDf8p>c(uF4N}qbp5+eE=;KE89eqLmlf7t`~o*S<-=q9>1l+| z&{R?su?^MorsF9VnW%z62i(hLQX|;2^ApGiCSSvvIm6eoC&du41`*5&a6rFOOR_HqaC`7^8+Q)h9Ye~UdUG)}1 zs6c`wC(m>RFMzBnKzeSiuP?BiBl7p3QmaVw-Ipqw9ImddWpo>y7*Gc6g!Qu<>(Qqs zK#AtNMotQ~y@(Ry3+3lgz7E^oCLoGtq6#iJf(+r3VW2;4~EkJLS|U^A2;d#t1NPFi{h>k6au6Q=_OEeQ!I1qcX$dm_gN zRS&GBG1ye8;_A3lVMxHPo3ZZ~5{{_iw{PMzQtdZF*$SB3`)eFo=`+|DY4H@&=cFnM77T09GVPG)n( zi5fk;{p;$WKy-Gpv%%|Osro6D;G?A`q3B+3$sM>%LWV0wKYW0zoe9?hfr-j;$L+KEO#i0|GG?n;Q-eSa@2p zOSFqrSkh5&g6eVyG?ZdbkCyeF5yQx+sLDK0P#|F74<1o|MlXOR7RaTiBqeD8R8ytC zK;T4IB>o)876cAI+`z_10HNWTXSTMrUHgrR2_B8_>&tVN&<~(*0^Z2D*jUO~s>uiv zlDK~?$yJ1;cm%<6>D-|`hS>@dWIq9M3M}GQ?X6`&Fh^r!A9|?DH9GL}ff*%(YwxFs zF?9@_q7`{f-0B)1khcu%f512aMB>!BAiEWe%M;_`N(&3|v9NAHLJvw|+cuAMo!RR6 zcop~}&NANs3b(SSnzJq+;e_IJnGs3pFVm#B7LI;ts4R^HL)nua!Bn**`V#YE`f@eB zBJtlFEEg-mEhWEQAVuZne7|>9IB+6kP$A>cd@!-s-_AErYS9SFzO05M^n@eXc_QRD zQKm$F8W}(_4p5?`tLUOFdYcdLo%A;+o`J3?QPr8@;w=u(_DHb0>IhjG8RNtn&+B~?Q&Sha(`@;V#E}t^ z^`k4iZhHoQ{{i7ImLVCXrR~FKiA9`uO!2BoF%f!ff0B*Yg@ucW-SkI3c0-wovP%AK z^U20~kio6{%43fwh8i%OH*m1ktD?m_$NJGJ%u-K!k? z$^HF3AQ(+gOeBB$`aR6jh?SO>7KmhrcP}3Sn&_UVYQ^AduT#J5X&r9^@h^${k<%0%u2b~T_YTPy2hWq!IyprUL;xkU~Uxe zT$bj?f;?h4&M1OVo!`de8fyhdCYk&$_U{DrXA_Xd=DJyFo zS(Xk;tMk=*(h2S#!CV}SuaGaCq(a7sw)Pvz3}tL=wp)3KEA(V>_cwuU7{B`I;y(P6 zp5YpbM4;VbA$CnpY^=79T?+@x$i`--!S?V^ONjSBIbg2S=uzRy?#idtQFNsydy=eah|b`Q5vB?@Am_z4(2v_LwsT z3O~~R%*lyrZgK`WPFA0n1_sQ*+X)v}f2o}gLv+alZn1Mre0i>7<+uA~%Hu7g8&a*c z6$W+p?pRW_!8Aa+RF6#3{RrliMYwa88i&CBNJ&e>u2vdZwLQ{kV4XOF4A)hRq8j>8pZ8^AWpTorK$gplX9Etxx?);QSlA)?Ra=7a4m$_yK1A`YY#>lu4 z5I_$`3o;%a&Dq)iG5f=EK$MwLFDl;_kZyyI#a5levl}8{UysD%V8TX74N%GCbq4{2 z50AIR0B4!4aObVS?i;2S%$+!NeVO{n=gAFNzgd!(6C5o{lh`e1&o}#6S$?;BHbI5w z6QZ-^&R|f7<!)TV?gbq#kzX;&i4A+Svy1I9KPG0 zrrl8L6>;qBxDE<&PBf^#k5qDV zLQY=GV%hW*#;XyedcsFXM=IhYQLV?tfdGc+=*VjJ6%WXdO^=KST{%X7eg%gSL{OB5 zEaI*M(XP6WbF5z&9$d28-mGOyb>@%Db^H>pfi%Xg5x3Orn+Go%ka3W(1f!V!B}8%` z>0$9+TKMOQu`D_yY062Lo#8#Ez)!OemfsWwE%Kg*CL{bijo{pBznLyTt-n%r^8p>; zbde(1a)YA?sH0`&jWcmAd0);N{AGA8{PrmBAKH9!P4|Uq;?!lz{ zOJ8m_Ix5!G|aO>rf$Mwz5jv6;lYNa+a@t$O44hk_D5>@mtt?Fh z`8(!^XbjnjPn)L%^#Yz0u&86QwSrwvI*%(=8UvZ3s7QC1OT|1bsGmTiPD>j?DK>cX z2~=0$lSk5dm;l&}F;QJrwO(PL1c4V2RiZZUfH?vN8XCL;B<{kSr$PfT{>5c#zbVIB zd{dLTHm#!!Rh(FEsE9~8sj4?AB#Fb;=gVJhB+CsVRaJK2zUZd|H0S5q{I1z$~r$@LHLK)liRR~-Op~+Fx#)eU9@IP{mU)^DEVLqmQ zQ*(&jujD2o0yX!`G~2topBmr73!TNrP6DrqUW>~L`0R;XU!g@c`~N&aqTyJro{41W?|?WR zH0|uxTB!-wE-o%WQW%+sfslO00C>?ev-QRGn(BRM;XO!j!Xyfp-k(5>=zD!2&Dh$R zCMD;fv*ox`To`u4)!Nhu3T(Q3ismP~^Y2oDlptU4n>loKGKfe z!O;P;d&S#eO`hw`gTy60V`mRH)uPnK3H$J2$ceqQUYwR_8%22VO!UmTFvMRh#vz{v z>y%rNPh?W~mS}<1<&TzEHX)97N(hSxp4la|Qqa&|bHuYane~y?>aooQdF@n{+pwga zVbMK@_qO*T26alziRa>29^*h=u24yQ%#7X$05^ zfyiwT49U(J^&aQbIxrvBd@IHL@Orak3)awpcmVAOc;?oib7=3@eFxOhmUAZ{fM$1R z2L;c}(%jtrYOnLxFAgp)O=1D8GF^_S9z&#S2lsQbF4rUf&L4!45uxwPo0^CMzqENZ zwJJE3g6^QAqS&lSSAK1L*eR_cF%Cq+{tc}8Y(tdwpbc~^laq@?d{!zP3GwVF3zr}t zR#twfzQ(cL`EooA6Vc@uS3W=1+6syMy^JoV{k`!OQ%Numo`qwRb$vy0QW6FRn*;@m z$pi{yIqG?9dnPPAA?(lGNd`;)6H9BXIxDqWvo@hComi_n4 zcqMFhM)=72+Gxtt^MCl^;8#YuDXhz|yM$e@&0dRkjppafyr!TYJ&Nr?B z_YJIxL?x=K!2X{Z0q7OG8iHoc2gD<1jol$Cx98=vyRvu7&!C!^9a)Vc;`1Ep>R(i zL&z16^On(B=2M^BxyuZK3+($~ZeE;TQXe!YcRujj#Pa(OuuY}9A=-h4;NqHCs;w-v z3p78EvD>cZWM!R0`*+(K%A>!Nm8R2cd64@CAN(V@k4A$B5+kwKDLh3aAZ2M*6gyP? z{rd(0cmk>8Z?NzLY);zv7fOLGwST9l?cl%K)m7!-AO9e@ks(X7QrkFi)RVrg=8lx1 zf>{T)x%41tgF!-Cx>Aq?a&mZB>Sp>&L8Y|eL#t`E2G~4WH4%1I0rn?|nUy0!1=|IH z0tKV>3<2WJ<6~WbJ!pgvoI}qpzX{H%@oMYn5R_+3qzf$7px*&p+SzjL_Ge$p|7iiJ z6NfNs+n!mUj=+qL(~)+kcb}8apE^`3{zt7!LWxizLJ#lX3>F=gUzwS4al&sgpdg7u zt@hgmRHi*&X4IO1=@2=){`&}iY*~9qCVrgNFZ39Pn1M)qUOw@eb}vMVEHXT@wD(yR z*kXw(NuT-R@axNoZ-(LN;_)|jAmbr3*oWRtP9^vL%p4rI>HH60hm}3ESuwLQL)s&a z?`h*87y%mtS8|vX@6nMp37dmFBDvkp4Dccw`OQPi9~(xlQ~3lZ!-FOvf?-3k{lqRu z^T-f2msutW*;0PwUqNL3nl}s#=rE9mykv~f1w)zP4`;{H^t1Gu?i*fgkXj_2>*F~l z#(%{ZeUp=u<{zLJ-sCqD2%GI`o~J4n)nJYYQY7ZB_e^ILfMXkQuLDx2bIp`cCpQ-( z{&~pJe1~+v+v6^-c)k7;_MM4nm3_h&S>ATI2k#wBs5h`eUf$lsd>$esC!oyIZ(~!v z0>UC%hJY7DN>Ge}5739SzxMY?^CpzP6+@-q`sn&=tZl-`Lsbpoj(71ymzhbQUs|SF zl=CxP4L#p;7r-htScg;0BQ^X0M#nnK-GjYxtw|X4lI&qW`zB=Aqz`T0aq zn;f=zt1sk$QQLLwTM?-p9aNUm>`gS&qbE$&S1Di~;ceL+GcD6#doseF6-NetkSRTj zSGt4@DG7HM%B6lvz6cT_1A*W*8f7RJ@(n*g?qLKGpH_SeKHkA8fLYF@zr$igplMq@ z74k4C=L*_XP*K6ds$YM81X$+7U}iV4Mh5q0!V-{Q0D1Nu2ze`30gUAuP$Girz;?OO z9&j5mF=fAe>+KB!=ljvLr0ib_&6XuBu&NIYB~4wS&o#HO@B-+QHt$EEcx&fNeJ$(O z^#|}~mV66P)?HYe6?xwe{m`By44Tb1Cc_*!f0;eO>HfJI+n4ins)Qo_UI z6>GJ<(-)`}c(?hPnTraYm%RgSr*%R38E+?NAnsxLB>)F|#p7}{aKOI#OS{dxG)w7v zF?kNyJ<7GCL6^+y_yFRhl2Vs?ySoqe_6Yd@oQ|Y^GiG7@xt=QyKb*!TUbh8Omo|4s zf&qIUDcxR~F>zsm>*;2x_3R8cK+FyO9Xre9=zuw1u(R_uB6iS!v1l_h|2Ql1^Xsga zW(VR@dz^X<&w-2-Bt+@x>VnGfp$i)-S;pu7LR4e^`>$WF;EBl1B>|6OvvaxYKkyA9 z;BnDWqvKxa0&-5z+zZBv`dMFAz{x*hF_^j z^@batBD;-#5QF8H_L9(8fA%1eYVRyocvmK91-YbQ$!#yPKbTkLoP$s%E+lbU;< zBaVe3E*n5e!rm`qDc3E6iG|TGVF>v|y5xn`%0{J4S6okE{H0YZ`jnjrB&W}`?0UK$ zdEQ^KuN&A{oAx!N<6{I3e6ohwsSxJzxZQ6xf9(C28zg$dsi3ASK0rl&FrW`t^0Aka zChNH$#`J#V_vac@9A*@8Vi$XRNhv8Q@7vDtZnL5}g)ecT+V2}aaLAYI)k>+*K?G^& z4>J0nGk)+vb$BM-qU_8ve$htw(x!botXfk;-USD->-f-Q&TZTahUPtnH$PHfw>~e9 zEK1U&CjUnA54$-(Xt!P)P52;bZoY4PYHws5^@BziN|5B|vXDUHb&lywyxHNNOlFf| z?P(JD4)3ZM%F2Z{%6%UaX|$3b_`qi*EK9f(VzAaqm-w9TP#XB*B(()LYpj8 zaHRg3om~~{f)_I)4f#zXYd=QH+!-!>bOb+tsu{|G zX_4Yb>#$pVrMlsgum7ITIG9#{QaD9Wu+~rW;X10YGpOn5y@1+tak1GShVR!zugm(* z4%n8rR#m-GRgLFF0fP;|$D)88{RP6c1_lF#>WiZbwc&AGnPsnMv4u-J+9L|XX5+?vVR_lbOEQ?fAD7NI*!H0^DO}4!{PvV9xfkR?_8AUDM1&qmd(ZClB$>&E1W2 z-0W8(6YIlGoqj#ITKl)>DPEJ78Zztt_=Ek~YU6pTUq>V67ba%^Y5STIGM9WN5o&#vr@7?F^`RsMIxV^+-C*EYCN^^BcWoh$TRw2Ak8@7vx)_06JwOZH~3Uf#S6Gj?^B*?)FjdmoiT72POD5 zDXD-UxQD2ktJL`q5IX_^;!CP$M4!F!&;-wdygV?51BdN$E7Tqy+~5(VvzQ=2D5$7{f`YKH6KO>%0Zfol@6Y`C$!XMdMNe;Ug$5ch ztTX4#5b;d=f z;UxUo24bv0dx5^8;~iT^koehPyEkt%h=PYm?~r-wU~Uth8HvZ8{$aG-Qz^8zF>n5N zeLDvBu0&j|0w4F^jmeec@;<=muWE%=PC=0c?jDgFR+^%*EhDWm6<|-~K!a3%et!So;%Y%WS z>!g+6+pAxF&Q#=woPi@EK>gj%rT16yDAQkN(vUOkH^r*NpgMq;>BALO$R|Y>KFK+W zL?xj~v;zrLd&*Leok`8hzq@sRia_qtr-%-mRmyg47P>U0P)mcYPI`9Vi={fIp#2+n zr9|WGuk-3dKflLH_o$9>5{vEMe)v&aRI_(d6C5Btoc?)eqr4+&F>Nny<6>zMXpOt+ z8v3^{zIww&Z7v5ljo1c=V8zS1buIyJ6|Qb32let)=po> z#-`A--&2814Ad`{QY3HD(a^G=tijYcIayOzH@%{Q5dsh%X<)&1tX%SiNLCf-8h8xdq;Wd#}8BijCuswya^+wW67-h^PwHe)S25m zzrp4lEh(9skl*WKwjAHe#XONww=e2i)!OT?J%DEH-L5f_SYkn*5-Ae z)A*?+?nh}{+*HDV@U4?_Dp?ac{ZY16v(SgQm%++LEmez9_CN}XMnB7&Yv;jC1eALB z%h{x)aI4SoaYh~wGV1DuK)cAp8K&(mj^?*{2U6Bg)_j4A`5))(V!*1>H@KSG+9DvI z&PcJz5JQ6{AZ+n^N?QDRLsWdVi&GJHb~Y${fHmh&{pa>? zOyMUsSkEgB!^u2OPj|rUb`*g$I-adO$uEhI$Jy#3tJ7$(rT>f%dhMn8Sd2142r6tc zm*slSnI#ynmV6XN39c~^0?2!{*OyrWxt+g()Mjk0)X`JkVxQ?uF|gEX=w=8Az-dp* zT2WdGY&g5WJ}SdPcLHM{0s;mGFVH9*^1Faa{aFb#uk?;V8kpVfH z7$`x+3Hirgj9B5p?}-?OXALCjCH)RI%bW%r;RhicixT{5Ox;b*6v}N-Q;Un-I~V(H zE4!*{m#EMdxEpUZrr;Sa}jN2T~Wc{NIp)D}MdEAoe)T#!bwLL%Rg|~3g zan?FQG4B``em$=zE&s4m_I^sSx+lz(z>4k3;<_f8id-6z1?*$rHuv{%y=g77a6Z5Y z2w)FWZP}8fnC#)1+l3$NM&vS0GOhci&RCK9bun7BV2tIkJ$GZQ9nWSD=ge2Pdu4XE zUY%*9p**0ZIPK?8v``TM)>A>f9UPPrwbH-@H<12s<4`i&oxQ!i_4O*}!`Ud#sb_r{ zA@b~PFhgFp^^zr}695Md= zcjpwyOkPUYj@#g$3O zF7pf-q%OU@KPs>1Oc2~|OT0l!PIe&6Aw$w?n(6dzKbfz&{@2xL)R%*QkPk72g_29> zd~vog+baV|BIm7)|37$0`m=?D;gIn_@GmP;7f5vkzIO!j(2Wmozk@!qG$8%w>0=fr zio5$gGc)t$?$~KRVVlo$bADAY&<&LhEC~RpI)k&bGqA#xmu~hOvs7#3W#9_tEA6-E zu$}Buo$z@!xBEEv_4IskHu~b#nZO`hOrWfMxv>E}%feXUUky44SblM;Wk8S4t7`p73(>X z7@NxNO}+0eqh@>~lRi=X*D1k#4DXJ-wtt9~1l&mqYP0La<6x=48}|6eVSD@cq-i({ zY^OGNCQ$lAcL1S~HdXl0(1LshFD@YB3*2)`ft;WyVgXXn^w-yeDr3l)kVQCtX#2RI zzJ7*v@ShzJs46KdmzS4=t-|TnmIzgxB8Ij3=8RB&B0&5i5%WKRz7Raeg*wak|9Bm^ zZVqPXgzK6_-&h+xYu>`>#ufKHtMUo|SpQ@}y z6Ba*MM>==ZmU>TRkrk@-)$hEn8w(5TP|dn3@p|T7?@><3tp2I311w~u`H zSM4?)=(h?G8U-5pxzp(7me?oZgL?+@xu87LAyMeq06Oh@gXFvNW!(J=gManYa>Q>w zy5|r`r#NTaX$RI$r*aWMev9C$vC%3c&5W1>q%~vG&>Xx~Y^UZ$lLCaWCndbkIRYSd{jwTXlbQY&Uot>Haf~X5i z;CHqNd^}#Lue(X4UCD1Wux0UCQ&J7{Nc|e|7BMg|o}`-I>p~?4^;)igy${>u{51uAi>I8I{-7y)e^^H8}X9bG(AnFEbF%j)ek08qX41dQJN zAyfS)@8-Wx%w7}Y5taFuyLvZgXGbebn>WY$OLGDDq*Brs1cWnsENpnm*`d<;O4~5pjXV=%)mmDiDnFadO z0N@*d(=O230}KbBR#?L|BlHni$~%epAv7+=4wi9Po7H36hvNFrto{;WUS!JWuzbgcEBR?hWb;DZw7v<>Kq06@m337juVEN;n?W4b>-S{ za!N`{fBuBlOur@W`zEjdA;c3fMTCV1Vn|2OPr-U_j_5V`_8tU3y_T-QL0Ns)rj3`> z4mps>cbK7$?c4mS(HdJv2b3E9fnTd-%Kb~#`7y!*-klEw2CA+l#&UiqL4woOZGQ*v z_wWz|-lx0U+o2+c(4ttOA^>;&diLQX^Vh!tTU)>tB^dBUF}x&6ZCs2?mQ^dh8SoVD z!4?ezJ8^>(#Sf(68UbE5I59bxnbj(*;K|8_!Oj3>RUlsK7)bO1GpDru`L;~&QC}G3 zY8-juV0S07Zp-E-i5v1bHI&dZn#|%z+y5k>pfKCv)3J~=vm5WJNz}a9S%j;nFOQA4221GLOe%*J*x+3M>s8ATN#-K!U)y|_FQ)Le@6+X~Xd8iQg$>0||dvB|>f?B6|Ds}4Ka=u9YH4L}sukKW?&}t0Bps_R^)t+oWnm1kIFH4f{ zW=X2hWW@`knKzI-hMkSXQpv6fJ(Cnw&F|2NDJktrQB_IbV+P)y*U1ZiRFa{EN~Thl zriznIM-4G`(6Re~AoZd`M{A(EV8JGi3Y!#Qlec1VBOjiIesU(PO;pySpv2 zhxY?*71-ajH8*nu{ul^IMbl(rVmh5JRt6GN(3iNmxfvLcBo58Y%?W^G@^2s_Pg3`5 zdr&+b2a^`(>D>T{u&^~`{jx_IOCRyQe*SW2PyX$u_&z^*3m$%rl$6E2SGAH56ogS< ziika|=(D1p-ugxk7QDUPr{|kv?m#I@T@DkvAI?ghAt1v93>(axaI3(dVD`eBXHju^*0P&Gl`rl77))&pb zx6$KhEVTLegnuW=kdJ$(oHgUrRPDL0(-kQ0)#xXG4c6SpQ?XgsCwZ3Muoh%XWBD}L z!k203b@b3)(-i|@H6W8Q1ls1alx1i8D-1u~Du8r!!1+-th_M!IT<)Eva>`+&2ogF5&nJU^_ zD=7FP1YSL;-gLrL%B45Ua26`iD0pMjoLz`64(w^_HnVLDijwVzd~K<>x@MwT2g%$v zhqHW4dc3~3C&=+qge=CeVL1$}OyydO0F>+GLQ+rz{&Y@sjxrTljP(WURxJg ze8v6lmykdeG!QS12&T>)940lTT33D?>OD=(F3PW%{l>pRD}jX>@3rO!`etz7&;rg7 zjoDz6BeS!!tRfqhwC9!t@n7kuU^`b=korW+q&}5AF*k?LmohJL<%`T=qrcE>|JcPN zpHCK%;~#~?Zv1${Ait4Oq2DALY@NpFWhlL$8;Lg{)NAxh<7yfy9n*pT!PR7q8lKL! zyVS#i?M)1Qznqn(O?sKvWy&$JfEObj1AFY@UhFiXFqF7rZFZDp0P1t>Co~%5crDR~ z#4Az9gz2rh>5zkaoVov_=`4V~u*gUoq-Ik;i^B z^`w_Mi*Zs6Y?8foYC2(?WIRJP_C~Kf7k=SB{1DEiHjQhQr+npck%kJU+cl1KB^_g1 zgHL3c80dlcF?Ct;#VJpMS*5cpry?^?rm`b!;|zPRP8K=Ks zQ+@T9&_^fn?{kC#<6>JxNhq0xwv1C?e-ZQB(@%p5>4;ZC-2eX5Jj~uEFWoj|VIux4 zH2Ofl05Td~6B8%MClQ}5PfcBfgMy&P7!)_z+1a4(06#Yrr#`57a%DFdC?rpxR#*uZ zyZz-Md7MH+5fL8EG|QqZRJu9e7QzdS?d6K}*uTX@)CaZ)Pm0(76#``D_v&FQ>-wCB zG~N$x_BiXlXr#Cjwos+6np54fw8W1R9Vti5)Ov^gvsVV47iI1Fc?^k(+LrS-lrPqE z{rq++Z9LGtfBzU@#k6{|)!Ew}&*w*|xii)K>cxw7E4)(}ugN8-l+fETGCG`J#fSf8 zZ6LirGH5|Jv1}`D`$X>LfwoHgx`j06Gv4WUr-^em+?+PO<=TkwAK!kvOy*4c#g#}@ zC1o#`)r5&=_Ou-s(S-Xsi$-(k_Z{K-oPd!^IYwE=~v1N15Py$kHOy` zhM`0Soe~5*Wxng|9T_DwFMijy|vzUMtYSRr@h1H&m3MW z*mdO<6-~{}Np(&k35oiYIB0^?C1df6+npDQ4>6B+CS|N90-4@ruXK%S&)ky}XmGp0 zeCZm2ddlu5;fq(0@#~kW`lQr^n)+l);q9r62<-PY=Kt-U%(?_Sh!~`scCAoGp#I|F zj>GVatcllm#Qz|Q=huo^H)I3~g7Qt?T=D#gu%eD$Q`81m&D0hV2 z+(e9xA0_vzGH&J+t{*s0V%|#>n(6J4CD>`YB>^bsYt`ojDIi}7{xCFJb_wV?NJjP8 zo?s;n>(?FN=jZ>=ySlg-@%?+ct<BmW}Jn8aU2tuJ7J13unA@+MTVp z+jCq^VPa-(z;l(?lqCLT&CpmGk=!h5x6M1v{Da%%^*=+z#z-sEr40YwQKZl4>e1}V zz0|$FGucw#jMdD)+h$G>Rx5WLzBY`THBCZ)CWW~|^}J8NxPyc3;Va=Yln!U!yt>+2 zw~I6EU;cBPrm6$Th>k}YzrF}eDDbYA4E2$Q>99CFTg2vO`qlYCKjKw7O*)7)E3=&a z=f93q7q)GF9TWP15R&K^9U>BaeJHb#E1_cF@?isEC~`lTV?U_)!e{uLn=2>zA9h4Y zdZMqm*|#ogwkr>#C5{7@=Rrt_SAVwr`^PwzP0k+gWv4?l29Q$rWGvng4!zxUV^79A z_MD2JT(<2giRumMK(#RPFzJm&Bmek6EkH8+!4fqiQ`r~)XNW+S4|%U#=PsysUUA)0 zUQMrL!YOgT=j)qa^6?`d(Z@jBP+qNli=q6Am-l_OJ#o8DmDU?VCpcSR)R!kk^~dJm z5}zlb*V=Zw%wU?H#1po%OIyT6_R{oAeKa|ZtLSBA2{NWkNuG6fr-sVzM+LJC+BKSQ zX-r3oX@|X4Q)xKeCBi;;uCo3;vRsuGi&@-^>V^!5b#JyF zxjGTfLZw>!yzP#gkN|nDJD)vvcf)YsA3F0Hh1?8oG?)HjN-4Fl%h%9-ZoYfm!;6BAE!^`_V9gP^rrv90p=m$0^0kdo>E1^#@4C-b8( zWo2>nss+DGOAikd)U>s8ZIu*pY73amCyMkK2|j|MG~aON^~k6V7#<;|3d*$eg~q$t zO3*w2aeA?VRRc(`9$0sM;^Y!ioO^dH%V`Gcx;SAFfH&*akZr72IKRVsvCgrPix!l!S4U^4m zE5>8@xg$S0r4iLh#BRcLbfQ8z03s%i2hCyGDEyx|#apdO-aoPF6ii7L)B3Y$tsxh{ zO89kU}t57j^akvCh1}|!CXx%)PPjk_eCFj1X z`E`#ZyX&(*+v>a~g=IHeWaAD$TJ;A_N1$N3d9Px8ezv$g_Y^gOI2IXKUvOe=VelH- zal~gQt9g{Z`|q+|nxy%D!ch#<%|N3qVU>8IB%NN?+s=DxaVQlpl~qP9tb`wkhAj{f zdxthQd>b2ZSTs=_Ibwc(-k5l4PqCJj=uCWf=OGV`$J}($~EHEL^pM#(3FJSRdhS($3AUJ*`3s;c|Bg29`CB`wHX$4Fe#pnNM+ z!OTdCM^iq9grLL^XfX4T^UMBZsQpY@>6b!-*%qC-xto~Ob8K`R6helF{7AZtCdP~= zjv42JUB``4zbd2yU^Ibz$qb0FJER(OTKOqC6BX|p7UXiDk~~F1#0vRpR)_0)Ck3m_ zOq_@nGir=Zk|(g}ZH`%ZHTzdEFTNtV$&C>d@kYI! zJ3NO+Tz-B@qC0PEbch_8>iEqZDIf;Fjl8s%PYxa%-qQWy9G1*B>lzk-_nNxm$=HM5 zCSv*c$cMRe4I@V7o8hT%tQz>{D?MZe87YQr{!!Up5UOM3`|gEoQCR)4j?SA5V(-U<{;)A40vW`Prob*enl~FeJHYJ1lasCV^cFWZCaQwP zY4D+fh0tFx*UK@+wM#qS6Iz21hyd% za5h2>1Aj$MPOh#~qZ|@;R8*RRs<))6>&|Vh;#vGqW6k9adC0F)Lm}&M2$d2u%>- z^y%JtK>_8`GFe8(q9WxD7R9FVQRrxZvc}~w0tfML-()lpD}CNW_FGI0mU_KTU7rpb z@^M<)&Chb97x&ZjUTWM3$mHK6In`w0)gAPn@YU>od~ZCOt61g<4fbN-hf2aIKTi54 ziwF*jmNQEG_nl~i18=|Z5kBi%)+o2_3~(Ylm^Y#2G^fNa_IoGFww7FPUzpI{*V{Ep zz2LLZMwEHkbQ1VN3C}L-acv8_a2)4bd1SGSS_Yuuj&AYak>2(ehoRCb9t2^|YcZA8pJEnkKFKKb|%&I6W!$72D z2N9H*HP3LH&(7{!IcR7Q-J|2ONP}_jY&z8|nPL6Bn9IW0 zaJS%K7m2Zb$0Xgx#{bQw+vrF~;6t93Ax?v>3-Y$1I57GyEmgmeXS?$_ES}FOWy0+7 zUGEfeFVVw#(4m*2}k( z{B*_BZMnE~*_AcpNvx(98kLyr%8Cx^Gp3w&S9w~sE-#CHXw}8r`Pz)cf(J7 zgDJvylLg%UzYGnUD&gTpfn^*x31B*g@JE>OAF;(+gN)@astC2-xz+qZROB4s$o4Bd zcjuNHI;|l+SLGe-v!^qkVKpoK5s6W5WQC3A+S*ghEl>-4T)%8p;YE-$7n=3YJrGZT!0@tlm$8QPF?5~G(d%M^b>fH%XVFTKWA(~rWxt4~dIeB0gp z_2#)&Rk{6mpQ{#A(h#!$Gr~4LzBU)he0*>W z#hvfpOaUwY;5fV|#h1g-k&r0FpF9v!N5yh7hpF#F>L`@fboc4<-#08m7#}e26PLph zZEiib0p9I#!(hk~{|xOS1LF7)BoD#RVh(OGwi+G;LKyqy%a`|jQh)v&amSQ_nd;P3 zPI>v8__J2PM}jYYY>czDm70$f%YRY8{pe(Oc9{5K@OOIE*U~iUh|j^n=&Iv5U>!ZH zK}X!U7JH6AJ03%doPYaJ(SmU;;3-eNy?9Cria$~+4&#Uz56w>z#F$Vo;fJ54=d7oD ze9mXvH-{DSNrImB&IeQPgv*~5C$7|>pPqB__*HH9s^S7P~1j0**JUZWB z@2qXdoR*drYp=DUoJBG$fz!jgVud<8Kflg;O@FA{kkDVkwW<(J=M8`eIN}j_*r^W- zOp#h*@agKLo!M17ztEcHvJQ26mT=adb%my6l z2{qlv#fTwGeCjon6K8g=RE9OK@H>JdKB)0r%ciR`7#UF6k_5H+R|#r~%? z%EnuSnlCLXD(cIC?Z`2MLNh$&7#{rEsZ|vuXr!b`1O%f?NVNq8gR}0xW5z2XBc3!w z7XW4k?V*+vMX70N!TsyaEWT+wt=Y%NA%lagUmi2>=J?8j>lpYQ2A8wRG?1!2Rc202 zy&B0#g*ym|eK7ET@#Y5&b72b$d|V&QWfn9W%XWxo#r^ z$z8~{%qp-f$k_~!drA{buu-e$y~-rjt;up(*FZs)}u99}LcSP#B)k=^R; zu7HC_L=W*9dpUw-*ZMWP zsCN>{?U(Pd$oSH#s$9NSxFQKkesU6MGZ%h>D_^ca?u(QcNE4^f^Psa$DHi>|1?U{! zlnWg=AV+&fH^p`e)3EU#(L7?(s#YjjJxeu^DARbpQjV)EF6L1ktU44`v5!TzSRi}f_iZ`N@a0>J*jHP1WE*zO2j!J=2Q^N+J^`*+)(N^?W-CH5szJ=uHCRM^ zrlM!NrnYVD|8`KKohsShkP2UM_!BCO-0OVO)A+F>Uu%A7C@D+j?b}qZx7}4=W8(`w zrtbP22cT}%FpWOFIGWn|dW0D#f3|nW|Ax}@c1h63o2<$Q)d^dkDgw=(C)@5mUQBm{ zb7Mlh+AC!%=t?qi-;*OFgs?db))u6rvU>&>&f9E~DuW6YP)R|1fI66G6S!L^Bx^dt0R_Mp*e?ES< zs#}I+R`c=m>d>OuD)ZLNd>_lX19hB_RaUoM30(W*+nYQ~P3IMr>Nid2R8FT%r5xfE zIB{rFVjZKQYwM{ShT?vD2;t9+KYK9L#zuAYv$C?(G@f9joXmwFeM{?8dot1-Dih6+ zpOqCs^ygUw8lT4%6(3)1&8&ys#nE5Dia71IFld(hicu_X4}ta;687SFqhn&ip7N^j zr6$D3gZLG(6HlQcC$|Ip?vs;pKu19#eQ{9@;bNL~JSz*0qDDsgubl&1^}+KCgVM5g z^P8LNOXloxPCr$IE~XEJpF*-6(-rl~4b~!*A_01phlPLw44Odt55A%vw?oh_@+_GC zc=gb4$YF_&?qKw#-{9ce#MR2H?xji0W!O z7T+YE$%6XB^qrYN^;&1@sdsnUPppBsPky6fXMA*<4h2!7 z5SzSoc_?skf;yT$AND61Tfe&Uzh`_FWOBOeZuX(^vHI)gp&`1LWoQ+}^=~GKsRtpV z{o{1`=zv1<8&x1La~3Vv{;afSd)|Mfo4EB9f9G z3Wo#o&DxqJj)C&|fbyPp(VyFiVmv>;RssU$)>b1d(v}dyMSps}aQ`1av(CMo5GjUT)&I0#*Rvpe{e2Svf0`dw{=A!+CYYTcoH{s^4T zA`&d~-1e+A56wpuqF?Y)BcxXN%|x0Ng+1F8OoH2`&VybDZc2`sKYQGWP~2<+f+O;S5Vy;tyW4 zAe%aAO!^&kPPaDVADsZ*3`1R}8zL9Ewe`A#MmxpUus&AEP z=yF^ysI+Bd&UHQp78m=UE%YbM#ysYEaia_>Htz(xx^ z>Ux5D$7`U8dinXE0}Zd)1)MtR#lYbDH4cr(Z?mx@%=B6{P_CzPx%!T!M0M0B;}%eG z(fzor=C+Ls!8$_5>&U>yRsyOfVPQRZ_&|Zs8fT{8@f$r>cC8~K9Tyi@=5<$BH@)W# z!&IHCm37fJCIxFAK*a%tGE`sx>AM9yLl1s(Rx zOqgZ`s#H1Qlk7q*ZpYhtRCRSDIIcw#<>rAl#C_OXY_Q%c#tM0wcejr!_6VKe_y%b^ z2hn76^V)7*c$G&*4fkLqbt4jzV*mRej4Jies!j|(PL~MT%cyP2QO><{_bymDeMM5s z${egpdc`Ovx89~Ij0_BrN12!{w}n7JSyo=2#9Jjfx!%6M6F3$*h~(bBz2}k>8j5)* z6I!@$-|+9R;vs|>iXT0FIvdCHnh1OZ?}+|`(QudHGz-1O@mL*xl9_2-RaG)s$t$CB zo$L+6T!t5aHs4B_NjHn-y!aIAt2g1<`6F!0P1{5Ccvg7JNBcKSZU4L5{>V`Mt|Ria|u zH#+FD=57B*I#~Jb)+eM4gy$cR+}+a!DDnB|kTagl$lv)^D57Ae`0os<1RfMEZ%*ZE zIOW{fJIfu&Q8Mo!SU$UpmaH=+eJOndBg-`XbhvmLJZ}r2-mtU`-EzOa1L0j2G{G1Y3DpcLbO3X+IFHzB-rIGRJuoK6&CgaH%2 zmb2{ohNz_X0CY&$`c1*@pTZM`t}oN)YEY?d;v}Kr(b{9OcTpM|yF1-@Hw1frQG^(T zhUA?Zr*lY+7dCzvFh4}G`6HRz^2hi$+C|3`cM_7X4)z;b1q-6Dy@tcfTdFdjt2jG1 z^>#R3Jg2~UhVA>SKQEVNT{eN!>gMXCV|1l+*tq|Mh&u;%bJZtzLm(ZBrr_K6(fKOk zS|dA5KM^9%JgW~MWQ--Wbr&{8102Ukl+-medXve{ra1zE9h1PB?kPxH*Fmd@D(D)I zQ+=!V>C=A4+w1i{yNTc2e}0DEN8!Fo;SE^ya)&du-l;riC)`d; z>wj~xyUELGK)_K}r{&L{S{K?j?R@t8S@_{kIc^OK2UyId8+)r9#RdBsiKdQwG$J_|3v^BvAHPGeQ51E zQ6T!#x}}pP4@%RG!S1cGJRiMK?>kj#X+{Tzf`1zeBnM1?H~QR)EkeJL{=VLAxT*r8 zeB&kXH_Kr(4B=t^{*Li4EG&A#uk0iMYVbGfe!MZ= zuGbe1r~Wk@z_a-N4d2Gx9@z`NVLV)YU0pmb2ZqwpLEsUJe`)?E3A4Yet0m;6z7Y1; zv<)8!l&<@fk(RBhV#={>28^Qo&8ri=v(C2IEX*Z z6QfOJ`KH=rA`P~UTzE-koBaZ1g2n0m%!&LqH*?F=+M~8^QWq&5s;$xio7fLJShBGO zvzUz&)bLFtjJ9x(9fEdtt#L*y-9YWP-j=?V`zYTm--v>Sf*d0{U!8|pH>qbB!c-hy zYGRUZDv)aDNt(Tn&_|d*D#zroLq^oPT(3<|PS_0S3ZWnncq!xP6}9HQ_9{g<{dAbm z*tNl2FNKWwJl*rW`RS0a6t>s=P@FzNY-|EG-ap!M2W(fV+j_)jIEA74_3Z8%;~eRy z$^OH&%xEO=APc;CHaY%z7`!Bew)h^!oaU*vh9 zS*Wf@>o-P2rWy(NQM?2t{$uvXs5*mO3LyWYWwf}uI+q;DUtPkKXN{cmNEVp=a2#(# z=)W_6y0WQkI#Ja0Il8s=CFYg6c|wLeV;rwjQ>Gm1j~|~x#hoOK*B&>xIBOaerFYi+tj*fs z%-Qh!BB9lAvUs*HnyZozfQu~Kk1B^LjS@8{ziX}p-Dt)XcKTQnxm^WyZSFBT%G}%>#v=yU`Chk$5cA^9AR*xMe>61>3=Y0H*}44Vbpqwc zm22tTZ33nf0^DK$VRIL_E_u9D2ql1sdh_$8HcyY6`}+AsUv5Fc$!RhvLES`)7&~)e zaR1}6v68)6xnvHPqNF5r#OKZx|Get@D^=;Pg~qmf{_5P^+>!cdP{MzCZ1w2zV|ZS9 z47cUFUnp379Lddng0=f=YLfVaS;FCI)c>>qbfEgvaYhjq`R|c!_q=sx{}*gx5Y*ID z#mnqt1yxb;@jSrP+_l;nOh<+Ea0*;jgsz-iv&Hnupee_5yart(wyMm`_#vaci;J8w zPBReL0c`>tA_S|loC(ac#RmIJJ+B~36U;d8$+3Tzm)|hjn$A+fAP17@1U(>CYzyvj_?B3mzRG+xXWcC4s!u&YBU*7w7YX@9+g6ghZOYbaZrp<`nR=onwxO z0&h}8N@1auiODG(^qdKI5wC10X8QZNIF$Uq&fluY#&YP0%DA77T|2yN6{Bcuw%9nC z)udwf*&`uSWM4*6jCnvM*Mf-d_muP_2nq6OeY2tRyZ&{omE?&pl2%kquW{+Tr;97^ z+DUavNXv60rZ!xCKK?X6>LoQsy0kmo7Cp#bXb1#j(gy>_kC1D&OwBn1h99jGf#_^tJFN$5k2@rnUoJdtwsQYk#V?|0q@=*06c zGR*t!Cm}wT6JdT@BY|`X1k!Wo!K2VTsr!?);33}K_}iG}=j_!eQ6>1wC^*m!qQ;VX zzT-upge&xF(XVNJOej?__NHQJMkC$btQPfHMm&_Wji@PmZ+`L51(aY&71 zkUbwwmxG%zo$u&=+~&`6-Hy_>J-_Yh4>`}f!(x#HJvmplca ze$v-p!g@M|=W)g3Eo5tZROi~C5Es|^F)u3%Na6Ppe|%6+Nn79Z;g-u%i5wp0p1bM2 zaUAtKEX4Bd!&fZzyi~C{vvwQF{|>RW-kg>PKiK|8wpgPgCZ@5V(-fDG@JHv@TRJN( zpp+dQrG*IZ$;nlA<5y1zvy@fYL1@up91$yPj7NF7QOqfDgJbFbG*Q-A_Huc#d3<(U z%l6d|7@?RLc}if*{y|-ClRG@c`uXldkIL0SWoo^{&DF!E0=qRHxddDFkCinwuB7~r z*xbYdhU(kH4w!T{^vq)UMokWs$nN=o0O|HFTb*Pm$;k<6zv1gg=Y$9f0)p>Asd2LG zS>7)xDY1@Mfn7CZ%hfsW>&wdaz$vje-_XCjFHNCRBPP`=xb$hU2MV%5d?uOnEgMvhyNg$7h+)b7#~kpX9U6tF$#Fs z+sGpE=TYmuQ9Fi__aSb{+xyN;DIU5X)Vu9o1N^T@fK$${E8B;L<?uubIb*mqpsX){qQ z$t_}u<5t;DJ2L-0sj4EVNZ)eW#oSrDfYwi0$WJWg|+f7^UkZUkKV_~ zgg?@~*wSSogf)r$>({uydV!<@8V8swcVRw+OGs8G3cp$}GLMZSV&YfW8~}0JLZ3qH zCdk{leDD)`cgG8p?cZ`EgyiH_fo*Ye;=G(k4B<=i37q*U3z7x~QBVbCW=e>)P=hh= zFDIPBFagcU-%&U>`}ooM?@+7Y%|t8!WY-T{o0M5FVQw|>7eWk&eiG+q4H)QKh7gVR zaWUWqIrGg4N*1>-TEnVdT}1^_?Z)CdbBl}j0UI-<2?z(MLt!|dBxqH*|L@=+@(FuH z5VcJ7#N;HS&P=rV+|(4E;!6$87q08oA4##62gt>Z#45zjlt0@9eZF(o_fB^l_B|YO zHblxj`WajI)XM4&xHN8-RU?tV)vr8@T;>SO$?+4+@7ysk%@CeSU=CX1LKPp35S{IG z*)$WvKnzgj+|qUs$?jlhxp%v9Xn8OoQQSeqn#vGSPZF|^6A)2gzhll}RPJmbbsg=h z7L9(Rx^WY215>?!w#Od}jImk7(C)8}5@02x6Z2X!ApQ)y{5JoNf*}8LO!-Aw_k~q~ zR5<SO|~Lvj&28~PH*2+!P|=$#6(2UK-3(5cDX#+ z$;>=K7{d}1=&48uz?OuoVL%l1Q}D=dJk_an#vvl2<9{94XvpRorAiwDsn=v=R_5l6 z>~{IwcG@jfiE(im-O*z7bg3{s8CCvzqw-0!LN0-2 zme86hfB&k@i|hX&fzxv?IWWMW_9%@Kny^NYtw9NZT%0}FByk>&(L~eS=|?eE~8$^Lnr~7A4RdGn2wI+ z0Qn`XB%@=TIxayt-qYI~v1B5<)oMxKp5a-V?zuL@bnVbd8*~F&hfIiK+D5!{iMAp z-A9>@!a#AA=0z~xq0-x~R1ucUNzmz#7r|7N+2HI_n?Zon%ulLRBz!HZwZU(RY zy_k3=iJ({i77e}mlb0ooO}h&V_WMh-crPCM?|MpR4As_V>eQV466e3=fq+v;#sO9i z^iEJkPFH!AlpK6{-0vr%2TFdZaDnPuVtw8;^0ybc#6H z|D-&SSw}7EA0OWZu92FGsbq*LUg-ocqeK`f;~ET%hd!O|YQtyK(Mf{0kdOyV4?g#= zGpiK|sQ?xF5M2(L$Yx4UTCnNu$}btK z#-BNrp2kbGwg0rgx!qAy;oH|)d}6cn_<-cB~Ii3)=2&}x2;*VUt-xLCr{(#3k9jN4AD zwtaefVq^Gkmzh~hx9WQT51S>iGrk|JE$d&Z4>WPmkl~@mrYqAke+_%laG9mn|waX9^fsi806$plZdzzD}GbL&>*IA6`YrFjp-M zHY`7&svX6ZQjCPT1U)si%k)|)Bct4=S-iPQK#)bwEem`ZKumyi5D?i?-Cr6iDlMgF zW;TbhLT4u!h(Hy-M<=Wb&DQI#2T@4_2??449#`AD0vF5eSp1=B5Qv#BQ3Wiv%gZyFOWb65rc{reY)jnJ;ZOV0rKM%$8#p0-6brB9^@lST=*O>=lvj}u_SG{5v%_Ie z-49rHx6baLh#*xPv_HE&H$|aBLUa-s0Ws`fgBCk7H=O zV8=gP|LAmwo)SsJJ2DcN-u%jKji-t2V68bN<>lI1^x|RwI>tP_m~!zP-JXq$-|5|F zBu)Z6{UAqmQrhavc%iv`>)^n^M5&}iq|;vG`XR*gf-Tg;ikl|Xadc$F<`oHGf9P{> zU@rY#eOQznyL!pwdt0@`7T5%ciQ;)=@{kbF;{`l)fcZghTL{}L{*3>ux)6Q@#V&XQ zKbV^4>C{ydM+7EtSwXtu*H--?pHJ(3+HloFUzdF*0cTXT2Aym?$6wfd=?7Gg7=LSD zwyE05D5eR1hE}56=$5KL6UCnc+M~FN1_D?c=} z501B%+xHNS3OqL4NDmAQj84^CWC$^2CauvEH#Kll@#e_yNN0Br+Zw=quEA_kX@FQB zCZ_%+ak-lH1x=6P}EtJRd`Ee6j&h`#Rb zigZk6aSp0^m0_8ywE$ozwhIn4aGf^k7ppNugY=Kf0$%h-vdIBTdgu&vF$`IXhOpw} za1+zC4tw*i(ojD zlz4GW{!J(Yvpn%jv*7t{U>|kaV7+_shKOpiI3*?iO-!RQ`2+K<#!Vt;9ehek2V}3| z;n!`S>|^WMhv<#_HkjUntNWC6`QLrPX6=ywtn*gyC$`3!xhw)rKj>|25|Ic#N{l_* z-dtTlLD~sThzXj%(3q0m<24LC41d;;-AQxLrx8{frzGe!!T0VlNjsmEqr0S|-O7SZNLUHX8+>sae~^J)5Yy6pB0U4ae=CM1;t#wBepzD%B$;a< z_W~JBru=7SW)c9#A;hsxy5ron5j-vSVjs(77RXqZ8hx0x3INFd(sUqYVq*PM_42Ea zF7uP?wzgf|Bsc5*e)R%4US8%f$NKi|+aH#%HME$`o0~-JZ0N}^e%slOkCl^P$|=2j z=MK+ylTR9~ppp*@sF*%;i*Q!XNl!0z*wS%zm3&^h+7^NVkl}-Ubxwn+*307e?-aij z=(uo=@3l4h{Ni!Nx3#kaWv=adpM|=ToHq13`ZrfkZf@ozo4kL?+DzBE%9sSpvZAA- z!!+yYcuV!ipt^zrHc4$xQIT(}emaecrY7LRrvYq!y0#4(BaFvPKn^R6h=_m#*wMkq^``m*L3xEj@24#}s;*2`A zKg+%cr3roXwtF=rpIk2`Z@>n=BDgJIAU?~Ml=j8AulGF#+gw~elSG_ z)EcMBQILO>FsC0C+~cm*=1IWte)i1&*CoC#U-4+4kU4vVYLtmw_`iV24nzFr!wA|) zf_vN#s3N2;F5HI}PtQ$Ne2U7o)V_(!PZn9_SAL}h6Wl>YH|rp)c1R!i^(-G+l9n5o zU?%SB{V|cu<7C{1LIe%5z+2O% zQ!F|l>hn{ful4cAQH13>M|T4{{)Obt%_+r;7Xfg{muh7x<-miP3-1C(cCb{**t*i! z*M~>I5j+`gTgn_;hJS^*UGt0tJwxNuYx?6P?P4{z@_=xG#mVwds;FvDFCP}!rzegp z>dFK@jk@!p=e3T<>9aMi*msr>)v2kSg~OVfru_Xg^EG!z#>yLd?)R5G!K8w&o0qpH zHytO4N^6K6@wweRVL+NjA9TYpOo=j5QVaWrie(*QONTFj)WUf-_3?)Rubx=LuYD{-&gFe)z;uO% zg)uWRwYitMU-nJ2%5R$aWd%~Pw1>Vt1>|&CZU4A8^t_HR*g= zWp3xnY8dQV(wbqPNR9|#D9g&Uz1hB4+vrW~>2yLZs_k3YcbS?N@y zpiD=;6A@eZ6p>X>u(&yDzQ3HGPxjJg$uAFf)cKK2px|K4m(0H#eh7J zQ~HAj6z=fW%Dq%~galMcTH5700YC)Z^Wh)gv$g*Bh$#jG0Y%Qz**+cyI(k}qL3Fe` zC)XCB=!`(azF(n@B>t9wptT#v>MFywmszs*nIal9?(IY!{mD`GdZw{P*~^(9KiZO5}6?XaDJ9yk(s_GZW}rqL8n9gmUOy`_ep`mex)(ltZr+#q-Lh*wZs+ zVZni4fCJqB){A+=CclUYvw|x4KE;yamK=3NC@!dm@*)evSp;f5CYrOskroAo@bLi! zu+#tW5!JN50=w<#=#U1czVYxLBzw&ko}t|F`tn8QO&^zZwlfF~jE&b|Qip_q=qDf; zfD8skZ?}aDN~i!!e9B`lwPxFpnyLqZ53tD8XXtdc7N)Y)rHBAVmWH}IE&EReHMKM7!+~eMFNyA_ud5p#6&00` zPy@-rKysL#I*Gph4QD#E6DcVO1dG)iAFNm7M8zD89w04Pg`>SCJi7Y%Qx)^x@>Qn{ z{QsLR@}p77pnwebFn!$>jN*{(-nnBXo|sijU?IC+ub<)&PsRV#Yd?RWk~k)-h_^b}8X$T$7IrWziyOTL-XDn&u$0c(C+xg3v0N$5fe|GTW`QrY6a&_PlIp!1Nm zweDOR1duWmN^I+1*C;X;{HWhpn~N?x-fgAl4ond;l^Mg=ixo3a{)$7__P9&Xl7;Ba z74?MHAB%Tvd!iZdsJc7raDVJo`4bUs=2`5|aVRUF zudH1C{+(&Ax(CVUFvW!_2;@?~61dUZxIumxv%K<(fzFJ}OK8JmAanTdo??#7>rHCB zWGUa>IRf74{yDpu&jj6t3fWY0av42eH=`eoeftoqb(5<(ILOT{C;2e>FNqjnnQA-* zKUzzsb^emCPsW_@TeT$<`0D;|VQB6y-tm!ArB>tQunlAWMr7+W6rA+f|M>a6w|90} zF{A1iW@cvcr&U5o4NNChuT2))!I1%a_vt|!w2k|n(uL^{d&3U&0u?vTo8rrUJxV3y z9Gblg8L2_o$kflNcGVh0Bcxwg-)A^i38~*SDNOn<8w!rf8XT<9Dl1#iro)=4rY5Z3Q*f*ec4dn|8U_yh<>lqC%*^hYI>4R6@Y~o@8i0OM z9h(p#x1V=$ahxC>_1rJOY2dY$sVV4=72dr=$KZuO--3_FY!29Z@--UvNA}5K7w3m2 z@CPW>cY@Ox9(6Z2i$aN6RHSx6&(A*#_nypco)(j8FlPNz5VN-jwrLC=dP`1lU1n-s_K*zql^Y&RH6VQCzIB)$ z&^lg72GD8_`}M))D5_0dOUld54Y~j_!B1DtF`f5aPd+B@n6T-KB}`0@eDFEl4)wg? znCs3(wTuHx9x4L~!;@MBVs06Aropo=HC6cI=rc5L@(vh7QGfpy67sc8q7Bw{;E1AO zdcNC1Fno_GMpIciBQx_}bsGpe4i7!0@c7Dpvj82n&Zzs5&4VbIErA6MAkq8RyamQ7 z`R{>y((T6Y?d@IfZvH}H-MVsevdZ&nKY*%nc_393jyNj|qkU(shXFomu?+m=2n05| zxJYU^fPS4pY%$Q%A{K$>;7Dp_U|_e@a{qZ$=c4s9w8$rVkl1IyHaVn;kG}$jZ){|k zpdiS}A<3}OG*bZ-lSh=24>dKGRxeum$tv~>W1fD&qIfEd8ziRf@EcYLS14#al_nD{7h8id@Faq_ zu%t~k)}E4v)-bNo_pkbYqe&nKsxjv*q<(bpCdDwqj7bpk&q5&nAbG#@__N(=)lh41 zW5Y_4qkY08A7*tADK>1sUWEBF3iSx5Os@Gg+U(nf6Grx(tJTsa#x3tLgJf)a)6dQ} zdL3*YE~{lBbs@hwHygE4Ex0)if4ZORg{X^*iz%xpsR+^%-@EtazTSo1pY(35ujJYT zgYnA#1n7sJuSws);<(*pukG26%gTjVZ(FGKg+)50!HnXYaBbEm=8bPCi z*vRMq7DYekuc_&4+QyTrJ3Nw@ukmp0DN7*uE2uPqEE=oRW|6h<@^9qy_ymWql7bZtep0PEs?aXa}$HU`^y5ZZ|>0i*+?t< z606pQ>P@bn&2357mGpubrZ~l0)l548vqJ$E78YjMT_{B1e0h>zQ&ZCb#_cD)0G$mI zi&T(<-Ev;u*DDd`!kM9==lnbO-+VakP2h`AV7?f8N%DBo_kPrK*;wRHArX}P#0e#^ z9Ou^k-1x^!;`?_w8=GvBhgM#^(H(uPpp! zZ^q**?-V4krKN31hSjR!tVmE%Ff#slI!zjCp8yYRI&8E89sZ}p6>2~JzD~tf&3Bw{ zIF*K4OgClwr@og^(TBsXt%y>50$K#3BI1uz--7xu*ZYPOg{gfiy!Ei1N|v~1v~JWm ze;R{&51X9ANViZtdY>B^8EN{Ib6fj2A?~3eNCy5@30UtgO=C^B@D!2b{!a_=I^(-M zm9E_@{u@WgGKf;!(N|I)g=hj5+?Bt;@dgwCeEkyrwm6+~hJYL_*RfhneU9iuwPgNuv3`qTZrBk(^~8NyE|6 z5y(CNIbRh}WZ|Q$Q#B!bKm2=-gQ&Nu$y)RxOgucY zAHzq-W1zs2;VLg0D9Cl4>b0X`=pkiI6CFK|FH3*ST+RC79hOM_e1k~=S`)82D%Kb? z4b8ylD8aO&Am~{cTQSDJ$Eb?IX4}^nWN{CSkXvZ+gOieUP8()nT*kBY@^?8~>+0ZA zm>9)Am|ei-(%sc{^YU&Pb2=~wjglyMc~L7+#0LS?5JddnupgA}V9j_p?-VyFo~p32 zx=Kt;3}X)+g@1~&T$igdbck3(5nuBEqvs%NJukhr8f*+yyCC~+eiX5Ar z6ckinTg!(c4*D=7V*^Bng@lro^FWMtZo%Krp1UlYNb`sHLGpQc{~FN3`xIq>2?pDQ zUHR`GK%xN-;MTPqO!f9f+}yi1E5F0gU1*lCtTH652{c$=<=GZ;d!6kKMcMP)|J#&2 z%@3YJAzK~#kVQ1)IFo^sz++Bw$k&V4G#av69?5_ zsZUyt{XMJeh~GVBem&?yM>Uqq?B9`j(_PIEtZ^Z zY76;YK@-wBe;=t1qp!;gEJm}RXJn~|Ad*{k?nK$xgf#M9d2rU<;dV?L3td$)0mFf) zse#dOhw#`i%Kwu{FdDq@hKfb(lMjkf-@nm7dw)_tq}fibia_`U9B{sD(kbHcKe0^0 z*EkeUQQT`IWG<8E$#xD&W6Z`=*ZqO^=mZI|srGS(&hUPg`DX`;cj7M;%5#g)CmLbG z5;#5v*zaj#VA*4A`ukzN!$Ofre|(*F`bCkft^WcA3^t=R4`60&Sb5Z9x7+~~SCbxD zh$*0nt3q)gAb2)xg`Ptas;y0IHlL{{&4U1}`HV_}ybd3O!sK+XMA?haHtyNz{hE1A zok77bB(&I&v%HeIzG)JZrkRsT$H}RZ!et-hFEQugr=0k6JkFrm#VL6L7uaqF1`|Ef z?3|p#KQL7uK+^LZ{oX&+f!5$mWNvMn^Mk+D?_PnKxM0=yKi%UP9B? zx(*H%At7OzFR5^o6!(&aXdf(xMXRVB(EFR!f1tk`gn_3v$HnS%L!$he>)q3kyMGsG zG$h%Zm!K&!0-lTrKM6)8zqG{sm-5lAP+w`i5d~-Dn2?~LAhj@QgU$R!gJAhzj#PID z8NuN?jnPB(fSo<6-Mq&d3|Nuj;rHk6!kncbFaIesIZ#NY_dbv;@WQ%4n`~?AzCRiM z`}Zj`OPo4Rz!}uPrg;sv8@Z?e6a8M`lDI!i_*CS-JhMdYa~z zv^UWF&6^Q#acbD7eTv;h>f|cffwIx;9B(Sv^FgO^Hzs^05L^K1DG67Xq9bGZaeoR_ z^gKbGzhVW2uA3z&$lQRkMYO@d=IGd};PI#Qab|J3Ry9SZs+>1Vl(cr0`7&IVLGqE@ z#qR`j3p*j^L%wer8IK))e=kWL7R3as#v}FIGA%X%f%>yEH-H)v5P;=eO%30l_Ts{N zTLK<&Ylr7?;&6;TMg}tN>!!`vxeY!shWo3j^Hl%1;xt&Zkp!`f#kdO8R5)2!wqwbE zjg0IaZ;;I0Oyj$rs<8&3-U-F*CpvUg>N~6d#1?az8jqffgz$?_i;hmahw$rPYrv-b zAm&zj&;P-SA{22%VC<*FBJiiFp0i>gi6ddUQeFg!Vcgxzs=<}pF+CKLVM)ZSRhl4e zJNT_|I&5vg1}t9uNCVw)u8)$8QHLkwE9=<4hUuSG1G24TM+x6#nq)6^&0-<^UP)t7 z_1wdK5-KY*+xwhF(IM;fcKXGpbS(qBr$oOdG!#n-T7e8|8M35;8~Fo4eh z^jlN}7ilC&NPWBAtNT#y@Q{6uV}1~}KM{tX73VvBaZ!+$SGvEgv0j}w=~z-UcXa5W zyq_2+!lhXNxn;f8=W1^ctGwJ*IcQN#O&sQDYwL&MnZpF!dCHE2m#Zu+S|AwF;TJRYMCzH&&gXvT&5`u;0#}Uqcs`GRDgx|Ff zpDhrE_t}L<2$q~-9r%2sJW-Z8W}<4PTU$Gni~^Lu!CZ|txVpEL3!9bI^7Neq-Z)M` zw}Iqvu&1o-f0Tx_v@}&_uiKmEz!yDzeOs{F2jL0<958o4JHRJg{^ULe3IxW$8kGGa zA52bsPoJj1xh!gbXC4u3NJwP8z&$!5tE8dvT1g2Bv3KL$4$k$HUD1oTpkQ;~qny@O zjk(+TSrr2`d81$ayOnQC86P$axI4&nMBsqY&T`;il7cxZSezO1vczCfQ|k;)?hi zmV~j1SurzNwfname>JR)zzvmyEd?bvF*W_ zibzs~usnA-@>T*o5$yH!z7FtD_>uzk7NQiCkBz!w9z2JSHMqzhl`}!t0=tvoHPB2} z%<%E?57!6dU&{|B@q`ST?QCs9VBo*Ek4N`#(VsJ|h55i4CyY@qQZ9hV&~LAQ8q18+ zBjrHC%#%<`B_u5>vaWjrBaG00bx#~6bV>3!CV&VDA$qQ@sK#Tnw@PnCACKFf8U2VR zkyC3%??=%?jdys0Y#joUx%_lcd##H6x>+Uk?r%cDI<%%d4<>3a8|-RMo@r)X?ZlyP zHYikgMwx@jlgNDWObf^R^20ysA{FY@+pM;*k0If~np@t8J_d7a5{ooiH%?>m-M~@m zcB9?Do499DQWG^V>5=?WxkL(%ewLS2p&+L3QC%JX^){6Vv1DBtZ(w%Dc(kdu8NJ<# z>{lIHmzLBWQMK<(e}~XXBlHP=a4MT`?*k)mj05yr8qY6Ld1y)t!TNAY6@n`)dDjLPI-obpOdc zt%cbifU@h=!>zTBI=ebQ48^DSxp1L`LLudp+2OUv2*2MZ8Yh2+mj4JZ&bZ&emn=Bi zYQAhVLY4sorDM_8v)NMSJK16oC2wm%#mxB2%6*Qq(gJy(i=D zsoQG`X|-1IvPO&#iM4c3zjbBi<=w8a`lzzAznk1Wb)>q!36S>YV5Kyiu3pn6+I&l= z#xy!QVRJKHA{q1MnSCgw+S}E7$G83c{i02`61K|-jBl3TVZWx`x`2L#otw*eejXGT zD`)|{J!)!@1)R|W-WeG=Y>)qUAB2tY=B7Cl6RfYyP>}FY@WaYrGg~`j4(d0Mi9g_m z4PYcGlJkMjE**>`A%fZbU3XjwD^=7~RqLEtM>;xS-;1Y?8snj$nfe;?8VU;W%6aj9 zCim_|!vI4q;E)YvPhPwm4(_9)!=Oc+(rnJ1yDycLmZqhoT(5LvVfe${kTJ8myQ?dd zc+*Qnl!Zn97m$D$MLkd0cz6iJW5Aqv74%81H83(V+Vhl?VA06Z(qms@q?9_Och%i( zV4?c$mD*em0zs_;l|9WB6`9UEyr8bI{ch)0Lmix)w3_-iL&Diqly5q^rD|&G*y>Y| zx%%}mN_Y1seUbA@X}XqT|h#wIpfjZD0$@tQQ})9MA3cKUUbN;hns2zE@-O zH!9ZF%?1p?8wi!3VC(en+*>HK_f zos!qruddgovzwZnyl?vkkUWfdZ?v_^(>T*#gtK> zVZp((Nj!PbM6%MY;6eP(A4%*k$~5ez=I0?^W1ZPiKJNcIki+2$9$vstv}pFdIl!fD;kU=&f)9<~ZL(XIu&}UzK{_CyGg~g+ zXF@F}8l%BpGp92YBldUtkSG#aTz;Wxbxn5Ujm{Obc=h zox{mkb!z6ERoC?LQx(gq^pZBgw6Behj+(60h&Imv*FT5TS69e)T9EG2P104HvLxK| zr>0zVE#l1(*JWExKqnI5bPDslC}_yf4cClN^((08!Lz(8E=V3#V|2K!y0}!n`G!UD zd{XUWKZk<`Sz%Y)>iWU(CFR3CH<8&C>`VNE+@OhS-AS9PzmwJ7{ld1lF6xOnMe^H4 z8K!0ZNfI@b;&k{U9B<;^tdK~fip%(Pk3Id(RKHS6T$$X$T~PO}&9A_!AXmt4K{AQS z;BMS?elW6**+pWjfL+Lq8s#YzUTSO4v<}n!E%ndK+bA>a6!AJ+`d`6f;Fmyp&3LWumJ3V>XgHH2qiU0*TXcIx`xULOOh=k`PQf3n zlCt1)UO=(_6V@;l?Y)Or^}0}`QQUlX_Q=(Wk1n>%{^t>U^1rSgGgFf%35hw8X>T6T zHH>Ug?Y`(g{nvxlYzz@zxP%N1-!gKR9-PUk{*JHKN)f(M8XDZ$XVp~c${e%RCdxQ% zm6AT0&M~&M^t>1liSF)9ND#7R=S}v#RW{R_o=JPXzL4dkpx4sylw^=}AP5n9uH2Gn zO?kCfa3;d){i?*fdg#j>QB!#gZXBALuzGqyraUbIlk5SWAtqTW&)Zi97}Pg6H}&`~ zzy(QBNNDccH}n0b#`1C)NF(49=jAo&nkN--^UWvAg!>(w9dI3Su(xmLR;9$1X{uD- zj)K8LDmF9}!#*g-Cf=127rgyfNB!sVo9ZgHaCxrEYIIXFfV@gWm#EXG>i*7^(8$7q z&)SdruU{4EV)^Y?oK%L1>RzGIl_$>?&P6q^wg+5B5Z^1W{`+x*K0Zw5!%*smm-ljq zsk}S`N1_TNfz)tibMs6Mdxnn>-^D4Peyz>cw0%QXFX>ZWE{W%@%1Oeda|e&3dtnJy zoq4~%-Ox`p6wJ{j=e;R{e{nlwqLgk2btw}$8++DZcD3*lvELZ_*%@-mt4t-}VD>Zj zDPY%?Kff7R-b<76r4eu_sQRK*%kObC01%Vm%LN;&BEF|oZpRkP%0_pCm-qJUz>ZZ> zfeB)*cv(x!0_=$-t-|5<3G}unyu8M(4*W_tH&d@$KO_sg$AyISetz71+|k?9f{B5F z0aeX*{9x(@%ZsCh@IGLWr^SPKHEl{}J#?#_qWb&%x+ZhVJG9(D8sX2io{dSfIs7M0#n~F*r z#b1KAC-o_;|60w>&C21me3c-4!R7kw*9mZ;{`^Mv_L*ABQsA;Csd|D@@Fdg&9!j*> zxzmhxbc5#4O#?BlKDUpVA06&0(&aL?&Q*C-S~YLKaYDIM)w~l*BUU|N;(b3InNY%< z{NwYydv)x8xSuxe%>|hb_E)r;eXnNC#FNmpNv$2wZsH;OZ;H-Z&7`N6lnfSu1!>Ky zRY2{0LTs#n^ZrqR6zdKqc@_UQ@=I0vMJ~4=BfY0IQC3eVq@Fw{9MOBFCS@}?t`?8n z>EEmJ=i)%K^su6a%-(2mv?h1*VEw*rl=d{c)O$HSG|Q5PE_E~c(H5dZVsSk3DB_dt zzI6=|rS6~uu41+C=7!ZV50FWg)w`IVgdOd=u;jX94vZv5L(rRA#K#sxuwPvzxk50I zvebuHUQ&lD+P?8$p_w|5|3SY%v6F~y2wkc8r#lEIL`IxR$e6)B!mH;_zmk)afl3AL zo@uL^+S=26^|PYzL|9Ks25thVw3E}I3}7kHkl8SkHyF`R`pl-4*| z%qs>q?+;X#ZwI(31myF;;x+uX$t(7;mO;Tgd3ez`2M2w`|gKK5LIp0tponE{3sPc#ichj-f`7+VQU zrm7{NPu$3W^Of;`^v(MEb6^-f=B5mW9aT?vGm&Rn==Oghb7fHGz`o&a^&MEe0a>fq z`O1Xjy}i1$cqk~pV(11kAl2|Om0xOOaeA5=oF*6H5JfmVj9^f?-^Qh?q49TdQA=DL zY;4|JFPiSbbrrCieea~Lt((oUu(L4yNus!)`~gKkA#3!^t0iw^42~ta14?*&#(w?E z@Iv)3)o04zqM|udu0{Hd#8Y2cI5}@G6D)7?0yH-7Fvqabtc?&wsru&K+;ts?g)r3FBe`tA;VSynSC`qoS zob15!yCER~>ar_p$@#vSR)tP+NCw><*jQbol*7vxR~7OISbJ}bjzGwT$*JM1Bq^@f z$r;a+)RPbqd0*_w!I#*ki0Y)y^zB=kUoPo689}m)%1&RajCXMXk;ZWjCxfj)cmJdB zLIZZ@=H~pZXP=*%3=vjj>aL*oe<4+KcKb$4@^Nb>86_BH894|7{)MSBGWZP)w|6X3 zFE*Dow_M2=g@+Z0iCVz#rp@+Bc5gnHqVGv4;ml_dqn9c2pwjQdeuyNp2z#1JtXz)x(wr*zTv1g>d!krt@1VuVVP8-nmB}f z`t(9yen@`+&iy3pa%~CTWQTKy^r}hJ@XSgx zCG3_)Rqgsi5g9zEjLV5bp9rb-qS)xMvtO}TBrcB8%|2AgpU>o(2|tkW7)}-mO#h)L z#9VCl)YQ{UFSP|DHRrQ9VXre<;XjOG^<0YjHC{iRdXvUvdfxY$0IAxjw=_RL%;&zo zy0)b;-`HrM+LP} zH8uG6P(?~((-&?C;|CJb6j>y0xwQl^^!y$h2d`GX9Yy#-Fi}PZ%DlM=QCIE?DfEj& z6^i#2FZ{+PCMGu}>*Ky`xKy(~tZt8AZPUev{iEiEQfAH4*VE-$g&L+a+`HG8{>o`d zAvv9DgwK5b)HgPwybEl911lhXn9irC>zIP*k>4}pEm8dUrE_+Q{TPu#uV836|AT0X zMr8JDf<~Bjj?GNPWWVPKlqP5^qCh$b4V%*ZQOfZ5fvdXP^e%084yv7+r3z-d+r11} zX&dgZ2%sRp>j-?u`pV5PZ+q7ltnZqy7iv5_UTG&KK%nPWrC-Tx~Vl)bH&H{`A%KO>bS%|WO{ne9&42Mea zXr|KK))s=?9)zecAj`{#+b%M$xPW!>9DGJ+`>XK!5s;m0F%#Olh=bM&BRhNeEb#k( z{`@IG`2o<`1{PfBwi1?>g||UseB)#ARvDRE@lTv02s7hBFLzrxb>}fXW}@l&IUdZAg4C}Q7lkJQ$-R(YW-3> z;B0CkJ(gU@7w+V|Z}Spal8f#h^V<5)STnS-Sv>9BIhE7F!n19aYC$V~I@*PZyT_-8 z(OyLLl0}9)eOXL-wW(F>xKI6$kq`z~Kcz)Usg4B;bmdreb#-GsP34Wkku}j&#r1YB z$qI}U!crmz_QeEt^wsvF(urQPb}#lX+sR63H7!O)9N3dNArJ~}*5>ATsQ&<{rmNdn zRMr*4oF^aWs7iHJwp@v({W^d_R8L%#q6(YbDD2rIE^L`5505uHC4r~!2LCD@etZ~A zh=EX$H1rR?Fqe9UXfr_+)c@^exqp%-1l#!6EnadtFTlb&n)qj_6#2T>?ibXvQ4kTK zipS*U<~BB7!bISK`3oY75Uov(9c{KdzBPqXu%_@R44j>&+KKVs!)Faex)eGuUowzC z0N0`%LsWM5mHnz9#m(&8%!cz@F1Zcpg%%+e6LO zbQ51J;8W(e7*6vb!_i9Ysh2#i|EB*gaTQ17+@qpW+=@be9|NJ! zZ&C^f^>TuJ-o|IvqlpF@8pm1sI2Y}zGDrwv>el@H?zT2AetuFW3YLzU%D0f$F4e1t zjE5{kMNoGeEPFy7Eg!>W1_=TG3t-@tm;cdi=7>NTnVY+TLw^QeLMeg;Guwr+2~#7Bf@-T7{#Go` z57a)@_N%xMi&%Y%Lu11{ixix3jZPI1&=^}v7uzs;Kq%Mib)nVdITcXY+88XXMdvU<^PZb-~edoek?(&w0u7l@+6|+OzX>QSY%jr@p^f8>=q=+?u?$ z0@U+-NQ$RfSy+C}zfBWpyfM~Uip{bVezp-YYi%Es@n~v7u8ND8XafG6@c%wO9a1g{ z34=GSt)`WP=s4t{#ML__L=}B*$}A~4kflh0h1Ck~YbXKE3rVXvy2ziLaPe7bd1lDK zO-?GjdvoXBB366{{m{jE3KBwZPpGnO-hAhz?^@{zPSp5R2}P}QW80ZA#l!B$x-q&ok33`irOhZniCCr8#k)64nNt?-)6lfqulA%aV2Q^> zN2@@p5WL^^_G*x=$~A+MX{#3>!ouu-7iSzfR^_C}UF+b8Y*e&V?I zrXSK*r~51yp}v+q`pkrJvS&gFf7*8~Ng=)WH8rDQ&_NM*`TH>j^_iR9RA}r-tZ_=d z4Yf*{*VbKjcB-w&y2?W@*VD%E@HBmrNCX1*f62THHC5xC^FcK(W>Z&-qF$s~nVE4w zs(%xDPh3&43RsUhoH7@E_b~$3OxYitE{<;fYHQ=`dxT#0AVAD-{+IQ4la*p!By8!+ z@PzwbHsn#lE?-AsPV1rwvNyM|046RHRDKMSk`td8-Xi6#o2_#XYpA3yY0rE2R+#p; zc=zK!{VX5Qy}kdC5`4{o)1k!QS2an%6mRRwzJ=w`j8zRRxYX6-wdx`vGS|>>Qf-Bf zu(PvcW@jhZF%`Fu;{?AaQJ!-qv&ZqahOh7K2VCL=c`}}!lY=#d={JvkuIiSS=J*vf zH8hA66#Bt(W>cIjbs+!?a4^uD+x&}s3Ym+}PI^GpL8^27)be~6k=!kPj>nNE<|q5t zdZ2u@H`lb4la)2==N9~O4F74IfrSV2k>zy}5LF0O=FZz#` z6i?NU@UXQV@xo6iO$Jj@-G84Amn_5Uwernp`>gu$JTZsSmqVOi^MCt?33MRL?5AtO z@0TaiiRYJb-4n+v>e2%{et&!r1BnA8;p#!n!S>rV8I5R45%=so+c#s`a!kKJe5tsM zec<)}GXnqY#j9-{9UVqvG;amwN(Kre-N<*d6HU1>7qy+gm6-AJL-p1BE!D=Iz-T&cV}cwSq%vH5-3`S*PYZ%yE)*46FIsRa&X zuqHD*Pc1CqBJfPdO-xPaX7}iR(xITDhEaq<`#{+YxHPVCz&>PJ&frk>g#2$u&?3x- zkP!jzS%} zT4=j~trsviXr$UYYcav9sf-Lk#E7@U-3n}j6C#Dw}+ryCBj82RB4j=io_7sAf1h{KsFKaBdZh3x?g7 zm{{9A4;B}%B#7>P@9VRksc;|sQZ11oMreHp0UGt?0@^dHX)f>|fi78&ieUIMf{y}| zL^I?Z&vZAP8lt0zH-@+LI7oy-GztZA?4CRscXVi36+u>@U;q8PY{u%BX8YO!T*1oH z64|s5Ko*M3YTw`?!(IpA8t8LZBtKSfbnm#*#eRQ-5hx+4H+nVSh%7>5V{YE2)=+*D z1j(S@X!^S-9h;-Z@F|AsF7fs$W2T1fArtQXv%VQkPJ^Svoq>CIgE~U)b&ma91VX0O zY!xdrGcE1X$KYV$APFZ2@>j1WLxot;0*v4CQD?0~fNFel$r-J%P`5$9PwDd#Ddooo zEB8c;t#ilHXV0csSx7CusTUO&nm+UF~-%#OioX}!;Y>hqKZ}SH2 zA2Q6lo11OKM_^Nc<{;Q(OVZgdG`RyZ{QJ-l(LWG4(GRFBT*`1^Yy>J3+dekWLJn5Hvv)8ORfGqk2%)^m*LjHl0Qz{@{GQ{XeEPCXi^xDcMGG!hF*v%h4-fC>yE_7=FB9aeQl{FU zpItLsC`v8o8hd7Ul0b+aSIzukVYya6T5!69vZI)DMaVh^Rd`(EdMML_<>mLaGSCNG zkDMDb7cEif5qA-_I8F14E8C?!awuW z_Tz;262LB^yXw}7I3mcH90C{rbw9&dyuV-S?0o$9&mZ$AvpZAel^eK(Bwru%PLdku zoi2Bx0bQuK& zkbj&;-rQZ6lagv|W`B+A0K}wv0pLF>(cN>v4cWcb-JWEZfd^OhlS;~zy~tSYN^}6e zHRa?`^k$}6yu%1Wrc<-vr|Zx|!bE)L*5Tk{0k5s-pBI$(J%;Ts1`aBSMX->?5Pp)< zD&pcjKP+TbcDu!;r5VA|Fi|i84o`q~NlW+iTjH$G0kC~6neWbj!M}i3Oz2PhD&k$R z+p?r}_zb(#=!>nEb!GfWhnZtVWu;W+B_(d}!h#5_2Sd~ZtDJXEU~yY?+ch=ix;~)o z?5yvLuc2XDBmpYD3SF#SYW_-rlQW1WW+yxO8I|?QVSv3@ta0 zlKE<$qw#kn_roA3R_iDO6riSSt*krfulCLlT3R%vsNy9u?50YTsEWcLsVfFb{BC=% zD=j?mQFo1(l|=Z7 z(K@B>gy;$c{!(#t`sH^Ge*0ge0N%9jdP(*8EY-Vt+_Ckjy2zk)ZEyJ*plnk3^IOII zx|%tv%9Fs~iG~mZD#!TGl9!V2-mNCrS)m17KfLY_47<~6szFMjFhG!$+ICLy{UpE^qbt&Onzs9Gh%d$9UurGv3q9^8N)Et z;KJ_58ru;l^-p#>=VPL4k{-~VK-zb9;_@w8RYSpt$N565Z);W66G6c*y{lW>7SHhl zX_=E0UF_|9Bp35v79=HUO9e^VD`{)n1IBB5cud3&fxr#vUvs#<6JB=9{23lw_kI@0 z7rl^l3O|2G`7r~d@X-Bhv+h0oy9}U&8x9gn1kerSmSGg*{J|s(kNc|C%TkymI%@WX zY*dG6vPSGr|I)5W=z!Q}14CF9P7&iD+Q%sDUUcIEcOM`!3daYMgpw5*7k#~4b9!0o z-bp&7MADd0aQ*dn)>XuV;Ns^vJ&WU$ zx-P3w0rC1)%J(nYEUsQ?cA#ACRB>`Mm#2lJ#RuY&^XYNa%js#ofB5??_s*+r`B3~m zJOGz?eOVn|-CeDN4N7D5U_L|)y1E$Pp-oSa2bNp1F>gu&QqoQ6wsIZ0zQy7K;^8MV zhd+gX-r)uyA)={$8tk9>e6F^)ybS49=;(=Z{?+83XTAZRB6i)$NXndd@A9=@9_+8= zWX;u3v$6cBq<*gwuB~ISzV4%-{59`%nIEHO0RjcG=h25j5R?Ga9+LFyupF(Bxc+6VtKVy0}pF zu6hO`(Es2IkYEGP;_h5}L$CN9#JA$&^w9>!uHV^BM@JPu!gKlTRdBG0W~``lRiee2 zCC^?^Tk}U!Kf4(XV5-StC_vh`?XP9qZ2CE#@L$cjjj7`$GT6&G4Sv$m)AaT-Ns4%k zj}HX+RE~L# zSrJ7*dwx;Tj|*aaJL40YrtY;{E%>oUC$1JsBImeOnl`uJX^sP99WIU~O5y?HTg%+w2ha<;bbUYdD@F z>OEK!p_p8Jwj*>;;rCmUkCzuBo_Lc2ff}#Is;h2_5Il@%9srkttGcwL@Sn|1+E`gw z*1`qt2G+E&s)Zyk$nRjAcJuT!R$7Be7Ld)0HP)Wq-fvN2Qc{eKc~<_;or!)@kd>3m zFHIb>6I(%6$t0eJ-vZ!sIpyVYO;f~((BTc+=`v6PBF2+2P&QYjM8R+7yfdk-jnn<- zCx>1g0EQ3z{UsRV?;=JfCmkGhxuANHCT&b<12ajnL+Y;;Z=m7+v11&f!A9uT(k&SRf{DGF0nG%5LyR^>Z%WOR>vnrXNz99l#%}BQrm#bvBzRz9#;zVWT7$S zF?nBKRA*;*!8qB`(c!bF)plp*Pb9H;7MXLKSb-Ebk72lf0P}7t(?H6qh>X)+F6`3z zij$MDbVFoTI!P47`2G%$wzlKP6mYkb-E2`6e*1Q|r~SM9j*LN|=ubSBeM3XmoYkcn z<5zL>V~>XogrnTX7A9@@?Ehq%Dy%CdKP!`a$JmF07#VomfgCov=hEi;Q`>^g5ldE|#Ym%M`_H0_}m%9m!xAt@fKv#7ro3qUl9W-lk7YBPII;LI(dabAGi-YM>zyKC#g^eC@rx8xVSg_y}0t zVaS3dDLP_&YU)4b3fz^XcN(*@?kJdC-HOcka`@an+x#zxOxdnNo`mlX9~e4zE3rmjgI!tQMHY3c=jA zr&IaUc|kim=s6CKhNHta9Sz|1oO5)=vEVthxAztrgHTb=Pij0UrgS5|4lE zV$U zrL)GZ3qJnLB^z^URJ+I{L7w0vmm?Jp5<^nhoiBi;7v*U%;dqa_tqlIRn zW~S~Ls}p$50KfqbEa7K~ib-SIF7XKo;M;aFGJ4_Z`S#in#xA&xZRU(k2vLi8g=Q73 zKo1vh&VgGKkg$QDO(u@_EAX{}fv7`v)aFNo7(C^;xq)_baSZ5%xmw%27x_R0^YI~p zE6r;)wJ=)d_XrBoqelBxG~Oau9l7-P_sTo2K-rjY-MeE0HuSpyZY}^Xd05y!p~WOJ zUzp$7(tGn-RL{p-Y0zjdEgU!Eb00G{=-(+Qczr+PyK&-%zWOA^s!?CODOL?-(>a?9 zD8pIV1tz(|v+L(HPS!<5RxT;0$>rr?nVA*Ia}{#&d)&3m;fxFnCT2j=dIv;aIj~?| zT{TL<9kMU}vi=t@f|iVI&HGZh$useY{SqkFJ!wDXajCZ*)Q0*C1?og4LTQ04czu!Y zbESoXlru8&t)FPlNX*@IZBUPB{+H!Jd##;&Jf2|zaf=X{-=26BrpRvcnlQE)@z z!3C6_{>H>)-19pf9){V#=W}4XD2^paJb=_1ym&gAnwrdna2h3il0OrMtI$G&b7gds z%dhj0;9w4&D#`7WUZ36UnZEvhG*r|Cc*lg@3vz-uPU6mP+=yf`B!~!q&&@V$XP}{Z zz@F*)=Gqgi;bEexH3}e0R|~K`OVFx?SE(n<>TpcV?{;+f7)lw zJ`i0z+ugL?d4+o)F)}u0B;?d&7?Dz^BTpu(@QFDORpIxL!dcRfvS1c*5JM#94&7<2 z&F>R(@^-NO_T5scVO#e$rn8>S-M$gm1h5|hp~rE$7LmBf?Vvai34N$`SvYCKqRvN0 zJ9(w3dWMdk1VK{t2Vus)1ryg2n!*()Nl^XiP;=o)g>hi#ZcO%X!RPQ|u%>#BwE8cD z*b9d5(Qy=V@B5y12(D>wYuf5{>pQ3{In+ZGA5=F!ZtnYOv_TTfne{fZ9Nu`D&)>^E z^uJ)ZIeqk7ocJTh2TU3w$-a&5ygXSNVg=X1vXtNLZ4^`JFPVzR$`1=e={xsynjeQt z7W^)x`DCuE_J$7S{dG{O<|YmmLtgBUZ$3ej0)w9goCdxN3=L6e7Eymxdn>R05AR^{ zNuq}RtnT|ZkM@C%VKNj(tn_bUA8Ia-`s@kzQW32v+4&#cw^gVzn2HK%P8A$tF#Nv? zI&Xt_4O$E#bJC#~1@Aj3k1()>7NbFi#0gqTD7#N?F)o0(_)qx(JfY9i#Qoj=I-qw>PE0@v0N@Hoz0%QQF@Qe9FSKfU$;SQ^6AKGgD573& zi;NmM5Aoh0BRz9m-=3{jAg;rq78zM;Grmeg+I3|Y^k>+>$I?cV1VDn>TsS7PBFzPTG^y1C?6j)9v`ZJV23UrYii*l;L;ZgSaqp^?254df4Nx{G zwtvn0_Vvm!3=MD}BoYk0DtbUoBpfbl@?%8qd2@P-$&k7aK z)1_L!>Yb@@8pDI}K7)uBo1@v!M2Q&_B?}DZHrY=fVXGZ9|G`4QQ)U#LmDqql zLpP9l0SrvA_$@b=n2hYv#RRZLbUAMBt%%`eS5>{F#S08X@$SFjaw}7Qeuqa?G%vGb zp(E4sE~YqM(CnkP)TD0n1C%stq-2G?y$PH;o6RHFUrq&u2-J&r>nA|EQD?3=S%uMV z^9c9ZGp~>N#l?ij4Elt1a8sPtgs!B(lNGQ0^z^0e2E-*9Yud>NU~6Tnj$Wfjl=0@B z@zqBV3oJI0Z0&f`@#Vb|22tMaQqexiQ%zUbezvN8b~b!LB)(KnUc~Wax+DeNq(`pT2SxwPaC!LUmNm% zH=5|dflJTWY| zxCvj~cn9X^Z$UI0X5dGYGX6K;>#{`MY-JrCBed4A{&K8-5SL82W2D>Q!_L7Wjh79M zAlMpvIe%C_SnCIGx3j%HzAbRNtrZj%LCNPySzx)TPe^zQUdg*SI7qirZiM@rd;bUq z$dT9>9>)-~XO=%Qbz*&yHmFdQskkM!dECM6t|f*4iJ70S;#aw2@F(I=PrVDOGX%U= zS3+8O4S6I{G%6Oie(OlN2MG{t#D4EuuIW@hquzbk7rV!sK_6|5CRlli=VwAIK+d** zB#wCTpx}F3ML3ESQ#V=dl})`zUARyz5~AaICxyk%-R&-1;{|ge%aA18&%e|mB$XU* z>D%z>Zc(?C`;DAWJ5u|D=;xOrHo_5Z+(FFk4i6Ao4fLN7;>`?LCn>_*mcIihSh5mA zx#3gEf?XoFMtdc%d}!qyQFZb|UO8Dr!=GHY$7bf8Ig<#u#p106GZ9d$HRZnuth9*gRu!9aLw@J=7yI<{-r9E9-&UJSC4)wvXqNDS6$ z-$7Yy_mQ;`?spYZ_V(&lRs%5Vu(0e4#yuggFOE9CS|c9q@;LeR^>~L4j`wcXs(ECo z(Y-y-Qe$dIem3u{z8v8@ND(~tH)q!_E@MIljc(`1gSs4qrZQg?XLe3s55MLBJVt8B zB~9Z%Dg+2EEi7_)T~N>$UpF{CN}L<)f~5FuncEtGJL%}@BO>~{Y91Kfx?Vho}YO}G9NxKuW*;nrI2vG)_XZ#4YcCi-kzSIz`(1j zFTmae_B}LJ^{pL14J>31|1+cY-^`Z;1W5$;uR+3E?A0sj><0nwCFzVz;zWbN)C}QN zkQ$8UvMt}@K~;g*l?AoG-!!uy1v10t0bnh9dahwOd@uHAN`;f3e|lg5z`)jvEzdYO zK6Ea>$RCHK9HldCzW3d}(zx6g%U06W(%K6tIim?bt2^tY7~0~J7Pn)vZTfs$|4Q|n z(v#wlaG{t<37=VvA4gWN;}}O(yV91kexZHq+I4`x^0oh6g?wvn!cZN~ZzBvY>*XlS zt*_26`zp~DFy(0se9g232zR6hOTVRj6@0pE?4{yEX=_aIa(a6uD~pY7io;5sRyJmY zw8KTa?(m(4rMilutW(ldp9wT#9Ddb{(JQ%YWRCG}{4ROc51L{44*{pz>(+CEvZ>5a z8IP!VYH@R|(5G}u@j6c6a+68rde~si*-rM!k8qqq{dtnm*Y^*CxESn9E#eaT`HzFg zOQ@d`rYq95COXu-fB(LGqHUx~#TvPWTP`>RBd-9Plr8T%^AS4IrU}u{#_BDK`rt4}MN zPhLt%*=&>~wTsKmxjK1}_*}nOP#HcNduWh`L(io6#5HwQ(_oSD-~m2ntg%e$KuRN} zI36$?k9TTAMPO}E-SDNPu+Yl#t;@}?LjlK-h>x@V{mDiK3Bo>Cd&+X>%UbJl zLTH`b5c)SSYJc>(8-YLzsPx>}FDQ6;xN#es&dbW$KSZq}mCWtDH0ak=@2Ib%GhLSm z(~+sEuxq#TdsC}Fj_N-Lo4xSThEMf``-i@`8-4Ze2vi$&gm?t+)BOku)5pLm zI&&orTv4RIJ$9L{0nDEJWwL`1NKZk)syIiO^v}fxEcDV!-)Fgz>I!S$d-BPiylOe^~QB(JwE}5DQieys-a#cedl|MLS}}J ze+M9eE?YdL_Jk6X%#j@Pp3W$Z+5pWm$Z6t(n<2>Vfv5#|_*z;yaoS|jAF#VQHM6jj zgD28U>{s(xo&rD@M5I(&kJRsfj%xlc;LWdCd~4U%+A7fYI+r1!R6v%G#J)|( z@mzgXVk1L|p)g>u5M_CuOQGs*MAVfGtz5IB41r*fg}tTz@x-{@HN;@a7;WM67;~-} zonyGlVmMib3W0nR7Ou8Trje=`h5r?zxQ+?;y`9tc?UJK|j*DL?=MZ8e^}1KpW5piQ zVdGVU_8ZJ^)g2pa-}|=UzV0&ok$jVjIuHVL<^Y4H))T^0s)?tdYFduEk|$(Plfeo3 zrK85y(V>+|&n0HFc7O3_tgV_Ut&@Z)Ip?k_)1y_meQ)$>#Ntm)S=+@;oc_E&d=7$ig? zxtlb)fr0TgdX=H?yWEpg6Lf(?=0lkfH*BEYNMHj+6TKZFZ)r9$%nn@?7|9mghECY|h&1tHwuCAP+@2Qw4}MDN=MO zNFaQ)H%=ziLQLLXa=uqnNKuMA9RFR2&|=WxAT@OpO!O5!zJQfJG^p@&{6)kjdK?(Kj*j~3=7}yY zRKN!g@~Ep*=i9^1+d;uW<#&ZQDHgg59D+4Ak1hWGw53R+~_~mdzfzzjL}m^;A2Qn50@-8u$H*k-XdaVU^=ZEp#F{xVO3+4f?( zmNqpNU?fQaJa(7#o13d)r=Zb$55hdLBtow3wkN%Yh+?3P1cr9*`EEu=mmvQDqEPa7 zJT3VV`dTdji9VIR*7vY;|vYlu2s!yP*XHqVVa7%>qeCU`5bt>Im|x z7$3eEoI}zEjDb+5WHt#CNyJ}@a-+p<8yEqbtU)Ce8`t~T!xQ~EoD32~i78Sbb-4;< zY6UN%M2rWGo~((B&fT*^P+D3J15rU(peTu9t>H(u_z|kQrB5ZGzmhx@>qUf1h{63J#j&G9}Qd>7on6P(M^ES1VLNXfo(sItvDD7Nw-s4VE$QwOW><`yrMyi?wASuRO zDGC;TPs|>Dekl@Y`kr9MwA$Ni{Fg06^KFkWP#3g2PmO+cog!2X<)K6SH03L=^)(NI z(QpPrE|-^=!Guo~5Ha9-SyBN2v6qWXBL44??zwW#Gyjc0N2jZN0wS(N^Y7tz&&H}Y z3ob4mKbZe4G+QZ)ZdZNp*3P){X{Cj;y`c$5uM(#pDVxJCr)=L6CeTV~#*weQjllRG z<&%rGQ)g<$16p9X>z7Z#71VA1Z^Y0=SchMezqTojeo38DZK+Y8FhUNdqqIas?m}WP z9(O6NGe#KG{~a7(?Fr+n^n6N0i=+Vi!1wZ;TK_=ufB=oroeGH`5Lf(K)S7Cbua7}A z?QrT^Y2o(+?2bTNHIH`nuav;WKyn;Kw6yelF|n29&O-r{gEj1r$2?1P88)CwxSu-7 z<$CH~KPKSb-$VuM5st6l`hv9sb}ohx3r`rKk*^E zy|LWS;Sw#;Z-ZF{26VZXt0n#eF6ZkK7NtLM!5SK%kQtX z@7`=dI8(W(pw|=Hh7?zjhBVWMDv%5fP$f_IKu?1i39GAn3^=4l)$s$ySlGim2oa){ zmB#>jdiu=n*$+k{NlCRzD6kOQX+RPLs6tqmg z*3_`{67a|2`G8B3eXXj>5&sCDGWqc|A?6kq&Wleo$l$Tj9v|?sWfq`V8%Cm#>dQC8mE z+ysN*(RF)G`l)4~>AvX2khdftBlBv@5le~?PyO|ld;%osw})`SOq<`aAT_gN8XOSq z`Hoio9EZ!kFD;FzTAP~o!exi~Jn}vc0^psPm|hHFZNN}1J(E?dx68nakCWRe()J}^ zU>Bf(%nZK;bSHREVNlyMn!Kfu+*P8b{7;L^>;AW}E$yq?Oi7OL@t*msLq=25N8Bu) z_T0RS^Uzqgg@pnV3zdSKY?DYp!=ErnlPJ*2^B&>sR|NGK6;RV z7J9HKKD)U-kO2amjf_kbBJ7!w%ZdlItSm#dUjG|-_=JIp4`*_-+b_@2TifSUQdklo z23077Q#j9^7m#KiE$44Ax477T`r~O7ekpSR0V$VOSltw&Af){>VUZp{dHuk=VPN&1hzcnh9xqeS&GDE22rCP$OL1l}^j*K$plLmByT?H} z-oc0cMU9w|(NVw~D0ngNz!Zf_HpI;d&~eicOZtABM~s^fMQ6}7wf-KSb5goee8M&F z84n&zljT#n9N*|S9Gv0+CV_+@-#2Dm!56;~Eg~qct^a;PZah*yl+mhW8;~{|gy0h? zCmU`o2HAEw9f>eAGtc?#Pf>`LHWr;Hl?5f3R5PkmFO@#;$2w~6x`b-)M!dTcxs9yamUbrgawK+;y-Ct{RX*CqF zpxzWv5CI0xME&iHFRUo00?)3;%AuFQX)BZOchFw?VQdPPWrv+Sgd-U|%DOr>Iy#rEP8%}q|w2eHG~b6=O~fkzONh!H1D|nY@YYW1RpPYjsG2MZb&qpk=^zE zO$YgPX*Pd>!^$@Jl~Of|pzwetanpIOR>FxxGS=Lri-k?K-QCpvEY%zQWkC91|Gg0?8XQ7%I>`d)0Yt-a_es~BdHJ(5xY83CG_!%SaTom?<=*Nf!84AOCctL zI5ZS|Pl}Ef7;FQa(jm}60W4+l-+;B=zq-V;uYko?{ZpAf{U)ZpQz zegIypRVF<(D*lTzJbYr}jg_vc`PTG`t^I_5JEq?nUCO8MBMfUqv363~Xuv5Ii+R}Q zs6~WHZ)a-S@%Qf=oi?nBswxwsO~8zdiJ7+sQGmsk^M4Kd&)CQ$)LT)vnv3Tc3Ru_I zM6%?U`PKiL8>TkT#rtCfGg!Art`FaCxq!P%Ufu~Ye@lJY0TtxezkdgVNskc*#^6#3 zZe?U-Ue&8Qq>&*s35W?anF{LpVZGca`L^?z^otP(Sk-zoUS7>?h!MRLPfrArlJ$vO zOBaqODdU}W;9k00t$Vt$Huhj-X>N|dca0*l;1>WGH+*g0g6bKiDH~QI+Fn&12Y5*8B*b3}Z2f zxr>IeK!-qj$F^_>Sc;8g?U!9hO{vQ7JO+?|{CVlnIg-;uX#`(22M2bX2Oe%9)`~HM zj^MZzCB(ni@cLw>AFCEm1VFCVd6J@uWAbBkxUrlBcpn^{ukP_`6+~xqj#oO3g8J=o zB9k1sOxW4K6nuCYi+@PHO@&jQ$Md=>7|-1m-Hxs=CpcQ4!NZ24GmqzFJ0loIXm0IX zkUz=|HbehlPscSWTd4}A9q@VOU6;hMg}iLs7xWUC&pfcCz0axN0^y2`e^!^5Z#uSu zdP{MDd}49=3E3KiUV8=y2S>0iTR}@Hy zRN9mnb@)EyHMA6j#UQ8s-2S>GD<6|aRj9gQ*%UWf^}Ap`B&K8f3hLKHT6x(YohlWX z{tq@`P8clMp?U^0yQ25ADT9PoX{OO(q0r8a!7zox)6IXTZq$tbaoe3TzKFi`3ZcxX zf#eN#+pXPdwQtfzhnHJNUfu%o?n+$PGB`$*IS!_AXwOz?dEY#-&;Ox`BTeV(8_r_W z_$op!5@z!)AKZ1meVZ(8*m)C~NihFY@+}+b7^{$dKvotpN?egFGWcGgd2{?;4djXa z-k%=QGKrwGGsR9@pqPPecT{SpuWt-Z&&=fhRy^iR9o5U_PSx$Lo&0l}kl$xZ%WQ1v z*a#|bgf+?Ni|{BE73)90enCXO&smY-GReb0GI#|J&$@%0%=DBjjKjn9&CQ9hJvMR0 z_p=2%zOuHbKT(8IFusgOQLNoV7c>zk^i}}&tH4LIPtj{iXXp_9GzQ)iM+?7iF&_%2 zv+hSWYn<7Mh=};wMfrGF(O*nZFknT%CS|48NYcTf7GzTZSzCktq(K0CSP>s2kiji^ zYHV!n=m;vB?`V4)2_))H*9 z&|&cj2yXTaGBYzffLa0+$(c0kC>a^$=xn;)!iseHjq0^ITL}BVM!#>y+EUccUk6q= zF#2L;ty9@CVNG4qe4>*nj5lZJO&?Zc_EPDT1-E9Y!ig@dmxq456M&PG&%V*@q_;hf z{{{ya#k5}-`c{C0xn`mMu*V;&$8J?{{=6Tx$KH7BTDP5(NZ=`8puog*Cz{ZN#MLwA z=kK1Nf8St2hhF?HwiNec6aVbKHzA>sJ-J=;_0-ZD&+15kg4nHKfgzUH;fztzkMiQ; z*^c(17oWGYc6hRs8;V$w5#H*f}Ud~)(~ zAVECo357Rv0UOPc$1Iy1lwE+Q$}9;D*g#a2EWw4c#cK8h#P$ADEP{}1Q+>U%`mB`f zGwErlkgv{v5it;GJfa|uRo}AGt}_PB1qqOkO6NKSi|bKI=TqBr(9Z|IM|k+qt(`LZ z3gDsX1m*fnzGjd`U&bMrYJ(5$=?;WL9UK@?#FB@DfarP!c%~lCRB~{9@yZ9N@td2D zETCVEEcp^ArJ`{5QK+9Jw6dME*ypT#XfXd0)c{hTd%^42edIAYYl{@WuLjGQC9hsg zGXG9VeJ})_5RyVvDuk^D4^fHh6uUno{bF_W9kIJiOKUKBMu{z&+5`5FUffO}YxJ^Nak3swd>MZoUn$m7ca z&F9es=X4h+sPCj3`iH=}EFa(Quf|1|zFDVoTER33z=?_L9Wwp=-g=6@@eBnl z?X{{$8CI)mF&hw2gagAL5;qNsx3pm)kXxt|bqieUBtK~Ets)5umHT&~2M_*1L7w5- zx&+)-k!TTXeki+3FFn}kHSgOz<@>rD$)14b&- za}KT$|DZ>`>VCidm1VjEB@_QwYO`Ir05MS8ZL;I5wLVvaR*dK|@bD;HY>K@lT+DJl z-*R|a7t=O=t=4X~c2Us7ot?S9eEOApxtsNtPoUh5gsl_{5mAoYk+O^8c44U%nOc^h zyyz2w;BOqrFH@+$)?)w50yJ0Xd3g@n(&Qc6syAk`B%KcpF(DnoI__ z1X%<@OKWR;>+*?v8zFd=WU?1dFUnHCb3Ms|K*G4Q^UB#qM(rk;8Z1U_+8rJ0}tfUlA4}}r{ zkg^~i6a+Bu&v)a*T@Bv%NH~#=o-S5PF-1i?AdO*OOZB%r6@Z+uPs61T7mW_$Utg494af9CcZ(&;rF;`sH*yU4zy<>#l>a@ z>lWCzd4Xc52M2UGhw6jjnBhUgrb8j;j2p2m0N8jqYX{6^ZEm}+Hpgz-iz@)~@~ZY$ zMoet;?%W=1K<_1G(&|a>-GBwSy z+A-V4=AWlAmUak!vzKRhnp#?xHkIOS@FHVz`I7?$ZXeaG!FBBW=ZGg9K-lGTqUNWe zQSS@^Cv}aV?zIu5=Cieaubm)ho|xR^rp~}H%%Gjv*$CQ-0CWY0nS)1OC@@MNl2cQI zd%7cHd`q$JZy0vLrZyt67W{RBfhIVh1NQ{5Gy*42BFFDtWDsvJmPx$;z!bTzgog(> zz=Z+XOm1FY&?*%f-{NFpNdt~-;G`4&H3g~#ngx6br2_i?UdG0WV3o2?dao7|6LSO3 zs&ZNU^EiHQ+|p=4*AP4UryoR$a$*w~@!VlHTgW3^ssOp4@hqEH!sG)n^RSS9+Utpp z*NLvT>5^e+AOha6FLBi_zXvg}sL1_byqhr{(B2ejI%aTHEIl#rcfK}!P!njgTdgUN z@9oA|Bcp(~S;nSN-Wq;Q6Z&32`2_-_@oxG9kU-4Nn$^!!Q&T%8f0CmJ^{0zYRY?&| z#N>|tqCJWQC$U8CF>z-YwWEOQeTbJdF!T-UrXpkiM$Y-sQ4!k~c2o49L=vtc_yH}9 z(rwBr+Wjm%YJa7E&OIr)MXpXsZ*QN=xalZa2${2N_j{*0;q@L17t6ORj$b{0I%zEW zw=;N9mw4w(Js~5i-mhZOC=g~cwy&I6rB8RCSSk$x zIa?F0ukku)_b9%eLyyK%(-PyahxbkE1qv6c4@wUN_QL(h&^Mtxq>4co+)9p^q_VB` z_e;8AAvILwMT#_$mt8@Ao8zzl6{ku{fbJ~3SWHh8o!$KVZMk~a2E+GUw^PCM^S8Z* zwc_OSO*XnBP+kRC-J1@gxm#o3Xe^y4lRoZ>)WC``6IYM-HtI&A2IMSz(F7BEIX`X5 zfJ~#o=3eQ`kCr7>I%-$|d-D}o2ckTT-D+XxPt>KAnocclrz0M6+=ObsRH%q(@j^aD z)eQXmmzYp&fCoquMO9TVk)P`e3tQ8J4}xs1%jW0EI9n(xDvsE?l5bB68%=x4RoPG< z=Ncbs2nc$Sn^rVy9f>mKn1)AhX4H@z#&f&tb(rGRfiE|Zsvuib`IecGKI6R|6)PN_TFxCI8{d2_VCb>&4?8}pxh;0({k(z4TN z5f}kKA^e_S>EiV%;q_1gUSv>33S*wzLF)_$S^_gy$f{T>hS&UJp3VVRGvC`2qTg_CE>=8su3BEz}e9*&~_a;uQ`DxR&pbbg$vf`)iM-LdQ|lEaEn^u;L6 z&qsRMTb(xPo`-+_K=Ux?nnh@BNL9&Z750ogT+Xcz*~mo@C`*NGJJ z8c-2!T2~(7D!9&J@p1NqhWz`Wwc66yF}1(|uu!u8q#_>SJ;R<83N7dxkR2{ty5HQ!?~emtAAPsEa3 z8x*SqYstzqSXnLF{PDFd0&}j4GP(eV6cy6ac;t76lhK^;uik{Be~`1p<_a zs1tOE{m#Du$ymFE`F$=9L`$n6r(H%afP1d3Riu;yehF9C4xvH`G`y0ngqIX8o^H23 zyHow?gUWi-sIRqB1w+3ED#5|AiP~XAe4{db)^t=e$4FmCH~3&ODXN!_%)viS@juN0 zW8cQs))oK$`9$GIx{qZBc${2Z0zNy5^!mCnz*YvHNB|`DmWbDQsl}no)*JNbWvHDV zZ-@Y1H?#*hfx7J11M(U{NycSwY#ps2!ZE;<3L*G6kk-t9^#PA4eSLlKnayTWP*(?` z0;K_ot@FF_@o_MuQB%&Jo-NlX0LkgmEKcw&1AY)iF)@nlQ8Q+CfZziR6(t1) z;H8EKQ>T1I$giWLBLLVN&E#Xk9GH8}jrVernJES{kjntTEQ8L36Jcz!9Me1#)uHLBXR2PCkDq+KphWdg!Z1gAOx33@e zSHKJm9D@MCErb*SR^;)P1OZL(=5Wl0jjbLy?I3_6Lo{s(9_E0@a*|_;wD1rfe~TVK z>X8Uon45ZCz8{@Jn`?iSJ+x9`{)d#;(9ke7Rda9%7d`lVI3dJ!`v@qRSpw4Z!1^#W z#Ky`BS{U#sJ)i^a`}!ib*aV{m;_XUG3p1AG7K}I)=&)dcT|Xb8C4EfIM;x2*rO=so zd2Vj+VEUu3p5XH(v+>+_vYrrkv`f~?ec7PDE)z&tq>hem_^=v7ZwK7C4(y@DVUuz? ze;TN?nN2+YiK);=v58K5uzY9Oj#7z^Jtb|qT-KMcK9i+k&$N1<#Wv|5-hXyv>RNgi z%K3FM=|t2Yfj%fW)d*a2m8>Tl(m%XiKg3AHT)PVUw<}vM;mm?)>w>Ni*;4X7%9!Sf zghuI+R-@Og%gbc{Oph~~!L52$Ir9YsEapg_RffCICa&@BJNZ*(R{W08750ceFeMCK zM*J-TvR+Y@_<>4&X^%Eo4&Io5B3jRH^t8(mqGhG8GRVjw>=1~%%Bw(au6|ZJ zBJw-4_q@C%`!Em*cz~m5#+YMYE~tb3+!b@2@>Y#Upzxl~8OXN|8W8pliT8GcgfhFqjpXZF*k@ zr*(B8(s2R8{rVG8zkDt?MhO~d95|82s&%4m{CpOhz?1?U{Dk=0EGBnSQp%^NuhIQS zGTWy*+W$kPUtM4OjpqP;A9$-4*L#?l6#n?pEKIsb8bm2Z^R7+iM?L`84`RJuT&RV- zWo(vkNJ|!@#o9Q(0~6$G-S(VO>ZupRyP?;jYa3=A05YJdY7upfiNGQ}tN zzwhB8veJ)JP5mohB)=IzLe8hBIY5XCU0e?P?T3^jTH;tqbaBbh{jDu5{QLK&$1!)C zsBX0y86d$0WK?EmW(Mc)Ck!P(K-+d6=Rbc-H_YwPQB-5C&_~w%b@j;LNF5MhO{Uj8ua{O)S3g~9&FU$oinAXc#?)K2C=^O#{PZ(} zgj+_9jH&uT{BPtBrnfMV-@jvs1$jmOd$DrbUJ#$Y^QLkg0B;vS1kIZ`GaG&Lcf(tp zd?0OlHUx!tW6OHL3d5ll9Q4krN^rQ?*}Y#f63>{;2e%T4Vh001#}FZBbD4FE3oUjs zE?btpj}7IAf^?Y-4gUItEhzZT*DI+?m=wQinTX|;$%-z6 zvXU+vetRt7NPFjBQBYUsc%8>#@@Cteg(dp6>CO68L4#JR-8~b032L(TF<6w>xlr>h z*Qj;zwesbFcx_?90po2}cy4%QOFE!frzbw;nYfrxG#9Xq~=x#M?GGyUrSI<=L!T2A3shSy;>x&o#I!O+-caH?WD01XPbLXaAZ zNoviIbZC{m1!hPM=)7xd^O>0y6Y({HkyzDvn2Noa<0B|3m8f$h!T`XtpD{v*GOtAm77^ye0BxcM$y*eaJ*ku7rY$ zM;PP(X|mmo>2s4zMR_?Z+xw!4uy0U-J?rVIET1XIYu5%?%ZVVuS?T&Ns+0MTmYlECU@K2w0ro zz8#yKw6V2?A{vWJxd3CM=xEtN-++;7@}wr)hQ_3ZNJbXCG%#QfG0>pIeBP{d4HvI7 zUzzx)&f^}#D=r>*lRv8I6#AMsA>;!ju*?8oO4O9>r3=t-2LRB<|pKLvKy z$5(d6(IURa>4lM9t`ZBSPVWa}vTnWdhnK)rLe)4c>Dh}3wtg`H=Z~QP^PSXT)YJD? z+?a7xvPKzIshn{c6dR0s9S2>8je&kfORo;x-o|#HK3%Mlnc~5XWUlTW@H*Oz?NntT z^wg+CWTDFc?j9RnEQdglaZvTz?&^{nU-rh)uCJHQCP&*-4_j{PVIlV1Pl525F~rIy z6hyrLHo(;zEQ2$3ApSJ+;AZJY&Dy7D$^d4POj`CX`xC=XrfM_J)yYo5SH;#_wsjdJ~2H6iu1tmjXwZ<@z57 zu7uTBHRi^T2m&5uwY3r+0fCXyUs`pl49P_d3=CXd!7;-XGzI;rbr^bPs~vZ6dJ!=i zo@3X~Yh+fx;Fu6&ID8x3eY}hEyFPF+`-kQzuTw4$NRYcb0F|=scXuWNygj{2s3}ko zrS)4_$ky(z%b1s>iX<#5|9~Pv{d%1!B7BDM9jM9zDi+D_MwJ- zQkr7N`wI{C>XrNBR*3)0bLh!}i`A^HkfsnU#CrbVaZ~cS0v|)U7>*HvpWlzT&SR6T zHYL=cCPCScKz5p13_WoaSLdgD){u>(^AVSE80NZ|xcFV$wV26&V%E^WX4}hy((6!gY1uKr}vEv;qvYxsrXWC|wfRyr9$BeOVz56aoM3NxC%) z48^7XKtWuOuF|<~m(JFkjQAe`zm0*ju>dbah=ig#ynmC8RdA`jOG_;Ox>UkX9w@iV0%=r%EEH=z)6pIsl$f^v|`XF59=kdI10#0{uzC!p7Np zKEw1xU0+$5*=UxByL$$==kG9+st~IrnNX;sFGSyITip&#o(6hy2S(#1xXG|B-7aUP zHJ7EzA7U#z9D*A8E#fn7;Sf-9iuDgKz(=RJ|XO4>9q;7 z{&iPA<2Yg+jkkmu8d4xyJG&N_sKufzY)+PIg0+%+ws``hg=mT4pQ@Zf09(eOs;+drVg(}*8KR6!2QB;(zkd>(3+V90=fHB8URHaH{kv4 z%vH9V&}h*jWQG<_kaODl`DI3|UnNv(nmRrf%V(S-o(4a~3O#H!ZNB|tk{L2p$&itm z>GxWo3%Ow`|0e{Ng4r~2NaRM)#w5E;_Coma-rHZcqVS!3y%^)9un^a2V%biYv?)H0 zPvrIL>b=mPA65~`16-oh_Fx~4JZ2HRPM|@eRKc|?-dvmxo=DB?A_q*Jg{+IerL54m~U#XN%y-kWT zX&?iHt#2DR6nGY9d!PQCr|Zshs&|@w_y$=&AA3+mW1BH%$TD3T`c#>xg-5)|@4n?N z!H5ga!G!IM6?%NQ8Bj^g*#qs))YjKU44IA3vkI#Us}3VcU*bYS;Qs%eO)Z=D1bQ zTQE6FXy9kcIt5yBY@8?B2y7gj!^1;SbOp{^7KlC!Okp*m#Q1N~)I6N3N;JCL>c(XT z2m}OX;-o0iiZw<0RQvGXI=iI4=KjUH3xrMKCFkUHJbqY@_|4VS^a(|_p{(ox*K7a- zuZ8kayRouTNr;8b2#wf~%W`TC?2fs4cqAp3TU?J7b##6|T&FwLS*hfGP-BdxM%R*& zk$z}J7An%s;`bm-U$%udH7-EKD*-VG?;snO;YH=*0Ps@+vR@b&8YWHCckj8_IBjPN zKKtb5>wQ$KDzgkCC1u<0H7uYI%F&3Teb6gZ+6P#Wr{{jzOkUTKj1Pd4MwGoUKYzbH zWC-z3Be;KvPU`Dh*d3*+M2%2YcLFU9^;*6{rOqYGAK#?@yhG?B=RDvAEI81KB(Mj{ zr-v&G+4b0%7(tH<6-~`xVNxulUSK+6VNnM=!9@$J`}=!va!}km$k`+XT#|r*86fZg z9T}zcaRRw3}Lb79>zkF5Sa%enBSY{FmSXi4*-JD>2>?H^+z{R z&?E#!jGdq{H&?_UTh8^B8CC=g;dz~Q)UR@0DRXEaPPQc(#7or)9FGIumO0}7(`1(> zTJ3E96gb_6VSOaU*m309G3;JxHXts1$XhIY#i4vBhbnnEvft>}S4+kyrq|`2AE}(l zJ^d%m2tGQAGPi@JUq>Y7LPRF0j8X~;k-BiaD*>LmWry_94|TjE$y{oDn^cPPxv&@&w`);747cx>sJVIXzerCs2G(FF7co5nvp^ICl7eAnw?%Obd z!IoC5Aa>lfwZ@fSZLp{o&KWG{bW~vy@UiiGEd^72FiH)J^>aU7*eoeYS1$S|U+YJ2mZ@+tGQt)8}{qAh_fH3QL%b{t;jNt#rAT>rz-)2)@s_I9%H$ zejMMG-7iO1ckkJPM86p_(nCmZ4%!v-i69V4%y0e5m@rU~uwJE~j>+COHpd4X$i;WD zSsOy0lRMLZ9zm4dRy~xXetG}}vGqyC?SaFY4KTv}ICJ!I)#vUcuh$C83I`ke`Uu@| z?#JhWr@8RHzpMVh`UqdBBuu%n{kWAtBM&%?sx~$-0s6zkKLO4M062hk&dnSVOalW0 zL9R!+Hbhl8_d9^o3;I6cVT`9uJb*<*Q_6*01s1Qa&rtF(5d|wV%G7jinE21%)-li! zwp1;U>sG4pxW6#@M*sD|@qbwW;R1fiO)TmwHL-v%RMLg-DovBc76C`NS+KcTr~H<_}^|M4kN+E(>)AI57-I0 z@_K)XC9aIE@r*XR7|Q7II#E;SYHUCtq4w5nAvJu3^83$hy<}=J;Y4MWmz(>h=kJ98 z2dqeu98q<-x-Wz8!?YD!Ot5p6(D3l~xfaPJ7^CV|E^bTP8DCVb2+E0t zE|iZ)&sOY*q;sdg8`ObYU(nq8fxl3LBYzPC2-&)j8Xc>726*h8F|>y2vPxw zNdxBO<4s=+JhM^Y#CnmcE0d5lL~Y_rggc}9*4GsCKUtVNcSir+zHoMSc5zOrd(&J* zZ8VkxMTc_(o;SQsyJCZW%&<^`dSJmeE)aCg9Az~w7yX*5>}+~+w90|47OrCU3>Oe2Ol~f!!_aun zky;qg_XT4Y>SM9uLO50}&Nm%8^E7LHX_ex_^mw$O5O($HgM`GRM`Cscte(17R9EN9 zaJZd_&^_*e@_D_#A_qv!C@A+&DF@6MV1M_{2!+pe!|b$Otu`Q1Bm(1i`PJ72cjUpR9W={<`VG;8q~ST0bNYK|xB! z5!s#@22}bclC<00)VIs>#?2Xsa0+!0MPKdy3_u`F*7Fs`)mJZwuq}RmZkqgnDmXIy zT($84QpU(=X!~=(W;8jFkUKogVB_SlYnLUXW@s*~s6wCIs-81K?F<%)_YV)Q1iAjC z2xyYbEG$I|C3Tb0Q7FkwdNW;J@iZq3jjl5Gk68TL8BG=uMby@?gy zu(Yl&TY9uJLBRedTldn>$A?C*U{OZcb&ZOU^yVbWG0z)zOavgMo;lxykBoFmZ_W6S zAQmWAG?e*39tWr*Gk>Kf#}qn%oT$ejHKhSLeBIvuUs553B%z(<6dfI-G1rv!;yQ=| z>UBK$QXk1GjNPymGjVZ6_C4NoudiKQP3ZCwlkmBI^?j(+p6z!}yO*_G4w;$ABjM)u z{kO;9cB)&~V2p!1oiNY_|27NI4Fw%3uOTy{kY8&F?~ z$dy#;6{Dl0!5Rxox((obPnO{!{=aR1WYi<3q~L+Q78lp8m+8(dS}C|I3l}P#F1LT2 z)VRL5i2poCyIQ$MlcmyorK!2)v%3Qwk%y#N^J)yhP(&+!?3+$%MD3qU1VZ&O)7r}F zOTe81L5$QUZY|gKGP6*T*Q@&z~d3OBSX#@R%6iEidXhTQz zYxa1I9Z4r~MC*XBN$SrGQG;4FI-{~j%5dH``=JfKS~u>-Xb8lHB6F`s1ONJEY`h1V zsYr8Uk)#%XokiBUeIqohUjM7E%kg!2vuq&>L>Rxbbhk!OYoKggv(5cFAcG$wJo2M@ z{%EK76O41XW&hAu-tttXMM^myB@q@~Zo8?HzjwxX0XPJ=L%Xwn>uO(ER3h@UV80ZG zS&3VcXr)t%k_XkB5U8QlX9@X~Sqld8IwK)Q%zTE$!o$g@B`R*KPzpQ;W9BaG)jQl* zNx(#Qe|`W6O-v`Ja*xYo>4MCt+ROp-7p#yFocUF{qoYMWb#I-8u=T7=H9;>BVx#TxbY}b0)RY!Q{c6W>^Rb1o zL9VuPLSlOQ;X&CJHf!@(bZ4FW{01NJ&izQldXJ~TX-%lB*zKF7WPkZ-q|=ASOirlg z_19chQ;5?XVxs<$%P>=|MqmM=#bG7(^vuy zi|xT{U_l>jEN*v$o6Z=9Pf>}8$7NXIX{*;zLq?7~CI5-gE!A?9yW`q8WmR6rKD-(T z3T0sS4+8tEe{FFPB+}9jz>clO`#%229AH!wO4`6K}0U~#}x5N#`0~>&nmYR|RQ0U!L zDjb!y^^)dxBEH#}pZkAhsg=|yz1Tdlgn-KG%fa-eS~Y&1ajKb9Jt_5rr>Uu!L-^K4 z;Y~$jTiZ^+Q&E-EP!s_ND1h#YSlRx|nv5M#@w#ndrsfr3zbc;M2kDK7sGp3FLZrW> zbvt~z)s-i&p5~j|+8$W4#Cqi5uMWB1L4Fy6p%QwncW=o$#St>H6_?G=Cf`6V^Jtw@ z{ttkMN+{$Df{})`K#GzjI8s+rgN7UlY?feL>j(xT8Jb)e;+8fxxI{kk4&UP9<6Df> zh3+M>dT|(j;(6JGM@Q#YP;xc1GTSd4FUsrVp`qvy;DYC@f`(SP#SGF{&6u|CXhhP_ zGrafb3bwpLs!@nUAF5bi`jr$FS#L|(rUjyJlTkk0+o2%9!NJ00EIgho`+Px?-f0o-SJfw>n>#~hLRE+*GtD=Wd?7=h!Hi;6Qod-x`ze`$SGUW~z1NDZu81tP zyT}j~#r|F^10;i7%St|z8?Oz_s1$J|eGsPdlGZa%#e-A#)hIiq1cL;0PQ5A_J@bt_ z44bq@<{GzCHAb{BIc#YR@};(Yb&8El`e)YVzZ05bG;QRse6dE->Jh;oCB=ay?6p@V z`n44A)-TZHW!GU{bS|CCpzxe>&(T}lasO_)UIEh zKHW~c_pwA)EQSTq8kDWX*^0`t+&-S4i<8I@(gb~OuJ-FeY?skZR>6N`9EAHzOM`iM z=F>Uhzy9Rc?xVPW(d~xK{?2zB2qG_=lK8u$9X}miN;T+v8>!vrb;6r z{tXSWh;K2|c&2?G(SEvdVTJa9DK?Ri@0sL0A7Ko$Wr6xqBeTQ+&$k-I|8|`howNKc z2`_hXLct4d#4%y+UK_A&p4ULMc|2A_tGV9M&NY&`;K5@p0Jq+F?Rm6BLOn|wN@}is zR$BNGw~7fs&|O^Gz$=D!(Ae_v&fRXM zBCX|UdmEeBSFZbK2q0P~3}|Vkot!+IR76U~5F(=JDCjY*{N%J#(2}0M z2*Mk@mm33SPCASIBdAwZ@2H*$#+VYPUxmP?sXZr0c$_(t)kJ3wpG4yI0o+Ge*}k{6 z<)zkVTwUSsXAfZ#Z)J01od1v_gM#4W>k$ies_W^^7RQ!%y*4nGg)wNdn)O%tdU^t2 z@LtKRe3tP@3X8C$It4=|_qoN+z}UR0sSL0)QaYSAPZkyDeuFZS9n&x}sysL-y0&^} zWOT}?!EE#>$7Qs6V^&eUTcw+T_s;2%X}JppSPOaW&6xwo+lZw;1JsR9-fw)rBXdK) zTj5qUbI`X{7|W<(djb7@2!vR^!SveY^+Wlm80*y=pPOc`K)g2u{dmtBe84@ZKHwKS`qs(HP;XtBWofgpso!mLr+r}Oxq z2ku2Gqo49)AO(+}FdjD8-7UB`8$YMidJo)?3n0nUcM-_dX;&D*Bi-IjxrU1XI|TnQI%{b@&<(k%E1<3Cp(DFb3kiWsmJJ~- z&d)a)s9k7QD9%Dy6Bc)ZGXHWWqNq4#438!t=m;gE3<=)hK zD=W|6>NU!~A5@A*d(sYYK5gn{W1ctWo?x~y44cqfgQ|9lPd^G5@&*n0&6 z$E|<-kN_=u-8$Cvf&5tFtpL)!u3(fZSRNdH1xnAC3yIPCaxQv?YaLw%Tn6G#6q|YE zYa5QfFwkc5=|_%_C(u1+d3md<12DX$b}v@$(`DTir&C& z#dqI_yg_EF-^n#b@eYK?NIT?fPu`QgobX9Zeg}J(-(8QF9j2DdwXKDtX9TB@sZO6T zVABqSK3pGb62rqo#0zB@Y+XnMn~NHwHFR}BHPN?(1cDMZ$+Li(dQLeRPado3JwS}5 z8}$NlrtomZ?HCA;*X;@L@s+C}bZPnc`~XjrD5FncDc(Dn>+y;ZIGjy#_jK=!Q$`%y zceuw~^a=Is;-b2)PQCU!s4`gu;m-P%M~!M+gg{PPZU1%Q-FML-3yR9f$oTQ&4w9+U zB%YS1Xdgs4DRCBp0wW-xt2o92io)`z_K8kA4)54dZo&VURwKFEQki5!h9!%`&wxutXL)6hT(zvt$_ zJ@T<38wZENP>No=YsRZC=L}4h%l`5; zA(1I^R6Q4I<^hTGZcORFA1Ijw?H!W9U30CPJ649na3t^mlN&ExFmT$9jR zQ|EfSy!@ldc^BCJK~4e+@?GARtHBQ(NC%D8i@;M{Z_CF^Z~~8Zq^gtjdh~Md^8)%2 z2nY#*8}pe>WeH%7<3C8LtJ4S~l$Mu+8*ZuXQoXBJ1PXb)#Z;k!peRlI-4d$9DoA~(nYt`-e z1cc2kEp&c8VE+w8Jax@umYkrXsqIa#$~KwAc*iYoi-o(fjnw`*p&Z#_!KkFTvNroo z3>3sdq)~rAZ2~U0^HDzzfS~T zbf_$px!Y$k_kb$5!}0Eqt5^k39%Qsl1-gl62Thwlz8Y$QYow!cd`VgwzyJ;(_idf` zHlb&I#m7sSHgWgsV0?>wIhv`fi)ywX&OSf6;h(7Bw3kotidIxyjl37bjOvUiiJ1+P5q|ToAPs7xy^1?apjzGr>F>!*m|aR_O4& zoLH-o0Rl>B7IALjDnSaH^+Ul>YG2v(=b8rvhcluyI#dyiwTU*nQ+4g_)~DgIZ3X=f zW%XT=P|#{d5Pg%1(q}9c6nh-ooqsHbgLG%?^VqEuo|Nk09qWHZ{dvKl=ieyme3Nra zYO1@Tp`#o!NhsNe4?%U-K^o=S3+J@Jhqu4~nKne697r(AjyA7p8(#n$J}c`}j?x48 zS!NU30{;8O&b=>CBHuyF{q^B_$86hw+~Y<^y@ds*u*_k0DahU(gBTs$&Lvpi*3LbqnB>2Z;2&V}6h8GmDQ9XoiIdragCmWStUE`aX4PRWm`Haw#QdpmP z+-)p{$usgpuua!)MK-Cx8u({_@Zp00o0GGJ0F->@IA54V0Z}Tl=i!V@ofLzc@AmlN zBpB_iE0KkD`|60Tf8`M~11J4j7{#%g}=!=c;c6&wK+}s3J ziRwt7e?0&SW7E8e{4h2)aOI3I57W@lghxdDx4#G82(XO&FTNJcMvKR65CB3_vXo_D zjlA5*IH3W+ihqOPUz=mVAm0^+2pqh2Q|fRKVN}#;D#g$kJeJPqU4l5kR09AL5S;>? zp_i5DS~q*(BuyrdIE({ei#qGMmFDZzprjWBijN_}iD=IkI*`JP`B{JK#-_W@Fzsc&Yv@sF(z%TiM@*v(T`fNCcYHcfd zb!=+eG&ZGLh_(dCpW3ZfqSEaF97fA{$)X5>;6YbzL@{AeE4HgRMs!$)950Pg0r$4W zlHkOImm!V!@P2pt=1cCBW(MEh+Y%k|_qrum6;_|rm&~`S8Tqc^v-u|ahgIhgAivdg znM`i8Z$NaAO3+JZI* zB9qTAV;ie&Wi9Sa#aOI(-ISm55)<_`vBdnLh+|X1y9giuZ|N9`P&-w8FrUL2d8V1) zc7KWf{CJBS5wPOVthCI@%K6zW{44jMHYm+_-DXbPKm%~EY8oasB<)P`yotW(jAl#Y z!#|inyX9%>3%n=8pb$~+pQ!TJS@wO8mI|=q$;mkdJx5^AGk06YWiwx6GjOo}E~`2K zANFszk+|dzJ$x!%%A@1p(2y;!cqYn$SUB%}eWd0QlATURxdoWd41}MPZX~JzT|~c! zAtYDH%BnOzUfJtm7XTbuy{;FRTQk6Qj6X^W4hJV?e?N^jiyIV1GxCCq0jfXWpOb|e zDZuF%N-4L^eA~&~SX|6bN=gbq@wgfMpcNh9A&`+4$LqfFBE(V|;6;!hSGdE9I6GhJ zwR)UqI}APHhkAQG@a0HzNHB3VOm=s-PCe*KfzFCh)FCe(i;N7mH>fP0_pDp}6P9|b zpX%ytkMk3Dm#zsj%S1)V#`o_Qk>Uxxl2w(b;`(dhDkW;< z@zfD$fc2BflZU0^f1q~uc&o_Yx2QxNydGD^t;NKBSDTwVI3`Si38oDHz440icgoCB z=l2Mqh?EL$$!kKPhFlfqSwbCRei8I)auf+ZuX?CmO0mRm#L0<8e!mjv6M}X54ql;b z$?D{!3Mj!nQ=S3TZs~$=QHTJ{3O@eLY2ve*e2b2D)z)t2zl4eH!Wv6pPo8FzZ)}N6j}?cQlEnxD(W>j1c0Bhr&q76LPPf$AQ4x)y2QSG z@t(+)_#hSYq)5E&PpJ2?ZAT_U4dqrczry|2u0v3nJf1$QW_zcN>*0@hpiD%;8bd%d z!)Cqdy<3=ITU7KH=jmR>D?|ae2$i3_d#Z-%z^ebY6g`WZdDQy{z3=AW2pt4crS+cM z_vL0`36Ft*I_gAojmz^T>8FASEyH_0SNdL_hs~16Rx`uAq`YwZ>}2F|6$MHdY!E4^ z`Z!`FpDQ*_A?!gT+)Uj%KL4k1L#>aCK(bq@Wbn5_Pdxh!JLj?5;(d-viJdWMSUkD_l?zL2&ck_?WF952)q9p#nRs48EW8?)-Z zqpT4xJq(s5Ll>Ucrt=oSl!!2lwHoT+;KF(BZzI6xah?73I?fjW6_E2vhi4!vCI&8W zLN)+`(iQsg<3|Z`@gwlwtr9gc$ycL|j*NU{LM!)Y1TnBsf*)CW`zIAiAuk8&_gDCt zEoRLzcZXoziAuuV|2y`bAOou0tBTY5x_F07%)nv7ywUR`}A zS0a&*IC(z<8lu-R=8v?{zp0jpqew>^HF-5Pd3q%%7-nH+4Dn{eucj^Lt3qzNDR9_hJY%N< zyZmF<=jd)fmNU~Hn>!qJlaqDc$CUhgYV<3thP_wD)IzFXtdA^t za49s!R3$_u|I>Zd3MI}CID()4_u=Q0w9?{kYGPtRdEn~04?=h^szs}BV4ki>heoZZ1%E`l>oy0axDGWE*Hq?(QbepsLT;JMy=`j*d-#lCxZw?|UMJ zi_0?f6Gc>{^s)uc_3iB{OXH`4)nP}m&7DoSl56_Ik~(% zrRo%3E^x{N#NkYJ*IhxOJVGF`kYB68-%hZDlrCW9Cl+$qixWC^{0WB>Ia47XHOcj} zw)P4HDdZ`q02QB(8L(};!{31%S8h%sj3IORRSyEJkAcB@le4)pR&?x~m6I@(@aJEl z)6=++fTSdIKfiX6_kWgSI(elE_94SB)Eq=Y5T64fBT|TrxQN=YV<`a&BJE&m!Pr5a z7-Q<|dXpZj|K}YsH~rR9zEZZoTXoE7NK^6)^+mCLHmmJ2ZDv$-a(aTPp_b}ogHnwn z4j+1lcq0jHpVcK=&$sJ^I3DR2Fhw$PEJ><^6YrwReAr*APrM!te&B=3TV*oZSsb z6X(BGU}s|HQlt-Bu5GDIYi-v~x(ea*#&mP(95A*1)^f#0wl6sYZ^bJ#pbu5Fr5M;0se`gLid(J_VI4nrb215SlW9$!(A(JQjVeg zB_t=wwD08(d%uJyaKmC^1)V8RHoAm9AHBFU8(>EEWDEIgD6W7BuKe@6TybsfMZg-q zJz7Wv%Rpe-2tmem`<-~8kC>xmVO^eaQ$5|+*W1%0+{Db+LdnzQGo}{{ibQxbU&;i< zWm~l=<9p7o@I5`<$dN*U012W*jj?l< zbYE|BnzgIBb>e#F)0sM^|GU3Q4mTNXsdm2&XSic;ef?;EAtx!YrslFIl2$+#Rs^K$ zwKY>Y3!aYi5S>*RMdNO?BuWb-C)mPab%x`PO=?ETsJl` z5j=!lwE!KDfP|hswJ;DM-`h4;a&<{q=W!mPzi8`pF*WgZ)M^$D!p!^=9@axp?)gZl ze`GY~v$FEJ@g^E*e>F9e?jM(uv7Rn0A`wz+^;G3{d{VYP9JQW(f8-%zuCv^4johJ}mEKBEsF0t=_GzvM1#bCd>3(?C-3 zWWo89wUN24ZE9+^`(upkUm}T#kC9AX-u?{*59kKKbhcD)3wEWpzRRD|tIm2NakKf{ z+l!0E+k!>O;NQG?QDFf`V~fKEKjbG68Y)@U<^n;ZS4&rUU7dok@Shg<1$qsbH2K*% z^ONN!=gwfn^h2AqcA}ua!0)R}foaT)M?}I%|KFQ zSRForoJ^Vdz8Bib5axSbGfco<^##P=)3S<)n)8BIkm`3g1MFwHS5S z>XDYY9-QXnl_ffIwQz}GN7*Nwvr*Tf#(552f2tIY|&@xS5P>ELqvEvNBub(xR$a!wPDmz1E+D7IHO6QqTc z)`ka|&TGe4;X(fFS2M{S^M3KYj3lbaGw<9vT5M#w7#pd0MeA;BZzq;ec0J6<%F?s7 zEuZBD&mlm;qtp0>fP}ONCK(W;sMKk?@b@e9$Hl|LQ&S6%JwB~*)s$q@v@;x_X4=9) z{G+sHu}rP#?)_fz8;*v-tS#?+i(;6z+|0Ldo>t4qO3Lu}{N4A<3z`Uloy@_CL9eCs};o-pp2}uD{ptE(fcX}F&jBVH2!ymTx!Z29fZKyic2Avv*4m4 z*}k+MBhbFGIIgw(?*N#;*FgK=zkg=l-rgg%t(eRx+5Da~A9wacF1ebP4~PZctTwTU z(WS?9VVa{9xH+qRyh^|C|E-Q1TmO>W)xGFiCR)UleEx+%z!2JgGD%~F6d}ok2@Kc?i)m2s_jq>) zfdCoV+$3NBbIzxnJP7FYdGXyyw{jEjUIeDC7tD=&bBx=bNX<$e!Be&2X&&e2($aY~ zTF0)3FVun2ZTvyPN-HgWu?6pWv99Ho+iGk3b82qH$0sI$fdvhsq&;|!e|D*&Mwuv6 z#MT?bv%b4)WYXO~Om9{LH$Za;R5RC;O1sjXG3NhP9djJU}C+GZn3M`P?%XK&`3J8I@Hws8; zm|X65^GumH=)yTlC!prg^CBrdoj_1uxM&I-vzwc0K^g!-tEDv$BqDUipmqxE-;4lOQS74F_0m<|MD3mu5v9Y+3A7wZDb}kxqRuewTfuuG&#}*kv zL{t=*O@Z+Nyg=&haD|H5qh>Xq=~IB?#Y~5XlF|$~WkF}z;2;S?U@bv^lg=t2R=#p{ z`;I;>RIz9dTQ47FV0v3@rN-d(>$TmA(j#G4eW z{aojnU$kcIGpa(-+GV-T!XHh5ygM+hgNU}3__@`)alijW7Y`&LmJ2uTldz*(>t4n= z+=G6l1R#8l`MJydLjsyeSnjC-4zBNGpUI|X z^3SC2n7nLzPdlRssx)^V)#T>ERRwJ7*FMpZy%*GYw9!wajEoH`S?rvg8tUp23kx7A z17 zZ<%87t?g~){4TEcTkjJeZd}<}a(thZPLC$Nw?c_*6Qg0^A5h=V`(^$|IHRJJ8Ub82 zUNw&?7q6!oWaqAwx|RWwKXEx?oVL2{?qecvNhC3n#3lL^5u&~+q0R!;L9gCLLQL#k zyzoz@V0HqS$R6gjW)LoZv4qs>uAkq%>e$KRtJ^TVg}L)t9Zt((O*?pA8~V zF8t|Jy_scpPL83`<_EMeRSgX@5Hd_!*5O&zUZ&z*M|uQ7xQYR7JlkQfdq~3R*W(vSzb_*}^SAZ%?h}~DxUQEo(mBV}VR|%IL1!$LIVL3-b$Rcmn7#@%n*DHHKb!uQ>V0HCsl@=<X*=A*!j_Q2x6NG}XDe-2hl9*pV@$4t4m@tY6l(6~8D{$OFrgo66JjzM*NsQTjt^ zR=45r-djsDM?v>ZkkR+Lx#}s3CBepoBNod)NZ>lrVG1kIZb`^v#iwP+@!@5yOfF3Y zl|GDObE>m#BBKepn~X^nYTrSoy3VdLDodJTkY#a)n#scwrld_CyEg!Q^*7=fsmoku zI(eZ~>cpIO8vinxVTT)O-kFVLm-}^a&2XtOFRo~E^GMQAj>p^gPBu?%qSD<#w>LF; z0_6tNZdI_}VhFGGon9Z>UI4W63bKyE@J!@u&bOqToQ>JN4}>@jH5MrZMgyoSU(^5F zok0{RGk(mwuaBoqK_CJG0n6I@;L_b z`YrN!6C&yVTtE93>Qy@c#b7VW`gXNB!0l?h9wuxQ3X-p~e{G#A_%HwP0N?O@`f3+Z zD+(5h8(lFQn~tkwYs!OPhBQuvy8N56YF9i>?aJ+gtyYN6pMD% zKg!r;@UyE!0lfKiS{`PHujY6QQBY6-6?|=yV`e1ujpTogoJXmanoQlLCJe1@$Gr*; zOU*kk?J$5hcYEslxxw%LKt03lL`Pra1`c9iXV>6%eB-8iC<}V$nzROM@iH?|Ij=o$ zZaME^`p3r1SzTj^g?=XGfffYy4xf{?4k7Q^&VQ5;C_D(B#J-1skCRhEy4~}NaNfk` z)|Qy4sFn30;O-<_YNpAz+@k!~tjh>er{3};dY{J1w`RdUb90%k;6VtZb zDOVp~cs_1!AA59JP*I=Rw}nnr-8VMo`M2}0Y+zP;y1n@>8l-=C_~_KOK&KcMCo+v| z>~!>ac){66LBTaBa0R9hC6j&N*=d;qA+S+r;=y&v*%<~D@Lslk7+i8Lwf=XuJDk?J z!Nd7!Wi9ugoQ{+0X4$-;B}?WTGP-l+>HJ+&1AL1MrVgD zGH!^djI=bU%eg#1r(s~2`>i-B>FB7u6hZ_a^!M)hb`1(_U4{l_AE^RJ$8SDf#D)U2 zg3$pCP9!!nGlu>IZOo`tx?JSYXnA8!5W!wveb3|=b+Go#LYShS80%kWf^<9wKzvDcZ2>%Rv9&juUi{ zN5iHd+Ip0us!M-j!n zuGtV~6@~5JD2B0a@ZsWoqMEP|xUKCmwB$p}iCd2NMI^{hra~(#=idqs0*d92xq1{I z0k5;{TnBmEDR4CVN;~YuxqD@N{P$Lb-rv7}0Xhfhy;gR1{}BmYT-g6B1P5iuPJqjU zg=BiMJplH=HgWF zrK>U=q2+73E=)+&3>N7kfB#+v1{L|9{>leCbK2;(GhPpsTah3&8r5)dzx&C@?^A?e zXi-y3GB=*`-0q(c%3?r<4a}3e-Swx*#&Y2|$TwGVP{a#`K)l_k6p3mCPv@Q z)?n|L?x;$lgWEfJV*f69Ya8jni@=#d#4nh?>74j0E#PDBZ({q$tmxWR27OWS?l+&_ zSu z#pH}pmy(0W)c5XUW2*3)&yU<=o4V`za7i)l{QfB}?wrf+Buk)Zs4)7RPE*J4hW>OT z=(a}87~5&+a#yka)nQwGY(nKRBMZXJnqjAy(#4SG| zU)GOCBjt3d5uiK)RLhl=RMpyqk0Zb)1i=$$31&JSFP4R&gE)|Aj>jKJO$k|B@-M)4 zBjSDUa(iMLFTJFtj7msYWIlGmo(iro(O-L7ZXa}FGT7bQSHSuMfKKl7*0vtGo&OD2 z8TZlAmUbJ3tI?K~mNt2?aI&(hv~>c(XKC>@LOiz~JNK0b7^Q#%4^#t8T;v&Cx)A3>h(;Y8Q z&nA6)|I)aW?xx$%PgS2Z63KjCRIa9VH}r@!oq3OH?cLX&e3FyMQiraOcgplj*i_8- zl!%0M>9-`Oj@6Ek->6?-4Jh~Uy@Wj;Q4Rb^g$^+&w)5wY0x%T_w;1Pz&dZj#y8UKx zxbZ5I_PukJ$w)Kx!VQt3vBTX>bo~0`cLB?0YbZbNqiu<*gP?}YKo4RaY#A);ck1v7 z=ViP8vM1$0B<^Vq`3mzS2Hp2j&|};!gljAd-BA1PAkEoi(QPeSr79=haY><8Nl%Y) z+aU(aSJ@|{Q0}dX_j#&m2G?sE+v}qNaEkz>3h?NFvQ}bNlYa1S|F3V^yC(vBP*@vq zq(L?nnDC~7r4FQ~fg<&y?$z?@0^p%aC29oGy&IMt-XTV?ilk5w4TcOiCnCcqgU|xR zze8yRZ9O>2-}GCJpdfEgz?5RS>v5P=DmmPD``zjk#v1Y{XZSO^uA)>$uV~Z@9{pyy+H~g@ z4aEJ@6Y?;=eesQMFv|$XZp*YDq2aNeLIH9Fn9_SdHHSj!+r z%Fp&0Fpp(@)Qz#8VVEe8D>|^Mzp#Jzj!01d*bTY0&9}5{_hc1YhP1zmJ4&2Bwu42A zZS!#V)Q6Q>pgQ_7e`>;&oUVu9OL(`4AFD^ZNA z-e|5^;%$DvW{uBf#@5y@FPl*;dFUvFIOFlWHK0KN)K20=O5`X3@8^HC#*NxZ@azMn zq?$9ydCbe7@L6D?Obzw*^j4oAZx?KCzw?-=Q4Lz*#yq^zFbR^qNW$*R!qK6-H~_I|w8~R2NPF9> zbl<@#PN(||Z7D=kR|O6>plqG0TD4hug}-FfN$o{1&b@|=MrjQFas;g;vGeYw+&d@~ z1sWp#F#*%@qvELhhb3^d#W!9={&oq>L5CB&f-HZID^&c-#|3XQa=mKTOVq%i_D@YEHd{sz{`?OvWslx^539zVf=x$FU=B2->_8tWfv!bxco2?3-zBPIc|#T(^{jC7;Oqc8a~69mS5GszoOJldx}CjIkboc z9ghBNP|r*(5n?4rDq$(e%E~JEg8nCVtLgq%rSsvJXLyi6)6}0sfx!OhyQxP9)VeWUSWITC0HFw~Hs5a6go#QB~7 zU5#o}!adn3zfOx_ejpY|$8zRX*G#RZ_wjgr$*id!oUZ0f*IcP6F2&Su_i6UF4~dMt??GnTtjNd3 zogw0TTl(1$y=xpo@?S55uRXfedtMid`QIC{BviZGKJs$j*f{l|jNN)^rh7Bx?6p50 z%A?gSTPRu*(0M24xE$PXwVIr+V?Zn~7VQWOr?GP}vya)`QTS|fvV+r^h0tHmR$7i$ zTKp?o5k-`&{oxDvfMACt@@IJ&$iE+ja01o6)F~+_d;r-NezphLiPnCVB;BA6E{YU` zprLoh)70B`t28^s#)KpgvNclELdC^&SZvZImrwZ#kO<{xcs$kl4d zy5;6jgPf^j>IwnbP!OqO*-n1&9rks;t0Nho$;FrDCKiZ>PtBa?4YzCV*KD>P+UShu z@0lWfn1vZAIRaig&B+J!rsP=rpjSi z$QA!XZSeTAZ+qwk>1ivQ&t|qiyVK3^$Lg7b4kTUDl%y}%LtNTwa7+Zyp|2G;Oh$jYYckZizKg~fF*BU^;ZH%s zw94tNoMvOqJuVIqn^7)yYb(YM@6?N*P+$(^h3A?n*h2!+YY=E`eNYFg9{|32L%{PkN^w>Ja6yF#zs1!l&;;&dm!)n05q}+ z;e&Voz&!|;!y2AxeIOr&jpZQSl$W9S7u#Tk`#SxN?@ds6Ogmtcy?3S+p~iAKUA>(x zr#AhK6ZYZU!-$~Vv{mZA*fWiJa7c*jn>YUsb{gz-n|<9YysG;REbNgX7B=g2BM4BC zoSZ}j`Hy_==%&olO-<@@IQj_4mn3^-w4qZIWXQ1m#IZ$XWh*|CIT-mW$2v8bDkZ_v z)9nn!?14(uYS#H6(zd5Gyf@+Zhz4uFntNc;2dUTO)N0bJ3H!=_XCBHi$ilJOuZP1lHnGark9Lu3U}FgO+Owi1YHp{i0t_68XkpG^S1>$G&&Rg}BpA9DMUeHA zib5#<^-HDN?me2lnHjI+CM@uB1_T6D^f017Fa5$G6(Lro#z}XDo#yLXE(m8zgLpbQBZ!iM~z~Nr_dEU6lg&p z>D$niSt~}!H83onQ0VVPrXFqV_MPN}U|LHTMzO!IHea2k#oj`)(M*dkE-%a9lGnQ; z`U$nZWXfqiKRss`6jH7ae4{( z$}> zrzHeyC0IYCiv|86!Kz@&T;G9SqAFtm|EhWQ?I41zD;qhU1%Lluql(X)Z*MNaM8^8HbROJxS_$60-#&owJT8v^#KICnpECaEXNdIg1tTA8&9IA$i`}uTi$(>#l?MR2BBlO`k&_mrM)i6l z#;0EWos)RX_N_mIURK#}DRo^>tnZV!`y@;;@JeM+x)DgcpX+L^-t+mH9L*!5*0CaV zLO~3o&qR#GUKRH_SATxE$-?IaP#e(w9Tqyu?R6z?`~eq&0QnVzwz9I)1mpH&Adrq|fJX*xp`uj0{BK`E*5yc3GgRGwux!#O-3ne$w z<(R4!b{{Q7&Pg6UJ}brXp7`zEN_B;LLLk*OL!7_79Uf{2TMOIUR~IGq*?QO>G+-T40J8OG;Fa-Wse=9C3Wlg`V% zb0lpOCOuB-==dzUa_7Xv&Pof&(Um7V)eMG6grI6a0aG`#NqR$NWsR52S-v2Xz8?*p z_1?71+T`TL-)AF2!soB;pCFJXohtt|VJ@Eos2ySfkduDIT!!hkx+Yw6mOq zU)fOz1zqMHX37(?(v>r{%419Ss-ctpF*{s8y1mN#7KE8@Ho-yy;8y=7llEYl=h#Ij z1I1RGzx_-(V*+XXM6!3i z2N{CbzL~Fb^8UkDTC_7px;#bD)6R^8yg&Ifmc)8xS^K2zgVZWKTnH3o+v)z>Ti+Ox z{au5~bm15R%e9hs# z>ePpf(|MkzA2UjdR!|cXpLr=Ml?H&GY{Du?Yc(H*e+~n`$a>um1!bHXy0+3%6jp+W z3Q^Le(rS9RPGKmT&NyAEBjdW1G|*3D!S{36m_EJ6;eRVZv6FZfCmglv%}`+{$zzlx zV&f?(&KiJG`%z$V^W?ha9}@D22p)7C&1x`JN#0)uO)^uBVf|^9Z<7Eua&=`cml}+V zo!pOI{YFDoOhAz=3=%<$UrL}8V|18eE8uZpuc;&XTAFE?Hw$w23s@kH2C1;6>|FK@)yCIa(d=-`sRJr9yr zf>)c(RYtHYA|TK*mLxe@?Me{-W0sG5dGC~_<8flcn>IXOZCcqp`c;_pn?yO*PAJMX z5d4Vv%wC`FgNoTVYogr7Tg3TC9gUs*4x?B#;5>qvIHjU6A^EmfE zH7nl+!QYZ&ZY(v_K*O}qYKmdFbUn1>uyNITcfO+vmCxcIT$D@|{(~4Y8UD^&UHu%? z2h=8%f;835`Ap0Z8s2)+bA=j3g4=;fi|Nown|DKIlpuOnu?`xjLW?0POE*S<4YXa- z(2vY6KsmnM-(og2igoPB9E=U@#st&dSc`i@8VRR`t+>uy#q6Lp2(~x!+R)_MH@KA@nW?(gB8npnqIKlRrX7FK|~ zF$i{mZ~a3_b)^ntB{+G3uY>pdXtt0LpZ`ah>E;Pwg zW^h=$S(fnTpaajhxWq9s3QBEN6$?9iRCL?j7Mo=xh=OUVTlQJf(v^Wk^S{&6%DVTV z8_2kf^6BjNU&sW(Lk=V%i~wTeOh0JPxes^jOM2QvI)#w~#gUaV4Ne%tgeQrs%h;%^ zHv|^~k!D`VT2B~mv6tJE53wBiSrwy~^I@{5L1Vb?!+engi7HENH7Q5WXdtIN^i_ z>>tm~%_s>S&Z`PV!Go=E0MWox&F^iNM+P`bA8hr-L^VZ?l19eFZ2O{#IEuspW?Ktt zv|fkp&fCrlekaxON&GCmEJ(nHv6QS((rulCqM`qUcJ|kgZ9W&wH^cWdY;~?|1Ox;? zu?ERCS0VM04;iA&6jJeW)yqRO*;Hxu-Ov}^Hcf$+n*9XAs+pxMUyn#PP4i8cpZvIh zqmDsGx9mTI?|T&Zc)Z=1Y^?@{$+xYfE3G&a;z( zo=9E<=i5%VA2^U11qSyP@y$5I(wbDlb^K?mb*i*Z*7sbmOd>51>t9~npub+-97l|1!lMy#VnQ4arr*BNtpIbiM5b6>Crm3<(H^U3ewzn_s6z8k}) zl=Kid*d{+Kr{dVQ6>L9iy35s++wbFZWl7a10F1oN$&T+ndC(^!)i z9?JF!r{e=V(@rZh&3;Y^zHV8>8%V%k17d-vxaW!|JyE?>#?{-?Y{gH}q>1O&A1Wx9 zM~)`CVur9k3O)dAiZF1C)cw-?@7C;af6;_%v`cNhiH&6PW&5~V<*3PdQH2w1Q|WOm z7_cHuApx#$5j$S};>F2rmmAwwn6?KJ`?jb&EO544y*0^`9i%B1M~~f)h){~g@O~Ca zwYm8Y02Q#HfZo@aAeQRp=H!XOb7Xrx+5NQe=_mUX$p2_+N|ogXiISZz)=l;Fe59d~ zp-jAgeDqg5D+X{GK0YbNS7ju0iTK6E2k;BT=Ya~UH_*Q%N{tm7!XPap6aQa<1W=eA zK^Y0Cuqy^;IiNy=`^ zazqQO@jrbbG(WwP3S(xLhl*{m{c9&K)hnXr6!*`APV9$eJVl2_B?z%e#GVQaD9-+? zV-B`dVRxnE;f}K*;Sgf)B~&%ADyS7j2<9^@b6In@U7pC8pGhdCg&x4gMJV6yl0@-N zdkPG*TFZ43*AxHMdb|B~(PpS4ELKSJ+C9GU0F?_D6lVxOa7%vu&6=4g_ZQ=^z{Wbk zk-5uzBFfXevX55N&B>`WuueT2nWB!AE=!)Qt1EfP zkUw+odne`QPwYn;`|R(-15$_D1pU|xr$E8IEOgv7FW zNu8bDiou@5LE^co>2eP;BR6;Y;v#-3Rt+Y}PG8T*x-V$wn>{tHW8n>U3O`lOApxOp zHm>#?QW=((G|-!MQQ1PEM@B1XsD+baBz8yZDi7RtHYc8)zZ8Gd`Px4CFX>-$dK#Wh z8^!k-_^9hZLj2I16g@2LU)~ykJpNW9?OWvGY8gW z4r?Ywz_4eM-?ndJ%Hq-g^EBl0(+}=}(qy^;78FB~j5*2(S|zNZQn3nsZ~ElX_W5Q) zy?3B`kj-qkw4_3lT8W=Hi`nsgn*4*dQh`$O0>bh*scsD=ITO>|%F51t&0K_iT?{h{ zuY3LF&5gz>1MdSxc97MiC4vC~K_fOcwmfCcir`juZq?W5ueBq*O8B8&8^woZGjXy7 zAP$OvnC1kOsECU}(8n~E#dhdz@~`jC9UN2;u%~f!+;HFq0Q?RdAvvDM{b`xk@r`h2 zGksbb{7M^Jy#~9-BpEulBZ8f66UtONAag1-egSQ>$4N<4Vd$sz^)hmwcr?!n{CcB8 z`1NZ3kEFAVs;cd}@Iesi?gk~KJEgn3yQP)x4y6SF>6Dc2?(UREy1Vt1utYo@19mPD3Ay>Del{h6Rmg2yduIKI{My%Zf?7KIo-e;NdJ`Ms!T|2jE2 z4%Im{*E(rVlw94U5{_lYB$Lj`t_+5EcBFE~5Odw+P046F$s4X^?~XZ~EY;6m`rSM7 ze*cVy@3Gh4)d9E=Hz*AClK?XG{X41ss)59B3@AvZfO@DLzw7>mh>y?Wq$!&i#r~BQ zwf(BfLa7I8g$KxUpo+;MBO!$%6S{88qzPpehxVJWC@7=@Lnm_g87j;XHo{vFuYS0u z%Pf}`CWFMz%>qTi#oQbloC4E6H6~cc5(ESU3=S3@_)dq1AGEXxnGYCZ(ra{s?1^Tn zj`(jC@bn$pMKr?WGB&L) z99lN{oKp0V45BO)1lW6(um%_*tre598jOc50eJw^GFNN0K=gT2p4du{bheUrKZ=j* zwYgslKN1;4D5Pe+B!KMWNd`{8I$RGkPM0ViZRnY}(ykORbHIhK7LJoxZdrUQAX#z} zl3t#ko=1oof9^y|V&KxWAwJ|CBUg3t?DU)` zPoV!{fp;{}cRxw-aeH8#gW2Bc;F^qZVAJHji zSZBNSi^VU2aO0KY%oW4bPk4h5x$+x9l zB?T6eRjvr)lxk*M+Z{r7xi*%TPbzOYH1$GBS04>}v0!d1Ld(L8H&(N{c&9#Ss)yd6U1|!hdhDf5%>R;| zAxI&SK!x~2K9F8|*bV>;Cm@q_1VF#@bpY;Puv1wzHy5Q5F5N{Dg@H++@o2$72oCCI zg@PdPoR}}t{g9er^f0f~>JE!3F2fLA8|bLa#ida@i>j@!cq3pqVp55LaS|HFq$npB zC$sJTsm|}!7tpHaki57m2x_-Xl+@-%HXd*BLxSM7Sb5#nNA3}I64aisun3Zow<>Uc z^wHG(`D19_60TLHG4ml55^QQzf3(0lF)m4l_GGO6?8op`lLFMf80E!lj8V`O%9?CE zdtBnYfdKqAJ}|kzz7T`b7%lqMH9s-E<<6QK+;c7z!*&)`tC2h9-|O3MOZ2e#fp%#0t1qECI!CUj7YLwa?4L$x4Z|YE7v< ze=B{P?{jfNn4^xEBIoSvtoLzKhpA-zTp1!Ld6RON*Cu)YUIvI6F}I||K3zb4>G}RSKuCxx9u^6& z0>m;&KVx%oaEzvK@X^u9GA2R$TU#r6mm~_$=%_#-95G2Uzmt=bk1=jRogP?76>9?= z_BAq1cnxEF02E|sG2`EVml3F&B12n@g9rH;VqyuO)}7k-5FtIr%;9ZNiZe9S?mg%S zP>=oU$zjNZ7A7Wo8gnv=K?(&!W~?|7=0-+JhKB7vk2e4!8X%PC%W}McsjjEz4MJWh zB-?9YH8t$Kl1m&)F*ZA-bGjY4dB5|sX$nhA&#g}^DA?aYz6FQ{Vp}4zHODrC)dU3Z>VkgAn;+ia_u z{00y1y+}_vGwIxk!71B>q!Z}05Bs7F=a2f-l}yz=qxb{AGmEtQnS-fZusc2A_Zb$2 z4^a@@8L0MKNv^T7+!7DFv_#Dm3t^>)+MVw|9t&VQSW;n3I-PIW-Y(|b%v;61g=clL7>?y1z5|hSoDP~wm(R4{K1}UvWW8DzNH$cy7^|*6Z^LY+OKCJcq z1{zRju*@Gi~k(+yU?bm!`7GcnN>GBOkdF1+t(Wl77ho$S*St{=ca zfH`X|^c|PUt!tu{1^qtE09?2I=g&$_;bm(LhrO4U=w7#ocl5BLeyf3)|GDBD%_JPs z!_e1@OSRIv0RIS2xA_sJyUZZI#j>91hyPp1H?fO_OAIatfwgK&2Z#{RG@@KeX#NDX zmbIqa?=APM;Qub&ti@oa*Qj=-L_+>3lI^MzjL}YW?oGm&?M5QoK43CGzWG6}%$pq(jxUP#>p*niql$NtI8hWicn%6kF*)Gl zR2J4T)ZhS{UC_)U9d7y1*jSt_wmU7Ot+>|2Ru-x4#l!4jukmoBJcfT4ati{?1j||j zCA{T_*u-Iop?X%JOa&h76EKg{3WMb0V$_Ma_;GpJhA2U~X%Wo(1P(GdBq+E*BbWqH zG@L*HPJB~-X!T|`~}8)vG-4jf4~CfJcB=sU6b3_`}KSL=8S7G9#V^)m(f`d zq9ae0=V&4El0$?TtG>RB^jH3vcl>@CzgN+Og_ahniqw2R-mC;bAIGTwcYsT@R{!@+ ztjcw^=k|yPP&uD@s!g^WS(4>Y5trsmb;FihynUYgA{ThV~t`Jo{(eq}1)61ow$sJ5&-F7#QzYN8JF0f=phQ|NO+;+X;@1M4{w(p#+2rfHcp+&VF`t0@i&I zAe)P2=R#Jg1(2dDhK8?(Ht}Pd7i9RbHDfyKPzD{CwyTx!kpFsC@L#IsDvF0}C392} zPAjoHAJBJ6-sTS}C&PVfx^j`cz}AT~#}K3FiwJv*1Pwux67xvyDSM+q1DJT5W4(q^ zf^Hjsw8%^dSWzW2_BM)2#RSa9YL;}FxG8o*xrrH6tRz{7H+-Owl$cDtRO}!SZ>=1T zC&hFYyMxFZJ7V0BvEO-`W0XyXU9vJ_QB*JwMa4B^X3w?;35?N3(s`(CUpjmusI#?Z zNjEG=@N-k}vS?^Ir@;o?fKDlHTb*1CvgRRWc-NzqJmx7-x)g1Ww@%EuS?6& zp;_7z5_Dtfh`M!0OkX8mQui8OQC}_5B7(8fZqh~jbG-!&r9r~rx%J)Et$mTe?z#J{ zp<$%KL{$P5v;j+6`T%R2S3CRXmix$Lcu!9=EOEpLZfNhmkN=&)MY7NC7l|}K#6iKG z-HX8nb?#m+Wf(WVWj^G2A z-;Mi5Yuu^_uZ`88;R!wb?Xa~bg$C(x2zkFU|5`97kNldC4v+8ytr-^$Ji%1&%>SZ6 zPZp_|?J~Q>fCbI}xB#XV`4C8QV2MxzFN`i+$LLf!vs`y~Q;H}d4x*2uJ}ny?;*{qm zv>LhOM{XyiKO3?b9=T$Za#U;_30m#--Pm+p4n1r3g_1wXz17GoIT>b*Ksl zA)5-cIsIjqrH5m9#Z%Wa8oP5Qzw^e1hBlAfQkF0QYmV}C9F?DCS(_fn#6 z2cuE%eBI~zR)-n9ozV%s*ceYbli9u!n}J>9n}7-j!>g;ECXDXnOEJg>4xu{ zC%>2dE~xGoTkq;XY7ULnNhfLWS+$asve1KDtI<&eO>$Oi~G(I7+JHs=7siR^k!fPScHXL27 z7UST5cOxq;?QyvirMD?CggwJ_Fhl%0dKVc$IZ}PKVT{IbJ=OO75Mry_E~<19+VA+= z3N-)+PXufOmN3B=Kl zw#H$ZU6ZqH%@SVL&j~?nTwJfKR3PHWxU-0As>x?-YPUbN-JTh>{VO?zd5`z{`ffto z#_i-&p+VT7Srq{pNqK^I?{>pe6)Ui{?41%vOR$xd+0RnkpPRNOs6R$Yi&1<9WbeOa zlm^X!-FdqnJ^be8P%tZXZ7K^nhYkKoF+_zYhhJ$38v+dl5z2+=`MrJmqyFc7!pmK5 zpj%@^;Dc>05IVK7*C;C6W5(0zI)kI2FhZIEU!+jS zficTjvAQ4=F9ZUR6b7xXgL4^uGhv0!Fkz}n#-gPzM)1GiPo1XByBXaE{VZAM^UK|z zEn~hV?Y_Fma=qwVf*salBW*wWJ`au6CtA8up#?{`@y&0aM&~X`a#>$b&$_tta@Vk> zRb7M#Y!?I+XwwB9CQ$2CZn-`+WD=9UO3)|b6HM_C{Rjx?EK;pBscdLz(bN9(r*=K? zt!#n9b7@($pnVH&fWNH5*hpRKJA!cc)_v9GP|g&6$?to_=29;N?74}MivRjm`Fn7-2Syd#sOi6s*F z8Ta$o@UHCX^4O@a=JmNK2ht%ipX*5h-(Q$OG64aBg@xCc)>9-Kll`@fl`9FdY`3C7 zvTtv2DgtR-dqUs-nXf!Vr^BJ27q8Il2q$iyN5euFu9)fmJh=g4K_958>E5)10%w>v0lZuxN?3$!7HTR} z{FPeoZDB*m@VdcQl$<;WS_$Oj<*Tcz@)go`^!0%s3pA^>wyuD5C&(x+*y6+#5HQAc zGB5WnLKVsMJ0nrWgPTbn>=ZNqy9B`j?S~ZG&=c36f9wPdARKqO1D7l}nv;u`{zO8b zcLPED4ntc@jGy+Z_-}8o@Gwh1$)M<6t&|8yZj_^H_F60sixD_D{tSKtqc${j*<-jR zq!1`I_hG5+@5-vt>u z#dB#2bIq4+4JGp(J$N@>_gZ%6UcGTIy@I%xiKXd08w&LCxw1C*`!VM0Ua0tMb8qv4 z6B9M!A@3k)vXy&;IV0tcNZo%%zOd_qCUk@P(-v?j`FHMa_l^}pMzPWHr?EPbsq*V# zJ`>tNbmqg=PU+KZ$sd$Bc22u+1Z3iR*V2g8RC36_h3cvIk{6pBC(hpbdW^T7&v_SB z-nw46g6-t(Cg;c*0uIhXN}irgdZ(AA)O_ChGgFm}pORcA`yG4-ytSp?eArV$2@MnS z`d%byT)W+)HSqV0+S~?tZB#r| zr*u|OitE&C+T_6~)uwZvR4$B_zcDY&7Z?g_lYF0Z6C%jJiGc1f6q1B9c$&Z;_yc>D zFU_u!3YqAjIY_r{hzbsx_0ykyk^$Yi1lUx3Pm;q?CGb zgImK%rvSWSyb$Mw@x4JZ#y~MM&#gs97|r?RX||!EdOE0_{9EKG=X~94L0G=I88k8;?(Zig7AM%TGEb0l*)L}v$ir7Q z1o78Xef>K(=j!Bim^E_n`iapX#<14R-RVHDh=ccTVI>tZ~Wecd`j-Rg`N(j!- z#&k=VpePWOg~R_*223crut&iYO?OvI&de2-XSNgI;gi7eO_aDXsh1Y;PkNC?d%L^e zl9Ow8nAu3$FPG{Zr3`N?knp2pg(FigvikRkr{AF9VO6SD8vx?PYF$vXEB)#oblN^Q z0_0zO#T_*sQ7~`TIfj%tu|F z4V@jY$WR+?hXH2aB9ArHmT=s(E(iCFehAarZSd;P#Pk-y-ls1f4+{^=5IaJT)Fh?IfQqbs}>@#Y48bV}fpKJG{Li;1srxflS8-GbW@%vYr3_7cwl3aslkjlS*m#QP`Mu88OKMRY5J4XhZ5y~UuY;`cZIyQ=e< z=?+fPE67#eph#^S?Q)<)i&oJsv`UV8czRUJQ9WL4N9Zvcp6uNbF2t@6&+%L~ljB{) zrfx}nt<2>i7kc?8UPrDDb~yKLS|j+G@KBf1!8{&d~6?yXJ}gNWeu^ zNM=nIBGXfBsq}hek(;kt$CVZ8+7F{o93!>SM7=(J-CIXg$VCb?f(%-*f11nBqW=;6 zT8(QC76n#TJRXZZ$EOO+J&8k`G_jb;g`V11hT_^*MTvN*Gt@3Vff@l7a!CW9WB*p3 zmr{Hhl~OGjOZVN>{ptV0R|EkqOvA&|-_+xVw?C1bKHbLy>2u=?w}V6GzkkW?TEYb4 zlLrUZK0ZECrHueT01C%0@4@~@-8Wr{hKSQviIge#8$$pszmcf~vAmr3#o@UkLG180 zs7@Q#No7W&V_*Q?kjgtB-@T1mZSwK|ZGPxf;Xc_L`|p}qY-|Sf75TkHb{DyQ-9Opi zhlhZ4d?r8NH<@@gJTuTa-)k&7VAeHCnEdvhmrkQb^#@)z0OYWOHtHzn{kmUV4hFVI z=co5s)^sJQsj?Ch5`h0!@3wYeUHc2h$tfPJ4;9BVfXYI!Mnu3%&B-Zo{{RTbh&c39 zu`x5#rhq`-12h*9Nd)K-5GYLNb&EPaRNM0!2iPZL+T~z=mcPtkLOBt zKYxxmyxERO>d#fwNYvC>aMnR@Ygsw2>=W8?Y+P7a;3gUaeULNXrzQUqA1Z2=4BUji z@bX+*^)IG}Tk9QLB+z8J-__RC;EWp-CeU(G)Z`E58)sS^BNB4toAXGbZ2>cMroh%O zUELisbNzXzxJYzzIIXzr)?|g+i~J#yyK2o1tSFC92BM-tp+!vZ-){;{ggeYqPDgsO zoRuO@MdW`%5{300|GK96;GAIYH;Wpsd@tXfpP#bG9JuI%NdXE}2Zk1TpIr`gp6wB~ zCEq#*BR?7Qa~_A=-WDrkrGIj)v9d}h)sI2iya-^&xEzk~DN4tqD_lB3($XB~e=}Y}3<4iXnF}|k&7TpOsKq)yoI{uQ& z2Y}cq#uvut(W9@vlCAhRf97iW+NE9J9OFB^44oR_{JXqoCpaFrumStJ@Nfi>%6a<( z?3B~lN)5d~Al~Yu1=0xxa=|&QjPG?s#ipk}-&{Kp1TFM8w~Hv9>|770YK&FP{XipQ zUcf%J1l8{rE0fxOPuk7K^%&ptYHDTx{IpN2dK+ zTJPD1GkILpHj3QRV6jm6(w2^S_}n>0eyNTbEuw%)_rp#Pb-T~ytI*+sBC($HbzlK)UmQL~!R2*tsv4T*8Ksfp@Uy!Qu-{t#0}#)A0J=s~N{OmD{P%y}T$>4zgOkjs$}6 zQ+~yq>3ucY>)R@y7ipYZP(8qtEgXU4F`CSWAUvVVB^XJ}xAk=2Ls~inpgE3>dMtIu zc=pXdnj9(DyF#>>9Oz(!oG;8k+vxDlD4!8tzDf~zPiL^Im+~Wt8mwx7#owL~-uda< z@SMlnQ_q)AgKgEun34nQWQ1HNtSl_QC%1~h@vFi+WIP+Xz*lS;Y!NQ5tV}wzENuu6 zg2FyN-kh75j4jU3S1gPSYZjE$5pgsv-F2`*3envvwOZ_!LKQQo80fWU)%9rA{g&!l ziyO%Z$bLwL!zI_@W7XEyg2F5i$z!vS6a2c7r&7^7KmT8r7(6(@39r=@|mA}}8md};%CbZ&vrm4A|G~0CTxf zhd(kQCk`&|;o)JaLF?l1@ZoLK8we5sdqrNJnm&O_<&tpwE6BeIqXp;e4aOIQq7Xm0 z?e9I~np`$9xQ%^c-gV;)Cbo7;rKVI&t#`fo)5wN>nnR;wU@&GQUFT@sNY6e&ajx&3Sz0)9S_4f|9F27 z6Jg`!`0(!nW&t6Gip%zuv(#1K?bpiAFD>DaWb}Y*UTWy>ikZJfgk2h&_l`ucU*W6n zOA_T$=0q^zVFHmvjo>uQk0Jg7a)%7tx3+~%7G}g5dT)NfE2bjCvg*buGDFHDE_cf3 zo80zwqJkM|qlKuI6!cqvWhQYssTui?bYu*Pu+ZzPC5a^ItrdPm&xJwihIg~q|E|bQ zoJsbNgU{VXUW;^9n1K^6kX~O;z<1a!zg}UWKLWw#!^V>OcI!N^H&*Vt^u3|LD?<-4 zZB=?}uVMG;@9%Z%eqCA^ z;rdfbgUEL0^&@eTq`xb(R^2;VQH2ABVE_z4TP#GAsM7u@qKxD-I81a5myz(Z*~abV z!$9!IsUK9lCJ+PJ^K#8swS{}uw$ zR_Y1rSXq;B#GzbIqo75gHH7l^VYD$ki7490^Xy5zgd!@E!*12W;^@G7)H4kMJZg5f z1R`aBWxMkj86_!Z(k3#bx`DI7r?zzS^5ahN2#cspqe_oq+uq@pmV;zAYAPy$JM`~F zppvK2374BlJ^&a@1GB;luRs^YTNs$5Rbq7jIc=#d zhgskaLm^h7IiWMQw?`2c>Q)vjm|h}w1m+De+JS2MkF>Pl843acX{_clK%bAOsIPB# zE)qApWhyQv<^GbDlf%luP+F!?8DMqz7zhQygC!u~9vmE}Qq1t)9y(nLdr4IWU4FYy z{9f73CgqZ0$td3Ad-|0DnEkAHk@D-Q@nx&IgZL_1B3wD*nY?BW4x~;rRQL0;Tw7j5 z5lAhHezGFpU|~4k6VZn)VsE~R+paZSKPO}PHQ+*o0%b@|h@xP{zL9MqLP2Q4$B0Nb z`dzZn2M`XPDjOlFRu9ndG$;C=p6+KkmE5=(k?+ui(2s_8@L>Tbkb2<22FhYlQBmZJPI*IEQ)W7U|D^GyS4iA~As}j>aijGR>@O|1W_wcEy z33j}{9txKz>Pb|<975-V^+8Kuh4;Vab925tI|3lke*I^J7vzNdT@iYm(*?&Y#ClX1 zz+_?-tHf~jNw}l((r^Wnow_asT-8VK%axMbLdQ^~MApc{aD1WM?%&yncpuf{zbS;P zpwe`p{E6H%Gu_@g0+yHhe%Y^C{IyB1NWj=X@=-L`f_>>1?VSAH@vdg}w4~d~#=*`X z9$5M7AEXR>gc_r%CYQ@gByYji0&icV{jqbQht8h2Ng`BD-qGaG+gbJE-*KQ*J2aYv zcx-NxWk>wT;}ssfrIn7IMd$Ea#hXad$OX{TI_ALOc*k zplvuY>|>W^nSml)GG~NpKQ25PEwN(-Te$25eZ3gj#$sqF#b1^{}WA3UfjSK?*N337kPQPG{M04Eod zf!xF$eTYb-Q$rjDLW{g%~rTL&IiRFY4ia=p}3NipX6z2Y~JYy(Rmpr!{F zBKFnC0PcN7Ma7>#wvBrB#>T)CYK*K*j3s`1yySK?|MiEI6d78xn_EOw6mYoUg{Jd3 z?aUNSg8&i;G}2n%#^eQHkzoLv;mXM1U^2Uvio#gI>8*@MKl7}J8cn#*)35hva4$(B z=06*pHlsQFrA#?ZzFhs+De!Nu`zh7DOe^ACdig^NH~r`RCQ99xdEPjJ!o}5uiTBdZ z)pY9eMy-4&8V=`=XqEa5AKWzjtwNH;WgF zrIsQ?fWQ*e<}cG-L<9N-$!W8@tj#pTC2#E)c!(T7_%LxGK!1jtT<>XnkFWv!(%o~V zI)a95w)HwgwXW4m9u2dpGxdgR`7ZL{m;-w12M48%{f}m5vzbde!$!&D`YNPZ21{jLZl0Mnf*=N;a~1 zi>l-TRN`fPpDj(_km!I~bi)VSSFF5(0-JLe@@vD`UXDfrKlz{rri|vT!)q1l1bm~e ziY#ge652P@sX%uNDIJm-wlAYMG@Waj>V#Z`*`Y{C5Jw0C_vxgIy9#5_d{Yy=8I71K zrm46xD7gixNy}=5biifuZIXZf$c)tNc)q;Zob(%mMSf{w{P$#E&8+Odb)lAY4Ve$O z!%Xom%7!!-!{^=a>e{n*ck|rGRhAs+Nodks)}^*O8wiOAAl8^O1v*B@g`A`Sp#GUP zBv00wqY2sc@Q+X8Q!cF(#DAc5>6tC`yL*a^^P8UzwsUc^(83du+T?5&A)jl$y1IJa z3S5IYGeYH!>#$ffA!&u5s!%*0ktpq-Z~0%^FB#_BF_p&B<>7_DdX{6@U&@n-@8Wb0tk?|dkV z*e^XOcQZaxQ!!KE*>voRQh#!a>1h^PiZS#9fL(MXBdrIN_sSkIU`JUJvUMpEJ=$%& zD8>c#tS=_#FOft>ueQ3aasM_87YTWOy2p72Kradc0z-ZIw1px};Gny-G^(T#$wq@P z`#$9%?h0;g)mTtu@Lw*Dj7S$X)5j0uFr@gLO~_Du)Ti)ib2LB8;9q*(cuVpL4;!=D zb@k2C<=0ISi(l~{r9_@=W?{?9aC&?1Vcvk=B9m=ltlm1iMqCWm`W1o0Lz}2b_ITka z`5GG=8_DRj3J^b>-Tbk1DxX$}^XYPbn%d1xe#IG8aX2mQmRN59xCp@6Rro_OJ_a7r zB7m`A@xZGT)l^koNRo0fGvk3#ART|*I6_k|Oa2E3jYlQqB4u+T_EE!j)7a-?;JeVNzZ0T@ zp>`Z|H<#JYRxa2>TW$aRnDFUvscU%GZsqkJm>SbF!U8#6OD>$4gta6;Vhp}v0wQ@C9&f+u1LqZWn~ z5zzOEW7}BNq#yXK7K57hCXwQ0;5DMr+Xz$zCANR;CoHi>W-W;KBOa3=Ho}2r~UgU z1SO;g#iwV6-5Vy}P$4o1tNeqo|F_Xmy(kNZjC{h9F$h|3;p)Lz+h>Ts9wVv17qLy^ zG9OV~(=-e551au8_xz^T_xbvCtYoXVGZJlB`sXM*bw#t9nc&q?VbEH}q}9Vm6yM@K z3pbuoZrJujhYOsPR8a^09&OHHx6|)m%Aad7%LR*Ri!?CvovU&1@qVQ>M#1%#>a`Rr zq`ZVIml?4Hl3}&vAAc8v6v`PH8!JcDKd7znF2UvfJ9`%duWC`u`Etj&l3cW3Sx@y3 z)Yn+()(QcSjJ9W$Drte+#^0d@xlrkL<|OtF93*dhBY7u#`L~5>wW9}1(yWj>d*jG&!7KY zKfeQ$f(ys*VKnC)#MRykb8&yjuhn#|kh7Hb!+{@Pm?>U-G|vPFHyGnP!%gaj-bY1w zV@R`g=riJ4SrwH=xPL+k{Q)eE6qf}ZAg+@{r%WFpMsdE}XhX0#KzAqxJYN~H4>%=p0ZZJA+van0eYHs}H{Gw0{` zt)h(WUR~WgsI55PL_xWBQ@3Y!|Ad|(v((H*MQ5Klb5{x{-z{S% zVaFxkkv}8OO;+Z4ts!R25+Y$SgckP3B<<_oOe3O|Ocja#aa&c$Hk`IG59^%Gd3jr7 z`-ISwB;gA`#WLih`eQ%#qj!uS%Ub1OZ1L9IONSY8+j%2q`@~qlf-gA4ja`!~}CQdjZEM zcR-TDN13De*Gu!z)K)1$QJJOUS*e}KZk?|722zOjU7qhXFKekB^2nvOlZgbL>?}cu zGg#xom~CZlhJLf}fn!6hXa{3F5Lv~b4b<0-y~N5h6dbN!ZU0J(_*|wcB3Ar+rAndE z%ac*indK;kIq(~_aO}@Ntp^?Q!6o zn;+qesPL5860An{hlFe4* z{)zE>9?s5?WM%a9?Y{FvYk}K%+9xk}H+E?mVe;UtAU*<2QwImX$DQNj^(ADpX-MCbqr+JR*vKxsL!47V zP!mkU#MJI{sUmI_kRGJ_?~Y7e_EVIEO`Pxoa@@pNFm*_kgZwviY!V}BqqyuUvnfNL#Pr2W^?Oyf*e2YPy9C%C>b(KM4s0|+b#eWpu zzXUythq$QE&c1!iF*gVMUZNqLJEO(OxWtKpuUSrUTyIbtz~a`sss9V+Rg^{myM)iC zMQ!U*)wTee$XG4q2@R+>*a5h9AaH={dT(&^o^N`|%SV-!k-WTg|17x%MAJS+B~|}G zvd>#m^+`-^k$4L#j0Yq8Ge6$bDzcG+!-P+N^;7+_p^$! z>a?weg-ov1XnJ#vX8G#N3lyEEtwKYCl7~M2DT0m7J$S@V=W|d}t~KaM+m3XHM0jI5 z1h0Q-2)?t7r#9sEx~ZbLn+0l@pN2lfDE~D?oNRG0b?JW$<(x)-vG@EV0S@7ArN!_ox`vu6G7-XZ}vu# z*(fL|evT&6YsV$t>@@d^oU?sIVlKTU3iJnxKny#~>;BpK^@4R}`emuXJNxe~f4XUr zoA#U!=j5YpwD0qNv5Wqq=r-$viFaNngrXh5^sA+;4{b^eFnOI6?dA%DPGI3{h$+Wh z)J~w%6|xvV6@*rXKqdgzFCpQrFj{86qM`@1aK-6a=fbJruS;Zq8EYCCAxrB5jIZXF z*3@ivnB^6n>9&93NckB&t~LygW-s|-zM^GlQ%B@1ifAS_;XMvuOr z>gRUrJXdhia}$SB-6lg1eg3|cs#y*@KBdM8ek{NN)vwc& zqM5B3{yTm$l9{!$z&07sLxCR()vFSj^G|`7MW6*WU7`KQ)eQ*t0#ecD^ARP*`bMm> zZ9_nyBwIuXcU9t#wz8ty&0_gxpqRk3cQ2jFN;cFoA|I9yJjSHlkzYP%AeVd?MdnG+ zLJ2fn4yjv=(ICHTvYf+V2w*%|v#UbRbb1;fL{+l+_F`JO{Inn3Xw;0&a08_>W#=?`UbW z1RE}H%vM3>J0#m8^*W=M{R&D^5KyYbn4NPh2=KgaY`Dax}P zTr((`N#jOeyYX78az*O?CN|0uBt~ImH$>Uymsx2wJw$TSm!GfW7GKP|ip|H9Dwi=8 zG-#IB(nO5*#2njvzA&^2#Vr8Oh~?y60p;fSxV&mBOK2;D_?ke?3>^ ziI-;X-(qRq4`hzZ)Vi^Ax*kWlQkA1AClR2tAon}mg_j4{CD^C5q&!7R2=zrS0WwNR z-JE6GuOH4?bn_CHu4duNcZKSpAQ9iZ=7hF+gj0s`NV;&tW!sFzo1*&=yK#u;Tgx{T z#<;zi0Kod)H)Si4RZOa^4|DD8l>7Mc&ayqVZ#P&4?fy#e_`x+(j{FRLuh`?fvDW&R z#q=v(xc-+uceq=5op|B$R`(~*W;bAb{x^}~vx=e5S?Ypwk%gBjH3++ed4iZE!g#kOYKc*XQB#lbjG4hp%c zszp0$)?5%*&|6&Pb~-(bi`u~R@qpiD+8Bjn&hY2?WUxr`YwefC@Px4g;gNQDq=(bW zhNOf95cB`mX@o>7&~i*tl%Kz|**_P)2$0;Gp?WW`k-t>6vp|R5q0Zh-xmH&W4-X-sjf$zOQtZnbwDonq_qtu>@tF$|>M@?@tgO(c zBFUG$G5aS5=eZ~GsXFX$*?UQBDiS!+f1EJ|T1bu&AL=Z69<82yG||e1H+hIq(iK2Q zHE4$g*IJuT#IIdHM=j_e7zofba^gJb9bFuTlJ?&l90|DSPAFaOAzzST;_o3v% z-RKBB)oVL@+B#t<6SDR%E!pWqvd+%2aPG3lWfokYFZ>>ltxf=5HmUR7f8yNUu+Mvfs663(7>_5DXLa^2DV&g6>?9(xCzr7uW1grfY4(Q=Qht2Xv zvE{~fNUeN;7!yRD-E7WO8g*4;H>AI++ZBP;RIU-RDZ``vLvpz83r$ohFzS}NAkC)CuiBGn=waes_I z7qM{P-=8{NStG^njfxg1rNIIMl{rbf^ zbZeF=q|Sc&cbpm+3cGrR|IF*gNy7f6KUzOtPoLof2L6w4npLiE`(PU0Dz@g3ok0B2 z-%Ef>`_1{vp!@rB5TY(nKz3->eh|;g&!@24krXhz1FJL?F;1{{8y*_^Q(Bt9PRYn< zz@Y^|7c)KlAYE!meFBI=+S)^ub=zv-?+_6YO-)UKx=dgPOqMI{lc;?%A|~DewbQ1i zF;;3K380-SIp2{IlGWA*%Bb-u0mFfvi>v;7hJV|za3DNpA0UQPz+K?F*z!sMgO#3M znj95?+}Jj=Y+l}=x>Ju>g+F7$B2 zHioX=eUO1o%2zw4;u7pe%9km)+40?`fLVHd^V->izV{M zu+qaw+iMOv)k1Fnbvh^-9Cc4bjSP=qa;`%n-$c?SEfJ*3J*@U+jCrrkd{$6Cq;5{$ zw3(niVovP*w$R8nvnWz4LkAsm+BwxH>@S`hLWj`}l_aS`m-~bk; z=2r(;&yT0^sA_8Ng6sJB`R{nJ)Q6M#YM=g$V_^QUuFzU@0oQfq?22u!lgelyO5cR6*b(|(7$)yZ)x3p8)NV!VpSQ+PuahP9TL8y&z8W1ObduLD&4u*( zr`dY)^zTY<*}ZGBm8RB;^ZVMJ*PPxWVLLw==ls&YvA^+KMI9wmZi;ZTxsCkr3D5QpRlU(RScDB73$yHKR?<5CdAuZQ zz8s=*OQe9?$$qXRe1Rg@%$zGR^WMf#0>N8w!d7Wh6SWJ4pu@-?g95cT!m&9y6qQ=G zpdtw^|MBUbtqo_YuO378O>q5ovVKb}v%}>0u#77N%n%qnfhKrJ!qTU=0CWUwcR9EJbzY+A>m(SNPUjD`y|>?^)yb`VyW5{-E(x4LY5AdU~|0Ez&{J6R_g| z;rH>ldd9zRCp#|>rkk0czxC{F8yPudOke(puNq@xjZ&}_fC*uvr-@qi_33j^1=Cn( zWvN6}46tLRsQP0vU#nT~P*Ch+NOm4j>=Sf(Dqf|HjAVm(!@9pmMz(ldQl&=m``s+~ z{i1{X8v?1sn_??;*Mq!w!rr|pykGuhb%(iK9~v&!+8~CxUmxgdf2^_?y=r<;>{qP$ z`H|%4F9;NS4#x4|GJZ%*O6u=9;&!sHYvdt>kfBLw(8p|E_K=XwHS%nOUqwR~dwYEV zS4WO&*6m9Pt|EP0Lz4~!(qn?3P+WBIt$&jl$pgecz$XE~zYGu z2{u->nD~lh|NO9wG*wi{>p6z9}r!>K#Hc#=CpqPy`EM@Js+U$;CRm!;9_Vl46}XPCL6De<*0zP6=GFIp$CUCc@v8ZSX^ube-Vl7Mf9pN8h!pxMpA>>+S2 z5-cbap;&eXzGbth*$YoeuH|l+o}Tvc_D*Ef2U)K5ot?ylglo;M1S33+3fCgFl;q?_ zr^~YNarM4m+URN%@a09Y-F=6g^3_xhYqP#14>LTfm~;IuokTw+WhPnw3apsi zNW|X<)9SgD*0g0NF6K{$8vUT~He|42{!jr0@fN!aO~SY4*0ItWDxxXgNr>h;%kBUY z6)8I9TszU9X;DzR4H=R4-1*R|7-&KeO)tJ}=3?BBAZ^;1wKI?rgc%sLVQ;X0?oKS3 z^nWy+bx>7byu~jeAPv&p-6@>{(jbkbbPCcP(p^$g(%oIsE!`>INH@IwdvBiO_=nzc zxc8oO_TJyM)@MOO@N{&<>w=+XQrgkQu}OQ`m>hd{X*KP2wk(YOI*OUN=Dx8}6x;TN zDRkdm;+h5hIoQxG2>FBWKz^!f7tZ2uA~#K7W%x^hZ09RqGDA2obk)p+YP#rg#Te>{ zx4A;DfVCwIz;$3Uj$D1DBE~k=(!(t7Xjux)Pvy5!D8w)-A_(d=K41UDcil;&W(LSb z;5YF95QqN)+`cxa163KBP^t9O)m9O&yMM2n$6V|#$66#YyvDtuAQfe9YHGVsC6?O- z?jmO=CoGs10Lv6x$yc=m^AoVRuB@&OL=iTEdoS!A?TcPvUdc9Nk}MJu5?H+%7#PF@ z%z5YDnyC6w4wuR!kqPDO6DE&2Ox(RE`NccL_W%on!B93`qTR_(;U@JjiysONE#?}X zU!G;UyD?G&KYT4IlqH6^)GhU1`0~y_1>QB@<5RIZavVl#0Mzd!x#QW4?RSwDe{uiN zhk7og4Pt-ejD9oAn3jzf+9Z+1;i>+z$I6&NE^_$;ExPVUzlmOyz!n#u$?knZby>?# z;+`v+_Xv>K@woA5YeC$|=|TxF`u<y4q|n#FVRc^;&~EuJ@WKv z0_bp1i8FKk83O6I>;wJ9BSn7IFy7wq4-YT;Q1t3`aiL(#J25x;kC?ZD=5!ISIA#KY z?lTlG{C9bs7Bf)rz^))UH9<4spBmC|>T$d2-uu$tX^RY|ZGTGC1t=N?x;Eab?b}2& za*?AQ^oc5?zi!mTxDE%JYs0|b>kCg`e!WT8j^^MXVc4S}T%X*nSS+x&;PPE z*!1V`HlvLe8f=#VJ{blY8XeZb(Xr(}35h$c(~Ft^U|zl>m*afMuZVuU ze`kM@=O&o>-nh5vQz6Yue{HK#4;W$ap^;9&c(vBb+2zEL&=t6sw*PTu4ujYBA_&<#)F1308M@j)_#L8yQI}D{mzc z@~|8E@lU;U_A^vNLoil2NW?pwt(~w-7cyvpl(7^gu^^Tp%U54zWzr8GEMeczY}T)* z`mQ!&=TuP@6vVe5HDZVIG}O6^&2mAgxSbKuz4TP4-hW`b1*dFiXlPVY;gwVGJM7(@ z-m3$0FzzRc)_G`d;HfRAV}*!P!~y+fZcc+E6^oyki}zc6{3!~l@Dtd&p2!aBYF8^| zqJGBpDQasoIh;ALC9DGwaexuKegX-e^|p^ktF21<`W;dz^r^_i947nVknmrF&FSfX z?RoL>m~esBK=1|aL(mZY4lh3YwWr%N&OBB848_vY(#le62&9N6#Rqw?6B+mp6-G_dhc(g@5iPK;C=txUtH`W zwVUB6^;go&O)22_#mGJMkUZGvK~c#6?U1umB>a6`{6PQh9cLWpi_xwwzZ46Ox*MmS z+Lo>GMW+J$Kk6Q4G;(5T{LhRurJqt4k1le%@XiXGC_`8an zX1-U@pmQMJh3mBL;{0(^88Lgq5}j;Xm&CqHxJ zRPNIPH*)!HQx8v}-Wi!VW7T1gbhXLhS`V(U4LT-=16k--N7o200@LyF@Nk69e)%y` zUzKW3QDtWIch+aZh|p)=@59-29>5z6bUpAMdMKr%ZYqopW@!VdE=`JY~)fihx7Of{Ftx;)T z=4!P(w4RL*VxeC@e_p7wTs4uQ#ml0_5WzsC3HqY}+iS1gc(2}_OX7H>d=ghbRlwYA z-=AW2JNspNDYWOE5kg!YUJ}2ni&p#N(&)ct{yQlSC}zuF@+f+o z`-||dNiM&l$*~a6-P#WC2zcFVn(MD4-@-^;(cFBSzqz(>pYas<#NS8-QpW%11=u9v zAr?l*Z?MrjzPE0~tYqSIRb$vbN=>nEM=w^6`F+;LXTL?ErGp097%E=%9LYGw$8T;^ z3ffepu`)|k>T7yQ(9Pfk`b*4_jOcSbBEwFB$9tfA$=rSN?>gC}?QI8z7{LA!E}H-{ zAoKIQe0%^Y$O?EPoxYvmp6u-Gq_8Fz64C>#RQB(9$WnkO3-S%G`kOw0VhD`r1Oz$( zI8V>59`217A~XptPTHL-Kg-OA?vZeAXcKgKoeC8@!lH$rah)GQ6{`GbDyu;UkpK=M zsqY6e^l(Xh;IKbiS?p+NsL0Dx=Zx;b5ca+}I817UgBzWif}Xn!c~6Ma z;LgR;l8%Ig#D_I2Q`6@Bd^3yi<{(7?&NVd+Yj4-VLGbsRp8BQVd_i-wFAp*cj(@{< z9w{kaEbM&vnVV=c6lLWY7x3>FW@j(A1~|;kYacxH)u{(W?Mb4f%&b0j_ei5%9B??^ zbu2BJA;TcHXsfGx{uTEEU@3ai0dXQ*YRmV0gYTI?EoD* z7FHb)u1ufMkYT=k`;n8w<0tAZ^lI-Q?-~i3F@uAH8D79am~dVHjyQmfot<5hq8jYV zWy2mom)XvUuH--nGQJ~VBX!EGNYE(a=4ay+5ko@}e;zTMq{TCdSaZ&K-;?b;R%-TR z-N#WkS6EY5@-)ng^JQ$MI7gmjOwsQA^xGWPDmxn#9Av_QOC-Z*Xsj4}ir}5GoEIvh zGDU{h?i632%u2mGJo(4+G(1=_P~8DRk;eIDOjvVXd>GQBP4XZ~j0GX5a4692)mT%q zdb{qF>jl(=Lawj8Y1QmHjyxZzBA~idh3K#{{ zH%DoA6rN;jl1S1lKfcCiUof*jV(0D-r+%}+;OSfVm?g6{OdU3BupoK3+Vb=PhuLS@ zDlA*RH;HFZ_ye(<+w*Q4H=}eE%76F>$FF-8kI5AlFj@#~dW4KJDK|U>#dKJR)RdIDfU-p}Mk3<#6Qof=(P%!U5Fg*-{JiG*^8Nk2 zLJC_B82_GOH>dGaZ=XNQ`vLBjDqjqQQ0yzCh(G&M;m__OZ*qGQX$<#rq;{f&!xZJr zJVCp{0g=EzsHh(;EyuG&*0HN=kFCq22Vds?;3h|}UHx+y@k~2_Z=r9Bhf<;SJ8^A( zicV%pU#3B>`N&{irPFf{LJw5JUgJP#36iM0YO+$$#+oagV*4=$Z(TRB+#k--xWsOy{)T%$zVj=dtauPfNw9Qxebym|2F1OgX33h#_y5qkP zO%SM{jzPsst2f&I_3>gtxx!$nWP?YMtuigzTpGGNrOhHgsKxbjpoHnSyjtv8-BO(* z#QH#vz`)43T<-wLRHKH^8>Wl;tNZdEBeIFQWEI;4#d9{*wY85Lg_^3L?2P>2A%=MG zua(`7{d<=ppAI3a%KoDZ51JvbN}-;nrqmgVnsCy<sgzkqTl*Ak!Qc7RIPi z2Kcn>oSdLkQuW#w9}mwddwCUyB94xUNxs>`8MHV7f=T#E_J!qgon>u&J&d3G&CIsB zp5ExY%6w*n4u|fhu?QSbS7&xsM(g*Eccw8UK7f7iD~F;+gz9>}nUt1><2(+qxGgOn z&|;Sr7jE&j<_5d`%wXf< zVqqw$ujN!dscI_n2iVCdTzJUL9G6n+VV~JF>rbV=E;wL)t_DtqmF9Y@9fdgMiTsG8 zgXu7!{ofpi5%QwFeH%UWs1ae5og;=I+f`Rw?c(x-O}?LvJJ;)yggq0ReHJYaC*%5DTY+wbs7XV7OKt1GXa32t)k-3?5rsu*V*vs-s`Tk zrMjI@Sz=-gV#MB-@9&=ML`O5*)%ZbN1#~4ky98=#Ybhxykq{9@W-W>Vj zpvHLT-^@(geh(U{h<0wS1Pnw??JLe7w8ah~jjmPf9AR)OdjAtlc?L+*Ev$y2!FTI& zH!(dpSOU!8s)W9EFkmvGd0!Kvqj5gP^t1b1pMYFOJ7-%(IChWLVjT{5S-Z9&4*ZHstGz8bCcR-egEi`;N3n3!)WGEiRzj!`2-QxyN@k zSW{20oTv&c3P9oonE-CqH^G0wWNUkGZ+>P5Gy{S0mlG(%F|JPOv0Qb3Sv1nu(D(uN z2AP@BJ%4w{v+QkbRwgEto&KcKcC zsQv?sPB)zVEb6jP1Z+|Z`-%SKMY>=rDI#b-aSa=HXSCaVLwLhUTMGplV90bkDsun9 zALo>1-bQ*PU}N}hIeq3swbE!uCg0Z4c0w{sQNxOZ#d>LSB}?Y}X=Tl$iP?kha6BSZ zTz|T`(kpn?P+lS%Ts!4fxiyAWd7;Nb`1q_L{`vh zZ2!01X!)N5gxNHI&&Id`b~h=B=VYy}#Vijwf8Gf*^*WFj(8Yy9X{x_8H z?oXzksePvi>zS3iyE$C0JkuXq%-f%WmR43R=l|l=jUt@W3EfqPTr_q0N#G$C@xwen z)XY14*S9>ykzmDq^G8u0NEv_sSAaI_q>2IzB+@De_>gSnbMlF}jB8IL8-1@Dq5z9Y@ zOjPL-K_b(D>J;!?vqXF|uk|v)C4tXr2KdwAea5B>w%~xi%u9*|B9wlHn22SI%`$%M zb^AVB9>VhSUEBniWZ`)82!yj)0?tooB^H_IyX|D~F7ci)z=#cj7i_-j4Kb>$V1J_> z$8XBZ>s)ZO!emQix@K%I#g48%?3&I&=6K2T4r$t7I1tN(IA&n_z-b3xmR!MuOd2Yj zHiwQO&RUVl5sR^GpdvWd4bx`vOaD)G!l=l|&G;!rsk8QfyVR-T#M`899$pjWrOC;} zYx|X~?eUZX^V1*LSFHM`+?6`mPxzI%-fgN&v5*9KeZ0#PaD}^n0C{wT05G@5iOHz* zvn&5#3JN#yJjlHS8k^mpwzCr9e@$B48VJ9~_q@2zHD^O5*%aY&Gh;*U36H==fED1e zdANIErjl01%;D&?2uaGKcn%JyjO6^L5+I7+{ zbm$+2?dxam`7SFkp7%Ha20z3ePlkVDv2GGcg5I~nCTd(wU zV7hDuqOMt6kJ@c~esz7&?g{U5^uy~!=Qv|o+sq8ddXDIsu$&m|c+EJDxT1;?p!u24 zR1F5k?$xyQ`$+}&C9^uBxd{GVA6=BW?%43~((Lj>{1kcjXG`~h+*YTGKqH@0ZEZdi z6p=7%@KsEVIR+k#oi{G`+6t8}AFeD%C_$CDQ8g_6ZMYpZ?i+ZtwZ=;B?_`346oPFT zSyVJM>=eeK;q|5DNeoS&*;lk939`ney$2$2$GyGqg}7W~yPjKwJ=D~cm9r1j#L)8q z0JXY$a&1jNsd8nB@BRDtP49~5B|>fnx8Ka$%d6Vj1f#B1nw>B-HKk1-V8I27QDL5~ zw*GgsVj4+2rfF|S*}gE@|Rx6Vuw z_J^4!zS^7KUaF$qO74I5>_)$o>8@cIlShp3=J?Bsi-CYtZdTsZ)C5|nq6v8w>ra7M zLL!C_FnVWauP-j3A)rhd3sGHFRUA_VE0hTS?_#*r*q?s9Px!IFJm?asR5S%1+c_Ja zqrc-r$Y^Opdb<)-s33mw^gUfm=|n`4j6V{$B=4*~qF{8pV-jT-u3XzEPYR7H0t<%K zQ_A1MN*zW8Wprf72K{%U=4@@235mwOINrfDBZ>xUnX?!FkV9SX^@{c#^Q;mu;e!M~ zR{Sl7b*Q2pi5bPyE1L<|imI}+fx&5MsRew6#l=HiUH;-ZuyAkS;NFxx{JVb&&5l)( zrTkWGm)mheNbhGBM`@}V4#vYMs*_x`EDQjYC@S$!QfS0Bc`G=QNaf*XmCkDU4Hs=J z1@!~1sWUb+v?s?Ki1Jc*3=8wu`Dr9hu3_P??@uMK{E_xfI%%KAeu})3CAU8BD)7Pu z(k6>wA_Oa@Ik}wZ%-U`(*Y1s(kHfopW}F&}-T4dv;#fKp3hLQ{%e~$3t@TB6%B_gG z18tm)$Pnpl-90-VGWe(s3@T+aDoaa04Y^u~Kht%w5n*6d8}$VH8V+{`MW5|N2P}eB zMfJJLs0R+vGHI1EfeRs5nJ!5D{L)_pnj~f(usDQv873^au9qceRW&qdaO&H`$^29) z9PzAfi;TarWhkz@THNMYHqW!@RTfuVUy=%%E^T+lyW|JuHR)IMN%1hAr_Y!}9nQVd7HoS9D9T<#?g((rL0f9jbI!3Ok z{=xADGRx3-07DSvaL1^M zOftwb!a(o{2?3j^B(M-TC|+Y*NhA6c(|Hsa7qy+~kl`+Yn+)U-fhy-a|Lh0xBN2HS z-ivT{T59MwSL{sVapr^zWM^a3s*`E`ewQTy9x^~R0+h9}QTe^m)K8y3uUH=D;E*Ap zBLu&IWKwWi$vOF2{CFOG#!D z;QTc<6!m&+U=WFUJpW>(t^Fnd%>9)}h4{6puk^JBKjS9>!!onxyyDDUofQEgK0XFQ zVswE-vDy;-Ee8)zzabl7@Uo?}wT%mTTH5m3ZC0c+dS{F8$ex$A-7P zzfSb{D1D=m!u>c_hAU}ZOc0&RU`7UmxF_&=k*7jF3l_PDzdqF5%boA`B9B^dc z?30^FA;Jc!{NHSO-~peKruqiDp+p3^=kT62f}p_J1YSB29GcZ3vt`<~Kmk=(Q7OLP zaXGMh+C9Nh@2eR6n_W_3dESlRXeV_1{v$V!-AWU9KVSy;n{5TGWmzL0pF6zBxw-8w z5>H3@iObVEun?gjpjuspA;~v_|C2PRtS^e%0p}ZxYQ}IsaO-JnZIpv|YdZHOZ*(ItHWW%}^E?!%~TO^K`hTC?-n z4KeMlomNf%Po>PZ4_5_}pi}|m9Se~OM>jtG zOx9v~z69dc#*N)mgq=?0{}ovE0eWR+rh&`WGEO z+WeLjnd8*iODFA?T9HMW&#tRIub&+c0kG8f(=Ws*DR?+IVYh#xNky07uYgDCe_0aV z-tGLGDp6oF2h8-(FrQwS!Oitw&&)Mv0GT|^C2avbf5uwATrwE^yn%`vkviMkwwlE{zLV zVu56X)6<`rCogg1Qd8YI9^}>4>q9~wht)^0iF-Ez&@VOb51w`QqSjk(JQRetF_Dp| zh*oxXFTlqEGUNjG_vrrAZzNQAv|ZG-^j2LfG-Ep%a6d#jFW<6cjd`R!)A%vOQcm

QF4SPrtGa=kD*v93?ms&DVwi)f-c~&p*$5N=Yp>CR`jh! zYQ6E|!q7V?2!L5;XNwlA@e2r8S3tse?cDxlSMZl(DF)NgTb$tXL-ofx4VHL90=puK zxH(R@5A-S8PzhilbU2PMqxEmC$n@M_`?km>db(~O;RDEss3aSGweb(TycDB6GWdu+ z?JYX(XfYR&r0V|c^?%BVHO@#}A)w;1O8n;K)fM&*KAGi&3Q6V-e}>QBDO;KZ5TC>| zHm(`_b4p7CJ~$rV7ZZkhb~5yV;>r1k($31YzbTQExwJ*QuepIwu$^S5U~Z$YuMg%n z(}ha7VUi#URieWOfTAPGtUf+q*1uA}-O8qnl!T8Z7&qbPlQMV6Fd}HOEVSgp zOqJiCx1VVK7N&D5M-tQc_3y99NaSL*@PRF8a$4mW$?JOM8M zBoGw|DYZ~Zh=Kl|$7%*$SC`iQjHKP;{v84=LU53H7!F@WMTNk(-MfjL9?x4;Ke8g= zH3F?C=jZe4`2BEjEds7lb7i&Gb`@uvR{us4t$`#;|9D4a`}k`cK)fF1DBAl*Q?mg<`By@kpS zD)Y)pm7qzu?C{a!w%10AKq7~b=Y(O^5LxQhJUgA(^ZMWoS)|+ksO#RV;?43dl-=O>X(PS>@0tF8ucyl zU0#!MA&Q2dJ;P>oOQKny$>KF=fGDEuGZpU^#}fm`=>lPrDq~^0e{!X5um~#)Hq$_5hHg8wkJzqi0 zZVa0|!VD_cii%R+J5WHfL1=nvss{Yo+iL<&u6W4(D=UJ$yrAiVo0<9KbzTins=;^J zV_XgX3HS*?zG)a8+|DdXvOqxQYPSV11RY-`xE@+s3t*}2GiE1=QIeGfbD76!f&d+@ zR+rVGgk(u^1UNOkH|UZqv57|;He?7r$=L6d``Nxrki~VdhqimEMC<@{0yJRP)rFXF zxZnK?jc(i=ueSI*wfr$ga?B0u`{U2l%%5(_2-XR5U+QM99exq7CHh-l|4x)m4c4D~ zj3x%xGX%nr1SZB>>gqv8jP!Kz9@h|v+fW&+WEMb5`0YYXT3p3lY6648kLw$hw-KvH z>z)@I|D3t3v0$6`_ou6c8UzQwFn$-uH0u6I{T3BMat$cB-W{jPk@bklUAX(_h zJc~C1MeSE9D3K{>olN@PEONRxj9!HN zlmV>Duh9-*G0*7g8q5|G&Jvjyu`y_cgOL58JHPAPrg`ADT?L=o-TBr!5OlyE81(o7 zz85Ye7O><>HCffm%z>pH2=(~LQedpA#iYUR2};PEkLFZzr9MbX3IcHvTt%G+^DorE zTJ@4qUkU}2O@;YZ@|TKp(tT|qUw^;ZNHXz0-Bt^5ZUAFq3qKBIB8hPxG?{0lr9Gs> z3Brn*jc4jszT@NJ?D6_~KBtQ>>>1Sfty&hJvD zC(4XLzOPt*LuJ$9%o<{vnwlBAuQw;U*y(CH&Dt|7OdQN$Wrq#Sbe%Z6~QWjhfq=Xjc3faMB@=0f}e=HmY~DCbbkI-?{O}x2aIsXsc|p5 z{@)7#T2V~~(e?CJf%g>jbjZ$v+yh{|`~{msqW53PsHyuVC*%5z+uhE^+tCc_t&%!B zMbgL2Y8Sy1#j1T!1TkMV6+L}}+Fn=wW0j(kqjJIaoADK6ncfZh+8bb929KyHJT7R1 zFmpP*>){9W^U^X;ovc<5#gRJ|HMK++zWTyZq-=kdvG@aY*XQLQ zN;gtGs);}C#+n>|krB!Ex8Bjx=boVJNO-iRUl^LUVCQv?*G6ElJuF(~(4>x#p}AETAS+ z{8EdisMCj%ATr8n#)TM?Zj9=-h+C@}iFIAXZQ&f~DAWA_`&(59k8_405(h1}_Ygl( zUK6={E*nDkyKY;Di+>sIHuv%L31jJtyQ11}HW;7}oW!BQdG_7yTF!G#rN|9Uee-Ub zDQ19Cddvk_9Gg_Bp@CvSx*|=*Acq<8`v@^-~g~hsV1DMJP!21YL7~zYcs9 z_wUV22id@)a-0O6w`W2h>b@unY_3sYDxQ!lm3nBdBQT;%jwXzLorVJD9VV~Ug?8Ng z6-P!)l0Wn;`mPtIJDLyY5)UEK2G-ldB?N}wJkn>M56rbitD+KmLbyv49B zMBOkP5vGrUKF@c^J(Z=rzXU>|jo3u$U_W>Ia0|acKqzz=BLEXcBkDh zpP(RmEfbBlb{Y&AV+h;2#mBKP5xyKx$#>iiq-=(w8X9w8uR%#cAucYirmK&Oo4hxk zRm$Jn_aEp2$c(_b4Gx?oANK&%%=6|r70ix-BiHv;L<9Pkb^`OahQACIaA$jM_J)a^ z3#@=6=TiN9)WZ6yC@rR;5k?}=_z0#hzI$!JSMZrdsWfh6=lb|SS5FVb|G@df$!U;> zXAU#~0w)r9ojFIU8bkfG`~YCua`V)1I{}Xa2e4V2fn&$Y%D6~a)v=RN09jEkhYpU~a{&D77Agp`1X&oM# zmN@AER0WSc%QRNO)vc?%td|JWQ(`kd;B)LTCbO*W@6CU@OaV~gq=_lha^1Ae&6b1d zKgWv#b1R=k=;;~%?i$Zd?mG>4NxY*~;$K`)_TAX%H;dk16VkS`+xesS#e!?9P-#b( zht_tLXKHqKWp-9@NB>WUx4`j9YhvQIw$=hN3DN1!4nP2EsrNP1+wP4O6;nw*baOOW*hMl&w@c;T*HajRvxVX4r-sYQQmnvf0 z;hxHe{oxHAMA?reA6JA?*!SUSw3(9|eoe8-l$;Ar3j*CjqkO7 ze?zrCf*__Fr_w}|`mduV(lGnTk7t5Ef<1bQ%$B|9(^s~ogx_u>r3dBKsE`!fze!rj zS#!$aV!3an9_C0Y=41M~d3a>WO(JtnlqJ&3`%>``JO;{_PiI=3XZki;4pc=P2?siS zk45LwzZ{f>bkT5(^vd2l5*9M{ndUx(@O!#eYFe~HA0{D^%E@&Nqk$P1C4#0ZL62C_wTTs+c#n!hD2&VcwaTgR5CYNXb*%ijak=dJ^k8=< zME0uXct{%5+S>ZVHCd8EA&u+fbu9=~jDK5vO$d#S_Br1K-tY6DeBWkjb*c>gWl_84 z=6Hasi(U{o>^5>3TBG_Hi|~E%-`4(rkiiG`FtejsY-dcn9OnMxkKp+FW${EvQ{fguO=qx z7Gsvl9)#Um#n);EItmWJW%OY1<%)enFRn^rhw3DTxIm$Bxs^CE^ zqPpnx?c4NJ>C2Iyz|0C91P>p-SaXP!oS$DuPK@Y3PS&jD&cA5Dt5z+(Nk@i=7EXh? zVn$}BzN+dbc+eCok#b)@#mNPY>=fkXS?G*)!Y|W7ff`3Eu%{s;C4J3J{T>`_X4;-S zvRv*8+AY%FW-|aiF>)N&;==_O7Z)hLR!KU01XmJpD1+5af+0yuOABmfMVT`#EG!l} zeQ8xI8OpS(Ss0I2ccUhW6e%<8>b`c_)l-2EF{x>Li;Xy$lK|_|(lj{K86L&fCOiNf z0{?|!HNdQ!kEVJ%-)15(Bz-3vOd7FICpbr&UJvavMhM_veHBIjE|s9olo?L3yiBMU zncD7kFuU@3_n6N=t@S;u*mv=dqJE!@Fup0L1huA{s;I2~o>(=m=AxL|J8n*O@+i&A zfXV>0jUCew_+S)vX2!7$9%i%MQBw?z+#gJ-=;#}HJ~N9pV!O+S*GF|!yQd60*H z^}#3-$r*GUmlTJAna^t!VX+B^HYeY479>@kgilo$Q|`o@CSi!2fE_a}d-}k)-He$e zMpJZ&RN0@7D@y@ZZ1{4-TjfMN#>W3-tdzGtR3KzEGH?mG9nL~0&qoVRWZErzC*@xp z3>@l&s{DGR?Krxu{p#6$0Ug3@)}dt2A2W@HPjMO*%`67CrUE z5Xj@J_Z(GtK&R8gw?FEsH+HJS;cpd1%8}lc ztBGJa_=e++-SFZRmsZn?&t;d-K|+y#>r(FtLb@K%QP|N})_?ETAfz=~Z&m#*4=8VG zY1t8FkWREfSTo!s)Qj6Ho#!{l|FxoW1r>lU&5;iug$ngGk13fod92QnENAlQQUsZU z>lEIoqutB9yrqV97V?-;Ad?uLEW23J;SI7xI-ggdrW*>Ee@GHh$JyyLQ{0Xntyu)Y zAkwKAifODlnjg*T+5C!NK>L`F;5XXVW>39(sFtwjd#6M}MkXo;t=fLv!}gSm;K)T# zPJQ|^??135FfA@BD751H2c)I>__FE|BDG-kT5)dEdoy}0gZUBL24jnyNA{f_rlrTp z@K8|?atoUAkawk%{8$F`yXFa0H+bHa@h$B{ypN){bsD%CQXCo@9eq;yO$BO#62gpw zpey4(B~luChFDrVK|>ZsJz#8F8yN@!{^C>*Z|rRBV1wYT727Yrxsul@AnI6DM5n8}44f&^@w$wf6_GJ9 ztn~CrfEJ``XJg|A5|Pogs+Nb=P=Q@Vq-J=*KpZd0r(eSW@+1i6tI=WUajQJqQCAmmtp# z0<4$_F3Lii`+6Pyr+Iti>wi@;N{>I&(jHF;eeW-#-amPc*)_)320l2sU)&wV1r_eM~nUjHOqQqJ#eitB<>rdsp$!{tRxuj7RT?= zIKNq)?ky#He#n(lhzR4eH!>>!ldKNfO|-Eqa^b?hOXMs;=L67XwWqzW_0e3CMYm*U z*cR*Y(tUAAR)#29Chu)PXXjSDDGVXO9k0C?pm&~Lc|NhiL`lK$%kmin2QO8aaF-a* z3^tS;s}^HvxG49)R;iWgxz5#w?CmtgIAzn*z1_0xz1E$xMm#N1xymO)UzRHVAp5br z4hlL}lKxX@Uh)@Kg%mD~fNXBvIW(`C%9d&sG#dP^tEA8ic7E1}J3a`1C2o;ltYxja zd=_WzRu(vpUR+$L(WAFNJ$qW-Q4Re0>Ed(xAS1sF&e-fq@K_ z-I79r;?3;bjO_!=j1Un5=FXPe%^hYW1Kvv0;Lu}jpPe>&I%C7VFXQv)>h6`Doysyw z>n$s?z-Zg4x@q+&N@*e14)O0&cuf?V>z+amEKU6QDQE>y`oLy8Q8r&l6WurG(5Kj=%^<{w8E9+$Z{#GKb@}Co4EVpIRG! zeSvJUjs+uhqUpTTkOwhHe8ed~z|3lHe6nYp_F4l+bm>j~)=(%ow&&#hB`x8*ycl8( z56Ibh5u@Zs3{r!9LqV_Q9+`i9xPQ2FCgpJYI`HV*d>oiN&abc4`((S=1iep>55AeZF0D^a zx3kr*#4X^wiKRFBJD6d?Os8Ud^lb$+;F$lBGKkU~-Zp`0zc&8^N^PJa{5p8E*^?H} z1FTwJTNC_B)t}#zB*~qr{}KuaLD12n5^t?*QNUejb%p)eo-N|rKYB~J6*zA_)XS2%eW5eFCo88IFuyqJAJt$ln{^izJ%%?$Ow*G&m$(sRTWCP`>!SO3ab3R5!89zS=QH7pQV^UHOzqWpkvZSMZjT2Wa5y;&UA3WWg%}v^lfgm{W z4&?rRc1_GBW>2Hcgtv7DDx~uOK>-aPQ`kl;ud?zCZ1sKkddDkiqYn9f^`8JaYIgQG zulJZfp#s1pAFqlv(^g@j&s#?s6liNgLd@kyqner?9UNQ>3gXQ77bDKtt&Ke<8C-xvDE>ENZ~whV;<0FX3vIVSPfW+Zyf;3F4hf=euaIb&2}< z(RmJ7j#7mz=c;0JmU-=_%LGtz40mqphIre43Mh;>Ay&XIMpN5zNq#yji#eI61e-~r zkDjnY7;l(x$(f5+MwiJRF9HbxEE)pITJ!0PcWLGobVr&^Tw`}rB14zQ<8f5k-%q^v zWc{-;v0gggGvl%t?j8836Lj6ez{2pZI^BesHGL#ny(7(dYIoV4%3Lu;=-YkEtgC@g z>*LJ_8%Oa0N5`(5=iA=CQ|rNTjGDIM2U7R)`rnRj3fGtpc^_LNb7j7aMAB$vL&21u z_qy+^(59prSe1z|?iOAyVqnFewlB9z#Voa0nZlo-kg3x}DOVK~^%a_Rh)B4`6zXM@ z&+2Aj?$`*UY3{N7;`p$9#XjKQ8qoQz!PsU?bJ^&`}K zg=}Ry!H(Qxyo<}hFADOh97DfhHOMfXBQX1JZw?9?5`lVLRD|vQ7%ZPE$jo?T&d3%g zqfAEd)&??^WX8p?pej=+sqy^rqJk|U$%VoOT z{LXnJr;+d5D8QQKY*prPc{zsNP%FYC+t*UyD>_)(-0Z%x@ozRzx>KuV>+o0|rp*&5 zM#ssy4Wz=c6_ri2MP{A4-Bj^0W|bQl%|{=}(pG~UNePEI!Wu}qE}NY2hQpJCD7EIp`p zs48T~Gr=kSuo|DD*5+nuFy^nJ@$gyy{vSpcb`mv5>cZnNVb1J}&gN2XIOMAP%lzL8{UVAF;btI~=DMNnsI=USaUpiDu> z)^>RWXcTa7-aPxfiA_m~2o8?S%QGof3y}I!R>t(@%NKPLYSfT-STlu9O_E7R_uE6Y z;86nVsp_T&DnW1`418>O_FBXAt~(zhG*j0wAR8qmY2~_&LEpa@jeXzhk5W=lczx-k zL=3tBz56r8L%hd<6>q|fX^5K*ylqZ@7QAQbJh@ri`)7?hkJ(5mfz~NDI0nHNA#HhUQ zKlD9pM)+^H!hJ%}GGE>8j(p))R#u5xKr;G4hg=P$ zPYO3eoC!P|(P06qjt$O#>i9ML!sG17AcX$CK zc_^$^JCu(xgmdP8YE*N)JfB;?Bz#L|f%s`}(X`?WmpdqypnJUuIM%6KVp$cd?dXk` zc&)W$D6Y1dMyG;WxX4VcDrJS_H{I9f*c zg#r<;pAIevkn4S84l9d`(EvVAdECYy^1vORocUrivme?Mg68B=G{}x&eRe_-PRD%_ zFZBUMY4N)>e#(~47R%Kns`HeRD%r1L8H9rPQB@LsOGc#Q$LA`H(tQS_p$k%W2DJOHQ#o8->u=)h)o>!D5ENm zIDh(L2f9VdndV31O z9tB>+Dmp{Q{?t9_iS&Ot6|TPAS`?mGiRCE&6n5A+2s~noC1bNZo}tF*sc&bDiegf7 z9WBT&uWVcEAp1?&X>9%yMI2m!3`Rr6u*|AOKj`bK=c1FoR`I&sv^mZU&SjJF=*kx! zSzJ3fnvJHdjt?)9?R}czp`5YpWu4N_9ljdV8x(p?hL-O}CN4bsxx-Q5y`v~&vJ ze%>*@^9LA%`<#8Rz2=_V#Da({u0Mgn* zo%t&fxPJqY0n~w2S5n3}hIf=v(<~8+F`p2pTH52%i0n)^&2a{=3}=IOOpYgM-0908^1RpHx6WTIP06&-rf(3l%#lov}qHEe;6*^>aV; zI^1Uqhf9-$g@qL$exQ<_%chx>ctZ<$uD`!9U!+Y_^UK?@@$ouffbUh|`!F+~sIISn zcU5t{j^2GHPmW`Qs=!{n(<%snDxJ4Z=F>2{1g#Czk1uJq+uMiIhhnok43krWO zDHfoUDz@cNr_il?@yj9O=uZ^~8#5?jjQq zbh5siaalvh*U(5gpDZoamWd^-vb=82VwP&lx5K1NFHJ16d#!{KGxQWz z{fxftJ9GK^28;xvTC0v>u+SJJ(0*e{z^yU2vSN9$qd#o>W2;7i3U6RgZ~IEShJYcd zyxw`OSV)a&u(x;TB7r?ls^6$~95}1d^jdT}EoQLrRFR-iK=5HYlX~E$xllS`(3S8X z>1>z&&#AJZ!zE=F+^D=4(+Ud_q1i5>C8LgK#0~IGm(HHBqzK-yDf#1Bu#crIMQ%&99y^6J*2i;->)20Z>03VW0kcy5y)hOHU=piS!Z6`vojIN1-l7SUmP8 zzw<;>EZIR(Aqh&KX^~`Iy%5NMEjWM9$k0Y3k#kS0{c;1UAt@4O)YVtj-M_y3x1Q;+ zFa-w($7LtT?O?K%rn&_A1T~Z#s-)#h z(>_~Bd?Px}BkqyImsde#`>^LgD6!^d9A0dz*-1_Q{w1lzu+{$LSkRY7|-|F z{9(gVOyBN02{W$dx)#P_xqPN zcSQ7p^dg6j+nK-CZ!Y-|#k<0M83zCPz6*FI0wG1pxSwp|h(WWdsj|R9hZP1J()U?i z9f;IQ)Cu8YjI=o2*fr^SBk922|M6q(;o%`HwBdYdoEBXpEVQs}1uw{VEiugO5D+!q z&lY_UqJREQxl=>HhBI#!LSYbDh=eAaY!`Y6lyfMpvSgoR0}vq&Z^rkcqoTyA6g|wl zsX(R&>E`)bB!G1Y>|xNtY_JedG}{GkP~>cHvJsXtmL}wx{$vXre+QwKJ6eB{V6tDk z#}$|RpVia|$X6RO$1f6}9^Pd6lx_*mj1?W{CKXtv84SEAsP8Z5i@ zSkPZ##c6^pcVsTOJLESI1Xv>5ucEjh$i6kmT;TWbFuOs z;icx{Dk&SgWm-8um66eNJzR}WNyfPfMr(YW_;5@4{1obGY58TysE8u28B_;R+A}jT zWq&l5Ff~mAn_HweUY(!lZJg(U;lz5g9XQfz3)BYzmrqrd6bOD$kh2Y09(#(%&5pM6 zUN<3;l9#OqEc269GQ3!!C*>Fg^=~k|FV8jFKsz;l&V3JVh;Vat*{0meP|aXv)Vmw zMpaB8#%mjv*(O6G%0)5J^(vqH5Z+wiQV^^0kz(qTL*GOZ3ypO$@`>}iAXFlwih4}a z@#_8KR+{p`800jZ$kTB#wNu3=j^YilPMhAlW3}O>5E9ygTAai>E;85A+)(SUfKj_j zIceQ|4AynlYD&j|6v)}=Ic`>sc#_&$L&5wN={LT2pL_QXpT|`|Nyk8fY&CXxfx@uX z?_9V@JwnSwrU=h9rCKsag2eg?1ACDM(r}Z}Lywlb+UX=9jKH zv@OmzcDYO&ec@j<`5#|HpM24>((^%2&5#7-lb6>saL2uYhC&LmvarBFDxz1#PXQ-r z1n@u!`A*It6br)?= ztI0N`;HbQ+i7js$`Yg~zn{XEuOg>WB8}flU7eXY;ZMIQ=Tzf?WE+Ra!oXVDojRYBM zE*^Vb#MSzR!L8Cz7LxQ5_q$Fff4x19Z26nxCoGA08~}Slm{Qx~Qy&y0UiFwl(%s#y zQR1?jh267ps8m^6u(Klq3Hco%Jv?f3j^M;b|M$$Kg~S^@LSk=RNVfqCAUFXkUr33& z5^2N?CG#~}P0>tfS{g1lL$ZvG;v8n`1fG?>E1qob7o+jb&6oL+Ne%&kp3$Uw$MZa2 zgW#&F+A)>$l|+D{hQ;*_Hj*i%$1!VuLInPe|JjAOm1If+Fltl(GK){hw zrti7?hkl*a+2_!L1qto51P;2ZSaMNJSVVO6rO_eaqz{BceV%aq=j?`#3gYh0(b>5=ru40g2_@L;Zqo;w=1p_kgXWj-2VV2< zu^tBpHxZGMmrlympxM|~Z{9@|Db!igX<-$~ z+w*uXiH#*KH@DgW1~{gwN7wVpKcuUbRnZmIik(iToB0FYwi`0A5F&mz)<-u`Ov<&h zyG0_Hx!ASk|M%~qQ2fOsGYnEm2N9h=mS{f&lFY30V(=PBLX^XDnM*#;qLD9a<=6_X zoJ*D{utmLZJ?^jjbkU;3)h~@NPo5X55L;;+H89I#Bf-b^^@=b|eVbEhv{EIlliQk8 zq3R!OoHA^jipsKzW#k-l!cxn9bBX_Xn1fzYSK=>pR)Gkr`N5R`Yu4YA5kd*CO#KV3oNhCHu5UGgU?mFZOPD%)*pY7YV@U(o`X4gMMoBA_d^)7y^^k0>FNh;Ijq*MBmyr7LS`W` zmCxoc|Ge3rNqwRw9ps?=r^D{gX9|Y)C$v*@UMtJXJKNjUX%i_zb#|i=7v|@;S{x<6 z&i}NWb96)&r$`%ZT3=`Iy4aB)y4*|ub>SvrSg%eAP1PyDtDy`78RkuBRxXg%lathj zf{6JUpBdyU;mdqu8WtI+4E{hLgVJvH}Fk! zw<$s-gr$$RKfz*G;wK?W3w_R#wSLzM$cX!Yw@PkU!`VowADqg3$vAk7@FQzBc6%4# zx5L_#Kk+aQcNTs60q_sFZ3gRfX|xa*hN_`oHy5=;p<(3VgAdPEm#W(on}h?+^J^)? zWm@mhbW$>(#n?1SV}_7}W~ZU5?3WJSz_zBQvT)i)E;pDL7T@<*L}H;Uc5U#HT9F7H z>+)=LAJJ&@$!B(bb2#@OZhnU>H}OZVc)F{}(aIxHZIKoO!O7XV$&QcHw09%$NuQc# z|A&4Cuerog7@8$pMSX5gr0G@O=_fwrf=D>3;b#*NM7_VFv|) zph@zr-UCuSF)|}yKLX>qb9(Iya6tnrJ3qhH={(iA{@3f;FN-7%&+dJ9PAUJ!x2{4_I(#A zo6DbQV#yNuu*+l1;68GlDi^94O@_GzN>Jijr6ur7{W7F{G@LZ(*v;);&^vysq`E#Z zu(sUA(px}-D!x3M9WSEETDrH+^xgFhDs`mxy3qV&kH4PE>^)h7+fZ~t|CY8@m-o~D zPA`blc&@p5HWlNvHK%X3#{t~s%*PTq!tnRGm^)A*qKNJ1G)$b9<&|3%x`dua63y~ zq8(t^PajfT%X7Mt5~+r0^2J$*MJc_x(tI)T5KTmqpq6Jw=;BJf*VJUcB*nMd#Sm9Xg5(;D%N6j)I(*f+&)APAhpS3gO!=4` zAX0FCh1`~V$RBhMQ|12mq*iaE<7Hj^7&O@CiL}KIX)A1SvuO*JZ?}v-D|#u=it=5_ z6|13eunZO`ojY_f=rp1Ln%-WhWZNC;5?2jH{Etv7h@V;+!&gkEx+ZPi*sdSBZhnHe z$ladryIoH`oTf-BVek%o(5MYRvdl+p;vw$(Nt((QdLah%^a`uZC)mANVje&t<=UR| zR$(N2O$dCh?*6+S2aiUCAF;;JW5Q}Xh;j-&ju1d3;PJENOMSBdtS~KXoaojYllNGb zwzg;gF=^rVfB*jami9fCyNb#s0BZAPO@X4ishk*P+9DRJfUb~EvQUw#UzlM+5!kV$AKSB!_j zkV8z|oijda}c`YV%{%SJvGjnX;316KY@O8s;!(-Wk`Th?rV`D znB5`JQCC+7lAk0bdW(zE1L8%rIOXQd@UpTmv!?Y^IT240n1Nt%VQX&g`uA54$JfQh zrFt=nxX#Ybf&y;`2Zy0-u+Ht`Pv_-PDO1blb*2Z_RX)BpW3CpR$=?oq@6YPZCIGGs z*YWe4vI?lAq@;g;+5yO{*`2Mnyxd0Lf53>Je;qt#Ks_|Ev`owS4@h-#bX0CS1@;>r zkJlsJ{(*pb>1xdL2_rW4nFEK|BQk6NnvIm?7IMrGcvU#Kx`OcC^Yx*sj7+#l z9s(lbL&VCoF9HjSVivFOS>|%b6NR&L7{qTVrt@SuIrH{glngC#KyK*pS*u4sJo*_G zjnMeyBohOJREMufJ@f|%8QsSZ+|8d;xV%z_Qk7GD#n$l|!wI=;2|_3d3D1Hh3N}0c zH9E_d2X;&?&!iX|92MqzCUtl6OGE*3zs-76D5H3*ux)j8u$38SUxiV4tv^T=$>*)E{u77pn zYs#z8Z6_;Jd8N1Kseo?(bQT328FXVPF&UYW0fmc!QGujz|KOR#M_OED+ys>#Za*X` zuD)>=UC%C3GooN9o@|fJgXbK_fA^4xh9$kW#nz>cs;5@%MK7Y3s zgC}oSR#sLiZVV^C8(c;|O~#o_CE1rzO&r-JZ5Aq^ESy^n1tH#5zP=G#uZGftpHS4IIrfS(;0bxViEk zd$5woU5mdSJZkpAW^bQx|Bx`E)-{(rXtsMk7N}0APg?yVcs>8|w8M#_EpI?Gr4wX3 z&)yTa!jr`0w7PAN^p%)frehm2K*%z_yawxb^>w17R2@>wN3))#|si=6!$;%oi6%(REwvhzw%nvo{suUu{+OK~JPrdxp@A9I<3*)>! zUIKi4ka7U*&ILuerRmTMVIyhFCTkPg$M(F1A-~78_kQG30c5-K&P?Kx3yWMo*4lqF zla1%T!j-(}cQ@K-1FH}#Yc?lGg9wI+K`u59yCG10trmbJ(YiC$eM-s$0pWx%0IOkq zylC_AaDH^uD%HRg=M*Q42~kp3w!g8Wh#2sPDA4ahcl(nOL0DmvgfYOFTyJQ2^Z|(n z@a}7DuAtIB@yO+|q1|{+@h17)v89e(J`6?N@PHLe;}2^*!`6s!x|26t}t6Y20zQ_{0B-Nbc_L ze$#GFM|ZAKX@&n;RGIzQd7%z>ch6TLXXSWuBhgg0ZBo46eQq)>?`z{wq=OT91s5S| zYzj8(S65bYuwC1x8eE78> z+&6n0#0Eg6PP6?kIQIawQ9wYzH__;nYlRX6Rz^k!i0m3wIy@%BvAw-UBr&fW@3(qV zM5(aa5g++L1=pSk#bwt2_UDs(CK%8GWjL!dbc$h+S4X9d|KbK#ne;?K~49jjk0*6~ zX{p(Ql{f+*%n^*E6N9d|_jHFZEA=$JW;B$c<2`vxH~mjp;w%`jEp8Xq9*^?;BxDHB zI|I8=Fu_E8jUfs{OG|77k(}n{cXX;|uF~1yk{unjQDd-eVCn_PO6x1@S$tE_kSd*) zVf>nB+Q-t9rGdpAVm zT;TD_a^R(?;h7Zh!2y}8y0mn7bhOu~7SP&H9EfGfFnj_G$UxI|S3&|dNens?5?~sL zDntBm0_H=xN{^@;>Cd)&mGrkLaxlonk4(YE6=hqQtg^K9ekSaIIh6({9F3Hh3Db}a zfkNez(|C3mHjHmY9|Tfk`cxc&Uv=DlYFG^`RngLZcv^ijIgnKCMZ$+`C*yZqfb6-nAM<^phx?)0X@1bvqa1yAs z1l=8cFLrP#0V3j~QU7|2>SXpm_OHp7_rMWcx$iINEar zi1dX-WpRAfOUYmB>0yf|jo8B6Enxtt#;?{GV?1Oj%sCFnm7ZrJ&6zo>vHC5H^aI_o zH3G#`H(voRkxIAu+b;TE;gclc8E7wv0t|`Lv*SoYj-!81F<8^FGGH7HF0<^zxzXJljq z`!k{Nd#}4QVE*9^AOndxhrD;T_Z|y?Ql%K< z)u|Ky+qb7(yqpJqB5U^Mt1K)8ji94bb|MEhQG(w5%=!v&5OC@GNI;s@(xgU?bqNKb z4OdrFYi{*eUg5Fbz#1}O2VFANU-%xc({AUJY;2(UJNZgD@FIYj=P=;S#-Xn5a!p-G z)}Glc(E~>8!%w$~ba8YNF(zFP!*R`!kH#AqupQ1<%_B5*vRge-66%4E}FM*T@^AjD-s=+7K=0aIa~v_^zJs3jf-1(+ESUc>y938 z+BU9FbI`!UY`6gRg+)o-V()1SFTK6-yn00MGa^kK0}vxcAn@5W`caFQ%B|^0pJsjp z0bnR?xCm_x&6loS9Msv9sE}`N3}}#U$?rsT)lp^3JH>N{h;(5fR0}j4K)GFArMHB3 zbj(@zyym2-F8mXbIMUVavtMy)t)TWjaizDFnJv4F@Qgf9b!ve0oP4gxF!YT5>Rr^i zdOEtQOUIY-%HOPj1wne{44m#i6BB!z9;nrmfr^xt1dm3NsQhFDga>z6~C| zk5k}u;>m+~71umeM2*3vF06?0GdVZq@x6lrQWhuO&p_-_5Ga{ko;h|ZD~2P4K>YK} z=1Z!JO?uh!D>!Sf#qP)3`i4#f+(%beJ+7bM$CX3}O|uM|^qNP1vvZ%rQOy$GT3RMR zKsq*TW@s8mKo7xKZfmrT6r2{XJMg{eLX?#qv-SE6fq=y&uWnmoW0`R+O2FQ`4cc%S zQ`Y;dgSyz*5s--}mpQZufr^Dp3x54`{B@vT%+(!_Nu!FuUs!qDwNLkBc5Z0%)5hd) zOj&&9FPiG=!F}6cP1veqCRH>GIAWQZ{K&|TcoEKgr&UU{$Z);|m%<36>)o+~&fPKEmdW3pM=-6bdm4Wl*kReUb;_O^kX@ zpsekidu2$pg1wV9>5d@I{Mwo;%~xX!3xao^xfw`ag1&>1-KonnOdW7g5Gt%M^ZT;a zoWB3ywwAM+mh6a5kIa;{=UTmojO?mEM&Ol25Tg=t{{dNrvE=nkj<%R8CJwF=ii_3i zkew+g(2}uzv0~qb|3M$_ZWAve1vMf3_#lVR*Xo7;=3miPzfk{TT-E=3 z0T#Z!zO`O&zS!(|8W_mwV(XoWZfscp1aD)L39^JPKhMb_7+}09IgXX;55o@h!5IUs zD_6R7STB4HBmTsg0 zfyd3|2Uvf-Dm>cq*9XUr%lbGO(_@MRd!l=w7Hd zyL`8%HHqGFX@6mzUoyNs5gtp{)oHQ%(bOcr({S>1Cp66WFvgtWdZy;<{&+_Uz(qd0 zgA2Qo$P?o~ZqGVOcTD2TOv5PR{<`rmt`0&VxfQZ~yYE4}%IBQxSNLazUNu=` zcnd`~S~Qma&AjZK3bri$Zc`$%HndabS;^+1U}dWqMa#D%r`x#$971O~rEX2%u4mT= zc-ND0?~7SjS%OOA9@sDSG1=*@MkC|y+y6r+Zc|E{35=<*tvh~^*VohV^0|PQL$3{kbbopwSO{R|DIY$zIxNS)hj5BFG1=$H^ z(U#dJNJ$ekb6#y{&Es_Wb%XK|iTei{eR(%k^`qssnDYvJQ7xnet>Y5Oik%4lKZz0U zJiioow-`!ZRk^(%)xP~LTwNVJpVAG_zm!%&g1|x~1m7&pvn!RA=M}MpQ$Rxhx&=g| zWl0pAtu+m;t+kD*q7br6X=?|L>_H*Kkm(>$i$DUQVa3HD0Yxg(7Jr-=87nR4iqQyI za<8^E85-Z+v;d|a>>D@@^~O-~P?YUo-v|k$1j((WW7U(dGw>QW4E>?SCU!wJAbzcr zy!)5;;m}{TBkn0Zf7cH_mZI{R*0qH--4to5p!8CGgAt_`;EnSTzt6{CAc8RrcjKUV zdnD~L>K2RH`75S`35@kiOgUqqRCH+Ddk|IuA^k_D=3A7(i@vr_99+;OX#_EUPz#q- zv_8w=gBr?@$&o&cuLJvyRq#yb!|rk9rq@b6?rMR5aitE#fWs^w{d;Ul0isI{6; zO}>d}*YThu`oT>^&B;VXYw#LwTIUlik~k+_)JJQ!yxDJp60plVi5 zK%Wf|%o0@5iMV64v#UUevY?ok67)K5_o2i|F`So#FQBOb*A#+58-=4iiA5f;p5L)umimp z8F=qwPp-dR6WVv6;I_o-IN7h_=4phASHDy2nUj!f#{^~>r)vkC>M-3Gc)Dh zh)B!JAD825G>W~EutANNF|{l^@9m`)a*v-&GnxmrKMew0Ww=JQn*0jh#mOTW@{E^v zNG#zlC6d>a&sTe7zWk&)V?pU*k)TlK$I4Su)4YF>RJbGE2BO&(hi#jZ_hBh<7_{Hd#=61Cx1Ps~ zjQ_PCU4p?*t-xJNJ1qo_Se;refDGL6M>jUIK!MlD-g$;!EWPUMR4E29K7E(7+mk=e&;43=aTt;D zxleiuoLhJn>dkGy&i$p?{srO(Ea0$;M?4?z+b+!Wy)P~^Vc=Ugg^~)|+xeMnb$c7$ zM&P3-yI{QnTp|nXd%ZD2fmV%W9a0{cshOF*o&rFJ=W<=pTdHZWr==}bVl-rRUtPud z9c(O*s+}}&Art_>T$E4Ie_Qw|ijGnk_ZRDK^TJ!EjW z#H<5QNdr3tA8X?pC#}mY<__9V^|vZ%$9dIW=%$BJ0{m!n;UT{Y%xk6w`nHzmBjp;e zE{TGNqKf8z>+=!E#IUmRwi?3+(4LuiXdUeew8`10PLk*1Fbu|mwm7BQR0K~)L@=9^Up+GB77$$7I49Ekqw;M@Fh>wY7TU7WYlPkC$APWMz>dWJV)a(C$f1)$5xXgGQ?mgSsd}0ToIlE59ZLyXi>35`$!mF?;{kX7|xBp zs-@wxZku9sL!9grDp~tzn6-g{h`B&2&%w*^$EBvB_NXX<*;Oq;qe2+SCwlSAtBO5v zXns;naN60+oXDW@`Y>R->2|c*uy$!qNZfJpJaBk>8$KbClXt7#h_=>L&G*n|Jhbz^ zPLa+M*<I8PzOB6Lj6*pZyg3Y-nYrrQ8!I z!{G;^EgwV20i_>AQeM7-g{*sQ3Bv&Eeb{xhO9KqO}bw0g5;;y)pJ_ zad8zkG^rqw>&ZPkDr$U8Oy7hyW=b_f=Qm5`canv9!q^cHqeS3x3oY7*wiGaAJTd-L zUS2+F|GeHj=9$Jx>7s9&Wgm}FR|aF){?g?6_Rj?lQqSBNHKpi=T#Gwp$ftf&{zvwY z&4D9Q6x;H0W%BY?^W&sO6X_EfT$@vqOi8a_)EBwG4qjBQQG<#BoT#-MgY{-U{yIt>aA%j*TYFjrH!%B^EZA{YyYcPnmK%~QL>(=WtnQ*Z;=n>eg z93a(DgT&+<^^?R-HXf9e1*HY0!I3d@rW53{xivQ)u>Xr2oU`hn$uwlvUYWg!*CzSnp|k{_K3M z-4Lj-go21TGp+Q`O9>e{p|q4yXpRdj%;}F(doi^RZ`-7xPz5=i}e={v;r6?FPoxoyMq%DzttlN>mdwn2C?yx zQD6y;`0)v>xc9QPe#g5^w9;F`Sj#`NVSc?a%^^bNy>?GU)r!4o2GUMRT&UQeztxA7 z93<^Z{k??Ka*5lAWyjd;Wb;yLf5syBfr;tNvO4tA%F0?* z!i#SE&}L${{xde!2nI>kCi(Kl+*oVrA8#&=>%YoJb-z0yetDhIWpu=QyoD-X43%v} z1!U=y4v$?4sK9%KlgLF+L z-NjVyXe2S;+3zt8ug+d=e#iYEZ;WTrPJu1(Bl~l8z^qZJ<@_fWU~Jvonv;_k7b~@Sh?oG)pN$QaKwBSV zstd(5HZsj3%pIC=W>m_V6(X5#AMAL%%cv4(wC7GM$3ljK$w)io<|c?Fl9yFv#zuNS z`Z{V7y7~nc8j&Oh9|0EV79*2BFTZ6m(GIE?JJsl=uMsXt7X(>AQOo8=DEDzj~pV?x?4pM*8@aeg3H( zFjTjiI-VBau?HSMuMMdUMkudt;)KqxPk*-NA_70E1aCJWiN9L)lho z67GM8UCc`ZLOe}jY3YerEFb@x^aeHS8z~eQW1H3x1oEPuc^ki;*}$~$O` z`I+KEBYWtypEAR>#~ehE*sJRmO^2st3VkUz%VPnHA6I8Me<$LlR+p{4XwJ%FC#otd zlg3VRp#>EJIX7laOnrdr4R<&L1xe!Xxz(3lU(HMUm^UxXpZ9Y@fmSaZ2&a*_H0(@_gxuB6OyG8Kc4~_i zME(A~shObNSJif7S|l-ww|EgUMWLhPq0oLG+2>I~eH++MUS9Hl`$d+++-+chj2z*3 zF!{o-$U#F>lAm9``yCkylJ^NbI0%Dc`Qudq=+c{=jTafb;-&x*j_pFQ#yM0St8=Gt*v>L2%VTHl52;w~8CPRfa zWAPJYwz7qM!QagX1?*iTG^ft9YoOul$r-0`w?%bhYKA?|0Z zUD}29&!F~}9O3lmX@m^bYzR83i+k&@|2_tJN}+w*K7#1SF$Q+$nK}9&4L-qd_vZIL zCVlvU2y6BAgNCy=L)>t`5gp{$#+qH*E=iUCR##DxIDijV$`|g6MdkllVVj~K&0yB` zGPD#Y{pfXY^KL9Esv!i$UP@s!183IuZDI8ELdn#(wFb}O8A^}{4-ZCn zRko>YOrDwb65{uYlh6=z#0neVMzCb3fEc_47}{-mX=B=#{rI^uKkqV{xYE)>GnK{n z@JHF+MSaB2pSbFA>_7rQ^m4kK)##m{ZqX2J@4W>72T6L+J)aVKqg+wYF1|^ic`YqP* zd%a$O%4UG0R#koXgSO&tPuX#F8s@cUyu2N|kn?Qg+x_j;7CYfNQHSWwi^7%cD$l~8 zg5`J=D2R$$@_Ztt#DJGr^Sl#4niV}$!-{{-ItyyoLJvO-qNIVLhTHkpJTbGzDQUNR z^&mvq7c;aQq$y>?Y9jM}FS^_?hnun0O@-uFkvvK~>HCLH9dqe^^KwB)DlAwsDu&xH zj=R>5@7hOIwmo!{*w9iY3?ffbo2nxFTAZ#+`NN5O3?L9Hl>OKcV-ay(8D|(2VNRW3xFnVdXKySyTroHyl?IQlikEQa3ej|Gom03_quwv~ZFCR5)r2!pafFLsMg6pWJ z7~}Km)9M9P^9@j$k@3M=B5uSaPboDk3xs53TdFSXZfiFL-3qkmNPGk}=AxL8+KiveLk!7M_+4*pA^DM*;B@)kMQ?>$3*Hx+{%W$DT_g36Lo zK_Fon??zE626WAL@vW}A=^$D2c?C%-ELY-nRLK3v#{(|%U^-UVQjb1v3LM!y?Jr+& z)ysW@TY{huNO0Z;WOuI2j?Vc4wH7pXFNmRobePn}XT!>WVY!#L$vN7pI#^@ZhIZWJ3WABPn(D_gMgxA$kr|40y zHHz!CyVYO){`HO9|5GhgKu0$B!y*khH9Tb z{&Bj|$=T@Z>DfM65nG4{(-eJDc>W(!ANE&B`8Yr|XfoXV2K1R-Ffw0mnX;}t-Wck(ejL-@SWb&Do|%I;C7N2-+i`o)kg0`5~@o zZF(9w^!9%J`m?{kIXfGLLrF!m$3hSZ?k}L@L{d!b8O7WTREl6?jsXK0Ee>mt$jsav z8|!RSLj%~Heo1j<=QsQ9!e94O1J1jH zcY@wWU#_Enw(?SEOeiXN;MX#kjwLUbsnQtKWF{tNI>#+NKMOS4wtUts9sbRPCCzEu zOhR-b;1TKV;u7L?EhCqT;czS0Cf8X~a#Ph@HaNH#7Ul{KO$~mnZ;+D(z0>R2s0->; zYT0NL$oY)VqDTZT6gO2^)1KRU(_e*SrKk|_dp;4@M;fB!mWDtv_0zQd1xcr@v%hlfYk^W|7fc^>W2`8j}Qzh14dyj1*sxH_05*9Lm{=oauAfR|fdNeO_bj04xZ z@8k7{gq4+?@KF<~A$OFW3s@_aV?shgz(VfuXn});1t!-Rf{#|ey8U%oxge0Vw6x95 zO`xO#`_H+RcC&5^kn!-4fp_j;XXgkI@l9WbCZ(k130dcLCDB6$qz_kH)lkdw5C{7+ zr|04XzfKm$6?hG+f&bv$Mmjv0eQ;aa+RRM&gP~d^5-EjoEASCVE3murwkvB`TJrPj zG7?@sBR+{7RIj`Q{9w=RXRSFa+m9b6rotMTF}^MeIVCCb2zDAvQ>RAfV`+~UuS3$e zh)Za5?fp(pDu_dSoS2x%@0IFuHOOZgCPtA#>ci-erT-YnQ8GF1oz&1$(lPSmtvLaT zt>*Hl*9-XBbDv+|zAt=^OeIE_?dk=lE@kD4j|4q0-gmdB9ZhZ{!-uLdjSX~zYe!at zV7ba^hq#BFs;h?vE1XUUXKIS10*4}M=;%H8QZli=Hm(+k%092>3v7|kIXM2f;P!-s?hK7qh z+dym@1&|)0djH;es|jHKK`B%|Em?vzEVR!`ovGDNPr;n?!^0bJ|4NgykGQ^Z1`Wc> z`ufZ`;k*vceym5JUf~5^Y8H&dZLT>w+`ds&OzK;H>i1te+Mc>oQ!62Un;impOp)oB zj}h4rp9=no=eq9u3QU6HA_G5_xcV0t)ImDJxdEGjC(#B`)k zM-g;%x&4tgw7XN>dd-TD(C^+-Wa+v{J7r;PT;uL&u($f;)QU!;==;)s`O>l5>H+5u zmB6gO$!{z%{fH_Wg6awfnwD8Ls+vvC4cht6qSxn`lPI>g1q&qFsP)D$k%I2dip)(- z!O27}Dsg{L%Wgy?hoIGPbg?gQS#Rd~^{c907d?+7@A={)>X}~s+(2bYXXi_^a6$XBdLUwl19y-p`&vRnZIRB~PY5iN z@dmUo@o}s%`0y&LnkUZMYmRx2Ejkji5d>0Q{$^q@d}e#!b>Z(gIXL9sy|th-d-sb2 z3Q`mxLCVLHlLsGgWB&EBhyFp4yQjqRWN$KvA z?(UxR&8+bk%ViMHb7J3n@2ehGw$(}41LIc}WTn+8nkjm*&umo;3NDhs+|XLm)=epz zmBpvmGQ6V1j}#YbPnu{%8wUp$5rg9Id9@w;1Z?(ZG;%239^=dBYO_DzD#(Nm z^Nh_9IPyU$`=nbkCwB&wzAT}w~^53bPiW^>6suPDu~1m56+9o*jD-R{HqDD>q^ zyPbAPp^PY6HkUIXRf3fa4i=VHPF7A%vs6plfEkrF0Jx6vqLw>5JHdlrocRjkJiv5p z$dagt03|ZA6i}QjoiDALK)&O++6o8k@_;t~h%`1lTv}EJFCvx$EhxAMrjl)~t!nfF zzAr`(wY=8wI02P_a~2jo@}{cA2>1s-prgOxk*%XeLRAL0>!2Cxd1KOCEOad`=XmTr z=8tF7co^Y4dv{FcdN~sCDYfw_cFQZi{|q6Wv&WC9B-8KalMRhwe|#qW^P@as5!?6c z(u&s~jx-}k2Cu4s*K9yy;w~k|{_=eSv;MM9o2&hfY6IvAey%od4WT}FJ^1+ghM0-z z_-NT8$NO}Rd9jiyy?>ivFa2~W0q=eK`oRIK%_1~1JSHA5&GCsWCMHmuccyY-9#7u{ ze}}sq6+?p+MN5r}6&9FU{)`X;71}K$-+FOvl~Yxvtu&5h>US}kiO}6$DF5?-&AJgL zmZsT3u%i{3*C98s{zr!|X}t8?%<2@`Zc;=Mbfcru1YtUY=LYD5zWaw)bUr>lpqM*S z4+E7RpdA*DxMtoK6iC6r!GYz35Ct$?nwp#IjbUU5Tz?yfKOAkGtkMe$8{C^aI8?=s zjTaQ4&^saq7#Y>u%~y)n0mAy#!OS5Mfq1*BLt?qDx*3uZk!e))yS3JVL+Tmcz&~eK zS09+|4CJT2PHzvTNPToeJlJpq{={EV6$S z;O3kFHk~B1f#5HD;Y$=(@dp5hbaJYxt8>}CG6W{s4@s6*R(Z617L%0(4SdqW^>uGF zqy+eUUmm85UJy3=2aP3kba-_0Y|Tubihs!vbIki}7gzDu(=^n*Aw~QyZBk>7%InCp zR3GZ<9Mw@ktxYZdDf8R;e%}1#cXts9WzjQ^)chQBxTal`{`f>@ zEgE!kT*$ACw?k+$p$cd44v~E(PH&Swd%d(_ui`?$w3|YLQlA5XO`hTXtoOep*nO1Q zE|re6l19XeiV#7_KkA#~O9Q7ZeoESi7!tP0vSiC^?>Z%NONOc>D&b~b1A`7$BO5&n zs)*drsrIy*reE@a&Q=?89U^UfcI1K6IB`H}hQ-cD?gu*7v#q!%9qi{D*{V7Z(=*Gzme*gNA5T83l`w@_1b2 z>tuqF-FkPZL9=ra7;=-x@A*b3Ujk?nfC2*?+%gpy zF+pdOd(^UmT$JcHI$FKAXTW7+skyBShE(g_{nMkS*4AeyE4u2vU$7*-t}4NNTFIt3 z?~k8ze*WO}uqSYt1O$9LGX|h$COwRa#bOR)B?{xVsHjlO+<#yIFqBeOTw&E|$XZ^` zgs^!#v3~_PineRc7)Vp@8?t=uG$^*O-$o>- z_9w^3L6tgKwSi$50J(X+J|W;OlxRjqA)C8<-9~G%UTE=pi zeROYnNHYZ&udXJl$H*f4m5z{@-PhlZS@SZ@vTp!$ArJvBD3pGqCtqd%`cQMN)kBh8 zSt{m=fbS|aJVSyU9VhZx?k}P&Ft&laMStt>S4eD2MTOnz>dC?auPkvBpqzoypDn_@ z$;rvl3%UM_L_@5BURMpz+>6l(hU4f@Wx$wr2PtWS^41Is!jhFCe znwN-8*cZ0cj+X~3?vO_?$fl>~?^$%>Sq1_Xs1_`?)Ke~+-t3PpA(AVc0?&t~)nUgP z0-TJLc;L?A;s5r_C%h1i&(6M}BPm%_+w&0e>8(JIkCVQE{TweH^apWUgFhlSbb|yj zc+zGmfC@v~(Q&mW8zy`%WhieH4*I+}N%QVo-#SIa7)GG~d&Zma#qX|+=Us~~{AvU= z@n>c`e7C1e7qlv zBdguuVMTq*W?p=8b&VRVwQ;wZ9BH~FAR>{azo3@J2@Xq3S^Q=j{BBvRsALH8W+-h= zK3*!-U39FM&^*f9d6wO2-Gx;u49@gfel_NlmFsX=un9bifa|0^ZeI&>DPm-1kO`r7 zZ(D%R+qTdUJ{#;ipjG=?i|z=fZUSis*Cm?xbr|&6B)+HtqT2W1F@^@K)Fn4o8U{EJ z()R_!AGYWf58ZbbITYYWqVGr6+ZlI@H8(8!VxqwoQ2}^V6~w8M9PoMloZ(}`0_v%#+y-{vc6+<;MJ$8ZfR zHyOHuv;=Cq{c-CVz1tK_0F8h7nXkpW`%uuXjfWhf!aE08=1hM}LV6?LfBg8-t>VM8 zaKPPNAlSW8fqz>=`*~%t2zdiZzuFUJx6L)pomt zX_oE$`xke9*S=T<1@&@{@abF!3eSA<0~2*;ngp^Qd{5d#b=_ggD{IL!YF;{Q!=SE? zp5B))Iqf_R)w$l!Dt(&{Ff@FB&1!wSA2K``q zAT$b-77w{|v)FG11B1=w;8gKY>#5nU-{B(2*DeCz_SRdQ3Yqow^Yg_2tAN>lZ{MD~V+j3U3wyieDBra)bJWqyJI&a2M6kEkG4 zWVhT%hlU6WGs!6^lD;#meZUSq-1#uAv4sfxWh)(7?L+Uvf-)of@2V=+Bw2LCpbsVl zun-Z-V&QtGIT;f+Dzf!K5U2*#av%(F*B1A8w}hGT%w3H8Cl^HI)2A4G3K9q@8X}0E z`Ir^%03spaI?v41T|7&$rG7Ov^#J9cqcR;Y1S0=+Q(q%`Giic&LXa3)QgP#be~cM$ z4O*||uuzzd5Mzg=JvxRI=+9YK`ze!Qx#N^K zI>fw25(Xg#fZo+~jveprTY_N?U=1nm-ut3T#S+>4Fl)4!F*BpU{MgCcxbSHF?B1{- zTZ{mky^5vz<^vZEE?M?R%foJe{GSgzdQjvHeMt-N+z8oShwB<6k%bEgk->|6F58Sa^QTp#b?kdvny z+Z>daW#^>FhcBmDg0~>OiQ&a}{d9?eiJ2K_g)6ZyhZ`HY&Z#vE*4fn|IVwe&6xA3Q z7+_yFG&I!KURax2^Thy0vg(?eWCm>=H7!_DLo+kitG#$)K`j5xl|Ij|%u;)arV3lb zYyupvj|(w`05rH}kl=cb=kHgD(}dD(pW>qk1}($G^1{L$a}_n(AO!&lfr5vTp*+yN zU3oN8#@eoR;KLRN!%w~G6Id&&!_*pTJs!8fesZNV4I?~NvET1d&H>R(d1m+S)IU7@ zzQOK*OVO07e|oyQscB+Z1>K0C>S*yLFlT4y?MR8~&`0?V?XvVp`#T`dOz!Hf|IGLa&~Cl5^bxz4Yyj&^qXNmIfj zC?X>vsn+afWMmj>se38^X=nnx#R7$NwJGka{Y`Lopy6Cv>=YphSqI#z`-caqj~_wH z{>T%ZYC@b>I-jXk8yER0L7xdM!T|vRAhXutaVaG(?sRjc zEvp%%_5oP)Y&<-gGBfo~SKE@@{JPh-fOJt?+t||Lme6o-PKxywm3UcCDehf2Eiz^b zqi*5OFD5!VUhfBc7nk7p_<>Swq|d;}7;WR^IQVd5V~}tZl?GNKn=T@BC7oZi~X#$0jJ_e*7=r2+Jh;o#%TLqiR% zXUE60f=_N}$ZPW6v2uQ`Emd7pv#_XWAr%7zf?&Jt&HrS)Iy5%5{4!rvzmNiM{yE{O z>G-0*SGmrw;-TlPGgS<1H$li5BdQFq}Wlw!?H5`@VYMu4B_|P>_rC)5Gb~;d3a{ zpmVUsO>j)7ET`u-v}LlqOuuwr*C7x2pKx=R14h&6@UTg3El*41{L)g5Rw=jLYNbx< z331!@=+BdLQo-lD^0qJ$MEf7;qW*FkWq`PIvEUSYnjb0I^%kn(XD2r~Z3r}$AJU8g z* z+osefD$!b7Sb%&)JSr(5O}DqN&d=Won{DUO-aT^6f#Fd~+G0|wzy}r13IOS-1VFy<=dD-UPZO0kp zuaHiFh^+W5^VL=(5>|u-x0EgMke~Ezuz<;5Qk+N|aECV=!S|T8R|k^)Ex4W=RY{nN zg$9cafWMZsfdhe1P*8kO_bvrH8u(NIGZ37?0H7JbdyH$ATHD(2qLM*vG2f@>-;t4? zOvflh!K97!Lod4&;;s^}7-)MP{I&)P3O2FCOzB(|noP<%y1Ky9IJ9ox)+(TmodE(k zmyVaeIf_AD_9q(A`L$qRKFm}e(i*`=gC$7gw_fy-KBiKXlWX?6E$|z2v3J(3HW?Yu zX+J+-3ZS{BOu)|b7a>*nRNU-hH4Q{nwyZYX11kfa$?3e%od0H-|LlKkqf^+f{MBpz zmfspAPxkw5F9ed7ws=Gom7dOPX~f*b`zs0ChPpt?JwT0Il!H^@=5z^w)M86sf4ee1 zz^9d!&2$`IP+`az8|THx$1h&UQzy*K&K~UVKkvHDz2b#7_E#e#qkNOH`T6;ym6nF)=EcE5S&*Iq-zL`c2e&06fawHT6yIBp{_$~(3IlV8 zCL)f?+Pb>vs3;%jFnsV4aIa@;+yPE(D!cjY3@7*<3qZ_EPX{)QslLAd_U5IA)3276 z7~ws9e9mCi0U7bHpqnH~5vKdd;yF<6+6vZ??^(r!#?ZIFgMGS6h zd{e8XhUT;?w=ph)!6OMJ;zjqZQU<*-WDji3$FSM20Pd|OtGM&N)kD?;U$(z_-5-m23 zc?&#VH5EvQIP%k(t_BT!Qu(0$cC8D^s?w+x)A`i%|1H_H(@hot5T-?Hb2fYI1ZK(=r|qK61mQ@2G%6fK7@|M2ATZL=%pqOWW2@H z6=i+>cAs^Du!-#8;NqaL{>u1woHJ#=B&lKLqm|5#CcSpojeW~Jhhxuf_Zv^=t2SwP zPk{8lxX1_OBBuL?lNEk*bMqO%J3+={llaLIH?$260qw-bmX;44){Sm3+}sUCs)eAs zl8cjd>yjBB=IRRh`Kb>q1cg;q;&-V(e|`cMQEtx>0xoN`$X`S>&ae=WU@#E=cK7dK zu^dA1{ym_(lbb%e3ye-o_}n{6T7s|_bz^h$+u$x@K~KcsE>sesZ!e!en9rdh^62Uw zCeSE4`|5$i3idWy!GVYzPZ&@@S@@-C%RHmG8^v;=8+b@979Xm>ioOPZmh_?`F%{o|nRM zvq1xQSkbN>c}Hg~>lJJ99Ndl}<}5CkyYfc%s;d6oVSPT!M1MWSk?P6%$6JC1J7M`8 zB`^tXuzQUO2>8GiX=j}GX{n}aMp5xK+i^f6mfa^4cbRya5=#~lcKHdeT@EL6t;M(b zb7s@;->&Ncmr(wz=?zgSLr~c7=SI>v9d1vIQAs$9GdHIC@%j19&zyjB4%FM5Ra?r- z*A^H5z}Z1rsy_z#J@R;3YU+czE&*NLwmvBoa*ESskF&9@w)?ibeu5Lk04;58Q*-m< zwGLsRb(EGOi=x3o0jegbqA)Y-16tihZ&eJ6_w;Y{6|G~4ABebI-gm{AMkwmJ_@h7o zPz+Gtz#;@{On{yV{y0!%wiU|(69CADK=u>M36rJ-vLh%bu~@9N_^(R44A{$DR$4ur zxQ|?U3ly_{#@Pd-jwW#ul4$4j^w!n>)C$RrVB_bLd>Jxq?6+L@R`fM>N&Ba#Aimvu zeaKBK?wFpY1FA0M({Vf>U3GS)?!NhWKCDDUU?3qaIc)a1xqdKcaW%cb>R+e8${@s# zcE9_Z2zgB)Bg?@>4jC{h_@yV?%Bp$to1J|sTo{W@_z5hve}j?cz{u!sMHK-4_q|z3*w}l6#zq!u zzPcQLMrLM)Qm-eI%w@0%DHZild@LMILw}FUV>5OVdN>=KoI-)v?4hcoHUB-UwY$ab zqm|`M!HV|XGm2BU88c`(e+WgrkIl+Zlue~~!wBo04BP$~Cb1DmruW0|6&ix(WvJO~ z_b_mW*-H{3A1)v8pD*kL2qhuqZSjmikyg8j`{B34IMu{>ker12pS7wwmZzZx8YQMOpFj5QEKg`m9+jwKd&x>!$0YFp9jI0pp<3^<39DmwXQ{03Gm>Kz58$OvQ<(OALV zwb2;^Iy4RfDRioz<)cy_AAlbCZ=C`^2-;f=fucZ^(td7INcF`g<`*_mY{d=|YID=) zigNAebqd7~-G2-lb8PbiIUx{)ZYx4-oQ~&wj^!!c`RD!^)ujY$unUsxy*G|vugnE= z0xbC!N868fQ$Gc=Py!eHgZ6xN)2$!Tj03F$yw(v zp#LFB(DUl=>?}M4T(4zp$J5jBPr>F+P?Gt_5DOdIO)}+UuaS(ha(wYRF&f%9=zqbc z*O;A~10ZYwatHoO^_L$&JOelPrNu>)>ev2N%}~j@YnzI;!q5{!6;U0vBWRc&XiPd? zbgG^%6l(^7+#gzgZ*bs07=6jcAV-u&S&`peT_a&f=_Xbh<3080=)dK?-QMmk)%F@& z^fIe(sW4a;26!n3omv)p`a{sf%54X1v()lui0|G#7v=3QH`N&TN0t39#g0USK!Rnf ztM=QB`(rwM3L%Y7yC?C1g*@^Y5G;s@7ki@{4ZG2{C2p7bN?wAb1|YN|`bpVyQ8&ew zU8^x$kBCA#Nhztm20J(n4KBIlsN`jN#DLMvwrY>dgWL@P zE2~oF?BeR)@$9LN!n{13Hs+Bov9YEsIv{9ZWX1YEmgV<485saDW;T`ie>wgQ`^vPm zi&>jYO><-8MgbFPslLq+PfG5>!J>-r{I*}gE)UZ!hjX6O)=8mS4EmLZH=}|l_4O*K zolD$n$xQF0rKPo-*uGw!T=XkdAh8bo>GGJ!WPFU_6;C8Tnbyw<}?OW#%Va4z81{hkr~BQRC~!LguK^yi6xd$f zj^_Q(6`!?101?Q)M)+2i2jDSc72`t;0JlimVkr}|_L%VSw9UgUE{%?So~gOpyJGdF zk1>RdYan5GCHUAZ@QMb@d&>^4l>1YKV*uPjqgd5j>7QDYF*$hxbo8UWz291}C=dW* z`3{UH)+^0I_dER-Q~cslpRxu4tUL~uH6?MnLNC+t=`J`RV64Wh2QwTZ=`cI%y}4OY zfs7Q*zwJdyjRpjyu-URU3p03NWKQLz#UdwWMnU{R^1eT9$Hj5nkUbu8z|&y+w8MG1 zk+@`P2=wB#feK|a*}|WR1qB!NL_cXh5z;XXw`A&_5UIS#M!*AuRG%o&2`0X_NGBim|^r5w;uJso;9yVUO^$6~@0})YH*{ zUZ^b_uY9C@X-Y=|79)nd6!=w%jZaozzxpD~tcJt(Z2_eam$f^?_t2&HpxgT>okE7+pHKg^6OG*qp?hI2y3)_mD89q0t zlhS@+%_GXV)l6ji@MKh4HlnH`q^yl?ZMk73ru)eI8dnqj#7hQA4C@i+WYmFOe(&$@ z7rI-JIUjO?2)r64oW#qK~=)Xv$g0VIgltFVl&8Lax2JPP@ zNzuS%6u8=7xYcD*hy=RKFt&t=(Z7Skp={7!6Xg^`&9_5GdpVmCJ>DO;`V`%MBBKFB zJeVScB3wzow_EEB>I>(2hd-!SKA}P~r>0AO|F|Bw1@qWq5?u5fjgvF3zMc!jiNUcV znqaOM3I(F1s><(qRVz)I#<5UW6;GxNa9*T{&BjZ|@6z)FTfOObCxgctYz4z(Vj@;w zp(Q1a(3waLi|WmKU%X>AD)6%6Y6$@?w*mISz}}_z(-6hS>n4xqiSy0fxiX|}Bz+z} zJdG*-4};ntQL^rPl>%rKiDU81&8(KRZ(Ppe^owrW^9u`+B%?Q1`Ue*+ILhE6KbPw( zWC;5zyO}>^_1noWdwV~hthUjhN!Ok4Us?9Psj9BF>Vi1zKt zGF^6@#jM8W(+(w+A3pAno?bn678YEVPAV3A|0n;Y1`;H_DyY8mGoBxR(BR<8_;i<| zYh!xYF|n}Nd|$qU`38ABZqq1u!2WCV3`Z|5DalLy{RKZWLq7Mm?awB(FE30||Q#*`f8{*g$KxDV-=4xzR`BVx6v&Z`sVO9Q3 z@OYG2NA-j35PTdT2?7(jTDP(9*#TM=sAL%lpemu=F~LQJ?7?-SVMXgV@Vd`wq>7zJ z&Ze0&IupWPqO%6%ioufoRdC<|RUq*Sp5R&)*Mp$8a;uSZNAMh|DBDzZ1z$SBhy4HUMTG|tf$X}2=SW<_9vo<-| zb&zCgUlT2V=yu4=X>w{xPFB|EJOfza1R3`v$jf!hyd|v{Yw?U{jM%i#&vz*(TE=Xx zEiFGVF#*gcxHH6rX#v<*bS^D&H@F%AF?a=JW0_H2G&>C|N#TO6{6!F9td;1H?HQ=K zbg#Y~JVz{zy;u3=((VPr=!ryOvYI>8;8n^pH6C*$i_TrO4lh-UTHBpS>=z|@7{h36u7iB00<5S)k(Rzxl@@!+{fo;e4YA7pMT(2JiM`tN4zhKdhlr^kpuVpysw34Ef}{?Pk^g?d`jb3}~=hj!i$lfZ`jU`wPkV zMj!}3P8UtdQzHeCa;5if%KRXNh5W9tlb4sr=XV9&B)7~cKR*=$X$^E~57JBw<^hWi z4B#W)vo${-Q`1Zr`cOC9D}=02hd zkEIEsLC|WiN?Tes>kdKZ0<7p*ruX#*oVcbUhiUP&<>)%%Z?%n)~2%9hB2365dnnZ^4No4TXQ@k)DAEN8L`?uSv%QmmDCsJGo&*nMcCOHZr~M5($=+l#DTYtNm#nJtTWVq^&;U z1#ZIw+P-Lq9uL8;hM07H)@}*k4a;|VqF*wM)EyU^7VUc&uF)~57B6Xe+^nrlySR7* z1Ajyys1xro_69iv6L)@jIRn@C5MnEt79 z$&8r+o;7DWya!t*xzXWlfkA&@%>15(atz z+w+wCJ6b^Y_VyJBXN+YDzi#{s7W<;Cj2`%>vaaq}Mlw;B3Z}Cw?j1B#Y-pvT;9Il^ zjGOpY&yy}>mG>Sv_%hd@1~h?gOEMT-xL043K9iLFz-2N)cfMNoNn=MA;^!J-@1o}S zz1T)yKQJL-^_HI^UkTClSE+$aUrl9Yqxazectb86FTl8yS$`h%AwFzMYCsEjH2L^A zfoC1eIURTiBXYfNY~vI)vBP@+PZ+!rxm&*q3U)x-`}glmRMbCnCZ>4^$aadmOg zZL}Xi<-3IyAZE^Uh>zEOeroON3e{BSTy05FAV;?>)_uIfBQ>6M<^#5_Z1&wJs@zJ% z0Ahi&y`DF-)uuTsKLsc@Qwwkc^%*Vgc7K<+Es`o`0;^Qk_oSrMblvT(=_YUA%YvR& zecwq+P5co=h=iUVH=E-lDy8GAT&196Iue76XD)37G**|urn~e6w~eW3-Pi8w#1oAtGyvgT|qHBY9@ z3~JDF=mcj~Y=5Kyh0lzvIe0W^2Sq*ld{)-h0=|;teY;5St!D_4=�yB+8y zUM^nSc!E@NNo3{w+-Mq3P7ajoG(z7^ea4iknP&D6)|DZ)X1XhbPjh^f)rg`55>LT; z6(9k{OCi|yD`#TBv@i=Lq3J1gNW9xzVM#`H zf|Yf_^zLjXdyDc6awCroES7= zHCsH!ed+cQ_}&rt8mE-%d(MrHiRpCm`Kcl?pl99Q_e0d(&PY0_palDCe9y25ni7EI z_`W=W_ImJ8u9#8$JLiG+@7#4ul_~g?`EW2pAOqKTT705MeAe6?LZkTe?|awdH*P1( z+HIZ+US6Rhq-yHwu5NC^4jYX|aNXJ2ROstVcZ9rBD3@QUcS>D9@lQ)g^3x%t=+sdo zrWf^osix&AU695m2dn+i($TC9e>2N6J$2lDKg`5vveZW*%wtOm zc;7hb!DwoSt0^y(SIq79CkZYsEoBMH*;y7h)VVl2IT3)%t>S4o`ERx29u_E3CHMId zNQZYkL0^{eZEJnK9wo-q{_#-Y<4~!<`FUaS9v)poiuCv8#l`u^_aVeQXt8vOWZm^P z@yadKG*o6W1Zx&{lFC<;!J6E5Cs(MARcvCEr;;7+>%W>q{!Tg77JmRBF1t0+^>r)X z-tyPEGhK{Z;L)C*qdy7UHYNN8uC!%Mc7M!quL%!vR6C1f;+MfP!~t=2 zoyAI%fs7K(Q@0Wu1F&)vx4Tb&O;a_N?*9f6Vx^+?c1iZW zXH=`OMP(dCXxGrJDtdn+MT;Tgue2I7d1HkD>13%jD+|4yRuvT`TY(4bg`2U*Uo@Pd zg*9f^>oghh&>J(B#;X2Fwx8Hil8iTJKX@#vt(3;cWfC59XdAtDjW0$RgbZqJLU(_# z-T*nVjf(u+NgAgYJnD=3>G-Q)tNRPI4ff@m$?fqVldGi1k&sEc;bjKPy4q8HIfs|Y zV9Yj}wT^V1`!XZ7wS7AF0yhHT1744fdv6+R|L`Mv35brEEgr99erzard!WOoy9}wx z)|D%NvJHhPqHM)P6_ubh$@eO>SfiPIQ<{baqS?~OmyGX@I*ZszI()bHwzjbA_wK5u zo`Ws5kJVX%R1BY9E;K1PhZE0x$iqhRi+Z#8uR#va9QH`nfMR2DVd3HK@t7KugW0)g zGhy~~Sx`5T11=@{H*54W)XeZKE~{jTC9;>E@brGgsIK)H`$l^6vEwNvez35{s(+NB zob+cmFMLPrUfPj_9t;aXBijo|=;z}9Vc~OIED9OsR?+pLVl}a$MDD&Vrw+cEY*(G} z{3qw)7hH~$sx*e3%yy4tkYsDm!KO}~#Oi@?X$4ct(Ka%@klP1Czq>oCjbD2~gHqKI z^?!?q8!+(15b|E=RVlw91!G5P|E{Z>$r?lhsC7`w4I&ht*9S>Jqyla}!1sHdJ~JCh zOUg3n+;-(#CqtAD>e^BUf=?!GAH8id8=Qs8Ii`?UX%FVB?y?45o^Ut&JU}-4@!p+E zPjCGnsKOOd=GZywK3*DXuvvnENXy8`H??WiyIOsvJr@2)gjvH)UEq~&t>@i4?@->` zd;LdrSEDTTt8KZinp%si3v|FgZ~~btUFGtz{_7v;l<|-@J6%gGeDkAwhaI=Haq2}% z3LoNEsQA`w^a?(Hpl9mundUv*Rqv{o7&Tt*3B2M1MjmqdSv zBS)*@FkYC~=tB7okCahRGz@Tbq{@dh^UX}F@$bumh|vD-W>!hDN)U8>OQ@=qRS8&E z!`Aaa1NL0GOwbqit-nHn@PPbh#OYZ(%_Ns+q1VVEJnpOO$i~Hg(19>UT(I=d%X=Mb zLRt#q*FJcIy7aRyg3p?+%>nH*+yO4f$w&g7AW=CuP)=9ak?<4f>Ms4PZoj&^!jWPU zZa8)K{aM8&S|>Vx9xrcDV-*+yicdY7QmPZ8zdJu0hTof8vEvXlt2jtt`5i1CIKYJ1 z3AVc3g{?#UuCAO3xFiD?_ig6LX|>tjFdMYu9v?K@dyH?TPH~m#Slx!~mfONZzM?x` zrbgT0MHf?K4A<2@3l>6fi;cLV$Stiff6=Um1v49(yPm(IibGaDz50q-DYv3rsa5&72s!7u>j5ySR%hfMVv?$P_Qy&>n zAAz$Gz=h`5L$(-{fcRk$uUaa+Yj*DQJ)i8LW_UEte$`oiRm{%L{^ynBP3b5KDvy7b0Rx@wrOn1-xJdV;N?kKu1@%<;%#<Tsw%$hk{X}P^9*ScR*x?Np3dL1zei1#(~^^A^(Hx?){)Z5z% zu~Qd1x(g^MwV3@`Y^n0NAui6paU;`%gHIal2HM+I(Su@hc@)R?{rxXZy(w==-R~O% zm}krtg-gn9Il1Tjzi6$AU_$;K@JZ22F*6=tPW<=?mK-6ckfW&PBzMzbhGpMjEnQgA zjg1&7i6@GrpFVOMFGOr7Oigp&EePQuf7bu`!{`Cx;z}9v5D4>Yl<7zoPK8Ts09R0V zceiPW8cyW)DaqNoI8NZ7T&2Gt<^xKo@z}O@sZO@h2?+%}F1A4O?c1}?bVG%zy?tfw zK0PC&f`0$mut86Ku46mSSZ@DY@e6_=HD zuiJwH#r~LQvIUtA&uh!qmpwojh=_26-DiVG33axFxSRX-4s}xi@R=!)5`7z&E9%Q{kD3H;)3S z1h(GSgW|mjSU5O~{9lK(K&f3wPF$t(Tm?<(=ce+dv1dsju6iEvJeHPL|0WuEdFgIz zMNmuR{zYjns&t&e`@%!0Pm1=1O}1peuH^=~p`j28B4m;jO?(VSB-M5XQGi-fRyGw{ zdckUZtfuxCy+nZO`T5xh^u>do9tgz7W`#_gQ&UQM?IQKLzxgi{?~B?8Wl6Y~-qutw z4N_9WF3go2m$w?j{e4s#p^2A$lsNlSxw}G~OidcwTzEL?w*hE7u+y$7YK}tL%vL_J zEt#!ok9THtRg7v^dKcPt9TcOi+J7?NP@}sh{jbZz_QZIsKa-K}#q5}cg?BHpx!F@y zH347tdzCB1k1dAd)6@&&W_o(a@n-UvtBH3D&di8Y3P~&RT7_@N#uywtEtk}+h&)PG z%lvq!{i1*{O5`>9^#oMp#P^0;#6;z^f;>NqyXTMSJI^ac%@99xwLd!%Zr#Yk(Y<$O z&^n)_UZh(|Cp3ZM;8j0}=F!`}hn$i8jT38u+ICN_5v23pxc_OUdHzhE7OS@XrLewj zwpmB7O`l!}D`r~NZAR1XeVFYy+&5OSE*Dmke@f><-f_1(C-qPGo2!c)RNo`(t0U_Z zBkk_Aa)SEHoY(}entZB@ii7HgSR~-?@6L@I*qmNB489>08#aG`B_K-}Am9e@Xerkg zvJP;gpUC@2wGaoi^6stN8X=!29=p}2KH!@O1s>-~@J0j$4G9KExgimvfUURb&)zcJI)NHE9%7)P*YB3Nuxb z4<^yjR|hFqy4cFwbIAdww&An{I6m?A2jb%Q>CURxk=SDHD49vKC7yZgvC#xkTe@; zDQzC|tI6qZAE$o@PI-_9^m2E%Z<^15t-`f9{o3qm3>7RUfp+D7wCVGFGo0RF0=kvJ z#G5iGSd7)*ucnS2d@0~!U~&-Px4VUFn{7}i`(goC~yP{(H2ZXLuj!uPyiU)v}R$A z1{xM3O^XGjpohoDEMZ^bnqweBQ^rew6Vk|~6nYVekfrYcpl>tNHjqSYbwA(uDfmd) z2O|k1{9jW+VzNe1Xebgaxi&t11TiqnhZ+?2yG=7|$n~NplKVO_-c=a0<)U~2ViVdYIC4#VY z_HL+jRWqiSBfz?^wcdX{S(X%XqF!7e7*M0OTXnA}l`qgd2i|5V$m(iMUTy$@4n

    javb@)giI@2_k`6gt62|v zTrqTUN)@tyA^XL)buhY)A42zj<9k;qKIOLOeWxJ*KnDdu^TRMay54|#7< z`)*h8B~Uy22&ZgXK_c7OWD6cvl++Iavf4~_074mH{c&+2N=SHCClygv6X~)t+#$e4 zn@3j4=zo*ekt-~KjergeFMb?Bwe=NOjm9+G*4n}@J6A_X*6@Mo>s$GMYqqN#uQjsp z$`5hG12fg3AXhr8Y2NR2bD0Eu9J5Bu4EdbqH*E7HEUosb3$z4Di1aq12{gOAUp3^3 zkOSu`A_xfhK4ob$SU5Oyr2boQlRR6ONubW5mDKmdfY?7T^83E#V7&#OME7_^-1)nA zM3RzhShl`X=`k(T81Bn>t~CgRQc)6$7GWYsGY*H&l1#-fBAqu&v}K;BDyKz5u~Uo+Y$o^3z;tu*H$vvhr&pJztgM7kKFCx;80ZLrlfmOD$X1^JwN)M8srw z=qZWNQ+8e;1xC0*J8jy(cHi9-?F;kDI?e+_d$pN3(A892RMbn{f=lQOZh!0lOwTv_ zj(CY5kbA2ADC$2A*HC=hh)Ur;VGZ~ zw4L>#5OH8&em5WRcEGuNNa6g{f35`@9}^c1?|IZ#2>6nL^NyBiB-cB zR+*($@gc6^o|%xcZyXQfcW;PE-q?gKBU4CCl*SYzPNAn*UBZ{(9&RL2;>Y}O%GE1$ zq%W2-*6vNDQ_aZCWW^NQTjY$4C5g$qxOsyg(}nDx%vVG2<~#BgPw)#g#1CA5M|uZw zIIFea$B>-vds30omWV)-z*(o$#5nVIgosAg@`qRzuB321i>5b2Ol1PouDFZ(0MA!j zv-{S$U%J)%FU)c%K313d=<6g^LQnK55rJ8AUnKR`s_X%INSL_K050f>{q-ev;w94d z3>dA+#6zmJ2qE^4^gN}wTvKHxA3^!Jj!s%s9SkHbT^GRNW)>GgE1K60L#95(8j0w5 zZf@?NY$+uo1S|ny3u^U!{WFCeXY1sa-n=?_{bl32=ba1l~lOY3PS2C|Uou^L^R^;Id4kR|5q9Z08+kPc8Qp z&n=F$eQ>`aK^sVwdFw)Jt`*Vsk^&7H1Le*)R?8OWjGmh%tAvNKmVx`{<*#FNTYZX z(g-44E%`r`)n~LdfNXKfty8QciNpZO` z-e*TXpmo3roEaW(vwSo9461pm3PRqu9vOYfan}37()ecxBbo8BaPscu(&6Zg1pim! z3l;*oL=f40qKN=FBa~FygjxC~|InlVO1ds%);0`Exh9!N#q@A;^i6b#I^dkdr2T+Z zJ84L~tX5u3>fs~}gKO32u6bO21>di%v=slZfKSRY+gbiG60$#D^XNrvP-wx(agY@d zpG$76qcOL3*{lfQN(s$I){CQL_+}Sji(}h2yLZ0(6o>Fh43_mV9ct@16$QTKvP3yp zpk;Tw`c0rnvgZA|GdmPDHCJ(LTS8~z27yS>ruBXGDf*;c&qfHTB<0{J$W?pcpklyz z6VlzH&a%%Iw%yw$!a!~L|;y-S6TfQp~j|vP>Q^d7fFnBloQ;Yi%t(o0fJVjl0otB}PgUuy!V73Fu`# z7a6SwsFDqhWkn4kA`(iHIYXJoW0jxYrbqfed|^=NCjLRI zVm(89)HhEt1w86{qDgN~#WEeAJKXu37xHB?6=zHE^?&yPc|nfKt)bm$ zB+?SnadJ8U735U1-(!FoO{Lun_)-){0n(KXUKq3-oNZKqtsEg&F<6M*z>p_MXX3NO z*x3EMf+`pon3pfZ!1Fyk4+A+Eh*sZeegT;km5}Rjy(@yx^$?7g4=e<`w;VMYa&?U$ zhMi>imFJithFL8MY3V)ReiDAm1OUcRXJx(Lz1%TjGu@w@%Os!h+V?)3tqa4Hl)Z91 zURi0ZuLmI@KBFlyQBfD7W4n?HV6)MwJ?`t5NqDI)b6QANLd>ur+!{f_zL z$B*wrifs!l;2|}2l@B`i%h;^%-&wkT{cGpIX>G%=`v&uq?#u{mj1UU+{qV@t#1ez8 zug6bx{4HH*DoX6v>K`#v!&Bb*o7G{84xwLL3O47YB>-x;l9Rq=N}c7A16E&JEdRDP~V6wn`jII9RYEs;2@p48{|x>J2_by;we z!IK6DgYuJab9CQqj&seCe~e&rIw@%en&II6F-^*NF4J8y=x8Cjzj=`{(&GF4MJ0Zn z01=Mt>1jzOnU0uv=<8Qo=T|HN-+<%a$X^JsM>#lDb-Mtu?qNR#6oi0aLU(uWNklG) zz8r7%O~AvF^YYf`wSm?oI{?-pCO$G@^VsZ-9UB`1=i9m(7N~yPk)(IV1sU&I9ZrKf|w(FeMU& zBYdKJ8Zs(X1UgLkdE!emG(Q-4Cq)Da?c^}7E2o;1J=kZ4>tmj{pHC(mckB!}q8c;y z7rF6qgG}Pz0X^_UaM)+Dc}^hkHOge=a`Kn^X}k+bLodFa+e%wx;V0hh!Iz+tx8U#J z<@x%EAa7ye$c9ktNYD(cS*=YMhTe42mL+MhJ$yi+e$UNl*Bi6!&To6OY~i%ysiip@ zB(hkjrsG*S)wOf+hIIKuSJBL%mI_GE+&VE!L@(NWlJCHjLbx--WzKzY64(CnW*tz) zL6vav`FVJ5u2~(Ws)tmQI0P{=MgpE-ceg9%3b2^|#*+ePcFHv?aCV)a*T&0~*3&yO z9SAUC9!cYg&v+J(ec0K-g-1Z>?&&ec!UHg!Ru7qMDn>N6mMKyuE6pi6x5lN^d|fS7J`Hb z2!8;ZPu?Cg80%VTX}f;Y_yq=#F(dj%GjpPBXD_jW7hFqG2A$3srbKlUgLCOemxkMj6B zFrkRzGMh+9gh)z0XZ~{li=;twBe##bjOpRww~UrWyj|a?R{w~lequ5t$Fuc1V0|I~ zDrJy?LqalV18SB~vBck(E-R{O3U_8vs3!FNGnA`vR@{dJY~E~GmFoPY=&qDdBO}WF zcGx&P^xx%+6spwLF#nh$j7rM2hD3@uidr?i;N_!sZ*o#IWM$@>?Wn`V?&2~?H;GqY z%za)3LWj-Op-Dde^`GIwLQWR0r6&8zny6tULi5A^H&E+G1*xntXHJgCu=IJABZ}pU z!a>^lru!oNM(YyqIe86gH1>tUQ#9Ew^dXvkUsW^a_YS$Mr;wjI`3Y9u`ek#%{QUe3 zv%$-M{K;*BUfQSsMM_io-jK$%*d-7DY{DmvKEJg&vY|e-euX#4m1!6r>mMA40SU== z4vXU{(qIj=b^O_V!L&dAx(-j=xLG}9o&Xl|X1wJ3QaF3~`L}E<4Aksu|3p*F!!;?A z=$D1w$u)PgqcGCo_{<6!o@`%{r=plMvO#2vO1O}a`O7FcV|=R?pNsA(CkF~99XUzg zxp}$AyPFeNeqUr7sJF#2riHo`H1UBd7!c_~2%K69VET9vM2>_1o%#f#1L508s$^NgmL-a&q2LQGo#lh?%~4h12_I44g&4&_Er4hAEtO z`*6(c>{^TNWQ842O30!`ey&&$GaqVcx+fzg?8o38@`$==^6NuTESuhRK2;f2}Q*&|xKS-IYmf>KXr^}M*y9}AlAYb(N!L^4nR zoxLqH5Da#p317u!L{4WjC5TA3vnT){a#rJQvM|q$o;O>$xk}e#H7&iO^t?@!akG$Yr}V( z{t|u*3y1q?TG6}W_cx+z{Us*22MO6!v&~zM;1E1hqJ73aDhh)SwZbg$AHRL^ix$Af+li3 zcDDLLG5k-G!yw3=Ka3P{f4GzBqQ6Z>Mimb312*OE{ zirCLngp$RWLZ+ku#or*H#oO9yYu zz_JrbV#97^l2K-0HB;6=heHppET`qhci)lA&ViQW4Yntt9j>m&E1;z>cmXz@+kusW zqIdM`>gq2IrLmv!OL)XSjyuElvI-;$`6VSypE^q4MTq=J-Q!3Fm!soiUogg)Iv}@` z#i_a?rSUnRwGV(g)NtJgP*=P>Q7#b`nmhj0 z{XbA%NFus_+W=Q2pyn$q4z0l4|Gvi28~9PeRHF1=s_WL`{=kuBDNS}G3ziiPo$ODP z|8bD?_-a>qWRbiW3Bijr%8*-ir5>*wqkG{;cE@wCvnwRDQjD2Fwik~?_Zp3~iao|N z;_pblw57EMZ5HXMOAH4cB|3~(%LV@8_;`3)SiX7J!S`pl?SO(!;R9Z-HI>zEw^a=> z?d$pJv9UFpH6DI)Z0ele3bhn>B+n%Y-(_vB9&_HJ{ad{G{55GC5q%>YVd9J-k)Wao zHUMpkl|42!_xiV(Q*^dddAvyKhjs>eLZmtKIz9UQWq+SF#C9KUQ!@nZCMJUlCb3Yd z*#r%+qp92NiIbD*O7FYZ@)s72`(=|75=@$WBB(L$cyHoAFOl&2a2buYdj)$$!B_np z@HJSen(S{?4u*y7Cp2a#)_0oe#;GvBvScfe6A>Gg%CuI9BjF>ynaSK%RW%2=E9cqr z--pF45%LPEA0#!$+n7rj0h?Hrj+1+|r~Ix1#)%&tk<&5NjWL~v$GK8r^&#V#?MsMn zyo#@+_uVBe2S;(Hu$fgdew>N46$GW3NRLlWdIQh2LUrrX;^N`a(Q?Z(na;Zc)&B3S zwsCCTU;qd1ARgykz|43Jq51p0XJ?m_kx7WoEGg5-2EYK|je&*R`6=LotO6;Kp`~#< zh;##D5MTx$%&dbI3}A^Ofj|4<-&wf1ikcd5UIk`lsR*}ec<}iZ7M_CxfM>lDNXOjW zc}`8u_LrggL6l+I*l>_A8Vp1PQGmStce=U^q;_m_{?h!|uPPeKxOn(KSa>x_Ep0oX z@fp@#;~DzBP2!Xn!GN8RQE@ z=+wP^iEkuaysAgxJUpDPx>^mW(J)Zo`<0WG`(Jh|{#eMS!)bm$AEqEb;6*#DprY@ zcuF{sVPoW@TopKI#9=Ve(QVFdG+zK_f6`x24N6kg7{S{d5`qx{7K%6{3yZzgQSth$ z{QS?Ro$dMQKYh}7e$jXeURAcVxG9gPxBSvT{qk!4X@9C{f8R3LEalKD+1<7RZ;uwW z0YI67i57rw5mCi@I*E>UcL4wq;D^9AE2hMTi;F8ma7VAsaqH?p7VrZ>S(#xj3f2)T zgDfRRi_d-TpFbE*Qfg}NQGbPm*bd_9!}LqHFa*L&LAX=Sa5;2c-PEVGkk z9$-TW7wf$`I69UY{xu^twM27iBHk}!$K6o@jJnTS_B)N&Pd)`<(&l5~Yi|b~9Ub}k`B^l2wjA>vUGl^!&hC2Y!N-{; zF}Jm;&+TqM0WdP#gR#pUK|!zLseatve(Z4fyc%yvD>uB%XEEPN{tk=Z!gs9K!cO2A zYlmOJMN@r6n{IGWY_gGW@*4V#X8v(mM5)RxW4ZL=gG$*7Rro0jiTP|q5|LL=Z}09z zPjo@wjl1sg@3V`=4~w;>3qgT~FccK+dS*N%6rUe-l~hzZK3l7FyEqgUuJ#^X>8%@d zOqXh>#k=t}%%YFAi`tV4ne4Sb;(@pN!X$a%bx+2L zV5>MKMlf^Q$k2!b`7a|ZYjo&`17^)}G*x_G4>L8OR{1+ppAasFhwzAxltq<4e-39$W5U70)*p?SRf6U& z&#OPh`80NR=HX&A5Dm#2ptQy+GGO%4kca>FMxua)HVjjUUc+~&!mq8;ek;&7u3Yvh zrk6M~N~IFR%;f)>&&2Ir5F3jg(G^G3GT37tkE|9*^TC@fTp}ct$Atgc;YF=MTOi_1 zD9S|2{yR_|Y63eXw$*jv=BCndc{!=AqouT<5)ioV;~h%1g8D8N$s$_7!^?ozh3>yL z`2^oByWVi@S1wnbZzaC7raZ*NY;=TpFd0N~Rzwo+uv#W8fVsDhwlu#SC@dqYNm(_j@#)WMje9dqx+6|ZA+%b3S+&87rxTgjPSyk)I5dx7`RNApz==s zX6K3g3*5TCqGGEd3-;rbj0_f2awwVJemp2D@Q~D$l_4Ra+(8rGvLaq1z{sD~{IGr_`spI=DPfvSCm}7R$@(T+=_w(%BTomE^(!xSI zF0RssaOb5r`@OMmfeAWUIhR&UMEEnm2Zk94uIAM^9! zFph>n26ASyxh{c>^H45rGcDy7MD%;JBpV76%s`^rVs0;Qq2XqCy{nl7lRI7N0j zF}fQ~w$GESpoeAVp*@qvWJ#}IyvlRlUIp8y z`=f<8Jw2Nb)G|Z00|Anu!TOx66?k0M+|VlMknY*2pB$lxe;E~Zx6%Dt_m7FGBAUAH z?#+w-p3-!9q!lKkSw+Uf{3cLfEOcffqHlZrMC@gd*K#PwF&#Mdn#R(G-~lsPeqmPT z#JuI{am$!Rq9NMDbM4wn!oSGVd}tqsJRWM}@VB(JwS8kTtEc@5k7S$o+hVW%2_a`m0;I0WZx&d`tdCN<=m!ZC6 zlWcv8Bf{bwBR-8Dq=Y_Buad*PHHXU*a@zbW9g0N3ZxG#O)Q&fIIfs!>iDgeIH zz*fI080$f?QVBCg!$U(K+)n|~PD@P2VfmN3BE-| zxr&SD7|}GspL=p*TvkQ=^N-a;02!yeJRJ><&<}CX{e8P8L`3(7Z&ydmk@BJU2G5ORa?4zYI|STj0SHHeI-L`SQkQ53M(`*Homx2 zhY@CY*XI89Tg?xNAj$3RkL)v*m6b<|Vxq&5fTLxHd~2pWJ{}X3p+6aYcIV=`>a8<3 z0AMTKl>tGNd+Rv`1>W3LhHDQh6S;3#I3^)z@QiqJCWXtizZZsyYijDqc#$eNs5ox% z!sVbHfaCOZ87O;(ukpb=Jrb8WO)`=o@>Y=|cGkf5Ka+%3v-jg+%6$^T>xyGzMP*$# zw&SDqt|AkdUGrf|Mjf18o{E{#Pf3i7gFypj_m`|htiLY^m1G@~T1Q7!%~mnPIzN%) z4tFyr(#qFVSHL~utHIR?UHs(_EDn%e6CDcP=Z>dN~1>+>ymFgwi8Clz-C z9=uY}H~r^NT$d3Ty8!^&@Q?R)0GJP4$n-QcSP@+-8?uUX#tLL;*nvrRmlaRvIcO6! zXIqbWfL@-GazZ&q3;A8UN?2InL#Lr{GMEgS(^6CEjGuwA)Z_m8Lx5Chn_)S495ys~ ziHUXo5)S&;Js*x|eMl$ibqNEjP(auz1U*Vb>C9p`uMB-Qw?uxlH7Zv7>Ou6KJ zd9%2AAH4hb|GfY;r?`VigS}tgEl^cx7W*u?wRp#kEl-zbW(DuY($l9Zr>>wV?#cOw ziKnI3ek&6W*=YF%%8k@Qb62htON4j76d#Z5_SkcA%j}Y)%E5Z z8u=1KZ4DZfG{kQ(2~JOqSz=~E+B!Ni)eoEom06B2S= zje9Jf_!#NoQ5iLr?$?M5|4)FID`Y4{{z-<)WB1^gfg#l-KQEx?MosOUI?Pqjha*QJ znWslV!R9g8{iQ?lC+W|BLuaZIvOA0W9UGqDRJ?s?vHE+*%E~CIZ1qT4rK77xWfqjx zJfA1<{zj5uLeQ51kq4ZA1hGQ&27i4N%@+w!{EvqB`2;y+D|XM3tYRJLT;55d9=_8z5)gT0YO?i44^VTTOSVQQ=y##D@Rgr2vBcsFZbAAzm_`>#>GdW z3T)GD_O5GcdW@z|2mAu?wFxjfLJHvK<@NON0A!~FaL)kiP+3t?{41iPNhg);4KR5E z^JzIbz42yna=qO95Kb=zAkmWF@BmUpYojqI2OFZHKbQb>IuP*1VT|h@M1l_hYgb~T znoM`#Z(th9DPDB=cM5n)*a)IZB^VM)pWaWrfVZ%;TyD8RV?H*j4}k_v$r%|L<1BzB zO@g#7cN{-f%B}?*UtS^wP*Um3D)@p@Z$3N}hzp11EvK1HSi0cj69y z*Y^`J`)w88?R8i5$l7+mH>y&@ja!(3nA|x&T zMA?9a_uKKezkhRd^hlG}&E1vL`1Ex9njyHay}TBHj-kcS2~6hnbh}8P{x*+?=orAJ zzMN(PWy{6Msln%7Uw(MUZl#}{O}|*L!L}7j2K6R@AjzQL*t@uh64K_ThGS8HWMfkT zHl@b$F2J1r>J__3a(GyngRU-7u8zL`&*bD8P!~cidpg+L3k-(h^75tsnLiN)FiUVEJc4r6hxVwoN^LvCgQi3Gp6_9OwLybpk1 zc|V#x9`O}1>*?R^sqn;R3&I$&&=62ZfGkpzkIwH}kx!qR($n|$IkVQ+QJ?nI9`shn z^1K3-^ZXeU)Pod9Pr~C)`p0*Ii1JvM2wP@mrDZ?#qNe5(LqY*BT}vbsGWgYe!FHK= z%&nY-(KjH!N4GIUG7zSrDoHXSS?Zw7iA_aw)W9xYGj* zyM-=}`n84q+vDd3a1)eBG=c|5QE=GWrZM(A?El;lI>j zo__P{&)hoW<2UAvZzq=o&!Nw@40`oqdZN>?``g>R_N%RMh}R&NIr9t~N(GwN{tNU09>0Ud%9l|2Bd=~9X)4v_x{Kq@cG z7+I`y(qfn<^?k@Bu72i61~cpmZa%kTupNM)L>W$-`O6G(NbGI7iopozAb~qVNbvN7 z3I*x1^4n)S@q}Zdxa`r8E%3U4+Q*>z{6d*J-o?Qc_L;j;V4XkrYk@MhgP94? zLb$Z_Ax}PiF4hXU!(K{iH&>2tIHSdL-Tk^U?{Rya5y2JcPh~;x=Km3z zEP^K&Fp~n1Q&EwX8XbH9nU2nnfB!n1dDG?dd>{U?GBcL}BiYAfCVhSVR1-_ES~y*t zOpa0#GkoNTk;wtTPvZ)1?Ex}qbbmh;#9F~HnMCjlQNSe)C5XSc-qvpTK521ZE0P;m-KRp4Pnk42~V&I=2YVocoH*-1`+tS}~yk)>KY zgC|kAcNGR@E5J4JT~-ArBFjZg2#{3YkQ8f`nUBEK+HFPQF9^`fao;%(;`t!{Z3zJ3 z_>&cbbZN_tTOJ-(^AUWkIx^bhQ)f2(=#EbW`#P$0@L7q9#l@SuTdRUTh2L9#*)T1y z%$#kWMXTfulf~x4`in8U+S%efHseyh-H`LV+Zr592G$O(DtDUb5k7nSoOt7w-QD4Z zbQs9Nff_WG6Y@N#iI8)0aA-GMw{U5Fe9e^nBmUyBJ%?)<6-Bh9gzc}WJymS4!{3@s z2NRRY1X}W>Bxh=BDoCh6^=RIMo-&&n0nTmDDDBJRgkI`(EvsjnF$9(=6Tjn78ZdXh5%-! zn=&<-pL!MkW?;)2Q-7Ni>WzipsFPX4tcv+$%x|lRRw(vek0CtYUc!bIYAwh(SPC2g z6Jx4K0xMNIyujLiT$vZ{DiDM`n!Nbuq#Xh4mkCN#5thbBooyXY-q-~MC`#8k`wk90 zAe3OCAo;J7oUK!MOEs58(1PHZmUoU=zVac%BbZt%8Gd9dcXXBAsiYcPh*zEyu`O$R zYPr+Np?c@R^+ysPAAni|Lg+a;H;Rk<`C@D@{&|~gFHBbG()^*&!elL1)u_=5{wT<5xUhsjhxSY-a6M2LFEP9e%)O^^Wyf~7z(Q;$^jcCZxQtijiwYb->cR;rt2R8(j z^LciSk+`$r$kT4iB;3WL`4d^F8#(eQ+o@1M(2G@N){$^94(y;coVP6ml#gwj{z2j7wAz1v$Kp00|$pN?Rl7pE})eZrsGeHkAvYS z4OX}m9onN-nK`b=k>KYm<1(Q*hXZzOak&c)8I($M7TC+Sd%O4xhF*U)uIhYw%5l2&{ zvWnhl@|p*ukB*6n_=$W2%T>o#;dS>T**u`Vltxw?n67W&#GojncXpD)GNKw&5 zZB22ZCEI&DV`N6nawCt)u)t8Ne6{zW(7F&C*JvXb2CAP;QB_rSP9=T7enRl)gAD(j zt{Jp^X>#6E26-2N0u~bs$|WZVAOkMfs2K(-%0^p%P8dQ9#OizaZk5W-t z`B6>|#NRtR!!SX*e|B=x+G^pyvnN;_hx+?dS;%!exM*o;yl;+u)TbA~CM`#~K#_*G zMx?f?ib#N{Y^iKwvU6(K{<#>%GW8f8GQ3i$b=b)5?v^L?A>&O)g=YVl4uCOfwz7?~ z04!u;ZvGy8{@{PN+L|GUgn&j78Oez=phkzC+UKWAXHa1zZe~V!1!74vfv&)YQ;M2S((#d?eVpIS=u&x@Pke zZkPS0f&yUukDo`)Jy}WS;zH(#%!c(363K*t0K>k;{HEZ8*FsoyO3FKR_2HKFK7d=u z`M8s+LxaO~T}IlltRhqs ziZluG8OM1ziOfuv_RfC^Et`;uuyT1|;1z_1H{0#JUV9qG z;vY_ak$Bg$W5X`EL|$Fc&@bsugd-)ceCnMpnZc~m1b8c9SVfXsm8Pu9ch2Zko7BgPkU0^ z!2&6h-=@&CBj)ydYhdlkmBdSKq&oy!nk79<)Lm+K(2EoR)&t4CJ>SuV?m|J`dI6IP zF7IbBO(HJK*Zw>s{sG%WIwxVf)Mkr}DC;(hF)t)K`b|)mFfB?qZsgRe4eO!>zgqMkWV?89k=r zzpQIfKv1ynsfpg+QQ7p{mDKia25pDC@_6#7ul_;EChoZXcLG#H%GmznVb%A7LC;-B zzABWIM0)kk(%RYt86%{;&b#?aSsd@NgGJ76PE6JdxkMksL22K>cOyJDPIq6 z6P!lD6>)k&=ze`T7fmWi0{$5|*U@3K*{}M6>ILwzzx$Y^+vqNx#a;KQBNv>rmg<~# zK<~?c^ovYqhA7ZtW@_#`$xETCqoY7+8Ls3x9Zmy%)nx*gJx16j*b8~`I`04E;Qu*V zX|4dZMiDQV%Vva;osP6X0kGKcX5kFkAG7Yn|vb zM_DSY38OjF+Vqu^BTke(w;I9-IbghWf7sPARMat6m|lo0()g3iXy-MRMN{u|FK1(( zsP3O2l4)3>$<7X%=lZKeAN2T0hrgGuq!kLQsZGVbW6v8ojfx+cuBT>9dQ5)V|98Z>VPheX|;zk5&i-%H3V zCqJLYCW`w%927PXT){GVKeYz@ICIuMn4q{@EHg%&LK<=HBN7A&3#G&EIP#Uz;SsWP z*~}wBV4yM!vQ6MBtyX0KJx^F9B&A>}#rk|i1OvqlBm&zrOK}s?FPghYueOI@#`Go$r3HQs^uN1D~Q;h};cVhZ_kqHTvkKXz0W;Jr1d9uW- z6DuAYAw|;WxMGZijb7%3RQ8}WwE4^9AOA0UY3Y$a1sYl$P4!or;Q|ADzTu6@7cc|0 zg@K`}yd3oB0d1QuQ6_pFL3lJOeeEL-wU@^nD*n9OEM1-Jk?8Ww-FL>LV+Y{H0^jn{ z%F0_-3;${tLIow71d?zc5ILbfT?zIhhiSUM%pK!>x$Sv-*=&C%`aW(cr@KdA)tuo{ zg$NRt>5KFi7KR|MAB>pNLZKxVc;gHaehMdaAIuTOKof>hTin>!O3us@#y+%kD5!qA25c7BR5H{}ImJJz#y=6UmO|a_MrBy5)(BEEf~uj3LVR zego$RTb1lD9u(}AnLidR<2$cP++XF@S)l!SVY!fVv9rFS`4Z$Xrn$H#e+j8z#yd1W z@-?^pX9s4|XRxyFgcNl(ZieSAV+v>?2-YyW?pDg_LX;?KZ8Wp!rvPUFd8~-i)TDJt zPkK=?BW5^1O2L5MAq0`8-bNy6J~Vlz(aHq}%&?7q_&k}QW%Fiud5Exwp3qxwVRw%` zP4)v`cGip66ZLrS4j1&4{xt~-8wINATB29`b>h~R4ZFtNYsT2KfW9P{*@tG2u{+UKgN|pmV_ySs zuzEV{A)VLGX$dVWV@~m2$31(`!pS*ZWw<9y6_1Vkyn9H91o@HuDlJ{RyK)W_b3B+X z+5PYJd58cQlR(gTdO8d@tW!(t1dQnKfJ)fM1Ow_3nmnsFJd9ZI@YE@!Bt@N(pvhLh zX`reA7veBO4jF_-NJ!5a`K3RBHWanX=Pse>tkfSGUTinn>Q4lu_OgO)nCdYMI!PrS z1kuEBR^^w;$m|800~rm0(lVh5*;V5Yzf=DJMv0HcefH%KAVmwFU)=L?7vpoEmWgQr zqswf)&FcUp2kvh|zcg8VxAK)T(Sr8TwJJ2f#m5hYZR}BE6oA7Sn7kY>5r&z^;h|t; zFuOP&R&Y|euK)X6%0va_|EY|?5kn38zQ~0Pra9CKjsr3?h86$N--^g*cssLy7#SIn zr%3?*ijxz;`H_?4u0EH8-}31x`K}VRRVIh?ANHgkvKDFuB3~bTpeR28fnN@i^qmoE z03M=c>FsLuEBA6U($X5(9ovwIBq-X$OB&b$Cb|EsI0XZo|3rhFAy^g;%2)7lSA-(e z!95Boysbk+`#sSiuV_=Kx~Jz$`2rDM6l-Gx%5_*(ky0xB`n}sbW!hs>Z0F|%_+#G} zx=gh>IbFlU@=L!3Y;|Y%Cc=Ox$Q*;Hu zi}%|CL*5W_31y;bYO^A_Vo=PxLD_mgz6$r9TKNVij=&!FDQenxU z!<(|b29D`|0_6hdp;U2P<`EFbhJ(7ly@wuO#Pv0p+i&CuNZhUVSS zNAd2gQS%qI;cBHKv34Mu;=UfS&HE-bc3C|T9|uEzi;|o>_B)Y_y+KcYQ%#NQ*lfkY zw{LQ?Ll7i`Igyp~&M1-YU*1QdW}z50ecK)ziWsR9iSBz90iRddkS!_zCbGKY!X98kSp573vco z=75?gAo$VI)6-(fC@PKu-pF&6;FGVA5-76>7o#Y!$N{UIt8>-!bqNHWs^H0`cLad? zyJW@)eDjiOO?lrNOeY-@xN|&e}C^p94Grz`1a&(-p9|PXLxA+!m;UxaCo*SiALG0b?zt~x( zO$6$o5kLv&jGj}B4smgZXwh^?F=ss9vzA4qU zIzah}wtO=+5CB%Xwxn)CU;4qo8PkN3t84@V4)1(H!zwf3xFn0h=ja&lV<`&unT zguK^8Mu%@=KqKkH-;DdyHC#kQc2a}dW0E_@0e17@;eq?j_l7;2w9RH_Rn*Zf*toU) zf1e2+0{=0;Ty%1B+8IpgSmU5gl%a?PXVU~=N$TwGX2bgiPSh!~QEi4mqmqb10Rb%{ zE^Eymw;}{kKv-C$a$!hhYlFO8=+D>u&^(6rD3Lq;_%Xz^;WE8PzL`0K7 zb_5SnmyIkg0JWZ~3kOUOklC>G5)np2LPj31$}e)*9nAz1Ory`edX6$^Ynz{)wdYO) zjc$8pi({yBKE#^ZV^g2r1&eUo+qZYVGzS~VteN1s+1XyR%JlT~t4&AH+EjnTa(QEy ztVw`A`#4p^#+!b&=40&Y3Lxh;HbQ0k*glW*mk)E}yVY4si;VR2+|trz5>Zup&ECgD zKNL8BOne4FS@5dWZ@T-*jjo}U0x;Ut^83wXesea@#9vc*nkA+E{nA~`E?wJE+sdQQ#k3#e8B4Om< zv*9s~3g}yM3j0E=FVu8REorsNy+6(Vbx3XdHe|Lz_D{FtB4t|?i3JrRlQ6M}v@_GX z)4}|PhE+HBz`8b?;Vk;4Z>WTkA>DZ2Gzvg;nqO(T@w}i!LCxa(;h6JXf6bfFbY9KN z1833E`G^!yHdJS7zNaEv>^ci~1^kGGPag{0vUG z-v7NC#VCS5&5n+){^#8ac@3n?+cX}h^Ye48C|wI6sggm`V#2D(??|HE+}tkC&mEne zi2y%Qd$8&)r~~ecr}pJCvRq-@9r0Z_wyD$QblI=Eyd+E&I~0Tge5jqB5i_Stcd$+e zi&R+641i1pa>j9K*8#;=?JRP4e?ycm$T3&GgXCQ zs5!D3eVm>i3YzX+9vl!6|B~`uUs|7MgS!qWfLr4^#xu5qD(o(KvcOdCqNL=6iOC#E zN@~)bT~NS>5w_@O#^SNQ_f8204#Cysro1^OE|LN#Gc#Q(p}U)^T(cNhT)y;Me1GpY z;|BxBTY8cA&+aP(3e-hKdnTuAD#zcDM&Tj{pPiinOAT-iI_c>V2e$pA@A?Y54FF?H zraFKCJsBT<0DJ1)EF5#nHivp(>)hN&DjT0$&#YAX?R`rg zigL-$?6-jk4FVqL?m(y~e#Y{;?u3D;cH;jBW> z!Czt-g2kO>R)b>S$`F0SH=kcfHH3%!$lV*IS1!zzJzpPcT*TRkelFAKMSo{4rFvp- zJh_kE{n-1R>)7~xArERd<&1gq$G@oA*CyaMX!r;(R-&;rqPI;pKG? ze4)noZ<4UE#v>yIT0ET=mq_nbj>h7ov@A}1a+wNVR8whM+(YPJkTx?*7(* z`88>Iupun;4%wsk+n$s#_Hw%31%TBr1#b){gc$k4`0~u_V2$Rve%U9ad!P(h4 zMH=x^*=G~EiOY3&p^2!zG_&8m*Spnt1HagBXiFx>u5iSLT=`caN95C&#t{Or@z}g4GpI{ewjSMP`Q}3vM41GD@?0bg2 zSy=^$cvjX0j{4B6t740ZTtRpH7NAvvEutC?76VSv#>Zc8fe{!M7PfzYUP*PMrqusV zea2(Ov91wpF@VNzmGtl5KM;qVn%Z+uCLkoVqRhK0x1RemKJK`bB?KZS@YPjSgWp*P z+5`FwN2?<>U?41Hu#4pqJe-_NC@%;4B|#1b3O_Mt~~!nGlKpSkoQq^r=uak_w~mBqwb7GOdb37ky&eqV|G4i{^$m|GWUdMvRK zApUAIVe58Nj*7Bh!Ne<-0K0oeye5@YgjGa8tA;^= zXcr-mKReGDXSoPM*+6;L4&oY`D1EtpwLwP`ospdU-bBi)5gejn+#dk>1h~IEj^j$S!s6t4a5JB)i@FP`Q;r>)e?=!4=-8Yi41;)gb?vG@czO<{a~bmj11@qx}}rx(UTED zAatjqe0S-2=k2m*xRDzpTJc6!_A^*L5dnw{orhb zef8=fb)TXMGr!bU{a$kQEG8>UfVjP@YyJ3$A1Q!J;U|~JImb~M0_r6p_h{J$_M>kU zcn|g$$Axt7#bU$By{feJczww&)%fDoE|AJnNjYE4B!`p3LPU&1=vX}Z&ouqKiYHH8 z5`7m*jN0Eh>7}CPS0f?FJBLzFt=!mv2$bZBc9_ECUy|TGWS?z@& z>~LM&M5YBaVty*hv)!`=kCmge#ag+Q+x@Y4zhxissn@Rt^+ID%X5|=f@nF?8vX-K( zW0+Uz6vaxGo{|Rot!$8MZHu1HDAxxT+zvTv+a4ROCYW-kK8gW`|4#_LYnJ=!XhG2+ zV=O6me!z=xY(!%cXS;?L6&MBVwZ%`kzapj^`B^m-N#lJVunE&6cIoW$^RsBs7v7imb@lCB$XXP4_kYEyW!*5MU5Rk&q`3-mv6xe;q1o@|@W#yFKaf z6e(J!KJ>KU%m3sr$WtFSbDAzQ zcm~+R|Mn*blP_{M{7CTY%KvKWd?T7>)7yJk9&;suhcvadqwv`!8h8>#$nWp&9>S?r zz$tvs(Zy7UNkbLPDEff+`qy+R44{1utka{!0!th3Jj9C8k!gyja4}d zP$ChbUyy|oRzqy@OBQ{%E!V}rt_5UHzIXEBcU@T`qt!je`5YbS{#jr2(SykDoLvi4 z{X5n?FHaj?ul(zmrbkEZ$9`+EF1RSqb8&T>u>BNa1@jhSUzuQ$*yQA!vkgf-A0YgZ z#lQ6Y0MV;6rSXmA7_?;WM`MI}_iqMGzKU3Bp_k^qD=!p$iiOy>a88dDTX@%`*LAUIzNVDB(ZiNF!ikGD}PZeC+1^Z<$=+oXF{O$8dVm zv%+a_XKM=#>cM2a3k!Vg?3KnSbT|<1YJutA8G$pJT@uWttU;0o&0 zLAGYzhNgb>Q=gSzm(dkBIrxu(MyOryYyo_VI1yKumjHb`!+tU)6fwVyZbO1;R53YJ zLLVz-ZM)shV*Tlp!~Uf7n>TNc)H>SRt9&*luLr8D2?BX-&Uv-9wAihtt*Zh(tCp`U zC&22G%GRY3Egg53Yjjt|CEhzZSy}XiG%5!0wV56KOnMK4ph|j1z@+yyy}|Yc!k#M0 z@gmdLNWHSB7SLmMQNxoaK}hHWTO0f?aH`d*d`7|L!o3ljp5pc1xqweG*rlNoI?;IQ z_5KpJQ4_DzrAF@FSA zxw)NJ{@nfz4GndT^O`MRi>K-eQ8>&>(Uzh-x0(^qZTcUY&N8aXuIs{w5K(Cn=`Ja0 zq?K-v5Tv`Mq(h`jN<^efT3Q-Jx*MdVQyS@#Z$0lAAH#nt9uMchs<4p$He5;r>(#Hy)NGTr9V?FZSlv6u_?pcfIA4vJju}I19=vQnKZ?t z2F|>ue;h}OVJMO4MvI)!iFsT`dN2Va@$b{@WN=b;ZF&eEQ_SO&;xWAsgZMmFSWh*m zYJGWKj!PFssa8sOe+(a|fF1+%-cdzs>+3L1zK@FR>w62%{p*Fam4*Kh7toRr5D@&A zB~4s6(_;K*7P;`Hfq^KL=#VIsqY6un1Q^0tj!>UyN=jZ55$Gj`D@px~;u(UEH74mZ zcHUDJ8yn^zu{0=&Msu2-i~qTdNqQ+4D$rwNVO<}tK)xas4mt`N#xu|3P}h_9Vws?^ z1>J+IJ%l^11J)~`YsKoLXA+VgBR&YtfaAK-T^fZ6Zxrg%C#DzKs;PnjWr!2n2CGs! z$*<1lS7ly@zn3CNdXgORP{aiv$82=&O`z`GTwaL!s1g#6&sz_zg{yq}+LIxlqw1p7 zvEq0_DvpFu$TgC^7p*jzx8ZYsV7D=Nf`T@qtuwaj)N^drh25bzmM>~yj_>7Fqnuc) z%l2I7yNkIDt@rUv%=o7dn^jbN_U{@!*L->}YG{zX{$7kWmY!rw58cNkD=Ox=j|dI@ z^2qDp`q!JxB)W7FjW1t@*$HdEe;UXjnf$et?yHDtf6VA(k$G!!&5CjcvnZGp)fI$2 zK(O=jTHEfd#uD5QN@a3YPUJipOl=^h08MzIzkh2=iY*3OcBaA~v9XxKr-~?$$%{Eb zt!NEo#MRZ+-@l)V2V*2c%ut$^mo(~~JMc{bx9`&O^8T2`)>&0;ZPKUDpF#Eq6mkoT$%#7O<0O)8K2#1ud@SiQlPht9 zg;)jpqpgX)$4}FEIjCfF|Fqv8MnVGC-e)xQvr^?D&FMd5+qr+!& zA{`{gW3`McEuC9c6)Qn)Q1y4FHHf)tGQ?0L{n46qHFNXw#6-X3?QXEgl|=5% zl!wE`qQaQWyPJ&vjDd;Om2s!nWtThr%s(yk`7sSeRTg00uj)X-~B8Hc|<90vge%5^kJr|xhz0JIS4&BqhoN$ z=9KXzHHTg!-s^Sss6pVM#W(7`ij>kj=WwieKg>q6AjD~h}?Hv@kW2t1Kt|>_3q?3hv zCqp(fiQQy9tS!XBo_w!{5AhR_MAK;=%hGcJ_4L>RC;m8`RkW_V!;@o}U@Nx3SS4W>A#F%Jt$n za?5+CPw%ar@1*e~o=n`#shK&Le)6A&8mNt^6K zo}e`J3n>#$cW3a)&n0#^Aa-X?|1P$0{V#-JfG)>CQGt;(s(YXcsPl?ePiCiIGmNq_ zn~49^+Lr|*$4nLTWp9|1wV3fc5>tg_3G*|5GwlfgkA?O+A}@_mw@Zh(>vBN zt9PlC6}cit3^&yH6g|IOeDSh-Q*_h0Uca&A1;oRh?bp3E&k2m&HaC>QL$#WbaslI@imq<|#u1=dC+vZQIn&>-@Sn!&InHrc^hXeO z2L=VfopCUd%^*isWg>h?j~Yz1YLOPw!#!8eBNgZ93mcy04*9}DYZd%E2w~S9Ffa(3 z_n)WMduVA*f#nZ)2F?wJJeJ$3d8$^3_ev|>QSev9O=vdGwn5qkcMzC1prExpH2S&N z5&-is2qgrudo_#$rly{WG!JAJ{D=~DJ57uvnCvzcN@v)co8AAr=31;KFb8Ht4Z#&G zM1Fq$!nN^0B+1!?(B$T(A~a7rk9`yvSBZ#;C8avuz0MTm1YYZ$t$SYTy7bW}4@9T< zYSE*_tMR+KzTrl&X}WI?w>|3POG@4ry0+!jp||PnXN0#qquG8kO?cPgS%3XWrW&=9 zz%#7~2}||ynU(+-t$jI329qXLAhMi$b6|*fcH$!=m1+(T2Si8jSJ^~^zwV>wzs||h zbaAmJr;4KG75#>WE8rclaXYuBynZ0oRqKAXGpYj6h70a@*S3VgkJ(aMHuy3b9UL6= z8hE92=;TP?KjM3JiirU^Mm|uiW@VYPh8|Cs+d#6*5BH_t@yt))M|x!+|5c0E{vH?h zcG!u8)uzWON{(IcbpYi;3r*GO2p&e*sRxk zP<$C@NQ%gK$<{c0;8-3RjT_~h&!nc*X2V>be+VR>?~?2-tso*+|1kJapkvhKDtc$_ zG>t&wK4Ot8Jx(RT-p=YJm0cCXSasd8fPs!vRu)b$B?@VpG>uNu7fQy$uJD4AQs4MQ zN62*dsj)9wb-EMF4r`&PpnDiEezu0v#C3Ty&yUP?d;tVAt>$>`<+-P)#~bZu_5`>o zPi=m-IT$=dktVkCt6V#;k!-pVRvo06ZT3J3>8QxvM8za6XblVvpnBf9QdNn9)E0bp z^{LtHW@dbLQqQXP1y+BcK)!w`nYmQ=noOYFsYD;x$l-l$+(NDaUG0_5E zgp0X zs;Ma{E60!5sZoZ#j|$ zgXQw+NW^mpk%E2eLbf!v@6ie}VtEDI%j<0X6nj)Lr3FE$#WFZIM}|1i%mM^241$3( zysx-Ng?=A{^xe%RCVo&xbp}=tM=y;uRs(>kYHPK$v@mukezI9oWx%A7JU3jUvnGjL zje{E<@#NJ<&p+PSL$2bF9{KM+d&*XtFXTk)I$be=07i@Z%4J(+Pae&Nuo#ZnWXT*X z6uJ!A=@svXY0nom+b6r~b|!-;WnZTTKUF?4tgWg^BTe*HUwhR1X#G*s4UCiy4$B4J z?GzL|c_U$nO+Pl=6J}*8bC1IJel&whfq7V%=J(3-&dyGj&p(zd#l<5Eo?s$=u7pWa zKS@<`a&ejXY`Y&iQN%YsuHbvPJpD(lfSm=Ce42G>bKC88Jo77CQUPbwkTG@-)zM?3 zrZ4Nkb?G*Hb2pCX4$EyJXKJcS$Qcqi&$ykz^2sG|QV%i~IH|ssuyJvJ1$i=ypy2jl zoILX@EGH*&ZQb!QI`fg*vO0+0_*1)-zP$gqmVut$I?I(u1%GrcBO8$cFBDv|LgWUS zDk=rzF)*Ep-RGapE1U$sYkEeeGQT@_=Sgn6D-nX^=21hj9 zp>S<(Z_TU+!14@!#;UJh7f*lB&9#GUWEh&lPzSaFspbntgsu(_-}3XvJ!L58r13e~ zGrnn!)QP>xgSa%+0*P(25m|j8FeoK*2KAV-={MG9W+J@+rn;Axpqm>u#2Zx|Za|J* zgY%yXIFEs`>#~)VBMF$3&ks0wCJM}hgCt4A7CgC)>dj?sioTbWn60eG92_jlC+y|R z_lStx_;_F1{Ce?ULV6#c{nO#!#|pVigL71}7QMW0+^ww9>gp5m3C4Xd)k%EM8*?AL z!i<-z@;ou-qHmdNBM#0yCYjDt78Xrn@9XGjotpzPOu?8%mAyf;U`=yi)Zr+L$5C$g z!~`B9Cg~&8$0ox>Y;3-JZzO2(fUgC=a#XD#s9K#VRirrCGsVHx6{x0v;9)Qj8LBKh zc=k@HzOc~gWb3u#;luG_UJB7y3TZ+DbZfb3Z5;#2lV`iL##XX=IQ)Tw8K#Z=HJ&c7 zdV}telm9{>&;lwP$GXSz=5_0wWTP2x7yN$xdgbbrUA87F@U0VX~!#5-~V&5SW*cg{A;@-~e*_#(Fjdb!js6>Bu zd4}p!C+*~ZK6S00RaKek`tnboa!=HJXN^X>!`RVGvP0)#?{hBkg8OyqE!4(~qC&Q* zq@;uT%6AbG{dzH^JYE4NUcZM~)XHf8s``D^$TZd3d^p-|`nTmgd(`Ldc#Ws}*&ejM ziHV66`px-8MQvr|bSEvu4E-QEv;_LFyYJ%LDUIrIasUz5rAV1bx3?9!<{ggo>cVzptjM z3OQ<$^l^Q?y`-e1l;~nOw8FyR4B5L~AG@8+fTsySyeNGrzaqa8M!|UU>lH3m z2oGpyn|%%7R0LPOT+Wa?ZbXimnHeOIK*J1cT>6k~qQjRE9TXo0%34UTlcB_cyqRi{ zP*^2HTjZ)e^N3tIL*#CO`YRO`d|cea(_Nh^HPyUhO;97KsXf#y#Qw^64$~r~lxMit zX4hZmZk3+*w6B7xC%Yn<46}i-tz;m%ZHEq9QCKU0q3ols`OQ=c^Jodc+kmEJ1u> z>_<=QLDGe^rXBuB$ca)72oMG-_0geYIXeZ9^{02`WQbu9~21(pt_ zHW>pB>I(_#7t~f3F>x~Ou$%-8Ac(JbWfWgGzbme<1?LYRzSN^+k0dG#^ZRirS7d)!xZt}+~ho`fp+hFx$up*+*R^RCJl+f=C~x!6$8mEiW! zov@c-s=15f73Q4YgwL1^p5z+ISX1|sNKUtYN(CpRN_#k^u1;@C=_g&AYZfU}XJSRE z4op&h@4hs$tC~I9x$|Q`gO#DOO8Tv@+k3o^?{r9oglse3pa4jIPlaztQh`spVreOS z2qlu`im+O7WIHFT5~pjViut(M{aM(Hj>R9csL@pSVx{!3)r=4YsjtVji~geOmdMFg zRhVkNIwwwysw$&4WT&Vd7~;p&nVaKEN-j=dOC=+k9v&`xLVJ##`%mZH9qHN&6FV-URBfiX z(42;;plCm1SXM1SD(aeMCSC9QC0V;^0p}^%DR3-%#Qe~-J#XvAjYL9SiW}r zy#4ryp9BR_OjmR%lgf2B;+8BMwx`nNrCneXR`^WUGY81VK z4YYN%B4aqjhu(gQ-!^^TwNefv)q|6pyM)n2uZp80BF2`hwsoEX4_d_YSS4Gx#&TpC zf}ps0{{pX>P;2&!-jCmdJK=(azFh0^MAc1ppu|_Xm}$o%RAqF(Z`Kp+A|8wlacr2F z$GW6;LcIyW8JlwoFMnaj#IUDGw0G|Gr}Hyps%18+PHIGY@ zTj_i?X%yvQ{{#D{f(w3ZBz(HDa`-nsdT+kn?Cz$@%iaH0ybJ2Golhb+*nS`9ien+M%43i&plk^jH4 zzj^0ZX20(RDy=I&BYl-=StQ%Su0a1VNGx>|d{XnxzP^(sdXVDf>DN$D5N3{MYi0%l z7n687IzS zpXK`i*9%b|J2NvlB%Th42nIz&$U|WaJK=22%r--ID6`d>0I- z8993z@=^iQI}^`%KL~*UULMuybL+f3Ia)kPY9DFZAhA1bgoyX4sd`=W#+aO3=>|Q((U>^gxhZ}ppq&ZSj#rlpI@`riAnv2yXKqi%lBW2D=dn3x%# zW9CB^Ce|`9vmk2gt{J3Dn{@zzof|$ZUu_y5})%?J(WPCR^e9adOdn}xfmI)TJ?3s5vWb^SbO^!VVhH_TA zomn0shVvJ<%PoHtS<`5;B|hUfZut)WFw+O0@E(%J=i#py?P$f#Gol(2t~=G?(@7~J zX=D^~og`KkUlyK#LZ;5GnZ@eMV_uihrtMOp{QhWvfu^5L;aRHNH-B2XySoc&G(88; zc(9rhrabiupVoJ=@iC|r;%Alo@L`DhqUXqDhLK%-*MxGgU6F0Jt>B&X$19Q~N<`$7 zUa}VMh1rgW2+CIlSWZdJf9nvG_fZeoxkHbR1F0;(Bu*b*omogreUzbfylcbF@6Z0> z!}nA*InfP?>8%)Pnw@@gvm0~+{>Oj`JgWJot)G~zh%nT&icrtQ3vX8^J&u_O?=oSS zCVBS#U6BB>@H@Bu;Mzbx(K&y?J0Wf5ojv9CtQANE=YRN0R3%avDN-?lxXAKUGF9p1 z+Ql%a`-?_Oz5hKt$L`?=A?hk9TJiB=)o-Z<9s^!)5rctcMOj&+a;>fWx!&F_1CR9U z1iv2Bk2398euD6E$x*HIM9&_{N6*3wa}M3NbyZVP=!ax*{U6?}_*>k0M1+L-*{+D^ zD*t`OXt4wvxkJDm9B`AYb~Se}$dtcvEr<_~kGGriS&5YB)R4c30cC;B=jS*Q**Aj^ zT&K#6ZdG847aFDi_RlCCkIBY{6$(QA8EI6HDKQZd9WCiZ;0Oc?PnIq$^FEj;zwiHk ztFZHRG zy`@&{T8)N+R?i?=GdK03Ez+RLbKGiCi8>Yx3OcyjIyt7C@NRrGYZ*z@ZXthai<{e2zIe8teC(=7?I6PA& zlFwAJ2Fc<34*B4rG!Iol2R`21@mJf-uWlb*-yolX4msiTXB*`o_fW6>?;L?P4;GK? zgi~Ez41$6iM;lK)NqSA43e@WzSAt>MqBxsV&S3(=AkP|nv;YoJLvxvbkf9w(<)+bM zvD;K3oB0mgXErG*3_YJ$`2Li;p22cK`1trf0KC0&0Q+ULBTW+!U27f?AP^vJ0(HQZ z=6zw$T4*wA9ar?5Jd4fkQQkv_r-ZaLtXg%nx1QzuT%9ViJ}kh*%6kG|R8%i+IN0`v zp{*b$o{`Xhv-_b`-yoOBODb}&WhJasw@!i*M_yh_&sH&tG_|0n%>Yv!mL|0R?m`@? z^YfdoTsLaZpR43;`vOgEt*=Y9;Hz>vQfl?RDu)geu`ZWYF0iuLPn8`gjB>juV!jxj zYarLsaha=PCV(_Ar=p@FGE!35BZ-(YJ~5QhX*E0#-OE$@p(M*M_;y{li424gY;1N6 zs69yUEx9XMbxyCtuR^md^AbvHFoHFN(x&5|trpM3iGTg0p z26ZQoQXM8(Ambo7B@)@5U?&Q$X`b#p9xV;QBr7l2oRi2sVvl2!q@LmWDd@6xTYi@6 zsOUZS%YwcMDKlaCCH|AF_gX6WkR%eWr&9jj_)dqv@jjV~I%5KX3>?2m}*JuX1I7Hek z%fm3e#2J0hH{_m7ST#8fH_OAZFr(!QalkMjlvDj!e7l4?U03pM-{3-w`=nSQR6n7-+L`C#=5GJGdHopio=vZw zkZ_Y(rwxG+7r!|;AOlmXwl>hC-vFAS!Bx@_!wvBq5*Ogm8%!5Xk3E#4qOdYHc3y6G zRFG$ipZWcpl*~uTg;yF8LqbRh&<3rWR$JS8h#uCMqfh8jujdYF6Lwn7uUG;jKYWw$ zN6g7dQ=;giyH8A^{(~{-_1e)U0_-FI8Ao`moyEli6oL_ec-95#gNgAVY>>!S#G;}^ zCdWgm+QoO)KZCR*i!Q|zwn<|}UZ*QUCYh`livm`O$!vBhFJ@OI{T&}9<~Ee*?k{gr z#v#db6*fgov^z4Ta0roKzW%izL)9k6XF$gF>$S&9>a4-(zE0`97Fz& zWM&qG-Q%V5>ljs1>WzqKC>8OOl1?xA=IOXmLc#QHV)*BYj`PLl2so3uP+vf}1Rw!m z+aEm?I`zz8yltxH=|SR-fX|kQg~ekGd&u8Q74};0iSt55j*;_Ap3V6A(?zthVbv+| zQVcsJU`>uK9vbLbLBb~AkhO(}VeS}hYd2TDEdiZ=qSAR7XTN{{K0kLS3X`|H3tLH>8`Gz^STB$>B zX(`|sq)tvtN=iV@ps2`HbjO82)x>1w>TEwA(~XQQ61^Mgc6gtj6hoXUGn?UqV1uC_gqP~*kkFa=+*~?#DLo?b88MEY=33zh z_A*{>vzd%ei%bf?9FdpXg6_`abY}u9ogJgin1cl;^vS1FKgy@2A3IcuFwHmv&3aj$ zSSj~N9pp&BHGPiP! zbF(LB6?NT(jv!@Omj-ZLIXOjWbr73nfeE;eF%K(nQ^hj z5i%m`dHsYutFFDwu&#v7DN}0?Zej(5Qb%@==O#0jX4Y$1OQ|p(TNrULC8mXUnD(xBsv`? zm0=PyjUHSHed0J0l#4bt+bRY4KsFW+!7>W1NDZv4e5}xS*Ql{Xukc+CL&AqB-1KOqMm!`rR$C8wYjiUVo8W zQGc-nVVm%p$I)ha8ks&rM2FMfpDOE)iAX9MOiZ4lqUnZKinINV;vc3EN~c@zJou$t zk6xj|+S)oQ%9Am{q_55Q>J^X2Ut6%jW&{TYGS@sLVD=0uVWTk~I?smpZo!w-pOSn2 zV`3tI_n2y3CqLHRtBW&5nc6=ac@@9zW9Tmb3URtw_debTxJrketH+3fS0 zG7K1a!~7J*pR&4o4#05Nl}^Km_YgSK3@UVJp!XgzuYXAfET%B=Tcdl0>_<0V1NDo6`T5(T4G6+uB>&k* z?=*o4Lg!SD{+P()wVuoNa>rX&;h+9Nal`3KO6CyB=q)Wa^~=>KIF3QJD>C`l(rYFr zIomfH$e(af6TeuNO*5%hpd!0=w=Ef>w|;!6lq%r9HbWZp?Q2lb{QS+z**>RiQi;@wTt z)+O{lIUWB1xBLgm*EBJGYrtoNI|PJTfampi`Emp*iBBZJo-}XqzY|BVI65+N@4evOs$Ti_ntEEe{o0PUV{@-{Ivt~ENym`iSJZW>rBOKu>p zz~7e&@7}WX|1_(~QGo%~TBjJ33Tv;GI0w#VHAnG~L9`CX!c*b12P8`mOs71=0^`R1 zIKxprR?dO$&&|X0XLIwt(&EY$b^~9~-{fibSa$dRO)D-7CV{fVcr!^QqKI!!iu{kv z(2{HJLA@EDPeS_)J)DUx%E5uMg&nksIcwwyH(wbwo9(S-!VeGW%H7b?Z0^^gAFTcN$%h%&hCLB$cztVvH#4@!M5 zn2n7YLc`?)XyM9AufydIkT{EapT8&8m;cOi4;}qE<_(rruP-LBaC&+#ui<#pEY+W# z(uyYG!xDCf1w9lBeSOTJjDs|GwA~2VoS!%vMAC4;P>_qXm#B@8$l_(EE&wCs>S zAUG3uxO}F!Ei8WcmWq-k?$!*Gr^fFI=QFZSZq1(M*;3G}U2RWVH%q=WE>~y#RIX6Z zrd1}<^xEN|!i0gI@qV_a`;_PLLTswW-s05y)#Heqt5YFiR~9871FR4e)Apa=)@c^VDG(P2tgRtn>D@;3n5GGNsJ@pWF^JF zer*6q9K44QeLp`iv$ULo1F^nd!VDL3O4NuWtKZ$A@iwsj8k?HJC#crY(eZ*#z@9UL zwI8*{1j^$ZE+6rVUdA8VC2 zns8VVcFN{(Dj*g7dc~3J{#ok-*|a3}J=B6{ykAwHq+*%0FE+Svl@!V$iN3h~N3N~i z<*#Xv8{vU?j)rEwF|^Q5$d=6eCSA}iydZJCw){%^gTwe}mdLx5fef9wAE^LpiUu6suo*FQr^dsmZ9znAU7-9M=OZ5C${*>kK8=TfZv`43x zrbf)04zH_djdz?o=zAPAwCW3BgerKjKJR?0D=<0>L9mnK*73#^xXM^Wi8`M?8*Zl& z+@m55TL0#E!zgr&-jz<59emYZ!1Y0EeY@H&rmDh-w57FWMQ;3^qT)dL0~GtzdC}c> zCk_vo0uT%d57t;hOx%|ZT#$F}e3k!vo-XWF{axZk=v{ZA z)%ErI_NR+CeN3c=$Ou&LpY|Uw6jEX@g?iq^bX9)cAW)B&hoq|(Ha2iKZ%;JXOPtRf zJ+3raIdXAk6w6GJ_|@#W>ll)^eslo`kx%S9*1g`WgWVb%8-B$5-1RI7Q*7@jIG5!o z#foQoskO^EXmD_p`e=8)1}0LDE-u?g-4U+0TDn_w4y=#eMCWr14^Z`P`(vnhz4ycsq zPzk%851w?;VPfz39urPQc0HPFx|()7t)K~cd~|UQ{3@EZthVu(n(7i&eEPy0vq@yq%Jb>(Z z_@Q?u%lY2YSy>f?v%+o-%-;7KHfE}d zKro~vCr5xDh<_hYT%5e8qrj_q^hz*%-F$Fs;)UCuL|nIF*lnG7h(-4A=#8Ofy{LhD zM-yzsb7|?koS?UV$4MnQxcr_A*>vh#Y|InR0{+Z~gQKfqW~=suBZ+${BQEaZgrsOV zXCrlXKUb0B_MnApLNLyx-wyFUC}<>><`7C`r@G%8TBQa;TFOI9HTFI#pEIm%SHn6` zKP=7H?8h)4+UE{g8c^>?xeJ*D_Z}nK9DIZ=+}iarRYvooPr6?!ZJOQA1OuSGt0SWM zqkrbuA3i%FsN0exQqDP>MrLLl+}h0~vq*?nE?3<)a9oI9sH{Bd$Rs77rx1O!Ed|g}j`fGt24Cmp7(Hc*TG3#>)#1ja- zQqkns(s4`km=)qsA98Rx^`ha-X^vt;WJKVEvmil+W2j@BEC8OD!+ z-u4(%^)?-gE%!-`H}gj_N4n4o&`Qz|4)Q*;CZBB%YQ=t#wZ7`rZm*Em9UVkMDVcxf z_($jMJp{`KR8%avC&p)o8z6(o9A&r#@)TV_aaBPf=hv$(=Ewl0_b*zksn5f$xX{6LP zG@Wo;+)iwR*$OyqnsaDegvUTShFQ5nGVegg+K?0N`cQiMmoFc}vbYkpEfJNOnMS|9 zP4n~f7Z-ao#T%QO3jjzI6p?=~kTEeMV`3D+BMgu)c?E^*YyJn3Vk?KwxVXkZ8`SKQ z-W;ev|KBRP)%q?1Jt%7^y%Jhp&}9foCPM<4xVX48G05LM&ktHY{Nd_jW}?F*N>zfG zdI?L*?ifE4M>*npoWPaw<&a|vW0~=ylG5S$wg8t82!yJ{yHvGZUOR3k6-$$`_@}l$ zA9wl;zx(Xhc|;by8jRK~^hZVDgh}XBe`997GB%RUQ;EqGceR^a=X&S=0P&m>=Wa=O zWJJTWx8`0ilA$vd_Sq?f*5zk!FO$QB9|$w-s-=ov3H-C=rjL~|=0M6u_PF|WS6TUy z$7=?Lyt1++$e1N48em^M>WH=AVl%+futRHe@Vuw}gun|!EsoCru)Ms?W8>e~*9R#IfR*~ps`H7p4y0L-`m6D( z`ko97BjJ2)f|C%m`S3Y`Tk0kH-a%$^b zc#BX(cZGErgXln+;!Z4Wv!@R(f|7^MZT-WFVx%{XiP~(55(mfRnp2$$BM8|)_9e5^ zg-({73ApVYC-LltAt|TPHpxUcqz-r9dx2wmRjtEUVxg3{g@yRSJ^};U3V5kts$g_& zS?SR&!O0$IwTXvHO7i2sr{p^yh~5&gCUuxUfOA@${vjvleHvGKNr{y99XuSI?{8*1 z8F~`Xu*-47zV^so++0Nyi+F#^B4Q%9pA@$r-%P{5PsRVlYwJ_?%aMr6O6Gc(x7Szp zuKiyQO*+3P__UCM6nuCb`c% zSG*#}Y2`Wm45}^?rT0Wy4f#+mNE$G~3#`th}%Y1w^qCr@4?* zo^{B#1$UQyBw|#}z8v*8xz)P!k3wt+A{4q!4JuE)u?hW8XU??$J~?`)81$Te<7242 zIByS)GBXS_5oFQn5>%ZMQuC2bwR-7=c?++mITG4tKWez=*X(wZf36%=-I=@kb=Y^@ zvpe@ZM1oSK0LElNMp}*ruy4RJp|@}HeN*tH=e=0ImLZSk^Z09eya7=CoL-z+@n4qL=L&uFXhtVNMPXBGy&pHk=4E@2&ACnJS@_-QBU-5ak(K zz3QM{SY)%%=#hQU0jbu4zW*A*HwZJ%j?PX1->6Y8Vv{}_URW^D)I3{A9{w*4(r)%1 zYAf{53h5%L1Lk*8Q5(I^^r>U<(9c@}gyBy=&mMu{&)o3T)M#s5-x@dZYY+n6{?p6n zO>cYeZ-aw9yvLd+*!=Crxp)gR{>3Pl$-q25_q$QrSp;k4mKz3d5qJz6_PnyrF&yuC3MfFg5%p*pz?9uPK#M zS>4b~{_=&0h$vG;WIXG9I&S*A)key6A8!zrn_CUReLBYkF10k^-R%C{aXMH`{27mQ ziluP(Zoo%8C2i}MGTnCIl(KxKs)|SE<2>UWf<&&DoTRNxs?5nbI)l$iwWM^~SZCyW zH1M%!HAh~9>A(<+5&yms;VVANFo|V*8lp;^|BW4!HRvlQ=qk-t&4w*CD;gS8e!zj058D8LIKIf$mrnY z^!MOEw6NEy_rDaN%Tzn9-49BpVq?Ri z;R~aT5!2J4#@$%;4*T!{0|Ubu^wi$`1aAV^galqAPM&5!zNwF+zyEVeKx(YPwkW>_ zfq>*I9tdR(3=D*`L*x4+gZEQApOw=qz}xkEscn4N@~GV%tNMu4){X5lDC)G#ab*S* z2;XwEc4)dKWf>V#1_op8f%k^-+DDW>?O%@ORBk*~c*$M|E`>QE$E{B^`LF{d+}%B( zr%`*oWVh7%ejrmCn=GwxbM3LQz6S>3d{2lhB@Pyu04$+|1gxK;wm>1B@N{8+*rvf| zvSn*)3y8=#VP7Fm)P#v9MbNF&B;(hI#5c@WC3=yU*kyXl6ay zd&6=YT2D$It$Dh6`z@Oqy~6v#3HGJu^)Fj^j`!iJfh76%_I9PTPg2<&i3;N$uVAl)PK_ayxJXkPqxI$-rW^ngrWf)4{?wuG*_0AW=>tS}-L<)K)>+fI#2M0! z(vG0F=zelRT|vT}X`dQ))xj>g1kK0bJUa_5(8L7$_wOuJGU1eF z#omPe3I+J2sM9Jv46rp7NAr|(txKh!ixFd5zV~etUz#=etbh7$jopmZK>N)mhVqsU znUK)L!1CgN4t893U)E4)5HcA}oL(MVW?Pb8H2J+DF1X9ob=A2?daaA%;$&!NY8|9n zEgw>2fWP-o;J?oCY1#Pkk|uu!g-e?ztQRG2l@dM-i?Gm1J0r(C)#@VjGbN=A<~Q7)%? z?gxp10bgMztNHdL6o5~>x8A*z&sFsDTyDQ7832tCh-K4+NnQh#D=&H${OTjyx4<}- zm3^{~j3bfR6k0aDQ(JouTpEbw1`{@mRvF?=+sYxnc~7$+ESgXLsv;1;$!l9`Y-$pH z*RT4G&DdlGMjg2diNp^d_N`@wNjWGg9sz*}p6XasBdyPz5?Nsu2!tWJYwn2?T9Yax zk!CtAQ#4^t&P4#4rRL-OworZ>;$HQqS(dh<*a*9=nsk4)+Yln&Jzb_w zi;V^bBGs8-qybgMukC9Wwh;$dhPWT<@f_JgW!z8HD0wC$l z71B;5&EFUr#yEMw);S@e`M)*2perM9g%*nokxQNK;Kr>03ifBY9Jf7MV5Le?$9i9% zmqVH=L`Op^Oz-4Vf(W=+U>O3-Bj8xVZPaGUc^8ZH{4w4T^z1-8RD1Obn&r}e2I|+0 zA|m=mRoRTpMOyXF1==OT`6VTXe~JIzcx7@Pl0}C!Hma|x1Fw5;;fC9BnYM!fx`(^i z^AH+4H!|`qH&{>NnuQ=DNb$RLY*G@Tn`91){7~9@BxcstnHu}IWP|LlUuo&r6&xM?>gv{h*ZoT! z;_Fy)@NGZe68aDs2@{{;<~ztQ009Q#cp{W8ySmVmyoD%u=JWEYgo%~j zMIF~mBtPa;a{D6NDw{rD6ZB*`JE>oG@nI+mSN2bp1SlTO4VgR#y#>tu9G;OT^xgcs z5k!3<85C43q;Ib|NiHl9gChkfqG}hHgR}6g#5s-NcD@vCi(=kOdeoolCVh$?_BrTc zfkx=hHN^Zl>U^jii|V}(qQ4XAkfwF5nDJJ|F*EJ*{6R-{(ylf6fDy@P%8zRy-$JbA zN_ZBvbrg;^J;QbJJ#_YzxN% z&wbvjAwgo_zEf$-j*bpVBg35BTr89DPzSHBV8oXNy$C*6r;sRB-efY59mByZ1r(aN zu*%2ByQ0sVF#RcS^(p>6efIzq440Rl9Tkm*g>w951wZ2$lws!pv%OC?y^M?N>%nbH z5FjUDRjXE@P7qOLdXHE3@UgHFXjCnRq<)Q$Ke3m;O9312(#tdnDvYr8B0aySKMC23~gSj#SWE!wm!3KKG)0jYzkX4w;6GnHLRF+dO z(Be7v6YfK<+Wx`=4&yO?egkWp=-aZHSxOw4(J_P3s!*{g4Anv(ziwhu9(6|5Bu*(w z$)Hy1yKpAiX)L1X=;UT1?Y8$wvC^AufA-_4GCq5U7XDaMTYF{>iboGWW2A2LL^3}rt-4`xh)gW@fPgMjZ}8K@xvT z^5~HJv`@sD;^`R}m`V1?kb{jhLb$WO%I6Nh2}(*%hEwn{t2SuvhL*m=vYm#et+w{r zfd>yP8_|H}6{%D7a!hON2wlIN_j&bSe0T*(|;Ssn*6>nY!b!>HGb=Ff;R`qSwLV zVDT(H2J!e!m*aV*MJk9_wW^&HoVY&B3`IxomIO82m^2LUj#p*>4rh`v z`u#&@|L~Ljlxu8q@*>;8*ZaC71luHF(!*s}AU7j}S4hZkm@?hts4VB|UkHcm&J+kV z-##OS$pM(qr)Fldn5;n6-LY~A0$arUND{sukl6>efnE-+TwyM&;=hMmplp5e~L#O z`vmD#HSPs}z%j-pyNkPOPxcHGA@Yy)H$4SHh@|bU??yOF#?h{v{QjiZubdOwNQB;K zA3gX#n$9vR%e3vnk4i{LcMB-pAsqsul%#ZnAdQ4{r%H){fV9#fEl48`($XN^-Tj^O zuJy6z$NXRz<-VUQ_PO`5ZIrZPwQOWFx^R}{ZLhx08qG4br?!tqvz&*c?R2WvR?3}^+Z7`o)Vi_iH~YP3Db7>3zvAaHwQl;l{Hn-F z0qyu|`pdwGu&aTml5~Fc73w~Riunz82{Q}?j4dtC{x#nlK>;JP zL(KYbA?&tH80L(=D=%KWz(I1!%{+VAJUQtBH^opaN@%1;K?E3wb&V_?2BVjTCM8`! zjtuN|H_*3;@B#}0+q9> zcO*z9FZO+Fb5)f-5r;vY^%y&pc85QQi6mY)rCXs4?R*4z!fU0pDXj;RZ_6lfd;WQN zmesIJn$J86q3>3)#0ou}_ntiO#MRNU-M!xam6Sf()kpDc{O`$zJSdM555^1QEDPu)82sh+-!OKDo%*e<>5D%sV9CouV3s$AKf zSq;3|8Vn>PuQnPA@|FsXJfrKj1hi+W*x#b8oE%B|n`RsKgq{S}L~HX0C0MS*Bj2(c zJJUqmf5*_R4v-$sweC{C0!x~ap-1*SOQCLOG`^{@6l3MES%eH z_wp#X2C#J**kPsg2|k8c^i7l;)&NKs^b+{{+g&*YZhZe1R4Ml$*UjJne-A`qo= zObJfID98~)#3=>QC$fX(IhV0Z98#f`m)vt^M45ssc;T>+SO|>6J48feo+>u{*s!Ri z#BuBFXo9C-F>U8Erwj!wXIPOTxueBRg%fte*@d+}^{3g2^Y1AJ<^m+r8OOL}9^!Ng zH(J_(ZOesarMohCja}5^M`nM2qUhmtXN)P787#j&KR=8Ck2>T!xJgBfTRtZ8tX}nx zXgj;OP(s%ZP+-CD6B`}vu%nl(a|P43FA6L07lcAGSE#*OSjZgsDCBeX6KdnZwE#G2 z=vV}Z6I@M&`eV=8P%w|&j#eguI}c+zl=`yr!e+|CCZz?j)Wiuxj1u#4CcmI9T2Mz; zNMS~4e2X0178`B4gP@6)XV<9r9adfWZX4HDuwL+HBZ#4nt)zFwM|8lJ4~Dez^4%PigE?g`_`AMEaY#yS5!q^A5{dkz9vscAU*JM2+OQ&3RuvxW z_ifF(ESBTO#1s5v&`)Let^DA++RrW|7s>Mp+lHX*LB9NILWv%l(9^dqMc(bq^$9Op zoZdHxd43X+w#Bu!9bT>q`{-D`ROwfY^+)1?aRE`96S3N!G$f<%HBO*~pM9y$L9UXzXoYl4jyDzY9!S;U#Atok9 zkH;kP-xmdf)4ZENT)Yu9DQRixqm9uIt@9$DCtEOQ@@@3ED#4|o*jik4TIo%MQn!|uu4B3?PFk2;S2x_%u zQIU&wzfniDnGh248}YBKywcG%l_$KuyT4y*=S_k^R3$?!`_b89$DHaX_x<`3xOngF z4Rq(zB5?3x8eF$uDI``kew-uUfvsj@V&Z=>L8!h^$S}ECXv5&GD0SV|F*kl(5r`N5 z8;m~Y78V#t0YHzr%ADL54}g0zoDcWUz3lCG;oAQcXKAtJKkc zVN9~YX_p%b)h=aa^V-@Ic;O^!-(l*F2b=^*icG&HZ7&ti(Z~K&)`H33CV|kb2 zW)KfHjf=ChA6l$r5kxpR>B^lqq(j5Q%Wsi$kVT1mjVyn6bTDPk!ma>Q5b>W|w&{bp z5MZ5ayhVhtnOf#XxpJ6almW;$&j86>#QkYY1hiExdxnw~~@8kiPVz zu%5nFc+7wPrY|W1gQ)lH;QQ-oH(3!XpQF8=qGUmZ$4)Cp@&Xi&lNIjG){mdbq$Kio zQI#+L+!sw1e|fl{SXhTE8C_P!{W$i#Jp3DIaZ-U^yE~x;nua~g_hZ5g%baslQo;kX z=0g;VSv3F5mNVeR28Wk@fzb+F70MWl1~Unh`Kg4w%#vDmeKyRxe;E5JpR+z%o}j1p z{KzKeqSL>$)X~szc=s;Rkz;Gon zwq9ISteD8ZkH?2@<{KXIVS^KAQFJWvuJp^iGHgzS%cbOB-iN<5w&tVn!ZHG^Il!iwHodTYOhM*un z5|n2+vaZgQ@{N6G+$C^R)HDdPoicJx8b6&-RM zrgv-lE}w2Kxe)hMC^s!;a`O;i+`%*8FQKfaW@=@Xzt;LTT$(19x;LL={yhQ_Pxkwf z>jf)bnD(k>y@=W002$yH!><+60PvE-@N3q8ji7o&*Q!F_D^Mc7Df`jKHrKPvOj_Oi z=m7iO(ff#r<>T^7ypEsI2!sjg>OiT6yEgS$klkgU?k?|50rTM)r^|-Ag~cj72tlV6 z#`lc>F`mFSvNjhU9^NhD_W6w9K`Idcka!rd3>UKiWzfOCN3IT6PR3yajXVRs-;QY{J?8eyI%_?769 zI9TQHhYlY|d0;IX_!CfE!)wcslWNAD*k(d0;wDia^a$m_gJU^;@|1yx7#JUbG)`dK z9z_kb=UM8Y0Dia_UtV5@hVHKHPsY-`pcm0mjFS7#u3Kd`QO4h*`fOwKw8Y8gQ9QvH zJ+`9Dd=;+~Jpu$2*+R=iRaqu051p+PXFU2cS5p9F^4f&oQKj=p8%3{^~&%+F6CS~3NK=?{&Vn4Ypvkwe}gq#s*X56Ih-$BwV$j^sw@oj%+Z@Y{e zUNM9FPTZFrIHSLVYV5AHq@?;>Hc@+)TS$4@;rxqmQs7V4S5`JMGyB$i`%Re<1%?0R zC#O)X@=Hgnu9G6{)7_}`y*kQ;{ly@j80N){h6a>ydPx?izLON3(y6ItM+73hG={i> z*>bQY0-DXcEc*G}|KAIs<9jCs{UsdyU{f;UXSUIEayrEFjt76_xh?heCA1C$kH+J8 z!v(k!h=I|9v0v}173JhMX6wl@rQVJj{TJHL6)UQWlC`$)&A6aI=U$8@zantZe6&a*4K-MwrbbBjY$<7%E2yZXvnU@lJ6`i zu^;@~u6hFdq2Ba1pb?}~U&QyA1I0*QK_pl(OSv;==*Q-3vWQOOYT0FIMV)W;93G;Y zbmcL7a+sWKZfxwiyvWPgOk>-dvRVXwX< zjr8GvZv{i1X|2D?<}WG>NQsO4o-rY0vE$%Ohbbi=?@_GfQwa@phRn4=caS9C_r66Y zcqk6n3M{^V;MkLmAUp0iKuhRJSrHA_ez+U3*5y}MS`$*hVDwM@j2VyYNd%i;C_7Z zamvM=?xsb!itw_+Vf9pDmXh!izX(`F zJ&uULw$+|XVIFLHI3=jOi@W$oufF`RJylKNcnB>U{=C(5kDOvXb;D$!y<5&}aEHgf zM-J@kGA*1EGp5DTnZxh(#ThGm+b5ojab>8H*obnRm)K>-+^~aHZ-aC<1MI5eYE?I= z`w%?^BVPRH&*9Q0h-k=Vz_fB1iaaO}Lb}9wYbs%Oyod5HIIaiEp8`1I1vPmhFc6uS2a$ahKs{uMi&c3~f z9Z9NuNKgN*w)WY2XD%|fF{=tFyS~1SpfQgA%@i~J-Pvh$xH140;*hff@_Kqzl_a|? z%;`|DC~$Byo@&TA-pKV{|2nnrm}9nOe7Cvk>ZMo_67pEc=?9eS#x=~N@p6pOFZf!Q(G*5jctpaR%W~?fRM1WTMjt4Sm#~vmuhHl@9ma5 zDbgLwQFycX4vq}q*m&PxQuU6rIM*m4VV{mRlG*Lb^MnEhuE*a$5bNnFZf?4#iTwqH z9`KZ(UOQf6;)UOj{eg-Y82AQ=Rnv|W`zf)O=0VJv+$^c6k0&XDYzw~70Q|BY7OwxE z0>S$M`;eYx*g>125b}5UBFr9`IW#j&p$Q<I$QtnhbymL7Hj#OIT zuFH3LIR|`QeiQ`W!vJO;=|{?PYPw0clinV|hGb!p$y}A$tY_d+ssH?`lR>(J0DP6#zkgfv zt*%ArWXOntu`wL)((8YFyNG))FM5k+Y-W74Xg}&2El_Bt*h&skWLd2Lv2kJ{`(!B@#BN;hilM># z0iv_3Yj81~D7kw1ttI}8O&gC8FYgaHm2ih6bvkTs{dHQbU-#SgjU$-=qxp6{+(Hui z)9Buw|MkBYR}C!4 z`iN^bS{wP^Krp=^MIC*2id@iC9pcIC+(#i4Ha)^RYj2}Rn(30yq*d)^6Kq&2XD#7U zFcpBBo)W)T^-Q3L>X*cJ4&4R7GFC^Q`hM_zp|R3TAmRv6h&ZI9 zX77V3>#@QYNN~)4`eXtoN;vpCCkc38jD$RR1WFQw<@WwHcXwrzxMA%dQEe7|)%#`I zI$h8+VRjb@kx@{3WVv)HB|BMDa@YT8<}%(^Sct}Rz>?SZ)-NsPJ_2+uRV^)R+Zn3& zpSzZ}UdOax(8S3>Zp&wHPk=nRil3A;SU$ctBBJfDI1}rY>)wD3@mLiBhOlt#7YzA! zw_QH|*LM-mK5jW08RoY*7*^T%{DU)(N98_(oZQ{Xn^^DH2}uf5n6!zO*6)x9lJD@v z5|h@I^Y$vOOGP~doEr$T^YYAnd|bwgnIFr#YnK-n+xgri%(64964B%NXZYiz2m6Y% zUpJ2bR{;(}bW~Uv4mxsoTBu$mL`9%>{S+q3kXb1zE`DQc3yhQa_!S6%1<4oARJ53ySOhz|9jD4FH;5siY&n43QN*C#)Rjx zXJRv4WmX33|Nw=^CdEsle>)6_F)}fM zUORtyeovzhi$D2+BwOxiq=_u-MPi`|Y}B(kQ7#%9PD>MG`ZgD?x$=9um@pSQ+pk|= zKfi}k5h45{e8QztpiR|)mB_-p9&2VO)$O<0kk{%F)_tvkoR{Ln2@M7YQiIz}OyN8h z#%Z&K+Ej*byZib|2mO+tn3l|ZIbB3Km&Fp#&&{3v`DVG={^sU}oRCkfkLmuY zFO^BD!16vm8TO}pzrXxG8hQJJesfHIJhXuN#KM)K=O4P9HD9R=T~zLEO`A;qX@Dl8 z)EM#5`YIumdNveo0prYzhx?GOdPu)qlsOEsS5%30SAR{QP_aZYy`r1mz32h{gPG6O zz&rh}(X#xO6!Y0a3ernF{wI?f@-O?}32lgyiP~+gIc`$}Wt-@6Y)NM3f{YGXLSNN8 zSKwmJC5)AX7lW4&((#Mnr@sfzXi@Z5(#1e(t$>b^exfK z@1Blkweg0V0Q=e6$;9QRH(=H zc&QX40Tf!C_a#?D_>&WpoRm~Y>kEVs*VdfCZ8lf&W|3ohs{5=Nt3|09uK8fHKtc4T zib7cRCB14?>AU+&sMmy3^X?)Bpz?=yeSmjzbd=93`5`z10C|yD{m$80Jxr11zfE2x z;{_$iTOP^hvjgKzGxbqr0^=#k`#$%QJA0(R48@2^ME!GD!&K+Z8F9-h(W`!=cPcehnPa;k1l`J+xiTyN!2^l==Db6=?QHi0u~SO|0YyxYM=` zI~{SP4vEvvU7cd@-y(}p^|!Vz!ci|n842;xI5nM#2@;8df)l`-Vr6EA!r`|t+^1or zqceq6uz06~vJ=m*ni}m=Q-j(6qFr!;*r{XcT^F1+QVLiHw~cdy0~{ z+g?*k@CTn9N;C%nl8}et(sw1(Ta{<}(Pto};G%~89}_-@QU2%&7igMrX@Zd((jBX~F%-|)8OXE{VaNJtZriYLd6k<-S}=P8~l9llIeDnrj?GLN-93+Q6?Gm3o{IDh>8^i$Ge8uQ#ke#labYx+$2^;7`$-I%Fgcczn4>Gk9~f04nY=o zZSB^tHTMW9E@Yz`Y^Q%*m54OorT{Ns5jE<-uZZ$vuQ7rU-Rh5qcw)AAQ^a7UwB{Oq zX?|Wt`0&stpQVlGs$|wri8kN$qTry9PJ$ELB0!swMMwk3!Air_>)&)lIUJ1WoVg&eoSH6UXvg><|z=Jf#k0 zHSA_%e{{zkL;A}KHT~sPv&yF^{_)G2_Y83nDDnym3-j||6_P>X z=yOa=DVTod3?yKSgnQCQAb$cNHAvBuo^S%9JR(6kxEMW&k&V5V=;zQj2i7H(NyU`|ADwfmB^vDr{{p|b@rLI z;au5o@yXV#secdK`}$l^v5NW41%(BV8(df#>guGg&jdt1(%eUq#HKd$bgMWY0|yQT z#pJ8IDlts3aD?oB&CkhcUPi~J3X^;B0t#)*^syg2dBTD~B=_B3RLbcl|1GG7LDL4D z420bclb=xQ2>52DQmLqYIN&RJJ(AmXjdEgGW(JP>1 zz^Z9zC;@pt7;PBVcSp-TV#EbI3oOQH_>Tny1^3UNicsZz|L%TsRRaJ6A_9WF^SiyM zXp;Bi$0jT++kBE&Q1DVt4htRqX@L$fwA3|B&MPA#>cqUKPj}ij&aaRM zEGxpoq!4}p*gZD>&}`%|zRd!B>sjy^xCXtH^c51;HM5B0FyP(puJl`=O|f{{$QVDv zsH*x_>M4WQtpR@B#Yu+LHh;DHq-*t5^^DNFS!x&Te-3-&j-r)cKG%D;9v{EKL3%Xp zPE46%Wp96R;~gW-nl;#o{}_`>)Z<{Gt;gqADp+$wTv=6FiLyw8kdf!0JLi}AW+$8O zsyuz!*-y>kY>My*jHda2S+c3?8{;>)%R9c%SB$LvaU~`CbXI8|8&BZ#POH#i}b`zmJ&ebMl@ZQz1Hu(HQoJ{ zm9$n{va2A3_ULF<23Mvjk`fmO`4Gj~a*{9q|~INcEVtRLq& z&HW%R+pG=JFdP*(rJht3vg^EU?eqOwbQFT=gPaEhBE5vPZkR#-l2W>h$34VEXY4y( zWLm7QQw?;TsZs!@(0z3lG`|1!frjz@x)MTGxi^Bm;U3=!Rt0>~f}hUsFSRY+a7W+9 z#+}os>@#;pYSC4i}Sbb zGp`tB#t@pdos_Rf6-mE;yB;$$$0en-f?TjZeRNcU7H;iJ@qwhu+C%wYL`4OB?2lw< zUyT1*+OSH>&VIWRiiga7vz^SA^RmExqRjJb&(IL&+cQPFd6+1HZ6?sn0D#b|h41yh zPHQbK1%@rZihA1B`{-8~XIXoTno9kPP)zp8;GdPfiVg0$O#_8YCMp|(H;*254}BM} zv@9yQ&1)?uxa+?MQzUP}P>^Vf^lEyirxgqh8^J3CcHbl(3nhi#iSF*Cm6iC5i}rwk zg{G!-FlNJs1Mx~A(#gk4;~9#Dggoj+NG9$E-#5hYq zO(YQpJBK|Hy72GRD5@8S2Rf|29~)8MjIa}Z5Z0s`8671jKmSNwpZ{*htVnC(`>N@W zO8*Sp(E(-eg>U}a+UDzTwpC3S-=|(Ko+tFXJk+Inmd1Rx81?N>lnU63dwaPNe#(sT z&$hWgl8-f}|Kmwual5P%;NyF8;{j@$#!{C|zk$t5_aEyk+usr0Q#eyI$>ru+0}V1R}KcjGb(?-k~to^k>P-%$kWK@CC}q$n++!)*1PihlX0Ks(Miq z%KNG${=@7cvkxu17C9ch{!shMyfMhfLC?)pei7sgyp?xJAg2gO|1Ep^ zGm?mn%uU9&lIXMji&w(0h1^k^9jYU*pHgm$YSyfXAzo%% z^$X*?11G)Yz;Gm4*Un`4aAjKG=pXS$o{jO0ot!(Z4hz9N$%^DTq^(*cO*lWs7`{4Z zUVPu*?9Ap+TM%WyG7z%2+DYj5ZmclyyAr;=!M;r!gg0(NcpY_es_E#+8L0Ih(>Rhi zZ*U8VK?M;METl4qA{fBMpy9VCejx$#la4NSgUjYCs8`mRET5dbh$qsqwY`pISO?n7 z=%_|sUKA?k51QDi!$4`;<^kp1=D7nAKI`;O=jNJ=1J2gKCy4e$~w$e%t+=BhV`#M`H3O;klPmA%j+QK$}FIU){Xxn zDIBNa+1EC!l~jSrRP$smi{=U9kt!u>hQJT6KQq3os=K~yGU6P{O@R~{Rga!tc86h4 z{0$pUm^5uTt5pK6*`~9Pb3?ziIT9)*)iw4g_?~`7qU+)?w8h5dI%6-5j zcD(1julKLctEL~aWWF`4RQsN0)7q?Hwi*CsM}{^70*v6o#r+nF%8JAy2!8sIDoR#mWV+6fff^No0E5#sObo!k2*?<8 z3=GKKXsf6gTU*0_$v8|0Xh!qu7tcH7#SAMvj+4MPcjSfzeuh_SYE$JFQ6r6Aot>QQ z>`KML<;3931*z{aRFd-dVkJ-l1VG}1OCyXDG8r;Flr=QiMg^HzSneTMb@6h(N&l-v zdlt2!ss7v9aevJ-=QVXq^5t@q|F%{kPRK)k-MZI(M6$9Bh^E>4d5n5ZWLpyP$p%B+ zY>eP%;6!)j|M|lM5rru_4BOQ|_ik=d7a7z~v_d+J zjPBFQ%8G`IZTr%#w|;trkBVu@!TuDd!g<1N7kTwN@0yrF(sTNj`5<+x1do7UmjmB| z7J`B<5eRW{?6Ef<;ZKO54Gk$m@Y-bauYD+%%n|4l7^&aCQ%|{s^)JCW48i8^L5O|7 z-nE#_hKC*ws?*b73ky3fz^R?qTY3j!#(h4%9*Y3c=@U;?+E)dx=z2a??Gii|nqIO5 z9lgxL%oQGAKHF6TzA*JarYCC3N_MO!AA)d@9x^Tzn;-C&{Fh_W&>#$%50b@pwHtTd zTx6gPs4`9uFQ3k;4s{hj8Wi;`l#_dGWhG}Ai7-OEx9RGsR#8E(u56LO0lhx3a#44G zc9$>-2R$qIzD$?ZhUx1*mtFEFn(s$Ku`^GCV`Oac5H#j4-c$R(-WnXz-@ik5%Q&3# zPfC2`4k8Hev*Ii9L&jFC0UCtg%nxJ#{$bq@p4<&G&DqAx7w@+Nkx}lFqb_>maTKFZ zPR~$g(p-A){a|lJ$GXRfcSCe-f33lVKnOLu_jHr8oOCEUIBZ=apCBDIdg5Igegwz3 zpI_ziu?rlHn*8p7KVx&h;P<@4^+h6Zz}g1}aG*D3AKuOH{B%RqRG6*O3moC7-rotp zmIfpxJvsB{#r^(N(aVQBv;GGM@8Pzlm?}~O_H?+~!AvXqz4GPxdDl*`-E1Ufsqp)! z;4OjNLHs&v&pcQRU|WVXSg8aJV8n`x)jVzFFG-C9bvIYcaPyGY~t# zwQkERa;If5c+Qz~^6@uUm-NRkG&Wj!nY!pVTgkDAQ+}mI_&*Y$;I;8Hd0jMH`sKH8 z6$diH;r!qnB0{@Vq{wfn%I5r?w^=)a`aR$fqB9I zR{(dNyo#qLyH-Ob=6wfC2$F0h$)FTI4NBld>0QBSoPk-Sq|O5(A^7HHlYP z0q1J^wj!cRZS8Eai4}}YF-kI?#oCqPVhyVU1+sNs=PlkLbB8hv^dknHY4yddb&ItxJwCHo^Jnr?VWo8 zILBhQ6mWE4v+SlUE-lT2M=SeSs=?cURx&Uk0F0*PKG&JU79z(-xd=pta-I3%1Juxo z+n@I=fx0wX@6TmHoSP51sJs4_huMt&e&V)Czv4#W+$CFDn!{n}qpS|7g6^1Nx_c4r z@w1`ycs`er+HrAggE{XbDK`~~^i@>2nAUFsQJMC~X2YfND2nOW^YSxQ@{*FD>>la; z%goH&J9k>H+U8m2r{cO!MM-D7fGBY$!2?pzMPPA$a@?zW2#KKS`N!E z@@4IkBvoHZ!nU8+4omAjI80+!AYS$63F<;12-4g1bn5-G!WqcJ_ODq{gLWpF>L0w^ znuhqEF`wWY5-e5q%{sw|hd!|}^D8@82xV1OA>2z;gkQ&E*QdKP`(XdPr;m)TSK(5u zu95O%+kv`VMdjehP^)ot;)M>v@3lW?VbV^yi43*4D57a*=^JWjRoa$pdn*zkRXfKo z=9fR|%T}=;nxjAJ)+gPjh*=5aoYLN#X0z~S{y@nS#~J}CS#?@3tHLJV(|9zj3Gc@CTL-YcL7yLx zdxV8gjphL53)ybKt^-(-K{08h2Ybc$4oE#@!cu&#o>>&ZcOkKVDWQT_K1Y|h_&dvx33F81ffR3(LY!KPkciC&!$wOyl zy}q%rP0EFjAMx|BCyH7=nbSvOmp4@<3rwNZPyqvcvbQ^R?288l36D(MS>c(pvC1JM zBygh3U#Y)(B@u1N%h>JTZ$GRW(|AqTc~->_b>riu-m;N>YjcgFCHdLU1c_J6J1uSv z?@#;Q%U5u7J3HNdl%lo;k{>Ub;>!|w18sf%V>q-8<%&YpNY|(DZEfEpBKnGoj`H)9 z;8SZmS%HQyvbOg8*W^P-cMmTd(HafCu&e(tqdFc?Sg!Bx&MhWF-?@2D`~z}+FQ{e! zGyk$fZKoszkd+*rIwpt+mO!v;NWeIPn6UDiVn;Sd_#@hwGve3n|Je%g@k9)C!FRyH zMzFk07!wnEO6?r`47{ZxROD3iiJuPC>dzZL?(JPODE*0*SwgzI<91dzK7O)675o$p zbGpoCUC6>BnR3eB*=|%(!AsoiIkBhz6VgY4sJFEnsFDLm>pOtd!qiI}Kgn_!ANh^0 z<7+ycO$n3Q_Gc?5r9sbszraVNG|%6^pnWm&##`ydi+a6KM-`R&)Ko6Szw1j)wXaK; zlnla27O!v0%#NlIlF{-it9p)q1@2=Yz9>{G*?6?ZRgi~;DesF85S_*G;duCQzUU|q4_aKCmO{k z?sdq5kU(P**C5ZzNa|mbyZuY^FVN+4h7l&q!+ zsmNr7k1gl3BrZsj>9=sx5~<>p3#r83q7U4X1Pv{%$%+DyeAzHj+86hwX{kat&q`JDy2Zq1)tL#8g0{A7 z7#J8}V9aIOZqUGLzxORK@7z>qaAFH1;=>2i@`-<|oZO#Y?sQA8kdI>AOl}o-RB2}X z7j2M8PrnvQSWscIo^Wcyh!-^*;{1GA1t^6~P?b{-?=YvJRAM}Ct@F_0Swt<>V>!;ZMk;HBLF*J@UY;tAkpHN!U5YZIU9D623kH4mjS!H!d%0gPvtYki5Zc9oW*I$7jG& zSEqk%Mi3XrcYf}|8v^uJCsDB{Vhtr*zR;aStI|BS`Xk_O5Dg-L0E_BMD*t?Kf%~xK zd`w9xFcxnP4oyu)dkTttTqIhwk}!gsZRp!b5Kf>KyBVtY)NGGGc zC+5W4C(IjU2t)|i47ngZ9zoRQgOw?7X?gh%L_;bJgY9biqakfi%)=aWfuQs4`H!@P zX|L-*yehh9BdrBr!ZpTHr}|Aw+v5|&efNT9w6L3_Z;WD>!emv)cEGf2IHv7HbM1&E zijWv=wI3dO{I$-*rzg*;s8S|b@fowp)FCZ9S@yief)^3{=RfMz)YDHon!6|hyNvly z8QziZ9&%;2`F`4AYMx(88BqBg)hXU5egFOT=V~6>SgRg6&pD?zZ`P?hqSWK#)<{Xh zwJ=#nz7NXi-jQV~TF=QhijX00%>6NziNiUNj1n1+CpNBxkJyW2e~zC&n<6N0S6$^Q zRecll^O2iq@0(x*4ohyO3Yr=L+0JK&*GX#m6DId$SkW-{+t#Xy{I>=fS`RBmHik0! z+&?<0l#r9!huGP4yg57I@d4B&l$k-%1}OI+)a(5G@j-6}a04)T z{5B>Gm(IOC2d0h#UmV8wLzYD|L{bO|2Bjbfo_^_Bh6)Uf4?gf-4f&DF48mL89bV4I z%ey^Ya|lT;Emr=A4>RiOuE0hImvK^(hYnu`N;>GUvHX8}%><8j=4Gmtr zJW)>QtY$x{lxfWH)~pe9KhQH<8muWdB_!OpwZ(HfY))XG@uPVT6K_`9(Z54{yr0oA zHXQ^ogP)EJsiGjfE_I3utPgrbd|LX6@xE);C4t>tBxo2%eI+gGZn+90@qC6WKh zlbmng?g#7Jx!k`Cgq-C=U3xsEz;!&ygY(QCbCds+OI9|v(ZRv>wy}RF+e_q6{sh^* zqvC9;%FgDa^|JT4`0<~~K!7;zxiWzagweok4XJjZOxx2l?949m+t%9J*%1QlkjqKS zL+9qu;^?#I>c}Xe07XV^?O-s*3(-^q3vd>iSy%wP6=E}}pM6n6B~WIWj)0^4Ip7Qi2ou6COZPk5*KyS(!8VA5H0fX}k)U8*ST zDT@4iU8k6q2m6W<5l$ZVVl1{NXM)#_wh+;+JWBrumqfb~!oXrcd&*DW|h!oTEA~&bZDO zWKy3K7#i-*oE%0{pa&yxe-?y3A17L`=DMv9#gmo25wL_D$?J3%1w`}3| zvh}qM|K(YgpYETQ7NYJ8ULmY|%{Yvf$bqsiyYu@gvWFd6i*X2XH7?z9jsk9jeo#tA z!%hh65Zs)aji9E{<@UzArm5cP@>g;3PXChkt?a;J5X>vZ|Mw-92ACD05*V_0Uj2rI zshKapdpf0!0{{2rDIN-8!z|e-*VnyeT3cf!JdpW*=oIyzyarm^#;CPh9s2|ek|dBL zYNk`mZ4e0X&~I(M)U6Kjx%jN1rY4ocgA{NgEr8hyYu4e($<0KS?;UJvF@@*Pjkpsd zBO`r$e85RQz7>i8N}V2BFpvV9r+<3)nMS_Ucg*7|mbVHo8d(#wn3 zY`C36J%zg7yoE)jM`dHGzgtH!S9dphT3I*oMeXi{_fvWK$IZu#u}eW@ax-^UNt;4L&J{J>NGs!clx!Fi4%Tbx6r4Q?!IGvMgIQk zO$g5!+ue6hs@K80oIqEq!brf>M;X}cOpWj`OND~ThIbEKW!>GiPlMkoHiL=^Ejao= z9b%@fXk)<_N^$~%&%nlmo96a598nXLWPWIUOiWyd z3<7LoXKThg&ef4hwvS^w7MtI4zw{D=ezKzP-wg{<)<;IU6BX7V*ECcA9=iO#%jbNu zXG?fBfcapQR;gFF{K5m0_Hl8x7UYi+61HsmjK6!6_{^6WMR|BOu2CztUMhMu!0)`l z*Ec1GuI~4-5@CobAC+zURSI3|R*nhSfHDR3}!V+1lf8 zF)@l-T=+7LF3}MMG=5e`j@L z6_==@A&Qu=)<=K1K(id`*_q`S8WM`_tlkd&ji>1>k0;8wJJ@Zf<=|+Z8(xf#6`F2o zqS*T6anD>K-A~`KZmRh;U>kzXJL&wa0U1py!H#pnw%VBN_5O3-Je*m>o6J5 z)Uq%GXEjIO278!^1O%A6AM@!ogf`$A2B0()k>)6Bo8+M&ND36SQ(Ff~%sTAeER-hh zavLZ*+1n3EsPNhh*OpbkY-c|F%}CKfANjSxsP#l2;TMk2u!HjDPz{vzN&BE(=a}2Y zS89H&hhL4wjG_;F-68V^@bdoD?7jixd7~{O#p&;)>Rfl}89l1i16-0lIkRbjcr)OC z2f@es!Xdk*D%V$ni|8Et{tQT~!A~<%4Suy`r=}W!|D*pEI}6JPD3Z^zn(%!@b03hC zFtc2*#lfVUWdAy(&ls)n_*9_dQn=_Yge7NOY}TA#o0xoJbwEY1>Qpeo&dkChYp3Wh z0mI7N^Q`qc;av5NL>`cZ31m!7O`&FD!X}dU(EGq_i9_Gy;y$3pc6M%dW>AGv&=8-K zlE8zqw9I!q-`b(tXMOjMJS{HA*T&DOH@vh9Zmv6_E(Q0}9@X>VKWKaEMnDts4X+?P z+qC^78tO)270P!d$KgPct?{mdw)YVCvYooCF=}LM+g4(@d8s=d989_OCtvYJ%~Xka zg`M~1w9km;q4n4?v4?HLyIpPVH!Nu!8`BeUpTB(1%`J1=|H!07M1bxy*HDfZ-o5kV zRCDpVvf^`PHYN%}#P?Nt&X_p&L7_6^$8f>KxZ{)K%^XY=)0x@`@x^Q?Hagx+uznMm zA?4yEc8Eni3%$F4nbh=Z(~6)YJ3l`pqGLRyr>7Q>P^w z|4Ipo;zUNMYYYbbW!RnG*ZlU_C$LEr>R!9!3eL}ix#tE}pXP)E?a& z>geCUFzANmrW{ zy{TU$a3bARMCaJeh@zUA`pGrYWBp8?$D(!$2NT8TaOKk9`wd<=J(GK0;r`cWi>yZt zUJkdy?=w4l6)H<(bah&U9BGv#B_Y5PEny?%V2Kwe$IfPpIfcvVdjqfLs`ITwv0)>v z6*Cj;eLMUJ1oS6%c<`T3e){cN-Qj`n2kC2x-}wSyhwu*pwQ6d zRUaxdKZ%{u=MW}#ZWVOd`Oi$%`QMQ@hljALbIH!!-*?f($Z@!~>GQBj}K zKt-h#boG4$9PHNnh@Z$9S2Huq!Y01f5;z1uy$=(&JM{iZngOOPnj^nxWWM9<@`dIX zLHnw?M3#G!vU)7fVs-^q)&|Ad#O@-}Q@_#E5Ui1ty>2~_`T9EC$S=Hf`YB#`#i<*% zU2x;?P0f-oB?d*elJ-+}6XFS7n0*{tmft+hq7%`vG@DFz(}y+p&O;`3khsl9u8^<3 z1$hy@OUKg=$n4jhz%~vhf2dnsgw6TKlW?ZwTFLms?__V{uP?phA;D%=_I&5yHA$WU z+jra2zvo%GU3$pmLKP`p8)Gk*C6v?#I*908;xrY#9%(Ib{8_hu`Y$YrJ!SmsCj#FU zD?A8&$J`S_J`LRZfxx+gm_-PlZ>a{vl&(H)gxu#7CuDg(y=0Q*4=T*Ke+r37SXyt+ zupcvjY1yUb|09;6q_NZ_r?c{^`~}{`mw@0qL=v_1-#)(h6?1drpw+pKy=&~pXtgFo zpR!@|>3GcVCRu0)!lkO{F7KYlvZ|)w@3lIr5ow!neo1pYg*H>bWS71f&i3u5+U)=;E_MlohRb> zvGS}X)H4I;4wxu(oSou30XZShE~VDrc+7`9D9Fs}?@7^;lRIb#TomE}KxX^nZjjq` zt9K3$FlXL1+5VFd6*F|+Z<)Genq#|QnfWD53w@ z$|_qn@~asq2?p_>6GLw2S^aA|+4ES=N1MgK zg68Q!ONa$YG@;mjVa5sj(#x-?2*FEVq1P|=;*X-Jc}}+`NJw~_J?FoGo*Et&-7iQO zOKA&6T1OQpqksH)O_Z?g#8EqTng!@}R3;_qm@EX+$(~44G@`fqW#+olE*uyhzBzwz z_Z>`%Zmxeps!7!6D>DH)urBCmY0V-rMO((kP64w3S_YW>5RgY01k%UGC{?fjB^K0X z45^-H(+c{!AewSjvMMTgm+rm;Z|+pse-gsYeg`!yPxJdfVPp>!rgLr8a~QY-3Fa-v z<|g@n%N@XhKzpFxC`Z7cvU1-|?@fVUWaQtj*qHLyeR$}>Lul;ay;AZ4IIXt$oLRUu zaCjZ(TWpVuU;H0UXBk!H*0teH2_hif-5t_h0#ef5-67o}T_P=AA`Q|V(g@Ps-QCT% z&O64(IDb#c-p{kvocDcQV<6fLn=79&^=Ef-N>#h9@8#i|VJOA@yus^|x_e5_Zw0(% zfml-{SJ8f11}rYmZ(Bn0!jqDYyl3Q_P__~ z=6q8Mq2k@*H1U2~s+C^*fv~sv)>QH2_OE3hq1Ua!?#d6%MR1SI)!D2k14AjXY`CiA zffl1TH&<72Gmd~iI@&*tjWbJtTQofjGXpoIpZhGxlo6pg9k7}DGI-<#ye6o znrv=eAR-qrWxs!;?qe?X^yC%&?i(l?qES~I6R^s&+NDB;1(RDZ&wFSHCHkPhS)6ZH zO4vOl>RG3IFNQ-Pg6EH!)5kq?E!Y2maNgaK3~2t**w&?+ST-Ti?|b@umHNGIv%BdM z?itytREksRb{O!AfK$c(eKJ23T7q9Il~^27I3CaOth$ub9f)4#jwx#E2_}m;W(tow z+&fwIUe0XGoXNTGAR-aRTq@=JtbN}??+dzA8Hp?M3J@cfhl2g|63;mBdTo3zBD z32oGsk?KBEi4)Zu;$Tqz;czaN9HmAQ!xQR%bQG0>c?S~=-37&aLzxuS zu^lpedj)!4j}PgCcip1$vQY0JIcsa2JC0YkM>Mg-eDbbs-kc{pcQ@}phUZwcvpS{S zTk+4(81UrJt}YVHI>L5uWxR0LQnC%biXy42MYh92?KuE|{rkr;kB$Hh^X95h zk=E$|^A!^pN%GUcOj#Kt0`$$z4T!3|;wA6{W9YX&;C)qBmn=}eynJV4gG-LK)6Z)H z>$7p2=j0pjCt;skExK5f5-Uv^fAqPws*d0P(c~_$9P9M*w@25<-FfM_!!1N2aR1V5 z8PUhCgMm=UyO~3kdBTP(AByJNHqCuAt{~o zCib$Z4`m6LISDid-@_vdsN@T%Xe*H~ZnLFGQNb*HrD^EPo}$UVwMxdD)F0MID?}6X z%6gGtkB4w~U0p;3F8!TD>T@hQJAlOp3yJJkskfbHLZ)caWe)kOv*sBo(g`vyi7DJo zP4i58T+)o#M=Nc-WL6Php@|_P1@WU9*T^eQzI}3~XcWW2V&^4EfWAksoInj5xNW+p zhQ_2WKGwOC7m?f}5S*xOe$d3FFq z9H3Uts5d7}VuE~bhjejtI06u2eBRB{5EMKf7{G8U1%i&PEz|e!NB{mUsjue(@HWF7 z{~bA`B$?-QdPG5J2_-;ezddCy7|j{Yz6AKnuI_FeM$IwbZSWBSW4u`gV0QqFH~<}Q z?z@^mQ!0@~n4aYZd;>-S0qC!!8neY9M*)^9xJc&n)zd@C*M@BA85y`FBtz!x85!FW z2GJ(q@kk1h{wS)dWP2Q){cEV9)zrNGXB~QLYYSx_$WskJYOn2ZtxUU~{7tgasIYAi z1a_F;^>KyLSErSO=Fat@Bj4NQ8nf`w(Lt4CV|~8289Fpq8U>&ls<&GAxE%CLD@4EN zA604+xA^qpu1ahIpwQQ>2+t~4n|f98wz4vxe+!=QC|;lb!}aNJyoM4 zr~lKM(amG5Dn_O;<5E$04e?2OeiH2ZC-Z1|t|4UxdKQaO;UX?2*gCCS3) z3P$)c5*l?OU96nCG)UlO%6!Z4hUx)cn)l0Ur(dx)snu5K6#r<17Q6r1sYydFF;M* zfLHNEnl<1qf(#)P_#Lt0BGD%yilQbAx_tSP%4*aV2m}98r}C0%2#D@r5(C(0Lpxo& zquB<4idtMeKRF3tC6@pg0Z=6XfxFkTWgiRjWv=e-Ek5@pjBfpgYfvzZJHv4-ec$jo z=_x7idLjmzo4q1FiWMj`F~L((zg#|8DTC(_9}f>OQx0;lf?WO6!*%IN%o-$sSb`M| z)(QxirJGpL>sCV%=piD|inyVQZ|OT>)U-Wox(D7iYcHs}P8Zl{@bJk)K|qC?m=A5% zHw=7bI|DO&d$UVztnF9z7;Y6vBWu-a&|6H#=Z}gB#vz` zlkMbaVD=+LMvgu+c~4H>Rxs`(`;Omn`C4(zN!;_^&(mLIbN*p^b591WK zQHW8?OJTD4Zg@{Aa;;AH*Wqa99s)OW=ez_#-}G94zyianuFqL#d8UzlHD~^3LEvuE zza70K`E!caI3!Ao2?%S4>)ldfxr(6e0aE5*-p7H@1eX4XGh^Ub2FamOm;fXKBHYH8 z?1Lj9+PVsYzlDMR?}`W&0O;Jt#-DG0EP{dO$k34Q&5@RyTRj;3=I3u_$0IzcfN#JE z7(($|pa5(Kc-{*6`@PqT2aX;URnwy)68uko0oMbbsLX`#TaRadcB7Fz;k@u;^7F_~ zh2(SBw94vtN89C{EKk$v=u7;s_Abj8MKu|BVrweF^_&2`TIC_RzTVGHK4giQo<7aa zF2;;S5GUFr-b;&-iz;C|wxB3HgUL9t55!jAnm{p-N2zS3xhks&R$YhF=3e)-L;1tniLuQFMGh1 zKlbR+3~1JI?Svl`yY2-Ef)*|15FipnF+URCKEngB)ZE;H(4g1t5isetRwlgBK1mnA z3;Od!C&~;f+T|WfG&2)M2JSfG-%BIYephX)D^KVuD993|e@*o6E++c-3l;V)nyLSb zI~C!KzIcKB*N-WQlfhj@XwTBPauV`>0i`M(=O`jQvlElqd=@K`!r9w@Hx8liU6561 zmDH7T8#rbhVYQf+j1G?HtDz+qchyIl#;2J4h!G@T;=bh^r&_(k>au0uPW*5~czL6R z2nXNrrO6`pw9s%-W!5TbiOma^X7lC?CcL;#SbSufk+6cIeWN zLPK~+Zo9_Etru@n2~qres{s)Xqk2S8AZuXr?e>NVrf6;Lf_rIBK&-(2{;7LyKk;Wd z?%1Zp#1{P(@pp*Lj-_3(!U9givdKZfz9N9r;Fx>cUS*P1uJaU)a zd5Y5rnxfv`kFBjfc-y{R!4a53p3P|aeOC(jdKlA&u)YIGA3pb zurZN0iyfz*U0l+do5{;V%70bhy8H2BM?pat4cPovGMvSX!fPe|3=YmOK5muGi=W@! z7&z#ppY8k`UUY9py^#F&!MaGp`?h{+xJ+5$TN5gtB`8^0*FJH5;StZzkAzE?fFqr#b}Y(Z1XvQG`!{#8s>zhE~W&DK%+uo z1IXcnjmRW;xP$MAyxZ@@D)U#1xW{|hfsKIF?*;S~!jPN?if1!lGI%Mg&RV3ry)Ea% zMeVy2j9)*^%!_UpQt3mM64>AW^8!?z4{)1&0mxqt{ou)#m?NdwTpZtN+FZB+-a|!%32#oh2-u z5sK!uE|tL2Xt({@vR6WoAT1>y5oO;)LSR0loL2W>T$UZE%qgYXi^UpZZ06HX@gRfc zd-EcwVe*ibE`k1Gxz4a3MGC%WI3z4uK@UM2owpN*?2;zmaU-(KA= zr<8RZGv*8PcC>7r$`nOgS%bkP>&RqGxIRtkA!)lR`rniyB$v zyK)!wBfl1zE7g;?YN{|Cz%2h4dTMc3kQq(r zV*`sqprTBAX>(i`lj=wL5TL8G_{szRtN_Fw&@T#DI{f}Ad~Pq!%L7guO-AgmwG6C) z5jZhX4Fb&ynS8Dw<>)W$6-sJ(4R+}ow~A~hD1cl%B4Jyl`CqspEe)Tazh%Tp!5^76 za;3vx7>t1BsCRd9TnyHMEl1EE)G}Yd7SRp6S3{MOMgIsMf-CH&JJW9*5iI{FT}1Jb z9X2-iLon8X=7#by-UM3~!e&3rki}?wmcW;z-k_RN!qmy*ssdLlsCS#K$){PIvyX!b zQ|A~Q$NfM2h8d+49bcYGyQKt7!#2-@wX|GY++F1g6fxD5l+ez4*J*sd&KOXmf1yR( zb2&+*3leMi2eciuJ>dg@-bei1s(d)}(D}oK`oK(ab*8B=_mUhjvD~}sL+;a6i!7cm8z7|8Hh?A*NIG!5*kI7=ky~8M z3%1cvFk8#Buh8I1B`Ao4iH-exUKb879_(6to;ZQo0nq1seV+}ol-dQR>@Lo7?EL=l)g~y?i7r=u- z89yREexlX$1*7j&r(tApSohHT%X8|FLYrBB_bg%v1jHa}E~jAN>*p8n{+`Tj5okq5 zN;<_8DwiUCRyobQT${#;99NW_&>!)3-oWXjb0K(f60 zYo@^4sHy4C42uMF^8o~c1Uqk2^jjjHOf`~tibp%Mnz!_+Q!&3YwjgS__IQfcC6FgK zZ(pr6$Mp3xTV{P$FDvP$%V49L`oRHZnHhsLym0GP$F z2JVS35(wu~UrXYdn&;%2>U`A1miE<565ZQ#+sbY7 Kv&(JVMPLY7i;`!I0?(Fq;P!x7RWV;4=e`Col*l zQm^?T?dgpz_e&xIZr|Nqsm%;5YbqnS?~$WvlxY&9qeDXwP{4=FzpV-->3`W#QbKi! zw+yRI!XjlD-Wwk3NZ2qKpr360qbdkCtKa=w`rgz2BtMM?2Bb4d_LM^5p{KVU39+@_ zdfJr9Zad>B@_5mnNZ5_|YYvzTFSm&AMSI~&hCmlpRnfV$$ohzh=%kN%i(V~z6Y!IK zR#Axe)hes2czcx?qo>44Vwp!F1-HCU6guEUmAA2huA)){s{g0ph@(xZeFI|qcA4|vL7Ys$-$+Z;?>If0Tt(%dFZ_^hUqC-YD*_uKF=ctG|DHewm*X#U%i<0l7O9;E-&aF#95dlFe}iyX7(B2?nZI41vb$ zc|X=ZIw-w&ufO~X8x7p)~wOadZo*`!Gr3WZ4(1VZ+ekd~empG~#-7wGQY z>bKe3+1XiJe_&_tHe@{s+CH@Pz&$C8$O{Nuq(p~6P(n{o|B~6ZgvK-nD!0! z={=&OggCR+aO}wc{7{ClV{BH!HLTH+EinC zodRewJt9M4iUOx&Wk}Pm=TGSP1|g7Iy5Bbbt@+ZAfjif`gg$uQ{yBB|Q8w4 zB60q4+b{liufgCuuD_ZBSA7;sN^D%iJ6q9<0|F}r1=r|NVDbP9AEv4?n?u`axYivO z+tZijeCJJs11^A3*!R=Zby9(h*feBh`(QU=`ayygGwxFl9{W;vclS0Sx>sOF$7_FV zwj$;e?wHw?>b!3Z7w4d?%mAy^#LuTE2LQGOql2j_RW3JL1_p2%_fPc?UwU8-#v#2x zyn;or$f56YvEB|NTQeJ#_Dv*DCM{HGTXTdnEBKd|CZi+1?9`!Z5bHlXHc*RSWMgAv zWNqDN$oiBr92=f7Uzv>omSOS1$F2O+bjG_C`{H5OjKoy^ay`azg5JAr9$BNCZ^3{) zZ?b87TYhGSh-xn{&r4n&1Q^1~4g3uUH&{R5bs3WK={4Hx?fsO#FxWd=&%Zf&=W*sZ z2?i$K5L~1;n@30Ow)0j{21U8K1E%#>R&)+?d)gYZ zVWb>%be^{-<-0H8M7HHR8bcnp%&QjbnpK+0dh#u)83{g{lUZ%{C<%+%$HZCrZl^My zcW3RThc5f*R>F&;?JXTat1OPE6`ga))pKpKJhFxCyQ8bUylK-$(dC_e+&T(wHf{dXeV9=urRm$fgo6&94!Qpm=3J#O!t2R z{6}&!7w9voV$Q(VA2v}_`;1L<($mrX+0fn{IV>XLmm}U==S>@86C_d_-lW%WzBBh- z!<=ui>VrCXHG12#%=)HQyQh4_|xX|Y3 zkLJ_otyZKxI*+rqwQYCuJUZf~5b~k@@}=tLf*1yZpy|v8#(|MZ2}Zc+h`r#> z&%|gu*X|qi?OSeY>OwIs$^OyT!7Wdg@VxOmz?!N5{ol_IUgyssbP|~df0J%%?$_w? zB57kY^qs7`tn7`$>U=?$jfu&?rl138`8-@n+a5_vNHAuO0BjdBF6&&k$$G8TYf2Fe zBtRRAC^Y9^=<)l#KbS<`xmBalO~nod&=6o>IB`v%L1(N2q3;|i#cH^PxW(J(Ope@? zR{j`C6JM4TiDO&@LgG20a}4tMO(@X=X$^QC9ru+jE3XGwBw`p*NWk4e{-hNaBefZ!ZHWCxoclA5Y&F-jY_{`9`J@_)2oN*p& zYs^EkH#6c3m(lG)pnRi7L5>n;>-wGQ!jWe zZ8gJqf&pY*u`}(5<&Ph7-e60mr_i8B<~$-cTKy6vAYEF8T{Pq!Jj%m@)GbU)2v_X);>OntN*n z2X|d(lkx#~n71ZuG;>y;ZRlTlhR|_%bK1=QWE?E?dS{?xuHxPEBbdAsfksJiu#LA{ zo1!8qKhN;=c=31ej}Ghg|z2Kf+4%brD3qDO`x`7 z(m8>2k2RjdoGd@~th5wOP;KS3*GlJ zWiw;Eo6dlOLHmlk%XocYO-#aK*4H<)U2@p|>ss9nve=-i*#^+Ef33$}p8HLVnE@Xq`Lo0Hc$|gg{ysy0 zUzYFngVP3bhpYzna!bd{&2nAs%xP+>#`BZF3o=#St0rnuBG*q@Df9gA1<_P(7Rmp{ zkI=>AwK?B+)pZcEV?8m)w~Hs%TIZL04!k@ zg*ri;C$D;vtaU>iwt3|F{KCWcE$ZjV;NuIV(kV@aI{w+c#re>-N;a^tSl8DPBw;nnh+jyuYLS;nK!jSmGdC(9Fvi+u&G-)r(4$StCW4+oA_>TaET#I zY-cX^zJL3&2lAru^k@Np&+-u3>?M4>_5GX<_YHs!2`dlC%gUBkN~uc$ch`SLEO*I! zA^>d>l$^HX+I8~8!&3l)d;bw?OvJ~_(89_6XWNQ|Ua(72gz$`ZFCYI#|MGL!X<~HxUi!<-sOhMi~}|RiEulC&0Nm-{hw0K%WPv>wU}ZGY%6QCc@#*G9n18Potkm=apJ1*N;?Jx3?RPJOFWU=VS>Zn3!NZhFPnhDBfC zF`w;~{^c?JaiTz0ULM}4s``t{7lm&t42HNM+JADL4_0FO41<6K#cRKGx7^}Bq&|I1 z0@Fi&laG>y9UpFe*VR^V~WIUjY0Ub{^s1Y`qF{}%x!CS(r$kX~aKms-Hq#|FPT z5`VipM|M3uJMfq6kkbA>H&Bf`-z(3Sex$_=v6WF#RZbJMNHl#%9J^dIyF5v80C;aK z`j`gIcN2CiA-{h=g9>FVjU$g<$M9q!ahQWVn0}q@FqtrvPUz_shJ>Bs@pcIc9Mx4S zENZJ${eKoRUX~ZdXWq*n6`9d!QqM1oUiIlYP?E(*uSt#=YX)2NND1 zxhmFdB5!7krMrp+T!gRKjJ0hZR&Wu~79X0VzIi1KdI0fd1#?)p>$}C$B1N*IOD9&O z(}gi1uJYg-YSO<{Lw)I`a<9E zvQ-O9Lxr?pd*?QAI@5rM!$Keeq!S}*i*+DO0#C})PUKf1kaboB^Q*MBHB~|3UUxT) z)dz*We88`6s6SSsIR0$1`0H;n%g3fi=;4_XuGrRiL=+m!aApW;OTn^|6cl95>ZkIb zH9KF8I5h?T6_NKW)@|DJJHj8;kb;O#Z4&a!Rwi($U!b`GuV%F18uS&_K21 zyLV7^W(}i$IX*Fcc1Q?*dxK}hsMU3J! zT&({tS|~)ho%%R#owa~LISeqqGG?!OGKO{EY7su)o$uN$TFigL(QBaeWQn;ipI6vTZy3PGvF|8cKn3V{50~=aTv{<>YxRHvunp zAu@`fKjze0B{7BO7_`FNv+7y}q$xv3vo7HvwH)p|d?-?cLPi@oU0t7BfL=(_{Z=bZ zYIb_2fM#`$-(n;&I=UZ_$^k@K%}$SfLj)ykbo;HV3z?Oc0K_nU8P(Z2Xq&;7Sn#v8 zXTS;hczmzFE3VPI8YxQxA#D6mRJF8H6B2^D*MVY{62{pP zv`N-s(*I#2z{VeVg4rAlyj#;~Y#n}izwz6Zj#8~QFBKUX0piu(K*?`Y3i9u?cifAM zm7Irvqz(?o#rK0ix!ZCxZG^SAzP^@~Rd>PAOT4&b%m+s&;wqszVYc&;S}S2&djkHM zo99PvS!_MGTioKDPhQU@PwEf-<1-KZZq1)m3nmO z)`GQhwY^@zze$b0aiuh@qSn%3n!ba~e4}S;q`kK8kh)UGJ4A*G&f&NJH9Af`)%{Zl zGfrD9Nhuqy1QQcd7pL2qVR;(nL&e@M(^EtDQrFW)&{p{&SCap|NMRjA zes~ssKEWMP5}`Ilc5cvVa+&2i`(ddy=u`jp@PWZ^`c%3zfoy%}Y*{)W^uirtuVbsNFF85&7u*mTPzPY>JE)(ig3;E60K=pN5*EUxOK?Y_Mp`oTH z4dUqn;Fz$^-(gZlThA+wx0)K6!#PxoYLH*`5Zo-0=~b%e2p&Bv>u%fx})HgcZQlzfEK1PJ7SLwGD63T++g|yqLFA6 zlBfv5S;vlG1#f1S)YRlTGVr0UG87 zpqgp?vPrqPyu1vie0|2!6juPkGz8qYoWJD7`NXf`y}Z1dJS&0c@Ns|BLYyK9u(M?7 zox~f$LhRq$SXwG-XrxRH_3Ad&Se0{2s|tOJdM!tj#W;{O`1VeJ_1}i|edIYaGexUF zrMEYWyIYzC7=6s!TL9mac6->aC8YBt(@NHtCMJ7KI;B5`#N~eC5J12H$`7{lJ(Z&S zF(1%~-JR!e4Bo(a>iWt2Eqw2KN%dNY^q3CG7&1a-Hvw!@y zSA0+OqZq}wQ_KZkj#vw1+fN79`(dKx`eVeI`}$%?GO`VLSM7QdXku}guTFOLt^!|$ zHc%iS0Us9^H$ifFdC8*Jtg5UWpP4CND7!=a78xBT0HOEK1PwhO-vdBY+j0RTEn-M3 zSStgH91S(~<$X0}j`RNL?mNVHQQ$e_cdgXF?g*&*AoFEqW>)u!GL#e{1p0&d5Gv$* zXlQd-SlE#gs2bVZpG-Wo9vv-(ha&)04j3|n1b-0x`=h0%(zdqDq3!i^o;Se5yD9r_ zlN137ACOF&T{}2FYwaEW_j9}St$(xN!!9bU-3s9!a|b|;0cU5l^*&3E=IZJVEA`yV zvWh((>?m`Wa1b70ZlP5MMUARdV$2`4b}cF!*S1!0t*{FVkuowx0C#P5XjG|LL)}6tg0jM|JZB=mq%gG&H{>gkA(qLPw@M!vI! zs3o+IOze^7GMiy_UKj;Y>3qa(siTp??qs_%_-Hu>Q#d+@#U&eju1Nc0CxxYz2$?X9h!6?wJH9O2^m274#+eA*$7 z>v|17%fs^(V6sTE(t<-!#WpQU^r|P>Cx+FaKFrhZuhG-qE!h9do~N;v$vek*w(HS^ z-M&r`dLcxykPzh8w|b`eWfLg5A<|R-5={;oNuUE;>pB(NLHUP`>bbqiABw;w%*k&dtVMLyMr%_-?`wV}h$5jJ2Nl0jDQi=vR`*3q` z_&#g5o684)$4mO64?x8rhA>jp%)PC`^?STIc0h92=!pO*2_DajyNQA*@VWupH%QUx zD3~FNw4E&XNL>Z-K^qhK!Czw zMp#D8I6u-VuIHMdke1bHFD`!jgtV}*0JaTfIt@yqqDE!qXd9FJ?4j*GspE%%!q9Jm zwCb!AX1~7c?Y#ydzRWi7SwR#X3Cga&eIoUzKN^N_btU_7;yy8=hTH#YTkW{r2f|rM zz}#E{AK!rHyTCEmJM`>`!u*tzg)jUU0{|Zp-*6l5`v}dzkiNLk?E0|Uk}m94Chh52 z)BNySv{eIw3>(K&q3S-`0(Lt(r6bR#aRuvu+=n z@~O++Z#wuYz&lL%{&uknXQtFj)_VS&_R1Q@M1{?gnZbW{$+Co&#Dq7Jdd-gI4paZE zII;IfUcR2b`pjX-0}SXq1I!QOtwAar=1QF_iY8JJ1LrWAn+gbq6@JdfOTGCGBeACy z8-z`UfdZm8pP89qR0r-(PAF#%8Woz_aE9!Ko#@Lxoqve|SYkNiyBqGCtc-eOlcGR% z)$GOEE_b!uj2YH|{M?!PDG|CQS_h8O}Rp;nDK z6a+vq5`O#u>7d!AC2bv@*N1BYi(M@`Iy!7@3IJez3o8H%1$*jXSMZ;dx(@+Fdoq%e zn(TD$HhT3z80bR{FCqlU0Xc>8#ahBCgSyC*|SD2=nBZ!z(Br!t$$P2F70sYL&VAD0$bif zP;EuCL5uVz;fF{15ZAR<;>1d5w0^!KYVg4!zZbg*!U zYLq8O!R2+LO#fF=#+Z!N5ZCDv?xv*$7W=XNeHoW3N^$R&Gm^8iPLFKOKYVhzK1luf z^EO;uS>=*?9(nLzIS}1!u`GARo*CeMc|`*$18~qAT6g3j{1zXt3|1)ln*eI|`PPs; z&eYr-9~=7(Bs@CW*5P!Vr{))zkpz&thq#Ru%=fKqaWPmv7LLyoqzvV!toqWt3W67ZB0>ALo< zB~J5e8?p^+=D}$dOw;luW70URl>MWvAdrvbY|P96c*=x~tOXnt2#W8p@0yx#P9FHZ zYOnXkXDP_VBBKt&P>z3P0MTR1@abu@wuquXZ20#dY~7xFV`xfBVWlP2TU(e*%PDh< z{J?|ZHao&^q+^jcUew%Nug9;#&*y=*-f-3kzrfdwLBsR&*GWqFGy-f!7T=sP8{^|Y z>DDZwde>h+l@TCoO@?vA`~(~)O-9ddZG3v8XtKaV1a^xYFPN~qfz z;)BD+K)_z=3qIhSIoRoy*t1GjTt5B^=sm`>t*XkUw4J1Whj3r#t~jmQY0RF6-)C)Y zNIIaWv8dPJYz8Cx$>mFj;cp-#? za3@A+RH3kzXO?NE0n(n$)Ipx)_+P(Wj<-$S{b60IQT`qKnUs^WE`9VOFP236QUK#C z=pU%1myca|s4{3EL3wNkaec$Z7jQ|O2@p*)sJsv?~bG)gx zKf`wr*Dp_-LYTUYzx_>X8}M80czY)-QT=$`{D|{v1^d4r4sPzD!9jq>2FF_P2?g3! zEC4@~k&qZ18nOYzR1lj0Ne0OuBpT}KKttRCoHgJy0w{;k;d3LRH{ZlNmYD=C$8tab z?LT!N8CexSgNRZw&~r@!=Q*&%(vYl4a63AaG*cY?s4;V&@?8SXJi;@Fv33=T!+@rrJ!Nz{{+Bv z2Lx#Fc&r&$R#<~?!071e=&1iU24#tqehkMZF)fXY?7Z2`M_(6iLSf>|z44&$F<*#? zj^DiLivyz?7Z(8m^ENLvYa4!4$Qef|7!szq)?L97?_y$NUR@&tGY6Yd7bU=4*d0Sm z>YDi|@k+~ci%W;3!yb0z_n*T8@$oaw659PXkEu@#2pk2C7v-?9=wtep8n~BT!cp2X zlhP*}t-2dmn_`IX^21wT!5Zp=0)VspEL>e`^LE$L!i)Imks5=}3=xtu zxd;UTS8AXwRn^cqlwLA>O!J4~<>ifu84z*O(Rl>rmW#{RlHy{lh>xoDY6=Rw7MVM= z;ORq?A>=Dr$)`xGYJi~t*eZZsL?TuL9>cnAUZ9N2ls!rV7?Tgppr7~5;iCr7oNi^e zH>>(prh}1ViX9AJgeGQ&Gma{m{*}MH5QEN(yR*(%b^;Os*HLGn)7ZZ)D1pY;-Wj*Q z@4xN619w#T_gUYUH#Tbo=py(=uNhHj01~s=(WDcR6yi; zd&pP|>h9fc_QIlj8-qa`y(q5o(LDF8-AWRm&53c7I26?F?H3o97FX!TfrBiW$)?{m z`RhE`U@cVJEQ;$KnmnRJG>V1RXUNpmXCV}inuw_d8fhlibwkET!RVfDltc*Sr`ynd z4%w%q!EIsBZxj@QAKzMHB8pghK*5ba@lh2YlrnD)lPFqWSI+OZH*l-ex^=X zBqKK%?ivia@rC<>o5>|iTKlzMU?vXIl!O0#O$ zy6xZGQH?e5?$ck)6EA=Gc&wj|u(r0L&3wf3{ScO*MwUpip}IvEkk`!o^!@lQqN6JL z1??q)^W+u%_Er4*j2ytm;N#+gUv%i8X-R3R-F$WL%nUX}QCBw_p&JSUGUA^{(zl0F zK%^4fY)JUs%IfQN3`he58ft1D`C)*N!05J18v-%>0AA@$TRS^4J6_1hu7K8ud(yV~ zc_yf!stV*^?b?3*-QMymQXAd<@4%0m#ETFN8)E)bh+;yOlqV*Iai{$aQHyF~kLFsF9|*5yQXzvEbO<_uo+h%pTAd!8iIXSC{}3^ULt=c_<2d zQK?NCBTY|V->v`k1ow-`2J;_K0m5d{Uz(fi*>)NU_`CgUXhZc5&WhWm6Gy5(p=2W3jRd7u-IbzRz7NOhf>L^ z)%|vJQ)*gTx??hTkzr{`fQaxHFRy&-tzl4xE{$E9sYPV35qM0XqE!_-5B;<<{X6?( zkA@`+M?RBFmlTzXX1moxKOtob>Cx<}JgDpcZBAtK1x%kkoIpp7&kH8C`@ zv8C168+Qi>NN4D_U)EA{Zgb|=3R=Jb!50Pi_g}J}-`U}^TQp5+s5KJL@v20Lh=}vb zGg9#CK2UC}5oq8Q2#-%5CQs`%*clkk9V~Z~QbFTG&ZKvSvDT*6ad1+A`e%mL_>)FZ=U{Na*Ulh^R(>i0rY3u}l;(?~pvi%#?BZ-W zpRa!mU)q^`SONlY9~#QG{o#Ob0MMgkjJE*3Z=BrWEx1R5i4H0PEDHR17-DIu`P$-N@kblWg*pB8k)v&<}* z-iudmYb`$n#L$?0dmRycMS_AlbU#Hg*m9lsipB;D9rZs~cogr6v!)jQ>Ta}gFfru| z`*AWeJMWBwZ_W#k$&MoglEN_8wE3;AZDugOCEa5>cz58KUx2 zr)9vxp~f}N2n_^0oWL5d)5>f_o!1!Q``227kU(gEhR>dlXjEsygBWGoxS~aC= z250o&za?C2F6#Z0-jEOenK0fGBPqHrQ+K2oLpQTo|k>VJ$MD9%iK_2UT%Tc z8Efbu46gsulfPn?8wZ74qN2WiwzzZNW#0WBqh!**psuc-oRV^QaDW1Yl2SBUb@Z$h z-NP2=Y?v~%*!g8;ef|A~&q1MJpW(AoJ7^%D0MxALa|(22m_=mr%T@Kv9-A!f9PaIHZEqjzHd_{EkxOU+HtyGT_Wqko zj|@23WG+l!JnK2T)Hb;&r!`%d%APh*`j?O>n3cA#@9!!bDhw;VfFqxq|D=Z&`v!7n z{np(9S+9!O130}V_Wv0zxNyeQ9vdBc9{)qdHv!`J^Ccr`_Em*H+fK6{qstZ{(KvGI zzxoruH=^4AeDkzN5Zf10Vh9O&a}7vqGPu8rM-cq1j7=e%9-T zKdJz6?9le#p&^47Hzso9^eMvM1tImmk0>Cq10Lqk5O87p541KDcyM41ERV@A9{Upo z{}T?;I5z1rV%u0*;Su@nI5^Sq^S3N5H6UPF31#>1x-WKZU(jKaV@IL62vR6}PuXAT zk)l>;)jaHtn<&T9P*L^bvHy~#vIM`zpg-SJQb2A21-Za%ZEFiS*>Nl!)wwvpl*%m$+QM!0>tFIQ#mx1uKHOV{>C;V^L9XuQ3c{YpaMt5@{82dYhf0cd4MSooUe-=Pf zLt}TN_Z{GzdOYy}nAIMS9VkwMya**auzYSV)OirchV>dp$G`(N``ftL$D48iGBTsV zc&cgwP#%yVM>8f9Et~|mX@9REDk>^8w8OWsxnIfm7iV{hRV*sCFs{JdH3+`pZ3rXo z8$?9>^>y#Mx;W$%z}V}Axc2W;m~zw8DIFXh3JB_PaIk>@+)9fA!($(hBk6(E)$$*s z$e`}islF?Cez#!V)>c~DGnvuYl9In)x##R-G)b|+r@`m%a9Jid3nlk09PAk}u^YPu zpEzpI?xl*dGOqej+oP`gsZ5JIw}0O-^KoVk+o%yPuy03poKT_FO3MsiUh06%jO%}x zEqYp7k^J$XH^>i#=ytDIjfCFo6X)#Y)H><0DjJ7*tumVNOzQb70slwSSq5d)BSKuSuwyIZ=uySw2#pLgctmotJmaK%2?Uh7zK>en{V zQa{2wJ~(J)ZZ3@K$l!dj*O%(`0WM?Hs>d_Xl{UEcTcgkd6Wf$Y1I%IX-{lMO)4fe3@A;n(?^e~jug_onR;ZRVP z7@f!JH1DGYosyZpD^0L1VS6ZPX*CrUSrWgAR6IR;`408vx^P>V)~sH0XE|o+_!#Fl zQBb!vJ!NT~OssqPz6RUoog5J_Q-VZSc1)wG$Y*@6xI#g}ARcjKd`mXd2~0DLst8%C?fRw!m@!It@ zC^@q<$+tiWd0l&fp@$U+USW;Z1i*lWXBrOa*Ih6JiJA) zGF)x`7~KI~0bg00QSEvhoN<^rGecpv`@ZrI6+oB-PAkQoiis298s#uDMo0fB`KrbN zVz&}yLzMU^bq+>z^5S=1z`E%5%4f9RIlg@S4G0bqUCr(umzqxqE-oI}5pGX&7|N3d zRQ*MjJo55xPeqBPar!(z#1^Y9jvFqA#3IL>Qd~qfsA)L^B|b%Q-TxX2syqF~zOaA? zX*YJ{uv-8(JDXj`TgdOMtgPHz5pYeMLg%-hQ=);<0}Rd0ewmCvegj40M&=tU&cBHX zB>)iu-UJYiTUuHI8R_K)CjjWFZ)}vR(a!^wselONeYMA&G(>_P2594e?fjHxbp;C7 zK-UWc4Gn<6VuW?SLFBRcypSa zFd&z%t>smGXwI!H{J_|(-QY+asOWR_J#EWM=j%+hVNhQR>v`xvnh0r49a$c~I-Rb^ zecjt=A!irv%OhD?0_mi?+&tbB`Nm@+*v!$sbyyl2Op5rST||@RBm_h&;M=mcjs#>5 z%+Ef(_E@TeC21eMZk&sYKolLIsLr+Z-$3B!d1$H54glZ=pFsS~y|a1>6#)ST3#%jb z+rY&0@={57_{rAR7GMX0OUKK|8o-{%qtdaeetmd~?r+Qt=lS^|y*c&sT|$n$pG7u(NN6yx+k>Ck-v*wbI`@P>2zu>yOt?<_B9x?>#Pp>o$&s z@OS|~_2!&4Ue2th`VJ2XpCTUu5fF~a>=1wsQ!mU?x|w~O+#^#!jr#GkM7#h5l5p-Z z1T0x^-d?)%m}#Y5t5j#zLXUj~9S#L{oM(B4LkhxRc!a3Zk)pwWCUN{@y$jNG%PY%T zGo|b_gIsoEu#hdbDhAA?BZ8c~P;nUIB4f-uuKCQSYjs)KlGKi*)<#C`i7oiCd+PLe z3^<`~+H+-p@>pFSB3GZ6VITr>JD|n#e3a%s=W}Drd4}4ha9@1&w!-bQ{;An@aUi>> zwfeDXCtiWf(o@WJrPQ8zr0h%QqNO;_pYaq%C1RocKjoBf7;bRs`_K$lKgU6LMa!hG zKe+`{cKH3}yw-bq6|t}=jE}GSe1HVmV{@UYFt)I?1Q!lyOar}rAle2nC+S&P>mWi7 zhm8Au?iHkPTU&jBrbrqqAt(s^$C-7x@i2 z(pbR01|J}J$6-AOqU=ix3lj@c!ZLf{)6jhyXw@1;L+7k-awWkDMlX-m9ZYl*NW#na zcxl66W(TZ{WdxIVn9Vq$3d@V^W3YSq%)fKDzSo_iV<=-w7=lM6qrU20p0sC|xIPAY z%cowlIy;}UbBE{%;fL1ZLJJkOHeu_3BQKg?b>re{u>%ERAnlfCYcKgFS{#I5?Ejv| zK5e=`e4aF^ogXu3q?G6R#oMQJlcTkNdUNEWs+RN?5;Jf|Z!pObl8gM2#d^KdHtq%t z`2eAiBeo9%i4tMZKEPl#85~eLc-uoOFPq^hH!9~0jTu_Ekqr= zmkwWkf&JTj6*G8_*VNRs)I#La*g++%+5#o+M~|z^-O)O`+wGhc415$bu#y36OBjfw zeL1+ffKm@W;!hL+%J^8b`A8Ze;9G2LR#X%UL;%DO1`~(kj`#QX0r;X%n`>&=z(!V_m?$L~2J%%!WM>#aZrVWfdvMV99s#^@z~5$AIpYM!$DsqJpko6* zMLYp6yQpHUwz~SzZUh|a%#{Y$iHgeIWzssj@+S!Rwwhbav*0b2!80#Q2M&wxg65)$ zih`r}GzqOyB+s<<$xJ|X1{+(0$7Kkg=hfJ}$Jf!O7aJhB{^G^EKHu~b6(y(4_fY;L zBYFg^wV@*T5dG@|YT9Q;eBK>;>SStKs|yWBb~btiS2J4LoUXTwx6KG4U8M1<5QsV* z*TGa=XXnCfnG`EatIV%*1!3WIM)#;Wzk})yvk`m(=D1(ZIm+^o;PqjW!rK#z(d6|j z)a8d)#dfA&-s_f-0VI${URw;(+0H6~cGdRbL+aA3TX5s;S7KxSRntD)-7PX+lu>HLurxXUbap?Pul)8Fo7W5Ti(*|mBaPXaoVRUc zdaii4v)u86%91PD%S*Xze(9&?l2db*j>7lMkZ-%!e`zNh2PcRjLNd&F(L_i54uBt1 zkssCOlNi*8fv>$Y>T>y=7#?yi;P`4Nb;UG zI2k`0b{mA)bTmOTHqN+aL|AXus;uMVW5bGMf%Y=+!UF!7)77@!OTdr$4t|~#@)_Ey zs*lfosX>S>+1YYHU4EIE14n;c?An^_Y;8rw@zGIGJKC^2>c(0JM1q9|r@Bw^j8KVE z=a-j>PXMYjKAP#)dce!W17Z&J)J@`9#Js#Z``Xb0z$B=l5p=Xfk1h$z+yoUUNY)81 z&exhDRJdOgo31IaY&pfp_>?!EB!H%TFtr}Qc<&Bf`1ffs{m~bsqpEWiHBz$k%c>1F zzqd3Im1<9uZ=CR00r6uK+^rNcd#*yZ<|*J zNemeu;9*~_pKrl^x4B_xWmyDJYR^wISCO3HW;4NRuBhOGtQ{nN&|TU5G?Y9w1vV(E z_+LQbGqc@GXkYXx;7cJmpck5f{uxtC&9O_fAir%j?Fg^`a2ad*eqL|2x<>-v7Sn5N zw3=mhp;gHt^MdvJn3KuIwO5-##3LozEr8j%fJ%wI1u=9tig>`wPYQOhIW7T0C|cMj zt`=cQ$#GD!NKP)Q*#lDeL-Lue@IjrY_jK*Mmqa}7_S=Jrq7jWHEe`^~teGnr4{&ni zNBoSZIv`&Hf^8R)>6xh>&)0I0w`SU|z8^ zH!lbNaDy(-%zHpT*>i?7G#c%REGnu5sqt=UfXZcH7@3`|Yi;%K&<8|k;GoLO%>~ON z5ht*~(5p7jXbBvN&AD;saXtB!Gp$}92`;?igK{xb|~%7K4_pcDN0u<2t%$$zoy`NPHrx)msc)I3vkC$Q`g%U4v~XU zcY2fQJzdnxW|)_jRvPrueGPL?L1uv`r{FjmvLLiNrcOG>Z@avJ!`!x@9{RUxuqjq} zDzW~n!Nm8tsVKT0uXWl;u+*EO)A$rkAFnK{F3v{MH`9cC3Te1RSdvJjR=xHBRoi;w z+Q>nT{egiMBfLiRH!gDcdGh2eM!7oJ47m0_-X<9u8dgUetrA14CPyUv~rbYi5sUW=#L71jA>j)1bpryrqr+WWcr>KFZV%IS> z)c?En*5)cL@DqGQ>XkyJrT>pp z&joD11F`SyRw)WHI^2$91DTk7e}kX68ez55dI3I4JgU}VcHl|5BHA2VP=VOKH?}^G z00bJMKdv7_|GXEQuJ|GTk+q-ijM3qhOi@E)+z1a=P~avlN-PIV+Ve-2`#(cNZGaO8 z16~A7nyRWMy^;GMs1RGrX|sTNkzV8GYi9m=0o*18`1lX55AmYf+Dm_ANhTO-mX?~Y zuC4&ta`2?eVY3$)paHRvovj7}0qNT1rJXX3sz1B)fcOj$tcRetY-W1;2LU(xmoInX zuqXGIA42i(ZdWDht|OZRv`4unJZzm1`9I|8w8&a3!x>`gRru`H3wog*pLirP>hamB zpJLz}H#+v9p@^3|YY2+W4L5o|^v5>J4o7ZuK7_@?4K&h9F`0x*e%(QYbpXphgbSLJ zOL=#XXjZmq2I|5#&+VYpRA3cD*sv$i=>a22#&lKN_nRA6Xn*2!6Mo+(1&4u%wd?rn zYkEY)Lm}a$#zuxT?z%L}snGX3`*ke&G6dKjQ-|KxGtCWs_FrU-j6zb2!je8HjV6h7 zIyLZaH$Y)a+uE6duJ8ZKA6ZFA()zZe@$9M%0^KAfU91p=g{!nwRN6n7TWtQ9YawQo z*Ieh{>HX22C+(+kcqvbO0`T%6^qSd5jESJ65R*T?0QHRtr#dy2CVi!2(*?_;i)qc? z9Ulc1Qqa&))6n4ZaJ9d(vI5qnjeu|zE0LXy0*T?n6C!DLJ>~+n79dm{;M>0HXCgVk zu^;Wt*H%zqyxiDctky){2?`Sci3vm;K@&L#h$!+Ie+9EK0FR$d9>z)J>L!&NvxN0* zNYn#ip%NIkL^Ndsz%SVcd`nP61za3>07(G)#mX#doOb2Fiz;Q)lHvY+kQnIqc4YNw zaVGb?8%zH=S0@%H+ZEY}#Qtol*8gfxM<*u6Zb$^PAzINLpUlSA+wGkCOf=i^; zM{C@?^cQG$Nj=%v0J89p`aAMC_{P*!?fxB${Orn!30vZ=U0&X#V4H>pqg^^HtIh9+ zP{_!~FGCm`o655#m)qyC6ZOy$J&&3T0$Eqt{CfvntJ7?*TU0u%tm!$ix73(V!&h>O z66|UGZX@Gyah@3QrDOXoO+k*1#@%)RCkcJF>s1J9c3#|)WJwVY3wZYIdoK5BD6V;W zMNEADbI~O$sa(N3vYV5OllXSAykcpZiVv;6#Y-t(=SsQKBN!%mDXv=C9moJ}%0x)W zi+M#M!Nd>j@ z>%FJL31$?@^iFoxNTd2&U`GMnI$+O5+zRgZ=cK+Cy>#sPLcJCZKE(>;cuk{uER2YV2q6v?%1TH~j3%^ERZ)S5+^u|?xmPk_;k>y)0Uhm+WH4Z^?(8Bnt|4Do z*to0FH@;g+^4Yec-7051E;t^hc?%MHOu=6qYo7 z1R4@RtS&B2MnTayo5JO7HW$t|RHeeFp5ITXe~Jyq6oQHn+CVArKm@)ukzB za`&R;tr0#b?1)`ve$Ru?pA68#y7yMRKY`xnoRNQ?cWYh4X?Aas6q>wURPEQWxYFqv zSBFze@>0g@wCz?Uq?;V}C#Yp*X`(5OS=M_Z)O;SEu8tal;^ae6rW?Gc&0dx!mGehp zY&pYs=im#~%QuoLJ`9oY&0(2a{np{F^roghfi>U^WD0 zJTHI(4yv010-zt=O@OX18BZab`nfQ=7!+J@9n_Qt(Pc8uHI2fMli{}bcKx$*C|g)i z4-eN-Y(yn0LPGL%uutY&9$0;KG`y&kbWIO9FDi0etjl9KPa!U${t(u!R@Thx#!wxt zckp zDjirzn@txXLs^kV6Sy%l+AUP@g0Am&Lty*o_j$MjNL>@XoGQ=}>0ug;hW9;-ZUGKoBGr*HqkYC##9m&=kAjKJ!(SAEtDlM-A$}z1_c`R- zdlh=JY)7G?K2JJ3tatmA`Z7vIJI?0R=E%LYPt)UD1x1YDfcY)fu}DK!6g;CV-vN`6 zkTCQ26TX4@*sW6va#-ra4{E!6DQbdEjmITd4gX(Qo&*BBQ7*Xvms0-l%;68$>^~ z*lXb!!}cM;!V_}=Z+ua#G|zKZkvj2-y2{IqHy&W=>Fev%HYg=BR4vJ*t9J~v{vnB) zkRm|V%tLPm>ObH&9-mEnWp&UBi1We)Xo(I+Uo2KrFQxS?)ww{Ei8{X_IB6dOjs~zI z&}&WEZ|fhHAXgg%je{@=3d*ufDJB612Ybl3^MadArfw8O6fqCeTuX2DAzf>JYX*0# zfmKzLebExsLE||k02>|m(-i~4WHxeCUd{yTApUK5=NG@9E)^yWBrkupQ-5r=!LBa9 zwYhl%;JluxT>+BK$H|G+knD#T=)X&)b4U^Zy=FZ<+xfdkl-Tqaa`N(ahb#T1IXMJa zE-YMJK*VsFDo|Ws*Lsv70FcXue^TH=r}Q$ zm{Z4YE{``!oK+mxJnNAko#N0}>Ga0)dTcE?*Vg=-ylz}uQU{0943bG7J)g*Pb3b8Y zuC&D-&{DT$P0s5gx{hY9%C;Z`1(x&JqU z=vx-?I%MyM8AZ>$k%fU&*RpWHuG+g?Tbc`S6|qm%6&KwY_FO7Iz3T?$On&n$ZLDV5 zqF+AKttPj(AUoz|59lkw2T}u0uz#-4j zFZyAGVqyy04s83JDU${;rt8vh|t*Fk8Awzjs8wl=w? z;RO7TmS;?SlwCeQyz~|O%dh2eN|e)4VZN?+XSyPK&EesLg~}hyKWpy&>eBx zw0q&6suM80boxTRtsc}^t?ryK{Bu&m8ez#j>9(*P{}&?v&Ko#t?<6L%2~Y*O&_NGr*(nh^Zc^* zs!|x4>vHgVOZ5W#{rh(wa7k?TjiP6YAiFDSP0QEqKKqN=92m*?ThA}9kxVgAq zhF>&6s|ZTUQsSTD(*5$v3GpLV?|PK5PeiTPGe4QcN#9eVhmEHD%&&;mR9D0Du5H&C zcLmp_vFFw|M3bVGr=*aV>9Hp#CzopY_+%voHS_FG582(G`b0#vE?my}l5*ZlX+E%6 z{8iA+!1QrF559JY_~}gRrl|7C)tUIYAyi}|ir^9OMzJI4jH=!4>_MlcIAtU^lWlWW zwKsTRFx)PpzkeT8o`ui5@X-6?ZN3}<2SLCm7qFCGsAj&NF7~9umL(muS+X|)0 z!0n=9WNhtB{k9N(?HflcgXNjPPcjbnnTjlYFGZ>3$P3uKM=`PeQUI0h40Vw&odwl zKkii1e9dshZJ~{$qT7*1N068=(~p^ghJW{?zsxiR;x5X5>T*p%fk`j20Uoqmw|1PG zitWa&LXc6H038y)^NI0RX=PN=d80Si2RW**3OFZ6JWWYQmt`BTp+!i69cLF8U{fNL zGY)hCly~n0j{p?79%%7E5QvcYw6jf;#PJFZ1v+Mcx&Y)waJg2^{{j_k--CnWnig@q z-0S`B&SEEx?w^eWs1kDXF@DA>h7iL~%L_YG#LrjhzJb)|=j!{luiLM*c!3mlFfty6 zV%$F7`T6Y0Y8w%jOSqU78!KL;QQu9Q#)qhHz$5G;o0Q%h0G(A66gIbOWq|<~_;{8z zV7nnm8Vg?N#|Mo}G(}pp`4XQ7L8}KNBP$E8{odFBxGYVKmo^#6Ba5VD82^I`$QWPl3}dkFzAg;0gyJs3JGE1%c;TwBDUHaqQe=RmGX|=B@aeP^2p(2|H48Kr(M^t z4xo^ZvxDw&mf(+`6o~umhuPzU< z!vSJUtFpW9H3x*3wW>_dd(oQO_0{|ZoXPIf(-wc!*Q-GLf9u1XT0}(C)t;jFS@1w_ZxlKCwN~vN7bpG8LydViJZg8w&##`7 zf0XV15b4&HnzuuEW8pyuc9|wtXx}Cj@$7WXfNM>!j?ezqXUh(u#XRX!iDkHcV@Dw& zPqE(3>V$k8B-ik+BE_74ukYV%ydv}u42T{b&WNztueF~ef`LdnJ@~6ySL_g{P2rCq zh69)iG%5IqD)U6XM$g+FgN@`I@BPMU4q233|mX)RE6;gV< zQE^b`IG~{$j&=Eh4qWqZ-5IA$TydZj*}tO%j(d%YXgT{qC}nFiKhL*TR5*B@H44;m z3ea`-Q`;_8tsa<>a~Rp!M59f;X0xRf$H;y7%)qYLo0XOAI%!6N*@L@_boh{l*H&iZ$DUJ^zfCIrrVPhyVbtF5Sg7{1D~8mJVwi!`BR2jwL;Mx zKrht$>_ew_TJhPRY=l45KipdVGD0#@g^x(khZyH_itnvQ=}OnKHd>lnc36=N(fZFdcvp7l?3x zgF-Yu^``2FFcA=(udb_G1-2BwL*bVeh+xdGr&N3)At6Fw&Ia+YidJrv5K%N7!Q3QG zk3%kh0d#nGCOVC6MF<**ROnp^v4uMXo%RaNd=>D5+tSbBLr`!TQYErk-WtBo4T-sh87`&GHJE=2skbJf(|NA5Yk z&nlT_yy4*sm{>0E*On<`GawEpeR%ka2|P5OFB=Ct^vaPx`evqU#78K@JZNvuAGpawO$+Wb~hM0lg8v&?{=Wz>t?IQ&tD zc2dnK9 z-W;mbWcRIu>3CGu8A+n63mHFGKG0{JYw5j_uyMUtX=t-ttPE1CG)Dvz2lRA+w`i0oFeq|x%a(1*{<7_n?~ zOqb%XvdBHstL{C-BkdD{jX@`cEb|9a&VOR$`2Hd7Ds;9n;YGgXZGH+dOWO*V>FkzU zTr^)JrtTjv^xbXfF9(OLErfk%B0hYqv*T-wtL+n7R*3}$5VX!F+68adJ{gB&)hcc8 zA+UX2{8~ok_X>B`P$N4``?g5$ZULWPPCRT3OZOsX#LsN2{q^jVqe7AacL<61Uv~Bz z<53z%!+)9VBoljSaCybFNwhzb%!(DybewcLxn1Qj6qQA@K90?Z6A<1TLAa2sczClU z13~lRpr;@1=zxNtVPjLE z!AB9Dc08viCbEH@KV0CsxbI9G3?aWSQ0iJEBas?5my6##7lXnVIY9mfyfA>DxIMiD zCn72o?JOl~knSt}84&=TGPK>rGO)B4L*jb6vi&?*!{_s`wYD}lHAUC*8w0+3Yb(NO z=Lzv72Lw>W`@*iTb)uo*0#n$B8|?Bd2&3JAs|5rwXp3I}`W~1{ZEe=W_7UI$HPsJx zcUte&3Khl0-zL)J_}Ska4Gbm4UtRa(M-`;JgB5CBtu(l~>5UYd316!|tm^~-zMb8C z`S6wPSoXiGH22@DmMO^dA0!-s?feVkPI{Z~^Z99M6F5YmqcKsk>6pG^mmf_8U;hU` zURkNSJ#m|uJgoFvZccnKBXz{JSxpAwzD-Rb*4bt1 zH7s;YDCO5D`|ho^q*Mhrc+J5;u>bKptv@{z4%=9>rwL^}wPuI=NgwKf*-u?v9r!#@ zLV%6YT191wvGOAZMtpt!mw}-Mr`?rSpQeHWb5P0+ero>)Jc6=yGT;Dxm#&hS5SiOK!r-OMF&DlFt7&y{hI{5Xz^afe}XeWQO4mxx3S8RBq)*PyH6G zO=4W#JO?Wz@n$A7Q$AT)Ul8RuI9&I*G!X(+vVc;C8sOd3(E-~j^sv1vyS8U`xrbl$ zTKP6B?lMvJNeKy66=}447N=lkrQ)2A#TPBnfR`7%{&~Y_D`hy9& zgQn)(@Gya4NkN`320o>TZd(Q(`T>t+HQca-xx8u5Jj?<@73`5_O0t}o~DMDcn=XY#J*vv2^c2%f2zM$ zNTmz7&uEM8y0{+>fLJ6ZZdrp=>01JV^fH^0C`&F|_F!!s3WEw6DE_CR#}M8h!*eii zBNtL@rW&P!W6S3>FqeJ|CHXu(K_CEcDZTrT-6l)zGrJAjB+D}ia^bhW6Bgxni#0ys zhgZP(K@=TTMD_e9tEB47C+!g=`J8c-be=$B2~g4;E75F!@JWe~{=Xlh99MVXt(~rG z!hCV`OgCkVA7Yn109pXz0u%(0`QJw~GNZx1uN`CiYuu=5CZCo%Pf@HL_xn|PP6 zd{QbF#!M+Bx;$wm+*yo5T=k7cpj3+8*)IF?NOzw1W}n9lH5yvnnywMP~BsYtEMbx;u;9YKJ| z@7(YekYfPWgEY1u7{!$tDe!&sOH1=|ivRa>Mu~cs&ya|1_I~;AS9%ll`Moy?$oT?mK3J@2^EkORd*4sEG&WZ(O-{0tZTbobfQA^aoqriER<}p* zF#ZM!-8nuSc5lj?{Y%bFcGo3Fo0^?H+~2?c^P$_YqPbbS;!ADxsQ3P6VB+y?we{tCU$43LZT7ylbHu4W2=+3S$G{1OW-FaMUKWY`G zq*{m`r%_r&^cfl8ymUQp`SEXLq^BS6bE?I;-k)#og07P`&+7xX`f#v(!lqM)K+bM- z{|icH3*e&%_6UK2mYsbDczhGAIyW8ij7j|h;^NRT6H}-aFtzLL4Ymh$mHR%`yB=%0 z-wmdwF4&%MLIA8l1k8*_B{m>T0n#4^>do(JQTaJl$+&5j7X+CqbdP|J zJ6obvXTRPV6BBbXQSKNcQ2-60kg@;8)bSp|WVeDus`#>>O@r33G=lazGxHT$_TEEE zN2ekyixw3$cv$ox`1!yzmz21=%845r&o9()_eCA6`o0ZbKluY5|$YAR@@}^$s22gvocBfL4*P`F7hO87H8Wl9&)fQ!xkEevO}WB z5heIqrDivng2#*#Sr1=(deOc_%c80S41@3j&J&*6HWPXayp?Pzbvwl z98+5iW_8b-K=D;*>}{c`>ZNO>a(}@Zp-*2nI&qS3ZiwsF*z@8q7F|g{(d~v?7p%NU z6o|-^n67(+(Xsl4K7lqFp5(vnEC_+gcNoat#BciO`UQkJ=0A6A7>%ZQm~Xb}HrkYz znmB$#i2YKFWIRCU@r541w!pF4uL%rDElo^7Lt>s1fOsN9aVh_x@){5S?p=d3bzV)4 zGK(wG1lfBXkqPvZ=-X=9tX(63Pa?g{fFt#O-pKy#;6gQxw`MAeX#HEpQo4@ zWBtG@k&jOrN*$r=j5O1_VUid=4u!d_v~W2G_f(&IuHS5&84qs~+WT77%ts6fxE0(@N!!b_ykDR_7=(Eo(~ z4@MFfX9tYY{O)IKr-#5%#qD;wf`Ql#4a|J{I6dp-z_i-n^H{1tWmF6jelOSX^l*XW zI(}eA-O5J>Dvh~0PP*>wBH+RPiQX$XI{Ib%=Z%Lz%)h1e!$YmFUr9Wu(csh4vmI0qM{%|Jp3gBCY5!GoLe)}GQnJwNLE4Hozc8%T!%4*07zslBWzNkk zSGdhLT?0z617l}++0r(H1_kOA1s`R0g+T;O#Z#!T(4eTm92x?ke=;&ErlW*>Uao4+ zl@W);ZR8w3>ElsjV!94zIpZmBm2+cPw-crRZW=awsd3x>EpGDzy#lvx$MsJK(-HS8 zWXejypb3zJm7LrKoQyu>-rF~iM(%^ve|keAFkrrW_vGcqFXQy_6JK1$PujKOt1mvU zqX2=X!|UQpUQ;Cm4cm@WY3t>&{oHi>r}0V?*4bz;{0*rnS>lLn6H8RTzbGa(fjR7N zctt*o3-Cfm{xRHl9IRMAXUQNQFQL@FZaX_k1gxC0OGd6I_)8LHBI}gMKRk`&N*t3c zen&A*v7X3K7eA>p^%tl`x4{cC^onIxXN6}<$g=QRmftikp(=|b@p+RBH&dp?gQyEe zq_~2Jrw4<@HU<`^2DC^lF)OGo7?T}Pi;gCSe0vL_gLK9>6y`@CGHMV)g9IzQ4b^+W z`w1N%TXgG>@-Clar@t9;kiwIm+co=Knugo>bBN?&PoIvd`#L^v_ew5J)S$wU^{Miu&c_QU^cG!>+~>P*Ht0#YW$_ zK|w(6cWgmWINKe7EOjq|7wz@vT;5Yzc2Uv6LJb}|knYW`jCkG8%IoXz^uoXY3~qa^ zl+O(R#Hfhb&cyWjb3x&LmNO*rp}2wz=#BLBB}%{W0BV)W6NKtA(Qg5)10)({8-Eqy z!^3*pae~E2tBrL+vDt*MdsQ_PZu~$qn%f`&tIU%@qJs)*9?h42kxjH)jncR19SS-`N#Beex z^Z1>c%karqQYyUJLnS~k(0VkZiQVa^jMTs4UyfRf_mFQ%jI3r&$oL#`JDe#j(=3#f zi^Ip19n$yrc4RZ2I!zD2M=DuLr{cpq)Z4qWP$MoW*|$rsi;0l-{7k?=?lv9a^V(`y zdHLHP1{q!cj$YIL-($#t?{v|RA?Tf6qH|2Qe$&g@zpdzKXHK-or|qN4p07%12toW$ zdOF_>KGD)1y4`v_Tv<@hth}SBG65_vF)@}j_WwPdA^Q(+K&pq3)6w|##UT;QpYzJwE!SArJsAyum=YPv}f%c%X)R~Cp#X18f(C}f>o7}UV>HQ65g0fEFyeD50IKU1fJ6ZEyX7guyXntRdE=+SGP zl0xx)QcryRNiuWp;+l~fWA6!;#z2#LXJuA(pT_>@4c%$N)kppnDdj5Ky^X0t=}bN) zp5;;ZQSQxs-#@VifWI21chB_d1v)UQVJ`@iNxsxXrnxoYVOFlIpxQSJog&XLrrf>0qakS|u2b?h#U zSpZQ5B>I%@?!OPcv6h2~Kd!3qQVtO#+?kLd-xRO7w3xh3wErWa-4Tt-`{7>O#7FL? zlL;kL_4)@74nq@CSZ;gqj$v4#zzs}w>hj;%W}a>^Kc$ncNXKRi#o}N4eba6CAy{v zcQ<+3L_zx#vYdkavuxhBw77CKh0gSt6xQ0i`JY#1* zwtP0zg&r-^Tu*5NkSDVz7|78=4Q-j5gw>NW6($*3&eN0k`A|SW03oaKi(qvBzkf#z z`QW|f{`&H~y1bjjn7JJlDv}L6UmuhJ!+^i!_c+?*0?ucLqa?B%he}rNXFhB;O6C&&%CPj5Q6kRFxTn-9!+H<=?ae0!E=yBcUqk54_9~b znJ!{z*#vH-{5)*?2A!qAuJ=pzFDEO3u+RsZddYzh-A$kU3!?Ci&vqKg9ayK&2JZZ0`kNPBrfQwx4{UpV6QEi$aAA|`Bz zjBXP}%J50HO~@u8H>WHJ^ho5ct({q^5v`Te1X?$_ukDVq-QQz%1*bqDS63?Eh1BEj zi}Lf`j?6DIbjo(U(yuezX8Z;cgAiwA(`Kv!CffqrQhEpkV+iZgd=RT|;o$<`B9veP7TTQT82Fp&OoitTgBb}Oq*$jjZE-x85y+Rb?kP;!|c;LrYt$c zZX>g44{K_g>fy2d&vd=Ti!qmbsoG-qLcOU& z2Z2(A)ZzId)4SqT9@541XOZCpqF~E#C+LxXjnaus)f_9c`bzM-pf~u9py+%Kl57Qp zCdA`X7`oN>;h)EacVH3rPcdPr{x3;X`y&ff$w)MFh5|GXdf937?iNglx*2E zew_-|N`<{;^<153y}X>Qz&-fLgjvfi#RI8Or#tO)zh$08@p{CnnW~xn4F%c5e>t(N zjff`N|MO%j^FI20Rsp%Lf~KaW*{rJQlZ)KVNDTSVHLNXmH=NIOeRK&{5W&L&?iaSe zuB2#C7O~p)v-RrS$<0Yr6q=$bN_m{&vr^~4{V)mS>b&{mSW}L8Mn1*S_!x)c3_<0Z zY6`pms5p~%W$JK$$7-_OW1*`Lj5p4e>y7!fsAJtV7cl{z zu`l`|<*HVrG)jHPy4>S9{_oFHpcwH4af~8C|4bn!xb!x8u`KU4{R8++OI|-gLqq?E z=Le)Npd`|&m4dY;1cHl)2eb{)cMkk2V1xzw12{atf$jjC9xO~uUuNk8KSI)2c+n;+ zx&Xa(UB~{xLCrc_g7|qQ)#+vUHzK5S0|Wbe<76Nnd$gyYLnW1^F!MJ|z1h9u4i;0C zfPivmyJhp7%VG)ST>+TN?P!j~ms7tTg7oJtZFF25%N`t8Iv*e;10@Wkk301B)jw@} zg+K&+r|jQ=qY*atP7-7Ot*agd1qBU_)bOy9)D{}HAUBhOSu%GlJTlIl-Tnl5+%5Nl z2CRs$FPQ;6{`>6jeHI7{d(IE?J@!BVhW_crcvG|Zo-<&%LtdV?;P2YVZKHVJG{?`` zcw*w}hll`#X2*MZRHvecX_)=m^>67PG3b3Mg2?)Ks`Mj#I#?J<8x+j3VSL zEY6?m-y54gcH5Cjb?+bm9KiTK$sDok_>;4N_(~TmwJRSh2!uR!p~_5;Lath)!7}ys z2^I#14fn^d*mH~UDlg+L%K-~uuzup%QeG%k90vQ54eaX1wx$-q1?q`A~-}(y~zD&js6jbP8 zrrdI=-MvbR@M4PlxCJm+({B%`ZMhSvFhMp`7I3@*DL5I`CNMK$pqMl?e&}l5mS{~( zxVeU0uXPAzF&Vec+PdJ%mau9rPYSrHcI zWiVFp8rpdr;5`d4>xhca*0)`cb2EebiWYA%Ir|eGj76ywm7w1MIJ?&OO-RxCG5niy zEe?7V5=23P6rV7s;j*gzam*qtY}T_Kf;F>c{wIQJgaJ73T$o$SY z*GF&JOB<>yva01IR(v+YKm^raprE8ghs#L&-x&&q2_58jdUUDJ57gK9*o?t|oF;e& zGrxqD)9+8#zg=Q4Rw(cd@*a>c>26R@7 zMF+BHvdo{rz(7z6I}lIVgZ%|l)URBr-I(R}co7r|gFQ4DHX?ugcqKV8jQq$7kk&a_ zf)wjg8X6UU|I$%Vm@ZUXw7k0?Snm$QNMpCLwM35&A`Iy7`M5|(NO*E`0*u~FNFWv@ zMcJxXZ=Y3OuFwf|X>hy(fuJ9up=i4xx>jUqVKF^3*jKi2h3U%T^ALG^irDSfRepC# ztb`8(1IhmL2Nx4_Z*T95@6#Q?wF7>TzrXE&fH5Z>9T!05?CqrhbYZ8RNN|>rJZ-c& zhSMXVM2nb^C?V;eJ$~2x;DZQ=huqZE>uPD${RuA2A`#Z%^DE`C z9~<_~<%Fzl_8nwr4<2#|At2Jk$FW%L%edXXGc^soIpWOC$(fv*$_3QbXHKv}9R0=X zdMWnuPs4Si>5CIrnh<+TRM;Le1;rmP*ZlZ_yYU=)+a>MiW2@)J8fsQE6gF0ZPi}vr zqDC5=k^o4~+k2-M-#H$$6bd+N9aR$iOjo*b)kZ+*zzu?~nm^+E+%_@6FlJt_ z)3hur^SRBpWU+BOytsJz?%h|Q8iQpb?ZX7%(lLBIpb7^ZWTVp;xx((tUERA!?&M^1 z*f$w`j#IPk0-&jE9ezfnyIlR4fq}JAZ=}m8!{5K6OdcuxLY)8~N#yEQ@18c!7z8NA zg2AvxgV-D$PJ1@|i*w3j@VAAs5h{M@#VzX3MJYi1Tw6nPyR`slc8V|RP!N3_PW)Tr zZ^Py$+k|wVUj-Bu9sWfYa@d;6<>P!PKVV6fPM?b0F;AXe^5MQd-`krE6VB^yQ)bk~ z|38|}GAOI{?c$G!5-LiElt>u}($Yw$bT>*!3DT`}3rKgjfOL0BcXxM(fbd@DKl3uq znK^zq;okSYuUPB1#)2QF7+W!4WaaW+J)O`2b1}K#4{WoDZM_h`^*w-VxIO-m{VjB4 zNO;|{Bip!djKvc{@Qr3%T}|t=QFP%iDpF0e_qr5$EXLhy>oGCw24bO$Vv*$Y6GXr<(Gp5cx;{hr$UQ(kUdfcoLV8wyyCWK$$_lmuia{klGQadt_bVvA=$DaBQ2{slDqt< z=+tq`Y=O@$uA8yxk5e$?)}`w7(?*R27EuR-=(8o1OmChidTAq}H);!Zkc4!I{O?DC zY-D7_`SK*^!UBA_dwRaNKyuX8eQQ2b3FZ;8k5O+4e)g_WQzs%JT3d-Bw-|~l$@zA6 z0&h4%=N@GqWDlwp%yj5ZU9LFJGoB!e;d6mX9;ArU5)y_C1jNDiXnUVQ*%g_T#A~jR zrJ-Vf;jdXkF+E))S8{S{bG9={ckTeZgwG}>pF;eij(&acYSQ82nz7s4Nx6X*&IF`C zT63J(*vo*R1hgAy5Ib60O7eG{BqTV4GLs7m91fSf_&ru$WFiA~s5yG5K58Tmi_`d1 zP;%%U<*B^$y9np?mZikwTdGX4% zw|9AkMRzdGrN4H&;Jg?I$(Li{xvUSPQDGiv$i#?+=?X_)-u?i53$cJtVn3|6PI4hG@t$@$V9? zLgL{@nyV`|QKApvWmML1_i8eF{78DXbaS9kWAAatd`Kk>|NforsxK}sJ`_$9rOYiX zTtD{Gv+XwcijL0XXm9_~u-lNYdvRO2YRBp!0u?V{bYOry)K^PO3-T|xS$jG=RZ2TW zJ_k2m`znWD_h>d$6c--?fA`<@H9?>-0T4pvL`6jb{&Q}g!FDNEgX%X+B9J1($jDe- z0u$UbKkNkmfB;So0G7fl07-}3jwcLn-hj&440s`wSY2adXIV4M)bJ&%`TcuK zB;9pAh=7m~#^U&hAMwsNEghUua(KQXBj@{ppSo-N5^CO%%0ga|X?vu17$by#o!Xv4 z^s;i{UZy4nZbNDDAg`Tkm5S_0@sw9*t77@&IiTYIQ_T1ry zX`ee@TJ|0yP%w<)< zvW7!Y4$?iPbO$;$WJG0{^QPR1IoYTD5~Li)nhiQ^TTMLEkGf(Ra?kHS3d6?bc6|0p zz4cFJch}trKgoL07+-GYKMT@QC~0L>gG700J}8JK|0#hp3zA>o?((&tXNWhKms$6* z!p1&Hi>s1BN`m;Qey%4k13&kMrGAfy?lY9w^gb{0gY0@IF@^?bw``U7G{S zt)#@DVJco6@H<98Jw!+0XDW;>x!K>rcO7{AG-mC|w$3o|PkPE>QAWC$ZC<3fgip$9X-2Y_p>JJb9c)@-aP{&kMfM4?2 zamw*nTP^{kjEQqSd#WOrnj^zR0463Te}){mdOMd783+Qat1mZ)#b&n?m6VlBRIse* z8S3;>mO-X6p&*F26hv3rxEx9Y=Pm{f%ljQIfx21(cF>Zg3?jgE{^^pQP z!o7bz{Ke77yFM+}A$?KpkmiCH4~C9uX&)>mOGCSH^Gt_r_qfjH#f|=6VcPvKEVb3r6TU~r#z)1p;Cu$# zu8YybQzY=dx*Du8fpcsx?L93{WMm|?;O0}MV*X`ztk-XeFj2q%{;dTUjrl|oe>@il z?AmH;x!@5G4yFKp3+L=L4;3yom3}k+S39IT6}l~#q^IZ}-@nf{^hbGKR8*8`L1aBx zuYpw#1)y|1yX@skette6vxhdzK$IHj>-!_%P^Hvp5C#r_NmkBB?eFbfJ+v337;J5Y zJc0m9EHt6P($Zlz3;HOojm*qg^$JcJO5%xnZghl>j?U4kNrIPG;GBdaRmtP`(0n(3 z{S{dG8kv7M>+xT$%$ILWObrZdm)n;`LP*zYuandCVWx5Ea%KDT=ZLDBt)`c>MW`zp z6lq1`;um{|xV-hA>t!cwVulT1t9c>oo>VUs7r!6Zhobj?Je)HYe%s7pf zjg9BuzCD=bHWj`#P|}=rW;~4fmXaILt%Gn+NEloa%CngHQJ5!`A_Fb#@bvDj$KIFo zBU3Zov9?UT^h`~x`)quIuBTfOttV7LmY*G$M^0kz)lSEIj~W}cQhmcj_M7@`b4N@p zV&L>)v^=i9IIr8bk%@p#ov9UsySO$rJ-iPrHSyG3>qn1{pU8BteqxN(^HKBT^`r{o z5EC(T*o%tk_CO$DT|IL%>(l$Qb^j$c4o0p!uAfLBS1**;*yvs~*cc6c&f?@x5)Arm zkK-jMLeHp(udoPeKX2-}G>*2Wsn-X@dQDVfe&*m+5@EzBF)V7XW(y>jAY9U#cwH1T z^5Q!Lh%M(gb?`Ah$W?Z`r_On6Tr#W8(_3k^xg$P&{}a}`v@nV1d{!|Nc)68R=Lx?b z`&ka2%}7>+W~&6WYiSq#oecLA=(Rb0Xu7kboIfv|HkkS${r>Ctw<1{v(Qh(~w5<2@ zbdoV_PYIZpuhuJFILFPV(^66}-KxhYW_ZbHFUN~2Z#~>l2p$@Y0x(Od00{wlw$jp0 zPZ3+|FL3|Ovsvz=JwXg~hMp<^(%2l7n+EvZn;CH29Fyg1R-0vY#C5M;T)uhB%F5w- zZ4dH21mfHOejeTz0FWCDoVQ(}CxjUthDf~X8|`2cgrJ1P_SR_LY?_n~fz}!sjGSqn zAx>?2{rqTZYuD5u_AVwyOoAyuFjbyLnz#C#AfQDmiN!1>q=H#7Lx#54?~vA?$LWaD zpxeOTH>zjl=Ye2(u=G``J^}$Ns<%w5Qt=bG>%gEnTHCFGg*dRDAs7ZPv({6Xq)t&W zqmZu>pPsP~GSo=23k#<@H$=s&se)Hx*epWBherYfecsWVn3~R*&&6|@M5b9MOA82q z89I}Aos{LzwWzb21^(>sPf}m96Y5;n*_IcKzEUo#ivPJsLh^5Ag%U3h?9=q@?C-CS z#ga2JSnSLjYpeXa?~24lf3COJaz5KRK7DGiv6Xv4|I|?D;Muc99YN39!?~H6>zak&_CFe0Ji0#tlYt~nDCI)|^DI=zqPBpq zRQ)4Z_QMCMs#POYbVUEt7?nh1U(B-Wq2AGORzH-|7KUtswSPlBzI|&dGf@URK7^oM zFAC@a@i#OS6Omk16_W`8(j#+ybZ;^rAX6Y3Vfxodpa0c5%z-5M*)z z6$l~z^Q{I7%%N!;dAYt{YP*y!ts@HHaD{d2Tx#3p3*+O*G?Fh0E7$iM_^Q>v@ipo& zDCJMIey;cuXLBKXop$H$>GX8f+;ND4BFnJRM@{a}0v@$o&Ab;R*#-G>ZC!?fv;oNg z5W`eKiOQ)WCbW6Mf|Nsq)zh?AtG;SX$iy=enGa_#KesEI{$HaR}}v`VOMBf^!}V%)nkc*;Btt6>qYS# z2WDFl{ydHcz-iRIByif8DrS)pdeU$)9ZHhehu!v4ZjALVX>H&1*X_yLG~=$xzaP2! zJcZq5OKUX;I&oo`&{^q7dX)dG`|nsQF4br_-fK;2*RIbNS-Pv?X9}V!f}LalEM4$d zJT#Q+Q<8aTTv=iLEnB^w2OV*#Z{FqVJYAr@j;4WGz3_QjhUAMAHK(!Q;Y2&r*933Z zt4l?&z03D>&C+{)tKj5eK`%3wvGqEbv>T+9XJ&&iwuX7PC$xMwfo`|E)0eja=EQ2w zMTVHHXp&M=ISPMmfZlHJ2tmrbkf8zN*0N$k2Ho#zAQTr6u*)4--N5tvu|NL`khgma z3n~gDRBwSV(6wsy)yfK5Egcotb0e6B#zpZ>6WfM7Nm>)Eq)gyFa=Pl-^y_`|f6 zzdbP}Rwyx&Ya6=DA!e0I=bm!MPuz4y{~o5rSIa*@{Z5<^zgOo-;&9cplpdh`bzX$j zANT799QF3#QDnr(kgB4h*;3hF#o)`TDvMULEcTkVHlA?nwpH}8f;wER-{n^wkx1X$ z{>Y1Ka+_^>ay)tz1hATXwX(TM2M{~Q@XUUlpF;`Di0sGj!VN%1_zH@Wk$UImVxgJf z>#H_aUR-SStY@8duv2FBX%`PsGcvCHiBQ=53ZuK&*s-CE?Y;T!=sm1|Yf?GOf^qUR zNxKuSij?0IvK~5Lot;0xy+JvsD6N!$RyQ<)s-Fii3wFz;eB>7whtEBliptAb85vY_l})z^*l4 zrMTPjyPMl9D3_;d>{f|c+3gQhLjDElnsL+8vnjrL#l=)|@vkQ9^pwS9+<*Rcx&5iNjo|E8;FQT9hoEtZAl)6$?U&L;+;76bwxFTtcx zjes<;W4WI)k(NENRAiRHe-i$iyvRbDPOEc0JWQ#K^Y4EYM+9Ct*f*Nd-92DO@TK!H zx7c+ZMZ7wvPS9dE%gsx?4ztMT!&FCBdm8t%_-I)cH^kWdqv=D{dLen8{0Xt=cw5US zgf^B=c7@aR63&9KrrPoNoSnxhMgD1sGhe~a^qE{6mN$lia-%+d5%V%S&vXv{~5NCO`gV;9d@Od2$*n03kGHfh0tQw2+Jr1 zb(&in9x`%Gu~@`+E?+!){w~JGRrbs=T5xqOV{Z5byM4CR*Um2ou>1IpRAAMS^J6v5* zjE3U41OHz5p$-C>O+edHJ4xU6yf(H+oAX}{-tl4d053}Ot>k1}xFavj6w>SJUPQfm z{W^baj4^Us5aZ>`*s?M@uo@>i;|O*C$(@-^P%fO?T3{%4!gjk&{<`kVltmL8lQ*tV zaY@hpjb_1~>zUuks&cIjliT%0*X;E7zOaBJ%0-wo^Ovhlm)jZ45EAAj@zivhjmhFi z#e8HWk`h7R<~`o4ejYS@+(|mMl(jId?wiYPxbYX8=;%UQaw=q z{u_DWKWW!{3xHFr3=hyt!}uvSHrkX`T~TpzYz*)$yFj^t77+x9-ny+LBb3jcr8qHK z!7Rp%(xdjBCm0(J$1oO@E2n)0SCdRvIKf;-L+-_tvTWalwKE%h| zLD>KO{V`G5`73ruclYGLfYorOCOCUZ-<1zcsMM83=zq?Afz>p3Tu$&*duV)ozxV_A zOb!pPh-tG#9h^8X8t`K9(~beLGir8boONujL8U?JJ0?^TtVml?3skctd`4R zbN9i)1BfE-kAH`QI9Qd~^|q;i0shc*(8d;#^KIXczDKt=3A&sUGLo%S(c3Jh=fC29 z{bB;Y%g->Yi!B>o%U!hFPs^l+&NHQN={VY-B9gPRmNwrV-PARQEqT{Vl>PJ2Zk1%K zz;=2()kK72x;`^Yxu77I0N^OTvk1TE6KU~M)Kyw?KhM`?q?X>8LS*ld=s#hd4`$QQccJ?9;vBV?x zlXgZ(y^1Lb?k&xBa8?_&8O3mpChPr({PR=jszsFjsNhtMsdS|x#?rlzzE7*U{xqajQ!GlZZ{RnVggf+m>xXT?*J)@q&uE3 zH4Xf1TVwYYH~7{_Ln_M3?s0>|Mo!GHeYMkQXW@&A zAG;^AHo8Ri0N>_w^lYWR0Maet^ZWPHQc7{s)9Ic_q&!Eh{wg9o7O=lX^vKv)jFglI zD%Jj8r^AK8*i2zO#c$ae87F1wZx{w>xa-E7iF9?hDU|LdNU8xi;XgTIHQQEs(ygb% zvBDBl)QC$~SY%A>=TzF**rTt1=DJ?;V9Fekx4wUZ2E=7GFlc zjYPPESt%pEHg#B?7RLeWG^$zKN&S!s(2^!b+4>B8JJZcN^C7jZ|E~pbkB$8UFvN`h zoL|4rPBmc|Ko~yRy#4?>jSy09TRS^QJP?(Vg7gF+NhR&iR1K$j%D~5?LQjkUQH}mJ zOtkgeT{CYClA$^?AUh@{%>yG73}lq~nt_mS0)JpO7DH20FL8Co(TFi%ka2(WIjuh< z!NX!NH&{Qte98DBy1IJnY)|DKWY)NavHArQ&Q$&Jg!Jw2u6W=Y4acFKS4+PMlZ9JCHyCBmc=Yq`GY z6cV0GMBU$?as5#J+tl_<85j5OzmV@pe%-4Ggmi-cd~vs8wO;&i`nSzIaE7@!#<9+8=qAdz^XmJH7#8;Pzix8VwT9{KZUy zAKRx!g)KqtR$Vc^Ygbsv)BB_Zl1{Z~v9_B&$c*2NHC3yeCQxA^A?#v6|*gsxJYQIJVAl1{A+zns{Kg2gQFt<<@S&c z+qAGJh>66P&Y=t&zwc@%S|y&^E~Vq+=QkhyH{=nnY=hq@vR~x)9@XVdFzX@45ilqMQEGX$~Qt8@J^ZveRZ4U2hj%W&JI! zB!CiDpcQUASA8Nx-}&s^67g_Fr-|1mv9y$%5D2Mz*_DJPdh!bAoR^R9-lIid;9Z_~ zyXw2S)q}uIdoGOct&Hoa7$~k+W+(ZojC#4?mlKubQq|OiMF9uLgrXwL+qWohG%Ei~ zdItM69xkBmXnli&_hj8KU>kt+Z6+3pUq;zaz{X}2Lch*vgEh3Y_EuR6OG}yQ_Y=|^ zlL%N?PNJg;s*VBcJ!z_S{+kIM9fyrA2Kgmu6i91<^#;qD#y|GQGSBtvgaYuoY`*(C z(rS1#a})18;?i<5op~sB=lR3?Q?+eqAILRo7USaAWakfP8NM7IHHI=1O^!&)OGu!> zswzkSqvSR0JS<#bm{FjGTV}7jr4e#bhS^nOKs++8thvJ(EZQnU_C-R}cdy{tf96_~ zrzF_eL3!g!iJWB68Q#E4dG9zRp6l=>_dn@#?r3C4QfP=zq;Xl<9Hv*ub#{Cj^Bo&S z02D_D@zaVNB;S)YLld*Y8L32idYA>ihqW}{tbRVc6}zsZ+XGN1Q2Rq)2cROUsZYZR zZB4C$a|}}&FE29qopifV)bFl;pXi|)Z6W$t@`sU3Ws4qa5 z3ssJ>h>xTv;lqHVW8R*`-7)>1_K=>4H-JSM8y~+dx+5BRnH?E<_VY!;(b#J|D5p1I zNrv?82VWz|jGsMwme^~sKV6a3vIw}RWIm|;t!-^1x1C#k7ti;3e+Jykl6zrA_|ygA z!S9W+T`IM58AjwT(Zy6v|B#X1IIk)k&TA!#xzrrPte%_OW37j0Wx?oIrm*X2 z?$&Nxp>X@r+yp+of17vGrDJvIpna9yqer6zfwnF#iU_{vX(WgO$U5-$L4jqot?hH@ zx#rUP&&jUQwYc2o66Fcz-QE*s6^J!j$}-0Ul&O&6j47x;wV;0f=xstKL@QhIsVFZ^ z!Jn&%x_V+W0p}+cTrORYw#N9#cF*5og%mt~@7J0cR9_zR`;uF++CN0rMRw*%3PG>b z#S?xt??#yae)t(?L#Z9Z-}l%7H{Er0TLC;d%g>@;P0c*q4)@HYG>t$!BZ^I|`hxD! zgnB@Ud~XOhRP7#xf37nEH=u1uo*W4wgZLFy#+~=+Sa;-oOebbD#^p$e)O9?^o)HNn z0`cHZe38koWVxLC-uIv0XZ`Kv%Ze*;QQST~AefPn0qU3kM5vT+zTZO_$-Q+V<|NzE zs3s2?D{HY!syO&4@6liIV3}EU7pe3&?pwLN*bN;Vg|a5(NlHxgJ6MeiTQn(8D2REB zmf1Rt=K}s+b?W?myg)DKzDaidj8E3O>ARPi8HwIDlu(pNz!rnZnWgwrK*Myj5`(lC z5A~z{R}&K_Aka2@J#gBN=`{cnUuk+e=r+>SXWqe^1j8niJ8gr%NL;@0v$Y#)-#QKb3+!NGGvvM#!$?PD#&6IUHhGyejX^w zXY@J~5tRl9lji$q+Cg;vWJ1cL`}aTlEW02OrmVcLSNP|{d#Y{JSuKJq-S)31EHj;X zt`GPzLr~T~xdzWnc~|$FQK))hyQ1(>ef{?DlBAA~ePxAosYEW9 zDQ4vZ{FeE+N+^wPCa%PpT|8m9H(z0)&CmZWAmBm5+&U__k?YwQO!zEC>o1+SbA4v^%d<#HQ4`=Ty?_23ou9+P1#ku|tgL%) ziFkn1IaBp@byZXAJw@ok#s)3E{0NG*&1sBec?==!o)@m{+B17gO3NZPPa;XC3rTs; z>>ZNB)AVoUX%f52px|LtQSEr3B1q-LB;ySB$gFsazC=_IaI08uUa zxDi6AL?`h)T1i+;N&)U(=tK@7d7S>y*Ts%Ml}}CI!o<>-f$^)CzrPyiEKq>~Bh7Gu@agoPb_ zzX$rYsHmu#%1UUX#m!kM^hg3m0nJMF=08~n$mwY``Gko~*nZ6r*J?+6dcL+t%cu5} zhb-i0#pU@OAt6POiXyJ)>R4^2l0EeUP?2Zi##ebBgr5o3sH=VEWLE*_=~MPjzG~cr zxwjK%G`d#yT%`Q&%qA$BirJ1{k&#M@pE#pl{lJTm{?HA;TF3dQ799qsGh$OCBgk4L z4+pK^rne7$wbg3xk7w=W9-L*=g&MADg@ukA1AWE%WD?~lSg6PpelC$sM(bZXX{3qK z(Mju!4p)LPw2m@Npibhl*&1>i7k%c_0E?WpGw}DN$RH4?57hay&U26WDKMS7pIh6V z+aFWo3*ABduvdmz4#K^)2^V5}U^W%UiGm3WtMA#V^1;`CtU#Z&-+&?=ww0NL+ z(QD!4} zogy_A)iH4{$B>?u)?e;ky5IL9PS42deX;-vQKaukj{EfUkKI1{OwJk_W^r*I_Z<5! zr1ih1z_N^c-)HZ1{gs?SnUsgm|vH3fB5xJ2!2q+qQ`w z4LbYNElLhO84r_<@PzJ_S3Dc49hxJ3(dkeQ@>bjwZ%Bx z3^uwu=ByeVN~Tq}7-zNpg01{@+Z-4ld5S-z&HB^kZ3+}+PQAYTvy&e}DRP~U=r9tFli`M023=Txo`4j^!4(Z)kj+DmY zi^G=_bdNbO+ql!d&ki(AE_+re?Y}wDex!Q&yRGKEE2q_ESn~M9qkuMl_TT%Hv#(U7 zQLv?Aceo7anpgj=kl{2Fp5J}|MTWKQ(Hp_L?l0+YCDR6fY5XlIix?ZPvadz-Hz3?W zgV+`j^)sqvZe8Y}hX^k~ub3SHpQ+^3RANlbCjHgyl$6lO$a3%|bVn6B`+aUMus%{t z|Ix+R9d({Ns;jHp+gnt6;2sP!gh5&+re*j5v^cO@%rzI3-9@OWaY!d}L22?5%g?h^ z>6?Aui4H^bl`SbEK@dS6NU$w4(I;k$RJ(fS6%%thoRuLYR950rU{Gl@J)^E_(jI@T zUAAPUaZQ1N@^8PuE+9%$I1|aZ^w-ayAL7_z&Rj;#(FoSveg&2qLh^2JCSYWcdHa#g)y9aGj>1Ii!Nv_nR{-rfda8HVjFTyb%1g^^)M z9M!-3lg6*>|Is61(!6*vG{9FL5#%q$Rj_yLU+gKGn)Y1(4QRMraA`6qNlTUHa=oURk{Xw{_NRc-#d0@u!!MrNTq((KK=bFJkF0%(2cNM;P6%z_n@eeHwalRt8DLP64! znz{s%iG@=mQi6x=Ac^h>B)mFU)K7E^6{Wa;|I)!>AFRc9_{7A;K?Rn`j@g$$194$D z3NSi(i-C&V9Pvu|2OT{uu(SggoQzCOVFiXui}e^htuJ1@fKfPP%^|sro`)nRCXUg@ zc^&uvE7xP}Z{Xr;=aF(&mLDd>DYad^4$xvcrJ_UwZAE?WuYtk0qvQBbFk?~J*!mY1 zM5fEPL z!?Ac_%zX^`>?Ga{I+Z6R1eIHJb?WAO=qXP4cMzXV)CKHrUASf%>Jp?W7DKroGOKm} zd?62fnR;g0CcEIO($Wo}2ick9hmh@KRlkygxJqVbAmT;vlsHYkmZJ?*d0m&6Cy|`6 z$_K!qFky6p-lEV$;nHhEjf#Qr@|yh*%j>(mcstmt;sS^kXKMu(2R2|v1)B2 zso9nKrYpGSd+P5TyWCxa3qD*l-8EJ0+HxztBxCSyloq`LQV3pK3O#<=A1v7>u`*mu z+W4?N-f1VP33b(JF5>;-7pn%z!cn!f)FqKF)snAI6bjpyD#(rUM_Zq}UC+rVyrHJr zt}yOR@51=+dMv+BZ|3OKFH|Ll>~ndJSCb!XITT~RM0J%Mjv8=Rh1l13_t*8`N7>^^ zs)EE@HzGR)=M?r>b82-_q3)p+TgM4K^`uV>Hn|?76dNjYRcF!NIA4i;kr9`CpL8{S z-%G~w_SO^lN0pWKPEPc*BtgP4adGP_D__hy-XG?F3m5f-fd_i60qg}nQSGzJ%gP?+ z$diD$t0VBOcAcY*b`UdE6E9x)ImHkOi;1E6w76WW2xl=N+_M$h8XFO>UcI8c8`pcI z$M#_Q+WP6)kFKNh{qqZ1z@Bk82*mw*<7i@%*C-o^LZ)p&ro9i+C1~tv#*>_EzU8Y| z*z9SXUwVhw?VU5`M}*?G=?z+QFs-(&Px45tU237CGSk1zbv?Xx+3d)qlK762`fg|P z>&M0SWMta?{lyC=MbAAjPMd`Q6BIlQlXZNHtJod*OEdOE50WwgFNE&MkZsHNn- z{^)Ku*YlJ0nT)^nEVQ(PeA%HC^kXr>i|6LWPh{hASo-FdGv$d^&8!wxX@5?epYrTD zC*OI0?)d}!OEpz_>gjE3TgIE(u*o^<-28l^Gm-L~&C2d5JLG!~O=_Iq&kL84{@q@Z zwz;(VdyQeY-qzwF$ZlM?^}E6x78j9NIWT_QmM;O~{^-$DVA`sMhUg%DL!g)}IL_8l zM23n|N=cnU-E45s5WWor5AUPjvp;ct`?eS=r1Xpo z%?8&x;F$OJ2F>43=%%bT%uSS}{)(dUBPP?VDHcL>6U!zPtsmkUne7Aj}!fTW9Icd=5^xbmPge1$a9fC(A}K49o9KL zzCG+n#O=O)pE4M}n+dTjGCCKOq(v&GG6Vj}$v4ekizau1o+@jIN7FpxM}o0y+Ts># zd&dyB1F-r?q8{?sZh~@A0axfX102@bvCYO z@-=A)GXjBzrUw+RM%;6_5Is-?y8q1SGq696>w>=XV%w2ik25_(XtJb#b)vZF*6WKj zzlk!v^8!b#Qq-v0^Y2ZJ?C}Ol!0?pkXR19hvs5n$3Z%I2TFii9|(cQ~Et6hyy43gj8aDT-maoc+P5oSd0c*n_3~rm^JSj#0^3bp2@4qO#Lngnb0T z!-Z3>2as%QMajJ4Vp&gp)*@M$_@;Y0I@FSvj*SgQ@kO^sgoqzso^RgLfIMC}WGttk z0P?*%-QUpB)#x;RUs}>*CVH!|QT9kuI@r{Z&3HId3aL<4L!*aD4+#OUpRJ9HQ{rszyx-bEnblAn>k zLSk!38O-LFIzNcscu0)pXMJ<64UwR$=e5%03z777JcfN(WMsS&-DnT+`_2wo{ypDO zzVBhPTe$M~I7v`QbH}g(@>1Ue@Y>O!Khf;%ApNJ0T@Tji=W&*}p5(yL{A`zV$Sv{Y zq;#fMq!_4$|$J3*eO??s)W-PIu!9g!R)0o|9 zeVK(trIpmKs4mNnUev^8@@hMJlhgwY6JEF@0M6 zAmr=A2f(ywReo7y>*qOZ0hJ|KcGT66Q|K3Eq9v6C1zRBwbYyh&!Gn3XE>x42foxeK z9}E&JOUp%8=li|XebM~~8cW|(@_B^RV?#uq);u}$^IwvcNCVvT^+)Sc#O~HERLE**5QErFhvB~$#elle zLK(nfkn09^T`-k_hc~n^3i3n1*9*BTjg5`_6+1v8iy9xKTyw)1C3&1>`pR*%HC*2V7z#yv0|6M5t$S_9+x+{Vwr3bu8s2ygl`hkkR$x38$;13AN<|3w2Yy&SCh5Y zSsydIs(z#byn4JIE7rBz$D@Y7-wn*3pGWpz5xe|t=Zhzrt#;l!snq)O%Zu?$RGdZ; z>GYDR`+bHX`@~U^$y(+oEvC9q930tPkH3DL7#tptR8;Q7ecYR?c3o`zQx|OWxSIVl zBW>cSw9{z&_=|pTK1&}@roH+1hK7bfeFrwQmzNh{hZ+e%Kw!ZGcnnol2M`l?zSIWe zO$?JE$c`Z{>S8_7+jC+<1%B`B>`i(fY}$N%eL+9q=kNc;fm!%20u$BVVqX=mztzKE z6LWKkF)@=iEkn6hhg8Zs5rnlnREf)k^Ukjw|ElpL5E$(j< zhK6=1{n8lkj{s}RUy9YS)V5{Z96XDX?hnE{Su!|1y&VtzCeYBV#5ITBY zx^+!@2@T6@I=Vq4c+F3sL1AL5!T$hRw7?+Ig~tF-%hAEXCC>SuJP<|(20FQ0dKdZr z{`CTc2%zErEm3135CcX(Gc()a8!lhQH~Rh2-$kcN@|D%3l7?u?BsQ&Con@Z+Fo=e|th zc8cVD3}4B5XF{)`657mU@(S<_fgF;Fsl?Y;zEG=9THLYtQ+N5{a=}Hd3dSSY<6D%( zY1eL8qDK5wP|@_7u0FM^ysdWSc0CN!vQbb{lH?Daa8nqRic`|^ja%PjQ4$a^zqT*P z-grkvrbUjs$qC(-tlZLQ_RpW6V&j~8Gp~!?Y%dP4_$BnrufuH*-~T?(*H7Xd%+If% z`vqIX*~}kqYU)l(_AMd2wqL$%zB0h$g&|YzlXVNDY=}lplied-Lq^)S!*{cuNXf|0 z?`dxS-m>_k>mz7iAja;73jhJI2M4XnJb*lZY0jQKrPr$c^5qNk^ODrV&CRK`weh#D z{J?J>zgY|qi026VEjao`WN>>(nh3!sp&<7`qx`+_%E}}GL{X~q=^W6%v2dGb!V8!` zdUMQgW=H!8)%`)iX+XWFcqM_Q6{c4p(gMDae+k7W5%ZhNof!w!cS5jKOXOPAn&wX% ze0KFX|CQP6R2u|BEhaxZSIK*kfY;5fz}Wms+_+3r62WJzKHjWMMZL6;tkKPH6~Nyv z^+6q07{B?sH~poi&vEn@i*lfwN5-g2!)|STf|tti>i@L>1tjg#XiE(uP<-F;xR3uJ5fWLViI zf(7l9@w*!sJ}SK2k1tJ4{a#gtc-)6@hq4b&$D9itQIIzd&dTYDwWg+~R;~SD|7+4H z0z;7?IRR>IK_RiVHR|b#^Uvv^lP6;9lhmkn5=QgjH#VYeYfpSU?PjzV*mxq+8&Q=y z`C)pk@6t>j)#BCDmr4awmG4&gT{NFMcj{T%?%EgWts1m8*XLH_3{L67%^Mbe&(lBz zS~UA`+p8T1-d84OHa72+*-9kPZXke0y9SmVu1<>0`NS1&dP-wFyIIGx>a znEk15@d4;cg2T!INkbLp*4EVBnAvEK(cO(upH6oW>*M~uNt$~nB%06P=1PE}P^Gc&d4#~5fCnE-g>Ab>Y) zRLXl>d)d~@ij44V(%ISFg>2lOWyH6xkA2aLYAO>2-Vg=?zk-P}K-O;K>dNopayL{| zLFUJNvA)kY{v_`yuHh9B^T=#%5ji+i-F17yO!N;(-4_=Z@Qh2#huQ<*!YQS>{kfYd z6$j});Ng^4Qp!w8DYn^Ga&d714eQWQzi1*@w0CwY09Gobke5P5P5q47db;?zAuY~f zEK%&QUk=vR#7Fb+m>Fa2u;BdXgZ#d*E&{fZX{ViXa&_gekm(YhklBeX)b8uR!Ed<4 zFXVqOt7{`5pe%wSAedf5)q<(p5j4EB^Rcr_EZ$$O=;NUF-KzS`%(b;%pEVTI$S4|7 z(J=HQllTNfI@P0TM4LVX0W#84^A!erF1+u_<>hZamIK3cM>WN-Yut{}nTh5@Vmy4f z!DM>pKjtiSjY zKTI>g2fr1g*$i+Bo+^zAXsSo>P{5iy{kHvh}vAkyL5YNmL#P(tr#sY zupfqr>(r)uVWDE8H#WNn$R9y>rA=Trq@%u`#0%u z&pGwqRDJ!*kd+hb=}pA$S=C_XNQQYIv5&|7aD<6Y`Nxf3iDKb?weja@iT=I22d8yw z?3i7))Rg|)r(%;$$XLiDFYH2_9nW&r6I|CSQQ3y}3)Ai2t4qf2OpIc#mRO{D+oRmS zS7-5z4t;KOWvl6H06?z4igVUpo9B7)Tzmcta4od(a^w9`VT$!cK(-SVD6eqA&Ve)y z(@zj8aCmuuCt`Vpr(ZG&BU(Tr{8wtr$oxo5oNKtf+1}d&>6s!XW+*(8G#T2L5!KLp zK|%A#*<8;Zy!XfI3PcN*TP>_-3Jr_q6-frBcqRohCbfj)HF1} z@Qz`si+ruxtWN?t-Ty6Yo;ocat-jjbHG>F0^>Wj_tu6oHT7;USj!2;q4#q&XGWLxo zaVZ3!2^{p+-vy$6(Lckn!ddtCHa|owFRS0U-L57ka_?nbDDTPA;&{`WGqc@ZAFrpj z)lt{Bw))~ND}r|a(w~b(G=|0BKlev+5JM7A88aJxP0e}boqg)wC7=3E1kr%xkr?dRmf+uo13#aRr}6#A-E(wEV1{??x!#vCVs>2HC}Em zrz+Y?o8#Ffp%N5(DcAbcN*LuqF3cU<)bG$nl90fL$pPmaL;yig*_s>RY}^Ht2?Fj6 zklB{Md4h=vrx!-JL{@D;hk;9k6+lIgkGqvQi4ctj0?cZ!8m50kD2<}RfiPD=Ztgp) z6<=@fc6jta(uEG;@;rw2pv13ZdKqyD1Km^HS|6gO&dv?Utm%9f*ZV6!f2FJC zIyOoFn&{{WGB@HoAf#tG#Lp=|+W@#`TwLB)8a&*w+?u+{{ zpu&=`(oEe&Rkbcwts+IhiSwWJNOzr6Nbil!daMrT@!PG@(Ow^nIa&Gegm-z<^7a>p z9esV@*|-kj<2Ij14)`o#;hw+tYp;DQU* zPW(Oj69VUDBvP-p;XfbA4BkW3n&;G5o))f}D6Q9v9!Bn}HVNJ-Qj=#39ISq0eNH4H zu~$b%%pa+5YMr$95f1j%hoeYv?#E}WvUejRfLN0eZdF*3N&DS|ZiVH}?Ms-Vq1j=p zz6MV!zNzuocO3C;n+vQXDJ9SNHa5Q({Xm#VeDk}$pr0-}UufpWc07yL>iR_Mmo`lx zV~oAtDsfrZURjlS5l@sY@IH>0w3*p(+_>)-u$xU;Yv`BNI1;3rZ9Wtjccylx>g>Cm zsi8c)^PoC!Hc zEB0Sm{*j?o;B$B5bLS1S+MoMZ?#Pu>;ZAbe>ULXKqSyALN`FcC!w33jH+bP)z#6`8 zcYVkKHY_{8vU1y{FaLyI7@OvYXC35Lp`+{lw}?1?jOTVsHC;~US4ZIf+9*9M3JVQK z5jR2c)vZRgDbS!Lg)cjb{#BHlpSzoBZ(3&-e(3STM!Q?v2C-WB^jbHY!!!_V-c((a zG0fRU<&@+0^5yTPRL|b0w^Q{kL87g13G!QfNX5jB9(~-*q=$0~yI>fUJU5s;Iy&0i ztd*Vwb84tmFZ>-u2@zvt<>vZNOH|j^IxFJtukE>R?CfY%SyBD=G=mfIu?q_?pzR}L zmDzpRs;sVbYR7Ziv{($^M;Mu#e_J-NJO;riWO^qi%)*I>!UF>X!?*0st=|l&MYZIP zLY)YG`5HOjy*o^5qrW{D$Qu4jhKB%B&K3UEQ=O8@hxhM>Ch?IU%r}p-G1!~w%w+I= zEWha6tvEVTOWxN5AfEI2uUMM5H0vea-e)r0%_#y|V}Y4Y#gD4;@`T>C zFBVu>n9nvg0^botP7cTfy-x`WY==qGqafi2we0sj2s?%6ZWfK(;Hg2lw zsWsIQgdF?B8r`e;mT>YhlyT4{7HT&DH%PM@E*4ymrj5kG+(3 z9|ybl&|(-CSu~WCV3*GbXzlg3(TmcU%t@b=3M+)M-k*}kToLaYaFB;pIW-{X=r^Y%x?=B=^Np-LanCUKHo=E zZG2CW^ZFYAa>Z*+%0G3pYo6<0w-+jz5OiQvY_XF)u;2FUtfIx);(Q;0_OJ4BENmQZ zlzPJX#l_K$FW@bLa_0*NBYE>nenrJm1n&9e<>mGD^}PppPjan!+E1^P<9 z17FdmgV57_Zv`dKzl;N`4mGbf##%xK^5)yaH! z*f1`updes$p{2625~`s(v@Ne|L<&ogS?%JhWNqsB@#DeWyAYYIv`>l69oE92&h@L( zn??OeN2|9{G=Jn(2=76teWLHXsu1?ii49rq#Dvf%oH=a{8S5>Ie6hZtF3Ey{y!YBn zV^pZq7#kZKv_pnJ#MD|BZMBV6G9cVV7q&Xx$;YUC`1DC*E*%?}uD}b)9xnI8hkEZ_ zieV@ZVLPX-{W$#>Z&^|c7wfL^WUHSW>%kd&S6^Q*c_>U#T^!r`nWO$tcD9S298U}6 zF!#MFOH()N6jic5z1cK!)G=|w@~T8=&45z9eE~7&nY-SiE4!FHb@ofsqTlCX1oFoy z_tS$avdjC}KNeil1d1iQtn~}e`3L^pwazs&Ts2XD3x><8j~_?QA#{S@j2VND$!TWK z?~%Ne!7{CbotLCrpBK&rB{Lf4wmESBTQ&DqBtaLJ^Bx`y#+|}xuKdDyqia;0*XmWc zshN6u)~P6hTI`tXF7DPwNK{v+oHxE*V+sBP9W`a3*Fhu*D=T5QZ+|M0H!=+w8L>f- za*#70g|%Z$71xe5K0FQ%dfOlve1{d=U!tUfJSt5bd>av?e1}~B{jK>!5^BLnRzG9NXZnjlYwF6;6k`S zWbGhnf*qR2#V`#kaWcJr9TuaD<=3a#*rXcj>wljN_PudqLD>XF#M6g@Hyh4Ad-#wt z?q|wI0&MA57e=>=kKS@E;Y7RE#i{9I+nYDn9ec`LXMbFeiBX*oJ7<`OBBXZEh$ghg z@6XHO55_&^e@0Mx#KgoryD-UMqdmJaKM@~)&R(xdjz35LoR)9$g;L+nd1p_L@4LWq zM=NN(QAkoUMzdk}F}s!rl44@2_Dh7bXXS2fXN~$g*dR@}=qnSyuqp3tQiM|Jvkp8b zF?G6!oLX=Fy}kWRz9IU*($Y|L*v4CbH#zzDGV?k`M2*SIT;+^iG1{S-*XGF82Ztq@ z9s9e_n3)+F%3Lsaigo<_bjV5jTgGZI0*KYC*0q;cB-{2LN_^IG-MzzcR9X;i(Pzwi z`>K4GyL*-QbKH1ZTs+*qH<^-;iT0*C<5NqGrJ4Sb>7vsS!ch$r6cdiEb3?JV6-H#Mt&FS1ZGTJ7N~59O0E z6zA;rVv@f7Th2)QF7D#-oHk3Uvo>`X^!3&M(so^5s5U(urKzv44-{nR*!T;GF1II+|&`KN7$vX>V?h ze!@PiW?=9;qE0$4x*QNQ|M=&0PSzTa>uQfhhPTt0uNVBK@{d72tfQ^1&H~4A6iBEr zY1Gc3-vn0QAOloQQ;zt9L>w&O&{yeGQ&Yo2-8PAoWnpFh5?u(U@1T5%c|r1bUqWMJ zBjN~27~6AW?N^nBYO;2YvFT5UIBg4p=HP)7o~|j|e&&&W0~J4;qdn+CP4UG@!rZ`} zW8tEPp1FA++UnuM&7B2i-#Qh}oIMN8u}m2&^?-m4m!45D5mCDH#U7}|Yi!jqAd$M5 z@wg!bM4a3BMO&gdbqk`?fg7v#_pkYr2GnS2Y0F07VF}wgH$_!3vHZu6X+qEl7Cd{l zZEETYT?nji@VyV#MFMmt@TdR!=mdezy0{y-Bm^&3s-4Sini4VVM2G-thR~zpBm4E+ zEaRzV9uu=Y%KP3ocC=K>4Ut$^EFc>*mo$5UJ05ENOkYPNL|5^qQTWNV?u;_&0VoHr&@m zr_#NOA4iI(u;2l>j14l zuZ1oIcBwCNbE)HQBd3B^{>|U_Q&Q^d>W;F2Ne_us^LKtb)V|;mLo#=8%!j< z1$lXF*SMbOY__zDrmpUT6NX%+zXkb2mU~FtUdJXEmwl+3nt?5^7Y(^$0cZj|}@Gt2X z)M49lbkJG@ii|Fnpq9_lj4^yk0s{llUIJjibX+1hdM)4J3C<>3@0(z;Nt=~BVlJ(z zscC)O(b@U5t}cd|HT?X@GOtW&Q7oUQc&|ib+72NVB_WGWotW%;X_5UW5*_;dP4fukM0DVS<$KifKlN# zN}pz=um2ET+{sC$%U^{J48C=AbOiU`(Q)l;Yhzt$>27Otm7nq;4Q18(E)zls!u$5^ z>3TigK2L)0QgHu+)VIFN1!fgXKU-~OQvh7|dtQTu5zgk)^tE)bAIR&={X&8LK&pP$ z*W2n!oijANO&K=0!uxzE08e7yzDH=>&GuJWn3&v9yz_=2tf*)NO{4yP3v=@gw6|S` zaM|`P54LZ<^ZJzmU?ej$6EoX!am;CrgD!TT?ps+~%X-aSG&b%=WDV{wlCJOV>)WUa zrKNhKj^zRlUy+A4o>8e+lHN{uLh?hlEQ-oc8E!Ye{7)Th!9|3QIQ zd*has?cc@8S~>Q^vWiMdy%-7n*}7id8~66DKY-ce$D2Xmb98bV)P&m_M!Z%^Z{I!% z-dK|8X(NeyZ{Mm8h3=D)`2xw&0|&0wM{{2Kz!39mJ$M+n2OMnU)1asd1~NWsCBM2H z;v*Ddo!#8v_Q0dSQ``GkI_@YoW3QR43okw_#^F3?GV6urQ^CO|wuUfJ2%VPuO@ z(}*4xc8a5d36#5}r>3jWzSFIeTfm$(YjFn0dn)R1g|Zq7B@E6%p;bjybsGh`V~4Rh z(Wei1b;jRq;tPSFKNG`InDyp7sEl$}TfP{0qa=JAkj=n;sideVnEmYjcH$;E>CPQP z3k!ZOF8LA9+v`H&;^jbf>SoVv0t4-k*k6x=K=fP~v}HU@kiJ=%AAFWQ&yh9Y*d%BGQ5TJjla zgbp0ox#|Jsh*Kv|l8XfW??R1=V5(*1{izHpANDOnz5fJj=g_`Bdn%HJP#fk_po{;i zhmpsFs!z)dGyS4aBkzxI8c|a^!X>B~8DA3;60UUTZ>gFv+n>cn@1;R`IXO712F5*m z$$tVkBp89j?8pGOjp+r1imwi@HeZp^@F&+?6&4nT_{@Tk@LzKri^or#u=0BIpLh)< zY6oRXCNp8$3RMTjgXiC|%crKp3$*z0L0KGA=;zJtJ}0CB7KOjRs@F&Y7RbVjL9!7P z!m)i;s4fV(e1sI~ez*#on(kcF!CaAty88Ri#K|Lx7t1{^BOxRAQ9%3#65)U2oL3J0 zwn*N@p@K3r^4v&Z>}EL`?Eyq7@&eMjEuXYa_rR~*%NJgl&o^jT$ztHb0I9}+7?HSn zebrckE3}5Cc{9hd$jIYxdOyik*sPUV68ai{$*YHkhO5w6iCBCOb*dJ-Kpo*M=W!iX z*%)nhPR{kG_N9b~+Ckhep>igS&dA8@w&H%*j7UrjAa`{J0;63ny@!5rWTY+A2)Lj7iB+`UQJlP zv`$KDE!)Y82gX7m64Cp$wzd}4OLK5`RycG>*_)L0?Adf2*Kz!G)0tI#PCbkc|F%(K z0jCNrEiD9b5O0evH9%R8!QhS@mIK{3^?I}a76#qw-uox*w6()=nO|-_z*-6C%^s54 zdZqvUGl4;yPtB_S|Cd{@VL->}^}j#e`ez$d|JT=+o-+`3t}X&f8i>N?1kXS#R^9mVp%yEHIJLMyPkj19X1yA%Wl!!u`&^8Ett z+}YWQ$tZkW_w?MT)^!BR2=>4%gssk;8Q63Nj*k=a+b*orc%A){lK%5U+v#s6-o3lW zIKdZp`Y+-HDhcts%@!N@Rs5YuvK$X%EZ^$yJArs{TiZv#0=W(-%9>odgjp+V3}2nz z{_)c%$X~`MB&?>?o={`C13K~DyIMeoqCJHR1&fTH1D%5{qAwM3(#3-HrNmy2}d_)8&Jb@bj68kR5O=xO{HQBg*1Vt%QbI9@+egWlAq^7ln^V1i4 zklO%9`joo*5O1?_e1eo?i9 z;6bi#!nP7{L~(KP8_u`@o+Sb8p)FUYe2^)kwm&MnZGwM=4uJ_x$#v?yEhCUqr}ic2_V^$PK)$-P`rAG@M_b{` zX$*NV+t}Kws;Gddg(;cV%S_;5to%oC4#tc>kC}g0abuA0~>uU%r5(==BLM8Xs;OgDV!f zv#$k9uF2+QMMVX=42kJXqJ^WK}vlM&l>NH`ys>jU0z~@>+pbNpl2azBkJlSWUdd6{I z^q+su!Q|lkcPKj{VWK@|QD_+q9&ts5GN2)xuDVewii&)#pHdE;5-+Er0Ft)l5rDB- z>>1R#kfL1qT~}9E%^9VhEx-xBVawOAZYV`TcR}-W6H0oP&gqGXf}*11V`GmoR*NiP zg8|(MWaX}&o*iGmzI^q{@$%*1Z~2B#0W@7a7!n#v>)Ve`2zs)JxA=f4D89^Kz{3Rw zlb?jRdGp&&disIb_g!7y&3D8YXf73&#zsa4RcK=E-~o~BcM}6t+$<=A3t@Xp7Xpw5 z7J-QYgYR$F^mVqkbHp{|v`wJLyW>-QQc|c#Vry%wt|&Jj-*dG2Cnt}IBq7mN?OLN> z-C_Lp#%n~YhzQ6^z4Kj`3Jgac__4VeSvJ01U|dY2XH#T^>1QB}_{>2+icxc+1J{`$ zh=^GSXhLE^A{A8GbaHl1PD(l?FYoE<3PPcl+&w6(@i9fnJ%lJiT-iYHbF&Jr%Lbbp zYtn%<*y1(=sl>n|X^nQpVCO6@E+UO0NK0qJ=NC^NbU-O-X=X2Lv~Pz#%XrQh<)+Ph zlhFbu(1}Y(KsQbIFU(T3FS^i$WMLyEHQB&N>FLuS7cw9iDkd%ta!^9kNLU!PVo`ZH zV!rRfNE2+_(CcZmtgAhpTb6qNJ|65*7DEFAp?&-Arlx*O!u-%I@D3JgSH(1ofK^+p z$p7#m-4iFUv^7}nq^C=Yi%0KY6uMnev35_=)G`5490y)g5ZK^_yznySt+%Zeb=2p$fQ#kdO@X7Pp|x zRY)SSpkNLH@oPu#D5RYqzX$j6kP?$1sD>AHa?VfT9XC=%s9&Ph6vDvT0s_bpn98P z`te6&BO?q;o^;BdGcI6wx?nO67|-_ev2U7MTXDqhKPe>0@ubuUEk0EqSAxfLBF`&>H&(dKLwiwVGuv5ub-Tr{%r&HB9=5> zj)nT%`SXxNhWL^?iw8;&6xgpzOEp+t0k~ymj$=*U6C!}N!7)Qaj=j|IxgbbN_O`Xv zqzCIg!T9j|Dr^oqYV{j8&~ZXF1&D8C`&f2anUSI4tIA5gkSwe@1dk1QTSv#V`}Z4@ zyvJMjKt~O6i7v!*u15*skToGW89M*C8t;Jd@6Ujq=ojizt|I?oihyR=O@X|^!uXRS zacNopkcKoiH1vQh^E-}ZbOG3+3=w(NzM#s7+LYy}jIgjx$8+PYHOq(G!$xLk(Ee-> zjt5Y77p~d0roHi9zLueaVi4k$!GU-`7?>+!q3(U-t^=BW6MCUxBizSu{&C&fLb&|O ztZKtI#;@UAt^BWD3ZXtkdF&hF@U#oKcG227tlRwj`~o)Dy)smwQG-$DKy><5>fWn( zlY&fFWey#>6&J@J;NxReWQ4_#lF?o1Q=FM;VUdEMM)Fp9X>Dn34Yl93OCR~OrR(>| zn*}Lb(*cqQf#CjjwJ+FMm?@?bl~+)3A79NkDskjq_V` zlnE*!cq@A{=#At%QsMk!q5#;&$PXyPW%kGG^^r^^_py$DO$Yl8#s6UW;8qs$95gz4 z@??Ao3qQZ}l`G9j1ZCyaDq zZ{McJg**`xu;)m>pa~4NIOqyo92|bNazu5DXyv3P#3if=k`<7gqen3jkG+Y|>tc=e zE-IzW*Q^Wu*8T7NNfNx&YS_bYT_qATm1`kzg(Zc$+7Fwg9P_<&A*jh=DoZOmHa>o- zsOwjLbpI!eTrn^*CTL(W1#pD`wui69bL#QYQPk_U_V(@yQwihM!BnUXNzUl}3MQR<&uqQ~)@~JSo8tk@ra+<}>;njEjc46KyIJt=c7d| zM3CagsSEOUI5HaJ|uo!P^SR;kWEn@+*Ah)K(b*mN+|mV zOc$qJm*Z0PFVnMUgBylfcmCKopbbC`)%sT9&p$N3f=c-B>MBxZL>p%EWAf?tPJdvU zs9X@09BgdvBq!5SAO%Hv#%FnIW6EEx>J(cDB-2`3GKu`hK+ipKPXZWWc}9jg>h!F+ zwQ~q@)Sihwl;F7oD0J_zB2Y!FrK8D&Rg~dnVs75*^j@k^7U2k~E(Q}=IVyEjGPK)I zrvuol1lE9>)vwV;7Lv)2Po`K~T9SYqSS6oh=~;l)Il^GTYLU?lbh`$ErDbK=lvpS8 z@KzQ1^w2eojEa#VEkp=xx%~8`?@n!DqNjf#@ALqY&Jb!S{P_2j3sHb42G2E|zYvg` zKzh}Hs*4bFy54wSd_==AFAooVPk1xnMgrz}HM`YKx3FO6X6d~|zogAZS|YWeT&88u8frrZ`&LKFSVq zJfStD#_rBgmrQgbx*X+`nr7TTGVx;xp)F`p@*?`Ifx+y|E{fJGL2!X!-A86?uOi4$drX8VUrUPR+f>L`mMo+2~cy2{{77F3{zR{ z24W_#sG$l2vA-aNo~v61Gac_^4&$cOglvbHFE=6k&F;psPpTvUL*M)Q%yF|T4qD6X z=Ai$Rl)QZ$TW7BBogUvdO9hFZp&cBd+BxeBAEm(y=>p#|G13Dr z6P|$lQH9h>OG^XbN7mAK@$u1BRwba1I1wNTJ;=VPG0IF)c72))YN3a(izkMtbb`E% zWN}@idGi?R%tFm0p^JpjSC?geOK@ z+V~~4{A(1hIyr4r`Az-@Nm2)41Js(P6e%i*j!NK2)|ouykYl5xgg;KY6VDPhg#9Dq z31MsL>MV7edPpn=XMd0giNW@FA`G;;;w+9d#b9^87(rIzP4j z9MS68i^MC{WQphsU}u1;p6_&1OhfwE-PJ`JL6p@yc`_^zuuS`pA3R$=AZMXa4X6ea z(hlHyx%{OQ5)q+s{&Tc$IJqh(rxxq7x_V%uEr(K?(pR5-BHFdA?dw-0jK_~1+eUGe zMfgm>H9S8&aaa=JAoKnQ6D8}&?TESLD9wj&-YDsBU=Yf`i600b3yXkDa|tF+(0Puy zvw7hx4AI-$M_yP0q9sW|7|6}-CF(;Iba5Q7m@SnC`|+!w!rI8DDAxkB+{)^tZ-L84 z;EY<+%>B;Hhi(k2|G5LC3%Ah#*-Esh+VgRyR8-%Bd%S|297BmU-OP)>53{mhNC9%F z1lLt7tG2+?Ux^^mZ=J8vwrKDod3|`!1r?ebZ0_dv+3oc~K3^t!s)`?H03K8Db)cr) z8UF;8Qt|BG%`oVH^R^1E8Bv6!+|t;H;{gd_4>u~0FAj2!vGuA$?xhbp(%*>UhHxw*M(0)*#!DWBc}-*f9kbRpyW+_q=9cXt>2kC2rTqLVlajv0{Rm@H`vNdmjH4MR zzlJ*80_ZP}!OV5?vJN%g8Q>nkz3bONt);`R7Kg zyu5g=8)Kii75%a5kiLcl(o$2yp#V+wO~gB0tq`;Hu<-DV!`D$&Pw|q?!XqNUCBWe% z1+EcP@B6NTw|XFV+1lLaak06FFb#_3Mc#DPmGm0nm(r zQvotUfol%{gIY>!Z^8esE`P&Sdd>BS0V+dNNwF3cAD|XgG!Q$G5EZ>(U~s5QXK8-E zsI)XLF3t;_JAaO&EFJ*MZIn5v)$rc&Nb0lK>-6So)<5sMMdp+sbtg`xJy3O>B<#6Q{G*> z>`@Yw9X2Yo+%~kZbu!~+P^kc~$FD`GDr4_LcMiq;5~3QqtizpNid)QXkl3r8=$E`` z)xH1~vMl9Bxr1#4g(9SS-UC;!U3;vbf3@Se5|W<8M17=K%nAG<0yu^M>#VF`Cfpa} ziHh@a#EtuMm)R0URaE#q^+UjaWyP9?FE*+rAOzkGzZ(sZ`LVGL&CW_fYaJ_o?_Q=S zC>?=4|0X%(?|@K>#3crP(DOZ{h0zm8^3nS!ER?~`%#iG*Bc7!Y0cL}5BR)f#5jW?3 zf476&1NH3cL)wPo-)m|Jz!*_@BmSZ>MvB5>KvNW8>dV4Hx}7_x4|un@pz4ytPQo-F z2J;c{SJpUhs)MMD^Y#7Mar{~?G*tFOx00PM3@aS*6_6U-6_JewnntGlY5<&RVgf^x zCs?F_NJ2F1a26mmt`sweWcBp)U=$W*0@6|_)vjpT+m9n=0at<}*yU%ld3iit_n5@11s3YZ_CF$5ROr?j9Uq6=z%UmZGH zAa5n<3ga0;!KTI%K*&H-B0%q_d3k74XliSZp?U3clAeKPcy<<3E19XOyt{XodCw2= z3z9F?4A1xV^&z4nM@4G2_~*~fZ$MqJ94q(eSZXESyO)7=fx-Zjt@6*Gysc2gwRdzRqx-6|m!F$EYKD_n z5vZO6lu%EddROHZ>v I-7@6=0cX(z82|tP literal 0 HcmV?d00001 diff --git a/previews/PR2365/assets/rnn-basic.png b/previews/PR2365/assets/rnn-basic.png new file mode 100644 index 0000000000000000000000000000000000000000..ad7313cfa4989a0f471edafe38007ec7a6522ae2 GIT binary patch literal 10304 zcmeHtWmr_-*ER|gA`%8IAms>1cS;Qm3^gFqNHZ{ibSu&#J<C>ho^RIi@ zkDiF*GE?)6H5c*qqcN9-wTVN|o=(y2;~d>OAM3R-ubw}vUfpTQ;{07?vV=^60VsJw zrW~POBq37}20w*@DJTXj1G^tYrbTN52_ln)%L$-%<|AmkR@93GLK7#fL)M+EOkW4_7Uc8UkJ^Lng&Y1-84h=psB92q{k#(KZy@FBcSJ-b=*M~ zI!p-j)DiC)#GGI+6_>a%8G9ggH6zlGar1HzdkhxYVoc3P;BfW?E>~B!Yo3_=Ti5v6 z&2gV<(WAxdE=B9Q^Rq);R;ztO&#+aiT$tq+bnZX`+^qbItRL9L#+P&Byf+~f|td`fB;xMs>U!oPeuN&17 zs?FVYrEnv!t?pFzI1eN#TC#F+Z^%0LH?CO+m|Kv&W&dpu?&W(h4j7HUlPw(gzPqtc zF144pPB}_?BUxP9$8uSUDda^ti(Wj28XjRI7x`9>OLgfudBAF2*g*|ZNxv;+srcj% zH;Q=g%AYuNbf8IN&0E|8h6`VEB(_6|*w@e|yJeUbG1oKF_^M^xJrKeq7T1|+HvoGd zt6@F|D*&b6^ZjYxu<9sIW78phoGyxP;MV)#Z#wxgdd;7ia#p@D^g>ij+X`YmNyRH( za^wkaT0LC$-5%P=vM4@!<=5O$`zJarhPL45g^^2|;UFWC7ZW#pYqtBp>#o=G>oL=P z>~Xl)N+L>erePpY3s+5qsM>A}(TdZU+4Oz0u=O;Z9QTc3-Xy*7i0h^UgrH&cki^=s zhC<9%cal+en)Fij%pg+n&#)Vd)CRW+Bq4;<`T4e7w>-ne!^5v1K);Xe|F~_5OYsk< ziTg|F(MT<*(K*gpEQFB!4+oilZFtI8hYXdFXv-zH`!?H39TR>*kzWNF_=yoXfJsQ}Z zg?&kdAo}`+5_Ay0<4LrIvWrZ&J1t#p=m4js%;i9XHmZY2cVDdNj^BL2Ujp`x%QI~h zT6mX3eNZ%qqg>&-f3ntz8IFlvPK&GKatA9s){XhCyXp85Sb++stckAMvHZ@}qgN4f zTQD%pe*NLr0a0^D& zw)y$U*HH&Zf_BgJ_XvaU$Zzlz3Yj%`gG_{y|1zpr=fe}Ijcv;t>f*0A{oCyp#5>s74I7T@LC z;+A97uC@ee1|JtAAg$vL5n1Amshmt+`Pt}NCa@|G7fC)By=*^K=L5FLg5>@wuHU5J z#D?}iSSX8T;yjSACV!wx&_RAxk$O+5DG|ivQO9)l_ZhEE5j}R4!pAKcBWibqVRRU{p!c7X5*% z(lNQ!wYA)`BTg3^JGbU)M8l_0Bv>VjSR#vSv6#xv<=q)~H?Xl&?Lwmccly!t6H?wR zdRjL{q=RH#InrG5$6w#{O{M^r&uryRBAbfRy~?WK4~v-*l~*DSF}(;L3v-6zx`nIl zudw}0c#&D8Nalx0NkhRm9qFkZ?1&M7rE%$a-_oQrI64TFcSm9|g;eCV%MrKl(}&*DVp!m%{* z;h*k87Fe+svsl_*w|2dj=l-9%P8kYb7zGub?G3=G(1_Z+T)hpy|ey7C^QI_N=?d^maZVC({!?|J8^)G5ZVa1A_ zq#sd_jDSVW1Yq=sUxMTlY$ZKA;Vi$#v`^M@vXu*6uoHp=eGUhc3Y>Z5YVvrg4dt zBE6E$K|6Vpm;(Bx?=Bvj9liL5DwlDQP{V>3v@ZdpX;5wZ+|4ih6`3MPcPpbe^CGoF8qG|)W1B(H;0kRs1-vm7FMuf z2{|6}t2yXT@axt>>GX!@Xa+M~fvbdy!HZ~lc)GH;ZPpg2wkI_CggHVm9lo5v3-!`^ zH3yi4d!jrP51mf0hGHpyp$PweO@nEcH?QO#P3d9=vuuU+A8?Rcz?zFo*5J z2WUgJa5D0#y!r$;B;TF-c$Y&!jVb|iuU8HejEB&`#39E`NFfuw1AM9WGi84`yhvi_ zB(zJlvv>tzS3UPa(%;N)E3HOljrtbRX+rOg%g0jgd$D*b5`&_+5pAL>^3~f&^d4J8 zxw3+a~wyS?8Q>6yX`ic&`? zv-fs*K{7N9YBkI+qMWbPj4^6j;O4V!twH{^)R({zZPtq))HDo^{*w3-^&uU6UHZhL zJV#=+X|gC$3;xRR_gLTE90^|=71X@<;p+hPd@kv~;xoZILK<)4No(+Vxs5jb1s#d&fyIHCn?i|Ukuf(uSN8>2Iy9BoSnDj*W@ox*Tbe% zP(XUdiMB!c|Jw-)hh(3uU?~)D9Cswf<^BpZ96OHT@4Lhc{u^+})EAFYb(;Ho{xkCy zwx1XYgt=!S#}{CY*RMX6!iga>m`Xp#C(i-;aos`6cSK+CztO?cx5I6#sg7Yc0lask z zi>C~1c7p_~Fl@tpf?3(+OSsma=M1~L(32(i4A~=QneS z;p3Y*{ulEoGK4>aeA`e@$PuD3Td*z_U{?Rb{Yc7Xp-LCEcFXMGYi?UgPWPoK9o!N! z)f}?WnGkK9f_)j5K5T3ma(CQe!V9(@5;n5eOHXW&@fJ-QR;+vcSyZZ_l!Y#z>ph9! zlQe8H#h0$Lo4)I#1(CKtzUY?FGNk`Ipz8RFn(Y!8Dsn-UseCN+XV9DDW-C7)@sF0) zf1|72cW~M1yEFfEvQ(dat;Lw0Z#_+F{Q7a!@W&e~rbhJy>(EgY?Fi-SEuu@jHJ3ev zVtGS9;&_wu*a|N9htOAOFQEAn440D!frtQ{o*XD$-?4ghJXjEEVl~CTp?9E4A$rgd_oQuWd>TkY?S!Pz*H} zV3oaoF(`@`$i%K?o*Ohk1r3k63d|-IW%~vtuXN>ngaH9X*!>(Fz^5o%nrslc7S6dL z0+D@4LJ-dXs{PMSC^Jl!@-dA(>^>?@{lIVt-j2;(7hX53qZ@KzDM-6 zcqQ{&t8q+n8K7B8gC5(|$_hXYzI_WOHt^y$t{3rQr5ky%B(F!)VF4M9wig;MP6Oh< zTNL&1u|G$D!k}pHkj-{c7-)BYpS={5s^J!FlY++MWj(%e4ah^7OIy5!av455=`IuS z@8BXbg85X5n5UKo@~Y9qCS=)^1fAP(zE%V49C5-@zf$6DU6<5 ztd6W@9{NwFNcCE5q;qJGseoyl4WweO4eCkRvQ(3u>+#?t`tE#S!+*d zQ&N7#GIK-=YV~<-Bu|O|Y({6PiQf>NIDcG`^6OU@(;mRq50it9Vft0@LHWFAwDI_u zw%j}@9q_PRi4^+~b)P0x z`^uW~W8c=1nlcMe-+bp!e)?bqDC(OK>l-!RBjgKD&K0XCKw4qbbx3UKL2=bhN780p z0!v{Jdu3(%&>0z-%|t!2^V+^t^|Pk+V4SYG(VdqnN2`%cBHuPhv17$e+KC3QL2Zk$ z`SsiZnH}fbtSbqCrx~_vRLE|9efi_vRxnezLphzoGnt_MsS`_?^ua(0nicHKl)j6P zToOCW;8b_td!cwHRM1P_wmwq-_HyI-HA`AcS);*-A}qx!Gp%8fsYxayK5oVG9bW$W z>KpkLM=?6)Mm}`DMC}t~YNoMpnbFZI83bFCl$P^XWveAs)cfJxa+ym7%}7(FJtX&+ z$@l5TQLWFzMTUaI-@5J6yFFnjhZosJ&-f1{>lSDeU?)nh!TG7pAmd_qOFaLmp%&T) z&3$}jLBjH=w6ji5!+MwLLrd+R$etouj(#~NyBmdMC7lkH5Ck@HBt$C-(0}uY9C!r!ou*Aa+ZMhUn|n_nSEf~qG=-u~ z_Fc-XAS*)%-qx)A&iO4ZMvDRhXt)KFI&>>3j$7`km-Bt^^IIUYatw1R=vSSJXK7H8 zb>sX*H7AGO7NdVFEM5Y|lIYKD3PG2wE~2|Q|6!10^E&!hk*+r$?a?#TD)e=@SIj;> zj>%(FfZCQnm_Zf&O}|+InA59S{*7QbZ(dB>g)Plm(NDL+!l~>i%+;9UFYFG>y;n9% zev5aAr6o<4O-0fM5%#9Ml@J8!MhiEl_|lyIS#Da{4}1*<$ChDi>}FX z?uM~OCxw-<@y_4o(_1G04Eputz`4M-S|vx0@m!uDCiKDd)M;xn4XJ<0MfwYlqyE$J1mx0hT#e<)}RdSes7auhqJM%kE z)MFlNDk*UaCBs|oQiC8OhizSC&aiUv+UDY5H*mE>BI_OLiIm+PQthfiSe`wES`4cX#w z1cbs&Eo@fy*{fP;qulU&w7SmYka-XEC^_oeA~hcgaeS$&6;O{xsI!X@fCSV3$yui3tnIM*2{AU(kD%f+5UnV z(;jg7Tm%!AEm0Ph?NY|?jPL6Zb7NMKd@rH(I`-)@xyn6Wa&xFzC&;246NtJ^ScYLv zkTAR7lG2<6YrXtl|A)V)^H}u%%`hYA60^$WBs$jIS}~5bS=h#>?ruIbUqg5IJ{#ia zUokSJe=z#kd^E)WWVxb^Vw$${vk@_dO*{XCH+ekT{;x<5T@qhuT(e)y_z!9-2!<*V zgqvf%SPRhRHQ?aEh2MIJ(eVp|8VF(ZBwMq~N>F$u2BOsn^NgobKi;I%sw7Y#y3-^F z;4XFe`O1sH85y`2rFkx0H9I(xgQ`4fokx@0-(aG9z8EOvPW)O1DjQl`n2=OuDVMGT zCXa*ySr?dlY_RMZ-aOIss}Z_YNCGScV!8V(B=c^WjL~7PGPXRc`&cuf1tq}BbGE6T z=TF#Vl8ImT*l3X143j|O_QcPk@2NTpuQpPI>WF;h!Ck-##m@7$3L}bWR(>LJ=5fd4 zLz$X0#y0M`v>i(;89iB$G58_WlcAVjE30Af>&t0}m+O%*D>!=YQE zYRsA=at?08X(A?&Rz2?|H8B7fV=JZ5qB{q=@{tM%$lfIc$3K@VpozdDS2aWMz?C^S z&H*w^@NO|EH}V3<0~H7WM-M#(B7;zS%v4yy?ejXgfS$Q#O8??wezLIRguD{(@8P}m z$&4bX()tC~w&PrOfxNo?Hm3X{E%;v){y(0I+=ZeDJff0}>_R$;rC+oa)xAjdT8@gG zUzqe9k~kPVOMmdt|A*bv5VwzyCIa^<`bmJDAs1=H@)RZ}>WtEf8T4=v*f;)ztXYX~ zT%^%Gn`=bhNBnDcsey7`ol>rNcIdF(`f9fz7{tFpeZc9<7JixU?kbW20=>!u9z8>t zmVNRA`&HcV%gTu*kO~WUA0U_Ns;kXT4>%eHNKZy-wpZ@pT8%FISvfU5bMc%a;TQ95 zPoozf{;!;RJk;UofVU%7X#7iqX@=6lMV`vZP$iHH`_a;g{e#XGDx&8&TM^x!Hs?+q z-lJFl57ZM4))^zQF}K!zHjp};ysLsvtyW8iPH0Mv>>aU=>R;3ja)+e{u(}jCFA{JL zwPIDRA;#&vvzqSJr5A#F0}Qw`88v3d6$5)j0ewUTm$^DesD4ua&oYlrerZqM zs=kED%Ap%Y`zETCtGIit2!@Q)Q{rMerIYiI3-h!_+W{i-e1C()PK$O$IC@QasE~9d zuhLK9W@|8p;9_Oos^_8v_*}moFV+sylenHyz7zT*bd%j~kQCt4`oDnnX`>bN&C1vBSZ3dz2Hc|TQx>-Zu z!DbhPyZ5sTw}uHkeYt!~TH&B#w>|hPQ-h7ajB@q4O#q{@l(_W)f z4)f-Z(?5q6wg-%q!qT&kDFsbHaPcW$}*eDoq46jO^mzzY|FApJ&G( zjZ?VM=*p=_L)zNbb*tu<FWfuAo zKp@}>y7T(ls*WQ!7$1t4WsDDH{Cj?>3=pAZVW3XnF$09~R!UTL{vJ5;2{#h)VOchqe6 zwB*`(!CwFkA1ox2{2G?YJDQd$&gFZ~mI$4^KS9$0Y*H%#vh}~me}iaNLEHSSh_E5E zc@EeVAPesBDILd&Biyfzuo z4|!^3HfNo5Pltq9XC0~)I;$USKBljFi?0e1yJ4e-3l!@$1n@1K=pz7PY^gvTTZ*L` za}ojg8Ye&Zh4soaq1DBHVdg7 z9{5P2s$#yD-c%(BtN^LuSqEBPEeiP}QLm1X5Ax9@wcXw~xxX!zAAtx!GaW8rhLSCm zy=2sT@hkD5KTC7zALl;n+3yf)55e|-4Q-jtr)Bbsskd4wp{0@tFM|r=aDPiJ{WA!# zazB+B9~pj%;RI3EJiMkLA8WCko2VJW4dYu79}S$%%jR;+85D|pyJXh)Q#(^GHoMTc zGKm(h5*#I@(@zC3S~7NMtNy6wEb$#Z!ala0FlLV8$D|@JPJOL?9!&pM0s@Z#P|qi4 zj(at2&8;vUtCi#mS6R^G-t$uFOYn@5w zWSY?701#qc;(0lT!5n|+aNZ{#`cHKrMb=Fom7tAb^$dw`EXY5r#kN+`3;gT|U%n6U zj|Z5f&b=@zZ&3<@-UL+;GY?rmV~PvplWdck#iU~h!2QicF?!Pr9`Ld5FW)izDYrEP zw_A5?|IF8}81;>Okq0HVT-L)}piUg!^kLm~qky-Dg1gjVb>HsBE8l$+IBHf(OLzP0 zQ@+NBFt#vbMSwjlm=QZ8aAyIiK3`iE#g9q%$ct5lXsSq*S0E24{neETOlT^sy;eb! zoTiy9wF8SCmwNn}j}AlKSykG3GVCZW@e74xHC&RGU=b2($$ipIo zSep897AHk`lH}(P>JYuZs9BlUm#E+ZCScf)Di^;l%2zHPU!Vs0!^b+2Q3NPZ!57|( zjlJ=*_xZ?xrw%m*K=5v4@g}=hM8B~QEX$kz7)6yxawM4yYWEHrVy-1B!;`7~%GjbsLTQ9wP2UR9s&N*2r+0{| zlBM=J!FSsV$@k(xJGy3yg!U>kzFK!^VEn8vtsO_&kcSc!$?M=}V|}6!Aj=*@_bqILR3)RZ7k6yKv6l*tXU9&u3k0_=wt(A{xtvE2d@^tTbUQ!P8nPP z1&4g`%h`B$I-eG%0Psp_#vsc7CjZ@FM^!F_#}%WlrLG@twGMXEBObR0UPjhqJ0SEM zk9yQ|@dnLP+OAx6cHLGqh}p{g$HUM^&Ns*?4M=W1hy`oC%IxU!$aG`+N*_@Y8? z@|;R~_i~T;qFrPMXczhN+3lLyNt}~9of*(clH#cnCw5*v&*wFI$eT;Nz@7B=Y}XJh zH&sKQ^-IbQFJSC%%_arS$S+HMyxSYYSoK1tduQ|8^IgMg02rZMW!@Vce=!?0rsQ@B zMxxtQiQe2u=aH^;a054uodxH0HAFT6yc6!95I@&_T3||Ak{94=AWI8&HI!h2hjg;{ zbc`Z=Qlge!?}L$Eiy?_2RzCnB+{x;lO5(+9-Q7>W7Ki5Q|&LMf=K}1_$h_ zoIz9yfcWA8HYsDGzPrezysO@A*?uc(>nr+sGK&-kf`N?G_b{>8kM-Auv&yD1-AEjV zm!I|;5CE4%@W`lnG3Sjzp*iiy6Kw7~hx48w7KuyC-{_~r@th-Gl|y_t{o@PBJN^;1 z%UFo`u%G;uB-DU?O)*qGxx#!wKVWi`CgB-<(VyPNV%yKch*SqtlaJ~wz89JjqV-eb zoGxIsy5sl3e<6{6+xOP5 zU!wfcQ)IJ50K(K@gbfW%x!fVOw~9bDbYr6Z4_p6j{vf#D=~I5^fWGt@_0ZQ^Ptpk$ zCE*L>++%H2!^vco4*u^=ak$WP0YoNH1P3(DjF-te_@eB5V5U F{{Y^;#3=v( literal 0 HcmV?d00001 diff --git a/previews/PR2365/assets/search.js b/previews/PR2365/assets/search.js new file mode 100644 index 0000000000..c133f74101 --- /dev/null +++ b/previews/PR2365/assets/search.js @@ -0,0 +1,267 @@ +// Generated by Documenter.jl +requirejs.config({ + paths: { + 'lunr': 'https://cdnjs.cloudflare.com/ajax/libs/lunr.js/2.3.9/lunr.min', + 'lodash': 'https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.21/lodash.min', + 'jquery': 'https://cdnjs.cloudflare.com/ajax/libs/jquery/3.6.0/jquery.min', + } +}); +//////////////////////////////////////////////////////////////////////////////// +require(['jquery', 'lunr', 'lodash'], function($, lunr, _) { + +$(document).ready(function() { + // parseUri 1.2.2 + // (c) Steven Levithan + // MIT License + function parseUri (str) { + var o = parseUri.options, + m = o.parser[o.strictMode ? "strict" : "loose"].exec(str), + uri = {}, + i = 14; + + while (i--) uri[o.key[i]] = m[i] || ""; + + uri[o.q.name] = {}; + uri[o.key[12]].replace(o.q.parser, function ($0, $1, $2) { + if ($1) uri[o.q.name][$1] = $2; + }); + + return uri; + }; + parseUri.options = { + strictMode: false, + key: ["source","protocol","authority","userInfo","user","password","host","port","relative","path","directory","file","query","anchor"], + q: { + name: "queryKey", + parser: /(?:^|&)([^&=]*)=?([^&]*)/g + }, + parser: { + strict: /^(?:([^:\/?#]+):)?(?:\/\/((?:(([^:@]*)(?::([^:@]*))?)?@)?([^:\/?#]*)(?::(\d*))?))?((((?:[^?#\/]*\/)*)([^?#]*))(?:\?([^#]*))?(?:#(.*))?)/, + loose: /^(?:(?![^:@]+:[^:@\/]*@)([^:\/?#.]+):)?(?:\/\/)?((?:(([^:@]*)(?::([^:@]*))?)?@)?([^:\/?#]*)(?::(\d*))?)(((\/(?:[^?#](?![^?#\/]*\.[^?#\/.]+(?:[?#]|$)))*\/?)?([^?#\/]*))(?:\?([^#]*))?(?:#(.*))?)/ + } + }; + + $("#search-form").submit(function(e) { + e.preventDefault() + }) + + // list below is the lunr 2.1.3 list minus the intersect with names(Base) + // (all, any, get, in, is, only, which) and (do, else, for, let, where, while, with) + // ideally we'd just filter the original list but it's not available as a variable + lunr.stopWordFilter = lunr.generateStopWordFilter([ + 'a', + 'able', + 'about', + 'across', + 'after', + 'almost', + 'also', + 'am', + 'among', + 'an', + 'and', + 'are', + 'as', + 'at', + 'be', + 'because', + 'been', + 'but', + 'by', + 'can', + 'cannot', + 'could', + 'dear', + 'did', + 'does', + 'either', + 'ever', + 'every', + 'from', + 'got', + 'had', + 'has', + 'have', + 'he', + 'her', + 'hers', + 'him', + 'his', + 'how', + 'however', + 'i', + 'if', + 'into', + 'it', + 'its', + 'just', + 'least', + 'like', + 'likely', + 'may', + 'me', + 'might', + 'most', + 'must', + 'my', + 'neither', + 'no', + 'nor', + 'not', + 'of', + 'off', + 'often', + 'on', + 'or', + 'other', + 'our', + 'own', + 'rather', + 'said', + 'say', + 'says', + 'she', + 'should', + 'since', + 'so', + 'some', + 'than', + 'that', + 'the', + 'their', + 'them', + 'then', + 'there', + 'these', + 'they', + 'this', + 'tis', + 'to', + 'too', + 'twas', + 'us', + 'wants', + 'was', + 'we', + 'were', + 'what', + 'when', + 'who', + 'whom', + 'why', + 'will', + 'would', + 'yet', + 'you', + 'your' + ]) + + // add . as a separator, because otherwise "title": "Documenter.Anchors.add!" + // would not find anything if searching for "add!", only for the entire qualification + lunr.tokenizer.separator = /[\s\-\.]+/ + + // custom trimmer that doesn't strip @ and !, which are used in julia macro and function names + lunr.trimmer = function (token) { + return token.update(function (s) { + return s.replace(/^[^a-zA-Z0-9@!]+/, '').replace(/[^a-zA-Z0-9@!]+$/, '') + }) + } + + lunr.Pipeline.registerFunction(lunr.stopWordFilter, 'juliaStopWordFilter') + lunr.Pipeline.registerFunction(lunr.trimmer, 'juliaTrimmer') + + var index = lunr(function () { + this.ref('location') + this.field('title',{boost: 100}) + this.field('text') + documenterSearchIndex['docs'].forEach(function(e) { + this.add(e) + }, this) + }) + var store = {} + + documenterSearchIndex['docs'].forEach(function(e) { + store[e.location] = {title: e.title, category: e.category, page: e.page} + }) + + $(function(){ + searchresults = $('#documenter-search-results'); + searchinfo = $('#documenter-search-info'); + searchbox = $('#documenter-search-query'); + searchform = $('.docs-search'); + sidebar = $('.docs-sidebar'); + function update_search(querystring) { + tokens = lunr.tokenizer(querystring) + results = index.query(function (q) { + tokens.forEach(function (t) { + q.term(t.toString(), { + fields: ["title"], + boost: 100, + usePipeline: true, + editDistance: 0, + wildcard: lunr.Query.wildcard.NONE + }) + q.term(t.toString(), { + fields: ["title"], + boost: 10, + usePipeline: true, + editDistance: 2, + wildcard: lunr.Query.wildcard.NONE + }) + q.term(t.toString(), { + fields: ["text"], + boost: 1, + usePipeline: true, + editDistance: 0, + wildcard: lunr.Query.wildcard.NONE + }) + }) + }) + searchinfo.text("Number of results: " + results.length) + searchresults.empty() + results.forEach(function(result) { + data = store[result.ref] + link = $(''+data.title+'') + link.attr('href', documenterBaseURL+'/'+result.ref) + if (data.category != "page"){ + cat = $('('+data.category+', '+data.page+')') + } else { + cat = $('('+data.category+')') + } + li = $('
  • ').append(link).append(" ").append(cat) + searchresults.append(li) + }) + } + + function update_search_box() { + querystring = searchbox.val() + update_search(querystring) + } + + searchbox.keyup(_.debounce(update_search_box, 250)) + searchbox.change(update_search_box) + + // Disable enter-key form submission for the searchbox on the search page + // and just re-run search rather than refresh the whole page. + searchform.keypress( + function(event){ + if (event.which == '13') { + if (sidebar.hasClass('visible')) { + sidebar.removeClass('visible'); + } + update_search_box(); + event.preventDefault(); + } + } + ); + + search_query_uri = parseUri(window.location).queryKey["q"] + if(search_query_uri !== undefined) { + search_query = decodeURIComponent(search_query_uri.replace(/\+/g, '%20')) + searchbox.val(search_query) + } + update_search_box(); + }) +}) + +}) diff --git a/previews/PR2365/assets/themes/documenter-dark.css b/previews/PR2365/assets/themes/documenter-dark.css new file mode 100644 index 0000000000..c94a294dcf --- /dev/null +++ b/previews/PR2365/assets/themes/documenter-dark.css @@ -0,0 +1,7 @@ +@keyframes spinAround{from{transform:rotate(0deg)}to{transform:rotate(359deg)}}html.theme--documenter-dark .tabs,html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .pagination-next,html.theme--documenter-dark .pagination-link,html.theme--documenter-dark .pagination-ellipsis,html.theme--documenter-dark .breadcrumb,html.theme--documenter-dark .file,html.theme--documenter-dark .button,.is-unselectable,html.theme--documenter-dark .modal-close,html.theme--documenter-dark .delete{-webkit-touch-callout:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}html.theme--documenter-dark .navbar-link:not(.is-arrowless)::after,html.theme--documenter-dark .select:not(.is-multiple):not(.is-loading)::after{border:3px solid rgba(0,0,0,0);border-radius:2px;border-right:0;border-top:0;content:" ";display:block;height:0.625em;margin-top:-0.4375em;pointer-events:none;position:absolute;top:50%;transform:rotate(-45deg);transform-origin:center;width:0.625em}html.theme--documenter-dark .admonition:not(:last-child),html.theme--documenter-dark .tabs:not(:last-child),html.theme--documenter-dark .message:not(:last-child),html.theme--documenter-dark .list:not(:last-child),html.theme--documenter-dark .level:not(:last-child),html.theme--documenter-dark .breadcrumb:not(:last-child),html.theme--documenter-dark .highlight:not(:last-child),html.theme--documenter-dark .block:not(:last-child),html.theme--documenter-dark .title:not(:last-child),html.theme--documenter-dark .subtitle:not(:last-child),html.theme--documenter-dark .table-container:not(:last-child),html.theme--documenter-dark .table:not(:last-child),html.theme--documenter-dark .progress:not(:last-child),html.theme--documenter-dark .notification:not(:last-child),html.theme--documenter-dark .content:not(:last-child),html.theme--documenter-dark .box:not(:last-child){margin-bottom:1.5rem}html.theme--documenter-dark .modal-close,html.theme--documenter-dark .delete{-moz-appearance:none;-webkit-appearance:none;background-color:rgba(10,10,10,0.2);border:none;border-radius:290486px;cursor:pointer;pointer-events:auto;display:inline-block;flex-grow:0;flex-shrink:0;font-size:0;height:20px;max-height:20px;max-width:20px;min-height:20px;min-width:20px;outline:none;position:relative;vertical-align:top;width:20px}html.theme--documenter-dark .modal-close::before,html.theme--documenter-dark .delete::before,html.theme--documenter-dark .modal-close::after,html.theme--documenter-dark .delete::after{background-color:#fff;content:"";display:block;left:50%;position:absolute;top:50%;transform:translateX(-50%) translateY(-50%) rotate(45deg);transform-origin:center center}html.theme--documenter-dark .modal-close::before,html.theme--documenter-dark .delete::before{height:2px;width:50%}html.theme--documenter-dark .modal-close::after,html.theme--documenter-dark .delete::after{height:50%;width:2px}html.theme--documenter-dark .modal-close:hover,html.theme--documenter-dark .delete:hover,html.theme--documenter-dark .modal-close:focus,html.theme--documenter-dark .delete:focus{background-color:rgba(10,10,10,0.3)}html.theme--documenter-dark .modal-close:active,html.theme--documenter-dark .delete:active{background-color:rgba(10,10,10,0.4)}html.theme--documenter-dark .is-small.modal-close,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.modal-close,html.theme--documenter-dark .is-small.delete,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.delete{height:16px;max-height:16px;max-width:16px;min-height:16px;min-width:16px;width:16px}html.theme--documenter-dark .is-medium.modal-close,html.theme--documenter-dark .is-medium.delete{height:24px;max-height:24px;max-width:24px;min-height:24px;min-width:24px;width:24px}html.theme--documenter-dark .is-large.modal-close,html.theme--documenter-dark .is-large.delete{height:32px;max-height:32px;max-width:32px;min-height:32px;min-width:32px;width:32px}html.theme--documenter-dark .control.is-loading::after,html.theme--documenter-dark .select.is-loading::after,html.theme--documenter-dark .loader,html.theme--documenter-dark .button.is-loading::after{animation:spinAround 500ms infinite linear;border:2px solid #dbdee0;border-radius:290486px;border-right-color:transparent;border-top-color:transparent;content:"";display:block;height:1em;position:relative;width:1em}html.theme--documenter-dark .hero-video,html.theme--documenter-dark .modal-background,html.theme--documenter-dark .modal,html.theme--documenter-dark .image.is-square img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-square img,html.theme--documenter-dark .image.is-square .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-square .has-ratio,html.theme--documenter-dark .image.is-1by1 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by1 img,html.theme--documenter-dark .image.is-1by1 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by1 .has-ratio,html.theme--documenter-dark .image.is-5by4 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by4 img,html.theme--documenter-dark .image.is-5by4 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by4 .has-ratio,html.theme--documenter-dark .image.is-4by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by3 img,html.theme--documenter-dark .image.is-4by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by3 .has-ratio,html.theme--documenter-dark .image.is-3by2 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by2 img,html.theme--documenter-dark .image.is-3by2 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by2 .has-ratio,html.theme--documenter-dark .image.is-5by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by3 img,html.theme--documenter-dark .image.is-5by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by3 .has-ratio,html.theme--documenter-dark .image.is-16by9 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-16by9 img,html.theme--documenter-dark .image.is-16by9 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-16by9 .has-ratio,html.theme--documenter-dark .image.is-2by1 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by1 img,html.theme--documenter-dark .image.is-2by1 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by1 .has-ratio,html.theme--documenter-dark .image.is-3by1 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by1 img,html.theme--documenter-dark .image.is-3by1 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by1 .has-ratio,html.theme--documenter-dark .image.is-4by5 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by5 img,html.theme--documenter-dark .image.is-4by5 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by5 .has-ratio,html.theme--documenter-dark .image.is-3by4 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by4 img,html.theme--documenter-dark .image.is-3by4 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by4 .has-ratio,html.theme--documenter-dark .image.is-2by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by3 img,html.theme--documenter-dark .image.is-2by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by3 .has-ratio,html.theme--documenter-dark .image.is-3by5 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by5 img,html.theme--documenter-dark .image.is-3by5 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by5 .has-ratio,html.theme--documenter-dark .image.is-9by16 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-9by16 img,html.theme--documenter-dark .image.is-9by16 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-9by16 .has-ratio,html.theme--documenter-dark .image.is-1by2 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by2 img,html.theme--documenter-dark .image.is-1by2 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by2 .has-ratio,html.theme--documenter-dark .image.is-1by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by3 img,html.theme--documenter-dark .image.is-1by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by3 .has-ratio,.is-overlay{bottom:0;left:0;position:absolute;right:0;top:0}html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .pagination-next,html.theme--documenter-dark .pagination-link,html.theme--documenter-dark .pagination-ellipsis,html.theme--documenter-dark .file-cta,html.theme--documenter-dark .file-name,html.theme--documenter-dark .select select,html.theme--documenter-dark .textarea,html.theme--documenter-dark .input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark .button{-moz-appearance:none;-webkit-appearance:none;align-items:center;border:1px solid transparent;border-radius:.4em;box-shadow:none;display:inline-flex;font-size:15px;height:2.25em;justify-content:flex-start;line-height:1.5;padding-bottom:calc(0.375em - 1px);padding-left:calc(0.625em - 1px);padding-right:calc(0.625em - 1px);padding-top:calc(0.375em - 1px);position:relative;vertical-align:top}html.theme--documenter-dark .pagination-previous:focus,html.theme--documenter-dark .pagination-next:focus,html.theme--documenter-dark .pagination-link:focus,html.theme--documenter-dark .pagination-ellipsis:focus,html.theme--documenter-dark .file-cta:focus,html.theme--documenter-dark .file-name:focus,html.theme--documenter-dark .select select:focus,html.theme--documenter-dark .textarea:focus,html.theme--documenter-dark .input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:focus,html.theme--documenter-dark .button:focus,html.theme--documenter-dark .is-focused.pagination-previous,html.theme--documenter-dark .is-focused.pagination-next,html.theme--documenter-dark .is-focused.pagination-link,html.theme--documenter-dark .is-focused.pagination-ellipsis,html.theme--documenter-dark .is-focused.file-cta,html.theme--documenter-dark .is-focused.file-name,html.theme--documenter-dark .select select.is-focused,html.theme--documenter-dark .is-focused.textarea,html.theme--documenter-dark .is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-focused.button,html.theme--documenter-dark .pagination-previous:active,html.theme--documenter-dark .pagination-next:active,html.theme--documenter-dark .pagination-link:active,html.theme--documenter-dark .pagination-ellipsis:active,html.theme--documenter-dark .file-cta:active,html.theme--documenter-dark .file-name:active,html.theme--documenter-dark .select select:active,html.theme--documenter-dark .textarea:active,html.theme--documenter-dark .input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:active,html.theme--documenter-dark .button:active,html.theme--documenter-dark .is-active.pagination-previous,html.theme--documenter-dark .is-active.pagination-next,html.theme--documenter-dark .is-active.pagination-link,html.theme--documenter-dark .is-active.pagination-ellipsis,html.theme--documenter-dark .is-active.file-cta,html.theme--documenter-dark .is-active.file-name,html.theme--documenter-dark .select select.is-active,html.theme--documenter-dark .is-active.textarea,html.theme--documenter-dark .is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active,html.theme--documenter-dark .is-active.button{outline:none}html.theme--documenter-dark .pagination-previous[disabled],html.theme--documenter-dark .pagination-next[disabled],html.theme--documenter-dark .pagination-link[disabled],html.theme--documenter-dark .pagination-ellipsis[disabled],html.theme--documenter-dark .file-cta[disabled],html.theme--documenter-dark .file-name[disabled],html.theme--documenter-dark .select select[disabled],html.theme--documenter-dark .textarea[disabled],html.theme--documenter-dark .input[disabled],html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[disabled],html.theme--documenter-dark .button[disabled],fieldset[disabled] html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark fieldset[disabled] .pagination-previous,fieldset[disabled] html.theme--documenter-dark .pagination-next,html.theme--documenter-dark fieldset[disabled] .pagination-next,fieldset[disabled] html.theme--documenter-dark .pagination-link,html.theme--documenter-dark fieldset[disabled] .pagination-link,fieldset[disabled] html.theme--documenter-dark .pagination-ellipsis,html.theme--documenter-dark fieldset[disabled] .pagination-ellipsis,fieldset[disabled] html.theme--documenter-dark .file-cta,html.theme--documenter-dark fieldset[disabled] .file-cta,fieldset[disabled] html.theme--documenter-dark .file-name,html.theme--documenter-dark fieldset[disabled] .file-name,fieldset[disabled] html.theme--documenter-dark .select select,fieldset[disabled] html.theme--documenter-dark .textarea,fieldset[disabled] html.theme--documenter-dark .input,fieldset[disabled] html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark fieldset[disabled] .select select,html.theme--documenter-dark .select fieldset[disabled] select,html.theme--documenter-dark fieldset[disabled] .textarea,html.theme--documenter-dark fieldset[disabled] .input,html.theme--documenter-dark fieldset[disabled] #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar fieldset[disabled] form.docs-search>input,fieldset[disabled] html.theme--documenter-dark .button,html.theme--documenter-dark fieldset[disabled] .button{cursor:not-allowed}/*! minireset.css v0.0.4 | MIT License | github.com/jgthms/minireset.css */html,body,p,ol,ul,li,dl,dt,dd,blockquote,figure,fieldset,legend,textarea,pre,iframe,hr,h1,h2,h3,h4,h5,h6{margin:0;padding:0}h1,h2,h3,h4,h5,h6{font-size:100%;font-weight:normal}ul{list-style:none}button,input,select,textarea{margin:0}html{box-sizing:border-box}*,*::before,*::after{box-sizing:inherit}img,embed,iframe,object,video{height:auto;max-width:100%}audio{max-width:100%}iframe{border:0}table{border-collapse:collapse;border-spacing:0}td,th{padding:0}td:not([align]),th:not([align]){text-align:left}.is-clearfix::after{clear:both;content:" ";display:table}.is-pulled-left{float:left !important}.is-pulled-right{float:right !important}.is-clipped{overflow:hidden !important}.is-size-1{font-size:3rem !important}.is-size-2{font-size:2.5rem !important}.is-size-3{font-size:2rem !important}.is-size-4{font-size:1.5rem !important}.is-size-5{font-size:1.25rem !important}.is-size-6{font-size:15px !important}.is-size-7,html.theme--documenter-dark .docstring>section>a.docs-sourcelink{font-size:.85em !important}@media screen and (max-width: 768px){.is-size-1-mobile{font-size:3rem !important}.is-size-2-mobile{font-size:2.5rem !important}.is-size-3-mobile{font-size:2rem !important}.is-size-4-mobile{font-size:1.5rem !important}.is-size-5-mobile{font-size:1.25rem !important}.is-size-6-mobile{font-size:15px !important}.is-size-7-mobile{font-size:.85em !important}}@media screen and (min-width: 769px),print{.is-size-1-tablet{font-size:3rem !important}.is-size-2-tablet{font-size:2.5rem !important}.is-size-3-tablet{font-size:2rem !important}.is-size-4-tablet{font-size:1.5rem !important}.is-size-5-tablet{font-size:1.25rem !important}.is-size-6-tablet{font-size:15px !important}.is-size-7-tablet{font-size:.85em !important}}@media screen and (max-width: 1055px){.is-size-1-touch{font-size:3rem !important}.is-size-2-touch{font-size:2.5rem !important}.is-size-3-touch{font-size:2rem !important}.is-size-4-touch{font-size:1.5rem !important}.is-size-5-touch{font-size:1.25rem !important}.is-size-6-touch{font-size:15px !important}.is-size-7-touch{font-size:.85em !important}}@media screen and (min-width: 1056px){.is-size-1-desktop{font-size:3rem !important}.is-size-2-desktop{font-size:2.5rem !important}.is-size-3-desktop{font-size:2rem !important}.is-size-4-desktop{font-size:1.5rem !important}.is-size-5-desktop{font-size:1.25rem !important}.is-size-6-desktop{font-size:15px !important}.is-size-7-desktop{font-size:.85em !important}}@media screen and (min-width: 1216px){.is-size-1-widescreen{font-size:3rem !important}.is-size-2-widescreen{font-size:2.5rem !important}.is-size-3-widescreen{font-size:2rem !important}.is-size-4-widescreen{font-size:1.5rem !important}.is-size-5-widescreen{font-size:1.25rem !important}.is-size-6-widescreen{font-size:15px !important}.is-size-7-widescreen{font-size:.85em !important}}@media screen and (min-width: 1408px){.is-size-1-fullhd{font-size:3rem !important}.is-size-2-fullhd{font-size:2.5rem !important}.is-size-3-fullhd{font-size:2rem !important}.is-size-4-fullhd{font-size:1.5rem !important}.is-size-5-fullhd{font-size:1.25rem !important}.is-size-6-fullhd{font-size:15px !important}.is-size-7-fullhd{font-size:.85em !important}}.has-text-centered{text-align:center !important}.has-text-justified{text-align:justify !important}.has-text-left{text-align:left !important}.has-text-right{text-align:right !important}@media screen and (max-width: 768px){.has-text-centered-mobile{text-align:center !important}}@media screen and (min-width: 769px),print{.has-text-centered-tablet{text-align:center !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-centered-tablet-only{text-align:center !important}}@media screen and (max-width: 1055px){.has-text-centered-touch{text-align:center !important}}@media screen and (min-width: 1056px){.has-text-centered-desktop{text-align:center !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-centered-desktop-only{text-align:center !important}}@media screen and (min-width: 1216px){.has-text-centered-widescreen{text-align:center !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-centered-widescreen-only{text-align:center !important}}@media screen and (min-width: 1408px){.has-text-centered-fullhd{text-align:center !important}}@media screen and (max-width: 768px){.has-text-justified-mobile{text-align:justify !important}}@media screen and (min-width: 769px),print{.has-text-justified-tablet{text-align:justify !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-justified-tablet-only{text-align:justify !important}}@media screen and (max-width: 1055px){.has-text-justified-touch{text-align:justify !important}}@media screen and (min-width: 1056px){.has-text-justified-desktop{text-align:justify !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-justified-desktop-only{text-align:justify !important}}@media screen and (min-width: 1216px){.has-text-justified-widescreen{text-align:justify !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-justified-widescreen-only{text-align:justify !important}}@media screen and (min-width: 1408px){.has-text-justified-fullhd{text-align:justify !important}}@media screen and (max-width: 768px){.has-text-left-mobile{text-align:left !important}}@media screen and (min-width: 769px),print{.has-text-left-tablet{text-align:left !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-left-tablet-only{text-align:left !important}}@media screen and (max-width: 1055px){.has-text-left-touch{text-align:left !important}}@media screen and (min-width: 1056px){.has-text-left-desktop{text-align:left !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-left-desktop-only{text-align:left !important}}@media screen and (min-width: 1216px){.has-text-left-widescreen{text-align:left !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-left-widescreen-only{text-align:left !important}}@media screen and (min-width: 1408px){.has-text-left-fullhd{text-align:left !important}}@media screen and (max-width: 768px){.has-text-right-mobile{text-align:right !important}}@media screen and (min-width: 769px),print{.has-text-right-tablet{text-align:right !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-right-tablet-only{text-align:right !important}}@media screen and (max-width: 1055px){.has-text-right-touch{text-align:right !important}}@media screen and (min-width: 1056px){.has-text-right-desktop{text-align:right !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-right-desktop-only{text-align:right !important}}@media screen and (min-width: 1216px){.has-text-right-widescreen{text-align:right !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-right-widescreen-only{text-align:right !important}}@media screen and (min-width: 1408px){.has-text-right-fullhd{text-align:right !important}}.is-capitalized{text-transform:capitalize !important}.is-lowercase{text-transform:lowercase !important}.is-uppercase{text-transform:uppercase !important}.is-italic{font-style:italic !important}.has-text-white{color:#fff !important}a.has-text-white:hover,a.has-text-white:focus{color:#e6e6e6 !important}.has-background-white{background-color:#fff !important}.has-text-black{color:#0a0a0a !important}a.has-text-black:hover,a.has-text-black:focus{color:#000 !important}.has-background-black{background-color:#0a0a0a !important}.has-text-light{color:#ecf0f1 !important}a.has-text-light:hover,a.has-text-light:focus{color:#cfd9db !important}.has-background-light{background-color:#ecf0f1 !important}.has-text-dark{color:#282f2f !important}a.has-text-dark:hover,a.has-text-dark:focus{color:#111414 !important}.has-background-dark{background-color:#282f2f !important}.has-text-primary{color:#375a7f !important}a.has-text-primary:hover,a.has-text-primary:focus{color:#28415b !important}.has-background-primary{background-color:#375a7f !important}.has-text-link{color:#1abc9c !important}a.has-text-link:hover,a.has-text-link:focus{color:#148f77 !important}.has-background-link{background-color:#1abc9c !important}.has-text-info{color:#024c7d !important}a.has-text-info:hover,a.has-text-info:focus{color:#012d4b !important}.has-background-info{background-color:#024c7d !important}.has-text-success{color:#008438 !important}a.has-text-success:hover,a.has-text-success:focus{color:#005122 !important}.has-background-success{background-color:#008438 !important}.has-text-warning{color:#ad8100 !important}a.has-text-warning:hover,a.has-text-warning:focus{color:#7a5b00 !important}.has-background-warning{background-color:#ad8100 !important}.has-text-danger{color:#9e1b0d !important}a.has-text-danger:hover,a.has-text-danger:focus{color:#6f1309 !important}.has-background-danger{background-color:#9e1b0d !important}.has-text-black-bis{color:#121212 !important}.has-background-black-bis{background-color:#121212 !important}.has-text-black-ter{color:#242424 !important}.has-background-black-ter{background-color:#242424 !important}.has-text-grey-darker{color:#282f2f !important}.has-background-grey-darker{background-color:#282f2f !important}.has-text-grey-dark{color:#343c3d !important}.has-background-grey-dark{background-color:#343c3d !important}.has-text-grey{color:#5e6d6f !important}.has-background-grey{background-color:#5e6d6f !important}.has-text-grey-light{color:#8c9b9d !important}.has-background-grey-light{background-color:#8c9b9d !important}.has-text-grey-lighter{color:#dbdee0 !important}.has-background-grey-lighter{background-color:#dbdee0 !important}.has-text-white-ter{color:#ecf0f1 !important}.has-background-white-ter{background-color:#ecf0f1 !important}.has-text-white-bis{color:#fafafa !important}.has-background-white-bis{background-color:#fafafa !important}.has-text-weight-light{font-weight:300 !important}.has-text-weight-normal{font-weight:400 !important}.has-text-weight-medium{font-weight:500 !important}.has-text-weight-semibold{font-weight:600 !important}.has-text-weight-bold{font-weight:700 !important}.is-family-primary{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif !important}.is-family-secondary{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif !important}.is-family-sans-serif{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif !important}.is-family-monospace{font-family:"JuliaMono","SFMono-Regular","Menlo","Consolas","Liberation Mono","DejaVu Sans Mono",monospace !important}.is-family-code{font-family:"JuliaMono","SFMono-Regular","Menlo","Consolas","Liberation Mono","DejaVu Sans Mono",monospace !important}.is-block{display:block !important}@media screen and (max-width: 768px){.is-block-mobile{display:block !important}}@media screen and (min-width: 769px),print{.is-block-tablet{display:block !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-block-tablet-only{display:block !important}}@media screen and (max-width: 1055px){.is-block-touch{display:block !important}}@media screen and (min-width: 1056px){.is-block-desktop{display:block !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-block-desktop-only{display:block !important}}@media screen and (min-width: 1216px){.is-block-widescreen{display:block !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-block-widescreen-only{display:block !important}}@media screen and (min-width: 1408px){.is-block-fullhd{display:block !important}}.is-flex{display:flex !important}@media screen and (max-width: 768px){.is-flex-mobile{display:flex !important}}@media screen and (min-width: 769px),print{.is-flex-tablet{display:flex !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-flex-tablet-only{display:flex !important}}@media screen and (max-width: 1055px){.is-flex-touch{display:flex !important}}@media screen and (min-width: 1056px){.is-flex-desktop{display:flex !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-flex-desktop-only{display:flex !important}}@media screen and (min-width: 1216px){.is-flex-widescreen{display:flex !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-flex-widescreen-only{display:flex !important}}@media screen and (min-width: 1408px){.is-flex-fullhd{display:flex !important}}.is-inline{display:inline !important}@media screen and (max-width: 768px){.is-inline-mobile{display:inline !important}}@media screen and (min-width: 769px),print{.is-inline-tablet{display:inline !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-inline-tablet-only{display:inline !important}}@media screen and (max-width: 1055px){.is-inline-touch{display:inline !important}}@media screen and (min-width: 1056px){.is-inline-desktop{display:inline !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-inline-desktop-only{display:inline !important}}@media screen and (min-width: 1216px){.is-inline-widescreen{display:inline !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-inline-widescreen-only{display:inline !important}}@media screen and (min-width: 1408px){.is-inline-fullhd{display:inline !important}}.is-inline-block{display:inline-block !important}@media screen and (max-width: 768px){.is-inline-block-mobile{display:inline-block !important}}@media screen and (min-width: 769px),print{.is-inline-block-tablet{display:inline-block !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-inline-block-tablet-only{display:inline-block !important}}@media screen and (max-width: 1055px){.is-inline-block-touch{display:inline-block !important}}@media screen and (min-width: 1056px){.is-inline-block-desktop{display:inline-block !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-inline-block-desktop-only{display:inline-block !important}}@media screen and (min-width: 1216px){.is-inline-block-widescreen{display:inline-block !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-inline-block-widescreen-only{display:inline-block !important}}@media screen and (min-width: 1408px){.is-inline-block-fullhd{display:inline-block !important}}.is-inline-flex{display:inline-flex !important}@media screen and (max-width: 768px){.is-inline-flex-mobile{display:inline-flex !important}}@media screen and (min-width: 769px),print{.is-inline-flex-tablet{display:inline-flex !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-inline-flex-tablet-only{display:inline-flex !important}}@media screen and (max-width: 1055px){.is-inline-flex-touch{display:inline-flex !important}}@media screen and (min-width: 1056px){.is-inline-flex-desktop{display:inline-flex !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-inline-flex-desktop-only{display:inline-flex !important}}@media screen and (min-width: 1216px){.is-inline-flex-widescreen{display:inline-flex !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-inline-flex-widescreen-only{display:inline-flex !important}}@media screen and (min-width: 1408px){.is-inline-flex-fullhd{display:inline-flex !important}}.is-hidden{display:none !important}.is-sr-only{border:none !important;clip:rect(0, 0, 0, 0) !important;height:0.01em !important;overflow:hidden !important;padding:0 !important;position:absolute !important;white-space:nowrap !important;width:0.01em !important}@media screen and (max-width: 768px){.is-hidden-mobile{display:none !important}}@media screen and (min-width: 769px),print{.is-hidden-tablet{display:none !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-hidden-tablet-only{display:none !important}}@media screen and (max-width: 1055px){.is-hidden-touch{display:none !important}}@media screen and (min-width: 1056px){.is-hidden-desktop{display:none !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-hidden-desktop-only{display:none !important}}@media screen and (min-width: 1216px){.is-hidden-widescreen{display:none !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-hidden-widescreen-only{display:none !important}}@media screen and (min-width: 1408px){.is-hidden-fullhd{display:none !important}}.is-invisible{visibility:hidden !important}@media screen and (max-width: 768px){.is-invisible-mobile{visibility:hidden !important}}@media screen and (min-width: 769px),print{.is-invisible-tablet{visibility:hidden !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-invisible-tablet-only{visibility:hidden !important}}@media screen and (max-width: 1055px){.is-invisible-touch{visibility:hidden !important}}@media screen and (min-width: 1056px){.is-invisible-desktop{visibility:hidden !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-invisible-desktop-only{visibility:hidden !important}}@media screen and (min-width: 1216px){.is-invisible-widescreen{visibility:hidden !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-invisible-widescreen-only{visibility:hidden !important}}@media screen and (min-width: 1408px){.is-invisible-fullhd{visibility:hidden !important}}.is-marginless{margin:0 !important}.is-paddingless{padding:0 !important}.is-radiusless{border-radius:0 !important}.is-shadowless{box-shadow:none !important}.is-relative{position:relative !important}html.theme--documenter-dark{/*! + Theme: a11y-dark + Author: @ericwbailey + Maintainer: @ericwbailey + + Based on the Tomorrow Night Eighties theme: https://github.com/isagalaev/highlight.js/blob/master/src/styles/tomorrow-night-eighties.css +*/}html.theme--documenter-dark html{background-color:#1f2424;font-size:16px;-moz-osx-font-smoothing:grayscale;-webkit-font-smoothing:antialiased;min-width:300px;overflow-x:auto;overflow-y:scroll;text-rendering:optimizeLegibility;text-size-adjust:100%}html.theme--documenter-dark article,html.theme--documenter-dark aside,html.theme--documenter-dark figure,html.theme--documenter-dark footer,html.theme--documenter-dark header,html.theme--documenter-dark hgroup,html.theme--documenter-dark section{display:block}html.theme--documenter-dark body,html.theme--documenter-dark button,html.theme--documenter-dark input,html.theme--documenter-dark select,html.theme--documenter-dark textarea{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif}html.theme--documenter-dark code,html.theme--documenter-dark pre{-moz-osx-font-smoothing:auto;-webkit-font-smoothing:auto;font-family:"JuliaMono","SFMono-Regular","Menlo","Consolas","Liberation Mono","DejaVu Sans Mono",monospace}html.theme--documenter-dark body{color:#fff;font-size:1em;font-weight:400;line-height:1.5}html.theme--documenter-dark a{color:#1abc9c;cursor:pointer;text-decoration:none}html.theme--documenter-dark a strong{color:currentColor}html.theme--documenter-dark a:hover{color:#1dd2af}html.theme--documenter-dark code{background-color:rgba(255,255,255,0.05);color:#ececec;font-size:.875em;font-weight:normal;padding:.1em}html.theme--documenter-dark hr{background-color:#282f2f;border:none;display:block;height:2px;margin:1.5rem 0}html.theme--documenter-dark img{height:auto;max-width:100%}html.theme--documenter-dark input[type="checkbox"],html.theme--documenter-dark input[type="radio"]{vertical-align:baseline}html.theme--documenter-dark small{font-size:.875em}html.theme--documenter-dark span{font-style:inherit;font-weight:inherit}html.theme--documenter-dark strong{color:#f2f2f2;font-weight:700}html.theme--documenter-dark fieldset{border:none}html.theme--documenter-dark pre{-webkit-overflow-scrolling:touch;background-color:#282f2f;color:#fff;font-size:.875em;overflow-x:auto;padding:1.25rem 1.5rem;white-space:pre;word-wrap:normal}html.theme--documenter-dark pre code{background-color:transparent;color:currentColor;font-size:1em;padding:0}html.theme--documenter-dark table td,html.theme--documenter-dark table th{vertical-align:top}html.theme--documenter-dark table td:not([align]),html.theme--documenter-dark table th:not([align]){text-align:left}html.theme--documenter-dark table th{color:#f2f2f2}html.theme--documenter-dark .box{background-color:#343c3d;border-radius:8px;box-shadow:none;color:#fff;display:block;padding:1.25rem}html.theme--documenter-dark a.box:hover,html.theme--documenter-dark a.box:focus{box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px #1abc9c}html.theme--documenter-dark a.box:active{box-shadow:inset 0 1px 2px rgba(10,10,10,0.2),0 0 0 1px #1abc9c}html.theme--documenter-dark .button{background-color:#282f2f;border-color:#4c5759;border-width:1px;color:#375a7f;cursor:pointer;justify-content:center;padding-bottom:calc(0.375em - 1px);padding-left:.75em;padding-right:.75em;padding-top:calc(0.375em - 1px);text-align:center;white-space:nowrap}html.theme--documenter-dark .button strong{color:inherit}html.theme--documenter-dark .button .icon,html.theme--documenter-dark .button .icon.is-small,html.theme--documenter-dark .button #documenter .docs-sidebar form.docs-search>input.icon,html.theme--documenter-dark #documenter .docs-sidebar .button form.docs-search>input.icon,html.theme--documenter-dark .button .icon.is-medium,html.theme--documenter-dark .button .icon.is-large{height:1.5em;width:1.5em}html.theme--documenter-dark .button .icon:first-child:not(:last-child){margin-left:calc(-0.375em - 1px);margin-right:0.1875em}html.theme--documenter-dark .button .icon:last-child:not(:first-child){margin-left:0.1875em;margin-right:calc(-0.375em - 1px)}html.theme--documenter-dark .button .icon:first-child:last-child{margin-left:calc(-0.375em - 1px);margin-right:calc(-0.375em - 1px)}html.theme--documenter-dark .button:hover,html.theme--documenter-dark .button.is-hovered{border-color:#8c9b9d;color:#f2f2f2}html.theme--documenter-dark .button:focus,html.theme--documenter-dark .button.is-focused{border-color:#8c9b9d;color:#17a689}html.theme--documenter-dark .button:focus:not(:active),html.theme--documenter-dark .button.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(26,188,156,0.25)}html.theme--documenter-dark .button:active,html.theme--documenter-dark .button.is-active{border-color:#343c3d;color:#f2f2f2}html.theme--documenter-dark .button.is-text{background-color:transparent;border-color:transparent;color:#fff;text-decoration:underline}html.theme--documenter-dark .button.is-text:hover,html.theme--documenter-dark .button.is-text.is-hovered,html.theme--documenter-dark .button.is-text:focus,html.theme--documenter-dark .button.is-text.is-focused{background-color:#282f2f;color:#f2f2f2}html.theme--documenter-dark .button.is-text:active,html.theme--documenter-dark .button.is-text.is-active{background-color:#1d2122;color:#f2f2f2}html.theme--documenter-dark .button.is-text[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-text{background-color:transparent;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-white{background-color:#fff;border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .button.is-white:hover,html.theme--documenter-dark .button.is-white.is-hovered{background-color:#f9f9f9;border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .button.is-white:focus,html.theme--documenter-dark .button.is-white.is-focused{border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .button.is-white:focus:not(:active),html.theme--documenter-dark .button.is-white.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(255,255,255,0.25)}html.theme--documenter-dark .button.is-white:active,html.theme--documenter-dark .button.is-white.is-active{background-color:#f2f2f2;border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .button.is-white[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-white{background-color:#fff;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-white.is-inverted{background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .button.is-white.is-inverted:hover,html.theme--documenter-dark .button.is-white.is-inverted.is-hovered{background-color:#000}html.theme--documenter-dark .button.is-white.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-white.is-inverted{background-color:#0a0a0a;border-color:transparent;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-white.is-loading::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}html.theme--documenter-dark .button.is-white.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-white.is-outlined:hover,html.theme--documenter-dark .button.is-white.is-outlined.is-hovered,html.theme--documenter-dark .button.is-white.is-outlined:focus,html.theme--documenter-dark .button.is-white.is-outlined.is-focused{background-color:#fff;border-color:#fff;color:#0a0a0a}html.theme--documenter-dark .button.is-white.is-outlined.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-white.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-white.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-white.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-white.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}html.theme--documenter-dark .button.is-white.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-white.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-white.is-inverted.is-outlined{background-color:transparent;border-color:#0a0a0a;color:#0a0a0a}html.theme--documenter-dark .button.is-white.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-white.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-white.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-white.is-inverted.is-outlined.is-focused{background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .button.is-white.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-white.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-white.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-white.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-white.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-white.is-inverted.is-outlined{background-color:transparent;border-color:#0a0a0a;box-shadow:none;color:#0a0a0a}html.theme--documenter-dark .button.is-black{background-color:#0a0a0a;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-black:hover,html.theme--documenter-dark .button.is-black.is-hovered{background-color:#040404;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-black:focus,html.theme--documenter-dark .button.is-black.is-focused{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-black:focus:not(:active),html.theme--documenter-dark .button.is-black.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(10,10,10,0.25)}html.theme--documenter-dark .button.is-black:active,html.theme--documenter-dark .button.is-black.is-active{background-color:#000;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-black[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-black{background-color:#0a0a0a;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-black.is-inverted{background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .button.is-black.is-inverted:hover,html.theme--documenter-dark .button.is-black.is-inverted.is-hovered{background-color:#f2f2f2}html.theme--documenter-dark .button.is-black.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-black.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#0a0a0a}html.theme--documenter-dark .button.is-black.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-black.is-outlined{background-color:transparent;border-color:#0a0a0a;color:#0a0a0a}html.theme--documenter-dark .button.is-black.is-outlined:hover,html.theme--documenter-dark .button.is-black.is-outlined.is-hovered,html.theme--documenter-dark .button.is-black.is-outlined:focus,html.theme--documenter-dark .button.is-black.is-outlined.is-focused{background-color:#0a0a0a;border-color:#0a0a0a;color:#fff}html.theme--documenter-dark .button.is-black.is-outlined.is-loading::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}html.theme--documenter-dark .button.is-black.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-black.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-black.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-black.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-black.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-black.is-outlined{background-color:transparent;border-color:#0a0a0a;box-shadow:none;color:#0a0a0a}html.theme--documenter-dark .button.is-black.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-black.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-black.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-black.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-black.is-inverted.is-outlined.is-focused{background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .button.is-black.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-black.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-black.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-black.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}html.theme--documenter-dark .button.is-black.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-black.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-light{background-color:#ecf0f1;border-color:transparent;color:#282f2f}html.theme--documenter-dark .button.is-light:hover,html.theme--documenter-dark .button.is-light.is-hovered{background-color:#e5eaec;border-color:transparent;color:#282f2f}html.theme--documenter-dark .button.is-light:focus,html.theme--documenter-dark .button.is-light.is-focused{border-color:transparent;color:#282f2f}html.theme--documenter-dark .button.is-light:focus:not(:active),html.theme--documenter-dark .button.is-light.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(236,240,241,0.25)}html.theme--documenter-dark .button.is-light:active,html.theme--documenter-dark .button.is-light.is-active{background-color:#dde4e6;border-color:transparent;color:#282f2f}html.theme--documenter-dark .button.is-light[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-light{background-color:#ecf0f1;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-light.is-inverted{background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .button.is-light.is-inverted:hover,html.theme--documenter-dark .button.is-light.is-inverted.is-hovered{background-color:#1d2122}html.theme--documenter-dark .button.is-light.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-light.is-inverted{background-color:#282f2f;border-color:transparent;box-shadow:none;color:#ecf0f1}html.theme--documenter-dark .button.is-light.is-loading::after{border-color:transparent transparent #282f2f #282f2f !important}html.theme--documenter-dark .button.is-light.is-outlined{background-color:transparent;border-color:#ecf0f1;color:#ecf0f1}html.theme--documenter-dark .button.is-light.is-outlined:hover,html.theme--documenter-dark .button.is-light.is-outlined.is-hovered,html.theme--documenter-dark .button.is-light.is-outlined:focus,html.theme--documenter-dark .button.is-light.is-outlined.is-focused{background-color:#ecf0f1;border-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .button.is-light.is-outlined.is-loading::after{border-color:transparent transparent #ecf0f1 #ecf0f1 !important}html.theme--documenter-dark .button.is-light.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-light.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-light.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-light.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #282f2f #282f2f !important}html.theme--documenter-dark .button.is-light.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-light.is-outlined{background-color:transparent;border-color:#ecf0f1;box-shadow:none;color:#ecf0f1}html.theme--documenter-dark .button.is-light.is-inverted.is-outlined{background-color:transparent;border-color:#282f2f;color:#282f2f}html.theme--documenter-dark .button.is-light.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-light.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-light.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-light.is-inverted.is-outlined.is-focused{background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .button.is-light.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-light.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-light.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-light.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #ecf0f1 #ecf0f1 !important}html.theme--documenter-dark .button.is-light.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-light.is-inverted.is-outlined{background-color:transparent;border-color:#282f2f;box-shadow:none;color:#282f2f}html.theme--documenter-dark .button.is-dark,html.theme--documenter-dark .content kbd.button{background-color:#282f2f;border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .button.is-dark:hover,html.theme--documenter-dark .content kbd.button:hover,html.theme--documenter-dark .button.is-dark.is-hovered,html.theme--documenter-dark .content kbd.button.is-hovered{background-color:#232829;border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .button.is-dark:focus,html.theme--documenter-dark .content kbd.button:focus,html.theme--documenter-dark .button.is-dark.is-focused,html.theme--documenter-dark .content kbd.button.is-focused{border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .button.is-dark:focus:not(:active),html.theme--documenter-dark .content kbd.button:focus:not(:active),html.theme--documenter-dark .button.is-dark.is-focused:not(:active),html.theme--documenter-dark .content kbd.button.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(40,47,47,0.25)}html.theme--documenter-dark .button.is-dark:active,html.theme--documenter-dark .content kbd.button:active,html.theme--documenter-dark .button.is-dark.is-active,html.theme--documenter-dark .content kbd.button.is-active{background-color:#1d2122;border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .button.is-dark[disabled],html.theme--documenter-dark .content kbd.button[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-dark,fieldset[disabled] html.theme--documenter-dark .content kbd.button{background-color:#282f2f;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-dark.is-inverted,html.theme--documenter-dark .content kbd.button.is-inverted{background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .button.is-dark.is-inverted:hover,html.theme--documenter-dark .content kbd.button.is-inverted:hover,html.theme--documenter-dark .button.is-dark.is-inverted.is-hovered,html.theme--documenter-dark .content kbd.button.is-inverted.is-hovered{background-color:#dde4e6}html.theme--documenter-dark .button.is-dark.is-inverted[disabled],html.theme--documenter-dark .content kbd.button.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-dark.is-inverted,fieldset[disabled] html.theme--documenter-dark .content kbd.button.is-inverted{background-color:#ecf0f1;border-color:transparent;box-shadow:none;color:#282f2f}html.theme--documenter-dark .button.is-dark.is-loading::after,html.theme--documenter-dark .content kbd.button.is-loading::after{border-color:transparent transparent #ecf0f1 #ecf0f1 !important}html.theme--documenter-dark .button.is-dark.is-outlined,html.theme--documenter-dark .content kbd.button.is-outlined{background-color:transparent;border-color:#282f2f;color:#282f2f}html.theme--documenter-dark .button.is-dark.is-outlined:hover,html.theme--documenter-dark .content kbd.button.is-outlined:hover,html.theme--documenter-dark .button.is-dark.is-outlined.is-hovered,html.theme--documenter-dark .content kbd.button.is-outlined.is-hovered,html.theme--documenter-dark .button.is-dark.is-outlined:focus,html.theme--documenter-dark .content kbd.button.is-outlined:focus,html.theme--documenter-dark .button.is-dark.is-outlined.is-focused,html.theme--documenter-dark .content kbd.button.is-outlined.is-focused{background-color:#282f2f;border-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .button.is-dark.is-outlined.is-loading::after,html.theme--documenter-dark .content kbd.button.is-outlined.is-loading::after{border-color:transparent transparent #282f2f #282f2f !important}html.theme--documenter-dark .button.is-dark.is-outlined.is-loading:hover::after,html.theme--documenter-dark .content kbd.button.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-dark.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .content kbd.button.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-dark.is-outlined.is-loading:focus::after,html.theme--documenter-dark .content kbd.button.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-dark.is-outlined.is-loading.is-focused::after,html.theme--documenter-dark .content kbd.button.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #ecf0f1 #ecf0f1 !important}html.theme--documenter-dark .button.is-dark.is-outlined[disabled],html.theme--documenter-dark .content kbd.button.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-dark.is-outlined,fieldset[disabled] html.theme--documenter-dark .content kbd.button.is-outlined{background-color:transparent;border-color:#282f2f;box-shadow:none;color:#282f2f}html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined{background-color:transparent;border-color:#ecf0f1;color:#ecf0f1}html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined:hover,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined:focus,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined.is-focused,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined.is-focused{background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined.is-loading.is-focused::after,html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #282f2f #282f2f !important}html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined[disabled],html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-dark.is-inverted.is-outlined,fieldset[disabled] html.theme--documenter-dark .content kbd.button.is-inverted.is-outlined{background-color:transparent;border-color:#ecf0f1;box-shadow:none;color:#ecf0f1}html.theme--documenter-dark .button.is-primary,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink{background-color:#375a7f;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-primary:hover,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:hover,html.theme--documenter-dark .button.is-primary.is-hovered,html.theme--documenter-dark .docstring>section>a.button.is-hovered.docs-sourcelink{background-color:#335476;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-primary:focus,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:focus,html.theme--documenter-dark .button.is-primary.is-focused,html.theme--documenter-dark .docstring>section>a.button.is-focused.docs-sourcelink{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-primary:focus:not(:active),html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:focus:not(:active),html.theme--documenter-dark .button.is-primary.is-focused:not(:active),html.theme--documenter-dark .docstring>section>a.button.is-focused.docs-sourcelink:not(:active){box-shadow:0 0 0 0.125em rgba(55,90,127,0.25)}html.theme--documenter-dark .button.is-primary:active,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:active,html.theme--documenter-dark .button.is-primary.is-active,html.theme--documenter-dark .docstring>section>a.button.is-active.docs-sourcelink{background-color:#2f4d6d;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-primary[disabled],html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-primary,fieldset[disabled] html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink{background-color:#375a7f;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-primary.is-inverted,html.theme--documenter-dark .docstring>section>a.button.is-inverted.docs-sourcelink{background-color:#fff;color:#375a7f}html.theme--documenter-dark .button.is-primary.is-inverted:hover,html.theme--documenter-dark .docstring>section>a.button.is-inverted.docs-sourcelink:hover,html.theme--documenter-dark .button.is-primary.is-inverted.is-hovered,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-hovered.docs-sourcelink{background-color:#f2f2f2}html.theme--documenter-dark .button.is-primary.is-inverted[disabled],html.theme--documenter-dark .docstring>section>a.button.is-inverted.docs-sourcelink[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-primary.is-inverted,fieldset[disabled] html.theme--documenter-dark .docstring>section>a.button.is-inverted.docs-sourcelink{background-color:#fff;border-color:transparent;box-shadow:none;color:#375a7f}html.theme--documenter-dark .button.is-primary.is-loading::after,html.theme--documenter-dark .docstring>section>a.button.is-loading.docs-sourcelink::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-primary.is-outlined,html.theme--documenter-dark .docstring>section>a.button.is-outlined.docs-sourcelink{background-color:transparent;border-color:#375a7f;color:#375a7f}html.theme--documenter-dark .button.is-primary.is-outlined:hover,html.theme--documenter-dark .docstring>section>a.button.is-outlined.docs-sourcelink:hover,html.theme--documenter-dark .button.is-primary.is-outlined.is-hovered,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-hovered.docs-sourcelink,html.theme--documenter-dark .button.is-primary.is-outlined:focus,html.theme--documenter-dark .docstring>section>a.button.is-outlined.docs-sourcelink:focus,html.theme--documenter-dark .button.is-primary.is-outlined.is-focused,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-focused.docs-sourcelink{background-color:#375a7f;border-color:#375a7f;color:#fff}html.theme--documenter-dark .button.is-primary.is-outlined.is-loading::after,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-loading.docs-sourcelink::after{border-color:transparent transparent #375a7f #375a7f !important}html.theme--documenter-dark .button.is-primary.is-outlined.is-loading:hover::after,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-loading.docs-sourcelink:hover::after,html.theme--documenter-dark .button.is-primary.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-loading.is-hovered.docs-sourcelink::after,html.theme--documenter-dark .button.is-primary.is-outlined.is-loading:focus::after,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-loading.docs-sourcelink:focus::after,html.theme--documenter-dark .button.is-primary.is-outlined.is-loading.is-focused::after,html.theme--documenter-dark .docstring>section>a.button.is-outlined.is-loading.is-focused.docs-sourcelink::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-primary.is-outlined[disabled],html.theme--documenter-dark .docstring>section>a.button.is-outlined.docs-sourcelink[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-primary.is-outlined,fieldset[disabled] html.theme--documenter-dark .docstring>section>a.button.is-outlined.docs-sourcelink{background-color:transparent;border-color:#375a7f;box-shadow:none;color:#375a7f}html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined:hover,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink:hover,html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.is-hovered.docs-sourcelink,html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined:focus,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink:focus,html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined.is-focused,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.is-focused.docs-sourcelink{background-color:#fff;color:#375a7f}html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.is-loading.docs-sourcelink:hover::after,html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.is-loading.is-hovered.docs-sourcelink::after,html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.is-loading.docs-sourcelink:focus::after,html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined.is-loading.is-focused::after,html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.is-loading.is-focused.docs-sourcelink::after{border-color:transparent transparent #375a7f #375a7f !important}html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined[disabled],html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-primary.is-inverted.is-outlined,fieldset[disabled] html.theme--documenter-dark .docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-link{background-color:#1abc9c;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-link:hover,html.theme--documenter-dark .button.is-link.is-hovered{background-color:#18b193;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-link:focus,html.theme--documenter-dark .button.is-link.is-focused{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-link:focus:not(:active),html.theme--documenter-dark .button.is-link.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(26,188,156,0.25)}html.theme--documenter-dark .button.is-link:active,html.theme--documenter-dark .button.is-link.is-active{background-color:#17a689;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-link[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-link{background-color:#1abc9c;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-link.is-inverted{background-color:#fff;color:#1abc9c}html.theme--documenter-dark .button.is-link.is-inverted:hover,html.theme--documenter-dark .button.is-link.is-inverted.is-hovered{background-color:#f2f2f2}html.theme--documenter-dark .button.is-link.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-link.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#1abc9c}html.theme--documenter-dark .button.is-link.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-link.is-outlined{background-color:transparent;border-color:#1abc9c;color:#1abc9c}html.theme--documenter-dark .button.is-link.is-outlined:hover,html.theme--documenter-dark .button.is-link.is-outlined.is-hovered,html.theme--documenter-dark .button.is-link.is-outlined:focus,html.theme--documenter-dark .button.is-link.is-outlined.is-focused{background-color:#1abc9c;border-color:#1abc9c;color:#fff}html.theme--documenter-dark .button.is-link.is-outlined.is-loading::after{border-color:transparent transparent #1abc9c #1abc9c !important}html.theme--documenter-dark .button.is-link.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-link.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-link.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-link.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-link.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-link.is-outlined{background-color:transparent;border-color:#1abc9c;box-shadow:none;color:#1abc9c}html.theme--documenter-dark .button.is-link.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-link.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-link.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-link.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-link.is-inverted.is-outlined.is-focused{background-color:#fff;color:#1abc9c}html.theme--documenter-dark .button.is-link.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-link.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-link.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-link.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #1abc9c #1abc9c !important}html.theme--documenter-dark .button.is-link.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-link.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-info{background-color:#024c7d;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-info:hover,html.theme--documenter-dark .button.is-info.is-hovered{background-color:#024470;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-info:focus,html.theme--documenter-dark .button.is-info.is-focused{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-info:focus:not(:active),html.theme--documenter-dark .button.is-info.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(2,76,125,0.25)}html.theme--documenter-dark .button.is-info:active,html.theme--documenter-dark .button.is-info.is-active{background-color:#023d64;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-info[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-info{background-color:#024c7d;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-info.is-inverted{background-color:#fff;color:#024c7d}html.theme--documenter-dark .button.is-info.is-inverted:hover,html.theme--documenter-dark .button.is-info.is-inverted.is-hovered{background-color:#f2f2f2}html.theme--documenter-dark .button.is-info.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-info.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#024c7d}html.theme--documenter-dark .button.is-info.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-info.is-outlined{background-color:transparent;border-color:#024c7d;color:#024c7d}html.theme--documenter-dark .button.is-info.is-outlined:hover,html.theme--documenter-dark .button.is-info.is-outlined.is-hovered,html.theme--documenter-dark .button.is-info.is-outlined:focus,html.theme--documenter-dark .button.is-info.is-outlined.is-focused{background-color:#024c7d;border-color:#024c7d;color:#fff}html.theme--documenter-dark .button.is-info.is-outlined.is-loading::after{border-color:transparent transparent #024c7d #024c7d !important}html.theme--documenter-dark .button.is-info.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-info.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-info.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-info.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-info.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-info.is-outlined{background-color:transparent;border-color:#024c7d;box-shadow:none;color:#024c7d}html.theme--documenter-dark .button.is-info.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-info.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-info.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-info.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-info.is-inverted.is-outlined.is-focused{background-color:#fff;color:#024c7d}html.theme--documenter-dark .button.is-info.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-info.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-info.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-info.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #024c7d #024c7d !important}html.theme--documenter-dark .button.is-info.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-info.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-success{background-color:#008438;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-success:hover,html.theme--documenter-dark .button.is-success.is-hovered{background-color:#073;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-success:focus,html.theme--documenter-dark .button.is-success.is-focused{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-success:focus:not(:active),html.theme--documenter-dark .button.is-success.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(0,132,56,0.25)}html.theme--documenter-dark .button.is-success:active,html.theme--documenter-dark .button.is-success.is-active{background-color:#006b2d;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-success[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-success{background-color:#008438;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-success.is-inverted{background-color:#fff;color:#008438}html.theme--documenter-dark .button.is-success.is-inverted:hover,html.theme--documenter-dark .button.is-success.is-inverted.is-hovered{background-color:#f2f2f2}html.theme--documenter-dark .button.is-success.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-success.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#008438}html.theme--documenter-dark .button.is-success.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-success.is-outlined{background-color:transparent;border-color:#008438;color:#008438}html.theme--documenter-dark .button.is-success.is-outlined:hover,html.theme--documenter-dark .button.is-success.is-outlined.is-hovered,html.theme--documenter-dark .button.is-success.is-outlined:focus,html.theme--documenter-dark .button.is-success.is-outlined.is-focused{background-color:#008438;border-color:#008438;color:#fff}html.theme--documenter-dark .button.is-success.is-outlined.is-loading::after{border-color:transparent transparent #008438 #008438 !important}html.theme--documenter-dark .button.is-success.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-success.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-success.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-success.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-success.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-success.is-outlined{background-color:transparent;border-color:#008438;box-shadow:none;color:#008438}html.theme--documenter-dark .button.is-success.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-success.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-success.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-success.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-success.is-inverted.is-outlined.is-focused{background-color:#fff;color:#008438}html.theme--documenter-dark .button.is-success.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-success.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-success.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-success.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #008438 #008438 !important}html.theme--documenter-dark .button.is-success.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-success.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-warning{background-color:#ad8100;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-warning:hover,html.theme--documenter-dark .button.is-warning.is-hovered{background-color:#a07700;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-warning:focus,html.theme--documenter-dark .button.is-warning.is-focused{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-warning:focus:not(:active),html.theme--documenter-dark .button.is-warning.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(173,129,0,0.25)}html.theme--documenter-dark .button.is-warning:active,html.theme--documenter-dark .button.is-warning.is-active{background-color:#946e00;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-warning[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-warning{background-color:#ad8100;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-warning.is-inverted{background-color:#fff;color:#ad8100}html.theme--documenter-dark .button.is-warning.is-inverted:hover,html.theme--documenter-dark .button.is-warning.is-inverted.is-hovered{background-color:#f2f2f2}html.theme--documenter-dark .button.is-warning.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-warning.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#ad8100}html.theme--documenter-dark .button.is-warning.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-warning.is-outlined{background-color:transparent;border-color:#ad8100;color:#ad8100}html.theme--documenter-dark .button.is-warning.is-outlined:hover,html.theme--documenter-dark .button.is-warning.is-outlined.is-hovered,html.theme--documenter-dark .button.is-warning.is-outlined:focus,html.theme--documenter-dark .button.is-warning.is-outlined.is-focused{background-color:#ad8100;border-color:#ad8100;color:#fff}html.theme--documenter-dark .button.is-warning.is-outlined.is-loading::after{border-color:transparent transparent #ad8100 #ad8100 !important}html.theme--documenter-dark .button.is-warning.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-warning.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-warning.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-warning.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-warning.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-warning.is-outlined{background-color:transparent;border-color:#ad8100;box-shadow:none;color:#ad8100}html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined.is-focused{background-color:#fff;color:#ad8100}html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #ad8100 #ad8100 !important}html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-warning.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-danger{background-color:#9e1b0d;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-danger:hover,html.theme--documenter-dark .button.is-danger.is-hovered{background-color:#92190c;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-danger:focus,html.theme--documenter-dark .button.is-danger.is-focused{border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-danger:focus:not(:active),html.theme--documenter-dark .button.is-danger.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(158,27,13,0.25)}html.theme--documenter-dark .button.is-danger:active,html.theme--documenter-dark .button.is-danger.is-active{background-color:#86170b;border-color:transparent;color:#fff}html.theme--documenter-dark .button.is-danger[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-danger{background-color:#9e1b0d;border-color:transparent;box-shadow:none}html.theme--documenter-dark .button.is-danger.is-inverted{background-color:#fff;color:#9e1b0d}html.theme--documenter-dark .button.is-danger.is-inverted:hover,html.theme--documenter-dark .button.is-danger.is-inverted.is-hovered{background-color:#f2f2f2}html.theme--documenter-dark .button.is-danger.is-inverted[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-danger.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#9e1b0d}html.theme--documenter-dark .button.is-danger.is-loading::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-danger.is-outlined{background-color:transparent;border-color:#9e1b0d;color:#9e1b0d}html.theme--documenter-dark .button.is-danger.is-outlined:hover,html.theme--documenter-dark .button.is-danger.is-outlined.is-hovered,html.theme--documenter-dark .button.is-danger.is-outlined:focus,html.theme--documenter-dark .button.is-danger.is-outlined.is-focused{background-color:#9e1b0d;border-color:#9e1b0d;color:#fff}html.theme--documenter-dark .button.is-danger.is-outlined.is-loading::after{border-color:transparent transparent #9e1b0d #9e1b0d !important}html.theme--documenter-dark .button.is-danger.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-danger.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-danger.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-danger.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}html.theme--documenter-dark .button.is-danger.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-danger.is-outlined{background-color:transparent;border-color:#9e1b0d;box-shadow:none;color:#9e1b0d}html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined:hover,html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined.is-hovered,html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined:focus,html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined.is-focused{background-color:#fff;color:#9e1b0d}html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined.is-loading:hover::after,html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined.is-loading.is-hovered::after,html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined.is-loading:focus::after,html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #9e1b0d #9e1b0d !important}html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined[disabled],fieldset[disabled] html.theme--documenter-dark .button.is-danger.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}html.theme--documenter-dark .button.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.button{border-radius:3px;font-size:.85em}html.theme--documenter-dark .button.is-normal{font-size:15px}html.theme--documenter-dark .button.is-medium{font-size:1.25rem}html.theme--documenter-dark .button.is-large{font-size:1.5rem}html.theme--documenter-dark .button[disabled],fieldset[disabled] html.theme--documenter-dark .button{background-color:#8c9b9d;border-color:#dbdee0;box-shadow:none;opacity:.5}html.theme--documenter-dark .button.is-fullwidth{display:flex;width:100%}html.theme--documenter-dark .button.is-loading{color:transparent !important;pointer-events:none}html.theme--documenter-dark .button.is-loading::after{position:absolute;left:calc(50% - (1em / 2));top:calc(50% - (1em / 2));position:absolute !important}html.theme--documenter-dark .button.is-static{background-color:#282f2f;border-color:#5e6d6f;color:#dbdee0;box-shadow:none;pointer-events:none}html.theme--documenter-dark .button.is-rounded,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.button{border-radius:290486px;padding-left:1em;padding-right:1em}html.theme--documenter-dark .buttons{align-items:center;display:flex;flex-wrap:wrap;justify-content:flex-start}html.theme--documenter-dark .buttons .button{margin-bottom:0.5rem}html.theme--documenter-dark .buttons .button:not(:last-child):not(.is-fullwidth){margin-right:0.5rem}html.theme--documenter-dark .buttons:last-child{margin-bottom:-0.5rem}html.theme--documenter-dark .buttons:not(:last-child){margin-bottom:1rem}html.theme--documenter-dark .buttons.are-small .button:not(.is-normal):not(.is-medium):not(.is-large){border-radius:3px;font-size:.85em}html.theme--documenter-dark .buttons.are-medium .button:not(.is-small):not(.is-normal):not(.is-large){font-size:1.25rem}html.theme--documenter-dark .buttons.are-large .button:not(.is-small):not(.is-normal):not(.is-medium){font-size:1.5rem}html.theme--documenter-dark .buttons.has-addons .button:not(:first-child){border-bottom-left-radius:0;border-top-left-radius:0}html.theme--documenter-dark .buttons.has-addons .button:not(:last-child){border-bottom-right-radius:0;border-top-right-radius:0;margin-right:-1px}html.theme--documenter-dark .buttons.has-addons .button:last-child{margin-right:0}html.theme--documenter-dark .buttons.has-addons .button:hover,html.theme--documenter-dark .buttons.has-addons .button.is-hovered{z-index:2}html.theme--documenter-dark .buttons.has-addons .button:focus,html.theme--documenter-dark .buttons.has-addons .button.is-focused,html.theme--documenter-dark .buttons.has-addons .button:active,html.theme--documenter-dark .buttons.has-addons .button.is-active,html.theme--documenter-dark .buttons.has-addons .button.is-selected{z-index:3}html.theme--documenter-dark .buttons.has-addons .button:focus:hover,html.theme--documenter-dark .buttons.has-addons .button.is-focused:hover,html.theme--documenter-dark .buttons.has-addons .button:active:hover,html.theme--documenter-dark .buttons.has-addons .button.is-active:hover,html.theme--documenter-dark .buttons.has-addons .button.is-selected:hover{z-index:4}html.theme--documenter-dark .buttons.has-addons .button.is-expanded{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .buttons.is-centered{justify-content:center}html.theme--documenter-dark .buttons.is-centered:not(.has-addons) .button:not(.is-fullwidth){margin-left:0.25rem;margin-right:0.25rem}html.theme--documenter-dark .buttons.is-right{justify-content:flex-end}html.theme--documenter-dark .buttons.is-right:not(.has-addons) .button:not(.is-fullwidth){margin-left:0.25rem;margin-right:0.25rem}html.theme--documenter-dark .container{flex-grow:1;margin:0 auto;position:relative;width:auto}@media screen and (min-width: 1056px){html.theme--documenter-dark .container{max-width:992px}html.theme--documenter-dark .container.is-fluid{margin-left:32px;margin-right:32px;max-width:none}}@media screen and (max-width: 1215px){html.theme--documenter-dark .container.is-widescreen{max-width:1152px}}@media screen and (max-width: 1407px){html.theme--documenter-dark .container.is-fullhd{max-width:1344px}}@media screen and (min-width: 1216px){html.theme--documenter-dark .container{max-width:1152px}}@media screen and (min-width: 1408px){html.theme--documenter-dark .container{max-width:1344px}}html.theme--documenter-dark .content li+li{margin-top:0.25em}html.theme--documenter-dark .content p:not(:last-child),html.theme--documenter-dark .content dl:not(:last-child),html.theme--documenter-dark .content ol:not(:last-child),html.theme--documenter-dark .content ul:not(:last-child),html.theme--documenter-dark .content blockquote:not(:last-child),html.theme--documenter-dark .content pre:not(:last-child),html.theme--documenter-dark .content table:not(:last-child){margin-bottom:1em}html.theme--documenter-dark .content h1,html.theme--documenter-dark .content h2,html.theme--documenter-dark .content h3,html.theme--documenter-dark .content h4,html.theme--documenter-dark .content h5,html.theme--documenter-dark .content h6{color:#f2f2f2;font-weight:600;line-height:1.125}html.theme--documenter-dark .content h1{font-size:2em;margin-bottom:0.5em}html.theme--documenter-dark .content h1:not(:first-child){margin-top:1em}html.theme--documenter-dark .content h2{font-size:1.75em;margin-bottom:0.5714em}html.theme--documenter-dark .content h2:not(:first-child){margin-top:1.1428em}html.theme--documenter-dark .content h3{font-size:1.5em;margin-bottom:0.6666em}html.theme--documenter-dark .content h3:not(:first-child){margin-top:1.3333em}html.theme--documenter-dark .content h4{font-size:1.25em;margin-bottom:0.8em}html.theme--documenter-dark .content h5{font-size:1.125em;margin-bottom:0.8888em}html.theme--documenter-dark .content h6{font-size:1em;margin-bottom:1em}html.theme--documenter-dark .content blockquote{background-color:#282f2f;border-left:5px solid #5e6d6f;padding:1.25em 1.5em}html.theme--documenter-dark .content ol{list-style-position:outside;margin-left:2em;margin-top:1em}html.theme--documenter-dark .content ol:not([type]){list-style-type:decimal}html.theme--documenter-dark .content ol.is-lower-alpha:not([type]){list-style-type:lower-alpha}html.theme--documenter-dark .content ol.is-lower-roman:not([type]){list-style-type:lower-roman}html.theme--documenter-dark .content ol.is-upper-alpha:not([type]){list-style-type:upper-alpha}html.theme--documenter-dark .content ol.is-upper-roman:not([type]){list-style-type:upper-roman}html.theme--documenter-dark .content ul{list-style:disc outside;margin-left:2em;margin-top:1em}html.theme--documenter-dark .content ul ul{list-style-type:circle;margin-top:0.5em}html.theme--documenter-dark .content ul ul ul{list-style-type:square}html.theme--documenter-dark .content dd{margin-left:2em}html.theme--documenter-dark .content figure{margin-left:2em;margin-right:2em;text-align:center}html.theme--documenter-dark .content figure:not(:first-child){margin-top:2em}html.theme--documenter-dark .content figure:not(:last-child){margin-bottom:2em}html.theme--documenter-dark .content figure img{display:inline-block}html.theme--documenter-dark .content figure figcaption{font-style:italic}html.theme--documenter-dark .content pre{-webkit-overflow-scrolling:touch;overflow-x:auto;padding:0;white-space:pre;word-wrap:normal}html.theme--documenter-dark .content sup,html.theme--documenter-dark .content sub{font-size:75%}html.theme--documenter-dark .content table{width:100%}html.theme--documenter-dark .content table td,html.theme--documenter-dark .content table th{border:1px solid #5e6d6f;border-width:0 0 1px;padding:0.5em 0.75em;vertical-align:top}html.theme--documenter-dark .content table th{color:#f2f2f2}html.theme--documenter-dark .content table th:not([align]){text-align:left}html.theme--documenter-dark .content table thead td,html.theme--documenter-dark .content table thead th{border-width:0 0 2px;color:#f2f2f2}html.theme--documenter-dark .content table tfoot td,html.theme--documenter-dark .content table tfoot th{border-width:2px 0 0;color:#f2f2f2}html.theme--documenter-dark .content table tbody tr:last-child td,html.theme--documenter-dark .content table tbody tr:last-child th{border-bottom-width:0}html.theme--documenter-dark .content .tabs li+li{margin-top:0}html.theme--documenter-dark .content.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.content{font-size:.85em}html.theme--documenter-dark .content.is-medium{font-size:1.25rem}html.theme--documenter-dark .content.is-large{font-size:1.5rem}html.theme--documenter-dark .icon{align-items:center;display:inline-flex;justify-content:center;height:1.5rem;width:1.5rem}html.theme--documenter-dark .icon.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.icon{height:1rem;width:1rem}html.theme--documenter-dark .icon.is-medium{height:2rem;width:2rem}html.theme--documenter-dark .icon.is-large{height:3rem;width:3rem}html.theme--documenter-dark .image,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img{display:block;position:relative}html.theme--documenter-dark .image img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img img{display:block;height:auto;width:100%}html.theme--documenter-dark .image img.is-rounded,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img img.is-rounded{border-radius:290486px}html.theme--documenter-dark .image.is-square img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-square img,html.theme--documenter-dark .image.is-square .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-square .has-ratio,html.theme--documenter-dark .image.is-1by1 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by1 img,html.theme--documenter-dark .image.is-1by1 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by1 .has-ratio,html.theme--documenter-dark .image.is-5by4 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by4 img,html.theme--documenter-dark .image.is-5by4 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by4 .has-ratio,html.theme--documenter-dark .image.is-4by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by3 img,html.theme--documenter-dark .image.is-4by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by3 .has-ratio,html.theme--documenter-dark .image.is-3by2 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by2 img,html.theme--documenter-dark .image.is-3by2 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by2 .has-ratio,html.theme--documenter-dark .image.is-5by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by3 img,html.theme--documenter-dark .image.is-5by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by3 .has-ratio,html.theme--documenter-dark .image.is-16by9 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-16by9 img,html.theme--documenter-dark .image.is-16by9 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-16by9 .has-ratio,html.theme--documenter-dark .image.is-2by1 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by1 img,html.theme--documenter-dark .image.is-2by1 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by1 .has-ratio,html.theme--documenter-dark .image.is-3by1 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by1 img,html.theme--documenter-dark .image.is-3by1 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by1 .has-ratio,html.theme--documenter-dark .image.is-4by5 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by5 img,html.theme--documenter-dark .image.is-4by5 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by5 .has-ratio,html.theme--documenter-dark .image.is-3by4 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by4 img,html.theme--documenter-dark .image.is-3by4 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by4 .has-ratio,html.theme--documenter-dark .image.is-2by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by3 img,html.theme--documenter-dark .image.is-2by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by3 .has-ratio,html.theme--documenter-dark .image.is-3by5 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by5 img,html.theme--documenter-dark .image.is-3by5 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by5 .has-ratio,html.theme--documenter-dark .image.is-9by16 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-9by16 img,html.theme--documenter-dark .image.is-9by16 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-9by16 .has-ratio,html.theme--documenter-dark .image.is-1by2 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by2 img,html.theme--documenter-dark .image.is-1by2 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by2 .has-ratio,html.theme--documenter-dark .image.is-1by3 img,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by3 img,html.theme--documenter-dark .image.is-1by3 .has-ratio,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by3 .has-ratio{height:100%;width:100%}html.theme--documenter-dark .image.is-square,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-square,html.theme--documenter-dark .image.is-1by1,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by1{padding-top:100%}html.theme--documenter-dark .image.is-5by4,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by4{padding-top:80%}html.theme--documenter-dark .image.is-4by3,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by3{padding-top:75%}html.theme--documenter-dark .image.is-3by2,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by2{padding-top:66.6666%}html.theme--documenter-dark .image.is-5by3,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-5by3{padding-top:60%}html.theme--documenter-dark .image.is-16by9,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-16by9{padding-top:56.25%}html.theme--documenter-dark .image.is-2by1,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by1{padding-top:50%}html.theme--documenter-dark .image.is-3by1,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by1{padding-top:33.3333%}html.theme--documenter-dark .image.is-4by5,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-4by5{padding-top:125%}html.theme--documenter-dark .image.is-3by4,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by4{padding-top:133.3333%}html.theme--documenter-dark .image.is-2by3,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-2by3{padding-top:150%}html.theme--documenter-dark .image.is-3by5,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-3by5{padding-top:166.6666%}html.theme--documenter-dark .image.is-9by16,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-9by16{padding-top:177.7777%}html.theme--documenter-dark .image.is-1by2,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by2{padding-top:200%}html.theme--documenter-dark .image.is-1by3,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-1by3{padding-top:300%}html.theme--documenter-dark .image.is-16x16,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-16x16{height:16px;width:16px}html.theme--documenter-dark .image.is-24x24,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-24x24{height:24px;width:24px}html.theme--documenter-dark .image.is-32x32,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-32x32{height:32px;width:32px}html.theme--documenter-dark .image.is-48x48,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-48x48{height:48px;width:48px}html.theme--documenter-dark .image.is-64x64,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-64x64{height:64px;width:64px}html.theme--documenter-dark .image.is-96x96,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-96x96{height:96px;width:96px}html.theme--documenter-dark .image.is-128x128,html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img.is-128x128{height:128px;width:128px}html.theme--documenter-dark .notification{background-color:#282f2f;border-radius:.4em;padding:1.25rem 2.5rem 1.25rem 1.5rem;position:relative}html.theme--documenter-dark .notification a:not(.button):not(.dropdown-item){color:currentColor;text-decoration:underline}html.theme--documenter-dark .notification strong{color:currentColor}html.theme--documenter-dark .notification code,html.theme--documenter-dark .notification pre{background:#fff}html.theme--documenter-dark .notification pre code{background:transparent}html.theme--documenter-dark .notification>.delete{position:absolute;right:0.5rem;top:0.5rem}html.theme--documenter-dark .notification .title,html.theme--documenter-dark .notification .subtitle,html.theme--documenter-dark .notification .content{color:currentColor}html.theme--documenter-dark .notification.is-white{background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .notification.is-black{background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .notification.is-light{background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .notification.is-dark,html.theme--documenter-dark .content kbd.notification{background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .notification.is-primary,html.theme--documenter-dark .docstring>section>a.notification.docs-sourcelink{background-color:#375a7f;color:#fff}html.theme--documenter-dark .notification.is-link{background-color:#1abc9c;color:#fff}html.theme--documenter-dark .notification.is-info{background-color:#024c7d;color:#fff}html.theme--documenter-dark .notification.is-success{background-color:#008438;color:#fff}html.theme--documenter-dark .notification.is-warning{background-color:#ad8100;color:#fff}html.theme--documenter-dark .notification.is-danger{background-color:#9e1b0d;color:#fff}html.theme--documenter-dark .progress{-moz-appearance:none;-webkit-appearance:none;border:none;border-radius:290486px;display:block;height:15px;overflow:hidden;padding:0;width:100%}html.theme--documenter-dark .progress::-webkit-progress-bar{background-color:#5e6d6f}html.theme--documenter-dark .progress::-webkit-progress-value{background-color:#dbdee0}html.theme--documenter-dark .progress::-moz-progress-bar{background-color:#dbdee0}html.theme--documenter-dark .progress::-ms-fill{background-color:#dbdee0;border:none}html.theme--documenter-dark .progress.is-white::-webkit-progress-value{background-color:#fff}html.theme--documenter-dark .progress.is-white::-moz-progress-bar{background-color:#fff}html.theme--documenter-dark .progress.is-white::-ms-fill{background-color:#fff}html.theme--documenter-dark .progress.is-white:indeterminate{background-image:linear-gradient(to right, #fff 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-black::-webkit-progress-value{background-color:#0a0a0a}html.theme--documenter-dark .progress.is-black::-moz-progress-bar{background-color:#0a0a0a}html.theme--documenter-dark .progress.is-black::-ms-fill{background-color:#0a0a0a}html.theme--documenter-dark .progress.is-black:indeterminate{background-image:linear-gradient(to right, #0a0a0a 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-light::-webkit-progress-value{background-color:#ecf0f1}html.theme--documenter-dark .progress.is-light::-moz-progress-bar{background-color:#ecf0f1}html.theme--documenter-dark .progress.is-light::-ms-fill{background-color:#ecf0f1}html.theme--documenter-dark .progress.is-light:indeterminate{background-image:linear-gradient(to right, #ecf0f1 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-dark::-webkit-progress-value,html.theme--documenter-dark .content kbd.progress::-webkit-progress-value{background-color:#282f2f}html.theme--documenter-dark .progress.is-dark::-moz-progress-bar,html.theme--documenter-dark .content kbd.progress::-moz-progress-bar{background-color:#282f2f}html.theme--documenter-dark .progress.is-dark::-ms-fill,html.theme--documenter-dark .content kbd.progress::-ms-fill{background-color:#282f2f}html.theme--documenter-dark .progress.is-dark:indeterminate,html.theme--documenter-dark .content kbd.progress:indeterminate{background-image:linear-gradient(to right, #282f2f 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-primary::-webkit-progress-value,html.theme--documenter-dark .docstring>section>a.progress.docs-sourcelink::-webkit-progress-value{background-color:#375a7f}html.theme--documenter-dark .progress.is-primary::-moz-progress-bar,html.theme--documenter-dark .docstring>section>a.progress.docs-sourcelink::-moz-progress-bar{background-color:#375a7f}html.theme--documenter-dark .progress.is-primary::-ms-fill,html.theme--documenter-dark .docstring>section>a.progress.docs-sourcelink::-ms-fill{background-color:#375a7f}html.theme--documenter-dark .progress.is-primary:indeterminate,html.theme--documenter-dark .docstring>section>a.progress.docs-sourcelink:indeterminate{background-image:linear-gradient(to right, #375a7f 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-link::-webkit-progress-value{background-color:#1abc9c}html.theme--documenter-dark .progress.is-link::-moz-progress-bar{background-color:#1abc9c}html.theme--documenter-dark .progress.is-link::-ms-fill{background-color:#1abc9c}html.theme--documenter-dark .progress.is-link:indeterminate{background-image:linear-gradient(to right, #1abc9c 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-info::-webkit-progress-value{background-color:#024c7d}html.theme--documenter-dark .progress.is-info::-moz-progress-bar{background-color:#024c7d}html.theme--documenter-dark .progress.is-info::-ms-fill{background-color:#024c7d}html.theme--documenter-dark .progress.is-info:indeterminate{background-image:linear-gradient(to right, #024c7d 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-success::-webkit-progress-value{background-color:#008438}html.theme--documenter-dark .progress.is-success::-moz-progress-bar{background-color:#008438}html.theme--documenter-dark .progress.is-success::-ms-fill{background-color:#008438}html.theme--documenter-dark .progress.is-success:indeterminate{background-image:linear-gradient(to right, #008438 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-warning::-webkit-progress-value{background-color:#ad8100}html.theme--documenter-dark .progress.is-warning::-moz-progress-bar{background-color:#ad8100}html.theme--documenter-dark .progress.is-warning::-ms-fill{background-color:#ad8100}html.theme--documenter-dark .progress.is-warning:indeterminate{background-image:linear-gradient(to right, #ad8100 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress.is-danger::-webkit-progress-value{background-color:#9e1b0d}html.theme--documenter-dark .progress.is-danger::-moz-progress-bar{background-color:#9e1b0d}html.theme--documenter-dark .progress.is-danger::-ms-fill{background-color:#9e1b0d}html.theme--documenter-dark .progress.is-danger:indeterminate{background-image:linear-gradient(to right, #9e1b0d 30%, #5e6d6f 30%)}html.theme--documenter-dark .progress:indeterminate{animation-duration:1.5s;animation-iteration-count:infinite;animation-name:moveIndeterminate;animation-timing-function:linear;background-color:#5e6d6f;background-image:linear-gradient(to right, #fff 30%, #5e6d6f 30%);background-position:top left;background-repeat:no-repeat;background-size:150% 150%}html.theme--documenter-dark .progress:indeterminate::-webkit-progress-bar{background-color:transparent}html.theme--documenter-dark .progress:indeterminate::-moz-progress-bar{background-color:transparent}html.theme--documenter-dark .progress.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.progress{height:.85em}html.theme--documenter-dark .progress.is-medium{height:1.25rem}html.theme--documenter-dark .progress.is-large{height:1.5rem}@keyframes moveIndeterminate{from{background-position:200% 0}to{background-position:-200% 0}}html.theme--documenter-dark .table{background-color:#343c3d;color:#fff}html.theme--documenter-dark .table td,html.theme--documenter-dark .table th{border:1px solid #5e6d6f;border-width:0 0 1px;padding:0.5em 0.75em;vertical-align:top}html.theme--documenter-dark .table td.is-white,html.theme--documenter-dark .table th.is-white{background-color:#fff;border-color:#fff;color:#0a0a0a}html.theme--documenter-dark .table td.is-black,html.theme--documenter-dark .table th.is-black{background-color:#0a0a0a;border-color:#0a0a0a;color:#fff}html.theme--documenter-dark .table td.is-light,html.theme--documenter-dark .table th.is-light{background-color:#ecf0f1;border-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .table td.is-dark,html.theme--documenter-dark .table th.is-dark{background-color:#282f2f;border-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .table td.is-primary,html.theme--documenter-dark .table th.is-primary{background-color:#375a7f;border-color:#375a7f;color:#fff}html.theme--documenter-dark .table td.is-link,html.theme--documenter-dark .table th.is-link{background-color:#1abc9c;border-color:#1abc9c;color:#fff}html.theme--documenter-dark .table td.is-info,html.theme--documenter-dark .table th.is-info{background-color:#024c7d;border-color:#024c7d;color:#fff}html.theme--documenter-dark .table td.is-success,html.theme--documenter-dark .table th.is-success{background-color:#008438;border-color:#008438;color:#fff}html.theme--documenter-dark .table td.is-warning,html.theme--documenter-dark .table th.is-warning{background-color:#ad8100;border-color:#ad8100;color:#fff}html.theme--documenter-dark .table td.is-danger,html.theme--documenter-dark .table th.is-danger{background-color:#9e1b0d;border-color:#9e1b0d;color:#fff}html.theme--documenter-dark .table td.is-narrow,html.theme--documenter-dark .table th.is-narrow{white-space:nowrap;width:1%}html.theme--documenter-dark .table td.is-selected,html.theme--documenter-dark .table th.is-selected{background-color:#375a7f;color:#fff}html.theme--documenter-dark .table td.is-selected a,html.theme--documenter-dark .table td.is-selected strong,html.theme--documenter-dark .table th.is-selected a,html.theme--documenter-dark .table th.is-selected strong{color:currentColor}html.theme--documenter-dark .table th{color:#f2f2f2}html.theme--documenter-dark .table th:not([align]){text-align:left}html.theme--documenter-dark .table tr.is-selected{background-color:#375a7f;color:#fff}html.theme--documenter-dark .table tr.is-selected a,html.theme--documenter-dark .table tr.is-selected strong{color:currentColor}html.theme--documenter-dark .table tr.is-selected td,html.theme--documenter-dark .table tr.is-selected th{border-color:#fff;color:currentColor}html.theme--documenter-dark .table thead{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .table thead td,html.theme--documenter-dark .table thead th{border-width:0 0 2px;color:#f2f2f2}html.theme--documenter-dark .table tfoot{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .table tfoot td,html.theme--documenter-dark .table tfoot th{border-width:2px 0 0;color:#f2f2f2}html.theme--documenter-dark .table tbody{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .table tbody tr:last-child td,html.theme--documenter-dark .table tbody tr:last-child th{border-bottom-width:0}html.theme--documenter-dark .table.is-bordered td,html.theme--documenter-dark .table.is-bordered th{border-width:1px}html.theme--documenter-dark .table.is-bordered tr:last-child td,html.theme--documenter-dark .table.is-bordered tr:last-child th{border-bottom-width:1px}html.theme--documenter-dark .table.is-fullwidth{width:100%}html.theme--documenter-dark .table.is-hoverable tbody tr:not(.is-selected):hover{background-color:#282f2f}html.theme--documenter-dark .table.is-hoverable.is-striped tbody tr:not(.is-selected):hover{background-color:#282f2f}html.theme--documenter-dark .table.is-hoverable.is-striped tbody tr:not(.is-selected):hover:nth-child(even){background-color:#2d3435}html.theme--documenter-dark .table.is-narrow td,html.theme--documenter-dark .table.is-narrow th{padding:0.25em 0.5em}html.theme--documenter-dark .table.is-striped tbody tr:not(.is-selected):nth-child(even){background-color:#282f2f}html.theme--documenter-dark .table-container{-webkit-overflow-scrolling:touch;overflow:auto;overflow-y:hidden;max-width:100%}html.theme--documenter-dark .tags{align-items:center;display:flex;flex-wrap:wrap;justify-content:flex-start}html.theme--documenter-dark .tags .tag,html.theme--documenter-dark .tags .content kbd,html.theme--documenter-dark .content .tags kbd,html.theme--documenter-dark .tags .docstring>section>a.docs-sourcelink{margin-bottom:0.5rem}html.theme--documenter-dark .tags .tag:not(:last-child),html.theme--documenter-dark .tags .content kbd:not(:last-child),html.theme--documenter-dark .content .tags kbd:not(:last-child),html.theme--documenter-dark .tags .docstring>section>a.docs-sourcelink:not(:last-child){margin-right:0.5rem}html.theme--documenter-dark .tags:last-child{margin-bottom:-0.5rem}html.theme--documenter-dark .tags:not(:last-child){margin-bottom:1rem}html.theme--documenter-dark .tags.are-medium .tag:not(.is-normal):not(.is-large),html.theme--documenter-dark .tags.are-medium .content kbd:not(.is-normal):not(.is-large),html.theme--documenter-dark .content .tags.are-medium kbd:not(.is-normal):not(.is-large),html.theme--documenter-dark .tags.are-medium .docstring>section>a.docs-sourcelink:not(.is-normal):not(.is-large){font-size:15px}html.theme--documenter-dark .tags.are-large .tag:not(.is-normal):not(.is-medium),html.theme--documenter-dark .tags.are-large .content kbd:not(.is-normal):not(.is-medium),html.theme--documenter-dark .content .tags.are-large kbd:not(.is-normal):not(.is-medium),html.theme--documenter-dark .tags.are-large .docstring>section>a.docs-sourcelink:not(.is-normal):not(.is-medium){font-size:1.25rem}html.theme--documenter-dark .tags.is-centered{justify-content:center}html.theme--documenter-dark .tags.is-centered .tag,html.theme--documenter-dark .tags.is-centered .content kbd,html.theme--documenter-dark .content .tags.is-centered kbd,html.theme--documenter-dark .tags.is-centered .docstring>section>a.docs-sourcelink{margin-right:0.25rem;margin-left:0.25rem}html.theme--documenter-dark .tags.is-right{justify-content:flex-end}html.theme--documenter-dark .tags.is-right .tag:not(:first-child),html.theme--documenter-dark .tags.is-right .content kbd:not(:first-child),html.theme--documenter-dark .content .tags.is-right kbd:not(:first-child),html.theme--documenter-dark .tags.is-right .docstring>section>a.docs-sourcelink:not(:first-child){margin-left:0.5rem}html.theme--documenter-dark .tags.is-right .tag:not(:last-child),html.theme--documenter-dark .tags.is-right .content kbd:not(:last-child),html.theme--documenter-dark .content .tags.is-right kbd:not(:last-child),html.theme--documenter-dark .tags.is-right .docstring>section>a.docs-sourcelink:not(:last-child){margin-right:0}html.theme--documenter-dark .tags.has-addons .tag,html.theme--documenter-dark .tags.has-addons .content kbd,html.theme--documenter-dark .content .tags.has-addons kbd,html.theme--documenter-dark .tags.has-addons .docstring>section>a.docs-sourcelink{margin-right:0}html.theme--documenter-dark .tags.has-addons .tag:not(:first-child),html.theme--documenter-dark .tags.has-addons .content kbd:not(:first-child),html.theme--documenter-dark .content .tags.has-addons kbd:not(:first-child),html.theme--documenter-dark .tags.has-addons .docstring>section>a.docs-sourcelink:not(:first-child){margin-left:0;border-bottom-left-radius:0;border-top-left-radius:0}html.theme--documenter-dark .tags.has-addons .tag:not(:last-child),html.theme--documenter-dark .tags.has-addons .content kbd:not(:last-child),html.theme--documenter-dark .content .tags.has-addons kbd:not(:last-child),html.theme--documenter-dark .tags.has-addons .docstring>section>a.docs-sourcelink:not(:last-child){border-bottom-right-radius:0;border-top-right-radius:0}html.theme--documenter-dark .tag:not(body),html.theme--documenter-dark .content kbd:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink:not(body){align-items:center;background-color:#282f2f;border-radius:.4em;color:#fff;display:inline-flex;font-size:.85em;height:2em;justify-content:center;line-height:1.5;padding-left:0.75em;padding-right:0.75em;white-space:nowrap}html.theme--documenter-dark .tag:not(body) .delete,html.theme--documenter-dark .content kbd:not(body) .delete,html.theme--documenter-dark .docstring>section>a.docs-sourcelink:not(body) .delete{margin-left:0.25rem;margin-right:-0.375rem}html.theme--documenter-dark .tag.is-white:not(body),html.theme--documenter-dark .content kbd.is-white:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-white:not(body){background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .tag.is-black:not(body),html.theme--documenter-dark .content kbd.is-black:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-black:not(body){background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .tag.is-light:not(body),html.theme--documenter-dark .content kbd.is-light:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-light:not(body){background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .tag.is-dark:not(body),html.theme--documenter-dark .content kbd:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-dark:not(body),html.theme--documenter-dark .content .docstring>section>kbd:not(body){background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .tag.is-primary:not(body),html.theme--documenter-dark .content kbd.is-primary:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink:not(body){background-color:#375a7f;color:#fff}html.theme--documenter-dark .tag.is-link:not(body),html.theme--documenter-dark .content kbd.is-link:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-link:not(body){background-color:#1abc9c;color:#fff}html.theme--documenter-dark .tag.is-info:not(body),html.theme--documenter-dark .content kbd.is-info:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-info:not(body){background-color:#024c7d;color:#fff}html.theme--documenter-dark .tag.is-success:not(body),html.theme--documenter-dark .content kbd.is-success:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-success:not(body){background-color:#008438;color:#fff}html.theme--documenter-dark .tag.is-warning:not(body),html.theme--documenter-dark .content kbd.is-warning:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-warning:not(body){background-color:#ad8100;color:#fff}html.theme--documenter-dark .tag.is-danger:not(body),html.theme--documenter-dark .content kbd.is-danger:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-danger:not(body){background-color:#9e1b0d;color:#fff}html.theme--documenter-dark .tag.is-normal:not(body),html.theme--documenter-dark .content kbd.is-normal:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-normal:not(body){font-size:.85em}html.theme--documenter-dark .tag.is-medium:not(body),html.theme--documenter-dark .content kbd.is-medium:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-medium:not(body){font-size:15px}html.theme--documenter-dark .tag.is-large:not(body),html.theme--documenter-dark .content kbd.is-large:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-large:not(body){font-size:1.25rem}html.theme--documenter-dark .tag:not(body) .icon:first-child:not(:last-child),html.theme--documenter-dark .content kbd:not(body) .icon:first-child:not(:last-child),html.theme--documenter-dark .docstring>section>a.docs-sourcelink:not(body) .icon:first-child:not(:last-child){margin-left:-0.375em;margin-right:0.1875em}html.theme--documenter-dark .tag:not(body) .icon:last-child:not(:first-child),html.theme--documenter-dark .content kbd:not(body) .icon:last-child:not(:first-child),html.theme--documenter-dark .docstring>section>a.docs-sourcelink:not(body) .icon:last-child:not(:first-child){margin-left:0.1875em;margin-right:-0.375em}html.theme--documenter-dark .tag:not(body) .icon:first-child:last-child,html.theme--documenter-dark .content kbd:not(body) .icon:first-child:last-child,html.theme--documenter-dark .docstring>section>a.docs-sourcelink:not(body) .icon:first-child:last-child{margin-left:-0.375em;margin-right:-0.375em}html.theme--documenter-dark .tag.is-delete:not(body),html.theme--documenter-dark .content kbd.is-delete:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body){margin-left:1px;padding:0;position:relative;width:2em}html.theme--documenter-dark .tag.is-delete:not(body)::before,html.theme--documenter-dark .content kbd.is-delete:not(body)::before,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body)::before,html.theme--documenter-dark .tag.is-delete:not(body)::after,html.theme--documenter-dark .content kbd.is-delete:not(body)::after,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body)::after{background-color:currentColor;content:"";display:block;left:50%;position:absolute;top:50%;transform:translateX(-50%) translateY(-50%) rotate(45deg);transform-origin:center center}html.theme--documenter-dark .tag.is-delete:not(body)::before,html.theme--documenter-dark .content kbd.is-delete:not(body)::before,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body)::before{height:1px;width:50%}html.theme--documenter-dark .tag.is-delete:not(body)::after,html.theme--documenter-dark .content kbd.is-delete:not(body)::after,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body)::after{height:50%;width:1px}html.theme--documenter-dark .tag.is-delete:not(body):hover,html.theme--documenter-dark .content kbd.is-delete:not(body):hover,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body):hover,html.theme--documenter-dark .tag.is-delete:not(body):focus,html.theme--documenter-dark .content kbd.is-delete:not(body):focus,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body):focus{background-color:#1d2122}html.theme--documenter-dark .tag.is-delete:not(body):active,html.theme--documenter-dark .content kbd.is-delete:not(body):active,html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-delete:not(body):active{background-color:#111414}html.theme--documenter-dark .tag.is-rounded:not(body),html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:not(body),html.theme--documenter-dark .content kbd.is-rounded:not(body),html.theme--documenter-dark #documenter .docs-sidebar .content form.docs-search>input:not(body),html.theme--documenter-dark .docstring>section>a.docs-sourcelink.is-rounded:not(body){border-radius:290486px}html.theme--documenter-dark a.tag:hover,html.theme--documenter-dark .docstring>section>a.docs-sourcelink:hover{text-decoration:underline}html.theme--documenter-dark .title,html.theme--documenter-dark .subtitle{word-break:break-word}html.theme--documenter-dark .title em,html.theme--documenter-dark .title span,html.theme--documenter-dark .subtitle em,html.theme--documenter-dark .subtitle span{font-weight:inherit}html.theme--documenter-dark .title sub,html.theme--documenter-dark .subtitle sub{font-size:.75em}html.theme--documenter-dark .title sup,html.theme--documenter-dark .subtitle sup{font-size:.75em}html.theme--documenter-dark .title .tag,html.theme--documenter-dark .title .content kbd,html.theme--documenter-dark .content .title kbd,html.theme--documenter-dark .title .docstring>section>a.docs-sourcelink,html.theme--documenter-dark .subtitle .tag,html.theme--documenter-dark .subtitle .content kbd,html.theme--documenter-dark .content .subtitle kbd,html.theme--documenter-dark .subtitle .docstring>section>a.docs-sourcelink{vertical-align:middle}html.theme--documenter-dark .title{color:#fff;font-size:2rem;font-weight:500;line-height:1.125}html.theme--documenter-dark .title strong{color:inherit;font-weight:inherit}html.theme--documenter-dark .title+.highlight{margin-top:-0.75rem}html.theme--documenter-dark .title:not(.is-spaced)+.subtitle{margin-top:-1.25rem}html.theme--documenter-dark .title.is-1{font-size:3rem}html.theme--documenter-dark .title.is-2{font-size:2.5rem}html.theme--documenter-dark .title.is-3{font-size:2rem}html.theme--documenter-dark .title.is-4{font-size:1.5rem}html.theme--documenter-dark .title.is-5{font-size:1.25rem}html.theme--documenter-dark .title.is-6{font-size:15px}html.theme--documenter-dark .title.is-7{font-size:.85em}html.theme--documenter-dark .subtitle{color:#8c9b9d;font-size:1.25rem;font-weight:400;line-height:1.25}html.theme--documenter-dark .subtitle strong{color:#8c9b9d;font-weight:600}html.theme--documenter-dark .subtitle:not(.is-spaced)+.title{margin-top:-1.25rem}html.theme--documenter-dark .subtitle.is-1{font-size:3rem}html.theme--documenter-dark .subtitle.is-2{font-size:2.5rem}html.theme--documenter-dark .subtitle.is-3{font-size:2rem}html.theme--documenter-dark .subtitle.is-4{font-size:1.5rem}html.theme--documenter-dark .subtitle.is-5{font-size:1.25rem}html.theme--documenter-dark .subtitle.is-6{font-size:15px}html.theme--documenter-dark .subtitle.is-7{font-size:.85em}html.theme--documenter-dark .heading{display:block;font-size:11px;letter-spacing:1px;margin-bottom:5px;text-transform:uppercase}html.theme--documenter-dark .highlight{font-weight:400;max-width:100%;overflow:hidden;padding:0}html.theme--documenter-dark .highlight pre{overflow:auto;max-width:100%}html.theme--documenter-dark .number{align-items:center;background-color:#282f2f;border-radius:290486px;display:inline-flex;font-size:1.25rem;height:2em;justify-content:center;margin-right:1.5rem;min-width:2.5em;padding:0.25rem 0.5rem;text-align:center;vertical-align:top}html.theme--documenter-dark .select select,html.theme--documenter-dark .textarea,html.theme--documenter-dark .input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input{background-color:#1f2424;border-color:#5e6d6f;border-radius:.4em;color:#dbdee0}html.theme--documenter-dark .select select::-moz-placeholder,html.theme--documenter-dark .textarea::-moz-placeholder,html.theme--documenter-dark .input::-moz-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input::-moz-placeholder{color:rgba(219,222,224,0.3)}html.theme--documenter-dark .select select::-webkit-input-placeholder,html.theme--documenter-dark .textarea::-webkit-input-placeholder,html.theme--documenter-dark .input::-webkit-input-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input::-webkit-input-placeholder{color:rgba(219,222,224,0.3)}html.theme--documenter-dark .select select:-moz-placeholder,html.theme--documenter-dark .textarea:-moz-placeholder,html.theme--documenter-dark .input:-moz-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:-moz-placeholder{color:rgba(219,222,224,0.3)}html.theme--documenter-dark .select select:-ms-input-placeholder,html.theme--documenter-dark .textarea:-ms-input-placeholder,html.theme--documenter-dark .input:-ms-input-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:-ms-input-placeholder{color:rgba(219,222,224,0.3)}html.theme--documenter-dark .select select:hover,html.theme--documenter-dark .textarea:hover,html.theme--documenter-dark .input:hover,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:hover,html.theme--documenter-dark .select select.is-hovered,html.theme--documenter-dark .is-hovered.textarea,html.theme--documenter-dark .is-hovered.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-hovered{border-color:#8c9b9d}html.theme--documenter-dark .select select:focus,html.theme--documenter-dark .textarea:focus,html.theme--documenter-dark .input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:focus,html.theme--documenter-dark .select select.is-focused,html.theme--documenter-dark .is-focused.textarea,html.theme--documenter-dark .is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .select select:active,html.theme--documenter-dark .textarea:active,html.theme--documenter-dark .input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:active,html.theme--documenter-dark .select select.is-active,html.theme--documenter-dark .is-active.textarea,html.theme--documenter-dark .is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{border-color:#1abc9c;box-shadow:0 0 0 0.125em rgba(26,188,156,0.25)}html.theme--documenter-dark .select select[disabled],html.theme--documenter-dark .textarea[disabled],html.theme--documenter-dark .input[disabled],html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[disabled],fieldset[disabled] html.theme--documenter-dark .select select,fieldset[disabled] html.theme--documenter-dark .textarea,fieldset[disabled] html.theme--documenter-dark .input,fieldset[disabled] html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input{background-color:#8c9b9d;border-color:#282f2f;box-shadow:none;color:#fff}html.theme--documenter-dark .select select[disabled]::-moz-placeholder,html.theme--documenter-dark .textarea[disabled]::-moz-placeholder,html.theme--documenter-dark .input[disabled]::-moz-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[disabled]::-moz-placeholder,fieldset[disabled] html.theme--documenter-dark .select select::-moz-placeholder,fieldset[disabled] html.theme--documenter-dark .textarea::-moz-placeholder,fieldset[disabled] html.theme--documenter-dark .input::-moz-placeholder,fieldset[disabled] html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input::-moz-placeholder{color:rgba(255,255,255,0.3)}html.theme--documenter-dark .select select[disabled]::-webkit-input-placeholder,html.theme--documenter-dark .textarea[disabled]::-webkit-input-placeholder,html.theme--documenter-dark .input[disabled]::-webkit-input-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[disabled]::-webkit-input-placeholder,fieldset[disabled] html.theme--documenter-dark .select select::-webkit-input-placeholder,fieldset[disabled] html.theme--documenter-dark .textarea::-webkit-input-placeholder,fieldset[disabled] html.theme--documenter-dark .input::-webkit-input-placeholder,fieldset[disabled] html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input::-webkit-input-placeholder{color:rgba(255,255,255,0.3)}html.theme--documenter-dark .select select[disabled]:-moz-placeholder,html.theme--documenter-dark .textarea[disabled]:-moz-placeholder,html.theme--documenter-dark .input[disabled]:-moz-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[disabled]:-moz-placeholder,fieldset[disabled] html.theme--documenter-dark .select select:-moz-placeholder,fieldset[disabled] html.theme--documenter-dark .textarea:-moz-placeholder,fieldset[disabled] html.theme--documenter-dark .input:-moz-placeholder,fieldset[disabled] html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:-moz-placeholder{color:rgba(255,255,255,0.3)}html.theme--documenter-dark .select select[disabled]:-ms-input-placeholder,html.theme--documenter-dark .textarea[disabled]:-ms-input-placeholder,html.theme--documenter-dark .input[disabled]:-ms-input-placeholder,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[disabled]:-ms-input-placeholder,fieldset[disabled] html.theme--documenter-dark .select select:-ms-input-placeholder,fieldset[disabled] html.theme--documenter-dark .textarea:-ms-input-placeholder,fieldset[disabled] html.theme--documenter-dark .input:-ms-input-placeholder,fieldset[disabled] html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input:-ms-input-placeholder{color:rgba(255,255,255,0.3)}html.theme--documenter-dark .textarea,html.theme--documenter-dark .input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input{box-shadow:inset 0 1px 2px rgba(10,10,10,0.1);max-width:100%;width:100%}html.theme--documenter-dark .textarea[readonly],html.theme--documenter-dark .input[readonly],html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input[readonly]{box-shadow:none}html.theme--documenter-dark .is-white.textarea,html.theme--documenter-dark .is-white.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-white{border-color:#fff}html.theme--documenter-dark .is-white.textarea:focus,html.theme--documenter-dark .is-white.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-white:focus,html.theme--documenter-dark .is-white.is-focused.textarea,html.theme--documenter-dark .is-white.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-white.textarea:active,html.theme--documenter-dark .is-white.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-white:active,html.theme--documenter-dark .is-white.is-active.textarea,html.theme--documenter-dark .is-white.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(255,255,255,0.25)}html.theme--documenter-dark .is-black.textarea,html.theme--documenter-dark .is-black.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-black{border-color:#0a0a0a}html.theme--documenter-dark .is-black.textarea:focus,html.theme--documenter-dark .is-black.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-black:focus,html.theme--documenter-dark .is-black.is-focused.textarea,html.theme--documenter-dark .is-black.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-black.textarea:active,html.theme--documenter-dark .is-black.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-black:active,html.theme--documenter-dark .is-black.is-active.textarea,html.theme--documenter-dark .is-black.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(10,10,10,0.25)}html.theme--documenter-dark .is-light.textarea,html.theme--documenter-dark .is-light.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-light{border-color:#ecf0f1}html.theme--documenter-dark .is-light.textarea:focus,html.theme--documenter-dark .is-light.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-light:focus,html.theme--documenter-dark .is-light.is-focused.textarea,html.theme--documenter-dark .is-light.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-light.textarea:active,html.theme--documenter-dark .is-light.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-light:active,html.theme--documenter-dark .is-light.is-active.textarea,html.theme--documenter-dark .is-light.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(236,240,241,0.25)}html.theme--documenter-dark .is-dark.textarea,html.theme--documenter-dark .content kbd.textarea,html.theme--documenter-dark .is-dark.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-dark,html.theme--documenter-dark .content kbd.input{border-color:#282f2f}html.theme--documenter-dark .is-dark.textarea:focus,html.theme--documenter-dark .content kbd.textarea:focus,html.theme--documenter-dark .is-dark.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-dark:focus,html.theme--documenter-dark .content kbd.input:focus,html.theme--documenter-dark .is-dark.is-focused.textarea,html.theme--documenter-dark .content kbd.is-focused.textarea,html.theme--documenter-dark .is-dark.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .content kbd.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar .content form.docs-search>input.is-focused,html.theme--documenter-dark .is-dark.textarea:active,html.theme--documenter-dark .content kbd.textarea:active,html.theme--documenter-dark .is-dark.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-dark:active,html.theme--documenter-dark .content kbd.input:active,html.theme--documenter-dark .is-dark.is-active.textarea,html.theme--documenter-dark .content kbd.is-active.textarea,html.theme--documenter-dark .is-dark.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active,html.theme--documenter-dark .content kbd.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar .content form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(40,47,47,0.25)}html.theme--documenter-dark .is-primary.textarea,html.theme--documenter-dark .docstring>section>a.textarea.docs-sourcelink,html.theme--documenter-dark .is-primary.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-primary,html.theme--documenter-dark .docstring>section>a.input.docs-sourcelink{border-color:#375a7f}html.theme--documenter-dark .is-primary.textarea:focus,html.theme--documenter-dark .docstring>section>a.textarea.docs-sourcelink:focus,html.theme--documenter-dark .is-primary.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-primary:focus,html.theme--documenter-dark .docstring>section>a.input.docs-sourcelink:focus,html.theme--documenter-dark .is-primary.is-focused.textarea,html.theme--documenter-dark .docstring>section>a.is-focused.textarea.docs-sourcelink,html.theme--documenter-dark .is-primary.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .docstring>section>a.is-focused.input.docs-sourcelink,html.theme--documenter-dark .is-primary.textarea:active,html.theme--documenter-dark .docstring>section>a.textarea.docs-sourcelink:active,html.theme--documenter-dark .is-primary.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-primary:active,html.theme--documenter-dark .docstring>section>a.input.docs-sourcelink:active,html.theme--documenter-dark .is-primary.is-active.textarea,html.theme--documenter-dark .docstring>section>a.is-active.textarea.docs-sourcelink,html.theme--documenter-dark .is-primary.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active,html.theme--documenter-dark .docstring>section>a.is-active.input.docs-sourcelink{box-shadow:0 0 0 0.125em rgba(55,90,127,0.25)}html.theme--documenter-dark .is-link.textarea,html.theme--documenter-dark .is-link.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-link{border-color:#1abc9c}html.theme--documenter-dark .is-link.textarea:focus,html.theme--documenter-dark .is-link.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-link:focus,html.theme--documenter-dark .is-link.is-focused.textarea,html.theme--documenter-dark .is-link.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-link.textarea:active,html.theme--documenter-dark .is-link.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-link:active,html.theme--documenter-dark .is-link.is-active.textarea,html.theme--documenter-dark .is-link.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(26,188,156,0.25)}html.theme--documenter-dark .is-info.textarea,html.theme--documenter-dark .is-info.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-info{border-color:#024c7d}html.theme--documenter-dark .is-info.textarea:focus,html.theme--documenter-dark .is-info.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-info:focus,html.theme--documenter-dark .is-info.is-focused.textarea,html.theme--documenter-dark .is-info.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-info.textarea:active,html.theme--documenter-dark .is-info.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-info:active,html.theme--documenter-dark .is-info.is-active.textarea,html.theme--documenter-dark .is-info.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(2,76,125,0.25)}html.theme--documenter-dark .is-success.textarea,html.theme--documenter-dark .is-success.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-success{border-color:#008438}html.theme--documenter-dark .is-success.textarea:focus,html.theme--documenter-dark .is-success.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-success:focus,html.theme--documenter-dark .is-success.is-focused.textarea,html.theme--documenter-dark .is-success.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-success.textarea:active,html.theme--documenter-dark .is-success.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-success:active,html.theme--documenter-dark .is-success.is-active.textarea,html.theme--documenter-dark .is-success.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(0,132,56,0.25)}html.theme--documenter-dark .is-warning.textarea,html.theme--documenter-dark .is-warning.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-warning{border-color:#ad8100}html.theme--documenter-dark .is-warning.textarea:focus,html.theme--documenter-dark .is-warning.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-warning:focus,html.theme--documenter-dark .is-warning.is-focused.textarea,html.theme--documenter-dark .is-warning.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-warning.textarea:active,html.theme--documenter-dark .is-warning.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-warning:active,html.theme--documenter-dark .is-warning.is-active.textarea,html.theme--documenter-dark .is-warning.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(173,129,0,0.25)}html.theme--documenter-dark .is-danger.textarea,html.theme--documenter-dark .is-danger.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-danger{border-color:#9e1b0d}html.theme--documenter-dark .is-danger.textarea:focus,html.theme--documenter-dark .is-danger.input:focus,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-danger:focus,html.theme--documenter-dark .is-danger.is-focused.textarea,html.theme--documenter-dark .is-danger.is-focused.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-focused,html.theme--documenter-dark .is-danger.textarea:active,html.theme--documenter-dark .is-danger.input:active,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-danger:active,html.theme--documenter-dark .is-danger.is-active.textarea,html.theme--documenter-dark .is-danger.is-active.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(158,27,13,0.25)}html.theme--documenter-dark .is-small.textarea,html.theme--documenter-dark .is-small.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input{border-radius:3px;font-size:.85em}html.theme--documenter-dark .is-medium.textarea,html.theme--documenter-dark .is-medium.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-medium{font-size:1.25rem}html.theme--documenter-dark .is-large.textarea,html.theme--documenter-dark .is-large.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-large{font-size:1.5rem}html.theme--documenter-dark .is-fullwidth.textarea,html.theme--documenter-dark .is-fullwidth.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-fullwidth{display:block;width:100%}html.theme--documenter-dark .is-inline.textarea,html.theme--documenter-dark .is-inline.input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-inline{display:inline;width:auto}html.theme--documenter-dark .input.is-rounded,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input{border-radius:290486px;padding-left:1em;padding-right:1em}html.theme--documenter-dark .input.is-static,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-static{background-color:transparent;border-color:transparent;box-shadow:none;padding-left:0;padding-right:0}html.theme--documenter-dark .textarea{display:block;max-width:100%;min-width:100%;padding:0.625em;resize:vertical}html.theme--documenter-dark .textarea:not([rows]){max-height:600px;min-height:120px}html.theme--documenter-dark .textarea[rows]{height:initial}html.theme--documenter-dark .textarea.has-fixed-size{resize:none}html.theme--documenter-dark .radio,html.theme--documenter-dark .checkbox{cursor:pointer;display:inline-block;line-height:1.25;position:relative}html.theme--documenter-dark .radio input,html.theme--documenter-dark .checkbox input{cursor:pointer}html.theme--documenter-dark .radio:hover,html.theme--documenter-dark .checkbox:hover{color:#8c9b9d}html.theme--documenter-dark .radio[disabled],html.theme--documenter-dark .checkbox[disabled],fieldset[disabled] html.theme--documenter-dark .radio,fieldset[disabled] html.theme--documenter-dark .checkbox{color:#fff;cursor:not-allowed}html.theme--documenter-dark .radio+.radio{margin-left:0.5em}html.theme--documenter-dark .select{display:inline-block;max-width:100%;position:relative;vertical-align:top}html.theme--documenter-dark .select:not(.is-multiple){height:2.25em}html.theme--documenter-dark .select:not(.is-multiple):not(.is-loading)::after{border-color:#1abc9c;right:1.125em;z-index:4}html.theme--documenter-dark .select.is-rounded select,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.select select{border-radius:290486px;padding-left:1em}html.theme--documenter-dark .select select{cursor:pointer;display:block;font-size:1em;max-width:100%;outline:none}html.theme--documenter-dark .select select::-ms-expand{display:none}html.theme--documenter-dark .select select[disabled]:hover,fieldset[disabled] html.theme--documenter-dark .select select:hover{border-color:#282f2f}html.theme--documenter-dark .select select:not([multiple]){padding-right:2.5em}html.theme--documenter-dark .select select[multiple]{height:auto;padding:0}html.theme--documenter-dark .select select[multiple] option{padding:0.5em 1em}html.theme--documenter-dark .select:not(.is-multiple):not(.is-loading):hover::after{border-color:#8c9b9d}html.theme--documenter-dark .select.is-white:not(:hover)::after{border-color:#fff}html.theme--documenter-dark .select.is-white select{border-color:#fff}html.theme--documenter-dark .select.is-white select:hover,html.theme--documenter-dark .select.is-white select.is-hovered{border-color:#f2f2f2}html.theme--documenter-dark .select.is-white select:focus,html.theme--documenter-dark .select.is-white select.is-focused,html.theme--documenter-dark .select.is-white select:active,html.theme--documenter-dark .select.is-white select.is-active{box-shadow:0 0 0 0.125em rgba(255,255,255,0.25)}html.theme--documenter-dark .select.is-black:not(:hover)::after{border-color:#0a0a0a}html.theme--documenter-dark .select.is-black select{border-color:#0a0a0a}html.theme--documenter-dark .select.is-black select:hover,html.theme--documenter-dark .select.is-black select.is-hovered{border-color:#000}html.theme--documenter-dark .select.is-black select:focus,html.theme--documenter-dark .select.is-black select.is-focused,html.theme--documenter-dark .select.is-black select:active,html.theme--documenter-dark .select.is-black select.is-active{box-shadow:0 0 0 0.125em rgba(10,10,10,0.25)}html.theme--documenter-dark .select.is-light:not(:hover)::after{border-color:#ecf0f1}html.theme--documenter-dark .select.is-light select{border-color:#ecf0f1}html.theme--documenter-dark .select.is-light select:hover,html.theme--documenter-dark .select.is-light select.is-hovered{border-color:#dde4e6}html.theme--documenter-dark .select.is-light select:focus,html.theme--documenter-dark .select.is-light select.is-focused,html.theme--documenter-dark .select.is-light select:active,html.theme--documenter-dark .select.is-light select.is-active{box-shadow:0 0 0 0.125em rgba(236,240,241,0.25)}html.theme--documenter-dark .select.is-dark:not(:hover)::after,html.theme--documenter-dark .content kbd.select:not(:hover)::after{border-color:#282f2f}html.theme--documenter-dark .select.is-dark select,html.theme--documenter-dark .content kbd.select select{border-color:#282f2f}html.theme--documenter-dark .select.is-dark select:hover,html.theme--documenter-dark .content kbd.select select:hover,html.theme--documenter-dark .select.is-dark select.is-hovered,html.theme--documenter-dark .content kbd.select select.is-hovered{border-color:#1d2122}html.theme--documenter-dark .select.is-dark select:focus,html.theme--documenter-dark .content kbd.select select:focus,html.theme--documenter-dark .select.is-dark select.is-focused,html.theme--documenter-dark .content kbd.select select.is-focused,html.theme--documenter-dark .select.is-dark select:active,html.theme--documenter-dark .content kbd.select select:active,html.theme--documenter-dark .select.is-dark select.is-active,html.theme--documenter-dark .content kbd.select select.is-active{box-shadow:0 0 0 0.125em rgba(40,47,47,0.25)}html.theme--documenter-dark .select.is-primary:not(:hover)::after,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink:not(:hover)::after{border-color:#375a7f}html.theme--documenter-dark .select.is-primary select,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select{border-color:#375a7f}html.theme--documenter-dark .select.is-primary select:hover,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select:hover,html.theme--documenter-dark .select.is-primary select.is-hovered,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select.is-hovered{border-color:#2f4d6d}html.theme--documenter-dark .select.is-primary select:focus,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select:focus,html.theme--documenter-dark .select.is-primary select.is-focused,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select.is-focused,html.theme--documenter-dark .select.is-primary select:active,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select:active,html.theme--documenter-dark .select.is-primary select.is-active,html.theme--documenter-dark .docstring>section>a.select.docs-sourcelink select.is-active{box-shadow:0 0 0 0.125em rgba(55,90,127,0.25)}html.theme--documenter-dark .select.is-link:not(:hover)::after{border-color:#1abc9c}html.theme--documenter-dark .select.is-link select{border-color:#1abc9c}html.theme--documenter-dark .select.is-link select:hover,html.theme--documenter-dark .select.is-link select.is-hovered{border-color:#17a689}html.theme--documenter-dark .select.is-link select:focus,html.theme--documenter-dark .select.is-link select.is-focused,html.theme--documenter-dark .select.is-link select:active,html.theme--documenter-dark .select.is-link select.is-active{box-shadow:0 0 0 0.125em rgba(26,188,156,0.25)}html.theme--documenter-dark .select.is-info:not(:hover)::after{border-color:#024c7d}html.theme--documenter-dark .select.is-info select{border-color:#024c7d}html.theme--documenter-dark .select.is-info select:hover,html.theme--documenter-dark .select.is-info select.is-hovered{border-color:#023d64}html.theme--documenter-dark .select.is-info select:focus,html.theme--documenter-dark .select.is-info select.is-focused,html.theme--documenter-dark .select.is-info select:active,html.theme--documenter-dark .select.is-info select.is-active{box-shadow:0 0 0 0.125em rgba(2,76,125,0.25)}html.theme--documenter-dark .select.is-success:not(:hover)::after{border-color:#008438}html.theme--documenter-dark .select.is-success select{border-color:#008438}html.theme--documenter-dark .select.is-success select:hover,html.theme--documenter-dark .select.is-success select.is-hovered{border-color:#006b2d}html.theme--documenter-dark .select.is-success select:focus,html.theme--documenter-dark .select.is-success select.is-focused,html.theme--documenter-dark .select.is-success select:active,html.theme--documenter-dark .select.is-success select.is-active{box-shadow:0 0 0 0.125em rgba(0,132,56,0.25)}html.theme--documenter-dark .select.is-warning:not(:hover)::after{border-color:#ad8100}html.theme--documenter-dark .select.is-warning select{border-color:#ad8100}html.theme--documenter-dark .select.is-warning select:hover,html.theme--documenter-dark .select.is-warning select.is-hovered{border-color:#946e00}html.theme--documenter-dark .select.is-warning select:focus,html.theme--documenter-dark .select.is-warning select.is-focused,html.theme--documenter-dark .select.is-warning select:active,html.theme--documenter-dark .select.is-warning select.is-active{box-shadow:0 0 0 0.125em rgba(173,129,0,0.25)}html.theme--documenter-dark .select.is-danger:not(:hover)::after{border-color:#9e1b0d}html.theme--documenter-dark .select.is-danger select{border-color:#9e1b0d}html.theme--documenter-dark .select.is-danger select:hover,html.theme--documenter-dark .select.is-danger select.is-hovered{border-color:#86170b}html.theme--documenter-dark .select.is-danger select:focus,html.theme--documenter-dark .select.is-danger select.is-focused,html.theme--documenter-dark .select.is-danger select:active,html.theme--documenter-dark .select.is-danger select.is-active{box-shadow:0 0 0 0.125em rgba(158,27,13,0.25)}html.theme--documenter-dark .select.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.select{border-radius:3px;font-size:.85em}html.theme--documenter-dark .select.is-medium{font-size:1.25rem}html.theme--documenter-dark .select.is-large{font-size:1.5rem}html.theme--documenter-dark .select.is-disabled::after{border-color:#fff}html.theme--documenter-dark .select.is-fullwidth{width:100%}html.theme--documenter-dark .select.is-fullwidth select{width:100%}html.theme--documenter-dark .select.is-loading::after{margin-top:0;position:absolute;right:0.625em;top:0.625em;transform:none}html.theme--documenter-dark .select.is-loading.is-small:after,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-loading:after{font-size:.85em}html.theme--documenter-dark .select.is-loading.is-medium:after{font-size:1.25rem}html.theme--documenter-dark .select.is-loading.is-large:after{font-size:1.5rem}html.theme--documenter-dark .file{align-items:stretch;display:flex;justify-content:flex-start;position:relative}html.theme--documenter-dark .file.is-white .file-cta{background-color:#fff;border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .file.is-white:hover .file-cta,html.theme--documenter-dark .file.is-white.is-hovered .file-cta{background-color:#f9f9f9;border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .file.is-white:focus .file-cta,html.theme--documenter-dark .file.is-white.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(255,255,255,0.25);color:#0a0a0a}html.theme--documenter-dark .file.is-white:active .file-cta,html.theme--documenter-dark .file.is-white.is-active .file-cta{background-color:#f2f2f2;border-color:transparent;color:#0a0a0a}html.theme--documenter-dark .file.is-black .file-cta{background-color:#0a0a0a;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-black:hover .file-cta,html.theme--documenter-dark .file.is-black.is-hovered .file-cta{background-color:#040404;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-black:focus .file-cta,html.theme--documenter-dark .file.is-black.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(10,10,10,0.25);color:#fff}html.theme--documenter-dark .file.is-black:active .file-cta,html.theme--documenter-dark .file.is-black.is-active .file-cta{background-color:#000;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-light .file-cta{background-color:#ecf0f1;border-color:transparent;color:#282f2f}html.theme--documenter-dark .file.is-light:hover .file-cta,html.theme--documenter-dark .file.is-light.is-hovered .file-cta{background-color:#e5eaec;border-color:transparent;color:#282f2f}html.theme--documenter-dark .file.is-light:focus .file-cta,html.theme--documenter-dark .file.is-light.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(236,240,241,0.25);color:#282f2f}html.theme--documenter-dark .file.is-light:active .file-cta,html.theme--documenter-dark .file.is-light.is-active .file-cta{background-color:#dde4e6;border-color:transparent;color:#282f2f}html.theme--documenter-dark .file.is-dark .file-cta,html.theme--documenter-dark .content kbd.file .file-cta{background-color:#282f2f;border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .file.is-dark:hover .file-cta,html.theme--documenter-dark .content kbd.file:hover .file-cta,html.theme--documenter-dark .file.is-dark.is-hovered .file-cta,html.theme--documenter-dark .content kbd.file.is-hovered .file-cta{background-color:#232829;border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .file.is-dark:focus .file-cta,html.theme--documenter-dark .content kbd.file:focus .file-cta,html.theme--documenter-dark .file.is-dark.is-focused .file-cta,html.theme--documenter-dark .content kbd.file.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(40,47,47,0.25);color:#ecf0f1}html.theme--documenter-dark .file.is-dark:active .file-cta,html.theme--documenter-dark .content kbd.file:active .file-cta,html.theme--documenter-dark .file.is-dark.is-active .file-cta,html.theme--documenter-dark .content kbd.file.is-active .file-cta{background-color:#1d2122;border-color:transparent;color:#ecf0f1}html.theme--documenter-dark .file.is-primary .file-cta,html.theme--documenter-dark .docstring>section>a.file.docs-sourcelink .file-cta{background-color:#375a7f;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-primary:hover .file-cta,html.theme--documenter-dark .docstring>section>a.file.docs-sourcelink:hover .file-cta,html.theme--documenter-dark .file.is-primary.is-hovered .file-cta,html.theme--documenter-dark .docstring>section>a.file.is-hovered.docs-sourcelink .file-cta{background-color:#335476;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-primary:focus .file-cta,html.theme--documenter-dark .docstring>section>a.file.docs-sourcelink:focus .file-cta,html.theme--documenter-dark .file.is-primary.is-focused .file-cta,html.theme--documenter-dark .docstring>section>a.file.is-focused.docs-sourcelink .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(55,90,127,0.25);color:#fff}html.theme--documenter-dark .file.is-primary:active .file-cta,html.theme--documenter-dark .docstring>section>a.file.docs-sourcelink:active .file-cta,html.theme--documenter-dark .file.is-primary.is-active .file-cta,html.theme--documenter-dark .docstring>section>a.file.is-active.docs-sourcelink .file-cta{background-color:#2f4d6d;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-link .file-cta{background-color:#1abc9c;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-link:hover .file-cta,html.theme--documenter-dark .file.is-link.is-hovered .file-cta{background-color:#18b193;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-link:focus .file-cta,html.theme--documenter-dark .file.is-link.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(26,188,156,0.25);color:#fff}html.theme--documenter-dark .file.is-link:active .file-cta,html.theme--documenter-dark .file.is-link.is-active .file-cta{background-color:#17a689;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-info .file-cta{background-color:#024c7d;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-info:hover .file-cta,html.theme--documenter-dark .file.is-info.is-hovered .file-cta{background-color:#024470;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-info:focus .file-cta,html.theme--documenter-dark .file.is-info.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(2,76,125,0.25);color:#fff}html.theme--documenter-dark .file.is-info:active .file-cta,html.theme--documenter-dark .file.is-info.is-active .file-cta{background-color:#023d64;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-success .file-cta{background-color:#008438;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-success:hover .file-cta,html.theme--documenter-dark .file.is-success.is-hovered .file-cta{background-color:#073;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-success:focus .file-cta,html.theme--documenter-dark .file.is-success.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(0,132,56,0.25);color:#fff}html.theme--documenter-dark .file.is-success:active .file-cta,html.theme--documenter-dark .file.is-success.is-active .file-cta{background-color:#006b2d;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-warning .file-cta{background-color:#ad8100;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-warning:hover .file-cta,html.theme--documenter-dark .file.is-warning.is-hovered .file-cta{background-color:#a07700;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-warning:focus .file-cta,html.theme--documenter-dark .file.is-warning.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(173,129,0,0.25);color:#fff}html.theme--documenter-dark .file.is-warning:active .file-cta,html.theme--documenter-dark .file.is-warning.is-active .file-cta{background-color:#946e00;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-danger .file-cta{background-color:#9e1b0d;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-danger:hover .file-cta,html.theme--documenter-dark .file.is-danger.is-hovered .file-cta{background-color:#92190c;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-danger:focus .file-cta,html.theme--documenter-dark .file.is-danger.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(158,27,13,0.25);color:#fff}html.theme--documenter-dark .file.is-danger:active .file-cta,html.theme--documenter-dark .file.is-danger.is-active .file-cta{background-color:#86170b;border-color:transparent;color:#fff}html.theme--documenter-dark .file.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.file{font-size:.85em}html.theme--documenter-dark .file.is-medium{font-size:1.25rem}html.theme--documenter-dark .file.is-medium .file-icon .fa{font-size:21px}html.theme--documenter-dark .file.is-large{font-size:1.5rem}html.theme--documenter-dark .file.is-large .file-icon .fa{font-size:28px}html.theme--documenter-dark .file.has-name .file-cta{border-bottom-right-radius:0;border-top-right-radius:0}html.theme--documenter-dark .file.has-name .file-name{border-bottom-left-radius:0;border-top-left-radius:0}html.theme--documenter-dark .file.has-name.is-empty .file-cta{border-radius:.4em}html.theme--documenter-dark .file.has-name.is-empty .file-name{display:none}html.theme--documenter-dark .file.is-boxed .file-label{flex-direction:column}html.theme--documenter-dark .file.is-boxed .file-cta{flex-direction:column;height:auto;padding:1em 3em}html.theme--documenter-dark .file.is-boxed .file-name{border-width:0 1px 1px}html.theme--documenter-dark .file.is-boxed .file-icon{height:1.5em;width:1.5em}html.theme--documenter-dark .file.is-boxed .file-icon .fa{font-size:21px}html.theme--documenter-dark .file.is-boxed.is-small .file-icon .fa,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-boxed .file-icon .fa{font-size:14px}html.theme--documenter-dark .file.is-boxed.is-medium .file-icon .fa{font-size:28px}html.theme--documenter-dark .file.is-boxed.is-large .file-icon .fa{font-size:35px}html.theme--documenter-dark .file.is-boxed.has-name .file-cta{border-radius:.4em .4em 0 0}html.theme--documenter-dark .file.is-boxed.has-name .file-name{border-radius:0 0 .4em .4em;border-width:0 1px 1px}html.theme--documenter-dark .file.is-centered{justify-content:center}html.theme--documenter-dark .file.is-fullwidth .file-label{width:100%}html.theme--documenter-dark .file.is-fullwidth .file-name{flex-grow:1;max-width:none}html.theme--documenter-dark .file.is-right{justify-content:flex-end}html.theme--documenter-dark .file.is-right .file-cta{border-radius:0 .4em .4em 0}html.theme--documenter-dark .file.is-right .file-name{border-radius:.4em 0 0 .4em;border-width:1px 0 1px 1px;order:-1}html.theme--documenter-dark .file-label{align-items:stretch;display:flex;cursor:pointer;justify-content:flex-start;overflow:hidden;position:relative}html.theme--documenter-dark .file-label:hover .file-cta{background-color:#e5eaec;color:#282f2f}html.theme--documenter-dark .file-label:hover .file-name{border-color:#596668}html.theme--documenter-dark .file-label:active .file-cta{background-color:#dde4e6;color:#282f2f}html.theme--documenter-dark .file-label:active .file-name{border-color:#535f61}html.theme--documenter-dark .file-input{height:100%;left:0;opacity:0;outline:none;position:absolute;top:0;width:100%}html.theme--documenter-dark .file-cta,html.theme--documenter-dark .file-name{border-color:#5e6d6f;border-radius:.4em;font-size:1em;padding-left:1em;padding-right:1em;white-space:nowrap}html.theme--documenter-dark .file-cta{background-color:#ecf0f1;color:#343c3d}html.theme--documenter-dark .file-name{border-color:#5e6d6f;border-style:solid;border-width:1px 1px 1px 0;display:block;max-width:16em;overflow:hidden;text-align:left;text-overflow:ellipsis}html.theme--documenter-dark .file-icon{align-items:center;display:flex;height:1em;justify-content:center;margin-right:0.5em;width:1em}html.theme--documenter-dark .file-icon .fa{font-size:14px}html.theme--documenter-dark .label{color:#282f2f;display:block;font-size:15px;font-weight:700}html.theme--documenter-dark .label:not(:last-child){margin-bottom:0.5em}html.theme--documenter-dark .label.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.label{font-size:.85em}html.theme--documenter-dark .label.is-medium{font-size:1.25rem}html.theme--documenter-dark .label.is-large{font-size:1.5rem}html.theme--documenter-dark .help{display:block;font-size:.85em;margin-top:0.25rem}html.theme--documenter-dark .help.is-white{color:#fff}html.theme--documenter-dark .help.is-black{color:#0a0a0a}html.theme--documenter-dark .help.is-light{color:#ecf0f1}html.theme--documenter-dark .help.is-dark,html.theme--documenter-dark .content kbd.help{color:#282f2f}html.theme--documenter-dark .help.is-primary,html.theme--documenter-dark .docstring>section>a.help.docs-sourcelink{color:#375a7f}html.theme--documenter-dark .help.is-link{color:#1abc9c}html.theme--documenter-dark .help.is-info{color:#024c7d}html.theme--documenter-dark .help.is-success{color:#008438}html.theme--documenter-dark .help.is-warning{color:#ad8100}html.theme--documenter-dark .help.is-danger{color:#9e1b0d}html.theme--documenter-dark .field:not(:last-child){margin-bottom:0.75rem}html.theme--documenter-dark .field.has-addons{display:flex;justify-content:flex-start}html.theme--documenter-dark .field.has-addons .control:not(:last-child){margin-right:-1px}html.theme--documenter-dark .field.has-addons .control:not(:first-child):not(:last-child) .button,html.theme--documenter-dark .field.has-addons .control:not(:first-child):not(:last-child) .input,html.theme--documenter-dark .field.has-addons .control:not(:first-child):not(:last-child) #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control:not(:first-child):not(:last-child) form.docs-search>input,html.theme--documenter-dark .field.has-addons .control:not(:first-child):not(:last-child) .select select{border-radius:0}html.theme--documenter-dark .field.has-addons .control:first-child:not(:only-child) .button,html.theme--documenter-dark .field.has-addons .control:first-child:not(:only-child) .input,html.theme--documenter-dark .field.has-addons .control:first-child:not(:only-child) #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control:first-child:not(:only-child) form.docs-search>input,html.theme--documenter-dark .field.has-addons .control:first-child:not(:only-child) .select select{border-bottom-right-radius:0;border-top-right-radius:0}html.theme--documenter-dark .field.has-addons .control:last-child:not(:only-child) .button,html.theme--documenter-dark .field.has-addons .control:last-child:not(:only-child) .input,html.theme--documenter-dark .field.has-addons .control:last-child:not(:only-child) #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control:last-child:not(:only-child) form.docs-search>input,html.theme--documenter-dark .field.has-addons .control:last-child:not(:only-child) .select select{border-bottom-left-radius:0;border-top-left-radius:0}html.theme--documenter-dark .field.has-addons .control .button:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .button.is-hovered:not([disabled]),html.theme--documenter-dark .field.has-addons .control .input:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):hover,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .input.is-hovered:not([disabled]),html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-hovered:not([disabled]),html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-hovered:not([disabled]),html.theme--documenter-dark .field.has-addons .control .select select:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .select select.is-hovered:not([disabled]){z-index:2}html.theme--documenter-dark .field.has-addons .control .button:not([disabled]):focus,html.theme--documenter-dark .field.has-addons .control .button.is-focused:not([disabled]),html.theme--documenter-dark .field.has-addons .control .button:not([disabled]):active,html.theme--documenter-dark .field.has-addons .control .button.is-active:not([disabled]),html.theme--documenter-dark .field.has-addons .control .input:not([disabled]):focus,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):focus,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):focus,html.theme--documenter-dark .field.has-addons .control .input.is-focused:not([disabled]),html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-focused:not([disabled]),html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-focused:not([disabled]),html.theme--documenter-dark .field.has-addons .control .input:not([disabled]):active,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):active,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):active,html.theme--documenter-dark .field.has-addons .control .input.is-active:not([disabled]),html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-active:not([disabled]),html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-active:not([disabled]),html.theme--documenter-dark .field.has-addons .control .select select:not([disabled]):focus,html.theme--documenter-dark .field.has-addons .control .select select.is-focused:not([disabled]),html.theme--documenter-dark .field.has-addons .control .select select:not([disabled]):active,html.theme--documenter-dark .field.has-addons .control .select select.is-active:not([disabled]){z-index:3}html.theme--documenter-dark .field.has-addons .control .button:not([disabled]):focus:hover,html.theme--documenter-dark .field.has-addons .control .button.is-focused:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .button:not([disabled]):active:hover,html.theme--documenter-dark .field.has-addons .control .button.is-active:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .input:not([disabled]):focus:hover,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):focus:hover,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):focus:hover,html.theme--documenter-dark .field.has-addons .control .input.is-focused:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-focused:not([disabled]):hover,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-focused:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .input:not([disabled]):active:hover,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):active:hover,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):active:hover,html.theme--documenter-dark .field.has-addons .control .input.is-active:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-active:not([disabled]):hover,html.theme--documenter-dark #documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-active:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .select select:not([disabled]):focus:hover,html.theme--documenter-dark .field.has-addons .control .select select.is-focused:not([disabled]):hover,html.theme--documenter-dark .field.has-addons .control .select select:not([disabled]):active:hover,html.theme--documenter-dark .field.has-addons .control .select select.is-active:not([disabled]):hover{z-index:4}html.theme--documenter-dark .field.has-addons .control.is-expanded{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .field.has-addons.has-addons-centered{justify-content:center}html.theme--documenter-dark .field.has-addons.has-addons-right{justify-content:flex-end}html.theme--documenter-dark .field.has-addons.has-addons-fullwidth .control{flex-grow:1;flex-shrink:0}html.theme--documenter-dark .field.is-grouped{display:flex;justify-content:flex-start}html.theme--documenter-dark .field.is-grouped>.control{flex-shrink:0}html.theme--documenter-dark .field.is-grouped>.control:not(:last-child){margin-bottom:0;margin-right:0.75rem}html.theme--documenter-dark .field.is-grouped>.control.is-expanded{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .field.is-grouped.is-grouped-centered{justify-content:center}html.theme--documenter-dark .field.is-grouped.is-grouped-right{justify-content:flex-end}html.theme--documenter-dark .field.is-grouped.is-grouped-multiline{flex-wrap:wrap}html.theme--documenter-dark .field.is-grouped.is-grouped-multiline>.control:last-child,html.theme--documenter-dark .field.is-grouped.is-grouped-multiline>.control:not(:last-child){margin-bottom:0.75rem}html.theme--documenter-dark .field.is-grouped.is-grouped-multiline:last-child{margin-bottom:-0.75rem}html.theme--documenter-dark .field.is-grouped.is-grouped-multiline:not(:last-child){margin-bottom:0}@media screen and (min-width: 769px),print{html.theme--documenter-dark .field.is-horizontal{display:flex}}html.theme--documenter-dark .field-label .label{font-size:inherit}@media screen and (max-width: 768px){html.theme--documenter-dark .field-label{margin-bottom:0.5rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .field-label{flex-basis:0;flex-grow:1;flex-shrink:0;margin-right:1.5rem;text-align:right}html.theme--documenter-dark .field-label.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.field-label{font-size:.85em;padding-top:0.375em}html.theme--documenter-dark .field-label.is-normal{padding-top:0.375em}html.theme--documenter-dark .field-label.is-medium{font-size:1.25rem;padding-top:0.375em}html.theme--documenter-dark .field-label.is-large{font-size:1.5rem;padding-top:0.375em}}html.theme--documenter-dark .field-body .field .field{margin-bottom:0}@media screen and (min-width: 769px),print{html.theme--documenter-dark .field-body{display:flex;flex-basis:0;flex-grow:5;flex-shrink:1}html.theme--documenter-dark .field-body .field{margin-bottom:0}html.theme--documenter-dark .field-body>.field{flex-shrink:1}html.theme--documenter-dark .field-body>.field:not(.is-narrow){flex-grow:1}html.theme--documenter-dark .field-body>.field:not(:last-child){margin-right:0.75rem}}html.theme--documenter-dark .control{box-sizing:border-box;clear:both;font-size:15px;position:relative;text-align:left}html.theme--documenter-dark .control.has-icons-left .input:focus~.icon,html.theme--documenter-dark .control.has-icons-left #documenter .docs-sidebar form.docs-search>input:focus~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-left form.docs-search>input:focus~.icon,html.theme--documenter-dark .control.has-icons-left .select:focus~.icon,html.theme--documenter-dark .control.has-icons-right .input:focus~.icon,html.theme--documenter-dark .control.has-icons-right #documenter .docs-sidebar form.docs-search>input:focus~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-right form.docs-search>input:focus~.icon,html.theme--documenter-dark .control.has-icons-right .select:focus~.icon{color:#5e6d6f}html.theme--documenter-dark .control.has-icons-left .input.is-small~.icon,html.theme--documenter-dark .control.has-icons-left #documenter .docs-sidebar form.docs-search>input~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-left form.docs-search>input~.icon,html.theme--documenter-dark .control.has-icons-left .select.is-small~.icon,html.theme--documenter-dark .control.has-icons-right .input.is-small~.icon,html.theme--documenter-dark .control.has-icons-right #documenter .docs-sidebar form.docs-search>input~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-right form.docs-search>input~.icon,html.theme--documenter-dark .control.has-icons-right .select.is-small~.icon{font-size:.85em}html.theme--documenter-dark .control.has-icons-left .input.is-medium~.icon,html.theme--documenter-dark .control.has-icons-left #documenter .docs-sidebar form.docs-search>input.is-medium~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-left form.docs-search>input.is-medium~.icon,html.theme--documenter-dark .control.has-icons-left .select.is-medium~.icon,html.theme--documenter-dark .control.has-icons-right .input.is-medium~.icon,html.theme--documenter-dark .control.has-icons-right #documenter .docs-sidebar form.docs-search>input.is-medium~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-right form.docs-search>input.is-medium~.icon,html.theme--documenter-dark .control.has-icons-right .select.is-medium~.icon{font-size:1.25rem}html.theme--documenter-dark .control.has-icons-left .input.is-large~.icon,html.theme--documenter-dark .control.has-icons-left #documenter .docs-sidebar form.docs-search>input.is-large~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-left form.docs-search>input.is-large~.icon,html.theme--documenter-dark .control.has-icons-left .select.is-large~.icon,html.theme--documenter-dark .control.has-icons-right .input.is-large~.icon,html.theme--documenter-dark .control.has-icons-right #documenter .docs-sidebar form.docs-search>input.is-large~.icon,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-right form.docs-search>input.is-large~.icon,html.theme--documenter-dark .control.has-icons-right .select.is-large~.icon{font-size:1.5rem}html.theme--documenter-dark .control.has-icons-left .icon,html.theme--documenter-dark .control.has-icons-right .icon{color:#dbdee0;height:2.25em;pointer-events:none;position:absolute;top:0;width:2.25em;z-index:4}html.theme--documenter-dark .control.has-icons-left .input,html.theme--documenter-dark .control.has-icons-left #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-left form.docs-search>input,html.theme--documenter-dark .control.has-icons-left .select select{padding-left:2.25em}html.theme--documenter-dark .control.has-icons-left .icon.is-left{left:0}html.theme--documenter-dark .control.has-icons-right .input,html.theme--documenter-dark .control.has-icons-right #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar .control.has-icons-right form.docs-search>input,html.theme--documenter-dark .control.has-icons-right .select select{padding-right:2.25em}html.theme--documenter-dark .control.has-icons-right .icon.is-right{right:0}html.theme--documenter-dark .control.is-loading::after{position:absolute !important;right:0.625em;top:0.625em;z-index:4}html.theme--documenter-dark .control.is-loading.is-small:after,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.is-loading:after{font-size:.85em}html.theme--documenter-dark .control.is-loading.is-medium:after{font-size:1.25rem}html.theme--documenter-dark .control.is-loading.is-large:after{font-size:1.5rem}html.theme--documenter-dark .breadcrumb{font-size:15px;white-space:nowrap}html.theme--documenter-dark .breadcrumb a{align-items:center;color:#1abc9c;display:flex;justify-content:center;padding:0 .75em}html.theme--documenter-dark .breadcrumb a:hover{color:#1dd2af}html.theme--documenter-dark .breadcrumb li{align-items:center;display:flex}html.theme--documenter-dark .breadcrumb li:first-child a{padding-left:0}html.theme--documenter-dark .breadcrumb li.is-active a{color:#f2f2f2;cursor:default;pointer-events:none}html.theme--documenter-dark .breadcrumb li+li::before{color:#8c9b9d;content:"\0002f"}html.theme--documenter-dark .breadcrumb ul,html.theme--documenter-dark .breadcrumb ol{align-items:flex-start;display:flex;flex-wrap:wrap;justify-content:flex-start}html.theme--documenter-dark .breadcrumb .icon:first-child{margin-right:0.5em}html.theme--documenter-dark .breadcrumb .icon:last-child{margin-left:0.5em}html.theme--documenter-dark .breadcrumb.is-centered ol,html.theme--documenter-dark .breadcrumb.is-centered ul{justify-content:center}html.theme--documenter-dark .breadcrumb.is-right ol,html.theme--documenter-dark .breadcrumb.is-right ul{justify-content:flex-end}html.theme--documenter-dark .breadcrumb.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.breadcrumb{font-size:.85em}html.theme--documenter-dark .breadcrumb.is-medium{font-size:1.25rem}html.theme--documenter-dark .breadcrumb.is-large{font-size:1.5rem}html.theme--documenter-dark .breadcrumb.has-arrow-separator li+li::before{content:"\02192"}html.theme--documenter-dark .breadcrumb.has-bullet-separator li+li::before{content:"\02022"}html.theme--documenter-dark .breadcrumb.has-dot-separator li+li::before{content:"\000b7"}html.theme--documenter-dark .breadcrumb.has-succeeds-separator li+li::before{content:"\0227B"}html.theme--documenter-dark .card{background-color:#fff;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1);color:#fff;max-width:100%;position:relative}html.theme--documenter-dark .card-header{background-color:rgba(0,0,0,0);align-items:stretch;box-shadow:0 1px 2px rgba(10,10,10,0.1);display:flex}html.theme--documenter-dark .card-header-title{align-items:center;color:#f2f2f2;display:flex;flex-grow:1;font-weight:700;padding:.75rem}html.theme--documenter-dark .card-header-title.is-centered{justify-content:center}html.theme--documenter-dark .card-header-icon{align-items:center;cursor:pointer;display:flex;justify-content:center;padding:.75rem}html.theme--documenter-dark .card-image{display:block;position:relative}html.theme--documenter-dark .card-content{background-color:rgba(0,0,0,0);padding:1.5rem}html.theme--documenter-dark .card-footer{background-color:rgba(0,0,0,0);border-top:1px solid #5e6d6f;align-items:stretch;display:flex}html.theme--documenter-dark .card-footer-item{align-items:center;display:flex;flex-basis:0;flex-grow:1;flex-shrink:0;justify-content:center;padding:.75rem}html.theme--documenter-dark .card-footer-item:not(:last-child){border-right:1px solid #5e6d6f}html.theme--documenter-dark .card .media:not(:last-child){margin-bottom:1.5rem}html.theme--documenter-dark .dropdown{display:inline-flex;position:relative;vertical-align:top}html.theme--documenter-dark .dropdown.is-active .dropdown-menu,html.theme--documenter-dark .dropdown.is-hoverable:hover .dropdown-menu{display:block}html.theme--documenter-dark .dropdown.is-right .dropdown-menu{left:auto;right:0}html.theme--documenter-dark .dropdown.is-up .dropdown-menu{bottom:100%;padding-bottom:4px;padding-top:initial;top:auto}html.theme--documenter-dark .dropdown-menu{display:none;left:0;min-width:12rem;padding-top:4px;position:absolute;top:100%;z-index:20}html.theme--documenter-dark .dropdown-content{background-color:#282f2f;border-radius:.4em;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1);padding-bottom:.5rem;padding-top:.5rem}html.theme--documenter-dark .dropdown-item{color:#fff;display:block;font-size:0.875rem;line-height:1.5;padding:0.375rem 1rem;position:relative}html.theme--documenter-dark a.dropdown-item,html.theme--documenter-dark button.dropdown-item{padding-right:3rem;text-align:left;white-space:nowrap;width:100%}html.theme--documenter-dark a.dropdown-item:hover,html.theme--documenter-dark button.dropdown-item:hover{background-color:#282f2f;color:#0a0a0a}html.theme--documenter-dark a.dropdown-item.is-active,html.theme--documenter-dark button.dropdown-item.is-active{background-color:#1abc9c;color:#fff}html.theme--documenter-dark .dropdown-divider{background-color:#5e6d6f;border:none;display:block;height:1px;margin:0.5rem 0}html.theme--documenter-dark .level{align-items:center;justify-content:space-between}html.theme--documenter-dark .level code{border-radius:.4em}html.theme--documenter-dark .level img{display:inline-block;vertical-align:top}html.theme--documenter-dark .level.is-mobile{display:flex}html.theme--documenter-dark .level.is-mobile .level-left,html.theme--documenter-dark .level.is-mobile .level-right{display:flex}html.theme--documenter-dark .level.is-mobile .level-left+.level-right{margin-top:0}html.theme--documenter-dark .level.is-mobile .level-item:not(:last-child){margin-bottom:0;margin-right:.75rem}html.theme--documenter-dark .level.is-mobile .level-item:not(.is-narrow){flex-grow:1}@media screen and (min-width: 769px),print{html.theme--documenter-dark .level{display:flex}html.theme--documenter-dark .level>.level-item:not(.is-narrow){flex-grow:1}}html.theme--documenter-dark .level-item{align-items:center;display:flex;flex-basis:auto;flex-grow:0;flex-shrink:0;justify-content:center}html.theme--documenter-dark .level-item .title,html.theme--documenter-dark .level-item .subtitle{margin-bottom:0}@media screen and (max-width: 768px){html.theme--documenter-dark .level-item:not(:last-child){margin-bottom:.75rem}}html.theme--documenter-dark .level-left,html.theme--documenter-dark .level-right{flex-basis:auto;flex-grow:0;flex-shrink:0}html.theme--documenter-dark .level-left .level-item.is-flexible,html.theme--documenter-dark .level-right .level-item.is-flexible{flex-grow:1}@media screen and (min-width: 769px),print{html.theme--documenter-dark .level-left .level-item:not(:last-child),html.theme--documenter-dark .level-right .level-item:not(:last-child){margin-right:.75rem}}html.theme--documenter-dark .level-left{align-items:center;justify-content:flex-start}@media screen and (max-width: 768px){html.theme--documenter-dark .level-left+.level-right{margin-top:1.5rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .level-left{display:flex}}html.theme--documenter-dark .level-right{align-items:center;justify-content:flex-end}@media screen and (min-width: 769px),print{html.theme--documenter-dark .level-right{display:flex}}html.theme--documenter-dark .list{background-color:#fff;border-radius:.4em;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1)}html.theme--documenter-dark .list-item{display:block;padding:0.5em 1em}html.theme--documenter-dark .list-item:not(a){color:#fff}html.theme--documenter-dark .list-item:first-child{border-top-left-radius:.4em;border-top-right-radius:.4em}html.theme--documenter-dark .list-item:last-child{border-bottom-left-radius:.4em;border-bottom-right-radius:.4em}html.theme--documenter-dark .list-item:not(:last-child){border-bottom:1px solid #5e6d6f}html.theme--documenter-dark .list-item.is-active{background-color:#1abc9c;color:#fff}html.theme--documenter-dark a.list-item{background-color:#282f2f;cursor:pointer}html.theme--documenter-dark .media{align-items:flex-start;display:flex;text-align:left}html.theme--documenter-dark .media .content:not(:last-child){margin-bottom:0.75rem}html.theme--documenter-dark .media .media{border-top:1px solid rgba(94,109,111,0.5);display:flex;padding-top:0.75rem}html.theme--documenter-dark .media .media .content:not(:last-child),html.theme--documenter-dark .media .media .control:not(:last-child){margin-bottom:0.5rem}html.theme--documenter-dark .media .media .media{padding-top:0.5rem}html.theme--documenter-dark .media .media .media+.media{margin-top:0.5rem}html.theme--documenter-dark .media+.media{border-top:1px solid rgba(94,109,111,0.5);margin-top:1rem;padding-top:1rem}html.theme--documenter-dark .media.is-large+.media{margin-top:1.5rem;padding-top:1.5rem}html.theme--documenter-dark .media-left,html.theme--documenter-dark .media-right{flex-basis:auto;flex-grow:0;flex-shrink:0}html.theme--documenter-dark .media-left{margin-right:1rem}html.theme--documenter-dark .media-right{margin-left:1rem}html.theme--documenter-dark .media-content{flex-basis:auto;flex-grow:1;flex-shrink:1;text-align:left}@media screen and (max-width: 768px){html.theme--documenter-dark .media-content{overflow-x:auto}}html.theme--documenter-dark .menu{font-size:15px}html.theme--documenter-dark .menu.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.menu{font-size:.85em}html.theme--documenter-dark .menu.is-medium{font-size:1.25rem}html.theme--documenter-dark .menu.is-large{font-size:1.5rem}html.theme--documenter-dark .menu-list{line-height:1.25}html.theme--documenter-dark .menu-list a{border-radius:3px;color:#fff;display:block;padding:0.5em 0.75em}html.theme--documenter-dark .menu-list a:hover{background-color:#282f2f;color:#f2f2f2}html.theme--documenter-dark .menu-list a.is-active{background-color:#1abc9c;color:#fff}html.theme--documenter-dark .menu-list li ul{border-left:1px solid #5e6d6f;margin:.75em;padding-left:.75em}html.theme--documenter-dark .menu-label{color:#fff;font-size:.75em;letter-spacing:.1em;text-transform:uppercase}html.theme--documenter-dark .menu-label:not(:first-child){margin-top:1em}html.theme--documenter-dark .menu-label:not(:last-child){margin-bottom:1em}html.theme--documenter-dark .message{background-color:#282f2f;border-radius:.4em;font-size:15px}html.theme--documenter-dark .message strong{color:currentColor}html.theme--documenter-dark .message a:not(.button):not(.tag):not(.dropdown-item){color:currentColor;text-decoration:underline}html.theme--documenter-dark .message.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.message{font-size:.85em}html.theme--documenter-dark .message.is-medium{font-size:1.25rem}html.theme--documenter-dark .message.is-large{font-size:1.5rem}html.theme--documenter-dark .message.is-white{background-color:#fff}html.theme--documenter-dark .message.is-white .message-header{background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .message.is-white .message-body{border-color:#fff;color:#4d4d4d}html.theme--documenter-dark .message.is-black{background-color:#fafafa}html.theme--documenter-dark .message.is-black .message-header{background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .message.is-black .message-body{border-color:#0a0a0a;color:#090909}html.theme--documenter-dark .message.is-light{background-color:#f9fafb}html.theme--documenter-dark .message.is-light .message-header{background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .message.is-light .message-body{border-color:#ecf0f1;color:#505050}html.theme--documenter-dark .message.is-dark,html.theme--documenter-dark .content kbd.message{background-color:#f9fafa}html.theme--documenter-dark .message.is-dark .message-header,html.theme--documenter-dark .content kbd.message .message-header{background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .message.is-dark .message-body,html.theme--documenter-dark .content kbd.message .message-body{border-color:#282f2f;color:#212526}html.theme--documenter-dark .message.is-primary,html.theme--documenter-dark .docstring>section>a.message.docs-sourcelink{background-color:#f8fafc}html.theme--documenter-dark .message.is-primary .message-header,html.theme--documenter-dark .docstring>section>a.message.docs-sourcelink .message-header{background-color:#375a7f;color:#fff}html.theme--documenter-dark .message.is-primary .message-body,html.theme--documenter-dark .docstring>section>a.message.docs-sourcelink .message-body{border-color:#375a7f;color:#2b4159}html.theme--documenter-dark .message.is-link{background-color:#f6fefc}html.theme--documenter-dark .message.is-link .message-header{background-color:#1abc9c;color:#fff}html.theme--documenter-dark .message.is-link .message-body{border-color:#1abc9c;color:#0b2f28}html.theme--documenter-dark .message.is-info{background-color:#f5fbff}html.theme--documenter-dark .message.is-info .message-header{background-color:#024c7d;color:#fff}html.theme--documenter-dark .message.is-info .message-body{border-color:#024c7d;color:#033659}html.theme--documenter-dark .message.is-success{background-color:#f5fff9}html.theme--documenter-dark .message.is-success .message-header{background-color:#008438;color:#fff}html.theme--documenter-dark .message.is-success .message-body{border-color:#008438;color:#023518}html.theme--documenter-dark .message.is-warning{background-color:#fffcf5}html.theme--documenter-dark .message.is-warning .message-header{background-color:#ad8100;color:#fff}html.theme--documenter-dark .message.is-warning .message-body{border-color:#ad8100;color:#3d2e03}html.theme--documenter-dark .message.is-danger{background-color:#fef6f6}html.theme--documenter-dark .message.is-danger .message-header{background-color:#9e1b0d;color:#fff}html.theme--documenter-dark .message.is-danger .message-body{border-color:#9e1b0d;color:#7a170c}html.theme--documenter-dark .message-header{align-items:center;background-color:#fff;border-radius:.4em .4em 0 0;color:rgba(0,0,0,0.7);display:flex;font-weight:700;justify-content:space-between;line-height:1.25;padding:0.75em 1em;position:relative}html.theme--documenter-dark .message-header .delete{flex-grow:0;flex-shrink:0;margin-left:0.75em}html.theme--documenter-dark .message-header+.message-body{border-width:0;border-top-left-radius:0;border-top-right-radius:0}html.theme--documenter-dark .message-body{border-color:#5e6d6f;border-radius:.4em;border-style:solid;border-width:0 0 0 4px;color:#fff;padding:1.25em 1.5em}html.theme--documenter-dark .message-body code,html.theme--documenter-dark .message-body pre{background-color:#fff}html.theme--documenter-dark .message-body pre code{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .modal{align-items:center;display:none;flex-direction:column;justify-content:center;overflow:hidden;position:fixed;z-index:40}html.theme--documenter-dark .modal.is-active{display:flex}html.theme--documenter-dark .modal-background{background-color:rgba(10,10,10,0.86)}html.theme--documenter-dark .modal-content,html.theme--documenter-dark .modal-card{margin:0 20px;max-height:calc(100vh - 160px);overflow:auto;position:relative;width:100%}@media screen and (min-width: 769px),print{html.theme--documenter-dark .modal-content,html.theme--documenter-dark .modal-card{margin:0 auto;max-height:calc(100vh - 40px);width:640px}}html.theme--documenter-dark .modal-close{background:none;height:40px;position:fixed;right:20px;top:20px;width:40px}html.theme--documenter-dark .modal-card{display:flex;flex-direction:column;max-height:calc(100vh - 40px);overflow:hidden;-ms-overflow-y:visible}html.theme--documenter-dark .modal-card-head,html.theme--documenter-dark .modal-card-foot{align-items:center;background-color:#282f2f;display:flex;flex-shrink:0;justify-content:flex-start;padding:20px;position:relative}html.theme--documenter-dark .modal-card-head{border-bottom:1px solid #5e6d6f;border-top-left-radius:8px;border-top-right-radius:8px}html.theme--documenter-dark .modal-card-title{color:#f2f2f2;flex-grow:1;flex-shrink:0;font-size:1.5rem;line-height:1}html.theme--documenter-dark .modal-card-foot{border-bottom-left-radius:8px;border-bottom-right-radius:8px;border-top:1px solid #5e6d6f}html.theme--documenter-dark .modal-card-foot .button:not(:last-child){margin-right:0.5em}html.theme--documenter-dark .modal-card-body{-webkit-overflow-scrolling:touch;background-color:#fff;flex-grow:1;flex-shrink:1;overflow:auto;padding:20px}html.theme--documenter-dark .navbar{background-color:#375a7f;min-height:4rem;position:relative;z-index:30}html.theme--documenter-dark .navbar.is-white{background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-white .navbar-brand .navbar-link{color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-white .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-white .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-white .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-white .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-white .navbar-brand .navbar-link.is-active{background-color:#f2f2f2;color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-brand .navbar-link::after{border-color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-burger{color:#0a0a0a}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-white .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-white .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-white .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-white .navbar-end .navbar-link{color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-white .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-white .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-white .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-white .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-white .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-white .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-white .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-white .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-white .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-white .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-white .navbar-end .navbar-link.is-active{background-color:#f2f2f2;color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-white .navbar-end .navbar-link::after{border-color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-white .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-white .navbar-item.has-dropdown.is-active .navbar-link{background-color:#f2f2f2;color:#0a0a0a}html.theme--documenter-dark .navbar.is-white .navbar-dropdown a.navbar-item.is-active{background-color:#fff;color:#0a0a0a}}html.theme--documenter-dark .navbar.is-black{background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-black .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-black .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-black .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-black .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-black .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-black .navbar-brand .navbar-link.is-active{background-color:#000;color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-black .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-black .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-black .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-black .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-black .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-black .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-black .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-black .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-black .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-black .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-black .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-black .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-black .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-black .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-black .navbar-end .navbar-link.is-active{background-color:#000;color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-black .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-black .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-black .navbar-item.has-dropdown.is-active .navbar-link{background-color:#000;color:#fff}html.theme--documenter-dark .navbar.is-black .navbar-dropdown a.navbar-item.is-active{background-color:#0a0a0a;color:#fff}}html.theme--documenter-dark .navbar.is-light{background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-light .navbar-brand .navbar-link{color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-light .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-light .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-light .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-light .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-light .navbar-brand .navbar-link.is-active{background-color:#dde4e6;color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-brand .navbar-link::after{border-color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-burger{color:#282f2f}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-light .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-light .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-light .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-light .navbar-end .navbar-link{color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-light .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-light .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-light .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-light .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-light .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-light .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-light .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-light .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-light .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-light .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-light .navbar-end .navbar-link.is-active{background-color:#dde4e6;color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-light .navbar-end .navbar-link::after{border-color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-light .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-light .navbar-item.has-dropdown.is-active .navbar-link{background-color:#dde4e6;color:#282f2f}html.theme--documenter-dark .navbar.is-light .navbar-dropdown a.navbar-item.is-active{background-color:#ecf0f1;color:#282f2f}}html.theme--documenter-dark .navbar.is-dark,html.theme--documenter-dark .content kbd.navbar{background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-brand>.navbar-item,html.theme--documenter-dark .content kbd.navbar .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-dark .navbar-brand .navbar-link,html.theme--documenter-dark .content kbd.navbar .navbar-brand .navbar-link{color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .content kbd.navbar .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-dark .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .content kbd.navbar .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-dark .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-dark .navbar-brand .navbar-link:focus,html.theme--documenter-dark .content kbd.navbar .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-dark .navbar-brand .navbar-link:hover,html.theme--documenter-dark .content kbd.navbar .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-dark .navbar-brand .navbar-link.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-brand .navbar-link.is-active{background-color:#1d2122;color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-brand .navbar-link::after,html.theme--documenter-dark .content kbd.navbar .navbar-brand .navbar-link::after{border-color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-burger,html.theme--documenter-dark .content kbd.navbar .navbar-burger{color:#ecf0f1}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-dark .navbar-start>.navbar-item,html.theme--documenter-dark .content kbd.navbar .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-dark .navbar-start .navbar-link,html.theme--documenter-dark .content kbd.navbar .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-dark .navbar-end>.navbar-item,html.theme--documenter-dark .content kbd.navbar .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-dark .navbar-end .navbar-link,html.theme--documenter-dark .content kbd.navbar .navbar-end .navbar-link{color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .content kbd.navbar .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-dark .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .content kbd.navbar .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-dark .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-dark .navbar-start .navbar-link:focus,html.theme--documenter-dark .content kbd.navbar .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-dark .navbar-start .navbar-link:hover,html.theme--documenter-dark .content kbd.navbar .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-dark .navbar-start .navbar-link.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-dark .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .content kbd.navbar .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-dark .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .content kbd.navbar .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-dark .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-dark .navbar-end .navbar-link:focus,html.theme--documenter-dark .content kbd.navbar .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-dark .navbar-end .navbar-link:hover,html.theme--documenter-dark .content kbd.navbar .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-dark .navbar-end .navbar-link.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-end .navbar-link.is-active{background-color:#1d2122;color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-start .navbar-link::after,html.theme--documenter-dark .content kbd.navbar .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-dark .navbar-end .navbar-link::after,html.theme--documenter-dark .content kbd.navbar .navbar-end .navbar-link::after{border-color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .content kbd.navbar .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-dark .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .content kbd.navbar .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-dark .navbar-item.has-dropdown.is-active .navbar-link,html.theme--documenter-dark .content kbd.navbar .navbar-item.has-dropdown.is-active .navbar-link{background-color:#1d2122;color:#ecf0f1}html.theme--documenter-dark .navbar.is-dark .navbar-dropdown a.navbar-item.is-active,html.theme--documenter-dark .content kbd.navbar .navbar-dropdown a.navbar-item.is-active{background-color:#282f2f;color:#ecf0f1}}html.theme--documenter-dark .navbar.is-primary,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink{background-color:#375a7f;color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-brand>.navbar-item,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-primary .navbar-brand .navbar-link,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-primary .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-primary .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-primary .navbar-brand .navbar-link:focus,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-primary .navbar-brand .navbar-link:hover,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-primary .navbar-brand .navbar-link.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link.is-active{background-color:#2f4d6d;color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-brand .navbar-link::after,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-burger,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-primary .navbar-start>.navbar-item,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-primary .navbar-start .navbar-link,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-primary .navbar-end>.navbar-item,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-primary .navbar-end .navbar-link,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-primary .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-primary .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-primary .navbar-start .navbar-link:focus,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-primary .navbar-start .navbar-link:hover,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-primary .navbar-start .navbar-link.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-primary .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-primary .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-primary .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-primary .navbar-end .navbar-link:focus,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-primary .navbar-end .navbar-link:hover,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-primary .navbar-end .navbar-link.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link.is-active{background-color:#2f4d6d;color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-start .navbar-link::after,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-primary .navbar-end .navbar-link::after,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-primary .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-primary .navbar-item.has-dropdown.is-active .navbar-link,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-item.has-dropdown.is-active .navbar-link{background-color:#2f4d6d;color:#fff}html.theme--documenter-dark .navbar.is-primary .navbar-dropdown a.navbar-item.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-dropdown a.navbar-item.is-active{background-color:#375a7f;color:#fff}}html.theme--documenter-dark .navbar.is-link{background-color:#1abc9c;color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-link .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-link .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-link .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-link .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-link .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-link .navbar-brand .navbar-link.is-active{background-color:#17a689;color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-link .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-link .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-link .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-link .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-link .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-link .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-link .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-link .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-link .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-link .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-link .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-link .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-link .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-link .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-link .navbar-end .navbar-link.is-active{background-color:#17a689;color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-link .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-link .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-link .navbar-item.has-dropdown.is-active .navbar-link{background-color:#17a689;color:#fff}html.theme--documenter-dark .navbar.is-link .navbar-dropdown a.navbar-item.is-active{background-color:#1abc9c;color:#fff}}html.theme--documenter-dark .navbar.is-info{background-color:#024c7d;color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-info .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-info .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-info .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-info .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-info .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-info .navbar-brand .navbar-link.is-active{background-color:#023d64;color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-info .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-info .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-info .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-info .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-info .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-info .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-info .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-info .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-info .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-info .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-info .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-info .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-info .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-info .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-info .navbar-end .navbar-link.is-active{background-color:#023d64;color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-info .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-info .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-info .navbar-item.has-dropdown.is-active .navbar-link{background-color:#023d64;color:#fff}html.theme--documenter-dark .navbar.is-info .navbar-dropdown a.navbar-item.is-active{background-color:#024c7d;color:#fff}}html.theme--documenter-dark .navbar.is-success{background-color:#008438;color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-success .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-success .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-success .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-success .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-success .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-success .navbar-brand .navbar-link.is-active{background-color:#006b2d;color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-success .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-success .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-success .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-success .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-success .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-success .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-success .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-success .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-success .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-success .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-success .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-success .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-success .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-success .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-success .navbar-end .navbar-link.is-active{background-color:#006b2d;color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-success .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-success .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-success .navbar-item.has-dropdown.is-active .navbar-link{background-color:#006b2d;color:#fff}html.theme--documenter-dark .navbar.is-success .navbar-dropdown a.navbar-item.is-active{background-color:#008438;color:#fff}}html.theme--documenter-dark .navbar.is-warning{background-color:#ad8100;color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-warning .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-warning .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-warning .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-warning .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-warning .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-warning .navbar-brand .navbar-link.is-active{background-color:#946e00;color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-warning .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-warning .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-warning .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-warning .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-warning .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-warning .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-warning .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-warning .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-warning .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-warning .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-warning .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-warning .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-warning .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-warning .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-warning .navbar-end .navbar-link.is-active{background-color:#946e00;color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-warning .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-warning .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-warning .navbar-item.has-dropdown.is-active .navbar-link{background-color:#946e00;color:#fff}html.theme--documenter-dark .navbar.is-warning .navbar-dropdown a.navbar-item.is-active{background-color:#ad8100;color:#fff}}html.theme--documenter-dark .navbar.is-danger{background-color:#9e1b0d;color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-brand>.navbar-item,html.theme--documenter-dark .navbar.is-danger .navbar-brand .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-brand>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-danger .navbar-brand>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-danger .navbar-brand>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-danger .navbar-brand .navbar-link:focus,html.theme--documenter-dark .navbar.is-danger .navbar-brand .navbar-link:hover,html.theme--documenter-dark .navbar.is-danger .navbar-brand .navbar-link.is-active{background-color:#86170b;color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-brand .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-burger{color:#fff}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar.is-danger .navbar-start>.navbar-item,html.theme--documenter-dark .navbar.is-danger .navbar-start .navbar-link,html.theme--documenter-dark .navbar.is-danger .navbar-end>.navbar-item,html.theme--documenter-dark .navbar.is-danger .navbar-end .navbar-link{color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-start>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-danger .navbar-start>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-danger .navbar-start>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-danger .navbar-start .navbar-link:focus,html.theme--documenter-dark .navbar.is-danger .navbar-start .navbar-link:hover,html.theme--documenter-dark .navbar.is-danger .navbar-start .navbar-link.is-active,html.theme--documenter-dark .navbar.is-danger .navbar-end>a.navbar-item:focus,html.theme--documenter-dark .navbar.is-danger .navbar-end>a.navbar-item:hover,html.theme--documenter-dark .navbar.is-danger .navbar-end>a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-danger .navbar-end .navbar-link:focus,html.theme--documenter-dark .navbar.is-danger .navbar-end .navbar-link:hover,html.theme--documenter-dark .navbar.is-danger .navbar-end .navbar-link.is-active{background-color:#86170b;color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-start .navbar-link::after,html.theme--documenter-dark .navbar.is-danger .navbar-end .navbar-link::after{border-color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar.is-danger .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar.is-danger .navbar-item.has-dropdown.is-active .navbar-link{background-color:#86170b;color:#fff}html.theme--documenter-dark .navbar.is-danger .navbar-dropdown a.navbar-item.is-active{background-color:#9e1b0d;color:#fff}}html.theme--documenter-dark .navbar>.container{align-items:stretch;display:flex;min-height:4rem;width:100%}html.theme--documenter-dark .navbar.has-shadow{box-shadow:0 2px 0 0 #282f2f}html.theme--documenter-dark .navbar.is-fixed-bottom,html.theme--documenter-dark .navbar.is-fixed-top{left:0;position:fixed;right:0;z-index:30}html.theme--documenter-dark .navbar.is-fixed-bottom{bottom:0}html.theme--documenter-dark .navbar.is-fixed-bottom.has-shadow{box-shadow:0 -2px 0 0 #282f2f}html.theme--documenter-dark .navbar.is-fixed-top{top:0}html.theme--documenter-dark html.has-navbar-fixed-top,html.theme--documenter-dark body.has-navbar-fixed-top{padding-top:4rem}html.theme--documenter-dark html.has-navbar-fixed-bottom,html.theme--documenter-dark body.has-navbar-fixed-bottom{padding-bottom:4rem}html.theme--documenter-dark .navbar-brand,html.theme--documenter-dark .navbar-tabs{align-items:stretch;display:flex;flex-shrink:0;min-height:4rem}html.theme--documenter-dark .navbar-brand a.navbar-item:focus,html.theme--documenter-dark .navbar-brand a.navbar-item:hover{background-color:transparent}html.theme--documenter-dark .navbar-tabs{-webkit-overflow-scrolling:touch;max-width:100vw;overflow-x:auto;overflow-y:hidden}html.theme--documenter-dark .navbar-burger{color:#fff;cursor:pointer;display:block;height:4rem;position:relative;width:4rem;margin-left:auto}html.theme--documenter-dark .navbar-burger span{background-color:currentColor;display:block;height:1px;left:calc(50% - 8px);position:absolute;transform-origin:center;transition-duration:86ms;transition-property:background-color, opacity, transform;transition-timing-function:ease-out;width:16px}html.theme--documenter-dark .navbar-burger span:nth-child(1){top:calc(50% - 6px)}html.theme--documenter-dark .navbar-burger span:nth-child(2){top:calc(50% - 1px)}html.theme--documenter-dark .navbar-burger span:nth-child(3){top:calc(50% + 4px)}html.theme--documenter-dark .navbar-burger:hover{background-color:rgba(0,0,0,0.05)}html.theme--documenter-dark .navbar-burger.is-active span:nth-child(1){transform:translateY(5px) rotate(45deg)}html.theme--documenter-dark .navbar-burger.is-active span:nth-child(2){opacity:0}html.theme--documenter-dark .navbar-burger.is-active span:nth-child(3){transform:translateY(-5px) rotate(-45deg)}html.theme--documenter-dark .navbar-menu{display:none}html.theme--documenter-dark .navbar-item,html.theme--documenter-dark .navbar-link{color:#fff;display:block;line-height:1.5;padding:0.5rem 0.75rem;position:relative}html.theme--documenter-dark .navbar-item .icon:only-child,html.theme--documenter-dark .navbar-link .icon:only-child{margin-left:-0.25rem;margin-right:-0.25rem}html.theme--documenter-dark a.navbar-item,html.theme--documenter-dark .navbar-link{cursor:pointer}html.theme--documenter-dark a.navbar-item:focus,html.theme--documenter-dark a.navbar-item:focus-within,html.theme--documenter-dark a.navbar-item:hover,html.theme--documenter-dark a.navbar-item.is-active,html.theme--documenter-dark .navbar-link:focus,html.theme--documenter-dark .navbar-link:focus-within,html.theme--documenter-dark .navbar-link:hover,html.theme--documenter-dark .navbar-link.is-active{background-color:rgba(0,0,0,0);color:#1abc9c}html.theme--documenter-dark .navbar-item{display:block;flex-grow:0;flex-shrink:0}html.theme--documenter-dark .navbar-item img{max-height:1.75rem}html.theme--documenter-dark .navbar-item.has-dropdown{padding:0}html.theme--documenter-dark .navbar-item.is-expanded{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .navbar-item.is-tab{border-bottom:1px solid transparent;min-height:4rem;padding-bottom:calc(0.5rem - 1px)}html.theme--documenter-dark .navbar-item.is-tab:focus,html.theme--documenter-dark .navbar-item.is-tab:hover{background-color:rgba(0,0,0,0);border-bottom-color:#1abc9c}html.theme--documenter-dark .navbar-item.is-tab.is-active{background-color:rgba(0,0,0,0);border-bottom-color:#1abc9c;border-bottom-style:solid;border-bottom-width:3px;color:#1abc9c;padding-bottom:calc(0.5rem - 3px)}html.theme--documenter-dark .navbar-content{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .navbar-link:not(.is-arrowless){padding-right:2.5em}html.theme--documenter-dark .navbar-link:not(.is-arrowless)::after{border-color:#fff;margin-top:-0.375em;right:1.125em}html.theme--documenter-dark .navbar-dropdown{font-size:0.875rem;padding-bottom:0.5rem;padding-top:0.5rem}html.theme--documenter-dark .navbar-dropdown .navbar-item{padding-left:1.5rem;padding-right:1.5rem}html.theme--documenter-dark .navbar-divider{background-color:rgba(0,0,0,0.2);border:none;display:none;height:2px;margin:0.5rem 0}@media screen and (max-width: 1055px){html.theme--documenter-dark .navbar>.container{display:block}html.theme--documenter-dark .navbar-brand .navbar-item,html.theme--documenter-dark .navbar-tabs .navbar-item{align-items:center;display:flex}html.theme--documenter-dark .navbar-link::after{display:none}html.theme--documenter-dark .navbar-menu{background-color:#375a7f;box-shadow:0 8px 16px rgba(10,10,10,0.1);padding:0.5rem 0}html.theme--documenter-dark .navbar-menu.is-active{display:block}html.theme--documenter-dark .navbar.is-fixed-bottom-touch,html.theme--documenter-dark .navbar.is-fixed-top-touch{left:0;position:fixed;right:0;z-index:30}html.theme--documenter-dark .navbar.is-fixed-bottom-touch{bottom:0}html.theme--documenter-dark .navbar.is-fixed-bottom-touch.has-shadow{box-shadow:0 -2px 3px rgba(10,10,10,0.1)}html.theme--documenter-dark .navbar.is-fixed-top-touch{top:0}html.theme--documenter-dark .navbar.is-fixed-top .navbar-menu,html.theme--documenter-dark .navbar.is-fixed-top-touch .navbar-menu{-webkit-overflow-scrolling:touch;max-height:calc(100vh - 4rem);overflow:auto}html.theme--documenter-dark html.has-navbar-fixed-top-touch,html.theme--documenter-dark body.has-navbar-fixed-top-touch{padding-top:4rem}html.theme--documenter-dark html.has-navbar-fixed-bottom-touch,html.theme--documenter-dark body.has-navbar-fixed-bottom-touch{padding-bottom:4rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .navbar,html.theme--documenter-dark .navbar-menu,html.theme--documenter-dark .navbar-start,html.theme--documenter-dark .navbar-end{align-items:stretch;display:flex}html.theme--documenter-dark .navbar{min-height:4rem}html.theme--documenter-dark .navbar.is-spaced{padding:1rem 2rem}html.theme--documenter-dark .navbar.is-spaced .navbar-start,html.theme--documenter-dark .navbar.is-spaced .navbar-end{align-items:center}html.theme--documenter-dark .navbar.is-spaced a.navbar-item,html.theme--documenter-dark .navbar.is-spaced .navbar-link{border-radius:.4em}html.theme--documenter-dark .navbar.is-transparent a.navbar-item:focus,html.theme--documenter-dark .navbar.is-transparent a.navbar-item:hover,html.theme--documenter-dark .navbar.is-transparent a.navbar-item.is-active,html.theme--documenter-dark .navbar.is-transparent .navbar-link:focus,html.theme--documenter-dark .navbar.is-transparent .navbar-link:hover,html.theme--documenter-dark .navbar.is-transparent .navbar-link.is-active{background-color:transparent !important}html.theme--documenter-dark .navbar.is-transparent .navbar-item.has-dropdown.is-active .navbar-link,html.theme--documenter-dark .navbar.is-transparent .navbar-item.has-dropdown.is-hoverable:focus .navbar-link,html.theme--documenter-dark .navbar.is-transparent .navbar-item.has-dropdown.is-hoverable:focus-within .navbar-link,html.theme--documenter-dark .navbar.is-transparent .navbar-item.has-dropdown.is-hoverable:hover .navbar-link{background-color:transparent !important}html.theme--documenter-dark .navbar.is-transparent .navbar-dropdown a.navbar-item:focus,html.theme--documenter-dark .navbar.is-transparent .navbar-dropdown a.navbar-item:hover{background-color:rgba(0,0,0,0);color:#dbdee0}html.theme--documenter-dark .navbar.is-transparent .navbar-dropdown a.navbar-item.is-active{background-color:rgba(0,0,0,0);color:#1abc9c}html.theme--documenter-dark .navbar-burger{display:none}html.theme--documenter-dark .navbar-item,html.theme--documenter-dark .navbar-link{align-items:center;display:flex}html.theme--documenter-dark .navbar-item{display:flex}html.theme--documenter-dark .navbar-item.has-dropdown{align-items:stretch}html.theme--documenter-dark .navbar-item.has-dropdown-up .navbar-link::after{transform:rotate(135deg) translate(0.25em, -0.25em)}html.theme--documenter-dark .navbar-item.has-dropdown-up .navbar-dropdown{border-bottom:1px solid rgba(0,0,0,0.2);border-radius:8px 8px 0 0;border-top:none;bottom:100%;box-shadow:0 -8px 8px rgba(10,10,10,0.1);top:auto}html.theme--documenter-dark .navbar-item.is-active .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-hoverable:focus .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-hoverable:focus-within .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-hoverable:hover .navbar-dropdown{display:block}.navbar.is-spaced html.theme--documenter-dark .navbar-item.is-active .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-active .navbar-dropdown.is-boxed,.navbar.is-spaced html.theme--documenter-dark .navbar-item.is-hoverable:focus .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-hoverable:focus .navbar-dropdown.is-boxed,.navbar.is-spaced html.theme--documenter-dark .navbar-item.is-hoverable:focus-within .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-hoverable:focus-within .navbar-dropdown.is-boxed,.navbar.is-spaced html.theme--documenter-dark .navbar-item.is-hoverable:hover .navbar-dropdown,html.theme--documenter-dark .navbar-item.is-hoverable:hover .navbar-dropdown.is-boxed{opacity:1;pointer-events:auto;transform:translateY(0)}html.theme--documenter-dark .navbar-menu{flex-grow:1;flex-shrink:0}html.theme--documenter-dark .navbar-start{justify-content:flex-start;margin-right:auto}html.theme--documenter-dark .navbar-end{justify-content:flex-end;margin-left:auto}html.theme--documenter-dark .navbar-dropdown{background-color:#375a7f;border-bottom-left-radius:8px;border-bottom-right-radius:8px;border-top:1px solid rgba(0,0,0,0.2);box-shadow:0 8px 8px rgba(10,10,10,0.1);display:none;font-size:0.875rem;left:0;min-width:100%;position:absolute;top:100%;z-index:20}html.theme--documenter-dark .navbar-dropdown .navbar-item{padding:0.375rem 1rem;white-space:nowrap}html.theme--documenter-dark .navbar-dropdown a.navbar-item{padding-right:3rem}html.theme--documenter-dark .navbar-dropdown a.navbar-item:focus,html.theme--documenter-dark .navbar-dropdown a.navbar-item:hover{background-color:rgba(0,0,0,0);color:#dbdee0}html.theme--documenter-dark .navbar-dropdown a.navbar-item.is-active{background-color:rgba(0,0,0,0);color:#1abc9c}.navbar.is-spaced html.theme--documenter-dark .navbar-dropdown,html.theme--documenter-dark .navbar-dropdown.is-boxed{border-radius:8px;border-top:none;box-shadow:0 8px 8px rgba(10,10,10,0.1), 0 0 0 1px rgba(10,10,10,0.1);display:block;opacity:0;pointer-events:none;top:calc(100% + (-4px));transform:translateY(-5px);transition-duration:86ms;transition-property:opacity, transform}html.theme--documenter-dark .navbar-dropdown.is-right{left:auto;right:0}html.theme--documenter-dark .navbar-divider{display:block}html.theme--documenter-dark .navbar>.container .navbar-brand,html.theme--documenter-dark .container>.navbar .navbar-brand{margin-left:-.75rem}html.theme--documenter-dark .navbar>.container .navbar-menu,html.theme--documenter-dark .container>.navbar .navbar-menu{margin-right:-.75rem}html.theme--documenter-dark .navbar.is-fixed-bottom-desktop,html.theme--documenter-dark .navbar.is-fixed-top-desktop{left:0;position:fixed;right:0;z-index:30}html.theme--documenter-dark .navbar.is-fixed-bottom-desktop{bottom:0}html.theme--documenter-dark .navbar.is-fixed-bottom-desktop.has-shadow{box-shadow:0 -2px 3px rgba(10,10,10,0.1)}html.theme--documenter-dark .navbar.is-fixed-top-desktop{top:0}html.theme--documenter-dark html.has-navbar-fixed-top-desktop,html.theme--documenter-dark body.has-navbar-fixed-top-desktop{padding-top:4rem}html.theme--documenter-dark html.has-navbar-fixed-bottom-desktop,html.theme--documenter-dark body.has-navbar-fixed-bottom-desktop{padding-bottom:4rem}html.theme--documenter-dark html.has-spaced-navbar-fixed-top,html.theme--documenter-dark body.has-spaced-navbar-fixed-top{padding-top:6rem}html.theme--documenter-dark html.has-spaced-navbar-fixed-bottom,html.theme--documenter-dark body.has-spaced-navbar-fixed-bottom{padding-bottom:6rem}html.theme--documenter-dark a.navbar-item.is-active,html.theme--documenter-dark .navbar-link.is-active{color:#1abc9c}html.theme--documenter-dark a.navbar-item.is-active:not(:focus):not(:hover),html.theme--documenter-dark .navbar-link.is-active:not(:focus):not(:hover){background-color:rgba(0,0,0,0)}html.theme--documenter-dark .navbar-item.has-dropdown:focus .navbar-link,html.theme--documenter-dark .navbar-item.has-dropdown:hover .navbar-link,html.theme--documenter-dark .navbar-item.has-dropdown.is-active .navbar-link{background-color:rgba(0,0,0,0)}}html.theme--documenter-dark .hero.is-fullheight-with-navbar{min-height:calc(100vh - 4rem)}html.theme--documenter-dark .pagination{font-size:15px;margin:-.25rem}html.theme--documenter-dark .pagination.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.pagination{font-size:.85em}html.theme--documenter-dark .pagination.is-medium{font-size:1.25rem}html.theme--documenter-dark .pagination.is-large{font-size:1.5rem}html.theme--documenter-dark .pagination.is-rounded .pagination-previous,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.pagination .pagination-previous,html.theme--documenter-dark .pagination.is-rounded .pagination-next,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.pagination .pagination-next{padding-left:1em;padding-right:1em;border-radius:290486px}html.theme--documenter-dark .pagination.is-rounded .pagination-link,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.pagination .pagination-link{border-radius:290486px}html.theme--documenter-dark .pagination,html.theme--documenter-dark .pagination-list{align-items:center;display:flex;justify-content:center;text-align:center}html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .pagination-next,html.theme--documenter-dark .pagination-link,html.theme--documenter-dark .pagination-ellipsis{font-size:1em;justify-content:center;margin:.25rem;padding-left:.5em;padding-right:.5em;text-align:center}html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .pagination-next,html.theme--documenter-dark .pagination-link{border-color:#5e6d6f;color:#1abc9c;min-width:2.25em}html.theme--documenter-dark .pagination-previous:hover,html.theme--documenter-dark .pagination-next:hover,html.theme--documenter-dark .pagination-link:hover{border-color:#8c9b9d;color:#1dd2af}html.theme--documenter-dark .pagination-previous:focus,html.theme--documenter-dark .pagination-next:focus,html.theme--documenter-dark .pagination-link:focus{border-color:#8c9b9d}html.theme--documenter-dark .pagination-previous:active,html.theme--documenter-dark .pagination-next:active,html.theme--documenter-dark .pagination-link:active{box-shadow:inset 0 1px 2px rgba(10,10,10,0.2)}html.theme--documenter-dark .pagination-previous[disabled],html.theme--documenter-dark .pagination-next[disabled],html.theme--documenter-dark .pagination-link[disabled]{background-color:#dbdee0;border-color:#dbdee0;box-shadow:none;color:#5e6d6f;opacity:0.5}html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .pagination-next{padding-left:0.75em;padding-right:0.75em;white-space:nowrap}html.theme--documenter-dark .pagination-link.is-current{background-color:#1abc9c;border-color:#1abc9c;color:#fff}html.theme--documenter-dark .pagination-ellipsis{color:#8c9b9d;pointer-events:none}html.theme--documenter-dark .pagination-list{flex-wrap:wrap}@media screen and (max-width: 768px){html.theme--documenter-dark .pagination{flex-wrap:wrap}html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .pagination-next{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .pagination-list li{flex-grow:1;flex-shrink:1}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .pagination-list{flex-grow:1;flex-shrink:1;justify-content:flex-start;order:1}html.theme--documenter-dark .pagination-previous{order:2}html.theme--documenter-dark .pagination-next{order:3}html.theme--documenter-dark .pagination{justify-content:space-between}html.theme--documenter-dark .pagination.is-centered .pagination-previous{order:1}html.theme--documenter-dark .pagination.is-centered .pagination-list{justify-content:center;order:2}html.theme--documenter-dark .pagination.is-centered .pagination-next{order:3}html.theme--documenter-dark .pagination.is-right .pagination-previous{order:1}html.theme--documenter-dark .pagination.is-right .pagination-next{order:2}html.theme--documenter-dark .pagination.is-right .pagination-list{justify-content:flex-end;order:3}}html.theme--documenter-dark .panel{font-size:15px}html.theme--documenter-dark .panel:not(:last-child){margin-bottom:1.5rem}html.theme--documenter-dark .panel-heading,html.theme--documenter-dark .panel-tabs,html.theme--documenter-dark .panel-block{border-bottom:1px solid #5e6d6f;border-left:1px solid #5e6d6f;border-right:1px solid #5e6d6f}html.theme--documenter-dark .panel-heading:first-child,html.theme--documenter-dark .panel-tabs:first-child,html.theme--documenter-dark .panel-block:first-child{border-top:1px solid #5e6d6f}html.theme--documenter-dark .panel-heading{background-color:#282f2f;border-radius:.4em .4em 0 0;color:#f2f2f2;font-size:1.25em;font-weight:300;line-height:1.25;padding:0.5em 0.75em}html.theme--documenter-dark .panel-tabs{align-items:flex-end;display:flex;font-size:.875em;justify-content:center}html.theme--documenter-dark .panel-tabs a{border-bottom:1px solid #5e6d6f;margin-bottom:-1px;padding:0.5em}html.theme--documenter-dark .panel-tabs a.is-active{border-bottom-color:#343c3d;color:#17a689}html.theme--documenter-dark .panel-list a{color:#fff}html.theme--documenter-dark .panel-list a:hover{color:#1abc9c}html.theme--documenter-dark .panel-block{align-items:center;color:#f2f2f2;display:flex;justify-content:flex-start;padding:0.5em 0.75em}html.theme--documenter-dark .panel-block input[type="checkbox"]{margin-right:0.75em}html.theme--documenter-dark .panel-block>.control{flex-grow:1;flex-shrink:1;width:100%}html.theme--documenter-dark .panel-block.is-wrapped{flex-wrap:wrap}html.theme--documenter-dark .panel-block.is-active{border-left-color:#1abc9c;color:#17a689}html.theme--documenter-dark .panel-block.is-active .panel-icon{color:#1abc9c}html.theme--documenter-dark a.panel-block,html.theme--documenter-dark label.panel-block{cursor:pointer}html.theme--documenter-dark a.panel-block:hover,html.theme--documenter-dark label.panel-block:hover{background-color:#282f2f}html.theme--documenter-dark .panel-icon{display:inline-block;font-size:14px;height:1em;line-height:1em;text-align:center;vertical-align:top;width:1em;color:#fff;margin-right:0.75em}html.theme--documenter-dark .panel-icon .fa{font-size:inherit;line-height:inherit}html.theme--documenter-dark .tabs{-webkit-overflow-scrolling:touch;align-items:stretch;display:flex;font-size:15px;justify-content:space-between;overflow:hidden;overflow-x:auto;white-space:nowrap}html.theme--documenter-dark .tabs a{align-items:center;border-bottom-color:#5e6d6f;border-bottom-style:solid;border-bottom-width:1px;color:#fff;display:flex;justify-content:center;margin-bottom:-1px;padding:0.5em 1em;vertical-align:top}html.theme--documenter-dark .tabs a:hover{border-bottom-color:#f2f2f2;color:#f2f2f2}html.theme--documenter-dark .tabs li{display:block}html.theme--documenter-dark .tabs li.is-active a{border-bottom-color:#1abc9c;color:#1abc9c}html.theme--documenter-dark .tabs ul{align-items:center;border-bottom-color:#5e6d6f;border-bottom-style:solid;border-bottom-width:1px;display:flex;flex-grow:1;flex-shrink:0;justify-content:flex-start}html.theme--documenter-dark .tabs ul.is-left{padding-right:0.75em}html.theme--documenter-dark .tabs ul.is-center{flex:none;justify-content:center;padding-left:0.75em;padding-right:0.75em}html.theme--documenter-dark .tabs ul.is-right{justify-content:flex-end;padding-left:0.75em}html.theme--documenter-dark .tabs .icon:first-child{margin-right:0.5em}html.theme--documenter-dark .tabs .icon:last-child{margin-left:0.5em}html.theme--documenter-dark .tabs.is-centered ul{justify-content:center}html.theme--documenter-dark .tabs.is-right ul{justify-content:flex-end}html.theme--documenter-dark .tabs.is-boxed a{border:1px solid transparent;border-radius:.4em .4em 0 0}html.theme--documenter-dark .tabs.is-boxed a:hover{background-color:#282f2f;border-bottom-color:#5e6d6f}html.theme--documenter-dark .tabs.is-boxed li.is-active a{background-color:#fff;border-color:#5e6d6f;border-bottom-color:rgba(0,0,0,0) !important}html.theme--documenter-dark .tabs.is-fullwidth li{flex-grow:1;flex-shrink:0}html.theme--documenter-dark .tabs.is-toggle a{border-color:#5e6d6f;border-style:solid;border-width:1px;margin-bottom:0;position:relative}html.theme--documenter-dark .tabs.is-toggle a:hover{background-color:#282f2f;border-color:#8c9b9d;z-index:2}html.theme--documenter-dark .tabs.is-toggle li+li{margin-left:-1px}html.theme--documenter-dark .tabs.is-toggle li:first-child a{border-radius:.4em 0 0 .4em}html.theme--documenter-dark .tabs.is-toggle li:last-child a{border-radius:0 .4em .4em 0}html.theme--documenter-dark .tabs.is-toggle li.is-active a{background-color:#1abc9c;border-color:#1abc9c;color:#fff;z-index:1}html.theme--documenter-dark .tabs.is-toggle ul{border-bottom:none}html.theme--documenter-dark .tabs.is-toggle.is-toggle-rounded li:first-child a{border-bottom-left-radius:290486px;border-top-left-radius:290486px;padding-left:1.25em}html.theme--documenter-dark .tabs.is-toggle.is-toggle-rounded li:last-child a{border-bottom-right-radius:290486px;border-top-right-radius:290486px;padding-right:1.25em}html.theme--documenter-dark .tabs.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.tabs{font-size:.85em}html.theme--documenter-dark .tabs.is-medium{font-size:1.25rem}html.theme--documenter-dark .tabs.is-large{font-size:1.5rem}html.theme--documenter-dark .column{display:block;flex-basis:0;flex-grow:1;flex-shrink:1;padding:.75rem}.columns.is-mobile>html.theme--documenter-dark .column.is-narrow{flex:none}.columns.is-mobile>html.theme--documenter-dark .column.is-full{flex:none;width:100%}.columns.is-mobile>html.theme--documenter-dark .column.is-three-quarters{flex:none;width:75%}.columns.is-mobile>html.theme--documenter-dark .column.is-two-thirds{flex:none;width:66.6666%}.columns.is-mobile>html.theme--documenter-dark .column.is-half{flex:none;width:50%}.columns.is-mobile>html.theme--documenter-dark .column.is-one-third{flex:none;width:33.3333%}.columns.is-mobile>html.theme--documenter-dark .column.is-one-quarter{flex:none;width:25%}.columns.is-mobile>html.theme--documenter-dark .column.is-one-fifth{flex:none;width:20%}.columns.is-mobile>html.theme--documenter-dark .column.is-two-fifths{flex:none;width:40%}.columns.is-mobile>html.theme--documenter-dark .column.is-three-fifths{flex:none;width:60%}.columns.is-mobile>html.theme--documenter-dark .column.is-four-fifths{flex:none;width:80%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-three-quarters{margin-left:75%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-two-thirds{margin-left:66.6666%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-half{margin-left:50%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-one-third{margin-left:33.3333%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-one-quarter{margin-left:25%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-one-fifth{margin-left:20%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-two-fifths{margin-left:40%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-three-fifths{margin-left:60%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-four-fifths{margin-left:80%}.columns.is-mobile>html.theme--documenter-dark .column.is-0{flex:none;width:0%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-0{margin-left:0%}.columns.is-mobile>html.theme--documenter-dark .column.is-1{flex:none;width:8.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-1{margin-left:8.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-2{flex:none;width:16.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-2{margin-left:16.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-3{flex:none;width:25%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-3{margin-left:25%}.columns.is-mobile>html.theme--documenter-dark .column.is-4{flex:none;width:33.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-4{margin-left:33.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-5{flex:none;width:41.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-5{margin-left:41.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-6{flex:none;width:50%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-6{margin-left:50%}.columns.is-mobile>html.theme--documenter-dark .column.is-7{flex:none;width:58.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-7{margin-left:58.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-8{flex:none;width:66.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-8{margin-left:66.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-9{flex:none;width:75%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-9{margin-left:75%}.columns.is-mobile>html.theme--documenter-dark .column.is-10{flex:none;width:83.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-10{margin-left:83.3333333333%}.columns.is-mobile>html.theme--documenter-dark .column.is-11{flex:none;width:91.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-11{margin-left:91.6666666667%}.columns.is-mobile>html.theme--documenter-dark .column.is-12{flex:none;width:100%}.columns.is-mobile>html.theme--documenter-dark .column.is-offset-12{margin-left:100%}@media screen and (max-width: 768px){html.theme--documenter-dark .column.is-narrow-mobile{flex:none}html.theme--documenter-dark .column.is-full-mobile{flex:none;width:100%}html.theme--documenter-dark .column.is-three-quarters-mobile{flex:none;width:75%}html.theme--documenter-dark .column.is-two-thirds-mobile{flex:none;width:66.6666%}html.theme--documenter-dark .column.is-half-mobile{flex:none;width:50%}html.theme--documenter-dark .column.is-one-third-mobile{flex:none;width:33.3333%}html.theme--documenter-dark .column.is-one-quarter-mobile{flex:none;width:25%}html.theme--documenter-dark .column.is-one-fifth-mobile{flex:none;width:20%}html.theme--documenter-dark .column.is-two-fifths-mobile{flex:none;width:40%}html.theme--documenter-dark .column.is-three-fifths-mobile{flex:none;width:60%}html.theme--documenter-dark .column.is-four-fifths-mobile{flex:none;width:80%}html.theme--documenter-dark .column.is-offset-three-quarters-mobile{margin-left:75%}html.theme--documenter-dark .column.is-offset-two-thirds-mobile{margin-left:66.6666%}html.theme--documenter-dark .column.is-offset-half-mobile{margin-left:50%}html.theme--documenter-dark .column.is-offset-one-third-mobile{margin-left:33.3333%}html.theme--documenter-dark .column.is-offset-one-quarter-mobile{margin-left:25%}html.theme--documenter-dark .column.is-offset-one-fifth-mobile{margin-left:20%}html.theme--documenter-dark .column.is-offset-two-fifths-mobile{margin-left:40%}html.theme--documenter-dark .column.is-offset-three-fifths-mobile{margin-left:60%}html.theme--documenter-dark .column.is-offset-four-fifths-mobile{margin-left:80%}html.theme--documenter-dark .column.is-0-mobile{flex:none;width:0%}html.theme--documenter-dark .column.is-offset-0-mobile{margin-left:0%}html.theme--documenter-dark .column.is-1-mobile{flex:none;width:8.3333333333%}html.theme--documenter-dark .column.is-offset-1-mobile{margin-left:8.3333333333%}html.theme--documenter-dark .column.is-2-mobile{flex:none;width:16.6666666667%}html.theme--documenter-dark .column.is-offset-2-mobile{margin-left:16.6666666667%}html.theme--documenter-dark .column.is-3-mobile{flex:none;width:25%}html.theme--documenter-dark .column.is-offset-3-mobile{margin-left:25%}html.theme--documenter-dark .column.is-4-mobile{flex:none;width:33.3333333333%}html.theme--documenter-dark .column.is-offset-4-mobile{margin-left:33.3333333333%}html.theme--documenter-dark .column.is-5-mobile{flex:none;width:41.6666666667%}html.theme--documenter-dark .column.is-offset-5-mobile{margin-left:41.6666666667%}html.theme--documenter-dark .column.is-6-mobile{flex:none;width:50%}html.theme--documenter-dark .column.is-offset-6-mobile{margin-left:50%}html.theme--documenter-dark .column.is-7-mobile{flex:none;width:58.3333333333%}html.theme--documenter-dark .column.is-offset-7-mobile{margin-left:58.3333333333%}html.theme--documenter-dark .column.is-8-mobile{flex:none;width:66.6666666667%}html.theme--documenter-dark .column.is-offset-8-mobile{margin-left:66.6666666667%}html.theme--documenter-dark .column.is-9-mobile{flex:none;width:75%}html.theme--documenter-dark .column.is-offset-9-mobile{margin-left:75%}html.theme--documenter-dark .column.is-10-mobile{flex:none;width:83.3333333333%}html.theme--documenter-dark .column.is-offset-10-mobile{margin-left:83.3333333333%}html.theme--documenter-dark .column.is-11-mobile{flex:none;width:91.6666666667%}html.theme--documenter-dark .column.is-offset-11-mobile{margin-left:91.6666666667%}html.theme--documenter-dark .column.is-12-mobile{flex:none;width:100%}html.theme--documenter-dark .column.is-offset-12-mobile{margin-left:100%}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .column.is-narrow,html.theme--documenter-dark .column.is-narrow-tablet{flex:none}html.theme--documenter-dark .column.is-full,html.theme--documenter-dark .column.is-full-tablet{flex:none;width:100%}html.theme--documenter-dark .column.is-three-quarters,html.theme--documenter-dark .column.is-three-quarters-tablet{flex:none;width:75%}html.theme--documenter-dark .column.is-two-thirds,html.theme--documenter-dark .column.is-two-thirds-tablet{flex:none;width:66.6666%}html.theme--documenter-dark .column.is-half,html.theme--documenter-dark .column.is-half-tablet{flex:none;width:50%}html.theme--documenter-dark .column.is-one-third,html.theme--documenter-dark .column.is-one-third-tablet{flex:none;width:33.3333%}html.theme--documenter-dark .column.is-one-quarter,html.theme--documenter-dark .column.is-one-quarter-tablet{flex:none;width:25%}html.theme--documenter-dark .column.is-one-fifth,html.theme--documenter-dark .column.is-one-fifth-tablet{flex:none;width:20%}html.theme--documenter-dark .column.is-two-fifths,html.theme--documenter-dark .column.is-two-fifths-tablet{flex:none;width:40%}html.theme--documenter-dark .column.is-three-fifths,html.theme--documenter-dark .column.is-three-fifths-tablet{flex:none;width:60%}html.theme--documenter-dark .column.is-four-fifths,html.theme--documenter-dark .column.is-four-fifths-tablet{flex:none;width:80%}html.theme--documenter-dark .column.is-offset-three-quarters,html.theme--documenter-dark .column.is-offset-three-quarters-tablet{margin-left:75%}html.theme--documenter-dark .column.is-offset-two-thirds,html.theme--documenter-dark .column.is-offset-two-thirds-tablet{margin-left:66.6666%}html.theme--documenter-dark .column.is-offset-half,html.theme--documenter-dark .column.is-offset-half-tablet{margin-left:50%}html.theme--documenter-dark .column.is-offset-one-third,html.theme--documenter-dark .column.is-offset-one-third-tablet{margin-left:33.3333%}html.theme--documenter-dark .column.is-offset-one-quarter,html.theme--documenter-dark .column.is-offset-one-quarter-tablet{margin-left:25%}html.theme--documenter-dark .column.is-offset-one-fifth,html.theme--documenter-dark .column.is-offset-one-fifth-tablet{margin-left:20%}html.theme--documenter-dark .column.is-offset-two-fifths,html.theme--documenter-dark .column.is-offset-two-fifths-tablet{margin-left:40%}html.theme--documenter-dark .column.is-offset-three-fifths,html.theme--documenter-dark .column.is-offset-three-fifths-tablet{margin-left:60%}html.theme--documenter-dark .column.is-offset-four-fifths,html.theme--documenter-dark .column.is-offset-four-fifths-tablet{margin-left:80%}html.theme--documenter-dark .column.is-0,html.theme--documenter-dark .column.is-0-tablet{flex:none;width:0%}html.theme--documenter-dark .column.is-offset-0,html.theme--documenter-dark .column.is-offset-0-tablet{margin-left:0%}html.theme--documenter-dark .column.is-1,html.theme--documenter-dark .column.is-1-tablet{flex:none;width:8.3333333333%}html.theme--documenter-dark .column.is-offset-1,html.theme--documenter-dark .column.is-offset-1-tablet{margin-left:8.3333333333%}html.theme--documenter-dark .column.is-2,html.theme--documenter-dark .column.is-2-tablet{flex:none;width:16.6666666667%}html.theme--documenter-dark .column.is-offset-2,html.theme--documenter-dark .column.is-offset-2-tablet{margin-left:16.6666666667%}html.theme--documenter-dark .column.is-3,html.theme--documenter-dark .column.is-3-tablet{flex:none;width:25%}html.theme--documenter-dark .column.is-offset-3,html.theme--documenter-dark .column.is-offset-3-tablet{margin-left:25%}html.theme--documenter-dark .column.is-4,html.theme--documenter-dark .column.is-4-tablet{flex:none;width:33.3333333333%}html.theme--documenter-dark .column.is-offset-4,html.theme--documenter-dark .column.is-offset-4-tablet{margin-left:33.3333333333%}html.theme--documenter-dark .column.is-5,html.theme--documenter-dark .column.is-5-tablet{flex:none;width:41.6666666667%}html.theme--documenter-dark .column.is-offset-5,html.theme--documenter-dark .column.is-offset-5-tablet{margin-left:41.6666666667%}html.theme--documenter-dark .column.is-6,html.theme--documenter-dark .column.is-6-tablet{flex:none;width:50%}html.theme--documenter-dark .column.is-offset-6,html.theme--documenter-dark .column.is-offset-6-tablet{margin-left:50%}html.theme--documenter-dark .column.is-7,html.theme--documenter-dark .column.is-7-tablet{flex:none;width:58.3333333333%}html.theme--documenter-dark .column.is-offset-7,html.theme--documenter-dark .column.is-offset-7-tablet{margin-left:58.3333333333%}html.theme--documenter-dark .column.is-8,html.theme--documenter-dark .column.is-8-tablet{flex:none;width:66.6666666667%}html.theme--documenter-dark .column.is-offset-8,html.theme--documenter-dark .column.is-offset-8-tablet{margin-left:66.6666666667%}html.theme--documenter-dark .column.is-9,html.theme--documenter-dark .column.is-9-tablet{flex:none;width:75%}html.theme--documenter-dark .column.is-offset-9,html.theme--documenter-dark .column.is-offset-9-tablet{margin-left:75%}html.theme--documenter-dark .column.is-10,html.theme--documenter-dark .column.is-10-tablet{flex:none;width:83.3333333333%}html.theme--documenter-dark .column.is-offset-10,html.theme--documenter-dark .column.is-offset-10-tablet{margin-left:83.3333333333%}html.theme--documenter-dark .column.is-11,html.theme--documenter-dark .column.is-11-tablet{flex:none;width:91.6666666667%}html.theme--documenter-dark .column.is-offset-11,html.theme--documenter-dark .column.is-offset-11-tablet{margin-left:91.6666666667%}html.theme--documenter-dark .column.is-12,html.theme--documenter-dark .column.is-12-tablet{flex:none;width:100%}html.theme--documenter-dark .column.is-offset-12,html.theme--documenter-dark .column.is-offset-12-tablet{margin-left:100%}}@media screen and (max-width: 1055px){html.theme--documenter-dark .column.is-narrow-touch{flex:none}html.theme--documenter-dark .column.is-full-touch{flex:none;width:100%}html.theme--documenter-dark .column.is-three-quarters-touch{flex:none;width:75%}html.theme--documenter-dark .column.is-two-thirds-touch{flex:none;width:66.6666%}html.theme--documenter-dark .column.is-half-touch{flex:none;width:50%}html.theme--documenter-dark .column.is-one-third-touch{flex:none;width:33.3333%}html.theme--documenter-dark .column.is-one-quarter-touch{flex:none;width:25%}html.theme--documenter-dark .column.is-one-fifth-touch{flex:none;width:20%}html.theme--documenter-dark .column.is-two-fifths-touch{flex:none;width:40%}html.theme--documenter-dark .column.is-three-fifths-touch{flex:none;width:60%}html.theme--documenter-dark .column.is-four-fifths-touch{flex:none;width:80%}html.theme--documenter-dark .column.is-offset-three-quarters-touch{margin-left:75%}html.theme--documenter-dark .column.is-offset-two-thirds-touch{margin-left:66.6666%}html.theme--documenter-dark .column.is-offset-half-touch{margin-left:50%}html.theme--documenter-dark .column.is-offset-one-third-touch{margin-left:33.3333%}html.theme--documenter-dark .column.is-offset-one-quarter-touch{margin-left:25%}html.theme--documenter-dark .column.is-offset-one-fifth-touch{margin-left:20%}html.theme--documenter-dark .column.is-offset-two-fifths-touch{margin-left:40%}html.theme--documenter-dark .column.is-offset-three-fifths-touch{margin-left:60%}html.theme--documenter-dark .column.is-offset-four-fifths-touch{margin-left:80%}html.theme--documenter-dark .column.is-0-touch{flex:none;width:0%}html.theme--documenter-dark .column.is-offset-0-touch{margin-left:0%}html.theme--documenter-dark .column.is-1-touch{flex:none;width:8.3333333333%}html.theme--documenter-dark .column.is-offset-1-touch{margin-left:8.3333333333%}html.theme--documenter-dark .column.is-2-touch{flex:none;width:16.6666666667%}html.theme--documenter-dark .column.is-offset-2-touch{margin-left:16.6666666667%}html.theme--documenter-dark .column.is-3-touch{flex:none;width:25%}html.theme--documenter-dark .column.is-offset-3-touch{margin-left:25%}html.theme--documenter-dark .column.is-4-touch{flex:none;width:33.3333333333%}html.theme--documenter-dark .column.is-offset-4-touch{margin-left:33.3333333333%}html.theme--documenter-dark .column.is-5-touch{flex:none;width:41.6666666667%}html.theme--documenter-dark .column.is-offset-5-touch{margin-left:41.6666666667%}html.theme--documenter-dark .column.is-6-touch{flex:none;width:50%}html.theme--documenter-dark .column.is-offset-6-touch{margin-left:50%}html.theme--documenter-dark .column.is-7-touch{flex:none;width:58.3333333333%}html.theme--documenter-dark .column.is-offset-7-touch{margin-left:58.3333333333%}html.theme--documenter-dark .column.is-8-touch{flex:none;width:66.6666666667%}html.theme--documenter-dark .column.is-offset-8-touch{margin-left:66.6666666667%}html.theme--documenter-dark .column.is-9-touch{flex:none;width:75%}html.theme--documenter-dark .column.is-offset-9-touch{margin-left:75%}html.theme--documenter-dark .column.is-10-touch{flex:none;width:83.3333333333%}html.theme--documenter-dark .column.is-offset-10-touch{margin-left:83.3333333333%}html.theme--documenter-dark .column.is-11-touch{flex:none;width:91.6666666667%}html.theme--documenter-dark .column.is-offset-11-touch{margin-left:91.6666666667%}html.theme--documenter-dark .column.is-12-touch{flex:none;width:100%}html.theme--documenter-dark .column.is-offset-12-touch{margin-left:100%}}@media screen and (min-width: 1056px){html.theme--documenter-dark .column.is-narrow-desktop{flex:none}html.theme--documenter-dark .column.is-full-desktop{flex:none;width:100%}html.theme--documenter-dark .column.is-three-quarters-desktop{flex:none;width:75%}html.theme--documenter-dark .column.is-two-thirds-desktop{flex:none;width:66.6666%}html.theme--documenter-dark .column.is-half-desktop{flex:none;width:50%}html.theme--documenter-dark .column.is-one-third-desktop{flex:none;width:33.3333%}html.theme--documenter-dark .column.is-one-quarter-desktop{flex:none;width:25%}html.theme--documenter-dark .column.is-one-fifth-desktop{flex:none;width:20%}html.theme--documenter-dark .column.is-two-fifths-desktop{flex:none;width:40%}html.theme--documenter-dark .column.is-three-fifths-desktop{flex:none;width:60%}html.theme--documenter-dark .column.is-four-fifths-desktop{flex:none;width:80%}html.theme--documenter-dark .column.is-offset-three-quarters-desktop{margin-left:75%}html.theme--documenter-dark .column.is-offset-two-thirds-desktop{margin-left:66.6666%}html.theme--documenter-dark .column.is-offset-half-desktop{margin-left:50%}html.theme--documenter-dark .column.is-offset-one-third-desktop{margin-left:33.3333%}html.theme--documenter-dark .column.is-offset-one-quarter-desktop{margin-left:25%}html.theme--documenter-dark .column.is-offset-one-fifth-desktop{margin-left:20%}html.theme--documenter-dark .column.is-offset-two-fifths-desktop{margin-left:40%}html.theme--documenter-dark .column.is-offset-three-fifths-desktop{margin-left:60%}html.theme--documenter-dark .column.is-offset-four-fifths-desktop{margin-left:80%}html.theme--documenter-dark .column.is-0-desktop{flex:none;width:0%}html.theme--documenter-dark .column.is-offset-0-desktop{margin-left:0%}html.theme--documenter-dark .column.is-1-desktop{flex:none;width:8.3333333333%}html.theme--documenter-dark .column.is-offset-1-desktop{margin-left:8.3333333333%}html.theme--documenter-dark .column.is-2-desktop{flex:none;width:16.6666666667%}html.theme--documenter-dark .column.is-offset-2-desktop{margin-left:16.6666666667%}html.theme--documenter-dark .column.is-3-desktop{flex:none;width:25%}html.theme--documenter-dark .column.is-offset-3-desktop{margin-left:25%}html.theme--documenter-dark .column.is-4-desktop{flex:none;width:33.3333333333%}html.theme--documenter-dark .column.is-offset-4-desktop{margin-left:33.3333333333%}html.theme--documenter-dark .column.is-5-desktop{flex:none;width:41.6666666667%}html.theme--documenter-dark .column.is-offset-5-desktop{margin-left:41.6666666667%}html.theme--documenter-dark .column.is-6-desktop{flex:none;width:50%}html.theme--documenter-dark .column.is-offset-6-desktop{margin-left:50%}html.theme--documenter-dark .column.is-7-desktop{flex:none;width:58.3333333333%}html.theme--documenter-dark .column.is-offset-7-desktop{margin-left:58.3333333333%}html.theme--documenter-dark .column.is-8-desktop{flex:none;width:66.6666666667%}html.theme--documenter-dark .column.is-offset-8-desktop{margin-left:66.6666666667%}html.theme--documenter-dark .column.is-9-desktop{flex:none;width:75%}html.theme--documenter-dark .column.is-offset-9-desktop{margin-left:75%}html.theme--documenter-dark .column.is-10-desktop{flex:none;width:83.3333333333%}html.theme--documenter-dark .column.is-offset-10-desktop{margin-left:83.3333333333%}html.theme--documenter-dark .column.is-11-desktop{flex:none;width:91.6666666667%}html.theme--documenter-dark .column.is-offset-11-desktop{margin-left:91.6666666667%}html.theme--documenter-dark .column.is-12-desktop{flex:none;width:100%}html.theme--documenter-dark .column.is-offset-12-desktop{margin-left:100%}}@media screen and (min-width: 1216px){html.theme--documenter-dark .column.is-narrow-widescreen{flex:none}html.theme--documenter-dark .column.is-full-widescreen{flex:none;width:100%}html.theme--documenter-dark .column.is-three-quarters-widescreen{flex:none;width:75%}html.theme--documenter-dark .column.is-two-thirds-widescreen{flex:none;width:66.6666%}html.theme--documenter-dark .column.is-half-widescreen{flex:none;width:50%}html.theme--documenter-dark .column.is-one-third-widescreen{flex:none;width:33.3333%}html.theme--documenter-dark .column.is-one-quarter-widescreen{flex:none;width:25%}html.theme--documenter-dark .column.is-one-fifth-widescreen{flex:none;width:20%}html.theme--documenter-dark .column.is-two-fifths-widescreen{flex:none;width:40%}html.theme--documenter-dark .column.is-three-fifths-widescreen{flex:none;width:60%}html.theme--documenter-dark .column.is-four-fifths-widescreen{flex:none;width:80%}html.theme--documenter-dark .column.is-offset-three-quarters-widescreen{margin-left:75%}html.theme--documenter-dark .column.is-offset-two-thirds-widescreen{margin-left:66.6666%}html.theme--documenter-dark .column.is-offset-half-widescreen{margin-left:50%}html.theme--documenter-dark .column.is-offset-one-third-widescreen{margin-left:33.3333%}html.theme--documenter-dark .column.is-offset-one-quarter-widescreen{margin-left:25%}html.theme--documenter-dark .column.is-offset-one-fifth-widescreen{margin-left:20%}html.theme--documenter-dark .column.is-offset-two-fifths-widescreen{margin-left:40%}html.theme--documenter-dark .column.is-offset-three-fifths-widescreen{margin-left:60%}html.theme--documenter-dark .column.is-offset-four-fifths-widescreen{margin-left:80%}html.theme--documenter-dark .column.is-0-widescreen{flex:none;width:0%}html.theme--documenter-dark .column.is-offset-0-widescreen{margin-left:0%}html.theme--documenter-dark .column.is-1-widescreen{flex:none;width:8.3333333333%}html.theme--documenter-dark .column.is-offset-1-widescreen{margin-left:8.3333333333%}html.theme--documenter-dark .column.is-2-widescreen{flex:none;width:16.6666666667%}html.theme--documenter-dark .column.is-offset-2-widescreen{margin-left:16.6666666667%}html.theme--documenter-dark .column.is-3-widescreen{flex:none;width:25%}html.theme--documenter-dark .column.is-offset-3-widescreen{margin-left:25%}html.theme--documenter-dark .column.is-4-widescreen{flex:none;width:33.3333333333%}html.theme--documenter-dark .column.is-offset-4-widescreen{margin-left:33.3333333333%}html.theme--documenter-dark .column.is-5-widescreen{flex:none;width:41.6666666667%}html.theme--documenter-dark .column.is-offset-5-widescreen{margin-left:41.6666666667%}html.theme--documenter-dark .column.is-6-widescreen{flex:none;width:50%}html.theme--documenter-dark .column.is-offset-6-widescreen{margin-left:50%}html.theme--documenter-dark .column.is-7-widescreen{flex:none;width:58.3333333333%}html.theme--documenter-dark .column.is-offset-7-widescreen{margin-left:58.3333333333%}html.theme--documenter-dark .column.is-8-widescreen{flex:none;width:66.6666666667%}html.theme--documenter-dark .column.is-offset-8-widescreen{margin-left:66.6666666667%}html.theme--documenter-dark .column.is-9-widescreen{flex:none;width:75%}html.theme--documenter-dark .column.is-offset-9-widescreen{margin-left:75%}html.theme--documenter-dark .column.is-10-widescreen{flex:none;width:83.3333333333%}html.theme--documenter-dark .column.is-offset-10-widescreen{margin-left:83.3333333333%}html.theme--documenter-dark .column.is-11-widescreen{flex:none;width:91.6666666667%}html.theme--documenter-dark .column.is-offset-11-widescreen{margin-left:91.6666666667%}html.theme--documenter-dark .column.is-12-widescreen{flex:none;width:100%}html.theme--documenter-dark .column.is-offset-12-widescreen{margin-left:100%}}@media screen and (min-width: 1408px){html.theme--documenter-dark .column.is-narrow-fullhd{flex:none}html.theme--documenter-dark .column.is-full-fullhd{flex:none;width:100%}html.theme--documenter-dark .column.is-three-quarters-fullhd{flex:none;width:75%}html.theme--documenter-dark .column.is-two-thirds-fullhd{flex:none;width:66.6666%}html.theme--documenter-dark .column.is-half-fullhd{flex:none;width:50%}html.theme--documenter-dark .column.is-one-third-fullhd{flex:none;width:33.3333%}html.theme--documenter-dark .column.is-one-quarter-fullhd{flex:none;width:25%}html.theme--documenter-dark .column.is-one-fifth-fullhd{flex:none;width:20%}html.theme--documenter-dark .column.is-two-fifths-fullhd{flex:none;width:40%}html.theme--documenter-dark .column.is-three-fifths-fullhd{flex:none;width:60%}html.theme--documenter-dark .column.is-four-fifths-fullhd{flex:none;width:80%}html.theme--documenter-dark .column.is-offset-three-quarters-fullhd{margin-left:75%}html.theme--documenter-dark .column.is-offset-two-thirds-fullhd{margin-left:66.6666%}html.theme--documenter-dark .column.is-offset-half-fullhd{margin-left:50%}html.theme--documenter-dark .column.is-offset-one-third-fullhd{margin-left:33.3333%}html.theme--documenter-dark .column.is-offset-one-quarter-fullhd{margin-left:25%}html.theme--documenter-dark .column.is-offset-one-fifth-fullhd{margin-left:20%}html.theme--documenter-dark .column.is-offset-two-fifths-fullhd{margin-left:40%}html.theme--documenter-dark .column.is-offset-three-fifths-fullhd{margin-left:60%}html.theme--documenter-dark .column.is-offset-four-fifths-fullhd{margin-left:80%}html.theme--documenter-dark .column.is-0-fullhd{flex:none;width:0%}html.theme--documenter-dark .column.is-offset-0-fullhd{margin-left:0%}html.theme--documenter-dark .column.is-1-fullhd{flex:none;width:8.3333333333%}html.theme--documenter-dark .column.is-offset-1-fullhd{margin-left:8.3333333333%}html.theme--documenter-dark .column.is-2-fullhd{flex:none;width:16.6666666667%}html.theme--documenter-dark .column.is-offset-2-fullhd{margin-left:16.6666666667%}html.theme--documenter-dark .column.is-3-fullhd{flex:none;width:25%}html.theme--documenter-dark .column.is-offset-3-fullhd{margin-left:25%}html.theme--documenter-dark .column.is-4-fullhd{flex:none;width:33.3333333333%}html.theme--documenter-dark .column.is-offset-4-fullhd{margin-left:33.3333333333%}html.theme--documenter-dark .column.is-5-fullhd{flex:none;width:41.6666666667%}html.theme--documenter-dark .column.is-offset-5-fullhd{margin-left:41.6666666667%}html.theme--documenter-dark .column.is-6-fullhd{flex:none;width:50%}html.theme--documenter-dark .column.is-offset-6-fullhd{margin-left:50%}html.theme--documenter-dark .column.is-7-fullhd{flex:none;width:58.3333333333%}html.theme--documenter-dark .column.is-offset-7-fullhd{margin-left:58.3333333333%}html.theme--documenter-dark .column.is-8-fullhd{flex:none;width:66.6666666667%}html.theme--documenter-dark .column.is-offset-8-fullhd{margin-left:66.6666666667%}html.theme--documenter-dark .column.is-9-fullhd{flex:none;width:75%}html.theme--documenter-dark .column.is-offset-9-fullhd{margin-left:75%}html.theme--documenter-dark .column.is-10-fullhd{flex:none;width:83.3333333333%}html.theme--documenter-dark .column.is-offset-10-fullhd{margin-left:83.3333333333%}html.theme--documenter-dark .column.is-11-fullhd{flex:none;width:91.6666666667%}html.theme--documenter-dark .column.is-offset-11-fullhd{margin-left:91.6666666667%}html.theme--documenter-dark .column.is-12-fullhd{flex:none;width:100%}html.theme--documenter-dark .column.is-offset-12-fullhd{margin-left:100%}}html.theme--documenter-dark .columns{margin-left:-.75rem;margin-right:-.75rem;margin-top:-.75rem}html.theme--documenter-dark .columns:last-child{margin-bottom:-.75rem}html.theme--documenter-dark .columns:not(:last-child){margin-bottom:calc(1.5rem - .75rem)}html.theme--documenter-dark .columns.is-centered{justify-content:center}html.theme--documenter-dark .columns.is-gapless{margin-left:0;margin-right:0;margin-top:0}html.theme--documenter-dark .columns.is-gapless>.column{margin:0;padding:0 !important}html.theme--documenter-dark .columns.is-gapless:not(:last-child){margin-bottom:1.5rem}html.theme--documenter-dark .columns.is-gapless:last-child{margin-bottom:0}html.theme--documenter-dark .columns.is-mobile{display:flex}html.theme--documenter-dark .columns.is-multiline{flex-wrap:wrap}html.theme--documenter-dark .columns.is-vcentered{align-items:center}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns:not(.is-desktop){display:flex}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-desktop{display:flex}}html.theme--documenter-dark .columns.is-variable{--columnGap: 0.75rem;margin-left:calc(-1 * var(--columnGap));margin-right:calc(-1 * var(--columnGap))}html.theme--documenter-dark .columns.is-variable .column{padding-left:var(--columnGap);padding-right:var(--columnGap)}html.theme--documenter-dark .columns.is-variable.is-0{--columnGap: 0rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-0-mobile{--columnGap: 0rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-0-tablet{--columnGap: 0rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-0-tablet-only{--columnGap: 0rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-0-touch{--columnGap: 0rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-0-desktop{--columnGap: 0rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-0-desktop-only{--columnGap: 0rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-0-widescreen{--columnGap: 0rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-0-widescreen-only{--columnGap: 0rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-0-fullhd{--columnGap: 0rem}}html.theme--documenter-dark .columns.is-variable.is-1{--columnGap: .25rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-1-mobile{--columnGap: .25rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-1-tablet{--columnGap: .25rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-1-tablet-only{--columnGap: .25rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-1-touch{--columnGap: .25rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-1-desktop{--columnGap: .25rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-1-desktop-only{--columnGap: .25rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-1-widescreen{--columnGap: .25rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-1-widescreen-only{--columnGap: .25rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-1-fullhd{--columnGap: .25rem}}html.theme--documenter-dark .columns.is-variable.is-2{--columnGap: .5rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-2-mobile{--columnGap: .5rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-2-tablet{--columnGap: .5rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-2-tablet-only{--columnGap: .5rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-2-touch{--columnGap: .5rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-2-desktop{--columnGap: .5rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-2-desktop-only{--columnGap: .5rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-2-widescreen{--columnGap: .5rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-2-widescreen-only{--columnGap: .5rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-2-fullhd{--columnGap: .5rem}}html.theme--documenter-dark .columns.is-variable.is-3{--columnGap: .75rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-3-mobile{--columnGap: .75rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-3-tablet{--columnGap: .75rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-3-tablet-only{--columnGap: .75rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-3-touch{--columnGap: .75rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-3-desktop{--columnGap: .75rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-3-desktop-only{--columnGap: .75rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-3-widescreen{--columnGap: .75rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-3-widescreen-only{--columnGap: .75rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-3-fullhd{--columnGap: .75rem}}html.theme--documenter-dark .columns.is-variable.is-4{--columnGap: 1rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-4-mobile{--columnGap: 1rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-4-tablet{--columnGap: 1rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-4-tablet-only{--columnGap: 1rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-4-touch{--columnGap: 1rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-4-desktop{--columnGap: 1rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-4-desktop-only{--columnGap: 1rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-4-widescreen{--columnGap: 1rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-4-widescreen-only{--columnGap: 1rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-4-fullhd{--columnGap: 1rem}}html.theme--documenter-dark .columns.is-variable.is-5{--columnGap: 1.25rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-5-mobile{--columnGap: 1.25rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-5-tablet{--columnGap: 1.25rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-5-tablet-only{--columnGap: 1.25rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-5-touch{--columnGap: 1.25rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-5-desktop{--columnGap: 1.25rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-5-desktop-only{--columnGap: 1.25rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-5-widescreen{--columnGap: 1.25rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-5-widescreen-only{--columnGap: 1.25rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-5-fullhd{--columnGap: 1.25rem}}html.theme--documenter-dark .columns.is-variable.is-6{--columnGap: 1.5rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-6-mobile{--columnGap: 1.5rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-6-tablet{--columnGap: 1.5rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-6-tablet-only{--columnGap: 1.5rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-6-touch{--columnGap: 1.5rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-6-desktop{--columnGap: 1.5rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-6-desktop-only{--columnGap: 1.5rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-6-widescreen{--columnGap: 1.5rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-6-widescreen-only{--columnGap: 1.5rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-6-fullhd{--columnGap: 1.5rem}}html.theme--documenter-dark .columns.is-variable.is-7{--columnGap: 1.75rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-7-mobile{--columnGap: 1.75rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-7-tablet{--columnGap: 1.75rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-7-tablet-only{--columnGap: 1.75rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-7-touch{--columnGap: 1.75rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-7-desktop{--columnGap: 1.75rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-7-desktop-only{--columnGap: 1.75rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-7-widescreen{--columnGap: 1.75rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-7-widescreen-only{--columnGap: 1.75rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-7-fullhd{--columnGap: 1.75rem}}html.theme--documenter-dark .columns.is-variable.is-8{--columnGap: 2rem}@media screen and (max-width: 768px){html.theme--documenter-dark .columns.is-variable.is-8-mobile{--columnGap: 2rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .columns.is-variable.is-8-tablet{--columnGap: 2rem}}@media screen and (min-width: 769px) and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-8-tablet-only{--columnGap: 2rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark .columns.is-variable.is-8-touch{--columnGap: 2rem}}@media screen and (min-width: 1056px){html.theme--documenter-dark .columns.is-variable.is-8-desktop{--columnGap: 2rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){html.theme--documenter-dark .columns.is-variable.is-8-desktop-only{--columnGap: 2rem}}@media screen and (min-width: 1216px){html.theme--documenter-dark .columns.is-variable.is-8-widescreen{--columnGap: 2rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){html.theme--documenter-dark .columns.is-variable.is-8-widescreen-only{--columnGap: 2rem}}@media screen and (min-width: 1408px){html.theme--documenter-dark .columns.is-variable.is-8-fullhd{--columnGap: 2rem}}html.theme--documenter-dark .tile{align-items:stretch;display:block;flex-basis:0;flex-grow:1;flex-shrink:1;min-height:min-content}html.theme--documenter-dark .tile.is-ancestor{margin-left:-.75rem;margin-right:-.75rem;margin-top:-.75rem}html.theme--documenter-dark .tile.is-ancestor:last-child{margin-bottom:-.75rem}html.theme--documenter-dark .tile.is-ancestor:not(:last-child){margin-bottom:.75rem}html.theme--documenter-dark .tile.is-child{margin:0 !important}html.theme--documenter-dark .tile.is-parent{padding:.75rem}html.theme--documenter-dark .tile.is-vertical{flex-direction:column}html.theme--documenter-dark .tile.is-vertical>.tile.is-child:not(:last-child){margin-bottom:1.5rem !important}@media screen and (min-width: 769px),print{html.theme--documenter-dark .tile:not(.is-child){display:flex}html.theme--documenter-dark .tile.is-1{flex:none;width:8.3333333333%}html.theme--documenter-dark .tile.is-2{flex:none;width:16.6666666667%}html.theme--documenter-dark .tile.is-3{flex:none;width:25%}html.theme--documenter-dark .tile.is-4{flex:none;width:33.3333333333%}html.theme--documenter-dark .tile.is-5{flex:none;width:41.6666666667%}html.theme--documenter-dark .tile.is-6{flex:none;width:50%}html.theme--documenter-dark .tile.is-7{flex:none;width:58.3333333333%}html.theme--documenter-dark .tile.is-8{flex:none;width:66.6666666667%}html.theme--documenter-dark .tile.is-9{flex:none;width:75%}html.theme--documenter-dark .tile.is-10{flex:none;width:83.3333333333%}html.theme--documenter-dark .tile.is-11{flex:none;width:91.6666666667%}html.theme--documenter-dark .tile.is-12{flex:none;width:100%}}html.theme--documenter-dark .hero{align-items:stretch;display:flex;flex-direction:column;justify-content:space-between}html.theme--documenter-dark .hero .navbar{background:none}html.theme--documenter-dark .hero .tabs ul{border-bottom:none}html.theme--documenter-dark .hero.is-white{background-color:#fff;color:#0a0a0a}html.theme--documenter-dark .hero.is-white a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-white strong{color:inherit}html.theme--documenter-dark .hero.is-white .title{color:#0a0a0a}html.theme--documenter-dark .hero.is-white .subtitle{color:rgba(10,10,10,0.9)}html.theme--documenter-dark .hero.is-white .subtitle a:not(.button),html.theme--documenter-dark .hero.is-white .subtitle strong{color:#0a0a0a}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-white .navbar-menu{background-color:#fff}}html.theme--documenter-dark .hero.is-white .navbar-item,html.theme--documenter-dark .hero.is-white .navbar-link{color:rgba(10,10,10,0.7)}html.theme--documenter-dark .hero.is-white a.navbar-item:hover,html.theme--documenter-dark .hero.is-white a.navbar-item.is-active,html.theme--documenter-dark .hero.is-white .navbar-link:hover,html.theme--documenter-dark .hero.is-white .navbar-link.is-active{background-color:#f2f2f2;color:#0a0a0a}html.theme--documenter-dark .hero.is-white .tabs a{color:#0a0a0a;opacity:0.9}html.theme--documenter-dark .hero.is-white .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-white .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-white .tabs.is-boxed a,html.theme--documenter-dark .hero.is-white .tabs.is-toggle a{color:#0a0a0a}html.theme--documenter-dark .hero.is-white .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-white .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-white .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-white .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-white .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-white .tabs.is-toggle li.is-active a:hover{background-color:#0a0a0a;border-color:#0a0a0a;color:#fff}html.theme--documenter-dark .hero.is-white.is-bold{background-image:linear-gradient(141deg, #e8e3e4 0%, #fff 71%, #fff 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-white.is-bold .navbar-menu{background-image:linear-gradient(141deg, #e8e3e4 0%, #fff 71%, #fff 100%)}}html.theme--documenter-dark .hero.is-black{background-color:#0a0a0a;color:#fff}html.theme--documenter-dark .hero.is-black a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-black strong{color:inherit}html.theme--documenter-dark .hero.is-black .title{color:#fff}html.theme--documenter-dark .hero.is-black .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-black .subtitle a:not(.button),html.theme--documenter-dark .hero.is-black .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-black .navbar-menu{background-color:#0a0a0a}}html.theme--documenter-dark .hero.is-black .navbar-item,html.theme--documenter-dark .hero.is-black .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-black a.navbar-item:hover,html.theme--documenter-dark .hero.is-black a.navbar-item.is-active,html.theme--documenter-dark .hero.is-black .navbar-link:hover,html.theme--documenter-dark .hero.is-black .navbar-link.is-active{background-color:#000;color:#fff}html.theme--documenter-dark .hero.is-black .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-black .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-black .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-black .tabs.is-boxed a,html.theme--documenter-dark .hero.is-black .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-black .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-black .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-black .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-black .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-black .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-black .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#0a0a0a}html.theme--documenter-dark .hero.is-black.is-bold{background-image:linear-gradient(141deg, #000 0%, #0a0a0a 71%, #181616 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-black.is-bold .navbar-menu{background-image:linear-gradient(141deg, #000 0%, #0a0a0a 71%, #181616 100%)}}html.theme--documenter-dark .hero.is-light{background-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .hero.is-light a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-light strong{color:inherit}html.theme--documenter-dark .hero.is-light .title{color:#282f2f}html.theme--documenter-dark .hero.is-light .subtitle{color:rgba(40,47,47,0.9)}html.theme--documenter-dark .hero.is-light .subtitle a:not(.button),html.theme--documenter-dark .hero.is-light .subtitle strong{color:#282f2f}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-light .navbar-menu{background-color:#ecf0f1}}html.theme--documenter-dark .hero.is-light .navbar-item,html.theme--documenter-dark .hero.is-light .navbar-link{color:rgba(40,47,47,0.7)}html.theme--documenter-dark .hero.is-light a.navbar-item:hover,html.theme--documenter-dark .hero.is-light a.navbar-item.is-active,html.theme--documenter-dark .hero.is-light .navbar-link:hover,html.theme--documenter-dark .hero.is-light .navbar-link.is-active{background-color:#dde4e6;color:#282f2f}html.theme--documenter-dark .hero.is-light .tabs a{color:#282f2f;opacity:0.9}html.theme--documenter-dark .hero.is-light .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-light .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-light .tabs.is-boxed a,html.theme--documenter-dark .hero.is-light .tabs.is-toggle a{color:#282f2f}html.theme--documenter-dark .hero.is-light .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-light .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-light .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-light .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-light .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-light .tabs.is-toggle li.is-active a:hover{background-color:#282f2f;border-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .hero.is-light.is-bold{background-image:linear-gradient(141deg, #cadfe0 0%, #ecf0f1 71%, #fafbfc 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-light.is-bold .navbar-menu{background-image:linear-gradient(141deg, #cadfe0 0%, #ecf0f1 71%, #fafbfc 100%)}}html.theme--documenter-dark .hero.is-dark,html.theme--documenter-dark .content kbd.hero{background-color:#282f2f;color:#ecf0f1}html.theme--documenter-dark .hero.is-dark a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .content kbd.hero a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-dark strong,html.theme--documenter-dark .content kbd.hero strong{color:inherit}html.theme--documenter-dark .hero.is-dark .title,html.theme--documenter-dark .content kbd.hero .title{color:#ecf0f1}html.theme--documenter-dark .hero.is-dark .subtitle,html.theme--documenter-dark .content kbd.hero .subtitle{color:rgba(236,240,241,0.9)}html.theme--documenter-dark .hero.is-dark .subtitle a:not(.button),html.theme--documenter-dark .content kbd.hero .subtitle a:not(.button),html.theme--documenter-dark .hero.is-dark .subtitle strong,html.theme--documenter-dark .content kbd.hero .subtitle strong{color:#ecf0f1}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-dark .navbar-menu,html.theme--documenter-dark .content kbd.hero .navbar-menu{background-color:#282f2f}}html.theme--documenter-dark .hero.is-dark .navbar-item,html.theme--documenter-dark .content kbd.hero .navbar-item,html.theme--documenter-dark .hero.is-dark .navbar-link,html.theme--documenter-dark .content kbd.hero .navbar-link{color:rgba(236,240,241,0.7)}html.theme--documenter-dark .hero.is-dark a.navbar-item:hover,html.theme--documenter-dark .content kbd.hero a.navbar-item:hover,html.theme--documenter-dark .hero.is-dark a.navbar-item.is-active,html.theme--documenter-dark .content kbd.hero a.navbar-item.is-active,html.theme--documenter-dark .hero.is-dark .navbar-link:hover,html.theme--documenter-dark .content kbd.hero .navbar-link:hover,html.theme--documenter-dark .hero.is-dark .navbar-link.is-active,html.theme--documenter-dark .content kbd.hero .navbar-link.is-active{background-color:#1d2122;color:#ecf0f1}html.theme--documenter-dark .hero.is-dark .tabs a,html.theme--documenter-dark .content kbd.hero .tabs a{color:#ecf0f1;opacity:0.9}html.theme--documenter-dark .hero.is-dark .tabs a:hover,html.theme--documenter-dark .content kbd.hero .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-dark .tabs li.is-active a,html.theme--documenter-dark .content kbd.hero .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-dark .tabs.is-boxed a,html.theme--documenter-dark .content kbd.hero .tabs.is-boxed a,html.theme--documenter-dark .hero.is-dark .tabs.is-toggle a,html.theme--documenter-dark .content kbd.hero .tabs.is-toggle a{color:#ecf0f1}html.theme--documenter-dark .hero.is-dark .tabs.is-boxed a:hover,html.theme--documenter-dark .content kbd.hero .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-dark .tabs.is-toggle a:hover,html.theme--documenter-dark .content kbd.hero .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-dark .tabs.is-boxed li.is-active a,html.theme--documenter-dark .content kbd.hero .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-dark .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-dark .tabs.is-toggle li.is-active a,html.theme--documenter-dark .content kbd.hero .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-dark .tabs.is-toggle li.is-active a:hover{background-color:#ecf0f1;border-color:#ecf0f1;color:#282f2f}html.theme--documenter-dark .hero.is-dark.is-bold,html.theme--documenter-dark .content kbd.hero.is-bold{background-image:linear-gradient(141deg, #0f1615 0%, #282f2f 71%, #313c40 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-dark.is-bold .navbar-menu,html.theme--documenter-dark .content kbd.hero.is-bold .navbar-menu{background-image:linear-gradient(141deg, #0f1615 0%, #282f2f 71%, #313c40 100%)}}html.theme--documenter-dark .hero.is-primary,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink{background-color:#375a7f;color:#fff}html.theme--documenter-dark .hero.is-primary a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-primary strong,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink strong{color:inherit}html.theme--documenter-dark .hero.is-primary .title,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .title{color:#fff}html.theme--documenter-dark .hero.is-primary .subtitle,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-primary .subtitle a:not(.button),html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .subtitle a:not(.button),html.theme--documenter-dark .hero.is-primary .subtitle strong,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-primary .navbar-menu,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .navbar-menu{background-color:#375a7f}}html.theme--documenter-dark .hero.is-primary .navbar-item,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .navbar-item,html.theme--documenter-dark .hero.is-primary .navbar-link,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-primary a.navbar-item:hover,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink a.navbar-item:hover,html.theme--documenter-dark .hero.is-primary a.navbar-item.is-active,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink a.navbar-item.is-active,html.theme--documenter-dark .hero.is-primary .navbar-link:hover,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .navbar-link:hover,html.theme--documenter-dark .hero.is-primary .navbar-link.is-active,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .navbar-link.is-active{background-color:#2f4d6d;color:#fff}html.theme--documenter-dark .hero.is-primary .tabs a,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-primary .tabs a:hover,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-primary .tabs li.is-active a,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-primary .tabs.is-boxed a,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs.is-boxed a,html.theme--documenter-dark .hero.is-primary .tabs.is-toggle a,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-primary .tabs.is-boxed a:hover,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-primary .tabs.is-toggle a:hover,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-primary .tabs.is-boxed li.is-active a,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-primary .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-primary .tabs.is-toggle li.is-active a,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-primary .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#375a7f}html.theme--documenter-dark .hero.is-primary.is-bold,html.theme--documenter-dark .docstring>section>a.hero.is-bold.docs-sourcelink{background-image:linear-gradient(141deg, #214b62 0%, #375a7f 71%, #3a5796 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-primary.is-bold .navbar-menu,html.theme--documenter-dark .docstring>section>a.hero.is-bold.docs-sourcelink .navbar-menu{background-image:linear-gradient(141deg, #214b62 0%, #375a7f 71%, #3a5796 100%)}}html.theme--documenter-dark .hero.is-link{background-color:#1abc9c;color:#fff}html.theme--documenter-dark .hero.is-link a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-link strong{color:inherit}html.theme--documenter-dark .hero.is-link .title{color:#fff}html.theme--documenter-dark .hero.is-link .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-link .subtitle a:not(.button),html.theme--documenter-dark .hero.is-link .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-link .navbar-menu{background-color:#1abc9c}}html.theme--documenter-dark .hero.is-link .navbar-item,html.theme--documenter-dark .hero.is-link .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-link a.navbar-item:hover,html.theme--documenter-dark .hero.is-link a.navbar-item.is-active,html.theme--documenter-dark .hero.is-link .navbar-link:hover,html.theme--documenter-dark .hero.is-link .navbar-link.is-active{background-color:#17a689;color:#fff}html.theme--documenter-dark .hero.is-link .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-link .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-link .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-link .tabs.is-boxed a,html.theme--documenter-dark .hero.is-link .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-link .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-link .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-link .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-link .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-link .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-link .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#1abc9c}html.theme--documenter-dark .hero.is-link.is-bold{background-image:linear-gradient(141deg, #0c9764 0%, #1abc9c 71%, #17d8d2 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-link.is-bold .navbar-menu{background-image:linear-gradient(141deg, #0c9764 0%, #1abc9c 71%, #17d8d2 100%)}}html.theme--documenter-dark .hero.is-info{background-color:#024c7d;color:#fff}html.theme--documenter-dark .hero.is-info a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-info strong{color:inherit}html.theme--documenter-dark .hero.is-info .title{color:#fff}html.theme--documenter-dark .hero.is-info .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-info .subtitle a:not(.button),html.theme--documenter-dark .hero.is-info .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-info .navbar-menu{background-color:#024c7d}}html.theme--documenter-dark .hero.is-info .navbar-item,html.theme--documenter-dark .hero.is-info .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-info a.navbar-item:hover,html.theme--documenter-dark .hero.is-info a.navbar-item.is-active,html.theme--documenter-dark .hero.is-info .navbar-link:hover,html.theme--documenter-dark .hero.is-info .navbar-link.is-active{background-color:#023d64;color:#fff}html.theme--documenter-dark .hero.is-info .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-info .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-info .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-info .tabs.is-boxed a,html.theme--documenter-dark .hero.is-info .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-info .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-info .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-info .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-info .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-info .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-info .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#024c7d}html.theme--documenter-dark .hero.is-info.is-bold{background-image:linear-gradient(141deg, #003a4c 0%, #024c7d 71%, #004299 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-info.is-bold .navbar-menu{background-image:linear-gradient(141deg, #003a4c 0%, #024c7d 71%, #004299 100%)}}html.theme--documenter-dark .hero.is-success{background-color:#008438;color:#fff}html.theme--documenter-dark .hero.is-success a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-success strong{color:inherit}html.theme--documenter-dark .hero.is-success .title{color:#fff}html.theme--documenter-dark .hero.is-success .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-success .subtitle a:not(.button),html.theme--documenter-dark .hero.is-success .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-success .navbar-menu{background-color:#008438}}html.theme--documenter-dark .hero.is-success .navbar-item,html.theme--documenter-dark .hero.is-success .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-success a.navbar-item:hover,html.theme--documenter-dark .hero.is-success a.navbar-item.is-active,html.theme--documenter-dark .hero.is-success .navbar-link:hover,html.theme--documenter-dark .hero.is-success .navbar-link.is-active{background-color:#006b2d;color:#fff}html.theme--documenter-dark .hero.is-success .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-success .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-success .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-success .tabs.is-boxed a,html.theme--documenter-dark .hero.is-success .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-success .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-success .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-success .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-success .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-success .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-success .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#008438}html.theme--documenter-dark .hero.is-success.is-bold{background-image:linear-gradient(141deg, #005115 0%, #008438 71%, #009e5d 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-success.is-bold .navbar-menu{background-image:linear-gradient(141deg, #005115 0%, #008438 71%, #009e5d 100%)}}html.theme--documenter-dark .hero.is-warning{background-color:#ad8100;color:#fff}html.theme--documenter-dark .hero.is-warning a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-warning strong{color:inherit}html.theme--documenter-dark .hero.is-warning .title{color:#fff}html.theme--documenter-dark .hero.is-warning .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-warning .subtitle a:not(.button),html.theme--documenter-dark .hero.is-warning .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-warning .navbar-menu{background-color:#ad8100}}html.theme--documenter-dark .hero.is-warning .navbar-item,html.theme--documenter-dark .hero.is-warning .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-warning a.navbar-item:hover,html.theme--documenter-dark .hero.is-warning a.navbar-item.is-active,html.theme--documenter-dark .hero.is-warning .navbar-link:hover,html.theme--documenter-dark .hero.is-warning .navbar-link.is-active{background-color:#946e00;color:#fff}html.theme--documenter-dark .hero.is-warning .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-warning .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-warning .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-warning .tabs.is-boxed a,html.theme--documenter-dark .hero.is-warning .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-warning .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-warning .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-warning .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-warning .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-warning .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-warning .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#ad8100}html.theme--documenter-dark .hero.is-warning.is-bold{background-image:linear-gradient(141deg, #7a4700 0%, #ad8100 71%, #c7b500 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-warning.is-bold .navbar-menu{background-image:linear-gradient(141deg, #7a4700 0%, #ad8100 71%, #c7b500 100%)}}html.theme--documenter-dark .hero.is-danger{background-color:#9e1b0d;color:#fff}html.theme--documenter-dark .hero.is-danger a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),html.theme--documenter-dark .hero.is-danger strong{color:inherit}html.theme--documenter-dark .hero.is-danger .title{color:#fff}html.theme--documenter-dark .hero.is-danger .subtitle{color:rgba(255,255,255,0.9)}html.theme--documenter-dark .hero.is-danger .subtitle a:not(.button),html.theme--documenter-dark .hero.is-danger .subtitle strong{color:#fff}@media screen and (max-width: 1055px){html.theme--documenter-dark .hero.is-danger .navbar-menu{background-color:#9e1b0d}}html.theme--documenter-dark .hero.is-danger .navbar-item,html.theme--documenter-dark .hero.is-danger .navbar-link{color:rgba(255,255,255,0.7)}html.theme--documenter-dark .hero.is-danger a.navbar-item:hover,html.theme--documenter-dark .hero.is-danger a.navbar-item.is-active,html.theme--documenter-dark .hero.is-danger .navbar-link:hover,html.theme--documenter-dark .hero.is-danger .navbar-link.is-active{background-color:#86170b;color:#fff}html.theme--documenter-dark .hero.is-danger .tabs a{color:#fff;opacity:0.9}html.theme--documenter-dark .hero.is-danger .tabs a:hover{opacity:1}html.theme--documenter-dark .hero.is-danger .tabs li.is-active a{opacity:1}html.theme--documenter-dark .hero.is-danger .tabs.is-boxed a,html.theme--documenter-dark .hero.is-danger .tabs.is-toggle a{color:#fff}html.theme--documenter-dark .hero.is-danger .tabs.is-boxed a:hover,html.theme--documenter-dark .hero.is-danger .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}html.theme--documenter-dark .hero.is-danger .tabs.is-boxed li.is-active a,html.theme--documenter-dark .hero.is-danger .tabs.is-boxed li.is-active a:hover,html.theme--documenter-dark .hero.is-danger .tabs.is-toggle li.is-active a,html.theme--documenter-dark .hero.is-danger .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#9e1b0d}html.theme--documenter-dark .hero.is-danger.is-bold{background-image:linear-gradient(141deg, #75030b 0%, #9e1b0d 71%, #ba380a 100%)}@media screen and (max-width: 768px){html.theme--documenter-dark .hero.is-danger.is-bold .navbar-menu{background-image:linear-gradient(141deg, #75030b 0%, #9e1b0d 71%, #ba380a 100%)}}html.theme--documenter-dark .hero.is-small .hero-body,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.hero .hero-body{padding-bottom:1.5rem;padding-top:1.5rem}@media screen and (min-width: 769px),print{html.theme--documenter-dark .hero.is-medium .hero-body{padding-bottom:9rem;padding-top:9rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .hero.is-large .hero-body{padding-bottom:18rem;padding-top:18rem}}html.theme--documenter-dark .hero.is-halfheight .hero-body,html.theme--documenter-dark .hero.is-fullheight .hero-body,html.theme--documenter-dark .hero.is-fullheight-with-navbar .hero-body{align-items:center;display:flex}html.theme--documenter-dark .hero.is-halfheight .hero-body>.container,html.theme--documenter-dark .hero.is-fullheight .hero-body>.container,html.theme--documenter-dark .hero.is-fullheight-with-navbar .hero-body>.container{flex-grow:1;flex-shrink:1}html.theme--documenter-dark .hero.is-halfheight{min-height:50vh}html.theme--documenter-dark .hero.is-fullheight{min-height:100vh}html.theme--documenter-dark .hero-video{overflow:hidden}html.theme--documenter-dark .hero-video video{left:50%;min-height:100%;min-width:100%;position:absolute;top:50%;transform:translate3d(-50%, -50%, 0)}html.theme--documenter-dark .hero-video.is-transparent{opacity:0.3}@media screen and (max-width: 768px){html.theme--documenter-dark .hero-video{display:none}}html.theme--documenter-dark .hero-buttons{margin-top:1.5rem}@media screen and (max-width: 768px){html.theme--documenter-dark .hero-buttons .button{display:flex}html.theme--documenter-dark .hero-buttons .button:not(:last-child){margin-bottom:0.75rem}}@media screen and (min-width: 769px),print{html.theme--documenter-dark .hero-buttons{display:flex;justify-content:center}html.theme--documenter-dark .hero-buttons .button:not(:last-child){margin-right:1.5rem}}html.theme--documenter-dark .hero-head,html.theme--documenter-dark .hero-foot{flex-grow:0;flex-shrink:0}html.theme--documenter-dark .hero-body{flex-grow:1;flex-shrink:0;padding:3rem 1.5rem}html.theme--documenter-dark .section{padding:3rem 1.5rem}@media screen and (min-width: 1056px){html.theme--documenter-dark .section.is-medium{padding:9rem 1.5rem}html.theme--documenter-dark .section.is-large{padding:18rem 1.5rem}}html.theme--documenter-dark .footer{background-color:#282f2f;padding:3rem 1.5rem 6rem}html.theme--documenter-dark hr{height:1px}html.theme--documenter-dark h6{text-transform:uppercase;letter-spacing:0.5px}html.theme--documenter-dark .hero{background-color:#343c3d}html.theme--documenter-dark a{transition:all 200ms ease}html.theme--documenter-dark .button{transition:all 200ms ease;border-width:1px;color:#fff}html.theme--documenter-dark .button.is-active,html.theme--documenter-dark .button.is-focused,html.theme--documenter-dark .button:active,html.theme--documenter-dark .button:focus{box-shadow:0 0 0 2px rgba(140,155,157,0.5)}html.theme--documenter-dark .button.is-white.is-hovered,html.theme--documenter-dark .button.is-white:hover{background-color:#fff}html.theme--documenter-dark .button.is-white.is-active,html.theme--documenter-dark .button.is-white.is-focused,html.theme--documenter-dark .button.is-white:active,html.theme--documenter-dark .button.is-white:focus{border-color:#fff;box-shadow:0 0 0 2px rgba(255,255,255,0.5)}html.theme--documenter-dark .button.is-black.is-hovered,html.theme--documenter-dark .button.is-black:hover{background-color:#1d1d1d}html.theme--documenter-dark .button.is-black.is-active,html.theme--documenter-dark .button.is-black.is-focused,html.theme--documenter-dark .button.is-black:active,html.theme--documenter-dark .button.is-black:focus{border-color:#0a0a0a;box-shadow:0 0 0 2px rgba(10,10,10,0.5)}html.theme--documenter-dark .button.is-light.is-hovered,html.theme--documenter-dark .button.is-light:hover{background-color:#fff}html.theme--documenter-dark .button.is-light.is-active,html.theme--documenter-dark .button.is-light.is-focused,html.theme--documenter-dark .button.is-light:active,html.theme--documenter-dark .button.is-light:focus{border-color:#ecf0f1;box-shadow:0 0 0 2px rgba(236,240,241,0.5)}html.theme--documenter-dark .button.is-dark.is-hovered,html.theme--documenter-dark .content kbd.button.is-hovered,html.theme--documenter-dark .button.is-dark:hover,html.theme--documenter-dark .content kbd.button:hover{background-color:#3a4344}html.theme--documenter-dark .button.is-dark.is-active,html.theme--documenter-dark .content kbd.button.is-active,html.theme--documenter-dark .button.is-dark.is-focused,html.theme--documenter-dark .content kbd.button.is-focused,html.theme--documenter-dark .button.is-dark:active,html.theme--documenter-dark .content kbd.button:active,html.theme--documenter-dark .button.is-dark:focus,html.theme--documenter-dark .content kbd.button:focus{border-color:#282f2f;box-shadow:0 0 0 2px rgba(40,47,47,0.5)}html.theme--documenter-dark .button.is-primary.is-hovered,html.theme--documenter-dark .docstring>section>a.button.is-hovered.docs-sourcelink,html.theme--documenter-dark .button.is-primary:hover,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:hover{background-color:#436d9a}html.theme--documenter-dark .button.is-primary.is-active,html.theme--documenter-dark .docstring>section>a.button.is-active.docs-sourcelink,html.theme--documenter-dark .button.is-primary.is-focused,html.theme--documenter-dark .docstring>section>a.button.is-focused.docs-sourcelink,html.theme--documenter-dark .button.is-primary:active,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:active,html.theme--documenter-dark .button.is-primary:focus,html.theme--documenter-dark .docstring>section>a.button.docs-sourcelink:focus{border-color:#375a7f;box-shadow:0 0 0 2px rgba(55,90,127,0.5)}html.theme--documenter-dark .button.is-link.is-hovered,html.theme--documenter-dark .button.is-link:hover{background-color:#1fdeb8}html.theme--documenter-dark .button.is-link.is-active,html.theme--documenter-dark .button.is-link.is-focused,html.theme--documenter-dark .button.is-link:active,html.theme--documenter-dark .button.is-link:focus{border-color:#1abc9c;box-shadow:0 0 0 2px rgba(26,188,156,0.5)}html.theme--documenter-dark .button.is-info.is-hovered,html.theme--documenter-dark .button.is-info:hover{background-color:#0363a3}html.theme--documenter-dark .button.is-info.is-active,html.theme--documenter-dark .button.is-info.is-focused,html.theme--documenter-dark .button.is-info:active,html.theme--documenter-dark .button.is-info:focus{border-color:#024c7d;box-shadow:0 0 0 2px rgba(2,76,125,0.5)}html.theme--documenter-dark .button.is-success.is-hovered,html.theme--documenter-dark .button.is-success:hover{background-color:#00aa48}html.theme--documenter-dark .button.is-success.is-active,html.theme--documenter-dark .button.is-success.is-focused,html.theme--documenter-dark .button.is-success:active,html.theme--documenter-dark .button.is-success:focus{border-color:#008438;box-shadow:0 0 0 2px rgba(0,132,56,0.5)}html.theme--documenter-dark .button.is-warning.is-hovered,html.theme--documenter-dark .button.is-warning:hover{background-color:#d39e00}html.theme--documenter-dark .button.is-warning.is-active,html.theme--documenter-dark .button.is-warning.is-focused,html.theme--documenter-dark .button.is-warning:active,html.theme--documenter-dark .button.is-warning:focus{border-color:#ad8100;box-shadow:0 0 0 2px rgba(173,129,0,0.5)}html.theme--documenter-dark .button.is-danger.is-hovered,html.theme--documenter-dark .button.is-danger:hover{background-color:#c12110}html.theme--documenter-dark .button.is-danger.is-active,html.theme--documenter-dark .button.is-danger.is-focused,html.theme--documenter-dark .button.is-danger:active,html.theme--documenter-dark .button.is-danger:focus{border-color:#9e1b0d;box-shadow:0 0 0 2px rgba(158,27,13,0.5)}html.theme--documenter-dark .label{color:#dbdee0}html.theme--documenter-dark .button,html.theme--documenter-dark .control.has-icons-left .icon,html.theme--documenter-dark .control.has-icons-right .icon,html.theme--documenter-dark .input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark .pagination-ellipsis,html.theme--documenter-dark .pagination-link,html.theme--documenter-dark .pagination-next,html.theme--documenter-dark .pagination-previous,html.theme--documenter-dark .select,html.theme--documenter-dark .select select,html.theme--documenter-dark .textarea{height:2.5em}html.theme--documenter-dark .input,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark .textarea{transition:all 200ms ease;box-shadow:none;border-width:1px;padding-left:1em;padding-right:1em}html.theme--documenter-dark .select:after,html.theme--documenter-dark .select select{border-width:1px}html.theme--documenter-dark .control.has-addons .button,html.theme--documenter-dark .control.has-addons .input,html.theme--documenter-dark .control.has-addons #documenter .docs-sidebar form.docs-search>input,html.theme--documenter-dark #documenter .docs-sidebar .control.has-addons form.docs-search>input,html.theme--documenter-dark .control.has-addons .select{margin-right:-1px}html.theme--documenter-dark .notification{background-color:#343c3d}html.theme--documenter-dark .card{box-shadow:none;border:1px solid #343c3d;background-color:#282f2f;border-radius:.4em}html.theme--documenter-dark .card .card-image img{border-radius:.4em .4em 0 0}html.theme--documenter-dark .card .card-header{box-shadow:none;background-color:rgba(18,18,18,0.2);border-radius:.4em .4em 0 0}html.theme--documenter-dark .card .card-footer{background-color:rgba(18,18,18,0.2)}html.theme--documenter-dark .card .card-footer,html.theme--documenter-dark .card .card-footer-item{border-width:1px;border-color:#343c3d}html.theme--documenter-dark .notification.is-white a:not(.button){color:#0a0a0a;text-decoration:underline}html.theme--documenter-dark .notification.is-black a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .notification.is-light a:not(.button){color:#282f2f;text-decoration:underline}html.theme--documenter-dark .notification.is-dark a:not(.button),html.theme--documenter-dark .content kbd.notification a:not(.button){color:#ecf0f1;text-decoration:underline}html.theme--documenter-dark .notification.is-primary a:not(.button),html.theme--documenter-dark .docstring>section>a.notification.docs-sourcelink a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .notification.is-link a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .notification.is-info a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .notification.is-success a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .notification.is-warning a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .notification.is-danger a:not(.button){color:#fff;text-decoration:underline}html.theme--documenter-dark .tag,html.theme--documenter-dark .content kbd,html.theme--documenter-dark .docstring>section>a.docs-sourcelink{border-radius:.4em}html.theme--documenter-dark .menu-list a{transition:all 300ms ease}html.theme--documenter-dark .modal-card-body{background-color:#282f2f}html.theme--documenter-dark .modal-card-foot,html.theme--documenter-dark .modal-card-head{border-color:#343c3d}html.theme--documenter-dark .message-header{font-weight:700;background-color:#343c3d;color:#fff}html.theme--documenter-dark .message-body{border-width:1px;border-color:#343c3d}html.theme--documenter-dark .navbar{border-radius:.4em}html.theme--documenter-dark .navbar.is-transparent{background:none}html.theme--documenter-dark .navbar.is-primary .navbar-dropdown a.navbar-item.is-active,html.theme--documenter-dark .docstring>section>a.navbar.docs-sourcelink .navbar-dropdown a.navbar-item.is-active{background-color:#1abc9c}@media screen and (max-width: 1055px){html.theme--documenter-dark .navbar .navbar-menu{background-color:#375a7f;border-radius:0 0 .4em .4em}}html.theme--documenter-dark .hero .navbar,html.theme--documenter-dark body>.navbar{border-radius:0}html.theme--documenter-dark .pagination-link,html.theme--documenter-dark .pagination-next,html.theme--documenter-dark .pagination-previous{border-width:1px}html.theme--documenter-dark .panel-block,html.theme--documenter-dark .panel-heading,html.theme--documenter-dark .panel-tabs{border-width:1px}html.theme--documenter-dark .panel-block:first-child,html.theme--documenter-dark .panel-heading:first-child,html.theme--documenter-dark .panel-tabs:first-child{border-top-width:1px}html.theme--documenter-dark .panel-heading{font-weight:700}html.theme--documenter-dark .panel-tabs a{border-width:1px;margin-bottom:-1px}html.theme--documenter-dark .panel-tabs a.is-active{border-bottom-color:#17a689}html.theme--documenter-dark .panel-block:hover{color:#1dd2af}html.theme--documenter-dark .panel-block:hover .panel-icon{color:#1dd2af}html.theme--documenter-dark .panel-block.is-active .panel-icon{color:#17a689}html.theme--documenter-dark .tabs a{border-bottom-width:1px;margin-bottom:-1px}html.theme--documenter-dark .tabs ul{border-bottom-width:1px}html.theme--documenter-dark .tabs.is-boxed a{border-width:1px}html.theme--documenter-dark .tabs.is-boxed li.is-active a{background-color:#1f2424}html.theme--documenter-dark .tabs.is-toggle li a{border-width:1px;margin-bottom:0}html.theme--documenter-dark .tabs.is-toggle li+li{margin-left:-1px}html.theme--documenter-dark .hero.is-white .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-black .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-light .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-dark .navbar .navbar-dropdown .navbar-item:hover,html.theme--documenter-dark .content kbd.hero .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-primary .navbar .navbar-dropdown .navbar-item:hover,html.theme--documenter-dark .docstring>section>a.hero.docs-sourcelink .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-link .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-info .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-success .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-warning .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark .hero.is-danger .navbar .navbar-dropdown .navbar-item:hover{background-color:rgba(0,0,0,0)}html.theme--documenter-dark h1 .docs-heading-anchor,html.theme--documenter-dark h1 .docs-heading-anchor:hover,html.theme--documenter-dark h1 .docs-heading-anchor:visited,html.theme--documenter-dark h2 .docs-heading-anchor,html.theme--documenter-dark h2 .docs-heading-anchor:hover,html.theme--documenter-dark h2 .docs-heading-anchor:visited,html.theme--documenter-dark h3 .docs-heading-anchor,html.theme--documenter-dark h3 .docs-heading-anchor:hover,html.theme--documenter-dark h3 .docs-heading-anchor:visited,html.theme--documenter-dark h4 .docs-heading-anchor,html.theme--documenter-dark h4 .docs-heading-anchor:hover,html.theme--documenter-dark h4 .docs-heading-anchor:visited,html.theme--documenter-dark h5 .docs-heading-anchor,html.theme--documenter-dark h5 .docs-heading-anchor:hover,html.theme--documenter-dark h5 .docs-heading-anchor:visited,html.theme--documenter-dark h6 .docs-heading-anchor,html.theme--documenter-dark h6 .docs-heading-anchor:hover,html.theme--documenter-dark h6 .docs-heading-anchor:visited{color:#f2f2f2}html.theme--documenter-dark h1 .docs-heading-anchor-permalink,html.theme--documenter-dark h2 .docs-heading-anchor-permalink,html.theme--documenter-dark h3 .docs-heading-anchor-permalink,html.theme--documenter-dark h4 .docs-heading-anchor-permalink,html.theme--documenter-dark h5 .docs-heading-anchor-permalink,html.theme--documenter-dark h6 .docs-heading-anchor-permalink{visibility:hidden;vertical-align:middle;margin-left:0.5em;font-size:0.7rem}html.theme--documenter-dark h1 .docs-heading-anchor-permalink::before,html.theme--documenter-dark h2 .docs-heading-anchor-permalink::before,html.theme--documenter-dark h3 .docs-heading-anchor-permalink::before,html.theme--documenter-dark h4 .docs-heading-anchor-permalink::before,html.theme--documenter-dark h5 .docs-heading-anchor-permalink::before,html.theme--documenter-dark h6 .docs-heading-anchor-permalink::before{font-family:"Font Awesome 5 Free";font-weight:900;content:"\f0c1"}html.theme--documenter-dark h1:hover .docs-heading-anchor-permalink,html.theme--documenter-dark h2:hover .docs-heading-anchor-permalink,html.theme--documenter-dark h3:hover .docs-heading-anchor-permalink,html.theme--documenter-dark h4:hover .docs-heading-anchor-permalink,html.theme--documenter-dark h5:hover .docs-heading-anchor-permalink,html.theme--documenter-dark h6:hover .docs-heading-anchor-permalink{visibility:visible}html.theme--documenter-dark .docs-light-only{display:none !important}html.theme--documenter-dark pre{position:relative;overflow:hidden}html.theme--documenter-dark pre code,html.theme--documenter-dark pre code.hljs{padding:0 .75rem !important;overflow:auto;display:block}html.theme--documenter-dark pre code:first-of-type,html.theme--documenter-dark pre code.hljs:first-of-type{padding-top:0.5rem !important}html.theme--documenter-dark pre code:last-of-type,html.theme--documenter-dark pre code.hljs:last-of-type{padding-bottom:0.5rem !important}html.theme--documenter-dark pre .copy-button{opacity:0.2;transition:opacity 0.2s;position:absolute;right:0em;top:0em;padding:0.5em;width:2.5em;height:2.5em;background:transparent;border:none;font-family:"Font Awesome 5 Free";color:#fff;cursor:pointer;text-align:center}html.theme--documenter-dark pre .copy-button:focus,html.theme--documenter-dark pre .copy-button:hover{opacity:1;background:rgba(255,255,255,0.1);color:#1abc9c}html.theme--documenter-dark pre .copy-button.success{color:#259a12;opacity:1}html.theme--documenter-dark pre .copy-button.error{color:#cb3c33;opacity:1}html.theme--documenter-dark pre:hover .copy-button{opacity:1}html.theme--documenter-dark .admonition{background-color:#282f2f;border-style:solid;border-width:1px;border-color:#5e6d6f;border-radius:.4em;font-size:15px}html.theme--documenter-dark .admonition strong{color:currentColor}html.theme--documenter-dark .admonition.is-small,html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input.admonition{font-size:.85em}html.theme--documenter-dark .admonition.is-medium{font-size:1.25rem}html.theme--documenter-dark .admonition.is-large{font-size:1.5rem}html.theme--documenter-dark .admonition.is-default{background-color:#282f2f;border-color:#5e6d6f}html.theme--documenter-dark .admonition.is-default>.admonition-header{background-color:#5e6d6f;color:#fff}html.theme--documenter-dark .admonition.is-default>.admonition-body{color:#fff}html.theme--documenter-dark .admonition.is-info{background-color:#282f2f;border-color:#024c7d}html.theme--documenter-dark .admonition.is-info>.admonition-header{background-color:#024c7d;color:#fff}html.theme--documenter-dark .admonition.is-info>.admonition-body{color:#fff}html.theme--documenter-dark .admonition.is-success{background-color:#282f2f;border-color:#008438}html.theme--documenter-dark .admonition.is-success>.admonition-header{background-color:#008438;color:#fff}html.theme--documenter-dark .admonition.is-success>.admonition-body{color:#fff}html.theme--documenter-dark .admonition.is-warning{background-color:#282f2f;border-color:#ad8100}html.theme--documenter-dark .admonition.is-warning>.admonition-header{background-color:#ad8100;color:#fff}html.theme--documenter-dark .admonition.is-warning>.admonition-body{color:#fff}html.theme--documenter-dark .admonition.is-danger{background-color:#282f2f;border-color:#9e1b0d}html.theme--documenter-dark .admonition.is-danger>.admonition-header{background-color:#9e1b0d;color:#fff}html.theme--documenter-dark .admonition.is-danger>.admonition-body{color:#fff}html.theme--documenter-dark .admonition.is-compat{background-color:#282f2f;border-color:#137886}html.theme--documenter-dark .admonition.is-compat>.admonition-header{background-color:#137886;color:#fff}html.theme--documenter-dark .admonition.is-compat>.admonition-body{color:#fff}html.theme--documenter-dark .admonition-header{color:#fff;background-color:#5e6d6f;align-items:center;font-weight:700;justify-content:space-between;line-height:1.25;padding:0.5rem .75rem;position:relative}html.theme--documenter-dark .admonition-header:before{font-family:"Font Awesome 5 Free";font-weight:900;margin-right:.75rem;content:"\f06a"}html.theme--documenter-dark .admonition-body{color:#fff;padding:0.5rem .75rem}html.theme--documenter-dark .admonition-body pre{background-color:#282f2f}html.theme--documenter-dark .admonition-body code{background-color:rgba(255,255,255,0.05)}html.theme--documenter-dark .docstring{margin-bottom:1em;background-color:rgba(0,0,0,0);border:1px solid #5e6d6f;box-shadow:none;max-width:100%}html.theme--documenter-dark .docstring>header{display:flex;flex-grow:1;align-items:stretch;padding:0.5rem .75rem;background-color:#282f2f;box-shadow:0 1px 2px rgba(10,10,10,0.1);box-shadow:none;border-bottom:1px solid #5e6d6f}html.theme--documenter-dark .docstring>header code{background-color:transparent}html.theme--documenter-dark .docstring>header .docstring-binding{margin-right:0.3em}html.theme--documenter-dark .docstring>header .docstring-category{margin-left:0.3em}html.theme--documenter-dark .docstring>section{position:relative;padding:.75rem .75rem;border-bottom:1px solid #5e6d6f}html.theme--documenter-dark .docstring>section:last-child{border-bottom:none}html.theme--documenter-dark .docstring>section>a.docs-sourcelink{transition:opacity 0.3s;opacity:0;position:absolute;right:.375rem;bottom:.375rem}html.theme--documenter-dark .docstring>section>a.docs-sourcelink:focus{opacity:1 !important}html.theme--documenter-dark .docstring:hover>section>a.docs-sourcelink{opacity:0.2}html.theme--documenter-dark .docstring:focus-within>section>a.docs-sourcelink{opacity:0.2}html.theme--documenter-dark .docstring>section:hover a.docs-sourcelink{opacity:1}html.theme--documenter-dark .documenter-example-output{background-color:#1f2424}html.theme--documenter-dark .outdated-warning-overlay{position:fixed;top:0;left:0;right:0;box-shadow:0 0 10px rgba(0,0,0,0.3);z-index:999;background-color:#282f2f;color:#fff;border-bottom:3px solid #9e1b0d;padding:10px 35px;text-align:center;font-size:15px}html.theme--documenter-dark .outdated-warning-overlay .outdated-warning-closer{position:absolute;top:calc(50% - 10px);right:18px;cursor:pointer;width:12px}html.theme--documenter-dark .outdated-warning-overlay a{color:#1abc9c}html.theme--documenter-dark .outdated-warning-overlay a:hover{color:#1dd2af}html.theme--documenter-dark .content pre{border:1px solid #5e6d6f}html.theme--documenter-dark .content code{font-weight:inherit}html.theme--documenter-dark .content a code{color:#1abc9c}html.theme--documenter-dark .content h1 code,html.theme--documenter-dark .content h2 code,html.theme--documenter-dark .content h3 code,html.theme--documenter-dark .content h4 code,html.theme--documenter-dark .content h5 code,html.theme--documenter-dark .content h6 code{color:#f2f2f2}html.theme--documenter-dark .content table{display:block;width:initial;max-width:100%;overflow-x:auto}html.theme--documenter-dark .content blockquote>ul:first-child,html.theme--documenter-dark .content blockquote>ol:first-child,html.theme--documenter-dark .content .admonition-body>ul:first-child,html.theme--documenter-dark .content .admonition-body>ol:first-child{margin-top:0}html.theme--documenter-dark pre,html.theme--documenter-dark code{font-variant-ligatures:no-contextual}html.theme--documenter-dark .breadcrumb a.is-disabled{cursor:default;pointer-events:none}html.theme--documenter-dark .breadcrumb a.is-disabled,html.theme--documenter-dark .breadcrumb a.is-disabled:hover{color:#f2f2f2}html.theme--documenter-dark .hljs{background:initial !important}html.theme--documenter-dark .katex .katex-mathml{top:0;right:0}html.theme--documenter-dark .katex-display,html.theme--documenter-dark mjx-container,html.theme--documenter-dark .MathJax_Display{margin:0.5em 0 !important}html.theme--documenter-dark html{-moz-osx-font-smoothing:auto;-webkit-font-smoothing:auto}html.theme--documenter-dark li.no-marker{list-style:none}html.theme--documenter-dark #documenter .docs-main>article{overflow-wrap:break-word}html.theme--documenter-dark #documenter .docs-main>article .math-container{overflow-x:auto;overflow-y:hidden}@media screen and (min-width: 1056px){html.theme--documenter-dark #documenter .docs-main{max-width:52rem;margin-left:20rem;padding-right:1rem}}@media screen and (max-width: 1055px){html.theme--documenter-dark #documenter .docs-main{width:100%}html.theme--documenter-dark #documenter .docs-main>article{max-width:52rem;margin-left:auto;margin-right:auto;margin-bottom:1rem;padding:0 1rem}html.theme--documenter-dark #documenter .docs-main>header,html.theme--documenter-dark #documenter .docs-main>nav{max-width:100%;width:100%;margin:0}}html.theme--documenter-dark #documenter .docs-main header.docs-navbar{background-color:#1f2424;border-bottom:1px solid #5e6d6f;z-index:2;min-height:4rem;margin-bottom:1rem;display:flex}html.theme--documenter-dark #documenter .docs-main header.docs-navbar .breadcrumb{flex-grow:1}html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right{display:flex;white-space:nowrap}html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right .docs-icon,html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right .docs-label,html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right .docs-sidebar-button{display:inline-block}html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right .docs-label{padding:0;margin-left:0.3em}html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right .docs-settings-button{margin:auto 0 auto 1rem}html.theme--documenter-dark #documenter .docs-main header.docs-navbar .docs-right .docs-sidebar-button{font-size:1.5rem;margin:auto 0 auto 1rem}html.theme--documenter-dark #documenter .docs-main header.docs-navbar>*{margin:auto 0}@media screen and (max-width: 1055px){html.theme--documenter-dark #documenter .docs-main header.docs-navbar{position:sticky;top:0;padding:0 1rem;transition-property:top, box-shadow;-webkit-transition-property:top, box-shadow;transition-duration:0.3s;-webkit-transition-duration:0.3s}html.theme--documenter-dark #documenter .docs-main header.docs-navbar.headroom--not-top{box-shadow:.2rem 0rem .4rem #171717;transition-duration:0.7s;-webkit-transition-duration:0.7s}html.theme--documenter-dark #documenter .docs-main header.docs-navbar.headroom--unpinned.headroom--not-top.headroom--not-bottom{top:-4.5rem;transition-duration:0.7s;-webkit-transition-duration:0.7s}}html.theme--documenter-dark #documenter .docs-main section.footnotes{border-top:1px solid #5e6d6f}html.theme--documenter-dark #documenter .docs-main section.footnotes li .tag:first-child,html.theme--documenter-dark #documenter .docs-main section.footnotes li .docstring>section>a.docs-sourcelink:first-child,html.theme--documenter-dark #documenter .docs-main section.footnotes li .content kbd:first-child,html.theme--documenter-dark .content #documenter .docs-main section.footnotes li kbd:first-child{margin-right:1em;margin-bottom:0.4em}html.theme--documenter-dark #documenter .docs-main .docs-footer{display:flex;flex-wrap:wrap;margin-left:0;margin-right:0;border-top:1px solid #5e6d6f;padding-top:1rem;padding-bottom:1rem}@media screen and (max-width: 1055px){html.theme--documenter-dark #documenter .docs-main .docs-footer{padding-left:1rem;padding-right:1rem}}html.theme--documenter-dark #documenter .docs-main .docs-footer .docs-footer-nextpage,html.theme--documenter-dark #documenter .docs-main .docs-footer .docs-footer-prevpage{flex-grow:1}html.theme--documenter-dark #documenter .docs-main .docs-footer .docs-footer-nextpage{text-align:right}html.theme--documenter-dark #documenter .docs-main .docs-footer .flexbox-break{flex-basis:100%;height:0}html.theme--documenter-dark #documenter .docs-main .docs-footer .footer-message{font-size:0.8em;margin:0.5em auto 0 auto;text-align:center}html.theme--documenter-dark #documenter .docs-sidebar{display:flex;flex-direction:column;color:#fff;background-color:#282f2f;border-right:1px solid #5e6d6f;padding:0;flex:0 0 18rem;z-index:5;font-size:15px;position:fixed;left:-18rem;width:18rem;height:100%;transition:left 0.3s}html.theme--documenter-dark #documenter .docs-sidebar.visible{left:0;box-shadow:.4rem 0rem .8rem #171717}@media screen and (min-width: 1056px){html.theme--documenter-dark #documenter .docs-sidebar.visible{box-shadow:none}}@media screen and (min-width: 1056px){html.theme--documenter-dark #documenter .docs-sidebar{left:0;top:0}}html.theme--documenter-dark #documenter .docs-sidebar .docs-logo{margin-top:1rem;padding:0 1rem}html.theme--documenter-dark #documenter .docs-sidebar .docs-logo>img{max-height:6rem;margin:auto}html.theme--documenter-dark #documenter .docs-sidebar .docs-package-name{flex-shrink:0;font-size:1.5rem;font-weight:700;text-align:center;white-space:nowrap;overflow:hidden;padding:0.5rem 0}html.theme--documenter-dark #documenter .docs-sidebar .docs-package-name .docs-autofit{max-width:16.2rem}html.theme--documenter-dark #documenter .docs-sidebar .docs-package-name a,html.theme--documenter-dark #documenter .docs-sidebar .docs-package-name a:hover{color:#fff}html.theme--documenter-dark #documenter .docs-sidebar .docs-version-selector{border-top:1px solid #5e6d6f;display:none;padding:0.5rem}html.theme--documenter-dark #documenter .docs-sidebar .docs-version-selector.visible{display:flex}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu{flex-grow:1;user-select:none;border-top:1px solid #5e6d6f;padding-bottom:1.5rem}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu>li>.tocitem{font-weight:bold}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu>li li{font-size:14.25px;margin-left:1em;border-left:1px solid #5e6d6f}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu input.collapse-toggle{display:none}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu ul.collapsed{display:none}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu input:checked~ul.collapsed{display:block}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu label.tocitem{display:flex}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu label.tocitem .docs-label{flex-grow:2}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu label.tocitem .docs-chevron{display:inline-block;font-style:normal;font-variant:normal;text-rendering:auto;line-height:1;font-size:11.25px;margin-left:1rem;margin-top:auto;margin-bottom:auto}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu label.tocitem .docs-chevron::before{font-family:"Font Awesome 5 Free";font-weight:900;content:"\f054"}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu input:checked~label.tocitem .docs-chevron::before{content:"\f078"}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu .tocitem{display:block;padding:0.5rem 0.5rem}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu .tocitem,html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu .tocitem:hover{color:#fff;background:#282f2f}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu a.tocitem:hover,html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu label.tocitem:hover{color:#fff;background-color:#32393a}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu li.is-active{border-top:1px solid #5e6d6f;border-bottom:1px solid #5e6d6f;background-color:#1f2424}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu li.is-active .tocitem,html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu li.is-active .tocitem:hover{background-color:#1f2424;color:#fff}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu li.is-active ul.internal .tocitem:hover{background-color:#32393a;color:#fff}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu>li.is-active:first-child{border-top:none}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu ul.internal{margin:0 0.5rem 0.5rem;border-top:1px solid #5e6d6f}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu ul.internal li{font-size:12.75px;border-left:none;margin-left:0;margin-top:0.5rem}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu ul.internal .tocitem{width:100%;padding:0}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu ul.internal .tocitem::before{content:"⚬";margin-right:0.4em}html.theme--documenter-dark #documenter .docs-sidebar form.docs-search{margin:auto;margin-top:0.5rem;margin-bottom:0.5rem}html.theme--documenter-dark #documenter .docs-sidebar form.docs-search>input{width:14.4rem}@media screen and (min-width: 1056px){html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu{overflow-y:auto;-webkit-overflow-scroll:touch}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu::-webkit-scrollbar{width:.3rem;background:none}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu::-webkit-scrollbar-thumb{border-radius:5px 0px 0px 5px;background:#3b4445}html.theme--documenter-dark #documenter .docs-sidebar ul.docs-menu::-webkit-scrollbar-thumb:hover{background:#4e5a5c}}@media screen and (max-width: 1055px){html.theme--documenter-dark #documenter .docs-sidebar{overflow-y:auto;-webkit-overflow-scroll:touch}html.theme--documenter-dark #documenter .docs-sidebar::-webkit-scrollbar{width:.3rem;background:none}html.theme--documenter-dark #documenter .docs-sidebar::-webkit-scrollbar-thumb{border-radius:5px 0px 0px 5px;background:#3b4445}html.theme--documenter-dark #documenter .docs-sidebar::-webkit-scrollbar-thumb:hover{background:#4e5a5c}}html.theme--documenter-dark #documenter .docs-main #documenter-search-info{margin-bottom:1rem}html.theme--documenter-dark #documenter .docs-main #documenter-search-results{list-style-type:circle;list-style-position:outside}html.theme--documenter-dark #documenter .docs-main #documenter-search-results li{margin-left:2rem}html.theme--documenter-dark #documenter .docs-main #documenter-search-results .docs-highlight{background-color:yellow}html.theme--documenter-dark{background-color:#1f2424;font-size:16px;min-width:300px;overflow-x:auto;overflow-y:scroll;text-rendering:optimizeLegibility;text-size-adjust:100%}html.theme--documenter-dark .ansi span.sgr1{font-weight:bolder}html.theme--documenter-dark .ansi span.sgr2{font-weight:lighter}html.theme--documenter-dark .ansi span.sgr3{font-style:italic}html.theme--documenter-dark .ansi span.sgr4{text-decoration:underline}html.theme--documenter-dark .ansi span.sgr7{color:#1f2424;background-color:#fff}html.theme--documenter-dark .ansi span.sgr8{color:transparent}html.theme--documenter-dark .ansi span.sgr8 span{color:transparent}html.theme--documenter-dark .ansi span.sgr9{text-decoration:line-through}html.theme--documenter-dark .ansi span.sgr30{color:#242424}html.theme--documenter-dark .ansi span.sgr31{color:#f6705f}html.theme--documenter-dark .ansi span.sgr32{color:#4fb43a}html.theme--documenter-dark .ansi span.sgr33{color:#f4c72f}html.theme--documenter-dark .ansi span.sgr34{color:#7587f0}html.theme--documenter-dark .ansi span.sgr35{color:#bc89d3}html.theme--documenter-dark .ansi span.sgr36{color:#49b6ca}html.theme--documenter-dark .ansi span.sgr37{color:#b3bdbe}html.theme--documenter-dark .ansi span.sgr40{background-color:#242424}html.theme--documenter-dark .ansi span.sgr41{background-color:#f6705f}html.theme--documenter-dark .ansi span.sgr42{background-color:#4fb43a}html.theme--documenter-dark .ansi span.sgr43{background-color:#f4c72f}html.theme--documenter-dark .ansi span.sgr44{background-color:#7587f0}html.theme--documenter-dark .ansi span.sgr45{background-color:#bc89d3}html.theme--documenter-dark .ansi span.sgr46{background-color:#49b6ca}html.theme--documenter-dark .ansi span.sgr47{background-color:#b3bdbe}html.theme--documenter-dark .ansi span.sgr90{color:#92a0a2}html.theme--documenter-dark .ansi span.sgr91{color:#ff8674}html.theme--documenter-dark .ansi span.sgr92{color:#79d462}html.theme--documenter-dark .ansi span.sgr93{color:#ffe76b}html.theme--documenter-dark .ansi span.sgr94{color:#8a98ff}html.theme--documenter-dark .ansi span.sgr95{color:#d2a4e6}html.theme--documenter-dark .ansi span.sgr96{color:#6bc8db}html.theme--documenter-dark .ansi span.sgr97{color:#ecf0f1}html.theme--documenter-dark .ansi span.sgr100{background-color:#92a0a2}html.theme--documenter-dark .ansi span.sgr101{background-color:#ff8674}html.theme--documenter-dark .ansi span.sgr102{background-color:#79d462}html.theme--documenter-dark .ansi span.sgr103{background-color:#ffe76b}html.theme--documenter-dark .ansi span.sgr104{background-color:#8a98ff}html.theme--documenter-dark .ansi span.sgr105{background-color:#d2a4e6}html.theme--documenter-dark .ansi span.sgr106{background-color:#6bc8db}html.theme--documenter-dark .ansi span.sgr107{background-color:#ecf0f1}html.theme--documenter-dark code.language-julia-repl>span.hljs-meta{color:#4fb43a;font-weight:bolder}html.theme--documenter-dark .hljs{background:#2b2b2b;color:#f8f8f2}html.theme--documenter-dark .hljs-comment,html.theme--documenter-dark .hljs-quote{color:#d4d0ab}html.theme--documenter-dark .hljs-variable,html.theme--documenter-dark .hljs-template-variable,html.theme--documenter-dark .hljs-tag,html.theme--documenter-dark .hljs-name,html.theme--documenter-dark .hljs-selector-id,html.theme--documenter-dark .hljs-selector-class,html.theme--documenter-dark .hljs-regexp,html.theme--documenter-dark .hljs-deletion{color:#ffa07a}html.theme--documenter-dark .hljs-number,html.theme--documenter-dark .hljs-built_in,html.theme--documenter-dark .hljs-literal,html.theme--documenter-dark .hljs-type,html.theme--documenter-dark .hljs-params,html.theme--documenter-dark .hljs-meta,html.theme--documenter-dark .hljs-link{color:#f5ab35}html.theme--documenter-dark .hljs-attribute{color:#ffd700}html.theme--documenter-dark .hljs-string,html.theme--documenter-dark .hljs-symbol,html.theme--documenter-dark .hljs-bullet,html.theme--documenter-dark .hljs-addition{color:#abe338}html.theme--documenter-dark .hljs-title,html.theme--documenter-dark .hljs-section{color:#00e0e0}html.theme--documenter-dark .hljs-keyword,html.theme--documenter-dark .hljs-selector-tag{color:#dcc6e0}html.theme--documenter-dark .hljs-emphasis{font-style:italic}html.theme--documenter-dark .hljs-strong{font-weight:bold}@media screen and (-ms-high-contrast: active){html.theme--documenter-dark .hljs-addition,html.theme--documenter-dark .hljs-attribute,html.theme--documenter-dark .hljs-built_in,html.theme--documenter-dark .hljs-bullet,html.theme--documenter-dark .hljs-comment,html.theme--documenter-dark .hljs-link,html.theme--documenter-dark .hljs-literal,html.theme--documenter-dark .hljs-meta,html.theme--documenter-dark .hljs-number,html.theme--documenter-dark .hljs-params,html.theme--documenter-dark .hljs-string,html.theme--documenter-dark .hljs-symbol,html.theme--documenter-dark .hljs-type,html.theme--documenter-dark .hljs-quote{color:highlight}html.theme--documenter-dark .hljs-keyword,html.theme--documenter-dark .hljs-selector-tag{font-weight:bold}}html.theme--documenter-dark .hljs-subst{color:#f8f8f2} diff --git a/previews/PR2365/assets/themes/documenter-light.css b/previews/PR2365/assets/themes/documenter-light.css new file mode 100644 index 0000000000..9b9a14b043 --- /dev/null +++ b/previews/PR2365/assets/themes/documenter-light.css @@ -0,0 +1,9 @@ +@keyframes spinAround{from{transform:rotate(0deg)}to{transform:rotate(359deg)}}.tabs,.pagination-previous,.pagination-next,.pagination-link,.pagination-ellipsis,.breadcrumb,.file,.button,.is-unselectable,.modal-close,.delete{-webkit-touch-callout:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.navbar-link:not(.is-arrowless)::after,.select:not(.is-multiple):not(.is-loading)::after{border:3px solid rgba(0,0,0,0);border-radius:2px;border-right:0;border-top:0;content:" ";display:block;height:0.625em;margin-top:-0.4375em;pointer-events:none;position:absolute;top:50%;transform:rotate(-45deg);transform-origin:center;width:0.625em}.admonition:not(:last-child),.tabs:not(:last-child),.message:not(:last-child),.list:not(:last-child),.level:not(:last-child),.breadcrumb:not(:last-child),.highlight:not(:last-child),.block:not(:last-child),.title:not(:last-child),.subtitle:not(:last-child),.table-container:not(:last-child),.table:not(:last-child),.progress:not(:last-child),.notification:not(:last-child),.content:not(:last-child),.box:not(:last-child){margin-bottom:1.5rem}.modal-close,.delete{-moz-appearance:none;-webkit-appearance:none;background-color:rgba(10,10,10,0.2);border:none;border-radius:290486px;cursor:pointer;pointer-events:auto;display:inline-block;flex-grow:0;flex-shrink:0;font-size:0;height:20px;max-height:20px;max-width:20px;min-height:20px;min-width:20px;outline:none;position:relative;vertical-align:top;width:20px}.modal-close::before,.delete::before,.modal-close::after,.delete::after{background-color:#fff;content:"";display:block;left:50%;position:absolute;top:50%;transform:translateX(-50%) translateY(-50%) rotate(45deg);transform-origin:center center}.modal-close::before,.delete::before{height:2px;width:50%}.modal-close::after,.delete::after{height:50%;width:2px}.modal-close:hover,.delete:hover,.modal-close:focus,.delete:focus{background-color:rgba(10,10,10,0.3)}.modal-close:active,.delete:active{background-color:rgba(10,10,10,0.4)}.is-small.modal-close,#documenter .docs-sidebar form.docs-search>input.modal-close,.is-small.delete,#documenter .docs-sidebar form.docs-search>input.delete{height:16px;max-height:16px;max-width:16px;min-height:16px;min-width:16px;width:16px}.is-medium.modal-close,.is-medium.delete{height:24px;max-height:24px;max-width:24px;min-height:24px;min-width:24px;width:24px}.is-large.modal-close,.is-large.delete{height:32px;max-height:32px;max-width:32px;min-height:32px;min-width:32px;width:32px}.control.is-loading::after,.select.is-loading::after,.loader,.button.is-loading::after{animation:spinAround 500ms infinite linear;border:2px solid #dbdbdb;border-radius:290486px;border-right-color:transparent;border-top-color:transparent;content:"";display:block;height:1em;position:relative;width:1em}.hero-video,.modal-background,.modal,.image.is-square img,#documenter .docs-sidebar .docs-logo>img.is-square img,.image.is-square .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-square .has-ratio,.image.is-1by1 img,#documenter .docs-sidebar .docs-logo>img.is-1by1 img,.image.is-1by1 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-1by1 .has-ratio,.image.is-5by4 img,#documenter .docs-sidebar .docs-logo>img.is-5by4 img,.image.is-5by4 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-5by4 .has-ratio,.image.is-4by3 img,#documenter .docs-sidebar .docs-logo>img.is-4by3 img,.image.is-4by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-4by3 .has-ratio,.image.is-3by2 img,#documenter .docs-sidebar .docs-logo>img.is-3by2 img,.image.is-3by2 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by2 .has-ratio,.image.is-5by3 img,#documenter .docs-sidebar .docs-logo>img.is-5by3 img,.image.is-5by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-5by3 .has-ratio,.image.is-16by9 img,#documenter .docs-sidebar .docs-logo>img.is-16by9 img,.image.is-16by9 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-16by9 .has-ratio,.image.is-2by1 img,#documenter .docs-sidebar .docs-logo>img.is-2by1 img,.image.is-2by1 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-2by1 .has-ratio,.image.is-3by1 img,#documenter .docs-sidebar .docs-logo>img.is-3by1 img,.image.is-3by1 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by1 .has-ratio,.image.is-4by5 img,#documenter .docs-sidebar .docs-logo>img.is-4by5 img,.image.is-4by5 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-4by5 .has-ratio,.image.is-3by4 img,#documenter .docs-sidebar .docs-logo>img.is-3by4 img,.image.is-3by4 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by4 .has-ratio,.image.is-2by3 img,#documenter .docs-sidebar .docs-logo>img.is-2by3 img,.image.is-2by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-2by3 .has-ratio,.image.is-3by5 img,#documenter .docs-sidebar .docs-logo>img.is-3by5 img,.image.is-3by5 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by5 .has-ratio,.image.is-9by16 img,#documenter .docs-sidebar .docs-logo>img.is-9by16 img,.image.is-9by16 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-9by16 .has-ratio,.image.is-1by2 img,#documenter .docs-sidebar .docs-logo>img.is-1by2 img,.image.is-1by2 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-1by2 .has-ratio,.image.is-1by3 img,#documenter .docs-sidebar .docs-logo>img.is-1by3 img,.image.is-1by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-1by3 .has-ratio,.is-overlay{bottom:0;left:0;position:absolute;right:0;top:0}.pagination-previous,.pagination-next,.pagination-link,.pagination-ellipsis,.file-cta,.file-name,.select select,.textarea,.input,#documenter .docs-sidebar form.docs-search>input,.button{-moz-appearance:none;-webkit-appearance:none;align-items:center;border:1px solid transparent;border-radius:4px;box-shadow:none;display:inline-flex;font-size:1rem;height:2.25em;justify-content:flex-start;line-height:1.5;padding-bottom:calc(0.375em - 1px);padding-left:calc(0.625em - 1px);padding-right:calc(0.625em - 1px);padding-top:calc(0.375em - 1px);position:relative;vertical-align:top}.pagination-previous:focus,.pagination-next:focus,.pagination-link:focus,.pagination-ellipsis:focus,.file-cta:focus,.file-name:focus,.select select:focus,.textarea:focus,.input:focus,#documenter .docs-sidebar form.docs-search>input:focus,.button:focus,.is-focused.pagination-previous,.is-focused.pagination-next,.is-focused.pagination-link,.is-focused.pagination-ellipsis,.is-focused.file-cta,.is-focused.file-name,.select select.is-focused,.is-focused.textarea,.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-focused.button,.pagination-previous:active,.pagination-next:active,.pagination-link:active,.pagination-ellipsis:active,.file-cta:active,.file-name:active,.select select:active,.textarea:active,.input:active,#documenter .docs-sidebar form.docs-search>input:active,.button:active,.is-active.pagination-previous,.is-active.pagination-next,.is-active.pagination-link,.is-active.pagination-ellipsis,.is-active.file-cta,.is-active.file-name,.select select.is-active,.is-active.textarea,.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active,.is-active.button{outline:none}.pagination-previous[disabled],.pagination-next[disabled],.pagination-link[disabled],.pagination-ellipsis[disabled],.file-cta[disabled],.file-name[disabled],.select select[disabled],.textarea[disabled],.input[disabled],#documenter .docs-sidebar form.docs-search>input[disabled],.button[disabled],fieldset[disabled] .pagination-previous,fieldset[disabled] .pagination-next,fieldset[disabled] .pagination-link,fieldset[disabled] .pagination-ellipsis,fieldset[disabled] .file-cta,fieldset[disabled] .file-name,fieldset[disabled] .select select,.select fieldset[disabled] select,fieldset[disabled] .textarea,fieldset[disabled] .input,fieldset[disabled] #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar fieldset[disabled] form.docs-search>input,fieldset[disabled] .button{cursor:not-allowed}/*! minireset.css v0.0.4 | MIT License | github.com/jgthms/minireset.css */html,body,p,ol,ul,li,dl,dt,dd,blockquote,figure,fieldset,legend,textarea,pre,iframe,hr,h1,h2,h3,h4,h5,h6{margin:0;padding:0}h1,h2,h3,h4,h5,h6{font-size:100%;font-weight:normal}ul{list-style:none}button,input,select,textarea{margin:0}html{box-sizing:border-box}*,*::before,*::after{box-sizing:inherit}img,embed,iframe,object,video{height:auto;max-width:100%}audio{max-width:100%}iframe{border:0}table{border-collapse:collapse;border-spacing:0}td,th{padding:0}td:not([align]),th:not([align]){text-align:left}html{background-color:#fff;font-size:16px;-moz-osx-font-smoothing:grayscale;-webkit-font-smoothing:antialiased;min-width:300px;overflow-x:auto;overflow-y:scroll;text-rendering:optimizeLegibility;text-size-adjust:100%}article,aside,figure,footer,header,hgroup,section{display:block}body,button,input,select,textarea{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif}code,pre{-moz-osx-font-smoothing:auto;-webkit-font-smoothing:auto;font-family:"JuliaMono","SFMono-Regular","Menlo","Consolas","Liberation Mono","DejaVu Sans Mono",monospace}body{color:#222;font-size:1em;font-weight:400;line-height:1.5}a{color:#2e63b8;cursor:pointer;text-decoration:none}a strong{color:currentColor}a:hover{color:#363636}code{background-color:rgba(0,0,0,0.05);color:#000;font-size:.875em;font-weight:normal;padding:.1em}hr{background-color:#f5f5f5;border:none;display:block;height:2px;margin:1.5rem 0}img{height:auto;max-width:100%}input[type="checkbox"],input[type="radio"]{vertical-align:baseline}small{font-size:.875em}span{font-style:inherit;font-weight:inherit}strong{color:#222;font-weight:700}fieldset{border:none}pre{-webkit-overflow-scrolling:touch;background-color:#f5f5f5;color:#222;font-size:.875em;overflow-x:auto;padding:1.25rem 1.5rem;white-space:pre;word-wrap:normal}pre code{background-color:transparent;color:currentColor;font-size:1em;padding:0}table td,table th{vertical-align:top}table td:not([align]),table th:not([align]){text-align:left}table th{color:#222}.is-clearfix::after{clear:both;content:" ";display:table}.is-pulled-left{float:left !important}.is-pulled-right{float:right !important}.is-clipped{overflow:hidden !important}.is-size-1{font-size:3rem !important}.is-size-2{font-size:2.5rem !important}.is-size-3{font-size:2rem !important}.is-size-4{font-size:1.5rem !important}.is-size-5{font-size:1.25rem !important}.is-size-6{font-size:1rem !important}.is-size-7,.docstring>section>a.docs-sourcelink{font-size:.75rem !important}@media screen and (max-width: 768px){.is-size-1-mobile{font-size:3rem !important}.is-size-2-mobile{font-size:2.5rem !important}.is-size-3-mobile{font-size:2rem !important}.is-size-4-mobile{font-size:1.5rem !important}.is-size-5-mobile{font-size:1.25rem !important}.is-size-6-mobile{font-size:1rem !important}.is-size-7-mobile{font-size:.75rem !important}}@media screen and (min-width: 769px),print{.is-size-1-tablet{font-size:3rem !important}.is-size-2-tablet{font-size:2.5rem !important}.is-size-3-tablet{font-size:2rem !important}.is-size-4-tablet{font-size:1.5rem !important}.is-size-5-tablet{font-size:1.25rem !important}.is-size-6-tablet{font-size:1rem !important}.is-size-7-tablet{font-size:.75rem !important}}@media screen and (max-width: 1055px){.is-size-1-touch{font-size:3rem !important}.is-size-2-touch{font-size:2.5rem !important}.is-size-3-touch{font-size:2rem !important}.is-size-4-touch{font-size:1.5rem !important}.is-size-5-touch{font-size:1.25rem !important}.is-size-6-touch{font-size:1rem !important}.is-size-7-touch{font-size:.75rem !important}}@media screen and (min-width: 1056px){.is-size-1-desktop{font-size:3rem !important}.is-size-2-desktop{font-size:2.5rem !important}.is-size-3-desktop{font-size:2rem !important}.is-size-4-desktop{font-size:1.5rem !important}.is-size-5-desktop{font-size:1.25rem !important}.is-size-6-desktop{font-size:1rem !important}.is-size-7-desktop{font-size:.75rem !important}}@media screen and (min-width: 1216px){.is-size-1-widescreen{font-size:3rem !important}.is-size-2-widescreen{font-size:2.5rem !important}.is-size-3-widescreen{font-size:2rem !important}.is-size-4-widescreen{font-size:1.5rem !important}.is-size-5-widescreen{font-size:1.25rem !important}.is-size-6-widescreen{font-size:1rem !important}.is-size-7-widescreen{font-size:.75rem !important}}@media screen and (min-width: 1408px){.is-size-1-fullhd{font-size:3rem !important}.is-size-2-fullhd{font-size:2.5rem !important}.is-size-3-fullhd{font-size:2rem !important}.is-size-4-fullhd{font-size:1.5rem !important}.is-size-5-fullhd{font-size:1.25rem !important}.is-size-6-fullhd{font-size:1rem !important}.is-size-7-fullhd{font-size:.75rem !important}}.has-text-centered{text-align:center !important}.has-text-justified{text-align:justify !important}.has-text-left{text-align:left !important}.has-text-right{text-align:right !important}@media screen and (max-width: 768px){.has-text-centered-mobile{text-align:center !important}}@media screen and (min-width: 769px),print{.has-text-centered-tablet{text-align:center !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-centered-tablet-only{text-align:center !important}}@media screen and (max-width: 1055px){.has-text-centered-touch{text-align:center !important}}@media screen and (min-width: 1056px){.has-text-centered-desktop{text-align:center !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-centered-desktop-only{text-align:center !important}}@media screen and (min-width: 1216px){.has-text-centered-widescreen{text-align:center !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-centered-widescreen-only{text-align:center !important}}@media screen and (min-width: 1408px){.has-text-centered-fullhd{text-align:center !important}}@media screen and (max-width: 768px){.has-text-justified-mobile{text-align:justify !important}}@media screen and (min-width: 769px),print{.has-text-justified-tablet{text-align:justify !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-justified-tablet-only{text-align:justify !important}}@media screen and (max-width: 1055px){.has-text-justified-touch{text-align:justify !important}}@media screen and (min-width: 1056px){.has-text-justified-desktop{text-align:justify !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-justified-desktop-only{text-align:justify !important}}@media screen and (min-width: 1216px){.has-text-justified-widescreen{text-align:justify !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-justified-widescreen-only{text-align:justify !important}}@media screen and (min-width: 1408px){.has-text-justified-fullhd{text-align:justify !important}}@media screen and (max-width: 768px){.has-text-left-mobile{text-align:left !important}}@media screen and (min-width: 769px),print{.has-text-left-tablet{text-align:left !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-left-tablet-only{text-align:left !important}}@media screen and (max-width: 1055px){.has-text-left-touch{text-align:left !important}}@media screen and (min-width: 1056px){.has-text-left-desktop{text-align:left !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-left-desktop-only{text-align:left !important}}@media screen and (min-width: 1216px){.has-text-left-widescreen{text-align:left !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-left-widescreen-only{text-align:left !important}}@media screen and (min-width: 1408px){.has-text-left-fullhd{text-align:left !important}}@media screen and (max-width: 768px){.has-text-right-mobile{text-align:right !important}}@media screen and (min-width: 769px),print{.has-text-right-tablet{text-align:right !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.has-text-right-tablet-only{text-align:right !important}}@media screen and (max-width: 1055px){.has-text-right-touch{text-align:right !important}}@media screen and (min-width: 1056px){.has-text-right-desktop{text-align:right !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.has-text-right-desktop-only{text-align:right !important}}@media screen and (min-width: 1216px){.has-text-right-widescreen{text-align:right !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.has-text-right-widescreen-only{text-align:right !important}}@media screen and (min-width: 1408px){.has-text-right-fullhd{text-align:right !important}}.is-capitalized{text-transform:capitalize !important}.is-lowercase{text-transform:lowercase !important}.is-uppercase{text-transform:uppercase !important}.is-italic{font-style:italic !important}.has-text-white{color:#fff !important}a.has-text-white:hover,a.has-text-white:focus{color:#e6e6e6 !important}.has-background-white{background-color:#fff !important}.has-text-black{color:#0a0a0a !important}a.has-text-black:hover,a.has-text-black:focus{color:#000 !important}.has-background-black{background-color:#0a0a0a !important}.has-text-light{color:#f5f5f5 !important}a.has-text-light:hover,a.has-text-light:focus{color:#dbdbdb !important}.has-background-light{background-color:#f5f5f5 !important}.has-text-dark{color:#363636 !important}a.has-text-dark:hover,a.has-text-dark:focus{color:#1c1c1c !important}.has-background-dark{background-color:#363636 !important}.has-text-primary{color:#4eb5de !important}a.has-text-primary:hover,a.has-text-primary:focus{color:#27a1d2 !important}.has-background-primary{background-color:#4eb5de !important}.has-text-link{color:#2e63b8 !important}a.has-text-link:hover,a.has-text-link:focus{color:#244d8f !important}.has-background-link{background-color:#2e63b8 !important}.has-text-info{color:#209cee !important}a.has-text-info:hover,a.has-text-info:focus{color:#1081cb !important}.has-background-info{background-color:#209cee !important}.has-text-success{color:#22c35b !important}a.has-text-success:hover,a.has-text-success:focus{color:#1a9847 !important}.has-background-success{background-color:#22c35b !important}.has-text-warning{color:#ffdd57 !important}a.has-text-warning:hover,a.has-text-warning:focus{color:#ffd324 !important}.has-background-warning{background-color:#ffdd57 !important}.has-text-danger{color:#da0b00 !important}a.has-text-danger:hover,a.has-text-danger:focus{color:#a70800 !important}.has-background-danger{background-color:#da0b00 !important}.has-text-black-bis{color:#121212 !important}.has-background-black-bis{background-color:#121212 !important}.has-text-black-ter{color:#242424 !important}.has-background-black-ter{background-color:#242424 !important}.has-text-grey-darker{color:#363636 !important}.has-background-grey-darker{background-color:#363636 !important}.has-text-grey-dark{color:#4a4a4a !important}.has-background-grey-dark{background-color:#4a4a4a !important}.has-text-grey{color:#6b6b6b !important}.has-background-grey{background-color:#6b6b6b !important}.has-text-grey-light{color:#b5b5b5 !important}.has-background-grey-light{background-color:#b5b5b5 !important}.has-text-grey-lighter{color:#dbdbdb !important}.has-background-grey-lighter{background-color:#dbdbdb !important}.has-text-white-ter{color:#f5f5f5 !important}.has-background-white-ter{background-color:#f5f5f5 !important}.has-text-white-bis{color:#fafafa !important}.has-background-white-bis{background-color:#fafafa !important}.has-text-weight-light{font-weight:300 !important}.has-text-weight-normal{font-weight:400 !important}.has-text-weight-medium{font-weight:500 !important}.has-text-weight-semibold{font-weight:600 !important}.has-text-weight-bold{font-weight:700 !important}.is-family-primary{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif !important}.is-family-secondary{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif !important}.is-family-sans-serif{font-family:"Lato Medium",-apple-system,BlinkMacSystemFont,"Segoe UI","Helvetica Neue","Helvetica","Arial",sans-serif !important}.is-family-monospace{font-family:"JuliaMono","SFMono-Regular","Menlo","Consolas","Liberation Mono","DejaVu Sans Mono",monospace !important}.is-family-code{font-family:"JuliaMono","SFMono-Regular","Menlo","Consolas","Liberation Mono","DejaVu Sans Mono",monospace !important}.is-block{display:block !important}@media screen and (max-width: 768px){.is-block-mobile{display:block !important}}@media screen and (min-width: 769px),print{.is-block-tablet{display:block !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-block-tablet-only{display:block !important}}@media screen and (max-width: 1055px){.is-block-touch{display:block !important}}@media screen and (min-width: 1056px){.is-block-desktop{display:block !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-block-desktop-only{display:block !important}}@media screen and (min-width: 1216px){.is-block-widescreen{display:block !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-block-widescreen-only{display:block !important}}@media screen and (min-width: 1408px){.is-block-fullhd{display:block !important}}.is-flex{display:flex !important}@media screen and (max-width: 768px){.is-flex-mobile{display:flex !important}}@media screen and (min-width: 769px),print{.is-flex-tablet{display:flex !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-flex-tablet-only{display:flex !important}}@media screen and (max-width: 1055px){.is-flex-touch{display:flex !important}}@media screen and (min-width: 1056px){.is-flex-desktop{display:flex !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-flex-desktop-only{display:flex !important}}@media screen and (min-width: 1216px){.is-flex-widescreen{display:flex !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-flex-widescreen-only{display:flex !important}}@media screen and (min-width: 1408px){.is-flex-fullhd{display:flex !important}}.is-inline{display:inline !important}@media screen and (max-width: 768px){.is-inline-mobile{display:inline !important}}@media screen and (min-width: 769px),print{.is-inline-tablet{display:inline !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-inline-tablet-only{display:inline !important}}@media screen and (max-width: 1055px){.is-inline-touch{display:inline !important}}@media screen and (min-width: 1056px){.is-inline-desktop{display:inline !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-inline-desktop-only{display:inline !important}}@media screen and (min-width: 1216px){.is-inline-widescreen{display:inline !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-inline-widescreen-only{display:inline !important}}@media screen and (min-width: 1408px){.is-inline-fullhd{display:inline !important}}.is-inline-block{display:inline-block !important}@media screen and (max-width: 768px){.is-inline-block-mobile{display:inline-block !important}}@media screen and (min-width: 769px),print{.is-inline-block-tablet{display:inline-block !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-inline-block-tablet-only{display:inline-block !important}}@media screen and (max-width: 1055px){.is-inline-block-touch{display:inline-block !important}}@media screen and (min-width: 1056px){.is-inline-block-desktop{display:inline-block !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-inline-block-desktop-only{display:inline-block !important}}@media screen and (min-width: 1216px){.is-inline-block-widescreen{display:inline-block !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-inline-block-widescreen-only{display:inline-block !important}}@media screen and (min-width: 1408px){.is-inline-block-fullhd{display:inline-block !important}}.is-inline-flex{display:inline-flex !important}@media screen and (max-width: 768px){.is-inline-flex-mobile{display:inline-flex !important}}@media screen and (min-width: 769px),print{.is-inline-flex-tablet{display:inline-flex !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-inline-flex-tablet-only{display:inline-flex !important}}@media screen and (max-width: 1055px){.is-inline-flex-touch{display:inline-flex !important}}@media screen and (min-width: 1056px){.is-inline-flex-desktop{display:inline-flex !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-inline-flex-desktop-only{display:inline-flex !important}}@media screen and (min-width: 1216px){.is-inline-flex-widescreen{display:inline-flex !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-inline-flex-widescreen-only{display:inline-flex !important}}@media screen and (min-width: 1408px){.is-inline-flex-fullhd{display:inline-flex !important}}.is-hidden{display:none !important}.is-sr-only{border:none !important;clip:rect(0, 0, 0, 0) !important;height:0.01em !important;overflow:hidden !important;padding:0 !important;position:absolute !important;white-space:nowrap !important;width:0.01em !important}@media screen and (max-width: 768px){.is-hidden-mobile{display:none !important}}@media screen and (min-width: 769px),print{.is-hidden-tablet{display:none !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-hidden-tablet-only{display:none !important}}@media screen and (max-width: 1055px){.is-hidden-touch{display:none !important}}@media screen and (min-width: 1056px){.is-hidden-desktop{display:none !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-hidden-desktop-only{display:none !important}}@media screen and (min-width: 1216px){.is-hidden-widescreen{display:none !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-hidden-widescreen-only{display:none !important}}@media screen and (min-width: 1408px){.is-hidden-fullhd{display:none !important}}.is-invisible{visibility:hidden !important}@media screen and (max-width: 768px){.is-invisible-mobile{visibility:hidden !important}}@media screen and (min-width: 769px),print{.is-invisible-tablet{visibility:hidden !important}}@media screen and (min-width: 769px) and (max-width: 1055px){.is-invisible-tablet-only{visibility:hidden !important}}@media screen and (max-width: 1055px){.is-invisible-touch{visibility:hidden !important}}@media screen and (min-width: 1056px){.is-invisible-desktop{visibility:hidden !important}}@media screen and (min-width: 1056px) and (max-width: 1215px){.is-invisible-desktop-only{visibility:hidden !important}}@media screen and (min-width: 1216px){.is-invisible-widescreen{visibility:hidden !important}}@media screen and (min-width: 1216px) and (max-width: 1407px){.is-invisible-widescreen-only{visibility:hidden !important}}@media screen and (min-width: 1408px){.is-invisible-fullhd{visibility:hidden !important}}.is-marginless{margin:0 !important}.is-paddingless{padding:0 !important}.is-radiusless{border-radius:0 !important}.is-shadowless{box-shadow:none !important}.is-relative{position:relative !important}.box{background-color:#fff;border-radius:6px;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1);color:#222;display:block;padding:1.25rem}a.box:hover,a.box:focus{box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px #2e63b8}a.box:active{box-shadow:inset 0 1px 2px rgba(10,10,10,0.2),0 0 0 1px #2e63b8}.button{background-color:#fff;border-color:#dbdbdb;border-width:1px;color:#363636;cursor:pointer;justify-content:center;padding-bottom:calc(0.375em - 1px);padding-left:.75em;padding-right:.75em;padding-top:calc(0.375em - 1px);text-align:center;white-space:nowrap}.button strong{color:inherit}.button .icon,.button .icon.is-small,.button #documenter .docs-sidebar form.docs-search>input.icon,#documenter .docs-sidebar .button form.docs-search>input.icon,.button .icon.is-medium,.button .icon.is-large{height:1.5em;width:1.5em}.button .icon:first-child:not(:last-child){margin-left:calc(-0.375em - 1px);margin-right:0.1875em}.button .icon:last-child:not(:first-child){margin-left:0.1875em;margin-right:calc(-0.375em - 1px)}.button .icon:first-child:last-child{margin-left:calc(-0.375em - 1px);margin-right:calc(-0.375em - 1px)}.button:hover,.button.is-hovered{border-color:#b5b5b5;color:#363636}.button:focus,.button.is-focused{border-color:#3c5dcd;color:#363636}.button:focus:not(:active),.button.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(46,99,184,0.25)}.button:active,.button.is-active{border-color:#4a4a4a;color:#363636}.button.is-text{background-color:transparent;border-color:transparent;color:#222;text-decoration:underline}.button.is-text:hover,.button.is-text.is-hovered,.button.is-text:focus,.button.is-text.is-focused{background-color:#f5f5f5;color:#222}.button.is-text:active,.button.is-text.is-active{background-color:#e8e8e8;color:#222}.button.is-text[disabled],fieldset[disabled] .button.is-text{background-color:transparent;border-color:transparent;box-shadow:none}.button.is-white{background-color:#fff;border-color:transparent;color:#0a0a0a}.button.is-white:hover,.button.is-white.is-hovered{background-color:#f9f9f9;border-color:transparent;color:#0a0a0a}.button.is-white:focus,.button.is-white.is-focused{border-color:transparent;color:#0a0a0a}.button.is-white:focus:not(:active),.button.is-white.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(255,255,255,0.25)}.button.is-white:active,.button.is-white.is-active{background-color:#f2f2f2;border-color:transparent;color:#0a0a0a}.button.is-white[disabled],fieldset[disabled] .button.is-white{background-color:#fff;border-color:transparent;box-shadow:none}.button.is-white.is-inverted{background-color:#0a0a0a;color:#fff}.button.is-white.is-inverted:hover,.button.is-white.is-inverted.is-hovered{background-color:#000}.button.is-white.is-inverted[disabled],fieldset[disabled] .button.is-white.is-inverted{background-color:#0a0a0a;border-color:transparent;box-shadow:none;color:#fff}.button.is-white.is-loading::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}.button.is-white.is-outlined{background-color:transparent;border-color:#fff;color:#fff}.button.is-white.is-outlined:hover,.button.is-white.is-outlined.is-hovered,.button.is-white.is-outlined:focus,.button.is-white.is-outlined.is-focused{background-color:#fff;border-color:#fff;color:#0a0a0a}.button.is-white.is-outlined.is-loading::after{border-color:transparent transparent #fff #fff !important}.button.is-white.is-outlined.is-loading:hover::after,.button.is-white.is-outlined.is-loading.is-hovered::after,.button.is-white.is-outlined.is-loading:focus::after,.button.is-white.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}.button.is-white.is-outlined[disabled],fieldset[disabled] .button.is-white.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-white.is-inverted.is-outlined{background-color:transparent;border-color:#0a0a0a;color:#0a0a0a}.button.is-white.is-inverted.is-outlined:hover,.button.is-white.is-inverted.is-outlined.is-hovered,.button.is-white.is-inverted.is-outlined:focus,.button.is-white.is-inverted.is-outlined.is-focused{background-color:#0a0a0a;color:#fff}.button.is-white.is-inverted.is-outlined.is-loading:hover::after,.button.is-white.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-white.is-inverted.is-outlined.is-loading:focus::after,.button.is-white.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}.button.is-white.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-white.is-inverted.is-outlined{background-color:transparent;border-color:#0a0a0a;box-shadow:none;color:#0a0a0a}.button.is-black{background-color:#0a0a0a;border-color:transparent;color:#fff}.button.is-black:hover,.button.is-black.is-hovered{background-color:#040404;border-color:transparent;color:#fff}.button.is-black:focus,.button.is-black.is-focused{border-color:transparent;color:#fff}.button.is-black:focus:not(:active),.button.is-black.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(10,10,10,0.25)}.button.is-black:active,.button.is-black.is-active{background-color:#000;border-color:transparent;color:#fff}.button.is-black[disabled],fieldset[disabled] .button.is-black{background-color:#0a0a0a;border-color:transparent;box-shadow:none}.button.is-black.is-inverted{background-color:#fff;color:#0a0a0a}.button.is-black.is-inverted:hover,.button.is-black.is-inverted.is-hovered{background-color:#f2f2f2}.button.is-black.is-inverted[disabled],fieldset[disabled] .button.is-black.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#0a0a0a}.button.is-black.is-loading::after{border-color:transparent transparent #fff #fff !important}.button.is-black.is-outlined{background-color:transparent;border-color:#0a0a0a;color:#0a0a0a}.button.is-black.is-outlined:hover,.button.is-black.is-outlined.is-hovered,.button.is-black.is-outlined:focus,.button.is-black.is-outlined.is-focused{background-color:#0a0a0a;border-color:#0a0a0a;color:#fff}.button.is-black.is-outlined.is-loading::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}.button.is-black.is-outlined.is-loading:hover::after,.button.is-black.is-outlined.is-loading.is-hovered::after,.button.is-black.is-outlined.is-loading:focus::after,.button.is-black.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}.button.is-black.is-outlined[disabled],fieldset[disabled] .button.is-black.is-outlined{background-color:transparent;border-color:#0a0a0a;box-shadow:none;color:#0a0a0a}.button.is-black.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}.button.is-black.is-inverted.is-outlined:hover,.button.is-black.is-inverted.is-outlined.is-hovered,.button.is-black.is-inverted.is-outlined:focus,.button.is-black.is-inverted.is-outlined.is-focused{background-color:#fff;color:#0a0a0a}.button.is-black.is-inverted.is-outlined.is-loading:hover::after,.button.is-black.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-black.is-inverted.is-outlined.is-loading:focus::after,.button.is-black.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #0a0a0a #0a0a0a !important}.button.is-black.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-black.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-light{background-color:#f5f5f5;border-color:transparent;color:#363636}.button.is-light:hover,.button.is-light.is-hovered{background-color:#eee;border-color:transparent;color:#363636}.button.is-light:focus,.button.is-light.is-focused{border-color:transparent;color:#363636}.button.is-light:focus:not(:active),.button.is-light.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(245,245,245,0.25)}.button.is-light:active,.button.is-light.is-active{background-color:#e8e8e8;border-color:transparent;color:#363636}.button.is-light[disabled],fieldset[disabled] .button.is-light{background-color:#f5f5f5;border-color:transparent;box-shadow:none}.button.is-light.is-inverted{background-color:#363636;color:#f5f5f5}.button.is-light.is-inverted:hover,.button.is-light.is-inverted.is-hovered{background-color:#292929}.button.is-light.is-inverted[disabled],fieldset[disabled] .button.is-light.is-inverted{background-color:#363636;border-color:transparent;box-shadow:none;color:#f5f5f5}.button.is-light.is-loading::after{border-color:transparent transparent #363636 #363636 !important}.button.is-light.is-outlined{background-color:transparent;border-color:#f5f5f5;color:#f5f5f5}.button.is-light.is-outlined:hover,.button.is-light.is-outlined.is-hovered,.button.is-light.is-outlined:focus,.button.is-light.is-outlined.is-focused{background-color:#f5f5f5;border-color:#f5f5f5;color:#363636}.button.is-light.is-outlined.is-loading::after{border-color:transparent transparent #f5f5f5 #f5f5f5 !important}.button.is-light.is-outlined.is-loading:hover::after,.button.is-light.is-outlined.is-loading.is-hovered::after,.button.is-light.is-outlined.is-loading:focus::after,.button.is-light.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #363636 #363636 !important}.button.is-light.is-outlined[disabled],fieldset[disabled] .button.is-light.is-outlined{background-color:transparent;border-color:#f5f5f5;box-shadow:none;color:#f5f5f5}.button.is-light.is-inverted.is-outlined{background-color:transparent;border-color:#363636;color:#363636}.button.is-light.is-inverted.is-outlined:hover,.button.is-light.is-inverted.is-outlined.is-hovered,.button.is-light.is-inverted.is-outlined:focus,.button.is-light.is-inverted.is-outlined.is-focused{background-color:#363636;color:#f5f5f5}.button.is-light.is-inverted.is-outlined.is-loading:hover::after,.button.is-light.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-light.is-inverted.is-outlined.is-loading:focus::after,.button.is-light.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #f5f5f5 #f5f5f5 !important}.button.is-light.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-light.is-inverted.is-outlined{background-color:transparent;border-color:#363636;box-shadow:none;color:#363636}.button.is-dark,.content kbd.button{background-color:#363636;border-color:transparent;color:#f5f5f5}.button.is-dark:hover,.content kbd.button:hover,.button.is-dark.is-hovered,.content kbd.button.is-hovered{background-color:#2f2f2f;border-color:transparent;color:#f5f5f5}.button.is-dark:focus,.content kbd.button:focus,.button.is-dark.is-focused,.content kbd.button.is-focused{border-color:transparent;color:#f5f5f5}.button.is-dark:focus:not(:active),.content kbd.button:focus:not(:active),.button.is-dark.is-focused:not(:active),.content kbd.button.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(54,54,54,0.25)}.button.is-dark:active,.content kbd.button:active,.button.is-dark.is-active,.content kbd.button.is-active{background-color:#292929;border-color:transparent;color:#f5f5f5}.button.is-dark[disabled],.content kbd.button[disabled],fieldset[disabled] .button.is-dark,fieldset[disabled] .content kbd.button,.content fieldset[disabled] kbd.button{background-color:#363636;border-color:transparent;box-shadow:none}.button.is-dark.is-inverted,.content kbd.button.is-inverted{background-color:#f5f5f5;color:#363636}.button.is-dark.is-inverted:hover,.content kbd.button.is-inverted:hover,.button.is-dark.is-inverted.is-hovered,.content kbd.button.is-inverted.is-hovered{background-color:#e8e8e8}.button.is-dark.is-inverted[disabled],.content kbd.button.is-inverted[disabled],fieldset[disabled] .button.is-dark.is-inverted,fieldset[disabled] .content kbd.button.is-inverted,.content fieldset[disabled] kbd.button.is-inverted{background-color:#f5f5f5;border-color:transparent;box-shadow:none;color:#363636}.button.is-dark.is-loading::after,.content kbd.button.is-loading::after{border-color:transparent transparent #f5f5f5 #f5f5f5 !important}.button.is-dark.is-outlined,.content kbd.button.is-outlined{background-color:transparent;border-color:#363636;color:#363636}.button.is-dark.is-outlined:hover,.content kbd.button.is-outlined:hover,.button.is-dark.is-outlined.is-hovered,.content kbd.button.is-outlined.is-hovered,.button.is-dark.is-outlined:focus,.content kbd.button.is-outlined:focus,.button.is-dark.is-outlined.is-focused,.content kbd.button.is-outlined.is-focused{background-color:#363636;border-color:#363636;color:#f5f5f5}.button.is-dark.is-outlined.is-loading::after,.content kbd.button.is-outlined.is-loading::after{border-color:transparent transparent #363636 #363636 !important}.button.is-dark.is-outlined.is-loading:hover::after,.content kbd.button.is-outlined.is-loading:hover::after,.button.is-dark.is-outlined.is-loading.is-hovered::after,.content kbd.button.is-outlined.is-loading.is-hovered::after,.button.is-dark.is-outlined.is-loading:focus::after,.content kbd.button.is-outlined.is-loading:focus::after,.button.is-dark.is-outlined.is-loading.is-focused::after,.content kbd.button.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #f5f5f5 #f5f5f5 !important}.button.is-dark.is-outlined[disabled],.content kbd.button.is-outlined[disabled],fieldset[disabled] .button.is-dark.is-outlined,fieldset[disabled] .content kbd.button.is-outlined,.content fieldset[disabled] kbd.button.is-outlined{background-color:transparent;border-color:#363636;box-shadow:none;color:#363636}.button.is-dark.is-inverted.is-outlined,.content kbd.button.is-inverted.is-outlined{background-color:transparent;border-color:#f5f5f5;color:#f5f5f5}.button.is-dark.is-inverted.is-outlined:hover,.content kbd.button.is-inverted.is-outlined:hover,.button.is-dark.is-inverted.is-outlined.is-hovered,.content kbd.button.is-inverted.is-outlined.is-hovered,.button.is-dark.is-inverted.is-outlined:focus,.content kbd.button.is-inverted.is-outlined:focus,.button.is-dark.is-inverted.is-outlined.is-focused,.content kbd.button.is-inverted.is-outlined.is-focused{background-color:#f5f5f5;color:#363636}.button.is-dark.is-inverted.is-outlined.is-loading:hover::after,.content kbd.button.is-inverted.is-outlined.is-loading:hover::after,.button.is-dark.is-inverted.is-outlined.is-loading.is-hovered::after,.content kbd.button.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-dark.is-inverted.is-outlined.is-loading:focus::after,.content kbd.button.is-inverted.is-outlined.is-loading:focus::after,.button.is-dark.is-inverted.is-outlined.is-loading.is-focused::after,.content kbd.button.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #363636 #363636 !important}.button.is-dark.is-inverted.is-outlined[disabled],.content kbd.button.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-dark.is-inverted.is-outlined,fieldset[disabled] .content kbd.button.is-inverted.is-outlined,.content fieldset[disabled] kbd.button.is-inverted.is-outlined{background-color:transparent;border-color:#f5f5f5;box-shadow:none;color:#f5f5f5}.button.is-primary,.docstring>section>a.button.docs-sourcelink{background-color:#4eb5de;border-color:transparent;color:#fff}.button.is-primary:hover,.docstring>section>a.button.docs-sourcelink:hover,.button.is-primary.is-hovered,.docstring>section>a.button.is-hovered.docs-sourcelink{background-color:#43b1dc;border-color:transparent;color:#fff}.button.is-primary:focus,.docstring>section>a.button.docs-sourcelink:focus,.button.is-primary.is-focused,.docstring>section>a.button.is-focused.docs-sourcelink{border-color:transparent;color:#fff}.button.is-primary:focus:not(:active),.docstring>section>a.button.docs-sourcelink:focus:not(:active),.button.is-primary.is-focused:not(:active),.docstring>section>a.button.is-focused.docs-sourcelink:not(:active){box-shadow:0 0 0 0.125em rgba(78,181,222,0.25)}.button.is-primary:active,.docstring>section>a.button.docs-sourcelink:active,.button.is-primary.is-active,.docstring>section>a.button.is-active.docs-sourcelink{background-color:#39acda;border-color:transparent;color:#fff}.button.is-primary[disabled],.docstring>section>a.button.docs-sourcelink[disabled],fieldset[disabled] .button.is-primary,fieldset[disabled] .docstring>section>a.button.docs-sourcelink{background-color:#4eb5de;border-color:transparent;box-shadow:none}.button.is-primary.is-inverted,.docstring>section>a.button.is-inverted.docs-sourcelink{background-color:#fff;color:#4eb5de}.button.is-primary.is-inverted:hover,.docstring>section>a.button.is-inverted.docs-sourcelink:hover,.button.is-primary.is-inverted.is-hovered,.docstring>section>a.button.is-inverted.is-hovered.docs-sourcelink{background-color:#f2f2f2}.button.is-primary.is-inverted[disabled],.docstring>section>a.button.is-inverted.docs-sourcelink[disabled],fieldset[disabled] .button.is-primary.is-inverted,fieldset[disabled] .docstring>section>a.button.is-inverted.docs-sourcelink{background-color:#fff;border-color:transparent;box-shadow:none;color:#4eb5de}.button.is-primary.is-loading::after,.docstring>section>a.button.is-loading.docs-sourcelink::after{border-color:transparent transparent #fff #fff !important}.button.is-primary.is-outlined,.docstring>section>a.button.is-outlined.docs-sourcelink{background-color:transparent;border-color:#4eb5de;color:#4eb5de}.button.is-primary.is-outlined:hover,.docstring>section>a.button.is-outlined.docs-sourcelink:hover,.button.is-primary.is-outlined.is-hovered,.docstring>section>a.button.is-outlined.is-hovered.docs-sourcelink,.button.is-primary.is-outlined:focus,.docstring>section>a.button.is-outlined.docs-sourcelink:focus,.button.is-primary.is-outlined.is-focused,.docstring>section>a.button.is-outlined.is-focused.docs-sourcelink{background-color:#4eb5de;border-color:#4eb5de;color:#fff}.button.is-primary.is-outlined.is-loading::after,.docstring>section>a.button.is-outlined.is-loading.docs-sourcelink::after{border-color:transparent transparent #4eb5de #4eb5de !important}.button.is-primary.is-outlined.is-loading:hover::after,.docstring>section>a.button.is-outlined.is-loading.docs-sourcelink:hover::after,.button.is-primary.is-outlined.is-loading.is-hovered::after,.docstring>section>a.button.is-outlined.is-loading.is-hovered.docs-sourcelink::after,.button.is-primary.is-outlined.is-loading:focus::after,.docstring>section>a.button.is-outlined.is-loading.docs-sourcelink:focus::after,.button.is-primary.is-outlined.is-loading.is-focused::after,.docstring>section>a.button.is-outlined.is-loading.is-focused.docs-sourcelink::after{border-color:transparent transparent #fff #fff !important}.button.is-primary.is-outlined[disabled],.docstring>section>a.button.is-outlined.docs-sourcelink[disabled],fieldset[disabled] .button.is-primary.is-outlined,fieldset[disabled] .docstring>section>a.button.is-outlined.docs-sourcelink{background-color:transparent;border-color:#4eb5de;box-shadow:none;color:#4eb5de}.button.is-primary.is-inverted.is-outlined,.docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink{background-color:transparent;border-color:#fff;color:#fff}.button.is-primary.is-inverted.is-outlined:hover,.docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink:hover,.button.is-primary.is-inverted.is-outlined.is-hovered,.docstring>section>a.button.is-inverted.is-outlined.is-hovered.docs-sourcelink,.button.is-primary.is-inverted.is-outlined:focus,.docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink:focus,.button.is-primary.is-inverted.is-outlined.is-focused,.docstring>section>a.button.is-inverted.is-outlined.is-focused.docs-sourcelink{background-color:#fff;color:#4eb5de}.button.is-primary.is-inverted.is-outlined.is-loading:hover::after,.docstring>section>a.button.is-inverted.is-outlined.is-loading.docs-sourcelink:hover::after,.button.is-primary.is-inverted.is-outlined.is-loading.is-hovered::after,.docstring>section>a.button.is-inverted.is-outlined.is-loading.is-hovered.docs-sourcelink::after,.button.is-primary.is-inverted.is-outlined.is-loading:focus::after,.docstring>section>a.button.is-inverted.is-outlined.is-loading.docs-sourcelink:focus::after,.button.is-primary.is-inverted.is-outlined.is-loading.is-focused::after,.docstring>section>a.button.is-inverted.is-outlined.is-loading.is-focused.docs-sourcelink::after{border-color:transparent transparent #4eb5de #4eb5de !important}.button.is-primary.is-inverted.is-outlined[disabled],.docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink[disabled],fieldset[disabled] .button.is-primary.is-inverted.is-outlined,fieldset[disabled] .docstring>section>a.button.is-inverted.is-outlined.docs-sourcelink{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-link{background-color:#2e63b8;border-color:transparent;color:#fff}.button.is-link:hover,.button.is-link.is-hovered{background-color:#2b5eae;border-color:transparent;color:#fff}.button.is-link:focus,.button.is-link.is-focused{border-color:transparent;color:#fff}.button.is-link:focus:not(:active),.button.is-link.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(46,99,184,0.25)}.button.is-link:active,.button.is-link.is-active{background-color:#2958a4;border-color:transparent;color:#fff}.button.is-link[disabled],fieldset[disabled] .button.is-link{background-color:#2e63b8;border-color:transparent;box-shadow:none}.button.is-link.is-inverted{background-color:#fff;color:#2e63b8}.button.is-link.is-inverted:hover,.button.is-link.is-inverted.is-hovered{background-color:#f2f2f2}.button.is-link.is-inverted[disabled],fieldset[disabled] .button.is-link.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#2e63b8}.button.is-link.is-loading::after{border-color:transparent transparent #fff #fff !important}.button.is-link.is-outlined{background-color:transparent;border-color:#2e63b8;color:#2e63b8}.button.is-link.is-outlined:hover,.button.is-link.is-outlined.is-hovered,.button.is-link.is-outlined:focus,.button.is-link.is-outlined.is-focused{background-color:#2e63b8;border-color:#2e63b8;color:#fff}.button.is-link.is-outlined.is-loading::after{border-color:transparent transparent #2e63b8 #2e63b8 !important}.button.is-link.is-outlined.is-loading:hover::after,.button.is-link.is-outlined.is-loading.is-hovered::after,.button.is-link.is-outlined.is-loading:focus::after,.button.is-link.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}.button.is-link.is-outlined[disabled],fieldset[disabled] .button.is-link.is-outlined{background-color:transparent;border-color:#2e63b8;box-shadow:none;color:#2e63b8}.button.is-link.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}.button.is-link.is-inverted.is-outlined:hover,.button.is-link.is-inverted.is-outlined.is-hovered,.button.is-link.is-inverted.is-outlined:focus,.button.is-link.is-inverted.is-outlined.is-focused{background-color:#fff;color:#2e63b8}.button.is-link.is-inverted.is-outlined.is-loading:hover::after,.button.is-link.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-link.is-inverted.is-outlined.is-loading:focus::after,.button.is-link.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #2e63b8 #2e63b8 !important}.button.is-link.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-link.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-info{background-color:#209cee;border-color:transparent;color:#fff}.button.is-info:hover,.button.is-info.is-hovered{background-color:#1497ed;border-color:transparent;color:#fff}.button.is-info:focus,.button.is-info.is-focused{border-color:transparent;color:#fff}.button.is-info:focus:not(:active),.button.is-info.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(32,156,238,0.25)}.button.is-info:active,.button.is-info.is-active{background-color:#1190e3;border-color:transparent;color:#fff}.button.is-info[disabled],fieldset[disabled] .button.is-info{background-color:#209cee;border-color:transparent;box-shadow:none}.button.is-info.is-inverted{background-color:#fff;color:#209cee}.button.is-info.is-inverted:hover,.button.is-info.is-inverted.is-hovered{background-color:#f2f2f2}.button.is-info.is-inverted[disabled],fieldset[disabled] .button.is-info.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#209cee}.button.is-info.is-loading::after{border-color:transparent transparent #fff #fff !important}.button.is-info.is-outlined{background-color:transparent;border-color:#209cee;color:#209cee}.button.is-info.is-outlined:hover,.button.is-info.is-outlined.is-hovered,.button.is-info.is-outlined:focus,.button.is-info.is-outlined.is-focused{background-color:#209cee;border-color:#209cee;color:#fff}.button.is-info.is-outlined.is-loading::after{border-color:transparent transparent #209cee #209cee !important}.button.is-info.is-outlined.is-loading:hover::after,.button.is-info.is-outlined.is-loading.is-hovered::after,.button.is-info.is-outlined.is-loading:focus::after,.button.is-info.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}.button.is-info.is-outlined[disabled],fieldset[disabled] .button.is-info.is-outlined{background-color:transparent;border-color:#209cee;box-shadow:none;color:#209cee}.button.is-info.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}.button.is-info.is-inverted.is-outlined:hover,.button.is-info.is-inverted.is-outlined.is-hovered,.button.is-info.is-inverted.is-outlined:focus,.button.is-info.is-inverted.is-outlined.is-focused{background-color:#fff;color:#209cee}.button.is-info.is-inverted.is-outlined.is-loading:hover::after,.button.is-info.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-info.is-inverted.is-outlined.is-loading:focus::after,.button.is-info.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #209cee #209cee !important}.button.is-info.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-info.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-success{background-color:#22c35b;border-color:transparent;color:#fff}.button.is-success:hover,.button.is-success.is-hovered{background-color:#20b856;border-color:transparent;color:#fff}.button.is-success:focus,.button.is-success.is-focused{border-color:transparent;color:#fff}.button.is-success:focus:not(:active),.button.is-success.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(34,195,91,0.25)}.button.is-success:active,.button.is-success.is-active{background-color:#1ead51;border-color:transparent;color:#fff}.button.is-success[disabled],fieldset[disabled] .button.is-success{background-color:#22c35b;border-color:transparent;box-shadow:none}.button.is-success.is-inverted{background-color:#fff;color:#22c35b}.button.is-success.is-inverted:hover,.button.is-success.is-inverted.is-hovered{background-color:#f2f2f2}.button.is-success.is-inverted[disabled],fieldset[disabled] .button.is-success.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#22c35b}.button.is-success.is-loading::after{border-color:transparent transparent #fff #fff !important}.button.is-success.is-outlined{background-color:transparent;border-color:#22c35b;color:#22c35b}.button.is-success.is-outlined:hover,.button.is-success.is-outlined.is-hovered,.button.is-success.is-outlined:focus,.button.is-success.is-outlined.is-focused{background-color:#22c35b;border-color:#22c35b;color:#fff}.button.is-success.is-outlined.is-loading::after{border-color:transparent transparent #22c35b #22c35b !important}.button.is-success.is-outlined.is-loading:hover::after,.button.is-success.is-outlined.is-loading.is-hovered::after,.button.is-success.is-outlined.is-loading:focus::after,.button.is-success.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}.button.is-success.is-outlined[disabled],fieldset[disabled] .button.is-success.is-outlined{background-color:transparent;border-color:#22c35b;box-shadow:none;color:#22c35b}.button.is-success.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}.button.is-success.is-inverted.is-outlined:hover,.button.is-success.is-inverted.is-outlined.is-hovered,.button.is-success.is-inverted.is-outlined:focus,.button.is-success.is-inverted.is-outlined.is-focused{background-color:#fff;color:#22c35b}.button.is-success.is-inverted.is-outlined.is-loading:hover::after,.button.is-success.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-success.is-inverted.is-outlined.is-loading:focus::after,.button.is-success.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #22c35b #22c35b !important}.button.is-success.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-success.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-warning{background-color:#ffdd57;border-color:transparent;color:rgba(0,0,0,0.7)}.button.is-warning:hover,.button.is-warning.is-hovered{background-color:#ffda4a;border-color:transparent;color:rgba(0,0,0,0.7)}.button.is-warning:focus,.button.is-warning.is-focused{border-color:transparent;color:rgba(0,0,0,0.7)}.button.is-warning:focus:not(:active),.button.is-warning.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(255,221,87,0.25)}.button.is-warning:active,.button.is-warning.is-active{background-color:#ffd83e;border-color:transparent;color:rgba(0,0,0,0.7)}.button.is-warning[disabled],fieldset[disabled] .button.is-warning{background-color:#ffdd57;border-color:transparent;box-shadow:none}.button.is-warning.is-inverted{background-color:rgba(0,0,0,0.7);color:#ffdd57}.button.is-warning.is-inverted:hover,.button.is-warning.is-inverted.is-hovered{background-color:rgba(0,0,0,0.7)}.button.is-warning.is-inverted[disabled],fieldset[disabled] .button.is-warning.is-inverted{background-color:rgba(0,0,0,0.7);border-color:transparent;box-shadow:none;color:#ffdd57}.button.is-warning.is-loading::after{border-color:transparent transparent rgba(0,0,0,0.7) rgba(0,0,0,0.7) !important}.button.is-warning.is-outlined{background-color:transparent;border-color:#ffdd57;color:#ffdd57}.button.is-warning.is-outlined:hover,.button.is-warning.is-outlined.is-hovered,.button.is-warning.is-outlined:focus,.button.is-warning.is-outlined.is-focused{background-color:#ffdd57;border-color:#ffdd57;color:rgba(0,0,0,0.7)}.button.is-warning.is-outlined.is-loading::after{border-color:transparent transparent #ffdd57 #ffdd57 !important}.button.is-warning.is-outlined.is-loading:hover::after,.button.is-warning.is-outlined.is-loading.is-hovered::after,.button.is-warning.is-outlined.is-loading:focus::after,.button.is-warning.is-outlined.is-loading.is-focused::after{border-color:transparent transparent rgba(0,0,0,0.7) rgba(0,0,0,0.7) !important}.button.is-warning.is-outlined[disabled],fieldset[disabled] .button.is-warning.is-outlined{background-color:transparent;border-color:#ffdd57;box-shadow:none;color:#ffdd57}.button.is-warning.is-inverted.is-outlined{background-color:transparent;border-color:rgba(0,0,0,0.7);color:rgba(0,0,0,0.7)}.button.is-warning.is-inverted.is-outlined:hover,.button.is-warning.is-inverted.is-outlined.is-hovered,.button.is-warning.is-inverted.is-outlined:focus,.button.is-warning.is-inverted.is-outlined.is-focused{background-color:rgba(0,0,0,0.7);color:#ffdd57}.button.is-warning.is-inverted.is-outlined.is-loading:hover::after,.button.is-warning.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-warning.is-inverted.is-outlined.is-loading:focus::after,.button.is-warning.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #ffdd57 #ffdd57 !important}.button.is-warning.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-warning.is-inverted.is-outlined{background-color:transparent;border-color:rgba(0,0,0,0.7);box-shadow:none;color:rgba(0,0,0,0.7)}.button.is-danger{background-color:#da0b00;border-color:transparent;color:#fff}.button.is-danger:hover,.button.is-danger.is-hovered{background-color:#cd0a00;border-color:transparent;color:#fff}.button.is-danger:focus,.button.is-danger.is-focused{border-color:transparent;color:#fff}.button.is-danger:focus:not(:active),.button.is-danger.is-focused:not(:active){box-shadow:0 0 0 0.125em rgba(218,11,0,0.25)}.button.is-danger:active,.button.is-danger.is-active{background-color:#c10a00;border-color:transparent;color:#fff}.button.is-danger[disabled],fieldset[disabled] .button.is-danger{background-color:#da0b00;border-color:transparent;box-shadow:none}.button.is-danger.is-inverted{background-color:#fff;color:#da0b00}.button.is-danger.is-inverted:hover,.button.is-danger.is-inverted.is-hovered{background-color:#f2f2f2}.button.is-danger.is-inverted[disabled],fieldset[disabled] .button.is-danger.is-inverted{background-color:#fff;border-color:transparent;box-shadow:none;color:#da0b00}.button.is-danger.is-loading::after{border-color:transparent transparent #fff #fff !important}.button.is-danger.is-outlined{background-color:transparent;border-color:#da0b00;color:#da0b00}.button.is-danger.is-outlined:hover,.button.is-danger.is-outlined.is-hovered,.button.is-danger.is-outlined:focus,.button.is-danger.is-outlined.is-focused{background-color:#da0b00;border-color:#da0b00;color:#fff}.button.is-danger.is-outlined.is-loading::after{border-color:transparent transparent #da0b00 #da0b00 !important}.button.is-danger.is-outlined.is-loading:hover::after,.button.is-danger.is-outlined.is-loading.is-hovered::after,.button.is-danger.is-outlined.is-loading:focus::after,.button.is-danger.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #fff #fff !important}.button.is-danger.is-outlined[disabled],fieldset[disabled] .button.is-danger.is-outlined{background-color:transparent;border-color:#da0b00;box-shadow:none;color:#da0b00}.button.is-danger.is-inverted.is-outlined{background-color:transparent;border-color:#fff;color:#fff}.button.is-danger.is-inverted.is-outlined:hover,.button.is-danger.is-inverted.is-outlined.is-hovered,.button.is-danger.is-inverted.is-outlined:focus,.button.is-danger.is-inverted.is-outlined.is-focused{background-color:#fff;color:#da0b00}.button.is-danger.is-inverted.is-outlined.is-loading:hover::after,.button.is-danger.is-inverted.is-outlined.is-loading.is-hovered::after,.button.is-danger.is-inverted.is-outlined.is-loading:focus::after,.button.is-danger.is-inverted.is-outlined.is-loading.is-focused::after{border-color:transparent transparent #da0b00 #da0b00 !important}.button.is-danger.is-inverted.is-outlined[disabled],fieldset[disabled] .button.is-danger.is-inverted.is-outlined{background-color:transparent;border-color:#fff;box-shadow:none;color:#fff}.button.is-small,#documenter .docs-sidebar form.docs-search>input.button{border-radius:2px;font-size:.75rem}.button.is-normal{font-size:1rem}.button.is-medium{font-size:1.25rem}.button.is-large{font-size:1.5rem}.button[disabled],fieldset[disabled] .button{background-color:#fff;border-color:#dbdbdb;box-shadow:none;opacity:.5}.button.is-fullwidth{display:flex;width:100%}.button.is-loading{color:transparent !important;pointer-events:none}.button.is-loading::after{position:absolute;left:calc(50% - (1em / 2));top:calc(50% - (1em / 2));position:absolute !important}.button.is-static{background-color:#f5f5f5;border-color:#dbdbdb;color:#6b6b6b;box-shadow:none;pointer-events:none}.button.is-rounded,#documenter .docs-sidebar form.docs-search>input.button{border-radius:290486px;padding-left:1em;padding-right:1em}.buttons{align-items:center;display:flex;flex-wrap:wrap;justify-content:flex-start}.buttons .button{margin-bottom:0.5rem}.buttons .button:not(:last-child):not(.is-fullwidth){margin-right:0.5rem}.buttons:last-child{margin-bottom:-0.5rem}.buttons:not(:last-child){margin-bottom:1rem}.buttons.are-small .button:not(.is-normal):not(.is-medium):not(.is-large){border-radius:2px;font-size:.75rem}.buttons.are-medium .button:not(.is-small):not(.is-normal):not(.is-large){font-size:1.25rem}.buttons.are-large .button:not(.is-small):not(.is-normal):not(.is-medium){font-size:1.5rem}.buttons.has-addons .button:not(:first-child){border-bottom-left-radius:0;border-top-left-radius:0}.buttons.has-addons .button:not(:last-child){border-bottom-right-radius:0;border-top-right-radius:0;margin-right:-1px}.buttons.has-addons .button:last-child{margin-right:0}.buttons.has-addons .button:hover,.buttons.has-addons .button.is-hovered{z-index:2}.buttons.has-addons .button:focus,.buttons.has-addons .button.is-focused,.buttons.has-addons .button:active,.buttons.has-addons .button.is-active,.buttons.has-addons .button.is-selected{z-index:3}.buttons.has-addons .button:focus:hover,.buttons.has-addons .button.is-focused:hover,.buttons.has-addons .button:active:hover,.buttons.has-addons .button.is-active:hover,.buttons.has-addons .button.is-selected:hover{z-index:4}.buttons.has-addons .button.is-expanded{flex-grow:1;flex-shrink:1}.buttons.is-centered{justify-content:center}.buttons.is-centered:not(.has-addons) .button:not(.is-fullwidth){margin-left:0.25rem;margin-right:0.25rem}.buttons.is-right{justify-content:flex-end}.buttons.is-right:not(.has-addons) .button:not(.is-fullwidth){margin-left:0.25rem;margin-right:0.25rem}.container{flex-grow:1;margin:0 auto;position:relative;width:auto}@media screen and (min-width: 1056px){.container{max-width:992px}.container.is-fluid{margin-left:32px;margin-right:32px;max-width:none}}@media screen and (max-width: 1215px){.container.is-widescreen{max-width:1152px}}@media screen and (max-width: 1407px){.container.is-fullhd{max-width:1344px}}@media screen and (min-width: 1216px){.container{max-width:1152px}}@media screen and (min-width: 1408px){.container{max-width:1344px}}.content li+li{margin-top:0.25em}.content p:not(:last-child),.content dl:not(:last-child),.content ol:not(:last-child),.content ul:not(:last-child),.content blockquote:not(:last-child),.content pre:not(:last-child),.content table:not(:last-child){margin-bottom:1em}.content h1,.content h2,.content h3,.content h4,.content h5,.content h6{color:#222;font-weight:600;line-height:1.125}.content h1{font-size:2em;margin-bottom:0.5em}.content h1:not(:first-child){margin-top:1em}.content h2{font-size:1.75em;margin-bottom:0.5714em}.content h2:not(:first-child){margin-top:1.1428em}.content h3{font-size:1.5em;margin-bottom:0.6666em}.content h3:not(:first-child){margin-top:1.3333em}.content h4{font-size:1.25em;margin-bottom:0.8em}.content h5{font-size:1.125em;margin-bottom:0.8888em}.content h6{font-size:1em;margin-bottom:1em}.content blockquote{background-color:#f5f5f5;border-left:5px solid #dbdbdb;padding:1.25em 1.5em}.content ol{list-style-position:outside;margin-left:2em;margin-top:1em}.content ol:not([type]){list-style-type:decimal}.content ol.is-lower-alpha:not([type]){list-style-type:lower-alpha}.content ol.is-lower-roman:not([type]){list-style-type:lower-roman}.content ol.is-upper-alpha:not([type]){list-style-type:upper-alpha}.content ol.is-upper-roman:not([type]){list-style-type:upper-roman}.content ul{list-style:disc outside;margin-left:2em;margin-top:1em}.content ul ul{list-style-type:circle;margin-top:0.5em}.content ul ul ul{list-style-type:square}.content dd{margin-left:2em}.content figure{margin-left:2em;margin-right:2em;text-align:center}.content figure:not(:first-child){margin-top:2em}.content figure:not(:last-child){margin-bottom:2em}.content figure img{display:inline-block}.content figure figcaption{font-style:italic}.content pre{-webkit-overflow-scrolling:touch;overflow-x:auto;padding:0;white-space:pre;word-wrap:normal}.content sup,.content sub{font-size:75%}.content table{width:100%}.content table td,.content table th{border:1px solid #dbdbdb;border-width:0 0 1px;padding:0.5em 0.75em;vertical-align:top}.content table th{color:#222}.content table th:not([align]){text-align:left}.content table thead td,.content table thead th{border-width:0 0 2px;color:#222}.content table tfoot td,.content table tfoot th{border-width:2px 0 0;color:#222}.content table tbody tr:last-child td,.content table tbody tr:last-child th{border-bottom-width:0}.content .tabs li+li{margin-top:0}.content.is-small,#documenter .docs-sidebar form.docs-search>input.content{font-size:.75rem}.content.is-medium{font-size:1.25rem}.content.is-large{font-size:1.5rem}.icon{align-items:center;display:inline-flex;justify-content:center;height:1.5rem;width:1.5rem}.icon.is-small,#documenter .docs-sidebar form.docs-search>input.icon{height:1rem;width:1rem}.icon.is-medium{height:2rem;width:2rem}.icon.is-large{height:3rem;width:3rem}.image,#documenter .docs-sidebar .docs-logo>img{display:block;position:relative}.image img,#documenter .docs-sidebar .docs-logo>img img{display:block;height:auto;width:100%}.image img.is-rounded,#documenter .docs-sidebar .docs-logo>img img.is-rounded{border-radius:290486px}.image.is-square img,#documenter .docs-sidebar .docs-logo>img.is-square img,.image.is-square .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-square .has-ratio,.image.is-1by1 img,#documenter .docs-sidebar .docs-logo>img.is-1by1 img,.image.is-1by1 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-1by1 .has-ratio,.image.is-5by4 img,#documenter .docs-sidebar .docs-logo>img.is-5by4 img,.image.is-5by4 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-5by4 .has-ratio,.image.is-4by3 img,#documenter .docs-sidebar .docs-logo>img.is-4by3 img,.image.is-4by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-4by3 .has-ratio,.image.is-3by2 img,#documenter .docs-sidebar .docs-logo>img.is-3by2 img,.image.is-3by2 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by2 .has-ratio,.image.is-5by3 img,#documenter .docs-sidebar .docs-logo>img.is-5by3 img,.image.is-5by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-5by3 .has-ratio,.image.is-16by9 img,#documenter .docs-sidebar .docs-logo>img.is-16by9 img,.image.is-16by9 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-16by9 .has-ratio,.image.is-2by1 img,#documenter .docs-sidebar .docs-logo>img.is-2by1 img,.image.is-2by1 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-2by1 .has-ratio,.image.is-3by1 img,#documenter .docs-sidebar .docs-logo>img.is-3by1 img,.image.is-3by1 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by1 .has-ratio,.image.is-4by5 img,#documenter .docs-sidebar .docs-logo>img.is-4by5 img,.image.is-4by5 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-4by5 .has-ratio,.image.is-3by4 img,#documenter .docs-sidebar .docs-logo>img.is-3by4 img,.image.is-3by4 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by4 .has-ratio,.image.is-2by3 img,#documenter .docs-sidebar .docs-logo>img.is-2by3 img,.image.is-2by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-2by3 .has-ratio,.image.is-3by5 img,#documenter .docs-sidebar .docs-logo>img.is-3by5 img,.image.is-3by5 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-3by5 .has-ratio,.image.is-9by16 img,#documenter .docs-sidebar .docs-logo>img.is-9by16 img,.image.is-9by16 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-9by16 .has-ratio,.image.is-1by2 img,#documenter .docs-sidebar .docs-logo>img.is-1by2 img,.image.is-1by2 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-1by2 .has-ratio,.image.is-1by3 img,#documenter .docs-sidebar .docs-logo>img.is-1by3 img,.image.is-1by3 .has-ratio,#documenter .docs-sidebar .docs-logo>img.is-1by3 .has-ratio{height:100%;width:100%}.image.is-square,#documenter .docs-sidebar .docs-logo>img.is-square,.image.is-1by1,#documenter .docs-sidebar .docs-logo>img.is-1by1{padding-top:100%}.image.is-5by4,#documenter .docs-sidebar .docs-logo>img.is-5by4{padding-top:80%}.image.is-4by3,#documenter .docs-sidebar .docs-logo>img.is-4by3{padding-top:75%}.image.is-3by2,#documenter .docs-sidebar .docs-logo>img.is-3by2{padding-top:66.6666%}.image.is-5by3,#documenter .docs-sidebar .docs-logo>img.is-5by3{padding-top:60%}.image.is-16by9,#documenter .docs-sidebar .docs-logo>img.is-16by9{padding-top:56.25%}.image.is-2by1,#documenter .docs-sidebar .docs-logo>img.is-2by1{padding-top:50%}.image.is-3by1,#documenter .docs-sidebar .docs-logo>img.is-3by1{padding-top:33.3333%}.image.is-4by5,#documenter .docs-sidebar .docs-logo>img.is-4by5{padding-top:125%}.image.is-3by4,#documenter .docs-sidebar .docs-logo>img.is-3by4{padding-top:133.3333%}.image.is-2by3,#documenter .docs-sidebar .docs-logo>img.is-2by3{padding-top:150%}.image.is-3by5,#documenter .docs-sidebar .docs-logo>img.is-3by5{padding-top:166.6666%}.image.is-9by16,#documenter .docs-sidebar .docs-logo>img.is-9by16{padding-top:177.7777%}.image.is-1by2,#documenter .docs-sidebar .docs-logo>img.is-1by2{padding-top:200%}.image.is-1by3,#documenter .docs-sidebar .docs-logo>img.is-1by3{padding-top:300%}.image.is-16x16,#documenter .docs-sidebar .docs-logo>img.is-16x16{height:16px;width:16px}.image.is-24x24,#documenter .docs-sidebar .docs-logo>img.is-24x24{height:24px;width:24px}.image.is-32x32,#documenter .docs-sidebar .docs-logo>img.is-32x32{height:32px;width:32px}.image.is-48x48,#documenter .docs-sidebar .docs-logo>img.is-48x48{height:48px;width:48px}.image.is-64x64,#documenter .docs-sidebar .docs-logo>img.is-64x64{height:64px;width:64px}.image.is-96x96,#documenter .docs-sidebar .docs-logo>img.is-96x96{height:96px;width:96px}.image.is-128x128,#documenter .docs-sidebar .docs-logo>img.is-128x128{height:128px;width:128px}.notification{background-color:#f5f5f5;border-radius:4px;padding:1.25rem 2.5rem 1.25rem 1.5rem;position:relative}.notification a:not(.button):not(.dropdown-item){color:currentColor;text-decoration:underline}.notification strong{color:currentColor}.notification code,.notification pre{background:#fff}.notification pre code{background:transparent}.notification>.delete{position:absolute;right:0.5rem;top:0.5rem}.notification .title,.notification .subtitle,.notification .content{color:currentColor}.notification.is-white{background-color:#fff;color:#0a0a0a}.notification.is-black{background-color:#0a0a0a;color:#fff}.notification.is-light{background-color:#f5f5f5;color:#363636}.notification.is-dark,.content kbd.notification{background-color:#363636;color:#f5f5f5}.notification.is-primary,.docstring>section>a.notification.docs-sourcelink{background-color:#4eb5de;color:#fff}.notification.is-link{background-color:#2e63b8;color:#fff}.notification.is-info{background-color:#209cee;color:#fff}.notification.is-success{background-color:#22c35b;color:#fff}.notification.is-warning{background-color:#ffdd57;color:rgba(0,0,0,0.7)}.notification.is-danger{background-color:#da0b00;color:#fff}.progress{-moz-appearance:none;-webkit-appearance:none;border:none;border-radius:290486px;display:block;height:1rem;overflow:hidden;padding:0;width:100%}.progress::-webkit-progress-bar{background-color:#dbdbdb}.progress::-webkit-progress-value{background-color:#222}.progress::-moz-progress-bar{background-color:#222}.progress::-ms-fill{background-color:#222;border:none}.progress.is-white::-webkit-progress-value{background-color:#fff}.progress.is-white::-moz-progress-bar{background-color:#fff}.progress.is-white::-ms-fill{background-color:#fff}.progress.is-white:indeterminate{background-image:linear-gradient(to right, #fff 30%, #dbdbdb 30%)}.progress.is-black::-webkit-progress-value{background-color:#0a0a0a}.progress.is-black::-moz-progress-bar{background-color:#0a0a0a}.progress.is-black::-ms-fill{background-color:#0a0a0a}.progress.is-black:indeterminate{background-image:linear-gradient(to right, #0a0a0a 30%, #dbdbdb 30%)}.progress.is-light::-webkit-progress-value{background-color:#f5f5f5}.progress.is-light::-moz-progress-bar{background-color:#f5f5f5}.progress.is-light::-ms-fill{background-color:#f5f5f5}.progress.is-light:indeterminate{background-image:linear-gradient(to right, #f5f5f5 30%, #dbdbdb 30%)}.progress.is-dark::-webkit-progress-value,.content kbd.progress::-webkit-progress-value{background-color:#363636}.progress.is-dark::-moz-progress-bar,.content kbd.progress::-moz-progress-bar{background-color:#363636}.progress.is-dark::-ms-fill,.content kbd.progress::-ms-fill{background-color:#363636}.progress.is-dark:indeterminate,.content kbd.progress:indeterminate{background-image:linear-gradient(to right, #363636 30%, #dbdbdb 30%)}.progress.is-primary::-webkit-progress-value,.docstring>section>a.progress.docs-sourcelink::-webkit-progress-value{background-color:#4eb5de}.progress.is-primary::-moz-progress-bar,.docstring>section>a.progress.docs-sourcelink::-moz-progress-bar{background-color:#4eb5de}.progress.is-primary::-ms-fill,.docstring>section>a.progress.docs-sourcelink::-ms-fill{background-color:#4eb5de}.progress.is-primary:indeterminate,.docstring>section>a.progress.docs-sourcelink:indeterminate{background-image:linear-gradient(to right, #4eb5de 30%, #dbdbdb 30%)}.progress.is-link::-webkit-progress-value{background-color:#2e63b8}.progress.is-link::-moz-progress-bar{background-color:#2e63b8}.progress.is-link::-ms-fill{background-color:#2e63b8}.progress.is-link:indeterminate{background-image:linear-gradient(to right, #2e63b8 30%, #dbdbdb 30%)}.progress.is-info::-webkit-progress-value{background-color:#209cee}.progress.is-info::-moz-progress-bar{background-color:#209cee}.progress.is-info::-ms-fill{background-color:#209cee}.progress.is-info:indeterminate{background-image:linear-gradient(to right, #209cee 30%, #dbdbdb 30%)}.progress.is-success::-webkit-progress-value{background-color:#22c35b}.progress.is-success::-moz-progress-bar{background-color:#22c35b}.progress.is-success::-ms-fill{background-color:#22c35b}.progress.is-success:indeterminate{background-image:linear-gradient(to right, #22c35b 30%, #dbdbdb 30%)}.progress.is-warning::-webkit-progress-value{background-color:#ffdd57}.progress.is-warning::-moz-progress-bar{background-color:#ffdd57}.progress.is-warning::-ms-fill{background-color:#ffdd57}.progress.is-warning:indeterminate{background-image:linear-gradient(to right, #ffdd57 30%, #dbdbdb 30%)}.progress.is-danger::-webkit-progress-value{background-color:#da0b00}.progress.is-danger::-moz-progress-bar{background-color:#da0b00}.progress.is-danger::-ms-fill{background-color:#da0b00}.progress.is-danger:indeterminate{background-image:linear-gradient(to right, #da0b00 30%, #dbdbdb 30%)}.progress:indeterminate{animation-duration:1.5s;animation-iteration-count:infinite;animation-name:moveIndeterminate;animation-timing-function:linear;background-color:#dbdbdb;background-image:linear-gradient(to right, #222 30%, #dbdbdb 30%);background-position:top left;background-repeat:no-repeat;background-size:150% 150%}.progress:indeterminate::-webkit-progress-bar{background-color:transparent}.progress:indeterminate::-moz-progress-bar{background-color:transparent}.progress.is-small,#documenter .docs-sidebar form.docs-search>input.progress{height:.75rem}.progress.is-medium{height:1.25rem}.progress.is-large{height:1.5rem}@keyframes moveIndeterminate{from{background-position:200% 0}to{background-position:-200% 0}}.table{background-color:#fff;color:#363636}.table td,.table th{border:1px solid #dbdbdb;border-width:0 0 1px;padding:0.5em 0.75em;vertical-align:top}.table td.is-white,.table th.is-white{background-color:#fff;border-color:#fff;color:#0a0a0a}.table td.is-black,.table th.is-black{background-color:#0a0a0a;border-color:#0a0a0a;color:#fff}.table td.is-light,.table th.is-light{background-color:#f5f5f5;border-color:#f5f5f5;color:#363636}.table td.is-dark,.table th.is-dark{background-color:#363636;border-color:#363636;color:#f5f5f5}.table td.is-primary,.table th.is-primary{background-color:#4eb5de;border-color:#4eb5de;color:#fff}.table td.is-link,.table th.is-link{background-color:#2e63b8;border-color:#2e63b8;color:#fff}.table td.is-info,.table th.is-info{background-color:#209cee;border-color:#209cee;color:#fff}.table td.is-success,.table th.is-success{background-color:#22c35b;border-color:#22c35b;color:#fff}.table td.is-warning,.table th.is-warning{background-color:#ffdd57;border-color:#ffdd57;color:rgba(0,0,0,0.7)}.table td.is-danger,.table th.is-danger{background-color:#da0b00;border-color:#da0b00;color:#fff}.table td.is-narrow,.table th.is-narrow{white-space:nowrap;width:1%}.table td.is-selected,.table th.is-selected{background-color:#4eb5de;color:#fff}.table td.is-selected a,.table td.is-selected strong,.table th.is-selected a,.table th.is-selected strong{color:currentColor}.table th{color:#222}.table th:not([align]){text-align:left}.table tr.is-selected{background-color:#4eb5de;color:#fff}.table tr.is-selected a,.table tr.is-selected strong{color:currentColor}.table tr.is-selected td,.table tr.is-selected th{border-color:#fff;color:currentColor}.table thead{background-color:rgba(0,0,0,0)}.table thead td,.table thead th{border-width:0 0 2px;color:#222}.table tfoot{background-color:rgba(0,0,0,0)}.table tfoot td,.table tfoot th{border-width:2px 0 0;color:#222}.table tbody{background-color:rgba(0,0,0,0)}.table tbody tr:last-child td,.table tbody tr:last-child th{border-bottom-width:0}.table.is-bordered td,.table.is-bordered th{border-width:1px}.table.is-bordered tr:last-child td,.table.is-bordered tr:last-child th{border-bottom-width:1px}.table.is-fullwidth{width:100%}.table.is-hoverable tbody tr:not(.is-selected):hover{background-color:#fafafa}.table.is-hoverable.is-striped tbody tr:not(.is-selected):hover{background-color:#fafafa}.table.is-hoverable.is-striped tbody tr:not(.is-selected):hover:nth-child(even){background-color:#f5f5f5}.table.is-narrow td,.table.is-narrow th{padding:0.25em 0.5em}.table.is-striped tbody tr:not(.is-selected):nth-child(even){background-color:#fafafa}.table-container{-webkit-overflow-scrolling:touch;overflow:auto;overflow-y:hidden;max-width:100%}.tags{align-items:center;display:flex;flex-wrap:wrap;justify-content:flex-start}.tags .tag,.tags .content kbd,.content .tags kbd,.tags .docstring>section>a.docs-sourcelink{margin-bottom:0.5rem}.tags .tag:not(:last-child),.tags .content kbd:not(:last-child),.content .tags kbd:not(:last-child),.tags .docstring>section>a.docs-sourcelink:not(:last-child){margin-right:0.5rem}.tags:last-child{margin-bottom:-0.5rem}.tags:not(:last-child){margin-bottom:1rem}.tags.are-medium .tag:not(.is-normal):not(.is-large),.tags.are-medium .content kbd:not(.is-normal):not(.is-large),.content .tags.are-medium kbd:not(.is-normal):not(.is-large),.tags.are-medium .docstring>section>a.docs-sourcelink:not(.is-normal):not(.is-large){font-size:1rem}.tags.are-large .tag:not(.is-normal):not(.is-medium),.tags.are-large .content kbd:not(.is-normal):not(.is-medium),.content .tags.are-large kbd:not(.is-normal):not(.is-medium),.tags.are-large .docstring>section>a.docs-sourcelink:not(.is-normal):not(.is-medium){font-size:1.25rem}.tags.is-centered{justify-content:center}.tags.is-centered .tag,.tags.is-centered .content kbd,.content .tags.is-centered kbd,.tags.is-centered .docstring>section>a.docs-sourcelink{margin-right:0.25rem;margin-left:0.25rem}.tags.is-right{justify-content:flex-end}.tags.is-right .tag:not(:first-child),.tags.is-right .content kbd:not(:first-child),.content .tags.is-right kbd:not(:first-child),.tags.is-right .docstring>section>a.docs-sourcelink:not(:first-child){margin-left:0.5rem}.tags.is-right .tag:not(:last-child),.tags.is-right .content kbd:not(:last-child),.content .tags.is-right kbd:not(:last-child),.tags.is-right .docstring>section>a.docs-sourcelink:not(:last-child){margin-right:0}.tags.has-addons .tag,.tags.has-addons .content kbd,.content .tags.has-addons kbd,.tags.has-addons .docstring>section>a.docs-sourcelink{margin-right:0}.tags.has-addons .tag:not(:first-child),.tags.has-addons .content kbd:not(:first-child),.content .tags.has-addons kbd:not(:first-child),.tags.has-addons .docstring>section>a.docs-sourcelink:not(:first-child){margin-left:0;border-bottom-left-radius:0;border-top-left-radius:0}.tags.has-addons .tag:not(:last-child),.tags.has-addons .content kbd:not(:last-child),.content .tags.has-addons kbd:not(:last-child),.tags.has-addons .docstring>section>a.docs-sourcelink:not(:last-child){border-bottom-right-radius:0;border-top-right-radius:0}.tag:not(body),.content kbd:not(body),.docstring>section>a.docs-sourcelink:not(body){align-items:center;background-color:#f5f5f5;border-radius:4px;color:#222;display:inline-flex;font-size:.75rem;height:2em;justify-content:center;line-height:1.5;padding-left:0.75em;padding-right:0.75em;white-space:nowrap}.tag:not(body) .delete,.content kbd:not(body) .delete,.docstring>section>a.docs-sourcelink:not(body) .delete{margin-left:0.25rem;margin-right:-0.375rem}.tag.is-white:not(body),.content kbd.is-white:not(body),.docstring>section>a.docs-sourcelink.is-white:not(body){background-color:#fff;color:#0a0a0a}.tag.is-black:not(body),.content kbd.is-black:not(body),.docstring>section>a.docs-sourcelink.is-black:not(body){background-color:#0a0a0a;color:#fff}.tag.is-light:not(body),.content kbd.is-light:not(body),.docstring>section>a.docs-sourcelink.is-light:not(body){background-color:#f5f5f5;color:#363636}.tag.is-dark:not(body),.content kbd:not(body),.docstring>section>a.docs-sourcelink.is-dark:not(body),.content .docstring>section>kbd:not(body){background-color:#363636;color:#f5f5f5}.tag.is-primary:not(body),.content kbd.is-primary:not(body),.docstring>section>a.docs-sourcelink:not(body){background-color:#4eb5de;color:#fff}.tag.is-link:not(body),.content kbd.is-link:not(body),.docstring>section>a.docs-sourcelink.is-link:not(body){background-color:#2e63b8;color:#fff}.tag.is-info:not(body),.content kbd.is-info:not(body),.docstring>section>a.docs-sourcelink.is-info:not(body){background-color:#209cee;color:#fff}.tag.is-success:not(body),.content kbd.is-success:not(body),.docstring>section>a.docs-sourcelink.is-success:not(body){background-color:#22c35b;color:#fff}.tag.is-warning:not(body),.content kbd.is-warning:not(body),.docstring>section>a.docs-sourcelink.is-warning:not(body){background-color:#ffdd57;color:rgba(0,0,0,0.7)}.tag.is-danger:not(body),.content kbd.is-danger:not(body),.docstring>section>a.docs-sourcelink.is-danger:not(body){background-color:#da0b00;color:#fff}.tag.is-normal:not(body),.content kbd.is-normal:not(body),.docstring>section>a.docs-sourcelink.is-normal:not(body){font-size:.75rem}.tag.is-medium:not(body),.content kbd.is-medium:not(body),.docstring>section>a.docs-sourcelink.is-medium:not(body){font-size:1rem}.tag.is-large:not(body),.content kbd.is-large:not(body),.docstring>section>a.docs-sourcelink.is-large:not(body){font-size:1.25rem}.tag:not(body) .icon:first-child:not(:last-child),.content kbd:not(body) .icon:first-child:not(:last-child),.docstring>section>a.docs-sourcelink:not(body) .icon:first-child:not(:last-child){margin-left:-0.375em;margin-right:0.1875em}.tag:not(body) .icon:last-child:not(:first-child),.content kbd:not(body) .icon:last-child:not(:first-child),.docstring>section>a.docs-sourcelink:not(body) .icon:last-child:not(:first-child){margin-left:0.1875em;margin-right:-0.375em}.tag:not(body) .icon:first-child:last-child,.content kbd:not(body) .icon:first-child:last-child,.docstring>section>a.docs-sourcelink:not(body) .icon:first-child:last-child{margin-left:-0.375em;margin-right:-0.375em}.tag.is-delete:not(body),.content kbd.is-delete:not(body),.docstring>section>a.docs-sourcelink.is-delete:not(body){margin-left:1px;padding:0;position:relative;width:2em}.tag.is-delete:not(body)::before,.content kbd.is-delete:not(body)::before,.docstring>section>a.docs-sourcelink.is-delete:not(body)::before,.tag.is-delete:not(body)::after,.content kbd.is-delete:not(body)::after,.docstring>section>a.docs-sourcelink.is-delete:not(body)::after{background-color:currentColor;content:"";display:block;left:50%;position:absolute;top:50%;transform:translateX(-50%) translateY(-50%) rotate(45deg);transform-origin:center center}.tag.is-delete:not(body)::before,.content kbd.is-delete:not(body)::before,.docstring>section>a.docs-sourcelink.is-delete:not(body)::before{height:1px;width:50%}.tag.is-delete:not(body)::after,.content kbd.is-delete:not(body)::after,.docstring>section>a.docs-sourcelink.is-delete:not(body)::after{height:50%;width:1px}.tag.is-delete:not(body):hover,.content kbd.is-delete:not(body):hover,.docstring>section>a.docs-sourcelink.is-delete:not(body):hover,.tag.is-delete:not(body):focus,.content kbd.is-delete:not(body):focus,.docstring>section>a.docs-sourcelink.is-delete:not(body):focus{background-color:#e8e8e8}.tag.is-delete:not(body):active,.content kbd.is-delete:not(body):active,.docstring>section>a.docs-sourcelink.is-delete:not(body):active{background-color:#dbdbdb}.tag.is-rounded:not(body),#documenter .docs-sidebar form.docs-search>input:not(body),.content kbd.is-rounded:not(body),#documenter .docs-sidebar .content form.docs-search>input:not(body),.docstring>section>a.docs-sourcelink.is-rounded:not(body){border-radius:290486px}a.tag:hover,.docstring>section>a.docs-sourcelink:hover{text-decoration:underline}.title,.subtitle{word-break:break-word}.title em,.title span,.subtitle em,.subtitle span{font-weight:inherit}.title sub,.subtitle sub{font-size:.75em}.title sup,.subtitle sup{font-size:.75em}.title .tag,.title .content kbd,.content .title kbd,.title .docstring>section>a.docs-sourcelink,.subtitle .tag,.subtitle .content kbd,.content .subtitle kbd,.subtitle .docstring>section>a.docs-sourcelink{vertical-align:middle}.title{color:#363636;font-size:2rem;font-weight:600;line-height:1.125}.title strong{color:inherit;font-weight:inherit}.title+.highlight{margin-top:-0.75rem}.title:not(.is-spaced)+.subtitle{margin-top:-1.25rem}.title.is-1{font-size:3rem}.title.is-2{font-size:2.5rem}.title.is-3{font-size:2rem}.title.is-4{font-size:1.5rem}.title.is-5{font-size:1.25rem}.title.is-6{font-size:1rem}.title.is-7{font-size:.75rem}.subtitle{color:#4a4a4a;font-size:1.25rem;font-weight:400;line-height:1.25}.subtitle strong{color:#363636;font-weight:600}.subtitle:not(.is-spaced)+.title{margin-top:-1.25rem}.subtitle.is-1{font-size:3rem}.subtitle.is-2{font-size:2.5rem}.subtitle.is-3{font-size:2rem}.subtitle.is-4{font-size:1.5rem}.subtitle.is-5{font-size:1.25rem}.subtitle.is-6{font-size:1rem}.subtitle.is-7{font-size:.75rem}.heading{display:block;font-size:11px;letter-spacing:1px;margin-bottom:5px;text-transform:uppercase}.highlight{font-weight:400;max-width:100%;overflow:hidden;padding:0}.highlight pre{overflow:auto;max-width:100%}.number{align-items:center;background-color:#f5f5f5;border-radius:290486px;display:inline-flex;font-size:1.25rem;height:2em;justify-content:center;margin-right:1.5rem;min-width:2.5em;padding:0.25rem 0.5rem;text-align:center;vertical-align:top}.select select,.textarea,.input,#documenter .docs-sidebar form.docs-search>input{background-color:#fff;border-color:#dbdbdb;border-radius:4px;color:#363636}.select select::-moz-placeholder,.textarea::-moz-placeholder,.input::-moz-placeholder,#documenter .docs-sidebar form.docs-search>input::-moz-placeholder{color:rgba(54,54,54,0.3)}.select select::-webkit-input-placeholder,.textarea::-webkit-input-placeholder,.input::-webkit-input-placeholder,#documenter .docs-sidebar form.docs-search>input::-webkit-input-placeholder{color:rgba(54,54,54,0.3)}.select select:-moz-placeholder,.textarea:-moz-placeholder,.input:-moz-placeholder,#documenter .docs-sidebar form.docs-search>input:-moz-placeholder{color:rgba(54,54,54,0.3)}.select select:-ms-input-placeholder,.textarea:-ms-input-placeholder,.input:-ms-input-placeholder,#documenter .docs-sidebar form.docs-search>input:-ms-input-placeholder{color:rgba(54,54,54,0.3)}.select select:hover,.textarea:hover,.input:hover,#documenter .docs-sidebar form.docs-search>input:hover,.select select.is-hovered,.is-hovered.textarea,.is-hovered.input,#documenter .docs-sidebar form.docs-search>input.is-hovered{border-color:#b5b5b5}.select select:focus,.textarea:focus,.input:focus,#documenter .docs-sidebar form.docs-search>input:focus,.select select.is-focused,.is-focused.textarea,.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.select select:active,.textarea:active,.input:active,#documenter .docs-sidebar form.docs-search>input:active,.select select.is-active,.is-active.textarea,.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{border-color:#2e63b8;box-shadow:0 0 0 0.125em rgba(46,99,184,0.25)}.select select[disabled],.textarea[disabled],.input[disabled],#documenter .docs-sidebar form.docs-search>input[disabled],fieldset[disabled] .select select,.select fieldset[disabled] select,fieldset[disabled] .textarea,fieldset[disabled] .input,fieldset[disabled] #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar fieldset[disabled] form.docs-search>input{background-color:#f5f5f5;border-color:#f5f5f5;box-shadow:none;color:#6b6b6b}.select select[disabled]::-moz-placeholder,.textarea[disabled]::-moz-placeholder,.input[disabled]::-moz-placeholder,#documenter .docs-sidebar form.docs-search>input[disabled]::-moz-placeholder,fieldset[disabled] .select select::-moz-placeholder,.select fieldset[disabled] select::-moz-placeholder,fieldset[disabled] .textarea::-moz-placeholder,fieldset[disabled] .input::-moz-placeholder,fieldset[disabled] #documenter .docs-sidebar form.docs-search>input::-moz-placeholder,#documenter .docs-sidebar fieldset[disabled] form.docs-search>input::-moz-placeholder{color:rgba(107,107,107,0.3)}.select select[disabled]::-webkit-input-placeholder,.textarea[disabled]::-webkit-input-placeholder,.input[disabled]::-webkit-input-placeholder,#documenter .docs-sidebar form.docs-search>input[disabled]::-webkit-input-placeholder,fieldset[disabled] .select select::-webkit-input-placeholder,.select fieldset[disabled] select::-webkit-input-placeholder,fieldset[disabled] .textarea::-webkit-input-placeholder,fieldset[disabled] .input::-webkit-input-placeholder,fieldset[disabled] #documenter .docs-sidebar form.docs-search>input::-webkit-input-placeholder,#documenter .docs-sidebar fieldset[disabled] form.docs-search>input::-webkit-input-placeholder{color:rgba(107,107,107,0.3)}.select select[disabled]:-moz-placeholder,.textarea[disabled]:-moz-placeholder,.input[disabled]:-moz-placeholder,#documenter .docs-sidebar form.docs-search>input[disabled]:-moz-placeholder,fieldset[disabled] .select select:-moz-placeholder,.select fieldset[disabled] select:-moz-placeholder,fieldset[disabled] .textarea:-moz-placeholder,fieldset[disabled] .input:-moz-placeholder,fieldset[disabled] #documenter .docs-sidebar form.docs-search>input:-moz-placeholder,#documenter .docs-sidebar fieldset[disabled] form.docs-search>input:-moz-placeholder{color:rgba(107,107,107,0.3)}.select select[disabled]:-ms-input-placeholder,.textarea[disabled]:-ms-input-placeholder,.input[disabled]:-ms-input-placeholder,#documenter .docs-sidebar form.docs-search>input[disabled]:-ms-input-placeholder,fieldset[disabled] .select select:-ms-input-placeholder,.select fieldset[disabled] select:-ms-input-placeholder,fieldset[disabled] .textarea:-ms-input-placeholder,fieldset[disabled] .input:-ms-input-placeholder,fieldset[disabled] #documenter .docs-sidebar form.docs-search>input:-ms-input-placeholder,#documenter .docs-sidebar fieldset[disabled] form.docs-search>input:-ms-input-placeholder{color:rgba(107,107,107,0.3)}.textarea,.input,#documenter .docs-sidebar form.docs-search>input{box-shadow:inset 0 1px 2px rgba(10,10,10,0.1);max-width:100%;width:100%}.textarea[readonly],.input[readonly],#documenter .docs-sidebar form.docs-search>input[readonly]{box-shadow:none}.is-white.textarea,.is-white.input,#documenter .docs-sidebar form.docs-search>input.is-white{border-color:#fff}.is-white.textarea:focus,.is-white.input:focus,#documenter .docs-sidebar form.docs-search>input.is-white:focus,.is-white.is-focused.textarea,.is-white.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-white.textarea:active,.is-white.input:active,#documenter .docs-sidebar form.docs-search>input.is-white:active,.is-white.is-active.textarea,.is-white.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(255,255,255,0.25)}.is-black.textarea,.is-black.input,#documenter .docs-sidebar form.docs-search>input.is-black{border-color:#0a0a0a}.is-black.textarea:focus,.is-black.input:focus,#documenter .docs-sidebar form.docs-search>input.is-black:focus,.is-black.is-focused.textarea,.is-black.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-black.textarea:active,.is-black.input:active,#documenter .docs-sidebar form.docs-search>input.is-black:active,.is-black.is-active.textarea,.is-black.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(10,10,10,0.25)}.is-light.textarea,.is-light.input,#documenter .docs-sidebar form.docs-search>input.is-light{border-color:#f5f5f5}.is-light.textarea:focus,.is-light.input:focus,#documenter .docs-sidebar form.docs-search>input.is-light:focus,.is-light.is-focused.textarea,.is-light.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-light.textarea:active,.is-light.input:active,#documenter .docs-sidebar form.docs-search>input.is-light:active,.is-light.is-active.textarea,.is-light.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(245,245,245,0.25)}.is-dark.textarea,.content kbd.textarea,.is-dark.input,#documenter .docs-sidebar form.docs-search>input.is-dark,.content kbd.input{border-color:#363636}.is-dark.textarea:focus,.content kbd.textarea:focus,.is-dark.input:focus,#documenter .docs-sidebar form.docs-search>input.is-dark:focus,.content kbd.input:focus,.is-dark.is-focused.textarea,.content kbd.is-focused.textarea,.is-dark.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.content kbd.is-focused.input,#documenter .docs-sidebar .content form.docs-search>input.is-focused,.is-dark.textarea:active,.content kbd.textarea:active,.is-dark.input:active,#documenter .docs-sidebar form.docs-search>input.is-dark:active,.content kbd.input:active,.is-dark.is-active.textarea,.content kbd.is-active.textarea,.is-dark.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active,.content kbd.is-active.input,#documenter .docs-sidebar .content form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(54,54,54,0.25)}.is-primary.textarea,.docstring>section>a.textarea.docs-sourcelink,.is-primary.input,#documenter .docs-sidebar form.docs-search>input.is-primary,.docstring>section>a.input.docs-sourcelink{border-color:#4eb5de}.is-primary.textarea:focus,.docstring>section>a.textarea.docs-sourcelink:focus,.is-primary.input:focus,#documenter .docs-sidebar form.docs-search>input.is-primary:focus,.docstring>section>a.input.docs-sourcelink:focus,.is-primary.is-focused.textarea,.docstring>section>a.is-focused.textarea.docs-sourcelink,.is-primary.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.docstring>section>a.is-focused.input.docs-sourcelink,.is-primary.textarea:active,.docstring>section>a.textarea.docs-sourcelink:active,.is-primary.input:active,#documenter .docs-sidebar form.docs-search>input.is-primary:active,.docstring>section>a.input.docs-sourcelink:active,.is-primary.is-active.textarea,.docstring>section>a.is-active.textarea.docs-sourcelink,.is-primary.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active,.docstring>section>a.is-active.input.docs-sourcelink{box-shadow:0 0 0 0.125em rgba(78,181,222,0.25)}.is-link.textarea,.is-link.input,#documenter .docs-sidebar form.docs-search>input.is-link{border-color:#2e63b8}.is-link.textarea:focus,.is-link.input:focus,#documenter .docs-sidebar form.docs-search>input.is-link:focus,.is-link.is-focused.textarea,.is-link.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-link.textarea:active,.is-link.input:active,#documenter .docs-sidebar form.docs-search>input.is-link:active,.is-link.is-active.textarea,.is-link.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(46,99,184,0.25)}.is-info.textarea,.is-info.input,#documenter .docs-sidebar form.docs-search>input.is-info{border-color:#209cee}.is-info.textarea:focus,.is-info.input:focus,#documenter .docs-sidebar form.docs-search>input.is-info:focus,.is-info.is-focused.textarea,.is-info.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-info.textarea:active,.is-info.input:active,#documenter .docs-sidebar form.docs-search>input.is-info:active,.is-info.is-active.textarea,.is-info.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(32,156,238,0.25)}.is-success.textarea,.is-success.input,#documenter .docs-sidebar form.docs-search>input.is-success{border-color:#22c35b}.is-success.textarea:focus,.is-success.input:focus,#documenter .docs-sidebar form.docs-search>input.is-success:focus,.is-success.is-focused.textarea,.is-success.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-success.textarea:active,.is-success.input:active,#documenter .docs-sidebar form.docs-search>input.is-success:active,.is-success.is-active.textarea,.is-success.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(34,195,91,0.25)}.is-warning.textarea,.is-warning.input,#documenter .docs-sidebar form.docs-search>input.is-warning{border-color:#ffdd57}.is-warning.textarea:focus,.is-warning.input:focus,#documenter .docs-sidebar form.docs-search>input.is-warning:focus,.is-warning.is-focused.textarea,.is-warning.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-warning.textarea:active,.is-warning.input:active,#documenter .docs-sidebar form.docs-search>input.is-warning:active,.is-warning.is-active.textarea,.is-warning.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(255,221,87,0.25)}.is-danger.textarea,.is-danger.input,#documenter .docs-sidebar form.docs-search>input.is-danger{border-color:#da0b00}.is-danger.textarea:focus,.is-danger.input:focus,#documenter .docs-sidebar form.docs-search>input.is-danger:focus,.is-danger.is-focused.textarea,.is-danger.is-focused.input,#documenter .docs-sidebar form.docs-search>input.is-focused,.is-danger.textarea:active,.is-danger.input:active,#documenter .docs-sidebar form.docs-search>input.is-danger:active,.is-danger.is-active.textarea,.is-danger.is-active.input,#documenter .docs-sidebar form.docs-search>input.is-active{box-shadow:0 0 0 0.125em rgba(218,11,0,0.25)}.is-small.textarea,.is-small.input,#documenter .docs-sidebar form.docs-search>input{border-radius:2px;font-size:.75rem}.is-medium.textarea,.is-medium.input,#documenter .docs-sidebar form.docs-search>input.is-medium{font-size:1.25rem}.is-large.textarea,.is-large.input,#documenter .docs-sidebar form.docs-search>input.is-large{font-size:1.5rem}.is-fullwidth.textarea,.is-fullwidth.input,#documenter .docs-sidebar form.docs-search>input.is-fullwidth{display:block;width:100%}.is-inline.textarea,.is-inline.input,#documenter .docs-sidebar form.docs-search>input.is-inline{display:inline;width:auto}.input.is-rounded,#documenter .docs-sidebar form.docs-search>input{border-radius:290486px;padding-left:1em;padding-right:1em}.input.is-static,#documenter .docs-sidebar form.docs-search>input.is-static{background-color:transparent;border-color:transparent;box-shadow:none;padding-left:0;padding-right:0}.textarea{display:block;max-width:100%;min-width:100%;padding:0.625em;resize:vertical}.textarea:not([rows]){max-height:600px;min-height:120px}.textarea[rows]{height:initial}.textarea.has-fixed-size{resize:none}.radio,.checkbox{cursor:pointer;display:inline-block;line-height:1.25;position:relative}.radio input,.checkbox input{cursor:pointer}.radio:hover,.checkbox:hover{color:#363636}.radio[disabled],.checkbox[disabled],fieldset[disabled] .radio,fieldset[disabled] .checkbox{color:#6b6b6b;cursor:not-allowed}.radio+.radio{margin-left:0.5em}.select{display:inline-block;max-width:100%;position:relative;vertical-align:top}.select:not(.is-multiple){height:2.25em}.select:not(.is-multiple):not(.is-loading)::after{border-color:#2e63b8;right:1.125em;z-index:4}.select.is-rounded select,#documenter .docs-sidebar form.docs-search>input.select select{border-radius:290486px;padding-left:1em}.select select{cursor:pointer;display:block;font-size:1em;max-width:100%;outline:none}.select select::-ms-expand{display:none}.select select[disabled]:hover,fieldset[disabled] .select select:hover{border-color:#f5f5f5}.select select:not([multiple]){padding-right:2.5em}.select select[multiple]{height:auto;padding:0}.select select[multiple] option{padding:0.5em 1em}.select:not(.is-multiple):not(.is-loading):hover::after{border-color:#363636}.select.is-white:not(:hover)::after{border-color:#fff}.select.is-white select{border-color:#fff}.select.is-white select:hover,.select.is-white select.is-hovered{border-color:#f2f2f2}.select.is-white select:focus,.select.is-white select.is-focused,.select.is-white select:active,.select.is-white select.is-active{box-shadow:0 0 0 0.125em rgba(255,255,255,0.25)}.select.is-black:not(:hover)::after{border-color:#0a0a0a}.select.is-black select{border-color:#0a0a0a}.select.is-black select:hover,.select.is-black select.is-hovered{border-color:#000}.select.is-black select:focus,.select.is-black select.is-focused,.select.is-black select:active,.select.is-black select.is-active{box-shadow:0 0 0 0.125em rgba(10,10,10,0.25)}.select.is-light:not(:hover)::after{border-color:#f5f5f5}.select.is-light select{border-color:#f5f5f5}.select.is-light select:hover,.select.is-light select.is-hovered{border-color:#e8e8e8}.select.is-light select:focus,.select.is-light select.is-focused,.select.is-light select:active,.select.is-light select.is-active{box-shadow:0 0 0 0.125em rgba(245,245,245,0.25)}.select.is-dark:not(:hover)::after,.content kbd.select:not(:hover)::after{border-color:#363636}.select.is-dark select,.content kbd.select select{border-color:#363636}.select.is-dark select:hover,.content kbd.select select:hover,.select.is-dark select.is-hovered,.content kbd.select select.is-hovered{border-color:#292929}.select.is-dark select:focus,.content kbd.select select:focus,.select.is-dark select.is-focused,.content kbd.select select.is-focused,.select.is-dark select:active,.content kbd.select select:active,.select.is-dark select.is-active,.content kbd.select select.is-active{box-shadow:0 0 0 0.125em rgba(54,54,54,0.25)}.select.is-primary:not(:hover)::after,.docstring>section>a.select.docs-sourcelink:not(:hover)::after{border-color:#4eb5de}.select.is-primary select,.docstring>section>a.select.docs-sourcelink select{border-color:#4eb5de}.select.is-primary select:hover,.docstring>section>a.select.docs-sourcelink select:hover,.select.is-primary select.is-hovered,.docstring>section>a.select.docs-sourcelink select.is-hovered{border-color:#39acda}.select.is-primary select:focus,.docstring>section>a.select.docs-sourcelink select:focus,.select.is-primary select.is-focused,.docstring>section>a.select.docs-sourcelink select.is-focused,.select.is-primary select:active,.docstring>section>a.select.docs-sourcelink select:active,.select.is-primary select.is-active,.docstring>section>a.select.docs-sourcelink select.is-active{box-shadow:0 0 0 0.125em rgba(78,181,222,0.25)}.select.is-link:not(:hover)::after{border-color:#2e63b8}.select.is-link select{border-color:#2e63b8}.select.is-link select:hover,.select.is-link select.is-hovered{border-color:#2958a4}.select.is-link select:focus,.select.is-link select.is-focused,.select.is-link select:active,.select.is-link select.is-active{box-shadow:0 0 0 0.125em rgba(46,99,184,0.25)}.select.is-info:not(:hover)::after{border-color:#209cee}.select.is-info select{border-color:#209cee}.select.is-info select:hover,.select.is-info select.is-hovered{border-color:#1190e3}.select.is-info select:focus,.select.is-info select.is-focused,.select.is-info select:active,.select.is-info select.is-active{box-shadow:0 0 0 0.125em rgba(32,156,238,0.25)}.select.is-success:not(:hover)::after{border-color:#22c35b}.select.is-success select{border-color:#22c35b}.select.is-success select:hover,.select.is-success select.is-hovered{border-color:#1ead51}.select.is-success select:focus,.select.is-success select.is-focused,.select.is-success select:active,.select.is-success select.is-active{box-shadow:0 0 0 0.125em rgba(34,195,91,0.25)}.select.is-warning:not(:hover)::after{border-color:#ffdd57}.select.is-warning select{border-color:#ffdd57}.select.is-warning select:hover,.select.is-warning select.is-hovered{border-color:#ffd83e}.select.is-warning select:focus,.select.is-warning select.is-focused,.select.is-warning select:active,.select.is-warning select.is-active{box-shadow:0 0 0 0.125em rgba(255,221,87,0.25)}.select.is-danger:not(:hover)::after{border-color:#da0b00}.select.is-danger select{border-color:#da0b00}.select.is-danger select:hover,.select.is-danger select.is-hovered{border-color:#c10a00}.select.is-danger select:focus,.select.is-danger select.is-focused,.select.is-danger select:active,.select.is-danger select.is-active{box-shadow:0 0 0 0.125em rgba(218,11,0,0.25)}.select.is-small,#documenter .docs-sidebar form.docs-search>input.select{border-radius:2px;font-size:.75rem}.select.is-medium{font-size:1.25rem}.select.is-large{font-size:1.5rem}.select.is-disabled::after{border-color:#6b6b6b}.select.is-fullwidth{width:100%}.select.is-fullwidth select{width:100%}.select.is-loading::after{margin-top:0;position:absolute;right:0.625em;top:0.625em;transform:none}.select.is-loading.is-small:after,#documenter .docs-sidebar form.docs-search>input.is-loading:after{font-size:.75rem}.select.is-loading.is-medium:after{font-size:1.25rem}.select.is-loading.is-large:after{font-size:1.5rem}.file{align-items:stretch;display:flex;justify-content:flex-start;position:relative}.file.is-white .file-cta{background-color:#fff;border-color:transparent;color:#0a0a0a}.file.is-white:hover .file-cta,.file.is-white.is-hovered .file-cta{background-color:#f9f9f9;border-color:transparent;color:#0a0a0a}.file.is-white:focus .file-cta,.file.is-white.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(255,255,255,0.25);color:#0a0a0a}.file.is-white:active .file-cta,.file.is-white.is-active .file-cta{background-color:#f2f2f2;border-color:transparent;color:#0a0a0a}.file.is-black .file-cta{background-color:#0a0a0a;border-color:transparent;color:#fff}.file.is-black:hover .file-cta,.file.is-black.is-hovered .file-cta{background-color:#040404;border-color:transparent;color:#fff}.file.is-black:focus .file-cta,.file.is-black.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(10,10,10,0.25);color:#fff}.file.is-black:active .file-cta,.file.is-black.is-active .file-cta{background-color:#000;border-color:transparent;color:#fff}.file.is-light .file-cta{background-color:#f5f5f5;border-color:transparent;color:#363636}.file.is-light:hover .file-cta,.file.is-light.is-hovered .file-cta{background-color:#eee;border-color:transparent;color:#363636}.file.is-light:focus .file-cta,.file.is-light.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(245,245,245,0.25);color:#363636}.file.is-light:active .file-cta,.file.is-light.is-active .file-cta{background-color:#e8e8e8;border-color:transparent;color:#363636}.file.is-dark .file-cta,.content kbd.file .file-cta{background-color:#363636;border-color:transparent;color:#f5f5f5}.file.is-dark:hover .file-cta,.content kbd.file:hover .file-cta,.file.is-dark.is-hovered .file-cta,.content kbd.file.is-hovered .file-cta{background-color:#2f2f2f;border-color:transparent;color:#f5f5f5}.file.is-dark:focus .file-cta,.content kbd.file:focus .file-cta,.file.is-dark.is-focused .file-cta,.content kbd.file.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(54,54,54,0.25);color:#f5f5f5}.file.is-dark:active .file-cta,.content kbd.file:active .file-cta,.file.is-dark.is-active .file-cta,.content kbd.file.is-active .file-cta{background-color:#292929;border-color:transparent;color:#f5f5f5}.file.is-primary .file-cta,.docstring>section>a.file.docs-sourcelink .file-cta{background-color:#4eb5de;border-color:transparent;color:#fff}.file.is-primary:hover .file-cta,.docstring>section>a.file.docs-sourcelink:hover .file-cta,.file.is-primary.is-hovered .file-cta,.docstring>section>a.file.is-hovered.docs-sourcelink .file-cta{background-color:#43b1dc;border-color:transparent;color:#fff}.file.is-primary:focus .file-cta,.docstring>section>a.file.docs-sourcelink:focus .file-cta,.file.is-primary.is-focused .file-cta,.docstring>section>a.file.is-focused.docs-sourcelink .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(78,181,222,0.25);color:#fff}.file.is-primary:active .file-cta,.docstring>section>a.file.docs-sourcelink:active .file-cta,.file.is-primary.is-active .file-cta,.docstring>section>a.file.is-active.docs-sourcelink .file-cta{background-color:#39acda;border-color:transparent;color:#fff}.file.is-link .file-cta{background-color:#2e63b8;border-color:transparent;color:#fff}.file.is-link:hover .file-cta,.file.is-link.is-hovered .file-cta{background-color:#2b5eae;border-color:transparent;color:#fff}.file.is-link:focus .file-cta,.file.is-link.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(46,99,184,0.25);color:#fff}.file.is-link:active .file-cta,.file.is-link.is-active .file-cta{background-color:#2958a4;border-color:transparent;color:#fff}.file.is-info .file-cta{background-color:#209cee;border-color:transparent;color:#fff}.file.is-info:hover .file-cta,.file.is-info.is-hovered .file-cta{background-color:#1497ed;border-color:transparent;color:#fff}.file.is-info:focus .file-cta,.file.is-info.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(32,156,238,0.25);color:#fff}.file.is-info:active .file-cta,.file.is-info.is-active .file-cta{background-color:#1190e3;border-color:transparent;color:#fff}.file.is-success .file-cta{background-color:#22c35b;border-color:transparent;color:#fff}.file.is-success:hover .file-cta,.file.is-success.is-hovered .file-cta{background-color:#20b856;border-color:transparent;color:#fff}.file.is-success:focus .file-cta,.file.is-success.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(34,195,91,0.25);color:#fff}.file.is-success:active .file-cta,.file.is-success.is-active .file-cta{background-color:#1ead51;border-color:transparent;color:#fff}.file.is-warning .file-cta{background-color:#ffdd57;border-color:transparent;color:rgba(0,0,0,0.7)}.file.is-warning:hover .file-cta,.file.is-warning.is-hovered .file-cta{background-color:#ffda4a;border-color:transparent;color:rgba(0,0,0,0.7)}.file.is-warning:focus .file-cta,.file.is-warning.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(255,221,87,0.25);color:rgba(0,0,0,0.7)}.file.is-warning:active .file-cta,.file.is-warning.is-active .file-cta{background-color:#ffd83e;border-color:transparent;color:rgba(0,0,0,0.7)}.file.is-danger .file-cta{background-color:#da0b00;border-color:transparent;color:#fff}.file.is-danger:hover .file-cta,.file.is-danger.is-hovered .file-cta{background-color:#cd0a00;border-color:transparent;color:#fff}.file.is-danger:focus .file-cta,.file.is-danger.is-focused .file-cta{border-color:transparent;box-shadow:0 0 0.5em rgba(218,11,0,0.25);color:#fff}.file.is-danger:active .file-cta,.file.is-danger.is-active .file-cta{background-color:#c10a00;border-color:transparent;color:#fff}.file.is-small,#documenter .docs-sidebar form.docs-search>input.file{font-size:.75rem}.file.is-medium{font-size:1.25rem}.file.is-medium .file-icon .fa{font-size:21px}.file.is-large{font-size:1.5rem}.file.is-large .file-icon .fa{font-size:28px}.file.has-name .file-cta{border-bottom-right-radius:0;border-top-right-radius:0}.file.has-name .file-name{border-bottom-left-radius:0;border-top-left-radius:0}.file.has-name.is-empty .file-cta{border-radius:4px}.file.has-name.is-empty .file-name{display:none}.file.is-boxed .file-label{flex-direction:column}.file.is-boxed .file-cta{flex-direction:column;height:auto;padding:1em 3em}.file.is-boxed .file-name{border-width:0 1px 1px}.file.is-boxed .file-icon{height:1.5em;width:1.5em}.file.is-boxed .file-icon .fa{font-size:21px}.file.is-boxed.is-small .file-icon .fa,#documenter .docs-sidebar form.docs-search>input.is-boxed .file-icon .fa{font-size:14px}.file.is-boxed.is-medium .file-icon .fa{font-size:28px}.file.is-boxed.is-large .file-icon .fa{font-size:35px}.file.is-boxed.has-name .file-cta{border-radius:4px 4px 0 0}.file.is-boxed.has-name .file-name{border-radius:0 0 4px 4px;border-width:0 1px 1px}.file.is-centered{justify-content:center}.file.is-fullwidth .file-label{width:100%}.file.is-fullwidth .file-name{flex-grow:1;max-width:none}.file.is-right{justify-content:flex-end}.file.is-right .file-cta{border-radius:0 4px 4px 0}.file.is-right .file-name{border-radius:4px 0 0 4px;border-width:1px 0 1px 1px;order:-1}.file-label{align-items:stretch;display:flex;cursor:pointer;justify-content:flex-start;overflow:hidden;position:relative}.file-label:hover .file-cta{background-color:#eee;color:#363636}.file-label:hover .file-name{border-color:#d5d5d5}.file-label:active .file-cta{background-color:#e8e8e8;color:#363636}.file-label:active .file-name{border-color:#cfcfcf}.file-input{height:100%;left:0;opacity:0;outline:none;position:absolute;top:0;width:100%}.file-cta,.file-name{border-color:#dbdbdb;border-radius:4px;font-size:1em;padding-left:1em;padding-right:1em;white-space:nowrap}.file-cta{background-color:#f5f5f5;color:#4a4a4a}.file-name{border-color:#dbdbdb;border-style:solid;border-width:1px 1px 1px 0;display:block;max-width:16em;overflow:hidden;text-align:left;text-overflow:ellipsis}.file-icon{align-items:center;display:flex;height:1em;justify-content:center;margin-right:0.5em;width:1em}.file-icon .fa{font-size:14px}.label{color:#363636;display:block;font-size:1rem;font-weight:700}.label:not(:last-child){margin-bottom:0.5em}.label.is-small,#documenter .docs-sidebar form.docs-search>input.label{font-size:.75rem}.label.is-medium{font-size:1.25rem}.label.is-large{font-size:1.5rem}.help{display:block;font-size:.75rem;margin-top:0.25rem}.help.is-white{color:#fff}.help.is-black{color:#0a0a0a}.help.is-light{color:#f5f5f5}.help.is-dark,.content kbd.help{color:#363636}.help.is-primary,.docstring>section>a.help.docs-sourcelink{color:#4eb5de}.help.is-link{color:#2e63b8}.help.is-info{color:#209cee}.help.is-success{color:#22c35b}.help.is-warning{color:#ffdd57}.help.is-danger{color:#da0b00}.field:not(:last-child){margin-bottom:0.75rem}.field.has-addons{display:flex;justify-content:flex-start}.field.has-addons .control:not(:last-child){margin-right:-1px}.field.has-addons .control:not(:first-child):not(:last-child) .button,.field.has-addons .control:not(:first-child):not(:last-child) .input,.field.has-addons .control:not(:first-child):not(:last-child) #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar .field.has-addons .control:not(:first-child):not(:last-child) form.docs-search>input,.field.has-addons .control:not(:first-child):not(:last-child) .select select{border-radius:0}.field.has-addons .control:first-child:not(:only-child) .button,.field.has-addons .control:first-child:not(:only-child) .input,.field.has-addons .control:first-child:not(:only-child) #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar .field.has-addons .control:first-child:not(:only-child) form.docs-search>input,.field.has-addons .control:first-child:not(:only-child) .select select{border-bottom-right-radius:0;border-top-right-radius:0}.field.has-addons .control:last-child:not(:only-child) .button,.field.has-addons .control:last-child:not(:only-child) .input,.field.has-addons .control:last-child:not(:only-child) #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar .field.has-addons .control:last-child:not(:only-child) form.docs-search>input,.field.has-addons .control:last-child:not(:only-child) .select select{border-bottom-left-radius:0;border-top-left-radius:0}.field.has-addons .control .button:not([disabled]):hover,.field.has-addons .control .button.is-hovered:not([disabled]),.field.has-addons .control .input:not([disabled]):hover,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):hover,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):hover,.field.has-addons .control .input.is-hovered:not([disabled]),.field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-hovered:not([disabled]),#documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-hovered:not([disabled]),.field.has-addons .control .select select:not([disabled]):hover,.field.has-addons .control .select select.is-hovered:not([disabled]){z-index:2}.field.has-addons .control .button:not([disabled]):focus,.field.has-addons .control .button.is-focused:not([disabled]),.field.has-addons .control .button:not([disabled]):active,.field.has-addons .control .button.is-active:not([disabled]),.field.has-addons .control .input:not([disabled]):focus,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):focus,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):focus,.field.has-addons .control .input.is-focused:not([disabled]),.field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-focused:not([disabled]),#documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-focused:not([disabled]),.field.has-addons .control .input:not([disabled]):active,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):active,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):active,.field.has-addons .control .input.is-active:not([disabled]),.field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-active:not([disabled]),#documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-active:not([disabled]),.field.has-addons .control .select select:not([disabled]):focus,.field.has-addons .control .select select.is-focused:not([disabled]),.field.has-addons .control .select select:not([disabled]):active,.field.has-addons .control .select select.is-active:not([disabled]){z-index:3}.field.has-addons .control .button:not([disabled]):focus:hover,.field.has-addons .control .button.is-focused:not([disabled]):hover,.field.has-addons .control .button:not([disabled]):active:hover,.field.has-addons .control .button.is-active:not([disabled]):hover,.field.has-addons .control .input:not([disabled]):focus:hover,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):focus:hover,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):focus:hover,.field.has-addons .control .input.is-focused:not([disabled]):hover,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-focused:not([disabled]):hover,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-focused:not([disabled]):hover,.field.has-addons .control .input:not([disabled]):active:hover,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input:not([disabled]):active:hover,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input:not([disabled]):active:hover,.field.has-addons .control .input.is-active:not([disabled]):hover,.field.has-addons .control #documenter .docs-sidebar form.docs-search>input.is-active:not([disabled]):hover,#documenter .docs-sidebar .field.has-addons .control form.docs-search>input.is-active:not([disabled]):hover,.field.has-addons .control .select select:not([disabled]):focus:hover,.field.has-addons .control .select select.is-focused:not([disabled]):hover,.field.has-addons .control .select select:not([disabled]):active:hover,.field.has-addons .control .select select.is-active:not([disabled]):hover{z-index:4}.field.has-addons .control.is-expanded{flex-grow:1;flex-shrink:1}.field.has-addons.has-addons-centered{justify-content:center}.field.has-addons.has-addons-right{justify-content:flex-end}.field.has-addons.has-addons-fullwidth .control{flex-grow:1;flex-shrink:0}.field.is-grouped{display:flex;justify-content:flex-start}.field.is-grouped>.control{flex-shrink:0}.field.is-grouped>.control:not(:last-child){margin-bottom:0;margin-right:0.75rem}.field.is-grouped>.control.is-expanded{flex-grow:1;flex-shrink:1}.field.is-grouped.is-grouped-centered{justify-content:center}.field.is-grouped.is-grouped-right{justify-content:flex-end}.field.is-grouped.is-grouped-multiline{flex-wrap:wrap}.field.is-grouped.is-grouped-multiline>.control:last-child,.field.is-grouped.is-grouped-multiline>.control:not(:last-child){margin-bottom:0.75rem}.field.is-grouped.is-grouped-multiline:last-child{margin-bottom:-0.75rem}.field.is-grouped.is-grouped-multiline:not(:last-child){margin-bottom:0}@media screen and (min-width: 769px),print{.field.is-horizontal{display:flex}}.field-label .label{font-size:inherit}@media screen and (max-width: 768px){.field-label{margin-bottom:0.5rem}}@media screen and (min-width: 769px),print{.field-label{flex-basis:0;flex-grow:1;flex-shrink:0;margin-right:1.5rem;text-align:right}.field-label.is-small,#documenter .docs-sidebar form.docs-search>input.field-label{font-size:.75rem;padding-top:0.375em}.field-label.is-normal{padding-top:0.375em}.field-label.is-medium{font-size:1.25rem;padding-top:0.375em}.field-label.is-large{font-size:1.5rem;padding-top:0.375em}}.field-body .field .field{margin-bottom:0}@media screen and (min-width: 769px),print{.field-body{display:flex;flex-basis:0;flex-grow:5;flex-shrink:1}.field-body .field{margin-bottom:0}.field-body>.field{flex-shrink:1}.field-body>.field:not(.is-narrow){flex-grow:1}.field-body>.field:not(:last-child){margin-right:0.75rem}}.control{box-sizing:border-box;clear:both;font-size:1rem;position:relative;text-align:left}.control.has-icons-left .input:focus~.icon,.control.has-icons-left #documenter .docs-sidebar form.docs-search>input:focus~.icon,#documenter .docs-sidebar .control.has-icons-left form.docs-search>input:focus~.icon,.control.has-icons-left .select:focus~.icon,.control.has-icons-right .input:focus~.icon,.control.has-icons-right #documenter .docs-sidebar form.docs-search>input:focus~.icon,#documenter .docs-sidebar .control.has-icons-right form.docs-search>input:focus~.icon,.control.has-icons-right .select:focus~.icon{color:#6b6b6b}.control.has-icons-left .input.is-small~.icon,.control.has-icons-left #documenter .docs-sidebar form.docs-search>input~.icon,#documenter .docs-sidebar .control.has-icons-left form.docs-search>input~.icon,.control.has-icons-left .select.is-small~.icon,.control.has-icons-right .input.is-small~.icon,.control.has-icons-right #documenter .docs-sidebar form.docs-search>input~.icon,#documenter .docs-sidebar .control.has-icons-right form.docs-search>input~.icon,.control.has-icons-right .select.is-small~.icon{font-size:.75rem}.control.has-icons-left .input.is-medium~.icon,.control.has-icons-left #documenter .docs-sidebar form.docs-search>input.is-medium~.icon,#documenter .docs-sidebar .control.has-icons-left form.docs-search>input.is-medium~.icon,.control.has-icons-left .select.is-medium~.icon,.control.has-icons-right .input.is-medium~.icon,.control.has-icons-right #documenter .docs-sidebar form.docs-search>input.is-medium~.icon,#documenter .docs-sidebar .control.has-icons-right form.docs-search>input.is-medium~.icon,.control.has-icons-right .select.is-medium~.icon{font-size:1.25rem}.control.has-icons-left .input.is-large~.icon,.control.has-icons-left #documenter .docs-sidebar form.docs-search>input.is-large~.icon,#documenter .docs-sidebar .control.has-icons-left form.docs-search>input.is-large~.icon,.control.has-icons-left .select.is-large~.icon,.control.has-icons-right .input.is-large~.icon,.control.has-icons-right #documenter .docs-sidebar form.docs-search>input.is-large~.icon,#documenter .docs-sidebar .control.has-icons-right form.docs-search>input.is-large~.icon,.control.has-icons-right .select.is-large~.icon{font-size:1.5rem}.control.has-icons-left .icon,.control.has-icons-right .icon{color:#dbdbdb;height:2.25em;pointer-events:none;position:absolute;top:0;width:2.25em;z-index:4}.control.has-icons-left .input,.control.has-icons-left #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar .control.has-icons-left form.docs-search>input,.control.has-icons-left .select select{padding-left:2.25em}.control.has-icons-left .icon.is-left{left:0}.control.has-icons-right .input,.control.has-icons-right #documenter .docs-sidebar form.docs-search>input,#documenter .docs-sidebar .control.has-icons-right form.docs-search>input,.control.has-icons-right .select select{padding-right:2.25em}.control.has-icons-right .icon.is-right{right:0}.control.is-loading::after{position:absolute !important;right:0.625em;top:0.625em;z-index:4}.control.is-loading.is-small:after,#documenter .docs-sidebar form.docs-search>input.is-loading:after{font-size:.75rem}.control.is-loading.is-medium:after{font-size:1.25rem}.control.is-loading.is-large:after{font-size:1.5rem}.breadcrumb{font-size:1rem;white-space:nowrap}.breadcrumb a{align-items:center;color:#2e63b8;display:flex;justify-content:center;padding:0 .75em}.breadcrumb a:hover{color:#363636}.breadcrumb li{align-items:center;display:flex}.breadcrumb li:first-child a{padding-left:0}.breadcrumb li.is-active a{color:#222;cursor:default;pointer-events:none}.breadcrumb li+li::before{color:#b5b5b5;content:"\0002f"}.breadcrumb ul,.breadcrumb ol{align-items:flex-start;display:flex;flex-wrap:wrap;justify-content:flex-start}.breadcrumb .icon:first-child{margin-right:0.5em}.breadcrumb .icon:last-child{margin-left:0.5em}.breadcrumb.is-centered ol,.breadcrumb.is-centered ul{justify-content:center}.breadcrumb.is-right ol,.breadcrumb.is-right ul{justify-content:flex-end}.breadcrumb.is-small,#documenter .docs-sidebar form.docs-search>input.breadcrumb{font-size:.75rem}.breadcrumb.is-medium{font-size:1.25rem}.breadcrumb.is-large{font-size:1.5rem}.breadcrumb.has-arrow-separator li+li::before{content:"\02192"}.breadcrumb.has-bullet-separator li+li::before{content:"\02022"}.breadcrumb.has-dot-separator li+li::before{content:"\000b7"}.breadcrumb.has-succeeds-separator li+li::before{content:"\0227B"}.card{background-color:#fff;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1);color:#222;max-width:100%;position:relative}.card-header{background-color:rgba(0,0,0,0);align-items:stretch;box-shadow:0 1px 2px rgba(10,10,10,0.1);display:flex}.card-header-title{align-items:center;color:#222;display:flex;flex-grow:1;font-weight:700;padding:.75rem}.card-header-title.is-centered{justify-content:center}.card-header-icon{align-items:center;cursor:pointer;display:flex;justify-content:center;padding:.75rem}.card-image{display:block;position:relative}.card-content{background-color:rgba(0,0,0,0);padding:1.5rem}.card-footer{background-color:rgba(0,0,0,0);border-top:1px solid #dbdbdb;align-items:stretch;display:flex}.card-footer-item{align-items:center;display:flex;flex-basis:0;flex-grow:1;flex-shrink:0;justify-content:center;padding:.75rem}.card-footer-item:not(:last-child){border-right:1px solid #dbdbdb}.card .media:not(:last-child){margin-bottom:1.5rem}.dropdown{display:inline-flex;position:relative;vertical-align:top}.dropdown.is-active .dropdown-menu,.dropdown.is-hoverable:hover .dropdown-menu{display:block}.dropdown.is-right .dropdown-menu{left:auto;right:0}.dropdown.is-up .dropdown-menu{bottom:100%;padding-bottom:4px;padding-top:initial;top:auto}.dropdown-menu{display:none;left:0;min-width:12rem;padding-top:4px;position:absolute;top:100%;z-index:20}.dropdown-content{background-color:#fff;border-radius:4px;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1);padding-bottom:.5rem;padding-top:.5rem}.dropdown-item{color:#4a4a4a;display:block;font-size:0.875rem;line-height:1.5;padding:0.375rem 1rem;position:relative}a.dropdown-item,button.dropdown-item{padding-right:3rem;text-align:left;white-space:nowrap;width:100%}a.dropdown-item:hover,button.dropdown-item:hover{background-color:#f5f5f5;color:#0a0a0a}a.dropdown-item.is-active,button.dropdown-item.is-active{background-color:#2e63b8;color:#fff}.dropdown-divider{background-color:#dbdbdb;border:none;display:block;height:1px;margin:0.5rem 0}.level{align-items:center;justify-content:space-between}.level code{border-radius:4px}.level img{display:inline-block;vertical-align:top}.level.is-mobile{display:flex}.level.is-mobile .level-left,.level.is-mobile .level-right{display:flex}.level.is-mobile .level-left+.level-right{margin-top:0}.level.is-mobile .level-item:not(:last-child){margin-bottom:0;margin-right:.75rem}.level.is-mobile .level-item:not(.is-narrow){flex-grow:1}@media screen and (min-width: 769px),print{.level{display:flex}.level>.level-item:not(.is-narrow){flex-grow:1}}.level-item{align-items:center;display:flex;flex-basis:auto;flex-grow:0;flex-shrink:0;justify-content:center}.level-item .title,.level-item .subtitle{margin-bottom:0}@media screen and (max-width: 768px){.level-item:not(:last-child){margin-bottom:.75rem}}.level-left,.level-right{flex-basis:auto;flex-grow:0;flex-shrink:0}.level-left .level-item.is-flexible,.level-right .level-item.is-flexible{flex-grow:1}@media screen and (min-width: 769px),print{.level-left .level-item:not(:last-child),.level-right .level-item:not(:last-child){margin-right:.75rem}}.level-left{align-items:center;justify-content:flex-start}@media screen and (max-width: 768px){.level-left+.level-right{margin-top:1.5rem}}@media screen and (min-width: 769px),print{.level-left{display:flex}}.level-right{align-items:center;justify-content:flex-end}@media screen and (min-width: 769px),print{.level-right{display:flex}}.list{background-color:#fff;border-radius:4px;box-shadow:0 2px 3px rgba(10,10,10,0.1),0 0 0 1px rgba(10,10,10,0.1)}.list-item{display:block;padding:0.5em 1em}.list-item:not(a){color:#222}.list-item:first-child{border-top-left-radius:4px;border-top-right-radius:4px}.list-item:last-child{border-bottom-left-radius:4px;border-bottom-right-radius:4px}.list-item:not(:last-child){border-bottom:1px solid #dbdbdb}.list-item.is-active{background-color:#2e63b8;color:#fff}a.list-item{background-color:#f5f5f5;cursor:pointer}.media{align-items:flex-start;display:flex;text-align:left}.media .content:not(:last-child){margin-bottom:0.75rem}.media .media{border-top:1px solid rgba(219,219,219,0.5);display:flex;padding-top:0.75rem}.media .media .content:not(:last-child),.media .media .control:not(:last-child){margin-bottom:0.5rem}.media .media .media{padding-top:0.5rem}.media .media .media+.media{margin-top:0.5rem}.media+.media{border-top:1px solid rgba(219,219,219,0.5);margin-top:1rem;padding-top:1rem}.media.is-large+.media{margin-top:1.5rem;padding-top:1.5rem}.media-left,.media-right{flex-basis:auto;flex-grow:0;flex-shrink:0}.media-left{margin-right:1rem}.media-right{margin-left:1rem}.media-content{flex-basis:auto;flex-grow:1;flex-shrink:1;text-align:left}@media screen and (max-width: 768px){.media-content{overflow-x:auto}}.menu{font-size:1rem}.menu.is-small,#documenter .docs-sidebar form.docs-search>input.menu{font-size:.75rem}.menu.is-medium{font-size:1.25rem}.menu.is-large{font-size:1.5rem}.menu-list{line-height:1.25}.menu-list a{border-radius:2px;color:#222;display:block;padding:0.5em 0.75em}.menu-list a:hover{background-color:#f5f5f5;color:#222}.menu-list a.is-active{background-color:#2e63b8;color:#fff}.menu-list li ul{border-left:1px solid #dbdbdb;margin:.75em;padding-left:.75em}.menu-label{color:#6b6b6b;font-size:.75em;letter-spacing:.1em;text-transform:uppercase}.menu-label:not(:first-child){margin-top:1em}.menu-label:not(:last-child){margin-bottom:1em}.message{background-color:#f5f5f5;border-radius:4px;font-size:1rem}.message strong{color:currentColor}.message a:not(.button):not(.tag):not(.dropdown-item){color:currentColor;text-decoration:underline}.message.is-small,#documenter .docs-sidebar form.docs-search>input.message{font-size:.75rem}.message.is-medium{font-size:1.25rem}.message.is-large{font-size:1.5rem}.message.is-white{background-color:#fff}.message.is-white .message-header{background-color:#fff;color:#0a0a0a}.message.is-white .message-body{border-color:#fff;color:#4d4d4d}.message.is-black{background-color:#fafafa}.message.is-black .message-header{background-color:#0a0a0a;color:#fff}.message.is-black .message-body{border-color:#0a0a0a;color:#090909}.message.is-light{background-color:#fafafa}.message.is-light .message-header{background-color:#f5f5f5;color:#363636}.message.is-light .message-body{border-color:#f5f5f5;color:#505050}.message.is-dark,.content kbd.message{background-color:#fafafa}.message.is-dark .message-header,.content kbd.message .message-header{background-color:#363636;color:#f5f5f5}.message.is-dark .message-body,.content kbd.message .message-body{border-color:#363636;color:#2a2a2a}.message.is-primary,.docstring>section>a.message.docs-sourcelink{background-color:#f6fbfd}.message.is-primary .message-header,.docstring>section>a.message.docs-sourcelink .message-header{background-color:#4eb5de;color:#fff}.message.is-primary .message-body,.docstring>section>a.message.docs-sourcelink .message-body{border-color:#4eb5de;color:#1f556a}.message.is-link{background-color:#f7f9fd}.message.is-link .message-header{background-color:#2e63b8;color:#fff}.message.is-link .message-body{border-color:#2e63b8;color:#264981}.message.is-info{background-color:#f6fbfe}.message.is-info .message-header{background-color:#209cee;color:#fff}.message.is-info .message-body{border-color:#209cee;color:#12537d}.message.is-success{background-color:#f6fdf9}.message.is-success .message-header{background-color:#22c35b;color:#fff}.message.is-success .message-body{border-color:#22c35b;color:#0f361d}.message.is-warning{background-color:#fffdf5}.message.is-warning .message-header{background-color:#ffdd57;color:rgba(0,0,0,0.7)}.message.is-warning .message-body{border-color:#ffdd57;color:#3c3108}.message.is-danger{background-color:#fff5f5}.message.is-danger .message-header{background-color:#da0b00;color:#fff}.message.is-danger .message-body{border-color:#da0b00;color:#9b0c04}.message-header{align-items:center;background-color:#222;border-radius:4px 4px 0 0;color:#fff;display:flex;font-weight:700;justify-content:space-between;line-height:1.25;padding:0.75em 1em;position:relative}.message-header .delete{flex-grow:0;flex-shrink:0;margin-left:0.75em}.message-header+.message-body{border-width:0;border-top-left-radius:0;border-top-right-radius:0}.message-body{border-color:#dbdbdb;border-radius:4px;border-style:solid;border-width:0 0 0 4px;color:#222;padding:1.25em 1.5em}.message-body code,.message-body pre{background-color:#fff}.message-body pre code{background-color:rgba(0,0,0,0)}.modal{align-items:center;display:none;flex-direction:column;justify-content:center;overflow:hidden;position:fixed;z-index:40}.modal.is-active{display:flex}.modal-background{background-color:rgba(10,10,10,0.86)}.modal-content,.modal-card{margin:0 20px;max-height:calc(100vh - 160px);overflow:auto;position:relative;width:100%}@media screen and (min-width: 769px),print{.modal-content,.modal-card{margin:0 auto;max-height:calc(100vh - 40px);width:640px}}.modal-close{background:none;height:40px;position:fixed;right:20px;top:20px;width:40px}.modal-card{display:flex;flex-direction:column;max-height:calc(100vh - 40px);overflow:hidden;-ms-overflow-y:visible}.modal-card-head,.modal-card-foot{align-items:center;background-color:#f5f5f5;display:flex;flex-shrink:0;justify-content:flex-start;padding:20px;position:relative}.modal-card-head{border-bottom:1px solid #dbdbdb;border-top-left-radius:6px;border-top-right-radius:6px}.modal-card-title{color:#222;flex-grow:1;flex-shrink:0;font-size:1.5rem;line-height:1}.modal-card-foot{border-bottom-left-radius:6px;border-bottom-right-radius:6px;border-top:1px solid #dbdbdb}.modal-card-foot .button:not(:last-child){margin-right:0.5em}.modal-card-body{-webkit-overflow-scrolling:touch;background-color:#fff;flex-grow:1;flex-shrink:1;overflow:auto;padding:20px}.navbar{background-color:#fff;min-height:3.25rem;position:relative;z-index:30}.navbar.is-white{background-color:#fff;color:#0a0a0a}.navbar.is-white .navbar-brand>.navbar-item,.navbar.is-white .navbar-brand .navbar-link{color:#0a0a0a}.navbar.is-white .navbar-brand>a.navbar-item:focus,.navbar.is-white .navbar-brand>a.navbar-item:hover,.navbar.is-white .navbar-brand>a.navbar-item.is-active,.navbar.is-white .navbar-brand .navbar-link:focus,.navbar.is-white .navbar-brand .navbar-link:hover,.navbar.is-white .navbar-brand .navbar-link.is-active{background-color:#f2f2f2;color:#0a0a0a}.navbar.is-white .navbar-brand .navbar-link::after{border-color:#0a0a0a}.navbar.is-white .navbar-burger{color:#0a0a0a}@media screen and (min-width: 1056px){.navbar.is-white .navbar-start>.navbar-item,.navbar.is-white .navbar-start .navbar-link,.navbar.is-white .navbar-end>.navbar-item,.navbar.is-white .navbar-end .navbar-link{color:#0a0a0a}.navbar.is-white .navbar-start>a.navbar-item:focus,.navbar.is-white .navbar-start>a.navbar-item:hover,.navbar.is-white .navbar-start>a.navbar-item.is-active,.navbar.is-white .navbar-start .navbar-link:focus,.navbar.is-white .navbar-start .navbar-link:hover,.navbar.is-white .navbar-start .navbar-link.is-active,.navbar.is-white .navbar-end>a.navbar-item:focus,.navbar.is-white .navbar-end>a.navbar-item:hover,.navbar.is-white .navbar-end>a.navbar-item.is-active,.navbar.is-white .navbar-end .navbar-link:focus,.navbar.is-white .navbar-end .navbar-link:hover,.navbar.is-white .navbar-end .navbar-link.is-active{background-color:#f2f2f2;color:#0a0a0a}.navbar.is-white .navbar-start .navbar-link::after,.navbar.is-white .navbar-end .navbar-link::after{border-color:#0a0a0a}.navbar.is-white .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-white .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-white .navbar-item.has-dropdown.is-active .navbar-link{background-color:#f2f2f2;color:#0a0a0a}.navbar.is-white .navbar-dropdown a.navbar-item.is-active{background-color:#fff;color:#0a0a0a}}.navbar.is-black{background-color:#0a0a0a;color:#fff}.navbar.is-black .navbar-brand>.navbar-item,.navbar.is-black .navbar-brand .navbar-link{color:#fff}.navbar.is-black .navbar-brand>a.navbar-item:focus,.navbar.is-black .navbar-brand>a.navbar-item:hover,.navbar.is-black .navbar-brand>a.navbar-item.is-active,.navbar.is-black .navbar-brand .navbar-link:focus,.navbar.is-black .navbar-brand .navbar-link:hover,.navbar.is-black .navbar-brand .navbar-link.is-active{background-color:#000;color:#fff}.navbar.is-black .navbar-brand .navbar-link::after{border-color:#fff}.navbar.is-black .navbar-burger{color:#fff}@media screen and (min-width: 1056px){.navbar.is-black .navbar-start>.navbar-item,.navbar.is-black .navbar-start .navbar-link,.navbar.is-black .navbar-end>.navbar-item,.navbar.is-black .navbar-end .navbar-link{color:#fff}.navbar.is-black .navbar-start>a.navbar-item:focus,.navbar.is-black .navbar-start>a.navbar-item:hover,.navbar.is-black .navbar-start>a.navbar-item.is-active,.navbar.is-black .navbar-start .navbar-link:focus,.navbar.is-black .navbar-start .navbar-link:hover,.navbar.is-black .navbar-start .navbar-link.is-active,.navbar.is-black .navbar-end>a.navbar-item:focus,.navbar.is-black .navbar-end>a.navbar-item:hover,.navbar.is-black .navbar-end>a.navbar-item.is-active,.navbar.is-black .navbar-end .navbar-link:focus,.navbar.is-black .navbar-end .navbar-link:hover,.navbar.is-black .navbar-end .navbar-link.is-active{background-color:#000;color:#fff}.navbar.is-black .navbar-start .navbar-link::after,.navbar.is-black .navbar-end .navbar-link::after{border-color:#fff}.navbar.is-black .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-black .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-black .navbar-item.has-dropdown.is-active .navbar-link{background-color:#000;color:#fff}.navbar.is-black .navbar-dropdown a.navbar-item.is-active{background-color:#0a0a0a;color:#fff}}.navbar.is-light{background-color:#f5f5f5;color:#363636}.navbar.is-light .navbar-brand>.navbar-item,.navbar.is-light .navbar-brand .navbar-link{color:#363636}.navbar.is-light .navbar-brand>a.navbar-item:focus,.navbar.is-light .navbar-brand>a.navbar-item:hover,.navbar.is-light .navbar-brand>a.navbar-item.is-active,.navbar.is-light .navbar-brand .navbar-link:focus,.navbar.is-light .navbar-brand .navbar-link:hover,.navbar.is-light .navbar-brand .navbar-link.is-active{background-color:#e8e8e8;color:#363636}.navbar.is-light .navbar-brand .navbar-link::after{border-color:#363636}.navbar.is-light .navbar-burger{color:#363636}@media screen and (min-width: 1056px){.navbar.is-light .navbar-start>.navbar-item,.navbar.is-light .navbar-start .navbar-link,.navbar.is-light .navbar-end>.navbar-item,.navbar.is-light .navbar-end .navbar-link{color:#363636}.navbar.is-light .navbar-start>a.navbar-item:focus,.navbar.is-light .navbar-start>a.navbar-item:hover,.navbar.is-light .navbar-start>a.navbar-item.is-active,.navbar.is-light .navbar-start .navbar-link:focus,.navbar.is-light .navbar-start .navbar-link:hover,.navbar.is-light .navbar-start .navbar-link.is-active,.navbar.is-light .navbar-end>a.navbar-item:focus,.navbar.is-light .navbar-end>a.navbar-item:hover,.navbar.is-light .navbar-end>a.navbar-item.is-active,.navbar.is-light .navbar-end .navbar-link:focus,.navbar.is-light .navbar-end .navbar-link:hover,.navbar.is-light .navbar-end .navbar-link.is-active{background-color:#e8e8e8;color:#363636}.navbar.is-light .navbar-start .navbar-link::after,.navbar.is-light .navbar-end .navbar-link::after{border-color:#363636}.navbar.is-light .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-light .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-light .navbar-item.has-dropdown.is-active .navbar-link{background-color:#e8e8e8;color:#363636}.navbar.is-light .navbar-dropdown a.navbar-item.is-active{background-color:#f5f5f5;color:#363636}}.navbar.is-dark,.content kbd.navbar{background-color:#363636;color:#f5f5f5}.navbar.is-dark .navbar-brand>.navbar-item,.content kbd.navbar .navbar-brand>.navbar-item,.navbar.is-dark .navbar-brand .navbar-link,.content kbd.navbar .navbar-brand .navbar-link{color:#f5f5f5}.navbar.is-dark .navbar-brand>a.navbar-item:focus,.content kbd.navbar .navbar-brand>a.navbar-item:focus,.navbar.is-dark .navbar-brand>a.navbar-item:hover,.content kbd.navbar .navbar-brand>a.navbar-item:hover,.navbar.is-dark .navbar-brand>a.navbar-item.is-active,.content kbd.navbar .navbar-brand>a.navbar-item.is-active,.navbar.is-dark .navbar-brand .navbar-link:focus,.content kbd.navbar .navbar-brand .navbar-link:focus,.navbar.is-dark .navbar-brand .navbar-link:hover,.content kbd.navbar .navbar-brand .navbar-link:hover,.navbar.is-dark .navbar-brand .navbar-link.is-active,.content kbd.navbar .navbar-brand .navbar-link.is-active{background-color:#292929;color:#f5f5f5}.navbar.is-dark .navbar-brand .navbar-link::after,.content kbd.navbar .navbar-brand .navbar-link::after{border-color:#f5f5f5}.navbar.is-dark .navbar-burger,.content kbd.navbar .navbar-burger{color:#f5f5f5}@media screen and (min-width: 1056px){.navbar.is-dark .navbar-start>.navbar-item,.content kbd.navbar .navbar-start>.navbar-item,.navbar.is-dark .navbar-start .navbar-link,.content kbd.navbar .navbar-start .navbar-link,.navbar.is-dark .navbar-end>.navbar-item,.content kbd.navbar .navbar-end>.navbar-item,.navbar.is-dark .navbar-end .navbar-link,.content kbd.navbar .navbar-end .navbar-link{color:#f5f5f5}.navbar.is-dark .navbar-start>a.navbar-item:focus,.content kbd.navbar .navbar-start>a.navbar-item:focus,.navbar.is-dark .navbar-start>a.navbar-item:hover,.content kbd.navbar .navbar-start>a.navbar-item:hover,.navbar.is-dark .navbar-start>a.navbar-item.is-active,.content kbd.navbar .navbar-start>a.navbar-item.is-active,.navbar.is-dark .navbar-start .navbar-link:focus,.content kbd.navbar .navbar-start .navbar-link:focus,.navbar.is-dark .navbar-start .navbar-link:hover,.content kbd.navbar .navbar-start .navbar-link:hover,.navbar.is-dark .navbar-start .navbar-link.is-active,.content kbd.navbar .navbar-start .navbar-link.is-active,.navbar.is-dark .navbar-end>a.navbar-item:focus,.content kbd.navbar .navbar-end>a.navbar-item:focus,.navbar.is-dark .navbar-end>a.navbar-item:hover,.content kbd.navbar .navbar-end>a.navbar-item:hover,.navbar.is-dark .navbar-end>a.navbar-item.is-active,.content kbd.navbar .navbar-end>a.navbar-item.is-active,.navbar.is-dark .navbar-end .navbar-link:focus,.content kbd.navbar .navbar-end .navbar-link:focus,.navbar.is-dark .navbar-end .navbar-link:hover,.content kbd.navbar .navbar-end .navbar-link:hover,.navbar.is-dark .navbar-end .navbar-link.is-active,.content kbd.navbar .navbar-end .navbar-link.is-active{background-color:#292929;color:#f5f5f5}.navbar.is-dark .navbar-start .navbar-link::after,.content kbd.navbar .navbar-start .navbar-link::after,.navbar.is-dark .navbar-end .navbar-link::after,.content kbd.navbar .navbar-end .navbar-link::after{border-color:#f5f5f5}.navbar.is-dark .navbar-item.has-dropdown:focus .navbar-link,.content kbd.navbar .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-dark .navbar-item.has-dropdown:hover .navbar-link,.content kbd.navbar .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-dark .navbar-item.has-dropdown.is-active .navbar-link,.content kbd.navbar .navbar-item.has-dropdown.is-active .navbar-link{background-color:#292929;color:#f5f5f5}.navbar.is-dark .navbar-dropdown a.navbar-item.is-active,.content kbd.navbar .navbar-dropdown a.navbar-item.is-active{background-color:#363636;color:#f5f5f5}}.navbar.is-primary,.docstring>section>a.navbar.docs-sourcelink{background-color:#4eb5de;color:#fff}.navbar.is-primary .navbar-brand>.navbar-item,.docstring>section>a.navbar.docs-sourcelink .navbar-brand>.navbar-item,.navbar.is-primary .navbar-brand .navbar-link,.docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link{color:#fff}.navbar.is-primary .navbar-brand>a.navbar-item:focus,.docstring>section>a.navbar.docs-sourcelink .navbar-brand>a.navbar-item:focus,.navbar.is-primary .navbar-brand>a.navbar-item:hover,.docstring>section>a.navbar.docs-sourcelink .navbar-brand>a.navbar-item:hover,.navbar.is-primary .navbar-brand>a.navbar-item.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-brand>a.navbar-item.is-active,.navbar.is-primary .navbar-brand .navbar-link:focus,.docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link:focus,.navbar.is-primary .navbar-brand .navbar-link:hover,.docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link:hover,.navbar.is-primary .navbar-brand .navbar-link.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link.is-active{background-color:#39acda;color:#fff}.navbar.is-primary .navbar-brand .navbar-link::after,.docstring>section>a.navbar.docs-sourcelink .navbar-brand .navbar-link::after{border-color:#fff}.navbar.is-primary .navbar-burger,.docstring>section>a.navbar.docs-sourcelink .navbar-burger{color:#fff}@media screen and (min-width: 1056px){.navbar.is-primary .navbar-start>.navbar-item,.docstring>section>a.navbar.docs-sourcelink .navbar-start>.navbar-item,.navbar.is-primary .navbar-start .navbar-link,.docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link,.navbar.is-primary .navbar-end>.navbar-item,.docstring>section>a.navbar.docs-sourcelink .navbar-end>.navbar-item,.navbar.is-primary .navbar-end .navbar-link,.docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link{color:#fff}.navbar.is-primary .navbar-start>a.navbar-item:focus,.docstring>section>a.navbar.docs-sourcelink .navbar-start>a.navbar-item:focus,.navbar.is-primary .navbar-start>a.navbar-item:hover,.docstring>section>a.navbar.docs-sourcelink .navbar-start>a.navbar-item:hover,.navbar.is-primary .navbar-start>a.navbar-item.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-start>a.navbar-item.is-active,.navbar.is-primary .navbar-start .navbar-link:focus,.docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link:focus,.navbar.is-primary .navbar-start .navbar-link:hover,.docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link:hover,.navbar.is-primary .navbar-start .navbar-link.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link.is-active,.navbar.is-primary .navbar-end>a.navbar-item:focus,.docstring>section>a.navbar.docs-sourcelink .navbar-end>a.navbar-item:focus,.navbar.is-primary .navbar-end>a.navbar-item:hover,.docstring>section>a.navbar.docs-sourcelink .navbar-end>a.navbar-item:hover,.navbar.is-primary .navbar-end>a.navbar-item.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-end>a.navbar-item.is-active,.navbar.is-primary .navbar-end .navbar-link:focus,.docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link:focus,.navbar.is-primary .navbar-end .navbar-link:hover,.docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link:hover,.navbar.is-primary .navbar-end .navbar-link.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link.is-active{background-color:#39acda;color:#fff}.navbar.is-primary .navbar-start .navbar-link::after,.docstring>section>a.navbar.docs-sourcelink .navbar-start .navbar-link::after,.navbar.is-primary .navbar-end .navbar-link::after,.docstring>section>a.navbar.docs-sourcelink .navbar-end .navbar-link::after{border-color:#fff}.navbar.is-primary .navbar-item.has-dropdown:focus .navbar-link,.docstring>section>a.navbar.docs-sourcelink .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-primary .navbar-item.has-dropdown:hover .navbar-link,.docstring>section>a.navbar.docs-sourcelink .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-primary .navbar-item.has-dropdown.is-active .navbar-link,.docstring>section>a.navbar.docs-sourcelink .navbar-item.has-dropdown.is-active .navbar-link{background-color:#39acda;color:#fff}.navbar.is-primary .navbar-dropdown a.navbar-item.is-active,.docstring>section>a.navbar.docs-sourcelink .navbar-dropdown a.navbar-item.is-active{background-color:#4eb5de;color:#fff}}.navbar.is-link{background-color:#2e63b8;color:#fff}.navbar.is-link .navbar-brand>.navbar-item,.navbar.is-link .navbar-brand .navbar-link{color:#fff}.navbar.is-link .navbar-brand>a.navbar-item:focus,.navbar.is-link .navbar-brand>a.navbar-item:hover,.navbar.is-link .navbar-brand>a.navbar-item.is-active,.navbar.is-link .navbar-brand .navbar-link:focus,.navbar.is-link .navbar-brand .navbar-link:hover,.navbar.is-link .navbar-brand .navbar-link.is-active{background-color:#2958a4;color:#fff}.navbar.is-link .navbar-brand .navbar-link::after{border-color:#fff}.navbar.is-link .navbar-burger{color:#fff}@media screen and (min-width: 1056px){.navbar.is-link .navbar-start>.navbar-item,.navbar.is-link .navbar-start .navbar-link,.navbar.is-link .navbar-end>.navbar-item,.navbar.is-link .navbar-end .navbar-link{color:#fff}.navbar.is-link .navbar-start>a.navbar-item:focus,.navbar.is-link .navbar-start>a.navbar-item:hover,.navbar.is-link .navbar-start>a.navbar-item.is-active,.navbar.is-link .navbar-start .navbar-link:focus,.navbar.is-link .navbar-start .navbar-link:hover,.navbar.is-link .navbar-start .navbar-link.is-active,.navbar.is-link .navbar-end>a.navbar-item:focus,.navbar.is-link .navbar-end>a.navbar-item:hover,.navbar.is-link .navbar-end>a.navbar-item.is-active,.navbar.is-link .navbar-end .navbar-link:focus,.navbar.is-link .navbar-end .navbar-link:hover,.navbar.is-link .navbar-end .navbar-link.is-active{background-color:#2958a4;color:#fff}.navbar.is-link .navbar-start .navbar-link::after,.navbar.is-link .navbar-end .navbar-link::after{border-color:#fff}.navbar.is-link .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-link .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-link .navbar-item.has-dropdown.is-active .navbar-link{background-color:#2958a4;color:#fff}.navbar.is-link .navbar-dropdown a.navbar-item.is-active{background-color:#2e63b8;color:#fff}}.navbar.is-info{background-color:#209cee;color:#fff}.navbar.is-info .navbar-brand>.navbar-item,.navbar.is-info .navbar-brand .navbar-link{color:#fff}.navbar.is-info .navbar-brand>a.navbar-item:focus,.navbar.is-info .navbar-brand>a.navbar-item:hover,.navbar.is-info .navbar-brand>a.navbar-item.is-active,.navbar.is-info .navbar-brand .navbar-link:focus,.navbar.is-info .navbar-brand .navbar-link:hover,.navbar.is-info .navbar-brand .navbar-link.is-active{background-color:#1190e3;color:#fff}.navbar.is-info .navbar-brand .navbar-link::after{border-color:#fff}.navbar.is-info .navbar-burger{color:#fff}@media screen and (min-width: 1056px){.navbar.is-info .navbar-start>.navbar-item,.navbar.is-info .navbar-start .navbar-link,.navbar.is-info .navbar-end>.navbar-item,.navbar.is-info .navbar-end .navbar-link{color:#fff}.navbar.is-info .navbar-start>a.navbar-item:focus,.navbar.is-info .navbar-start>a.navbar-item:hover,.navbar.is-info .navbar-start>a.navbar-item.is-active,.navbar.is-info .navbar-start .navbar-link:focus,.navbar.is-info .navbar-start .navbar-link:hover,.navbar.is-info .navbar-start .navbar-link.is-active,.navbar.is-info .navbar-end>a.navbar-item:focus,.navbar.is-info .navbar-end>a.navbar-item:hover,.navbar.is-info .navbar-end>a.navbar-item.is-active,.navbar.is-info .navbar-end .navbar-link:focus,.navbar.is-info .navbar-end .navbar-link:hover,.navbar.is-info .navbar-end .navbar-link.is-active{background-color:#1190e3;color:#fff}.navbar.is-info .navbar-start .navbar-link::after,.navbar.is-info .navbar-end .navbar-link::after{border-color:#fff}.navbar.is-info .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-info .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-info .navbar-item.has-dropdown.is-active .navbar-link{background-color:#1190e3;color:#fff}.navbar.is-info .navbar-dropdown a.navbar-item.is-active{background-color:#209cee;color:#fff}}.navbar.is-success{background-color:#22c35b;color:#fff}.navbar.is-success .navbar-brand>.navbar-item,.navbar.is-success .navbar-brand .navbar-link{color:#fff}.navbar.is-success .navbar-brand>a.navbar-item:focus,.navbar.is-success .navbar-brand>a.navbar-item:hover,.navbar.is-success .navbar-brand>a.navbar-item.is-active,.navbar.is-success .navbar-brand .navbar-link:focus,.navbar.is-success .navbar-brand .navbar-link:hover,.navbar.is-success .navbar-brand .navbar-link.is-active{background-color:#1ead51;color:#fff}.navbar.is-success .navbar-brand .navbar-link::after{border-color:#fff}.navbar.is-success .navbar-burger{color:#fff}@media screen and (min-width: 1056px){.navbar.is-success .navbar-start>.navbar-item,.navbar.is-success .navbar-start .navbar-link,.navbar.is-success .navbar-end>.navbar-item,.navbar.is-success .navbar-end .navbar-link{color:#fff}.navbar.is-success .navbar-start>a.navbar-item:focus,.navbar.is-success .navbar-start>a.navbar-item:hover,.navbar.is-success .navbar-start>a.navbar-item.is-active,.navbar.is-success .navbar-start .navbar-link:focus,.navbar.is-success .navbar-start .navbar-link:hover,.navbar.is-success .navbar-start .navbar-link.is-active,.navbar.is-success .navbar-end>a.navbar-item:focus,.navbar.is-success .navbar-end>a.navbar-item:hover,.navbar.is-success .navbar-end>a.navbar-item.is-active,.navbar.is-success .navbar-end .navbar-link:focus,.navbar.is-success .navbar-end .navbar-link:hover,.navbar.is-success .navbar-end .navbar-link.is-active{background-color:#1ead51;color:#fff}.navbar.is-success .navbar-start .navbar-link::after,.navbar.is-success .navbar-end .navbar-link::after{border-color:#fff}.navbar.is-success .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-success .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-success .navbar-item.has-dropdown.is-active .navbar-link{background-color:#1ead51;color:#fff}.navbar.is-success .navbar-dropdown a.navbar-item.is-active{background-color:#22c35b;color:#fff}}.navbar.is-warning{background-color:#ffdd57;color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-brand>.navbar-item,.navbar.is-warning .navbar-brand .navbar-link{color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-brand>a.navbar-item:focus,.navbar.is-warning .navbar-brand>a.navbar-item:hover,.navbar.is-warning .navbar-brand>a.navbar-item.is-active,.navbar.is-warning .navbar-brand .navbar-link:focus,.navbar.is-warning .navbar-brand .navbar-link:hover,.navbar.is-warning .navbar-brand .navbar-link.is-active{background-color:#ffd83e;color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-brand .navbar-link::after{border-color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-burger{color:rgba(0,0,0,0.7)}@media screen and (min-width: 1056px){.navbar.is-warning .navbar-start>.navbar-item,.navbar.is-warning .navbar-start .navbar-link,.navbar.is-warning .navbar-end>.navbar-item,.navbar.is-warning .navbar-end .navbar-link{color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-start>a.navbar-item:focus,.navbar.is-warning .navbar-start>a.navbar-item:hover,.navbar.is-warning .navbar-start>a.navbar-item.is-active,.navbar.is-warning .navbar-start .navbar-link:focus,.navbar.is-warning .navbar-start .navbar-link:hover,.navbar.is-warning .navbar-start .navbar-link.is-active,.navbar.is-warning .navbar-end>a.navbar-item:focus,.navbar.is-warning .navbar-end>a.navbar-item:hover,.navbar.is-warning .navbar-end>a.navbar-item.is-active,.navbar.is-warning .navbar-end .navbar-link:focus,.navbar.is-warning .navbar-end .navbar-link:hover,.navbar.is-warning .navbar-end .navbar-link.is-active{background-color:#ffd83e;color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-start .navbar-link::after,.navbar.is-warning .navbar-end .navbar-link::after{border-color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-warning .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-warning .navbar-item.has-dropdown.is-active .navbar-link{background-color:#ffd83e;color:rgba(0,0,0,0.7)}.navbar.is-warning .navbar-dropdown a.navbar-item.is-active{background-color:#ffdd57;color:rgba(0,0,0,0.7)}}.navbar.is-danger{background-color:#da0b00;color:#fff}.navbar.is-danger .navbar-brand>.navbar-item,.navbar.is-danger .navbar-brand .navbar-link{color:#fff}.navbar.is-danger .navbar-brand>a.navbar-item:focus,.navbar.is-danger .navbar-brand>a.navbar-item:hover,.navbar.is-danger .navbar-brand>a.navbar-item.is-active,.navbar.is-danger .navbar-brand .navbar-link:focus,.navbar.is-danger .navbar-brand .navbar-link:hover,.navbar.is-danger .navbar-brand .navbar-link.is-active{background-color:#c10a00;color:#fff}.navbar.is-danger .navbar-brand .navbar-link::after{border-color:#fff}.navbar.is-danger .navbar-burger{color:#fff}@media screen and (min-width: 1056px){.navbar.is-danger .navbar-start>.navbar-item,.navbar.is-danger .navbar-start .navbar-link,.navbar.is-danger .navbar-end>.navbar-item,.navbar.is-danger .navbar-end .navbar-link{color:#fff}.navbar.is-danger .navbar-start>a.navbar-item:focus,.navbar.is-danger .navbar-start>a.navbar-item:hover,.navbar.is-danger .navbar-start>a.navbar-item.is-active,.navbar.is-danger .navbar-start .navbar-link:focus,.navbar.is-danger .navbar-start .navbar-link:hover,.navbar.is-danger .navbar-start .navbar-link.is-active,.navbar.is-danger .navbar-end>a.navbar-item:focus,.navbar.is-danger .navbar-end>a.navbar-item:hover,.navbar.is-danger .navbar-end>a.navbar-item.is-active,.navbar.is-danger .navbar-end .navbar-link:focus,.navbar.is-danger .navbar-end .navbar-link:hover,.navbar.is-danger .navbar-end .navbar-link.is-active{background-color:#c10a00;color:#fff}.navbar.is-danger .navbar-start .navbar-link::after,.navbar.is-danger .navbar-end .navbar-link::after{border-color:#fff}.navbar.is-danger .navbar-item.has-dropdown:focus .navbar-link,.navbar.is-danger .navbar-item.has-dropdown:hover .navbar-link,.navbar.is-danger .navbar-item.has-dropdown.is-active .navbar-link{background-color:#c10a00;color:#fff}.navbar.is-danger .navbar-dropdown a.navbar-item.is-active{background-color:#da0b00;color:#fff}}.navbar>.container{align-items:stretch;display:flex;min-height:3.25rem;width:100%}.navbar.has-shadow{box-shadow:0 2px 0 0 #f5f5f5}.navbar.is-fixed-bottom,.navbar.is-fixed-top{left:0;position:fixed;right:0;z-index:30}.navbar.is-fixed-bottom{bottom:0}.navbar.is-fixed-bottom.has-shadow{box-shadow:0 -2px 0 0 #f5f5f5}.navbar.is-fixed-top{top:0}html.has-navbar-fixed-top,body.has-navbar-fixed-top{padding-top:3.25rem}html.has-navbar-fixed-bottom,body.has-navbar-fixed-bottom{padding-bottom:3.25rem}.navbar-brand,.navbar-tabs{align-items:stretch;display:flex;flex-shrink:0;min-height:3.25rem}.navbar-brand a.navbar-item:focus,.navbar-brand a.navbar-item:hover{background-color:transparent}.navbar-tabs{-webkit-overflow-scrolling:touch;max-width:100vw;overflow-x:auto;overflow-y:hidden}.navbar-burger{color:#4a4a4a;cursor:pointer;display:block;height:3.25rem;position:relative;width:3.25rem;margin-left:auto}.navbar-burger span{background-color:currentColor;display:block;height:1px;left:calc(50% - 8px);position:absolute;transform-origin:center;transition-duration:86ms;transition-property:background-color, opacity, transform;transition-timing-function:ease-out;width:16px}.navbar-burger span:nth-child(1){top:calc(50% - 6px)}.navbar-burger span:nth-child(2){top:calc(50% - 1px)}.navbar-burger span:nth-child(3){top:calc(50% + 4px)}.navbar-burger:hover{background-color:rgba(0,0,0,0.05)}.navbar-burger.is-active span:nth-child(1){transform:translateY(5px) rotate(45deg)}.navbar-burger.is-active span:nth-child(2){opacity:0}.navbar-burger.is-active span:nth-child(3){transform:translateY(-5px) rotate(-45deg)}.navbar-menu{display:none}.navbar-item,.navbar-link{color:#4a4a4a;display:block;line-height:1.5;padding:0.5rem 0.75rem;position:relative}.navbar-item .icon:only-child,.navbar-link .icon:only-child{margin-left:-0.25rem;margin-right:-0.25rem}a.navbar-item,.navbar-link{cursor:pointer}a.navbar-item:focus,a.navbar-item:focus-within,a.navbar-item:hover,a.navbar-item.is-active,.navbar-link:focus,.navbar-link:focus-within,.navbar-link:hover,.navbar-link.is-active{background-color:#fafafa;color:#2e63b8}.navbar-item{display:block;flex-grow:0;flex-shrink:0}.navbar-item img{max-height:1.75rem}.navbar-item.has-dropdown{padding:0}.navbar-item.is-expanded{flex-grow:1;flex-shrink:1}.navbar-item.is-tab{border-bottom:1px solid transparent;min-height:3.25rem;padding-bottom:calc(0.5rem - 1px)}.navbar-item.is-tab:focus,.navbar-item.is-tab:hover{background-color:rgba(0,0,0,0);border-bottom-color:#2e63b8}.navbar-item.is-tab.is-active{background-color:rgba(0,0,0,0);border-bottom-color:#2e63b8;border-bottom-style:solid;border-bottom-width:3px;color:#2e63b8;padding-bottom:calc(0.5rem - 3px)}.navbar-content{flex-grow:1;flex-shrink:1}.navbar-link:not(.is-arrowless){padding-right:2.5em}.navbar-link:not(.is-arrowless)::after{border-color:#2e63b8;margin-top:-0.375em;right:1.125em}.navbar-dropdown{font-size:0.875rem;padding-bottom:0.5rem;padding-top:0.5rem}.navbar-dropdown .navbar-item{padding-left:1.5rem;padding-right:1.5rem}.navbar-divider{background-color:#f5f5f5;border:none;display:none;height:2px;margin:0.5rem 0}@media screen and (max-width: 1055px){.navbar>.container{display:block}.navbar-brand .navbar-item,.navbar-tabs .navbar-item{align-items:center;display:flex}.navbar-link::after{display:none}.navbar-menu{background-color:#fff;box-shadow:0 8px 16px rgba(10,10,10,0.1);padding:0.5rem 0}.navbar-menu.is-active{display:block}.navbar.is-fixed-bottom-touch,.navbar.is-fixed-top-touch{left:0;position:fixed;right:0;z-index:30}.navbar.is-fixed-bottom-touch{bottom:0}.navbar.is-fixed-bottom-touch.has-shadow{box-shadow:0 -2px 3px rgba(10,10,10,0.1)}.navbar.is-fixed-top-touch{top:0}.navbar.is-fixed-top .navbar-menu,.navbar.is-fixed-top-touch .navbar-menu{-webkit-overflow-scrolling:touch;max-height:calc(100vh - 3.25rem);overflow:auto}html.has-navbar-fixed-top-touch,body.has-navbar-fixed-top-touch{padding-top:3.25rem}html.has-navbar-fixed-bottom-touch,body.has-navbar-fixed-bottom-touch{padding-bottom:3.25rem}}@media screen and (min-width: 1056px){.navbar,.navbar-menu,.navbar-start,.navbar-end{align-items:stretch;display:flex}.navbar{min-height:3.25rem}.navbar.is-spaced{padding:1rem 2rem}.navbar.is-spaced .navbar-start,.navbar.is-spaced .navbar-end{align-items:center}.navbar.is-spaced a.navbar-item,.navbar.is-spaced .navbar-link{border-radius:4px}.navbar.is-transparent a.navbar-item:focus,.navbar.is-transparent a.navbar-item:hover,.navbar.is-transparent a.navbar-item.is-active,.navbar.is-transparent .navbar-link:focus,.navbar.is-transparent .navbar-link:hover,.navbar.is-transparent .navbar-link.is-active{background-color:transparent !important}.navbar.is-transparent .navbar-item.has-dropdown.is-active .navbar-link,.navbar.is-transparent .navbar-item.has-dropdown.is-hoverable:focus .navbar-link,.navbar.is-transparent .navbar-item.has-dropdown.is-hoverable:focus-within .navbar-link,.navbar.is-transparent .navbar-item.has-dropdown.is-hoverable:hover .navbar-link{background-color:transparent !important}.navbar.is-transparent .navbar-dropdown a.navbar-item:focus,.navbar.is-transparent .navbar-dropdown a.navbar-item:hover{background-color:#f5f5f5;color:#0a0a0a}.navbar.is-transparent .navbar-dropdown a.navbar-item.is-active{background-color:#f5f5f5;color:#2e63b8}.navbar-burger{display:none}.navbar-item,.navbar-link{align-items:center;display:flex}.navbar-item{display:flex}.navbar-item.has-dropdown{align-items:stretch}.navbar-item.has-dropdown-up .navbar-link::after{transform:rotate(135deg) translate(0.25em, -0.25em)}.navbar-item.has-dropdown-up .navbar-dropdown{border-bottom:2px solid #dbdbdb;border-radius:6px 6px 0 0;border-top:none;bottom:100%;box-shadow:0 -8px 8px rgba(10,10,10,0.1);top:auto}.navbar-item.is-active .navbar-dropdown,.navbar-item.is-hoverable:focus .navbar-dropdown,.navbar-item.is-hoverable:focus-within .navbar-dropdown,.navbar-item.is-hoverable:hover .navbar-dropdown{display:block}.navbar.is-spaced .navbar-item.is-active .navbar-dropdown,.navbar-item.is-active .navbar-dropdown.is-boxed,.navbar.is-spaced .navbar-item.is-hoverable:focus .navbar-dropdown,.navbar-item.is-hoverable:focus .navbar-dropdown.is-boxed,.navbar.is-spaced .navbar-item.is-hoverable:focus-within .navbar-dropdown,.navbar-item.is-hoverable:focus-within .navbar-dropdown.is-boxed,.navbar.is-spaced .navbar-item.is-hoverable:hover .navbar-dropdown,.navbar-item.is-hoverable:hover .navbar-dropdown.is-boxed{opacity:1;pointer-events:auto;transform:translateY(0)}.navbar-menu{flex-grow:1;flex-shrink:0}.navbar-start{justify-content:flex-start;margin-right:auto}.navbar-end{justify-content:flex-end;margin-left:auto}.navbar-dropdown{background-color:#fff;border-bottom-left-radius:6px;border-bottom-right-radius:6px;border-top:2px solid #dbdbdb;box-shadow:0 8px 8px rgba(10,10,10,0.1);display:none;font-size:0.875rem;left:0;min-width:100%;position:absolute;top:100%;z-index:20}.navbar-dropdown .navbar-item{padding:0.375rem 1rem;white-space:nowrap}.navbar-dropdown a.navbar-item{padding-right:3rem}.navbar-dropdown a.navbar-item:focus,.navbar-dropdown a.navbar-item:hover{background-color:#f5f5f5;color:#0a0a0a}.navbar-dropdown a.navbar-item.is-active{background-color:#f5f5f5;color:#2e63b8}.navbar.is-spaced .navbar-dropdown,.navbar-dropdown.is-boxed{border-radius:6px;border-top:none;box-shadow:0 8px 8px rgba(10,10,10,0.1), 0 0 0 1px rgba(10,10,10,0.1);display:block;opacity:0;pointer-events:none;top:calc(100% + (-4px));transform:translateY(-5px);transition-duration:86ms;transition-property:opacity, transform}.navbar-dropdown.is-right{left:auto;right:0}.navbar-divider{display:block}.navbar>.container .navbar-brand,.container>.navbar .navbar-brand{margin-left:-.75rem}.navbar>.container .navbar-menu,.container>.navbar .navbar-menu{margin-right:-.75rem}.navbar.is-fixed-bottom-desktop,.navbar.is-fixed-top-desktop{left:0;position:fixed;right:0;z-index:30}.navbar.is-fixed-bottom-desktop{bottom:0}.navbar.is-fixed-bottom-desktop.has-shadow{box-shadow:0 -2px 3px rgba(10,10,10,0.1)}.navbar.is-fixed-top-desktop{top:0}html.has-navbar-fixed-top-desktop,body.has-navbar-fixed-top-desktop{padding-top:3.25rem}html.has-navbar-fixed-bottom-desktop,body.has-navbar-fixed-bottom-desktop{padding-bottom:3.25rem}html.has-spaced-navbar-fixed-top,body.has-spaced-navbar-fixed-top{padding-top:5.25rem}html.has-spaced-navbar-fixed-bottom,body.has-spaced-navbar-fixed-bottom{padding-bottom:5.25rem}a.navbar-item.is-active,.navbar-link.is-active{color:#0a0a0a}a.navbar-item.is-active:not(:focus):not(:hover),.navbar-link.is-active:not(:focus):not(:hover){background-color:rgba(0,0,0,0)}.navbar-item.has-dropdown:focus .navbar-link,.navbar-item.has-dropdown:hover .navbar-link,.navbar-item.has-dropdown.is-active .navbar-link{background-color:#fafafa}}.hero.is-fullheight-with-navbar{min-height:calc(100vh - 3.25rem)}.pagination{font-size:1rem;margin:-.25rem}.pagination.is-small,#documenter .docs-sidebar form.docs-search>input.pagination{font-size:.75rem}.pagination.is-medium{font-size:1.25rem}.pagination.is-large{font-size:1.5rem}.pagination.is-rounded .pagination-previous,#documenter .docs-sidebar form.docs-search>input.pagination .pagination-previous,.pagination.is-rounded .pagination-next,#documenter .docs-sidebar form.docs-search>input.pagination .pagination-next{padding-left:1em;padding-right:1em;border-radius:290486px}.pagination.is-rounded .pagination-link,#documenter .docs-sidebar form.docs-search>input.pagination .pagination-link{border-radius:290486px}.pagination,.pagination-list{align-items:center;display:flex;justify-content:center;text-align:center}.pagination-previous,.pagination-next,.pagination-link,.pagination-ellipsis{font-size:1em;justify-content:center;margin:.25rem;padding-left:.5em;padding-right:.5em;text-align:center}.pagination-previous,.pagination-next,.pagination-link{border-color:#dbdbdb;color:#363636;min-width:2.25em}.pagination-previous:hover,.pagination-next:hover,.pagination-link:hover{border-color:#b5b5b5;color:#363636}.pagination-previous:focus,.pagination-next:focus,.pagination-link:focus{border-color:#3c5dcd}.pagination-previous:active,.pagination-next:active,.pagination-link:active{box-shadow:inset 0 1px 2px rgba(10,10,10,0.2)}.pagination-previous[disabled],.pagination-next[disabled],.pagination-link[disabled]{background-color:#dbdbdb;border-color:#dbdbdb;box-shadow:none;color:#6b6b6b;opacity:0.5}.pagination-previous,.pagination-next{padding-left:0.75em;padding-right:0.75em;white-space:nowrap}.pagination-link.is-current{background-color:#2e63b8;border-color:#2e63b8;color:#fff}.pagination-ellipsis{color:#b5b5b5;pointer-events:none}.pagination-list{flex-wrap:wrap}@media screen and (max-width: 768px){.pagination{flex-wrap:wrap}.pagination-previous,.pagination-next{flex-grow:1;flex-shrink:1}.pagination-list li{flex-grow:1;flex-shrink:1}}@media screen and (min-width: 769px),print{.pagination-list{flex-grow:1;flex-shrink:1;justify-content:flex-start;order:1}.pagination-previous{order:2}.pagination-next{order:3}.pagination{justify-content:space-between}.pagination.is-centered .pagination-previous{order:1}.pagination.is-centered .pagination-list{justify-content:center;order:2}.pagination.is-centered .pagination-next{order:3}.pagination.is-right .pagination-previous{order:1}.pagination.is-right .pagination-next{order:2}.pagination.is-right .pagination-list{justify-content:flex-end;order:3}}.panel{font-size:1rem}.panel:not(:last-child){margin-bottom:1.5rem}.panel-heading,.panel-tabs,.panel-block{border-bottom:1px solid #dbdbdb;border-left:1px solid #dbdbdb;border-right:1px solid #dbdbdb}.panel-heading:first-child,.panel-tabs:first-child,.panel-block:first-child{border-top:1px solid #dbdbdb}.panel-heading{background-color:#f5f5f5;border-radius:4px 4px 0 0;color:#222;font-size:1.25em;font-weight:300;line-height:1.25;padding:0.5em 0.75em}.panel-tabs{align-items:flex-end;display:flex;font-size:.875em;justify-content:center}.panel-tabs a{border-bottom:1px solid #dbdbdb;margin-bottom:-1px;padding:0.5em}.panel-tabs a.is-active{border-bottom-color:#4a4a4a;color:#363636}.panel-list a{color:#222}.panel-list a:hover{color:#2e63b8}.panel-block{align-items:center;color:#222;display:flex;justify-content:flex-start;padding:0.5em 0.75em}.panel-block input[type="checkbox"]{margin-right:0.75em}.panel-block>.control{flex-grow:1;flex-shrink:1;width:100%}.panel-block.is-wrapped{flex-wrap:wrap}.panel-block.is-active{border-left-color:#2e63b8;color:#363636}.panel-block.is-active .panel-icon{color:#2e63b8}a.panel-block,label.panel-block{cursor:pointer}a.panel-block:hover,label.panel-block:hover{background-color:#f5f5f5}.panel-icon{display:inline-block;font-size:14px;height:1em;line-height:1em;text-align:center;vertical-align:top;width:1em;color:#6b6b6b;margin-right:0.75em}.panel-icon .fa{font-size:inherit;line-height:inherit}.tabs{-webkit-overflow-scrolling:touch;align-items:stretch;display:flex;font-size:1rem;justify-content:space-between;overflow:hidden;overflow-x:auto;white-space:nowrap}.tabs a{align-items:center;border-bottom-color:#dbdbdb;border-bottom-style:solid;border-bottom-width:1px;color:#222;display:flex;justify-content:center;margin-bottom:-1px;padding:0.5em 1em;vertical-align:top}.tabs a:hover{border-bottom-color:#222;color:#222}.tabs li{display:block}.tabs li.is-active a{border-bottom-color:#2e63b8;color:#2e63b8}.tabs ul{align-items:center;border-bottom-color:#dbdbdb;border-bottom-style:solid;border-bottom-width:1px;display:flex;flex-grow:1;flex-shrink:0;justify-content:flex-start}.tabs ul.is-left{padding-right:0.75em}.tabs ul.is-center{flex:none;justify-content:center;padding-left:0.75em;padding-right:0.75em}.tabs ul.is-right{justify-content:flex-end;padding-left:0.75em}.tabs .icon:first-child{margin-right:0.5em}.tabs .icon:last-child{margin-left:0.5em}.tabs.is-centered ul{justify-content:center}.tabs.is-right ul{justify-content:flex-end}.tabs.is-boxed a{border:1px solid transparent;border-radius:4px 4px 0 0}.tabs.is-boxed a:hover{background-color:#f5f5f5;border-bottom-color:#dbdbdb}.tabs.is-boxed li.is-active a{background-color:#fff;border-color:#dbdbdb;border-bottom-color:rgba(0,0,0,0) !important}.tabs.is-fullwidth li{flex-grow:1;flex-shrink:0}.tabs.is-toggle a{border-color:#dbdbdb;border-style:solid;border-width:1px;margin-bottom:0;position:relative}.tabs.is-toggle a:hover{background-color:#f5f5f5;border-color:#b5b5b5;z-index:2}.tabs.is-toggle li+li{margin-left:-1px}.tabs.is-toggle li:first-child a{border-radius:4px 0 0 4px}.tabs.is-toggle li:last-child a{border-radius:0 4px 4px 0}.tabs.is-toggle li.is-active a{background-color:#2e63b8;border-color:#2e63b8;color:#fff;z-index:1}.tabs.is-toggle ul{border-bottom:none}.tabs.is-toggle.is-toggle-rounded li:first-child a{border-bottom-left-radius:290486px;border-top-left-radius:290486px;padding-left:1.25em}.tabs.is-toggle.is-toggle-rounded li:last-child a{border-bottom-right-radius:290486px;border-top-right-radius:290486px;padding-right:1.25em}.tabs.is-small,#documenter .docs-sidebar form.docs-search>input.tabs{font-size:.75rem}.tabs.is-medium{font-size:1.25rem}.tabs.is-large{font-size:1.5rem}.column{display:block;flex-basis:0;flex-grow:1;flex-shrink:1;padding:.75rem}.columns.is-mobile>.column.is-narrow{flex:none}.columns.is-mobile>.column.is-full{flex:none;width:100%}.columns.is-mobile>.column.is-three-quarters{flex:none;width:75%}.columns.is-mobile>.column.is-two-thirds{flex:none;width:66.6666%}.columns.is-mobile>.column.is-half{flex:none;width:50%}.columns.is-mobile>.column.is-one-third{flex:none;width:33.3333%}.columns.is-mobile>.column.is-one-quarter{flex:none;width:25%}.columns.is-mobile>.column.is-one-fifth{flex:none;width:20%}.columns.is-mobile>.column.is-two-fifths{flex:none;width:40%}.columns.is-mobile>.column.is-three-fifths{flex:none;width:60%}.columns.is-mobile>.column.is-four-fifths{flex:none;width:80%}.columns.is-mobile>.column.is-offset-three-quarters{margin-left:75%}.columns.is-mobile>.column.is-offset-two-thirds{margin-left:66.6666%}.columns.is-mobile>.column.is-offset-half{margin-left:50%}.columns.is-mobile>.column.is-offset-one-third{margin-left:33.3333%}.columns.is-mobile>.column.is-offset-one-quarter{margin-left:25%}.columns.is-mobile>.column.is-offset-one-fifth{margin-left:20%}.columns.is-mobile>.column.is-offset-two-fifths{margin-left:40%}.columns.is-mobile>.column.is-offset-three-fifths{margin-left:60%}.columns.is-mobile>.column.is-offset-four-fifths{margin-left:80%}.columns.is-mobile>.column.is-0{flex:none;width:0%}.columns.is-mobile>.column.is-offset-0{margin-left:0%}.columns.is-mobile>.column.is-1{flex:none;width:8.3333333333%}.columns.is-mobile>.column.is-offset-1{margin-left:8.3333333333%}.columns.is-mobile>.column.is-2{flex:none;width:16.6666666667%}.columns.is-mobile>.column.is-offset-2{margin-left:16.6666666667%}.columns.is-mobile>.column.is-3{flex:none;width:25%}.columns.is-mobile>.column.is-offset-3{margin-left:25%}.columns.is-mobile>.column.is-4{flex:none;width:33.3333333333%}.columns.is-mobile>.column.is-offset-4{margin-left:33.3333333333%}.columns.is-mobile>.column.is-5{flex:none;width:41.6666666667%}.columns.is-mobile>.column.is-offset-5{margin-left:41.6666666667%}.columns.is-mobile>.column.is-6{flex:none;width:50%}.columns.is-mobile>.column.is-offset-6{margin-left:50%}.columns.is-mobile>.column.is-7{flex:none;width:58.3333333333%}.columns.is-mobile>.column.is-offset-7{margin-left:58.3333333333%}.columns.is-mobile>.column.is-8{flex:none;width:66.6666666667%}.columns.is-mobile>.column.is-offset-8{margin-left:66.6666666667%}.columns.is-mobile>.column.is-9{flex:none;width:75%}.columns.is-mobile>.column.is-offset-9{margin-left:75%}.columns.is-mobile>.column.is-10{flex:none;width:83.3333333333%}.columns.is-mobile>.column.is-offset-10{margin-left:83.3333333333%}.columns.is-mobile>.column.is-11{flex:none;width:91.6666666667%}.columns.is-mobile>.column.is-offset-11{margin-left:91.6666666667%}.columns.is-mobile>.column.is-12{flex:none;width:100%}.columns.is-mobile>.column.is-offset-12{margin-left:100%}@media screen and (max-width: 768px){.column.is-narrow-mobile{flex:none}.column.is-full-mobile{flex:none;width:100%}.column.is-three-quarters-mobile{flex:none;width:75%}.column.is-two-thirds-mobile{flex:none;width:66.6666%}.column.is-half-mobile{flex:none;width:50%}.column.is-one-third-mobile{flex:none;width:33.3333%}.column.is-one-quarter-mobile{flex:none;width:25%}.column.is-one-fifth-mobile{flex:none;width:20%}.column.is-two-fifths-mobile{flex:none;width:40%}.column.is-three-fifths-mobile{flex:none;width:60%}.column.is-four-fifths-mobile{flex:none;width:80%}.column.is-offset-three-quarters-mobile{margin-left:75%}.column.is-offset-two-thirds-mobile{margin-left:66.6666%}.column.is-offset-half-mobile{margin-left:50%}.column.is-offset-one-third-mobile{margin-left:33.3333%}.column.is-offset-one-quarter-mobile{margin-left:25%}.column.is-offset-one-fifth-mobile{margin-left:20%}.column.is-offset-two-fifths-mobile{margin-left:40%}.column.is-offset-three-fifths-mobile{margin-left:60%}.column.is-offset-four-fifths-mobile{margin-left:80%}.column.is-0-mobile{flex:none;width:0%}.column.is-offset-0-mobile{margin-left:0%}.column.is-1-mobile{flex:none;width:8.3333333333%}.column.is-offset-1-mobile{margin-left:8.3333333333%}.column.is-2-mobile{flex:none;width:16.6666666667%}.column.is-offset-2-mobile{margin-left:16.6666666667%}.column.is-3-mobile{flex:none;width:25%}.column.is-offset-3-mobile{margin-left:25%}.column.is-4-mobile{flex:none;width:33.3333333333%}.column.is-offset-4-mobile{margin-left:33.3333333333%}.column.is-5-mobile{flex:none;width:41.6666666667%}.column.is-offset-5-mobile{margin-left:41.6666666667%}.column.is-6-mobile{flex:none;width:50%}.column.is-offset-6-mobile{margin-left:50%}.column.is-7-mobile{flex:none;width:58.3333333333%}.column.is-offset-7-mobile{margin-left:58.3333333333%}.column.is-8-mobile{flex:none;width:66.6666666667%}.column.is-offset-8-mobile{margin-left:66.6666666667%}.column.is-9-mobile{flex:none;width:75%}.column.is-offset-9-mobile{margin-left:75%}.column.is-10-mobile{flex:none;width:83.3333333333%}.column.is-offset-10-mobile{margin-left:83.3333333333%}.column.is-11-mobile{flex:none;width:91.6666666667%}.column.is-offset-11-mobile{margin-left:91.6666666667%}.column.is-12-mobile{flex:none;width:100%}.column.is-offset-12-mobile{margin-left:100%}}@media screen and (min-width: 769px),print{.column.is-narrow,.column.is-narrow-tablet{flex:none}.column.is-full,.column.is-full-tablet{flex:none;width:100%}.column.is-three-quarters,.column.is-three-quarters-tablet{flex:none;width:75%}.column.is-two-thirds,.column.is-two-thirds-tablet{flex:none;width:66.6666%}.column.is-half,.column.is-half-tablet{flex:none;width:50%}.column.is-one-third,.column.is-one-third-tablet{flex:none;width:33.3333%}.column.is-one-quarter,.column.is-one-quarter-tablet{flex:none;width:25%}.column.is-one-fifth,.column.is-one-fifth-tablet{flex:none;width:20%}.column.is-two-fifths,.column.is-two-fifths-tablet{flex:none;width:40%}.column.is-three-fifths,.column.is-three-fifths-tablet{flex:none;width:60%}.column.is-four-fifths,.column.is-four-fifths-tablet{flex:none;width:80%}.column.is-offset-three-quarters,.column.is-offset-three-quarters-tablet{margin-left:75%}.column.is-offset-two-thirds,.column.is-offset-two-thirds-tablet{margin-left:66.6666%}.column.is-offset-half,.column.is-offset-half-tablet{margin-left:50%}.column.is-offset-one-third,.column.is-offset-one-third-tablet{margin-left:33.3333%}.column.is-offset-one-quarter,.column.is-offset-one-quarter-tablet{margin-left:25%}.column.is-offset-one-fifth,.column.is-offset-one-fifth-tablet{margin-left:20%}.column.is-offset-two-fifths,.column.is-offset-two-fifths-tablet{margin-left:40%}.column.is-offset-three-fifths,.column.is-offset-three-fifths-tablet{margin-left:60%}.column.is-offset-four-fifths,.column.is-offset-four-fifths-tablet{margin-left:80%}.column.is-0,.column.is-0-tablet{flex:none;width:0%}.column.is-offset-0,.column.is-offset-0-tablet{margin-left:0%}.column.is-1,.column.is-1-tablet{flex:none;width:8.3333333333%}.column.is-offset-1,.column.is-offset-1-tablet{margin-left:8.3333333333%}.column.is-2,.column.is-2-tablet{flex:none;width:16.6666666667%}.column.is-offset-2,.column.is-offset-2-tablet{margin-left:16.6666666667%}.column.is-3,.column.is-3-tablet{flex:none;width:25%}.column.is-offset-3,.column.is-offset-3-tablet{margin-left:25%}.column.is-4,.column.is-4-tablet{flex:none;width:33.3333333333%}.column.is-offset-4,.column.is-offset-4-tablet{margin-left:33.3333333333%}.column.is-5,.column.is-5-tablet{flex:none;width:41.6666666667%}.column.is-offset-5,.column.is-offset-5-tablet{margin-left:41.6666666667%}.column.is-6,.column.is-6-tablet{flex:none;width:50%}.column.is-offset-6,.column.is-offset-6-tablet{margin-left:50%}.column.is-7,.column.is-7-tablet{flex:none;width:58.3333333333%}.column.is-offset-7,.column.is-offset-7-tablet{margin-left:58.3333333333%}.column.is-8,.column.is-8-tablet{flex:none;width:66.6666666667%}.column.is-offset-8,.column.is-offset-8-tablet{margin-left:66.6666666667%}.column.is-9,.column.is-9-tablet{flex:none;width:75%}.column.is-offset-9,.column.is-offset-9-tablet{margin-left:75%}.column.is-10,.column.is-10-tablet{flex:none;width:83.3333333333%}.column.is-offset-10,.column.is-offset-10-tablet{margin-left:83.3333333333%}.column.is-11,.column.is-11-tablet{flex:none;width:91.6666666667%}.column.is-offset-11,.column.is-offset-11-tablet{margin-left:91.6666666667%}.column.is-12,.column.is-12-tablet{flex:none;width:100%}.column.is-offset-12,.column.is-offset-12-tablet{margin-left:100%}}@media screen and (max-width: 1055px){.column.is-narrow-touch{flex:none}.column.is-full-touch{flex:none;width:100%}.column.is-three-quarters-touch{flex:none;width:75%}.column.is-two-thirds-touch{flex:none;width:66.6666%}.column.is-half-touch{flex:none;width:50%}.column.is-one-third-touch{flex:none;width:33.3333%}.column.is-one-quarter-touch{flex:none;width:25%}.column.is-one-fifth-touch{flex:none;width:20%}.column.is-two-fifths-touch{flex:none;width:40%}.column.is-three-fifths-touch{flex:none;width:60%}.column.is-four-fifths-touch{flex:none;width:80%}.column.is-offset-three-quarters-touch{margin-left:75%}.column.is-offset-two-thirds-touch{margin-left:66.6666%}.column.is-offset-half-touch{margin-left:50%}.column.is-offset-one-third-touch{margin-left:33.3333%}.column.is-offset-one-quarter-touch{margin-left:25%}.column.is-offset-one-fifth-touch{margin-left:20%}.column.is-offset-two-fifths-touch{margin-left:40%}.column.is-offset-three-fifths-touch{margin-left:60%}.column.is-offset-four-fifths-touch{margin-left:80%}.column.is-0-touch{flex:none;width:0%}.column.is-offset-0-touch{margin-left:0%}.column.is-1-touch{flex:none;width:8.3333333333%}.column.is-offset-1-touch{margin-left:8.3333333333%}.column.is-2-touch{flex:none;width:16.6666666667%}.column.is-offset-2-touch{margin-left:16.6666666667%}.column.is-3-touch{flex:none;width:25%}.column.is-offset-3-touch{margin-left:25%}.column.is-4-touch{flex:none;width:33.3333333333%}.column.is-offset-4-touch{margin-left:33.3333333333%}.column.is-5-touch{flex:none;width:41.6666666667%}.column.is-offset-5-touch{margin-left:41.6666666667%}.column.is-6-touch{flex:none;width:50%}.column.is-offset-6-touch{margin-left:50%}.column.is-7-touch{flex:none;width:58.3333333333%}.column.is-offset-7-touch{margin-left:58.3333333333%}.column.is-8-touch{flex:none;width:66.6666666667%}.column.is-offset-8-touch{margin-left:66.6666666667%}.column.is-9-touch{flex:none;width:75%}.column.is-offset-9-touch{margin-left:75%}.column.is-10-touch{flex:none;width:83.3333333333%}.column.is-offset-10-touch{margin-left:83.3333333333%}.column.is-11-touch{flex:none;width:91.6666666667%}.column.is-offset-11-touch{margin-left:91.6666666667%}.column.is-12-touch{flex:none;width:100%}.column.is-offset-12-touch{margin-left:100%}}@media screen and (min-width: 1056px){.column.is-narrow-desktop{flex:none}.column.is-full-desktop{flex:none;width:100%}.column.is-three-quarters-desktop{flex:none;width:75%}.column.is-two-thirds-desktop{flex:none;width:66.6666%}.column.is-half-desktop{flex:none;width:50%}.column.is-one-third-desktop{flex:none;width:33.3333%}.column.is-one-quarter-desktop{flex:none;width:25%}.column.is-one-fifth-desktop{flex:none;width:20%}.column.is-two-fifths-desktop{flex:none;width:40%}.column.is-three-fifths-desktop{flex:none;width:60%}.column.is-four-fifths-desktop{flex:none;width:80%}.column.is-offset-three-quarters-desktop{margin-left:75%}.column.is-offset-two-thirds-desktop{margin-left:66.6666%}.column.is-offset-half-desktop{margin-left:50%}.column.is-offset-one-third-desktop{margin-left:33.3333%}.column.is-offset-one-quarter-desktop{margin-left:25%}.column.is-offset-one-fifth-desktop{margin-left:20%}.column.is-offset-two-fifths-desktop{margin-left:40%}.column.is-offset-three-fifths-desktop{margin-left:60%}.column.is-offset-four-fifths-desktop{margin-left:80%}.column.is-0-desktop{flex:none;width:0%}.column.is-offset-0-desktop{margin-left:0%}.column.is-1-desktop{flex:none;width:8.3333333333%}.column.is-offset-1-desktop{margin-left:8.3333333333%}.column.is-2-desktop{flex:none;width:16.6666666667%}.column.is-offset-2-desktop{margin-left:16.6666666667%}.column.is-3-desktop{flex:none;width:25%}.column.is-offset-3-desktop{margin-left:25%}.column.is-4-desktop{flex:none;width:33.3333333333%}.column.is-offset-4-desktop{margin-left:33.3333333333%}.column.is-5-desktop{flex:none;width:41.6666666667%}.column.is-offset-5-desktop{margin-left:41.6666666667%}.column.is-6-desktop{flex:none;width:50%}.column.is-offset-6-desktop{margin-left:50%}.column.is-7-desktop{flex:none;width:58.3333333333%}.column.is-offset-7-desktop{margin-left:58.3333333333%}.column.is-8-desktop{flex:none;width:66.6666666667%}.column.is-offset-8-desktop{margin-left:66.6666666667%}.column.is-9-desktop{flex:none;width:75%}.column.is-offset-9-desktop{margin-left:75%}.column.is-10-desktop{flex:none;width:83.3333333333%}.column.is-offset-10-desktop{margin-left:83.3333333333%}.column.is-11-desktop{flex:none;width:91.6666666667%}.column.is-offset-11-desktop{margin-left:91.6666666667%}.column.is-12-desktop{flex:none;width:100%}.column.is-offset-12-desktop{margin-left:100%}}@media screen and (min-width: 1216px){.column.is-narrow-widescreen{flex:none}.column.is-full-widescreen{flex:none;width:100%}.column.is-three-quarters-widescreen{flex:none;width:75%}.column.is-two-thirds-widescreen{flex:none;width:66.6666%}.column.is-half-widescreen{flex:none;width:50%}.column.is-one-third-widescreen{flex:none;width:33.3333%}.column.is-one-quarter-widescreen{flex:none;width:25%}.column.is-one-fifth-widescreen{flex:none;width:20%}.column.is-two-fifths-widescreen{flex:none;width:40%}.column.is-three-fifths-widescreen{flex:none;width:60%}.column.is-four-fifths-widescreen{flex:none;width:80%}.column.is-offset-three-quarters-widescreen{margin-left:75%}.column.is-offset-two-thirds-widescreen{margin-left:66.6666%}.column.is-offset-half-widescreen{margin-left:50%}.column.is-offset-one-third-widescreen{margin-left:33.3333%}.column.is-offset-one-quarter-widescreen{margin-left:25%}.column.is-offset-one-fifth-widescreen{margin-left:20%}.column.is-offset-two-fifths-widescreen{margin-left:40%}.column.is-offset-three-fifths-widescreen{margin-left:60%}.column.is-offset-four-fifths-widescreen{margin-left:80%}.column.is-0-widescreen{flex:none;width:0%}.column.is-offset-0-widescreen{margin-left:0%}.column.is-1-widescreen{flex:none;width:8.3333333333%}.column.is-offset-1-widescreen{margin-left:8.3333333333%}.column.is-2-widescreen{flex:none;width:16.6666666667%}.column.is-offset-2-widescreen{margin-left:16.6666666667%}.column.is-3-widescreen{flex:none;width:25%}.column.is-offset-3-widescreen{margin-left:25%}.column.is-4-widescreen{flex:none;width:33.3333333333%}.column.is-offset-4-widescreen{margin-left:33.3333333333%}.column.is-5-widescreen{flex:none;width:41.6666666667%}.column.is-offset-5-widescreen{margin-left:41.6666666667%}.column.is-6-widescreen{flex:none;width:50%}.column.is-offset-6-widescreen{margin-left:50%}.column.is-7-widescreen{flex:none;width:58.3333333333%}.column.is-offset-7-widescreen{margin-left:58.3333333333%}.column.is-8-widescreen{flex:none;width:66.6666666667%}.column.is-offset-8-widescreen{margin-left:66.6666666667%}.column.is-9-widescreen{flex:none;width:75%}.column.is-offset-9-widescreen{margin-left:75%}.column.is-10-widescreen{flex:none;width:83.3333333333%}.column.is-offset-10-widescreen{margin-left:83.3333333333%}.column.is-11-widescreen{flex:none;width:91.6666666667%}.column.is-offset-11-widescreen{margin-left:91.6666666667%}.column.is-12-widescreen{flex:none;width:100%}.column.is-offset-12-widescreen{margin-left:100%}}@media screen and (min-width: 1408px){.column.is-narrow-fullhd{flex:none}.column.is-full-fullhd{flex:none;width:100%}.column.is-three-quarters-fullhd{flex:none;width:75%}.column.is-two-thirds-fullhd{flex:none;width:66.6666%}.column.is-half-fullhd{flex:none;width:50%}.column.is-one-third-fullhd{flex:none;width:33.3333%}.column.is-one-quarter-fullhd{flex:none;width:25%}.column.is-one-fifth-fullhd{flex:none;width:20%}.column.is-two-fifths-fullhd{flex:none;width:40%}.column.is-three-fifths-fullhd{flex:none;width:60%}.column.is-four-fifths-fullhd{flex:none;width:80%}.column.is-offset-three-quarters-fullhd{margin-left:75%}.column.is-offset-two-thirds-fullhd{margin-left:66.6666%}.column.is-offset-half-fullhd{margin-left:50%}.column.is-offset-one-third-fullhd{margin-left:33.3333%}.column.is-offset-one-quarter-fullhd{margin-left:25%}.column.is-offset-one-fifth-fullhd{margin-left:20%}.column.is-offset-two-fifths-fullhd{margin-left:40%}.column.is-offset-three-fifths-fullhd{margin-left:60%}.column.is-offset-four-fifths-fullhd{margin-left:80%}.column.is-0-fullhd{flex:none;width:0%}.column.is-offset-0-fullhd{margin-left:0%}.column.is-1-fullhd{flex:none;width:8.3333333333%}.column.is-offset-1-fullhd{margin-left:8.3333333333%}.column.is-2-fullhd{flex:none;width:16.6666666667%}.column.is-offset-2-fullhd{margin-left:16.6666666667%}.column.is-3-fullhd{flex:none;width:25%}.column.is-offset-3-fullhd{margin-left:25%}.column.is-4-fullhd{flex:none;width:33.3333333333%}.column.is-offset-4-fullhd{margin-left:33.3333333333%}.column.is-5-fullhd{flex:none;width:41.6666666667%}.column.is-offset-5-fullhd{margin-left:41.6666666667%}.column.is-6-fullhd{flex:none;width:50%}.column.is-offset-6-fullhd{margin-left:50%}.column.is-7-fullhd{flex:none;width:58.3333333333%}.column.is-offset-7-fullhd{margin-left:58.3333333333%}.column.is-8-fullhd{flex:none;width:66.6666666667%}.column.is-offset-8-fullhd{margin-left:66.6666666667%}.column.is-9-fullhd{flex:none;width:75%}.column.is-offset-9-fullhd{margin-left:75%}.column.is-10-fullhd{flex:none;width:83.3333333333%}.column.is-offset-10-fullhd{margin-left:83.3333333333%}.column.is-11-fullhd{flex:none;width:91.6666666667%}.column.is-offset-11-fullhd{margin-left:91.6666666667%}.column.is-12-fullhd{flex:none;width:100%}.column.is-offset-12-fullhd{margin-left:100%}}.columns{margin-left:-.75rem;margin-right:-.75rem;margin-top:-.75rem}.columns:last-child{margin-bottom:-.75rem}.columns:not(:last-child){margin-bottom:calc(1.5rem - .75rem)}.columns.is-centered{justify-content:center}.columns.is-gapless{margin-left:0;margin-right:0;margin-top:0}.columns.is-gapless>.column{margin:0;padding:0 !important}.columns.is-gapless:not(:last-child){margin-bottom:1.5rem}.columns.is-gapless:last-child{margin-bottom:0}.columns.is-mobile{display:flex}.columns.is-multiline{flex-wrap:wrap}.columns.is-vcentered{align-items:center}@media screen and (min-width: 769px),print{.columns:not(.is-desktop){display:flex}}@media screen and (min-width: 1056px){.columns.is-desktop{display:flex}}.columns.is-variable{--columnGap: 0.75rem;margin-left:calc(-1 * var(--columnGap));margin-right:calc(-1 * var(--columnGap))}.columns.is-variable .column{padding-left:var(--columnGap);padding-right:var(--columnGap)}.columns.is-variable.is-0{--columnGap: 0rem}@media screen and (max-width: 768px){.columns.is-variable.is-0-mobile{--columnGap: 0rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-0-tablet{--columnGap: 0rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-0-tablet-only{--columnGap: 0rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-0-touch{--columnGap: 0rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-0-desktop{--columnGap: 0rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-0-desktop-only{--columnGap: 0rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-0-widescreen{--columnGap: 0rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-0-widescreen-only{--columnGap: 0rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-0-fullhd{--columnGap: 0rem}}.columns.is-variable.is-1{--columnGap: .25rem}@media screen and (max-width: 768px){.columns.is-variable.is-1-mobile{--columnGap: .25rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-1-tablet{--columnGap: .25rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-1-tablet-only{--columnGap: .25rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-1-touch{--columnGap: .25rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-1-desktop{--columnGap: .25rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-1-desktop-only{--columnGap: .25rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-1-widescreen{--columnGap: .25rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-1-widescreen-only{--columnGap: .25rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-1-fullhd{--columnGap: .25rem}}.columns.is-variable.is-2{--columnGap: .5rem}@media screen and (max-width: 768px){.columns.is-variable.is-2-mobile{--columnGap: .5rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-2-tablet{--columnGap: .5rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-2-tablet-only{--columnGap: .5rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-2-touch{--columnGap: .5rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-2-desktop{--columnGap: .5rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-2-desktop-only{--columnGap: .5rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-2-widescreen{--columnGap: .5rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-2-widescreen-only{--columnGap: .5rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-2-fullhd{--columnGap: .5rem}}.columns.is-variable.is-3{--columnGap: .75rem}@media screen and (max-width: 768px){.columns.is-variable.is-3-mobile{--columnGap: .75rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-3-tablet{--columnGap: .75rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-3-tablet-only{--columnGap: .75rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-3-touch{--columnGap: .75rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-3-desktop{--columnGap: .75rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-3-desktop-only{--columnGap: .75rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-3-widescreen{--columnGap: .75rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-3-widescreen-only{--columnGap: .75rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-3-fullhd{--columnGap: .75rem}}.columns.is-variable.is-4{--columnGap: 1rem}@media screen and (max-width: 768px){.columns.is-variable.is-4-mobile{--columnGap: 1rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-4-tablet{--columnGap: 1rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-4-tablet-only{--columnGap: 1rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-4-touch{--columnGap: 1rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-4-desktop{--columnGap: 1rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-4-desktop-only{--columnGap: 1rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-4-widescreen{--columnGap: 1rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-4-widescreen-only{--columnGap: 1rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-4-fullhd{--columnGap: 1rem}}.columns.is-variable.is-5{--columnGap: 1.25rem}@media screen and (max-width: 768px){.columns.is-variable.is-5-mobile{--columnGap: 1.25rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-5-tablet{--columnGap: 1.25rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-5-tablet-only{--columnGap: 1.25rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-5-touch{--columnGap: 1.25rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-5-desktop{--columnGap: 1.25rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-5-desktop-only{--columnGap: 1.25rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-5-widescreen{--columnGap: 1.25rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-5-widescreen-only{--columnGap: 1.25rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-5-fullhd{--columnGap: 1.25rem}}.columns.is-variable.is-6{--columnGap: 1.5rem}@media screen and (max-width: 768px){.columns.is-variable.is-6-mobile{--columnGap: 1.5rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-6-tablet{--columnGap: 1.5rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-6-tablet-only{--columnGap: 1.5rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-6-touch{--columnGap: 1.5rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-6-desktop{--columnGap: 1.5rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-6-desktop-only{--columnGap: 1.5rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-6-widescreen{--columnGap: 1.5rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-6-widescreen-only{--columnGap: 1.5rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-6-fullhd{--columnGap: 1.5rem}}.columns.is-variable.is-7{--columnGap: 1.75rem}@media screen and (max-width: 768px){.columns.is-variable.is-7-mobile{--columnGap: 1.75rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-7-tablet{--columnGap: 1.75rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-7-tablet-only{--columnGap: 1.75rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-7-touch{--columnGap: 1.75rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-7-desktop{--columnGap: 1.75rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-7-desktop-only{--columnGap: 1.75rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-7-widescreen{--columnGap: 1.75rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-7-widescreen-only{--columnGap: 1.75rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-7-fullhd{--columnGap: 1.75rem}}.columns.is-variable.is-8{--columnGap: 2rem}@media screen and (max-width: 768px){.columns.is-variable.is-8-mobile{--columnGap: 2rem}}@media screen and (min-width: 769px),print{.columns.is-variable.is-8-tablet{--columnGap: 2rem}}@media screen and (min-width: 769px) and (max-width: 1055px){.columns.is-variable.is-8-tablet-only{--columnGap: 2rem}}@media screen and (max-width: 1055px){.columns.is-variable.is-8-touch{--columnGap: 2rem}}@media screen and (min-width: 1056px){.columns.is-variable.is-8-desktop{--columnGap: 2rem}}@media screen and (min-width: 1056px) and (max-width: 1215px){.columns.is-variable.is-8-desktop-only{--columnGap: 2rem}}@media screen and (min-width: 1216px){.columns.is-variable.is-8-widescreen{--columnGap: 2rem}}@media screen and (min-width: 1216px) and (max-width: 1407px){.columns.is-variable.is-8-widescreen-only{--columnGap: 2rem}}@media screen and (min-width: 1408px){.columns.is-variable.is-8-fullhd{--columnGap: 2rem}}.tile{align-items:stretch;display:block;flex-basis:0;flex-grow:1;flex-shrink:1;min-height:min-content}.tile.is-ancestor{margin-left:-.75rem;margin-right:-.75rem;margin-top:-.75rem}.tile.is-ancestor:last-child{margin-bottom:-.75rem}.tile.is-ancestor:not(:last-child){margin-bottom:.75rem}.tile.is-child{margin:0 !important}.tile.is-parent{padding:.75rem}.tile.is-vertical{flex-direction:column}.tile.is-vertical>.tile.is-child:not(:last-child){margin-bottom:1.5rem !important}@media screen and (min-width: 769px),print{.tile:not(.is-child){display:flex}.tile.is-1{flex:none;width:8.3333333333%}.tile.is-2{flex:none;width:16.6666666667%}.tile.is-3{flex:none;width:25%}.tile.is-4{flex:none;width:33.3333333333%}.tile.is-5{flex:none;width:41.6666666667%}.tile.is-6{flex:none;width:50%}.tile.is-7{flex:none;width:58.3333333333%}.tile.is-8{flex:none;width:66.6666666667%}.tile.is-9{flex:none;width:75%}.tile.is-10{flex:none;width:83.3333333333%}.tile.is-11{flex:none;width:91.6666666667%}.tile.is-12{flex:none;width:100%}}.hero{align-items:stretch;display:flex;flex-direction:column;justify-content:space-between}.hero .navbar{background:none}.hero .tabs ul{border-bottom:none}.hero.is-white{background-color:#fff;color:#0a0a0a}.hero.is-white a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-white strong{color:inherit}.hero.is-white .title{color:#0a0a0a}.hero.is-white .subtitle{color:rgba(10,10,10,0.9)}.hero.is-white .subtitle a:not(.button),.hero.is-white .subtitle strong{color:#0a0a0a}@media screen and (max-width: 1055px){.hero.is-white .navbar-menu{background-color:#fff}}.hero.is-white .navbar-item,.hero.is-white .navbar-link{color:rgba(10,10,10,0.7)}.hero.is-white a.navbar-item:hover,.hero.is-white a.navbar-item.is-active,.hero.is-white .navbar-link:hover,.hero.is-white .navbar-link.is-active{background-color:#f2f2f2;color:#0a0a0a}.hero.is-white .tabs a{color:#0a0a0a;opacity:0.9}.hero.is-white .tabs a:hover{opacity:1}.hero.is-white .tabs li.is-active a{opacity:1}.hero.is-white .tabs.is-boxed a,.hero.is-white .tabs.is-toggle a{color:#0a0a0a}.hero.is-white .tabs.is-boxed a:hover,.hero.is-white .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-white .tabs.is-boxed li.is-active a,.hero.is-white .tabs.is-boxed li.is-active a:hover,.hero.is-white .tabs.is-toggle li.is-active a,.hero.is-white .tabs.is-toggle li.is-active a:hover{background-color:#0a0a0a;border-color:#0a0a0a;color:#fff}.hero.is-white.is-bold{background-image:linear-gradient(141deg, #e8e3e4 0%, #fff 71%, #fff 100%)}@media screen and (max-width: 768px){.hero.is-white.is-bold .navbar-menu{background-image:linear-gradient(141deg, #e8e3e4 0%, #fff 71%, #fff 100%)}}.hero.is-black{background-color:#0a0a0a;color:#fff}.hero.is-black a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-black strong{color:inherit}.hero.is-black .title{color:#fff}.hero.is-black .subtitle{color:rgba(255,255,255,0.9)}.hero.is-black .subtitle a:not(.button),.hero.is-black .subtitle strong{color:#fff}@media screen and (max-width: 1055px){.hero.is-black .navbar-menu{background-color:#0a0a0a}}.hero.is-black .navbar-item,.hero.is-black .navbar-link{color:rgba(255,255,255,0.7)}.hero.is-black a.navbar-item:hover,.hero.is-black a.navbar-item.is-active,.hero.is-black .navbar-link:hover,.hero.is-black .navbar-link.is-active{background-color:#000;color:#fff}.hero.is-black .tabs a{color:#fff;opacity:0.9}.hero.is-black .tabs a:hover{opacity:1}.hero.is-black .tabs li.is-active a{opacity:1}.hero.is-black .tabs.is-boxed a,.hero.is-black .tabs.is-toggle a{color:#fff}.hero.is-black .tabs.is-boxed a:hover,.hero.is-black .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-black .tabs.is-boxed li.is-active a,.hero.is-black .tabs.is-boxed li.is-active a:hover,.hero.is-black .tabs.is-toggle li.is-active a,.hero.is-black .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#0a0a0a}.hero.is-black.is-bold{background-image:linear-gradient(141deg, #000 0%, #0a0a0a 71%, #181616 100%)}@media screen and (max-width: 768px){.hero.is-black.is-bold .navbar-menu{background-image:linear-gradient(141deg, #000 0%, #0a0a0a 71%, #181616 100%)}}.hero.is-light{background-color:#f5f5f5;color:#363636}.hero.is-light a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-light strong{color:inherit}.hero.is-light .title{color:#363636}.hero.is-light .subtitle{color:rgba(54,54,54,0.9)}.hero.is-light .subtitle a:not(.button),.hero.is-light .subtitle strong{color:#363636}@media screen and (max-width: 1055px){.hero.is-light .navbar-menu{background-color:#f5f5f5}}.hero.is-light .navbar-item,.hero.is-light .navbar-link{color:rgba(54,54,54,0.7)}.hero.is-light a.navbar-item:hover,.hero.is-light a.navbar-item.is-active,.hero.is-light .navbar-link:hover,.hero.is-light .navbar-link.is-active{background-color:#e8e8e8;color:#363636}.hero.is-light .tabs a{color:#363636;opacity:0.9}.hero.is-light .tabs a:hover{opacity:1}.hero.is-light .tabs li.is-active a{opacity:1}.hero.is-light .tabs.is-boxed a,.hero.is-light .tabs.is-toggle a{color:#363636}.hero.is-light .tabs.is-boxed a:hover,.hero.is-light .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-light .tabs.is-boxed li.is-active a,.hero.is-light .tabs.is-boxed li.is-active a:hover,.hero.is-light .tabs.is-toggle li.is-active a,.hero.is-light .tabs.is-toggle li.is-active a:hover{background-color:#363636;border-color:#363636;color:#f5f5f5}.hero.is-light.is-bold{background-image:linear-gradient(141deg, #dfd8d9 0%, #f5f5f5 71%, #fff 100%)}@media screen and (max-width: 768px){.hero.is-light.is-bold .navbar-menu{background-image:linear-gradient(141deg, #dfd8d9 0%, #f5f5f5 71%, #fff 100%)}}.hero.is-dark,.content kbd.hero{background-color:#363636;color:#f5f5f5}.hero.is-dark a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.content kbd.hero a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-dark strong,.content kbd.hero strong{color:inherit}.hero.is-dark .title,.content kbd.hero .title{color:#f5f5f5}.hero.is-dark .subtitle,.content kbd.hero .subtitle{color:rgba(245,245,245,0.9)}.hero.is-dark .subtitle a:not(.button),.content kbd.hero .subtitle a:not(.button),.hero.is-dark .subtitle strong,.content kbd.hero .subtitle strong{color:#f5f5f5}@media screen and (max-width: 1055px){.hero.is-dark .navbar-menu,.content kbd.hero .navbar-menu{background-color:#363636}}.hero.is-dark .navbar-item,.content kbd.hero .navbar-item,.hero.is-dark .navbar-link,.content kbd.hero .navbar-link{color:rgba(245,245,245,0.7)}.hero.is-dark a.navbar-item:hover,.content kbd.hero a.navbar-item:hover,.hero.is-dark a.navbar-item.is-active,.content kbd.hero a.navbar-item.is-active,.hero.is-dark .navbar-link:hover,.content kbd.hero .navbar-link:hover,.hero.is-dark .navbar-link.is-active,.content kbd.hero .navbar-link.is-active{background-color:#292929;color:#f5f5f5}.hero.is-dark .tabs a,.content kbd.hero .tabs a{color:#f5f5f5;opacity:0.9}.hero.is-dark .tabs a:hover,.content kbd.hero .tabs a:hover{opacity:1}.hero.is-dark .tabs li.is-active a,.content kbd.hero .tabs li.is-active a{opacity:1}.hero.is-dark .tabs.is-boxed a,.content kbd.hero .tabs.is-boxed a,.hero.is-dark .tabs.is-toggle a,.content kbd.hero .tabs.is-toggle a{color:#f5f5f5}.hero.is-dark .tabs.is-boxed a:hover,.content kbd.hero .tabs.is-boxed a:hover,.hero.is-dark .tabs.is-toggle a:hover,.content kbd.hero .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-dark .tabs.is-boxed li.is-active a,.content kbd.hero .tabs.is-boxed li.is-active a,.hero.is-dark .tabs.is-boxed li.is-active a:hover,.hero.is-dark .tabs.is-toggle li.is-active a,.content kbd.hero .tabs.is-toggle li.is-active a,.hero.is-dark .tabs.is-toggle li.is-active a:hover{background-color:#f5f5f5;border-color:#f5f5f5;color:#363636}.hero.is-dark.is-bold,.content kbd.hero.is-bold{background-image:linear-gradient(141deg, #1f191a 0%, #363636 71%, #46403f 100%)}@media screen and (max-width: 768px){.hero.is-dark.is-bold .navbar-menu,.content kbd.hero.is-bold .navbar-menu{background-image:linear-gradient(141deg, #1f191a 0%, #363636 71%, #46403f 100%)}}.hero.is-primary,.docstring>section>a.hero.docs-sourcelink{background-color:#4eb5de;color:#fff}.hero.is-primary a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.docstring>section>a.hero.docs-sourcelink a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-primary strong,.docstring>section>a.hero.docs-sourcelink strong{color:inherit}.hero.is-primary .title,.docstring>section>a.hero.docs-sourcelink .title{color:#fff}.hero.is-primary .subtitle,.docstring>section>a.hero.docs-sourcelink .subtitle{color:rgba(255,255,255,0.9)}.hero.is-primary .subtitle a:not(.button),.docstring>section>a.hero.docs-sourcelink .subtitle a:not(.button),.hero.is-primary .subtitle strong,.docstring>section>a.hero.docs-sourcelink .subtitle strong{color:#fff}@media screen and (max-width: 1055px){.hero.is-primary .navbar-menu,.docstring>section>a.hero.docs-sourcelink .navbar-menu{background-color:#4eb5de}}.hero.is-primary .navbar-item,.docstring>section>a.hero.docs-sourcelink .navbar-item,.hero.is-primary .navbar-link,.docstring>section>a.hero.docs-sourcelink .navbar-link{color:rgba(255,255,255,0.7)}.hero.is-primary a.navbar-item:hover,.docstring>section>a.hero.docs-sourcelink a.navbar-item:hover,.hero.is-primary a.navbar-item.is-active,.docstring>section>a.hero.docs-sourcelink a.navbar-item.is-active,.hero.is-primary .navbar-link:hover,.docstring>section>a.hero.docs-sourcelink .navbar-link:hover,.hero.is-primary .navbar-link.is-active,.docstring>section>a.hero.docs-sourcelink .navbar-link.is-active{background-color:#39acda;color:#fff}.hero.is-primary .tabs a,.docstring>section>a.hero.docs-sourcelink .tabs a{color:#fff;opacity:0.9}.hero.is-primary .tabs a:hover,.docstring>section>a.hero.docs-sourcelink .tabs a:hover{opacity:1}.hero.is-primary .tabs li.is-active a,.docstring>section>a.hero.docs-sourcelink .tabs li.is-active a{opacity:1}.hero.is-primary .tabs.is-boxed a,.docstring>section>a.hero.docs-sourcelink .tabs.is-boxed a,.hero.is-primary .tabs.is-toggle a,.docstring>section>a.hero.docs-sourcelink .tabs.is-toggle a{color:#fff}.hero.is-primary .tabs.is-boxed a:hover,.docstring>section>a.hero.docs-sourcelink .tabs.is-boxed a:hover,.hero.is-primary .tabs.is-toggle a:hover,.docstring>section>a.hero.docs-sourcelink .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-primary .tabs.is-boxed li.is-active a,.docstring>section>a.hero.docs-sourcelink .tabs.is-boxed li.is-active a,.hero.is-primary .tabs.is-boxed li.is-active a:hover,.hero.is-primary .tabs.is-toggle li.is-active a,.docstring>section>a.hero.docs-sourcelink .tabs.is-toggle li.is-active a,.hero.is-primary .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#4eb5de}.hero.is-primary.is-bold,.docstring>section>a.hero.is-bold.docs-sourcelink{background-image:linear-gradient(141deg, #1bc7de 0%, #4eb5de 71%, #5fa9e7 100%)}@media screen and (max-width: 768px){.hero.is-primary.is-bold .navbar-menu,.docstring>section>a.hero.is-bold.docs-sourcelink .navbar-menu{background-image:linear-gradient(141deg, #1bc7de 0%, #4eb5de 71%, #5fa9e7 100%)}}.hero.is-link{background-color:#2e63b8;color:#fff}.hero.is-link a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-link strong{color:inherit}.hero.is-link .title{color:#fff}.hero.is-link .subtitle{color:rgba(255,255,255,0.9)}.hero.is-link .subtitle a:not(.button),.hero.is-link .subtitle strong{color:#fff}@media screen and (max-width: 1055px){.hero.is-link .navbar-menu{background-color:#2e63b8}}.hero.is-link .navbar-item,.hero.is-link .navbar-link{color:rgba(255,255,255,0.7)}.hero.is-link a.navbar-item:hover,.hero.is-link a.navbar-item.is-active,.hero.is-link .navbar-link:hover,.hero.is-link .navbar-link.is-active{background-color:#2958a4;color:#fff}.hero.is-link .tabs a{color:#fff;opacity:0.9}.hero.is-link .tabs a:hover{opacity:1}.hero.is-link .tabs li.is-active a{opacity:1}.hero.is-link .tabs.is-boxed a,.hero.is-link .tabs.is-toggle a{color:#fff}.hero.is-link .tabs.is-boxed a:hover,.hero.is-link .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-link .tabs.is-boxed li.is-active a,.hero.is-link .tabs.is-boxed li.is-active a:hover,.hero.is-link .tabs.is-toggle li.is-active a,.hero.is-link .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#2e63b8}.hero.is-link.is-bold{background-image:linear-gradient(141deg, #1b6098 0%, #2e63b8 71%, #2d51d2 100%)}@media screen and (max-width: 768px){.hero.is-link.is-bold .navbar-menu{background-image:linear-gradient(141deg, #1b6098 0%, #2e63b8 71%, #2d51d2 100%)}}.hero.is-info{background-color:#209cee;color:#fff}.hero.is-info a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-info strong{color:inherit}.hero.is-info .title{color:#fff}.hero.is-info .subtitle{color:rgba(255,255,255,0.9)}.hero.is-info .subtitle a:not(.button),.hero.is-info .subtitle strong{color:#fff}@media screen and (max-width: 1055px){.hero.is-info .navbar-menu{background-color:#209cee}}.hero.is-info .navbar-item,.hero.is-info .navbar-link{color:rgba(255,255,255,0.7)}.hero.is-info a.navbar-item:hover,.hero.is-info a.navbar-item.is-active,.hero.is-info .navbar-link:hover,.hero.is-info .navbar-link.is-active{background-color:#1190e3;color:#fff}.hero.is-info .tabs a{color:#fff;opacity:0.9}.hero.is-info .tabs a:hover{opacity:1}.hero.is-info .tabs li.is-active a{opacity:1}.hero.is-info .tabs.is-boxed a,.hero.is-info .tabs.is-toggle a{color:#fff}.hero.is-info .tabs.is-boxed a:hover,.hero.is-info .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-info .tabs.is-boxed li.is-active a,.hero.is-info .tabs.is-boxed li.is-active a:hover,.hero.is-info .tabs.is-toggle li.is-active a,.hero.is-info .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#209cee}.hero.is-info.is-bold{background-image:linear-gradient(141deg, #05a6d6 0%, #209cee 71%, #3287f5 100%)}@media screen and (max-width: 768px){.hero.is-info.is-bold .navbar-menu{background-image:linear-gradient(141deg, #05a6d6 0%, #209cee 71%, #3287f5 100%)}}.hero.is-success{background-color:#22c35b;color:#fff}.hero.is-success a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-success strong{color:inherit}.hero.is-success .title{color:#fff}.hero.is-success .subtitle{color:rgba(255,255,255,0.9)}.hero.is-success .subtitle a:not(.button),.hero.is-success .subtitle strong{color:#fff}@media screen and (max-width: 1055px){.hero.is-success .navbar-menu{background-color:#22c35b}}.hero.is-success .navbar-item,.hero.is-success .navbar-link{color:rgba(255,255,255,0.7)}.hero.is-success a.navbar-item:hover,.hero.is-success a.navbar-item.is-active,.hero.is-success .navbar-link:hover,.hero.is-success .navbar-link.is-active{background-color:#1ead51;color:#fff}.hero.is-success .tabs a{color:#fff;opacity:0.9}.hero.is-success .tabs a:hover{opacity:1}.hero.is-success .tabs li.is-active a{opacity:1}.hero.is-success .tabs.is-boxed a,.hero.is-success .tabs.is-toggle a{color:#fff}.hero.is-success .tabs.is-boxed a:hover,.hero.is-success .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-success .tabs.is-boxed li.is-active a,.hero.is-success .tabs.is-boxed li.is-active a:hover,.hero.is-success .tabs.is-toggle li.is-active a,.hero.is-success .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#22c35b}.hero.is-success.is-bold{background-image:linear-gradient(141deg, #12a02c 0%, #22c35b 71%, #1fdf83 100%)}@media screen and (max-width: 768px){.hero.is-success.is-bold .navbar-menu{background-image:linear-gradient(141deg, #12a02c 0%, #22c35b 71%, #1fdf83 100%)}}.hero.is-warning{background-color:#ffdd57;color:rgba(0,0,0,0.7)}.hero.is-warning a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-warning strong{color:inherit}.hero.is-warning .title{color:rgba(0,0,0,0.7)}.hero.is-warning .subtitle{color:rgba(0,0,0,0.9)}.hero.is-warning .subtitle a:not(.button),.hero.is-warning .subtitle strong{color:rgba(0,0,0,0.7)}@media screen and (max-width: 1055px){.hero.is-warning .navbar-menu{background-color:#ffdd57}}.hero.is-warning .navbar-item,.hero.is-warning .navbar-link{color:rgba(0,0,0,0.7)}.hero.is-warning a.navbar-item:hover,.hero.is-warning a.navbar-item.is-active,.hero.is-warning .navbar-link:hover,.hero.is-warning .navbar-link.is-active{background-color:#ffd83e;color:rgba(0,0,0,0.7)}.hero.is-warning .tabs a{color:rgba(0,0,0,0.7);opacity:0.9}.hero.is-warning .tabs a:hover{opacity:1}.hero.is-warning .tabs li.is-active a{opacity:1}.hero.is-warning .tabs.is-boxed a,.hero.is-warning .tabs.is-toggle a{color:rgba(0,0,0,0.7)}.hero.is-warning .tabs.is-boxed a:hover,.hero.is-warning .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-warning .tabs.is-boxed li.is-active a,.hero.is-warning .tabs.is-boxed li.is-active a:hover,.hero.is-warning .tabs.is-toggle li.is-active a,.hero.is-warning .tabs.is-toggle li.is-active a:hover{background-color:rgba(0,0,0,0.7);border-color:rgba(0,0,0,0.7);color:#ffdd57}.hero.is-warning.is-bold{background-image:linear-gradient(141deg, #ffae24 0%, #ffdd57 71%, #fffa71 100%)}@media screen and (max-width: 768px){.hero.is-warning.is-bold .navbar-menu{background-image:linear-gradient(141deg, #ffae24 0%, #ffdd57 71%, #fffa71 100%)}}.hero.is-danger{background-color:#da0b00;color:#fff}.hero.is-danger a:not(.button):not(.dropdown-item):not(.tag):not(.pagination-link.is-current),.hero.is-danger strong{color:inherit}.hero.is-danger .title{color:#fff}.hero.is-danger .subtitle{color:rgba(255,255,255,0.9)}.hero.is-danger .subtitle a:not(.button),.hero.is-danger .subtitle strong{color:#fff}@media screen and (max-width: 1055px){.hero.is-danger .navbar-menu{background-color:#da0b00}}.hero.is-danger .navbar-item,.hero.is-danger .navbar-link{color:rgba(255,255,255,0.7)}.hero.is-danger a.navbar-item:hover,.hero.is-danger a.navbar-item.is-active,.hero.is-danger .navbar-link:hover,.hero.is-danger .navbar-link.is-active{background-color:#c10a00;color:#fff}.hero.is-danger .tabs a{color:#fff;opacity:0.9}.hero.is-danger .tabs a:hover{opacity:1}.hero.is-danger .tabs li.is-active a{opacity:1}.hero.is-danger .tabs.is-boxed a,.hero.is-danger .tabs.is-toggle a{color:#fff}.hero.is-danger .tabs.is-boxed a:hover,.hero.is-danger .tabs.is-toggle a:hover{background-color:rgba(10,10,10,0.1)}.hero.is-danger .tabs.is-boxed li.is-active a,.hero.is-danger .tabs.is-boxed li.is-active a:hover,.hero.is-danger .tabs.is-toggle li.is-active a,.hero.is-danger .tabs.is-toggle li.is-active a:hover{background-color:#fff;border-color:#fff;color:#da0b00}.hero.is-danger.is-bold{background-image:linear-gradient(141deg, #a70013 0%, #da0b00 71%, #f43500 100%)}@media screen and (max-width: 768px){.hero.is-danger.is-bold .navbar-menu{background-image:linear-gradient(141deg, #a70013 0%, #da0b00 71%, #f43500 100%)}}.hero.is-small .hero-body,#documenter .docs-sidebar form.docs-search>input.hero .hero-body{padding-bottom:1.5rem;padding-top:1.5rem}@media screen and (min-width: 769px),print{.hero.is-medium .hero-body{padding-bottom:9rem;padding-top:9rem}}@media screen and (min-width: 769px),print{.hero.is-large .hero-body{padding-bottom:18rem;padding-top:18rem}}.hero.is-halfheight .hero-body,.hero.is-fullheight .hero-body,.hero.is-fullheight-with-navbar .hero-body{align-items:center;display:flex}.hero.is-halfheight .hero-body>.container,.hero.is-fullheight .hero-body>.container,.hero.is-fullheight-with-navbar .hero-body>.container{flex-grow:1;flex-shrink:1}.hero.is-halfheight{min-height:50vh}.hero.is-fullheight{min-height:100vh}.hero-video{overflow:hidden}.hero-video video{left:50%;min-height:100%;min-width:100%;position:absolute;top:50%;transform:translate3d(-50%, -50%, 0)}.hero-video.is-transparent{opacity:0.3}@media screen and (max-width: 768px){.hero-video{display:none}}.hero-buttons{margin-top:1.5rem}@media screen and (max-width: 768px){.hero-buttons .button{display:flex}.hero-buttons .button:not(:last-child){margin-bottom:0.75rem}}@media screen and (min-width: 769px),print{.hero-buttons{display:flex;justify-content:center}.hero-buttons .button:not(:last-child){margin-right:1.5rem}}.hero-head,.hero-foot{flex-grow:0;flex-shrink:0}.hero-body{flex-grow:1;flex-shrink:0;padding:3rem 1.5rem}.section{padding:3rem 1.5rem}@media screen and (min-width: 1056px){.section.is-medium{padding:9rem 1.5rem}.section.is-large{padding:18rem 1.5rem}}.footer{background-color:#fafafa;padding:3rem 1.5rem 6rem}h1 .docs-heading-anchor,h1 .docs-heading-anchor:hover,h1 .docs-heading-anchor:visited,h2 .docs-heading-anchor,h2 .docs-heading-anchor:hover,h2 .docs-heading-anchor:visited,h3 .docs-heading-anchor,h3 .docs-heading-anchor:hover,h3 .docs-heading-anchor:visited,h4 .docs-heading-anchor,h4 .docs-heading-anchor:hover,h4 .docs-heading-anchor:visited,h5 .docs-heading-anchor,h5 .docs-heading-anchor:hover,h5 .docs-heading-anchor:visited,h6 .docs-heading-anchor,h6 .docs-heading-anchor:hover,h6 .docs-heading-anchor:visited{color:#222}h1 .docs-heading-anchor-permalink,h2 .docs-heading-anchor-permalink,h3 .docs-heading-anchor-permalink,h4 .docs-heading-anchor-permalink,h5 .docs-heading-anchor-permalink,h6 .docs-heading-anchor-permalink{visibility:hidden;vertical-align:middle;margin-left:0.5em;font-size:0.7rem}h1 .docs-heading-anchor-permalink::before,h2 .docs-heading-anchor-permalink::before,h3 .docs-heading-anchor-permalink::before,h4 .docs-heading-anchor-permalink::before,h5 .docs-heading-anchor-permalink::before,h6 .docs-heading-anchor-permalink::before{font-family:"Font Awesome 5 Free";font-weight:900;content:"\f0c1"}h1:hover .docs-heading-anchor-permalink,h2:hover .docs-heading-anchor-permalink,h3:hover .docs-heading-anchor-permalink,h4:hover .docs-heading-anchor-permalink,h5:hover .docs-heading-anchor-permalink,h6:hover .docs-heading-anchor-permalink{visibility:visible}.docs-dark-only{display:none !important}pre{position:relative;overflow:hidden}pre code,pre code.hljs{padding:0 .75rem !important;overflow:auto;display:block}pre code:first-of-type,pre code.hljs:first-of-type{padding-top:0.5rem !important}pre code:last-of-type,pre code.hljs:last-of-type{padding-bottom:0.5rem !important}pre .copy-button{opacity:0.2;transition:opacity 0.2s;position:absolute;right:0em;top:0em;padding:0.5em;width:2.5em;height:2.5em;background:transparent;border:none;font-family:"Font Awesome 5 Free";color:#222;cursor:pointer;text-align:center}pre .copy-button:focus,pre .copy-button:hover{opacity:1;background:rgba(34,34,34,0.1);color:#2e63b8}pre .copy-button.success{color:#259a12;opacity:1}pre .copy-button.error{color:#cb3c33;opacity:1}pre:hover .copy-button{opacity:1}.admonition{background-color:#b5b5b5;border-style:solid;border-width:1px;border-color:#363636;border-radius:4px;font-size:1rem}.admonition strong{color:currentColor}.admonition.is-small,#documenter .docs-sidebar form.docs-search>input.admonition{font-size:.75rem}.admonition.is-medium{font-size:1.25rem}.admonition.is-large{font-size:1.5rem}.admonition.is-default{background-color:#b5b5b5;border-color:#363636}.admonition.is-default>.admonition-header{background-color:#363636;color:#fff}.admonition.is-default>.admonition-body{color:#fff}.admonition.is-info{background-color:#def0fc;border-color:#209cee}.admonition.is-info>.admonition-header{background-color:#209cee;color:#fff}.admonition.is-info>.admonition-body{color:rgba(0,0,0,0.7)}.admonition.is-success{background-color:#bdf4d1;border-color:#22c35b}.admonition.is-success>.admonition-header{background-color:#22c35b;color:#fff}.admonition.is-success>.admonition-body{color:rgba(0,0,0,0.7)}.admonition.is-warning{background-color:#fff3c5;border-color:#ffdd57}.admonition.is-warning>.admonition-header{background-color:#ffdd57;color:rgba(0,0,0,0.7)}.admonition.is-warning>.admonition-body{color:rgba(0,0,0,0.7)}.admonition.is-danger{background-color:#ffaba7;border-color:#da0b00}.admonition.is-danger>.admonition-header{background-color:#da0b00;color:#fff}.admonition.is-danger>.admonition-body{color:rgba(0,0,0,0.7)}.admonition.is-compat{background-color:#bdeff5;border-color:#1db5c9}.admonition.is-compat>.admonition-header{background-color:#1db5c9;color:#fff}.admonition.is-compat>.admonition-body{color:rgba(0,0,0,0.7)}.admonition-header{color:#fff;background-color:#363636;align-items:center;font-weight:700;justify-content:space-between;line-height:1.25;padding:0.5rem .75rem;position:relative}.admonition-header:before{font-family:"Font Awesome 5 Free";font-weight:900;margin-right:.75rem;content:"\f06a"}.admonition-body{color:#222;padding:0.5rem .75rem}.admonition-body pre{background-color:#f5f5f5}.admonition-body code{background-color:rgba(0,0,0,0.05)}.docstring{margin-bottom:1em;background-color:rgba(0,0,0,0);border:1px solid #dbdbdb;box-shadow:2px 2px 3px rgba(10,10,10,0.1);max-width:100%}.docstring>header{display:flex;flex-grow:1;align-items:stretch;padding:0.5rem .75rem;background-color:#f5f5f5;box-shadow:0 1px 2px rgba(10,10,10,0.1);box-shadow:none;border-bottom:1px solid #dbdbdb}.docstring>header code{background-color:transparent}.docstring>header .docstring-binding{margin-right:0.3em}.docstring>header .docstring-category{margin-left:0.3em}.docstring>section{position:relative;padding:.75rem .75rem;border-bottom:1px solid #dbdbdb}.docstring>section:last-child{border-bottom:none}.docstring>section>a.docs-sourcelink{transition:opacity 0.3s;opacity:0;position:absolute;right:.375rem;bottom:.375rem}.docstring>section>a.docs-sourcelink:focus{opacity:1 !important}.docstring:hover>section>a.docs-sourcelink{opacity:0.2}.docstring:focus-within>section>a.docs-sourcelink{opacity:0.2}.docstring>section:hover a.docs-sourcelink{opacity:1}.documenter-example-output{background-color:#fff}.outdated-warning-overlay{position:fixed;top:0;left:0;right:0;box-shadow:0 0 10px rgba(0,0,0,0.3);z-index:999;background-color:#ffaba7;color:rgba(0,0,0,0.7);border-bottom:3px solid #da0b00;padding:10px 35px;text-align:center;font-size:15px}.outdated-warning-overlay .outdated-warning-closer{position:absolute;top:calc(50% - 10px);right:18px;cursor:pointer;width:12px}.outdated-warning-overlay a{color:#2e63b8}.outdated-warning-overlay a:hover{color:#363636}.content pre{border:1px solid #dbdbdb}.content code{font-weight:inherit}.content a code{color:#2e63b8}.content h1 code,.content h2 code,.content h3 code,.content h4 code,.content h5 code,.content h6 code{color:#222}.content table{display:block;width:initial;max-width:100%;overflow-x:auto}.content blockquote>ul:first-child,.content blockquote>ol:first-child,.content .admonition-body>ul:first-child,.content .admonition-body>ol:first-child{margin-top:0}pre,code{font-variant-ligatures:no-contextual}.breadcrumb a.is-disabled{cursor:default;pointer-events:none}.breadcrumb a.is-disabled,.breadcrumb a.is-disabled:hover{color:#222}.hljs{background:initial !important}.katex .katex-mathml{top:0;right:0}.katex-display,mjx-container,.MathJax_Display{margin:0.5em 0 !important}html{-moz-osx-font-smoothing:auto;-webkit-font-smoothing:auto}li.no-marker{list-style:none}#documenter .docs-main>article{overflow-wrap:break-word}#documenter .docs-main>article .math-container{overflow-x:auto;overflow-y:hidden}@media screen and (min-width: 1056px){#documenter .docs-main{max-width:52rem;margin-left:20rem;padding-right:1rem}}@media screen and (max-width: 1055px){#documenter .docs-main{width:100%}#documenter .docs-main>article{max-width:52rem;margin-left:auto;margin-right:auto;margin-bottom:1rem;padding:0 1rem}#documenter .docs-main>header,#documenter .docs-main>nav{max-width:100%;width:100%;margin:0}}#documenter .docs-main header.docs-navbar{background-color:#fff;border-bottom:1px solid #dbdbdb;z-index:2;min-height:4rem;margin-bottom:1rem;display:flex}#documenter .docs-main header.docs-navbar .breadcrumb{flex-grow:1}#documenter .docs-main header.docs-navbar .docs-right{display:flex;white-space:nowrap}#documenter .docs-main header.docs-navbar .docs-right .docs-icon,#documenter .docs-main header.docs-navbar .docs-right .docs-label,#documenter .docs-main header.docs-navbar .docs-right .docs-sidebar-button{display:inline-block}#documenter .docs-main header.docs-navbar .docs-right .docs-label{padding:0;margin-left:0.3em}#documenter .docs-main header.docs-navbar .docs-right .docs-settings-button{margin:auto 0 auto 1rem}#documenter .docs-main header.docs-navbar .docs-right .docs-sidebar-button{font-size:1.5rem;margin:auto 0 auto 1rem}#documenter .docs-main header.docs-navbar>*{margin:auto 0}@media screen and (max-width: 1055px){#documenter .docs-main header.docs-navbar{position:sticky;top:0;padding:0 1rem;transition-property:top, box-shadow;-webkit-transition-property:top, box-shadow;transition-duration:0.3s;-webkit-transition-duration:0.3s}#documenter .docs-main header.docs-navbar.headroom--not-top{box-shadow:.2rem 0rem .4rem #bbb;transition-duration:0.7s;-webkit-transition-duration:0.7s}#documenter .docs-main header.docs-navbar.headroom--unpinned.headroom--not-top.headroom--not-bottom{top:-4.5rem;transition-duration:0.7s;-webkit-transition-duration:0.7s}}#documenter .docs-main section.footnotes{border-top:1px solid #dbdbdb}#documenter .docs-main section.footnotes li .tag:first-child,#documenter .docs-main section.footnotes li .docstring>section>a.docs-sourcelink:first-child,#documenter .docs-main section.footnotes li .content kbd:first-child,.content #documenter .docs-main section.footnotes li kbd:first-child{margin-right:1em;margin-bottom:0.4em}#documenter .docs-main .docs-footer{display:flex;flex-wrap:wrap;margin-left:0;margin-right:0;border-top:1px solid #dbdbdb;padding-top:1rem;padding-bottom:1rem}@media screen and (max-width: 1055px){#documenter .docs-main .docs-footer{padding-left:1rem;padding-right:1rem}}#documenter .docs-main .docs-footer .docs-footer-nextpage,#documenter .docs-main .docs-footer .docs-footer-prevpage{flex-grow:1}#documenter .docs-main .docs-footer .docs-footer-nextpage{text-align:right}#documenter .docs-main .docs-footer .flexbox-break{flex-basis:100%;height:0}#documenter .docs-main .docs-footer .footer-message{font-size:0.8em;margin:0.5em auto 0 auto;text-align:center}#documenter .docs-sidebar{display:flex;flex-direction:column;color:#0a0a0a;background-color:#f5f5f5;border-right:1px solid #dbdbdb;padding:0;flex:0 0 18rem;z-index:5;font-size:1rem;position:fixed;left:-18rem;width:18rem;height:100%;transition:left 0.3s}#documenter .docs-sidebar.visible{left:0;box-shadow:.4rem 0rem .8rem #bbb}@media screen and (min-width: 1056px){#documenter .docs-sidebar.visible{box-shadow:none}}@media screen and (min-width: 1056px){#documenter .docs-sidebar{left:0;top:0}}#documenter .docs-sidebar .docs-logo{margin-top:1rem;padding:0 1rem}#documenter .docs-sidebar .docs-logo>img{max-height:6rem;margin:auto}#documenter .docs-sidebar .docs-package-name{flex-shrink:0;font-size:1.5rem;font-weight:700;text-align:center;white-space:nowrap;overflow:hidden;padding:0.5rem 0}#documenter .docs-sidebar .docs-package-name .docs-autofit{max-width:16.2rem}#documenter .docs-sidebar .docs-package-name a,#documenter .docs-sidebar .docs-package-name a:hover{color:#0a0a0a}#documenter .docs-sidebar .docs-version-selector{border-top:1px solid #dbdbdb;display:none;padding:0.5rem}#documenter .docs-sidebar .docs-version-selector.visible{display:flex}#documenter .docs-sidebar ul.docs-menu{flex-grow:1;user-select:none;border-top:1px solid #dbdbdb;padding-bottom:1.5rem}#documenter .docs-sidebar ul.docs-menu>li>.tocitem{font-weight:bold}#documenter .docs-sidebar ul.docs-menu>li li{font-size:.95rem;margin-left:1em;border-left:1px solid #dbdbdb}#documenter .docs-sidebar ul.docs-menu input.collapse-toggle{display:none}#documenter .docs-sidebar ul.docs-menu ul.collapsed{display:none}#documenter .docs-sidebar ul.docs-menu input:checked~ul.collapsed{display:block}#documenter .docs-sidebar ul.docs-menu label.tocitem{display:flex}#documenter .docs-sidebar ul.docs-menu label.tocitem .docs-label{flex-grow:2}#documenter .docs-sidebar ul.docs-menu label.tocitem .docs-chevron{display:inline-block;font-style:normal;font-variant:normal;text-rendering:auto;line-height:1;font-size:.75rem;margin-left:1rem;margin-top:auto;margin-bottom:auto}#documenter .docs-sidebar ul.docs-menu label.tocitem .docs-chevron::before{font-family:"Font Awesome 5 Free";font-weight:900;content:"\f054"}#documenter .docs-sidebar ul.docs-menu input:checked~label.tocitem .docs-chevron::before{content:"\f078"}#documenter .docs-sidebar ul.docs-menu .tocitem{display:block;padding:0.5rem 0.5rem}#documenter .docs-sidebar ul.docs-menu .tocitem,#documenter .docs-sidebar ul.docs-menu .tocitem:hover{color:#0a0a0a;background:#f5f5f5}#documenter .docs-sidebar ul.docs-menu a.tocitem:hover,#documenter .docs-sidebar ul.docs-menu label.tocitem:hover{color:#0a0a0a;background-color:#ebebeb}#documenter .docs-sidebar ul.docs-menu li.is-active{border-top:1px solid #dbdbdb;border-bottom:1px solid #dbdbdb;background-color:#fff}#documenter .docs-sidebar ul.docs-menu li.is-active .tocitem,#documenter .docs-sidebar ul.docs-menu li.is-active .tocitem:hover{background-color:#fff;color:#0a0a0a}#documenter .docs-sidebar ul.docs-menu li.is-active ul.internal .tocitem:hover{background-color:#ebebeb;color:#0a0a0a}#documenter .docs-sidebar ul.docs-menu>li.is-active:first-child{border-top:none}#documenter .docs-sidebar ul.docs-menu ul.internal{margin:0 0.5rem 0.5rem;border-top:1px solid #dbdbdb}#documenter .docs-sidebar ul.docs-menu ul.internal li{font-size:.85rem;border-left:none;margin-left:0;margin-top:0.5rem}#documenter .docs-sidebar ul.docs-menu ul.internal .tocitem{width:100%;padding:0}#documenter .docs-sidebar ul.docs-menu ul.internal .tocitem::before{content:"⚬";margin-right:0.4em}#documenter .docs-sidebar form.docs-search{margin:auto;margin-top:0.5rem;margin-bottom:0.5rem}#documenter .docs-sidebar form.docs-search>input{width:14.4rem}@media screen and (min-width: 1056px){#documenter .docs-sidebar ul.docs-menu{overflow-y:auto;-webkit-overflow-scroll:touch}#documenter .docs-sidebar ul.docs-menu::-webkit-scrollbar{width:.3rem;background:none}#documenter .docs-sidebar ul.docs-menu::-webkit-scrollbar-thumb{border-radius:5px 0px 0px 5px;background:#e0e0e0}#documenter .docs-sidebar ul.docs-menu::-webkit-scrollbar-thumb:hover{background:#ccc}}@media screen and (max-width: 1055px){#documenter .docs-sidebar{overflow-y:auto;-webkit-overflow-scroll:touch}#documenter .docs-sidebar::-webkit-scrollbar{width:.3rem;background:none}#documenter .docs-sidebar::-webkit-scrollbar-thumb{border-radius:5px 0px 0px 5px;background:#e0e0e0}#documenter .docs-sidebar::-webkit-scrollbar-thumb:hover{background:#ccc}}#documenter .docs-main #documenter-search-info{margin-bottom:1rem}#documenter .docs-main #documenter-search-results{list-style-type:circle;list-style-position:outside}#documenter .docs-main #documenter-search-results li{margin-left:2rem}#documenter .docs-main #documenter-search-results .docs-highlight{background-color:yellow}.ansi span.sgr1{font-weight:bolder}.ansi span.sgr2{font-weight:lighter}.ansi span.sgr3{font-style:italic}.ansi span.sgr4{text-decoration:underline}.ansi span.sgr7{color:#fff;background-color:#222}.ansi span.sgr8{color:transparent}.ansi span.sgr8 span{color:transparent}.ansi span.sgr9{text-decoration:line-through}.ansi span.sgr30{color:#242424}.ansi span.sgr31{color:#a7201f}.ansi span.sgr32{color:#066f00}.ansi span.sgr33{color:#856b00}.ansi span.sgr34{color:#2149b0}.ansi span.sgr35{color:#7d4498}.ansi span.sgr36{color:#007989}.ansi span.sgr37{color:gray}.ansi span.sgr40{background-color:#242424}.ansi span.sgr41{background-color:#a7201f}.ansi span.sgr42{background-color:#066f00}.ansi span.sgr43{background-color:#856b00}.ansi span.sgr44{background-color:#2149b0}.ansi span.sgr45{background-color:#7d4498}.ansi span.sgr46{background-color:#007989}.ansi span.sgr47{background-color:gray}.ansi span.sgr90{color:#616161}.ansi span.sgr91{color:#cb3c33}.ansi span.sgr92{color:#0e8300}.ansi span.sgr93{color:#a98800}.ansi span.sgr94{color:#3c5dcd}.ansi span.sgr95{color:#9256af}.ansi span.sgr96{color:#008fa3}.ansi span.sgr97{color:#f5f5f5}.ansi span.sgr100{background-color:#616161}.ansi span.sgr101{background-color:#cb3c33}.ansi span.sgr102{background-color:#0e8300}.ansi span.sgr103{background-color:#a98800}.ansi span.sgr104{background-color:#3c5dcd}.ansi span.sgr105{background-color:#9256af}.ansi span.sgr106{background-color:#008fa3}.ansi span.sgr107{background-color:#f5f5f5}code.language-julia-repl>span.hljs-meta{color:#066f00;font-weight:bolder}/*! + Theme: Default + Description: Original highlight.js style + Author: (c) Ivan Sagalaev + Maintainer: @highlightjs/core-team + Website: https://highlightjs.org/ + License: see project LICENSE + Touched: 2021 +*/pre code.hljs{display:block;overflow-x:auto}code.hljs{padding:3px 5px}.hljs{background:#F0F0F0;color:#444}.hljs-comment{color:#888888}.hljs-tag,.hljs-punctuation{color:#444a}.hljs-tag .hljs-name,.hljs-tag .hljs-attr{color:#444}.hljs-keyword,.hljs-attribute,.hljs-selector-tag,.hljs-meta .hljs-keyword,.hljs-doctag,.hljs-name{font-weight:bold}.hljs-type,.hljs-string,.hljs-number,.hljs-selector-id,.hljs-selector-class,.hljs-quote,.hljs-template-tag,.hljs-deletion{color:#880000}.hljs-title,.hljs-section{color:#880000;font-weight:bold}.hljs-regexp,.hljs-symbol,.hljs-variable,.hljs-template-variable,.hljs-link,.hljs-selector-attr,.hljs-operator,.hljs-selector-pseudo{color:#BC6060}.hljs-literal{color:#78A960}.hljs-built_in,.hljs-bullet,.hljs-code,.hljs-addition{color:#397300}.hljs-meta{color:#1f7199}.hljs-meta .hljs-string{color:#4d99bf}.hljs-emphasis{font-style:italic}.hljs-strong{font-weight:bold} diff --git a/previews/PR2365/assets/themeswap.js b/previews/PR2365/assets/themeswap.js new file mode 100644 index 0000000000..c58e993e3e --- /dev/null +++ b/previews/PR2365/assets/themeswap.js @@ -0,0 +1,66 @@ +// Small function to quickly swap out themes. Gets put into the tag.. +function set_theme_from_local_storage() { + // Intialize the theme to null, which means default + var theme = null; + // If the browser supports the localstorage and is not disabled then try to get the + // documenter theme + if(window.localStorage != null) { + // Get the user-picked theme from localStorage. May be `null`, which means the default + // theme. + theme = window.localStorage.getItem("documenter-theme"); + } + // Check if the browser supports user color preference + var darkPreference = false; + // Check if the users preference is for dark color scheme + if(window.matchMedia('(prefers-color-scheme: dark)').matches === true) { + darkPreference = true; + } + // Initialize a few variables for the loop: + // + // - active: will contain the index of the theme that should be active. Note that there + // is no guarantee that localStorage contains sane values. If `active` stays `null` + // we either could not find the theme or it is the default (primary) theme anyway. + // Either way, we then need to stick to the primary theme. + // + // - disabled: style sheets that should be disabled (i.e. all the theme style sheets + // that are not the currently active theme) + var active = null; var disabled = []; var darkTheme = null; + for (var i = 0; i < document.styleSheets.length; i++) { + var ss = document.styleSheets[i]; + // The tag of each style sheet is expected to have a data-theme-name attribute + // which must contain the name of the theme. The names in localStorage much match this. + var themename = ss.ownerNode.getAttribute("data-theme-name"); + // attribute not set => non-theme stylesheet => ignore + if(themename === null) continue; + // To distinguish the default (primary) theme, it needs to have the data-theme-primary + // attribute set. + var isprimary = (ss.ownerNode.getAttribute("data-theme-primary") !== null); + // Check if the theme is primary dark theme + var isDarkTheme = (ss.ownerNode.getAttribute("data-theme-primary-dark") !== null); + // If ss is for dark theme then set the value of darkTheme to the name of the theme + if(isDarkTheme) darkTheme = themename; + // If we find a matching theme (and it's not the default), we'll set active to non-null + if(themename === theme) active = i; + // Store the style sheets of inactive themes so that we could disable them + if(themename !== theme) disabled.push(ss); + } + if(active !== null) { + // If we did find an active theme, we'll (1) add the theme--$(theme) class to + document.getElementsByTagName('html')[0].className = "theme--" + theme; + // and (2) disable all the other theme stylesheets + disabled.forEach(function(ss){ + ss.disabled = true; + }); + } + else if(darkTheme !== null && darkPreference === true) { + // If we did find an active theme, we'll (1) add the theme--$(theme) class to + document.getElementsByTagName('html')[0].className = "theme--" + darkTheme; + // and (2) disable all the other theme stylesheets + disabled.forEach(function(ss){ + if (ss.ownerNode.getAttribute("data-theme-name") !== darkTheme) { + ss.disabled = true; + } + }); + } +} +set_theme_from_local_storage(); diff --git a/previews/PR2365/assets/warner.js b/previews/PR2365/assets/warner.js new file mode 100644 index 0000000000..5531c8851b --- /dev/null +++ b/previews/PR2365/assets/warner.js @@ -0,0 +1,49 @@ +function maybeAddWarning () { + // DOCUMENTER_NEWEST is defined in versions.js, DOCUMENTER_CURRENT_VERSION and DOCUMENTER_STABLE + // in siteinfo.js. + // If either of these are undefined something went horribly wrong, so we abort. + if ( + window.DOCUMENTER_NEWEST === undefined || + window.DOCUMENTER_CURRENT_VERSION === undefined || + window.DOCUMENTER_STABLE === undefined + ) { + return + }; + + // Current version is not a version number, so we can't tell if it's the newest version. Abort. + if (!/v(\d+\.)*\d+/.test(window.DOCUMENTER_CURRENT_VERSION)) { + return + }; + + // Current version is newest version, so no need to add a warning. + if (window.DOCUMENTER_NEWEST === window.DOCUMENTER_CURRENT_VERSION) { + return + }; + + // Add a noindex meta tag (unless one exists) so that search engines don't index this version of the docs. + if (document.body.querySelector('meta[name="robots"]') === null) { + const meta = document.createElement('meta'); + meta.name = 'robots'; + meta.content = 'noindex'; + + document.getElementsByTagName('head')[0].appendChild(meta); + }; + + const div = document.createElement('div'); + div.classList.add('outdated-warning-overlay'); + const closer = document.createElement('button'); + closer.classList.add('outdated-warning-closer', 'delete'); + closer.addEventListener('click', function () { + document.body.removeChild(div); + }); + const href = window.documenterBaseURL + '/../' + window.DOCUMENTER_STABLE; + div.innerHTML = 'This documentation is not for the latest stable release, but for either the development version or an older release.
    Click here to go to the documentation for the latest stable release.'; + div.appendChild(closer); + document.body.appendChild(div); +}; + +if (document.readyState === 'loading') { + document.addEventListener('DOMContentLoaded', maybeAddWarning); +} else { + maybeAddWarning(); +}; diff --git a/previews/PR2365/data/mlutils/index.html b/previews/PR2365/data/mlutils/index.html new file mode 100644 index 0000000000..dee50d7e1e --- /dev/null +++ b/previews/PR2365/data/mlutils/index.html @@ -0,0 +1,565 @@ + +Batching Data – MLUtils.jl · Flux

    Working with Data, using MLUtils.jl

    Flux re-exports the DataLoader type and utility functions for working with data from MLUtils.

    DataLoader

    The DataLoader can be used to create mini-batches of data, in the format train! expects.

    Flux's website has a dedicated tutorial on DataLoader for more information.

    MLUtils.DataLoaderType
    DataLoader(data; [batchsize, buffer, collate, parallel, partial, rng, shuffle])

    An object that iterates over mini-batches of data, each mini-batch containing batchsize observations (except possibly the last one).

    Takes as input a single data array, a tuple (or a named tuple) of arrays, or in general any data object that implements the numobs and getobs methods.

    The last dimension in each array is the observation dimension, i.e. the one divided into mini-batches.

    The original data is preserved in the data field of the DataLoader.

    Arguments

    • data: The data to be iterated over. The data type has to be supported by numobs and getobs.
    • batchsize: If less than 0, iterates over individual observations. Otherwise, each iteration (except possibly the last) yields a mini-batch containing batchsize observations. Default 1.
    • buffer: If buffer=true and supported by the type of data, a buffer will be allocated and reused for memory efficiency. You can also pass a preallocated object to buffer. Default false.
    • collate: Batching behavior. If nothing (default), a batch is getobs(data, indices). If false, each batch is [getobs(data, i) for i in indices]. When true, applies batch to the vector of observations in a batch, recursively collating arrays in the last dimensions. See batch for more information and examples.
    • parallel: Whether to use load data in parallel using worker threads. Greatly speeds up data loading by factor of available threads. Requires starting Julia with multiple threads. Check Threads.nthreads() to see the number of available threads. Passing parallel = true breaks ordering guarantees. Default false.
    • partial: This argument is used only when batchsize > 0. If partial=false and the number of observations is not divisible by the batchsize, then the last mini-batch is dropped. Default true.
    • rng: A random number generator. Default Random.GLOBAL_RNG.
    • shuffle: Whether to shuffle the observations before iterating. Unlike wrapping the data container with shuffleobs(data), shuffle=true ensures that the observations are shuffled anew every time you start iterating over eachobs. Default false.

    Examples

    julia> Xtrain = rand(10, 100);
    +
    +julia> array_loader = DataLoader(Xtrain, batchsize=2);
    +
    +julia> for x in array_loader
    +         @assert size(x) == (10, 2)
    +         # do something with x, 50 times
    +       end
    +
    +julia> array_loader.data === Xtrain
    +true
    +
    +julia> tuple_loader = DataLoader((Xtrain,), batchsize=2);  # similar, but yielding 1-element tuples
    +
    +julia> for x in tuple_loader
    +         @assert x isa Tuple{Matrix}
    +         @assert size(x[1]) == (10, 2)
    +       end
    +
    +julia> Ytrain = rand('a':'z', 100);  # now make a DataLoader yielding 2-element named tuples
    +
    +julia> train_loader = DataLoader((data=Xtrain, label=Ytrain), batchsize=5, shuffle=true);
    +
    +julia> for epoch in 1:100
    +         for (x, y) in train_loader  # access via tuple destructuring
    +           @assert size(x) == (10, 5)
    +           @assert size(y) == (5,)
    +           # loss += f(x, y) # etc, runs 100 * 20 times
    +         end
    +       end
    +
    +julia> first(train_loader).label isa Vector{Char}  # access via property name
    +true
    +
    +julia> first(train_loader).label == Ytrain[1:5]  # because of shuffle=true
    +false
    +
    +julia> foreach(println∘summary, DataLoader(rand(Int8, 10, 64), batchsize=30))  # partial=false would omit last
    +10×30 Matrix{Int8}
    +10×30 Matrix{Int8}
    +10×4 Matrix{Int8}

    Utility Functions

    The utility functions are meant to be used while working with data; these functions help create inputs for your models or batch your dataset.

    MLUtils.batchFunction
    batch(xs)

    Batch the arrays in xs into a single array with an extra dimension.

    If the elements of xs are tuples, named tuples, or dicts, the output will be of the same type.

    See also unbatch.

    Examples

    julia> batch([[1,2,3], 
    +              [4,5,6]])
    +3×2 Matrix{Int64}:
    + 1  4
    + 2  5
    + 3  6
    +
    +julia> batch([(a=[1,2], b=[3,4])
    +               (a=[5,6], b=[7,8])]) 
    +(a = [1 5; 2 6], b = [3 7; 4 8])
    MLUtils.batchsizeFunction
    batchsize(data) -> Int

    Return the fixed size of each batch in data.

    MLUtils.batchseqFunction
    batchseq(seqs, val = 0)

    Take a list of N sequences, and turn them into a single sequence where each item is a batch of N. Short sequences will be padded by val.

    Examples

    julia> batchseq([[1, 2, 3], [4, 5]], 0)
    +3-element Vector{Vector{Int64}}:
    + [1, 4]
    + [2, 5]
    + [3, 0]
    MLUtils.BatchViewType
    BatchView(data, batchsize; partial=true, collate=nothing)
    +BatchView(data; batchsize=1, partial=true, collate=nothing)

    Create a view of the given data that represents it as a vector of batches. Each batch will contain an equal amount of observations in them. The batch-size can be specified using the parameter batchsize. In the case that the size of the dataset is not dividable by the specified batchsize, the remaining observations will be ignored if partial=false. If partial=true instead the last batch-size can be slightly smaller.

    Note that any data access is delayed until getindex is called.

    If used as an iterator, the object will iterate over the dataset once, effectively denoting an epoch.

    For BatchView to work on some data structure, the type of the given variable data must implement the data container interface. See ObsView for more info.

    Arguments

    • data : The object describing the dataset. Can be of any type as long as it implements getobs and numobs (see Details for more information).

    • batchsize : The batch-size of each batch. It is the number of observations that each batch must contain (except possibly for the last one).

    • partial : If partial=false and the number of observations is not divisible by the batch-size, then the last mini-batch is dropped.

    • collate: Batching behavior. If nothing (default), a batch is getobs(data, indices). If false, each batch is [getobs(data, i) for i in indices]. When true, applies batch to the vector of observations in a batch, recursively collating arrays in the last dimensions. See batch for more information and examples.

    Examples

    using MLUtils
    +X, Y = MLUtils.load_iris()
    +
    +A = BatchView(X, batchsize=30)
    +@assert typeof(A) <: BatchView <: AbstractVector
    +@assert eltype(A) <: SubArray{Float64,2}
    +@assert length(A) == 5 # Iris has 150 observations
    +@assert size(A[1]) == (4,30) # Iris has 4 features
    +
    +# 5 batches of size 30 observations
    +for x in BatchView(X, batchsize=30)
    +    @assert typeof(x) <: SubArray{Float64,2}
    +    @assert numobs(x) === 30
    +end
    +
    +# 7 batches of size 20 observations
    +# Note that the iris dataset has 150 observations,
    +# which means that with a batchsize of 20, the last
    +# 10 observations will be ignored
    +for (x, y) in BatchView((X, Y), batchsize=20, partial=false)
    +    @assert typeof(x) <: SubArray{Float64,2}
    +    @assert typeof(y) <: SubArray{String,1}
    +    @assert numobs(x) == numobs(y) == 20
    +end
    +
    +# collate tuple observations
    +for (x, y) in BatchView((rand(10, 3), ["a", "b", "c"]), batchsize=2, collate=true, partial=false)
    +    @assert size(x) == (10, 2)
    +    @assert size(y) == (2,)
    +end
    +
    +
    +# randomly assign observations to one and only one batch.
    +for (x, y) in BatchView(shuffleobs((X, Y)), batchsize=20)
    +    @assert typeof(x) <: SubArray{Float64,2}
    +    @assert typeof(y) <: SubArray{String,1}
    +end
    MLUtils.chunkFunction
    chunk(x, n; [dims])
    +chunk(x; [size, dims])

    Split x into n parts or alternatively, if size is an integer, into equal chunks of size size. The parts contain the same number of elements except possibly for the last one that can be smaller.

    In case size is a collection of integers instead, the elements of x are split into chunks of the given sizes.

    If x is an array, dims can be used to specify along which dimension to split (defaults to the last dimension).

    Examples

    julia> chunk(1:10, 3)
    +3-element Vector{UnitRange{Int64}}:
    + 1:4
    + 5:8
    + 9:10
    +
    +julia> chunk(1:10; size = 2)
    +5-element Vector{UnitRange{Int64}}:
    + 1:2
    + 3:4
    + 5:6
    + 7:8
    + 9:10
    +
    +julia> x = reshape(collect(1:20), (5, 4))
    +5×4 Matrix{Int64}:
    + 1   6  11  16
    + 2   7  12  17
    + 3   8  13  18
    + 4   9  14  19
    + 5  10  15  20
    +
    +julia> xs = chunk(x, 2, dims=1)
    +2-element Vector{SubArray{Int64, 2, Matrix{Int64}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}}:
    + [1 6 11 16; 2 7 12 17; 3 8 13 18]
    + [4 9 14 19; 5 10 15 20]
    +
    +julia> xs[1]
    +3×4 view(::Matrix{Int64}, 1:3, :) with eltype Int64:
    + 1  6  11  16
    + 2  7  12  17
    + 3  8  13  18
    +
    +julia> xes = chunk(x; size = 2, dims = 2)
    +2-element Vector{SubArray{Int64, 2, Matrix{Int64}, Tuple{Base.Slice{Base.OneTo{Int64}}, UnitRange{Int64}}, true}}:
    + [1 6; 2 7; … ; 4 9; 5 10]
    + [11 16; 12 17; … ; 14 19; 15 20]
    +
    +julia> xes[2]
    +5×2 view(::Matrix{Int64}, :, 3:4) with eltype Int64:
    + 11  16
    + 12  17
    + 13  18
    + 14  19
    + 15  20
    +
    +julia> chunk(1:6; size = [2, 4])
    +2-element Vector{UnitRange{Int64}}:
    + 1:2
    + 3:6
    chunk(x, partition_idxs; [npartitions, dims])

    Partition the array x along the dimension dims according to the indexes in partition_idxs.

    partition_idxs must be sorted and contain only positive integers between 1 and the number of partitions.

    If the number of partition npartitions is not provided, it is inferred from partition_idxs.

    If dims is not provided, it defaults to the last dimension.

    See also unbatch.

    Examples

    julia> x = reshape([1:10;], 2, 5)
    +2×5 Matrix{Int64}:
    + 1  3  5  7   9
    + 2  4  6  8  10
    +
    +julia> chunk(x, [1, 2, 2, 3, 3])
    +3-element Vector{SubArray{Int64, 2, Matrix{Int64}, Tuple{Base.Slice{Base.OneTo{Int64}}, UnitRange{Int64}}, true}}:
    + [1; 2;;]
    + [3 5; 4 6]
    + [7 9; 8 10]
    MLUtils.eachobsFunction
    eachobs(data; kws...)

    Return an iterator over data.

    Supports the same arguments as DataLoader. The batchsize default is -1 here while it is 1 for DataLoader.

    Examples

    X = rand(4,100)
    +
    +for x in eachobs(X)
    +    # loop entered 100 times
    +    @assert typeof(x) <: Vector{Float64}
    +    @assert size(x) == (4,)
    +end
    +
    +# mini-batch iterations
    +for x in eachobs(X, batchsize=10)
    +    # loop entered 10 times
    +    @assert typeof(x) <: Matrix{Float64}
    +    @assert size(x) == (4,10)
    +end
    +
    +# support for tuples, named tuples, dicts
    +for (x, y) in eachobs((X, Y))
    +    # ...
    +end
    MLUtils.fill_likeFunction
    fill_like(x, val, [element_type=eltype(x)], [dims=size(x)]))

    Create an array with the given element type and size, based upon the given source array x. All element of the new array will be set to val. The third and fourth arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.

    See also zeros_like and ones_like.

    Examples

    julia> x = rand(Float32, 2)
    +2-element Vector{Float32}:
    + 0.16087806
    + 0.89916044
    +
    +julia> fill_like(x, 1.7, (3, 3))
    +3×3 Matrix{Float32}:
    + 1.7  1.7  1.7
    + 1.7  1.7  1.7
    + 1.7  1.7  1.7
    +
    +julia> using CUDA
    +
    +julia> x = CUDA.rand(2, 2)
    +2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + 0.803167  0.476101
    + 0.303041  0.317581
    +
    +julia> fill_like(x, 1.7, Float64)
    +2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:
    + 1.7  1.7
    + 1.7  1.7
    MLUtils.filterobsFunction
    filterobs(f, data)

    Return a subset of data container data including all indices i for which f(getobs(data, i)) === true.

    data = 1:10
    +numobs(data) == 10
    +fdata = filterobs(>(5), data)
    +numobs(fdata) == 5
    MLUtils.flattenFunction
    flatten(x::AbstractArray)

    Reshape arbitrarly-shaped input into a matrix-shaped output, preserving the size of the last dimension.

    See also unsqueeze.

    Examples

    julia> rand(3,4,5) |> flatten |> size
    +(12, 5)
    MLUtils.getobsFunction
    getobs(data, [idx])

    Return the observations corresponding to the observation index idx. Note that idx can be any type as long as data has defined getobs for that type. If idx is not provided, then materialize all observations in data.

    If data does not have getobs defined, then in the case of Tables.table(data) == true returns the row(s) in position idx, otherwise returns data[idx].

    Authors of custom data containers should implement Base.getindex for their type instead of getobs. getobs should only be implemented for types where there is a difference between getobs and Base.getindex (such as multi-dimensional arrays).

    The returned observation(s) should be in the form intended to be passed as-is to some learning algorithm. There is no strict interface requirement on how this "actual data" must look like. Every author behind some custom data container can make this decision themselves. The output should be consistent when idx is a scalar vs vector.

    getobs supports by default nested combinations of array, tuple, named tuples, and dictionaries.

    See also getobs! and numobs.

    Examples

    # named tuples 
    +x = (a = [1, 2, 3], b = rand(6, 3))
    +
    +getobs(x, 2) == (a = 2, b = x.b[:, 2])
    +getobs(x, [1, 3]) == (a = [1, 3], b = x.b[:, [1, 3]])
    +
    +
    +# dictionaries
    +x = Dict(:a => [1, 2, 3], :b => rand(6, 3))
    +
    +getobs(x, 2) == Dict(:a => 2, :b => x[:b][:, 2])
    +getobs(x, [1, 3]) == Dict(:a => [1, 3], :b => x[:b][:, [1, 3]])
    MLUtils.getobs!Function
    getobs!(buffer, data, idx)

    Inplace version of getobs(data, idx). If this method is defined for the type of data, then buffer should be used to store the result, instead of allocating a dedicated object.

    Implementing this function is optional. In the case no such method is provided for the type of data, then buffer will be ignored and the result of getobs returned. This could be because the type of data may not lend itself to the concept of copy!. Thus, supporting a custom getobs! is optional and not required.

    See also getobs and numobs.

    MLUtils.joinobsFunction
    joinobs(datas...)

    Concatenate data containers datas.

    data1, data2 = 1:10, 11:20
    +jdata = joinumobs(data1, data2)
    +getobs(jdata, 15) == 15
    MLUtils.group_countsFunction
    group_counts(x)

    Count the number of times that each element of x appears.

    See also group_indices

    Examples

    julia> group_counts(['a', 'b', 'b'])
    +Dict{Char, Int64} with 2 entries:
    +  'a' => 1
    +  'b' => 2
    MLUtils.group_indicesFunction
    group_indices(x) -> Dict

    Computes the indices of elements in the vector x for each distinct value contained. This information is useful for resampling strategies, such as stratified sampling.

    See also group_counts.

    Examples

    julia> x = [:yes, :no, :maybe, :yes];
    +
    +julia> group_indices(x)
    +Dict{Symbol, Vector{Int64}} with 3 entries:
    +  :yes   => [1, 4]
    +  :maybe => [3]
    +  :no    => [2]
    MLUtils.groupobsFunction
    groupobs(f, data)

    Split data container data data into different data containers, grouping observations by f(obs).

    data = -10:10
    +datas = groupobs(>(0), data)
    +length(datas) == 2
    MLUtils.kfoldsFunction
    kfolds(n::Integer, k = 5) -> Tuple

    Compute the train/validation assignments for k repartitions of n observations, and return them in the form of two vectors. The first vector contains the index-vectors for the training subsets, and the second vector the index-vectors for the validation subsets respectively. A general rule of thumb is to use either k = 5 or k = 10. The following code snippet generates the indices assignments for k = 5

    julia> train_idx, val_idx = kfolds(10, 5);

    Each observation is assigned to the validation subset once (and only once). Thus, a union over all validation index-vectors reproduces the full range 1:n. Note that there is no random assignment of observations to subsets, which means that adjacent observations are likely to be part of the same validation subset.

    julia> train_idx
    +5-element Array{Array{Int64,1},1}:
    + [3,4,5,6,7,8,9,10]
    + [1,2,5,6,7,8,9,10]
    + [1,2,3,4,7,8,9,10]
    + [1,2,3,4,5,6,9,10]
    + [1,2,3,4,5,6,7,8]
    +
    +julia> val_idx
    +5-element Array{UnitRange{Int64},1}:
    + 1:2
    + 3:4
    + 5:6
    + 7:8
    + 9:10
    kfolds(data, [k = 5])

    Repartition a data container k times using a k folds strategy and return the sequence of folds as a lazy iterator. Only data subsets are created, which means that no actual data is copied until getobs is invoked.

    Conceptually, a k-folds repartitioning strategy divides the given data into k roughly equal-sized parts. Each part will serve as validation set once, while the remaining parts are used for training. This results in k different partitions of data.

    In the case that the size of the dataset is not dividable by the specified k, the remaining observations will be evenly distributed among the parts.

    for (x_train, x_val) in kfolds(X, k=10)
    +    # code called 10 times
    +    # nobs(x_val) may differ up to ±1 over iterations
    +end

    Multiple variables are supported (e.g. for labeled data)

    for ((x_train, y_train), val) in kfolds((X, Y), k=10)
    +    # ...
    +end

    By default the folds are created using static splits. Use shuffleobs to randomly assign observations to the folds.

    for (x_train, x_val) in kfolds(shuffleobs(X), k = 10)
    +    # ...
    +end

    See leavepout for a related function.

    MLUtils.leavepoutFunction
    leavepout(n::Integer, [size = 1]) -> Tuple

    Compute the train/validation assignments for k ≈ n/size repartitions of n observations, and return them in the form of two vectors. The first vector contains the index-vectors for the training subsets, and the second vector the index-vectors for the validation subsets respectively. Each validation subset will have either size or size+1 observations assigned to it. The following code snippet generates the index-vectors for size = 2.

    julia> train_idx, val_idx = leavepout(10, 2);

    Each observation is assigned to the validation subset once (and only once). Thus, a union over all validation index-vectors reproduces the full range 1:n. Note that there is no random assignment of observations to subsets, which means that adjacent observations are likely to be part of the same validation subset.

    julia> train_idx
    +5-element Array{Array{Int64,1},1}:
    + [3,4,5,6,7,8,9,10]
    + [1,2,5,6,7,8,9,10]
    + [1,2,3,4,7,8,9,10]
    + [1,2,3,4,5,6,9,10]
    + [1,2,3,4,5,6,7,8]
    +
    +julia> val_idx
    +5-element Array{UnitRange{Int64},1}:
    + 1:2
    + 3:4
    + 5:6
    + 7:8
    + 9:10
    leavepout(data, p = 1)

    Repartition a data container using a k-fold strategy, where k is chosen in such a way, that each validation subset of the resulting folds contains roughly p observations. Defaults to p = 1, which is also known as "leave-one-out" partitioning.

    The resulting sequence of folds is returned as a lazy iterator. Only data subsets are created. That means no actual data is copied until getobs is invoked.

    for (train, val) in leavepout(X, p=2)
    +    # if nobs(X) is dividable by 2,
    +    # then numobs(val) will be 2 for each iteraton,
    +    # otherwise it may be 3 for the first few iterations.
    +end

    Seekfolds for a related function.

    MLUtils.mapobsFunction
    mapobs(f, data; batched=:auto)

    Lazily map f over the observations in a data container data. Returns a new data container mdata that can be indexed and has a length. Indexing triggers the transformation f.

    The batched keyword argument controls the behavior of mdata[idx] and mdata[idxs] where idx is an integer and idxs is a vector of integers:

    • batched=:auto (default). Let f handle the two cases. Calls f(getobs(data, idx)) and f(getobs(data, idxs)).
    • batched=:never. The function f is always called on a single observation. Calls f(getobs(data, idx)) and [f(getobs(data, idx)) for idx in idxs].
    • batched=:always. The function f is always called on a batch of observations. Calls getobs(f(getobs(data, [idx])), 1) and f(getobs(data, idxs)).

    Examples

    julia> data = (a=[1,2,3], b=[1,2,3]);
    +
    +julia> mdata = mapobs(data) do x
    +         (c = x.a .+ x.b,  d = x.a .- x.b)
    +       end
    +mapobs(#25, (a = [1, 2, 3], b = [1, 2, 3]); batched=:auto))
    +
    +julia> mdata[1]
    +(c = 2, d = 0)
    +
    +julia> mdata[1:2]
    +(c = [2, 4], d = [0, 0])
    mapobs(fs, data)

    Lazily map each function in tuple fs over the observations in data container data. Returns a tuple of transformed data containers.

    mapobs(namedfs::NamedTuple, data)

    Map a NamedTuple of functions over data, turning it into a data container of NamedTuples. Field syntax can be used to select a column of the resulting data container.

    data = 1:10
    +nameddata = mapobs((x = sqrt, y = log), data)
    +getobs(nameddata, 10) == (x = sqrt(10), y = log(10))
    +getobs(nameddata.x, 10) == sqrt(10)
    MLUtils.numobsFunction
    numobs(data)

    Return the total number of observations contained in data.

    If data does not have numobs defined, then in the case of Tables.table(data) == true returns the number of rows, otherwise returns length(data).

    Authors of custom data containers should implement Base.length for their type instead of numobs. numobs should only be implemented for types where there is a difference between numobs and Base.length (such as multi-dimensional arrays).

    getobs supports by default nested combinations of array, tuple, named tuples, and dictionaries.

    See also getobs.

    Examples

    
    +# named tuples 
    +x = (a = [1, 2, 3], b = rand(6, 3))
    +numobs(x) == 3
    +
    +# dictionaries
    +x = Dict(:a => [1, 2, 3], :b => rand(6, 3))
    +numobs(x) == 3

    All internal containers must have the same number of observations:

    julia> x = (a = [1, 2, 3, 4], b = rand(6, 3));
    +
    +julia> numobs(x)
    +ERROR: DimensionMismatch: All data containers must have the same number of observations.
    +Stacktrace:
    + [1] _check_numobs_error()
    +   @ MLUtils ~/.julia/dev/MLUtils/src/observation.jl:163
    + [2] _check_numobs
    +   @ ~/.julia/dev/MLUtils/src/observation.jl:130 [inlined]
    + [3] numobs(data::NamedTuple{(:a, :b), Tuple{Vector{Int64}, Matrix{Float64}}})
    +   @ MLUtils ~/.julia/dev/MLUtils/src/observation.jl:177
    + [4] top-level scope
    +   @ REPL[35]:1
    MLUtils.normaliseFunction
    normalise(x; dims=ndims(x), ϵ=1e-5)

    Normalise the array x to mean 0 and standard deviation 1 across the dimension(s) given by dims. Per default, dims is the last dimension.

    ϵ is a small additive factor added to the denominator for numerical stability.

    MLUtils.obsviewFunction
    obsview(data, [indices])

    Returns a lazy view of the observations in data that correspond to the given indices. No data will be copied except of the indices. It is similar to constructing an ObsView, but returns a SubArray if the type of data is Array or SubArray. Furthermore, this function may be extended for custom types of data that also want to provide their own subset-type.

    In case data is a tuple, the constructor will be mapped over its elements. That means that the constructor returns a tuple of ObsView instead of a ObsView of tuples.

    If instead you want to get the subset of observations corresponding to the given indices in their native type, use getobs.

    See ObsView for more information.

    MLUtils.ObsViewType
    ObsView(data, [indices])

    Used to represent a subset of some data of arbitrary type by storing which observation-indices the subset spans. Furthermore, subsequent subsettings are accumulated without needing to access actual data.

    The main purpose for the existence of ObsView is to delay data access and movement until an actual batch of data (or single observation) is needed for some computation. This is particularily useful when the data is not located in memory, but on the hard drive or some remote location. In such a scenario one wants to load the required data only when needed.

    Any data access is delayed until getindex is called, and even getindex returns the result of obsview which in general avoids data movement until getobs is called. If used as an iterator, the view will iterate over the dataset once, effectively denoting an epoch. Each iteration will return a lazy subset to the current observation.

    Arguments

    • data : The object describing the dataset. Can be of any type as long as it implements getobs and numobs (see Details for more information).

    • indices : Optional. The index or indices of the observation(s) in data that the subset should represent. Can be of type Int or some subtype of AbstractVector.

    Methods

    • getindex : Returns the observation(s) of the given index/indices. No data is copied aside from the required indices.

    • numobs : Returns the total number observations in the subset.

    • getobs : Returns the underlying data that the ObsView represents at the given relative indices. Note that these indices are in "subset space", and in general will not directly correspond to the same indices in the underlying data set.

    Details

    For ObsView to work on some data structure, the desired type MyType must implement the following interface:

    • getobs(data::MyType, idx) : Should return the observation(s) indexed by idx. In what form is up to the user. Note that idx can be of type Int or AbstractVector.

    • numobs(data::MyType) : Should return the total number of observations in data

    The following methods can also be provided and are optional:

    • getobs(data::MyType) : By default this function is the identity function. If that is not the behaviour that you want for your type, you need to provide this method as well.

    • obsview(data::MyType, idx) : If your custom type has its own kind of subset type, you can return it here. An example for such a case are SubArray for representing a subset of some AbstractArray.

    • getobs!(buffer, data::MyType, [idx]) : Inplace version of getobs(data, idx). If this method is provided for MyType, then eachobs can preallocate a buffer that is then reused every iteration. Note: buffer should be equivalent to the return value of getobs(::MyType, ...), since this is how buffer is preallocated by default.

    Examples

    X, Y = MLUtils.load_iris()
    +
    +# The iris set has 150 observations and 4 features
    +@assert size(X) == (4,150)
    +
    +# Represents the 80 observations as a ObsView
    +v = ObsView(X, 21:100)
    +@assert numobs(v) == 80
    +@assert typeof(v) <: ObsView
    +# getobs indexes into v
    +@assert getobs(v, 1:10) == X[:, 21:30]
    +
    +# Use `obsview` to avoid boxing into ObsView
    +# for types that provide a custom "subset", such as arrays.
    +# Here it instead creates a native SubArray.
    +v = obsview(X, 1:100)
    +@assert numobs(v) == 100
    +@assert typeof(v) <: SubArray
    +
    +# Also works for tuples of arbitrary length
    +subset = obsview((X, Y), 1:100)
    +@assert numobs(subset) == 100
    +@assert typeof(subset) <: Tuple # tuple of SubArray
    +
    +# Use as iterator
    +for x in ObsView(X)
    +    @assert typeof(x) <: SubArray{Float64,1}
    +end
    +
    +# iterate over each individual labeled observation
    +for (x, y) in ObsView((X, Y))
    +    @assert typeof(x) <: SubArray{Float64,1}
    +    @assert typeof(y) <: String
    +end
    +
    +# same but in random order
    +for (x, y) in ObsView(shuffleobs((X, Y)))
    +    @assert typeof(x) <: SubArray{Float64,1}
    +    @assert typeof(y) <: String
    +end
    +
    +# Indexing: take first 10 observations
    +x, y = ObsView((X, Y))[1:10]

    See also

    obsview, getobs, numobs, splitobs, shuffleobs, kfolds.

    MLUtils.ones_likeFunction
    ones_like(x, [element_type=eltype(x)], [dims=size(x)]))

    Create an array with the given element type and size, based upon the given source array x. All element of the new array will be set to 1. The second and third arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.

    See also zeros_like and fill_like.

    Examples

    julia> x = rand(Float32, 2)
    +2-element Vector{Float32}:
    + 0.8621633
    + 0.5158395
    +
    +julia> ones_like(x, (3, 3))
    +3×3 Matrix{Float32}:
    + 1.0  1.0  1.0
    + 1.0  1.0  1.0
    + 1.0  1.0  1.0
    +
    +julia> using CUDA
    +
    +julia> x = CUDA.rand(2, 2)
    +2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + 0.82297   0.656143
    + 0.701828  0.391335
    +
    +julia> ones_like(x, Float64)
    +2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:
    + 1.0  1.0
    + 1.0  1.0
    MLUtils.oversampleFunction
    oversample(data, classes; fraction=1, shuffle=true)
    +oversample(data::Tuple; fraction=1, shuffle=true)

    Generate a re-balanced version of data by repeatedly sampling existing observations in such a way that every class will have at least fraction times the number observations of the largest class in classes. This way, all classes will have a minimum number of observations in the resulting data set relative to what largest class has in the given (original) data.

    As an example, by default (i.e. with fraction = 1) the resulting dataset will be near perfectly balanced. On the other hand, with fraction = 0.5 every class in the resulting data with have at least 50% as many observations as the largest class.

    The classes input is an array with the same length as numobs(data).

    The convenience parameter shuffle determines if the resulting data will be shuffled after its creation; if it is not shuffled then all the repeated samples will be together at the end, sorted by class. Defaults to true.

    The output will contain both the resampled data and classes.

    # 6 observations with 3 features each
    +X = rand(3, 6)
    +# 2 classes, severely imbalanced
    +Y = ["a", "b", "b", "b", "b", "a"]
    +
    +# oversample the class "a" to match "b"
    +X_bal, Y_bal = oversample(X, Y)
    +
    +# this results in a bigger dataset with repeated data
    +@assert size(X_bal) == (3,8)
    +@assert length(Y_bal) == 8
    +
    +# now both "a", and "b" have 4 observations each
    +@assert sum(Y_bal .== "a") == 4
    +@assert sum(Y_bal .== "b") == 4

    For this function to work, the type of data must implement numobs and getobs.

    Note that if data is a tuple and classes is not given, then it will be assumed that the last element of the tuple contains the classes.

    julia> data = DataFrame(X1=rand(6), X2=rand(6), Y=[:a,:b,:b,:b,:b,:a])
    +6×3 DataFrames.DataFrame
    +│ Row │ X1        │ X2          │ Y │
    +├─────┼───────────┼─────────────┼───┤
    +│ 1   │ 0.226582  │ 0.0443222   │ a │
    +│ 2   │ 0.504629  │ 0.722906    │ b │
    +│ 3   │ 0.933372  │ 0.812814    │ b │
    +│ 4   │ 0.522172  │ 0.245457    │ b │
    +│ 5   │ 0.505208  │ 0.11202     │ b │
    +│ 6   │ 0.0997825 │ 0.000341996 │ a │
    +
    +julia> getobs(oversample(data, data.Y))
    +8×3 DataFrame
    + Row │ X1        X2         Y      
    +     │ Float64   Float64    Symbol 
    +─────┼─────────────────────────────
    +   1 │ 0.376304  0.100022   a
    +   2 │ 0.467095  0.185437   b
    +   3 │ 0.481957  0.319906   b
    +   4 │ 0.336762  0.390811   b
    +   5 │ 0.376304  0.100022   a
    +   6 │ 0.427064  0.0648339  a
    +   7 │ 0.427064  0.0648339  a
    +   8 │ 0.457043  0.490688   b

    See ObsView for more information on data subsets. See also undersample.

    MLUtils.randobsFunction
    randobs(data, [n])

    Pick a random observation or a batch of n random observations from data. For this function to work, the type of data must implement numobs and getobs.

    MLUtils.rand_likeFunction
    rand_like([rng=default_rng()], x, [element_type=eltype(x)], [dims=size(x)])

    Create an array with the given element type and size, based upon the given source array x. All element of the new array will be set to a random value. The last two arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.

    The default random number generator is used, unless a custom one is passed in explicitly as the first argument.

    See also Base.rand and randn_like.

    Examples

    julia> x = ones(Float32, 2)
    +2-element Vector{Float32}:
    + 1.0
    + 1.0
    +
    +julia> rand_like(x, (3, 3))
    +3×3 Matrix{Float32}:
    + 0.780032  0.920552  0.53689
    + 0.121451  0.741334  0.5449
    + 0.55348   0.138136  0.556404
    +
    +julia> using CUDA
    +
    +julia> CUDA.ones(2, 2)
    +2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + 1.0  1.0
    + 1.0  1.0
    +
    +julia> rand_like(x, Float64)
    +2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:
    + 0.429274  0.135379
    + 0.718895  0.0098756
    MLUtils.randn_likeFunction
    randn_like([rng=default_rng()], x, [element_type=eltype(x)], [dims=size(x)])

    Create an array with the given element type and size, based upon the given source array x. All element of the new array will be set to a random value drawn from a normal distribution. The last two arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.

    The default random number generator is used, unless a custom one is passed in explicitly as the first argument.

    See also Base.randn and rand_like.

    Examples

    julia> x = ones(Float32, 2)
    +2-element Vector{Float32}:
    + 1.0
    + 1.0
    +
    +julia> randn_like(x, (3, 3))
    +3×3 Matrix{Float32}:
    + -0.385331    0.956231   0.0745102
    +  1.43756    -0.967328   2.06311
    +  0.0482372   1.78728   -0.902547
    +
    +julia> using CUDA
    +
    +julia> CUDA.ones(2, 2)
    +2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + 1.0  1.0
    + 1.0  1.0
    +
    +julia> randn_like(x, Float64)
    +2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:
    + -0.578527   0.823445
    + -1.01338   -0.612053
    MLUtils.rpad_constantFunction
    rpad_constant(v::AbstractArray, n::Union{Integer, Tuple}, val = 0; dims=:)

    Return the given sequence padded with val along the dimensions dims up to a maximum length in each direction specified by n.

    Examples

    julia> rpad_constant([1, 2], 4, -1) # passing with -1 up to size 4
    +4-element Vector{Int64}:
    + 1
    + 2
    + -1
    + -1
    +
    +julia> rpad_constant([1, 2, 3], 2) # no padding if length is already greater than n
    +3-element Vector{Int64}:
    + 1
    + 2
    + 3
    +
    +julia> rpad_constant([1 2; 3 4], 4; dims=1) # padding along the first dimension
    +4×2 Matrix{Int64}:
    + 1  2
    + 3  4
    + 0  0
    + 0  0 
    +
    +julia> rpad_constant([1 2; 3 4], 4) # padding along all dimensions by default
    +4×2 Matrix{Int64}:
    + 1  2
    + 3  4
    + 0  0
    + 0  0 
    MLUtils.shuffleobsFunction
    shuffleobs([rng], data)

    Return a "subset" of data that spans all observations, but has the order of the observations shuffled.

    The values of data itself are not copied. Instead only the indices are shuffled. This function calls obsview to accomplish that, which means that the return value is likely of a different type than data.

    # For Arrays the subset will be of type SubArray
    +@assert typeof(shuffleobs(rand(4,10))) <: SubArray
    +
    +# Iterate through all observations in random order
    +for x in eachobs(shuffleobs(X))
    +    ...
    +end

    The optional parameter rng allows one to specify the random number generator used for shuffling. This is useful when reproducible results are desired. By default, uses the global RNG. See Random in Julia's standard library for more info.

    For this function to work, the type of data must implement numobs and getobs. See ObsView for more information.

    MLUtils.splitobsFunction
    splitobs(n::Int; at) -> Tuple

    Compute the indices for two or more disjoint subsets of the range 1:n with splits given by at.

    Examples

    julia> splitobs(100, at=0.7)
    +(1:70, 71:100)
    +
    +julia> splitobs(100, at=(0.1, 0.4))
    +(1:10, 11:50, 51:100)
    splitobs(data; at, shuffle=false) -> Tuple

    Split the data into multiple subsets proportional to the value(s) of at.

    If shuffle=true, randomly permute the observations before splitting.

    Supports any datatype implementing the numobs and getobs interfaces.

    Examples

    # A 70%-30% split
    +train, test = splitobs(X, at=0.7)
    +
    +# A 50%-30%-20% split
    +train, val, test = splitobs(X, at=(0.5, 0.3))
    +
    +# A 70%-30% split with multiple arrays and shuffling
    +train, test = splitobs((X, y), at=0.7, shuffle=true)
    +Xtrain, Ytrain = train
    MLUtils.unbatchFunction
    unbatch(x)

    Reverse of the batch operation, unstacking the last dimension of the array x.

    See also unstack and chunk.

    Examples

    julia> unbatch([1 3 5 7;
    +                2 4 6 8])
    +4-element Vector{Vector{Int64}}:
    + [1, 2]
    + [3, 4]
    + [5, 6]
    + [7, 8]
    MLUtils.undersampleFunction
    undersample(data, classes; shuffle=true)

    Generate a class-balanced version of data by subsampling its observations in such a way that the resulting number of observations will be the same number for every class. This way, all classes will have as many observations in the resulting data set as the smallest class has in the given (original) data.

    The convenience parameter shuffle determines if the resulting data will be shuffled after its creation; if it is not shuffled then all the observations will be in their original order. Defaults to false.

    The output will contain both the resampled data and classes.

    # 6 observations with 3 features each
    +X = rand(3, 6)
    +# 2 classes, severely imbalanced
    +Y = ["a", "b", "b", "b", "b", "a"]
    +
    +# subsample the class "b" to match "a"
    +X_bal, Y_bal = undersample(X, Y)
    +
    +# this results in a smaller dataset
    +@assert size(X_bal) == (3,4)
    +@assert length(Y_bal) == 4
    +
    +# now both "a", and "b" have 2 observations each
    +@assert sum(Y_bal .== "a") == 2
    +@assert sum(Y_bal .== "b") == 2

    For this function to work, the type of data must implement numobs and getobs.

    Note that if data is a tuple, then it will be assumed that the last element of the tuple contains the targets.

    julia> data = DataFrame(X1=rand(6), X2=rand(6), Y=[:a,:b,:b,:b,:b,:a])
    +6×3 DataFrames.DataFrame
    +│ Row │ X1        │ X2          │ Y │
    +├─────┼───────────┼─────────────┼───┤
    +│ 1   │ 0.226582  │ 0.0443222   │ a │
    +│ 2   │ 0.504629  │ 0.722906    │ b │
    +│ 3   │ 0.933372  │ 0.812814    │ b │
    +│ 4   │ 0.522172  │ 0.245457    │ b │
    +│ 5   │ 0.505208  │ 0.11202     │ b │
    +│ 6   │ 0.0997825 │ 0.000341996 │ a │
    +
    +julia> getobs(undersample(data, data.Y))
    +4×3 DataFrame
    + Row │ X1        X2         Y      
    +     │ Float64   Float64    Symbol 
    +─────┼─────────────────────────────
    +   1 │ 0.427064  0.0648339  a
    +   2 │ 0.376304  0.100022   a
    +   3 │ 0.467095  0.185437   b
    +   4 │ 0.457043  0.490688   b

    See ObsView for more information on data subsets. See also oversample.

    MLUtils.unsqueezeFunction
    unsqueeze(x; dims)

    Return x reshaped into an array one dimensionality higher than x, where dims indicates in which dimension x is extended. dims can be an integer between 1 and ndims(x)+1.

    See also flatten, stack.

    Examples

    julia> unsqueeze([1 2; 3 4], dims=2)
    +2×1×2 Array{Int64, 3}:
    +[:, :, 1] =
    + 1
    + 3
    +
    +[:, :, 2] =
    + 2
    + 4
    +
    +
    +julia> xs = [[1, 2], [3, 4], [5, 6]]
    +3-element Vector{Vector{Int64}}:
    + [1, 2]
    + [3, 4]
    + [5, 6]
    +
    +julia> unsqueeze(xs, dims=1)
    +1×3 Matrix{Vector{Int64}}:
    + [1, 2]  [3, 4]  [5, 6]
    unsqueeze(; dims)

    Returns a function which, acting on an array, inserts a dimension of size 1 at dims.

    Examples

    julia> rand(21, 22, 23) |> unsqueeze(dims=2) |> size
    +(21, 1, 22, 23)
    MLUtils.unstackFunction
    unstack(xs; dims)

    Unroll the given xs into an array of arrays along the given dimension dims.

    See also stack, unbatch, and chunk.

    Examples

    julia> unstack([1 3 5 7; 2 4 6 8], dims=2)
    +4-element Vector{Vector{Int64}}:
    + [1, 2]
    + [3, 4]
    + [5, 6]
    + [7, 8]
    MLUtils.zeros_likeFunction
    zeros_like(x, [element_type=eltype(x)], [dims=size(x)]))

    Create an array with the given element type and size, based upon the given source array x. All element of the new array will be set to 0. The second and third arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.

    See also ones_like and fill_like.

    Examples

    julia> x = rand(Float32, 2)
    +2-element Vector{Float32}:
    + 0.4005432
    + 0.36934233
    +
    +julia> zeros_like(x, (3, 3))
    +3×3 Matrix{Float32}:
    + 0.0  0.0  0.0
    + 0.0  0.0  0.0
    + 0.0  0.0  0.0
    +
    +julia> using CUDA
    +
    +julia> x = CUDA.rand(2, 2)
    +2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + 0.0695155  0.667979
    + 0.558468   0.59903
    +
    +julia> zeros_like(x, Float64)
    +2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:
    + 0.0  0.0
    + 0.0  0.0
    diff --git a/previews/PR2365/data/onehot/index.html b/previews/PR2365/data/onehot/index.html new file mode 100644 index 0000000000..8108602e53 --- /dev/null +++ b/previews/PR2365/data/onehot/index.html @@ -0,0 +1,76 @@ + +OneHotArrays.jl · Flux

    One-Hot Encoding with OneHotArrays.jl

    It's common to encode categorical variables (like true, false or cat, dog) in "one-of-k" or "one-hot" form. OneHotArrays.jl provides the onehot function to make this easy.

    julia> using OneHotArrays
    +
    +julia> onehot(:b, [:a, :b, :c])
    +3-element OneHotVector(::UInt32) with eltype Bool:
    + ⋅
    + 1
    + ⋅
    +
    +julia> onehot(:c, [:a, :b, :c])
    +3-element OneHotVector(::UInt32) with eltype Bool:
    + ⋅
    + ⋅
    + 1

    There is also a onecold function, which is an inverse of onehot. It can also be given an array of numbers instead of booleans, in which case it performs an argmax-like operation, returning the label with the highest corresponding weight.

    julia> onecold(ans, [:a, :b, :c])
    +:c
    +
    +julia> onecold([true, false, false], [:a, :b, :c])
    +:a
    +
    +julia> onecold([0.3, 0.2, 0.5], [:a, :b, :c])
    +:c

    For multiple samples at once, onehotbatch creates a batch (matrix) of one-hot vectors, and onecold treats matrices as batches.

    julia> using OneHotArrays
    +
    +julia> onehotbatch([:b, :a, :b], [:a, :b, :c])
    +3×3 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + ⋅  1  ⋅
    + 1  ⋅  1
    + ⋅  ⋅  ⋅
    +
    +julia> onecold(ans, [:a, :b, :c])
    +3-element Vector{Symbol}:
    + :b
    + :a
    + :b

    Note that these operations returned OneHotVector and OneHotMatrix rather than Arrays. OneHotVectors behave like normal vectors but avoid any unnecessary cost compared to using an integer index directly. For example, multiplying a matrix with a one-hot vector simply slices out the relevant row of the matrix under the hood.

    Function listing

    OneHotArrays.onehotFunction
    onehot(x, labels, [default])

    Returns a OneHotVector which is roughly a sparse representation of x .== labels.

    Instead of storing say Vector{Bool}, it stores the index of the first occurrence of x in labels. If x is not found in labels, then it either returns onehot(default, labels), or gives an error if no default is given.

    See also onehotbatch to apply this to many xs, and onecold to reverse either of these, as well as to generalise argmax.

    Examples

    julia> β = onehot(:b, (:a, :b, :c))
    +3-element OneHotVector(::UInt32) with eltype Bool:
    + ⋅
    + 1
    + ⋅
    +
    +julia> αβγ = (onehot(0, 0:2), β, onehot(:z, [:a, :b, :c], :c))  # uses default
    +(Bool[1, 0, 0], Bool[0, 1, 0], Bool[0, 0, 1])
    +
    +julia> hcat(αβγ...)  # preserves sparsity
    +3×3 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + 1  ⋅  ⋅
    + ⋅  1  ⋅
    + ⋅  ⋅  1
    OneHotArrays.onecoldFunction
    onecold(y::AbstractArray, labels = 1:size(y,1))

    Roughly the inverse operation of onehot or onehotbatch: This finds the index of the largest element of y, or each column of y, and looks them up in labels.

    If labels are not specified, the default is integers 1:size(y,1) – the same operation as argmax(y, dims=1) but sometimes a different return type.

    Examples

    julia> onecold([false, true, false])
    +2
    +
    +julia> onecold([0.3, 0.2, 0.5], (:a, :b, :c))
    +:c
    +
    +julia> onecold([ 1  0  0  1  0  1  0  1  0  0  1
    +                 0  1  0  0  0  0  0  0  1  0  0
    +                 0  0  0  0  1  0  0  0  0  0  0
    +                 0  0  0  0  0  0  1  0  0  0  0
    +                 0  0  1  0  0  0  0  0  0  1  0 ], 'a':'e') |> String
    +"abeacadabea"
    OneHotArrays.onehotbatchFunction
    onehotbatch(xs, labels, [default])

    Returns a OneHotMatrix where kth column of the matrix is onehot(xs[k], labels). This is a sparse matrix, which stores just a Vector{UInt32} containing the indices of the nonzero elements.

    If one of the inputs in xs is not found in labels, that column is onehot(default, labels) if default is given, else an error.

    If xs has more dimensions, N = ndims(xs) > 1, then the result is an AbstractArray{Bool, N+1} which is one-hot along the first dimension, i.e. result[:, k...] == onehot(xs[k...], labels).

    Note that xs can be any iterable, such as a string. And that using a tuple for labels will often speed up construction, certainly for less than 32 classes.

    Examples

    julia> oh = onehotbatch("abracadabra", 'a':'e', 'e')
    +5×11 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + 1  ⋅  ⋅  1  ⋅  1  ⋅  1  ⋅  ⋅  1
    + ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅
    + ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
    + ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅
    + ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  1  ⋅
    +
    +julia> reshape(1:15, 3, 5) * oh  # this matrix multiplication is done efficiently
    +3×11 Matrix{Int64}:
    + 1  4  13  1  7  1  10  1  4  13  1
    + 2  5  14  2  8  2  11  2  5  14  2
    + 3  6  15  3  9  3  12  3  6  15  3
    OneHotArrays.OneHotArrayType
    OneHotArray{T, N, M, I} <: AbstractArray{Bool, M}
    +OneHotArray(indices, L)

    A one-hot M-dimensional array with L labels (i.e. size(A, 1) == L and sum(A, dims=1) == 1) stored as a compact N == M-1-dimensional array of indices.

    Typically constructed by onehot and onehotbatch. Parameter I is the type of the underlying storage, and T its eltype.

    OneHotArrays.OneHotVectorType
    OneHotVector{T} = OneHotArray{T, 0, 1, T}
    +OneHotVector(indices, L)

    A one-hot vector with L labels (i.e. length(A) == L and count(A) == 1) typically constructed by onehot. Stored efficiently as a single index of type T, usually UInt32.

    OneHotArrays.OneHotMatrixType
    OneHotMatrix{T, I} = OneHotArray{T, 1, 2, I}
    +OneHotMatrix(indices, L)

    A one-hot matrix (with L labels) typically constructed using onehotbatch. Stored efficiently as a vector of indices with type I and eltype T.

    diff --git a/previews/PR2365/destructure/index.html b/previews/PR2365/destructure/index.html new file mode 100644 index 0000000000..78317466dc --- /dev/null +++ b/previews/PR2365/destructure/index.html @@ -0,0 +1,115 @@ + +Flat vs. Nested · Flux

    Flat vs. Nested Structures

    A Flux model is a nested structure, with parameters stored within many layers. Sometimes you may want a flat representation of them, to interact with functions expecting just one vector. This is provided by destructure:

    julia> model = Chain(Dense(2=>1, tanh), Dense(1=>1))
    +Chain(
    +  Dense(2 => 1, tanh),                  # 3 parameters
    +  Dense(1 => 1),                        # 2 parameters
    +)                   # Total: 4 arrays, 5 parameters, 276 bytes.
    +
    +julia> flat, rebuild = Flux.destructure(model)
    +(Float32[0.863101, 1.2454957, 0.0, -1.6345707, 0.0], Restructure(Chain, ..., 5))
    +
    +julia> rebuild(zeros(5))  # same structure, new parameters
    +Chain(
    +  Dense(2 => 1, tanh),                  # 3 parameters  (all zero)
    +  Dense(1 => 1),                        # 2 parameters  (all zero)
    +)                   # Total: 4 arrays, 5 parameters, 276 bytes.

    Both destructure and the Restructure function can be used within gradient computations. For instance, this computes the Hessian ∂²L/∂θᵢ∂θⱼ of some loss function, with respect to all parameters of the Flux model. The resulting matrix has off-diagonal entries, which cannot really be expressed in a nested structure:

    julia> x = rand(Float32, 2, 16);
    +
    +julia> grad = gradient(m -> sum(abs2, m(x)), model)  # nested gradient
    +((layers = ((weight = Float32[10.339018 11.379145], bias = Float32[22.845667], σ = nothing), (weight = Float32[-29.565302;;], bias = Float32[-37.644184], σ = nothing)),),)
    +
    +julia> function loss(v::Vector)
    +         m = rebuild(v)
    +         y = m(x)
    +         sum(abs2, y)
    +       end;
    +
    +julia> gradient(loss, flat)  # flat gradient, same numbers
    +(Float32[10.339018, 11.379145, 22.845667, -29.565302, -37.644184],)
    +
    +julia> Zygote.hessian(loss, flat)  # second derivative
    +5×5 Matrix{Float32}:
    +  -7.13131   -5.54714  -11.1393  -12.6504   -8.13492
    +  -5.54714   -7.11092  -11.0208  -13.9231   -9.36316
    + -11.1393   -11.0208   -13.7126  -27.9531  -22.741
    + -12.6504   -13.9231   -27.9531   18.0875   23.03
    +  -8.13492   -9.36316  -22.741    23.03     32.0
    +
    +julia> Flux.destructure(grad)  # acts on non-models, too
    +(Float32[10.339018, 11.379145, 22.845667, -29.565302, -37.644184], Restructure(Tuple, ..., 5))
    Flux ≤ 0.12

    Old versions of Flux had an entirely different implementation of destructure, which had many bugs (and almost no tests). Many comments online still refer to that now-deleted function, or to memories of it.

    All Parameters

    The function destructure now lives in Optimisers.jl. (Be warned this package is unrelated to the Flux.Optimisers sub-module! The confusion is temporary.)

    Optimisers.destructureFunction
    destructure(model) -> vector, reconstructor

    Copies all trainable, isnumeric parameters in the model to a vector, and returns also a function which reverses this transformation. Differentiable.

    Example

    julia> v, re = destructure((x=[1.0, 2.0], y=(sin, [3.0 + 4.0im])))
    +(ComplexF64[1.0 + 0.0im, 2.0 + 0.0im, 3.0 + 4.0im], Restructure(NamedTuple, ..., 3))
    +
    +julia> re([3, 5, 7+11im])
    +(x = [3.0, 5.0], y = (sin, ComplexF64[7.0 + 11.0im]))

    If model contains various number types, they are promoted to make vector, and are usually restored by Restructure. Such restoration follows the rules of ChainRulesCore.ProjectTo, and thus will restore floating point precision, but will permit more exotic numbers like ForwardDiff.Dual.

    If model contains only GPU arrays, then vector will also live on the GPU. At present, a mixture of GPU and ordinary CPU arrays is undefined behaviour.

    Optimisers.trainableFunction
    trainable(x::Layer) -> NamedTuple

    This may be overloaded to make optimisers ignore some fields of every Layer, which would otherwise contain trainable parameters.

    Warning

    This is very rarely required. Fields of struct Layer which contain functions, or integers like sizes, are always ignored anyway. Overloading trainable is only necessary when some arrays of numbers are to be optimised, and some arrays of numbers are not.

    The default is Functors.children(x), usually a NamedTuple of all fields, and trainable(x) must contain a subset of these.

    Optimisers.isnumericFunction
    isnumeric(x) -> Bool

    Returns true on any parameter to be adjusted by Optimisers.jl, namely arrays of non-integer numbers. Returns false on all other types.

    Requires also that Functors.isleaf(x) == true, to focus on e.g. the parent of a transposed matrix, not the wrapper.

    All Layers

    Another kind of flat view of a nested model is provided by the modules command. This extracts a list of all layers:

    Flux.modulesFunction
    modules(m)

    Return an iterator over non-leaf objects that can be reached by recursing m over the children given by functor.

    Useful for applying a function (e.g. a regularizer) over specific modules or subsets of the parameters (e.g. the weights but not the biases).

    Examples

    julia> m1 = Chain(Dense(28^2, 64), BatchNorm(64, relu));
    +
    +julia> m2 = Chain(m1, Dense(64, 10))
    +Chain(
    +  Chain(
    +    Dense(784 => 64),                   # 50_240 parameters
    +    BatchNorm(64, relu),                # 128 parameters, plus 128
    +  ),
    +  Dense(64 => 10),                      # 650 parameters
    +)         # Total: 6 trainable arrays, 51_018 parameters,
    +          # plus 2 non-trainable, 128 parameters, summarysize 200.312 KiB.
    +
    +julia> Flux.modules(m2)
    +7-element Vector{Any}:
    + Chain(Chain(Dense(784 => 64), BatchNorm(64, relu)), Dense(64 => 10))  # 51_018 parameters, plus 128 non-trainable
    + (Chain(Dense(784 => 64), BatchNorm(64, relu)), Dense(64 => 10))
    + Chain(Dense(784 => 64), BatchNorm(64, relu))  # 50_368 parameters, plus 128 non-trainable
    + (Dense(784 => 64), BatchNorm(64, relu))
    + Dense(784 => 64)    # 50_240 parameters
    + BatchNorm(64, relu)  # 128 parameters, plus 128 non-trainable
    + Dense(64 => 10)     # 650 parameters
    +
    +julia> L2(m) = sum(sum(abs2, l.weight) for l in Flux.modules(m) if l isa Dense)
    +L2 (generic function with 1 method)
    +
    +julia> L2(m2) isa Float32
    +true
    source

    Save and Load

    Flux.stateFunction
    state(x)

    Return an object with the same nested structure as x according to Functors.children, but made only of basic containers (e.g. named tuples, tuples, arrays, and dictionaries).

    Besides trainable and non-trainable arrays, the state will contain leaf nodes that are not arrays, such as numbers, symbols, strings, and nothing values. The leaf types that end up in the state could increase in the future.

    This method is particularly useful for saving and loading models, since the state contain only simple data types that can be easily serialized.

    The state can be passed to loadmodel! to restore the model.

    Examples

    Copy the state into another model

    julia> m1 = Chain(Dense(1, 2, tanh; init=ones), Dense(2, 1; init=ones));
    +
    +julia> s = Flux.state(m1)
    +(layers = ((weight = [1.0; 1.0;;], bias = [0.0, 0.0], σ = ()), (weight = [1.0 1.0], bias = [0.0], σ = ())),)
    +
    +julia> m2 = Chain(Dense(1, 2, tanh), Dense(2, 1; bias=false));  # weights are random numbers
    +
    +julia> Flux.loadmodel!(m2, s);
    +
    +julia> m2[1].weight   # now the weights of m2 are the same as m1
    +2×1 Matrix{Float32}:
    + 1.0
    + 1.0
    +
    +julia> Flux.state(trainmode!(Dropout(0.2)))  # contains p & activity, but not RNG state
    +(p = 0.2, dims = (), active = true, rng = ())
    +
    +julia> Flux.state(BatchNorm(1))  # contains non-trainable arrays μ, σ²
    +(λ = (), β = Float32[0.0], γ = Float32[1.0], μ = Float32[0.0], σ² = Float32[1.0], ϵ = 1.0f-5, momentum = 0.1f0, affine = true, track_stats = true, active = nothing, chs = 1)

    Save and load with BSON

    julia> using BSON
    +
    +julia> BSON.@save "checkpoint.bson" model_state = s
    +
    +julia> Flux.loadmodel!(m2, BSON.load("checkpoint.bson")[:model_state])

    Save and load with JLD2

    julia> using JLD2
    +
    +julia> JLD2.jldsave("checkpoint.jld2", model_state = s)
    +
    +julia> Flux.loadmodel!(m2, JLD2.load("checkpoint.jld2", "model_state"))
    source
    Flux.loadmodel!Function
    loadmodel!(dst, src)

    Copy all the parameters (trainable and non-trainable) from src into dst.

    Recursively walks dst and src together using Functors.children, and calling copyto! on parameter arrays or throwing an error when there is a mismatch. Non-array elements (such as activation functions) are not copied and need not match. Zero bias vectors and bias=false are considered equivalent (see extended help for more details).

    See also Flux.state.

    Examples

    julia> dst = Chain(Dense(Flux.ones32(2, 5), Flux.ones32(2), tanh), Dense(2 => 1; bias = [1f0]))
    +Chain(
    +  Dense(5 => 2, tanh),                  # 12 parameters
    +  Dense(2 => 1),                        # 3 parameters
    +)                   # Total: 4 arrays, 15 parameters, 316 bytes.
    +
    +julia> dst[1].weight ≈ ones(2, 5)  # by construction
    +true
    +
    +julia> src = Chain(Dense(5 => 2, relu), Dense(2 => 1, bias=false));
    +
    +julia> Flux.loadmodel!(dst, src);
    +
    +julia> dst[1].weight ≈ ones(2, 5)  # values changed
    +false
    +
    +julia> iszero(dst[2].bias)
    +true

    Extended help

    Throws an error when:

    • dst and src do not share the same fields (at any level)
    • the sizes of leaf nodes are mismatched between dst and src
    • copying non-array values to/from an array parameter (except inactive parameters described below)
    • dst is a "tied" parameter (i.e. refers to another parameter) and loaded into multiple times with mismatched source values

    Inactive parameters can be encoded by using the boolean value false instead of an array. If dst == false and src is an all-zero array, no error will be raised (and no values copied); however, attempting to copy a non-zero array to an inactive parameter will throw an error. Likewise, copying a src value of false to any dst array is valid, but copying a src value of true will error.

    source
    diff --git a/previews/PR2365/ecosystem/index.html b/previews/PR2365/ecosystem/index.html new file mode 100644 index 0000000000..b686ad94b9 --- /dev/null +++ b/previews/PR2365/ecosystem/index.html @@ -0,0 +1,6 @@ + +Ecosystem · Flux

    The Julia Ecosystem around Flux

    One of the main strengths of Julia lies in an ecosystem of packages globally providing a rich and consistent user experience.

    This is a non-exhaustive list of Julia packages, nicely complementing Flux in typical machine learning and deep learning workflows. To add your project please send a PR. See also academic work citing Flux or citing Zygote.

    Flux models

    • Flux's model-zoo contains examples from many domains.

    Computer vision

    • ObjectDetector.jl provides ready-to-go image detection via YOLO.
    • Metalhead.jl includes many state-of-the-art computer vision models which can easily be used for transfer learning.
    • UNet.jl is a generic UNet implementation.

    Natural language processing

    • Transformers.jl provides components for Transformer models for NLP, as well as providing several trained models out of the box.
    • TextAnalysis.jl provides several NLP algorithms that use Flux models under the hood.

    Reinforcement learning

    • AlphaZero.jl provides a generic, simple and fast implementation of Deepmind's AlphaZero algorithm.
    • ReinforcementLearning.jl offers a collection of tools for doing reinforcement learning research in Julia.

    Graph learning

    • GraphNeuralNetworks.jl is a fresh, performant and flexible graph neural network library based on Flux.jl.
    • GeometricFlux.jl is the first graph neural network library for julia.
    • NeuralOperators.jl enables training infinite dimensional PDEs by learning a continuous function instead of using the finite element method.
    • SeaPearl.jl is a Constraint Programming solver that uses Reinforcement Learning based on graphs as input.

    Time series

    Robust networks

    • RobustNeuralNetworks.jl includes classes of neural networks that are constructed to naturally satisfy robustness constraints.

    Tools closely associated with Flux

    Utility tools you're unlikely to have met if you never used Flux!

    High-level training flows

    • FastAI.jl is a Julia port of Python's fast.ai library.
    • FluxTraining.jl is a package for using and writing powerful, extensible training loops for deep learning models. It supports callbacks for many common use cases like hyperparameter scheduling, metrics tracking and logging, checkpointing, early stopping, and more. It powers training in FastAI.jl

    Datasets

    Commonly used machine learning datasets are provided by the following packages in the julia ecosystem:

    Plumbing

    Tools to put data into the right order for creating a model.

    • Augmentor.jl is a real-time library augmentation library for increasing the number of training images.
    • DataAugmentation.jl aims to make it easy to build stochastic, label-preserving augmentation pipelines for vision use cases involving images, keypoints and segmentation masks.
    • MLUtils.jl (replaces MLDataUtils.jl and MLLabelUtils.jl) is a library for processing Machine Learning datasets.

    Parameters


    Differentiable programming

    Packages based on differentiable programming but not necessarily related to Machine Learning.

    • The SciML ecosystem uses Flux and Zygote to mix neural nets with differential equations, to get the best of black box and mechanistic modelling.
    • DiffEqFlux.jl provides tools for creating Neural Differential Equations.
    • Flux3D.jl shows off machine learning on 3D data.
    • RayTracer.jl combines ML with computer vision via a differentiable renderer.
    • Duckietown.jl Differentiable Duckietown simulator.
    • The Yao.jl project uses Flux and Zygote for Quantum Differentiable Programming.
    • AtomicGraphNets.jl enables learning graph based models on atomic systems used in chemistry.
    • DiffImages.jl differentiable computer vision modeling in Julia with the Images.jl ecosystem.

    Probabilistic programming

    • Turing.jl extends Flux's differentiable programming capabilities to probabilistic programming.
    • Omega.jl is a research project aimed at causal, higher-order probabilistic programming.
    • Stheno.jl provides flexible Gaussian processes.

    Statistics


    Useful miscellaneous packages

    Some useful and random packages!

    • AdversarialPrediction.jl provides a way to easily optimise generic performance metrics in supervised learning settings using the Adversarial Prediction framework.
    • Mill.jl helps to prototype flexible multi-instance learning models.
    • MLMetrics.jl is a utility for scoring models in data science and machine learning.
    • Torch.jl exposes torch in Julia.
    • ValueHistories.jl is a utility for efficient tracking of optimization histories, training curves or other information of arbitrary types and at arbitrarily spaced sampling times.
    • InvertibleNetworks.jl Building blocks for invertible neural networks in the Julia programming language.
    • ProgressMeter.jl progress meters for long-running computations.
    • TensorBoardLogger.jl easy peasy logging to tensorboard in Julia
    • ArgParse.jl is a package for parsing command-line arguments to Julia programs.
    • Parameters.jl types with default field values, keyword constructors and (un-)pack macros.
    • BSON.jl is a package for working with the Binary JSON serialisation format.
    • DataFrames.jl in-memory tabular data in Julia.
    • DrWatson.jl is a scientific project assistant software.

    This tight integration among Julia packages is shown in some of the examples in the model-zoo repository.


    Alternatives to Flux

    Julia has several other libraries for making neural networks.

    • SimpleChains.jl is focused on making small, simple, CPU-based, neural networks fast. Uses LoopVectorization.jl. (Was FastChain in DiffEqFlux.jl)

    • Knet.jl is a neural network library built around AutoGrad.jl.

    • Lux.jl (earlier ExplicitFluxLayers.jl) shares much of the design, use-case, and NNlib.jl / Optimisers.jl back-end of Flux. But instead of encapsulating all parameters within the model structure, it separates this into 3 components: a model, a tree of parameters, and a tree of model states.

    Explicit or explicit?

    Flux's training docs talk about changes from Zygote's implicit to explicit gradients, dictionary-like to tree-like structures. (See also Zygote's description of these.) Lux also uses Zygote, but uses the word "explicit" to mean something unrelated, namely storing the tree of parameters (and of state) separately from the model.

    diff --git a/previews/PR2365/gpu/index.html b/previews/PR2365/gpu/index.html new file mode 100644 index 0000000000..d6a62ab5ce --- /dev/null +++ b/previews/PR2365/gpu/index.html @@ -0,0 +1,237 @@ + +GPU Support · Flux

    GPU Support

    Starting with v0.14, Flux doesn't force a specific GPU backend and the corresponding package dependencies on the users. Thanks to the package extension mechanism introduced in julia v1.9, Flux conditionally loads GPU specific code once a GPU package is made available (e.g. through using CUDA).

    NVIDIA GPU support requires the packages CUDA.jl and cuDNN.jl to be installed in the environment. In the julia REPL, type ] add CUDA, cuDNN to install them. For more details see the CUDA.jl readme.

    AMD GPU support is available since Julia 1.9 on systems with ROCm and MIOpen installed. For more details refer to the AMDGPU.jl repository.

    Metal GPU acceleration is available on Apple Silicon hardware. For more details refer to the Metal.jl repository. Metal support in Flux is experimental and many features are not yet available.

    In order to trigger GPU support in Flux, you need to call using CUDA, using AMDGPU or using Metal in your code. Notice that for CUDA, explicitly loading also cuDNN is not required, but the package has to be installed in the environment.

    Flux ≤ 0.13

    Old versions of Flux automatically installed CUDA.jl to provide GPU support. Starting from Flux v0.14, CUDA.jl is not a dependency anymore and has to be installed manually.

    Checking GPU Availability

    By default, Flux will run the checks on your system to see if it can support GPU functionality. You can check if Flux identified a valid GPU setup by typing the following:

    julia> using CUDA
    +
    +julia> CUDA.functional()
    +true

    For AMD GPU:

    julia> using AMDGPU
    +
    +julia> AMDGPU.functional()
    +true
    +
    +julia> AMDGPU.functional(:MIOpen)
    +true

    For Metal GPU:

    julia> using Metal
    +
    +julia> Metal.functional()
    +true

    Selecting GPU backend

    Available GPU backends are: CUDA, AMDGPU and Metal.

    Flux relies on Preferences.jl for selecting default GPU backend to use.

    There are two ways you can specify it:

    • From the REPL/code in your project, call Flux.gpu_backend!("AMDGPU") and restart (if needed) Julia session for the changes to take effect.
    • In LocalPreferences.toml file in you project directory specify:
    [Flux]
    +gpu_backend = "AMDGPU"

    Current GPU backend can be fetched from Flux.GPU_BACKEND variable:

    julia> Flux.GPU_BACKEND
    +"CUDA"

    The current backend will affect the behaviour of methods like the method gpu described below.

    Basic GPU Usage

    Support for array operations on other hardware backends, like GPUs, is provided by external packages like CUDA.jl, AMDGPU.jl, and Metal.jl. Flux is agnostic to array types, so we simply need to move model weights and data to the GPU and Flux will handle it.

    For example, we can use CUDA.CuArray (with the cu converter) to run our basic example on an NVIDIA GPU.

    (Note that you need to have CUDA available to use CUDA.CuArray – please see the CUDA.jl instructions for more details.)

    using CUDA
    +
    +W = cu(rand(2, 5)) # a 2×5 CuArray
    +b = cu(rand(2))
    +
    +predict(x) = W*x .+ b
    +loss(x, y) = sum((predict(x) .- y).^2)
    +
    +x, y = cu(rand(5)), cu(rand(2)) # Dummy data
    +loss(x, y) # ~ 3

    Note that we convert both the parameters (W, b) and the data set (x, y) to cuda arrays. Taking derivatives and training works exactly as before.

    If you define a structured model, like a Dense layer or Chain, you just need to convert the internal parameters. Flux provides fmap, which allows you to alter all parameters of a model at once.

    d = Dense(10 => 5, σ)
    +d = fmap(cu, d)
    +d.weight # CuArray
    +d(cu(rand(10))) # CuArray output
    +
    +m = Chain(Dense(10 => 5, σ), Dense(5 => 2), softmax)
    +m = fmap(cu, m)
    +m(cu(rand(10)))

    As a convenience, Flux provides the gpu function to convert models and data to the GPU if one is available. By default, it'll do nothing. So, you can safely call gpu on some data or model (as shown below), and the code will not error, regardless of whether the GPU is available or not. If a GPU library (e.g. CUDA) loads successfully, gpu will move data from the CPU to the GPU. As is shown below, this will change the type of something like a regular array to a CuArray.

    julia> using Flux, CUDA
    +
    +julia> m = Dense(10, 5) |> gpu
    +Dense(10 => 5)      # 55 parameters
    +
    +julia> x = rand(10) |> gpu
    +10-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:
    + 0.066846445
    + ⋮
    + 0.76706964
    +
    +julia> m(x)
    +5-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:
    + -0.99992573
    + ⋮
    + -0.547261

    The analogue cpu is also available for moving models and data back off of the GPU.

    julia> x = rand(10) |> gpu
    +10-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:
    + 0.8019236
    + ⋮
    + 0.7766742
    +
    +julia> x |> cpu
    +10-element Vector{Float32}:
    + 0.8019236
    + ⋮
    + 0.7766742

    Transferring Training Data

    In order to train the model using the GPU both model and the training data have to be transferred to GPU memory. Moving the data can be done in two different ways:

    1. Iterating over the batches in a DataLoader object transferring each one of the training batches at a time to the GPU. This is recommended for large datasets. Done by hand, it might look like this:

      train_loader = Flux.DataLoader((X, Y), batchsize=64, shuffle=true)
      +# ... model definition, optimiser setup
      +for epoch in 1:epochs
      +    for (x_cpu, y_cpu) in train_loader
      +        x = gpu(x_cpu)
      +        y = gpu(y_cpu)
      +        grads = gradient(m -> loss(m, x, y), model)
      +        Flux.update!(opt_state, model, grads[1])
      +    end
      +end

      Rather than write this out every time, you can just call gpu(::DataLoader):

      gpu_train_loader = Flux.DataLoader((X, Y), batchsize=64, shuffle=true) |> gpu
      +# ... model definition, optimiser setup
      +for epoch in 1:epochs
      +    for (x, y) in gpu_train_loader
      +        grads = gradient(m -> loss(m, x, y), model)
      +        Flux.update!(opt_state, model, grads[1])
      +    end
      +end

      This is equivalent to DataLoader(MLUtils.mapobs(gpu, (X, Y)); keywords...). Something similar can also be done with CUDA.CuIterator, gpu_train_loader = CUDA.CuIterator(train_loader). However, this only works with a limited number of data types: first(train_loader) should be a tuple (or NamedTuple) of arrays.

    2. Transferring all training data to the GPU at once before creating the DataLoader. This is usually performed for smaller datasets which are sure to fit in the available GPU memory.

      gpu_train_loader = Flux.DataLoader((X, Y) |> gpu, batchsize = 32)
      +# ...
      +for epoch in 1:epochs
      +    for (x, y) in gpu_train_loader
      +        # ...

      Here (X, Y) |> gpu applies gpu to both arrays, as it recurses into structures.

    Saving GPU-Trained Models

    After the training process is done, one must always transfer the trained model back to the cpu memory scope before serializing or saving to disk. This can be done, as described in the previous section, with:

    model = cpu(model) # or model = model |> cpu

    and then

    using BSON
    +# ...
    +BSON.@save "./path/to/trained_model.bson" model
    +
    +# in this approach the cpu-transferred model (referenced by the variable `model`)
    +# only exists inside the `let` statement
    +let model = cpu(model)
    +   # ...
    +   BSON.@save "./path/to/trained_model.bson" model
    +end
    +
    +# is equivalent to the above, but uses `key=value` storing directive from BSON.jl
    +BSON.@save "./path/to/trained_model.bson" model = cpu(model)

    The reason behind this is that models trained in the GPU but not transferred to the CPU memory scope will expect CuArrays as input. In other words, Flux models expect input data coming from the same kind device in which they were trained on.

    In controlled scenarios in which the data fed to the loaded models is garanteed to be in the GPU there's no need to transfer them back to CPU memory scope, however in production environments, where artifacts are shared among different processes, equipments or configurations, there is no garantee that the CUDA.jl package will be available for the process performing inference on the model loaded from the disk.

    Disabling CUDA or choosing which GPUs are visible to Flux

    Sometimes it is required to control which GPUs are visible to julia on a system with multiple GPUs or disable GPUs entirely. This can be achieved with an environment variable CUDA_VISIBLE_DEVICES.

    To disable all devices:

    $ export CUDA_VISIBLE_DEVICES='-1'

    To select specific devices by device id:

    $ export CUDA_VISIBLE_DEVICES='0,1'

    More information for conditional use of GPUs in CUDA.jl can be found in its documentation, and information about the specific use of the variable is described in the Nvidia CUDA blog post.

    Using device objects

    As a more convenient syntax, Flux allows the usage of GPU device objects which can be used to easily transfer models to GPUs (and defaulting to using the CPU if no GPU backend is available). This syntax has a few advantages including automatic selection of the GPU backend and type stability of data movement. To do this, the Flux.get_device function can be used.

    Flux.get_device first checks for a GPU preference, and if possible returns a device for the preference backend. For instance, consider the following example, where we load the CUDA.jl package to use an NVIDIA GPU ("CUDA" is the default preference):

    julia> using Flux, CUDA;
    +
    +julia> device = Flux.get_device(; verbose=true)   # returns handle to an NVIDIA GPU
    +[ Info: Using backend set in preferences: CUDA.
    +(::Flux.FluxCUDADevice) (generic function with 1 method)
    +
    +julia> device.deviceID      # check the id of the GPU
    +CuDevice(0): NVIDIA GeForce GTX 1650
    +
    +julia> model = Dense(2 => 3);
    +
    +julia> model.weight     # the model initially lives in CPU memory
    +3×2 Matrix{Float32}:
    + -0.984794  -0.904345
    +  0.720379  -0.486398
    +  0.851011  -0.586942
    +
    +julia> model = model |> device      # transfer model to the GPU
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> model.weight
    +3×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + -0.984794  -0.904345
    +  0.720379  -0.486398
    +  0.851011  -0.586942
    +

    The device preference can also be set via the Flux.gpu_backend! function. For instance, below we first set our device preference to "CPU":

    julia> using Flux; Flux.gpu_backend!("CPU")
    +┌ Info: New GPU backend set: CPU.
    +└ Restart your Julia session for this change to take effect!

    Then, after restarting the Julia session, Flux.get_device returns a handle to the "CPU":

    julia> using Flux, CUDA;    # even if CUDA is loaded, we'll still get a CPU device
    +
    +julia> device = Flux.get_device(; verbose=true)   # get a CPU device
    +[ Info: Using backend set in preferences: CPU.
    +(::Flux.FluxCPUDevice) (generic function with 1 method)
    +
    +julia> model = Dense(2 => 3);
    +
    +julia> model = model |> device
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> model.weight     # no change; model still lives on CPU
    +3×2 Matrix{Float32}:
    + -0.942968   0.856258
    +  0.440009   0.714106
    + -0.419192  -0.471838

    Clearly, this means that the same code will work for any GPU backend and the CPU.

    If the preference backend isn't available or isn't functional, then Flux.get_device looks for a CUDA, AMDGPU or Metal backend, and returns a corresponding device (if the backend is available and functional). Otherwise, a CPU device is returned. In the below example, the GPU preference is "CUDA":

    julia> using Flux;      # preference is CUDA, but CUDA.jl not loaded
    +
    +julia> device = Flux.get_device(; verbose=true)       # this will resort to automatic device selection
    +[ Info: Using backend set in preferences: CUDA.
    +┌ Warning: Trying to use backend: CUDA but it's trigger package is not loaded.
    +│ Please load the package and call this function again to respect the preferences backend.
    +└ @ Flux ~/fluxml/Flux.jl/src/functor.jl:637
    +[ Info: Using backend: CPU.
    +(::Flux.FluxCPUDevice) (generic function with 1 method)

    For detailed information about how the backend is selected, check the documentation for Flux.get_device.

    Data movement across GPU devices

    Flux also supports getting handles to specific GPU devices, and transferring models from one GPU device to another GPU device from the same backend. Let's try it out for NVIDIA GPUs. First, we list all the available devices:

    julia> using Flux, CUDA;
    +
    +julia> CUDA.devices()
    +CUDA.DeviceIterator() for 3 devices:
    +0. GeForce RTX 2080 Ti
    +1. GeForce RTX 2080 Ti
    +2. TITAN X (Pascal)
    +

    Then, let's select the device with id 0:

    julia> device0 = Flux.get_device("CUDA", 0)        # the currently supported values for backend are "CUDA" and "AMDGPU"
    +(::Flux.FluxCUDADevice) (generic function with 1 method)
    +

    Then, let's move a simple dense layer to the GPU represented by device0:

    julia> dense_model = Dense(2 => 3)
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> dense_model = dense_model |> device0;
    +
    +julia> dense_model.weight
    +3×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    +  0.695662   0.816299
    + -0.204763  -0.10232
    + -0.955829   0.538412
    +
    +julia> CUDA.device(dense_model.weight)      # check the GPU to which dense_model is attached
    +CuDevice(0): GeForce RTX 2080 Ti
    +

    Next, we'll get a handle to the device with id 1, and move dense_model to that device:

    julia> device1 = Flux.get_device("CUDA", 1)
    +(::Flux.FluxCUDADevice) (generic function with 1 method)
    +
    +julia> dense_model = dense_model |> device1;    # don't directly print the model; see warning below
    +
    +julia> CUDA.device(dense_model.weight)
    +CuDevice(1): GeForce RTX 2080 Ti
    +

    Due to a limitation in Metal.jl, currently this kind of data movement across devices is only supported for CUDA and AMDGPU backends.

    Printing models after moving to a different device

    Due to a limitation in how GPU packages currently work, printing models on the REPL after moving them to a GPU device which is different from the current device will lead to an error.

    Flux.AbstractDeviceType
    Flux.AbstractDevice <: Function

    An abstract type representing device objects for different GPU backends. The currently supported backends are "CUDA", "AMDGPU", "Metal" and "CPU"; the "CPU" backend is the fallback case when no GPU is available. GPU extensions of Flux define subtypes of this type.

    source
    Flux.FluxCPUDeviceType
    Flux.FluxCPUDevice <: Flux.AbstractDevice

    A type representing device objects for the "CPU" backend for Flux. This is the fallback case when no GPU is available to Flux.

    source
    Flux.FluxCUDADeviceType
    FluxCUDADevice <: AbstractDevice

    A type representing device objects for the "CUDA" backend for Flux.

    source
    Flux.FluxAMDGPUDeviceType
    FluxAMDGPUDevice <: AbstractDevice

    A type representing device objects for the "AMDGPU" backend for Flux.

    source
    Flux.FluxMetalDeviceType
    FluxMetalDevice <: AbstractDevice

    A type representing device objects for the "Metal" backend for Flux.

    source
    Flux.supported_devicesFunction
    Flux.supported_devices()

    Get all supported backends for Flux, in order of preference.

    Example

    julia> using Flux;
    +
    +julia> Flux.supported_devices()
    +("CUDA", "AMDGPU", "Metal", "CPU")
    source
    Flux.get_deviceFunction
    Flux.get_device(; verbose=false)::Flux.AbstractDevice

    Returns a device object for the most appropriate backend for the current Julia session.

    First, the function checks whether a backend preference has been set via the Flux.gpu_backend! function. If so, an attempt is made to load this backend. If the corresponding trigger package has been loaded and the backend is functional, a device corresponding to the given backend is loaded. Otherwise, the backend is chosen automatically. To update the backend preference, use Flux.gpu_backend!.

    If there is no preference, then for each of the "CUDA", "AMDGPU", "Metal" and "CPU" backends in the given order, this function checks whether the given backend has been loaded via the corresponding trigger package, and whether the backend is functional. If so, the device corresponding to the backend is returned. If no GPU backend is available, a Flux.FluxCPUDevice is returned.

    If verbose is set to true, then the function prints informative log messages.

    Examples

    For the example given below, the backend preference was set to "AMDGPU" via the gpu_backend! function.

    julia> using Flux;
    +
    +julia> model = Dense(2 => 3)
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> device = Flux.get_device(; verbose=true)       # this will just load the CPU device
    +[ Info: Using backend set in preferences: AMDGPU.
    +┌ Warning: Trying to use backend: AMDGPU but it's trigger package is not loaded.
    +│ Please load the package and call this function again to respect the preferences backend.
    +└ @ Flux ~/fluxml/Flux.jl/src/functor.jl:638
    +[ Info: Using backend: CPU.
    +(::Flux.FluxCPUDevice) (generic function with 1 method)
    +
    +julia> model = model |> device
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> model.weight
    +3×2 Matrix{Float32}:
    + -0.304362  -0.700477
    + -0.861201   0.67825
    + -0.176017   0.234188

    Here is the same example, but using "CUDA":

    julia> using Flux, CUDA;
    +
    +julia> model = Dense(2 => 3)
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> device = Flux.get_device(; verbose=true)
    +[ Info: Using backend set in preferences: AMDGPU.
    +┌ Warning: Trying to use backend: AMDGPU but it's trigger package is not loaded.
    +│ Please load the package and call this function again to respect the preferences backend.
    +└ @ Flux ~/fluxml/Flux.jl/src/functor.jl:637
    +[ Info: Using backend: CUDA.
    +(::Flux.FluxCUDADevice) (generic function with 1 method)
    +
    +julia> model = model |> device
    +Dense(2 => 3)       # 9 parameters
    +
    +julia> model.weight
    +3×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    +  0.820013   0.527131
    + -0.915589   0.549048
    +  0.290744  -0.0592499
    source
    Flux.get_device(backend::String, idx::Int = 0)::Flux.AbstractDevice

    Get a device object for a backend specified by the string backend and idx. The currently supported values of backend are "CUDA", "AMDGPU" and "CPU". idx must be an integer value between 0 and the number of available devices.

    Examples

    julia> using Flux, CUDA;
    +
    +julia> CUDA.devices()
    +CUDA.DeviceIterator() for 3 devices:
    +0. GeForce RTX 2080 Ti
    +1. GeForce RTX 2080 Ti
    +2. TITAN X (Pascal)
    +
    +julia> device0 = Flux.get_device("CUDA", 0)
    +(::Flux.FluxCUDADevice) (generic function with 1 method)
    +
    +julia> device0.deviceID
    +CuDevice(0): GeForce RTX 2080 Ti
    +
    +julia> device1 = Flux.get_device("CUDA", 1)
    +(::Flux.FluxCUDADevice) (generic function with 1 method)
    +
    +julia> device1.deviceID
    +CuDevice(1): GeForce RTX 2080 Ti
    +
    +julia> cpu_device = Flux.get_device("CPU")
    +(::Flux.FluxCPUDevice) (generic function with 1 method)
    +
    source
    diff --git a/previews/PR2365/index.html b/previews/PR2365/index.html new file mode 100644 index 0000000000..c58d7f7731 --- /dev/null +++ b/previews/PR2365/index.html @@ -0,0 +1,6 @@ + +Welcome · Flux

    Flux: The Julia Machine Learning Library

    Flux is a library for machine learning. It comes "batteries-included" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. We follow a few key principles:

    • Doing the obvious thing. Flux has relatively few explicit APIs. Instead, writing down the mathematical form will work – and be fast.
    • Extensible by default. Flux is written to be highly flexible while being performant. Extending Flux is as simple as using your own code as part of the model you want - it is all high-level Julia code.
    • Play nicely with others. Flux works well with unrelated Julia libraries from images to differential equation solvers, rather than duplicating them.

    Installation

    Download Julia 1.9 or later, preferably the current stable release. You can add Flux using Julia's package manager, by typing ] add Flux in the Julia prompt. For Nvidia GPU support, you will also need to install the CUDA and the cuDNN packages. For AMD GPU support, install the AMDGPU package. For acceleration on Apple Silicon, install the Metal package.

    Learning Flux

    The quick start page trains a simple neural network.

    This rest of the guide provides a from-scratch introduction to Flux's take on models and how they work, starting with fitting a line. Once you understand these docs, congratulations, you also understand Flux's source code, which is intended to be concise, legible and a good reference for more advanced concepts.

    There are some tutorials about building particular models. The model zoo has starting points for many other common ones. And finally, the ecosystem page lists packages which define Flux models.

    The reference section includes, beside Flux's own functions, those of some companion packages: Zygote.jl (automatic differentiation), Optimisers.jl (training) and others.

    Community

    Everyone is welcome to join our community on the Julia discourse forum, or the slack chat (channel #machine-learning). If you have questions or issues we'll try to help you out.

    If you're interested in hacking on Flux, the source code is open and easy to understand – it's all just the same Julia code you work with normally. You might be interested in our intro issues to get started, or our contributing guide.

    diff --git a/previews/PR2365/models/activation/index.html b/previews/PR2365/models/activation/index.html new file mode 100644 index 0000000000..96466356b4 --- /dev/null +++ b/previews/PR2365/models/activation/index.html @@ -0,0 +1,433 @@ + +Activation Functions · Flux

    Activation Functions from NNlib.jl

    These non-linearities used between layers of your model are exported by the NNlib package.

    Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call σ.(xs), relu.(xs) and so on. Alternatively, they can be passed to a layer like Dense(784 => 1024, relu) which will handle this broadcasting.

    Functions like softmax are sometimes described as activation functions, but not by Flux. They must see all the outputs, and hence cannot be broadcasted. See the next page for details.

    Alphabetical Listing

    NNlib.celuFunction
    celu(x, α=1) = x ≥ 0 ? x : α * (exp(x/α) - 1)

    Activation function from "Continuously Differentiable Exponential Linear Units".

    julia> lineplot(celu, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐        
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉│ celu(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│        
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡧⠶⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⠤⠔⠒⠋⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        -1 │⠤⠤⠤⠤⠔⠒⠒⠒⠊⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    +
    +julia> celu(-10f0)
    +-0.9999546f0
    NNlib.eluFunction
    elu(x, α=1) = x > 0 ? x : α * (exp(x) - 1)

    Exponential Linear Unit activation function. See "Fast and Accurate Deep Network Learning by Exponential Linear Units". You can also specify the coefficient explicitly, e.g. elu(x, 1).

    julia> lineplot(elu, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐       
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉│ elu(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀│       
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│       
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│       
    +           │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡧⠶⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│       
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⠤⠔⠒⠋⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│       
    +        -1 │⠤⠤⠤⠤⠔⠒⠒⠒⠊⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│       
    +           └────────────────────────────────────────┘       
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀       
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀       
    +
    +julia> elu(-10f0)
    +-0.9999546f0
    +
    +julia> elu(-10f0, 2)
    +-1.9999092f0
    NNlib.geluFunction
    gelu(x) = 0.5x * (1 + tanh(√(2/π) * (x + 0.044715x^3)))

    Activation function from "Gaussian Error Linear Units".

    julia> lineplot(gelu, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐        
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊│ gelu(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠊⠁⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀│        
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⣀⡠⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⣤⣤⣤⣤⣤⣤⣤⣤⡤⠤⠤⠤⠤⠤⠤⠤⣤⣤⣤⡤⡧⠶⠶⠭⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠉⠉⠉⠉⠉⠉⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        -1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    +
    +julia> lineplot(gelu, -5, 0, height=7);
    +
    +julia> lineplot!(ans, swish)
    +             ┌────────────────────────────────────────┐         
    +           0 │⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠒⠒⠤⣄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸│ gelu(x) 
    +             │⠑⠒⠢⠤⣄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇│ swish(x)
    +             │⠀⠀⠀⠀⠀⠈⠉⠒⠤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣸⠁│         
    +   f(x)      │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠒⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⢄⠀⠀⠀⠀⠀⠀⠀⠀⢠⡇⠀│         
    +             │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠓⣄⠀⠀⠀⠀⠀⢠⡞⠀⠀│         
    +             │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠓⢄⣀⣀⡤⢣⠃⠀⠀│         
    +        -0.2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢠⠇⠀⠀⠀│         
    +             └────────────────────────────────────────┘         
    +             ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀0⠀         
    +             ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         
    NNlib.hardsigmoidFunction
    hardσ(x) = max(0, min(1, (x + 3) / 6))

    Piecewise linear approximation of sigmoid.

    julia> lineplot(hardsigmoid, -5, 5, height=7)
    +          ┌────────────────────────────────────────┐         
    +        1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡠⠖⠋⠉⠉⠉⠉⠉⠉⠉⠉│ hardσ(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⣀⡤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⡠⠔⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⡗⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠊⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠋⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +        0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⠤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          └────────────────────────────────────────┘         
    +          ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀         
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         
    +
    +julia> lineplot(sigmoid, -5, 5, height=7)
    +          ┌────────────────────────────────────────┐     
    +        1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡠⠤⠖⠒⠒⠋⠉⠉⠉⠉⠉⠉│ σ(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⣀⠔⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⡏⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡔⠋⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠊⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +        0 │⣀⣀⣀⣀⣀⣀⣀⠤⠤⠤⠒⠊⠉⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          └────────────────────────────────────────┘     
    +          ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀     
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀     
    NNlib.hardswishFunction
    hardswish(x) = x * hardσ(x)

    Hard-Swish activation function. See "Searching for MobileNetV3".

    julia> lineplot(hardswish, -2, 5, height = 7)
    +           ┌────────────────────────────────────────┐             
    +         5 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠔⠒⠉│ hardswish(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠔⠒⠉⠁⠀⠀⠀⠀│             
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠖⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│             
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│             
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⢀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│             
    +           │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣇⣤⣤⣖⣚⣉⣁⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀│             
    +        -1 │⠉⠒⠒⠒⠒⠉⠉⠉⠉⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│             
    +           └────────────────────────────────────────┘             
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀             
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀             
    +
    +julia> lineplot(hardswish, -4, 0, height = 7);
    +
    +julia> lineplot!(ans, swish)
    +             ┌────────────────────────────────────────┐             
    +           0 │⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⢣⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡜│ hardswish(x)
    +             │⠒⠒⠢⠤⢄⣀⡀⠀⠀⠀⠀⠱⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⠎⠀│ swish(x)    
    +             │⠀⠀⠀⠀⠀⠀⠈⠉⠑⠒⠦⢄⣘⢄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡴⠃⠀⠀│             
    +   f(x)      │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠑⡖⠦⢄⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⢔⠏⠁⠀⠀⠀│             
    +             │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠣⣄⠀⠉⠑⠒⠦⠤⢄⣀⣀⣀⣀⡠⠤⠖⣊⠕⠁⠀⠀⠀⠀⠀│             
    +             │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠓⠤⡀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠁⠀⠀⠀⠀⠀⠀⠀│             
    +        -0.4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠒⠢⠤⠤⠔⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│             
    +             └────────────────────────────────────────┘             
    +             ⠀-4⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀0⠀             
    +             ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀             
    +
    +julia> hardswish.(-5:5)'
    +1×11 adjoint(::Vector{Float64}) with eltype Float64:
    + -0.0  -0.0  -0.0  -0.333333  -0.333333  0.0  0.666667  1.66667  3.0  4.0  5.0
    NNlib.hardtanhFunction
    hardtanh(x) = max(-1, min(1, x))

    Segment-wise linear approximation of tanh, much cheaper to compute. See "Large Scale Machine Learning".

    See also tanh_fast.

    julia> lineplot(hardtanh, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐            
    +         1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⠔⠋⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│ hardtanh(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⣀⡤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⢀⡤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡷⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠖⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠖⠋⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +        -1 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⠔⠋⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           └────────────────────────────────────────┘            
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀            
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x
    +
    +julia> lineplot(tanh, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐        
    +         1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠤⠤⠒⠒⠒⠊⠉⠉⠉│ tanh(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⢀⡤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡷⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠔⠊⠁⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        -1 │⣀⣀⣀⡠⠤⠤⠤⠖⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    NNlib.leakyreluFunction
    leakyrelu(x, a=0.01) = max(a*x, x)

    Leaky Rectified Linear Unit activation function. You can also specify the coefficient explicitly, e.g. leakyrelu(x, 0.01).

    julia> lineplot(x -> leakyrelu(x, 0.5), -2, 2, height=7)
    +           ┌────────────────────────────────────────┐       
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉│ #42(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀│       
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│       
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│       
    +           │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⣤⣤⡤⡧⠶⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│       
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⠤⠤⠒⠒⠋⠉⠁⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│       
    +        -1 │⣀⣀⠤⠤⠒⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│       
    +           └────────────────────────────────────────┘       
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀       
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀       
    +
    +julia> leakyrelu(-10f0, 0.2)
    +-2.0f0
    +
    +julia> leakyrelu(-10f0, 0.02)
    +-0.5f0
    NNlib.lishtFunction
    lisht(x) = x * tanh(x)

    Activation function from "LiSHT: Non-Parametric Linearly Scaled Hyperbolic Tangent ..."

    julia> lineplot(lisht, -2, 2, height=7)
    +          ┌────────────────────────────────────────┐         
    +        2 │⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔│ lisht(x)
    +          │⠀⠈⠑⢦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀│         
    +          │⠀⠀⠀⠀⠈⠣⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠁⠀⠀⠀⠀│         
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠑⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠊⠁⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠢⡄⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⠔⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⢄⡀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⢀⡠⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +        0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠦⣄⣀⣀⣇⣀⣀⠤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          └────────────────────────────────────────┘         
    +          ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀         
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         
    +
    +julia> lineplot!(ans, logcosh)
    +          ┌────────────────────────────────────────┐           
    +        2 │⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔│ lisht(x)  
    +          │⠀⠈⠑⢦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀│ logcosh(x)
    +          │⠢⣄⠀⠀⠈⠣⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠁⠀⠀⣀⠔│           
    +   f(x)   │⠀⠈⠑⠢⣀⠀⠀⠑⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠊⠁⠀⣀⠔⠊⠁⠀│           
    +          │⠀⠀⠀⠀⠀⠉⠢⢄⡀⠉⠢⡄⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⠔⠋⠀⡠⠔⠋⠁⠀⠀⠀⠀│           
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⠦⣌⡓⢄⡀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⢀⡠⠖⣁⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀│           
    +        0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⠪⠷⣦⣄⣀⣀⣇⣀⣀⣤⠶⠕⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│           
    +          └────────────────────────────────────────┘           
    +          ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀           
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀           
    NNlib.logcoshFunction
    logcosh(x)

    Return log(cosh(x)) which is computed in a numerically stable way.

    julia> lineplot(logcosh, -5, 5, height=7)
    +          ┌────────────────────────────────────────┐           
    +        5 │⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ logcosh(x)
    +          │⠉⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠋│           
    +          │⠀⠀⠀⠑⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠊⠁⠀⠀│           
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠑⠦⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠊⠁⠀⠀⠀⠀⠀│           
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│           
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠦⡀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⢀⡤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│           
    +        0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠑⠢⢄⣀⣀⣇⣀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│           
    +          └────────────────────────────────────────┘           
    +          ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀           
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀           
    NNlib.logsigmoidFunction
    logσ(x)

    Return log(σ(x)) which is computed in a numerically stable way.

    julia> lineplot(logsigmoid, -5, 5, height=7)
    +           ┌────────────────────────────────────────┐        
    +         0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡧⠤⠔⠒⠒⠒⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│ logσ(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠉⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +   f(x)    │⠀⠀⠀⠀⠀⠀⢀⡤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⣀⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⡤⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        -6 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    NNlib.mishFunction
    mish(x) = x * tanh(softplus(x))

    Activation function from "Mish: A Self Regularized Non-Monotonic Neural Activation Function".

    julia> lineplot(mish, -5, 5, height=7)
    +           ┌────────────────────────────────────────┐        
    +         5 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠖⠋│ mish(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠒⠁⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠔⠋⠁⠀⠀⠀⠀⠀⠀│        
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⢀⡠⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡤⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣧⣔⣊⣁⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀│        
    +        -1 │⠀⠀⠀⠀⠀⠀⠀⠀⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    NNlib.reluFunction
    relu(x) = max(0, x)

    Rectified Linear Unit activation function.

    julia> lineplot(relu, -2, 2, height=7)
    +          ┌────────────────────────────────────────┐        
    +        2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠋│ relu(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠊⠁⠀⠀│        
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀⠀⠀⠀⠀│        
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀│        
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⡠⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⡠⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣇⠔⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +          └────────────────────────────────────────┘        
    +          ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀        
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    NNlib.relu6Function
    relu6(x) = min(max(0, x), 6)

    Rectified Linear Unit activation function capped at 6. See "Convolutional Deep Belief Networks" from CIFAR-10.

    julia> lineplot(relu6, -10, 10, height=7)
    +          ┌────────────────────────────────────────┐         
    +        6 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠎⠉⠉⠉⠉⠉⠉⠉⠉│ relu6(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡔⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⡤⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⡠⠎⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⡔⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +        0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⡧⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          └────────────────────────────────────────┘         
    +          ⠀-10⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀10⠀         
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         
    NNlib.rreluFunction
    rrelu(x, lo=1/8, hi=1/3) = max(a*x, x)
    +# where `a` is randomly sampled from uniform distribution `U(lo, hi)`

    Randomized Leaky Rectified Linear Unit activation function. See "Empirical Evaluation of Rectified Activations" You can also specify the bound explicitly, e.g. rrelu(x, 0.0, 1.0).

    julia> lineplot(rrelu, -20, 10, height=7)
    +            ┌────────────────────────────────────────┐         
    +         10 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋│ rrelu(x)
    +            │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀│         
    +            │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⢀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀│         
    +   f(x)     │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⠤⣤⣤⢤⣤⣤⠤⠤⠤⢼⠮⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│         
    +            │⣰⢀⣆⡄⣄⡄⡠⡰⠦⠷⡜⢢⠷⠳⠢⠊⠉⠉⠀⠀⠁⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +            │⠃⠉⠙⠘⠃⠈⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +        -10 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +            └────────────────────────────────────────┘         
    +            ⠀-20⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀10⠀         
    +            ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         
    +
    +julia> extrema(rrelu.(fill(-10f0, 1000)))
    +(-3.3316886f0, -1.2548422f0)
    NNlib.seluFunction
    selu(x) = λ * (x ≥ 0 ? x : α * (exp(x) - 1))
    +
    +λ ≈ 1.05070...
    +α ≈ 1.67326...

    Scaled exponential linear units. See "Self-Normalizing Neural Networks".

    julia> lineplot(selu, -3, 2, height=7)
    +           ┌────────────────────────────────────────┐        
    +         3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ selu(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠤⠒│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⢀⣀⠤⠖⠊⠉⠀⠀⠀⠀│        
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⣀⡠⠤⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⣉⠭⠛⡏⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⡤⠤⠒⠊⠉⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        -2 │⠤⠤⠖⠒⠒⠒⠒⠒⠒⠒⠉⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    +
    +julia> selu(-10f0)
    +-1.7580194f0
    NNlib.sigmoidFunction
    σ(x) = 1 / (1 + exp(-x))

    Classic sigmoid activation function. Unicode σ can be entered as \sigma then tab, in many editors. The ascii name sigmoid is also exported.

    See also sigmoid_fast.

    julia> using UnicodePlots
    +
    +julia> lineplot(sigmoid, -5, 5, height=7)
    +          ┌────────────────────────────────────────┐     
    +        1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡠⠤⠖⠒⠒⠋⠉⠉⠉⠉⠉⠉│ σ(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⣀⠔⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⡏⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡔⠋⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠊⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +        0 │⣀⣀⣀⣀⣀⣀⣀⠤⠤⠤⠒⠊⠉⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│     
    +          └────────────────────────────────────────┘     
    +          ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀     
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀     
    +
    +julia> sigmoid === σ
    +true
    NNlib.sigmoid_fastFunction
    sigmoid_fast(x)

    This is a faster, and very slightly less accurate, version of sigmoid. For `x::Float32, perhaps 3 times faster, and maximum errors 2 eps instead of 1.

    See also tanh_fast.

    julia> sigmoid(0.2f0)
    +0.54983395f0
    +
    +julia> sigmoid_fast(0.2f0)
    +0.54983395f0
    +
    +julia> hardσ(0.2f0)
    +0.53333336f0
    NNlib.softplusFunction
    softplus(x) = log(exp(x) + 1)

    See "Deep Sparse Rectifier Neural Networks", JMLR 2011.

    julia> lineplot(softplus, -3, 3, height=7)
    +          ┌────────────────────────────────────────┐            
    +        4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ softplus(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠│            
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠁⠀│            
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠔⠊⠁⠀⠀⠀⠀⠀│            
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⣀⡠⠤⠒⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⡧⠤⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +        0 │⣀⣀⣀⣀⣀⣀⣀⡠⠤⠤⠤⠤⠔⠒⠒⠚⠉⠉⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +          └────────────────────────────────────────┘            
    +          ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀            
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀            
    +
    +julia> lineplot!(ans, relu)
    +          ┌────────────────────────────────────────┐            
    +        4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ softplus(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠│ relu(x)    
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠⡴⠞⠋⠁│            
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣤⡴⠞⠋⠁⠀⠀⠀⠀│            
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⣀⡠⢤⡲⠝⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀│            
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⡧⠤⠒⠊⣉⠥⠚⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +        0 │⣀⣀⣀⣀⣀⣀⣀⣠⣤⣤⣤⣤⣔⣒⣒⣚⣉⣉⣁⣀⣇⠴⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +          └────────────────────────────────────────┘            
    +          ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀            
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀            
    +
    +julia> softplus(16f0)
    +16.0f0
    NNlib.softshrinkFunction
    softshrink(x, λ=0.5) =
    +    (x ≥ λ ? x - λ : (-λ ≥ x ? x + λ : 0))

    See "Softshrink Activation Function".

    julia> lineplot(softshrink, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐              
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀│ softshrink(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⡤⠔⠒⠉⠁│              
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⣀⡤⠤⠒⠋⠁⠀⠀⠀⠀⠀⠀│              
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⣤⡤⠤⠤⠤⠤⠤⠤⡧⠤⠤⠤⠤⠶⠮⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│              
    +           │⠀⠀⠀⠀⠀⠀⢀⣀⠤⠖⠒⠉⠁⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +           │⠀⣀⠤⠔⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +        -2 │⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +           └────────────────────────────────────────┘              
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀              
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀              
    +
    +julia> lineplot!(ans, tanhshrink)
    +           ┌────────────────────────────────────────┐              
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀│ softshrink(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⡤⠔⠒⣉⡡│ tanhshrink(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⣀⡤⠤⣒⣋⠥⠤⠒⠊⠉⠁⠀│              
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⣤⣤⣤⣤⡤⠤⠤⠤⠤⠤⠤⡷⠶⠶⠶⠶⠶⠾⠿⠯⠭⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤│              
    +           │⠀⢀⣀⡠⠤⠖⢒⣋⠭⠗⠒⠉⠁⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +           │⠊⣉⠤⠔⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +        -2 │⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +           └────────────────────────────────────────┘              
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀              
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀
    +
    +julia> softshrink.((-10f0, 10f0))
    +(-9.5f0, 9.5f0)
    NNlib.softsignFunction
    softsign(x) = x / (1 + |x|)

    See "Quadratic Polynomials Learn Better Image Features" (2009).

    julia> lineplot(softsign, -5, 5, height=7)
    +           ┌────────────────────────────────────────┐            
    +         1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⣀⣀⠤⠤⠤⠤⠤│ softsign(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⡤⠖⠒⠋⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⡔⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡯⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠁⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⠤⠤⠒⠋⠁⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +        -1 │⠒⠒⠒⠒⠒⠊⠉⠉⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           └────────────────────────────────────────┘            
    +           ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀            
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀            
    +
    +julia> lineplot!(ans, tanh)
    +           ┌────────────────────────────────────────┐            
    +         1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡤⠖⠊⠉⠉⠉⣉⣉⣉⣉⣉⠭⠭⠭⠭⠭│ softsign(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⡔⣃⡤⠖⠒⠋⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ tanh(x)    
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣧⡞⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡯⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡴⠃⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⠤⠤⠒⢋⠕⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +        -1 │⣒⣒⣒⣒⣒⣊⣉⣉⣉⣉⣁⣀⣀⡠⠤⠒⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│            
    +           └────────────────────────────────────────┘            
    +           ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀            
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀            
    +
    +julia> softsign(1f0)
    +0.5f0
    +
    +julia> softsign(100f0)
    +0.990099f0
    NNlib.swishFunction
    swish(x) = x * σ(x)

    Self-gated activation function. See "Swish: a Self-Gated Activation Function".

    julia> lineplot(swish, -2, 2, height=7)
    +           ┌────────────────────────────────────────┐         
    +         2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤│ swish(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋⠁⠀│         
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀│         
    +   f(x)    │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⢀⣀⡤⠔⠊⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +           │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⣤⣤⡤⡧⠴⠶⠯⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│         
    +           │⠉⠑⠒⠒⠒⠒⠒⠒⠒⠒⠒⠒⠉⠉⠉⠉⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +        -1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +           └────────────────────────────────────────┘         
    +           ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀         
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         
    NNlib.tanhshrinkFunction
    tanhshrink(x) = x - tanh(x)

    See "Tanhshrink Activation Function".

    julia> lineplot(tanhshrink, -3, 3, height=7)
    +           ┌────────────────────────────────────────┐              
    +         3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ tanhshrink(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠤⠖⠊│              
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⣀⡠⠤⠒⠊⠉⠁⠀⠀⠀⠀│              
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⣤⡤⠤⠤⠤⠤⠤⠤⡷⠶⠶⠶⠶⠶⠮⠭⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│              
    +           │⠀⠀⠀⠀⠀⣀⡠⠴⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +           │⡠⠴⠒⠊⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +        -3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│              
    +           └────────────────────────────────────────┘              
    +           ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀              
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀              
    +
    +julia> tanhshrink.((-10f0, 10f0))
    +(-9.0f0, 9.0f0)
    NNlib.tanh_fastFunction
    tanh_fast(x)

    This is a faster but slighly less accurate version of tanh.

    Where Julia's tanh function has an error under 2 eps, this may be wrong by 5 eps, a reduction by less than one decimal digit.

    For x::Float32 this is usually about 10 times faster, with a smaller speedup for x::Float64. For any other number types, it just calls tanh.

    See also sigmoid_fast.

    julia> tanh(0.5f0)
    +0.46211717f0
    +
    +julia> tanh_fast(0.5f0)
    +0.46211714f0
    +
    +julia> hard_tanh(0.5f0)
    +0.5f0
    NNlib.treluFunction
    trelu(x, theta=1) = x > theta ? x : 0

    Threshold gated rectified linear activation function. See "Zero-bias autoencoders and the benefits of co-adapting features"

    julia> lineplot(trelu, -2, 4, height=7)
    +          ┌────────────────────────────────────────┐         
    +        4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋│ trelu(x)
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀│         
    +   f(x)   │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠴⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⣠⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⡏⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +        0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣇⣀⣀⣀⣀⣀⣀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│         
    +          └────────────────────────────────────────┘         
    +          ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀4⠀         
    +          ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀         

    One More

    Julia's Base.Math also provides tanh, which can be used as an activation function.

    Note that many Flux layers will automatically replace this with NNlib.tanh_fast when called, as Base's tanh is slow enough to sometimes be a bottleneck.

    julia> using UnicodePlots
    +
    +julia> lineplot(tanh, -3, 3, height=7)
    +           ┌────────────────────────────────────────┐        
    +         1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⣀⠤⠔⠒⠒⠉⠉⠉⠉⠉⠉⠉⠉⠉│ tanh(x)
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⡠⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⡰⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +   f(x)    │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⡤⡯⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠎⠁⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠴⠊⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +        -1 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⡤⠤⠔⠒⠉⠁⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│        
    +           └────────────────────────────────────────┘        
    +           ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀        
    +           ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀        
    diff --git a/previews/PR2365/models/advanced/index.html b/previews/PR2365/models/advanced/index.html new file mode 100644 index 0000000000..4e46a7a133 --- /dev/null +++ b/previews/PR2365/models/advanced/index.html @@ -0,0 +1,106 @@ + +Custom Layers · Flux

    Defining Customised Layers

    Here we will try and describe usage of some more advanced features that Flux provides to give more control over model building.

    Custom Model Example

    Here is a basic example of a custom model. It simply adds the input to the result from the neural network.

    struct CustomModel
    +  chain::Chain
    +end
    +
    +function (m::CustomModel)(x)
    +  # Arbitrary code can go here, but note that everything will be differentiated.
    +  # Zygote does not allow some operations, like mutating arrays.
    +
    +  return m.chain(x) + x
    +end
    +
    +# Call @functor to allow for training. Described below in more detail.
    +Flux.@functor CustomModel

    You can then use the model like:

    chain = Chain(Dense(10, 10))
    +model = CustomModel(chain)
    +model(rand(10))

    For an intro to Flux and automatic differentiation, see this tutorial.

    Customising Parameter Collection for a Model

    Taking reference from our example Affine layer from the basics.

    By default all the fields in the Affine type are collected as its parameters, however, in some cases it may be desired to hold other metadata in our "layers" that may not be needed for training, and are hence supposed to be ignored while the parameters are collected. With Flux, the way to mark some fields of our layer as trainable is through overloading the trainable function:

    julia> Flux.@functor Affine
    +
    +julia> a = Affine(Float32[1 2; 3 4; 5 6], Float32[7, 8, 9])
    +Affine(Float32[1.0 2.0; 3.0 4.0; 5.0 6.0], Float32[7.0, 8.0, 9.0])
    +
    +julia> Flux.params(a) # default behavior
    +Params([Float32[1.0 2.0; 3.0 4.0; 5.0 6.0], Float32[7.0, 8.0, 9.0]])
    +
    +julia> Flux.trainable(a::Affine) = (; a.W)  # returns a NamedTuple using the field's name
    +
    +julia> Flux.params(a)
    +Params([Float32[1.0 2.0; 3.0 4.0; 5.0 6.0]])

    Only the fields returned by trainable will be collected as trainable parameters of the layer when calling Flux.params, and only these fields will be seen by Flux.setup and Flux.update! for training. But all fields wil be seen by gpu and similar functions, for example:

    julia> a |> f16
    +Affine(Float16[1.0 2.0; 3.0 4.0; 5.0 6.0], Float16[7.0, 8.0, 9.0])

    Note that there is no need to overload trainable to hide fields which do not contain trainable parameters. (For example, activation functions, or Boolean flags.) These are always ignored by params and by training:

    julia> Flux.params(Affine(true, [10, 11, 12.0]))
    +Params([])

    It is also possible to further restrict what fields are seen by writing @functor Affine (W,). However, this is not recommended. This requires the struct to have a corresponding constructor that accepts only W as an argument, and the ignored fields will not be seen by functions like gpu (which is usually undesired).

    Freezing Layer Parameters

    When it is desired to not include all the model parameters (for e.g. transfer learning), we can simply not pass in those layers into our call to params.

    Flux ≤ 0.14

    The mechanism described here is for Flux's old "implicit" training style. When upgrading for Flux 0.15, it should be replaced by freeze! and thaw!.

    Consider a simple multi-layer perceptron model where we want to avoid optimising the first two Dense layers. We can obtain this using the slicing features Chain provides:

    m = Chain(
    +      Dense(784 => 64, relu),
    +      Dense(64 => 64, relu),
    +      Dense(32 => 10)
    +    );
    +
    +ps = Flux.params(m[3:end])

    The Zygote.Params object ps now holds a reference to only the parameters of the layers passed to it.

    During training, the gradients will only be computed for (and applied to) the last Dense layer, therefore only that would have its parameters changed.

    Flux.params also takes multiple inputs to make it easy to collect parameters from heterogenous models with a single call. A simple demonstration would be if we wanted to omit optimising the second Dense layer in the previous example. It would look something like this:

    Flux.params(m[1], m[3:end])

    Sometimes, a more fine-tuned control is needed. We can freeze a specific parameter of a specific layer which already entered a Params object ps, by simply deleting it from ps:

    ps = Flux.params(m)
    +delete!(ps, m[2].bias) 

    Custom multiple input or output layer

    Sometimes a model needs to receive several separate inputs at once or produce several separate outputs at once. In other words, there multiple paths within this high-level layer, each processing a different input or producing a different output. A simple example of this in machine learning literature is the inception module.

    Naively, we could have a struct that stores the weights of along each path and implement the joining/splitting in the forward pass function. But that would mean a new struct any time the operations along each path changes. Instead, this guide will show you how to construct a high-level layer (like Chain) that is made of multiple sub-layers for each path.

    Multiple inputs: a custom Join layer

    Our custom Join layer will accept multiple inputs at once, pass each input through a separate path, then combine the results together. Note that this layer can already be constructed using Parallel, but we will first walk through how do this manually.

    We start by defining a new struct, Join, that stores the different paths and a combine operation as its fields.

    using Flux
    +using CUDA
    +
    +# custom join layer
    +struct Join{T, F}
    +  combine::F
    +  paths::T
    +end
    +
    +# allow Join(op, m1, m2, ...) as a constructor
    +Join(combine, paths...) = Join(combine, paths)

    Notice that we parameterized the type of the paths field. This is necessary for fast Julia code; in general, T might be a Tuple or Vector, but we don't need to pay attention to what it specifically is. The same goes for the combine field.

    The next step is to use Functors.@functor to make our struct behave like a Flux layer. This is important so that calling params on a Join returns the underlying weight arrays on each path.

    Flux.@functor Join

    Finally, we define the forward pass. For Join, this means applying each path in paths to each input array, then using combine to merge the results.

    (m::Join)(xs::Tuple) = m.combine(map((f, x) -> f(x), m.paths, xs)...)
    +(m::Join)(xs...) = m(xs)

    Lastly, we can test our new layer. Thanks to the proper abstractions in Julia, our layer works on GPU arrays out of the box!

    model = Chain(
    +              Join(vcat,
    +                   Chain(Dense(1 => 5, relu), Dense(5 => 1)), # branch 1
    +                   Dense(1 => 2),                             # branch 2
    +                   Dense(1 => 1)                              # branch 3
    +                  ),
    +              Dense(4 => 1)
    +             ) |> gpu
    +
    +xs = map(gpu, (rand(1), rand(1), rand(1)))
    +
    +model(xs)
    +# returns a single float vector with one value
    Note

    This Join layer is available from the Fluxperimental.jl package.

    Using Parallel

    Flux already provides Parallel that can offer the same functionality. In this case, Join is going to just be syntactic sugar for Parallel.

    Join(combine, paths) = Parallel(combine, paths)
    +Join(combine, paths...) = Join(combine, paths)
    +
    +# use vararg/tuple version of Parallel forward pass
    +model = Chain(
    +              Join(vcat,
    +                   Chain(Dense(1 => 5, relu), Dense(5 => 1)),
    +                   Dense(1 => 2),
    +                   Dense(1 => 1)
    +                  ),
    +              Dense(4 => 1)
    +             ) |> gpu
    +
    +xs = map(gpu, (rand(1), rand(1), rand(1)))
    +
    +model(xs)
    +# returns a single float vector with one value

    Multiple outputs: a custom Split layer

    Our custom Split layer will accept a single input, then pass the input through a separate path to produce multiple outputs.

    We start by following the same steps as the Join layer: define a struct, use Functors.@functor, and define the forward pass.

    using Flux
    +using CUDA
    +
    +# custom split layer
    +struct Split{T}
    +  paths::T
    +end
    +
    +Split(paths...) = Split(paths)
    +
    +Flux.@functor Split
    +
    +(m::Split)(x::AbstractArray) = map(f -> f(x), m.paths)

    Now we can test to see that our Split does indeed produce multiple outputs.

    model = Chain(
    +              Dense(10 => 5),
    +              Split(Dense(5 => 1, tanh), Dense(5 => 3, tanh), Dense(5 => 2))
    +             ) |> gpu
    +
    +model(gpu(rand(10)))
    +# returns a tuple with three float vectors

    A custom loss function for the multiple outputs may look like this:

    using Statistics
    +
    +# assuming model returns the output of a Split
    +# x is a single input
    +# ys is a tuple of outputs
    +function loss(x, ys, model)
    +  # rms over all the mse
    +  ŷs = model(x)
    +  return sqrt(mean(Flux.mse(y, ŷ) for (y, ŷ) in zip(ys, ŷs)))
    +end
    Note

    This Split layer is available from the Fluxperimental.jl package.

    diff --git a/previews/PR2365/models/basics/index.html b/previews/PR2365/models/basics/index.html new file mode 100644 index 0000000000..103be1410f --- /dev/null +++ b/previews/PR2365/models/basics/index.html @@ -0,0 +1,130 @@ + +Gradients and Layers · Flux

    How Flux Works: Gradients and Layers

    Taking Gradients

    Flux's core feature is taking gradients of Julia code. The gradient function takes another Julia function f and a set of arguments, and returns the gradient with respect to each argument. (It's a good idea to try pasting these examples in the Julia terminal.)

    julia> using Flux
    +
    +julia> f(x) = 3x^2 + 2x + 1;
    +
    +julia> df(x) = gradient(f, x)[1]; # df/dx = 6x + 2
    +
    +julia> df(2)
    +14.0
    +
    +julia> d2f(x) = gradient(df, x)[1]; # d²f/dx² = 6
    +
    +julia> d2f(2)
    +6.0

    When a function has many parameters, we can get gradients of each one at the same time:

    julia> f(x, y) = sum((x .- y).^2);
    +
    +julia> gradient(f, [2, 1], [2, 0])
    +([0.0, 2.0], [-0.0, -2.0])

    These gradients are based on x and y. Flux works by instead taking gradients based on the weights and biases that make up the parameters of a model.

    Machine learning often can have hundreds of parameter arrays. Instead of passing them to gradient individually, we can store them together in a structure. The simplest example is a named tuple, created by the following syntax:

    julia> nt = (a = [2, 1], b = [2, 0], c = tanh);
    +
    +julia> g(x::NamedTuple) = sum(abs2, x.a .- x.b);
    +
    +julia> g(nt)
    +1
    +
    +julia> dg_nt = gradient(g, nt)[1]
    +(a = [0.0, 2.0], b = [-0.0, -2.0], c = nothing)

    Notice that gradient has returned a matching structure. The field dg_nt.a is the gradient for nt.a, and so on. Some fields have no gradient, indicated by nothing.

    Rather than define a function like g every time (and think up a name for it), it is often useful to use anonymous functions: this one is x -> sum(abs2, x.a .- x.b). Anonymous functions can be defined either with -> or with do, and such do blocks are often useful if you have a few steps to perform:

    julia> gradient((x, y) -> sum(abs2, x.a ./ y .- x.b), nt, [1, 2])
    +((a = [0.0, 0.5], b = [-0.0, -1.0], c = nothing), [-0.0, -0.25])
    +
    +julia> gradient(nt, [1, 2]) do x, y
    +         z = x.a ./ y
    +         sum(abs2, z .- x.b)
    +       end
    +((a = [0.0, 0.5], b = [-0.0, -1.0], c = nothing), [-0.0, -0.25])

    Sometimes you may want to know the value of the function, as well as its gradient. Rather than calling the function a second time, you can call withgradient instead:

    julia> Flux.withgradient(g, nt)
    +(val = 1, grad = ((a = [0.0, 2.0], b = [-0.0, -2.0], c = nothing),))
    Implicit gradients

    Flux used to handle many parameters in a different way, using the params function. This uses a method of gradient which takes a zero-argument function, and returns a dictionary through which the resulting gradients can be looked up:

    julia> x = [2, 1];
    +
    +julia> y = [2, 0];
    +
    +julia> gs = gradient(Flux.params(x, y)) do
    +         f(x, y)
    +       end
    +Grads(...)
    +
    +julia> gs[x]
    +2-element Vector{Float64}:
    + 0.0
    + 2.0
    +
    +julia> gs[y]
    +2-element Vector{Float64}:
    + -0.0
    + -2.0

    Building Simple Models

    Consider a simple linear regression, which tries to predict an output array y from an input x.

    W = rand(2, 5)
    +b = rand(2)
    +
    +predict(x) = W*x .+ b
    +
    +function loss(x, y)
    +  ŷ = predict(x)
    +  sum((y .- ŷ).^2)
    +end
    +
    +x, y = rand(5), rand(2) # Dummy data
    +loss(x, y) # ~ 3

    To improve the prediction we can take the gradients of the loss with respect to W and b and perform gradient descent.

    using Flux
    +
    +gs = gradient(() -> loss(x, y), Flux.params(W, b))

    Now that we have gradients, we can pull them out and update W to train the model.

    W̄ = gs[W]
    +
    +W .-= 0.1 .* W̄
    +
    +loss(x, y) # ~ 2.5

    The loss has decreased a little, meaning that our prediction x is closer to the target y. If we have some data we can already try training the model.

    All deep learning in Flux, however complex, is a simple generalisation of this example. Of course, models can look very different – they might have millions of parameters or complex control flow. Let's see how Flux handles more complex models.

    Building Layers

    It's common to create more complex models than the linear regression above. For example, we might want to have two linear layers with a nonlinearity like sigmoid (σ) in between them. In the above style we could write this as:

    using Flux
    +
    +W1 = rand(3, 5)
    +b1 = rand(3)
    +layer1(x) = W1 * x .+ b1
    +
    +W2 = rand(2, 3)
    +b2 = rand(2)
    +layer2(x) = W2 * x .+ b2
    +
    +model(x) = layer2(σ.(layer1(x)))
    +
    +model(rand(5)) # => 2-element vector

    This works but is fairly unwieldy, with a lot of repetition – especially as we add more layers. One way to factor this out is to create a function that returns linear layers.

    function linear(in, out)
    +  W = randn(out, in)
    +  b = randn(out)
    +  x -> W * x .+ b
    +end
    +
    +linear1 = linear(5, 3) # we can access linear1.W etc
    +linear2 = linear(3, 2)
    +
    +model(x) = linear2(σ.(linear1(x)))
    +
    +model(rand(5)) # => 2-element vector

    Another (equivalent) way is to create a struct that explicitly represents the affine layer.

    struct Affine
    +  W
    +  b
    +end
    +
    +Affine(in::Integer, out::Integer) =
    +  Affine(randn(out, in), randn(out))
    +
    +# Overload call, so the object can be used as a function
    +(m::Affine)(x) = m.W * x .+ m.b
    +
    +a = Affine(10, 5)
    +
    +a(rand(10)) # => 5-element vector

    Congratulations! You just built the Dense layer that comes with Flux. Flux has many interesting layers available, but they're all things you could have built yourself very easily.

    (There is one small difference with Dense – for convenience it also takes an activation function, like Dense(10 => 5, σ).)

    Stacking It Up

    It's pretty common to write models that look something like:

    layer1 = Dense(10 => 5, σ)
    +# ...
    +model(x) = layer3(layer2(layer1(x)))

    For long chains, it might be a bit more intuitive to have a list of layers, like this:

    using Flux
    +
    +layers = [Dense(10 => 5, σ), Dense(5 => 2), softmax]
    +
    +model(x) = foldl((x, m) -> m(x), layers, init = x)
    +
    +model(rand(10)) # => 2-element vector

    Handily, this is also provided for in Flux:

    model2 = Chain(
    +  Dense(10 => 5, σ),
    +  Dense(5 => 2),
    +  softmax)
    +
    +model2(rand(10)) # => 2-element vector

    This quickly starts to look like a high-level deep learning library; yet you can see how it falls out of simple abstractions, and we lose none of the power of Julia code.

    A nice property of this approach is that because "models" are just functions (possibly with trainable parameters), you can also see this as simple function composition.

    m = Dense(5 => 2) ∘ Dense(10 => 5, σ)
    +
    +m(rand(10))

    Likewise, Chain will happily work with any Julia function.

    m = Chain(x -> x^2, x -> x+1)
    +
    +m(5) # => 26

    Layer Helpers

    There is still one problem with this Affine layer, that Flux does not know to look inside it. This means that Flux.train! won't see its parameters, nor will gpu be able to move them to your GPU. These features are enabled by the @functor macro:

    Flux.@functor Affine

    Finally, most Flux layers make bias optional, and allow you to supply the function used for generating random weights. We can easily add these refinements to the Affine layer as follows, using the helper function create_bias:

    function Affine((in, out)::Pair; bias=true, init=Flux.randn32)
    +  W = init(out, in)
    +  b = Flux.create_bias(W, bias, out)
    +  Affine(W, b)
    +end
    +
    +Affine(3 => 1, bias=false, init=ones) |> gpu
    diff --git a/previews/PR2365/models/functors/index.html b/previews/PR2365/models/functors/index.html new file mode 100644 index 0000000000..08b799881a --- /dev/null +++ b/previews/PR2365/models/functors/index.html @@ -0,0 +1,196 @@ + +Nested Structures – Functors.jl · Flux

    Recursive transformations from Functors.jl

    Flux models are deeply nested structures, and Functors.jl provides tools needed to explore such objects, apply functions to the parameters they contain, and re-build them.

    New layers should be annotated using the Functors.@functor macro. This will enable params to see the parameters inside, and gpu to move them to the GPU.

    Functors.jl has its own notes on basic usage for more details. Additionally, the Advanced Model Building and Customisation page covers the use cases of Functors in greater details.

    Functors.@functorMacro
    @functor T
    +@functor T (x,)

    Adds methods to functor allowing recursion into objects of type T, and reconstruction. Assumes that T has a constructor accepting all of its fields, which is true unless you have provided an inner constructor which does not.

    By default all fields of T are considered children; this can be restricted be restructed by providing a tuple of field names.

    Examples

    julia> struct Foo; x; y; end
    +
    +julia> @functor Foo
    +
    +julia> Functors.children(Foo(1,2))
    +(x = 1, y = 2)
    +
    +julia> _, re = Functors.functor(Foo(1,2));
    +
    +julia> re((10, 20))
    +Foo(10, 20)
    +
    +julia> struct TwoThirds a; b; c; end
    +
    +julia> @functor TwoThirds (a, c)
    +
    +julia> ch2, re3 = Functors.functor(TwoThirds(10,20,30));
    +
    +julia> ch2
    +(a = 10, c = 30)
    +
    +julia> re3(("ten", "thirty"))
    +TwoThirds("ten", 20, "thirty")
    +
    +julia> fmap(x -> 10x, TwoThirds(Foo(1,2), Foo(3,4), 56))
    +TwoThirds(Foo(10, 20), Foo(3, 4), 560)
    Functors.fmapFunction
    fmap(f, x, ys...; exclude = Functors.isleaf, walk = Functors.DefaultWalk()[, prune])

    A structure and type preserving map.

    By default it transforms every leaf node (identified by exclude, default isleaf) by applying f, and otherwise traverses x recursively using functor. Optionally, it may also be associated with objects ys with the same tree structure. In that case, f is applied to the corresponding leaf nodes in x and ys.

    Examples

    julia> fmap(string, (x=1, y=(2, 3)))
    +(x = "1", y = ("2", "3"))
    +
    +julia> nt = (a = [1,2], b = [23, (45,), (x=6//7, y=())], c = [8,9]);
    +
    +julia> fmap(println, nt)
    +[1, 2]
    +23
    +45
    +6//7
    +()
    +[8, 9]
    +(a = nothing, b = Any[nothing, (nothing,), (x = nothing, y = nothing)], c = nothing)
    +
    +julia> fmap(println, nt; exclude = x -> x isa Array)
    +[1, 2]
    +Any[23, (45,), (x = 6//7, y = ())]
    +[8, 9]
    +(a = nothing, b = nothing, c = nothing)
    +
    +julia> twice = [1, 2];  # println only acts once on this
    +
    +julia> fmap(println, (i = twice, ii = 34, iii = [5, 6], iv = (twice, 34), v = 34.0))
    +[1, 2]
    +34
    +[5, 6]
    +34
    +34.0
    +(i = nothing, ii = nothing, iii = nothing, iv = (nothing, nothing), v = nothing)
    +
    +julia> d1 = Dict("x" => [1,2], "y" => 3);
    +
    +julia> d2 = Dict("x" => [4,5], "y" => 6, "z" => "an_extra_value");
    +
    +julia> fmap(+, d1, d2) == Dict("x" => [5, 7], "y" => 9) # Note that "z" is ignored
    +true

    Mutable objects which appear more than once are only handled once (by caching f(x) in an IdDict). Thus the relationship x.i === x.iv[1] will be preserved. An immutable object which appears twice is not stored in the cache, thus f(34) will be called twice, and the results will agree only if f is pure.

    By default, Tuples, NamedTuples, and some other container-like types in Base have children to recurse into. Arrays of numbers do not. To enable recursion into new types, you must provide a method of functor, which can be done using the macro @functor:

    julia> struct Foo; x; y; end
    +
    +julia> @functor Foo
    +
    +julia> struct Bar; x; end
    +
    +julia> @functor Bar
    +
    +julia> m = Foo(Bar([1,2,3]), (4, 5, Bar(Foo(6, 7))));
    +
    +julia> fmap(x -> 10x, m)
    +Foo(Bar([10, 20, 30]), (40, 50, Bar(Foo(60, 70))))
    +
    +julia> fmap(string, m)
    +Foo(Bar("[1, 2, 3]"), ("4", "5", Bar(Foo("6", "7"))))
    +
    +julia> fmap(string, m, exclude = v -> v isa Bar)
    +Foo("Bar([1, 2, 3])", (4, 5, "Bar(Foo(6, 7))"))

    To recurse into custom types without reconstructing them afterwards, use fmapstructure.

    For advanced customization of the traversal behaviour, pass a custom walk function that subtypes Functors.AbstractWalk. The call fmap(f, x, ys...; walk = mywalk) will wrap mywalk in ExcludeWalk then CachedWalk. Here, ExcludeWalk is responsible for applying f at excluded nodes. For a low-level interface for executing a user-constructed walk, see execute.

    julia> struct MyWalk <: Functors.AbstractWalk end
    +
    +julia> (::MyWalk)(recurse, x) = x isa Bar ? "hello" :
    +                                            Functors.DefaultWalk()(recurse, x)
    +
    +julia> fmap(x -> 10x, m; walk = MyWalk())
    +Foo("hello", (40, 50, "hello"))

    The behaviour when the same node appears twice can be altered by giving a value to the prune keyword, which is then used in place of all but the first:

    julia> twice = [1, 2];
    +
    +julia> fmap(float, (x = twice, y = [1,2], z = twice); prune = missing)
    +(x = [1.0, 2.0], y = [1.0, 2.0], z = missing)
    Functors.isleafFunction
    Functors.isleaf(x)

    Return true if x has no children according to functor.

    Examples

    julia> Functors.isleaf(1)
    +true
    +
    +julia> Functors.isleaf([2, 3, 4])
    +true
    +
    +julia> Functors.isleaf(["five", [6, 7]])
    +false
    +
    +julia> Functors.isleaf([])
    +false
    +
    +julia> Functors.isleaf((8, 9))
    +false
    +
    +julia> Functors.isleaf(())
    +true
    Functors.childrenFunction
    Functors.children(x)

    Return the children of x as defined by functor. Equivalent to functor(x)[1].

    Functors.fcollectFunction
    fcollect(x; exclude = v -> false)

    Traverse x by recursing each child of x as defined by functor and collecting the results into a flat array, ordered by a breadth-first traversal of x, respecting the iteration order of children calls.

    Doesn't recurse inside branches rooted at nodes v for which exclude(v) == true. In such cases, the root v is also excluded from the result. By default, exclude always yields false.

    See also children.

    Examples

    julia> struct Foo; x; y; end
    +
    +julia> @functor Foo
    +
    +julia> struct Bar; x; end
    +
    +julia> @functor Bar
    +
    +julia> struct TypeWithNoChildren; x; y; end
    +
    +julia> m = Foo(Bar([1,2,3]), TypeWithNoChildren(:a, :b))
    +Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))
    +
    +julia> fcollect(m)
    +4-element Vector{Any}:
    + Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))
    + Bar([1, 2, 3])
    + [1, 2, 3]
    + TypeWithNoChildren(:a, :b)
    +
    +julia> fcollect(m, exclude = v -> v isa Bar)
    +2-element Vector{Any}:
    + Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))
    + TypeWithNoChildren(:a, :b)
    +
    +julia> fcollect(m, exclude = v -> Functors.isleaf(v))
    +2-element Vector{Any}:
    + Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))
    + Bar([1, 2, 3])
    Functors.functorFunction
    Functors.functor(x) = functor(typeof(x), x)

    Returns a tuple containing, first, a NamedTuple of the children of x (typically its fields), and second, a reconstruction funciton. This controls the behaviour of fmap.

    Methods should be added to functor(::Type{T}, x) for custom types, usually using the macro @functor.

    Functors.fmapstructureFunction
    fmapstructure(f, x; exclude = isleaf)

    Like fmap, but doesn't preserve the type of custom structs. Instead, it returns a NamedTuple (or a Tuple, or an array), or a nested set of these.

    Useful for when the output must not contain custom structs.

    Examples

    julia> struct Foo; x; y; end
    +
    +julia> @functor Foo
    +
    +julia> m = Foo([1,2,3], [4, (5, 6), Foo(7, 8)]);
    +
    +julia> fmapstructure(x -> 2x, m)
    +(x = [2, 4, 6], y = Any[8, (10, 12), (x = 14, y = 16)])
    +
    +julia> fmapstructure(println, m)
    +[1, 2, 3]
    +4
    +5
    +6
    +7
    +8
    +(x = nothing, y = Any[nothing, (nothing, nothing), (x = nothing, y = nothing)])

    Moving models, or data, to the GPU

    Flux provides some convenience functions based on fmap. Some (f16, f32, f64) change the precision of all arrays in a model. Others are used for moving a model to of from GPU memory:

    Flux.cpuFunction
    cpu(m)

    Copies m onto the CPU, the opposite of gpu. Recurses into structs marked @functor.

    Example

    julia> m_gpu = Dense(CUDA.randn(2, 5))
    +Dense(5 => 2)       # 12 parameters
    +
    +julia> m_gpu.bias  # matches the given weight matrix
    +2-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:
    + 0.0
    + 0.0
    +
    +julia> m = m_gpu |> cpu
    +Dense(5 => 2)       # 12 parameters
    +
    +julia> m.bias
    +2-element Vector{Float32}:
    + 0.0
    + 0.0
    source
    Flux.gpuMethod
    gpu(m)

    Copies m to the current GPU device (using current GPU backend), if one is available. If no GPU is available, it does nothing (but prints a warning the first time).

    On arrays, this calls CUDA's cu, which also changes arrays with Float64 elements to Float32 while copying them to the device (same for AMDGPU). To act on arrays within a struct, the struct type must be marked with @functor.

    Use cpu to copy back to ordinary Arrays. See also f32 and f16 to change element type only.

    See the CUDA.jl docs to help identify the current device.

    Example

    julia> m = Dense(rand(2, 3))  # constructed with Float64 weight matrix
    +Dense(3 => 2)       # 8 parameters
    +
    +julia> typeof(m.weight)
    +Matrix{Float64} (alias for Array{Float64, 2})
    +
    +julia> m_gpu = gpu(m)  # can equivalently be written m_gpu = m |> gpu
    +Dense(3 => 2)       # 8 parameters
    +
    +julia> typeof(m_gpu.weight)
    +CUDA.CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}
    source
    Flux.gpuMethod
    gpu(data::DataLoader)

    Transforms a given DataLoader to apply gpu to each batch of data, when iterated over. (If no GPU is available, this does nothing.)

    Example

    julia> dl = Flux.DataLoader((x = ones(2,10), y='a':'j'), batchsize=3)
    +4-element DataLoader(::NamedTuple{(:x, :y), Tuple{Matrix{Float64}, StepRange{Char, Int64}}}, batchsize=3)
    +  with first element:
    +  (; x = 2×3 Matrix{Float64}, y = 3-element StepRange{Char, Int64})
    +
    +julia> first(dl)
    +(x = [1.0 1.0 1.0; 1.0 1.0 1.0], y = 'a':1:'c')
    +
    +julia> c_dl = gpu(dl)
    +4-element DataLoader(::MLUtils.MappedData{:auto, typeof(gpu), NamedTuple{(:x, :y), Tuple{Matrix{Float64}, StepRange{Char, Int64}}}}, batchsize=3)
    +  with first element:
    +  (; x = 2×3 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, y = 3-element StepRange{Char, Int64})
    +
    +julia> first(c_dl).x
    +2×3 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
    + 1.0  1.0  1.0
    + 1.0  1.0  1.0

    For large datasets, this is preferred over moving all the data to the GPU before creating the DataLoader, like this:

    julia> Flux.DataLoader((x = ones(2,10), y=2:11) |> gpu, batchsize=3)
    +4-element DataLoader(::NamedTuple{(:x, :y), Tuple{CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, UnitRange{Int64}}}, batchsize=3)
    +  with first element:
    +  (; x = 2×3 CUDA.CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, y = 3-element UnitRange{Int64})
    Warning

    This only works if gpu is applied directly to the DataLoader. While gpu acts recursively on Flux models and many basic Julia structs, it will not work on (say) a tuple of DataLoaders.

    source
    diff --git a/previews/PR2365/models/layers/index.html b/previews/PR2365/models/layers/index.html new file mode 100644 index 0000000000..bef24814d0 --- /dev/null +++ b/previews/PR2365/models/layers/index.html @@ -0,0 +1,715 @@ + +Built-in Layers · Flux

    Built-in Layer Types

    If you started at the beginning of the guide, then you have already met the basic Dense layer, and seen Chain for combining layers. These core layers form the foundation of almost all neural networks.

    The Dense exemplifies several features:

    • It contains an an activation function, which is broadcasted over the output. Because this broadcast can be fused with other operations, doing so is more efficient than applying the activation function separately.

    • It take an init keyword, which accepts a function acting like rand. That is, init(2,3,4) should create an array of this size. Flux has many such functions built-in. All make a CPU array, moved later with gpu if desired.

    • The bias vector is always initialised Flux.zeros32. The keyword bias=false will turn this off, i.e. keeping the bias permanently zero.

    • It is annotated with @functor, which means that params will see the contents, and gpu will move their arrays to the GPU.

    By contrast, Chain itself contains no parameters, but connects other layers together. The section on dataflow layers introduces others like this.

    Fully Connected

    Flux.DenseType
    Dense(in => out, σ=identity; bias=true, init=glorot_uniform)
    +Dense(W::AbstractMatrix, [bias, σ])

    Create a traditional fully connected layer, whose forward pass is given by:

    y = σ.(W * x .+ bias)

    The input x should be a vector of length in, or batch of vectors represented as an in × N matrix, or any array with size(x,1) == in. The out y will be a vector of length out, or a batch with size(y) == (out, size(x)[2:end]...)

    Keyword bias=false will switch off trainable bias for the layer. The initialisation of the weight matrix is W = init(out, in), calling the function given to keyword init, with default glorot_uniform. The weight matrix and/or the bias vector (of length out) may also be provided explicitly.

    Examples

    julia> d = Dense(5 => 2)
    +Dense(5 => 2)       # 12 parameters
    +
    +julia> d(rand32(5, 64)) |> size
    +(2, 64)
    +
    +julia> d(rand32(5, 6, 4, 64)) |> size  # treated as three batch dimensions
    +(2, 6, 4, 64)
    +
    +julia> d1 = Dense(ones(2, 5), false, tanh)  # using provided weight matrix
    +Dense(5 => 2, tanh; bias=false)  # 10 parameters
    +
    +julia> d1(ones(5))
    +2-element Vector{Float64}:
    + 0.9999092042625951
    + 0.9999092042625951
    +
    +julia> Flux.params(d1)  # no trainable bias
    +Params([[1.0 1.0 … 1.0 1.0; 1.0 1.0 … 1.0 1.0]])
    source
    Flux.BilinearType
    Bilinear((in1, in2) => out, σ=identity; bias=true, init=glorot_uniform)
    +Bilinear(W::AbstractArray, [bias, σ])

    Creates a layer which is fully connected between two inputs and the output, and otherwise similar to Dense. Its output, given vectors x & y, is another vector z with, for all i ∈ 1:out:

    z[i] = σ(x' * W[i,:,:] * y + bias[i])

    If x and y are matrices, then each column of the output z = B(x, y) is of this form, with B the Bilinear layer.

    If the second input y is not given, it is taken to be equal to x, i.e. B(x) == B(x, x)

    The two inputs may also be provided as a tuple, B((x, y)) == B(x, y), which is accepted as the input to a Chain.

    If the two input sizes are the same, in1 == in2, then you may write Bilinear(in => out, σ).

    The initialisation works as for Dense layer, with W = init(out, in1, in2). By default the bias vector is zeros(Float32, out), option bias=false will switch off trainable bias. Either of these may be provided explicitly.

    Examples

    julia> x, y = randn(Float32, 5, 32), randn(Float32, 5, 32);
    +
    +julia> B = Flux.Bilinear((5, 5) => 7)
    +Bilinear(5 => 7)    # 182 parameters
    +
    +julia> B(x) |> size  # interactions based on one input
    +(7, 32)
    +
    +julia> B(x,y) == B((x,y))  # two inputs, may be given as a tuple
    +true
    +
    +julia> sc = SkipConnection(
    +                Chain(Dense(5 => 20, tanh), Dense(20 => 9, tanh)),
    +                Flux.Bilinear((9, 5) => 3, bias=false),
    +            );  # used as the recombinator, with skip as the second input
    +
    +julia> sc(x) |> size
    +(3, 32)
    +
    +julia> Flux.Bilinear(rand(4,8,16), false, tanh)  # first dim of weight is the output
    +Bilinear((8, 16) => 4, tanh; bias=false)  # 512 parameters
    source
    Flux.ScaleType
    Scale(size::Integer..., σ=identity; bias=true, init=ones32)
    +Scale(scale::AbstractArray, [bias, σ])

    Create an element-wise layer, whose forward pass is given by:

    y = σ.(scale .* x .+ bias)

    This uses .* instead of matrix multiplication * of Dense.

    The learnable scale & bias are initialised init(size...) and zeros32(size...), with init=ones32 by default. You may specify the function init, turn off trainable bias with bias=false, or provide the array(s) explicitly.

    Used by LayerNorm with affine=true.

    Examples

    julia> a = Flux.Scale(2)
    +Scale(2)            # 4 parameters
    +
    +julia> Flux.params(a)
    +Params([Float32[1.0, 1.0], Float32[0.0, 0.0]])
    +
    +julia> a([1 2 3])
    +2×3 Matrix{Float32}:
    + 1.0  2.0  3.0
    + 1.0  2.0  3.0
    +
    +julia> b = Flux.Scale([1 2 3 4], false, abs2)
    +Scale(1, 4, abs2; bias=false)  # 4 parameters
    +
    +julia> b([1, 10])
    +2×4 Matrix{Int64}:
    +   1    4    9    16
    + 100  400  900  1600
    +
    +julia> Flux.params(b)
    +Params([[1 2 3 4]])
    source

    Perhaps Scale isn't quite fully connected, but it may be thought of as Dense(Diagonal(s.weights), s.bias), and LinearAlgebra's Diagonal is a matrix which just happens to contain many zeros.

    Flux ≤ 0.12

    Old versions of Flux accepted only Dense(in, out, act) and not Dense(in => out, act). This notation makes a Pair object. If you get an error like MethodError: no method matching Dense(::Pair{Int64,Int64}), this means that you should upgrade to newer Flux versions.

    Convolution Models

    These layers are used to build convolutional neural networks (CNNs).

    They all expect images in what is called WHCN order: a batch of 32 colour images, each 50 x 50 pixels, will have size(x) == (50, 50, 3, 32). A single grayscale image might instead have size(x) == (28, 28, 1, 1).

    Besides images, 2D data, they also work with 1D data, where for instance stereo sound recording with 1000 samples might have size(x) == (1000, 2, 1). They will also work with 3D data, ndims(x) == 5, where again the last two dimensions are channel and batch.

    To understand how strides and padding work, the article by Dumoulin & Visin has great illustrations.

    Flux.ConvType
    Conv(filter, in => out, σ = identity;
    +     stride = 1, pad = 0, dilation = 1, groups = 1, [bias, init])

    Standard convolutional layer. filter is a tuple of integers specifying the size of the convolutional kernel; in and out specify the number of input and output channels.

    Image data should be stored in WHCN order (width, height, channels, batch). In other words, a 100×100 RGB image would be a 100×100×3×1 array, and a batch of 50 would be a 100×100×3×50 array. This has N = 2 spatial dimensions, and needs a kernel size like (5,5), a 2-tuple of integers.

    To take convolutions along N feature dimensions, this layer expects as input an array with ndims(x) == N+2, where size(x, N+1) == in is the number of input channels, and size(x, ndims(x)) is (as always) the number of observations in a batch. Then:

    • filter should be a tuple of N integers.
    • Keywords stride and dilation should each be either single integer, or a tuple with N integers.
    • Keyword pad specifies the number of elements added to the borders of the data array. It can be
      • a single integer for equal padding all around,
      • a tuple of N integers, to apply the same padding at begin/end of each spatial dimension,
      • a tuple of 2*N integers, for asymmetric padding, or
      • the singleton SamePad(), to calculate padding such that size(output,d) == size(x,d) / stride (possibly rounded) for each spatial dimension.
    • Keyword groups is expected to be an Int. It specifies the number of groups to divide a convolution into.

    Keywords to control initialization of the layer:

    • init - Function used to generate initial weights. Defaults to glorot_uniform.
    • bias - The initial bias vector is all zero by default. Trainable bias can be disabled entirely by setting this to false, or another vector can be provided such as bias = randn(Float32, out).

    See also ConvTranspose, DepthwiseConv, CrossCor.

    Examples

    julia> xs = rand32(100, 100, 3, 50); # a batch of 50 RGB images
    +
    +julia> layer = Conv((5,5), 3 => 7, relu; bias = false)
    +Conv((5, 5), 3 => 7, relu, bias=false)  # 525 parameters
    +
    +julia> layer(xs) |> size
    +(96, 96, 7, 50)
    +
    +julia> Conv((5,5), 3 => 7; stride = 2)(xs) |> size
    +(48, 48, 7, 50)
    +
    +julia> Conv((5,5), 3 => 7; stride = 2, pad = SamePad())(xs) |> size
    +(50, 50, 7, 50)
    +
    +julia> Conv((1,1), 3 => 7; pad = (20,10,0,0))(xs) |> size
    +(130, 100, 7, 50)
    +
    +julia> Conv((5,5), 3 => 7; stride = 2, dilation = 4)(xs) |> size
    +(42, 42, 7, 50)
    source
    Flux.ConvMethod
    Conv(weight::AbstractArray, [bias, activation; stride, pad, dilation])

    Constructs a convolutional layer with the given weight and bias. Accepts the same keywords and has the same defaults as Conv(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ; ...).

    julia> weight = rand(3, 4, 5);
    +
    +julia> bias = zeros(5);
    +
    +julia> layer = Conv(weight, bias, sigmoid)  # expects 1 spatial dimension
    +Conv((3,), 4 => 5, σ)  # 65 parameters
    +
    +julia> layer(randn(100, 4, 64)) |> size
    +(98, 5, 64)
    +
    +julia> Flux.params(layer) |> length
    +2
    source
    Flux.ConvTransposeType
    ConvTranspose(filter, in => out, σ=identity; stride=1, pad=0, dilation=1, [bias, init])

    Standard convolutional transpose layer. filter is a tuple of integers specifying the size of the convolutional kernel, while in and out specify the number of input and output channels.

    Note that pad=SamePad() here tries to ensure size(output,d) == size(x,d) * stride.

    Parameters are controlled by additional keywords, with defaults init=glorot_uniform and bias=true.

    See also Conv for more detailed description of keywords.

    Examples

    julia> xs = rand32(100, 100, 3, 50);  # a batch of 50 RGB images
    +
    +julia> layer = ConvTranspose((5,5), 3 => 7, relu)
    +ConvTranspose((5, 5), 3 => 7, relu)  # 532 parameters
    +
    +julia> layer(xs) |> size
    +(104, 104, 7, 50)
    +
    +julia> ConvTranspose((5,5), 3 => 7, stride=2)(xs) |> size
    +(203, 203, 7, 50)
    +
    +julia> ConvTranspose((5,5), 3 => 7, stride=3, pad=SamePad())(xs) |> size
    +(300, 300, 7, 50)
    source
    Flux.ConvTransposeMethod
    ConvTranspose(weight::AbstractArray, [bias, activation; stride, pad, dilation, groups])

    Constructs a ConvTranspose layer with the given weight and bias. Accepts the same keywords and has the same defaults as ConvTranspose(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ; ...).

    Examples

    julia> weight = rand(3, 4, 5);
    +
    +julia> bias = zeros(4);
    +
    +julia> layer = ConvTranspose(weight, bias, sigmoid)
    +ConvTranspose((3,), 5 => 4, σ)  # 64 parameters
    +
    +julia> layer(randn(100, 5, 64)) |> size  # transposed convolution will increase the dimension size (upsampling)
    +(102, 4, 64)
    +
    +julia> Flux.params(layer) |> length
    +2
    source
    Flux.CrossCorType
    CrossCor(filter, in => out, σ=identity; stride=1, pad=0, dilation=1, [bias, init])

    Standard cross correlation layer. filter is a tuple of integers specifying the size of the convolutional kernel; in and out specify the number of input and output channels.

    Parameters are controlled by additional keywords, with defaults init=glorot_uniform and bias=true.

    See also Conv for more detailed description of keywords.

    Examples

    julia> xs = rand(Float32, 100, 100, 3, 50);  # a batch of 50 RGB images
    +
    +julia> layer = CrossCor((5,5), 3 => 6, relu; bias=false)
    +CrossCor((5, 5), 3 => 6, relu, bias=false)  # 450 parameters
    +
    +julia> layer(xs) |> size
    +(96, 96, 6, 50)
    +
    +julia> CrossCor((5,5), 3 => 7, stride=3, pad=(2,0))(xs) |> size
    +(34, 32, 7, 50)
    source
    Flux.CrossCorMethod
    CrossCor(weight::AbstractArray, [bias, activation; stride, pad, dilation])

    Constructs a CrossCor layer with the given weight and bias. Accepts the same keywords and has the same defaults as CrossCor(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ; ...).

    Examples

    julia> weight = rand(3, 4, 5);
    +
    +julia> bias = zeros(5);
    +
    +julia> layer = CrossCor(weight, bias, relu)
    +CrossCor((3,), 4 => 5, relu)  # 65 parameters
    +
    +julia> layer(randn(100, 4, 64)) |> size
    +(98, 5, 64)
    source
    Flux.DepthwiseConvFunction
    DepthwiseConv(filter, in => out, σ=identity; stride=1, pad=0, dilation=1, [bias, init])
    +DepthwiseConv(weight::AbstractArray, [bias, activation; stride, pad, dilation])

    Return a depthwise convolutional layer, that is a Conv layer with number of groups equal to the number of input channels.

    See Conv for a description of the arguments.

    Examples

    julia> xs = rand(Float32, 100, 100, 3, 50);  # a batch of 50 RGB images
    +
    +julia> layer = DepthwiseConv((5,5), 3 => 6, relu; bias=false)
    +Conv((5, 5), 3 => 6, relu, groups=3, bias=false)  # 150 parameters 
    +
    +julia> layer(xs) |> size
    +(96, 96, 6, 50)
    +
    +julia> DepthwiseConv((5, 5), 3 => 9, stride=2, pad=2)(xs) |> size
    +(50, 50, 9, 50)
    source
    Flux.SamePadType
    SamePad()

    Passed as an option to convolutional layers (and friends), this causes the padding to be chosen such that the input and output sizes agree (on the first N dimensions, the kernel or window) when stride==1. When stride≠1, the output size equals ceil(input_size/stride).

    See also Conv, MaxPool.

    Examples

    julia> xs = rand32(100, 100, 3, 50);  # a batch of images
    +
    +julia> layer = Conv((2,2), 3 => 7, pad=SamePad())
    +Conv((2, 2), 3 => 7, pad=(1, 0, 1, 0))  # 91 parameters
    +
    +julia> layer(xs) |> size  # notice how the dimensions stay the same with this padding
    +(100, 100, 7, 50)
    +
    +julia> layer2 = Conv((2,2), 3 => 7)
    +Conv((2, 2), 3 => 7)  # 91 parameters
    +
    +julia> layer2(xs) |> size  # the output dimension changes as the padding was not "same"
    +(99, 99, 7, 50)
    +
    +julia> layer3 = Conv((5, 5), 3 => 7, stride=2, pad=SamePad())
    +Conv((5, 5), 3 => 7, pad=2, stride=2)  # 532 parameters
    +
    +julia> layer3(xs) |> size  # output size = `ceil(input_size/stride)` = 50
    +(50, 50, 7, 50)
    source

    MultiHeadAttention

    The basic blocks needed to implement Transformer architectures. See also the functional counterparts documented in NNlib's Attention section.

    Flux.MultiHeadAttentionType
    MultiHeadAttention(dims; [nheads, bias, init, dropout_prob])

    The multi-head dot-product attention layer used in Transformer architectures [1].

    Returns the transformed input sequence and the attention scores.

    [1] Vaswani et al. "Attention is all you need." Advances in Neural Information Processing Systems. 2017.

    Arguments

    • dims: The embedding dimensions of inputs, intermediate tensors and outputs. In the most general case, it is given as a) (q_in_dim, k_in_dim, v_in_dim) => (qk_dim, v_dim) => out_dim. Can take also simpler forms as b) dims::Int; c) in_dim::Int => (qk_dim, v_dim) => out_dim; d) in_dim::Int => qkv_dim => out_dim.
    • nheads: number of heads. Default 8.
    • init: weight initializer for the Dense layers. Default glorot_uniform.
    • bias : whether pointwise QKVO dense transforms use bias. Default false.
    • dropout_prob: dropout probability for the attention scores. Default 0.0.

    Forward

    (mha::MultiHeadAttention)(q_in, k_in, v_in, [bias]; [mask])

    The arguments of the forward pass are:

    • q_in: Input query array of size (q_in_dim, q_len, batch_size).
    • k_in: Input key array of size (k_in_dim, kv_len, batch_size).
    • v_in: Input value array of size (v_in_dim, kv_len, batch_size).
    • bias: Bias array broadcastable to size (kv_len, q_len, nheads, batch_size). It will be added to the attention scores before the softmax. Default nothing.
    • mask: Input array broadcastable to size (kv_len, q_len, nheads, batch_size). The mask is applied to the attention scores just before the softmax. See NNlib.make_causal_mask for creating causal masks. Default nothing.

    Alternative calling signatures are mha(q_in), equivalent to mha(q_in, q_in, q_in) (self-attention), and mha(q_in, k_in), equivalent to mha(q_in, k_in, k_in) (key and value are the same).

    See also NNlib.dot_product_attention.

    Examples

    mha = MultiHeadAttention(64, nheads = 8)
    +q = rand(Float32, (64, 10, 32))
    +k = rand(Float32, (64, 20, 32))
    +v = rand(Float32, (64, 20, 32))
    +y, α = mha(q, k, v) 
    +# [y] = [64, 10, 32]
    +# [α] = [20, 10, 8, 32]
    +
    +mha = MultiHeadAttention(64 => 1024 => 1024, nheads = 8)
    +y, α = mha(q) # self-attention
    +# [y] = [1024, 10, 32]
    +# [α] = [10, 10, 8, 32]
    source

    Pooling

    These layers are commonly used after a convolution layer, and reduce the size of its output. They have no trainable parameters.

    Flux.AdaptiveMaxPoolType
    AdaptiveMaxPool(out::NTuple)

    Adaptive max pooling layer. Calculates the necessary window size such that its output has size(y)[1:N] == out.

    Expects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(out).

    See also MaxPool, AdaptiveMeanPool.

    Examples

    julia> xs = rand(Float32, 100, 100, 3, 50);  # batch of 50 RGB images
    +
    +julia> AdaptiveMaxPool((25, 25))(xs) |> size
    +(25, 25, 3, 50)
    +
    +julia> MaxPool((4,4))(xs) ≈ AdaptiveMaxPool((25, 25))(xs)
    +true
    source
    Flux.MaxPoolType
    MaxPool(window::NTuple; pad=0, stride=window)

    Max pooling layer, which replaces all pixels in a block of size window with one.

    Expects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(window).

    By default the window size is also the stride in each dimension. The keyword pad accepts the same options as for the Conv layer, including SamePad().

    See also Conv, MeanPool, AdaptiveMaxPool, GlobalMaxPool.

    Examples

    julia> xs = rand(Float32, 100, 100, 3, 50);  # batch of 50 RGB images
    +
    +julia> m = Chain(Conv((5, 5), 3 => 7, pad=SamePad()), MaxPool((5, 5), pad=SamePad()))
    +Chain(
    +  Conv((5, 5), 3 => 7, pad=2),          # 532 parameters
    +  MaxPool((5, 5), pad=2),
    +)
    +
    +julia> m[1](xs) |> size
    +(100, 100, 7, 50)
    +
    +julia> m(xs) |> size
    +(20, 20, 7, 50)
    +
    +julia> layer = MaxPool((5,), pad=2, stride=(3,))  # one-dimensional window
    +MaxPool((5,), pad=2, stride=3)
    +
    +julia> layer(rand(Float32, 100, 7, 50)) |> size
    +(34, 7, 50)
    source
    Flux.GlobalMaxPoolType
    GlobalMaxPool()

    Global max pooling layer.

    Transforms (w,h,c,b)-shaped input into (1,1,c,b)-shaped output, by performing max pooling on the complete (w,h)-shaped feature maps.

    See also MaxPool, GlobalMeanPool.

    julia> xs = rand(Float32, 100, 100, 3, 50);
    +
    +julia> m = Chain(Conv((3,3), 3 => 7), GlobalMaxPool());
    +
    +julia> m(xs) |> size
    +(1, 1, 7, 50)
    +
    +julia> GlobalMaxPool()(rand(3,5,7)) |> size  # preserves 2 dimensions
    +(1, 5, 7)
    source
    Flux.AdaptiveMeanPoolType
    AdaptiveMeanPool(out::NTuple)

    Adaptive mean pooling layer. Calculates the necessary window size such that its output has size(y)[1:N] == out.

    Expects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(out).

    See also MaxPool, AdaptiveMaxPool.

    Examples

    julia> xs = rand(Float32, 100, 100, 3, 50);  # batch of 50 RGB images
    +
    +julia> AdaptiveMeanPool((25, 25))(xs) |> size
    +(25, 25, 3, 50)
    +
    +julia> MeanPool((4,4))(xs) ≈ AdaptiveMeanPool((25, 25))(xs)
    +true
    source
    Flux.MeanPoolType
    MeanPool(window::NTuple; pad=0, stride=window)

    Mean pooling layer, averaging all pixels in a block of size window.

    Expects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(window).

    By default the window size is also the stride in each dimension. The keyword pad accepts the same options as for the Conv layer, including SamePad().

    See also Conv, MaxPool, AdaptiveMeanPool.

    Examples

    julia> xs = rand(Float32, 100, 100, 3, 50);
    +
    +julia> m = Chain(Conv((5,5), 3 => 7), MeanPool((5,5), pad=SamePad()))
    +Chain(
    +  Conv((5, 5), 3 => 7),                 # 532 parameters
    +  MeanPool((5, 5), pad=2),
    +)
    +
    +julia> m[1](xs) |> size
    +(96, 96, 7, 50)
    +
    +julia> m(xs) |> size
    +(20, 20, 7, 50)
    source
    Flux.GlobalMeanPoolType
    GlobalMeanPool()

    Global mean pooling layer.

    Transforms (w,h,c,b)-shaped input into (1,1,c,b)-shaped output, by performing mean pooling on the complete (w,h)-shaped feature maps.

    julia> xs = rand(Float32, 100, 100, 3, 50);
    +
    +julia> m = Chain(Conv((3,3), 3 => 7), GlobalMeanPool());
    +
    +julia> m(xs) |> size
    +(1, 1, 7, 50)
    source

    Upsampling

    The opposite of pooling, these layers increase the size of an array. They have no trainable parameters.

    Flux.UpsampleType
    Upsample(mode = :nearest; [scale, size]) 
    +Upsample(scale, mode = :nearest)

    An upsampling layer. One of two keywords must be given:

    If scale is a number, this applies to all but the last two dimensions (channel and batch) of the input. It may also be a tuple, to control dimensions individually. Alternatively, keyword size accepts a tuple, to directly specify the leading dimensions of the output.

    Currently supported upsampling modes and corresponding NNlib's methods are:

    Examples

    julia> m = Upsample(scale = (2, 3))
    +Upsample(:nearest, scale = (2, 3))
    +
    +julia> m(ones(2, 2, 1, 1)) |> size
    +(4, 6, 1, 1)
    +
    +julia> m = Upsample(:bilinear, size = (4, 5))
    +Upsample(:bilinear, size = (4, 5))
    +
    +julia> m(ones(2, 2, 1, 1)) |> size
    +(4, 5, 1, 1)
    source
    Flux.PixelShuffleType
    PixelShuffle(r::Int)

    Pixel shuffling layer with upscale factor r. Usually used for generating higher resolution images while upscaling them.

    See NNlib.pixel_shuffle.

    Examples

    julia> p = PixelShuffle(2);
    +
    +julia> xs = [2row + col + channel/10 for row in 1:2, col in 1:2, channel in 1:4, n in 1:1]
    +2×2×4×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 3.1  4.1
    + 5.1  6.1
    +
    +[:, :, 2, 1] =
    + 3.2  4.2
    + 5.2  6.2
    +
    +[:, :, 3, 1] =
    + 3.3  4.3
    + 5.3  6.3
    +
    +[:, :, 4, 1] =
    + 3.4  4.4
    + 5.4  6.4
    +
    +julia> p(xs)
    +4×4×1×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 3.1  3.3  4.1  4.3
    + 3.2  3.4  4.2  4.4
    + 5.1  5.3  6.1  6.3
    + 5.2  5.4  6.2  6.4
    +
    +julia> xs = [3row + col + channel/10 for row in 1:2, col in 1:3, channel in 1:4, n in 1:1]
    +2×3×4×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 4.1  5.1  6.1
    + 7.1  8.1  9.1
    +
    +[:, :, 2, 1] =
    + 4.2  5.2  6.2
    + 7.2  8.2  9.2
    +
    +[:, :, 3, 1] =
    + 4.3  5.3  6.3
    + 7.3  8.3  9.3
    +
    +[:, :, 4, 1] =
    + 4.4  5.4  6.4
    + 7.4  8.4  9.4
    +
    +julia> p(xs)
    +4×6×1×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 4.1  4.3  5.1  5.3  6.1  6.3
    + 4.2  4.4  5.2  5.4  6.2  6.4
    + 7.1  7.3  8.1  8.3  9.1  9.3
    + 7.2  7.4  8.2  8.4  9.2  9.4
    source

    Embedding Vectors

    These layers accept an index, and return a vector (or several indices, and several vectors). The possible embedding vectors are learned parameters.

    Flux.EmbeddingType
    Embedding(in => out; init=randn32)

    A lookup table that stores embeddings of dimension out for a vocabulary of size in, as a trainable matrix.

    This layer is often used to store word embeddings and retrieve them using indices. The input to the layer can be a vocabulary index in 1:in, an array of indices, or the corresponding onehot encoding.

    For indices x, the result is of size (out, size(x)...), allowing several batch dimensions. For one-hot ohx, the result is of size (out, size(ohx)[2:end]...).

    Examples

    julia> emb = Embedding(26 => 4, init=Flux.identity_init(gain=22))
    +Embedding(26 => 4)  # 104 parameters
    +
    +julia> emb(2)  # one column of e.weight (here not random!)
    +4-element Vector{Float32}:
    +  0.0
    + 22.0
    +  0.0
    +  0.0
    +
    +julia> emb([3, 1, 20, 14, 4, 15, 7])  # vocabulary indices, in 1:26
    +4×7 Matrix{Float32}:
    +  0.0  22.0  0.0  0.0   0.0  0.0  0.0
    +  0.0   0.0  0.0  0.0   0.0  0.0  0.0
    + 22.0   0.0  0.0  0.0   0.0  0.0  0.0
    +  0.0   0.0  0.0  0.0  22.0  0.0  0.0
    +
    +julia> ans == emb(Flux.onehotbatch("cat&dog", 'a':'z', 'n'))
    +true
    +
    +julia> emb(rand(1:26, (10, 1, 12))) |> size  # three batch dimensions
    +(4, 10, 1, 12)
    source
    Flux.EmbeddingBagType
    EmbeddingBag(in => out, reduction=mean; init=Flux.randn32)

    A lookup table that stores embeddings of dimension out for a vocabulary of size in. Differs from Embedding in that, instead of acting on a single vocabulary index, it always acts a vector of indices which it calls a "bag". Their individual embedding vectors are reduced to one, using mean or some other function.

    Instead of acting on one "bag", such as x::Vector{Int}, the layer can also act on several:

    • Acting on a vector of "bags", it produces a matrix whose columns are the reduced vectors. More generally on x::Array{Vector{Int}}, its output is of size (out, size(x)...).

    • Any higher-rank array of integers is interpreted as a collection of "bags" each along the first dimension. Thus the output is mapslices(e, x; dims=1) when e::EmbeddingBag and x::Array{Int,N}. This method is more efficient, but requires that all "bags" have the same length.

    • A vector of "bags" may also be produced by splitting a vector of indices at specified points. For this case the layer takes two inputs, both vectors of integers. See details below.

    The "bag" may equivalently be represented as a OneHotMatrix. A collection of these, or one higher-rank OneHotArray, again produce a stack of embeddings. See details below.

    Examples

    julia> vocab_size = 26;  # embed into 3 dimensions, with non-random vectors:
    +
    +julia> eb = EmbeddingBag(vocab_size => 3, init=Flux.identity_init(gain=100))
    +EmbeddingBag(26 => 3)  # 78 parameters
    +
    +julia> eb([2])  # one bag of 1 item
    +3-element Vector{Float32}:
    +   0.0
    + 100.0
    +   0.0
    +
    +julia> eb([3,3,1])  # one bag of 3 items, one mean embedding
    +3-element Vector{Float32}:
    + 33.333332
    +  0.0
    + 66.666664
    +
    +julia> eb([[3,1,3], [2,1]])  # two bags
    +3×2 Matrix{Float32}:
    + 33.3333  50.0
    +  0.0     50.0
    + 66.6667   0.0
    +
    +julia> eb([1 1 1 1; 1 2 3 4])  # 4 bags each of 2 items, eachcol([1 1 1 1; 1 2 3 4])
    +3×4 Matrix{Float32}:
    + 100.0  50.0  50.0  50.0
    +   0.0  50.0   0.0   0.0
    +   0.0   0.0  50.0   0.0
    +
    +julia> eb(rand(1:26, 10, 5, 5)) |> size  # 25 bags each of 10 items
    +(3, 5, 5)

    Another way to specify "many bags of many items" is to provide a vector data (each in 1:in) and a vector at stating where to split that up into "bags". The first bag starts with data[at[1]], the second at data[at[2]], and so on, with no overlaps and nothing left out (thus it requires at[1]==1).

    julia> data = [11, 1, 12, 2, 13, 3, 14];
    +
    +julia> Flux._splitat(data, [1, 4]) |> println  # internal function, makes data[1:3], data[4:end]
    +[[11, 1, 12], [2, 13, 3, 14]]
    +
    +julia> eb(data, [1, 4])  # two bags, of 3 and 4 items
    +3×2 Matrix{Float32}:
    + 33.3333   0.0
    +  0.0     25.0
    +  0.0     25.0

    Finally, each bag may also be also be represented as a OneHotMatrix.

    julia> eb(Flux.onehotbatch("bba", 'a':'z'))  # same as [2,2,1], one bag of 3 items
    +3-element Vector{Float32}:
    + 33.333332
    + 66.666664
    +  0.0
    +
    +julia> eb([Flux.onehotbatch("bba", 'a':'z'), Flux.onehotbatch("cc", 'a':'z')])  # two bags
    +3×2 Matrix{Float32}:
    + 33.3333    0.0
    + 66.6667    0.0
    +  0.0     100.0
    source

    Dataflow Layers, or Containers

    The basic Chain(F, G, H) applies the layers it contains in sequence, equivalent to H ∘ G ∘ F. Flux has some other layers which contain layers, but connect them up in a more complicated way: SkipConnection allows ResNet's residual connection.

    Flux.ChainType
    Chain(layers...)
    +Chain(name = layer, ...)

    Collects multiple layers / functions to be called in sequence on a given input. Supports indexing and slicing, m[2] or m[1:end-1], and if names are given, m[:name] == m[1] etc.

    Examples

    julia> m = Chain(x -> x^2, x -> x+1);
    +
    +julia> m(5) == 26
    +true
    +
    +julia> m = Chain(Dense(10 => 5, tanh), Dense(5 => 2));
    +
    +julia> x = rand32(10, 32);
    +
    +julia> m(x) == m[2](m[1](x))
    +true
    +
    +julia> m2 = Chain(enc = Chain(Flux.flatten, Dense(10 => 5, tanh)), 
    +                  dec = Dense(5 => 2));
    +
    +julia> m2(x) == (m2[:dec] ∘ m2[:enc])(x)
    +true

    For large models, there is a special type-unstable path which can reduce compilation times. This can be used by supplying a vector of layers Chain([layer1, layer2, ...]). This feature is somewhat experimental, beware!

    source
    Flux.activationsFunction
    activations(c::Chain, input)

    Like calling a Chain, but saves the result of each layer as an output.

    Examples

    julia> using Flux: activations
    +
    +julia> c = Chain(x -> x + 1, x -> x * 2, x -> x ^ 3);
    +
    +julia> activations(c, 1)
    +(2, 4, 64)
    source
    Flux.MaxoutType
    Maxout(layers...)
    +Maxout(f, n_alts)

    This contains a number of internal layers, each of which receives the same input. Its output is the elementwise maximum of the internal layers' outputs.

    Instead of defining layers individually, you can provide a zero-argument function which constructs them, and the number to construct.

    Maxout over linear dense layers satisfies the universal approximation theorem. See Goodfellow, Warde-Farley, Mirza, Courville & Bengio "Maxout Networks" https://arxiv.org/abs/1302.4389.

    See also Parallel to reduce with other operators.

    Examples

    julia> m = Maxout(x -> abs2.(x), x -> x .* 3);
    +
    +julia> m([-2 -1 0 1 2])
    +1×5 Matrix{Int64}:
    + 4  1  0  3  6
    +
    +julia> m3 = Maxout(() -> Dense(5 => 7, tanh), 3)
    +Maxout(
    +  Dense(5 => 7, tanh),                  # 42 parameters
    +  Dense(5 => 7, tanh),                  # 42 parameters
    +  Dense(5 => 7, tanh),                  # 42 parameters
    +)                   # Total: 6 arrays, 126 parameters, 888 bytes.
    +
    +julia> Flux.outputsize(m3, (5, 11))
    +(7, 11)
    source
    Flux.SkipConnectionType
    SkipConnection(layer, connection)

    Create a skip connection which consists of a layer or Chain of consecutive layers and a shortcut connection linking the block's input to the output through a user-supplied 2-argument callable. The first argument to the callable will be propagated through the given layer while the second is the unchanged, "skipped" input.

    The simplest "ResNet"-type connection is just SkipConnection(layer, +). Here is a more complicated example:

    julia> m = Conv((3,3), 4 => 7, pad=(1,1));
    +
    +julia> x = ones(Float32, 5, 5, 4, 10);
    +
    +julia> size(m(x)) == (5, 5, 7, 10)
    +true
    +
    +julia> sm = SkipConnection(m, (mx, x) -> cat(mx, x, dims=3));
    +
    +julia> size(sm(x)) == (5, 5, 11, 10)
    +true

    See also Parallel, Maxout.

    source
    Flux.ParallelType
    Parallel(connection, layers...)
    +Parallel(connection; name = layer, ...)

    Create a layer which passes an input array to each path in layers, before reducing the output with connection.

    Called with one input x, this is equivalent to connection([l(x) for l in layers]...). If called with multiple inputs, one is passed to each layer, thus Parallel(+, f, g)(x, y) = f(x) + g(y).

    Like Chain, its sub-layers may be given names using the keyword constructor. These can be accessed by indexing: m[1] == m[:name] is the first layer.

    See also SkipConnection which is Parallel with one identity, and Maxout which reduces by broadcasting max.

    Examples

    julia> model = Chain(Dense(3 => 5),
    +                     Parallel(vcat, Dense(5 => 4), Chain(Dense(5 => 7), Dense(7 => 4))),
    +                     Dense(8 => 17));
    +
    +julia> model(rand32(3)) |> size
    +(17,)
    +
    +julia> model2 = Parallel(+; α = Dense(10, 2, tanh), β = Dense(5, 2))
    +Parallel(
    +  +,
    +  α = Dense(10 => 2, tanh),             # 22 parameters
    +  β = Dense(5 => 2),                    # 12 parameters
    +)                   # Total: 4 arrays, 34 parameters, 392 bytes.
    +
    +julia> model2(rand32(10), rand32(5)) |> size
    +(2,)
    +
    +julia> model2[:α](rand32(10)) |> size
    +(2,)
    +
    +julia> model2[:β] == model2[2]
    +true
    source
    Flux.PairwiseFusionType
    PairwiseFusion(connection, layers...)

    Arguments

    • connection: A function taking 2 inputs and combining them into a single output
    • layers: The layers whose outputs are combined

    Inputs

    This layer behaves differently based on input type:

    1. If input x is a tuple of length N (or the input is xs with N x's), matching the number of layers,

    then each layer receives a new input x[i] combined with the previous output y[i-1] using connection. Thus (y1, y2, y3) = PairwiseFusion(connection, layer1, layer2, layer3)((x1, x2, x3)) may be drawn as:

    x1 → layer1 → y1 ↘
    +                  connection → layer2 → y2 ↘
    +              x2 ↗                          connection → layer3 → y3
    +                                        x3 ↗

    ... or written as:

    y1 = layer1(x1)
    +y2 = layer2(connection(y1, x2))
    +y3 = layer3(connection(y2, x3))
    1. With just one input, each layer receives the same x combined with the previous output. Thus y = PairwiseFusion(connection, layers...)(x) obeys:
    y[1] == layers[1](x)
    +for i in 2:length(layers)
    +    y[i] == connection(layers[i](y[i-1]), x)
    +end

    Returns

    A tuple of length N with the output of each fusion ((y1, y2, ..., yN) in the example above).

    source

    Recurrent Models

    Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).

    Flux.RNNFunction
    RNN(in => out, σ = tanh)

    The most basic recurrent layer; essentially acts as a Dense layer, but with the output fed back into the input each time step.

    The arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.

    This constructor is syntactic sugar for Recur(RNNCell(a...)), and so RNNs are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.

    Examples

    julia> r = RNN(3 => 5)
    +Recur(
    +  RNNCell(3 => 5, tanh),                # 50 parameters
    +)         # Total: 4 trainable arrays, 50 parameters,
    +          # plus 1 non-trainable, 5 parameters, summarysize 432 bytes.
    +
    +julia> r(rand(Float32, 3)) |> size
    +(5,)
    +
    +julia> Flux.reset!(r);
    +
    +julia> r(rand(Float32, 3, 10)) |> size # batch size of 10
    +(5, 10)
    Batch size changes

    Failing to call reset! when the input batch size changes can lead to unexpected behavior. See the following example:

    julia> r = RNN(3 => 5)
    +Recur(
    +  RNNCell(3 => 5, tanh),                # 50 parameters
    +)         # Total: 4 trainable arrays, 50 parameters,
    +          # plus 1 non-trainable, 5 parameters, summarysize 432 bytes.
    +
    +julia> r.state |> size
    +(5, 1)
    +
    +julia> r(rand(Float32, 3)) |> size
    +(5,)
    +
    +julia> r.state |> size
    +(5, 1)
    +
    +julia> r(rand(Float32, 3, 10)) |> size # batch size of 10
    +(5, 10)
    +
    +julia> r.state |> size # state shape has changed
    +(5, 10)
    +
    +julia> r(rand(Float32, 3)) |> size # erroneously outputs a length 5*10 = 50 vector.
    +(50,)

    Note:

    RNNCells can be constructed directly by specifying the non-linear function, the Wi and Wh internal matrices, a bias vector b, and a learnable initial state state0. The Wi and Wh matrices do not need to be the same type, but if Wh is dxd, then Wi should be of shape dxN.

    julia> using LinearAlgebra
    +
    +julia> r = Flux.Recur(Flux.RNNCell(tanh, rand(5, 4), Tridiagonal(rand(5, 5)), rand(5), rand(5, 1)))
    +
    +julia> r(rand(4, 10)) |> size # batch size of 10
    +(5, 10)
    source
    Flux.LSTMFunction
    LSTM(in => out)

    Long Short Term Memory recurrent layer. Behaves like an RNN but generally exhibits a longer memory span over sequences.

    The arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.

    This constructor is syntactic sugar for Recur(LSTMCell(a...)), and so LSTMs are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.

    See this article for a good overview of the internals.

    Examples

    julia> l = LSTM(3 => 5)
    +Recur(
    +  LSTMCell(3 => 5),                     # 190 parameters
    +)         # Total: 5 trainable arrays, 190 parameters,
    +          # plus 2 non-trainable, 10 parameters, summarysize 1.062 KiB.
    +
    +julia> l(rand(Float32, 3)) |> size
    +(5,)
    +
    +julia> Flux.reset!(l);
    +
    +julia> l(rand(Float32, 3, 10)) |> size # batch size of 10
    +(5, 10)
    Batch size changes

    Failing to call reset! when the input batch size changes can lead to unexpected behavior. See the example in RNN.

    Note:

    LSTMCells can be constructed directly by specifying the non-linear function, the Wi and Wh internal matrices, a bias vector b, and a learnable initial state state0. The Wi and Wh matrices do not need to be the same type. See the example in RNN.

    source
    Flux.GRUFunction
    GRU(in => out)

    Gated Recurrent Unit layer. Behaves like an RNN but generally exhibits a longer memory span over sequences. This implements the variant proposed in v1 of the referenced paper.

    The integer arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.

    This constructor is syntactic sugar for Recur(GRUCell(a...)), and so GRUs are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.

    See this article for a good overview of the internals.

    Examples

    julia> g = GRU(3 => 5)
    +Recur(
    +  GRUCell(3 => 5),                      # 140 parameters
    +)         # Total: 4 trainable arrays, 140 parameters,
    +          # plus 1 non-trainable, 5 parameters, summarysize 792 bytes.
    +
    +julia> g(rand(Float32, 3)) |> size
    +(5,)
    +
    +julia> Flux.reset!(g);
    +
    +julia> g(rand(Float32, 3, 10)) |> size # batch size of 10
    +(5, 10)
    Batch size changes

    Failing to call reset! when the input batch size changes can lead to unexpected behavior. See the example in RNN.

    Note:

    GRUCells can be constructed directly by specifying the non-linear function, the Wi and Wh internal matrices, a bias vector b, and a learnable initial state state0. The Wi and Wh matrices do not need to be the same type. See the example in RNN.

    source
    Flux.GRUv3Function
    GRUv3(in => out)

    Gated Recurrent Unit layer. Behaves like an RNN but generally exhibits a longer memory span over sequences. This implements the variant proposed in v3 of the referenced paper.

    The arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.

    This constructor is syntactic sugar for Recur(GRUv3Cell(a...)), and so GRUv3s are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.

    See this article for a good overview of the internals.

    Examples

    julia> g = GRUv3(3 => 5)
    +Recur(
    +  GRUv3Cell(3 => 5),                    # 140 parameters
    +)         # Total: 5 trainable arrays, 140 parameters,
    +          # plus 1 non-trainable, 5 parameters, summarysize 848 bytes.
    +
    +julia> g(rand(Float32, 3)) |> size
    +(5,)
    +
    +julia> Flux.reset!(g);
    +
    +julia> g(rand(Float32, 3, 10)) |> size # batch size of 10
    +(5, 10)
    Batch size changes

    Failing to call reset! when the input batch size changes can lead to unexpected behavior. See the example in RNN.

    Note:

    GRUv3Cells can be constructed directly by specifying the non-linear function, the Wi, Wh, and Wh_h internal matrices, a bias vector b, and a learnable initial state state0. The Wi, Wh, and Wh_h matrices do not need to be the same type. See the example in RNN.

    source
    Flux.RecurType
    Recur(cell)

    Recur takes a recurrent cell and makes it stateful, managing the hidden state in the background. cell should be a model of the form:

    h, y = cell(h, x...)

    For example, here's a recurrent network that keeps a running total of its inputs:

    Examples

    julia> accum(h, x) = (h + x, x)
    +accum (generic function with 1 method)
    +
    +julia> rnn = Flux.Recur(accum, 0)
    +Recur(accum)
    +
    +julia> rnn(2) 
    +2
    +
    +julia> rnn(3)
    +3
    +
    +julia> rnn.state
    +5

    Folding over a 3d Array of dimensions (features, batch, time) is also supported:

    julia> accum(h, x) = (h .+ x, x)
    +accum (generic function with 1 method)
    +
    +julia> rnn = Flux.Recur(accum, zeros(Int, 1, 1))
    +Recur(accum)
    +
    +julia> rnn([2])
    +1-element Vector{Int64}:
    + 2
    +
    +julia> rnn([3])
    +1-element Vector{Int64}:
    + 3
    +
    +julia> rnn.state
    +1×1 Matrix{Int64}:
    + 5
    +
    +julia> out = rnn(reshape(1:10, 1, 1, :));  # apply to a sequence of (features, batch, time)
    +
    +julia> out |> size
    +(1, 1, 10)
    +
    +julia> vec(out)
    +10-element Vector{Int64}:
    +  1
    +  2
    +  3
    +  4
    +  5
    +  6
    +  7
    +  8
    +  9
    + 10
    +
    +julia> rnn.state
    +1×1 Matrix{Int64}:
    + 60
    source
    Flux.reset!Function
    reset!(rnn)

    Reset the hidden state of a recurrent layer back to its original value.

    Assuming you have a Recur layer rnn, this is roughly equivalent to:

    rnn.state = hidden(rnn.cell)

    Examples

    julia> r = Flux.RNNCell(relu, ones(1,1), zeros(1,1), ones(1,1), zeros(1,1));  # users should use the RNN wrapper struct instead
    +
    +julia> y = Flux.Recur(r, ones(1,1));
    +
    +julia> y.state
    +1×1 Matrix{Float64}:
    + 1.0
    +
    +julia> y(ones(1,1))  # relu(1*1 + 1)
    +1×1 Matrix{Float64}:
    + 2.0
    +
    +julia> y.state
    +1×1 Matrix{Float64}:
    + 2.0
    +
    +julia> Flux.reset!(y)
    +1×1 Matrix{Float64}:
    + 0.0
    +
    +julia> y.state
    +1×1 Matrix{Float64}:
    + 0.0
    source

    Normalisation & Regularisation

    These layers don't affect the structure of the network but may improve training times or reduce overfitting. Some of them contain trainable parameters, while others do not.

    Flux.BatchNormType
    BatchNorm(channels::Integer, λ=identity;
    +          initβ=zeros32, initγ=ones32,
    +          affine=true, track_stats=true, active=nothing,
    +          eps=1f-5, momentum= 0.1f0)

    Batch Normalization layer. channels should be the size of the channel dimension in your data (see below).

    Given an array with N dimensions, call the N-1th the channel dimension. For a batch of feature vectors this is just the data dimension, for WHCN images it's the usual channel dimension.

    BatchNorm computes the mean and variance for each D_1×...×D_{N-2}×1×D_N input slice and normalises the input accordingly.

    If affine=true, it also applies a shift and a rescale to the input through to learnable per-channel bias β and scale γ parameters.

    After normalisation, elementwise activation λ is applied.

    If track_stats=true, accumulates mean and var statistics in training phase that will be used to renormalize the input in test phase.

    Use testmode! during inference.

    Examples

    julia> using Statistics
    +
    +julia> xs = rand(3, 3, 3, 2);  # a batch of 2 images, each having 3 channels
    +
    +julia> m = BatchNorm(3);
    +
    +julia> Flux.trainmode!(m);
    +
    +julia> isapprox(std(m(xs)), 1, atol=0.1) && std(xs) != std(m(xs))
    +true
    source
    Flux.DropoutType
    Dropout(p; [dims, rng, active])

    Layer implementing dropout with the given probability. This is used as a regularisation, i.e. to reduce overfitting.

    While training, it sets each input to 0 (with probability p) or else scales it by 1 / (1 - p), using the NNlib.dropout function. While testing, it has no effect.

    By default the mode will switch automatically, but it can also be controlled manually via Flux.testmode!, or by passing keyword active=true for training mode.

    By default every input is treated independently. With the dims keyword, instead it takes a random choice only along that dimension. For example Dropout(p; dims = 3) will randomly zero out entire channels on WHCN input (also called 2D dropout).

    Keyword rng lets you specify a custom random number generator. (Only supported on the CPU.)

    Examples

    julia> m = Chain(Dense(ones(3,2)), Dropout(0.4))
    +Chain(
    +  Dense(2 => 3),                        # 9 parameters
    +  Dropout(0.4),
    +)
    +
    +julia> m(ones(2, 7))  # test mode, no effect
    +3×7 Matrix{Float64}:
    + 2.0  2.0  2.0  2.0  2.0  2.0  2.0
    + 2.0  2.0  2.0  2.0  2.0  2.0  2.0
    + 2.0  2.0  2.0  2.0  2.0  2.0  2.0
    +
    +julia> Flux.trainmode!(m)  # equivalent to use within gradient
    +Chain(
    +  Dense(2 => 3),                        # 9 parameters
    +  Dropout(0.4, active=true),
    +)
    +
    +julia> m(ones(2, 7))
    +3×7 Matrix{Float64}:
    + 0.0      0.0      3.33333  0.0      0.0      0.0  0.0
    + 3.33333  0.0      3.33333  0.0      3.33333  0.0  3.33333
    + 3.33333  3.33333  0.0      3.33333  0.0      0.0  3.33333
    +
    +julia> y = m(ones(2, 10_000));
    +
    +julia> using Statistics
    +
    +julia> mean(y)  # is about 2.0, same as in test mode
    +1.9989999999999961
    +
    +julia> mean(iszero, y)  # is about 0.4
    +0.4003
    source
    Flux.AlphaDropoutType
    AlphaDropout(p; [rng, active])

    A dropout layer. Used in Self-Normalizing Neural Networks. The AlphaDropout layer ensures that mean and variance of activations remain the same as before.

    Does nothing to the input once testmode! is true.

    Examples

    julia> using Statistics
    +
    +julia> x = randn32(1000,1);
    +
    +julia> m = Chain(Dense(1000 => 1000, selu), AlphaDropout(0.2));
    +
    +julia> Flux.trainmode!(m);
    +
    +julia> y = m(x);
    +
    +julia> isapprox(std(x), std(y), atol=0.2)
    +true
    source
    Flux.LayerNormType
    LayerNorm(size..., λ=identity; affine=true, eps=1f-5)

    A normalisation layer designed to be used with recurrent hidden states. The argument size should be an integer or a tuple of integers.

    In the forward pass, the layer normalises the mean and standard deviation of the input, then applies the elementwise activation λ. The input is normalised along the first length(size) dimensions for tuple size, and along the first dimension for integer size. The input is expected to have first dimensions' size equal to size.

    If affine=true, it also applies a learnable shift and rescaling using the Scale layer.

    See also BatchNorm, InstanceNorm, GroupNorm, and normalise.

    Examples

    julia> using Statistics
    +
    +julia> xs = rand(3, 3, 3, 2);  # a batch of 2 images, each having 3 channels
    +
    +julia> m = LayerNorm(3);
    +
    +julia> y = m(xs);
    +
    +julia> isapprox(std(y, dims=1:3), ones(1, 1, 1, 2), atol=0.1) && std(y, dims=1:3) != std(xs, dims=1:3)
    +true
    source
    Flux.InstanceNormType
    InstanceNorm(channels::Integer, λ=identity;
    +             initβ=zeros32, initγ=ones32,
    +             affine=false, track_stats=false,
    +             eps=1f-5, momentum=0.1f0)

    Instance Normalization layer. channels should be the size of the channel dimension in your data (see below).

    Given an array with N > 2 dimensions, call the N-1th the channel dimension. For WHCN images it's the usual channel dimension.

    InstanceNorm computes the mean and variance for each D_1×...×D_{N-2}×1×1 input slice and normalises the input accordingly.

    If affine=true, it also applies a shift and a rescale to the input through to learnable per-channel bias β and scale γ parameters.

    If track_stats=true, accumulates mean and var statistics in training phase that will be used to renormalize the input in test phase.

    Warning: the defaults for affine and track_stats used to be true in previous Flux versions (< v0.12).

    Examples

    julia> using Statistics
    +
    +julia> xs = rand(3, 3, 3, 2);  # a batch of 2 images, each having 3 channels
    +
    +julia> m = InstanceNorm(3);
    +
    +julia> y = m(xs);
    +
    +julia> isapprox(std(y, dims=1:2), ones(1, 1, 3, 2), atol=0.2) && std(y, dims=1:2) != std(xs, dims=1:2)
    +true
    source
    Flux.GroupNormType
    GroupNorm(channels::Int, G::Int, λ = identity;
    +          initβ = zeros32,
    +          initγ = ones32,
    +          affine = true,
    +          eps = 1f-5,
    +          momentum = 0.1f0)

    Group Normalization layer.

    chs is the number of channels, the channel dimension of your input. For an array of N dimensions, the N-1th index is the channel dimension.

    G is the number of groups along which the statistics are computed. The number of channels must be an integer multiple of the number of groups.

    channels should be the size of the channel dimension in your data (see below).

    Given an array with N > 2 dimensions, call the N-1th the channel dimension. For WHCN images it's the usual channel dimension.

    If affine=true, it also applies a shift and a rescale to the input through to learnable per-channel bias β and scale γ parameters.

    Examples

    julia> using Statistics
    +
    +julia> xs = rand(3, 3, 4, 2);  # a batch of 2 images, each having 4 channels
    +
    +julia> m = GroupNorm(4, 2);
    +
    +julia> y = m(xs);
    +
    +julia> isapprox(std(y[:, :, 1:2, 1]), 1, atol=0.1) && std(xs[:, :, 1:2, 1]) != std(y[:, :, 1:2, 1])
    +true
    +
    +julia> isapprox(std(y[:, :, 3:4, 2]), 1, atol=0.1) && std(xs[:, :, 3:4, 2]) != std(y[:, :, 3:4, 2])
    +true
    source
    Flux.normaliseFunction
    normalise(x; dims=ndims(x), eps=1e-5)

    Normalise x to mean 0 and standard deviation 1 across the dimension(s) given by dims. Per default, dims is the last dimension. eps is a small term added to the denominator for numerical stability.

    Examples

    julia> using Statistics
    +
    +julia> x = [90, 100, 110, 130, 70];
    +
    +julia> mean(x), std(x; corrected=false)
    +(100.0, 20.0)
    +
    +julia> y = Flux.normalise(x)
    +5-element Vector{Float64}:
    + -0.49999975000012503
    +  0.0
    +  0.49999975000012503
    +  1.499999250000375
    + -1.499999250000375
    +
    +julia> isapprox(std(y; corrected=false), 1, atol=1e-5)
    +true
    +
    +julia> x = rand(10:100, 10, 10);
    +
    +julia> y = Flux.normalise(x, dims=1);
    +
    +julia> isapprox(std(y; dims=1, corrected=false), ones(1, 10), atol=1e-5)
    +true
    source

    Test vs. Train

    Several normalisation layers behave differently under training and inference (testing). By default, Flux will automatically determine when a layer evaluation is part of training or inference.

    Warning

    This automatic train/test detection works best with Zygote, the default automatic differentiation package. It may not work with other packages such as Tracker, Yota, or ForwardDiff.

    The functions Flux.trainmode! and Flux.testmode! let you manually specify which behaviour you want. When called on a model, they will place all layers within the model into the specified mode.

    Flux.testmode!Method
    testmode!(model, [mode]) -> model

    Set a layer, or all layers in a model, to test mode. This disables the effect of Dropout and some other regularisation layers.

    If you manually set a model into test mode, you need to manually place it back into train mode during training phase, using trainmode!.

    There is an optional second argument, which takes a symbol :auto to reset all layers back to the default automatic mode.

    Example

    julia> d = Dropout(0.3)
    +Dropout(0.3)
    +
    +julia> testmode!(d)   # dropout is now always disabled
    +Dropout(0.3, active=false)
    +
    +julia> trainmode!(d)  # dropout is now always enabled
    +Dropout(0.3, active=true)
    +
    +julia> testmode!(d, :auto)  # back to default
    +Dropout(0.3)
    source
    Flux.testmode!Method
    testmode!(model, inactive)

    This two-argument method is largely internal. It recurses into the model, and until a method like testmode!(d::Dropout, inactive) alters the activity of a layer. Custom layers can support manual testmode! / trainmode! switching by defining such a method.

    Possible values of inactive are:

    • true for testing, i.e. active=false
    • false for training, same as trainmode!(m)
    • :auto or nothing for Flux to detect training automatically.
    Compat

    This method may be removed in a future breaking change, to separate the user-facing testmode! from the internal recursion.

    source
    Flux.trainmode!Function
    trainmode!(model) -> model

    Set a layer, or all layers in a model, to training mode. Opposite to testmode!, see further details there.

    source
    trainmode!(m, active)
    Warning

    This two-argument method is deprecated.

    Possible values of active are:

    • true for training, or
    • false for testing, same as testmode!(m)
    • :auto or nothing for Flux to detect training automatically.
    source
    diff --git a/previews/PR2365/models/losses/index.html b/previews/PR2365/models/losses/index.html new file mode 100644 index 0000000000..cd83f5f4fb --- /dev/null +++ b/previews/PR2365/models/losses/index.html @@ -0,0 +1,225 @@ + +Loss Functions · Flux

    Loss Functions

    Flux provides a large number of common loss functions used for training machine learning models. They are grouped together in the Flux.Losses module.

    Loss functions for supervised learning typically expect as inputs a target y, and a prediction from your model. In Flux's convention, the order of the arguments is the following

    loss(ŷ, y)

    Most loss functions in Flux have an optional argument agg, denoting the type of aggregation performed over the batch:

    loss(ŷ, y)                         # defaults to `mean`
    +loss(ŷ, y, agg=sum)                # use `sum` for reduction
    +loss(ŷ, y, agg=x->sum(x, dims=2))  # partial reduction
    +loss(ŷ, y, agg=x->mean(w .* x))    # weighted mean
    +loss(ŷ, y, agg=identity)           # no aggregation.

    Function listing

    Flux.Losses.maeFunction
    mae(ŷ, y; agg = mean)

    Return the loss corresponding to mean absolute error:

    agg(abs.(ŷ .- y))

    Example

    julia> y_model = [1.1, 1.9, 3.1];
    +
    +julia> Flux.mae(y_model, 1:3)
    +0.10000000000000009
    source
    Flux.Losses.mseFunction
    mse(ŷ, y; agg = mean)

    Return the loss corresponding to mean square error:

    agg((ŷ .- y) .^ 2)

    See also: mae, msle, crossentropy.

    Example

    julia> y_model = [1.1, 1.9, 3.1];
    +
    +julia> y_true = 1:3;
    +
    +julia> Flux.mse(y_model, y_true)
    +0.010000000000000018
    source
    Flux.Losses.msleFunction
    msle(ŷ, y; agg = mean, eps = eps(eltype(ŷ)))

    The loss corresponding to mean squared logarithmic errors, calculated as

    agg((log.(ŷ .+ ϵ) .- log.(y .+ ϵ)) .^ 2)

    The ϵ == eps term provides numerical stability. Penalizes an under-estimation more than an over-estimatation.

    Example

    julia> Flux.msle(Float32[1.1, 2.2, 3.3], 1:3)
    +0.009084041f0
    +
    +julia> Flux.msle(Float32[0.9, 1.8, 2.7], 1:3)
    +0.011100831f0
    source
    Flux.Losses.huber_lossFunction
    huber_loss(ŷ, y; delta = 1, agg = mean)

    Return the mean of the Huber loss given the prediction and true values y.

                 | 0.5 * |ŷ - y|^2,            for |ŷ - y| <= δ
    +Huber loss = |
    +             |  δ * (|ŷ - y| - 0.5 * δ), otherwise

    Example

    julia> ŷ = [1.1, 2.1, 3.1];
    +
    +julia> Flux.huber_loss(ŷ, 1:3)  # default δ = 1 > |ŷ - y|
    +0.005000000000000009
    +
    +julia> Flux.huber_loss(ŷ, 1:3, delta=0.05)  # changes behaviour as |ŷ - y| > δ
    +0.003750000000000005
    source
    Flux.Losses.label_smoothingFunction
    label_smoothing(y::Union{Number, AbstractArray}, α; dims::Int=1)

    Returns smoothed labels, meaning the confidence on label values are relaxed.

    When y is given as one-hot vector or batch of one-hot, its calculated as

    y .* (1 - α) .+ α / size(y, dims)

    when y is given as a number or batch of numbers for binary classification, its calculated as

    y .* (1 - α) .+ α / 2

    in which case the labels are squeezed towards 0.5.

    α is a number in interval (0, 1) called the smoothing factor. Higher the value of α larger the smoothing of y.

    dims denotes the one-hot dimension, unless dims=0 which denotes the application of label smoothing to binary distributions encoded in a single number.

    Example

    julia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)
    +2×6 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + ⋅  ⋅  ⋅  1  ⋅  1
    + 1  1  1  ⋅  1  ⋅
    +
    +julia> y_smoothed = Flux.label_smoothing(y, 0.2f0)
    +2×6 Matrix{Float32}:
    + 0.1  0.1  0.1  0.9  0.1  0.9
    + 0.9  0.9  0.9  0.1  0.9  0.1
    +
    +julia> y_sim = softmax(y .* log(2f0))
    +2×6 Matrix{Float32}:
    + 0.333333  0.333333  0.333333  0.666667  0.333333  0.666667
    + 0.666667  0.666667  0.666667  0.333333  0.666667  0.333333
    +
    +julia> y_dis = vcat(y_sim[2,:]', y_sim[1,:]')
    +2×6 Matrix{Float32}:
    + 0.666667  0.666667  0.666667  0.333333  0.666667  0.333333
    + 0.333333  0.333333  0.333333  0.666667  0.333333  0.666667
    +
    +julia> Flux.crossentropy(y_sim, y) < Flux.crossentropy(y_sim, y_smoothed)
    +true
    +
    +julia> Flux.crossentropy(y_dis, y) > Flux.crossentropy(y_dis, y_smoothed)
    +true
    source
    Flux.Losses.crossentropyFunction
    crossentropy(ŷ, y; dims = 1, eps = eps(eltype(ŷ)), agg = mean)

    Return the cross entropy between the given probability distributions; calculated as

    agg(-sum(y .* log.(ŷ .+ ϵ); dims))

    Cross entropy is typically used as a loss in multi-class classification, in which case the labels y are given in a one-hot format. dims specifies the dimension (or the dimensions) containing the class probabilities. The prediction is supposed to sum to one across dims, as would be the case with the output of a softmax operation.

    For numerical stability, it is recommended to use logitcrossentropy rather than softmax followed by crossentropy .

    Use label_smoothing to smooth the true labels as preprocessing before computing the loss.

    See also: logitcrossentropy, binarycrossentropy, logitbinarycrossentropy.

    Example

    julia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)
    +3×5 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + 1  ⋅  ⋅  ⋅  1
    + ⋅  1  ⋅  1  ⋅
    + ⋅  ⋅  1  ⋅  ⋅
    +
    +julia> y_model = softmax(reshape(-7:7, 3, 5) .* 1f0)
    +3×5 Matrix{Float32}:
    + 0.0900306  0.0900306  0.0900306  0.0900306  0.0900306
    + 0.244728   0.244728   0.244728   0.244728   0.244728
    + 0.665241   0.665241   0.665241   0.665241   0.665241
    +
    +julia> sum(y_model; dims=1)
    +1×5 Matrix{Float32}:
    + 1.0  1.0  1.0  1.0  1.0
    +
    +julia> Flux.crossentropy(y_model, y_label)
    +1.6076053f0
    +
    +julia> 5 * ans ≈ Flux.crossentropy(y_model, y_label; agg=sum)
    +true
    +
    +julia> y_smooth = Flux.label_smoothing(y_label, 0.15f0)
    +3×5 Matrix{Float32}:
    + 0.9   0.05  0.05  0.05  0.9
    + 0.05  0.9   0.05  0.9   0.05
    + 0.05  0.05  0.9   0.05  0.05
    +
    +julia> Flux.crossentropy(y_model, y_smooth)
    +1.5776052f0
    source
    Flux.Losses.logitcrossentropyFunction
    logitcrossentropy(ŷ, y; dims = 1, agg = mean)

    Return the cross entropy calculated by

    agg(-sum(y .* logsoftmax(ŷ; dims); dims))

    This is mathematically equivalent to crossentropy(softmax(ŷ), y), but is more numerically stable than using functions crossentropy and softmax separately.

    See also: binarycrossentropy, logitbinarycrossentropy, label_smoothing.

    Example

    julia> y_label = Flux.onehotbatch(collect("abcabaa"), 'a':'c')
    +3×7 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + 1  ⋅  ⋅  1  ⋅  1  1
    + ⋅  1  ⋅  ⋅  1  ⋅  ⋅
    + ⋅  ⋅  1  ⋅  ⋅  ⋅  ⋅
    +
    +julia> y_model = reshape(vcat(-9:0, 0:9, 7.5f0), 3, 7)
    +3×7 Matrix{Float32}:
    + -9.0  -6.0  -3.0  0.0  2.0  5.0  8.0
    + -8.0  -5.0  -2.0  0.0  3.0  6.0  9.0
    + -7.0  -4.0  -1.0  1.0  4.0  7.0  7.5
    +
    +julia> Flux.logitcrossentropy(y_model, y_label)
    +1.5791205f0
    +
    +julia> Flux.crossentropy(softmax(y_model), y_label)
    +1.5791197f0
    source
    Flux.Losses.binarycrossentropyFunction
    binarycrossentropy(ŷ, y; agg = mean, eps = eps(eltype(ŷ)))

    Return the binary cross-entropy loss, computed as

    agg(@.(-y * log(ŷ + ϵ) - (1 - y) * log(1 - ŷ + ϵ)))

    Where typically, the prediction is given by the output of a sigmoid activation. The ϵ == eps term is included to avoid infinity. Using logitbinarycrossentropy is recomended over binarycrossentropy for numerical stability.

    Use label_smoothing to smooth the y value as preprocessing before computing the loss.

    See also: crossentropy, logitcrossentropy.

    Examples

    julia> y_bin = Bool[1,0,1]
    +3-element Vector{Bool}:
    + 1
    + 0
    + 1
    +
    +julia> y_prob = softmax(reshape(vcat(1:3, 3:5), 2, 3) .* 1f0)
    +2×3 Matrix{Float32}:
    + 0.268941  0.5  0.268941
    + 0.731059  0.5  0.731059
    +
    +julia> Flux.binarycrossentropy(y_prob[2,:], y_bin)
    +0.43989f0
    +
    +julia> all(p -> 0 < p < 1, y_prob[2,:])  # else DomainError
    +true
    +
    +julia> y_hot = Flux.onehotbatch(y_bin, 0:1)
    +2×3 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + ⋅  1  ⋅
    + 1  ⋅  1
    +
    +julia> Flux.crossentropy(y_prob, y_hot)
    +0.43989f0
    source
    Flux.Losses.logitbinarycrossentropyFunction
    logitbinarycrossentropy(ŷ, y; agg = mean)

    Mathematically equivalent to binarycrossentropy(σ(ŷ), y) but is more numerically stable.

    See also: crossentropy, logitcrossentropy.

    Examples

    julia> y_bin = Bool[1,0,1];
    +
    +julia> y_model = Float32[2, -1, pi]
    +3-element Vector{Float32}:
    +  2.0
    + -1.0
    +  3.1415927
    +
    +julia> Flux.logitbinarycrossentropy(y_model, y_bin)
    +0.160832f0
    +
    +julia> Flux.binarycrossentropy(sigmoid.(y_model), y_bin)
    +0.16083185f0
    source
    Flux.Losses.kldivergenceFunction
    kldivergence(ŷ, y; agg = mean, eps = eps(eltype(ŷ)))

    Return the Kullback-Leibler divergence between the given probability distributions.

    The KL divergence is a measure of how much one probability distribution is different from the other. It is always non-negative, and zero only when both the distributions are equal.

    Example

    julia> p1 = [1 0; 0 1]
    +2×2 Matrix{Int64}:
    + 1  0
    + 0  1
    +
    +julia> p2 = fill(0.5, 2, 2)
    +2×2 Matrix{Float64}:
    + 0.5  0.5
    + 0.5  0.5
    +
    +julia> Flux.kldivergence(p2, p1) ≈ log(2)
    +true
    +
    +julia> Flux.kldivergence(p2, p1; agg = sum) ≈ 2log(2)
    +true
    +
    +julia> Flux.kldivergence(p2, p2; eps = 0)  # about -2e-16 with the regulator
    +0.0
    +
    +julia> Flux.kldivergence(p1, p2; eps = 0)  # about 17.3 with the regulator
    +Inf
    source
    Flux.Losses.poisson_lossFunction
    poisson_loss(ŷ, y; agg = mean)

    Return how much the predicted distribution diverges from the expected Poisson distribution y; calculated as -

    sum(ŷ .- y .* log.(ŷ)) / size(y, 2)

    More information..

    Example

    julia> y_model = [1, 3, 3];  # data should only take integral values
    +
    +julia> Flux.poisson_loss(y_model, 1:3)
    +0.5023128522198171
    source
    Flux.Losses.hinge_lossFunction
    hinge_loss(ŷ, y; agg = mean)

    Return the hinge_loss given the prediction and true labels y (containing 1 or -1); calculated as

    sum(max.(0, 1 .- ŷ .* y)) / size(y, 2)

    Usually used with classifiers like Support Vector Machines. See also: squared_hinge_loss

    Example

    julia> y_true = [1, -1, 1, 1];
    +
    +julia> y_pred = [0.1, 0.3, 1, 1.5];
    +
    +julia> Flux.hinge_loss(y_pred, y_true)
    +0.55
    +
    +julia> Flux.hinge_loss(y_pred[1], y_true[1]) != 0  # same sign but |ŷ| < 1
    +true
    +
    +julia> Flux.hinge_loss(y_pred[end], y_true[end]) == 0  # same sign but |ŷ| >= 1
    +true
    +
    +julia> Flux.hinge_loss(y_pred[2], y_true[2]) != 0 # opposite signs
    +true
    source
    Flux.Losses.squared_hinge_lossFunction
    squared_hinge_loss(ŷ, y)

    Return the squared hinge_loss loss given the prediction and true labels y (containing 1 or -1); calculated as

    sum((max.(0, 1 .- ŷ .* y)).^2) / size(y, 2)

    Usually used with classifiers like Support Vector Machines. See also: hinge_loss

    Example

    julia> y_true = [1, -1, 1, 1];
    +
    +julia> y_pred = [0.1, 0.3, 1, 1.5];
    +
    +julia> Flux.squared_hinge_loss(y_pred, y_true)
    +0.625
    +
    +julia> Flux.squared_hinge_loss(y_pred[1], y_true[1]) != 0
    +true
    +
    +julia> Flux.squared_hinge_loss(y_pred[end], y_true[end]) == 0
    +true
    +
    +julia> Flux.squared_hinge_loss(y_pred[2], y_true[2]) != 0
    +true
    source
    Flux.Losses.dice_coeff_lossFunction
    dice_coeff_loss(ŷ, y; smooth = 1)

    Return a loss based on the dice coefficient. Used in the V-Net image segmentation architecture. The dice coefficient is similar to the F1_score. Loss calculated as:

    1 - 2*sum(|ŷ .* y| + smooth) / (sum(ŷ.^2) + sum(y.^2) + smooth)

    Example

    julia> y_pred = [1.1, 2.1, 3.1];
    +
    +julia> Flux.dice_coeff_loss(y_pred, 1:3)
    +0.000992391663909964
    +
    +julia> 1 - Flux.dice_coeff_loss(y_pred, 1:3)  # ~ F1 score for image segmentation
    +0.99900760833609
    source
    Flux.Losses.tversky_lossFunction
    tversky_loss(ŷ, y; beta = 0.7)

    Return the Tversky loss. Used with imbalanced data to give more weight to false negatives. Larger β == beta weigh recall more than precision (by placing more emphasis on false negatives). Calculated as:

    1 - sum(|y .* ŷ| + 1) / (sum(y .* ŷ + (1 - β)*(1 .- y) .* ŷ + β*y .* (1 .- ŷ)) + 1)
    source
    Flux.Losses.binary_focal_lossFunction
    binary_focal_loss(ŷ, y; agg=mean, gamma=2, eps=eps(eltype(ŷ)))

    Return the binaryfocalloss The input, 'ŷ', is expected to be normalized (i.e. softmax output).

    For gamma = 0, the loss is mathematically equivalent to Losses.binarycrossentropy.

    See also: Losses.focal_loss for multi-class setting

    Example

    julia> y = [0  1  0
    +            1  0  1]
    +2×3 Matrix{Int64}:
    + 0  1  0
    + 1  0  1
    +
    +julia> ŷ = [0.268941  0.5  0.268941
    +            0.731059  0.5  0.731059]
    +2×3 Matrix{Float64}:
    + 0.268941  0.5  0.268941
    + 0.731059  0.5  0.731059
    +
    +julia> Flux.binary_focal_loss(ŷ, y) ≈ 0.0728675615927385
    +true
    source
    Flux.Losses.focal_lossFunction
    focal_loss(ŷ, y; dims=1, agg=mean, gamma=2, eps=eps(eltype(ŷ)))

    Return the focal_loss which can be used in classification tasks with highly imbalanced classes. It down-weights well-classified examples and focuses on hard examples. The input, 'ŷ', is expected to be normalized (i.e. softmax output).

    The modulating factor, γ == gamma, controls the down-weighting strength. For γ == 0, the loss is mathematically equivalent to Losses.crossentropy.

    Example

    julia> y = [1  0  0  0  1
    +            0  1  0  1  0
    +            0  0  1  0  0]
    +3×5 Matrix{Int64}:
    + 1  0  0  0  1
    + 0  1  0  1  0
    + 0  0  1  0  0
    +
    +julia> ŷ = softmax(reshape(-7:7, 3, 5) .* 1f0)
    +3×5 Matrix{Float32}:
    + 0.0900306  0.0900306  0.0900306  0.0900306  0.0900306
    + 0.244728   0.244728   0.244728   0.244728   0.244728
    + 0.665241   0.665241   0.665241   0.665241   0.665241
    +
    +julia> Flux.focal_loss(ŷ, y) ≈ 1.1277571935622628
    +true

    See also: Losses.binary_focal_loss for binary (not one-hot) labels

    source
    Flux.Losses.siamese_contrastive_lossFunction
    siamese_contrastive_loss(ŷ, y; margin = 1, agg = mean)

    Return the contrastive loss which can be useful for training Siamese Networks. It is given by

    agg(@. (1 - y) * ŷ^2 + y * max(0, margin - ŷ)^2)

    Specify margin to set the baseline for distance at which pairs are dissimilar.

    Example

    julia> ŷ = [0.5, 1.5, 2.5];
    +
    +julia> Flux.siamese_contrastive_loss(ŷ, 1:3)
    +-4.833333333333333
    +
    +julia> Flux.siamese_contrastive_loss(ŷ, 1:3, margin = 2)
    +-4.0
    source
    diff --git a/previews/PR2365/models/nnlib/index.html b/previews/PR2365/models/nnlib/index.html new file mode 100644 index 0000000000..8e1947767e --- /dev/null +++ b/previews/PR2365/models/nnlib/index.html @@ -0,0 +1,378 @@ + +Low-level Operations – NNlib.jl · Flux

    Neural Network primitives from NNlib.jl

    Flux re-exports all of the functions exported by the NNlib package. This includes activation functions, described on their own page. Many of the functions on this page exist primarily as the internal implementation of Flux layer, but can also be used independently.

    Attention

    Primitives for the MultiHeadAttention layer.

    NNlib.dot_product_attentionFunction
    dot_product_attention(query, key, value, [bias]; [fdrop, mask, nheads])

    Multihead dot product attention used in transformer architectures.

    The input arrays must have the first two dimensions given by the number of features and the sequence length, then an arbitrary number of batch dimensions or none.

    Returns the attention output array of size (v_dim, q_len, batch_size...) and the attention scores of size (kv_len, q_len, nheads, batch_size...).

    See also dot_product_attention_scores if you only need the attention scores.

    Arguments

    • query: Query array of size (qk_dim, q_len, batch_size...).
    • key: Key array of size (qk_dim, kv_len, batch_size...).
    • value: Value array of size (v_dim, kv_len, batch_size...).
    • bias: Either nothing or an array broadcastable to size (kv_len, q_len, nheads, batch_size). It will be added to the attention scores before applying the softmax. Default nothing.
    • fdrop: A dropout function or layer to be applied on the attention scores right after the softmax. Default identity (no dropout).
    • mask: Either nothing or a boolean array broadcastable to size (kv_len, q_len, nheads, batch_size). The mask is applied to the attention scores just before the softmax. See make_causal_mask fore creating causal masks. Default nothing.
    • nheads: Number of heads to split the input arrays into. Default 1.

    Examples

    q, k, v = rand(10, 20, 2), rand(10, 30, 2), rand(20, 30, 2)
    +y, α = dot_product_attention(q, k, v)
    NNlib.make_causal_maskFunction
    make_causal_mask(x, dims=2)

    Return a boolean square matrix m of the same type as x and of side size(x, dims). Its elements are set such that m[i, j] == i ≤ j.

    Can be used to mask the attention scores in dot_product_attention.

    Softmax

    Flux's Flux.logitcrossentropy uses NNlib.logsoftmax internally.

    NNlib.softmaxFunction
    softmax(x; dims = 1)

    Softmax turns input array x into probability distributions that sum to 1 along the dimensions specified by dims. It is semantically equivalent to the following:

    softmax(x; dims = 1) = exp.(x) ./ sum(exp.(x), dims = dims)

    with additional manipulations enhancing numerical stability.

    For a matrix input x it will by default (dims = 1) treat it as a batch of vectors, with each column independent. Keyword dims = 2 will instead treat rows independently, and so on.

    See also logsoftmax.

    Examples

    julia> softmax([1, 2, 3])
    +3-element Vector{Float64}:
    + 0.09003057317038046
    + 0.24472847105479764
    + 0.6652409557748218
    +
    +julia> softmax([1 2 3; 2 2 2])  # dims=1
    +2×3 Matrix{Float64}:
    + 0.268941  0.5  0.731059
    + 0.731059  0.5  0.268941
    +
    +julia> softmax([1 2 3; 2 2 2]; dims=2)
    +2×3 Matrix{Float64}:
    + 0.0900306  0.244728  0.665241
    + 0.333333   0.333333  0.333333

    Note that, when used with Flux.jl, softmax must not be passed to layers like Dense which accept an activation function. The activation is broadcasted over the result, thus applies to individual numbers. But softmax always needs to see the whole column.

    julia> using Flux
    +
    +julia> x = randn(Float32, 4, 4, 3, 13);
    +
    +julia> model = Chain(Conv((4, 4), 3 => 8, tanh), Flux.flatten, Dense(8 => 7), softmax);
    +
    +julia> model(x) |> size
    +(7, 13)
    +
    +julia> Dense(4 => 7, softmax)(x)
    +ERROR: `softmax(x)` called with a number, but it expects an array. 
    NNlib.logsoftmaxFunction
    logsoftmax(x; dims = 1)

    Computes the log of softmax in a more numerically stable way than directly taking log.(softmax(xs)). Commonly used in computing cross entropy loss.

    It is semantically equivalent to the following:

    logsoftmax(x; dims = 1) = x .- log.(sum(exp.(x), dims = dims))

    See also softmax.

    Pooling

    Flux's AdaptiveMaxPool, AdaptiveMeanPool, GlobalMaxPool, GlobalMeanPool, MaxPool, and MeanPool use NNlib.PoolDims, NNlib.maxpool, and NNlib.meanpool as their backend.

    NNlib.PoolDimsType
    PoolDims(x_size::NTuple{M}, k::Union{NTuple{L, Int}, Int};
    +        stride=k, padding=0, dilation=1)  where {M, L}

    Dimensions for a "pooling" operation that can have an arbitrary input size, kernel size, stride, dilation, and channel count. Used to dispatch onto efficient implementations at compile-time.

    NNlib.lpnormpoolFunction
    lpnormpool(x, p::Real, k::NTuple{N, Integer}; pad=0, stride=k)

    Perform Lp pool operation with value of the Lp norm p and window size k on input tensor x, also known as LPPool in pytorch. This pooling operator from Learned-Norm Pooling for Deep Feedforward and Recurrent Neural Networks.

    Arguments:

    • x and k: Expects ndim(x) ∈ 3:5, and alwayslength(k) == ndim(x) - 2`
    • p is restricted to 0 < p < Inf.
    • pad: See pad_zeros for details.
    • stride: Either a tuple with the same length as k, or one integer for all directions. Default is k.

    For all elements x in a size k window, lpnormpool computes (∑ᵢ xᵢ^p)^(1 / p) as an element of the output.

    Thus lpnormpool(x, 1, k) ./ prod(k) ≈ meanpool(x, k) and lpnormpool(x, 2, k).^2 ./ prod(k) ≈ meanpool(x.^2, k).

    NNlib.maxpoolFunction
    maxpool(x, k::NTuple{N, Integer}; pad=0, stride=k)

    Perform max pool operation with window size k on input tensor x.

    Arguments:

    • x and k: Expects ndim(x) ∈ 3:5, and always length(k) == ndim(x) - 2
    • pad: See pad_zeros for details.
    • stride: Either a tuple with the same length as k, or one integer for all directions. Default is k.
    NNlib.meanpoolFunction
    meanpool(x, k::NTuple{N, Integer}; pad=0, stride=k)

    Perform mean pool operation with window size k on input tensor x.

    Arguments:

    • x and k: Expects ndim(x) ∈ 3:5, and alwayslength(k) == ndim(x) - 2`
    • pad: See pad_zeros for details.
    • stride: Either a tuple with the same length as k, or one integer for all directions. Default is k.

    Padding

    NNlib.pad_circularFunction
    pad_circular(x, pad::Tuple; [dims])
    +pad_circular(x, pad::Int; [dims])

    Pad the array x "circularly" across the border by wrapping around values from the opposite side of x.

    pad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.

    If pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension).

    The pad length on either side in any dimension must not exceed the size of x in that dimension, i.e. pad_circular is not able to create abitrary sized tilings of x.

    See also pad_repeat, pad_reflect, pad_symmetric, and pad_constant.

    julia> r = reshape(1:9, 3, 3)
    +3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:
    + 1  4  7
    + 2  5  8
    + 3  6  9
    +
    +julia> pad_circular(r, (1,2,1,2))
    +6×6 Matrix{Int64}:
    + 9  3  6  9  3  6
    + 7  1  4  7  1  4
    + 8  2  5  8  2  5
    + 9  3  6  9  3  6
    + 7  1  4  7  1  4
    + 8  2  5  8  2  5
    NNlib.pad_constantFunction
    pad_constant(x, pad::Tuple, val = 0; [dims = :])
    +pad_constant(x, pad::Int, val = 0; [dims = :])

    Pad the array x with the constant value val.

    pad can be a tuple of integers. If it is of some length 2 * length(dims) that specifies the left and right padding size for each of the dimensions in dims as (l1, r1, ..., ln, rn). If supplied with a tuple of length length(dims) instead, it applies symmetric padding. If dims is not given, it defaults to all dimensions.

    For integer pad input, it is applied on both sides on every dimension in dims.

    See also pad_zeros, pad_repeat, pad_reflect, pad_symmetric, and pad_circular.

    julia> r = reshape(1:4, 2, 2)
    +2×2 reshape(::UnitRange{Int64}, 2, 2) with eltype Int64:
    + 1  3
    + 2  4
    +
    +julia> pad_constant(r, (1, 2, 3, 4), 8)
    +5×9 Matrix{Int64}:
    + 8  8  8  8  8  8  8  8  8
    + 8  8  8  1  3  8  8  8  8
    + 8  8  8  2  4  8  8  8  8
    + 8  8  8  8  8  8  8  8  8
    + 8  8  8  8  8  8  8  8  8
    +
    +julia> pad_constant(r, 1, 8)
    +4×4 Matrix{Int64}:
    + 8  8  8  8
    + 8  1  3  8
    + 8  2  4  8
    + 8  8  8  8
    +
    +julia> r = reshape(1:27, 3, 3, 3)
    +3×3×3 reshape(::UnitRange{Int64}, 3, 3, 3) with eltype Int64:
    +[:, :, 1] =
    + 1  4  7
    + 2  5  8
    + 3  6  9
    +
    +[:, :, 2] =
    + 10  13  16
    + 11  14  17
    + 12  15  18
    +
    +[:, :, 3] =
    + 19  22  25
    + 20  23  26
    + 21  24  27
    +
    +julia> pad_constant(r, (2,1), dims = 1) # assymetric padding
    +6×3×3 Array{Int64, 3}:
    +[:, :, 1] =
    + 0  0  0
    + 0  0  0
    + 1  4  7
    + 2  5  8
    + 3  6  9
    + 0  0  0
    +
    +[:, :, 2] =
    +  0   0   0
    +  0   0   0
    + 10  13  16
    + 11  14  17
    + 12  15  18
    +  0   0   0
    +
    +[:, :, 3] =
    +  0   0   0
    +  0   0   0
    + 19  22  25
    + 20  23  26
    + 21  24  27
    +  0   0   0
    +
    +julia> pad_constant(r, (2,1, 3), dims = (1,2)) # padding must always be either the same length as dims, or double it
    +ERROR: ArgumentError: Could not parse padding (2, 1, 3) and dims (1, 2)
    +Stacktrace:
    +[...]
    NNlib.pad_reflectFunction
    pad_reflect(x, pad::Tuple; [dims])
    +pad_reflect(x, pad::Int; [dims])

    Pad the array x reflecting its values across the border.

    pad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.

    If pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension).

    See also pad_repeat, pad_symmetric, pad_circular, and pad_constant.

    julia> r = reshape(1:9, 3, 3)
    +3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:
    + 1  4  7
    + 2  5  8
    + 3  6  9
    +
    +julia> pad_reflect(r, (1,2,1,2))
    +6×6 Matrix{Int64}:
    + 5  2  5  8  5  2
    + 4  1  4  7  4  1
    + 5  2  5  8  5  2
    + 6  3  6  9  6  3
    + 5  2  5  8  5  2
    + 4  1  4  7  4  1
    NNlib.pad_repeatFunction
    pad_repeat(x, pad::Tuple; [dims])
    +pad_repeat(x, pad::Int; [dims])

    Pad the array x repeating the values on the border.

    pad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.

    If pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension).

    See also pad_reflect, pad_symmetric, pad_circular, and pad_constant.

    julia> r = reshape(1:9, 3, 3)
    +3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:
    + 1  4  7
    + 2  5  8
    + 3  6  9
    +
    +julia> pad_repeat(r, (1,2,3,4))
    +6×10 Matrix{Int64}:
    + 1  1  1  1  4  7  7  7  7  7
    + 1  1  1  1  4  7  7  7  7  7
    + 2  2  2  2  5  8  8  8  8  8
    + 3  3  3  3  6  9  9  9  9  9
    + 3  3  3  3  6  9  9  9  9  9
    + 3  3  3  3  6  9  9  9  9  9
    NNlib.pad_symmetricFunction
    pad_symmetric(x, pad::Tuple; [dims])
    +pad_symmetric(x, pad::Int; [dims])

    Pad the array x reflecting its values symmetrically across the border, i.e. the border values of x are present in the padding values, in contrast to pad_reflect.

    pad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.

    If pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension).

    See also pad_repeat, pad_reflect, pad_circular, and pad_constant.

    julia> r = reshape(1:9, 3, 3)
    +3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:
    + 1  4  7
    + 2  5  8
    + 3  6  9
    +
    +julia> pad_symmetric(r, (1,2,1,2))
    +6×6 Matrix{Int64}:
    + 1  1  4  7  7  4
    + 1  1  4  7  7  4
    + 2  2  5  8  8  5
    + 3  3  6  9  9  6
    + 3  3  6  9  9  6
    + 2  2  5  8  8  5
    NNlib.pad_zerosFunction
    pad_zeros(x, pad::Tuple; [dims])
    +pad_zeros(x, pad::Int; [dims])

    Pad the array x with zeros. Equivalent to pad_constant with the constant equal to 0.

    Convolution

    Flux's Conv and CrossCor layers use NNlib.DenseConvDims and NNlib.conv internally.

    NNlib.convFunction
    conv(x, w; stride = 1, pad = 0, dilation = 1, flipped = false, groups = 1)

    Apply convolution filter w to input x. x and w are 3d/4d/5d tensors in 1d/2d/3d convolutions respectively. x and w may have real or complex element types.

    NNlib.ConvDimsType
    ConvDims

    Type system-level information about convolution dimensions. Critical for things like im2col!() to generate efficient code, and helpful to reduce the number of kwargs getting passed around.

    NNlib.depthwiseconvFunction
    depthwiseconv(x, w; stride=1, pad=0, dilation=1, flipped=false)

    Depthwise convolution operation with filter w on input x. x and w are 3d/4d/5d tensors in 1d/2d/3d convolutions respectively.

    NNlib.DepthwiseConvDimsType
    DepthwiseConvDims

    Concrete subclass of ConvDims for a depthwise convolution. Differs primarily due to characterization by Cin, Cmult, rather than Cin, Cout. Useful to be separate from DenseConvDims primarily for channel calculation differences.

    NNlib.DenseConvDimsType
    DenseConvDims

    Concrete subclass of ConvDims for a normal, dense, conv2d/conv3d.

    Dropout

    NNlib.dropoutFunction
    dropout([rng], A, p; [dims])

    Returns an array in which each element of A is either replaced with zero, with probability p, or else multiplied by 1/(1-p).

    By default every element is treated independently. With keyword dims=1, a choice is made for every value of the 1st index i.e. each row of a matrix is either zero or not.

    Optional first argument is the random number generator used.

    Examples

    julia> dropout(ones(2, 10), 0.2)
    +2×10 Matrix{Float64}:
    + 1.25  1.25  0.0   1.25  1.25  1.25  1.25  1.25  1.25  1.25
    + 1.25  1.25  1.25  0.0   1.25  1.25  0.0   1.25  1.25  1.25
    +
    +julia> mean(dropout(ones(10^4, 5), 0.2), dims=1)
    +1×5 Matrix{Float64}:
    + 0.998  1.00075  0.99125  0.99575  1.00075
    +
    +julia> dropout(ones(5, 5), 0.7, dims=1)  # whole row the same
    +5×5 Matrix{Float64}:
    + 3.33333  3.33333  3.33333  3.33333  3.33333
    + 0.0      0.0      0.0      0.0      0.0
    + 0.0      0.0      0.0      0.0      0.0
    + 3.33333  3.33333  3.33333  3.33333  3.33333
    + 0.0      0.0      0.0      0.0      0.0
    +
    +julia> mean(dropout(ones(10^4, 5), 0.3, dims=1), dims=1)
    +1×5 Matrix{Float64}:
    + 1.00571  1.00571  1.00571  1.00571  1.00571
    NNlib.dropout!Function
    dropout!(B, A, p; [dims])

    This does exactly B .= dropout(A, p; dims), or rather, it's the implementation of out-of-place dropout.

    Upsampling

    Flux's Upsample layer uses NNlib.upsample_nearest, NNlib.upsample_bilinear, and NNlib.upsample_trilinear as its backend. Additionally, Flux's PixelShuffle layer uses NNlib.pixel_shuffle as its backend.

    NNlib.upsample_nearestFunction
    upsample_nearest(x, scale::NTuple{S,Int})
    +upsample_nearest(x; size::NTuple{S,Int})

    Upsamples the array x by integer multiples along the first S dimensions. Subsequent dimensions of x are not altered.

    Either the scale factors or the final output size can be specified.

    See also upsample_bilinear, for two dimensions of an N=4 array.

    Example

    julia> upsample_nearest([1 2 3; 4 5 6], (2, 3))
    +4×9 Matrix{Int64}:
    + 1  1  1  2  2  2  3  3  3
    + 1  1  1  2  2  2  3  3  3
    + 4  4  4  5  5  5  6  6  6
    + 4  4  4  5  5  5  6  6  6
    +
    +julia> ans == upsample_nearest([1 2 3; 4 5 6]; size=(4, 9))  # equivalent
    +true
    +
    +julia> upsample_nearest([1 2 3; 4 5 6], (2,))
    +4×3 Matrix{Int64}:
    + 1  2  3
    + 1  2  3
    + 4  5  6
    + 4  5  6
    +
    +julia> ans == upsample_nearest([1 2 3; 4 5 6], size=(4,))
    +true
    NNlib.upsample_linearFunction
    upsample_linear(x::AbstractArray{T,3}, scale::Real; align_corners::Bool = true)
    +upsample_linear(x::AbstractArray{T,3}; size::Integer, align_corners::Bool = true)

    Upsamples the first dimension of the array x by the upsample provided scale, using linear interpolation. As an alternative to using scale, the resulting array size can be directly specified with a keyword argument.

    The size of the output is equal to (scale*S1, S2, S3), where S1, S2, S3 = size(x).

    NNlib.∇upsample_linearFunction
    ∇upsample_linear(Δ::AbstractArray{T,3}; size::Integer, align_corners::Bool = true) where T

    Arguments

    • Δ: Incoming gradient array, backpropagated from downstream layers
    • size: Size of the image upsampled in the first place

    Outputs

    • dx: Downsampled version of Δ
    NNlib.upsample_bilinearFunction
    upsample_bilinear(x::AbstractArray{T,4}, scale::NTuple{2,Real}; align_corners::Bool = true)
    +upsample_bilinear(x::AbstractArray{T,4}; size::NTuple{2,Integer}, align_corners::Bool = true)

    Upsamples the first 2 dimensions of the array x by the upsample factors stored in scale, using bilinear interpolation. As an alternative to using scale, the resulting image size can be directly specified with a keyword argument.

    The size of the output is equal to (scale[1]*S1, scale[2]*S2, S3, S4), where S1, S2, S3, S4 = size(x).

    Examples

    julia> x = reshape(Float32[1 2 3; 4 5 6], (2,3,1,1))
    +2×3×1×1 Array{Float32, 4}:
    +[:, :, 1, 1] =
    + 1.0  2.0  3.0
    + 4.0  5.0  6.0
    +
    +julia> upsample_bilinear(x, (2, 3))
    +4×9×1×1 Array{Float32, 4}:
    +[:, :, 1, 1] =
    + 1.0  1.25  1.5  1.75  2.0  2.25  2.5  2.75  3.0
    + 2.0  2.25  2.5  2.75  3.0  3.25  3.5  3.75  4.0
    + 3.0  3.25  3.5  3.75  4.0  4.25  4.5  4.75  5.0
    + 4.0  4.25  4.5  4.75  5.0  5.25  5.5  5.75  6.0
    +
    +julia> ans == upsample_bilinear(x; size=(4, 9))  # specify ouput size instead
    +true
    +
    +julia> upsample_bilinear(x, (2.5, 3.5))  # non-integer scaling factors are allowed
    +5×10×1×1 Array{Float32, 4}:
    +[:, :, 1, 1] =
    + 1.0   1.22222  1.44444  1.66667  1.88889  …  2.33333  2.55556  2.77778  3.0
    + 1.75  1.97222  2.19444  2.41667  2.63889     3.08333  3.30556  3.52778  3.75
    + 2.5   2.72222  2.94444  3.16667  3.38889     3.83333  4.05556  4.27778  4.5
    + 3.25  3.47222  3.69444  3.91667  4.13889     4.58333  4.80556  5.02778  5.25
    + 4.0   4.22222  4.44444  4.66667  4.88889     5.33333  5.55556  5.77778  6.0
    NNlib.∇upsample_bilinearFunction
    ∇upsample_bilinear(Δ::AbstractArray{T,4}; size::NTuple{2,Integer}, align_corners::Bool = true) where T

    Arguments

    • Δ: Incoming gradient array, backpropagated from downstream layers
    • size: Lateral (W,H) size of the image upsampled in the first place

    Outputs

    • dx: Downsampled version of Δ
    NNlib.upsample_trilinearFunction
    upsample_trilinear(x::AbstractArray{T,5}, scale::NTuple{3,Real}; align_corners::Bool = true)
    +upsample_trilinear(x::AbstractArray{T,5}; size::NTuple{3,Integer}, align_corners::Bool = true)

    Upsamples the first 3 dimensions of the array x by the upsample factors stored in scale, using trilinear interpolation. As an alternative to using scale, the resulting image size can be directly specified with a keyword argument.

    The size of the output is equal to (scale[1]*S1, scale[2]*S2, scale[3]*S3, S4, S5), where S1, S2, S3, S4, S5 = size(x).

    Examples

    upsample_trilinear(x, (2, 3, 4))
    +upsample_trilinear(x; size=(4, 9, 11))  # specify ouput size instead
    +upsample_trilinear(x, (2.5, 3.5, pi))  # non-integer scaling factors are allowed
    NNlib.∇upsample_trilinearFunction
    ∇upsample_trilinear(Δ::AbstractArray{T,5}; size::NTuple{3,Integer}, align_corners::Bool = true) where T

    Arguments

    • Δ: Incoming gradient array, backpropagated from downstream layers
    • size: Lateral size & depth (W,H,D) of the image upsampled in the first place

    Outputs

    • dx: Downsampled version of Δ
    NNlib.pixel_shuffleFunction
    pixel_shuffle(x, r::Integer)

    Pixel shuffling operation, upscaling by a factor r.

    For 4-arrays representing N images, the operation converts input size(x) == (W, H, r^2*C, N) to output of size (r*W, r*H, C, N). For D-dimensional data, it expects ndims(x) == D+2 with channel and batch dimensions, and divides the number of channels by r^D.

    Used in super-resolution networks to upsample towards high resolution features. Reference: Shi et. al., "Real-Time Single Image and Video Super-Resolution ...", CVPR 2016, https://arxiv.org/abs/1609.05158

    Examples

    julia> x = [10i + j + channel/10 for i in 1:2, j in 1:3, channel in 1:4, batch in 1:1]
    +2×3×4×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 11.1  12.1  13.1
    + 21.1  22.1  23.1
    +
    +[:, :, 2, 1] =
    + 11.2  12.2  13.2
    + 21.2  22.2  23.2
    +
    +[:, :, 3, 1] =
    + 11.3  12.3  13.3
    + 21.3  22.3  23.3
    +
    +[:, :, 4, 1] =
    + 11.4  12.4  13.4
    + 21.4  22.4  23.4
    +
    +julia> pixel_shuffle(x, 2)  # 4 channels used up as 2x upscaling of image dimensions
    +4×6×1×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 11.1  11.3  12.1  12.3  13.1  13.3
    + 11.2  11.4  12.2  12.4  13.2  13.4
    + 21.1  21.3  22.1  22.3  23.1  23.3
    + 21.2  21.4  22.2  22.4  23.2  23.4
    +
    +julia> y = [i + channel/10 for i in 1:3, channel in 1:6, batch in 1:1]
    +3×6×1 Array{Float64, 3}:
    +[:, :, 1] =
    + 1.1  1.2  1.3  1.4  1.5  1.6
    + 2.1  2.2  2.3  2.4  2.5  2.6
    + 3.1  3.2  3.3  3.4  3.5  3.6
    +
    +julia> pixel_shuffle(y, 2)  # 1D image, with 6 channels reduced to 3
    +6×3×1 Array{Float64, 3}:
    +[:, :, 1] =
    + 1.1  1.3  1.5
    + 1.2  1.4  1.6
    + 2.1  2.3  2.5
    + 2.2  2.4  2.6
    + 3.1  3.3  3.5
    + 3.2  3.4  3.6

    Batched Operations

    Flux's Flux.Bilinear layer uses NNlib.batched_mul internally.

    NNlib.batched_mulFunction
    batched_mul(A, B) -> C
    +A ⊠ B  # \boxtimes

    Batched matrix multiplication. Result has C[:,:,k...] == A[:,:,k...] * B[:,:,k...] where k... represent any indices in the last dimensions.

    If ndims(A) == ndims(B) == 3 and size(B,3) == 1 then instead C[:,:,k] == A[:,:,k] * B[:,:,1], and similarly for A.

    To transpose each matrix, apply batched_transpose to the array, or batched_adjoint for conjugate-transpose:

    julia> A, B = randn(2,5,17), randn(5,9,17);
    +
    +julia> A ⊠ B |> size
    +(2, 9, 17)
    +
    +julia> batched_adjoint(A) |> size
    +(5, 2, 17)
    +
    +julia> batched_mul(A, batched_adjoint(randn(9,5,17))) |> size
    +(2, 9, 17)
    +
    +julia> A ⊠ randn(5,9,1) |> size
    +(2, 9, 17)
    +
    +julia> batched_transpose(A) == PermutedDimsArray(A, (2,1,3))
    +true

    The equivalent PermutedDimsArray may be used in place of batched_transpose. Other permutations are also handled by BLAS, provided that the batch index k is not the first dimension of the underlying array. Thus PermutedDimsArray(::Array, (1,3,2)) and PermutedDimsArray(::Array, (3,1,2)) are fine.

    However, A = PermutedDimsArray(::Array, (3,2,1)) is not acceptable to BLAS, since the batch dimension is the contiguous one: stride(A,3) == 1. This will be copied, as doing so is faster than batched_mul_generic!.

    Both this copy and batched_mul_generic! produce @debug messages, and setting for instance ENV["JULIA_DEBUG"] = NNlib will display them.

    batched_mul(A::Array{T,3}, B::Matrix)
    +batched_mul(A::Matrix, B::Array{T,3})
    +A ⊠ B

    This is always matrix-matrix multiplication, but either A or B may lack a batch index.

    • When B is a matrix, result has C[:,:,k] == A[:,:,k] * B[:,:] for all k.

    • When A is a matrix, then C[:,:,k] == A[:,:] * B[:,:,k]. This can also be done by reshaping and calling *, for instance A ⊡ B using TensorCore.jl, but is implemented here using batched_gemm instead of gemm.

    julia> randn(16,8,32) ⊠ randn(8,4) |> size
    +(16, 4, 32)
    +
    +julia> randn(16,8,32) ⊠ randn(8,4,1) |> size  # equivalent
    +(16, 4, 32)
    +
    +julia> randn(16,8) ⊠ randn(8,4,32) |> size
    +(16, 4, 32)

    See also batched_vec to regard B as a batch of vectors, A[:,:,k] * B[:,k].

    NNlib.batched_mul!Function
    batched_mul!(C, A, B) -> C
    +batched_mul!(C, A, B, α=1, β=0)

    In-place batched matrix multiplication, equivalent to mul!(C[:,:,k], A[:,:,k], B[:,:,k], α, β) for all k. If size(B,3) == 1 then every batch uses B[:,:,1] instead.

    This will call batched_gemm! whenever possible. For real arrays this means that, for X ∈ [A,B,C], either strides(X,1)==1 or strides(X,2)==1, the latter may be caused by batched_transpose or by for instance PermutedDimsArray(::Array, (3,1,2)). Unlike batched_mul this will never make a copy.

    For complex arrays, the wrapper made by batched_adjoint must be outermost to be seen. In this case the strided accepted by BLAS are more restricted, if stride(C,1)==1 then only stride(AorB::BatchedAdjoint,2) == 1 is accepted.

    NNlib.batched_adjointFunction
    batched_transpose(A::AbstractArray{T,3})
    +batched_adjoint(A)

    Equivalent to applying transpose or adjoint to each matrix A[:,:,k].

    These exist to control how batched_mul behaves, as it operates on such matrix slices of an array with ndims(A)==3.

    PermutedDimsArray(A, (2,1,3)) is equivalent to batched_transpose(A), and is also understood by batched_mul (and more widely supported elsewhere).

    BatchedTranspose{T, S} <: AbstractBatchedMatrix{T, 3}
    +BatchedAdjoint{T, S}

    Lazy wrappers analogous to Transpose and Adjoint, returned by batched_transpose etc.

    NNlib.batched_transposeFunction
    batched_transpose(A::AbstractArray{T,3})
    +batched_adjoint(A)

    Equivalent to applying transpose or adjoint to each matrix A[:,:,k].

    These exist to control how batched_mul behaves, as it operates on such matrix slices of an array with ndims(A)==3.

    PermutedDimsArray(A, (2,1,3)) is equivalent to batched_transpose(A), and is also understood by batched_mul (and more widely supported elsewhere).

    BatchedTranspose{T, S} <: AbstractBatchedMatrix{T, 3}
    +BatchedAdjoint{T, S}

    Lazy wrappers analogous to Transpose and Adjoint, returned by batched_transpose etc.

    NNlib.batched_vecFunction
    batched_vec(A::Array{T,3}, B::Matrix)
    +batched_vec(A::Array{T,3}, b::Vector)

    Batched matrix-vector multiplication: the result has C[:,:,k] == A[:,:,k] * B[:,k] for all k, or else C[:,:,k] == A[:,:,k] * b for b::Vector.

    With the same argument types, batched_mul(A, B) would regard B as a fixed matrix, not a batch of vectors. Both reshape and then call batched_mul(::Array{T,3}, ::Array{T,3}).

    julia> A, B, b = randn(16,8,32), randn(8,32), randn(8);
    +
    +julia> batched_vec(A,B) |> size
    +(16, 32)
    +
    +julia> batched_vec(A,b) |> size
    +(16, 32)

    Gather and Scatter

    Flux's Embedding layer uses NNlib.gather as its backend.

    NNlib.gatherFunction
    NNlib.gather(src, idx) -> dst

    Reverse operation of scatter. Gathers data from source src and writes it in a destination dst according to the index array idx. For each k in CartesianIndices(idx), assign values to dst according to

    dst[:, ... , k] .= src[:, ... , idx[k]...]

    Notice that if idx is a vector containing integers and src is a matrix, previous expression simplifies to

    dst[:, k] .= src[:, idx[k]]

    and k will run over 1:length(idx).

    The elements of idx can be integers or integer tuples and may be repeated. A single src column can end up being copied into zero, one, or multiple dst columns.

    See gather! for an in-place version.

    Examples

    julia> NNlib.gather([1,20,300,4000], [2,4,2])
    +3-element Vector{Int64}:
    +   20
    + 4000
    +   20
    +
    +julia> NNlib.gather([1 2 3; 4 5 6], [1,3,1,3,1])
    +2×5 Matrix{Int64}:
    + 1  3  1  3  1
    + 4  6  4  6  4
    gather(src, IJK...)

    Convert the tuple of integer vectors IJK to a tuple of CartesianIndex and call gather on it: gather(src, CartesianIndex.(IJK...)).

    Examples

    julia> src = reshape([1:15;], 3, 5)
    +3×5 Matrix{Int64}:
    + 1  4  7  10  13
    + 2  5  8  11  14
    + 3  6  9  12  15
    +
    +julia> NNlib.gather(src, [1, 2], [2, 4])
    +2-element Vector{Int64}:
    +  4
    + 11
    NNlib.gather!Function
    NNlib.gather!(dst, src, idx)

    Reverse operation of scatter!. Gathers data from source src and writes it in destination dst according to the index array idx. For each k in CartesianIndices(idx), assign values to dst according to

    dst[:, ... , k] .= src[:, ... , idx[k]...]

    Notice that if idx is a vector containing integers, and both dst and src are matrices, previous expression simplifies to

    dst[:, k] .= src[:, idx[k]]

    and k will run over 1:length(idx).

    The elements of idx can be integers or integer tuples and may be repeated. A single src column can end up being copied into zero, one, or multiple dst columns.

    See gather for an allocating version.

    NNlib.scatterFunction
    NNlib.scatter(op, src, idx; [init, dstsize])

    Scatter operation allocating a destination array dst and calling scatter!(op, dst, src, idx) on it.

    • If keyword init is provided, it is used to initialize the content of dst. Otherwise, the init values is inferred from the reduction operator op for some common operators (e.g. init = 0 for op = +).

    • If dstsize is provided, it will be used to define the size of destination array, otherwise it will be inferred by src and idx.

    See scatter! for full details on how idx works.

    Examples

    julia> NNlib.scatter(+, [10,100,1000], [3,1,2])
    +3-element Vector{Int64}:
    +  100
    + 1000
    +   10
    +
    +julia> NNlib.scatter(+, [1 2 3 4; 5 6 7 8], [2,1,1,5])
    +2×5 Matrix{Int64}:
    +  5  1  0  0  4
    + 13  5  0  0  8
    +
    +julia> NNlib.scatter(*, [10,200,3000], [1,4,2]; init = 10, dstsize = 6)
    +6-element Vector{Int64}:
    +   100
    + 30000
    +    10
    +  2000
    +    10
    +    10
    NNlib.scatter!Function
    NNlib.scatter!(op, dst, src, idx)

    Scatter operation, which writes data in src into dst at locations idx. A binary reduction operator op is applied during the scatter. For each index k in idx, accumulates values in dst according to

    dst[:, ..., idx[k]...] = (op).(dst[:, ..., idx[k]...], src[:, ..., k...])

    See also scatter, gather.

    Arguments

    • op: Operations to be applied on dst and src, e.g. +, -, *, /, max, min and mean.
    • dst: The destination for src to aggregate to. This argument will be mutated.
    • src: The source data for aggregating.
    • idx: The mapping for aggregation from source (index) to destination (value). The idx array can contain either integers or tuples.

    Examples

    julia> NNlib.scatter!(+, ones(3), [10,100], [1,3])
    +3-element Vector{Float64}:
    +  11.0
    +   1.0
    + 101.0
    +
    +julia> NNlib.scatter!(*, fill(0.5, 2, 4), [1 10; 100 1000], [3,2])
    +2×4 Matrix{Float64}:
    + 0.5    5.0   0.5  0.5
    + 0.5  500.0  50.0  0.5

    Sampling

    NNlib.grid_sampleFunction
    grid_sample(input::AbstractArray{T, 4}, grid::AbstractArray{T, 4}; padding_mode = :zeros)

    Given input, compute output by sampling input values at pixel locations from grid. Uses bilinear interpolation to calculate output values.

    This implementation assumes the extrema (-1 and 1) are considered as referring to the center points of the input’s corner pixels (i.e. align corners is true).

    Arguments

    • input: Input array in (W_in, H_in, C, N) shape.

    • grid: Input grid in (2, W_out, H_out, N) shape. Where for each (W_out, H_out, N) grid contains (x, y) coordinates that specify sampling locations normalized by the input shape.

      Therefore, x and y should have values in [-1, 1] range. For example, (x = -1, y = -1) is the left-top pixel of input, and (x = 1, y = 1) is the right-bottom pixel of input.

      Out-of-bound values are handled according to the padding_mode.

    • padding_mode: Out-of-bound padding. :zeros to use 0 for out-of-bound grid locations. :border to use border values for out-of-bound grid locations. Default is :zeros.

    Returns

    (W_out, H_out, C, N) sampled grid from input.

    Examples

    In the example below, grid contains two out-of-bound sampling locations, which are handled differently, depending on the padding_mode.

    julia> x = reshape(collect(1.0:4.0), (2, 2, 1, 1))
    +2×2×1×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 1.0  3.0
    + 2.0  4.0
    +
    +julia> grid = Array{Float64}(undef, 2, 3, 2, 1);
    +
    +julia> grid[:, 1, 1, 1] .= (-3, -1);
    +
    +julia> grid[:, 2, 1, 1] .= (0, -1);
    +
    +julia> grid[:, 3, 1, 1] .= (1, -1);
    +
    +julia> grid[:, 1, 2, 1] .= (-1, 1);
    +
    +julia> grid[:, 2, 2, 1] .= (0, 1);
    +
    +julia> grid[:, 3, 2, 1] .= (3, 1);
    +
    +julia> grid_sample(x, grid; padding_mode=:zeros)
    +3×2×1×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 0.0  3.0
    + 1.5  3.5
    + 2.0  0.0
    +
    +julia> grid_sample(x, grid; padding_mode=:border)
    +3×2×1×1 Array{Float64, 4}:
    +[:, :, 1, 1] =
    + 1.0  3.0
    + 1.5  3.5
    + 2.0  4.0
    NNlib.∇grid_sampleFunction
    ∇grid_sample(Δ::AbstractArray{T, 4}, input::AbstractArray{T, 4}, grid::AbstractArray{T, 4}; padding_mode = :zeros) where T

    Arguments

    • Δ: Input gradient in (W_out, H_out, C, N) shape (same as output of the primal computation).
    • input: Input from primal computation in (W_in, H_in, C, N) shape.
    • grid: Grid from primal computation in (2, W_out, H_out, N) shape.
    • padding_mode: Out-of-bound padding. :zeros to use 0 for out-of-bound grid locations. :border to use border values for out-of-bound grid locations. Should be the same as in primal computation. Default is :zeros.

    Returns

    dinput (same shape as input) and dgrid (same shape as grid) gradients.

    Losses

    NNlib.ctc_lossFunction
    ctc_loss(ŷ, y)

    Computes the connectionist temporal classification loss between and y. must be a classes-by-time matrices, i.e., each row represents a class and each column represents a time step. Additionally, the logsoftmax function will be applied to , so must be the raw activation values from the neural network and not, for example, the activations after being passed through a softmax activation function. y must be a 1D array of the labels associated with . The blank label is assumed to be the last label category in , so it is equivalent to size(ŷ, 1). Used for sequence-to-sequence classification problems such as speech recognition and handwriting recognition where the exact time-alignment of the output (e.g., letters) is not needed to solve the problem. See Graves et al. (2006) or Graves (2012) for mathematical details.

    Miscellaneous

    NNlib.logsumexpFunction
    logsumexp(x; dims = :)

    Computes log.(sum(exp.(x); dims)) in a numerically stable way. Without dims keyword this returns a scalar.

    See also logsoftmax.

    diff --git a/previews/PR2365/models/overview/index.html b/previews/PR2365/models/overview/index.html new file mode 100644 index 0000000000..65fbadc9fe --- /dev/null +++ b/previews/PR2365/models/overview/index.html @@ -0,0 +1,59 @@ + +Fitting a Line · Flux

    Flux Overview: Fitting a Straight Line

    Flux is a pure Julia ML stack that allows you to build predictive models. Here are the steps for a typical Flux program:

    1. Provide training and test data
    2. Build a model with configurable parameters to make predictions
    3. Iteratively train the model by tweaking the parameters to improve predictions
    4. Verify your model

    Under the hood, Flux uses a technique called automatic differentiation to take gradients that help improve predictions. Flux is also fully written in Julia so you can easily replace any layer of Flux with your own code to improve your understanding or satisfy special requirements.

    Here's how you'd use Flux to build and train the most basic of models, step by step.

    A Trivial Prediction

    This example will predict the output of the function 4x + 2. Making such predictions is called "linear regression", and is really too simple to need a neural network. But it's a nice toy example.

    First, import Flux and define the function we want to simulate:

    julia> using Flux
    +
    +julia> actual(x) = 4x + 2
    +actual (generic function with 1 method)

    This example will build a model to approximate the actual function.

    1. Provide Training and Test Data

    Use the actual function to build sets of data for training and verification:

    julia> x_train, x_test = hcat(0:5...), hcat(6:10...)
    +([0 1 … 4 5], [6 7 … 9 10])
    +
    +julia> y_train, y_test = actual.(x_train), actual.(x_test)
    +([2 6 … 18 22], [26 30 … 38 42])

    Normally, your training and test data come from real world observations, but here we simulate them.

    2. Build a Model to Make Predictions

    Now, build a model to make predictions with 1 input and 1 output:

    julia> model = Dense(1 => 1)
    +Dense(1 => 1)       # 2 parameters
    +
    +julia> model.weight
    +1×1 Matrix{Float32}:
    + 0.95041317
    +
    +julia> model.bias
    +1-element Vector{Float32}:
    + 0.0

    Under the hood, a dense layer is a struct with fields weight and bias. weight represents a weights' matrix and bias represents a bias vector. There's another way to think about a model. In Flux, models are conceptually predictive functions:

    julia> predict = Dense(1 => 1)
    +Dense(1 => 1)       # 2 parameters

    Dense(1 => 1) also implements the function σ(Wx+b) where W and b are the weights and biases. σ is an activation function (more on activations later). Our model has one weight and one bias, but typical models will have many more. Think of weights and biases as knobs and levers Flux can use to tune predictions. Activation functions are transformations that tailor models to your needs.

    This model will already make predictions, though not accurate ones yet:

    julia> predict(x_train)
    +1×6 Matrix{Float32}:
    + 0.0  0.906654  1.81331  2.71996  3.62662  4.53327

    In order to make better predictions, you'll need to provide a loss function to tell Flux how to objectively evaluate the quality of a prediction. Loss functions compute the cumulative distance between actual values and predictions.

    julia> using Statistics
    +
    +julia> loss(model, x, y) = mean(abs2.(model(x) .- y));
    +
    +julia> loss(predict, x_train, y_train)
    +122.64734f0

    More accurate predictions will yield a lower loss. You can write your own loss functions or rely on those already provided by Flux. This loss function is called mean squared error (and built-in as mse). Flux works by iteratively reducing the loss through training.

    3. Improve the Prediction

    Under the hood, the Flux Flux.train! function uses a loss function and training data to improve the parameters of your model based on a pluggable optimiser:

    julia> using Flux: train!
    +
    +julia> opt = Descent()
    +Descent(0.1)
    +
    +julia> data = [(x_train, y_train)]
    +1-element Vector{Tuple{Matrix{Int64}, Matrix{Int64}}}:
    + ([0 1 … 4 5], [2 6 … 18 22])

    Now, we have the optimiser and data we'll pass to train!. All that remains are the parameters of the model. Remember, each model is a Julia struct with a function and configurable parameters. Remember, the dense layer has weights and biases that depend on the dimensions of the inputs and outputs:

    julia> predict.weight
    +1×1 Matrix{Float32}:
    + 0.9066542
    +
    +julia> predict.bias
    +1-element Vector{Float32}:
    + 0.0

    The dimensions of these model parameters depend on the number of inputs and outputs.

    Flux will adjust predictions by iteratively changing these parameters according to the optimiser.

    This optimiser implements the classic gradient descent strategy. Now improve the parameters of the model with a call to Flux.train! like this:

    julia> train!(loss, predict, data, opt)

    And check the loss:

    julia> loss(predict, x_train, y_train)
    +116.38745f0

    It went down. Why?

    julia> predict.weight, predict.bias
    +(Float32[7.246838;;], Float32[1.748103])

    The parameters have changed. This single step is the essence of machine learning.

    3+. Iteratively Train the Model

    In the previous section, we made a single call to train! which iterates over the data we passed in just once. An epoch refers to one pass over the dataset. Typically, we will run the training for multiple epochs to drive the loss down even further. Let's run it a few more times:

    julia> for epoch in 1:200
    +         train!(loss, predict, data, opt)
    +       end
    +
    +julia> loss(predict, x_train, y_train)
    +0.00339581f0
    +
    +julia> predict.weight, predict.bias
    +(Float32[4.0159144;;], Float32[2.004479])

    After 200 training steps, the loss went down, and the parameters are getting close to those in the function the model is built to predict.

    4. Verify the Results

    Now, let's verify the predictions:

    julia> predict(x_test)
    +1×5 Matrix{Float32}:
    + 26.1121  30.13  34.1479  38.1657  42.1836
    +
    +julia> y_test
    +1×5 Matrix{Int64}:
    + 26  30  34  38  42

    The predictions are good. Here's how we got there.

    First, we gathered real-world data into the variables x_train, y_train, x_test, and y_test. The x_* data defines inputs, and the y_* data defines outputs. The *_train data is for training the model, and the *_test data is for verifying the model. Our data was based on the function 4x + 2.

    Then, we built a single input, single output predictive model, predict = Dense(1 => 1). The initial predictions weren't accurate, because we had not trained the model yet.

    After building the model, we trained it with train!(loss, predict, data, opt). The loss function is first, followed by the model itself, the training data, and the Descent optimiser provided by Flux. We ran the training step once, and observed that the parameters changed and the loss went down. Then, we ran the train! many times to finish the training process.

    After we trained the model, we verified it with the test data to verify the results.

    This overall flow represents how Flux works. Let's drill down a bit to understand what's going on inside the individual layers of Flux.

    diff --git a/previews/PR2365/models/quickstart/index.html b/previews/PR2365/models/quickstart/index.html new file mode 100644 index 0000000000..7da5e80e4a --- /dev/null +++ b/previews/PR2365/models/quickstart/index.html @@ -0,0 +1,62 @@ + +Quick Start · Flux

    A Neural Network in One Minute

    If you have used neural networks before, then this simple example might be helpful for seeing how the major parts of Flux work together. Try pasting the code into the REPL prompt.

    If you haven't, then you might prefer the Fitting a Straight Line page.

    # This will prompt if neccessary to install everything, including CUDA:
    +using Flux, CUDA, Statistics, ProgressMeter
    +
    +# Generate some data for the XOR problem: vectors of length 2, as columns of a matrix:
    +noisy = rand(Float32, 2, 1000)                                    # 2×1000 Matrix{Float32}
    +truth = [xor(col[1]>0.5, col[2]>0.5) for col in eachcol(noisy)]   # 1000-element Vector{Bool}
    +
    +# Define our model, a multi-layer perceptron with one hidden layer of size 3:
    +model = Chain(
    +    Dense(2 => 3, tanh),   # activation function inside layer
    +    BatchNorm(3),
    +    Dense(3 => 2),
    +    softmax) |> gpu        # move model to GPU, if available
    +
    +# The model encapsulates parameters, randomly initialised. Its initial output is:
    +out1 = model(noisy |> gpu) |> cpu                                 # 2×1000 Matrix{Float32}
    +
    +# To train the model, we use batches of 64 samples, and one-hot encoding:
    +target = Flux.onehotbatch(truth, [true, false])                   # 2×1000 OneHotMatrix
    +loader = Flux.DataLoader((noisy, target) |> gpu, batchsize=64, shuffle=true);
    +# 16-element DataLoader with first element: (2×64 Matrix{Float32}, 2×64 OneHotMatrix)
    +
    +optim = Flux.setup(Flux.Adam(0.01), model)  # will store optimiser momentum, etc.
    +
    +# Training loop, using the whole data set 1000 times:
    +losses = []
    +@showprogress for epoch in 1:1_000
    +    for (x, y) in loader
    +        loss, grads = Flux.withgradient(model) do m
    +            # Evaluate model and loss inside gradient context:
    +            y_hat = m(x)
    +            Flux.crossentropy(y_hat, y)
    +        end
    +        Flux.update!(optim, model, grads[1])
    +        push!(losses, loss)  # logging, outside gradient context
    +    end
    +end
    +
    +optim # parameters, momenta and output have all changed
    +out2 = model(noisy |> gpu) |> cpu  # first row is prob. of true, second row p(false)
    +
    +mean((out2[1,:] .> 0.5) .== truth)  # accuracy 94% so far!

    using Plots  # to draw the above figure
    +
    +p_true = scatter(noisy[1,:], noisy[2,:], zcolor=truth, title="True classification", legend=false)
    +p_raw =  scatter(noisy[1,:], noisy[2,:], zcolor=out1[1,:], title="Untrained network", label="", clims=(0,1))
    +p_done = scatter(noisy[1,:], noisy[2,:], zcolor=out2[1,:], title="Trained network", legend=false)
    +
    +plot(p_true, p_raw, p_done, layout=(1,3), size=(1000,330))

    Here's the loss during training:

    plot(losses; xaxis=(:log10, "iteration"),
    +    yaxis="loss", label="per batch")
    +n = length(loader)
    +plot!(n:n:length(losses), mean.(Iterators.partition(losses, n)),
    +    label="epoch mean", dpi=200)

    This XOR ("exclusive or") problem is a variant of the famous one which drove Minsky and Papert to invent deep neural networks in 1969. For small values of "deep" – this has one hidden layer, while earlier perceptrons had none. (What they call a hidden layer, Flux calls the output of the first layer, model[1](noisy).)

    Since then things have developed a little.

    Features to Note

    Some things to notice in this example are:

    • The batch dimension of data is always the last one. Thus a 2×1000 Matrix is a thousand observations, each a column of length 2. Flux defaults to Float32, but most of Julia to Float64.

    • The model can be called like a function, y = model(x). Each layer like Dense is an ordinary struct, which encapsulates some arrays of parameters (and possibly other state, as for BatchNorm).

    • But the model does not contain the loss function, nor the optimisation rule. The momenta needed by Adam are stored in the object returned by setup. And Flux.crossentropy is an ordinary function.

    • The do block creates an anonymous function, as the first argument of gradient. Anything executed within this is differentiated.

    Instead of calling gradient and update! separately, there is a convenience function train!. If we didn't want anything extra (like logging the loss), we could replace the training loop with the following:

    for epoch in 1:1_000
    +    Flux.train!(model, loader, optim) do m, x, y
    +        y_hat = m(x)
    +        Flux.crossentropy(y_hat, y)
    +    end
    +end
    Implicit-style training, Flux ≤ 0.14

    Until recently Flux's training worked a bit differently. Any code which looks like

    gradient(() -> loss(model, x, y), Flux.params(model))

    (gradient of a zero-argument function) or

    train!((x,y) -> loss(model, x, y), Flux.params(model), loader, opt)

    (with Flux.params) is in the old "implicit" style. This still works on Flux 0.14, but will be removed from Flux 0.15. See the training section for more details.

    diff --git a/previews/PR2365/models/recurrence/index.html b/previews/PR2365/models/recurrence/index.html new file mode 100644 index 0000000000..61a931ed9c --- /dev/null +++ b/previews/PR2365/models/recurrence/index.html @@ -0,0 +1,103 @@ + +Recurrence · Flux

    Recurrent Models

    Recurrent cells

    To introduce Flux's recurrence functionalities, we will consider the following vanilla recurrent neural network structure:

    In the above, we have a sequence of length 3, where x1 to x3 represent the input at each step (could be a timestamp or a word in a sentence), and y1 to y3 are their respective outputs.

    An aspect to recognize is that in such a model, the recurrent cells A all refer to the same structure. What distinguishes it from a simple dense layer is that the cell A is fed, in addition to an input x, with information from the previous state of the model (hidden state denoted as h1 & h2 in the diagram).

    In the most basic RNN case, cell A could be defined by the following:

    output_size = 5
    +input_size = 2
    +Wxh = randn(Float32, output_size, input_size)
    +Whh = randn(Float32, output_size, output_size)
    +b   = randn(Float32, output_size)
    +
    +function rnn_cell(h, x)
    +    h = tanh.(Wxh * x .+ Whh * h .+ b)
    +    return h, h
    +end
    +
    +x = rand(Float32, input_size) # dummy input data
    +h = rand(Float32, output_size) # random initial hidden state
    +
    +h, y = rnn_cell(h, x)

    Notice how the above is essentially a Dense layer that acts on two inputs, h and x.

    If you run the last line a few times, you'll notice the output y changing slightly even though the input x is the same.

    There are various recurrent cells available in Flux, notably RNNCell, LSTMCell and GRUCell, which are documented in the layer reference. The hand-written example above can be replaced with:

    using Flux
    +
    +rnn = Flux.RNNCell(2, 5)
    +
    +x = rand(Float32, 2) # dummy data
    +h = rand(Float32, 5)  # initial hidden state
    +
    +h, y = rnn(h, x)

    Stateful Models

    For the most part, we don't want to manage hidden states ourselves, but to treat our models as being stateful. Flux provides the Recur wrapper to do this.

    x = rand(Float32, 2)
    +h = rand(Float32, 5)
    +
    +m = Flux.Recur(rnn, h)
    +
    +y = m(x)

    The Recur wrapper stores the state between runs in the m.state field.

    If we use the RNN(2, 5) constructor – as opposed to RNNCell – you'll see that it's simply a wrapped cell.

    julia> using Flux
    +
    +julia> RNN(2, 5)  # or equivalently RNN(2 => 5)
    +Recur(
    +  RNNCell(2 => 5, tanh),                # 45 parameters
    +)         # Total: 4 trainable arrays, 45 parameters,
    +          # plus 1 non-trainable, 5 parameters, summarysize 412 bytes.

    Equivalent to the RNN stateful constructor, LSTM and GRU are also available.

    Using these tools, we can now build the model shown in the above diagram with:

    julia> m = Chain(RNN(2 => 5), Dense(5 => 1))
    +Chain(
    +  Recur(
    +    RNNCell(2 => 5, tanh),              # 45 parameters
    +  ),
    +  Dense(5 => 1),                        # 6 parameters
    +)         # Total: 6 trainable arrays, 51 parameters,
    +          # plus 1 non-trainable, 5 parameters, summarysize 580 bytes.   

    In this example, each output has only one component.

    Working with sequences

    Using the previously defined m recurrent model, we can now apply it to a single step from our sequence:

    julia> x = rand(Float32, 2);
    +
    +julia> m(x)
    +1-element Vector{Float32}:
    + 0.45860028

    The m(x) operation would be represented by x1 -> A -> y1 in our diagram. If we perform this operation a second time, it will be equivalent to x2 -> A -> y2 since the model m has stored the state resulting from the x1 step.

    Now, instead of computing a single step at a time, we can get the full y1 to y3 sequence in a single pass by iterating the model on a sequence of data.

    To do so, we'll need to structure the input data as a Vector of observations at each time step. This Vector will therefore be of length = seq_length and each of its elements will represent the input features for a given step. In our example, this translates into a Vector of length 3, where each element is a Matrix of size (features, batch_size), or just a Vector of length features if dealing with a single observation.

    julia> x = [rand(Float32, 2) for i = 1:3];
    +
    +julia> [m(xi) for xi in x]
    +3-element Vector{Vector{Float32}}:
    + [0.36080405]
    + [-0.13914406]
    + [0.9310162]
    Use of map and broadcast

    Mapping and broadcasting operations with stateful layers such are discouraged, since the julia language doesn't guarantee a specific execution order. Therefore, avoid

    y = m.(x)
    +# or 
    +y = map(m, x)

    and use explicit loops

    y = [m(x) for x in x]

    If for some reason one wants to exclude the first step of the RNN chain for the computation of the loss, that can be handled with:

    using Flux.Losses: mse
    +
    +function loss(x, y)
    +  m(x[1]) # ignores the output but updates the hidden states
    +  sum(mse(m(xi), yi) for (xi, yi) in zip(x[2:end], y))
    +end
    +
    +y = [rand(Float32, 1) for i=1:2]
    +loss(x, y)

    In such a model, only the last two outputs are used to compute the loss, hence the target y being of length 2. This is a strategy that can be used to easily handle a seq-to-one kind of structure, compared to the seq-to-seq assumed so far.

    Alternatively, if one wants to perform some warmup of the sequence, it could be performed once, followed with a regular training where all the steps of the sequence would be considered for the gradient update:

    function loss(x, y)
    +  sum(mse(m(xi), yi) for (xi, yi) in zip(x, y))
    +end
    +
    +seq_init = [rand(Float32, 2)]
    +seq_1 = [rand(Float32, 2) for i = 1:3]
    +seq_2 = [rand(Float32, 2) for i = 1:3]
    +
    +y1 = [rand(Float32, 1) for i = 1:3]
    +y2 = [rand(Float32, 1) for i = 1:3]
    +
    +X = [seq_1, seq_2]
    +Y = [y1, y2]
    +data = zip(X,Y)
    +
    +Flux.reset!(m)
    +[m(x) for x in seq_init]
    +
    +ps = Flux.params(m)
    +opt= Adam(1e-3)
    +Flux.train!(loss, ps, data, opt)

    In this previous example, model's state is first reset with Flux.reset!. Then, there's a warmup that is performed over a sequence of length 1 by feeding it with seq_init, resulting in a warmup state. The model can then be trained for 1 epoch, where 2 batches are provided (seq_1 and seq_2) and all the timesteps outputs are considered for the loss.

    In this scenario, it is important to note that a single continuous sequence is considered. Since the model state is not reset between the 2 batches, the state of the model flows through the batches, which only makes sense in the context where seq_1 is the continuation of seq_init and so on.

    Batch size would be 1 here as there's only a single sequence within each batch. If the model was to be trained on multiple independent sequences, then these sequences could be added to the input data as a second dimension. For example, in a language model, each batch would contain multiple independent sentences. In such scenario, if we set the batch size to 4, a single batch would be of the shape:

    x = [rand(Float32, 2, 4) for i = 1:3]
    +y = [rand(Float32, 1, 4) for i = 1:3]

    That would mean that we have 4 sentences (or samples), each with 2 features (let's say a very small embedding!) and each with a length of 3 (3 words per sentence). Computing m(batch[1]), would still represent x1 -> y1 in our diagram and returns the first word output, but now for each of the 4 independent sentences (second dimension of the input matrix). We do not need to use Flux.reset!(m) here; each sentence in the batch will output in its own "column", and the outputs of the different sentences won't mix.

    To illustrate, we go through an example of batching with our implementation of rnn_cell. The implementation doesn't need to change; the batching comes for "free" from the way Julia does broadcasting and the rules of matrix multiplication.

    output_size = 5
    +input_size = 2
    +Wxh = randn(Float32, output_size, input_size)
    +Whh = randn(Float32, output_size, output_size)
    +b   = randn(Float32, output_size)
    +
    +function rnn_cell(h, x)
    +    h = tanh.(Wxh * x .+ Whh * h .+ b)
    +    return h, h
    +end

    Here, we use the last dimension of the input and the hidden state as the batch dimension. I.e., h[:, n] would be the hidden state of the nth sentence in the batch.

    batch_size = 4
    +x = rand(Float32, input_size, batch_size) # dummy input data
    +h = rand(Float32, output_size, batch_size) # random initial hidden state
    +
    +h, y = rnn_cell(h, x)
    julia> size(h) == size(y) == (output_size, batch_size)
    +true

    In many situations, such as when dealing with a language model, the sentences in each batch are independent (i.e. the last item of the first sentence of the first batch is independent from the first item of the first sentence of the second batch), so we cannot handle the model as if each batch was the direct continuation of the previous one. To handle such situations, we need to reset the state of the model between each batch, which can be conveniently performed within the loss function:

    function loss(x, y)
    +  Flux.reset!(m)
    +  sum(mse(m(xi), yi) for (xi, yi) in zip(x, y))
    +end

    A potential source of ambiguity with RNN in Flux can come from the different data layout compared to some common frameworks where data is typically a 3 dimensional array: (features, seq length, samples). In Flux, those 3 dimensions are provided through a vector of seq length containing a matrix (features, samples).

    diff --git a/previews/PR2365/outputsize/index.html b/previews/PR2365/outputsize/index.html new file mode 100644 index 0000000000..8562db909c --- /dev/null +++ b/previews/PR2365/outputsize/index.html @@ -0,0 +1,84 @@ + +Shape Inference · Flux

    Shape Inference

    Flux has some tools to help generate models in an automated fashion, by inferring the size of arrays that layers will recieve, without doing any computation. This is especially useful for convolutional models, where the same Conv layer accepts any size of image, but the next layer may not.

    The higher-level tool is a macro @autosize which acts on the code defining the layers, and replaces each appearance of _ with the relevant size. This simple example returns a model with Dense(845 => 10) as the last layer:

    @autosize (28, 28, 1, 32) Chain(Conv((3, 3), _ => 5, relu, stride=2), Flux.flatten, Dense(_ => 10))

    The input size may be provided at runtime, like @autosize (sz..., 1, 32) Chain(Conv(..., but all the layer constructors containing _ must be explicitly written out – the macro sees the code as written.

    This macro relies on a lower-level function outputsize, which you can also use directly:

    c = Conv((3, 3), 1 => 5, relu, stride=2)
    +Flux.outputsize(c, (28, 28, 1, 32))  # returns (13, 13, 5, 32)

    The function outputsize works by passing a "dummy" array into the model, which propagates through very cheaply. It should work for all layers, including custom layers, out of the box.

    An example of how to automate model building is this:

    """
    +    make_model(width, height, [inchannels, nclasses; layer_config])
    +
    +Create a CNN for a given set of configuration parameters. Arguments:
    +- `width`, `height`: the input image size in pixels
    +- `inchannels`: the number of channels in the input image, default `1`
    +- `nclasses`: the number of output classes, default `10`
    +- Keyword `layer_config`: a vector of the number of channels per layer, default `[16, 16, 32, 64]`
    +"""
    +function make_model(width, height, inchannels = 1, nclasses = 10;
    +                    layer_config = [16, 16, 32, 64])
    +  # construct a vector of layers:
    +  conv_layers = []
    +  push!(conv_layers, Conv((5, 5), inchannels => layer_config[1], relu, pad=SamePad()))
    +  for (inch, outch) in zip(layer_config, layer_config[2:end])
    +    push!(conv_layers, Conv((3, 3), inch => outch, sigmoid, stride=2))
    +  end
    +
    +  # compute the output dimensions after these conv layers:
    +  conv_outsize = Flux.outputsize(conv_layers, (width, height, inchannels); padbatch=true)
    +
    +  # use this to define appropriate Dense layer:
    +  last_layer = Dense(prod(conv_outsize) => nclasses)
    +  return Chain(conv_layers..., Flux.flatten, last_layer)
    +end
    +
    +m = make_model(28, 28, 3, layer_config = [9, 17, 33, 65])
    +
    +Flux.outputsize(m, (28, 28, 3, 42)) == (10, 42) == size(m(randn(Float32, 28, 28, 3, 42)))

    Alternatively, using the macro, the definition of make_model could end with:

      # compute the output dimensions & construct appropriate Dense layer:
    +  return @autosize (width, height, inchannels, 1) Chain(conv_layers..., Flux.flatten, Dense(_ => nclasses))
    +end

    Listing

    Flux.@autosizeMacro
    @autosize (size...,) Chain(Layer(_ => 2), Layer(_), ...)

    Returns the specified model, with each _ replaced by an inferred number, for input of the given size.

    The unknown sizes are usually the second-last dimension of that layer's input, which Flux regards as the channel dimension. (A few layers, Dense & LayerNorm, instead always use the first dimension.) The underscore may appear as an argument of a layer, or inside a =>. It may be used in further calculations, such as Dense(_ => _÷4).

    Examples

    julia> @autosize (3, 1) Chain(Dense(_ => 2, sigmoid), BatchNorm(_, affine=false))
    +Chain(
    +  Dense(3 => 2, σ),                     # 8 parameters
    +  BatchNorm(2, affine=false),
    +) 
    +
    +julia> img = [28, 28];
    +
    +julia> @autosize (img..., 1, 32) Chain(              # size is only needed at runtime
    +          Chain(c = Conv((3,3), _ => 5; stride=2, pad=SamePad()),
    +                p = MeanPool((3,3)),
    +                b = BatchNorm(_),
    +                f = Flux.flatten),
    +          Dense(_ => _÷4, relu, init=Flux.rand32),   # can calculate output size _÷4
    +          SkipConnection(Dense(_ => _, relu), +),
    +          Dense(_ => 10),
    +       )
    +Chain(
    +  Chain(
    +    c = Conv((3, 3), 1 => 5, pad=1, stride=2),  # 50 parameters
    +    p = MeanPool((3, 3)),
    +    b = BatchNorm(5),                   # 10 parameters, plus 10
    +    f = Flux.flatten,
    +  ),
    +  Dense(80 => 20, relu),                # 1_620 parameters
    +  SkipConnection(
    +    Dense(20 => 20, relu),              # 420 parameters
    +    +,
    +  ),
    +  Dense(20 => 10),                      # 210 parameters
    +)         # Total: 10 trainable arrays, 2_310 parameters,
    +          # plus 2 non-trainable, 10 parameters, summarysize 10.469 KiB.
    +
    +julia> outputsize(ans, (28, 28, 1, 32))
    +(10, 32)

    Limitations:

    • While @autosize (5, 32) Flux.Bilinear(_ => 7) is OK, something like Bilinear((_, _) => 7) will fail.
    • While Scale(_) and LayerNorm(_) are fine (and use the first dimension), Scale(_,_) and LayerNorm(_,_) will fail if size(x,1) != size(x,2).
    source
    Flux.outputsizeFunction
    outputsize(m, x_size, y_size, ...; padbatch=false)

    For model or layer m accepting multiple arrays as input, this returns size(m((x, y, ...))) given size_x = size(x), etc.

    Examples

    julia> x, y = rand(Float32, 5, 64), rand(Float32, 7, 64);
    +
    +julia> par = Parallel(vcat, Dense(5 => 9), Dense(7 => 11));
    +
    +julia> Flux.outputsize(par, (5, 64), (7, 64))
    +(20, 64)
    +
    +julia> m = Chain(par, Dense(20 => 13), softmax);
    +
    +julia> Flux.outputsize(m, (5,), (7,); padbatch=true)
    +(13, 1)
    +
    +julia> par(x, y) == par((x, y)) == Chain(par, identity)((x, y))
    +true

    Notice that Chain only accepts multiple arrays as a tuple, while Parallel also accepts them as multiple arguments; outputsize always supplies the tuple.

    source
    diff --git a/previews/PR2365/performance/index.html b/previews/PR2365/performance/index.html new file mode 100644 index 0000000000..f387dd41b3 --- /dev/null +++ b/previews/PR2365/performance/index.html @@ -0,0 +1,17 @@ + +Performance Tips · Flux

    Performance Tips

    All the usual Julia performance tips apply. As always profiling your code is generally a useful way of finding bottlenecks. Below follow some Flux specific tips/reminders.

    Don't use more precision than you need

    Flux works great with all kinds of number types. But often you do not need to be working with say Float64 (let alone BigFloat). Switching to Float32 can give you a significant speed up, not because the operations are faster, but because the memory usage is halved. Which means allocations occur much faster. And you use less memory.

    Preserve inputs' types

    Not only should your activation and loss functions be type-stable, they should also preserve the type of their inputs.

    A very artificial example using an activation function like

    my_tanh(x) = Float64(tanh(x))

    will result in performance on Float32 input orders of magnitude slower than the normal tanh would, because it results in having to use slow mixed type multiplication in the dense layers. Similar situations can occur in the loss function during backpropagation.

    Which means if you change your data say from Float64 to Float32 (which should give a speedup: see above), you will see a large slow-down.

    This can occur sneakily, because you can cause type-promotion by interacting with a numeric literals. E.g. the following will have run into the same problem as above:

    leaky_tanh(x) = 0.01*x + tanh(x)

    While one could change the activation function (e.g. to use 0.01f0*x), the idiomatic (and safe way) to avoid type casts whenever inputs changes is to use oftype:

    leaky_tanh(x) = oftype(x/1, 0.01)*x + tanh(x)

    Evaluate batches as Matrices of features

    While it can sometimes be tempting to process your observations (feature vectors) one at a time e.g.

    function loss_total(xs::AbstractVector{<:Vector}, ys::AbstractVector{<:Vector})
    +    sum(zip(xs, ys)) do (x, y_target)
    +        y_pred = model(x)  # evaluate the model
    +        return loss(y_pred, y_target)
    +    end
    +end

    It is much faster to concatenate them into a matrix, as this will hit BLAS matrix-matrix multiplication, which is much faster than the equivalent sequence of matrix-vector multiplications. The improvement is enough that it is worthwhile allocating new memory to store them contiguously.

    x_batch = reduce(hcat, xs)
    +y_batch = reduce(hcat, ys)
    +...
    +function loss_total(x_batch::Matrix, y_batch::Matrix)
    +    y_preds = model(x_batch)
    +    sum(loss.(y_preds, y_batch))
    +end

    When doing this kind of concatenation use reduce(hcat, xs) rather than hcat(xs...). This will avoid the splatting penalty, and will hit the optimised reduce method.

    diff --git a/previews/PR2365/saving/index.html b/previews/PR2365/saving/index.html new file mode 100644 index 0000000000..45eddb8568 --- /dev/null +++ b/previews/PR2365/saving/index.html @@ -0,0 +1,62 @@ + +Saving & Loading · Flux

    Saving and Loading Models

    You may wish to save models so that they can be loaded and run in a later session. Flux provides a number of ways to do this. The recommended way, which is the most robust one for long term storage, is to use Flux.state in combination with a serialization format like JLD2.jl or BSON.jl.

    Save a model:

    julia> using Flux
    +
    +julia> struct MyModel
    +           net
    +       end
    +
    +julia> Flux.@functor MyModel
    +
    +julia> MyModel() = MyModel(Chain(Dense(10, 5, relu), Dense(5, 2)));
    +
    +julia> model = MyModel()
    +MyModel(Chain(Dense(10 => 5, relu), Dense(5 => 2)))
    +
    +julia> model_state = Flux.state(model);
    +
    +julia> using JLD2
    +
    +julia> jldsave("mymodel.jld2"; model_state)

    Load it again in a new session using Flux.loadmodel!:

    julia> using Flux, JLD2
    +
    +julia> model_state = JLD2.load("mymodel.jld2", "model_state");
    +
    +julia> model = MyModel(); # MyModel definition must be available
    +
    +julia> Flux.loadmodel!(model, model_state);
    Note

    If a saved model's parameters are stored on the GPU, the model will not load later on if there is no GPU support available. It's best to move your model to the CPU with cpu(model) before saving it.

    Checkpointing

    In longer training runs it's a good idea to periodically save your model, so that you can resume if training is interrupted (for example, if there's a power cut).

    julia> using Flux: throttle
    +
    +julia> using JLD2
    +
    +julia> m = Chain(Dense(10 => 5, relu), Dense(5 => 2))
    +Chain(
    +  Dense(10 => 5, relu),                 # 55 parameters
    +  Dense(5 => 2),                        # 12 parameters
    +)                   # Total: 4 arrays, 67 parameters, 524 bytes.
    +
    +julia> for epoch in 1:10
    +          # ... train model ...
    +          jldsave("model-checkpoint.jld2", model_state = Flux.state(m))
    +       end;

    This will update the "model-checkpoint.jld2" every epoch.

    You can get more advanced by saving a series of models throughout training, for example

    jldsave("model-$(now()).jld2", model_state = Flux.state(m))

    will produce a series of models like "model-2018-03-06T02:57:10.41.jld2". You could also store the current test set loss, so that it's easy to (for example) revert to an older copy of the model if it starts to overfit.

    jldsave("model-$(now()).jld2", model_state = Flux.state(m), loss = testloss())

    Note that to resume a model's training, you might need to restore other stateful parts of your training loop. Possible examples are the optimiser state and the randomness used to partition the original data into the training and validation sets.

    You can store the optimiser state alongside the model, to resume training exactly where you left off:

    model = MyModel()
    +opt_state = Flux.setup(AdamW(), model)
    +
    +# ... train model ...
    +
    +model_state = Flux.state(model)
    +jldsave("checkpoint_epoch=42.jld2"; model_state, opt_state)

    Saving Models as Julia Structs

    Models are just normal Julia structs, so it's fine to use any Julia storage format to save the struct as it is instead of saving the state returned by Flux.state. BSON.jl is particularly convenient for this, since it can also save anynomous functions, which are sometimes part of a model definition.

    Save a model:

    julia> using Flux
    +
    +julia> model = Chain(Dense(10, 5, NNlib.relu), Dense(5, 2));
    +
    +julia> using BSON: @save
    +
    +julia> @save "mymodel.bson" model

    Load it again in a new session:

    julia> using Flux, BSON
    +
    +julia> BSON.@load "mymodel.bson" model
    +
    +julia> model
    +Chain(
    +  Dense(10 => 5, relu),                 # 55 parameters
    +  Dense(5 => 2),                        # 12 parameters
    +)                   # Total: 4 arrays, 67 parameters, 524 bytes.
    Warning

    Saving models this way could lead to compatibility issues across julia versions and across Flux versions if some of the Flux layers' internals are changed. It is therefore not recommended for long term storage, use Flux.state instead.

    Warning

    Previous versions of Flux suggested saving only the model weights using @save "mymodel.bson" params(model). This is no longer recommended and even strongly discouraged. Saving models this way will only store the trainable parameters which will result in incorrect behavior for layers like BatchNorm.

    diff --git a/previews/PR2365/search/index.html b/previews/PR2365/search/index.html new file mode 100644 index 0000000000..0235e16a49 --- /dev/null +++ b/previews/PR2365/search/index.html @@ -0,0 +1,6 @@ + +Search · Flux diff --git a/previews/PR2365/search_index.js b/previews/PR2365/search_index.js new file mode 100644 index 0000000000..be846ffe69 --- /dev/null +++ b/previews/PR2365/search_index.js @@ -0,0 +1,3 @@ +var documenterSearchIndex = {"docs": +[{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"CurrentModule = Flux","category":"page"},{"location":"training/optimisers/#man-optimisers","page":"Optimisation Rules","title":"Optimisation Rules","text":"","category":"section"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Flux builds in many optimisation rules for use with train! and other training functions.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"The mechanism by which these work is gradually being replaced as part of the change from \"implicit\" dictionary-based to \"explicit\" tree-like structures. At present, the same struct (such as Adam) can be used with either form, and will be automatically translated.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"For full details of how the new interface works, see the Optimisers.jl documentation.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"For full details on how the old \"implicit\" interface worked, see the Flux 0.13.6 manual.","category":"page"},{"location":"training/optimisers/#Optimiser-Reference","page":"Optimisation Rules","title":"Optimiser Reference","text":"","category":"section"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"All optimisers return an object that, when passed to train!, will update the parameters passed to it.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Descent\nMomentum\nNesterov\nRMSProp\nAdam\nRAdam\nAdaMax\nAdaGrad\nAdaDelta\nAMSGrad\nNAdam\nAdamW\nOAdam\nAdaBelief","category":"page"},{"location":"training/optimisers/#Flux.Optimise.Descent","page":"Optimisation Rules","title":"Flux.Optimise.Descent","text":"Descent(η = 0.1)\n\nClassic gradient descent optimiser with learning rate η. For each parameter p and its gradient δp, this runs p -= η*δp\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\n\nExamples\n\nopt = Descent()\n\nopt = Descent(0.3)\n\nps = Flux.params(model)\n\ngs = gradient(ps) do\n loss(x, y)\nend\n\nFlux.Optimise.update!(opt, ps, gs)\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.Momentum","page":"Optimisation Rules","title":"Flux.Optimise.Momentum","text":"Momentum(η = 0.01, ρ = 0.9)\n\nGradient descent optimiser with learning rate η and momentum ρ.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nMomentum (ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.\n\nExamples\n\nopt = Momentum()\n\nopt = Momentum(0.01, 0.99)\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.Nesterov","page":"Optimisation Rules","title":"Flux.Optimise.Nesterov","text":"Nesterov(η = 0.001, ρ = 0.9)\n\nGradient descent optimiser with learning rate η and Nesterov momentum ρ.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nNesterov momentum (ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.\n\nExamples\n\nopt = Nesterov()\n\nopt = Nesterov(0.003, 0.95)\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.RMSProp","page":"Optimisation Rules","title":"Flux.Optimise.RMSProp","text":"RMSProp(η = 0.001, ρ = 0.9, ϵ = 1.0e-8)\n\nOptimizer using the RMSProp algorithm. Often a good choice for recurrent networks. Parameters other than learning rate generally don't need tuning.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nMomentum (ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.\n\nExamples\n\nopt = RMSProp()\n\nopt = RMSProp(0.002, 0.95)\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.Adam","page":"Optimisation Rules","title":"Flux.Optimise.Adam","text":"Adam(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)\n\nAdam optimiser.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = Adam()\n\nopt = Adam(0.001, (0.9, 0.8))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.RAdam","page":"Optimisation Rules","title":"Flux.Optimise.RAdam","text":"RAdam(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)\n\nRectified Adam optimiser.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = RAdam()\n\nopt = RAdam(0.001, (0.9, 0.8))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.AdaMax","page":"Optimisation Rules","title":"Flux.Optimise.AdaMax","text":"AdaMax(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)\n\nAdaMax is a variant of Adam based on the ∞-norm.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = AdaMax()\n\nopt = AdaMax(0.001, (0.9, 0.995))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.AdaGrad","page":"Optimisation Rules","title":"Flux.Optimise.AdaGrad","text":"AdaGrad(η = 0.1, ϵ = 1.0e-8)\n\nAdaGrad optimiser. It has parameter specific learning rates based on how frequently it is updated. Parameters don't need tuning.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\n\nExamples\n\nopt = AdaGrad()\n\nopt = AdaGrad(0.001)\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.AdaDelta","page":"Optimisation Rules","title":"Flux.Optimise.AdaDelta","text":"AdaDelta(ρ = 0.9, ϵ = 1.0e-8)\n\nAdaDelta is a version of AdaGrad adapting its learning rate based on a window of past gradient updates. Parameters don't need tuning.\n\nParameters\n\nRho (ρ): Factor by which the gradient is decayed at each time step.\n\nExamples\n\nopt = AdaDelta()\n\nopt = AdaDelta(0.89)\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.AMSGrad","page":"Optimisation Rules","title":"Flux.Optimise.AMSGrad","text":"AMSGrad(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)\n\nThe AMSGrad version of the Adam optimiser. Parameters don't need tuning.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = AMSGrad()\n\nopt = AMSGrad(0.001, (0.89, 0.995))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.NAdam","page":"Optimisation Rules","title":"Flux.Optimise.NAdam","text":"NAdam(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)\n\nNAdam is a Nesterov variant of Adam. Parameters don't need tuning.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = NAdam()\n\nopt = NAdam(0.002, (0.89, 0.995))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.AdamW","page":"Optimisation Rules","title":"Flux.Optimise.AdamW","text":"AdamW(η = 0.001, β::Tuple = (0.9, 0.999), decay = 0)\n\nAdamW is a variant of Adam fixing (as in repairing) its weight decay regularization.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\ndecay: Decay applied to weights during optimisation.\n\nExamples\n\nopt = AdamW()\n\nopt = AdamW(0.001, (0.89, 0.995), 0.1)\n\n\n\n\n\n","category":"function"},{"location":"training/optimisers/#Flux.Optimise.OAdam","page":"Optimisation Rules","title":"Flux.Optimise.OAdam","text":"OAdam(η = 0.0001, β::Tuple = (0.5, 0.9), ϵ = 1.0e-8)\n\nOAdam (Optimistic Adam) is a variant of Adam adding an \"optimistic\" term suitable for adversarial training.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = OAdam()\n\nopt = OAdam(0.001, (0.9, 0.995))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.AdaBelief","page":"Optimisation Rules","title":"Flux.Optimise.AdaBelief","text":"AdaBelief(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)\n\nThe AdaBelief optimiser is a variant of the well-known Adam optimiser.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\nDecay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.\n\nExamples\n\nopt = AdaBelief()\n\nopt = AdaBelief(0.001, (0.9, 0.8))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Composing-Optimisers","page":"Optimisation Rules","title":"Composing Optimisers","text":"","category":"section"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Flux defines a special kind of optimiser simply called Optimiser which takes in arbitrary optimisers as input. Its behaviour is similar to the usual optimisers, but differs in that it acts by calling the optimisers listed in it sequentially. Each optimiser produces a modified gradient that will be fed into the next, and the resultant update will be applied to the parameter as usual. A classic use case is where adding decays is desirable. Flux defines some basic decays including ExpDecay, InvDecay etc.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"opt = Optimiser(ExpDecay(1, 0.1, 1000, 1e-4), Descent())","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Here we apply exponential decay to the Descent optimiser. The defaults of ExpDecay say that its learning rate will be decayed every 1000 steps. It is then applied like any optimiser.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"w = randn(10, 10)\nw1 = randn(10,10)\nps = Params([w, w1])\n\nloss(x) = Flux.Losses.mse(w * x, w1 * x)\n\nloss(rand(10)) # around 9\n\nfor t = 1:10^5\n θ = Params([w, w1])\n θ̄ = gradient(() -> loss(rand(10)), θ)\n Flux.Optimise.update!(opt, θ, θ̄)\nend\n\nloss(rand(10)) # around 0.9","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"It is possible to compose optimisers for some added flexibility.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Flux.Optimise.Optimiser","category":"page"},{"location":"training/optimisers/#Flux.Optimise.Optimiser","page":"Optimisation Rules","title":"Flux.Optimise.Optimiser","text":"Optimiser(a, b, c...)\n\nCombine several optimisers into one; each optimiser produces a modified gradient that will be fed into the next, and this is finally applied to the parameter as usual.\n\nnote: Note\nThis will be replaced by Optimisers.OptimiserChain in Flux 0.15.\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Scheduling-Optimisers","page":"Optimisation Rules","title":"Scheduling Optimisers","text":"","category":"section"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"In practice, it is fairly common to schedule the learning rate of an optimiser to obtain faster convergence. There are a variety of popular scheduling policies, and you can find implementations of them in ParameterSchedulers.jl. The documentation for ParameterSchedulers.jl provides a more detailed overview of the different scheduling policies, and how to use them with Flux optimisers. Below, we provide a brief snippet illustrating a cosine annealing schedule with a momentum optimiser.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"First, we import ParameterSchedulers.jl and initialize a cosine annealing schedule to vary the learning rate between 1e-4 and 1e-2 every 10 steps. We also create a new Momentum optimiser.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"using ParameterSchedulers\n\nopt = Momentum()\nschedule = Cos(λ0 = 1e-4, λ1 = 1e-2, period = 10)\nfor (eta, epoch) in zip(schedule, 1:100)\n opt.eta = eta\n # your training code here\nend","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"schedule can also be indexed (e.g. schedule(100)) or iterated like any iterator in Julia.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"ParameterSchedulers.jl schedules are stateless (they don't store their iteration state). If you want a stateful schedule, you can use ParameterSchedulers.Stateful:","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"using ParameterSchedulers: Stateful, next!\n\nschedule = Stateful(Cos(λ0 = 1e-4, λ1 = 1e-2, period = 10))\nfor epoch in 1:100\n opt.eta = next!(schedule)\n # your training code here\nend","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"ParameterSchedulers.jl allows for many more scheduling policies including arbitrary functions, looping any function with a given period, or sequences of many schedules. See the ParameterSchedulers.jl documentation for more info.","category":"page"},{"location":"training/optimisers/#Decays","page":"Optimisation Rules","title":"Decays","text":"","category":"section"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Similar to optimisers, Flux also defines some simple decays that can be used in conjunction with other optimisers, or standalone.","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"ExpDecay\nInvDecay\nWeightDecay","category":"page"},{"location":"training/optimisers/#Flux.Optimise.ExpDecay","page":"Optimisation Rules","title":"Flux.Optimise.ExpDecay","text":"ExpDecay(η = 0.001, decay = 0.1, decay_step = 1000, clip = 1e-4, start = 1)\n\nDiscount the learning rate η by the factor decay every decay_step steps till a minimum of clip.\n\nParameters\n\nLearning rate (η): Amount by which gradients are discounted before updating the weights.\ndecay: Factor by which the learning rate is discounted.\ndecay_step: Schedule decay operations by setting the number of steps between two decay operations.\nclip: Minimum value of learning rate.\n'start': Step at which the decay starts.\n\nSee also the Scheduling Optimisers section of the docs for more general scheduling techniques.\n\nExamples\n\nExpDecay is typically composed with other optimisers as the last transformation of the gradient:\n\nopt = Optimiser(Adam(), ExpDecay(1.0))\n\nNote: you may want to start with η=1 in ExpDecay when combined with other optimisers (Adam in this case) that have their own learning rate.\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.InvDecay","page":"Optimisation Rules","title":"Flux.Optimise.InvDecay","text":"InvDecay(γ = 0.001)\n\nApply inverse time decay to an optimiser, so that the effective step size at iteration n is eta / (1 + γ * n) where eta is the initial step size. The wrapped optimiser's step size is not modified.\n\nSee also the Scheduling Optimisers section of the docs for more general scheduling techniques.\n\nExamples\n\nInvDecay is typically composed with other optimisers as the last transformation of the gradient:\n\n# Inverse decay of the learning rate\n# with starting value 0.001 and decay coefficient 0.01.\nopt = Optimiser(Adam(1f-3), InvDecay(1f-2))\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.WeightDecay","page":"Optimisation Rules","title":"Flux.Optimise.WeightDecay","text":"WeightDecay(λ = 0)\n\nDecay weights by λ. Typically composed with other optimisers as the first transformation to the gradient, making it equivalent to adding L_2 regularization with coefficient λ to the loss.\n\nExamples\n\nopt = Optimiser(WeightDecay(1f-4), Adam())\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Gradient-Clipping","page":"Optimisation Rules","title":"Gradient Clipping","text":"","category":"section"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"Gradient clipping is useful for training recurrent neural networks, which have a tendency to suffer from the exploding gradient problem. An example usage is","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"opt = Optimiser(ClipValue(1e-3), Adam(1e-3))","category":"page"},{"location":"training/optimisers/","page":"Optimisation Rules","title":"Optimisation Rules","text":"ClipValue\nClipNorm","category":"page"},{"location":"training/optimisers/#Flux.Optimise.ClipValue","page":"Optimisation Rules","title":"Flux.Optimise.ClipValue","text":"ClipValue(thresh)\n\nClip gradients when their absolute value exceeds thresh.\n\nnote: Note\nThis will be replaced by Optimisers.ClipGrad in Flux 0.15.\n\n\n\n\n\n","category":"type"},{"location":"training/optimisers/#Flux.Optimise.ClipNorm","page":"Optimisation Rules","title":"Flux.Optimise.ClipNorm","text":"ClipNorm(thresh)\n\nClip gradients when their L2 norm exceeds thresh.\n\n\n\n\n\n","category":"type"},{"location":"tutorials/logistic_regression/#Logistic-Regression","page":"Logistic Regression","title":"Logistic Regression","text":"","category":"section"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The following page contains a step-by-step walkthrough of the logistic regression algorithm in Julia using Flux. We will then create a simple logistic regression model without any usage of Flux and compare the different working parts with Flux's implementation. ","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's start by importing the required Julia packages.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> using Flux, Statistics, MLDatasets, DataFrames, OneHotArrays","category":"page"},{"location":"tutorials/logistic_regression/#Dataset","page":"Logistic Regression","title":"Dataset","text":"","category":"section"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's start by importing a dataset from MLDatasets.jl. We will use the Iris dataset that contains the data of three different Iris species. The data consists of 150 data points (xs), each having four features. Each of these x is mapped to y, the name of a particular Iris specie. The following code will download the Iris dataset when run for the first time.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> Iris()\ndataset Iris:\n metadata => Dict{String, Any} with 4 entries\n features => 150×4 DataFrame\n targets => 150×1 DataFrame\n dataframe => 150×5 DataFrame\n\njulia> x, y = Iris(as_df=false)[:];","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's have a look at our dataset -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> y\n1×150 Matrix{InlineStrings.String15}:\n \"Iris-setosa\" \"Iris-setosa\" … \"Iris-virginica\" \"Iris-virginica\"\n\njulia> x |> summary\n\"4×150 Matrix{Float64}\"","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The y values here corresponds to a type of iris plant, with a total of 150 data points. The x values depict the sepal length, sepal width, petal length, and petal width (all in cm) of 150 iris plant (hence the matrix size 4×150). Different type of iris plants have different lengths and widths of sepals and petals associated with them, and there is a definitive pattern for this in nature. We can leverage this to train a simple classifier that outputs the type of iris plant using the length and width of sepals and petals as inputs.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Our next step would be to convert this data into a form that can be fed to a machine learning model. The x values are arranged in a matrix and should ideally be converted to Float32 type (see Performance tips), but the labels must be one hot encoded. Here is a great discourse thread on different techniques that can be used to one hot encode data with or without using any external Julia package.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> x = Float32.(x);\n\njulia> y = vec(y);\n\njulia> custom_y_onehot = unique(y) .== permutedims(y)\n3×150 BitMatrix:\n 1 1 1 1 1 1 1 1 1 1 1 1 1 … 0 0 0 0 0 0 0 0 0 0 0 0\n 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"This same operation can also be performed using OneHotArrays' onehotbatch function. We will use both of these outputs parallelly to show how intuitive FluxML is!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> const classes = [\"Iris-setosa\", \"Iris-versicolor\", \"Iris-virginica\"];\n\njulia> flux_y_onehot = onehotbatch(y, classes)\n3×150 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n 1 1 1 1 1 1 1 1 1 1 1 1 1 … ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅\n ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅\n ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ 1 1 1 1 1 1 1 1 1 1 1 1","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Our data is ready. The next step would be to build a classifier for the same.","category":"page"},{"location":"tutorials/logistic_regression/#Building-a-model","page":"Logistic Regression","title":"Building a model","text":"","category":"section"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"A logistic regression model is defined mathematically as -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"model(x) = σ(Wx + b)","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"where W is the weight matrix, b is the bias vector, and σ is any activation function. For our case, let's use the softmax activation function as we will be performing a multiclass classification task.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> m(W, b, x) = W*x .+ b\nm (generic function with 1 method)","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Note that this model lacks an activation function, but we will come back to that.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We can now move ahead to initialize the parameters of our model. Given that our model has four inputs (4 features in every data point), and three outputs (3 different classes), the parameters can be initialized in the following way -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> W = rand(Float32, 3, 4);\n\njulia> b = [0.0f0, 0.0f0, 0.0f0];","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Now our model can take in the complete dataset and predict the class of each x in one go. But, we need to ensure that our model outputs the probabilities of an input belonging to the respective classes. As our model has three outputs, each would denote the probability of the input belonging to a particular class.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We will use an activation function to map our outputs to a probability value. It would make sense to use a softmax activation function here, which is defined mathematically as -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"σ(vecx) = frace^z_isum_j=1^k e^z_j","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The softmax function scales down the outputs to probability values such that the sum of all the final outputs equals 1. Let's implement this in Julia.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_softmax(x) = exp.(x) ./ sum(exp.(x), dims=1)\ncustom_softmax (generic function with 1 method)","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The implementation looks straightforward enough! Note that we specify dims=1 in the sum function to calculate the sum of probabilities in each column. Remember, we will have a 3×150 matrix (predicted ys) as the output of our model, where each column would be an output of a corresponding input.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's combine this softmax function with our model to construct the complete custom_model.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_model(W, b, x) = m(W, b, x) |> custom_softmax\ncustom_model (generic function with 1 method)","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's check if our model works.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_model(W, b, x) |> size\n(3, 150)","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"It works! Let's check if the softmax function is working.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> all(0 .<= custom_model(W, b, x) .<= 1)\ntrue\n\njulia> sum(custom_model(W, b, x), dims=1)\n1×150 Matrix{Float32}:\n 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 … 1.0 1.0 1.0 1.0 1.0 1.0 1.0","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Every output value is between 0 and 1, and every column adds to 1!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's convert our custom_model to a Flux model. Flux provides the users with a very elegant API that almost feels like writing your code!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Note, all the flux_* variables in this tutorial would be general, that is, they can be used as it is with some other similar-looking dataset, but the custom_* variables will remain specific to this tutorial.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> flux_model = Chain(Dense(4 => 3), softmax)\nChain(\n Dense(4 => 3), # 15 parameters\n NNlib.softmax,\n)","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"A Dense(4 => 3) layer denotes a layer with four inputs (four features in every data point) and three outputs (three classes or labels). This layer is the same as the mathematical model defined by us above. Under the hood, Flux too calculates the output using the same expression, but we don't have to initialize the parameters ourselves this time, instead Flux does it for us.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The softmax function provided by NNLib.jl is re-exported by Flux, which has been used here. Lastly, Flux provides users with a Chain struct which makes stacking layers seamless.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"A model's weights and biases can be accessed as follows -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> flux_model[1].weight, flux_model[1].bias\n(Float32[0.78588694 -0.45968163 -0.77409476 0.2358028; -0.9049773 -0.58643705 0.466441 -0.79523873; 0.82426906 0.4143493 0.7630932 0.020588955], Float32[0.0, 0.0, 0.0])","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We can now pass the complete data in one go, with each data point having four features (four inputs)!","category":"page"},{"location":"tutorials/logistic_regression/#Loss-and-accuracy","page":"Logistic Regression","title":"Loss and accuracy","text":"","category":"section"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Our next step should be to define some quantitative values for our model, which we will maximize or minimize during the complete training procedure. These values will be the loss function and the accuracy metric.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's start by defining a loss function, a logitcrossentropy function.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_logitcrossentropy(ŷ, y) = mean(.-sum(y .* logsoftmax(ŷ; dims = 1); dims = 1));","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Now we can wrap the custom_logitcrossentropy inside a function that takes in the model parameters, xs, and ys, and returns the loss value.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> function custom_loss(W, b, x, y)\n ŷ = custom_model(W, b, x)\n custom_logitcrossentropy(ŷ, y)\n end;\n\njulia> custom_loss(W, b, x, custom_y_onehot)\n1.1714406827505623","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The loss function works!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Flux provides us with many minimal yet elegant loss functions. In fact, the custom_logitcrossentropy defined above has been taken directly from Flux. The functions present in Flux includes sanity checks, ensures efficient performance, and behaves well with the overall FluxML ecosystem.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> function flux_loss(flux_model, x, y)\n ŷ = flux_model(x)\n Flux.logitcrossentropy(ŷ, y)\n end;\n\njulia> flux_loss(flux_model, x, flux_y_onehot)\n1.2156688659673647","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Next, let's define an accuracy function, which we will try to maximize during our training procedure. Before jumping to accuracy, let's define a onecold function. The onecold function would convert our output, which remember, are probability values, to the actual class names.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We can divide this task into two parts -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Identify the index of the maximum element of each column in the output matrix\nConvert this index to a class name","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The maximum index should be calculated along the columns (remember, each column is the output of a single x data point). We can use Julia's argmax function to achieve this.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> argmax(custom_y_onehot, dims=1) # calculate the cartesian index of max element column-wise\n1×150 Matrix{CartesianIndex{2}}:\n CartesianIndex(1, 1) CartesianIndex(1, 2) … CartesianIndex(3, 150)\n\njulia> max_idx = [x[1] for x in argmax(custom_y_onehot; dims=1)]\n1×150 Matrix{Int64}:\n 1 1 1 1 1 1 1 1 1 1 1 1 1 … 3 3 3 3 3 3 3 3 3 3 3 3","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Now we can write a function that calculates the indices of the maximum element in each column, and maps them to a class name.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> function custom_onecold(custom_y_onehot)\n max_idx = [x[1] for x in argmax(custom_y_onehot; dims=1)]\n vec(classes[max_idx])\n end;\n\njulia> custom_onecold(custom_y_onehot)\n150-element Vector{String}:\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n \"Iris-setosa\"\n ⋮\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"\n \"Iris-virginica\"","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"It works!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Flux provides users with the onecold function so that we don't have to write it on our own. Let's see how our custom_onecold function compares to Flux.onecold.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> istrue = Flux.onecold(flux_y_onehot, classes) .== custom_onecold(custom_y_onehot);\n\njulia> all(istrue)\ntrue","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Both the functions act identically!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We now move to the accuracy metric and run it with the untrained custom_model.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_accuracy(W, b, x, y) = mean(custom_onecold(custom_model(W, b, x)) .== y);\n\njulia> custom_accuracy(W, b, x, y)\n0.3333333333333333","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We could also have used Flux's built-in functionality to define this accuracy function.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> flux_accuracy(x, y) = mean(Flux.onecold(flux_model(x), classes) .== y);\n\njulia> flux_accuracy(x, y)\n0.24","category":"page"},{"location":"tutorials/logistic_regression/#Training-the-model","page":"Logistic Regression","title":"Training the model","text":"","category":"section"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Let's train our model using the classic Gradient Descent algorithm. According to the gradient descent algorithm, the weights and biases should be iteratively updated using the following mathematical equations -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"beginaligned\nW = W - eta * fracdLdW \nb = b - eta * fracdLdb\nendaligned","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Here, W is the weight matrix, b is the bias vector, eta is the learning rate, fracdLdW is the derivative of the loss function with respect to the weight, and fracdLdb is the derivative of the loss function with respect to the bias.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The derivatives are calculated using an Automatic Differentiation tool, and Flux uses Zygote.jl for the same. Since Zygote.jl is an independent Julia package, it can be used outside of Flux as well! Refer to the documentation of Zygote.jl for more information on the same.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Our first step would be to obtain the gradient of the loss function with respect to the weights and the biases. Flux re-exports Zygote's gradient function; hence, we don't need to import Zygote explicitly to use the functionality. gradient takes in a function and its arguments, and returns a tuple containing ∂f/∂x for each argument x. Let's pass in custom_loss and the arguments required by custom_loss to gradient. We will require the derivatives of the loss function (custom_loss) with respect to the weights (∂f/∂w) and the bias (∂f/∂b) to carry out gradient descent, but we can ignore the partial derivatives of the loss function (custom_loss) with respect to x (∂f/∂x) and one hot encoded y (∂f/∂y).","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, custom_y_onehot);","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We can now update the parameters, following the gradient descent algorithm -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> W .= W .- 0.1 .* dLdW;\n\njulia> b .= b .- 0.1 .* dLdb;","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The parameters have been updated! We can now check the value of our custom loss function -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_loss(W, b, x, custom_y_onehot)\n1.164742997664842","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"The loss went down! Let's plug our super training logic inside a function.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> function train_custom_model()\n dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, custom_y_onehot)\n W .= W .- 0.1 .* dLdW\n b .= b .- 0.1 .* dLdb\n end;","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We can plug the training function inside a loop and train the model for more epochs. The loop can be tailored to suit the user's needs, and the conditions can be specified in plain Julia. Here we will train the model for a maximum of 500 epochs, but to ensure that the model does not overfit, we will break as soon as our accuracy value crosses or becomes equal to 0.98.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> for i = 1:500\n train_custom_model();\n custom_accuracy(W, b, x, y) >= 0.98 && break\n end\n \njulia> @show custom_accuracy(W, b, x, y);\ncustom_accuracy(W, b, x, y) = 0.98","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Everything works! Our model achieved an accuracy of 0.98! Let's have a look at the loss.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> custom_loss(W, b, x, custom_y_onehot)\n0.6520349798243569","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"As expected, the loss went down too! Now, let's repeat the same steps with our flux_model.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We can write a similar-looking training loop for our flux_model and train it similarly.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> flux_loss(flux_model, x, flux_y_onehot)\n1.215731131385928\n\njulia> function train_flux_model()\n dLdm, _, _ = gradient(flux_loss, flux_model, x, flux_y_onehot)\n @. flux_model[1].weight = flux_model[1].weight - 0.1 * dLdm[:layers][1][:weight]\n @. flux_model[1].bias = flux_model[1].bias - 0.1 * dLdm[:layers][1][:bias]\n end;\n\njulia> for i = 1:500\n train_flux_model();\n flux_accuracy(x, y) >= 0.98 && break\n end","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Looking at the accuracy and loss value -","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"julia> @show flux_accuracy(x, y);\nflux_accuracy(x, y) = 0.98\n\njulia> flux_loss(flux_model, x, flux_y_onehot)\n0.6952386604624324","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"We see a very similar final loss and accuracy.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Summarising this tutorial, we saw how we can run a logistic regression algorithm in Julia with and without using Flux. We started by importing the classic Iris dataset, and one hot encoded the labels. Next, we defined our model, the loss function, and the accuracy, all by ourselves.","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"Finally, we trained the model by manually writing down the Gradient Descent algorithm and optimising the loss. Interestingly, we implemented most of the functions on our own, and then parallelly compared them with the functionalities provided by Flux!","category":"page"},{"location":"tutorials/logistic_regression/","page":"Logistic Regression","title":"Logistic Regression","text":"info: Info\nOriginally published on 1st April 2023, by Saransh Chopra.","category":"page"},{"location":"training/zygote/#autodiff-zygote","page":"Gradients – Zygote.jl","title":"Automatic Differentiation using Zygote.jl","text":"","category":"section"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"Flux re-exports the gradient from Zygote, and uses this function within train! to differentiate the model. Zygote has its own documentation, in particular listing some important limitations.","category":"page"},{"location":"training/zygote/#Explicit-style","page":"Gradients – Zygote.jl","title":"Explicit style","text":"","category":"section"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"The preferred way of using Zygote, and the only way of using most other AD packages, is to explicitly provide a function and its arguments.","category":"page"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"Zygote.gradient(f, args...)\nZygote.withgradient(f, args...)\nZygote.jacobian(f, args...)\nZygote.withjacobian(f, args...)\nZygote.hessian\nZygote.hessian_reverse\nZygote.diaghessian","category":"page"},{"location":"training/zygote/#Zygote.gradient-Tuple{Any, Vararg{Any}}","page":"Gradients – Zygote.jl","title":"Zygote.gradient","text":"gradient(f, args...)\n\nReturns a tuple containing ∂f/∂x for each argument x, the derivative (for scalar x) or the gradient.\n\nf(args...) must be a real number, see jacobian for array output.\n\nSee also withgradient to keep the value f(args...), and pullback for value and back-propagator.\n\njulia> gradient(*, 2.0, 3.0, 5.0)\n(15.0, 10.0, 6.0)\n\njulia> gradient(x -> sum(abs2,x), [7.0, 11.0, 13.0])\n([14.0, 22.0, 26.0],)\n\njulia> gradient([7, 11], 0, 1) do x, y, d\n p = size(x, d)\n sum(x.^p .+ y)\n end\n([14.0, 22.0], 2.0, nothing)\n\n\n\n\n\n","category":"method"},{"location":"training/zygote/#Zygote.withgradient-Tuple{Any, Vararg{Any}}","page":"Gradients – Zygote.jl","title":"Zygote.withgradient","text":"withgradient(f, args...)\nwithgradient(f, ::Params)\n\nReturns both the value of the function and the gradient, as a named tuple. \n\njulia> y, ∇ = withgradient(/, 1, 2)\n(val = 0.5, grad = (0.5, -0.25))\n\njulia> ∇ == gradient(/, 1, 2)\ntrue\n\nAllows you to capture auxillary outputs, in addition to the scalar used by gradient. To do this, f must return a Tuple or NamedTuple. Then it calculates grad = gradient(first∘f, args...) but returns the wholeval = f(args...)`:\n\njulia> withgradient([1,2,4]) do x\n z = 1 ./ x\n sum(z), z # here z is an auxillary output\n end\n(val = (1.75, [1.0, 0.5, 0.25]), grad = ([-1.0, -0.25, -0.0625],))\n\njulia> withgradient(3.0, 4.0) do x, y\n (div = x/y, mul = x*y)\n end\n(val = (div = 0.75, mul = 12.0), grad = (0.25, -0.1875))\n\nAlso supports implicit mode:\n\njulia> w = [3.0];\n\njulia> res = withgradient(() -> sum(abs2, w), Params([w]))\n(val = 9.0, grad = Grads(...))\n\njulia> res.grad[w]\n1-element Vector{Float64}:\n 6.0\n\n\n\n\n\n","category":"method"},{"location":"training/zygote/#Zygote.jacobian-Tuple{Any, Vararg{Any}}","page":"Gradients – Zygote.jl","title":"Zygote.jacobian","text":"jacobian(f, args...) -> Tuple\n\nFor each array a ∈ args this returns a matrix with Ja[k,i] = ∂y[k]/∂a[i] where y = f(args...) is usually a vector. Arrays of higher dimension are treated like vec(a), or vec(y) for output.\n\nFor scalar x::Number ∈ args, the result is a vector Jx[k] = ∂y[k]/∂x, while for scalar y all results have just one row.\n\nWith any other argument type, no result is produced, even if gradient would work.\n\nThis reverse-mode Jacobian needs to evaluate the pullback once for each element of y. Doing so is usually only efficient when length(y) is small compared to length(a), otherwise forward mode is likely to be better.\n\nSee also withjacobian, hessian, hessian_reverse.\n\nExamples\n\njulia> jacobian(a -> 100*a[1:3].^2, 1:7)[1] # first index (rows) is output\n3×7 Matrix{Int64}:\n 200 0 0 0 0 0 0\n 0 400 0 0 0 0 0\n 0 0 600 0 0 0 0\n\njulia> jacobian((a,x) -> a.^2 .* x, [1,2,3], 1) # scalar argument has vector jacobian\n([2 0 0; 0 4 0; 0 0 6], [1, 4, 9])\n\njulia> jacobian((a,d) -> prod(a, dims=d), [1 2; 3 4; 5 6], 2)\n([2 0 … 0 0; 0 4 … 3 0; 0 0 … 0 5], [0, 0, 0])\n\nwarning: Warning\nFor arguments of any type except Number & AbstractArray, the result is nothing.\n\njulia> jacobian((a,s) -> a.^length(s), [1,2,3], \"str\")\n([3 0 0; 0 12 0; 0 0 27], nothing)\n\njulia> jacobian((a,t) -> sum(a .* t[1]) + t[2], [1,2,3], (4,5))\n([4 4 4], nothing)\n\njulia> gradient((a,t) -> sum(a .* t[1]) + t[2], [1,2,3], (4,5)) # gradient undersands the tuple\n([4 4 4], (6, 1))\n\n\n\n\n\n","category":"method"},{"location":"training/zygote/#Zygote.withjacobian-Tuple{Any, Vararg{Any}}","page":"Gradients – Zygote.jl","title":"Zygote.withjacobian","text":"withjacobian(f, args...)\n\nReturns both the value f(args...) and the jacobian as a named tuple.\n\njulia> withjacobian(cumsum, [1,2,3])\n(val = [1, 3, 6], grad = ([1 0 0; 1 1 0; 1 1 1],))\n\n\n\n\n\n","category":"method"},{"location":"training/zygote/#Zygote.hessian","page":"Gradients – Zygote.jl","title":"Zygote.hessian","text":"hessian(f, x)\n\nConstruct the Hessian ∂²f/∂x², where x is a real number or an array, and f(x) is a real number. When x is an array, the result is a matrix H[i,j] = ∂²f/∂x[i]∂x[j], using linear indexing x[i] even if the argument is higher-dimensional.\n\nThis uses forward over reverse, ForwardDiff over Zygote, calling hessian_dual(f, x). See hessian_reverse for an all-Zygote alternative.\n\nSee also diaghessian to compute only the diagonal part.\n\nExamples\n\njulia> hessian(x -> x[1]*x[2], randn(2))\n2×2 Matrix{Float64}:\n 0.0 1.0\n 1.0 0.0\n\njulia> hessian(x -> sum(x.^3), [1 2; 3 4]) # uses linear indexing of x\n4×4 Matrix{Int64}:\n 6 0 0 0\n 0 18 0 0\n 0 0 12 0\n 0 0 0 24\n\njulia> hessian(sin, pi/2)\n-1.0\n\n\n\n\n\n","category":"function"},{"location":"training/zygote/#Zygote.hessian_reverse","page":"Gradients – Zygote.jl","title":"Zygote.hessian_reverse","text":"hessian_reverse(f, x)\n\nThis should be equivalent to hessian(f, x), but implemented using reverse over reverse mode, all Zygote. (This is usually much slower, and more likely to find errors.)\n\n\n\n\n\n","category":"function"},{"location":"training/zygote/#Zygote.diaghessian","page":"Gradients – Zygote.jl","title":"Zygote.diaghessian","text":"diaghessian(f, args...) -> Tuple\n\nDiagonal part of the Hessian. Returns a tuple containing, for each argument x, h of the same shape with h[i] = Hᵢᵢ = ∂²y/∂x[i]∂x[i]. The original evaluation y = f(args...) must give a real number y.\n\nFor one vector argument x, this is equivalent to (diag(hessian(f,x)),). Like hessian it uses ForwardDiff over Zygote. \n\nwarning: Warning\nFor arguments of any type except Number & AbstractArray, the result is nothing.\n\nExamples\n\njulia> diaghessian(x -> sum(x.^3), [1 2; 3 4])[1]\n2×2 Matrix{Int64}:\n 6 12\n 18 24\n\njulia> Diagonal(vec(ans)) == hessian(x -> sum(x.^3), [1 2; 3 4]) # full Hessian is diagonal\ntrue\n\njulia> diaghessian((x,y) -> sum(x .* y .* y'), [1 22; 333 4], [0.5, 0.666]) # two array arguments\n([0.0 0.0; 0.0 0.0], [2.0, 8.0])\n\njulia> diaghessian(atan, 1, 2) # two scalar arguments\n(-0.16, 0.16)\n\njulia> hessian(xy -> atan(xy[1], xy[2]), [1, 2]) # full Hessian is not diagonal\n2×2 Matrix{Float64}:\n -0.16 -0.12\n -0.12 0.16\n\n\n\n\n\n","category":"function"},{"location":"training/zygote/#Implicit-style-(Flux-0.14)","page":"Gradients – Zygote.jl","title":"Implicit style (Flux ≤ 0.14)","text":"","category":"section"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"Flux used to use what Zygote calls \"implicit\" gradients, described here in its documentation. However, support for this will be removed from Flux 0.15.","category":"page"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"compat: Training\nThe blue-green boxes in the training section describe the changes needed to upgrade old code from implicit to explicit style.","category":"page"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"Zygote.gradient(loss, ::Params)\nZygote.Params\nZygote.Grads\nZygote.jacobian(loss, ::Params)","category":"page"},{"location":"training/zygote/#Zygote.gradient-Tuple{Any, Params}","page":"Gradients – Zygote.jl","title":"Zygote.gradient","text":"gradient(f, args...)\n\nReturns a tuple containing ∂f/∂x for each argument x, the derivative (for scalar x) or the gradient.\n\nf(args...) must be a real number, see jacobian for array output.\n\nSee also withgradient to keep the value f(args...), and pullback for value and back-propagator.\n\njulia> gradient(*, 2.0, 3.0, 5.0)\n(15.0, 10.0, 6.0)\n\njulia> gradient(x -> sum(abs2,x), [7.0, 11.0, 13.0])\n([14.0, 22.0, 26.0],)\n\njulia> gradient([7, 11], 0, 1) do x, y, d\n p = size(x, d)\n sum(x.^p .+ y)\n end\n([14.0, 22.0], 2.0, nothing)\n\n\n\n\n\n","category":"method"},{"location":"training/zygote/#Zygote.Params","page":"Gradients – Zygote.jl","title":"Zygote.Params","text":"Params([A, B])\n\nContainer for implicit parameters, used when differentiating a zero-argument function () -> loss(A, B) with respect to A, B.\n\n\n\n\n\n","category":"type"},{"location":"training/zygote/#Zygote.Grads","page":"Gradients – Zygote.jl","title":"Zygote.Grads","text":"Grads(...)\n\nDictionary-like container returned when taking gradients with respect to implicit parameters. For an array W, appearing within Params([W, A, B...]), the gradient is g[W].\n\n\n\n\n\n","category":"type"},{"location":"training/zygote/#Zygote.jacobian-Tuple{Any, Params}","page":"Gradients – Zygote.jl","title":"Zygote.jacobian","text":"jacobian(loss, ::Params)\n\nLike gradient with implicit parameters, this method takes a zero-argument function and returns an IdDict-like object, now containing the Jacobian for each parameter.\n\nExamples\n\njulia> xs = [1 2; 3 4]; ys = [5,7,9];\n\njulia> Jxy = jacobian(() -> ys[1:2] .+ sum(xs.^2), Params([xs, ys]))\nGrads(...)\n\njulia> Jxy[ys]\n2×3 Matrix{Int64}:\n 1 0 0\n 0 1 0\n\njulia> Jxy[xs]\n2×4 Matrix{Int64}:\n 2 6 4 8\n 2 6 4 8\n\n\n\n\n\n","category":"method"},{"location":"training/zygote/#ChainRules","page":"Gradients – Zygote.jl","title":"ChainRules","text":"","category":"section"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"Sometimes it is necessary to exclude some code, or a whole function, from automatic differentiation. This can be done using ChainRules:","category":"page"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"ChainRulesCore.ignore_derivatives\nChainRulesCore.@non_differentiable","category":"page"},{"location":"training/zygote/#ChainRulesCore.ignore_derivatives","page":"Gradients – Zygote.jl","title":"ChainRulesCore.ignore_derivatives","text":"ignore_derivatives(f::Function)\n\nTells the AD system to ignore the gradients of the wrapped closure. The primal computation (forward pass) is executed normally.\n\nignore_derivatives() do\n value = rand()\n push!(collection, value)\nend\n\nUsing this incorrectly could lead to incorrect gradients. For example, the following function will have zero gradients with respect to its argument:\n\nfunction wrong_grads(x)\n y = ones(3)\n ignore_derivatives() do\n push!(y, x)\n end\n return sum(y)\nend\n\n\n\n\n\nignore_derivatives(x)\n\nTells the AD system to ignore the gradients of the argument. Can be used to avoid unnecessary computation of gradients.\n\nignore_derivatives(x) * w\n\n\n\n\n\n","category":"function"},{"location":"training/zygote/#ChainRulesCore.@non_differentiable","page":"Gradients – Zygote.jl","title":"ChainRulesCore.@non_differentiable","text":"@non_differentiable(signature_expression)\n\nA helper to make it easier to declare that a method is not differentiable. This is a short-hand for defining an frule and rrule that return NoTangent() for all partials (even for the function s̄elf-partial itself)\n\nKeyword arguments should not be included.\n\njulia> @non_differentiable Base.:(==)(a, b)\n\njulia> _, pullback = rrule(==, 2.0, 3.0);\n\njulia> pullback(1.0)\n(NoTangent(), NoTangent(), NoTangent())\n\nYou can place type-constraints in the signature:\n\njulia> @non_differentiable Base.length(xs::Union{Number, Array})\n\njulia> frule((ZeroTangent(), 1), length, [2.0, 3.0])\n(2, NoTangent())\n\nwarning: Warning\nThis helper macro covers only the simple common cases. It does not support where-clauses. For these you can declare the rrule and frule directly\n\n\n\n\n\n","category":"macro"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"To manually supply the gradient for one function, you should define a method of rrule. ChainRules has detailed documentation on how this works.","category":"page"},{"location":"training/zygote/","page":"Gradients – Zygote.jl","title":"Gradients – Zygote.jl","text":"ChainRulesCore.rrule\nChainRulesCore.frule\nChainRulesCore.@scalar_rule\nChainRulesCore.NoTangent\nChainRulesCore.ZeroTangent","category":"page"},{"location":"training/zygote/#ChainRulesCore.rrule","page":"Gradients – Zygote.jl","title":"ChainRulesCore.rrule","text":"rrule([::RuleConfig,] f, x...)\n\nExpressing x as the tuple (x₁, x₂, ...) and the output tuple of f(x...) as Ω, return the tuple:\n\n(Ω, (Ω̄₁, Ω̄₂, ...) -> (s̄elf, x̄₁, x̄₂, ...))\n\nWhere the second return value is the the propagation rule or pullback. It takes in cotangents corresponding to the outputs (x̄₁, x̄₂, ...), and s̄elf, the internal values of the function itself (for closures)\n\nIf no method matching rrule(f, xs...) has been defined, then return nothing.\n\nExamples:\n\nunary input, unary output scalar function:\n\njulia> x = rand();\n\njulia> sinx, sin_pullback = rrule(sin, x);\n\njulia> sinx == sin(x)\ntrue\n\njulia> sin_pullback(1) == (NoTangent(), cos(x))\ntrue\n\nbinary input, unary output scalar function:\n\njulia> x, y = rand(2);\n\njulia> hypotxy, hypot_pullback = rrule(hypot, x, y);\n\njulia> hypotxy == hypot(x, y)\ntrue\n\njulia> hypot_pullback(1) == (NoTangent(), (x / hypot(x, y)), (y / hypot(x, y)))\ntrue\n\nThe optional RuleConfig option allows specifying rrules only for AD systems that support given features. If not needed, then it can be omitted and the rrule without it will be hit as a fallback. This is the case for most rules.\n\nSee also: frule, @scalar_rule, RuleConfig\n\n\n\n\n\n","category":"function"},{"location":"training/zygote/#ChainRulesCore.frule","page":"Gradients – Zygote.jl","title":"ChainRulesCore.frule","text":"frule([::RuleConfig,] (Δf, Δx...), f, x...)\n\nExpressing the output of f(x...) as Ω, return the tuple:\n\n(Ω, ΔΩ)\n\nThe second return value is the tangent w.r.t. the output.\n\nIf no method matching frule((Δf, Δx...), f, x...) has been defined, then return nothing.\n\nExamples:\n\nunary input, unary output scalar function:\n\njulia> dself = NoTangent();\n\njulia> x = rand()\n0.8236475079774124\n\njulia> sinx, Δsinx = frule((dself, 1), sin, x)\n(0.7336293678134624, 0.6795498147167869)\n\njulia> sinx == sin(x)\ntrue\n\njulia> Δsinx == cos(x)\ntrue\n\nUnary input, binary output scalar function:\n\njulia> sincosx, Δsincosx = frule((dself, 1), sincos, x);\n\njulia> sincosx == sincos(x)\ntrue\n\njulia> Δsincosx[1] == cos(x)\ntrue\n\njulia> Δsincosx[2] == -sin(x)\ntrue\n\nNote that techically speaking julia does not have multiple output functions, just functions that return a single output that is iterable, like a Tuple. So this is actually a Tangent:\n\njulia> Δsincosx\nTangent{Tuple{Float64, Float64}}(0.6795498147167869, -0.7336293678134624)\n\nThe optional RuleConfig option allows specifying frules only for AD systems that support given features. If not needed, then it can be omitted and the frule without it will be hit as a fallback. This is the case for most rules.\n\nSee also: rrule, @scalar_rule, RuleConfig\n\n\n\n\n\n","category":"function"},{"location":"training/zygote/#ChainRulesCore.@scalar_rule","page":"Gradients – Zygote.jl","title":"ChainRulesCore.@scalar_rule","text":"@scalar_rule(f(x₁, x₂, ...),\n @setup(statement₁, statement₂, ...),\n (∂f₁_∂x₁, ∂f₁_∂x₂, ...),\n (∂f₂_∂x₁, ∂f₂_∂x₂, ...),\n ...)\n\nA convenience macro that generates simple scalar forward or reverse rules using the provided partial derivatives. Specifically, generates the corresponding methods for frule and rrule:\n\nfunction ChainRulesCore.frule((NoTangent(), Δx₁, Δx₂, ...), ::typeof(f), x₁::Number, x₂::Number, ...)\n Ω = f(x₁, x₂, ...)\n $(statement₁, statement₂, ...)\n return Ω, (\n (∂f₁_∂x₁ * Δx₁ + ∂f₁_∂x₂ * Δx₂ + ...),\n (∂f₂_∂x₁ * Δx₁ + ∂f₂_∂x₂ * Δx₂ + ...),\n ...\n )\nend\n\nfunction ChainRulesCore.rrule(::typeof(f), x₁::Number, x₂::Number, ...)\n Ω = f(x₁, x₂, ...)\n $(statement₁, statement₂, ...)\n return Ω, ((ΔΩ₁, ΔΩ₂, ...)) -> (\n NoTangent(),\n ∂f₁_∂x₁ * ΔΩ₁ + ∂f₂_∂x₁ * ΔΩ₂ + ...),\n ∂f₁_∂x₂ * ΔΩ₁ + ∂f₂_∂x₂ * ΔΩ₂ + ...),\n ...\n )\nend\n\nIf no type constraints in f(x₁, x₂, ...) within the call to @scalar_rule are provided, each parameter in the resulting frule/rrule definition is given a type constraint of Number. Constraints may also be explicitly be provided to override the Number constraint, e.g. f(x₁::Complex, x₂), which will constrain x₁ to Complex and x₂ to Number.\n\nAt present this does not support defining for closures/functors. Thus in reverse-mode, the first returned partial, representing the derivative with respect to the function itself, is always NoTangent(). And in forward-mode, the first input to the returned propagator is always ignored.\n\nThe result of f(x₁, x₂, ...) is automatically bound to Ω. This allows the primal result to be conveniently referenced (as Ω) within the derivative/setup expressions.\n\nThis macro assumes complex functions are holomorphic. In general, for non-holomorphic functions, the frule and rrule must be defined manually.\n\nIf the derivative is one, (e.g. for identity functions) true can be used as the most general multiplicative identity.\n\nThe @setup argument can be elided if no setup code is need. In other words:\n\n@scalar_rule(f(x₁, x₂, ...),\n (∂f₁_∂x₁, ∂f₁_∂x₂, ...),\n (∂f₂_∂x₁, ∂f₂_∂x₂, ...),\n ...)\n\nis equivalent to:\n\n@scalar_rule(f(x₁, x₂, ...),\n @setup(nothing),\n (∂f₁_∂x₁, ∂f₁_∂x₂, ...),\n (∂f₂_∂x₁, ∂f₂_∂x₂, ...),\n ...)\n\nFor examples, see ChainRules' rulesets directory.\n\nSee also: frule, rrule.\n\n\n\n\n\n","category":"macro"},{"location":"training/zygote/#ChainRulesCore.NoTangent","page":"Gradients – Zygote.jl","title":"ChainRulesCore.NoTangent","text":"NoTangent() <: AbstractZero\n\nThis tangent indicates that the derivative does not exist. It is the tangent type for primal types that are not differentiable, such as integers or booleans (when they are not being used to represent floating-point values). The only valid way to perturb such values is to not change them at all. As a consequence, NoTangent is functionally identical to ZeroTangent(), but it provides additional semantic information.\n\nAdding NoTangent() to a primal is generally wrong: gradient-based methods cannot be used to optimize over discrete variables. An optimization package making use of this might want to check for such a case.\n\nnote: Note\nThis does not indicate that the derivative is not implemented, but rather that mathematically it is not defined.\n\nThis mostly shows up as the derivative with respect to dimension, index, or size arguments.\n\n function rrule(fill, x, len::Int)\n y = fill(x, len)\n fill_pullback(ȳ) = (NoTangent(), @thunk(sum(Ȳ)), NoTangent())\n return y, fill_pullback\n end\n\n\n\n\n\n","category":"type"},{"location":"training/zygote/#ChainRulesCore.ZeroTangent","page":"Gradients – Zygote.jl","title":"ChainRulesCore.ZeroTangent","text":"ZeroTangent() <: AbstractZero\n\nThe additive identity for tangents. This is basically the same as 0. A derivative of ZeroTangent() does not propagate through the primal function.\n\n\n\n\n\n","category":"type"},{"location":"destructure/#man-destructure","page":"Flat vs. Nested","title":"Flat vs. Nested Structures","text":"","category":"section"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"A Flux model is a nested structure, with parameters stored within many layers. Sometimes you may want a flat representation of them, to interact with functions expecting just one vector. This is provided by destructure:","category":"page"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"julia> model = Chain(Dense(2=>1, tanh), Dense(1=>1))\nChain(\n Dense(2 => 1, tanh), # 3 parameters\n Dense(1 => 1), # 2 parameters\n) # Total: 4 arrays, 5 parameters, 276 bytes.\n\njulia> flat, rebuild = Flux.destructure(model)\n(Float32[0.863101, 1.2454957, 0.0, -1.6345707, 0.0], Restructure(Chain, ..., 5))\n\njulia> rebuild(zeros(5)) # same structure, new parameters\nChain(\n Dense(2 => 1, tanh), # 3 parameters (all zero)\n Dense(1 => 1), # 2 parameters (all zero)\n) # Total: 4 arrays, 5 parameters, 276 bytes.","category":"page"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"Both destructure and the Restructure function can be used within gradient computations. For instance, this computes the Hessian ∂²L/∂θᵢ∂θⱼ of some loss function, with respect to all parameters of the Flux model. The resulting matrix has off-diagonal entries, which cannot really be expressed in a nested structure:","category":"page"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"julia> x = rand(Float32, 2, 16);\n\njulia> grad = gradient(m -> sum(abs2, m(x)), model) # nested gradient\n((layers = ((weight = Float32[10.339018 11.379145], bias = Float32[22.845667], σ = nothing), (weight = Float32[-29.565302;;], bias = Float32[-37.644184], σ = nothing)),),)\n\njulia> function loss(v::Vector)\n m = rebuild(v)\n y = m(x)\n sum(abs2, y)\n end;\n\njulia> gradient(loss, flat) # flat gradient, same numbers\n(Float32[10.339018, 11.379145, 22.845667, -29.565302, -37.644184],)\n\njulia> Zygote.hessian(loss, flat) # second derivative\n5×5 Matrix{Float32}:\n -7.13131 -5.54714 -11.1393 -12.6504 -8.13492\n -5.54714 -7.11092 -11.0208 -13.9231 -9.36316\n -11.1393 -11.0208 -13.7126 -27.9531 -22.741\n -12.6504 -13.9231 -27.9531 18.0875 23.03\n -8.13492 -9.36316 -22.741 23.03 32.0\n\njulia> Flux.destructure(grad) # acts on non-models, too\n(Float32[10.339018, 11.379145, 22.845667, -29.565302, -37.644184], Restructure(Tuple, ..., 5))","category":"page"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"compat: Flux ≤ 0.12\nOld versions of Flux had an entirely different implementation of destructure, which had many bugs (and almost no tests). Many comments online still refer to that now-deleted function, or to memories of it.","category":"page"},{"location":"destructure/#All-Parameters","page":"Flat vs. Nested","title":"All Parameters","text":"","category":"section"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"The function destructure now lives in Optimisers.jl. (Be warned this package is unrelated to the Flux.Optimisers sub-module! The confusion is temporary.)","category":"page"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"Optimisers.destructure\nOptimisers.trainable\nOptimisers.isnumeric","category":"page"},{"location":"destructure/#Optimisers.destructure","page":"Flat vs. Nested","title":"Optimisers.destructure","text":"destructure(model) -> vector, reconstructor\n\nCopies all trainable, isnumeric parameters in the model to a vector, and returns also a function which reverses this transformation. Differentiable.\n\nExample\n\njulia> v, re = destructure((x=[1.0, 2.0], y=(sin, [3.0 + 4.0im])))\n(ComplexF64[1.0 + 0.0im, 2.0 + 0.0im, 3.0 + 4.0im], Restructure(NamedTuple, ..., 3))\n\njulia> re([3, 5, 7+11im])\n(x = [3.0, 5.0], y = (sin, ComplexF64[7.0 + 11.0im]))\n\nIf model contains various number types, they are promoted to make vector, and are usually restored by Restructure. Such restoration follows the rules of ChainRulesCore.ProjectTo, and thus will restore floating point precision, but will permit more exotic numbers like ForwardDiff.Dual.\n\nIf model contains only GPU arrays, then vector will also live on the GPU. At present, a mixture of GPU and ordinary CPU arrays is undefined behaviour.\n\n\n\n\n\n","category":"function"},{"location":"destructure/#Optimisers.trainable","page":"Flat vs. Nested","title":"Optimisers.trainable","text":"trainable(x::Layer) -> NamedTuple\n\nThis may be overloaded to make optimisers ignore some fields of every Layer, which would otherwise contain trainable parameters.\n\nwarning: Warning\nThis is very rarely required. Fields of struct Layer which contain functions, or integers like sizes, are always ignored anyway. Overloading trainable is only necessary when some arrays of numbers are to be optimised, and some arrays of numbers are not.\n\nThe default is Functors.children(x), usually a NamedTuple of all fields, and trainable(x) must contain a subset of these.\n\n\n\n\n\n","category":"function"},{"location":"destructure/#Optimisers.isnumeric","page":"Flat vs. Nested","title":"Optimisers.isnumeric","text":"isnumeric(x) -> Bool\n\nReturns true on any parameter to be adjusted by Optimisers.jl, namely arrays of non-integer numbers. Returns false on all other types.\n\nRequires also that Functors.isleaf(x) == true, to focus on e.g. the parent of a transposed matrix, not the wrapper.\n\n\n\n\n\n","category":"function"},{"location":"destructure/#All-Layers","page":"Flat vs. Nested","title":"All Layers","text":"","category":"section"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"Another kind of flat view of a nested model is provided by the modules command. This extracts a list of all layers:","category":"page"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"Flux.modules","category":"page"},{"location":"destructure/#Flux.modules","page":"Flat vs. Nested","title":"Flux.modules","text":"modules(m)\n\nReturn an iterator over non-leaf objects that can be reached by recursing m over the children given by functor.\n\nUseful for applying a function (e.g. a regularizer) over specific modules or subsets of the parameters (e.g. the weights but not the biases).\n\nExamples\n\njulia> m1 = Chain(Dense(28^2, 64), BatchNorm(64, relu));\n\njulia> m2 = Chain(m1, Dense(64, 10))\nChain(\n Chain(\n Dense(784 => 64), # 50_240 parameters\n BatchNorm(64, relu), # 128 parameters, plus 128\n ),\n Dense(64 => 10), # 650 parameters\n) # Total: 6 trainable arrays, 51_018 parameters,\n # plus 2 non-trainable, 128 parameters, summarysize 200.312 KiB.\n\njulia> Flux.modules(m2)\n7-element Vector{Any}:\n Chain(Chain(Dense(784 => 64), BatchNorm(64, relu)), Dense(64 => 10)) # 51_018 parameters, plus 128 non-trainable\n (Chain(Dense(784 => 64), BatchNorm(64, relu)), Dense(64 => 10))\n Chain(Dense(784 => 64), BatchNorm(64, relu)) # 50_368 parameters, plus 128 non-trainable\n (Dense(784 => 64), BatchNorm(64, relu))\n Dense(784 => 64) # 50_240 parameters\n BatchNorm(64, relu) # 128 parameters, plus 128 non-trainable\n Dense(64 => 10) # 650 parameters\n\njulia> L2(m) = sum(sum(abs2, l.weight) for l in Flux.modules(m) if l isa Dense)\nL2 (generic function with 1 method)\n\njulia> L2(m2) isa Float32\ntrue\n\n\n\n\n\n","category":"function"},{"location":"destructure/#Save-and-Load","page":"Flat vs. Nested","title":"Save and Load","text":"","category":"section"},{"location":"destructure/","page":"Flat vs. Nested","title":"Flat vs. Nested","text":"Flux.state\nFlux.loadmodel!","category":"page"},{"location":"destructure/#Flux.state","page":"Flat vs. Nested","title":"Flux.state","text":"state(x)\n\nReturn an object with the same nested structure as x according to Functors.children, but made only of basic containers (e.g. named tuples, tuples, arrays, and dictionaries).\n\nBesides trainable and non-trainable arrays, the state will contain leaf nodes that are not arrays, such as numbers, symbols, strings, and nothing values. The leaf types that end up in the state could increase in the future.\n\nThis method is particularly useful for saving and loading models, since the state contain only simple data types that can be easily serialized.\n\nThe state can be passed to loadmodel! to restore the model.\n\nExamples\n\nCopy the state into another model\n\njulia> m1 = Chain(Dense(1, 2, tanh; init=ones), Dense(2, 1; init=ones));\n\njulia> s = Flux.state(m1)\n(layers = ((weight = [1.0; 1.0;;], bias = [0.0, 0.0], σ = ()), (weight = [1.0 1.0], bias = [0.0], σ = ())),)\n\njulia> m2 = Chain(Dense(1, 2, tanh), Dense(2, 1; bias=false)); # weights are random numbers\n\njulia> Flux.loadmodel!(m2, s);\n\njulia> m2[1].weight # now the weights of m2 are the same as m1\n2×1 Matrix{Float32}:\n 1.0\n 1.0\n\njulia> Flux.state(trainmode!(Dropout(0.2))) # contains p & activity, but not RNG state\n(p = 0.2, dims = (), active = true, rng = ())\n\njulia> Flux.state(BatchNorm(1)) # contains non-trainable arrays μ, σ²\n(λ = (), β = Float32[0.0], γ = Float32[1.0], μ = Float32[0.0], σ² = Float32[1.0], ϵ = 1.0f-5, momentum = 0.1f0, affine = true, track_stats = true, active = nothing, chs = 1)\n\nSave and load with BSON\n\njulia> using BSON\n\njulia> BSON.@save \"checkpoint.bson\" model_state = s\n\njulia> Flux.loadmodel!(m2, BSON.load(\"checkpoint.bson\")[:model_state])\n\nSave and load with JLD2\n\njulia> using JLD2\n\njulia> JLD2.jldsave(\"checkpoint.jld2\", model_state = s)\n\njulia> Flux.loadmodel!(m2, JLD2.load(\"checkpoint.jld2\", \"model_state\"))\n\n\n\n\n\n","category":"function"},{"location":"destructure/#Flux.loadmodel!","page":"Flat vs. Nested","title":"Flux.loadmodel!","text":"loadmodel!(dst, src)\n\nCopy all the parameters (trainable and non-trainable) from src into dst.\n\nRecursively walks dst and src together using Functors.children, and calling copyto! on parameter arrays or throwing an error when there is a mismatch. Non-array elements (such as activation functions) are not copied and need not match. Zero bias vectors and bias=false are considered equivalent (see extended help for more details).\n\nSee also Flux.state.\n\nExamples\n\njulia> dst = Chain(Dense(Flux.ones32(2, 5), Flux.ones32(2), tanh), Dense(2 => 1; bias = [1f0]))\nChain(\n Dense(5 => 2, tanh), # 12 parameters\n Dense(2 => 1), # 3 parameters\n) # Total: 4 arrays, 15 parameters, 316 bytes.\n\njulia> dst[1].weight ≈ ones(2, 5) # by construction\ntrue\n\njulia> src = Chain(Dense(5 => 2, relu), Dense(2 => 1, bias=false));\n\njulia> Flux.loadmodel!(dst, src);\n\njulia> dst[1].weight ≈ ones(2, 5) # values changed\nfalse\n\njulia> iszero(dst[2].bias)\ntrue\n\nExtended help\n\nThrows an error when:\n\ndst and src do not share the same fields (at any level)\nthe sizes of leaf nodes are mismatched between dst and src\ncopying non-array values to/from an array parameter (except inactive parameters described below)\ndst is a \"tied\" parameter (i.e. refers to another parameter) and loaded into multiple times with mismatched source values\n\nInactive parameters can be encoded by using the boolean value false instead of an array. If dst == false and src is an all-zero array, no error will be raised (and no values copied); however, attempting to copy a non-zero array to an inactive parameter will throw an error. Likewise, copying a src value of false to any dst array is valid, but copying a src value of true will error.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#man-init-funcs","page":"Weight Initialisation","title":"Random Weight Initialisation","text":"","category":"section"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"Flux initialises convolutional layers and recurrent cells with glorot_uniform by default. Most layers accept a function as an init keyword, which replaces this default. For example:","category":"page"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"julia> conv = Conv((3, 3), 3 => 2, relu; init=Flux.glorot_normal)\nConv((3, 3), 3 => 2, relu) # 56 parameters\n\njulia> conv.bias\n2-element Vector{Float32}:\n 0.0\n 0.0","category":"page"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"Note that init creates the weight array, but not the bias vector.","category":"page"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"Many of the initialisation functions accept keywords such as gain, and a random number generator. To make it easy to pass these to layers, there are methods which return a function:","category":"page"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"julia> Dense(4 => 5, tanh; init=Flux.glorot_uniform(gain=2))\nDense(4 => 5, tanh) # 25 parameters\n\njulia> Dense(4 => 5, tanh; init=Flux.randn32(MersenneTwister(1)))\nDense(4 => 5, tanh) # 25 parameters","category":"page"},{"location":"utilities/#Initialisation-functions","page":"Weight Initialisation","title":"Initialisation functions","text":"","category":"section"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"Flux.glorot_uniform\nFlux.glorot_normal\nFlux.kaiming_uniform\nFlux.kaiming_normal\nFlux.truncated_normal\nFlux.orthogonal\nFlux.sparse_init\nFlux.identity_init\nFlux.ones32\nFlux.zeros32\nFlux.rand32\nFlux.randn32\nFlux.create_bias","category":"page"},{"location":"utilities/#Flux.glorot_uniform","page":"Weight Initialisation","title":"Flux.glorot_uniform","text":"glorot_uniform([rng], size...; gain = 1) -> Array\nglorot_uniform([rng]; kw...) -> Function\n\nReturn an Array{Float32} of the given size containing random numbers drawn from a uniform distribution on the interval -x x, where x = gain * sqrt(6 / (fan_in + fan_out)).\n\nThis method is described in [1] and also known as Xavier initialization.\n\nExamples\n\njulia> Flux.glorot_uniform(3, 4) |> summary\n\"3×4 Matrix{Float32}\"\n\njulia> round.(extrema(Flux.glorot_uniform(10, 100)), digits=3)\n(-0.232f0, 0.234f0)\n\njulia> round.(extrema(Flux.glorot_uniform(100, 10)), digits=3)\n(-0.233f0, 0.233f0)\n\njulia> round.(extrema(Flux.glorot_uniform(100, 100)), digits=3)\n(-0.173f0, 0.173f0)\n\njulia> Dense(3 => 2, tanh; init = Flux.glorot_uniform(MersenneTwister(1)))\nDense(3 => 2, tanh) # 8 parameters\n\njulia> ans.bias\n2-element Vector{Float32}:\n 0.0\n 0.0\n\nReferences\n\n[1] Glorot, Xavier, and Yoshua Bengio. \"Understanding the difficulty of training deep feedforward neural networks.\" Proceedings of the thirteenth international conference on artificial intelligence and statistics. 2010.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.glorot_normal","page":"Weight Initialisation","title":"Flux.glorot_normal","text":"glorot_normal([rng], size...; gain = 1) -> Array\nglorot_normal([rng]; kw...) -> Function\n\nReturn an Array{Float32} of the given size containing random numbers drawn from a normal distribution with standard deviation gain * sqrt(2 / (fan_in + fan_out)), using nfan.\n\nThis method is described in [1] and also known as Xavier initialization.\n\nExamples\n\njulia> using Statistics\n\njulia> round(std(Flux.glorot_normal(10, 1000)), digits=3)\n0.044f0\n\njulia> round(std(Flux.glorot_normal(1000, 10)), digits=3)\n0.044f0\n\njulia> round(std(Flux.glorot_normal(1000, 1000)), digits=3)\n0.032f0\n\njulia> Dense(10 => 1000, tanh; init = Flux.glorot_normal(gain=100))\nDense(10 => 1000, tanh) # 11_000 parameters\n\njulia> round(std(ans.weight), sigdigits=3)\n4.45f0\n\nReferences\n\n[1] Glorot, Xavier, and Yoshua Bengio. \"Understanding the difficulty of training deep feedforward neural networks.\" Proceedings of the thirteenth international conference on artificial intelligence and statistics. 2010.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.kaiming_uniform","page":"Weight Initialisation","title":"Flux.kaiming_uniform","text":"kaiming_uniform([rng], size...; gain = √2) -> Array\nkaiming_uniform([rng]; kw...) -> Function\n\nReturn an Array{Float32} of the given size containing random numbers drawn from a uniform distribution on the interval [-x, x], where x = gain * sqrt(3/fan_in) using nfan.\n\nThis method is described in [1] and also known as He initialization.\n\nExamples\n\njulia> round.(extrema(Flux.kaiming_uniform(100, 10)), digits=3)\n(-0.774f0, 0.774f0)\n\njulia> round.(extrema(Flux.kaiming_uniform(10, 100)), digits=3)\n(-0.245f0, 0.244f0)\n\njulia> round.(extrema(Flux.kaiming_uniform(100, 100)), digits=3)\n(-0.245f0, 0.245f0)\n\nReferences\n\n[1] He, Kaiming, et al. \"Delving deep into rectifiers: Surpassing human-level performance on imagenet classification.\" Proceedings of the IEEE international conference on computer vision. 2015.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.kaiming_normal","page":"Weight Initialisation","title":"Flux.kaiming_normal","text":"kaiming_normal([rng], size...; gain = √2) -> Array\nkaiming_normal([rng]; kw...) -> Function\n\nReturn an Array{Float32} of the given size containing random numbers taken from a normal distribution standard deviation gain / sqrt(fan_in), using nfan.\n\nThis method is described in [1] and also known as He initialization.\n\nExamples\n\njulia> using Statistics\n\njulia> round(std(Flux.kaiming_normal(10, 1000)), digits=3)\n0.045f0\n\njulia> round(std(Flux.kaiming_normal(1000, 10)), digits=3)\n0.447f0\n\njulia> round(std(Flux.kaiming_normal(1000, 1000)), digits=3)\n0.045f0\n\nReferences\n\n[1] He, Kaiming, et al. \"Delving deep into rectifiers: Surpassing human-level performance on imagenet classification.\" Proceedings of the IEEE international conference on computer vision. 2015.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.truncated_normal","page":"Weight Initialisation","title":"Flux.truncated_normal","text":"truncated_normal([rng], size...; mean = 0, std = 1, lo = -2, hi = 2) -> Array\ntruncated_normal([rng]; kw...) -> Function\n\nReturn an Array{Float32} of the given size where each element is drawn from a truncated normal distribution. The numbers are distributed like filter(x -> lo<=x<=hi, mean .+ std .* randn(100)).\n\nThe values are generated by sampling a Uniform(0, 1) (rand()) and then applying the inverse CDF of the truncated normal distribution. This method works best when lo ≤ mean ≤ hi.\n\nExamples\n\njulia> using Statistics\n\njulia> Flux.truncated_normal(3, 4) |> summary\n\"3×4 Matrix{Float32}\"\n\njulia> round.(extrema(Flux.truncated_normal(10^6)); digits=3)\n(-2.0f0, 2.0f0)\n\njulia> round(std(Flux.truncated_normal(10^6; lo = -100, hi = 100)))\n1.0f0\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.orthogonal","page":"Weight Initialisation","title":"Flux.orthogonal","text":"orthogonal([rng], size...; gain = 1) -> Array\northogonal([rng]; kw...) -> Function\n\nReturn an Array{Float32} of the given size which is a (semi) orthogonal matrix, as described in [1].\n\nCannot construct a vector, i.e. length(size) == 1 is forbidden. For length(size) > 2, a prod(size[1:(end - 1)]) by size[end] orthogonal matrix is computed before reshaping it to the original dimensions.\n\nExamples\n\njulia> W = Flux.orthogonal(5, 7);\n\njulia> summary(W)\n\"5×7 Matrix{Float32}\"\n\njulia> W * W' ≈ I(5)\ntrue\n\njulia> W2 = Flux.orthogonal(7, 5);\n\njulia> W2 * W2' ≈ I(7)\nfalse\n\njulia> W2' * W2 ≈ I(5)\ntrue\n\njulia> W3 = Flux.orthogonal(3, 3, 2, 4);\n\njulia> transpose(reshape(W3, :, 4)) * reshape(W3, :, 4) ≈ I(4)\ntrue\n\nReferences\n\n[1] Saxe, McClelland, Ganguli. \"Exact solutions to the nonlinear dynamics of learning in deep linear neural networks\", ICLR 2014, https://arxiv.org/abs/1312.6120\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.sparse_init","page":"Weight Initialisation","title":"Flux.sparse_init","text":"sparse_init([rng], rows, cols; sparsity, std = 0.01) -> Array\nsparse_init([rng]; kw...) -> Function\n\nReturn a Matrix{Float32} of size rows, cols where each column contains a fixed fraction of zero elements given by sparsity. Non-zero elements are normally distributed with a mean of zero and standard deviation std.\n\nThis method is described in [1].\n\nExamples\n\njulia> count(iszero, Flux.sparse_init(10, 10, sparsity=1/5))\n20\n\njulia> sum(0 .== Flux.sparse_init(10, 11, sparsity=0.9), dims=1)\n1×11 Matrix{Int64}:\n 9 9 9 9 9 9 9 9 9 9 9\n\njulia> Dense(3 => 10, tanh; init=Flux.sparse_init(sparsity=0.5))\nDense(3 => 10, tanh) # 40 parameters\n\njulia> count(iszero, ans.weight, dims=1)\n1×3 Matrix{Int64}:\n 5 5 5\n\nReferences\n\n[1] Martens, J, \"Deep learning via Hessian-free optimization\" Proceedings of the 27th International Conference on International Conference on Machine Learning. 2010.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.identity_init","page":"Weight Initialisation","title":"Flux.identity_init","text":"identity_init(size...; gain=1, shift=0) -> Array\nidentity_init(; kw...) -> Function\n\nReturn an Array{Float32} of the given size which yields an identity mapping when used as parameters in most Flux layers. Use gain to scale the identity by a constant.\n\nOften useful in the context of transfer learning, i.e when one wants to add more capacity to a model but start from the same mapping.\n\nHas the following behaviour\n\n1D: A Vector of zeros (useful for an identity bias)\n2D: An identity matrix (useful for an identity matrix multiplication)\nMore than 2D: A dense block array of center tap spatial filters (useful for an identity convolution)\n\nSome caveats: \n\nNot all layers will be identity mapping when used with this init. Exceptions include recurrent layers and normalization layers.\nLayers must have input_size == output_size for identity mapping to be possible. When this is not the case, extra dimensions of the array are padded with zeros.\nFor convolutional layers, in addition to the above, the kernel sizes must also be odd and padding must be applied so that output feature maps have the same size as input feature maps, e.g by using SamePad.\n\nUse keyword shift (integer or tuple) to apply circular shift to the output, equivalent to Base.circshift(identity_init(size...), shift).\n\nFor consistency with other initialisers, it accepts rng::AbstractRNG as an optional first argument. But this is ignored, since the result is not random.\n\nExamples\n\njulia> Flux.identity_init(3,5)\n3×5 Matrix{Float32}:\n 1.0 0.0 0.0 0.0 0.0\n 0.0 1.0 0.0 0.0 0.0\n 0.0 0.0 1.0 0.0 0.0\n\njulia> Dense(5 => 3, relu, init=Flux.identity_init)([1,-2,3,-4,5])\n3-element Vector{Float32}:\n 1.0\n 0.0\n 3.0\n\njulia> Flux.identity_init(3,3,2; gain=100)\n3×3×2 Array{Float32, 3}:\n[:, :, 1] =\n 0.0 0.0 0.0\n 100.0 0.0 0.0\n 0.0 0.0 0.0\n\n[:, :, 2] =\n 0.0 0.0 0.0\n 0.0 100.0 0.0\n 0.0 0.0 0.0\n\njulia> x4 = cat([1 2 3; 4 5 6; 7 8 9]; dims=4);\n\njulia> Conv((2,2), 1 => 1, init=Flux.identity_init(gain=10), pad=SamePad())(x4)\n3×3×1×1 Array{Float32, 4}:\n[:, :, 1, 1] =\n 10.0 20.0 30.0\n 40.0 50.0 60.0\n 70.0 80.0 90.0\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.ones32","page":"Weight Initialisation","title":"Flux.ones32","text":"ones32(size...) = ones(Float32, size...)\n\nReturn an Array{Float32} of the given size filled with 1s.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.zeros32","page":"Weight Initialisation","title":"Flux.zeros32","text":"zeros32(size...) = zeros(Float32, size...)\n\nReturn an Array{Float32} of the given size filled with 0s.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.rand32","page":"Weight Initialisation","title":"Flux.rand32","text":"rand32([rng], size...)\n\nReturn an Array{Float32} of the given size, filled like rand. When the size is not provided, rand32(rng::AbstractRNG) returns a function.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.randn32","page":"Weight Initialisation","title":"Flux.randn32","text":"randn32([rng], size...)\n\nReturn an Array{Float32} of the given size, filled like randn. When the size is not provided, randn32(rng::AbstractRNG) returns a function.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.create_bias","page":"Weight Initialisation","title":"Flux.create_bias","text":"create_bias(weights, bias, size...)\n\nReturn a bias parameter for a layer, based on the value given to the constructor's keyword bias=bias.\n\nbias == true creates a trainable array of the given size, of the same type as weights, initialised to zero.\nbias == false returns false, which is understood by AD to be non-differentiable.\nbias::AbstractArray uses the array provided, provided it has the correct size. It will also correct the eltype to match that of weights.\n\n\n\n\n\n","category":"function"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"These functions call:","category":"page"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"Flux.rng_from_array\nFlux.nfan","category":"page"},{"location":"utilities/#Flux.rng_from_array","page":"Weight Initialisation","title":"Flux.rng_from_array","text":"rng_from_array(x)\n\nCreate an instance of the RNG most appropriate for x. The current defaults are:\n\nx isa CuArray: CUDA.default_rng()\nx isa AbstractArray: `Random.default_rng()\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.nfan","page":"Weight Initialisation","title":"Flux.nfan","text":"nfan(n_out, n_in=1) -> Tuple\nnfan(dims...)\nnfan(dims::Tuple)\n\nFor a layer characterized by dimensions dims, return a tuple (fan_in, fan_out), where fan_in is the number of input neurons connected to an output one, and fan_out is the number of output neurons connected to an input one.\n\nThis function is mainly used by weight initializers, e.g., kaiming_normal.\n\nExamples\n\njulia> layer = Dense(10, 20);\n\njulia> Flux.nfan(size(layer.weight))\n(10, 20)\n\njulia> layer = Conv((3, 3), 2=>10);\n\njulia> Flux.nfan(size(layer.weight))\n(18, 90)\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Changing-the-type-of-all-parameters","page":"Weight Initialisation","title":"Changing the type of all parameters","text":"","category":"section"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"The default eltype for models is Float32 since models are often trained/run on GPUs. The eltype of model m can be changed to Float64 by f64(m):","category":"page"},{"location":"utilities/","page":"Weight Initialisation","title":"Weight Initialisation","text":"Flux.f64\nFlux.f32\nFlux.f16","category":"page"},{"location":"utilities/#Flux.f64","page":"Weight Initialisation","title":"Flux.f64","text":"f64(m)\n\nConverts the eltype of model's floating point parameters to Float64. Recurses into structs marked with @functor.\n\nSee also f32 and f16.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.f32","page":"Weight Initialisation","title":"Flux.f32","text":"f32(m)\n\nConverts the eltype of model's floating point parameters to Float32 (which is Flux's default). Recurses into structs marked with @functor.\n\nSee also f64 and f16.\n\n\n\n\n\n","category":"function"},{"location":"utilities/#Flux.f16","page":"Weight Initialisation","title":"Flux.f16","text":"f16(m)\n\nConverts the eltype of model's floating point parameters to Float16. Recurses into structs marked with @functor.\n\nSupport for Float16 is limited on many CPUs. Julia may convert to Float32 for each operation, which is slow.\n\nSee also f32 and f64.\n\nExample\n\njulia> m = Chain(Dense(784, 2048, relu), Dense(2048, 10)) # all Float32\nChain(\n Dense(784 => 2048, relu), # 1_607_680 parameters\n Dense(2048 => 10), # 20_490 parameters\n) # Total: 4 arrays, 1_628_170 parameters, 6.211 MiB.\n\njulia> m |> f16 # takes half the memory\nChain(\n Dense(784 => 2048, relu), # 1_607_680 parameters\n Dense(2048 => 10), # 20_490 parameters\n) # Total: 4 arrays, 1_628_170 parameters, 3.106 MiB.\n\n\n\n\n\n","category":"function"},{"location":"outputsize/#Shape-Inference","page":"Shape Inference","title":"Shape Inference","text":"","category":"section"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"Flux has some tools to help generate models in an automated fashion, by inferring the size of arrays that layers will recieve, without doing any computation. This is especially useful for convolutional models, where the same Conv layer accepts any size of image, but the next layer may not. ","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"The higher-level tool is a macro @autosize which acts on the code defining the layers, and replaces each appearance of _ with the relevant size. This simple example returns a model with Dense(845 => 10) as the last layer:","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"@autosize (28, 28, 1, 32) Chain(Conv((3, 3), _ => 5, relu, stride=2), Flux.flatten, Dense(_ => 10))","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"The input size may be provided at runtime, like @autosize (sz..., 1, 32) Chain(Conv(..., but all the layer constructors containing _ must be explicitly written out – the macro sees the code as written.","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"This macro relies on a lower-level function outputsize, which you can also use directly:","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"c = Conv((3, 3), 1 => 5, relu, stride=2)\nFlux.outputsize(c, (28, 28, 1, 32)) # returns (13, 13, 5, 32)","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"The function outputsize works by passing a \"dummy\" array into the model, which propagates through very cheaply. It should work for all layers, including custom layers, out of the box.","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"An example of how to automate model building is this:","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"\"\"\"\n make_model(width, height, [inchannels, nclasses; layer_config])\n\nCreate a CNN for a given set of configuration parameters. Arguments:\n- `width`, `height`: the input image size in pixels\n- `inchannels`: the number of channels in the input image, default `1`\n- `nclasses`: the number of output classes, default `10`\n- Keyword `layer_config`: a vector of the number of channels per layer, default `[16, 16, 32, 64]`\n\"\"\"\nfunction make_model(width, height, inchannels = 1, nclasses = 10;\n layer_config = [16, 16, 32, 64])\n # construct a vector of layers:\n conv_layers = []\n push!(conv_layers, Conv((5, 5), inchannels => layer_config[1], relu, pad=SamePad()))\n for (inch, outch) in zip(layer_config, layer_config[2:end])\n push!(conv_layers, Conv((3, 3), inch => outch, sigmoid, stride=2))\n end\n\n # compute the output dimensions after these conv layers:\n conv_outsize = Flux.outputsize(conv_layers, (width, height, inchannels); padbatch=true)\n\n # use this to define appropriate Dense layer:\n last_layer = Dense(prod(conv_outsize) => nclasses)\n return Chain(conv_layers..., Flux.flatten, last_layer)\nend\n\nm = make_model(28, 28, 3, layer_config = [9, 17, 33, 65])\n\nFlux.outputsize(m, (28, 28, 3, 42)) == (10, 42) == size(m(randn(Float32, 28, 28, 3, 42)))","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"Alternatively, using the macro, the definition of make_model could end with:","category":"page"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":" # compute the output dimensions & construct appropriate Dense layer:\n return @autosize (width, height, inchannels, 1) Chain(conv_layers..., Flux.flatten, Dense(_ => nclasses))\nend","category":"page"},{"location":"outputsize/#Listing","page":"Shape Inference","title":"Listing","text":"","category":"section"},{"location":"outputsize/","page":"Shape Inference","title":"Shape Inference","text":"Flux.@autosize\nFlux.outputsize","category":"page"},{"location":"outputsize/#Flux.@autosize","page":"Shape Inference","title":"Flux.@autosize","text":"@autosize (size...,) Chain(Layer(_ => 2), Layer(_), ...)\n\nReturns the specified model, with each _ replaced by an inferred number, for input of the given size.\n\nThe unknown sizes are usually the second-last dimension of that layer's input, which Flux regards as the channel dimension. (A few layers, Dense & LayerNorm, instead always use the first dimension.) The underscore may appear as an argument of a layer, or inside a =>. It may be used in further calculations, such as Dense(_ => _÷4).\n\nExamples\n\njulia> @autosize (3, 1) Chain(Dense(_ => 2, sigmoid), BatchNorm(_, affine=false))\nChain(\n Dense(3 => 2, σ), # 8 parameters\n BatchNorm(2, affine=false),\n) \n\njulia> img = [28, 28];\n\njulia> @autosize (img..., 1, 32) Chain( # size is only needed at runtime\n Chain(c = Conv((3,3), _ => 5; stride=2, pad=SamePad()),\n p = MeanPool((3,3)),\n b = BatchNorm(_),\n f = Flux.flatten),\n Dense(_ => _÷4, relu, init=Flux.rand32), # can calculate output size _÷4\n SkipConnection(Dense(_ => _, relu), +),\n Dense(_ => 10),\n )\nChain(\n Chain(\n c = Conv((3, 3), 1 => 5, pad=1, stride=2), # 50 parameters\n p = MeanPool((3, 3)),\n b = BatchNorm(5), # 10 parameters, plus 10\n f = Flux.flatten,\n ),\n Dense(80 => 20, relu), # 1_620 parameters\n SkipConnection(\n Dense(20 => 20, relu), # 420 parameters\n +,\n ),\n Dense(20 => 10), # 210 parameters\n) # Total: 10 trainable arrays, 2_310 parameters,\n # plus 2 non-trainable, 10 parameters, summarysize 10.469 KiB.\n\njulia> outputsize(ans, (28, 28, 1, 32))\n(10, 32)\n\nLimitations:\n\nWhile @autosize (5, 32) Flux.Bilinear(_ => 7) is OK, something like Bilinear((_, _) => 7) will fail.\nWhile Scale(_) and LayerNorm(_) are fine (and use the first dimension), Scale(_,_) and LayerNorm(_,_) will fail if size(x,1) != size(x,2).\n\n\n\n\n\n","category":"macro"},{"location":"outputsize/#Flux.outputsize","page":"Shape Inference","title":"Flux.outputsize","text":"outputsize(m, x_size, y_size, ...; padbatch=false)\n\nFor model or layer m accepting multiple arrays as input, this returns size(m((x, y, ...))) given size_x = size(x), etc.\n\nExamples\n\njulia> x, y = rand(Float32, 5, 64), rand(Float32, 7, 64);\n\njulia> par = Parallel(vcat, Dense(5 => 9), Dense(7 => 11));\n\njulia> Flux.outputsize(par, (5, 64), (7, 64))\n(20, 64)\n\njulia> m = Chain(par, Dense(20 => 13), softmax);\n\njulia> Flux.outputsize(m, (5,), (7,); padbatch=true)\n(13, 1)\n\njulia> par(x, y) == par((x, y)) == Chain(par, identity)((x, y))\ntrue\n\nNotice that Chain only accepts multiple arrays as a tuple, while Parallel also accepts them as multiple arguments; outputsize always supplies the tuple.\n\n\n\n\n\n","category":"function"},{"location":"tutorials/2021-02-07-convnet/#man-convnet-tutorial","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"","category":"section"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"In this tutorial, we build a simple Convolutional Neural Network (ConvNet) to classify the MNIST dataset. This model has a simple architecture with three feature detection layers (Conv -> ReLU -> MaxPool) followed by a final dense layer that classifies MNIST handwritten digits. Note that this model, while simple, should hit around 99% test accuracy after training for approximately 20 epochs.","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"This example writes out the saved model to the file mnist_conv.bson. Also, it demonstrates basic model construction, training, saving, conditional early-exit, and learning rate scheduling.","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"To run this example, we need the following packages:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"using Flux, MLDatasets, Statistics\nusing Flux: onehotbatch, onecold, logitcrossentropy, params\nusing MLDatasets: MNIST\nusing Base.Iterators: partition\nusing Printf, BSON\nusing CUDA\nCUDA.allowscalar(false)","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"We set default values for learning rate, batch size, number of epochs, and path for saving the file mnist_conv.bson:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Base.@kwdef mutable struct TrainArgs\n lr::Float64 = 3e-3\n epochs::Int = 20\n batch_size = 128\n savepath::String = \"./\"\nend","category":"page"},{"location":"tutorials/2021-02-07-convnet/#Data","page":"Tutorial: A Simple ConvNet","title":"Data","text":"","category":"section"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"To train our model, we need to bundle images together with their labels and group them into mini-batches (makes the training process faster). We define the function make_minibatch that takes as inputs the images (X) and their labels (Y) as well as the indices for the mini-batches (idx):","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"function make_minibatch(X, Y, idxs)\n X_batch = Array{Float32}(undef, size(X)[1:end-1]..., 1, length(idxs))\n for i in 1:length(idxs)\n X_batch[:, :, :, i] = Float32.(X[:,:,idxs[i]])\n end\n Y_batch = onehotbatch(Y[idxs], 0:9)\n return (X_batch, Y_batch)\nend","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"make_minibatch takes the following steps:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Creates the X_batch array of size 28x28x1x128 to store the mini-batches. \nStores the mini-batches in X_batch.\nOne hot encodes the labels of the images.\nStores the labels in Y_batch.","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"get_processed_data loads the train and test data from Flux.Data.MNIST. First, it loads the images and labels of the train data set, and creates an array that contains the indices of the train images that correspond to each mini-batch (of size args.batch_size). Then, it calls the make_minibatch function to create all of the train mini-batches. Finally, it loads the test images and creates one mini-batch that contains them all.","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"function get_processed_data(args)\n # Load labels and images\n train_imgs, train_labels = MNIST.traindata()\n mb_idxs = partition(1:length(train_labels), args.batch_size)\n train_set = [make_minibatch(train_imgs, train_labels, i) for i in mb_idxs]\n \n # Prepare test set as one giant minibatch:\n test_imgs, test_labels = MNIST.testdata()\n test_set = make_minibatch(test_imgs, test_labels, 1:length(test_labels))\n \n return train_set, test_set\n \nend","category":"page"},{"location":"tutorials/2021-02-07-convnet/#Model","page":"Tutorial: A Simple ConvNet","title":"Model","text":"","category":"section"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Now, we define the build_model function that creates a ConvNet model which is composed of three convolution layers (feature detection) and one classification layer. The input layer size is 28x28. The images are grayscale, which means there is only one channel (compared to 3 for RGB) in every data point. Combined together, the convolutional layer structure would look like Conv(kernel, input_channels => output_channels, ...). Each convolution layer reduces the size of the image by applying the Rectified Linear unit (ReLU) and MaxPool operations. On the other hand, the classification layer outputs a vector of 10 dimensions (a dense layer), that is, the number of classes that the model will be able to predict.","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"function build_model(args; imgsize = (28,28,1), nclasses = 10)\n cnn_output_size = Int.(floor.([imgsize[1]/8,imgsize[2]/8,32])) \n \n return Chain(\n # First convolution, operating upon a 28x28 image\n Conv((3, 3), imgsize[3]=>16, pad=(1,1), relu),\n MaxPool((2,2)),\n \n # Second convolution, operating upon a 14x14 image\n Conv((3, 3), 16=>32, pad=(1,1), relu),\n MaxPool((2,2)),\n \n # Third convolution, operating upon a 7x7 image\n Conv((3, 3), 32=>32, pad=(1,1), relu),\n MaxPool((2,2)),\n \n # Reshape 3d array into a 2d one using `Flux.flatten`, at this point it should be (3, 3, 32, N)\n flatten,\n Dense(prod(cnn_output_size), 10))\nend","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"To chain the layers of a model we use the Flux function Chain. It enables us to call the layers in sequence on a given input. Also, we use the function flatten to reshape the output image from the last convolution layer. Finally, we call the Dense function to create the classification layer.","category":"page"},{"location":"tutorials/2021-02-07-convnet/#Training","page":"Tutorial: A Simple ConvNet","title":"Training","text":"","category":"section"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Before training our model, we need to define a few functions that will be helpful for the process:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"augment adds gaussian random noise to our image, to make it more robust:\nanynan checks whether any element of the params is NaN or not:\naccuracy computes the proportion of inputs x correctly classified by our ConvNet:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"augment(x) = x .+ gpu(0.1f0*randn(eltype(x), size(x)))\nanynan(x) = any(y -> any(isnan, y), x)\naccuracy(x, y, model) = mean(onecold(cpu(model(x))) .== onecold(cpu(y)))","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Finally, we define the train function:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"function train(; kws...) \n args = TrainArgs(; kws...)\n \n @info(\"Loading data set\")\n train_set, test_set = get_processed_data(args)\n \n # Define our model. We will use a simple convolutional architecture with\n # three iterations of Conv -> ReLU -> MaxPool, followed by a final Dense layer.\n @info(\"Building model...\")\n model = build_model(args)\n \n # Load model and datasets onto GPU, if enabled\n train_set = gpu.(train_set)\n test_set = gpu.(test_set)\n model = gpu(model)\n \n # Make sure our model is nicely precompiled before starting our training loop\n model(train_set[1][1])\n \n # `loss()` calculates the crossentropy loss between our prediction `y_hat`\n # (calculated from `model(x)`) and the ground truth `y`. We augment the data\n # a bit, adding gaussian random noise to our image to make it more robust.\n function loss(x, y) \n x̂ = augment(x)\n ŷ = model(x̂)\n return logitcrossentropy(ŷ, y)\n end\n \n # Train our model with the given training set using the Adam optimiser and\n # printing out performance against the test set as we go.\n opt = Adam(args.lr)\n \n @info(\"Beginning training loop...\")\n best_acc = 0.0\n last_improvement = 0\n for epoch_idx in 1:args.epochs\n # Train for a single epoch\n Flux.train!(loss, params(model), train_set, opt)\n \n # Terminate on NaN\n if anynan(Flux.params(model))\n @error \"NaN params\"\n break\n end\n \n # Calculate accuracy:\n acc = accuracy(test_set..., model)\n \n @info(@sprintf(\"[%d]: Test accuracy: %.4f\", epoch_idx, acc))\n # If our accuracy is good enough, quit out.\n if acc >= 0.999\n @info(\" -> Early-exiting: We reached our target accuracy of 99.9%\")\n break\n end\n \n # If this is the best accuracy we've seen so far, save the model out\n if acc >= best_acc\n @info(\" -> New best accuracy! Saving model out to mnist_conv.bson\")\n BSON.@save joinpath(args.savepath, \"mnist_conv.bson\") params=cpu.(params(model)) epoch_idx acc\n best_acc = acc\n last_improvement = epoch_idx\n end\n \n # If we haven't seen improvement in 5 epochs, drop our learning rate:\n if epoch_idx - last_improvement >= 5 && opt.eta > 1e-6\n opt.eta /= 10.0\n @warn(\" -> Haven't improved in a while, dropping learning rate to $(opt.eta)!\")\n \n # After dropping learning rate, give it a few epochs to improve\n last_improvement = epoch_idx\n end\n \n if epoch_idx - last_improvement >= 10\n @warn(\" -> We're calling this converged.\")\n break\n end\n end\nend","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"train calls the functions we defined above and trains our model. It stops when the model achieves 99% accuracy (early-exiting) or after performing 20 steps. More specifically, it performs the following steps:","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Loads the MNIST dataset.\nBuilds our ConvNet model (as described above).\nLoads the train and test data sets as well as our model onto a GPU (if available).\nDefines a loss function that calculates the crossentropy between our prediction and the ground truth.\nSets the Adam optimiser to train the model with learning rate args.lr.\nRuns the training loop. For each step (or epoch), it executes the following:\nCalls Flux.train! function to execute one training step.\nIf any of the parameters of our model is NaN, then the training process is terminated.\nCalculates the model accuracy.\nIf the model accuracy is >= 0.999, then early-exiting is executed.\nIf the actual accuracy is the best so far, then the model is saved to mnist_conv.bson. Also, the new best accuracy and the current epoch is saved.\nIf there has not been any improvement for the last 5 epochs, then the learning rate is dropped and the process waits a little longer for the accuracy to improve.\nIf the last improvement was more than 10 epochs ago, then the process is terminated.","category":"page"},{"location":"tutorials/2021-02-07-convnet/#Testing","page":"Tutorial: A Simple ConvNet","title":"Testing","text":"","category":"section"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Finally, to test our model we define the test function: ","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"function test(; kws...)\n args = TrainArgs(; kws...)\n \n # Loading the test data\n _,test_set = get_processed_data(args)\n \n # Re-constructing the model with random initial weights\n model = build_model(args)\n \n # Loading the saved parameters\n BSON.@load joinpath(args.savepath, \"mnist_conv.bson\") params\n \n # Loading parameters onto the model\n Flux.loadparams!(model, params)\n \n test_set = gpu.(test_set)\n model = gpu(model)\n @show accuracy(test_set...,model)\nend","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"test loads the MNIST test data set, reconstructs the model, and loads the saved parameters (in mnist_conv.bson) onto it. Finally, it computes our model's predictions for the test set and shows the test accuracy (around 99%).","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"To see the full version of this example, see Simple ConvNets - model-zoo.","category":"page"},{"location":"tutorials/2021-02-07-convnet/#Resources","page":"Tutorial: A Simple ConvNet","title":"Resources","text":"","category":"section"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"Neural Networks in Flux.jl with Huda Nassar (working with the MNIST dataset)\nConvolutional Neural Networks (CNNs / ConvNets).\nConvolutional Neural Networks Tutorial in PyTorch.","category":"page"},{"location":"tutorials/2021-02-07-convnet/","page":"Tutorial: A Simple ConvNet","title":"Tutorial: A Simple ConvNet","text":"info: Info\nOriginally published at fluxml.ai on 7 February 2021. Written by Elliot Saba, Adarsh Kumar, Mike J Innes, Dhairya Gandhi, Sudhanshu Agrawal, Sambit Kumar Dash, fps.io, Carlo Lucibello, Andrew Dinhobl, Liliana Badillo","category":"page"},{"location":"performance/#[Performance-Tips]((@id-man-performance-tips))","page":"Performance Tips","title":"Performance Tips","text":"","category":"section"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"All the usual Julia performance tips apply. As always profiling your code is generally a useful way of finding bottlenecks. Below follow some Flux specific tips/reminders.","category":"page"},{"location":"performance/#Don't-use-more-precision-than-you-need","page":"Performance Tips","title":"Don't use more precision than you need","text":"","category":"section"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"Flux works great with all kinds of number types. But often you do not need to be working with say Float64 (let alone BigFloat). Switching to Float32 can give you a significant speed up, not because the operations are faster, but because the memory usage is halved. Which means allocations occur much faster. And you use less memory.","category":"page"},{"location":"performance/#Preserve-inputs'-types","page":"Performance Tips","title":"Preserve inputs' types","text":"","category":"section"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"Not only should your activation and loss functions be type-stable, they should also preserve the type of their inputs.","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"A very artificial example using an activation function like","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"my_tanh(x) = Float64(tanh(x))","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"will result in performance on Float32 input orders of magnitude slower than the normal tanh would, because it results in having to use slow mixed type multiplication in the dense layers. Similar situations can occur in the loss function during backpropagation.","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"Which means if you change your data say from Float64 to Float32 (which should give a speedup: see above), you will see a large slow-down.","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"This can occur sneakily, because you can cause type-promotion by interacting with a numeric literals. E.g. the following will have run into the same problem as above:","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"leaky_tanh(x) = 0.01*x + tanh(x)","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"While one could change the activation function (e.g. to use 0.01f0*x), the idiomatic (and safe way) to avoid type casts whenever inputs changes is to use oftype:","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"leaky_tanh(x) = oftype(x/1, 0.01)*x + tanh(x)","category":"page"},{"location":"performance/#Evaluate-batches-as-Matrices-of-features","page":"Performance Tips","title":"Evaluate batches as Matrices of features","text":"","category":"section"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"While it can sometimes be tempting to process your observations (feature vectors) one at a time e.g.","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"function loss_total(xs::AbstractVector{<:Vector}, ys::AbstractVector{<:Vector})\n sum(zip(xs, ys)) do (x, y_target)\n y_pred = model(x) # evaluate the model\n return loss(y_pred, y_target)\n end\nend","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"It is much faster to concatenate them into a matrix, as this will hit BLAS matrix-matrix multiplication, which is much faster than the equivalent sequence of matrix-vector multiplications. The improvement is enough that it is worthwhile allocating new memory to store them contiguously.","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"x_batch = reduce(hcat, xs)\ny_batch = reduce(hcat, ys)\n...\nfunction loss_total(x_batch::Matrix, y_batch::Matrix)\n y_preds = model(x_batch)\n sum(loss.(y_preds, y_batch))\nend","category":"page"},{"location":"performance/","page":"Performance Tips","title":"Performance Tips","text":"When doing this kind of concatenation use reduce(hcat, xs) rather than hcat(xs...). This will avoid the splatting penalty, and will hit the optimised reduce method.","category":"page"},{"location":"data/mlutils/#Working-with-Data,-using-MLUtils.jl","page":"Batching Data – MLUtils.jl","title":"Working with Data, using MLUtils.jl","text":"","category":"section"},{"location":"data/mlutils/","page":"Batching Data – MLUtils.jl","title":"Batching Data – MLUtils.jl","text":"Flux re-exports the DataLoader type and utility functions for working with data from MLUtils.","category":"page"},{"location":"data/mlutils/#DataLoader","page":"Batching Data – MLUtils.jl","title":"DataLoader","text":"","category":"section"},{"location":"data/mlutils/","page":"Batching Data – MLUtils.jl","title":"Batching Data – MLUtils.jl","text":"The DataLoader can be used to create mini-batches of data, in the format train! expects.","category":"page"},{"location":"data/mlutils/","page":"Batching Data – MLUtils.jl","title":"Batching Data – MLUtils.jl","text":"Flux's website has a dedicated tutorial on DataLoader for more information. ","category":"page"},{"location":"data/mlutils/","page":"Batching Data – MLUtils.jl","title":"Batching Data – MLUtils.jl","text":"MLUtils.DataLoader","category":"page"},{"location":"data/mlutils/#MLUtils.DataLoader","page":"Batching Data – MLUtils.jl","title":"MLUtils.DataLoader","text":"DataLoader(data; [batchsize, buffer, collate, parallel, partial, rng, shuffle])\n\nAn object that iterates over mini-batches of data, each mini-batch containing batchsize observations (except possibly the last one).\n\nTakes as input a single data array, a tuple (or a named tuple) of arrays, or in general any data object that implements the numobs and getobs methods.\n\nThe last dimension in each array is the observation dimension, i.e. the one divided into mini-batches.\n\nThe original data is preserved in the data field of the DataLoader.\n\nArguments\n\ndata: The data to be iterated over. The data type has to be supported by numobs and getobs.\nbatchsize: If less than 0, iterates over individual observations. Otherwise, each iteration (except possibly the last) yields a mini-batch containing batchsize observations. Default 1.\nbuffer: If buffer=true and supported by the type of data, a buffer will be allocated and reused for memory efficiency. You can also pass a preallocated object to buffer. Default false.\ncollate: Batching behavior. If nothing (default), a batch is getobs(data, indices). If false, each batch is [getobs(data, i) for i in indices]. When true, applies batch to the vector of observations in a batch, recursively collating arrays in the last dimensions. See batch for more information and examples.\nparallel: Whether to use load data in parallel using worker threads. Greatly speeds up data loading by factor of available threads. Requires starting Julia with multiple threads. Check Threads.nthreads() to see the number of available threads. Passing parallel = true breaks ordering guarantees. Default false.\npartial: This argument is used only when batchsize > 0. If partial=false and the number of observations is not divisible by the batchsize, then the last mini-batch is dropped. Default true.\nrng: A random number generator. Default Random.GLOBAL_RNG.\nshuffle: Whether to shuffle the observations before iterating. Unlike wrapping the data container with shuffleobs(data), shuffle=true ensures that the observations are shuffled anew every time you start iterating over eachobs. Default false.\n\nExamples\n\njulia> Xtrain = rand(10, 100);\n\njulia> array_loader = DataLoader(Xtrain, batchsize=2);\n\njulia> for x in array_loader\n @assert size(x) == (10, 2)\n # do something with x, 50 times\n end\n\njulia> array_loader.data === Xtrain\ntrue\n\njulia> tuple_loader = DataLoader((Xtrain,), batchsize=2); # similar, but yielding 1-element tuples\n\njulia> for x in tuple_loader\n @assert x isa Tuple{Matrix}\n @assert size(x[1]) == (10, 2)\n end\n\njulia> Ytrain = rand('a':'z', 100); # now make a DataLoader yielding 2-element named tuples\n\njulia> train_loader = DataLoader((data=Xtrain, label=Ytrain), batchsize=5, shuffle=true);\n\njulia> for epoch in 1:100\n for (x, y) in train_loader # access via tuple destructuring\n @assert size(x) == (10, 5)\n @assert size(y) == (5,)\n # loss += f(x, y) # etc, runs 100 * 20 times\n end\n end\n\njulia> first(train_loader).label isa Vector{Char} # access via property name\ntrue\n\njulia> first(train_loader).label == Ytrain[1:5] # because of shuffle=true\nfalse\n\njulia> foreach(println∘summary, DataLoader(rand(Int8, 10, 64), batchsize=30)) # partial=false would omit last\n10×30 Matrix{Int8}\n10×30 Matrix{Int8}\n10×4 Matrix{Int8}\n\n\n\n\n\n","category":"type"},{"location":"data/mlutils/#Utility-Functions","page":"Batching Data – MLUtils.jl","title":"Utility Functions","text":"","category":"section"},{"location":"data/mlutils/","page":"Batching Data – MLUtils.jl","title":"Batching Data – MLUtils.jl","text":"The utility functions are meant to be used while working with data; these functions help create inputs for your models or batch your dataset.","category":"page"},{"location":"data/mlutils/","page":"Batching Data – MLUtils.jl","title":"Batching Data – MLUtils.jl","text":"MLUtils.batch\nMLUtils.batchsize\nMLUtils.batchseq\nMLUtils.BatchView\nMLUtils.chunk\nMLUtils.eachobs\nMLUtils.fill_like\nMLUtils.filterobs\nMLUtils.flatten\nMLUtils.getobs\nMLUtils.getobs!\nMLUtils.joinobs\nMLUtils.group_counts\nMLUtils.group_indices\nMLUtils.groupobs\nMLUtils.kfolds\nMLUtils.leavepout\nMLUtils.mapobs\nMLUtils.numobs\nMLUtils.normalise\nMLUtils.obsview\nMLUtils.ObsView\nMLUtils.ones_like\nMLUtils.oversample\nMLUtils.randobs\nMLUtils.rand_like\nMLUtils.randn_like\nMLUtils.rpad_constant\nMLUtils.shuffleobs\nMLUtils.splitobs\nMLUtils.unbatch\nMLUtils.undersample\nMLUtils.unsqueeze\nMLUtils.unstack\nMLUtils.zeros_like","category":"page"},{"location":"data/mlutils/#MLUtils.batch","page":"Batching Data – MLUtils.jl","title":"MLUtils.batch","text":"batch(xs)\n\nBatch the arrays in xs into a single array with an extra dimension.\n\nIf the elements of xs are tuples, named tuples, or dicts, the output will be of the same type. \n\nSee also unbatch.\n\nExamples\n\njulia> batch([[1,2,3], \n [4,5,6]])\n3×2 Matrix{Int64}:\n 1 4\n 2 5\n 3 6\n\njulia> batch([(a=[1,2], b=[3,4])\n (a=[5,6], b=[7,8])]) \n(a = [1 5; 2 6], b = [3 7; 4 8])\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.batchsize","page":"Batching Data – MLUtils.jl","title":"MLUtils.batchsize","text":"batchsize(data) -> Int\n\nReturn the fixed size of each batch in data.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.batchseq","page":"Batching Data – MLUtils.jl","title":"MLUtils.batchseq","text":"batchseq(seqs, val = 0)\n\nTake a list of N sequences, and turn them into a single sequence where each item is a batch of N. Short sequences will be padded by val.\n\nExamples\n\njulia> batchseq([[1, 2, 3], [4, 5]], 0)\n3-element Vector{Vector{Int64}}:\n [1, 4]\n [2, 5]\n [3, 0]\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.BatchView","page":"Batching Data – MLUtils.jl","title":"MLUtils.BatchView","text":"BatchView(data, batchsize; partial=true, collate=nothing)\nBatchView(data; batchsize=1, partial=true, collate=nothing)\n\nCreate a view of the given data that represents it as a vector of batches. Each batch will contain an equal amount of observations in them. The batch-size can be specified using the parameter batchsize. In the case that the size of the dataset is not dividable by the specified batchsize, the remaining observations will be ignored if partial=false. If partial=true instead the last batch-size can be slightly smaller.\n\nNote that any data access is delayed until getindex is called.\n\nIf used as an iterator, the object will iterate over the dataset once, effectively denoting an epoch.\n\nFor BatchView to work on some data structure, the type of the given variable data must implement the data container interface. See ObsView for more info.\n\nArguments\n\ndata : The object describing the dataset. Can be of any type as long as it implements getobs and numobs (see Details for more information).\nbatchsize : The batch-size of each batch. It is the number of observations that each batch must contain (except possibly for the last one).\npartial : If partial=false and the number of observations is not divisible by the batch-size, then the last mini-batch is dropped.\ncollate: Batching behavior. If nothing (default), a batch is getobs(data, indices). If false, each batch is [getobs(data, i) for i in indices]. When true, applies batch to the vector of observations in a batch, recursively collating arrays in the last dimensions. See batch for more information and examples.\n\nExamples\n\nusing MLUtils\nX, Y = MLUtils.load_iris()\n\nA = BatchView(X, batchsize=30)\n@assert typeof(A) <: BatchView <: AbstractVector\n@assert eltype(A) <: SubArray{Float64,2}\n@assert length(A) == 5 # Iris has 150 observations\n@assert size(A[1]) == (4,30) # Iris has 4 features\n\n# 5 batches of size 30 observations\nfor x in BatchView(X, batchsize=30)\n @assert typeof(x) <: SubArray{Float64,2}\n @assert numobs(x) === 30\nend\n\n# 7 batches of size 20 observations\n# Note that the iris dataset has 150 observations,\n# which means that with a batchsize of 20, the last\n# 10 observations will be ignored\nfor (x, y) in BatchView((X, Y), batchsize=20, partial=false)\n @assert typeof(x) <: SubArray{Float64,2}\n @assert typeof(y) <: SubArray{String,1}\n @assert numobs(x) == numobs(y) == 20\nend\n\n# collate tuple observations\nfor (x, y) in BatchView((rand(10, 3), [\"a\", \"b\", \"c\"]), batchsize=2, collate=true, partial=false)\n @assert size(x) == (10, 2)\n @assert size(y) == (2,)\nend\n\n\n# randomly assign observations to one and only one batch.\nfor (x, y) in BatchView(shuffleobs((X, Y)), batchsize=20)\n @assert typeof(x) <: SubArray{Float64,2}\n @assert typeof(y) <: SubArray{String,1}\nend\n\n\n\n\n\n","category":"type"},{"location":"data/mlutils/#MLUtils.chunk","page":"Batching Data – MLUtils.jl","title":"MLUtils.chunk","text":"chunk(x, n; [dims])\nchunk(x; [size, dims])\n\nSplit x into n parts or alternatively, if size is an integer, into equal chunks of size size. The parts contain the same number of elements except possibly for the last one that can be smaller.\n\nIn case size is a collection of integers instead, the elements of x are split into chunks of the given sizes.\n\nIf x is an array, dims can be used to specify along which dimension to split (defaults to the last dimension).\n\nExamples\n\njulia> chunk(1:10, 3)\n3-element Vector{UnitRange{Int64}}:\n 1:4\n 5:8\n 9:10\n\njulia> chunk(1:10; size = 2)\n5-element Vector{UnitRange{Int64}}:\n 1:2\n 3:4\n 5:6\n 7:8\n 9:10\n\njulia> x = reshape(collect(1:20), (5, 4))\n5×4 Matrix{Int64}:\n 1 6 11 16\n 2 7 12 17\n 3 8 13 18\n 4 9 14 19\n 5 10 15 20\n\njulia> xs = chunk(x, 2, dims=1)\n2-element Vector{SubArray{Int64, 2, Matrix{Int64}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}}:\n [1 6 11 16; 2 7 12 17; 3 8 13 18]\n [4 9 14 19; 5 10 15 20]\n\njulia> xs[1]\n3×4 view(::Matrix{Int64}, 1:3, :) with eltype Int64:\n 1 6 11 16\n 2 7 12 17\n 3 8 13 18\n\njulia> xes = chunk(x; size = 2, dims = 2)\n2-element Vector{SubArray{Int64, 2, Matrix{Int64}, Tuple{Base.Slice{Base.OneTo{Int64}}, UnitRange{Int64}}, true}}:\n [1 6; 2 7; … ; 4 9; 5 10]\n [11 16; 12 17; … ; 14 19; 15 20]\n\njulia> xes[2]\n5×2 view(::Matrix{Int64}, :, 3:4) with eltype Int64:\n 11 16\n 12 17\n 13 18\n 14 19\n 15 20\n\njulia> chunk(1:6; size = [2, 4])\n2-element Vector{UnitRange{Int64}}:\n 1:2\n 3:6\n\n\n\n\n\nchunk(x, partition_idxs; [npartitions, dims])\n\nPartition the array x along the dimension dims according to the indexes in partition_idxs.\n\npartition_idxs must be sorted and contain only positive integers between 1 and the number of partitions. \n\nIf the number of partition npartitions is not provided, it is inferred from partition_idxs.\n\nIf dims is not provided, it defaults to the last dimension.\n\nSee also unbatch.\n\nExamples\n\njulia> x = reshape([1:10;], 2, 5)\n2×5 Matrix{Int64}:\n 1 3 5 7 9\n 2 4 6 8 10\n\njulia> chunk(x, [1, 2, 2, 3, 3])\n3-element Vector{SubArray{Int64, 2, Matrix{Int64}, Tuple{Base.Slice{Base.OneTo{Int64}}, UnitRange{Int64}}, true}}:\n [1; 2;;]\n [3 5; 4 6]\n [7 9; 8 10]\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.eachobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.eachobs","text":"eachobs(data; kws...)\n\nReturn an iterator over data.\n\nSupports the same arguments as DataLoader. The batchsize default is -1 here while it is 1 for DataLoader.\n\nExamples\n\nX = rand(4,100)\n\nfor x in eachobs(X)\n # loop entered 100 times\n @assert typeof(x) <: Vector{Float64}\n @assert size(x) == (4,)\nend\n\n# mini-batch iterations\nfor x in eachobs(X, batchsize=10)\n # loop entered 10 times\n @assert typeof(x) <: Matrix{Float64}\n @assert size(x) == (4,10)\nend\n\n# support for tuples, named tuples, dicts\nfor (x, y) in eachobs((X, Y))\n # ...\nend\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.fill_like","page":"Batching Data – MLUtils.jl","title":"MLUtils.fill_like","text":"fill_like(x, val, [element_type=eltype(x)], [dims=size(x)]))\n\nCreate an array with the given element type and size, based upon the given source array x. All element of the new array will be set to val. The third and fourth arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.\n\nSee also zeros_like and ones_like.\n\nExamples\n\njulia> x = rand(Float32, 2)\n2-element Vector{Float32}:\n 0.16087806\n 0.89916044\n\njulia> fill_like(x, 1.7, (3, 3))\n3×3 Matrix{Float32}:\n 1.7 1.7 1.7\n 1.7 1.7 1.7\n 1.7 1.7 1.7\n\njulia> using CUDA\n\njulia> x = CUDA.rand(2, 2)\n2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 0.803167 0.476101\n 0.303041 0.317581\n\njulia> fill_like(x, 1.7, Float64)\n2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:\n 1.7 1.7\n 1.7 1.7\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.filterobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.filterobs","text":"filterobs(f, data)\n\nReturn a subset of data container data including all indices i for which f(getobs(data, i)) === true.\n\ndata = 1:10\nnumobs(data) == 10\nfdata = filterobs(>(5), data)\nnumobs(fdata) == 5\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.flatten","page":"Batching Data – MLUtils.jl","title":"MLUtils.flatten","text":"flatten(x::AbstractArray)\n\nReshape arbitrarly-shaped input into a matrix-shaped output, preserving the size of the last dimension.\n\nSee also unsqueeze.\n\nExamples\n\njulia> rand(3,4,5) |> flatten |> size\n(12, 5)\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.getobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.getobs","text":"getobs(data, [idx])\n\nReturn the observations corresponding to the observation index idx. Note that idx can be any type as long as data has defined getobs for that type. If idx is not provided, then materialize all observations in data.\n\nIf data does not have getobs defined, then in the case of Tables.table(data) == true returns the row(s) in position idx, otherwise returns data[idx].\n\nAuthors of custom data containers should implement Base.getindex for their type instead of getobs. getobs should only be implemented for types where there is a difference between getobs and Base.getindex (such as multi-dimensional arrays).\n\nThe returned observation(s) should be in the form intended to be passed as-is to some learning algorithm. There is no strict interface requirement on how this \"actual data\" must look like. Every author behind some custom data container can make this decision themselves. The output should be consistent when idx is a scalar vs vector.\n\ngetobs supports by default nested combinations of array, tuple, named tuples, and dictionaries. \n\nSee also getobs! and numobs.\n\nExamples\n\n# named tuples \nx = (a = [1, 2, 3], b = rand(6, 3))\n\ngetobs(x, 2) == (a = 2, b = x.b[:, 2])\ngetobs(x, [1, 3]) == (a = [1, 3], b = x.b[:, [1, 3]])\n\n\n# dictionaries\nx = Dict(:a => [1, 2, 3], :b => rand(6, 3))\n\ngetobs(x, 2) == Dict(:a => 2, :b => x[:b][:, 2])\ngetobs(x, [1, 3]) == Dict(:a => [1, 3], :b => x[:b][:, [1, 3]])\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.getobs!","page":"Batching Data – MLUtils.jl","title":"MLUtils.getobs!","text":"getobs!(buffer, data, idx)\n\nInplace version of getobs(data, idx). If this method is defined for the type of data, then buffer should be used to store the result, instead of allocating a dedicated object.\n\nImplementing this function is optional. In the case no such method is provided for the type of data, then buffer will be ignored and the result of getobs returned. This could be because the type of data may not lend itself to the concept of copy!. Thus, supporting a custom getobs! is optional and not required.\n\nSee also getobs and numobs. \n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.joinobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.joinobs","text":"joinobs(datas...)\n\nConcatenate data containers datas.\n\ndata1, data2 = 1:10, 11:20\njdata = joinumobs(data1, data2)\ngetobs(jdata, 15) == 15\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.group_counts","page":"Batching Data – MLUtils.jl","title":"MLUtils.group_counts","text":"group_counts(x)\n\nCount the number of times that each element of x appears.\n\nSee also group_indices\n\nExamples\n\njulia> group_counts(['a', 'b', 'b'])\nDict{Char, Int64} with 2 entries:\n 'a' => 1\n 'b' => 2\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.group_indices","page":"Batching Data – MLUtils.jl","title":"MLUtils.group_indices","text":"group_indices(x) -> Dict\n\nComputes the indices of elements in the vector x for each distinct value contained. This information is useful for resampling strategies, such as stratified sampling.\n\nSee also group_counts.\n\nExamples\n\njulia> x = [:yes, :no, :maybe, :yes];\n\njulia> group_indices(x)\nDict{Symbol, Vector{Int64}} with 3 entries:\n :yes => [1, 4]\n :maybe => [3]\n :no => [2]\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.groupobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.groupobs","text":"groupobs(f, data)\n\nSplit data container data data into different data containers, grouping observations by f(obs).\n\ndata = -10:10\ndatas = groupobs(>(0), data)\nlength(datas) == 2\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.kfolds","page":"Batching Data – MLUtils.jl","title":"MLUtils.kfolds","text":"kfolds(n::Integer, k = 5) -> Tuple\n\nCompute the train/validation assignments for k repartitions of n observations, and return them in the form of two vectors. The first vector contains the index-vectors for the training subsets, and the second vector the index-vectors for the validation subsets respectively. A general rule of thumb is to use either k = 5 or k = 10. The following code snippet generates the indices assignments for k = 5\n\njulia> train_idx, val_idx = kfolds(10, 5);\n\nEach observation is assigned to the validation subset once (and only once). Thus, a union over all validation index-vectors reproduces the full range 1:n. Note that there is no random assignment of observations to subsets, which means that adjacent observations are likely to be part of the same validation subset.\n\njulia> train_idx\n5-element Array{Array{Int64,1},1}:\n [3,4,5,6,7,8,9,10]\n [1,2,5,6,7,8,9,10]\n [1,2,3,4,7,8,9,10]\n [1,2,3,4,5,6,9,10]\n [1,2,3,4,5,6,7,8]\n\njulia> val_idx\n5-element Array{UnitRange{Int64},1}:\n 1:2\n 3:4\n 5:6\n 7:8\n 9:10\n\n\n\n\n\nkfolds(data, [k = 5])\n\nRepartition a data container k times using a k folds strategy and return the sequence of folds as a lazy iterator. Only data subsets are created, which means that no actual data is copied until getobs is invoked.\n\nConceptually, a k-folds repartitioning strategy divides the given data into k roughly equal-sized parts. Each part will serve as validation set once, while the remaining parts are used for training. This results in k different partitions of data.\n\nIn the case that the size of the dataset is not dividable by the specified k, the remaining observations will be evenly distributed among the parts.\n\nfor (x_train, x_val) in kfolds(X, k=10)\n # code called 10 times\n # nobs(x_val) may differ up to ±1 over iterations\nend\n\nMultiple variables are supported (e.g. for labeled data)\n\nfor ((x_train, y_train), val) in kfolds((X, Y), k=10)\n # ...\nend\n\nBy default the folds are created using static splits. Use shuffleobs to randomly assign observations to the folds.\n\nfor (x_train, x_val) in kfolds(shuffleobs(X), k = 10)\n # ...\nend\n\nSee leavepout for a related function.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.leavepout","page":"Batching Data – MLUtils.jl","title":"MLUtils.leavepout","text":"leavepout(n::Integer, [size = 1]) -> Tuple\n\nCompute the train/validation assignments for k ≈ n/size repartitions of n observations, and return them in the form of two vectors. The first vector contains the index-vectors for the training subsets, and the second vector the index-vectors for the validation subsets respectively. Each validation subset will have either size or size+1 observations assigned to it. The following code snippet generates the index-vectors for size = 2.\n\njulia> train_idx, val_idx = leavepout(10, 2);\n\nEach observation is assigned to the validation subset once (and only once). Thus, a union over all validation index-vectors reproduces the full range 1:n. Note that there is no random assignment of observations to subsets, which means that adjacent observations are likely to be part of the same validation subset.\n\njulia> train_idx\n5-element Array{Array{Int64,1},1}:\n [3,4,5,6,7,8,9,10]\n [1,2,5,6,7,8,9,10]\n [1,2,3,4,7,8,9,10]\n [1,2,3,4,5,6,9,10]\n [1,2,3,4,5,6,7,8]\n\njulia> val_idx\n5-element Array{UnitRange{Int64},1}:\n 1:2\n 3:4\n 5:6\n 7:8\n 9:10\n\n\n\n\n\nleavepout(data, p = 1)\n\nRepartition a data container using a k-fold strategy, where k is chosen in such a way, that each validation subset of the resulting folds contains roughly p observations. Defaults to p = 1, which is also known as \"leave-one-out\" partitioning.\n\nThe resulting sequence of folds is returned as a lazy iterator. Only data subsets are created. That means no actual data is copied until getobs is invoked.\n\nfor (train, val) in leavepout(X, p=2)\n # if nobs(X) is dividable by 2,\n # then numobs(val) will be 2 for each iteraton,\n # otherwise it may be 3 for the first few iterations.\nend\n\nSeekfolds for a related function.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.mapobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.mapobs","text":"mapobs(f, data; batched=:auto)\n\nLazily map f over the observations in a data container data. Returns a new data container mdata that can be indexed and has a length. Indexing triggers the transformation f.\n\nThe batched keyword argument controls the behavior of mdata[idx] and mdata[idxs] where idx is an integer and idxs is a vector of integers:\n\nbatched=:auto (default). Let f handle the two cases. Calls f(getobs(data, idx)) and f(getobs(data, idxs)).\nbatched=:never. The function f is always called on a single observation. Calls f(getobs(data, idx)) and [f(getobs(data, idx)) for idx in idxs].\nbatched=:always. The function f is always called on a batch of observations. Calls getobs(f(getobs(data, [idx])), 1) and f(getobs(data, idxs)).\n\nExamples\n\njulia> data = (a=[1,2,3], b=[1,2,3]);\n\njulia> mdata = mapobs(data) do x\n (c = x.a .+ x.b, d = x.a .- x.b)\n end\nmapobs(#25, (a = [1, 2, 3], b = [1, 2, 3]); batched=:auto))\n\njulia> mdata[1]\n(c = 2, d = 0)\n\njulia> mdata[1:2]\n(c = [2, 4], d = [0, 0])\n\n\n\n\n\nmapobs(fs, data)\n\nLazily map each function in tuple fs over the observations in data container data. Returns a tuple of transformed data containers.\n\n\n\n\n\nmapobs(namedfs::NamedTuple, data)\n\nMap a NamedTuple of functions over data, turning it into a data container of NamedTuples. Field syntax can be used to select a column of the resulting data container.\n\ndata = 1:10\nnameddata = mapobs((x = sqrt, y = log), data)\ngetobs(nameddata, 10) == (x = sqrt(10), y = log(10))\ngetobs(nameddata.x, 10) == sqrt(10)\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.numobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.numobs","text":"numobs(data)\n\nReturn the total number of observations contained in data.\n\nIf data does not have numobs defined, then in the case of Tables.table(data) == true returns the number of rows, otherwise returns length(data).\n\nAuthors of custom data containers should implement Base.length for their type instead of numobs. numobs should only be implemented for types where there is a difference between numobs and Base.length (such as multi-dimensional arrays).\n\ngetobs supports by default nested combinations of array, tuple, named tuples, and dictionaries. \n\nSee also getobs.\n\nExamples\n\n\n# named tuples \nx = (a = [1, 2, 3], b = rand(6, 3))\nnumobs(x) == 3\n\n# dictionaries\nx = Dict(:a => [1, 2, 3], :b => rand(6, 3))\nnumobs(x) == 3\n\nAll internal containers must have the same number of observations:\n\njulia> x = (a = [1, 2, 3, 4], b = rand(6, 3));\n\njulia> numobs(x)\nERROR: DimensionMismatch: All data containers must have the same number of observations.\nStacktrace:\n [1] _check_numobs_error()\n @ MLUtils ~/.julia/dev/MLUtils/src/observation.jl:163\n [2] _check_numobs\n @ ~/.julia/dev/MLUtils/src/observation.jl:130 [inlined]\n [3] numobs(data::NamedTuple{(:a, :b), Tuple{Vector{Int64}, Matrix{Float64}}})\n @ MLUtils ~/.julia/dev/MLUtils/src/observation.jl:177\n [4] top-level scope\n @ REPL[35]:1\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.normalise","page":"Batching Data – MLUtils.jl","title":"MLUtils.normalise","text":"normalise(x; dims=ndims(x), ϵ=1e-5)\n\nNormalise the array x to mean 0 and standard deviation 1 across the dimension(s) given by dims. Per default, dims is the last dimension. \n\nϵ is a small additive factor added to the denominator for numerical stability.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.obsview","page":"Batching Data – MLUtils.jl","title":"MLUtils.obsview","text":"obsview(data, [indices])\n\nReturns a lazy view of the observations in data that correspond to the given indices. No data will be copied except of the indices. It is similar to constructing an ObsView, but returns a SubArray if the type of data is Array or SubArray. Furthermore, this function may be extended for custom types of data that also want to provide their own subset-type.\n\nIn case data is a tuple, the constructor will be mapped over its elements. That means that the constructor returns a tuple of ObsView instead of a ObsView of tuples.\n\nIf instead you want to get the subset of observations corresponding to the given indices in their native type, use getobs.\n\nSee ObsView for more information.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.ObsView","page":"Batching Data – MLUtils.jl","title":"MLUtils.ObsView","text":"ObsView(data, [indices])\n\nUsed to represent a subset of some data of arbitrary type by storing which observation-indices the subset spans. Furthermore, subsequent subsettings are accumulated without needing to access actual data.\n\nThe main purpose for the existence of ObsView is to delay data access and movement until an actual batch of data (or single observation) is needed for some computation. This is particularily useful when the data is not located in memory, but on the hard drive or some remote location. In such a scenario one wants to load the required data only when needed.\n\nAny data access is delayed until getindex is called, and even getindex returns the result of obsview which in general avoids data movement until getobs is called. If used as an iterator, the view will iterate over the dataset once, effectively denoting an epoch. Each iteration will return a lazy subset to the current observation.\n\nArguments\n\ndata : The object describing the dataset. Can be of any type as long as it implements getobs and numobs (see Details for more information).\nindices : Optional. The index or indices of the observation(s) in data that the subset should represent. Can be of type Int or some subtype of AbstractVector.\n\nMethods\n\ngetindex : Returns the observation(s) of the given index/indices. No data is copied aside from the required indices.\nnumobs : Returns the total number observations in the subset.\ngetobs : Returns the underlying data that the ObsView represents at the given relative indices. Note that these indices are in \"subset space\", and in general will not directly correspond to the same indices in the underlying data set.\n\nDetails\n\nFor ObsView to work on some data structure, the desired type MyType must implement the following interface:\n\ngetobs(data::MyType, idx) : Should return the observation(s) indexed by idx. In what form is up to the user. Note that idx can be of type Int or AbstractVector.\nnumobs(data::MyType) : Should return the total number of observations in data\n\nThe following methods can also be provided and are optional:\n\ngetobs(data::MyType) : By default this function is the identity function. If that is not the behaviour that you want for your type, you need to provide this method as well.\nobsview(data::MyType, idx) : If your custom type has its own kind of subset type, you can return it here. An example for such a case are SubArray for representing a subset of some AbstractArray.\ngetobs!(buffer, data::MyType, [idx]) : Inplace version of getobs(data, idx). If this method is provided for MyType, then eachobs can preallocate a buffer that is then reused every iteration. Note: buffer should be equivalent to the return value of getobs(::MyType, ...), since this is how buffer is preallocated by default.\n\nExamples\n\nX, Y = MLUtils.load_iris()\n\n# The iris set has 150 observations and 4 features\n@assert size(X) == (4,150)\n\n# Represents the 80 observations as a ObsView\nv = ObsView(X, 21:100)\n@assert numobs(v) == 80\n@assert typeof(v) <: ObsView\n# getobs indexes into v\n@assert getobs(v, 1:10) == X[:, 21:30]\n\n# Use `obsview` to avoid boxing into ObsView\n# for types that provide a custom \"subset\", such as arrays.\n# Here it instead creates a native SubArray.\nv = obsview(X, 1:100)\n@assert numobs(v) == 100\n@assert typeof(v) <: SubArray\n\n# Also works for tuples of arbitrary length\nsubset = obsview((X, Y), 1:100)\n@assert numobs(subset) == 100\n@assert typeof(subset) <: Tuple # tuple of SubArray\n\n# Use as iterator\nfor x in ObsView(X)\n @assert typeof(x) <: SubArray{Float64,1}\nend\n\n# iterate over each individual labeled observation\nfor (x, y) in ObsView((X, Y))\n @assert typeof(x) <: SubArray{Float64,1}\n @assert typeof(y) <: String\nend\n\n# same but in random order\nfor (x, y) in ObsView(shuffleobs((X, Y)))\n @assert typeof(x) <: SubArray{Float64,1}\n @assert typeof(y) <: String\nend\n\n# Indexing: take first 10 observations\nx, y = ObsView((X, Y))[1:10]\n\nSee also\n\nobsview, getobs, numobs, splitobs, shuffleobs, kfolds.\n\n\n\n\n\n","category":"type"},{"location":"data/mlutils/#MLUtils.ones_like","page":"Batching Data – MLUtils.jl","title":"MLUtils.ones_like","text":"ones_like(x, [element_type=eltype(x)], [dims=size(x)]))\n\nCreate an array with the given element type and size, based upon the given source array x. All element of the new array will be set to 1. The second and third arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.\n\nSee also zeros_like and fill_like.\n\nExamples\n\njulia> x = rand(Float32, 2)\n2-element Vector{Float32}:\n 0.8621633\n 0.5158395\n\njulia> ones_like(x, (3, 3))\n3×3 Matrix{Float32}:\n 1.0 1.0 1.0\n 1.0 1.0 1.0\n 1.0 1.0 1.0\n\njulia> using CUDA\n\njulia> x = CUDA.rand(2, 2)\n2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 0.82297 0.656143\n 0.701828 0.391335\n\njulia> ones_like(x, Float64)\n2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:\n 1.0 1.0\n 1.0 1.0\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.oversample","page":"Batching Data – MLUtils.jl","title":"MLUtils.oversample","text":"oversample(data, classes; fraction=1, shuffle=true)\noversample(data::Tuple; fraction=1, shuffle=true)\n\nGenerate a re-balanced version of data by repeatedly sampling existing observations in such a way that every class will have at least fraction times the number observations of the largest class in classes. This way, all classes will have a minimum number of observations in the resulting data set relative to what largest class has in the given (original) data.\n\nAs an example, by default (i.e. with fraction = 1) the resulting dataset will be near perfectly balanced. On the other hand, with fraction = 0.5 every class in the resulting data with have at least 50% as many observations as the largest class.\n\nThe classes input is an array with the same length as numobs(data). \n\nThe convenience parameter shuffle determines if the resulting data will be shuffled after its creation; if it is not shuffled then all the repeated samples will be together at the end, sorted by class. Defaults to true.\n\nThe output will contain both the resampled data and classes.\n\n# 6 observations with 3 features each\nX = rand(3, 6)\n# 2 classes, severely imbalanced\nY = [\"a\", \"b\", \"b\", \"b\", \"b\", \"a\"]\n\n# oversample the class \"a\" to match \"b\"\nX_bal, Y_bal = oversample(X, Y)\n\n# this results in a bigger dataset with repeated data\n@assert size(X_bal) == (3,8)\n@assert length(Y_bal) == 8\n\n# now both \"a\", and \"b\" have 4 observations each\n@assert sum(Y_bal .== \"a\") == 4\n@assert sum(Y_bal .== \"b\") == 4\n\nFor this function to work, the type of data must implement numobs and getobs. \n\nNote that if data is a tuple and classes is not given, then it will be assumed that the last element of the tuple contains the classes.\n\njulia> data = DataFrame(X1=rand(6), X2=rand(6), Y=[:a,:b,:b,:b,:b,:a])\n6×3 DataFrames.DataFrame\n│ Row │ X1 │ X2 │ Y │\n├─────┼───────────┼─────────────┼───┤\n│ 1 │ 0.226582 │ 0.0443222 │ a │\n│ 2 │ 0.504629 │ 0.722906 │ b │\n│ 3 │ 0.933372 │ 0.812814 │ b │\n│ 4 │ 0.522172 │ 0.245457 │ b │\n│ 5 │ 0.505208 │ 0.11202 │ b │\n│ 6 │ 0.0997825 │ 0.000341996 │ a │\n\njulia> getobs(oversample(data, data.Y))\n8×3 DataFrame\n Row │ X1 X2 Y \n │ Float64 Float64 Symbol \n─────┼─────────────────────────────\n 1 │ 0.376304 0.100022 a\n 2 │ 0.467095 0.185437 b\n 3 │ 0.481957 0.319906 b\n 4 │ 0.336762 0.390811 b\n 5 │ 0.376304 0.100022 a\n 6 │ 0.427064 0.0648339 a\n 7 │ 0.427064 0.0648339 a\n 8 │ 0.457043 0.490688 b\n\nSee ObsView for more information on data subsets. See also undersample.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.randobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.randobs","text":"randobs(data, [n])\n\nPick a random observation or a batch of n random observations from data. For this function to work, the type of data must implement numobs and getobs.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.rand_like","page":"Batching Data – MLUtils.jl","title":"MLUtils.rand_like","text":"rand_like([rng=default_rng()], x, [element_type=eltype(x)], [dims=size(x)])\n\nCreate an array with the given element type and size, based upon the given source array x. All element of the new array will be set to a random value. The last two arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.\n\nThe default random number generator is used, unless a custom one is passed in explicitly as the first argument.\n\nSee also Base.rand and randn_like.\n\nExamples\n\njulia> x = ones(Float32, 2)\n2-element Vector{Float32}:\n 1.0\n 1.0\n\njulia> rand_like(x, (3, 3))\n3×3 Matrix{Float32}:\n 0.780032 0.920552 0.53689\n 0.121451 0.741334 0.5449\n 0.55348 0.138136 0.556404\n\njulia> using CUDA\n\njulia> CUDA.ones(2, 2)\n2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 1.0 1.0\n 1.0 1.0\n\njulia> rand_like(x, Float64)\n2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:\n 0.429274 0.135379\n 0.718895 0.0098756\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.randn_like","page":"Batching Data – MLUtils.jl","title":"MLUtils.randn_like","text":"randn_like([rng=default_rng()], x, [element_type=eltype(x)], [dims=size(x)])\n\nCreate an array with the given element type and size, based upon the given source array x. All element of the new array will be set to a random value drawn from a normal distribution. The last two arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.\n\nThe default random number generator is used, unless a custom one is passed in explicitly as the first argument.\n\nSee also Base.randn and rand_like.\n\nExamples\n\njulia> x = ones(Float32, 2)\n2-element Vector{Float32}:\n 1.0\n 1.0\n\njulia> randn_like(x, (3, 3))\n3×3 Matrix{Float32}:\n -0.385331 0.956231 0.0745102\n 1.43756 -0.967328 2.06311\n 0.0482372 1.78728 -0.902547\n\njulia> using CUDA\n\njulia> CUDA.ones(2, 2)\n2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 1.0 1.0\n 1.0 1.0\n\njulia> randn_like(x, Float64)\n2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:\n -0.578527 0.823445\n -1.01338 -0.612053\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.rpad_constant","page":"Batching Data – MLUtils.jl","title":"MLUtils.rpad_constant","text":"rpad_constant(v::AbstractArray, n::Union{Integer, Tuple}, val = 0; dims=:)\n\nReturn the given sequence padded with val along the dimensions dims up to a maximum length in each direction specified by n.\n\nExamples\n\njulia> rpad_constant([1, 2], 4, -1) # passing with -1 up to size 4\n4-element Vector{Int64}:\n 1\n 2\n -1\n -1\n\njulia> rpad_constant([1, 2, 3], 2) # no padding if length is already greater than n\n3-element Vector{Int64}:\n 1\n 2\n 3\n\njulia> rpad_constant([1 2; 3 4], 4; dims=1) # padding along the first dimension\n4×2 Matrix{Int64}:\n 1 2\n 3 4\n 0 0\n 0 0 \n\njulia> rpad_constant([1 2; 3 4], 4) # padding along all dimensions by default\n4×2 Matrix{Int64}:\n 1 2\n 3 4\n 0 0\n 0 0 \n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.shuffleobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.shuffleobs","text":"shuffleobs([rng], data)\n\nReturn a \"subset\" of data that spans all observations, but has the order of the observations shuffled.\n\nThe values of data itself are not copied. Instead only the indices are shuffled. This function calls obsview to accomplish that, which means that the return value is likely of a different type than data.\n\n# For Arrays the subset will be of type SubArray\n@assert typeof(shuffleobs(rand(4,10))) <: SubArray\n\n# Iterate through all observations in random order\nfor x in eachobs(shuffleobs(X))\n ...\nend\n\nThe optional parameter rng allows one to specify the random number generator used for shuffling. This is useful when reproducible results are desired. By default, uses the global RNG. See Random in Julia's standard library for more info.\n\nFor this function to work, the type of data must implement numobs and getobs. See ObsView for more information.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.splitobs","page":"Batching Data – MLUtils.jl","title":"MLUtils.splitobs","text":"splitobs(n::Int; at) -> Tuple\n\nCompute the indices for two or more disjoint subsets of the range 1:n with splits given by at.\n\nExamples\n\njulia> splitobs(100, at=0.7)\n(1:70, 71:100)\n\njulia> splitobs(100, at=(0.1, 0.4))\n(1:10, 11:50, 51:100)\n\n\n\n\n\nsplitobs(data; at, shuffle=false) -> Tuple\n\nSplit the data into multiple subsets proportional to the value(s) of at. \n\nIf shuffle=true, randomly permute the observations before splitting.\n\nSupports any datatype implementing the numobs and getobs interfaces.\n\nExamples\n\n# A 70%-30% split\ntrain, test = splitobs(X, at=0.7)\n\n# A 50%-30%-20% split\ntrain, val, test = splitobs(X, at=(0.5, 0.3))\n\n# A 70%-30% split with multiple arrays and shuffling\ntrain, test = splitobs((X, y), at=0.7, shuffle=true)\nXtrain, Ytrain = train\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.unbatch","page":"Batching Data – MLUtils.jl","title":"MLUtils.unbatch","text":"unbatch(x)\n\nReverse of the batch operation, unstacking the last dimension of the array x.\n\nSee also unstack and chunk.\n\nExamples\n\njulia> unbatch([1 3 5 7;\n 2 4 6 8])\n4-element Vector{Vector{Int64}}:\n [1, 2]\n [3, 4]\n [5, 6]\n [7, 8]\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.undersample","page":"Batching Data – MLUtils.jl","title":"MLUtils.undersample","text":"undersample(data, classes; shuffle=true)\n\nGenerate a class-balanced version of data by subsampling its observations in such a way that the resulting number of observations will be the same number for every class. This way, all classes will have as many observations in the resulting data set as the smallest class has in the given (original) data.\n\nThe convenience parameter shuffle determines if the resulting data will be shuffled after its creation; if it is not shuffled then all the observations will be in their original order. Defaults to false.\n\nThe output will contain both the resampled data and classes.\n\n# 6 observations with 3 features each\nX = rand(3, 6)\n# 2 classes, severely imbalanced\nY = [\"a\", \"b\", \"b\", \"b\", \"b\", \"a\"]\n\n# subsample the class \"b\" to match \"a\"\nX_bal, Y_bal = undersample(X, Y)\n\n# this results in a smaller dataset\n@assert size(X_bal) == (3,4)\n@assert length(Y_bal) == 4\n\n# now both \"a\", and \"b\" have 2 observations each\n@assert sum(Y_bal .== \"a\") == 2\n@assert sum(Y_bal .== \"b\") == 2\n\nFor this function to work, the type of data must implement numobs and getobs. \n\nNote that if data is a tuple, then it will be assumed that the last element of the tuple contains the targets.\n\njulia> data = DataFrame(X1=rand(6), X2=rand(6), Y=[:a,:b,:b,:b,:b,:a])\n6×3 DataFrames.DataFrame\n│ Row │ X1 │ X2 │ Y │\n├─────┼───────────┼─────────────┼───┤\n│ 1 │ 0.226582 │ 0.0443222 │ a │\n│ 2 │ 0.504629 │ 0.722906 │ b │\n│ 3 │ 0.933372 │ 0.812814 │ b │\n│ 4 │ 0.522172 │ 0.245457 │ b │\n│ 5 │ 0.505208 │ 0.11202 │ b │\n│ 6 │ 0.0997825 │ 0.000341996 │ a │\n\njulia> getobs(undersample(data, data.Y))\n4×3 DataFrame\n Row │ X1 X2 Y \n │ Float64 Float64 Symbol \n─────┼─────────────────────────────\n 1 │ 0.427064 0.0648339 a\n 2 │ 0.376304 0.100022 a\n 3 │ 0.467095 0.185437 b\n 4 │ 0.457043 0.490688 b\n\nSee ObsView for more information on data subsets. See also oversample.\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.unsqueeze","page":"Batching Data – MLUtils.jl","title":"MLUtils.unsqueeze","text":"unsqueeze(x; dims)\n\nReturn x reshaped into an array one dimensionality higher than x, where dims indicates in which dimension x is extended. dims can be an integer between 1 and ndims(x)+1.\n\nSee also flatten, stack.\n\nExamples\n\njulia> unsqueeze([1 2; 3 4], dims=2)\n2×1×2 Array{Int64, 3}:\n[:, :, 1] =\n 1\n 3\n\n[:, :, 2] =\n 2\n 4\n\n\njulia> xs = [[1, 2], [3, 4], [5, 6]]\n3-element Vector{Vector{Int64}}:\n [1, 2]\n [3, 4]\n [5, 6]\n\njulia> unsqueeze(xs, dims=1)\n1×3 Matrix{Vector{Int64}}:\n [1, 2] [3, 4] [5, 6]\n\n\n\n\n\nunsqueeze(; dims)\n\nReturns a function which, acting on an array, inserts a dimension of size 1 at dims.\n\nExamples\n\njulia> rand(21, 22, 23) |> unsqueeze(dims=2) |> size\n(21, 1, 22, 23)\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.unstack","page":"Batching Data – MLUtils.jl","title":"MLUtils.unstack","text":"unstack(xs; dims)\n\nUnroll the given xs into an array of arrays along the given dimension dims.\n\nSee also stack, unbatch, and chunk.\n\nExamples\n\njulia> unstack([1 3 5 7; 2 4 6 8], dims=2)\n4-element Vector{Vector{Int64}}:\n [1, 2]\n [3, 4]\n [5, 6]\n [7, 8]\n\n\n\n\n\n","category":"function"},{"location":"data/mlutils/#MLUtils.zeros_like","page":"Batching Data – MLUtils.jl","title":"MLUtils.zeros_like","text":"zeros_like(x, [element_type=eltype(x)], [dims=size(x)]))\n\nCreate an array with the given element type and size, based upon the given source array x. All element of the new array will be set to 0. The second and third arguments are both optional, defaulting to the given array's eltype and size. The dimensions may be specified as an integer or as a tuple argument.\n\nSee also ones_like and fill_like.\n\nExamples\n\njulia> x = rand(Float32, 2)\n2-element Vector{Float32}:\n 0.4005432\n 0.36934233\n\njulia> zeros_like(x, (3, 3))\n3×3 Matrix{Float32}:\n 0.0 0.0 0.0\n 0.0 0.0 0.0\n 0.0 0.0 0.0\n\njulia> using CUDA\n\njulia> x = CUDA.rand(2, 2)\n2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 0.0695155 0.667979\n 0.558468 0.59903\n\njulia> zeros_like(x, Float64)\n2×2 CuArray{Float64, 2, CUDA.Mem.DeviceBuffer}:\n 0.0 0.0\n 0.0 0.0\n\n\n\n\n\n","category":"function"},{"location":"models/advanced/#man-advanced","page":"Custom Layers","title":"Defining Customised Layers","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Here we will try and describe usage of some more advanced features that Flux provides to give more control over model building.","category":"page"},{"location":"models/advanced/#Custom-Model-Example","page":"Custom Layers","title":"Custom Model Example","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Here is a basic example of a custom model. It simply adds the input to the result from the neural network.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"struct CustomModel\n chain::Chain\nend\n\nfunction (m::CustomModel)(x)\n # Arbitrary code can go here, but note that everything will be differentiated.\n # Zygote does not allow some operations, like mutating arrays.\n\n return m.chain(x) + x\nend\n\n# Call @functor to allow for training. Described below in more detail.\nFlux.@functor CustomModel","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"You can then use the model like:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"chain = Chain(Dense(10, 10))\nmodel = CustomModel(chain)\nmodel(rand(10))","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"For an intro to Flux and automatic differentiation, see this tutorial.","category":"page"},{"location":"models/advanced/#Customising-Parameter-Collection-for-a-Model","page":"Custom Layers","title":"Customising Parameter Collection for a Model","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Taking reference from our example Affine layer from the basics.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"By default all the fields in the Affine type are collected as its parameters, however, in some cases it may be desired to hold other metadata in our \"layers\" that may not be needed for training, and are hence supposed to be ignored while the parameters are collected. With Flux, the way to mark some fields of our layer as trainable is through overloading the trainable function:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"julia> Flux.@functor Affine\n\njulia> a = Affine(Float32[1 2; 3 4; 5 6], Float32[7, 8, 9])\nAffine(Float32[1.0 2.0; 3.0 4.0; 5.0 6.0], Float32[7.0, 8.0, 9.0])\n\njulia> Flux.params(a) # default behavior\nParams([Float32[1.0 2.0; 3.0 4.0; 5.0 6.0], Float32[7.0, 8.0, 9.0]])\n\njulia> Flux.trainable(a::Affine) = (; a.W) # returns a NamedTuple using the field's name\n\njulia> Flux.params(a)\nParams([Float32[1.0 2.0; 3.0 4.0; 5.0 6.0]])","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Only the fields returned by trainable will be collected as trainable parameters of the layer when calling Flux.params, and only these fields will be seen by Flux.setup and Flux.update! for training. But all fields wil be seen by gpu and similar functions, for example:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"julia> a |> f16\nAffine(Float16[1.0 2.0; 3.0 4.0; 5.0 6.0], Float16[7.0, 8.0, 9.0])","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Note that there is no need to overload trainable to hide fields which do not contain trainable parameters. (For example, activation functions, or Boolean flags.) These are always ignored by params and by training:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"julia> Flux.params(Affine(true, [10, 11, 12.0]))\nParams([])","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"It is also possible to further restrict what fields are seen by writing @functor Affine (W,). However, this is not recommended. This requires the struct to have a corresponding constructor that accepts only W as an argument, and the ignored fields will not be seen by functions like gpu (which is usually undesired).","category":"page"},{"location":"models/advanced/#Freezing-Layer-Parameters","page":"Custom Layers","title":"Freezing Layer Parameters","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"When it is desired to not include all the model parameters (for e.g. transfer learning), we can simply not pass in those layers into our call to params.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"compat: Flux ≤ 0.14\nThe mechanism described here is for Flux's old \"implicit\" training style. When upgrading for Flux 0.15, it should be replaced by freeze! and thaw!.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Consider a simple multi-layer perceptron model where we want to avoid optimising the first two Dense layers. We can obtain this using the slicing features Chain provides:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"m = Chain(\n Dense(784 => 64, relu),\n Dense(64 => 64, relu),\n Dense(32 => 10)\n );\n\nps = Flux.params(m[3:end])","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"The Zygote.Params object ps now holds a reference to only the parameters of the layers passed to it.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"During training, the gradients will only be computed for (and applied to) the last Dense layer, therefore only that would have its parameters changed.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Flux.params also takes multiple inputs to make it easy to collect parameters from heterogenous models with a single call. A simple demonstration would be if we wanted to omit optimising the second Dense layer in the previous example. It would look something like this:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Flux.params(m[1], m[3:end])","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Sometimes, a more fine-tuned control is needed. We can freeze a specific parameter of a specific layer which already entered a Params object ps, by simply deleting it from ps:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"ps = Flux.params(m)\ndelete!(ps, m[2].bias) ","category":"page"},{"location":"models/advanced/#Custom-multiple-input-or-output-layer","page":"Custom Layers","title":"Custom multiple input or output layer","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Sometimes a model needs to receive several separate inputs at once or produce several separate outputs at once. In other words, there multiple paths within this high-level layer, each processing a different input or producing a different output. A simple example of this in machine learning literature is the inception module.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Naively, we could have a struct that stores the weights of along each path and implement the joining/splitting in the forward pass function. But that would mean a new struct any time the operations along each path changes. Instead, this guide will show you how to construct a high-level layer (like Chain) that is made of multiple sub-layers for each path.","category":"page"},{"location":"models/advanced/#Multiple-inputs:-a-custom-Join-layer","page":"Custom Layers","title":"Multiple inputs: a custom Join layer","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Our custom Join layer will accept multiple inputs at once, pass each input through a separate path, then combine the results together. Note that this layer can already be constructed using Parallel, but we will first walk through how do this manually.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"We start by defining a new struct, Join, that stores the different paths and a combine operation as its fields.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"using Flux\nusing CUDA\n\n# custom join layer\nstruct Join{T, F}\n combine::F\n paths::T\nend\n\n# allow Join(op, m1, m2, ...) as a constructor\nJoin(combine, paths...) = Join(combine, paths)","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Notice that we parameterized the type of the paths field. This is necessary for fast Julia code; in general, T might be a Tuple or Vector, but we don't need to pay attention to what it specifically is. The same goes for the combine field.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"The next step is to use Functors.@functor to make our struct behave like a Flux layer. This is important so that calling params on a Join returns the underlying weight arrays on each path.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Flux.@functor Join","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Finally, we define the forward pass. For Join, this means applying each path in paths to each input array, then using combine to merge the results.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"(m::Join)(xs::Tuple) = m.combine(map((f, x) -> f(x), m.paths, xs)...)\n(m::Join)(xs...) = m(xs)","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Lastly, we can test our new layer. Thanks to the proper abstractions in Julia, our layer works on GPU arrays out of the box!","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"model = Chain(\n Join(vcat,\n Chain(Dense(1 => 5, relu), Dense(5 => 1)), # branch 1\n Dense(1 => 2), # branch 2\n Dense(1 => 1) # branch 3\n ),\n Dense(4 => 1)\n ) |> gpu\n\nxs = map(gpu, (rand(1), rand(1), rand(1)))\n\nmodel(xs)\n# returns a single float vector with one value","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"note: Note\nThis Join layer is available from the Fluxperimental.jl package.","category":"page"},{"location":"models/advanced/#Using-Parallel","page":"Custom Layers","title":"Using Parallel","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Flux already provides Parallel that can offer the same functionality. In this case, Join is going to just be syntactic sugar for Parallel.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Join(combine, paths) = Parallel(combine, paths)\nJoin(combine, paths...) = Join(combine, paths)\n\n# use vararg/tuple version of Parallel forward pass\nmodel = Chain(\n Join(vcat,\n Chain(Dense(1 => 5, relu), Dense(5 => 1)),\n Dense(1 => 2),\n Dense(1 => 1)\n ),\n Dense(4 => 1)\n ) |> gpu\n\nxs = map(gpu, (rand(1), rand(1), rand(1)))\n\nmodel(xs)\n# returns a single float vector with one value","category":"page"},{"location":"models/advanced/#Multiple-outputs:-a-custom-Split-layer","page":"Custom Layers","title":"Multiple outputs: a custom Split layer","text":"","category":"section"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Our custom Split layer will accept a single input, then pass the input through a separate path to produce multiple outputs.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"We start by following the same steps as the Join layer: define a struct, use Functors.@functor, and define the forward pass.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"using Flux\nusing CUDA\n\n# custom split layer\nstruct Split{T}\n paths::T\nend\n\nSplit(paths...) = Split(paths)\n\nFlux.@functor Split\n\n(m::Split)(x::AbstractArray) = map(f -> f(x), m.paths)","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"Now we can test to see that our Split does indeed produce multiple outputs.","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"model = Chain(\n Dense(10 => 5),\n Split(Dense(5 => 1, tanh), Dense(5 => 3, tanh), Dense(5 => 2))\n ) |> gpu\n\nmodel(gpu(rand(10)))\n# returns a tuple with three float vectors","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"A custom loss function for the multiple outputs may look like this:","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"using Statistics\n\n# assuming model returns the output of a Split\n# x is a single input\n# ys is a tuple of outputs\nfunction loss(x, ys, model)\n # rms over all the mse\n ŷs = model(x)\n return sqrt(mean(Flux.mse(y, ŷ) for (y, ŷ) in zip(ys, ŷs)))\nend","category":"page"},{"location":"models/advanced/","page":"Custom Layers","title":"Custom Layers","text":"note: Note\nThis Split layer is available from the Fluxperimental.jl package.","category":"page"},{"location":"ecosystem/#The-Julia-Ecosystem-around-Flux","page":"Ecosystem","title":"The Julia Ecosystem around Flux","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"One of the main strengths of Julia lies in an ecosystem of packages globally providing a rich and consistent user experience.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"This is a non-exhaustive list of Julia packages, nicely complementing Flux in typical machine learning and deep learning workflows. To add your project please send a PR. See also academic work citing Flux or citing Zygote.","category":"page"},{"location":"ecosystem/#Flux-models","page":"Ecosystem","title":"Flux models","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Flux's model-zoo contains examples from many domains.","category":"page"},{"location":"ecosystem/#Computer-vision","page":"Ecosystem","title":"Computer vision","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"ObjectDetector.jl provides ready-to-go image detection via YOLO.\nMetalhead.jl includes many state-of-the-art computer vision models which can easily be used for transfer learning.\nUNet.jl is a generic UNet implementation.","category":"page"},{"location":"ecosystem/#Natural-language-processing","page":"Ecosystem","title":"Natural language processing","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Transformers.jl provides components for Transformer models for NLP, as well as providing several trained models out of the box.\nTextAnalysis.jl provides several NLP algorithms that use Flux models under the hood.","category":"page"},{"location":"ecosystem/#Reinforcement-learning","page":"Ecosystem","title":"Reinforcement learning","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"AlphaZero.jl provides a generic, simple and fast implementation of Deepmind's AlphaZero algorithm.\nReinforcementLearning.jl offers a collection of tools for doing reinforcement learning research in Julia.","category":"page"},{"location":"ecosystem/#Graph-learning","page":"Ecosystem","title":"Graph learning","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"GraphNeuralNetworks.jl is a fresh, performant and flexible graph neural network library based on Flux.jl.\nGeometricFlux.jl is the first graph neural network library for julia. \nNeuralOperators.jl enables training infinite dimensional PDEs by learning a continuous function instead of using the finite element method.\nSeaPearl.jl is a Constraint Programming solver that uses Reinforcement Learning based on graphs as input.","category":"page"},{"location":"ecosystem/#Time-series","page":"Ecosystem","title":"Time series","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"FluxArchitectures.jl is a collection of advanced network architectures for time series forecasting.","category":"page"},{"location":"ecosystem/#Robust-networks","page":"Ecosystem","title":"Robust networks","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"RobustNeuralNetworks.jl includes classes of neural networks that are constructed to naturally satisfy robustness constraints.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"","category":"page"},{"location":"ecosystem/#Tools-closely-associated-with-Flux","page":"Ecosystem","title":"Tools closely associated with Flux","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Utility tools you're unlikely to have met if you never used Flux!","category":"page"},{"location":"ecosystem/#High-level-training-flows","page":"Ecosystem","title":"High-level training flows","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"FastAI.jl is a Julia port of Python's fast.ai library.\nFluxTraining.jl is a package for using and writing powerful, extensible training loops for deep learning models. It supports callbacks for many common use cases like hyperparameter scheduling, metrics tracking and logging, checkpointing, early stopping, and more. It powers training in FastAI.jl","category":"page"},{"location":"ecosystem/#Datasets","page":"Ecosystem","title":"Datasets","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Commonly used machine learning datasets are provided by the following packages in the julia ecosystem:","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"MLDatasets.jl focuses on downloading, unpacking, and accessing benchmark datasets.\nGraphMLDatasets.jl: a library for machine learning datasets on graph.","category":"page"},{"location":"ecosystem/#Plumbing","page":"Ecosystem","title":"Plumbing","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Tools to put data into the right order for creating a model.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Augmentor.jl is a real-time library augmentation library for increasing the number of training images.\nDataAugmentation.jl aims to make it easy to build stochastic, label-preserving augmentation pipelines for vision use cases involving images, keypoints and segmentation masks.\nMLUtils.jl (replaces MLDataUtils.jl and MLLabelUtils.jl) is a library for processing Machine Learning datasets.","category":"page"},{"location":"ecosystem/#Parameters","page":"Ecosystem","title":"Parameters","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"ParameterSchedulers.jl standard scheduling policies for machine learning.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"","category":"page"},{"location":"ecosystem/#Differentiable-programming","page":"Ecosystem","title":"Differentiable programming","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Packages based on differentiable programming but not necessarily related to Machine Learning. ","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"The SciML ecosystem uses Flux and Zygote to mix neural nets with differential equations, to get the best of black box and mechanistic modelling.\nDiffEqFlux.jl provides tools for creating Neural Differential Equations.\nFlux3D.jl shows off machine learning on 3D data.\nRayTracer.jl combines ML with computer vision via a differentiable renderer.\nDuckietown.jl Differentiable Duckietown simulator.\nThe Yao.jl project uses Flux and Zygote for Quantum Differentiable Programming.\nAtomicGraphNets.jl enables learning graph based models on atomic systems used in chemistry.\nDiffImages.jl differentiable computer vision modeling in Julia with the Images.jl ecosystem.","category":"page"},{"location":"ecosystem/#Probabilistic-programming","page":"Ecosystem","title":"Probabilistic programming","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Turing.jl extends Flux's differentiable programming capabilities to probabilistic programming.\nOmega.jl is a research project aimed at causal, higher-order probabilistic programming.\nStheno.jl provides flexible Gaussian processes.","category":"page"},{"location":"ecosystem/#Statistics","page":"Ecosystem","title":"Statistics","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"OnlineStats.jl provides single-pass algorithms for statistics.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"","category":"page"},{"location":"ecosystem/#Useful-miscellaneous-packages","page":"Ecosystem","title":"Useful miscellaneous packages","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Some useful and random packages!","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"AdversarialPrediction.jl provides a way to easily optimise generic performance metrics in supervised learning settings using the Adversarial Prediction framework.\nMill.jl helps to prototype flexible multi-instance learning models.\nMLMetrics.jl is a utility for scoring models in data science and machine learning.\nTorch.jl exposes torch in Julia.\nValueHistories.jl is a utility for efficient tracking of optimization histories, training curves or other information of arbitrary types and at arbitrarily spaced sampling times.\nInvertibleNetworks.jl Building blocks for invertible neural networks in the Julia programming language.\nProgressMeter.jl progress meters for long-running computations.\nTensorBoardLogger.jl easy peasy logging to tensorboard in Julia\nArgParse.jl is a package for parsing command-line arguments to Julia programs.\nParameters.jl types with default field values, keyword constructors and (un-)pack macros.\nBSON.jl is a package for working with the Binary JSON serialisation format.\nDataFrames.jl in-memory tabular data in Julia.\nDrWatson.jl is a scientific project assistant software.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"This tight integration among Julia packages is shown in some of the examples in the model-zoo repository.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"","category":"page"},{"location":"ecosystem/#Alternatives-to-Flux","page":"Ecosystem","title":"Alternatives to Flux","text":"","category":"section"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"Julia has several other libraries for making neural networks. ","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"SimpleChains.jl is focused on making small, simple, CPU-based, neural networks fast. Uses LoopVectorization.jl. (Was FastChain in DiffEqFlux.jl) \nKnet.jl is a neural network library built around AutoGrad.jl.\nLux.jl (earlier ExplicitFluxLayers.jl) shares much of the design, use-case, and NNlib.jl / Optimisers.jl back-end of Flux. But instead of encapsulating all parameters within the model structure, it separates this into 3 components: a model, a tree of parameters, and a tree of model states.","category":"page"},{"location":"ecosystem/","page":"Ecosystem","title":"Ecosystem","text":"compat: Explicit or explicit?\nFlux's training docs talk about changes from Zygote's implicit to explicit gradients, dictionary-like to tree-like structures. (See also Zygote's description of these.) Lux also uses Zygote, but uses the word \"explicit\" to mean something unrelated, namely storing the tree of parameters (and of state) separately from the model.","category":"page"},{"location":"models/functors/#Recursive-transformations-from-Functors.jl","page":"Nested Structures – Functors.jl","title":"Recursive transformations from Functors.jl","text":"","category":"section"},{"location":"models/functors/","page":"Nested Structures – Functors.jl","title":"Nested Structures – Functors.jl","text":"Flux models are deeply nested structures, and Functors.jl provides tools needed to explore such objects, apply functions to the parameters they contain, and re-build them.","category":"page"},{"location":"models/functors/","page":"Nested Structures – Functors.jl","title":"Nested Structures – Functors.jl","text":"New layers should be annotated using the Functors.@functor macro. This will enable params to see the parameters inside, and gpu to move them to the GPU.","category":"page"},{"location":"models/functors/","page":"Nested Structures – Functors.jl","title":"Nested Structures – Functors.jl","text":"Functors.jl has its own notes on basic usage for more details. Additionally, the Advanced Model Building and Customisation page covers the use cases of Functors in greater details.","category":"page"},{"location":"models/functors/","page":"Nested Structures – Functors.jl","title":"Nested Structures – Functors.jl","text":"Functors.@functor\nFunctors.fmap\nFunctors.isleaf\nFunctors.children\nFunctors.fcollect\nFunctors.functor\nFunctors.fmapstructure","category":"page"},{"location":"models/functors/#Functors.@functor","page":"Nested Structures – Functors.jl","title":"Functors.@functor","text":"@functor T\n@functor T (x,)\n\nAdds methods to functor allowing recursion into objects of type T, and reconstruction. Assumes that T has a constructor accepting all of its fields, which is true unless you have provided an inner constructor which does not.\n\nBy default all fields of T are considered children; this can be restricted be restructed by providing a tuple of field names.\n\nExamples\n\njulia> struct Foo; x; y; end\n\njulia> @functor Foo\n\njulia> Functors.children(Foo(1,2))\n(x = 1, y = 2)\n\njulia> _, re = Functors.functor(Foo(1,2));\n\njulia> re((10, 20))\nFoo(10, 20)\n\njulia> struct TwoThirds a; b; c; end\n\njulia> @functor TwoThirds (a, c)\n\njulia> ch2, re3 = Functors.functor(TwoThirds(10,20,30));\n\njulia> ch2\n(a = 10, c = 30)\n\njulia> re3((\"ten\", \"thirty\"))\nTwoThirds(\"ten\", 20, \"thirty\")\n\njulia> fmap(x -> 10x, TwoThirds(Foo(1,2), Foo(3,4), 56))\nTwoThirds(Foo(10, 20), Foo(3, 4), 560)\n\n\n\n\n\n","category":"macro"},{"location":"models/functors/#Functors.fmap","page":"Nested Structures – Functors.jl","title":"Functors.fmap","text":"fmap(f, x, ys...; exclude = Functors.isleaf, walk = Functors.DefaultWalk()[, prune])\n\nA structure and type preserving map.\n\nBy default it transforms every leaf node (identified by exclude, default isleaf) by applying f, and otherwise traverses x recursively using functor. Optionally, it may also be associated with objects ys with the same tree structure. In that case, f is applied to the corresponding leaf nodes in x and ys.\n\nExamples\n\njulia> fmap(string, (x=1, y=(2, 3)))\n(x = \"1\", y = (\"2\", \"3\"))\n\njulia> nt = (a = [1,2], b = [23, (45,), (x=6//7, y=())], c = [8,9]);\n\njulia> fmap(println, nt)\n[1, 2]\n23\n45\n6//7\n()\n[8, 9]\n(a = nothing, b = Any[nothing, (nothing,), (x = nothing, y = nothing)], c = nothing)\n\njulia> fmap(println, nt; exclude = x -> x isa Array)\n[1, 2]\nAny[23, (45,), (x = 6//7, y = ())]\n[8, 9]\n(a = nothing, b = nothing, c = nothing)\n\njulia> twice = [1, 2]; # println only acts once on this\n\njulia> fmap(println, (i = twice, ii = 34, iii = [5, 6], iv = (twice, 34), v = 34.0))\n[1, 2]\n34\n[5, 6]\n34\n34.0\n(i = nothing, ii = nothing, iii = nothing, iv = (nothing, nothing), v = nothing)\n\njulia> d1 = Dict(\"x\" => [1,2], \"y\" => 3);\n\njulia> d2 = Dict(\"x\" => [4,5], \"y\" => 6, \"z\" => \"an_extra_value\");\n\njulia> fmap(+, d1, d2) == Dict(\"x\" => [5, 7], \"y\" => 9) # Note that \"z\" is ignored\ntrue\n\nMutable objects which appear more than once are only handled once (by caching f(x) in an IdDict). Thus the relationship x.i === x.iv[1] will be preserved. An immutable object which appears twice is not stored in the cache, thus f(34) will be called twice, and the results will agree only if f is pure.\n\nBy default, Tuples, NamedTuples, and some other container-like types in Base have children to recurse into. Arrays of numbers do not. To enable recursion into new types, you must provide a method of functor, which can be done using the macro @functor:\n\njulia> struct Foo; x; y; end\n\njulia> @functor Foo\n\njulia> struct Bar; x; end\n\njulia> @functor Bar\n\njulia> m = Foo(Bar([1,2,3]), (4, 5, Bar(Foo(6, 7))));\n\njulia> fmap(x -> 10x, m)\nFoo(Bar([10, 20, 30]), (40, 50, Bar(Foo(60, 70))))\n\njulia> fmap(string, m)\nFoo(Bar(\"[1, 2, 3]\"), (\"4\", \"5\", Bar(Foo(\"6\", \"7\"))))\n\njulia> fmap(string, m, exclude = v -> v isa Bar)\nFoo(\"Bar([1, 2, 3])\", (4, 5, \"Bar(Foo(6, 7))\"))\n\nTo recurse into custom types without reconstructing them afterwards, use fmapstructure.\n\nFor advanced customization of the traversal behaviour, pass a custom walk function that subtypes Functors.AbstractWalk. The call fmap(f, x, ys...; walk = mywalk) will wrap mywalk in ExcludeWalk then CachedWalk. Here, ExcludeWalk is responsible for applying f at excluded nodes. For a low-level interface for executing a user-constructed walk, see execute.\n\njulia> struct MyWalk <: Functors.AbstractWalk end\n\njulia> (::MyWalk)(recurse, x) = x isa Bar ? \"hello\" :\n Functors.DefaultWalk()(recurse, x)\n\njulia> fmap(x -> 10x, m; walk = MyWalk())\nFoo(\"hello\", (40, 50, \"hello\"))\n\nThe behaviour when the same node appears twice can be altered by giving a value to the prune keyword, which is then used in place of all but the first:\n\njulia> twice = [1, 2];\n\njulia> fmap(float, (x = twice, y = [1,2], z = twice); prune = missing)\n(x = [1.0, 2.0], y = [1.0, 2.0], z = missing)\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Functors.isleaf","page":"Nested Structures – Functors.jl","title":"Functors.isleaf","text":"Functors.isleaf(x)\n\nReturn true if x has no children according to functor.\n\nExamples\n\njulia> Functors.isleaf(1)\ntrue\n\njulia> Functors.isleaf([2, 3, 4])\ntrue\n\njulia> Functors.isleaf([\"five\", [6, 7]])\nfalse\n\njulia> Functors.isleaf([])\nfalse\n\njulia> Functors.isleaf((8, 9))\nfalse\n\njulia> Functors.isleaf(())\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Functors.children","page":"Nested Structures – Functors.jl","title":"Functors.children","text":"Functors.children(x)\n\nReturn the children of x as defined by functor. Equivalent to functor(x)[1].\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Functors.fcollect","page":"Nested Structures – Functors.jl","title":"Functors.fcollect","text":"fcollect(x; exclude = v -> false)\n\nTraverse x by recursing each child of x as defined by functor and collecting the results into a flat array, ordered by a breadth-first traversal of x, respecting the iteration order of children calls.\n\nDoesn't recurse inside branches rooted at nodes v for which exclude(v) == true. In such cases, the root v is also excluded from the result. By default, exclude always yields false.\n\nSee also children.\n\nExamples\n\njulia> struct Foo; x; y; end\n\njulia> @functor Foo\n\njulia> struct Bar; x; end\n\njulia> @functor Bar\n\njulia> struct TypeWithNoChildren; x; y; end\n\njulia> m = Foo(Bar([1,2,3]), TypeWithNoChildren(:a, :b))\nFoo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))\n\njulia> fcollect(m)\n4-element Vector{Any}:\n Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))\n Bar([1, 2, 3])\n [1, 2, 3]\n TypeWithNoChildren(:a, :b)\n\njulia> fcollect(m, exclude = v -> v isa Bar)\n2-element Vector{Any}:\n Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))\n TypeWithNoChildren(:a, :b)\n\njulia> fcollect(m, exclude = v -> Functors.isleaf(v))\n2-element Vector{Any}:\n Foo(Bar([1, 2, 3]), TypeWithNoChildren(:a, :b))\n Bar([1, 2, 3])\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Functors.functor","page":"Nested Structures – Functors.jl","title":"Functors.functor","text":"Functors.functor(x) = functor(typeof(x), x)\n\nReturns a tuple containing, first, a NamedTuple of the children of x (typically its fields), and second, a reconstruction funciton. This controls the behaviour of fmap.\n\nMethods should be added to functor(::Type{T}, x) for custom types, usually using the macro @functor.\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Functors.fmapstructure","page":"Nested Structures – Functors.jl","title":"Functors.fmapstructure","text":"fmapstructure(f, x; exclude = isleaf)\n\nLike fmap, but doesn't preserve the type of custom structs. Instead, it returns a NamedTuple (or a Tuple, or an array), or a nested set of these.\n\nUseful for when the output must not contain custom structs.\n\nExamples\n\njulia> struct Foo; x; y; end\n\njulia> @functor Foo\n\njulia> m = Foo([1,2,3], [4, (5, 6), Foo(7, 8)]);\n\njulia> fmapstructure(x -> 2x, m)\n(x = [2, 4, 6], y = Any[8, (10, 12), (x = 14, y = 16)])\n\njulia> fmapstructure(println, m)\n[1, 2, 3]\n4\n5\n6\n7\n8\n(x = nothing, y = Any[nothing, (nothing, nothing), (x = nothing, y = nothing)])\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Moving-models,-or-data,-to-the-GPU","page":"Nested Structures – Functors.jl","title":"Moving models, or data, to the GPU","text":"","category":"section"},{"location":"models/functors/","page":"Nested Structures – Functors.jl","title":"Nested Structures – Functors.jl","text":"Flux provides some convenience functions based on fmap. Some (f16, f32, f64) change the precision of all arrays in a model. Others are used for moving a model to of from GPU memory:","category":"page"},{"location":"models/functors/","page":"Nested Structures – Functors.jl","title":"Nested Structures – Functors.jl","text":"cpu\ngpu(::Any)\ngpu(::Flux.DataLoader)","category":"page"},{"location":"models/functors/#Flux.cpu","page":"Nested Structures – Functors.jl","title":"Flux.cpu","text":"cpu(m)\n\nCopies m onto the CPU, the opposite of gpu. Recurses into structs marked @functor.\n\nExample\n\njulia> m_gpu = Dense(CUDA.randn(2, 5))\nDense(5 => 2) # 12 parameters\n\njulia> m_gpu.bias # matches the given weight matrix\n2-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:\n 0.0\n 0.0\n\njulia> m = m_gpu |> cpu\nDense(5 => 2) # 12 parameters\n\njulia> m.bias\n2-element Vector{Float32}:\n 0.0\n 0.0\n\n\n\n\n\n","category":"function"},{"location":"models/functors/#Flux.gpu-Tuple{Any}","page":"Nested Structures – Functors.jl","title":"Flux.gpu","text":"gpu(m)\n\nCopies m to the current GPU device (using current GPU backend), if one is available. If no GPU is available, it does nothing (but prints a warning the first time).\n\nOn arrays, this calls CUDA's cu, which also changes arrays with Float64 elements to Float32 while copying them to the device (same for AMDGPU). To act on arrays within a struct, the struct type must be marked with @functor.\n\nUse cpu to copy back to ordinary Arrays. See also f32 and f16 to change element type only.\n\nSee the CUDA.jl docs to help identify the current device.\n\nExample\n\njulia> m = Dense(rand(2, 3)) # constructed with Float64 weight matrix\nDense(3 => 2) # 8 parameters\n\njulia> typeof(m.weight)\nMatrix{Float64} (alias for Array{Float64, 2})\n\njulia> m_gpu = gpu(m) # can equivalently be written m_gpu = m |> gpu\nDense(3 => 2) # 8 parameters\n\njulia> typeof(m_gpu.weight)\nCUDA.CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}\n\n\n\n\n\n","category":"method"},{"location":"models/functors/#Flux.gpu-Tuple{DataLoader}","page":"Nested Structures – Functors.jl","title":"Flux.gpu","text":"gpu(data::DataLoader)\n\nTransforms a given DataLoader to apply gpu to each batch of data, when iterated over. (If no GPU is available, this does nothing.)\n\nExample\n\njulia> dl = Flux.DataLoader((x = ones(2,10), y='a':'j'), batchsize=3)\n4-element DataLoader(::NamedTuple{(:x, :y), Tuple{Matrix{Float64}, StepRange{Char, Int64}}}, batchsize=3)\n with first element:\n (; x = 2×3 Matrix{Float64}, y = 3-element StepRange{Char, Int64})\n\njulia> first(dl)\n(x = [1.0 1.0 1.0; 1.0 1.0 1.0], y = 'a':1:'c')\n\njulia> c_dl = gpu(dl)\n4-element DataLoader(::MLUtils.MappedData{:auto, typeof(gpu), NamedTuple{(:x, :y), Tuple{Matrix{Float64}, StepRange{Char, Int64}}}}, batchsize=3)\n with first element:\n (; x = 2×3 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, y = 3-element StepRange{Char, Int64})\n\njulia> first(c_dl).x\n2×3 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 1.0 1.0 1.0\n 1.0 1.0 1.0\n\nFor large datasets, this is preferred over moving all the data to the GPU before creating the DataLoader, like this:\n\njulia> Flux.DataLoader((x = ones(2,10), y=2:11) |> gpu, batchsize=3)\n4-element DataLoader(::NamedTuple{(:x, :y), Tuple{CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, UnitRange{Int64}}}, batchsize=3)\n with first element:\n (; x = 2×3 CUDA.CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, y = 3-element UnitRange{Int64})\n\nwarning: Warning\nThis only works if gpu is applied directly to the DataLoader. While gpu acts recursively on Flux models and many basic Julia structs, it will not work on (say) a tuple of DataLoaders.\n\n\n\n\n\n","category":"method"},{"location":"models/overview/#man-overview","page":"Fitting a Line","title":"Flux Overview: Fitting a Straight Line","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Flux is a pure Julia ML stack that allows you to build predictive models. Here are the steps for a typical Flux program:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Provide training and test data\nBuild a model with configurable parameters to make predictions\nIteratively train the model by tweaking the parameters to improve predictions\nVerify your model","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Under the hood, Flux uses a technique called automatic differentiation to take gradients that help improve predictions. Flux is also fully written in Julia so you can easily replace any layer of Flux with your own code to improve your understanding or satisfy special requirements.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Here's how you'd use Flux to build and train the most basic of models, step by step.","category":"page"},{"location":"models/overview/#A-Trivial-Prediction","page":"Fitting a Line","title":"A Trivial Prediction","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"This example will predict the output of the function 4x + 2. Making such predictions is called \"linear regression\", and is really too simple to need a neural network. But it's a nice toy example.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"First, import Flux and define the function we want to simulate:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> using Flux\n\njulia> actual(x) = 4x + 2\nactual (generic function with 1 method)","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"This example will build a model to approximate the actual function.","category":"page"},{"location":"models/overview/#.-Provide-Training-and-Test-Data","page":"Fitting a Line","title":"1. Provide Training and Test Data","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Use the actual function to build sets of data for training and verification:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> x_train, x_test = hcat(0:5...), hcat(6:10...)\n([0 1 … 4 5], [6 7 … 9 10])\n\njulia> y_train, y_test = actual.(x_train), actual.(x_test)\n([2 6 … 18 22], [26 30 … 38 42])","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Normally, your training and test data come from real world observations, but here we simulate them.","category":"page"},{"location":"models/overview/#.-Build-a-Model-to-Make-Predictions","page":"Fitting a Line","title":"2. Build a Model to Make Predictions","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Now, build a model to make predictions with 1 input and 1 output:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> model = Dense(1 => 1)\nDense(1 => 1) # 2 parameters\n\njulia> model.weight\n1×1 Matrix{Float32}:\n 0.95041317\n\njulia> model.bias\n1-element Vector{Float32}:\n 0.0","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Under the hood, a dense layer is a struct with fields weight and bias. weight represents a weights' matrix and bias represents a bias vector. There's another way to think about a model. In Flux, models are conceptually predictive functions: ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> predict = Dense(1 => 1)\nDense(1 => 1) # 2 parameters","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Dense(1 => 1) also implements the function σ(Wx+b) where W and b are the weights and biases. σ is an activation function (more on activations later). Our model has one weight and one bias, but typical models will have many more. Think of weights and biases as knobs and levers Flux can use to tune predictions. Activation functions are transformations that tailor models to your needs. ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"This model will already make predictions, though not accurate ones yet:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> predict(x_train)\n1×6 Matrix{Float32}:\n 0.0 0.906654 1.81331 2.71996 3.62662 4.53327","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"In order to make better predictions, you'll need to provide a loss function to tell Flux how to objectively evaluate the quality of a prediction. Loss functions compute the cumulative distance between actual values and predictions. ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> using Statistics\n\njulia> loss(model, x, y) = mean(abs2.(model(x) .- y));\n\njulia> loss(predict, x_train, y_train)\n122.64734f0","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"More accurate predictions will yield a lower loss. You can write your own loss functions or rely on those already provided by Flux. This loss function is called mean squared error (and built-in as mse). Flux works by iteratively reducing the loss through training.","category":"page"},{"location":"models/overview/#.-Improve-the-Prediction","page":"Fitting a Line","title":"3. Improve the Prediction","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Under the hood, the Flux Flux.train! function uses a loss function and training data to improve the parameters of your model based on a pluggable optimiser:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> using Flux: train!\n\njulia> opt = Descent()\nDescent(0.1)\n\njulia> data = [(x_train, y_train)]\n1-element Vector{Tuple{Matrix{Int64}, Matrix{Int64}}}:\n ([0 1 … 4 5], [2 6 … 18 22])","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Now, we have the optimiser and data we'll pass to train!. All that remains are the parameters of the model. Remember, each model is a Julia struct with a function and configurable parameters. Remember, the dense layer has weights and biases that depend on the dimensions of the inputs and outputs: ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> predict.weight\n1×1 Matrix{Float32}:\n 0.9066542\n\njulia> predict.bias\n1-element Vector{Float32}:\n 0.0","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"The dimensions of these model parameters depend on the number of inputs and outputs.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Flux will adjust predictions by iteratively changing these parameters according to the optimiser.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"This optimiser implements the classic gradient descent strategy. Now improve the parameters of the model with a call to Flux.train! like this:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> train!(loss, predict, data, opt)","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"And check the loss:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> loss(predict, x_train, y_train)\n116.38745f0","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"It went down. Why? ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> predict.weight, predict.bias\n(Float32[7.246838;;], Float32[1.748103])","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"The parameters have changed. This single step is the essence of machine learning.","category":"page"},{"location":"models/overview/#.-Iteratively-Train-the-Model","page":"Fitting a Line","title":"3+. Iteratively Train the Model","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"In the previous section, we made a single call to train! which iterates over the data we passed in just once. An epoch refers to one pass over the dataset. Typically, we will run the training for multiple epochs to drive the loss down even further. Let's run it a few more times:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> for epoch in 1:200\n train!(loss, predict, data, opt)\n end\n\njulia> loss(predict, x_train, y_train)\n0.00339581f0\n\njulia> predict.weight, predict.bias\n(Float32[4.0159144;;], Float32[2.004479])","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"After 200 training steps, the loss went down, and the parameters are getting close to those in the function the model is built to predict.","category":"page"},{"location":"models/overview/#.-Verify-the-Results","page":"Fitting a Line","title":"4. Verify the Results","text":"","category":"section"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Now, let's verify the predictions:","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"julia> predict(x_test)\n1×5 Matrix{Float32}:\n 26.1121 30.13 34.1479 38.1657 42.1836\n\njulia> y_test\n1×5 Matrix{Int64}:\n 26 30 34 38 42","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"The predictions are good. Here's how we got there. ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"First, we gathered real-world data into the variables x_train, y_train, x_test, and y_test. The x_* data defines inputs, and the y_* data defines outputs. The *_train data is for training the model, and the *_test data is for verifying the model. Our data was based on the function 4x + 2.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"Then, we built a single input, single output predictive model, predict = Dense(1 => 1). The initial predictions weren't accurate, because we had not trained the model yet.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"After building the model, we trained it with train!(loss, predict, data, opt). The loss function is first, followed by the model itself, the training data, and the Descent optimiser provided by Flux. We ran the training step once, and observed that the parameters changed and the loss went down. Then, we ran the train! many times to finish the training process.","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"After we trained the model, we verified it with the test data to verify the results. ","category":"page"},{"location":"models/overview/","page":"Fitting a Line","title":"Fitting a Line","text":"This overall flow represents how Flux works. Let's drill down a bit to understand what's going on inside the individual layers of Flux.","category":"page"},{"location":"models/nnlib/#Neural-Network-primitives-from-NNlib.jl","page":"Low-level Operations – NNlib.jl","title":"Neural Network primitives from NNlib.jl","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux re-exports all of the functions exported by the NNlib package. This includes activation functions, described on their own page. Many of the functions on this page exist primarily as the internal implementation of Flux layer, but can also be used independently.","category":"page"},{"location":"models/nnlib/#Attention","page":"Low-level Operations – NNlib.jl","title":"Attention","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Primitives for the MultiHeadAttention layer.","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"NNlib.dot_product_attention\nNNlib.dot_product_attention_scores\nNNlib.make_causal_mask","category":"page"},{"location":"models/nnlib/#NNlib.dot_product_attention","page":"Low-level Operations – NNlib.jl","title":"NNlib.dot_product_attention","text":"dot_product_attention(query, key, value, [bias]; [fdrop, mask, nheads])\n\nMultihead dot product attention used in transformer architectures.\n\nThe input arrays must have the first two dimensions given by the number of features and the sequence length, then an arbitrary number of batch dimensions or none.\n\nReturns the attention output array of size (v_dim, q_len, batch_size...) and the attention scores of size (kv_len, q_len, nheads, batch_size...).\n\nSee also dot_product_attention_scores if you only need the attention scores.\n\nArguments\n\nquery: Query array of size (qk_dim, q_len, batch_size...).\nkey: Key array of size (qk_dim, kv_len, batch_size...).\nvalue: Value array of size (v_dim, kv_len, batch_size...).\nbias: Either nothing or an array broadcastable to size (kv_len, q_len, nheads, batch_size). It will be added to the attention scores before applying the softmax. Default nothing.\nfdrop: A dropout function or layer to be applied on the attention scores right after the softmax. Default identity (no dropout).\nmask: Either nothing or a boolean array broadcastable to size (kv_len, q_len, nheads, batch_size). The mask is applied to the attention scores just before the softmax. See make_causal_mask fore creating causal masks. Default nothing.\nnheads: Number of heads to split the input arrays into. Default 1.\n\nExamples\n\nq, k, v = rand(10, 20, 2), rand(10, 30, 2), rand(20, 30, 2)\ny, α = dot_product_attention(q, k, v)\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.dot_product_attention_scores","page":"Low-level Operations – NNlib.jl","title":"NNlib.dot_product_attention_scores","text":"dot_product_attention_scores(query, key, [bias]; [fdrop, mask])\n\nReturn the attention scores for the dot_product_attention. Input arrays must have dimensions (num_features ÷ nheads, nheads, sequence_length, batch_size).\n\nSee dot_product_attention for more details.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.make_causal_mask","page":"Low-level Operations – NNlib.jl","title":"NNlib.make_causal_mask","text":"make_causal_mask(x, dims=2)\n\nReturn a boolean square matrix m of the same type as x and of side size(x, dims). Its elements are set such that m[i, j] == i ≤ j.\n\nCan be used to mask the attention scores in dot_product_attention.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Softmax","page":"Low-level Operations – NNlib.jl","title":"Softmax","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux's Flux.logitcrossentropy uses NNlib.logsoftmax internally.","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"softmax\nlogsoftmax","category":"page"},{"location":"models/nnlib/#NNlib.softmax","page":"Low-level Operations – NNlib.jl","title":"NNlib.softmax","text":"softmax(x; dims = 1)\n\nSoftmax turns input array x into probability distributions that sum to 1 along the dimensions specified by dims. It is semantically equivalent to the following:\n\nsoftmax(x; dims = 1) = exp.(x) ./ sum(exp.(x), dims = dims)\n\nwith additional manipulations enhancing numerical stability.\n\nFor a matrix input x it will by default (dims = 1) treat it as a batch of vectors, with each column independent. Keyword dims = 2 will instead treat rows independently, and so on.\n\nSee also logsoftmax.\n\nExamples\n\njulia> softmax([1, 2, 3])\n3-element Vector{Float64}:\n 0.09003057317038046\n 0.24472847105479764\n 0.6652409557748218\n\njulia> softmax([1 2 3; 2 2 2]) # dims=1\n2×3 Matrix{Float64}:\n 0.268941 0.5 0.731059\n 0.731059 0.5 0.268941\n\njulia> softmax([1 2 3; 2 2 2]; dims=2)\n2×3 Matrix{Float64}:\n 0.0900306 0.244728 0.665241\n 0.333333 0.333333 0.333333\n\nNote that, when used with Flux.jl, softmax must not be passed to layers like Dense which accept an activation function. The activation is broadcasted over the result, thus applies to individual numbers. But softmax always needs to see the whole column.\n\njulia> using Flux\n\njulia> x = randn(Float32, 4, 4, 3, 13);\n\njulia> model = Chain(Conv((4, 4), 3 => 8, tanh), Flux.flatten, Dense(8 => 7), softmax);\n\njulia> model(x) |> size\n(7, 13)\n\njulia> Dense(4 => 7, softmax)(x)\nERROR: `softmax(x)` called with a number, but it expects an array. \n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.logsoftmax","page":"Low-level Operations – NNlib.jl","title":"NNlib.logsoftmax","text":"logsoftmax(x; dims = 1)\n\nComputes the log of softmax in a more numerically stable way than directly taking log.(softmax(xs)). Commonly used in computing cross entropy loss.\n\nIt is semantically equivalent to the following:\n\nlogsoftmax(x; dims = 1) = x .- log.(sum(exp.(x), dims = dims))\n\nSee also softmax.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Pooling","page":"Low-level Operations – NNlib.jl","title":"Pooling","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux's AdaptiveMaxPool, AdaptiveMeanPool, GlobalMaxPool, GlobalMeanPool, MaxPool, and MeanPool use NNlib.PoolDims, NNlib.maxpool, and NNlib.meanpool as their backend.","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"NNlib.PoolDims\nNNlib.lpnormpool\nNNlib.maxpool\nNNlib.meanpool","category":"page"},{"location":"models/nnlib/#NNlib.PoolDims","page":"Low-level Operations – NNlib.jl","title":"NNlib.PoolDims","text":"PoolDims(x_size::NTuple{M}, k::Union{NTuple{L, Int}, Int};\n stride=k, padding=0, dilation=1) where {M, L}\n\nDimensions for a \"pooling\" operation that can have an arbitrary input size, kernel size, stride, dilation, and channel count. Used to dispatch onto efficient implementations at compile-time.\n\n\n\n\n\n","category":"type"},{"location":"models/nnlib/#NNlib.lpnormpool","page":"Low-level Operations – NNlib.jl","title":"NNlib.lpnormpool","text":"lpnormpool(x, p::Real, k::NTuple{N, Integer}; pad=0, stride=k)\n\nPerform Lp pool operation with value of the Lp norm p and window size k on input tensor x, also known as LPPool in pytorch. This pooling operator from Learned-Norm Pooling for Deep Feedforward and Recurrent Neural Networks.\n\nArguments:\n\nx and k: Expects ndim(x) ∈ 3:5, and alwayslength(k) == ndim(x) - 2`\np is restricted to 0 < p < Inf.\npad: See pad_zeros for details.\nstride: Either a tuple with the same length as k, or one integer for all directions. Default is k.\n\nFor all elements x in a size k window, lpnormpool computes (∑ᵢ xᵢ^p)^(1 / p) as an element of the output.\n\nThus lpnormpool(x, 1, k) ./ prod(k) ≈ meanpool(x, k) and lpnormpool(x, 2, k).^2 ./ prod(k) ≈ meanpool(x.^2, k).\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.maxpool","page":"Low-level Operations – NNlib.jl","title":"NNlib.maxpool","text":"maxpool(x, k::NTuple{N, Integer}; pad=0, stride=k)\n\nPerform max pool operation with window size k on input tensor x.\n\nArguments:\n\nx and k: Expects ndim(x) ∈ 3:5, and always length(k) == ndim(x) - 2\npad: See pad_zeros for details.\nstride: Either a tuple with the same length as k, or one integer for all directions. Default is k.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.meanpool","page":"Low-level Operations – NNlib.jl","title":"NNlib.meanpool","text":"meanpool(x, k::NTuple{N, Integer}; pad=0, stride=k)\n\nPerform mean pool operation with window size k on input tensor x.\n\nArguments:\n\nx and k: Expects ndim(x) ∈ 3:5, and alwayslength(k) == ndim(x) - 2`\npad: See pad_zeros for details.\nstride: Either a tuple with the same length as k, or one integer for all directions. Default is k.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Padding","page":"Low-level Operations – NNlib.jl","title":"Padding","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"NNlib.pad_circular\nNNlib.pad_constant\nNNlib.pad_reflect\nNNlib.pad_repeat\nNNlib.pad_symmetric\nNNlib.pad_zeros","category":"page"},{"location":"models/nnlib/#NNlib.pad_circular","page":"Low-level Operations – NNlib.jl","title":"NNlib.pad_circular","text":"pad_circular(x, pad::Tuple; [dims])\npad_circular(x, pad::Int; [dims])\n\nPad the array x \"circularly\" across the border by wrapping around values from the opposite side of x. \n\npad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.\n\nIf pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension). \n\nThe pad length on either side in any dimension must not exceed the size of x in that dimension, i.e. pad_circular is not able to create abitrary sized tilings of x.\n\nSee also pad_repeat, pad_reflect, pad_symmetric, and pad_constant.\n\njulia> r = reshape(1:9, 3, 3)\n3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:\n 1 4 7\n 2 5 8\n 3 6 9\n\njulia> pad_circular(r, (1,2,1,2))\n6×6 Matrix{Int64}:\n 9 3 6 9 3 6\n 7 1 4 7 1 4\n 8 2 5 8 2 5\n 9 3 6 9 3 6\n 7 1 4 7 1 4\n 8 2 5 8 2 5\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.pad_constant","page":"Low-level Operations – NNlib.jl","title":"NNlib.pad_constant","text":"pad_constant(x, pad::Tuple, val = 0; [dims = :])\npad_constant(x, pad::Int, val = 0; [dims = :])\n\nPad the array x with the constant value val.\n\npad can be a tuple of integers. If it is of some length 2 * length(dims) that specifies the left and right padding size for each of the dimensions in dims as (l1, r1, ..., ln, rn). If supplied with a tuple of length length(dims) instead, it applies symmetric padding. If dims is not given, it defaults to all dimensions.\n\nFor integer pad input, it is applied on both sides on every dimension in dims.\n\nSee also pad_zeros, pad_repeat, pad_reflect, pad_symmetric, and pad_circular.\n\njulia> r = reshape(1:4, 2, 2)\n2×2 reshape(::UnitRange{Int64}, 2, 2) with eltype Int64:\n 1 3\n 2 4\n\njulia> pad_constant(r, (1, 2, 3, 4), 8)\n5×9 Matrix{Int64}:\n 8 8 8 8 8 8 8 8 8\n 8 8 8 1 3 8 8 8 8\n 8 8 8 2 4 8 8 8 8\n 8 8 8 8 8 8 8 8 8\n 8 8 8 8 8 8 8 8 8\n\njulia> pad_constant(r, 1, 8)\n4×4 Matrix{Int64}:\n 8 8 8 8\n 8 1 3 8\n 8 2 4 8\n 8 8 8 8\n\njulia> r = reshape(1:27, 3, 3, 3)\n3×3×3 reshape(::UnitRange{Int64}, 3, 3, 3) with eltype Int64:\n[:, :, 1] =\n 1 4 7\n 2 5 8\n 3 6 9\n\n[:, :, 2] =\n 10 13 16\n 11 14 17\n 12 15 18\n\n[:, :, 3] =\n 19 22 25\n 20 23 26\n 21 24 27\n\njulia> pad_constant(r, (2,1), dims = 1) # assymetric padding\n6×3×3 Array{Int64, 3}:\n[:, :, 1] =\n 0 0 0\n 0 0 0\n 1 4 7\n 2 5 8\n 3 6 9\n 0 0 0\n\n[:, :, 2] =\n 0 0 0\n 0 0 0\n 10 13 16\n 11 14 17\n 12 15 18\n 0 0 0\n\n[:, :, 3] =\n 0 0 0\n 0 0 0\n 19 22 25\n 20 23 26\n 21 24 27\n 0 0 0\n\njulia> pad_constant(r, (2,1, 3), dims = (1,2)) # padding must always be either the same length as dims, or double it\nERROR: ArgumentError: Could not parse padding (2, 1, 3) and dims (1, 2)\nStacktrace:\n[...]\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.pad_reflect","page":"Low-level Operations – NNlib.jl","title":"NNlib.pad_reflect","text":"pad_reflect(x, pad::Tuple; [dims])\npad_reflect(x, pad::Int; [dims])\n\nPad the array x reflecting its values across the border.\n\npad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.\n\nIf pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension). \n\nSee also pad_repeat, pad_symmetric, pad_circular, and pad_constant.\n\njulia> r = reshape(1:9, 3, 3)\n3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:\n 1 4 7\n 2 5 8\n 3 6 9\n\njulia> pad_reflect(r, (1,2,1,2))\n6×6 Matrix{Int64}:\n 5 2 5 8 5 2\n 4 1 4 7 4 1\n 5 2 5 8 5 2\n 6 3 6 9 6 3\n 5 2 5 8 5 2\n 4 1 4 7 4 1\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.pad_repeat","page":"Low-level Operations – NNlib.jl","title":"NNlib.pad_repeat","text":"pad_repeat(x, pad::Tuple; [dims])\npad_repeat(x, pad::Int; [dims])\n\nPad the array x repeating the values on the border.\n\npad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.\n\nIf pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension). \n\nSee also pad_reflect, pad_symmetric, pad_circular, and pad_constant.\n\njulia> r = reshape(1:9, 3, 3)\n3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:\n 1 4 7\n 2 5 8\n 3 6 9\n\njulia> pad_repeat(r, (1,2,3,4))\n6×10 Matrix{Int64}:\n 1 1 1 1 4 7 7 7 7 7\n 1 1 1 1 4 7 7 7 7 7\n 2 2 2 2 5 8 8 8 8 8\n 3 3 3 3 6 9 9 9 9 9\n 3 3 3 3 6 9 9 9 9 9\n 3 3 3 3 6 9 9 9 9 9\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.pad_symmetric","page":"Low-level Operations – NNlib.jl","title":"NNlib.pad_symmetric","text":"pad_symmetric(x, pad::Tuple; [dims])\npad_symmetric(x, pad::Int; [dims])\n\nPad the array x reflecting its values symmetrically across the border, i.e. the border values of x are present in the padding values, in contrast to pad_reflect.\n\npad can a tuple of integers (l1, r1, ..., ln, rn) of some length 2n that specifies the left and right padding size for each of the dimensions in dims. If dims is not given, it defaults to the first n dimensions.\n\nIf pad is an integer, it is applied on both sides on every dimension in dims. In this case, dims defaults to the first ndims(x)-2 dimensions (i.e. excludes the channel and batch dimension). \n\nSee also pad_repeat, pad_reflect, pad_circular, and pad_constant.\n\njulia> r = reshape(1:9, 3, 3)\n3×3 reshape(::UnitRange{Int64}, 3, 3) with eltype Int64:\n 1 4 7\n 2 5 8\n 3 6 9\n\njulia> pad_symmetric(r, (1,2,1,2))\n6×6 Matrix{Int64}:\n 1 1 4 7 7 4\n 1 1 4 7 7 4\n 2 2 5 8 8 5\n 3 3 6 9 9 6\n 3 3 6 9 9 6\n 2 2 5 8 8 5\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.pad_zeros","page":"Low-level Operations – NNlib.jl","title":"NNlib.pad_zeros","text":"pad_zeros(x, pad::Tuple; [dims])\npad_zeros(x, pad::Int; [dims])\n\nPad the array x with zeros. Equivalent to pad_constant with the constant equal to 0. \n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Convolution","page":"Low-level Operations – NNlib.jl","title":"Convolution","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux's Conv and CrossCor layers use NNlib.DenseConvDims and NNlib.conv internally. ","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"conv\nConvDims\ndepthwiseconv\nDepthwiseConvDims\nDenseConvDims","category":"page"},{"location":"models/nnlib/#NNlib.conv","page":"Low-level Operations – NNlib.jl","title":"NNlib.conv","text":"conv(x, w; stride = 1, pad = 0, dilation = 1, flipped = false, groups = 1)\n\nApply convolution filter w to input x. x and w are 3d/4d/5d tensors in 1d/2d/3d convolutions respectively. x and w may have real or complex element types.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.ConvDims","page":"Low-level Operations – NNlib.jl","title":"NNlib.ConvDims","text":"ConvDims\n\nType system-level information about convolution dimensions. Critical for things like im2col!() to generate efficient code, and helpful to reduce the number of kwargs getting passed around.\n\n\n\n\n\n","category":"type"},{"location":"models/nnlib/#NNlib.depthwiseconv","page":"Low-level Operations – NNlib.jl","title":"NNlib.depthwiseconv","text":"depthwiseconv(x, w; stride=1, pad=0, dilation=1, flipped=false)\n\nDepthwise convolution operation with filter w on input x. x and w are 3d/4d/5d tensors in 1d/2d/3d convolutions respectively.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.DepthwiseConvDims","page":"Low-level Operations – NNlib.jl","title":"NNlib.DepthwiseConvDims","text":"DepthwiseConvDims\n\nConcrete subclass of ConvDims for a depthwise convolution. Differs primarily due to characterization by Cin, Cmult, rather than Cin, Cout. Useful to be separate from DenseConvDims primarily for channel calculation differences.\n\n\n\n\n\n","category":"type"},{"location":"models/nnlib/#NNlib.DenseConvDims","page":"Low-level Operations – NNlib.jl","title":"NNlib.DenseConvDims","text":"DenseConvDims\n\nConcrete subclass of ConvDims for a normal, dense, conv2d/conv3d.\n\n\n\n\n\n","category":"type"},{"location":"models/nnlib/#Dropout","page":"Low-level Operations – NNlib.jl","title":"Dropout","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"NNlib.dropout\nNNlib.dropout!","category":"page"},{"location":"models/nnlib/#NNlib.dropout","page":"Low-level Operations – NNlib.jl","title":"NNlib.dropout","text":"dropout([rng], A, p; [dims])\n\nReturns an array in which each element of A is either replaced with zero, with probability p, or else multiplied by 1/(1-p).\n\nBy default every element is treated independently. With keyword dims=1, a choice is made for every value of the 1st index i.e. each row of a matrix is either zero or not.\n\nOptional first argument is the random number generator used.\n\nExamples\n\njulia> dropout(ones(2, 10), 0.2)\n2×10 Matrix{Float64}:\n 1.25 1.25 0.0 1.25 1.25 1.25 1.25 1.25 1.25 1.25\n 1.25 1.25 1.25 0.0 1.25 1.25 0.0 1.25 1.25 1.25\n\njulia> mean(dropout(ones(10^4, 5), 0.2), dims=1)\n1×5 Matrix{Float64}:\n 0.998 1.00075 0.99125 0.99575 1.00075\n\njulia> dropout(ones(5, 5), 0.7, dims=1) # whole row the same\n5×5 Matrix{Float64}:\n 3.33333 3.33333 3.33333 3.33333 3.33333\n 0.0 0.0 0.0 0.0 0.0\n 0.0 0.0 0.0 0.0 0.0\n 3.33333 3.33333 3.33333 3.33333 3.33333\n 0.0 0.0 0.0 0.0 0.0\n\njulia> mean(dropout(ones(10^4, 5), 0.3, dims=1), dims=1)\n1×5 Matrix{Float64}:\n 1.00571 1.00571 1.00571 1.00571 1.00571\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.dropout!","page":"Low-level Operations – NNlib.jl","title":"NNlib.dropout!","text":"dropout!(B, A, p; [dims])\n\nThis does exactly B .= dropout(A, p; dims), or rather, it's the implementation of out-of-place dropout.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Upsampling","page":"Low-level Operations – NNlib.jl","title":"Upsampling","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux's Upsample layer uses NNlib.upsample_nearest, NNlib.upsample_bilinear, and NNlib.upsample_trilinear as its backend. Additionally, Flux's PixelShuffle layer uses NNlib.pixel_shuffle as its backend.","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"upsample_nearest\nupsample_linear\n∇upsample_linear\nupsample_bilinear\n∇upsample_bilinear\nupsample_trilinear\n∇upsample_trilinear\npixel_shuffle","category":"page"},{"location":"models/nnlib/#NNlib.upsample_nearest","page":"Low-level Operations – NNlib.jl","title":"NNlib.upsample_nearest","text":"upsample_nearest(x, scale::NTuple{S,Int})\nupsample_nearest(x; size::NTuple{S,Int})\n\nUpsamples the array x by integer multiples along the first S dimensions. Subsequent dimensions of x are not altered.\n\nEither the scale factors or the final output size can be specified.\n\nSee also upsample_bilinear, for two dimensions of an N=4 array.\n\nExample\n\njulia> upsample_nearest([1 2 3; 4 5 6], (2, 3))\n4×9 Matrix{Int64}:\n 1 1 1 2 2 2 3 3 3\n 1 1 1 2 2 2 3 3 3\n 4 4 4 5 5 5 6 6 6\n 4 4 4 5 5 5 6 6 6\n\njulia> ans == upsample_nearest([1 2 3; 4 5 6]; size=(4, 9)) # equivalent\ntrue\n\njulia> upsample_nearest([1 2 3; 4 5 6], (2,))\n4×3 Matrix{Int64}:\n 1 2 3\n 1 2 3\n 4 5 6\n 4 5 6\n\njulia> ans == upsample_nearest([1 2 3; 4 5 6], size=(4,))\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.upsample_linear","page":"Low-level Operations – NNlib.jl","title":"NNlib.upsample_linear","text":"upsample_linear(x::AbstractArray{T,3}, scale::Real; align_corners::Bool = true)\nupsample_linear(x::AbstractArray{T,3}; size::Integer, align_corners::Bool = true)\n\nUpsamples the first dimension of the array x by the upsample provided scale, using linear interpolation. As an alternative to using scale, the resulting array size can be directly specified with a keyword argument.\n\nThe size of the output is equal to (scale*S1, S2, S3), where S1, S2, S3 = size(x).\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.∇upsample_linear","page":"Low-level Operations – NNlib.jl","title":"NNlib.∇upsample_linear","text":"∇upsample_linear(Δ::AbstractArray{T,3}; size::Integer, align_corners::Bool = true) where T\n\nArguments\n\nΔ: Incoming gradient array, backpropagated from downstream layers\nsize: Size of the image upsampled in the first place\n\nOutputs\n\ndx: Downsampled version of Δ\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.upsample_bilinear","page":"Low-level Operations – NNlib.jl","title":"NNlib.upsample_bilinear","text":"upsample_bilinear(x::AbstractArray{T,4}, scale::NTuple{2,Real}; align_corners::Bool = true)\nupsample_bilinear(x::AbstractArray{T,4}; size::NTuple{2,Integer}, align_corners::Bool = true)\n\nUpsamples the first 2 dimensions of the array x by the upsample factors stored in scale, using bilinear interpolation. As an alternative to using scale, the resulting image size can be directly specified with a keyword argument.\n\nThe size of the output is equal to (scale[1]*S1, scale[2]*S2, S3, S4), where S1, S2, S3, S4 = size(x).\n\nExamples\n\njulia> x = reshape(Float32[1 2 3; 4 5 6], (2,3,1,1))\n2×3×1×1 Array{Float32, 4}:\n[:, :, 1, 1] =\n 1.0 2.0 3.0\n 4.0 5.0 6.0\n\njulia> upsample_bilinear(x, (2, 3))\n4×9×1×1 Array{Float32, 4}:\n[:, :, 1, 1] =\n 1.0 1.25 1.5 1.75 2.0 2.25 2.5 2.75 3.0\n 2.0 2.25 2.5 2.75 3.0 3.25 3.5 3.75 4.0\n 3.0 3.25 3.5 3.75 4.0 4.25 4.5 4.75 5.0\n 4.0 4.25 4.5 4.75 5.0 5.25 5.5 5.75 6.0\n\njulia> ans == upsample_bilinear(x; size=(4, 9)) # specify ouput size instead\ntrue\n\njulia> upsample_bilinear(x, (2.5, 3.5)) # non-integer scaling factors are allowed\n5×10×1×1 Array{Float32, 4}:\n[:, :, 1, 1] =\n 1.0 1.22222 1.44444 1.66667 1.88889 … 2.33333 2.55556 2.77778 3.0\n 1.75 1.97222 2.19444 2.41667 2.63889 3.08333 3.30556 3.52778 3.75\n 2.5 2.72222 2.94444 3.16667 3.38889 3.83333 4.05556 4.27778 4.5\n 3.25 3.47222 3.69444 3.91667 4.13889 4.58333 4.80556 5.02778 5.25\n 4.0 4.22222 4.44444 4.66667 4.88889 5.33333 5.55556 5.77778 6.0\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.∇upsample_bilinear","page":"Low-level Operations – NNlib.jl","title":"NNlib.∇upsample_bilinear","text":"∇upsample_bilinear(Δ::AbstractArray{T,4}; size::NTuple{2,Integer}, align_corners::Bool = true) where T\n\nArguments\n\nΔ: Incoming gradient array, backpropagated from downstream layers\nsize: Lateral (W,H) size of the image upsampled in the first place\n\nOutputs\n\ndx: Downsampled version of Δ\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.upsample_trilinear","page":"Low-level Operations – NNlib.jl","title":"NNlib.upsample_trilinear","text":"upsample_trilinear(x::AbstractArray{T,5}, scale::NTuple{3,Real}; align_corners::Bool = true)\nupsample_trilinear(x::AbstractArray{T,5}; size::NTuple{3,Integer}, align_corners::Bool = true)\n\nUpsamples the first 3 dimensions of the array x by the upsample factors stored in scale, using trilinear interpolation. As an alternative to using scale, the resulting image size can be directly specified with a keyword argument.\n\nThe size of the output is equal to (scale[1]*S1, scale[2]*S2, scale[3]*S3, S4, S5), where S1, S2, S3, S4, S5 = size(x).\n\nExamples\n\nupsample_trilinear(x, (2, 3, 4))\nupsample_trilinear(x; size=(4, 9, 11)) # specify ouput size instead\nupsample_trilinear(x, (2.5, 3.5, pi)) # non-integer scaling factors are allowed\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.∇upsample_trilinear","page":"Low-level Operations – NNlib.jl","title":"NNlib.∇upsample_trilinear","text":"∇upsample_trilinear(Δ::AbstractArray{T,5}; size::NTuple{3,Integer}, align_corners::Bool = true) where T\n\nArguments\n\nΔ: Incoming gradient array, backpropagated from downstream layers\nsize: Lateral size & depth (W,H,D) of the image upsampled in the first place\n\nOutputs\n\ndx: Downsampled version of Δ\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.pixel_shuffle","page":"Low-level Operations – NNlib.jl","title":"NNlib.pixel_shuffle","text":"pixel_shuffle(x, r::Integer)\n\nPixel shuffling operation, upscaling by a factor r.\n\nFor 4-arrays representing N images, the operation converts input size(x) == (W, H, r^2*C, N) to output of size (r*W, r*H, C, N). For D-dimensional data, it expects ndims(x) == D+2 with channel and batch dimensions, and divides the number of channels by r^D.\n\nUsed in super-resolution networks to upsample towards high resolution features. Reference: Shi et. al., \"Real-Time Single Image and Video Super-Resolution ...\", CVPR 2016, https://arxiv.org/abs/1609.05158\n\nExamples\n\njulia> x = [10i + j + channel/10 for i in 1:2, j in 1:3, channel in 1:4, batch in 1:1]\n2×3×4×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 11.1 12.1 13.1\n 21.1 22.1 23.1\n\n[:, :, 2, 1] =\n 11.2 12.2 13.2\n 21.2 22.2 23.2\n\n[:, :, 3, 1] =\n 11.3 12.3 13.3\n 21.3 22.3 23.3\n\n[:, :, 4, 1] =\n 11.4 12.4 13.4\n 21.4 22.4 23.4\n\njulia> pixel_shuffle(x, 2) # 4 channels used up as 2x upscaling of image dimensions\n4×6×1×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 11.1 11.3 12.1 12.3 13.1 13.3\n 11.2 11.4 12.2 12.4 13.2 13.4\n 21.1 21.3 22.1 22.3 23.1 23.3\n 21.2 21.4 22.2 22.4 23.2 23.4\n\njulia> y = [i + channel/10 for i in 1:3, channel in 1:6, batch in 1:1]\n3×6×1 Array{Float64, 3}:\n[:, :, 1] =\n 1.1 1.2 1.3 1.4 1.5 1.6\n 2.1 2.2 2.3 2.4 2.5 2.6\n 3.1 3.2 3.3 3.4 3.5 3.6\n\njulia> pixel_shuffle(y, 2) # 1D image, with 6 channels reduced to 3\n6×3×1 Array{Float64, 3}:\n[:, :, 1] =\n 1.1 1.3 1.5\n 1.2 1.4 1.6\n 2.1 2.3 2.5\n 2.2 2.4 2.6\n 3.1 3.3 3.5\n 3.2 3.4 3.6\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Batched-Operations","page":"Low-level Operations – NNlib.jl","title":"Batched Operations","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux's Flux.Bilinear layer uses NNlib.batched_mul internally.","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"batched_mul\nbatched_mul!\nbatched_adjoint\nbatched_transpose\nbatched_vec","category":"page"},{"location":"models/nnlib/#NNlib.batched_mul","page":"Low-level Operations – NNlib.jl","title":"NNlib.batched_mul","text":"batched_mul(A, B) -> C\nA ⊠ B # \\boxtimes\n\nBatched matrix multiplication. Result has C[:,:,k...] == A[:,:,k...] * B[:,:,k...] where k... represent any indices in the last dimensions.\n\nIf ndims(A) == ndims(B) == 3 and size(B,3) == 1 then instead C[:,:,k] == A[:,:,k] * B[:,:,1], and similarly for A.\n\nTo transpose each matrix, apply batched_transpose to the array, or batched_adjoint for conjugate-transpose:\n\njulia> A, B = randn(2,5,17), randn(5,9,17);\n\njulia> A ⊠ B |> size\n(2, 9, 17)\n\njulia> batched_adjoint(A) |> size\n(5, 2, 17)\n\njulia> batched_mul(A, batched_adjoint(randn(9,5,17))) |> size\n(2, 9, 17)\n\njulia> A ⊠ randn(5,9,1) |> size\n(2, 9, 17)\n\njulia> batched_transpose(A) == PermutedDimsArray(A, (2,1,3))\ntrue\n\nThe equivalent PermutedDimsArray may be used in place of batched_transpose. Other permutations are also handled by BLAS, provided that the batch index k is not the first dimension of the underlying array. Thus PermutedDimsArray(::Array, (1,3,2)) and PermutedDimsArray(::Array, (3,1,2)) are fine.\n\nHowever, A = PermutedDimsArray(::Array, (3,2,1)) is not acceptable to BLAS, since the batch dimension is the contiguous one: stride(A,3) == 1. This will be copied, as doing so is faster than batched_mul_generic!.\n\nBoth this copy and batched_mul_generic! produce @debug messages, and setting for instance ENV[\"JULIA_DEBUG\"] = NNlib will display them.\n\n\n\n\n\nbatched_mul(A::Array{T,3}, B::Matrix)\nbatched_mul(A::Matrix, B::Array{T,3})\nA ⊠ B\n\nThis is always matrix-matrix multiplication, but either A or B may lack a batch index.\n\nWhen B is a matrix, result has C[:,:,k] == A[:,:,k] * B[:,:] for all k.\nWhen A is a matrix, then C[:,:,k] == A[:,:] * B[:,:,k]. This can also be done by reshaping and calling *, for instance A ⊡ B using TensorCore.jl, but is implemented here using batched_gemm instead of gemm.\n\njulia> randn(16,8,32) ⊠ randn(8,4) |> size\n(16, 4, 32)\n\njulia> randn(16,8,32) ⊠ randn(8,4,1) |> size # equivalent\n(16, 4, 32)\n\njulia> randn(16,8) ⊠ randn(8,4,32) |> size\n(16, 4, 32)\n\nSee also batched_vec to regard B as a batch of vectors, A[:,:,k] * B[:,k].\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.batched_mul!","page":"Low-level Operations – NNlib.jl","title":"NNlib.batched_mul!","text":"batched_mul!(C, A, B) -> C\nbatched_mul!(C, A, B, α=1, β=0)\n\nIn-place batched matrix multiplication, equivalent to mul!(C[:,:,k], A[:,:,k], B[:,:,k], α, β) for all k. If size(B,3) == 1 then every batch uses B[:,:,1] instead.\n\nThis will call batched_gemm! whenever possible. For real arrays this means that, for X ∈ [A,B,C], either strides(X,1)==1 or strides(X,2)==1, the latter may be caused by batched_transpose or by for instance PermutedDimsArray(::Array, (3,1,2)). Unlike batched_mul this will never make a copy.\n\nFor complex arrays, the wrapper made by batched_adjoint must be outermost to be seen. In this case the strided accepted by BLAS are more restricted, if stride(C,1)==1 then only stride(AorB::BatchedAdjoint,2) == 1 is accepted.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.batched_adjoint","page":"Low-level Operations – NNlib.jl","title":"NNlib.batched_adjoint","text":"batched_transpose(A::AbstractArray{T,3})\nbatched_adjoint(A)\n\nEquivalent to applying transpose or adjoint to each matrix A[:,:,k].\n\nThese exist to control how batched_mul behaves, as it operates on such matrix slices of an array with ndims(A)==3.\n\nPermutedDimsArray(A, (2,1,3)) is equivalent to batched_transpose(A), and is also understood by batched_mul (and more widely supported elsewhere).\n\nBatchedTranspose{T, S} <: AbstractBatchedMatrix{T, 3}\nBatchedAdjoint{T, S}\n\nLazy wrappers analogous to Transpose and Adjoint, returned by batched_transpose etc.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.batched_transpose","page":"Low-level Operations – NNlib.jl","title":"NNlib.batched_transpose","text":"batched_transpose(A::AbstractArray{T,3})\nbatched_adjoint(A)\n\nEquivalent to applying transpose or adjoint to each matrix A[:,:,k].\n\nThese exist to control how batched_mul behaves, as it operates on such matrix slices of an array with ndims(A)==3.\n\nPermutedDimsArray(A, (2,1,3)) is equivalent to batched_transpose(A), and is also understood by batched_mul (and more widely supported elsewhere).\n\nBatchedTranspose{T, S} <: AbstractBatchedMatrix{T, 3}\nBatchedAdjoint{T, S}\n\nLazy wrappers analogous to Transpose and Adjoint, returned by batched_transpose etc.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.batched_vec","page":"Low-level Operations – NNlib.jl","title":"NNlib.batched_vec","text":"batched_vec(A::Array{T,3}, B::Matrix)\nbatched_vec(A::Array{T,3}, b::Vector)\n\nBatched matrix-vector multiplication: the result has C[:,:,k] == A[:,:,k] * B[:,k] for all k, or else C[:,:,k] == A[:,:,k] * b for b::Vector.\n\nWith the same argument types, batched_mul(A, B) would regard B as a fixed matrix, not a batch of vectors. Both reshape and then call batched_mul(::Array{T,3}, ::Array{T,3}).\n\njulia> A, B, b = randn(16,8,32), randn(8,32), randn(8);\n\njulia> batched_vec(A,B) |> size\n(16, 32)\n\njulia> batched_vec(A,b) |> size\n(16, 32)\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Gather-and-Scatter","page":"Low-level Operations – NNlib.jl","title":"Gather and Scatter","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"Flux's Embedding layer uses NNlib.gather as its backend.","category":"page"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"NNlib.gather\nNNlib.gather!\nNNlib.scatter\nNNlib.scatter!","category":"page"},{"location":"models/nnlib/#NNlib.gather","page":"Low-level Operations – NNlib.jl","title":"NNlib.gather","text":"NNlib.gather(src, idx) -> dst\n\nReverse operation of scatter. Gathers data from source src and writes it in a destination dst according to the index array idx. For each k in CartesianIndices(idx), assign values to dst according to\n\ndst[:, ... , k] .= src[:, ... , idx[k]...]\n\nNotice that if idx is a vector containing integers and src is a matrix, previous expression simplifies to\n\ndst[:, k] .= src[:, idx[k]]\n\nand k will run over 1:length(idx).\n\nThe elements of idx can be integers or integer tuples and may be repeated. A single src column can end up being copied into zero, one, or multiple dst columns.\n\nSee gather! for an in-place version.\n\nExamples\n\njulia> NNlib.gather([1,20,300,4000], [2,4,2])\n3-element Vector{Int64}:\n 20\n 4000\n 20\n\njulia> NNlib.gather([1 2 3; 4 5 6], [1,3,1,3,1])\n2×5 Matrix{Int64}:\n 1 3 1 3 1\n 4 6 4 6 4\n\n\n\n\n\ngather(src, IJK...)\n\nConvert the tuple of integer vectors IJK to a tuple of CartesianIndex and call gather on it: gather(src, CartesianIndex.(IJK...)).\n\nExamples\n\njulia> src = reshape([1:15;], 3, 5)\n3×5 Matrix{Int64}:\n 1 4 7 10 13\n 2 5 8 11 14\n 3 6 9 12 15\n\njulia> NNlib.gather(src, [1, 2], [2, 4])\n2-element Vector{Int64}:\n 4\n 11\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.gather!","page":"Low-level Operations – NNlib.jl","title":"NNlib.gather!","text":"NNlib.gather!(dst, src, idx)\n\nReverse operation of scatter!. Gathers data from source src and writes it in destination dst according to the index array idx. For each k in CartesianIndices(idx), assign values to dst according to\n\ndst[:, ... , k] .= src[:, ... , idx[k]...]\n\nNotice that if idx is a vector containing integers, and both dst and src are matrices, previous expression simplifies to\n\ndst[:, k] .= src[:, idx[k]]\n\nand k will run over 1:length(idx).\n\nThe elements of idx can be integers or integer tuples and may be repeated. A single src column can end up being copied into zero, one, or multiple dst columns.\n\nSee gather for an allocating version.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.scatter","page":"Low-level Operations – NNlib.jl","title":"NNlib.scatter","text":"NNlib.scatter(op, src, idx; [init, dstsize])\n\nScatter operation allocating a destination array dst and calling scatter!(op, dst, src, idx) on it.\n\nIf keyword init is provided, it is used to initialize the content of dst. Otherwise, the init values is inferred from the reduction operator op for some common operators (e.g. init = 0 for op = +).\nIf dstsize is provided, it will be used to define the size of destination array, otherwise it will be inferred by src and idx.\n\nSee scatter! for full details on how idx works.\n\nExamples\n\njulia> NNlib.scatter(+, [10,100,1000], [3,1,2])\n3-element Vector{Int64}:\n 100\n 1000\n 10\n\njulia> NNlib.scatter(+, [1 2 3 4; 5 6 7 8], [2,1,1,5])\n2×5 Matrix{Int64}:\n 5 1 0 0 4\n 13 5 0 0 8\n\njulia> NNlib.scatter(*, [10,200,3000], [1,4,2]; init = 10, dstsize = 6)\n6-element Vector{Int64}:\n 100\n 30000\n 10\n 2000\n 10\n 10\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.scatter!","page":"Low-level Operations – NNlib.jl","title":"NNlib.scatter!","text":"NNlib.scatter!(op, dst, src, idx)\n\nScatter operation, which writes data in src into dst at locations idx. A binary reduction operator op is applied during the scatter. For each index k in idx, accumulates values in dst according to\n\ndst[:, ..., idx[k]...] = (op).(dst[:, ..., idx[k]...], src[:, ..., k...])\n\nSee also scatter, gather.\n\nArguments\n\nop: Operations to be applied on dst and src, e.g. +, -, *, /, max, min and mean.\ndst: The destination for src to aggregate to. This argument will be mutated.\nsrc: The source data for aggregating.\nidx: The mapping for aggregation from source (index) to destination (value). The idx array can contain either integers or tuples.\n\nExamples\n\njulia> NNlib.scatter!(+, ones(3), [10,100], [1,3])\n3-element Vector{Float64}:\n 11.0\n 1.0\n 101.0\n\njulia> NNlib.scatter!(*, fill(0.5, 2, 4), [1 10; 100 1000], [3,2])\n2×4 Matrix{Float64}:\n 0.5 5.0 0.5 0.5\n 0.5 500.0 50.0 0.5\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Sampling","page":"Low-level Operations – NNlib.jl","title":"Sampling","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"grid_sample\n∇grid_sample","category":"page"},{"location":"models/nnlib/#NNlib.grid_sample","page":"Low-level Operations – NNlib.jl","title":"NNlib.grid_sample","text":"grid_sample(input::AbstractArray{T, 4}, grid::AbstractArray{T, 4}; padding_mode = :zeros)\n\nGiven input, compute output by sampling input values at pixel locations from grid. Uses bilinear interpolation to calculate output values.\n\nThis implementation assumes the extrema (-1 and 1) are considered as referring to the center points of the input’s corner pixels (i.e. align corners is true).\n\nArguments\n\ninput: Input array in (W_in, H_in, C, N) shape.\ngrid: Input grid in (2, W_out, H_out, N) shape. Where for each (W_out, H_out, N) grid contains (x, y) coordinates that specify sampling locations normalized by the input shape.\nTherefore, x and y should have values in [-1, 1] range. For example, (x = -1, y = -1) is the left-top pixel of input, and (x = 1, y = 1) is the right-bottom pixel of input.\nOut-of-bound values are handled according to the padding_mode.\npadding_mode: Out-of-bound padding. :zeros to use 0 for out-of-bound grid locations. :border to use border values for out-of-bound grid locations. Default is :zeros.\n\nReturns\n\n(W_out, H_out, C, N) sampled grid from input.\n\nExamples\n\nIn the example below, grid contains two out-of-bound sampling locations, which are handled differently, depending on the padding_mode.\n\njulia> x = reshape(collect(1.0:4.0), (2, 2, 1, 1))\n2×2×1×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 1.0 3.0\n 2.0 4.0\n\njulia> grid = Array{Float64}(undef, 2, 3, 2, 1);\n\njulia> grid[:, 1, 1, 1] .= (-3, -1);\n\njulia> grid[:, 2, 1, 1] .= (0, -1);\n\njulia> grid[:, 3, 1, 1] .= (1, -1);\n\njulia> grid[:, 1, 2, 1] .= (-1, 1);\n\njulia> grid[:, 2, 2, 1] .= (0, 1);\n\njulia> grid[:, 3, 2, 1] .= (3, 1);\n\njulia> grid_sample(x, grid; padding_mode=:zeros)\n3×2×1×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 0.0 3.0\n 1.5 3.5\n 2.0 0.0\n\njulia> grid_sample(x, grid; padding_mode=:border)\n3×2×1×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 1.0 3.0\n 1.5 3.5\n 2.0 4.0\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.∇grid_sample","page":"Low-level Operations – NNlib.jl","title":"NNlib.∇grid_sample","text":"∇grid_sample(Δ::AbstractArray{T, 4}, input::AbstractArray{T, 4}, grid::AbstractArray{T, 4}; padding_mode = :zeros) where T\n\nArguments\n\nΔ: Input gradient in (W_out, H_out, C, N) shape (same as output of the primal computation).\ninput: Input from primal computation in (W_in, H_in, C, N) shape.\ngrid: Grid from primal computation in (2, W_out, H_out, N) shape.\npadding_mode: Out-of-bound padding. :zeros to use 0 for out-of-bound grid locations. :border to use border values for out-of-bound grid locations. Should be the same as in primal computation. Default is :zeros.\n\nReturns\n\ndinput (same shape as input) and dgrid (same shape as grid) gradients.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Losses","page":"Low-level Operations – NNlib.jl","title":"Losses","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"ctc_loss","category":"page"},{"location":"models/nnlib/#NNlib.ctc_loss","page":"Low-level Operations – NNlib.jl","title":"NNlib.ctc_loss","text":"ctc_loss(ŷ, y)\n\nComputes the connectionist temporal classification loss between ŷ and y. ŷ must be a classes-by-time matrices, i.e., each row represents a class and each column represents a time step. Additionally, the logsoftmax function will be applied to ŷ, so ŷ must be the raw activation values from the neural network and not, for example, the activations after being passed through a softmax activation function. y must be a 1D array of the labels associated with ŷ. The blank label is assumed to be the last label category in ŷ, so it is equivalent to size(ŷ, 1). Used for sequence-to-sequence classification problems such as speech recognition and handwriting recognition where the exact time-alignment of the output (e.g., letters) is not needed to solve the problem. See Graves et al. (2006) or Graves (2012) for mathematical details.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#Miscellaneous","page":"Low-level Operations – NNlib.jl","title":"Miscellaneous","text":"","category":"section"},{"location":"models/nnlib/","page":"Low-level Operations – NNlib.jl","title":"Low-level Operations – NNlib.jl","text":"logsumexp\nNNlib.glu","category":"page"},{"location":"models/nnlib/#NNlib.logsumexp","page":"Low-level Operations – NNlib.jl","title":"NNlib.logsumexp","text":"logsumexp(x; dims = :)\n\nComputes log.(sum(exp.(x); dims)) in a numerically stable way. Without dims keyword this returns a scalar.\n\nSee also logsoftmax.\n\n\n\n\n\n","category":"function"},{"location":"models/nnlib/#NNlib.glu","page":"Low-level Operations – NNlib.jl","title":"NNlib.glu","text":"glu(x, dim = 1)\n\nThe gated linear unit from the \"Language Modeling with Gated Convolutional Networks\" paper.\n\nCalculates a .* sigmoid(b), where x is split in half along given dimension dim to form a and b.\n\n\n\n\n\n","category":"function"},{"location":"saving/#Saving-and-Loading-Models","page":"Saving & Loading","title":"Saving and Loading Models","text":"","category":"section"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"You may wish to save models so that they can be loaded and run in a later session. Flux provides a number of ways to do this. The recommended way, which is the most robust one for long term storage, is to use Flux.state in combination with a serialization format like JLD2.jl or BSON.jl.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"Save a model:","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"julia> using Flux\n\njulia> struct MyModel\n net\n end\n\njulia> Flux.@functor MyModel\n\njulia> MyModel() = MyModel(Chain(Dense(10, 5, relu), Dense(5, 2)));\n\njulia> model = MyModel()\nMyModel(Chain(Dense(10 => 5, relu), Dense(5 => 2)))\n\njulia> model_state = Flux.state(model);\n\njulia> using JLD2\n\njulia> jldsave(\"mymodel.jld2\"; model_state)","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"Load it again in a new session using Flux.loadmodel!:","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"julia> using Flux, JLD2\n\njulia> model_state = JLD2.load(\"mymodel.jld2\", \"model_state\");\n\njulia> model = MyModel(); # MyModel definition must be available\n\njulia> Flux.loadmodel!(model, model_state);","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"note: Note\nIf a saved model's parameters are stored on the GPU, the model will not load later on if there is no GPU support available. It's best to move your model to the CPU with cpu(model) before saving it.","category":"page"},{"location":"saving/#Checkpointing","page":"Saving & Loading","title":"Checkpointing","text":"","category":"section"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"In longer training runs it's a good idea to periodically save your model, so that you can resume if training is interrupted (for example, if there's a power cut). ","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"julia> using Flux: throttle\n\njulia> using JLD2\n\njulia> m = Chain(Dense(10 => 5, relu), Dense(5 => 2))\nChain(\n Dense(10 => 5, relu), # 55 parameters\n Dense(5 => 2), # 12 parameters\n) # Total: 4 arrays, 67 parameters, 524 bytes.\n\njulia> for epoch in 1:10\n # ... train model ...\n jldsave(\"model-checkpoint.jld2\", model_state = Flux.state(m))\n end;","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"This will update the \"model-checkpoint.jld2\" every epoch.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"You can get more advanced by saving a series of models throughout training, for example","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"jldsave(\"model-$(now()).jld2\", model_state = Flux.state(m))","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"will produce a series of models like \"model-2018-03-06T02:57:10.41.jld2\". You could also store the current test set loss, so that it's easy to (for example) revert to an older copy of the model if it starts to overfit.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"jldsave(\"model-$(now()).jld2\", model_state = Flux.state(m), loss = testloss())","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"Note that to resume a model's training, you might need to restore other stateful parts of your training loop. Possible examples are the optimiser state and the randomness used to partition the original data into the training and validation sets.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"You can store the optimiser state alongside the model, to resume training exactly where you left off: ","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"model = MyModel()\nopt_state = Flux.setup(AdamW(), model)\n\n# ... train model ...\n\nmodel_state = Flux.state(model)\njldsave(\"checkpoint_epoch=42.jld2\"; model_state, opt_state)","category":"page"},{"location":"saving/#Saving-Models-as-Julia-Structs","page":"Saving & Loading","title":"Saving Models as Julia Structs","text":"","category":"section"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"Models are just normal Julia structs, so it's fine to use any Julia storage format to save the struct as it is instead of saving the state returned by Flux.state. BSON.jl is particularly convenient for this, since it can also save anynomous functions, which are sometimes part of a model definition.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"Save a model:","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"julia> using Flux\n\njulia> model = Chain(Dense(10, 5, NNlib.relu), Dense(5, 2));\n\njulia> using BSON: @save\n\njulia> @save \"mymodel.bson\" model","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"Load it again in a new session:","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"julia> using Flux, BSON\n\njulia> BSON.@load \"mymodel.bson\" model\n\njulia> model\nChain(\n Dense(10 => 5, relu), # 55 parameters\n Dense(5 => 2), # 12 parameters\n) # Total: 4 arrays, 67 parameters, 524 bytes.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"warning: Warning\nSaving models this way could lead to compatibility issues across julia versions and across Flux versions if some of the Flux layers' internals are changed. It is therefore not recommended for long term storage, use Flux.state instead.","category":"page"},{"location":"saving/","page":"Saving & Loading","title":"Saving & Loading","text":"warning: Warning\nPrevious versions of Flux suggested saving only the model weights using @save \"mymodel.bson\" params(model). This is no longer recommended and even strongly discouraged. Saving models this way will only store the trainable parameters which will result in incorrect behavior for layers like BatchNorm.","category":"page"},{"location":"models/layers/#Built-in-Layer-Types","page":"Built-in Layers","title":"Built-in Layer Types","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"If you started at the beginning of the guide, then you have already met the basic Dense layer, and seen Chain for combining layers. These core layers form the foundation of almost all neural networks.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"The Dense exemplifies several features:","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"It contains an an activation function, which is broadcasted over the output. Because this broadcast can be fused with other operations, doing so is more efficient than applying the activation function separately.\nIt take an init keyword, which accepts a function acting like rand. That is, init(2,3,4) should create an array of this size. Flux has many such functions built-in. All make a CPU array, moved later with gpu if desired.\nThe bias vector is always initialised Flux.zeros32. The keyword bias=false will turn this off, i.e. keeping the bias permanently zero.\nIt is annotated with @functor, which means that params will see the contents, and gpu will move their arrays to the GPU.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"By contrast, Chain itself contains no parameters, but connects other layers together. The section on dataflow layers introduces others like this.","category":"page"},{"location":"models/layers/#Fully-Connected","page":"Built-in Layers","title":"Fully Connected","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Dense\nFlux.Bilinear\nFlux.Scale","category":"page"},{"location":"models/layers/#Flux.Dense","page":"Built-in Layers","title":"Flux.Dense","text":"Dense(in => out, σ=identity; bias=true, init=glorot_uniform)\nDense(W::AbstractMatrix, [bias, σ])\n\nCreate a traditional fully connected layer, whose forward pass is given by:\n\ny = σ.(W * x .+ bias)\n\nThe input x should be a vector of length in, or batch of vectors represented as an in × N matrix, or any array with size(x,1) == in. The out y will be a vector of length out, or a batch with size(y) == (out, size(x)[2:end]...)\n\nKeyword bias=false will switch off trainable bias for the layer. The initialisation of the weight matrix is W = init(out, in), calling the function given to keyword init, with default glorot_uniform. The weight matrix and/or the bias vector (of length out) may also be provided explicitly.\n\nExamples\n\njulia> d = Dense(5 => 2)\nDense(5 => 2) # 12 parameters\n\njulia> d(rand32(5, 64)) |> size\n(2, 64)\n\njulia> d(rand32(5, 6, 4, 64)) |> size # treated as three batch dimensions\n(2, 6, 4, 64)\n\njulia> d1 = Dense(ones(2, 5), false, tanh) # using provided weight matrix\nDense(5 => 2, tanh; bias=false) # 10 parameters\n\njulia> d1(ones(5))\n2-element Vector{Float64}:\n 0.9999092042625951\n 0.9999092042625951\n\njulia> Flux.params(d1) # no trainable bias\nParams([[1.0 1.0 … 1.0 1.0; 1.0 1.0 … 1.0 1.0]])\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.Bilinear","page":"Built-in Layers","title":"Flux.Bilinear","text":"Bilinear((in1, in2) => out, σ=identity; bias=true, init=glorot_uniform)\nBilinear(W::AbstractArray, [bias, σ])\n\nCreates a layer which is fully connected between two inputs and the output, and otherwise similar to Dense. Its output, given vectors x & y, is another vector z with, for all i ∈ 1:out:\n\nz[i] = σ(x' * W[i,:,:] * y + bias[i])\n\nIf x and y are matrices, then each column of the output z = B(x, y) is of this form, with B the Bilinear layer.\n\nIf the second input y is not given, it is taken to be equal to x, i.e. B(x) == B(x, x)\n\nThe two inputs may also be provided as a tuple, B((x, y)) == B(x, y), which is accepted as the input to a Chain.\n\nIf the two input sizes are the same, in1 == in2, then you may write Bilinear(in => out, σ).\n\nThe initialisation works as for Dense layer, with W = init(out, in1, in2). By default the bias vector is zeros(Float32, out), option bias=false will switch off trainable bias. Either of these may be provided explicitly.\n\nExamples\n\njulia> x, y = randn(Float32, 5, 32), randn(Float32, 5, 32);\n\njulia> B = Flux.Bilinear((5, 5) => 7)\nBilinear(5 => 7) # 182 parameters\n\njulia> B(x) |> size # interactions based on one input\n(7, 32)\n\njulia> B(x,y) == B((x,y)) # two inputs, may be given as a tuple\ntrue\n\njulia> sc = SkipConnection(\n Chain(Dense(5 => 20, tanh), Dense(20 => 9, tanh)),\n Flux.Bilinear((9, 5) => 3, bias=false),\n ); # used as the recombinator, with skip as the second input\n\njulia> sc(x) |> size\n(3, 32)\n\njulia> Flux.Bilinear(rand(4,8,16), false, tanh) # first dim of weight is the output\nBilinear((8, 16) => 4, tanh; bias=false) # 512 parameters\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.Scale","page":"Built-in Layers","title":"Flux.Scale","text":"Scale(size::Integer..., σ=identity; bias=true, init=ones32)\nScale(scale::AbstractArray, [bias, σ])\n\nCreate an element-wise layer, whose forward pass is given by:\n\ny = σ.(scale .* x .+ bias)\n\nThis uses .* instead of matrix multiplication * of Dense.\n\nThe learnable scale & bias are initialised init(size...) and zeros32(size...), with init=ones32 by default. You may specify the function init, turn off trainable bias with bias=false, or provide the array(s) explicitly.\n\nUsed by LayerNorm with affine=true.\n\nExamples\n\njulia> a = Flux.Scale(2)\nScale(2) # 4 parameters\n\njulia> Flux.params(a)\nParams([Float32[1.0, 1.0], Float32[0.0, 0.0]])\n\njulia> a([1 2 3])\n2×3 Matrix{Float32}:\n 1.0 2.0 3.0\n 1.0 2.0 3.0\n\njulia> b = Flux.Scale([1 2 3 4], false, abs2)\nScale(1, 4, abs2; bias=false) # 4 parameters\n\njulia> b([1, 10])\n2×4 Matrix{Int64}:\n 1 4 9 16\n 100 400 900 1600\n\njulia> Flux.params(b)\nParams([[1 2 3 4]])\n\n\n\n\n\n","category":"type"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Perhaps Scale isn't quite fully connected, but it may be thought of as Dense(Diagonal(s.weights), s.bias), and LinearAlgebra's Diagonal is a matrix which just happens to contain many zeros.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"compat: Flux ≤ 0.12\nOld versions of Flux accepted only Dense(in, out, act) and not Dense(in => out, act). This notation makes a Pair object. If you get an error like MethodError: no method matching Dense(::Pair{Int64,Int64}), this means that you should upgrade to newer Flux versions.","category":"page"},{"location":"models/layers/#Convolution-Models","page":"Built-in Layers","title":"Convolution Models","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"These layers are used to build convolutional neural networks (CNNs).","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"They all expect images in what is called WHCN order: a batch of 32 colour images, each 50 x 50 pixels, will have size(x) == (50, 50, 3, 32). A single grayscale image might instead have size(x) == (28, 28, 1, 1).","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Besides images, 2D data, they also work with 1D data, where for instance stereo sound recording with 1000 samples might have size(x) == (1000, 2, 1). They will also work with 3D data, ndims(x) == 5, where again the last two dimensions are channel and batch.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"To understand how strides and padding work, the article by Dumoulin & Visin has great illustrations.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Conv\nConv(weight::AbstractArray)\nConvTranspose\nConvTranspose(weight::AbstractArray)\nCrossCor\nCrossCor(weight::AbstractArray)\nDepthwiseConv\nSamePad\nFlux.flatten","category":"page"},{"location":"models/layers/#Flux.Conv","page":"Built-in Layers","title":"Flux.Conv","text":"Conv(filter, in => out, σ = identity;\n stride = 1, pad = 0, dilation = 1, groups = 1, [bias, init])\n\nStandard convolutional layer. filter is a tuple of integers specifying the size of the convolutional kernel; in and out specify the number of input and output channels.\n\nImage data should be stored in WHCN order (width, height, channels, batch). In other words, a 100×100 RGB image would be a 100×100×3×1 array, and a batch of 50 would be a 100×100×3×50 array. This has N = 2 spatial dimensions, and needs a kernel size like (5,5), a 2-tuple of integers.\n\nTo take convolutions along N feature dimensions, this layer expects as input an array with ndims(x) == N+2, where size(x, N+1) == in is the number of input channels, and size(x, ndims(x)) is (as always) the number of observations in a batch. Then:\n\nfilter should be a tuple of N integers.\nKeywords stride and dilation should each be either single integer, or a tuple with N integers.\nKeyword pad specifies the number of elements added to the borders of the data array. It can be\na single integer for equal padding all around,\na tuple of N integers, to apply the same padding at begin/end of each spatial dimension,\na tuple of 2*N integers, for asymmetric padding, or\nthe singleton SamePad(), to calculate padding such that size(output,d) == size(x,d) / stride (possibly rounded) for each spatial dimension.\nKeyword groups is expected to be an Int. It specifies the number of groups to divide a convolution into.\n\nKeywords to control initialization of the layer:\n\ninit - Function used to generate initial weights. Defaults to glorot_uniform.\nbias - The initial bias vector is all zero by default. Trainable bias can be disabled entirely by setting this to false, or another vector can be provided such as bias = randn(Float32, out).\n\nSee also ConvTranspose, DepthwiseConv, CrossCor.\n\nExamples\n\njulia> xs = rand32(100, 100, 3, 50); # a batch of 50 RGB images\n\njulia> layer = Conv((5,5), 3 => 7, relu; bias = false)\nConv((5, 5), 3 => 7, relu, bias=false) # 525 parameters\n\njulia> layer(xs) |> size\n(96, 96, 7, 50)\n\njulia> Conv((5,5), 3 => 7; stride = 2)(xs) |> size\n(48, 48, 7, 50)\n\njulia> Conv((5,5), 3 => 7; stride = 2, pad = SamePad())(xs) |> size\n(50, 50, 7, 50)\n\njulia> Conv((1,1), 3 => 7; pad = (20,10,0,0))(xs) |> size\n(130, 100, 7, 50)\n\njulia> Conv((5,5), 3 => 7; stride = 2, dilation = 4)(xs) |> size\n(42, 42, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.Conv-Tuple{AbstractArray}","page":"Built-in Layers","title":"Flux.Conv","text":"Conv(weight::AbstractArray, [bias, activation; stride, pad, dilation])\n\nConstructs a convolutional layer with the given weight and bias. Accepts the same keywords and has the same defaults as Conv(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ; ...).\n\njulia> weight = rand(3, 4, 5);\n\njulia> bias = zeros(5);\n\njulia> layer = Conv(weight, bias, sigmoid) # expects 1 spatial dimension\nConv((3,), 4 => 5, σ) # 65 parameters\n\njulia> layer(randn(100, 4, 64)) |> size\n(98, 5, 64)\n\njulia> Flux.params(layer) |> length\n2\n\n\n\n\n\n","category":"method"},{"location":"models/layers/#Flux.ConvTranspose","page":"Built-in Layers","title":"Flux.ConvTranspose","text":"ConvTranspose(filter, in => out, σ=identity; stride=1, pad=0, dilation=1, [bias, init])\n\nStandard convolutional transpose layer. filter is a tuple of integers specifying the size of the convolutional kernel, while in and out specify the number of input and output channels.\n\nNote that pad=SamePad() here tries to ensure size(output,d) == size(x,d) * stride.\n\nParameters are controlled by additional keywords, with defaults init=glorot_uniform and bias=true.\n\nSee also Conv for more detailed description of keywords.\n\nExamples\n\njulia> xs = rand32(100, 100, 3, 50); # a batch of 50 RGB images\n\njulia> layer = ConvTranspose((5,5), 3 => 7, relu)\nConvTranspose((5, 5), 3 => 7, relu) # 532 parameters\n\njulia> layer(xs) |> size\n(104, 104, 7, 50)\n\njulia> ConvTranspose((5,5), 3 => 7, stride=2)(xs) |> size\n(203, 203, 7, 50)\n\njulia> ConvTranspose((5,5), 3 => 7, stride=3, pad=SamePad())(xs) |> size\n(300, 300, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.ConvTranspose-Tuple{AbstractArray}","page":"Built-in Layers","title":"Flux.ConvTranspose","text":"ConvTranspose(weight::AbstractArray, [bias, activation; stride, pad, dilation, groups])\n\nConstructs a ConvTranspose layer with the given weight and bias. Accepts the same keywords and has the same defaults as ConvTranspose(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ; ...).\n\nExamples\n\njulia> weight = rand(3, 4, 5);\n\njulia> bias = zeros(4);\n\njulia> layer = ConvTranspose(weight, bias, sigmoid)\nConvTranspose((3,), 5 => 4, σ) # 64 parameters\n\njulia> layer(randn(100, 5, 64)) |> size # transposed convolution will increase the dimension size (upsampling)\n(102, 4, 64)\n\njulia> Flux.params(layer) |> length\n2\n\n\n\n\n\n","category":"method"},{"location":"models/layers/#Flux.CrossCor","page":"Built-in Layers","title":"Flux.CrossCor","text":"CrossCor(filter, in => out, σ=identity; stride=1, pad=0, dilation=1, [bias, init])\n\nStandard cross correlation layer. filter is a tuple of integers specifying the size of the convolutional kernel; in and out specify the number of input and output channels.\n\nParameters are controlled by additional keywords, with defaults init=glorot_uniform and bias=true.\n\nSee also Conv for more detailed description of keywords.\n\nExamples\n\njulia> xs = rand(Float32, 100, 100, 3, 50); # a batch of 50 RGB images\n\njulia> layer = CrossCor((5,5), 3 => 6, relu; bias=false)\nCrossCor((5, 5), 3 => 6, relu, bias=false) # 450 parameters\n\njulia> layer(xs) |> size\n(96, 96, 6, 50)\n\njulia> CrossCor((5,5), 3 => 7, stride=3, pad=(2,0))(xs) |> size\n(34, 32, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.CrossCor-Tuple{AbstractArray}","page":"Built-in Layers","title":"Flux.CrossCor","text":"CrossCor(weight::AbstractArray, [bias, activation; stride, pad, dilation])\n\nConstructs a CrossCor layer with the given weight and bias. Accepts the same keywords and has the same defaults as CrossCor(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ; ...).\n\nExamples\n\njulia> weight = rand(3, 4, 5);\n\njulia> bias = zeros(5);\n\njulia> layer = CrossCor(weight, bias, relu)\nCrossCor((3,), 4 => 5, relu) # 65 parameters\n\njulia> layer(randn(100, 4, 64)) |> size\n(98, 5, 64)\n\n\n\n\n\n","category":"method"},{"location":"models/layers/#Flux.DepthwiseConv","page":"Built-in Layers","title":"Flux.DepthwiseConv","text":"DepthwiseConv(filter, in => out, σ=identity; stride=1, pad=0, dilation=1, [bias, init])\nDepthwiseConv(weight::AbstractArray, [bias, activation; stride, pad, dilation])\n\nReturn a depthwise convolutional layer, that is a Conv layer with number of groups equal to the number of input channels.\n\nSee Conv for a description of the arguments.\n\nExamples\n\njulia> xs = rand(Float32, 100, 100, 3, 50); # a batch of 50 RGB images\n\njulia> layer = DepthwiseConv((5,5), 3 => 6, relu; bias=false)\nConv((5, 5), 3 => 6, relu, groups=3, bias=false) # 150 parameters \n\njulia> layer(xs) |> size\n(96, 96, 6, 50)\n\njulia> DepthwiseConv((5, 5), 3 => 9, stride=2, pad=2)(xs) |> size\n(50, 50, 9, 50)\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Flux.SamePad","page":"Built-in Layers","title":"Flux.SamePad","text":"SamePad()\n\nPassed as an option to convolutional layers (and friends), this causes the padding to be chosen such that the input and output sizes agree (on the first N dimensions, the kernel or window) when stride==1. When stride≠1, the output size equals ceil(input_size/stride).\n\nSee also Conv, MaxPool.\n\nExamples\n\njulia> xs = rand32(100, 100, 3, 50); # a batch of images\n\njulia> layer = Conv((2,2), 3 => 7, pad=SamePad())\nConv((2, 2), 3 => 7, pad=(1, 0, 1, 0)) # 91 parameters\n\njulia> layer(xs) |> size # notice how the dimensions stay the same with this padding\n(100, 100, 7, 50)\n\njulia> layer2 = Conv((2,2), 3 => 7)\nConv((2, 2), 3 => 7) # 91 parameters\n\njulia> layer2(xs) |> size # the output dimension changes as the padding was not \"same\"\n(99, 99, 7, 50)\n\njulia> layer3 = Conv((5, 5), 3 => 7, stride=2, pad=SamePad())\nConv((5, 5), 3 => 7, pad=2, stride=2) # 532 parameters\n\njulia> layer3(xs) |> size # output size = `ceil(input_size/stride)` = 50\n(50, 50, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.flatten","page":"Built-in Layers","title":"Flux.flatten","text":"flatten(x)\n\nSame as MLUtils.flatten, which should be prefered to this method existing only for backward compatibility.\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#MultiHeadAttention","page":"Built-in Layers","title":"MultiHeadAttention","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"The basic blocks needed to implement Transformer architectures. See also the functional counterparts documented in NNlib's Attention section.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"MultiHeadAttention","category":"page"},{"location":"models/layers/#Flux.MultiHeadAttention","page":"Built-in Layers","title":"Flux.MultiHeadAttention","text":"MultiHeadAttention(dims; [nheads, bias, init, dropout_prob])\n\nThe multi-head dot-product attention layer used in Transformer architectures [1].\n\nReturns the transformed input sequence and the attention scores.\n\n[1] Vaswani et al. \"Attention is all you need.\" Advances in Neural Information Processing Systems. 2017.\n\nArguments\n\ndims: The embedding dimensions of inputs, intermediate tensors and outputs. In the most general case, it is given as a) (q_in_dim, k_in_dim, v_in_dim) => (qk_dim, v_dim) => out_dim. Can take also simpler forms as b) dims::Int; c) in_dim::Int => (qk_dim, v_dim) => out_dim; d) in_dim::Int => qkv_dim => out_dim.\nnheads: number of heads. Default 8.\ninit: weight initializer for the Dense layers. Default glorot_uniform.\nbias : whether pointwise QKVO dense transforms use bias. Default false.\ndropout_prob: dropout probability for the attention scores. Default 0.0.\n\nForward\n\n(mha::MultiHeadAttention)(q_in, k_in, v_in, [bias]; [mask])\n\nThe arguments of the forward pass are:\n\nq_in: Input query array of size (q_in_dim, q_len, batch_size).\nk_in: Input key array of size (k_in_dim, kv_len, batch_size).\nv_in: Input value array of size (v_in_dim, kv_len, batch_size).\nbias: Bias array broadcastable to size (kv_len, q_len, nheads, batch_size). It will be added to the attention scores before the softmax. Default nothing.\nmask: Input array broadcastable to size (kv_len, q_len, nheads, batch_size). The mask is applied to the attention scores just before the softmax. See NNlib.make_causal_mask for creating causal masks. Default nothing.\n\nAlternative calling signatures are mha(q_in), equivalent to mha(q_in, q_in, q_in) (self-attention), and mha(q_in, k_in), equivalent to mha(q_in, k_in, k_in) (key and value are the same).\n\nSee also NNlib.dot_product_attention.\n\nExamples\n\nmha = MultiHeadAttention(64, nheads = 8)\nq = rand(Float32, (64, 10, 32))\nk = rand(Float32, (64, 20, 32))\nv = rand(Float32, (64, 20, 32))\ny, α = mha(q, k, v) \n# [y] = [64, 10, 32]\n# [α] = [20, 10, 8, 32]\n\nmha = MultiHeadAttention(64 => 1024 => 1024, nheads = 8)\ny, α = mha(q) # self-attention\n# [y] = [1024, 10, 32]\n# [α] = [10, 10, 8, 32]\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Pooling","page":"Built-in Layers","title":"Pooling","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"These layers are commonly used after a convolution layer, and reduce the size of its output. They have no trainable parameters.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"AdaptiveMaxPool\nMaxPool\nGlobalMaxPool\nAdaptiveMeanPool\nMeanPool\nGlobalMeanPool","category":"page"},{"location":"models/layers/#Flux.AdaptiveMaxPool","page":"Built-in Layers","title":"Flux.AdaptiveMaxPool","text":"AdaptiveMaxPool(out::NTuple)\n\nAdaptive max pooling layer. Calculates the necessary window size such that its output has size(y)[1:N] == out.\n\nExpects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(out).\n\nSee also MaxPool, AdaptiveMeanPool.\n\nExamples\n\njulia> xs = rand(Float32, 100, 100, 3, 50); # batch of 50 RGB images\n\njulia> AdaptiveMaxPool((25, 25))(xs) |> size\n(25, 25, 3, 50)\n\njulia> MaxPool((4,4))(xs) ≈ AdaptiveMaxPool((25, 25))(xs)\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.MaxPool","page":"Built-in Layers","title":"Flux.MaxPool","text":"MaxPool(window::NTuple; pad=0, stride=window)\n\nMax pooling layer, which replaces all pixels in a block of size window with one.\n\nExpects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(window).\n\nBy default the window size is also the stride in each dimension. The keyword pad accepts the same options as for the Conv layer, including SamePad().\n\nSee also Conv, MeanPool, AdaptiveMaxPool, GlobalMaxPool.\n\nExamples\n\njulia> xs = rand(Float32, 100, 100, 3, 50); # batch of 50 RGB images\n\njulia> m = Chain(Conv((5, 5), 3 => 7, pad=SamePad()), MaxPool((5, 5), pad=SamePad()))\nChain(\n Conv((5, 5), 3 => 7, pad=2), # 532 parameters\n MaxPool((5, 5), pad=2),\n)\n\njulia> m[1](xs) |> size\n(100, 100, 7, 50)\n\njulia> m(xs) |> size\n(20, 20, 7, 50)\n\njulia> layer = MaxPool((5,), pad=2, stride=(3,)) # one-dimensional window\nMaxPool((5,), pad=2, stride=3)\n\njulia> layer(rand(Float32, 100, 7, 50)) |> size\n(34, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.GlobalMaxPool","page":"Built-in Layers","title":"Flux.GlobalMaxPool","text":"GlobalMaxPool()\n\nGlobal max pooling layer.\n\nTransforms (w,h,c,b)-shaped input into (1,1,c,b)-shaped output, by performing max pooling on the complete (w,h)-shaped feature maps.\n\nSee also MaxPool, GlobalMeanPool.\n\njulia> xs = rand(Float32, 100, 100, 3, 50);\n\njulia> m = Chain(Conv((3,3), 3 => 7), GlobalMaxPool());\n\njulia> m(xs) |> size\n(1, 1, 7, 50)\n\njulia> GlobalMaxPool()(rand(3,5,7)) |> size # preserves 2 dimensions\n(1, 5, 7)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.AdaptiveMeanPool","page":"Built-in Layers","title":"Flux.AdaptiveMeanPool","text":"AdaptiveMeanPool(out::NTuple)\n\nAdaptive mean pooling layer. Calculates the necessary window size such that its output has size(y)[1:N] == out.\n\nExpects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(out).\n\nSee also MaxPool, AdaptiveMaxPool.\n\nExamples\n\njulia> xs = rand(Float32, 100, 100, 3, 50); # batch of 50 RGB images\n\njulia> AdaptiveMeanPool((25, 25))(xs) |> size\n(25, 25, 3, 50)\n\njulia> MeanPool((4,4))(xs) ≈ AdaptiveMeanPool((25, 25))(xs)\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.MeanPool","page":"Built-in Layers","title":"Flux.MeanPool","text":"MeanPool(window::NTuple; pad=0, stride=window)\n\nMean pooling layer, averaging all pixels in a block of size window.\n\nExpects as input an array with ndims(x) == N+2, i.e. channel and batch dimensions, after the N feature dimensions, where N = length(window).\n\nBy default the window size is also the stride in each dimension. The keyword pad accepts the same options as for the Conv layer, including SamePad().\n\nSee also Conv, MaxPool, AdaptiveMeanPool.\n\nExamples\n\njulia> xs = rand(Float32, 100, 100, 3, 50);\n\njulia> m = Chain(Conv((5,5), 3 => 7), MeanPool((5,5), pad=SamePad()))\nChain(\n Conv((5, 5), 3 => 7), # 532 parameters\n MeanPool((5, 5), pad=2),\n)\n\njulia> m[1](xs) |> size\n(96, 96, 7, 50)\n\njulia> m(xs) |> size\n(20, 20, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.GlobalMeanPool","page":"Built-in Layers","title":"Flux.GlobalMeanPool","text":"GlobalMeanPool()\n\nGlobal mean pooling layer.\n\nTransforms (w,h,c,b)-shaped input into (1,1,c,b)-shaped output, by performing mean pooling on the complete (w,h)-shaped feature maps.\n\njulia> xs = rand(Float32, 100, 100, 3, 50);\n\njulia> m = Chain(Conv((3,3), 3 => 7), GlobalMeanPool());\n\njulia> m(xs) |> size\n(1, 1, 7, 50)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Upsampling","page":"Built-in Layers","title":"Upsampling","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"The opposite of pooling, these layers increase the size of an array. They have no trainable parameters. ","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Upsample\nPixelShuffle","category":"page"},{"location":"models/layers/#Flux.Upsample","page":"Built-in Layers","title":"Flux.Upsample","text":"Upsample(mode = :nearest; [scale, size]) \nUpsample(scale, mode = :nearest)\n\nAn upsampling layer. One of two keywords must be given:\n\nIf scale is a number, this applies to all but the last two dimensions (channel and batch) of the input. It may also be a tuple, to control dimensions individually. Alternatively, keyword size accepts a tuple, to directly specify the leading dimensions of the output.\n\nCurrently supported upsampling modes and corresponding NNlib's methods are:\n\n:nearest -> NNlib.upsample_nearest \n:bilinear -> NNlib.upsample_bilinear\n:trilinear -> NNlib.upsample_trilinear\n\nExamples\n\njulia> m = Upsample(scale = (2, 3))\nUpsample(:nearest, scale = (2, 3))\n\njulia> m(ones(2, 2, 1, 1)) |> size\n(4, 6, 1, 1)\n\njulia> m = Upsample(:bilinear, size = (4, 5))\nUpsample(:bilinear, size = (4, 5))\n\njulia> m(ones(2, 2, 1, 1)) |> size\n(4, 5, 1, 1)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.PixelShuffle","page":"Built-in Layers","title":"Flux.PixelShuffle","text":"PixelShuffle(r::Int)\n\nPixel shuffling layer with upscale factor r. Usually used for generating higher resolution images while upscaling them.\n\nSee NNlib.pixel_shuffle.\n\nExamples\n\njulia> p = PixelShuffle(2);\n\njulia> xs = [2row + col + channel/10 for row in 1:2, col in 1:2, channel in 1:4, n in 1:1]\n2×2×4×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 3.1 4.1\n 5.1 6.1\n\n[:, :, 2, 1] =\n 3.2 4.2\n 5.2 6.2\n\n[:, :, 3, 1] =\n 3.3 4.3\n 5.3 6.3\n\n[:, :, 4, 1] =\n 3.4 4.4\n 5.4 6.4\n\njulia> p(xs)\n4×4×1×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 3.1 3.3 4.1 4.3\n 3.2 3.4 4.2 4.4\n 5.1 5.3 6.1 6.3\n 5.2 5.4 6.2 6.4\n\njulia> xs = [3row + col + channel/10 for row in 1:2, col in 1:3, channel in 1:4, n in 1:1]\n2×3×4×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 4.1 5.1 6.1\n 7.1 8.1 9.1\n\n[:, :, 2, 1] =\n 4.2 5.2 6.2\n 7.2 8.2 9.2\n\n[:, :, 3, 1] =\n 4.3 5.3 6.3\n 7.3 8.3 9.3\n\n[:, :, 4, 1] =\n 4.4 5.4 6.4\n 7.4 8.4 9.4\n\njulia> p(xs)\n4×6×1×1 Array{Float64, 4}:\n[:, :, 1, 1] =\n 4.1 4.3 5.1 5.3 6.1 6.3\n 4.2 4.4 5.2 5.4 6.2 6.4\n 7.1 7.3 8.1 8.3 9.1 9.3\n 7.2 7.4 8.2 8.4 9.2 9.4\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Embedding-Vectors","page":"Built-in Layers","title":"Embedding Vectors","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"These layers accept an index, and return a vector (or several indices, and several vectors). The possible embedding vectors are learned parameters.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Flux.Embedding\nFlux.EmbeddingBag","category":"page"},{"location":"models/layers/#Flux.Embedding","page":"Built-in Layers","title":"Flux.Embedding","text":"Embedding(in => out; init=randn32)\n\nA lookup table that stores embeddings of dimension out for a vocabulary of size in, as a trainable matrix.\n\nThis layer is often used to store word embeddings and retrieve them using indices. The input to the layer can be a vocabulary index in 1:in, an array of indices, or the corresponding onehot encoding.\n\nFor indices x, the result is of size (out, size(x)...), allowing several batch dimensions. For one-hot ohx, the result is of size (out, size(ohx)[2:end]...).\n\nExamples\n\njulia> emb = Embedding(26 => 4, init=Flux.identity_init(gain=22))\nEmbedding(26 => 4) # 104 parameters\n\njulia> emb(2) # one column of e.weight (here not random!)\n4-element Vector{Float32}:\n 0.0\n 22.0\n 0.0\n 0.0\n\njulia> emb([3, 1, 20, 14, 4, 15, 7]) # vocabulary indices, in 1:26\n4×7 Matrix{Float32}:\n 0.0 22.0 0.0 0.0 0.0 0.0 0.0\n 0.0 0.0 0.0 0.0 0.0 0.0 0.0\n 22.0 0.0 0.0 0.0 0.0 0.0 0.0\n 0.0 0.0 0.0 0.0 22.0 0.0 0.0\n\njulia> ans == emb(Flux.onehotbatch(\"cat&dog\", 'a':'z', 'n'))\ntrue\n\njulia> emb(rand(1:26, (10, 1, 12))) |> size # three batch dimensions\n(4, 10, 1, 12)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.EmbeddingBag","page":"Built-in Layers","title":"Flux.EmbeddingBag","text":"EmbeddingBag(in => out, reduction=mean; init=Flux.randn32)\n\nA lookup table that stores embeddings of dimension out for a vocabulary of size in. Differs from Embedding in that, instead of acting on a single vocabulary index, it always acts a vector of indices which it calls a \"bag\". Their individual embedding vectors are reduced to one, using mean or some other function.\n\nInstead of acting on one \"bag\", such as x::Vector{Int}, the layer can also act on several:\n\nActing on a vector of \"bags\", it produces a matrix whose columns are the reduced vectors. More generally on x::Array{Vector{Int}}, its output is of size (out, size(x)...).\nAny higher-rank array of integers is interpreted as a collection of \"bags\" each along the first dimension. Thus the output is mapslices(e, x; dims=1) when e::EmbeddingBag and x::Array{Int,N}. This method is more efficient, but requires that all \"bags\" have the same length.\nA vector of \"bags\" may also be produced by splitting a vector of indices at specified points. For this case the layer takes two inputs, both vectors of integers. See details below.\n\nThe \"bag\" may equivalently be represented as a OneHotMatrix. A collection of these, or one higher-rank OneHotArray, again produce a stack of embeddings. See details below.\n\nExamples\n\njulia> vocab_size = 26; # embed into 3 dimensions, with non-random vectors:\n\njulia> eb = EmbeddingBag(vocab_size => 3, init=Flux.identity_init(gain=100))\nEmbeddingBag(26 => 3) # 78 parameters\n\njulia> eb([2]) # one bag of 1 item\n3-element Vector{Float32}:\n 0.0\n 100.0\n 0.0\n\njulia> eb([3,3,1]) # one bag of 3 items, one mean embedding\n3-element Vector{Float32}:\n 33.333332\n 0.0\n 66.666664\n\njulia> eb([[3,1,3], [2,1]]) # two bags\n3×2 Matrix{Float32}:\n 33.3333 50.0\n 0.0 50.0\n 66.6667 0.0\n\njulia> eb([1 1 1 1; 1 2 3 4]) # 4 bags each of 2 items, eachcol([1 1 1 1; 1 2 3 4])\n3×4 Matrix{Float32}:\n 100.0 50.0 50.0 50.0\n 0.0 50.0 0.0 0.0\n 0.0 0.0 50.0 0.0\n\njulia> eb(rand(1:26, 10, 5, 5)) |> size # 25 bags each of 10 items\n(3, 5, 5)\n\nAnother way to specify \"many bags of many items\" is to provide a vector data (each in 1:in) and a vector at stating where to split that up into \"bags\". The first bag starts with data[at[1]], the second at data[at[2]], and so on, with no overlaps and nothing left out (thus it requires at[1]==1).\n\njulia> data = [11, 1, 12, 2, 13, 3, 14];\n\njulia> Flux._splitat(data, [1, 4]) |> println # internal function, makes data[1:3], data[4:end]\n[[11, 1, 12], [2, 13, 3, 14]]\n\njulia> eb(data, [1, 4]) # two bags, of 3 and 4 items\n3×2 Matrix{Float32}:\n 33.3333 0.0\n 0.0 25.0\n 0.0 25.0\n\nFinally, each bag may also be also be represented as a OneHotMatrix.\n\njulia> eb(Flux.onehotbatch(\"bba\", 'a':'z')) # same as [2,2,1], one bag of 3 items\n3-element Vector{Float32}:\n 33.333332\n 66.666664\n 0.0\n\njulia> eb([Flux.onehotbatch(\"bba\", 'a':'z'), Flux.onehotbatch(\"cc\", 'a':'z')]) # two bags\n3×2 Matrix{Float32}:\n 33.3333 0.0\n 66.6667 0.0\n 0.0 100.0\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#man-dataflow-layers","page":"Built-in Layers","title":"Dataflow Layers, or Containers","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"The basic Chain(F, G, H) applies the layers it contains in sequence, equivalent to H ∘ G ∘ F. Flux has some other layers which contain layers, but connect them up in a more complicated way: SkipConnection allows ResNet's residual connection.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Chain\nFlux.activations\nMaxout\nSkipConnection\nParallel\nPairwiseFusion","category":"page"},{"location":"models/layers/#Flux.Chain","page":"Built-in Layers","title":"Flux.Chain","text":"Chain(layers...)\nChain(name = layer, ...)\n\nCollects multiple layers / functions to be called in sequence on a given input. Supports indexing and slicing, m[2] or m[1:end-1], and if names are given, m[:name] == m[1] etc.\n\nExamples\n\njulia> m = Chain(x -> x^2, x -> x+1);\n\njulia> m(5) == 26\ntrue\n\njulia> m = Chain(Dense(10 => 5, tanh), Dense(5 => 2));\n\njulia> x = rand32(10, 32);\n\njulia> m(x) == m[2](m[1](x))\ntrue\n\njulia> m2 = Chain(enc = Chain(Flux.flatten, Dense(10 => 5, tanh)), \n dec = Dense(5 => 2));\n\njulia> m2(x) == (m2[:dec] ∘ m2[:enc])(x)\ntrue\n\nFor large models, there is a special type-unstable path which can reduce compilation times. This can be used by supplying a vector of layers Chain([layer1, layer2, ...]). This feature is somewhat experimental, beware!\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.activations","page":"Built-in Layers","title":"Flux.activations","text":"activations(c::Chain, input)\n\nLike calling a Chain, but saves the result of each layer as an output.\n\nExamples\n\njulia> using Flux: activations\n\njulia> c = Chain(x -> x + 1, x -> x * 2, x -> x ^ 3);\n\njulia> activations(c, 1)\n(2, 4, 64)\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Flux.Maxout","page":"Built-in Layers","title":"Flux.Maxout","text":"Maxout(layers...)\nMaxout(f, n_alts)\n\nThis contains a number of internal layers, each of which receives the same input. Its output is the elementwise maximum of the internal layers' outputs.\n\nInstead of defining layers individually, you can provide a zero-argument function which constructs them, and the number to construct.\n\nMaxout over linear dense layers satisfies the universal approximation theorem. See Goodfellow, Warde-Farley, Mirza, Courville & Bengio \"Maxout Networks\" https://arxiv.org/abs/1302.4389.\n\nSee also Parallel to reduce with other operators.\n\nExamples\n\njulia> m = Maxout(x -> abs2.(x), x -> x .* 3);\n\njulia> m([-2 -1 0 1 2])\n1×5 Matrix{Int64}:\n 4 1 0 3 6\n\njulia> m3 = Maxout(() -> Dense(5 => 7, tanh), 3)\nMaxout(\n Dense(5 => 7, tanh), # 42 parameters\n Dense(5 => 7, tanh), # 42 parameters\n Dense(5 => 7, tanh), # 42 parameters\n) # Total: 6 arrays, 126 parameters, 888 bytes.\n\njulia> Flux.outputsize(m3, (5, 11))\n(7, 11)\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.SkipConnection","page":"Built-in Layers","title":"Flux.SkipConnection","text":"SkipConnection(layer, connection)\n\nCreate a skip connection which consists of a layer or Chain of consecutive layers and a shortcut connection linking the block's input to the output through a user-supplied 2-argument callable. The first argument to the callable will be propagated through the given layer while the second is the unchanged, \"skipped\" input.\n\nThe simplest \"ResNet\"-type connection is just SkipConnection(layer, +). Here is a more complicated example:\n\njulia> m = Conv((3,3), 4 => 7, pad=(1,1));\n\njulia> x = ones(Float32, 5, 5, 4, 10);\n\njulia> size(m(x)) == (5, 5, 7, 10)\ntrue\n\njulia> sm = SkipConnection(m, (mx, x) -> cat(mx, x, dims=3));\n\njulia> size(sm(x)) == (5, 5, 11, 10)\ntrue\n\nSee also Parallel, Maxout.\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.Parallel","page":"Built-in Layers","title":"Flux.Parallel","text":"Parallel(connection, layers...)\nParallel(connection; name = layer, ...)\n\nCreate a layer which passes an input array to each path in layers, before reducing the output with connection.\n\nCalled with one input x, this is equivalent to connection([l(x) for l in layers]...). If called with multiple inputs, one is passed to each layer, thus Parallel(+, f, g)(x, y) = f(x) + g(y).\n\nLike Chain, its sub-layers may be given names using the keyword constructor. These can be accessed by indexing: m[1] == m[:name] is the first layer.\n\nSee also SkipConnection which is Parallel with one identity, and Maxout which reduces by broadcasting max.\n\nExamples\n\njulia> model = Chain(Dense(3 => 5),\n Parallel(vcat, Dense(5 => 4), Chain(Dense(5 => 7), Dense(7 => 4))),\n Dense(8 => 17));\n\njulia> model(rand32(3)) |> size\n(17,)\n\njulia> model2 = Parallel(+; α = Dense(10, 2, tanh), β = Dense(5, 2))\nParallel(\n +,\n α = Dense(10 => 2, tanh), # 22 parameters\n β = Dense(5 => 2), # 12 parameters\n) # Total: 4 arrays, 34 parameters, 392 bytes.\n\njulia> model2(rand32(10), rand32(5)) |> size\n(2,)\n\njulia> model2[:α](rand32(10)) |> size\n(2,)\n\njulia> model2[:β] == model2[2]\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.PairwiseFusion","page":"Built-in Layers","title":"Flux.PairwiseFusion","text":"PairwiseFusion(connection, layers...)\n\nArguments\n\nconnection: A function taking 2 inputs and combining them into a single output \nlayers: The layers whose outputs are combined\n\nInputs\n\nThis layer behaves differently based on input type:\n\nIf input x is a tuple of length N (or the input is xs with N x's), matching the number of layers, \n\nthen each layer receives a new input x[i] combined with the previous output y[i-1] using connection. Thus (y1, y2, y3) = PairwiseFusion(connection, layer1, layer2, layer3)((x1, x2, x3)) may be drawn as:\n\nx1 → layer1 → y1 ↘\n connection → layer2 → y2 ↘\n x2 ↗ connection → layer3 → y3\n x3 ↗\n\n... or written as:\n\ny1 = layer1(x1)\ny2 = layer2(connection(y1, x2))\ny3 = layer3(connection(y2, x3))\n\nWith just one input, each layer receives the same x combined with the previous output. Thus y = PairwiseFusion(connection, layers...)(x) obeys:\n\ny[1] == layers[1](x)\nfor i in 2:length(layers)\n y[i] == connection(layers[i](y[i-1]), x)\nend\n\nReturns\n\nA tuple of length N with the output of each fusion ((y1, y2, ..., yN) in the example above).\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Recurrent-Models","page":"Built-in Layers","title":"Recurrent Models","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"RNN\nLSTM\nGRU\nGRUv3\nFlux.Recur\nFlux.reset!","category":"page"},{"location":"models/layers/#Flux.RNN","page":"Built-in Layers","title":"Flux.RNN","text":"RNN(in => out, σ = tanh)\n\nThe most basic recurrent layer; essentially acts as a Dense layer, but with the output fed back into the input each time step.\n\nThe arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.\n\nThis constructor is syntactic sugar for Recur(RNNCell(a...)), and so RNNs are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.\n\nExamples\n\njulia> r = RNN(3 => 5)\nRecur(\n RNNCell(3 => 5, tanh), # 50 parameters\n) # Total: 4 trainable arrays, 50 parameters,\n # plus 1 non-trainable, 5 parameters, summarysize 432 bytes.\n\njulia> r(rand(Float32, 3)) |> size\n(5,)\n\njulia> Flux.reset!(r);\n\njulia> r(rand(Float32, 3, 10)) |> size # batch size of 10\n(5, 10)\n\nwarning: Batch size changes\nFailing to call reset! when the input batch size changes can lead to unexpected behavior. See the following example:julia> r = RNN(3 => 5)\nRecur(\n RNNCell(3 => 5, tanh), # 50 parameters\n) # Total: 4 trainable arrays, 50 parameters,\n # plus 1 non-trainable, 5 parameters, summarysize 432 bytes.\n\njulia> r.state |> size\n(5, 1)\n\njulia> r(rand(Float32, 3)) |> size\n(5,)\n\njulia> r.state |> size\n(5, 1)\n\njulia> r(rand(Float32, 3, 10)) |> size # batch size of 10\n(5, 10)\n\njulia> r.state |> size # state shape has changed\n(5, 10)\n\njulia> r(rand(Float32, 3)) |> size # erroneously outputs a length 5*10 = 50 vector.\n(50,)\n\nNote:\n\nRNNCells can be constructed directly by specifying the non-linear function, the Wi and Wh internal matrices, a bias vector b, and a learnable initial state state0. The Wi and Wh matrices do not need to be the same type, but if Wh is dxd, then Wi should be of shape dxN.\n\njulia> using LinearAlgebra\n\njulia> r = Flux.Recur(Flux.RNNCell(tanh, rand(5, 4), Tridiagonal(rand(5, 5)), rand(5), rand(5, 1)))\n\njulia> r(rand(4, 10)) |> size # batch size of 10\n(5, 10)\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Flux.LSTM","page":"Built-in Layers","title":"Flux.LSTM","text":"LSTM(in => out)\n\nLong Short Term Memory recurrent layer. Behaves like an RNN but generally exhibits a longer memory span over sequences.\n\nThe arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.\n\nThis constructor is syntactic sugar for Recur(LSTMCell(a...)), and so LSTMs are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.\n\nSee this article for a good overview of the internals.\n\nExamples\n\njulia> l = LSTM(3 => 5)\nRecur(\n LSTMCell(3 => 5), # 190 parameters\n) # Total: 5 trainable arrays, 190 parameters,\n # plus 2 non-trainable, 10 parameters, summarysize 1.062 KiB.\n\njulia> l(rand(Float32, 3)) |> size\n(5,)\n\njulia> Flux.reset!(l);\n\njulia> l(rand(Float32, 3, 10)) |> size # batch size of 10\n(5, 10)\n\nwarning: Batch size changes\nFailing to call reset! when the input batch size changes can lead to unexpected behavior. See the example in RNN.\n\nNote:\n\nLSTMCells can be constructed directly by specifying the non-linear function, the Wi and Wh internal matrices, a bias vector b, and a learnable initial state state0. The Wi and Wh matrices do not need to be the same type. See the example in RNN.\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Flux.GRU","page":"Built-in Layers","title":"Flux.GRU","text":"GRU(in => out)\n\nGated Recurrent Unit layer. Behaves like an RNN but generally exhibits a longer memory span over sequences. This implements the variant proposed in v1 of the referenced paper.\n\nThe integer arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.\n\nThis constructor is syntactic sugar for Recur(GRUCell(a...)), and so GRUs are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.\n\nSee this article for a good overview of the internals.\n\nExamples\n\njulia> g = GRU(3 => 5)\nRecur(\n GRUCell(3 => 5), # 140 parameters\n) # Total: 4 trainable arrays, 140 parameters,\n # plus 1 non-trainable, 5 parameters, summarysize 792 bytes.\n\njulia> g(rand(Float32, 3)) |> size\n(5,)\n\njulia> Flux.reset!(g);\n\njulia> g(rand(Float32, 3, 10)) |> size # batch size of 10\n(5, 10)\n\nwarning: Batch size changes\nFailing to call reset! when the input batch size changes can lead to unexpected behavior. See the example in RNN.\n\nNote:\n\nGRUCells can be constructed directly by specifying the non-linear function, the Wi and Wh internal matrices, a bias vector b, and a learnable initial state state0. The Wi and Wh matrices do not need to be the same type. See the example in RNN.\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Flux.GRUv3","page":"Built-in Layers","title":"Flux.GRUv3","text":"GRUv3(in => out)\n\nGated Recurrent Unit layer. Behaves like an RNN but generally exhibits a longer memory span over sequences. This implements the variant proposed in v3 of the referenced paper.\n\nThe arguments in and out describe the size of the feature vectors passed as input and as output. That is, it accepts a vector of length in or a batch of vectors represented as a in x B matrix and outputs a vector of length out or a batch of vectors of size out x B.\n\nThis constructor is syntactic sugar for Recur(GRUv3Cell(a...)), and so GRUv3s are stateful. Note that the state shape can change depending on the inputs, and so it is good to reset! the model between inference calls if the batch size changes. See the examples below.\n\nSee this article for a good overview of the internals.\n\nExamples\n\njulia> g = GRUv3(3 => 5)\nRecur(\n GRUv3Cell(3 => 5), # 140 parameters\n) # Total: 5 trainable arrays, 140 parameters,\n # plus 1 non-trainable, 5 parameters, summarysize 848 bytes.\n\njulia> g(rand(Float32, 3)) |> size\n(5,)\n\njulia> Flux.reset!(g);\n\njulia> g(rand(Float32, 3, 10)) |> size # batch size of 10\n(5, 10)\n\nwarning: Batch size changes\nFailing to call reset! when the input batch size changes can lead to unexpected behavior. See the example in RNN.\n\nNote:\n\nGRUv3Cells can be constructed directly by specifying the non-linear function, the Wi, Wh, and Wh_h internal matrices, a bias vector b, and a learnable initial state state0. The Wi, Wh, and Wh_h matrices do not need to be the same type. See the example in RNN.\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Flux.Recur","page":"Built-in Layers","title":"Flux.Recur","text":"Recur(cell)\n\nRecur takes a recurrent cell and makes it stateful, managing the hidden state in the background. cell should be a model of the form:\n\nh, y = cell(h, x...)\n\nFor example, here's a recurrent network that keeps a running total of its inputs:\n\nExamples\n\njulia> accum(h, x) = (h + x, x)\naccum (generic function with 1 method)\n\njulia> rnn = Flux.Recur(accum, 0)\nRecur(accum)\n\njulia> rnn(2) \n2\n\njulia> rnn(3)\n3\n\njulia> rnn.state\n5\n\nFolding over a 3d Array of dimensions (features, batch, time) is also supported:\n\njulia> accum(h, x) = (h .+ x, x)\naccum (generic function with 1 method)\n\njulia> rnn = Flux.Recur(accum, zeros(Int, 1, 1))\nRecur(accum)\n\njulia> rnn([2])\n1-element Vector{Int64}:\n 2\n\njulia> rnn([3])\n1-element Vector{Int64}:\n 3\n\njulia> rnn.state\n1×1 Matrix{Int64}:\n 5\n\njulia> out = rnn(reshape(1:10, 1, 1, :)); # apply to a sequence of (features, batch, time)\n\njulia> out |> size\n(1, 1, 10)\n\njulia> vec(out)\n10-element Vector{Int64}:\n 1\n 2\n 3\n 4\n 5\n 6\n 7\n 8\n 9\n 10\n\njulia> rnn.state\n1×1 Matrix{Int64}:\n 60\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.reset!","page":"Built-in Layers","title":"Flux.reset!","text":"reset!(rnn)\n\nReset the hidden state of a recurrent layer back to its original value.\n\nAssuming you have a Recur layer rnn, this is roughly equivalent to:\n\nrnn.state = hidden(rnn.cell)\n\nExamples\n\njulia> r = Flux.RNNCell(relu, ones(1,1), zeros(1,1), ones(1,1), zeros(1,1)); # users should use the RNN wrapper struct instead\n\njulia> y = Flux.Recur(r, ones(1,1));\n\njulia> y.state\n1×1 Matrix{Float64}:\n 1.0\n\njulia> y(ones(1,1)) # relu(1*1 + 1)\n1×1 Matrix{Float64}:\n 2.0\n\njulia> y.state\n1×1 Matrix{Float64}:\n 2.0\n\njulia> Flux.reset!(y)\n1×1 Matrix{Float64}:\n 0.0\n\njulia> y.state\n1×1 Matrix{Float64}:\n 0.0\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Normalisation-and-Regularisation","page":"Built-in Layers","title":"Normalisation & Regularisation","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"These layers don't affect the structure of the network but may improve training times or reduce overfitting. Some of them contain trainable parameters, while others do not.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"BatchNorm\nDropout\nAlphaDropout\nLayerNorm\nInstanceNorm\nGroupNorm\nFlux.normalise","category":"page"},{"location":"models/layers/#Flux.BatchNorm","page":"Built-in Layers","title":"Flux.BatchNorm","text":"BatchNorm(channels::Integer, λ=identity;\n initβ=zeros32, initγ=ones32,\n affine=true, track_stats=true, active=nothing,\n eps=1f-5, momentum= 0.1f0)\n\nBatch Normalization layer. channels should be the size of the channel dimension in your data (see below).\n\nGiven an array with N dimensions, call the N-1th the channel dimension. For a batch of feature vectors this is just the data dimension, for WHCN images it's the usual channel dimension.\n\nBatchNorm computes the mean and variance for each D_1×...×D_{N-2}×1×D_N input slice and normalises the input accordingly.\n\nIf affine=true, it also applies a shift and a rescale to the input through to learnable per-channel bias β and scale γ parameters.\n\nAfter normalisation, elementwise activation λ is applied.\n\nIf track_stats=true, accumulates mean and var statistics in training phase that will be used to renormalize the input in test phase.\n\nUse testmode! during inference.\n\nExamples\n\njulia> using Statistics\n\njulia> xs = rand(3, 3, 3, 2); # a batch of 2 images, each having 3 channels\n\njulia> m = BatchNorm(3);\n\njulia> Flux.trainmode!(m);\n\njulia> isapprox(std(m(xs)), 1, atol=0.1) && std(xs) != std(m(xs))\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.Dropout","page":"Built-in Layers","title":"Flux.Dropout","text":"Dropout(p; [dims, rng, active])\n\nLayer implementing dropout with the given probability. This is used as a regularisation, i.e. to reduce overfitting.\n\nWhile training, it sets each input to 0 (with probability p) or else scales it by 1 / (1 - p), using the NNlib.dropout function. While testing, it has no effect.\n\nBy default the mode will switch automatically, but it can also be controlled manually via Flux.testmode!, or by passing keyword active=true for training mode.\n\nBy default every input is treated independently. With the dims keyword, instead it takes a random choice only along that dimension. For example Dropout(p; dims = 3) will randomly zero out entire channels on WHCN input (also called 2D dropout).\n\nKeyword rng lets you specify a custom random number generator. (Only supported on the CPU.)\n\nExamples\n\njulia> m = Chain(Dense(ones(3,2)), Dropout(0.4))\nChain(\n Dense(2 => 3), # 9 parameters\n Dropout(0.4),\n)\n\njulia> m(ones(2, 7)) # test mode, no effect\n3×7 Matrix{Float64}:\n 2.0 2.0 2.0 2.0 2.0 2.0 2.0\n 2.0 2.0 2.0 2.0 2.0 2.0 2.0\n 2.0 2.0 2.0 2.0 2.0 2.0 2.0\n\njulia> Flux.trainmode!(m) # equivalent to use within gradient\nChain(\n Dense(2 => 3), # 9 parameters\n Dropout(0.4, active=true),\n)\n\njulia> m(ones(2, 7))\n3×7 Matrix{Float64}:\n 0.0 0.0 3.33333 0.0 0.0 0.0 0.0\n 3.33333 0.0 3.33333 0.0 3.33333 0.0 3.33333\n 3.33333 3.33333 0.0 3.33333 0.0 0.0 3.33333\n\njulia> y = m(ones(2, 10_000));\n\njulia> using Statistics\n\njulia> mean(y) # is about 2.0, same as in test mode\n1.9989999999999961\n\njulia> mean(iszero, y) # is about 0.4\n0.4003\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.AlphaDropout","page":"Built-in Layers","title":"Flux.AlphaDropout","text":"AlphaDropout(p; [rng, active])\n\nA dropout layer. Used in Self-Normalizing Neural Networks. The AlphaDropout layer ensures that mean and variance of activations remain the same as before.\n\nDoes nothing to the input once testmode! is true.\n\nExamples\n\njulia> using Statistics\n\njulia> x = randn32(1000,1);\n\njulia> m = Chain(Dense(1000 => 1000, selu), AlphaDropout(0.2));\n\njulia> Flux.trainmode!(m);\n\njulia> y = m(x);\n\njulia> isapprox(std(x), std(y), atol=0.2)\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.LayerNorm","page":"Built-in Layers","title":"Flux.LayerNorm","text":"LayerNorm(size..., λ=identity; affine=true, eps=1f-5)\n\nA normalisation layer designed to be used with recurrent hidden states. The argument size should be an integer or a tuple of integers.\n\nIn the forward pass, the layer normalises the mean and standard deviation of the input, then applies the elementwise activation λ. The input is normalised along the first length(size) dimensions for tuple size, and along the first dimension for integer size. The input is expected to have first dimensions' size equal to size.\n\nIf affine=true, it also applies a learnable shift and rescaling using the Scale layer.\n\nSee also BatchNorm, InstanceNorm, GroupNorm, and normalise.\n\nExamples\n\njulia> using Statistics\n\njulia> xs = rand(3, 3, 3, 2); # a batch of 2 images, each having 3 channels\n\njulia> m = LayerNorm(3);\n\njulia> y = m(xs);\n\njulia> isapprox(std(y, dims=1:3), ones(1, 1, 1, 2), atol=0.1) && std(y, dims=1:3) != std(xs, dims=1:3)\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.InstanceNorm","page":"Built-in Layers","title":"Flux.InstanceNorm","text":"InstanceNorm(channels::Integer, λ=identity;\n initβ=zeros32, initγ=ones32,\n affine=false, track_stats=false,\n eps=1f-5, momentum=0.1f0)\n\nInstance Normalization layer. channels should be the size of the channel dimension in your data (see below).\n\nGiven an array with N > 2 dimensions, call the N-1th the channel dimension. For WHCN images it's the usual channel dimension.\n\nInstanceNorm computes the mean and variance for each D_1×...×D_{N-2}×1×1 input slice and normalises the input accordingly.\n\nIf affine=true, it also applies a shift and a rescale to the input through to learnable per-channel bias β and scale γ parameters.\n\nIf track_stats=true, accumulates mean and var statistics in training phase that will be used to renormalize the input in test phase.\n\nWarning: the defaults for affine and track_stats used to be true in previous Flux versions (< v0.12).\n\nExamples\n\njulia> using Statistics\n\njulia> xs = rand(3, 3, 3, 2); # a batch of 2 images, each having 3 channels\n\njulia> m = InstanceNorm(3);\n\njulia> y = m(xs);\n\njulia> isapprox(std(y, dims=1:2), ones(1, 1, 3, 2), atol=0.2) && std(y, dims=1:2) != std(xs, dims=1:2)\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.GroupNorm","page":"Built-in Layers","title":"Flux.GroupNorm","text":"GroupNorm(channels::Int, G::Int, λ = identity;\n initβ = zeros32,\n initγ = ones32,\n affine = true,\n eps = 1f-5,\n momentum = 0.1f0)\n\nGroup Normalization layer.\n\nchs is the number of channels, the channel dimension of your input. For an array of N dimensions, the N-1th index is the channel dimension.\n\nG is the number of groups along which the statistics are computed. The number of channels must be an integer multiple of the number of groups.\n\nchannels should be the size of the channel dimension in your data (see below).\n\nGiven an array with N > 2 dimensions, call the N-1th the channel dimension. For WHCN images it's the usual channel dimension.\n\nIf affine=true, it also applies a shift and a rescale to the input through to learnable per-channel bias β and scale γ parameters.\n\nExamples\n\njulia> using Statistics\n\njulia> xs = rand(3, 3, 4, 2); # a batch of 2 images, each having 4 channels\n\njulia> m = GroupNorm(4, 2);\n\njulia> y = m(xs);\n\njulia> isapprox(std(y[:, :, 1:2, 1]), 1, atol=0.1) && std(xs[:, :, 1:2, 1]) != std(y[:, :, 1:2, 1])\ntrue\n\njulia> isapprox(std(y[:, :, 3:4, 2]), 1, atol=0.1) && std(xs[:, :, 3:4, 2]) != std(y[:, :, 3:4, 2])\ntrue\n\n\n\n\n\n","category":"type"},{"location":"models/layers/#Flux.normalise","page":"Built-in Layers","title":"Flux.normalise","text":"normalise(x; dims=ndims(x), eps=1e-5)\n\nNormalise x to mean 0 and standard deviation 1 across the dimension(s) given by dims. Per default, dims is the last dimension. eps is a small term added to the denominator for numerical stability.\n\nExamples\n\njulia> using Statistics\n\njulia> x = [90, 100, 110, 130, 70];\n\njulia> mean(x), std(x; corrected=false)\n(100.0, 20.0)\n\njulia> y = Flux.normalise(x)\n5-element Vector{Float64}:\n -0.49999975000012503\n 0.0\n 0.49999975000012503\n 1.499999250000375\n -1.499999250000375\n\njulia> isapprox(std(y; corrected=false), 1, atol=1e-5)\ntrue\n\njulia> x = rand(10:100, 10, 10);\n\njulia> y = Flux.normalise(x, dims=1);\n\njulia> isapprox(std(y; dims=1, corrected=false), ones(1, 10), atol=1e-5)\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/layers/#Test-vs.-Train","page":"Built-in Layers","title":"Test vs. Train","text":"","category":"section"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"Several normalisation layers behave differently under training and inference (testing). By default, Flux will automatically determine when a layer evaluation is part of training or inference. ","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"warning: Warning\nThis automatic train/test detection works best with Zygote, the default automatic differentiation package. It may not work with other packages such as Tracker, Yota, or ForwardDiff.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"The functions Flux.trainmode! and Flux.testmode! let you manually specify which behaviour you want. When called on a model, they will place all layers within the model into the specified mode.","category":"page"},{"location":"models/layers/","page":"Built-in Layers","title":"Built-in Layers","text":"testmode!(::Any)\ntestmode!(::Any, ::Any)\ntrainmode!","category":"page"},{"location":"models/layers/#Flux.testmode!-Tuple{Any}","page":"Built-in Layers","title":"Flux.testmode!","text":"testmode!(model, [mode]) -> model\n\nSet a layer, or all layers in a model, to test mode. This disables the effect of Dropout and some other regularisation layers.\n\nIf you manually set a model into test mode, you need to manually place it back into train mode during training phase, using trainmode!.\n\nThere is an optional second argument, which takes a symbol :auto to reset all layers back to the default automatic mode.\n\nExample\n\njulia> d = Dropout(0.3)\nDropout(0.3)\n\njulia> testmode!(d) # dropout is now always disabled\nDropout(0.3, active=false)\n\njulia> trainmode!(d) # dropout is now always enabled\nDropout(0.3, active=true)\n\njulia> testmode!(d, :auto) # back to default\nDropout(0.3)\n\n\n\n\n\n","category":"method"},{"location":"models/layers/#Flux.testmode!-Tuple{Any, Any}","page":"Built-in Layers","title":"Flux.testmode!","text":"testmode!(model, inactive)\n\nThis two-argument method is largely internal. It recurses into the model, and until a method like testmode!(d::Dropout, inactive) alters the activity of a layer. Custom layers can support manual testmode! / trainmode! switching by defining such a method.\n\nPossible values of inactive are:\n\ntrue for testing, i.e. active=false\nfalse for training, same as trainmode!(m)\n:auto or nothing for Flux to detect training automatically.\n\ncompat: Compat\nThis method may be removed in a future breaking change, to separate the user-facing testmode! from the internal recursion.\n\n\n\n\n\n","category":"method"},{"location":"models/layers/#Flux.trainmode!","page":"Built-in Layers","title":"Flux.trainmode!","text":"trainmode!(model) -> model\n\nSet a layer, or all layers in a model, to training mode. Opposite to testmode!, see further details there.\n\n\n\n\n\ntrainmode!(m, active)\n\nwarning: Warning\nThis two-argument method is deprecated.\n\nPossible values of active are:\n\ntrue for training, or \nfalse for testing, same as testmode!(m)\n:auto or nothing for Flux to detect training automatically.\n\n\n\n\n\n","category":"function"},{"location":"models/quickstart/#man-quickstart","page":"Quick Start","title":"A Neural Network in One Minute","text":"","category":"section"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"If you have used neural networks before, then this simple example might be helpful for seeing how the major parts of Flux work together. Try pasting the code into the REPL prompt.","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"If you haven't, then you might prefer the Fitting a Straight Line page.","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"# This will prompt if neccessary to install everything, including CUDA:\nusing Flux, CUDA, Statistics, ProgressMeter\n\n# Generate some data for the XOR problem: vectors of length 2, as columns of a matrix:\nnoisy = rand(Float32, 2, 1000) # 2×1000 Matrix{Float32}\ntruth = [xor(col[1]>0.5, col[2]>0.5) for col in eachcol(noisy)] # 1000-element Vector{Bool}\n\n# Define our model, a multi-layer perceptron with one hidden layer of size 3:\nmodel = Chain(\n Dense(2 => 3, tanh), # activation function inside layer\n BatchNorm(3),\n Dense(3 => 2),\n softmax) |> gpu # move model to GPU, if available\n\n# The model encapsulates parameters, randomly initialised. Its initial output is:\nout1 = model(noisy |> gpu) |> cpu # 2×1000 Matrix{Float32}\n\n# To train the model, we use batches of 64 samples, and one-hot encoding:\ntarget = Flux.onehotbatch(truth, [true, false]) # 2×1000 OneHotMatrix\nloader = Flux.DataLoader((noisy, target) |> gpu, batchsize=64, shuffle=true);\n# 16-element DataLoader with first element: (2×64 Matrix{Float32}, 2×64 OneHotMatrix)\n\noptim = Flux.setup(Flux.Adam(0.01), model) # will store optimiser momentum, etc.\n\n# Training loop, using the whole data set 1000 times:\nlosses = []\n@showprogress for epoch in 1:1_000\n for (x, y) in loader\n loss, grads = Flux.withgradient(model) do m\n # Evaluate model and loss inside gradient context:\n y_hat = m(x)\n Flux.crossentropy(y_hat, y)\n end\n Flux.update!(optim, model, grads[1])\n push!(losses, loss) # logging, outside gradient context\n end\nend\n\noptim # parameters, momenta and output have all changed\nout2 = model(noisy |> gpu) |> cpu # first row is prob. of true, second row p(false)\n\nmean((out2[1,:] .> 0.5) .== truth) # accuracy 94% so far!","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"(Image: )","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"using Plots # to draw the above figure\n\np_true = scatter(noisy[1,:], noisy[2,:], zcolor=truth, title=\"True classification\", legend=false)\np_raw = scatter(noisy[1,:], noisy[2,:], zcolor=out1[1,:], title=\"Untrained network\", label=\"\", clims=(0,1))\np_done = scatter(noisy[1,:], noisy[2,:], zcolor=out2[1,:], title=\"Trained network\", legend=false)\n\nplot(p_true, p_raw, p_done, layout=(1,3), size=(1000,330))","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"Here's the loss during training:","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"plot(losses; xaxis=(:log10, \"iteration\"),\n yaxis=\"loss\", label=\"per batch\")\nn = length(loader)\nplot!(n:n:length(losses), mean.(Iterators.partition(losses, n)),\n label=\"epoch mean\", dpi=200)","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"This XOR (\"exclusive or\") problem is a variant of the famous one which drove Minsky and Papert to invent deep neural networks in 1969. For small values of \"deep\" – this has one hidden layer, while earlier perceptrons had none. (What they call a hidden layer, Flux calls the output of the first layer, model[1](noisy).)","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"Since then things have developed a little. ","category":"page"},{"location":"models/quickstart/#Features-to-Note","page":"Quick Start","title":"Features to Note","text":"","category":"section"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"Some things to notice in this example are:","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"The batch dimension of data is always the last one. Thus a 2×1000 Matrix is a thousand observations, each a column of length 2. Flux defaults to Float32, but most of Julia to Float64.\nThe model can be called like a function, y = model(x). Each layer like Dense is an ordinary struct, which encapsulates some arrays of parameters (and possibly other state, as for BatchNorm).\nBut the model does not contain the loss function, nor the optimisation rule. The momenta needed by Adam are stored in the object returned by setup. And Flux.crossentropy is an ordinary function.\nThe do block creates an anonymous function, as the first argument of gradient. Anything executed within this is differentiated.","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"Instead of calling gradient and update! separately, there is a convenience function train!. If we didn't want anything extra (like logging the loss), we could replace the training loop with the following:","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"for epoch in 1:1_000\n Flux.train!(model, loader, optim) do m, x, y\n y_hat = m(x)\n Flux.crossentropy(y_hat, y)\n end\nend","category":"page"},{"location":"models/quickstart/","page":"Quick Start","title":"Quick Start","text":"compat: Implicit-style training, Flux ≤ 0.14\nUntil recently Flux's training worked a bit differently. Any code which looks like gradient(() -> loss(model, x, y), Flux.params(model))(gradient of a zero-argument function) ortrain!((x,y) -> loss(model, x, y), Flux.params(model), loader, opt)(with Flux.params) is in the old \"implicit\" style. This still works on Flux 0.14, but will be removed from Flux 0.15. See the training section for more details.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Deep-Convolutional-Generative-Adversarial-Network-(DCGAN)","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"This is a beginner level tutorial for generating images of handwritten digits using a Deep Convolutional Generative Adversarial Network inspired by the TensorFlow tutorial on DCGAN.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#What-are-GANs?","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"What are GANs?","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Generative Adversarial Neural Networks or simply GANs introduced by Goodfellow et al. is one of the most innovative ideas in modern-day machine learning. GANs are used extensively in the field of image and audio processing to generate high-quality synthetic data that can easily be passed off as real data.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"A GAN is composed of two sub-models - the generator and the discriminator acting against one another. The generator can be considered as an artist who draws (generates) new images that look real, whereas the discriminator is a critic who learns to tell real images apart from fakes.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The GAN starts with a generator and discriminator which have very little or no idea about the underlying data. During training, the generator progressively becomes better at creating images that look real, while the discriminator becomes better at telling them apart. The process reaches equilibrium when the discriminator can no longer distinguish real images from fakes.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"[source]","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"This tutorial demonstrates the process of training a DC-GAN on the MNIST dataset for handwritten digits. The following animation shows a series of images produced by the generator as it was trained for 25 epochs. The images begin as random noise, but over time, the images become increasingly similar to handwritten numbers.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Setup","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Setup","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"We need to install some Julia packages before we start with our implementation of DCGAN.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"using Pkg\n\n# Activate a new project environment in the current directory\nPkg.activate(\".\")\n# Add the required packages to the environment\nPkg.add([\"Images\", \"Flux\", \"MLDatasets\", \"CUDA\", \"Parameters\"])","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Note: Depending on your internet speed, it may take a few minutes for the packages install.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"After installing the libraries, load the required packages and functions:","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"using Base.Iterators: partition\nusing Printf\nusing Statistics\nusing Random\nusing Images\nusing Flux: params, DataLoader\nusing Flux.Optimise: update!\nusing Flux.Losses: logitbinarycrossentropy\nusing MLDatasets: MNIST\nusing CUDA","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Now we set default values for the learning rates, batch size, epochs, the usage of a GPU (if available) and other hyperparameters for our model.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Base.@kwdef struct HyperParams\n batch_size::Int = 128\n latent_dim::Int = 100\n epochs::Int = 25\n verbose_freq::Int = 1000\n output_dim::Int = 5\n disc_lr::Float64 = 0.0002\n gen_lr::Float64 = 0.0002\n device::Function = gpu\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Loading-the-data","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Loading the data","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"As mentioned before, we will be using the MNIST dataset for handwritten digits. So we begin with a simple function for loading and pre-processing the MNIST images:","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function load_MNIST_images(hparams)\n images = MNIST.traintensor(Float32)\n\n # Normalize the images to (-1, 1)\n normalized_images = @. 2f0 * images - 1f0\n image_tensor = reshape(normalized_images, 28, 28, 1, :)\n\n # Create a dataloader that iterates over mini-batches of the image tensor\n dataloader = DataLoader(image_tensor, batchsize=hparams.batch_size, shuffle=true)\n\n return dataloader\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"To learn more about loading images in Flux, you can check out this tutorial.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Note: The data returned from the dataloader is loaded is on the CPU. To train on the GPU, we need to transfer the data to the GPU beforehand.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Create-the-models","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Create the models","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Generator","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Generator","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Our generator, a.k.a. the artist, is a neural network that maps low dimensional data to a high dimensional form.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"This low dimensional data (seed) is generally a vector of random values sampled from a normal distribution.\nThe high dimensional data is the generated image.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The Dense layer is used for taking the seed as an input which is upsampled several times using the ConvTranspose layer until we reach the desired output size (in our case, 28x28x1). Furthermore, after each ConvTranspose layer, we apply the Batch Normalization to stabilize the learning process.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"We will be using the relu activation function for each layer except the output layer, where we use tanh activation.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"We will also apply the weight initialization method mentioned in the original DCGAN paper.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"# Function for initializing the model weights with values \n# sampled from a Gaussian distribution with μ=0 and σ=0.02\ndcgan_init(shape...) = randn(Float32, shape) * 0.02f0","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function Generator(latent_dim)\n Chain(\n Dense(latent_dim, 7*7*256, bias=false),\n BatchNorm(7*7*256, relu),\n\n x -> reshape(x, 7, 7, 256, :),\n\n ConvTranspose((5, 5), 256 => 128; stride = 1, pad = 2, init = dcgan_init, bias=false),\n BatchNorm(128, relu),\n\n ConvTranspose((4, 4), 128 => 64; stride = 2, pad = 1, init = dcgan_init, bias=false),\n BatchNorm(64, relu),\n\n # The tanh activation ensures that output is in range of (-1, 1)\n ConvTranspose((4, 4), 64 => 1, tanh; stride = 2, pad = 1, init = dcgan_init, bias=false),\n )\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Time for a small test!! We create a dummy generator and feed a random vector as a seed to the generator. If our generator is initialized correctly it will return an array of size (28, 28, 1, batch_size). The @assert macro in Julia will raise an exception for the wrong output size.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"# Create a dummy generator of latent dim 100\ngenerator = Generator(100)\nnoise = randn(Float32, 100, 3) # The last axis is the batch size\n\n# Feed the random noise to the generator\ngen_image = generator(noise)\n@assert size(gen_image) == (28, 28, 1, 3)","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Our generator model is yet to learn the correct weights, so it does not produce a recognizable image for now. To train our poor generator we need its equal rival, the discriminator.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Discriminator","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Discriminator","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The Discriminator is a simple CNN based image classifier. The Conv layer a is used with a leakyrelu activation function. ","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function Discriminator()\n Chain(\n Conv((4, 4), 1 => 64; stride = 2, pad = 1, init = dcgan_init),\n x->leakyrelu.(x, 0.2f0),\n Dropout(0.3),\n\n Conv((4, 4), 64 => 128; stride = 2, pad = 1, init = dcgan_init),\n x->leakyrelu.(x, 0.2f0),\n Dropout(0.3),\n\n # The output is now of the shape (7, 7, 128, batch_size)\n flatten,\n Dense(7 * 7 * 128, 1) \n )\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"For a more detailed implementation of a CNN-based image classifier, you can refer to this tutorial.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Now let us check if our discriminator is working:","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"# Dummy Discriminator\ndiscriminator = Discriminator()\n# We pass the generated image to the discriminator\nlogits = discriminator(gen_image)\n@assert size(logits) == (1, 3)","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Just like our dummy generator, the untrained discriminator has no idea about what is a real or fake image. It needs to be trained alongside the generator to output positive values for real images, and negative values for fake images.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Loss-functions-for-GAN","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Loss functions for GAN","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"In a GAN problem, there are only two labels involved: fake and real. So Binary CrossEntropy is an easy choice for a preliminary loss function. ","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"But even if Flux's binarycrossentropy does the job for us, due to numerical stability it is always preferred to compute cross-entropy using logits. Flux provides logitbinarycrossentropy specifically for this purpose. Mathematically it is equivalent to binarycrossentropy(σ(ŷ), y, kwargs...).","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Discriminator-Loss","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Discriminator Loss","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The discriminator loss quantifies how well the discriminator can distinguish real images from fakes. It compares ","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"discriminator's predictions on real images to an array of 1s, and\ndiscriminator's predictions on fake (generated) images to an array of 0s.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"These two losses are summed together to give a scalar loss. So we can write the loss function of the discriminator as:","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function discriminator_loss(real_output, fake_output)\n real_loss = logitbinarycrossentropy(real_output, 1)\n fake_loss = logitbinarycrossentropy(fake_output, 0)\n return real_loss + fake_loss\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Generator-Loss","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Generator Loss","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The generator's loss quantifies how well it was able to trick the discriminator. Intuitively, if the generator is performing well, the discriminator will classify the fake images as real (or 1).","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"generator_loss(fake_output) = logitbinarycrossentropy(fake_output, 1)","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"We also need optimisers for our network. Why you may ask? Read more here. For both the generator and discriminator, we will use the ADAM optimiser.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Utility-functions","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Utility functions","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The output of the generator ranges from (-1, 1), so it needs to be de-normalized before we can display it as an image. To make things a bit easier, we define a function to visualize the output of the generator as a grid of images. ","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function create_output_image(gen, fixed_noise, hparams)\n fake_images = cpu(gen.(fixed_noise))\n image_array = reduce(vcat, reduce.(hcat, partition(fake_images, hparams.output_dim)))\n image_array = permutedims(dropdims(image_array; dims=(3, 4)), (2, 1))\n image_array = @. Gray(image_array + 1f0) / 2f0\n return image_array\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Training","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Training","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"For the sake of simplifying our training problem, we will divide the generator and discriminator training into two separate functions. ","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function train_discriminator!(gen, disc, real_img, fake_img, opt, ps, hparams)\n\n disc_loss, grads = Flux.withgradient(ps) do\n discriminator_loss(disc(real_img), disc(fake_img))\n end\n\n # Update the discriminator parameters\n update!(opt, ps, grads)\n return disc_loss\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"We define a similar function for the generator.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function train_generator!(gen, disc, fake_img, opt, ps, hparams)\n\n gen_loss, grads = Flux.withgradient(ps) do\n generator_loss(disc(fake_img))\n end\n\n update!(opt, ps, grads)\n return gen_loss\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Now that we have defined every function we need, we integrate everything into a single train function where we first set up all the models and optimisers and then train the GAN for a specified number of epochs.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"function train(hparams)\n\n dev = hparams.device\n # Check if CUDA is actually present\n if hparams.device == gpu\n if !CUDA.has_cuda()\n dev = cpu\n @warn \"No gpu found, falling back to CPU\"\n end\n end\n\n # Load the normalized MNIST images\n dataloader = load_MNIST_images(hparams)\n\n # Initialize the models and pass them to correct device\n disc = Discriminator() |> dev\n gen = Generator(hparams.latent_dim) |> dev\n\n # Collect the generator and discriminator parameters\n disc_ps = params(disc)\n gen_ps = params(gen)\n\n # Initialize the ADAM optimisers for both the sub-models\n # with respective learning rates\n disc_opt = ADAM(hparams.disc_lr)\n gen_opt = ADAM(hparams.gen_lr)\n\n # Create a batch of fixed noise for visualizing the training of generator over time\n fixed_noise = [randn(Float32, hparams.latent_dim, 1) |> dev for _=1:hparams.output_dim^2]\n\n # Training loop\n train_steps = 0\n for ep in 1:hparams.epochs\n @info \"Epoch $ep\"\n for real_img in dataloader\n\n # Transfer the data to the GPU\n real_img = real_img |> dev\n\n # Create a random noise\n noise = randn!(similar(real_img, (hparams.latent_dim, hparams.batch_size)))\n # Pass the noise to the generator to create a fake imagae\n fake_img = gen(noise)\n\n # Update discriminator and generator\n loss_disc = train_discriminator!(gen, disc, real_img, fake_img, disc_opt, disc_ps, hparams)\n loss_gen = train_generator!(gen, disc, fake_img, gen_opt, gen_ps, hparams)\n\n if train_steps % hparams.verbose_freq == 0\n @info(\"Train step $(train_steps), Discriminator loss = $(loss_disc), Generator loss = $(loss_gen)\")\n # Save generated fake image\n output_image = create_output_image(gen, fixed_noise, hparams)\n save(@sprintf(\"output/dcgan_steps_%06d.png\", train_steps), output_image)\n end\n train_steps += 1\n end\n end\n\n output_image = create_output_image(gen, fixed_noise, hparams)\n save(@sprintf(\"output/dcgan_steps_%06d.png\", train_steps), output_image)\n\n return nothing\nend","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"Now we finally get to train the GAN:","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"# Define the hyper-parameters (here, we go with the default ones)\nhparams = HyperParams()\ntrain(hparams)","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Output","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Output","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The generated images are stored inside the output folder. To visualize the output of the generator over time, we create a gif of the generated images.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"folder = \"output\"\n# Get the image filenames from the folder\nimg_paths = readdir(folder, join=true)\n# Load all the images as an array\nimages = load.(img_paths)\n# Join all the images in the array to create a matrix of images\ngif_mat = cat(images..., dims=3)\nsave(\"./output.gif\", gif_mat)","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/#Resources-and-References","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Resources & References","text":"","category":"section"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"The DCGAN implementation in the Model Zoo.","category":"page"},{"location":"tutorials/2021-10-08-dcgan-mnist/","page":"Deep Convolutional Generative Adversarial Network (DCGAN)","title":"Deep Convolutional Generative Adversarial Network (DCGAN)","text":"info: Info\nOriginally published at fluxml.ai on 8 October 2021, by Deeptendu Santra","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#[Tutorial:-Generative-Adversarial-Networks](](@id-man-gan-tutorial))","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"This tutorial describes how to implement a vanilla Generative Adversarial Network using Flux and how train it on the MNIST dataset. It is based on this Pytorch tutorial. The original GAN paper by Goodfellow et al. is a great resource that describes the motivation and theory behind GANs:","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"In the proposed adversarial nets framework, the generative model is pitted against an adversary: a discriminative model that learns to determine whether a sample is from the model distribution or the data distribution. The generative model can be thought of as analogous to a team of counterfeiters, trying to produce fake currency and use it without detection, while the discriminative model is analogous to the police, trying to detect the counterfeit currency. Competition in this game drives both teams to improve their methods until the counterfeits are indistinguishable from the genuine articles.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Let's implement a GAN in Flux. To get started we first import a few useful packages:","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"using MLDatasets: MNIST\nusing Flux.Data: DataLoader\nusing Flux\nusing CUDA\nusing Zygote\nusing UnicodePlots","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"To download a package in the Julia REPL, type ] to enter package mode and then type add MLDatasets or perform this operation with the Pkg module like this","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"> import Pkg\n> Pkg.add(\"MLDatasets\")","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"While UnicodePlots is not necessary, it can be used to plot generated samples into the terminal during training. Having direct feedback, instead of looking at plots in a separate window, use fantastic for debugging.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Next, let us define values for learning rate, batch size, epochs, and other hyper-parameters. While we are at it, we also define optimisers for the generator and discriminator network. More on what these are later.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":" lr_g = 2e-4 # Learning rate of the generator network\n lr_d = 2e-4 # Learning rate of the discriminator network\n batch_size = 128 # batch size\n num_epochs = 1000 # Number of epochs to train for\n output_period = 100 # Period length for plots of generator samples\n n_features = 28 * 28# Number of pixels in each sample of the MNIST dataset\n latent_dim = 100 # Dimension of latent space\n opt_dscr = ADAM(lr_d)# Optimiser for the discriminator\n opt_gen = ADAM(lr_g) # Optimiser for the generator","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"In this tutorial I'm assuming that a CUDA-enabled GPU is available on the system where the script is running. If this is not the case, simply remove the |>gpu decorators: piping.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#Data-loading","page":"Tutorial: Generative Adversarial Networks","title":"Data loading","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"The MNIST data set is available from MLDatasets. The first time you instantiate it you will be prompted if you want to download it. You should agree to this. ","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"GANs can be trained unsupervised. Therefore only keep the images from the training set and discard the labels.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"After we load the training data we re-scale the data from values in [0:1] to values in [-1:1]. GANs are notoriously tricky to train and this re-scaling is a recommended GAN hack. The re-scaled data is used to define a data loader which handles batching and shuffling the data.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":" # Load the dataset\n train_x, _ = MNIST.traindata(Float32);\n # This dataset has pixel values ∈ [0:1]. Map these to [-1:1]\n train_x = 2f0 * reshape(train_x, 28, 28, 1, :) .- 1f0 |>gpu;\n # DataLoader allows to access data batch-wise and handles shuffling.\n train_loader = DataLoader(train_x, batchsize=batch_size, shuffle=true);","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#Defining-the-Networks","page":"Tutorial: Generative Adversarial Networks","title":"Defining the Networks","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"A vanilla GAN, the discriminator and the generator are both plain, feed-forward multilayer perceptrons. We use leaky rectified linear units leakyrelu to ensure out model is non-linear. ","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Here, the coefficient α (in the leakyrelu below), is set to 0.2. Empirically, this value allows for good training of the network (based on prior experiments). It has also been found that Dropout ensures a good generalization of the learned network, so we will use that below. Dropout is usually active when training a model and inactive in inference. Flux automatically sets the training mode when calling the model in a gradient context. As a final non-linearity, we use the sigmoid activation function.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"discriminator = Chain(Dense(n_features, 1024, x -> leakyrelu(x, 0.2f0)),\n Dropout(0.3),\n Dense(1024, 512, x -> leakyrelu(x, 0.2f0)),\n Dropout(0.3),\n Dense(512, 256, x -> leakyrelu(x, 0.2f0)),\n Dropout(0.3),\n Dense(256, 1, sigmoid)) |> gpu","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Let's define the generator in a similar fashion. This network maps a latent variable (a variable that is not directly observed but instead inferred) to the image space and we set the input and output dimension accordingly. A tanh squashes the output of the final layer to values in [-1:1], the same range that we squashed the training data onto.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"generator = Chain(Dense(latent_dim, 256, x -> leakyrelu(x, 0.2f0)),\n Dense(256, 512, x -> leakyrelu(x, 0.2f0)),\n Dense(512, 1024, x -> leakyrelu(x, 0.2f0)),\n Dense(1024, n_features, tanh)) |> gpu","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#Training-functions-for-the-networks","page":"Tutorial: Generative Adversarial Networks","title":"Training functions for the networks","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"To train the discriminator, we present it with real data from the MNIST data set and with fake data and reward it by predicting the correct labels for each sample. The correct labels are of course 1 for in-distribution data and 0 for out-of-distribution data coming from the generator. Binary cross entropy is the loss function of choice. While the Flux documentation suggests to use Logit binary cross entropy, the GAN seems to be difficult to train with this loss function. This function returns the discriminator loss for logging purposes. We can calculate the loss in the same call as evaluating the pullback and resort to getting the pullback directly from Zygote instead of calling Flux.train! on the model. To calculate the gradients of the loss function with respect to the parameters of the discriminator we then only have to evaluate the pullback with a seed gradient of 1.0. These gradients are used to update the model parameters","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"function train_dscr!(discriminator, real_data, fake_data)\n this_batch = size(real_data)[end] # Number of samples in the batch\n # Concatenate real and fake data into one big vector\n all_data = hcat(real_data, fake_data)\n\n # Target vector for predictions: 1 for real data, 0 for fake data.\n all_target = [ones(eltype(real_data), 1, this_batch) zeros(eltype(fake_data), 1, this_batch)] |> gpu;\n\n ps = Flux.params(discriminator)\n loss, pullback = Zygote.pullback(ps) do\n preds = discriminator(all_data)\n loss = Flux.Losses.binarycrossentropy(preds, all_target)\n end\n # To get the gradients we evaluate the pullback with 1.0 as a seed gradient.\n grads = pullback(1f0)\n\n # Update the parameters of the discriminator with the gradients we calculated above\n Flux.update!(opt_dscr, Flux.params(discriminator), grads)\n \n return loss \nend","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Now we need to define a function to train the generator network. The job of the generator is to fool the discriminator so we reward the generator when the discriminator predicts a high probability for its samples to be real data. In the training function we first need to sample some noise, i.e. normally distributed data. This has to be done outside the pullback since we don't want to get the gradients with respect to the noise, but to the generator parameters. Inside the pullback we need to first apply the generator to the noise since we will take the gradient with respect to the parameters of the generator. We also need to call the discriminator in order to evaluate the loss function inside the pullback. Here we need to remember to deactivate the dropout layers of the discriminator. We do this by setting the discriminator into test mode before the pullback. Immediately after the pullback we set it back into training mode. Then we evaluate the pullback, call it with a seed gradient of 1.0 as above, update the parameters of the generator network and return the loss.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"function train_gen!(discriminator, generator)\n # Sample noise\n noise = randn(latent_dim, batch_size) |> gpu;\n\n # Define parameters and get the pullback\n ps = Flux.params(generator)\n # Set discriminator into test mode to disable dropout layers\n testmode!(discriminator)\n # Evaluate the loss function while calculating the pullback. We get the loss for free\n loss, back = Zygote.pullback(ps) do\n preds = discriminator(generator(noise));\n loss = Flux.Losses.binarycrossentropy(preds, 1.) \n end\n # Evaluate the pullback with a seed-gradient of 1.0 to get the gradients for\n # the parameters of the generator\n grads = back(1.0f0)\n Flux.update!(opt_gen, Flux.params(generator), grads)\n # Set discriminator back into automatic mode\n trainmode!(discriminator, mode=:auto)\n return loss\nend","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#Training","page":"Tutorial: Generative Adversarial Networks","title":"Training","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Now we are ready to train the GAN. In the training loop we keep track of the per-sample loss of the generator and the discriminator, where we use the batch loss returned by the two training functions defined above. In each epoch we iterate over the mini-batches given by the data loader. Only minimal data processing needs to be done before the training functions can be called. ","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"lossvec_gen = zeros(num_epochs)\nlossvec_dscr = zeros(num_epochs)\n\nfor n in 1:num_epochs\n loss_sum_gen = 0.0f0\n loss_sum_dscr = 0.0f0\n\n for x in train_loader\n # - Flatten the images from 28x28xbatchsize to 784xbatchsize\n real_data = flatten(x);\n\n # Train the discriminator\n noise = randn(latent_dim, size(x)[end]) |> gpu\n fake_data = generator(noise)\n loss_dscr = train_dscr!(discriminator, real_data, fake_data)\n loss_sum_dscr += loss_dscr\n\n # Train the generator\n loss_gen = train_gen!(discriminator, generator)\n loss_sum_gen += loss_gen\n end\n\n # Add the per-sample loss of the generator and discriminator\n lossvec_gen[n] = loss_sum_gen / size(train_x)[end]\n lossvec_dscr[n] = loss_sum_dscr / size(train_x)[end]\n\n if n % output_period == 0\n @show n\n noise = randn(latent_dim, 4) |> gpu;\n fake_data = reshape(generator(noise), 28, 4*28);\n p = heatmap(fake_data, colormap=:inferno)\n print(p)\n end\nend ","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"For the hyper-parameters shown in this example, the generator produces useful images after about 1000 epochs. And after about 5000 epochs the result look indistinguishable from real MNIST data. Using a Nvidia V100 GPU on a 2.7 GHz Power9 CPU with 32 hardware threads, training 100 epochs takes about 80 seconds when using the GPU. The GPU utilization is between 30 and 40%. To observe the network more frequently during training you can for example set output_period=20. Training the GAN using the CPU takes about 10 minutes per epoch and is not recommended.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#Results","page":"Tutorial: Generative Adversarial Networks","title":"Results","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"Below you can see what some of the images output may look like after different numbers of epochs.","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"(Image: )","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/#Resources","page":"Tutorial: Generative Adversarial Networks","title":"Resources","text":"","category":"section"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"A collection of GANs in Flux\nWikipedia\nGAN hacks","category":"page"},{"location":"tutorials/2021-10-14-vanilla-gan/","page":"Tutorial: Generative Adversarial Networks","title":"Tutorial: Generative Adversarial Networks","text":"info: Info\nOriginally published at fluxml.ai on 14 October 2021, by Ralph Kube.","category":"page"},{"location":"models/losses/#man-losses","page":"Loss Functions","title":"Loss Functions","text":"","category":"section"},{"location":"models/losses/","page":"Loss Functions","title":"Loss Functions","text":"Flux provides a large number of common loss functions used for training machine learning models. They are grouped together in the Flux.Losses module.","category":"page"},{"location":"models/losses/","page":"Loss Functions","title":"Loss Functions","text":"Loss functions for supervised learning typically expect as inputs a target y, and a prediction ŷ from your model. In Flux's convention, the order of the arguments is the following","category":"page"},{"location":"models/losses/","page":"Loss Functions","title":"Loss Functions","text":"loss(ŷ, y)","category":"page"},{"location":"models/losses/","page":"Loss Functions","title":"Loss Functions","text":"Most loss functions in Flux have an optional argument agg, denoting the type of aggregation performed over the batch:","category":"page"},{"location":"models/losses/","page":"Loss Functions","title":"Loss Functions","text":"loss(ŷ, y) # defaults to `mean`\nloss(ŷ, y, agg=sum) # use `sum` for reduction\nloss(ŷ, y, agg=x->sum(x, dims=2)) # partial reduction\nloss(ŷ, y, agg=x->mean(w .* x)) # weighted mean\nloss(ŷ, y, agg=identity) # no aggregation.","category":"page"},{"location":"models/losses/#Function-listing","page":"Loss Functions","title":"Function listing","text":"","category":"section"},{"location":"models/losses/","page":"Loss Functions","title":"Loss Functions","text":"Flux.Losses.mae\nFlux.Losses.mse\nFlux.Losses.msle\nFlux.Losses.huber_loss\nFlux.Losses.label_smoothing\nFlux.Losses.crossentropy\nFlux.Losses.logitcrossentropy\nFlux.Losses.binarycrossentropy\nFlux.Losses.logitbinarycrossentropy\nFlux.Losses.kldivergence\nFlux.Losses.poisson_loss\nFlux.Losses.hinge_loss\nFlux.Losses.squared_hinge_loss\nFlux.Losses.dice_coeff_loss\nFlux.Losses.tversky_loss\nFlux.Losses.binary_focal_loss\nFlux.Losses.focal_loss\nFlux.Losses.siamese_contrastive_loss","category":"page"},{"location":"models/losses/#Flux.Losses.mae","page":"Loss Functions","title":"Flux.Losses.mae","text":"mae(ŷ, y; agg = mean)\n\nReturn the loss corresponding to mean absolute error:\n\nagg(abs.(ŷ .- y))\n\nExample\n\njulia> y_model = [1.1, 1.9, 3.1];\n\njulia> Flux.mae(y_model, 1:3)\n0.10000000000000009\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.mse","page":"Loss Functions","title":"Flux.Losses.mse","text":"mse(ŷ, y; agg = mean)\n\nReturn the loss corresponding to mean square error:\n\nagg((ŷ .- y) .^ 2)\n\nSee also: mae, msle, crossentropy.\n\nExample\n\njulia> y_model = [1.1, 1.9, 3.1];\n\njulia> y_true = 1:3;\n\njulia> Flux.mse(y_model, y_true)\n0.010000000000000018\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.msle","page":"Loss Functions","title":"Flux.Losses.msle","text":"msle(ŷ, y; agg = mean, eps = eps(eltype(ŷ)))\n\nThe loss corresponding to mean squared logarithmic errors, calculated as\n\nagg((log.(ŷ .+ ϵ) .- log.(y .+ ϵ)) .^ 2)\n\nThe ϵ == eps term provides numerical stability. Penalizes an under-estimation more than an over-estimatation.\n\nExample\n\njulia> Flux.msle(Float32[1.1, 2.2, 3.3], 1:3)\n0.009084041f0\n\njulia> Flux.msle(Float32[0.9, 1.8, 2.7], 1:3)\n0.011100831f0\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.huber_loss","page":"Loss Functions","title":"Flux.Losses.huber_loss","text":"huber_loss(ŷ, y; delta = 1, agg = mean)\n\nReturn the mean of the Huber loss given the prediction ŷ and true values y.\n\n | 0.5 * |ŷ - y|^2, for |ŷ - y| <= δ\nHuber loss = |\n | δ * (|ŷ - y| - 0.5 * δ), otherwise\n\nExample\n\njulia> ŷ = [1.1, 2.1, 3.1];\n\njulia> Flux.huber_loss(ŷ, 1:3) # default δ = 1 > |ŷ - y|\n0.005000000000000009\n\njulia> Flux.huber_loss(ŷ, 1:3, delta=0.05) # changes behaviour as |ŷ - y| > δ\n0.003750000000000005\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.label_smoothing","page":"Loss Functions","title":"Flux.Losses.label_smoothing","text":"label_smoothing(y::Union{Number, AbstractArray}, α; dims::Int=1)\n\nReturns smoothed labels, meaning the confidence on label values are relaxed.\n\nWhen y is given as one-hot vector or batch of one-hot, its calculated as\n\ny .* (1 - α) .+ α / size(y, dims)\n\nwhen y is given as a number or batch of numbers for binary classification, its calculated as\n\ny .* (1 - α) .+ α / 2\n\nin which case the labels are squeezed towards 0.5.\n\nα is a number in interval (0, 1) called the smoothing factor. Higher the value of α larger the smoothing of y.\n\ndims denotes the one-hot dimension, unless dims=0 which denotes the application of label smoothing to binary distributions encoded in a single number.\n\nExample\n\njulia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)\n2×6 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n ⋅ ⋅ ⋅ 1 ⋅ 1\n 1 1 1 ⋅ 1 ⋅\n\njulia> y_smoothed = Flux.label_smoothing(y, 0.2f0)\n2×6 Matrix{Float32}:\n 0.1 0.1 0.1 0.9 0.1 0.9\n 0.9 0.9 0.9 0.1 0.9 0.1\n\njulia> y_sim = softmax(y .* log(2f0))\n2×6 Matrix{Float32}:\n 0.333333 0.333333 0.333333 0.666667 0.333333 0.666667\n 0.666667 0.666667 0.666667 0.333333 0.666667 0.333333\n\njulia> y_dis = vcat(y_sim[2,:]', y_sim[1,:]')\n2×6 Matrix{Float32}:\n 0.666667 0.666667 0.666667 0.333333 0.666667 0.333333\n 0.333333 0.333333 0.333333 0.666667 0.333333 0.666667\n\njulia> Flux.crossentropy(y_sim, y) < Flux.crossentropy(y_sim, y_smoothed)\ntrue\n\njulia> Flux.crossentropy(y_dis, y) > Flux.crossentropy(y_dis, y_smoothed)\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.crossentropy","page":"Loss Functions","title":"Flux.Losses.crossentropy","text":"crossentropy(ŷ, y; dims = 1, eps = eps(eltype(ŷ)), agg = mean)\n\nReturn the cross entropy between the given probability distributions; calculated as\n\nagg(-sum(y .* log.(ŷ .+ ϵ); dims))\n\nCross entropy is typically used as a loss in multi-class classification, in which case the labels y are given in a one-hot format. dims specifies the dimension (or the dimensions) containing the class probabilities. The prediction ŷ is supposed to sum to one across dims, as would be the case with the output of a softmax operation.\n\nFor numerical stability, it is recommended to use logitcrossentropy rather than softmax followed by crossentropy .\n\nUse label_smoothing to smooth the true labels as preprocessing before computing the loss.\n\nSee also: logitcrossentropy, binarycrossentropy, logitbinarycrossentropy.\n\nExample\n\njulia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)\n3×5 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n 1 ⋅ ⋅ ⋅ 1\n ⋅ 1 ⋅ 1 ⋅\n ⋅ ⋅ 1 ⋅ ⋅\n\njulia> y_model = softmax(reshape(-7:7, 3, 5) .* 1f0)\n3×5 Matrix{Float32}:\n 0.0900306 0.0900306 0.0900306 0.0900306 0.0900306\n 0.244728 0.244728 0.244728 0.244728 0.244728\n 0.665241 0.665241 0.665241 0.665241 0.665241\n\njulia> sum(y_model; dims=1)\n1×5 Matrix{Float32}:\n 1.0 1.0 1.0 1.0 1.0\n\njulia> Flux.crossentropy(y_model, y_label)\n1.6076053f0\n\njulia> 5 * ans ≈ Flux.crossentropy(y_model, y_label; agg=sum)\ntrue\n\njulia> y_smooth = Flux.label_smoothing(y_label, 0.15f0)\n3×5 Matrix{Float32}:\n 0.9 0.05 0.05 0.05 0.9\n 0.05 0.9 0.05 0.9 0.05\n 0.05 0.05 0.9 0.05 0.05\n\njulia> Flux.crossentropy(y_model, y_smooth)\n1.5776052f0\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.logitcrossentropy","page":"Loss Functions","title":"Flux.Losses.logitcrossentropy","text":"logitcrossentropy(ŷ, y; dims = 1, agg = mean)\n\nReturn the cross entropy calculated by\n\nagg(-sum(y .* logsoftmax(ŷ; dims); dims))\n\nThis is mathematically equivalent to crossentropy(softmax(ŷ), y), but is more numerically stable than using functions crossentropy and softmax separately.\n\nSee also: binarycrossentropy, logitbinarycrossentropy, label_smoothing.\n\nExample\n\njulia> y_label = Flux.onehotbatch(collect(\"abcabaa\"), 'a':'c')\n3×7 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n 1 ⋅ ⋅ 1 ⋅ 1 1\n ⋅ 1 ⋅ ⋅ 1 ⋅ ⋅\n ⋅ ⋅ 1 ⋅ ⋅ ⋅ ⋅\n\njulia> y_model = reshape(vcat(-9:0, 0:9, 7.5f0), 3, 7)\n3×7 Matrix{Float32}:\n -9.0 -6.0 -3.0 0.0 2.0 5.0 8.0\n -8.0 -5.0 -2.0 0.0 3.0 6.0 9.0\n -7.0 -4.0 -1.0 1.0 4.0 7.0 7.5\n\njulia> Flux.logitcrossentropy(y_model, y_label)\n1.5791205f0\n\njulia> Flux.crossentropy(softmax(y_model), y_label)\n1.5791197f0\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.binarycrossentropy","page":"Loss Functions","title":"Flux.Losses.binarycrossentropy","text":"binarycrossentropy(ŷ, y; agg = mean, eps = eps(eltype(ŷ)))\n\nReturn the binary cross-entropy loss, computed as\n\nagg(@.(-y * log(ŷ + ϵ) - (1 - y) * log(1 - ŷ + ϵ)))\n\nWhere typically, the prediction ŷ is given by the output of a sigmoid activation. The ϵ == eps term is included to avoid infinity. Using logitbinarycrossentropy is recomended over binarycrossentropy for numerical stability.\n\nUse label_smoothing to smooth the y value as preprocessing before computing the loss.\n\nSee also: crossentropy, logitcrossentropy.\n\nExamples\n\njulia> y_bin = Bool[1,0,1]\n3-element Vector{Bool}:\n 1\n 0\n 1\n\njulia> y_prob = softmax(reshape(vcat(1:3, 3:5), 2, 3) .* 1f0)\n2×3 Matrix{Float32}:\n 0.268941 0.5 0.268941\n 0.731059 0.5 0.731059\n\njulia> Flux.binarycrossentropy(y_prob[2,:], y_bin)\n0.43989f0\n\njulia> all(p -> 0 < p < 1, y_prob[2,:]) # else DomainError\ntrue\n\njulia> y_hot = Flux.onehotbatch(y_bin, 0:1)\n2×3 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n ⋅ 1 ⋅\n 1 ⋅ 1\n\njulia> Flux.crossentropy(y_prob, y_hot)\n0.43989f0\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.logitbinarycrossentropy","page":"Loss Functions","title":"Flux.Losses.logitbinarycrossentropy","text":"logitbinarycrossentropy(ŷ, y; agg = mean)\n\nMathematically equivalent to binarycrossentropy(σ(ŷ), y) but is more numerically stable.\n\nSee also: crossentropy, logitcrossentropy.\n\nExamples\n\njulia> y_bin = Bool[1,0,1];\n\njulia> y_model = Float32[2, -1, pi]\n3-element Vector{Float32}:\n 2.0\n -1.0\n 3.1415927\n\njulia> Flux.logitbinarycrossentropy(y_model, y_bin)\n0.160832f0\n\njulia> Flux.binarycrossentropy(sigmoid.(y_model), y_bin)\n0.16083185f0\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.kldivergence","page":"Loss Functions","title":"Flux.Losses.kldivergence","text":"kldivergence(ŷ, y; agg = mean, eps = eps(eltype(ŷ)))\n\nReturn the Kullback-Leibler divergence between the given probability distributions.\n\nThe KL divergence is a measure of how much one probability distribution is different from the other. It is always non-negative, and zero only when both the distributions are equal.\n\nExample\n\njulia> p1 = [1 0; 0 1]\n2×2 Matrix{Int64}:\n 1 0\n 0 1\n\njulia> p2 = fill(0.5, 2, 2)\n2×2 Matrix{Float64}:\n 0.5 0.5\n 0.5 0.5\n\njulia> Flux.kldivergence(p2, p1) ≈ log(2)\ntrue\n\njulia> Flux.kldivergence(p2, p1; agg = sum) ≈ 2log(2)\ntrue\n\njulia> Flux.kldivergence(p2, p2; eps = 0) # about -2e-16 with the regulator\n0.0\n\njulia> Flux.kldivergence(p1, p2; eps = 0) # about 17.3 with the regulator\nInf\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.poisson_loss","page":"Loss Functions","title":"Flux.Losses.poisson_loss","text":"poisson_loss(ŷ, y; agg = mean)\n\nReturn how much the predicted distribution ŷ diverges from the expected Poisson distribution y; calculated as -\n\nsum(ŷ .- y .* log.(ŷ)) / size(y, 2)\n\nMore information..\n\nExample\n\njulia> y_model = [1, 3, 3]; # data should only take integral values\n\njulia> Flux.poisson_loss(y_model, 1:3)\n0.5023128522198171\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.hinge_loss","page":"Loss Functions","title":"Flux.Losses.hinge_loss","text":"hinge_loss(ŷ, y; agg = mean)\n\nReturn the hinge_loss given the prediction ŷ and true labels y (containing 1 or -1); calculated as\n\nsum(max.(0, 1 .- ŷ .* y)) / size(y, 2)\n\nUsually used with classifiers like Support Vector Machines. See also: squared_hinge_loss\n\nExample\n\njulia> y_true = [1, -1, 1, 1];\n\njulia> y_pred = [0.1, 0.3, 1, 1.5];\n\njulia> Flux.hinge_loss(y_pred, y_true)\n0.55\n\njulia> Flux.hinge_loss(y_pred[1], y_true[1]) != 0 # same sign but |ŷ| < 1\ntrue\n\njulia> Flux.hinge_loss(y_pred[end], y_true[end]) == 0 # same sign but |ŷ| >= 1\ntrue\n\njulia> Flux.hinge_loss(y_pred[2], y_true[2]) != 0 # opposite signs\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.squared_hinge_loss","page":"Loss Functions","title":"Flux.Losses.squared_hinge_loss","text":"squared_hinge_loss(ŷ, y)\n\nReturn the squared hinge_loss loss given the prediction ŷ and true labels y (containing 1 or -1); calculated as\n\nsum((max.(0, 1 .- ŷ .* y)).^2) / size(y, 2)\n\nUsually used with classifiers like Support Vector Machines. See also: hinge_loss\n\nExample\n\njulia> y_true = [1, -1, 1, 1];\n\njulia> y_pred = [0.1, 0.3, 1, 1.5];\n\njulia> Flux.squared_hinge_loss(y_pred, y_true)\n0.625\n\njulia> Flux.squared_hinge_loss(y_pred[1], y_true[1]) != 0\ntrue\n\njulia> Flux.squared_hinge_loss(y_pred[end], y_true[end]) == 0\ntrue\n\njulia> Flux.squared_hinge_loss(y_pred[2], y_true[2]) != 0\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.dice_coeff_loss","page":"Loss Functions","title":"Flux.Losses.dice_coeff_loss","text":"dice_coeff_loss(ŷ, y; smooth = 1)\n\nReturn a loss based on the dice coefficient. Used in the V-Net image segmentation architecture. The dice coefficient is similar to the F1_score. Loss calculated as:\n\n1 - 2*sum(|ŷ .* y| + smooth) / (sum(ŷ.^2) + sum(y.^2) + smooth)\n\nExample\n\njulia> y_pred = [1.1, 2.1, 3.1];\n\njulia> Flux.dice_coeff_loss(y_pred, 1:3)\n0.000992391663909964\n\njulia> 1 - Flux.dice_coeff_loss(y_pred, 1:3) # ~ F1 score for image segmentation\n0.99900760833609\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.tversky_loss","page":"Loss Functions","title":"Flux.Losses.tversky_loss","text":"tversky_loss(ŷ, y; beta = 0.7)\n\nReturn the Tversky loss. Used with imbalanced data to give more weight to false negatives. Larger β == beta weigh recall more than precision (by placing more emphasis on false negatives). Calculated as:\n\n1 - sum(|y .* ŷ| + 1) / (sum(y .* ŷ + (1 - β)*(1 .- y) .* ŷ + β*y .* (1 .- ŷ)) + 1)\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.binary_focal_loss","page":"Loss Functions","title":"Flux.Losses.binary_focal_loss","text":"binary_focal_loss(ŷ, y; agg=mean, gamma=2, eps=eps(eltype(ŷ)))\n\nReturn the binaryfocalloss The input, 'ŷ', is expected to be normalized (i.e. softmax output).\n\nFor gamma = 0, the loss is mathematically equivalent to Losses.binarycrossentropy.\n\nSee also: Losses.focal_loss for multi-class setting\n\nExample\n\njulia> y = [0 1 0\n 1 0 1]\n2×3 Matrix{Int64}:\n 0 1 0\n 1 0 1\n\njulia> ŷ = [0.268941 0.5 0.268941\n 0.731059 0.5 0.731059]\n2×3 Matrix{Float64}:\n 0.268941 0.5 0.268941\n 0.731059 0.5 0.731059\n\njulia> Flux.binary_focal_loss(ŷ, y) ≈ 0.0728675615927385\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.focal_loss","page":"Loss Functions","title":"Flux.Losses.focal_loss","text":"focal_loss(ŷ, y; dims=1, agg=mean, gamma=2, eps=eps(eltype(ŷ)))\n\nReturn the focal_loss which can be used in classification tasks with highly imbalanced classes. It down-weights well-classified examples and focuses on hard examples. The input, 'ŷ', is expected to be normalized (i.e. softmax output).\n\nThe modulating factor, γ == gamma, controls the down-weighting strength. For γ == 0, the loss is mathematically equivalent to Losses.crossentropy.\n\nExample\n\njulia> y = [1 0 0 0 1\n 0 1 0 1 0\n 0 0 1 0 0]\n3×5 Matrix{Int64}:\n 1 0 0 0 1\n 0 1 0 1 0\n 0 0 1 0 0\n\njulia> ŷ = softmax(reshape(-7:7, 3, 5) .* 1f0)\n3×5 Matrix{Float32}:\n 0.0900306 0.0900306 0.0900306 0.0900306 0.0900306\n 0.244728 0.244728 0.244728 0.244728 0.244728\n 0.665241 0.665241 0.665241 0.665241 0.665241\n\njulia> Flux.focal_loss(ŷ, y) ≈ 1.1277571935622628\ntrue\n\nSee also: Losses.binary_focal_loss for binary (not one-hot) labels\n\n\n\n\n\n","category":"function"},{"location":"models/losses/#Flux.Losses.siamese_contrastive_loss","page":"Loss Functions","title":"Flux.Losses.siamese_contrastive_loss","text":"siamese_contrastive_loss(ŷ, y; margin = 1, agg = mean)\n\nReturn the contrastive loss which can be useful for training Siamese Networks. It is given by\n\nagg(@. (1 - y) * ŷ^2 + y * max(0, margin - ŷ)^2)\n\nSpecify margin to set the baseline for distance at which pairs are dissimilar.\n\nExample\n\njulia> ŷ = [0.5, 1.5, 2.5];\n\njulia> Flux.siamese_contrastive_loss(ŷ, 1:3)\n-4.833333333333333\n\njulia> Flux.siamese_contrastive_loss(ŷ, 1:3, margin = 2)\n-4.0\n\n\n\n\n\n","category":"function"},{"location":"training/reference/#Training-API-Reference","page":"Training API","title":"Training API Reference","text":"","category":"section"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"The new version of Flux's training code was written as an independent package, Optimisers.jl. Only the function train! belongs to Flux itself.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"The Optimisers package is designed to allow for immutable objects. But at present all Flux models contain parameter arrays (such as Arrays and CuArrays) which can be updated in-place. Because of this:","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"The objects returned by Optimisers.update! can be ignored.\nFlux defines its own version of setup which checks this assumption. (Using instead Optimisers.setup will also work, they return the same thing.)","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"The new implementation of rules such as Adam in the Optimisers is quite different from the old one in Flux.Optimise. In Flux 0.14, Flux.Adam() returns the old one, with supertype Flux.Optimise.AbstractOptimiser, but setup will silently translate it to its new counterpart. The available rules are listed the optimisation rules page here; see the Optimisers documentation for details on how the new rules work.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"Flux.Train.setup\nFlux.Train.train!(loss, model, data, state; cb)\nOptimisers.update!","category":"page"},{"location":"training/reference/#Flux.Train.setup","page":"Training API","title":"Flux.Train.setup","text":"opt_state = setup(rule, model)\n\nThis is a version of Optimisers.setup, and is the first step before using train!. It differs from Optimisers.setup in that it:\n\nhas one extra check for mutability (since Flux expects to mutate the model in-place, while Optimisers.jl is designed to return an updated model)\nhas methods which accept Flux's old optimisers, and convert them. (The old Flux.Optimise.Adam and new Optimisers.Adam are distinct types.)\n\ncompat: New\nThis function was added in Flux 0.13.9. It was not used by the old \"implicit\" interface, using Flux.Optimise module and Flux.params.\n\nExample\n\njulia> model = Dense(2=>1, leakyrelu; init=ones);\n\njulia> opt_state = Flux.setup(Momentum(0.1), model) # this encodes the optimiser and its state\n(weight = Leaf(Momentum{Float64}(0.1, 0.9), [0.0 0.0]), bias = Leaf(Momentum{Float64}(0.1, 0.9), [0.0]), σ = ())\n\njulia> x1, y1 = [0.2, -0.3], [0.4]; # use the same data for two steps:\n\njulia> Flux.train!(model, [(x1, y1), (x1, y1)], opt_state) do m, x, y\n sum(abs.(m(x) .- y)) * 100\n end\n\njulia> model.bias # was zero, mutated by Flux.train!\n1-element Vector{Float64}:\n 10.19\n\njulia> opt_state # mutated by Flux.train!\n(weight = Leaf(Momentum{Float64}(0.1, 0.9), [-2.018 3.027]), bias = Leaf(Momentum{Float64}(0.1, 0.9), [-10.09]), σ = ())\n\n\n\n\n\n","category":"function"},{"location":"training/reference/#Flux.Optimise.train!-NTuple{4, Any}","page":"Training API","title":"Flux.Optimise.train!","text":"train!(loss, model, data, opt_state)\n\nUses a loss function and training data to improve the model's parameters according to a particular optimisation rule encoded in opt_state. Iterates through data once, evaluating for each d in data either loss(model, d...) if d isa Tuple, or else loss(model, d) for other d.\n\nFor example, with these definitions...\n\ndata = [(x1, y1), (x2, y2), (x3, y3)]\n\nloss3(m, x, y) = norm(m(x) .- y) # the model is the first argument\n\nopt_state = Flux.setup(Adam(), model) # explicit setup of optimiser momenta\n\n...calling Flux.train!(loss3, model, data, opt_state) runs a loop much like this:\n\nfor d in data\n ∂L∂m = gradient(loss3, model, d...)[1]\n update!(opt_state, model, ∂L∂m)\nend\n\nYou can also write this loop yourself, if you need more flexibility. For this reason train! is not highly extensible. It adds only a few features to the loop above:\n\nStop with a DomainError if the loss is infinite or NaN at any point.\nShow a progress bar using @withprogress.\n\ncompat: New\nThis method was added in Flux 0.13.9. It has significant changes from the one used by Flux ≤ 0.13:It now takes the model itself, not the result of Flux.params. (This is to move away from Zygote's \"implicit\" parameter handling, with Grads.)\nInstead of loss being a function which accepts only the data, now it must also accept the model itself, as the first argument.\nopt_state should be the result of Flux.setup. Using an optimiser such as Adam() without this step should give you a warning.\nCallback functions are not supported. (But any code can be included in the above for loop.)\n\n\n\n\n\n","category":"method"},{"location":"training/reference/#Optimisers.update!","page":"Training API","title":"Optimisers.update!","text":"Optimisers.update!(tree, model, gradient) -> (tree, model)\n\nUses the optimiser and the gradient to change the trainable parameters in the model. Returns the improved model, and the optimiser states needed for the next update. The initial tree of states comes from setup.\n\nThis is used in exactly the same manner as update, but because it may mutate arrays within the old model (and the old state), it will be faster for models of ordinary Arrays or CuArrays. However, you should not rely on the old model being fully updated but rather use the returned model. (The original state tree is always mutated, as each Leaf is mutable.)\n\nExample\n\njulia> using StaticArrays, Zygote, Optimisers\n\njulia> m = (x = [1f0, 2f0], y = SA[4f0, 5f0]); # partly mutable model\n\njulia> t = Optimisers.setup(Momentum(1/30, 0.9), m) # tree of states\n(x = Leaf(Momentum(0.0333333, 0.9), Float32[0.0, 0.0]), y = Leaf(Momentum(0.0333333, 0.9), Float32[0.0, 0.0]))\n\njulia> g = gradient(m -> sum(abs2.(m.x .+ m.y)), m)[1] # structural gradient\n(x = Float32[10.0, 14.0], y = Float32[10.0, 14.0])\n\njulia> t2, m2 = Optimisers.update!(t, m, g);\n\njulia> m2 # after update or update!, this is the new model\n(x = Float32[0.6666666, 1.5333333], y = Float32[3.6666667, 4.5333333])\n\njulia> m2.x === m.x # update! has re-used this array, for efficiency\ntrue\n\njulia> m # original should be discarded, may be mutated but no guarantee\n(x = Float32[0.6666666, 1.5333333], y = Float32[4.0, 5.0])\n\njulia> t == t2 # original state tree is guaranteed to be mutated\ntrue\n\n\n\n\n\n","category":"function"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"train! uses @progress which should show a progress bar in VSCode automatically. To see one in a terminal, you will need to install TerminalLoggers.jl and follow its setup instructions.","category":"page"},{"location":"training/reference/#Optimisation-Modifiers","page":"Training API","title":"Optimisation Modifiers","text":"","category":"section"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"The state returned by setup can be modified to temporarily prevent training of some parts of the model, or to change the learning rate or other hyperparameter. The functions for doing so may be accessed as Flux.freeze!, Flux.thaw!, and Flux.adjust!. All mutate the state (or part of it) and return nothing.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"Optimisers.adjust!\nOptimisers.freeze!\nOptimisers.thaw!","category":"page"},{"location":"training/reference/#Optimisers.adjust!","page":"Training API","title":"Optimisers.adjust!","text":"Optimisers.adjust!(tree, η)\n\nAlters the state tree = setup(rule, model) to change the parameters of the optimisation rule, without destroying its stored state. Typically used mid-way through training.\n\nCan be applied to part of a model, by acting only on the corresponding part of the state tree.\n\nTo change just the learning rate, provide a number η::Real.\n\nExample\n\njulia> m = (vec = rand(Float32, 2), fun = sin);\n\njulia> st = Optimisers.setup(Nesterov(), m) # stored momentum is initialised to zero\n(vec = Leaf(Nesterov(0.001, 0.9), Float32[0.0, 0.0]), fun = ())\n\njulia> st, m = Optimisers.update(st, m, (vec = [16, 88], fun = nothing)); # with fake gradient\n\njulia> st\n(vec = Leaf(Nesterov(0.001, 0.9), Float32[-0.016, -0.088]), fun = ())\n\njulia> Optimisers.adjust!(st, 0.123) # change learning rate, stored momentum untouched\n\njulia> st\n(vec = Leaf(Nesterov(0.123, 0.9), Float32[-0.016, -0.088]), fun = ())\n\nTo change other parameters, adjust! also accepts keyword arguments matching the field names of the optimisation rule's type.\n\njulia> fieldnames(Adam)\n(:eta, :beta, :epsilon)\n\njulia> st2 = Optimisers.setup(OptimiserChain(ClipGrad(), Adam()), m)\n(vec = Leaf(OptimiserChain(ClipGrad(10.0), Adam(0.001, (0.9, 0.999), 1.0e-8)), (nothing, (Float32[0.0, 0.0], Float32[0.0, 0.0], (0.9, 0.999)))), fun = ())\n\njulia> Optimisers.adjust(st2; beta = (0.777, 0.909), delta = 11.1) # delta acts on ClipGrad\n(vec = Leaf(OptimiserChain(ClipGrad(11.1), Adam(0.001, (0.777, 0.909), 1.0e-8)), (nothing, (Float32[0.0, 0.0], Float32[0.0, 0.0], (0.9, 0.999)))), fun = ())\n\njulia> Optimisers.adjust(st; beta = \"no such field\") # silently ignored!\n(vec = Leaf(Nesterov(0.123, 0.9), Float32[-0.016, -0.088]), fun = ())\n\n\n\n\n\n","category":"function"},{"location":"training/reference/#Optimisers.freeze!","page":"Training API","title":"Optimisers.freeze!","text":"Optimisers.freeze!(tree)\n\nTemporarily alters the state tree = setup(rule, model) so that parameters will not be updated. Un-done by thaw!.\n\nCan be applied to the state corresponding to only part of a model, for instance with model::Chain, to freeze model.layers[1] you should call freeze!(tree.layers[1]).\n\nExample\n\njulia> m = (x = ([1.0], 2.0), y = [3.0]);\n\njulia> s = Optimisers.setup(Momentum(), m);\n\njulia> Optimisers.freeze!(s.x)\n\njulia> Optimisers.update!(s, m, (x = ([pi], 10pi), y = [100pi])); # with fake gradient\n\njulia> m\n(x = ([1.0], 2.0), y = [-0.14159265358979312])\n\njulia> s\n(x = (Leaf(Momentum(0.01, 0.9), [0.0], frozen = true), ()), y = Leaf(Momentum(0.01, 0.9), [3.14159]))\n\njulia> Optimisers.thaw!(s)\n\njulia> s.x\n(Leaf(Momentum(0.01, 0.9), [0.0]), ())\n\n\n\n\n\n","category":"function"},{"location":"training/reference/#Optimisers.thaw!","page":"Training API","title":"Optimisers.thaw!","text":"Optimisers.thaw!(tree)\n\nThe reverse of freeze!. Applies to all parameters, mutating every Leaf(rule, state, frozen = true) to Leaf(rule, state, frozen = false).\n\n\n\n\n\n","category":"function"},{"location":"training/reference/#Implicit-style-(Flux-0.14)","page":"Training API","title":"Implicit style (Flux ≤ 0.14)","text":"","category":"section"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"Flux used to handle gradients, training, and optimisation rules quite differently. The new style described above is called \"explicit\" by Zygote, and the old style \"implicit\". Flux 0.13 and 0.14 are the transitional versions which support both; Flux 0.15 will remove the old.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"compat: How to upgrade\nThe blue-green boxes in the training section describe the changes needed to upgrade old code.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"For full details on the interface for implicit-style optimisers, see the Flux 0.13.6 manual.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"compat: Flux ≤ 0.12\nEarlier versions of Flux exported params, thus allowing unqualified params(model) after using Flux. This conflicted with too many other packages, and was removed in Flux 0.13. If you get an error UndefVarError: params not defined, this probably means that you are following code for Flux 0.12 or earlier on a more recent version.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"Flux.params\nFlux.Optimise.update!(opt::Flux.Optimise.AbstractOptimiser, xs::AbstractArray, gs)\nFlux.Optimise.train!(loss, ps::Flux.Params, data, opt::Flux.Optimise.AbstractOptimiser; cb)","category":"page"},{"location":"training/reference/#Flux.params","page":"Training API","title":"Flux.params","text":"params(model)\nparams(layers...)\n\nGiven a model or specific layers from a model, create a Params object pointing to its trainable parameters.\n\nThis can be used with the gradient function, see the training section of the manual, or as input to the Flux.train! function.\n\nThe behaviour of params on custom types can be customized using Functors.@functor or Flux.trainable.\n\nExamples\n\njulia> using Flux: params\n\njulia> params(Chain(Dense(ones(2,3)), softmax)) # unpacks Flux models\nParams([[1.0 1.0 1.0; 1.0 1.0 1.0], [0.0, 0.0]])\n\njulia> bn = BatchNorm(2, relu)\nBatchNorm(2, relu) # 4 parameters, plus 4 non-trainable\n\njulia> params(bn) # only the trainable parameters\nParams([Float32[0.0, 0.0], Float32[1.0, 1.0]])\n\njulia> params([1, 2, 3], [4]) # one or more arrays of numbers\nParams([[1, 2, 3], [4]])\n\njulia> params([[1, 2, 3], [4]]) # unpacks array of arrays\nParams([[1, 2, 3], [4]])\n\njulia> params(1, [2 2], (alpha=[3,3,3], beta=Ref(4), gamma=sin)) # ignores scalars, unpacks NamedTuples\nParams([[2 2], [3, 3, 3]])\n\n\n\n\n\n","category":"function"},{"location":"training/reference/#Optimisers.update!-Tuple{Flux.Optimise.AbstractOptimiser, AbstractArray, Any}","page":"Training API","title":"Optimisers.update!","text":"update!(opt, p, g)\nupdate!(opt, ps::Params, gs)\n\nPerform an update step of the parameters ps (or the single parameter p) according to optimiser opt::AbstractOptimiser and the gradients gs (the gradient g).\n\nAs a result, the parameters are mutated and the optimiser's internal state may change. The gradient could be mutated as well.\n\ncompat: Deprecated\nThis method for implicit Params (and AbstractOptimiser) will be removed from Flux 0.15. The explicit method update!(opt, model, grad) from Optimisers.jl will remain.\n\n\n\n\n\n","category":"method"},{"location":"training/reference/#Flux.Optimise.train!-Tuple{Any, Params, Any, Flux.Optimise.AbstractOptimiser}","page":"Training API","title":"Flux.Optimise.train!","text":"train!(loss, pars::Params, data, opt::AbstractOptimiser; [cb])\n\nUses a loss function and training data to improve the model's parameters according to a particular optimisation rule opt.\n\ncompat: Deprecated\nThis method with implicit Params will be removed from Flux 0.15. It should be replaced with the explicit method train!(loss, model, data, opt).\n\nFor each d in data, first the gradient of the loss is computed like this:\n\n gradient(() -> loss(d...), pars) # if d isa Tuple\n gradient(() -> loss(d), pars) # otherwise\n\nHere pars is produced by calling Flux.params on your model. (Or just on the layers you want to train, like train!(loss, params(model[1:end-2]), data, opt).) This is the \"implicit\" style of parameter handling.\n\nThis gradient is then used by optimiser opt to update the parameters:\n\n update!(opt, pars, grads)\n\nThe optimiser should be from the Flux.Optimise module (see Optimisers). Different optimisers can be combined using Flux.Optimise.Optimiser.\n\nThis training loop iterates through data once. It will stop with a DomainError if the loss is NaN or infinite.\n\nYou can use use train! inside a for loop to do this several times, or use for instance Itertools.ncycle to make a longer data iterator.\n\nCallbacks\n\nCallbacks are given with the keyword argument cb. For example, this will print \"training\" every 10 seconds (using Flux.throttle):\n\n train!(loss, params, data, opt, cb = throttle(() -> println(\"training\"), 10))\n\nMultiple callbacks can be passed to cb as array.\n\n\n\n\n\n","category":"method"},{"location":"training/reference/#Callbacks","page":"Training API","title":"Callbacks","text":"","category":"section"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"Implicit train! takes an additional argument, cb, that's used for callbacks so that you can observe the training process. For example:","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"train!(objective, ps, data, opt, cb = () -> println(\"training\"))","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"Callbacks are called for every batch of training data. You can slow this down using Flux.throttle(f, timeout) which prevents f from being called more than once every timeout seconds.","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"A more typical callback might look like this:","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"test_x, test_y = # ... create single batch of test data ...\nevalcb() = @show(loss(test_x, test_y))\nthrottled_cb = throttle(evalcb, 5)\nfor epoch in 1:20\n @info \"Epoch $epoch\"\n Flux.train!(objective, ps, data, opt, cb = throttled_cb)\nend","category":"page"},{"location":"training/reference/","page":"Training API","title":"Training API","text":"See the page about callback helpers for more.","category":"page"},{"location":"models/recurrence/#Recurrent-Models","page":"Recurrence","title":"Recurrent Models","text":"","category":"section"},{"location":"models/recurrence/#Recurrent-cells","page":"Recurrence","title":"Recurrent cells","text":"","category":"section"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"To introduce Flux's recurrence functionalities, we will consider the following vanilla recurrent neural network structure:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"(Image: )","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In the above, we have a sequence of length 3, where x1 to x3 represent the input at each step (could be a timestamp or a word in a sentence), and y1 to y3 are their respective outputs.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"An aspect to recognize is that in such a model, the recurrent cells A all refer to the same structure. What distinguishes it from a simple dense layer is that the cell A is fed, in addition to an input x, with information from the previous state of the model (hidden state denoted as h1 & h2 in the diagram).","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In the most basic RNN case, cell A could be defined by the following: ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"output_size = 5\ninput_size = 2\nWxh = randn(Float32, output_size, input_size)\nWhh = randn(Float32, output_size, output_size)\nb = randn(Float32, output_size)\n\nfunction rnn_cell(h, x)\n h = tanh.(Wxh * x .+ Whh * h .+ b)\n return h, h\nend\n\nx = rand(Float32, input_size) # dummy input data\nh = rand(Float32, output_size) # random initial hidden state\n\nh, y = rnn_cell(h, x)","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Notice how the above is essentially a Dense layer that acts on two inputs, h and x.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"If you run the last line a few times, you'll notice the output y changing slightly even though the input x is the same.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"There are various recurrent cells available in Flux, notably RNNCell, LSTMCell and GRUCell, which are documented in the layer reference. The hand-written example above can be replaced with:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"using Flux\n\nrnn = Flux.RNNCell(2, 5)\n\nx = rand(Float32, 2) # dummy data\nh = rand(Float32, 5) # initial hidden state\n\nh, y = rnn(h, x)","category":"page"},{"location":"models/recurrence/#Stateful-Models","page":"Recurrence","title":"Stateful Models","text":"","category":"section"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"For the most part, we don't want to manage hidden states ourselves, but to treat our models as being stateful. Flux provides the Recur wrapper to do this.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"x = rand(Float32, 2)\nh = rand(Float32, 5)\n\nm = Flux.Recur(rnn, h)\n\ny = m(x)","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"The Recur wrapper stores the state between runs in the m.state field.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"If we use the RNN(2, 5) constructor – as opposed to RNNCell – you'll see that it's simply a wrapped cell.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"julia> using Flux\n\njulia> RNN(2, 5) # or equivalently RNN(2 => 5)\nRecur(\n RNNCell(2 => 5, tanh), # 45 parameters\n) # Total: 4 trainable arrays, 45 parameters,\n # plus 1 non-trainable, 5 parameters, summarysize 412 bytes.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Equivalent to the RNN stateful constructor, LSTM and GRU are also available. ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Using these tools, we can now build the model shown in the above diagram with: ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"julia> m = Chain(RNN(2 => 5), Dense(5 => 1))\nChain(\n Recur(\n RNNCell(2 => 5, tanh), # 45 parameters\n ),\n Dense(5 => 1), # 6 parameters\n) # Total: 6 trainable arrays, 51 parameters,\n # plus 1 non-trainable, 5 parameters, summarysize 580 bytes. ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In this example, each output has only one component.","category":"page"},{"location":"models/recurrence/#Working-with-sequences","page":"Recurrence","title":"Working with sequences","text":"","category":"section"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Using the previously defined m recurrent model, we can now apply it to a single step from our sequence:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"julia> x = rand(Float32, 2);\n\njulia> m(x)\n1-element Vector{Float32}:\n 0.45860028","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"The m(x) operation would be represented by x1 -> A -> y1 in our diagram. If we perform this operation a second time, it will be equivalent to x2 -> A -> y2 since the model m has stored the state resulting from the x1 step.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Now, instead of computing a single step at a time, we can get the full y1 to y3 sequence in a single pass by iterating the model on a sequence of data. ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"To do so, we'll need to structure the input data as a Vector of observations at each time step. This Vector will therefore be of length = seq_length and each of its elements will represent the input features for a given step. In our example, this translates into a Vector of length 3, where each element is a Matrix of size (features, batch_size), or just a Vector of length features if dealing with a single observation. ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"julia> x = [rand(Float32, 2) for i = 1:3];\n\njulia> [m(xi) for xi in x]\n3-element Vector{Vector{Float32}}:\n [0.36080405]\n [-0.13914406]\n [0.9310162]","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"warning: Use of map and broadcast\nMapping and broadcasting operations with stateful layers such are discouraged, since the julia language doesn't guarantee a specific execution order. Therefore, avoid y = m.(x)\n# or \ny = map(m, x)and use explicit loops y = [m(x) for x in x]","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"If for some reason one wants to exclude the first step of the RNN chain for the computation of the loss, that can be handled with:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"using Flux.Losses: mse\n\nfunction loss(x, y)\n m(x[1]) # ignores the output but updates the hidden states\n sum(mse(m(xi), yi) for (xi, yi) in zip(x[2:end], y))\nend\n\ny = [rand(Float32, 1) for i=1:2]\nloss(x, y)","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In such a model, only the last two outputs are used to compute the loss, hence the target y being of length 2. This is a strategy that can be used to easily handle a seq-to-one kind of structure, compared to the seq-to-seq assumed so far. ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Alternatively, if one wants to perform some warmup of the sequence, it could be performed once, followed with a regular training where all the steps of the sequence would be considered for the gradient update:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"function loss(x, y)\n sum(mse(m(xi), yi) for (xi, yi) in zip(x, y))\nend\n\nseq_init = [rand(Float32, 2)]\nseq_1 = [rand(Float32, 2) for i = 1:3]\nseq_2 = [rand(Float32, 2) for i = 1:3]\n\ny1 = [rand(Float32, 1) for i = 1:3]\ny2 = [rand(Float32, 1) for i = 1:3]\n\nX = [seq_1, seq_2]\nY = [y1, y2]\ndata = zip(X,Y)\n\nFlux.reset!(m)\n[m(x) for x in seq_init]\n\nps = Flux.params(m)\nopt= Adam(1e-3)\nFlux.train!(loss, ps, data, opt)","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In this previous example, model's state is first reset with Flux.reset!. Then, there's a warmup that is performed over a sequence of length 1 by feeding it with seq_init, resulting in a warmup state. The model can then be trained for 1 epoch, where 2 batches are provided (seq_1 and seq_2) and all the timesteps outputs are considered for the loss.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In this scenario, it is important to note that a single continuous sequence is considered. Since the model state is not reset between the 2 batches, the state of the model flows through the batches, which only makes sense in the context where seq_1 is the continuation of seq_init and so on.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Batch size would be 1 here as there's only a single sequence within each batch. If the model was to be trained on multiple independent sequences, then these sequences could be added to the input data as a second dimension. For example, in a language model, each batch would contain multiple independent sentences. In such scenario, if we set the batch size to 4, a single batch would be of the shape:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"x = [rand(Float32, 2, 4) for i = 1:3]\ny = [rand(Float32, 1, 4) for i = 1:3]","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"That would mean that we have 4 sentences (or samples), each with 2 features (let's say a very small embedding!) and each with a length of 3 (3 words per sentence). Computing m(batch[1]), would still represent x1 -> y1 in our diagram and returns the first word output, but now for each of the 4 independent sentences (second dimension of the input matrix). We do not need to use Flux.reset!(m) here; each sentence in the batch will output in its own \"column\", and the outputs of the different sentences won't mix. ","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"To illustrate, we go through an example of batching with our implementation of rnn_cell. The implementation doesn't need to change; the batching comes for \"free\" from the way Julia does broadcasting and the rules of matrix multiplication.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"output_size = 5\ninput_size = 2\nWxh = randn(Float32, output_size, input_size)\nWhh = randn(Float32, output_size, output_size)\nb = randn(Float32, output_size)\n\nfunction rnn_cell(h, x)\n h = tanh.(Wxh * x .+ Whh * h .+ b)\n return h, h\nend","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"Here, we use the last dimension of the input and the hidden state as the batch dimension. I.e., h[:, n] would be the hidden state of the nth sentence in the batch.","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"batch_size = 4\nx = rand(Float32, input_size, batch_size) # dummy input data\nh = rand(Float32, output_size, batch_size) # random initial hidden state\n\nh, y = rnn_cell(h, x)","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"julia> size(h) == size(y) == (output_size, batch_size)\ntrue","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"In many situations, such as when dealing with a language model, the sentences in each batch are independent (i.e. the last item of the first sentence of the first batch is independent from the first item of the first sentence of the second batch), so we cannot handle the model as if each batch was the direct continuation of the previous one. To handle such situations, we need to reset the state of the model between each batch, which can be conveniently performed within the loss function:","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"function loss(x, y)\n Flux.reset!(m)\n sum(mse(m(xi), yi) for (xi, yi) in zip(x, y))\nend","category":"page"},{"location":"models/recurrence/","page":"Recurrence","title":"Recurrence","text":"A potential source of ambiguity with RNN in Flux can come from the different data layout compared to some common frameworks where data is typically a 3 dimensional array: (features, seq length, samples). In Flux, those 3 dimensions are provided through a vector of seq length containing a matrix (features, samples).","category":"page"},{"location":"training/callbacks/#man-callback-helpers","page":"Callback Helpers","title":"Callback Helpers","text":"","category":"section"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"Flux.throttle","category":"page"},{"location":"training/callbacks/#Flux.throttle","page":"Callback Helpers","title":"Flux.throttle","text":"throttle(f, timeout; leading=true, trailing=false)\n\nReturn a function that when invoked, will only be triggered at most once during timeout seconds.\n\nNormally, the throttled function will run as much as it can, without ever going more than once per wait duration; but if you'd like to disable the execution on the leading edge, pass leading=false. To enable execution on the trailing edge, pass trailing=true.\n\nExamples\n\njulia> a = Flux.throttle(() -> println(\"Flux\"), 2);\n\njulia> for i = 1:4 # a called in alternate iterations\n a()\n sleep(1)\n end\nFlux\nFlux\n\n\n\n\n\n","category":"function"},{"location":"training/callbacks/#Patience-Helpers","page":"Callback Helpers","title":"Patience Helpers","text":"","category":"section"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"Flux provides utilities for controlling your training procedure according to some monitored condition and a maximum patience. For example, you can use early_stopping to stop training when the model is converging or deteriorating, or you can use plateau to check if the model is stagnating.","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"For example, below we create a pseudo-loss function that decreases, bottoms out, and then increases. The early stopping trigger will break the loop before the loss increases too much.","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"# create a pseudo-loss that decreases for 4 calls, then starts increasing\n# we call this like loss()\nloss = let t = 0\n () -> begin\n t += 1\n (t - 4) ^ 2\n end\nend\n\n# create an early stopping trigger\n# returns true when the loss increases for two consecutive steps\nes = early_stopping(loss, 2; init_score = 9)\n\n# this will stop at the 6th (4 decreasing + 2 increasing calls) epoch\nfor epoch in 1:10\n es() && break\nend","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"The keyword argument distance of early_stopping is a function of the form distance(best_score, score). By default distance is -, which implies that the monitored metric f is expected to be decreasing and minimized. If you use some increasing metric (e.g. accuracy), you can customize the distance function: (best_score, score) -> score - best_score.","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"# create a pseudo-accuracy that increases by 0.01 each time from 0 to 1\n# we call this like acc()\nacc = let v = 0\n () -> v = max(1, v + 0.01)\nend\n\n# create an early stopping trigger for accuracy\nes = early_stopping(acc, 3; delta = (best_score, score) -> score - best_score)\n\n# this will iterate until the 10th epoch\nfor epoch in 1:10\n es() && break\nend","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"early_stopping and plateau are both built on top of patience. You can use patience to build your own triggers that use a patient counter. For example, if you want to trigger when the loss is below a threshold for several consecutive iterations:","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"threshold(f, thresh, delay) = patience(delay) do\n f() < thresh\nend","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"Both predicate in patience and f in early_stopping / plateau can accept extra arguments. You can pass such extra arguments to predicate or f through the returned function:","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"trigger = patience((a; b) -> a > b, 3)\n\n# this will iterate until the 10th epoch\nfor epoch in 1:10\n trigger(1; b = 2) && break\nend\n\n# this will stop at the 3rd epoch\nfor epoch in 1:10\n trigger(3; b = 2) && break\nend","category":"page"},{"location":"training/callbacks/","page":"Callback Helpers","title":"Callback Helpers","text":"Flux.patience\nFlux.early_stopping\nFlux.plateau","category":"page"},{"location":"training/callbacks/#Flux.patience","page":"Callback Helpers","title":"Flux.patience","text":"patience(predicate, wait)\n\nReturn a function that internally counts by one when predicate(...) == true, otherwise the count is reset to zero. If the count is greater than or equal to wait, the function returns true, otherwise it returns false.\n\nExamples\n\njulia> loss() = rand();\n\njulia> trigger = Flux.patience(() -> loss() < 1, 3);\n\n\njulia> for i in 1:10\n @info \"Epoch $i\"\n trigger() && break\n end\n[ Info: Epoch 1\n[ Info: Epoch 2\n[ Info: Epoch 3\n\n\n\n\n\n","category":"function"},{"location":"training/callbacks/#Flux.early_stopping","page":"Callback Helpers","title":"Flux.early_stopping","text":"early_stopping(f, delay; distance = -, init_score = 0, min_dist = 0)\n\nReturn a function that internally counts by one when distance(best_score, f(...)) <= min_dist, where best_score is the last seen best value of f(...). If the count is greater than or equal to delay, the function returns true, otherwise it returns false. The count is reset when distance(best_score, f(...)) > min_dist.\n\nExamples\n\njulia> loss = let l = 0\n () -> l += 1\n end; # pseudo loss function that returns increasing values\n\njulia> es = Flux.early_stopping(loss, 3);\n\n\njulia> for i in 1:10\n @info \"Epoch $i\"\n es() && break\n end\n[ Info: Epoch 1\n[ Info: Epoch 2\n[ Info: Epoch 3\n\n\n\n\n\n","category":"function"},{"location":"training/callbacks/#Flux.plateau","page":"Callback Helpers","title":"Flux.plateau","text":"plateau(f, width; distance = -, init_score = 0, min_dist = 1f-6)\n\nReturn a function that internally counts by one when abs(distance(last_score, f(...))) <= min_dist, where last_score holds the last value of f(...). If the count is greater than or equal to width, the function returns true, otherwise it returns false. The count is reset when abs(distance(last_score, f(...))) > min_dist.\n\nExamples\n\njulia> f = let v = 10\n () -> v = v / abs(v) - v\n end; # -9, 8, -7, 6, ...\n\njulia> trigger = Flux.plateau(f, 3; init_score=10, min_dist=18);\n\n\njulia> for i in 1:10\n @info \"Epoch $i\"\n trigger() && break\n end\n[ Info: Epoch 1\n[ Info: Epoch 2\n[ Info: Epoch 3\n[ Info: Epoch 4\n\n\n\n\n\n","category":"function"},{"location":"tutorials/2020-09-15-deep-learning-flux/#man-blitz","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This is a quick intro to Flux loosely based on PyTorch's tutorial. It introduces basic Julia programming, as well Zygote, a source-to-source automatic differentiation (AD) framework in Julia. We'll use these tools to build a very simple neural network.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Arrays","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Arrays","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The starting point for all of our models is the Array (sometimes referred to as a Tensor in other frameworks). This is really just a list of numbers, which might be arranged into a shape like a square. Let's write down an array with three elements.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x = [1, 2, 3]","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Here's a matrix – a square array with four elements.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x = [1 2; 3 4]","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We often work with arrays of thousands of elements, and don't usually write them down by hand. Here's how we can create an array of 5×3 = 15 elements, each a random number from zero to one.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x = rand(5, 3)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"There's a few functions like this; try replacing rand with ones, zeros, or randn to see what they do.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"By default, Julia works stores numbers is a high-precision format called Float64. In ML we often don't need all those digits, and can ask Julia to work with Float32 instead. We can even ask for more digits using BigFloat.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x = rand(BigFloat, 5, 3)\n\nx = rand(Float32, 5, 3)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can ask the array how many elements it has.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"length(x)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Or, more specifically, what size it has.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"size(x)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We sometimes want to see some elements of the array on their own.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x\n\nx[2, 3]","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This means get the second row and the third column. We can also get every row of the third column.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x[:, 3]","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can add arrays, and subtract them, which adds or subtracts each element of the array.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x + x\n\nx - x","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Julia supports a feature called broadcasting, using the . syntax. This tiles small arrays (or single numbers) to fill bigger ones.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x .+ 1","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can see Julia tile the column vector 1:5 across all rows of the larger array.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"zeros(5,5) .+ (1:5)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The x' syntax is used to transpose a column 1:5 into an equivalent row, and Julia will tile that across columns.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"zeros(5,5) .+ (1:5)'","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can use this to make a times table.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"(1:5) .* (1:5)'","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Finally, and importantly for machine learning, we can conveniently do things like matrix multiply.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"W = randn(5, 10)\nx = rand(10)\nW * x","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Julia's arrays are very powerful, and you can learn more about what they can do here.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#CUDA-Arrays","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"CUDA Arrays","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"CUDA functionality is provided separately by the CUDA package. If you have a GPU and CUDA available, you can run ] add CUDA in a REPL or IJulia to get it.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Once CUDA is loaded you can move any array to the GPU with the cu function, and it supports all of the above operations with the same syntax.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using CUDA\nx = cu(rand(5, 3))","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Automatic-Differentiation","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Automatic Differentiation","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"You probably learned to take derivatives in school. We start with a simple mathematical function like","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"f(x) = 3x^2 + 2x + 1\n\nf(5)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"In simple cases it's pretty easy to work out the gradient by hand – here it's 6x+2. But it's much easier to make Flux do the work for us!","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Flux: gradient\n\ndf(x) = gradient(f, x)[1]\n\ndf(5)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"You can try this with a few different inputs to make sure it's really the same as 6x+2. We can even do this multiple times (but the second derivative is a fairly boring 6).","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"ddf(x) = gradient(df, x)[1]\n\nddf(5)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Flux's AD can handle any Julia code you throw at it, including loops, recursion and custom layers, so long as the mathematical functions you call are differentiable. For example, we can differentiate a Taylor approximation to the sin function.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"mysin(x) = sum((-1)^k*x^(1+2k)/factorial(1+2k) for k in 0:5)\n\nx = 0.5\n\nmysin(x), gradient(mysin, x)\n\nsin(x), cos(x)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"You can see that the derivative we calculated is very close to cos(x), as we expect.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This gets more interesting when we consider functions that take arrays as inputs, rather than just a single number. For example, here's a function that takes a matrix and two vectors (the definition itself is arbitrary)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"myloss(W, b, x) = sum(W * x .+ b)\n\nW = randn(3, 5)\nb = zeros(3)\nx = rand(5)\n\ngradient(myloss, W, b, x)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Now we get gradients for each of the inputs W, b and x, which will come in handy when we want to train models.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Because ML models can contain hundreds of parameters, Flux provides a slightly different way of writing gradient. We instead mark arrays with param to indicate that we want their derivatives. W and b represent the weight and bias respectively.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Flux: params\n\nW = randn(3, 5)\nb = zeros(3)\nx = rand(5)\n\ny(x) = sum(W * x .+ b)\n\ngrads = gradient(()->y(x), params([W, b]))\n\ngrads[W], grads[b]","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can now grab the gradients of W and b directly from those parameters.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This comes in handy when working with layers. A layer is just a handy container for some parameters. For example, Dense does a linear transform for you.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Flux\n\nm = Dense(10, 5)\n\nx = rand(Float32, 10)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can easily get the parameters of any layer or model with params with params.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"params(m)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This makes it very easy to calculate the gradient for all parameters in a network, even if it has many parameters.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"x = rand(Float32, 10)\nm = Chain(Dense(10, 5, relu), Dense(5, 2), softmax)\nl(x) = sum(Flux.crossentropy(m(x), [0.5, 0.5]))\ngrads = gradient(params(m)) do\n l(x)\nend\nfor p in params(m)\n println(grads[p])\nend","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"You don't have to use layers, but they can be convient for many simple kinds of models and fast iteration.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The next step is to update our weights and perform optimisation. As you might be familiar, Gradient Descent is a simple algorithm that takes the weights and steps using a learning rate and the gradients. weights = weights - learning_rate * gradient.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Flux.Optimise: update!, Descent\nη = 0.1\nfor p in params(m)\n update!(p, -η * grads[p])\nend","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"While this is a valid way of updating our weights, it can get more complicated as the algorithms we use get more involved.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Flux comes with a bunch of pre-defined optimisers and makes writing our own really simple. We just give it the learning rate η:","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"opt = Descent(0.01)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Training a network reduces down to iterating on a dataset mulitple times, performing these steps in order. Just for a quick implementation, let’s train a network that learns to predict 0.5 for every input of 10 floats. Flux defines the train! function to do it for us.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"data, labels = rand(10, 100), fill(0.5, 2, 100)\nloss(x, y) = sum(Flux.crossentropy(m(x), y))\nFlux.train!(loss, params(m), [(data,labels)], opt)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"You don't have to use train!. In cases where arbitrary logic might be better suited, you could open up this training loop like so:","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":" for d in training_set # assuming d looks like (data, labels)\n # our super logic\n gs = gradient(params(m)) do #m is our model\n l = loss(d...)\n end\n update!(opt, params(m), gs)\n end","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Training-a-Classifier","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Training a Classifier","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Getting a real classifier to work might help cement the workflow a bit more. CIFAR10 is a dataset of 50k tiny training images split into 10 classes.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We will do the following steps in order:","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Load CIFAR10 training and test datasets\nDefine a Convolution Neural Network\nDefine a loss function\nTrain the network on the training data\nTest the network on the test data","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Loading-the-Dataset","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Loading the Dataset","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Statistics\nusing Flux, Flux.Optimise\nusing MLDatasets: CIFAR10\nusing Images.ImageCore\nusing Flux: onehotbatch, onecold\nusing Base.Iterators: partition\nusing CUDA","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This image will give us an idea of what we are dealing with. ","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"(Image: title)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"train_x, train_y = CIFAR10.traindata(Float32)\nlabels = onehotbatch(train_y, 0:9)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The train_x contains 50000 images converted to 32 X 32 X 3 arrays with the third dimension being the 3 channels (R,G,B). Let's take a look at a random image from the train_x. For this, we need to permute the dimensions to 3 X 32 X 32 and use colorview to convert it back to an image. ","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Plots\nimage(x) = colorview(RGB, permutedims(x, (3, 2, 1)))\nplot(image(train_x[:,:,:,rand(1:end)]))","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can now arrange the training data in batches of say, 1000 and keep a validation set to track our progress. This process is called minibatch learning, which is a popular method of training large neural networks. Rather that sending the entire dataset at once, we break it down into smaller chunks (called minibatches) that are typically chosen at random, and train only on them. It is shown to help with escaping saddle points.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The first 49k images (in batches of 1000) will be our training set, and the rest is for validation. partition handily breaks down the set we give it in consecutive parts (1000 in this case).","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"train = ([(train_x[:,:,:,i], labels[:,i]) for i in partition(1:49000, 1000)]) |> gpu\nvalset = 49001:50000\nvalX = train_x[:,:,:,valset] |> gpu\nvalY = labels[:, valset] |> gpu","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Defining-the-Classifier","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Defining the Classifier","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Now we can define our Convolutional Neural Network (CNN).","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"A convolutional neural network is one which defines a kernel and slides it across a matrix to create an intermediate representation to extract features from. It creates higher order features as it goes into deeper layers, making it suitable for images, where the strucure of the subject is what will help us determine which class it belongs to.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"m = Chain(\n Conv((5,5), 3=>16, relu),\n MaxPool((2,2)),\n Conv((5,5), 16=>8, relu),\n MaxPool((2,2)),\n x -> reshape(x, :, size(x, 4)),\n Dense(200, 120),\n Dense(120, 84),\n Dense(84, 10),\n softmax) |> gpu","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We will use a crossentropy loss and an Momentum optimiser here. Crossentropy will be a good option when it comes to working with mulitple independent classes. Momentum gradually lowers the learning rate as we proceed with the training. It helps maintain a bit of adaptivity in our optimisation, preventing us from over shooting from our desired destination.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"using Flux: crossentropy, Momentum\n\nloss(x, y) = sum(crossentropy(m(x), y))\nopt = Momentum(0.01)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We can start writing our train loop where we will keep track of some basic accuracy numbers about our model. We can define an accuracy function for it like so.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"accuracy(x, y) = mean(onecold(m(x), 0:9) .== onecold(y, 0:9))","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Training-the-Classifier","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Training the Classifier","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Training is where we do a bunch of the interesting operations we defined earlier, and see what our net is capable of. We will loop over the dataset 10 times and feed the inputs to the neural network and optimise.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"epochs = 10\n\nfor epoch = 1:epochs\n for d in train\n gs = gradient(params(m)) do\n l = loss(d...)\n end\n update!(opt, params(m), gs)\n end\n @show accuracy(valX, valY)\nend","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Seeing our training routine unfold gives us an idea of how the network learnt the function. This is not bad for a small hand-written network, trained for a limited time.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Training-on-a-GPU","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Training on a GPU","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The gpu functions you see sprinkled through this bit of the code tell Flux to move these entities to an available GPU, and subsequently train on it. No extra faffing about required! The same bit of code would work on any hardware with some small annotations like you saw here.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/#Testing-the-Network","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Testing the Network","text":"","category":"section"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We have trained the network for 100 passes over the training dataset. But we need to check if the network has learnt anything at all.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"We will check this by predicting the class label that the neural network outputs, and checking it against the ground-truth. If the prediction is correct, we add the sample to the list of correct predictions. This will be done on a yet unseen section of data.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Okay, first step. Let us perform the exact same preprocessing on this set, as we did on our training set.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"test_x, test_y = CIFAR10.testdata(Float32)\ntest_labels = onehotbatch(test_y, 0:9)\n\ntest = gpu.([(test_x[:,:,:,i], test_labels[:,i]) for i in partition(1:10000, 1000)])","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Next, display an image from the test set.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"plot(image(test_x[:,:,:,rand(1:end)]))","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The outputs are energies for the 10 classes. Higher the energy for a class, the more the network thinks that the image is of the particular class. Every column corresponds to the output of one image, with the 10 floats in the column being the energies.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Let's see how the model fared.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"ids = rand(1:10000, 5)\nrand_test = test_x[:,:,:,ids] |> gpu\nrand_truth = test_y[ids]\nm(rand_test)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This looks similar to how we would expect the results to be. At this point, it's a good idea to see how our net actually performs on new data, that we have prepared.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"accuracy(test[1]...)","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"This is much better than random chance set at 10% (since we only have 10 classes), and not bad at all for a small hand written network like ours.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"Let's take a look at how the net performed on all the classes performed individually.","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"class_correct = zeros(10)\nclass_total = zeros(10)\nfor i in 1:10\n preds = m(test[i][1])\n lab = test[i][2]\n for j = 1:1000\n pred_class = findmax(preds[:, j])[2]\n actual_class = findmax(lab[:, j])[2]\n if pred_class == actual_class\n class_correct[pred_class] += 1\n end\n class_total[actual_class] += 1\n end\nend\n\nclass_correct ./ class_total","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"The spread seems pretty good, with certain classes performing significantly better than the others. Why should that be?","category":"page"},{"location":"tutorials/2020-09-15-deep-learning-flux/","page":"Deep Learning with Julia & Flux: A 60 Minute Blitz","title":"Deep Learning with Julia & Flux: A 60 Minute Blitz","text":"info: Info\nOriginally published at fluxml.ai on 15 November 2020. Written by Saswat Das, Mike Innes, Andrew Dinhobl, Ygor Canalli, Sudhanshu Agrawal, João Felipe Santos.","category":"page"},{"location":"tutorials/2021-01-26-mlp/#man-mlp-tutorial","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"","category":"section"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"In this example, we create a simple multi-layer perceptron (MLP) that classifies handwritten digits using the MNIST dataset. A MLP consists of at least three layers of stacked perceptrons: Input, hidden, and output. Each neuron of an MLP has parameters (weights and bias) and uses an activation function to compute its output. ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"To run this example, we need the following packages:","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"using Flux, Statistics\nusing Flux.Data: DataLoader\nusing Flux: onehotbatch, onecold, logitcrossentropy, throttle, params\nusing Base.Iterators: repeated\nusing CUDA\nusing MLDatasets\nif has_cuda()\t\t# Check if CUDA is available\n @info \"CUDA is on\"\n CUDA.allowscalar(false)\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"We set default values for learning rate, batch size, epochs, and the usage of a GPU (if available) for our model:","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Base.@kwdef mutable struct Args\n rate::Float64 = 3e-4 # learning rate\n batchsize::Int = 1024 # batch size\n epochs::Int = 10 # number of epochs\n device::Function = gpu # set as gpu, if gpu available\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"If a GPU is available on our local system, then Flux uses it for computing the loss and updating the weights and biases when training our model.","category":"page"},{"location":"tutorials/2021-01-26-mlp/#Data","page":"Tutorial: Simple Multi-layer Perceptron","title":"Data","text":"","category":"section"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"We create the function getdata to load the MNIST train and test data sets from MLDatasets and prepare them for the training process. In addition, we set mini-batches of the data sets by loading them onto a DataLoader object. ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"function getdata(args)\n ENV[\"DATADEPS_ALWAYS_ACCEPT\"] = \"true\"\n\n # Loading Dataset\t\n xtrain, ytrain = MLDatasets.MNIST.traindata(Float32)\n xtest, ytest = MLDatasets.MNIST.testdata(Float32)\n\t\n # Reshape Data in order to flatten each image into a linear array\n xtrain = Flux.flatten(xtrain)\n xtest = Flux.flatten(xtest)\n\n # One-hot-encode the labels\n ytrain, ytest = onehotbatch(ytrain, 0:9), onehotbatch(ytest, 0:9)\n\n # Batching\n train_data = DataLoader((xtrain, ytrain), batchsize=args.batchsize, shuffle=true)\n test_data = DataLoader((xtest, ytest), batchsize=args.batchsize)\n\n return train_data, test_data\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"getdata performs the following steps:","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Loads MNIST data set: Loads the train and test set tensors. The shape of train data is 28x28x60000 and test data is 28X28X10000. \nReshapes the train and test data: Uses the flatten function to reshape the train data set into a 784x60000 array and test data set into a 784x10000. Notice that we reshape the data so that we can pass these as arguments for the input layer of our model (a simple MLP expects a vector as an input).\nOne-hot encodes the train and test labels: Creates a batch of one-hot vectors so we can pass the labels of the data as arguments for the loss function. For this example, we use the logitcrossentropy function and it expects data to be one-hot encoded. \nCreates batches of data: Creates two DataLoader objects (train and test) that handle data mini-batches of size 1024 (as defined above). We create these two objects so that we can pass the entire data set through the loss function at once when training our model. Also, it shuffles the data points during each iteration (shuffle=true).","category":"page"},{"location":"tutorials/2021-01-26-mlp/#Model","page":"Tutorial: Simple Multi-layer Perceptron","title":"Model","text":"","category":"section"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"As we mentioned above, a MLP consist of three layers that are fully connected. For this example, we define out model with the following layers and dimensions: ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Input: It has 784 perceptrons (the MNIST image size is 28x28). We flatten the train and test data so that we can pass them as arguments to this layer.\nHidden: It has 32 perceptrons that use the relu activation function.\nOutput: It has 10 perceptrons that output the model's prediction or probability that a digit is 0 to 9. ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"We define our model with the build_model function: ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"function build_model(; imgsize=(28,28,1), nclasses=10)\n return Chain(\n \t Dense(prod(imgsize), 32, relu),\n Dense(32, nclasses))\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Note that we use the functions Dense so that our model is densely (or fully) connected and Chain to chain the computation of the three layers.","category":"page"},{"location":"tutorials/2021-01-26-mlp/#Loss-functions","page":"Tutorial: Simple Multi-layer Perceptron","title":"Loss functions","text":"","category":"section"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Now, we define the loss function loss_all. It expects a DataLoader object and the model function we defined above as arguments. Notice that this function iterates through the dataloader object in mini-batches and uses the function logitcrossentropy to compute the difference between the predicted and actual values. ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"function loss_all(dataloader, model)\n l = 0f0\n for (x,y) in dataloader\n l += logitcrossentropy(model(x), y)\n end\n l/length(dataloader)\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"In addition, we define the function (accuracy) to report the accuracy of our model during the training process. To compute the accuray, we need to decode the output of our model using the onecold function. ","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"function accuracy(data_loader, model)\n acc = 0\n for (x,y) in data_loader\n acc += sum(onecold(cpu(model(x))) .== onecold(cpu(y)))*1 / size(x,2)\n end\n acc/length(data_loader)\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/#Train-our-model","page":"Tutorial: Simple Multi-layer Perceptron","title":"Train our model","text":"","category":"section"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Finally, we create the train function that calls the functions we defined and trains the model.","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"function train(; kws...)\n # Initializing Model parameters \n args = Args(; kws...)\n\n # Load Data\n train_data,test_data = getdata(args)\n\n # Construct model\n m = build_model()\n train_data = args.device.(train_data)\n test_data = args.device.(test_data)\n m = args.device(m)\n loss(x,y) = logitcrossentropy(m(x), y)\n \n ## Training\n evalcb = () -> @show(loss_all(train_data, m))\n opt = Adam(args.rate)\n\t\n for epoch in 1:args.epochs\n @info \"Epoch $epoch\"\n Flux.train!(loss, params(m), train_data, opt, cb = evalcb)\n end\n\n @show accuracy(train_data, m)\n\n @show accuracy(test_data, m)\nend","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"train performs the following steps:","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"Initializes the model parameters: Creates the args object that contains the defult values for training our model.\nLoads the train and test data: Calls the function getdata we defined above.\nConstructs the model: Builds the model and loads the train and test data sets, and our model onto the GPU (if available).\nTrains the model: Defines the callback function evalcb to show the value of the loss_all function during the training process. Then, it sets Adam as the optimiser for training out model. Finally, it runs the training process for 10 epochs (as defined in the args object) and shows the accuracy value for the train and test data.","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"To see the full version of this example, see Simple multi-layer perceptron - model-zoo.","category":"page"},{"location":"tutorials/2021-01-26-mlp/#Resources","page":"Tutorial: Simple Multi-layer Perceptron","title":"Resources","text":"","category":"section"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"3Blue1Brown Neural networks videos.\nNeural Networks and Deep Learning.","category":"page"},{"location":"tutorials/2021-01-26-mlp/","page":"Tutorial: Simple Multi-layer Perceptron","title":"Tutorial: Simple Multi-layer Perceptron","text":"info: Info\nOriginally published at fluxml.ai on 26 January 2021. Written by Adarsh Kumar, Mike J Innes, Andrew Dinhobl, Jerry Ling, natema, Zhang Shitian, Liliana Badillo, Dhairya Gandhi","category":"page"},{"location":"models/activation/#man-activations","page":"Activation Functions","title":"Activation Functions from NNlib.jl","text":"","category":"section"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"These non-linearities used between layers of your model are exported by the NNlib package.","category":"page"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call σ.(xs), relu.(xs) and so on. Alternatively, they can be passed to a layer like Dense(784 => 1024, relu) which will handle this broadcasting.","category":"page"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"Functions like softmax are sometimes described as activation functions, but not by Flux. They must see all the outputs, and hence cannot be broadcasted. See the next page for details.","category":"page"},{"location":"models/activation/#Alphabetical-Listing","page":"Activation Functions","title":"Alphabetical Listing","text":"","category":"section"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"celu\nelu\ngelu\nhardsigmoid\nhardswish\nhardtanh\nleakyrelu\nlisht\nlogcosh\nlogsigmoid\nmish\nrelu\nrelu6\nrrelu\nselu\nsigmoid\nsigmoid_fast\nsoftplus\nsoftshrink\nsoftsign\nswish\ntanhshrink\ntanh_fast\ntrelu","category":"page"},{"location":"models/activation/#NNlib.celu","page":"Activation Functions","title":"NNlib.celu","text":"celu(x, α=1) = x ≥ 0 ? x : α * (exp(x/α) - 1)\n\nActivation function from \"Continuously Differentiable Exponential Linear Units\".\n\njulia> lineplot(celu, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉│ celu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡧⠶⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⠤⠔⠒⠋⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⠤⠤⠤⠤⠔⠒⠒⠒⠊⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> celu(-10f0)\n-0.9999546f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.elu","page":"Activation Functions","title":"NNlib.elu","text":"elu(x, α=1) = x > 0 ? x : α * (exp(x) - 1)\n\nExponential Linear Unit activation function. See \"Fast and Accurate Deep Network Learning by Exponential Linear Units\". You can also specify the coefficient explicitly, e.g. elu(x, 1).\n\njulia> lineplot(elu, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉│ elu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡧⠶⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⠤⠔⠒⠋⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⠤⠤⠤⠤⠔⠒⠒⠒⠊⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> elu(-10f0)\n-0.9999546f0\n\njulia> elu(-10f0, 2)\n-1.9999092f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.gelu","page":"Activation Functions","title":"NNlib.gelu","text":"gelu(x) = 0.5x * (1 + tanh(√(2/π) * (x + 0.044715x^3)))\n\nActivation function from \"Gaussian Error Linear Units\".\n\njulia> lineplot(gelu, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊│ gelu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠊⠁⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⣀⡠⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⣤⣤⣤⣤⣤⣤⣤⣤⡤⠤⠤⠤⠤⠤⠤⠤⣤⣤⣤⡤⡧⠶⠶⠭⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠉⠉⠉⠉⠉⠉⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot(gelu, -5, 0, height=7);\n\njulia> lineplot!(ans, swish)\n ┌────────────────────────────────────────┐ \n 0 │⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠒⠒⠤⣄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸│ gelu(x) \n │⠑⠒⠢⠤⣄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇│ swish(x)\n │⠀⠀⠀⠀⠀⠈⠉⠒⠤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣸⠁│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠒⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⢄⠀⠀⠀⠀⠀⠀⠀⠀⢠⡇⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠓⣄⠀⠀⠀⠀⠀⢠⡞⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠓⢄⣀⣀⡤⢣⠃⠀⠀│ \n -0.2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢠⠇⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀0⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.hardsigmoid","page":"Activation Functions","title":"NNlib.hardsigmoid","text":"hardσ(x) = max(0, min(1, (x + 3) / 6))\n\nPiecewise linear approximation of sigmoid.\n\njulia> lineplot(hardsigmoid, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡠⠖⠋⠉⠉⠉⠉⠉⠉⠉⠉│ hardσ(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⣀⡤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⡠⠔⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⡗⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠊⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠋⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⠤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot(sigmoid, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡠⠤⠖⠒⠒⠋⠉⠉⠉⠉⠉⠉│ σ(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⣀⠔⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⡏⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡔⠋⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠊⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⠤⠤⠤⠒⠊⠉⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.hardswish","page":"Activation Functions","title":"NNlib.hardswish","text":"hardswish(x) = x * hardσ(x)\n\nHard-Swish activation function. See \"Searching for MobileNetV3\".\n\njulia> lineplot(hardswish, -2, 5, height = 7)\n ┌────────────────────────────────────────┐ \n 5 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠔⠒⠉│ hardswish(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠔⠒⠉⠁⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠖⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⢀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣇⣤⣤⣖⣚⣉⣁⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀│ \n -1 │⠉⠒⠒⠒⠒⠉⠉⠉⠉⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot(hardswish, -4, 0, height = 7);\n\njulia> lineplot!(ans, swish)\n ┌────────────────────────────────────────┐ \n 0 │⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⢣⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡜│ hardswish(x)\n │⠒⠒⠢⠤⢄⣀⡀⠀⠀⠀⠀⠱⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⠎⠀│ swish(x) \n │⠀⠀⠀⠀⠀⠀⠈⠉⠑⠒⠦⢄⣘⢄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡴⠃⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠑⡖⠦⢄⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⢔⠏⠁⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠣⣄⠀⠉⠑⠒⠦⠤⢄⣀⣀⣀⣀⡠⠤⠖⣊⠕⠁⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠓⠤⡀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠁⠀⠀⠀⠀⠀⠀⠀│ \n -0.4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠒⠢⠤⠤⠔⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-4⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀0⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> hardswish.(-5:5)'\n1×11 adjoint(::Vector{Float64}) with eltype Float64:\n -0.0 -0.0 -0.0 -0.333333 -0.333333 0.0 0.666667 1.66667 3.0 4.0 5.0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.hardtanh","page":"Activation Functions","title":"NNlib.hardtanh","text":"hardtanh(x) = max(-1, min(1, x))\n\nSegment-wise linear approximation of tanh, much cheaper to compute. See \"Large Scale Machine Learning\".\n\nSee also tanh_fast.\n\njulia> lineplot(hardtanh, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⠔⠋⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│ hardtanh(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⣀⡤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⢀⡤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡷⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠖⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠖⠋⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⠔⠋⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x\n\njulia> lineplot(tanh, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠤⠤⠒⠒⠒⠊⠉⠉⠉│ tanh(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⢀⡤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⡷⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠔⠊⠁⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⣀⣀⣀⡠⠤⠤⠤⠖⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.leakyrelu","page":"Activation Functions","title":"NNlib.leakyrelu","text":"leakyrelu(x, a=0.01) = max(a*x, x)\n\nLeaky Rectified Linear Unit activation function. You can also specify the coefficient explicitly, e.g. leakyrelu(x, 0.01).\n\njulia> lineplot(x -> leakyrelu(x, 0.5), -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉│ #42(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠉⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⣤⣤⡤⡧⠶⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⠤⠤⠒⠒⠋⠉⠁⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⣀⣀⠤⠤⠒⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> leakyrelu(-10f0, 0.2)\n-2.0f0\n\njulia> leakyrelu(-10f0, 0.02)\n-0.5f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.lisht","page":"Activation Functions","title":"NNlib.lisht","text":"lisht(x) = x * tanh(x)\n\nActivation function from \"LiSHT: Non-Parametric Linearly Scaled Hyperbolic Tangent ...\"\n\njulia> lineplot(lisht, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔│ lisht(x)\n │⠀⠈⠑⢦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀│ \n │⠀⠀⠀⠀⠈⠣⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠁⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠑⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠊⠁⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠢⡄⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⠔⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⢄⡀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⢀⡠⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠦⣄⣀⣀⣇⣀⣀⠤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot!(ans, logcosh)\n ┌────────────────────────────────────────┐ \n 2 │⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔│ lisht(x) \n │⠀⠈⠑⢦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀│ logcosh(x)\n │⠢⣄⠀⠀⠈⠣⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠁⠀⠀⣀⠔│ \n f(x) │⠀⠈⠑⠢⣀⠀⠀⠑⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠊⠁⠀⣀⠔⠊⠁⠀│ \n │⠀⠀⠀⠀⠀⠉⠢⢄⡀⠉⠢⡄⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⠔⠋⠀⡠⠔⠋⠁⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⠦⣌⡓⢄⡀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⢀⡠⠖⣁⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠓⠪⠷⣦⣄⣀⣀⣇⣀⣀⣤⠶⠕⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.logcosh","page":"Activation Functions","title":"NNlib.logcosh","text":"logcosh(x)\n\nReturn log(cosh(x)) which is computed in a numerically stable way.\n\njulia> lineplot(logcosh, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 5 │⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ logcosh(x)\n │⠉⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠋│ \n │⠀⠀⠀⠑⠢⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠊⠁⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠑⠦⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠊⠁⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠦⡀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⢀⡤⠒⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠑⠢⢄⣀⣀⣇⣀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.logsigmoid","page":"Activation Functions","title":"NNlib.logsigmoid","text":"logσ(x)\n\nReturn log(σ(x)) which is computed in a numerically stable way.\n\njulia> lineplot(logsigmoid, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡧⠤⠔⠒⠒⠒⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│ logσ(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠊⠉⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠒⠉⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⢀⡤⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⣀⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⡤⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -6 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.mish","page":"Activation Functions","title":"NNlib.mish","text":"mish(x) = x * tanh(softplus(x))\n\nActivation function from \"Mish: A Self Regularized Non-Monotonic Neural Activation Function\".\n\njulia> lineplot(mish, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 5 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠖⠋│ mish(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠒⠁⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠔⠋⠁⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⢀⡠⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡤⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣧⣔⣊⣁⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀│ \n -1 │⠀⠀⠀⠀⠀⠀⠀⠀⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.relu","page":"Activation Functions","title":"NNlib.relu","text":"relu(x) = max(0, x)\n\nRectified Linear Unit activation function.\n\njulia> lineplot(relu, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠋│ relu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠊⠁⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡤⠊⠁⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡤⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⡠⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⡠⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣇⠔⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.relu6","page":"Activation Functions","title":"NNlib.relu6","text":"relu6(x) = min(max(0, x), 6)\n\nRectified Linear Unit activation function capped at 6. See \"Convolutional Deep Belief Networks\" from CIFAR-10.\n\njulia> lineplot(relu6, -10, 10, height=7)\n ┌────────────────────────────────────────┐ \n 6 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠎⠉⠉⠉⠉⠉⠉⠉⠉│ relu6(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⡔⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⡤⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⡠⠎⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⠖⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⡔⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⡧⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-10⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀10⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.rrelu","page":"Activation Functions","title":"NNlib.rrelu","text":"rrelu(x, lo=1/8, hi=1/3) = max(a*x, x)\n# where `a` is randomly sampled from uniform distribution `U(lo, hi)`\n\nRandomized Leaky Rectified Linear Unit activation function. See \"Empirical Evaluation of Rectified Activations\" You can also specify the bound explicitly, e.g. rrelu(x, 0.0, 1.0).\n\njulia> lineplot(rrelu, -20, 10, height=7)\n ┌────────────────────────────────────────┐ \n 10 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋│ rrelu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⢀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡤⠤⣤⣤⢤⣤⣤⠤⠤⠤⢼⠮⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⣰⢀⣆⡄⣄⡄⡠⡰⠦⠷⡜⢢⠷⠳⠢⠊⠉⠉⠀⠀⠁⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠃⠉⠙⠘⠃⠈⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -10 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-20⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀10⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> extrema(rrelu.(fill(-10f0, 1000)))\n(-3.3316886f0, -1.2548422f0)\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.selu","page":"Activation Functions","title":"NNlib.selu","text":"selu(x) = λ * (x ≥ 0 ? x : α * (exp(x) - 1))\n\nλ ≈ 1.05070...\nα ≈ 1.67326...\n\nScaled exponential linear units. See \"Self-Normalizing Neural Networks\".\n\njulia> lineplot(selu, -3, 2, height=7)\n ┌────────────────────────────────────────┐ \n 3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ selu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠤⠒│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⢀⣀⠤⠖⠊⠉⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⣀⡠⠤⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⣉⠭⠛⡏⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⣀⡤⠤⠒⠊⠉⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -2 │⠤⠤⠖⠒⠒⠒⠒⠒⠒⠒⠉⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> selu(-10f0)\n-1.7580194f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.sigmoid","page":"Activation Functions","title":"NNlib.sigmoid","text":"σ(x) = 1 / (1 + exp(-x))\n\nClassic sigmoid activation function. Unicode σ can be entered as \\sigma then tab, in many editors. The ascii name sigmoid is also exported.\n\nSee also sigmoid_fast.\n\njulia> using UnicodePlots\n\njulia> lineplot(sigmoid, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⣀⡠⠤⠖⠒⠒⠋⠉⠉⠉⠉⠉⠉│ σ(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⣀⠔⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⡏⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡔⠋⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠊⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⠤⠤⠤⠒⠊⠉⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> sigmoid === σ\ntrue\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.sigmoid_fast","page":"Activation Functions","title":"NNlib.sigmoid_fast","text":"sigmoid_fast(x)\n\nThis is a faster, and very slightly less accurate, version of sigmoid. For `x::Float32, perhaps 3 times faster, and maximum errors 2 eps instead of 1.\n\nSee also tanh_fast.\n\njulia> sigmoid(0.2f0)\n0.54983395f0\n\njulia> sigmoid_fast(0.2f0)\n0.54983395f0\n\njulia> hardσ(0.2f0)\n0.53333336f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.softplus","page":"Activation Functions","title":"NNlib.softplus","text":"softplus(x) = log(exp(x) + 1)\n\nSee \"Deep Sparse Rectifier Neural Networks\", JMLR 2011.\n\njulia> lineplot(softplus, -3, 3, height=7)\n ┌────────────────────────────────────────┐ \n 4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ softplus(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠁⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠔⠊⠁⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⣀⡠⠤⠒⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⡧⠤⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⡠⠤⠤⠤⠤⠔⠒⠒⠚⠉⠉⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot!(ans, relu)\n ┌────────────────────────────────────────┐ \n 4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ softplus(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠│ relu(x) \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠⡴⠞⠋⠁│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣤⡴⠞⠋⠁⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⣀⡠⢤⡲⠝⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⡧⠤⠒⠊⣉⠥⠚⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⣠⣤⣤⣤⣤⣔⣒⣒⣚⣉⣉⣁⣀⣇⠴⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> softplus(16f0)\n16.0f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.softshrink","page":"Activation Functions","title":"NNlib.softshrink","text":"softshrink(x, λ=0.5) =\n (x ≥ λ ? x - λ : (-λ ≥ x ? x + λ : 0))\n\nSee \"Softshrink Activation Function\".\n\njulia> lineplot(softshrink, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀│ softshrink(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⡤⠔⠒⠉⠁│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⣀⡤⠤⠒⠋⠁⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⣤⡤⠤⠤⠤⠤⠤⠤⡧⠤⠤⠤⠤⠶⠮⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⢀⣀⠤⠖⠒⠉⠁⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⣀⠤⠔⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -2 │⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot!(ans, tanhshrink)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀│ softshrink(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣀⡤⠔⠒⣉⡡│ tanhshrink(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⣀⡤⠤⣒⣋⠥⠤⠒⠊⠉⠁⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⣤⣤⣤⣤⡤⠤⠤⠤⠤⠤⠤⡷⠶⠶⠶⠶⠶⠾⠿⠯⠭⠭⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⢀⣀⡠⠤⠖⢒⣋⠭⠗⠒⠉⠁⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠊⣉⠤⠔⠒⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -2 │⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀\n\njulia> softshrink.((-10f0, 10f0))\n(-9.5f0, 9.5f0)\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.softsign","page":"Activation Functions","title":"NNlib.softsign","text":"softsign(x) = x / (1 + |x|)\n\nSee \"Quadratic Polynomials Learn Better Image Features\" (2009).\n\njulia> lineplot(softsign, -5, 5, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⣀⣀⠤⠤⠤⠤⠤│ softsign(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⣀⡤⠖⠒⠋⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⡔⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡯⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠔⠁⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⠤⠤⠒⠋⠁⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⠒⠒⠒⠒⠒⠊⠉⠉⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> lineplot!(ans, tanh)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⢀⡤⠖⠊⠉⠉⠉⣉⣉⣉⣉⣉⠭⠭⠭⠭⠭│ softsign(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⡔⣃⡤⠖⠒⠋⠉⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀│ tanh(x) \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣧⡞⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⡯⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡴⠃⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⠤⠤⠒⢋⠕⠁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⣒⣒⣒⣒⣒⣊⣉⣉⣉⣉⣁⣀⣀⡠⠤⠒⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-5⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀5⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> softsign(1f0)\n0.5f0\n\njulia> softsign(100f0)\n0.990099f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.swish","page":"Activation Functions","title":"NNlib.swish","text":"swish(x) = x * σ(x)\n\nSelf-gated activation function. See \"Swish: a Self-Gated Activation Function\".\n\njulia> lineplot(swish, -2, 2, height=7)\n ┌────────────────────────────────────────┐ \n 2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤│ swish(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋⠁⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠤⠖⠋⠁⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⢀⣀⡤⠔⠊⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⣤⣤⡤⡧⠴⠶⠯⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠉⠑⠒⠒⠒⠒⠒⠒⠒⠒⠒⠒⠉⠉⠉⠉⠁⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀2⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.tanhshrink","page":"Activation Functions","title":"NNlib.tanhshrink","text":"tanhshrink(x) = x - tanh(x)\n\nSee \"Tanhshrink Activation Function\".\n\njulia> lineplot(tanhshrink, -3, 3, height=7)\n ┌────────────────────────────────────────┐ \n 3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ tanhshrink(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡠⠤⠖⠊│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⢀⣀⡠⠤⠒⠊⠉⠁⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⢤⣤⡤⠤⠤⠤⠤⠤⠤⡷⠶⠶⠶⠶⠶⠮⠭⠥⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⣀⡠⠴⠒⠊⠉⠁⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⡠⠴⠒⠊⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\njulia> tanhshrink.((-10f0, 10f0))\n(-9.0f0, 9.0f0)\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.tanh_fast","page":"Activation Functions","title":"NNlib.tanh_fast","text":"tanh_fast(x)\n\nThis is a faster but slighly less accurate version of tanh.\n\nWhere Julia's tanh function has an error under 2 eps, this may be wrong by 5 eps, a reduction by less than one decimal digit. \n\nFor x::Float32 this is usually about 10 times faster, with a smaller speedup for x::Float64. For any other number types, it just calls tanh.\n\nSee also sigmoid_fast.\n\njulia> tanh(0.5f0)\n0.46211717f0\n\njulia> tanh_fast(0.5f0)\n0.46211714f0\n\njulia> hard_tanh(0.5f0)\n0.5f0\n\n\n\n\n\n","category":"function"},{"location":"models/activation/#NNlib.trelu","page":"Activation Functions","title":"NNlib.trelu","text":"trelu(x, theta=1) = x > theta ? x : 0\n\nThreshold gated rectified linear activation function. See \"Zero-bias autoencoders and the benefits of co-adapting features\"\n\njulia> lineplot(trelu, -2, 4, height=7)\n ┌────────────────────────────────────────┐ \n 4 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋│ trelu(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠖⠋⠁⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡠⠔⠊⠁⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠴⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⣠⠤⠒⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⡏⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n 0 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣀⣇⣀⣀⣀⣀⣀⣀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-2⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀4⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ \n\n\n\n\n\n","category":"function"},{"location":"models/activation/#One-More","page":"Activation Functions","title":"One More","text":"","category":"section"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"Julia's Base.Math also provides tanh, which can be used as an activation function.","category":"page"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"Note that many Flux layers will automatically replace this with NNlib.tanh_fast when called, as Base's tanh is slow enough to sometimes be a bottleneck.","category":"page"},{"location":"models/activation/","page":"Activation Functions","title":"Activation Functions","text":"julia> using UnicodePlots\n\njulia> lineplot(tanh, -3, 3, height=7)\n ┌────────────────────────────────────────┐ \n 1 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⣀⠤⠔⠒⠒⠉⠉⠉⠉⠉⠉⠉⠉⠉│ tanh(x)\n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⡠⠖⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⡰⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n f(x) │⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⡤⡯⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡠⠎⠁⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠴⠊⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n -1 │⣀⣀⣀⣀⣀⣀⣀⣀⣀⡤⠤⠔⠒⠉⠁⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ \n └────────────────────────────────────────┘ \n ⠀-3⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀ \n ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀x⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ","category":"page"},{"location":"training/training/#man-training","page":"Training","title":"Training a Flux Model","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"Training refers to the process of slowly adjusting the parameters of a model to make it work better. Besides the model itself, we will need three things:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"An objective function that evaluates how well a model is doing on some input.\nAn optimisation rule which describes how the model's parameters should be adjusted.\nSome training data to use as the input during this process.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Usually the training data is some collection of examples (or batches of examples) which are handled one-by-one. One epoch of training means that each example is used once, something like this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"# Initialise the optimiser for this model:\nopt_state = Flux.setup(rule, model)\n\nfor data in train_set\n # Unpack this element (for supervised training):\n input, label = data\n\n # Calculate the gradient of the objective\n # with respect to the parameters within the model:\n grads = Flux.gradient(model) do m\n result = m(input)\n loss(result, label)\n end\n\n # Update the parameters so as to reduce the objective,\n # according the chosen optimisation rule:\n Flux.update!(opt_state, model, grads[1])\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"This loop can also be written using the function train!, but it's helpful to undersand the pieces first:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"train!(model, train_set, opt_state) do m, x, y\n loss(m(x), y)\nend","category":"page"},{"location":"training/training/#Model-Gradients","page":"Training","title":"Model Gradients","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"Fist recall from the section on taking gradients that Flux.gradient(f, a, b) always calls f(a, b), and returns a tuple (∂f_∂a, ∂f_∂b). In the code above, the function f passed to gradient is an anonymous function with one argument, created by the do block, hence grads is a tuple with one element. Instead of a do block, we could have written:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"grads = Flux.gradient(m -> loss(m(input), label), model)","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Since the model is some nested set of layers, grads[1] is a similarly nested set of NamedTuples, ultimately containing gradient components. If (for example) θ = model.layers[1].weight[2,3] is one scalar parameter, an entry in a matrix of weights, then the derivative of the loss with respect to it is ∂f_∂θ = grads[1].layers[1].weight[2,3].","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"It is important that the execution of the model takes place inside the call to gradient, in order for the influence of the model's parameters to be observed by Zygote.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"It is also important that every update! step receives a newly computed gradient, as it will change whenever the model's parameters are changed, and for each new data point.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"compat: Implicit gradients\nFlux ≤ 0.14 used Zygote's \"implicit\" mode, in which gradient takes a zero-argument function. It looks like this:pars = Flux.params(model)\ngrad = gradient(() -> loss(model(input), label), pars)Here pars::Params and grad::Grads are two dictionary-like structures. Support for this will be removed from Flux 0.15, and these blue (teal?) boxes explain what needs to change.","category":"page"},{"location":"training/training/#Loss-Functions","page":"Training","title":"Loss Functions","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"The objective function must return a number representing how far the model is from the desired result. This is termed the loss of the model.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"This number can be produced by any ordinary Julia code, but this must be executed within the call to gradient. For instance, we could define a function","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"loss(y_hat, y) = sum((y_hat .- y).^2)","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"or write this directly inside the do block above. Many commonly used functions, like mse for mean-squared error or crossentropy for cross-entropy loss, are available from the Flux.Losses module.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"compat: Implicit-style loss functions\nFlux ≤ 0.14 needed a loss function which closed over a reference to the model, instead of being a pure function. Thus in old code you may see something likeloss(x, y) = sum((model(x) .- y).^2)which defines a function making reference to a particular global variable model.","category":"page"},{"location":"training/training/#Optimisation-Rules","page":"Training","title":"Optimisation Rules","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"The simplest kind of optimisation using the gradient is termed gradient descent (or sometimes stochastic gradient descent when, as here, it is not applied to the entire dataset at once).","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Gradient descent needs a learning rate which is a small number describing how fast to walk downhill, usually written as the Greek letter \"eta\", η. This is often described as a hyperparameter, to distinguish it from the parameters which are being updated θ = θ - η * ∂loss_∂θ. We want to update all the parameters in the model, like this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"η = 0.01 # learning rate\n\n# For each parameter array, update\n# according to the corresponding gradient:\nfmap(model, grads[1]) do p, g\n p .= p .- η .* g\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"A slightly more refined version of this loop to update all the parameters is wrapped up as a function update!(opt_state, model, grads[1]). And the learning rate is the only thing stored in the Descent struct.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"However, there are many other optimisation rules, which adjust the step size and direction in various clever ways. Most require some memory of the gradients from earlier steps, rather than always walking straight downhill – Momentum is the simplest. The function setup creates the necessary storage for this, for a particular model. It should be called once, before training, and returns a tree-like object which is the first argument of update!. Like this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"# Initialise momentum \nopt_state = Flux.setup(Momentum(0.01, 0.9), model)\n\nfor data in train_set\n grads = [...]\n\n # Update both model parameters and optimiser state:\n Flux.update!(opt_state, model, grads[1])\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Many commonly-used optimisation rules, such as Adam, are built-in. These are listed on the optimisers page.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"compat: Implicit-style optimiser state\nThis setup makes another tree-like structure. Old versions of Flux did not do this, and instead stored a dictionary-like structure within the optimiser Adam(0.001). This was initialised on first use of the version of update! for \"implicit\" parameters.","category":"page"},{"location":"training/training/#Datasets-and-Batches","page":"Training","title":"Datasets & Batches","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"The loop above iterates through train_set, expecting at each step a tuple (input, label). The very simplest such object is a vector of tuples, such as this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"x = randn(28, 28)\ny = rand(10)\ndata = [(x, y)]","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"or data = [(x, y), (x, y), (x, y)] for the same values three times.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Very often, the initial data is large arrays which you need to slice into examples. To produce one iterator of pairs (x, y), you might want zip:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"X = rand(28, 28, 60_000); # many images, each 28 × 28\nY = rand(10, 60_000)\ndata = zip(eachslice(X; dims=3), eachcol(Y))\n\nfirst(data) isa Tuple{AbstractMatrix, AbstractVector} # true","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Here each iteration will use one matrix x (an image, perhaps) and one vector y. It is very common to instead train on batches of such inputs (or mini-batches, the two words mean the same thing) both for efficiency and for better results. This can be easily done using the DataLoader:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"data = Flux.DataLoader((X, Y), batchsize=32)\n\nx1, y1 = first(data)\nsize(x1) == (28, 28, 32)\nlength(data) == 1875 === 60_000 ÷ 32","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Flux's layers are set up to accept such a batch of input data, and the convolutional layers such as Conv require it. The batch index is always the last dimension.","category":"page"},{"location":"training/training/#Training-Loops","page":"Training","title":"Training Loops","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"Simple training loops like the one above can be written compactly using the train! function. Including setup, this reads:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"opt_state = Flux.setup(Adam(), model)\n\nfor epoch in 1:100\n Flux.train!(model, train_set, opt_state) do m, x, y\n loss(m(x), y)\n end\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Or explicitly writing the anonymous function which this do block creates, train!((m,x,y) -> loss(m(x),y), model, train_set, opt_state) is exactly equivalent.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"compat: Implicit-style `train!`\nThis is a new method of train!, which takes the result of setup as its 4th argument. The 1st argument is a function which accepts the model itself. Flux versions ≤ 0.14 provided a method of train! for \"implicit\" parameters, which works like this:train!((x,y) -> loss(model(x), y), Flux.params(model), train_set, Adam())","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Real training loops often need more flexibility, and the best way to do this is just to write the loop. This is ordinary Julia code, without any need to work through some callback API. Here is an example, in which it may be helpful to note:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"The function withgradient is like gradient but also returns the value of the function, for logging or diagnostic use.\nLogging or printing is best done outside of the gradient call, as there is no need to differentiate these commands.\nJulia's break and continue keywords let you exit from parts of the loop.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"opt_state = Flux.setup(Adam(), model)\n\nmy_log = []\nfor epoch in 1:100\n losses = Float32[]\n for (i, data) in enumerate(train_set)\n input, label = data\n\n val, grads = Flux.withgradient(model) do m\n # Any code inside here is differentiated.\n # Evaluation of the model and loss must be inside!\n result = m(input)\n my_loss(result, label)\n end\n\n # Save the loss from the forward pass. (Done outside of gradient.)\n push!(losses, val)\n\n # Detect loss of Inf or NaN. Print a warning, and then skip update!\n if !isfinite(val)\n @warn \"loss is $val on item $i\" epoch\n continue\n end\n\n Flux.update!(opt_state, model, grads[1])\n end\n\n # Compute some accuracy, and save details as a NamedTuple\n acc = my_accuracy(model, train_set)\n push!(my_log, (; acc, losses))\n\n # Stop training when some criterion is reached\n if acc > 0.95\n println(\"stopping after $epoch epochs\")\n break\n end\nend","category":"page"},{"location":"training/training/#Regularisation","page":"Training","title":"Regularisation","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"The term regularisation covers a wide variety of techniques aiming to improve the result of training. This is often done to avoid overfitting.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Some of these are can be implemented by simply modifying the loss function. L₂ regularisation (sometimes called ridge regression) adds to the loss a penalty proportional to θ^2 for every scalar parameter. For a very simple model could be implemented as follows:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"grads = Flux.gradient(densemodel) do m\n result = m(input)\n penalty = sum(abs2, m.weight)/2 + sum(abs2, m.bias)/2\n my_loss(result, label) + 0.42 * penalty\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Accessing each individual parameter array by hand won't work well for large models. Instead, we can use Flux.params to collect all of them, and then apply a function to each one, and sum the result:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"pen_l2(x::AbstractArray) = sum(abs2, x)/2\n\ngrads = Flux.gradient(model) do m\n result = m(input)\n penalty = sum(pen_l2, Flux.params(m))\n my_loss(result, label) + 0.42 * penalty\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"However, the gradient of this penalty term is very simple: It is proportional to the original weights. So there is a simpler way to implement exactly the same thing, by modifying the optimiser instead of the loss function. This is done by replacing this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"opt_state = Flux.setup(Adam(0.1), model)","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"with this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"decay_opt_state = Flux.setup(OptimiserChain(WeightDecay(0.42), Adam(0.1)), model)","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Flux's optimisers are really modifications applied to the gradient before using it to update the parameters, and OptimiserChain applies two such modifications. The first, WeightDecay adds 0.42 times original parameter to the gradient, matching the gradient of the penalty above (with the same, unrealistically large, constant). After that, in either case, Adam computes the final update.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"The same OptimiserChain mechanism can be used for other purposes, such as gradient clipping with ClipGrad or ClipNorm.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Besides L2 / weight decay, another common and quite different kind of regularisation is provided by the Dropout layer. This turns off some outputs of the previous layer during training. It should switch automatically, but see trainmode! / testmode! to manually enable or disable this layer.","category":"page"},{"location":"training/training/#Freezing-and-Schedules","page":"Training","title":"Freezing & Schedules","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"Finer control of training, you may wish to alter the learning rate mid-way through training. This can be done with adjust!, like this:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"opt_state = Flux.setup(Adam(0.1), model) # initialise once\n\nfor epoch in 1:1000\n train!([...], state) # Train with η = 0.1 for first 100,\n if epoch == 100 # then change to use η = 0.01 for the rest.\n Flux.adjust!(opt_state, 0.01)\n end\nend","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"compat: Flux ≤ 0.14\nWith the old \"implicit\" optimiser, opt = Adam(0.1), the equivalent was to directly mutate the Adam struct, opt.eta = 0.001. ","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Other hyper-parameters can also be adjusted, such as Flux.adjust!(opt_state, beta = (0.8, 0.99)). And such modifications can be applied to just one part of the model. For instance, this sets a different learning rate for the encoder and the decoder:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"# Consider some model with two parts:\nbimodel = Chain(enc = [...], dec = [...])\n\n# This returns a tree whose structure matches the model:\nopt_state = Flux.setup(Adam(0.02), bimodel)\n\n# Adjust the learning rate to be used for bimodel.layers.enc\nFlux.adjust!(opt_state.layers.enc, 0.03)","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"To completely disable training of some part of the model, use freeze!. This is a temporary modification, reversed by thaw!:","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"Flux.freeze!(opt_state.layers.enc)\n\n# Now training won't update parameters in bimodel.layers.enc\ntrain!(loss, bimodel, data, opt_state)\n\n# Un-freeze the entire model:\nFlux.thaw!(opt_state)","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"compat: Flux ≤ 0.14\nThe earlier \"implicit\" equivalent was to pass to gradient an object referencing only part of the model, such as Flux.params(bimodel.layers.enc).","category":"page"},{"location":"training/training/#Implicit-or-Explicit?","page":"Training","title":"Implicit or Explicit?","text":"","category":"section"},{"location":"training/training/","page":"Training","title":"Training","text":"Flux used to handle gradients, training, and optimisation rules quite differently. The new style described above is called \"explicit\" by Zygote, and the old style \"implicit\". Flux 0.13 and 0.14 are the transitional versions which support both.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"The blue-green boxes above describe the changes. For more details on training in the implicit style, see Flux 0.13.6 documentation.","category":"page"},{"location":"training/training/","page":"Training","title":"Training","text":"For details about the two gradient modes, see Zygote's documentation.","category":"page"},{"location":"gpu/#GPU-Support","page":"GPU Support","title":"GPU Support","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Starting with v0.14, Flux doesn't force a specific GPU backend and the corresponding package dependencies on the users. Thanks to the package extension mechanism introduced in julia v1.9, Flux conditionally loads GPU specific code once a GPU package is made available (e.g. through using CUDA).","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"NVIDIA GPU support requires the packages CUDA.jl and cuDNN.jl to be installed in the environment. In the julia REPL, type ] add CUDA, cuDNN to install them. For more details see the CUDA.jl readme.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"AMD GPU support is available since Julia 1.9 on systems with ROCm and MIOpen installed. For more details refer to the AMDGPU.jl repository.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Metal GPU acceleration is available on Apple Silicon hardware. For more details refer to the Metal.jl repository. Metal support in Flux is experimental and many features are not yet available.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"In order to trigger GPU support in Flux, you need to call using CUDA, using AMDGPU or using Metal in your code. Notice that for CUDA, explicitly loading also cuDNN is not required, but the package has to be installed in the environment. ","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"compat: Flux ≤ 0.13\nOld versions of Flux automatically installed CUDA.jl to provide GPU support. Starting from Flux v0.14, CUDA.jl is not a dependency anymore and has to be installed manually.","category":"page"},{"location":"gpu/#Checking-GPU-Availability","page":"GPU Support","title":"Checking GPU Availability","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"By default, Flux will run the checks on your system to see if it can support GPU functionality. You can check if Flux identified a valid GPU setup by typing the following:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using CUDA\n\njulia> CUDA.functional()\ntrue","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"For AMD GPU:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using AMDGPU\n\njulia> AMDGPU.functional()\ntrue\n\njulia> AMDGPU.functional(:MIOpen)\ntrue","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"For Metal GPU:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Metal\n\njulia> Metal.functional()\ntrue","category":"page"},{"location":"gpu/#Selecting-GPU-backend","page":"GPU Support","title":"Selecting GPU backend","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Available GPU backends are: CUDA, AMDGPU and Metal.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Flux relies on Preferences.jl for selecting default GPU backend to use.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"There are two ways you can specify it:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"From the REPL/code in your project, call Flux.gpu_backend!(\"AMDGPU\") and restart (if needed) Julia session for the changes to take effect.\nIn LocalPreferences.toml file in you project directory specify:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"[Flux]\ngpu_backend = \"AMDGPU\"","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Current GPU backend can be fetched from Flux.GPU_BACKEND variable:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> Flux.GPU_BACKEND\n\"CUDA\"","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"The current backend will affect the behaviour of methods like the method gpu described below.","category":"page"},{"location":"gpu/#Basic-GPU-Usage","page":"GPU Support","title":"Basic GPU Usage","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Support for array operations on other hardware backends, like GPUs, is provided by external packages like CUDA.jl, AMDGPU.jl, and Metal.jl. Flux is agnostic to array types, so we simply need to move model weights and data to the GPU and Flux will handle it.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"For example, we can use CUDA.CuArray (with the cu converter) to run our basic example on an NVIDIA GPU.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"(Note that you need to have CUDA available to use CUDA.CuArray – please see the CUDA.jl instructions for more details.)","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"using CUDA\n\nW = cu(rand(2, 5)) # a 2×5 CuArray\nb = cu(rand(2))\n\npredict(x) = W*x .+ b\nloss(x, y) = sum((predict(x) .- y).^2)\n\nx, y = cu(rand(5)), cu(rand(2)) # Dummy data\nloss(x, y) # ~ 3","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Note that we convert both the parameters (W, b) and the data set (x, y) to cuda arrays. Taking derivatives and training works exactly as before.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"If you define a structured model, like a Dense layer or Chain, you just need to convert the internal parameters. Flux provides fmap, which allows you to alter all parameters of a model at once.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"d = Dense(10 => 5, σ)\nd = fmap(cu, d)\nd.weight # CuArray\nd(cu(rand(10))) # CuArray output\n\nm = Chain(Dense(10 => 5, σ), Dense(5 => 2), softmax)\nm = fmap(cu, m)\nm(cu(rand(10)))","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"As a convenience, Flux provides the gpu function to convert models and data to the GPU if one is available. By default, it'll do nothing. So, you can safely call gpu on some data or model (as shown below), and the code will not error, regardless of whether the GPU is available or not. If a GPU library (e.g. CUDA) loads successfully, gpu will move data from the CPU to the GPU. As is shown below, this will change the type of something like a regular array to a CuArray.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Flux, CUDA\n\njulia> m = Dense(10, 5) |> gpu\nDense(10 => 5) # 55 parameters\n\njulia> x = rand(10) |> gpu\n10-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:\n 0.066846445\n ⋮\n 0.76706964\n\njulia> m(x)\n5-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:\n -0.99992573\n ⋮\n -0.547261","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"The analogue cpu is also available for moving models and data back off of the GPU.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> x = rand(10) |> gpu\n10-element CuArray{Float32, 1, CUDA.Mem.DeviceBuffer}:\n 0.8019236\n ⋮\n 0.7766742\n\njulia> x |> cpu\n10-element Vector{Float32}:\n 0.8019236\n ⋮\n 0.7766742","category":"page"},{"location":"gpu/#Transferring-Training-Data","page":"GPU Support","title":"Transferring Training Data","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"In order to train the model using the GPU both model and the training data have to be transferred to GPU memory. Moving the data can be done in two different ways:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Iterating over the batches in a DataLoader object transferring each one of the training batches at a time to the GPU. This is recommended for large datasets. Done by hand, it might look like this:\ntrain_loader = Flux.DataLoader((X, Y), batchsize=64, shuffle=true)\n# ... model definition, optimiser setup\nfor epoch in 1:epochs\n for (x_cpu, y_cpu) in train_loader\n x = gpu(x_cpu)\n y = gpu(y_cpu)\n grads = gradient(m -> loss(m, x, y), model)\n Flux.update!(opt_state, model, grads[1])\n end\nend\nRather than write this out every time, you can just call gpu(::DataLoader):\ngpu_train_loader = Flux.DataLoader((X, Y), batchsize=64, shuffle=true) |> gpu\n# ... model definition, optimiser setup\nfor epoch in 1:epochs\n for (x, y) in gpu_train_loader\n grads = gradient(m -> loss(m, x, y), model)\n Flux.update!(opt_state, model, grads[1])\n end\nend\nThis is equivalent to DataLoader(MLUtils.mapobs(gpu, (X, Y)); keywords...). Something similar can also be done with CUDA.CuIterator, gpu_train_loader = CUDA.CuIterator(train_loader). However, this only works with a limited number of data types: first(train_loader) should be a tuple (or NamedTuple) of arrays.\nTransferring all training data to the GPU at once before creating the DataLoader. This is usually performed for smaller datasets which are sure to fit in the available GPU memory.\ngpu_train_loader = Flux.DataLoader((X, Y) |> gpu, batchsize = 32)\n# ...\nfor epoch in 1:epochs\n for (x, y) in gpu_train_loader\n # ...\nHere (X, Y) |> gpu applies gpu to both arrays, as it recurses into structures.","category":"page"},{"location":"gpu/#Saving-GPU-Trained-Models","page":"GPU Support","title":"Saving GPU-Trained Models","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"After the training process is done, one must always transfer the trained model back to the cpu memory scope before serializing or saving to disk. This can be done, as described in the previous section, with:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"model = cpu(model) # or model = model |> cpu","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"and then","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"using BSON\n# ...\nBSON.@save \"./path/to/trained_model.bson\" model\n\n# in this approach the cpu-transferred model (referenced by the variable `model`)\n# only exists inside the `let` statement\nlet model = cpu(model)\n # ...\n BSON.@save \"./path/to/trained_model.bson\" model\nend\n\n# is equivalent to the above, but uses `key=value` storing directive from BSON.jl\nBSON.@save \"./path/to/trained_model.bson\" model = cpu(model)","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"The reason behind this is that models trained in the GPU but not transferred to the CPU memory scope will expect CuArrays as input. In other words, Flux models expect input data coming from the same kind device in which they were trained on.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"In controlled scenarios in which the data fed to the loaded models is garanteed to be in the GPU there's no need to transfer them back to CPU memory scope, however in production environments, where artifacts are shared among different processes, equipments or configurations, there is no garantee that the CUDA.jl package will be available for the process performing inference on the model loaded from the disk.","category":"page"},{"location":"gpu/#Disabling-CUDA-or-choosing-which-GPUs-are-visible-to-Flux","page":"GPU Support","title":"Disabling CUDA or choosing which GPUs are visible to Flux","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Sometimes it is required to control which GPUs are visible to julia on a system with multiple GPUs or disable GPUs entirely. This can be achieved with an environment variable CUDA_VISIBLE_DEVICES.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"To disable all devices:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"$ export CUDA_VISIBLE_DEVICES='-1'","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"To select specific devices by device id:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"$ export CUDA_VISIBLE_DEVICES='0,1'","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"More information for conditional use of GPUs in CUDA.jl can be found in its documentation, and information about the specific use of the variable is described in the Nvidia CUDA blog post.","category":"page"},{"location":"gpu/#Using-device-objects","page":"GPU Support","title":"Using device objects","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"As a more convenient syntax, Flux allows the usage of GPU device objects which can be used to easily transfer models to GPUs (and defaulting to using the CPU if no GPU backend is available). This syntax has a few advantages including automatic selection of the GPU backend and type stability of data movement. To do this, the Flux.get_device function can be used.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Flux.get_device first checks for a GPU preference, and if possible returns a device for the preference backend. For instance, consider the following example, where we load the CUDA.jl package to use an NVIDIA GPU (\"CUDA\" is the default preference):","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Flux, CUDA;\n\njulia> device = Flux.get_device(; verbose=true) # returns handle to an NVIDIA GPU\n[ Info: Using backend set in preferences: CUDA.\n(::Flux.FluxCUDADevice) (generic function with 1 method)\n\njulia> device.deviceID # check the id of the GPU\nCuDevice(0): NVIDIA GeForce GTX 1650\n\njulia> model = Dense(2 => 3);\n\njulia> model.weight # the model initially lives in CPU memory\n3×2 Matrix{Float32}:\n -0.984794 -0.904345\n 0.720379 -0.486398\n 0.851011 -0.586942\n\njulia> model = model |> device # transfer model to the GPU\nDense(2 => 3) # 9 parameters\n\njulia> model.weight\n3×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n -0.984794 -0.904345\n 0.720379 -0.486398\n 0.851011 -0.586942\n","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"The device preference can also be set via the Flux.gpu_backend! function. For instance, below we first set our device preference to \"CPU\":","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Flux; Flux.gpu_backend!(\"CPU\")\n┌ Info: New GPU backend set: CPU.\n└ Restart your Julia session for this change to take effect!","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Then, after restarting the Julia session, Flux.get_device returns a handle to the \"CPU\":","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Flux, CUDA; # even if CUDA is loaded, we'll still get a CPU device\n\njulia> device = Flux.get_device(; verbose=true) # get a CPU device\n[ Info: Using backend set in preferences: CPU.\n(::Flux.FluxCPUDevice) (generic function with 1 method)\n\njulia> model = Dense(2 => 3);\n\njulia> model = model |> device\nDense(2 => 3) # 9 parameters\n\njulia> model.weight # no change; model still lives on CPU\n3×2 Matrix{Float32}:\n -0.942968 0.856258\n 0.440009 0.714106\n -0.419192 -0.471838","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Clearly, this means that the same code will work for any GPU backend and the CPU. ","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"If the preference backend isn't available or isn't functional, then Flux.get_device looks for a CUDA, AMDGPU or Metal backend, and returns a corresponding device (if the backend is available and functional). Otherwise, a CPU device is returned. In the below example, the GPU preference is \"CUDA\":","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Flux; # preference is CUDA, but CUDA.jl not loaded\n\njulia> device = Flux.get_device(; verbose=true) # this will resort to automatic device selection\n[ Info: Using backend set in preferences: CUDA.\n┌ Warning: Trying to use backend: CUDA but it's trigger package is not loaded.\n│ Please load the package and call this function again to respect the preferences backend.\n└ @ Flux ~/fluxml/Flux.jl/src/functor.jl:637\n[ Info: Using backend: CPU.\n(::Flux.FluxCPUDevice) (generic function with 1 method)","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"For detailed information about how the backend is selected, check the documentation for Flux.get_device.","category":"page"},{"location":"gpu/#Data-movement-across-GPU-devices","page":"GPU Support","title":"Data movement across GPU devices","text":"","category":"section"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Flux also supports getting handles to specific GPU devices, and transferring models from one GPU device to another GPU device from the same backend. Let's try it out for NVIDIA GPUs. First, we list all the available devices:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> using Flux, CUDA;\n\njulia> CUDA.devices()\nCUDA.DeviceIterator() for 3 devices:\n0. GeForce RTX 2080 Ti\n1. GeForce RTX 2080 Ti\n2. TITAN X (Pascal)\n","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Then, let's select the device with id 0:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> device0 = Flux.get_device(\"CUDA\", 0) # the currently supported values for backend are \"CUDA\" and \"AMDGPU\"\n(::Flux.FluxCUDADevice) (generic function with 1 method)\n","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Then, let's move a simple dense layer to the GPU represented by device0:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> dense_model = Dense(2 => 3)\nDense(2 => 3) # 9 parameters\n\njulia> dense_model = dense_model |> device0;\n\njulia> dense_model.weight\n3×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 0.695662 0.816299\n -0.204763 -0.10232\n -0.955829 0.538412\n\njulia> CUDA.device(dense_model.weight) # check the GPU to which dense_model is attached\nCuDevice(0): GeForce RTX 2080 Ti\n","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Next, we'll get a handle to the device with id 1, and move dense_model to that device:","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"julia> device1 = Flux.get_device(\"CUDA\", 1)\n(::Flux.FluxCUDADevice) (generic function with 1 method)\n\njulia> dense_model = dense_model |> device1; # don't directly print the model; see warning below\n\njulia> CUDA.device(dense_model.weight)\nCuDevice(1): GeForce RTX 2080 Ti\n","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Due to a limitation in Metal.jl, currently this kind of data movement across devices is only supported for CUDA and AMDGPU backends.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"warning: Printing models after moving to a different device\nDue to a limitation in how GPU packages currently work, printing models on the REPL after moving them to a GPU device which is different from the current device will lead to an error.","category":"page"},{"location":"gpu/","page":"GPU Support","title":"GPU Support","text":"Flux.AbstractDevice\nFlux.FluxCPUDevice\nFlux.FluxCUDADevice\nFlux.FluxAMDGPUDevice\nFlux.FluxMetalDevice\nFlux.supported_devices\nFlux.get_device","category":"page"},{"location":"gpu/#Flux.AbstractDevice","page":"GPU Support","title":"Flux.AbstractDevice","text":"Flux.AbstractDevice <: Function\n\nAn abstract type representing device objects for different GPU backends. The currently supported backends are \"CUDA\", \"AMDGPU\", \"Metal\" and \"CPU\"; the \"CPU\" backend is the fallback case when no GPU is available. GPU extensions of Flux define subtypes of this type.\n\n\n\n\n\n","category":"type"},{"location":"gpu/#Flux.FluxCPUDevice","page":"GPU Support","title":"Flux.FluxCPUDevice","text":"Flux.FluxCPUDevice <: Flux.AbstractDevice\n\nA type representing device objects for the \"CPU\" backend for Flux. This is the fallback case when no GPU is available to Flux.\n\n\n\n\n\n","category":"type"},{"location":"gpu/#Flux.FluxCUDADevice","page":"GPU Support","title":"Flux.FluxCUDADevice","text":"FluxCUDADevice <: AbstractDevice\n\nA type representing device objects for the \"CUDA\" backend for Flux.\n\n\n\n\n\n","category":"type"},{"location":"gpu/#Flux.FluxAMDGPUDevice","page":"GPU Support","title":"Flux.FluxAMDGPUDevice","text":"FluxAMDGPUDevice <: AbstractDevice\n\nA type representing device objects for the \"AMDGPU\" backend for Flux.\n\n\n\n\n\n","category":"type"},{"location":"gpu/#Flux.FluxMetalDevice","page":"GPU Support","title":"Flux.FluxMetalDevice","text":"FluxMetalDevice <: AbstractDevice\n\nA type representing device objects for the \"Metal\" backend for Flux.\n\n\n\n\n\n","category":"type"},{"location":"gpu/#Flux.supported_devices","page":"GPU Support","title":"Flux.supported_devices","text":"Flux.supported_devices()\n\nGet all supported backends for Flux, in order of preference.\n\nExample\n\njulia> using Flux;\n\njulia> Flux.supported_devices()\n(\"CUDA\", \"AMDGPU\", \"Metal\", \"CPU\")\n\n\n\n\n\n","category":"function"},{"location":"gpu/#Flux.get_device","page":"GPU Support","title":"Flux.get_device","text":"Flux.get_device(; verbose=false)::Flux.AbstractDevice\n\nReturns a device object for the most appropriate backend for the current Julia session. \n\nFirst, the function checks whether a backend preference has been set via the Flux.gpu_backend! function. If so, an attempt is made to load this backend. If the corresponding trigger package has been loaded and the backend is functional, a device corresponding to the given backend is loaded. Otherwise, the backend is chosen automatically. To update the backend preference, use Flux.gpu_backend!.\n\nIf there is no preference, then for each of the \"CUDA\", \"AMDGPU\", \"Metal\" and \"CPU\" backends in the given order, this function checks whether the given backend has been loaded via the corresponding trigger package, and whether the backend is functional. If so, the device corresponding to the backend is returned. If no GPU backend is available, a Flux.FluxCPUDevice is returned.\n\nIf verbose is set to true, then the function prints informative log messages.\n\nExamples\n\nFor the example given below, the backend preference was set to \"AMDGPU\" via the gpu_backend! function.\n\njulia> using Flux;\n\njulia> model = Dense(2 => 3)\nDense(2 => 3) # 9 parameters\n\njulia> device = Flux.get_device(; verbose=true) # this will just load the CPU device\n[ Info: Using backend set in preferences: AMDGPU.\n┌ Warning: Trying to use backend: AMDGPU but it's trigger package is not loaded.\n│ Please load the package and call this function again to respect the preferences backend.\n└ @ Flux ~/fluxml/Flux.jl/src/functor.jl:638\n[ Info: Using backend: CPU.\n(::Flux.FluxCPUDevice) (generic function with 1 method)\n\njulia> model = model |> device\nDense(2 => 3) # 9 parameters\n\njulia> model.weight\n3×2 Matrix{Float32}:\n -0.304362 -0.700477\n -0.861201 0.67825\n -0.176017 0.234188\n\nHere is the same example, but using \"CUDA\":\n\njulia> using Flux, CUDA;\n\njulia> model = Dense(2 => 3)\nDense(2 => 3) # 9 parameters\n\njulia> device = Flux.get_device(; verbose=true)\n[ Info: Using backend set in preferences: AMDGPU.\n┌ Warning: Trying to use backend: AMDGPU but it's trigger package is not loaded.\n│ Please load the package and call this function again to respect the preferences backend.\n└ @ Flux ~/fluxml/Flux.jl/src/functor.jl:637\n[ Info: Using backend: CUDA.\n(::Flux.FluxCUDADevice) (generic function with 1 method)\n\njulia> model = model |> device\nDense(2 => 3) # 9 parameters\n\njulia> model.weight\n3×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:\n 0.820013 0.527131\n -0.915589 0.549048\n 0.290744 -0.0592499\n\n\n\n\n\nFlux.get_device(backend::String, idx::Int = 0)::Flux.AbstractDevice\n\nGet a device object for a backend specified by the string backend and idx. The currently supported values of backend are \"CUDA\", \"AMDGPU\" and \"CPU\". idx must be an integer value between 0 and the number of available devices.\n\nExamples\n\njulia> using Flux, CUDA;\n\njulia> CUDA.devices()\nCUDA.DeviceIterator() for 3 devices:\n0. GeForce RTX 2080 Ti\n1. GeForce RTX 2080 Ti\n2. TITAN X (Pascal)\n\njulia> device0 = Flux.get_device(\"CUDA\", 0)\n(::Flux.FluxCUDADevice) (generic function with 1 method)\n\njulia> device0.deviceID\nCuDevice(0): GeForce RTX 2080 Ti\n\njulia> device1 = Flux.get_device(\"CUDA\", 1)\n(::Flux.FluxCUDADevice) (generic function with 1 method)\n\njulia> device1.deviceID\nCuDevice(1): GeForce RTX 2080 Ti\n\njulia> cpu_device = Flux.get_device(\"CPU\")\n(::Flux.FluxCPUDevice) (generic function with 1 method)\n\n\n\n\n\n\n","category":"function"},{"location":"tutorials/linear_regression/#man-linear-regression","page":"Linear Regression","title":"Tutorial: Linear Regression","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Flux is a pure Julia ML stack that allows you to build predictive models. Here are the steps for a typical Flux program:","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Provide training and test data\nBuild a model with configurable parameters to make predictions\nIteratively train the model by tweaking the parameters to improve predictions\nVerify your model","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Under the hood, Flux uses a technique called automatic differentiation to take gradients that help improve predictions. Flux is also fully written in Julia so you can easily replace any layer of Flux with your own code to improve your understanding or satisfy special requirements.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The following page contains a step-by-step walkthrough of the linear regression algorithm in Julia using Flux! We will start by creating a simple linear regression model for dummy data and then move on to a real dataset. The first part would involve writing some parts of the model on our own, which will later be replaced by Flux.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Let us start by building a simple linear regression model. This model would be trained on the data points of the form (x₁, y₁), (x₂, y₂), ... , (xₙ, yₙ). In the real world, these xs can have multiple features, and the ys denote a label. In our example, each x has a single feature; hence, our data would have n data points, each point mapping a single feature to a single label.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Importing the required Julia packages -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> using Flux, Plots","category":"page"},{"location":"tutorials/linear_regression/#Generating-a-dataset","page":"Linear Regression","title":"Generating a dataset","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The data usually comes from the real world, which we will be exploring in the last part of this tutorial, but we don't want to jump straight to the relatively harder part. Here we will generate the xs of our data points and map them to the respective ys using a simple function. Remember, here each x is equivalent to a feature, and each y is the corresponding label. Combining all the xs and ys would create the complete dataset.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> x = hcat(collect(Float32, -3:0.1:3)...)\n1×61 Matrix{Float32}:\n -3.0 -2.9 -2.8 -2.7 -2.6 -2.5 … 2.4 2.5 2.6 2.7 2.8 2.9 3.0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The hcat call generates a Matrix with numbers ranging from -3.0 to 3.0 with a gap of 0.1 between them. Each column of this matrix holds a single x, a total of 61 xs. The next step would be to generate the corresponding labels or the ys.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> f(x) = @. 3x + 2;\n\njulia> y = f(x)\n1×61 Matrix{Float32}:\n -7.0 -6.7 -6.4 -6.1 -5.8 -5.5 … 9.5 9.8 10.1 10.4 10.7 11.0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The function f maps each x to a y, and as x is a Matrix, the expression broadcasts the scalar values using @. macro. Our data points are ready, but they are too perfect. In a real-world scenario, we will not have an f function to generate y values, but instead, the labels would be manually added.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> x = x .* reshape(rand(Float32, 61), (1, 61));","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Visualizing the final data -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> plot(vec(x), vec(y), lw = 3, seriestype = :scatter, label = \"\", title = \"Generated data\", xlabel = \"x\", ylabel= \"y\");","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"(Image: linear-regression-data)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The data looks random enough now! The x and y values are still somewhat correlated; hence, the linear regression algorithm should work fine on our dataset.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can now proceed ahead and build a model for our dataset!","category":"page"},{"location":"tutorials/linear_regression/#Building-a-model","page":"Linear Regression","title":"Building a model","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"A linear regression model is defined mathematically as -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"model(W b x) = Wx + b","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"where W is the weight matrix and b is the bias. For our case, the weight matrix (W) would constitute only a single element, as we have only a single feature. We can define our model in Julia using the exact same notation!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> custom_model(W, b, x) = @. W*x + b\ncustom_model (generic function with 1 method)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The @. macro allows you to perform the calculations by broadcasting the scalar quantities (for example - the bias).","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The next step would be to initialize the model parameters, which are the weight and the bias. There are a lot of initialization techniques available for different machine learning models, but for the sake of this example, let's pull out the weight from a uniform distribution and initialize the bias as 0.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> W = rand(Float32, 1, 1)\n1×1 Matrix{Float32}:\n 0.99285793\n\njulia> b = [0.0f0]\n1-element Vector{Float32}:\n 0.0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Time to test if our model works!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> custom_model(W, b, x) |> size\n(1, 61)\n\njulia> custom_model(W, b, x)[1], y[1]\n(-1.6116865f0, -7.0f0)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"It does! But the predictions are way off. We need to train the model to improve the predictions, but before training the model we need to define the loss function. The loss function would ideally output a quantity that we will try to minimize during the entire training process. Here we will use the mean sum squared error loss function.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> function custom_loss(W, b, x, y)\n ŷ = custom_model(W, b, x)\n sum((y .- ŷ).^2) / length(x)\n end;\n\njulia> custom_loss(W, b, x, y)\n23.772217f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Calling the loss function on our xs and ys shows how far our predictions (ŷ) are from the real labels. More precisely, it calculates the sum of the squares of residuals and divides it by the total number of data points.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We have successfully defined our model and the loss function, but surprisingly, we haven't used Flux anywhere till now. Let's see how we can write the same code using Flux. ","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> flux_model = Dense(1 => 1)\nDense(1 => 1) # 2 parameters","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"A Dense(1 => 1) layer denotes a layer of one neuron with one input (one feature) and one output. This layer is exactly same as the mathematical model defined by us above! Under the hood, Flux too calculates the output using the same expression! But, we don't have to initialize the parameters ourselves this time, instead Flux does it for us.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> flux_model.weight, flux_model.bias\n(Float32[-1.2678515;;], Float32[0.0])","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Now we can check if our model is acting right. We can pass the complete data in one go, with each x having exactly one feature (one input) -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> flux_model(x) |> size\n(1, 61)\n\njulia> flux_model(x)[1], y[1]\n(-1.8525281f0, -7.0f0)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"It is! The next step would be defining the loss function using Flux's functions -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> function flux_loss(flux_model, x, y)\n ŷ = flux_model(x)\n Flux.mse(ŷ, y)\n end;\n\njulia> flux_loss(flux_model, x, y)\n22.74856f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Everything works as before! It almost feels like Flux provides us with smart wrappers for the functions we could have written on our own. Now, as the last step of this section, let's see how different the flux_model is from our custom_model. A good way to go about this would be to fix the parameters of both models to be the same. Let's change the parameters of our custom_model to match that of the flux_model -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> W = Float32[1.1412252]\n1-element Vector{Float32}:\n 1.1412252","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"To check how both the models are performing on the data, let's find out the losses using the loss and flux_loss functions -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> custom_loss(W, b, x, y), flux_loss(flux_model, x, y)\n(22.74856f0, 22.74856f0)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The losses are identical! This means that our model and the flux_model are identical on some level, and the loss functions are completely identical! The difference in models would be that Flux's Dense layer supports many other arguments that can be used to customize the layer further. But, for this tutorial, let us stick to our simple custom_model.","category":"page"},{"location":"tutorials/linear_regression/#Training-the-model","page":"Linear Regression","title":"Training the model","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Let's train our model using the classic Gradient Descent algorithm. According to the gradient descent algorithm, the weights and biases should be iteratively updated using the following mathematical equations -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"beginaligned\nW = W - eta * fracdLdW \nb = b - eta * fracdLdb\nendaligned","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Here, W is the weight matrix, b is the bias vector, eta is the learning rate, fracdLdW is the derivative of the loss function with respect to the weight, and fracdLdb is the derivative of the loss function with respect to the bias.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The derivatives are calculated using an Automatic Differentiation tool, and Flux uses Zygote.jl for the same. Since Zygote.jl is an independent Julia package, it can be used outside of Flux as well! Refer to the documentation of Zygote.jl for more information on the same.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Our first step would be to obtain the gradient of the loss function with respect to the weights and the biases. Flux re-exports Zygote's gradient function; hence, we don't need to import Zygote explicitly to use the functionality.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, y);","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can now update the parameters, following the gradient descent algorithm -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> W .= W .- 0.1 .* dLdW\n1-element Vector{Float32}:\n 1.8144473\n\njulia> b .= b .- 0.1 .* dLdb\n1-element Vector{Float32}:\n 0.41325632","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The parameters have been updated! We can now check the value of the loss function -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> custom_loss(W, b, x, y)\n17.157953f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The loss went down! This means that we successfully trained our model for one epoch. We can plug the training code written above into a loop and train the model for a higher number of epochs. It can be customized either to have a fixed number of epochs or to stop when certain conditions are met, for example, change in loss < 0.1. The loop can be tailored to suit the user's needs, and the conditions can be specified in plain Julia!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Let's plug our super training logic inside a function and test it again -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> function train_custom_model()\n dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, y)\n @. W = W - 0.1 * dLdW\n @. b = b - 0.1 * dLdb\n end;\n\njulia> train_custom_model();\n\njulia> W, b, custom_loss(W, b, x, y)\n(Float32[2.340657], Float32[0.7516814], 13.64972f0)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"It works, and the loss went down again! This was the second epoch of our training procedure. Let's plug this in a for loop and train the model for 30 epochs.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> for i = 1:40\n train_custom_model()\n end\n\njulia> W, b, custom_loss(W, b, x, y)\n(Float32[4.2422233], Float32[2.2460847], 7.6680417f0)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"There was a significant reduction in loss, and the parameters were updated!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can train the model even more or tweak the hyperparameters to achieve the desired result faster, but let's stop here. We trained our model for 42 epochs, and loss went down from 22.74856 to 7.6680417f. Time for some visualization!","category":"page"},{"location":"tutorials/linear_regression/#Results","page":"Linear Regression","title":"Results","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The main objective of this tutorial was to fit a line to our dataset using the linear regression algorithm. The training procedure went well, and the loss went down significantly! Let's see what the fitted line looks like. Remember, Wx + b is nothing more than a line's equation, with slope = W[1] and y-intercept = b[1] (indexing at 1 as W and b are iterable).","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Plotting the line and the data points using Plot.jl -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> plot(reshape(x, (61, 1)), reshape(y, (61, 1)), lw = 3, seriestype = :scatter, label = \"\", title = \"Simple Linear Regression\", xlabel = \"x\", ylabel= \"y\");\n\njulia> plot!((x) -> b[1] + W[1] * x, -3, 3, label=\"Custom model\", lw=2);","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"(Image: linear-regression-line)","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The line fits well! There is room for improvement, but we leave that up to you! You can play with the optimisers, the number of epochs, learning rate, etc. to improve the fitting and reduce the loss!","category":"page"},{"location":"tutorials/linear_regression/#Linear-regression-model-on-a-real-dataset","page":"Linear Regression","title":"Linear regression model on a real dataset","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We now move on to a relatively complex linear regression model. Here we will use a real dataset from MLDatasets.jl, which will not confine our data points to have only one feature. Let's start by importing the required packages -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> using Flux, Statistics, MLDatasets, DataFrames","category":"page"},{"location":"tutorials/linear_regression/#Gathering-real-data","page":"Linear Regression","title":"Gathering real data","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Let's start by initializing our dataset. We will be using the BostonHousing dataset consisting of 506 data points. Each of these data points has 13 features and a corresponding label, the house's price. The xs are still mapped to a single y, but now, a single x data point has 13 features. ","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> dataset = BostonHousing();\n\njulia> x, y = BostonHousing(as_df=false)[:];\n\njulia> x, y = Float32.(x), Float32.(y);","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can now split the obtained data into training and testing data -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> x_train, x_test, y_train, y_test = x[:, 1:400], x[:, 401:end], y[:, 1:400], y[:, 401:end];\n\njulia> x_train |> size, x_test |> size, y_train |> size, y_test |> size\n((13, 400), (13, 106), (1, 400), (1, 106))","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"This data contains a diverse number of features, which means that the features have different scales. A wise option here would be to normalise the data, making the training process more efficient and fast. Let's check the standard deviation of the training data before normalising it.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> std(x_train)\n134.06786f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The data is indeed not normalised. We can use the Flux.normalise function to normalise the training data.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> x_train_n = Flux.normalise(x_train);\n\njulia> std(x_train_n)\n1.0000844f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The standard deviation is now close to one! Our data is ready!","category":"page"},{"location":"tutorials/linear_regression/#Building-a-Flux-model","page":"Linear Regression","title":"Building a Flux model","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can now directly use Flux and let it do all the work internally! Let's define a model that takes in 13 inputs (13 features) and gives us a single output (the label). We will then pass our entire data through this model in one go, and Flux will handle everything for us! Remember, we could have declared a model in plain Julia as well. The model will have 14 parameters: 13 weights and 1 bias.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> model = Dense(13 => 1)\nDense(13 => 1) # 14 parameters","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Same as before, our next step would be to define a loss function to quantify our accuracy somehow. The lower the loss, the better the model!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> function loss(model, x, y)\n ŷ = model(x)\n Flux.mse(ŷ, y)\n end;\n\njulia> loss(model, x_train_n, y_train)\n676.1656f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can now proceed to the training phase!","category":"page"},{"location":"tutorials/linear_regression/#Training-the-Flux-model","page":"Linear Regression","title":"Training the Flux model","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The training procedure would make use of the same mathematics, but now we can pass in the model inside the gradient call and let Flux and Zygote handle the derivatives!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> function train_model()\n dLdm, _, _ = gradient(loss, model, x_train_n, y_train)\n @. model.weight = model.weight - 0.000001 * dLdm.weight\n @. model.bias = model.bias - 0.000001 * dLdm.bias\n end;","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Contrary to our last training procedure, let's say that this time we don't want to hardcode the number of epochs. We want the training procedure to stop when the loss converges, that is, when change in loss < δ. The quantity δ can be altered according to a user's need, but let's fix it to 10⁻³ for this tutorial.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"We can write such custom training loops effortlessly using Flux and plain Julia!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> loss_init = Inf;\n\njulia> while true\n train_model()\n if loss_init == Inf\n loss_init = loss(model, x_train_n, y_train)\n continue\n end\n if abs(loss_init - loss(model, x_train_n, y_train)) < 1e-4\n break\n else\n loss_init = loss(model, x_train_n, y_train)\n end\n end;","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The code starts by initializing an initial value for the loss, infinity. Next, it runs an infinite loop that breaks if change in loss < 10⁻³, or the code changes the value of loss_init to the current loss and moves on to the next iteration.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"This custom loop works! This shows how easily a user can write down any custom training routine using Flux and Julia!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Let's have a look at the loss -","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> loss(model, x_train_n, y_train)\n27.1272f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The loss went down significantly! It can be minimized further by choosing an even smaller δ.","category":"page"},{"location":"tutorials/linear_regression/#Testing-the-Flux-model","page":"Linear Regression","title":"Testing the Flux model","text":"","category":"section"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The last step of this tutorial would be to test our model using the testing data. We will first normalise the testing data and then calculate the corresponding loss.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"julia> x_test_n = Flux.normalise(x_test);\n\njulia> loss(model, x_test_n, y_test)\n66.91015f0","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"The loss is not as small as the loss of the training data, but it looks good! This also shows that our model is not overfitting!","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Summarising this tutorial, we started by generating a random yet correlated dataset for our custom model. We then saw how a simple linear regression model could be built with and without Flux, and how they were almost identical. ","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"Next, we trained the model by manually writing down the Gradient Descent algorithm and optimising the loss. We also saw how Flux provides various wrapper functionalities and keeps the API extremely intuitive and simple for the users. ","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"After getting familiar with the basics of Flux and Julia, we moved ahead to build a machine learning model for a real dataset. We repeated the exact same steps, but this time with a lot more features and data points, and by harnessing Flux's full capabilities. In the end, we developed a training loop that was smarter than the hardcoded one and ran the model on our normalised dataset to conclude the tutorial.","category":"page"},{"location":"tutorials/linear_regression/","page":"Linear Regression","title":"Linear Regression","text":"info: Info\nOriginally published on 21 November 2022, by Saransh Chopra.","category":"page"},{"location":"#Flux:-The-Julia-Machine-Learning-Library","page":"Welcome","title":"Flux: The Julia Machine Learning Library","text":"","category":"section"},{"location":"","page":"Welcome","title":"Welcome","text":"Flux is a library for machine learning. It comes \"batteries-included\" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. We follow a few key principles:","category":"page"},{"location":"","page":"Welcome","title":"Welcome","text":"Doing the obvious thing. Flux has relatively few explicit APIs. Instead, writing down the mathematical form will work – and be fast.\nExtensible by default. Flux is written to be highly flexible while being performant. Extending Flux is as simple as using your own code as part of the model you want - it is all high-level Julia code.\nPlay nicely with others. Flux works well with unrelated Julia libraries from images to differential equation solvers, rather than duplicating them.","category":"page"},{"location":"#Installation","page":"Welcome","title":"Installation","text":"","category":"section"},{"location":"","page":"Welcome","title":"Welcome","text":"Download Julia 1.9 or later, preferably the current stable release. You can add Flux using Julia's package manager, by typing ] add Flux in the Julia prompt. For Nvidia GPU support, you will also need to install the CUDA and the cuDNN packages. For AMD GPU support, install the AMDGPU package. For acceleration on Apple Silicon, install the Metal package.","category":"page"},{"location":"#Learning-Flux","page":"Welcome","title":"Learning Flux","text":"","category":"section"},{"location":"","page":"Welcome","title":"Welcome","text":"The quick start page trains a simple neural network.","category":"page"},{"location":"","page":"Welcome","title":"Welcome","text":"This rest of the guide provides a from-scratch introduction to Flux's take on models and how they work, starting with fitting a line. Once you understand these docs, congratulations, you also understand Flux's source code, which is intended to be concise, legible and a good reference for more advanced concepts.","category":"page"},{"location":"","page":"Welcome","title":"Welcome","text":"There are some tutorials about building particular models. The model zoo has starting points for many other common ones. And finally, the ecosystem page lists packages which define Flux models.","category":"page"},{"location":"","page":"Welcome","title":"Welcome","text":"The reference section includes, beside Flux's own functions, those of some companion packages: Zygote.jl (automatic differentiation), Optimisers.jl (training) and others.","category":"page"},{"location":"#Community","page":"Welcome","title":"Community","text":"","category":"section"},{"location":"","page":"Welcome","title":"Welcome","text":"Everyone is welcome to join our community on the Julia discourse forum, or the slack chat (channel #machine-learning). If you have questions or issues we'll try to help you out.","category":"page"},{"location":"","page":"Welcome","title":"Welcome","text":"If you're interested in hacking on Flux, the source code is open and easy to understand – it's all just the same Julia code you work with normally. You might be interested in our intro issues to get started, or our contributing guide.","category":"page"},{"location":"models/basics/#man-basics","page":"Gradients and Layers","title":"How Flux Works: Gradients and Layers","text":"","category":"section"},{"location":"models/basics/#man-taking-gradients","page":"Gradients and Layers","title":"Taking Gradients","text":"","category":"section"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Flux's core feature is taking gradients of Julia code. The gradient function takes another Julia function f and a set of arguments, and returns the gradient with respect to each argument. (It's a good idea to try pasting these examples in the Julia terminal.)","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"julia> using Flux\n\njulia> f(x) = 3x^2 + 2x + 1;\n\njulia> df(x) = gradient(f, x)[1]; # df/dx = 6x + 2\n\njulia> df(2)\n14.0\n\njulia> d2f(x) = gradient(df, x)[1]; # d²f/dx² = 6\n\njulia> d2f(2)\n6.0","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"When a function has many parameters, we can get gradients of each one at the same time:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"julia> f(x, y) = sum((x .- y).^2);\n\njulia> gradient(f, [2, 1], [2, 0])\n([0.0, 2.0], [-0.0, -2.0])","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"These gradients are based on x and y. Flux works by instead taking gradients based on the weights and biases that make up the parameters of a model.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Machine learning often can have hundreds of parameter arrays. Instead of passing them to gradient individually, we can store them together in a structure. The simplest example is a named tuple, created by the following syntax:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"julia> nt = (a = [2, 1], b = [2, 0], c = tanh);\n\njulia> g(x::NamedTuple) = sum(abs2, x.a .- x.b);\n\njulia> g(nt)\n1\n\njulia> dg_nt = gradient(g, nt)[1]\n(a = [0.0, 2.0], b = [-0.0, -2.0], c = nothing)","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Notice that gradient has returned a matching structure. The field dg_nt.a is the gradient for nt.a, and so on. Some fields have no gradient, indicated by nothing. ","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Rather than define a function like g every time (and think up a name for it), it is often useful to use anonymous functions: this one is x -> sum(abs2, x.a .- x.b). Anonymous functions can be defined either with -> or with do, and such do blocks are often useful if you have a few steps to perform:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"julia> gradient((x, y) -> sum(abs2, x.a ./ y .- x.b), nt, [1, 2])\n((a = [0.0, 0.5], b = [-0.0, -1.0], c = nothing), [-0.0, -0.25])\n\njulia> gradient(nt, [1, 2]) do x, y\n z = x.a ./ y\n sum(abs2, z .- x.b)\n end\n((a = [0.0, 0.5], b = [-0.0, -1.0], c = nothing), [-0.0, -0.25])","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Sometimes you may want to know the value of the function, as well as its gradient. Rather than calling the function a second time, you can call withgradient instead:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"julia> Flux.withgradient(g, nt)\n(val = 1, grad = ((a = [0.0, 2.0], b = [-0.0, -2.0], c = nothing),))","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"note: Implicit gradients\nFlux used to handle many parameters in a different way, using the params function. This uses a method of gradient which takes a zero-argument function, and returns a dictionary through which the resulting gradients can be looked up:julia> x = [2, 1];\n\njulia> y = [2, 0];\n\njulia> gs = gradient(Flux.params(x, y)) do\n f(x, y)\n end\nGrads(...)\n\njulia> gs[x]\n2-element Vector{Float64}:\n 0.0\n 2.0\n\njulia> gs[y]\n2-element Vector{Float64}:\n -0.0\n -2.0","category":"page"},{"location":"models/basics/#Building-Simple-Models","page":"Gradients and Layers","title":"Building Simple Models","text":"","category":"section"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Consider a simple linear regression, which tries to predict an output array y from an input x.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"W = rand(2, 5)\nb = rand(2)\n\npredict(x) = W*x .+ b\n\nfunction loss(x, y)\n ŷ = predict(x)\n sum((y .- ŷ).^2)\nend\n\nx, y = rand(5), rand(2) # Dummy data\nloss(x, y) # ~ 3","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"To improve the prediction we can take the gradients of the loss with respect to W and b and perform gradient descent.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"using Flux\n\ngs = gradient(() -> loss(x, y), Flux.params(W, b))","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Now that we have gradients, we can pull them out and update W to train the model.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"W̄ = gs[W]\n\nW .-= 0.1 .* W̄\n\nloss(x, y) # ~ 2.5","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"The loss has decreased a little, meaning that our prediction x is closer to the target y. If we have some data we can already try training the model.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"All deep learning in Flux, however complex, is a simple generalisation of this example. Of course, models can look very different – they might have millions of parameters or complex control flow. Let's see how Flux handles more complex models.","category":"page"},{"location":"models/basics/#Building-Layers","page":"Gradients and Layers","title":"Building Layers","text":"","category":"section"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"It's common to create more complex models than the linear regression above. For example, we might want to have two linear layers with a nonlinearity like sigmoid (σ) in between them. In the above style we could write this as:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"using Flux\n\nW1 = rand(3, 5)\nb1 = rand(3)\nlayer1(x) = W1 * x .+ b1\n\nW2 = rand(2, 3)\nb2 = rand(2)\nlayer2(x) = W2 * x .+ b2\n\nmodel(x) = layer2(σ.(layer1(x)))\n\nmodel(rand(5)) # => 2-element vector","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"This works but is fairly unwieldy, with a lot of repetition – especially as we add more layers. One way to factor this out is to create a function that returns linear layers.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"function linear(in, out)\n W = randn(out, in)\n b = randn(out)\n x -> W * x .+ b\nend\n\nlinear1 = linear(5, 3) # we can access linear1.W etc\nlinear2 = linear(3, 2)\n\nmodel(x) = linear2(σ.(linear1(x)))\n\nmodel(rand(5)) # => 2-element vector","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Another (equivalent) way is to create a struct that explicitly represents the affine layer.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"struct Affine\n W\n b\nend\n\nAffine(in::Integer, out::Integer) =\n Affine(randn(out, in), randn(out))\n\n# Overload call, so the object can be used as a function\n(m::Affine)(x) = m.W * x .+ m.b\n\na = Affine(10, 5)\n\na(rand(10)) # => 5-element vector","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Congratulations! You just built the Dense layer that comes with Flux. Flux has many interesting layers available, but they're all things you could have built yourself very easily.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"(There is one small difference with Dense – for convenience it also takes an activation function, like Dense(10 => 5, σ).)","category":"page"},{"location":"models/basics/#Stacking-It-Up","page":"Gradients and Layers","title":"Stacking It Up","text":"","category":"section"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"It's pretty common to write models that look something like:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"layer1 = Dense(10 => 5, σ)\n# ...\nmodel(x) = layer3(layer2(layer1(x)))","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"For long chains, it might be a bit more intuitive to have a list of layers, like this:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"using Flux\n\nlayers = [Dense(10 => 5, σ), Dense(5 => 2), softmax]\n\nmodel(x) = foldl((x, m) -> m(x), layers, init = x)\n\nmodel(rand(10)) # => 2-element vector","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Handily, this is also provided for in Flux:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"model2 = Chain(\n Dense(10 => 5, σ),\n Dense(5 => 2),\n softmax)\n\nmodel2(rand(10)) # => 2-element vector","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"This quickly starts to look like a high-level deep learning library; yet you can see how it falls out of simple abstractions, and we lose none of the power of Julia code.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"A nice property of this approach is that because \"models\" are just functions (possibly with trainable parameters), you can also see this as simple function composition.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"m = Dense(5 => 2) ∘ Dense(10 => 5, σ)\n\nm(rand(10))","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Likewise, Chain will happily work with any Julia function.","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"m = Chain(x -> x^2, x -> x+1)\n\nm(5) # => 26","category":"page"},{"location":"models/basics/#Layer-Helpers","page":"Gradients and Layers","title":"Layer Helpers","text":"","category":"section"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"There is still one problem with this Affine layer, that Flux does not know to look inside it. This means that Flux.train! won't see its parameters, nor will gpu be able to move them to your GPU. These features are enabled by the @functor macro:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Flux.@functor Affine","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"Finally, most Flux layers make bias optional, and allow you to supply the function used for generating random weights. We can easily add these refinements to the Affine layer as follows, using the helper function create_bias:","category":"page"},{"location":"models/basics/","page":"Gradients and Layers","title":"Gradients and Layers","text":"function Affine((in, out)::Pair; bias=true, init=Flux.randn32)\n W = init(out, in)\n b = Flux.create_bias(W, bias, out)\n Affine(W, b)\nend\n\nAffine(3 => 1, bias=false, init=ones) |> gpu","category":"page"},{"location":"data/onehot/#One-Hot-Encoding-with-OneHotArrays.jl","page":"OneHotArrays.jl","title":"One-Hot Encoding with OneHotArrays.jl","text":"","category":"section"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"It's common to encode categorical variables (like true, false or cat, dog) in \"one-of-k\" or \"one-hot\" form. OneHotArrays.jl provides the onehot function to make this easy.","category":"page"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"julia> using OneHotArrays\n\njulia> onehot(:b, [:a, :b, :c])\n3-element OneHotVector(::UInt32) with eltype Bool:\n ⋅\n 1\n ⋅\n\njulia> onehot(:c, [:a, :b, :c])\n3-element OneHotVector(::UInt32) with eltype Bool:\n ⋅\n ⋅\n 1","category":"page"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"There is also a onecold function, which is an inverse of onehot. It can also be given an array of numbers instead of booleans, in which case it performs an argmax-like operation, returning the label with the highest corresponding weight.","category":"page"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"julia> onecold(ans, [:a, :b, :c])\n:c\n\njulia> onecold([true, false, false], [:a, :b, :c])\n:a\n\njulia> onecold([0.3, 0.2, 0.5], [:a, :b, :c])\n:c","category":"page"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"For multiple samples at once, onehotbatch creates a batch (matrix) of one-hot vectors, and onecold treats matrices as batches.","category":"page"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"julia> using OneHotArrays\n\njulia> onehotbatch([:b, :a, :b], [:a, :b, :c])\n3×3 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n ⋅ 1 ⋅\n 1 ⋅ 1\n ⋅ ⋅ ⋅\n\njulia> onecold(ans, [:a, :b, :c])\n3-element Vector{Symbol}:\n :b\n :a\n :b","category":"page"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"Note that these operations returned OneHotVector and OneHotMatrix rather than Arrays. OneHotVectors behave like normal vectors but avoid any unnecessary cost compared to using an integer index directly. For example, multiplying a matrix with a one-hot vector simply slices out the relevant row of the matrix under the hood.","category":"page"},{"location":"data/onehot/#Function-listing","page":"OneHotArrays.jl","title":"Function listing","text":"","category":"section"},{"location":"data/onehot/","page":"OneHotArrays.jl","title":"OneHotArrays.jl","text":"OneHotArrays.onehot\nOneHotArrays.onecold\nOneHotArrays.onehotbatch\nOneHotArrays.OneHotArray\nOneHotArrays.OneHotVector\nOneHotArrays.OneHotMatrix","category":"page"},{"location":"data/onehot/#OneHotArrays.onehot","page":"OneHotArrays.jl","title":"OneHotArrays.onehot","text":"onehot(x, labels, [default])\n\nReturns a OneHotVector which is roughly a sparse representation of x .== labels.\n\nInstead of storing say Vector{Bool}, it stores the index of the first occurrence of x in labels. If x is not found in labels, then it either returns onehot(default, labels), or gives an error if no default is given.\n\nSee also onehotbatch to apply this to many xs, and onecold to reverse either of these, as well as to generalise argmax.\n\nExamples\n\njulia> β = onehot(:b, (:a, :b, :c))\n3-element OneHotVector(::UInt32) with eltype Bool:\n ⋅\n 1\n ⋅\n\njulia> αβγ = (onehot(0, 0:2), β, onehot(:z, [:a, :b, :c], :c)) # uses default\n(Bool[1, 0, 0], Bool[0, 1, 0], Bool[0, 0, 1])\n\njulia> hcat(αβγ...) # preserves sparsity\n3×3 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n 1 ⋅ ⋅\n ⋅ 1 ⋅\n ⋅ ⋅ 1\n\n\n\n\n\n","category":"function"},{"location":"data/onehot/#OneHotArrays.onecold","page":"OneHotArrays.jl","title":"OneHotArrays.onecold","text":"onecold(y::AbstractArray, labels = 1:size(y,1))\n\nRoughly the inverse operation of onehot or onehotbatch: This finds the index of the largest element of y, or each column of y, and looks them up in labels.\n\nIf labels are not specified, the default is integers 1:size(y,1) – the same operation as argmax(y, dims=1) but sometimes a different return type.\n\nExamples\n\njulia> onecold([false, true, false])\n2\n\njulia> onecold([0.3, 0.2, 0.5], (:a, :b, :c))\n:c\n\njulia> onecold([ 1 0 0 1 0 1 0 1 0 0 1\n 0 1 0 0 0 0 0 0 1 0 0\n 0 0 0 0 1 0 0 0 0 0 0\n 0 0 0 0 0 0 1 0 0 0 0\n 0 0 1 0 0 0 0 0 0 1 0 ], 'a':'e') |> String\n\"abeacadabea\"\n\n\n\n\n\n","category":"function"},{"location":"data/onehot/#OneHotArrays.onehotbatch","page":"OneHotArrays.jl","title":"OneHotArrays.onehotbatch","text":"onehotbatch(xs, labels, [default])\n\nReturns a OneHotMatrix where kth column of the matrix is onehot(xs[k], labels). This is a sparse matrix, which stores just a Vector{UInt32} containing the indices of the nonzero elements.\n\nIf one of the inputs in xs is not found in labels, that column is onehot(default, labels) if default is given, else an error.\n\nIf xs has more dimensions, N = ndims(xs) > 1, then the result is an AbstractArray{Bool, N+1} which is one-hot along the first dimension, i.e. result[:, k...] == onehot(xs[k...], labels).\n\nNote that xs can be any iterable, such as a string. And that using a tuple for labels will often speed up construction, certainly for less than 32 classes.\n\nExamples\n\njulia> oh = onehotbatch(\"abracadabra\", 'a':'e', 'e')\n5×11 OneHotMatrix(::Vector{UInt32}) with eltype Bool:\n 1 ⋅ ⋅ 1 ⋅ 1 ⋅ 1 ⋅ ⋅ 1\n ⋅ 1 ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ 1 ⋅ ⋅\n ⋅ ⋅ ⋅ ⋅ 1 ⋅ ⋅ ⋅ ⋅ ⋅ ⋅\n ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ 1 ⋅ ⋅ ⋅ ⋅\n ⋅ ⋅ 1 ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ 1 ⋅\n\njulia> reshape(1:15, 3, 5) * oh # this matrix multiplication is done efficiently\n3×11 Matrix{Int64}:\n 1 4 13 1 7 1 10 1 4 13 1\n 2 5 14 2 8 2 11 2 5 14 2\n 3 6 15 3 9 3 12 3 6 15 3\n\n\n\n\n\n","category":"function"},{"location":"data/onehot/#OneHotArrays.OneHotArray","page":"OneHotArrays.jl","title":"OneHotArrays.OneHotArray","text":"OneHotArray{T, N, M, I} <: AbstractArray{Bool, M}\nOneHotArray(indices, L)\n\nA one-hot M-dimensional array with L labels (i.e. size(A, 1) == L and sum(A, dims=1) == 1) stored as a compact N == M-1-dimensional array of indices.\n\nTypically constructed by onehot and onehotbatch. Parameter I is the type of the underlying storage, and T its eltype.\n\n\n\n\n\n","category":"type"},{"location":"data/onehot/#OneHotArrays.OneHotVector","page":"OneHotArrays.jl","title":"OneHotArrays.OneHotVector","text":"OneHotVector{T} = OneHotArray{T, 0, 1, T}\nOneHotVector(indices, L)\n\nA one-hot vector with L labels (i.e. length(A) == L and count(A) == 1) typically constructed by onehot. Stored efficiently as a single index of type T, usually UInt32.\n\n\n\n\n\n","category":"type"},{"location":"data/onehot/#OneHotArrays.OneHotMatrix","page":"OneHotArrays.jl","title":"OneHotArrays.OneHotMatrix","text":"OneHotMatrix{T, I} = OneHotArray{T, 1, 2, I}\nOneHotMatrix(indices, L)\n\nA one-hot matrix (with L labels) typically constructed using onehotbatch. Stored efficiently as a vector of indices with type I and eltype T.\n\n\n\n\n\n","category":"type"}] +} diff --git a/previews/PR2365/siteinfo.js b/previews/PR2365/siteinfo.js new file mode 100644 index 0000000000..95dc6fd383 --- /dev/null +++ b/previews/PR2365/siteinfo.js @@ -0,0 +1 @@ +var DOCUMENTER_CURRENT_VERSION = "previews/PR2365"; diff --git a/previews/PR2365/training/callbacks/index.html b/previews/PR2365/training/callbacks/index.html new file mode 100644 index 0000000000..b992455cec --- /dev/null +++ b/previews/PR2365/training/callbacks/index.html @@ -0,0 +1,91 @@ + +Callback Helpers · Flux

    Callback Helpers

    Flux.throttleFunction
    throttle(f, timeout; leading=true, trailing=false)

    Return a function that when invoked, will only be triggered at most once during timeout seconds.

    Normally, the throttled function will run as much as it can, without ever going more than once per wait duration; but if you'd like to disable the execution on the leading edge, pass leading=false. To enable execution on the trailing edge, pass trailing=true.

    Examples

    julia> a = Flux.throttle(() -> println("Flux"), 2);
    +
    +julia> for i = 1:4  # a called in alternate iterations
    +           a()
    +           sleep(1)
    +       end
    +Flux
    +Flux
    source

    Patience Helpers

    Flux provides utilities for controlling your training procedure according to some monitored condition and a maximum patience. For example, you can use early_stopping to stop training when the model is converging or deteriorating, or you can use plateau to check if the model is stagnating.

    For example, below we create a pseudo-loss function that decreases, bottoms out, and then increases. The early stopping trigger will break the loop before the loss increases too much.

    # create a pseudo-loss that decreases for 4 calls, then starts increasing
    +# we call this like loss()
    +loss = let t = 0
    +  () -> begin
    +    t += 1
    +    (t - 4) ^ 2
    +  end
    +end
    +
    +# create an early stopping trigger
    +# returns true when the loss increases for two consecutive steps
    +es = early_stopping(loss, 2; init_score = 9)
    +
    +# this will stop at the 6th (4 decreasing + 2 increasing calls) epoch
    +for epoch in 1:10
    +  es() && break
    +end

    The keyword argument distance of early_stopping is a function of the form distance(best_score, score). By default distance is -, which implies that the monitored metric f is expected to be decreasing and minimized. If you use some increasing metric (e.g. accuracy), you can customize the distance function: (best_score, score) -> score - best_score.

    # create a pseudo-accuracy that increases by 0.01 each time from 0 to 1
    +# we call this like acc()
    +acc = let v = 0
    +  () -> v = max(1, v + 0.01)
    +end
    +
    +# create an early stopping trigger for accuracy
    +es = early_stopping(acc, 3; delta = (best_score, score) -> score - best_score)
    +
    +# this will iterate until the 10th epoch
    +for epoch in 1:10
    +  es() && break
    +end

    early_stopping and plateau are both built on top of patience. You can use patience to build your own triggers that use a patient counter. For example, if you want to trigger when the loss is below a threshold for several consecutive iterations:

    threshold(f, thresh, delay) = patience(delay) do
    +  f() < thresh
    +end

    Both predicate in patience and f in early_stopping / plateau can accept extra arguments. You can pass such extra arguments to predicate or f through the returned function:

    trigger = patience((a; b) -> a > b, 3)
    +
    +# this will iterate until the 10th epoch
    +for epoch in 1:10
    +  trigger(1; b = 2) && break
    +end
    +
    +# this will stop at the 3rd epoch
    +for epoch in 1:10
    +  trigger(3; b = 2) && break
    +end
    Flux.patienceFunction
    patience(predicate, wait)

    Return a function that internally counts by one when predicate(...) == true, otherwise the count is reset to zero. If the count is greater than or equal to wait, the function returns true, otherwise it returns false.

    Examples

    julia> loss() = rand();
    +
    +julia> trigger = Flux.patience(() -> loss() < 1, 3);
    +
    +
    +julia> for i in 1:10
    +         @info "Epoch $i"
    +         trigger() && break
    +       end
    +[ Info: Epoch 1
    +[ Info: Epoch 2
    +[ Info: Epoch 3
    source
    Flux.early_stoppingFunction
    early_stopping(f, delay; distance = -, init_score = 0, min_dist = 0)

    Return a function that internally counts by one when distance(best_score, f(...)) <= min_dist, where best_score is the last seen best value of f(...). If the count is greater than or equal to delay, the function returns true, otherwise it returns false. The count is reset when distance(best_score, f(...)) > min_dist.

    Examples

    julia> loss = let l = 0
    +         () -> l += 1
    +       end; # pseudo loss function that returns increasing values
    +
    +julia> es = Flux.early_stopping(loss, 3);
    +
    +
    +julia> for i in 1:10
    +         @info "Epoch $i"
    +         es() && break
    +       end
    +[ Info: Epoch 1
    +[ Info: Epoch 2
    +[ Info: Epoch 3
    source
    Flux.plateauFunction
    plateau(f, width; distance = -, init_score = 0, min_dist = 1f-6)

    Return a function that internally counts by one when abs(distance(last_score, f(...))) <= min_dist, where last_score holds the last value of f(...). If the count is greater than or equal to width, the function returns true, otherwise it returns false. The count is reset when abs(distance(last_score, f(...))) > min_dist.

    Examples

    julia> f = let v = 10
    +         () -> v = v / abs(v) - v
    +       end; # -9, 8, -7, 6, ...
    +
    +julia> trigger = Flux.plateau(f, 3; init_score=10, min_dist=18);
    +
    +
    +julia> for i in 1:10
    +         @info "Epoch $i"
    +         trigger() && break
    +       end
    +[ Info: Epoch 1
    +[ Info: Epoch 2
    +[ Info: Epoch 3
    +[ Info: Epoch 4
    source
    diff --git a/previews/PR2365/training/optimisers/index.html b/previews/PR2365/training/optimisers/index.html new file mode 100644 index 0000000000..5544bd58ce --- /dev/null +++ b/previews/PR2365/training/optimisers/index.html @@ -0,0 +1,71 @@ + +Optimisation Rules · Flux

    Optimisation Rules

    Flux builds in many optimisation rules for use with train! and other training functions.

    The mechanism by which these work is gradually being replaced as part of the change from "implicit" dictionary-based to "explicit" tree-like structures. At present, the same struct (such as Adam) can be used with either form, and will be automatically translated.

    For full details of how the new interface works, see the Optimisers.jl documentation.

    For full details on how the old "implicit" interface worked, see the Flux 0.13.6 manual.

    Optimiser Reference

    All optimisers return an object that, when passed to train!, will update the parameters passed to it.

    Flux.Optimise.DescentType
    Descent(η = 0.1)

    Classic gradient descent optimiser with learning rate η. For each parameter p and its gradient δp, this runs p -= η*δp

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.

    Examples

    opt = Descent()
    +
    +opt = Descent(0.3)
    +
    +ps = Flux.params(model)
    +
    +gs = gradient(ps) do
    +    loss(x, y)
    +end
    +
    +Flux.Optimise.update!(opt, ps, gs)
    source
    Flux.Optimise.MomentumType
    Momentum(η = 0.01, ρ = 0.9)

    Gradient descent optimiser with learning rate η and momentum ρ.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Momentum (ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.

    Examples

    opt = Momentum()
    +
    +opt = Momentum(0.01, 0.99)
    source
    Flux.Optimise.NesterovType
    Nesterov(η = 0.001, ρ = 0.9)

    Gradient descent optimiser with learning rate η and Nesterov momentum ρ.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Nesterov momentum (ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.

    Examples

    opt = Nesterov()
    +
    +opt = Nesterov(0.003, 0.95)
    source
    Flux.Optimise.RMSPropType
    RMSProp(η = 0.001, ρ = 0.9, ϵ = 1.0e-8)

    Optimizer using the RMSProp algorithm. Often a good choice for recurrent networks. Parameters other than learning rate generally don't need tuning.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Momentum (ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.

    Examples

    opt = RMSProp()
    +
    +opt = RMSProp(0.002, 0.95)
    source
    Flux.Optimise.AdamType
    Adam(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)

    Adam optimiser.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = Adam()
    +
    +opt = Adam(0.001, (0.9, 0.8))
    source
    Flux.Optimise.RAdamType
    RAdam(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)

    Rectified Adam optimiser.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = RAdam()
    +
    +opt = RAdam(0.001, (0.9, 0.8))
    source
    Flux.Optimise.AdaMaxType
    AdaMax(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)

    AdaMax is a variant of Adam based on the ∞-norm.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = AdaMax()
    +
    +opt = AdaMax(0.001, (0.9, 0.995))
    source
    Flux.Optimise.AdaGradType
    AdaGrad(η = 0.1, ϵ = 1.0e-8)

    AdaGrad optimiser. It has parameter specific learning rates based on how frequently it is updated. Parameters don't need tuning.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.

    Examples

    opt = AdaGrad()
    +
    +opt = AdaGrad(0.001)
    source
    Flux.Optimise.AdaDeltaType
    AdaDelta(ρ = 0.9, ϵ = 1.0e-8)

    AdaDelta is a version of AdaGrad adapting its learning rate based on a window of past gradient updates. Parameters don't need tuning.

    Parameters

    • Rho (ρ): Factor by which the gradient is decayed at each time step.

    Examples

    opt = AdaDelta()
    +
    +opt = AdaDelta(0.89)
    source
    Flux.Optimise.AMSGradType
    AMSGrad(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)

    The AMSGrad version of the Adam optimiser. Parameters don't need tuning.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = AMSGrad()
    +
    +opt = AMSGrad(0.001, (0.89, 0.995))
    source
    Flux.Optimise.NAdamType
    NAdam(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)

    NAdam is a Nesterov variant of Adam. Parameters don't need tuning.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = NAdam()
    +
    +opt = NAdam(0.002, (0.89, 0.995))
    source
    Flux.Optimise.AdamWFunction
    AdamW(η = 0.001, β::Tuple = (0.9, 0.999), decay = 0)

    AdamW is a variant of Adam fixing (as in repairing) its weight decay regularization.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.
    • decay: Decay applied to weights during optimisation.

    Examples

    opt = AdamW()
    +
    +opt = AdamW(0.001, (0.89, 0.995), 0.1)
    source
    Flux.Optimise.OAdamType
    OAdam(η = 0.0001, β::Tuple = (0.5, 0.9), ϵ = 1.0e-8)

    OAdam (Optimistic Adam) is a variant of Adam adding an "optimistic" term suitable for adversarial training.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = OAdam()
    +
    +opt = OAdam(0.001, (0.9, 0.995))
    source
    Flux.Optimise.AdaBeliefType
    AdaBelief(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)

    The AdaBelief optimiser is a variant of the well-known Adam optimiser.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • Decay of momentums (β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.

    Examples

    opt = AdaBelief()
    +
    +opt = AdaBelief(0.001, (0.9, 0.8))
    source

    Composing Optimisers

    Flux defines a special kind of optimiser simply called Optimiser which takes in arbitrary optimisers as input. Its behaviour is similar to the usual optimisers, but differs in that it acts by calling the optimisers listed in it sequentially. Each optimiser produces a modified gradient that will be fed into the next, and the resultant update will be applied to the parameter as usual. A classic use case is where adding decays is desirable. Flux defines some basic decays including ExpDecay, InvDecay etc.

    opt = Optimiser(ExpDecay(1, 0.1, 1000, 1e-4), Descent())

    Here we apply exponential decay to the Descent optimiser. The defaults of ExpDecay say that its learning rate will be decayed every 1000 steps. It is then applied like any optimiser.

    w = randn(10, 10)
    +w1 = randn(10,10)
    +ps = Params([w, w1])
    +
    +loss(x) = Flux.Losses.mse(w * x, w1 * x)
    +
    +loss(rand(10)) # around 9
    +
    +for t = 1:10^5
    +  θ = Params([w, w1])
    +  θ̄ = gradient(() -> loss(rand(10)), θ)
    +  Flux.Optimise.update!(opt, θ, θ̄)
    +end
    +
    +loss(rand(10)) # around 0.9

    It is possible to compose optimisers for some added flexibility.

    Flux.Optimise.OptimiserType
    Optimiser(a, b, c...)

    Combine several optimisers into one; each optimiser produces a modified gradient that will be fed into the next, and this is finally applied to the parameter as usual.

    Note

    This will be replaced by Optimisers.OptimiserChain in Flux 0.15.

    source

    Scheduling Optimisers

    In practice, it is fairly common to schedule the learning rate of an optimiser to obtain faster convergence. There are a variety of popular scheduling policies, and you can find implementations of them in ParameterSchedulers.jl. The documentation for ParameterSchedulers.jl provides a more detailed overview of the different scheduling policies, and how to use them with Flux optimisers. Below, we provide a brief snippet illustrating a cosine annealing schedule with a momentum optimiser.

    First, we import ParameterSchedulers.jl and initialize a cosine annealing schedule to vary the learning rate between 1e-4 and 1e-2 every 10 steps. We also create a new Momentum optimiser.

    using ParameterSchedulers
    +
    +opt = Momentum()
    +schedule = Cos(λ0 = 1e-4, λ1 = 1e-2, period = 10)
    +for (eta, epoch) in zip(schedule, 1:100)
    +  opt.eta = eta
    +  # your training code here
    +end

    schedule can also be indexed (e.g. schedule(100)) or iterated like any iterator in Julia.

    ParameterSchedulers.jl schedules are stateless (they don't store their iteration state). If you want a stateful schedule, you can use ParameterSchedulers.Stateful:

    using ParameterSchedulers: Stateful, next!
    +
    +schedule = Stateful(Cos(λ0 = 1e-4, λ1 = 1e-2, period = 10))
    +for epoch in 1:100
    +  opt.eta = next!(schedule)
    +  # your training code here
    +end

    ParameterSchedulers.jl allows for many more scheduling policies including arbitrary functions, looping any function with a given period, or sequences of many schedules. See the ParameterSchedulers.jl documentation for more info.

    Decays

    Similar to optimisers, Flux also defines some simple decays that can be used in conjunction with other optimisers, or standalone.

    Flux.Optimise.ExpDecayType
    ExpDecay(η = 0.001, decay = 0.1, decay_step = 1000, clip = 1e-4, start = 1)

    Discount the learning rate η by the factor decay every decay_step steps till a minimum of clip.

    Parameters

    • Learning rate (η): Amount by which gradients are discounted before updating the weights.
    • decay: Factor by which the learning rate is discounted.
    • decay_step: Schedule decay operations by setting the number of steps between two decay operations.
    • clip: Minimum value of learning rate.
    • 'start': Step at which the decay starts.

    See also the Scheduling Optimisers section of the docs for more general scheduling techniques.

    Examples

    ExpDecay is typically composed with other optimisers as the last transformation of the gradient:

    opt = Optimiser(Adam(), ExpDecay(1.0))

    Note: you may want to start with η=1 in ExpDecay when combined with other optimisers (Adam in this case) that have their own learning rate.

    source
    Flux.Optimise.InvDecayType
    InvDecay(γ = 0.001)

    Apply inverse time decay to an optimiser, so that the effective step size at iteration n is eta / (1 + γ * n) where eta is the initial step size. The wrapped optimiser's step size is not modified.

    See also the Scheduling Optimisers section of the docs for more general scheduling techniques.

    Examples

    InvDecay is typically composed with other optimisers as the last transformation of the gradient:

    # Inverse decay of the learning rate
    +# with starting value 0.001 and decay coefficient 0.01.
    +opt = Optimiser(Adam(1f-3), InvDecay(1f-2))
    source
    Flux.Optimise.WeightDecayType
    WeightDecay(λ = 0)

    Decay weights by $λ$. Typically composed with other optimisers as the first transformation to the gradient, making it equivalent to adding $L_2$ regularization with coefficient $λ$ to the loss.

    Examples

    opt = Optimiser(WeightDecay(1f-4), Adam())
    source

    Gradient Clipping

    Gradient clipping is useful for training recurrent neural networks, which have a tendency to suffer from the exploding gradient problem. An example usage is

    opt = Optimiser(ClipValue(1e-3), Adam(1e-3))
    Flux.Optimise.ClipValueType
    ClipValue(thresh)

    Clip gradients when their absolute value exceeds thresh.

    Note

    This will be replaced by Optimisers.ClipGrad in Flux 0.15.

    source
    diff --git a/previews/PR2365/training/reference/index.html b/previews/PR2365/training/reference/index.html new file mode 100644 index 0000000000..b8dc36df91 --- /dev/null +++ b/previews/PR2365/training/reference/index.html @@ -0,0 +1,118 @@ + +Training API · Flux

    Training API Reference

    The new version of Flux's training code was written as an independent package, Optimisers.jl. Only the function train! belongs to Flux itself.

    The Optimisers package is designed to allow for immutable objects. But at present all Flux models contain parameter arrays (such as Arrays and CuArrays) which can be updated in-place. Because of this:

    • The objects returned by Optimisers.update! can be ignored.
    • Flux defines its own version of setup which checks this assumption. (Using instead Optimisers.setup will also work, they return the same thing.)

    The new implementation of rules such as Adam in the Optimisers is quite different from the old one in Flux.Optimise. In Flux 0.14, Flux.Adam() returns the old one, with supertype Flux.Optimise.AbstractOptimiser, but setup will silently translate it to its new counterpart. The available rules are listed the optimisation rules page here; see the Optimisers documentation for details on how the new rules work.

    Flux.Train.setupFunction
    opt_state = setup(rule, model)

    This is a version of Optimisers.setup, and is the first step before using train!. It differs from Optimisers.setup in that it:

    • has one extra check for mutability (since Flux expects to mutate the model in-place, while Optimisers.jl is designed to return an updated model)
    • has methods which accept Flux's old optimisers, and convert them. (The old Flux.Optimise.Adam and new Optimisers.Adam are distinct types.)
    New

    This function was added in Flux 0.13.9. It was not used by the old "implicit" interface, using Flux.Optimise module and Flux.params.

    Example

    julia> model = Dense(2=>1, leakyrelu; init=ones);
    +
    +julia> opt_state = Flux.setup(Momentum(0.1), model)  # this encodes the optimiser and its state
    +(weight = Leaf(Momentum{Float64}(0.1, 0.9), [0.0 0.0]), bias = Leaf(Momentum{Float64}(0.1, 0.9), [0.0]), σ = ())
    +
    +julia> x1, y1 = [0.2, -0.3], [0.4];  # use the same data for two steps:
    +
    +julia> Flux.train!(model, [(x1, y1), (x1, y1)], opt_state) do m, x, y
    +         sum(abs.(m(x) .- y)) * 100
    +       end
    +
    +julia> model.bias  # was zero, mutated by Flux.train!
    +1-element Vector{Float64}:
    + 10.19
    +
    +julia> opt_state  # mutated by Flux.train!
    +(weight = Leaf(Momentum{Float64}(0.1, 0.9), [-2.018 3.027]), bias = Leaf(Momentum{Float64}(0.1, 0.9), [-10.09]), σ = ())
    source
    Flux.Optimise.train!Method
    train!(loss, model, data, opt_state)

    Uses a loss function and training data to improve the model's parameters according to a particular optimisation rule encoded in opt_state. Iterates through data once, evaluating for each d in data either loss(model, d...) if d isa Tuple, or else loss(model, d) for other d.

    For example, with these definitions...

    data = [(x1, y1), (x2, y2), (x3, y3)]
    +
    +loss3(m, x, y) = norm(m(x) .- y)        # the model is the first argument
    +
    +opt_state = Flux.setup(Adam(), model)   # explicit setup of optimiser momenta

    ...calling Flux.train!(loss3, model, data, opt_state) runs a loop much like this:

    for d in data
    +    ∂L∂m = gradient(loss3, model, d...)[1]
    +    update!(opt_state, model, ∂L∂m)
    +end

    You can also write this loop yourself, if you need more flexibility. For this reason train! is not highly extensible. It adds only a few features to the loop above:

    • Stop with a DomainError if the loss is infinite or NaN at any point.

    • Show a progress bar using @withprogress.

    New

    This method was added in Flux 0.13.9. It has significant changes from the one used by Flux ≤ 0.13:

    • It now takes the model itself, not the result of Flux.params. (This is to move away from Zygote's "implicit" parameter handling, with Grads.)
    • Instead of loss being a function which accepts only the data, now it must also accept the model itself, as the first argument.
    • opt_state should be the result of Flux.setup. Using an optimiser such as Adam() without this step should give you a warning.
    • Callback functions are not supported. (But any code can be included in the above for loop.)
    source
    Optimisers.update!Function
    Optimisers.update!(tree, model, gradient) -> (tree, model)

    Uses the optimiser and the gradient to change the trainable parameters in the model. Returns the improved model, and the optimiser states needed for the next update. The initial tree of states comes from setup.

    This is used in exactly the same manner as update, but because it may mutate arrays within the old model (and the old state), it will be faster for models of ordinary Arrays or CuArrays. However, you should not rely on the old model being fully updated but rather use the returned model. (The original state tree is always mutated, as each Leaf is mutable.)

    Example

    julia> using StaticArrays, Zygote, Optimisers
    +
    +julia> m = (x = [1f0, 2f0], y = SA[4f0, 5f0]);  # partly mutable model
    +
    +julia> t = Optimisers.setup(Momentum(1/30, 0.9), m)  # tree of states
    +(x = Leaf(Momentum(0.0333333, 0.9), Float32[0.0, 0.0]), y = Leaf(Momentum(0.0333333, 0.9), Float32[0.0, 0.0]))
    +
    +julia> g = gradient(m -> sum(abs2.(m.x .+ m.y)), m)[1]  # structural gradient
    +(x = Float32[10.0, 14.0], y = Float32[10.0, 14.0])
    +
    +julia> t2, m2 = Optimisers.update!(t, m, g);
    +
    +julia> m2  # after update or update!, this is the new model
    +(x = Float32[0.6666666, 1.5333333], y = Float32[3.6666667, 4.5333333])
    +
    +julia> m2.x === m.x  # update! has re-used this array, for efficiency
    +true
    +
    +julia> m  # original should be discarded, may be mutated but no guarantee
    +(x = Float32[0.6666666, 1.5333333], y = Float32[4.0, 5.0])
    +
    +julia> t == t2  # original state tree is guaranteed to be mutated
    +true

    train! uses @progress which should show a progress bar in VSCode automatically. To see one in a terminal, you will need to install TerminalLoggers.jl and follow its setup instructions.

    Optimisation Modifiers

    The state returned by setup can be modified to temporarily prevent training of some parts of the model, or to change the learning rate or other hyperparameter. The functions for doing so may be accessed as Flux.freeze!, Flux.thaw!, and Flux.adjust!. All mutate the state (or part of it) and return nothing.

    Optimisers.adjust!Function
    Optimisers.adjust!(tree, η)

    Alters the state tree = setup(rule, model) to change the parameters of the optimisation rule, without destroying its stored state. Typically used mid-way through training.

    Can be applied to part of a model, by acting only on the corresponding part of the state tree.

    To change just the learning rate, provide a number η::Real.

    Example

    julia> m = (vec = rand(Float32, 2), fun = sin);
    +
    +julia> st = Optimisers.setup(Nesterov(), m)  # stored momentum is initialised to zero
    +(vec = Leaf(Nesterov(0.001, 0.9), Float32[0.0, 0.0]), fun = ())
    +
    +julia> st, m = Optimisers.update(st, m, (vec = [16, 88], fun = nothing));  # with fake gradient
    +
    +julia> st
    +(vec = Leaf(Nesterov(0.001, 0.9), Float32[-0.016, -0.088]), fun = ())
    +
    +julia> Optimisers.adjust!(st, 0.123)  # change learning rate, stored momentum untouched
    +
    +julia> st
    +(vec = Leaf(Nesterov(0.123, 0.9), Float32[-0.016, -0.088]), fun = ())

    To change other parameters, adjust! also accepts keyword arguments matching the field names of the optimisation rule's type.

    julia> fieldnames(Adam)
    +(:eta, :beta, :epsilon)
    +
    +julia> st2 = Optimisers.setup(OptimiserChain(ClipGrad(), Adam()), m)
    +(vec = Leaf(OptimiserChain(ClipGrad(10.0), Adam(0.001, (0.9, 0.999), 1.0e-8)), (nothing, (Float32[0.0, 0.0], Float32[0.0, 0.0], (0.9, 0.999)))), fun = ())
    +
    +julia> Optimisers.adjust(st2; beta = (0.777, 0.909), delta = 11.1)  # delta acts on ClipGrad
    +(vec = Leaf(OptimiserChain(ClipGrad(11.1), Adam(0.001, (0.777, 0.909), 1.0e-8)), (nothing, (Float32[0.0, 0.0], Float32[0.0, 0.0], (0.9, 0.999)))), fun = ())
    +
    +julia> Optimisers.adjust(st; beta = "no such field")  # silently ignored!
    +(vec = Leaf(Nesterov(0.123, 0.9), Float32[-0.016, -0.088]), fun = ())
    Optimisers.freeze!Function
    Optimisers.freeze!(tree)

    Temporarily alters the state tree = setup(rule, model) so that parameters will not be updated. Un-done by thaw!.

    Can be applied to the state corresponding to only part of a model, for instance with model::Chain, to freeze model.layers[1] you should call freeze!(tree.layers[1]).

    Example

    julia> m = (x = ([1.0], 2.0), y = [3.0]);
    +
    +julia> s = Optimisers.setup(Momentum(), m);
    +
    +julia> Optimisers.freeze!(s.x)
    +
    +julia> Optimisers.update!(s, m, (x = ([pi], 10pi), y = [100pi]));  # with fake gradient
    +
    +julia> m
    +(x = ([1.0], 2.0), y = [-0.14159265358979312])
    +
    +julia> s
    +(x = (Leaf(Momentum(0.01, 0.9), [0.0], frozen = true), ()), y = Leaf(Momentum(0.01, 0.9), [3.14159]))
    +
    +julia> Optimisers.thaw!(s)
    +
    +julia> s.x
    +(Leaf(Momentum(0.01, 0.9), [0.0]), ())
    Optimisers.thaw!Function
    Optimisers.thaw!(tree)

    The reverse of freeze!. Applies to all parameters, mutating every Leaf(rule, state, frozen = true) to Leaf(rule, state, frozen = false).

    Implicit style (Flux ≤ 0.14)

    Flux used to handle gradients, training, and optimisation rules quite differently. The new style described above is called "explicit" by Zygote, and the old style "implicit". Flux 0.13 and 0.14 are the transitional versions which support both; Flux 0.15 will remove the old.

    How to upgrade

    The blue-green boxes in the training section describe the changes needed to upgrade old code.

    For full details on the interface for implicit-style optimisers, see the Flux 0.13.6 manual.

    Flux ≤ 0.12

    Earlier versions of Flux exported params, thus allowing unqualified params(model) after using Flux. This conflicted with too many other packages, and was removed in Flux 0.13. If you get an error UndefVarError: params not defined, this probably means that you are following code for Flux 0.12 or earlier on a more recent version.

    Flux.paramsFunction
    params(model)
    +params(layers...)

    Given a model or specific layers from a model, create a Params object pointing to its trainable parameters.

    This can be used with the gradient function, see the training section of the manual, or as input to the Flux.train! function.

    The behaviour of params on custom types can be customized using Functors.@functor or Flux.trainable.

    Examples

    julia> using Flux: params
    +
    +julia> params(Chain(Dense(ones(2,3)), softmax))  # unpacks Flux models
    +Params([[1.0 1.0 1.0; 1.0 1.0 1.0], [0.0, 0.0]])
    +
    +julia> bn = BatchNorm(2, relu)
    +BatchNorm(2, relu)  # 4 parameters, plus 4 non-trainable
    +
    +julia> params(bn)  # only the trainable parameters
    +Params([Float32[0.0, 0.0], Float32[1.0, 1.0]])
    +
    +julia> params([1, 2, 3], [4])  # one or more arrays of numbers
    +Params([[1, 2, 3], [4]])
    +
    +julia> params([[1, 2, 3], [4]])  # unpacks array of arrays
    +Params([[1, 2, 3], [4]])
    +
    +julia> params(1, [2 2], (alpha=[3,3,3], beta=Ref(4), gamma=sin))  # ignores scalars, unpacks NamedTuples
    +Params([[2 2], [3, 3, 3]])
    source
    Optimisers.update!Method
    update!(opt, p, g)
    +update!(opt, ps::Params, gs)

    Perform an update step of the parameters ps (or the single parameter p) according to optimiser opt::AbstractOptimiser and the gradients gs (the gradient g).

    As a result, the parameters are mutated and the optimiser's internal state may change. The gradient could be mutated as well.

    Deprecated

    This method for implicit Params (and AbstractOptimiser) will be removed from Flux 0.15. The explicit method update!(opt, model, grad) from Optimisers.jl will remain.

    source
    Flux.Optimise.train!Method
    train!(loss, pars::Params, data, opt::AbstractOptimiser; [cb])

    Uses a loss function and training data to improve the model's parameters according to a particular optimisation rule opt.

    Deprecated

    This method with implicit Params will be removed from Flux 0.15. It should be replaced with the explicit method train!(loss, model, data, opt).

    For each d in data, first the gradient of the loss is computed like this:

        gradient(() -> loss(d...), pars)  # if d isa Tuple
    +    gradient(() -> loss(d), pars)     # otherwise

    Here pars is produced by calling Flux.params on your model. (Or just on the layers you want to train, like train!(loss, params(model[1:end-2]), data, opt).) This is the "implicit" style of parameter handling.

    This gradient is then used by optimiser opt to update the parameters:

        update!(opt, pars, grads)

    The optimiser should be from the Flux.Optimise module (see Optimisers). Different optimisers can be combined using Flux.Optimise.Optimiser.

    This training loop iterates through data once. It will stop with a DomainError if the loss is NaN or infinite.

    You can use use train! inside a for loop to do this several times, or use for instance Itertools.ncycle to make a longer data iterator.

    Callbacks

    Callbacks are given with the keyword argument cb. For example, this will print "training" every 10 seconds (using Flux.throttle):

        train!(loss, params, data, opt, cb = throttle(() -> println("training"), 10))

    Multiple callbacks can be passed to cb as array.

    source

    Callbacks

    Implicit train! takes an additional argument, cb, that's used for callbacks so that you can observe the training process. For example:

    train!(objective, ps, data, opt, cb = () -> println("training"))

    Callbacks are called for every batch of training data. You can slow this down using Flux.throttle(f, timeout) which prevents f from being called more than once every timeout seconds.

    A more typical callback might look like this:

    test_x, test_y = # ... create single batch of test data ...
    +evalcb() = @show(loss(test_x, test_y))
    +throttled_cb = throttle(evalcb, 5)
    +for epoch in 1:20
    +  @info "Epoch $epoch"
    +  Flux.train!(objective, ps, data, opt, cb = throttled_cb)
    +end

    See the page about callback helpers for more.

    diff --git a/previews/PR2365/training/training/index.html b/previews/PR2365/training/training/index.html new file mode 100644 index 0000000000..e0df9d9846 --- /dev/null +++ b/previews/PR2365/training/training/index.html @@ -0,0 +1,122 @@ + +Training · Flux

    Training a Flux Model

    Training refers to the process of slowly adjusting the parameters of a model to make it work better. Besides the model itself, we will need three things:

    • An objective function that evaluates how well a model is doing on some input.
    • An optimisation rule which describes how the model's parameters should be adjusted.
    • Some training data to use as the input during this process.

    Usually the training data is some collection of examples (or batches of examples) which are handled one-by-one. One epoch of training means that each example is used once, something like this:

    # Initialise the optimiser for this model:
    +opt_state = Flux.setup(rule, model)
    +
    +for data in train_set
    +  # Unpack this element (for supervised training):
    +  input, label = data
    +
    +  # Calculate the gradient of the objective
    +  # with respect to the parameters within the model:
    +  grads = Flux.gradient(model) do m
    +      result = m(input)
    +      loss(result, label)
    +  end
    +
    +  # Update the parameters so as to reduce the objective,
    +  # according the chosen optimisation rule:
    +  Flux.update!(opt_state, model, grads[1])
    +end

    This loop can also be written using the function train!, but it's helpful to undersand the pieces first:

    train!(model, train_set, opt_state) do m, x, y
    +  loss(m(x), y)
    +end

    Model Gradients

    Fist recall from the section on taking gradients that Flux.gradient(f, a, b) always calls f(a, b), and returns a tuple (∂f_∂a, ∂f_∂b). In the code above, the function f passed to gradient is an anonymous function with one argument, created by the do block, hence grads is a tuple with one element. Instead of a do block, we could have written:

    grads = Flux.gradient(m -> loss(m(input), label), model)

    Since the model is some nested set of layers, grads[1] is a similarly nested set of NamedTuples, ultimately containing gradient components. If (for example) θ = model.layers[1].weight[2,3] is one scalar parameter, an entry in a matrix of weights, then the derivative of the loss with respect to it is ∂f_∂θ = grads[1].layers[1].weight[2,3].

    It is important that the execution of the model takes place inside the call to gradient, in order for the influence of the model's parameters to be observed by Zygote.

    It is also important that every update! step receives a newly computed gradient, as it will change whenever the model's parameters are changed, and for each new data point.

    Implicit gradients

    Flux ≤ 0.14 used Zygote's "implicit" mode, in which gradient takes a zero-argument function. It looks like this:

    pars = Flux.params(model)
    +grad = gradient(() -> loss(model(input), label), pars)

    Here pars::Params and grad::Grads are two dictionary-like structures. Support for this will be removed from Flux 0.15, and these blue (teal?) boxes explain what needs to change.

    Loss Functions

    The objective function must return a number representing how far the model is from the desired result. This is termed the loss of the model.

    This number can be produced by any ordinary Julia code, but this must be executed within the call to gradient. For instance, we could define a function

    loss(y_hat, y) = sum((y_hat .- y).^2)

    or write this directly inside the do block above. Many commonly used functions, like mse for mean-squared error or crossentropy for cross-entropy loss, are available from the Flux.Losses module.

    Implicit-style loss functions

    Flux ≤ 0.14 needed a loss function which closed over a reference to the model, instead of being a pure function. Thus in old code you may see something like

    loss(x, y) = sum((model(x) .- y).^2)

    which defines a function making reference to a particular global variable model.

    Optimisation Rules

    The simplest kind of optimisation using the gradient is termed gradient descent (or sometimes stochastic gradient descent when, as here, it is not applied to the entire dataset at once).

    Gradient descent needs a learning rate which is a small number describing how fast to walk downhill, usually written as the Greek letter "eta", η. This is often described as a hyperparameter, to distinguish it from the parameters which are being updated θ = θ - η * ∂loss_∂θ. We want to update all the parameters in the model, like this:

    η = 0.01   # learning rate
    +
    +# For each parameter array, update
    +# according to the corresponding gradient:
    +fmap(model, grads[1]) do p, g
    +  p .= p .- η .* g
    +end

    A slightly more refined version of this loop to update all the parameters is wrapped up as a function update!(opt_state, model, grads[1]). And the learning rate is the only thing stored in the Descent struct.

    However, there are many other optimisation rules, which adjust the step size and direction in various clever ways. Most require some memory of the gradients from earlier steps, rather than always walking straight downhill – Momentum is the simplest. The function setup creates the necessary storage for this, for a particular model. It should be called once, before training, and returns a tree-like object which is the first argument of update!. Like this:

    # Initialise momentum 
    +opt_state = Flux.setup(Momentum(0.01, 0.9), model)
    +
    +for data in train_set
    +  grads = [...]
    +
    +  # Update both model parameters and optimiser state:
    +  Flux.update!(opt_state, model, grads[1])
    +end

    Many commonly-used optimisation rules, such as Adam, are built-in. These are listed on the optimisers page.

    Implicit-style optimiser state

    This setup makes another tree-like structure. Old versions of Flux did not do this, and instead stored a dictionary-like structure within the optimiser Adam(0.001). This was initialised on first use of the version of update! for "implicit" parameters.

    Datasets & Batches

    The loop above iterates through train_set, expecting at each step a tuple (input, label). The very simplest such object is a vector of tuples, such as this:

    x = randn(28, 28)
    +y = rand(10)
    +data = [(x, y)]

    or data = [(x, y), (x, y), (x, y)] for the same values three times.

    Very often, the initial data is large arrays which you need to slice into examples. To produce one iterator of pairs (x, y), you might want zip:

    X = rand(28, 28, 60_000);  # many images, each 28 × 28
    +Y = rand(10, 60_000)
    +data = zip(eachslice(X; dims=3), eachcol(Y))
    +
    +first(data) isa Tuple{AbstractMatrix, AbstractVector}  # true

    Here each iteration will use one matrix x (an image, perhaps) and one vector y. It is very common to instead train on batches of such inputs (or mini-batches, the two words mean the same thing) both for efficiency and for better results. This can be easily done using the DataLoader:

    data = Flux.DataLoader((X, Y), batchsize=32)
    +
    +x1, y1 = first(data)
    +size(x1) == (28, 28, 32)
    +length(data) == 1875 === 60_000 ÷ 32

    Flux's layers are set up to accept such a batch of input data, and the convolutional layers such as Conv require it. The batch index is always the last dimension.

    Training Loops

    Simple training loops like the one above can be written compactly using the train! function. Including setup, this reads:

    opt_state = Flux.setup(Adam(), model)
    +
    +for epoch in 1:100
    +  Flux.train!(model, train_set, opt_state) do m, x, y
    +    loss(m(x), y)
    +  end
    +end

    Or explicitly writing the anonymous function which this do block creates, train!((m,x,y) -> loss(m(x),y), model, train_set, opt_state) is exactly equivalent.

    Implicit-style `train!`

    This is a new method of train!, which takes the result of setup as its 4th argument. The 1st argument is a function which accepts the model itself. Flux versions ≤ 0.14 provided a method of train! for "implicit" parameters, which works like this:

    train!((x,y) -> loss(model(x), y), Flux.params(model), train_set, Adam())

    Real training loops often need more flexibility, and the best way to do this is just to write the loop. This is ordinary Julia code, without any need to work through some callback API. Here is an example, in which it may be helpful to note:

    • The function withgradient is like gradient but also returns the value of the function, for logging or diagnostic use.
    • Logging or printing is best done outside of the gradient call, as there is no need to differentiate these commands.
    • Julia's break and continue keywords let you exit from parts of the loop.
    opt_state = Flux.setup(Adam(), model)
    +
    +my_log = []
    +for epoch in 1:100
    +  losses = Float32[]
    +  for (i, data) in enumerate(train_set)
    +    input, label = data
    +
    +    val, grads = Flux.withgradient(model) do m
    +      # Any code inside here is differentiated.
    +      # Evaluation of the model and loss must be inside!
    +      result = m(input)
    +      my_loss(result, label)
    +    end
    +
    +    # Save the loss from the forward pass. (Done outside of gradient.)
    +    push!(losses, val)
    +
    +    # Detect loss of Inf or NaN. Print a warning, and then skip update!
    +    if !isfinite(val)
    +      @warn "loss is $val on item $i" epoch
    +      continue
    +    end
    +
    +    Flux.update!(opt_state, model, grads[1])
    +  end
    +
    +  # Compute some accuracy, and save details as a NamedTuple
    +  acc = my_accuracy(model, train_set)
    +  push!(my_log, (; acc, losses))
    +
    +  # Stop training when some criterion is reached
    +  if  acc > 0.95
    +    println("stopping after $epoch epochs")
    +    break
    +  end
    +end

    Regularisation

    The term regularisation covers a wide variety of techniques aiming to improve the result of training. This is often done to avoid overfitting.

    Some of these are can be implemented by simply modifying the loss function. L₂ regularisation (sometimes called ridge regression) adds to the loss a penalty proportional to θ^2 for every scalar parameter. For a very simple model could be implemented as follows:

    grads = Flux.gradient(densemodel) do m
    +  result = m(input)
    +  penalty = sum(abs2, m.weight)/2 + sum(abs2, m.bias)/2
    +  my_loss(result, label) + 0.42 * penalty
    +end

    Accessing each individual parameter array by hand won't work well for large models. Instead, we can use Flux.params to collect all of them, and then apply a function to each one, and sum the result:

    pen_l2(x::AbstractArray) = sum(abs2, x)/2
    +
    +grads = Flux.gradient(model) do m
    +  result = m(input)
    +  penalty = sum(pen_l2, Flux.params(m))
    +  my_loss(result, label) + 0.42 * penalty
    +end

    However, the gradient of this penalty term is very simple: It is proportional to the original weights. So there is a simpler way to implement exactly the same thing, by modifying the optimiser instead of the loss function. This is done by replacing this:

    opt_state = Flux.setup(Adam(0.1), model)

    with this:

    decay_opt_state = Flux.setup(OptimiserChain(WeightDecay(0.42), Adam(0.1)), model)

    Flux's optimisers are really modifications applied to the gradient before using it to update the parameters, and OptimiserChain applies two such modifications. The first, WeightDecay adds 0.42 times original parameter to the gradient, matching the gradient of the penalty above (with the same, unrealistically large, constant). After that, in either case, Adam computes the final update.

    The same OptimiserChain mechanism can be used for other purposes, such as gradient clipping with ClipGrad or ClipNorm.

    Besides L2 / weight decay, another common and quite different kind of regularisation is provided by the Dropout layer. This turns off some outputs of the previous layer during training. It should switch automatically, but see trainmode! / testmode! to manually enable or disable this layer.

    Freezing & Schedules

    Finer control of training, you may wish to alter the learning rate mid-way through training. This can be done with adjust!, like this:

    opt_state = Flux.setup(Adam(0.1), model)  # initialise once
    +
    +for epoch in 1:1000
    +  train!([...], state)  # Train with η = 0.1 for first 100,
    +  if epoch == 100       # then change to use η = 0.01 for the rest.
    +    Flux.adjust!(opt_state, 0.01)
    +  end
    +end
    Flux ≤ 0.14

    With the old "implicit" optimiser, opt = Adam(0.1), the equivalent was to directly mutate the Adam struct, opt.eta = 0.001.

    Other hyper-parameters can also be adjusted, such as Flux.adjust!(opt_state, beta = (0.8, 0.99)). And such modifications can be applied to just one part of the model. For instance, this sets a different learning rate for the encoder and the decoder:

    # Consider some model with two parts:
    +bimodel = Chain(enc = [...], dec = [...])
    +
    +# This returns a tree whose structure matches the model:
    +opt_state = Flux.setup(Adam(0.02), bimodel)
    +
    +# Adjust the learning rate to be used for bimodel.layers.enc
    +Flux.adjust!(opt_state.layers.enc, 0.03)

    To completely disable training of some part of the model, use freeze!. This is a temporary modification, reversed by thaw!:

    Flux.freeze!(opt_state.layers.enc)
    +
    +# Now training won't update parameters in bimodel.layers.enc
    +train!(loss, bimodel, data, opt_state)
    +
    +# Un-freeze the entire model:
    +Flux.thaw!(opt_state)
    Flux ≤ 0.14

    The earlier "implicit" equivalent was to pass to gradient an object referencing only part of the model, such as Flux.params(bimodel.layers.enc).

    Implicit or Explicit?

    Flux used to handle gradients, training, and optimisation rules quite differently. The new style described above is called "explicit" by Zygote, and the old style "implicit". Flux 0.13 and 0.14 are the transitional versions which support both.

    The blue-green boxes above describe the changes. For more details on training in the implicit style, see Flux 0.13.6 documentation.

    For details about the two gradient modes, see Zygote's documentation.

    diff --git a/previews/PR2365/training/zygote/index.html b/previews/PR2365/training/zygote/index.html new file mode 100644 index 0000000000..05028f7a8f --- /dev/null +++ b/previews/PR2365/training/zygote/index.html @@ -0,0 +1,197 @@ + +Gradients – Zygote.jl · Flux

    Automatic Differentiation using Zygote.jl

    Flux re-exports the gradient from Zygote, and uses this function within train! to differentiate the model. Zygote has its own documentation, in particular listing some important limitations.

    Explicit style

    The preferred way of using Zygote, and the only way of using most other AD packages, is to explicitly provide a function and its arguments.

    Zygote.gradientMethod
    gradient(f, args...)

    Returns a tuple containing ∂f/∂x for each argument x, the derivative (for scalar x) or the gradient.

    f(args...) must be a real number, see jacobian for array output.

    See also withgradient to keep the value f(args...), and pullback for value and back-propagator.

    julia> gradient(*, 2.0, 3.0, 5.0)
    +(15.0, 10.0, 6.0)
    +
    +julia> gradient(x -> sum(abs2,x), [7.0, 11.0, 13.0])
    +([14.0, 22.0, 26.0],)
    +
    +julia> gradient([7, 11], 0, 1) do x, y, d
    +         p = size(x, d)
    +         sum(x.^p .+ y)
    +       end
    +([14.0, 22.0], 2.0, nothing)
    Zygote.withgradientMethod
    withgradient(f, args...)
    +withgradient(f, ::Params)

    Returns both the value of the function and the gradient, as a named tuple.

    julia> y, ∇ = withgradient(/, 1, 2)
    +(val = 0.5, grad = (0.5, -0.25))
    +
    +julia> ∇ == gradient(/, 1, 2)
    +true

    Allows you to capture auxillary outputs, in addition to the scalar used by gradient. To do this, f must return a Tuple or NamedTuple. Then it calculates grad = gradient(first∘f, args...) but returns the wholeval = f(args...)`:

    julia> withgradient([1,2,4]) do x
    +          z = 1 ./ x
    +          sum(z), z  # here z is an auxillary output
    +       end
    +(val = (1.75, [1.0, 0.5, 0.25]), grad = ([-1.0, -0.25, -0.0625],))
    +
    +julia> withgradient(3.0, 4.0) do x, y
    +          (div = x/y, mul = x*y)
    +       end
    +(val = (div = 0.75, mul = 12.0), grad = (0.25, -0.1875))

    Also supports implicit mode:

    julia> w = [3.0];
    +
    +julia> res = withgradient(() -> sum(abs2, w), Params([w]))
    +(val = 9.0, grad = Grads(...))
    +
    +julia> res.grad[w]
    +1-element Vector{Float64}:
    + 6.0
    Zygote.jacobianMethod
    jacobian(f, args...) -> Tuple

    For each array a ∈ args this returns a matrix with Ja[k,i] = ∂y[k]/∂a[i] where y = f(args...) is usually a vector. Arrays of higher dimension are treated like vec(a), or vec(y) for output.

    For scalar x::Number ∈ args, the result is a vector Jx[k] = ∂y[k]/∂x, while for scalar y all results have just one row.

    With any other argument type, no result is produced, even if gradient would work.

    This reverse-mode Jacobian needs to evaluate the pullback once for each element of y. Doing so is usually only efficient when length(y) is small compared to length(a), otherwise forward mode is likely to be better.

    See also withjacobian, hessian, hessian_reverse.

    Examples

    julia> jacobian(a -> 100*a[1:3].^2, 1:7)[1]  # first index (rows) is output
    +3×7 Matrix{Int64}:
    + 200    0    0  0  0  0  0
    +   0  400    0  0  0  0  0
    +   0    0  600  0  0  0  0
    +
    +julia> jacobian((a,x) -> a.^2 .* x, [1,2,3], 1)  # scalar argument has vector jacobian
    +([2 0 0; 0 4 0; 0 0 6], [1, 4, 9])
    +
    +julia> jacobian((a,d) -> prod(a, dims=d), [1 2; 3 4; 5 6], 2)
    +([2 0 … 0 0; 0 4 … 3 0; 0 0 … 0 5], [0, 0, 0])
    Warning

    For arguments of any type except Number & AbstractArray, the result is nothing.

    julia> jacobian((a,s) -> a.^length(s), [1,2,3], "str")
    +([3 0 0; 0 12 0; 0 0 27], nothing)
    +
    +julia> jacobian((a,t) -> sum(a .* t[1]) + t[2], [1,2,3], (4,5))
    +([4 4 4], nothing)
    +
    +julia> gradient((a,t) -> sum(a .* t[1]) + t[2], [1,2,3], (4,5))  # gradient undersands the tuple
    +([4 4 4], (6, 1))
    Zygote.withjacobianMethod
    withjacobian(f, args...)

    Returns both the value f(args...) and the jacobian as a named tuple.

    julia> withjacobian(cumsum, [1,2,3])
    +(val = [1, 3, 6], grad = ([1 0 0; 1 1 0; 1 1 1],))
    Zygote.hessianFunction
    hessian(f, x)

    Construct the Hessian ∂²f/∂x², where x is a real number or an array, and f(x) is a real number. When x is an array, the result is a matrix H[i,j] = ∂²f/∂x[i]∂x[j], using linear indexing x[i] even if the argument is higher-dimensional.

    This uses forward over reverse, ForwardDiff over Zygote, calling hessian_dual(f, x). See hessian_reverse for an all-Zygote alternative.

    See also diaghessian to compute only the diagonal part.

    Examples

    julia> hessian(x -> x[1]*x[2], randn(2))
    +2×2 Matrix{Float64}:
    + 0.0  1.0
    + 1.0  0.0
    +
    +julia> hessian(x -> sum(x.^3), [1 2; 3 4])  # uses linear indexing of x
    +4×4 Matrix{Int64}:
    + 6   0   0   0
    + 0  18   0   0
    + 0   0  12   0
    + 0   0   0  24
    +
    +julia> hessian(sin, pi/2)
    +-1.0
    Zygote.hessian_reverseFunction
    hessian_reverse(f, x)

    This should be equivalent to hessian(f, x), but implemented using reverse over reverse mode, all Zygote. (This is usually much slower, and more likely to find errors.)

    Zygote.diaghessianFunction
    diaghessian(f, args...) -> Tuple

    Diagonal part of the Hessian. Returns a tuple containing, for each argument x, h of the same shape with h[i] = Hᵢᵢ = ∂²y/∂x[i]∂x[i]. The original evaluation y = f(args...) must give a real number y.

    For one vector argument x, this is equivalent to (diag(hessian(f,x)),). Like hessian it uses ForwardDiff over Zygote.

    Warning

    For arguments of any type except Number & AbstractArray, the result is nothing.

    Examples

    julia> diaghessian(x -> sum(x.^3), [1 2; 3 4])[1]
    +2×2 Matrix{Int64}:
    +  6  12
    + 18  24
    +
    +julia> Diagonal(vec(ans)) == hessian(x -> sum(x.^3), [1 2; 3 4])  # full Hessian is diagonal
    +true
    +
    +julia> diaghessian((x,y) -> sum(x .* y .* y'), [1 22; 333 4], [0.5, 0.666])  # two array arguments
    +([0.0 0.0; 0.0 0.0], [2.0, 8.0])
    +
    +julia> diaghessian(atan, 1, 2)  # two scalar arguments
    +(-0.16, 0.16)
    +
    +julia> hessian(xy -> atan(xy[1], xy[2]), [1, 2])  # full Hessian is not diagonal
    +2×2 Matrix{Float64}:
    + -0.16  -0.12
    + -0.12   0.16

    Implicit style (Flux ≤ 0.14)

    Flux used to use what Zygote calls "implicit" gradients, described here in its documentation. However, support for this will be removed from Flux 0.15.

    Training

    The blue-green boxes in the training section describe the changes needed to upgrade old code from implicit to explicit style.

    Zygote.gradientMethod
    gradient(f, args...)

    Returns a tuple containing ∂f/∂x for each argument x, the derivative (for scalar x) or the gradient.

    f(args...) must be a real number, see jacobian for array output.

    See also withgradient to keep the value f(args...), and pullback for value and back-propagator.

    julia> gradient(*, 2.0, 3.0, 5.0)
    +(15.0, 10.0, 6.0)
    +
    +julia> gradient(x -> sum(abs2,x), [7.0, 11.0, 13.0])
    +([14.0, 22.0, 26.0],)
    +
    +julia> gradient([7, 11], 0, 1) do x, y, d
    +         p = size(x, d)
    +         sum(x.^p .+ y)
    +       end
    +([14.0, 22.0], 2.0, nothing)
    Zygote.ParamsType
    Params([A, B])

    Container for implicit parameters, used when differentiating a zero-argument function () -> loss(A, B) with respect to A, B.

    Zygote.GradsType
    Grads(...)

    Dictionary-like container returned when taking gradients with respect to implicit parameters. For an array W, appearing within Params([W, A, B...]), the gradient is g[W].

    Zygote.jacobianMethod
    jacobian(loss, ::Params)

    Like gradient with implicit parameters, this method takes a zero-argument function and returns an IdDict-like object, now containing the Jacobian for each parameter.

    Examples

    julia> xs = [1 2; 3 4]; ys = [5,7,9];
    +
    +julia> Jxy = jacobian(() -> ys[1:2] .+ sum(xs.^2), Params([xs, ys]))
    +Grads(...)
    +
    +julia> Jxy[ys]
    +2×3 Matrix{Int64}:
    + 1  0  0
    + 0  1  0
    +
    +julia> Jxy[xs]
    +2×4 Matrix{Int64}:
    + 2  6  4  8
    + 2  6  4  8

    ChainRules

    Sometimes it is necessary to exclude some code, or a whole function, from automatic differentiation. This can be done using ChainRules:

    ChainRulesCore.ignore_derivativesFunction
    ignore_derivatives(f::Function)

    Tells the AD system to ignore the gradients of the wrapped closure. The primal computation (forward pass) is executed normally.

    ignore_derivatives() do
    +    value = rand()
    +    push!(collection, value)
    +end

    Using this incorrectly could lead to incorrect gradients. For example, the following function will have zero gradients with respect to its argument:

    function wrong_grads(x)
    +    y = ones(3)
    +    ignore_derivatives() do
    +        push!(y, x)
    +    end
    +    return sum(y)
    +end
    ignore_derivatives(x)

    Tells the AD system to ignore the gradients of the argument. Can be used to avoid unnecessary computation of gradients.

    ignore_derivatives(x) * w
    ChainRulesCore.@non_differentiableMacro
    @non_differentiable(signature_expression)

    A helper to make it easier to declare that a method is not differentiable. This is a short-hand for defining an frule and rrule that return NoTangent() for all partials (even for the function s̄elf-partial itself)

    Keyword arguments should not be included.

    julia> @non_differentiable Base.:(==)(a, b)
    +
    +julia> _, pullback = rrule(==, 2.0, 3.0);
    +
    +julia> pullback(1.0)
    +(NoTangent(), NoTangent(), NoTangent())

    You can place type-constraints in the signature:

    julia> @non_differentiable Base.length(xs::Union{Number, Array})
    +
    +julia> frule((ZeroTangent(), 1), length, [2.0, 3.0])
    +(2, NoTangent())
    Warning

    This helper macro covers only the simple common cases. It does not support where-clauses. For these you can declare the rrule and frule directly

    To manually supply the gradient for one function, you should define a method of rrule. ChainRules has detailed documentation on how this works.

    ChainRulesCore.rruleFunction
    rrule([::RuleConfig,] f, x...)

    Expressing x as the tuple (x₁, x₂, ...) and the output tuple of f(x...) as Ω, return the tuple:

    (Ω, (Ω̄₁, Ω̄₂, ...) -> (s̄elf, x̄₁, x̄₂, ...))

    Where the second return value is the the propagation rule or pullback. It takes in cotangents corresponding to the outputs (x̄₁, x̄₂, ...), and s̄elf, the internal values of the function itself (for closures)

    If no method matching rrule(f, xs...) has been defined, then return nothing.

    Examples:

    unary input, unary output scalar function:

    julia> x = rand();
    +
    +julia> sinx, sin_pullback = rrule(sin, x);
    +
    +julia> sinx == sin(x)
    +true
    +
    +julia> sin_pullback(1) == (NoTangent(), cos(x))
    +true

    binary input, unary output scalar function:

    julia> x, y = rand(2);
    +
    +julia> hypotxy, hypot_pullback = rrule(hypot, x, y);
    +
    +julia> hypotxy == hypot(x, y)
    +true
    +
    +julia> hypot_pullback(1) == (NoTangent(), (x / hypot(x, y)), (y / hypot(x, y)))
    +true

    The optional RuleConfig option allows specifying rrules only for AD systems that support given features. If not needed, then it can be omitted and the rrule without it will be hit as a fallback. This is the case for most rules.

    See also: frule, @scalar_rule, RuleConfig

    ChainRulesCore.fruleFunction
    frule([::RuleConfig,] (Δf, Δx...), f, x...)

    Expressing the output of f(x...) as Ω, return the tuple:

    (Ω, ΔΩ)

    The second return value is the tangent w.r.t. the output.

    If no method matching frule((Δf, Δx...), f, x...) has been defined, then return nothing.

    Examples:

    unary input, unary output scalar function:

    julia> dself = NoTangent();
    +
    +julia> x = rand()
    +0.8236475079774124
    +
    +julia> sinx, Δsinx = frule((dself, 1), sin, x)
    +(0.7336293678134624, 0.6795498147167869)
    +
    +julia> sinx == sin(x)
    +true
    +
    +julia> Δsinx == cos(x)
    +true

    Unary input, binary output scalar function:

    julia> sincosx, Δsincosx = frule((dself, 1), sincos, x);
    +
    +julia> sincosx == sincos(x)
    +true
    +
    +julia> Δsincosx[1] == cos(x)
    +true
    +
    +julia> Δsincosx[2] == -sin(x)
    +true

    Note that techically speaking julia does not have multiple output functions, just functions that return a single output that is iterable, like a Tuple. So this is actually a Tangent:

    julia> Δsincosx
    +Tangent{Tuple{Float64, Float64}}(0.6795498147167869, -0.7336293678134624)

    The optional RuleConfig option allows specifying frules only for AD systems that support given features. If not needed, then it can be omitted and the frule without it will be hit as a fallback. This is the case for most rules.

    See also: rrule, @scalar_rule, RuleConfig

    ChainRulesCore.@scalar_ruleMacro
    @scalar_rule(f(x₁, x₂, ...),
    +             @setup(statement₁, statement₂, ...),
    +             (∂f₁_∂x₁, ∂f₁_∂x₂, ...),
    +             (∂f₂_∂x₁, ∂f₂_∂x₂, ...),
    +             ...)

    A convenience macro that generates simple scalar forward or reverse rules using the provided partial derivatives. Specifically, generates the corresponding methods for frule and rrule:

    function ChainRulesCore.frule((NoTangent(), Δx₁, Δx₂, ...), ::typeof(f), x₁::Number, x₂::Number, ...)
    +    Ω = f(x₁, x₂, ...)
    +    $(statement₁, statement₂, ...)
    +    return Ω, (
    +            (∂f₁_∂x₁ * Δx₁ + ∂f₁_∂x₂ * Δx₂ + ...),
    +            (∂f₂_∂x₁ * Δx₁ + ∂f₂_∂x₂ * Δx₂ + ...),
    +            ...
    +        )
    +end
    +
    +function ChainRulesCore.rrule(::typeof(f), x₁::Number, x₂::Number, ...)
    +    Ω = f(x₁, x₂, ...)
    +    $(statement₁, statement₂, ...)
    +    return Ω, ((ΔΩ₁, ΔΩ₂, ...)) -> (
    +            NoTangent(),
    +            ∂f₁_∂x₁ * ΔΩ₁ + ∂f₂_∂x₁ * ΔΩ₂ + ...),
    +            ∂f₁_∂x₂ * ΔΩ₁ + ∂f₂_∂x₂ * ΔΩ₂ + ...),
    +            ...
    +        )
    +end

    If no type constraints in f(x₁, x₂, ...) within the call to @scalar_rule are provided, each parameter in the resulting frule/rrule definition is given a type constraint of Number. Constraints may also be explicitly be provided to override the Number constraint, e.g. f(x₁::Complex, x₂), which will constrain x₁ to Complex and x₂ to Number.

    At present this does not support defining for closures/functors. Thus in reverse-mode, the first returned partial, representing the derivative with respect to the function itself, is always NoTangent(). And in forward-mode, the first input to the returned propagator is always ignored.

    The result of f(x₁, x₂, ...) is automatically bound to Ω. This allows the primal result to be conveniently referenced (as Ω) within the derivative/setup expressions.

    This macro assumes complex functions are holomorphic. In general, for non-holomorphic functions, the frule and rrule must be defined manually.

    If the derivative is one, (e.g. for identity functions) true can be used as the most general multiplicative identity.

    The @setup argument can be elided if no setup code is need. In other words:

    @scalar_rule(f(x₁, x₂, ...),
    +             (∂f₁_∂x₁, ∂f₁_∂x₂, ...),
    +             (∂f₂_∂x₁, ∂f₂_∂x₂, ...),
    +             ...)

    is equivalent to:

    @scalar_rule(f(x₁, x₂, ...),
    +             @setup(nothing),
    +             (∂f₁_∂x₁, ∂f₁_∂x₂, ...),
    +             (∂f₂_∂x₁, ∂f₂_∂x₂, ...),
    +             ...)

    For examples, see ChainRules' rulesets directory.

    See also: frule, rrule.

    ChainRulesCore.NoTangentType
    NoTangent() <: AbstractZero

    This tangent indicates that the derivative does not exist. It is the tangent type for primal types that are not differentiable, such as integers or booleans (when they are not being used to represent floating-point values). The only valid way to perturb such values is to not change them at all. As a consequence, NoTangent is functionally identical to ZeroTangent(), but it provides additional semantic information.

    Adding NoTangent() to a primal is generally wrong: gradient-based methods cannot be used to optimize over discrete variables. An optimization package making use of this might want to check for such a case.

    Note

    This does not indicate that the derivative is not implemented, but rather that mathematically it is not defined.

    This mostly shows up as the derivative with respect to dimension, index, or size arguments.

        function rrule(fill, x, len::Int)
    +        y = fill(x, len)
    +        fill_pullback(ȳ) = (NoTangent(), @thunk(sum(Ȳ)), NoTangent())
    +        return y, fill_pullback
    +    end
    ChainRulesCore.ZeroTangentType
    ZeroTangent() <: AbstractZero

    The additive identity for tangents. This is basically the same as 0. A derivative of ZeroTangent() does not propagate through the primal function.

    diff --git a/previews/PR2365/tutorials/2020-09-15-deep-learning-flux/index.html b/previews/PR2365/tutorials/2020-09-15-deep-learning-flux/index.html new file mode 100644 index 0000000000..d59f1fadc3 --- /dev/null +++ b/previews/PR2365/tutorials/2020-09-15-deep-learning-flux/index.html @@ -0,0 +1,124 @@ + +Deep Learning with Julia & Flux: A 60 Minute Blitz · Flux

    Deep Learning with Julia & Flux: A 60 Minute Blitz

    This is a quick intro to Flux loosely based on PyTorch's tutorial. It introduces basic Julia programming, as well Zygote, a source-to-source automatic differentiation (AD) framework in Julia. We'll use these tools to build a very simple neural network.

    Arrays

    The starting point for all of our models is the Array (sometimes referred to as a Tensor in other frameworks). This is really just a list of numbers, which might be arranged into a shape like a square. Let's write down an array with three elements.

    x = [1, 2, 3]

    Here's a matrix – a square array with four elements.

    x = [1 2; 3 4]

    We often work with arrays of thousands of elements, and don't usually write them down by hand. Here's how we can create an array of 5×3 = 15 elements, each a random number from zero to one.

    x = rand(5, 3)

    There's a few functions like this; try replacing rand with ones, zeros, or randn to see what they do.

    By default, Julia works stores numbers is a high-precision format called Float64. In ML we often don't need all those digits, and can ask Julia to work with Float32 instead. We can even ask for more digits using BigFloat.

    x = rand(BigFloat, 5, 3)
    +
    +x = rand(Float32, 5, 3)

    We can ask the array how many elements it has.

    length(x)

    Or, more specifically, what size it has.

    size(x)

    We sometimes want to see some elements of the array on their own.

    x
    +
    +x[2, 3]

    This means get the second row and the third column. We can also get every row of the third column.

    x[:, 3]

    We can add arrays, and subtract them, which adds or subtracts each element of the array.

    x + x
    +
    +x - x

    Julia supports a feature called broadcasting, using the . syntax. This tiles small arrays (or single numbers) to fill bigger ones.

    x .+ 1

    We can see Julia tile the column vector 1:5 across all rows of the larger array.

    zeros(5,5) .+ (1:5)

    The x' syntax is used to transpose a column 1:5 into an equivalent row, and Julia will tile that across columns.

    zeros(5,5) .+ (1:5)'

    We can use this to make a times table.

    (1:5) .* (1:5)'

    Finally, and importantly for machine learning, we can conveniently do things like matrix multiply.

    W = randn(5, 10)
    +x = rand(10)
    +W * x

    Julia's arrays are very powerful, and you can learn more about what they can do here.

    CUDA Arrays

    CUDA functionality is provided separately by the CUDA package. If you have a GPU and CUDA available, you can run ] add CUDA in a REPL or IJulia to get it.

    Once CUDA is loaded you can move any array to the GPU with the cu function, and it supports all of the above operations with the same syntax.

    using CUDA
    +x = cu(rand(5, 3))

    Automatic Differentiation

    You probably learned to take derivatives in school. We start with a simple mathematical function like

    f(x) = 3x^2 + 2x + 1
    +
    +f(5)

    In simple cases it's pretty easy to work out the gradient by hand – here it's 6x+2. But it's much easier to make Flux do the work for us!

    using Flux: gradient
    +
    +df(x) = gradient(f, x)[1]
    +
    +df(5)

    You can try this with a few different inputs to make sure it's really the same as 6x+2. We can even do this multiple times (but the second derivative is a fairly boring 6).

    ddf(x) = gradient(df, x)[1]
    +
    +ddf(5)

    Flux's AD can handle any Julia code you throw at it, including loops, recursion and custom layers, so long as the mathematical functions you call are differentiable. For example, we can differentiate a Taylor approximation to the sin function.

    mysin(x) = sum((-1)^k*x^(1+2k)/factorial(1+2k) for k in 0:5)
    +
    +x = 0.5
    +
    +mysin(x), gradient(mysin, x)
    +
    +sin(x), cos(x)

    You can see that the derivative we calculated is very close to cos(x), as we expect.

    This gets more interesting when we consider functions that take arrays as inputs, rather than just a single number. For example, here's a function that takes a matrix and two vectors (the definition itself is arbitrary)

    myloss(W, b, x) = sum(W * x .+ b)
    +
    +W = randn(3, 5)
    +b = zeros(3)
    +x = rand(5)
    +
    +gradient(myloss, W, b, x)

    Now we get gradients for each of the inputs W, b and x, which will come in handy when we want to train models.

    Because ML models can contain hundreds of parameters, Flux provides a slightly different way of writing gradient. We instead mark arrays with param to indicate that we want their derivatives. W and b represent the weight and bias respectively.

    using Flux: params
    +
    +W = randn(3, 5)
    +b = zeros(3)
    +x = rand(5)
    +
    +y(x) = sum(W * x .+ b)
    +
    +grads = gradient(()->y(x), params([W, b]))
    +
    +grads[W], grads[b]

    We can now grab the gradients of W and b directly from those parameters.

    This comes in handy when working with layers. A layer is just a handy container for some parameters. For example, Dense does a linear transform for you.

    using Flux
    +
    +m = Dense(10, 5)
    +
    +x = rand(Float32, 10)

    We can easily get the parameters of any layer or model with params with params.

    params(m)

    This makes it very easy to calculate the gradient for all parameters in a network, even if it has many parameters.

    x = rand(Float32, 10)
    +m = Chain(Dense(10, 5, relu), Dense(5, 2), softmax)
    +l(x) = sum(Flux.crossentropy(m(x), [0.5, 0.5]))
    +grads = gradient(params(m)) do
    +    l(x)
    +end
    +for p in params(m)
    +    println(grads[p])
    +end

    You don't have to use layers, but they can be convient for many simple kinds of models and fast iteration.

    The next step is to update our weights and perform optimisation. As you might be familiar, Gradient Descent is a simple algorithm that takes the weights and steps using a learning rate and the gradients. weights = weights - learning_rate * gradient.

    using Flux.Optimise: update!, Descent
    +η = 0.1
    +for p in params(m)
    +  update!(p, -η * grads[p])
    +end

    While this is a valid way of updating our weights, it can get more complicated as the algorithms we use get more involved.

    Flux comes with a bunch of pre-defined optimisers and makes writing our own really simple. We just give it the learning rate η:

    opt = Descent(0.01)

    Training a network reduces down to iterating on a dataset mulitple times, performing these steps in order. Just for a quick implementation, let’s train a network that learns to predict 0.5 for every input of 10 floats. Flux defines the train! function to do it for us.

    data, labels = rand(10, 100), fill(0.5, 2, 100)
    +loss(x, y) = sum(Flux.crossentropy(m(x), y))
    +Flux.train!(loss, params(m), [(data,labels)], opt)

    You don't have to use train!. In cases where arbitrary logic might be better suited, you could open up this training loop like so:

      for d in training_set # assuming d looks like (data, labels)
    +    # our super logic
    +    gs = gradient(params(m)) do #m is our model
    +      l = loss(d...)
    +    end
    +    update!(opt, params(m), gs)
    +  end

    Training a Classifier

    Getting a real classifier to work might help cement the workflow a bit more. CIFAR10 is a dataset of 50k tiny training images split into 10 classes.

    We will do the following steps in order:

    • Load CIFAR10 training and test datasets
    • Define a Convolution Neural Network
    • Define a loss function
    • Train the network on the training data
    • Test the network on the test data

    Loading the Dataset

    using Statistics
    +using Flux, Flux.Optimise
    +using MLDatasets: CIFAR10
    +using Images.ImageCore
    +using Flux: onehotbatch, onecold
    +using Base.Iterators: partition
    +using CUDA

    This image will give us an idea of what we are dealing with.

    title

    train_x, train_y = CIFAR10.traindata(Float32)
    +labels = onehotbatch(train_y, 0:9)

    The train_x contains 50000 images converted to 32 X 32 X 3 arrays with the third dimension being the 3 channels (R,G,B). Let's take a look at a random image from the train_x. For this, we need to permute the dimensions to 3 X 32 X 32 and use colorview to convert it back to an image.

    using Plots
    +image(x) = colorview(RGB, permutedims(x, (3, 2, 1)))
    +plot(image(train_x[:,:,:,rand(1:end)]))

    We can now arrange the training data in batches of say, 1000 and keep a validation set to track our progress. This process is called minibatch learning, which is a popular method of training large neural networks. Rather that sending the entire dataset at once, we break it down into smaller chunks (called minibatches) that are typically chosen at random, and train only on them. It is shown to help with escaping saddle points.

    The first 49k images (in batches of 1000) will be our training set, and the rest is for validation. partition handily breaks down the set we give it in consecutive parts (1000 in this case).

    train = ([(train_x[:,:,:,i], labels[:,i]) for i in partition(1:49000, 1000)]) |> gpu
    +valset = 49001:50000
    +valX = train_x[:,:,:,valset] |> gpu
    +valY = labels[:, valset] |> gpu

    Defining the Classifier

    Now we can define our Convolutional Neural Network (CNN).

    A convolutional neural network is one which defines a kernel and slides it across a matrix to create an intermediate representation to extract features from. It creates higher order features as it goes into deeper layers, making it suitable for images, where the strucure of the subject is what will help us determine which class it belongs to.

    m = Chain(
    +  Conv((5,5), 3=>16, relu),
    +  MaxPool((2,2)),
    +  Conv((5,5), 16=>8, relu),
    +  MaxPool((2,2)),
    +  x -> reshape(x, :, size(x, 4)),
    +  Dense(200, 120),
    +  Dense(120, 84),
    +  Dense(84, 10),
    +  softmax) |> gpu

    We will use a crossentropy loss and an Momentum optimiser here. Crossentropy will be a good option when it comes to working with mulitple independent classes. Momentum gradually lowers the learning rate as we proceed with the training. It helps maintain a bit of adaptivity in our optimisation, preventing us from over shooting from our desired destination.

    using Flux: crossentropy, Momentum
    +
    +loss(x, y) = sum(crossentropy(m(x), y))
    +opt = Momentum(0.01)

    We can start writing our train loop where we will keep track of some basic accuracy numbers about our model. We can define an accuracy function for it like so.

    accuracy(x, y) = mean(onecold(m(x), 0:9) .== onecold(y, 0:9))

    Training the Classifier

    Training is where we do a bunch of the interesting operations we defined earlier, and see what our net is capable of. We will loop over the dataset 10 times and feed the inputs to the neural network and optimise.

    epochs = 10
    +
    +for epoch = 1:epochs
    +  for d in train
    +    gs = gradient(params(m)) do
    +      l = loss(d...)
    +    end
    +    update!(opt, params(m), gs)
    +  end
    +  @show accuracy(valX, valY)
    +end

    Seeing our training routine unfold gives us an idea of how the network learnt the function. This is not bad for a small hand-written network, trained for a limited time.

    Training on a GPU

    The gpu functions you see sprinkled through this bit of the code tell Flux to move these entities to an available GPU, and subsequently train on it. No extra faffing about required! The same bit of code would work on any hardware with some small annotations like you saw here.

    Testing the Network

    We have trained the network for 100 passes over the training dataset. But we need to check if the network has learnt anything at all.

    We will check this by predicting the class label that the neural network outputs, and checking it against the ground-truth. If the prediction is correct, we add the sample to the list of correct predictions. This will be done on a yet unseen section of data.

    Okay, first step. Let us perform the exact same preprocessing on this set, as we did on our training set.

    test_x, test_y = CIFAR10.testdata(Float32)
    +test_labels = onehotbatch(test_y, 0:9)
    +
    +test = gpu.([(test_x[:,:,:,i], test_labels[:,i]) for i in partition(1:10000, 1000)])

    Next, display an image from the test set.

    plot(image(test_x[:,:,:,rand(1:end)]))

    The outputs are energies for the 10 classes. Higher the energy for a class, the more the network thinks that the image is of the particular class. Every column corresponds to the output of one image, with the 10 floats in the column being the energies.

    Let's see how the model fared.

    ids = rand(1:10000, 5)
    +rand_test = test_x[:,:,:,ids] |> gpu
    +rand_truth = test_y[ids]
    +m(rand_test)

    This looks similar to how we would expect the results to be. At this point, it's a good idea to see how our net actually performs on new data, that we have prepared.

    accuracy(test[1]...)

    This is much better than random chance set at 10% (since we only have 10 classes), and not bad at all for a small hand written network like ours.

    Let's take a look at how the net performed on all the classes performed individually.

    class_correct = zeros(10)
    +class_total = zeros(10)
    +for i in 1:10
    +  preds = m(test[i][1])
    +  lab = test[i][2]
    +  for j = 1:1000
    +    pred_class = findmax(preds[:, j])[2]
    +    actual_class = findmax(lab[:, j])[2]
    +    if pred_class == actual_class
    +      class_correct[pred_class] += 1
    +    end
    +    class_total[actual_class] += 1
    +  end
    +end
    +
    +class_correct ./ class_total

    The spread seems pretty good, with certain classes performing significantly better than the others. Why should that be?

    Info

    Originally published at fluxml.ai on 15 November 2020. Written by Saswat Das, Mike Innes, Andrew Dinhobl, Ygor Canalli, Sudhanshu Agrawal, João Felipe Santos.

    diff --git a/previews/PR2365/tutorials/2021-01-26-mlp/index.html b/previews/PR2365/tutorials/2021-01-26-mlp/index.html new file mode 100644 index 0000000000..1e5ae11a7c --- /dev/null +++ b/previews/PR2365/tutorials/2021-01-26-mlp/index.html @@ -0,0 +1,81 @@ + +Tutorial: Simple Multi-layer Perceptron · Flux

    Tutorial: Simple Multi-layer Perceptron

    In this example, we create a simple multi-layer perceptron (MLP) that classifies handwritten digits using the MNIST dataset. A MLP consists of at least three layers of stacked perceptrons: Input, hidden, and output. Each neuron of an MLP has parameters (weights and bias) and uses an activation function to compute its output.

    To run this example, we need the following packages:

    using Flux, Statistics
    +using Flux.Data: DataLoader
    +using Flux: onehotbatch, onecold, logitcrossentropy, throttle, params
    +using Base.Iterators: repeated
    +using CUDA
    +using MLDatasets
    +if has_cuda()		# Check if CUDA is available
    +    @info "CUDA is on"
    +    CUDA.allowscalar(false)
    +end

    We set default values for learning rate, batch size, epochs, and the usage of a GPU (if available) for our model:

    Base.@kwdef mutable struct Args
    +    rate::Float64 = 3e-4    # learning rate
    +    batchsize::Int = 1024   # batch size
    +    epochs::Int = 10        # number of epochs
    +    device::Function = gpu  # set as gpu, if gpu available
    +end

    If a GPU is available on our local system, then Flux uses it for computing the loss and updating the weights and biases when training our model.

    Data

    We create the function getdata to load the MNIST train and test data sets from MLDatasets and prepare them for the training process. In addition, we set mini-batches of the data sets by loading them onto a DataLoader object.

    function getdata(args)
    +    ENV["DATADEPS_ALWAYS_ACCEPT"] = "true"
    +
    +    # Loading Dataset	
    +    xtrain, ytrain = MLDatasets.MNIST.traindata(Float32)
    +    xtest, ytest = MLDatasets.MNIST.testdata(Float32)
    +	
    +    # Reshape Data in order to flatten each image into a linear array
    +    xtrain = Flux.flatten(xtrain)
    +    xtest = Flux.flatten(xtest)
    +
    +    # One-hot-encode the labels
    +    ytrain, ytest = onehotbatch(ytrain, 0:9), onehotbatch(ytest, 0:9)
    +
    +    # Batching
    +    train_data = DataLoader((xtrain, ytrain), batchsize=args.batchsize, shuffle=true)
    +    test_data = DataLoader((xtest, ytest), batchsize=args.batchsize)
    +
    +    return train_data, test_data
    +end

    getdata performs the following steps:

    • Loads MNIST data set: Loads the train and test set tensors. The shape of train data is 28x28x60000 and test data is 28X28X10000.
    • Reshapes the train and test data: Uses the flatten function to reshape the train data set into a 784x60000 array and test data set into a 784x10000. Notice that we reshape the data so that we can pass these as arguments for the input layer of our model (a simple MLP expects a vector as an input).
    • One-hot encodes the train and test labels: Creates a batch of one-hot vectors so we can pass the labels of the data as arguments for the loss function. For this example, we use the logitcrossentropy function and it expects data to be one-hot encoded.
    • Creates batches of data: Creates two DataLoader objects (train and test) that handle data mini-batches of size 1024 (as defined above). We create these two objects so that we can pass the entire data set through the loss function at once when training our model. Also, it shuffles the data points during each iteration (shuffle=true).

    Model

    As we mentioned above, a MLP consist of three layers that are fully connected. For this example, we define out model with the following layers and dimensions:

    • Input: It has 784 perceptrons (the MNIST image size is 28x28). We flatten the train and test data so that we can pass them as arguments to this layer.
    • Hidden: It has 32 perceptrons that use the relu activation function.
    • Output: It has 10 perceptrons that output the model's prediction or probability that a digit is 0 to 9.

    We define our model with the build_model function:

    function build_model(; imgsize=(28,28,1), nclasses=10)
    +    return Chain(
    + 	    Dense(prod(imgsize), 32, relu),
    +            Dense(32, nclasses))
    +end

    Note that we use the functions Dense so that our model is densely (or fully) connected and Chain to chain the computation of the three layers.

    Loss functions

    Now, we define the loss function loss_all. It expects a DataLoader object and the model function we defined above as arguments. Notice that this function iterates through the dataloader object in mini-batches and uses the function logitcrossentropy to compute the difference between the predicted and actual values.

    function loss_all(dataloader, model)
    +    l = 0f0
    +    for (x,y) in dataloader
    +        l += logitcrossentropy(model(x), y)
    +    end
    +    l/length(dataloader)
    +end

    In addition, we define the function (accuracy) to report the accuracy of our model during the training process. To compute the accuray, we need to decode the output of our model using the onecold function.

    function accuracy(data_loader, model)
    +    acc = 0
    +    for (x,y) in data_loader
    +        acc += sum(onecold(cpu(model(x))) .== onecold(cpu(y)))*1 / size(x,2)
    +    end
    +    acc/length(data_loader)
    +end

    Train our model

    Finally, we create the train function that calls the functions we defined and trains the model.

    function train(; kws...)
    +    # Initializing Model parameters 
    +    args = Args(; kws...)
    +
    +    # Load Data
    +    train_data,test_data = getdata(args)
    +
    +    # Construct model
    +    m = build_model()
    +    train_data = args.device.(train_data)
    +    test_data = args.device.(test_data)
    +    m = args.device(m)
    +    loss(x,y) = logitcrossentropy(m(x), y)
    +    
    +    ## Training
    +    evalcb = () -> @show(loss_all(train_data, m))
    +    opt = Adam(args.rate)
    +	
    +    for epoch in 1:args.epochs
    +        @info "Epoch $epoch"
    +        Flux.train!(loss, params(m), train_data, opt, cb = evalcb)
    +    end
    +
    +    @show accuracy(train_data, m)
    +
    +    @show accuracy(test_data, m)
    +end

    train performs the following steps:

    • Initializes the model parameters: Creates the args object that contains the defult values for training our model.
    • Loads the train and test data: Calls the function getdata we defined above.
    • Constructs the model: Builds the model and loads the train and test data sets, and our model onto the GPU (if available).
    • Trains the model: Defines the callback function evalcb to show the value of the loss_all function during the training process. Then, it sets Adam as the optimiser for training out model. Finally, it runs the training process for 10 epochs (as defined in the args object) and shows the accuracy value for the train and test data.

    To see the full version of this example, see Simple multi-layer perceptron - model-zoo.

    Resources

    Info

    Originally published at fluxml.ai on 26 January 2021. Written by Adarsh Kumar, Mike J Innes, Andrew Dinhobl, Jerry Ling, natema, Zhang Shitian, Liliana Badillo, Dhairya Gandhi

    diff --git a/previews/PR2365/tutorials/2021-02-07-convnet/index.html b/previews/PR2365/tutorials/2021-02-07-convnet/index.html new file mode 100644 index 0000000000..1cbb4f368f --- /dev/null +++ b/previews/PR2365/tutorials/2021-02-07-convnet/index.html @@ -0,0 +1,152 @@ + +Tutorial: A Simple ConvNet · Flux

    Tutorial: A Simple ConvNet

    In this tutorial, we build a simple Convolutional Neural Network (ConvNet) to classify the MNIST dataset. This model has a simple architecture with three feature detection layers (Conv -> ReLU -> MaxPool) followed by a final dense layer that classifies MNIST handwritten digits. Note that this model, while simple, should hit around 99% test accuracy after training for approximately 20 epochs.

    This example writes out the saved model to the file mnist_conv.bson. Also, it demonstrates basic model construction, training, saving, conditional early-exit, and learning rate scheduling.

    To run this example, we need the following packages:

    using Flux, MLDatasets, Statistics
    +using Flux: onehotbatch, onecold, logitcrossentropy, params
    +using MLDatasets: MNIST
    +using Base.Iterators: partition
    +using Printf, BSON
    +using CUDA
    +CUDA.allowscalar(false)

    We set default values for learning rate, batch size, number of epochs, and path for saving the file mnist_conv.bson:

    Base.@kwdef mutable struct TrainArgs
    +   lr::Float64 = 3e-3
    +   epochs::Int = 20
    +   batch_size = 128
    +   savepath::String = "./"
    +end

    Data

    To train our model, we need to bundle images together with their labels and group them into mini-batches (makes the training process faster). We define the function make_minibatch that takes as inputs the images (X) and their labels (Y) as well as the indices for the mini-batches (idx):

    function make_minibatch(X, Y, idxs)
    +   X_batch = Array{Float32}(undef, size(X)[1:end-1]..., 1, length(idxs))
    +   for i in 1:length(idxs)
    +       X_batch[:, :, :, i] = Float32.(X[:,:,idxs[i]])
    +   end
    +   Y_batch = onehotbatch(Y[idxs], 0:9)
    +   return (X_batch, Y_batch)
    +end

    make_minibatch takes the following steps:

    • Creates the X_batch array of size 28x28x1x128 to store the mini-batches.
    • Stores the mini-batches in X_batch.
    • One hot encodes the labels of the images.
    • Stores the labels in Y_batch.

    get_processed_data loads the train and test data from Flux.Data.MNIST. First, it loads the images and labels of the train data set, and creates an array that contains the indices of the train images that correspond to each mini-batch (of size args.batch_size). Then, it calls the make_minibatch function to create all of the train mini-batches. Finally, it loads the test images and creates one mini-batch that contains them all.

    function get_processed_data(args)
    +   # Load labels and images
    +   train_imgs, train_labels = MNIST.traindata()
    +   mb_idxs = partition(1:length(train_labels), args.batch_size)
    +   train_set = [make_minibatch(train_imgs, train_labels, i) for i in mb_idxs]
    +  
    +   # Prepare test set as one giant minibatch:
    +   test_imgs, test_labels = MNIST.testdata()
    +   test_set = make_minibatch(test_imgs, test_labels, 1:length(test_labels))
    + 
    +   return train_set, test_set
    + 
    +end

    Model

    Now, we define the build_model function that creates a ConvNet model which is composed of three convolution layers (feature detection) and one classification layer. The input layer size is 28x28. The images are grayscale, which means there is only one channel (compared to 3 for RGB) in every data point. Combined together, the convolutional layer structure would look like Conv(kernel, input_channels => output_channels, ...). Each convolution layer reduces the size of the image by applying the Rectified Linear unit (ReLU) and MaxPool operations. On the other hand, the classification layer outputs a vector of 10 dimensions (a dense layer), that is, the number of classes that the model will be able to predict.

    function build_model(args; imgsize = (28,28,1), nclasses = 10)
    +   cnn_output_size = Int.(floor.([imgsize[1]/8,imgsize[2]/8,32])) 
    + 
    +   return Chain(
    +   # First convolution, operating upon a 28x28 image
    +   Conv((3, 3), imgsize[3]=>16, pad=(1,1), relu),
    +   MaxPool((2,2)),
    + 
    +   # Second convolution, operating upon a 14x14 image
    +   Conv((3, 3), 16=>32, pad=(1,1), relu),
    +   MaxPool((2,2)),
    + 
    +   # Third convolution, operating upon a 7x7 image
    +   Conv((3, 3), 32=>32, pad=(1,1), relu),
    +   MaxPool((2,2)),
    + 
    +   # Reshape 3d array into a 2d one using `Flux.flatten`, at this point it should be (3, 3, 32, N)
    +   flatten,
    +   Dense(prod(cnn_output_size), 10))
    +end

    To chain the layers of a model we use the Flux function Chain. It enables us to call the layers in sequence on a given input. Also, we use the function flatten to reshape the output image from the last convolution layer. Finally, we call the Dense function to create the classification layer.

    Training

    Before training our model, we need to define a few functions that will be helpful for the process:

    • augment adds gaussian random noise to our image, to make it more robust:
    • anynan checks whether any element of the params is NaN or not:
    • accuracy computes the proportion of inputs x correctly classified by our ConvNet:
    augment(x) = x .+ gpu(0.1f0*randn(eltype(x), size(x)))
    +anynan(x) = any(y -> any(isnan, y), x)
    +accuracy(x, y, model) = mean(onecold(cpu(model(x))) .== onecold(cpu(y)))

    Finally, we define the train function:

    function train(; kws...)   
    +   args = TrainArgs(; kws...)
    + 
    +   @info("Loading data set")
    +   train_set, test_set = get_processed_data(args)
    + 
    +   # Define our model.  We will use a simple convolutional architecture with
    +   # three iterations of Conv -> ReLU -> MaxPool, followed by a final Dense layer.
    +   @info("Building model...")
    +   model = build_model(args)
    + 
    +   # Load model and datasets onto GPU, if enabled
    +   train_set = gpu.(train_set)
    +   test_set = gpu.(test_set)
    +   model = gpu(model)
    +  
    +   # Make sure our model is nicely precompiled before starting our training loop
    +   model(train_set[1][1])
    + 
    +   # `loss()` calculates the crossentropy loss between our prediction `y_hat`
    +   # (calculated from `model(x)`) and the ground truth `y`.  We augment the data
    +   # a bit, adding gaussian random noise to our image to make it more robust.
    +   function loss(x, y)   
    +       x̂ = augment(x)
    +       ŷ = model(x̂)
    +       return logitcrossentropy(ŷ, y)
    +   end
    +  
    +   # Train our model with the given training set using the Adam optimiser and
    +   # printing out performance against the test set as we go.
    +   opt = Adam(args.lr)
    +  
    +   @info("Beginning training loop...")
    +   best_acc = 0.0
    +   last_improvement = 0
    +   for epoch_idx in 1:args.epochs
    +       # Train for a single epoch
    +       Flux.train!(loss, params(model), train_set, opt)
    +      
    +       # Terminate on NaN
    +       if anynan(Flux.params(model))
    +           @error "NaN params"
    +           break
    +       end
    +  
    +       # Calculate accuracy:
    +       acc = accuracy(test_set..., model)
    +      
    +       @info(@sprintf("[%d]: Test accuracy: %.4f", epoch_idx, acc))
    +       # If our accuracy is good enough, quit out.
    +       if acc >= 0.999
    +           @info(" -> Early-exiting: We reached our target accuracy of 99.9%")
    +           break
    +       end
    +  
    +       # If this is the best accuracy we've seen so far, save the model out
    +       if acc >= best_acc
    +           @info(" -> New best accuracy! Saving model out to mnist_conv.bson")
    +           BSON.@save joinpath(args.savepath, "mnist_conv.bson") params=cpu.(params(model)) epoch_idx acc
    +           best_acc = acc
    +           last_improvement = epoch_idx
    +       end
    +  
    +       # If we haven't seen improvement in 5 epochs, drop our learning rate:
    +       if epoch_idx - last_improvement >= 5 && opt.eta > 1e-6
    +           opt.eta /= 10.0
    +           @warn(" -> Haven't improved in a while, dropping learning rate to $(opt.eta)!")
    + 
    +           # After dropping learning rate, give it a few epochs to improve
    +           last_improvement = epoch_idx
    +       end
    +  
    +       if epoch_idx - last_improvement >= 10
    +           @warn(" -> We're calling this converged.")
    +           break
    +       end
    +   end
    +end

    train calls the functions we defined above and trains our model. It stops when the model achieves 99% accuracy (early-exiting) or after performing 20 steps. More specifically, it performs the following steps:

    • Loads the MNIST dataset.
    • Builds our ConvNet model (as described above).
    • Loads the train and test data sets as well as our model onto a GPU (if available).
    • Defines a loss function that calculates the crossentropy between our prediction and the ground truth.
    • Sets the Adam optimiser to train the model with learning rate args.lr.
    • Runs the training loop. For each step (or epoch), it executes the following:
      • Calls Flux.train! function to execute one training step.
      • If any of the parameters of our model is NaN, then the training process is terminated.
      • Calculates the model accuracy.
      • If the model accuracy is >= 0.999, then early-exiting is executed.
      • If the actual accuracy is the best so far, then the model is saved to mnist_conv.bson. Also, the new best accuracy and the current epoch is saved.
      • If there has not been any improvement for the last 5 epochs, then the learning rate is dropped and the process waits a little longer for the accuracy to improve.
      • If the last improvement was more than 10 epochs ago, then the process is terminated.

    Testing

    Finally, to test our model we define the test function:

    function test(; kws...)
    +   args = TrainArgs(; kws...)
    +  
    +   # Loading the test data
    +   _,test_set = get_processed_data(args)
    +  
    +   # Re-constructing the model with random initial weights
    +   model = build_model(args)
    +  
    +   # Loading the saved parameters
    +   BSON.@load joinpath(args.savepath, "mnist_conv.bson") params
    +  
    +   # Loading parameters onto the model
    +   Flux.loadparams!(model, params)
    +  
    +   test_set = gpu.(test_set)
    +   model = gpu(model)
    +   @show accuracy(test_set...,model)
    +end

    test loads the MNIST test data set, reconstructs the model, and loads the saved parameters (in mnist_conv.bson) onto it. Finally, it computes our model's predictions for the test set and shows the test accuracy (around 99%).

    To see the full version of this example, see Simple ConvNets - model-zoo.

    Resources

    Info

    Originally published at fluxml.ai on 7 February 2021. Written by Elliot Saba, Adarsh Kumar, Mike J Innes, Dhairya Gandhi, Sudhanshu Agrawal, Sambit Kumar Dash, fps.io, Carlo Lucibello, Andrew Dinhobl, Liliana Badillo

    diff --git a/previews/PR2365/tutorials/2021-10-08-dcgan-mnist/index.html b/previews/PR2365/tutorials/2021-10-08-dcgan-mnist/index.html new file mode 100644 index 0000000000..32dd2b5d34 --- /dev/null +++ b/previews/PR2365/tutorials/2021-10-08-dcgan-mnist/index.html @@ -0,0 +1,180 @@ + +Deep Convolutional Generative Adversarial Network (DCGAN) · Flux

    Deep Convolutional Generative Adversarial Network (DCGAN)

    This is a beginner level tutorial for generating images of handwritten digits using a Deep Convolutional Generative Adversarial Network inspired by the TensorFlow tutorial on DCGAN.

    What are GANs?

    Generative Adversarial Neural Networks or simply GANs introduced by Goodfellow et al. is one of the most innovative ideas in modern-day machine learning. GANs are used extensively in the field of image and audio processing to generate high-quality synthetic data that can easily be passed off as real data.

    A GAN is composed of two sub-models - the generator and the discriminator acting against one another. The generator can be considered as an artist who draws (generates) new images that look real, whereas the discriminator is a critic who learns to tell real images apart from fakes.

    The GAN starts with a generator and discriminator which have very little or no idea about the underlying data. During training, the generator progressively becomes better at creating images that look real, while the discriminator becomes better at telling them apart. The process reaches equilibrium when the discriminator can no longer distinguish real images from fakes.

    [source]

    This tutorial demonstrates the process of training a DC-GAN on the MNIST dataset for handwritten digits. The following animation shows a series of images produced by the generator as it was trained for 25 epochs. The images begin as random noise, but over time, the images become increasingly similar to handwritten numbers.

    Setup

    We need to install some Julia packages before we start with our implementation of DCGAN.

    using Pkg
    +
    +# Activate a new project environment in the current directory
    +Pkg.activate(".")
    +# Add the required packages to the environment
    +Pkg.add(["Images", "Flux", "MLDatasets", "CUDA", "Parameters"])

    Note: Depending on your internet speed, it may take a few minutes for the packages install.

    After installing the libraries, load the required packages and functions:

    using Base.Iterators: partition
    +using Printf
    +using Statistics
    +using Random
    +using Images
    +using Flux: params, DataLoader
    +using Flux.Optimise: update!
    +using Flux.Losses: logitbinarycrossentropy
    +using MLDatasets: MNIST
    +using CUDA

    Now we set default values for the learning rates, batch size, epochs, the usage of a GPU (if available) and other hyperparameters for our model.

    Base.@kwdef struct HyperParams
    +    batch_size::Int = 128
    +    latent_dim::Int = 100
    +    epochs::Int = 25
    +    verbose_freq::Int = 1000
    +    output_dim::Int = 5
    +    disc_lr::Float64 = 0.0002
    +    gen_lr::Float64 = 0.0002
    +    device::Function = gpu
    +end

    Loading the data

    As mentioned before, we will be using the MNIST dataset for handwritten digits. So we begin with a simple function for loading and pre-processing the MNIST images:

    function load_MNIST_images(hparams)
    +    images = MNIST.traintensor(Float32)
    +
    +    # Normalize the images to (-1, 1)
    +    normalized_images = @. 2f0 * images - 1f0
    +    image_tensor = reshape(normalized_images, 28, 28, 1, :)
    +
    +    # Create a dataloader that iterates over mini-batches of the image tensor
    +    dataloader = DataLoader(image_tensor, batchsize=hparams.batch_size, shuffle=true)
    +
    +    return dataloader
    +end

    To learn more about loading images in Flux, you can check out this tutorial.

    Note: The data returned from the dataloader is loaded is on the CPU. To train on the GPU, we need to transfer the data to the GPU beforehand.

    Create the models

    Generator

    Our generator, a.k.a. the artist, is a neural network that maps low dimensional data to a high dimensional form.

    • This low dimensional data (seed) is generally a vector of random values sampled from a normal distribution.
    • The high dimensional data is the generated image.

    The Dense layer is used for taking the seed as an input which is upsampled several times using the ConvTranspose layer until we reach the desired output size (in our case, 28x28x1). Furthermore, after each ConvTranspose layer, we apply the Batch Normalization to stabilize the learning process.

    We will be using the relu activation function for each layer except the output layer, where we use tanh activation.

    We will also apply the weight initialization method mentioned in the original DCGAN paper.

    # Function for initializing the model weights with values 
    +# sampled from a Gaussian distribution with μ=0 and σ=0.02
    +dcgan_init(shape...) = randn(Float32, shape) * 0.02f0
    function Generator(latent_dim)
    +    Chain(
    +        Dense(latent_dim, 7*7*256, bias=false),
    +        BatchNorm(7*7*256, relu),
    +
    +        x -> reshape(x, 7, 7, 256, :),
    +
    +        ConvTranspose((5, 5), 256 => 128; stride = 1, pad = 2, init = dcgan_init, bias=false),
    +        BatchNorm(128, relu),
    +
    +        ConvTranspose((4, 4), 128 => 64; stride = 2, pad = 1, init = dcgan_init, bias=false),
    +        BatchNorm(64, relu),
    +
    +        # The tanh activation ensures that output is in range of (-1, 1)
    +        ConvTranspose((4, 4), 64 => 1, tanh; stride = 2, pad = 1, init = dcgan_init, bias=false),
    +    )
    +end

    Time for a small test!! We create a dummy generator and feed a random vector as a seed to the generator. If our generator is initialized correctly it will return an array of size (28, 28, 1, batch_size). The @assert macro in Julia will raise an exception for the wrong output size.

    # Create a dummy generator of latent dim 100
    +generator = Generator(100)
    +noise = randn(Float32, 100, 3) # The last axis is the batch size
    +
    +# Feed the random noise to the generator
    +gen_image = generator(noise)
    +@assert size(gen_image) == (28, 28, 1, 3)

    Our generator model is yet to learn the correct weights, so it does not produce a recognizable image for now. To train our poor generator we need its equal rival, the discriminator.

    Discriminator

    The Discriminator is a simple CNN based image classifier. The Conv layer a is used with a leakyrelu activation function.

    function Discriminator()
    +    Chain(
    +        Conv((4, 4), 1 => 64; stride = 2, pad = 1, init = dcgan_init),
    +        x->leakyrelu.(x, 0.2f0),
    +        Dropout(0.3),
    +
    +        Conv((4, 4), 64 => 128; stride = 2, pad = 1, init = dcgan_init),
    +        x->leakyrelu.(x, 0.2f0),
    +        Dropout(0.3),
    +
    +        # The output is now of the shape (7, 7, 128, batch_size)
    +        flatten,
    +        Dense(7 * 7 * 128, 1) 
    +    )
    +end

    For a more detailed implementation of a CNN-based image classifier, you can refer to this tutorial.

    Now let us check if our discriminator is working:

    # Dummy Discriminator
    +discriminator = Discriminator()
    +# We pass the generated image to the discriminator
    +logits = discriminator(gen_image)
    +@assert size(logits) == (1, 3)

    Just like our dummy generator, the untrained discriminator has no idea about what is a real or fake image. It needs to be trained alongside the generator to output positive values for real images, and negative values for fake images.

    Loss functions for GAN

    In a GAN problem, there are only two labels involved: fake and real. So Binary CrossEntropy is an easy choice for a preliminary loss function.

    But even if Flux's binarycrossentropy does the job for us, due to numerical stability it is always preferred to compute cross-entropy using logits. Flux provides logitbinarycrossentropy specifically for this purpose. Mathematically it is equivalent to binarycrossentropy(σ(ŷ), y, kwargs...).

    Discriminator Loss

    The discriminator loss quantifies how well the discriminator can distinguish real images from fakes. It compares

    • discriminator's predictions on real images to an array of 1s, and
    • discriminator's predictions on fake (generated) images to an array of 0s.

    These two losses are summed together to give a scalar loss. So we can write the loss function of the discriminator as:

    function discriminator_loss(real_output, fake_output)
    +    real_loss = logitbinarycrossentropy(real_output, 1)
    +    fake_loss = logitbinarycrossentropy(fake_output, 0)
    +    return real_loss + fake_loss
    +end

    Generator Loss

    The generator's loss quantifies how well it was able to trick the discriminator. Intuitively, if the generator is performing well, the discriminator will classify the fake images as real (or 1).

    generator_loss(fake_output) = logitbinarycrossentropy(fake_output, 1)

    We also need optimisers for our network. Why you may ask? Read more here. For both the generator and discriminator, we will use the ADAM optimiser.

    Utility functions

    The output of the generator ranges from (-1, 1), so it needs to be de-normalized before we can display it as an image. To make things a bit easier, we define a function to visualize the output of the generator as a grid of images.

    function create_output_image(gen, fixed_noise, hparams)
    +    fake_images = cpu(gen.(fixed_noise))
    +    image_array = reduce(vcat, reduce.(hcat, partition(fake_images, hparams.output_dim)))
    +    image_array = permutedims(dropdims(image_array; dims=(3, 4)), (2, 1))
    +    image_array = @. Gray(image_array + 1f0) / 2f0
    +    return image_array
    +end

    Training

    For the sake of simplifying our training problem, we will divide the generator and discriminator training into two separate functions.

    function train_discriminator!(gen, disc, real_img, fake_img, opt, ps, hparams)
    +
    +    disc_loss, grads = Flux.withgradient(ps) do
    +        discriminator_loss(disc(real_img), disc(fake_img))
    +    end
    +
    +    # Update the discriminator parameters
    +    update!(opt, ps, grads)
    +    return disc_loss
    +end

    We define a similar function for the generator.

    function train_generator!(gen, disc, fake_img, opt, ps, hparams)
    +
    +    gen_loss, grads = Flux.withgradient(ps) do
    +        generator_loss(disc(fake_img))
    +    end
    +
    +    update!(opt, ps, grads)
    +    return gen_loss
    +end

    Now that we have defined every function we need, we integrate everything into a single train function where we first set up all the models and optimisers and then train the GAN for a specified number of epochs.

    function train(hparams)
    +
    +    dev = hparams.device
    +    # Check if CUDA is actually present
    +    if hparams.device == gpu
    +        if !CUDA.has_cuda()
    +        dev = cpu
    +        @warn "No gpu found, falling back to CPU"
    +        end
    +    end
    +
    +    # Load the normalized MNIST images
    +    dataloader = load_MNIST_images(hparams)
    +
    +    # Initialize the models and pass them to correct device
    +    disc = Discriminator() |> dev
    +    gen =  Generator(hparams.latent_dim) |> dev
    +
    +    # Collect the generator and discriminator parameters
    +    disc_ps = params(disc)
    +    gen_ps = params(gen)
    +
    +    # Initialize the ADAM optimisers for both the sub-models
    +    # with respective learning rates
    +    disc_opt = ADAM(hparams.disc_lr)
    +    gen_opt = ADAM(hparams.gen_lr)
    +
    +    # Create a batch of fixed noise for visualizing the training of generator over time
    +    fixed_noise = [randn(Float32, hparams.latent_dim, 1) |> dev for _=1:hparams.output_dim^2]
    +
    +    # Training loop
    +    train_steps = 0
    +    for ep in 1:hparams.epochs
    +        @info "Epoch $ep"
    +        for real_img in dataloader
    +
    +            # Transfer the data to the GPU
    +            real_img = real_img |> dev
    +
    +            # Create a random noise
    +            noise = randn!(similar(real_img, (hparams.latent_dim, hparams.batch_size)))
    +            # Pass the noise to the generator to create a fake imagae
    +            fake_img = gen(noise)
    +
    +            # Update discriminator and generator
    +            loss_disc = train_discriminator!(gen, disc, real_img, fake_img, disc_opt, disc_ps, hparams)
    +            loss_gen = train_generator!(gen, disc, fake_img, gen_opt, gen_ps, hparams)
    +
    +            if train_steps % hparams.verbose_freq == 0
    +                @info("Train step $(train_steps), Discriminator loss = $(loss_disc), Generator loss = $(loss_gen)")
    +                # Save generated fake image
    +                output_image = create_output_image(gen, fixed_noise, hparams)
    +                save(@sprintf("output/dcgan_steps_%06d.png", train_steps), output_image)
    +            end
    +            train_steps += 1
    +        end
    +    end
    +
    +    output_image = create_output_image(gen, fixed_noise, hparams)
    +    save(@sprintf("output/dcgan_steps_%06d.png", train_steps), output_image)
    +
    +    return nothing
    +end

    Now we finally get to train the GAN:

    # Define the hyper-parameters (here, we go with the default ones)
    +hparams = HyperParams()
    +train(hparams)

    Output

    The generated images are stored inside the output folder. To visualize the output of the generator over time, we create a gif of the generated images.

    folder = "output"
    +# Get the image filenames from the folder
    +img_paths = readdir(folder, join=true)
    +# Load all the images as an array
    +images = load.(img_paths)
    +# Join all the images in the array to create a matrix of images
    +gif_mat = cat(images..., dims=3)
    +save("./output.gif", gif_mat)

    Resources & References

    Info

    Originally published at fluxml.ai on 8 October 2021, by Deeptendu Santra

    diff --git a/previews/PR2365/tutorials/2021-10-14-vanilla-gan/index.html b/previews/PR2365/tutorials/2021-10-14-vanilla-gan/index.html new file mode 100644 index 0000000000..d8447a9299 --- /dev/null +++ b/previews/PR2365/tutorials/2021-10-14-vanilla-gan/index.html @@ -0,0 +1,107 @@ + +Tutorial: Generative Adversarial Networks · Flux

    Tutorial: Generative Adversarial Networks

    This tutorial describes how to implement a vanilla Generative Adversarial Network using Flux and how train it on the MNIST dataset. It is based on this Pytorch tutorial. The original GAN paper by Goodfellow et al. is a great resource that describes the motivation and theory behind GANs:

    In the proposed adversarial nets framework, the generative model is pitted against an adversary: a discriminative model that learns to determine whether a sample is from the model distribution or the data distribution. The generative model can be thought of as analogous to a team of counterfeiters, trying to produce fake currency and use it without detection, while the discriminative model is analogous to the police, trying to detect the counterfeit currency. Competition in this game drives both teams to improve their methods until the counterfeits are indistinguishable from the genuine articles.

    Let's implement a GAN in Flux. To get started we first import a few useful packages:

    using MLDatasets: MNIST
    +using Flux.Data: DataLoader
    +using Flux
    +using CUDA
    +using Zygote
    +using UnicodePlots

    To download a package in the Julia REPL, type ] to enter package mode and then type add MLDatasets or perform this operation with the Pkg module like this

    > import Pkg
    +> Pkg.add("MLDatasets")

    While UnicodePlots is not necessary, it can be used to plot generated samples into the terminal during training. Having direct feedback, instead of looking at plots in a separate window, use fantastic for debugging.

    Next, let us define values for learning rate, batch size, epochs, and other hyper-parameters. While we are at it, we also define optimisers for the generator and discriminator network. More on what these are later.

        lr_g = 2e-4          # Learning rate of the generator network
    +    lr_d = 2e-4          # Learning rate of the discriminator network
    +    batch_size = 128    # batch size
    +    num_epochs = 1000   # Number of epochs to train for
    +    output_period = 100 # Period length for plots of generator samples
    +    n_features = 28 * 28# Number of pixels in each sample of the MNIST dataset
    +    latent_dim = 100    # Dimension of latent space
    +    opt_dscr = ADAM(lr_d)# Optimiser for the discriminator
    +    opt_gen = ADAM(lr_g) # Optimiser for the generator

    In this tutorial I'm assuming that a CUDA-enabled GPU is available on the system where the script is running. If this is not the case, simply remove the |>gpu decorators: piping.

    Data loading

    The MNIST data set is available from MLDatasets. The first time you instantiate it you will be prompted if you want to download it. You should agree to this.

    GANs can be trained unsupervised. Therefore only keep the images from the training set and discard the labels.

    After we load the training data we re-scale the data from values in [0:1] to values in [-1:1]. GANs are notoriously tricky to train and this re-scaling is a recommended GAN hack. The re-scaled data is used to define a data loader which handles batching and shuffling the data.

        # Load the dataset
    +    train_x, _ = MNIST.traindata(Float32);
    +    # This dataset has pixel values ∈ [0:1]. Map these to [-1:1]
    +    train_x = 2f0 * reshape(train_x, 28, 28, 1, :) .- 1f0 |>gpu;
    +    # DataLoader allows to access data batch-wise and handles shuffling.
    +    train_loader = DataLoader(train_x, batchsize=batch_size, shuffle=true);

    Defining the Networks

    A vanilla GAN, the discriminator and the generator are both plain, feed-forward multilayer perceptrons. We use leaky rectified linear units leakyrelu to ensure out model is non-linear.

    Here, the coefficient α (in the leakyrelu below), is set to 0.2. Empirically, this value allows for good training of the network (based on prior experiments). It has also been found that Dropout ensures a good generalization of the learned network, so we will use that below. Dropout is usually active when training a model and inactive in inference. Flux automatically sets the training mode when calling the model in a gradient context. As a final non-linearity, we use the sigmoid activation function.

    discriminator = Chain(Dense(n_features, 1024, x -> leakyrelu(x, 0.2f0)),
    +                        Dropout(0.3),
    +                        Dense(1024, 512, x -> leakyrelu(x, 0.2f0)),
    +                        Dropout(0.3),
    +                        Dense(512, 256, x -> leakyrelu(x, 0.2f0)),
    +                        Dropout(0.3),
    +                        Dense(256, 1, sigmoid)) |> gpu

    Let's define the generator in a similar fashion. This network maps a latent variable (a variable that is not directly observed but instead inferred) to the image space and we set the input and output dimension accordingly. A tanh squashes the output of the final layer to values in [-1:1], the same range that we squashed the training data onto.

    generator = Chain(Dense(latent_dim, 256, x -> leakyrelu(x, 0.2f0)),
    +                    Dense(256, 512, x -> leakyrelu(x, 0.2f0)),
    +                    Dense(512, 1024, x -> leakyrelu(x, 0.2f0)),
    +                    Dense(1024, n_features, tanh)) |> gpu

    Training functions for the networks

    To train the discriminator, we present it with real data from the MNIST data set and with fake data and reward it by predicting the correct labels for each sample. The correct labels are of course 1 for in-distribution data and 0 for out-of-distribution data coming from the generator. Binary cross entropy is the loss function of choice. While the Flux documentation suggests to use Logit binary cross entropy, the GAN seems to be difficult to train with this loss function. This function returns the discriminator loss for logging purposes. We can calculate the loss in the same call as evaluating the pullback and resort to getting the pullback directly from Zygote instead of calling Flux.train! on the model. To calculate the gradients of the loss function with respect to the parameters of the discriminator we then only have to evaluate the pullback with a seed gradient of 1.0. These gradients are used to update the model parameters

    function train_dscr!(discriminator, real_data, fake_data)
    +    this_batch = size(real_data)[end] # Number of samples in the batch
    +    # Concatenate real and fake data into one big vector
    +    all_data = hcat(real_data, fake_data)
    +
    +    # Target vector for predictions: 1 for real data, 0 for fake data.
    +    all_target = [ones(eltype(real_data), 1, this_batch) zeros(eltype(fake_data), 1, this_batch)] |> gpu;
    +
    +    ps = Flux.params(discriminator)
    +    loss, pullback = Zygote.pullback(ps) do
    +        preds = discriminator(all_data)
    +        loss = Flux.Losses.binarycrossentropy(preds, all_target)
    +    end
    +    # To get the gradients we evaluate the pullback with 1.0 as a seed gradient.
    +    grads = pullback(1f0)
    +
    +    # Update the parameters of the discriminator with the gradients we calculated above
    +    Flux.update!(opt_dscr, Flux.params(discriminator), grads)
    +    
    +    return loss 
    +end

    Now we need to define a function to train the generator network. The job of the generator is to fool the discriminator so we reward the generator when the discriminator predicts a high probability for its samples to be real data. In the training function we first need to sample some noise, i.e. normally distributed data. This has to be done outside the pullback since we don't want to get the gradients with respect to the noise, but to the generator parameters. Inside the pullback we need to first apply the generator to the noise since we will take the gradient with respect to the parameters of the generator. We also need to call the discriminator in order to evaluate the loss function inside the pullback. Here we need to remember to deactivate the dropout layers of the discriminator. We do this by setting the discriminator into test mode before the pullback. Immediately after the pullback we set it back into training mode. Then we evaluate the pullback, call it with a seed gradient of 1.0 as above, update the parameters of the generator network and return the loss.

    function train_gen!(discriminator, generator)
    +    # Sample noise
    +    noise = randn(latent_dim, batch_size) |> gpu;
    +
    +    # Define parameters and get the pullback
    +    ps = Flux.params(generator)
    +    # Set discriminator into test mode to disable dropout layers
    +    testmode!(discriminator)
    +    # Evaluate the loss function while calculating the pullback. We get the loss for free
    +    loss, back = Zygote.pullback(ps) do
    +        preds = discriminator(generator(noise));
    +        loss = Flux.Losses.binarycrossentropy(preds, 1.) 
    +    end
    +    # Evaluate the pullback with a seed-gradient of 1.0 to get the gradients for
    +    # the parameters of the generator
    +    grads = back(1.0f0)
    +    Flux.update!(opt_gen, Flux.params(generator), grads)
    +    # Set discriminator back into automatic mode
    +    trainmode!(discriminator, mode=:auto)
    +    return loss
    +end

    Training

    Now we are ready to train the GAN. In the training loop we keep track of the per-sample loss of the generator and the discriminator, where we use the batch loss returned by the two training functions defined above. In each epoch we iterate over the mini-batches given by the data loader. Only minimal data processing needs to be done before the training functions can be called.

    lossvec_gen = zeros(num_epochs)
    +lossvec_dscr = zeros(num_epochs)
    +
    +for n in 1:num_epochs
    +    loss_sum_gen = 0.0f0
    +    loss_sum_dscr = 0.0f0
    +
    +    for x in train_loader
    +        # - Flatten the images from 28x28xbatchsize to 784xbatchsize
    +        real_data = flatten(x);
    +
    +        # Train the discriminator
    +        noise = randn(latent_dim, size(x)[end]) |> gpu
    +        fake_data = generator(noise)
    +        loss_dscr = train_dscr!(discriminator, real_data, fake_data)
    +        loss_sum_dscr += loss_dscr
    +
    +        # Train the generator
    +        loss_gen = train_gen!(discriminator, generator)
    +        loss_sum_gen += loss_gen
    +    end
    +
    +    # Add the per-sample loss of the generator and discriminator
    +    lossvec_gen[n] = loss_sum_gen / size(train_x)[end]
    +    lossvec_dscr[n] = loss_sum_dscr / size(train_x)[end]
    +
    +    if n % output_period == 0
    +        @show n
    +        noise = randn(latent_dim, 4) |> gpu;
    +        fake_data = reshape(generator(noise), 28, 4*28);
    +        p = heatmap(fake_data, colormap=:inferno)
    +        print(p)
    +    end
    +end 

    For the hyper-parameters shown in this example, the generator produces useful images after about 1000 epochs. And after about 5000 epochs the result look indistinguishable from real MNIST data. Using a Nvidia V100 GPU on a 2.7 GHz Power9 CPU with 32 hardware threads, training 100 epochs takes about 80 seconds when using the GPU. The GPU utilization is between 30 and 40%. To observe the network more frequently during training you can for example set output_period=20. Training the GAN using the CPU takes about 10 minutes per epoch and is not recommended.

    Results

    Below you can see what some of the images output may look like after different numbers of epochs.

    Resources

    Info

    Originally published at fluxml.ai on 14 October 2021, by Ralph Kube.

    diff --git a/previews/PR2365/tutorials/linear_regression/index.html b/previews/PR2365/tutorials/linear_regression/index.html new file mode 100644 index 0000000000..0a2e66c43f --- /dev/null +++ b/previews/PR2365/tutorials/linear_regression/index.html @@ -0,0 +1,109 @@ + +Linear Regression · Flux

    Tutorial: Linear Regression

    Flux is a pure Julia ML stack that allows you to build predictive models. Here are the steps for a typical Flux program:

    • Provide training and test data
    • Build a model with configurable parameters to make predictions
    • Iteratively train the model by tweaking the parameters to improve predictions
    • Verify your model

    Under the hood, Flux uses a technique called automatic differentiation to take gradients that help improve predictions. Flux is also fully written in Julia so you can easily replace any layer of Flux with your own code to improve your understanding or satisfy special requirements.

    The following page contains a step-by-step walkthrough of the linear regression algorithm in Julia using Flux! We will start by creating a simple linear regression model for dummy data and then move on to a real dataset. The first part would involve writing some parts of the model on our own, which will later be replaced by Flux.


    Let us start by building a simple linear regression model. This model would be trained on the data points of the form (x₁, y₁), (x₂, y₂), ... , (xₙ, yₙ). In the real world, these xs can have multiple features, and the ys denote a label. In our example, each x has a single feature; hence, our data would have n data points, each point mapping a single feature to a single label.

    Importing the required Julia packages -

    julia> using Flux, Plots

    Generating a dataset

    The data usually comes from the real world, which we will be exploring in the last part of this tutorial, but we don't want to jump straight to the relatively harder part. Here we will generate the xs of our data points and map them to the respective ys using a simple function. Remember, here each x is equivalent to a feature, and each y is the corresponding label. Combining all the xs and ys would create the complete dataset.

    julia> x = hcat(collect(Float32, -3:0.1:3)...)
    +1×61 Matrix{Float32}:
    + -3.0  -2.9  -2.8  -2.7  -2.6  -2.5  …  2.4  2.5  2.6  2.7  2.8  2.9  3.0

    The hcat call generates a Matrix with numbers ranging from -3.0 to 3.0 with a gap of 0.1 between them. Each column of this matrix holds a single x, a total of 61 xs. The next step would be to generate the corresponding labels or the ys.

    julia> f(x) = @. 3x + 2;
    +
    +julia> y = f(x)
    +1×61 Matrix{Float32}:
    + -7.0  -6.7  -6.4  -6.1  -5.8  -5.5  …  9.5  9.8  10.1  10.4  10.7  11.0

    The function f maps each x to a y, and as x is a Matrix, the expression broadcasts the scalar values using @. macro. Our data points are ready, but they are too perfect. In a real-world scenario, we will not have an f function to generate y values, but instead, the labels would be manually added.

    julia> x = x .* reshape(rand(Float32, 61), (1, 61));

    Visualizing the final data -

    julia> plot(vec(x), vec(y), lw = 3, seriestype = :scatter, label = "", title = "Generated data", xlabel = "x", ylabel= "y");

    linear-regression-data

    The data looks random enough now! The x and y values are still somewhat correlated; hence, the linear regression algorithm should work fine on our dataset.

    We can now proceed ahead and build a model for our dataset!

    Building a model

    A linear regression model is defined mathematically as -

    \[model(W, b, x) = Wx + b\]

    where W is the weight matrix and b is the bias. For our case, the weight matrix (W) would constitute only a single element, as we have only a single feature. We can define our model in Julia using the exact same notation!

    julia> custom_model(W, b, x) = @. W*x + b
    +custom_model (generic function with 1 method)

    The @. macro allows you to perform the calculations by broadcasting the scalar quantities (for example - the bias).

    The next step would be to initialize the model parameters, which are the weight and the bias. There are a lot of initialization techniques available for different machine learning models, but for the sake of this example, let's pull out the weight from a uniform distribution and initialize the bias as 0.

    julia> W = rand(Float32, 1, 1)
    +1×1 Matrix{Float32}:
    + 0.99285793
    +
    +julia> b = [0.0f0]
    +1-element Vector{Float32}:
    + 0.0

    Time to test if our model works!

    julia> custom_model(W, b, x) |> size
    +(1, 61)
    +
    +julia> custom_model(W, b, x)[1], y[1]
    +(-1.6116865f0, -7.0f0)

    It does! But the predictions are way off. We need to train the model to improve the predictions, but before training the model we need to define the loss function. The loss function would ideally output a quantity that we will try to minimize during the entire training process. Here we will use the mean sum squared error loss function.

    julia> function custom_loss(W, b, x, y)
    +           ŷ = custom_model(W, b, x)
    +           sum((y .- ŷ).^2) / length(x)
    +       end;
    +
    +julia> custom_loss(W, b, x, y)
    +23.772217f0

    Calling the loss function on our xs and ys shows how far our predictions (ŷ) are from the real labels. More precisely, it calculates the sum of the squares of residuals and divides it by the total number of data points.

    We have successfully defined our model and the loss function, but surprisingly, we haven't used Flux anywhere till now. Let's see how we can write the same code using Flux.

    julia> flux_model = Dense(1 => 1)
    +Dense(1 => 1)       # 2 parameters

    A Dense(1 => 1) layer denotes a layer of one neuron with one input (one feature) and one output. This layer is exactly same as the mathematical model defined by us above! Under the hood, Flux too calculates the output using the same expression! But, we don't have to initialize the parameters ourselves this time, instead Flux does it for us.

    julia> flux_model.weight, flux_model.bias
    +(Float32[-1.2678515;;], Float32[0.0])

    Now we can check if our model is acting right. We can pass the complete data in one go, with each x having exactly one feature (one input) -

    julia> flux_model(x) |> size
    +(1, 61)
    +
    +julia> flux_model(x)[1], y[1]
    +(-1.8525281f0, -7.0f0)

    It is! The next step would be defining the loss function using Flux's functions -

    julia> function flux_loss(flux_model, x, y)
    +           ŷ = flux_model(x)
    +           Flux.mse(ŷ, y)
    +       end;
    +
    +julia> flux_loss(flux_model, x, y)
    +22.74856f0

    Everything works as before! It almost feels like Flux provides us with smart wrappers for the functions we could have written on our own. Now, as the last step of this section, let's see how different the flux_model is from our custom_model. A good way to go about this would be to fix the parameters of both models to be the same. Let's change the parameters of our custom_model to match that of the flux_model -

    julia> W = Float32[1.1412252]
    +1-element Vector{Float32}:
    + 1.1412252

    To check how both the models are performing on the data, let's find out the losses using the loss and flux_loss functions -

    julia> custom_loss(W, b, x, y), flux_loss(flux_model, x, y)
    +(22.74856f0, 22.74856f0)

    The losses are identical! This means that our model and the flux_model are identical on some level, and the loss functions are completely identical! The difference in models would be that Flux's Dense layer supports many other arguments that can be used to customize the layer further. But, for this tutorial, let us stick to our simple custom_model.

    Training the model

    Let's train our model using the classic Gradient Descent algorithm. According to the gradient descent algorithm, the weights and biases should be iteratively updated using the following mathematical equations -

    \[\begin{aligned} +W &= W - \eta * \frac{dL}{dW} \\ +b &= b - \eta * \frac{dL}{db} +\end{aligned}\]

    Here, W is the weight matrix, b is the bias vector, $\eta$ is the learning rate, $\frac{dL}{dW}$ is the derivative of the loss function with respect to the weight, and $\frac{dL}{db}$ is the derivative of the loss function with respect to the bias.

    The derivatives are calculated using an Automatic Differentiation tool, and Flux uses Zygote.jl for the same. Since Zygote.jl is an independent Julia package, it can be used outside of Flux as well! Refer to the documentation of Zygote.jl for more information on the same.

    Our first step would be to obtain the gradient of the loss function with respect to the weights and the biases. Flux re-exports Zygote's gradient function; hence, we don't need to import Zygote explicitly to use the functionality.

    julia> dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, y);

    We can now update the parameters, following the gradient descent algorithm -

    julia> W .= W .- 0.1 .* dLdW
    +1-element Vector{Float32}:
    + 1.8144473
    +
    +julia> b .= b .- 0.1 .* dLdb
    +1-element Vector{Float32}:
    + 0.41325632

    The parameters have been updated! We can now check the value of the loss function -

    julia> custom_loss(W, b, x, y)
    +17.157953f0

    The loss went down! This means that we successfully trained our model for one epoch. We can plug the training code written above into a loop and train the model for a higher number of epochs. It can be customized either to have a fixed number of epochs or to stop when certain conditions are met, for example, change in loss < 0.1. The loop can be tailored to suit the user's needs, and the conditions can be specified in plain Julia!

    Let's plug our super training logic inside a function and test it again -

    julia> function train_custom_model()
    +           dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, y)
    +           @. W = W - 0.1 * dLdW
    +           @. b = b - 0.1 * dLdb
    +       end;
    +
    +julia> train_custom_model();
    +
    +julia> W, b, custom_loss(W, b, x, y)
    +(Float32[2.340657], Float32[0.7516814], 13.64972f0)

    It works, and the loss went down again! This was the second epoch of our training procedure. Let's plug this in a for loop and train the model for 30 epochs.

    julia> for i = 1:40
    +          train_custom_model()
    +       end
    +
    +julia> W, b, custom_loss(W, b, x, y)
    +(Float32[4.2422233], Float32[2.2460847], 7.6680417f0)

    There was a significant reduction in loss, and the parameters were updated!

    We can train the model even more or tweak the hyperparameters to achieve the desired result faster, but let's stop here. We trained our model for 42 epochs, and loss went down from 22.74856 to 7.6680417f. Time for some visualization!

    Results

    The main objective of this tutorial was to fit a line to our dataset using the linear regression algorithm. The training procedure went well, and the loss went down significantly! Let's see what the fitted line looks like. Remember, Wx + b is nothing more than a line's equation, with slope = W[1] and y-intercept = b[1] (indexing at 1 as W and b are iterable).

    Plotting the line and the data points using Plot.jl -

    julia> plot(reshape(x, (61, 1)), reshape(y, (61, 1)), lw = 3, seriestype = :scatter, label = "", title = "Simple Linear Regression", xlabel = "x", ylabel= "y");
    +
    +julia> plot!((x) -> b[1] + W[1] * x, -3, 3, label="Custom model", lw=2);

    linear-regression-line

    The line fits well! There is room for improvement, but we leave that up to you! You can play with the optimisers, the number of epochs, learning rate, etc. to improve the fitting and reduce the loss!

    Linear regression model on a real dataset

    We now move on to a relatively complex linear regression model. Here we will use a real dataset from MLDatasets.jl, which will not confine our data points to have only one feature. Let's start by importing the required packages -

    julia> using Flux, Statistics, MLDatasets, DataFrames

    Gathering real data

    Let's start by initializing our dataset. We will be using the BostonHousing dataset consisting of 506 data points. Each of these data points has 13 features and a corresponding label, the house's price. The xs are still mapped to a single y, but now, a single x data point has 13 features.

    julia> dataset = BostonHousing();
    +
    +julia> x, y = BostonHousing(as_df=false)[:];
    +
    +julia> x, y = Float32.(x), Float32.(y);

    We can now split the obtained data into training and testing data -

    julia> x_train, x_test, y_train, y_test = x[:, 1:400], x[:, 401:end], y[:, 1:400], y[:, 401:end];
    +
    +julia> x_train |> size, x_test |> size, y_train |> size, y_test |> size
    +((13, 400), (13, 106), (1, 400), (1, 106))

    This data contains a diverse number of features, which means that the features have different scales. A wise option here would be to normalise the data, making the training process more efficient and fast. Let's check the standard deviation of the training data before normalising it.

    julia> std(x_train)
    +134.06786f0

    The data is indeed not normalised. We can use the Flux.normalise function to normalise the training data.

    julia> x_train_n = Flux.normalise(x_train);
    +
    +julia> std(x_train_n)
    +1.0000844f0

    The standard deviation is now close to one! Our data is ready!

    Building a Flux model

    We can now directly use Flux and let it do all the work internally! Let's define a model that takes in 13 inputs (13 features) and gives us a single output (the label). We will then pass our entire data through this model in one go, and Flux will handle everything for us! Remember, we could have declared a model in plain Julia as well. The model will have 14 parameters: 13 weights and 1 bias.

    julia> model = Dense(13 => 1)
    +Dense(13 => 1)      # 14 parameters

    Same as before, our next step would be to define a loss function to quantify our accuracy somehow. The lower the loss, the better the model!

    julia> function loss(model, x, y)
    +           ŷ = model(x)
    +           Flux.mse(ŷ, y)
    +       end;
    +
    +julia> loss(model, x_train_n, y_train)
    +676.1656f0

    We can now proceed to the training phase!

    Training the Flux model

    The training procedure would make use of the same mathematics, but now we can pass in the model inside the gradient call and let Flux and Zygote handle the derivatives!

    julia> function train_model()
    +           dLdm, _, _ = gradient(loss, model, x_train_n, y_train)
    +           @. model.weight = model.weight - 0.000001 * dLdm.weight
    +           @. model.bias = model.bias - 0.000001 * dLdm.bias
    +       end;

    Contrary to our last training procedure, let's say that this time we don't want to hardcode the number of epochs. We want the training procedure to stop when the loss converges, that is, when change in loss < δ. The quantity δ can be altered according to a user's need, but let's fix it to 10⁻³ for this tutorial.

    We can write such custom training loops effortlessly using Flux and plain Julia!

    julia> loss_init = Inf;
    +
    +julia> while true
    +           train_model()
    +           if loss_init == Inf
    +               loss_init = loss(model, x_train_n, y_train)
    +               continue
    +           end
    +           if abs(loss_init - loss(model, x_train_n, y_train)) < 1e-4
    +               break
    +           else
    +               loss_init = loss(model, x_train_n, y_train)
    +           end
    +       end;

    The code starts by initializing an initial value for the loss, infinity. Next, it runs an infinite loop that breaks if change in loss < 10⁻³, or the code changes the value of loss_init to the current loss and moves on to the next iteration.

    This custom loop works! This shows how easily a user can write down any custom training routine using Flux and Julia!

    Let's have a look at the loss -

    julia> loss(model, x_train_n, y_train)
    +27.1272f0

    The loss went down significantly! It can be minimized further by choosing an even smaller δ.

    Testing the Flux model

    The last step of this tutorial would be to test our model using the testing data. We will first normalise the testing data and then calculate the corresponding loss.

    julia> x_test_n = Flux.normalise(x_test);
    +
    +julia> loss(model, x_test_n, y_test)
    +66.91015f0

    The loss is not as small as the loss of the training data, but it looks good! This also shows that our model is not overfitting!


    Summarising this tutorial, we started by generating a random yet correlated dataset for our custom model. We then saw how a simple linear regression model could be built with and without Flux, and how they were almost identical.

    Next, we trained the model by manually writing down the Gradient Descent algorithm and optimising the loss. We also saw how Flux provides various wrapper functionalities and keeps the API extremely intuitive and simple for the users.

    After getting familiar with the basics of Flux and Julia, we moved ahead to build a machine learning model for a real dataset. We repeated the exact same steps, but this time with a lot more features and data points, and by harnessing Flux's full capabilities. In the end, we developed a training loop that was smarter than the hardcoded one and ran the model on our normalised dataset to conclude the tutorial.

    Info

    Originally published on 21 November 2022, by Saransh Chopra.

    diff --git a/previews/PR2365/tutorials/logistic_regression/index.html b/previews/PR2365/tutorials/logistic_regression/index.html new file mode 100644 index 0000000000..b13fa6b71a --- /dev/null +++ b/previews/PR2365/tutorials/logistic_regression/index.html @@ -0,0 +1,134 @@ + +Logistic Regression · Flux

    Logistic Regression

    The following page contains a step-by-step walkthrough of the logistic regression algorithm in Julia using Flux. We will then create a simple logistic regression model without any usage of Flux and compare the different working parts with Flux's implementation.

    Let's start by importing the required Julia packages.

    julia> using Flux, Statistics, MLDatasets, DataFrames, OneHotArrays

    Dataset

    Let's start by importing a dataset from MLDatasets.jl. We will use the Iris dataset that contains the data of three different Iris species. The data consists of 150 data points (xs), each having four features. Each of these x is mapped to y, the name of a particular Iris specie. The following code will download the Iris dataset when run for the first time.

    julia> Iris()
    +dataset Iris:
    +  metadata   =>    Dict{String, Any} with 4 entries
    +  features   =>    150×4 DataFrame
    +  targets    =>    150×1 DataFrame
    +  dataframe  =>    150×5 DataFrame
    +
    +julia> x, y = Iris(as_df=false)[:];

    Let's have a look at our dataset -

    julia> y
    +1×150 Matrix{InlineStrings.String15}:
    + "Iris-setosa"  "Iris-setosa"  …  "Iris-virginica"  "Iris-virginica"
    +
    +julia> x |> summary
    +"4×150 Matrix{Float64}"

    The y values here corresponds to a type of iris plant, with a total of 150 data points. The x values depict the sepal length, sepal width, petal length, and petal width (all in cm) of 150 iris plant (hence the matrix size 4×150). Different type of iris plants have different lengths and widths of sepals and petals associated with them, and there is a definitive pattern for this in nature. We can leverage this to train a simple classifier that outputs the type of iris plant using the length and width of sepals and petals as inputs.

    Our next step would be to convert this data into a form that can be fed to a machine learning model. The x values are arranged in a matrix and should ideally be converted to Float32 type (see Performance tips), but the labels must be one hot encoded. Here is a great discourse thread on different techniques that can be used to one hot encode data with or without using any external Julia package.

    julia> x = Float32.(x);
    +
    +julia> y = vec(y);
    +
    +julia> custom_y_onehot = unique(y) .== permutedims(y)
    +3×150 BitMatrix:
    + 1  1  1  1  1  1  1  1  1  1  1  1  1  …  0  0  0  0  0  0  0  0  0  0  0  0
    + 0  0  0  0  0  0  0  0  0  0  0  0  0     0  0  0  0  0  0  0  0  0  0  0  0
    + 0  0  0  0  0  0  0  0  0  0  0  0  0     1  1  1  1  1  1  1  1  1  1  1  1

    This same operation can also be performed using OneHotArrays' onehotbatch function. We will use both of these outputs parallelly to show how intuitive FluxML is!

    julia> const classes = ["Iris-setosa", "Iris-versicolor", "Iris-virginica"];
    +
    +julia> flux_y_onehot = onehotbatch(y, classes)
    +3×150 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
    + 1  1  1  1  1  1  1  1  1  1  1  1  1  …  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
    + ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅     ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅
    + ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅  ⋅     1  1  1  1  1  1  1  1  1  1  1  1

    Our data is ready. The next step would be to build a classifier for the same.

    Building a model

    A logistic regression model is defined mathematically as -

    \[model(x) = σ(Wx + b)\]

    where W is the weight matrix, b is the bias vector, and σ is any activation function. For our case, let's use the softmax activation function as we will be performing a multiclass classification task.

    julia> m(W, b, x) = W*x .+ b
    +m (generic function with 1 method)

    Note that this model lacks an activation function, but we will come back to that.

    We can now move ahead to initialize the parameters of our model. Given that our model has four inputs (4 features in every data point), and three outputs (3 different classes), the parameters can be initialized in the following way -

    julia> W = rand(Float32, 3, 4);
    +
    +julia> b = [0.0f0, 0.0f0, 0.0f0];

    Now our model can take in the complete dataset and predict the class of each x in one go. But, we need to ensure that our model outputs the probabilities of an input belonging to the respective classes. As our model has three outputs, each would denote the probability of the input belonging to a particular class.

    We will use an activation function to map our outputs to a probability value. It would make sense to use a softmax activation function here, which is defined mathematically as -

    \[σ(\vec{x}) = \frac{\\e^{z_{i}}}{\\sum_{j=1}^{k} \\e^{z_{j}}}\]

    The softmax function scales down the outputs to probability values such that the sum of all the final outputs equals 1. Let's implement this in Julia.

    julia> custom_softmax(x) = exp.(x) ./ sum(exp.(x), dims=1)
    +custom_softmax (generic function with 1 method)

    The implementation looks straightforward enough! Note that we specify dims=1 in the sum function to calculate the sum of probabilities in each column. Remember, we will have a 3×150 matrix (predicted ys) as the output of our model, where each column would be an output of a corresponding input.

    Let's combine this softmax function with our model to construct the complete custom_model.

    julia> custom_model(W, b, x) = m(W, b, x) |> custom_softmax
    +custom_model (generic function with 1 method)

    Let's check if our model works.

    julia> custom_model(W, b, x) |> size
    +(3, 150)

    It works! Let's check if the softmax function is working.

    julia> all(0 .<= custom_model(W, b, x) .<= 1)
    +true
    +
    +julia> sum(custom_model(W, b, x), dims=1)
    +1×150 Matrix{Float32}:
    + 1.0  1.0  1.0  1.0  1.0  1.0  1.0  1.0  …  1.0  1.0  1.0  1.0  1.0  1.0  1.0

    Every output value is between 0 and 1, and every column adds to 1!

    Let's convert our custom_model to a Flux model. Flux provides the users with a very elegant API that almost feels like writing your code!

    Note, all the flux_* variables in this tutorial would be general, that is, they can be used as it is with some other similar-looking dataset, but the custom_* variables will remain specific to this tutorial.

    julia> flux_model = Chain(Dense(4 => 3), softmax)
    +Chain(
    +  Dense(4 => 3),                        # 15 parameters
    +  NNlib.softmax,
    +)

    A Dense(4 => 3) layer denotes a layer with four inputs (four features in every data point) and three outputs (three classes or labels). This layer is the same as the mathematical model defined by us above. Under the hood, Flux too calculates the output using the same expression, but we don't have to initialize the parameters ourselves this time, instead Flux does it for us.

    The softmax function provided by NNLib.jl is re-exported by Flux, which has been used here. Lastly, Flux provides users with a Chain struct which makes stacking layers seamless.

    A model's weights and biases can be accessed as follows -

    julia> flux_model[1].weight, flux_model[1].bias
    +(Float32[0.78588694 -0.45968163 -0.77409476 0.2358028; -0.9049773 -0.58643705 0.466441 -0.79523873; 0.82426906 0.4143493 0.7630932 0.020588955], Float32[0.0, 0.0, 0.0])

    We can now pass the complete data in one go, with each data point having four features (four inputs)!

    Loss and accuracy

    Our next step should be to define some quantitative values for our model, which we will maximize or minimize during the complete training procedure. These values will be the loss function and the accuracy metric.

    Let's start by defining a loss function, a logitcrossentropy function.

    julia> custom_logitcrossentropy(ŷ, y) = mean(.-sum(y .* logsoftmax(ŷ; dims = 1); dims = 1));

    Now we can wrap the custom_logitcrossentropy inside a function that takes in the model parameters, xs, and ys, and returns the loss value.

    julia> function custom_loss(W, b, x, y)
    +           ŷ = custom_model(W, b, x)
    +           custom_logitcrossentropy(ŷ, y)
    +       end;
    +
    +julia> custom_loss(W, b, x, custom_y_onehot)
    +1.1714406827505623

    The loss function works!

    Flux provides us with many minimal yet elegant loss functions. In fact, the custom_logitcrossentropy defined above has been taken directly from Flux. The functions present in Flux includes sanity checks, ensures efficient performance, and behaves well with the overall FluxML ecosystem.

    julia> function flux_loss(flux_model, x, y)
    +           ŷ = flux_model(x)
    +           Flux.logitcrossentropy(ŷ, y)
    +       end;
    +
    +julia> flux_loss(flux_model, x, flux_y_onehot)
    +1.2156688659673647

    Next, let's define an accuracy function, which we will try to maximize during our training procedure. Before jumping to accuracy, let's define a onecold function. The onecold function would convert our output, which remember, are probability values, to the actual class names.

    We can divide this task into two parts -

    1. Identify the index of the maximum element of each column in the output matrix
    2. Convert this index to a class name

    The maximum index should be calculated along the columns (remember, each column is the output of a single x data point). We can use Julia's argmax function to achieve this.

    julia> argmax(custom_y_onehot, dims=1)  # calculate the cartesian index of max element column-wise
    +1×150 Matrix{CartesianIndex{2}}:
    + CartesianIndex(1, 1)  CartesianIndex(1, 2)  …  CartesianIndex(3, 150)
    +
    +julia> max_idx = [x[1] for x in argmax(custom_y_onehot; dims=1)]
    +1×150 Matrix{Int64}:
    + 1  1  1  1  1  1  1  1  1  1  1  1  1  …  3  3  3  3  3  3  3  3  3  3  3  3

    Now we can write a function that calculates the indices of the maximum element in each column, and maps them to a class name.

    julia> function custom_onecold(custom_y_onehot)
    +           max_idx = [x[1] for x in argmax(custom_y_onehot; dims=1)]
    +           vec(classes[max_idx])
    +       end;
    +
    +julia> custom_onecold(custom_y_onehot)
    +150-element Vector{String}:
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + "Iris-setosa"
    + ⋮
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"
    + "Iris-virginica"

    It works!

    Flux provides users with the onecold function so that we don't have to write it on our own. Let's see how our custom_onecold function compares to Flux.onecold.

    julia> istrue = Flux.onecold(flux_y_onehot, classes) .== custom_onecold(custom_y_onehot);
    +
    +julia> all(istrue)
    +true

    Both the functions act identically!

    We now move to the accuracy metric and run it with the untrained custom_model.

    julia> custom_accuracy(W, b, x, y) = mean(custom_onecold(custom_model(W, b, x)) .== y);
    +
    +julia> custom_accuracy(W, b, x, y)
    +0.3333333333333333

    We could also have used Flux's built-in functionality to define this accuracy function.

    julia> flux_accuracy(x, y) = mean(Flux.onecold(flux_model(x), classes) .== y);
    +
    +julia> flux_accuracy(x, y)
    +0.24

    Training the model

    Let's train our model using the classic Gradient Descent algorithm. According to the gradient descent algorithm, the weights and biases should be iteratively updated using the following mathematical equations -

    \[\begin{aligned} +W &= W - \eta * \frac{dL}{dW} \\ +b &= b - \eta * \frac{dL}{db} +\end{aligned}\]

    Here, W is the weight matrix, b is the bias vector, $\eta$ is the learning rate, $\frac{dL}{dW}$ is the derivative of the loss function with respect to the weight, and $\frac{dL}{db}$ is the derivative of the loss function with respect to the bias.

    The derivatives are calculated using an Automatic Differentiation tool, and Flux uses Zygote.jl for the same. Since Zygote.jl is an independent Julia package, it can be used outside of Flux as well! Refer to the documentation of Zygote.jl for more information on the same.

    Our first step would be to obtain the gradient of the loss function with respect to the weights and the biases. Flux re-exports Zygote's gradient function; hence, we don't need to import Zygote explicitly to use the functionality. gradient takes in a function and its arguments, and returns a tuple containing ∂f/∂x for each argument x. Let's pass in custom_loss and the arguments required by custom_loss to gradient. We will require the derivatives of the loss function (custom_loss) with respect to the weights (∂f/∂w) and the bias (∂f/∂b) to carry out gradient descent, but we can ignore the partial derivatives of the loss function (custom_loss) with respect to x (∂f/∂x) and one hot encoded y (∂f/∂y).

    julia> dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, custom_y_onehot);

    We can now update the parameters, following the gradient descent algorithm -

    julia> W .= W .- 0.1 .* dLdW;
    +
    +julia> b .= b .- 0.1 .* dLdb;

    The parameters have been updated! We can now check the value of our custom loss function -

    julia> custom_loss(W, b, x, custom_y_onehot)
    +1.164742997664842

    The loss went down! Let's plug our super training logic inside a function.

    julia> function train_custom_model()
    +           dLdW, dLdb, _, _ = gradient(custom_loss, W, b, x, custom_y_onehot)
    +           W .= W .- 0.1 .* dLdW
    +           b .= b .- 0.1 .* dLdb
    +       end;

    We can plug the training function inside a loop and train the model for more epochs. The loop can be tailored to suit the user's needs, and the conditions can be specified in plain Julia. Here we will train the model for a maximum of 500 epochs, but to ensure that the model does not overfit, we will break as soon as our accuracy value crosses or becomes equal to 0.98.

    julia> for i = 1:500
    +            train_custom_model();
    +            custom_accuracy(W, b, x, y) >= 0.98 && break
    +       end
    +    
    +julia> @show custom_accuracy(W, b, x, y);
    +custom_accuracy(W, b, x, y) = 0.98

    Everything works! Our model achieved an accuracy of 0.98! Let's have a look at the loss.

    julia> custom_loss(W, b, x, custom_y_onehot)
    +0.6520349798243569

    As expected, the loss went down too! Now, let's repeat the same steps with our flux_model.

    We can write a similar-looking training loop for our flux_model and train it similarly.

    julia> flux_loss(flux_model, x, flux_y_onehot)
    +1.215731131385928
    +
    +julia> function train_flux_model()
    +           dLdm, _, _ = gradient(flux_loss, flux_model, x, flux_y_onehot)
    +           @. flux_model[1].weight = flux_model[1].weight - 0.1 * dLdm[:layers][1][:weight]
    +           @. flux_model[1].bias = flux_model[1].bias - 0.1 * dLdm[:layers][1][:bias]
    +       end;
    +
    +julia> for i = 1:500
    +            train_flux_model();
    +            flux_accuracy(x, y) >= 0.98 && break
    +       end

    Looking at the accuracy and loss value -

    julia> @show flux_accuracy(x, y);
    +flux_accuracy(x, y) = 0.98
    +
    +julia> flux_loss(flux_model, x, flux_y_onehot)
    +0.6952386604624324

    We see a very similar final loss and accuracy.


    Summarising this tutorial, we saw how we can run a logistic regression algorithm in Julia with and without using Flux. We started by importing the classic Iris dataset, and one hot encoded the labels. Next, we defined our model, the loss function, and the accuracy, all by ourselves.

    Finally, we trained the model by manually writing down the Gradient Descent algorithm and optimising the loss. Interestingly, we implemented most of the functions on our own, and then parallelly compared them with the functionalities provided by Flux!

    Info

    Originally published on 1st April 2023, by Saransh Chopra.

    diff --git a/previews/PR2365/utilities/index.html b/previews/PR2365/utilities/index.html new file mode 100644 index 0000000000..b2f27d23ce --- /dev/null +++ b/previews/PR2365/utilities/index.html @@ -0,0 +1,164 @@ + +Weight Initialisation · Flux

    Random Weight Initialisation

    Flux initialises convolutional layers and recurrent cells with glorot_uniform by default. Most layers accept a function as an init keyword, which replaces this default. For example:

    julia> conv = Conv((3, 3), 3 => 2, relu; init=Flux.glorot_normal)
    +Conv((3, 3), 3 => 2, relu)  # 56 parameters
    +
    +julia> conv.bias
    +2-element Vector{Float32}:
    + 0.0
    + 0.0

    Note that init creates the weight array, but not the bias vector.

    Many of the initialisation functions accept keywords such as gain, and a random number generator. To make it easy to pass these to layers, there are methods which return a function:

    julia> Dense(4 => 5, tanh; init=Flux.glorot_uniform(gain=2))
    +Dense(4 => 5, tanh)  # 25 parameters
    +
    +julia> Dense(4 => 5, tanh; init=Flux.randn32(MersenneTwister(1)))
    +Dense(4 => 5, tanh)  # 25 parameters

    Initialisation functions

    Flux.glorot_uniformFunction
    glorot_uniform([rng], size...; gain = 1) -> Array
    +glorot_uniform([rng]; kw...) -> Function

    Return an Array{Float32} of the given size containing random numbers drawn from a uniform distribution on the interval $[-x, x]$, where x = gain * sqrt(6 / (fan_in + fan_out)).

    This method is described in [1] and also known as Xavier initialization.

    Examples

    julia> Flux.glorot_uniform(3, 4) |> summary
    +"3×4 Matrix{Float32}"
    +
    +julia> round.(extrema(Flux.glorot_uniform(10, 100)), digits=3)
    +(-0.232f0, 0.234f0)
    +
    +julia> round.(extrema(Flux.glorot_uniform(100, 10)), digits=3)
    +(-0.233f0, 0.233f0)
    +
    +julia> round.(extrema(Flux.glorot_uniform(100, 100)), digits=3)
    +(-0.173f0, 0.173f0)
    +
    +julia> Dense(3 => 2, tanh; init = Flux.glorot_uniform(MersenneTwister(1)))
    +Dense(3 => 2, tanh)  # 8 parameters
    +
    +julia> ans.bias
    +2-element Vector{Float32}:
    + 0.0
    + 0.0

    References

    [1] Glorot, Xavier, and Yoshua Bengio. "Understanding the difficulty of training deep feedforward neural networks." Proceedings of the thirteenth international conference on artificial intelligence and statistics. 2010.

    source
    Flux.glorot_normalFunction
    glorot_normal([rng], size...; gain = 1) -> Array
    +glorot_normal([rng]; kw...) -> Function

    Return an Array{Float32} of the given size containing random numbers drawn from a normal distribution with standard deviation gain * sqrt(2 / (fan_in + fan_out)), using nfan.

    This method is described in [1] and also known as Xavier initialization.

    Examples

    julia> using Statistics
    +
    +julia> round(std(Flux.glorot_normal(10, 1000)), digits=3)
    +0.044f0
    +
    +julia> round(std(Flux.glorot_normal(1000, 10)), digits=3)
    +0.044f0
    +
    +julia> round(std(Flux.glorot_normal(1000, 1000)), digits=3)
    +0.032f0
    +
    +julia> Dense(10 => 1000, tanh; init = Flux.glorot_normal(gain=100))
    +Dense(10 => 1000, tanh)  # 11_000 parameters
    +
    +julia> round(std(ans.weight), sigdigits=3)
    +4.45f0

    References

    [1] Glorot, Xavier, and Yoshua Bengio. "Understanding the difficulty of training deep feedforward neural networks." Proceedings of the thirteenth international conference on artificial intelligence and statistics. 2010.

    source
    Flux.kaiming_uniformFunction
    kaiming_uniform([rng], size...; gain = √2) -> Array
    +kaiming_uniform([rng]; kw...) -> Function

    Return an Array{Float32} of the given size containing random numbers drawn from a uniform distribution on the interval [-x, x], where x = gain * sqrt(3/fan_in) using nfan.

    This method is described in [1] and also known as He initialization.

    Examples

    julia> round.(extrema(Flux.kaiming_uniform(100, 10)), digits=3)
    +(-0.774f0, 0.774f0)
    +
    +julia> round.(extrema(Flux.kaiming_uniform(10, 100)), digits=3)
    +(-0.245f0, 0.244f0)
    +
    +julia> round.(extrema(Flux.kaiming_uniform(100, 100)), digits=3)
    +(-0.245f0, 0.245f0)

    References

    [1] He, Kaiming, et al. "Delving deep into rectifiers: Surpassing human-level performance on imagenet classification." Proceedings of the IEEE international conference on computer vision. 2015.

    source
    Flux.kaiming_normalFunction
    kaiming_normal([rng], size...; gain = √2) -> Array
    +kaiming_normal([rng]; kw...) -> Function

    Return an Array{Float32} of the given size containing random numbers taken from a normal distribution standard deviation gain / sqrt(fan_in), using nfan.

    This method is described in [1] and also known as He initialization.

    Examples

    julia> using Statistics
    +
    +julia> round(std(Flux.kaiming_normal(10, 1000)), digits=3)
    +0.045f0
    +
    +julia> round(std(Flux.kaiming_normal(1000, 10)), digits=3)
    +0.447f0
    +
    +julia> round(std(Flux.kaiming_normal(1000, 1000)), digits=3)
    +0.045f0

    References

    [1] He, Kaiming, et al. "Delving deep into rectifiers: Surpassing human-level performance on imagenet classification." Proceedings of the IEEE international conference on computer vision. 2015.

    source
    Flux.truncated_normalFunction
    truncated_normal([rng], size...; mean = 0, std = 1, lo = -2, hi = 2) -> Array
    +truncated_normal([rng]; kw...) -> Function

    Return an Array{Float32} of the given size where each element is drawn from a truncated normal distribution. The numbers are distributed like filter(x -> lo<=x<=hi, mean .+ std .* randn(100)).

    The values are generated by sampling a Uniform(0, 1) (rand()) and then applying the inverse CDF of the truncated normal distribution. This method works best when lo ≤ mean ≤ hi.

    Examples

    julia> using Statistics
    +
    +julia> Flux.truncated_normal(3, 4) |> summary
    +"3×4 Matrix{Float32}"
    +
    +julia> round.(extrema(Flux.truncated_normal(10^6)); digits=3)
    +(-2.0f0, 2.0f0)
    +
    +julia> round(std(Flux.truncated_normal(10^6; lo = -100, hi = 100)))
    +1.0f0
    source
    Flux.orthogonalFunction
    orthogonal([rng], size...; gain = 1) -> Array
    +orthogonal([rng]; kw...) -> Function

    Return an Array{Float32} of the given size which is a (semi) orthogonal matrix, as described in [1].

    Cannot construct a vector, i.e. length(size) == 1 is forbidden. For length(size) > 2, a prod(size[1:(end - 1)]) by size[end] orthogonal matrix is computed before reshaping it to the original dimensions.

    Examples

    julia> W = Flux.orthogonal(5, 7);
    +
    +julia> summary(W)
    +"5×7 Matrix{Float32}"
    +
    +julia> W * W' ≈ I(5)
    +true
    +
    +julia> W2 = Flux.orthogonal(7, 5);
    +
    +julia> W2 * W2' ≈ I(7)
    +false
    +
    +julia> W2' * W2 ≈ I(5)
    +true
    +
    +julia> W3 = Flux.orthogonal(3, 3, 2, 4);
    +
    +julia> transpose(reshape(W3, :, 4)) * reshape(W3, :, 4) ≈ I(4)
    +true

    References

    [1] Saxe, McClelland, Ganguli. "Exact solutions to the nonlinear dynamics of learning in deep linear neural networks", ICLR 2014, https://arxiv.org/abs/1312.6120

    source
    Flux.sparse_initFunction
    sparse_init([rng], rows, cols; sparsity, std = 0.01) -> Array
    +sparse_init([rng]; kw...) -> Function

    Return a Matrix{Float32} of size rows, cols where each column contains a fixed fraction of zero elements given by sparsity. Non-zero elements are normally distributed with a mean of zero and standard deviation std.

    This method is described in [1].

    Examples

    julia> count(iszero, Flux.sparse_init(10, 10, sparsity=1/5))
    +20
    +
    +julia> sum(0 .== Flux.sparse_init(10, 11, sparsity=0.9), dims=1)
    +1×11 Matrix{Int64}:
    + 9  9  9  9  9  9  9  9  9  9  9
    +
    +julia> Dense(3 => 10, tanh; init=Flux.sparse_init(sparsity=0.5))
    +Dense(3 => 10, tanh)  # 40 parameters
    +
    +julia> count(iszero, ans.weight, dims=1)
    +1×3 Matrix{Int64}:
    + 5  5  5

    References

    [1] Martens, J, "Deep learning via Hessian-free optimization" Proceedings of the 27th International Conference on International Conference on Machine Learning. 2010.

    source
    Flux.identity_initFunction
    identity_init(size...; gain=1, shift=0) -> Array
    +identity_init(; kw...) -> Function

    Return an Array{Float32} of the given size which yields an identity mapping when used as parameters in most Flux layers. Use gain to scale the identity by a constant.

    Often useful in the context of transfer learning, i.e when one wants to add more capacity to a model but start from the same mapping.

    Has the following behaviour

    • 1D: A Vector of zeros (useful for an identity bias)
    • 2D: An identity matrix (useful for an identity matrix multiplication)
    • More than 2D: A dense block array of center tap spatial filters (useful for an identity convolution)

    Some caveats:

    • Not all layers will be identity mapping when used with this init. Exceptions include recurrent layers and normalization layers.

    • Layers must have input_size == output_size for identity mapping to be possible. When this is not the case, extra dimensions of the array are padded with zeros.

    • For convolutional layers, in addition to the above, the kernel sizes must also be odd and padding must be applied so that output feature maps have the same size as input feature maps, e.g by using SamePad.

    Use keyword shift (integer or tuple) to apply circular shift to the output, equivalent to Base.circshift(identity_init(size...), shift).

    For consistency with other initialisers, it accepts rng::AbstractRNG as an optional first argument. But this is ignored, since the result is not random.

    Examples

    julia> Flux.identity_init(3,5)
    +3×5 Matrix{Float32}:
    + 1.0  0.0  0.0  0.0  0.0
    + 0.0  1.0  0.0  0.0  0.0
    + 0.0  0.0  1.0  0.0  0.0
    +
    +julia> Dense(5 => 3, relu, init=Flux.identity_init)([1,-2,3,-4,5])
    +3-element Vector{Float32}:
    + 1.0
    + 0.0
    + 3.0
    +
    +julia> Flux.identity_init(3,3,2; gain=100)
    +3×3×2 Array{Float32, 3}:
    +[:, :, 1] =
    +   0.0  0.0  0.0
    + 100.0  0.0  0.0
    +   0.0  0.0  0.0
    +
    +[:, :, 2] =
    + 0.0    0.0  0.0
    + 0.0  100.0  0.0
    + 0.0    0.0  0.0
    +
    +julia> x4 = cat([1 2 3; 4 5 6; 7 8 9]; dims=4);
    +
    +julia> Conv((2,2), 1 => 1, init=Flux.identity_init(gain=10), pad=SamePad())(x4)
    +3×3×1×1 Array{Float32, 4}:
    +[:, :, 1, 1] =
    + 10.0  20.0  30.0
    + 40.0  50.0  60.0
    + 70.0  80.0  90.0
    source
    Flux.ones32Function
    ones32(size...) = ones(Float32, size...)

    Return an Array{Float32} of the given size filled with 1s.

    source
    Flux.zeros32Function
    zeros32(size...) = zeros(Float32, size...)

    Return an Array{Float32} of the given size filled with 0s.

    source
    Flux.rand32Function
    rand32([rng], size...)

    Return an Array{Float32} of the given size, filled like rand. When the size is not provided, rand32(rng::AbstractRNG) returns a function.

    source
    Flux.randn32Function
    randn32([rng], size...)

    Return an Array{Float32} of the given size, filled like randn. When the size is not provided, randn32(rng::AbstractRNG) returns a function.

    source
    Flux.create_biasFunction
    create_bias(weights, bias, size...)

    Return a bias parameter for a layer, based on the value given to the constructor's keyword bias=bias.

    • bias == true creates a trainable array of the given size, of the same type as weights, initialised to zero.
    • bias == false returns false, which is understood by AD to be non-differentiable.
    • bias::AbstractArray uses the array provided, provided it has the correct size. It will also correct the eltype to match that of weights.
    source

    These functions call:

    Flux.rng_from_arrayFunction
    rng_from_array(x)

    Create an instance of the RNG most appropriate for x. The current defaults are:

    • x isa CuArray: CUDA.default_rng()
    • x isa AbstractArray: `Random.default_rng()
    source
    Flux.nfanFunction
    nfan(n_out, n_in=1) -> Tuple
    +nfan(dims...)
    +nfan(dims::Tuple)

    For a layer characterized by dimensions dims, return a tuple (fan_in, fan_out), where fan_in is the number of input neurons connected to an output one, and fan_out is the number of output neurons connected to an input one.

    This function is mainly used by weight initializers, e.g., kaiming_normal.

    Examples

    julia> layer = Dense(10, 20);
    +
    +julia> Flux.nfan(size(layer.weight))
    +(10, 20)
    +
    +julia> layer = Conv((3, 3), 2=>10);
    +
    +julia> Flux.nfan(size(layer.weight))
    +(18, 90)
    source

    Changing the type of all parameters

    The default eltype for models is Float32 since models are often trained/run on GPUs. The eltype of model m can be changed to Float64 by f64(m):

    Flux.f64Function
    f64(m)

    Converts the eltype of model's floating point parameters to Float64. Recurses into structs marked with @functor.

    See also f32 and f16.

    source
    Flux.f32Function
    f32(m)

    Converts the eltype of model's floating point parameters to Float32 (which is Flux's default). Recurses into structs marked with @functor.

    See also f64 and f16.

    source
    Flux.f16Function
    f16(m)

    Converts the eltype of model's floating point parameters to Float16. Recurses into structs marked with @functor.

    Support for Float16 is limited on many CPUs. Julia may convert to Float32 for each operation, which is slow.

    See also f32 and f64.

    Example

    julia> m = Chain(Dense(784, 2048, relu), Dense(2048, 10))  # all Float32
    +Chain(
    +  Dense(784 => 2048, relu),             # 1_607_680 parameters
    +  Dense(2048 => 10),                    # 20_490 parameters
    +)                   # Total: 4 arrays, 1_628_170 parameters, 6.211 MiB.
    +
    +julia> m |> f16  # takes half the memory
    +Chain(
    +  Dense(784 => 2048, relu),             # 1_607_680 parameters
    +  Dense(2048 => 10),                    # 20_490 parameters
    +)                   # Total: 4 arrays, 1_628_170 parameters, 3.106 MiB.
    source