diff --git a/_images/02bf006e33b5745006490aa403b786aa6d54cb5de32651cb1b282424eb040690.png b/_images/02bf006e33b5745006490aa403b786aa6d54cb5de32651cb1b282424eb040690.png deleted file mode 100644 index f582ee6c0..000000000 Binary files a/_images/02bf006e33b5745006490aa403b786aa6d54cb5de32651cb1b282424eb040690.png and /dev/null differ diff --git a/_images/6d8c901d15c5a52662c6e9b2b587e22fd964cadf48ae7fbfad23f3ef1da434fc.png b/_images/0316474640500036ead44849fe6d2247166e4cea1264f0325265041f1e3565bc.png similarity index 99% rename from _images/6d8c901d15c5a52662c6e9b2b587e22fd964cadf48ae7fbfad23f3ef1da434fc.png rename to _images/0316474640500036ead44849fe6d2247166e4cea1264f0325265041f1e3565bc.png index 733b78699..078627151 100644 Binary files a/_images/6d8c901d15c5a52662c6e9b2b587e22fd964cadf48ae7fbfad23f3ef1da434fc.png and b/_images/0316474640500036ead44849fe6d2247166e4cea1264f0325265041f1e3565bc.png differ diff --git a/_images/eaff432b30bae519b5199bab45f93f26b9bc95e7f7a3ba0f7eb32dd0cbd0cbcf.png b/_images/0367ec8db8ae4d65ea628497d7d5f591ff01c92e4fc01ec7dad4cccfb52e97ba.png similarity index 99% rename from _images/eaff432b30bae519b5199bab45f93f26b9bc95e7f7a3ba0f7eb32dd0cbd0cbcf.png rename to _images/0367ec8db8ae4d65ea628497d7d5f591ff01c92e4fc01ec7dad4cccfb52e97ba.png index 77f4b1291..4968d65fa 100644 Binary files a/_images/eaff432b30bae519b5199bab45f93f26b9bc95e7f7a3ba0f7eb32dd0cbd0cbcf.png and b/_images/0367ec8db8ae4d65ea628497d7d5f591ff01c92e4fc01ec7dad4cccfb52e97ba.png differ diff --git a/_images/ab691438ce7197a801c04a5a0512e67400c7328fdb5a2324a1b97d67fb9dbb7a.png b/_images/0425646389429b5ee5be251a9729a599897cc73935acbe8201f0d6bba99ac16b.png similarity index 99% rename from _images/ab691438ce7197a801c04a5a0512e67400c7328fdb5a2324a1b97d67fb9dbb7a.png rename to _images/0425646389429b5ee5be251a9729a599897cc73935acbe8201f0d6bba99ac16b.png index 510b95645..9f48abfe6 100644 Binary files a/_images/ab691438ce7197a801c04a5a0512e67400c7328fdb5a2324a1b97d67fb9dbb7a.png and b/_images/0425646389429b5ee5be251a9729a599897cc73935acbe8201f0d6bba99ac16b.png differ diff --git a/_images/307916fb13ee9b3502c1b5695ea462b08284f61dcdd50e464d44748b9d9aa93e.png b/_images/096424eaa56ce6bb1650b00527c1964ed06db0349fcf1d81cfe3926ced2d0ebd.png similarity index 99% rename from _images/307916fb13ee9b3502c1b5695ea462b08284f61dcdd50e464d44748b9d9aa93e.png rename to _images/096424eaa56ce6bb1650b00527c1964ed06db0349fcf1d81cfe3926ced2d0ebd.png index 790b1b47f..331338231 100644 Binary files a/_images/307916fb13ee9b3502c1b5695ea462b08284f61dcdd50e464d44748b9d9aa93e.png and b/_images/096424eaa56ce6bb1650b00527c1964ed06db0349fcf1d81cfe3926ced2d0ebd.png differ diff --git a/_images/691998358ad0b84a3aa3e260513e493dea28f2884db29c3b72d2ae13d752d828.png b/_images/12309e1b0f5f8b2678fe9ce4f50df80f08c0a5c46dd3ef0d18aefa6382df374b.png similarity index 99% rename from _images/691998358ad0b84a3aa3e260513e493dea28f2884db29c3b72d2ae13d752d828.png rename to _images/12309e1b0f5f8b2678fe9ce4f50df80f08c0a5c46dd3ef0d18aefa6382df374b.png index a699ca431..d9de8d593 100644 Binary files a/_images/691998358ad0b84a3aa3e260513e493dea28f2884db29c3b72d2ae13d752d828.png and b/_images/12309e1b0f5f8b2678fe9ce4f50df80f08c0a5c46dd3ef0d18aefa6382df374b.png differ diff --git a/_images/a87eb291e5565cfbd24b33a681cdc67e4dadcddf5099e55e040283ac4b650bc6.png b/_images/13cc88b7e40d2a65356dfe28add027101b40a7c64f2dd0620a581c51321b9a0f.png similarity index 99% rename from _images/a87eb291e5565cfbd24b33a681cdc67e4dadcddf5099e55e040283ac4b650bc6.png rename to _images/13cc88b7e40d2a65356dfe28add027101b40a7c64f2dd0620a581c51321b9a0f.png index a333eddf3..e3a7d88ce 100644 Binary files a/_images/a87eb291e5565cfbd24b33a681cdc67e4dadcddf5099e55e040283ac4b650bc6.png and b/_images/13cc88b7e40d2a65356dfe28add027101b40a7c64f2dd0620a581c51321b9a0f.png differ diff --git a/_images/a9fbaacd997a7501a8d8aa2faf06b198d55027a8df14da65764d78ed9a7e16a6.png b/_images/1b520a82cc97dacbf8be7cc89de0e501b0e2addc6fc413758b1cb1e2d5218e01.png similarity index 99% rename from _images/a9fbaacd997a7501a8d8aa2faf06b198d55027a8df14da65764d78ed9a7e16a6.png rename to _images/1b520a82cc97dacbf8be7cc89de0e501b0e2addc6fc413758b1cb1e2d5218e01.png index 6b030891e..d75cd0ccb 100644 Binary files a/_images/a9fbaacd997a7501a8d8aa2faf06b198d55027a8df14da65764d78ed9a7e16a6.png and b/_images/1b520a82cc97dacbf8be7cc89de0e501b0e2addc6fc413758b1cb1e2d5218e01.png differ diff --git a/_images/df9b07a1c36ed20e5906e4c6912732c4d7802634a94ad1c12bc55c7ab3f92353.png b/_images/1bef9e7b0a28397757bece87c28a72c79af5e8b3819be9a475b96587dad0e27e.png similarity index 99% rename from _images/df9b07a1c36ed20e5906e4c6912732c4d7802634a94ad1c12bc55c7ab3f92353.png rename to _images/1bef9e7b0a28397757bece87c28a72c79af5e8b3819be9a475b96587dad0e27e.png index 6a5480e90..ee72f5749 100644 Binary files a/_images/df9b07a1c36ed20e5906e4c6912732c4d7802634a94ad1c12bc55c7ab3f92353.png and b/_images/1bef9e7b0a28397757bece87c28a72c79af5e8b3819be9a475b96587dad0e27e.png differ diff --git a/_images/2e10eb2fa8a8838373723213e24c9f75dc2ab57c7be16c39d0e86b84746e6dc1.png b/_images/1d278a6559a883fb1f41a77e46e43a5afbf4aa7f10e77be7f987643785f08e1c.png similarity index 99% rename from _images/2e10eb2fa8a8838373723213e24c9f75dc2ab57c7be16c39d0e86b84746e6dc1.png rename to _images/1d278a6559a883fb1f41a77e46e43a5afbf4aa7f10e77be7f987643785f08e1c.png index 3a3344f40..be63f1514 100644 Binary files a/_images/2e10eb2fa8a8838373723213e24c9f75dc2ab57c7be16c39d0e86b84746e6dc1.png and b/_images/1d278a6559a883fb1f41a77e46e43a5afbf4aa7f10e77be7f987643785f08e1c.png differ diff --git a/_images/3570e35a9991f01a0b6f3327abff86351f1c8b78d2024e75f701407ae49d30d5.png b/_images/2ae68461c5418d5a111a8068f89f0042745e74bc03add970013477b9155bf7bd.png similarity index 99% rename from _images/3570e35a9991f01a0b6f3327abff86351f1c8b78d2024e75f701407ae49d30d5.png rename to _images/2ae68461c5418d5a111a8068f89f0042745e74bc03add970013477b9155bf7bd.png index 4054a0776..6e57826b7 100644 Binary files a/_images/3570e35a9991f01a0b6f3327abff86351f1c8b78d2024e75f701407ae49d30d5.png and b/_images/2ae68461c5418d5a111a8068f89f0042745e74bc03add970013477b9155bf7bd.png differ diff --git a/_images/3887af56402df0c0078b7262a0fc6f9b5635ae9181292ee8d6fc1f9554b57d75.png b/_images/2d59c9a8b535a107b091da62bd4a34913660601ebd751fd91c267099443fedb9.png similarity index 99% rename from _images/3887af56402df0c0078b7262a0fc6f9b5635ae9181292ee8d6fc1f9554b57d75.png rename to _images/2d59c9a8b535a107b091da62bd4a34913660601ebd751fd91c267099443fedb9.png index 2b0c56a95..ef887bc52 100644 Binary files a/_images/3887af56402df0c0078b7262a0fc6f9b5635ae9181292ee8d6fc1f9554b57d75.png and b/_images/2d59c9a8b535a107b091da62bd4a34913660601ebd751fd91c267099443fedb9.png differ diff --git a/_images/a1a77da928eec784639eb05c87e63483f8f041a1dc6fa10f34391b8187021844.png b/_images/2e0155f095b8d0008f42c3e8e6d2607c01f87f0e446fed56b38405866f541fb3.png similarity index 99% rename from _images/a1a77da928eec784639eb05c87e63483f8f041a1dc6fa10f34391b8187021844.png rename to _images/2e0155f095b8d0008f42c3e8e6d2607c01f87f0e446fed56b38405866f541fb3.png index a9af140ad..341a318e1 100644 Binary files a/_images/a1a77da928eec784639eb05c87e63483f8f041a1dc6fa10f34391b8187021844.png and b/_images/2e0155f095b8d0008f42c3e8e6d2607c01f87f0e446fed56b38405866f541fb3.png differ diff --git a/_images/22f74960312c887ace42c1cb30316122f4e1971b1ab9495dba0b8428b29f0a17.png b/_images/2fde92f75e38088fc7e1a73aa86803e063787c974a97dd86a8a02bcb1e191154.png similarity index 99% rename from _images/22f74960312c887ace42c1cb30316122f4e1971b1ab9495dba0b8428b29f0a17.png rename to _images/2fde92f75e38088fc7e1a73aa86803e063787c974a97dd86a8a02bcb1e191154.png index 9dece7f5c..d75394d25 100644 Binary files a/_images/22f74960312c887ace42c1cb30316122f4e1971b1ab9495dba0b8428b29f0a17.png and b/_images/2fde92f75e38088fc7e1a73aa86803e063787c974a97dd86a8a02bcb1e191154.png differ diff --git a/_images/7dab5cd492e47eabc79f18cfc91295b3a6fba687cd26f117f1d712277ef030da.png b/_images/378be3d26490c74e544d93d36b6495079af1f963fbab130eeb7397e22381bc71.png similarity index 99% rename from _images/7dab5cd492e47eabc79f18cfc91295b3a6fba687cd26f117f1d712277ef030da.png rename to _images/378be3d26490c74e544d93d36b6495079af1f963fbab130eeb7397e22381bc71.png index 8db9a4f99..ee2b4f947 100644 Binary files a/_images/7dab5cd492e47eabc79f18cfc91295b3a6fba687cd26f117f1d712277ef030da.png and b/_images/378be3d26490c74e544d93d36b6495079af1f963fbab130eeb7397e22381bc71.png differ diff --git a/_images/a437e1b661ed17c9d03ba4c5fed7fb4399d79ee78f102a783185ae78712a056e.png b/_images/3997b99f17f490458c7b19ae1587d5d977ababf0010fefb712b1960d75ba5b05.png similarity index 99% rename from _images/a437e1b661ed17c9d03ba4c5fed7fb4399d79ee78f102a783185ae78712a056e.png rename to _images/3997b99f17f490458c7b19ae1587d5d977ababf0010fefb712b1960d75ba5b05.png index 0d03576cc..9876334ab 100644 Binary files a/_images/a437e1b661ed17c9d03ba4c5fed7fb4399d79ee78f102a783185ae78712a056e.png and b/_images/3997b99f17f490458c7b19ae1587d5d977ababf0010fefb712b1960d75ba5b05.png differ diff --git a/_images/3a10e03b56d5ecdbbd326f9f6a3bfe98d118669c377b69497bff4dde15d0c0cf.png b/_images/3a10e03b56d5ecdbbd326f9f6a3bfe98d118669c377b69497bff4dde15d0c0cf.png deleted file mode 100644 index 29e733edf..000000000 Binary files a/_images/3a10e03b56d5ecdbbd326f9f6a3bfe98d118669c377b69497bff4dde15d0c0cf.png and /dev/null differ diff --git a/_images/f64a1c67037ca360d4520ed87514eab2d9b77fd6d7e1f1842165a09a711452f4.png b/_images/3a63aa830d1503b84a7d83fb936f1171a97b791503c43847651d484a54daee9e.png similarity index 99% rename from _images/f64a1c67037ca360d4520ed87514eab2d9b77fd6d7e1f1842165a09a711452f4.png rename to _images/3a63aa830d1503b84a7d83fb936f1171a97b791503c43847651d484a54daee9e.png index f74871ee1..6f70e35f3 100644 Binary files a/_images/f64a1c67037ca360d4520ed87514eab2d9b77fd6d7e1f1842165a09a711452f4.png and b/_images/3a63aa830d1503b84a7d83fb936f1171a97b791503c43847651d484a54daee9e.png differ diff --git a/_images/3d16ae3899ad056181e31c4171b56e0d6efa3a29eb3ddf32ed9cf0fdc143a92b.png b/_images/3d16ae3899ad056181e31c4171b56e0d6efa3a29eb3ddf32ed9cf0fdc143a92b.png deleted file mode 100644 index 24a8429ba..000000000 Binary files a/_images/3d16ae3899ad056181e31c4171b56e0d6efa3a29eb3ddf32ed9cf0fdc143a92b.png and /dev/null differ diff --git a/_images/47359cdf22e90dceb054e9bd8166287f7d88a2f652faf51fd3b3d90b464ec7de.png b/_images/3f322b109c93aa7bfd89efd8694090ad39571d6ab42da715e35728eff66c27de.png similarity index 99% rename from _images/47359cdf22e90dceb054e9bd8166287f7d88a2f652faf51fd3b3d90b464ec7de.png rename to _images/3f322b109c93aa7bfd89efd8694090ad39571d6ab42da715e35728eff66c27de.png index fc7eb3d83..97ce816d6 100644 Binary files a/_images/47359cdf22e90dceb054e9bd8166287f7d88a2f652faf51fd3b3d90b464ec7de.png and b/_images/3f322b109c93aa7bfd89efd8694090ad39571d6ab42da715e35728eff66c27de.png differ diff --git a/_images/56dc918d419751303251ee6a9fa06664f92caa8f58829e178b7359ed4b6d9b49.png b/_images/3f79b672d80ea72a7a497ac8c8a1263a518906c376b408a79f926b27e2448c09.png similarity index 99% rename from _images/56dc918d419751303251ee6a9fa06664f92caa8f58829e178b7359ed4b6d9b49.png rename to _images/3f79b672d80ea72a7a497ac8c8a1263a518906c376b408a79f926b27e2448c09.png index 6704dc07c..de99e833e 100644 Binary files a/_images/56dc918d419751303251ee6a9fa06664f92caa8f58829e178b7359ed4b6d9b49.png and b/_images/3f79b672d80ea72a7a497ac8c8a1263a518906c376b408a79f926b27e2448c09.png differ diff --git a/_images/54885e65427d84d7b01f9a435a4e143802b2c70c468ce6468848a91c2138d1ea.png b/_images/434b95f0afef8e61b449992591137c1368f3dace3f520f30e5de25a90e62bc3d.png similarity index 99% rename from _images/54885e65427d84d7b01f9a435a4e143802b2c70c468ce6468848a91c2138d1ea.png rename to _images/434b95f0afef8e61b449992591137c1368f3dace3f520f30e5de25a90e62bc3d.png index 4be41be69..39a4e712b 100644 Binary files a/_images/54885e65427d84d7b01f9a435a4e143802b2c70c468ce6468848a91c2138d1ea.png and b/_images/434b95f0afef8e61b449992591137c1368f3dace3f520f30e5de25a90e62bc3d.png differ diff --git a/_images/0a9bfa5194313943b8e9185e3b1efd4ee1ef7fd2008301cbe438ef6838a7085a.png b/_images/43a6c776291d856db092862953b946d23b4aca58298af46fe05d8260cfce27da.png similarity index 99% rename from _images/0a9bfa5194313943b8e9185e3b1efd4ee1ef7fd2008301cbe438ef6838a7085a.png rename to _images/43a6c776291d856db092862953b946d23b4aca58298af46fe05d8260cfce27da.png index 61799dbd5..fa022d0f2 100644 Binary files a/_images/0a9bfa5194313943b8e9185e3b1efd4ee1ef7fd2008301cbe438ef6838a7085a.png and b/_images/43a6c776291d856db092862953b946d23b4aca58298af46fe05d8260cfce27da.png differ diff --git a/_images/44d082c54c092ad784ef3c44480b3690e631c1abf753978966ba53207c44f439.png b/_images/44d082c54c092ad784ef3c44480b3690e631c1abf753978966ba53207c44f439.png deleted file mode 100644 index d33354e22..000000000 Binary files a/_images/44d082c54c092ad784ef3c44480b3690e631c1abf753978966ba53207c44f439.png and /dev/null differ diff --git a/_images/44f2df9fa9aecdb09d84a11587e25f6add12dd7d32d158eb0853acd7ab29f906.png b/_images/44f2df9fa9aecdb09d84a11587e25f6add12dd7d32d158eb0853acd7ab29f906.png deleted file mode 100644 index 44f1bea68..000000000 Binary files a/_images/44f2df9fa9aecdb09d84a11587e25f6add12dd7d32d158eb0853acd7ab29f906.png and /dev/null differ diff --git a/_images/73a62b215c7cd56b4b2839cfce73599a2bddfa69ee0e330669fe1c7755603524.png b/_images/45a5b10c893d404c98b4a84affdfc0492b0c931187d35216d783c50c7da0f2e3.png similarity index 99% rename from _images/73a62b215c7cd56b4b2839cfce73599a2bddfa69ee0e330669fe1c7755603524.png rename to _images/45a5b10c893d404c98b4a84affdfc0492b0c931187d35216d783c50c7da0f2e3.png index 32979fa94..1873311a3 100644 Binary files a/_images/73a62b215c7cd56b4b2839cfce73599a2bddfa69ee0e330669fe1c7755603524.png and b/_images/45a5b10c893d404c98b4a84affdfc0492b0c931187d35216d783c50c7da0f2e3.png differ diff --git a/_images/4c430f878a4bdde5171ac45480975bfd8f1cd968428f3debd9b04c31f22b5df5.png b/_images/48b2f3dcdc7f8ce5bfffc4c5841322d8b0dc4f8448f79906826ce8a8013cc772.png similarity index 99% rename from _images/4c430f878a4bdde5171ac45480975bfd8f1cd968428f3debd9b04c31f22b5df5.png rename to _images/48b2f3dcdc7f8ce5bfffc4c5841322d8b0dc4f8448f79906826ce8a8013cc772.png index bb97dca15..19615a7d0 100644 Binary files a/_images/4c430f878a4bdde5171ac45480975bfd8f1cd968428f3debd9b04c31f22b5df5.png and b/_images/48b2f3dcdc7f8ce5bfffc4c5841322d8b0dc4f8448f79906826ce8a8013cc772.png differ diff --git a/_images/dd316b865a698b994ee3b13ad6dfe0de1aa3197969512a9600b4eebb980581aa.png b/_images/499b33296fd75f367398a8553dd918bb8947197f08ced98162d08fb3bc2387ad.png similarity index 99% rename from _images/dd316b865a698b994ee3b13ad6dfe0de1aa3197969512a9600b4eebb980581aa.png rename to _images/499b33296fd75f367398a8553dd918bb8947197f08ced98162d08fb3bc2387ad.png index 5c96496c7..e1af74e3a 100644 Binary files a/_images/dd316b865a698b994ee3b13ad6dfe0de1aa3197969512a9600b4eebb980581aa.png and b/_images/499b33296fd75f367398a8553dd918bb8947197f08ced98162d08fb3bc2387ad.png differ diff --git a/_images/b0adb5d98d22c716aca0471f668a684fdbda6577191176d62918ce9f094b9293.png b/_images/4bd3b05f898177c81c9e612bb80339cd3bb6d6d6579ac0cda5e8ec3414b5e47f.png similarity index 99% rename from _images/b0adb5d98d22c716aca0471f668a684fdbda6577191176d62918ce9f094b9293.png rename to _images/4bd3b05f898177c81c9e612bb80339cd3bb6d6d6579ac0cda5e8ec3414b5e47f.png index d5052d1ab..82e88221b 100644 Binary files a/_images/b0adb5d98d22c716aca0471f668a684fdbda6577191176d62918ce9f094b9293.png and b/_images/4bd3b05f898177c81c9e612bb80339cd3bb6d6d6579ac0cda5e8ec3414b5e47f.png differ diff --git a/_images/4d22edbf348f43256d1e465ca1629b5d492e11651a4230519554fc5c37f7d2f0.png b/_images/4d22edbf348f43256d1e465ca1629b5d492e11651a4230519554fc5c37f7d2f0.png new file mode 100644 index 000000000..a23e11040 Binary files /dev/null and b/_images/4d22edbf348f43256d1e465ca1629b5d492e11651a4230519554fc5c37f7d2f0.png differ diff --git a/_images/a2829e1935e2bad62518a69ef55ee33889fe6ecc6762ac0b1dcb7fed5c9472dd.png b/_images/522d287bd530127087194d55ae3f4e772f3315f39b6f902327e01565029fd1c7.png similarity index 99% rename from _images/a2829e1935e2bad62518a69ef55ee33889fe6ecc6762ac0b1dcb7fed5c9472dd.png rename to _images/522d287bd530127087194d55ae3f4e772f3315f39b6f902327e01565029fd1c7.png index bfbc76d61..4b432be25 100644 Binary files a/_images/a2829e1935e2bad62518a69ef55ee33889fe6ecc6762ac0b1dcb7fed5c9472dd.png and b/_images/522d287bd530127087194d55ae3f4e772f3315f39b6f902327e01565029fd1c7.png differ diff --git a/_images/526bbfc5df5c554dcedb08f917749ac49a6112047439f59c84109ab3df812acb.png b/_images/526bbfc5df5c554dcedb08f917749ac49a6112047439f59c84109ab3df812acb.png new file mode 100644 index 000000000..55db7f5b6 Binary files /dev/null and b/_images/526bbfc5df5c554dcedb08f917749ac49a6112047439f59c84109ab3df812acb.png differ diff --git a/_images/c6969a2f2598c3629906ea43d37657e372ab4cafaf9c4d21b7031e600974aaa0.png b/_images/53b7f0d3e5b07e2a0e10e43b28f9fa5ce1f10c6128b5c432293a7edce439ec6b.png similarity index 99% rename from _images/c6969a2f2598c3629906ea43d37657e372ab4cafaf9c4d21b7031e600974aaa0.png rename to _images/53b7f0d3e5b07e2a0e10e43b28f9fa5ce1f10c6128b5c432293a7edce439ec6b.png index 6614cecf1..689a35eaf 100644 Binary files a/_images/c6969a2f2598c3629906ea43d37657e372ab4cafaf9c4d21b7031e600974aaa0.png and b/_images/53b7f0d3e5b07e2a0e10e43b28f9fa5ce1f10c6128b5c432293a7edce439ec6b.png differ diff --git a/_images/dbccbca66516cacb8a41ced26bf0cd378c90041c9e9cbf543116fb377c8a8557.png b/_images/575be5eb288326275c799b4750a001e8347933129cbc93aa448a89aca03882eb.png similarity index 99% rename from _images/dbccbca66516cacb8a41ced26bf0cd378c90041c9e9cbf543116fb377c8a8557.png rename to _images/575be5eb288326275c799b4750a001e8347933129cbc93aa448a89aca03882eb.png index c7d1e6bc7..4294e891a 100644 Binary files a/_images/dbccbca66516cacb8a41ced26bf0cd378c90041c9e9cbf543116fb377c8a8557.png and b/_images/575be5eb288326275c799b4750a001e8347933129cbc93aa448a89aca03882eb.png differ diff --git a/_images/4633ec72b53c4f88006a88008cb2fa937738c206757d2c6c59ee0e0e48e762f5.png b/_images/58d3ce2721c5d4f2d4bef02242e9b514a75acef3b3fe60aa340c14e62d074fe8.png similarity index 99% rename from _images/4633ec72b53c4f88006a88008cb2fa937738c206757d2c6c59ee0e0e48e762f5.png rename to _images/58d3ce2721c5d4f2d4bef02242e9b514a75acef3b3fe60aa340c14e62d074fe8.png index eac8ec2d3..86f7c1190 100644 Binary files a/_images/4633ec72b53c4f88006a88008cb2fa937738c206757d2c6c59ee0e0e48e762f5.png and b/_images/58d3ce2721c5d4f2d4bef02242e9b514a75acef3b3fe60aa340c14e62d074fe8.png differ diff --git a/_images/fdf6936a17ef4f3d30a069cae1cc47564e133fba51034ae49fd322bf2d5e90b1.png b/_images/5b7bee7b59e463ff8e6f7eb25a97d7e743499c4f94b358ed72743336ebe7dfcb.png similarity index 99% rename from _images/fdf6936a17ef4f3d30a069cae1cc47564e133fba51034ae49fd322bf2d5e90b1.png rename to _images/5b7bee7b59e463ff8e6f7eb25a97d7e743499c4f94b358ed72743336ebe7dfcb.png index 79016dd39..9447976d4 100644 Binary files a/_images/fdf6936a17ef4f3d30a069cae1cc47564e133fba51034ae49fd322bf2d5e90b1.png and b/_images/5b7bee7b59e463ff8e6f7eb25a97d7e743499c4f94b358ed72743336ebe7dfcb.png differ diff --git a/_images/bf146facce2c5b4465c1c96f05129d951f75bdbd498dc91c9138c7961df38e04.png b/_images/5b8f96e235f4a03a0ed108b74da9718b0a752da3453498ad54d8d009d545f62d.png similarity index 99% rename from _images/bf146facce2c5b4465c1c96f05129d951f75bdbd498dc91c9138c7961df38e04.png rename to _images/5b8f96e235f4a03a0ed108b74da9718b0a752da3453498ad54d8d009d545f62d.png index bcf0ace06..f38014ccb 100644 Binary files a/_images/bf146facce2c5b4465c1c96f05129d951f75bdbd498dc91c9138c7961df38e04.png and b/_images/5b8f96e235f4a03a0ed108b74da9718b0a752da3453498ad54d8d009d545f62d.png differ diff --git a/_images/67a8d761ab6a11cdfddb806c3bb84ec6a68eb16014104c97ad3a53da8a969d84.png b/_images/5de67be11870e5173f10f30c2aa1331af6cd02956d72a7ad17e696a0ce5d2705.png similarity index 99% rename from _images/67a8d761ab6a11cdfddb806c3bb84ec6a68eb16014104c97ad3a53da8a969d84.png rename to _images/5de67be11870e5173f10f30c2aa1331af6cd02956d72a7ad17e696a0ce5d2705.png index 4a6a2c4c8..9b138500e 100644 Binary files a/_images/67a8d761ab6a11cdfddb806c3bb84ec6a68eb16014104c97ad3a53da8a969d84.png and b/_images/5de67be11870e5173f10f30c2aa1331af6cd02956d72a7ad17e696a0ce5d2705.png differ diff --git a/_images/64058d37f289e3656e6d296491b14d5cfc7641127067f41d0c6a3f6a91b7e179.png b/_images/64058d37f289e3656e6d296491b14d5cfc7641127067f41d0c6a3f6a91b7e179.png new file mode 100644 index 000000000..41956a1ec Binary files /dev/null and b/_images/64058d37f289e3656e6d296491b14d5cfc7641127067f41d0c6a3f6a91b7e179.png differ diff --git a/_images/4cb1e04a307d81d656f10cdd6f20e9d547bb9f89a008a8e0c786186fd442e89b.png b/_images/66423f415c5ed2c155e6d4c8df35ddcaef422be800f1752e6da3c3185e7cab70.png similarity index 99% rename from _images/4cb1e04a307d81d656f10cdd6f20e9d547bb9f89a008a8e0c786186fd442e89b.png rename to _images/66423f415c5ed2c155e6d4c8df35ddcaef422be800f1752e6da3c3185e7cab70.png index 90d3a3f17..1bc27d330 100644 Binary files a/_images/4cb1e04a307d81d656f10cdd6f20e9d547bb9f89a008a8e0c786186fd442e89b.png and b/_images/66423f415c5ed2c155e6d4c8df35ddcaef422be800f1752e6da3c3185e7cab70.png differ diff --git a/_images/dfb6871f5a91bf9154dc2f623c875195025ae2c291c43202fee53663affb6aa7.png b/_images/66bdfdbaaf200a738d2d2b15c67e92854cf403201b69cbcc85ad790bf78370da.png similarity index 99% rename from _images/dfb6871f5a91bf9154dc2f623c875195025ae2c291c43202fee53663affb6aa7.png rename to _images/66bdfdbaaf200a738d2d2b15c67e92854cf403201b69cbcc85ad790bf78370da.png index 98b634be7..5726645f7 100644 Binary files a/_images/dfb6871f5a91bf9154dc2f623c875195025ae2c291c43202fee53663affb6aa7.png and b/_images/66bdfdbaaf200a738d2d2b15c67e92854cf403201b69cbcc85ad790bf78370da.png differ diff --git a/_images/66dd83cff7b926d00aeb0a62ae40b694db75332e41060d61aec323d8193c574d.png b/_images/66dd83cff7b926d00aeb0a62ae40b694db75332e41060d61aec323d8193c574d.png new file mode 100644 index 000000000..bb08a0b2b Binary files /dev/null and b/_images/66dd83cff7b926d00aeb0a62ae40b694db75332e41060d61aec323d8193c574d.png differ diff --git a/_images/69a008c0a426494710ab469f3e882e796190f13d8460a005c8916669ce9a3ec3.png b/_images/69a008c0a426494710ab469f3e882e796190f13d8460a005c8916669ce9a3ec3.png new file mode 100644 index 000000000..290b3f42a Binary files /dev/null and b/_images/69a008c0a426494710ab469f3e882e796190f13d8460a005c8916669ce9a3ec3.png differ diff --git a/_images/6b176037749651980a9a02a512bd69e994655b36cab66cb2d1a25f8d9af0c528.png b/_images/6b176037749651980a9a02a512bd69e994655b36cab66cb2d1a25f8d9af0c528.png new file mode 100644 index 000000000..21c96c5e9 Binary files /dev/null and b/_images/6b176037749651980a9a02a512bd69e994655b36cab66cb2d1a25f8d9af0c528.png differ diff --git a/_images/86d5ecbd5807f4033da9d40d5f7cda9fe5964ddc6b6f1329e5e61217afe379ff.png b/_images/6fbd1e5b4f33d1bd8d6047d2263785e28305ea8ee5c46874e97cb634148224a4.png similarity index 99% rename from _images/86d5ecbd5807f4033da9d40d5f7cda9fe5964ddc6b6f1329e5e61217afe379ff.png rename to _images/6fbd1e5b4f33d1bd8d6047d2263785e28305ea8ee5c46874e97cb634148224a4.png index b9950a574..fe1a7f1c2 100644 Binary files a/_images/86d5ecbd5807f4033da9d40d5f7cda9fe5964ddc6b6f1329e5e61217afe379ff.png and b/_images/6fbd1e5b4f33d1bd8d6047d2263785e28305ea8ee5c46874e97cb634148224a4.png differ diff --git a/_images/0c9ad0f380bd8fd67092f23f2f95e539674e45608d4c52ccf0c01d12a475829f.png b/_images/70d03ba61a7131910eebb4ce352b35f6d93a1779fc0bcaabe5e333c7dac971b3.png similarity index 99% rename from _images/0c9ad0f380bd8fd67092f23f2f95e539674e45608d4c52ccf0c01d12a475829f.png rename to _images/70d03ba61a7131910eebb4ce352b35f6d93a1779fc0bcaabe5e333c7dac971b3.png index b309d9ff7..a9b3838b8 100644 Binary files a/_images/0c9ad0f380bd8fd67092f23f2f95e539674e45608d4c52ccf0c01d12a475829f.png and b/_images/70d03ba61a7131910eebb4ce352b35f6d93a1779fc0bcaabe5e333c7dac971b3.png differ diff --git a/_images/0fffeba293de5453c66782e5f84871228cdf184f3cdb32706f755776007d1baa.png b/_images/7115401207a7c015679b0137158af5a34f99ab1c583958840de04e8653a18543.png similarity index 99% rename from _images/0fffeba293de5453c66782e5f84871228cdf184f3cdb32706f755776007d1baa.png rename to _images/7115401207a7c015679b0137158af5a34f99ab1c583958840de04e8653a18543.png index 4ec036cde..665582251 100644 Binary files a/_images/0fffeba293de5453c66782e5f84871228cdf184f3cdb32706f755776007d1baa.png and b/_images/7115401207a7c015679b0137158af5a34f99ab1c583958840de04e8653a18543.png differ diff --git a/_images/78566d46f1f05e3965d54f9bb1efec22378c1d3f0317e8f5faf8b50b2220d972.png b/_images/78566d46f1f05e3965d54f9bb1efec22378c1d3f0317e8f5faf8b50b2220d972.png deleted file mode 100644 index 5325c1386..000000000 Binary files a/_images/78566d46f1f05e3965d54f9bb1efec22378c1d3f0317e8f5faf8b50b2220d972.png and /dev/null differ diff --git a/_images/7acdaa9daf0507c86354349586991516c97f08178468e1d51655fc8892312fe9.png b/_images/7acdaa9daf0507c86354349586991516c97f08178468e1d51655fc8892312fe9.png new file mode 100644 index 000000000..208e02118 Binary files /dev/null and b/_images/7acdaa9daf0507c86354349586991516c97f08178468e1d51655fc8892312fe9.png differ diff --git a/_images/251c533d77dcce088f8d474c71c698aa5d47eb6e929441006950f90fbf73137c.png b/_images/7f96db181f1246bf7033cc953839a1c263a55b81faf06b89209f8ed1237fcf00.png similarity index 99% rename from _images/251c533d77dcce088f8d474c71c698aa5d47eb6e929441006950f90fbf73137c.png rename to _images/7f96db181f1246bf7033cc953839a1c263a55b81faf06b89209f8ed1237fcf00.png index be3cc96a7..3941c8b01 100644 Binary files a/_images/251c533d77dcce088f8d474c71c698aa5d47eb6e929441006950f90fbf73137c.png and b/_images/7f96db181f1246bf7033cc953839a1c263a55b81faf06b89209f8ed1237fcf00.png differ diff --git a/_images/113be394735e65607dc8017a3f2bc2f0b1c6357f4bf10e4872b2c2ed397081fb.png b/_images/834cc44529fc80e6f72cf3fda4a79add3a07a827fc678394367dafd93aa26686.png similarity index 99% rename from _images/113be394735e65607dc8017a3f2bc2f0b1c6357f4bf10e4872b2c2ed397081fb.png rename to _images/834cc44529fc80e6f72cf3fda4a79add3a07a827fc678394367dafd93aa26686.png index 13a703760..80662682e 100644 Binary files a/_images/113be394735e65607dc8017a3f2bc2f0b1c6357f4bf10e4872b2c2ed397081fb.png and b/_images/834cc44529fc80e6f72cf3fda4a79add3a07a827fc678394367dafd93aa26686.png differ diff --git a/_images/4c1ed1868ba96066a2e7d7703df52a32f0302c2ace25d15ed08ee848fea75ec2.png b/_images/83f933dbc6d50564dbbf98f58f6d5322fb90d93971785b0d778011045bf4d158.png similarity index 99% rename from _images/4c1ed1868ba96066a2e7d7703df52a32f0302c2ace25d15ed08ee848fea75ec2.png rename to _images/83f933dbc6d50564dbbf98f58f6d5322fb90d93971785b0d778011045bf4d158.png index 1dc6353c8..63b6da8c6 100644 Binary files a/_images/4c1ed1868ba96066a2e7d7703df52a32f0302c2ace25d15ed08ee848fea75ec2.png and b/_images/83f933dbc6d50564dbbf98f58f6d5322fb90d93971785b0d778011045bf4d158.png differ diff --git a/_images/338911a3d51a8d2332a6edb77e91f79d860a36bea2c7a8ca6c0e1c78c16ea3aa.png b/_images/85460b363e5ef3fea7e7fa140a35f2f2f1cfe37ab393709587f3156a10f3cd53.png similarity index 99% rename from _images/338911a3d51a8d2332a6edb77e91f79d860a36bea2c7a8ca6c0e1c78c16ea3aa.png rename to _images/85460b363e5ef3fea7e7fa140a35f2f2f1cfe37ab393709587f3156a10f3cd53.png index 7bf380c58..a504e0d2e 100644 Binary files a/_images/338911a3d51a8d2332a6edb77e91f79d860a36bea2c7a8ca6c0e1c78c16ea3aa.png and b/_images/85460b363e5ef3fea7e7fa140a35f2f2f1cfe37ab393709587f3156a10f3cd53.png differ diff --git a/_images/8a1a0f9de4ef832c52f09191ecc0214d8b6996df8d6613698f36e29e05832d63.png b/_images/8a1a0f9de4ef832c52f09191ecc0214d8b6996df8d6613698f36e29e05832d63.png new file mode 100644 index 000000000..5d055831b Binary files /dev/null and b/_images/8a1a0f9de4ef832c52f09191ecc0214d8b6996df8d6613698f36e29e05832d63.png differ diff --git a/_images/02c33c12e77d12754a4f51ba67fa89bce734a4c0d766e2935d2f27d56b1acc27.png b/_images/8ae7cd982c48a5de102c8d6905769be84e5acc537ae8b89cd6292cd2f0993030.png similarity index 99% rename from _images/02c33c12e77d12754a4f51ba67fa89bce734a4c0d766e2935d2f27d56b1acc27.png rename to _images/8ae7cd982c48a5de102c8d6905769be84e5acc537ae8b89cd6292cd2f0993030.png index 40d55f7e4..aa8cbf1ee 100644 Binary files a/_images/02c33c12e77d12754a4f51ba67fa89bce734a4c0d766e2935d2f27d56b1acc27.png and b/_images/8ae7cd982c48a5de102c8d6905769be84e5acc537ae8b89cd6292cd2f0993030.png differ diff --git a/_images/4f2472a6b25550223831b2965553d3797b710fa5173db29b128d03b0936de156.png b/_images/9090378321d79b0c04638c0debe98564620dde8f6598eccd8f5e4e25a0ebf5a7.png similarity index 99% rename from _images/4f2472a6b25550223831b2965553d3797b710fa5173db29b128d03b0936de156.png rename to _images/9090378321d79b0c04638c0debe98564620dde8f6598eccd8f5e4e25a0ebf5a7.png index f3e9dbdfa..a9fa87e63 100644 Binary files a/_images/4f2472a6b25550223831b2965553d3797b710fa5173db29b128d03b0936de156.png and b/_images/9090378321d79b0c04638c0debe98564620dde8f6598eccd8f5e4e25a0ebf5a7.png differ diff --git a/_images/ad0a03fc9ce19f3218cf46a25b956d9c504f4478a68c82e17b40e45f532505f3.png b/_images/9222fa9baf763431ef04a3dcc0ebfa5a8f41e733e5587552356fd873879c5251.png similarity index 99% rename from _images/ad0a03fc9ce19f3218cf46a25b956d9c504f4478a68c82e17b40e45f532505f3.png rename to _images/9222fa9baf763431ef04a3dcc0ebfa5a8f41e733e5587552356fd873879c5251.png index 469a5626b..4094ab47c 100644 Binary files a/_images/ad0a03fc9ce19f3218cf46a25b956d9c504f4478a68c82e17b40e45f532505f3.png and b/_images/9222fa9baf763431ef04a3dcc0ebfa5a8f41e733e5587552356fd873879c5251.png differ diff --git a/_images/84b869b22381643b074ee70a768b4bd3727a1a490ed6646abb6e7ad4494c17a5.png b/_images/9299bcaea082adfc9f546ff5feabb08afe1fc51c83231120917a59d0df45751a.png similarity index 99% rename from _images/84b869b22381643b074ee70a768b4bd3727a1a490ed6646abb6e7ad4494c17a5.png rename to _images/9299bcaea082adfc9f546ff5feabb08afe1fc51c83231120917a59d0df45751a.png index 204c52ba9..43ea6567d 100644 Binary files a/_images/84b869b22381643b074ee70a768b4bd3727a1a490ed6646abb6e7ad4494c17a5.png and b/_images/9299bcaea082adfc9f546ff5feabb08afe1fc51c83231120917a59d0df45751a.png differ diff --git a/_images/99db78bd1535fe12692e1f5329d742f857f7af1cfaeb27d07a71910ae1911136.png b/_images/9c8f994db318385a11b92995d28fee11addaa2d6c634d95c5fd115df3b530ed4.png similarity index 99% rename from _images/99db78bd1535fe12692e1f5329d742f857f7af1cfaeb27d07a71910ae1911136.png rename to _images/9c8f994db318385a11b92995d28fee11addaa2d6c634d95c5fd115df3b530ed4.png index c35c74845..5e7ff7ff5 100644 Binary files a/_images/99db78bd1535fe12692e1f5329d742f857f7af1cfaeb27d07a71910ae1911136.png and b/_images/9c8f994db318385a11b92995d28fee11addaa2d6c634d95c5fd115df3b530ed4.png differ diff --git a/_images/630efa1975514c4e86d9488d4675e6e15ff7695ac20e920ee98c7805fd6cbaf7.png b/_images/9d2db9ca12067f24eff8013482b01d974c3ad921f256ab2cec85fa57d86e789d.png similarity index 99% rename from _images/630efa1975514c4e86d9488d4675e6e15ff7695ac20e920ee98c7805fd6cbaf7.png rename to _images/9d2db9ca12067f24eff8013482b01d974c3ad921f256ab2cec85fa57d86e789d.png index 99ec891e3..1fa4694c3 100644 Binary files a/_images/630efa1975514c4e86d9488d4675e6e15ff7695ac20e920ee98c7805fd6cbaf7.png and b/_images/9d2db9ca12067f24eff8013482b01d974c3ad921f256ab2cec85fa57d86e789d.png differ diff --git a/_images/3f6d3f33c0fcdc25fc5308b9f32b8a7a8eb8f1f22b32fda64c3085ee693dc64a.png b/_images/a0a6d8ea4435c866e16fc976e9f0b970931df02213bcbd9d015aab5767801928.png similarity index 99% rename from _images/3f6d3f33c0fcdc25fc5308b9f32b8a7a8eb8f1f22b32fda64c3085ee693dc64a.png rename to _images/a0a6d8ea4435c866e16fc976e9f0b970931df02213bcbd9d015aab5767801928.png index f18f28f7a..b62776d8a 100644 Binary files a/_images/3f6d3f33c0fcdc25fc5308b9f32b8a7a8eb8f1f22b32fda64c3085ee693dc64a.png and b/_images/a0a6d8ea4435c866e16fc976e9f0b970931df02213bcbd9d015aab5767801928.png differ diff --git a/_images/ac41c39c0cd72dbce7b201b4a42c7dd3d712835703a37a4981d910f8078d5270.png b/_images/a2233e8f46644b2c4781d7a01db67e577fe810b6811c7b651aead58d939f5228.png similarity index 99% rename from _images/ac41c39c0cd72dbce7b201b4a42c7dd3d712835703a37a4981d910f8078d5270.png rename to _images/a2233e8f46644b2c4781d7a01db67e577fe810b6811c7b651aead58d939f5228.png index 30d4b9a0e..7f962cf78 100644 Binary files a/_images/ac41c39c0cd72dbce7b201b4a42c7dd3d712835703a37a4981d910f8078d5270.png and b/_images/a2233e8f46644b2c4781d7a01db67e577fe810b6811c7b651aead58d939f5228.png differ diff --git a/_images/afac7309a8b92ea40082ce82f15d62fee1ea84fde5d0e3ce3c1fb65c82d75890.png b/_images/a31ba78cc972eadf46b9cf6e28951a6ff3572d2d93b9f21732bbaf097acd39b8.png similarity index 99% rename from _images/afac7309a8b92ea40082ce82f15d62fee1ea84fde5d0e3ce3c1fb65c82d75890.png rename to _images/a31ba78cc972eadf46b9cf6e28951a6ff3572d2d93b9f21732bbaf097acd39b8.png index a75cea3d1..bddf24e6d 100644 Binary files a/_images/afac7309a8b92ea40082ce82f15d62fee1ea84fde5d0e3ce3c1fb65c82d75890.png and b/_images/a31ba78cc972eadf46b9cf6e28951a6ff3572d2d93b9f21732bbaf097acd39b8.png differ diff --git a/_images/becac0473624f04086b4f6c3661bc8141ba6e6994a2cc3613887f5df90e76190.png b/_images/a55aa677043fbfe697ed97615dbf553411bd6fa5e5e891cc601594eb0d21e843.png similarity index 99% rename from _images/becac0473624f04086b4f6c3661bc8141ba6e6994a2cc3613887f5df90e76190.png rename to _images/a55aa677043fbfe697ed97615dbf553411bd6fa5e5e891cc601594eb0d21e843.png index a7d0c3604..25a5a692c 100644 Binary files a/_images/becac0473624f04086b4f6c3661bc8141ba6e6994a2cc3613887f5df90e76190.png and b/_images/a55aa677043fbfe697ed97615dbf553411bd6fa5e5e891cc601594eb0d21e843.png differ diff --git a/_images/d90acb9fd01dedb1a3276986f7b1eb8f5fe7d446d98eac8459a2be7e04269511.png b/_images/a5d3cad3009c0a9f95bd70801d179e595c8bf2e6d474bcc0b05541d7407725f3.png similarity index 99% rename from _images/d90acb9fd01dedb1a3276986f7b1eb8f5fe7d446d98eac8459a2be7e04269511.png rename to _images/a5d3cad3009c0a9f95bd70801d179e595c8bf2e6d474bcc0b05541d7407725f3.png index ef78cb3e8..c4fca6e8a 100644 Binary files a/_images/d90acb9fd01dedb1a3276986f7b1eb8f5fe7d446d98eac8459a2be7e04269511.png and b/_images/a5d3cad3009c0a9f95bd70801d179e595c8bf2e6d474bcc0b05541d7407725f3.png differ diff --git a/_images/c463454c7b069521e9e2351b0002bac6cd2b0228a2619239ac92dcabc5f64185.png b/_images/a8766aa3dd02a279ca9ed688dde42f61fb55f03cbe1b2fc5f2c72f7a9f6880f0.png similarity index 99% rename from _images/c463454c7b069521e9e2351b0002bac6cd2b0228a2619239ac92dcabc5f64185.png rename to _images/a8766aa3dd02a279ca9ed688dde42f61fb55f03cbe1b2fc5f2c72f7a9f6880f0.png index c7345e11c..b56dee0c3 100644 Binary files a/_images/c463454c7b069521e9e2351b0002bac6cd2b0228a2619239ac92dcabc5f64185.png and b/_images/a8766aa3dd02a279ca9ed688dde42f61fb55f03cbe1b2fc5f2c72f7a9f6880f0.png differ diff --git a/_images/53200be50c429ba186794203173fbe578ab9800d98547ee4ad2ef3bb6d9e015a.png b/_images/a89102cec942cb125061c328e1d0676155805a2f0051d2961c1236f4cb4e0053.png similarity index 99% rename from _images/53200be50c429ba186794203173fbe578ab9800d98547ee4ad2ef3bb6d9e015a.png rename to _images/a89102cec942cb125061c328e1d0676155805a2f0051d2961c1236f4cb4e0053.png index 48e7d8f2c..db5b1172b 100644 Binary files a/_images/53200be50c429ba186794203173fbe578ab9800d98547ee4ad2ef3bb6d9e015a.png and b/_images/a89102cec942cb125061c328e1d0676155805a2f0051d2961c1236f4cb4e0053.png differ diff --git a/_images/8b6585fd3f1f1eea6010f3001c94db5f642d474213664eafba313b2a062627c6.png b/_images/a8f3715e500b0bb26077475c313ac3adc445f25a931303e55ccf2c50b7d05557.png similarity index 99% rename from _images/8b6585fd3f1f1eea6010f3001c94db5f642d474213664eafba313b2a062627c6.png rename to _images/a8f3715e500b0bb26077475c313ac3adc445f25a931303e55ccf2c50b7d05557.png index 5eec57bb4..7b709d3ea 100644 Binary files a/_images/8b6585fd3f1f1eea6010f3001c94db5f642d474213664eafba313b2a062627c6.png and b/_images/a8f3715e500b0bb26077475c313ac3adc445f25a931303e55ccf2c50b7d05557.png differ diff --git a/_images/4035348be18abfca2094f2231f8015e24075adf0e363ced40e8af2346bef57a4.png b/_images/a929da8da1b2ef2bcf7555e64b02623d058e275714fc2799073034aa8b006012.png similarity index 99% rename from _images/4035348be18abfca2094f2231f8015e24075adf0e363ced40e8af2346bef57a4.png rename to _images/a929da8da1b2ef2bcf7555e64b02623d058e275714fc2799073034aa8b006012.png index f3888a962..6971d2ae9 100644 Binary files a/_images/4035348be18abfca2094f2231f8015e24075adf0e363ced40e8af2346bef57a4.png and b/_images/a929da8da1b2ef2bcf7555e64b02623d058e275714fc2799073034aa8b006012.png differ diff --git a/_images/a951147ea7115a6e43c4c3b51ed173cc633a88339d81f33191134145d61b3f73.png b/_images/a951147ea7115a6e43c4c3b51ed173cc633a88339d81f33191134145d61b3f73.png new file mode 100644 index 000000000..a464198e3 Binary files /dev/null and b/_images/a951147ea7115a6e43c4c3b51ed173cc633a88339d81f33191134145d61b3f73.png differ diff --git a/_images/be6c92f3d6ceb2ba4bba2c848a305e5a7a4925c1458cbcb3e29282d26cbcb890.png b/_images/a976c67f79ffb266a7ac3936519c0877c525ccb29aee29b084c9f622a93e8df6.png similarity index 99% rename from _images/be6c92f3d6ceb2ba4bba2c848a305e5a7a4925c1458cbcb3e29282d26cbcb890.png rename to _images/a976c67f79ffb266a7ac3936519c0877c525ccb29aee29b084c9f622a93e8df6.png index 685907c1d..8198aa737 100644 Binary files a/_images/be6c92f3d6ceb2ba4bba2c848a305e5a7a4925c1458cbcb3e29282d26cbcb890.png and b/_images/a976c67f79ffb266a7ac3936519c0877c525ccb29aee29b084c9f622a93e8df6.png differ diff --git a/_images/5517ff75b0df3683c1d00c1d0cc0a9ba636f10e7d27ecfc0f92189d829f2c556.png b/_images/adbbb7d4b9eca41b61d38981d89b534767b785b4466f76eb683aef9080dad326.png similarity index 99% rename from _images/5517ff75b0df3683c1d00c1d0cc0a9ba636f10e7d27ecfc0f92189d829f2c556.png rename to _images/adbbb7d4b9eca41b61d38981d89b534767b785b4466f76eb683aef9080dad326.png index 8c8b81c4d..52a6d7136 100644 Binary files a/_images/5517ff75b0df3683c1d00c1d0cc0a9ba636f10e7d27ecfc0f92189d829f2c556.png and b/_images/adbbb7d4b9eca41b61d38981d89b534767b785b4466f76eb683aef9080dad326.png differ diff --git a/_images/d964234cc1439e0aaf1deb690d9bad92d4f76256534fcab0ac85899a77c99c2c.png b/_images/b0a991aa1fd778f2f0ef3d4f2bb95a33c4e1819dc6126275b87c8b2891aa6fde.png similarity index 99% rename from _images/d964234cc1439e0aaf1deb690d9bad92d4f76256534fcab0ac85899a77c99c2c.png rename to _images/b0a991aa1fd778f2f0ef3d4f2bb95a33c4e1819dc6126275b87c8b2891aa6fde.png index 40f41f1a0..765985d81 100644 Binary files a/_images/d964234cc1439e0aaf1deb690d9bad92d4f76256534fcab0ac85899a77c99c2c.png and b/_images/b0a991aa1fd778f2f0ef3d4f2bb95a33c4e1819dc6126275b87c8b2891aa6fde.png differ diff --git a/_images/b9ae3a198b7e9657cfcd1bb3e387ab53c26153ea18f0437e4d25ce65a1b1658e.png b/_images/b1be39be0840384dec6887d3741e6245a06ade1140d68ec90e56fa51bbb92d83.png similarity index 99% rename from _images/b9ae3a198b7e9657cfcd1bb3e387ab53c26153ea18f0437e4d25ce65a1b1658e.png rename to _images/b1be39be0840384dec6887d3741e6245a06ade1140d68ec90e56fa51bbb92d83.png index e61cbd434..ceebc3ca3 100644 Binary files a/_images/b9ae3a198b7e9657cfcd1bb3e387ab53c26153ea18f0437e4d25ce65a1b1658e.png and b/_images/b1be39be0840384dec6887d3741e6245a06ade1140d68ec90e56fa51bbb92d83.png differ diff --git a/_images/16bacb1df7ed9615ca2b3a5b1b2d017eea911c94d95e697331ef9f174cda5ad0.png b/_images/b2c72bebca4f83cb4f4eaf6f8abd90f6e9200fbfb1d2b4c301ce441b41803f8a.png similarity index 99% rename from _images/16bacb1df7ed9615ca2b3a5b1b2d017eea911c94d95e697331ef9f174cda5ad0.png rename to _images/b2c72bebca4f83cb4f4eaf6f8abd90f6e9200fbfb1d2b4c301ce441b41803f8a.png index 909b82898..81d187b70 100644 Binary files a/_images/16bacb1df7ed9615ca2b3a5b1b2d017eea911c94d95e697331ef9f174cda5ad0.png and b/_images/b2c72bebca4f83cb4f4eaf6f8abd90f6e9200fbfb1d2b4c301ce441b41803f8a.png differ diff --git a/_images/569fc8f69cf7812bf8f3a04c31fc90eb497c9de16f18b9bd9de5ab65b360fc37.png b/_images/b3c2bb3bab70cb65e6dbcbdbd1e021ecd70d29be61ff143fb1e69a93c731189a.png similarity index 99% rename from _images/569fc8f69cf7812bf8f3a04c31fc90eb497c9de16f18b9bd9de5ab65b360fc37.png rename to _images/b3c2bb3bab70cb65e6dbcbdbd1e021ecd70d29be61ff143fb1e69a93c731189a.png index d3f639b1e..21e135d29 100644 Binary files a/_images/569fc8f69cf7812bf8f3a04c31fc90eb497c9de16f18b9bd9de5ab65b360fc37.png and b/_images/b3c2bb3bab70cb65e6dbcbdbd1e021ecd70d29be61ff143fb1e69a93c731189a.png differ diff --git a/_images/5adc4e702a00e5378fb581feb37714785469a4a637c9024550a29ebc765a8315.png b/_images/b66c28d1e7df9a9bb7418cf1fd3dcad9402533eba8bfc3b56b9f52bb5d7012a2.png similarity index 99% rename from _images/5adc4e702a00e5378fb581feb37714785469a4a637c9024550a29ebc765a8315.png rename to _images/b66c28d1e7df9a9bb7418cf1fd3dcad9402533eba8bfc3b56b9f52bb5d7012a2.png index 8f67c4f86..cf96dd04c 100644 Binary files a/_images/5adc4e702a00e5378fb581feb37714785469a4a637c9024550a29ebc765a8315.png and b/_images/b66c28d1e7df9a9bb7418cf1fd3dcad9402533eba8bfc3b56b9f52bb5d7012a2.png differ diff --git a/_images/94521ffc12e0f43997e97e204432b3e669bdfe84ffb02c1c42e216b27772bc27.png b/_images/b8061ee4e4d4e3abd502af240b0102a0ebb5490a872dfc818001e9a245bee996.png similarity index 99% rename from _images/94521ffc12e0f43997e97e204432b3e669bdfe84ffb02c1c42e216b27772bc27.png rename to _images/b8061ee4e4d4e3abd502af240b0102a0ebb5490a872dfc818001e9a245bee996.png index 571597c0d..0b25af9f8 100644 Binary files a/_images/94521ffc12e0f43997e97e204432b3e669bdfe84ffb02c1c42e216b27772bc27.png and b/_images/b8061ee4e4d4e3abd502af240b0102a0ebb5490a872dfc818001e9a245bee996.png differ diff --git a/_images/95f88f7e75611a5dfb12461c69670991eb7a9f23c58d715c1ed41a11777107d3.png b/_images/ba1198882221690cb498e83f0ecebef51d6b091fcff67ca9438698ea18db5d67.png similarity index 99% rename from _images/95f88f7e75611a5dfb12461c69670991eb7a9f23c58d715c1ed41a11777107d3.png rename to _images/ba1198882221690cb498e83f0ecebef51d6b091fcff67ca9438698ea18db5d67.png index 913e4009b..0f7a3d399 100644 Binary files a/_images/95f88f7e75611a5dfb12461c69670991eb7a9f23c58d715c1ed41a11777107d3.png and b/_images/ba1198882221690cb498e83f0ecebef51d6b091fcff67ca9438698ea18db5d67.png differ diff --git a/_images/6e07f71f6a34699f7e147a428c9a23eaa43793fa82b95325ad0a902972a6495b.png b/_images/bbe15bf07165a4e150e848e15698f27bb86b8727403949b9bff1de1fced30671.png similarity index 99% rename from _images/6e07f71f6a34699f7e147a428c9a23eaa43793fa82b95325ad0a902972a6495b.png rename to _images/bbe15bf07165a4e150e848e15698f27bb86b8727403949b9bff1de1fced30671.png index f0e0aa0dd..e32d4253d 100644 Binary files a/_images/6e07f71f6a34699f7e147a428c9a23eaa43793fa82b95325ad0a902972a6495b.png and b/_images/bbe15bf07165a4e150e848e15698f27bb86b8727403949b9bff1de1fced30671.png differ diff --git a/_images/8fd49f19f5c5553fa3f9e8f8f036659a2b0816046d6d9d13b3610298a44801d6.png b/_images/beed17a196e131b7484b279457fe53ab09d927adebd90b3acdce17e6c5aac9a4.png similarity index 99% rename from _images/8fd49f19f5c5553fa3f9e8f8f036659a2b0816046d6d9d13b3610298a44801d6.png rename to _images/beed17a196e131b7484b279457fe53ab09d927adebd90b3acdce17e6c5aac9a4.png index 86c342070..f8e5a0a07 100644 Binary files a/_images/8fd49f19f5c5553fa3f9e8f8f036659a2b0816046d6d9d13b3610298a44801d6.png and b/_images/beed17a196e131b7484b279457fe53ab09d927adebd90b3acdce17e6c5aac9a4.png differ diff --git a/_images/58317876c091f1bd51681944c7f5ba3b486c62f152afab357750ec03adc091c3.png b/_images/c3d6f36997c074b0a8a6d31836859b54f631b834fd9bab539fc1c013a9b96c37.png similarity index 99% rename from _images/58317876c091f1bd51681944c7f5ba3b486c62f152afab357750ec03adc091c3.png rename to _images/c3d6f36997c074b0a8a6d31836859b54f631b834fd9bab539fc1c013a9b96c37.png index 5f53f93b9..1be41dc83 100644 Binary files a/_images/58317876c091f1bd51681944c7f5ba3b486c62f152afab357750ec03adc091c3.png and b/_images/c3d6f36997c074b0a8a6d31836859b54f631b834fd9bab539fc1c013a9b96c37.png differ diff --git a/_images/d8aea21b3d80d298463d9b5b90da9a832b7a4cfa982949836b3ac13c8ae399b7.png b/_images/c66b47652a1108d84fc48ff6084ec8a491fc75c66614eccf023d769bb9d1889e.png similarity index 99% rename from _images/d8aea21b3d80d298463d9b5b90da9a832b7a4cfa982949836b3ac13c8ae399b7.png rename to _images/c66b47652a1108d84fc48ff6084ec8a491fc75c66614eccf023d769bb9d1889e.png index 16f56e98e..b3343157c 100644 Binary files a/_images/d8aea21b3d80d298463d9b5b90da9a832b7a4cfa982949836b3ac13c8ae399b7.png and b/_images/c66b47652a1108d84fc48ff6084ec8a491fc75c66614eccf023d769bb9d1889e.png differ diff --git a/_images/c81c8f6a77c176e76725be01bc6deec7a048dfc2f57487287143f02c94740461.png b/_images/c81c8f6a77c176e76725be01bc6deec7a048dfc2f57487287143f02c94740461.png deleted file mode 100644 index 349bb15bb..000000000 Binary files a/_images/c81c8f6a77c176e76725be01bc6deec7a048dfc2f57487287143f02c94740461.png and /dev/null differ diff --git a/_images/c8df484c737084a7fd8066188d3af74d3fd3a66329025dbc4e3564d1366bb65c.png b/_images/c8df484c737084a7fd8066188d3af74d3fd3a66329025dbc4e3564d1366bb65c.png deleted file mode 100644 index a908490d7..000000000 Binary files a/_images/c8df484c737084a7fd8066188d3af74d3fd3a66329025dbc4e3564d1366bb65c.png and /dev/null differ diff --git a/_images/13a482702ff210a0b3c6c0e3d6ad8f63d1d0ed37e15c1eadea804599f0e0e257.png b/_images/c95d42570103145fe081b9b3db1e19128fbfbc67a05af8694998317cadabd971.png similarity index 99% rename from _images/13a482702ff210a0b3c6c0e3d6ad8f63d1d0ed37e15c1eadea804599f0e0e257.png rename to _images/c95d42570103145fe081b9b3db1e19128fbfbc67a05af8694998317cadabd971.png index 2152cc4b7..7553027dd 100644 Binary files a/_images/13a482702ff210a0b3c6c0e3d6ad8f63d1d0ed37e15c1eadea804599f0e0e257.png and b/_images/c95d42570103145fe081b9b3db1e19128fbfbc67a05af8694998317cadabd971.png differ diff --git a/_images/ee2a62c7eda5e489def7f90d1dec3444d24b7a2046768887656da91b19675d09.png b/_images/cf8d4c96f5b4bbc479bbced72f41a91d985d373b1fcd94daf6f08f70a09a97f5.png similarity index 99% rename from _images/ee2a62c7eda5e489def7f90d1dec3444d24b7a2046768887656da91b19675d09.png rename to _images/cf8d4c96f5b4bbc479bbced72f41a91d985d373b1fcd94daf6f08f70a09a97f5.png index b85649060..02ccb5311 100644 Binary files a/_images/ee2a62c7eda5e489def7f90d1dec3444d24b7a2046768887656da91b19675d09.png and b/_images/cf8d4c96f5b4bbc479bbced72f41a91d985d373b1fcd94daf6f08f70a09a97f5.png differ diff --git a/_images/9cac0e245302cda4f4f277a43ee4c8b4b74eb2da2af25944be5022b4c6721177.png b/_images/d08f387ea927ba54e51cf3eac1e928d4cc5d6864c238f0e4664842ca7087a7fa.png similarity index 99% rename from _images/9cac0e245302cda4f4f277a43ee4c8b4b74eb2da2af25944be5022b4c6721177.png rename to _images/d08f387ea927ba54e51cf3eac1e928d4cc5d6864c238f0e4664842ca7087a7fa.png index bab52c645..94a6428be 100644 Binary files a/_images/9cac0e245302cda4f4f277a43ee4c8b4b74eb2da2af25944be5022b4c6721177.png and b/_images/d08f387ea927ba54e51cf3eac1e928d4cc5d6864c238f0e4664842ca7087a7fa.png differ diff --git a/_images/3cd532bdc2aa84dba045f16ff44310580f469353eef0e1144af89a8bd28c1647.png b/_images/d50b0bb5be799a5938cc05673941c07c15ae6d4203f35bc93551e6831ce2318c.png similarity index 99% rename from _images/3cd532bdc2aa84dba045f16ff44310580f469353eef0e1144af89a8bd28c1647.png rename to _images/d50b0bb5be799a5938cc05673941c07c15ae6d4203f35bc93551e6831ce2318c.png index 2fa2eae95..8dc8a549b 100644 Binary files a/_images/3cd532bdc2aa84dba045f16ff44310580f469353eef0e1144af89a8bd28c1647.png and b/_images/d50b0bb5be799a5938cc05673941c07c15ae6d4203f35bc93551e6831ce2318c.png differ diff --git a/_images/c8121c9542c8d3f7f2bf90cb4e76695a56130892165b3c0d4f85eafbaedee77c.png b/_images/d86642fd7d9645b88930bbc26359f33b466bf815199f4cddda4926bc2654deab.png similarity index 99% rename from _images/c8121c9542c8d3f7f2bf90cb4e76695a56130892165b3c0d4f85eafbaedee77c.png rename to _images/d86642fd7d9645b88930bbc26359f33b466bf815199f4cddda4926bc2654deab.png index 116a3b92e..6e186b6f3 100644 Binary files a/_images/c8121c9542c8d3f7f2bf90cb4e76695a56130892165b3c0d4f85eafbaedee77c.png and b/_images/d86642fd7d9645b88930bbc26359f33b466bf815199f4cddda4926bc2654deab.png differ diff --git a/_images/b0de31c52ce78630640d081720a3b57ef3508edc188b3a277bf0f64307ec7f08.png b/_images/da42e2626a961a450ddd110e5ba09e05e306fde26455d97b2eb04873f11bbfc0.png similarity index 99% rename from _images/b0de31c52ce78630640d081720a3b57ef3508edc188b3a277bf0f64307ec7f08.png rename to _images/da42e2626a961a450ddd110e5ba09e05e306fde26455d97b2eb04873f11bbfc0.png index 7423445c2..00fb6b7d4 100644 Binary files a/_images/b0de31c52ce78630640d081720a3b57ef3508edc188b3a277bf0f64307ec7f08.png and b/_images/da42e2626a961a450ddd110e5ba09e05e306fde26455d97b2eb04873f11bbfc0.png differ diff --git a/_images/273757cbb77dc45d006e9cef718bc20ea51015358901c0caec78eb2f7c511f86.png b/_images/de36e59077b4c818fabcce4691fc43acd5dce227952140b65db535ec5f9bff24.png similarity index 99% rename from _images/273757cbb77dc45d006e9cef718bc20ea51015358901c0caec78eb2f7c511f86.png rename to _images/de36e59077b4c818fabcce4691fc43acd5dce227952140b65db535ec5f9bff24.png index 391b5425a..05b6213f9 100644 Binary files a/_images/273757cbb77dc45d006e9cef718bc20ea51015358901c0caec78eb2f7c511f86.png and b/_images/de36e59077b4c818fabcce4691fc43acd5dce227952140b65db535ec5f9bff24.png differ diff --git a/_images/5f632ca1ca12cdb2c26a4890db3a4e76d29d31c78aaa9913dc2b6e5dea2acf10.png b/_images/e11ad47d91b98ae72765f301c3edb9d68b175197f00373d1d5b26c5831c96317.png similarity index 99% rename from _images/5f632ca1ca12cdb2c26a4890db3a4e76d29d31c78aaa9913dc2b6e5dea2acf10.png rename to _images/e11ad47d91b98ae72765f301c3edb9d68b175197f00373d1d5b26c5831c96317.png index d17a50ad9..522935076 100644 Binary files a/_images/5f632ca1ca12cdb2c26a4890db3a4e76d29d31c78aaa9913dc2b6e5dea2acf10.png and b/_images/e11ad47d91b98ae72765f301c3edb9d68b175197f00373d1d5b26c5831c96317.png differ diff --git a/_images/3c6665d686802f0f4bf3cf60d8cab3be4c09762b585030cdf6311a15b4370ef1.png b/_images/e1383b959d43a9732ddffe7257cf6666e5e98ca6eb420487f5b6507bd1e6c0ff.png similarity index 99% rename from _images/3c6665d686802f0f4bf3cf60d8cab3be4c09762b585030cdf6311a15b4370ef1.png rename to _images/e1383b959d43a9732ddffe7257cf6666e5e98ca6eb420487f5b6507bd1e6c0ff.png index d0a85f8d4..9239826a6 100644 Binary files a/_images/3c6665d686802f0f4bf3cf60d8cab3be4c09762b585030cdf6311a15b4370ef1.png and b/_images/e1383b959d43a9732ddffe7257cf6666e5e98ca6eb420487f5b6507bd1e6c0ff.png differ diff --git a/_images/b18831179318d3ead60db0cd39ed9592b92bac6c042757db9b75eb3027d1808a.png b/_images/e174c1088fb4925cbb8adc7e900bc71d034c2aa67f0619a0b7d1ede19bfd792a.png similarity index 99% rename from _images/b18831179318d3ead60db0cd39ed9592b92bac6c042757db9b75eb3027d1808a.png rename to _images/e174c1088fb4925cbb8adc7e900bc71d034c2aa67f0619a0b7d1ede19bfd792a.png index 9cfad31c8..3467c2146 100644 Binary files a/_images/b18831179318d3ead60db0cd39ed9592b92bac6c042757db9b75eb3027d1808a.png and b/_images/e174c1088fb4925cbb8adc7e900bc71d034c2aa67f0619a0b7d1ede19bfd792a.png differ diff --git a/_images/aa33911f4860473fcf5024cc6dcadb3b18a02dbc913af4c29d3781785d8ab9ae.png b/_images/e6cc24c552bd299556dddabb2cf749a7efa8dca694629c0274f424097cf80e9b.png similarity index 99% rename from _images/aa33911f4860473fcf5024cc6dcadb3b18a02dbc913af4c29d3781785d8ab9ae.png rename to _images/e6cc24c552bd299556dddabb2cf749a7efa8dca694629c0274f424097cf80e9b.png index 0e797bc65..00d567022 100644 Binary files a/_images/aa33911f4860473fcf5024cc6dcadb3b18a02dbc913af4c29d3781785d8ab9ae.png and b/_images/e6cc24c552bd299556dddabb2cf749a7efa8dca694629c0274f424097cf80e9b.png differ diff --git a/_images/bcbeed3a7ea37656e384c7107d2396b9f3c947be9eeacdee26aaff7ffcc33cb3.png b/_images/e997693cb4498edd9c19d71a3c1b891785373cfa02e18c47b70ced7e512bf630.png similarity index 99% rename from _images/bcbeed3a7ea37656e384c7107d2396b9f3c947be9eeacdee26aaff7ffcc33cb3.png rename to _images/e997693cb4498edd9c19d71a3c1b891785373cfa02e18c47b70ced7e512bf630.png index 94dcf3a90..d9fed4ea9 100644 Binary files a/_images/bcbeed3a7ea37656e384c7107d2396b9f3c947be9eeacdee26aaff7ffcc33cb3.png and b/_images/e997693cb4498edd9c19d71a3c1b891785373cfa02e18c47b70ced7e512bf630.png differ diff --git a/_images/63b232bbbcd644fcffbd6e619b6ab4e3ac81351028b17be87d47f52439c4a992.png b/_images/eb90e6c18ffc8ac63832ae36125a2dba2880170a40541762168fdf520c9687a3.png similarity index 99% rename from _images/63b232bbbcd644fcffbd6e619b6ab4e3ac81351028b17be87d47f52439c4a992.png rename to _images/eb90e6c18ffc8ac63832ae36125a2dba2880170a40541762168fdf520c9687a3.png index b9024b1c9..9f82118ff 100644 Binary files a/_images/63b232bbbcd644fcffbd6e619b6ab4e3ac81351028b17be87d47f52439c4a992.png and b/_images/eb90e6c18ffc8ac63832ae36125a2dba2880170a40541762168fdf520c9687a3.png differ diff --git a/_images/5a356d13bf0631d6fb14465e16c11c4090f2f7a1303ed7fd849d74987a318eed.png b/_images/edec94c24efa7adc6f73aece51381008efd9a7dacc611f859f1c4227b3b4376c.png similarity index 99% rename from _images/5a356d13bf0631d6fb14465e16c11c4090f2f7a1303ed7fd849d74987a318eed.png rename to _images/edec94c24efa7adc6f73aece51381008efd9a7dacc611f859f1c4227b3b4376c.png index 76c350017..58bcbbb26 100644 Binary files a/_images/5a356d13bf0631d6fb14465e16c11c4090f2f7a1303ed7fd849d74987a318eed.png and b/_images/edec94c24efa7adc6f73aece51381008efd9a7dacc611f859f1c4227b3b4376c.png differ diff --git a/_images/ee0219c6b1af7da294c7b8282f6483798df6b41b8798f2663e1a8b3c19d3e74f.png b/_images/ee0219c6b1af7da294c7b8282f6483798df6b41b8798f2663e1a8b3c19d3e74f.png new file mode 100644 index 000000000..67ea915ef Binary files /dev/null and b/_images/ee0219c6b1af7da294c7b8282f6483798df6b41b8798f2663e1a8b3c19d3e74f.png differ diff --git a/_images/61cd63687588ab8ce0f21288e4d0d6fe610bcb5cdb6d33279cd662b03819c4a3.png b/_images/eec9425a8f91b3ee6a494b0e22e78c13bf44761429bffb58850c1611147a5bc7.png similarity index 99% rename from _images/61cd63687588ab8ce0f21288e4d0d6fe610bcb5cdb6d33279cd662b03819c4a3.png rename to _images/eec9425a8f91b3ee6a494b0e22e78c13bf44761429bffb58850c1611147a5bc7.png index a6044ef1e..f4339a57f 100644 Binary files a/_images/61cd63687588ab8ce0f21288e4d0d6fe610bcb5cdb6d33279cd662b03819c4a3.png and b/_images/eec9425a8f91b3ee6a494b0e22e78c13bf44761429bffb58850c1611147a5bc7.png differ diff --git a/_images/6864477d9e8849db0385b53a73ba8cbce3690f54f52e3239e340749f14d4243c.png b/_images/effbe87412081f6f6827cf3b5e342e950298e3905b0ac4550c91d643e4c9191e.png similarity index 99% rename from _images/6864477d9e8849db0385b53a73ba8cbce3690f54f52e3239e340749f14d4243c.png rename to _images/effbe87412081f6f6827cf3b5e342e950298e3905b0ac4550c91d643e4c9191e.png index 2e32fe108..8c79da2e7 100644 Binary files a/_images/6864477d9e8849db0385b53a73ba8cbce3690f54f52e3239e340749f14d4243c.png and b/_images/effbe87412081f6f6827cf3b5e342e950298e3905b0ac4550c91d643e4c9191e.png differ diff --git a/_images/92c8a721ca3f1972276d94a0fb7c12964f0ce72c90ab24090445dd10e9060044.png b/_images/f04470b0ce7dd4df80a6d3eb32bc7a026ddd3df7c45c65a45cf960fa4954d5aa.png similarity index 99% rename from _images/92c8a721ca3f1972276d94a0fb7c12964f0ce72c90ab24090445dd10e9060044.png rename to _images/f04470b0ce7dd4df80a6d3eb32bc7a026ddd3df7c45c65a45cf960fa4954d5aa.png index 15df5a52e..31bb54dec 100644 Binary files a/_images/92c8a721ca3f1972276d94a0fb7c12964f0ce72c90ab24090445dd10e9060044.png and b/_images/f04470b0ce7dd4df80a6d3eb32bc7a026ddd3df7c45c65a45cf960fa4954d5aa.png differ diff --git a/_images/657b75caf157f5e8a44e7e14cd2f691d4f1560628aa3d1d8a2e085eeec1c3356.png b/_images/f0e8f6d99d4b7ea50d65e7ac08504675500847058d0a74bb06e228699b609634.png similarity index 99% rename from _images/657b75caf157f5e8a44e7e14cd2f691d4f1560628aa3d1d8a2e085eeec1c3356.png rename to _images/f0e8f6d99d4b7ea50d65e7ac08504675500847058d0a74bb06e228699b609634.png index 050a66585..f1f5f5712 100644 Binary files a/_images/657b75caf157f5e8a44e7e14cd2f691d4f1560628aa3d1d8a2e085eeec1c3356.png and b/_images/f0e8f6d99d4b7ea50d65e7ac08504675500847058d0a74bb06e228699b609634.png differ diff --git a/_images/e2bbfa7fcba2c15582af5f938bb59bfd0604fbbcd2a43c8c054346d49a7ad0b6.png b/_images/f39b59eb6c7c677dfbd0cccfa034f14897e0e44ce39cf11a9b5b0662b7e14383.png similarity index 99% rename from _images/e2bbfa7fcba2c15582af5f938bb59bfd0604fbbcd2a43c8c054346d49a7ad0b6.png rename to _images/f39b59eb6c7c677dfbd0cccfa034f14897e0e44ce39cf11a9b5b0662b7e14383.png index e655e97c7..835533905 100644 Binary files a/_images/e2bbfa7fcba2c15582af5f938bb59bfd0604fbbcd2a43c8c054346d49a7ad0b6.png and b/_images/f39b59eb6c7c677dfbd0cccfa034f14897e0e44ce39cf11a9b5b0662b7e14383.png differ diff --git a/_images/f6ab9f77e2837ddf28ee5f76c8cb41df931cd5789762da1f6effcf890edabc16.png b/_images/f6ab9f77e2837ddf28ee5f76c8cb41df931cd5789762da1f6effcf890edabc16.png deleted file mode 100644 index c0d91e9da..000000000 Binary files a/_images/f6ab9f77e2837ddf28ee5f76c8cb41df931cd5789762da1f6effcf890edabc16.png and /dev/null differ diff --git a/_images/2cd8102a1957cda7079606f33897aa474c5b616a78e2756f473f43f5d7cf146f.png b/_images/f72b0a91a6ca01adf773cd29f13805f1bbecef345442382010050772cde30832.png similarity index 99% rename from _images/2cd8102a1957cda7079606f33897aa474c5b616a78e2756f473f43f5d7cf146f.png rename to _images/f72b0a91a6ca01adf773cd29f13805f1bbecef345442382010050772cde30832.png index f81dce0dd..c0fc93695 100644 Binary files a/_images/2cd8102a1957cda7079606f33897aa474c5b616a78e2756f473f43f5d7cf146f.png and b/_images/f72b0a91a6ca01adf773cd29f13805f1bbecef345442382010050772cde30832.png differ diff --git a/_images/f78468b1fd9d04097ff8de8c8037eee381278208493f358d6a14244b194bb0ba.png b/_images/f7677587eea0099d87ca72136e5798438cc531f37a309517501b606513a5fb2d.png similarity index 99% rename from _images/f78468b1fd9d04097ff8de8c8037eee381278208493f358d6a14244b194bb0ba.png rename to _images/f7677587eea0099d87ca72136e5798438cc531f37a309517501b606513a5fb2d.png index 7459f9ae1..7616ccf1c 100644 Binary files a/_images/f78468b1fd9d04097ff8de8c8037eee381278208493f358d6a14244b194bb0ba.png and b/_images/f7677587eea0099d87ca72136e5798438cc531f37a309517501b606513a5fb2d.png differ diff --git a/_images/d7d27e1e38842d3008b50bab3b9329989fe5003502aeb54e0c98dbc923224f90.png b/_images/fc7ba8921f3ef5787aa3fdb5dc0479931549406ee8001113099a562ce771d606.png similarity index 99% rename from _images/d7d27e1e38842d3008b50bab3b9329989fe5003502aeb54e0c98dbc923224f90.png rename to _images/fc7ba8921f3ef5787aa3fdb5dc0479931549406ee8001113099a562ce771d606.png index b93777fb9..07173b27f 100644 Binary files a/_images/d7d27e1e38842d3008b50bab3b9329989fe5003502aeb54e0c98dbc923224f90.png and b/_images/fc7ba8921f3ef5787aa3fdb5dc0479931549406ee8001113099a562ce771d606.png differ diff --git a/_images/19e4ae58913a7019a615c61c567203b572272c267e08a4bc78050b83b016f982.png b/_images/fc905809fe9379a02f91849ff8ef8d06c2830a18097bd076ee7f2e4fed1e6e83.png similarity index 99% rename from _images/19e4ae58913a7019a615c61c567203b572272c267e08a4bc78050b83b016f982.png rename to _images/fc905809fe9379a02f91849ff8ef8d06c2830a18097bd076ee7f2e4fed1e6e83.png index 1dc528d62..19a87e427 100644 Binary files a/_images/19e4ae58913a7019a615c61c567203b572272c267e08a4bc78050b83b016f982.png and b/_images/fc905809fe9379a02f91849ff8ef8d06c2830a18097bd076ee7f2e4fed1e6e83.png differ diff --git a/_images/fcb2cefa233355c15605817e37826c6ed3e69137306d11d3bf0e009a8f290296.png b/_images/fcb2cefa233355c15605817e37826c6ed3e69137306d11d3bf0e009a8f290296.png deleted file mode 100644 index bf05c6075..000000000 Binary files a/_images/fcb2cefa233355c15605817e37826c6ed3e69137306d11d3bf0e009a8f290296.png and /dev/null differ diff --git a/_sources/tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2.ipynb b/_sources/tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2.ipynb index b69e95d1a..35ff008aa 100644 --- a/_sources/tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2.ipynb +++ b/_sources/tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2.ipynb @@ -378,7 +378,7 @@ "\n", "In classical transformer systems, a core principle is encoding and decoding. We can encode an input sequence as a vector (that implicitly codes what we just read). And we can then take this vector and decode it, e.g., as a new sentence. So a sequence-to-sequence (e.g., sentence translation) system may read a sentence (made out of words embedded in a relevant space) and encode it as an overall vector. It then takes the resulting encoding of the sentence and decodes it into a translated sentence.\n", "\n", - "In modern transformer systems, such as GPT, all words are used parallelly. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)." + "In modern transformer systems, such as GPT, all words are used in parallel. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)." ] }, { diff --git a/projects/ComputerVision/data_augmentation.html b/projects/ComputerVision/data_augmentation.html index 90ab553a7..414908acf 100644 --- a/projects/ComputerVision/data_augmentation.html +++ b/projects/ComputerVision/data_augmentation.html @@ -1763,8 +1763,8 @@

Cutout

Mixup#

Mixup is a data augmentation technique that combines pairs of examples via a convex combination of the images and the labels. Given images \(x_i\) and \(x_j\) with labels \(y_i\) and \(y_j\), respectively, and \(\lambda \in [0, 1]\), mixup creates a new image \(\hat{x}\) with label \(\hat{y}\) the following way:

-
-(128)#\[\begin{align} +
+(128)#\[\begin{align} \hat{x} &= \lambda x_i + (1 - \lambda) x_j \\ \hat{y} &= \lambda y_i + (1 - \lambda) y_j \end{align}\]
diff --git a/projects/modelingsteps/Example_Deep_Learning_Project.html b/projects/modelingsteps/Example_Deep_Learning_Project.html index 9d9f19ff5..420982139 100644 --- a/projects/modelingsteps/Example_Deep_Learning_Project.html +++ b/projects/modelingsteps/Example_Deep_Learning_Project.html @@ -2067,33 +2067,33 @@

Build model -
Epoch [100/500], Step [1/2], Loss: 0.8369, Accuracy: 71.51%
+
Epoch [100/500], Step [1/2], Loss: 0.8982, Accuracy: 72.67%
 ------------------------------------------
-Epoch [100/500], Step [2/2], Loss: 0.9079, Accuracy: 67.83%
+Epoch [100/500], Step [2/2], Loss: 0.9187, Accuracy: 68.80%
 ------------------------------------------
 
-
Epoch [200/500], Step [1/2], Loss: 0.6369, Accuracy: 77.52%
+
Epoch [200/500], Step [1/2], Loss: 0.6901, Accuracy: 77.13%
 ------------------------------------------
-Epoch [200/500], Step [2/2], Loss: 0.5581, Accuracy: 81.40%
+Epoch [200/500], Step [2/2], Loss: 0.6060, Accuracy: 79.26%
 ------------------------------------------
 
-
Epoch [300/500], Step [1/2], Loss: 0.5327, Accuracy: 80.23%
+
Epoch [300/500], Step [1/2], Loss: 0.5478, Accuracy: 81.20%
 ------------------------------------------
-Epoch [300/500], Step [2/2], Loss: 0.4692, Accuracy: 85.08%
+Epoch [300/500], Step [2/2], Loss: 0.4853, Accuracy: 83.91%
 ------------------------------------------
 
-
Epoch [400/500], Step [1/2], Loss: 0.3940, Accuracy: 87.98%
+
Epoch [400/500], Step [1/2], Loss: 0.4423, Accuracy: 83.53%
 ------------------------------------------
-Epoch [400/500], Step [2/2], Loss: 0.4626, Accuracy: 82.56%
+Epoch [400/500], Step [2/2], Loss: 0.5842, Accuracy: 78.88%
 ------------------------------------------
 
- -
limb performance: 72.67%
+
limb performance: 78.49%
 
 *** FITTING: Right Leg
 
-
limb performance: 70.93%
+
limb performance: 66.28%
 
 *** FITTING: Left Arm
 
-
limb performance: 66.86%
+
limb performance: 56.40%
 
 *** FITTING: Right Arm
 
-
limb performance: 38.95%
+
limb performance: 40.12%
 
 *** FITTING: Torso
 
-
limb performance: 79.65%
+
limb performance: 78.49%
 
 *** FITTING: Head
 
-
limb performance: 50.00%
+
limb performance: 47.67%
 
@@ -2353,44 +2353,44 @@

Step 9: Model evaluation
*** FITTING: limbs only
 

-
performance: 76.16%
+
performance: 79.65%
 
-
performance: 68.02%
+
performance: 62.21%
 
-
performance: 75.58%
+
performance: 80.81%
 
-
performance: 73.26%
+
performance: 69.77%
 
-
performance: 70.35%
+
performance: 74.42%
 
-
performance: 85.47%
-median performance: 74.42%
+
performance: 84.30%
+median performance: 77.03%
 
 *** FITTING: limbs+torso+head
 
-
performance: 81.98%
+
performance: 74.42%
 
-
performance: 81.98%
+
performance: 71.51%
 
-
performance: 86.63%
+
performance: 78.49%
 
-
performance: 79.07%
+
performance: 78.49%
 
-
performance: 79.65%
+
performance: 75.00%
 
-
performance: 86.05%
-median performance: 81.98%
+
performance: 83.14%
+median performance: 76.74%
 
diff --git a/projects/modelingsteps/ModelingSteps_10_DL.html b/projects/modelingsteps/ModelingSteps_10_DL.html index bf8f2a7fb..641012d84 100644 --- a/projects/modelingsteps/ModelingSteps_10_DL.html +++ b/projects/modelingsteps/ModelingSteps_10_DL.html @@ -58,7 +58,7 @@ const thebe_selector_output = ".output, .cell_output" - + @@ -1451,7 +1451,7 @@

Step 10: publishing the model#

-
+

Guiding principles:

    diff --git a/projects/modelingsteps/ModelingSteps_1through2_DL.html b/projects/modelingsteps/ModelingSteps_1through2_DL.html index 953b3c187..0a9a270e7 100644 --- a/projects/modelingsteps/ModelingSteps_1through2_DL.html +++ b/projects/modelingsteps/ModelingSteps_1through2_DL.html @@ -58,7 +58,7 @@ const thebe_selector_output = ".output, .cell_output" - + @@ -1515,11 +1515,11 @@

    Objectives#

    -
    +
-
+
@@ -1746,7 +1746,7 @@

Disclaimer#

-
+
@@ -1759,7 +1759,7 @@

Step 1: Finding a phenomenon and a question to ask about it#

-
+
@@ -1843,7 +1843,7 @@

Example projects step 1
-
+

@@ -1917,7 +1917,7 @@

Step 2: Understanding the state of the art & background#

-
+
@@ -2059,7 +2059,7 @@

Example projects step 2
<IPython.core.display.Markdown object>
 

-
+

Here you will do a literature review. For the projects, do not spend too much time on this. A thorough literature review could take weeks or months depending on your prior knowledge of the field…

The important thing for your project here is not to exhaustively survey the literature but rather to learn the process of modeling. 1-2 days of digging into the literature should be enough!

diff --git a/projects/modelingsteps/ModelingSteps_3through4_DL.html b/projects/modelingsteps/ModelingSteps_3through4_DL.html index be6614909..17bf57f8c 100644 --- a/projects/modelingsteps/ModelingSteps_3through4_DL.html +++ b/projects/modelingsteps/ModelingSteps_3through4_DL.html @@ -60,7 +60,7 @@ - + @@ -1464,7 +1464,7 @@

Step 3: Determining the basic ingredients#

-
+
@@ -1731,7 +1731,7 @@

Example projects step 3
<IPython.core.display.Markdown object>
 

-
+
@@ -1801,7 +1801,7 @@

Step 4: Formulating specific, mathematically defined hypotheses#

-
+
@@ -1921,7 +1921,7 @@

Example projects step 4
<IPython.core.display.Markdown object>
 

-
+
diff --git a/projects/modelingsteps/ModelingSteps_5through6_DL.html b/projects/modelingsteps/ModelingSteps_5through6_DL.html index 34dd7de13..63200085d 100644 --- a/projects/modelingsteps/ModelingSteps_5through6_DL.html +++ b/projects/modelingsteps/ModelingSteps_5through6_DL.html @@ -58,7 +58,7 @@ const thebe_selector_output = ".output, .cell_output" - + @@ -1441,7 +1441,7 @@

Step 5: Selecting the toolkit#

-
+

Once you have completed Steps 1-4 to your satisfaction, you are now ready to model. You have a specific question, a goal in mind, and precise hypotheses expressed in mathematical language. All these components will empower you to chose an appropriate modeling approach.

In selecting the right toolkit, i.e. the right mathematics, computer science, engineering, or physics, etc approaches, you should consider the following important rules:

@@ -1511,7 +1511,7 @@

Step 6: Planning / drafting the model#

-
+

Planning the model involves thinking about the general outline of the model, its components and how they might fit together. You want to draw a model diagram, make some sketches and formalize necessary equations. This step will thus outline a plan of implementation. Once you have that plan, this will hugely facilitate the actual implementation of the model in computer code.

Your model will have:

diff --git a/projects/modelingsteps/ModelingSteps_7through9_DL.html b/projects/modelingsteps/ModelingSteps_7through9_DL.html index 5abc0e98c..5771dd0a2 100644 --- a/projects/modelingsteps/ModelingSteps_7through9_DL.html +++ b/projects/modelingsteps/ModelingSteps_7through9_DL.html @@ -58,7 +58,7 @@ const thebe_selector_output = ".output, .cell_output" - + @@ -1458,7 +1458,7 @@

Step 7: Implementing the model#

-
+

This is the step where you finally start writing code! Separately implement each box, icon, or flow relationship identified in Step 6. Test each of those model components separately! (This is called a unit test). Unit testing ensures that each model components works are expected/planned.

Guiding principles:

@@ -1525,7 +1525,7 @@

Step 8: Completing the model#

-
+

Determing what you’re done modeling is a hard question. Referring back to your original goals will be crucial. This is also where a precise question and specific hypotheses expressed in mathematical relationships come in handy.

Note: you can always keep improving our model, but at some point you need to decide that it is finished. Once you have a model that displays the properties of a system you are interested in, it should be possible to say something about your hypothesis and question. Keeping the model simple makes it easier to understand the phenomenon and answer the research question.

@@ -1575,7 +1575,7 @@

Step 9: testing and evaluating the model#

-
+

Every models needs to be evaluated quantitatively. There are many ways to achieve that and not every model should be evaluated in the same way. Ultimately, model testing depends on what your goals are and what you want to get out of the model, e.g. qualitative vs quantitative fit to data.

Guiding principles:

diff --git a/projects/modelingsteps/TrainIllusionDataProjectDL.html b/projects/modelingsteps/TrainIllusionDataProjectDL.html index 0d6766352..ae84e8a30 100644 --- a/projects/modelingsteps/TrainIllusionDataProjectDL.html +++ b/projects/modelingsteps/TrainIllusionDataProjectDL.html @@ -1772,8 +1772,8 @@

Question\(N\) neurons and \(M\) trials for each of 3 motion conditions: no self-motion, slowly accelerating self-motion and faster accelerating self-motion.

-
-(126)#\[\begin{align} +
+(126)#\[\begin{align} N &= 40 \\ M &= 400 \end{align}\]
@@ -1824,7 +1824,7 @@

Background -../../_images/691998358ad0b84a3aa3e260513e493dea28f2884db29c3b72d2ae13d752d828.png +../../_images/12309e1b0f5f8b2678fe9ce4f50df80f08c0a5c46dd3ef0d18aefa6382df374b.png

Blue is the no-motion condition, and produces flat average spike counts across the 3 s time interval. The orange and green line do show a bell-shaped curve that corresponds to the acceleration profile. But there also seems to be considerable noise: exactly what we need. Let’s see what the spike trains for a single trial look like:

@@ -1836,9 +1836,9 @@

Background -../../_images/86d5ecbd5807f4033da9d40d5f7cda9fe5964ddc6b6f1329e5e61217afe379ff.png -../../_images/54885e65427d84d7b01f9a435a4e143802b2c70c468ce6468848a91c2138d1ea.png -../../_images/d8aea21b3d80d298463d9b5b90da9a832b7a4cfa982949836b3ac13c8ae399b7.png +../../_images/6fbd1e5b4f33d1bd8d6047d2263785e28305ea8ee5c46874e97cb634148224a4.png +../../_images/434b95f0afef8e61b449992591137c1368f3dace3f520f30e5de25a90e62bc3d.png +../../_images/c66b47652a1108d84fc48ff6084ec8a491fc75c66614eccf023d769bb9d1889e.png

You can change the trial number in the bit of code above to compare what the rasterplots look like in different trials. You’ll notice that they all look kind of the same: the 3 conditions are very hard (impossible?) to distinguish by eye-balling.

@@ -1986,7 +1986,7 @@

Model implementation

-../../_images/657b75caf157f5e8a44e7e14cd2f691d4f1560628aa3d1d8a2e085eeec1c3356.png +../../_images/f0e8f6d99d4b7ea50d65e7ac08504675500847058d0a74bb06e228699b609634.png

We asked for 8 cross validations, which show up as the blue dots in the graph (two have the same accuracy). Prediction accuracy ranges from 56% to 72%, with the average at 65%, and the orange line is the median. Given the noisy data, that is not too bad actually.

@@ -2037,7 +2037,7 @@

Model implementation

-../../_images/657b75caf157f5e8a44e7e14cd2f691d4f1560628aa3d1d8a2e085eeec1c3356.png +../../_images/f0e8f6d99d4b7ea50d65e7ac08504675500847058d0a74bb06e228699b609634.png

This is the exact same figure as before, so our function classifyMotionFromSpikes() also works as intended.

@@ -2174,7 +2174,7 @@

Model evaluation & testing -../../_images/92c8a721ca3f1972276d94a0fb7c12964f0ce72c90ab24090445dd10e9060044.png +../../_images/f04470b0ce7dd4df80a6d3eb32bc7a026ddd3df7c45c65a45cf960fa4954d5aa.png

Well, that’s interesting! The logistic regression doesn’t do a perfect job, but there is information in these results.

diff --git a/projects/modelingsteps/TrainIllusionModelingProjectDL.html b/projects/modelingsteps/TrainIllusionModelingProjectDL.html index e3143f74d..5830ed819 100644 --- a/projects/modelingsteps/TrainIllusionModelingProjectDL.html +++ b/projects/modelingsteps/TrainIllusionModelingProjectDL.html @@ -1553,8 +1553,8 @@

Selected toolkitDrift-Diffusion Model (DDM) because it is a well-established framework that allows us to model decision making in the case of 2 alternative choices (here: self-motion vs. other train motion).

For our purposes simplest equation looks something like this:

-
-(127)#\[\begin{align} +
+(127)#\[\begin{align} \dot e = \frac{de}{dt}= -c \cdot e + v \, , \end{align}\]

where \(e\) is the accumulated evidence and \(v\) is our vestibular input already containing the noise (so we don’t need to add more noise?). \(c\) is the leakage constant, i.e., \(c=0\) means perfect integration; \(c=1\) means no integration (perfect leakage).

@@ -1630,7 +1630,7 @@

1. Vestibular signal generator
Text(0, 0.5, 'vestibular signal (a.u.)')
 

-../../_images/f6ab9f77e2837ddf28ee5f76c8cb41df931cd5789762da1f6effcf890edabc16.png +../../_images/6b176037749651980a9a02a512bd69e994655b36cab66cb2d1a25f8d9af0c528.png

@@ -1671,7 +1671,7 @@

2. Integrator (DDM mechanism) -../../_images/3a10e03b56d5ecdbbd326f9f6a3bfe98d118669c377b69497bff4dde15d0c0cf.png +../../_images/ee0219c6b1af7da294c7b8282f6483798df6b41b8798f2663e1a8b3c19d3e74f.png

@@ -1821,7 +1821,7 @@

Model evaluation & testing -../../_images/44f2df9fa9aecdb09d84a11587e25f6add12dd7d32d158eb0853acd7ab29f906.png +../../_images/7acdaa9daf0507c86354349586991516c97f08178468e1d51655fc8892312fe9.png

There seems to be some parameter redundancy, i.e., we could chose different parameter combinations to make the model do something sensible…

@@ -1842,7 +1842,7 @@

Model evaluation & testing -

Our hypothesis of linear increase of illusion strength with noise only holds true in a limited range of noise… It’s monotonic but saturating of course…

diff --git a/searchindex.js b/searchindex.js index 87bccbcf6..ca5df6f75 100644 --- a/searchindex.js +++ b/searchindex.js @@ -1 +1 @@ -Search.setIndex({"docnames": ["prereqs/DeepLearning", "projects/ComputerVision/README", "projects/ComputerVision/data_augmentation", "projects/ComputerVision/em_synapses", "projects/ComputerVision/ideas_and_datasets", "projects/ComputerVision/screws", "projects/ComputerVision/slides", "projects/ComputerVision/spectrogram_analysis", "projects/ComputerVision/transfer_learning", "projects/NaturalLanguageProcessing/README", "projects/NaturalLanguageProcessing/ideas_and_datasets", "projects/NaturalLanguageProcessing/machine_translation", "projects/NaturalLanguageProcessing/sentiment_analysis", "projects/NaturalLanguageProcessing/slides", "projects/Neuroscience/README", "projects/Neuroscience/algonauts_videos", "projects/Neuroscience/blurry_vision", "projects/Neuroscience/cellular_segmentation", "projects/Neuroscience/finetuning_fmri", "projects/Neuroscience/ideas_and_datasets", "projects/Neuroscience/neuro_seq_to_seq", "projects/Neuroscience/pose_estimation", "projects/Neuroscience/slides", "projects/README", "projects/ReinforcementLearning/README", "projects/ReinforcementLearning/human_rl", "projects/ReinforcementLearning/ideas_and_datasets", "projects/ReinforcementLearning/lunar_lander", "projects/ReinforcementLearning/robolympics", "projects/ReinforcementLearning/slides", "projects/docs/datasets_and_models", "projects/docs/project_guidance", "projects/docs/projects_overview", "projects/modelingsteps/Example_Deep_Learning_Project", "projects/modelingsteps/ModelingSteps_10_DL", "projects/modelingsteps/ModelingSteps_1through2_DL", "projects/modelingsteps/ModelingSteps_3through4_DL", "projects/modelingsteps/ModelingSteps_5through6_DL", "projects/modelingsteps/ModelingSteps_7through9_DL", "projects/modelingsteps/TrainIllusionDataProjectDL", "projects/modelingsteps/TrainIllusionModelingProjectDL", "projects/modelingsteps/intro", "tutorials/Bonus_DeployModels/chapter_title", "tutorials/Bonus_DeployModels/student/Bonus_Tutorial1", "tutorials/Module_WrapUps/FineTuning", "tutorials/Module_WrapUps/NaturalLanguageProcessing", "tutorials/Schedule/daily_schedules", "tutorials/Schedule/schedule_intro", "tutorials/Schedule/shared_calendars", "tutorials/Schedule/timezone_widget", "tutorials/TechnicalHelp/Discord", "tutorials/TechnicalHelp/Jupyterbook", "tutorials/TechnicalHelp/Links_Policy", "tutorials/TechnicalHelp/Tutorial_colab", "tutorials/TechnicalHelp/Tutorial_kaggle", "tutorials/TechnicalHelp/tech_intro", "tutorials/W1D1_BasicsAndPytorch/chapter_title", "tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1", "tutorials/W1D2_LinearDeepLearning/chapter_title", "tutorials/W1D2_LinearDeepLearning/student/W1D2_BonusLecture", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3", "tutorials/W1D3_MultiLayerPerceptrons/chapter_title", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2", "tutorials/W1D5_Optimization/chapter_title", "tutorials/W1D5_Optimization/student/W1D5_Tutorial1", "tutorials/W2D1_Regularization/chapter_title", "tutorials/W2D1_Regularization/student/W2D1_Tutorial1", "tutorials/W2D1_Regularization/student/W2D1_Tutorial2", "tutorials/W2D2_ConvnetsAndDlThinking/chapter_title", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2", "tutorials/W2D3_ModernConvnets/chapter_title", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial1", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial2", "tutorials/W2D4_GenerativeModels/chapter_title", "tutorials/W2D4_GenerativeModels/student/W2D4_BonusLecture", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial1", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial2", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial3", "tutorials/W2D5_AttentionAndTransformers/chapter_title", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial1", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial2", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/chapter_title", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial1", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial3", "tutorials/W3D2_DlThinking2/chapter_title", "tutorials/W3D2_DlThinking2/student/W3D2_Tutorial1", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/chapter_title", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_BonusLecture", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_Tutorial1", "tutorials/W3D4_BasicReinforcementLearning/chapter_title", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_BonusLecture", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_Tutorial1", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/chapter_title", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_BonusLecture", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial1", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial2", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial3", "tutorials/intro"], "filenames": ["prereqs/DeepLearning.md", "projects/ComputerVision/README.md", "projects/ComputerVision/data_augmentation.ipynb", "projects/ComputerVision/em_synapses.ipynb", "projects/ComputerVision/ideas_and_datasets.md", "projects/ComputerVision/screws.ipynb", "projects/ComputerVision/slides.md", "projects/ComputerVision/spectrogram_analysis.ipynb", "projects/ComputerVision/transfer_learning.ipynb", "projects/NaturalLanguageProcessing/README.md", "projects/NaturalLanguageProcessing/ideas_and_datasets.md", "projects/NaturalLanguageProcessing/machine_translation.ipynb", "projects/NaturalLanguageProcessing/sentiment_analysis.ipynb", "projects/NaturalLanguageProcessing/slides.md", "projects/Neuroscience/README.md", "projects/Neuroscience/algonauts_videos.ipynb", "projects/Neuroscience/blurry_vision.ipynb", "projects/Neuroscience/cellular_segmentation.ipynb", "projects/Neuroscience/finetuning_fmri.ipynb", "projects/Neuroscience/ideas_and_datasets.md", "projects/Neuroscience/neuro_seq_to_seq.ipynb", "projects/Neuroscience/pose_estimation.ipynb", "projects/Neuroscience/slides.md", "projects/README.md", "projects/ReinforcementLearning/README.md", "projects/ReinforcementLearning/human_rl.ipynb", "projects/ReinforcementLearning/ideas_and_datasets.md", "projects/ReinforcementLearning/lunar_lander.ipynb", "projects/ReinforcementLearning/robolympics.ipynb", "projects/ReinforcementLearning/slides.md", "projects/docs/datasets_and_models.md", "projects/docs/project_guidance.md", "projects/docs/projects_overview.md", "projects/modelingsteps/Example_Deep_Learning_Project.ipynb", "projects/modelingsteps/ModelingSteps_10_DL.ipynb", "projects/modelingsteps/ModelingSteps_1through2_DL.ipynb", "projects/modelingsteps/ModelingSteps_3through4_DL.ipynb", "projects/modelingsteps/ModelingSteps_5through6_DL.ipynb", "projects/modelingsteps/ModelingSteps_7through9_DL.ipynb", "projects/modelingsteps/TrainIllusionDataProjectDL.ipynb", "projects/modelingsteps/TrainIllusionModelingProjectDL.ipynb", "projects/modelingsteps/intro.md", "tutorials/Bonus_DeployModels/chapter_title.md", "tutorials/Bonus_DeployModels/student/Bonus_Tutorial1.ipynb", "tutorials/Module_WrapUps/FineTuning.ipynb", "tutorials/Module_WrapUps/NaturalLanguageProcessing.ipynb", "tutorials/Schedule/daily_schedules.md", "tutorials/Schedule/schedule_intro.md", "tutorials/Schedule/shared_calendars.md", "tutorials/Schedule/timezone_widget.md", "tutorials/TechnicalHelp/Discord.md", "tutorials/TechnicalHelp/Jupyterbook.md", "tutorials/TechnicalHelp/Links_Policy.md", "tutorials/TechnicalHelp/Tutorial_colab.md", "tutorials/TechnicalHelp/Tutorial_kaggle.md", "tutorials/TechnicalHelp/tech_intro.md", "tutorials/W1D1_BasicsAndPytorch/chapter_title.md", "tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1.ipynb", "tutorials/W1D2_LinearDeepLearning/chapter_title.md", "tutorials/W1D2_LinearDeepLearning/student/W1D2_BonusLecture.ipynb", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1.ipynb", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2.ipynb", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3.ipynb", "tutorials/W1D3_MultiLayerPerceptrons/chapter_title.md", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1.ipynb", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2.ipynb", "tutorials/W1D5_Optimization/chapter_title.md", "tutorials/W1D5_Optimization/student/W1D5_Tutorial1.ipynb", "tutorials/W2D1_Regularization/chapter_title.md", "tutorials/W2D1_Regularization/student/W2D1_Tutorial1.ipynb", "tutorials/W2D1_Regularization/student/W2D1_Tutorial2.ipynb", "tutorials/W2D2_ConvnetsAndDlThinking/chapter_title.md", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture.ipynb", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1.ipynb", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2.ipynb", "tutorials/W2D3_ModernConvnets/chapter_title.md", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial1.ipynb", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial2.ipynb", "tutorials/W2D4_GenerativeModels/chapter_title.md", "tutorials/W2D4_GenerativeModels/student/W2D4_BonusLecture.ipynb", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial1.ipynb", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial2.ipynb", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial3.ipynb", "tutorials/W2D5_AttentionAndTransformers/chapter_title.md", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial1.ipynb", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial2.ipynb", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/chapter_title.md", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial1.ipynb", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2.ipynb", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial3.ipynb", "tutorials/W3D2_DlThinking2/chapter_title.md", "tutorials/W3D2_DlThinking2/student/W3D2_Tutorial1.ipynb", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/chapter_title.md", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_BonusLecture.ipynb", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_Tutorial1.ipynb", "tutorials/W3D4_BasicReinforcementLearning/chapter_title.md", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_BonusLecture.ipynb", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_Tutorial1.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/chapter_title.md", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_BonusLecture.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial1.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial2.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial3.ipynb", "tutorials/intro.ipynb"], "titles": ["Prerequisites and preparatory materials for NMA Deep Learning", "Computer Vision", "Data Augmentation in image classification models", "Knowledge Extraction from a Convolutional Neural Network", "Ideas", "Something Screwy - image recognition, detection, and classification of screws", "Slides", "Music classification and generation with spectrograms", "Transfer Learning", "Natural Language Processing", "Ideas", "Machine Translation", "Twitter Sentiment Analysis", "Slides", "Neuroscience", "Load algonauts videos", "Vision with Lost Glasses: Modelling how the brain deals with noisy input", "Segmentation and Denoising", "Moving beyond Labels: Finetuning CNNs on BOLD response", "Ideas", "Focus on what matters: inferring low-dimensional dynamics from neural recordings", "Animal Pose Estimation", "Slides", "Introduction to projects", "Reinforcement Learning", "Using RL to Model Cognitive Tasks", "Ideas", "Performance Analysis of DQN Algorithm on the Lunar Lander task", "NMA Robolympics: Controlling robots using reinforcement learning", "Slides", "Models and Data sets", "Daily guide for projects", "Project Templates", "Example Deep Learning Project", "Modeling Steps 10", "Modeling Steps 1 - 2", "Modeling Steps 3 - 4", "Modeling Steps 5 - 6", "Modeling Steps 7 - 9", "Example Data Project: the Train Illusion", "Example Model Project: the Train Illusion", "Modeling Step-by-Step Guide", "Deploy Models", "Bonus Tutorial: Deploying Neural Networks on the Web", "Deep Learning: The Basics and Fine Tuning Wrap-up", "Deep Learning: Convnets and NLP", "General schedule", "Schedule", "Shared calendars", "Timezone widget", "Using Discord", "Using jupyterbook", "Quick links and policies", "Using Google Colab", "Using Kaggle", "Technical Help", "Basics And Pytorch", "Tutorial 1: PyTorch", "Linear Deep Learning", "Bonus Lecture: Yoshua Bengio", "Tutorial 1: Gradient Descent and AutoGrad", "Tutorial 2: Learning Hyperparameters", "Tutorial 3: Deep linear neural networks", "Multi Layer Perceptrons", "Tutorial 1: Biological vs. Artificial Neural Networks", "Tutorial 2: Deep MLPs", "Optimization", "Tutorial 1: Optimization techniques", "Regularization", "Tutorial 1: Regularization techniques part 1", "Tutorial 2: Regularization techniques part 2", "Convnets And Dl Thinking", "Bonus Lecture: Kyunghyun Cho", "Tutorial 1: Introduction to CNNs", "Tutorial 2: Deep Learning Thinking 1: Cost Functions", "Modern Convnets", "Tutorial 1: Learn how to use modern convnets", "Bonus Tutorial: Facial recognition using modern convnets", "Generative Models", "Bonus Lecture: Geoffrey Hinton", "Tutorial 1: Variational Autoencoders (VAEs)", "Tutorial 2: Diffusion models", "Tutorial 3: Image, Conditional Diffusion and Beyond", "Attention And Transformers", "Tutorial 1: Learn how to work with Transformers", "Bonus Tutorial: Understanding Pre-training, Fine-tuning and Robustness of Transformers", "Time Series And Natural Language Processing", "Tutorial 1: Introduction to processing time series", "Tutorial 2: Natural Language Processing and LLMs", "Bonus Tutorial: Multilingual Embeddings", "Dl Thinking2", "Tutorial 1: Deep Learning Thinking 2: Architectures and Multimodal DL thinking", "Unsupervised And Self Supervised Learning", "Bonus Lecture: Melanie Mitchell", "Tutorial 1: Un/Self-supervised learning methods", "Basic Reinforcement Learning", "Bonus Lecture: Chealsea Finn", "Tutorial 1: Basic Reinforcement Learning", "Reinforcement Learning For Games And Dl Thinking3", "Bonus Lecture: Amita Kapoor", "Tutorial 1: Reinforcement Learning For Games", "Tutorial 2: Deep Learning Thinking 3", "Bonus Tutorial: Planning with Monte Carlo Tree Search", "Introduction"], "terms": {"welcom": [0, 43, 76, 82, 103], "neuromatch": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 44, 45, 46, 52, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102, 103], "academi": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 37, 38, 39, 40, 43, 44, 45, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "we": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 46, 48, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "re": [0, 11, 20, 21, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 54, 57, 64, 65, 67, 73, 76, 77, 80, 82, 85, 88, 91, 97, 100], "realli": [0, 11, 12, 31, 33, 34, 35, 36, 37, 38, 40, 60, 84, 88], "excit": [0, 27, 31], "bring": [0, 39, 67, 69, 70, 73, 85, 87], "wide": [0, 4, 12, 15, 43, 57, 60, 62, 67, 70, 97], "vari": [0, 17, 27, 33, 40, 62, 67, 76, 80, 88, 91], "audienc": [0, 31, 34, 57], "an": [0, 2, 4, 5, 7, 8, 10, 11, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 52, 53, 54, 60, 62, 65, 69, 70, 76, 77, 80, 81, 82, 84, 87, 88, 91, 100, 101], "amaz": [0, 62], "set": [0, 3, 4, 7, 12, 15, 16, 23, 26, 27, 31, 34, 35, 36, 37, 38, 39, 40, 43, 54, 97, 101], "lectur": [0, 46, 64, 67, 73, 80], "tutori": [0, 3, 5, 7, 11, 19, 21, 26, 28, 31, 33, 46, 49, 54], "you": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 20, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 48, 51, 53, 54, 57, 60, 61, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "peopl": [0, 11, 12, 15, 16, 21, 23, 31, 33, 35, 36, 39, 40, 43, 61, 70, 74, 77, 84, 87, 91], "ar": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 52, 54, 60, 61, 62, 64, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 97, 100, 101, 102], "come": [0, 12, 15, 27, 31, 36, 37, 38, 39, 43, 60, 62, 67, 70, 73, 74, 77, 80, 81, 85, 91], "thi": [0, 2, 3, 4, 5, 8, 11, 12, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 49, 51, 52, 53, 54, 57, 60, 61, 64, 65, 69, 70, 73, 74, 76, 80, 81, 82, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "from": [0, 2, 5, 7, 10, 11, 12, 15, 16, 17, 18, 19, 21, 23, 25, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 44, 45, 54, 59, 60, 61, 64, 65, 67, 69, 70, 72, 74, 77, 79, 81, 88, 89, 91, 93, 96, 99, 101, 103], "rang": [0, 2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 39, 40, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 87, 88, 94, 97, 100, 102], "disciplin": 0, "level": [0, 15, 25, 31, 36, 37, 40, 60, 61, 62, 67, 70, 73, 81, 82, 85, 88, 94, 97, 100, 102], "background": [0, 4, 5, 23, 31, 37, 61, 73, 80], "want": [0, 2, 3, 4, 5, 7, 8, 11, 12, 16, 17, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 53, 54, 57, 60, 61, 62, 64, 65, 69, 70, 73, 74, 76, 77, 80, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101, 102], "make": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 52, 53, 54, 60, 61, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "sure": [0, 2, 3, 11, 12, 17, 21, 25, 28, 31, 33, 34, 35, 36, 37, 38, 40, 43, 54, 57, 60, 61, 64, 65, 73, 76, 80, 84, 85, 88, 97, 100], "everybodi": 0, "abl": [0, 5, 16, 25, 27, 28, 31, 33, 36, 40, 43, 57, 67, 69, 73, 74, 77, 84, 85, 91, 100, 101], "follow": [0, 2, 5, 7, 8, 11, 12, 15, 18, 21, 23, 25, 27, 28, 31, 33, 35, 36, 37, 39, 40, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "enjoi": [0, 35, 76], "school": [0, 23, 76], "dai": [0, 3, 12, 23, 35, 39, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "1": [0, 2, 3, 5, 7, 8, 10, 11, 15, 16, 17, 18, 20, 21, 23, 25, 34, 36, 37, 38, 39, 46, 48, 54, 59, 72, 79, 93, 96, 99], "mean": [0, 2, 5, 8, 12, 15, 16, 17, 20, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "need": [0, 3, 5, 7, 12, 15, 17, 19, 21, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 54, 57, 60, 61, 62, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 85, 88, 91, 94, 100, 101, 102], "know": [0, 3, 5, 12, 31, 33, 34, 35, 36, 37, 39, 57, 61, 62, 64, 67, 69, 70, 73, 76, 77, 80, 88, 91, 94, 97, 101], "basic": [0, 3, 17, 19, 25, 33, 35, 38, 39, 43, 46, 60, 61, 64, 73, 76, 80, 81, 82, 85, 87, 91, 94, 101], "python": [0, 2, 5, 7, 8, 17, 21, 25, 27, 31, 35, 39, 57, 60, 65, 69, 70, 76, 80, 81, 82, 85, 87, 88, 89, 94], "some": [0, 2, 3, 4, 7, 8, 11, 12, 16, 17, 19, 21, 23, 25, 26, 27, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 61, 64, 65, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "core": [0, 23, 28, 35, 36, 57, 60, 62, 65, 67, 73, 84, 88, 94, 97, 100, 102], "concept": [0, 23, 25, 36, 37, 46, 57, 60, 67, 76, 77, 80, 88, 97], "below": [0, 3, 6, 7, 8, 11, 13, 15, 16, 17, 22, 27, 28, 29, 33, 36, 39, 43, 44, 45, 48, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "provid": [0, 2, 3, 5, 7, 8, 11, 12, 15, 17, 21, 23, 25, 27, 28, 31, 34, 35, 36, 37, 38, 43, 57, 60, 62, 64, 65, 67, 70, 73, 80, 81, 85, 88, 89, 94, 97, 100, 101], "more": [0, 2, 3, 5, 8, 11, 12, 15, 16, 17, 19, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 39, 40, 43, 46, 52, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 97, 100, 101, 102], "detail": [0, 2, 5, 8, 15, 23, 26, 27, 28, 31, 33, 34, 36, 37, 38, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "run": [0, 2, 3, 5, 8, 11, 16, 17, 20, 21, 27, 31, 33, 35, 36, 37, 39, 40, 43, 51, 53, 54, 57, 60, 61, 64, 65, 74, 80, 81, 82, 84, 85, 87, 88, 89, 100, 102], "us": [0, 2, 3, 7, 10, 11, 12, 17, 20, 21, 23, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 59, 60, 61, 62, 64, 65, 67, 70, 72, 73, 74, 79, 80, 82, 85, 89, 91, 93, 96, 97, 99, 101], "If": [0, 3, 5, 8, 11, 12, 16, 20, 21, 27, 28, 31, 33, 35, 36, 38, 39, 40, 43, 48, 51, 52, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "ve": [0, 5, 17, 31, 34, 60, 61, 67, 73, 74, 76, 77, 80, 85, 88, 97, 100, 102], "never": [0, 27, 31, 34, 62, 84, 87, 88, 101], "now": [0, 3, 5, 8, 11, 12, 16, 17, 19, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "good": [0, 3, 8, 11, 12, 16, 17, 25, 27, 31, 33, 34, 35, 36, 38, 39, 40, 43, 57, 60, 61, 62, 64, 67, 69, 70, 74, 76, 80, 91, 97, 101], "time": [0, 3, 5, 7, 8, 12, 15, 16, 17, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 48, 49, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 88, 89, 91, 94, 97, 100, 101, 102], "start": [0, 2, 3, 5, 8, 10, 11, 12, 15, 17, 19, 20, 23, 27, 28, 33, 34, 35, 36, 37, 38, 39, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "practic": [0, 5, 8, 31, 39, 43, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 84, 91, 101], "expect": [0, 2, 8, 11, 17, 25, 33, 34, 35, 36, 38, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "student": [0, 2, 8, 23, 31, 46, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "familiar": [0, 12, 31, 62, 64, 73, 76, 88, 97], "variabl": [0, 2, 12, 31, 36, 37, 39, 43, 60, 62, 64, 65, 67, 69, 73, 76, 77, 81, 88, 100, 102], "list": [0, 5, 7, 8, 11, 12, 16, 17, 18, 21, 25, 31, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "dict": [0, 5, 8, 18, 21, 27, 40, 43, 57, 62, 76, 85, 88, 97], "numpi": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 20, 21, 25, 27, 28, 31, 33, 35, 36, 39, 40, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "scipi": [0, 5, 17, 18, 35, 39, 40, 73, 80, 81, 85, 87], "librari": [0, 3, 5, 7, 8, 12, 28, 31, 36, 43, 57, 67, 76, 84, 85, 87, 88, 89, 94], "well": [0, 2, 5, 8, 11, 12, 16, 19, 21, 28, 31, 33, 34, 35, 38, 39, 40, 43, 57, 60, 62, 67, 69, 70, 73, 76, 77, 80, 81, 85, 88, 91, 101], "plot": [0, 2, 5, 7, 11, 12, 15, 17, 21, 25, 27, 31, 40, 67, 87, 88], "matplotlib": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97], "littl": [0, 16, 33, 39, 40, 57, 74, 76, 77, 84, 91, 97], "bit": [0, 3, 27, 33, 35, 39, 43, 57, 74, 80, 88, 89, 91, 94], "everi": [0, 11, 17, 20, 21, 23, 27, 28, 31, 33, 34, 38, 43, 54, 57, 60, 62, 64, 65, 67, 69, 70, 73, 74, 81, 85, 94, 100], "ll": [0, 5, 7, 12, 17, 28, 31, 33, 35, 36, 38, 39, 43, 57, 64, 65, 70, 73, 74, 76, 80, 85, 87, 88, 89, 97, 100], "great": [0, 11, 31, 33, 34, 43, 57, 67, 73, 74, 76, 77, 88, 91, 101], "shape": [0, 2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 28, 33, 35, 36, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97, 100, 102], "class": [0, 2, 3, 7, 8, 10, 11, 12, 16, 17, 18, 20, 21, 25, 27, 33, 36, 39, 43, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 77, 80, 82, 84, 85, 87, 88, 94, 97, 100, 101, 102], "have": [0, 2, 3, 4, 5, 8, 11, 12, 16, 17, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 48, 51, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "workshop": [0, 31, 46], "w0d1": 0, "w0d2": 0, "here": [0, 2, 5, 8, 11, 12, 15, 16, 17, 19, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 44, 45, 46, 50, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "should": [0, 3, 5, 7, 8, 11, 12, 17, 20, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 48, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 91, 94, 97, 100, 101, 102], "go": [0, 3, 7, 11, 12, 17, 20, 21, 28, 31, 33, 34, 36, 38, 39, 40, 43, 54, 57, 60, 62, 64, 65, 67, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 94], "through": [0, 3, 5, 17, 18, 21, 27, 31, 34, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 67, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 101], "made": [0, 3, 20, 21, 31, 43, 62, 76, 88, 94, 100, 103], "content": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 23, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 44, 45, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "your": [0, 3, 4, 5, 7, 10, 11, 12, 15, 16, 17, 23, 25, 27, 28, 31, 33, 34, 37, 38, 39, 40, 46, 48, 49, 53, 54], "own": [0, 3, 5, 7, 11, 12, 21, 23, 27, 28, 31, 33, 38, 39, 40, 43, 57, 61, 74, 80, 85, 88, 94, 97], "pace": 0, "befor": [0, 5, 8, 12, 16, 21, 23, 27, 31, 34, 35, 36, 38, 39, 43, 57, 60, 62, 64, 65, 67, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "note": [0, 2, 5, 8, 11, 12, 16, 17, 20, 21, 23, 25, 27, 28, 31, 33, 35, 36, 38, 39, 40, 43, 46, 52, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "ha": [0, 2, 3, 4, 5, 8, 12, 16, 17, 18, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 44, 45, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "neurosci": [0, 17, 19, 23, 31, 34, 35, 39, 54, 64, 76], "focu": [0, 17, 25, 27, 31, 33, 36, 38, 57, 67, 73, 74, 80, 81, 91, 101, 102], "exampl": [0, 2, 3, 5, 7, 12, 15, 17, 20, 21, 25, 26, 27, 31, 43, 53, 57, 60, 61, 62, 64, 65, 67, 69, 73, 74, 80, 82, 84, 85, 87, 88, 89, 91, 100, 101, 102], "extrem": [0, 39, 40, 62, 67, 76, 91], "besid": [0, 62], "recommend": [0, 5, 12, 17, 23, 31, 33, 43, 54, 57, 62, 67, 85, 88, 100], "softwar": [0, 5, 11, 21, 43], "carpentri": 0, "free": [0, 3, 11, 12, 28, 31, 36, 40, 43, 57, 60, 61, 62, 64, 65, 67, 73, 80, 82, 84, 88, 97, 100, 101], "edx": 0, "research": [0, 5, 15, 19, 21, 23, 25, 27, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 67, 74, 76, 81, 84, 88, 97], "For": [0, 2, 5, 7, 8, 12, 15, 16, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 101, 102], "depth": [0, 62, 64, 65, 70, 80, 82, 84, 100], "intro": [0, 5, 31, 35, 40, 46], "see": [0, 2, 3, 4, 5, 6, 8, 10, 11, 12, 13, 16, 17, 20, 21, 22, 23, 27, 28, 29, 31, 33, 35, 36, 37, 39, 40, 43, 46, 48, 51, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "final": [0, 3, 11, 17, 23, 27, 28, 33, 36, 38, 43, 54, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100], "can": [0, 2, 3, 4, 5, 7, 8, 11, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 52, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "data": [0, 4, 12, 19, 25, 27, 31, 34, 36, 37, 38, 40, 43, 60, 61, 62, 64, 74, 80, 82, 87, 88, 89, 101, 102], "scienc": [0, 15, 31, 35, 37, 39, 57, 88, 97, 101], "handbook": 0, "which": [0, 2, 3, 5, 7, 11, 12, 16, 17, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 39, 40, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "also": [0, 3, 5, 7, 8, 11, 12, 16, 17, 19, 21, 23, 26, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 51, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 88, 89, 91, 94, 100, 101], "print": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 34, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "edit": [0, 31, 46, 53, 62, 77, 80, 82, 85], "matlab": 0, "quickli": [0, 16, 27, 31, 33, 43, 57, 61, 74, 85, 88, 91, 101], "get": [0, 2, 5, 7, 8, 11, 12, 15, 17, 18, 19, 20, 21, 23, 25, 27, 28, 33, 35, 36, 37, 38, 39, 40, 43, 52, 54, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "up": [0, 2, 11, 12, 15, 17, 20, 21, 25, 28, 31, 33, 35, 36, 38, 39, 40, 43, 57, 62, 64, 65, 67, 69, 70, 73, 77, 81, 82, 84, 87, 88, 94, 100, 101, 102], "speed": [0, 11, 27, 28, 39, 57, 61, 64, 65, 73, 84, 87, 100], "cheatsheet": 0, "mai": [0, 2, 11, 12, 15, 17, 23, 27, 28, 31, 33, 34, 39, 40, 43, 51, 52, 54, 57, 60, 61, 62, 64, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "paperback": 0, "neural": [0, 2, 7, 11, 15, 18, 19, 23, 26, 27, 31, 35, 39, 45, 67, 69, 70, 73, 76, 80, 84, 88, 91, 94, 101, 102], "both": [0, 11, 16, 17, 21, 25, 31, 33, 35, 36, 39, 40, 43, 46, 57, 65, 67, 70, 73, 74, 76, 80, 81, 84, 85, 87, 91, 94, 100, 101], "version": [0, 3, 5, 6, 7, 13, 16, 17, 21, 22, 25, 27, 29, 31, 33, 36, 53, 60, 73, 76, 77, 81, 82, 85, 94, 101], "reli": [0, 25, 31, 33, 34, 39, 40, 62, 67], "linear": [0, 2, 3, 5, 7, 8, 11, 12, 16, 18, 33, 36, 39, 40, 46, 57, 60, 64, 65, 67, 69, 70, 73, 74, 76, 81, 82, 84, 87, 94, 100, 102], "algebra": [0, 57, 62], "probabl": [0, 11, 16, 17, 27, 33, 36, 38, 39, 40, 43, 48, 57, 61, 62, 64, 65, 69, 70, 73, 74, 76, 80, 81, 85, 87, 88, 91, 94, 97, 100, 101, 102], "statist": [0, 12, 18, 38, 64, 65, 67, 73, 84], "calculu": [0, 60], "deriv": [0, 38, 57, 60, 61, 62, 64, 65, 67, 70, 73, 81, 97], "od": [0, 81, 82], "highli": [0, 17, 23, 27, 34, 57, 67, 73, 82], "our": [0, 3, 11, 12, 16, 17, 20, 21, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97, 100, 101], "refresh": [0, 67], "w0d3": 0, "w0d4": 0, "w0d5": 0, "ask": [0, 16, 21, 23, 25, 31, 33, 39, 40, 60, 61, 62, 74, 77, 85, 88, 91, 94, 101], "question": [0, 10, 17, 19, 23, 25, 27, 34, 36, 37, 38, 65, 67, 74, 76, 77, 84, 88, 89, 91, 94, 101], "discord": [0, 31, 46], "grasp": 0, "along": [0, 19, 27, 31, 33, 57, 60, 64, 65, 69, 73, 74, 80, 85, 88], "crucial": [0, 5, 20, 31, 36, 38, 69, 97], "almost": [0, 33, 35, 62, 69, 84, 88], "anyth": [0, 33, 36, 38, 39, 51, 53, 64, 73, 88, 91], "quantit": [0, 35, 38], "involv": [0, 27, 31, 33, 37, 38, 57, 73, 74, 76, 81, 85, 97], "than": [0, 3, 5, 11, 12, 16, 17, 20, 21, 27, 28, 31, 33, 35, 36, 37, 39, 52, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101], "one": [0, 2, 3, 4, 5, 7, 8, 10, 11, 12, 16, 17, 21, 23, 25, 26, 27, 31, 33, 34, 35, 36, 37, 39, 40, 43, 53, 54, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "number": [0, 2, 3, 12, 16, 17, 20, 21, 25, 26, 27, 28, 33, 35, 36, 39, 54, 60, 61, 62, 64, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 91, 97, 100, 102], "vector": [0, 2, 11, 12, 27, 39, 40, 57, 61, 64, 65, 67, 69, 74, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97], "matrix": [0, 12, 15, 17, 20, 33, 35, 39, 57, 61, 62, 67, 69, 73, 77, 80, 81, 84, 85, 89, 94, 97], "addit": [0, 11, 17, 31, 33, 34, 35, 43, 46, 57, 61, 62, 74, 80, 81, 82, 84, 85, 87, 94, 101], "multipl": [0, 2, 3, 5, 8, 17, 26, 27, 31, 57, 60, 61, 62, 64, 65, 81, 84, 85, 87, 91, 97, 100], "rank": [0, 62, 74, 100], "base": [0, 5, 11, 17, 21, 23, 25, 31, 33, 35, 36, 37, 39, 43, 57, 60, 62, 67, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 94, 97, 101], "determin": [0, 31, 35, 37, 38, 39, 40, 57, 64, 69, 73, 74, 85, 94, 97, 100, 102], "invers": [0, 67, 74, 80], "eigenvalu": [0, 62], "decomposit": [0, 77], "In": [0, 2, 3, 4, 5, 7, 8, 11, 12, 15, 16, 17, 20, 23, 25, 27, 28, 31, 33, 34, 35, 36, 37, 39, 40, 43, 46, 51, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "beauti": [0, 11, 76, 82], "seri": [0, 21, 31, 33, 35, 40, 44, 45, 46, 74, 76, 88, 89, 91, 94, 101], "anoth": [0, 3, 7, 8, 10, 11, 12, 17, 20, 25, 26, 27, 30, 31, 35, 39, 40, 53, 54, 57, 62, 64, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100], "resourc": [0, 3, 7, 27, 33, 37, 43, 77], "khan": 0, "exercis": [0, 2, 8, 34, 74, 85, 89], "understand": [0, 5, 11, 16, 19, 25, 27, 31, 33, 34, 36, 38, 39, 57, 61, 62, 67, 76, 77, 82, 84, 88, 94, 97, 100, 101, 102], "import": [0, 2, 3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 31, 33, 35, 36, 37, 38, 39, 40, 44, 45, 51, 54], "comfort": [0, 76], "varianc": [0, 18, 62, 65, 67, 74, 80, 81, 82, 87, 91], "normal": [0, 2, 3, 4, 5, 7, 8, 11, 18, 21, 28, 31, 33, 35, 36, 39, 40, 43, 46, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 100], "distribut": [0, 12, 15, 25, 28, 35, 39, 57, 61, 62, 64, 65, 67, 74, 80, 81, 82, 84, 94, 100, 101, 102], "select": [0, 2, 5, 7, 12, 15, 16, 21, 25, 27, 31, 36, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "read": [0, 2, 3, 5, 7, 8, 12, 15, 21, 27, 31, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "i": [0, 2, 3, 5, 11, 12, 15, 16, 17, 18, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 87, 88, 97, 100, 101, 102], "e": [0, 3, 4, 5, 11, 15, 16, 21, 25, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 52, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 100, 101, 102], "chapter": [0, 31], "6": [0, 2, 3, 5, 7, 8, 11, 12, 16, 18, 20, 21, 28, 31, 34, 36, 38, 39, 40, 46, 64, 65, 69, 80, 81, 82, 85, 102], "7": [0, 2, 3, 5, 7, 8, 12, 18, 19, 21, 25, 27, 28, 31, 34, 35, 39, 40, 46, 60, 64, 65, 69, 70, 80, 81, 82, 85, 87, 102], "russ": 0, "poldrack": 0, "s": [0, 2, 3, 7, 11, 15, 16, 17, 19, 20, 21, 25, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 53, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "book": [0, 11, 25, 31, 64, 73, 76, 97], "think": [0, 5, 11, 12, 20, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 61, 62, 100, 102], "21st": 0, "centuri": 0, "what": [0, 3, 5, 11, 16, 17, 19, 21, 23, 25, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 57, 60, 61, 62, 65, 67, 69, 70, 74, 77, 80, 82, 84, 85, 88, 89, 100, 101], "integr": [0, 36, 39, 52, 81, 82, 101], "differenti": [0, 61, 67, 73, 81, 84, 101], "equat": [0, 37, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 89, 91, 100, 101, 102], "memori": [0, 17, 21, 27, 57, 64, 73, 76, 77, 84, 85, 88], "gilbert": [0, 62], "strang": [0, 62, 76], "studi": [0, 16, 31, 33, 35, 39, 60, 62, 64, 73, 84, 97], "0": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 36, 39, 40, 43, 46, 57, 61, 65, 67, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 102], "includ": [0, 8, 12, 15, 19, 23, 25, 26, 27, 28, 31, 33, 36, 37, 39, 40, 43, 57, 62, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 88, 94, 97, 100, 101], "jiri": 0, "lebl": 0, "engin": [0, 21, 25, 27, 33, 36, 37, 43, 57, 60, 67, 74, 76, 88, 91, 101], "The": [0, 2, 3, 5, 7, 8, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 31, 33, 35, 36, 39, 40, 43, 45, 54, 60, 62, 67, 69, 70, 74, 77, 82, 84, 85, 87, 89, 91, 97, 100, 102], "team": [0, 23, 27, 31, 57, 61], "By": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "creator": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "jama": [2, 8], "hussein": [2, 8], "mohamud": [2, 8], "alex": [2, 8, 15, 73], "hernandez": [2, 8], "garcia": [2, 8], "product": [2, 3, 5, 8, 11, 12, 18, 25, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "editor": [2, 3, 5, 8, 11, 12, 15, 17, 18, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "spiro": [2, 3, 5, 8, 11, 12, 15, 17, 18, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "chavli": [2, 3, 5, 8, 11, 12, 15, 17, 18, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "saeed": [2, 8, 60, 61, 62, 64, 65, 69, 70, 73, 80], "salehi": [2, 8, 60, 61, 62, 64, 65, 69, 70, 73, 80], "refer": [2, 7, 8, 12, 16, 19, 23, 25, 28, 31, 33, 34, 38, 39, 40, 43, 46, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "synthet": [2, 69], "increas": [2, 12, 17, 19, 21, 27, 31, 39, 40, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 94, 97], "amount": [2, 3, 16, 17, 21, 28, 33, 36, 38, 57, 62, 76, 77, 80, 81, 91], "transform": [2, 5, 7, 8, 11, 12, 16, 18, 21, 23, 28, 31, 33, 36, 39, 43, 46, 65, 67, 69, 70, 73, 76, 80, 81, 82, 87, 88, 89, 100, 101], "exist": [2, 3, 5, 7, 8, 15, 21, 27, 28, 31, 40, 43, 57, 62, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 89, 91, 94, 100, 101, 102], "been": [2, 4, 5, 8, 16, 21, 27, 28, 31, 33, 34, 35, 38, 39, 40, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "shown": [2, 4, 16, 25, 28, 51, 60, 61, 62, 69, 70, 73, 81, 94, 97], "veri": [2, 5, 8, 11, 12, 16, 19, 20, 21, 27, 28, 31, 33, 34, 35, 36, 39, 40, 43, 57, 60, 61, 62, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100], "techniqu": [2, 15, 18, 21, 27, 31, 60, 64, 73, 76, 80, 84, 87, 88], "especi": [2, 27, 28, 31, 38, 39, 54, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 94, 100, 101, 102], "comput": [2, 3, 8, 12, 16, 17, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 39, 40, 43, 54, 57, 61, 64, 65, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "vision": [2, 7, 23, 30, 31, 35, 39, 40, 57, 76, 84, 91, 101], "applic": [2, 26, 27, 31, 39, 67, 70, 77, 80, 82, 87, 88, 94, 100, 101], "howev": [2, 4, 5, 20, 26, 27, 28, 31, 33, 39, 40, 57, 64, 67, 69, 70, 73, 77, 81, 82, 84, 85, 87, 88, 89, 94, 97, 101], "wai": [2, 3, 5, 10, 11, 12, 16, 17, 28, 31, 33, 35, 37, 38, 39, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101], "perform": [2, 7, 8, 10, 12, 16, 18, 19, 21, 23, 25, 28, 31, 33, 34, 35, 36, 37, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 87, 88, 89, 97, 100, 101, 102], "yet": [2, 11, 12, 16, 28, 30, 31, 33, 35, 40, 43, 57, 62, 97], "understood": [2, 8, 64, 81, 101], "effect": [2, 11, 16, 17, 27, 34, 57, 62, 67, 70, 80, 85, 88, 91, 97, 102], "why": [2, 12, 19, 31, 35, 36, 38, 39, 60, 62, 64, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 101], "how": [2, 3, 5, 7, 8, 10, 11, 12, 15, 17, 19, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 54, 60, 61, 62, 64, 65, 69, 70, 77, 80, 81, 82, 85, 87, 88, 89, 91, 97, 100, 101, 102], "interact": [2, 16, 26, 28, 33, 36, 43, 51, 53, 54, 60, 65, 97, 101], "other": [2, 3, 4, 5, 7, 8, 11, 16, 17, 21, 27, 31, 33, 35, 36, 38, 39, 40, 43, 46, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 85, 87, 88, 89, 91, 94, 100, 101, 102], "fact": [2, 25, 31, 35, 39, 60, 61, 64, 73, 80, 81, 101], "common": [2, 5, 7, 11, 12, 21, 23, 26, 27, 33, 35, 43, 57, 61, 62, 69, 70, 73, 74, 76, 80, 88, 89, 91], "differ": [2, 3, 4, 5, 7, 10, 12, 15, 16, 17, 18, 19, 21, 23, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 65, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101], "scheme": [2, 57, 65, 67, 85], "paper": [2, 4, 5, 8, 12, 17, 19, 26, 27, 28, 31, 33, 35, 37, 40, 57, 64, 70, 73, 74, 76, 81, 84, 91, 101, 102], "perceptu": [2, 7, 35, 40, 64], "possibl": [2, 7, 8, 10, 11, 12, 16, 21, 27, 33, 35, 36, 37, 38, 39, 40, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "relat": [2, 12, 17, 19, 20, 21, 31, 33, 34, 36, 37, 38, 57, 61, 65, 67, 74, 80, 81, 82, 84, 85, 87, 88, 91, 94, 101], "human": [2, 3, 15, 16, 19, 27, 33, 35, 36, 43, 62, 91, 97, 101], "percept": [2, 15, 35, 39, 40, 101], "simpl": [2, 3, 5, 11, 12, 17, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 60, 62, 64, 65, 67, 80, 82, 85, 94, 100], "artifici": [2, 15, 39, 40, 57, 62, 69, 74, 97, 101], "even": [2, 8, 11, 12, 16, 23, 28, 31, 33, 35, 36, 40, 57, 60, 61, 62, 65, 67, 70, 76, 77, 81, 87, 88, 91, 94, 100, 101], "label": [2, 3, 5, 7, 8, 12, 15, 16, 17, 20, 21, 28, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 70, 73, 77, 80, 81, 82, 84, 85, 87, 88, 91, 100, 101], "among": [2, 69, 70, 84, 88], "mani": [2, 5, 8, 12, 17, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 88, 91, 94, 97, 101], "notebook": [2, 3, 4, 5, 8, 11, 12, 16, 17, 20, 21, 25, 27, 28, 31, 33, 35, 43, 51, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "show": [2, 3, 4, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 34, 35, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101], "deep": [2, 5, 8, 18, 19, 21, 23, 27, 28, 31, 35, 36, 37, 43, 46, 52, 54, 60, 61, 67, 69, 70, 73, 80, 88, 100, 103], "network": [2, 8, 10, 11, 18, 19, 20, 21, 23, 25, 26, 27, 31, 33, 35, 45, 60, 69, 73, 74, 80, 84, 87, 88, 91, 101], "analys": [2, 31, 38, 60, 84], "result": [2, 3, 10, 15, 17, 23, 25, 27, 28, 34, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "titl": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "pip": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 40, 43, 54, 57, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 100, 102], "panda": [2, 8, 12, 25, 40, 57, 76, 81, 84, 85], "quiet": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "os": [2, 3, 5, 7, 8, 12, 15, 16, 17, 21, 27, 28, 43, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89, 94, 100, 102], "csv": [2, 8, 12], "multiprocess": [2, 8], "np": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "pd": [2, 8, 12, 25, 40, 57, 81, 84, 85], "pyplot": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 89, 94, 97], "plt": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97], "torch": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 27, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "nn": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 27, 33, 57, 62, 64, 65, 67, 69, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "f": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "backend": [2, 8, 16, 28, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "cudnn": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "autograd": [2, 8, 57, 67, 84, 85], "torchvis": [2, 3, 5, 7, 8, 16, 18, 21, 43, 57, 65, 67, 69, 70, 73, 76, 77, 80, 82, 94], "execut": [2, 3, 8, 23, 25, 27, 28, 43, 51, 57, 59, 72, 74, 79, 91, 93, 96, 97, 99, 101], "set_se": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "markdown": [2, 3, 8, 16, 17, 21, 28, 33, 35, 36, 44, 45, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "dl": [2, 3, 8, 15, 19, 25, 33, 35, 36, 37, 39, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "its": [2, 3, 8, 11, 12, 15, 17, 21, 25, 28, 36, 37, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "critic": [2, 8, 11, 28, 34, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "so": [2, 3, 5, 8, 11, 12, 16, 17, 20, 26, 27, 28, 31, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "baselin": [2, 8, 26, 27, 35, 39, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "compar": [2, 4, 5, 7, 8, 10, 16, 17, 18, 19, 25, 27, 33, 35, 39, 57, 60, 61, 62, 64, 65, 69, 70, 73, 74, 77, 80, 82, 84, 85, 87, 88, 89, 102], "http": [2, 3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 21, 27, 28, 30, 31, 33, 36, 39, 43, 44, 45, 46, 49, 52, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "pytorch": [2, 7, 11, 16, 21, 30, 31, 33, 36, 46, 61, 62, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "org": [2, 5, 7, 8, 11, 17, 21, 27, 30, 33, 43, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "doc": [2, 5, 8, 12, 27, 31, 46, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "stabl": [2, 8, 27, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 88, 89, 94, 100, 102], "html": [2, 5, 8, 15, 21, 25, 27, 28, 31, 33, 39, 43, 46, 49, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "call": [2, 3, 7, 8, 12, 17, 27, 28, 31, 33, 35, 36, 38, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "ensur": [2, 8, 31, 35, 37, 38, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "reproduc": [2, 8, 25, 34, 35, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "def": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "none": [2, 3, 5, 8, 11, 12, 16, 17, 18, 21, 25, 27, 28, 39, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "seed_torch": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "true": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 21, 25, 27, 28, 31, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "choic": [2, 5, 8, 11, 15, 16, 21, 25, 27, 33, 36, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "2": [2, 3, 5, 7, 8, 10, 11, 16, 17, 18, 19, 20, 21, 23, 25, 31, 34, 36, 39, 48, 54, 89], "32": [2, 3, 5, 7, 8, 11, 12, 16, 17, 21, 25, 27, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "manual_se": [2, 3, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "cuda": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "manual_seed_al": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "benchmark": [2, 8, 16, 38, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "fals": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 20, 21, 25, 27, 28, 33, 34, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "determinist": [2, 8, 16, 27, 28, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97, 100, 102], "case": [2, 8, 12, 16, 27, 28, 31, 33, 34, 35, 36, 39, 40, 43, 54, 57, 60, 61, 62, 64, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 89, 91, 94, 97, 100, 102], "dataload": [2, 3, 7, 12, 21, 31, 33, 60, 61, 62, 64, 67, 69, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "seed_work": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "worker_id": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "worker_se": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "initial_se": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "inform": [2, 4, 5, 7, 8, 12, 17, 21, 23, 25, 28, 31, 33, 34, 35, 36, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 102], "user": [2, 8, 12, 15, 16, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "set_devic": [2, 5, 7, 8, 12, 16, 57], "is_avail": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "els": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 36, 37, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "warn": [2, 7, 8, 12, 16, 28, 31, 35, 43, 61, 62, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "best": [2, 7, 8, 12, 16, 17, 18, 27, 31, 35, 37, 39, 40, 43, 61, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "menu": [2, 5, 7, 8, 12, 16, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "under": [2, 5, 7, 8, 12, 16, 40, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 97, 100, 102], "runtim": [2, 5, 7, 8, 12, 16, 20, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "chang": [2, 3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 20, 21, 27, 28, 31, 33, 35, 36, 38, 39, 43, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "type": [2, 3, 5, 7, 8, 12, 15, 16, 17, 20, 21, 23, 27, 28, 31, 33, 36, 37, 39, 43, 57, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 102], "enabl": [2, 3, 5, 7, 8, 12, 16, 20, 25, 27, 43, 54, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 88, 89, 100, 101, 102], "return": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "2021": [2, 8, 15, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94], "reduc": [2, 12, 18, 21, 27, 28, 31, 57, 60, 67, 69, 70, 73, 76, 77, 80, 82], "epoch": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 21, 31, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 100, 102], "end_epoch": 2, "valu": [2, 5, 8, 11, 12, 15, 16, 17, 20, 21, 27, 28, 31, 33, 36, 37, 39, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 85, 87], "wa": [2, 3, 7, 12, 17, 20, 21, 25, 27, 28, 31, 33, 34, 36, 38, 39, 40, 57, 62, 64, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 91, 97, 102], "200": [2, 8, 20, 27, 28, 33, 36, 39, 40, 43, 62, 67, 69, 70, 73, 76, 77, 81, 87, 100, 102], "pleas": [2, 5, 8, 11, 15, 23, 28, 31, 33, 46, 49, 50, 51, 52, 54, 57, 60, 61, 62, 65, 67, 70, 74, 76, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "back": [2, 11, 12, 23, 28, 31, 34, 36, 38, 39, 40, 43, 57, 73, 76, 77, 80, 81, 82, 84, 88, 97], "code": [2, 5, 7, 8, 12, 15, 16, 17, 21, 23, 26, 27, 28, 31, 33, 34, 35, 37, 38, 39, 43, 51, 52, 53, 54, 74, 85, 89, 91, 101], "hyper": [2, 8, 60, 70], "paramet": [2, 3, 5, 7, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 57, 60, 61, 62, 64, 69, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "use_cuda": [2, 8, 94], "alpha": [2, 5, 18, 57, 61, 62, 65, 67, 73, 74, 77, 80, 81, 85, 87, 94, 97, 100, 102], "best_acc": [2, 8, 69, 70], "accuraci": [2, 3, 7, 8, 12, 21, 25, 33, 35, 39, 64, 65, 67, 69, 70, 73, 85, 87, 94], "start_epoch": [2, 8], "last": [2, 12, 17, 20, 21, 25, 27, 31, 36, 43, 46, 53, 54, 57, 60, 62, 64, 65, 67, 69, 70, 74, 76, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 102], "checkpoint": [2, 3, 8, 17, 21, 25, 82, 85, 94, 100, 102], "batch_siz": [2, 3, 7, 8, 11, 12, 16, 17, 18, 21, 27, 33, 57, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 87, 94, 100, 102], "128": [2, 3, 5, 7, 8, 16, 17, 18, 64, 65, 69, 70, 73, 76, 80, 82, 84, 87], "end_apoch": 2, "15": [2, 3, 5, 7, 8, 12, 17, 19, 20, 23, 28, 33, 46, 61, 62, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 91, 94, 97, 100, 101, 102], "base_learning_r": [2, 8], "n_hole": 2, "hole": [2, 17], "cut": [2, 12, 80], "out": [2, 7, 8, 11, 12, 15, 16, 17, 20, 21, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 46, 49, 52, 57, 60, 62, 64, 65, 67, 70, 73, 74, 80, 84, 85, 87, 88, 91, 94, 100, 101, 102], "length": [2, 5, 7, 12, 17, 20, 21, 26, 27, 39, 57, 60, 73, 80, 84, 85, 87, 88, 94, 100, 102], "16": [2, 3, 5, 7, 8, 17, 21, 28, 31, 33, 46, 61, 62, 64, 65, 67, 70, 73, 76, 77, 80, 81, 82, 85, 87, 88, 94, 97, 100, 102], "torchvision_transform": [2, 8], "randomli": [2, 3, 5, 20, 25, 67, 69, 70, 76, 80, 81, 82, 85, 94, 97, 100, 102], "mask": [2, 11, 15, 17, 27, 67, 76, 88, 100, 102], "patch": [2, 5, 21, 84], "github": [2, 3, 7, 15, 17, 21, 27, 28, 33, 34, 49, 57, 73, 76, 77, 87, 88, 89, 94, 100, 101, 102], "com": [2, 3, 5, 7, 10, 12, 15, 16, 17, 21, 27, 28, 30, 33, 34, 43, 44, 45, 52, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "uoguelph": 2, "mlrg": 2, "arg": [2, 16, 25, 28, 33, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "int": [2, 3, 5, 8, 12, 16, 17, 18, 21, 25, 28, 33, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 85, 87, 88, 94, 97, 100, 102], "each": [2, 3, 5, 6, 8, 11, 12, 13, 15, 16, 17, 18, 19, 20, 21, 22, 23, 25, 26, 27, 28, 29, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 48, 51, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "pixel": [2, 4, 5, 8, 17, 21, 67, 69, 70, 73, 74, 80, 91], "squar": [2, 28, 57, 60, 61, 62, 67, 69, 70, 76, 80, 81, 82, 84, 94, 97, 100, 102], "__init__": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "self": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "__call__": [2, 57, 85, 88], "img": [2, 3, 5, 7, 15, 17, 18, 21, 43, 64, 65, 69, 70, 73, 77], "tensor": [2, 7, 8, 11, 12, 16, 21, 33, 43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "size": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 40, 43, 57, 60, 61, 62, 64, 65, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "c": [2, 3, 5, 11, 12, 21, 25, 33, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 88, 91, 94, 97], "h": [2, 12, 17, 21, 57, 60, 62, 73, 76, 80, 82, 84, 85], "w": [2, 3, 4, 5, 8, 11, 12, 16, 21, 43, 57, 60, 61, 62, 65, 67, 70, 73, 74, 76, 80, 81, 82, 85, 89], "dimens": [2, 3, 12, 17, 28, 33, 35, 39, 43, 57, 62, 67, 69, 70, 73, 77, 80, 82, 84, 87, 89, 100, 102], "x": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101, 102], "ones": [2, 3, 12, 17, 20, 28, 31, 33, 35, 39, 57, 60, 65, 73, 76, 80, 81, 82, 84, 85, 88, 94, 97, 100], "float32": [2, 3, 17, 21, 25, 27, 28, 33, 57, 80], "n": [2, 3, 5, 7, 11, 12, 16, 17, 21, 27, 28, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "y": [2, 3, 7, 8, 11, 12, 17, 18, 20, 21, 25, 27, 28, 35, 36, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 80, 81, 82, 85, 87, 88, 91, 94, 100, 101, 102], "randint": [2, 5, 17, 25, 57, 62, 80, 85, 100, 102], "y1": [2, 21], "clip": [2, 12, 15, 27, 80, 82, 101], "y2": 2, "x1": [2, 20, 21, 57, 64, 94], "x2": [2, 21, 64, 94], "from_numpi": [2, 5, 12, 17, 20, 21, 57, 73, 76, 80], "expand_a": [2, 76], "combin": [2, 3, 17, 21, 25, 33, 36, 39, 40, 57, 60, 64, 65, 67, 70, 73, 76, 77, 80, 84, 85, 88, 89, 91, 94, 101, 102], "pair": [2, 3, 10, 11, 17, 67, 74, 77, 80, 84, 85, 87, 88, 89, 91, 100, 101, 102], "via": [2, 5, 21, 23, 31, 43, 60, 67, 70, 73, 74, 76, 81, 82, 85, 97, 100], "convex": 2, "given": [2, 5, 11, 15, 16, 17, 25, 27, 31, 33, 36, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97, 100, 101, 102], "x_i": [2, 18, 64, 74, 81, 89], "x_j": [2, 18, 65], "y_i": [2, 89], "y_j": 2, "respect": [2, 5, 15, 27, 28, 33, 36, 38, 51, 57, 60, 61, 62, 64, 67, 69, 73, 76, 80, 85, 87, 94], "lambda": [2, 3, 11, 12, 21, 67, 69, 70, 74, 81, 82, 84, 85, 87, 88, 100, 102], "creat": [2, 5, 7, 11, 12, 15, 21, 23, 25, 27, 31, 33, 34, 35, 39, 40, 54, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91], "new": [2, 8, 10, 11, 12, 17, 21, 25, 26, 27, 28, 31, 33, 38, 43, 57, 62, 64, 67, 69, 70, 73, 77, 80, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101, 102], "hat": [2, 76], "begin": [2, 17, 28, 31, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 100, 102], "align": [2, 11, 23, 36, 39, 40, 43, 60, 62, 64, 65, 67, 73, 74, 81, 84, 85, 89], "end": [2, 3, 5, 11, 12, 17, 21, 23, 27, 28, 31, 33, 36, 39, 40, 43, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "check": [2, 3, 11, 12, 15, 16, 17, 23, 25, 28, 31, 33, 35, 36, 38, 39, 43, 46, 49, 57, 62, 64, 65, 67, 70, 73, 74, 77, 82, 88, 91, 94, 97, 100, 101, 102], "origin": [2, 5, 8, 11, 12, 16, 17, 27, 28, 33, 35, 36, 38, 40, 57, 60, 62, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 91, 94, 100, 102], "repositori": [2, 17, 30, 31, 34, 43, 57, 100, 102], "mixup_data": 2, "mix": [2, 35, 39, 74, 76], "input": [2, 4, 5, 8, 11, 12, 17, 18, 20, 21, 27, 33, 35, 36, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "target": [2, 7, 11, 12, 16, 18, 21, 27, 28, 34, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 84, 85, 89, 100, 102], "hongyi": 2, "zhang": [2, 8, 73], "lam": [2, 20], "beta": [2, 15, 57, 67], "index": [2, 5, 12, 18, 21, 30, 33, 43, 62, 64, 65, 69, 70, 73, 74, 84, 85, 94], "randperm": [2, 64, 65], "mixed_x": 2, "y_a": [2, 80], "y_b": [2, 80], "small": [2, 8, 16, 17, 31, 33, 35, 39, 43, 57, 60, 62, 64, 65, 67, 69, 73, 74, 76, 77, 82, 84, 87, 88, 91, 94, 97, 100, 101], "tweak": [2, 8, 67], "ani": [2, 5, 8, 15, 21, 25, 31, 33, 35, 36, 37, 38, 39, 40, 54, 57, 60, 61, 62, 64, 67, 70, 73, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97], "interest": [2, 3, 8, 11, 12, 15, 16, 23, 26, 27, 28, 31, 33, 35, 36, 38, 39, 40, 60, 62, 69, 70, 76, 80, 81, 82, 85, 94, 97, 100, 101], "download": [2, 5, 7, 8, 10, 11, 15, 31, 33, 36, 39, 43, 57, 69, 70, 87, 88, 89, 94], "prepar": [2, 8, 23, 28, 31, 57, 64, 73, 84, 87, 88, 100, 102], "percentagesplit": [2, 8], "full_dataset": [2, 3, 8], "percent": [2, 5, 8, 12, 76], "set1_siz": [2, 8], "len": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 21, 25, 27, 28, 33, 35, 39, 40, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 102], "set2_siz": [2, 8], "final_dataset": [2, 8], "_": [2, 3, 7, 8, 11, 15, 16, 17, 18, 21, 27, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 94, 100, 102], "util": [2, 3, 7, 8, 11, 12, 16, 17, 18, 21, 25, 27, 28, 33, 36, 57, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 88, 89, 100, 101, 102], "random_split": [2, 3, 8, 21, 67, 69, 70, 73, 87], "cifar100": [2, 8], "5071": [2, 8], "4866": [2, 8], "4409": [2, 8, 12], "std": [2, 5, 8, 35, 39, 43, 62, 65, 67, 69, 70, 76, 81, 82], "2673": [2, 8], "2564": [2, 8], "2762": [2, 8], "cifar10": [2, 8, 80], "4914": [2, 8], "4822": [2, 8], "4465": [2, 8], "2023": [2, 8, 81, 82, 88, 100, 102], "1994": [2, 8], "2010": [2, 8, 65], "transform_train": [2, 8], "compos": [2, 5, 7, 8, 16, 18, 43, 57, 65, 67, 69, 70, 73, 76, 77, 84, 85, 88, 101], "append": [2, 3, 5, 7, 8, 11, 12, 16, 17, 25, 28, 33, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 89, 94, 97, 100, 102], "randomcrop": [2, 8], "pad": [2, 3, 7, 8, 11, 12, 16, 17, 21, 25, 27, 57, 60, 61, 62, 76, 80, 82, 84, 85, 88, 100, 102], "4": [2, 3, 5, 7, 8, 11, 16, 17, 18, 20, 21, 23, 25, 31, 34, 37, 38, 39, 40, 46, 48, 54, 81, 89, 102], "randomhorizontalflip": [2, 8, 18, 65, 70], "totensor": [2, 5, 7, 8, 16, 18, 43, 57, 65, 67, 69, 70, 73, 76, 77, 80, 82], "transform_test": [2, 8], "trainset": [2, 8], "root": [2, 3, 5, 8, 16, 28, 43, 57, 67, 69, 70, 73, 84], "testset": [2, 8], "www": [2, 3, 5, 8, 15, 17, 21, 27, 30, 52, 57, 73, 100], "cs": [2, 8, 12, 57], "toronto": [2, 8, 57], "edu": [2, 8, 12, 30, 57], "kriz": [2, 8, 57], "tar": [2, 5, 8, 21, 57, 67, 73, 80, 84, 85, 100, 102], "gz": [2, 5, 8, 21, 57, 67, 73, 80, 82, 84, 85, 87, 89], "extract": [2, 5, 7, 8, 18, 21, 35, 36, 39, 57, 67, 73, 76, 82, 84, 85, 87, 89, 91, 97, 100], "file": [2, 3, 5, 7, 8, 11, 15, 16, 21, 43, 51, 65, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "alreadi": [2, 3, 7, 8, 10, 12, 15, 21, 28, 31, 33, 35, 36, 40, 43, 57, 60, 61, 62, 67, 70, 73, 74, 76, 77, 80, 85, 88, 91, 94, 100, 101, 102], "verifi": [2, 8, 21, 43, 54, 57, 69, 73, 84, 94, 97], "50": [2, 3, 5, 7, 8, 11, 16, 17, 18, 25, 33, 35, 39, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 74, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "000": [2, 8, 15, 19, 57, 67, 73, 94], "colour": [2, 8, 57], "rgb": [2, 3, 8, 16, 18, 27, 57, 76, 77], "plane": [2, 8, 76, 80], "car": [2, 11, 76], "bird": [2, 57, 62, 76, 80], "cat": [2, 16, 21, 57, 64, 65, 69, 70, 76, 77, 81, 82, 87, 89, 91, 94], "deer": [2, 57], "dog": [2, 16, 43, 57, 69, 70, 76, 87, 88, 91], "frog": [2, 57, 76, 87], "hors": [2, 57, 76, 87], "ship": [2, 57, 76], "truck": [2, 57, 76], "store": [2, 5, 8, 15, 17, 21, 27, 28, 39, 43, 57, 60, 67, 69, 70, 73, 76, 84, 85, 97, 100, 102], "custom": [2, 8, 27, 28, 57, 60, 74, 80, 84, 85, 88, 94], "properti": [2, 8, 35, 38, 39, 43, 60, 64, 67, 74, 85, 88, 91], "uniqu": [2, 8, 17, 25, 31, 33, 36, 40, 87, 88], "50000": [2, 8, 27, 57, 67], "3": [2, 3, 5, 7, 8, 11, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 31, 34, 39, 46, 48, 54, 89, 102], "10000": [2, 8, 27, 57, 62, 67, 69, 70, 80, 81, 84, 85], "choos": [2, 8, 16, 17, 20, 21, 25, 31, 33, 36, 37, 43, 57, 60, 65, 67, 69, 70, 73, 74, 80, 81, 82, 85, 88, 97, 100, 101, 102], "percentag": [2, 5, 8, 85], "whole": [2, 8, 12, 15, 17, 20, 31, 33, 36, 38, 39, 57, 60, 67, 69, 73, 76, 77, 85, 87, 91, 100, 102], "A": [2, 3, 5, 8, 10, 12, 15, 16, 17, 18, 19, 21, 25, 26, 27, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 101, 102], "iter": [2, 8, 11, 12, 17, 18, 20, 21, 23, 31, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 84, 85, 87, 88, 94, 100, 102], "effici": [2, 8, 19, 27, 60, 67, 69, 73, 84, 94, 101], "shuffl": [2, 5, 7, 8, 11, 12, 16, 21, 31, 33, 57, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 85, 87, 100, 102], "batch": [2, 3, 7, 8, 11, 12, 17, 21, 27, 33, 43, 57, 61, 62, 65, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "num_work": [2, 7, 8, 57, 64, 65, 67, 69, 70, 73, 76, 80, 82], "cpu_count": [2, 8], "worker": [2, 8, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "trainload": [2, 8], "testload": [2, 8], "To": [2, 3, 5, 7, 8, 10, 12, 15, 17, 21, 26, 27, 28, 31, 33, 34, 39, 40, 57, 60, 61, 62, 64, 67, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "correspond": [2, 3, 17, 18, 21, 25, 28, 31, 35, 39, 40, 43, 57, 61, 62, 64, 65, 67, 73, 76, 77, 82, 84, 85, 87, 94, 97, 100, 102], "flag": [2, 17, 27, 57], "section": [2, 3, 7, 12, 15, 27, 52, 85], "batch_x": 2, "batch_i": 2, "next": [2, 3, 11, 21, 28, 31, 33, 35, 39, 43, 46, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "plot_mixed_imag": 2, "inv_norm": 2, "m": [2, 3, 11, 12, 17, 21, 25, 27, 31, 35, 36, 39, 43, 61, 62, 64, 65, 69, 73, 74, 80, 84, 85, 87, 89], "zip": [2, 3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 40, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 88, 89, 97, 100, 102], "inv_pil": 2, "topilimag": 2, "fig": [2, 3, 7, 12, 16, 21, 28, 33, 61, 62, 64, 67, 69, 70, 73, 76, 80, 81, 94, 97], "figur": [2, 7, 8, 12, 15, 16, 17, 18, 21, 31, 34, 35, 36, 39, 40, 74, 77, 88, 91, 101], "figsiz": [2, 3, 5, 7, 12, 15, 16, 17, 18, 20, 21, 28, 33, 35, 39, 40, 60, 61, 62, 67, 69, 70, 73, 76, 77, 80, 81, 82, 87], "8": [2, 3, 5, 7, 8, 11, 12, 17, 18, 20, 21, 27, 28, 34, 35, 39, 40, 60, 64, 65, 69, 70, 77, 80, 81, 82, 85, 87, 88, 102], "ax": [2, 3, 5, 15, 16, 18, 20, 21, 28, 35, 39, 40, 57, 60, 61, 62, 64, 67, 69, 70, 73, 76, 80, 81, 94, 97], "add_subplot": [2, 16, 61, 67], "inv_tensor": 2, "imshow": [2, 3, 5, 7, 15, 16, 17, 18, 20, 21, 28, 33, 57, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82], "9": [2, 3, 5, 8, 15, 17, 18, 21, 25, 28, 31, 34, 39, 40, 46, 60, 64, 65, 69, 70, 73, 80, 81, 82, 85, 87, 88, 97, 102], "famili": [2, 8], "whose": [2, 8, 67, 70, 84, 88], "main": [2, 8, 11, 15, 19, 25, 27, 28, 31, 33, 34, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97, 100, 101, 102], "organis": [2, 8, 11, 16, 27], "stack": [2, 8, 12, 17, 33, 73, 76, 77, 88], "residu": [2, 8, 31, 80, 82], "block": [2, 8, 17, 43, 60, 70, 73, 76, 80, 84, 87], "consist": [2, 8, 11, 15, 17, 21, 27, 34, 36, 43, 57, 65, 67, 73, 77, 84, 87, 88, 89, 94, 101], "layer": [2, 5, 7, 12, 17, 18, 20, 27, 31, 57, 60, 61, 62, 64, 67, 69, 70, 73, 74, 80, 82, 84, 85, 87, 91, 94, 100], "output": [2, 5, 7, 8, 11, 12, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 100, 102], "ad": [2, 3, 8, 12, 17, 21, 26, 27, 33, 38, 57, 62, 69, 70, 76, 80, 81, 82, 84, 85, 88, 91, 94, 100], "shortcut": [2, 3, 8, 21, 43], "connect": [2, 7, 8, 12, 17, 54, 57, 60, 65, 69, 70, 76, 80, 81, 82, 94, 100, 102], "just": [2, 3, 5, 8, 11, 12, 20, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 40, 43, 46, 62, 64, 65, 67, 69, 73, 74, 76, 77, 80, 82, 88, 89, 91, 94, 97, 101], "popular": [2, 8, 27, 43, 60, 64, 76, 77, 82], "work": [2, 5, 8, 12, 16, 17, 19, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 46, 52, 54, 60, 61, 62, 64, 69, 70, 73, 74, 76, 77, 80, 82, 85, 87, 88, 89, 91, 97, 100, 101, 102], "gener": [2, 3, 8, 16, 17, 20, 21, 23, 25, 27, 28, 31, 33, 34, 36, 37, 38, 61, 67, 74, 76, 77, 81, 85, 87, 91, 100, 101], "pick": [2, 8, 11, 31, 33, 39, 67, 70, 76, 80, 84, 85, 87, 88, 97, 100, 102], "illustr": [2, 8, 67, 69, 82, 84], "purpos": [2, 8, 15, 21, 27, 34, 39, 40, 60, 73, 80, 82, 84, 88, 94], "basicblock": [2, 8], "modul": [2, 3, 5, 7, 8, 11, 12, 16, 17, 20, 21, 25, 27, 33, 43, 44, 45, 54, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 84, 85, 87, 88, 89, 94, 101], "kaim": [2, 8], "he": [2, 8, 11, 12, 45, 57, 74, 84, 85, 87, 91, 101], "xiangyu": [2, 8], "shaoq": [2, 8], "ren": [2, 8], "jian": [2, 8], "sun": [2, 8, 46, 76], "learn": [2, 10, 11, 12, 17, 19, 20, 21, 23, 25, 30, 31, 34, 35, 36, 37, 38, 39, 40, 43, 46, 52, 54, 60, 65, 67, 69, 73, 77, 80, 82, 87, 102, 103], "recognit": [2, 8, 16, 19, 70, 76, 80, 91], "arxiv": [2, 8, 15, 27, 77, 81, 84, 91, 101], "1512": [2, 8], "03385": [2, 8], "expans": [2, 8], "in_plan": [2, 8], "stride": [2, 3, 5, 7, 8, 16, 21, 33, 80, 82, 100, 102], "super": [2, 3, 5, 7, 8, 11, 12, 16, 17, 20, 21, 27, 28, 33, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "conv1": [2, 7, 8, 18, 73, 76, 82, 100, 102], "conv2d": [2, 3, 5, 7, 8, 16, 17, 21, 73, 76, 80, 82, 100, 102], "kernel_s": [2, 3, 5, 7, 8, 16, 17, 21, 33, 73, 76, 80, 82, 100, 102], "bia": [2, 8, 12, 20, 57, 60, 62, 64, 65, 67, 70, 73, 76, 80, 81, 82, 84, 87, 94], "bn1": [2, 8, 100, 102], "batchnorm2d": [2, 3, 7, 8, 16, 17, 21, 100, 102], "conv2": [2, 7, 8, 18, 73, 82, 100, 102], "bn2": [2, 8, 100, 102], "sequenti": [2, 3, 5, 8, 16, 17, 21, 25, 28, 33, 57, 60, 62, 64, 65, 67, 73, 81, 82, 84, 88, 94, 102], "forward": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 94, 100, 102], "relu": [2, 3, 7, 8, 11, 16, 17, 20, 21, 27, 33, 57, 67, 69, 70, 76, 80, 84, 87, 100, 102], "bottleneck": [2, 8, 76, 80], "conv3": [2, 7, 8, 18, 82, 100, 102], "bn3": [2, 8, 100, 102], "num_block": [2, 8], "num_class": [2, 8, 16, 33, 73, 76, 84, 87], "64": [2, 3, 5, 7, 8, 12, 16, 17, 27, 57, 65, 67, 69, 70, 73, 82, 85, 87, 100, 102], "layer1": [2, 8, 33], "_make_lay": [2, 8], "layer2": [2, 8, 33], "layer3": [2, 8], "256": [2, 3, 5, 8, 16, 17, 18, 28, 43, 57, 67, 70, 73, 76, 77, 80, 82], "layer4": [2, 8], "512": [2, 7, 8, 28, 60, 76, 77, 84, 85, 88, 100, 102], "avg_pool2d": [2, 8], "view": [2, 3, 8, 11, 15, 25, 27, 35, 37, 39, 40, 43, 57, 64, 65, 67, 69, 70, 73, 76, 80, 84, 94, 100, 101, 102], "resnet18": [2, 8, 76], "resnet34": [2, 8], "resnet50": [2, 8], "load": [2, 3, 16, 21, 27, 31, 33, 36, 43, 60, 61, 62, 64, 65, 67, 76, 80, 81, 82, 88, 89], "net": [2, 4, 7, 8, 16, 18, 20, 21, 25, 43, 57, 64, 65, 69, 70, 73, 76, 80, 81, 84, 88, 100, 102], "randn": [2, 8, 20, 57, 60, 64, 65, 80, 81, 82], "result_fold": [2, 8], "path": [2, 3, 5, 7, 8, 15, 16, 17, 21, 27, 28, 57, 62, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 88, 94, 100, 102], "makedir": [2, 8, 27], "lognam": [2, 8], "__class__": [2, 8, 85], "__name__": [2, 8, 43, 57, 60, 73, 82, 85, 87, 100, 102], "dataparallel": [2, 3, 8], "device_count": [2, 8], "cross": [2, 8, 16, 21, 33, 39, 57, 61, 67, 70, 73, 82, 85, 100, 101], "entropi": [2, 8, 16, 21, 57, 67, 70, 80, 85, 87, 100], "commonli": [2, 8, 17, 43, 60, 67, 81, 91, 97], "stochast": [2, 8, 57, 60, 67, 81], "gradient": [2, 7, 8, 11, 12, 16, 17, 18, 21, 27, 28, 57, 62, 64, 65, 73, 74, 76, 80, 81, 84, 100, 101, 102], "descent": [2, 8, 17, 21, 27, 31, 57, 61, 62, 74, 84, 101], "sgd": [2, 5, 8, 11, 17, 18, 57, 60, 62, 65, 67, 69, 73, 81, 87, 100, 102], "momentum": [2, 3, 5, 8, 17, 18, 21, 60, 69, 70, 73, 94, 101], "weight": [2, 3, 7, 8, 12, 16, 17, 18, 21, 27, 28, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 91, 94, 100, 102], "decai": [2, 8, 21, 70, 82], "criterion": [2, 7, 8, 12, 16, 17, 18, 21, 33, 38, 62, 64, 65, 69, 70, 73, 76, 87], "mixup_criterion": 2, "pred": [2, 12, 21, 67, 69, 70, 81, 85], "crossentropyloss": [2, 3, 5, 7, 8, 16, 17, 33, 57, 64, 65, 73, 76, 87], "onli": [2, 3, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 80, 81, 84, 85, 87, 88, 91, 97, 100, 101], "lr": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 100, 102], "weight_decai": [2, 8, 17, 21, 80, 85], "1e": [2, 3, 8, 17, 21, 28, 57, 61, 62, 64, 65, 67, 69, 70, 76, 80, 81, 82, 84, 94, 97, 102], "nepoch": [2, 8, 17, 21], "d": [2, 3, 5, 7, 8, 11, 17, 20, 21, 25, 33, 35, 36, 38, 39, 40, 57, 60, 61, 62, 64, 65, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 101], "train_loss": [2, 5, 7, 8, 12, 67, 69, 70, 73, 84, 87], "correct": [2, 3, 4, 5, 7, 8, 16, 17, 25, 33, 35, 39, 57, 62, 64, 65, 69, 70, 73, 76, 80, 88, 94, 100, 102], "total": [2, 3, 5, 7, 8, 12, 17, 21, 25, 27, 33, 35, 39, 57, 64, 65, 69, 70, 73, 76, 80, 94, 100, 102], "batch_idx": [2, 8, 16, 69, 70], "enumer": [2, 5, 8, 12, 16, 17, 18, 21, 28, 33, 61, 62, 64, 65, 67, 69, 70, 76, 84, 85, 87, 94, 97], "zero_grad": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 94, 100, 102], "two": [2, 10, 12, 15, 20, 21, 23, 25, 27, 31, 33, 35, 36, 39, 43, 52, 57, 61, 62, 65, 67, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101], "hot": [2, 12, 76, 88], "coeffici": [2, 18, 27, 67, 81, 82, 84], "targets_a": 2, "targets_b": 2, "loss_func": 2, "backward": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 28, 33, 57, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 100, 102], "step": [2, 3, 5, 7, 8, 11, 16, 17, 18, 20, 21, 25, 27, 28, 31, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "item": [2, 3, 5, 7, 8, 12, 16, 17, 18, 20, 21, 28, 33, 43, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "predict": [2, 3, 5, 7, 8, 12, 16, 17, 19, 21, 27, 31, 33, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 81, 82, 87, 88, 94, 100, 101, 102], "max": [2, 5, 7, 8, 12, 16, 17, 18, 21, 28, 33, 34, 35, 39, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 94, 97, 100, 102], "eq": [2, 8, 69, 70, 76, 81, 82], "sum": [2, 3, 7, 8, 11, 12, 16, 17, 21, 28, 33, 35, 39, 40, 57, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 94, 97, 100, 102], "500": [2, 3, 5, 7, 8, 20, 28, 33, 64, 65, 67, 69, 70, 76, 81, 82, 87, 94], "3f": [2, 5, 8, 33, 61, 67, 81, 84, 87], "acc": [2, 8, 12, 16, 64, 65, 67, 69, 73, 76], "100": [2, 3, 5, 11, 12, 16, 17, 20, 21, 27, 33, 34, 35, 39, 40, 44, 45, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 85, 87, 88, 89, 94, 100, 102], "global": [2, 8, 17, 28, 57, 67, 84, 85, 88, 94], "eval": [2, 3, 7, 8, 12, 16, 17, 18, 21, 27, 33, 43, 65, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 94, 100, 102], "test_loss": [2, 5, 8, 67, 69, 70, 84], "no_grad": [2, 5, 8, 16, 21, 33, 43, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 94, 100, 102], "volatil": 2, "save": [2, 8, 11, 12, 15, 16, 17, 21, 31, 43, 53, 57, 64, 65, 67, 69, 70, 81, 82, 84, 85, 87, 88, 100, 102], "adjust_learning_r": [2, 8], "decreas": [2, 8, 12, 20, 21, 26, 31, 33, 36, 57, 73, 82], "rate": [2, 7, 8, 17, 21, 27, 28, 31, 35, 36, 39, 60, 62, 67, 74, 76, 82, 84, 87, 97, 100, 102], "certain": [2, 8, 12, 21, 28, 33, 35, 36, 39, 43, 60, 64, 65, 67, 76, 81, 82, 84, 88, 94, 97], "state": [2, 5, 7, 8, 12, 16, 17, 20, 21, 25, 28, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 81, 82, 84, 87, 89, 94, 97, 100, 101, 102], "state_dict": [2, 8, 12, 16, 17, 18, 21, 76, 82, 100, 102], "rng_state": [2, 8], "get_rng_stat": [2, 8], "isdir": [2, 8], "mkdir": [2, 5, 7, 8, 16, 21, 57, 73, 100, 102], "ckpt": [2, 82], "t7": [2, 8], "150": [2, 8, 12, 28, 35, 39, 44, 45, 67, 70, 76, 77, 87, 88, 94], "warm": [2, 8], "larg": [2, 5, 7, 8, 12, 20, 21, 31, 57, 62, 65, 69, 70, 73, 74, 76, 77, 82, 84, 85, 87, 94, 101], "minibatch": [2, 8, 11, 33, 57, 73], "param_group": [2, 8, 17], "open": [2, 3, 5, 7, 8, 10, 11, 15, 16, 17, 18, 21, 27, 31, 38, 43, 46, 48, 51, 53, 57, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 89, 100, 102], "logfil": [2, 8], "logwrit": [2, 8], "writer": [2, 8, 16, 31, 34, 69, 70, 81], "delimit": [2, 8], "writerow": [2, 8], "train_acc": [2, 7, 8, 12, 16, 64, 65, 67, 69, 70, 73, 87, 94], "test_acc": [2, 8, 16, 64, 65, 94], "391": [2, 8, 76], "443": [2, 76, 82], "938": [2, 8, 73, 76, 94], "14": [2, 3, 5, 8, 23, 27, 28, 33, 36, 40, 61, 62, 69, 70, 73, 76, 82, 84, 85, 87, 88, 94, 97, 100], "79": [2, 8, 12, 21, 33, 67, 76, 100], "531": [2, 8, 76], "46": [2, 8, 27, 65, 76, 80, 84, 85, 87, 94, 100], "094": 2, "59": [2, 8, 27, 65, 76, 77, 87, 94, 100], "31": [2, 8, 27, 28, 40, 57, 64, 65, 67, 76, 77, 81, 82, 84, 85, 97, 100], "604000091552734": 2, "44": [2, 57, 65, 67, 76, 77, 85, 94, 100], "2599983215332": 2, "619": [2, 8, 76], "39": [2, 5, 8, 27, 33, 73, 76, 82, 84, 85, 94, 100], "844": [2, 8, 73, 76], "51": [2, 5, 8, 27, 28, 33, 61, 69, 70, 76, 85, 87, 94], "199": [2, 76], "60": [2, 8, 16, 27, 28, 31, 33, 35, 39, 67, 73, 76, 94, 100], "156": [2, 76], "77": [2, 8, 12, 16, 21, 27, 33, 39, 67, 76, 80, 82, 84], "47": [2, 27, 33, 65, 69, 76, 84, 85, 87, 94, 100], "03200149536133": 2, "54": [2, 8, 65, 69, 76, 94, 100], "41999816894531": 2, "301": [2, 39, 76], "53": [2, 8, 12, 33, 69, 76, 84, 94], "906": [2, 73, 76], "69": [2, 8, 21, 65, 73, 76, 84], "013": [2, 8], "61": [2, 5, 12, 16, 21, 27, 28, 40, 61, 62, 76, 94], "719": [2, 76], "56": [2, 8, 25, 28, 33, 39, 65, 67, 69, 76, 84, 94], "257999420166016": 2, "62": [2, 5, 8, 27, 65, 73, 76, 87, 94], "599998474121094": 2, "036": 2, "062": 2, "82": [2, 8, 12, 16, 21, 27, 33, 57, 67, 73, 76, 80, 100], "909": [2, 76], "89": [2, 8, 21, 73, 76], "43199920654297": 2, "65": [2, 5, 8, 21, 27, 28, 33, 39, 61, 76, 84, 85, 88, 94, 100], "6500015258789": 2, "839": [2, 5, 73, 76], "68": [2, 8, 17, 33, 76, 84], "750": [2, 8, 39, 76], "88": [2, 21, 27, 61, 65, 73, 76, 84], "859": [2, 76], "70": [2, 3, 5, 8, 21, 28, 33, 67, 73, 76, 84, 94, 100], "312": [2, 8, 67, 76], "90": [2, 8, 21, 33, 67, 73, 76, 77, 84, 94, 100, 102], "67": [2, 12, 21, 27, 33, 76, 84, 85, 94], "08999633789062": [2, 8], "5": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 23, 25, 28, 31, 35, 39, 40, 46, 48, 69, 77, 81, 82, 85, 102], "922": [2, 76], "83": [2, 8, 12, 21, 33, 67, 73, 76, 80, 94, 100], "660": [2, 76], "76": [2, 8, 16, 21, 27, 33, 39, 76], "562": [2, 8, 76], "98": [2, 5, 8, 16, 21, 27, 33, 76, 77, 94], "1259994506836": 2, "72": [2, 8, 16, 21, 27, 33, 39, 65, 76, 84, 94], "52999877929688": 2, "833": [2, 76], "625": [2, 76], "84": [2, 5, 12, 21, 33, 76, 80], "616": [2, 8, 76], "78": [2, 8, 15, 39, 76], "125": [2, 8, 76], "73": [2, 8, 16, 21, 28, 33, 61, 76], "45999908447266": 2, "686": [2, 76], "75": [2, 8, 16, 17, 21, 33, 36, 73, 76, 87, 94, 100], "96": [2, 5, 21, 28, 76], "533": [2, 12, 76], "81": [2, 8, 12, 21, 25, 27, 33, 76, 80, 94, 100], "250": [2, 8, 17, 20, 61, 62, 67, 70, 76, 80, 82], "104": [2, 8, 21, 28, 76], "99600219726562": [2, 8], "91000366210938": 2, "626": [2, 8, 76], "458": [2, 12, 76], "031": 2, "105": [2, 61, 76], "42400360107422": 2, "11000061035156": 2, "465": [2, 76], "85": [2, 8, 12, 21, 33, 73, 76, 84, 100, 102], "110": [2, 21, 28, 76], "87": [2, 21, 33, 73, 76, 84], "112": [2, 21, 76], "80": [2, 3, 8, 11, 12, 21, 27, 28, 33, 64, 65, 67, 76, 80, 87, 94, 100, 102], "72599792480469": 2, "37000274658203": 2, "509": [2, 76], "523": [2, 76], "688": [2, 8, 76], "102": [2, 8, 12, 15, 76], "16400146484375": 2, "25": [2, 3, 7, 12, 17, 21, 25, 28, 46, 61, 62, 65, 67, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 100], "11": [2, 3, 5, 7, 8, 16, 25, 28, 33, 46, 64, 67, 73, 76, 82, 84, 85, 87, 97, 100], "423": [2, 76], "610": [2, 76], "96199798583984": 2, "68000030517578": 2, "12": [2, 3, 5, 8, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 39, 40, 46, 60, 62, 67, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100], "221": [2, 76], "115": [2, 76], "467": [2, 76], "812": [2, 8, 73, 76], "106": [2, 8, 76], "61799621582031": 2, "88999938964844": 2, "13": [2, 3, 5, 8, 12, 27, 28, 33, 34, 67, 70, 73, 76, 80, 82, 84, 85, 87, 94, 97, 100], "427": [2, 76], "522": [2, 76], "21199798583984": 2, "54000091552734": [2, 8], "216": [2, 76, 94], "93": [2, 5, 8, 15, 28, 33, 73, 76, 84], "120": [2, 5, 18, 76], "386": [2, 76], "86": [2, 21, 33, 73, 76], "111": [2, 21, 76], "08000183105469": 2, "44999694824219": [2, 8], "read_csv": [2, 8, 12, 57], "resnet_": [2, 8], "sep": [2, 8, 84, 88], "head": [2, 8, 12, 25, 33, 36, 40, 43, 67, 73], "932130": 2, "604000": 2, "535233": 2, "259998": 2, "446863": 2, "032001": 2, "262779": 2, "419998": 2, "212518": 2, "257999": 2, "069593": 2, "599998": 2, "051850": 2, "431999": 2, "996476": 2, "650002": 2, "928131": 2, "000000": [2, 73], "898354": 2, "089996": [2, 8], "train_accuraci": [2, 8, 12, 67], "test_accuraci": [2, 8, 67], "averag": [2, 8, 12, 15, 16, 17, 31, 35, 36, 39, 40, 57, 61, 64, 67, 73, 76, 77, 80, 82, 84, 87, 94, 100, 102], "over": [2, 3, 7, 8, 11, 16, 17, 18, 21, 23, 27, 28, 33, 36, 39, 40, 43, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 87, 88, 91, 94, 97, 100, 101, 102], "accuracci": [2, 8], "figurenam": [2, 8], "withmixup": 2, "name": [2, 3, 8, 11, 12, 15, 16, 18, 21, 23, 25, 27, 28, 31, 33, 36, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "xlabel": [2, 5, 8, 11, 15, 16, 17, 20, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 87], "ylabel": [2, 5, 8, 11, 15, 16, 17, 18, 20, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 87], "curv": [2, 8, 35, 39, 61, 62, 64, 73, 80, 94], "savefig": [2, 8, 57, 73, 87], "png": [2, 3, 5, 7, 8, 21, 57, 73, 76, 82, 87, 94], "legend": [2, 5, 7, 8, 12, 16, 18, 20, 21, 34, 35, 39, 40, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 87, 94], "jan": [3, 5], "funk": 3, "between": [3, 7, 10, 15, 16, 17, 20, 27, 31, 33, 35, 36, 39, 40, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97, 100], "contain": [3, 5, 8, 15, 16, 19, 21, 25, 27, 28, 31, 33, 34, 35, 36, 39, 40, 43, 54, 57, 60, 62, 67, 73, 76, 77, 84, 85, 87, 89, 91, 94, 97, 100], "everyth": [3, 35, 57, 60, 61, 64, 73, 74, 88, 91], "vgg": 3, "electron": 3, "microscopi": 3, "drosophila": 3, "synaps": 3, "those": [3, 5, 8, 12, 23, 25, 31, 33, 34, 35, 36, 38, 39, 40, 57, 67, 73, 74, 80, 81, 87, 88, 91, 94, 100, 101], "accord": [3, 21, 39, 43, 57, 64, 65, 74, 80, 81, 84, 85, 94, 97, 100, 102], "neurotransmitt": 3, "thei": [3, 5, 11, 12, 20, 23, 26, 27, 31, 33, 35, 36, 37, 38, 39, 40, 43, 48, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101], "releas": [3, 5], "scikit": [3, 5, 39, 57, 87], "pillow": [3, 16, 18, 43, 73, 76, 77, 80, 81, 82], "glob": [3, 7, 8, 15, 21, 27, 77], "json": [3, 5, 7, 21, 43, 57, 84, 88], "tqdm": [3, 7, 11, 12, 16, 17, 21, 39, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 88, 89, 100, 102], "skimag": [3, 5], "io": [3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 27, 30, 31, 33, 34, 36, 39, 43, 46, 49, 52, 57, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89, 94, 100, 102], "imread": [3, 5, 7, 17, 21, 57], "imagefold": [3, 7, 16, 65, 69, 70, 76, 77], "sampler": [3, 64, 65, 94], "weightedrandomsampl": 3, "inlin": [3, 25, 28, 69, 70, 73, 82, 87, 89, 94], "improv": [3, 5, 12, 16, 17, 21, 27, 38, 67, 73, 80, 82, 84, 88, 94, 97, 100, 101], "classif": [3, 4, 8, 10, 12, 18, 33, 35, 36, 39, 57, 77, 87, 91], "On": [3, 39, 43, 46, 57, 61, 62, 65, 67, 81, 85], "valid": [3, 5, 7, 11, 12, 16, 17, 18, 21, 31, 39, 64, 67, 70, 73, 80, 85, 87, 88, 97, 100, 102], "around": [3, 5, 16, 21, 23, 27, 31, 33, 35, 36, 39, 57, 67, 73, 76, 77, 80, 81, 82, 85, 87], "try": [3, 4, 5, 7, 10, 11, 12, 16, 17, 21, 23, 26, 27, 31, 33, 36, 40, 54, 57, 61, 62, 65, 67, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101], "easi": [3, 21, 35, 38, 43, 57, 60, 61, 67, 74, 85, 88, 91, 101], "augment": [3, 8, 21, 62, 65, 91, 94, 101], "quit": [3, 26, 39, 43, 57, 73, 74, 77, 80, 91, 94], "enlarg": 3, "avail": [3, 12, 19, 21, 23, 25, 27, 31, 33, 37, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 100, 102], "transpos": [3, 17, 18, 21, 57, 62, 64, 65, 69, 70, 73, 80, 84, 89], "mirror": [3, 76, 80, 100, 102], "add": [3, 7, 11, 12, 16, 17, 20, 21, 27, 28, 37, 38, 40, 43, 48, 57, 60, 62, 65, 67, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "nois": [3, 17, 20, 35, 36, 39, 40, 57, 60, 61, 62, 67, 69, 70, 76, 80, 81, 82, 85, 91], "intens": [3, 17, 57], "etc": [3, 5, 8, 19, 27, 28, 31, 33, 34, 37, 38, 40, 43, 52, 53, 57, 60, 64, 67, 70, 76, 85, 91], "architectur": [3, 11, 12, 19, 27, 31, 43, 57, 60, 65, 67, 69, 70, 73, 74, 76, 77, 94, 101, 102], "few": [3, 31, 33, 39, 40, 57, 60, 70, 73, 76, 80, 82, 85, 88, 89, 91, 97, 100, 101], "tune": [3, 27, 31, 57, 61, 62, 67, 69, 87, 91], "take": [3, 5, 10, 11, 12, 16, 17, 21, 23, 25, 27, 31, 33, 35, 36, 37, 39, 40, 43, 53, 57, 60, 62, 64, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "random": [3, 5, 7, 11, 16, 17, 20, 21, 27, 35, 39, 40, 57, 74, 97], "sampl": [3, 7, 11, 12, 16, 25, 27, 33, 36, 43, 61, 64, 65, 70, 76, 84, 85, 87, 94, 101, 102], "test": [3, 5, 7, 11, 12, 19, 28, 31, 33, 34, 35, 36, 61, 62, 64, 65, 67, 70, 76, 80, 84, 85, 87, 94, 100, 101], "them": [3, 5, 8, 12, 20, 23, 26, 27, 28, 31, 33, 35, 36, 37, 38, 39, 43, 48, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "togeth": [3, 4, 31, 33, 34, 37, 38, 39, 40, 46, 57, 61, 64, 65, 70, 74, 77, 84, 88, 89, 101], "actual": [3, 5, 21, 25, 31, 33, 35, 36, 37, 39, 40, 57, 60, 64, 67, 73, 74, 76, 80, 81, 82, 84, 85, 88, 89, 91, 94, 100, 101], "medium": [3, 21, 27, 85], "g": [3, 4, 5, 7, 11, 15, 21, 25, 27, 33, 34, 35, 36, 37, 38, 39, 40, 53, 57, 60, 65, 67, 70, 73, 74, 77, 81, 82, 84, 85, 88, 91, 94, 97, 101], "resnet": [3, 77, 82], "error": [3, 11, 17, 31, 33, 40, 54, 57, 60, 61, 62, 65, 69, 70, 73, 74, 76, 81, 82, 84, 85, 87, 88, 89, 97, 100, 102], "most": [3, 8, 11, 12, 25, 28, 31, 33, 34, 35, 36, 37, 38, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 101, 102], "accur": [3, 16, 17, 33, 36, 69, 74, 76, 87, 88, 91], "confus": [3, 33, 73, 100], "explor": [3, 5, 8, 16, 17, 21, 28, 31, 39, 61, 62, 67, 70, 80, 81, 85, 88, 89, 100, 102], "gaba": 3, "glutam": 3, "revers": [3, 11, 12, 28, 57, 82, 85, 97, 100, 102], "direct": [3, 17, 21, 23, 25, 27, 28, 33, 34, 39, 40, 60, 62, 64, 67, 80, 81, 88, 97, 100], "where": [3, 4, 5, 8, 11, 12, 17, 18, 19, 21, 27, 30, 31, 33, 35, 36, 37, 38, 39, 40, 43, 53, 54, 57, 60, 62, 64, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "least": [3, 5, 31, 40, 52, 70, 74, 94, 100], "obviou": [3, 31], "hard": [3, 16, 17, 21, 28, 31, 35, 38, 39, 67, 73, 76, 94, 101], "watch": [3, 19, 31, 33, 74, 76, 80, 91, 101], "bore": [3, 31], "find": [3, 7, 11, 12, 15, 21, 23, 25, 26, 27, 31, 33, 34, 38, 39, 40, 43, 49, 60, 61, 62, 64, 65, 67, 70, 73, 74, 80, 82, 85, 87, 89, 91, 97, 100, 101, 102], "period": [3, 27, 28, 36, 38, 39, 57, 64, 81, 82, 101], "current": [3, 16, 21, 25, 27, 28, 31, 38, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 76, 82, 84, 85, 88, 94, 97, 100, 101, 102], "hint": [3, 5, 36, 57, 60, 65, 69, 73, 74, 76, 77, 88, 91, 94, 97, 100, 101, 102], "cycle_gan": 3, "visual": [3, 7, 17, 19, 21, 27, 28, 31, 33, 34, 35, 39, 40, 61, 64, 65, 67, 76, 80, 82, 84, 85, 91, 100], "might": [3, 5, 19, 20, 21, 27, 31, 33, 35, 37, 38, 39, 40, 57, 61, 67, 69, 70, 73, 74, 76, 80, 82, 84, 85, 88, 89, 91, 94, 100, 101, 102], "help": [3, 5, 11, 12, 16, 17, 26, 27, 31, 34, 35, 36, 37, 43, 48, 49, 57, 60, 61, 62, 67, 74, 77, 82, 84, 85, 88, 91, 94, 97, 100, 101], "look": [3, 5, 11, 12, 19, 20, 21, 25, 27, 31, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 69, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 100, 101], "organ": [3, 11, 23, 31, 39, 57, 69, 76], "raw": [3, 15, 28, 33, 35, 36, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 89, 94], "copi": [3, 5, 7, 17, 18, 21, 25, 28, 31, 33, 39, 43, 53, 57, 67, 69, 70, 73, 85, 94, 97, 100, 102], "directori": [3, 7, 15, 16, 21, 27, 28, 67, 77, 82, 100, 102], "structur": [3, 5, 17, 21, 27, 31, 34, 35, 36, 46, 57, 60, 62, 67, 69, 70, 76, 80, 82, 84, 87, 94, 101], "adjust": [3, 20, 28, 61, 67, 73, 76, 87], "128x128": [3, 76], "channel": [3, 5, 16, 17, 21, 27, 31, 33, 36, 57, 65, 67, 73, 76, 77, 80, 82], "written": [3, 5, 7, 11, 17, 57, 62], "nil": 3, "eckstein": 3, "modifi": [3, 4, 17, 21, 27, 69, 70, 73, 77, 80, 85, 94], "implement": [3, 5, 12, 17, 21, 23, 28, 34, 37, 43, 57, 60, 62, 65, 69, 76, 80, 82, 85, 87, 94, 101, 102], "six": [3, 33, 76, 88], "acethylcholin": 3, "octopamin": 3, "serotonin": 3, "dopamin": 3, "request": [3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 33, 36, 43, 52, 57, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 88, 89, 94, 100, 102], "zipfil": [3, 7, 11, 12, 15, 16, 65, 69, 70, 73, 76, 77, 80, 82, 84, 87, 89, 94, 100, 102], "fname": [3, 7, 15, 16, 17, 18, 21, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89], "url": [3, 5, 7, 12, 15, 16, 17, 18, 21, 27, 34, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "dropbox": 3, "sh": 3, "ucpjfd3omjieu80": 3, "aaavzynltzvhyfx7_jwvhuk2a": 3, "downlad": 3, "r": [3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 27, 33, 35, 36, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "allow_redirect": [3, 15, 18, 21, 65, 67, 69, 70, 73, 80, 84, 85], "stream": [3, 5, 25, 76, 77, 85, 87, 88, 89, 91], "wb": [3, 5, 7, 15, 16, 17, 18, 21, 65, 67, 69, 70, 73, 76, 80, 84, 85, 87, 89], "fh": [3, 15, 18, 65, 67, 69, 70, 73, 80], "write": [3, 5, 7, 15, 16, 17, 18, 21, 23, 27, 31, 33, 35, 36, 37, 38, 39, 40, 43, 46, 51, 53, 54, 57, 62, 64, 65, 67, 69, 70, 74, 76, 80, 84, 85, 87, 88, 89, 91, 97, 101], "cmplete": 3, "unzip": [3, 21, 65, 69, 70, 94, 100, 102], "specifi": [3, 21, 27, 28, 31, 33, 36, 37, 38, 39, 43, 57, 61, 62, 67, 69, 70, 82, 84, 85, 87, 88, 94, 97, 100, 102], "mode": [3, 17, 18, 21, 27, 28, 31, 33, 43, 60, 62, 64, 65, 73, 80, 82, 87, 88, 89, 94, 97], "zf": [3, 88], "all": [3, 5, 7, 8, 11, 12, 15, 17, 18, 21, 23, 25, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "extractal": [3, 5, 7, 11, 12, 15, 16, 21, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89, 94, 100, 102], "done": [3, 11, 17, 21, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 53, 57, 61, 62, 65, 73, 77, 80, 81, 87, 89, 91, 100, 102], "zh": 3, "narchiv": 3, "textract": 3, "order": [3, 10, 12, 15, 19, 26, 28, 31, 33, 34, 36, 39, 43, 51, 54, 57, 65, 67, 73, 77, 84, 87, 88, 89, 91, 94, 97, 100], "match": [3, 5, 8, 17, 18, 20, 21, 25, 31, 33, 36, 57, 65, 67, 73, 80, 82, 84, 85, 94], "pretrain": [3, 5, 8, 10, 12, 16, 18, 30, 43, 73, 80, 82, 84, 87, 88, 91, 94, 100, 101], "renam": [3, 21], "0_gaba": 3, "acetylcholin": 3, "1_acetylcholin": 3, "2_glutam": 3, "3_serotonin": 3, "4_octopamin": 3, "5_dopamin": 3, "remov": [3, 5, 7, 8, 12, 16, 17, 21, 27, 38, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 85, 87, 88, 89, 94, 100, 102], "archiv": [3, 5, 30, 73], "experi": [3, 11, 16, 17, 23, 26, 27, 28, 31, 35, 36, 37, 39, 40, 52, 54, 57, 61, 67, 80, 81, 82, 84, 85, 100, 101], "first": [3, 7, 8, 10, 11, 12, 15, 17, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 46, 54, 57, 60, 62, 64, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 89, 91, 94, 97, 100], "loader": [3, 15, 21, 33, 57, 64, 69, 73, 76, 80, 84, 88], "account": [3, 5, 7, 10, 11, 12, 25, 28, 39, 43, 54, 77, 80, 88, 102], "imbal": [3, 10], "dure": [3, 8, 10, 12, 15, 16, 17, 19, 21, 23, 27, 28, 31, 39, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "load_imag": 3, "filenam": [3, 16, 17, 21, 57, 73, 76, 87, 94, 100, 102], "grescal": 3, "uint8": [3, 18, 21, 27, 73], "255": [3, 18, 21, 27, 67, 70, 73, 76], "astyp": [3, 5, 12, 17, 18, 21, 25, 27, 62, 73, 80, 100, 102], "split": [3, 5, 7, 11, 12, 15, 16, 17, 21, 23, 31, 33, 35, 36, 39, 43, 57, 64, 65, 67, 69, 70, 73, 76, 84, 85, 87, 88, 94], "num_imag": [3, 77, 94], "num_train": [3, 87], "num_valid": [3, 87], "num_test": 3, "fix": [3, 5, 25, 28, 33, 43, 57, 61, 62, 65, 67, 73, 76, 77, 81, 82, 84, 85, 91], "seed": [3, 16, 17, 20, 25, 27, 35, 39, 57], "train_dataset": [3, 7, 21, 57, 73, 85, 87], "validation_dataset": [3, 73, 85], "test_dataset": [3, 21, 73, 85, 87], "23061912": 3, "uniform": [3, 27, 28, 57, 61, 65, 76, 100, 102], "ys": [3, 67], "arrai": [3, 5, 12, 15, 17, 25, 27, 28, 31, 33, 35, 36, 39, 40, 57, 61, 62, 64, 65, 67, 70, 73, 76, 80, 81, 84, 85, 87, 89, 94, 97, 100, 102], "count": [3, 4, 11, 12, 16, 20, 21, 25, 35, 36, 39, 57, 64, 65, 74, 76, 88, 102], "bincount": 3, "label_weight": 3, "per": [3, 5, 16, 17, 21, 25, 31, 33, 35, 39, 57, 61, 62, 64, 65, 67, 69, 74, 76, 80, 84, 85, 88, 94, 100, 102], "t": [3, 5, 11, 12, 17, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 97, 100, 101, 102], "tn": 3, "tweight": 3, "serv": [3, 35], "mini": [3, 76, 81, 82], "drop_last": [3, 12, 64, 65], "15855": 3, "30715862503942e": 3, "05": [3, 5, 17, 27, 33, 57, 60, 61, 62, 67, 73, 82, 94, 100, 102], "4911": 3, "00020362451639177357": 3, "3550": 3, "00028169014084507044": 3, "2297": 3, "00043535045711797995": 3, "951": [3, 76], "0010515247108307045": 3, "4649": 3, "00021510002151000216": 3, "cell": [3, 4, 11, 12, 16, 20, 21, 25, 27, 28, 33, 36, 43, 51, 57, 60, 61, 64, 65, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "singl": [3, 5, 8, 20, 21, 25, 26, 28, 33, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 88, 100], "chosen": [3, 12, 27, 35, 67, 82, 84, 100, 102], "feel": [3, 12, 16, 28, 35, 39, 40, 57, 61, 64, 73, 80, 82, 84, 85, 88, 97, 100], "tell": [3, 11, 16, 31, 33, 34, 43, 64, 67, 69, 80, 82, 88], "show_batch": 3, "subplot": [3, 5, 7, 12, 15, 16, 17, 18, 20, 21, 28, 35, 39, 40, 60, 61, 62, 64, 67, 69, 73, 76, 80, 81, 94, 97], "sharei": [3, 18], "squeez": [3, 12, 16, 18, 21, 33, 57, 70, 73, 76, 80], "cmap": [3, 15, 18, 20, 21, 57, 61, 62, 64, 65, 67, 73, 80, 81], "grai": [3, 18, 21, 57, 73, 76, 80, 85, 88], "set_titl": [3, 7, 35, 36, 40, 60, 61, 62, 67, 69, 73, 76, 81, 94], "repeatedli": [3, 73], "break": [3, 5, 11, 21, 25, 35, 46, 60, 62, 67, 69, 70, 84, 88, 91, 100, 102], "vgg2d": 3, "input_s": [3, 5, 11, 12, 76], "fmap": 3, "downsample_factor": 3, "output_class": 3, "current_fmap": 3, "current_s": 3, "tupl": [3, 12, 17, 57, 60, 61, 67, 73, 80, 85, 97, 100, 102], "featur": [3, 7, 8, 12, 16, 17, 18, 21, 25, 27, 33, 36, 39, 43, 44, 45, 57, 60, 61, 62, 64, 67, 69, 70, 73, 80, 81, 82, 87], "inplac": [3, 7, 12, 16, 17, 21], "maxpool2d": [3, 5, 16, 17, 21, 73], "assert": [3, 5, 16, 21, 27, 57, 60, 61, 62, 67, 80, 82, 84, 85, 97, 100, 102], "downsampl": [3, 17, 21, 73, 82], "factor": [3, 21, 31, 62, 67, 82, 84, 91, 94, 97, 101], "4096": [3, 16, 57], "dropout": [3, 5, 7, 12, 16, 20, 33, 64, 84, 87, 100, 102], "reshap": [3, 11, 17, 21, 33, 35, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 81, 82, 84, 87, 94, 100, 102], "optimz": 3, "adam": [3, 7, 12, 16, 20, 28, 33, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 100, 102], "gpu": [3, 5, 7, 12, 16, 17, 20, 21, 28, 33, 43, 54, 97], "devic": [3, 7, 11, 12, 16, 17, 18, 20, 21, 33, 36, 57], "cpu": [3, 5, 7, 11, 12, 16, 17, 18, 20, 21, 28, 33, 43], "Will": [3, 28, 101], "mere": 3, "defin": [3, 5, 7, 11, 12, 21, 27, 28, 31, 35, 37, 40, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 84, 85, 87, 88, 89, 97, 101, 102], "conveni": [3, 43, 60, 61, 62, 67, 88, 94], "function": [3, 11, 12, 20, 25, 31, 37, 38, 40, 88, 97], "epoch_loss": [3, 17, 18, 21], "num_batch": [3, 67], "y_pred": [3, 12, 64, 65], "l": [3, 5, 11, 12, 18, 62, 64, 67, 70, 73, 76, 82, 85, 91, 100, 102], "evalu": [3, 7, 10, 11, 12, 16, 17, 18, 27, 34, 35, 43, 61, 65, 69, 73, 80, 81, 85, 87, 88, 100], "logit": [3, 21, 64, 65, 76, 85], "prob": [3, 76, 80, 81, 84, 100, 102], "softmax": [3, 7, 11, 64, 67, 73, 76, 84, 100], "dim": [3, 11, 17, 21, 33, 57, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 102], "argmax": [3, 5, 11, 21, 43, 57, 64, 65, 67, 69, 70, 73, 76, 84, 85, 87, 88, 89, 97, 100, 102], "detach": [3, 7, 11, 16, 17, 18, 20, 60, 62, 65, 69, 70, 73, 77, 80, 87], "readi": [3, 11, 12, 28, 33, 36, 37, 43, 60, 88, 100, 101], "after": [3, 12, 15, 16, 17, 21, 23, 27, 31, 33, 35, 36, 38, 39, 43, 46, 54, 57, 60, 62, 64, 65, 67, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101], "roughli": [3, 31, 40, 80, 81], "onc": [3, 17, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 54, 57, 61, 69, 73, 74, 80, 82, 85, 87, 91, 97, 101, 102], "report": [3, 27, 31, 69, 70, 82, 84, 85, 87, 88, 89], "train_from_scratch": 3, "num_epoch": [3, 5, 11, 16, 18, 33, 64, 65, 94], "yes_i_want_the_pretrained_model": 3, "wherea": [3, 69, 73], "scratch": [3, 27, 31, 33, 57, 61, 88, 91], "unceck": 3, "box": [3, 16, 27, 37, 38, 64, 67, 73, 76, 80, 94], "param": [3, 5, 8, 12, 15, 16, 40, 43, 57, 67, 69, 70, 76, 80, 81, 82, 84, 85, 87], "boolean": [3, 33, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 89, 94, 100, 102], "vgg_checkpoint": 3, "map_loc": [3, 17, 21, 67, 82, 100, 102], "load_state_dict": [3, 8, 16, 17, 18, 21, 67, 76, 82, 100, 102], "model_state_dict": 3, "8054750869061413": 3, "conclud": [3, 33, 34, 39, 40], "discrimin": [3, 39], "perfect": [3, 12, 33, 35, 36, 39, 40, 85], "pretti": [3, 21, 33, 73, 74, 91], "consid": [3, 12, 16, 27, 31, 33, 37, 60, 61, 65, 67, 73, 76, 84, 85, 87, 88, 91, 94], "furthermor": [3, 20, 31, 33, 36, 65, 73], "clear": [3, 16, 20, 25, 31, 33, 35, 57, 67, 69, 70, 73, 84, 91, 101], "doe": [3, 5, 7, 10, 12, 16, 19, 21, 25, 28, 31, 33, 35, 36, 38, 39, 40, 43, 53, 57, 60, 61, 62, 67, 74, 77, 80, 85, 87, 88, 89, 91, 97, 100, 101, 102], "yourself": [3, 17, 31, 34, 57, 61], "betwe": [3, 85], "sai": [3, 31, 38, 46, 57, 61, 62, 74, 76, 80, 81, 89, 91], "gabaerg": 3, "glutamaterg": 3, "situat": [3, 12, 16, 73, 74, 77, 84, 91, 101], "someth": [3, 21, 31, 33, 34, 35, 38, 39, 40, 57, 70, 76, 80, 88, 91, 97, 100, 101, 102], "don": [3, 11, 12, 20, 25, 31, 33, 34, 35, 36, 37, 40, 43, 54, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 97, 101], "relev": [3, 12, 19, 21, 34, 39, 60, 73, 74, 84, 88, 91, 94, 101], "repo": [3, 27, 28, 77, 87, 89, 94, 100, 102], "funkei": 3, "neuromatch_xai": 3, "osf": [3, 5, 7, 16, 17, 18, 21, 33, 34, 36, 39, 57, 65, 67, 69, 70, 73, 76, 77, 80, 84, 87, 89, 94, 100, 102], "vutn5": 3, "z": [3, 11, 12, 15, 18, 20, 27, 36, 40, 57, 60, 61, 67, 73, 76, 77, 80, 81, 82, 85, 91], "bytesio": [3, 11, 12, 15, 33, 36, 43, 73, 76, 77, 94, 100, 102], "domin": [3, 74], "either": [3, 5, 16, 21, 23, 25, 31, 39, 43, 57, 70, 73, 74, 76, 85, 88, 91, 94, 100, 101, 102], "format": [3, 5, 8, 11, 12, 21, 25, 27, 31, 33, 36, 38, 43, 57, 61, 62, 67, 69, 73, 76, 82, 84, 85, 87, 100, 102], "happi": [3, 11], "afterward": [3, 73], "prepare_dataset": 3, "uncom": [3, 7, 16, 25, 27, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 94, 100, 102], "procedur": [3, 39, 40, 64, 81, 82, 91, 94, 101], "lot": [3, 12, 21, 27, 31, 33, 35, 37, 39, 43, 57, 70, 73, 74, 80, 84, 88, 89, 91], "longer": [3, 7, 33, 39, 80, 85, 88, 91], "abov": [3, 5, 7, 15, 17, 27, 28, 33, 35, 36, 39, 52, 57, 60, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 89, 91, 94, 97, 100, 101, 102], "continu": [3, 12, 23, 25, 27, 31, 33, 52, 57, 60, 61, 62, 64, 70, 73, 80, 81, 84, 85, 87, 88, 91, 97, 100, 101, 102], "interrupt": [3, 43, 62], "kernel": [3, 54, 57, 80, 82, 100], "b": [3, 12, 21, 25, 33, 34, 36, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 87, 88, 89, 91, 94, 100, 102], "data_dir": 3, "class_a": 3, "class_b": 3, "img_siz": 3, "checkpoints_dir": 3, "gaba_glutam": 3, "option": [3, 8, 10, 11, 17, 20, 23, 34, 35, 37, 39, 43, 54, 64, 67, 73, 80, 84, 85, 88, 89, 97, 100], "aspect_ratio": 3, "aux_checkpoint": 3, "default": [3, 17, 21, 25, 43, 57, 60, 61, 62, 64, 65, 67, 69, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "aux_downsample_factor": 3, "aux_input_nc": 3, "aux_input_s": 3, "aux_net": 3, "aux_output_class": 3, "crop_siz": 3, "dataroot": 3, "0_gaba_2_glutam": 3, "dataset_mod": 3, "atob": 3, "display_wins": 3, "latest": [3, 21], "gpu_id": 3, "init_gain": 3, "02": [3, 27, 33, 60, 61, 62, 73, 81, 94], "init_typ": 3, "input_nc": 3, "istrain": 3, "load_it": 3, "load_siz": 3, "max_dataset_s": 3, "inf": [3, 12, 20, 21, 28, 61, 67, 84, 97, 102], "model_suffix": 3, "_a": [3, 74], "n_layers_d": 3, "experiment_nam": 3, "ndf": 3, "netd": 3, "netg": 3, "resnet_9block": 3, "ngf": 3, "no_dropout": 3, "no_flip": 3, "norm": [3, 25, 35, 39, 40, 62, 64, 67, 77, 80, 84, 87, 89], "instanc": [3, 5, 8, 11, 12, 17, 26, 27, 28, 57, 60, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "ntest": [3, 33], "num_thread": 3, "output_nc": 3, "phase": [3, 18], "preprocess": [3, 5, 12, 15, 65, 67, 73, 76], "results_dir": [3, 15], "serial_batch": 3, "suffix": [3, 82], "verbos": [3, 12, 21, 27, 64, 65, 94, 100, 102], "singledataset": 3, "initi": [3, 4, 5, 11, 12, 17, 20, 21, 26, 27, 28, 31, 36, 43, 57, 60, 64, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 88, 94, 97, 100, 102], "testmodel": 3, "latest_net_g_a": 3, "pth": [3, 17, 21, 76, 82, 94, 100, 102], "resnetgener": 3, "reflectionpad2d": 3, "instancenorm2d": 3, "ep": [3, 17, 60, 64, 65, 80, 81, 82, 94], "affin": [3, 15, 17, 62, 67, 82, 94], "track_running_stat": [3, 94], "10": [3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 19, 20, 25, 27, 28, 31, 39, 40, 46, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 97, 100, 102], "resnetblock": 3, "conv_block": 3, "17": [3, 12, 21, 27, 28, 33, 46, 62, 70, 73, 76, 81, 82, 84, 85, 87, 91, 94, 97, 100, 102], "18": [3, 5, 8, 12, 25, 27, 28, 33, 46, 70, 73, 76, 81, 82, 84, 85, 87, 94, 97, 100], "19": [3, 5, 12, 16, 27, 28, 33, 34, 46, 57, 62, 65, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 97, 102], "convtranspose2d": [3, 21, 80, 82], "output_pad": [3, 82], "20": [3, 5, 7, 12, 17, 18, 19, 20, 27, 28, 31, 33, 40, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 94, 97, 100, 102], "21": [3, 8, 21, 27, 33, 40, 43, 57, 62, 70, 73, 76, 77, 81, 82, 84, 85, 87, 94, 97, 100, 102], "22": [3, 12, 21, 27, 28, 33, 46, 57, 62, 65, 69, 70, 73, 76, 81, 82, 84, 85, 87, 94, 97, 100], "23": [3, 8, 25, 28, 33, 46, 57, 62, 65, 67, 70, 73, 76, 81, 82, 84, 85, 87, 94, 100, 102], "24": [3, 12, 21, 25, 27, 28, 33, 36, 46, 57, 62, 67, 70, 73, 76, 80, 81, 82, 85, 87, 94, 100], "26": [3, 5, 8, 17, 27, 33, 65, 67, 70, 73, 76, 81, 82, 84, 85, 88, 91, 97, 100], "27": [3, 5, 8, 27, 57, 60, 65, 67, 73, 76, 81, 82, 84, 85, 94, 97], "tanh": [3, 20, 27, 60, 80, 81, 100, 102], "g_a": 3, "366": [3, 76], "web": [3, 17, 76, 88, 97], "test_latest": 3, "process": [3, 11, 12, 16, 17, 19, 20, 23, 25, 26, 27, 31, 34, 35, 36, 37, 39, 40, 45, 46, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 82, 84, 89, 91, 94, 100, 101, 102], "0000": [3, 21, 57], "th": [3, 7, 17, 43, 62, 64, 67, 76, 80, 85], "traina": 3, "0_train": 3, "0005": [3, 7, 67], "10004_train": 3, "0010": [3, 67], "10009_train": 3, "0015": 3, "10013_train": 3, "0020": 3, "10018_train": 3, "0025": 3, "10022_train": 3, "0030": 3, "10027_train": 3, "0035": 3, "10031_train": 3, "0040": [3, 67], "10036_train": 3, "0045": [3, 67], "10040_train": 3, "0050": 3, "10045_train": 3, "0055": 3, "1004_train": 3, "0060": 3, "10054_train": 3, "0065": [3, 57], "10059_train": 3, "0070": [3, 67], "10063_train": 3, "0075": 3, "10068_train": 3, "0080": [3, 67], "10072_train": 3, "0085": 3, "10077_train": 3, "0090": 3, "10081_train": 3, "0095": 3, "10086_train": 3, "0100": [3, 67], "10090_train": 3, "0105": 3, "10095_train": 3, "0110": [3, 67], "1009_train": 3, "0115": [3, 67], "10103_train": 3, "0120": 3, "10108_train": 3, "0125": 3, "10112_train": 3, "0130": 3, "10117_train": 3, "0135": 3, "10121_train": 3, "0140": 3, "10126_train": 3, "0145": [3, 67], "10130_train": 3, "0150": 3, "10135_train": 3, "0155": [3, 67], "1013_train": 3, "0160": 3, "10144_train": 3, "0165": 3, "10149_train": 3, "0170": 3, "10153_train": 3, "0175": [3, 67], "10158_train": 3, "0180": [3, 67], "10162_train": 3, "0185": 3, "10167_train": 3, "0190": 3, "10171_train": 3, "0195": [3, 57], "10176_train": 3, "0200": 3, "10180_train": 3, "0205": [3, 67], "10185_train": 3, "0210": 3, "1018_train": 3, "0215": 3, "10194_train": 3, "0220": [3, 67], "10199_train": 3, "0225": 3, "10202_train": 3, "0230": 3, "10207_train": 3, "0235": 3, "10211_train": 3, "0240": [3, 67], "10216_train": 3, "0245": 3, "10220_train": 3, "0250": 3, "10225_train": 3, "0255": [3, 67], "1022_train": 3, "0260": 3, "10234_train": 3, "0265": [3, 67], "10239_train": 3, "0270": [3, 67], "10243_train": 3, "0275": 3, "10248_train": 3, "0280": 3, "10252_train": 3, "0285": 3, "10257_train": 3, "0290": [3, 67], "10261_train": 3, "0295": [3, 67], "10266_train": 3, "0300": 3, "10270_train": 3, "0305": 3, "10275_train": 3, "0310": 3, "1027_train": 3, "0315": 3, "10284_train": 3, "0320": [3, 67], "10289_train": 3, "0325": 3, "10293_train": 3, "0330": 3, "10298_train": 3, "0335": [3, 67], "10301_train": 3, "0340": 3, "10306_train": 3, "0345": 3, "10310_train": 3, "0350": 3, "10315_train": 3, "0355": 3, "1031_train": 3, "0360": 3, "10324_train": 3, "0365": 3, "10329_train": 3, "0370": [3, 67], "10333_train": 3, "0375": 3, "10338_train": 3, "0380": 3, "10342_train": 3, "0385": 3, "10347_train": 3, "0390": 3, "10351_train": 3, "0395": 3, "10356_train": 3, "0400": 3, "10360_train": 3, "0405": 3, "10365_train": 3, "0410": 3, "1036_train": 3, "0415": 3, "10374_train": 3, "0420": 3, "10379_train": 3, "0425": 3, "10383_train": 3, "0430": 3, "10388_train": 3, "0435": 3, "10392_train": 3, "0440": [3, 67], "10397_train": 3, "0445": [3, 57], "10400_train": 3, "0450": 3, "10405_train": 3, "0455": 3, "1040_train": 3, "0460": 3, "10414_train": 3, "0465": 3, "10419_train": 3, "0470": 3, "10423_train": 3, "0475": 3, "10428_train": 3, "0480": 3, "10432_train": 3, "0485": 3, "10437_train": 3, "0490": 3, "10441_train": 3, "0495": 3, "10446_train": 3, "sort": [3, 11, 12, 15, 17, 23, 25, 31, 33, 70, 76, 84, 85, 94, 100, 102], "much": [3, 8, 12, 16, 17, 20, 21, 28, 31, 33, 34, 35, 36, 38, 39, 40, 60, 62, 67, 70, 74, 77, 80, 82, 85, 88, 91, 94, 97], "fool": [3, 40, 88], "class_a_index": 3, "class_b_index": 3, "result_dir": 3, "classification_result": 3, "basenam": 3, "replac": [3, 4, 5, 8, 17, 25, 27, 28, 33, 43, 57, 60, 62, 67, 73, 74, 76, 80, 84, 85, 88], "_aux": 3, "kei": [3, 5, 7, 8, 11, 12, 15, 16, 18, 28, 31, 33, 34, 39, 40, 43, 57, 60, 61, 65, 67, 69, 70, 76, 80, 81, 82, 85, 87, 91, 100, 101, 102], "aux_real": 3, "aux_fak": 3, "top": [3, 5, 20, 27, 33, 35, 39, 43, 53, 54, 57, 60, 62, 64, 67, 73, 74, 76, 80, 84, 87, 88, 91, 94, 97, 100, 102], "real": [3, 5, 10, 20, 25, 31, 33, 35, 36, 39, 43, 57, 62, 67, 69, 73, 74, 77, 81, 84, 87, 88, 91, 94, 100, 101], "fake": [3, 10, 25, 80], "mind": [3, 5, 16, 21, 35, 37, 80], "show_pair": 3, "score_a": 3, "score_b": 3, "p": [3, 5, 7, 8, 11, 12, 15, 18, 21, 27, 33, 34, 36, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 88, 91, 97, 100, 101, 102], "str": [3, 15, 21, 25, 27, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "success": [3, 11, 61, 67, 69, 70, 74, 76, 82, 97, 101], "real_a": 3, "_real": 3, "fake_b": 3, "_fake": 3, "segment": [4, 5, 33, 64], "approach": [4, 5, 10, 11, 12, 17, 18, 25, 27, 28, 33, 34, 35, 36, 37, 39, 40, 60, 62, 67, 73, 74, 91, 100, 102], "denois": [4, 82], "noise2void": 4, "u": [4, 12, 15, 21, 25, 40, 57, 60, 62, 65, 67, 76, 80, 85, 88, 89, 102], "thing": [4, 11, 12, 19, 20, 23, 31, 35, 43, 60, 62, 64, 74, 76, 80, 91, 101], "privaci": 4, "essenti": [4, 27, 33, 34, 35, 39, 40, 60, 80, 84, 97, 100], "particularli": [4, 27, 76, 81, 88, 101, 102], "feder": 4, "environ": [4, 26, 51, 53, 54, 57, 80, 97, 100, 101], "client": [4, 43], "train": [4, 5, 10, 12, 15, 21, 25, 26, 27, 31, 33, 34, 35, 36, 65, 74, 80, 81, 87, 88, 91, 97, 101], "model": [4, 5, 7, 10, 11, 12, 15, 18, 19, 20, 26, 27, 28, 46, 57, 60, 61, 62, 69, 74, 77, 87, 89, 97, 100, 101, 102], "without": [4, 16, 17, 25, 27, 31, 36, 38, 57, 61, 62, 64, 65, 67, 84, 85, 88, 89, 97, 101], "share": [4, 12, 21, 31, 46, 57, 67, 73, 80, 84, 85, 94], "privat": [4, 88], "adopt": [4, 73, 77], "method": [4, 10, 20, 27, 31, 34, 35, 38, 43, 60, 62, 64, 65, 69, 70, 73, 76, 85, 88, 97, 100, 101], "hide": [4, 81], "person": [4, 11, 23, 31, 53, 74, 77, 88, 91, 94], "while": [4, 5, 11, 12, 16, 19, 25, 27, 31, 33, 35, 39, 43, 51, 57, 60, 62, 67, 70, 73, 74, 76, 77, 80, 81, 85, 88, 94, 97, 100, 101], "collabor": [4, 44, 45, 53, 91], "updat": [4, 7, 8, 12, 17, 21, 25, 28, 31, 46, 60, 61, 69, 70, 73, 80, 81, 82, 84, 85, 94, 97, 100, 101, 102], "attack": [4, 76, 85], "retriev": [4, 36, 43, 62, 76, 84, 94], "exact": [4, 21, 35, 39, 43, 73, 80, 81, 100], "simpli": [4, 5, 33, 36, 40, 60, 62, 65, 67, 80, 81, 85, 100], "zhu": [4, 27], "et": [4, 11, 16, 27, 35, 40, 60, 62, 76, 81, 84, 91], "al": [4, 16, 27, 35, 40, 62, 76, 81, 84, 91], "project": [4, 5, 7, 8, 10, 11, 15, 17, 19, 20, 21, 25, 28, 34, 49, 53, 54, 57, 61, 67, 77, 80, 84, 89, 101], "task": [4, 5, 10, 11, 12, 16, 19, 21, 57, 60, 61, 62, 67, 73, 76, 84, 85, 87, 88, 91, 97, 101], "reimplement": [4, 69, 85], "wise": [4, 67, 70, 73, 80, 84], "were": [4, 15, 31, 33, 34, 35, 39, 67, 69, 73, 74, 76, 77, 80, 81, 84, 85, 88, 91, 94, 97, 100, 101, 102], "post": [4, 31, 43, 46, 69, 70, 76, 82, 84, 85, 88], "contrast": [4, 5, 11, 16, 28, 39, 62, 67, 73, 76, 80, 91], "problem": [4, 11, 12, 17, 19, 25, 26, 27, 31, 33, 36, 37, 39, 43, 57, 60, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 88, 91, 101], "vanilla": [4, 60, 87], "vs": [4, 7, 10, 16, 21, 25, 36, 38, 39, 40, 43, 69, 74, 84, 85, 94, 100, 101, 102], "gnn": 4, "deepwalk": 4, "embed": [4, 10, 11, 12, 44, 45, 57, 84, 85, 88, 91, 94], "molecul": [4, 74], "cnn": [4, 33, 57, 67, 74, 80, 82, 100], "cluster": [4, 57, 77, 85, 87], "ref1": 4, "ref2": 4, "openneuro": 4, "kaggl": [4, 5, 7, 16, 30, 51, 57, 85, 87], "visualdata": [4, 30], "joe": 5, "donovan": 5, "nma": [5, 17, 20, 23, 25, 31, 33, 35, 36, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "daili": 5, "guid": [5, 23, 27, 34, 37, 38, 43, 50, 61, 62, 67, 70, 88, 101], "deeplearn": [5, 31, 46, 52], "project_guid": [5, 46], "overal": [5, 31, 36, 38, 39, 65, 67, 69, 74, 77, 81, 85, 88, 94], "goal": [5, 11, 18, 19, 25, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 46, 57, 60, 62, 76, 84, 97, 100, 102], "about": [5, 7, 11, 12, 15, 19, 21, 23, 25, 26, 27, 31, 33, 34, 36, 37, 38, 39, 40, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 87, 88, 100, 101, 102], "potenti": [5, 8, 19, 23, 28, 35, 36, 39, 62, 64, 67, 74, 84, 85, 88, 91, 97, 100, 101], "larger": [5, 12, 17, 27, 40, 61, 62, 65, 67, 70, 73, 76, 82, 87, 88, 91, 100], "pretain": 5, "loss": [5, 7, 11, 12, 16, 17, 20, 21, 33, 57, 60, 62, 65, 70, 73, 74, 76, 77, 80, 84, 85, 87, 88, 102], "optim": [5, 7, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 39, 46, 57, 60, 61, 62, 64, 65, 69, 70, 73, 74, 80, 81, 82, 84, 87, 91, 97, 100, 101, 102], "torchsummari": 5, "ndimag": [5, 17], "geometri": [5, 81], "point": [5, 7, 11, 12, 17, 21, 23, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 57, 60, 61, 62, 65, 67, 69, 70, 73, 74, 77, 80, 84, 85, 87, 94, 97, 100, 101], "polygon": 5, "summari": [5, 34], "rotat": [5, 17, 33, 36, 57, 62, 65, 67, 70, 73, 76, 77, 91, 94, 100, 102], "subimag": 5, "unpack_bbox": 5, "bbox": 5, "coco": [5, 21], "centerx": 5, "centeri": 5, "width": [5, 15, 21, 27, 28, 35, 39, 43, 44, 45, 57, 61, 62, 64, 73, 76, 77, 80, 81, 94, 97], "height": [5, 21, 27, 28, 39, 44, 45, 57, 73, 76, 77, 80, 94, 97], "theta": [5, 17, 40, 81, 82], "radian": 5, "rot_cent": 5, "pi": [5, 17, 27, 60, 64, 65, 74, 81, 82, 94, 97, 100, 102], "rotcorners_from_coord": 5, "co": [5, 17, 27, 30, 60, 64, 65, 80, 81, 82, 84, 85, 88], "sin": [5, 17, 27, 60, 64, 65, 80, 81, 82, 84], "wvec": 5, "dot": [5, 39, 40, 57, 60, 61, 62, 77, 81, 87, 89, 94], "hvec": 5, "corner_point": 5, "rotbbox_from_coord": 5, "rot_bbox": 5, "min": [5, 16, 17, 18, 28, 31, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 85, 87, 91, 94, 97, 100, 101, 102], "constrain": [5, 37, 89], "insid": [5, 31, 37, 43, 73], "extract_subimg_bbox": 5, "im": [5, 7, 12, 16, 18, 21, 28, 80], "extract_subimg": 5, "subimg": 5, "rotated_im": 5, "degre": [5, 16, 65, 73, 74, 87, 94, 100, 102], "180": [5, 76, 94], "newcent": 5, "drop": [5, 33, 36, 52, 54, 62, 67, 70, 76, 80, 94], "hardwar": [5, 20, 28, 57, 60, 61, 62, 64, 65, 76, 77, 100], "acceler": [5, 20, 28, 35, 36, 39, 40, 54, 57, 60, 61, 62, 64, 65, 74, 77, 82, 85, 88, 100], "dropdown": [5, 57, 60, 61, 62, 64, 65, 67, 80, 100], "disabl": [5, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 87, 89, 94, 100], "rcparam": [5, 20, 69, 70, 81], "gridspec": [5, 62], "plt_transform": 5, "font": [5, 20, 35, 36, 43, 87], "spine": [5, 20, 33, 35, 39, 73, 80], "right": [5, 7, 11, 15, 16, 19, 20, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 43, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 84, 87, 94, 97, 101], "autolayout": [5, 20], "properli": [5, 35, 36, 80, 82, 84, 94, 97], "dataset": [5, 11, 16, 17, 21, 23, 30, 31, 33, 35, 36, 38, 39, 61, 62, 76, 81, 82, 89, 91, 100, 101, 102], "took": [5, 33, 36], "minut": [5, 16, 23, 28, 31, 33, 43, 67, 69, 73, 76, 81, 82, 87, 89, 100], "me": [5, 11, 12, 65, 88], "mvtec": 5, "compani": [5, 43, 82, 88], "tarfil": [5, 21, 67, 73, 76, 80, 84, 85], "ruca6": 5, "tarnam": 5, "mvtec_screws_v1": 5, "isfil": [5, 7, 8, 16, 17, 100, 102], "fd": [5, 73, 76, 80, 84, 85], "complet": [5, 15, 16, 28, 31, 35, 37, 46, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 87, 88, 89, 91, 97, 100, 101, 102], "unpack": 5, "datafil": 5, "datapath": 5, "screwdata": 5, "folder": [5, 7, 16, 17, 21, 43, 57, 73, 76, 100, 102], "full": [5, 6, 11, 13, 15, 22, 23, 27, 28, 29, 31, 33, 36, 39, 40, 61, 62, 85, 87, 101], "listdir": [5, 7, 16, 73, 76, 100, 102], "mvtec_screws_train": 5, "mvtec_screw": 5, "hdict": 5, "mvtec_screws_split": 5, "mvtec_screws_v": 5, "readme_v1": 5, "txt": [5, 11, 43], "mvtec_screws_test": 5, "There": [5, 12, 17, 21, 23, 26, 31, 33, 34, 35, 36, 38, 39, 40, 43, 57, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101], "readm": 5, "file_cont": 5, "v1": [5, 15, 16, 18, 19, 27, 28, 87, 88, 89], "author": [5, 11, 40, 65, 88, 101], "gmbh": 5, "juli": [5, 57], "2020": [5, 16, 27, 34, 57, 81, 91], "halcon": 5, "licens": [5, 88], "annot": [5, 19, 21, 77, 84, 87, 94], "creativ": [5, 23, 84, 88], "attribut": [5, 28, 57, 60, 62, 65, 67, 84, 85, 88], "noncommerci": 5, "sharealik": 5, "intern": [5, 11, 15, 37, 62, 67, 84, 87, 91, 101], "cc": [5, 17, 64, 87, 89], "BY": [5, 17, 85], "nc": [5, 21], "sa": 5, "creativecommon": 5, "fall": [5, 17, 33, 36, 88], "commerci": [5, 15], "claus": 5, "contact": [5, 23, 27], "scientif": [5, 19, 31, 33, 34, 35, 39, 40, 57], "cite": [5, 27, 88], "marku": 5, "ulrich": 5, "patrick": [5, 77], "follmann": 5, "hendrik": [5, 57], "neudeck": 5, "comparison": [5, 35, 38, 40, 73, 94, 101], "technisch": 5, "messen": 5, "2019": [5, 15, 34, 35, 40, 77], "doi": [5, 34, 91], "1515": [5, 80, 94], "teme": 5, "0076": 5, "384": [5, 16, 76], "nut": [5, 84], "wooden": [5, 76], "categori": [5, 11, 15, 19, 73, 84, 87, 88], "4426": 5, "exemplari": 5, "mention": [5, 15, 64, 65, 74, 81, 91, 94], "public": [5, 11, 16, 30, 43, 81, 88], "approxim": [5, 16, 17, 23, 31, 62, 69, 73, 80, 81, 82, 94, 100], "within": [5, 31, 33, 34, 39, 40, 57, 61, 65, 67, 70, 73, 84, 85, 91, 94], "val": [5, 7, 18, 21, 65, 69, 70, 76, 87], "examplari": 5, "dldataset": 5, "unsplit": 5, "usag": [5, 28, 57, 73, 84, 85], "read_dict": 5, "path_to_mvtec_screw": 5, "locat": [5, 21, 28, 67, 74, 80, 88], "path_to_images_fold": 5, "set_dict_tupl": 5, "image_dir": [5, 77], "write_dict": 5, "subpixel": 5, "precis": [5, 12, 17, 21, 23, 25, 28, 35, 36, 37, 38, 73, 94, 97], "center": [5, 36, 43, 67, 73, 76, 77, 80, 85, 94], "coordin": [5, 27, 28, 33, 48, 57, 62, 70, 80], "system": [5, 16, 25, 35, 37, 38, 40, 43, 57, 61, 74, 76, 77, 85, 87, 88, 89, 91, 97, 101], "left": [5, 11, 17, 26, 27, 31, 33, 34, 35, 36, 39, 40, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 84, 94, 97], "corner": [5, 67, 97], "when": [5, 8, 10, 12, 17, 19, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 97, 100, 101, 102], "convert": [5, 7, 12, 16, 18, 21, 33, 39, 43, 57, 62, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 91, 97], "similar": [5, 8, 10, 16, 19, 21, 23, 27, 33, 35, 36, 39, 40, 43, 57, 60, 65, 67, 70, 73, 74, 80, 81, 82, 84, 85, 88, 89, 91, 97], "cocodataset": [5, 21], "row": [5, 15, 43, 57, 67, 73, 80, 84, 89, 94, 97, 100, 102], "col": 5, "phi": [5, 80], "vertic": [5, 20, 27, 62, 67, 70, 73, 91], "axi": [5, 11, 15, 16, 17, 21, 27, 28, 35, 39, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 85, 88, 89, 97], "column": [5, 15, 40, 57, 62, 67, 73, 81, 84, 85, 94], "horizont": [5, 17, 27, 65, 67, 70, 73, 76, 91], "parallel": [5, 34, 57, 76, 87, 89], "perpendicular": 5, "mathemat": [5, 33, 37, 38, 39, 40, 57, 61, 64, 65, 67, 76, 81, 84], "posit": [5, 12, 17, 21, 27, 31, 35, 39, 57, 67, 73, 74, 76, 77, 81, 82, 87, 94, 97, 100, 102], "sens": [5, 15, 31, 33, 35, 36, 39, 40, 64, 88, 94, 100], "toward": [5, 16, 31, 60, 84, 88, 101], "side": [5, 7, 27, 33, 36, 57, 62, 70, 73, 74, 80, 82, 94, 100], "bottom": [5, 43, 48, 57, 67, 73, 76, 77, 87, 91, 94, 97], "alwai": [5, 11, 25, 31, 33, 38, 39, 40, 57, 60, 61, 62, 69, 87, 88, 91, 100], "semi": [5, 17, 76, 81], "henc": [5, 20, 57, 60, 64, 65, 67, 69, 70, 73, 85], "shift": [5, 11, 25, 64, 73, 85, 91, 94], "metadata": [5, 27, 28, 57], "join": [5, 7, 11, 15, 16, 21, 25, 31, 84, 85, 94, 97, 100, 102], "dict_kei": 5, "info": [5, 27, 43, 65, 70, 84, 100, 102], "file_nam": [5, 21], "screws_001": 5, "1440": 5, "1920": 5, "id": [5, 11, 12, 15, 21, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "area": [5, 16, 17, 27, 31, 73, 81, 88, 94, 97], "3440": 5, "97": [5, 8, 16, 21, 28, 76, 94], "184": [5, 76, 94], "876": [5, 76, 84], "313": [5, 76], "55": [5, 27, 61, 62, 65, 67, 76, 84, 94], "5631": 5, "category_id": 5, "1001": [5, 40, 82], "image_id": 5, "is_crowd": 5, "map": [5, 12, 15, 17, 21, 25, 33, 36, 40, 57, 61, 62, 64, 65, 67, 73, 77, 80, 82, 84, 85, 87, 88, 97, 101], "imgdir": 5, "remap": 5, "imgdict": 5, "collect": [5, 8, 11, 12, 21, 25, 28, 31, 33, 36, 39, 62, 67, 73, 76, 85, 87, 91, 94, 100, 101, 102], "defaultdict": 5, "annodict": 5, "ncategori": 5, "cat_id": 5, "category_nam": 5, "wood": [5, 76, 101], "lag": 5, "bolt": 5, "black": [5, 57, 67, 73, 76, 77, 81, 94, 100], "oxid": 5, "shini": 5, "short": [5, 7, 19, 23, 31, 34, 57, 74, 76, 88], "long": [5, 11, 15, 16, 17, 27, 31, 33, 35, 57, 62, 64, 65, 73, 76, 81, 82, 84, 85, 88], "machin": [5, 8, 10, 15, 28, 30, 43, 60, 67, 70, 74, 76, 84, 88, 91, 101], "associ": [5, 15, 40, 60, 61, 62, 65, 67, 73, 82, 84, 100, 102], "imageid": 5, "gs": [5, 61, 62], "width_ratio": [5, 62], "wspace": 5, "cmap_norm": 5, "scatter": [5, 21, 35, 39, 57, 60, 61, 64, 65, 67, 69, 70, 76, 77, 80, 81, 87], "color": [5, 16, 21, 28, 39, 57, 60, 61, 62, 67, 69, 70, 73, 77, 81, 85, 87, 91, 94, 100, 103], "cm": [5, 21, 33, 57, 61, 62, 64, 67, 80, 87], "jet": [5, 15, 21, 64, 65], "rect": 5, "rectangl": [5, 73], "linewidth": [5, 35, 39, 61, 62, 65, 67, 69, 70], "edgecolor": 5, "facecolor": 5, "affine2d": 5, "rotate_around": 5, "set_transform": 5, "gca": [5, 60, 61, 62, 80, 81], "transdata": 5, "add_patch": [5, 73], "off": [5, 12, 16, 17, 21, 28, 31, 43, 60, 61, 64, 65, 67, 73, 77, 80, 82, 88, 91, 102], "colorbar": [5, 7, 15, 20, 60, 61, 62, 77, 97], "tick": [5, 39, 62, 73, 76, 81], "clim": 5, "cat_imgdict": 5, "img_id": 5, "k": [5, 8, 16, 17, 20, 21, 27, 34, 36, 40, 61, 62, 64, 65, 73, 77, 80, 81, 85, 87, 88, 91, 94, 100, 102], "v": [5, 8, 36, 40, 43, 62, 64, 67, 76, 80, 85, 88, 89, 97, 100, 102], "string": [5, 21, 25, 27, 43, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 88, 94, 100, 102], "neat": [5, 17, 57], "realpython": 5, "365": [5, 76], "317": [5, 76], "314": [5, 28, 76, 88], "367": [5, 76], "393": [5, 76], "387": [5, 76], "315": [5, 76], "320": [5, 76, 82], "346": [5, 76], "347": [5, 76], "322": [5, 76], "321": [5, 27, 76], "catid": 5, "num_exampl": 5, "suptitl": [5, 16, 25, 80, 81], "hetergogen": 5, "throughout": [5, 17, 35, 43, 57, 62, 67, 70, 73, 94, 97, 100, 102], "simpler": [5, 25, 70, 73, 80], "whether": [5, 17, 25, 31, 33, 35, 36, 39, 40, 57, 62, 65, 70, 73, 84, 85, 94, 102], "blank": 5, "use_categori": 5, "smaller": [5, 8, 12, 16, 17, 20, 21, 27, 60, 61, 62, 65, 67, 69, 70, 77, 80, 88, 91, 94, 97, 101], "patch_siz": 5, "num_patches_per_categori": 5, "nut_patch": 5, "blank_patch": 5, "until": [5, 65, 67, 69, 70, 73, 85, 88, 97], "suitabl": [5, 12, 26, 43, 57, 67, 82], "found": [5, 8, 16, 17, 21, 26, 31, 54, 57, 64, 70, 73, 81, 84, 85, 89, 94, 97, 100, 102], "imgid": 5, "imgobj": 5, "place": [5, 21, 30, 37, 38, 62, 67, 73, 76, 80, 85, 88, 100], "half": [5, 15, 23, 39, 57, 73, 76], "edg": [5, 21, 65, 88, 94, 102], "rand_cent": 5, "intersect": [5, 17], "rand_patch": 5, "todo": [5, 57, 67, 81, 82, 85, 100, 102], "seem": [5, 16, 31, 33, 35, 39, 40, 43, 61, 62, 74, 76, 80, 88, 89], "like": [5, 7, 11, 12, 16, 17, 18, 19, 20, 21, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 97, 101], "rare": [5, 61, 91, 101], "aren": [5, 11, 67, 81, 85], "fulli": [5, 7, 12, 17, 28, 57, 60, 65, 69, 70, 76, 80, 82, 94, 100, 101], "miss": [5, 12, 31, 35, 36, 37, 52, 57, 60, 64, 65, 69, 70, 73, 76, 80, 84, 85, 94], "could": [5, 8, 11, 17, 21, 25, 27, 31, 33, 35, 36, 37, 38, 39, 40, 46, 57, 61, 62, 64, 65, 69, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 91, 100, 101], "cifar": [5, 57, 80], "patch_label": 5, "all_patch": 5, "concat": 5, "shuffle_idx": 5, "immedi": [5, 27, 31, 35, 39, 100], "jump": [5, 11, 73, 85], "often": [5, 17, 21, 23, 31, 35, 36, 37, 38, 39, 40, 57, 60, 61, 62, 64, 70, 73, 74, 76, 80, 81, 100], "dimension": [5, 18, 31, 57, 70, 73, 74, 76, 77, 80, 81, 82, 87, 88], "485": [5, 18, 43, 76], "456": [5, 18, 43, 76, 80], "406": [5, 8, 18, 43, 76], "229": [5, 12, 18, 43, 76], "224": [5, 17, 18, 43, 76], "225": [5, 8, 18, 43, 76], "train_frac": 5, "train_numb": 5, "test_nuumb": 5, "train_patch": 5, "train_label": [5, 17, 33, 36, 84, 87], "test_patch": 5, "test_label": [5, 33, 36, 84], "permut": [5, 16, 17, 57, 69, 70, 76, 77, 80, 82], "simplescrewnet": 5, "leakyrelu": [5, 64, 65], "flatten": [5, 7, 16, 18, 25, 57, 67, 73, 80, 94, 102], "1024": [5, 82, 100, 102], "pass": [5, 11, 16, 17, 18, 20, 21, 25, 28, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "inspect": [5, 7, 54, 57, 60, 67, 80], "snet": 5, "368": [5, 76], "30": [5, 7, 8, 17, 27, 28, 31, 46, 57, 60, 61, 62, 65, 67, 70, 73, 76, 81, 82, 84, 85, 87, 94, 97, 100, 102], "832": [5, 76], "264": [5, 76], "600": [5, 20, 61, 67, 76, 94], "130": [5, 27, 65, 67, 76, 88], "132": [5, 76], "194": [5, 8, 76], "trainabl": [5, 8, 12, 60, 65, 67, 73, 81, 82], "non": [5, 11, 15, 25, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 88, 89, 91, 94, 100, 102], "mb": [5, 27, 28], "48": [5, 8, 27, 33, 36, 57, 62, 70, 76, 84, 85, 87, 94, 100], "estim": [5, 10, 12, 17, 33, 35, 37, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 87, 91, 94, 97, 100, 101, 102], "loss_fn": [5, 67, 80, 81, 82, 84], "000001": 5, "test_correct": 5, "lbl": [5, 17], "float": [5, 12, 16, 17, 18, 20, 21, 25, 28, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 81, 84, 85, 94, 97, 102], "unsqueez": [5, 16, 17, 21, 43, 57, 64, 65, 73, 76, 80, 84, 94], "as_tensor": [5, 80], "2f": [5, 16, 28, 33, 61, 64, 65, 69, 73, 87], "train_ds_load": 5, "losss": 5, "37": [5, 12, 33, 35, 39, 61, 62, 73, 76, 80, 81, 82, 84, 85, 97, 100], "973": [5, 76], "380": [5, 76], "49": [5, 7, 8, 12, 27, 70, 76, 77, 84, 87, 94, 100], "197": [5, 76], "818": [5, 76], "270": [5, 76], "161": [5, 27, 76, 80], "713": [5, 76], "547": [5, 76], "137": [5, 76], "508": [5, 76], "38": [5, 33, 67, 73, 76, 81, 82, 85, 100], "970": [5, 76], "993": [5, 76], "calcul": [5, 7, 8, 12, 15, 28, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 84, 85, 97, 100, 102], "984": [5, 76], "text": [5, 10, 12, 17, 19, 25, 33, 36, 39, 40, 43, 46, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "anamol": 5, "ruruamour": 5, "snippet": [5, 62, 73, 88], "dirnam": 5, "file_path": 5, "empti": [5, 25, 27, 39, 43, 57, 61, 65, 67, 69, 70, 73, 81, 85, 97, 101], "fp": [5, 17, 27, 69, 70, 81], "api": [5, 7, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "token": [5, 7, 11, 12, 43, 82, 85], "upper": [5, 7, 67, 81, 85, 102], "api_token": [5, 7, 88], "usernam": [5, 7], "enter": [5, 27, 54, 73], "dump": [5, 15], "chnage": 5, "permiss": 5, "chmod": 5, "far": [5, 12, 35, 39, 60, 62, 67, 73, 74, 76, 77, 81, 82, 88, 101], "do": [5, 7, 8, 12, 17, 21, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 97, 100, 101, 102], "classifi": [5, 16, 18, 33, 35, 36, 39, 43, 64, 65, 67, 70, 73, 76, 77, 84, 85, 89, 91], "same": [5, 7, 10, 11, 12, 16, 17, 25, 27, 28, 31, 33, 35, 36, 38, 39, 43, 53, 57, 60, 62, 64, 67, 70, 73, 74, 76, 77, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101], "configur": [5, 27, 28, 43, 60, 67, 73, 88, 100, 102], "remind": [5, 35, 76, 84, 100], "nice": [5, 31, 38, 67, 82, 88, 100], "world": [5, 15, 16, 27, 39, 43, 69, 73, 74, 76, 84, 87, 88, 91, 94, 101], "machinelearningmasteri": 5, "captur": [5, 33, 35, 36, 64, 84, 87, 91], "won": [5, 12, 25, 28, 31, 64, 67, 73, 74, 85, 91, 94, 100, 101, 102], "enough": [5, 27, 31, 33, 34, 35, 37, 38, 39, 43, 57, 62, 64, 69, 82, 87], "yolo": 5, "algorithm": [5, 10, 11, 17, 21, 33, 35, 36, 39, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "keep": [5, 7, 20, 21, 25, 28, 31, 33, 36, 37, 38, 43, 57, 60, 61, 62, 64, 67, 69, 70, 73, 80, 87, 88, 91, 94], "skill": [5, 23, 31, 35, 91], "program": [5, 57, 73, 88, 97, 101], "debug": [5, 21, 25, 38, 43, 100, 102], "intermedi": [5, 18, 21, 31, 60, 67, 76], "give": [5, 17, 23, 31, 33, 34, 36, 39, 40, 57, 62, 65, 70, 73, 74, 76, 80, 81, 82, 85, 88, 91, 97, 100], "w1d2": [5, 35, 67], "standard": [5, 8, 17, 21, 25, 27, 36, 38, 40, 43, 46, 57, 61, 62, 64, 65, 67, 70, 76, 80, 81, 82, 88, 91, 97, 100, 101], "draw": [5, 20, 37, 38, 39, 80, 81, 94, 97, 100], "doesn": [5, 12, 25, 28, 31, 33, 36, 39, 40, 69, 84, 85, 87, 88, 97], "handl": [5, 11, 12, 16, 21, 25, 26, 43, 57, 67, 73, 80, 81, 84, 85, 88, 94], "elegantli": 5, "sever": [5, 11, 12, 17, 19, 28, 39, 52, 67, 73, 74, 76, 85, 87, 91], "extend": [5, 15, 43, 62, 74, 80, 88, 101], "produc": [5, 7, 11, 17, 25, 35, 38, 39, 67, 73, 80, 82, 84, 85, 87, 88, 100, 102], "form": [5, 11, 15, 31, 36, 39, 43, 52, 62, 64, 69, 73, 74, 80, 81, 82, 84, 85, 97, 100, 101, 102], "supervis": [5, 46, 76, 84, 100, 101], "incomplet": [5, 57], "unsupervis": [5, 46, 80, 94], "group": [5, 12, 23, 31, 33, 36, 39, 46, 53, 67, 70, 74, 76, 77, 80, 81, 82, 87, 91, 100, 101], "classic": [5, 25, 36, 39, 80, 88], "sklearn": [5, 11, 12, 16, 33, 35, 39, 57, 77, 84, 85, 87], "yolo3": 5, "minim": [5, 27, 28, 33, 43, 60, 62, 67, 70, 74, 80, 81, 84, 89, 91, 94, 100], "yolov5": 5, "detetectron2": 5, "yolov4": 5, "less": [5, 16, 21, 27, 31, 33, 36, 40, 62, 67, 69, 70, 73, 74, 76, 80, 82, 87, 88, 94], "readabl": 5, "complic": [5, 28, 31, 43, 57, 80], "framework": [5, 19, 27, 40, 43, 60, 65, 81, 85, 97, 101], "3d": [5, 17, 33, 61, 67, 73, 84, 87, 101], "cad": 5, "en": [5, 11, 21, 27, 43, 87, 89], "click": [6, 13, 15, 22, 29, 34, 35, 36, 37, 38, 43, 48, 50, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100, 101, 102], "imag": [6, 7, 8, 13, 15, 17, 19, 21, 22, 25, 27, 29, 30, 31, 64, 65, 69, 70, 74, 81, 84, 85, 91, 103], "browser": [6, 7, 13, 22, 28, 29, 53], "beatrix": 7, "benko": 7, "lina": 7, "teichmann": 7, "audiofil": 7, "part": [7, 15, 17, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 52, 57, 60, 62, 64, 73, 77, 80, 81, 82, 84, 85, 88, 94, 97, 102], "second": [7, 15, 17, 23, 27, 28, 31, 33, 35, 36, 39, 40, 57, 60, 61, 62, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 87, 91, 94, 97], "genr": 7, "link": [7, 21, 23, 28, 36, 39, 43, 46, 48, 54, 57, 73, 76, 77, 85], "harder": [7, 39, 77], "idea": [7, 11, 12, 21, 23, 31, 33, 35, 36, 37, 39, 43, 57, 60, 62, 64, 65, 67, 69, 73, 76, 80, 81, 82, 84, 91, 97, 100, 102], "fun": [7, 65, 101], "benk\u0151": 7, "towardsdatasci": 7, "rwightman": [7, 30], "blob": [7, 17, 21, 27, 76], "master": [7, 17, 21, 27, 43, 97], "timm": 7, "vision_transform": 7, "py": [7, 16, 17, 18, 21, 27, 28, 43, 65, 69, 70, 76, 80, 81, 82, 85, 87, 94, 100], "kamalesh0406": 7, "audio": [7, 19, 44, 45], "zcacer": 7, "spec_aug": 7, "musicinformationretriev": 7, "ipython_audio": 7, "sudo": 7, "apt": [7, 21, 27, 28], "ffmpeg": [7, 27, 28, 69, 70, 81], "librosa": 7, "imageio": [7, 28, 57, 69, 70], "packag": [7, 25, 28, 57, 60, 65, 69, 70, 80, 81, 82, 85, 87, 89, 94], "build": [7, 16, 17, 18, 27, 28, 31, 35, 38, 39, 40, 43, 60, 62, 65, 69, 70, 73, 74, 76, 77, 82, 85, 87, 88, 91, 94, 100, 101, 102], "tree": [7, 76, 87], "newest": 7, "0ubuntu0": 7, "automat": [7, 33, 40, 43, 57, 60, 61, 67, 73, 76, 85, 89, 91, 100], "requir": [7, 17, 21, 25, 27, 28, 31, 34, 35, 37, 39, 43, 46, 54, 57, 60, 62, 67, 69, 70, 73, 74, 76, 77, 80, 85, 87, 88, 94, 97, 101], "libnvidia": 7, "460": [7, 76, 80], "autoremov": 7, "upgrad": [7, 25, 28, 43, 84, 87], "newli": [7, 54, 60, 67, 100], "necessari": [7, 17, 21, 27, 28, 31, 36, 37, 52, 61, 62, 64, 67, 73, 77, 85, 97, 100, 101], "shutil": [7, 16, 28, 73, 94, 100, 102], "ipython": [7, 15, 25, 27, 28, 35, 36, 39, 44, 45, 57, 64, 65, 69, 70, 73, 76, 81, 94], "displai": [7, 15, 25, 27, 28, 33, 35, 36, 38, 39, 43, 44, 45, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 81, 82, 84, 85, 94, 97, 100, 102], "drjhb": 7, "except": [7, 8, 16, 17, 21, 23, 28, 31, 46, 52, 60, 62, 65, 76, 84, 94], "connectionerror": [7, 16, 17], "fail": [7, 16, 17, 18, 21, 33, 35, 36, 38, 69, 74, 76, 77, 85], "status_cod": [7, 16, 17, 33, 36], "ok": [7, 11, 16, 17, 31, 33, 35, 36, 40, 80], "fid": [7, 16, 17], "dowload": [7, 80], "It": [7, 11, 12, 16, 17, 20, 21, 25, 26, 27, 31, 33, 36, 38, 39, 40, 43, 54, 57, 60, 62, 67, 70, 73, 74, 76, 77, 80, 81, 84, 87, 88, 89, 94, 97, 100, 101, 102], "johnsmith": 7, "123a123a123": 7, "zipobj": [7, 87, 89], "waveform": 7, "Then": [7, 10, 17, 21, 31, 34, 35, 37, 38, 43, 57, 62, 64, 65, 69, 73, 77, 81, 84, 85, 87, 89, 91], "sound": [7, 31, 76, 88], "wave": [7, 33], "sample_path": 7, "genres_origin": 7, "jazz": 7, "00000": 7, "wav": 7, "listen": [7, 19, 31, 44, 45, 74], "support": [7, 12, 28, 31, 57, 67, 85, 97], "element": [7, 38, 39, 43, 57, 61, 62, 64, 67, 69, 70, 73, 81, 85, 88, 94], "sample_r": 7, "khz": 7, "waveplot": 7, "sr": 7, "fontsiz": [7, 28, 61, 62, 67, 77], "00924683": 7, "01177979": 7, "01370239": 7, "0071106": 7, "00561523": 7, "661794": 7, "22050": 7, "013333333333332": 7, "fourier": 7, "stft": 7, "ab": [7, 16, 27, 28, 35, 39, 64, 65, 67, 76, 101], "n_fft": 7, "2048": 7, "hop_length": 7, "object": [7, 15, 31, 36, 39, 40], "amplitud": [7, 15, 18, 36, 40], "decibel": 7, "scale": [7, 16, 17, 18, 20, 27, 37, 40, 57, 60, 61, 62, 64, 65, 67, 73, 74, 76, 77, 81, 82, 84, 91, 94, 101], "db": [7, 61], "amplitude_to_db": 7, "ref": [7, 80], "spectogram": 7, "specshow": 7, "x_axi": 7, "y_axi": 7, "log": [7, 20, 25, 27, 28, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "1025": 7, "1293": 7, "mel": 7, "sclae": 7, "intead": 7, "pitch": [7, 28, 31, 33, 36, 88], "judg": [7, 39, 76], "equal": [7, 27, 28, 39, 57, 62, 65, 67, 74, 80, 81, 84, 100, 102], "distanc": [7, 17, 18, 57, 67, 73, 74, 82, 85, 88, 89], "frequenc": [7, 28, 36, 40, 64, 67, 82, 87, 88], "measur": [7, 15, 17, 25, 33, 35, 36, 37, 62, 67, 69, 70, 76, 77, 80, 84, 87, 88, 89, 94, 100], "assign": [7, 16, 17, 23, 31, 64, 65, 85, 87, 97, 100, 102], "1000": [7, 15, 16, 20, 25, 28, 34, 40, 57, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 94, 100, 102], "hz": [7, 28, 88], "tone": 7, "40": [7, 11, 12, 27, 31, 33, 35, 39, 46, 57, 61, 67, 69, 70, 73, 76, 81, 85, 94, 100], "threshold": [7, 20, 36, 64, 65, 73, 88, 100, 102], "increasingli": 7, "interv": [7, 28, 35, 39, 61, 74, 81], "increment": [7, 57, 65, 67], "melspectrogram": 7, "s_db": 7, "img_path": [7, 77], "images_origin": 7, "jazz00000": 7, "interpol": [7, 17, 21, 33, 36], "nearest": [7, 17, 21, 73, 85, 101], "288": [7, 76], "432": [7, 76], "plot_loss_accuraci": [7, 73], "validation_loss": [7, 16, 73, 87], "validation_acc": [7, 73], "ax1": [7, 60, 61, 62, 69, 73], "ax2": [7, 60, 61, 62, 69, 73], "set_xlabel": [7, 40, 60, 61, 62, 67, 69, 70, 73, 94], "set_ylabel": [7, 40, 60, 61, 62, 67, 69, 70, 73, 94], "set_size_inch": [7, 73], "spectrograms_dir": 7, "folder_nam": 7, "train_dir": 7, "test_dir": 7, "val_dir": 7, "rmtree": [7, 16, 28, 94], "loop": [7, 11, 17, 21, 28, 36, 38, 43, 61, 62, 64, 65, 67, 84, 88, 97, 102], "src_file_path": 7, "recurs": [7, 37, 67, 102], "test_fil": 7, "val_fil": 7, "train_fil": 7, "destin": [7, 73, 87], "train_load": [7, 12, 33, 64, 65, 67, 69, 70, 73, 84], "val_dataset": 7, "val_load": [7, 67, 69, 70], "music_net": 7, "intit": 7, "in_channel": [7, 17, 21, 33, 73, 76, 80, 100, 102], "out_channel": [7, 17, 21, 33, 73, 76, 80, 100, 102], "conv4": [7, 18, 82, 100, 102], "conv5": [7, 18], "fc1": [7, 18, 33, 69, 70, 73, 76, 87, 100, 102], "in_featur": [7, 8, 12, 18, 57, 60, 67, 73, 76, 82, 100, 102], "9856": 7, "out_featur": [7, 12, 57, 60, 67, 73, 80, 82, 100, 102], "batchnorm1": 7, "num_featur": [7, 65, 94, 100, 102], "batchnorm2": 7, "batchnorm3": 7, "batchnorm4": 7, "batchnorm5": 7, "conv": [7, 17, 21, 73, 80, 82], "max_pool2d": [7, 73], "validation_load": [7, 73], "unit": [7, 17, 21, 28, 31, 33, 38, 39, 40, 57, 62, 64, 65, 67, 73, 74, 76, 80, 87, 94], "tepoch": [7, 73], "set_descript": [7, 73, 81, 82], "track": [7, 15, 18, 21, 27, 33, 57, 60, 62, 64, 69, 70, 73, 76, 84, 85, 94], "running_loss": [7, 18, 64, 65, 67, 73, 87], "zero": [7, 11, 12, 15, 17, 18, 20, 21, 27, 35, 39, 57, 60, 62, 64, 65, 67, 70, 73, 80, 81, 82, 84, 97, 100, 101, 102], "set_postfix": [7, 17, 21, 73, 100, 102], "One": [8, 10, 12, 17, 21, 25, 27, 31, 43, 62, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 87, 88, 94, 100], "desir": [8, 27, 57, 62, 64, 65, 67, 80, 88, 97, 101], "capabl": [8, 12, 25, 67, 101], "abil": [8, 62, 64, 73, 101], "knowledg": [8, 26, 27, 31, 36, 37, 74, 81, 88, 94], "domain": [8, 27, 57, 67, 73, 74, 76, 88, 91], "scarc": 8, "unfortun": 8, "recip": 8, "instead": [8, 12, 20, 25, 26, 27, 28, 33, 36, 43, 57, 62, 65, 67, 73, 74, 76, 77, 82, 84, 85, 88, 91, 94, 97, 101], "remain": [8, 57, 73, 88, 97, 101], "scenario": [8, 39, 65, 74, 91, 101], "gc": 8, "obtain": [8, 12, 57, 60, 64, 67, 69, 70, 73, 76, 80, 82, 87, 91, 94, 100], "variou": [8, 23, 25, 28, 33, 36, 57, 73, 74, 76, 81, 85, 87, 91, 97, 101], "max_epoch": [8, 67, 69, 70], "max_epochs_target": 8, "normalis": [8, 67, 73, 84], "substract": 8, "divid": [8, 43, 60, 61, 62, 73, 82, 88, 101], "deviat": [8, 21, 36, 40, 61, 62, 64, 65, 80, 81, 82], "appli": [8, 11, 16, 17, 20, 26, 27, 28, 31, 34, 35, 39, 57, 60, 65, 67, 70, 73, 76, 77, 80, 82, 84, 85, 88, 97, 102], "_pretrain": [8, 27], "outmodelnam": 8, "748": [8, 76], "781": [8, 28, 76], "545": [8, 76, 94], "527999877929688": 8, "369999885559082": 8, "597": [8, 27, 76], "157": [8, 76], "438": [8, 76], "392000198364258": 8, "829999923706055": 8, "932": [8, 76], "34": [8, 70, 73, 76, 77, 81, 82, 84, 85, 97, 100], "450": [8, 40, 76, 94], "016000747680664": 8, "079999923706055": 8, "649": [8, 76], "35": [8, 27, 28, 33, 62, 64, 73, 76, 81, 82, 84, 85, 94, 97, 100], "134": [8, 76], "84000015258789": 8, "70000076293945": 8, "153": [8, 76], "41": [8, 27, 28, 57, 76, 94, 97, 100], "911": [8, 73, 76], "219": [8, 76], "63": [8, 16, 21, 27, 33, 76, 88, 94], "42": [8, 12, 16, 20, 28, 33, 70, 76, 80, 81, 84, 85, 97, 100], "827999114990234": 8, "43": [8, 16, 65, 73, 76, 84, 85, 87, 88, 94], "619998931884766": 8, "878": [8, 27, 76], "149": [8, 76], "87200164794922": 8, "45": [8, 12, 21, 46, 60, 61, 65, 67, 73, 76, 77, 81, 84, 85, 87, 94], "380001068115234": 8, "814": [8, 76], "66": [8, 33, 67, 76, 94], "847": [8, 73, 76], "875": [8, 73, 76], "59000015258789": 8, "310001373291016": 8, "514": [8, 76], "568": [8, 67, 76], "57": [8, 12, 73, 76, 84, 94], "35200119018555": 8, "209999084472656": 8, "403": [8, 76], "375": [8, 76], "61600112915039": 8, "20000076293945": 8, "124": [8, 69, 70, 76], "339": [8, 27, 76], "55400085449219": 8, "58": [8, 33, 65, 73, 76, 84, 94, 100], "900001525878906": 8, "656": [8, 76], "91999816894531": 8, "83000183105469": 8, "971": [8, 76], "491": [8, 76], "281": [8, 76], "05000305175781": 8, "560001373291016": 8, "028": 8, "358": [8, 76], "099998474121094": 8, "699": [8, 76], "299": [8, 76], "71": [8, 12, 15, 16, 21, 27, 28, 33, 76, 84, 94], "9739990234375": 8, "220001220703125": 8, "768": [8, 76], "182": [8, 76, 94], "74": [8, 12, 16, 21, 33, 67, 76], "95": [8, 21, 33, 73, 76, 94, 100, 102], "69400024414062": 8, "90999984741211": 8, "backbon": [8, 97], "del": [8, 57, 73], "again": [8, 11, 27, 28, 31, 43, 60, 61, 62, 69, 70, 73, 80, 94, 102], "0001": [8, 12, 82], "simul": [8, 25, 26, 31, 33, 34, 35, 37, 39, 40, 61, 65, 69, 70, 80, 81, 82, 85, 101, 102], "lower": [8, 11, 35, 39, 40, 61, 73, 80, 81, 85, 88, 100], "regim": [8, 40], "30000": [8, 12, 27, 67], "As": [8, 16, 19, 27, 28, 31, 35, 57, 61, 62, 64, 65, 67, 73, 76, 77, 80, 82, 88, 89, 94, 101], "previous": [8, 17, 43, 67, 73, 85, 94, 101], "checkpointpath": 8, "prefix": [8, 82, 84, 85], "elif": [8, 17, 21, 57, 61, 65, 69, 70, 73, 80, 85, 94, 97], "msg": 8, "strict": [8, 80], "rais": [8, 21, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 97, 100, 102], "No": [8, 20, 31, 36, 43, 46, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "9100": 8, "successfulli": [8, 18, 27, 94, 97], "usual": [8, 12, 20, 25, 31, 33, 35, 39, 40, 46, 57, 62, 70, 73, 74, 81, 88], "requires_grad": [8, 12, 60, 65, 67, 76, 80, 81, 82, 85], "num_ftr": [8, 18, 76], "sinc": [8, 20, 31, 33, 36, 38, 39, 40, 43, 53, 60, 61, 62, 64, 65, 67, 70, 73, 81, 82, 84, 85, 97, 102], "total_param": 8, "numel": [8, 57, 65, 73, 76, 80], "trainable_total_param": 8, "11173962": 8, "5130": 8, "finetun": [8, 82, 101], "235": [8, 67, 76, 80], "302": [8, 33, 43, 76], "630": [8, 76], "086669921875": 8, "757": [8, 76], "86000061035156": 8, "666": [8, 76], "640": [8, 76], "04000091552734": 8, "55000305175781": 8, "579": [8, 76], "577": [8, 76], "56999969482422": 8, "2300033569336": 8, "661": [8, 76], "613": [8, 76], "6866683959961": 8, "627": [8, 76], "469": [8, 76], "103": [8, 11, 21, 76, 88], "163330078125": 8, "37999725341797": 8, "602": [8, 76], "344": [8, 76], "99": [8, 12, 17, 21, 27, 57, 67, 69, 70, 73, 76, 82, 84, 97], "607": [8, 76], "42333221435547": 8, "02999877929688": 8, "537": [8, 76, 94], "608": [8, 76], "49333190917969": 8, "1500015258789": 8, "578": [8, 76], "650": [8, 76], "15333557128906": 8, "583": [8, 76], "66999816894531": 8, "20999908447266": 8, "819690": 8, "086670": 8, "713260": 8, "680940": 8, "860001": 8, "674431": 8, "540001": 8, "650245": 8, "040001": 8, "675883": 8, "550003": 8, "638555": 8, "570000": 8, "652776": 8, "230003": 8, "630500": 8, "686668": 8, "666428": 8, "449997": 8, "identifi": [10, 16, 20, 25, 31, 35, 36, 38, 70, 94], "quantifi": [10, 74, 76, 80, 84, 97], "word": [10, 12, 34, 36, 57, 67, 85, 88, 89, 91, 100, 101], "analyz": [10, 12, 27, 28, 31, 39, 40, 62, 65, 85], "sensibl": [10, 40], "rnn": [10, 12, 19, 57, 80, 84, 87, 88, 89, 101], "degrad": 10, "bag": [10, 12, 73, 76, 84], "pro": [10, 27, 65], "con": 10, "pre": [10, 19, 21, 25, 26, 33, 34, 36, 39, 57, 61, 67, 73, 76, 84, 87, 88, 91, 100, 101], "specif": [10, 15, 16, 17, 18, 25, 27, 28, 31, 33, 35, 37, 38, 39, 40, 43, 57, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 84, 85, 88, 89, 91, 94, 100, 101], "suggest": [10, 11, 12, 16, 28, 31, 40, 64, 65, 80, 85, 87, 94, 100], "corpu": [10, 12, 57, 85, 87, 88, 89], "cbow": 10, "let": [10, 12, 16, 17, 20, 21, 25, 27, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "toxic": 10, "wikipedia": [10, 88, 89], "detect": [10, 17, 40, 67, 77], "categor": [10, 16, 25, 57, 80], "comment": [10, 12, 34, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 80, 94], "manag": [10, 16, 67], "multiclass": 10, "translat": [10, 31, 64, 73, 74, 84, 88, 91, 94, 100, 101], "huggingfac": [10, 12, 30, 84, 85, 88], "analyticsindiamag": 10, "nlp": [10, 23, 30, 67, 101], "sourc": [10, 15, 17, 25, 26, 27, 28, 31, 43, 57, 67, 77, 82, 84, 85, 89, 94], "beginn": [10, 23, 33, 67, 84], "descript": [10, 19, 25, 27, 33, 36, 43, 61, 62, 73, 76, 80, 84, 85, 101], "q": [10, 20, 27, 57, 80, 85, 100, 101, 102], "juan": [11, 12], "manuel": [11, 12], "rodriguez": [11, 12], "salomei": [11, 12, 16], "osei": [11, 12, 16], "amita": [11, 12, 44], "kapoor": [11, 12, 44], "sequenc": [11, 12, 25, 28, 31, 57, 64, 65, 81, 84, 85, 87, 88, 89, 101], "transtlat": 11, "french": [11, 76, 89], "english": [11, 43, 76, 84, 85, 87, 89, 101], "math": [11, 27, 34, 36, 60, 62, 65, 73, 76, 82, 84, 85, 102], "unicodedata": 11, "zip_file_url": 11, "line": [11, 12, 27, 28, 33, 35, 36, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 102], "eng": 11, "fra": 11, "strip": 11, "va": [11, 67, 73, 77, 87, 100, 102], "cour": 11, "courez": 11, "wow": [11, 16], "\u00e7a": 11, "alor": 11, "fire": [11, 12, 27, 35, 39, 67, 74, 76, 101], "au": 11, "feu": 11, "\u00e0": 11, "aid": [11, 76, 77], "saut": 11, "stop": [11, 12, 26, 31, 35, 39, 43, 62, 67, 70, 73, 76, 82, 88], "suffit": 11, "arr\u00eat": 11, "toi": [11, 39, 40, 69, 70, 73, 76, 94], "represent": [11, 12, 15, 18, 21, 33, 34, 36, 43, 57, 65, 77, 80, 84, 85, 89, 100, 101, 102], "indix": 11, "three": [11, 16, 27, 28, 39, 57, 60, 61, 65, 67, 69, 74, 76, 77, 81, 84, 85, 87, 91, 94], "special": [11, 12, 20, 27, 67, 73, 80, 85, 87, 88], "Of": [11, 12, 33, 39, 67, 88], "sentenc": [11, 12, 31, 34, 84, 85, 87, 88, 89], "eo": 11, "fill": [11, 12, 17, 25, 46, 52, 57, 64, 65, 69, 70, 73, 76, 88, 94, 97, 100, 102], "sos_token": 11, "eos_token": [11, 88], "lang": [11, 43, 80], "word2index": 11, "word2count": 11, "index2word": 11, "n_word": 11, "addsent": 11, "addword": 11, "unicodetoascii": 11, "nfd": [11, 88], "mn": 11, "normalizestr": 11, "sub": [11, 15, 21, 25, 27, 33, 36, 43, 76, 84, 85], "za": 11, "readlang": 11, "lang1": 11, "lang2": 11, "encod": [11, 12, 15, 27, 57, 60, 76, 81, 82, 85, 87, 88, 89, 101], "utf": [11, 43, 82], "input_lang": 11, "output_lang": 11, "max_length": [11, 12, 84, 85, 88], "eng_prefix": 11, "am": [11, 12, 31, 46, 85, 94, 101], "she": [11, 85], "filterpair": 11, "startswith": [11, 87], "preparedata": 11, "trim": 11, "135842": 11, "10599": 11, "4346": 11, "2804": 11, "nou": 11, "somm": 11, "san": 11, "emploi": [11, 26, 65, 88, 94], "unemploi": 11, "plot_lang": 11, "top_k": 11, "count_occur": [11, 12], "accumul": [11, 12, 35, 36, 39, 40, 60, 64, 65, 67], "counter": [11, 12, 33, 36, 65, 87], "occurr": [11, 12, 40, 88], "bar": [11, 12, 18, 69, 70, 73, 74, 76, 91], "je": 11, "sui": 11, "est": [11, 73], "vou": 11, "pa": 11, "de": [11, 40, 60, 61, 62, 64, 76, 88], "il": 11, "tu": 11, "ne": 11, "es": [11, 100, 102], "un": [11, 80], "ell": 11, "la": 11, "tre": 11, "que": 11, "le": 11, "sont": 11, "j": [11, 17, 21, 27, 43, 61, 62, 64, 65, 67, 69, 73, 74, 80, 84, 85, 91, 97, 100, 102], "ai": [11, 21, 57, 76, 101], "pour": 11, "plu": [11, 33, 36, 67], "ce": [11, 64, 100], "vai": 11, "moi": 11, "mon": [11, 12, 46], "trop": 11, "fort": 11, "si": 11, "ici": 11, "du": 11, "toujour": 11, "tout": 11, "tou": 11, "vraiment": 11, "sur": 11, "te": 11, "dan": 11, "avec": 11, "avoir": 11, "encor": 11, "qu": 11, "tom": 11, "votr": 11, "peur": 11, "desol": 11, "bien": 11, "ca": [11, 39], "bon": 11, "fai": 11, "heureux": 11, "fair": [11, 38, 62, 73, 82, 84, 94, 100], "etr": 11, "son": 11, "aussi": 11, "assez": 11, "lui": 11, "tellement": 11, "ma": [11, 64, 85], "fatigu": 11, "par": [11, 67], "fait": 11, "ton": [11, 21, 43, 76], "se": 11, "mainten": 11, "grand": [11, 76], "desole": 11, "avon": 11, "allon": 11, "peu": 11, "deux": 11, "vieux": 11, "674188349067465": 11, "0371543427945": 11, "my": [11, 12, 21, 31, 34, 36, 37, 57], "too": [11, 12, 31, 33, 34, 35, 36, 37, 39, 40, 61, 65, 67, 69, 82, 85, 88, 91], "sorri": [11, 67], "glad": 11, "tire": 11, "afraid": [11, 37], "hi": [11, 12, 31, 77, 84, 87, 89, 100], "busi": [11, 15, 31, 84, 87], "still": [11, 12, 21, 25, 28, 33, 36, 40, 43, 60, 65, 70, 73, 81, 87, 88, 101], "old": [11, 27, 28, 57, 62, 76, 84, 85], "friend": [11, 101], "her": [11, 25, 84], "teacher": [11, 27], "him": [11, 91], "alon": [11, 38, 67, 73, 91], "being": [11, 12, 16, 25, 57, 60, 61, 65, 67, 70, 73, 74, 76, 77, 80, 84, 85, 87, 88, 91, 94, 100, 101, 102], "home": [11, 12, 43, 76, 94], "proud": 11, "man": [11, 12, 76, 87], "marri": 11, "kind": [11, 12, 15, 17, 21, 25, 31, 33, 35, 39, 57, 60, 67, 69, 73, 80, 81, 82, 85, 88, 94, 101], "who": [11, 31, 52, 57, 82, 84, 85, 88, 101], "wait": [11, 31, 57, 69, 70, 76, 94, 100, 102], "young": [11, 84], "late": [11, 23, 31], "anymor": [11, 36, 43, 65], "hungri": [11, 88], "sick": [11, 87], "85540878257765": 11, "00500226665207": 11, "decod": [11, 15, 19, 27, 33, 35, 39, 43, 74, 82, 88], "condens": [11, 34], "explain": [11, 20, 21, 25, 33, 34, 35, 36, 38, 39, 40, 61, 62, 65, 67, 76, 84, 94], "diagram": [11, 37, 84, 100], "encoderrnn": 11, "hidden_s": [11, 12, 84, 87], "gru": 11, "batch_first": [11, 12], "hidden": [11, 12, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 84, 87], "inithidden": 11, "decoderrnn": 11, "output_s": [11, 87], "logsoftmax": [11, 64, 65], "to_train": 11, "max_len": [11, 84], "x_input": 11, "x_output": 11, "o": [11, 16, 57, 73, 76, 77, 84, 85, 87, 88, 100, 102], "s_i": [11, 77], "s_o": 11, "s_to": 11, "x_partial": 11, "sentec": [11, 12], "nrepresent": 11, "partial": [11, 60, 61, 62, 67, 69, 70, 84, 94], "becaus": [11, 17, 20, 28, 31, 34, 36, 37, 38, 39, 40, 43, 57, 62, 69, 73, 74, 76, 85, 88, 94, 97, 100], "rignt": 11, "close": [11, 16, 20, 21, 27, 31, 38, 46, 57, 65, 67, 69, 70, 73, 76, 77, 81, 87, 88, 89, 101], "context": [11, 27, 67, 84, 87, 88], "inmediatli": 11, "notic": [11, 12, 28, 31, 34, 35, 39, 40, 57, 67, 69, 73, 80, 94], "feed": [11, 21, 73, 74, 84, 88], "instant": [11, 20], "affect": [11, 12, 17, 21, 27, 31, 70, 73, 76, 82, 87, 88, 94], "learning_r": [11, 17, 18, 21, 27, 33, 57, 61, 70, 84, 85, 87], "001": [11, 12, 21, 27, 33, 40, 61, 70, 73, 82, 100, 102], "plot_loss": [11, 62, 94], "plot_full_loss": 11, "encoder_optim": 11, "decoder_optim": 11, "c_input": 11, "c_output": 11, "c_target": 11, "dtype": [11, 12, 17, 21, 25, 27, 28, 57, 65, 67, 82, 84, 87, 97], "acc_loss": 11, "c_batch_siz": 11, "r_target": 11, "c_loss": 11, "useful": 11, "ceil": [11, 17, 21, 85], "300": [11, 20, 27, 33, 57, 67, 69, 70, 73, 76, 87, 94], "epoch_error": 11, "batch_error": 11, "nllloss": [11, 64, 65], "reduct": [11, 31, 57, 69, 70, 73, 80], "seq2seq": 11, "partiar": 11, "repeat": [11, 28, 33, 35, 39, 62, 65, 67, 69, 70, 73, 88, 94], "eof": 11, "candid": [11, 85, 100, 102], "beam": [11, 76], "search": [11, 21, 31, 35, 40, 61, 70, 84, 85, 91], "gen_transl": 11, "pt_out": 11, "idx": [11, 12, 21, 33, 57, 69, 70, 80, 84, 85, 87], "troubl": [11, 69], "gra": 11, "fat": [11, 84], "exhaust": [11, 35], "gro": [11, 88], "fit": [11, 12, 15, 25, 33, 35, 37, 38, 39, 40, 57, 64, 69, 73, 74, 76, 80, 81, 88], "touch": [11, 69], "hit": [11, 39, 40], "touche": 11, "malad": 11, "ill": [11, 52, 62], "trist": 11, "sad": [11, 12], "timid": 11, "shy": 11, "mouill": 11, "wet": 11, "mouille": 11, "revenu": 11, "revoila": 11, "seriou": [11, 69], "chauv": 11, "bald": [11, 76], "occup": 11, "occupe": 11, "calm": 11, "froid": 11, "cold": [11, 62, 84], "fini": 11, "fine": [11, 19, 73, 80, 87, 91], "libr": 11, "dispon": 11, "repu": 11, "rassasi": 11, "chez": 11, "retard": 11, "paresseux": 11, "lazi": [11, 43], "faineant": 11, "paresseus": 11, "okai": 11, "port": [11, 43], "candidat": 11, "aux": 11, "presidentiel": 11, "americain": 11, "american": [11, 76, 88], "presidenti": 11, "mood": 11, "eglis": 11, "contribut": [11, 20, 39, 102], "church": [11, 76], "ag": [11, 19, 76, 84], "quelqu": 11, "difficult": [11, 16, 20, 26, 27, 28, 31, 52, 54, 74, 94, 97], "compil": [11, 30], "programm": 11, "entreprend": 11, "laboratoir": 11, "carri": [11, 12, 27, 33, 62, 94, 101], "laboratori": [11, 76], "seulement": 11, "bell": [11, 35, 39, 76], "intelligent": 11, "intellig": [11, 15, 57, 88, 97, 101], "smart": [11, 31, 61, 65, 73, 74], "enqueton": 11, "meurtr": 11, "jackson": 11, "investig": [11, 25, 27, 33, 35, 36, 40, 67, 70, 76, 94, 101], "murder": 11, "recept": 11, "hypnotiqu": 11, "suscept": 11, "hypnot": 11, "job": [11, 39, 57, 82], "trouv": 11, "redir": 11, "autr": 11, "fault": 11, "complain": 11, "pens": 11, "apprendr": 11, "coreen": 11, "semestr": 11, "prochain": 11, "korean": 11, "semest": 11, "jeun": 11, "comprendr": 11, "critiqu": 11, "defaut": 11, "shortcom": 11, "attendon": 11, "ouvrag": 11, "invent": 11, "histoir": 11, "interessant": 11, "stori": [11, 31, 73, 76, 87], "mari": 11, "husband": 11, "di": [11, 76], "constam": 11, "comport": 11, "constantli": 11, "behav": [11, 12, 40, 57, 64, 69], "herself": 11, "interpret": [11, 20, 35, 38, 39, 73, 81, 88], "banqu": 11, "international": 11, "bank": [11, 76, 84], "expert": [11, 31, 74, 91, 103], "litteratur": 11, "francais": 11, "acquaint": 11, "literatur": [11, 23, 27, 31, 36, 37, 39, 40, 81], "batail": 11, "croyanc": 11, "religieus": 11, "grappl": 11, "religi": [11, 84], "belief": 11, "compet": [11, 26, 67], "espagnol": 11, "italien": 11, "profici": [11, 101], "spanish": [11, 89], "italian": [11, 76], "love": [11, 12, 82, 87], "quitt": 11, "narita": 11, "hawaii": 11, "soir": 11, "leav": [11, 31, 65, 80, 89], "amus": 11, "jouant": 11, "jeux": 11, "video": [11, 31, 33, 40, 46, 51], "himself": 11, "plai": [11, 15, 20, 31, 36, 38, 67, 69, 73, 76, 80, 82, 85, 87, 97], "game": [11, 26, 46, 101], "discuteron": 11, "demain": 11, "discuss": [11, 17, 27, 31, 35, 46, 57, 64, 69, 70, 73, 74, 76, 77, 81, 82, 84, 88, 91, 97, 101], "tomorrow": [11, 57, 101], "notr": [11, 31], "nouveau": 11, "voisin": 11, "neighbor": [11, 17, 85, 101], "tard": 11, "recevoir": 11, "repons": 11, "receiv": [11, 15, 25, 27, 31, 34, 43, 57, 77, 97], "repli": 11, "hate": 11, "ta": [11, 23, 46], "frighten": 11, "clotur": 11, "compt": 11, "epargn": 11, "father": 11, "etonn": 11, "attitud": 11, "irrespons": 11, "alarm": [11, 88], "inquiet": 11, "autorit": 11, "reconnu": 11, "sujet": 11, "recogn": [11, 16, 74, 76, 77, 85, 88], "subject": [11, 15, 19, 25, 26, 31, 76], "invit": [11, 23, 31, 87], "guest": 11, "rejouisson": 11, "revoir": 11, "dispose": 11, "discut": 11, "willing": 11, "talk": [11, 31, 33, 43, 73, 101], "dispos": 11, "parlent": 11, "vont": 11, "chanter": 11, "sing": 11, "parol": 11, "cett": 11, "institut": [11, 15, 17, 19], "spokesperson": 11, "did": [11, 16, 23, 31, 33, 34, 36, 38, 39, 40, 57, 60, 61, 62, 67, 70, 73, 74, 76, 85, 97], "empir": [11, 61, 65, 67], "metric": [11, 12, 17, 27, 33, 84, 85], "blue": [11, 35, 39, 61, 69, 70, 73, 76, 77, 81, 85], "score": [11, 12, 15, 18, 21, 57, 65, 84, 85], "writ": 11, "rigth": 11, "happen": [11, 16, 17, 19, 23, 31, 33, 34, 53, 57, 64, 65, 67, 70, 73, 74, 76, 77, 80, 85, 87, 91, 97], "would": [11, 12, 16, 17, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 84, 85, 88, 89, 91, 94, 100, 101], "attent": [11, 12, 31, 46, 57, 67, 73, 74, 82, 85, 91, 100, 101, 102], "proper": [11, 21, 74, 97], "noun": 11, "jointli": [11, 101], "gonzalo": 12, "uribarri": 12, "infer": [12, 33, 35, 74, 80, 81, 82, 88], "emot": 12, "tweet": 12, "accross": 12, "torchtext": [12, 84, 85, 87], "tensordataset": [12, 64, 65], "get_token": [12, 87], "classification_report": 12, "linear_model": [12, 35, 39], "logisticregress": [12, 35, 39], "model_select": [12, 16, 35, 39], "train_test_split": [12, 16], "feature_extract": 12, "countvector": 12, "websit": [12, 19, 21, 31, 33, 43, 52, 76, 100, 101], "stanford": 12, "alecmgo": 12, "trainingandtestdata": 12, "header_list": 12, "polar": [12, 76], "date": [12, 62], "queri": [12, 25, 57, 85, 87, 88, 101], "df": [12, 25, 40, 57, 60], "1600000": 12, "noemoticon": 12, "iso": 12, "8859": 12, "1467810369": 12, "apr": 12, "06": [12, 16, 25], "pdt": 12, "2009": [12, 27], "no_queri": 12, "_thespecialone_": 12, "switchfoot": 12, "twitpic": 12, "2y1zl": 12, "awww": 12, "1467810672": 12, "scotthamilton": 12, "upset": [12, 84], "facebook": [12, 89], "1467810917": 12, "mattycu": 12, "kenichan": 12, "dive": [12, 65, 73], "ball": [12, 26, 35, 39, 76], "1467811184": 12, "ellectf": 12, "bodi": [12, 15, 25, 28, 33, 35, 36, 43, 73], "itchi": 12, "1467811193": 12, "karoli": 12, "nationwideclass": 12, "neg": [12, 17, 27, 31, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 87, 89, 100, 102], "x_train_text": 12, "x_test_text": 12, "y_train": [12, 35, 39, 61, 62, 64, 65, 81], "y_test": [12, 35, 39, 64, 65, 69, 70], "test_siz": [12, 16, 64, 65, 67], "random_st": [12, 35, 39, 80, 87], "stratifi": [12, 16], "ourselv": [12, 64, 67, 69], "exploratori": [12, 67], "analisi": 12, "eda": 12, "paisleypaislei": 12, "lol": 12, "advanc": [12, 23, 35, 39, 46, 62, 67, 101], "june": 12, "third": [12, 35, 39, 57, 62, 73, 84, 101], "knitter": 12, "summer": [12, 23], "worst": [12, 33, 69, 87], "headach": 12, "ever": [12, 33, 54, 69, 74], "ewaniesciuszko": 12, "wont": 12, "yeah": 12, "18th": 12, "spell": [12, 34, 87], "conk": 12, "quot": 12, "stand": [12, 36, 40, 73], "gone": 12, "everyon": [12, 31], "basic_english": [12, 87], "x_train_token": 12, "x_test_token": 12, "occur": [12, 33, 35, 36, 39, 40, 46, 57, 64, 69, 70, 73, 85, 88, 94], "present": [12, 19, 23, 25, 33, 40, 46, 62, 67, 69, 73, 88, 94], "sorted_word": 12, "669284": 12, "todai": [12, 31, 39, 46, 57, 60, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 101], "got": [12, 31, 57, 73, 74, 76, 85, 88, 100, 102], "had": [12, 15, 31, 39, 40, 70, 73, 76, 80, 85, 88, 91, 101], "amp": 12, "night": [12, 16, 76, 82, 101], "thank": [12, 87], "oh": 12, "13970153178620734": 12, "00532743602652": 12, "zipf": [12, 16], "law": 12, "dictionari": [12, 16, 43, 62, 67, 69, 70, 76, 84, 85, 87, 89, 100, 102], "puntuat": 12, "steam": [12, 76], "uncommon": 12, "appear": [12, 15, 17, 57, 67, 73, 76, 81, 84, 85, 87, 94, 101], "fewer": [12, 23, 33, 36, 65, 67, 73, 80, 88, 94, 97], "occat": 12, "noth": [12, 27, 28, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "simplest": [12, 39, 40, 61, 81, 97], "quirk": 12, "languag": [12, 23, 26, 31, 36, 37, 45, 46, 80, 81, 82, 87, 89], "moreov": [12, 67], "difer": 12, "univers": [12, 31, 48, 69, 73], "better": [12, 16, 18, 21, 25, 27, 31, 33, 34, 36, 39, 57, 60, 61, 62, 67, 69, 70, 73, 74, 76, 80, 82, 85, 87, 88, 89, 91, 94, 97, 100, 101], "spaci": 12, "access": [12, 16, 21, 28, 35, 43, 46, 53, 54, 57, 60, 81, 88, 97, 100], "laguag": 12, "nltk": [12, 80, 85], "toktok": 12, "probali": 12, "svm": 12, "repres": [12, 16, 28, 37, 40, 57, 61, 62, 67, 81, 82, 84, 85, 87, 88, 97, 100], "binari": [12, 21, 36, 40, 57, 70, 73, 84, 87], "otherwis": [12, 31, 33, 35, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "sklean": 12, "document": [12, 23, 27, 43, 64, 65, 67, 84, 85, 88], "x_train_cv": 12, "fit_transform": [12, 77, 87], "x_test_cv": 12, "matriz": 12, "spars": [12, 33, 60, 85, 87], "528584": 12, "165468": 12, "300381": 12, "242211": 12, "489893": 12, "134160": 12, "regressor": 12, "solver": [12, 35, 39, 82], "saga": [12, 35, 39], "class_weight": 12, "dual": [12, 101], "fit_intercept": 12, "intercept_sc": 12, "l1_ratio": 12, "max_it": [12, 67], "multi_class": 12, "auto": [12, 15, 20, 43, 64, 65, 67, 69, 70, 101], "n_job": 12, "penalti": [12, 27, 70, 84], "l2": [12, 17, 60, 61, 73, 84, 89, 94], "tol": 12, "warm_start": 12, "recal": [12, 60, 61, 62, 65, 67, 73, 76, 80, 81, 84, 85, 88], "f1": 12, "160000": 12, "320000": 12, "macro": 12, "avg": [12, 100], "regres": 12, "explan": [12, 34, 37, 38, 65, 67, 88], "coef_": [12, 35, 39], "vocabulary_": 12, "words_sk": 12, "589260": 12, "roni": 12, "862597673594883": 12, "inaperfectworld": 12, "5734362290886375": 12, "dontyouh": 12, "500197620227523": 12, "xbllygbsn": 12, "412645372640648": 12, "anqju": 12, "336405291553548": 12, "200522312464158": 12, "pakcricket": 12, "1949158120163412": 12, "condol": 12, "132498019366488": 12, "heartbreak": 12, "066508733796654": 12, "saddest": 12, "041999809733714": 12, "sadd": 12, "029070563580306": 12, "heartbroken": 12, "0287688233900174": 12, "boohoo": 12, "022608649696793": 12, "sadfac": 12, "9918411285807234": 12, "rachelle_lefevr": 12, "925057253107806": 12, "disappoint": 12, "902524113779547": 12, "lvbu": 12, "894705935001672": 12, "sadden": 12, "8855127179984654": 12, "bum": 12, "83650014970307": 12, "neda": 12, "792944556837498": 12, "iamsoannoi": 12, "8494314732277672": 12, "myfax": 12, "797451563471618": 12, "jennamadison": 12, "5667257393706113": 12, "yeyi": 12, "478028598852801": 12, "tryout": 12, "4383315790116677": 12, "goldymom": 12, "4374026022205535": 12, "wooohooo": 12, "40297322137544": 12, "thesupergirl": 12, "3565118467330004": 12, "iammaxathotspot": 12, "311648368632618": 12, "londicr": 12, "3074490293400993": 12, "smilin": 12, "2991891636718216": 12, "worri": [12, 57, 80], "2899429774914717": 12, "sinfulsignorita": 12, "2798963640981817": 12, "finchensnail": 12, "264302079155878": 12, "smackthi": 12, "2376679263761083": 12, "kv": 12, "2158393907798413": 12, "tojosan": 12, "211784259253832": 12, "russmarshalek": 12, "2095374025599384": 12, "traciknopp": 12, "1768297770350835": 12, "congratul": [12, 70, 84], "171590496227557": 12, "rememb": [12, 25, 28, 33, 35, 36, 38, 60, 62, 67, 69, 74, 77, 80, 82, 88, 100, 101], "sigma": [12, 40, 61, 62, 64, 65, 74, 76, 80, 81], "wx": 12, "previou": [12, 15, 16, 17, 21, 31, 35, 38, 39, 40, 43, 57, 61, 62, 64, 67, 73, 74, 76, 80, 84, 85, 88], "That": [12, 31, 33, 35, 36, 38, 57, 60, 67, 73, 74, 76, 84, 88, 89], "mea": 12, "didnt": 12, "But": [12, 33, 35, 36, 39, 40, 43, 62, 64, 65, 67, 69, 70, 73, 74, 80, 81, 82, 87, 88, 89, 100], "solv": [12, 19, 27, 37, 62, 67, 81, 84, 88, 97, 101], "unlik": [12, 53, 57, 62, 73, 88, 100], "feedforward": [12, 84], "cyclic": 12, "power": [12, 43, 57, 60, 62, 67, 69, 70, 76, 80, 84, 97], "word_to_idx": 12, "integ": [12, 15, 21, 33, 35, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "limit": [12, 15, 18, 25, 26, 31, 33, 34, 37, 39, 40, 43, 61, 77, 85, 88, 91, 101], "num_words_dict": 12, "ditionari": 12, "reserv": 12, "most_used_word": 12, "extra": [12, 17, 23, 27, 31, 33, 34, 80, 85, 97], "outsid": [12, 17, 21, 35, 39, 40, 43, 57, 64, 65, 73, 76, 84, 88, 94], "unk": [12, 87, 88], "idx_to_word": 12, "pad_token": [12, 88], "unk_token": [12, 88], "popul": [12, 20, 39], "num": [12, 57, 77, 80, 85, 100, 102], "These": [12, 15, 17, 23, 27, 31, 33, 35, 36, 46, 57, 60, 62, 64, 67, 70, 73, 74, 77, 81, 82, 85, 87, 88, 89, 97, 101, 102], "tokens_to_idx": 12, "sentences_token": 12, "sentences_idx": 12, "sent": [12, 84, 87], "sent_idx": 12, "x_train_idx": 12, "x_test_idx": 12, "some_numb": 12, "721": [12, 76], "237": [12, 76, 80], "adequ": [12, 27], "tweet_len": 12, "asarrai": [12, 65, 73, 76], "median": [12, 33, 39, 80], "quantil": 12, "maximum": [12, 17, 20, 21, 25, 27, 28, 39, 43, 65, 67, 69, 70, 73, 77, 84, 85, 94, 97, 100, 102], "max_lenght": 12, "shorter": 12, "lenght": 12, "seq_len": 12, "ii": [12, 28, 46, 64, 65], "len_tweet": 12, "x_train_pad": 12, "x_test_pad": 12, "y_train_np": 12, "y_test_np": 12, "122": [12, 20, 69, 76], "209": [12, 76], "667": [12, 76], "138": [12, 76, 84], "3296": 12, "train_data": [12, 16, 17, 64, 65, 67, 73, 84, 87], "valid_data": [12, 87], "hyperparamet": [12, 27, 28, 31, 33, 36, 67, 69, 84, 85, 87, 88, 94, 100, 102], "valid_load": 12, "trane": 12, "proccess": 12, "folllow": 12, "datait": [12, 65, 76, 77], "sample_x": 12, "sample_i": 12, "seq_length": [12, 84], "7447": 12, "14027": 12, "22241": 12, "2702": 12, "162": [12, 76, 80, 84], "12904": 12, "sentimentrnn": 12, "pai": [12, 31, 57, 67, 73, 74, 76, 91, 100, 101, 102], "posibl": 12, "inedex": 12, "space": [12, 17, 20, 21, 27, 36, 57, 61, 64, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 100, 102], "embedding_dim": 12, "thread": [12, 43, 84], "particular": [12, 15, 16, 31, 33, 35, 39, 40, 57, 65, 67, 73, 80, 81, 94, 97], "lstm": [12, 33, 57, 84, 88], "decid": [12, 33, 34, 36, 38, 39, 40, 67, 84, 85, 97, 100], "no_lay": 12, "strongli": [12, 62], "colah": 12, "vocab_s": [12, 84, 87, 88], "hidden_dim": [12, 67], "drop_prob": 12, "output_dim": [12, 62, 82], "num_lay": [12, 20], "sigmoid": [12, 21, 76, 82], "fc": [12, 20, 76], "sig": [12, 40, 80], "emb": [12, 57, 74, 81, 82, 84, 91], "lstm_out": 12, "activ": [12, 15, 16, 19, 20, 21, 25, 35, 39, 43, 57, 60, 62, 64, 65, 67, 73, 76, 80, 81, 82, 91, 101], "contigu": [12, 84, 100, 102], "across": [12, 15, 17, 18, 19, 27, 31, 35, 36, 39, 40, 57, 62, 64, 67, 70, 73, 80, 82, 84, 85, 87, 88, 94, 100, 101, 102], "sig_out": 12, "init_hidden": 12, "n_layer": 12, "h0": 12, "c0": [12, 20], "vocabulari": [12, 84, 85, 87, 88], "regular": [12, 17, 39, 46, 60, 67, 76, 84, 91, 101], "move": [12, 26, 27, 31, 33, 35, 39, 40, 43, 57, 67, 73, 76, 80, 84, 85, 88, 100, 102], "model_paramet": 12, "filter": [12, 17, 21, 36, 40, 57, 67, 80, 84, 85, 87, 88], "prod": [12, 28, 61, 67], "1018433": 12, "procc": 12, "crossentropi": 12, "bceloss": 12, "round": [12, 21, 31, 62, 76, 77, 84, 87, 100, 102], "absolut": [12, 31, 67, 69, 70, 73, 84, 94, 101], "accept": [12, 15, 34, 43, 70, 82, 85, 87, 100, 102], "gradeint": 12, "assum": [12, 21, 25, 28, 35, 62, 64, 65, 73, 74, 76, 80, 84, 88, 97, 100, 101], "big": [12, 21, 31, 43, 46, 67, 69, 70, 81, 84, 88, 101], "valid_loss_min": 12, "evolut": [12, 60, 61, 62, 67, 81], "epoch_tr_loss": 12, "epoch_vl_loss": 12, "epoch_tr_acc": 12, "epoch_vl_acc": 12, "backprop": [12, 33], "clip_grad_norm": 12, "prevent": [12, 34, 35, 37, 67, 81], "explod": [12, 20, 31, 61, 65, 81, 82], "clip_grad_norm_": [12, 87], "val_loss": [12, 67, 87], "val_acc": [12, 67, 69, 70, 87], "val_h": 12, "epoch_train_loss": 12, "epoch_val_loss": 12, "epoch_train_acc": 12, "epoch_val_acc": 12, "val_accuraci": 12, "6f": 12, "pt": [12, 67, 85, 88], "4367361353733577": 12, "39174133955966683": 12, "530625": 12, "3628125": 12, "391741": 12, "3765802335098851": 12, "3724124691961333": 12, "19140625": 12, "42031250000001": 12, "372412": 12, "35746844720793886": 12, "365050206175074": 12, "16882812499999": 12, "7440625": 12, "365050": 12, "34491546426317654": 12, "36467386982403693": 12, "879140625": 12, "364674": 12, "33429012800217606": 12, "36189084346871825": 12, "44296875": 12, "0221875": 12, "361891": 12, "grid": [12, 28, 57, 70, 76, 80, 81, 87, 100], "migth": 12, "preprocces": 12, "rudimentari": [12, 19], "correctli": [12, 17, 25, 31, 39, 43, 57, 64, 73, 81, 82, 84, 94, 100], "propos": [12, 27, 31, 33, 35, 64, 94, 101], "hyperparament": 12, "bidirecton": 12, "learnt": [12, 16, 73], "beliv": 12, "youtub": 12, "kshitij": 15, "dwivedi": 15, "produtct": [15, 17, 21], "colab": [15, 16, 19, 21, 27, 31, 33, 35, 51, 54, 62, 65, 85, 87, 88, 101], "agre": 15, "educ": 15, "NOT": [15, 17, 35, 54, 60, 84, 88, 94], "thereof": 15, "massachusett": 15, "technolog": 15, "warranti": 15, "regard": [15, 33, 35, 37, 40, 69], "infring": 15, "shall": [15, 76, 81, 82, 85], "defend": [15, 70], "indemnifi": 15, "corpor": 15, "employe": 15, "offic": 15, "agent": [15, 27, 84, 97, 101], "against": [15, 31, 38, 40, 60, 70, 81, 85, 94], "claim": [15, 34, 67], "aris": [15, 40, 57, 67, 84], "copyright": 15, "treat": [15, 57, 74, 76, 80, 85, 88], "digniti": 15, "guarante": [15, 31, 35, 62, 67, 80, 84, 85, 94], "liabil": 15, "roi": [15, 18], "glass": [15, 76], "nilearn": 15, "challeng": [15, 16, 33, 61, 67, 73, 94, 97], "particip": [15, 20, 25, 33, 36, 52, 81, 88], "decord": 15, "pickl": [15, 100], "nibabel": 15, "nib": 15, "fsaverag": 15, "fetch_surf_fsaverag": 15, "year": [15, 31, 85, 88, 101], "googl": [15, 16, 21, 23, 27, 30, 31, 48, 51, 54, 84, 87], "must": [15, 21, 27, 31, 38, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 88, 89, 94, 100, 102], "dropbox_link": 15, "myurl": 15, "participants_data": 15, "agxyxntrbwko7t1": 15, "fname1": [15, 76], "participants_data_v2021": 15, "fname2": [15, 76], "algonautsvideos268_all_30fpsmax": 15, "dynam": [15, 17, 36, 38, 60, 62, 64, 67, 97], "cognit": [15, 20], "fub": 15, "algonauts2021_devkit": 15, "nii": 15, "submit": [15, 23, 31], "everydai": 15, "event": [15, 23, 46, 48, 57, 74, 101], "magnet": [15, 76], "reson": 15, "high": [15, 17, 19, 27, 31, 37, 62, 70, 73, 74, 76, 80, 81, 82, 87, 88, 94, 97, 100, 102], "spatial": [15, 17, 18, 37, 64, 82], "resolut": [15, 19, 37, 40, 73, 82], "blood": [15, 62, 101], "flow": [15, 31, 37, 38, 43, 67, 81, 84, 101], "independ": [15, 61, 62, 65, 74, 81], "voxel": 15, "reliabl": [15, 62, 67], "region": [15, 16, 20, 54, 73], "known": [15, 33, 35, 39, 40, 60, 70, 73, 74, 77, 80, 81, 82, 87, 97], "role": [15, 39, 69, 76, 84, 88, 94], "earli": [15, 23, 31, 38, 70, 73], "mid": [15, 81, 87], "cortex": [15, 17, 18, 19, 74], "v2": [15, 16, 27, 57, 87, 89], "v3": 15, "v4": [15, 16, 27], "higher": [15, 33, 36, 39, 40, 61, 62, 65, 67, 70, 73, 80, 84, 87, 88, 100], "respond": [15, 25, 31, 43, 82, 88], "preferenti": 15, "eba": 15, "face": [15, 25, 76, 85, 94, 101], "ffa": 15, "st": [15, 76, 87], "loc": [15, 21, 25, 39, 61, 67, 76], "scene": [15, 28, 84], "ppa": 15, "pkl": 15, "num_video": 15, "num_repetit": 15, "num_voxel": 15, "signific": [15, 38, 57, 67, 70], "demonstr": [15, 27, 38, 39, 40, 57, 67, 76, 80, 101], "save_dict": 15, "di_": 15, "filename_": 15, "load_dict": 15, "rb": [15, 27, 57, 73, 82, 100, 102], "_unpickl": 15, "latin1": 15, "ret_di": 15, "visualize_act": 15, "vid_id": 15, "fmri_dir": 15, "full_track": 15, "track_dir": 15, "sub_fmri_dir": 15, "nifti": 15, "fmri_train_al": 15, "voxel_mask": 15, "get_fmri": 15, "visual_mask_3d": 15, "brain_mask": 15, "nii_save_path": 15, "vid_act": 15, "saveasnii": 15, "plot_glass_brain": 15, "plot_ab": 15, "display_mod": 15, "lyr": 15, "train_vid": 15, "repetit": [15, 35, 36, 39, 40], "roi_fil": 15, "roi_data": 15, "roi_data_train": 15, "nii_data": 15, "nii_img": 15, "nifti1imag": 15, "header": [15, 43, 88], "sub05": 15, "sub01": 15, "sub02": 15, "sub03": 15, "sub04": 15, "sub06": 15, "sub07": 15, "sub08": 15, "sub09": 15, "sub10": 15, "wrapper": [15, 25, 27, 28, 57, 76, 85], "mini_track": 15, "heatmap": [15, 21], "stimulu": [15, 18, 25, 39], "aspect": [15, 20, 31, 35, 36, 37, 38, 74, 88, 91, 101], "vmin": [15, 17, 20, 21, 28, 62, 73, 82], "vmax": [15, 17, 20, 21, 28, 62, 73, 82], "shrink": [15, 73], "tight_layout": [15, 17, 18, 21, 62, 64, 67, 76], "individu": [15, 17, 31, 33, 36, 38, 46, 62, 74, 76, 77, 85, 88], "999": [15, 25, 76, 82, 94], "base64": [15, 27, 73], "b64encod": [15, 27, 73], "video_dir": 15, "video_list": 15, "mp4": [15, 27], "data_url": 15, "400": [15, 20, 27, 33, 35, 39, 67, 76], "control": [15, 19, 20, 25, 27, 31, 36, 43, 52, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "src": [15, 27, 28, 39, 43, 44, 45, 73], "5s": [15, 67], "9s": 15, "onset": 15, "finit": [15, 76], "impuls": 15, "fir": 15, "radoslaw": 15, "martin": [15, 97], "cichi": [15, 19], "benjamin": [15, 57], "lahner": 15, "lascel": 15, "polina": [15, 60, 61, 62, 73, 76, 77], "iamshchinina": 15, "monika": 15, "graumann": 15, "andonian": 15, "apurva": 15, "ratan": 15, "murti": 15, "kendrick": 15, "kai": [15, 19, 34], "gemma": 15, "roig": 15, "aud": 15, "oliva": 15, "motion": [15, 33, 35, 36, 39, 40], "2104": 15, "13714v1": 15, "yalda": 15, "mohsenzadeh": 15, "kandan": 15, "ramakrishnan": 15, "platform": [15, 43, 57, 76], "commun": [15, 31, 34, 39, 52, 64, 67, 88], "biolog": [15, 17, 101], "1905": 15, "05675": 15, "rishika": [16, 20], "mohanta": [16, 20], "furkan": 16, "\u00f6z\u00e7elik": 16, "imagin": [16, 61, 74, 82, 91], "spectacl": 16, "blur": [16, 21, 81], "stumbl": 16, "anim": [16, 28, 39, 57, 62, 73, 81, 97], "walk": [16, 20, 33, 34, 57, 73, 76], "ye": [16, 33, 39, 40, 57, 62, 73], "foggi": 16, "condit": [16, 17, 28, 35, 36, 39, 40, 57, 61, 65, 80, 81, 85, 87, 88, 97], "poor": [16, 77], "qualiti": [16, 80, 87, 94], "low": [16, 17, 19, 27, 31, 33, 39, 62, 73, 74, 80, 81, 82, 88, 91, 100, 102], "Is": [16, 38, 64, 67, 73, 80, 91, 94, 101], "torch_intermediate_layer_gett": [16, 18], "wheel": [16, 18, 25, 28, 76], "getter": [16, 18], "25l": [16, 18, 28], "25hdone": [16, 18, 28], "pil": [16, 18, 43, 73, 76, 77], "imagefilt": 16, "copyfil": 16, "intermediatelayergett": [16, 18], "layergett": 16, "summarywrit": 16, "load_ext": [16, 84, 85], "asirra": 16, "captcha": 16, "autom": [16, 57, 70, 76], "ture": [16, 101], "apart": [16, 69, 77], "hip": [16, 76], "proof": 16, "motiv": [16, 25, 60, 67, 74, 97, 101], "behind": [16, 35, 62, 65, 67, 82, 88, 100, 102], "creation": [16, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "speci": 16, "restrict": [16, 82, 85], "photograph": 16, "accomplish": [16, 38, 88], "competit": [16, 30, 100], "dataset_blur_2": 16, "dataset_blur_5": 16, "subfold": 16, "gaussian": [16, 21, 35, 39, 65, 74, 80, 82], "radiu": [16, 21, 57], "catvdog_clear": 16, "catvdog_blur_2": 16, "catvdog_blur_5": 16, "hj2gd": 16, "xp6qd": 16, "wj43a": 16, "zip_ref": [16, 80, 84], "appropri": [16, 23, 31, 36, 37, 39, 64, 67, 69, 70, 73, 81, 85, 94], "resiz": [16, 17, 18, 27, 43, 73, 76, 77], "clear_train_data": 16, "clear_test_data": 16, "noisy_train_data": 16, "noisy_test_data": 16, "validation_split": 16, "val_ratio": 16, "train_indic": [16, 76], "val_indic": 16, "train_split": 16, "subset": [16, 20, 21, 28, 33, 35, 39, 57, 67, 70, 76, 77, 84, 85, 87, 88, 94], "val_split": 16, "clear_train_split": 16, "clear_val_split": 16, "clear_train_batch": 16, "clear_val_batch": 16, "clear_test_batch": 16, "noisy_train_split": 16, "noisy_val_split": 16, "noisy_train_batch": 16, "noisy_val_batch": 16, "noisy_test_batch": 16, "clear_cat_imag": 16, "clear_dog_imag": 16, "19997": 16, "noisy_cat_imag": 16, "noisy_dog_imag": 16, "141": [16, 76], "142": [16, 76], "143": [16, 76], "144": [16, 76], "tri": [16, 33, 34, 39, 40, 85, 101], "schrimpf": 16, "categoris": 16, "faster": [16, 21, 35, 36, 39, 61, 67, 69, 73, 76, 80, 94], "architechtur": 16, "rel": [16, 19, 21, 31, 33, 35, 39, 40, 57, 62, 67, 76, 77, 80, 82, 84, 87, 88, 94], "computation": [16, 67], "feasibl": [16, 31], "down": [16, 17, 21, 27, 28, 31, 34, 35, 36, 37, 54, 57, 61, 62, 67, 73, 80, 81, 82, 85, 88, 91, 97], "retina": [16, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "lgn": 16, "192": [16, 76], "avgpool": [16, 73], "adaptiveavgpool2d": 16, "visualis": [16, 67, 73], "add_graph": 16, "logdir": [16, 85], "train_batch": 16, "val_batch": 16, "training_loss": [16, 64, 65, 87], "reset": [16, 25, 27, 28, 57, 60, 64, 73, 76, 82, 97], "train_count": 16, "test_count": 16, "blurri": [16, 94], "h_0": 16, "h_1": 16, "3e": [16, 65], "num_pretraining_epoch": 16, "num_training_epoch": 16, "naive_before_train": 16, "naive_training_loss": 16, "naive_validation_loss": 16, "naive_after_train": 16, "arang": [16, 17, 21, 28, 35, 39, 57, 60, 61, 62, 64, 73, 76, 81, 84, 85, 87], "array_split": 16, "17953": 16, "00": [16, 27, 28, 31, 33, 46, 94], "2494": 16, "09": [16, 33, 61], "distinguish": [16, 33, 35, 36, 39, 77, 80, 100], "expert_before_train": 16, "expert_after_pretrain": 16, "experienced_training_loss": 16, "experienced_validation_loss": 16, "expert_after_train": 16, "axvlin": [16, 67, 69, 94], "linestyl": [16, 67, 70], "dash": [16, 39, 69, 70, 94], "01": [16, 20, 25, 27, 57, 60, 61, 62, 67, 73, 81, 82, 94], "29": [16, 20, 65, 67, 73, 76, 81, 82, 85, 94, 97, 100], "thats": 16, "seen": [16, 35, 39, 60, 61, 62, 65, 67, 70, 74, 76, 77, 80, 84, 94, 101], "further": [16, 20, 31, 38, 39, 70, 73, 81, 82, 85, 97], "itself": [16, 25, 67, 94, 97], "OR": [16, 31, 36, 57], "plot_filt": 16, "filter_index": [16, 76], "row_index": [16, 76], "col_index": [16, 76], "filter_imag": [16, 76], "scaled_imag": [16, 76], "meaning": [16, 35, 87, 89], "visibl": [16, 27, 85, 94], "intermidi": 16, "return_lay": [16, 18], "plot_intermediate_lay": 16, "intermediate_output": [16, 76], "complex": [16, 26, 27, 37, 38, 57, 60, 61, 62, 67, 69, 70, 73, 94, 97, 101], "appar": [16, 38, 82], "clearli": [16, 33, 34, 35, 37, 39, 40, 69, 74], "somewhat": [16, 39, 73, 84, 100], "focus": [16, 74, 101], "contibut": 16, "respons": [16, 19, 25, 31, 43, 67, 76, 84, 85, 87, 88], "wish": [16, 27, 60, 85, 94, 100], "variat": [16, 40, 60, 77], "wget": 16, "certif": [16, 46, 52], "microsoft": [16, 88], "3e1c3f21": 16, "ecdb": 16, "4869": 16, "8368": 16, "6deba77b919f": 16, "kagglecatsanddogs_3367a": 16, "local_zip": 16, "cat_fold": 16, "petimag": 16, "dog_fold": 16, "check_fil": 16, "cat_fil": 16, "dog_fil": 16, "test_ratio": 16, "training_length": 16, "test_indic": [16, 76], "jpg": [16, 76, 77], "gaussianblur": [16, 21], "zipdir": 16, "ziph": 16, "dir": [16, 60, 77], "relpath": 16, "zip_defl": 16, "carsen": 17, "stringer": [17, 19], "cellular": [17, 76], "cultur": 17, "calcium": 17, "opencv": [17, 21], "numba": [17, 25, 28], "tifffil": 17, "cv2": [17, 21], "hashlib": 17, "jit": [17, 82], "gaussian_filt": 17, "find_object": 17, "binary_fill_hol": 17, "generate_binary_structur": 17, "linear_sum_assign": 17, "answer": [17, 23, 31, 33, 34, 35, 36, 37, 38, 39, 40, 57, 74, 76, 77, 84, 88, 91, 94, 101], "allow": [17, 20, 28, 33, 34, 36, 38, 39, 40, 52, 57, 64, 65, 67, 73, 80, 82, 84, 85, 88, 91, 97, 100, 101], "drug": 17, "surviv": 17, "reason": [17, 28, 31, 33, 52, 61, 62, 65, 67, 69, 74, 76, 82, 87, 94, 100, 101], "tempor": [17, 37, 57, 64], "divis": [17, 67, 84, 94], "movement": [17, 33, 35, 36, 39, 40], "influx": 17, "quantif": 17, "protein": [17, 84], "rna": 17, "expresss": 17, "convolut": [17, 21, 31, 33, 35, 45, 67, 70, 74, 82, 84, 91, 94, 100], "curat": [17, 23, 27], "cytoplasm": 17, "stain": 17, "nuclear": 17, "cost": [17, 20, 31, 36, 60, 61, 70, 76, 85, 100], "transfer": [17, 21, 43, 57, 62, 100, 101], "ann": [17, 19, 21, 69, 70, 73, 87, 89, 91], "carpent": [17, 76], "lab": [17, 19, 21, 25, 57, 73, 76], "broad": [17, 31, 80], "mayb": [17, 28, 31, 33, 34, 36, 39, 40, 77, 91], "worm": [17, 76], "herd": 17, "bison": [17, 76], "rescal": [17, 21], "accordingli": [17, 61, 64, 97], "tool": [17, 19, 21, 31, 37, 38, 43, 57, 61, 62, 67, 76, 77], "napari": 17, "overfit": [17, 21, 70, 100, 102], "finish": [17, 21, 31, 38, 69, 70, 73, 82, 84, 85, 87, 102], "develop": [17, 21, 25, 31, 35, 38, 43, 57, 60, 61, 81, 88, 97, 100], "movi": [17, 19, 33, 35, 36, 57, 76, 85, 91], "record": [17, 19, 25, 27, 33, 35, 39, 57, 60, 61, 62, 69, 73, 74, 88, 94, 100, 102], "microscop": 17, "therefor": [17, 33, 35, 36, 39, 43, 60, 61, 62, 64, 65, 69, 70, 73, 80, 84, 85, 91, 100], "though": [17, 28, 57, 60, 65, 74, 80, 88, 91, 94, 101], "frame": [17, 27, 37, 39, 40, 57, 69, 70, 76, 81], "suite2p": 17, "acknowledg": [17, 21], "borrow": [17, 21], "cellpos": 17, "mariu": [17, 20, 33, 34, 35, 36, 37, 38, 39, 40], "pachitariu": [17, 20, 35, 36], "kristin": [17, 21], "branson": [17, 21], "poseestim": 17, "cells_train": 17, "npz": [17, 18, 94], "cells_test": 17, "z3h78": 17, "ft5p3": 17, "expected_md5": 17, "85e1fe2ee8d936c1083d62563d79d958": 17, "e8f789abe20a7efde806d9ba03d20fd7": 17, "md5": 17, "hexdigest": 17, "corrupt": 17, "allow_pickl": [17, 33, 36], "arr_0": 17, "imgs_train": 17, "masks_train": 17, "imgs_test": 17, "masks_test": 17, "mostli": [17, 33, 36, 62, 64, 77], "varieti": [17, 31, 34, 39, 80, 88], "fast": [17, 27, 36, 39, 61, 64, 67, 74, 85, 101], "normalize99": 17, "1st": [17, 57, 84], "percentil": [17, 21, 57], "99th": 17, "x01": 17, "x99": 17, "irand": 17, "nuclei": 17, "labels_train": 17, "labels_test": 17, "adapt": [17, 33, 43, 57, 65, 69], "random_rotate_and_res": 17, "scale_rang": 17, "xy": [17, 87], "do_flip": 17, "nimg": 17, "ly": 17, "lx": 17, "nd": [17, 67], "nchan": 17, "nlabel": 17, "IF": 17, "rand": [17, 20, 35, 39, 57, 61, 62, 69, 70, 73, 80, 81, 82, 94], "bool": [17, 28, 67, 73, 84, 85, 94, 97], "flip": [17, 62, 65, 70, 73, 91, 100], "imgi": 17, "ndim": [17, 81], "nt": [17, 20], "dxy": 17, "cc1": 17, "pts1": 17, "pts2": 17, "getaffinetransform": 17, "newaxi": [17, 81], "warpaffin": 17, "inter_linear": 17, "inter_nearest": 17, "img_batch": 17, "lbl_batch": 17, "local": [17, 20, 27, 28, 31, 48, 53, 60, 70, 77, 85, 87, 89, 91, 100, 102], "autoencod": [17, 81], "imagenet": [17, 18, 19, 43, 57, 91], "upsampl": [17, 21, 82], "ultim": [17, 33, 35, 36, 38, 88], "skip": [17, 31, 57, 65, 73, 76, 80, 82, 94], "TO": [17, 65, 85], "propag": [17, 64, 65, 94, 102], "later": [17, 25, 28, 31, 33, 35, 36, 39, 40, 43, 57, 60, 62, 64, 65, 73, 74, 80, 88, 97, 100], "resnet_torch": 17, "convbatchrelu": 17, "sz": 17, "convdown": 17, "add_modul": [17, 64, 65], "conv_": 17, "nbase": 17, "maxpool": [17, 21], "conv_down_": 17, "xd": 17, "convup": 17, "conv_0": 17, "conv_1": 17, "scale_factor": [17, 21], "conv_up_": 17, "unet": [17, 21], "nout": 17, "nbaseup": 17, "t0": 17, "save_model": 17, "load_model": [17, 87, 89, 100, 102], "concaten": [17, 20, 28, 35, 39, 57, 64, 65, 80, 81], "put": [17, 21, 31, 33, 37, 39, 40, 43, 60, 61, 64, 76, 77, 81, 84, 87, 88, 89, 100], "colon": [17, 85], "datetim": [17, 21], "linearli": [17, 21, 33, 36, 40, 80, 82], "batchsiz": [17, 21, 73], "n_epoch": [17, 60, 61, 62, 82], "cycl": [17, 21], "n_epochs_per_sav": 17, "val_frac": [17, 21], "fraction": [17, 21, 27, 31, 69, 70, 80], "clean": [17, 21, 27, 43, 57, 81, 85, 88], "timestamp": [17, 21], "strftime": [17, 21], "dt": [17, 21, 35, 39, 40, 64, 81], "n_val": [17, 21], "n_train": [17, 21, 64, 84], "iperm": 17, "val_data": 17, "val_label": [17, 18, 87], "train_mask": 17, "val_mask": 17, "flavor": [17, 21, 31, 88], "schedul": [17, 21, 34, 82, 87], "linspac": [17, 21, 40, 57, 60, 61, 64, 65, 67, 69, 70, 80, 81, 82, 87, 94], "nan": [17, 21, 25, 61, 94], "saveepoch": [17, 21], "entir": [17, 19, 21, 27, 28, 31, 69, 73, 76, 80, 85, 87, 89, 101], "batchnorm": [17, 21, 94], "desc": [17, 21, 80, 85, 100, 102], "pbar": [17, 21, 81, 82], "ibatch": 17, "ind": 17, "clip_grad_value_": [17, 21], "nsave": 17, "savefil": [17, 21], "unet_epoch": 17, "pad_image_nd": 17, "img0": 17, "div": [17, 36, 43, 73], "2d": [17, 28, 35, 39, 43, 57, 60, 64, 65, 67, 69, 70, 73, 80, 82, 85, 87, 94], "lz": 17, "slice": [17, 57], "lpad": 17, "xpad1": 17, "xpad2": 17, "ypad1": 17, "ypad2": 17, "constant": [17, 36, 40, 74, 88], "ysub": 17, "xsub": 17, "slc": 17, "img_pad": 17, "img_torch": 17, "rather": [17, 33, 35, 39, 43, 61, 62, 67, 74, 80, 88, 97], "union": [17, 85], "iou": 17, "overlap": [17, 73, 89], "ground": [17, 27, 28, 33, 35, 36, 39, 61, 64, 76, 100, 101, 102], "truth": [17, 33, 35, 36, 39, 57, 61, 64, 100, 101, 102], "greater": [17, 76, 77, 85], "taken": [17, 27, 57, 65, 67, 73, 100, 102], "stardist": 17, "maxim": [17, 27, 70, 74, 76, 80, 91, 94, 97, 100], "fill_holes_and_remove_small_mask": 17, "min_siz": 17, "discard": [17, 69], "morpholog": 17, "NO": [17, 62], "minimum": [17, 21, 25, 28, 60, 67, 69, 70, 94], "turn": [17, 23, 31, 57, 61, 64, 67, 74, 81, 82, 100, 102], "msk": 17, "npix": 17, "average_precis": 17, "masks_tru": 17, "masks_pr": 17, "ap": 17, "tp": 17, "fn": 17, "heavili": 17, "mpicbg": 17, "csbd": 17, "isinst": [17, 21, 60, 61, 62, 65, 80, 82, 85], "ndarrai": [17, 21, 28, 39, 57, 60, 61, 62, 65, 67, 69, 70, 73, 80, 81, 85, 97, 100, 102], "n_true": 17, "n_pred": 17, "mt": 17, "return_index": 17, "_intersection_over_union": 17, "_true_posit": 17, "nopython": 17, "_label_overlap": 17, "ravel": [17, 57, 67, 76, 100, 102], "uint": 17, "n_pixels_pr": 17, "keepdim": [17, 65, 69, 70, 81], "n_pixels_tru": 17, "isnan": [17, 61], "n_min": 17, "true_ind": 17, "pred_ind": 17, "match_ok": 17, "get_masks_unet": 17, "cell_threshold": 17, "selem": 17, "shape0": 17, "return_invers": 17, "uint16": [17, 21], "capac": [17, 70, 101], "val_pad": 17, "val_torch": 17, "iou_threshold": 17, "ylim": [17, 27, 39, 61, 64, 67, 80, 81], "5039152": 17, "test_pad": 17, "test_torch": 17, "58384985": 17, "typic": [17, 18, 19, 20, 25, 28, 31, 65, 67, 74, 77, 80, 88, 94, 97], "overmerg": 17, "avoid": [17, 21, 28, 31, 34, 35, 36, 37, 38, 43, 61, 67, 76, 77, 88, 100], "boundari": [17, 70, 73], "interfac": [17, 27, 28, 35, 38, 43, 57, 85, 87], "jupyt": [17, 27, 43, 73], "overlaid": 17, "mous": [17, 19, 20, 76], "10hz": 17, "4500": 17, "325": [17, 76], "556": [17, 76], "gt1": 17, "tif": 17, "test_data": [17, 57, 64, 65, 67, 70, 73, 84, 87], "n_time": 17, "max_img": 17, "max_img_filt": 17, "unfilt": 17, "max_img_larg": 17, "max_img_2chan": 17, "zeros_lik": [17, 57, 67, 73, 81], "BE": 17, "hand": [17, 28, 31, 33, 36, 38, 39, 57, 60, 61, 62, 65, 67, 73, 76, 81, 82, 87], "IT": 17, "n_cell": [17, 60], "fluoresc": 17, "trace": [17, 73], "middl": [17, 21, 88, 91], "allen": 17, "guidanc": [17, 31, 57], "strategi": [17, 36, 85, 88, 94, 100], "light": [17, 76, 101], "aakash": 18, "agraw": 18, "proven": [18, 27], "pca": [18, 77, 87], "directli": [18, 31, 53, 57, 60, 64, 67, 73, 80, 85, 87, 97, 100, 102], "midgett": 18, "pdist": 18, "stat": [18, 25, 35, 39, 40, 43, 67, 80, 81, 85, 87, 94], "pearsonr": 18, "kay_label": 18, "npy": 18, "kay_labels_v": 18, "kay_imag": 18, "r638": 18, "yqb3e": 18, "ymnjv": 18, "dobj": 18, "dat": 18, "sharex": [18, 62], "flat": [18, 35, 39, 76], "stimuli": [18, 20, 25, 35, 36, 39, 84], "field": [18, 21, 31, 34, 35, 37, 38, 43, 57, 61, 76, 77, 81, 85], "stim": 18, "grayscal": [18, 80], "stimuli_test": 18, "responses_test": 18, "roi_nam": 18, "stimuli_tr": 18, "stimuli_t": 18, "stimuli_tr_xform": 18, "1750": 18, "stimuli_ts_xform": 18, "loc_id": 18, "response_tr": 18, "response_t": 18, "mydataset": 18, "longtensor": [18, 57], "__getitem__": [18, 21, 33, 69, 70, 81], "fromarrai": [18, 73], "__len__": [18, 21, 33], "randomresizedcrop": 18, "centercrop": [18, 43, 76], "dataset_s": 18, "mseloss": [18, 60, 62, 69, 70, 80, 81], "best_model_wt": 18, "deepcopi": [18, 67, 69, 70, 94], "best_loss": 18, "running_correct": 18, "histori": [18, 60, 73], "set_grad_en": 18, "4f": [18, 20, 33, 40, 69], "4805": 18, "0503": 18, "4680": 18, "4679": 18, "0501": 18, "4677": 18, "0500": 18, "0499": 18, "fc2": [18, 33, 69, 70, 73, 87, 100, 102], "fc3": [18, 69, 70, 100, 102], "net_im": 18, "midfeat_ft": 18, "keep_output": 18, "midfeat_im": 18, "mid_outputs_ft": 18, "mid_outputs_im": 18, "v1_id": 18, "rts_v1": 18, "rts_lo": 18, "fmri_dist_metric_ft": 18, "euclidean": [18, 57, 62, 69, 74, 77], "fmri_dist_metric_im": 18, "alexnet_ft_dist_metr": 18, "alexnet_im_dist_metr": 18, "dobs_v1_ft": 18, "dobs_lo_ft": 18, "dobs_v1_im": 18, "dobs_lo_im": 18, "dnet_ft": 18, "dnet_im": 18, "xtick": [18, 33, 39, 73], "expertis": 19, "analysi": [19, 31, 34, 35, 36, 38, 39, 67, 77, 80, 91, 94], "toolkit": [19, 21, 35], "behavior": [19, 20, 25, 27, 31, 34, 38, 61, 67, 85, 97, 100, 101], "pipelin": [19, 31, 36, 39, 57, 85, 87], "conceptu": [19, 40, 81, 101], "steinmetz": 19, "neuropixel": 19, "lfp": 19, "spontan": 19, "orient": [19, 20, 27, 73, 84, 91, 94], "2p": 19, "sdk": 19, "simplifi": [19, 25, 39, 62, 73, 80, 81, 84, 85, 94, 101], "connectom": 19, "fmri": [19, 25], "natur": [19, 23, 26, 31, 45, 46, 62, 67, 81, 82, 84, 87, 89, 91, 94, 101], "bonner": 19, "ecog": 19, "caltech": 19, "social": [19, 25, 62], "ibl": 19, "decis": [19, 21, 34, 35, 36, 37, 39, 64, 65, 67, 70, 101], "motor": [19, 28, 64, 74, 76], "hippocampu": 19, "fly": [19, 21, 76, 81, 84], "hipposeq": 19, "mouselight": 19, "openorganel": 19, "stringer1": 19, "neuron": [19, 27, 31, 34, 35, 36, 39, 57, 60, 61, 62, 70, 76], "stringer2": 19, "800": [19, 20, 35, 39, 57, 65, 76], "stringer3": 19, "ephi": 19, "buzsaki": 19, "webpag": [19, 73, 101], "eeg": [19, 30], "bci": 19, "handwrit": 19, "recent": [19, 27, 28, 43, 60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 101, 102], "krishna": 19, "shenoi": 19, "epilept": 19, "seizur": 19, "neurovista": 19, "seizure_recognit": 19, "mnist": [19, 57, 80], "digit": [19, 25, 67, 73, 76, 80, 82, 88], "studyforrest": 19, "forrest": 19, "gump": 19, "speech": [19, 84, 85, 94], "eyegaz": 19, "trend": [19, 27, 57, 69, 80, 82], "neuroimag": 19, "mri": [19, 91], "assess": [19, 69, 94], "brainscor": 19, "preprint": 19, "overview": [19, 23, 33, 34, 39, 40, 88, 97], "behaviour": [19, 25, 28, 37, 38, 43, 61, 100], "deeper": [19, 36, 61, 62, 73, 76], "rule": [19, 31, 34, 37, 43, 57, 61, 67, 70, 76, 100], "influenc": [19, 33, 36, 37, 40], "preliminari": [19, 31, 35], "older": 19, "cn": [19, 20, 25, 35, 39, 64], "subsampl": 19, "cheat": 19, "sheet": 19, "fetch": [19, 43, 73, 76], "pars": [19, 21, 43, 85, 87], "session": [19, 31, 44, 45, 46, 54, 87], "pedram": 20, "luca": 20, "tavar": 20, "jonni": [20, 31], "coutinho": 20, "bless": 20, "itoro": 20, "gaurang": 20, "mahajan": 20, "brain": [20, 25, 35, 36, 39, 40, 44, 45, 64, 76, 77, 94, 97], "pattern": [20, 33, 73], "noisi": [20, 35, 39, 40, 69, 70, 80, 81], "brainwid": 20, "isol": [20, 38, 80], "seq": 20, "suffici": [20, 28, 33, 36, 38, 64, 69, 88], "describ": [20, 27, 33, 34, 36, 64, 65, 69, 80, 94, 97, 100, 101, 102], "hundr": [20, 84, 87, 100, 102], "ten": [20, 27, 31, 34, 67, 76, 87, 88], "thousan": 20, "cours": [20, 23, 28, 31, 33, 36, 39, 40, 52, 54, 60, 62, 70, 73, 74, 76, 80, 85, 88, 94, 100, 103], "ntrial": 20, "pretend": [20, 35, 39], "bin": [20, 25, 35, 36, 39, 43, 67, 74, 76, 87, 89, 94], "10m": [20, 36, 39], "2500m": 20, "compon": [20, 27, 28, 31, 33, 34, 36, 37, 38, 39, 40, 43, 46, 67, 73, 76, 77, 84, 88, 97], "ncomp": 20, "recurr": [20, 23, 60, 73, 84, 87, 101], "diagon": [20, 57, 62, 73, 94], "simplic": [20, 27, 33, 36, 39, 57, 67, 81, 84, 97], "stabil": [20, 27, 38, 64, 67, 70, 81, 82, 84], "a0": 20, "diag": 20, "025": [20, 67], "innov": 20, "timestep": [20, 25, 27, 28, 81, 82], "poisson": [20, 35, 39], "spike": [20, 35, 36, 39, 64, 76], "nn1": 20, "nn2": 20, "bidi": 20, "bidirect": 20, "nonlinear": [20, 21, 39, 60, 62, 64, 65], "enforc": [20, 28], "softplu": 20, "smooth": [20, 64, 69, 70, 74, 76, 88, 91], "likelihood": [20, 40, 67, 74, 84, 94, 100, 102], "lead": [20, 27, 31, 35, 40, 61, 65, 67, 70, 73, 76, 84, 87, 88, 94, 97], "failur": 20, "gray_r": [20, 67], "ms": [20, 35, 39, 64, 74, 76], "separ": [20, 28, 31, 37, 38, 39, 57, 64, 67, 77, 80, 84, 85, 88, 89, 94], "x0": [20, 21, 57, 81], "bias": [20, 39, 57, 60, 65, 67, 73, 82, 85, 100, 101, 102], "slow": [20, 27, 28, 36, 39, 61, 62, 84, 85], "005": [20, 61, 67, 81], "poisson_loss": 20, "spk": 20, "niter": 20, "4105": 20, "2552": 20, "2438": 20, "2392": 20, "2377": 20, "2373": 20, "2371": 20, "700": [20, 76], "2369": 20, "2368": 20, "900": [20, 27, 76], "2367": 20, "rpred": 20, "121": [20, 69, 76], "ycpu": 20, "Not": [20, 21, 33, 36, 37, 38, 64, 67], "surpris": [20, 31, 38, 61, 69], "gagana": [21, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "fruit": 21, "robust": [21, 61, 67, 70, 76, 94], "perturb": [21, 67, 70, 82, 85], "mmpose": 21, "mmlab": 21, "design": [21, 27, 31, 35, 36, 39, 62, 64, 67, 70, 76, 84, 89, 94, 100, 101, 102], "toolbox": [21, 91], "exot": 21, "definit": [21, 23, 31, 38, 64, 67, 80, 85, 88, 94], "room": [21, 31, 80, 97, 101], "tracker": 21, "4216": 21, "awai": [21, 28, 33, 73, 74, 77, 88, 101], "readthedoc": [21, 27], "2d_animal_keypoint": 21, "deeplabcut": 21, "mackenziemathislab": 21, "kristinbranson": 21, "tensorboard": [21, 84, 85], "monitor": [21, 27, 76, 80, 85], "tensorboard_tutori": 21, "gh": 21, "page": [21, 23, 27, 33, 43, 51, 53, 54, 57], "_download": 21, "tensorboard_with_pytorch": 21, "ipynb": [21, 27], "milesi": 21, "alexandr": 21, "plotlabelandpredict": 21, "hm_pred": 21, "title_str": 21, "isbatch": 21, "locs_pr": 21, "heatmap2landmark": 21, "get_imag": 21, "get_landmark": 21, "nlandmark": 21, "marker": [21, 33, 61, 65, 67, 76, 77, 80, 94], "markerfacecolor": 21, "batchid": 21, "locs_pred_curr": 21, "hmim": 21, "get_heatmap_imag": 21, "predcurr": 21, "heatmap2imag": 21, "__version__": [21, 85, 87], "ncuda": 21, "ntorch": 21, "cu113": 21, "fly_bubble_20201204": 21, "q7vhy": 21, "datadir": 21, "view0": 21, "ftar": 21, "untar": 21, "drive": [21, 31, 36, 39, 53, 80], "1a06zamqxvuqzzqgi9xwwjabl4vof8v6z": 21, "usp": 21, "instruct": [21, 34, 60, 61, 101], "past": [21, 31, 33, 43, 67, 88, 101], "flush_and_unmount": [21, 28], "force_remount": 21, "fly_bubble_pos": 21, "xvzf": 21, "dev": [21, 27, 28, 35, 39, 100, 102], "null": [21, 27, 28, 43], "traindir": 21, "trainannfil": 21, "train_annot": 21, "testdir": 21, "testannfil": 21, "test_annot": 21, "trainann": 21, "ntrainim": 21, "filestr": 21, "imfil": 21, "imread_unchang": 21, "imsiz": 21, "landmark_nam": 21, "head_fc": 21, "head_bl": 21, "head_br": 21, "thorax_fr": 21, "thorax_fl": 21, "thorax_bc": 21, "abdomen": 21, "leg_ml_in": 21, "leg_ml_c": 21, "leg_mr_in": 21, "leg_mr_c": 21, "leg_fl_tip": 21, "leg_ml_tip": 21, "leg_bl_tip": 21, "leg_br_tip": 21, "leg_mr_tip": 21, "leg_fr_tip": 21, "num_keypoint": 21, "181": [21, 76, 94], "nimsshow": 21, "imsshow": 21, "dpi": [21, 28, 73, 87], "bigger": [21, 61, 76, 87, 91], "keypoint": 21, "hm": 21, "landmark": 21, "indic": [21, 31, 33, 35, 36, 39, 40, 57, 60, 64, 65, 67, 69, 76, 80, 81, 85, 88, 94, 100], "colormap": 21, "get_cmap": [21, 62, 73], "colornorm": 21, "annfil": 21, "label_sigma": 21, "constructor": [21, 57, 88], "scalar": [21, 40, 57, 60, 62, 67, 80, 81, 84, 87], "nlandmarks_al": 21, "precomput": 21, "stuff": [21, 33], "label_filt": 21, "label_filter_r": 21, "label_filter_d": 21, "init_label_filt": 21, "overload": 21, "getitem": [21, 64], "ncolor": 21, "65535": 21, "cannot": [21, 38, 39, 57, 62, 65, 77, 85, 87, 100], "typeerror": [21, 85], "imsz": 21, "make_heatmap_target": 21, "diamet": 21, "alloc": [21, 57, 61], "lose": [21, 27, 28, 73], "border": [21, 57, 73, 76], "y0": 21, "crop": [21, 57, 70, 77, 91], "goe": [21, 31, 33, 64, 73, 80], "fil_x0": 21, "fil_x1": 21, "fil_y0": 21, "fil_y1": 21, "staticmethod": [21, 25, 100, 102], "static": [21, 43, 73, 76, 81], "usabl": [21, 38, 77], "ith": [21, 100], "plottabl": 21, "instanti": [21, 33, 36, 61, 62, 64, 65, 73, 76, 84, 85, 100, 102], "train_dataload": [21, 57, 87], "i_batch": 21, "sample_batch": 21, "8353": 21, "8275": 21, "8235": 21, "8314": 21, "7882": 21, "7922": 21, "8000": [21, 57], "8039": 21, "7804": 21, "7961": 21, "8157": 21, "8118": 21, "8078": 21, "8196": 21, "8392": 21, "8471": 21, "8431": 21, "8510": 21, "7488": 21, "4393": 21, "2865": 21, "8702": 21, "8077": 21, "0938": 21, "8719": 21, "1947": 21, "1545": 21, "3605": 21, "2214": 21, "91": [21, 73, 76, 84], "9388": 21, "6487": 21, "113": [21, 76], "5320": 21, "6973": 21, "1256": [21, 61], "5618": 21, "7494": 21, "8496": 21, "9855": 21, "9393": 21, "6579": 21, "4566": 21, "2644": 21, "5434": 21, "8570": 21, "0331": 21, "8386": 21, "8340": 21, "109": [21, 76], "6349": 21, "94": [21, 28, 76, 84], "3467": 21, "3398": 21, "2621": 21, "5554": 21, "6067": 21, "5406": 21, "3683": 21, "4841": 21, "6089": 21, "5981": 21, "6650": 21, "1148": 21, "9521": 21, "5694": 21, "5933": 21, "9952": 21, "0958": 21, "8181": 21, "1196": 21, "0669": 21, "6937": 21, "5386": 21, "0347": 21, "8119": 21, "0003": [21, 67], "2152": 21, "5787": 21, "4639": 21, "1912": 21, "7318": 21, "7608": 21, "107": [21, 27, 76], "6556": 21, "7992": 21, "5985": 21, "5912": 21, "108": [21, 27, 76], "5169": 21, "3186": 21, "2265": 21, "modularli": 21, "outconv": 21, "doubleconv": 21, "2x2": [21, 73], "pool": [21, 76, 84, 94], "bilinear": 21, "incorpor": [21, 67, 73, 74, 76, 82, 97, 100, 101], "bn": 21, "mid_channel": 21, "double_conv": 21, "downscal": [21, 82], "doubl": [21, 28, 57, 62, 67, 70, 73, 77, 80], "maxpool_conv": 21, "upscal": [21, 82], "align_corn": 21, "chw": 21, "diffi": 21, "diffx": 21, "issu": [21, 25, 31, 38, 39, 40, 51, 67, 69, 77, 84, 88], "haiyongjiang": 21, "unstructur": 21, "buggi": 21, "commit": [21, 34, 43], "0e854509c2cea854e247a9c615f175f76fbb2e3a": 21, "xiaopeng": 21, "liao": 21, "8ebac70e633bac59fc22bb5195e513d5832fb3bd": 21, "unet_model": 21, "n_channel": [21, 27], "n_landmark": 21, "nchannels_inc": 21, "nchannels_down1": 21, "nchannels_down2": 21, "nchannels_down3": 21, "nchannels_up1": 21, "nchannels_up2": 21, "nchannels_up3": 21, "layer_inc": 21, "layer_down1": 21, "layer_down2": 21, "layer_down3": 21, "layer_up1": 21, "layer_up2": 21, "layer_up3": 21, "layer_outc": 21, "inc": 21, "x3": 21, "x4": 21, "outc": 21, "__str__": [21, 85], "down1": 21, "down2": 21, "down3": 21, "up1": 21, "up2": 21, "up3": 21, "__repr__": [21, 85], "unravel_index": 21, "insanti": 21, "care": [21, 25, 28, 33, 36, 37, 67, 84, 85, 91, 101], "hms0": 21, "restart": [21, 25, 54, 57, 77], "poseestimationnet": 21, "unet20210510t140305": 21, "final_epoch4": 21, "loadepoch": 21, "nepochs_per_sav": 21, "forget": [21, 34, 37, 38, 43, 57, 62, 65, 67, 70, 73, 74, 76, 82, 84, 85, 88, 91, 94, 97, 101], "savedir": 21, "checkpointdir": 21, "val_dataload": 21, "rmsprop": 21, "lr_schedul": [21, 82, 87], "reducelronplateau": 21, "patienc": [21, 69, 70], "numer": [21, 37, 57, 61, 62, 64, 67, 74, 80, 81, 82, 85, 94], "bcewithlogitsloss": 21, "hm_label": 21, "savefile0": 21, "cp_latest_epoch": 21, "savefile1": 21, "cp_prev_epoch": 21, "final_epoch": 21, "892069824039936": 21, "0559215631801635": 21, "596667634788901": 21, "train_hms1": 21, "val_hms1": 21, "eval_net": 21, "err": [21, 81, 94], "loc_pr": 21, "loc_label": 21, "l2err": 21, "sqrt": [21, 27, 61, 62, 65, 67, 69, 73, 74, 80, 81, 82, 84, 102], "idscurr": 21, "l2err_per_landmark_v": 21, "val_id": 21, "l2err_per_landmark_train": 21, "train_id": 21, "nbin": [21, 94], "bin_edg": 21, "bin_cent": 21, "frac_val": 21, "frac_train": 21, "histogram": [21, 25, 67], "densiti": [21, 25, 80, 94], "hval": 21, "px": 21, "argsort": 21, "printopt": 21, "errstr": 21, "nr": 21, "fil": 21, "testann": 21, "ntestim": 21, "test_dataload": [21, 57, 87], "l2err_per_landmark_test": 21, "test_id": 21, "1800": 21, "frac_test": 21, "conduct": [23, 52, 57], "teach": [23, 33, 35, 36, 52, 76, 91, 100, 101, 102], "pod": [23, 31, 46, 57, 60, 62, 64, 69, 73, 80], "reinforc": [23, 25, 27, 31, 46, 57, 70, 101, 102], "alphabet": [23, 31, 85, 88], "letter": [23, 31, 33, 39, 40, 73, 76, 85, 88], "topic": [23, 31, 57, 62, 97, 101], "taught": [23, 31], "week": [23, 26, 35, 46, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "comp": [23, 31, 35, 54], "neuro": [23, 25, 31, 35, 54], "lai": [23, 94], "foundat": [23, 81], "w1d4": [23, 39], "review": [23, 31, 36, 37, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101, 102], "refin": [23, 31, 39], "encourag": [23, 27, 31, 37, 40, 67, 73, 80], "w3d2": [23, 25], "dedic": [23, 31, 46, 57, 80], "abstract": [23, 25, 31, 33, 36, 37, 39, 46, 57, 85, 88], "rest": [23, 27, 31, 33, 64, 65, 73, 80, 94, 97], "culmin": 23, "slide": [23, 26, 31, 73, 76, 77, 102], "3h": 23, "slot": [23, 31, 46, 48, 76], "substanti": [23, 31, 77, 80], "inspir": [23, 44, 45, 80, 101], "becom": [23, 31, 65, 67, 70, 73, 74, 88], "airtabl": [23, 31, 52], "w3d5": 23, "approx": [23, 33, 61], "five": [23, 88], "member": [23, 80], "due": [23, 27, 28, 52, 61, 64, 67, 69, 73, 84, 85, 94, 100], "style": [23, 28, 31, 33, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 89, 94], "powerpoint": [23, 31], "send": [23, 57, 87, 89], "email": 23, "primari": 23, "logist": [23, 35, 39, 67], "neurmatch": 25, "morteza": 25, "ansarinia": 25, "yamil": [25, 27], "vidal": [25, 27], "aim": [25, 26], "mimic": [25, 101], "mechan": [25, 27, 33, 36, 37, 39, 60, 82, 84, 101], "construct": [25, 27, 33, 40, 57, 64, 65, 67, 73, 84, 87, 97], "multi": [25, 31, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 89, 94, 100, 102], "jedi": [25, 28], "setuptool": [25, 28], "dm": 25, "acm": 25, "jax": [25, 28, 60], "sonnet": [25, 28], "trfl": [25, 28], "ignor": [25, 39, 57, 61, 62, 64, 69, 70, 82, 84, 85, 87, 88, 89, 94], "uninstal": 25, "seaborn": [25, 81], "31merror": [25, 28, 43], "resolv": [25, 28, 35, 39, 40, 43, 67], "conflict": [25, 28, 43], "incompat": [25, 28, 43], "0m": [25, 27, 28, 43], "31m": [25, 28, 43], "chex": 25, "snt": [25, 28], "sn": [25, 81], "dm_env": 25, "spec": [25, 28], "environmentloop": [25, 28], "tf": [25, 28], "dqn": [25, 101], "logger": [25, 28, 85, 87], "clear_output": [25, 76], "scientist": [25, 74, 91, 101], "tap": 25, "stroop": 25, "span": [25, 37, 64, 94], "tmt": 25, "trail": 25, "wcst": 25, "wisconsin": 25, "card": 25, "despit": [25, 52, 73, 101], "extens": [25, 31, 43, 73], "sophist": [25, 43, 60, 67, 94], "gain": [25, 35, 38, 39, 40, 52, 76, 81, 82, 88], "underli": [25, 39, 64, 65, 73, 74, 76, 88, 101], "interestingli": [25, 94], "thought": [25, 31, 35, 36, 60, 62, 67, 76, 80, 94], "action": [25, 26, 27, 39, 57, 64, 81, 97, 100, 101, 102], "conson": 25, "formul": [25, 26, 31, 67, 77, 81, 82, 97], "reward": [25, 97], "feedback": [25, 27, 31, 46], "trajectori": [25, 61, 67, 81], "episod": [25, 27, 28, 100, 101, 102], "schema": [25, 43, 88], "correl": [25, 35, 39, 40, 73, 91, 94], "straightforward": [25, 43], "composit": [25, 60], "hcp": 25, "wm": 25, "bound": [25, 28, 65, 81, 102], "symbol": [25, 57, 67, 85, 94], "neutral": 25, "sake": [25, 37], "breviti": 25, "perfom": [25, 57], "participant_id": 25, "bid": 25, "trial_index": 25, "time_step": [25, 82], "observ": [25, 27, 28, 33, 36, 39, 40, 57, 60, 67, 69, 70, 73, 77, 85, 94, 97, 101, 102], "expected_respons": 25, "is_correct": 25, "response_tim": 25, "mock": 25, "generate_mock_nback_dataset": 25, "n_particip": 25, "n_trial": 25, "stimulus_choic": 25, "abcdef": 25, "response_choic": 25, "n_row": 25, "pid": 25, "trial_indic": 25, "stimulus_sequ": 25, "exponenti": [25, 33, 36, 57], "datafram": [25, 40, 57, 81], "mark": [25, 27, 31, 61, 62, 76, 84, 94], "matchig": 25, "_nback_stim": 25, "burn": [25, 101], "trial": [25, 31, 35, 36, 39, 40, 65, 97], "mock_nback_data": 25, "displot": 25, "barplot": 25, "697356": 25, "149110": 25, "277760": 25, "implment": 25, "envinron": 25, "prefer": [25, 31, 67, 85, 88], "nback": 25, "episode_step": [25, 28], "stimuli_choic": 25, "human_data": 25, "_reset_next_step": 25, "_imitate_human": 25, "human_subject_data": 25, "_action_histori": 25, "_current_step": 25, "fixm": 25, "reverb": 25, "iloc": 25, "sort_valu": 25, "to_list": [25, 57], "ord": [25, 80], "_observ": 25, "_episode_return": 25, "agent_act": 25, "human_act": 25, "step_reward": 25, "rational": 25, "expected_act": 25, "termin": [25, 27, 28, 43, 85, 100, 102], "transit": [25, 97], "observation_spec": 25, "boundedarrai": [25, 28], "nback_stimuli": 25, "action_spec": [25, 28], "discretearrai": 25, "num_valu": 25, "int32": [25, 97], "ob": 25, "plot_stat": 25, "br": [25, 35, 36], "create_environ": 25, "singleprecisionwrapp": [25, 28], "grab": [25, 31], "environment_spec": [25, 28], "make_environment_spec": [25, 28], "randomag": 25, "actor": [25, 28], "_num_act": 25, "select_act": [25, 28], "uniformli": [25, 81, 82], "observe_first": 25, "next_timestep": 25, "env": [25, 27, 28], "env_spec": [25, 28], "n_episod": 25, "1_000": 25, "n_total_step": 25, "log_loss": 25, "n_step": [25, 28], "all_return": 25, "episode_return": 25, "episode_loss": 25, "start_tim": [25, 67, 69, 87], "polici": [25, 57, 101], "last_loss": 25, "steps_per_second": 25, "episode_length": 25, "loss_avg": 25, "histplot": 25, "kde": 25, "deepmind": [25, 84], "init": [25, 43, 57, 62, 65, 67, 82, 87], "discreteenviron": 25, "num_act": [25, 97], "num_observ": 25, "obs_dtyp": 25, "dqn_make_network": 25, "mlp": [25, 69, 84], "epsilon": [25, 67, 80, 81, 82], "inmemorylogg": 25, "_logger": 25, "_data": 25, "tail": [25, 76, 82], "995": [25, 76, 82], "329": [25, 76], "379165": 25, "996": [25, 76, 82], "31872": 25, "326": [25, 76], "324034": 25, "997": [25, 76], "31904": 25, "373": [25, 76], "017676": 25, "998": [25, 76, 82], "31936": 25, "309": [25, 76], "737031": 25, "31968": 25, "405": [25, 76], "329983": 25, "32000": 25, "cooper": 26, "expand": [26, 57, 80, 85], "theori": [26, 34, 35, 43, 57, 101], "pistonbal": 26, "piston": 26, "obstacl": 26, "extern": [26, 64, 67, 101], "earn": 26, "puckworld": 26, "snake": [26, 76], "minigrid": 26, "rl": [26, 28, 101], "congest": 26, "travel": 26, "queue": [26, 64, 65, 100, 102], "easier": [26, 31, 35, 36, 37, 38, 43, 67, 73, 76, 77, 88, 89, 94], "materi": [26, 31, 34, 52, 67, 70, 73, 84], "zoo": 26, "openai": [26, 27, 84], "gym": [26, 27, 28], "raghuram": 27, "bharadwaj": 27, "diddigi": 27, "geraud": 27, "nangu": 27, "tass": 27, "sanjukta": 27, "krishnagop": 27, "sara": 27, "rajae": 27, "shaonan": [27, 81, 82, 88, 97, 101], "wang": [27, 57, 81, 82, 88, 97, 101], "keyword": [27, 31, 57], "exactli": [27, 33, 35, 39, 43, 88], "xvfb": [27, 28], "opengl": 27, "swig": 27, "python3": [27, 65, 69, 70, 80, 81, 82, 85, 87, 94], "x11": 27, "rarfil": 27, "baselines3": 27, "box2d": 27, "pyvirtualdisplai": 27, "pyglet": 27, "pygam": 27, "gymnasium": 27, "pip3": [27, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "2k": [27, 28], "90m": [27, 28], "32m14": 27, "31m28": 27, "eta": [27, 28, 60, 61, 62, 67, 70], "36m0": [27, 28], "25h": [27, 28], "sy": [27, 84, 94, 100, 102], "stable_baselines3": 27, "results_plott": 27, "ts2xy": 27, "load_result": 27, "callback": 27, "evalcallback": 27, "env_util": 27, "make_atari_env": 27, "lunar_land": 27, "video_record": 27, "videorecord": 27, "exist_ok": 27, "1400": 27, "wrap_env": 27, "render_mp4": 27, "videopath": 27, "b4": 27, "base64_encoded_mp4": 27, "reach": [27, 31, 33, 39, 40, 64, 65, 67, 69, 70, 74, 76, 80, 88, 97], "onlin": [27, 70, 74, 97], "fluctuat": [27, 70], "impact": [27, 34, 39, 61, 62, 67, 73, 77, 80], "land": [27, 76], "downward": 27, "graviti": 27, "safe": [27, 65, 76], "fuel": 27, "screen": [27, 31, 76], "140": [27, 76], "leg": [27, 28, 33, 36], "yield": [27, 65, 67, 87, 88], "03": [27, 70, 80], "crash": [27, 76], "veloc": [27, 28, 35, 36, 39, 40, 67], "angl": [27, 28, 31, 33, 36, 57, 65, 74, 94], "angular": [27, 33, 36], "nn_layer": 27, "tip": [27, 31, 34, 57, 73, 82, 84], "log_dir": 27, "tmp": 27, "env_nam": 27, "lunarland": 27, "cartpol": 27, "mountaincar": 27, "acrobot": 27, "statement": [27, 73, 85], "log_path": 27, "policy_kwarg": 27, "activation_fn": 27, "net_arch": 27, "mlppolici": 27, "buffer_s": 27, "replai": 27, "buffer": [27, 94], "learning_start": 27, "gamma": [27, 62, 65, 85, 87, 97], "discount": [27, 97], "facto": 27, "tau": [27, 81, 82], "soft": [27, 70, 76], "target_update_interv": 27, "train_freq": 27, "max_grad_norm": 27, "exploration_initial_ep": 27, "exploration_fract": 27, "gradient_step": 27, "pseudo": [27, 100], "a2c": 27, "ppo": 27, "ddpg": 27, "render": [27, 28, 36, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "observation_spac": 27, "action_spac": 27, "render_mod": 27, "rgb_arrai": [27, 28], "vid": 27, "total_reward": 27, "capture_fram": 27, "ntotal": 27, "0358279244006": 27, "total_timestep": 27, "100000": 27, "log_interv": [27, 70, 87], "num_timestep": 27, "episode_reward": 27, "420": [27, 67, 76], "151": [27, 76], "20000": [27, 67], "561": [27, 76], "249": [27, 76, 80], "240": [27, 61, 76], "40000": 27, "338": [27, 76], "08": [27, 33, 73], "160": [27, 76, 80], "241": [27, 62, 69, 76], "60000": 27, "190": [27, 76], "646": [27, 76], "70000": 27, "92": [27, 73, 76], "04": [27, 70], "139": [27, 76], "80000": 27, "267": [27, 76], "52": [27, 33, 62, 67, 69, 76, 77, 85, 94, 100, 102], "90000": 27, "126": [27, 33, 76], "536": [27, 76, 94], "257": [27, 76], "259": [27, 76], "0x7fef99c08c70": 27, "ep_rew_mean": 27, "proce": [27, 57, 91], "monoton": [27, 40], "upward": 27, "hope": [27, 39, 57, 67], "_learn": 27, "252": [27, 67, 70, 76, 80], "88935234615718": 27, "although": [27, 61, 67, 80, 88], "achiev": [27, 33, 38, 57, 67, 69, 70, 73, 74, 76, 77, 80, 84, 87, 88, 94, 101], "greedi": [27, 102], "At": [27, 31, 33, 57, 67, 69, 70, 73, 81, 84], "exploration_final_ep": 27, "defualt": 27, "argument": [27, 36, 39, 57, 60, 61, 64, 65, 67, 69, 70, 77, 81, 84, 85, 94, 100, 101, 102], "custom_env": 27, "customenv": 27, "arg1": 27, "arg2": 27, "discret": [27, 39, 57, 67, 80, 81, 82, 88, 100], "n_discrete_act": 27, "inherit": [27, 28, 43, 57, 67], "custom_lunarland": 27, "wind": [27, 87], "forgot": 27, "enable_wind": 27, "ground_contact": 27, "wind_mag": 27, "wind_idx": 27, "wind_pow": 27, "applyforcetocent": 27, "torqu": 27, "torque_mag": 27, "torque_idx": 27, "turbulence_pow": 27, "applytorqu": 27, "invalid": [27, 85, 88, 97, 100, 102], "dispers": 27, "np_random": 27, "m_power": 27, "ox": [27, 76], "oy": 27, "impulse_po": 27, "_create_particl": 27, "particl": 27, "decor": 27, "applylinearimpuls": 27, "main_engine_pow": 27, "s_power": 27, "sign": [27, 54, 67, 70, 76, 85, 100, 102], "side_engine_awai": 27, "side_engine_height": 27, "side_engine_pow": 27, "po": [27, 57, 84, 85, 94], "vel": [27, 67], "linearveloc": 27, "viewport_w": 27, "helipad_i": 27, "leg_down": 27, "viewport_h": 27, "angularveloc": 27, "And": [27, 33, 36, 39, 40, 62, 64, 65, 67, 73, 74, 87, 88, 89, 91, 101], "prev_shap": 27, "spent": [27, 73], "heurist": [27, 81], "game_ov": 27, "awak": 27, "cutom": 27, "alter": [27, 40], "eight": [27, 100], "portion": [27, 52, 77], "pong": [27, 76], "210": [27, 70, 76], "sb3": 27, "atari_gam": 27, "scrollto": 27, "f3k4rmxwimbo": 27, "pongnoframeskip": 27, "n_env": 27, "command": [27, 43, 60], "coalb": 27, "vecframestack": 27, "n_stack": 27, "cnnpolici": 27, "collis": 27, "devis": [27, 82], "mechansim": 27, "imaginari": 27, "horizant": 27, "cooridn": 27, "levi": 27, "initialis": [27, 57, 67, 73, 80, 100, 102], "effeci": 27, "load_path": 27, "custom_object": 27, "kwarg": [27, 43, 67, 80, 81, 84, 85, 88, 94], "survei": [27, 35, 46], "taylor": 27, "jmlr": 27, "volume10": 27, "taylor09a": 27, "lazar": 27, "2012": [27, 76], "hal": 27, "inria": 27, "fr": [27, 35, 39, 89], "pdf": [27, 35, 39, 81], "quick": [27, 62, 73, 80, 88], "lin": [27, 73, 80], "zhou": 27, "07888": 27, "barreto": 27, "2016": [27, 76], "successor": 27, "1606": 27, "05312": 27, "gridworld": 27, "lightweight": [27, 43], "5x5": 27, "v0": [27, 28], "wrap": [27, 28, 64, 65, 88], "imgobswrapp": 27, "rgbimgobswrapp": 27, "8x8": 27, "tradit": [27, 94], "earlier": [27, 31, 80, 88], "arbitrari": [27, 35, 40, 67, 84, 101], "straight": [27, 62, 74, 100], "rex": 27, "icml2019": 27, "trex": 27, "roman": 28, "vaxenburg": 28, "diptodip": [28, 80], "deb": [28, 80], "sriniva": 28, "turaga": 28, "infrastructur": [28, 88], "hopper": [28, 76], "ant": [28, 76], "humanoid": 28, "easili": [28, 31, 33, 36, 43, 57, 80, 81, 84, 88], "introduct": [28, 31, 34, 82, 101], "worth": [28, 34, 101], "leverag": [28, 54, 73], "shouldn": [28, 85], "workload": 28, "anywai": [28, 31, 52, 94], "freeglut3": 28, "32m1": 28, "31m55": 28, "32m804": 28, "804": [28, 76], "kb": 28, "31m14": 28, "32m314": 28, "31m21": 28, "32m3": 28, "31m97": 28, "32m352": 28, "352": [28, 67, 76], "31m37": 28, "32m131": 28, "131": [28, 76], "31m16": 28, "32m6": 28, "31m78": 28, "31m68": 28, "32m4": 28, "31m98": 28, "32m462": 28, "462": [28, 76], "31m42": 28, "32m497": 28, "497": [28, 76], "31m3": 28, "32m5": 28, "32m42": 28, "31m5": 28, "31m99": 28, "31m44": 28, "32m110": 28, "32m318": 28, "318": [28, 76, 88], "31m33": 28, "32m94": 28, "31m12": 28, "32m17": 28, "31m74": 28, "31m100": 28, "32m781": 28, "31m62": 28, "32m268": 28, "268": [28, 76], "31m10": 28, "32m104": 28, "31m8": 28, "32m80": 28, "31m9": 28, "pybullet_env": 28, "tf2_util": 28, "distributionalmpo": 28, "environment_loop": 28, "gym_locomotion_env": 28, "hopperbulletenv": 28, "walker2dbulletenv": 28, "halfcheetahbulletenv": 28, "antbulletenv": 28, "humanoidbulletenv": 28, "ipywidget": [28, 33, 35, 36, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "widget": [28, 33, 35, 36, 46, 60, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "config": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 89, 94], "inlinebackend": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "figure_format": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "githubusercont": [28, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "neuromatchacademi": [28, 33, 49, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "mplstyle": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "preserv": [28, 57, 76, 77, 81, 85, 89, 94], "save_ckpt_to_dr": 28, "acme_ckpt": 28, "restore_ckpt_from_dr": 28, "recov": [28, 62], "virtual": [28, 31, 43], "mydriv": 28, "_learner": 28, "_checkpoint": 28, "_checkpoint_manag": 28, "dst": 28, "copytre": 28, "checkpoin": 28, "upon": [28, 57, 67, 74, 80, 82, 84, 85], "display_video": 28, "framer": 28, "n_frame": 28, "orig_backend": 28, "get_backend": 28, "agg": 28, "switch": [28, 43, 76], "headless": 28, "inhibit": 28, "set_axis_off": [28, 62, 76], "set_aspect": 28, "set_posit": 28, "set_data": 28, "funcanim": [28, 69, 70, 81], "func": [28, 67, 73], "blit": [28, 69, 70, 81], "to_html5_video": [28, 69, 70, 81], "layer_s": 28, "num_atom": 28, "reqiur": 28, "make_networks_d4pg": 28, "policy_layer_s": 28, "critic_layer_s": 28, "action_s": [28, 100, 102], "policy_network": 28, "batch_concat": 28, "layernormmlp": 28, "tanhtospec": 28, "critic_network": 28, "criticmultiplex": 28, "action_network": 28, "cliptospec": 28, "activate_fin": 28, "discretevaluedhead": 28, "make_networks_ddpg": 28, "make_networks_dmpo": 28, "multivariatenormaldiaghead": 28, "min_scal": 28, "tanh_mean": 28, "init_scal": 28, "fixed_scal": 28, "use_tfd_independ": 28, "multiplex": [28, 81, 82], "getlist": 28, "humanoiddeepmimicbackflipbulletenv": 28, "humanoiddeepmimicwalkbulletenv": 28, "cartpolebulletenv": 28, "cartpolecontinuousbulletenv": 28, "minitaurbulletenv": 28, "minitaurbulletduckenv": 28, "racecarbulletenv": 28, "racecarzedbulletenv": 28, "kukabulletenv": 28, "kukacambulletenv": 28, "invertedpendulumbulletenv": 28, "inverteddoublependulumbulletenv": 28, "invertedpendulumswingupbulletenv": 28, "reacherbulletenv": 28, "pusherbulletenv": 28, "throwerbulletenv": 28, "humanoidflagrunbulletenv": 28, "humanoidflagrunharderbulletenv": 28, "minitaurextendedenv": 28, "minitaurreactiveenv": 28, "minitaurballgymenv": 28, "minitaurtrottingenv": 28, "minitaurstandgymenv": 28, "minitauralternatinglegsenv": 28, "minitaurfourlegstandenv": 28, "kukadiverseobjectgrasp": 28, "entri": [28, 43, 67, 80, 84, 89], "hierarchi": 28, "mainli": [28, 65, 82], "realiz": [28, 91], "child": [28, 82], "overrid": [28, 64, 85], "subclass": [28, 85], "piec": [28, 34, 73, 76, 88, 100, 102], "parent": [28, 62, 64], "step_count": 28, "durat": 28, "iii": 28, "modif": 28, "_isdon": 28, "overriden": [28, 85], "entireti": 28, "walkerbasebulletenv": 28, "multiplay": 28, "_step": 28, "apply_act": 28, "global_step": 28, "calc_stat": 28, "joints_at_limit": 28, "body_rpi": 28, "_aliv": 28, "alive_bonu": 28, "initial_z": 28, "isfinit": 28, "potential_old": 28, "calc_potenti": 28, "progress": [28, 31, 67, 76, 81, 84, 85, 94], "feet_collision_cost": 28, "feet": 28, "contact_id": 28, "contact_list": 28, "ground_id": 28, "feet_contact": 28, "dc": 28, "brake": [28, 74, 76], "electricity_cost": 28, "joint_spe": 28, "stall_torque_cost": 28, "joints_at_limit_cost": 28, "hud": 28, "gymwrapp": 28, "nativ": [28, 76], "compat": [28, 57, 65, 69, 70, 80, 85], "adher": [28, 88], "hop": 28, "km": 28, "constraint": [28, 31, 36, 65, 85, 94, 101], "walk_target_x": 28, "walk_target_i": 28, "haven": [28, 31, 62, 85], "actuat": 28, "tag": [28, 69, 70, 81, 84, 85], "cartesian": 28, "joint": [28, 33, 36, 94], "body_part": 28, "pose": [28, 61], "xyz": 28, "link0_2": 28, "2868544": 28, "torso": [28, 33, 36], "0166108": 28, "2329636": 28, "link0_3": 28, "02035943": 28, "link0_4": 28, "link0_6": 28, "03194364": 28, "03894688": 28, "thigh": 28, "03755892": 28, "814017": 28, "link0_8": 28, "0431742": 28, "58908712": 28, "05006377": 28, "33918206": 28, "link0_10": 28, "05695333": 28, "089277": 28, "foot": [28, 76], "12194239": 28, "09046921": 28, "floor": [28, 31, 57, 85], "robot_bodi": 28, "39135196": 28, "97286361": 28, "posteriori": 28, "optimis": [28, 33, 67], "learner_log_everi": 28, "learner": [28, 101], "loop_log_everi": 28, "learner_logg": 28, "terminallogg": 28, "time_delta": 28, "print_fn": 28, "loop_logg": 28, "policy_optim": 28, "critic_optim": 28, "observation_network": 28, "ident": [28, 80, 81, 84, 85], "op": [28, 100, 102], "num_step": [28, 82, 97], "100_000": 28, "num_episod": 28, "hopefulli": [28, 33, 39, 40, 74, 100], "env_step": 28, "gdrive": 28, "81b1f746": 28, "216e": 28, "11ee": 28, "93ef": 28, "0242ac1c000c": 28, "d4pg_learner": 28, "flush": 28, "roboflow": 30, "modelzoo": 30, "onnx": 30, "caden": 30, "qubvel": 30, "segmentation_model": 30, "zylo117": 30, "efficientdet": 30, "balavenkatesh3322": 30, "cv": [30, 39], "snakers4": 30, "silero": 30, "hugginfac": 30, "awesom": [30, 43, 57, 101], "awesomedata": 30, "uci": 30, "ic": [30, 76], "ml": [30, 67, 70], "php": 30, "zindi": 30, "africa": 30, "data_typ": 30, "dryad": 30, "datadryad": 30, "datasetsearch": 30, "zenodo": 30, "meagmohit": 30, "discoveri": [30, 84], "plan": [31, 34, 38, 57, 97, 101], "explicitli": [31, 33, 36, 39, 40, 80, 85, 100], "gradual": 31, "hypothes": [31, 34, 35, 37, 38, 84], "balanc": [31, 33, 36, 76, 77, 97, 102], "brainstorm": [31, 35, 36, 39, 40], "testabl": [31, 38], "hypothesi": [31, 33, 34, 38, 39, 40, 62, 67, 84, 85, 101], "evid": [31, 35, 36, 39, 40, 67, 73, 81, 87], "meet": [31, 38, 62, 85], "megapod": 31, "meant": [31, 39, 40, 57], "starter": 31, "reus": 31, "diverg": [31, 80, 94], "hesit": 31, "flexibl": [31, 37, 57, 62, 64, 76, 80, 88], "friendli": 31, "consult": 31, "sometim": [31, 33, 38, 40, 57, 60, 61, 62, 67, 69, 74, 76, 88, 97], "arriv": [31, 62], "resum": 31, "slightli": [31, 69, 70, 73, 84, 88, 91, 94], "footwork": 31, "whenev": [31, 62, 89, 100, 102], "senior": 31, "postdoc": 31, "professor": 31, "industri": [31, 88], "navig": [31, 97, 101], "perspect": [31, 100], "depend": [31, 35, 36, 38, 40, 62, 65, 67, 91], "regardless": [31, 100], "spend": [31, 35, 46, 61, 74, 87, 91, 101], "yourselv": [31, 69], "curiou": [31, 67, 82], "carefulli": [31, 34, 74], "brows": [31, 76], "booklet": 31, "skim": 31, "concret": [31, 35], "suit": [31, 67, 76, 94], "intertwin": 31, "headstart": 31, "readili": 31, "paragraph": [31, 33, 39, 40, 88], "stage": [31, 67, 73, 76, 80, 85, 94], "stai": [31, 37, 64, 67, 94], "arrang": [31, 87, 100], "ideal": [31, 37, 60, 69, 80, 82, 85], "threefold": 31, "ingredi": [31, 37], "primarili": [31, 73, 77, 94], "pollut": 31, "climat": 31, "geograph": [31, 88], "surfac": [31, 67, 81], "temperatur": [31, 62, 80, 87, 88, 94, 100, 102], "1900": 31, "convnet": [31, 33, 46, 73, 74], "didn": [31, 33, 34, 36, 85, 88, 100, 102], "favorit": 31, "invest": 31, "upload": [31, 34, 43, 57], "internet": [31, 43, 54, 76], "reformat": 31, "shaki": 31, "vagu": 31, "weak": 31, "cover": [31, 44, 45, 57, 60, 76, 87, 101], "fairli": [31, 33, 74, 94], "aggress": 31, "stuck": [31, 62, 67, 74, 91, 101], "thesi": 31, "confer": [31, 91], "chanc": [31, 33, 46, 67, 73, 94], "venu": 31, "branch": [31, 62, 81, 82], "pursu": [31, 37, 88], "With": [31, 65, 70, 80, 85, 87, 97], "principl": [31, 33, 34, 35, 37, 38, 65, 69, 76, 82, 88, 97], "solut": [31, 38, 57, 60, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100, 101, 102], "importantli": [31, 33, 35, 91, 94], "jargon": 31, "cohes": 31, "explicit": [31, 36, 38, 81, 94, 101], "likewis": 31, "reveal": 31, "someon": [31, 74, 88, 91], "heard": [31, 73, 88], "narrow": [31, 62, 65], "discourag": [31, 61, 74], "month": [31, 35], "term": [31, 33, 36, 37, 57, 60, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 94, 101, 102], "somebodi": 31, "somedai": 31, "logic": [31, 35, 43, 57, 80, 91, 94, 101], "accident": 31, "peak": [31, 35, 36, 39, 69], "circular": 31, "catch": [31, 33], "experienc": [31, 35, 36, 39, 40], "guard": 31, "calendar": [31, 62], "daily_schedul": 31, "graphic": [31, 34, 38, 73], "told": 31, "greet": 31, "zoom": [31, 46, 57, 73], "themselv": 31, "wiggli": 31, "caterpillar": 31, "phd": 31, "dame": 31, "pari": [31, 87], "fli": 31, "bike": [31, 76], "ride": [31, 74], "speak": [31, 33, 37, 38, 39, 40, 57, 62, 76, 80], "wast": 31, "breakout": 31, "anyon": 31, "futur": [31, 33, 34, 38, 39, 40, 62, 69, 88, 97, 100, 102], "perhap": [31, 88], "hardest": [31, 35], "subgroup": 31, "timeslot": 31, "hour": [31, 46, 57, 61, 62, 69, 70, 84], "jupyterbook": [31, 36], "cutoff": 31, "superpod": 31, "conclus": [31, 39, 76, 84], "imposs": [31, 35, 38, 39, 61], "elev": 31, "poster": [31, 76], "Or": [31, 36, 38, 88], "zuckerberg": 31, "secur": [31, 43, 77, 88], "million": [31, 43, 67, 77, 84], "dollar": 31, "fund": 31, "art": [31, 39, 40, 61, 76, 77, 82, 84, 101], "act": [31, 70, 82, 85, 97, 100], "music": 31, "instrument": 31, "rehears": 31, "WILL": 31, "annoi": [31, 61], "tenth": [31, 76], "secret": 31, "anecdot": 31, "magic": [31, 35, 60, 61, 67, 87], "engag": 31, "passiv": 31, "bind": [31, 43, 57], "hear": [31, 101], "dream": 31, "pictur": [31, 73, 76, 77, 94], "oppos": [31, 39, 84], "technic": [31, 70, 102], "concis": 31, "rambl": 31, "life": [31, 33, 67, 77, 88, 101], "hart": [33, 34, 35, 36, 37, 38, 39, 40], "megan": [33, 34, 35, 36, 37, 38, 39, 40], "peter": [33, 34, 35, 36, 37, 38, 39, 40, 64], "vladimir": [33, 43, 57, 60, 67], "haltakov": [33, 43, 57, 60, 67], "paul": [33, 34, 35, 36, 37, 38, 39, 40, 44, 45], "schrater": [33, 34, 35, 36, 37, 38, 39, 40], "gunnar": [33, 34, 35, 36, 37, 38, 39, 40, 100, 101, 102], "blohm": [33, 34, 35, 36, 37, 38, 39, 40, 100, 101, 102], "modal": [33, 81, 91, 101], "acceleromet": 33, "skelet": [33, 35], "reconstruct": [33, 80, 85, 87], "pilot": [33, 35], "neccessari": 33, "demo": [33, 36, 40], "matric": [33, 39, 57, 62, 73, 80, 89, 97], "maxpool1d": 33, "confusion_matrix": 33, "unbalanc": [33, 74], "plotconfusionmatrix": 33, "real_label": 33, "predicted_label": [33, 87], "label_nam": [33, 36, 85], "conver": 33, "tick_nam": 33, "ytick": [33, 35, 39, 73], "mnqb7": [33, 36], "train_mov": [33, 36], "test_mov": [33, 36], "joint_nam": [33, 36], "sensor": [33, 35, 39, 80], "wristband": [33, 35], "address": [33, 34, 35, 36, 39, 40, 43, 62, 69, 76, 101], "novel": [33, 35, 101], "1032": [33, 36], "172": [33, 36, 70, 76, 94], "closer": [33, 35, 36, 39, 57, 65, 88], "major": [33, 36, 82, 87, 88], "limb": [33, 36], "yaw": [33, 36], "roll": [33, 36, 67, 94], "advantag": [33, 36, 43, 73, 88], "agnost": [33, 36, 57], "3rd": [33, 36, 82, 100], "timepoint": [33, 35, 36, 39], "suppos": [33, 36, 82, 84], "cool": [33, 37, 62, 73], "joint_no": [33, 36], "pelvi": 33, "lefthip": 33, "righthip": 33, "spine1": 33, "leftkne": 33, "rightkne": 33, "spine2": 33, "leftankl": 33, "rightankl": 33, "spine3": 33, "leftfoot": 33, "rightfoot": 33, "neck": [33, 76], "leftcollar": 33, "rightcollar": 33, "leftshould": 33, "rightshould": 33, "leftelbow": 33, "rightelbow": 33, "leftwrist": 33, "rightwrist": 33, "lefthand": [33, 94], "righthand": [33, 94], "label_numb": [33, 36], "label_no": [33, 36], "crawl": 33, "throw": [33, 57, 85], "running_in_spot": 33, "cross_legged_sit": 33, "hand_clap": 33, "scratching_head": 33, "kick": [33, 60], "phone_talk": 33, "sitting_down": 33, "checking_watch": 33, "hand_wav": 33, "taking_photo": 33, "spread": [33, 36], "matter": [33, 34, 36, 40, 62, 70, 73, 80, 84, 88, 101], "asid": [33, 36, 70, 73], "stick": [33, 34, 36, 76], "hypothezis": [33, 36], "arm": [33, 36, 64, 76], "four": [33, 36, 62, 76, 80, 84, 87, 88, 101], "outperform": [33, 36, 76, 94, 101], "mathbb": [33, 39, 40, 57, 60, 62, 65, 67, 80, 81, 82], "perf_": 33, "deal": [33, 57, 67, 74, 77], "1d": [33, 39, 57, 60, 62, 64, 67, 73, 80], "sketch": [33, 37, 88], "data_tutori": 33, "movijointdataset": 33, "is_tensor": 33, "tolist": [33, 88, 100, 102], "intend": [33, 39, 62, 85], "movi_train": 33, "movi_test": 33, "1031": 33, "171": [33, 67, 70, 76, 94], "minu": 33, "bxcxt": 33, "516": [33, 76], "test_load": [33, 64, 65, 67, 69, 70, 73, 84], "highest": [33, 57, 62, 65, 88, 100, 102], "mov1dcnn": 33, "njoint": 33, "conv1d": 33, "dropout1": [33, 70, 73], "2200": 33, "nl": [33, 73], "dropout2": [33, 70, 73], "log_softmax": [33, 67, 69, 70, 84, 87, 102], "total_step": 33, "loss_list": 33, "acc_list": 33, "8369": 33, "9079": 33, "6369": 33, "5581": 33, "5327": 33, "4692": 33, "3940": 33, "4626": 33, "3307": 33, "3613": 33, "converg": [33, 57, 61, 62, 67, 70, 76, 97, 101], "decent": 33, "dark_background": [33, 40], "dark": [33, 76], "clap": 33, "phone": [33, 54, 76, 77], "photo": [33, 43, 74, 91], "tend": [33, 37, 69, 80, 84, 85], "sit": [33, 35, 39, 40, 101], "misclassifi": 33, "testjointmodel": 33, "cnn6j": 33, "83720930232558": 33, "limb_joint": 33, "limb_fit": 33, "leftleg": 33, "rightleg": 33, "leftarm": 33, "rightarm": 33, "formal": [33, 37, 88, 97], "win": [33, 36, 39, 88, 100, 102], "necessarili": 33, "wors": [33, 39, 76, 77], "limb_set": 33, "limb_set_fit": 33, "07": [33, 73], "fundament": [33, 61, 64, 73, 88, 97], "pocket": [33, 64], "inert": 33, "imu": 33, "gyroscop": 33, "begun": 33, "guidelin": [33, 34, 39, 40, 94], "phenomena": [33, 34, 35, 39, 40, 101], "summar": [33, 34, 39, 40, 73, 84, 101], "character": [33, 64, 94], "articul": [33, 34, 39, 40, 76, 101], "relationship": [33, 34, 36, 37, 38, 39, 40, 61, 62, 64, 67, 73, 74, 76, 80, 81, 84], "neuroal": 33, "publicli": [33, 57], "outcom": [33, 34, 36, 38, 39, 40, 100, 101, 102], "contrari": [33, 35], "intuit": [33, 34, 36, 38, 60, 62, 65, 67, 76, 80, 81, 84, 87, 88, 94, 100], "briefli": [33, 34, 39, 40, 97], "argu": [33, 34, 39, 40, 101], "plausibl": [33, 34, 39, 40], "paraphras": [33, 39, 40], "jean": [34, 37, 38, 39, 76], "lauren": [34, 37, 38, 39], "messag": [34, 85, 100, 102], "experimentalist": 34, "analog": [34, 69, 70, 76], "famou": [34, 57, 88], "criteria": [34, 38], "convinc": 34, "AND": 34, "peer": 34, "pitfal": [34, 35, 36, 37, 38], "recap": [34, 35, 36, 37, 38, 64, 82, 100], "reader": [34, 38, 82], "appreci": [34, 35, 60, 74], "unreason": 34, "reject": 34, "draft": 34, "expeiment": 34, "mesi": 34, "journal": 34, "rightfulli": 34, "cleanli": 34, "stereotyp": [34, 84], "mensh": 34, "kord": [34, 43, 57, 73, 74, 80, 88, 91, 101], "2017": [34, 84], "consider": [34, 35, 37, 39], "strong": [34, 35, 39, 40, 43, 64], "forc": [34, 43, 52], "focuss": [34, 36, 39], "succinctli": 34, "kp": 34, "pr": [34, 74], "eneuro": 34, "0352": 34, "1523": [34, 80, 94], "nbdt": 34, "scholasticahq": 34, "articl": [34, 57, 70, 73, 76], "16723": 34, "plo": 34, "biol": 34, "e1005619": 34, "1371": 34, "pcbi": 34, "1005619": 34, "mk": 34, "w56vt": 34, "eric": [35, 36], "dewitt": [35, 36], "tara": [35, 36], "van": [35, 36, 73, 76, 82], "viegen": [35, 36], "ella": [35, 36, 101], "batti": [35, 36, 101], "eas": 35, "deconstruct": 35, "inclin": 35, "roleplai": 35, "regress": [35, 39, 60, 67], "cross_val_scor": [35, 39], "rasterplot": [35, 39], "trial_spik": [35, 39], "trial_ev": [35, 39], "nonzero": [35, 39, 100, 102], "eventplot": [35, 39], "plotcrossvalaccuraci": [35, 39], "boxplot": [35, 39], "vert": [35, 39], "set_vis": [35, 39, 73, 80, 97], "generatespiketrain": [35, 39], "subsetpercept": [35, 39], "velocity_sigma": [35, 39], "profil": [35, 39], "velocity_profil": [35, 39], "sensit": [35, 39, 65, 67, 84, 87], "target_shap": [35, 39], "multipli": [35, 39, 57, 61, 73, 74, 76, 81, 85, 88, 89, 91], "s_gain": [35, 39], "s_move": [35, 39], "s_fr": [35, 39], "rv": [35, 39, 40, 76, 80, 81], "hwin": [35, 39], "num_mov": [35, 39], "m_train": [35, 39], "m_test": [35, 39], "w_idx": [35, 39], "w_0": [35, 39], "w_1": [35, 39, 60, 61, 62], "stationari": [35, 39, 40], "spikes_stat": [35, 39], "spikes_mov": [35, 39], "train_spikes_stat": [35, 39], "train_spikes_mov": [35, 39], "test_spikes_stat": [35, 39], "test_spikes_mov": [35, 39], "x_train": [35, 39, 61, 62, 64, 65, 81], "x_test": [35, 39, 64, 65, 69, 70, 81, 82], "population_model": [35, 39], "liblinear": [35, 39], "newton": [35, 39, 67], "cg": [35, 39], "lbfg": [35, 39], "sag": [35, 39], "slope": [35, 39, 64, 65], "intercept_": [35, 39], "intercept": [35, 39], "ground_truth": [35, 39], "getdata": [35, 39], "illus": [35, 36, 62, 88], "introductori": 35, "showcas": 35, "joke": 35, "markdown1": [35, 36], "3pt": [35, 36], "window": [35, 39, 40, 43, 73, 76, 85], "suddenli": [35, 39, 40], "wrong": [35, 37, 38, 39, 40, 57, 85, 88], "vice": [35, 36, 39, 40, 57, 85], "versa": [35, 36, 39, 40, 57, 85], "surround": [35, 39, 40, 97, 100], "disambigu": [35, 39, 40], "vibrat": [35, 39, 40, 88], "inde": [35, 39, 40, 57, 81], "vestibular": [35, 36, 39], "illusori": [35, 40], "markdown2": [35, 36], "sensori": [35, 39], "signal": [35, 36, 37, 39, 64, 65, 73, 76, 81, 82], "hold": [35, 39, 40, 60, 65, 97], "slowli": [35, 39, 70, 74, 76, 94], "judgement": [35, 36, 39], "markdown3": [35, 36], "out2": [35, 36], "out1": [35, 36], "out3": [35, 36], "tab": [35, 36, 57], "yesterdai": [35, 76, 101], "lost": [35, 37, 38, 57, 80, 100, 102], "mechanist": [35, 40], "unclear": 35, "deepli": [35, 38], "BUT": 35, "anywher": 35, "revisit": [35, 36, 70], "frequent": [35, 40, 69, 70], "necess": [35, 67], "bad": [35, 38, 39, 67, 69, 70, 80, 88], "nest": [35, 57, 60], "examin": [35, 39, 40, 60, 62, 76, 81, 82, 88], "attempt": [35, 40, 65, 77, 85, 91], "markdown21": 35, "4d": [35, 39, 57, 82], "markdown22": 35, "simultan": [35, 39, 62, 65], "fourth": [35, 39, 44, 62], "mi": [35, 39, 88, 97], "markdown23": 35, "orang": [35, 39, 70, 73, 76, 77, 81, 85], "green": [35, 39, 61, 67, 69, 70, 73, 76, 85], "markdown24": 35, "ey": [35, 39, 62, 76, 80, 81, 94], "move_no": [35, 39], "thorough": [35, 88], "prior": [35, 36, 37, 52, 76, 80, 81, 84, 101, 102], "dig": 35, "emit": [35, 88], "altern": [35, 40, 57, 67, 85, 100, 102], "complementari": [35, 37], "\u03b8": 36, "somehow": [36, 39, 69], "perceiv": [36, 39, 101], "markdown31": 36, "markdown32": 36, "markdown33": 36, "markdown34": 36, "markdown35": 36, "markdown36": 36, "markdown37": 36, "markdown38": 36, "markdown39": 36, "omit": 36, "latent": [36, 81, 88], "uncertainti": 36, "salienc": 36, "plant": [36, 62], "inventori": 36, "acquir": 36, "latex": 36, "strength": [36, 40, 73, 100], "em": [36, 85], "sup": 36, "\u03c3": [36, 62], "strongest": [36, 39], "ratio": [36, 39, 40, 62, 85], "came": [36, 39, 40], "hyp": [36, 39], "slower": [36, 39], "accum": [36, 39], "denot": [36, 39, 57, 60, 62, 64, 65, 70, 76, 81, 84], "perf": 36, "consecut": [36, 94], "express": [36, 37, 38, 57, 60, 67, 76, 80, 87, 94, 100, 102], "Be": [36, 38, 84, 91], "assumpt": [36, 70], "phenomenon": [36, 38, 67], "justifi": [36, 39], "lack": [36, 37, 57, 73], "clariti": [36, 37, 70], "satisfact": [37, 38], "empow": [37, 81, 101], "chose": [37, 40, 43, 74, 76, 80, 100], "physic": [37, 38], "granular": 37, "wider": [37, 70], "lumpabl": 37, "analyt": [37, 81, 82], "divers": 37, "Being": 37, "w1d1": [37, 46, 67], "meaningfulli": 37, "needless": 37, "highlight": [37, 40, 67, 81], "outlin": [37, 101], "thu": [37, 39, 46, 60, 62, 64, 65, 67, 70, 73, 80, 84, 88, 91, 94, 100], "huge": [37, 73, 76, 84], "facilit": [37, 88, 97, 101], "broken": 37, "portenti": 37, "hypothet": 37, "arrow": [37, 60, 61, 73, 85], "rough": 37, "icon": [38, 67], "easiest": 38, "surprisingli": [38, 62, 69], "insight": [38, 40, 67, 70, 74, 88, 91], "isn": [38, 67, 69, 73, 85], "equilibrium": 38, "asymptot": 38, "wrangl": [38, 57], "mistak": 38, "wore": 38, "useless": 38, "distract": 38, "alreali": 38, "determ": 38, "handi": 38, "satisfi": [38, 39, 65], "parametr": [38, 70, 102], "elimin": 38, "met": 38, "board": [38, 76, 87, 100, 102], "endless": 38, "neglect": 38, "warrant": 38, "qualit": 38, "upfront": 38, "breadth": 38, "bic": 38, "aic": 38, "subsumpt": 38, "uncov": [38, 74, 91, 101], "falsifi": 38, "leverl": 38, "avenu": 38, "experiment": [38, 40, 88, 101], "rethink": [38, 70], "implic": [38, 77, 81], "unanticip": 38, "consequ": [38, 61, 70, 77], "unbias": [38, 94], "disclaim": [39, 40, 67], "inner": [39, 62], "ear": [39, 76], "tricli": 39, "believ": [39, 62, 101], "ifram": [39, 44, 45], "mfr": 39, "57w2p": 39, "26mode": 39, "26action": 39, "scroll": [39, 57, 73], "641px": 39, "marginheight": 39, "framebord": 39, "allowfullscreen": 39, "webkitallowfullscreen": 39, "aka": 39, "problemat": 39, "invas": 39, "obvious": [39, 70, 80], "36": [39, 70, 73, 76, 81, 82, 84, 85, 97], "7575": 39, "975": [39, 76], "m_r": 39, "m_p": 39, "c_": [39, 40, 57, 64, 102], "cdot": [39, 40, 57, 60, 61, 62, 64, 65, 70, 76, 77, 84, 91, 100, 102], "soon": [39, 60, 88], "glm": 39, "whiteboard": [39, 40], "belong": [39, 65, 69, 70, 73, 100, 102], "halfwin": 39, "getdesignmatrix": 39, "extent": [39, 64, 65], "movstim": 39, "win_idx": 39, "a_r": 39, "desmat": 39, "mov": [39, 40], "33475": 39, "53275": 39, "61975": 39, "saw": [39, 62, 64, 67, 73, 74, 76, 80, 87, 91], "graph": [39, 57, 61, 69, 73, 87, 94], "magnitud": [39, 57, 67, 69, 73, 80, 81], "classifymotionfromspik": 39, "presenc": [39, 73, 88], "runanalysi": 39, "050": 39, "class_set": 39, "halfwin_no": 39, "lty": 39, "leg_hw": 39, "classes_no": 39, "leg_class": 39, "purpl": [39, 62, 85], "motions_no": 39, "cond_acc": 39, "m_acc": 39, "plotaccuraci": 39, "accuarci": 39, "xlim": [39, 64, 80, 81], "proport": [39, 40, 62, 94], "reflect": [39, 46, 57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "clearer": [39, 94], "judgment": [39, 40], "notion": [39, 76], "benefit": [39, 76, 101], "wrt": [39, 67, 85], "comclus": 39, "justifc": 39, "adjac": [39, 40, 85], "unknown": [39, 40, 88, 101], "effort": [39, 67], "cumul": [39, 97], "instantan": 39, "causal": [39, 101], "built": [39, 40, 43, 57, 67, 70, 73, 76, 80, 82, 84, 88, 101], "roadblock": [39, 40], "somewher": [39, 40, 73, 101], "neuroscientist": [39, 74], "drift": 40, "diffus": [40, 46, 62, 84], "establish": [40, 69, 74, 88], "frac": [40, 60, 64, 65, 67, 70, 73, 74, 76, 81, 82, 84, 91, 100, 102], "leakag": [40, 64], "instal": [40, 54], "vestibular_sign": 40, "sd": 40, "white": [40, 73, 76, 77, 94, 100], "1m": 40, "exp": [40, 64, 74, 80, 84, 100, 102], "diff": [40, 85], "leaki": 40, "thr": 40, "run_model": 40, "selfmot": 40, "thrshold": 40, "itertool": 40, "temp": [40, 57, 69, 70, 76, 81, 100, 102], "hypothsi": 40, "panel": [40, 54, 64], "layout": [40, 61, 62, 64, 65, 76, 80], "constrained_layout": [40, 61], "absent": [40, 85], "mov_": 40, "thr_": 40, "sig_": 40, "thr_n": 40, "c_n": 40, "subdf0": 40, "groupbi": 40, "subdf1": 40, "im0": 40, "im1": [40, 62], "set_ylim": [40, 61, 62, 67, 73, 81], "set_xlim": [40, 61, 62, 76, 81], "set_facecolor": 40, "grei": [40, 61, 76, 81], "redund": 40, "0004": 40, "d0": [40, 67], "d1": [40, 67, 73, 76], "201": [40, 76], "satur": 40, "push": [40, 43], "probabilist": [40, 67, 80], "dokka": 40, "sam": 43, "rai": [43, 76], "konrad": [43, 57, 73, 74, 80, 88, 91, 101], "modern": [43, 46, 60, 67, 84, 88, 101], "maintain": [43, 67, 73, 88], "ui": [43, 73, 76], "servic": [43, 101], "flasgger": 43, "pyngrok": 43, "vibecheck": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "datatop": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "datatopscontentreviewcontain": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "content_review": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "notebook_sect": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "prompt": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "pmyvdlilci": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "east": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "amazonaw": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "klab": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "neuromatch_dl": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "user_kei": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "f379rz8y": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "feedback_prefix": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "bonus_deplooymodel": 43, "atexit": 43, "subprocess": [43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "timer": 43, "_run_ngrok": 43, "popen": 43, "regist": [43, 54, 60, 61, 77], "localhost_url": 43, "localhost": 43, "4040": 43, "tunnel": 43, "sleep": [43, 61, 73, 76, 94], "tunnel_url": 43, "public_url": 43, "start_ngrok": 43, "ngrok_address": 43, "traffic": [43, 76], "127": [43, 76], "run_with_ngrok": 43, "expos": [43, 85], "stdout": 43, "old_run": 43, "new_run": 43, "5000": [43, 57, 62, 67, 81, 84, 85], "setdaemon": 43, "urllib": [43, 76, 94, 100, 102], "urlopen": [43, 76, 94, 100, 102], "flask_rest": 43, "marshal": 43, "render_template_str": 43, "redirect": [43, 57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "authent": 43, "mail": [43, 76], "dashboard": 43, "authtoken": 43, "your_ngrok_authtoken": 43, "ngrok2": 43, "yml": 43, "delet": [43, 73, 85], "visit": [43, 67, 102], "_deploying_neural_networks_on_the_web_video": 43, "micro": 43, "scalabl": [43, 101], "nowadai": 43, "socket": 43, "linkedin": 43, "pinterest": 43, "_flask_video": 43, "handler": [43, 85, 87], "trick": [43, 67, 70, 74, 91], "server": 43, "__main__": 43, "wsgi": 43, "press": [43, 61, 76, 91], "ctrl": 43, "xxx": 43, "xx": [43, 57, 60, 61, 81], "button": [43, 53, 54, 61, 67], "site": [43, 65, 69, 70, 76, 80, 81, 82, 85, 87, 88, 94, 100], "manual": [43, 64, 67, 69, 70, 73, 81, 84], "rout": 43, "h1": [43, 82], "_jinja_templates_video": 43, "offer": [43, 57, 67, 84, 85, 94], "reusabl": 43, "WIth": 43, "ifs": 43, "tabl": [43, 57, 62, 73, 76], "template_str": 43, "margin": [43, 77, 81, 82], "100px": [43, 80], "tr": 43, "200px": 43, "td": [43, 97], "endfor": 43, "unam": 43, "_asdict": 43, "_using_the_mvvm_design_pattern_video": 43, "gui": 43, "pointmodel": 43, "pointview": 43, "pointviewmodel": 43, "get_sample_data": 43, "viewmodel": 43, "classmethod": 43, "cl": [43, 84, 88], "add_resourc": 43, "pvm": 43, "_rest_api_video": 43, "platformview": 43, "swag_from": 43, "spec_dict": 43, "processor": [43, 88], "node": [43, 60, 62, 65, 67, 70, 76, 102], "arch": [43, 76], "resource_field": 43, "serial": [43, 82], "platformviewmodel": 43, "apidoc": 43, "redirect_platform": 43, "swagger": 43, "swg": 43, "_vue": 43, "js_video": 43, "front": [43, 87, 97], "benefici": [43, 65], "similarli": [43, 57, 61, 80, 85, 94, 101], "javascript": [43, 88], "axoi": 43, "mount": 43, "vue_templ": 43, "script": [43, 94], "cdn": 43, "jsdelivr": 43, "npm": 43, "dist": [43, 81], "cdnj": 43, "cloudflar": 43, "ajax": 43, "lib": [43, 65, 69, 70, 80, 81, 82, 85, 87, 94], "axio": 43, "ul": 43, "li": [43, 60, 73], "var": [43, 65, 80, 91], "el": 43, "_deploying_a_pytorch_model_video": 43, "densenet": 43, "_classification_with_a_pretrained_model_video": 43, "traini": 43, "densenet121": 43, "class_labels_url": 43, "hub": [43, 76], "imagenet_class": 43, "class_label": 43, "image_tensor": 43, "higherst": 43, "class_id": [43, 67], "class_nam": 43, "dog_imag": 43, "unsplash": 43, "2l0cwtpcchi": 43, "480": [43, 76], "foxhound": [43, 76], "_create_a_dynamic_application_video": 43, "index_templ": 43, "imageform": 43, "enctyp": 43, "multipart": 43, "imagefil": 43, "10px": 43, "250px": 43, "50px": 43, "32px": 43, "bold": [43, 85], "20px": 43, "getelementbyid": 43, "addeventlisten": 43, "formdata": 43, "preventdefault": 43, "const": 43, "createobjecturl": 43, "predict_api": 43, "image_fil": 43, "image_byt": 43, "_deploy_on_heroku_video": 43, "paa": 43, "tier": 43, "_prepare_python_environment_video": 43, "venv": 43, "linux": 43, "maco": 43, "bat": 43, "gunicorn": 43, "torchaudio": 43, "caus": [43, 61, 64, 73, 94], "_creating_a_local_application_video": 43, "send_from_directori": 43, "transofrm": 43, "getenv": 43, "_preparing_for_heroku_video": 43, "coupl": [43, 62, 67, 69, 73, 80, 89], "procfil": 43, "freez": [43, 67, 85, 88, 89], "exce": [43, 73], "whl": 43, "torch_stabl": 43, "_deploying_on_heroku_video": 43, "cli": 43, "login": 43, "git": [43, 53, 76, 77, 87, 89, 94, 100, 102], "remot": [43, 76], "lt": 43, "herokuapp": 43, "_summary_video": 43, "middlebrook": [44, 45], "host": [44, 45], "panelist": [44, 45], "lyle": [44, 57, 69, 70, 74, 87, 88, 89, 91, 101], "ungar": [44, 57, 69, 70, 74, 87, 88, 89, 91, 101], "surya": [44, 57, 64, 65], "ganguli": [44, 57, 64, 65], "braininspir": [44, 45], "casto": [44, 45], "player": [44, 45, 76], "596518": 44, "fifth": 45, "brad": 45, "wybl": 45, "kyunghyun": 45, "cho": 45, "jo\u00e3o": 45, "sedoc": 45, "612309": 45, "live": [46, 57, 62, 84, 89], "tbd": 46, "ceremoni": 46, "utc": [46, 48], "pm": 46, "tue": 46, "wed": 46, "multilay": [46, 73], "perceptron": [46, 57, 64, 65, 67, 73], "fri": 46, "vae": 46, "synchron": [46, 73, 76], "eod": 46, "swap": [46, 57, 84, 85, 88], "farewel": 46, "graduat": 46, "portal": [46, 52], "goodby": 46, "impos": 46, "quarter": 46, "crowdcast": [46, 52], "zone": 48, "tz": 49, "launch": [51, 53, 54, 57], "setup": 51, "spot": [51, 76], "unusu": 51, "2022": [52, 91], "violat": 52, "precours": 52, "exempt": 52, "shrubhlgswj8dua7": 52, "shrjdpfwacarn5jop": 52, "assist": 52, "circumst": 52, "beyond": [52, 62, 70], "electr": [52, 76, 94], "blackout": 52, "grant": [52, 54], "elig": 52, "overwrit": 53, "china": [54, 76, 88], "substitut": [54, 85], "asococi": 54, "workaround": [54, 100, 102], "sidebar": 54, "credenti": 54, "artwork": [55, 82], "daniela": 55, "buchwald": 55, "shubh": 57, "pachchigar": 57, "matthew": 57, "sargent": 57, "deepak": [57, 94], "raya": [57, 94], "siwei": [57, 67, 69, 70], "bai": [57, 67, 69, 70, 76], "kelson": [57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "shill": [57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "scrivo": [57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "anoop": [57, 60, 61, 62, 64, 65, 76, 77, 84, 85, 94], "kulkarni": [57, 60, 61, 62, 64, 65, 76, 77, 84, 85, 94], "arush": [57, 67, 76, 77, 100, 102], "tagad": [57, 67, 76, 77, 100, 102], "naivenet": 57, "preinstal": 57, "fulfil": 57, "w1d1_t1": 57, "getlogg": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 87, 89, 94, 100, 102], "font_manag": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 87, 89, 94], "checkexercise1": 57, "array_equ": 57, "vander": 57, "timefun": 57, "bufferedread": 57, "t_total": 57, "5f": [57, 82], "mess": 57, "insert": [57, 85], "25min": [57, 73, 76, 80], "adventur": 57, "_welcome_and_history_video": 57, "_why_dl_is_cool_video": 57, "multidimension": 57, "modular": 57, "deploi": [57, 94, 102], "_making_tensors_video": 57, "2000": [57, 80, 81], "3000": 57, "float64": [57, 100, 102], "9053e": 57, "0855e": 57, "8856e": 57, "4842e": 57, "seemingli": 57, "rand_lik": 57, "9728": 57, "2797": 57, "9038": 57, "3048": 57, "7761": 57, "7133": 57, "8749": 57, "0401": 57, "0865": 57, "8609": 57, "5365": 57, "4012": 57, "8520": 57, "9674": 57, "7411": 57, "6957": 57, "3873": 57, "rng": 57, "simplefun": 57, "my_se": [57, 82], "4963": 57, "7682": 57, "0885": 57, "3643": 57, "1344": 57, "1642": 57, "3058": 57, "2100": 57, "9056": 57, "6035": 57, "8110": 57, "0451": 57, "8797": 57, "0482": 57, "familar": 57, "sim": [57, 65, 80, 81, 82], "mathcal": [57, 61, 62, 64, 65, 74, 80, 81, 82, 84], "dagger": [57, 67, 76, 84], "inclus": 57, "tensor_cr": 57, "notimplementederror": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 97, 100, 102], "_creating_tensors_exercis": 57, "_tensors_operators_video": 57, "pointwis": 57, "0362": 57, "1852": 57, "3734": 57, "3051": 57, "9320": 57, "1759": 57, "2698": 57, "1507": 57, "0317": [57, 67], "2081": 57, "9298": 57, "7231": 57, "7423": 57, "5263": 57, "2437": 57, "overridden": [57, 85], "arithmet": 57, "lift": 57, "elementwis": [57, 84], "3333": 57, "syntax": [57, 88], "equival": [57, 64, 65, 67, 81, 82], "5846": 57, "0332": 57, "1387": 57, "2422": 57, "8155": [57, 73], "7932": 57, "2783": 57, "4820": 57, "8198": 57, "187318325042725": 57, "1051": 57, "3306": 57, "7517": 57, "7565": 57, "8509": 57, "5800": 57, "46525758504867554": 57, "3684": 57, "4435": 57, "5839": 57, "2522": 57, "6170": 57, "5267": 57, "matmul": [57, 80, 89], "bracket": [57, 88], "textbf": [57, 73], "bmatrix": [57, 60, 62, 73], "simple_oper": 57, "a1": 57, "a2": 57, "a3": 57, "matrici": 57, "dot_product": 57, "b1": 57, "b2": 57, "geometr": [57, 73, 76, 84], "cosin": [57, 62, 82, 87, 89, 94], "_simple_tensor_operations_exercis": 57, "_manipulating_tensors_video": 57, "last_el": 57, "exclud": [57, 61, 62, 73, 80], "5d": [57, 87], "3x4": 57, "subtl": 57, "singleton": [57, 80, 94], "compress": [57, 62, 73, 80], "opposit": [57, 100], "zeroth": 57, "7391": 57, "8027": 57, "6817": 57, "1335": 57, "0658": 57, "5919": 57, "7670": 57, "6899": 57, "3282": 57, "5085": 57, "peski": 57, "gave": 57, "rid": [57, 85], "7390837073326111": 57, "times48": 57, "times64": 57, "times3": 57, "image_height": [57, 73], "image_width": [57, 73, 76], "0th": 57, "2nd": [57, 82], "cat_row": 57, "cat_col": 57, "colum": 57, "convers": [57, 91], "minor": [57, 74], "inconveni": 57, "halt": 57, "chunk": [57, 73, 87, 88], "introduc": [57, 60, 69, 70, 73, 76, 80, 84, 87, 88, 97, 101], "2659": 57, "5148": 57, "0613": 57, "5046": 57, "1385": 57, "floattensor": [57, 100, 102], "26593232": 57, "5148316": 57, "06128114": 57, "5046449": 57, "13848118": 57, "invok": [57, 67], "elmement": 57, "functiona": 57, "my_tensor1": 57, "my_tensor2": 57, "retun": 57, "functionb": 57, "my_tensor": 57, "idx_tensor": 57, "functionc": 57, "_manipulating_tensors_exercis": 57, "_gpu_vs_cpu_video": 57, "rerun": [57, 94], "reimport": 57, "nvidia": 57, "pure": [57, 69, 70, 82, 94, 100], "whilst": 57, "unless": [57, 65, 67, 84, 88], "lazili": [57, 85], "fashion": [57, 84, 88], "forth": 57, "compatibl": 57, "recreat": [57, 76], "74070": 57, "87535": 57, "_how_much_faster_are_gpus_exercis": 57, "_gpus_discuss": 57, "_getting_data_video": 57, "fortun": 57, "vehicl": [57, 76], "cifar10_data": 57, "airplan": 57, "automobil": 57, "reorder": 57, "rearrang": [57, 84], "input_var": 57, "_display_an_image_exercis": 57, "_train_and_test_video": 57, "adress": 57, "training_data": 57, "rese": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "worker_init_fn": [57, 64, 65, 67, 69, 70, 73, 76, 80, 82], "g_seed": [57, 64, 65, 67, 69, 70, 73, 76, 80], "batch_imag": 57, "batch_label": 57, "predefin": [57, 73], "checkout": 57, "excercis": 57, "my_data_load": 57, "3309": 57, "_load_cifar10_exercis": 57, "_csv_files_video": 57, "interleav": 57, "circl": [57, 61, 80, 100], "sample_data": 57, "make_moon": 57, "to_csv": 57, "x_orig": 57, "to_numpi": 57, "y_orig": 57, "interg": 57, "_generating_neural_network_video": 57, "differend": 57, "obligatori": 57, "x_sampl": 57, "nnetwork": 57, "y_predict": [57, 60, 61], "npredict": [57, 85], "9066": 57, "5052": 57, "2024": 57, "1226": [57, 65], "0685": 57, "2809": 57, "6720": 57, "5097": 57, "8548": 57, "5122": 57, "1543": 57, "8018": 57, "2077": 57, "9859": 57, "5745": 57, "1924": 57, "8367": 57, "1818": 57, "8301": 57, "grad_fn": [57, 60, 73], "addmmbackward": 57, "_classify_some_examples_exercis": 57, "_train_the_network_video": 57, "jonchar": 57, "kera": 57, "pathlib": [57, 65, 69, 70, 85], "plot_decision_boundari": 57, "frames_path": 57, "x_min": 57, "x_max": [57, 64, 65], "y_min": 57, "y_max": 57, "yy": [57, 60, 61, 81], "meshgrid": [57, 60, 64, 65, 67, 81], "gid": 57, "grid_point": 57, "contour": [57, 61, 67, 80, 81], "contourf": [57, 60, 61], "spectral": [57, 74], "correcspond": 57, "transmit": 57, "loss_funct": [57, 60], "15000": 57, "y_logit": 57, "1000th": 57, "05d": 57, "6582635641098022": 57, "2830354869365692": 57, "24354352056980133": 57, "23178495466709137": 57, "4000": [57, 94], "22571030259132385": 57, "2219410538673401": 57, "6000": 57, "21937936544418335": 57, "7000": 57, "21753723919391632": 57, "21614307165145874": 57, "9000": 57, "21508803963661194": 57, "21437251567840576": 57, "11000": 57, "21384570002555847": 57, "12000": 57, "21345028281211853": 57, "13000": 57, "21314124763011932": 57, "14000": 57, "2128836214542389": 57, "interactiveshel": 57, "ast_node_interact": 57, "gif": 57, "mimsav": 57, "gifpath": 57, "_play_with_it_video": 57, "_tweak_your_network_discuss": 57, "_xor_widget_video": 57, "exclus": [57, 67, 76], "odd": 57, "gate": 57, "inequ": 57, "alik": 57, "hline": [57, 73], "tensorflow": [57, 60, 88], "perfectli": [57, 69, 91], "tini": [57, 60], "infinit": [57, 62, 64, 97], "x_1": [57, 65, 91], "x_2": [57, 65, 91], "w1_min_xor": 57, "theses": 57, "voila": 57, "_xor_interactive_demo": 57, "_ethics_video": 57, "_be_a_group_video": 57, "_syllabus_video": 57, "andrew": [57, 60, 61, 62, 91], "sax": [57, 60, 61, 62, 76], "ioanni": [57, 67], "mitliagka": [57, 67], "alona": [57, 73], "fysh": [57, 73], "alexand": [57, 76, 77, 84], "ecker": [57, 76, 77], "jame": 57, "evan": 57, "vikash": [57, 80], "gilja": [57, 80], "akash": 57, "srivastava": [57, 73], "tim": [57, 94, 100, 102], "lillicrap": [57, 94], "blake": [57, 94, 100, 102], "richard": [57, 73, 76, 77, 80, 94, 97, 100, 102], "jane": 57, "feryal": 57, "behbahani": 57, "josh": 57, "vogelstein": 57, "vincenzo": 57, "lamonaco": 57, "iclr": 57, "patient": [57, 61, 62, 65, 67, 70, 74, 76, 80, 82, 84, 88, 91, 94, 97, 101], "delai": [57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "strobelt": 57, "mit": [57, 88], "ibm": 57, "watson": 57, "hoover": 57, "retreiv": 57, "allenai": 57, "s2orc": 57, "methodolog": 57, "alt": [57, 73], "gltr": 57, "ml_regexv1_cs_ma_cit": 57, "_99perc": 57, "pos_umap_cosine_100_d0": 57, "pos_fil": 57, "qyrfn": 57, "_99perc_clean": 57, "meta_fil": 57, "vfdu6": 57, "load_data": 57, "merg": [57, 88], "paper_id": 57, "read_json": 57, "meta": [57, 101], "left_on": 57, "right_on": 57, "year_period": 57, "quinquenni": 57, "selection_multi": 57, "chart": 57, "citation_count": 57, "mark_circl": 57, "opac": [57, 87], "viridi": [57, 76], "clamp": [57, 82, 94], "1955": 57, "pow": 57, "expon": [57, 76], "tooltip": 57, "decad": [57, 76], "add_select": 57, "distant": 57, "publish": [57, 76], "citat": [57, 88], "hover": [57, 73, 77], "boom": 57, "winter": [57, 62], "mileston": 57, "_bonus_section_discuss": 57, "fullfil": 57, "criterria": 57, "bleu": 57, "specter": 57, "umap": 57, "rush": [57, 84], "ignorecas": 57, "issel": 57, "na": [57, 70], "0000000001": 57, "ON": 57, "vi": [57, 100, 102], "stroke": [57, 87], "strokeopac": 57, "strokewidth": 57, "colaboratori": 57, "faq": 57, "deeplearningbook": 57, "ian": 57, "goodfellow": 57, "yoshua": 57, "bengio": [57, 65], "aaron": 57, "courvil": 57, "w1d2_bonuslectur": 59, "_yoshua_bengio_video": 59, "turishcheva": [60, 61, 62, 73, 76, 77], "antoin": [60, 61, 62, 64], "comit": [60, 61, 62, 64], "khalid": [60, 61, 62, 84, 85, 94], "almubarak": [60, 61, 62, 84, 85, 94], "skillset": 60, "w1d2_t1": 60, "mpl_toolkit": [60, 61, 62], "axes_grid1": [60, 61, 62], "make_axes_locat": [60, 61, 62], "ex3_plot": 60, "lss": 60, "mse": [60, 61, 62, 80, 81, 102], "ex1_plot": 60, "fun_z": 60, "fun_dz": 60, "sine": [60, 64, 82], "zz": [60, 61], "xg": 60, "yg": 60, "xxg": 60, "yyg": 60, "zxg": 60, "zyg": 60, "contplt": [60, 61], "quiver": [60, 61, 81], "cax": [60, 61, 62], "append_ax": [60, 61, 62], "cbar": [60, 61, 62, 77], "set_label": [60, 61, 77], "workhors": 60, "poorli": [60, 67], "_introduction_video": [60, 64, 67, 84, 94, 100], "risk": [60, 69], "_gradient_descent_video": 60, "clarifi": [60, 74, 91, 101], "dfrac": [60, 61, 62], "equiv": 60, "circ": [60, 80], "dx": [60, 62], "dg": 60, "dh": 60, "rewrit": 60, "2x": [60, 67], "2y": 60, "_gradient_vector_analytical_exercis": 60, "dz_dx": 60, "dz_dy": 60, "x_0": [60, 81, 82], "y_0": 60, "landscap": [60, 67], "steep": 60, "plateau": [60, 62], "minima": [60, 61, 67, 70, 100, 102], "maxima": 60, "aforement": [60, 73], "formula": [60, 64, 65, 81, 91, 102], "1847": 60, "augustin": 60, "loui": [60, 64], "cauchi": 60, "_gradient_vector_exercis": 60, "_gradient_descent_discussion_video": 60, "mathbf": [60, 61, 62, 80, 81, 82], "rightarrow": [60, 61], "nabla": [60, 61, 67, 81], "w_d": [60, 80], "guess": [60, 61, 62, 69], "qquad": 60, "learnabl": [60, 61, 73, 80], "w_2": [60, 61], "ln": [60, 81], "_gradients_analytical_exercis": 60, "_computational_graph_video": 60, "overwhelm": 60, "extraordinarili": 60, "beast": [60, 76, 81], "backpropag": [60, 65], "oper": [60, 61, 62, 64, 67, 70, 73, 76, 80, 82, 84], "_chain_rule_analytical_exercis": 60, "_autodifferentiation_video": 60, "declar": 60, "rebuild": 60, "simplegraph": 60, "sq_loss": 60, "y_true": [60, 61], "squre": 60, "simple_graph": 60, "niniti": 60, "square_loss": 60, "arbitrarili": [60, 100], "_building_a_computational_graph_exercis": 60, "interconnect": 60, "acycl": 60, "addbackward": 60, "addbackward0": 60, "0x7fb5727ae700": 60, "nameerror": [60, 62, 64, 65, 70, 73, 76, 80, 81, 84, 85, 87, 100, 102], "traceback": [60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 102], "y_t": [60, 61], "y_p": [60, 61], "grad": [60, 67, 81], "contagi": 60, "leaf": [60, 76, 87, 102], "method_nam": 60, "my_object": 60, "ana_dloss_dw": 60, "ana_dloss_db": 60, "autograd_dloss_dw": 60, "autograd_dloss_db": 60, "28": [60, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 88, 91], "gentl": 60, "_pytorch_nn_module_video": 60, "pack": [60, 76, 88], "n_sampl": [60, 62, 80], "widenet": 60, "wide_net": [60, 65], "stochstic": 60, "003": [60, 81], "sgd_optim": 60, "888942301273346": 60, "loss_fun": 60, "loss_record": [60, 61], "recod": 60, "exercic": 60, "physiqu": 60, "mathematiqu": 60, "_training_loop_exercis": 60, "_tutorial_1_wrapup_video": 60, "w1d2_t2": 61, "intslid": [61, 62, 67, 73, 80], "floatslid": [61, 62, 64, 67, 73, 80], "hbox": [61, 73, 76, 80], "interactive_output": [61, 62, 73, 76, 80], "togglebutton": 61, "plot_x_y_": 61, "x_t_": 61, "y_t_": 61, "x_ev_": 61, "y_ev_": 61, "loss_log_": 61, "weight_log_": 61, "shallownarrownet": 61, "plot_vector_field": 61, "init_weight": [61, 65, 87], "x_po": 61, "endpoint": 61, "y_po": 61, "mgrid": 61, "empty_lik": 61, "x_temp": 61, "y_temp": 61, "gen_sampl": 61, "plasma": 61, "temp_model": 61, "shallownarrowlnn": 61, "da": [61, 81, 82, 88, 97], "dloss_dw": 61, "temp_record": 61, "zorder": [61, 73], "red": [61, 67, 69, 70, 73, 76, 85, 94], "coolwarm": [61, 67], "plot_loss_landscap": 61, "loss_rec_1": 61, "w_rec_1": 61, "loss_rec_2": 61, "w_rec_2": 61, "plot_surfac": [61, 67], "scatter3d": [61, 67], "view_init": [61, 67], "260": [61, 76], "depth_widget": 61, "depth_lr_init_interplai": 61, "lr_widget": 61, "depth_lr_interplai": 61, "deepnarrowlnn": 61, "w_i": 61, "yscale": 61, "plot_init_effect": 61, "init_w": 61, "ncol": [61, 64, 67], "interplai": [61, 67], "min_depth": 61, "max_depth": 61, "depth_list": 61, "i_depth": 61, "min_lr": 61, "max_lr": 61, "slider": [61, 62, 67, 73, 76, 80, 81, 85], "button_styl": 61, "danger": [61, 70], "argwher": [61, 102], "add_gridspec": 61, "ax3": [61, 73], "set_yscal": 61, "datapoint": [61, 64, 65, 67, 69, 70, 80, 81, 82], "offset": [61, 67, 80, 84, 87], "evenli": [61, 76], "w1": [61, 85], "w2": [61, 85], "dloss_dw1": 61, "dloss_dw2": 61, "n_ep": [61, 62], "corrspond": 61, "weight_record": 61, "thin": [61, 80, 84], "dw": [61, 81], "ex": 61, "wp": 61, "isinf": 61, "_shallow_narrow_linear_net_video": 61, "incred": [61, 74], "dissect": 61, "comprehend": 61, "occas": 61, "compact": 61, "pressur": 61, "_loss_gradients_analytical_exercis": 61, "shallownarrowexercis": 61, "shallownarrow": 61, "netwrok": 61, "211": [61, 76], "initial_weight": 61, "x_eval": 61, "sn_model": 61, "loss_log": [61, 67], "weight_log": 61, "y_eval": 61, "_simple_narrow_lnn_exercis": 61, "_training_landscape_video": 61, "1x": 61, "ribbon": 61, "yellow": [61, 73, 76, 85, 94], "crowd": 61, "saddl": [61, 62], "_training_landscape_discussion_video": 61, "clever": 61, "_effect_of_depth_video": 61, "realiti": 61, "incap": [61, 62], "unseen": [61, 67, 85, 101], "w_": [61, 62, 65, 67, 70, 100], "vanish": [61, 65, 76], "chain": [61, 62, 76, 81, 85], "vulnerabl": 61, "impair": [61, 94], "fastest": [61, 62], "eventu": [61, 80], "continuous_upd": [61, 62, 80], "_effect_of_depth_discussion_video": 61, "trade": [61, 67, 102], "_learning_rate_video": 61, "045": 61, "readout_format": [61, 62], "_learning_rate_discussion_video": 61, "_depth_and_learning_rate_video": 61, "deliv": 61, "confid": [61, 67, 73, 88, 100, 102], "impli": [61, 64, 100], "intpl_obj": 61, "500px": 61, "widgets_ui": [61, 62, 80], "widgets_out": [61, 62, 80], "_depth_and_learning_rate_interactive_demo": 61, "_depth_and_learning_rate_discussion_video": 61, "_initialization_matt": 61, "_initialization_matters_discussion_video": 61, "_wrapup_video": 61, "overflow": [61, 73], "difficulti": [61, 67], "_hyperparameter_interaction_bonus_discuss": 61, "ethic": 62, "w1d2_t3": 62, "floatlogslid": [62, 67, 76], "vbox": [62, 73, 76, 80], "filterwarn": [62, 94], "plot_x_y_hier_data": 62, "im2": 62, "subplot_ratio": 62, "hierarch": [62, 65], "ax0": 62, "plot_x_y_hier_on": 62, "plot_tree_data": 62, "label_list": [62, 87], "feature_arrai": 62, "new_featur": 62, "listedcolormap": 62, "cyan": [62, 85], "magenta": [62, 70, 77], "n_featur": 62, "n_label": 62, "feature_list": 62, "can_grow": 62, "is_mamm": 62, "has_leav": 62, "can_mov": 62, "has_trunk": 62, "can_fli": 62, "can_swim": 62, "has_stem": 62, "is_warmblood": 62, "can_flow": 62, "goldfish": [62, 76], "tuna": 62, "robin": [62, 76], "canari": 62, "rose": [62, 76], "daisi": [62, 76], "pine": 62, "oak": 62, "implt": 62, "set_yticklabel": 62, "set_ytick": [62, 80], "set_xtick": [62, 76, 80], "set_xticklabel": [62, 76], "loss_arrai": 62, "plot_loss_sv": 62, "sv_arrai": 62, "n_sing_valu": 62, "set1": 62, "plot1": [62, 69], "plot2": [62, 69], "plot_loss_sv_twin": 62, "tick_param": 62, "labelcolor": 62, "twinx": 62, "plot_ills_sv_twin": 62, "ill_arrai": 62, "ill_label": 62, "plot_loss_sv_rsm": 62, "rsm_arrai": 62, "i_ep": 62, "yaxi": 62, "tick_right": 62, "implot": 62, "rsm": 62, "item_nam": 62, "axvspan": 62, "build_tre": 62, "n_level": 62, "n_branch": 62, "to_np_arrai": 62, "pflip": 62, "sample_from_tre": 62, "n_item": 62, "rand_temp": 62, "flip_temp": 62, "samp": 62, "prop": 62, "generate_hsd": 62, "tree_label": 62, "tree_featur": 62, "linear_regress": 62, "linalg": [62, 77, 80, 81, 85, 87, 89], "inv": [62, 81], "dy": 62, "add_featur": 62, "existing_featur": 62, "hstack": 62, "net_svd": 62, "in_dim": [62, 67], "orthogon": [62, 89, 94], "w_tot": 62, "net_rsm": 62, "initializer_": 62, "n_out": 62, "n_in": 62, "normal_": [62, 69, 70, 82], "test_initializer_ex": 62, "lnnet": 62, "ex_initializer_": 62, "faulti": 62, "test_net_svd_ex": 62, "net_svd_ex": 62, "u_ex": 62, "\u03c3_ex": 62, "v_ex": 62, "ex_net_svd": 62, "isclos": 62, "atol": 62, "test_net_rsm_ex": 62, "net_rsm_ex": 62, "y_ex": 62, "ex_net_rsm": 62, "tight": [62, 87], "timelin": 62, "hid_dim": 62, "out_dim": [62, 67], "ouput": 62, "in_hid": 62, "hid_out": 62, "hid": 62, "illusory_i": 62, "input_dim": [62, 82], "rs_mat": 62, "pred_ij": 62, "mu": [62, 65, 74, 80, 81], "n_": [62, 65, 76], "underscor": [62, 85], "_reinitialization_exercis": 62, "_intro_to_representation_learning_video": 62, "shallow": [62, 87], "hardli": [62, 87], "sacrif": 62, "syntact": [62, 84], "swim": [62, 76], "fin": 62, "cast": [62, 76], "label_tensor": 62, "feature_tensor": 62, "vital": [62, 70], "premis": [62, 85], "dim_input": 62, "dim_hidden": 62, "dim_output": 62, "dlnn_model": 62, "bump": 62, "loss_lr_init": 62, "1f": [62, 67], "_training_the_deep_lnn_interactive_demo": 62, "_svd_video": 62, "prod_": [62, 74], "tot": 62, "2013": [62, 91], "decompos": [62, 88], "untangl": 62, "matix": 62, "evd": 62, "eigenvector": [62, 80], "_svd_exercis": 62, "_svd_discuss": 62, "_svd_discussion_video": 62, "_rsa_video": 62, "remark": [62, 67, 74, 88, 101], "smallest": [62, 82], "fish": [62, 76], "loss_svd_rsm_lr_gamma": 62, "i_ep_slid": 62, "630px": 62, "lr_slider": 62, "gamma_slid": 62, "moment": [62, 73, 80], "naiv": [62, 81], "suprem": 62, "unsurprisingli": 62, "_rsa_exercis": 62, "_rsa_discussion_video": 62, "_illusorycorrelations_video": 62, "sudden": 62, "tempt": 62, "immatur": 62, "bone": [62, 101], "distinct": [62, 65], "shark": [62, 76], "skeleton": [62, 73, 94], "cartilagin": 62, "lighter": [62, 76], "illusion_idx": 62, "its_label": 62, "has_bon": 62, "ill_predict": 62, "medic": [62, 69], "parrot": 62, "cannot_speak": 62, "_illusory_correlations_exercis": 62, "_illusory_correlations_discussion_video": 62, "_outro_video": [62, 65, 100], "_linear_regression_bonus_video": 62, "gp": 62, "air": [62, 80], "predictor": 62, "multivari": 62, "pop": [62, 76, 85, 97], "y_": 62, "vdot": 62, "ddot": 62, "x_": [62, 64], "b_": 62, "broadcast": [62, 81, 82], "notat": [62, 80, 81, 88, 97], "underset": [62, 89, 100, 102], "mathrm": [62, 65, 84], "argmin": [62, 97], "invert": [62, 80], "linear_regression_exercis": 62, "w_true": 62, "w_estim": 62, "nestim": 62, "_analytical_solution_to_lr_exercis": 62, "deep_w_tot": 62, "analytical_weight": 62, "lrnet": 62, "in_out": 62, "lr_model": 62, "zero_depth_model": 62, "lr_model_weight": 62, "allclos": [62, 81, 82, 94], "_linear_regression_discussion_video": 62, "arash": [64, 65, 80], "ash": [64, 65, 76, 80], "felix": [64, 65], "bartsch": [64, 65], "yu": [64, 65, 73, 76, 77], "fang": [64, 65, 73, 76, 77], "yang": [64, 65, 73, 76, 77, 81, 82, 88, 97], "melvin": [64, 65, 76, 77, 84, 85, 94, 100, 102], "selim": [64, 65, 76, 77, 84, 85, 94, 100, 102], "atai": [64, 65, 76, 77, 84, 85, 94, 100, 102], "arguabl": [64, 76, 91, 101], "tractabl": 64, "w1d3_t1": 64, "helper": [64, 69, 76, 81], "unnormalis": [64, 65], "unnorm": [64, 65, 69, 70, 76, 81], "npimg": [64, 65, 69, 70], "plot_function_approxim": 64, "relu_act": 64, "y_hat": 64, "incom": [64, 76, 85, 88], "basi": [64, 69, 82], "_universal_approximation_theorem_video": 64, "inflect": 64, "_c": 64, "0x7f8734c83830": 64, "2871": 64, "6413": 64, "8615": [64, 73], "3649": 64, "6931": 64, "7542": [64, 67], "5983": 64, "7588": 64, "3569": 64, "6389": 64, "approximate_funct": 64, "n_relu": 64, "i_relu": 64, "combination_weight": 64, "prev_slop": 64, "delta_x": 64, "_function_approximation_with_relu_exercis": 64, "1hr": 64, "_building_mlps_in_pytorch_video": 64, "lipschitz": 64, "prove": [64, 65, 101], "fascin": 64, "terminolog": [64, 94], "negative_slop": [64, 65], "ge": 64, "actv": [64, 65], "input_feature_num": [64, 65], "hidden_unit_num": [64, 65], "output_feature_num": [64, 65], "in_num": [64, 65], "temporari": [64, 65], "out_num": [64, 65], "linear_": [64, 65], "actv_lay": [64, 65], "activation_": [64, 65], "out_lay": [64, 65], "output_linear": [64, 65], "_implement_a_general_purpose_mlp_in_pytorch_exercis": 64, "_cross_entropy_video": 64, "addition": [64, 100], "_i": [64, 74, 80], "operatornam": [64, 89, 100, 102], "sum_": [64, 65, 69, 74, 77, 80, 100], "cross_entropy_loss": [64, 100, 102], "avg_loss": [64, 82], "x_of_label": 64, "pytorch_loss": 64, "our_loss": 64, "8f": 64, "34672737": 64, "34672749": 64, "00000012": 64, "_implement_batch_cross_entropy_loss_exercis": 64, "fanci": 64, "quad": 64, "leq": [64, 65], "ldot": 64, "create_spiral_dataset": 64, "_training_and_evaluating_an_mlp_video": 64, "shuffle_and_split_data": [64, 65], "shuffled_indic": [64, 65], "datset": [64, 65, 76], "_implement_it_for_a_classification_task_exercis": 64, "multithread": [64, 65], "train_test_classif": [64, 65], "training_plot": [64, 65], "data_load": [64, 65, 67, 73, 76, 82], "gaug": [64, 65], "train_tot": [64, 65], "test_tot": [64, 65], "_whats_the_point_of_ev": 64, "_and_train": 64, "_discuss": 64, "ish": 64, "sample_grid": [64, 65, 82], "x_all": [64, 65], "jj": [64, 65], "ij": [64, 65, 69, 70], "plot_decision_map": [64, 65], "decision_map": [64, 65], "33": [64, 65, 73, 76, 81, 82, 84, 85, 94, 97], "_does_it_generalize_well_discuss": 64, "leayer": 64, "biophys": 64, "circuit": 64, "excess": 64, "_biological_to_artificial_neurons_bonus_video": 64, "1907": 64, "\u00e9douard": 64, "lapicqu": 64, "electrophysiolog": 64, "theoret": [64, 65, 67], "dayan": 64, "laurenc": 64, "abbott": 64, "v_m": 64, "c_m": 64, "t_": 64, "v_": [64, 67, 100], "r_": 64, "membran": 64, "voltag": 64, "capacit": 64, "resit": 64, "rm": 64, "momentarili": 64, "mimick": 64, "refractori": 64, "eqnarrai": 64, "sp": 64, "exceed": 64, "synapt": 64, "euler": [64, 82], "delta": [64, 67, 81, 97], "superscript": [64, 70], "run_lif": 64, "tau_ref": 64, "vth": 64, "v_spike": 64, "msec": 64, "resist": 64, "kohm": 64, "uf": 64, "vm": 64, "t_rest": 64, "volatag": 64, "refactori": 64, "sim_tim": 64, "my_layout": [64, 65], "plot_if_curv": 64, "is_max": 64, "spike_count": 64, "_real_and_artificial_neuron_similarities_bonus_discuss": 64, "w1d3_t2": 65, "make_grid": [65, 76, 77, 80, 82], "reshapinng": 65, "naccuraci": 65, "indec": 65, "32x32": 65, "animalfac": [65, 69, 70], "animalfaces32x32": 65, "kgfvj": [65, 69, 70], "zfile": [65, 69, 70, 94, 100, 102], "chdir": 65, "_deep_expressivity_video": 65, "max_par_count": 65, "run_depth_optim": 65, "max_hidden_lay": 65, "hidden_lay": 65, "test_scor": 65, "count_paramet": 65, "par_count": 65, "hidden_unit": 65, "_wide_vs_deep_exercis": 65, "optimum": [65, 67, 81], "hurt": 65, "_wide_vs_deep_discuss": 65, "spiral": [65, 76], "polynomi": [65, 102], "upto": 65, "constatnt": 65, "run_poly_classif": 65, "poly_degre": 65, "make_poly_featur": 65, "poly_x": 65, "poly_x_test": 65, "poly_x_train": 65, "poly_test_data": 65, "poly_test_load": 65, "poly_train_data": 65, "poly_train_load": 65, "poly_net": 65, "poly_x_al": 65, "max_poly_degre": 65, "3200": 65, "1325": 65, "_does_a_wide_model_generalize_well_discuss": 65, "_case_study_video": 65, "randomrot": 65, "offici": 65, "get_data_load": [65, 73], "img_train_load": 65, "img_test_load": [65, 70], "augmentation_transform": [65, 73], "preprocessing_transform": [65, 73], "train_transform": [65, 69, 70, 73, 76, 77], "test_transform": [65, 70], "data_path": [65, 69, 70], "afhq": [65, 69, 70], "img_train_dataset": 65, "img_test_dataset": [65, 70], "tpu": 65, "nrow": [65, 76, 77, 80, 82], "fc1_weight": 65, "_dataloader_real_world_exercis": 65, "_why_first_layer_features_are_high_level_discuss": 65, "_ethics_hype_in_ai_video": 65, "chaotic": 65, "chao": [65, 67], "implicitli": [65, 80, 88, 101], "_need_for_good_initialization_bonus_video": 65, "o_i": 65, "o_": 65, "drawn": [65, 80], "2_": 65, "2_j": 65, "albeit": 65, "blow": [65, 76], "dilemma": [65, 97], "glorot": 65, "plug": [65, 102], "geq": 65, "z_": 65, "z_i": [65, 100], "confirm": [65, 87, 101], "ngain": 65, "xavier_normal_": 65, "xavier_uniform_": 65, "best_gain": 65, "theoretical_gain": 65, "valueerror": [65, 80, 85, 94, 97], "opt": [65, 67, 69, 70, 80, 81, 82, 85, 87, 88, 94], "hostedtoolcach": [65, 69, 70, 80, 81, 82, 85, 87, 94], "x64": [65, 69, 70, 80, 81, 82, 85, 87, 94], "fromnumer": 65, "1229": [65, 91], "1142": 65, "1143": 65, "1144": 65, "1227": 65, "1228": 65, "kwd": 65, "_novalu": 65, "_wrapfunc": 65, "obj": 65, "getattr": 65, "_wrapit": 65, "attributeerror": [65, 88], "jose": 67, "gallego": 67, "posada": 67, "piyush": [67, 69, 70], "chauhan": [67, 69, 70], "charl": [67, 80], "edelson": [67, 80], "krishnakumaran": 67, "w1d5_t1": 67, "rc": [67, 94], "unicode_minu": [67, 81, 94], "print_param": 67, "named_paramet": [67, 69, 70, 85], "incent": 67, "led": 67, "_unexpected_consequences_discuss": 67, "pedagog": 67, "spotlight": [67, 76], "emphasi": [67, 88], "vet": 67, "handwritten": [67, 73, 80], "strictli": 67, "wavi": 67, "vallei": [67, 76], "deepest": [67, 82], "_case_study_mlp_classification_video": 67, "y2fj6": [67, 80], "ndownload": [67, 73, 80], "load_mnist_data": 67, "change_tensor": 67, "greyscal": [67, 73], "train_set": [67, 80], "train_siz": 67, "784": [67, 73, 76], "train_target": 67, "test_set": 67, "test_target": 67, "std_dev": 67, "tform": 67, "concentr": 67, "subset_index": 67, "num_figur": 67, "sample_id": [67, 100, 102], "matshow": [67, 97], "ndenumer": 67, "steelblu": 67, "use_bia": 67, "multilayerperceptron": 67, "num_hidden_lay": 67, "transformed_x": 67, "hidden_output": 67, "constitut": 67, "encapsul": 67, "deep_learning_tutori": 67, "surrog": 67, "nll_loss": [67, 69, 70, 84], "_thi": 67, "cell_": 67, "opportun": [67, 76, 91], "cell_verbos": 67, "partial_trained_model": 67, "7e": 67, "goto": 67, "ineffici": [67, 101], "_optimization_of_an_objective_function_video": 67, "zero_": [67, 87], "random_upd": 67, "noise_scal": 67, "randn_lik": [67, 81, 82], "gradient_upd": 67, "model1": [67, 76], "0264": 67, "0173": 67, "0297": 67, "0278": 67, "0221": 67, "0086": 67, "0254": 67, "0233": 67, "0231": 67, "0342": 67, "0124": 67, "0157": 67, "0111": 67, "0144": 67, "0301": 67, "0181": 67, "0303": 67, "0208": 67, "0353": 67, "0183": 67, "0271": 67, "0099": 67, "0033": 67, "0022": 67, "0307": 67, "0243": 67, "0159": 67, "0064": 67, "0263": 67, "0174": 67, "0298": 67, "0047": 67, "0302": 67, "0093": 67, "0077": 67, "0248": 67, "0234": 67, "0237": 67, "0117": 67, "0187": 67, "0006": 67, "0156": 67, "0143": 67, "0164": 67, "0286": 67, "0238": 67, "0127": 67, "0191": 67, "0188": 67, "0206": 67, "0354": 67, "0184": 67, "0272": 67, "0098": 67, "0002": 67, "0292": 67, "0018": 67, "0054": 67, "0246": 67, "0198": 67, "0061": 67, "_implement_gradient_descent_exercis": 67, "induc": [67, 89], "_run": 67, "model_nam": [67, 85], "my_model": 67, "base_loss": 67, "dummy_model": 67, "loss1": 67, "gd_delta": 67, "trial_id": 67, "hist": [67, 94], "get_legend_handles_label": 67, "bbox_to_anchor": 67, "fancybox": 67, "shadow": 67, "_gradient_descent_vs_random_search_discuss": 67, "haunt": 67, "_momentum_video": 67, "bridg": [67, 76], "gap": 67, "flatter": [67, 70], "exhibit": [67, 101], "recomput": 67, "_how_momentum_works_discuss": 67, "w_t": 67, "quantiti": [67, 80, 81, 91], "v_t": 67, "underbrac": 67, "leftarrow": 67, "loss_2d": 67, "mask_idx": 67, "378": [67, 76], "bias_id": 67, "bias_idx": 67, "ones_lik": [67, 97], "masked_weight": 67, "mesh": 67, "_subplot": 67, "axessubplot": 67, "surf": 67, "antialias": 67, "set_zlabel": 67, "plot_param_dist": 67, "best_u": 67, "best_v": 67, "traj": [67, 81], "use_log": 67, "y_min_v": 67, "y_max_v": 67, "run_optim": 67, "eval_fn": 67, "update_fn": 67, "max_step": [67, 88, 97], "optim_kwarg": 67, "log_traj": 67, "callabl": 67, "customiz": 67, "auxiliari": 67, "aux_tensor": 67, "xs": [67, 73], "momentum_upd": 67, "grad_vel": 67, "model2": [67, 76], "initial_vel": 67, "5898": 67, "0116": 67, "0239": 67, "0871": 67, "4030": 67, "9577": 67, "4653": 67, "6022": 67, "7363": 67, "5485": 67, "2747": 67, "6539": 67, "4117": 67, "1045": 67, "6492": 67, "0201": 67, "6503": 67, "1310": 67, "5098": 67, "5075": 67, "0718": 67, "1192": 67, "2900": 67, "9657": 67, "4405": 67, "1174": 67, "0792": 67, "1857": 67, "3537": 67, "0824": 67, "4254": 67, "3760": 67, "7491": 67, "6025": 67, "4147": 67, "8720": 67, "6201": 67, "9632": 67, "9430": 67, "5180": 67, "3417": 67, "6574": 67, "3677": 67, "_implement_momentum_exercis": 67, "line2d": [67, 81], "run_newton": 67, "init_list": 67, "par_tensor": 67, "t_g": 67, "eval_loss": 67, "eval_grad": 67, "eval_hess": 67, "hessian": 67, "fromit": 67, "lstyle": 67, "interact_manu": [67, 73], "momentum_experi": 67, "9e": 67, "sgd_traj": 67, "mom_traj": 67, "plot3d": 67, "lime": 67, "_momentum_vs_gd_interactive_demo": 67, "_momentum_and_oscillations_discuss": 67, "couldn": [67, 85], "onward": 67, "remaind": 67, "_overparameterization_video": 67, "losslandscap": 67, "mental": 67, "undesir": 67, "optima": 67, "ampl": 67, "evolv": [67, 69, 70, 101], "overparam": 67, "5e": [67, 69, 70, 85], "num_init": 67, "hdim": 67, "base_model": 67, "2e": [67, 70], "loss_hist": 67, "num_param": 67, "_overparameterization_interactive_demo": 67, "downsid": 67, "_width_and_depth_of_the_network_discuss": 67, "quest": 67, "thousand": [67, 91], "_mini_batches_video": 67, "measure_update_tim": 67, "num_point": 67, "loss_tim": 67, "gradient_tim": 67, "computation_tim": 67, "times_list": 67, "_cost_of_computation_interactive_demo": 67, "sample_minibatch": 67, "iid": 67, "input_data": 67, "target_data": 67, "batch_input": 67, "batch_target": 67, "batch_indic": 67, "x_batch": 67, "y_batch": 67, "_implement_mini_batch_sampling_exercis": 67, "budget": 67, "minibatch_experi": 67, "time_budget": 67, "plot_data": 67, "precaut": 67, "afford": [67, 101], "steadili": 67, "_compare_different_minibatch_sizes_interactive_demo": 67, "awar": [67, 77, 88], "knob": 67, "prototyp": [67, 76], "_adaptive_methods_video": 67, "rmsprop_upd": 67, "grad_sq": 67, "quotient": 67, "gsq": 67, "model3": [67, 76], "0031": 67, "0193": 67, "0316": 67, "0063": 67, "0318": 67, "0109": 67, "0232": 67, "0218": 67, "0253": 67, "0102": 67, "0203": 67, "0027": 67, "0136": 67, "0089": 67, "0123": 67, "0324": 67, "0166": 67, "0281": 67, "0133": 67, "0197": 67, "0182": 67, "0186": 67, "0376": 67, "0293": 67, "0019": 67, "0313": 67, "0011": 67, "0122": 67, "0199": 67, "0329": 67, "0041": 67, "_implement_rmsprop_exercis": 67, "congrat": 67, "compare_optim": 67, "sgd_dict": 67, "mom_dict": 67, "rms_dict": 67, "fuchsia": 67, "all_dict": 67, "opt_dict": 67, "opt_nam": 67, "optim_dict": 67, "_compare_optimizers_interactive_demo": 67, "excel": [67, 73, 87, 89, 97], "_compare_optimizers_discuss": 67, "plain": [67, 81], "undisput": 67, "amsgrad": 67, "adagrad": 67, "burden": 67, "_loss_function_and_optimization_discuss": 67, "15min": [67, 73, 76, 80], "_ethical_concerns_video": 67, "utilis": 67, "unforeseen": 67, "beat": [67, 88, 100], "mission": 67, "tricki": [67, 70], "_putting_it_all_together_bonus_video": 67, "benchmark_model": 67, "sj4e8": 67, "benchmark_state_dict": 67, "eval_model": 67, "acc_log": 67, "batch_id": 67, "log_freq": 67, "val_freq": 67, "train_set_orig": 67, "test_set_orig": 67, "val_set_orig": 67, "val_idx": 67, "step_idx": 67, "running_acc": 67, "193": [67, 76], "274": [67, 76], "930": [67, 76], "068": 67, "979": [67, 76], "166": [67, 70, 76, 94], "289": [67, 76, 94], "789": [67, 76], "807": [67, 76], "935": [67, 76, 94], "266": [67, 76], "381": [67, 76], "203": [67, 76], "066": 67, "414": [67, 76], "361": [67, 76], "989": [67, 76], "_train_your_own_model_bonus_exercis": 67, "_metrics_bonus_discuss": 67, "drum": [67, 76], "nbenchmark": 67, "826": [67, 76], "810": [67, 76], "011": 67, "316": [67, 76], "ravi": [69, 70], "teja": [69, 70, 73], "konkimalla": [69, 70], "mohitrajhu": [69, 70], "lingan": [69, 70], "kumaraian": [69, 70], "kevin": [69, 70], "machado": [69, 70], "gamboa": [69, 70], "roberto": [69, 70, 76, 77], "guidotti": [69, 70, 76, 77], "w2d1_t1": 69, "afhq_random_32x32": [69, 70], "afhq_10_32x32": [69, 70], "9sj7p": [69, 70], "wvgkq": [69, 70], "plot_weight": [69, 70], "ws": [69, 70], "axhlin": [69, 70], "ls": [69, 70, 94], "early_stop_plot": 69, "train_acc_earlystop": 69, "val_acc_earlystop": 69, "best_epoch": [69, 70], "solid": [69, 70, 73, 76, 101], "reg_function1": [69, 70], "reg_function2": [69, 70], "regularis": [69, 70], "lambda1": [69, 70], "lambda2": [69, 70], "view_a": [69, 70], "val_acc_list": [69, 70], "train_acc_list": [69, 70], "param_norm_list": [69, 70], "trained_model": [69, 70], "param_norm": [69, 70], "calculate_frobenius_norm": [69, 70], "_introduction_to_regularization_video": 69, "_regularization_as_shrinkage_video": 69, "underperform": [69, 77], "underfit": 69, "_f": 69, "a_": 69, "6572162508964539": 69, "_forbenius_norm_exercis": 69, "6572": 69, "plot_weigt": 69, "forbeniu": 69, "3810": [69, 70], "_overparameterization_and_overfitting_video": 69, "overparametr": [69, 73], "leaky_relu": [69, 70], "normi": 69, "wsi": 69, "model_norm": [69, 70], "norm_per_lay": 69, "running_predict": [69, 70], "pl": [69, 88], "layer_nam": 69, "title1": 69, "title2": 69, "set_text": 69, "repeat_delai": [69, 70], "html_anim": [69, 70], "runtimeerror": [69, 70, 81, 85], "1285": [69, 70, 81], "embed_limit": [69, 70, 81], "1282": [69, 70, 81], "tmpdir": [69, 70, 81], "m4v": [69, 70, 81], "1283": [69, 70, 81], "1284": [69, 70, 81], "mpl": [69, 70, 81], "1286": [69, 70, 81], "codec": [69, 70, 81], "h264": [69, 70, 81], "1287": [69, 70, 81], "bitrat": [69, 70, 81], "1288": [69, 70, 81], "_interv": [69, 70, 81], "1289": [69, 70, 81], "148": [69, 70, 76, 80, 81], "moviewriterregistri": [69, 70, 81], "146": [69, 70, 76, 80, 81], "147": [69, 70, 76, 80, 81], "_regist": [69, 70, 81], "moviewrit": [69, 70, 81], "_interpreting_losses_discuss": 69, "frobeni": 69, "normf": 69, "wsf": 69, "honest": 69, "seldom": 69, "wild": [69, 70, 76], "len_train": 69, "len_val": 69, "len_test": 69, "14430": [69, 70], "img_dataset": [69, 70], "img_train_data": [69, 70], "img_val_data": [69, 70], "afhq_random": [69, 70], "random_img_train_data": [69, 70], "random_img_val_data": [69, 70], "rand_train_load": [69, 70], "rand_val_load": [69, 70], "afhq_10": [69, 70], "partially_random_train_data": [69, 70], "partially_random_val_data": [69, 70], "partial_rand_train_load": [69, 70], "partial_rand_val_load": [69, 70], "biganimalnet": [69, 70], "val_acc_pur": [69, 70], "train_acc_pur": [69, 70], "end_tim": 69, "6447606086731": 69, "p_x": 69, "p_y": 69, "visualize_data": [69, 70], "image_class": [69, 70], "val_acc_random": 69, "train_acc_random": 69, "_early_stopping_video": 69, "early_stopping_main": [69, 70], "best_model": [69, 70], "_early_stopping_exercis": 69, "harm": [69, 82], "_early_stopping_discuss": 69, "caveat": 69, "intial": [69, 100], "val_acc_shuffl": 69, "train_acc_shuffl": 69, "_early_stopping_generalization_bonus_discuss": 69, "shrinkag": 70, "peril": 70, "w2d1_t2": 70, "frobeniu": 70, "animalnet": 70, "kep": 70, "_l1_and_l2_regularization_video": 70, "teammat": 70, "reg_train_data": 70, "reg_val_data": 70, "14500": 70, "reg_train_load": 70, "reg_val_load": 70, "val_acc_unreg": 70, "train_acc_unreg": 70, "param_norm_unreg": 70, "lasso": 70, "ddagger": 70, "l_r": 70, "subsect": 70, "sgn": 70, "mbox": 70, "l1_reg": 70, "445133209228516": 70, "_l1_regularization_exercis": 70, "args1": [70, 100, 102], "test_batch_s": 70, "val_acc_l1reg": 70, "train_acc_l1reg": 70, "param_norm_l1reg": 70, "251": [70, 76, 80], "253": [70, 76, 80], "254": [70, 76, 80], "165": [70, 76, 94], "163": [70, 76, 94], "164": [70, 76, 94], "167": [70, 76, 94], "_tune_lambda1_exercis": 70, "quadrat": 70, "\u03b7": 70, "l2_reg": 70, "328375816345215": 70, "_l2_ridge_regularization_exercis": 70, "args2": [70, 102], "val_acc_l2reg": 70, "train_acc_l2reg": 70, "param_norm_l2reg": 70, "168": [70, 76, 94], "169": [70, 76], "170": [70, 76, 94], "_tune_lambda2_exercis": 70, "args3": 70, "val_acc_l1l2reg": 70, "train_acc_l1l2reg": 70, "param_norm_l1l2reg": 70, "lambda_2": 70, "lambda_1": 70, "174": [70, 76, 94], "173": [70, 76, 94], "175": [70, 76, 94], "176": [70, 76, 94], "_dropout_video": 70, "liter": 70, "subsequ": [70, 73, 85, 87], "netdropout": 70, "unsqueeze_": 70, "running_predictions_dp": 70, "train_loss_dp": 70, "test_loss_dp": 70, "model_norm_dp": 70, "_dropout_discuss": 70, "fare": 70, "animalnetdropout": 70, "248": [70, 76], "val_acc_dropout": 70, "train_acc_dropout": 70, "model_dp": 70, "val_acc_big": 70, "train_acc_big": 70, "model_big": 70, "dp": 70, "placement": 70, "_dropout_caveats_discuss": 70, "_data_augmentation_video": 70, "14280": 70, "new_transform": 70, "randomverticalflip": 70, "new_train_data": 70, "new_train_load": 70, "model_aug": 70, "val_acc_dataaug": 70, "train_acc_dataaug": 70, "param_norm_dataaug": 70, "model_pur": 70, "param_norm_pur": 70, "_data_augmentation_discussuion": 70, "_overparameterized_vs_small_nn_discussuion": 70, "_sgd_video": 70, "broader": 70, "bewar": 70, "overshoot": 70, "11700": 70, "2930": 70, "full_train_load": 70, "full_val_load": 70, "acc_dict": 70, "val_": 70, "train_": 70, "param_norm_": 70, "0e": 70, "_hyperparameter_tuning_video": 70, "consum": 70, "bayesian": 70, "evolutionari": 70, "_overview_of_regularization_techniques_discuss": 70, "_adversarial_attacks_bonus_video": 70, "inevit": 70, "defens": 70, "distil": 70, "prone": 70, "w2d2_bonuslectur": 72, "_kynghyun_cho_video": 72, "dawn": 73, "mcknight": 73, "gerum": 73, "cassidi": 73, "pirlot": 73, "rohan": 73, "saha": 73, "liam": 73, "peet": 73, "pare": 73, "najafi": 73, "lili": [73, 84, 85, 100, 102], "cheng": [73, 84, 85, 100, 102], "bettina": [73, 76, 77], "hein": [73, 76, 77], "nina": 73, "kudryashova": 73, "anmol": 73, "gupta": [73, 100, 102], "xiaoxiong": 73, "tran": 73, "minh": 73, "hmrishav": 73, "bandyopadhyai": 73, "rahul": 73, "shekhar": 73, "w2d2_t1": 73, "trang": [73, 80, 81, 82], "correlate2d": 73, "gzip": 73, "download_data": 73, "nextract": 73, "fz": 73, "ft": [73, 76, 84, 85], "foldernam": 73, "gunzip": 73, "f_in": 73, "f_out": 73, "copyfileobj": 73, "check_shape_funct": 73, "image_shap": 73, "kernel_shap": 73, "correct_shap": 73, "user_shap": 73, "output_shap": [73, 80], "check_conv_funct": 73, "conv_funct": 73, "solution_us": 73, "solution_scipi": 73, "result_right": 73, "check_pooling_net": 73, "x_img": 73, "emnist_train": 73, "x_img_idx": 73, "output_x": 73, "right_output": 73, "309552": 73, "6216984": 73, "2708383": 73, "6654134": 73, "2271233": 73, "873457": 73, "318945": 73, "46229": 73, "663746": 73, "8889914": 73, "31068993": 73, "354934": 73, "378724": 73, "882853": 73, "499334": 73, "8546696": 73, "29296": 73, "096506": 73, "7074604": 73, "984148": 73, "12916": 73, "10037": 73, "667609": 73, "2780352": 73, "436305": 73, "9764223": 73, "98801": 73, "1756": 73, "531992": 73, "664275": 73, "5453291": 73, "2691708": 73, "3217516": 73, "3798618": 73, "05612564": 73, "218788": 73, "360992": 73, "980816": 73, "354935": 73, "8126211": 73, "9199777": 73, "9382377": 73, "076582": 73, "035061": 73, "92164516": 73, "434638": 73, "7816348": 73, "83254766": 73, "right_shap": 73, "display_image_from_greyscale_arrai": 73, "_matrix": 73, "_img": 73, "220": [73, 76], "make_plot": 73, "actual_convolut": 73, "memor": 73, "l1": [73, 84], "_introduction_to_cnns_and_rnns_video": 73, "penal": 73, "dens": [73, 80, 82], "_regularization_and_effective_number_of_params_discuss": 73, "_representations_and_visual_processing_in_the_brain_video": 73, "aristotl": 73, "bc": 73, "certainli": [73, 88], "_what_makes_a_representation_good_discuss": 73, "_details_about_convolution_video": 73, "lipton": 73, "smola": 73, "underlin": 73, "run_demo": 73, "id_html": 73, "w2d2_convnetsanddlthink": 73, "interactive_demo": 73, "convent": [73, 80], "convolv": [73, 81], "conv_check": 73, "incorrect": [73, 85], "_convolution_of_a_simple_kernel_exercis": 73, "calculate_output_shap": 73, "output_height": 73, "output_width": 73, "kernel_height": 73, "kernel_width": 73, "correcli": 73, "_convolution_output_size_exercis": 73, "beneath": 73, "convolve2d": 73, "convolution2d": 73, "im_h": 73, "im_w": 73, "ker_h": 73, "ker_w": 73, "out_h": 73, "out_w": 73, "out_row": 73, "out_col": 73, "overlai": 73, "current_product": 73, "_coding_a_convolution_exercis": 73, "chicago_skyline_shrunk_v2": 73, "bmp": 73, "ipydisplai": 73, "skyline_image_fil": 73, "img_skyline_orig": 73, "img_skyline_mat": 73, "kernel_v": 73, "kernel_hor": 73, "img_processed_mat_v": 73, "img_processed_mat_hor": 73, "img_processed_mat": 73, "img_process": 73, "plethora": 73, "whatev": 73, "dim1": 73, "dim2": 73, "3x3": 73, "convolutionbackward0": 73, "undefin": [73, 85], "0s": [73, 94, 100], "onto": [73, 76, 77, 80, 88], "_visualization_of_convolution_with_padding_and_stride_interactive_demo": 73, "abrupt": 73, "_edge_detection_discuss": 73, "stripe": 73, "1s": [73, 94, 100], "processed_imag": 73, "_kernel_structure_discuss": 73, "50min": 73, "binar": 73, "charact": [73, 85, 88], "itl": 73, "nist": 73, "gov": 73, "iaui": 73, "vip": 73, "cs_link": 73, "xwfaj": 73, "get_xvs0_dataset": 73, "emnist_test": 73, "1307": 73, "3081": 73, "train_idx": 73, "int64": [73, 87], "test_idx": 73, "o_img_idx": 73, "ax4": 73, "_visualization_of_convolution_with_multiple_filters_interactive_demo": 73, "thicker": 73, "net2": 73, "kernel_1": 73, "kernel_2": 73, "tthird": 73, "checkerboard": 73, "kernel_3": 73, "multiple_kernel": 73, "ax11": 73, "ax12": 73, "ax13": 73, "axesimag": 73, "0x7f79ab6816d0": 73, "_multiple_filters_discuss": 73, "o_img": 73, "output_o": 73, "ax14": 73, "ax21": 73, "ax22": 73, "ax23": 73, "ax24": 73, "ax31": 73, "ax32": 73, "ax33": 73, "ax34": 73, "rectifi": [73, 94], "net3": 73, "output_x_relu": 73, "output_o_relu": 73, "ax15": 73, "ax16": 73, "ax17": 73, "ax25": 73, "ax26": 73, "ax27": 73, "ax35": 73, "ax36": 73, "ax37": 73, "strengthen": 73, "funciton": 73, "cup": [73, 76], "invari": [73, 91, 100], "retain": 73, "translation": 73, "_pooling_video": 73, "systemat": 73, "neighborhood": 73, "depict": 73, "_the_effect_of_the_stride_interactive_demo": 73, "net4": 73, "_implement_maxpooling_exercis": 73, "output_x_pool": 73, "output_o_pool": 73, "intact": 73, "33min": 73, "_putting_it_all_together_video": 73, "times32": 73, "num_dens": 73, "num_conv": 73, "do_plot": 73, "image_s": [73, 77], "number_of_linear": 73, "number_of_conv2d": 73, "final_lay": 73, "sample_imag": 73, "linear_lay": 73, "linear_net": 73, "code_dens": 73, "model_dens": 73, "result_dens": 73, "conv_lay": 73, "conv_net": 73, "code_conv": 73, "model_conv": 73, "shape_conv": 73, "result_conv": 73, "t_1": 73, "shape_linear": 73, "t_2": 73, "t_3": 73, "p1": 73, "p2": 73, "addbox": 73, "text1": 73, "text2": 73, "text3": 73, "clip_on": 73, "gcf": [73, 76], "set_tight_layout": 73, "my_stringiobyt": 73, "seek": [73, 76], "my_base64_jpgdata": 73, "mystr": 73, "caption": [73, 84], "range1": 73, "range2": 73, "slider_batch_s": 73, "slider_image_s": 73, "images": 73, "slider_number_of_linear": 73, "numdens": 73, "slider_number_of_conv2d": 73, "numconv": 73, "slider_kernel_s": 73, "kernels": 73, "input_pool": 73, "checkbox": [73, 85], "input_final_lay": 73, "output_code1": 73, "output_plot": 73, "plot_func": 73, "code1": 73, "code2": 73, "doctyp": 73, "5px": 73, "clearfix": 73, "2em": 73, "h2": [73, 82], "irrespect": 73, "_number_of_parameters_interactive_demo": 73, "_implement_your_own_cnn_video": 73, "9216": 73, "therebi": 73, "emnist_net": 73, "emnistnet": 73, "10d": 73, "_implement_your_own_cnn_exercis": 73, "nresult": 73, "ouselv": 73, "lean": [73, 74], "20min": [73, 84, 85, 94], "_writing_your_own_training_loop_bonus_video": 73, "shirt": [73, 76], "trouser": 73, "pullov": 73, "dress": [73, 88], "coat": [73, 76], "sandal": [73, 76], "sneaker": 73, "ankl": 73, "boot": [73, 76], "10min": [73, 77, 84, 85, 94], "2min": 73, "zalandoresearch": 73, "fashionmnist": 73, "dfhu5": 73, "reduce_class": 73, "get_fashion_mnist_dataset": 73, "validation_data": 73, "_the_training_loop_bonus_video": 73, "ourput": 73, "mnist_train": 73, "mnist_test": 73, "udpat": 73, "fmnist_net1": 73, "FOR": 73, "_code_the_training_loop_bonus_exercis": 73, "combat": [73, 76], "greatli": [73, 82], "_overfitting_bonus_discuss": 73, "30min": [73, 80, 94], "fmnist_net2": 73, "_adding_regularization_bonus_exercis": 73, "_adding_regularization_bonus_discuss": 73, "precalcul": 73, "3495898238046372": 73, "2901147632522786": 73, "2504794800931469": 73, "23571575765914105": 73, "21297093365896255": 73, "19087818914905508": 73, "186408187797729": 73, "19487689035211472": 73, "16774938120803934": 73, "1548648244958926": 73, "1390149021382503": 73, "10919439224922593": 73, "10054351237820501": 73, "09900783193594914": 73, "08370604479507088": 73, "07831853718318521": 73, "06859792241866285": 73, "06152600247383197": 73, "046342475851873885": 73, "055123823092992796": 73, "83475": 73, "8659166666666667": 73, "8874166666666666": 73, "8913333333333333": 73, "8998333333333334": 73, "9140833333333334": 73, "9178333333333333": 73, "9138333333333334": 73, "9251666666666667": 73, "92975": 73, "939": [73, 76, 94], "9525833333333333": 73, "9548333333333333": 73, "9585833333333333": 73, "9655833333333333": 73, "9661666666666666": 73, "9704166666666667": 73, "9743333333333334": 73, "9808333333333333": 73, "9775": 73, "334623601436615": 73, "2977438402175903": 73, "2655304968357086": 73, "25506321132183074": 73, "2588835284113884": 73, "2336345863342285": 73, "3029863876104355": 73, "240766831189394": 73, "2719801160693169": 73, "25231350839138034": 73, "2500132185220718": 73, "26699506521224975": 73, "2934862145781517": 73, "361227530837059": 73, "33196919202804565": 73, "36985905408859254": 73, "4042587959766388": 73, "3716402840614319": 73, "3707024946808815": 73, "4652537405490875": 73, "866875": 73, "851875": 73, "8775": 73, "889375": 73, "881875": 73, "900625": 73, "898125": 73, "885625": 73, "876875": 73, "899375": 73, "90625": 73, "89875": 73, "884375": 73, "874375": 73, "89375": 73, "903125": 73, "890625": 73, "35404509995528993": 73, "30616586227366266": 73, "2872369573946963": 73, "27564131199045383": 73, "25969504263806853": 73, "24728168408445855": 73, "23505379509260046": 73, "21552803914280647": 73, "209761732277718": 73, "19977611067526518": 73, "19632092922767427": 73, "18672360206379535": 73, "16564940239124476": 73, "1654047035671612": 73, "1684555298985636": 73, "1627526102349796": 73, "13878319327263755": 73, "12881529055773577": 73, "12628930977525862": 73, "11346105090837846": 73, "8324166666666667": 73, "8604166666666667": 73, "8680833333333333": 73, "8728333333333333": 73, "8829166666666667": 73, "88625": 73, "89425": 73, "90125": 73, "9015833333333333": 73, "90925": 73, "9114166666666667": 73, "917": [73, 76], "9268333333333333": 73, "92475": 73, "921": [73, 76], "9255833333333333": 73, "9385": 73, "9428333333333333": 73, "9424166666666667": 73, "9484166666666667": 73, "3533937376737595": 73, "29569859683513644": 73, "27531551957130435": 73, "2576177391409874": 73, "26947550356388095": 73, "25361743807792664": 73, "2527468180656433": 73, "24179009914398195": 73, "28664454460144045": 73, "23347773611545564": 73, "24672816634178163": 73, "27822364538908007": 73, "2380720081925392": 73, "24426509588956832": 73, "2443918392062187": 73, "24207917481660843": 73, "2519641682505608": 73, "3075403380393982": 73, "2798181238770485": 73, "26709021866321564": 73, "826875": 73, "870625": 73, "8875": 73, "883125": 73, "891875": 73, "888125": 73, "905": [73, 76], "905625": 73, "901875": 73, "39775496332886373": 73, "33771887778284704": 73, "321900939132939": 73, "3079229625774191": 73, "304149763301966": 73, "28249239723416086": 73, "2861261191044716": 73, "27356165798103554": 73, "2654648520686525": 73, "2697350280557541": 73, "25354846321204877": 73, "24612889034633942": 73, "23482802549892284": 73, "2389904112416379": 73, "23742155821875055": 73, "232423192127905": 73, "22337309338469455": 73, "2141852991932884": 73, "20677659985549907": 73, "19355326712607068": 73, "83625": 73, "8481666666666666": 73, "8530833333333333": 73, "8571666666666666": 73, "86775": 73, "8623333333333333": 73, "8711666666666666": 73, "8748333333333334": 73, "8685833333333334": 73, "8785": 73, "8804166666666666": 73, "8835833333333334": 73, "8840833333333333": 73, "88875": 73, "8919166666666667": 73, "8946666666666667": 73, "8960833333333333": 73, "9063333333333333": 73, "3430288594961166": 73, "4062050700187683": 73, "29745822548866274": 73, "27728439271450045": 73, "28092808067798614": 73, "2577864158153534": 73, "2651400637626648": 73, "25632822573184966": 73, "3082498562335968": 73, "2812121778726578": 73, "26345942318439486": 73, "2577408078312874": 73, "25757989794015884": 73, "26434457510709763": 73, "24917411386966706": 73, "27261342853307724": 73, "2445397639274597": 73, "26001051396131514": 73, "24147838801145555": 73, "2471102523803711": 73, "82875": 73, "795625": 73, "87375": 73, "865625": 73, "8825": 73, "87625": 73, "848125": 73, "87875": 73, "8675": 73, "8925": 73, "87125": 73, "895625": 73, "90375": 73, "4454924576777093": 73, "43416607585993217": 73, "42200265769311723": 73, "40520024616667566": 73, "41137005166804536": 73, "404100904280835": 73, "40118067664034823": 73, "40139733080534223": 73, "3797615355158106": 73, "3596332479030528": 73, "3600061919460905": 73, "3554147962242999": 73, "34480382890460337": 73, "3329520877054397": 73, "33164913056695716": 73, "31860941466181836": 73, "30702565340919696": 73, "30605297186907304": 73, "2953788426486736": 73, "2877389984403519": 73, "7788333333333334": 73, "7825": 73, "7854166666666667": 73, "7916666666666666": 73, "7885": 73, "7833333333333333": 73, "7923333333333333": 73, "79525": 73, "805": [73, 76], "81475": 73, "8161666666666667": 73, "8188333333333333": 73, "817": [73, 76], "8266666666666667": 73, "82225": 73, "8360833333333333": 73, "8456666666666667": 73, "8430833333333333": 73, "8491666666666666": 73, "8486666666666667": 73, "3507828885316849": 73, "3337512403726578": 73, "34320746660232543": 73, "3476085543632507": 73, "3326113569736481": 73, "33033264458179473": 73, "32014619171619413": 73, "3182142299413681": 73, "30076164126396177": 73, "3263852882385254": 73, "27597591280937195": 73, "29062016785144806": 73, "2765174686908722": 73, "269492534995079": 73, "2679423809051514": 73, "2691828978061676": 73, "2726386785507202": 73, "2541181230545044": 73, "2580208206176758": 73, "26315389811992645": 73, "839375": 73, "843125": 73, "823125": 73, "821875": 73, "81875": 73, "819375": 73, "8225": 73, "835625": 73, "865": [73, 76], "868125": 73, "855625": 73, "8975": 73, "885": [73, 76], "34561181647029326": 73, "2834314257699124": 73, "2583787844298368": 73, "23892096465730922": 73, "23207981773513428": 73, "20245029634617745": 73, "183908417583146": 73, "17489413774393975": 73, "17696723581707857": 73, "15615438255778652": 73, "14469048382833283": 73, "12424647461305907": 73, "11314761043189371": 73, "11249036608422373": 73, "10725672634199579": 73, "09081190969160896": 73, "0942245383271353": 73, "08525650047677312": 73, "06622548752583246": 73, "06039895973307021": 73, "8356666666666667": 73, "8675833333333334": 73, "88175": 73, "8933333333333333": 73, "8975833333333333": 73, "91175": 73, "91825": 73, "9249166666666667": 73, "9238333333333333": 73, "9305": 73, "9465833333333333": 73, "9539166666666666": 73, "9555": 73, "9615": 73, "9606666666666667": 73, "96275": 73, "9725": 73, "9764166666666667": 73, "31630186855792997": 73, "2702121251821518": 73, "2915778249502182": 73, "26050266206264494": 73, "27837209939956664": 73, "24276352763175965": 73, "3567117482423782": 73, "2752074319124222": 73, "2423130339384079": 73, "2565067422389984": 73, "28710135877132414": 73, "266545415520668": 73, "31818037331104276": 73, "28757534325122835": 73, "2777567034959793": 73, "2998969575762749": 73, "3292293107509613": 73, "30775387287139894": 73, "32681577146053314": 73, "44882203072309496": 73, "85375": 73, "879375": 73, "875625": 73, "86125": 73, "89625": 73, "895": [73, 76], "89125": 73, "880625": 73, "894375": 73, "35970850011452715": 73, "31336131549261986": 73, "2881505932421126": 73, "2732012960267194": 73, "26232245425753137": 73, "2490472443639598": 73, "24866499093935845": 73, "22930880945096624": 73, "21745950407645803": 73, "20700296882460725": 73, "197304340356842": 73, "20665066804182022": 73, "19864868348900308": 73, "184807124210799": 73, "1684703354703936": 73, "17377675851767369": 73, "16638460063791655": 73, "15944768343754906": 73, "14876513817208878": 73, "1388207479835825": 73, "83375": 73, "85175": 73, "86725": 73, "8719166666666667": 73, "8761666666666666": 73, "8865833333333333": 73, "88275": 73, "8956666666666667": 73, "8995833333333333": 73, "9034166666666666": 73, "90825": 73, "9043333333333333": 73, "9093333333333333": 73, "9145": 73, "9196666666666666": 73, "9216666666666666": 73, "9273333333333333": 73, "9299166666666666": 73, "93675": 73, "3166788029670715": 73, "28422485530376435": 73, "38055971562862395": 73, "2586472672224045": 73, "2588653892278671": 73, "27983254253864287": 73, "25693483114242555": 73, "26412731170654297": 73, "2733065390586853": 73, "24399636536836625": 73, "24481021404266357": 73, "2689305514097214": 73, "2527604129910469": 73, "24829535871744157": 73, "2654112687706947": 73, "23074268400669098": 73, "24625462979078294": 73, "26423920392990113": 73, "25540480852127073": 73, "25536185175180437": 73, "856875": 73, "86625": 73, "815": [73, 76], "88125": 73, "893125": 73, "3975753842040579": 73, "34884724409339274": 73, "3296900932142075": 73, "3150389680361494": 73, "31285368667003954": 73, "30415422033439293": 73, "29553352716438314": 73, "289314468094009": 73, "2806722329969102": 73, "2724469883486311": 73, "26634286379719035": 73, "2645016222241077": 73, "2619251853766594": 73, "2551752221473354": 73, "26411766035759704": 73, "24515971153023394": 73, "2390686312412962": 73, "23573122312255362": 73, "221005061562074": 73, "22358600648635246": 73, "8106666666666666": 73, "8286666666666667": 73, "8513333333333334": 73, "84975": 73, "8570833333333333": 73, "8624166666666667": 73, "8626666666666667": 73, "866": [73, 76], "8706666666666667": 73, "8738333333333334": 73, "8778333333333334": 73, "8798333333333334": 73, "8865": 73, "8898333333333334": 73, "8885833333333333": 73, "8991666666666667": 73, "8968333333333334": 73, "3597823417186737": 73, "31115993797779085": 73, "29929635107517244": 73, "2986589139699936": 73, "2938830828666687": 73, "28118040919303894": 73, "2711684626340866": 73, "2844697123765945": 73, "26613601863384245": 73, "2783134698867798": 73, "2540236383676529": 73, "25821100890636445": 73, "2618845862150192": 73, "2554920208454132": 73, "26543013513088226": 73, "24074569433927537": 73, "26475649774074556": 73, "25578504264354707": 73, "2648500043153763": 73, "25700133621692656": 73, "825": [73, 76], "8375": 73, "85875": 73, "861875": 73, "886875": 73, "86375": 73, "88375": 73, "4584837538447786": 73, "4506375778545725": 73, "4378386567089152": 73, "4066803843734112": 73, "3897064097542712": 73, "3855383962868376": 73, "39160584618753574": 73, "3731403942120836": 73, "37915910170116324": 73, "36966170814443144": 73, "35735995298687445": 73, "35630573094525236": 73, "346426092167484": 73, "34040802899510303": 73, "32829743726773464": 73, "3284692421872565": 73, "3186114077713895": 73, "32295761503120685": 73, "3201326223764014": 73, "30581602454185486": 73, "7803333333333333": 73, "7709166666666667": 73, "7723333333333333": 73, "7850833333333334": 73, "7903333333333333": 73, "7986666666666666": 73, "8011666666666667": 73, "8068333333333333": 73, "8095833333333333": 73, "8226666666666667": 73, "8285": 73, "83125": 73, "8369166666666666": 73, "8395": 73, "8441666666666666": 73, "8393333333333334": 73, "8490833333333333": 73, "8546666666666667": 73, "43526833415031435": 73, "3598956459760666": 73, "3492005372047424": 73, "33501910269260404": 73, "31689528703689573": 73, "3113307124376297": 73, "32388085544109346": 73, "3084335786104202": 73, "3013568025827408": 73, "28992725372314454": 73, "28726822674274444": 73, "26945948660373686": 73, "276592333316803": 73, "27462401330471037": 73, "27574350595474245": 73, "2710308712720871": 73, "2702724140882492": 73, "27323003828525544": 73, "25551479041576386": 73, "26488787233829497": 73, "808125": 73, "81625": 73, "8325": 73, "846875": 73, "850625": 73, "838125": 73, "836875": 73, "858125": 73, "86875": 73, "3579516930783049": 73, "29596046564426826": 73, "2779693031247626": 73, "2563994538356015": 73, "24771526356802342": 73, "2324555875693864": 73, "2139121579362991": 73, "20474095547452886": 73, "19138856208387842": 73, "18883306279461434": 73, "1763652620757831": 73, "1698919345248253": 73, "16033914366221808": 73, "1557997044651432": 73, "1432509447467771": 73, "13817814606776896": 73, "12609625801919622": 73, "11830132696381275": 73, "11182412960903441": 73, "112559904720872": 73, "8314166666666667": 73, "8611666666666666": 73, "8736666666666667": 73, "8800833333333333": 73, "8944166666666666": 73, "9036666666666666": 73, "9090833333333334": 73, "9193333333333333": 73, "9161666666666667": 73, "92225": 73, "9255": 73, "93075": 73, "93225": 73, "9414166666666667": 73, "94375": 73, "9485833333333333": 73, "9535833333333333": 73, "9524166666666667": 73, "30677567660808563": 73, "32954772651195524": 73, "25747098088264464": 73, "2736126834154129": 73, "2561805549263954": 73, "23671718776226044": 73, "24553639352321624": 73, "2338863667845726": 73, "24586652517318724": 73, "23423030972480774": 73, "26579618513584136": 73, "2781539523601532": 73, "27084136098623274": 73, "23948652744293214": 73, "26023868829011915": 73, "2419952344894409": 73, "2511997854709625": 73, "23935708701610564": 73, "2701922015845776": 73, "27307246536016466": 73, "878125": 73, "896875": 73, "904375": 73, "906875": 73, "3712943946903056": 73, "3198322071594761": 73, "29978102302931725": 73, "295274139798068": 73, "2861913934032968": 73, "27165328782606635": 73, "25972246442069397": 73, "2543164194819141": 73, "24795781916126292": 73, "24630710007028378": 73, "23296909834793272": 73, "23382153587931015": 73, "2239028559799524": 73, "21443849290780564": 73, "2149274461367663": 73, "20642021417300752": 73, "19801520536396097": 73, "1978839404009124": 73, "19118623847657062": 73, "18144798041024107": 73, "8235833333333333": 73, "8538333333333333": 73, "86075": 73, "8664166666666666": 73, "8754166666666666": 73, "8799166666666667": 73, "8815833333333334": 73, "88725": 73, "8848333333333334": 73, "8936666666666667": 73, "8935": 73, "8995": 73, "9068333333333334": 73, "9098333333333334": 73, "9120833333333334": 73, "91375": 73, "9175833333333333": 73, "3184810388088226": 73, "2948088157176971": 73, "29438531696796416": 73, "27669853866100313": 73, "2634278678894043": 73, "25847582578659056": 73, "2500907778739929": 73, "2538330048322678": 73, "25127841770648957": 73, "2519759064912796": 73, "2455715072154999": 73, "2437664610147476": 73, "259639236330986": 73, "24515749186277389": 73, "2553828465938568": 73, "2324645048379898": 73, "24492083072662355": 73, "24482838332653045": 73, "23327024638652802": 73, "2520161652565002": 73, "855": [73, 76], "8525": 73, "40442772225496615": 73, "36662670541951": 73, "355034276367502": 73, "3396551510755052": 73, "3378269396563794": 73, "32084332002287214": 73, "31314464951766297": 73, "2982726935693558": 73, "2885229691387491": 73, "2888992782285873": 73, "2893476904706752": 73, "281817957996688": 73, "2771622718490185": 73, "2693793097550565": 73, "2617615883416952": 73, "2657115764995205": 73, "25631817549150043": 73, "24793559907281654": 73, "2538738044652533": 73, "23912971732305718": 73, "8093333333333333": 73, "82825": 73, "8341666666666666": 73, "84525": 73, "8515": 73, "8583333333333333": 73, "8688333333333333": 73, "8685": 73, "8689166666666667": 73, "8693333333333333": 73, "8766666666666667": 73, "8839166666666667": 73, "8866666666666667": 73, "8929166666666667": 73, "38392188608646394": 73, "3653419762849808": 73, "3050421380996704": 73, "30614266455173494": 73, "2937217426300049": 73, "30008585572242735": 73, "2794034606218338": 73, "27541795969009397": 73, "31378355383872986": 73, "2670704126358032": 73, "26745485186576845": 73, "2471194839477539": 73, "26509816259145735": 73, "25458798944950106": 73, "2481587851047516": 73, "25591064751148224": 73, "2596563971042633": 73, "2569611769914627": 73, "2435744071006775": 73, "2507249677181244": 73, "820625": 73, "860625": 73, "46106574311852455": 73, "4519433615372536": 73, "4446939624687459": 73, "4284856241751224": 73, "4527993325857406": 73, "4220876024758562": 73, "40969764266876463": 73, "39233948219012704": 73, "42498463344700793": 73, "3869199570506177": 73, "38021832910623954": 73, "3855376149270129": 73, "3721433773319772": 73, "3662295250340979": 73, "3629763710530514": 73, "358500304691335": 73, "3490118366131123": 73, "34879197790584665": 73, "33399240054348683": 73, "3347948451149971": 73, "7866666666666666": 73, "7865": 73, "79375": 73, "7755833333333333": 73, "79125": 73, "7973333333333333": 73, "8085833333333333": 73, "7913333333333333": 73, "8125833333333333": 73, "81675": 73, "8173333333333334": 73, "831": [73, 76], "8306666666666667": 73, "8353333333333334": 73, "8320833333333333": 73, "84375": 73, "8410833333333333": 73, "35159709095954894": 73, "3579048192501068": 73, "3501501774787903": 73, "33594816565513613": 73, "3741619431972504": 73, "34183687329292295": 73, "3353554099798203": 73, "32617265462875367": 73, "3640907108783722": 73, "33187183618545535": 73, "32401839792728426": 73, "30536725163459777": 73, "31303414940834046": 73, "2893040508031845": 73, "3063929396867752": 73, "2909839802980423": 73, "2858921372890472": 73, "2850045281648636": 73, "28049838364124297": 73, "2873564797639847": 73, "816875": 73, "793125": 73, "810625": 73, "8175": 73, "814375": 73, "828125": 73, "83875": 73, "818125": 73, "834375": 73, "37716902824158366": 73, "3260373148195287": 73, "3128290904012132": 73, "2998493126732238": 73, "29384377892030045": 73, "2759418967873492": 73, "26431119905665834": 73, "2577077782455277": 73, "25772295725789474": 73, "24954422610871335": 73, "24065862928933285": 73, "23703582263848882": 73, "23237684028262787": 73, "2200249534575863": 73, "22110319957929722": 73, "21804759631607126": 73, "21419822757548473": 73, "19927451733816812": 73, "19864692467641323": 73, "18966749441274938": 73, "8215833333333333": 73, "848": [73, 76], "8526666666666667": 73, "8585": 73, "8639166666666667": 73, "8716666666666667": 73, "8783333333333333": 73, "8849166666666667": 73, "88325": 73, "8918333333333334": 73, "896": [73, 76], "9010833333333333": 73, "8996666666666666": 73, "9016666666666666": 73, "902": [73, 76], "9105833333333333": 73, "9160833333333334": 73, "3255926352739334": 73, "3397491586208343": 73, "3148202610015869": 73, "30447013437747955": 73, "27427292466163633": 73, "2607581865787506": 73, "2583494257926941": 73, "24150457441806794": 73, "24839721441268922": 73, "24157819360494615": 73, "24594406485557557": 73, "2547012311220169": 73, "24132476687431337": 73, "2433958488702774": 73, "2358475297689438": 73, "24675665378570558": 73, "23343635857105255": 73, "22841362684965133": 73, "2247604575753212": 73, "24281086921691894": 73, "85125": 73, "853125": 73, "3795942336796446": 73, "33614943612446174": 73, "3235826115024851": 73, "3267444484728448": 73, "30353531146303137": 73, "29750882636042353": 73, "2964640334248543": 73, "28714796314214136": 73, "2744278162717819": 73, "27310871372514584": 73, "2624819800257683": 73, "2579742945889209": 73, "25963644726954876": 73, "25635017161356644": 73, "2501001837960583": 73, "24249463702769988": 73, "23696896695393196": 73, "23254455582417072": 73, "22419108628751117": 73, "22851746232110134": 73, "8204166666666667": 73, "8506666666666667": 73, "8635": 73, "87475": 73, "87925": 73, "8805833333333334": 73, "8845": 73, "88675": 73, "8908333333333334": 73, "8926666666666667": 73, "89525": 73, "8985": 73, "8955833333333333": 73, "3383863967657089": 73, "31120560944080355": 73, "32110977828502657": 73, "3080899566411972": 73, "2866462391614914": 73, "27701647162437437": 73, "29040718913078306": 73, "2702513742446899": 73, "2590403389930725": 73, "26199558019638064": 73, "26484714448451996": 73, "2940529054403305": 73, "2654808533191681": 73, "25154681205749513": 73, "26637687146663663": 73, "24435366928577423": 73, "24174826145172118": 73, "2444209086894989": 73, "247626873254776": 73, "24192263156175614": 73, "8575": 73, "85625": 73, "41032169133107715": 73, "37122817583223605": 73, "35897897873470125": 73, "3438001747064768": 73, "33858899811797954": 73, "3389760729797343": 73, "32536247420184156": 73, "3152934226425404": 73, "30936657058748795": 73, "3078679118226183": 73, "30974164977669716": 73, "30031369174731537": 73, "29489042173991814": 73, "28921707251921613": 73, "28369594476324445": 73, "2849519875772456": 73, "27076949349584734": 73, "26930386248104116": 73, "26349931491657774": 73, "26431971300948176": 73, "8086666666666666": 73, "8284166666666667": 73, "8381666666666666": 73, "837": [73, 76], "8389166666666666": 73, "8488333333333333": 73, "8533333333333334": 73, "8551666666666666": 73, "8509166666666667": 73, "8628333333333333": 73, "86225": 73, "8715": 73, "8814166666666666": 73, "8835": 73, "3464747530221939": 73, "3193131250143051": 73, "3464068531990051": 73, "3129056388139725": 73, "3131117367744446": 73, "30689118325710296": 73, "2929005026817322": 73, "3131696957349777": 73, "302835636138916": 73, "27934255003929137": 73, "300513002872467": 73, "26962003886699676": 73, "2676294481754303": 73, "26430738389492037": 73, "2525753951072693": 73, "2508367341756821": 73, "25303518533706665": 73, "24774718701839446": 73, "24518848478794097": 73, "26084545016288757": 73, "849375": 73, "869375": 73, "863125": 73, "8725": 73, "4765880586619073": 73, "4503744399928032": 73, "4249279998401378": 73, "42333967214886176": 73, "4236916420941657": 73, "4269233151002133": 73, "4192506206479478": 73, "41413671872083174": 73, "41084911515738104": 73, "389948022413127": 73, "39566395788433706": 73, "3741930383951106": 73, "3794517093040842": 73, "3692300356131919": 73, "3640432547223061": 73, "3608953575504587": 73, "3419572095129084": 73, "34907091543712515": 73, "33601277535583113": 73, "3408893179544743": 73, "77625": 73, "7823333333333333": 73, "80075": 73, "7810833333333334": 73, "7928333333333333": 73, "7930833333333334": 73, "7951666666666667": 73, "8015833333333333": 73, "8000833333333334": 73, "8126666666666666": 73, "811": [73, 76], "81775": 73, "8236666666666667": 73, "8215": 73, "8305833333333333": 73, "8251666666666667": 73, "8299166666666666": 73, "836": [73, 76], "3674533206224442": 73, "36733597874641416": 73, "35894496202468873": 73, "3514183223247528": 73, "35345671892166136": 73, "36494161546230314": 73, "35217500329017637": 73, "3447349113225937": 73, "34697150766849516": 73, "36931039452552794": 73, "3350031852722168": 73, "3416145300865173": 73, "32389605045318604": 73, "3109715062379837": 73, "3322615468502045": 73, "327584428191185": 73, "31910278856754304": 73, "311815539598465": 73, "2950947880744934": 73, "2948034608364105": 73, "789375": 73, "81375": 73, "804375": 73, "80625": 73, "8125": 73, "84625": 73, "824375": 73, "825625": 73, "840625": 73, "8475": 73, "844375": 73, "400307985173582": 73, "2597426520640662": 73, "20706942731312025": 73, "17091670006251475": 73, "13984850759524653": 73, "11444453444522518": 73, "0929887340481538": 73, "07584588486117436": 73, "06030314570384176": 73, "04997897459031356": 73, "037156337104278056": 73, "02793900864590992": 73, "02030197833807442": 73, "01789472087045391": 73, "0175876492686666": 73, "019220354652448274": 73, "013543135874294319": 73, "006956856955481477": 73, "0024507183060002227": 73, "00206579088377317": 73, "8547833333333333": 73, "9049": 73, "9241666666666667": 73, "9360166666666667": 73, "94695": 73, "9658666666666667": 73, "9723166666666667": 73, "9780333333333333": 73, "9820166666666666": 73, "9868": 73, "9906666666666667": 73, "9936833333333334": 73, "9941333333333333": 73, "99405": 73, "9932833333333333": 73, "9960666666666667": 73, "9979666666666667": 73, "9996666666666667": 73, "9995666666666667": 73, "36797549843788147": 73, "2586278670430183": 73, "24208260095119477": 73, "24353929474949837": 73, "24164094921946525": 73, "2638056704550982": 73, "2579395814836025": 73, "27675500786304474": 73, "2851512663513422": 73, "30380481338500975": 73, "3235128371268511": 73, "3284085538983345": 73, "3443841063082218": 73, "41086878085136413": 73, "457796107493341": 73, "4356938077956438": 73, "4109785168170929": 73, "4433729724138975": 73, "4688420155197382": 73, "4773445381522179": 73, "908375": 73, "91475": 73, "915125": 73, "91525": 73, "91725": 73, "924875": 73, "91975": 73, "922375": 73, "92025": 73, "920375": 73, "9235": 73, "918125": 73, "918875": 73, "923625": 73, "92625": 73, "925": [73, 76], "4710115425463424": 73, "3166707545550647": 73, "25890692547440275": 73, "22350736999753187": 73, "19296910860009794": 73, "17304379170113154": 73, "15315235079105285": 73, "13728606270383925": 73, "12178339355929034": 73, "10961619754736898": 73, "10074329449495337": 73, "08793247367408294": 73, "07651288138686625": 73, "06934997136779089": 73, "06243234033510685": 73, "056774082654433795": 73, "05116950291028218": 73, "04961718403588313": 73, "04289388027836952": 73, "040430180404756245": 73, "8289666666666666": 73, "8851833333333333": 73, "9045166666666666": 73, "9167666666666666": 73, "9294166666666667": 73, "93545": 73, "94275": 73, "9486666666666667": 73, "95365": 73, "95855": 73, "9618833333333333": 73, "9667": 73, "9717666666666667": 73, "9745833333333334": 73, "9765833333333334": 73, "9793": 73, "9809833333333333": 73, "9820333333333333": 73, "9839166666666667": 73, "9849166666666667": 73, "3629846270084381": 73, "31240448981523516": 73, "24729759228229523": 73, "2697310926616192": 73, "24718070650100707": 73, "23403583562374114": 73, "2295891786813736": 73, "22117181441187858": 73, "2475375788807869": 73, "23771390727162361": 73, "2562992911040783": 73, "25533875498175623": 73, "27057862806320193": 73, "2820998176634312": 73, "29471745146811007": 73, "2795617451965809": 73, "3008101430237293": 73, "28815430629253386": 73, "31814645100384953": 73, "3106237706840038": 73, "874125": 73, "908875": 73, "9045": 73, "919375": 73, "9245": 73, "926": [73, 76], "925875": 73, "926375": 73, "925125": 73, "92525": 73, "924625": 73, "930875": 73, "926625": 73, "6091368444629316": 73, "40709905083309106": 73, "33330900164873106": 73, "29541655938063605": 73, "26824146830864043": 73, "24633059249535552": 73, "22803501166832219": 73, "21262132842689435": 73, "20038021789160745": 73, "18430457027680647": 73, "1744787511763288": 73, "165271017740149": 73, "15522625095554507": 73, "1432937567076608": 73, "13617747858651222": 73, "12876031456241158": 73, "12141566201230325": 73, "11405601029369686": 73, "11116664642408522": 73, "10308189516060992": 73, "7803833333333333": 73, "8559166666666667": 73, "8823": 73, "89505": 73, "9027333333333334": 73, "9099166666666667": 73, "9162333333333333": 73, "9224833333333333": 73, "9243166666666667": 73, "9321": 73, "9345833333333333": 73, "9375333333333333": 73, "9418833333333333": 73, "9456666666666667": 73, "9482333333333334": 73, "9513666666666667": 73, "9527333333333333": 73, "9559": 73, "9576166666666667": 73, "9611": 73, "36491659212112426": 73, "29200539910793305": 73, "2840233483910561": 73, "2591339669823646": 73, "24114771646261215": 73, "2436459481716156": 73, "2374294084906578": 73, "24284198743104934": 73, "22679156363010405": 73, "2229055170416832": 73, "21932773572206496": 73, "23045065227150918": 73, "23631879675388337": 73, "22048399156332016": 73, "2563135535418987": 73, "2494968646839261": 73, "24099056956171988": 73, "23974315640330315": 73, "24684958010911942": 73, "25887142738699914": 73, "8665": 73, "897": [73, 76], "907375": 73, "914125": 73, "9125": 73, "913875": 73, "911875": 73, "921125": 73, "922625": 73, "923375": 73, "924125": 73, "915625": 73, "926125": 73, "932625": 73, "927875": 73, "187068938827718": 73, "9080034740316842": 73, "6863665148329887": 73, "5706229420867301": 73, "5069490017921432": 73, "46316734996876485": 73, "42913920047885573": 73, "4107565824855874": 73, "3908677859061054": 73, "37283689377785745": 73, "3606657798388111": 73, "353545261082301": 73, "34009441143986": 73, "3239413740506559": 73, "3193119444620253": 73, "31045137204404577": 73, "3003838519091164": 73, "29092520530194615": 73, "28635713599447504": 73, "2760026559138349": 73, "5551333333333334": 73, "6467": 73, "7338666666666667": 73, "7841333333333333": 73, "8128": 73, "82845": 73, "8501666666666666": 73, "8580833333333333": 73, "8646166666666667": 73, "8667666666666667": 73, "8709833333333333": 73, "8766166666666667": 73, "8816666666666667": 73, "8812": 73, "88465": 73, "8898833333333334": 73, "8934666666666666": 73, "8940833333333333": 73, "8977666666666667": 73, "6463955206871033": 73, "5193838343620301": 73, "4155286856889725": 73, "3316091845035553": 73, "3148408111333847": 73, "29354524302482604": 73, "2875490103960037": 73, "26903486740589144": 73, "27737221759557723": 73, "262776792883873": 73, "25498255288600924": 73, "2390553195178509": 73, "24918611392378806": 73, "23830307483673097": 73, "23538302001357078": 73, "24996423116326333": 73, "2464654156267643": 73, "24081429636478424": 73, "23204647853970528": 73, "23771219885349273": 73, "763875": 73, "81925": 73, "8885": 73, "8895": 73, "904125": 73, "906125": 73, "908": [73, 76], "909375": 73, "916125": 73, "9175": 73, "91875": 73, "91425": 73, "915375": 73, "4140813298491654": 73, "27481235485118843": 73, "22397600941614174": 73, "1890777693286951": 73, "16538111197112848": 73, "1448796250478132": 73, "12440053254032313": 73, "10817898457734855": 73, "09634132136696025": 73, "08548538653410352": 73, "07339220296349257": 73, "06470446296305314": 73, "060030178171393875": 73, "053294485403614034": 73, "04429284706704323": 73, "04014099264770115": 73, "03974721442450951": 73, "03304463665041803": 73, "02955428938137994": 73, "026940144761875052": 73, "8496666666666667": 73, "8982666666666667": 73, "9162166666666667": 73, "9292166666666667": 73, "93805": 73, "9457666666666666": 73, "9534333333333334": 73, "9596": 73, "9645833333333333": 73, "9679": 73, "9726166666666667": 73, "9761666666666666": 73, "9800166666666666": 73, "9842": 73, "9855333333333334": 73, "9857": 73, "98805": 73, "9895666666666667": 73, "9905833333333334": 73, "3327465409040451": 73, "27738857254385946": 73, "23834018683433533": 73, "24359044748544692": 73, "23630736249685289": 73, "26239568686485293": 73, "23089197066426276": 73, "23183160039782524": 73, "2287161501646042": 73, "23795067170262338": 73, "2680365410447121": 73, "28079107534885406": 73, "2745736412107945": 73, "27641161236166956": 73, "2967236565724015": 73, "29836027943715454": 73, "28526886811852453": 73, "3188628684282303": 73, "3159900237545371": 73, "33990017675608397": 73, "899875": 73, "9105": 73, "92075": 73, "924": [73, 76], "920875": 73, "9285": 73, "927625": 73, "9265": 73, "927375": 73, "927": [73, 76], "92575": 73, "48859380523978013": 73, "3269256727337075": 73, "275135099903734": 73, "24039912359244914": 73, "21368402032566858": 73, "19328243048317523": 73, "17890911489359732": 73, "16624130663682402": 73, "15215728174088827": 73, "1416037013468299": 73, "13273427299440288": 73, "12227611260405227": 73, "11463099068699917": 73, "10616964906720179": 73, "09988978996809357": 73, "09424899211093815": 73, "08670466838887077": 73, "0835973875783781": 73, "0778748192367698": 73, "07327510508696741": 73, "82055": 73, "8806666666666667": 73, "9004333333333333": 73, "9117333333333333": 73, "9206333333333333": 73, "92785": 73, "9333": 73, "9384166666666667": 73, "9430333333333333": 73, "9471833333333334": 73, "95055": 73, "9540166666666666": 73, "9568833333333333": 73, "9601666666666666": 73, "9620333333333333": 73, "9652": 73, "9676833333333333": 73, "9682666666666667": 73, "9706": 73, "9724333333333334": 73, "34025013536214826": 73, "29788709819316866": 73, "2680273652672768": 73, "2463292105793953": 73, "23471139985322953": 73, "22580294385552407": 73, "21676637730002404": 73, "20925517010688782": 73, "23552959233522416": 73, "21975916308164598": 73, "23494828915596008": 73, "21611644634604454": 73, "22251244640350343": 73, "22066593673825263": 73, "2214409472346306": 73, "22849382662773132": 73, "24493269926309585": 73, "2397777333110571": 73, "23578458192944526": 73, "2563280282020569": 73, "870875": 73, "900375": 73, "906625": 73, "92125": 73, "92425": 73, "916": [73, 76], "923125": 73, "92675": 73, "922875": 73, "931125": 73, "932375": 73, "929": [73, 76], "6104797730917362": 73, "42115319246994154": 73, "3527538229359874": 73, "3136731511446586": 73, "2857721160565104": 73, "26646374052426197": 73, "24732486170523965": 73, "23057452346613286": 73, "21953405395769743": 73, "20952929538100767": 73, "19584925043811677": 73, "18926965880162044": 73, "18003955145856973": 73, "17379174885878176": 73, "16635702809354644": 73, "15807223409366633": 73, "1509416516620054": 73, "1477138751140758": 73, "14028569269798266": 73, "13906246528172417": 73, "7786833333333333": 73, "8482166666666666": 73, "8730833333333333": 73, "888": [73, 76], "8978": 73, "9033666666666667": 73, "9089166666666667": 73, "9147666666666666": 73, "91955": 73, "9221833333333334": 73, "92715": 73, "9309666666666667": 73, "9334": 73, "93495": 73, "9376833333333333": 73, "9402666666666667": 73, "94405": 73, "9439166666666666": 73, "9466833333333333": 73, "9464833333333333": 73, "3859497320652008": 73, "3124091213941574": 73, "28177140313386917": 73, "2564259949326515": 73, "24969424712657928": 73, "23137387067079543": 73, "22758139592409135": 73, "22978509336709976": 73, "2293499847650528": 73, "22430640310049058": 73, "21563700905442237": 73, "21529569518566133": 73, "22171301135420798": 73, "2105387990772724": 73, "21190602815151213": 73, "21494245541095733": 73, "21312989933788776": 73, "20670134457945824": 73, "2146600303351879": 73, "21474341893941165": 73, "907": [73, 76], "915": [73, 76], "917875": 73, "917625": 73, "921875": 73, "928125": 73, "92775": 73, "928625": 73, "930375": 73, "1724896589194789": 73, "8803599189911315": 73, "692622532690766": 73, "5974764075837156": 73, "5319996399920124": 73, "49373906012028773": 73, "4741932853007876": 73, "45601858158927483": 73, "43706520244892216": 73, "4238534729236733": 73, "41077356216813454": 73, "38932509837882606": 73, "3771154705856019": 73, "3687882057305719": 73, "34927689276937485": 73, "3379922736602933": 73, "33547254843212393": 73, "3263144160448107": 73, "31800466419251233": 73, "3133781185822446": 73, "5631833333333334": 73, "6579333333333334": 73, "7342166666666666": 73, "7765833333333333": 73, "8036333333333333": 73, "8197166666666666": 73, "82755": 73, "8320166666666666": 73, "8397833333333333": 73, "8432666666666667": 73, "8519333333333333": 73, "85835": 73, "86285": 73, "8641": 73, "87105": 73, "8756666666666667": 73, "8775166666666666": 73, "87965": 73, "88255": 73, "8832333333333333": 73, "5745115535259246": 73, "4740168128013611": 73, "4092038922309876": 73, "345498643040657": 73, "32894178831577303": 73, "2999964846372604": 73, "28456189918518066": 73, "28186965006589887": 73, "26958267349004744": 73, "26703972268104553": 73, "2667745503783226": 73, "2553461962342262": 73, "25764305877685545": 73, "2528705199956894": 73, "24987997275590895": 73, "24210182267427444": 73, "2366510547697544": 73, "24053962442278862": 73, "22825994032621383": 73, "2270425768494606": 73, "776875": 73, "822625": 73, "848875": 73, "87825": 73, "88925": 73, "9015": 73, "9035": 73, "91125": 73, "908625": 73, "917125": 73, "91675": 73, "919875": 73, "43062501005145276": 73, "29807482149078646": 73, "2541527441585623": 73, "21918726423338278": 73, "1950343672964555": 73, "17517360023010387": 73, "16213757058244144": 73, "14869415854364": 73, "13477844860392815": 73, "12352272007129848": 73, "11392300839184412": 73, "10589898744228679": 73, "09751250602896692": 73, "089864786467088": 73, "08516462990539526": 73, "07973235945548934": 73, "07441158362824137": 73, "07053931183896578": 73, "06258528833356954": 73, "06177985634201014": 73, "8429": 73, "88905": 73, "9052166666666667": 73, "9182166666666667": 73, "92755": 73, "9337666666666666": 73, "93835": 73, "944": [73, 76, 94], "9489333333333333": 73, "9565333333333333": 73, "9599166666666666": 73, "9637833333333333": 73, "9659666666666666": 73, "9685666666666667": 73, "9705": 73, "9713666666666667": 73, "9738": 73, "9770166666666666": 73, "9769833333333333": 73, "32814766228199005": 73, "29447353577613833": 73, "25052148789167406": 73, "22761481428146363": 73, "23280890756845474": 73, "23155913531780242": 73, "21984874603152274": 73, "2166314404308796": 73, "2202563073039055": 73, "22508277136087418": 73, "2237191815972328": 73, "2246915928721428": 73, "22815296687185765": 73, "2254556802213192": 73, "2337513281852007": 73, "2381753808259964": 73, "24798179551959038": 73, "24766947883367538": 73, "24877363580465317": 73, "2518915164768696": 73, "879625": 73, "89025": 73, "907875": 73, "916625": 73, "91625": 73, "923": [73, 76], "927125": 73, "925375": 73, "925625": 73, "5022556754285847": 73, "3545388207554436": 73, "2965180559564374": 73, "2689443711818917": 73, "24340009927622544": 73, "22504497168144819": 73, "21177587015574167": 73, "19926073912507308": 73, "18498492261557692": 73, "1792394390810273": 73, "16716771742809555": 73, "16088557891500022": 73, "15540826101420022": 73, "1471743908549931": 73, "14383414784458273": 73, "1351151093741311": 73, "1312572255915305": 73, "12904865093140014": 73, "12332957751079918": 73, "11934908895072208": 73, "8186333333333333": 73, "8905666666666666": 73, "9020666666666667": 73, "9106333333333333": 73, "9169333333333334": 73, "9227": 73, "9258166666666666": 73, "9317": 73, "9329666666666667": 73, "9384833333333333": 73, "9394333333333333": 73, "94185": 73, "9447666666666666": 73, "9449833333333333": 73, "9489": 73, "9506": 73, "9520333333333333": 73, "95295": 73, "9556833333333333": 73, "37072600054740906": 73, "2894986196160316": 73, "2896255247592926": 73, "2553737629055977": 73, "2347450014948845": 73, "23144772934913635": 73, "22532679361104965": 73, "2152210614681244": 73, "21610748746991157": 73, "22872606116533278": 73, "22058768355846406": 73, "20230921444296837": 73, "2118315652012825": 73, "20028054055571556": 73, "20844366964697839": 73, "20884322375059128": 73, "21231223946809769": 73, "19875787001848222": 73, "2072589308321476": 73, "22480831852555275": 73, "862": [73, 76], "894": [73, 76], "892375": 73, "906375": 73, "912625": 73, "916875": 73, "9185": 73, "92825": 73, "92925": 73, "926875": 73, "6208003907124879": 73, "4341448332582201": 73, "3655890760454796": 73, "3245583019102179": 73, "3000562671722888": 73, "2840681741280215": 73, "2686156402947679": 73, "25843519997844566": 73, "24892204790227196": 73, "23988707410469493": 73, "22968693327770304": 73, "22323107979953416": 73, "21376596502403714": 73, "21353628940340172": 73, "208721635311143": 73, "20283085862393063": 73, "19862186088204892": 73, "1939613972542319": 73, "18833921627917968": 73, "18451892669552933": 73, "7769666666666667": 73, "8453333333333334": 73, "86965": 73, "88425": 73, "8911": 73, "8957666666666667": 73, "9056666666666666": 73, "9083833333333333": 73, "9122666666666667": 73, "91455": 73, "9176833333333333": 73, "92035": 73, "9217": 73, "9232333333333334": 73, "9270333333333334": 73, "9283": 73, "93035": 73, "9312333333333334": 73, "390482270359993": 73, "3140819278359413": 73, "286346542596817": 73, "26530489122867584": 73, "25648517191410064": 73, "25534764647483826": 73, "24066219604015351": 73, "22813884472846985": 73, "22091108289361": 73, "22591463786363603": 73, "22548504903912545": 73, "21807716876268388": 73, "23463654381036758": 73, "21917386519908905": 73, "2077158398628235": 73, "2112607652246952": 73, "205703763961792": 73, "21748955991864205": 73, "20092388433218003": 73, "20742826372385026": 73, "859125": 73, "89225": 73, "904875": 73, "914875": 73, "916375": 73, "91575": 73, "92375": 73, "1608194957918196": 73, "8736483463918222": 73, "7270457689632485": 73, "6118623841482439": 73, "5539627463769302": 73, "5169604117872872": 73, "4843029365547176": 73, "4664089765979537": 73, "449539397952399": 73, "4308713404481599": 73, "4170197155842903": 73, "4104185118508746": 73, "3983522486299086": 73, "3890672579232945": 73, "38423672571047535": 73, "38125834129512437": 73, "36963055836461756": 73, "36898326972273116": 73, "3608236700328174": 73, "35822524538617145": 73, "56785": 73, "6591833333333333": 73, "71765": 73, "7660333333333333": 73, "7931666666666667": 73, "8079666666666667": 73, "8198833333333333": 73, "8275166666666667": 73, "8349833333333333": 73, "8422": 73, "8473666666666667": 73, "8486833333333333": 73, "85425": 73, "85675": 73, "8578666666666667": 73, "8603333333333333": 73, "8643333333333333": 73, "8637833333333333": 73, "8684333333333333": 73, "8680166666666667": 73, "5984484012126923": 73, "5152713191509247": 73, "42289899206161496": 73, "3746640253067017": 73, "3369040569067001": 73, "32359291434288023": 73, "2978636801838875": 73, "2998174095153809": 73, "2883352539539337": 73, "2839300352931023": 73, "2775397801399231": 73, "2616970262527466": 73, "259125192284584": 73, "25470315623283385": 73, "2535187450051308": 73, "2600560383200645": 73, "25031394577026367": 73, "2547155976295471": 73, "23950587111711502": 73, "24401323813199996": 73, "750875": 73, "78025": 73, "869875": 73, "884875": 73, "891625": 73, "898875": 73, "89275": 73, "9005": 73, "910375": 73, "9135": 73, "911625": 73, "5018121279410716": 73, "3649225841834347": 73, "31199926770985253": 73, "2825479824850554": 73, "25993211727057186": 73, "2431308363737074": 73, "22870161555913973": 73, "22126636312587428": 73, "2113911879540824": 73, "20279224649834227": 73, "19300907663603836": 73, "18686007729360163": 73, "1815741605866057": 73, "1759802805684777": 73, "17041425832084564": 73, "16513840764014323": 73, "15892388751861383": 73, "1548161118118557": 73, "1498002242614656": 73, "14744469122107284": 73, "8158": 73, "8648": 73, "8846833333333334": 73, "8954666666666666": 73, "9035333333333333": 73, "9097666666666666": 73, "9142666666666667": 73, "91615": 73, "9219166666666667": 73, "9239333333333334": 73, "9268166666666666": 73, "9287666666666666": 73, "9304833333333333": 73, "9327333333333333": 73, "9365": 73, "9368666666666666": 73, "9395333333333333": 73, "9445": 73, "9450166666666666": 73, "35916801404953": 73, "30038927191495896": 73, "2824265750646591": 73, "28094157111644746": 73, "2402345055937767": 73, "24779821130633353": 73, "2263277245759964": 73, "22270147562026976": 73, "22010754531621932": 73, "20850908517837524": 73, "21723379525542258": 73, "20454896742105483": 73, "2065480750799179": 73, "20593296563625335": 73, "21030707907676696": 73, "2015896993279457": 73, "19770563289523124": 73, "19552358242869378": 73, "197759574085474": 73, "19900305101275445": 73, "867125": 73, "890875": 73, "912125": 73, "90875": 73, "9275": 73, "928": [73, 76], "928875": 73, "93325": 73, "930125": 73, "564780301424359": 73, "41836969141385705": 73, "3581543931924204": 73, "3251280398018706": 73, "30215959723538427": 73, "28700008430778345": 73, "27507679125488693": 73, "26540731782439164": 73, "25373875692105496": 73, "24964979071734048": 73, "24098571216357922": 73, "23604591902512223": 73, "2270722362135392": 73, "2229606584985373": 73, "22031292727570545": 73, "21439386613126885": 73, "21020108821200156": 73, "2042837777872012": 73, "20376247368149283": 73, "20021205727082453": 73, "7927": 73, "8474166666666667": 73, "8672166666666666": 73, "8811833333333333": 73, "8883": 73, "8952833333333333": 73, "89795": 73, "9011333333333333": 73, "9055833333333333": 73, "9071166666666667": 73, "9100333333333334": 73, "91515": 73, "91775": 73, "9197833333333333": 73, "9218666666666666": 73, "9239": 73, "9236833333333333": 73, "92455": 73, "39558523416519165": 73, "3187315353155136": 73, "30105597496032716": 73, "2717038299441338": 73, "25286867189407347": 73, "24664685553312302": 73, "24286985045671464": 73, "23643679201602935": 73, "23006864881515504": 73, "2277349520921707": 73, "22591854375600814": 73, "2165311907827854": 73, "21385486593842506": 73, "21402871897816658": 73, "2096972267627716": 73, "21242560443282127": 73, "2098898750245571": 73, "2062524998188019": 73, "19932547932863234": 73, "20170186588168143": 73, "897125": 73, "9065": 73, "9085": 73, "907625": 73, "91275": 73, "91925": 73, "6916971901205303": 73, "4947840944567977": 73, "41710148827988963": 73, "38678343986460906": 73, "36429949198513906": 73, "34339441834831796": 73, "33055868282564665": 73, "3199633415272114": 73, "31550557391920575": 73, "3022628513289921": 73, "2959158662110885": 73, "2941135993993867": 73, "28555906579089063": 73, "27903660322462065": 73, "2769482293601102": 73, "27154609372716215": 73, "26548120195963487": 73, "26188135733291795": 73, "2588035051009929": 73, "2574938320115939": 73, "7497333333333334": 73, "8236833333333333": 73, "8482333333333333": 73, "8618666666666667": 73, "8703666666666666": 73, "8772166666666666": 73, "8803333333333333": 73, "88525": 73, "88945": 73, "8937166666666667": 73, "8969": 73, "90175": 73, "9041666666666667": 73, "9046166666666666": 73, "41916924858093263": 73, "3380992366075516": 73, "31549062132835387": 73, "2921286026239395": 73, "2786481494307518": 73, "28516836106777194": 73, "25556409001350405": 73, "2538892236948013": 73, "24726227968931197": 73, "24262803781032563": 73, "24080126863718032": 73, "24242325466871262": 73, "23416680485010147": 73, "22847312396764755": 73, "22423979061841964": 73, "2311997367441654": 73, "22794704174995423": 73, "21943940049409866": 73, "21820387506484987": 73, "21150743806362152": 73, "8435": 73, "87725": 73, "890375": 73, "910625": 73, "909875": 73, "919625": 73, "923875": 73, "162218615571573": 73, "8284856370453642": 73, "7309887468624217": 73, "6590983641744931": 73, "6089096262510906": 73, "5663433943285363": 73, "5383681068733048": 73, "5242803116787725": 73, "49926126579930785": 73, "48940120944018556": 73, "4789252862779062": 73, "46633604049746163": 73, "4596060775458686": 73, "4464966354847971": 73, "4418302221593064": 73, "43759817490254893": 73, "42892070028827645": 73, "4226101264516428": 73, "418694807601763": 73, "4110745745840103": 73, "58005": 73, "6824666666666667": 73, "7223333333333334": 73, "7464333333333333": 73, "7711333333333333": 73, "7891833333333333": 73, "8012333333333334": 73, "80635": 73, "8172666666666667": 73, "8271833333333334": 73, "8335833333333333": 73, "8371833333333333": 73, "8412166666666666": 73, "84265": 73, "8458833333333333": 73, "8471166666666666": 73, "8497666666666667": 73, "8522833333333333": 73, "5945872340202332": 73, "518519122838974": 73, "4681703653335571": 73, "42978407418727876": 73, "40349935555458066": 73, "37377681517601014": 73, "35234942865371705": 73, "3359788683652878": 73, "3217720929384232": 73, "3279728285074234": 73, "3114012089371681": 73, "3060767319202423": 73, "2949701727628708": 73, "2981588536500931": 73, "2855641575455666": 73, "28112928783893587": 73, "28212732630968096": 73, "27846804082393645": 73, "27372796374559405": 73, "27415593349933626": 73, "78525": 73, "820125": 73, "875125": 73, "876625": 73, "882": [73, 76], "887875": 73, "884625": 73, "892125": 73, "894125": 73, "902625": 73, "89975": 73, "90075": 73, "d2": 73, "description_width": [73, 80], "800px": 73, "aasdsd": 73, "_dropout_exploration_bonus_interactive_demo": 73, "transforms_custom": 73, "get_augmentation_transform": 73, "_how_much_augmentation_help_bonus_exercis": 73, "_data_augmentation_bonus_discuss": 73, "ashish": 74, "sahoo": 74, "practition": [74, 88, 91], "w2d2_t2": 74, "_intro_to_dl_thinking_video": 74, "_spiking_neuron_predictions_video": 74, "_spiking_neuron_predictions_setup_video": 74, "motorcycl": 74, "emerg": [74, 94], "millisecond": 74, "k_": [74, 76], "lambda_": 74, "\u03bb_": 74, "stamp": 74, "milisecond": 74, "all_data": 74, "_designing_a_cost_function_to_predict_neural_activities_discuss": 74, "_spiking_neurons_wrapup_video": 74, "_nonpoisson_neurons_bonus_discuss": 74, "_ann_uncertainty_vignette_video": 74, "_ann_uncertainty_setup_video": 74, "atom": 74, "chemic": 74, "mu_i": [74, 77], "sigma_i": 74, "_ann_uncertainty_discuss": 74, "_ann_uncertainty_wrapup_video": 74, "rapid": 74, "nmr": 74, "imit": 74, "molecular": 74, "_negative_standard_deviations_bonus_discuss": 74, "_embedding_faces_vignette_video": 74, "_embedding_faces_setup_video": 74, "nearbi": 74, "_j": 74, "i_c": 74, "j_c": 74, "phrase": [74, 88], "_p": 74, "fed": [74, 94], "_n": 74, "dissimiliarti": 74, "dissimilar": [74, 89], "triplet": [74, 77], "subscript": 74, "anchor": [74, 77], "_embedding_faces_discuss": 74, "_embedding_faces_wrapup_video": 74, "dwell": [74, 76], "entiti": 74, "pull": 74, "probe": 74, "laura": [76, 77], "pede": [76, 77], "vogg": [76, 77], "marissa": [76, 77], "wei": [76, 77], "timo": [76, 77], "l\u00fcddeck": [76, 77], "cari": [76, 77], "murrai": [76, 77], "ben": [76, 77], "heil": [76, 77], "w2d3_t1": 76, "_modern_cnns_and_transfer_learning_video": 76, "image_length": 76, "image_channel": 76, "num_of_param": 76, "_l": 76, "k_l": 76, "n_l": 76, "characterist": 76, "fullyconnectednet": 76, "get_parameter_count": 76, "param_count": 76, "fccnet": 76, "fccn": 76, "12583168": 76, "7168": 76, "_calculate_number_of_params_exercis": 76, "calculate_paramet": 76, "filter_count": 76, "fcnn_node": 76, "filter_width": 76, "image_area": 76, "image_volum": 76, "fcnn_paramet": 76, "cnn_paramet": 76, "_check_your_results_interactive_demo": 76, "1980": 76, "predat": 76, "revolut": 76, "_history_of_convnets_video": 76, "_challenges_of_improving_cnns_discuss": 76, "18min": [76, 84], "_alexnet_and_vgg_video": 76, "paralel": 76, "input_imag": 76, "input_batch": 76, "s3": 76, "owt": 76, "4df8aa71": 76, "load_state_dict_from_url": 76, "9dzeu": 76, "w2d3_modernconvnet": 76, "urlretriev": 76, "input_tensor": 76, "_filter_similarity_discuss": 76, "alexnet_intermediate_output": 76, "browse_imag": 76, "view_imag": 76, "_what_does_alexnet_see_interactive_demo": 76, "_filter_purpose_discuss": 76, "_residual_networks_resnets_video": 76, "subtract": [76, 80], "preceed": 76, "imagenette2": 76, "mnve4": 76, "tgz": 76, "dict_map": 76, "tench": [76, 80], "tinca": 76, "carassiu": 76, "auratu": 76, "eater": 76, "eat": 76, "carcharodon": 76, "carcharia": 76, "tiger": [76, 84], "galeocerdo": 76, "cuvieri": 76, "hammerhead": 76, "crampfish": 76, "numbfish": 76, "torpedo": 76, "stingrai": 76, "cock": 76, "hen": 76, "ostrich": 76, "struthio": 76, "camelu": 76, "brambl": 76, "fringilla": 76, "montifringilla": 76, "goldfinch": 76, "cardu": 76, "hous": 76, "finch": 76, "linnet": 76, "carpodacu": 76, "mexicanu": 76, "junco": 76, "snowbird": 76, "indigo": 76, "bunt": 76, "passerina": 76, "cyanea": 76, "turdu": 76, "migratoriu": 76, "bulbul": 76, "jai": [76, 84], "magpi": [76, 80], "chickade": 76, "water": [76, 84], "ouzel": 76, "dipper": 76, "kite": 76, "eagl": 76, "haliaeetu": 76, "leucocephalu": 76, "vultur": 76, "owl": [76, 88], "strix": 76, "nebulosa": 76, "european": [76, 88], "salamand": 76, "salamandra": 76, "newt": 76, "trituru": 76, "vulgari": 76, "eft": 76, "ambystoma": 76, "maculatum": 76, "axolotl": 76, "mud": 76, "puppi": 76, "mexicanum": 76, "bullfrog": 76, "rana": 76, "catesbeiana": 76, "toad": 76, "rib": 76, "ascaphu": 76, "trui": 76, "loggerhead": 76, "turtl": 76, "caretta": 76, "leatherback": 76, "leatheri": 76, "dermoch": 76, "coriacea": 76, "terrapin": 76, "tortois": 76, "band": 76, "gecko": 76, "iguana": 76, "chameleon": 76, "anol": 76, "anoli": 76, "carolinensi": 76, "whiptail": 76, "lizard": 76, "agama": 76, "frill": 76, "chlamydosauru": 76, "kingi": 76, "allig": 76, "gila": 76, "monster": 76, "heloderma": 76, "suspectum": 76, "lacerta": 76, "african": 76, "chamaeleo": 76, "chamaeleon": 76, "komodo": 76, "dragon": 76, "giant": 76, "varanu": 76, "komodoensi": 76, "crocodil": 76, "nile": 76, "crocodylu": 76, "niloticu": 76, "mississipiensi": 76, "triceratop": 76, "thunder": 76, "carphophi": 76, "amoenu": 76, "ringneck": 76, "ring": 76, "hognos": 76, "puff": 76, "adder": 76, "sand": 76, "viper": 76, "grass": 76, "king": [76, 87], "kingsnak": 76, "garter": 76, "vine": 76, "hypsiglena": 76, "torquata": 76, "boa": 76, "constrictor": 76, "rock": [76, 87], "seba": 76, "indian": 76, "cobra": 76, "naja": 76, "mamba": 76, "sea": 76, "horn": [76, 88], "cerast": 76, "asp": 76, "cornutu": 76, "diamondback": 76, "rattlesnak": 76, "crotalu": 76, "adamanteu": 76, "sidewind": 76, "trilobit": 76, "harvestman": 76, "daddi": 76, "longleg": 76, "phalangium": 76, "opilio": 76, "scorpion": 76, "gold": 76, "garden": 76, "spider": 76, "argiop": 76, "aurantia": 76, "barn": 76, "araneu": 76, "cavaticu": 76, "aranea": 76, "diademata": 76, "widow": 76, "latrodectu": 76, "mactan": 76, "tarantula": 76, "wolf": [76, 88], "hunt": [76, 88], "centiped": 76, "grous": 76, "ptarmigan": 76, "ruf": 76, "partridg": 76, "bonasa": 76, "umbellu": 76, "prairi": 76, "chicken": 76, "fowl": 76, "peacock": 76, "quail": 76, "psittacu": 76, "erithacu": 76, "macaw": 76, "sulphur": 76, "crest": 76, "cockatoo": 76, "kakato": 76, "galerita": 76, "cacatua": 76, "lorikeet": 76, "coucal": 76, "bee": [76, 80], "hornbil": 76, "hummingbird": 76, "jacamar": 76, "toucan": 76, "drake": 76, "breast": 76, "mergans": 76, "mergu": 76, "serrat": 76, "goos": 76, "swan": 76, "cygnu": 76, "atratu": 76, "101": 76, "tusker": 76, "echidna": 76, "spini": 76, "anteat": 76, "platypu": 76, "duckbil": 76, "duck": 76, "bill": 76, "ornithorhynchu": 76, "anatinu": 76, "wallabi": 76, "brush": 76, "kangaroo": 76, "koala": 76, "bear": 76, "phascolarcto": 76, "cinereu": 76, "wombat": 76, "jellyfish": [76, 80], "anemon": 76, "coral": 76, "flatworm": 76, "platyhelminth": 76, "nematod": 76, "roundworm": 76, "conch": 76, "snail": 76, "114": [76, 84], "slug": 76, "nudibranch": 76, "116": 76, "chiton": 76, "shell": [76, 84, 94], "cradl": 76, "polyplacophor": 76, "117": 76, "chamber": 76, "nautilu": 76, "pearli": 76, "118": 76, "dung": 76, "crab": 76, "cancer": [76, 91], "magist": 76, "119": [76, 84], "irroratu": 76, "fiddler": 76, "alaska": 76, "alaskan": 76, "paralithod": 76, "camtschatica": 76, "lobster": 76, "northern": 76, "homaru": 76, "americanu": 76, "123": 76, "langoust": 76, "crawfish": 76, "crayfish": 76, "crawdad": 76, "crawdaddi": 76, "hermit": 76, "isopod": 76, "stork": 76, "ciconia": 76, "nigra": 76, "129": 76, "spoonbil": 76, "flamingo": [76, 84], "heron": 76, "egretta": 76, "caerulea": 76, "egret": 76, "albu": 76, "133": 76, "bittern": 76, "crane": 76, "135": 76, "limpkin": 76, "aramu": 76, "pictu": 76, "136": 76, "gallinul": 76, "porphyrio": 76, "coot": 76, "marsh": 76, "fulica": 76, "americana": 76, "bustard": 76, "ruddi": 76, "turnston": 76, "arenaria": 76, "interpr": 76, "sandpip": 76, "dunlin": 76, "erolia": 76, "alpina": 76, "redshank": 76, "tringa": 76, "totanu": 76, "dowitch": 76, "oystercatch": 76, "oyster": 76, "catcher": 76, "pelican": 76, "145": [76, 80], "penguin": 76, "aptenodyt": 76, "patagonica": 76, "albatross": 76, "mollymawk": 76, "whale": 76, "devilfish": 76, "eschrichtiu": 76, "gibbosu": 76, "robustu": 76, "killer": 76, "orca": 76, "grampu": 76, "orcinu": 76, "dugong": 76, "dugon": 76, "lion": 76, "chihuahua": 76, "152": 76, "japanes": 76, "spaniel": 76, "maltes": 76, "terrier": 76, "154": 76, "pekines": 76, "pekinges": 76, "peke": 76, "155": 76, "shih": 76, "tzu": 76, "blenheim": 76, "papillon": 76, "158": [76, 80], "159": [76, 80], "rhodesian": 76, "ridgeback": 76, "afghan": 76, "hound": 76, "basset": 76, "beagl": 76, "bloodhound": 76, "sleuthhound": 76, "bluetick": 76, "tan": 76, "coonhound": 76, "walker": 76, "redbon": 76, "borzoi": 76, "russian": 76, "wolfhound": 76, "irish": 76, "greyhound": 76, "whippet": 76, "ibizan": 76, "podenco": 76, "norwegian": 76, "elkhound": 76, "otterhound": 76, "otter": 76, "saluki": 76, "gazel": 76, "177": [76, 94], "scottish": 76, "deerhound": 76, "178": [76, 94], "weimaran": 76, "179": [76, 94], "staffordshir": 76, "bullterri": 76, "bull": [76, 87], "pit": 76, "bedlington": 76, "183": [76, 94], "kerri": 76, "185": [76, 94], "norfolk": 76, "186": [76, 94], "norwich": 76, "187": [76, 94], "yorkshir": 76, "188": 76, "wire": 76, "hair": 76, "fox": 76, "189": [76, 84], "lakeland": 76, "sealyham": 76, "191": 76, "airedal": 76, "cairn": 76, "australian": 76, "dandi": 76, "dinmont": 76, "195": 76, "boston": 76, "196": 76, "miniatur": 76, "schnauzer": 76, "198": 76, "scotch": 76, "scotti": 76, "tibetan": 76, "chrysanthemum": 76, "silki": 76, "sydnei": 76, "202": 76, "wheaten": 76, "west": [76, 100], "highland": 76, "204": 76, "lhasa": 76, "apso": 76, "205": 76, "206": 76, "curli": [76, 88], "207": 76, "golden": 76, "208": 76, "labrador": 76, "chesapeak": 76, "german": [76, 80], "pointer": [76, 85], "vizsla": 76, "hungarian": 76, "212": 76, "setter": 76, "213": 76, "214": 76, "gordon": 76, "215": [76, 94], "brittani": 76, "clumber": 76, "217": [76, 94], "springer": 76, "218": [76, 94], "welsh": 76, "cocker": 76, "sussex": 76, "222": 76, "kuvasz": 76, "223": 76, "schipperk": 76, "groenendael": 76, "malinoi": 76, "226": 76, "briard": 76, "227": 76, "kelpi": 76, "228": 76, "komondor": 76, "sheepdog": 76, "bobtail": 76, "230": 76, "shetland": 76, "sheep": 76, "231": 76, "colli": 76, "232": 76, "233": 76, "bouvier": 76, "flandr": 76, "234": [76, 80], "rottweil": 76, "shepherd": [76, 80], "polic": 76, "alsatian": 76, "236": [76, 80], "doberman": 76, "pinscher": 76, "238": [76, 80], "swiss": 76, "mountain": 76, "239": 76, "bernes": 76, "appenzel": 76, "entlebuch": 76, "242": 76, "boxer": 76, "243": [76, 80], "mastiff": 76, "244": 76, "245": 76, "bulldog": 76, "246": [76, 80], "dane": 76, "247": [76, 80], "saint": 76, "bernard": 76, "eskimo": 76, "huski": 76, "malamut": 76, "malemut": 76, "siberian": 76, "dalmatian": 76, "coach": 76, "carriag": 76, "affenpinsch": 76, "monkei": 76, "basenji": 76, "pug": 76, "leonberg": 76, "newfoundland": 76, "pyrene": 76, "258": 76, "samoi": 76, "samoyed": 76, "pomeranian": 76, "chow": 76, "261": 76, "keeshond": 76, "262": 76, "brabancon": 76, "griffon": 76, "263": 76, "pembrok": 76, "corgi": 76, "cardigan": 76, "265": 76, "poodl": 76, "mexican": 76, "hairless": 76, "269": 76, "timber": 76, "cani": 76, "lupu": 76, "arctic": 76, "tundrarum": 76, "271": 76, "mane": 76, "rufu": 76, "niger": 76, "272": [76, 88], "coyot": 76, "latran": 76, "273": 76, "dingo": 76, "warrig": 76, "warrag": 76, "dhole": 76, "cuon": 76, "alpinu": 76, "275": 76, "hyena": 76, "cape": 76, "lycaon": 76, "276": 76, "hyaena": 76, "277": 76, "vulp": 76, "278": 76, "kit": 76, "macroti": 76, "279": 76, "alopex": 76, "lagopu": 76, "280": 76, "urocyon": 76, "cinereoargenteu": 76, "tabbi": 76, "282": 76, "283": 76, "persian": 76, "284": 76, "siames": 76, "285": 76, "egyptian": 76, "286": 76, "cougar": 76, "puma": 76, "catamount": 76, "painter": 76, "panther": 76, "feli": 76, "concolor": 76, "287": 76, "lynx": 76, "leopard": 76, "panthera": 76, "pardu": 76, "snow": 76, "ounc": 76, "uncia": 76, "290": 76, "jaguar": 76, "onca": 76, "291": [76, 94], "leo": 76, "292": [76, 94], "tigri": 76, "293": [76, 94], "cheetah": 76, "chetah": 76, "acinonyx": 76, "jubatu": 76, "294": [76, 94], "brown": [76, 85, 87], "bruin": 76, "ursu": 76, "arcto": 76, "295": 76, "euarcto": 76, "296": 76, "maritimu": 76, "thalarcto": 76, "297": 76, "sloth": 76, "melursu": 76, "ursinu": 76, "298": 76, "mongoos": 76, "meerkat": 76, "mierkat": 76, "beetl": 76, "ladybug": 76, "ladybeetl": 76, "ladi": 76, "ladybird": 76, "carabid": 76, "303": 76, "longicorn": 76, "304": 76, "chrysomelid": 76, "305": 76, "306": 76, "rhinocero": 76, "307": 76, "weevil": 76, "308": 76, "310": 76, "emmet": 76, "pismir": 76, "311": 76, "grasshopp": 76, "cricket": [76, 87], "walkingstick": 76, "insect": [76, 87], "cockroach": 76, "roach": 76, "manti": 76, "mantid": 76, "cicada": 76, "cicala": 76, "leafhopp": 76, "lacew": 76, "319": 76, "dragonfli": 76, "darn": 76, "needl": 76, "devil": 76, "sew": 76, "feeder": 76, "doctor": [76, 84], "mosquito": 76, "hawk": 76, "skeeter": 76, "damselfli": 76, "admir": 76, "ringlet": 76, "butterfli": 76, "323": 76, "monarch": 76, "milkwe": 76, "danau": 76, "plexippu": 76, "324": 76, "cabbag": 76, "sulfur": 76, "lycaenid": 76, "327": 76, "starfish": 76, "star": 76, "328": 76, "urchin": 76, "cucumb": 76, "holothurian": 76, "330": [76, 88], "rabbit": 76, "cottontail": 76, "331": 76, "hare": 76, "332": 76, "angora": 76, "333": 76, "hamster": 76, "334": 76, "porcupin": 76, "hedgehog": 76, "335": 76, "squirrel": 76, "eastern": 76, "sciuru": 76, "336": 76, "marmot": 76, "337": 76, "beaver": 76, "guinea": 76, "pig": 76, "cavia": 76, "cobaya": 76, "sorrel": 76, "340": 76, "zebra": 76, "341": 76, "hog": 76, "grunter": 76, "squealer": 76, "su": 76, "scrofa": 76, "342": 76, "boar": 76, "343": 76, "warthog": 76, "hippopotamu": 76, "hippo": 76, "river": [76, 84], "amphibiu": 76, "345": 76, "buffalo": 76, "asiat": 76, "bubalu": 76, "bubali": 76, "348": 76, "ram": [76, 77], "tup": 76, "349": 76, "bighorn": 76, "cimarron": 76, "rocki": 76, "ovi": 76, "canadensi": 76, "350": 76, "ibex": 76, "capra": 76, "351": 76, "hartebeest": 76, "impala": 76, "aepycero": 76, "melampu": 76, "353": 76, "354": 76, "arabian": 76, "camel": 76, "dromedari": 76, "dromedariu": 76, "355": 76, "llama": 76, "356": 76, "weasel": 76, "357": 76, "mink": 76, "polecat": 76, "fitch": 76, "foulmart": 76, "foumart": 76, "mustela": 76, "putoriu": 76, "359": 76, "ferret": 76, "nigrip": 76, "360": 76, "skunk": 76, "pussi": 76, "362": 76, "badger": 76, "363": 76, "armadillo": 76, "364": 76, "toed": 76, "bradypu": 76, "tridactylu": 76, "orangutan": 76, "orangutang": 76, "pongo": 76, "pygmaeu": 76, "gorilla": 76, "chimpanze": 76, "chimp": 76, "pan": 76, "troglodyt": 76, "gibbon": 76, "hylob": 76, "lar": 76, "369": 76, "siamang": 76, "syndactylu": 76, "symphalangu": 76, "370": 76, "guenon": 76, "371": 76, "pata": 76, "hussar": 76, "erythrocebu": 76, "372": 76, "baboon": 76, "macaqu": 76, "374": 76, "langur": 76, "colobu": 76, "376": 76, "probosci": 76, "nasali": 76, "larvatu": 76, "377": 76, "marmoset": 76, "capuchin": 76, "ringtail": 76, "cebu": 76, "capucinu": 76, "379": 76, "howler": 76, "titi": 76, "atel": 76, "geoffroyi": 76, "382": 76, "saimiri": 76, "sciureu": 76, "383": 76, "madagascar": 76, "lemur": 76, "catta": 76, "indri": 76, "brevicaudatu": 76, "385": 76, "eleph": 76, "elepha": 76, "maximu": 76, "loxodonta": 76, "africana": 76, "lesser": 76, "ailuru": 76, "fulgen": 76, "388": 76, "coon": 76, "ailuropoda": 76, "melanoleuca": 76, "389": 76, "barracouta": 76, "snoek": 76, "390": 76, "eel": 76, "coho": 76, "salmon": 76, "jack": 76, "silver": 76, "oncorhynchu": 76, "kisutch": 76, "392": 76, "holocanthu": 76, "tricolor": 76, "394": 76, "sturgeon": 76, "395": 76, "gar": 76, "garfish": 76, "garpik": 76, "billfish": 76, "lepisosteu": 76, "osseu": 76, "396": 76, "lionfish": 76, "397": 76, "puffer": 76, "pufferfish": 76, "blowfish": 76, "globefish": 76, "398": 76, "abacu": 76, "399": 76, "abaya": 76, "academ": 76, "gown": 76, "robe": 76, "401": 76, "accordion": 76, "piano": 76, "402": 76, "acoust": [76, 80], "guitar": [76, 80], "aircraft": 76, "carrier": 76, "flattop": 76, "404": 76, "airlin": 76, "airship": 76, "dirig": 76, "altar": 76, "407": 76, "ambul": 76, "408": [76, 88], "amphibian": 76, "amphibi": 76, "409": 76, "clock": 76, "410": 76, "apiari": 76, "411": 76, "apron": 76, "412": 76, "ashcan": 76, "trash": 76, "garbag": [76, 87], "wastebin": 76, "ashbin": 76, "dustbin": 76, "barrel": 76, "413": 76, "assault": 76, "rifl": 76, "gun": 76, "backpack": 76, "knapsack": 76, "packsack": 76, "rucksack": 76, "haversack": 76, "415": 76, "bakeri": 76, "bakeshop": 76, "bakehous": 76, "416": 76, "417": 76, "balloon": 76, "418": 76, "ballpoint": 76, "pen": 76, "ballpen": 76, "biro": 76, "419": 76, "banjo": 76, "421": 76, "bannist": 76, "banist": 76, "balustrad": 76, "balust": 76, "handrail": 76, "422": 76, "barbel": 76, "barber": 76, "chair": [76, 101], "424": 76, "barbershop": 76, "425": [76, 82], "426": [76, 82], "baromet": 76, "cask": 76, "428": 76, "barrow": 76, "cart": 76, "lawn": 76, "wheelbarrow": 76, "429": 76, "basebal": 76, "430": 76, "basketbal": 76, "431": 76, "bassinet": 76, "bassoon": 76, "433": 76, "bath": 76, "cap": 76, "434": 76, "towel": 76, "435": 76, "bathtub": 76, "tub": 76, "436": [76, 88], "beach": [76, 84], "wagon": 76, "station": 76, "estat": 76, "waggon": 76, "437": 76, "beacon": 76, "lighthous": 76, "pharo": 76, "beaker": 76, "439": 76, "bearskin": 76, "busbi": 76, "shako": 76, "440": 76, "beer": 76, "bottl": 76, "441": 76, "442": 76, "cote": 76, "cot": 76, "bib": 76, "444": [76, 82], "bicycl": 76, "tandem": 76, "445": [76, 82], "bikini": 76, "446": [76, 82], "binder": 76, "447": [76, 82], "binocular": 76, "opera": 76, "448": 76, "birdhous": 76, "449": 76, "boathous": 76, "bobsl": 76, "bobsleigh": 76, "bob": 76, "451": 76, "bolo": 76, "tie": 76, "bola": 76, "452": [76, 80], "bonnet": 76, "poke": 76, "453": [76, 80], "bookcas": 76, "454": [76, 80], "bookshop": 76, "bookstor": 76, "bookstal": 76, "455": [76, 80], "bottlecap": 76, "bow": 76, "457": [76, 80, 84], "bowti": 76, "brass": 76, "tablet": 76, "plaqu": 76, "459": [76, 80], "brassier": 76, "bra": 76, "bandeau": 76, "breakwat": 76, "groin": 76, "groyn": 76, "mole": 76, "bulwark": 76, "seawal": 76, "jetti": 76, "461": 76, "breastplat": 76, "aegi": 76, "egi": 76, "broom": 76, "463": 76, "bucket": 76, "pail": 76, "464": 76, "buckl": 76, "bulletproof": 76, "vest": 76, "466": 76, "bullet": 76, "butcher": 76, "shop": 76, "meat": 76, "market": [76, 88], "468": 76, "cab": 76, "hack": 76, "taxi": 76, "taxicab": 76, "caldron": 76, "cauldron": 76, "470": 76, "candl": 76, "taper": 76, "wax": 76, "471": 76, "cannon": 76, "472": 76, "cano": 76, "473": 76, "tin": 76, "474": 76, "475": 76, "476": 76, "carousel": 76, "carrousel": 76, "merri": 76, "roundabout": 76, "whirligig": 76, "477": 76, "478": 76, "carton": 76, "479": 76, "cash": 76, "dispens": 76, "teller": 76, "atm": 76, "481": 76, "cassett": 76, "482": 76, "483": 76, "castl": 76, "484": 76, "catamaran": 76, "cd": 76, "486": 76, "cello": 76, "violoncello": 76, "487": 76, "telephon": 76, "cellphon": 76, "mobil": 76, "488": 76, "489": 76, "chainlink": 76, "fenc": 76, "490": 76, "armor": 76, "armour": 76, "chainsaw": 76, "492": 76, "chest": 76, "493": 76, "chiffoni": 76, "commod": 76, "494": 76, "chime": 76, "gong": 76, "495": 76, "cabinet": 76, "closet": 76, "496": 76, "christma": 76, "stock": 76, "498": 76, "cinema": 76, "theater": 76, "theatr": 76, "palac": 76, "499": 76, "cleaver": 76, "chopper": 76, "cliff": 76, "501": 76, "cloak": 76, "502": 76, "clog": 76, "geta": 76, "patten": 76, "sabot": 76, "503": 76, "cocktail": 76, "shaker": 76, "504": 76, "coffe": [76, 80], "mug": [76, 80], "505": 76, "coffeepot": 76, "506": 76, "coil": 76, "volut": 76, "whorl": 76, "helix": 76, "507": 76, "lock": 76, "keyboard": [76, 85], "keypad": 76, "confectioneri": 76, "confectionari": 76, "candi": 76, "510": 76, "containership": 76, "vessel": 76, "511": 76, "corkscrew": 76, "screw": 76, "513": 76, "cornet": 76, "trumpet": 76, "trump": [76, 88], "cowboi": 76, "515": 76, "gallon": 76, "517": 76, "518": 76, "helmet": 76, "519": 76, "crate": 76, "520": 76, "crib": 76, "521": 76, "crock": 76, "pot": 76, "croquet": 76, "crutch": 76, "524": 76, "cuirass": 76, "525": [76, 88], "dam": 76, "dike": 76, "dyke": 76, "526": 76, "desk": 76, "527": 76, "desktop": 76, "528": 76, "dial": 76, "529": 76, "diaper": 76, "nappi": 76, "napkin": 76, "530": 76, "532": 76, "dine": 76, "dishrag": 76, "dishcloth": 76, "534": 76, "dishwash": [76, 101], "dish": 76, "washer": 76, "535": 76, "disk": [76, 88, 100], "disc": [76, 100], "dock": 76, "dockag": 76, "facil": 76, "dogsl": 76, "sled": 76, "sleigh": 76, "538": [76, 94], "dome": 76, "539": [76, 94], "doormat": 76, "mat": 76, "540": [76, 94], "drill": 76, "offshor": 76, "rig": 76, "541": [76, 94], "membranophon": 76, "tympan": 76, "542": [76, 94], "drumstick": 76, "543": [76, 94], "dumbbel": 76, "544": [76, 94], "dutch": 76, "oven": 76, "fan": 76, "blower": 76, "546": 76, "locomot": 76, "548": 76, "entertain": 76, "549": 76, "envelop": 76, "550": 76, "espresso": 76, "maker": 76, "551": 76, "powder": 76, "552": 76, "feather": 76, "553": 76, "554": 76, "fireboat": 76, "555": 76, "fireguard": 76, "557": 76, "flagpol": 76, "flagstaff": 76, "558": 76, "flute": 76, "transvers": 76, "559": 76, "fold": [76, 84], "560": 76, "footbal": 76, "forklift": 76, "fountain": 76, "563": 76, "564": 76, "565": 76, "freight": 76, "566": 76, "567": 76, "fry": 76, "frypan": 76, "skillet": 76, "fur": [76, 91], "569": 76, "dustcart": 76, "570": 76, "gasmask": 76, "respir": 76, "ga": 76, "571": 76, "pump": 76, "gasolin": 76, "petrol": 76, "island": 76, "572": 76, "goblet": 76, "573": 76, "kart": 76, "574": 76, "golf": 76, "575": 76, "golfcart": 76, "576": 76, "gondola": 76, "tam": 76, "580": 76, "greenhous": 76, "nurseri": 76, "glasshous": 76, "581": 76, "grill": 76, "radiat": 76, "582": 76, "groceri": 76, "food": [76, 87], "guillotin": 76, "584": 76, "585": 76, "sprai": 76, "586": 76, "587": 76, "hammer": 76, "588": 76, "hamper": 76, "589": 76, "dryer": 76, "drier": 76, "590": 76, "held": [76, 94], "microcomput": 76, "591": 76, "handkerchief": 76, "hanki": 76, "hankei": 76, "592": 76, "593": 76, "harmonica": 76, "mouth": 76, "harp": 76, "594": 76, "595": 76, "harvest": 76, "reaper": 76, "596": 76, "hatchet": 76, "holster": 76, "598": 76, "599": 76, "honeycomb": 76, "hook": [76, 80, 94], "claw": 76, "601": 76, "hoopskirt": 76, "crinolin": 76, "603": 76, "604": 76, "hourglass": 76, "605": 76, "ipod": 76, "606": 76, "iron": 76, "lantern": 76, "denim": 76, "609": 76, "jeep": 76, "landrov": 76, "jersei": 76, "tee": 76, "611": 76, "jigsaw": 76, "puzzl": 76, "612": 76, "jinrikisha": 76, "ricksha": 76, "rickshaw": 76, "joystick": 76, "614": 76, "kimono": 76, "615": 76, "knee": 76, "knot": 76, "617": 76, "618": 76, "ladl": 76, "lampshad": 76, "lamp": 76, "shade": 76, "620": 76, "laptop": 76, "621": 76, "mower": 76, "622": 76, "623": 76, "knife": 76, "paperknif": 76, "624": 76, "lifeboat": 76, "ignit": 76, "ignitor": 76, "limousin": 76, "limo": 76, "628": 76, "liner": [76, 97], "ocean": 76, "629": 76, "lipstick": 76, "lip": 76, "roug": 76, "loafer": 76, "631": 76, "lotion": 76, "632": 76, "loudspeak": 76, "speaker": 76, "633": 76, "loup": 76, "jewel": 76, "634": 76, "lumbermil": 76, "sawmil": 76, "635": 76, "compass": 76, "636": 76, "mailbag": 76, "postbag": 76, "637": 76, "mailbox": 76, "638": 76, "maillot": 76, "639": 76, "tank": 76, "manhol": 76, "641": 76, "maraca": 76, "642": 76, "marimba": 76, "xylophon": 76, "643": 76, "644": 76, "matchstick": 76, "645": 76, "maypol": 76, "maze": 76, "labyrinth": 76, "647": 76, "648": 76, "medicin": 76, "megalith": 76, "microphon": 76, "mike": 76, "651": 76, "microwav": 76, "652": 76, "militari": 76, "653": 76, "milk": 76, "654": 76, "minibu": [76, 80], "655": 76, "miniskirt": 76, "minivan": 76, "657": 76, "missil": 76, "658": 76, "mitten": 76, "659": 76, "bowl": 76, "manufactur": 76, "662": 76, "modem": 76, "663": 76, "monasteri": 76, "664": 76, "665": 76, "mope": 76, "mortar": 76, "mortarboard": 76, "668": 76, "mosqu": 76, "669": 76, "670": 76, "scooter": 76, "671": 76, "terrain": 76, "roader": 76, "672": 76, "tent": 76, "673": 76, "674": 76, "mousetrap": 76, "675": 76, "676": 76, "muzzl": 76, "677": 76, "nail": 76, "678": 76, "brace": 76, "679": 76, "necklac": 76, "680": 76, "nippl": 76, "681": 76, "682": 76, "obelisk": 76, "683": 76, "obo": 76, "hautboi": 76, "684": 76, "ocarina": 76, "sweet": 76, "potato": 76, "685": 76, "odomet": 76, "hodomet": 76, "mileomet": 76, "milomet": 76, "oil": 76, "687": 76, "pipe": [76, 82], "oscilloscop": 76, "scope": [76, 85], "cathod": 76, "cro": 76, "689": 76, "overskirt": 76, "690": 76, "oxcart": 76, "691": 76, "oxygen": 76, "692": 76, "packet": 76, "693": 76, "paddl": 76, "boat": 76, "694": 76, "paddlewheel": 76, "695": 76, "padlock": 76, "696": 76, "paintbrush": 76, "697": 76, "pajama": 76, "pyjama": 76, "pj": 76, "jammi": 76, "698": 76, "panpip": 76, "pandean": 76, "syrinx": 76, "701": 76, "parachut": 76, "chute": 76, "702": 76, "703": 76, "park": [76, 84], "bench": 76, "704": 76, "meter": 76, "705": 76, "passeng": 76, "706": 76, "patio": 76, "terrac": 76, "707": 76, "708": 76, "pedest": 76, "plinth": 76, "footstal": 76, "709": 76, "pencil": 76, "710": 76, "sharpen": 76, "711": 76, "perfum": 76, "essenc": 76, "712": 76, "petri": 76, "photocopi": 76, "714": 76, "plectrum": 76, "plectron": 76, "715": 76, "pickelhaub": 76, "716": 76, "picket": 76, "pale": 76, "717": 76, "pickup": 76, "718": 76, "pier": 76, "piggi": 76, "penni": 76, "720": 76, "pill": 76, "722": 76, "ping": 76, "723": 76, "pinwheel": 76, "724": 76, "pirat": 76, "725": 76, "pitcher": 76, "ewer": 76, "726": 76, "woodwork": 76, "727": 76, "planetarium": 76, "728": 76, "plastic": 76, "729": 76, "plate": 76, "rack": 76, "730": 76, "plow": 76, "plough": 76, "731": 76, "plunger": 76, "plumber": 76, "732": 76, "polaroid": 76, "camera": [76, 77], "733": 76, "pole": 76, "734": 76, "paddi": 76, "patrol": 76, "maria": 76, "735": 76, "poncho": 76, "736": 76, "billiard": 76, "snooker": 76, "737": 76, "soda": 76, "738": 76, "flowerpot": 76, "739": 76, "potter": 76, "740": 76, "741": 76, "prayer": 76, "rug": 76, "742": 76, "printer": 76, "743": 76, "prison": 76, "744": 76, "projectil": 76, "745": 76, "projector": [76, 89], "746": 76, "puck": 76, "hockei": 76, "747": 76, "punch": 76, "punchbal": 76, "purs": 76, "749": 76, "quill": 76, "quilt": 76, "751": 76, "racer": 76, "race": [76, 77], "752": 76, "racket": 76, "racquet": 76, "753": 76, "754": [76, 88], "radio": 76, "wireless": 76, "755": 76, "telescop": 76, "reflector": 76, "756": 76, "rain": [76, 87], "758": 76, "reel": 76, "759": 76, "reflex": 76, "760": 76, "refriger": 76, "icebox": 76, "761": 76, "762": 76, "restaur": 76, "eateri": 76, "763": 76, "revolv": 76, "shooter": 76, "764": 76, "765": 76, "rocker": 76, "766": 76, "rotisseri": 76, "767": 76, "rubber": 76, "eras": [76, 101], "rugbi": 76, "769": 76, "ruler": 76, "770": 76, "shoe": 76, "771": 76, "772": 76, "safeti": 76, "pin": 76, "773": 76, "saltshak": 76, "salt": 76, "774": 76, "775": 76, "sarong": 76, "776": 76, "saxophon": 76, "777": 76, "scabbard": 76, "778": 76, "weigh": 76, "779": 76, "bu": 76, "780": [76, 88], "schooner": 76, "scoreboard": 76, "782": 76, "crt": 76, "783": 76, "screwdriv": 76, "785": 76, "seat": 76, "belt": [76, 80], "seatbelt": 76, "786": 76, "787": 76, "shield": 76, "buckler": 76, "788": 76, "shoji": 76, "790": 76, "basket": 76, "791": 76, "792": 76, "shovel": 76, "793": 76, "shower": 76, "794": 76, "curtain": 76, "795": 76, "ski": 76, "796": 76, "797": 76, "798": 76, "slipstick": 76, "799": 76, "door": 76, "bandit": 76, "801": 76, "snorkel": 76, "802": 76, "snowmobil": 76, "803": 76, "snowplow": 76, "snowplough": 76, "soap": 76, "soccer": 76, "806": 76, "sock": 76, "solar": 76, "collector": 76, "furnac": 76, "808": 76, "sombrero": 76, "809": 76, "soup": 76, "heater": 76, "shuttl": 76, "813": 76, "spatula": 76, "speedboat": 76, "816": 76, "spindl": 76, "sport": [76, 87], "819": 76, "820": 76, "821": 76, "steel": 76, "822": 76, "823": 76, "stethoscop": 76, "824": 76, "stole": 76, "stone": 76, "wall": [76, 97], "stopwatch": 76, "827": 76, "stove": 76, "828": 76, "strainer": 76, "829": 76, "streetcar": 76, "tram": 76, "tramcar": 76, "trollei": 76, "830": 76, "stretcher": 76, "studio": 76, "couch": 76, "bed": [76, 85], "stupa": 76, "tope": 76, "submarin": 76, "pigboat": 76, "834": 76, "cloth": 76, "835": 76, "sundial": 76, "sunglass": 76, "838": 76, "sunscreen": 76, "sunblock": 76, "blocker": 76, "suspens": 76, "840": 76, "swab": 76, "swob": 76, "mop": 76, "841": 76, "sweatshirt": 76, "842": 76, "trunk": 76, "843": 76, "swing": 76, "845": 76, "syring": 76, "846": 76, "armi": 76, "tape": 76, "849": 76, "teapot": 76, "850": 76, "teddi": 76, "851": 76, "televis": 76, "852": 76, "tenni": 76, "853": 76, "thatch": 76, "roof": 76, "854": 76, "thimbl": 76, "856": 76, "thresher": 76, "thrasher": 76, "thresh": 76, "857": 76, "throne": 76, "858": 76, "tile": 76, "toaster": 76, "860": 76, "tobacco": 76, "tobacconist": 76, "861": 76, "toilet": 76, "863": 76, "totem": 76, "864": 76, "tow": 76, "wrecker": 76, "toyshop": 76, "tractor": 76, "867": 76, "trailer": [76, 84], "lorri": 76, "868": 76, "trai": 76, "869": 76, "trench": 76, "870": 76, "tricycl": 76, "trike": 76, "velociped": 76, "871": 76, "trimaran": 76, "872": 76, "tripod": 76, "873": 76, "triumphal": 76, "874": 76, "trolleybu": 76, "trackless": 76, "trombon": 76, "vat": 76, "877": 76, "turnstil": 76, "typewrit": 76, "879": 76, "umbrella": 76, "880": 76, "unicycl": 76, "monocycl": 76, "881": 76, "upright": 76, "vacuum": 76, "cleaner": 76, "883": 76, "vase": 76, "884": 76, "vault": 76, "velvet": 76, "886": 76, "vend": 76, "887": 76, "vestment": 76, "viaduct": 76, "889": 76, "violin": 76, "fiddl": 76, "890": 76, "volleybal": 76, "891": 76, "waffl": 76, "892": 76, "893": 76, "wallet": 76, "billfold": 76, "notecas": 76, "pocketbook": 76, "wardrob": 76, "warplan": 76, "washbasin": 76, "handbasin": 76, "washbowl": 76, "lavabo": 76, "wash": 76, "basin": 76, "898": 76, "899": 76, "jug": 76, "tower": 76, "901": 76, "whiskei": 76, "whistl": 76, "903": 76, "wig": 76, "904": 76, "windsor": 76, "wine": 76, "wing": 76, "wok": 76, "910": 76, "spoon": 76, "wool": 76, "woolen": 76, "woollen": 76, "912": 76, "rail": 76, "virginia": 76, "913": 76, "wreck": 76, "914": 76, "yawl": 76, "yurt": 76, "comic": 76, "918": 76, "crossword": 76, "919": 76, "street": 76, "920": 76, "stoplight": 76, "jacket": 76, "dust": 76, "guacamol": 76, "consomm": 76, "hotpot": 76, "trifl": 76, "cream": 76, "icecream": 76, "lolli": 76, "lollipop": 76, "popsicl": 76, "loaf": 76, "931": 76, "bagel": 76, "beigel": 76, "pretzel": 76, "933": 76, "cheeseburg": 76, "934": 76, "hotdog": 76, "mash": 76, "936": [76, 94], "937": [76, 94], "broccoli": 76, "cauliflow": 76, "zucchini": 76, "courgett": 76, "940": [76, 94], "spaghetti": 76, "squash": 76, "941": [76, 94], "acorn": 76, "942": [76, 94], "butternut": 76, "943": 76, "cuke": 76, "artichok": 76, "globe": 76, "945": [76, 94], "pepper": 76, "946": 76, "cardoon": 76, "947": 76, "mushroom": 76, "948": 76, "granni": 76, "smith": 76, "949": 76, "strawberri": 76, "950": 76, "lemon": 76, "952": 76, "953": 76, "pineappl": 76, "anana": 76, "954": 76, "banana": 76, "955": 76, "jackfruit": 76, "jak": 76, "956": 76, "custard": 76, "appl": 76, "957": 76, "pomegran": 76, "958": 76, "hai": 76, "959": 76, "carbonara": 76, "960": 76, "chocol": 76, "sauc": 76, "syrup": 76, "961": 76, "dough": 76, "962": 76, "meatloaf": 76, "963": 76, "pizza": 76, "pie": [76, 87], "964": 76, "potpi": 76, "965": 76, "burrito": 76, "966": 76, "967": 76, "968": 76, "969": 76, "eggnog": 76, "alp": 76, "bubbl": 76, "972": 76, "reef": 76, "974": 76, "geyser": 76, "lakesid": 76, "lakeshor": 76, "976": 76, "promontori": 76, "headland": 76, "foreland": 76, "977": 76, "sandbar": 76, "978": 76, "seashor": 76, "coast": 76, "seacoast": 76, "vale": 76, "980": 76, "volcano": 76, "981": 76, "ballplay": 76, "982": 76, "groom": 76, "bridegroom": 76, "983": 76, "scuba": 76, "diver": 76, "rapese": 76, "985": 76, "986": 76, "slipper": 76, "cypripedium": 76, "calceolu": 76, "parviflorum": 76, "987": 76, "corn": 76, "988": 76, "rosehip": 76, "990": 76, "buckey": 76, "chestnut": 76, "conker": 76, "991": 76, "fungu": 76, "992": 76, "agar": 76, "gyromitra": 76, "994": 76, "stinkhorn": 76, "carrion": 76, "earthstar": 76, "polyporu": 76, "frondosu": 76, "grifola": 76, "frondosa": 76, "bolet": 76, "capitulum": 76, "tissu": 76, "bathroom": 76, "dir_to_imagenet_index": 76, "n03888257": 76, "n03425413": 76, "n03394916": 76, "n03000684": 76, "n02102040": 76, "n03445777": 76, "n03417042": 76, "n03028079": 76, "n02979186": 76, "n01440764": 76, "dir_index_to_imagenet_label": 76, "ordered_dir": 76, "dir_index": 76, "dir_nam": 76, "val_transform": 76, "imagenette_v": 76, "imagenette_train": 76, "random_indic": 76, "imagenette_train_subset": 76, "imagenette_train_load": 76, "imagenette_val_load": 76, "dataset_length": 76, "loss_sum": 76, "total_1_correct": 76, "total_5_correct": 76, "bearpaw": 76, "cc9106d598ff1fe375cc030873ceacfea0499d77": 76, "topk": [76, 100, 102], "top_k_correct": 76, "top_1_correct": 76, "top_1_acc": 76, "top_5_acc": 76, "imagenette_train_loop": 76, "untrain": [76, 94], "imagenette_batch": 76, "top_1_accuraci": 76, "top_5_accuraci": 76, "resnet18_weight": 76, "resnet_opt": 76, "predict_top5": 76, "top5_prob": 76, "top5_nam": 76, "top5_idc": 76, "_use_the_resnet_model_exercis": 76, "moveaxi": [76, 80], "bonsai": 76, "svg": 76, "pok\u00e9mon_pikachu_art": 76, "data2": 76, "27min": 76, "_improving_efficiency_inception_and_resnext_video": 76, "xie": 76, "calculate_parameters_resnet": 76, "d_in": 76, "resnet_channel": 76, "d_out": 76, "resnet_paramet": 76, "calculate_parameters_resnext": 76, "resnext_channel": 76, "num_path": 76, "pathwai": 76, "resnext_paramet": 76, "descriptions_resnet": 76, "descriptions_resnext": 76, "cardin": 76, "lbox_resnet": 76, "lbox_resnext": 76, "rbox_resnet": 76, "rbox_resnext": 76, "ui_resnet": 76, "ui_resnet_label": 76, "1px": 76, "ui_resnext": 76, "ui_resnext_label": 76, "out_resnet": 76, "out_resnext": 76, "_resnet_vs_resnext_interactive_demo": 76, "_resnet_vs_resnext_discuss": 76, "biggest": 76, "23min": 76, "_improving_efficiency_mobilenet_video": 76, "convolution_math": 76, "filter_s": [76, 80], "conv_paramet": 76, "depthwise_conv_paramet": 76, "_calculation_of_parameters_exercis": 76, "_parameter_savings_discuss": 76, "24min": 76, "_transfer_learning_video": 76, "twice": [76, 80, 87], "pokemon": 76, "cis_522_data": [76, 77], "u4njm": 76, "small_pokemon_dataset": 76, "charmeleon": 76, "venusaur": 76, "ivysaur": 76, "squirtl": 76, "charizard": 76, "blastois": 76, "bulbasaur": 76, "wartortl": 76, "charmand": 76, "pokemon_dataset": 76, "image_count": [76, 77], "pokemon_test_set": 76, "pokemon_train_set": 76, "pokemon_train_load": 76, "pokemon_test_load": 76, "pretrained_acc": 76, "total_correct": 76, "num_correct": 76, "linreadout_acc": 76, "scratch_acc": 76, "_pretrained_resnet_vs_resnet_exercis": 76, "_training_only_the_classification_exercis": 76, "facial": 76, "_summary_and_outlook_video": 76, "21min": [76, 84], "_speedaccuracy_tradeoff_different_backbones_bonus_video": 76, "era": 76, "tradeoff": [76, 97], "t_start": 76, "top_1_acciraci": 76, "aux_logit": 76, "googlenet": 76, "model_tim": 76, "plot_acc_spe": 76, "ti": [76, 100], "create_model": 76, "weight_list": 76, "alexnet_weight": 76, "vgg19_weight": 76, "_accuracy_vs_training_speed_exercis": 76, "_finding_best_model_exercis": 76, "_speed_and_accuracy_correlation_exercis": 76, "facenet": 77, "w2d3_t2_bonu": 77, "facenet_pytorch": 77, "mtcnn": 77, "inceptionresnetv1": 77, "12min": 77, "2kyfb": 77, "_face_recognition_using_cnns_video": 77, "retrain": [77, 80], "bruce": 77, "lee": 77, "neil": 77, "harri": 77, "pam": 77, "grier": 77, "face_dataset": 77, "face_load": 77, "process_imag": 77, "model_tensor": 77, "display_tensor": 77, "img_crop": 77, "bruce_tensor": 77, "bruce_displai": 77, "neil_tensor": 77, "neil_displai": 77, "pam_tensor": 77, "pam_displai": 77, "tensor_to_displai": 77, "vggface2": 77, "9131": 77, "bruce_embed": 77, "neil_embed": 77, "pam_embed": 77, "_embedding_vectors_discuss": 77, "princip": [77, 80], "embedding_tensor": 77, "n_compon": [77, 81, 87], "pca_tensor": 77, "categ": 77, "pc": [77, 80], "unlock": 77, "19min": 77, "casia": 77, "webfac": 77, "caucasian": 77, "crimin": 77, "justic": 77, "utkfac": 77, "women": 77, "imbalanc": 77, "_ethical_aspects_video": [77, 84], "richardvogg": 77, "face_sampl": 77, "36wyh": 77, "face_sample2": 77, "black_female_tensor": 77, "black_female_displai": 77, "_1_1_": 77, "white_female_tensor": 77, "white_female_displai": 77, "_1_0_": 77, "black_female_embed": 77, "white_female_embed": 77, "cdist": 77, "calculate_pairwise_dist": 77, "embedding_dimens": [77, 89], "femal": [77, 84, 88], "_face_similarity_discuss": 77, "_embeddings_discuss": 77, "lastli": 77, "men": 77, "fairfac": 77, "male": [77, 84, 88], "k\u00e4rkk\u00e4inen": 77, "joo": 77, "centroid": [77, 87], "complement": 77, "embedding_s": 77, "sum_sq": 77, "1x1": 77, "w2d4_bonuslectur": 79, "_geoffrey_hinton_video": 79, "upenn": 80, "instructor": 80, "libsixel": 80, "w2d4_t1": 80, "pylab": 80, "pytorch_pretrained_biggan": 80, "one_hot_from_nam": 80, "image_mo": 80, "image_batch": 80, "n_batch": 80, "covari": [80, 91], "m1": 80, "cov": [80, 81, 91], "m2": 80, "num_interp": 80, "kl_q_p": 80, "zs": 80, "kl": 80, "mu_p": 80, "sigma_p": 80, "log_q": 80, "log_p": 80, "mu_q": 80, "log_sig_q": 80, "log_p_x": 80, "mu_x": 80, "sig_x": 80, "squared_error": 80, "pca_encoder_decod": 80, "svd_lowrank": 80, "w_encod": 80, "w_decod": 80, "pca_encod": 80, "pca_decod": 80, "cout": 80, "unnecessarili": 80, "dilat": 80, "in_depth": 80, "in_height": 80, "in_width": 80, "out_depth": 80, "out_height": 80, "out_width": 80, "plot_gen_samples_ppca": 80, "therm1": 80, "therm2": 80, "therm_data_sim": 80, "thermomet": 80, "them2": 80, "plot_linear_a": 80, "lin_loss": 80, "plot_conv_a": 80, "conv_loss": 80, "ae": 80, "plot_imag": 80, "plt_titl": 80, "plot_torch_imag": 80, "plot_phi": 80, "entropu": 80, "inter": 80, "28318": 80, "rsampl": 80, "lie": [80, 88], "im_plt": 80, "nltk_data": [80, 84, 85], "omw": 80, "ekjxi": 80, "kuwep": 80, "corpora": [80, 85, 87], "_generative_modeling_video": 80, "biggan_model": 80, "from_pretrain": [80, 82, 84, 85, 87, 88], "3yvhw": 80, "biggan_deep_256": 80, "sneak": 80, "peek": 80, "truncat": [80, 81, 84, 85, 88], "ins": [80, 84], "z_magnitud": 80, "truncnorm": 80, "truncated_noise_sampl": 80, "dim_z": 80, "randomst": 80, "sample_from_biggan": 80, "instabl": 80, "clone": [80, 94, 100, 102], "z_slider": 80, "440px": 80, "category_dropdown": 80, "realist": 80, "_generated_images_discuss": 80, "interpolate_biggan": 80, "category_a": 80, "category_b": 80, "z_magnitude_a": 80, "z_magnitude_b": 80, "interpolate_and_shap": 80, "interp": 80, "unit_vector": 80, "z_a": 80, "z_b": 80, "z_interp": 80, "y_interp": 80, "output_grid": 80, "z_a_slid": 80, "z_b_slider": 80, "magntud": 80, "category_a_dropdown": 80, "category_b_dropdown": 80, "_biggan_interpolation_interactive_demo": 80, "_samples_from_the_same_category_discuss": 80, "_latent_variable_models_video": 80, "generate_data": 80, "mean_of_temp": 80, "cov_of_temp": 80, "temparatur": 80, "kx1": 80, "kxk": 80, "psudo": 80, "multivariate_norm": [80, 81], "sqrt2": 80, "pc_ax": 80, "therm_data": 80, "therm_data_mean": 80, "therm_data_cent": 80, "outer": 80, "therm_data_zero_cent": 80, "pc_project": 80, "pc_axes_vari": 80, "sensor_noise_std": 80, "sensor_noise_var": 80, "gen_from_ppca": 80, "noise_var": 80, "data_mean": 80, "pc_varianc": 80, "epsilon_cov": 80, "sim_mean": 80, "rand_ep": 80, "_coding_ppca_exercis": 80, "_autoencoders_video": 80, "jbpme": 80, "mnist_val": 80, "cifar10_v": 80, "dataset_nam": 80, "get_data": 80, "my_dataset": 80, "my_dataset_nam": 80, "my_dataset_shap": 80, "my_dataset_s": 80, "my_valset": 80, "data_shap": 80, "data_s": 80, "valid_set": 80, "longrightarrow": 80, "2_2": 80, "plenti": 80, "linearautoencod": 80, "x_dim": 80, "my_dataset_dim": 80, "h_dim": 80, "train_autoencod": 80, "mse_loss": 80, "pin_memori": 80, "im_batch": 80, "enc_lin": 80, "dec_lin": 80, "x_prime": 80, "flat_x": 80, "lin_a": 80, "_linear_autoencoder_exercis": 80, "n_plot": 80, "h_pca": 80, "recon_pca": 80, "nimag": 80, "1000x450": 80, "_pca_vs_linearautoencod": 80, "Such": 80, "biaslay": 80, "grain": 80, "requisit": 80, "init_bia": 80, "tour": [80, 87], "deconvolut": [80, 94], "ubiquit": 80, "schemat": 80, "dummy_imag": 80, "dummy_conv": 80, "dummy_deconv": 80, "n_filter": 80, "enc_bia": 80, "enc_conv_1": 80, "conv_1_shap": 80, "enc_conv_2": 80, "conv_2_shap": 80, "enc_flatten": 80, "flat_after_conv": 80, "undo": 80, "ing": 80, "unflatten": 80, "dec_unflatten": 80, "unflattened_s": 80, "dec_deconv_1": 80, "dec_deconv_2": 80, "dec_bia": 80, "trained_conv_a": 80, "lin_recon": 80, "nonlin_recon": 80, "nonlin": 80, "_nonlinear_autoencoder_exercis": 80, "_variational_autoencoder_video": 80, "ambiti": 80, "k_vae": 80, "convva": 80, "num_filt": 80, "filter_reduct": 80, "shape_after_conv": 80, "flat_size_after_conv": 80, "q_bia": 80, "q_conv_1": 80, "q_conv_2": 80, "q_flatten": 80, "q_fc_phi": 80, "p_fc_upsampl": 80, "p_unflatten": 80, "p_deconv_1": 80, "p_deconv_2": 80, "p_bia": 80, "log_sig_x": 80, "flat_": 80, "mu_z": 80, "elbo": [80, 81], "expected_z": 80, "kplus1": 80, "train_va": 80, "elbo_v": 80, "trained_conv_vara": 80, "sigma_x": 80, "keyboardinterrupt": [80, 94], "1511": [80, 94], "_wrapped_call_impl": [80, 94], "1509": [80, 94], "_compiled_call_impl": [80, 94], "misc": [80, 94], "1510": [80, 94], "_call_impl": [80, 94], "1520": [80, 94], "1516": [80, 94], "1517": [80, 94], "_backward_hook": [80, 94], "_backward_pre_hook": [80, 94], "_forward_hook": [80, 94], "_forward_pre_hook": [80, 94], "1518": [80, 94], "_global_backward_pre_hook": [80, 94], "_global_backward_hook": [80, 94], "1519": [80, 94], "_global_forward_hook": [80, 94], "_global_forward_pre_hook": [80, 94], "forward_cal": [80, 94], "1522": [80, 94], "_conv_forward": 80, "padding_mod": 80, "_reversed_padding_repeated_twic": 80, "_pair": 80, "overset": 80, "q_": 80, "w_e": 80, "parametar": 80, "p_": [80, 82], "partli": 80, "prime": 80, "generate_imag": 80, "n_imag": 80, "_generating_images_exercis": 80, "_autoencoders_vs_variational_autoencoders_discuss": 80, "_sota_vaes_and_wrapup_video": 80, "binxu": [81, 82], "dongrui": [81, 82], "deng": [81, 82], "dora": [81, 82, 88, 97], "zhiyu": [81, 82, 88, 97], "adrita": [81, 82, 88, 97], "w2d4_t2": 81, "mline": 81, "plotting_z": 81, "kdeplot": 81, "pnt": 81, "titlestr": 81, "figh": 81, "hacki": 81, "stackoverflow": [81, 89], "73739704": 81, "14392829": 81, "_get_lin": 81, "prop_cycl": 81, "quiver_plot": 81, "vec": 81, "gmm_pdf_contour_plot": 81, "gmm": 81, "logprob": [81, 84], "dstack": 81, "visualize_diffusion_distr": 81, "x_traj_rev": 81, "leftt": 81, "rightt": 81, "explabel": 81, "x_t": [81, 82], "interchang": 81, "markov": 81, "sde": [81, 82], "synopsi": 81, "_intro_and_principles_video": 81, "_math_behind_diffusion_video": 81, "vpsde": 81, "wiener": 81, "z_t": 81, "_0": 81, "_t": 81, "sigma_t": [81, 82], "p_t": 81, "p_0": [81, 82], "int_": [81, 82], "undergo": 81, "diffusion_1d_forward": 81, "samplen": 81, "bimod": 81, "cumsum": [81, 87], "scatter1": 81, "scatter2": 81, "set_offset": 81, "to_jshtml": 81, "_visualizing_diffusion_interactive_demo": 81, "gaussianmixtur": 81, "signifi": [81, 100, 102], "prec": 81, "norm_weight": 81, "add_compon": 81, "pdf_decompos": 81, "component_pdf": 81, "nabla_x": 81, "weighted_compon_pdf": 81, "gradvec": 81, "score_decompos": 81, "gradvec_list": 81, "rand_compon": 81, "all_sampl": 81, "gmm_samp": 81, "mu1": 81, "cov1": 81, "mu2": 81, "cov2": 81, "show_sampl": 81, "gmm_sampl": 81, "gmm_samps_few": 81, "scorevecs_few": 81, "gauss": 81, "mode1": 81, "mode2": 81, "silenc": [81, 82], "_what_does_score_tell_us_discuss": 81, "equip": 81, "nabla_": 81, "recoveri": 81, "sigma_t_squar": 81, "diffuse_gmm": 81, "teleport": 81, "sigma_t_2": 81, "noise_cov": 81, "covs_dif": 81, "reverse_diffusion_sde_sampling_gmm": 81, "sampn": 81, "nstep": 81, "gausian": 81, "sigmat2": 81, "xt": 81, "eps_z": 81, "transport": 81, "gmm_t": 81, "score_xt": 81, "2500": [81, 85], "x0_rev": 81, "_score_enables_reversal_of_diffusion_exercis": 81, "dsm": [81, 82], "j_": 81, "e_": [81, 82], "tild": 81, "s_": [81, 82, 100], "esm": 81, "2_t": 81, "gamma_t": [81, 82], "1dt": [81, 82], "emphas": [81, 82], "sigma_ts_": 81, "rapidli": [81, 88], "scare": 81, "disguis": 81, "absorb": 81, "alpha_t": 81, "2303": 81, "00848": 81, "2206": 81, "00364": 81, "2106": 81, "05527": 81, "_denoising_objective_discuss": 81, "sigma_t_fun": 81, "toler": [81, 82, 97], "random_t": [81, 82], "perturbed_x": [81, 82], "sigma_t_test": 81, "score_analyt_test": [81, 82], "_implementing_denoising_score_matching_objective_exercis": 81, "gaussianfourierproject": [81, 82], "embed_dim": [81, 82], "t_proj": 81, "scoremodel_tim": 81, "t_emb": 81, "induct": [81, 82], "sample_x_and_score_t_depend": 81, "trainn": 81, "partit": 81, "trainn_part": 81, "x_train_col": 81, "y_train_col": 81, "t_train_col": 81, "gmm_dif": 81, "x_train_tsr": 81, "y_train_tsr": 81, "t_train_tsr": 81, "test_dsm_object": 81, "x_train_samp": 81, "y_train_samp": 81, "t_train_samp": 81, "x_test_samp": 81, "y_test_samp": 81, "t_test_samp": 81, "score_model_td": 81, "sigma_t_f": 81, "5k": [81, 85], "y_pred_train": 81, "mse_train": 81, "y_pred_test": 81, "mse_test": 81, "stats_df": 81, "dsm_loss": 81, "reverse_diffusion_sde_sampl": 81, "tvec": 81, "x_traj_rev_appr_denoi": 81, "x_traj_rev_exact": 81, "x_samp": 81, "bravo": 81, "bigg": 81, "ordinari": 81, "brian": 81, "anderson": 81, "1986": 81, "song": 81, "practis": [81, 88], "pascal": 81, "vincent": 81, "2011": 81, "w2d4_t3": 82, "functool": 82, "multiplicativelr": 82, "lambdalr": 82, "takeawai": 82, "highwai": 82, "_network_architecture_video": 82, "marginal_prob_std": 82, "diffusion_coeff": 82, "int_0": 82, "tg": 82, "2t": 82, "0t": 82, "diff_coeff": 82, "_train_diffusion_for_mnist_exercis": 82, "readout": 82, "freq": 82, "x_proj": 82, "repr": 82, "time_emb": 82, "t_mod1": 82, "gnorm1": 82, "groupnorm": 82, "num_channel": [82, 100, 102], "t_mod2": 82, "gnorm2": 82, "t_mod3": 82, "gnorm3": 82, "t_mod4": 82, "gnorm4": 82, "tconv4": 82, "t_mod5": 82, "tgnorm4": 82, "tconv3": 82, "t_mod6": 82, "tgnorm3": 82, "tconv2": 82, "t_mod7": 82, "tgnorm2": 82, "tconv1": 82, "swish": 82, "h3": 82, "4th": [82, 88, 100], "h4": 82, "_unet_architecture_discuss": 82, "irregular": 82, "_2": [82, 91], "marginal_prob_std_test": 82, "_defining_the_loss_function_exercis": 82, "suffic": 82, "marginal_prob_std_fn": 82, "diffusion_coeff_fn": 82, "score_model": 82, "10e": 82, "lr_lambda": 82, "tqdm_epoch": 82, "num_item": [82, 94], "get_last_lr": 82, "yann": 82, "lecun": [82, 91], "exdb": 82, "idx3": 82, "ubyt": 82, "idx1": 82, "t10k": 82, "euler_maruyama_sampl": 82, "x_shape": 82, "maruyama": 82, "init_x": 82, "step_siz": 82, "batch_time_step": 82, "mean_x": 82, "save_samples_uncond": 82, "sample_batch_s": 82, "sample_np": 82, "imsav": 82, "uncondition_diffus": 82, "uncond_score_model": 82, "filenotfounderror": [82, 100, 102], "pickle_modul": 82, "weights_onli": 82, "mmap": 82, "pickle_load_arg": 82, "_open_file_lik": 82, "opened_fil": 82, "_is_zipfil": 82, "1002": 82, "1003": 82, "orig_posit": 82, "name_or_buff": 82, "_is_path": 82, "_open_fil": 82, "errno": 82, "advis": 82, "effortless": 82, "_conditional_diffusion_model_video": 82, "uncondit": 82, "_advanced_techinque_stable_diffusion_video": 82, "potent": 82, "stablediffusionpipelin": 82, "dpmsolvermultistepschedul": 82, "pndmschedul": 82, "model_id": [82, 88], "stabilityai": 82, "torch_dtyp": 82, "float16": 82, "pndm": 82, "from_config": 82, "dpm": 82, "loos": 82, "dessert": 82, "gogh": 82, "ballerina": 82, "danc": 82, "starri": 82, "monet": 82, "num_inference_step": 82, "_stable_diffusion_interactive_demo": 82, "babi": 82, "recursive_print": 82, "text_encod": 82, "named_children": 82, "modulelist": 82, "unet2dconditionmodel": 82, "conv_in": 82, "time_proj": 82, "time_embed": 82, "timestepembed": 82, "linear_1": 82, "1280": 82, "silu": 82, "linear_2": 82, "down_block": 82, "crossattndownblock2d": 82, "downblock2d": 82, "up_block": 82, "upblock2d": 82, "crossattnupblock2d": 82, "mid_block": 82, "unetmidblock2dcrossattn": 82, "conv_norm_out": 82, "conv_act": 82, "conv_out": 82, "cliptextmodel": 82, "text_model": 82, "cliptexttransform": 82, "cliptextembed": 82, "token_embed": [82, 84], "49408": 82, "position_embed": 82, "clipencod": 82, "clipencoderlay": 82, "final_layer_norm": 82, "layernorm": [82, 84], "elementwise_affin": 82, "_architecture_of_stable_diffusion_model_discuss": 82, "_ethical_consideration_video": 82, "artist": 82, "prompter": 82, "deserv": 82, "credit": 82, "_copyrights_discuss": 82, "misinform": 82, "unet_condit": 82, "text_dim": 82, "nclass": 82, "cond_emb": 82, "y_mod2": 82, "y_mod3": 82, "y_mod4": 82, "y_mod5": 82, "y_mod6": 82, "y_mod7": 82, "constant_": 82, "y_emb": 82, "loss_fn_cond": 82, "initil": 82, "score_model_cond": 82, "lr_current": 82, "ckpt_cond": 82, "empty_cach": 82, "bikram": [84, 85], "khastgir": [84, 85], "rajaswa": [84, 85], "patil": [84, 85], "egor": [84, 85], "zverev": [84, 85], "alish": [84, 85, 87, 88, 89], "dipani": [84, 85, 87, 88, 89], "ezekiel": [84, 85], "william": [84, 85], "hadi": [84, 85, 94], "vafaei": [84, 85, 94], "pytorch_pretrained_bert": 84, "gensim": [84, 85, 87, 88], "w2d5_t1": 84, "ta_cache_dir": [84, 85], "pprint": [84, 85], "abc": [84, 85], "abstractmethod": [84, 85], "word2vec": [84, 85, 87, 88], "manifold": [84, 85, 87], "tsne": [84, 85, 87], "vocab": [84, 85, 87, 88], "autotoken": [84, 85, 88], "berttoken": 84, "bertformaskedlm": 84, "modulenotfounderror": [84, 85], "get_ipython": [84, 85], "run_line_mag": [84, 85], "zqw5": 84, "brown_wordlist": [84, 87], "w2vmodel": [84, 87], "editori": [84, 87], "fiction": [84, 87], "govern": [84, 87], "mysteri": [84, 87, 88], "religion": [84, 87], "romanc": [84, 87], "science_fict": [84, 87], "create_word2vec_model": [84, 87], "sg": [84, 87], "min_count": [84, 87], "vector_s": [84, 87], "model_dictionari": [84, 87], "wv": [84, 87], "get_embed": [84, 87], "keyerror": 84, "check_word_in_corpu": 84, "word_embed": [84, 87], "layer1_s": 84, "embed_list": 84, "f_x": 84, "ambienc": 84, "aggreg": 84, "dataset_dict": 84, "datasetdict": 84, "sentiment": [84, 87], "cach": 84, "set_format": 84, "input_id": [84, 85, 88], "hf_datasets_cach": [84, 85], "kthjg": [84, 85], "load_dataset": [84, 85, 87, 88], "yelp_review_ful": [84, 85], "download_mod": [84, 85], "reuse_dataset_if_exist": [84, 85], "cache_dir": [84, 85], "ignore_verif": [84, 85], "charg": 84, "variant": 84, "mlm": [84, 88], "pred_text": 84, "actual_label": 84, "batch1": 84, "transform_sentence_for_bert": 84, "masked_word": 84, "___": 84, "parse_text_and_word": 84, "raw_lin": 84, "option1": 84, "optionn": 84, "mask_index": 84, "get_probabilities_of_masked_word": 84, "uncas": 84, "words_idx": 84, "convert_tokens_to_id": 84, "tokenized_text": 84, "indexed_token": 84, "masked_index": 84, "tokens_tensor": 84, "pretrained_masked_model": 84, "predicted_index": 84, "ix": 84, "suffer": [84, 101], "attend": 84, "_application_of_attention_discuss": 84, "40min": [84, 94], "_queries_keys_and_values_video": 84, "ambigi": 84, "get_value_attent": 84, "query_embed": 84, "query_similar_word": 84, "similar_by_word": 84, "key_embed": 84, "unscal": 84, "scaled_attent": 84, "softmax_attent": 84, "nscale": 84, "value_similar_word": 84, "similar_by_vector": 84, "monei": 84, "random_word": [84, 87], "_intution_behind_attention_interactive_demo": 84, "_does_this_model_perform_well_discuss": 84, "dotproductattent": 84, "calculate_scor": 84, "databas": 84, "bmm": 84, "softmax_weight": 84, "dot_product_attent": 84, "_dot_product_attention_exercis": 84, "_multihead_attention_video": 84, "semant": [84, 101], "to_kei": 84, "to_queri": 84, "to_valu": 84, "selfattent": 84, "unify_head": 84, "unifi": 84, "multiheadattent": 84, "_q_k_v_attention_exercis": 84, "_transformer_overview_i_video": 84, "transformerblock": 84, "norm1": 84, "norm2": 84, "norm_1": 84, "norm_2": 84, "1607": 84, "06450": 84, "_transformer_encoder_exercis": 84, "_transformer_overview_ii_video": 84, "autoregress": 84, "explanatori": 84, "vaswani": 84, "_complexity_of_decoding_discuss": 84, "_positional_encoding_video": 84, "concern": 84, "sinusoid": 84, "pe_": 84, "2i": 84, "d_": 84, "pe": 84, "bonu": [84, 87], "familiaris": 84, "transformer_tutori": 84, "emb_siz": 84, "inject": 84, "div_term": 84, "wavelength": 84, "2\u03c0": 84, "pepo": 84, "register_buff": 84, "gehr": 84, "alammar": 84, "phillip": 84, "lipp": 84, "num_token": 84, "pos_enc": 84, "transformer_block": 84, "classification_head": 84, "sequence_avg": 84, "_transformer_architecture_for_classification_exercis": 84, "n_iter": [84, 87], "l2_penalti": 84, "l1_penalti": 84, "opim": 84, "n_neuron": 84, "n_test": 84, "placehold": [84, 85], "cf": 84, "appendix": 84, "iter_train_loss": 84, "iter_loss_test": 84, "test_batch": 84, "out_test": 84, "loss_test": [84, 87], "pred_batch": 84, "predicted_label28": 84, "11min": 84, "keen": 84, "favor": 84, "racial": 84, "gender": 84, "crow": 84, "gather": 84, "wouldn": [84, 85], "astrophysicist": 84, "gross": 84, "socioeconom": 84, "statu": 84, "alcohol": 84, "mansion": 84, "favour": 84, "u2019t": 84, "attract": 84, "masked_text": 84, "_find_biases_in_the_model_interactive_demo": 84, "creatur": 84, "ago": [84, 87], "prei": [84, 88], "jungl": 84, "compsognathu": 84, "_problems_of_this_approach_discuss": 84, "protbert": 84, "_biases_of_using_these_models_in_other_fields_discuss": 84, "5min": 84, "vit": 84, "dall": 84, "parti": 84, "nerf": 84, "wav2vec": 84, "generalist": 84, "gato": 84, "demand": 84, "w2d5_t2_bonu": 85, "trainer": [85, 88], "trainingargu": [85, 88], "automodelforcausallm": 85, "automodelforsequenceclassif": 85, "_pretraining_video": 85, "sentiment_dict": 85, "clean_text": 85, "backslash": 85, "sample_review_from_yelp": 85, "use_custom_review": 85, "custom_review": 85, "gpt2": [85, 88], "xlnet": 85, "extension_prompt": 85, "num_output_respons": 85, "input_text": 85, "generated_respons": 85, "num_return_sequ": [85, 88], "THE": 85, "generated_text": [85, 88], "custom_positive_extens": 85, "custom_negative_extens": 85, "positive_input_id": 85, "positive_attention_mask": 85, "attention_mask": 85, "positive_label_id": 85, "positive_extension_likelihood": 85, "nlog": 85, "negative_input_id": 85, "negative_attention_mask": 85, "negative_label_id": 85, "negative_extension_likelihood": 85, "nposit": 85, "nneg": 85, "_finetuning_video": 85, "billion": [85, 87], "bert": [85, 101], "tokenize_funct": 85, "tokenis": 85, "tokenized_dataset": 85, "10k": 85, "train_lay": 85, "num_label": 85, "pooler": 85, "frozen": [85, 94], "training_arg": [85, 88], "output_dir": [85, 88], "yelp_bert": 85, "overwrite_output_dir": 85, "evaluation_strategi": 85, "per_device_train_batch_s": [85, 88], "per_device_eval_batch_s": 85, "num_train_epoch": 85, "fp16": 85, "save_step": 85, "logging_step": 85, "report_to": 85, "compute_metr": [85, 88], "eval_pr": [85, 88], "load_metr": 85, "eval_dataset": 85, "22min": 85, "_robustness_video": 85, "deceiv": 85, "persuad": 85, "impart": 85, "_load_an_original_review_interactive_demo": 85, "wordswapqwerti": 85, "wordswapextend": 85, "wordswapcontract": 85, "wordswaphomoglyphswap": 85, "compositetransform": 85, "wordswaprandomcharacterdelet": 85, "wordswapneighboringcharacterswap": 85, "wordswaprandomcharacterinsert": 85, "wordswaprandomcharactersubstitut": 85, "flair": 85, "ordereddict": 85, "wordswap": 85, "_get_replacement_word": 85, "default_class_repr": 85, "hasattr": 85, "extra_repr_kei": 85, "extra_param": 85, "extra_str": 85, "__dict__": 85, "label_color": 85, "pink": 85, "adversari": 85, "current_text": 85, "pre_transformation_constraint": 85, "indices_to_modifi": 85, "shifted_idx": 85, "_get_transform": 85, "attackedtext": 85, "pretransformationconstraint": 85, "dictat": 85, "searchmethod": 85, "transformed_text": 85, "convert_from_original_idx": 85, "attack_attr": 85, "last_transform": 85, "indicies_to_modifi": 85, "letters_to_insert": 85, "char": 85, "ascii_lett": 85, "_get_random_lett": 85, "lowercas": [85, 88], "uppercas": 85, "word_to_replac": 85, "replacement_word": 85, "transformed_texts_idx": 85, "replace_word_at_index": 85, "qwerti": 85, "replic": [85, 89, 91], "random_on": 85, "skip_first_char": 85, "skip_last_char": 85, "wordswapqwert": 85, "fabul": 85, "_keyboard_adjac": 85, "_get_adjac": 85, "adjacent_kei": 85, "s_lower": 85, "isupp": 85, "candidate_word": 85, "start_idx": 85, "end_idx": 85, "randrang": 85, "swap_kei": 85, "extension_map": 85, "ain": 85, "hadn": 85, "hasn": [85, 88], "madam": 85, "mightn": 85, "mustn": 85, "needn": 85, "oughtn": 85, "ought": 85, "shan": 85, "wasn": 85, "weren": 85, "expend": 85, "contract": 85, "reverse_contraction_map": 85, "word_idx": 85, "next_idx": 85, "next_word": 85, "delete_word_at_index": 85, "homoglyph": 85, "graphem": 85, "glyph": 85, "substr": [85, 88], "homo": 85, "\u09ed": 85, "\u0223": 85, "\ud835\udfd5": 85, "\u0431": 85, "\u01bd": 85, "\uab9e": 85, "\u0292": 85, "\u14bf": 85, "\u0251": 85, "\u044c": 85, "\u03f2": 85, "\u0501": 85, "\u0435": 85, "\ud835\ude8f": 85, "\u0261": 85, "\u0570": 85, "\u0456": 85, "\u03f3": 85, "\ud835\udc8c": 85, "\u217c": 85, "\uff4d": 85, "\u0578": 85, "\u043e": 85, "\u0440": 85, "\u051b": 85, "\u2c85": 85, "\u0455": 85, "\ud835\ude9d": 85, "\u057d": 85, "\u0475": 85, "\u051d": 85, "\u0443": 85, "\u1d22": 85, "repl_lett": 85, "disregard": 85, "optoin": 85, "new_attacked_text": 85, "main_str": 85, "transformation_lin": 85, "add_ind": 85, "stopword": 85, "_get_modifiable_indic": 85, "check_compat": 85, "wordembeddingdist": 85, "words_from_text": 85, "words_to_ignor": 85, "alphanumer": 85, "legitim": 85, "isalnum": 85, "apostroph": 85, "hyphen": 85, "asterisk": 85, "_flair_pos_tagg": 85, "flair_tag": 85, "tag_typ": 85, "upo": 85, "tagger": 85, "sequencetagg": 85, "zip_flair_result": 85, "split_token": 85, "previous_attacked_text": 85, "text_input": 85, "_text_input": 85, "_word": 85, "_words_per_input": 85, "_pos_tag": 85, "_ner_tag": 85, "setdefault": 85, "original_index_map": 85, "num_word": 85, "modified_indic": 85, "__eq__": 85, "__hash__": 85, "hash": 85, "free_memori": 85, "text_window_around_index": 85, "window_s": 85, "half_siz": 85, "text_idx_start": 85, "_text_index_of_word_index": 85, "text_idx_end": 85, "pos_of_word_index": 85, "desired_word_idx": 85, "use_token": 85, "flair_word_list": 85, "flair_pos_list": 85, "word_idx_in_flair_tag": 85, "ner_of_word_index": 85, "ner": 85, "flair_ner_list": 85, "look_after_index": 85, "pre_word": 85, "lower_text": 85, "text_until_word_index": 85, "text_after_word_index": 85, "first_word_diff": 85, "other_attacked_text": 85, "first_word_diff_index": 85, "all_words_diff": 85, "ith_word_diff": 85, "words_diff_num": 85, "generate_token": 85, "words_to_token": 85, "edit_dist": 85, "w1_t": 85, "w2_t": 85, "cal_dif": 85, "replace_words_at_indic": 85, "new_word": 85, "generate_new_attacked_text": 85, "insert_text_after_word_index": 85, "word_at_index": 85, "new_text": 85, "insert_text_before_word_index": 85, "capit": 85, "get_deletion_indic": 85, "punctuat": [85, 88], "preturb": 85, "perturbed_text": 85, "original_text": 85, "new_attack_attr": 85, "newly_modified_indic": 85, "new_i": 85, "input_word": 85, "adv_word_seq": 85, "word_start": 85, "word_end": 85, "adv_word": 85, "adv_num_word": 85, "num_words_diff": 85, "shifted_modified_indic": 85, "modified_idx": 85, "original_modification_idx": 85, "new_idx_map": 85, "preced": 85, "reform": 85, "perturbed_input_text": 85, "perturbed_input": 85, "words_diff_ratio": 85, "align_with_model_token": 85, "model_wrapp": 85, "subword": [85, 87, 88], "ding": 85, "modelwrapp": 85, "word2token_map": 85, "tokenizer_input": 85, "strip_prefix": 85, "last_match": 85, "matched_token": 85, "input_tupl": 85, "column_label": 85, "words_per_input": 85, "_input": 85, "printable_text": 85, "key_color": 85, "key_color_method": 85, "entail": 85, "ck": [85, 88], "color_text": 85, "pct_words_to_swap": 85, "transformations_per_exampl": 85, "_filter_transform": 85, "compare_against_origin": 85, "call_mani": 85, "attacked_text": 85, "all_transformed_text": 85, "num_words_to_swap": 85, "words_swap": 85, "augment_mani": 85, "text_list": [85, 87], "show_progress": [85, 88], "augment_text_with_id": 85, "id_list": 85, "supplement": 85, "all_text_list": 85, "all_id_list": 85, "_id": 85, "augmented_text": 85, "constraints_lin": 85, "constraints_str": 85, "importerror": [85, 87], "_arrow": 85, "noqa": [85, 87], "e402": 85, "dictconfig": 85, "clusteringmodel": 85, "entity_linker_model": 85, "spanclassifi": 85, "language_model": 85, "languagemodel": 85, "_iter_dataset": 85, "documentembed": 85, "scalarmix": 85, "documentcnnembed": 85, "documentlmembed": 85, "documentpoolembed": 85, "documentrnnembed": 85, "documenttfidfembed": 85, "sentencetransformerdocumentembed": 85, "transformerdocumentembed": 85, "convtransformnetworkimageembed": 85, "identityimageembed": 85, "precomputedimageembed": 85, "load_embed": 85, "register_embed": 85, "flairembed": 85, "stackedembed": 85, "tokenembed": 85, "transformerembed": 85, "transformeronnxdocumentembed": 85, "lockeddropout": 85, "worddropout": 85, "matutil": [85, 87], "f401": [85, 87], "reload": [85, 87, 94], "namespac": [85, 87], "indexedcorpu": [85, 87], "mmcorpu": [85, 87], "bleicorpu": [85, 87], "corpusabc": [85, 87], "saveload": [85, 87], "get_blas_func": [85, 87], "triu": [85, 87], "lapack": [85, 87], "get_lapack_func": [85, 87], "psi": [85, 87], "word_swap_contract": 85, "word_swap_extend": 85, "word_swap_homoglyph_swap": 85, "word_swap_neighboring_character_swap": 85, "word_swap_qwerti": 85, "word_swap_random_character_delet": 85, "word_swap_random_character_insert": 85, "word_swap_random_character_substitut": 85, "augmented_review": 85, "getpredict": 85, "return_tensor": [85, 88], "naugment": 85, "_textattack_module_interactive_demo": 85, "anushre": [87, 89], "hede": [87, 89], "pooja": [87, 89], "consul": [87, 89], "katrin": [87, 89], "reuel": [87, 89], "levenshtein": [87, 89], "portalock": 87, "w3d1_t1": 87, "facebookresearch": [87, 89], "vkuz7": [87, 89], "word_token": 87, "pad_sequ": 87, "imdb": 87, "ag_new": 87, "build_vocab_from_iter": 87, "to_map_style_dataset": 87, "suppress": 87, "simplefilt": 87, "punkt": 87, "download_file_from_google_dr": 87, "uc": 87, "export": 87, "get_confirm_token": 87, "save_response_cont": 87, "cooki": 87, "download_warn": 87, "chunk_siz": 87, "32768": 87, "iter_cont": 87, "aliv": 87, "_time_series_and_nlp_video": 87, "_what_is_nlp_video": 87, "_nlp_tokenization_video": 87, "linguist": 87, "prune": 87, "uninterest": 87, "typo": 87, "bewteen": 87, "freedom": 87, "random_word_embed": 87, "voter": 87, "god": 87, "administr": 87, "get_cluster_embed": 87, "embedding_clust": 87, "word_clust": 87, "closest": 87, "similar_word": 87, "most_similar": 87, "topn": 87, "cluser": 87, "tsne_model_en_2d": 87, "perplex": 87, "3500": 87, "embeddings_en_2d": 87, "tsne_plot_similar_word": 87, "rainbow": 87, "markers": 87, "xytext": 87, "textcoord": 87, "bbox_inch": 87, "farther": 87, "_similarity_discuss": 87, "_embeddings_rule_video": 87, "_distributional_similarity_and_vector_embeddings_video": 87, "morphem": 87, "oblivi": 87, "2frqg": [87, 89], "ft_en_vector": [87, 89], "get_word_vector": [87, 89], "nembed": 87, "04045481": 87, "10617249": 87, "27222311": 87, "06879666": 87, "16408321": 87, "00276707": 87, "27080125": 87, "05805573": 87, "31865698": 87, "03748008": 87, "00254088": 87, "13805169": 87, "00182498": 87, "08973497": 87, "00319015": 87, "19619396": 87, "09858181": 87, "10103802": 87, "08279888": 87, "0082208": 87, "13119364": 87, "15956607": 87, "17203182": 87, "0315701": 87, "25064597": 87, "06182072": 87, "03929246": 87, "05157393": 87, "03543638": 87, "13660161": 87, "05473648": 87, "06072914": 87, "04709269": 87, "17394426": 87, "02101276": 87, "11402624": 87, "24489872": 87, "08576579": 87, "00322696": 87, "04509873": 87, "00614253": 87, "05772085": 87, "073414": 87, "06718913": 87, "06057961": 87, "10963406": 87, "1245006": 87, "04819863": 87, "11408057": 87, "11081408": 87, "06752145": 87, "01689911": 87, "01186301": 87, "11716368": 87, "01287614": 87, "10639337": 87, "04243141": 87, "01057278": 87, "0230855": 87, "04930984": 87, "04717607": 87, "03696446": 87, "0015999": 87, "02193867": 87, "01331578": 87, "11102925": 87, "1686794": 87, "05814958": 87, "00296521": 87, "04252011": 87, "00352389": 87, "06267346": 87, "07747819": 87, "08959802": 87, "02445797": 87, "08913022": 87, "13422231": 87, "1258949": 87, "01296814": 87, "0531218": 87, "00541025": 87, "16908626": 87, "06323182": 87, "11510128": 87, "08352032": 87, "07224389": 87, "01023453": 87, "08263734": 87, "03859017": 87, "00798539": 87, "01498295": 87, "05448429": 87, "02708506": 87, "00549948": 87, "14634523": 87, "12550676": 87, "04641578": 87, "10164826": 87, "05370862": 87, "01217492": 87, "get_nearest_neighbor": 87, "8168574571609497": 87, "princ": 87, "796097457408905": 87, "emperor": 87, "7907207608222961": 87, "7655220627784729": 87, "lord": 87, "7435404062271118": 87, "7394551634788513": 87, "chieftain": 87, "7307553291320801": 87, "tyrant": 87, "7226710319519043": 87, "conqueror": 87, "719561755657196": 87, "kingli": 87, "718187689781189": 87, "queen": 87, "cosine_similar": [87, 89, 94], "vec_a": [87, 89], "vec_b": [87, 89], "getsimilar": [87, 89], "word1": [87, 89], "word2": [87, 89], "knight": 87, "twenti": 87, "nsimilar": 87, "ascend": 87, "descend": 87, "victori": 87, "defeat": 87, "7181877493858337": 87, "6881008744239807": 87, "2892838716506958": 87, "19655467569828033": 87, "833964467048645": 87, "8707448840141296": 87, "7478055953979492": 87, "8461978435516357": 87, "595384955406189": 87, "word_similar": 87, "5649225115776062": 87, "pronunci": 87, "4072215259075165": 87, "5812374353408813": 87, "_check_similarity_between_words_interactive_demo": 87, "context_word_1": 87, "context_word_2": 87, "word_similarity_1": 87, "word_similarity_2": 87, "7297980785369873": 87, "340322345495224": 87, "woman": 87, "_____": 87, "germani": [87, 88], "berlin": 87, "franc": 87, "petal": 87, "get_analog": 87, "funnction": 87, "positv": 87, "______": 87, "frannc": 87, "8162637948989868": 87, "8568049669265747": 87, "7037209272384644": 87, "flower": 87, "poverti": 87, "wealth": 87, "615874171257019": 87, "afflict": 87, "5437814593315125": 87, "_explore_homonyms_interactive_demo": 87, "_using_embeddings_video": 87, "cheap": 87, "attach": 87, "neuralnet": [87, 100, 102], "embedding_length": 87, "voabulari": 87, "embeddingbag": 87, "embedding_fasttext": 87, "requiresgrad": 87, "initrang": 87, "uniform_": 87, "_simple_feed_forward_net_exercis": 87, "train_it": 87, "valid_it": 87, "test_it": 87, "emb_vector": 87, "plot_train_v": 87, "total_acc": 87, "total_count": 87, "elaps": 87, "valid_split": 87, "yield_token": 87, "data_it": 87, "set_default_index": 87, "text_pipelin": 87, "label_pipelin": 87, "collate_batch": 87, "_label": 87, "_text": 87, "processed_text": 87, "split_train_": 87, "split_valid_": 87, "collate_fn": 87, "valid_dataload": 87, "steplr": 87, "total_accu": 87, "epoch_start_tim": 87, "accu_train": 87, "loss_train": 87, "accu_v": 87, "loss_val": 87, "accu_test": 87, "training_accuraci": 87, "validation_accuraci": 87, "ag_news_label": 87, "sci": 87, "tec": 87, "ex_text_str": 87, "memphi": 87, "tenn": 87, "jon": 87, "rahm": 87, "endur": 87, "season": 87, "weather": 87, "sundai": 87, "royal": 87, "portrush": 87, "thursdai": 87, "wgc": 87, "fedex": 87, "jude": 87, "spaniard": 87, "flawless": 87, "pga": 87, "impress": [87, 101], "nine": 87, "tpc": 87, "southwind": 87, "multilingu": 87, "jordan": 88, "matelski": 88, "weizh": 88, "yuan": 88, "dalia": 88, "nasr": 88, "stephen": 88, "kiilu": 88, "konstantin": 88, "tsafatino": 88, "comprehens": 88, "influenti": 88, "pytorch_lightn": 88, "typing_extens": 88, "w3d1_t2": 88, "regex": 88, "parallelli": 88, "_intro_to_nlps_and_llms_video": 88, "_nlp_pipeline_video": 88, "march": 88, "grade": 88, "exchang": 88, "hf": 88, "wikitext": 88, "41492": 88, "wolv": 88, "howl": 88, "assembl": 88, "den": 88, "storm": 88, "unfamiliar": 88, "territori": 88, "km2": 88, "sq": 88, "indistinguish": 88, "voic": 88, "octav": 88, "bass": 88, "stress": 88, "nasal": 88, "bariton": 88, "pup": 88, "yearl": 88, "yelp": 88, "harmon": 88, "overton": 88, "smoothli": 88, "mate": 88, "kill": 88, "cry": 88, "bark": 88, "choru": 88, "lone": 88, "protract": 88, "melodi": 88, "north": [88, 100], "louder": 88, "stronger": 88, "syllabl": 88, "mutual": 88, "biologist": 88, "generate_n_exampl": 88, "protocol": 88, "ecosystem": 88, "reinvent": 88, "richer": 88, "workflow": 88, "embedd": 88, "shelf": 88, "splitter": 88, "12_000": 88, "wordpiec": 88, "workpiec": 88, "subchunk": 88, "diacrit": 88, "accent": 88, "whitespac": 88, "stripacc": 88, "whitespacesplit": 88, "individual_digit": 88, "bpe": 88, "ee": 88, "tokenizer_train": 88, "wordpiecetrain": 88, "special_token": 88, "sample_ratio": 88, "dataset_smal": 88, "train_from_iter": 88, "hello": [88, 89], "toastersock": 88, "groommpi": 88, "hell": 88, "9140": 88, "2273": 88, "4375": 88, "aster": 88, "omm": 88, "downstream": [88, 94], "_is_it_a_good_idea_to_do_pre_tokenizers_discuss": 88, "_tokenizer_good_practices_discuss": 88, "unicod": 88, "\u997f": 88, "_chinese_and_english_tokenizer_discuss": 88, "_bert_video": 88, "_nlg_video": 88, "codeparrot": 88, "offroad": 88, "_tokenizers_discuss": 88, "7gb": 88, "500mb": 88, "automodelwithlmhead": 88, "generation_pipelin": 88, "input_prompt": 88, "simple_add": 88, "input_token_id": 88, "input_str": 88, "convert_ids_to_token": 88, "\u0121simpl": 88, "3486": 88, "\u0121int": 88, "1109": 88, "\u0121b": 88, "\u0121": 88, "1035": 88, "\u010b\u0121\u0121\u0121": 88, "\u0121add": 88, "15747": 88, "\u0121two": 88, "2877": 88, "\u0121number": 88, "5579": 88, "\u0121togeth": 88, "10451": 88, "\u0121and": 88, "\u0121return": 88, "2529": 88, "\u0121the": 88, "\u0121result": 88, "weirdli": 88, "copilot": 88, "wilder": 88, "simple_mul": 88, "simple_div": 88, "simpleadd": 88, "yike": 88, "ew": 88, "hobbyist": 88, "java": 88, "devolv": 88, "resembl": [88, 101], "diagnos": 88, "nonexpert": 88, "_using_sota_models_discuss": 88, "alright": 88, "_some_": 88, "isc": 88, "apach": 88, "repo_nam": 88, "overwhelmingli": 88, "lightn": 88, "serializ": 88, "bunch": 88, "collat": 88, "datacollatorforlanguagemodel": 88, "encoded_dataset": 88, "remove_column": 88, "data_col": 88, "ellipsi": 88, "simpleadder2": 88, "simpleadder3": 88, "jam": 88, "imperfect": 88, "_finetune_the_model_exercis": 88, "_accuracy_metric_observations_discuss": 88, "_conclusion_video": [88, 94], "payload": 88, "bearer": 88, "api_url": 88, "hf_": 88, "chatgpt": [88, 101], "gpt3": 88, "gptbing": 88, "gpt4": 88, "latter": [88, 91], "musk": 88, "biographi": 88, "uk": 88, "japan": 88, "capita": 88, "_play_around_with_llms_act": 88, "tutor": 88, "exam": 88, "_what_models_video": [88, 94], "w3d1_t3_bonu": 89, "tradition": [89, 97], "sum_i": [89, 100], "wy_i": 89, "deeptext": 89, "unchang": 89, "rqadk": 89, "neither": [89, 97], "nor": [89, 97], "bonjour": 89, "7028388977050781": 89, "20523205399513245": 89, "chatt": 89, "chat": 89, "013087842613458633": 89, "02490561455488205": 89, "6003134250640869": 89, "en_word": 89, "fr_word": 89, "bilingual_dictionari": 89, "make_training_matric": 89, "learn_transform": 89, "svd": 89, "source_dictionari": 89, "target_dictionari": 89, "source_matrix": 89, "target_matrix": 89, "21030391": 89, "atleast_1d": 89, "expand_dim": 89, "normalize_vector": 89, "dictionary_length": 89, "bilingu": 89, "source_training_matrix": 89, "target_training_matrix": 89, "5818599462509155": 89, "4327273964881897": 89, "6866635084152222": 89, "6003133654594421": 89, "_multilingual_embeddings_bonus_act": 89, "central": [91, 97], "w3d2_t1": 91, "_intro_to_dl_thinking_2_video": 91, "_getting_more_vignette_video": 91, "bui": 91, "costli": 91, "shear": 91, "bright": 91, "1000x": 91, "recogniz": 91, "_getting_more_data_discuss": 91, "_getting_more_data_wrapup_video": 91, "balestriero": 91, "bottou": 91, "2204": 91, "03632": 91, "_classbased_strategies_bonus_discuss": 91, "_detecting_tumors_vignette_video": 91, "_detecting_tumors_setup_video": 91, "hospit": 91, "scan": 91, "coher": 91, "chop": 91, "former": 91, "_detecting_tumors_discuss": 91, "_detecting_tumors_wrapup_video": 91, "tschandl": 91, "rinner": 91, "apalla": 91, "skin": 91, "nat": 91, "med": 91, "1234": [91, 94], "1038": 91, "s41591": 91, "020": 91, "0942": 91, "_brains_on_forrest_gump_vignette_video": 91, "_brains_on_forrest_gump_setup_video": 91, "_1": 91, "pearson": 91, "rho": 91, "_brains_on_forrest_gump_discuss": 91, "_brains_on_forrest_gump_wrapup_video": 91, "arora": 91, "bilm": 91, "livescu": 91, "canon": [91, 97, 100, 102], "proceed": 91, "30th": 91, "pmlr": 91, "1247": 91, "1255": 91, "mlr": 91, "v28": 91, "andrew13": 91, "_wrapup_of_dl_thinking_video": 91, "cca": 91, "w3d3_bonuslectur": 93, "_melanie_mitchell_video": 93, "arna": 94, "ghosh": 94, "colleen": 94, "gillon": 94, "atnafu": 94, "lambebo": 94, "colleenjg": 94, "neuromatch_ssl_tutori": 94, "importlib": 94, "repo_path": [94, 100, 102], "download_str": [94, 100, 102], "redownload": 94, "zipurl": [94, 100, 102], "smqvg": 94, "zipresp": [94, 100, 102], "w3d3_t1": 94, "plot_util": 94, "runner": 94, "w3d3_unsupervisedandselfsupervisedlearn": 94, "xkcd": 94, "plot_rsm_histogram": 94, "min_val": 94, "nanmin": 94, "max_val": 94, "nanmax": 94, "test_custom_torch_rsm_fct": 94, "custom_torch_rsm_fct": 94, "f_name": 94, "rand_feat": 94, "rsm_custom": 94, "rsm_ground_truth": 94, "calculate_torch_rsm": 94, "equal_nan": 94, "test_custom_contrastive_loss_fct": 94, "custom_simclr_contrastive_loss": 94, "rand_proj_feat1": 94, "rand_proj_feat2": 94, "loss_custom": 94, "loss_ground_truth": 94, "contrastive_loss": 94, "dspritesdataset": 94, "dsprites_subset": 94, "dsprites_torchdataset": 94, "dspritestorchdataset": 94, "target_lat": 94, "train_sampl": 94, "test_sampl": 94, "train_test_split_idx": 94, "fraction_train": 94, "randst": 94, "supervised_encod": 94, "load_encod": 94, "model_typ": 94, "random_encod": 94, "vae_encod": 94, "invariance_transform": 94, "randomaffin": 94, "dsprites_invariance_torchdataset": 94, "simclr_transform": 94, "simclr_encod": 94, "_why_do_representations_matter_video": 94, "openli": 94, "oval": 94, "heart": 94, "show_imag": 94, "posx": 94, "posi": 94, "compris": 94, "1x84": 94, "feat_encoder_schemat": 94, "1200": 94, "seed_process": 94, "train_test_splix_idx": 94, "16000": 94, "train_classifi": 94, "freeze_featur": 94, "substructur": 94, "encodercor": 94, "train_supervised_encod": 94, "_logistic_regression_classifier_exercis": 94, "_supervised_learning_and_invariance_video": 94, "nbr": [94, 97], "pairwis": 94, "_function_that_calculates_rsms_exercis": 94, "whichev": 94, "sorting_lat": 94, "plot_model_rsm": 94, "rsm_fct": 94, "encoder_rsm": 94, "encoder_lat": 94, "all_lat": 94, "all_featur": 94, "_supervised_network_encoder_rsm_interactive_demo": 94, "rsms_supervised_encoder_10ep_bs1000_seed2021": 94, "_what_patterns_do_the_rsms_reveal_discuss": 94, "_random_representations_video": 94, "trivial": 94, "plot_rsm": 94, "_plotting_a_random_network_encoder_exercis": 94, "rsms_random_encoder_0ep_bs0_seed2021": 94, "_trained_vs_random_encoder_discuss": 94, "ahead": 94, "random_loss_arrai": 94, "_evaluating_the_classification_performance_exercis": 94, "_random_projections_with_dsprites_discuss": 94, "_generative_models_video": 94, "absenc": 94, "kullback": 94, "leibler": 94, "kld": 94, "load_decod": 94, "vae_decod": 94, "load_vae_decod": 94, "vae_encoder_300ep_bs500_seed2021": 94, "vae_decoder_300ep_bs500_seed2021": 94, "plot_vae_reconstruct": 94, "_pretrained_vae_interactive_demo": 94, "overcom": [94, 101], "_vae_on_the_reconstruction_task_discuss": 94, "_vae_encoder_rsms_interactive_demo": 94, "rsms_vae_encoder_300ep_bs500_seed2021": 94, "_construct_a_meaningful_representation_space_discuss": 94, "kept": 94, "vae_train_loss": 94, "vae_loss_arrai": 94, "_evaluate_performance_using_pretrained_vae_exercis": 94, "_modern_approach_in_selfsupervised_learning_video": 94, "predetermin": 94, "_image_transformations_interactive_demo": 94, "_data_transformations_video": 94, "proj_feat1": 94, "feat_siz": 94, "proj_feat2": 94, "similarity_matrix": 94, "pos_sample_ind": 94, "neg_sample_ind": 94, "denomin": 94, "relax": 94, "z1": 94, "z2": 94, "proj_featur": 94, "2n": 94, "_simclr_loss_function_exercis": 94, "test_simclr_loss_arrai": 94, "train_simclr": 94, "loss_fct": 94, "neg_pair": 94, "total_loss": 94, "num_tot": 94, "z_aug1": 94, "z_aug2": 94, "simclr_encoder_60ep_bs1000_deg90_trans0": 94, "2_scale0": 94, "8to1": 94, "2_seed2021": 94, "dsprites_torch": 94, "dsprites_invariance_torch": 94, "simclr_loss_arrai": 94, "_evaluate_performance_using_pretrained_simclr_interactive_demo": 94, "_un_self_supervised_learning_video": 94, "train_sampler_bias": 94, "significantli": 94, "6x": 94, "train_sampler_bias_ctrl": 94, "bias_typ": 94, "shape_posx_spac": 94, "test_sampler_for_bias": 94, "compens": 94, "train_bia": 94, "test_sampler_for_bias_ctrl": 94, "5808": 94, "posx_quadr": 94, "full_training_procedur": 94, "dataset_typ": 94, "funtion": 94, "bias_ctrl": 94, "encoder_label": 94, "num_clf_epoch": 94, "ntrain": 94, "train_encoder_clfs_by_fraction_label": 94, "subset_se": 94, "vae_encoder_bias_ctrl_450ep_bs500_seed2021": 94, "simclr_encoder_bias_ctrl_150ep_bs1000_deg90_trans0": 94, "labelled_fract": 94, "plot_accuraci": 94, "1232": 94, "1233": 94, "train_clfs_by_fraction_label": 94, "1235": 94, "1236": 94, "1237": 94, "1238": 94, "1239": 94, "1240": 94, "plot_chanc": 94, "1241": 94, "1242": 94, "1244": 94, "1245": 94, "1064": 94, "1062": 94, "fresh": 94, "1063": 94, "orig_encod": 94, "1065": 94, "1066": 94, "num_epochs_use_al": 94, "1067": 94, "fraction_of_label": 94, "1068": 94, "1069": 94, "progress_bar": 94, "1070": 94, "1072": 94, "1073": 94, "classification_optim": 94, "get_featur": 94, "feats_extr": 94, "feature_extractor": 94, "feats_flat": 94, "feats_proj": 94, "linear_project": 94, "_batchnorm": 94, "bn_train": 94, "running_mean": 94, "running_var": 94, "batch_norm": 94, "exponential_average_factor": 94, "2482": 94, "2479": 94, "2480": 94, "_verify_batch_s": 94, "2483": 94, "2484": 94, "notabl": 94, "weaker": 94, "mitig": 94, "_biased_training_dataset_discuss": 94, "_general_principles_discuss": 94, "_invariant_representations_bonus_video": 94, "_simclr_network_encoder_rsms_bonus_interactive_demo": 94, "rsms_simclr_encoder_60ep_bs1000_deg90_trans0": 94, "_contrastive_models_bonus_discuss": 94, "_avoiding_representational_collapse_bonus_video": 94, "hood": 94, "simclr_encoder_neg_pair": 94, "rsms_and_histogram_plot": 94, "simclr_rsm": 94, "simclr_neg_pairs_rsm": 94, "random_rsm": 94, "calc": 94, "_visualizing_the_network_encoder_rsms_bonus_exercis": 94, "nuse": 94, "simclr_neg_pairs_loss_arrai": 94, "rsms_simclr_encoder_2neg_60ep_bs1000_deg90_trans0": 94, "_negative_pairs_in_computing_the_contrastive_loss_bonus_discuss": 94, "_simclr_network_encoder_pretrained_with_only_a_few_negative_pairs_bonus_interactive_demo": 94, "_fewshot_supervised_learning_bonus_video": 94, "thoroughli": 94, "unlabel": 94, "new_supervised_encod": 94, "nwith": 94, "_use_a_fraction_of_the_labelled_dataset_bonus_interactive_demo": 94, "_advantages_and_disadvantages_of_encoders_bonus_discuss": 94, "w3d4_bonuslectur": 96, "_chelsea_finn_video": 96, "pablo": 97, "samuel": 97, "castro": 97, "xiaomei": 97, "julia": 97, "costacurta": 97, "w3d4_t1": 97, "chronolog": 97, "_intro_to_rl_video": 97, "sutton": 97, "barto": 97, "playground": 97, "app": 97, "_grid_world_video": 97, "ascii_to_emoji": 97, "action_effect": 97, "get_emoji": 97, "gridworldbas": 97, "world_spec": 97, "full_lik": 97, "goal_cel": 97, "get_neighbour": 97, "neighbour": 97, "neighbour_po": 97, "include_polici": 97, "row_rang": 97, "row_char": 97, "gwb": 97, "goal_queu": 97, "goals_don": 97, "goal_neighbour": 97, "gwp": 97, "planer": 97, "_make_a_better_planner_exercis": 97, "harder_grid": 97, "gwb_2": 97, "gwp_2": 97, "puterman": 97, "_markov_decision_process_video": 97, "mdpbase": 97, "grid_world": 97, "num_stat": 97, "state_idx": 97, "cell_to_st": 97, "state_to_cel": 97, "goal_stat": 97, "s2": 97, "nbr_state": 97, "nbr_action": 97, "mdpb": 97, "_create_an_mdp_exercis": 97, "_q_values_video": 97, "mdptogo": 97, "computeq": 97, "sxa": 97, "steps_to_go": 97, "mdptg": 97, "_create_a_step_to_go_solver_exercis": 97, "bellman": 97, "backup": 97, "curriculum": 97, "_value_iteration_video": 97, "mdpvalueiter": 97, "error_toler": 97, "num_iter": 97, "new_q": 97, "max_next_q": 97, "_draw_v": 97, "min_v": 97, "max_v": 97, "wall_v": 97, "grid_valu": 97, "get_xaxi": 97, "get_yaxi": 97, "set_clim": 97, "draw_mod": 97, "mdpvi": 97, "_implement_value_iteration_exercis": 97, "_policy_iteration_video": 97, "mdppolicyiter": 97, "findpi": 97, "\u03c0": 97, "new_pi": 97, "next_v": 97, "mdppi": 97, "_implement_policy_iteration_exercis": 97, "mild": 97, "_q_learning_video": 97, "qlearner": 97, "current_st": 97, "new_stat": 97, "pickact": 97, "maybereset": 97, "learnq": 97, "10_000": 97, "base_q_learn": 97, "_implement_q_learning_exercis": 97, "_epsilon_greedy_exploration_video": 97, "qlearnerexplor": 97, "_implement_epsilon_greedy_exploration_exercis": 97, "testb": 97, "greatest": 97, "occasion": 97, "discov": 97, "w3d5_bonuslectur": 99, "_amita_kapoor_video": 99, "mandana": [100, 102], "samiei": [100, 102], "raymond": [100, 102], "chua": [100, 102], "kushaan": [100, 102], "lilicrap": [100, 102], "namrata": [100, 102], "bafna": [100, 102], "coloredlog": [100, 102], "w3d5_t1": 100, "unpickl": [100, 102], "loadtrainexampl": [100, 102], "trainexampleshistori": [100, 102], "modelfil": [100, 102], "examplesfil": [100, 102], "trainexampl": [100, 102], "exit": [100, 102], "save_model_checkpoint": [100, 102], "nnet": [100, 102], "filepath": [100, 102], "load_model_checkpoint": [100, 102], "raymondchua": [100, 102], "nma_rl_gam": [100, 102], "kf4p9": [100, 102], "arena": [100, 102], "mct": 100, "othelloplay": [100, 102], "othellolog": [100, 102], "nnetwrapp": [100, 102], "dotdict": [100, 102], "numit": [100, 102], "numep": [100, 102], "tempthreshold": [100, 102], "exploit": [100, 102], "updatethreshold": [100, 102], "playoff": [100, 102], "maxlenofqueu": [100, 102], "nummctssim": [100, 102], "arenacompar": [100, 102], "cpuct": [100, 102], "maxdepth": [100, 102], "nummcsim": [100, 102], "mc_topk": [100, 102], "load_folder_fil": [100, 102], "8x100x50": [100, 102], "numitersfortrainexampleshistori": [100, 102], "_a_game_loop_for_rl_video": 100, "centr": 100, "south": 100, "outflank": 100, "opppon": 100, "voluntarili": 100, "forfeit": 100, "eothello": 100, "6x6": [100, 102], "getinitboard": [100, 102], "getvalidmov": [100, 102], "square_cont": [100, 102], "getsquarepiec": [100, 102], "getboards": [100, 102], "getactions": [100, 102], "getcanonicalform": [100, 102], "stringrepresent": [100, 102], "tobyt": [100, 102], "stringrepresentationread": [100, 102], "board_": [100, 102], "getscor": [100, 102], "countdiff": [100, 102], "displayvalidmov": [100, 102], "getnextst": [100, 102], "execute_mov": [100, 102], "legalmov": [100, 102], "get_legal_mov": [100, 102], "getgameend": [100, 102], "has_legal_mov": [100, 102], "getsymmetri": [100, 102], "pi_board": [100, 102], "newb": [100, 102], "rot90": [100, 102], "newpi": [100, 102], "fliplr": [100, 102], "randomplay": [100, 102], "_implement_a_random_player_excercis": 100, "player1": [100, 102], "player2": [100, 102], "num_gam": [100, 102], "playgam": [100, 102], "nnumber": [100, 102], "win_rate_player1": [100, 102], "nwin": [100, 102], "w3d5_reinforcementlearningforgamesanddlthinking3": 100, "gameresult": 100, "onewon": 100, "curplay": 100, "smarter": 100, "_train_a_value_function_video": 100, "pretrained_model": [100, 102], "loaded_gam": [100, 102], "checkpoint_1": [100, 102], "l_": 100, "othellonet": [100, 102], "board_x": [100, 102], "board_i": [100, 102], "bn4": [100, 102], "fc_bn1": [100, 102], "batchnorm1d": [100, 102], "fc_bn2": [100, 102], "fc4": [100, 102], "_implement_othelonn_excercis": 100, "v_loss": [100, 102], "batch_count": [100, 102], "target_v": [100, 102], "loss_v": [100, 102], "out_v": [100, 102], "l_v": [100, 102], "save_checkpoint": [100, 102], "load_checkpoint": [100, 102], "_implement_the_value_network_excercis": 100, "rl_for_gam": 100, "vnet": [100, 102], "_play_games_using_a_value_function_video": 100, "model_save_nam": [100, 102], "valuebasedplay": [100, 102], "max_num_act": [100, 102], "va_list": [100, 102], "negat": 100, "nextboard": [100, 102], "_implement_the_value_based_player_excercis": 100, "_train_a_policy_network_video": 100, "t_i": 100, "output_i": 100, "pi_loss": [100, 102], "target_pi": [100, 102], "loss_pi": [100, 102], "out_pi": [100, 102], "l_pi": [100, 102], "nll": [100, 102], "aspir": [100, 102], "gombru": [100, 102], "2018": [100, 102], "_implement_the_policy_network_exercis": 100, "pnet": [100, 102], "_play_games_using_a_policy_network_video": 100, "probabilit": 100, "sum_vap": [100, 102], "action_prob": [100, 102], "vap": [100, 102], "renorm": [100, 102], "_implement_the_policy_based_player_exercis": 100, "_play_using_monte_carlo_rollouts_video": 100, "recapitul": 100, "ps": [100, 102], "canonicalboard": [100, 102], "temp_v": [100, 102], "init_start_st": [100, 102], "isfirstact": [100, 102], "current_play": [100, 102], "sum_ps_": [100, 102], "nb": [100, 102], "insuffici": [100, 102], "dozen": [100, 102], "next_": [100, 102], "next_play": [100, 102], "_implement_the_monte_carlo_planner_exercis": 100, "_play_with_planning_video": 100, "averg": 100, "s_t": [100, 102], "mc": [100, 102], "mc_model_save_nam": [100, 102], "montecarlobasedplay": [100, 102], "best_act": [100, 102], "avg_valu": [100, 102], "qsa": [100, 102], "num_valid_act": [100, 102], "top_k_act": [100, 102], "argpartit": [100, 102], "getactionprob": [100, 102], "rp": [100, 102], "n1": [100, 102], "maxrollout": [100, 102], "mc1": 100, "n1p": [100, 102], "mc_result": [100, 102], "_monte_carlo_simulations_exercis": 100, "vp": [100, 102], "pp": [100, 102], "_unbeatable_opponents_video": 100, "w3d5_t2": 101, "_intro_to_dl_thinking_3_video": 101, "_the_future_video": 101, "curios": 101, "Their": 101, "constitu": 101, "mammal": 101, "flesh": 101, "2201": 101, "07372": 101, "brief": 101, "retrospect": 101, "prospect": 101, "uncertain": 101, "revel": 101, "unmet": 101, "conting": 101, "_the_future_discuss": 101, "_in_context_learning_vignette_video": 101, "llm": 101, "2211": 101, "15561": 101, "2212": 101, "10559": 101, "implicit": 101, "possess": 101, "icl": 101, "theorem": 101, "_in_context_learning_discuss": 101, "_memories_vignette_video": 101, "dinner": 101, "blackboard": 101, "1410": 101, "5401": 101, "1805": 101, "07603": 101, "1703": 101, "03129": 101, "overlook": 101, "intric": 101, "ntm": 101, "enrich": 101, "dnn": 101, "emdqn": 101, "atari": 101, "lifelong": 101, "shot": 101, "seamlessli": 101, "omniglot": 101, "agil": 101, "_memories_discuss": 101, "_multiple_information_sources_vignette_video": 101, "synergist": 101, "gpt": 101, "2302": 101, "14045": 101, "palm": 101, "multimod": 101, "mllm": 101, "ocr": 101, "ccombin": 101, "_multiple_information_sources_discuss": 101, "_language_for_robotics_video": 101, "subproblem": 101, "robotoc": 101, "manipul": 101, "embodi": 101, "rt": 101, "socrat": 101, "outstand": 101, "_language_for_robotics_discuss": 101, "w3d5_t3_bonu": 102, "othello": 102, "othellogam": 102, "othellonnet": 102, "valuenetwork": 102, "policynetwork": 102, "policybasedplay": 102, "montecarlo": 102, "rollout": 102, "oppon": 102, "_plan_with_mcts_video": 102, "puct": 102, "sum_b": 102, "uct": 102, "alorithm": 102, "nsa": 102, "ns": 102, "till": 102, "cur_best": 102, "getnsa": 102, "_mcts_planner_exercis": 102, "_play_with_mcts_video": 102, "mcts_model_save_nam": 102, "montecarlotreesearchbasedplay": 102, "besta": 102, "counts_sum": 102, "versu": 102, "mcts1": 102, "mcts_result": 102, "n2": 102, "n2p": 102, "_play_games_mcts_exercis": 102, "john": 103, "butler": 103, "advic": 103, "isabel": 103}, "objects": {}, "objtypes": {}, "objnames": {}, "titleterms": {"prerequisit": 0, "preparatori": 0, "materi": [0, 23], "nma": [0, 19, 28], "deep": [0, 4, 16, 26, 33, 44, 45, 57, 58, 59, 62, 64, 65, 74, 76, 91, 101], "learn": [0, 4, 5, 8, 16, 24, 26, 27, 28, 33, 44, 45, 57, 58, 59, 61, 62, 64, 70, 74, 76, 81, 84, 88, 91, 92, 94, 95, 96, 97, 98, 100, 101], "prepar": [0, 3, 5, 20, 43, 76, 77], "yourself": 0, "cours": [0, 46, 57], "program": 0, "math": [0, 81], "skill": 0, "comput": [1, 60, 67, 94], "vision": [1, 16], "data": [2, 3, 5, 7, 8, 11, 15, 16, 17, 18, 20, 21, 30, 33, 35, 39, 57, 65, 67, 69, 70, 73, 76, 77, 81, 84, 85, 91, 94, 100], "augment": [2, 17, 57, 70, 73, 85], "imag": [2, 3, 4, 5, 16, 18, 43, 57, 67, 73, 76, 77, 80, 82, 94], "classif": [2, 5, 7, 43, 64, 67, 76, 84, 85, 94], "model": [2, 3, 8, 16, 17, 23, 25, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 64, 65, 67, 70, 73, 76, 78, 80, 81, 82, 84, 85, 88, 94], "object": [2, 3, 5, 8, 11, 16, 17, 20, 21, 25, 27, 28, 33, 35, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "setup": [2, 3, 5, 7, 8, 11, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "instal": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "depend": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 43, 57, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "set": [2, 5, 8, 17, 18, 20, 21, 25, 28, 30, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 102], "random": [2, 8, 25, 28, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "seed": [2, 8, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "devic": [2, 5, 8, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "gpu": [2, 8, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "cpu": [2, 8, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "train": [2, 3, 7, 8, 11, 16, 17, 18, 20, 28, 39, 40, 43, 57, 60, 61, 62, 64, 67, 69, 70, 73, 76, 77, 82, 84, 85, 89, 94, 100, 102], "hyperparamet": [2, 8, 61, 70], "cutout": 2, "mixup": 2, "dataset": [2, 3, 4, 7, 8, 10, 12, 15, 18, 19, 25, 57, 60, 64, 65, 67, 69, 70, 73, 77, 80, 84, 85, 87, 88, 94], "cifar": [2, 8], "10": [2, 33, 34, 35, 38, 43, 57, 61, 62, 74, 91, 94], "loader": [2, 8, 65, 70], "visual": [2, 15, 16, 18, 20, 57, 62, 69, 70, 73, 77, 81, 87, 94], "architectur": [2, 8, 17, 21, 80, 82, 84, 88, 91, 100], "resnet": [2, 8, 76], "test": [2, 8, 16, 17, 18, 21, 25, 38, 39, 40, 43, 57, 69, 73, 81, 82], "loss": [2, 3, 8, 18, 61, 64, 67, 69, 81, 82, 94, 100], "function": [2, 5, 7, 8, 15, 16, 17, 21, 27, 28, 33, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "optim": [2, 3, 8, 66, 67, 76], "loop": [2, 8, 16, 25, 60, 73, 76, 100], "auxiliari": [2, 8], "knowledg": [3, 35], "extract": 3, "from": [3, 4, 8, 20, 28, 43, 57, 62, 73, 76, 80, 82, 84, 85, 87, 94, 97, 100, 102], "convolut": [3, 16, 73, 76, 80], "neural": [3, 4, 12, 16, 17, 20, 43, 57, 60, 61, 62, 64, 74, 81, 82, 87, 100], "network": [3, 4, 5, 12, 16, 17, 28, 43, 57, 61, 62, 64, 65, 67, 70, 76, 77, 81, 82, 94, 100, 102], "project": [3, 12, 23, 27, 31, 32, 33, 35, 36, 39, 40, 46, 52, 94], "idea": [3, 4, 10, 19, 26, 27, 88], "acknowledg": [3, 7], "an": [3, 57, 64, 67, 73, 74, 85, 94, 97, 102], "classifi": [3, 57, 94], "download": [3, 16, 17, 18, 21, 65, 67, 73, 76, 77, 80, 82, 84, 85, 100, 102], "creat": [3, 16, 17, 28, 43, 57, 87, 97, 100], "inspect": [3, 15, 28, 82], "gan": 3, "translat": [3, 11], "get": [3, 16, 31, 57, 73, 77, 91], "cyclegan": 3, "code": [3, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 94, 97, 100, 102], "biolog": [4, 64], "analysi": [4, 12, 17, 27, 62], "us": [4, 5, 8, 15, 16, 18, 25, 28, 43, 50, 51, 53, 54, 57, 69, 76, 77, 81, 84, 87, 88, 94, 100, 102], "featur": [4, 65, 76, 88, 94], "predict": [4, 11, 18, 20, 74, 84, 85], "human": [4, 25], "behavior": 4, "leakag": 4, "gradient": [4, 60, 61, 67, 70], "flow": 4, "base": [4, 27, 28, 81, 91, 100, 102], "vae": [4, 80, 94], "self": [4, 92, 94, 96], "supervis": [4, 92, 94, 96], "graph": [4, 60], "someth": 5, "screwi": 5, "recognit": [5, 77], "detect": [5, 73, 91], "screw": 5, "helper": [5, 7, 16, 17, 21, 57, 61, 62, 65, 67, 70, 73, 80, 82, 84, 85, 87, 89, 94, 100, 102], "choos": [5, 12, 61], "figur": [5, 20, 25, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "load": [5, 7, 8, 12, 15, 17, 18, 57, 69, 70, 73, 77, 84, 85, 87, 94, 100, 102], "let": [5, 28], "s": [5, 12, 28, 60, 64], "check": [5, 8, 76, 84, 85, 87], "out": [5, 76, 97], "some": [5, 20, 28, 57, 67, 77], "up": [5, 8, 18, 44, 60, 61, 74, 76, 80, 85, 91], "our": [5, 57], "first": [5, 16, 43, 65], "challeng": [5, 76], "damag": 5, "multi": [5, 26, 63, 84], "class": [5, 28, 76, 81, 91], "perform": [5, 17, 27, 84, 94], "introspect": 5, "orient": [5, 103], "bound": 5, "box": 5, "cluster": 5, "perspect": 5, "scale": 5, "transfer": [5, 8, 26, 27, 64, 65, 76], "link": [5, 15, 31, 52], "slide": [6, 13, 22, 29], "music": 7, "gener": [7, 11, 35, 39, 40, 46, 57, 60, 62, 64, 65, 69, 70, 73, 78, 80, 82, 84, 88, 94], "spectrogram": 7, "thi": [7, 62, 67, 77, 84], "notebook": 7, "gtzan": 7, "includ": 7, "have": 7, "look": [7, 16, 28], "simpl": [7, 57, 61, 73, 87, 97], "cnn": [7, 18, 73, 76, 77], "run": [7, 15, 25, 28, 46, 62, 67, 69, 70, 73, 76, 77, 94], "me": [7, 67, 69, 70, 73], "sourc": [8, 101], "100": 8, "dataload": [8, 16, 18, 57, 65, 70, 73], "pytorch": [8, 12, 43, 56, 57, 60, 64, 73], "re": [8, 62], "improv": [8, 76], "differ": [8, 64, 67, 76, 94], "delet": 8, "variabl": [8, 57, 80, 84, 85, 94], "previou": [8, 102], "target": [8, 94], "select": [8, 28, 33, 37, 39, 40, 80], "subset": 8, "pre": [8, 43, 77, 85, 94], "freez": 8, "paramet": [8, 65, 67, 73, 76], "unfreez": 8, "last": [8, 73], "layer": [8, 16, 21, 63, 65, 76], "number": [8, 57, 65, 73, 76, 94], "plot": [8, 16, 18, 20, 28, 33, 35, 39, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 80, 81, 94], "result": [8, 31, 76], "natur": [9, 86, 88], "languag": [9, 11, 84, 85, 86, 88, 101], "process": [9, 57, 73, 81, 85, 86, 87, 88, 97], "machin": [11, 57], "repres": [11, 94], "word": [11, 84, 87], "distribut": [11, 21, 76, 87], "The": [11, 28, 44, 57, 61, 64, 65, 73, 76, 80, 81, 88, 94, 101], "rnn": [11, 20, 73], "text": [11, 57, 85], "To": 11, "do": [11, 73, 76, 88, 91, 94], "further": [11, 76], "read": [11, 34, 76], "twitter": 12, "sentiment": [12, 85], "welcom": [12, 57], "nlp": [12, 45, 87, 88], "templat": [12, 23, 31, 32, 43], "step": [12, 23, 33, 34, 35, 36, 37, 38, 41, 54, 97], "1": [12, 27, 28, 31, 33, 35, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "question": [12, 31, 33, 35, 39, 40, 57], "goal": [12, 88], "2": [12, 27, 28, 33, 35, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101, 102], "literatur": [12, 33, 35], "review": [12, 33, 35, 85], "3": [12, 27, 28, 33, 35, 36, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101], "explor": [12, 27, 64, 73, 87, 94, 97], "4": [12, 27, 28, 33, 36, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101], "toolkit": [12, 31, 33, 37, 39, 40], "logist": [12, 31, 94], "regress": [12, 18, 62, 94], "explain": 12, "ai": [12, 59, 65, 93], "recurr": 12, "what": [12, 20, 64, 73, 76, 81, 87, 91, 94, 97], "next": 12, "neurosci": [14, 73], "algonaut": 15, "video": [15, 27, 28, 34, 35, 36, 37, 38, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 91, 93, 94, 96, 97, 99, 100, 101, 102, 103], "term": 15, "algonauts2021": 15, "enter": [15, 84], "dropbox": 15, "cell": [15, 17, 62, 67, 77], "util": [15, 81], "fmri": 15, "dimens": [15, 94], "correspond": 15, "brain": [15, 16, 17, 19, 73, 79, 91], "respons": [15, 18, 94], "refer": [15, 27, 57, 60], "lost": 16, "glass": 16, "how": [16, 23, 57, 67, 73, 74, 76, 84, 94], "deal": 16, "noisi": 16, "input": [16, 76], "clean": 16, "defin": [16, 17, 20, 25, 36, 43, 57, 69, 81, 82, 100], "preprocess": [16, 77], "pipelin": [16, 88], "exampl": [16, 18, 28, 33, 35, 36, 39, 40, 76, 77, 81, 94, 97], "nois": 16, "free": 16, "ventral": 16, "stream": 16, "alexnet": [16, 18, 76], "2012": 16, "batch": [16, 64, 67], "normal": [16, 17], "downscal": 16, "factor": 16, "tensorboard": 16, "accuraci": [16, 76, 88], "calcul": [16, 76, 77, 94], "hypothesi": [16, 36], "naiv": 16, "learner": 16, "expert": [16, 100], "experienc": 16, "kernel": [16, 73], "16": [16, 57], "filter": [16, 73, 76], "intermedi": 16, "output": [16, 73], "segment": 17, "denois": [17, 81], "intro": [17, 62, 74, 81, 87, 88, 91, 97, 101], "activ": [17, 18, 74], "neuron": [17, 20, 64, 74], "dish": 17, "transform": [17, 57, 77, 83, 84, 85, 94], "u": [17, 82], "net": [17, 60, 61, 62, 82, 87], "threshold": [17, 40], "find": [17, 35, 57, 76, 84], "move": 18, "beyond": [18, 82, 84], "label": [18, 69, 76, 94], "finetun": [18, 76], "bold": 18, "kai": 18, "structur": [18, 23, 73], "fine": [18, 44, 76, 85, 88], "tune": [18, 44, 70, 72, 76, 85, 88], "voxel": 18, "loc": 18, "region": 18, "custom": [18, 81], "numpi": [18, 57], "arrai": 18, "dissimilar": 18, "correl": [18, 62, 76], "between": [18, 87], "observ": [18, 88, 100], "valu": [18, 62, 84, 94, 97, 100, 102], "curat": 19, "crcn": 19, "janelia": 19, "figshar": 19, "other": [19, 26, 28, 84], "score": [19, 81, 82], "allen": 19, "observatori": 19, "bciaut": 19, "p300": 19, "focu": 20, "matter": [20, 61, 94], "infer": 20, "low": 20, "dimension": [20, 67], "dynam": [20, 43], "record": 20, "simul": [20, 64, 100], "linear": [20, 58, 61, 62, 80], "system": [20, 27], "compar": [20, 67, 76, 81, 94, 100], "true": 20, "fire": [20, 64], "rate": [20, 61, 70], "view": [20, 77], "all": [20, 28, 67, 69, 73, 94], "one": [20, 28, 57], "trial": 20, "latent": [20, 80, 94], "anim": [21, 65, 69, 70], "pose": 21, "estim": 21, "mount": [21, 28], "your": [21, 35, 36, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "gdrive": 21, "visula": 21, "evalu": [21, 33, 38, 39, 40, 64, 67, 94], "error": [21, 80], "final": [21, 31, 39, 40, 46], "introduct": [23, 27, 35, 43, 60, 64, 67, 69, 73, 76, 80, 84, 87, 94, 100, 103], "daili": [23, 31, 46, 57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "schedul": [23, 31, 46, 47], "ten": 23, "import": [23, 28, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "deadlin": 23, "reinforc": [24, 26, 28, 95, 97, 98, 100], "rl": [25, 27, 97, 99, 100], "cognit": 25, "task": [25, 27, 28, 64, 94], "background": [25, 35, 39, 40], "n": 25, "back": 25, "environ": [25, 27, 28, 43, 84, 85], "implement": [25, 27, 31, 33, 38, 39, 40, 61, 64, 67, 70, 73, 81, 84, 88, 97, 100], "scheme": 25, "agent": [25, 26, 28, 100, 102], "initi": [25, 61, 62, 65], "traffic": 26, "signal": [26, 40], "control": [26, 28], "resourc": [26, 57], "dqn": 27, "algorithm": [27, 60, 97], "lunar": 27, "lander": 27, "updat": [27, 67], "upgrad": 27, "lib": 27, "plai": [27, 57, 88, 100, 102], "basic": [27, 36, 44, 56, 57, 95, 97], "addit": [27, 28], "exploit": [27, 97], "trade": [27, 76], "off": [27, 76], "reward": [27, 28], "shape": 27, "identifi": 27, "state": [27, 35, 80, 88], "inform": [27, 101], "crucial": 27, "its": [27, 74], "extens": [27, 85], "atari": 27, "game": [27, 98, 100, 102], "5": [27, 33, 36, 37, 43, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 80, 84, 87, 88, 91, 94, 97, 100, 101], "obstacl": 27, "avoid": [27, 85, 94], "b": 27, "minigrid": 27, "6": [27, 33, 37, 43, 57, 60, 61, 62, 67, 70, 73, 74, 76, 84, 87, 88, 91, 94, 97, 100, 101], "prefer": 27, "pbrl": 27, "robolymp": 28, "robot": [28, 96, 101], "colab": [28, 53, 57], "limit": 28, "pybullet": 28, "locomot": 28, "save": [28, 76], "restor": 28, "checkpoint": 28, "befor": [28, 69], "runtim": 28, "restart": 28, "after": [28, 69, 73, 76], "conveni": 28, "factori": 28, "method": [28, 57, 67, 94], "continu": 28, "list": 28, "modifi": [28, 43], "instanti": 28, "default": [28, 70, 97], "bit": 28, "frame": 28, "action": 28, "take": 28, "properti": 28, "dm": 28, "acm": 28, "d4pg": 28, "examin": [28, 94], "polici": [28, 52, 97, 100, 102], "total": 28, "googl": [28, 53, 57], "drive": 28, "temporarili": 28, "option": [28, 60, 61, 62, 94], "unmount": 28, "two": 28, "dmpo": 28, "ddpg": 28, "good": [28, 65, 73, 88, 94], "luck": 28, "search": [30, 57, 67, 102], "metadataset": 30, "guid": [31, 41], "summari": [31, 39, 40, 43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100, 101, 102], "submiss": 31, "ta": 31, "mentor": 31, "week": [31, 57, 73], "start": 31, "w1d4": [31, 46], "dai": [31, 46], "w1d5": 31, "w2d1": 31, "3h": 31, "w2d2": 31, "w3d1": 31, "w3d2": [31, 46], "half": [31, 46], "w3d3": 31, "w3d4": 31, "w3d5": [31, 46], "present": [31, 43, 84], "content": 31, "retriev": 33, "ingredi": [33, 36, 39, 40], "hypothes": [33, 36, 39, 40], "draft": [33, 37, 39, 40], "7": [33, 37, 38, 43, 57, 61, 62, 67, 73, 74, 76, 84, 91, 94, 97, 100], "build": [33, 57, 64, 80], "8": [33, 38, 43, 46, 57, 61, 62, 67, 73, 74, 76, 84, 91, 94, 97, 100], "complet": [33, 38, 39, 40, 94], "9": [33, 38, 43, 57, 61, 62, 67, 74, 76, 84, 91, 94, 100], "public": 33, "publish": 34, "11": [34, 43, 57, 61, 94], "write": [34, 73], "abstract": [34, 40, 93], "paper": 34, "guidanc": 34, "suggest": 34, "0": [35, 60, 62, 64, 69, 73, 94, 100], "overview": [35, 46, 70, 84], "demo": [35, 57, 61, 62, 64, 67, 73, 76, 80, 81, 82, 84, 85, 87, 94], "disclaim": 35, "tutori": [35, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "phenomenon": [35, 39, 40], "ask": 35, "about": [35, 73, 91, 94], "own": [35, 67, 73, 84], "understand": [35, 73, 81, 85], "art": [35, 80, 88], "determin": [36, 76], "formul": 36, "specif": [36, 46, 97], "mathemat": 36, "plan": [37, 100, 102], "ethic": [38, 39, 57, 65, 67, 77, 82, 84, 94, 100], "illus": [39, 40], "media": 39, "thought": [39, 40], "vestibular": 40, "integr": [40, 64], "ddm": 40, "mechan": 40, "decis": [40, 57, 97], "assembl": 40, "deploi": [42, 43], "bonu": [43, 57, 59, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 85, 88, 89, 91, 93, 94, 96, 99, 102], "web": 43, "feedback": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "gadget": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "flask": 43, "ngrok": 43, "packag": 43, "which": 43, "doesn": 43, "t": [43, 91, 94], "work": [43, 57, 67, 84, 94], "latest": 43, "version": 43, "section": [43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101, 102], "submit": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "app": 43, "jinja2": 43, "jinja": 43, "appli": [43, 94], "mvvm": 43, "design": [43, 74, 91], "pattern": [43, 94], "rest": 43, "api": 43, "vue": 43, "js": 43, "serv": 43, "applic": [43, 84, 99], "heroku": 43, "python": 43, "12": [43, 57, 61], "local": [43, 67], "deploy": 43, "13": [43, 57], "14": [43, 57], "15": [43, 57], "wrap": [44, 60, 61, 74, 80, 91], "podcast": [44, 45], "panel": [44, 45, 46], "discuss": [44, 45, 60, 61, 62, 67, 94], "convnet": [45, 71, 75, 76, 77], "2024": 46, "juli": 46, "26": 46, "coursework": [46, 52], "time": [46, 82, 86, 87], "propos": 46, "dl": [46, 57, 71, 74, 90, 91, 98, 101], "think": [46, 64, 65, 67, 69, 70, 71, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 101], "profession": 46, "develop": 46, "share": [48, 91], "calendar": 48, "timezon": 49, "widget": [49, 57, 61, 62], "discord": 50, "jupyterbook": 51, "quick": 52, "attend": 52, "advic": 53, "kaggl": 54, "technic": 55, "help": [55, 73], "And": [56, 57, 71, 83, 86, 92, 98], "neuromatch": 57, "histori": [57, 76, 97], "why": [57, 61, 65, 94], "cool": 57, "tensor": 57, "make": [57, 62, 73], "like": [57, 94], "rang": 57, "exercis": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 87, 88, 94, 97, 100, 102], "oper": 57, "manipul": [57, 87], "index": 57, "vs": [57, 61, 62, 64, 65, 67, 70, 73, 76, 80, 81], "just": 57, "much": [57, 73], "faster": 57, "ar": [57, 65, 67, 87, 94], "displai": [57, 76, 77], "cifar10": 57, "grayscal": 57, "csv": 57, "file": 57, "sampl": [57, 60, 62, 67, 73, 80, 81, 82, 100], "boundari": 57, "tweak": 57, "xor": 57, "interact": [57, 61, 62, 64, 67, 73, 76, 80, 81, 82, 84, 85, 87, 94], "solv": 57, "info": [57, 91], "Be": 57, "group": 57, "17": 57, "syllabu": 57, "meet": 57, "lectur": [57, 59, 72, 79, 93, 96, 99], "block": 57, "thing": 57, "more": [57, 60, 91, 94], "magic": 57, "survei": [57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "60": 57, "year": 57, "research": [57, 99], "altair": 57, "vega_dataset": 57, "author": 57, "edit": 57, "author_filt": 57, "full": [57, 67, 94], "appendix": 57, "offici": 57, "document": 57, "book": 57, "yoshua": 59, "bengio": 59, "descent": [60, 67, 70], "autograd": 60, "execut": [60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "set_devic": [60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "steepest": 60, "ascent": 60, "analyt": [60, 61, 62], "vector": [60, 62, 77, 87], "backprop": 60, "chain": 60, "rule": [60, 87], "auto": [60, 80], "differenti": 60, "forward": [60, 87], "propag": 60, "buid": 60, "backward": 60, "modul": [60, 80, 82, 100, 102], "nn": [60, 70], "A": [61, 97, 100], "shallow": 61, "narrow": 61, "solut": [61, 62], "lnn": [61, 62], "landscap": 61, "depth": [61, 67], "effect": [61, 73, 94], "prelud": 62, "represent": [62, 73, 94], "tree": [62, 102], "sure": 62, "you": [62, 73], "enabl": [62, 81, 94], "singular": 62, "decomposit": 62, "svd": 62, "similar": [62, 64, 76, 77, 87, 94], "rsa": 62, "illusori": 62, "demonstr": [62, 73], "outro": [62, 65, 100], "lr": 62, "dlnn": 62, "perceptron": 63, "artifici": 64, "mlp": [64, 65, 67], "need": [64, 65], "univers": 64, "approxim": 64, "theorem": 64, "relu": [64, 65, 73], "purpos": [64, 76], "cross": 64, "entropi": 64, "spiral": 64, "classfic": 64, "point": 64, "eval": 64, "doe": [64, 65, 69, 73, 76, 81, 84, 94], "well": [64, 65, 84, 94], "physiolog": 64, "motiv": [64, 73], "leaki": [64, 65], "lif": 64, "r_m": 64, "tau_": 64, "ref": 64, "real": [64, 65], "face": [65, 69, 70, 74, 77], "wider": 65, "deeper": 65, "express": 65, "wide": 65, "while": 65, "keep": 65, "same": [65, 80], "tradeoff": 65, "where": 65, "fail": [65, 94], "case": [65, 67], "studi": [65, 67], "world": [65, 97], "high": [65, 67], "level": 65, "aspect": [65, 77, 84, 100], "hype": 65, "xavier": 65, "best": [65, 76], "gain": 65, "techniqu": [67, 69, 70], "unexpect": 67, "consequ": [67, 94], "successfulli": 67, "mnist": [67, 73, 82], "interpret": [67, 69], "poor": 67, "condit": [67, 82, 94], "momentum": 67, "gd": 67, "oscil": 67, "non": [67, 74], "convex": 67, "overparameter": [67, 69, 70], "rescu": 67, "width": 67, "expens": 67, "mini": 67, "cost": [67, 74], "minibatch": 67, "size": [67, 73], "adapt": 67, "rmsprop": 67, "concern": 67, "put": [67, 73], "togeth": [67, 69, 73], "benchmark": 67, "metric": [67, 88], "regular": [68, 69, 70, 73], "part": [69, 70], "shrinkag": 69, "frobeniu": 69, "norm": [69, 70], "overfit": [69, 73], "valid": 69, "memor": 69, "animalnet": 69, "earli": 69, "stop": 69, "them": 69, "l1": 70, "l2": 70, "unregular": 70, "ridg": 70, "dropout": [70, 73], "caveat": 70, "without": [70, 94], "small": 70, "stochast": 70, "sgd": 70, "adversari": 70, "attack": 70, "kyunghyun": 72, "cho": 72, "onlin": 72, "hyperparmet": 72, "kynghyun": 72, "recap": 73, "experi": 73, "param": 73, "edg": 73, "detail": 73, "definit": 73, "note": [73, 81], "chicago": 73, "skylin": 73, "pad": 73, "stride": 73, "pool": 73, "subsampl": 73, "emnist": 73, "multipl": [73, 101], "see": [73, 76], "would": 73, "recogn": 73, "x": 73, "maxpool": 73, "fulli": 73, "connect": 73, "revisit": 73, "fashion": 73, "backpropag": 73, "remind": 73, "symptom": 73, "cure": 73, "ad": 73, "ha": 73, "been": 73, "spike": 74, "vignett": [74, 91, 101], "poisson": 74, "can": [74, 94], "ann": 74, "know": 74, "uncertainti": 74, "so": 74, "we": [74, 94], "measur": 74, "neg": [74, 85, 94], "standard": 74, "deviat": 74, "embed": [74, 77, 82, 87, 89], "modern": [75, 76, 77, 94], "fcnn": 76, "big": 76, "vgg": 76, "residu": 76, "imagenett": 76, "textual": 76, "imagenet": 76, "map": [76, 103], "eval_imagenett": 76, "incept": 76, "resnext": 76, "effici": 76, "depthwis": 76, "separ": 76, "mobilenet": 76, "64": 76, "onli": [76, 94], "readout": 76, "scratch": 76, "head": [76, 84], "comparison": [76, 80, 100], "pretrain": [76, 77], "outlook": 76, "speed": 76, "backbon": 76, "train_loop": 76, "train_load": 76, "loss_fn": 76, "run_model": 76, "lr_rate": 76, "facial": 77, "bia": 77, "discrimin": 77, "due": 77, "pairwis": 77, "distanc": 77, "within": 77, "sum": 77, "squar": 77, "wss": 77, "geoffrei": 79, "hinton": 79, "distil": 79, "variat": [80, 94], "autoencod": [80, 94], "pleas": 80, "ignor": 80, "warn": 80, "dure": 80, "wordnet": 80, "biggan": 80, "interpol": 80, "categori": 80, "ppca": 80, "conceptu": 80, "pca": 80, "autoencond": 80, "nonlinear": 80, "fill": 80, "convautoencod": 80, "encod": [80, 84, 94], "compon": 80, "novel": 80, "decod": [80, 84, 94], "Of": 80, "diffus": [81, 82], "principl": [81, 94], "behind": [81, 84], "1d": 81, "2d": 81, "gaussian": 81, "mixtur": 81, "log": 81, "densiti": 81, "each": [81, 94], "mode": 81, "individu": 81, "tell": 81, "revers": 81, "match": 81, "unet": 82, "sampler": 82, "advanc": 82, "techinqu": 82, "stabl": 82, "consider": [82, 94], "copyright": 82, "imageri": 82, "attent": [83, 84], "nltk": [84, 87], "punkt": 84, "averaged_perceptron_tagg": 84, "brown": 84, "webtext": 84, "yelp": [84, 85], "load_yelp_data": 84, "token": [84, 87, 88], "bert": [84, 88], "infil": 84, "queri": 84, "kei": 84, "intut": 84, "corpu": 84, "dot": 84, "product": 84, "multihead": 84, "q": [84, 97], "k": 84, "v": 84, "i": [84, 94], "ii": 84, "complex": 84, "posit": [84, 85], "positionalencod": 84, "bias": [84, 94], "probabl": 84, "mask": 84, "problem": [84, 97], "approach": [84, 94], "hint": 84, "field": 84, "robust": 85, "gpt": [85, 88], "osf": 85, "context": [85, 101], "extend": 85, "binari": 85, "likelihood": 85, "light": 85, "weight": 85, "break": 85, "origin": 85, "textattack": 85, "issu": 85, "seri": [86, 87], "fasttext": [87, 89], "homonym": 87, "analog": [87, 93], "feed": 87, "llm": 88, "Is": 88, "pre_token": 88, "practic": 88, "chines": 88, "english": 88, "nlg": 88, "sota": 88, "todai": 88, "tomorrow": 88, "conclus": [88, 94], "around": 88, "larg": 88, "multilingu": 89, "thinking2": 90, "multimod": 91, "strategi": 91, "tumor": 91, "still": 91, "isn": 91, "enough": 91, "forrest": 91, "gump": 91, "pull": 91, "unsupervis": 92, "melani": 93, "mitchel": 93, "un": 94, "allow": 94, "independ": 94, "introduc": 94, "dsprite": 94, "schemat": 94, "directli": 94, "along": 94, "induc": 94, "invari": 94, "matric": 94, "rsm": 94, "reveal": 94, "support": 94, "don": 94, "potenti": 94, "versu": [94, 100], "produc": 94, "conclud": 94, "reconstruct": 94, "organ": 94, "abil": 94, "construct": 94, "meaning": 94, "space": 94, "few": 94, "avail": 94, "could": 94, "ssl": 94, "simclr": 94, "wa": 94, "cope": 94, "contrast": 94, "collaps": 94, "reduc": 94, "histogram": 94, "pair": 94, "shot": 94, "benefit": 94, "short": 94, "scenario": 94, "e": 94, "when": 94, "fraction": 94, "advantag": 94, "disadvantag": 94, "type": 94, "under": 94, "variou": 94, "chealsea": 96, "finn": 96, "chelsea": 96, "grid": 97, "shortest": 97, "path": 97, "planner": [97, 100, 102], "gridworld": 97, "gridworldplann": 97, "try": 97, "harder": 97, "markov": 97, "mdp": 97, "go": 97, "solver": 97, "iter": 97, "epsilon": 97, "greedi": [97, 100], "For": [98, 100], "thinking3": 98, "amita": 99, "kapoor": 99, "futur": [99, 101], "othellogam": 100, "player": [100, 102], "compet": 100, "othello": 100, "othellonnet": 100, "valuenetwork": 100, "mse": 100, "progress": 100, "policynetwork": 100, "policybasedplay": 100, "mont": [100, 102], "carlo": [100, 102], "rollout": 100, "montecarlo": 100, "against": [100, 102], "unbeat": 100, "oppon": 100, "19": 100, "In": 101, "memori": 101, "mct": 102, "concept": 103}, "envversion": {"sphinx.domains.c": 2, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 6, "sphinx.domains.index": 1, "sphinx.domains.javascript": 2, "sphinx.domains.math": 2, "sphinx.domains.python": 3, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx": 56}}) \ No newline at end of file +Search.setIndex({"docnames": ["prereqs/DeepLearning", "projects/ComputerVision/README", "projects/ComputerVision/data_augmentation", "projects/ComputerVision/em_synapses", "projects/ComputerVision/ideas_and_datasets", "projects/ComputerVision/screws", "projects/ComputerVision/slides", "projects/ComputerVision/spectrogram_analysis", "projects/ComputerVision/transfer_learning", "projects/NaturalLanguageProcessing/README", "projects/NaturalLanguageProcessing/ideas_and_datasets", "projects/NaturalLanguageProcessing/machine_translation", "projects/NaturalLanguageProcessing/sentiment_analysis", "projects/NaturalLanguageProcessing/slides", "projects/Neuroscience/README", "projects/Neuroscience/algonauts_videos", "projects/Neuroscience/blurry_vision", "projects/Neuroscience/cellular_segmentation", "projects/Neuroscience/finetuning_fmri", "projects/Neuroscience/ideas_and_datasets", "projects/Neuroscience/neuro_seq_to_seq", "projects/Neuroscience/pose_estimation", "projects/Neuroscience/slides", "projects/README", "projects/ReinforcementLearning/README", "projects/ReinforcementLearning/human_rl", "projects/ReinforcementLearning/ideas_and_datasets", "projects/ReinforcementLearning/lunar_lander", "projects/ReinforcementLearning/robolympics", "projects/ReinforcementLearning/slides", "projects/docs/datasets_and_models", "projects/docs/project_guidance", "projects/docs/projects_overview", "projects/modelingsteps/Example_Deep_Learning_Project", "projects/modelingsteps/ModelingSteps_10_DL", "projects/modelingsteps/ModelingSteps_1through2_DL", "projects/modelingsteps/ModelingSteps_3through4_DL", "projects/modelingsteps/ModelingSteps_5through6_DL", "projects/modelingsteps/ModelingSteps_7through9_DL", "projects/modelingsteps/TrainIllusionDataProjectDL", "projects/modelingsteps/TrainIllusionModelingProjectDL", "projects/modelingsteps/intro", "tutorials/Bonus_DeployModels/chapter_title", "tutorials/Bonus_DeployModels/student/Bonus_Tutorial1", "tutorials/Module_WrapUps/FineTuning", "tutorials/Module_WrapUps/NaturalLanguageProcessing", "tutorials/Schedule/daily_schedules", "tutorials/Schedule/schedule_intro", "tutorials/Schedule/shared_calendars", "tutorials/Schedule/timezone_widget", "tutorials/TechnicalHelp/Discord", "tutorials/TechnicalHelp/Jupyterbook", "tutorials/TechnicalHelp/Links_Policy", "tutorials/TechnicalHelp/Tutorial_colab", "tutorials/TechnicalHelp/Tutorial_kaggle", "tutorials/TechnicalHelp/tech_intro", "tutorials/W1D1_BasicsAndPytorch/chapter_title", "tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1", "tutorials/W1D2_LinearDeepLearning/chapter_title", "tutorials/W1D2_LinearDeepLearning/student/W1D2_BonusLecture", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3", "tutorials/W1D3_MultiLayerPerceptrons/chapter_title", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2", "tutorials/W1D5_Optimization/chapter_title", "tutorials/W1D5_Optimization/student/W1D5_Tutorial1", "tutorials/W2D1_Regularization/chapter_title", "tutorials/W2D1_Regularization/student/W2D1_Tutorial1", "tutorials/W2D1_Regularization/student/W2D1_Tutorial2", "tutorials/W2D2_ConvnetsAndDlThinking/chapter_title", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2", "tutorials/W2D3_ModernConvnets/chapter_title", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial1", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial2", "tutorials/W2D4_GenerativeModels/chapter_title", "tutorials/W2D4_GenerativeModels/student/W2D4_BonusLecture", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial1", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial2", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial3", "tutorials/W2D5_AttentionAndTransformers/chapter_title", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial1", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial2", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/chapter_title", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial1", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial3", "tutorials/W3D2_DlThinking2/chapter_title", "tutorials/W3D2_DlThinking2/student/W3D2_Tutorial1", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/chapter_title", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_BonusLecture", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_Tutorial1", "tutorials/W3D4_BasicReinforcementLearning/chapter_title", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_BonusLecture", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_Tutorial1", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/chapter_title", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_BonusLecture", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial1", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial2", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial3", "tutorials/intro"], "filenames": ["prereqs/DeepLearning.md", "projects/ComputerVision/README.md", "projects/ComputerVision/data_augmentation.ipynb", "projects/ComputerVision/em_synapses.ipynb", "projects/ComputerVision/ideas_and_datasets.md", "projects/ComputerVision/screws.ipynb", "projects/ComputerVision/slides.md", "projects/ComputerVision/spectrogram_analysis.ipynb", "projects/ComputerVision/transfer_learning.ipynb", "projects/NaturalLanguageProcessing/README.md", "projects/NaturalLanguageProcessing/ideas_and_datasets.md", "projects/NaturalLanguageProcessing/machine_translation.ipynb", "projects/NaturalLanguageProcessing/sentiment_analysis.ipynb", "projects/NaturalLanguageProcessing/slides.md", "projects/Neuroscience/README.md", "projects/Neuroscience/algonauts_videos.ipynb", "projects/Neuroscience/blurry_vision.ipynb", "projects/Neuroscience/cellular_segmentation.ipynb", "projects/Neuroscience/finetuning_fmri.ipynb", "projects/Neuroscience/ideas_and_datasets.md", "projects/Neuroscience/neuro_seq_to_seq.ipynb", "projects/Neuroscience/pose_estimation.ipynb", "projects/Neuroscience/slides.md", "projects/README.md", "projects/ReinforcementLearning/README.md", "projects/ReinforcementLearning/human_rl.ipynb", "projects/ReinforcementLearning/ideas_and_datasets.md", "projects/ReinforcementLearning/lunar_lander.ipynb", "projects/ReinforcementLearning/robolympics.ipynb", "projects/ReinforcementLearning/slides.md", "projects/docs/datasets_and_models.md", "projects/docs/project_guidance.md", "projects/docs/projects_overview.md", "projects/modelingsteps/Example_Deep_Learning_Project.ipynb", "projects/modelingsteps/ModelingSteps_10_DL.ipynb", "projects/modelingsteps/ModelingSteps_1through2_DL.ipynb", "projects/modelingsteps/ModelingSteps_3through4_DL.ipynb", "projects/modelingsteps/ModelingSteps_5through6_DL.ipynb", "projects/modelingsteps/ModelingSteps_7through9_DL.ipynb", "projects/modelingsteps/TrainIllusionDataProjectDL.ipynb", "projects/modelingsteps/TrainIllusionModelingProjectDL.ipynb", "projects/modelingsteps/intro.md", "tutorials/Bonus_DeployModels/chapter_title.md", "tutorials/Bonus_DeployModels/student/Bonus_Tutorial1.ipynb", "tutorials/Module_WrapUps/FineTuning.ipynb", "tutorials/Module_WrapUps/NaturalLanguageProcessing.ipynb", "tutorials/Schedule/daily_schedules.md", "tutorials/Schedule/schedule_intro.md", "tutorials/Schedule/shared_calendars.md", "tutorials/Schedule/timezone_widget.md", "tutorials/TechnicalHelp/Discord.md", "tutorials/TechnicalHelp/Jupyterbook.md", "tutorials/TechnicalHelp/Links_Policy.md", "tutorials/TechnicalHelp/Tutorial_colab.md", "tutorials/TechnicalHelp/Tutorial_kaggle.md", "tutorials/TechnicalHelp/tech_intro.md", "tutorials/W1D1_BasicsAndPytorch/chapter_title.md", "tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1.ipynb", "tutorials/W1D2_LinearDeepLearning/chapter_title.md", "tutorials/W1D2_LinearDeepLearning/student/W1D2_BonusLecture.ipynb", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1.ipynb", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2.ipynb", "tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3.ipynb", "tutorials/W1D3_MultiLayerPerceptrons/chapter_title.md", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1.ipynb", "tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2.ipynb", "tutorials/W1D5_Optimization/chapter_title.md", "tutorials/W1D5_Optimization/student/W1D5_Tutorial1.ipynb", "tutorials/W2D1_Regularization/chapter_title.md", "tutorials/W2D1_Regularization/student/W2D1_Tutorial1.ipynb", "tutorials/W2D1_Regularization/student/W2D1_Tutorial2.ipynb", "tutorials/W2D2_ConvnetsAndDlThinking/chapter_title.md", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture.ipynb", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1.ipynb", "tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2.ipynb", "tutorials/W2D3_ModernConvnets/chapter_title.md", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial1.ipynb", "tutorials/W2D3_ModernConvnets/student/W2D3_Tutorial2.ipynb", "tutorials/W2D4_GenerativeModels/chapter_title.md", "tutorials/W2D4_GenerativeModels/student/W2D4_BonusLecture.ipynb", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial1.ipynb", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial2.ipynb", "tutorials/W2D4_GenerativeModels/student/W2D4_Tutorial3.ipynb", "tutorials/W2D5_AttentionAndTransformers/chapter_title.md", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial1.ipynb", "tutorials/W2D5_AttentionAndTransformers/student/W2D5_Tutorial2.ipynb", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/chapter_title.md", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial1.ipynb", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial2.ipynb", "tutorials/W3D1_TimeSeriesAndNaturalLanguageProcessing/student/W3D1_Tutorial3.ipynb", "tutorials/W3D2_DlThinking2/chapter_title.md", "tutorials/W3D2_DlThinking2/student/W3D2_Tutorial1.ipynb", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/chapter_title.md", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_BonusLecture.ipynb", "tutorials/W3D3_UnsupervisedAndSelfSupervisedLearning/student/W3D3_Tutorial1.ipynb", "tutorials/W3D4_BasicReinforcementLearning/chapter_title.md", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_BonusLecture.ipynb", "tutorials/W3D4_BasicReinforcementLearning/student/W3D4_Tutorial1.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/chapter_title.md", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_BonusLecture.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial1.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial2.ipynb", "tutorials/W3D5_ReinforcementLearningForGamesAndDlThinking3/student/W3D5_Tutorial3.ipynb", "tutorials/intro.ipynb"], "titles": ["Prerequisites and preparatory materials for NMA Deep Learning", "Computer Vision", "Data Augmentation in image classification models", "Knowledge Extraction from a Convolutional Neural Network", "Ideas", "Something Screwy - image recognition, detection, and classification of screws", "Slides", "Music classification and generation with spectrograms", "Transfer Learning", "Natural Language Processing", "Ideas", "Machine Translation", "Twitter Sentiment Analysis", "Slides", "Neuroscience", "Load algonauts videos", "Vision with Lost Glasses: Modelling how the brain deals with noisy input", "Segmentation and Denoising", "Moving beyond Labels: Finetuning CNNs on BOLD response", "Ideas", "Focus on what matters: inferring low-dimensional dynamics from neural recordings", "Animal Pose Estimation", "Slides", "Introduction to projects", "Reinforcement Learning", "Using RL to Model Cognitive Tasks", "Ideas", "Performance Analysis of DQN Algorithm on the Lunar Lander task", "NMA Robolympics: Controlling robots using reinforcement learning", "Slides", "Models and Data sets", "Daily guide for projects", "Project Templates", "Example Deep Learning Project", "Modeling Steps 10", "Modeling Steps 1 - 2", "Modeling Steps 3 - 4", "Modeling Steps 5 - 6", "Modeling Steps 7 - 9", "Example Data Project: the Train Illusion", "Example Model Project: the Train Illusion", "Modeling Step-by-Step Guide", "Deploy Models", "Bonus Tutorial: Deploying Neural Networks on the Web", "Deep Learning: The Basics and Fine Tuning Wrap-up", "Deep Learning: Convnets and NLP", "General schedule", "Schedule", "Shared calendars", "Timezone widget", "Using Discord", "Using jupyterbook", "Quick links and policies", "Using Google Colab", "Using Kaggle", "Technical Help", "Basics And Pytorch", "Tutorial 1: PyTorch", "Linear Deep Learning", "Bonus Lecture: Yoshua Bengio", "Tutorial 1: Gradient Descent and AutoGrad", "Tutorial 2: Learning Hyperparameters", "Tutorial 3: Deep linear neural networks", "Multi Layer Perceptrons", "Tutorial 1: Biological vs. Artificial Neural Networks", "Tutorial 2: Deep MLPs", "Optimization", "Tutorial 1: Optimization techniques", "Regularization", "Tutorial 1: Regularization techniques part 1", "Tutorial 2: Regularization techniques part 2", "Convnets And Dl Thinking", "Bonus Lecture: Kyunghyun Cho", "Tutorial 1: Introduction to CNNs", "Tutorial 2: Deep Learning Thinking 1: Cost Functions", "Modern Convnets", "Tutorial 1: Learn how to use modern convnets", "Bonus Tutorial: Facial recognition using modern convnets", "Generative Models", "Bonus Lecture: Geoffrey Hinton", "Tutorial 1: Variational Autoencoders (VAEs)", "Tutorial 2: Diffusion models", "Tutorial 3: Image, Conditional Diffusion and Beyond", "Attention And Transformers", "Tutorial 1: Learn how to work with Transformers", "Bonus Tutorial: Understanding Pre-training, Fine-tuning and Robustness of Transformers", "Time Series And Natural Language Processing", "Tutorial 1: Introduction to processing time series", "Tutorial 2: Natural Language Processing and LLMs", "Bonus Tutorial: Multilingual Embeddings", "Dl Thinking2", "Tutorial 1: Deep Learning Thinking 2: Architectures and Multimodal DL thinking", "Unsupervised And Self Supervised Learning", "Bonus Lecture: Melanie Mitchell", "Tutorial 1: Un/Self-supervised learning methods", "Basic Reinforcement Learning", "Bonus Lecture: Chealsea Finn", "Tutorial 1: Basic Reinforcement Learning", "Reinforcement Learning For Games And Dl Thinking3", "Bonus Lecture: Amita Kapoor", "Tutorial 1: Reinforcement Learning For Games", "Tutorial 2: Deep Learning Thinking 3", "Bonus Tutorial: Planning with Monte Carlo Tree Search", "Introduction"], "terms": {"welcom": [0, 43, 76, 82, 103], "neuromatch": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 44, 45, 46, 52, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102, 103], "academi": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 37, 38, 39, 40, 43, 44, 45, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "we": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 46, 48, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "re": [0, 11, 20, 21, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 54, 57, 64, 65, 67, 73, 76, 77, 80, 82, 85, 88, 91, 97, 100], "realli": [0, 11, 12, 31, 33, 34, 35, 36, 37, 38, 40, 60, 84, 88], "excit": [0, 27, 31], "bring": [0, 39, 67, 69, 70, 73, 85, 87], "wide": [0, 4, 12, 15, 43, 57, 60, 62, 67, 70, 97], "vari": [0, 17, 27, 33, 40, 62, 67, 76, 80, 88, 91], "audienc": [0, 31, 34, 57], "an": [0, 2, 4, 5, 7, 8, 10, 11, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 52, 53, 54, 60, 62, 65, 69, 70, 76, 77, 80, 81, 82, 84, 87, 88, 91, 100, 101], "amaz": [0, 62], "set": [0, 3, 4, 7, 12, 15, 16, 23, 26, 27, 31, 34, 35, 36, 37, 38, 39, 40, 43, 54, 97, 101], "lectur": [0, 46, 64, 67, 73, 80], "tutori": [0, 3, 5, 7, 11, 19, 21, 26, 28, 31, 33, 46, 49, 54], "you": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 20, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 48, 51, 53, 54, 57, 60, 61, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "peopl": [0, 11, 12, 15, 16, 21, 23, 31, 33, 35, 36, 39, 40, 43, 61, 70, 74, 77, 84, 87, 91], "ar": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 52, 54, 60, 61, 62, 64, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 97, 100, 101, 102], "come": [0, 12, 15, 27, 31, 36, 37, 38, 39, 43, 60, 62, 67, 70, 73, 74, 77, 80, 81, 85, 91], "thi": [0, 2, 3, 4, 5, 8, 11, 12, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 49, 51, 52, 53, 54, 57, 60, 61, 64, 65, 69, 70, 73, 74, 76, 80, 81, 82, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "from": [0, 2, 5, 7, 10, 11, 12, 15, 16, 17, 18, 19, 21, 23, 25, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 44, 45, 54, 59, 60, 61, 64, 65, 67, 69, 70, 72, 74, 77, 79, 81, 88, 89, 91, 93, 96, 99, 101, 103], "rang": [0, 2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 39, 40, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 87, 88, 94, 97, 100, 102], "disciplin": 0, "level": [0, 15, 25, 31, 36, 37, 40, 60, 61, 62, 67, 70, 73, 81, 82, 85, 88, 94, 97, 100, 102], "background": [0, 4, 5, 23, 31, 37, 61, 73, 80], "want": [0, 2, 3, 4, 5, 7, 8, 11, 12, 16, 17, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 53, 54, 57, 60, 61, 62, 64, 65, 69, 70, 73, 74, 76, 77, 80, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101, 102], "make": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 52, 53, 54, 60, 61, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "sure": [0, 2, 3, 11, 12, 17, 21, 25, 28, 31, 33, 34, 35, 36, 37, 38, 40, 43, 54, 57, 60, 61, 64, 65, 73, 76, 80, 84, 85, 88, 97, 100], "everybodi": 0, "abl": [0, 5, 16, 25, 27, 28, 31, 33, 36, 40, 43, 57, 67, 69, 73, 74, 77, 84, 85, 91, 100, 101], "follow": [0, 2, 5, 7, 8, 11, 12, 15, 18, 21, 23, 25, 27, 28, 31, 33, 35, 36, 37, 39, 40, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "enjoi": [0, 35, 76], "school": [0, 23, 76], "dai": [0, 3, 12, 23, 35, 39, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "1": [0, 2, 3, 5, 7, 8, 10, 11, 15, 16, 17, 18, 20, 21, 23, 25, 34, 36, 37, 38, 39, 46, 48, 54, 59, 72, 79, 93, 96, 99], "mean": [0, 2, 5, 8, 12, 15, 16, 17, 20, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "need": [0, 3, 5, 7, 12, 15, 17, 19, 21, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 54, 57, 60, 61, 62, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 85, 88, 91, 94, 100, 101, 102], "know": [0, 3, 5, 12, 31, 33, 34, 35, 36, 37, 39, 57, 61, 62, 64, 67, 69, 70, 73, 76, 77, 80, 88, 91, 94, 97, 101], "basic": [0, 3, 17, 19, 25, 33, 35, 38, 39, 43, 46, 60, 61, 64, 73, 76, 80, 81, 82, 85, 87, 91, 94, 101], "python": [0, 2, 5, 7, 8, 17, 21, 25, 27, 31, 35, 39, 57, 60, 65, 67, 69, 70, 76, 80, 81, 82, 85, 87, 88, 89, 94], "some": [0, 2, 3, 4, 7, 8, 11, 12, 16, 17, 19, 21, 23, 25, 26, 27, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 61, 64, 65, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "core": [0, 23, 28, 35, 36, 57, 60, 62, 65, 67, 73, 84, 88, 94, 97, 100, 102], "concept": [0, 23, 25, 36, 37, 46, 57, 60, 67, 76, 77, 80, 88, 97], "below": [0, 3, 6, 7, 8, 11, 13, 15, 16, 17, 22, 27, 28, 29, 33, 36, 39, 43, 44, 45, 48, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "provid": [0, 2, 3, 5, 7, 8, 11, 12, 15, 17, 21, 23, 25, 27, 28, 31, 34, 35, 36, 37, 38, 43, 57, 60, 62, 64, 65, 67, 70, 73, 80, 81, 85, 88, 89, 94, 97, 100, 101], "more": [0, 2, 3, 5, 8, 11, 12, 15, 16, 17, 19, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 39, 40, 43, 46, 52, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 97, 100, 101, 102], "detail": [0, 2, 5, 8, 15, 23, 26, 27, 28, 31, 33, 34, 36, 37, 38, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "run": [0, 2, 3, 5, 8, 11, 16, 17, 20, 21, 27, 31, 33, 35, 36, 37, 39, 40, 43, 51, 53, 54, 57, 60, 61, 64, 65, 74, 80, 81, 82, 84, 85, 87, 88, 89, 100, 102], "us": [0, 2, 3, 7, 10, 11, 12, 17, 20, 21, 23, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 59, 60, 61, 62, 64, 65, 67, 70, 72, 73, 74, 79, 80, 82, 85, 89, 91, 93, 96, 97, 99, 101], "If": [0, 3, 5, 8, 11, 12, 16, 20, 21, 27, 28, 31, 33, 35, 36, 38, 39, 40, 43, 48, 51, 52, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "ve": [0, 5, 17, 31, 34, 60, 61, 67, 73, 74, 76, 77, 80, 85, 88, 97, 100, 102], "never": [0, 27, 31, 34, 62, 84, 87, 88, 101], "now": [0, 3, 5, 8, 11, 12, 16, 17, 19, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "good": [0, 3, 8, 11, 12, 16, 17, 25, 27, 31, 33, 34, 35, 36, 38, 39, 40, 43, 57, 60, 61, 62, 64, 67, 69, 70, 74, 76, 80, 91, 97, 101], "time": [0, 3, 5, 7, 8, 12, 15, 16, 17, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 48, 49, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 88, 89, 91, 94, 97, 100, 101, 102], "start": [0, 2, 3, 5, 8, 10, 11, 12, 15, 17, 19, 20, 23, 27, 28, 33, 34, 35, 36, 37, 38, 39, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "practic": [0, 5, 8, 31, 39, 43, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 84, 91, 101], "expect": [0, 2, 8, 11, 17, 25, 33, 34, 35, 36, 38, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "student": [0, 2, 8, 23, 31, 46, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "familiar": [0, 12, 31, 62, 64, 73, 76, 88, 97], "variabl": [0, 2, 12, 31, 36, 37, 39, 43, 60, 62, 64, 65, 67, 69, 73, 76, 77, 81, 88, 100, 102], "list": [0, 5, 7, 8, 11, 12, 16, 17, 18, 21, 25, 31, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "dict": [0, 5, 8, 18, 21, 27, 40, 43, 57, 62, 76, 85, 88, 97], "numpi": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 20, 21, 25, 27, 28, 31, 33, 35, 36, 39, 40, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "scipi": [0, 5, 17, 18, 35, 39, 40, 73, 80, 81, 85, 87], "librari": [0, 3, 5, 7, 8, 12, 28, 31, 36, 43, 57, 67, 76, 84, 85, 87, 88, 89, 94], "well": [0, 2, 5, 8, 11, 12, 16, 19, 21, 28, 31, 33, 34, 35, 38, 39, 40, 43, 57, 60, 62, 67, 69, 70, 73, 76, 77, 80, 81, 85, 88, 91, 101], "plot": [0, 2, 5, 7, 11, 12, 15, 17, 21, 25, 27, 31, 40, 67, 87, 88], "matplotlib": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97], "littl": [0, 16, 33, 39, 40, 57, 74, 76, 77, 84, 91, 97], "bit": [0, 3, 27, 33, 35, 39, 43, 57, 74, 80, 88, 89, 91, 94], "everi": [0, 11, 17, 20, 21, 23, 27, 28, 31, 33, 34, 38, 43, 54, 57, 60, 62, 64, 65, 67, 69, 70, 73, 74, 81, 85, 94, 100], "ll": [0, 5, 7, 12, 17, 28, 31, 33, 35, 36, 38, 39, 43, 57, 64, 65, 70, 73, 74, 76, 80, 85, 87, 88, 89, 97, 100], "great": [0, 11, 31, 33, 34, 43, 57, 67, 73, 74, 76, 77, 88, 91, 101], "shape": [0, 2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 28, 33, 35, 36, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97, 100, 102], "class": [0, 2, 3, 7, 8, 10, 11, 12, 16, 17, 18, 20, 21, 25, 27, 33, 36, 39, 43, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 77, 80, 82, 84, 85, 87, 88, 94, 97, 100, 101, 102], "have": [0, 2, 3, 4, 5, 8, 11, 12, 16, 17, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 48, 51, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "workshop": [0, 31, 46], "w0d1": 0, "w0d2": 0, "here": [0, 2, 5, 8, 11, 12, 15, 16, 17, 19, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 44, 45, 46, 50, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "should": [0, 3, 5, 7, 8, 11, 12, 17, 20, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 48, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 91, 94, 97, 100, 101, 102], "go": [0, 3, 7, 11, 12, 17, 20, 21, 28, 31, 33, 34, 36, 38, 39, 40, 43, 54, 57, 60, 62, 64, 65, 67, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 94], "through": [0, 3, 5, 17, 18, 21, 27, 31, 34, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 67, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 101], "made": [0, 3, 20, 21, 31, 43, 62, 76, 88, 94, 100, 103], "content": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 23, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 44, 45, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "your": [0, 3, 4, 5, 7, 10, 11, 12, 15, 16, 17, 23, 25, 27, 28, 31, 33, 34, 37, 38, 39, 40, 46, 48, 49, 53, 54], "own": [0, 3, 5, 7, 11, 12, 21, 23, 27, 28, 31, 33, 38, 39, 40, 43, 57, 61, 74, 80, 85, 88, 94, 97], "pace": 0, "befor": [0, 5, 8, 12, 16, 21, 23, 27, 31, 34, 35, 36, 38, 39, 43, 57, 60, 62, 64, 65, 67, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "note": [0, 2, 5, 8, 11, 12, 16, 17, 20, 21, 23, 25, 27, 28, 31, 33, 35, 36, 38, 39, 40, 43, 46, 52, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "ha": [0, 2, 3, 4, 5, 8, 12, 16, 17, 18, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 44, 45, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "neurosci": [0, 17, 19, 23, 31, 34, 35, 39, 54, 64, 76], "focu": [0, 17, 25, 27, 31, 33, 36, 38, 57, 67, 73, 74, 80, 81, 91, 101, 102], "exampl": [0, 2, 3, 5, 7, 12, 15, 17, 20, 21, 25, 26, 27, 31, 43, 53, 57, 60, 61, 62, 64, 65, 67, 69, 73, 74, 80, 82, 84, 85, 87, 88, 89, 91, 100, 101, 102], "extrem": [0, 39, 40, 62, 67, 76, 91], "besid": [0, 62], "recommend": [0, 5, 12, 17, 23, 31, 33, 43, 54, 57, 62, 67, 85, 88, 100], "softwar": [0, 5, 11, 21, 43], "carpentri": 0, "free": [0, 3, 11, 12, 28, 31, 36, 40, 43, 57, 60, 61, 62, 64, 65, 67, 73, 80, 82, 84, 88, 97, 100, 101], "edx": 0, "research": [0, 5, 15, 19, 21, 23, 25, 27, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 67, 74, 76, 81, 84, 88, 97], "For": [0, 2, 5, 7, 8, 12, 15, 16, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 101, 102], "depth": [0, 62, 64, 65, 70, 80, 82, 84, 100], "intro": [0, 5, 31, 35, 40, 46], "see": [0, 2, 3, 4, 5, 6, 8, 10, 11, 12, 13, 16, 17, 20, 21, 22, 23, 27, 28, 29, 31, 33, 35, 36, 37, 39, 40, 43, 46, 48, 51, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "final": [0, 3, 11, 17, 23, 27, 28, 33, 36, 38, 43, 54, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100], "can": [0, 2, 3, 4, 5, 7, 8, 11, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 51, 52, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "data": [0, 4, 12, 19, 25, 27, 31, 34, 36, 37, 38, 40, 43, 60, 61, 62, 64, 74, 80, 82, 87, 88, 89, 101, 102], "scienc": [0, 15, 31, 35, 37, 39, 57, 88, 97, 101], "handbook": 0, "which": [0, 2, 3, 5, 7, 11, 12, 16, 17, 20, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 39, 40, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "also": [0, 3, 5, 7, 8, 11, 12, 16, 17, 19, 21, 23, 26, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 51, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 88, 89, 91, 94, 100, 101], "print": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 34, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "edit": [0, 31, 46, 53, 62, 77, 80, 82, 85], "matlab": 0, "quickli": [0, 16, 27, 31, 33, 43, 57, 61, 74, 85, 88, 91, 101], "get": [0, 2, 5, 7, 8, 11, 12, 15, 17, 18, 19, 20, 21, 23, 25, 27, 28, 33, 35, 36, 37, 38, 39, 40, 43, 52, 54, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "up": [0, 2, 11, 12, 15, 17, 20, 21, 25, 28, 31, 33, 35, 36, 38, 39, 40, 43, 57, 62, 64, 65, 67, 69, 70, 73, 77, 81, 82, 84, 87, 88, 94, 100, 101, 102], "speed": [0, 11, 27, 28, 39, 57, 61, 64, 65, 73, 84, 87, 100], "cheatsheet": 0, "mai": [0, 2, 11, 12, 15, 17, 23, 27, 28, 31, 33, 34, 39, 40, 43, 51, 52, 54, 57, 60, 61, 62, 64, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "paperback": 0, "neural": [0, 2, 7, 11, 15, 18, 19, 23, 26, 27, 31, 35, 39, 45, 67, 69, 70, 73, 76, 80, 84, 88, 91, 94, 101, 102], "both": [0, 11, 16, 17, 21, 25, 31, 33, 35, 36, 39, 40, 43, 46, 57, 65, 67, 70, 73, 74, 76, 80, 81, 84, 85, 87, 91, 94, 100, 101], "version": [0, 3, 5, 6, 7, 13, 16, 17, 21, 22, 25, 27, 29, 31, 33, 36, 53, 60, 73, 76, 77, 80, 81, 82, 85, 94, 101], "reli": [0, 25, 31, 33, 34, 39, 40, 62, 67], "linear": [0, 2, 3, 5, 7, 8, 11, 12, 16, 18, 33, 36, 39, 40, 46, 57, 60, 64, 65, 67, 69, 70, 73, 74, 76, 81, 82, 84, 87, 94, 100, 102], "algebra": [0, 57, 62], "probabl": [0, 11, 16, 17, 27, 33, 36, 38, 39, 40, 43, 48, 57, 61, 62, 64, 65, 69, 70, 73, 74, 76, 80, 81, 85, 87, 88, 91, 94, 97, 100, 101, 102], "statist": [0, 12, 18, 38, 64, 65, 67, 73, 84], "calculu": [0, 60], "deriv": [0, 38, 57, 60, 61, 62, 64, 65, 67, 70, 73, 81, 97], "od": [0, 81, 82], "highli": [0, 17, 23, 27, 34, 57, 67, 73, 82], "our": [0, 3, 11, 12, 16, 17, 20, 21, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97, 100, 101], "refresh": [0, 67], "w0d3": 0, "w0d4": 0, "w0d5": 0, "ask": [0, 16, 21, 23, 25, 31, 33, 39, 40, 60, 61, 62, 74, 77, 85, 88, 91, 94, 101], "question": [0, 10, 17, 19, 23, 25, 27, 34, 36, 37, 38, 65, 67, 74, 76, 77, 84, 88, 89, 91, 94, 101], "discord": [0, 31, 46], "grasp": 0, "along": [0, 19, 27, 31, 33, 57, 60, 64, 65, 69, 73, 74, 80, 85, 88], "crucial": [0, 5, 20, 31, 36, 38, 69, 97], "almost": [0, 33, 35, 62, 69, 84, 88], "anyth": [0, 33, 36, 38, 39, 51, 53, 64, 73, 88, 91], "quantit": [0, 35, 38], "involv": [0, 27, 31, 33, 37, 38, 57, 73, 74, 76, 81, 85, 97], "than": [0, 3, 5, 11, 12, 16, 17, 20, 21, 27, 28, 31, 33, 35, 36, 37, 39, 52, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101], "one": [0, 2, 3, 4, 5, 7, 8, 10, 11, 12, 16, 17, 21, 23, 25, 26, 27, 31, 33, 34, 35, 36, 37, 39, 40, 43, 53, 54, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "number": [0, 2, 3, 12, 16, 17, 20, 21, 25, 26, 27, 28, 33, 35, 36, 39, 54, 60, 61, 62, 64, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 91, 97, 100, 102], "vector": [0, 2, 11, 12, 27, 39, 40, 57, 61, 64, 65, 67, 69, 74, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97], "matrix": [0, 12, 15, 17, 20, 33, 35, 39, 57, 61, 62, 67, 69, 73, 77, 80, 81, 84, 85, 89, 94, 97], "addit": [0, 11, 17, 31, 33, 34, 35, 43, 46, 57, 61, 62, 74, 80, 81, 82, 84, 85, 87, 94, 101], "multipl": [0, 2, 3, 5, 8, 17, 26, 27, 31, 57, 60, 61, 62, 64, 65, 81, 84, 85, 87, 91, 97, 100], "rank": [0, 62, 74, 100], "base": [0, 5, 11, 17, 21, 23, 25, 31, 33, 35, 36, 37, 39, 43, 57, 60, 62, 67, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 94, 97, 101], "determin": [0, 31, 35, 37, 38, 39, 40, 57, 64, 69, 73, 74, 85, 94, 97, 100, 102], "invers": [0, 67, 74, 80], "eigenvalu": [0, 62], "decomposit": [0, 77], "In": [0, 2, 3, 4, 5, 7, 8, 11, 12, 15, 16, 17, 20, 23, 25, 27, 28, 31, 33, 34, 35, 36, 37, 39, 40, 43, 46, 51, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "beauti": [0, 11, 76, 82], "seri": [0, 21, 31, 33, 35, 40, 44, 45, 46, 74, 76, 88, 89, 91, 94, 101], "anoth": [0, 3, 7, 8, 10, 11, 12, 17, 20, 25, 26, 27, 30, 31, 35, 39, 40, 53, 54, 57, 62, 64, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100], "resourc": [0, 3, 7, 27, 33, 37, 43, 77], "khan": 0, "exercis": [0, 2, 8, 34, 74, 85, 89], "understand": [0, 5, 11, 16, 19, 25, 27, 31, 33, 34, 36, 38, 39, 57, 61, 62, 67, 76, 77, 82, 84, 88, 94, 97, 100, 101, 102], "import": [0, 2, 3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 31, 33, 35, 36, 37, 38, 39, 40, 44, 45, 51, 54], "comfort": [0, 76], "varianc": [0, 18, 62, 65, 67, 74, 80, 81, 82, 87, 91], "normal": [0, 2, 3, 4, 5, 7, 8, 11, 18, 21, 28, 31, 33, 35, 36, 39, 40, 43, 46, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 100], "distribut": [0, 12, 15, 25, 28, 35, 39, 57, 61, 62, 64, 65, 67, 74, 80, 81, 82, 84, 94, 100, 101, 102], "select": [0, 2, 5, 7, 12, 15, 16, 21, 25, 27, 31, 36, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "read": [0, 2, 3, 5, 7, 8, 12, 15, 21, 27, 31, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "i": [0, 2, 3, 5, 11, 12, 15, 16, 17, 18, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 87, 88, 97, 100, 101, 102], "e": [0, 3, 4, 5, 11, 15, 16, 21, 25, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 52, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 100, 101, 102], "chapter": [0, 31], "6": [0, 2, 3, 5, 7, 8, 11, 12, 16, 18, 20, 21, 28, 31, 34, 36, 38, 39, 40, 46, 64, 65, 69, 80, 81, 82, 85, 102], "7": [0, 2, 3, 5, 7, 8, 12, 18, 19, 21, 25, 27, 28, 31, 34, 35, 39, 40, 46, 60, 64, 65, 69, 70, 80, 81, 82, 85, 87, 102], "russ": 0, "poldrack": 0, "s": [0, 2, 3, 7, 11, 15, 16, 17, 19, 20, 21, 25, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 53, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "book": [0, 11, 25, 31, 64, 73, 76, 97], "think": [0, 5, 11, 12, 20, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 61, 62, 100, 102], "21st": 0, "centuri": 0, "what": [0, 3, 5, 11, 16, 17, 19, 21, 23, 25, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 57, 60, 61, 62, 65, 67, 69, 70, 74, 77, 80, 82, 84, 85, 88, 89, 100, 101], "integr": [0, 36, 39, 52, 81, 82, 101], "differenti": [0, 61, 67, 73, 81, 84, 101], "equat": [0, 37, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 89, 91, 100, 101, 102], "memori": [0, 17, 21, 27, 57, 64, 73, 76, 77, 84, 85, 88], "gilbert": [0, 62], "strang": [0, 62, 76], "studi": [0, 16, 31, 33, 35, 39, 60, 62, 64, 73, 84, 97], "0": [0, 2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 36, 39, 40, 43, 46, 57, 61, 65, 67, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 102], "includ": [0, 8, 12, 15, 19, 23, 25, 26, 27, 28, 31, 33, 36, 37, 39, 40, 43, 57, 62, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 88, 94, 97, 100, 101], "jiri": 0, "lebl": 0, "engin": [0, 21, 25, 27, 33, 36, 37, 43, 57, 60, 67, 74, 76, 80, 88, 91, 101], "The": [0, 2, 3, 5, 7, 8, 12, 15, 16, 17, 19, 20, 21, 23, 25, 26, 27, 31, 33, 35, 36, 39, 40, 43, 45, 54, 60, 62, 67, 69, 70, 74, 77, 82, 84, 85, 87, 89, 91, 97, 100, 102], "team": [0, 23, 27, 31, 57, 61], "By": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "creator": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "jama": [2, 8], "hussein": [2, 8], "mohamud": [2, 8], "alex": [2, 8, 15, 73], "hernandez": [2, 8], "garcia": [2, 8], "product": [2, 3, 5, 8, 11, 12, 18, 25, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "editor": [2, 3, 5, 8, 11, 12, 15, 17, 18, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "spiro": [2, 3, 5, 8, 11, 12, 15, 17, 18, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "chavli": [2, 3, 5, 8, 11, 12, 15, 17, 18, 21, 25, 27, 28, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "saeed": [2, 8, 60, 61, 62, 64, 65, 69, 70, 73, 80], "salehi": [2, 8, 60, 61, 62, 64, 65, 69, 70, 73, 80], "refer": [2, 7, 8, 12, 16, 19, 23, 25, 28, 31, 33, 34, 38, 39, 40, 43, 46, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "synthet": [2, 69], "increas": [2, 12, 17, 19, 21, 27, 31, 39, 40, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 94, 97], "amount": [2, 3, 16, 17, 21, 28, 33, 36, 38, 57, 62, 76, 77, 80, 81, 91], "transform": [2, 5, 7, 8, 11, 12, 16, 18, 21, 23, 28, 31, 33, 36, 39, 43, 46, 65, 67, 69, 70, 73, 76, 80, 81, 82, 87, 88, 89, 100, 101], "exist": [2, 3, 5, 7, 8, 15, 21, 27, 28, 31, 40, 43, 57, 62, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 89, 91, 94, 100, 101, 102], "been": [2, 4, 5, 8, 16, 21, 27, 28, 31, 33, 34, 35, 38, 39, 40, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "shown": [2, 4, 16, 25, 28, 51, 60, 61, 62, 69, 70, 73, 81, 94, 97], "veri": [2, 5, 8, 11, 12, 16, 19, 20, 21, 27, 28, 31, 33, 34, 35, 36, 39, 40, 43, 57, 60, 61, 62, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100], "techniqu": [2, 15, 18, 21, 27, 31, 60, 64, 73, 76, 80, 84, 87, 88], "especi": [2, 27, 28, 31, 38, 39, 54, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 94, 100, 101, 102], "comput": [2, 3, 8, 12, 16, 17, 21, 23, 25, 26, 27, 28, 30, 31, 33, 34, 35, 36, 37, 39, 40, 43, 54, 57, 61, 64, 65, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "vision": [2, 7, 23, 30, 31, 35, 39, 40, 57, 76, 84, 91, 101], "applic": [2, 26, 27, 31, 39, 67, 70, 77, 80, 82, 87, 88, 94, 100, 101], "howev": [2, 4, 5, 20, 26, 27, 28, 31, 33, 39, 40, 57, 64, 67, 69, 70, 73, 77, 81, 82, 84, 85, 87, 88, 89, 94, 97, 101], "wai": [2, 3, 5, 10, 11, 12, 16, 17, 28, 31, 33, 35, 37, 38, 39, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101], "perform": [2, 7, 8, 10, 12, 16, 18, 19, 21, 23, 25, 28, 31, 33, 34, 35, 36, 37, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 87, 88, 89, 97, 100, 101, 102], "yet": [2, 11, 12, 16, 28, 30, 31, 33, 35, 40, 43, 57, 62, 97], "understood": [2, 8, 64, 81, 101], "effect": [2, 11, 16, 17, 27, 34, 57, 62, 67, 70, 80, 85, 88, 91, 97, 102], "why": [2, 12, 19, 31, 35, 36, 38, 39, 60, 62, 64, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 101], "how": [2, 3, 5, 7, 8, 10, 11, 12, 15, 17, 19, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 54, 60, 61, 62, 64, 65, 69, 70, 77, 80, 81, 82, 85, 87, 88, 89, 91, 97, 100, 101, 102], "interact": [2, 16, 26, 28, 33, 36, 43, 51, 53, 54, 60, 65, 97, 101], "other": [2, 3, 4, 5, 7, 8, 11, 16, 17, 21, 27, 31, 33, 35, 36, 38, 39, 40, 43, 46, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 85, 87, 88, 89, 91, 94, 100, 101, 102], "fact": [2, 25, 31, 35, 39, 60, 61, 64, 73, 80, 81, 101], "common": [2, 5, 7, 11, 12, 21, 23, 26, 27, 33, 35, 43, 57, 61, 62, 69, 70, 73, 74, 76, 80, 88, 89, 91], "differ": [2, 3, 4, 5, 7, 10, 12, 15, 16, 17, 18, 19, 21, 23, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 65, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101], "scheme": [2, 57, 65, 67, 85], "paper": [2, 4, 5, 8, 12, 17, 19, 26, 27, 28, 31, 33, 35, 37, 40, 57, 64, 70, 73, 74, 76, 81, 84, 91, 101, 102], "perceptu": [2, 7, 35, 40, 64], "possibl": [2, 7, 8, 10, 11, 12, 16, 21, 27, 33, 35, 36, 37, 38, 39, 40, 57, 61, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "relat": [2, 12, 17, 19, 20, 21, 31, 33, 34, 36, 37, 38, 57, 61, 65, 67, 74, 80, 81, 82, 84, 85, 87, 88, 91, 94, 101], "human": [2, 3, 15, 16, 19, 27, 33, 35, 36, 43, 62, 91, 97, 101], "percept": [2, 15, 35, 39, 40, 101], "simpl": [2, 3, 5, 11, 12, 17, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 60, 62, 64, 65, 67, 80, 82, 85, 88, 94, 100], "artifici": [2, 15, 39, 40, 57, 62, 69, 74, 97, 101], "even": [2, 8, 11, 12, 16, 23, 28, 31, 33, 35, 36, 40, 57, 60, 61, 62, 65, 67, 70, 76, 77, 81, 87, 88, 91, 94, 100, 101], "label": [2, 3, 5, 7, 8, 12, 15, 16, 17, 20, 21, 28, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 70, 73, 77, 80, 81, 82, 84, 85, 87, 88, 91, 100, 101], "among": [2, 69, 70, 84, 88], "mani": [2, 5, 8, 12, 17, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 88, 91, 94, 97, 101], "notebook": [2, 3, 4, 5, 8, 11, 12, 16, 17, 20, 21, 25, 27, 28, 31, 33, 35, 43, 51, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "show": [2, 3, 4, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 34, 35, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101], "deep": [2, 5, 8, 18, 19, 21, 23, 27, 28, 31, 35, 36, 37, 43, 46, 52, 54, 60, 61, 67, 69, 70, 73, 80, 88, 100, 103], "network": [2, 8, 10, 11, 18, 19, 20, 21, 23, 25, 26, 27, 31, 33, 35, 45, 60, 69, 73, 74, 80, 84, 87, 88, 91, 101], "analys": [2, 31, 38, 60, 84], "result": [2, 3, 10, 15, 17, 23, 25, 27, 28, 34, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "titl": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "pip": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 40, 43, 54, 57, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 100, 102], "panda": [2, 8, 12, 25, 40, 57, 76, 81, 84, 85], "quiet": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "os": [2, 3, 5, 7, 8, 12, 15, 16, 17, 21, 27, 28, 43, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89, 94, 100, 102], "csv": [2, 8, 12], "multiprocess": [2, 8], "np": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "pd": [2, 8, 12, 25, 40, 57, 81, 84, 85], "pyplot": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 89, 94, 97], "plt": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97], "torch": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 27, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "nn": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 27, 33, 57, 62, 64, 65, 67, 69, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "f": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "backend": [2, 8, 16, 28, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "cudnn": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "autograd": [2, 8, 57, 67, 80, 84, 85], "torchvis": [2, 3, 5, 7, 8, 16, 18, 21, 43, 57, 65, 67, 69, 70, 73, 76, 77, 80, 82, 94], "execut": [2, 3, 8, 23, 25, 27, 28, 43, 51, 57, 59, 72, 74, 79, 91, 93, 96, 97, 99, 101], "set_se": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "markdown": [2, 3, 8, 16, 17, 21, 28, 33, 35, 36, 44, 45, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "dl": [2, 3, 8, 15, 19, 25, 33, 35, 36, 37, 39, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "its": [2, 3, 8, 11, 12, 15, 17, 21, 25, 28, 36, 37, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "critic": [2, 8, 11, 28, 34, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "so": [2, 3, 5, 8, 11, 12, 16, 17, 20, 26, 27, 28, 31, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "baselin": [2, 8, 26, 27, 35, 39, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "compar": [2, 4, 5, 7, 8, 10, 16, 17, 18, 19, 25, 27, 33, 35, 39, 57, 60, 61, 62, 64, 65, 69, 70, 73, 74, 77, 80, 82, 84, 85, 87, 88, 89, 102], "http": [2, 3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 21, 27, 28, 30, 31, 33, 36, 39, 43, 44, 45, 46, 49, 52, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "pytorch": [2, 7, 11, 16, 21, 30, 31, 33, 36, 46, 61, 62, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "org": [2, 5, 7, 8, 11, 17, 21, 27, 30, 33, 43, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "doc": [2, 5, 8, 12, 27, 31, 46, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "stabl": [2, 8, 27, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 88, 89, 94, 100, 102], "html": [2, 5, 8, 15, 21, 25, 27, 28, 31, 33, 39, 43, 46, 49, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "call": [2, 3, 7, 8, 12, 17, 27, 28, 31, 33, 35, 36, 38, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "ensur": [2, 8, 31, 35, 37, 38, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "reproduc": [2, 8, 25, 34, 35, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "def": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "none": [2, 3, 5, 8, 11, 12, 16, 17, 18, 21, 25, 27, 28, 39, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "seed_torch": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "true": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 21, 25, 27, 28, 31, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "choic": [2, 5, 8, 11, 15, 16, 21, 25, 27, 33, 36, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "2": [2, 3, 5, 7, 8, 10, 11, 16, 17, 18, 19, 20, 21, 23, 25, 31, 34, 36, 39, 48, 54, 89], "32": [2, 3, 5, 7, 8, 11, 12, 16, 17, 21, 25, 27, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "manual_se": [2, 3, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "cuda": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "manual_seed_al": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "benchmark": [2, 8, 16, 38, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "fals": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 20, 21, 25, 27, 28, 33, 34, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "determinist": [2, 8, 16, 27, 28, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97, 100, 102], "case": [2, 8, 12, 16, 27, 28, 31, 33, 34, 35, 36, 39, 40, 43, 54, 57, 60, 61, 62, 64, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 89, 91, 94, 97, 100, 102], "dataload": [2, 3, 7, 12, 21, 31, 33, 60, 61, 62, 64, 67, 69, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "seed_work": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "worker_id": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "worker_se": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "initial_se": [2, 8, 16, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "inform": [2, 4, 5, 7, 8, 12, 17, 21, 23, 25, 28, 31, 33, 34, 35, 36, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 102], "user": [2, 8, 12, 15, 16, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "set_devic": [2, 5, 7, 8, 12, 16, 57], "is_avail": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "els": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 36, 37, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "warn": [2, 7, 8, 12, 16, 28, 31, 35, 43, 61, 62, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "best": [2, 7, 8, 12, 16, 17, 18, 27, 31, 35, 37, 39, 40, 43, 61, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 101, 102], "menu": [2, 5, 7, 8, 12, 16, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "under": [2, 5, 7, 8, 12, 16, 40, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 97, 100, 102], "runtim": [2, 5, 7, 8, 12, 16, 20, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "chang": [2, 3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 20, 21, 27, 28, 31, 33, 35, 36, 38, 39, 43, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "type": [2, 3, 5, 7, 8, 12, 15, 16, 17, 20, 21, 23, 27, 28, 31, 33, 36, 37, 39, 43, 57, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 102], "enabl": [2, 3, 5, 7, 8, 12, 16, 20, 25, 27, 43, 54, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 88, 89, 100, 101, 102], "return": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 40, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "2021": [2, 8, 15, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94], "reduc": [2, 12, 18, 21, 27, 28, 31, 57, 60, 67, 69, 70, 73, 76, 77, 80, 82], "epoch": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 21, 31, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 100, 102], "end_epoch": 2, "valu": [2, 5, 8, 11, 12, 15, 16, 17, 20, 21, 27, 28, 31, 33, 36, 37, 39, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 85, 87], "wa": [2, 3, 7, 12, 17, 20, 21, 25, 27, 28, 31, 33, 34, 36, 38, 39, 40, 57, 62, 64, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 91, 97, 102], "200": [2, 8, 20, 27, 28, 33, 36, 39, 40, 43, 62, 67, 69, 70, 73, 76, 77, 81, 87, 100, 102], "pleas": [2, 5, 8, 11, 15, 23, 28, 31, 33, 46, 49, 50, 51, 52, 54, 57, 60, 61, 62, 65, 67, 70, 74, 76, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "back": [2, 11, 12, 23, 28, 31, 34, 36, 38, 39, 40, 43, 57, 73, 76, 77, 80, 81, 82, 84, 88, 97], "code": [2, 5, 7, 8, 12, 15, 16, 17, 21, 23, 26, 27, 28, 31, 33, 34, 35, 37, 38, 39, 43, 51, 52, 53, 54, 74, 85, 89, 91, 101], "hyper": [2, 8, 60, 70], "paramet": [2, 3, 5, 7, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 57, 60, 61, 62, 64, 69, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "use_cuda": [2, 8, 94], "alpha": [2, 5, 18, 57, 61, 62, 65, 67, 73, 74, 77, 80, 81, 85, 87, 94, 97, 100, 102], "best_acc": [2, 8, 69, 70], "accuraci": [2, 3, 7, 8, 12, 21, 25, 33, 35, 39, 64, 65, 67, 69, 70, 73, 85, 87, 94], "start_epoch": [2, 8], "last": [2, 12, 17, 20, 21, 25, 27, 31, 36, 43, 46, 53, 54, 57, 60, 62, 64, 65, 67, 69, 70, 74, 76, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 102], "checkpoint": [2, 3, 8, 17, 21, 25, 82, 85, 94, 100, 102], "batch_siz": [2, 3, 7, 8, 11, 12, 16, 17, 18, 21, 27, 33, 57, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 87, 94, 100, 102], "128": [2, 3, 5, 7, 8, 16, 17, 18, 64, 65, 69, 70, 73, 76, 80, 82, 84, 87], "end_apoch": 2, "15": [2, 3, 5, 7, 8, 12, 17, 19, 20, 23, 28, 33, 46, 61, 62, 67, 69, 70, 73, 76, 77, 81, 82, 84, 85, 87, 91, 94, 97, 100, 101, 102], "base_learning_r": [2, 8], "n_hole": 2, "hole": [2, 17], "cut": [2, 12, 80], "out": [2, 7, 8, 11, 12, 15, 16, 17, 20, 21, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 46, 49, 52, 57, 60, 62, 64, 65, 67, 70, 73, 74, 80, 84, 85, 87, 88, 91, 94, 100, 101, 102], "length": [2, 5, 7, 12, 17, 20, 21, 26, 27, 39, 57, 60, 73, 80, 84, 85, 87, 88, 94, 100, 102], "16": [2, 3, 5, 7, 8, 17, 21, 28, 31, 33, 46, 61, 62, 64, 65, 67, 70, 73, 76, 77, 80, 81, 82, 85, 87, 88, 94, 97, 100, 102], "torchvision_transform": [2, 8], "randomli": [2, 3, 5, 20, 25, 67, 69, 70, 76, 80, 81, 82, 85, 94, 97, 100, 102], "mask": [2, 11, 15, 17, 27, 67, 76, 88, 100, 102], "patch": [2, 5, 21, 84], "github": [2, 3, 7, 15, 17, 21, 27, 28, 33, 34, 49, 57, 73, 76, 77, 87, 88, 89, 94, 100, 101, 102], "com": [2, 3, 5, 7, 10, 12, 15, 16, 17, 21, 27, 28, 30, 33, 34, 43, 44, 45, 52, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "uoguelph": 2, "mlrg": 2, "arg": [2, 16, 25, 28, 33, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "int": [2, 3, 5, 8, 12, 16, 17, 18, 21, 25, 28, 33, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 85, 87, 88, 94, 97, 100, 102], "each": [2, 3, 5, 6, 8, 11, 12, 13, 15, 16, 17, 18, 19, 20, 21, 22, 23, 25, 26, 27, 28, 29, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 48, 51, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "pixel": [2, 4, 5, 8, 17, 21, 67, 69, 70, 73, 74, 80, 91], "squar": [2, 28, 57, 60, 61, 62, 67, 69, 70, 76, 80, 81, 82, 84, 94, 97, 100, 102], "__init__": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "self": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "__call__": [2, 57, 85, 88], "img": [2, 3, 5, 7, 15, 17, 18, 21, 43, 64, 65, 69, 70, 73, 77], "tensor": [2, 7, 8, 11, 12, 16, 21, 33, 43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "size": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 35, 36, 40, 43, 57, 60, 61, 62, 64, 65, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "c": [2, 3, 5, 11, 12, 21, 25, 33, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 88, 91, 94, 97], "h": [2, 12, 17, 21, 57, 60, 62, 73, 76, 80, 82, 84, 85], "w": [2, 3, 4, 5, 8, 11, 12, 16, 21, 43, 57, 60, 61, 62, 65, 67, 70, 73, 74, 76, 80, 81, 82, 85, 89], "dimens": [2, 3, 12, 17, 28, 33, 35, 39, 43, 57, 62, 67, 69, 70, 73, 77, 80, 82, 84, 87, 89, 100, 102], "x": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101, 102], "ones": [2, 3, 12, 17, 20, 28, 31, 33, 35, 39, 57, 60, 65, 73, 76, 80, 81, 82, 84, 85, 88, 94, 97, 100], "float32": [2, 3, 17, 21, 25, 27, 28, 33, 57, 80], "n": [2, 3, 5, 7, 11, 12, 16, 17, 21, 27, 28, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "y": [2, 3, 7, 8, 11, 12, 17, 18, 20, 21, 25, 27, 28, 35, 36, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 80, 81, 82, 85, 87, 88, 91, 94, 100, 101, 102], "randint": [2, 5, 17, 25, 57, 62, 80, 85, 100, 102], "y1": [2, 21], "clip": [2, 12, 15, 27, 80, 82, 101], "y2": 2, "x1": [2, 20, 21, 57, 64, 94], "x2": [2, 21, 64, 94], "from_numpi": [2, 5, 12, 17, 20, 21, 57, 73, 76, 80], "expand_a": [2, 76], "combin": [2, 3, 17, 21, 25, 33, 36, 39, 40, 57, 60, 64, 65, 67, 70, 73, 76, 77, 80, 84, 85, 88, 89, 91, 94, 101, 102], "pair": [2, 3, 10, 11, 17, 67, 74, 77, 80, 84, 85, 87, 88, 89, 91, 100, 101, 102], "via": [2, 5, 21, 23, 31, 43, 60, 67, 70, 73, 74, 76, 81, 82, 85, 97, 100], "convex": 2, "given": [2, 5, 11, 15, 16, 17, 25, 27, 31, 33, 36, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97, 100, 101, 102], "x_i": [2, 18, 64, 74, 81, 89], "x_j": [2, 18, 65], "y_i": [2, 89], "y_j": 2, "respect": [2, 5, 15, 27, 28, 33, 36, 38, 51, 57, 60, 61, 62, 64, 67, 69, 73, 76, 80, 85, 87, 94], "lambda": [2, 3, 11, 12, 21, 67, 69, 70, 74, 81, 82, 84, 85, 87, 88, 100, 102], "creat": [2, 5, 7, 11, 12, 15, 21, 23, 25, 27, 31, 33, 34, 35, 39, 40, 54, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91], "new": [2, 8, 10, 11, 12, 17, 21, 25, 26, 27, 28, 31, 33, 38, 43, 57, 62, 64, 67, 69, 70, 73, 77, 80, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101, 102], "hat": [2, 76], "begin": [2, 17, 28, 31, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 100, 102], "align": [2, 11, 23, 36, 39, 40, 43, 60, 62, 64, 65, 67, 73, 74, 81, 84, 85, 89], "end": [2, 3, 5, 11, 12, 17, 21, 23, 27, 28, 31, 33, 36, 39, 40, 43, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "check": [2, 3, 11, 12, 15, 16, 17, 23, 25, 28, 31, 33, 35, 36, 38, 39, 43, 46, 49, 57, 62, 64, 65, 67, 70, 73, 74, 77, 82, 88, 91, 94, 97, 100, 101, 102], "origin": [2, 5, 8, 11, 12, 16, 17, 27, 28, 33, 35, 36, 38, 40, 57, 60, 62, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 91, 94, 100, 102], "repositori": [2, 17, 30, 31, 34, 43, 57, 100, 102], "mixup_data": 2, "mix": [2, 35, 39, 74, 76], "input": [2, 4, 5, 8, 11, 12, 17, 18, 20, 21, 27, 33, 35, 36, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "target": [2, 7, 11, 12, 16, 18, 21, 27, 28, 34, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 84, 85, 89, 100, 102], "hongyi": 2, "zhang": [2, 8, 73], "lam": [2, 20], "beta": [2, 15, 57, 67], "index": [2, 5, 12, 18, 21, 30, 33, 43, 62, 64, 65, 69, 70, 73, 74, 84, 85, 94], "randperm": [2, 64, 65], "mixed_x": 2, "y_a": [2, 80], "y_b": [2, 80], "small": [2, 8, 16, 17, 31, 33, 35, 39, 43, 57, 60, 62, 64, 65, 67, 69, 73, 74, 76, 77, 82, 84, 87, 88, 91, 94, 97, 100, 101], "tweak": [2, 8, 67], "ani": [2, 5, 8, 15, 21, 25, 31, 33, 35, 36, 37, 38, 39, 40, 54, 57, 60, 61, 62, 64, 67, 70, 73, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97], "interest": [2, 3, 8, 11, 12, 15, 16, 23, 26, 27, 28, 31, 33, 35, 36, 38, 39, 40, 60, 62, 69, 70, 76, 80, 81, 82, 85, 94, 97, 100, 101], "download": [2, 5, 7, 8, 10, 11, 15, 31, 33, 36, 39, 43, 57, 69, 70, 87, 88, 89, 94], "prepar": [2, 8, 23, 28, 31, 57, 64, 73, 84, 87, 88, 100, 102], "percentagesplit": [2, 8], "full_dataset": [2, 3, 8], "percent": [2, 5, 8, 12, 76], "set1_siz": [2, 8], "len": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 21, 25, 27, 28, 33, 35, 39, 40, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 102], "set2_siz": [2, 8], "final_dataset": [2, 8], "_": [2, 3, 7, 8, 11, 15, 16, 17, 18, 21, 27, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 94, 100, 102], "util": [2, 3, 7, 8, 11, 12, 16, 17, 18, 21, 25, 27, 28, 33, 36, 57, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 88, 89, 100, 101, 102], "random_split": [2, 3, 8, 21, 67, 69, 70, 73, 87], "cifar100": [2, 8], "5071": [2, 8], "4866": [2, 8], "4409": [2, 8, 12], "std": [2, 5, 8, 35, 39, 43, 62, 65, 67, 69, 70, 76, 81, 82], "2673": [2, 8], "2564": [2, 8], "2762": [2, 8], "cifar10": [2, 8, 80], "4914": [2, 8], "4822": [2, 8], "4465": [2, 8], "2023": [2, 8, 81, 82, 88, 100, 102], "1994": [2, 8], "2010": [2, 8, 65], "transform_train": [2, 8], "compos": [2, 5, 7, 8, 16, 18, 43, 57, 65, 67, 69, 70, 73, 76, 77, 84, 85, 88, 101], "append": [2, 3, 5, 7, 8, 11, 12, 16, 17, 25, 28, 33, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 89, 94, 97, 100, 102], "randomcrop": [2, 8], "pad": [2, 3, 7, 8, 11, 12, 16, 17, 21, 25, 27, 57, 60, 61, 62, 76, 80, 82, 84, 85, 88, 94, 100, 102], "4": [2, 3, 5, 7, 8, 11, 16, 17, 18, 20, 21, 23, 25, 31, 34, 37, 38, 39, 40, 46, 48, 54, 81, 89, 102], "randomhorizontalflip": [2, 8, 18, 65, 70], "totensor": [2, 5, 7, 8, 16, 18, 43, 57, 65, 67, 69, 70, 73, 76, 77, 80, 82], "transform_test": [2, 8], "trainset": [2, 8], "root": [2, 3, 5, 8, 16, 28, 43, 57, 67, 69, 70, 73, 84], "testset": [2, 8], "www": [2, 3, 5, 8, 15, 17, 21, 27, 30, 52, 57, 73, 100], "cs": [2, 8, 12, 57], "toronto": [2, 8, 57], "edu": [2, 8, 12, 30, 57], "kriz": [2, 8, 57], "tar": [2, 5, 8, 21, 57, 67, 73, 80, 84, 85, 100, 102], "gz": [2, 5, 8, 21, 57, 67, 73, 80, 82, 84, 85, 87, 89], "extract": [2, 5, 7, 8, 18, 21, 35, 36, 39, 57, 67, 73, 76, 82, 84, 85, 87, 89, 91, 97, 100], "file": [2, 3, 5, 7, 8, 11, 15, 16, 21, 43, 51, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "alreadi": [2, 3, 7, 8, 10, 12, 15, 21, 28, 31, 33, 35, 36, 40, 43, 57, 60, 61, 62, 67, 70, 73, 74, 76, 77, 80, 85, 88, 91, 94, 100, 101, 102], "verifi": [2, 8, 21, 43, 54, 57, 69, 73, 84, 94, 97], "50": [2, 3, 5, 7, 8, 11, 16, 17, 18, 25, 33, 35, 39, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 74, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "000": [2, 8, 15, 19, 57, 67, 73, 94], "colour": [2, 8, 57], "rgb": [2, 3, 8, 16, 18, 27, 57, 76, 77], "plane": [2, 8, 76, 80], "car": [2, 11, 76], "bird": [2, 57, 62, 76, 80], "cat": [2, 16, 21, 57, 64, 65, 69, 70, 76, 77, 81, 82, 87, 89, 91, 94], "deer": [2, 57], "dog": [2, 16, 43, 57, 69, 70, 76, 87, 88, 91], "frog": [2, 57, 76, 87], "hors": [2, 57, 76, 87], "ship": [2, 57, 76], "truck": [2, 57, 76], "store": [2, 5, 8, 15, 17, 21, 27, 28, 39, 43, 57, 60, 67, 69, 70, 73, 76, 84, 85, 97, 100, 102], "custom": [2, 8, 27, 28, 57, 60, 74, 80, 84, 85, 88, 94], "properti": [2, 8, 35, 38, 39, 43, 60, 64, 67, 74, 85, 88, 91], "uniqu": [2, 8, 17, 25, 31, 33, 36, 40, 87, 88], "50000": [2, 8, 27, 57, 67], "3": [2, 3, 5, 7, 8, 11, 15, 16, 17, 18, 19, 20, 21, 23, 25, 26, 31, 34, 39, 46, 48, 54, 89, 102], "10000": [2, 8, 27, 57, 62, 67, 69, 70, 80, 81, 84, 85], "choos": [2, 8, 16, 17, 20, 21, 25, 31, 33, 36, 37, 43, 57, 60, 65, 67, 69, 70, 73, 74, 80, 81, 82, 85, 88, 97, 100, 101, 102], "percentag": [2, 5, 8, 85], "whole": [2, 8, 12, 15, 17, 20, 31, 33, 36, 38, 39, 57, 60, 67, 69, 73, 76, 77, 85, 87, 91, 100, 102], "A": [2, 3, 5, 8, 10, 12, 15, 16, 17, 18, 19, 21, 25, 26, 27, 33, 34, 35, 36, 37, 38, 39, 40, 43, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 101, 102], "iter": [2, 8, 11, 12, 17, 18, 20, 21, 23, 31, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 84, 85, 87, 88, 94, 100, 102], "effici": [2, 8, 19, 27, 60, 67, 69, 73, 84, 94, 101], "shuffl": [2, 5, 7, 8, 11, 12, 16, 21, 31, 33, 57, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 85, 87, 100, 102], "batch": [2, 3, 7, 8, 11, 12, 17, 21, 27, 33, 43, 57, 61, 62, 65, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "num_work": [2, 7, 8, 57, 64, 65, 67, 69, 70, 73, 76, 80, 82], "cpu_count": [2, 8], "worker": [2, 8, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "trainload": [2, 8], "testload": [2, 8], "To": [2, 3, 5, 7, 8, 10, 12, 15, 17, 21, 26, 27, 28, 31, 33, 34, 39, 40, 57, 60, 61, 62, 64, 67, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "correspond": [2, 3, 17, 18, 21, 25, 28, 31, 35, 39, 40, 43, 57, 61, 62, 64, 65, 67, 73, 76, 77, 82, 84, 85, 87, 94, 97, 100, 102], "flag": [2, 17, 27, 57], "section": [2, 3, 7, 12, 15, 27, 52, 85], "batch_x": 2, "batch_i": 2, "next": [2, 3, 11, 21, 28, 31, 33, 35, 39, 43, 46, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "plot_mixed_imag": 2, "inv_norm": 2, "m": [2, 3, 11, 12, 17, 21, 25, 27, 31, 35, 36, 39, 43, 61, 62, 64, 65, 69, 73, 74, 80, 84, 85, 87, 89], "zip": [2, 3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 40, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 88, 89, 97, 100, 102], "inv_pil": 2, "topilimag": 2, "fig": [2, 3, 7, 12, 16, 21, 28, 33, 61, 62, 64, 67, 69, 70, 73, 76, 80, 81, 94, 97], "figur": [2, 7, 8, 12, 15, 16, 17, 18, 21, 31, 34, 35, 36, 39, 40, 74, 77, 88, 91, 101], "figsiz": [2, 3, 5, 7, 12, 15, 16, 17, 18, 20, 21, 28, 33, 35, 39, 40, 60, 61, 62, 67, 69, 70, 73, 76, 77, 80, 81, 82, 87], "8": [2, 3, 5, 7, 8, 11, 12, 17, 18, 20, 21, 27, 28, 34, 35, 39, 40, 60, 64, 65, 69, 70, 77, 80, 81, 82, 85, 87, 88, 102], "ax": [2, 3, 5, 15, 16, 18, 20, 21, 28, 35, 39, 40, 57, 60, 61, 62, 64, 67, 69, 70, 73, 76, 80, 81, 94, 97], "add_subplot": [2, 16, 61, 67], "inv_tensor": 2, "imshow": [2, 3, 5, 7, 15, 16, 17, 18, 20, 21, 28, 33, 57, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82], "9": [2, 3, 5, 8, 15, 17, 18, 21, 25, 28, 31, 34, 39, 40, 46, 60, 64, 65, 69, 70, 73, 80, 81, 82, 85, 87, 88, 97, 102], "famili": [2, 8], "whose": [2, 8, 67, 70, 84, 88], "main": [2, 8, 11, 15, 19, 25, 27, 28, 31, 33, 34, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 97, 100, 101, 102], "organis": [2, 8, 11, 16, 27], "stack": [2, 8, 12, 17, 33, 73, 76, 77, 88], "residu": [2, 8, 31, 80, 82], "block": [2, 8, 17, 43, 60, 70, 73, 76, 80, 84, 87], "consist": [2, 8, 11, 15, 17, 21, 27, 34, 36, 43, 57, 65, 67, 73, 77, 84, 87, 88, 89, 94, 101], "layer": [2, 5, 7, 12, 17, 18, 20, 27, 31, 57, 60, 61, 62, 64, 67, 69, 70, 73, 74, 80, 82, 84, 85, 87, 91, 94, 100], "output": [2, 5, 7, 8, 11, 12, 17, 18, 20, 21, 25, 27, 28, 33, 35, 36, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 100, 102], "ad": [2, 3, 8, 12, 17, 21, 26, 27, 33, 38, 57, 62, 69, 70, 76, 80, 81, 82, 84, 85, 88, 91, 94, 100], "shortcut": [2, 3, 8, 21, 43], "connect": [2, 7, 8, 12, 17, 54, 57, 60, 65, 69, 70, 76, 80, 81, 82, 94, 100, 102], "just": [2, 3, 5, 8, 11, 12, 20, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 40, 43, 46, 62, 64, 65, 67, 69, 73, 74, 76, 77, 80, 82, 88, 89, 91, 94, 97, 101], "popular": [2, 8, 27, 43, 60, 64, 76, 77, 82], "work": [2, 5, 8, 12, 16, 17, 19, 21, 23, 25, 26, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 46, 52, 54, 60, 61, 62, 64, 69, 70, 73, 74, 76, 77, 80, 82, 85, 87, 88, 89, 91, 97, 100, 101, 102], "gener": [2, 3, 8, 16, 17, 20, 21, 23, 25, 27, 28, 31, 33, 34, 36, 37, 38, 61, 67, 74, 76, 77, 81, 85, 87, 91, 100, 101], "pick": [2, 8, 11, 31, 33, 39, 67, 70, 76, 80, 84, 85, 87, 88, 97, 100, 102], "illustr": [2, 8, 67, 69, 82, 84], "purpos": [2, 8, 15, 21, 27, 34, 39, 40, 60, 73, 80, 82, 84, 88, 94], "basicblock": [2, 8], "modul": [2, 3, 5, 7, 8, 11, 12, 16, 17, 20, 21, 25, 27, 33, 43, 44, 45, 54, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 81, 84, 85, 87, 88, 89, 94, 101], "kaim": [2, 8], "he": [2, 8, 11, 12, 45, 57, 74, 84, 85, 87, 91, 101], "xiangyu": [2, 8], "shaoq": [2, 8], "ren": [2, 8], "jian": [2, 8], "sun": [2, 8, 46, 76], "learn": [2, 10, 11, 12, 17, 19, 20, 21, 23, 25, 30, 31, 34, 35, 36, 37, 38, 39, 40, 43, 46, 52, 54, 60, 65, 67, 69, 73, 77, 80, 82, 87, 102, 103], "recognit": [2, 8, 16, 19, 70, 76, 80, 91], "arxiv": [2, 8, 15, 27, 77, 81, 84, 91, 101], "1512": [2, 8], "03385": [2, 8], "expans": [2, 8], "in_plan": [2, 8], "stride": [2, 3, 5, 7, 8, 16, 21, 33, 80, 82, 94, 100, 102], "super": [2, 3, 5, 7, 8, 11, 12, 16, 17, 20, 21, 27, 28, 33, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "conv1": [2, 7, 8, 18, 73, 76, 82, 100, 102], "conv2d": [2, 3, 5, 7, 8, 16, 17, 21, 73, 76, 80, 82, 94, 100, 102], "kernel_s": [2, 3, 5, 7, 8, 16, 17, 21, 33, 73, 76, 80, 82, 100, 102], "bia": [2, 8, 12, 20, 57, 60, 62, 64, 65, 67, 70, 73, 76, 80, 81, 82, 84, 87, 94], "bn1": [2, 8, 100, 102], "batchnorm2d": [2, 3, 7, 8, 16, 17, 21, 100, 102], "conv2": [2, 7, 8, 18, 73, 82, 100, 102], "bn2": [2, 8, 100, 102], "sequenti": [2, 3, 5, 8, 16, 17, 21, 25, 28, 33, 57, 60, 62, 64, 65, 67, 73, 81, 82, 84, 88, 94, 102], "forward": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 94, 100, 102], "relu": [2, 3, 7, 8, 11, 16, 17, 20, 21, 27, 33, 57, 67, 69, 70, 76, 80, 84, 87, 100, 102], "bottleneck": [2, 8, 76, 80], "conv3": [2, 7, 8, 18, 82, 100, 102], "bn3": [2, 8, 100, 102], "num_block": [2, 8], "num_class": [2, 8, 16, 33, 73, 76, 84, 87], "64": [2, 3, 5, 7, 8, 12, 16, 17, 27, 57, 65, 67, 69, 70, 73, 82, 85, 87, 100, 102], "layer1": [2, 8, 33], "_make_lay": [2, 8], "layer2": [2, 8, 33], "layer3": [2, 8], "256": [2, 3, 5, 8, 16, 17, 18, 28, 43, 57, 67, 70, 73, 76, 77, 80, 82], "layer4": [2, 8], "512": [2, 7, 8, 28, 60, 76, 77, 80, 84, 85, 88, 100, 102], "avg_pool2d": [2, 8], "view": [2, 3, 8, 11, 15, 25, 27, 35, 37, 39, 40, 43, 57, 64, 65, 67, 69, 70, 73, 76, 80, 84, 94, 100, 101, 102], "resnet18": [2, 8, 76], "resnet34": [2, 8], "resnet50": [2, 8], "load": [2, 3, 16, 21, 27, 31, 33, 36, 43, 60, 61, 62, 64, 65, 67, 76, 80, 81, 82, 88, 89], "net": [2, 4, 7, 8, 16, 18, 20, 21, 25, 43, 57, 64, 65, 69, 70, 73, 76, 80, 81, 84, 88, 100, 102], "randn": [2, 8, 20, 57, 60, 64, 65, 80, 81, 82], "result_fold": [2, 8], "path": [2, 3, 5, 7, 8, 15, 16, 17, 21, 27, 28, 57, 62, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 88, 94, 100, 102], "makedir": [2, 8, 27], "lognam": [2, 8], "__class__": [2, 8, 85], "__name__": [2, 8, 43, 57, 60, 73, 82, 85, 87, 100, 102], "dataparallel": [2, 3, 8], "device_count": [2, 8], "cross": [2, 8, 16, 21, 33, 39, 57, 61, 67, 70, 73, 82, 85, 100, 101], "entropi": [2, 8, 16, 21, 57, 67, 70, 80, 85, 87, 100], "commonli": [2, 8, 17, 43, 60, 67, 81, 91, 97], "stochast": [2, 8, 57, 60, 67, 81], "gradient": [2, 7, 8, 11, 12, 16, 17, 18, 21, 27, 28, 57, 62, 64, 65, 73, 74, 76, 80, 81, 84, 100, 101, 102], "descent": [2, 8, 17, 21, 27, 31, 57, 61, 62, 74, 84, 101], "sgd": [2, 5, 8, 11, 17, 18, 57, 60, 62, 65, 67, 69, 73, 81, 87, 100, 102], "momentum": [2, 3, 5, 8, 17, 18, 21, 60, 69, 70, 73, 101], "weight": [2, 3, 7, 8, 12, 16, 17, 18, 21, 27, 28, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 91, 94, 100, 102], "decai": [2, 8, 21, 70, 82], "criterion": [2, 7, 8, 12, 16, 17, 18, 21, 33, 38, 62, 64, 65, 69, 70, 73, 76, 87], "mixup_criterion": 2, "pred": [2, 12, 21, 67, 69, 70, 81, 85], "crossentropyloss": [2, 3, 5, 7, 8, 16, 17, 33, 57, 64, 65, 73, 76, 87], "onli": [2, 3, 8, 11, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 80, 81, 84, 85, 87, 88, 91, 97, 100, 101], "lr": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 100, 102], "weight_decai": [2, 8, 17, 21, 80, 85], "1e": [2, 3, 8, 17, 21, 28, 57, 61, 62, 64, 65, 67, 69, 70, 76, 80, 81, 82, 84, 94, 97, 102], "nepoch": [2, 8, 17, 21], "d": [2, 3, 5, 7, 8, 11, 17, 20, 21, 25, 33, 35, 36, 38, 39, 40, 57, 60, 61, 62, 64, 65, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 101], "train_loss": [2, 5, 7, 8, 12, 67, 69, 70, 73, 84, 87], "correct": [2, 3, 4, 5, 7, 8, 16, 17, 25, 33, 35, 39, 57, 62, 64, 65, 69, 70, 73, 76, 80, 88, 94, 100, 102], "total": [2, 3, 5, 7, 8, 12, 17, 21, 25, 27, 33, 35, 39, 57, 64, 65, 69, 70, 73, 76, 80, 94, 100, 102], "batch_idx": [2, 8, 16, 69, 70], "enumer": [2, 5, 8, 12, 16, 17, 18, 21, 28, 33, 61, 62, 64, 65, 67, 69, 70, 76, 84, 85, 87, 94, 97], "zero_grad": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 33, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 94, 100, 102], "two": [2, 10, 12, 15, 20, 21, 23, 25, 27, 31, 33, 35, 36, 39, 43, 52, 57, 61, 62, 65, 67, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101], "hot": [2, 12, 76, 88], "coeffici": [2, 18, 27, 67, 81, 82, 84], "targets_a": 2, "targets_b": 2, "loss_func": 2, "backward": [2, 3, 5, 7, 8, 11, 12, 16, 17, 18, 20, 21, 28, 33, 57, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 100, 102], "step": [2, 3, 5, 7, 8, 11, 16, 17, 18, 20, 21, 25, 27, 28, 31, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 101, 102], "item": [2, 3, 5, 7, 8, 12, 16, 17, 18, 20, 21, 28, 33, 43, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "predict": [2, 3, 5, 7, 8, 12, 16, 17, 19, 21, 27, 31, 33, 35, 36, 37, 38, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 81, 82, 87, 88, 94, 100, 101, 102], "max": [2, 5, 7, 8, 12, 16, 17, 18, 21, 28, 33, 34, 35, 39, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 85, 94, 97, 100, 102], "eq": [2, 8, 69, 70, 76, 81, 82], "sum": [2, 3, 7, 8, 11, 12, 16, 17, 21, 28, 33, 35, 39, 40, 57, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 94, 97, 100, 102], "500": [2, 3, 5, 7, 8, 20, 28, 33, 64, 65, 67, 69, 70, 76, 81, 82, 87, 94], "3f": [2, 5, 8, 33, 61, 67, 81, 84, 87], "acc": [2, 8, 12, 16, 64, 65, 67, 69, 73, 76], "100": [2, 3, 5, 11, 12, 16, 17, 20, 21, 27, 33, 34, 35, 39, 40, 44, 45, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 85, 87, 88, 89, 94, 100, 102], "global": [2, 8, 17, 28, 57, 67, 84, 85, 88, 94], "eval": [2, 3, 7, 8, 12, 16, 17, 18, 21, 27, 33, 43, 65, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 100, 102], "test_loss": [2, 5, 8, 67, 69, 70, 84], "no_grad": [2, 5, 8, 16, 21, 33, 43, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 94, 100, 102], "volatil": 2, "save": [2, 8, 11, 12, 15, 16, 17, 21, 31, 43, 53, 57, 64, 65, 67, 69, 70, 81, 82, 84, 85, 87, 88, 100, 102], "adjust_learning_r": [2, 8], "decreas": [2, 8, 12, 20, 21, 26, 31, 33, 36, 57, 73, 82], "rate": [2, 7, 8, 17, 21, 27, 28, 31, 35, 36, 39, 60, 62, 67, 74, 76, 82, 84, 87, 97, 100, 102], "certain": [2, 8, 12, 21, 28, 33, 35, 36, 39, 43, 60, 64, 65, 67, 76, 81, 82, 84, 88, 94, 97], "state": [2, 5, 7, 8, 12, 16, 17, 20, 21, 25, 28, 37, 39, 40, 43, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 81, 82, 84, 87, 89, 94, 97, 100, 101, 102], "state_dict": [2, 8, 12, 16, 17, 18, 21, 76, 82, 100, 102], "rng_state": [2, 8], "get_rng_stat": [2, 8], "isdir": [2, 8], "mkdir": [2, 5, 7, 8, 16, 21, 57, 73, 100, 102], "ckpt": [2, 82], "t7": [2, 8], "150": [2, 8, 12, 28, 35, 39, 44, 45, 67, 70, 76, 77, 87, 88, 94], "warm": [2, 8], "larg": [2, 5, 7, 8, 12, 20, 21, 31, 57, 62, 65, 69, 70, 73, 74, 76, 77, 82, 84, 85, 87, 94, 101], "minibatch": [2, 8, 11, 33, 57, 73], "param_group": [2, 8, 17], "open": [2, 3, 5, 7, 8, 10, 11, 15, 16, 17, 18, 21, 27, 31, 38, 43, 46, 48, 51, 53, 57, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 89, 100, 102], "logfil": [2, 8], "logwrit": [2, 8], "writer": [2, 8, 16, 31, 34, 69, 70, 81], "delimit": [2, 8], "writerow": [2, 8], "train_acc": [2, 7, 8, 12, 16, 64, 65, 67, 69, 70, 73, 87, 94], "test_acc": [2, 8, 16, 64, 65, 94], "391": [2, 8, 76], "443": [2, 76, 82], "938": [2, 8, 73, 76, 94], "14": [2, 3, 5, 8, 23, 27, 28, 33, 36, 40, 61, 62, 67, 69, 70, 73, 76, 82, 84, 85, 87, 88, 94, 97, 100], "79": [2, 8, 12, 21, 33, 76, 100], "531": [2, 8, 76], "46": [2, 8, 27, 65, 76, 80, 84, 85, 87, 94, 100], "094": 2, "59": [2, 8, 27, 65, 76, 77, 87, 94, 100], "31": [2, 8, 27, 28, 40, 57, 64, 65, 67, 76, 77, 81, 82, 84, 85, 97, 100], "604000091552734": 2, "44": [2, 57, 65, 76, 77, 85, 94, 100], "2599983215332": 2, "619": [2, 8, 76], "39": [2, 5, 8, 27, 33, 73, 76, 82, 84, 85, 94, 100], "844": [2, 8, 73, 76], "51": [2, 5, 8, 27, 28, 33, 61, 69, 70, 76, 85, 87, 94], "199": [2, 76], "60": [2, 8, 16, 27, 28, 31, 35, 39, 67, 73, 76, 94, 100], "156": [2, 76], "77": [2, 8, 12, 16, 21, 27, 33, 39, 76, 80, 82, 84], "47": [2, 27, 33, 65, 69, 76, 84, 85, 87, 94, 100], "03200149536133": 2, "54": [2, 8, 65, 69, 76, 94, 100], "41999816894531": 2, "301": [2, 39, 76], "53": [2, 8, 12, 33, 69, 76, 84, 94], "906": [2, 73, 76], "69": [2, 8, 21, 33, 65, 67, 73, 76, 84], "013": [2, 8], "61": [2, 5, 12, 16, 21, 27, 28, 40, 61, 62, 76, 94], "719": [2, 76], "56": [2, 8, 25, 28, 33, 39, 65, 69, 76, 84, 94], "257999420166016": 2, "62": [2, 5, 8, 27, 33, 65, 73, 76, 87, 94], "599998474121094": 2, "036": 2, "062": 2, "82": [2, 8, 12, 16, 21, 27, 57, 73, 76, 100], "909": [2, 76], "89": [2, 8, 21, 33, 73, 76], "43199920654297": 2, "65": [2, 5, 8, 21, 27, 28, 33, 39, 61, 67, 76, 84, 85, 88, 94, 100], "6500015258789": 2, "839": [2, 5, 73, 76], "68": [2, 8, 17, 33, 67, 76, 84], "750": [2, 8, 39, 76], "88": [2, 21, 27, 33, 61, 65, 73, 76, 84], "859": [2, 76], "70": [2, 3, 5, 8, 21, 28, 33, 67, 73, 76, 84, 94, 100], "312": [2, 8, 76], "90": [2, 8, 21, 33, 67, 73, 76, 77, 84, 94, 100, 102], "67": [2, 12, 21, 27, 33, 67, 76, 84, 85, 94], "08999633789062": [2, 8], "5": [2, 3, 5, 7, 8, 11, 12, 15, 16, 17, 18, 20, 21, 23, 25, 28, 31, 35, 39, 40, 46, 48, 69, 77, 81, 82, 85, 102], "922": [2, 76], "83": [2, 8, 12, 21, 33, 73, 76, 94, 100], "660": [2, 76], "76": [2, 8, 16, 21, 27, 33, 39, 76], "562": [2, 8, 76], "98": [2, 5, 8, 16, 21, 27, 76, 77, 94], "1259994506836": 2, "72": [2, 8, 16, 21, 27, 33, 39, 65, 67, 76, 84, 94], "52999877929688": 2, "833": [2, 76], "625": [2, 76], "84": [2, 5, 12, 21, 33, 76], "616": [2, 8, 76], "78": [2, 8, 15, 33, 39, 76], "125": [2, 8, 76], "73": [2, 8, 16, 21, 28, 33, 61, 67, 76], "45999908447266": 2, "686": [2, 76], "75": [2, 8, 16, 17, 21, 33, 36, 73, 76, 87, 94, 100], "96": [2, 5, 21, 28, 76], "533": [2, 12, 76], "81": [2, 8, 12, 21, 25, 27, 33, 76, 94, 100], "250": [2, 8, 17, 20, 61, 62, 67, 70, 76, 80, 82], "104": [2, 8, 21, 28, 76], "99600219726562": [2, 8], "91000366210938": 2, "626": [2, 8, 76], "458": [2, 12, 76], "031": 2, "105": [2, 61, 76], "42400360107422": 2, "11000061035156": 2, "465": [2, 76], "85": [2, 8, 12, 21, 73, 76, 84, 100, 102], "110": [2, 21, 28, 76], "87": [2, 21, 33, 73, 76, 84], "112": [2, 21, 76], "80": [2, 3, 8, 11, 12, 21, 27, 28, 33, 64, 65, 67, 76, 87, 94, 100, 102], "72599792480469": 2, "37000274658203": 2, "509": [2, 76], "523": [2, 76, 80], "688": [2, 8, 76], "102": [2, 8, 12, 15, 76], "16400146484375": 2, "25": [2, 3, 7, 12, 17, 21, 25, 28, 46, 61, 62, 65, 67, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 100], "11": [2, 3, 5, 7, 8, 16, 25, 28, 33, 46, 64, 67, 73, 76, 82, 84, 85, 87, 97, 100], "423": [2, 76], "610": [2, 76], "96199798583984": 2, "68000030517578": 2, "12": [2, 3, 5, 8, 12, 15, 16, 17, 18, 20, 21, 25, 27, 28, 33, 39, 40, 46, 60, 62, 67, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100], "221": [2, 76], "115": [2, 76], "467": [2, 76], "812": [2, 8, 73, 76], "106": [2, 8, 76], "61799621582031": 2, "88999938964844": 2, "13": [2, 3, 5, 8, 12, 27, 28, 33, 34, 67, 70, 73, 76, 80, 82, 84, 85, 87, 94, 97, 100], "427": [2, 76], "522": [2, 76, 80], "21199798583984": 2, "54000091552734": [2, 8], "216": [2, 76, 94], "93": [2, 5, 8, 15, 28, 73, 76, 84], "120": [2, 5, 18, 76], "386": [2, 76], "86": [2, 21, 73, 76], "111": [2, 21, 76], "08000183105469": 2, "44999694824219": [2, 8], "read_csv": [2, 8, 12, 57], "resnet_": [2, 8], "sep": [2, 8, 84, 88], "head": [2, 8, 12, 25, 33, 36, 40, 43, 67, 73], "932130": 2, "604000": 2, "535233": 2, "259998": 2, "446863": 2, "032001": 2, "262779": 2, "419998": 2, "212518": 2, "257999": 2, "069593": 2, "599998": 2, "051850": 2, "431999": 2, "996476": 2, "650002": 2, "928131": 2, "000000": [2, 73], "898354": 2, "089996": [2, 8], "train_accuraci": [2, 8, 12, 67], "test_accuraci": [2, 8, 67], "averag": [2, 8, 12, 15, 16, 17, 31, 35, 36, 39, 40, 57, 61, 64, 67, 73, 76, 77, 80, 82, 84, 87, 94, 100, 102], "over": [2, 3, 7, 8, 11, 16, 17, 18, 21, 23, 27, 28, 33, 36, 39, 40, 43, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 87, 88, 91, 94, 97, 100, 101, 102], "accuracci": [2, 8], "figurenam": [2, 8], "withmixup": 2, "name": [2, 3, 8, 11, 12, 15, 16, 18, 21, 23, 25, 27, 28, 31, 33, 36, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "xlabel": [2, 5, 8, 11, 15, 16, 17, 20, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 87], "ylabel": [2, 5, 8, 11, 15, 16, 17, 18, 20, 25, 27, 28, 33, 35, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 76, 77, 80, 81, 87], "curv": [2, 8, 35, 39, 61, 62, 64, 73, 80, 94], "savefig": [2, 8, 57, 73, 87], "png": [2, 3, 5, 7, 8, 21, 57, 73, 76, 82, 87, 94], "legend": [2, 5, 7, 8, 12, 16, 18, 20, 21, 34, 35, 39, 40, 57, 60, 61, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 87, 94], "jan": [3, 5], "funk": 3, "between": [3, 7, 10, 15, 16, 17, 20, 27, 31, 33, 35, 36, 39, 40, 57, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 97, 100], "contain": [3, 5, 8, 15, 16, 19, 21, 25, 27, 28, 31, 33, 34, 35, 36, 39, 40, 43, 54, 57, 60, 62, 67, 73, 76, 77, 84, 85, 87, 89, 91, 94, 97, 100], "everyth": [3, 35, 57, 60, 61, 64, 73, 74, 88, 91], "vgg": 3, "electron": 3, "microscopi": 3, "drosophila": 3, "synaps": 3, "those": [3, 5, 8, 12, 23, 25, 31, 33, 34, 35, 36, 38, 39, 40, 57, 67, 73, 74, 80, 81, 87, 88, 91, 94, 100, 101], "accord": [3, 21, 39, 43, 57, 64, 65, 74, 80, 81, 84, 85, 94, 97, 100, 102], "neurotransmitt": 3, "thei": [3, 5, 11, 12, 20, 23, 26, 27, 31, 33, 35, 36, 37, 38, 39, 40, 43, 48, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101], "releas": [3, 5], "scikit": [3, 5, 39, 57, 87], "pillow": [3, 16, 18, 43, 73, 76, 77, 80, 81, 82], "glob": [3, 7, 8, 15, 21, 27, 77], "json": [3, 5, 7, 21, 43, 57, 84, 88], "tqdm": [3, 7, 11, 12, 16, 17, 21, 39, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 88, 89, 100, 102], "skimag": [3, 5], "io": [3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 27, 30, 31, 33, 34, 36, 39, 43, 46, 49, 52, 57, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89, 94, 100, 102], "imread": [3, 5, 7, 17, 21, 57], "imagefold": [3, 7, 16, 65, 69, 70, 76, 77], "sampler": [3, 64, 65, 94], "weightedrandomsampl": 3, "inlin": [3, 25, 28, 69, 70, 73, 82, 87, 89, 94], "improv": [3, 5, 12, 16, 17, 21, 27, 38, 67, 73, 80, 82, 84, 88, 94, 97, 100, 101], "classif": [3, 4, 8, 10, 12, 18, 33, 35, 36, 39, 57, 77, 87, 91], "On": [3, 39, 43, 46, 57, 61, 62, 65, 67, 81, 85], "valid": [3, 5, 7, 11, 12, 16, 17, 18, 21, 31, 39, 64, 67, 70, 73, 80, 85, 87, 88, 97, 100, 102], "around": [3, 5, 16, 21, 23, 27, 31, 33, 35, 36, 39, 57, 67, 73, 76, 77, 80, 81, 82, 85, 87], "try": [3, 4, 5, 7, 10, 11, 12, 16, 17, 21, 23, 26, 27, 31, 33, 36, 40, 54, 57, 61, 62, 65, 67, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101], "easi": [3, 21, 35, 38, 43, 57, 60, 61, 67, 74, 85, 88, 91, 101], "augment": [3, 8, 21, 62, 65, 91, 94, 101], "quit": [3, 26, 39, 43, 57, 73, 74, 77, 80, 91, 94], "enlarg": 3, "avail": [3, 12, 19, 21, 23, 25, 27, 31, 33, 37, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 100, 102], "transpos": [3, 17, 18, 21, 57, 62, 64, 65, 69, 70, 73, 80, 84, 89], "mirror": [3, 76, 80, 100, 102], "add": [3, 7, 11, 12, 16, 17, 20, 21, 27, 28, 37, 38, 40, 43, 48, 57, 60, 62, 65, 67, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 97, 100, 102], "nois": [3, 17, 20, 35, 36, 39, 40, 57, 60, 61, 62, 67, 69, 70, 76, 80, 81, 82, 85, 91], "intens": [3, 17, 57], "etc": [3, 5, 8, 19, 27, 28, 31, 33, 34, 37, 38, 40, 43, 52, 53, 57, 60, 64, 67, 70, 76, 85, 91], "architectur": [3, 11, 12, 19, 27, 31, 43, 57, 60, 65, 67, 69, 70, 73, 74, 76, 77, 94, 101, 102], "few": [3, 31, 33, 39, 40, 57, 60, 70, 73, 76, 80, 82, 85, 88, 89, 91, 97, 100, 101], "tune": [3, 27, 31, 57, 61, 62, 67, 69, 87, 91], "take": [3, 5, 10, 11, 12, 16, 17, 21, 23, 25, 27, 31, 33, 35, 36, 37, 39, 40, 43, 53, 57, 60, 62, 64, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "random": [3, 5, 7, 11, 16, 17, 20, 21, 27, 35, 39, 40, 57, 74, 97], "sampl": [3, 7, 11, 12, 16, 25, 27, 33, 36, 43, 61, 64, 65, 70, 76, 84, 85, 87, 94, 101, 102], "test": [3, 5, 7, 11, 12, 19, 28, 31, 33, 34, 35, 36, 61, 62, 64, 65, 67, 70, 76, 80, 84, 85, 87, 94, 100, 101], "them": [3, 5, 8, 12, 20, 23, 26, 27, 28, 31, 33, 35, 36, 37, 38, 39, 43, 48, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "togeth": [3, 4, 31, 33, 34, 37, 38, 39, 40, 46, 57, 61, 64, 65, 70, 74, 77, 84, 88, 89, 101], "actual": [3, 5, 21, 25, 31, 33, 35, 36, 37, 39, 40, 57, 60, 64, 67, 73, 74, 76, 80, 81, 82, 84, 85, 88, 89, 91, 94, 100, 101], "medium": [3, 21, 27, 85], "g": [3, 4, 5, 7, 11, 15, 21, 25, 27, 33, 34, 35, 36, 37, 38, 39, 40, 53, 57, 60, 65, 67, 70, 73, 74, 77, 81, 82, 84, 85, 88, 91, 94, 97, 101], "resnet": [3, 77, 82], "error": [3, 11, 17, 31, 33, 40, 54, 57, 60, 61, 62, 65, 69, 70, 73, 74, 76, 81, 82, 84, 85, 87, 88, 89, 97, 100, 102], "most": [3, 8, 11, 12, 25, 28, 31, 33, 34, 35, 36, 37, 38, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 101, 102], "accur": [3, 16, 17, 33, 36, 69, 74, 76, 87, 88, 91], "confus": [3, 33, 73, 100], "explor": [3, 5, 8, 16, 17, 21, 28, 31, 39, 61, 62, 67, 70, 80, 81, 85, 88, 89, 100, 102], "gaba": 3, "glutam": 3, "revers": [3, 11, 12, 28, 57, 82, 85, 97, 100, 102], "direct": [3, 17, 21, 23, 25, 27, 28, 33, 34, 39, 40, 60, 62, 64, 67, 80, 81, 88, 97, 100], "where": [3, 4, 5, 8, 11, 12, 17, 18, 19, 21, 27, 30, 31, 33, 35, 36, 37, 38, 39, 40, 43, 53, 54, 57, 60, 62, 64, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 94, 97, 100, 101, 102], "least": [3, 5, 31, 40, 52, 70, 74, 94, 100], "obviou": [3, 31], "hard": [3, 16, 17, 21, 28, 31, 35, 38, 39, 67, 73, 76, 94, 101], "watch": [3, 19, 31, 33, 74, 76, 80, 91, 101], "bore": [3, 31], "find": [3, 7, 11, 12, 15, 21, 23, 25, 26, 27, 31, 33, 34, 38, 39, 40, 43, 49, 60, 61, 62, 64, 65, 67, 70, 73, 74, 80, 82, 85, 87, 89, 91, 97, 100, 101, 102], "period": [3, 27, 28, 36, 38, 39, 57, 64, 81, 82, 101], "current": [3, 16, 21, 25, 27, 28, 31, 38, 40, 43, 57, 60, 61, 64, 65, 67, 69, 70, 76, 82, 84, 85, 88, 94, 97, 100, 101, 102], "hint": [3, 5, 36, 57, 60, 65, 69, 73, 74, 76, 77, 88, 91, 94, 97, 100, 101, 102], "cycle_gan": 3, "visual": [3, 7, 17, 19, 21, 27, 28, 31, 33, 34, 35, 39, 40, 61, 64, 65, 67, 76, 80, 82, 84, 85, 91, 100], "might": [3, 5, 19, 20, 21, 27, 31, 33, 35, 37, 38, 39, 40, 57, 61, 67, 69, 70, 73, 74, 76, 80, 82, 84, 85, 88, 89, 91, 94, 100, 101, 102], "help": [3, 5, 11, 12, 16, 17, 26, 27, 31, 34, 35, 36, 37, 43, 48, 49, 57, 60, 61, 62, 67, 74, 77, 82, 84, 85, 88, 91, 94, 97, 100, 101], "look": [3, 5, 11, 12, 19, 20, 21, 25, 27, 31, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 69, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 89, 91, 94, 100, 101], "organ": [3, 11, 23, 31, 39, 57, 69, 76], "raw": [3, 15, 28, 33, 35, 36, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 89, 94], "copi": [3, 5, 7, 17, 18, 21, 25, 28, 31, 33, 39, 43, 53, 57, 67, 69, 70, 73, 85, 94, 97, 100, 102], "directori": [3, 7, 15, 16, 21, 27, 28, 67, 77, 82, 100, 102], "structur": [3, 5, 17, 21, 27, 31, 34, 35, 36, 46, 57, 60, 62, 67, 69, 70, 76, 80, 82, 84, 87, 94, 101], "adjust": [3, 20, 28, 61, 67, 73, 76, 87], "128x128": [3, 76], "channel": [3, 5, 16, 17, 21, 27, 31, 33, 36, 57, 65, 67, 73, 76, 77, 80, 82], "written": [3, 5, 7, 11, 17, 57, 62], "nil": 3, "eckstein": 3, "modifi": [3, 4, 17, 21, 27, 69, 70, 73, 77, 80, 85, 94], "implement": [3, 5, 12, 17, 21, 23, 28, 34, 37, 43, 57, 60, 62, 65, 69, 76, 80, 82, 85, 87, 94, 101, 102], "six": [3, 33, 76, 88], "acethylcholin": 3, "octopamin": 3, "serotonin": 3, "dopamin": 3, "request": [3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 33, 36, 43, 52, 57, 65, 67, 69, 70, 73, 76, 77, 80, 81, 84, 85, 87, 88, 89, 94, 100, 102], "zipfil": [3, 7, 11, 12, 15, 16, 65, 69, 70, 73, 76, 77, 80, 82, 84, 87, 89, 94, 100, 102], "fname": [3, 7, 15, 16, 17, 18, 21, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89], "url": [3, 5, 7, 12, 15, 16, 17, 18, 21, 27, 34, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "dropbox": 3, "sh": 3, "ucpjfd3omjieu80": 3, "aaavzynltzvhyfx7_jwvhuk2a": 3, "downlad": 3, "r": [3, 5, 7, 11, 12, 15, 16, 17, 18, 21, 27, 33, 35, 36, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 102], "allow_redirect": [3, 15, 18, 21, 65, 67, 69, 70, 73, 80, 84, 85], "stream": [3, 5, 25, 76, 77, 85, 87, 88, 89, 91], "wb": [3, 5, 7, 15, 16, 17, 18, 21, 65, 67, 69, 70, 73, 76, 80, 84, 85, 87, 89], "fh": [3, 15, 18, 65, 67, 69, 70, 73, 80], "write": [3, 5, 7, 15, 16, 17, 18, 21, 23, 27, 31, 33, 35, 36, 37, 38, 39, 40, 43, 46, 51, 53, 54, 57, 62, 64, 65, 67, 69, 70, 74, 76, 80, 84, 85, 87, 88, 89, 91, 97, 101], "cmplete": 3, "unzip": [3, 21, 65, 69, 70, 94, 100, 102], "specifi": [3, 21, 27, 28, 31, 33, 36, 37, 38, 39, 43, 57, 61, 62, 67, 69, 70, 82, 84, 85, 87, 88, 94, 97, 100, 102], "mode": [3, 17, 18, 21, 27, 28, 31, 33, 43, 60, 62, 64, 65, 73, 82, 87, 88, 89, 94, 97], "zf": [3, 88], "all": [3, 5, 7, 8, 11, 12, 15, 17, 18, 21, 23, 25, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 57, 60, 61, 62, 64, 65, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 97, 100, 101, 102], "extractal": [3, 5, 7, 11, 12, 15, 16, 21, 65, 67, 69, 70, 73, 76, 77, 80, 84, 85, 87, 89, 94, 100, 102], "done": [3, 11, 17, 21, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 53, 57, 61, 62, 65, 73, 77, 80, 81, 87, 89, 91, 100, 102], "zh": 3, "narchiv": 3, "textract": 3, "order": [3, 10, 12, 15, 19, 26, 28, 31, 33, 34, 36, 39, 43, 51, 54, 57, 65, 67, 73, 77, 84, 87, 88, 89, 91, 94, 97, 100], "match": [3, 5, 8, 17, 18, 20, 21, 25, 31, 33, 36, 57, 65, 67, 73, 80, 82, 84, 85, 94], "pretrain": [3, 5, 8, 10, 12, 16, 18, 30, 43, 73, 80, 82, 84, 87, 88, 91, 94, 100, 101], "renam": [3, 21], "0_gaba": 3, "acetylcholin": 3, "1_acetylcholin": 3, "2_glutam": 3, "3_serotonin": 3, "4_octopamin": 3, "5_dopamin": 3, "remov": [3, 5, 7, 8, 12, 16, 17, 21, 27, 38, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 85, 87, 88, 89, 94, 100, 102], "archiv": [3, 5, 30, 73], "experi": [3, 11, 16, 17, 23, 26, 27, 28, 31, 35, 36, 37, 39, 40, 52, 54, 57, 61, 67, 80, 81, 82, 84, 85, 100, 101], "first": [3, 7, 8, 10, 11, 12, 15, 17, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 46, 54, 57, 60, 62, 64, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 89, 91, 94, 97, 100], "loader": [3, 15, 21, 33, 57, 64, 69, 73, 76, 80, 84, 88], "account": [3, 5, 7, 10, 11, 12, 25, 28, 39, 43, 54, 77, 80, 88, 102], "imbal": [3, 10], "dure": [3, 8, 10, 12, 15, 16, 17, 19, 21, 23, 27, 28, 31, 39, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "load_imag": 3, "filenam": [3, 16, 17, 21, 57, 73, 76, 87, 94, 100, 102], "grescal": 3, "uint8": [3, 18, 21, 27, 73], "255": [3, 18, 21, 27, 67, 70, 73, 76], "astyp": [3, 5, 12, 17, 18, 21, 25, 27, 62, 73, 80, 100, 102], "split": [3, 5, 7, 11, 12, 15, 16, 17, 21, 23, 31, 33, 35, 36, 39, 43, 57, 64, 65, 67, 69, 70, 73, 76, 84, 85, 87, 88, 94], "num_imag": [3, 77, 94], "num_train": [3, 87], "num_valid": [3, 87], "num_test": 3, "fix": [3, 5, 25, 28, 33, 43, 57, 61, 62, 65, 67, 73, 76, 77, 81, 82, 84, 85, 91], "seed": [3, 16, 17, 20, 25, 27, 35, 39, 57], "train_dataset": [3, 7, 21, 57, 73, 85, 87], "validation_dataset": [3, 73, 85], "test_dataset": [3, 21, 73, 85, 87], "23061912": 3, "uniform": [3, 27, 28, 57, 61, 65, 76, 100, 102], "ys": [3, 67], "arrai": [3, 5, 12, 15, 17, 25, 27, 28, 31, 33, 35, 36, 39, 40, 57, 61, 62, 64, 65, 67, 70, 73, 76, 80, 81, 84, 85, 87, 89, 94, 97, 100, 102], "count": [3, 4, 11, 12, 16, 20, 21, 25, 35, 36, 39, 57, 64, 65, 74, 76, 88, 102], "bincount": 3, "label_weight": 3, "per": [3, 5, 16, 17, 21, 25, 31, 33, 35, 39, 57, 61, 62, 64, 65, 67, 69, 74, 76, 80, 84, 85, 88, 94, 100, 102], "t": [3, 5, 11, 12, 17, 20, 21, 25, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 97, 100, 101, 102], "tn": 3, "tweight": 3, "serv": [3, 35], "mini": [3, 76, 81, 82], "drop_last": [3, 12, 64, 65], "15855": 3, "30715862503942e": 3, "05": [3, 5, 17, 27, 57, 60, 61, 62, 67, 73, 82, 94, 100, 102], "4911": 3, "00020362451639177357": 3, "3550": 3, "00028169014084507044": 3, "2297": 3, "00043535045711797995": 3, "951": [3, 76], "0010515247108307045": 3, "4649": 3, "00021510002151000216": 3, "cell": [3, 4, 11, 12, 16, 20, 21, 25, 27, 28, 33, 36, 43, 51, 57, 60, 61, 64, 65, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "singl": [3, 5, 8, 20, 21, 25, 26, 28, 33, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 88, 100], "chosen": [3, 12, 27, 35, 67, 82, 84, 100, 102], "feel": [3, 12, 16, 28, 35, 39, 40, 57, 61, 64, 73, 80, 82, 84, 85, 88, 97, 100], "tell": [3, 11, 16, 31, 33, 34, 43, 64, 67, 69, 80, 82, 88], "show_batch": 3, "subplot": [3, 5, 7, 12, 15, 16, 17, 18, 20, 21, 28, 35, 39, 40, 60, 61, 62, 64, 67, 69, 73, 76, 80, 81, 94, 97], "sharei": [3, 18], "squeez": [3, 12, 16, 18, 21, 33, 57, 70, 73, 76, 80], "cmap": [3, 15, 18, 20, 21, 57, 61, 62, 64, 65, 67, 73, 80, 81], "grai": [3, 18, 21, 57, 73, 76, 80, 85, 88], "set_titl": [3, 7, 35, 36, 40, 60, 61, 62, 67, 69, 73, 76, 81, 94], "repeatedli": [3, 73], "break": [3, 5, 11, 21, 25, 35, 46, 60, 62, 67, 69, 70, 84, 88, 91, 100, 102], "vgg2d": 3, "input_s": [3, 5, 11, 12, 76], "fmap": 3, "downsample_factor": 3, "output_class": 3, "current_fmap": 3, "current_s": 3, "tupl": [3, 12, 17, 57, 60, 61, 67, 73, 80, 85, 97, 100, 102], "featur": [3, 7, 8, 12, 16, 17, 18, 21, 25, 27, 33, 36, 39, 43, 44, 45, 57, 60, 61, 62, 64, 67, 69, 70, 73, 80, 81, 82, 87], "inplac": [3, 7, 12, 16, 17, 21], "maxpool2d": [3, 5, 16, 17, 21, 73], "assert": [3, 5, 16, 21, 27, 57, 60, 61, 62, 67, 80, 82, 84, 85, 97, 100, 102], "downsampl": [3, 17, 21, 73, 82], "factor": [3, 21, 31, 62, 67, 82, 84, 91, 94, 97, 101], "4096": [3, 16, 57], "dropout": [3, 5, 7, 12, 16, 20, 33, 64, 84, 87, 100, 102], "reshap": [3, 11, 17, 21, 33, 35, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 81, 82, 84, 87, 94, 100, 102], "optimz": 3, "adam": [3, 7, 12, 16, 20, 28, 33, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 100, 102], "gpu": [3, 5, 7, 12, 16, 17, 20, 21, 28, 33, 43, 54, 97], "devic": [3, 7, 11, 12, 16, 17, 18, 20, 21, 33, 36, 57], "cpu": [3, 5, 7, 11, 12, 16, 17, 18, 20, 21, 28, 33, 43], "Will": [3, 28, 101], "mere": 3, "defin": [3, 5, 7, 11, 12, 21, 27, 28, 31, 35, 37, 40, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 84, 85, 87, 88, 89, 97, 101, 102], "conveni": [3, 43, 60, 61, 62, 67, 88, 94], "function": [3, 11, 12, 20, 25, 31, 37, 38, 40, 88, 97], "epoch_loss": [3, 17, 18, 21], "num_batch": [3, 67], "y_pred": [3, 12, 64, 65], "l": [3, 5, 11, 12, 18, 62, 64, 67, 70, 73, 76, 82, 85, 91, 100, 102], "evalu": [3, 7, 10, 11, 12, 16, 17, 18, 27, 34, 35, 43, 61, 65, 69, 73, 80, 81, 85, 87, 88, 100], "logit": [3, 21, 64, 65, 76, 85], "prob": [3, 76, 80, 81, 84, 100, 102], "softmax": [3, 7, 11, 64, 67, 73, 76, 84, 100], "dim": [3, 11, 17, 21, 33, 57, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 102], "argmax": [3, 5, 11, 21, 43, 57, 64, 65, 67, 69, 70, 73, 76, 84, 85, 87, 88, 89, 97, 100, 102], "detach": [3, 7, 11, 16, 17, 18, 20, 60, 62, 65, 69, 70, 73, 77, 80, 87], "readi": [3, 11, 12, 28, 33, 36, 37, 43, 60, 88, 100, 101], "after": [3, 12, 15, 16, 17, 21, 23, 27, 31, 33, 35, 36, 38, 39, 43, 46, 54, 57, 60, 62, 64, 65, 67, 70, 74, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101], "roughli": [3, 31, 40, 80, 81], "onc": [3, 17, 25, 27, 28, 31, 33, 35, 36, 37, 38, 39, 40, 43, 54, 57, 61, 69, 73, 74, 80, 82, 85, 87, 91, 97, 101, 102], "report": [3, 27, 31, 69, 70, 82, 84, 85, 87, 88, 89], "train_from_scratch": 3, "num_epoch": [3, 5, 11, 16, 18, 33, 64, 65, 94], "yes_i_want_the_pretrained_model": 3, "wherea": [3, 69, 73], "scratch": [3, 27, 31, 33, 57, 61, 88, 91], "unceck": 3, "box": [3, 16, 27, 37, 38, 64, 67, 73, 76, 80, 94], "param": [3, 5, 8, 12, 15, 16, 40, 43, 57, 67, 69, 70, 76, 80, 81, 82, 84, 85, 87], "boolean": [3, 33, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 89, 94, 100, 102], "vgg_checkpoint": 3, "map_loc": [3, 17, 21, 67, 82, 100, 102], "load_state_dict": [3, 8, 16, 17, 18, 21, 67, 76, 82, 100, 102], "model_state_dict": 3, "8054750869061413": 3, "conclud": [3, 33, 34, 39, 40], "discrimin": [3, 39], "perfect": [3, 12, 33, 35, 36, 39, 40, 85], "pretti": [3, 21, 33, 73, 74, 91], "consid": [3, 12, 16, 27, 31, 33, 37, 60, 61, 65, 67, 73, 76, 84, 85, 87, 88, 91, 94], "furthermor": [3, 20, 31, 33, 36, 65, 73], "clear": [3, 16, 20, 25, 31, 33, 35, 57, 67, 69, 70, 73, 84, 91, 101], "doe": [3, 5, 7, 10, 12, 16, 19, 21, 25, 28, 31, 33, 35, 36, 38, 39, 40, 43, 53, 57, 60, 61, 62, 67, 74, 77, 80, 85, 87, 88, 89, 91, 97, 100, 101, 102], "yourself": [3, 17, 31, 34, 57, 61], "betwe": [3, 85], "sai": [3, 31, 38, 46, 57, 61, 62, 74, 76, 80, 81, 89, 91], "gabaerg": 3, "glutamaterg": 3, "situat": [3, 12, 16, 73, 74, 77, 84, 91, 101], "someth": [3, 21, 31, 33, 34, 35, 38, 39, 40, 57, 70, 76, 80, 88, 91, 97, 100, 101, 102], "don": [3, 11, 12, 20, 25, 31, 33, 34, 35, 36, 37, 40, 43, 54, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 97, 101], "relev": [3, 12, 19, 21, 34, 39, 60, 73, 74, 84, 88, 91, 94, 101], "repo": [3, 27, 28, 77, 87, 89, 94, 100, 102], "funkei": 3, "neuromatch_xai": 3, "osf": [3, 5, 7, 16, 17, 18, 21, 33, 34, 36, 39, 57, 65, 67, 69, 70, 73, 76, 77, 80, 84, 87, 89, 94, 100, 102], "vutn5": 3, "z": [3, 11, 12, 15, 18, 20, 27, 36, 40, 57, 60, 61, 67, 73, 76, 77, 80, 81, 82, 85, 91], "bytesio": [3, 11, 12, 15, 33, 36, 43, 73, 76, 77, 94, 100, 102], "domin": [3, 74], "either": [3, 5, 16, 21, 23, 25, 31, 39, 43, 57, 70, 73, 74, 76, 85, 88, 91, 94, 100, 101, 102], "format": [3, 5, 8, 11, 12, 21, 25, 27, 31, 33, 36, 38, 43, 57, 61, 62, 67, 69, 73, 76, 82, 84, 85, 87, 100, 102], "happi": [3, 11], "afterward": [3, 73], "prepare_dataset": 3, "uncom": [3, 7, 16, 25, 27, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 94, 100, 102], "procedur": [3, 39, 40, 64, 81, 82, 91, 94, 101], "lot": [3, 12, 21, 27, 31, 33, 35, 37, 39, 43, 57, 70, 73, 74, 80, 84, 88, 89, 91], "longer": [3, 7, 33, 39, 80, 85, 88, 91], "abov": [3, 5, 7, 15, 17, 27, 28, 33, 35, 36, 39, 52, 57, 60, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 89, 91, 94, 97, 100, 101, 102], "continu": [3, 12, 23, 25, 27, 31, 33, 52, 57, 60, 61, 62, 64, 70, 73, 80, 81, 84, 85, 87, 88, 91, 97, 100, 101, 102], "interrupt": [3, 43, 62], "kernel": [3, 54, 57, 80, 82, 100], "b": [3, 12, 21, 25, 33, 34, 36, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 87, 88, 89, 91, 94, 100, 102], "data_dir": 3, "class_a": 3, "class_b": 3, "img_siz": 3, "checkpoints_dir": 3, "gaba_glutam": 3, "option": [3, 8, 10, 11, 17, 20, 23, 34, 35, 37, 39, 43, 54, 64, 67, 73, 80, 84, 85, 88, 89, 97, 100], "aspect_ratio": 3, "aux_checkpoint": 3, "default": [3, 17, 21, 25, 43, 57, 60, 61, 62, 64, 65, 67, 69, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "aux_downsample_factor": 3, "aux_input_nc": 3, "aux_input_s": 3, "aux_net": 3, "aux_output_class": 3, "crop_siz": 3, "dataroot": 3, "0_gaba_2_glutam": 3, "dataset_mod": 3, "atob": 3, "display_wins": 3, "latest": [3, 21], "gpu_id": 3, "init_gain": 3, "02": [3, 27, 33, 60, 61, 62, 73, 81, 94], "init_typ": 3, "input_nc": 3, "istrain": 3, "load_it": 3, "load_siz": 3, "max_dataset_s": 3, "inf": [3, 12, 20, 21, 28, 61, 67, 84, 97, 102], "model_suffix": 3, "_a": [3, 74], "n_layers_d": 3, "experiment_nam": 3, "ndf": 3, "netd": 3, "netg": 3, "resnet_9block": 3, "ngf": 3, "no_dropout": 3, "no_flip": 3, "norm": [3, 25, 35, 39, 40, 62, 64, 67, 77, 80, 84, 87, 89], "instanc": [3, 5, 8, 11, 12, 17, 26, 27, 28, 57, 60, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "ntest": [3, 33], "num_thread": 3, "output_nc": 3, "phase": [3, 18], "preprocess": [3, 5, 12, 15, 65, 67, 73, 76], "results_dir": [3, 15], "serial_batch": 3, "suffix": [3, 82], "verbos": [3, 12, 21, 27, 64, 65, 94, 100, 102], "singledataset": 3, "initi": [3, 4, 5, 11, 12, 17, 20, 21, 26, 27, 28, 31, 36, 43, 57, 60, 64, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 88, 94, 97, 100, 102], "testmodel": 3, "latest_net_g_a": 3, "pth": [3, 17, 21, 76, 82, 94, 100, 102], "resnetgener": 3, "reflectionpad2d": 3, "instancenorm2d": 3, "ep": [3, 17, 60, 64, 65, 80, 81, 82], "affin": [3, 15, 17, 62, 67, 82, 94], "track_running_stat": 3, "10": [3, 5, 7, 8, 10, 11, 12, 15, 16, 17, 18, 19, 20, 25, 27, 28, 31, 39, 40, 46, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 97, 100, 102], "resnetblock": 3, "conv_block": 3, "17": [3, 12, 21, 27, 28, 33, 46, 62, 70, 73, 76, 81, 82, 84, 85, 87, 91, 94, 97, 100, 102], "18": [3, 5, 8, 12, 25, 27, 28, 33, 46, 70, 73, 76, 81, 82, 84, 85, 87, 94, 97, 100], "19": [3, 5, 12, 16, 27, 28, 33, 34, 46, 57, 62, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 97, 102], "convtranspose2d": [3, 21, 80, 82], "output_pad": [3, 82], "20": [3, 5, 7, 12, 17, 18, 19, 20, 27, 28, 31, 33, 40, 57, 60, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 94, 97, 100, 102], "21": [3, 8, 21, 27, 33, 40, 43, 57, 62, 70, 73, 76, 77, 81, 82, 84, 85, 87, 94, 97, 100, 102], "22": [3, 12, 21, 27, 28, 33, 46, 62, 65, 69, 70, 73, 76, 81, 82, 84, 85, 87, 94, 97, 100], "23": [3, 8, 25, 28, 33, 46, 57, 62, 65, 67, 70, 73, 76, 81, 82, 84, 85, 87, 94, 100, 102], "24": [3, 12, 21, 25, 27, 28, 33, 36, 46, 57, 62, 67, 70, 73, 76, 80, 81, 82, 85, 87, 94, 100], "26": [3, 5, 8, 17, 27, 33, 65, 67, 70, 73, 76, 81, 82, 84, 85, 88, 91, 97, 100], "27": [3, 5, 8, 27, 57, 60, 65, 67, 73, 76, 81, 82, 84, 85, 94, 97], "tanh": [3, 20, 27, 60, 80, 81, 100, 102], "g_a": 3, "366": [3, 76], "web": [3, 17, 76, 88, 97], "test_latest": 3, "process": [3, 11, 12, 16, 17, 19, 20, 23, 25, 26, 27, 31, 34, 35, 36, 37, 39, 40, 45, 46, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 82, 84, 89, 91, 94, 100, 101, 102], "0000": [3, 21, 57], "th": [3, 7, 17, 43, 62, 64, 67, 76, 80, 85], "traina": 3, "0_train": 3, "0005": [3, 7, 67], "10004_train": 3, "0010": [3, 67], "10009_train": 3, "0015": 3, "10013_train": 3, "0020": 3, "10018_train": 3, "0025": 3, "10022_train": 3, "0030": 3, "10027_train": 3, "0035": 3, "10031_train": 3, "0040": [3, 67], "10036_train": 3, "0045": [3, 67], "10040_train": 3, "0050": 3, "10045_train": 3, "0055": 3, "1004_train": 3, "0060": 3, "10054_train": 3, "0065": 3, "10059_train": 3, "0070": [3, 67], "10063_train": 3, "0075": 3, "10068_train": 3, "0080": [3, 67], "10072_train": 3, "0085": 3, "10077_train": 3, "0090": 3, "10081_train": 3, "0095": 3, "10086_train": 3, "0100": [3, 67], "10090_train": 3, "0105": 3, "10095_train": 3, "0110": [3, 67], "1009_train": 3, "0115": [3, 67], "10103_train": 3, "0120": 3, "10108_train": 3, "0125": 3, "10112_train": 3, "0130": 3, "10117_train": 3, "0135": 3, "10121_train": 3, "0140": 3, "10126_train": 3, "0145": [3, 67], "10130_train": 3, "0150": 3, "10135_train": 3, "0155": [3, 67], "1013_train": 3, "0160": 3, "10144_train": 3, "0165": 3, "10149_train": 3, "0170": 3, "10153_train": 3, "0175": [3, 67], "10158_train": 3, "0180": [3, 67], "10162_train": 3, "0185": 3, "10167_train": 3, "0190": 3, "10171_train": 3, "0195": [3, 57], "10176_train": 3, "0200": 3, "10180_train": 3, "0205": [3, 67], "10185_train": 3, "0210": 3, "1018_train": 3, "0215": 3, "10194_train": 3, "0220": [3, 67], "10199_train": 3, "0225": 3, "10202_train": 3, "0230": 3, "10207_train": 3, "0235": 3, "10211_train": 3, "0240": [3, 67], "10216_train": 3, "0245": 3, "10220_train": 3, "0250": 3, "10225_train": 3, "0255": [3, 67], "1022_train": 3, "0260": 3, "10234_train": 3, "0265": [3, 67], "10239_train": 3, "0270": [3, 67], "10243_train": 3, "0275": 3, "10248_train": 3, "0280": 3, "10252_train": 3, "0285": 3, "10257_train": 3, "0290": [3, 67], "10261_train": 3, "0295": [3, 67], "10266_train": 3, "0300": 3, "10270_train": 3, "0305": 3, "10275_train": 3, "0310": 3, "1027_train": 3, "0315": 3, "10284_train": 3, "0320": [3, 67], "10289_train": 3, "0325": 3, "10293_train": 3, "0330": 3, "10298_train": 3, "0335": [3, 67], "10301_train": 3, "0340": 3, "10306_train": 3, "0345": 3, "10310_train": 3, "0350": 3, "10315_train": 3, "0355": 3, "1031_train": 3, "0360": 3, "10324_train": 3, "0365": 3, "10329_train": 3, "0370": [3, 67], "10333_train": 3, "0375": 3, "10338_train": 3, "0380": 3, "10342_train": 3, "0385": 3, "10347_train": 3, "0390": 3, "10351_train": 3, "0395": 3, "10356_train": 3, "0400": 3, "10360_train": 3, "0405": 3, "10365_train": 3, "0410": 3, "1036_train": 3, "0415": 3, "10374_train": 3, "0420": 3, "10379_train": 3, "0425": 3, "10383_train": 3, "0430": 3, "10388_train": 3, "0435": 3, "10392_train": 3, "0440": [3, 67], "10397_train": 3, "0445": [3, 57], "10400_train": 3, "0450": 3, "10405_train": 3, "0455": 3, "1040_train": 3, "0460": 3, "10414_train": 3, "0465": 3, "10419_train": 3, "0470": 3, "10423_train": 3, "0475": 3, "10428_train": 3, "0480": 3, "10432_train": 3, "0485": 3, "10437_train": 3, "0490": 3, "10441_train": 3, "0495": 3, "10446_train": 3, "sort": [3, 11, 12, 15, 17, 23, 25, 31, 33, 70, 76, 84, 85, 94, 100, 102], "much": [3, 8, 12, 16, 17, 20, 21, 28, 31, 33, 34, 35, 36, 38, 39, 40, 60, 62, 67, 70, 74, 77, 80, 82, 85, 88, 91, 94, 97], "fool": [3, 40, 88], "class_a_index": 3, "class_b_index": 3, "result_dir": 3, "classification_result": 3, "basenam": 3, "replac": [3, 4, 5, 8, 17, 25, 27, 28, 33, 43, 57, 60, 62, 67, 73, 74, 76, 80, 84, 85, 88], "_aux": 3, "kei": [3, 5, 7, 8, 11, 12, 15, 16, 18, 28, 31, 33, 34, 39, 40, 43, 57, 60, 61, 65, 67, 69, 70, 76, 80, 81, 82, 85, 87, 91, 100, 101, 102], "aux_real": 3, "aux_fak": 3, "top": [3, 5, 20, 27, 33, 35, 39, 43, 53, 54, 57, 60, 62, 64, 67, 73, 74, 76, 80, 84, 87, 88, 91, 94, 97, 100, 102], "real": [3, 5, 10, 20, 25, 31, 33, 35, 36, 39, 43, 57, 62, 67, 69, 73, 74, 77, 81, 84, 87, 88, 91, 94, 100, 101], "fake": [3, 10, 25, 80], "mind": [3, 5, 16, 21, 35, 37, 80], "show_pair": 3, "score_a": 3, "score_b": 3, "p": [3, 5, 7, 8, 11, 12, 15, 18, 21, 27, 33, 34, 36, 65, 67, 69, 70, 73, 74, 76, 80, 81, 84, 85, 88, 91, 97, 100, 101, 102], "str": [3, 15, 21, 25, 27, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "success": [3, 11, 61, 67, 69, 70, 74, 76, 82, 97, 101], "real_a": 3, "_real": 3, "fake_b": 3, "_fake": 3, "segment": [4, 5, 33, 64], "approach": [4, 5, 10, 11, 12, 17, 18, 25, 27, 28, 33, 34, 35, 36, 37, 39, 40, 60, 62, 67, 73, 74, 91, 100, 102], "denois": [4, 82], "noise2void": 4, "u": [4, 12, 15, 21, 25, 40, 57, 60, 62, 65, 67, 76, 80, 85, 88, 89, 102], "thing": [4, 11, 12, 19, 20, 23, 31, 35, 43, 60, 62, 64, 74, 76, 80, 91, 101], "privaci": 4, "essenti": [4, 27, 33, 34, 35, 39, 40, 60, 80, 84, 97, 100], "particularli": [4, 27, 76, 81, 88, 101, 102], "feder": 4, "environ": [4, 26, 51, 53, 54, 57, 80, 97, 100, 101], "client": [4, 43], "train": [4, 5, 10, 12, 15, 21, 25, 26, 27, 31, 33, 34, 35, 36, 65, 74, 80, 81, 87, 88, 91, 97, 101], "model": [4, 5, 7, 10, 11, 12, 15, 18, 19, 20, 26, 27, 28, 46, 57, 60, 61, 62, 69, 74, 77, 87, 89, 97, 100, 101, 102], "without": [4, 16, 17, 25, 27, 31, 36, 38, 57, 61, 62, 64, 65, 67, 84, 85, 88, 89, 97, 101], "share": [4, 12, 21, 31, 46, 57, 67, 73, 80, 84, 85, 94], "privat": 4, "adopt": [4, 73, 77], "method": [4, 10, 20, 27, 31, 34, 35, 38, 43, 60, 62, 64, 65, 69, 70, 73, 76, 85, 88, 97, 100, 101], "hide": [4, 81], "person": [4, 11, 23, 31, 53, 74, 77, 88, 91, 94], "while": [4, 5, 11, 12, 16, 19, 25, 27, 31, 33, 35, 39, 43, 51, 57, 60, 62, 67, 70, 73, 74, 76, 77, 80, 81, 85, 88, 94, 97, 100, 101], "collabor": [4, 44, 45, 53, 91], "updat": [4, 7, 8, 12, 17, 21, 25, 28, 31, 46, 60, 61, 69, 70, 73, 80, 81, 82, 84, 85, 94, 97, 100, 101, 102], "attack": [4, 76, 85], "retriev": [4, 36, 43, 62, 76, 84, 94], "exact": [4, 21, 35, 39, 43, 73, 80, 81, 100], "simpli": [4, 5, 33, 36, 40, 60, 62, 65, 67, 80, 81, 85, 100], "zhu": [4, 27], "et": [4, 11, 16, 27, 35, 40, 60, 62, 76, 81, 84, 91], "al": [4, 16, 27, 35, 40, 62, 76, 81, 84, 91], "project": [4, 5, 7, 8, 10, 11, 15, 17, 19, 20, 21, 25, 28, 34, 49, 53, 54, 57, 61, 67, 77, 80, 84, 89, 101], "task": [4, 5, 10, 11, 12, 16, 19, 21, 57, 60, 61, 62, 67, 73, 76, 84, 85, 87, 88, 91, 97, 101], "reimplement": [4, 69, 85], "wise": [4, 67, 70, 73, 80, 84], "were": [4, 15, 31, 33, 34, 35, 39, 67, 69, 73, 74, 76, 77, 80, 81, 84, 85, 88, 91, 94, 97, 100, 101, 102], "post": [4, 31, 43, 46, 69, 70, 76, 82, 84, 85, 88], "contrast": [4, 5, 11, 16, 28, 39, 62, 67, 73, 76, 80, 91], "problem": [4, 11, 12, 17, 19, 25, 26, 27, 31, 33, 36, 37, 39, 43, 57, 60, 62, 65, 67, 69, 70, 73, 74, 76, 77, 80, 88, 91, 101], "vanilla": [4, 60, 87], "vs": [4, 7, 10, 16, 21, 25, 36, 38, 39, 40, 43, 69, 74, 84, 85, 94, 100, 101, 102], "gnn": 4, "deepwalk": 4, "embed": [4, 10, 11, 12, 44, 45, 57, 84, 85, 88, 91, 94], "molecul": [4, 74], "cnn": [4, 33, 57, 67, 74, 80, 82, 100], "cluster": [4, 57, 77, 85, 87], "ref1": 4, "ref2": 4, "openneuro": 4, "kaggl": [4, 5, 7, 16, 30, 51, 57, 85, 87], "visualdata": [4, 30], "joe": 5, "donovan": 5, "nma": [5, 17, 20, 23, 25, 31, 33, 35, 36, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "daili": 5, "guid": [5, 23, 27, 34, 37, 38, 43, 50, 61, 62, 67, 70, 88, 101], "deeplearn": [5, 31, 46, 52], "project_guid": [5, 46], "overal": [5, 31, 36, 38, 39, 65, 67, 69, 74, 77, 81, 85, 88, 94], "goal": [5, 11, 18, 19, 25, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 40, 46, 57, 60, 62, 76, 84, 97, 100, 102], "about": [5, 7, 11, 12, 15, 19, 21, 23, 25, 26, 27, 31, 33, 34, 36, 37, 38, 39, 40, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 87, 88, 100, 101, 102], "potenti": [5, 8, 19, 23, 28, 35, 36, 39, 62, 64, 67, 74, 84, 85, 88, 91, 97, 100, 101], "larger": [5, 12, 17, 27, 40, 61, 62, 65, 67, 70, 73, 76, 82, 87, 88, 91, 100], "pretain": 5, "loss": [5, 7, 11, 12, 16, 17, 20, 21, 33, 57, 60, 62, 65, 70, 73, 74, 76, 77, 80, 84, 85, 87, 88, 102], "optim": [5, 7, 11, 12, 16, 17, 18, 20, 21, 25, 27, 28, 31, 33, 39, 46, 57, 60, 61, 62, 64, 65, 69, 70, 73, 74, 80, 81, 82, 84, 87, 91, 97, 100, 101, 102], "torchsummari": 5, "ndimag": [5, 17], "geometri": [5, 81], "point": [5, 7, 11, 12, 17, 21, 23, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 57, 60, 61, 62, 65, 67, 69, 70, 73, 74, 77, 80, 84, 85, 87, 94, 97, 100, 101], "polygon": 5, "summari": [5, 34], "rotat": [5, 17, 33, 36, 57, 62, 65, 67, 70, 73, 76, 77, 91, 94, 100, 102], "subimag": 5, "unpack_bbox": 5, "bbox": 5, "coco": [5, 21], "centerx": 5, "centeri": 5, "width": [5, 15, 21, 27, 28, 35, 39, 43, 44, 45, 57, 61, 62, 64, 73, 76, 77, 80, 81, 94, 97], "height": [5, 21, 27, 28, 39, 44, 45, 57, 73, 76, 77, 80, 94, 97], "theta": [5, 17, 40, 81, 82], "radian": 5, "rot_cent": 5, "pi": [5, 17, 27, 60, 64, 65, 74, 81, 82, 94, 97, 100, 102], "rotcorners_from_coord": 5, "co": [5, 17, 27, 30, 60, 64, 65, 80, 81, 82, 84, 85, 88], "sin": [5, 17, 27, 60, 64, 65, 80, 81, 82, 84], "wvec": 5, "dot": [5, 39, 40, 57, 60, 61, 62, 77, 81, 87, 89, 94], "hvec": 5, "corner_point": 5, "rotbbox_from_coord": 5, "rot_bbox": 5, "min": [5, 16, 17, 18, 28, 31, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 85, 87, 91, 94, 97, 100, 101, 102], "constrain": [5, 37, 89], "insid": [5, 31, 37, 43, 73], "extract_subimg_bbox": 5, "im": [5, 7, 12, 16, 18, 21, 28, 80], "extract_subimg": 5, "subimg": 5, "rotated_im": 5, "degre": [5, 16, 65, 73, 74, 87, 94, 100, 102], "180": [5, 76], "newcent": 5, "drop": [5, 33, 36, 52, 54, 62, 67, 70, 76, 80, 94], "hardwar": [5, 20, 28, 57, 60, 61, 62, 64, 65, 76, 77, 100], "acceler": [5, 20, 28, 35, 36, 39, 40, 54, 57, 60, 61, 62, 64, 65, 74, 77, 82, 85, 88, 100], "dropdown": [5, 57, 60, 61, 62, 64, 65, 67, 80, 100], "disabl": [5, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 87, 89, 94, 100], "rcparam": [5, 20, 69, 70, 81], "gridspec": [5, 62], "plt_transform": 5, "font": [5, 20, 35, 36, 43, 87], "spine": [5, 20, 33, 35, 39, 73, 80], "right": [5, 7, 11, 15, 16, 19, 20, 26, 27, 31, 33, 34, 35, 36, 37, 38, 39, 43, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 84, 87, 94, 97, 101], "autolayout": [5, 20], "properli": [5, 35, 36, 80, 82, 84, 94, 97], "dataset": [5, 11, 16, 17, 21, 23, 30, 31, 33, 35, 36, 38, 39, 61, 62, 76, 81, 82, 89, 91, 100, 101, 102], "took": [5, 33, 36], "minut": [5, 16, 23, 28, 31, 33, 43, 67, 69, 73, 76, 81, 82, 87, 89, 100], "me": [5, 11, 12, 65, 88], "mvtec": 5, "compani": [5, 43, 82, 88], "tarfil": [5, 21, 67, 73, 76, 80, 84, 85], "ruca6": 5, "tarnam": 5, "mvtec_screws_v1": 5, "isfil": [5, 7, 8, 16, 17, 100, 102], "fd": [5, 73, 76, 80, 84, 85], "complet": [5, 15, 16, 28, 31, 35, 37, 46, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 82, 84, 87, 88, 89, 91, 97, 100, 101, 102], "unpack": 5, "datafil": 5, "datapath": 5, "screwdata": 5, "folder": [5, 7, 16, 17, 21, 43, 57, 73, 76, 100, 102], "full": [5, 6, 11, 13, 15, 22, 23, 27, 28, 29, 31, 33, 36, 39, 40, 61, 62, 85, 87, 101], "listdir": [5, 7, 16, 73, 76, 100, 102], "mvtec_screws_train": 5, "mvtec_screw": 5, "hdict": 5, "mvtec_screws_split": 5, "mvtec_screws_v": 5, "readme_v1": 5, "txt": [5, 11, 43], "mvtec_screws_test": 5, "There": [5, 12, 17, 21, 23, 26, 31, 33, 34, 35, 36, 38, 39, 40, 43, 57, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101], "readm": 5, "file_cont": 5, "v1": [5, 15, 16, 18, 19, 27, 28, 87, 88, 89], "author": [5, 11, 40, 65, 88, 101], "gmbh": 5, "juli": [5, 57], "2020": [5, 16, 27, 34, 57, 81, 91], "halcon": 5, "licens": [5, 88], "annot": [5, 19, 21, 77, 84, 87, 94], "creativ": [5, 23, 84, 88], "attribut": [5, 28, 57, 60, 62, 65, 67, 84, 85, 88], "noncommerci": 5, "sharealik": 5, "intern": [5, 11, 15, 37, 62, 67, 84, 87, 91, 101], "cc": [5, 17, 64, 87, 89], "BY": [5, 17, 85], "nc": [5, 21], "sa": 5, "creativecommon": 5, "fall": [5, 17, 33, 36, 88], "commerci": [5, 15], "claus": 5, "contact": [5, 23, 27], "scientif": [5, 19, 31, 33, 34, 35, 39, 40, 57], "cite": [5, 27, 88], "marku": 5, "ulrich": 5, "patrick": [5, 77], "follmann": 5, "hendrik": [5, 57], "neudeck": 5, "comparison": [5, 35, 38, 40, 73, 94, 101], "technisch": 5, "messen": 5, "2019": [5, 15, 34, 35, 40, 77], "doi": [5, 34, 91], "1515": [5, 94], "teme": 5, "0076": 5, "384": [5, 16, 76], "nut": [5, 84], "wooden": [5, 76], "categori": [5, 11, 15, 19, 73, 84, 87, 88], "4426": 5, "exemplari": 5, "mention": [5, 15, 64, 65, 74, 81, 91, 94], "public": [5, 11, 16, 30, 43, 81, 88], "approxim": [5, 16, 17, 23, 31, 62, 69, 73, 80, 81, 82, 94, 100], "within": [5, 31, 33, 34, 39, 40, 57, 61, 65, 67, 70, 73, 84, 85, 91, 94], "val": [5, 7, 18, 21, 65, 69, 70, 76, 87], "examplari": 5, "dldataset": 5, "unsplit": 5, "usag": [5, 28, 57, 73, 84, 85], "read_dict": 5, "path_to_mvtec_screw": 5, "locat": [5, 21, 28, 67, 74, 80, 88], "path_to_images_fold": 5, "set_dict_tupl": 5, "image_dir": [5, 77], "write_dict": 5, "subpixel": 5, "precis": [5, 12, 17, 21, 23, 25, 28, 35, 36, 37, 38, 73, 94, 97], "center": [5, 36, 43, 67, 73, 76, 77, 80, 85, 94], "coordin": [5, 27, 28, 33, 48, 57, 62, 70, 80], "system": [5, 16, 25, 35, 37, 38, 40, 43, 57, 61, 74, 76, 77, 85, 87, 88, 89, 91, 97, 101], "left": [5, 11, 17, 26, 27, 31, 33, 34, 35, 36, 39, 40, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 84, 94, 97], "corner": [5, 67, 97], "when": [5, 8, 10, 12, 17, 19, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 46, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 97, 100, 101, 102], "convert": [5, 7, 12, 16, 18, 21, 33, 39, 43, 57, 62, 69, 70, 73, 74, 76, 80, 81, 84, 85, 87, 88, 91, 97], "similar": [5, 8, 10, 16, 19, 21, 23, 27, 33, 35, 36, 39, 40, 43, 57, 60, 65, 67, 70, 73, 74, 80, 81, 82, 84, 85, 88, 89, 91, 97], "cocodataset": [5, 21], "row": [5, 15, 43, 57, 67, 73, 80, 84, 89, 94, 97, 100, 102], "col": 5, "phi": [5, 80], "vertic": [5, 20, 27, 62, 67, 70, 73, 91], "axi": [5, 11, 15, 16, 17, 21, 27, 28, 35, 39, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 85, 88, 89, 97], "column": [5, 15, 40, 57, 62, 67, 73, 81, 84, 85, 94], "horizont": [5, 17, 27, 65, 67, 70, 73, 76, 91], "parallel": [5, 34, 57, 76, 87, 88, 89], "perpendicular": 5, "mathemat": [5, 33, 37, 38, 39, 40, 57, 61, 64, 65, 67, 76, 81, 84], "posit": [5, 12, 17, 21, 27, 31, 35, 39, 57, 67, 73, 74, 76, 77, 81, 82, 87, 94, 97, 100, 102], "sens": [5, 15, 31, 33, 35, 36, 39, 40, 64, 88, 94, 100], "toward": [5, 16, 31, 60, 84, 88, 101], "side": [5, 7, 27, 33, 36, 57, 62, 70, 73, 74, 80, 82, 94, 100], "bottom": [5, 43, 48, 57, 67, 73, 76, 77, 87, 91, 94, 97], "alwai": [5, 11, 25, 31, 33, 38, 39, 40, 57, 60, 61, 62, 69, 87, 88, 91, 100], "semi": [5, 17, 76, 81], "henc": [5, 20, 57, 60, 64, 65, 67, 69, 70, 73, 85], "shift": [5, 11, 25, 64, 73, 85, 91, 94], "metadata": [5, 27, 28, 57], "join": [5, 7, 11, 15, 16, 21, 25, 31, 84, 85, 94, 97, 100, 102], "dict_kei": 5, "info": [5, 27, 43, 65, 70, 84, 100, 102], "file_nam": [5, 21], "screws_001": 5, "1440": [5, 57], "1920": 5, "id": [5, 11, 12, 15, 21, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "area": [5, 16, 17, 27, 31, 73, 81, 88, 94, 97], "3440": 5, "97": [5, 8, 16, 21, 28, 76, 94], "184": [5, 76], "876": [5, 76, 84], "313": [5, 76], "55": [5, 27, 61, 62, 65, 76, 84, 94], "5631": 5, "category_id": 5, "1001": [5, 40, 82], "image_id": 5, "is_crowd": 5, "map": [5, 12, 15, 17, 21, 25, 33, 36, 40, 57, 61, 62, 64, 65, 67, 73, 77, 80, 82, 84, 85, 87, 88, 97, 101], "imgdir": 5, "remap": 5, "imgdict": 5, "collect": [5, 8, 11, 12, 21, 25, 28, 31, 33, 36, 39, 62, 67, 73, 76, 85, 87, 91, 94, 100, 101, 102], "defaultdict": 5, "annodict": 5, "ncategori": 5, "cat_id": 5, "category_nam": 5, "wood": [5, 76, 101], "lag": 5, "bolt": 5, "black": [5, 57, 67, 73, 76, 77, 81, 94, 100], "oxid": 5, "shini": 5, "short": [5, 7, 19, 23, 31, 34, 57, 74, 76, 88], "long": [5, 11, 15, 16, 17, 27, 31, 33, 35, 57, 62, 64, 65, 73, 76, 81, 82, 84, 85, 88], "machin": [5, 8, 10, 15, 28, 30, 43, 60, 67, 70, 74, 76, 84, 88, 91, 101], "associ": [5, 15, 40, 60, 61, 62, 65, 67, 73, 82, 84, 100, 102], "imageid": 5, "gs": [5, 61, 62], "width_ratio": [5, 62], "wspace": 5, "cmap_norm": 5, "scatter": [5, 21, 35, 39, 57, 60, 61, 64, 65, 67, 69, 70, 76, 77, 80, 81, 87], "color": [5, 16, 21, 28, 39, 57, 60, 61, 62, 67, 69, 70, 73, 77, 81, 85, 87, 91, 94, 100, 103], "cm": [5, 21, 33, 57, 61, 62, 64, 67, 80, 87], "jet": [5, 15, 21, 64, 65], "rect": 5, "rectangl": [5, 73], "linewidth": [5, 35, 39, 61, 62, 65, 67, 69, 70], "edgecolor": 5, "facecolor": 5, "affine2d": 5, "rotate_around": 5, "set_transform": 5, "gca": [5, 60, 61, 62, 80, 81], "transdata": 5, "add_patch": [5, 73], "off": [5, 12, 16, 17, 21, 28, 31, 43, 60, 61, 64, 65, 67, 73, 77, 80, 82, 88, 91, 102], "colorbar": [5, 7, 15, 20, 60, 61, 62, 77, 97], "tick": [5, 39, 62, 73, 76, 81], "clim": 5, "cat_imgdict": 5, "img_id": 5, "k": [5, 8, 16, 17, 20, 21, 27, 34, 36, 40, 61, 62, 64, 65, 73, 77, 80, 81, 85, 87, 88, 91, 94, 100, 102], "v": [5, 8, 36, 40, 43, 62, 64, 67, 76, 80, 85, 88, 89, 97, 100, 102], "string": [5, 21, 25, 27, 43, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 85, 87, 88, 94, 100, 102], "neat": [5, 17, 57], "realpython": 5, "365": [5, 76], "317": [5, 76], "314": [5, 28, 76, 88], "367": [5, 76], "393": [5, 76], "387": [5, 76], "315": [5, 76], "320": [5, 76, 82], "346": [5, 76], "347": [5, 76], "322": [5, 76], "321": [5, 27, 76], "catid": 5, "num_exampl": 5, "suptitl": [5, 16, 25, 80, 81], "hetergogen": 5, "throughout": [5, 17, 35, 43, 57, 62, 67, 70, 73, 94, 97, 100, 102], "simpler": [5, 25, 70, 73, 80], "whether": [5, 17, 25, 31, 33, 35, 36, 39, 40, 57, 62, 65, 70, 73, 84, 85, 94, 102], "blank": 5, "use_categori": 5, "smaller": [5, 8, 12, 16, 17, 20, 21, 27, 60, 61, 62, 65, 67, 69, 70, 77, 80, 88, 91, 94, 97, 101], "patch_siz": 5, "num_patches_per_categori": 5, "nut_patch": 5, "blank_patch": 5, "until": [5, 65, 67, 69, 70, 73, 85, 88, 97], "suitabl": [5, 12, 26, 43, 57, 67, 82], "found": [5, 8, 16, 17, 21, 26, 31, 54, 57, 64, 70, 73, 81, 84, 85, 89, 94, 97, 100, 102], "imgid": 5, "imgobj": 5, "place": [5, 21, 30, 37, 38, 62, 67, 73, 76, 80, 85, 88, 100], "half": [5, 15, 23, 39, 57, 73, 76], "edg": [5, 21, 65, 88, 94, 102], "rand_cent": 5, "intersect": [5, 17], "rand_patch": 5, "todo": [5, 57, 67, 81, 82, 85, 100, 102], "seem": [5, 16, 31, 33, 35, 39, 40, 43, 61, 62, 74, 76, 80, 88, 89], "like": [5, 7, 11, 12, 16, 17, 18, 19, 20, 21, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 88, 91, 97, 101], "rare": [5, 61, 91, 101], "aren": [5, 11, 67, 81, 85], "fulli": [5, 7, 12, 17, 28, 57, 60, 65, 69, 70, 76, 80, 82, 94, 100, 101], "miss": [5, 12, 31, 35, 36, 37, 52, 57, 60, 64, 65, 69, 70, 73, 76, 80, 84, 85, 94], "could": [5, 8, 11, 17, 21, 25, 27, 31, 33, 35, 36, 37, 38, 39, 40, 46, 57, 61, 62, 64, 65, 69, 73, 74, 77, 80, 81, 82, 84, 85, 87, 88, 91, 100, 101], "cifar": [5, 57, 80], "patch_label": 5, "all_patch": 5, "concat": 5, "shuffle_idx": 5, "immedi": [5, 27, 31, 35, 39, 100], "jump": [5, 11, 73, 85], "often": [5, 17, 21, 23, 31, 35, 36, 37, 38, 39, 40, 57, 60, 61, 62, 64, 70, 73, 74, 76, 80, 81, 100], "dimension": [5, 18, 31, 57, 70, 73, 74, 76, 77, 80, 81, 82, 87, 88], "485": [5, 18, 43, 76], "456": [5, 18, 43, 76, 94], "406": [5, 8, 18, 43, 76], "229": [5, 12, 18, 43, 76], "224": [5, 17, 18, 43, 76], "225": [5, 8, 18, 43, 76], "train_frac": 5, "train_numb": 5, "test_nuumb": 5, "train_patch": 5, "train_label": [5, 17, 33, 36, 84, 87], "test_patch": 5, "test_label": [5, 33, 36, 84], "permut": [5, 16, 17, 57, 69, 70, 76, 77, 80, 82], "simplescrewnet": 5, "leakyrelu": [5, 64, 65], "flatten": [5, 7, 16, 18, 25, 57, 67, 73, 80, 94, 102], "1024": [5, 82, 100, 102], "pass": [5, 11, 16, 17, 18, 20, 21, 25, 28, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 100, 102], "inspect": [5, 7, 54, 57, 60, 67, 80], "snet": 5, "368": [5, 76], "30": [5, 7, 8, 17, 27, 28, 31, 33, 46, 57, 60, 61, 62, 65, 67, 70, 73, 76, 81, 82, 84, 85, 87, 94, 97, 100, 102], "832": [5, 76], "264": [5, 76, 80], "600": [5, 20, 61, 76, 94], "130": [5, 27, 65, 67, 76, 88], "132": [5, 76], "194": [5, 8, 76], "trainabl": [5, 8, 12, 60, 65, 67, 73, 81, 82], "non": [5, 11, 15, 25, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 88, 89, 91, 94, 100, 102], "mb": [5, 27, 28], "48": [5, 8, 27, 33, 36, 57, 62, 70, 76, 84, 85, 87, 94, 100], "estim": [5, 10, 12, 17, 33, 35, 37, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 84, 85, 87, 91, 94, 97, 100, 101, 102], "loss_fn": [5, 67, 80, 81, 82, 84], "000001": 5, "test_correct": 5, "lbl": [5, 17], "float": [5, 12, 16, 17, 18, 20, 21, 25, 28, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 81, 84, 85, 94, 97, 102], "unsqueez": [5, 16, 17, 21, 43, 57, 64, 65, 73, 76, 80, 84, 94], "as_tensor": [5, 80], "2f": [5, 16, 28, 33, 61, 64, 65, 69, 73, 87], "train_ds_load": 5, "losss": 5, "37": [5, 12, 33, 35, 39, 61, 62, 73, 76, 80, 81, 82, 84, 85, 97, 100], "973": [5, 76], "380": [5, 76], "49": [5, 7, 8, 12, 27, 33, 70, 76, 77, 84, 87, 94, 100], "197": [5, 76], "818": [5, 76], "270": [5, 76, 80], "161": [5, 27, 76], "713": [5, 76], "547": [5, 76], "137": [5, 76], "508": [5, 76], "38": [5, 73, 76, 81, 82, 85, 100], "970": [5, 76], "993": [5, 76], "calcul": [5, 7, 8, 12, 15, 28, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 84, 85, 97, 100, 102], "984": [5, 76], "text": [5, 10, 12, 17, 19, 25, 33, 36, 39, 40, 43, 46, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "anamol": 5, "ruruamour": 5, "snippet": [5, 62, 73, 88], "dirnam": 5, "file_path": 5, "empti": [5, 25, 27, 39, 43, 57, 61, 65, 67, 69, 70, 73, 81, 85, 97, 101], "fp": [5, 17, 27, 69, 70, 81], "api": [5, 7, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "token": [5, 7, 11, 12, 43, 82, 85], "upper": [5, 7, 67, 81, 85, 102], "api_token": [5, 7, 88], "usernam": [5, 7], "enter": [5, 27, 54, 73], "dump": [5, 15], "chnage": 5, "permiss": 5, "chmod": 5, "far": [5, 12, 35, 39, 60, 62, 67, 73, 74, 76, 77, 81, 82, 88, 101], "do": [5, 7, 8, 12, 17, 21, 23, 27, 28, 31, 33, 34, 35, 36, 37, 38, 39, 40, 43, 46, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 77, 80, 81, 82, 84, 85, 97, 100, 101, 102], "classifi": [5, 16, 18, 33, 35, 36, 39, 43, 64, 65, 67, 70, 73, 76, 77, 84, 85, 89, 91], "same": [5, 7, 10, 11, 12, 16, 17, 25, 27, 28, 31, 33, 35, 36, 38, 39, 43, 53, 57, 60, 62, 64, 67, 70, 73, 74, 76, 77, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101], "configur": [5, 27, 28, 43, 60, 67, 73, 88, 100, 102], "remind": [5, 35, 76, 84, 100], "nice": [5, 31, 38, 67, 82, 88, 100], "world": [5, 15, 16, 27, 39, 43, 69, 73, 74, 76, 84, 87, 88, 91, 94, 101], "machinelearningmasteri": 5, "captur": [5, 33, 35, 36, 64, 84, 87, 91], "won": [5, 12, 25, 28, 31, 64, 67, 73, 74, 85, 91, 100, 101, 102], "enough": [5, 27, 31, 33, 34, 35, 37, 38, 39, 43, 57, 62, 64, 69, 82, 87], "yolo": 5, "algorithm": [5, 10, 11, 17, 21, 33, 35, 36, 39, 57, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "keep": [5, 7, 20, 21, 25, 28, 31, 33, 36, 37, 38, 43, 57, 60, 61, 62, 64, 67, 69, 70, 73, 80, 87, 88, 91, 94], "skill": [5, 23, 31, 35, 91], "program": [5, 57, 73, 88, 97, 101], "debug": [5, 21, 25, 38, 43, 100, 102], "intermedi": [5, 18, 21, 31, 60, 67, 76], "give": [5, 17, 23, 31, 33, 34, 36, 39, 40, 57, 62, 65, 70, 73, 74, 76, 80, 81, 82, 85, 88, 91, 97, 100], "w1d2": [5, 35, 67], "standard": [5, 8, 17, 21, 25, 27, 36, 38, 40, 43, 46, 57, 61, 62, 64, 65, 67, 70, 76, 80, 81, 82, 88, 91, 97, 100, 101], "draw": [5, 20, 37, 38, 39, 80, 81, 94, 97, 100], "doesn": [5, 12, 25, 28, 31, 33, 36, 39, 40, 69, 84, 85, 87, 88, 97], "handl": [5, 11, 12, 16, 21, 25, 26, 43, 57, 67, 73, 80, 81, 84, 85, 88, 94], "elegantli": 5, "sever": [5, 11, 12, 17, 19, 28, 39, 52, 67, 73, 74, 76, 85, 87, 91], "extend": [5, 15, 43, 62, 74, 80, 88, 101], "produc": [5, 7, 11, 17, 25, 35, 38, 39, 67, 73, 80, 82, 84, 85, 87, 88, 100, 102], "form": [5, 11, 15, 31, 36, 39, 43, 52, 62, 64, 69, 73, 74, 80, 81, 82, 84, 85, 97, 100, 101, 102], "supervis": [5, 46, 76, 84, 100, 101], "incomplet": [5, 57], "unsupervis": [5, 46, 80, 94], "group": [5, 12, 23, 31, 33, 36, 39, 46, 53, 67, 70, 74, 76, 77, 81, 82, 87, 91, 94, 100, 101], "classic": [5, 25, 36, 39, 80, 88], "sklearn": [5, 11, 12, 16, 33, 35, 39, 57, 77, 84, 85, 87], "yolo3": 5, "minim": [5, 27, 28, 33, 43, 60, 62, 67, 70, 74, 80, 81, 84, 89, 91, 94, 100], "yolov5": 5, "detetectron2": 5, "yolov4": 5, "less": [5, 16, 21, 27, 31, 33, 36, 40, 62, 67, 69, 70, 73, 74, 76, 80, 82, 87, 88, 94], "readabl": 5, "complic": [5, 28, 31, 43, 57, 80], "framework": [5, 19, 27, 40, 43, 60, 65, 81, 85, 97, 101], "3d": [5, 17, 33, 61, 67, 73, 84, 87, 101], "cad": 5, "en": [5, 11, 21, 27, 43, 87, 89], "click": [6, 13, 15, 22, 29, 34, 35, 36, 37, 38, 43, 48, 50, 53, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100, 101, 102], "imag": [6, 7, 8, 13, 15, 17, 19, 21, 22, 25, 27, 29, 30, 31, 64, 65, 69, 70, 74, 81, 84, 85, 91, 103], "browser": [6, 7, 13, 22, 28, 29, 53], "beatrix": 7, "benko": 7, "lina": 7, "teichmann": 7, "audiofil": 7, "part": [7, 15, 17, 21, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 43, 52, 57, 60, 62, 64, 73, 77, 80, 81, 82, 84, 85, 88, 94, 97, 102], "second": [7, 15, 17, 23, 27, 28, 31, 33, 35, 36, 39, 40, 57, 60, 61, 62, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 87, 91, 94, 97], "genr": 7, "link": [7, 21, 23, 28, 36, 39, 43, 46, 48, 54, 57, 73, 76, 77, 85], "harder": [7, 39, 77], "idea": [7, 11, 12, 21, 23, 31, 33, 35, 36, 37, 39, 43, 57, 60, 62, 64, 65, 67, 69, 73, 76, 80, 81, 82, 84, 91, 97, 100, 102], "fun": [7, 65, 101], "benk\u0151": 7, "towardsdatasci": 7, "rwightman": [7, 30], "blob": [7, 17, 21, 27, 76], "master": [7, 17, 21, 27, 43, 97], "timm": 7, "vision_transform": 7, "py": [7, 16, 17, 18, 21, 27, 28, 43, 65, 67, 69, 70, 76, 80, 81, 82, 85, 87, 94, 100], "kamalesh0406": 7, "audio": [7, 19, 44, 45], "zcacer": 7, "spec_aug": 7, "musicinformationretriev": 7, "ipython_audio": 7, "sudo": 7, "apt": [7, 21, 27, 28], "ffmpeg": [7, 27, 28, 69, 70, 81], "librosa": 7, "imageio": [7, 28, 57, 69, 70], "packag": [7, 25, 28, 57, 60, 65, 67, 69, 70, 80, 81, 82, 85, 87, 89, 94], "build": [7, 16, 17, 18, 27, 28, 31, 35, 38, 39, 40, 43, 60, 62, 65, 69, 70, 73, 74, 76, 77, 82, 85, 87, 88, 91, 94, 100, 101, 102], "tree": [7, 76, 87], "newest": 7, "0ubuntu0": 7, "automat": [7, 33, 40, 43, 57, 60, 61, 67, 73, 76, 85, 89, 91, 100], "requir": [7, 17, 21, 25, 27, 28, 31, 34, 35, 37, 39, 43, 46, 54, 57, 60, 62, 67, 69, 70, 73, 74, 76, 77, 80, 85, 87, 88, 94, 97, 101], "libnvidia": 7, "460": [7, 76, 94], "autoremov": 7, "upgrad": [7, 25, 28, 43, 84, 87], "newli": [7, 54, 60, 67, 100], "necessari": [7, 17, 21, 27, 28, 31, 36, 37, 52, 61, 62, 64, 67, 73, 77, 85, 97, 100, 101], "shutil": [7, 16, 28, 73, 94, 100, 102], "ipython": [7, 15, 25, 27, 28, 35, 36, 39, 44, 45, 57, 64, 65, 69, 70, 73, 76, 81, 94], "displai": [7, 15, 25, 27, 28, 33, 35, 36, 38, 39, 43, 44, 45, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 81, 82, 84, 85, 94, 97, 100, 102], "drjhb": 7, "except": [7, 8, 16, 17, 21, 23, 28, 31, 46, 52, 60, 62, 65, 67, 76, 84, 94], "connectionerror": [7, 16, 17], "fail": [7, 16, 17, 18, 21, 33, 35, 36, 38, 69, 74, 76, 77, 85], "status_cod": [7, 16, 17, 33, 36], "ok": [7, 11, 16, 17, 31, 33, 35, 36, 40, 80], "fid": [7, 16, 17], "dowload": [7, 80], "It": [7, 11, 12, 16, 17, 20, 21, 25, 26, 27, 31, 33, 36, 38, 39, 40, 43, 54, 57, 60, 62, 67, 70, 73, 74, 76, 77, 80, 81, 84, 87, 88, 89, 94, 97, 100, 101, 102], "johnsmith": 7, "123a123a123": 7, "zipobj": [7, 87, 89], "waveform": 7, "Then": [7, 10, 17, 21, 31, 34, 35, 37, 38, 43, 57, 62, 64, 65, 69, 73, 77, 81, 84, 85, 87, 89, 91], "sound": [7, 31, 76, 88], "wave": [7, 33], "sample_path": 7, "genres_origin": 7, "jazz": 7, "00000": 7, "wav": 7, "listen": [7, 19, 31, 44, 45, 74], "support": [7, 12, 28, 31, 57, 67, 85, 97], "element": [7, 38, 39, 43, 57, 61, 62, 64, 67, 69, 70, 73, 81, 85, 88, 94], "sample_r": 7, "khz": 7, "waveplot": 7, "sr": 7, "fontsiz": [7, 28, 61, 62, 67, 77], "00924683": 7, "01177979": 7, "01370239": 7, "0071106": 7, "00561523": 7, "661794": 7, "22050": 7, "013333333333332": 7, "fourier": 7, "stft": 7, "ab": [7, 16, 27, 28, 35, 39, 64, 65, 67, 76, 101], "n_fft": 7, "2048": 7, "hop_length": 7, "object": [7, 15, 31, 36, 39, 40], "amplitud": [7, 15, 18, 36, 40], "decibel": 7, "scale": [7, 16, 17, 18, 20, 27, 37, 40, 57, 60, 61, 62, 64, 65, 67, 73, 74, 76, 77, 81, 82, 84, 91, 94, 101], "db": [7, 61], "amplitude_to_db": 7, "ref": [7, 80], "spectogram": 7, "specshow": 7, "x_axi": 7, "y_axi": 7, "log": [7, 20, 25, 27, 28, 43, 54, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "1025": 7, "1293": 7, "mel": 7, "sclae": 7, "intead": 7, "pitch": [7, 28, 31, 33, 36, 88], "judg": [7, 39, 76], "equal": [7, 27, 28, 39, 57, 62, 65, 67, 74, 80, 81, 84, 100, 102], "distanc": [7, 17, 18, 57, 67, 73, 74, 82, 85, 88, 89], "frequenc": [7, 28, 36, 40, 64, 67, 82, 87, 88], "measur": [7, 15, 17, 25, 33, 35, 36, 37, 62, 67, 69, 70, 76, 77, 80, 84, 87, 88, 89, 94, 100], "assign": [7, 16, 17, 23, 31, 64, 65, 85, 87, 97, 100, 102], "1000": [7, 15, 16, 20, 25, 28, 34, 40, 57, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 94, 100, 102], "hz": [7, 28, 88], "tone": 7, "40": [7, 11, 12, 27, 31, 33, 35, 39, 46, 57, 61, 67, 69, 70, 73, 76, 81, 85, 94, 100], "threshold": [7, 20, 36, 64, 65, 73, 88, 100, 102], "increasingli": 7, "interv": [7, 28, 35, 39, 61, 74, 81], "increment": [7, 57, 65, 67], "melspectrogram": 7, "s_db": 7, "img_path": [7, 77], "images_origin": 7, "jazz00000": 7, "interpol": [7, 17, 21, 33, 36], "nearest": [7, 17, 21, 73, 85, 101], "288": [7, 76], "432": [7, 76], "plot_loss_accuraci": [7, 73], "validation_loss": [7, 16, 73, 87], "validation_acc": [7, 73], "ax1": [7, 60, 61, 62, 69, 73], "ax2": [7, 60, 61, 62, 69, 73], "set_xlabel": [7, 40, 60, 61, 62, 67, 69, 70, 73, 94], "set_ylabel": [7, 40, 60, 61, 62, 67, 69, 70, 73, 94], "set_size_inch": [7, 73], "spectrograms_dir": 7, "folder_nam": 7, "train_dir": 7, "test_dir": 7, "val_dir": 7, "rmtree": [7, 16, 28, 94], "loop": [7, 11, 17, 21, 28, 36, 38, 43, 61, 62, 64, 65, 67, 84, 88, 97, 102], "src_file_path": 7, "recurs": [7, 37, 67, 102], "test_fil": 7, "val_fil": 7, "train_fil": 7, "destin": [7, 73, 87], "train_load": [7, 12, 33, 64, 65, 67, 69, 70, 73, 84], "val_dataset": 7, "val_load": [7, 67, 69, 70], "music_net": 7, "intit": 7, "in_channel": [7, 17, 21, 33, 73, 76, 80, 100, 102], "out_channel": [7, 17, 21, 33, 73, 76, 80, 100, 102], "conv4": [7, 18, 82, 100, 102], "conv5": [7, 18], "fc1": [7, 18, 33, 69, 70, 73, 76, 87, 100, 102], "in_featur": [7, 8, 12, 18, 57, 60, 67, 73, 76, 82, 100, 102], "9856": 7, "out_featur": [7, 12, 57, 60, 67, 73, 80, 82, 100, 102], "batchnorm1": 7, "num_featur": [7, 65, 94, 100, 102], "batchnorm2": 7, "batchnorm3": 7, "batchnorm4": 7, "batchnorm5": 7, "conv": [7, 17, 21, 73, 80, 82, 94], "max_pool2d": [7, 73], "validation_load": [7, 73], "unit": [7, 17, 21, 28, 31, 33, 38, 39, 40, 57, 62, 64, 65, 67, 73, 74, 76, 80, 87, 94], "tepoch": [7, 73], "set_descript": [7, 73, 81, 82], "track": [7, 15, 18, 21, 27, 33, 57, 60, 62, 64, 69, 70, 73, 76, 84, 85], "running_loss": [7, 18, 64, 65, 67, 73, 87], "zero": [7, 11, 12, 15, 17, 18, 20, 21, 27, 35, 39, 57, 60, 62, 64, 65, 67, 70, 73, 80, 81, 82, 84, 94, 97, 100, 101, 102], "set_postfix": [7, 17, 21, 73, 100, 102], "One": [8, 10, 12, 17, 21, 25, 27, 31, 43, 62, 65, 67, 69, 70, 73, 76, 77, 80, 82, 84, 87, 88, 94, 100], "desir": [8, 27, 57, 62, 64, 65, 67, 80, 88, 97, 101], "capabl": [8, 12, 25, 67, 101], "abil": [8, 62, 64, 73, 101], "knowledg": [8, 26, 27, 31, 36, 37, 74, 81, 88, 94], "domain": [8, 27, 57, 67, 73, 74, 76, 88, 91], "scarc": 8, "unfortun": 8, "recip": 8, "instead": [8, 12, 20, 25, 26, 27, 28, 33, 36, 43, 57, 62, 65, 67, 73, 74, 76, 77, 82, 84, 85, 88, 91, 94, 97, 101], "remain": [8, 57, 73, 88, 97, 101], "scenario": [8, 39, 65, 74, 91, 101], "gc": 8, "obtain": [8, 12, 57, 60, 64, 67, 69, 70, 73, 76, 80, 82, 87, 91, 94, 100], "variou": [8, 23, 25, 28, 33, 36, 57, 73, 74, 76, 81, 85, 87, 91, 97, 101], "max_epoch": [8, 67, 69, 70], "max_epochs_target": 8, "normalis": [8, 67, 73, 84], "substract": 8, "divid": [8, 43, 60, 61, 62, 73, 82, 101], "deviat": [8, 21, 36, 40, 61, 62, 64, 65, 80, 81, 82], "appli": [8, 11, 16, 17, 20, 26, 27, 28, 31, 34, 35, 39, 57, 60, 65, 67, 70, 73, 76, 77, 80, 82, 84, 85, 88, 97, 102], "_pretrain": [8, 27], "outmodelnam": 8, "748": [8, 76], "781": [8, 28, 76], "545": [8, 76, 94], "527999877929688": 8, "369999885559082": 8, "597": [8, 27, 76], "157": [8, 76], "438": [8, 76], "392000198364258": 8, "829999923706055": 8, "932": [8, 76], "34": [8, 70, 73, 76, 77, 81, 82, 84, 85, 97, 100], "450": [8, 40, 76, 94], "016000747680664": 8, "079999923706055": 8, "649": [8, 76], "35": [8, 27, 28, 62, 64, 73, 76, 81, 82, 84, 85, 94, 97, 100], "134": [8, 76], "84000015258789": 8, "70000076293945": 8, "153": [8, 76], "41": [8, 27, 28, 57, 76, 94, 97, 100], "911": [8, 73, 76], "219": [8, 76], "63": [8, 16, 21, 27, 76, 88, 94], "42": [8, 12, 16, 20, 28, 33, 70, 76, 80, 81, 84, 85, 97, 100], "827999114990234": 8, "43": [8, 16, 65, 73, 76, 84, 85, 87, 88, 94], "619998931884766": 8, "878": [8, 27, 76], "149": [8, 76], "87200164794922": 8, "45": [8, 12, 21, 46, 60, 61, 65, 67, 73, 76, 77, 81, 84, 85, 87, 94], "380001068115234": 8, "814": [8, 76], "66": [8, 33, 67, 76, 94], "847": [8, 73, 76], "875": [8, 73, 76], "59000015258789": 8, "310001373291016": 8, "514": [8, 76, 80], "568": [8, 76], "57": [8, 12, 73, 76, 84, 94], "35200119018555": 8, "209999084472656": 8, "403": [8, 76], "375": [8, 76], "61600112915039": 8, "20000076293945": 8, "124": [8, 69, 70, 76], "339": [8, 27, 76], "55400085449219": 8, "58": [8, 33, 65, 73, 76, 84, 94, 100], "900001525878906": 8, "656": [8, 76], "91999816894531": 8, "83000183105469": 8, "971": [8, 76], "491": [8, 76], "281": [8, 76], "05000305175781": 8, "560001373291016": 8, "028": 8, "358": [8, 76], "099998474121094": 8, "699": [8, 76], "299": [8, 76], "71": [8, 12, 15, 16, 21, 27, 28, 33, 67, 76, 84, 94], "9739990234375": 8, "220001220703125": 8, "768": [8, 76], "182": [8, 76], "74": [8, 12, 16, 21, 33, 76], "95": [8, 21, 73, 76, 94, 100, 102], "69400024414062": 8, "90999984741211": 8, "backbon": [8, 97], "del": [8, 57, 73], "again": [8, 11, 27, 28, 31, 43, 60, 61, 62, 69, 70, 73, 80, 94, 102], "0001": [8, 12, 82], "simul": [8, 25, 26, 31, 33, 34, 35, 37, 39, 40, 61, 65, 69, 70, 80, 81, 82, 85, 101, 102], "lower": [8, 11, 35, 39, 40, 61, 73, 80, 81, 85, 88, 100], "regim": [8, 40], "30000": [8, 12, 27, 67], "As": [8, 16, 19, 27, 28, 31, 35, 57, 61, 62, 64, 65, 67, 73, 76, 77, 80, 82, 88, 89, 94, 101], "previous": [8, 17, 43, 67, 73, 85, 94, 101], "checkpointpath": 8, "prefix": [8, 82, 84, 85], "elif": [8, 17, 21, 57, 61, 65, 69, 70, 73, 80, 85, 94, 97], "msg": 8, "strict": [8, 80], "rais": [8, 21, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 97, 100, 102], "No": [8, 20, 31, 36, 43, 46, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "9100": 8, "successfulli": [8, 18, 27, 94, 97], "usual": [8, 12, 20, 25, 31, 33, 35, 39, 40, 46, 57, 62, 70, 73, 74, 81, 88], "requires_grad": [8, 12, 60, 65, 67, 76, 80, 81, 82, 85], "num_ftr": [8, 18, 76], "sinc": [8, 20, 31, 33, 36, 38, 39, 40, 43, 53, 60, 61, 62, 64, 65, 67, 70, 73, 81, 82, 84, 85, 97, 102], "total_param": 8, "numel": [8, 57, 65, 73, 76, 80], "trainable_total_param": 8, "11173962": 8, "5130": 8, "finetun": [8, 82, 101], "235": [8, 76, 80], "302": [8, 43, 76], "630": [8, 76], "086669921875": 8, "757": [8, 76], "86000061035156": 8, "666": [8, 76], "640": [8, 76], "04000091552734": 8, "55000305175781": 8, "579": [8, 76], "577": [8, 76], "56999969482422": 8, "2300033569336": 8, "661": [8, 76], "613": [8, 76], "6866683959961": 8, "627": [8, 76], "469": [8, 76], "103": [8, 11, 21, 76, 88], "163330078125": 8, "37999725341797": 8, "602": [8, 76], "344": [8, 76], "99": [8, 12, 17, 21, 27, 57, 67, 69, 70, 73, 76, 82, 84, 97], "607": [8, 76], "42333221435547": 8, "02999877929688": 8, "537": [8, 76, 94], "608": [8, 76], "49333190917969": 8, "1500015258789": 8, "578": [8, 76], "650": [8, 76], "15333557128906": 8, "583": [8, 76], "66999816894531": 8, "20999908447266": 8, "819690": 8, "086670": 8, "713260": 8, "680940": 8, "860001": 8, "674431": 8, "540001": 8, "650245": 8, "040001": 8, "675883": 8, "550003": 8, "638555": 8, "570000": 8, "652776": 8, "230003": 8, "630500": 8, "686668": 8, "666428": 8, "449997": 8, "identifi": [10, 16, 20, 25, 31, 35, 36, 38, 70, 94], "quantifi": [10, 74, 76, 80, 84, 97], "word": [10, 12, 34, 36, 57, 67, 85, 88, 89, 91, 100, 101], "analyz": [10, 12, 27, 28, 31, 39, 40, 62, 65, 85], "sensibl": [10, 40], "rnn": [10, 12, 19, 57, 80, 84, 87, 88, 89, 101], "degrad": 10, "bag": [10, 12, 73, 76, 84], "pro": [10, 27, 65], "con": 10, "pre": [10, 19, 21, 25, 26, 33, 34, 36, 39, 57, 61, 67, 73, 76, 84, 87, 88, 91, 100, 101], "specif": [10, 15, 16, 17, 18, 25, 27, 28, 31, 33, 35, 37, 38, 39, 40, 43, 57, 62, 64, 65, 67, 70, 73, 74, 76, 77, 80, 84, 85, 88, 89, 91, 94, 100, 101], "suggest": [10, 11, 12, 16, 28, 31, 40, 64, 65, 80, 85, 87, 94, 100], "corpu": [10, 12, 57, 85, 87, 88, 89], "cbow": 10, "let": [10, 12, 16, 17, 20, 21, 25, 27, 33, 35, 36, 39, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 101, 102], "toxic": 10, "wikipedia": [10, 88, 89], "detect": [10, 17, 40, 67, 77], "categor": [10, 16, 25, 57, 80], "comment": [10, 12, 34, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 80, 94], "manag": [10, 16, 67], "multiclass": 10, "translat": [10, 31, 64, 73, 74, 84, 88, 91, 94, 100, 101], "huggingfac": [10, 12, 30, 84, 85, 88], "analyticsindiamag": 10, "nlp": [10, 23, 30, 67, 101], "sourc": [10, 15, 17, 25, 26, 27, 28, 31, 43, 57, 67, 77, 82, 84, 85, 89, 94], "beginn": [10, 23, 33, 67, 84], "descript": [10, 19, 25, 27, 33, 36, 43, 61, 62, 73, 76, 80, 84, 85, 101], "q": [10, 20, 27, 57, 80, 85, 100, 101, 102], "juan": [11, 12], "manuel": [11, 12], "rodriguez": [11, 12], "salomei": [11, 12, 16], "osei": [11, 12, 16], "amita": [11, 12, 44], "kapoor": [11, 12, 44], "sequenc": [11, 12, 25, 28, 31, 57, 64, 65, 81, 84, 85, 87, 88, 89, 101], "transtlat": 11, "french": [11, 76, 89], "english": [11, 43, 76, 84, 85, 87, 89, 101], "math": [11, 27, 34, 36, 60, 62, 65, 73, 76, 82, 84, 85, 102], "unicodedata": 11, "zip_file_url": 11, "line": [11, 12, 27, 28, 33, 35, 36, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 102], "eng": 11, "fra": 11, "strip": 11, "va": [11, 67, 73, 77, 87, 100, 102], "cour": 11, "courez": 11, "wow": [11, 16], "\u00e7a": 11, "alor": 11, "fire": [11, 12, 27, 35, 39, 67, 74, 76, 101], "au": 11, "feu": 11, "\u00e0": 11, "aid": [11, 76, 77], "saut": 11, "stop": [11, 12, 26, 31, 35, 39, 43, 62, 67, 70, 73, 76, 82, 88], "suffit": 11, "arr\u00eat": 11, "toi": [11, 39, 40, 69, 70, 73, 76, 94], "represent": [11, 12, 15, 18, 21, 33, 34, 36, 43, 57, 65, 77, 80, 84, 85, 89, 100, 101, 102], "indix": 11, "three": [11, 16, 27, 28, 39, 57, 60, 61, 65, 67, 69, 74, 76, 77, 81, 84, 85, 87, 91, 94], "special": [11, 12, 20, 27, 67, 73, 80, 85, 87, 88], "Of": [11, 12, 33, 39, 67, 88], "sentenc": [11, 12, 31, 34, 84, 85, 87, 88, 89], "eo": 11, "fill": [11, 12, 17, 25, 46, 52, 57, 64, 65, 69, 70, 73, 76, 88, 94, 97, 100, 102], "sos_token": 11, "eos_token": [11, 88], "lang": [11, 43, 80], "word2index": 11, "word2count": 11, "index2word": 11, "n_word": 11, "addsent": 11, "addword": 11, "unicodetoascii": 11, "nfd": [11, 88], "mn": 11, "normalizestr": 11, "sub": [11, 15, 21, 25, 27, 33, 36, 43, 76, 84, 85], "za": 11, "readlang": 11, "lang1": 11, "lang2": 11, "encod": [11, 12, 15, 27, 57, 60, 76, 81, 82, 85, 87, 88, 89, 101], "utf": [11, 43, 82], "input_lang": 11, "output_lang": 11, "max_length": [11, 12, 84, 85, 88], "eng_prefix": 11, "am": [11, 12, 31, 46, 85, 94, 101], "she": [11, 85], "filterpair": 11, "startswith": [11, 87], "preparedata": 11, "trim": 11, "135842": 11, "10599": 11, "4346": 11, "2804": 11, "nou": 11, "somm": 11, "san": 11, "emploi": [11, 26, 65, 88, 94], "unemploi": 11, "plot_lang": 11, "top_k": 11, "count_occur": [11, 12], "accumul": [11, 12, 35, 36, 39, 40, 60, 64, 65, 67], "counter": [11, 12, 33, 36, 65, 87], "occurr": [11, 12, 40, 88], "bar": [11, 12, 18, 69, 70, 73, 74, 76, 91], "je": 11, "sui": 11, "est": [11, 73], "vou": 11, "pa": 11, "de": [11, 40, 60, 61, 62, 64, 76, 88], "il": 11, "tu": 11, "ne": 11, "es": [11, 100, 102], "un": [11, 80], "ell": 11, "la": 11, "tre": 11, "que": 11, "le": 11, "sont": 11, "j": [11, 17, 21, 27, 43, 61, 62, 64, 65, 67, 69, 73, 74, 80, 84, 85, 91, 97, 100, 102], "ai": [11, 21, 57, 76, 101], "pour": 11, "plu": [11, 33, 36, 67], "ce": [11, 64, 100], "vai": 11, "moi": 11, "mon": [11, 12, 46], "trop": 11, "fort": 11, "si": 11, "ici": 11, "du": 11, "toujour": 11, "tout": 11, "tou": 11, "vraiment": 11, "sur": 11, "te": 11, "dan": 11, "avec": 11, "avoir": 11, "encor": 11, "qu": 11, "tom": 11, "votr": 11, "peur": 11, "desol": 11, "bien": 11, "ca": [11, 39], "bon": 11, "fai": 11, "heureux": 11, "fair": [11, 38, 62, 73, 82, 84, 94, 100], "etr": 11, "son": 11, "aussi": 11, "assez": 11, "lui": 11, "tellement": 11, "ma": [11, 64, 85], "fatigu": 11, "par": [11, 67], "fait": 11, "ton": [11, 21, 43, 76], "se": 11, "mainten": 11, "grand": [11, 76], "desole": 11, "avon": 11, "allon": 11, "peu": 11, "deux": 11, "vieux": 11, "674188349067465": 11, "0371543427945": 11, "my": [11, 12, 21, 31, 34, 36, 37, 57], "too": [11, 12, 31, 33, 34, 35, 36, 37, 39, 40, 61, 65, 67, 69, 82, 85, 88, 91], "sorri": [11, 67], "glad": 11, "tire": 11, "afraid": [11, 37], "hi": [11, 12, 31, 77, 84, 87, 89, 100], "busi": [11, 15, 31, 84, 87], "still": [11, 12, 21, 25, 28, 33, 36, 40, 43, 60, 65, 70, 73, 81, 87, 88, 101], "old": [11, 27, 28, 57, 62, 76, 84, 85], "friend": [11, 101], "her": [11, 25, 84], "teacher": [11, 27], "him": [11, 91], "alon": [11, 38, 67, 73, 91], "being": [11, 12, 16, 25, 57, 60, 61, 65, 67, 70, 73, 74, 76, 77, 80, 84, 85, 87, 88, 91, 94, 100, 101, 102], "home": [11, 12, 43, 76, 94], "proud": 11, "man": [11, 12, 76, 87], "marri": 11, "kind": [11, 12, 15, 17, 21, 25, 31, 33, 35, 39, 57, 60, 67, 69, 73, 80, 81, 82, 85, 88, 94, 101], "who": [11, 31, 52, 57, 82, 84, 85, 88, 101], "wait": [11, 31, 57, 69, 70, 76, 94, 100, 102], "young": [11, 84], "late": [11, 23, 31], "anymor": [11, 36, 43, 65], "hungri": [11, 88], "sick": [11, 87], "85540878257765": 11, "00500226665207": 11, "decod": [11, 15, 19, 27, 33, 35, 39, 43, 74, 82, 88], "condens": [11, 34], "explain": [11, 20, 21, 25, 33, 34, 35, 36, 38, 39, 40, 61, 62, 65, 67, 76, 84, 94], "diagram": [11, 37, 84, 100], "encoderrnn": 11, "hidden_s": [11, 12, 84, 87], "gru": 11, "batch_first": [11, 12], "hidden": [11, 12, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 80, 84, 87], "inithidden": 11, "decoderrnn": 11, "output_s": [11, 87], "logsoftmax": [11, 64, 65], "to_train": 11, "max_len": [11, 84], "x_input": 11, "x_output": 11, "o": [11, 16, 57, 73, 76, 77, 84, 85, 87, 88, 100, 102], "s_i": [11, 77], "s_o": 11, "s_to": 11, "x_partial": 11, "sentec": [11, 12], "nrepresent": 11, "partial": [11, 60, 61, 62, 67, 69, 70, 84, 94], "becaus": [11, 17, 20, 28, 31, 34, 36, 37, 38, 39, 40, 43, 57, 62, 69, 73, 74, 76, 85, 88, 94, 97, 100], "rignt": 11, "close": [11, 16, 20, 21, 27, 31, 38, 46, 57, 65, 67, 69, 70, 73, 76, 77, 81, 87, 88, 89, 101], "context": [11, 27, 67, 84, 87, 88], "inmediatli": 11, "notic": [11, 12, 28, 31, 34, 35, 39, 40, 57, 67, 69, 73, 80, 94], "feed": [11, 21, 73, 74, 84, 88], "instant": [11, 20], "affect": [11, 12, 17, 21, 27, 31, 70, 73, 76, 82, 87, 88, 94], "learning_r": [11, 17, 18, 21, 27, 33, 57, 61, 70, 84, 85, 87], "001": [11, 12, 21, 27, 33, 40, 61, 70, 73, 82, 100, 102], "plot_loss": [11, 62, 94], "plot_full_loss": 11, "encoder_optim": 11, "decoder_optim": 11, "c_input": 11, "c_output": 11, "c_target": 11, "dtype": [11, 12, 17, 21, 25, 27, 28, 57, 65, 67, 82, 84, 87, 97], "acc_loss": 11, "c_batch_siz": 11, "r_target": 11, "c_loss": 11, "useful": 11, "ceil": [11, 17, 21, 85], "300": [11, 20, 27, 33, 57, 67, 69, 70, 73, 76, 87, 94], "epoch_error": 11, "batch_error": 11, "nllloss": [11, 64, 65], "reduct": [11, 31, 57, 69, 70, 73, 80], "seq2seq": 11, "partiar": 11, "repeat": [11, 28, 33, 35, 39, 62, 65, 67, 69, 70, 73, 80, 88, 94], "eof": 11, "candid": [11, 85, 100, 102], "beam": [11, 76], "search": [11, 21, 31, 35, 40, 61, 70, 84, 85, 91], "gen_transl": 11, "pt_out": 11, "idx": [11, 12, 21, 33, 57, 69, 70, 80, 84, 85, 87], "troubl": [11, 69], "gra": 11, "fat": [11, 84], "exhaust": [11, 35], "gro": [11, 88], "fit": [11, 12, 15, 25, 33, 35, 37, 38, 39, 40, 57, 64, 69, 73, 74, 76, 80, 81, 88], "touch": [11, 69], "hit": [11, 39, 40], "touche": 11, "malad": 11, "ill": [11, 52, 62], "trist": 11, "sad": [11, 12], "timid": 11, "shy": 11, "mouill": 11, "wet": 11, "mouille": 11, "revenu": 11, "revoila": 11, "seriou": [11, 69], "chauv": 11, "bald": [11, 76], "occup": 11, "occupe": 11, "calm": 11, "froid": 11, "cold": [11, 62, 84], "fini": 11, "fine": [11, 19, 73, 80, 87, 91], "libr": 11, "dispon": 11, "repu": 11, "rassasi": 11, "chez": 11, "retard": 11, "paresseux": 11, "lazi": [11, 43], "faineant": 11, "paresseus": 11, "okai": 11, "port": [11, 43], "candidat": 11, "aux": 11, "presidentiel": 11, "americain": 11, "american": [11, 76, 88], "presidenti": 11, "mood": 11, "eglis": 11, "contribut": [11, 20, 39, 102], "church": [11, 76], "ag": [11, 19, 76, 84], "quelqu": 11, "difficult": [11, 16, 20, 26, 27, 28, 31, 52, 54, 74, 94, 97], "compil": [11, 30], "programm": 11, "entreprend": 11, "laboratoir": 11, "carri": [11, 12, 27, 33, 62, 94, 101], "laboratori": [11, 76], "seulement": 11, "bell": [11, 35, 39, 76], "intelligent": 11, "intellig": [11, 15, 57, 88, 97, 101], "smart": [11, 31, 61, 65, 73, 74], "enqueton": 11, "meurtr": 11, "jackson": 11, "investig": [11, 25, 27, 33, 35, 36, 40, 67, 70, 76, 94, 101], "murder": 11, "recept": 11, "hypnotiqu": 11, "suscept": 11, "hypnot": 11, "job": [11, 39, 57, 82], "trouv": 11, "redir": 11, "autr": 11, "fault": 11, "complain": 11, "pens": 11, "apprendr": 11, "coreen": 11, "semestr": 11, "prochain": 11, "korean": 11, "semest": 11, "jeun": 11, "comprendr": 11, "critiqu": 11, "defaut": 11, "shortcom": 11, "attendon": 11, "ouvrag": 11, "invent": 11, "histoir": 11, "interessant": 11, "stori": [11, 31, 73, 76, 87], "mari": 11, "husband": 11, "di": [11, 76], "constam": 11, "comport": 11, "constantli": 11, "behav": [11, 12, 40, 57, 64, 69], "herself": 11, "interpret": [11, 20, 35, 38, 39, 73, 81, 88], "banqu": 11, "international": 11, "bank": [11, 76, 84], "expert": [11, 31, 74, 91, 103], "litteratur": 11, "francais": 11, "acquaint": 11, "literatur": [11, 23, 27, 31, 36, 37, 39, 40, 81], "batail": 11, "croyanc": 11, "religieus": 11, "grappl": 11, "religi": [11, 84], "belief": 11, "compet": [11, 26, 67], "espagnol": 11, "italien": 11, "profici": [11, 101], "spanish": [11, 89], "italian": [11, 76], "love": [11, 12, 82, 87], "quitt": 11, "narita": 11, "hawaii": 11, "soir": 11, "leav": [11, 31, 65, 80, 89], "amus": 11, "jouant": 11, "jeux": 11, "video": [11, 31, 33, 40, 46, 51], "himself": 11, "plai": [11, 15, 20, 31, 36, 38, 67, 69, 73, 76, 80, 82, 85, 87, 97], "game": [11, 26, 46, 101], "discuteron": 11, "demain": 11, "discuss": [11, 17, 27, 31, 35, 46, 57, 64, 69, 70, 73, 74, 76, 77, 81, 82, 84, 88, 91, 97, 101], "tomorrow": [11, 57, 101], "notr": [11, 31], "nouveau": 11, "voisin": 11, "neighbor": [11, 17, 85, 101], "tard": 11, "recevoir": 11, "repons": 11, "receiv": [11, 15, 25, 27, 31, 34, 43, 57, 67, 77, 97], "repli": 11, "hate": 11, "ta": [11, 23, 46], "frighten": 11, "clotur": 11, "compt": 11, "epargn": 11, "father": 11, "etonn": 11, "attitud": 11, "irrespons": 11, "alarm": [11, 88], "inquiet": 11, "autorit": 11, "reconnu": 11, "sujet": 11, "recogn": [11, 16, 74, 76, 77, 85, 88], "subject": [11, 15, 19, 25, 26, 31, 76], "invit": [11, 23, 31, 87], "guest": 11, "rejouisson": 11, "revoir": 11, "dispose": 11, "discut": 11, "willing": 11, "talk": [11, 31, 33, 43, 73, 101], "dispos": 11, "parlent": 11, "vont": 11, "chanter": 11, "sing": 11, "parol": 11, "cett": 11, "institut": [11, 15, 17, 19], "spokesperson": 11, "did": [11, 16, 23, 31, 33, 34, 36, 38, 39, 40, 57, 60, 61, 62, 67, 70, 73, 74, 76, 85, 97], "empir": [11, 61, 65, 67], "metric": [11, 12, 17, 27, 33, 84, 85], "blue": [11, 35, 39, 61, 69, 70, 73, 76, 77, 81, 85], "score": [11, 12, 15, 18, 21, 57, 65, 84, 85], "writ": 11, "rigth": 11, "happen": [11, 16, 17, 19, 23, 31, 33, 34, 53, 57, 64, 65, 67, 70, 73, 74, 76, 77, 80, 85, 87, 91, 97], "would": [11, 12, 16, 17, 23, 25, 27, 28, 31, 33, 34, 35, 36, 38, 39, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 84, 85, 88, 89, 91, 94, 100, 101], "attent": [11, 12, 31, 46, 57, 67, 73, 74, 82, 85, 91, 100, 101, 102], "proper": [11, 21, 74, 97], "noun": 11, "jointli": [11, 101], "gonzalo": 12, "uribarri": 12, "infer": [12, 33, 35, 74, 80, 81, 82, 88], "emot": 12, "tweet": 12, "accross": 12, "torchtext": [12, 84, 85, 87], "tensordataset": [12, 64, 65], "get_token": [12, 87], "classification_report": 12, "linear_model": [12, 35, 39], "logisticregress": [12, 35, 39], "model_select": [12, 16, 35, 39], "train_test_split": [12, 16], "feature_extract": 12, "countvector": 12, "websit": [12, 19, 21, 31, 33, 43, 52, 76, 100, 101], "stanford": 12, "alecmgo": 12, "trainingandtestdata": 12, "header_list": 12, "polar": [12, 76], "date": [12, 62], "queri": [12, 25, 57, 85, 87, 88, 101], "df": [12, 25, 40, 57, 60], "1600000": 12, "noemoticon": 12, "iso": 12, "8859": 12, "1467810369": 12, "apr": 12, "06": [12, 16, 25, 57], "pdt": 12, "2009": [12, 27], "no_queri": 12, "_thespecialone_": 12, "switchfoot": 12, "twitpic": 12, "2y1zl": 12, "awww": 12, "1467810672": 12, "scotthamilton": 12, "upset": [12, 84], "facebook": [12, 89], "1467810917": 12, "mattycu": 12, "kenichan": 12, "dive": [12, 65, 73], "ball": [12, 26, 35, 39, 76], "1467811184": 12, "ellectf": 12, "bodi": [12, 15, 25, 28, 33, 35, 36, 43, 73], "itchi": 12, "1467811193": 12, "karoli": 12, "nationwideclass": 12, "neg": [12, 17, 27, 31, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 87, 89, 100, 102], "x_train_text": 12, "x_test_text": 12, "y_train": [12, 35, 39, 61, 62, 64, 65, 81], "y_test": [12, 35, 39, 64, 65, 69, 70], "test_siz": [12, 16, 64, 65, 67], "random_st": [12, 35, 39, 80, 87], "stratifi": [12, 16], "ourselv": [12, 64, 67, 69], "exploratori": [12, 67], "analisi": 12, "eda": 12, "paisleypaislei": 12, "lol": 12, "advanc": [12, 23, 35, 39, 46, 62, 67, 101], "june": 12, "third": [12, 35, 39, 57, 62, 73, 84, 101], "knitter": 12, "summer": [12, 23], "worst": [12, 33, 69, 87], "headach": 12, "ever": [12, 33, 54, 69, 74], "ewaniesciuszko": 12, "wont": 12, "yeah": 12, "18th": 12, "spell": [12, 34, 87], "conk": 12, "quot": 12, "stand": [12, 36, 40, 73], "gone": 12, "everyon": [12, 31], "basic_english": [12, 87], "x_train_token": 12, "x_test_token": 12, "occur": [12, 33, 35, 36, 39, 40, 46, 57, 64, 69, 70, 73, 85, 88], "present": [12, 19, 23, 25, 33, 40, 46, 62, 67, 69, 73, 88, 94], "sorted_word": 12, "669284": 12, "todai": [12, 31, 39, 46, 57, 60, 65, 67, 69, 70, 73, 74, 76, 80, 81, 82, 84, 101], "got": [12, 31, 57, 73, 74, 76, 85, 88, 100, 102], "had": [12, 15, 31, 39, 40, 70, 73, 76, 80, 85, 88, 91, 101], "amp": 12, "night": [12, 16, 76, 82, 101], "thank": [12, 87], "oh": 12, "13970153178620734": 12, "00532743602652": 12, "zipf": [12, 16], "law": 12, "dictionari": [12, 16, 43, 62, 67, 69, 70, 76, 84, 85, 87, 89, 100, 102], "puntuat": 12, "steam": [12, 76], "uncommon": 12, "appear": [12, 15, 17, 57, 67, 73, 76, 81, 84, 85, 87, 94, 101], "fewer": [12, 23, 33, 36, 65, 67, 73, 80, 88, 94, 97], "occat": 12, "noth": [12, 27, 28, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "simplest": [12, 39, 40, 61, 81, 97], "quirk": 12, "languag": [12, 23, 26, 31, 36, 37, 45, 46, 80, 81, 82, 87, 89], "moreov": [12, 67], "difer": 12, "univers": [12, 31, 48, 69, 73], "better": [12, 16, 18, 21, 25, 27, 31, 33, 34, 36, 39, 57, 60, 61, 62, 67, 69, 70, 73, 74, 76, 80, 82, 85, 87, 88, 89, 91, 94, 97, 100, 101], "spaci": 12, "access": [12, 16, 21, 28, 35, 43, 46, 53, 54, 57, 60, 81, 88, 97, 100], "laguag": 12, "nltk": [12, 80, 85], "toktok": 12, "probali": 12, "svm": 12, "repres": [12, 16, 28, 37, 40, 57, 61, 62, 67, 81, 82, 84, 85, 87, 88, 97, 100], "binari": [12, 21, 36, 40, 57, 70, 73, 84, 87], "otherwis": [12, 31, 33, 35, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 97, 100, 102], "sklean": 12, "document": [12, 23, 27, 43, 64, 65, 67, 84, 85, 88], "x_train_cv": 12, "fit_transform": [12, 77, 87], "x_test_cv": 12, "matriz": 12, "spars": [12, 33, 60, 85, 87], "528584": 12, "165468": 12, "300381": 12, "242211": 12, "489893": 12, "134160": 12, "regressor": 12, "solver": [12, 35, 39, 82], "saga": [12, 35, 39], "class_weight": 12, "dual": [12, 101], "fit_intercept": 12, "intercept_sc": 12, "l1_ratio": 12, "max_it": [12, 67], "multi_class": 12, "auto": [12, 15, 20, 43, 64, 65, 67, 69, 70, 101], "n_job": 12, "penalti": [12, 27, 70, 84], "l2": [12, 17, 60, 61, 73, 84, 89, 94], "tol": 12, "warm_start": 12, "recal": [12, 60, 61, 62, 65, 67, 73, 76, 80, 81, 84, 85, 88], "f1": 12, "160000": 12, "320000": 12, "macro": 12, "avg": [12, 100], "regres": 12, "explan": [12, 34, 37, 38, 65, 67, 88], "coef_": [12, 35, 39], "vocabulary_": 12, "words_sk": 12, "589260": 12, "roni": 12, "862597673594883": 12, "inaperfectworld": 12, "5734362290886375": 12, "dontyouh": 12, "500197620227523": 12, "xbllygbsn": 12, "412645372640648": 12, "anqju": 12, "336405291553548": 12, "200522312464158": 12, "pakcricket": 12, "1949158120163412": 12, "condol": 12, "132498019366488": 12, "heartbreak": 12, "066508733796654": 12, "saddest": 12, "041999809733714": 12, "sadd": 12, "029070563580306": 12, "heartbroken": 12, "0287688233900174": 12, "boohoo": 12, "022608649696793": 12, "sadfac": 12, "9918411285807234": 12, "rachelle_lefevr": 12, "925057253107806": 12, "disappoint": 12, "902524113779547": 12, "lvbu": 12, "894705935001672": 12, "sadden": 12, "8855127179984654": 12, "bum": 12, "83650014970307": 12, "neda": 12, "792944556837498": 12, "iamsoannoi": 12, "8494314732277672": 12, "myfax": 12, "797451563471618": 12, "jennamadison": 12, "5667257393706113": 12, "yeyi": 12, "478028598852801": 12, "tryout": 12, "4383315790116677": 12, "goldymom": 12, "4374026022205535": 12, "wooohooo": 12, "40297322137544": 12, "thesupergirl": 12, "3565118467330004": 12, "iammaxathotspot": 12, "311648368632618": 12, "londicr": 12, "3074490293400993": 12, "smilin": 12, "2991891636718216": 12, "worri": [12, 57, 80], "2899429774914717": 12, "sinfulsignorita": 12, "2798963640981817": 12, "finchensnail": 12, "264302079155878": 12, "smackthi": 12, "2376679263761083": 12, "kv": 12, "2158393907798413": 12, "tojosan": 12, "211784259253832": 12, "russmarshalek": 12, "2095374025599384": 12, "traciknopp": 12, "1768297770350835": 12, "congratul": [12, 70, 84], "171590496227557": 12, "rememb": [12, 25, 28, 33, 35, 36, 38, 60, 62, 67, 69, 74, 77, 80, 82, 88, 100, 101], "sigma": [12, 40, 61, 62, 64, 65, 74, 76, 80, 81], "wx": 12, "previou": [12, 15, 16, 17, 21, 31, 35, 38, 39, 40, 43, 57, 61, 62, 64, 67, 73, 74, 76, 80, 84, 85, 88], "That": [12, 31, 33, 35, 36, 38, 57, 60, 67, 73, 74, 76, 84, 88, 89], "mea": 12, "didnt": 12, "But": [12, 33, 35, 36, 39, 40, 43, 62, 64, 65, 67, 69, 70, 73, 74, 80, 81, 82, 87, 88, 89, 100], "solv": [12, 19, 27, 37, 62, 67, 81, 84, 88, 97, 101], "unlik": [12, 53, 57, 62, 73, 88, 100], "feedforward": [12, 84], "cyclic": 12, "power": [12, 43, 57, 60, 62, 67, 69, 70, 76, 80, 84, 97], "word_to_idx": 12, "integ": [12, 15, 21, 33, 35, 39, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "limit": [12, 15, 18, 25, 26, 31, 33, 34, 37, 39, 40, 43, 61, 77, 85, 88, 91, 101], "num_words_dict": 12, "ditionari": 12, "reserv": 12, "most_used_word": 12, "extra": [12, 17, 23, 27, 31, 33, 34, 80, 85, 97], "outsid": [12, 17, 21, 35, 39, 40, 43, 57, 64, 65, 73, 76, 84, 88, 94], "unk": [12, 87, 88], "idx_to_word": 12, "pad_token": [12, 88], "unk_token": [12, 88], "popul": [12, 20, 39], "num": [12, 57, 77, 80, 85, 100, 102], "These": [12, 15, 17, 23, 27, 31, 33, 35, 36, 46, 57, 60, 62, 64, 67, 70, 73, 74, 77, 81, 82, 85, 87, 88, 89, 97, 101, 102], "tokens_to_idx": 12, "sentences_token": 12, "sentences_idx": 12, "sent": [12, 84, 87], "sent_idx": 12, "x_train_idx": 12, "x_test_idx": 12, "some_numb": 12, "721": [12, 76], "237": [12, 76, 80], "adequ": [12, 27], "tweet_len": 12, "asarrai": [12, 65, 73, 76], "median": [12, 33, 39, 80], "quantil": 12, "maximum": [12, 17, 20, 21, 25, 27, 28, 39, 43, 65, 67, 69, 70, 73, 77, 84, 85, 94, 97, 100, 102], "max_lenght": 12, "shorter": 12, "lenght": 12, "seq_len": 12, "ii": [12, 28, 46, 64, 65], "len_tweet": 12, "x_train_pad": 12, "x_test_pad": 12, "y_train_np": 12, "y_test_np": 12, "122": [12, 20, 69, 76], "209": [12, 76], "667": [12, 76], "138": [12, 76, 84], "3296": 12, "train_data": [12, 16, 17, 64, 65, 67, 73, 84, 87], "valid_data": [12, 87], "hyperparamet": [12, 27, 28, 31, 33, 36, 67, 69, 84, 85, 87, 88, 94, 100, 102], "valid_load": 12, "trane": 12, "proccess": 12, "folllow": 12, "datait": [12, 65, 76, 77], "sample_x": 12, "sample_i": 12, "seq_length": [12, 84], "7447": 12, "14027": 12, "22241": 12, "2702": 12, "162": [12, 76, 84], "12904": 12, "sentimentrnn": 12, "pai": [12, 31, 57, 67, 73, 74, 76, 91, 100, 101, 102], "posibl": 12, "inedex": 12, "space": [12, 17, 20, 21, 27, 36, 57, 61, 64, 70, 73, 74, 76, 77, 80, 82, 84, 85, 87, 88, 89, 91, 100, 102], "embedding_dim": 12, "thread": [12, 43, 84], "particular": [12, 15, 16, 31, 33, 35, 39, 40, 57, 65, 67, 73, 80, 81, 94, 97], "lstm": [12, 33, 57, 84, 88], "decid": [12, 33, 34, 36, 38, 39, 40, 67, 84, 85, 97, 100], "no_lay": 12, "strongli": [12, 62], "colah": 12, "vocab_s": [12, 84, 87, 88], "hidden_dim": [12, 67], "drop_prob": 12, "output_dim": [12, 62, 82], "num_lay": [12, 20], "sigmoid": [12, 21, 76, 82], "fc": [12, 20, 76], "sig": [12, 40, 80], "emb": [12, 57, 74, 81, 82, 84, 91], "lstm_out": 12, "activ": [12, 15, 16, 19, 20, 21, 25, 35, 39, 43, 57, 60, 62, 64, 65, 67, 73, 76, 80, 81, 82, 91, 101], "contigu": [12, 84, 100, 102], "across": [12, 15, 17, 18, 19, 27, 31, 35, 36, 39, 40, 57, 62, 64, 67, 70, 73, 80, 82, 84, 85, 87, 88, 94, 100, 101, 102], "sig_out": 12, "init_hidden": 12, "n_layer": 12, "h0": 12, "c0": [12, 20], "vocabulari": [12, 84, 85, 87, 88], "regular": [12, 17, 39, 46, 60, 67, 76, 84, 91, 101], "move": [12, 26, 27, 31, 33, 35, 39, 40, 43, 57, 67, 73, 76, 80, 84, 85, 88, 100, 102], "model_paramet": 12, "filter": [12, 17, 21, 36, 40, 57, 67, 80, 84, 85, 87, 88], "prod": [12, 28, 61, 67], "1018433": 12, "procc": 12, "crossentropi": 12, "bceloss": 12, "round": [12, 21, 31, 62, 76, 77, 84, 87, 100, 102], "absolut": [12, 31, 67, 69, 70, 73, 84, 94, 101], "accept": [12, 15, 34, 43, 70, 82, 85, 87, 100, 102], "gradeint": 12, "assum": [12, 21, 25, 28, 35, 62, 64, 65, 73, 74, 76, 80, 84, 88, 97, 100, 101], "big": [12, 21, 31, 43, 46, 67, 69, 70, 81, 84, 88, 101], "valid_loss_min": 12, "evolut": [12, 60, 61, 62, 67, 81], "epoch_tr_loss": 12, "epoch_vl_loss": 12, "epoch_tr_acc": 12, "epoch_vl_acc": 12, "backprop": [12, 33], "clip_grad_norm": 12, "prevent": [12, 34, 35, 37, 67, 81], "explod": [12, 20, 31, 61, 65, 81, 82], "clip_grad_norm_": [12, 87], "val_loss": [12, 67, 87], "val_acc": [12, 67, 69, 70, 87], "val_h": 12, "epoch_train_loss": 12, "epoch_val_loss": 12, "epoch_train_acc": 12, "epoch_val_acc": 12, "val_accuraci": 12, "6f": 12, "pt": [12, 67, 85, 88], "4367361353733577": 12, "39174133955966683": 12, "530625": 12, "3628125": 12, "391741": 12, "3765802335098851": 12, "3724124691961333": 12, "19140625": 12, "42031250000001": 12, "372412": 12, "35746844720793886": 12, "365050206175074": 12, "16882812499999": 12, "7440625": 12, "365050": 12, "34491546426317654": 12, "36467386982403693": 12, "879140625": 12, "364674": 12, "33429012800217606": 12, "36189084346871825": 12, "44296875": 12, "0221875": 12, "361891": 12, "grid": [12, 28, 57, 70, 76, 80, 81, 87, 100], "migth": 12, "preprocces": 12, "rudimentari": [12, 19], "correctli": [12, 17, 25, 31, 39, 43, 57, 64, 73, 81, 82, 84, 94, 100], "propos": [12, 27, 31, 33, 35, 64, 94, 101], "hyperparament": 12, "bidirecton": 12, "learnt": [12, 16, 73], "beliv": 12, "youtub": 12, "kshitij": 15, "dwivedi": 15, "produtct": [15, 17, 21], "colab": [15, 16, 19, 21, 27, 31, 33, 35, 51, 54, 62, 65, 85, 87, 88, 101], "agre": 15, "educ": 15, "NOT": [15, 17, 35, 54, 60, 84, 88, 94], "thereof": 15, "massachusett": 15, "technolog": 15, "warranti": 15, "regard": [15, 33, 35, 37, 40, 69], "infring": 15, "shall": [15, 76, 81, 82, 85], "defend": [15, 70], "indemnifi": 15, "corpor": 15, "employe": 15, "offic": 15, "agent": [15, 27, 84, 97, 101], "against": [15, 31, 38, 40, 60, 70, 81, 85, 94], "claim": [15, 34, 67], "aris": [15, 40, 57, 67, 84], "copyright": 15, "treat": [15, 57, 74, 76, 80, 85, 88], "digniti": 15, "guarante": [15, 31, 35, 62, 67, 80, 84, 85, 94], "liabil": 15, "roi": [15, 18], "glass": [15, 76], "nilearn": 15, "challeng": [15, 16, 33, 61, 67, 73, 94, 97], "particip": [15, 20, 25, 33, 36, 52, 81, 88], "decord": 15, "pickl": [15, 67, 100], "nibabel": 15, "nib": 15, "fsaverag": 15, "fetch_surf_fsaverag": 15, "year": [15, 31, 85, 88, 101], "googl": [15, 16, 21, 23, 27, 30, 31, 48, 51, 54, 84, 87], "must": [15, 21, 27, 31, 38, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 88, 89, 94, 100, 102], "dropbox_link": 15, "myurl": 15, "participants_data": 15, "agxyxntrbwko7t1": 15, "fname1": [15, 76], "participants_data_v2021": 15, "fname2": [15, 76], "algonautsvideos268_all_30fpsmax": 15, "dynam": [15, 17, 36, 38, 60, 62, 64, 67, 97], "cognit": [15, 20], "fub": 15, "algonauts2021_devkit": 15, "nii": 15, "submit": [15, 23, 31], "everydai": 15, "event": [15, 23, 46, 48, 57, 74, 101], "magnet": [15, 76], "reson": 15, "high": [15, 17, 19, 27, 31, 37, 62, 70, 73, 74, 76, 80, 81, 82, 87, 88, 94, 97, 100, 102], "spatial": [15, 17, 18, 37, 64, 82], "resolut": [15, 19, 37, 40, 73, 82], "blood": [15, 62, 101], "flow": [15, 31, 37, 38, 43, 67, 81, 84, 101], "independ": [15, 61, 62, 65, 74, 81], "voxel": 15, "reliabl": [15, 62, 67], "region": [15, 16, 20, 54, 73], "known": [15, 33, 35, 39, 40, 60, 70, 73, 74, 77, 80, 81, 82, 87, 97], "role": [15, 39, 69, 76, 84, 88, 94], "earli": [15, 23, 31, 38, 70, 73], "mid": [15, 81, 87], "cortex": [15, 17, 18, 19, 74], "v2": [15, 16, 27, 57, 87, 89], "v3": 15, "v4": [15, 16, 27], "higher": [15, 33, 36, 39, 40, 61, 62, 65, 67, 70, 73, 80, 84, 87, 88, 100], "respond": [15, 25, 31, 43, 82, 88], "preferenti": 15, "eba": 15, "face": [15, 25, 76, 85, 94, 101], "ffa": 15, "st": [15, 76, 87], "loc": [15, 21, 25, 39, 61, 67, 76], "scene": [15, 28, 84], "ppa": 15, "pkl": 15, "num_video": 15, "num_repetit": 15, "num_voxel": 15, "signific": [15, 38, 57, 67, 70], "demonstr": [15, 27, 38, 39, 40, 57, 67, 76, 80, 101], "save_dict": 15, "di_": 15, "filename_": 15, "load_dict": 15, "rb": [15, 27, 57, 73, 82, 100, 102], "_unpickl": 15, "latin1": 15, "ret_di": 15, "visualize_act": 15, "vid_id": 15, "fmri_dir": 15, "full_track": 15, "track_dir": 15, "sub_fmri_dir": 15, "nifti": 15, "fmri_train_al": 15, "voxel_mask": 15, "get_fmri": 15, "visual_mask_3d": 15, "brain_mask": 15, "nii_save_path": 15, "vid_act": 15, "saveasnii": 15, "plot_glass_brain": 15, "plot_ab": 15, "display_mod": 15, "lyr": 15, "train_vid": 15, "repetit": [15, 35, 36, 39, 40], "roi_fil": 15, "roi_data": 15, "roi_data_train": 15, "nii_data": 15, "nii_img": 15, "nifti1imag": 15, "header": [15, 43, 88], "sub05": 15, "sub01": 15, "sub02": 15, "sub03": 15, "sub04": 15, "sub06": 15, "sub07": 15, "sub08": 15, "sub09": 15, "sub10": 15, "wrapper": [15, 25, 27, 28, 57, 76, 85], "mini_track": 15, "heatmap": [15, 21], "stimulu": [15, 18, 25, 39], "aspect": [15, 20, 31, 35, 36, 37, 38, 74, 88, 91, 101], "vmin": [15, 17, 20, 21, 28, 62, 73, 82], "vmax": [15, 17, 20, 21, 28, 62, 73, 82], "shrink": [15, 73], "tight_layout": [15, 17, 18, 21, 62, 64, 67, 76], "individu": [15, 17, 31, 33, 36, 38, 46, 62, 74, 76, 77, 85, 88], "999": [15, 25, 76, 82, 94], "base64": [15, 27, 73], "b64encod": [15, 27, 73], "video_dir": 15, "video_list": 15, "mp4": [15, 27], "data_url": 15, "400": [15, 20, 27, 33, 35, 39, 76], "control": [15, 19, 20, 25, 27, 31, 36, 43, 52, 53, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 101, 102], "src": [15, 27, 28, 39, 43, 44, 45, 73], "5s": [15, 67], "9s": 15, "onset": 15, "finit": [15, 76], "impuls": 15, "fir": 15, "radoslaw": 15, "martin": [15, 97], "cichi": [15, 19], "benjamin": [15, 57], "lahner": 15, "lascel": 15, "polina": [15, 60, 61, 62, 73, 76, 77], "iamshchinina": 15, "monika": 15, "graumann": 15, "andonian": 15, "apurva": 15, "ratan": 15, "murti": 15, "kendrick": 15, "kai": [15, 19, 34], "gemma": 15, "roig": 15, "aud": 15, "oliva": 15, "motion": [15, 33, 35, 36, 39, 40], "2104": 15, "13714v1": 15, "yalda": 15, "mohsenzadeh": 15, "kandan": 15, "ramakrishnan": 15, "platform": [15, 43, 57, 76], "commun": [15, 31, 34, 39, 52, 64, 67, 88], "biolog": [15, 17, 101], "1905": 15, "05675": 15, "rishika": [16, 20], "mohanta": [16, 20], "furkan": 16, "\u00f6z\u00e7elik": 16, "imagin": [16, 61, 74, 82, 91], "spectacl": 16, "blur": [16, 21, 81], "stumbl": 16, "anim": [16, 28, 39, 57, 62, 73, 81, 97], "walk": [16, 20, 33, 34, 57, 73, 76], "ye": [16, 33, 39, 40, 57, 62, 73], "foggi": 16, "condit": [16, 17, 28, 35, 36, 39, 40, 57, 61, 65, 80, 81, 85, 87, 88, 97], "poor": [16, 77], "qualiti": [16, 80, 87, 94], "low": [16, 17, 19, 27, 31, 33, 39, 62, 73, 74, 80, 81, 82, 88, 91, 100, 102], "Is": [16, 38, 64, 67, 73, 80, 91, 94, 101], "torch_intermediate_layer_gett": [16, 18], "wheel": [16, 18, 25, 28, 76], "getter": [16, 18], "25l": [16, 18, 28], "25hdone": [16, 18, 28], "pil": [16, 18, 43, 73, 76, 77], "imagefilt": 16, "copyfil": 16, "intermediatelayergett": [16, 18], "layergett": 16, "summarywrit": 16, "load_ext": [16, 84, 85], "asirra": 16, "captcha": 16, "autom": [16, 57, 70, 76], "ture": [16, 101], "apart": [16, 69, 77], "hip": [16, 76], "proof": 16, "motiv": [16, 25, 60, 67, 74, 97, 101], "behind": [16, 35, 62, 65, 67, 82, 88, 100, 102], "creation": [16, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "speci": 16, "restrict": [16, 82, 85], "photograph": 16, "accomplish": [16, 38, 88], "competit": [16, 30, 100], "dataset_blur_2": 16, "dataset_blur_5": 16, "subfold": 16, "gaussian": [16, 21, 35, 39, 65, 74, 80, 82], "radiu": [16, 21, 57], "catvdog_clear": 16, "catvdog_blur_2": 16, "catvdog_blur_5": 16, "hj2gd": 16, "xp6qd": 16, "wj43a": 16, "zip_ref": [16, 80, 84], "appropri": [16, 23, 31, 36, 37, 39, 64, 67, 69, 70, 73, 81, 85, 94], "resiz": [16, 17, 18, 27, 43, 73, 76, 77], "clear_train_data": 16, "clear_test_data": 16, "noisy_train_data": 16, "noisy_test_data": 16, "validation_split": 16, "val_ratio": 16, "train_indic": [16, 76], "val_indic": 16, "train_split": 16, "subset": [16, 20, 21, 28, 33, 35, 39, 57, 67, 70, 76, 77, 84, 85, 87, 88, 94], "val_split": 16, "clear_train_split": 16, "clear_val_split": 16, "clear_train_batch": 16, "clear_val_batch": 16, "clear_test_batch": 16, "noisy_train_split": 16, "noisy_val_split": 16, "noisy_train_batch": 16, "noisy_val_batch": 16, "noisy_test_batch": 16, "clear_cat_imag": 16, "clear_dog_imag": 16, "19997": 16, "noisy_cat_imag": 16, "noisy_dog_imag": 16, "141": [16, 76], "142": [16, 76], "143": [16, 76], "144": [16, 76], "tri": [16, 33, 34, 39, 40, 85, 101], "schrimpf": 16, "categoris": 16, "faster": [16, 21, 35, 36, 39, 61, 67, 69, 73, 76, 80, 94], "architechtur": 16, "rel": [16, 19, 21, 31, 33, 35, 39, 40, 57, 62, 67, 76, 77, 80, 82, 84, 87, 88, 94], "computation": [16, 67], "feasibl": [16, 31], "down": [16, 17, 21, 27, 28, 31, 34, 35, 36, 37, 54, 57, 61, 62, 67, 73, 80, 81, 82, 85, 88, 91, 97], "retina": [16, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "lgn": 16, "192": [16, 76], "avgpool": [16, 73], "adaptiveavgpool2d": 16, "visualis": [16, 67, 73], "add_graph": 16, "logdir": [16, 85], "train_batch": 16, "val_batch": 16, "training_loss": [16, 64, 65, 87], "reset": [16, 25, 27, 28, 57, 60, 64, 73, 76, 82, 97], "train_count": 16, "test_count": 16, "blurri": [16, 94], "h_0": 16, "h_1": 16, "3e": [16, 65], "num_pretraining_epoch": 16, "num_training_epoch": 16, "naive_before_train": 16, "naive_training_loss": 16, "naive_validation_loss": 16, "naive_after_train": 16, "arang": [16, 17, 21, 28, 35, 39, 57, 60, 61, 62, 64, 73, 76, 81, 84, 85, 87], "array_split": 16, "17953": 16, "00": [16, 27, 28, 31, 33, 46, 94], "2494": 16, "09": [16, 33, 61], "distinguish": [16, 33, 35, 36, 39, 77, 80, 100], "expert_before_train": 16, "expert_after_pretrain": 16, "experienced_training_loss": 16, "experienced_validation_loss": 16, "expert_after_train": 16, "axvlin": [16, 67, 69, 94], "linestyl": [16, 67, 70], "dash": [16, 39, 69, 70, 94], "01": [16, 20, 25, 27, 57, 60, 61, 62, 67, 73, 81, 82, 94], "29": [16, 20, 65, 67, 73, 76, 81, 82, 85, 94, 97, 100], "thats": 16, "seen": [16, 35, 39, 60, 61, 62, 65, 67, 70, 74, 76, 77, 80, 84, 94, 101], "further": [16, 20, 31, 38, 39, 70, 73, 81, 82, 85, 97], "itself": [16, 25, 67, 94, 97], "OR": [16, 31, 36, 57], "plot_filt": 16, "filter_index": [16, 76], "row_index": [16, 76], "col_index": [16, 76], "filter_imag": [16, 76], "scaled_imag": [16, 76], "meaning": [16, 35, 87, 89], "visibl": [16, 27, 85, 94], "intermidi": 16, "return_lay": [16, 18], "plot_intermediate_lay": 16, "intermediate_output": [16, 76], "complex": [16, 26, 27, 37, 38, 57, 60, 61, 62, 67, 69, 70, 73, 94, 97, 101], "appar": [16, 38, 82], "clearli": [16, 33, 34, 35, 37, 39, 40, 69, 74], "somewhat": [16, 39, 73, 84, 100], "focus": [16, 74, 101], "contibut": 16, "respons": [16, 19, 25, 31, 43, 67, 76, 84, 85, 87, 88], "wish": [16, 27, 60, 85, 94, 100], "variat": [16, 40, 60, 77], "wget": 16, "certif": [16, 46, 52], "microsoft": [16, 88], "3e1c3f21": 16, "ecdb": 16, "4869": 16, "8368": 16, "6deba77b919f": 16, "kagglecatsanddogs_3367a": 16, "local_zip": 16, "cat_fold": 16, "petimag": 16, "dog_fold": 16, "check_fil": 16, "cat_fil": 16, "dog_fil": 16, "test_ratio": 16, "training_length": 16, "test_indic": [16, 76], "jpg": [16, 76, 77], "gaussianblur": [16, 21], "zipdir": 16, "ziph": 16, "dir": [16, 60, 77], "relpath": 16, "zip_defl": 16, "carsen": 17, "stringer": [17, 19], "cellular": [17, 76], "cultur": 17, "calcium": 17, "opencv": [17, 21], "numba": [17, 25, 28], "tifffil": 17, "cv2": [17, 21], "hashlib": 17, "jit": [17, 82], "gaussian_filt": 17, "find_object": 17, "binary_fill_hol": 17, "generate_binary_structur": 17, "linear_sum_assign": 17, "answer": [17, 23, 31, 33, 34, 35, 36, 37, 38, 39, 40, 57, 74, 76, 77, 84, 88, 91, 94, 101], "allow": [17, 20, 28, 33, 34, 36, 38, 39, 40, 52, 57, 64, 65, 67, 73, 80, 82, 84, 85, 88, 91, 97, 100, 101], "drug": 17, "surviv": 17, "reason": [17, 28, 31, 33, 52, 61, 62, 65, 67, 69, 74, 76, 80, 82, 87, 94, 100, 101], "tempor": [17, 37, 57, 64], "divis": [17, 67, 84, 94], "movement": [17, 33, 35, 36, 39, 40], "influx": 17, "quantif": 17, "protein": [17, 84], "rna": 17, "expresss": 17, "convolut": [17, 21, 31, 33, 35, 45, 67, 70, 74, 82, 84, 91, 94, 100], "curat": [17, 23, 27], "cytoplasm": 17, "stain": 17, "nuclear": 17, "cost": [17, 20, 31, 36, 60, 61, 70, 76, 85, 100], "transfer": [17, 21, 43, 57, 62, 100, 101], "ann": [17, 19, 21, 69, 70, 73, 87, 89, 91], "carpent": [17, 76], "lab": [17, 19, 21, 25, 57, 73, 76], "broad": [17, 31, 80], "mayb": [17, 28, 31, 33, 34, 36, 39, 40, 77, 91], "worm": [17, 76], "herd": 17, "bison": [17, 76], "rescal": [17, 21], "accordingli": [17, 61, 64, 97], "tool": [17, 19, 21, 31, 37, 38, 43, 57, 61, 62, 67, 76, 77], "napari": 17, "overfit": [17, 21, 70, 100, 102], "finish": [17, 21, 31, 38, 69, 70, 73, 82, 84, 85, 87, 102], "develop": [17, 21, 25, 31, 35, 38, 43, 57, 60, 61, 81, 88, 97, 100], "movi": [17, 19, 33, 35, 36, 57, 76, 85, 91], "record": [17, 19, 25, 27, 33, 35, 39, 57, 60, 61, 62, 69, 73, 74, 88, 94, 100, 102], "microscop": 17, "therefor": [17, 33, 35, 36, 39, 43, 60, 61, 62, 64, 65, 69, 70, 73, 80, 84, 85, 91, 100], "though": [17, 28, 57, 60, 65, 74, 80, 88, 91, 94, 101], "frame": [17, 27, 37, 39, 40, 57, 69, 70, 76, 81], "suite2p": 17, "acknowledg": [17, 21], "borrow": [17, 21], "cellpos": 17, "mariu": [17, 20, 33, 34, 35, 36, 37, 38, 39, 40], "pachitariu": [17, 20, 35, 36], "kristin": [17, 21], "branson": [17, 21], "poseestim": 17, "cells_train": 17, "npz": [17, 18, 94], "cells_test": 17, "z3h78": 17, "ft5p3": 17, "expected_md5": 17, "85e1fe2ee8d936c1083d62563d79d958": 17, "e8f789abe20a7efde806d9ba03d20fd7": 17, "md5": 17, "hexdigest": 17, "corrupt": [17, 67], "allow_pickl": [17, 33, 36], "arr_0": 17, "imgs_train": 17, "masks_train": 17, "imgs_test": 17, "masks_test": 17, "mostli": [17, 33, 36, 62, 64, 77], "varieti": [17, 31, 34, 39, 80, 88], "fast": [17, 27, 36, 39, 61, 64, 67, 74, 85, 101], "normalize99": 17, "1st": [17, 57, 84], "percentil": [17, 21, 57], "99th": 17, "x01": 17, "x99": 17, "irand": 17, "nuclei": 17, "labels_train": 17, "labels_test": 17, "adapt": [17, 33, 43, 57, 65, 69], "random_rotate_and_res": 17, "scale_rang": 17, "xy": [17, 87], "do_flip": 17, "nimg": 17, "ly": 17, "lx": 17, "nd": [17, 67], "nchan": 17, "nlabel": 17, "IF": 17, "rand": [17, 20, 35, 39, 57, 61, 62, 69, 70, 73, 80, 81, 82, 94], "bool": [17, 28, 67, 73, 84, 85, 94, 97], "flip": [17, 62, 65, 70, 73, 91, 100], "imgi": 17, "ndim": [17, 81], "nt": [17, 20], "dxy": 17, "cc1": 17, "pts1": 17, "pts2": 17, "getaffinetransform": 17, "newaxi": [17, 81], "warpaffin": 17, "inter_linear": 17, "inter_nearest": 17, "img_batch": 17, "lbl_batch": 17, "local": [17, 20, 27, 28, 31, 48, 53, 60, 70, 77, 85, 87, 89, 91, 100, 102], "autoencod": [17, 81], "imagenet": [17, 18, 19, 43, 57, 91], "upsampl": [17, 21, 82], "ultim": [17, 33, 35, 36, 38, 88], "skip": [17, 31, 57, 65, 73, 76, 82, 94], "TO": [17, 65, 85], "propag": [17, 64, 65, 94, 102], "later": [17, 25, 28, 31, 33, 35, 36, 39, 40, 43, 57, 60, 62, 64, 65, 73, 74, 80, 88, 97, 100], "resnet_torch": 17, "convbatchrelu": 17, "sz": 17, "convdown": 17, "add_modul": [17, 64, 65], "conv_": 17, "nbase": 17, "maxpool": [17, 21], "conv_down_": 17, "xd": 17, "convup": 17, "conv_0": 17, "conv_1": 17, "scale_factor": [17, 21], "conv_up_": 17, "unet": [17, 21], "nout": 17, "nbaseup": 17, "t0": 17, "save_model": 17, "load_model": [17, 87, 89, 100, 102], "concaten": [17, 20, 28, 35, 39, 57, 64, 65, 80, 81], "put": [17, 21, 31, 33, 37, 39, 40, 43, 60, 61, 64, 76, 77, 81, 84, 87, 88, 89, 100], "colon": [17, 85], "datetim": [17, 21], "linearli": [17, 21, 33, 36, 40, 80, 82], "batchsiz": [17, 21, 73], "n_epoch": [17, 60, 61, 62, 82], "cycl": [17, 21], "n_epochs_per_sav": 17, "val_frac": [17, 21], "fraction": [17, 21, 27, 31, 69, 70, 80], "clean": [17, 21, 27, 43, 57, 81, 85, 88], "timestamp": [17, 21], "strftime": [17, 21], "dt": [17, 21, 35, 39, 40, 64, 81], "n_val": [17, 21], "n_train": [17, 21, 64, 84], "iperm": 17, "val_data": 17, "val_label": [17, 18, 87], "train_mask": 17, "val_mask": 17, "flavor": [17, 21, 31, 88], "schedul": [17, 21, 34, 82, 87], "linspac": [17, 21, 40, 57, 60, 61, 64, 65, 67, 69, 70, 80, 81, 82, 87, 94], "nan": [17, 21, 25, 61, 94], "saveepoch": [17, 21], "entir": [17, 19, 21, 27, 28, 31, 69, 73, 76, 80, 85, 87, 89, 101], "batchnorm": [17, 21], "desc": [17, 21, 80, 85, 100, 102], "pbar": [17, 21, 81, 82], "ibatch": 17, "ind": 17, "clip_grad_value_": [17, 21], "nsave": 17, "savefil": [17, 21], "unet_epoch": 17, "pad_image_nd": 17, "img0": 17, "div": [17, 36, 43, 73], "2d": [17, 28, 35, 39, 43, 57, 60, 64, 65, 67, 69, 70, 73, 80, 82, 85, 87, 94], "lz": 17, "slice": [17, 57], "lpad": 17, "xpad1": 17, "xpad2": 17, "ypad1": 17, "ypad2": 17, "constant": [17, 36, 40, 74, 88], "ysub": 17, "xsub": 17, "slc": 17, "img_pad": 17, "img_torch": 17, "rather": [17, 33, 35, 39, 43, 61, 62, 67, 74, 80, 88, 97], "union": [17, 85], "iou": 17, "overlap": [17, 73, 89], "ground": [17, 27, 28, 33, 35, 36, 39, 61, 64, 76, 100, 101, 102], "truth": [17, 33, 35, 36, 39, 57, 61, 64, 100, 101, 102], "greater": [17, 76, 77, 85], "taken": [17, 27, 57, 65, 67, 73, 100, 102], "stardist": 17, "maxim": [17, 27, 70, 74, 76, 80, 91, 94, 97, 100], "fill_holes_and_remove_small_mask": 17, "min_siz": 17, "discard": [17, 69], "morpholog": 17, "NO": [17, 62], "minimum": [17, 21, 25, 28, 60, 67, 69, 70, 94], "turn": [17, 23, 31, 57, 61, 64, 67, 74, 81, 82, 100, 102], "msk": 17, "npix": 17, "average_precis": 17, "masks_tru": 17, "masks_pr": 17, "ap": 17, "tp": 17, "fn": 17, "heavili": 17, "mpicbg": 17, "csbd": 17, "isinst": [17, 21, 60, 61, 62, 65, 80, 82, 85], "ndarrai": [17, 21, 28, 39, 57, 60, 61, 62, 65, 67, 69, 70, 73, 80, 81, 85, 97, 100, 102], "n_true": 17, "n_pred": 17, "mt": 17, "return_index": 17, "_intersection_over_union": 17, "_true_posit": 17, "nopython": 17, "_label_overlap": 17, "ravel": [17, 57, 67, 76, 100, 102], "uint": 17, "n_pixels_pr": 17, "keepdim": [17, 65, 69, 70, 81], "n_pixels_tru": 17, "isnan": [17, 61], "n_min": 17, "true_ind": 17, "pred_ind": 17, "match_ok": 17, "get_masks_unet": 17, "cell_threshold": 17, "selem": 17, "shape0": 17, "return_invers": 17, "uint16": [17, 21], "capac": [17, 70, 101], "val_pad": 17, "val_torch": 17, "iou_threshold": 17, "ylim": [17, 27, 39, 61, 64, 67, 80, 81], "5039152": 17, "test_pad": 17, "test_torch": 17, "58384985": 17, "typic": [17, 18, 19, 20, 25, 28, 31, 65, 67, 74, 77, 80, 88, 94, 97], "overmerg": 17, "avoid": [17, 21, 28, 31, 34, 35, 36, 37, 38, 43, 61, 67, 76, 77, 88, 100], "boundari": [17, 70, 73], "interfac": [17, 27, 28, 35, 38, 43, 57, 85, 87], "jupyt": [17, 27, 43, 73], "overlaid": 17, "mous": [17, 19, 20, 76], "10hz": 17, "4500": 17, "325": [17, 76], "556": [17, 76], "gt1": 17, "tif": 17, "test_data": [17, 57, 64, 65, 67, 70, 73, 84, 87], "n_time": 17, "max_img": 17, "max_img_filt": 17, "unfilt": 17, "max_img_larg": 17, "max_img_2chan": 17, "zeros_lik": [17, 57, 67, 73, 81], "BE": 17, "hand": [17, 28, 31, 33, 36, 38, 39, 57, 60, 61, 62, 65, 67, 73, 76, 81, 82, 87], "IT": 17, "n_cell": [17, 60], "fluoresc": 17, "trace": [17, 73], "middl": [17, 21, 88, 91], "allen": 17, "guidanc": [17, 31, 57], "strategi": [17, 36, 85, 88, 94, 100], "light": [17, 76, 101], "aakash": 18, "agraw": 18, "proven": [18, 27], "pca": [18, 77, 87], "directli": [18, 31, 53, 57, 60, 64, 67, 73, 80, 85, 87, 97, 100, 102], "midgett": 18, "pdist": 18, "stat": [18, 25, 35, 39, 40, 43, 67, 80, 81, 85, 87], "pearsonr": 18, "kay_label": 18, "npy": 18, "kay_labels_v": 18, "kay_imag": 18, "r638": 18, "yqb3e": 18, "ymnjv": 18, "dobj": 18, "dat": 18, "sharex": [18, 62], "flat": [18, 35, 39, 76], "stimuli": [18, 20, 25, 35, 36, 39, 84], "field": [18, 21, 31, 34, 35, 37, 38, 43, 57, 61, 76, 77, 81, 85], "stim": 18, "grayscal": [18, 80], "stimuli_test": 18, "responses_test": 18, "roi_nam": 18, "stimuli_tr": 18, "stimuli_t": 18, "stimuli_tr_xform": 18, "1750": 18, "stimuli_ts_xform": 18, "loc_id": 18, "response_tr": 18, "response_t": 18, "mydataset": 18, "longtensor": [18, 57], "__getitem__": [18, 21, 33, 69, 70, 81], "fromarrai": [18, 73], "__len__": [18, 21, 33], "randomresizedcrop": 18, "centercrop": [18, 43, 76], "dataset_s": 18, "mseloss": [18, 60, 62, 69, 70, 80, 81], "best_model_wt": 18, "deepcopi": [18, 67, 69, 70, 94], "best_loss": 18, "running_correct": 18, "histori": [18, 60, 73], "set_grad_en": 18, "4f": [18, 20, 33, 40, 69], "4805": 18, "0503": 18, "4680": 18, "4679": 18, "0501": 18, "4677": 18, "0500": 18, "0499": 18, "fc2": [18, 33, 69, 70, 73, 87, 100, 102], "fc3": [18, 69, 70, 100, 102], "net_im": 18, "midfeat_ft": 18, "keep_output": 18, "midfeat_im": 18, "mid_outputs_ft": 18, "mid_outputs_im": 18, "v1_id": 18, "rts_v1": 18, "rts_lo": 18, "fmri_dist_metric_ft": 18, "euclidean": [18, 57, 62, 69, 74, 77], "fmri_dist_metric_im": 18, "alexnet_ft_dist_metr": 18, "alexnet_im_dist_metr": 18, "dobs_v1_ft": 18, "dobs_lo_ft": 18, "dobs_v1_im": 18, "dobs_lo_im": 18, "dnet_ft": 18, "dnet_im": 18, "xtick": [18, 33, 39, 73], "expertis": 19, "analysi": [19, 31, 34, 35, 36, 38, 39, 67, 77, 80, 91, 94], "toolkit": [19, 21, 35], "behavior": [19, 20, 25, 27, 31, 34, 38, 61, 67, 85, 97, 100, 101], "pipelin": [19, 31, 36, 39, 57, 85, 87], "conceptu": [19, 40, 81, 101], "steinmetz": 19, "neuropixel": 19, "lfp": 19, "spontan": 19, "orient": [19, 20, 27, 73, 84, 91, 94], "2p": 19, "sdk": 19, "simplifi": [19, 25, 39, 62, 73, 80, 81, 84, 85, 94, 101], "connectom": 19, "fmri": [19, 25], "natur": [19, 23, 26, 31, 45, 46, 62, 67, 81, 82, 84, 87, 89, 91, 94, 101], "bonner": 19, "ecog": 19, "caltech": 19, "social": [19, 25, 62], "ibl": 19, "decis": [19, 21, 34, 35, 36, 37, 39, 64, 65, 67, 70, 101], "motor": [19, 28, 64, 74, 76], "hippocampu": 19, "fly": [19, 21, 76, 81, 84], "hipposeq": 19, "mouselight": 19, "openorganel": 19, "stringer1": 19, "neuron": [19, 27, 31, 34, 35, 36, 39, 57, 60, 61, 62, 70, 76], "stringer2": 19, "800": [19, 20, 35, 39, 57, 65, 76], "stringer3": 19, "ephi": 19, "buzsaki": 19, "webpag": [19, 73, 101], "eeg": [19, 30], "bci": 19, "handwrit": 19, "recent": [19, 27, 28, 43, 60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 101, 102], "krishna": 19, "shenoi": 19, "epilept": 19, "seizur": 19, "neurovista": 19, "seizure_recognit": 19, "mnist": [19, 57, 80], "digit": [19, 25, 67, 73, 76, 80, 82, 88], "studyforrest": 19, "forrest": 19, "gump": 19, "speech": [19, 84, 85, 94], "eyegaz": 19, "trend": [19, 27, 57, 69, 80, 82], "neuroimag": 19, "mri": [19, 91], "assess": [19, 69, 94], "brainscor": 19, "preprint": 19, "overview": [19, 23, 33, 34, 39, 40, 88, 97], "behaviour": [19, 25, 28, 37, 38, 43, 61, 100], "deeper": [19, 36, 61, 62, 73, 76], "rule": [19, 31, 34, 37, 43, 57, 61, 67, 70, 76, 100], "influenc": [19, 33, 36, 37, 40], "preliminari": [19, 31, 35], "older": 19, "cn": [19, 20, 25, 35, 39, 64], "subsampl": 19, "cheat": 19, "sheet": 19, "fetch": [19, 43, 73, 76], "pars": [19, 21, 43, 85, 87], "session": [19, 31, 44, 45, 46, 54, 87], "pedram": 20, "luca": 20, "tavar": 20, "jonni": [20, 31], "coutinho": 20, "bless": 20, "itoro": 20, "gaurang": 20, "mahajan": 20, "brain": [20, 25, 35, 36, 39, 40, 44, 45, 64, 76, 77, 94, 97], "pattern": [20, 33, 73], "noisi": [20, 35, 39, 40, 69, 70, 80, 81], "brainwid": 20, "isol": [20, 38, 80], "seq": 20, "suffici": [20, 28, 33, 36, 38, 64, 69, 88], "describ": [20, 27, 33, 34, 36, 64, 65, 69, 80, 94, 97, 100, 101, 102], "hundr": [20, 84, 87, 100, 102], "ten": [20, 27, 31, 34, 67, 76, 87, 88], "thousan": 20, "cours": [20, 23, 28, 31, 33, 36, 39, 40, 52, 54, 60, 62, 70, 73, 74, 76, 80, 85, 88, 94, 100, 103], "ntrial": 20, "pretend": [20, 35, 39], "bin": [20, 25, 35, 36, 39, 43, 67, 74, 76, 87, 89, 94], "10m": [20, 36, 39], "2500m": 20, "compon": [20, 27, 28, 31, 33, 34, 36, 37, 38, 39, 40, 43, 46, 67, 73, 76, 77, 84, 88, 97], "ncomp": 20, "recurr": [20, 23, 60, 73, 84, 87, 101], "diagon": [20, 57, 62, 73, 94], "simplic": [20, 27, 33, 36, 39, 57, 67, 81, 84, 97], "stabil": [20, 27, 38, 64, 67, 70, 81, 82, 84], "a0": 20, "diag": 20, "025": 20, "innov": 20, "timestep": [20, 25, 27, 28, 81, 82], "poisson": [20, 35, 39], "spike": [20, 35, 36, 39, 64, 76], "nn1": 20, "nn2": 20, "bidi": 20, "bidirect": 20, "nonlinear": [20, 21, 39, 60, 62, 64, 65], "enforc": [20, 28], "softplu": 20, "smooth": [20, 64, 69, 70, 74, 76, 88, 91], "likelihood": [20, 40, 67, 74, 84, 94, 100, 102], "lead": [20, 27, 31, 35, 40, 61, 65, 67, 70, 73, 76, 84, 87, 88, 94, 97], "failur": 20, "gray_r": [20, 67], "ms": [20, 35, 39, 64, 74, 76], "separ": [20, 28, 31, 37, 38, 39, 57, 64, 67, 77, 80, 84, 85, 88, 89, 94], "x0": [20, 21, 57, 81], "bias": [20, 39, 57, 60, 65, 67, 73, 82, 85, 100, 101, 102], "slow": [20, 27, 28, 36, 39, 61, 62, 84, 85], "005": [20, 61, 67, 81], "poisson_loss": 20, "spk": 20, "niter": 20, "4105": 20, "2552": 20, "2438": 20, "2392": 20, "2377": 20, "2373": 20, "2371": 20, "700": [20, 76], "2369": 20, "2368": 20, "900": [20, 27, 76], "2367": 20, "rpred": 20, "121": [20, 69, 76], "ycpu": 20, "Not": [20, 21, 33, 36, 37, 38, 64, 67], "surpris": [20, 31, 38, 61, 69], "gagana": [21, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "fruit": 21, "robust": [21, 61, 67, 70, 76, 94], "perturb": [21, 67, 70, 82, 85], "mmpose": 21, "mmlab": 21, "design": [21, 27, 31, 35, 36, 39, 62, 64, 67, 70, 76, 84, 89, 94, 100, 101, 102], "toolbox": [21, 91], "exot": 21, "definit": [21, 23, 31, 38, 64, 67, 80, 85, 88, 94], "room": [21, 31, 80, 97, 101], "tracker": 21, "4216": 21, "awai": [21, 28, 33, 73, 74, 77, 88, 101], "readthedoc": [21, 27], "2d_animal_keypoint": 21, "deeplabcut": 21, "mackenziemathislab": 21, "kristinbranson": 21, "tensorboard": [21, 84, 85], "monitor": [21, 27, 76, 80, 85], "tensorboard_tutori": 21, "gh": 21, "page": [21, 23, 27, 33, 43, 51, 53, 54, 57], "_download": 21, "tensorboard_with_pytorch": 21, "ipynb": [21, 27], "milesi": 21, "alexandr": 21, "plotlabelandpredict": 21, "hm_pred": 21, "title_str": 21, "isbatch": 21, "locs_pr": 21, "heatmap2landmark": 21, "get_imag": 21, "get_landmark": 21, "nlandmark": 21, "marker": [21, 33, 61, 65, 67, 76, 77, 80, 94], "markerfacecolor": 21, "batchid": 21, "locs_pred_curr": 21, "hmim": 21, "get_heatmap_imag": 21, "predcurr": 21, "heatmap2imag": 21, "__version__": [21, 85, 87], "ncuda": 21, "ntorch": 21, "cu113": 21, "fly_bubble_20201204": 21, "q7vhy": 21, "datadir": 21, "view0": 21, "ftar": 21, "untar": 21, "drive": [21, 31, 36, 39, 53, 80], "1a06zamqxvuqzzqgi9xwwjabl4vof8v6z": 21, "usp": 21, "instruct": [21, 34, 60, 61, 101], "past": [21, 31, 33, 43, 67, 88, 101], "flush_and_unmount": [21, 28], "force_remount": 21, "fly_bubble_pos": 21, "xvzf": 21, "dev": [21, 27, 28, 35, 39, 100, 102], "null": [21, 27, 28, 43], "traindir": 21, "trainannfil": 21, "train_annot": 21, "testdir": 21, "testannfil": 21, "test_annot": 21, "trainann": 21, "ntrainim": 21, "filestr": 21, "imfil": 21, "imread_unchang": 21, "imsiz": 21, "landmark_nam": 21, "head_fc": 21, "head_bl": 21, "head_br": 21, "thorax_fr": 21, "thorax_fl": 21, "thorax_bc": 21, "abdomen": 21, "leg_ml_in": 21, "leg_ml_c": 21, "leg_mr_in": 21, "leg_mr_c": 21, "leg_fl_tip": 21, "leg_ml_tip": 21, "leg_bl_tip": 21, "leg_br_tip": 21, "leg_mr_tip": 21, "leg_fr_tip": 21, "num_keypoint": 21, "181": [21, 76], "nimsshow": 21, "imsshow": 21, "dpi": [21, 28, 73, 87], "bigger": [21, 61, 76, 87, 91], "keypoint": 21, "hm": 21, "landmark": 21, "indic": [21, 31, 33, 35, 36, 39, 40, 57, 60, 64, 65, 67, 69, 76, 80, 81, 85, 88, 94, 100], "colormap": 21, "get_cmap": [21, 62, 73], "colornorm": 21, "annfil": 21, "label_sigma": 21, "constructor": [21, 57, 88], "scalar": [21, 40, 57, 60, 62, 67, 80, 81, 84, 87], "nlandmarks_al": 21, "precomput": 21, "stuff": [21, 33], "label_filt": 21, "label_filter_r": 21, "label_filter_d": 21, "init_label_filt": 21, "overload": 21, "getitem": [21, 64], "ncolor": 21, "65535": 21, "cannot": [21, 38, 39, 57, 62, 65, 77, 85, 87, 100], "typeerror": [21, 85], "imsz": 21, "make_heatmap_target": 21, "diamet": 21, "alloc": [21, 57, 61], "lose": [21, 27, 28, 73], "border": [21, 57, 73, 76], "y0": 21, "crop": [21, 57, 70, 77, 91], "goe": [21, 31, 33, 64, 73, 80], "fil_x0": 21, "fil_x1": 21, "fil_y0": 21, "fil_y1": 21, "staticmethod": [21, 25, 100, 102], "static": [21, 43, 73, 76, 81], "usabl": [21, 38, 77], "ith": [21, 100], "plottabl": 21, "instanti": [21, 33, 36, 61, 62, 64, 65, 73, 76, 84, 85, 100, 102], "train_dataload": [21, 57, 87], "i_batch": 21, "sample_batch": 21, "8353": 21, "8275": 21, "8235": 21, "8314": 21, "7882": 21, "7922": 21, "8000": [21, 57], "8039": 21, "7804": 21, "7961": 21, "8157": 21, "8118": 21, "8078": 21, "8196": 21, "8392": 21, "8471": 21, "8431": 21, "8510": 21, "7488": 21, "4393": 21, "2865": 21, "8702": 21, "8077": 21, "0938": 21, "8719": 21, "1947": 21, "1545": 21, "3605": 21, "2214": 21, "91": [21, 33, 73, 76, 84], "9388": 21, "6487": 21, "113": [21, 76], "5320": 21, "6973": 21, "1256": [21, 61, 67], "5618": 21, "7494": 21, "8496": 21, "9855": 21, "9393": 21, "6579": 21, "4566": 21, "2644": 21, "5434": 21, "8570": 21, "0331": 21, "8386": 21, "8340": 21, "109": [21, 76], "6349": 21, "94": [21, 28, 76, 84], "3467": 21, "3398": 21, "2621": 21, "5554": 21, "6067": 21, "5406": 21, "3683": 21, "4841": 21, "6089": 21, "5981": 21, "6650": 21, "1148": 21, "9521": 21, "5694": 21, "5933": 21, "9952": 21, "0958": 21, "8181": 21, "1196": 21, "0669": 21, "6937": 21, "5386": 21, "0347": 21, "8119": 21, "0003": [21, 67], "2152": 21, "5787": 21, "4639": 21, "1912": 21, "7318": 21, "7608": 21, "107": [21, 27, 76], "6556": 21, "7992": 21, "5985": 21, "5912": 21, "108": [21, 27, 76], "5169": 21, "3186": 21, "2265": 21, "modularli": 21, "outconv": 21, "doubleconv": 21, "2x2": [21, 73], "pool": [21, 76, 84, 94], "bilinear": 21, "incorpor": [21, 67, 73, 74, 76, 82, 97, 100, 101], "bn": 21, "mid_channel": 21, "double_conv": 21, "downscal": [21, 82], "doubl": [21, 28, 57, 62, 67, 70, 73, 77, 80], "maxpool_conv": 21, "upscal": [21, 82], "align_corn": 21, "chw": 21, "diffi": 21, "diffx": 21, "issu": [21, 25, 31, 38, 39, 40, 51, 67, 69, 77, 84, 88], "haiyongjiang": 21, "unstructur": 21, "buggi": 21, "commit": [21, 34, 43], "0e854509c2cea854e247a9c615f175f76fbb2e3a": 21, "xiaopeng": 21, "liao": 21, "8ebac70e633bac59fc22bb5195e513d5832fb3bd": 21, "unet_model": 21, "n_channel": [21, 27], "n_landmark": 21, "nchannels_inc": 21, "nchannels_down1": 21, "nchannels_down2": 21, "nchannels_down3": 21, "nchannels_up1": 21, "nchannels_up2": 21, "nchannels_up3": 21, "layer_inc": 21, "layer_down1": 21, "layer_down2": 21, "layer_down3": 21, "layer_up1": 21, "layer_up2": 21, "layer_up3": 21, "layer_outc": 21, "inc": 21, "x3": 21, "x4": 21, "outc": 21, "__str__": [21, 85], "down1": 21, "down2": 21, "down3": 21, "up1": 21, "up2": 21, "up3": 21, "__repr__": [21, 85], "unravel_index": 21, "insanti": 21, "care": [21, 25, 28, 33, 36, 37, 67, 84, 85, 91, 101], "hms0": 21, "restart": [21, 25, 54, 57, 77], "poseestimationnet": 21, "unet20210510t140305": 21, "final_epoch4": 21, "loadepoch": 21, "nepochs_per_sav": 21, "forget": [21, 34, 37, 38, 43, 57, 62, 65, 67, 70, 73, 74, 76, 82, 84, 85, 88, 91, 94, 97, 101], "savedir": 21, "checkpointdir": 21, "val_dataload": 21, "rmsprop": 21, "lr_schedul": [21, 82, 87], "reducelronplateau": 21, "patienc": [21, 69, 70], "numer": [21, 37, 57, 61, 62, 64, 67, 74, 80, 81, 82, 85, 94], "bcewithlogitsloss": 21, "hm_label": 21, "savefile0": 21, "cp_latest_epoch": 21, "savefile1": 21, "cp_prev_epoch": 21, "final_epoch": 21, "892069824039936": 21, "0559215631801635": 21, "596667634788901": 21, "train_hms1": 21, "val_hms1": 21, "eval_net": 21, "err": [21, 81, 94], "loc_pr": 21, "loc_label": 21, "l2err": 21, "sqrt": [21, 27, 61, 62, 65, 67, 69, 73, 74, 80, 81, 82, 84, 102], "idscurr": 21, "l2err_per_landmark_v": 21, "val_id": 21, "l2err_per_landmark_train": 21, "train_id": 21, "nbin": [21, 94], "bin_edg": 21, "bin_cent": 21, "frac_val": 21, "frac_train": 21, "histogram": [21, 25, 67], "densiti": [21, 25, 80, 94], "hval": 21, "px": 21, "argsort": 21, "printopt": 21, "errstr": 21, "nr": 21, "fil": 21, "testann": 21, "ntestim": 21, "test_dataload": [21, 57, 87], "l2err_per_landmark_test": 21, "test_id": 21, "1800": 21, "frac_test": 21, "conduct": [23, 52, 57], "teach": [23, 33, 35, 36, 52, 76, 91, 100, 101, 102], "pod": [23, 31, 46, 57, 60, 62, 64, 69, 73, 80], "reinforc": [23, 25, 27, 31, 46, 57, 70, 101, 102], "alphabet": [23, 31, 85, 88], "letter": [23, 31, 33, 39, 40, 73, 76, 85, 88], "topic": [23, 31, 57, 62, 97, 101], "taught": [23, 31], "week": [23, 26, 35, 46, 60, 61, 62, 64, 65, 67, 69, 70, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "comp": [23, 31, 35, 54], "neuro": [23, 25, 31, 35, 54], "lai": [23, 94], "foundat": [23, 81], "w1d4": [23, 39], "review": [23, 31, 36, 37, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101, 102], "refin": [23, 31, 39], "encourag": [23, 27, 31, 37, 40, 67, 73, 80], "w3d2": [23, 25], "dedic": [23, 31, 46, 57, 80], "abstract": [23, 25, 31, 33, 36, 37, 39, 46, 57, 85, 88], "rest": [23, 27, 31, 33, 64, 65, 73, 80, 94, 97], "culmin": 23, "slide": [23, 26, 31, 73, 76, 77, 102], "3h": 23, "slot": [23, 31, 46, 48, 76], "substanti": [23, 31, 77, 80], "inspir": [23, 44, 45, 80, 101], "becom": [23, 31, 65, 67, 70, 73, 74, 88], "airtabl": [23, 31, 52], "w3d5": 23, "approx": [23, 33, 61], "five": [23, 88], "member": [23, 80], "due": [23, 27, 28, 52, 61, 64, 67, 69, 73, 84, 85, 94, 100], "style": [23, 28, 31, 33, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 89, 94], "powerpoint": [23, 31], "send": [23, 57, 87, 89], "email": 23, "primari": 23, "logist": [23, 35, 39, 67], "neurmatch": 25, "morteza": 25, "ansarinia": 25, "yamil": [25, 27], "vidal": [25, 27], "aim": [25, 26], "mimic": [25, 101], "mechan": [25, 27, 33, 36, 37, 39, 60, 82, 84, 101], "construct": [25, 27, 33, 40, 57, 64, 65, 67, 73, 84, 87, 97], "multi": [25, 31, 40, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 85, 87, 89, 94, 100, 102], "jedi": [25, 28], "setuptool": [25, 28], "dm": 25, "acm": 25, "jax": [25, 28, 60], "sonnet": [25, 28], "trfl": [25, 28], "ignor": [25, 39, 57, 61, 62, 64, 69, 70, 82, 84, 85, 87, 88, 89, 94], "uninstal": 25, "seaborn": [25, 81], "31merror": [25, 28, 43], "resolv": [25, 28, 35, 39, 40, 43, 67], "conflict": [25, 28, 43], "incompat": [25, 28, 43], "0m": [25, 27, 28, 43], "31m": [25, 28, 43], "chex": 25, "snt": [25, 28], "sn": [25, 81], "dm_env": 25, "spec": [25, 28], "environmentloop": [25, 28], "tf": [25, 28], "dqn": [25, 101], "logger": [25, 28, 85, 87], "clear_output": [25, 76], "scientist": [25, 74, 91, 101], "tap": 25, "stroop": 25, "span": [25, 37, 64, 94], "tmt": 25, "trail": 25, "wcst": 25, "wisconsin": 25, "card": 25, "despit": [25, 52, 73, 101], "extens": [25, 31, 43, 73], "sophist": [25, 43, 60, 67, 94], "gain": [25, 35, 38, 39, 40, 52, 76, 81, 82, 88], "underli": [25, 39, 64, 65, 73, 74, 76, 88, 101], "interestingli": [25, 94], "thought": [25, 31, 35, 36, 60, 62, 67, 76, 80, 94], "action": [25, 26, 27, 39, 57, 64, 81, 97, 100, 101, 102], "conson": 25, "formul": [25, 26, 31, 67, 77, 81, 82, 97], "reward": [25, 97], "feedback": [25, 27, 31, 46], "trajectori": [25, 61, 67, 81], "episod": [25, 27, 28, 100, 101, 102], "schema": [25, 43, 88], "correl": [25, 35, 39, 40, 73, 91, 94], "straightforward": [25, 43], "composit": [25, 60], "hcp": 25, "wm": 25, "bound": [25, 28, 65, 81, 102], "symbol": [25, 57, 67, 85, 94], "neutral": 25, "sake": [25, 37], "breviti": 25, "perfom": [25, 57], "participant_id": 25, "bid": 25, "trial_index": 25, "time_step": [25, 82], "observ": [25, 27, 28, 33, 36, 39, 40, 57, 60, 67, 69, 70, 73, 77, 85, 94, 97, 101, 102], "expected_respons": 25, "is_correct": 25, "response_tim": 25, "mock": 25, "generate_mock_nback_dataset": 25, "n_particip": 25, "n_trial": 25, "stimulus_choic": 25, "abcdef": 25, "response_choic": 25, "n_row": 25, "pid": 25, "trial_indic": 25, "stimulus_sequ": 25, "exponenti": [25, 33, 36, 57], "datafram": [25, 40, 57, 81], "mark": [25, 27, 31, 61, 62, 76, 84, 94], "matchig": 25, "_nback_stim": 25, "burn": [25, 101], "trial": [25, 31, 35, 36, 39, 40, 65, 97], "mock_nback_data": 25, "displot": 25, "barplot": 25, "697356": 25, "149110": 25, "277760": 25, "implment": 25, "envinron": 25, "prefer": [25, 31, 67, 85, 88], "nback": 25, "episode_step": [25, 28], "stimuli_choic": 25, "human_data": 25, "_reset_next_step": 25, "_imitate_human": 25, "human_subject_data": 25, "_action_histori": 25, "_current_step": 25, "fixm": 25, "reverb": 25, "iloc": 25, "sort_valu": 25, "to_list": [25, 57], "ord": [25, 80], "_observ": 25, "_episode_return": 25, "agent_act": 25, "human_act": 25, "step_reward": 25, "rational": 25, "expected_act": 25, "termin": [25, 27, 28, 43, 85, 100, 102], "transit": [25, 97], "observation_spec": 25, "boundedarrai": [25, 28], "nback_stimuli": 25, "action_spec": [25, 28], "discretearrai": 25, "num_valu": 25, "int32": [25, 97], "ob": 25, "plot_stat": 25, "br": [25, 35, 36], "create_environ": 25, "singleprecisionwrapp": [25, 28], "grab": [25, 31], "environment_spec": [25, 28], "make_environment_spec": [25, 28], "randomag": 25, "actor": [25, 28], "_num_act": 25, "select_act": [25, 28], "uniformli": [25, 81, 82], "observe_first": 25, "next_timestep": 25, "env": [25, 27, 28], "env_spec": [25, 28], "n_episod": 25, "1_000": 25, "n_total_step": 25, "log_loss": 25, "n_step": [25, 28], "all_return": 25, "episode_return": 25, "episode_loss": 25, "start_tim": [25, 67, 69, 87], "polici": [25, 57, 101], "last_loss": 25, "steps_per_second": 25, "episode_length": 25, "loss_avg": 25, "histplot": 25, "kde": 25, "deepmind": [25, 84], "init": [25, 43, 57, 62, 65, 67, 82, 87], "discreteenviron": 25, "num_act": [25, 97], "num_observ": 25, "obs_dtyp": 25, "dqn_make_network": 25, "mlp": [25, 69, 84], "epsilon": [25, 67, 80, 81, 82], "inmemorylogg": 25, "_logger": 25, "_data": 25, "tail": [25, 76, 82], "995": [25, 76, 82], "329": [25, 76], "379165": 25, "996": [25, 76, 82], "31872": 25, "326": [25, 76], "324034": 25, "997": [25, 76], "31904": 25, "373": [25, 76], "017676": 25, "998": [25, 76, 82], "31936": 25, "309": [25, 76], "737031": 25, "31968": 25, "405": [25, 76], "329983": 25, "32000": 25, "cooper": 26, "expand": [26, 57, 80, 85], "theori": [26, 34, 35, 43, 57, 101], "pistonbal": 26, "piston": 26, "obstacl": 26, "extern": [26, 64, 67, 101], "earn": 26, "puckworld": 26, "snake": [26, 76], "minigrid": 26, "rl": [26, 28, 101], "congest": 26, "travel": 26, "queue": [26, 64, 65, 100, 102], "easier": [26, 31, 35, 36, 37, 38, 43, 67, 73, 76, 77, 88, 89, 94], "materi": [26, 31, 34, 52, 67, 70, 73, 84], "zoo": 26, "openai": [26, 27, 84], "gym": [26, 27, 28], "raghuram": 27, "bharadwaj": 27, "diddigi": 27, "geraud": 27, "nangu": 27, "tass": 27, "sanjukta": 27, "krishnagop": 27, "sara": 27, "rajae": 27, "shaonan": [27, 81, 82, 88, 97, 101], "wang": [27, 57, 81, 82, 88, 97, 101], "keyword": [27, 31, 57], "exactli": [27, 33, 35, 39, 43, 88], "xvfb": [27, 28], "opengl": 27, "swig": 27, "python3": [27, 65, 67, 69, 70, 80, 81, 82, 85, 87, 94], "x11": 27, "rarfil": 27, "baselines3": 27, "box2d": 27, "pyvirtualdisplai": 27, "pyglet": 27, "pygam": 27, "gymnasium": 27, "pip3": [27, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "2k": [27, 28], "90m": [27, 28], "32m14": 27, "31m28": 27, "eta": [27, 28, 60, 61, 62, 67, 70], "36m0": [27, 28], "25h": [27, 28], "sy": [27, 67, 84, 94, 100, 102], "stable_baselines3": 27, "results_plott": 27, "ts2xy": 27, "load_result": 27, "callback": 27, "evalcallback": 27, "env_util": 27, "make_atari_env": 27, "lunar_land": 27, "video_record": 27, "videorecord": 27, "exist_ok": 27, "1400": 27, "wrap_env": 27, "render_mp4": 27, "videopath": 27, "b4": 27, "base64_encoded_mp4": 27, "reach": [27, 31, 33, 39, 40, 64, 65, 67, 69, 70, 74, 76, 80, 88, 97], "onlin": [27, 70, 74, 97], "fluctuat": [27, 70], "impact": [27, 34, 39, 61, 62, 67, 73, 77, 80], "land": [27, 76], "downward": 27, "graviti": 27, "safe": [27, 65, 76], "fuel": 27, "screen": [27, 31, 76], "140": [27, 76], "leg": [27, 28, 33, 36], "yield": [27, 65, 67, 87, 88], "03": [27, 33, 70, 80], "crash": [27, 76], "veloc": [27, 28, 35, 36, 39, 40, 67], "angl": [27, 28, 31, 33, 36, 57, 65, 74, 94], "angular": [27, 33, 36], "nn_layer": 27, "tip": [27, 31, 34, 57, 73, 82, 84], "log_dir": 27, "tmp": 27, "env_nam": 27, "lunarland": 27, "cartpol": 27, "mountaincar": 27, "acrobot": 27, "statement": [27, 73, 85], "log_path": 27, "policy_kwarg": 27, "activation_fn": 27, "net_arch": 27, "mlppolici": 27, "buffer_s": 27, "replai": 27, "buffer": [27, 94], "learning_start": 27, "gamma": [27, 62, 65, 85, 87, 97], "discount": [27, 97], "facto": 27, "tau": [27, 81, 82], "soft": [27, 70, 76], "target_update_interv": 27, "train_freq": 27, "max_grad_norm": 27, "exploration_initial_ep": 27, "exploration_fract": 27, "gradient_step": 27, "pseudo": [27, 100], "a2c": 27, "ppo": 27, "ddpg": 27, "render": [27, 28, 36, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "observation_spac": 27, "action_spac": 27, "render_mod": 27, "rgb_arrai": [27, 28], "vid": 27, "total_reward": 27, "capture_fram": 27, "ntotal": 27, "0358279244006": 27, "total_timestep": 27, "100000": 27, "log_interv": [27, 70, 87], "num_timestep": 27, "episode_reward": 27, "420": [27, 76], "151": [27, 76], "20000": [27, 67], "561": [27, 76], "249": [27, 76, 80], "240": [27, 61, 76, 80], "40000": 27, "338": [27, 76], "08": [27, 73], "160": [27, 76], "241": [27, 62, 76], "60000": 27, "190": [27, 76], "646": [27, 76], "70000": 27, "92": [27, 73, 76], "04": [27, 70], "139": [27, 76], "80000": 27, "267": [27, 76, 80], "52": [27, 62, 69, 76, 77, 85, 94, 100, 102], "90000": 27, "126": [27, 33, 76], "536": [27, 76, 94], "257": [27, 76], "259": [27, 76], "0x7fef99c08c70": 27, "ep_rew_mean": 27, "proce": [27, 57, 91], "monoton": [27, 40], "upward": 27, "hope": [27, 39, 57, 67], "_learn": 27, "252": [27, 70, 76, 80], "88935234615718": 27, "although": [27, 61, 67, 80, 88], "achiev": [27, 33, 38, 57, 67, 69, 70, 73, 74, 76, 77, 80, 84, 87, 88, 94, 101], "greedi": [27, 102], "At": [27, 31, 33, 57, 67, 69, 70, 73, 81, 84], "exploration_final_ep": 27, "defualt": 27, "argument": [27, 36, 39, 57, 60, 61, 64, 65, 67, 69, 70, 77, 81, 84, 85, 94, 100, 101, 102], "custom_env": 27, "customenv": 27, "arg1": 27, "arg2": 27, "discret": [27, 39, 57, 67, 80, 81, 82, 88, 100], "n_discrete_act": 27, "inherit": [27, 28, 43, 57, 67], "custom_lunarland": 27, "wind": [27, 87], "forgot": 27, "enable_wind": 27, "ground_contact": 27, "wind_mag": 27, "wind_idx": 27, "wind_pow": 27, "applyforcetocent": 27, "torqu": 27, "torque_mag": 27, "torque_idx": 27, "turbulence_pow": 27, "applytorqu": 27, "invalid": [27, 67, 85, 88, 97, 100, 102], "dispers": 27, "np_random": 27, "m_power": 27, "ox": [27, 76], "oy": 27, "impulse_po": 27, "_create_particl": 27, "particl": 27, "decor": 27, "applylinearimpuls": 27, "main_engine_pow": 27, "s_power": 27, "sign": [27, 54, 67, 70, 76, 85, 100, 102], "side_engine_awai": 27, "side_engine_height": 27, "side_engine_pow": 27, "po": [27, 57, 84, 85, 94], "vel": [27, 67], "linearveloc": 27, "viewport_w": 27, "helipad_i": 27, "leg_down": 27, "viewport_h": 27, "angularveloc": 27, "And": [27, 33, 36, 39, 40, 62, 64, 65, 67, 73, 74, 87, 88, 89, 91, 101], "prev_shap": 27, "spent": [27, 73], "heurist": [27, 81], "game_ov": 27, "awak": 27, "cutom": 27, "alter": [27, 40], "eight": [27, 100], "portion": [27, 52, 77], "pong": [27, 76], "210": [27, 70, 76], "sb3": 27, "atari_gam": 27, "scrollto": 27, "f3k4rmxwimbo": 27, "pongnoframeskip": 27, "n_env": 27, "command": [27, 43, 60], "coalb": 27, "vecframestack": 27, "n_stack": 27, "cnnpolici": 27, "collis": 27, "devis": [27, 82], "mechansim": 27, "imaginari": 27, "horizant": 27, "cooridn": 27, "levi": 27, "initialis": [27, 57, 67, 73, 80, 100, 102], "effeci": 27, "load_path": 27, "custom_object": 27, "kwarg": [27, 43, 67, 81, 84, 85, 88, 94], "survei": [27, 35, 46], "taylor": 27, "jmlr": 27, "volume10": 27, "taylor09a": 27, "lazar": 27, "2012": [27, 76], "hal": 27, "inria": 27, "fr": [27, 35, 39, 89], "pdf": [27, 35, 39, 81], "quick": [27, 62, 73, 80, 88], "lin": [27, 73, 80], "zhou": 27, "07888": 27, "barreto": 27, "2016": [27, 76], "successor": 27, "1606": 27, "05312": 27, "gridworld": 27, "lightweight": [27, 43], "5x5": 27, "v0": [27, 28], "wrap": [27, 28, 64, 65, 88], "imgobswrapp": 27, "rgbimgobswrapp": 27, "8x8": 27, "tradit": [27, 94], "earlier": [27, 31, 80, 88], "arbitrari": [27, 35, 40, 67, 84, 101], "straight": [27, 62, 74, 100], "rex": 27, "icml2019": 27, "trex": 27, "roman": 28, "vaxenburg": 28, "diptodip": [28, 80], "deb": [28, 80], "sriniva": 28, "turaga": 28, "infrastructur": [28, 88], "hopper": [28, 76], "ant": [28, 76], "humanoid": 28, "easili": [28, 31, 33, 36, 43, 57, 80, 81, 84, 88], "introduct": [28, 31, 34, 82, 101], "worth": [28, 34, 101], "leverag": [28, 54, 73], "shouldn": [28, 85], "workload": 28, "anywai": [28, 31, 52, 94], "freeglut3": 28, "32m1": 28, "31m55": 28, "32m804": 28, "804": [28, 76], "kb": 28, "31m14": 28, "32m314": 28, "31m21": 28, "32m3": 28, "31m97": 28, "32m352": 28, "352": [28, 76], "31m37": 28, "32m131": 28, "131": [28, 76], "31m16": 28, "32m6": 28, "31m78": 28, "31m68": 28, "32m4": 28, "31m98": 28, "32m462": 28, "462": [28, 76], "31m42": 28, "32m497": 28, "497": [28, 76], "31m3": 28, "32m5": 28, "32m42": 28, "31m5": 28, "31m99": 28, "31m44": 28, "32m110": 28, "32m318": 28, "318": [28, 76, 88], "31m33": 28, "32m94": 28, "31m12": 28, "32m17": 28, "31m74": 28, "31m100": 28, "32m781": 28, "31m62": 28, "32m268": 28, "268": [28, 76, 80], "31m10": 28, "32m104": 28, "31m8": 28, "32m80": 28, "31m9": 28, "pybullet_env": 28, "tf2_util": 28, "distributionalmpo": 28, "environment_loop": 28, "gym_locomotion_env": 28, "hopperbulletenv": 28, "walker2dbulletenv": 28, "halfcheetahbulletenv": 28, "antbulletenv": 28, "humanoidbulletenv": 28, "ipywidget": [28, 33, 35, 36, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "widget": [28, 33, 35, 36, 46, 60, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "config": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 89, 94], "inlinebackend": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "figure_format": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "githubusercont": [28, 33, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "neuromatchacademi": [28, 33, 49, 52, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "mplstyle": [28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "preserv": [28, 57, 76, 77, 81, 85, 89, 94], "save_ckpt_to_dr": 28, "acme_ckpt": 28, "restore_ckpt_from_dr": 28, "recov": [28, 62], "virtual": [28, 31, 43], "mydriv": 28, "_learner": 28, "_checkpoint": 28, "_checkpoint_manag": 28, "dst": 28, "copytre": 28, "checkpoin": 28, "upon": [28, 57, 67, 74, 80, 82, 84, 85], "display_video": 28, "framer": 28, "n_frame": 28, "orig_backend": 28, "get_backend": 28, "agg": 28, "switch": [28, 43, 76], "headless": 28, "inhibit": 28, "set_axis_off": [28, 62, 76], "set_aspect": 28, "set_posit": 28, "set_data": 28, "funcanim": [28, 69, 70, 81], "func": [28, 67, 73], "blit": [28, 69, 70, 81], "to_html5_video": [28, 69, 70, 81], "layer_s": 28, "num_atom": 28, "reqiur": 28, "make_networks_d4pg": 28, "policy_layer_s": 28, "critic_layer_s": 28, "action_s": [28, 100, 102], "policy_network": 28, "batch_concat": 28, "layernormmlp": 28, "tanhtospec": 28, "critic_network": 28, "criticmultiplex": 28, "action_network": 28, "cliptospec": 28, "activate_fin": 28, "discretevaluedhead": 28, "make_networks_ddpg": 28, "make_networks_dmpo": 28, "multivariatenormaldiaghead": 28, "min_scal": 28, "tanh_mean": 28, "init_scal": 28, "fixed_scal": 28, "use_tfd_independ": 28, "multiplex": [28, 81, 82], "getlist": 28, "humanoiddeepmimicbackflipbulletenv": 28, "humanoiddeepmimicwalkbulletenv": 28, "cartpolebulletenv": 28, "cartpolecontinuousbulletenv": 28, "minitaurbulletenv": 28, "minitaurbulletduckenv": 28, "racecarbulletenv": 28, "racecarzedbulletenv": 28, "kukabulletenv": 28, "kukacambulletenv": 28, "invertedpendulumbulletenv": 28, "inverteddoublependulumbulletenv": 28, "invertedpendulumswingupbulletenv": 28, "reacherbulletenv": 28, "pusherbulletenv": 28, "throwerbulletenv": 28, "humanoidflagrunbulletenv": 28, "humanoidflagrunharderbulletenv": 28, "minitaurextendedenv": 28, "minitaurreactiveenv": 28, "minitaurballgymenv": 28, "minitaurtrottingenv": 28, "minitaurstandgymenv": 28, "minitauralternatinglegsenv": 28, "minitaurfourlegstandenv": 28, "kukadiverseobjectgrasp": 28, "entri": [28, 43, 67, 80, 84, 89], "hierarchi": 28, "mainli": [28, 65, 82], "realiz": [28, 91], "child": [28, 82], "overrid": [28, 64, 85], "subclass": [28, 85], "piec": [28, 34, 73, 76, 88, 100, 102], "parent": [28, 62, 64], "step_count": 28, "durat": 28, "iii": 28, "modif": 28, "_isdon": 28, "overriden": [28, 85], "entireti": 28, "walkerbasebulletenv": 28, "multiplay": 28, "_step": 28, "apply_act": 28, "global_step": 28, "calc_stat": 28, "joints_at_limit": 28, "body_rpi": 28, "_aliv": 28, "alive_bonu": 28, "initial_z": 28, "isfinit": 28, "potential_old": 28, "calc_potenti": 28, "progress": [28, 31, 67, 76, 81, 84, 85, 94], "feet_collision_cost": 28, "feet": 28, "contact_id": 28, "contact_list": 28, "ground_id": 28, "feet_contact": 28, "dc": 28, "brake": [28, 74, 76], "electricity_cost": 28, "joint_spe": 28, "stall_torque_cost": 28, "joints_at_limit_cost": 28, "hud": 28, "gymwrapp": 28, "nativ": [28, 76], "compat": [28, 57, 65, 69, 70, 80, 85], "adher": [28, 88], "hop": 28, "km": 28, "constraint": [28, 31, 36, 65, 85, 94, 101], "walk_target_x": 28, "walk_target_i": 28, "haven": [28, 31, 62, 85], "actuat": 28, "tag": [28, 69, 70, 81, 84, 85], "cartesian": 28, "joint": [28, 33, 36, 94], "body_part": 28, "pose": [28, 61], "xyz": 28, "link0_2": 28, "2868544": 28, "torso": [28, 33, 36], "0166108": 28, "2329636": 28, "link0_3": 28, "02035943": 28, "link0_4": 28, "link0_6": 28, "03194364": 28, "03894688": 28, "thigh": 28, "03755892": 28, "814017": 28, "link0_8": 28, "0431742": 28, "58908712": 28, "05006377": 28, "33918206": 28, "link0_10": 28, "05695333": 28, "089277": 28, "foot": [28, 76], "12194239": 28, "09046921": 28, "floor": [28, 31, 57, 85], "robot_bodi": 28, "39135196": 28, "97286361": 28, "posteriori": 28, "optimis": [28, 33, 67], "learner_log_everi": 28, "learner": [28, 101], "loop_log_everi": 28, "learner_logg": 28, "terminallogg": 28, "time_delta": 28, "print_fn": 28, "loop_logg": 28, "policy_optim": 28, "critic_optim": 28, "observation_network": 28, "ident": [28, 80, 81, 84, 85], "op": [28, 100, 102], "num_step": [28, 82, 97], "100_000": 28, "num_episod": 28, "hopefulli": [28, 33, 39, 40, 74, 100], "env_step": 28, "gdrive": 28, "81b1f746": 28, "216e": 28, "11ee": 28, "93ef": 28, "0242ac1c000c": 28, "d4pg_learner": 28, "flush": 28, "roboflow": 30, "modelzoo": 30, "onnx": 30, "caden": 30, "qubvel": 30, "segmentation_model": 30, "zylo117": 30, "efficientdet": 30, "balavenkatesh3322": 30, "cv": [30, 39], "snakers4": 30, "silero": 30, "hugginfac": 30, "awesom": [30, 43, 57, 101], "awesomedata": 30, "uci": 30, "ic": [30, 76], "ml": [30, 67, 70], "php": 30, "zindi": 30, "africa": 30, "data_typ": 30, "dryad": 30, "datadryad": 30, "datasetsearch": 30, "zenodo": 30, "meagmohit": 30, "discoveri": [30, 84], "plan": [31, 34, 38, 57, 97, 101], "explicitli": [31, 33, 36, 39, 40, 80, 85, 100], "gradual": 31, "hypothes": [31, 34, 35, 37, 38, 84], "balanc": [31, 33, 36, 76, 77, 97, 102], "brainstorm": [31, 35, 36, 39, 40], "testabl": [31, 38], "hypothesi": [31, 33, 34, 38, 39, 40, 62, 67, 84, 85, 101], "evid": [31, 35, 36, 39, 40, 67, 73, 81, 87], "meet": [31, 38, 62, 85], "megapod": 31, "meant": [31, 39, 40, 57], "starter": 31, "reus": 31, "diverg": [31, 80, 94], "hesit": 31, "flexibl": [31, 37, 57, 62, 64, 76, 80, 88], "friendli": 31, "consult": 31, "sometim": [31, 33, 38, 40, 57, 60, 61, 62, 67, 69, 74, 76, 88, 97], "arriv": [31, 62], "resum": 31, "slightli": [31, 69, 70, 73, 84, 88, 91, 94], "footwork": 31, "whenev": [31, 62, 89, 100, 102], "senior": 31, "postdoc": 31, "professor": 31, "industri": [31, 88], "navig": [31, 97, 101], "perspect": [31, 100], "depend": [31, 35, 36, 38, 40, 62, 65, 67, 91], "regardless": [31, 100], "spend": [31, 35, 46, 61, 74, 87, 91, 101], "yourselv": [31, 69], "curiou": [31, 67, 82], "carefulli": [31, 34, 74], "brows": [31, 76], "booklet": 31, "skim": 31, "concret": [31, 35], "suit": [31, 67, 76, 94], "intertwin": 31, "headstart": 31, "readili": 31, "paragraph": [31, 33, 39, 40, 88], "stage": [31, 67, 73, 76, 80, 85, 94], "stai": [31, 37, 64, 67, 94], "arrang": [31, 87, 100], "ideal": [31, 37, 60, 69, 80, 82, 85], "threefold": 31, "ingredi": [31, 37], "primarili": [31, 73, 77, 94], "pollut": 31, "climat": 31, "geograph": [31, 88], "surfac": [31, 67, 81], "temperatur": [31, 62, 80, 87, 88, 94, 100, 102], "1900": 31, "convnet": [31, 33, 46, 73, 74], "didn": [31, 33, 34, 36, 85, 88, 100, 102], "favorit": 31, "invest": 31, "upload": [31, 34, 43, 57], "internet": [31, 43, 54, 76], "reformat": 31, "shaki": 31, "vagu": 31, "weak": 31, "cover": [31, 44, 45, 57, 60, 76, 87, 101], "fairli": [31, 33, 74, 94], "aggress": 31, "stuck": [31, 62, 67, 74, 91, 101], "thesi": 31, "confer": [31, 91], "chanc": [31, 33, 46, 67, 73, 94], "venu": 31, "branch": [31, 62, 81, 82], "pursu": [31, 37, 88], "With": [31, 65, 70, 80, 85, 87, 97], "principl": [31, 33, 34, 35, 37, 38, 65, 69, 76, 82, 88, 97], "solut": [31, 38, 57, 60, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100, 101, 102], "importantli": [31, 33, 35, 91, 94], "jargon": 31, "cohes": 31, "explicit": [31, 36, 38, 81, 94, 101], "likewis": 31, "reveal": 31, "someon": [31, 74, 88, 91], "heard": [31, 73, 88], "narrow": [31, 62, 65], "discourag": [31, 61, 74], "month": [31, 35], "term": [31, 33, 36, 37, 57, 60, 64, 65, 67, 69, 70, 73, 74, 76, 80, 81, 94, 101, 102], "somebodi": 31, "somedai": 31, "logic": [31, 35, 43, 57, 91, 94, 101], "accident": 31, "peak": [31, 35, 36, 39, 69], "circular": 31, "catch": [31, 33], "experienc": [31, 35, 36, 39, 40], "guard": 31, "calendar": [31, 62], "daily_schedul": 31, "graphic": [31, 34, 38, 73], "told": 31, "greet": 31, "zoom": [31, 46, 57, 73], "themselv": 31, "wiggli": 31, "caterpillar": 31, "phd": 31, "dame": 31, "pari": [31, 87], "fli": 31, "bike": [31, 76], "ride": [31, 74], "speak": [31, 33, 37, 38, 39, 40, 57, 62, 76, 80], "wast": 31, "breakout": 31, "anyon": 31, "futur": [31, 33, 34, 38, 39, 40, 62, 69, 88, 97, 100, 102], "perhap": [31, 88], "hardest": [31, 35], "subgroup": 31, "timeslot": 31, "hour": [31, 46, 57, 61, 62, 69, 70, 84], "jupyterbook": [31, 36], "cutoff": 31, "superpod": 31, "conclus": [31, 39, 76, 84], "imposs": [31, 35, 38, 39, 61], "elev": 31, "poster": [31, 76], "Or": [31, 36, 38, 88], "zuckerberg": 31, "secur": [31, 43, 77, 88], "million": [31, 43, 67, 77, 84], "dollar": 31, "fund": 31, "art": [31, 39, 40, 61, 76, 77, 82, 84, 101], "act": [31, 70, 82, 85, 97, 100], "music": 31, "instrument": 31, "rehears": 31, "WILL": 31, "annoi": [31, 61], "tenth": [31, 76], "secret": 31, "anecdot": 31, "magic": [31, 35, 60, 61, 67, 87], "engag": 31, "passiv": 31, "bind": [31, 43, 57], "hear": [31, 101], "dream": 31, "pictur": [31, 73, 76, 77, 94], "oppos": [31, 39, 84], "technic": [31, 70, 102], "concis": 31, "rambl": 31, "life": [31, 33, 67, 77, 88, 101], "hart": [33, 34, 35, 36, 37, 38, 39, 40], "megan": [33, 34, 35, 36, 37, 38, 39, 40], "peter": [33, 34, 35, 36, 37, 38, 39, 40, 64], "vladimir": [33, 43, 57, 60, 67], "haltakov": [33, 43, 57, 60, 67], "paul": [33, 34, 35, 36, 37, 38, 39, 40, 44, 45], "schrater": [33, 34, 35, 36, 37, 38, 39, 40], "gunnar": [33, 34, 35, 36, 37, 38, 39, 40, 100, 101, 102], "blohm": [33, 34, 35, 36, 37, 38, 39, 40, 100, 101, 102], "modal": [33, 81, 91, 101], "acceleromet": 33, "skelet": [33, 35], "reconstruct": [33, 80, 85, 87], "pilot": [33, 35], "neccessari": 33, "demo": [33, 36, 40], "matric": [33, 39, 57, 62, 73, 80, 89, 97], "maxpool1d": 33, "confusion_matrix": 33, "unbalanc": [33, 74], "plotconfusionmatrix": 33, "real_label": 33, "predicted_label": [33, 87], "label_nam": [33, 36, 85], "conver": 33, "tick_nam": 33, "ytick": [33, 35, 39, 73], "mnqb7": [33, 36], "train_mov": [33, 36], "test_mov": [33, 36], "joint_nam": [33, 36], "sensor": [33, 35, 39, 80], "wristband": [33, 35], "address": [33, 34, 35, 36, 39, 40, 43, 62, 69, 76, 101], "novel": [33, 35, 101], "1032": [33, 36], "172": [33, 36, 70, 76], "closer": [33, 35, 36, 39, 57, 65, 88], "major": [33, 36, 82, 87, 88], "limb": [33, 36], "yaw": [33, 36], "roll": [33, 36, 67, 94], "advantag": [33, 36, 43, 73, 88], "agnost": [33, 36, 57], "3rd": [33, 36, 82, 100], "timepoint": [33, 35, 36, 39], "suppos": [33, 36, 82, 84], "cool": [33, 37, 62, 73], "joint_no": [33, 36], "pelvi": 33, "lefthip": 33, "righthip": 33, "spine1": 33, "leftkne": 33, "rightkne": 33, "spine2": 33, "leftankl": 33, "rightankl": 33, "spine3": 33, "leftfoot": 33, "rightfoot": 33, "neck": [33, 76], "leftcollar": 33, "rightcollar": 33, "leftshould": 33, "rightshould": 33, "leftelbow": 33, "rightelbow": 33, "leftwrist": 33, "rightwrist": 33, "lefthand": [33, 94], "righthand": [33, 94], "label_numb": [33, 36], "label_no": [33, 36], "crawl": 33, "throw": [33, 57, 85], "running_in_spot": 33, "cross_legged_sit": 33, "hand_clap": 33, "scratching_head": 33, "kick": [33, 60], "phone_talk": 33, "sitting_down": 33, "checking_watch": 33, "hand_wav": 33, "taking_photo": 33, "spread": [33, 36], "matter": [33, 34, 36, 40, 62, 70, 73, 80, 84, 88, 101], "asid": [33, 36, 70, 73], "stick": [33, 34, 36, 76], "hypothezis": [33, 36], "arm": [33, 36, 64, 76], "four": [33, 36, 62, 76, 80, 84, 87, 88, 101], "outperform": [33, 36, 76, 94, 101], "mathbb": [33, 39, 40, 57, 60, 62, 65, 67, 80, 81, 82], "perf_": 33, "deal": [33, 57, 67, 74, 77], "1d": [33, 39, 57, 60, 62, 64, 67, 73, 80], "sketch": [33, 37, 88], "data_tutori": 33, "movijointdataset": 33, "is_tensor": 33, "tolist": [33, 88, 100, 102], "intend": [33, 39, 62, 85], "movi_train": 33, "movi_test": 33, "1031": 33, "171": [33, 70, 76], "minu": 33, "bxcxt": 33, "516": [33, 76], "test_load": [33, 64, 65, 67, 69, 70, 73, 84], "highest": [33, 57, 62, 65, 88, 100, 102], "mov1dcnn": 33, "njoint": 33, "conv1d": 33, "dropout1": [33, 70, 73], "2200": 33, "nl": [33, 73], "dropout2": [33, 70, 73], "log_softmax": [33, 67, 69, 70, 84, 87, 102], "total_step": 33, "loss_list": 33, "acc_list": 33, "8982": 33, "9187": 33, "6901": 33, "6060": 33, "5478": 33, "4853": 33, "4423": 33, "5842": 33, "3525": 33, "3686": 33, "070": 33, "converg": [33, 57, 61, 62, 67, 70, 76, 97, 101], "decent": 33, "dark_background": [33, 40], "dark": [33, 76], "clap": 33, "phone": [33, 54, 76, 77], "photo": [33, 43, 74, 91], "tend": [33, 37, 69, 80, 84, 85], "sit": [33, 35, 39, 40, 101], "misclassifi": 33, "testjointmodel": 33, "cnn6j": 33, "51162790697676": 33, "limb_joint": 33, "limb_fit": 33, "28": [33, 60, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 88, 91], "leftleg": 33, "rightleg": 33, "leftarm": 33, "rightarm": 33, "formal": [33, 37, 88, 97], "win": [33, 36, 39, 88, 100, 102], "necessarili": 33, "wors": [33, 39, 76, 77], "limb_set": 33, "limb_set_fit": 33, "fundament": [33, 61, 64, 73, 88, 97], "pocket": [33, 64], "inert": 33, "imu": 33, "gyroscop": 33, "begun": 33, "guidelin": [33, 34, 39, 40, 94], "phenomena": [33, 34, 35, 39, 40, 101], "summar": [33, 34, 39, 40, 73, 84, 101], "character": [33, 64, 94], "articul": [33, 34, 39, 40, 76, 101], "relationship": [33, 34, 36, 37, 38, 39, 40, 61, 62, 64, 67, 73, 74, 76, 80, 81, 84], "neuroal": 33, "publicli": [33, 57], "outcom": [33, 34, 36, 38, 39, 40, 100, 101, 102], "contrari": [33, 35], "intuit": [33, 34, 36, 38, 60, 62, 65, 67, 76, 80, 81, 84, 87, 88, 94, 100], "briefli": [33, 34, 39, 40, 97], "argu": [33, 34, 39, 40, 101], "plausibl": [33, 34, 39, 40], "paraphras": [33, 39, 40], "jean": [34, 37, 38, 39, 76], "lauren": [34, 37, 38, 39], "messag": [34, 85, 100, 102], "experimentalist": 34, "analog": [34, 69, 70, 76], "famou": [34, 57, 88], "criteria": [34, 38], "convinc": 34, "AND": 34, "peer": 34, "pitfal": [34, 35, 36, 37, 38], "recap": [34, 35, 36, 37, 38, 64, 82, 100], "reader": [34, 38, 82], "appreci": [34, 35, 60, 74], "unreason": 34, "reject": 34, "draft": 34, "expeiment": 34, "mesi": 34, "journal": 34, "rightfulli": 34, "cleanli": 34, "stereotyp": [34, 84], "mensh": 34, "kord": [34, 43, 57, 73, 74, 80, 88, 91, 101], "2017": [34, 84], "consider": [34, 35, 37, 39], "strong": [34, 35, 39, 40, 43, 64], "forc": [34, 43, 52], "focuss": [34, 36, 39], "succinctli": 34, "kp": 34, "pr": [34, 74], "eneuro": 34, "0352": 34, "1523": [34, 94], "nbdt": 34, "scholasticahq": 34, "articl": [34, 57, 70, 73, 76], "16723": 34, "plo": 34, "biol": 34, "e1005619": 34, "1371": 34, "pcbi": 34, "1005619": 34, "mk": 34, "w56vt": 34, "eric": [35, 36], "dewitt": [35, 36], "tara": [35, 36], "van": [35, 36, 73, 76, 82], "viegen": [35, 36], "ella": [35, 36, 101], "batti": [35, 36, 101], "eas": 35, "deconstruct": 35, "inclin": 35, "roleplai": 35, "regress": [35, 39, 60, 67], "cross_val_scor": [35, 39], "rasterplot": [35, 39], "trial_spik": [35, 39], "trial_ev": [35, 39], "nonzero": [35, 39, 100, 102], "eventplot": [35, 39], "plotcrossvalaccuraci": [35, 39], "boxplot": [35, 39], "vert": [35, 39], "set_vis": [35, 39, 73, 80, 97], "generatespiketrain": [35, 39], "subsetpercept": [35, 39], "velocity_sigma": [35, 39], "profil": [35, 39], "velocity_profil": [35, 39], "sensit": [35, 39, 65, 67, 84, 87], "target_shap": [35, 39], "multipli": [35, 39, 57, 61, 73, 74, 76, 81, 85, 88, 89, 91], "s_gain": [35, 39], "s_move": [35, 39], "s_fr": [35, 39], "rv": [35, 39, 40, 76, 80, 81], "hwin": [35, 39], "num_mov": [35, 39], "m_train": [35, 39], "m_test": [35, 39], "w_idx": [35, 39], "w_0": [35, 39], "w_1": [35, 39, 60, 61, 62], "stationari": [35, 39, 40], "spikes_stat": [35, 39], "spikes_mov": [35, 39], "train_spikes_stat": [35, 39], "train_spikes_mov": [35, 39], "test_spikes_stat": [35, 39], "test_spikes_mov": [35, 39], "x_train": [35, 39, 61, 62, 64, 65, 81], "x_test": [35, 39, 64, 65, 69, 70, 81, 82], "population_model": [35, 39], "liblinear": [35, 39], "newton": [35, 39, 67], "cg": [35, 39], "lbfg": [35, 39], "sag": [35, 39], "slope": [35, 39, 64, 65], "intercept_": [35, 39], "intercept": [35, 39], "ground_truth": [35, 39], "getdata": [35, 39], "illus": [35, 36, 62, 88], "introductori": 35, "showcas": 35, "joke": 35, "markdown1": [35, 36], "3pt": [35, 36], "window": [35, 39, 40, 43, 73, 76, 85], "suddenli": [35, 39, 40], "wrong": [35, 37, 38, 39, 40, 57, 85, 88], "vice": [35, 36, 39, 40, 57, 85], "versa": [35, 36, 39, 40, 57, 85], "surround": [35, 39, 40, 97, 100], "disambigu": [35, 39, 40], "vibrat": [35, 39, 40, 88], "inde": [35, 39, 40, 57, 81], "vestibular": [35, 36, 39], "illusori": [35, 40], "markdown2": [35, 36], "sensori": [35, 39], "signal": [35, 36, 37, 39, 64, 65, 73, 76, 81, 82], "hold": [35, 39, 40, 60, 65, 97], "slowli": [35, 39, 70, 74, 76, 94], "judgement": [35, 36, 39], "markdown3": [35, 36], "out2": [35, 36], "out1": [35, 36], "out3": [35, 36], "tab": [35, 36, 57], "yesterdai": [35, 76, 101], "lost": [35, 37, 38, 57, 80, 100, 102], "mechanist": [35, 40], "unclear": 35, "deepli": [35, 38], "BUT": 35, "anywher": 35, "revisit": [35, 36, 70], "frequent": [35, 40, 69, 70], "necess": [35, 67], "bad": [35, 38, 39, 67, 69, 70, 80, 88], "nest": [35, 57, 60], "examin": [35, 39, 40, 60, 62, 76, 81, 82, 88], "attempt": [35, 40, 65, 77, 85, 91], "markdown21": 35, "4d": [35, 39, 57, 82], "markdown22": 35, "simultan": [35, 39, 62, 65], "fourth": [35, 39, 44, 62], "mi": [35, 39, 88, 97], "markdown23": 35, "orang": [35, 39, 70, 73, 76, 77, 81, 85], "green": [35, 39, 61, 67, 69, 70, 73, 76, 85], "markdown24": 35, "ey": [35, 39, 62, 76, 80, 81, 94], "move_no": [35, 39], "thorough": [35, 88], "prior": [35, 36, 37, 52, 76, 80, 81, 84, 101, 102], "dig": 35, "emit": [35, 88], "altern": [35, 40, 57, 67, 85, 100, 102], "complementari": [35, 37], "\u03b8": 36, "somehow": [36, 39, 69], "perceiv": [36, 39, 101], "markdown31": 36, "markdown32": 36, "markdown33": 36, "markdown34": 36, "markdown35": 36, "markdown36": 36, "markdown37": 36, "markdown38": 36, "markdown39": 36, "omit": 36, "latent": [36, 81, 88], "uncertainti": 36, "salienc": 36, "plant": [36, 62], "inventori": 36, "acquir": 36, "latex": 36, "strength": [36, 40, 73, 100], "em": [36, 85], "sup": 36, "\u03c3": [36, 62], "strongest": [36, 39], "ratio": [36, 39, 40, 62, 85], "came": [36, 39, 40], "hyp": [36, 39], "slower": [36, 39], "accum": [36, 39], "denot": [36, 39, 57, 60, 62, 64, 65, 70, 76, 81, 84], "perf": 36, "consecut": [36, 94], "express": [36, 37, 38, 57, 60, 67, 76, 80, 87, 94, 100, 102], "Be": [36, 38, 84, 91], "assumpt": [36, 70], "phenomenon": [36, 38, 67], "justifi": [36, 39], "lack": [36, 37, 57, 73], "clariti": [36, 37, 70], "satisfact": [37, 38], "empow": [37, 81, 101], "chose": [37, 40, 43, 74, 76, 80, 100], "physic": [37, 38], "granular": 37, "wider": [37, 70], "lumpabl": 37, "analyt": [37, 81, 82], "divers": 37, "Being": 37, "w1d1": [37, 46, 67], "meaningfulli": 37, "needless": 37, "highlight": [37, 40, 67, 81], "outlin": [37, 101], "thu": [37, 39, 46, 60, 62, 64, 65, 67, 70, 73, 80, 84, 88, 91, 94, 100], "huge": [37, 73, 76, 84], "facilit": [37, 88, 97, 101], "broken": 37, "portenti": 37, "hypothet": 37, "arrow": [37, 60, 61, 73, 85], "rough": 37, "icon": [38, 67], "easiest": 38, "surprisingli": [38, 62, 69], "insight": [38, 40, 67, 70, 74, 88, 91], "isn": [38, 67, 69, 73, 85], "equilibrium": 38, "asymptot": 38, "wrangl": [38, 57], "mistak": 38, "wore": 38, "useless": 38, "distract": 38, "alreali": 38, "determ": 38, "handi": 38, "satisfi": [38, 39, 65], "parametr": [38, 70, 102], "elimin": 38, "met": 38, "board": [38, 76, 87, 100, 102], "endless": 38, "neglect": 38, "warrant": 38, "qualit": 38, "upfront": 38, "breadth": 38, "bic": 38, "aic": 38, "subsumpt": 38, "uncov": [38, 74, 91, 101], "falsifi": 38, "leverl": 38, "avenu": 38, "experiment": [38, 40, 88, 101], "rethink": [38, 70], "implic": [38, 77, 81], "unanticip": 38, "consequ": [38, 61, 70, 77], "unbias": [38, 94], "disclaim": [39, 40, 67], "inner": [39, 62], "ear": [39, 76], "tricli": 39, "believ": [39, 62, 101], "ifram": [39, 44, 45], "mfr": 39, "57w2p": 39, "26mode": 39, "26action": 39, "scroll": [39, 57, 73], "641px": 39, "marginheight": 39, "framebord": 39, "allowfullscreen": 39, "webkitallowfullscreen": 39, "aka": 39, "problemat": 39, "invas": 39, "obvious": [39, 70, 80], "36": [39, 70, 73, 76, 81, 82, 84, 85, 97], "7575": 39, "975": [39, 76], "m_r": 39, "m_p": 39, "c_": [39, 40, 57, 64, 102], "cdot": [39, 40, 57, 60, 61, 62, 64, 65, 70, 76, 77, 84, 91, 100, 102], "soon": [39, 60, 88], "glm": 39, "whiteboard": [39, 40], "belong": [39, 65, 69, 70, 73, 100, 102], "halfwin": 39, "getdesignmatrix": 39, "extent": [39, 64, 65], "movstim": 39, "win_idx": 39, "a_r": 39, "desmat": 39, "mov": [39, 40], "33475": 39, "53275": 39, "61975": 39, "saw": [39, 62, 64, 67, 73, 74, 76, 80, 87, 91], "graph": [39, 57, 61, 69, 73, 87, 94], "magnitud": [39, 57, 67, 69, 73, 80, 81], "classifymotionfromspik": 39, "presenc": [39, 73, 88], "runanalysi": 39, "050": 39, "class_set": 39, "halfwin_no": 39, "lty": 39, "leg_hw": 39, "classes_no": 39, "leg_class": 39, "purpl": [39, 62, 85], "motions_no": 39, "cond_acc": 39, "m_acc": 39, "plotaccuraci": 39, "accuarci": 39, "xlim": [39, 64, 80, 81], "proport": [39, 40, 62, 94], "reflect": [39, 46, 57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "clearer": [39, 94], "judgment": [39, 40], "notion": [39, 76], "benefit": [39, 76, 101], "wrt": [39, 67, 85], "comclus": 39, "justifc": 39, "adjac": [39, 40, 85], "unknown": [39, 40, 88, 101], "effort": [39, 67], "cumul": [39, 97], "instantan": 39, "causal": [39, 101], "built": [39, 40, 43, 57, 67, 70, 73, 76, 80, 82, 84, 88, 101], "roadblock": [39, 40], "somewher": [39, 40, 73, 101], "neuroscientist": [39, 74], "drift": 40, "diffus": [40, 46, 62, 84], "establish": [40, 69, 74, 88], "frac": [40, 60, 64, 65, 67, 70, 73, 74, 76, 81, 82, 84, 91, 100, 102], "leakag": [40, 64], "instal": [40, 54], "vestibular_sign": 40, "sd": 40, "white": [40, 73, 76, 77, 94, 100], "1m": 40, "exp": [40, 64, 74, 80, 84, 100, 102], "diff": [40, 85], "leaki": 40, "thr": 40, "run_model": 40, "selfmot": 40, "thrshold": 40, "itertool": 40, "temp": [40, 57, 69, 70, 76, 81, 100, 102], "hypothsi": 40, "panel": [40, 54, 64], "layout": [40, 61, 62, 64, 65, 76, 80], "constrained_layout": [40, 61], "absent": [40, 85], "mov_": 40, "thr_": 40, "sig_": 40, "thr_n": 40, "c_n": 40, "subdf0": 40, "groupbi": 40, "subdf1": 40, "im0": 40, "im1": [40, 62], "set_ylim": [40, 61, 62, 67, 73, 81], "set_xlim": [40, 61, 62, 76, 81], "set_facecolor": 40, "grei": [40, 61, 76, 81], "redund": 40, "0004": 40, "d0": [40, 67], "d1": [40, 67, 73, 76], "201": [40, 76], "satur": 40, "push": [40, 43], "probabilist": [40, 67, 80], "dokka": 40, "sam": 43, "rai": [43, 76], "konrad": [43, 57, 73, 74, 80, 88, 91, 101], "modern": [43, 46, 60, 67, 84, 88, 101], "maintain": [43, 67, 73, 88], "ui": [43, 73, 76], "servic": [43, 101], "flasgger": 43, "pyngrok": 43, "vibecheck": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "datatop": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "datatopscontentreviewcontain": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "content_review": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "notebook_sect": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "prompt": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "pmyvdlilci": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "east": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "amazonaw": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "klab": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "neuromatch_dl": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "user_kei": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "f379rz8y": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "feedback_prefix": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "bonus_deplooymodel": 43, "atexit": 43, "subprocess": [43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "timer": 43, "_run_ngrok": 43, "popen": 43, "regist": [43, 54, 60, 61, 77], "localhost_url": 43, "localhost": 43, "4040": 43, "tunnel": 43, "sleep": [43, 61, 73, 76, 94], "tunnel_url": 43, "public_url": 43, "start_ngrok": 43, "ngrok_address": 43, "traffic": [43, 76], "127": [43, 76], "run_with_ngrok": 43, "expos": [43, 85], "stdout": 43, "old_run": 43, "new_run": 43, "5000": [43, 57, 62, 67, 81, 84, 85], "setdaemon": 43, "urllib": [43, 76, 94, 100, 102], "urlopen": [43, 76, 94, 100, 102], "flask_rest": 43, "marshal": 43, "render_template_str": 43, "redirect": [43, 57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "authent": 43, "mail": [43, 76], "dashboard": 43, "authtoken": 43, "your_ngrok_authtoken": 43, "ngrok2": 43, "yml": 43, "delet": [43, 73, 85], "visit": [43, 67, 102], "_deploying_neural_networks_on_the_web_video": 43, "micro": 43, "scalabl": [43, 101], "nowadai": 43, "socket": 43, "linkedin": 43, "pinterest": 43, "_flask_video": 43, "handler": [43, 85, 87], "trick": [43, 67, 70, 74, 91], "server": 43, "__main__": 43, "wsgi": 43, "press": [43, 61, 76, 91], "ctrl": 43, "xxx": 43, "xx": [43, 57, 60, 61, 81], "button": [43, 53, 54, 61, 67], "site": [43, 65, 67, 69, 70, 76, 80, 81, 82, 85, 87, 88, 94, 100], "manual": [43, 64, 67, 69, 70, 73, 81, 84], "rout": 43, "h1": [43, 82], "_jinja_templates_video": 43, "offer": [43, 57, 67, 84, 85, 94], "reusabl": 43, "WIth": 43, "ifs": 43, "tabl": [43, 57, 62, 73, 76], "template_str": 43, "margin": [43, 77, 81, 82], "100px": [43, 80], "tr": 43, "200px": 43, "td": [43, 97], "endfor": 43, "unam": 43, "_asdict": 43, "_using_the_mvvm_design_pattern_video": 43, "gui": 43, "pointmodel": 43, "pointview": 43, "pointviewmodel": 43, "get_sample_data": 43, "viewmodel": 43, "classmethod": 43, "cl": [43, 84, 88], "add_resourc": 43, "pvm": 43, "_rest_api_video": 43, "platformview": 43, "swag_from": 43, "spec_dict": 43, "processor": [43, 88], "node": [43, 60, 62, 65, 67, 70, 76, 102], "arch": [43, 76], "resource_field": 43, "serial": [43, 67, 82], "platformviewmodel": 43, "apidoc": 43, "redirect_platform": 43, "swagger": 43, "swg": 43, "_vue": 43, "js_video": 43, "front": [43, 87, 97], "benefici": [43, 65], "similarli": [43, 57, 61, 80, 85, 94, 101], "javascript": [43, 88], "axoi": 43, "mount": 43, "vue_templ": 43, "script": [43, 94], "cdn": 43, "jsdelivr": 43, "npm": 43, "dist": [43, 81], "cdnj": 43, "cloudflar": 43, "ajax": 43, "lib": [43, 65, 67, 69, 70, 80, 81, 82, 85, 87, 94], "axio": 43, "ul": 43, "li": [43, 60, 73], "var": [43, 65, 80, 91], "el": 43, "_deploying_a_pytorch_model_video": 43, "densenet": 43, "_classification_with_a_pretrained_model_video": 43, "traini": 43, "densenet121": 43, "class_labels_url": 43, "hub": [43, 76], "imagenet_class": 43, "class_label": 43, "image_tensor": 43, "higherst": 43, "class_id": [43, 67], "class_nam": 43, "dog_imag": 43, "unsplash": 43, "2l0cwtpcchi": 43, "480": [43, 76], "foxhound": [43, 76], "_create_a_dynamic_application_video": 43, "index_templ": 43, "imageform": 43, "enctyp": 43, "multipart": 43, "imagefil": 43, "10px": 43, "250px": 43, "50px": 43, "32px": 43, "bold": [43, 85], "20px": 43, "getelementbyid": 43, "addeventlisten": 43, "formdata": 43, "preventdefault": 43, "const": 43, "createobjecturl": 43, "predict_api": 43, "image_fil": 43, "image_byt": 43, "_deploy_on_heroku_video": 43, "paa": 43, "tier": 43, "_prepare_python_environment_video": 43, "venv": 43, "linux": 43, "maco": 43, "bat": 43, "gunicorn": 43, "torchaudio": 43, "caus": [43, 61, 64, 73, 94], "_creating_a_local_application_video": 43, "send_from_directori": 43, "transofrm": 43, "getenv": 43, "_preparing_for_heroku_video": 43, "coupl": [43, 62, 67, 69, 73, 80, 89], "procfil": 43, "freez": [43, 67, 85, 88, 89], "exce": [43, 73], "whl": 43, "torch_stabl": 43, "_deploying_on_heroku_video": 43, "cli": 43, "login": 43, "git": [43, 53, 76, 77, 87, 89, 94, 100, 102], "remot": [43, 76], "lt": 43, "herokuapp": 43, "_summary_video": 43, "middlebrook": [44, 45], "host": [44, 45], "panelist": [44, 45], "lyle": [44, 57, 69, 70, 74, 87, 88, 89, 91, 101], "ungar": [44, 57, 69, 70, 74, 87, 88, 89, 91, 101], "surya": [44, 57, 64, 65], "ganguli": [44, 57, 64, 65], "braininspir": [44, 45], "casto": [44, 45], "player": [44, 45, 76], "596518": 44, "fifth": 45, "brad": 45, "wybl": 45, "kyunghyun": 45, "cho": 45, "jo\u00e3o": 45, "sedoc": 45, "612309": 45, "live": [46, 57, 62, 84, 89], "tbd": 46, "ceremoni": 46, "utc": [46, 48], "pm": 46, "tue": 46, "wed": 46, "multilay": [46, 73], "perceptron": [46, 57, 64, 65, 67, 73], "fri": 46, "vae": 46, "synchron": [46, 73, 76], "eod": 46, "swap": [46, 57, 84, 85, 88], "farewel": 46, "graduat": 46, "portal": [46, 52], "goodby": 46, "impos": 46, "quarter": 46, "crowdcast": [46, 52], "zone": 48, "tz": 49, "launch": [51, 53, 54, 57], "setup": 51, "spot": [51, 76], "unusu": 51, "2022": [52, 91], "violat": 52, "precours": 52, "exempt": 52, "shrubhlgswj8dua7": 52, "shrjdpfwacarn5jop": 52, "assist": 52, "circumst": 52, "beyond": [52, 62, 70], "electr": [52, 76, 94], "blackout": 52, "grant": [52, 54], "elig": 52, "overwrit": 53, "china": [54, 76, 88], "substitut": [54, 85], "asococi": 54, "workaround": [54, 100, 102], "sidebar": 54, "credenti": 54, "artwork": [55, 82], "daniela": 55, "buchwald": 55, "shubh": 57, "pachchigar": 57, "matthew": 57, "sargent": 57, "deepak": [57, 94], "raya": [57, 94], "siwei": [57, 67, 69, 70], "bai": [57, 67, 69, 70, 76], "kelson": [57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "shill": [57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "scrivo": [57, 60, 61, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 84, 85, 87, 89, 91, 94, 100, 102], "anoop": [57, 60, 61, 62, 64, 65, 76, 77, 84, 85, 94], "kulkarni": [57, 60, 61, 62, 64, 65, 76, 77, 84, 85, 94], "arush": [57, 67, 76, 77, 100, 102], "tagad": [57, 67, 76, 77, 100, 102], "naivenet": 57, "preinstal": 57, "fulfil": 57, "w1d1_t1": 57, "getlogg": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 85, 87, 89, 94, 100, 102], "font_manag": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 84, 87, 89, 94], "checkexercise1": 57, "array_equ": 57, "vander": 57, "timefun": 57, "bufferedread": 57, "t_total": 57, "5f": [57, 82], "mess": 57, "insert": [57, 85], "25min": [57, 73, 76, 80], "adventur": 57, "_welcome_and_history_video": 57, "_why_dl_is_cool_video": 57, "multidimension": 57, "modular": 57, "deploi": [57, 94, 102], "_making_tensors_video": 57, "2000": [57, 80, 81], "3000": 57, "float64": [57, 100, 102], "6054e": 57, "0865e": 57, "7638e": 57, "4842e": 57, "seemingli": 57, "rand_lik": 57, "5906": 57, "6871": 57, "6338": 57, "1422": 57, "4803": 57, "0736": 57, "2737": 57, "5090": 57, "9543": 57, "2056": 57, "1748": 57, "2209": 57, "6177": 57, "6247": 57, "5401": [57, 101], "5953": 57, "9354": 57, "rng": 57, "simplefun": 57, "my_se": [57, 82], "4963": 57, "7682": 57, "0885": 57, "3643": 57, "1344": 57, "1642": 57, "3058": 57, "2100": 57, "9056": 57, "6035": 57, "8110": 57, "0451": 57, "8797": 57, "0482": 57, "familar": 57, "sim": [57, 65, 80, 81, 82], "mathcal": [57, 61, 62, 64, 65, 74, 80, 81, 82, 84], "dagger": [57, 67, 76, 84], "inclus": 57, "tensor_cr": 57, "notimplementederror": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 94, 97, 100, 102], "_creating_tensors_exercis": 57, "_tensors_operators_video": 57, "pointwis": 57, "0362": 57, "1852": 57, "3734": 57, "3051": 57, "9320": 57, "1759": 57, "2698": 57, "1507": 57, "0317": [57, 67], "2081": 57, "9298": 57, "7231": 57, "7423": 57, "5263": 57, "2437": 57, "overridden": [57, 85], "arithmet": 57, "lift": 57, "elementwis": [57, 84], "3333": 57, "syntax": [57, 88], "equival": [57, 64, 65, 67, 81, 82], "5846": 57, "0332": 57, "1387": 57, "2422": 57, "8155": [57, 73], "7932": 57, "2783": 57, "4820": 57, "8198": 57, "187318325042725": 57, "1051": 57, "3306": 57, "7517": 57, "7565": 57, "8509": 57, "5800": 57, "46525758504867554": 57, "3684": 57, "4435": 57, "5839": 57, "2522": 57, "6170": 57, "5267": 57, "matmul": [57, 80, 89], "bracket": [57, 88], "textbf": [57, 73], "bmatrix": [57, 60, 62, 73], "simple_oper": 57, "a1": 57, "a2": 57, "a3": 57, "matrici": 57, "dot_product": 57, "b1": 57, "b2": 57, "geometr": [57, 73, 76, 84], "cosin": [57, 62, 82, 87, 89, 94], "_simple_tensor_operations_exercis": 57, "_manipulating_tensors_video": 57, "last_el": 57, "exclud": [57, 61, 62, 73, 80], "5d": [57, 87], "3x4": 57, "subtl": 57, "singleton": [57, 80, 94], "compress": [57, 62, 73, 80], "opposit": [57, 100], "zeroth": 57, "7391": 57, "8027": 57, "6817": 57, "1335": 57, "0658": 57, "5919": 57, "7670": 57, "6899": 57, "3282": 57, "5085": 57, "peski": 57, "gave": 57, "rid": [57, 85], "7390837073326111": 57, "times48": 57, "times64": 57, "times3": 57, "image_height": [57, 73], "image_width": [57, 73, 76], "0th": 57, "2nd": [57, 82], "cat_row": 57, "cat_col": 57, "colum": 57, "convers": [57, 91], "minor": [57, 74], "inconveni": 57, "halt": 57, "chunk": [57, 73, 87, 88], "introduc": [57, 60, 69, 70, 73, 76, 80, 84, 87, 88, 97, 101], "2659": 57, "5148": 57, "0613": 57, "5046": 57, "1385": 57, "floattensor": [57, 100, 102], "26593232": 57, "5148316": 57, "06128114": 57, "5046449": 57, "13848118": 57, "invok": [57, 67], "elmement": 57, "functiona": 57, "my_tensor1": 57, "my_tensor2": 57, "retun": 57, "functionb": 57, "my_tensor": 57, "idx_tensor": 57, "functionc": 57, "_manipulating_tensors_exercis": 57, "_gpu_vs_cpu_video": 57, "rerun": [57, 94], "reimport": 57, "nvidia": 57, "pure": [57, 69, 70, 82, 94, 100], "whilst": 57, "unless": [57, 65, 67, 84, 88], "lazili": [57, 85], "fashion": [57, 84, 88], "forth": 57, "compatibl": 57, "recreat": [57, 76], "74070": 57, "87535": 57, "_how_much_faster_are_gpus_exercis": 57, "_gpus_discuss": 57, "_getting_data_video": 57, "fortun": 57, "vehicl": [57, 76], "cifar10_data": 57, "airplan": 57, "automobil": 57, "reorder": 57, "rearrang": [57, 84], "input_var": 57, "_display_an_image_exercis": 57, "_train_and_test_video": 57, "adress": 57, "training_data": 57, "rese": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "worker_init_fn": [57, 64, 65, 67, 69, 70, 73, 76, 80, 82], "g_seed": [57, 64, 65, 67, 69, 70, 73, 76, 80], "batch_imag": 57, "batch_label": 57, "predefin": [57, 73], "checkout": 57, "excercis": 57, "my_data_load": 57, "3309": 57, "_load_cifar10_exercis": 57, "_csv_files_video": 57, "interleav": 57, "circl": [57, 61, 80, 100], "sample_data": 57, "make_moon": 57, "to_csv": 57, "x_orig": 57, "to_numpi": 57, "y_orig": 57, "interg": 57, "_generating_neural_network_video": 57, "differend": 57, "obligatori": 57, "x_sampl": 57, "nnetwork": 57, "y_predict": [57, 60, 61], "npredict": [57, 85], "9066": 57, "5052": 57, "2024": 57, "1226": [57, 65], "0685": 57, "2809": 57, "6720": 57, "5097": 57, "8548": 57, "5122": 57, "1543": 57, "8018": 57, "2077": 57, "9859": 57, "5745": 57, "1924": 57, "8367": 57, "1818": 57, "8301": 57, "grad_fn": [57, 60, 73], "addmmbackward": 57, "_classify_some_examples_exercis": 57, "_train_the_network_video": 57, "jonchar": 57, "kera": 57, "pathlib": [57, 65, 69, 70, 85], "plot_decision_boundari": 57, "frames_path": 57, "x_min": 57, "x_max": [57, 64, 65], "y_min": 57, "y_max": 57, "yy": [57, 60, 61, 81], "meshgrid": [57, 60, 64, 65, 67, 81], "gid": 57, "grid_point": 57, "contour": [57, 61, 67, 80, 81], "contourf": [57, 60, 61], "spectral": [57, 74], "correcspond": 57, "transmit": 57, "loss_funct": [57, 60], "15000": 57, "y_logit": 57, "1000th": 57, "05d": 57, "6582635641098022": 57, "2830354869365692": 57, "24354352056980133": 57, "23178495466709137": 57, "4000": [57, 94], "22571030259132385": 57, "2219410538673401": 57, "6000": 57, "21937936544418335": 57, "7000": 57, "21753723919391632": 57, "21614307165145874": 57, "9000": 57, "21508803963661194": 57, "21437251567840576": 57, "11000": 57, "21384570002555847": 57, "12000": 57, "21345028281211853": 57, "13000": 57, "21314124763011932": 57, "14000": 57, "2128836214542389": 57, "interactiveshel": 57, "ast_node_interact": 57, "gif": 57, "mimsav": 57, "gifpath": 57, "_play_with_it_video": 57, "_tweak_your_network_discuss": 57, "_xor_widget_video": 57, "exclus": [57, 67, 76], "odd": 57, "gate": 57, "inequ": 57, "alik": 57, "hline": [57, 73], "tensorflow": [57, 60, 88], "perfectli": [57, 69, 91], "tini": [57, 60], "infinit": [57, 62, 64, 97], "x_1": [57, 65, 91], "x_2": [57, 65, 91], "w1_min_xor": 57, "theses": 57, "voila": 57, "_xor_interactive_demo": 57, "_ethics_video": 57, "_be_a_group_video": 57, "_syllabus_video": 57, "andrew": [57, 60, 61, 62, 91], "sax": [57, 60, 61, 62, 76], "ioanni": [57, 67], "mitliagka": [57, 67], "alona": [57, 73], "fysh": [57, 73], "alexand": [57, 76, 77, 84], "ecker": [57, 76, 77], "jame": 57, "evan": 57, "vikash": [57, 80], "gilja": [57, 80], "akash": 57, "srivastava": [57, 73], "tim": [57, 94, 100, 102], "lillicrap": [57, 94], "blake": [57, 94, 100, 102], "richard": [57, 73, 76, 77, 80, 94, 97, 100, 102], "jane": 57, "feryal": 57, "behbahani": 57, "josh": 57, "vogelstein": 57, "vincenzo": 57, "lamonaco": 57, "iclr": 57, "patient": [57, 61, 62, 65, 67, 70, 74, 76, 80, 82, 84, 88, 91, 94, 97, 101], "delai": [57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "strobelt": 57, "mit": [57, 88], "ibm": 57, "watson": 57, "hoover": 57, "retreiv": 57, "allenai": 57, "s2orc": 57, "methodolog": 57, "alt": [57, 73], "gltr": 57, "ml_regexv1_cs_ma_cit": 57, "_99perc": 57, "pos_umap_cosine_100_d0": 57, "pos_fil": 57, "qyrfn": 57, "_99perc_clean": 57, "meta_fil": 57, "vfdu6": 57, "load_data": 57, "merg": [57, 88], "paper_id": 57, "read_json": 57, "meta": [57, 101], "left_on": 57, "right_on": 57, "year_period": 57, "quinquenni": 57, "selection_multi": 57, "chart": 57, "citation_count": 57, "mark_circl": 57, "opac": [57, 87], "viridi": [57, 76], "clamp": [57, 82, 94], "1955": 57, "pow": 57, "expon": [57, 76], "tooltip": 57, "decad": [57, 76], "add_select": 57, "distant": 57, "publish": [57, 76], "citat": [57, 88], "hover": [57, 73, 77], "boom": 57, "winter": [57, 62], "mileston": 57, "_bonus_section_discuss": 57, "fullfil": 57, "criterria": 57, "bleu": 57, "specter": 57, "umap": 57, "rush": [57, 84], "ignorecas": 57, "issel": 57, "na": [57, 70], "0000000001": 57, "ON": 57, "vi": [57, 100, 102], "stroke": [57, 87], "strokeopac": 57, "strokewidth": 57, "colaboratori": 57, "faq": 57, "deeplearningbook": 57, "ian": 57, "goodfellow": 57, "yoshua": 57, "bengio": [57, 65], "aaron": 57, "courvil": 57, "w1d2_bonuslectur": 59, "_yoshua_bengio_video": 59, "turishcheva": [60, 61, 62, 73, 76, 77], "antoin": [60, 61, 62, 64], "comit": [60, 61, 62, 64], "khalid": [60, 61, 62, 84, 85, 94], "almubarak": [60, 61, 62, 84, 85, 94], "skillset": 60, "w1d2_t1": 60, "mpl_toolkit": [60, 61, 62], "axes_grid1": [60, 61, 62], "make_axes_locat": [60, 61, 62], "ex3_plot": 60, "lss": 60, "mse": [60, 61, 62, 80, 81, 102], "ex1_plot": 60, "fun_z": 60, "fun_dz": 60, "sine": [60, 64, 82], "zz": [60, 61], "xg": 60, "yg": 60, "xxg": 60, "yyg": 60, "zxg": 60, "zyg": 60, "contplt": [60, 61], "quiver": [60, 61, 81], "cax": [60, 61, 62], "append_ax": [60, 61, 62], "cbar": [60, 61, 62, 77], "set_label": [60, 61, 77], "workhors": 60, "poorli": [60, 67], "_introduction_video": [60, 64, 67, 84, 94, 100], "risk": [60, 69], "_gradient_descent_video": 60, "clarifi": [60, 74, 91, 101], "dfrac": [60, 61, 62], "equiv": 60, "circ": [60, 80], "dx": [60, 62], "dg": 60, "dh": 60, "rewrit": 60, "2x": [60, 67], "2y": 60, "_gradient_vector_analytical_exercis": 60, "dz_dx": 60, "dz_dy": 60, "x_0": [60, 81, 82], "y_0": 60, "landscap": [60, 67], "steep": 60, "plateau": [60, 62], "minima": [60, 61, 67, 70, 100, 102], "maxima": 60, "aforement": [60, 73], "formula": [60, 64, 65, 81, 91, 102], "1847": 60, "augustin": 60, "loui": [60, 64], "cauchi": 60, "_gradient_vector_exercis": 60, "_gradient_descent_discussion_video": 60, "mathbf": [60, 61, 62, 80, 81, 82], "rightarrow": [60, 61], "nabla": [60, 61, 67, 81], "w_d": [60, 80], "guess": [60, 61, 62, 69], "qquad": 60, "learnabl": [60, 61, 73, 80], "w_2": [60, 61], "ln": [60, 81], "_gradients_analytical_exercis": 60, "_computational_graph_video": 60, "overwhelm": 60, "extraordinarili": 60, "beast": [60, 76, 81], "backpropag": [60, 65], "oper": [60, 61, 62, 64, 67, 70, 73, 76, 80, 82, 84], "_chain_rule_analytical_exercis": 60, "_autodifferentiation_video": 60, "declar": 60, "rebuild": 60, "simplegraph": 60, "sq_loss": 60, "y_true": [60, 61], "squre": 60, "simple_graph": 60, "niniti": 60, "square_loss": 60, "arbitrarili": [60, 100], "_building_a_computational_graph_exercis": 60, "interconnect": 60, "acycl": 60, "addbackward": 60, "addbackward0": 60, "0x7f8a7bb7ba90": 60, "nameerror": [60, 62, 64, 65, 67, 70, 73, 76, 80, 81, 84, 85, 87, 100, 102], "traceback": [60, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 85, 87, 88, 94, 97, 100, 102], "y_t": [60, 61], "y_p": [60, 61], "grad": [60, 67, 81], "contagi": 60, "leaf": [60, 76, 87, 102], "method_nam": 60, "my_object": 60, "ana_dloss_dw": 60, "ana_dloss_db": 60, "autograd_dloss_dw": 60, "autograd_dloss_db": 60, "gentl": 60, "_pytorch_nn_module_video": 60, "pack": [60, 76, 88], "n_sampl": [60, 62, 80], "widenet": 60, "wide_net": [60, 65], "stochstic": 60, "003": [60, 81], "sgd_optim": 60, "888942301273346": 60, "loss_fun": 60, "loss_record": [60, 61], "recod": 60, "exercic": 60, "physiqu": 60, "mathematiqu": 60, "_training_loop_exercis": 60, "_tutorial_1_wrapup_video": 60, "w1d2_t2": 61, "intslid": [61, 62, 67, 73, 80], "floatslid": [61, 62, 64, 67, 73, 80], "hbox": [61, 73, 76, 80], "interactive_output": [61, 62, 73, 76, 80], "togglebutton": 61, "plot_x_y_": 61, "x_t_": 61, "y_t_": 61, "x_ev_": 61, "y_ev_": 61, "loss_log_": 61, "weight_log_": 61, "shallownarrownet": 61, "plot_vector_field": 61, "init_weight": [61, 65, 87], "x_po": 61, "endpoint": 61, "y_po": 61, "mgrid": 61, "empty_lik": 61, "x_temp": 61, "y_temp": 61, "gen_sampl": 61, "plasma": 61, "temp_model": 61, "shallownarrowlnn": 61, "da": [61, 81, 82, 88, 97], "dloss_dw": 61, "temp_record": 61, "zorder": [61, 73], "red": [61, 67, 69, 70, 73, 76, 85, 94], "coolwarm": [61, 67], "plot_loss_landscap": 61, "loss_rec_1": 61, "w_rec_1": 61, "loss_rec_2": 61, "w_rec_2": 61, "plot_surfac": [61, 67], "scatter3d": [61, 67], "view_init": [61, 67], "260": [61, 76], "depth_widget": 61, "depth_lr_init_interplai": 61, "lr_widget": 61, "depth_lr_interplai": 61, "deepnarrowlnn": 61, "w_i": 61, "yscale": 61, "plot_init_effect": 61, "init_w": 61, "ncol": [61, 64, 67], "interplai": [61, 67], "min_depth": 61, "max_depth": 61, "depth_list": 61, "i_depth": 61, "min_lr": 61, "max_lr": 61, "slider": [61, 62, 67, 73, 76, 80, 81, 85], "button_styl": 61, "danger": [61, 70], "argwher": [61, 102], "add_gridspec": 61, "ax3": [61, 73], "set_yscal": 61, "datapoint": [61, 64, 65, 67, 69, 70, 80, 81, 82], "offset": [61, 67, 80, 84, 87], "evenli": [61, 76], "w1": [61, 85], "w2": [61, 85], "dloss_dw1": 61, "dloss_dw2": 61, "n_ep": [61, 62], "corrspond": 61, "weight_record": 61, "thin": [61, 80, 84], "dw": [61, 81], "ex": 61, "wp": 61, "isinf": 61, "_shallow_narrow_linear_net_video": 61, "incred": [61, 74], "dissect": 61, "comprehend": 61, "occas": 61, "compact": 61, "pressur": 61, "_loss_gradients_analytical_exercis": 61, "shallownarrowexercis": 61, "shallownarrow": 61, "netwrok": 61, "211": [61, 76], "initial_weight": 61, "x_eval": 61, "sn_model": 61, "loss_log": [61, 67], "weight_log": 61, "y_eval": 61, "_simple_narrow_lnn_exercis": 61, "_training_landscape_video": 61, "1x": 61, "ribbon": 61, "yellow": [61, 73, 76, 85, 94], "crowd": 61, "saddl": [61, 62], "_training_landscape_discussion_video": 61, "clever": 61, "_effect_of_depth_video": 61, "realiti": 61, "incap": [61, 62], "unseen": [61, 67, 85, 101], "w_": [61, 62, 65, 67, 70, 100], "vanish": [61, 65, 76], "chain": [61, 62, 76, 81, 85], "vulnerabl": 61, "impair": [61, 94], "fastest": [61, 62], "eventu": [61, 80], "continuous_upd": [61, 62, 80], "_effect_of_depth_discussion_video": 61, "trade": [61, 67, 102], "_learning_rate_video": 61, "045": 61, "readout_format": [61, 62], "_learning_rate_discussion_video": 61, "_depth_and_learning_rate_video": 61, "deliv": 61, "confid": [61, 67, 73, 88, 100, 102], "impli": [61, 64, 100], "intpl_obj": 61, "500px": 61, "widgets_ui": [61, 62, 80], "widgets_out": [61, 62, 80], "_depth_and_learning_rate_interactive_demo": 61, "_depth_and_learning_rate_discussion_video": 61, "_initialization_matt": 61, "_initialization_matters_discussion_video": 61, "_wrapup_video": 61, "overflow": [61, 73], "difficulti": [61, 67], "_hyperparameter_interaction_bonus_discuss": 61, "ethic": 62, "w1d2_t3": 62, "floatlogslid": [62, 67, 76], "vbox": [62, 73, 76, 80], "filterwarn": [62, 94], "plot_x_y_hier_data": 62, "im2": 62, "subplot_ratio": 62, "hierarch": [62, 65], "ax0": 62, "plot_x_y_hier_on": 62, "plot_tree_data": 62, "label_list": [62, 87], "feature_arrai": 62, "new_featur": 62, "listedcolormap": 62, "cyan": [62, 85], "magenta": [62, 70, 77], "n_featur": 62, "n_label": 62, "feature_list": 62, "can_grow": 62, "is_mamm": 62, "has_leav": 62, "can_mov": 62, "has_trunk": 62, "can_fli": 62, "can_swim": 62, "has_stem": 62, "is_warmblood": 62, "can_flow": 62, "goldfish": [62, 76], "tuna": 62, "robin": [62, 76], "canari": 62, "rose": [62, 76], "daisi": [62, 76], "pine": 62, "oak": 62, "implt": 62, "set_yticklabel": 62, "set_ytick": [62, 80], "set_xtick": [62, 76, 80], "set_xticklabel": [62, 76], "loss_arrai": 62, "plot_loss_sv": 62, "sv_arrai": 62, "n_sing_valu": 62, "set1": 62, "plot1": [62, 69], "plot2": [62, 69], "plot_loss_sv_twin": 62, "tick_param": 62, "labelcolor": 62, "twinx": 62, "plot_ills_sv_twin": 62, "ill_arrai": 62, "ill_label": 62, "plot_loss_sv_rsm": 62, "rsm_arrai": 62, "i_ep": 62, "yaxi": 62, "tick_right": 62, "implot": 62, "rsm": 62, "item_nam": 62, "axvspan": 62, "build_tre": 62, "n_level": 62, "n_branch": 62, "to_np_arrai": 62, "pflip": 62, "sample_from_tre": 62, "n_item": 62, "rand_temp": 62, "flip_temp": 62, "samp": 62, "prop": 62, "generate_hsd": 62, "tree_label": 62, "tree_featur": 62, "linear_regress": 62, "linalg": [62, 77, 80, 81, 85, 87, 89], "inv": [62, 81], "dy": 62, "add_featur": 62, "existing_featur": 62, "hstack": 62, "net_svd": 62, "in_dim": [62, 67], "orthogon": [62, 89, 94], "w_tot": 62, "net_rsm": 62, "initializer_": 62, "n_out": 62, "n_in": 62, "normal_": [62, 69, 70, 82], "test_initializer_ex": 62, "lnnet": 62, "ex_initializer_": 62, "faulti": 62, "test_net_svd_ex": 62, "net_svd_ex": 62, "u_ex": 62, "\u03c3_ex": 62, "v_ex": 62, "ex_net_svd": 62, "isclos": 62, "atol": 62, "test_net_rsm_ex": 62, "net_rsm_ex": 62, "y_ex": 62, "ex_net_rsm": 62, "tight": [62, 87], "timelin": 62, "hid_dim": 62, "out_dim": [62, 67], "ouput": 62, "in_hid": 62, "hid_out": 62, "hid": 62, "illusory_i": 62, "input_dim": [62, 82], "rs_mat": 62, "pred_ij": 62, "mu": [62, 65, 74, 80, 81], "n_": [62, 65, 76], "underscor": [62, 85], "_reinitialization_exercis": 62, "_intro_to_representation_learning_video": 62, "shallow": [62, 87], "hardli": [62, 87], "sacrif": 62, "syntact": [62, 84], "swim": [62, 76], "fin": 62, "cast": [62, 76], "label_tensor": 62, "feature_tensor": 62, "vital": [62, 70], "premis": [62, 85], "dim_input": 62, "dim_hidden": 62, "dim_output": 62, "dlnn_model": 62, "bump": 62, "loss_lr_init": 62, "1f": [62, 67], "_training_the_deep_lnn_interactive_demo": 62, "_svd_video": 62, "prod_": [62, 74], "tot": 62, "2013": [62, 91], "decompos": [62, 88], "untangl": 62, "matix": 62, "evd": 62, "eigenvector": [62, 80], "_svd_exercis": 62, "_svd_discuss": 62, "_svd_discussion_video": 62, "_rsa_video": 62, "remark": [62, 67, 74, 88, 101], "smallest": [62, 82], "fish": [62, 76], "loss_svd_rsm_lr_gamma": 62, "i_ep_slid": 62, "630px": 62, "lr_slider": 62, "gamma_slid": 62, "moment": [62, 73, 80], "naiv": [62, 81], "suprem": 62, "unsurprisingli": 62, "_rsa_exercis": 62, "_rsa_discussion_video": 62, "_illusorycorrelations_video": 62, "sudden": 62, "tempt": 62, "immatur": 62, "bone": [62, 101], "distinct": [62, 65], "shark": [62, 76], "skeleton": [62, 73, 94], "cartilagin": 62, "lighter": [62, 76], "illusion_idx": 62, "its_label": 62, "has_bon": 62, "ill_predict": 62, "medic": [62, 69], "parrot": 62, "cannot_speak": 62, "_illusory_correlations_exercis": 62, "_illusory_correlations_discussion_video": 62, "_outro_video": [62, 65, 100], "_linear_regression_bonus_video": 62, "gp": 62, "air": [62, 80], "predictor": 62, "multivari": 62, "pop": [62, 76, 85, 97], "y_": 62, "vdot": 62, "ddot": 62, "x_": [62, 64], "b_": 62, "broadcast": [62, 81, 82], "notat": [62, 80, 81, 88, 97], "underset": [62, 89, 100, 102], "mathrm": [62, 65, 84], "argmin": [62, 97], "invert": [62, 80], "linear_regression_exercis": 62, "w_true": 62, "w_estim": 62, "nestim": 62, "_analytical_solution_to_lr_exercis": 62, "deep_w_tot": 62, "analytical_weight": 62, "lrnet": 62, "in_out": 62, "lr_model": 62, "zero_depth_model": 62, "lr_model_weight": 62, "allclos": [62, 81, 82, 94], "_linear_regression_discussion_video": 62, "arash": [64, 65, 80], "ash": [64, 65, 76, 80], "felix": [64, 65], "bartsch": [64, 65], "yu": [64, 65, 73, 76, 77], "fang": [64, 65, 73, 76, 77], "yang": [64, 65, 73, 76, 77, 81, 82, 88, 97], "melvin": [64, 65, 76, 77, 84, 85, 94, 100, 102], "selim": [64, 65, 76, 77, 84, 85, 94, 100, 102], "atai": [64, 65, 76, 77, 84, 85, 94, 100, 102], "arguabl": [64, 76, 91, 101], "tractabl": 64, "w1d3_t1": 64, "helper": [64, 69, 76, 81], "unnormalis": [64, 65], "unnorm": [64, 65, 69, 70, 76, 81], "npimg": [64, 65, 69, 70], "plot_function_approxim": 64, "relu_act": 64, "y_hat": 64, "incom": [64, 76, 85, 88], "basi": [64, 69, 82], "_universal_approximation_theorem_video": 64, "inflect": 64, "_c": 64, "0x7f8734c83830": 64, "2871": 64, "6413": 64, "8615": [64, 73], "3649": 64, "6931": 64, "7542": [64, 67], "5983": 64, "7588": 64, "3569": 64, "6389": 64, "approximate_funct": 64, "n_relu": 64, "i_relu": 64, "combination_weight": 64, "prev_slop": 64, "delta_x": 64, "_function_approximation_with_relu_exercis": 64, "1hr": 64, "_building_mlps_in_pytorch_video": 64, "lipschitz": 64, "prove": [64, 65, 101], "fascin": 64, "terminolog": [64, 94], "negative_slop": [64, 65], "ge": 64, "actv": [64, 65], "input_feature_num": [64, 65], "hidden_unit_num": [64, 65], "output_feature_num": [64, 65], "in_num": [64, 65], "temporari": [64, 65], "out_num": [64, 65], "linear_": [64, 65], "actv_lay": [64, 65], "activation_": [64, 65], "out_lay": [64, 65], "output_linear": [64, 65], "_implement_a_general_purpose_mlp_in_pytorch_exercis": 64, "_cross_entropy_video": 64, "addition": [64, 100], "_i": [64, 74, 80], "operatornam": [64, 89, 100, 102], "sum_": [64, 65, 69, 74, 77, 80, 100], "cross_entropy_loss": [64, 100, 102], "avg_loss": [64, 82], "x_of_label": 64, "pytorch_loss": 64, "our_loss": 64, "8f": 64, "34672737": 64, "34672749": 64, "00000012": 64, "_implement_batch_cross_entropy_loss_exercis": 64, "fanci": 64, "quad": 64, "leq": [64, 65], "ldot": 64, "create_spiral_dataset": 64, "_training_and_evaluating_an_mlp_video": 64, "shuffle_and_split_data": [64, 65], "shuffled_indic": [64, 65], "datset": [64, 65, 76], "_implement_it_for_a_classification_task_exercis": 64, "multithread": [64, 65], "train_test_classif": [64, 65], "training_plot": [64, 65], "data_load": [64, 65, 67, 73, 76, 82], "gaug": [64, 65], "train_tot": [64, 65], "test_tot": [64, 65], "_whats_the_point_of_ev": 64, "_and_train": 64, "_discuss": 64, "ish": 64, "sample_grid": [64, 65, 82], "x_all": [64, 65], "jj": [64, 65], "ij": [64, 65, 69, 70], "plot_decision_map": [64, 65], "decision_map": [64, 65], "33": [64, 65, 73, 76, 81, 82, 84, 85, 94, 97], "_does_it_generalize_well_discuss": 64, "leayer": 64, "biophys": 64, "circuit": 64, "excess": 64, "_biological_to_artificial_neurons_bonus_video": 64, "1907": 64, "\u00e9douard": 64, "lapicqu": 64, "electrophysiolog": 64, "theoret": [64, 65, 67], "dayan": 64, "laurenc": 64, "abbott": 64, "v_m": 64, "c_m": 64, "t_": 64, "v_": [64, 67, 100], "r_": 64, "membran": 64, "voltag": 64, "capacit": 64, "resit": 64, "rm": 64, "momentarili": 64, "mimick": 64, "refractori": 64, "eqnarrai": 64, "sp": 64, "exceed": 64, "synapt": 64, "euler": [64, 82], "delta": [64, 67, 81, 97], "superscript": [64, 70], "run_lif": 64, "tau_ref": 64, "vth": 64, "v_spike": 64, "msec": 64, "resist": 64, "kohm": 64, "uf": 64, "vm": 64, "t_rest": 64, "volatag": 64, "refactori": 64, "sim_tim": 64, "my_layout": [64, 65], "plot_if_curv": 64, "is_max": 64, "spike_count": 64, "_real_and_artificial_neuron_similarities_bonus_discuss": 64, "w1d3_t2": 65, "make_grid": [65, 76, 77, 80, 82], "reshapinng": 65, "naccuraci": 65, "indec": 65, "32x32": 65, "animalfac": [65, 69, 70], "animalfaces32x32": 65, "kgfvj": [65, 69, 70], "zfile": [65, 69, 70, 94, 100, 102], "chdir": 65, "_deep_expressivity_video": 65, "max_par_count": 65, "run_depth_optim": 65, "max_hidden_lay": 65, "hidden_lay": 65, "test_scor": 65, "count_paramet": 65, "par_count": 65, "hidden_unit": 65, "_wide_vs_deep_exercis": 65, "optimum": [65, 67, 81], "hurt": 65, "_wide_vs_deep_discuss": 65, "spiral": [65, 76], "polynomi": [65, 102], "upto": 65, "constatnt": 65, "run_poly_classif": 65, "poly_degre": 65, "make_poly_featur": 65, "poly_x": 65, "poly_x_test": 65, "poly_x_train": 65, "poly_test_data": 65, "poly_test_load": 65, "poly_train_data": 65, "poly_train_load": 65, "poly_net": 65, "poly_x_al": 65, "max_poly_degre": 65, "3200": 65, "1325": 65, "_does_a_wide_model_generalize_well_discuss": 65, "_case_study_video": 65, "randomrot": 65, "offici": 65, "get_data_load": [65, 73], "img_train_load": 65, "img_test_load": [65, 70], "augmentation_transform": [65, 73], "preprocessing_transform": [65, 73], "train_transform": [65, 69, 70, 73, 76, 77], "test_transform": [65, 70], "data_path": [65, 69, 70], "afhq": [65, 69, 70], "img_train_dataset": 65, "img_test_dataset": [65, 70], "tpu": 65, "nrow": [65, 76, 77, 80, 82], "fc1_weight": 65, "_dataloader_real_world_exercis": 65, "_why_first_layer_features_are_high_level_discuss": 65, "_ethics_hype_in_ai_video": 65, "chaotic": 65, "chao": [65, 67], "implicitli": [65, 80, 88, 101], "_need_for_good_initialization_bonus_video": 65, "o_i": 65, "o_": 65, "drawn": [65, 80], "2_": 65, "2_j": 65, "albeit": 65, "blow": [65, 76], "dilemma": [65, 97], "glorot": 65, "plug": [65, 102], "geq": 65, "z_": 65, "z_i": [65, 100], "confirm": [65, 87, 101], "ngain": 65, "xavier_normal_": 65, "xavier_uniform_": 65, "best_gain": 65, "theoretical_gain": 65, "valueerror": [65, 80, 85, 94, 97], "opt": [65, 67, 69, 70, 80, 81, 82, 85, 87, 88, 94], "hostedtoolcach": [65, 67, 69, 70, 80, 81, 82, 85, 87, 94], "x64": [65, 67, 69, 70, 80, 81, 82, 85, 87, 94], "fromnumer": 65, "1229": [65, 91], "1142": 65, "1143": 65, "1144": 65, "1227": 65, "1228": 65, "kwd": 65, "_novalu": 65, "_wrapfunc": 65, "obj": 65, "getattr": 65, "_wrapit": 65, "attributeerror": [65, 88], "jose": 67, "gallego": 67, "posada": 67, "piyush": [67, 69, 70], "chauhan": [67, 69, 70], "charl": [67, 80], "edelson": [67, 80], "krishnakumaran": 67, "w1d5_t1": 67, "rc": [67, 94], "unicode_minu": [67, 81, 94], "print_param": 67, "named_paramet": [67, 69, 70, 85], "incent": 67, "led": 67, "_unexpected_consequences_discuss": 67, "pedagog": 67, "spotlight": [67, 76], "emphasi": [67, 88], "vet": 67, "handwritten": [67, 73, 80], "strictli": 67, "wavi": 67, "vallei": [67, 76], "deepest": [67, 82], "_case_study_mlp_classification_video": 67, "y2fj6": [67, 80], "ndownload": [67, 73, 80], "load_mnist_data": 67, "change_tensor": 67, "greyscal": [67, 73], "train_set": [67, 80], "train_siz": 67, "784": [67, 73, 76], "train_target": 67, "test_set": 67, "test_target": 67, "std_dev": 67, "tform": 67, "concentr": 67, "subset_index": 67, "num_figur": 67, "sample_id": [67, 100, 102], "matshow": [67, 97], "ndenumer": 67, "steelblu": 67, "use_bia": 67, "multilayerperceptron": 67, "num_hidden_lay": 67, "transformed_x": 67, "hidden_output": 67, "constitut": 67, "encapsul": 67, "deep_learning_tutori": 67, "surrog": 67, "nll_loss": [67, 69, 70, 84], "_thi": 67, "cell_": 67, "opportun": [67, 76, 91], "cell_verbos": 67, "partial_trained_model": 67, "7e": 67, "goto": 67, "ineffici": [67, 101], "_optimization_of_an_objective_function_video": 67, "zero_": [67, 87], "random_upd": 67, "noise_scal": 67, "randn_lik": [67, 81, 82], "gradient_upd": 67, "model1": [67, 76], "0264": 67, "0173": 67, "0297": 67, "0278": 67, "0221": 67, "0086": 67, "0254": 67, "0233": 67, "0231": 67, "0342": 67, "0124": 67, "0157": 67, "0111": 67, "0144": 67, "0301": 67, "0181": 67, "0303": 67, "0208": 67, "0353": 67, "0183": 67, "0271": 67, "0099": 67, "0033": 67, "0022": 67, "0307": 67, "0243": 67, "0159": 67, "0064": 67, "0263": 67, "0174": 67, "0298": 67, "0047": 67, "0302": 67, "0093": 67, "0077": 67, "0248": 67, "0234": 67, "0237": 67, "0117": 67, "0187": 67, "0006": 67, "0156": 67, "0143": 67, "0164": 67, "0286": 67, "0238": 67, "0127": 67, "0191": 67, "0188": 67, "0206": 67, "0354": 67, "0184": 67, "0272": 67, "0098": 67, "0002": 67, "0292": 67, "0018": 67, "0054": 67, "0246": 67, "0198": 67, "0061": 67, "_implement_gradient_descent_exercis": 67, "induc": [67, 89], "_run": 67, "model_nam": [67, 85], "my_model": 67, "base_loss": 67, "dummy_model": 67, "loss1": 67, "gd_delta": 67, "trial_id": 67, "hist": [67, 94], "get_legend_handles_label": 67, "bbox_to_anchor": 67, "fancybox": 67, "shadow": 67, "_gradient_descent_vs_random_search_discuss": 67, "haunt": 67, "_momentum_video": 67, "bridg": [67, 76], "gap": 67, "flatter": [67, 70], "exhibit": [67, 101], "recomput": 67, "_how_momentum_works_discuss": 67, "w_t": 67, "quantiti": [67, 80, 81, 91], "v_t": 67, "underbrac": 67, "leftarrow": 67, "loss_2d": 67, "mask_idx": 67, "378": [67, 76], "bias_id": 67, "bias_idx": 67, "ones_lik": [67, 97], "masked_weight": 67, "mesh": 67, "_subplot": 67, "axessubplot": 67, "surf": 67, "antialias": 67, "set_zlabel": 67, "plot_param_dist": 67, "best_u": 67, "best_v": 67, "traj": [67, 81], "use_log": 67, "y_min_v": 67, "y_max_v": 67, "run_optim": 67, "eval_fn": 67, "update_fn": 67, "max_step": [67, 88, 97], "optim_kwarg": 67, "log_traj": 67, "callabl": 67, "customiz": 67, "auxiliari": 67, "aux_tensor": 67, "xs": [67, 73], "momentum_upd": 67, "grad_vel": 67, "model2": [67, 76], "initial_vel": 67, "5898": 67, "0116": 67, "0239": 67, "0871": 67, "4030": 67, "9577": 67, "4653": 67, "6022": 67, "7363": 67, "5485": 67, "2747": 67, "6539": 67, "4117": 67, "1045": 67, "6492": 67, "0201": 67, "6503": 67, "1310": 67, "5098": 67, "5075": 67, "0718": 67, "1192": 67, "2900": 67, "9657": 67, "4405": 67, "1174": 67, "0792": 67, "1857": 67, "3537": 67, "0824": 67, "4254": 67, "3760": 67, "7491": 67, "6025": 67, "4147": 67, "8720": 67, "6201": 67, "9632": 67, "9430": 67, "5180": 67, "3417": 67, "6574": 67, "3677": 67, "_implement_momentum_exercis": 67, "line2d": [67, 81], "run_newton": 67, "init_list": 67, "par_tensor": 67, "t_g": 67, "eval_loss": 67, "eval_grad": 67, "eval_hess": 67, "hessian": 67, "fromit": 67, "lstyle": 67, "interact_manu": [67, 73], "momentum_experi": 67, "9e": 67, "sgd_traj": 67, "mom_traj": 67, "plot3d": 67, "lime": 67, "_momentum_vs_gd_interactive_demo": 67, "_momentum_and_oscillations_discuss": 67, "couldn": [67, 85], "onward": 67, "remaind": 67, "_overparameterization_video": 67, "losslandscap": 67, "mental": 67, "undesir": 67, "optima": 67, "ampl": 67, "evolv": [67, 69, 70, 101], "overparam": 67, "5e": [67, 69, 70, 85], "num_init": 67, "hdim": 67, "base_model": 67, "2e": [67, 70], "loss_hist": 67, "num_param": 67, "_overparameterization_interactive_demo": 67, "downsid": 67, "_width_and_depth_of_the_network_discuss": 67, "quest": 67, "thousand": [67, 91], "_mini_batches_video": 67, "measure_update_tim": 67, "num_point": 67, "loss_tim": 67, "gradient_tim": 67, "computation_tim": 67, "times_list": 67, "_cost_of_computation_interactive_demo": 67, "sample_minibatch": 67, "iid": 67, "input_data": 67, "target_data": 67, "batch_input": 67, "batch_target": 67, "batch_indic": 67, "x_batch": 67, "y_batch": 67, "_implement_mini_batch_sampling_exercis": 67, "budget": 67, "minibatch_experi": 67, "time_budget": 67, "plot_data": 67, "precaut": 67, "afford": [67, 101], "steadili": 67, "_compare_different_minibatch_sizes_interactive_demo": 67, "awar": [67, 77, 88], "knob": 67, "prototyp": [67, 76], "_adaptive_methods_video": 67, "rmsprop_upd": 67, "grad_sq": 67, "quotient": 67, "gsq": 67, "model3": [67, 76], "0031": 67, "0193": 67, "0316": 67, "0063": 67, "0318": 67, "0109": 67, "0232": 67, "0218": 67, "0253": 67, "0102": 67, "0203": 67, "0027": 67, "0136": 67, "0089": 67, "0123": 67, "0324": 67, "0166": 67, "0281": 67, "0133": 67, "0197": 67, "0182": 67, "0186": 67, "0376": 67, "0293": 67, "0019": 67, "0313": 67, "0011": 67, "0122": 67, "0199": 67, "0329": 67, "0041": 67, "_implement_rmsprop_exercis": 67, "congrat": 67, "compare_optim": 67, "sgd_dict": 67, "mom_dict": 67, "rms_dict": 67, "fuchsia": 67, "all_dict": 67, "opt_dict": 67, "opt_nam": 67, "optim_dict": 67, "_compare_optimizers_interactive_demo": 67, "excel": [67, 73, 87, 89, 97], "_compare_optimizers_discuss": 67, "plain": [67, 81], "undisput": 67, "amsgrad": 67, "adagrad": 67, "burden": 67, "_loss_function_and_optimization_discuss": 67, "15min": [67, 73, 76, 80], "_ethical_concerns_video": 67, "utilis": 67, "unforeseen": 67, "beat": [67, 88, 100], "mission": 67, "tricki": [67, 70], "_putting_it_all_together_bonus_video": 67, "benchmark_model": 67, "sj4e8": 67, "benchmark_state_dict": 67, "eoferror": 67, "1040": 67, "pickle_modul": [67, 82], "weights_onli": [67, 82], "mmap": [67, 82], "pickle_load_arg": [67, 82], "1038": [67, 91], "runtimeerror": [67, 69, 70, 81, 85], "1039": 67, "unpicklingerror": 67, "unsafe_messag": 67, "_legacy_load": 67, "opened_fil": [67, 82], "1258": 67, "1252": 67, "hasattr": [67, 85], "readinto": 67, "version_info": 67, "1253": 67, "1254": 67, "1255": [67, 91], "newer": 67, "restor": 67, "magic_numb": 67, "1259": 67, "1260": 67, "ran": 67, "eval_model": 67, "acc_log": 67, "batch_id": 67, "log_freq": 67, "val_freq": 67, "train_set_orig": 67, "test_set_orig": 67, "val_set_orig": 67, "val_idx": 67, "step_idx": 67, "running_acc": 67, "_train_your_own_model_bonus_exercis": 67, "_metrics_bonus_discuss": 67, "drum": [67, 76], "nbenchmark": 67, "ravi": [69, 70], "teja": [69, 70, 73], "konkimalla": [69, 70], "mohitrajhu": [69, 70], "lingan": [69, 70], "kumaraian": [69, 70], "kevin": [69, 70], "machado": [69, 70], "gamboa": [69, 70], "roberto": [69, 70, 76, 77], "guidotti": [69, 70, 76, 77], "w2d1_t1": 69, "afhq_random_32x32": [69, 70], "afhq_10_32x32": [69, 70], "9sj7p": [69, 70], "wvgkq": [69, 70], "plot_weight": [69, 70], "ws": [69, 70], "axhlin": [69, 70], "ls": [69, 70, 94], "early_stop_plot": 69, "train_acc_earlystop": 69, "val_acc_earlystop": 69, "best_epoch": [69, 70], "solid": [69, 70, 73, 76, 101], "reg_function1": [69, 70], "reg_function2": [69, 70], "regularis": [69, 70], "lambda1": [69, 70], "lambda2": [69, 70], "view_a": [69, 70], "val_acc_list": [69, 70], "train_acc_list": [69, 70], "param_norm_list": [69, 70], "trained_model": [69, 70], "param_norm": [69, 70], "calculate_frobenius_norm": [69, 70], "_introduction_to_regularization_video": 69, "_regularization_as_shrinkage_video": 69, "underperform": [69, 77], "underfit": 69, "_f": 69, "a_": 69, "6572162508964539": 69, "_forbenius_norm_exercis": 69, "6572": 69, "plot_weigt": 69, "forbeniu": 69, "3810": [69, 70], "_overparameterization_and_overfitting_video": 69, "overparametr": [69, 73], "leaky_relu": [69, 70], "normi": 69, "wsi": 69, "model_norm": [69, 70], "norm_per_lay": 69, "running_predict": [69, 70], "pl": [69, 88], "layer_nam": 69, "title1": 69, "title2": 69, "set_text": 69, "repeat_delai": [69, 70], "html_anim": [69, 70], "1285": [69, 70, 81], "embed_limit": [69, 70, 81], "1282": [69, 70, 81], "tmpdir": [69, 70, 81], "m4v": [69, 70, 81], "1283": [69, 70, 81], "1284": [69, 70, 81], "mpl": [69, 70, 81], "1286": [69, 70, 81], "codec": [69, 70, 81], "h264": [69, 70, 81], "1287": [69, 70, 81], "bitrat": [69, 70, 81], "1288": [69, 70, 81], "_interv": [69, 70, 81], "1289": [69, 70, 81], "148": [69, 70, 76, 81], "moviewriterregistri": [69, 70, 81], "146": [69, 70, 76, 81], "147": [69, 70, 76, 81], "_regist": [69, 70, 81], "moviewrit": [69, 70, 81], "_interpreting_losses_discuss": 69, "frobeni": 69, "normf": 69, "wsf": 69, "honest": 69, "seldom": 69, "wild": [69, 70, 76], "len_train": 69, "len_val": 69, "len_test": 69, "14430": [69, 70], "img_dataset": [69, 70], "img_train_data": [69, 70], "img_val_data": [69, 70], "afhq_random": [69, 70], "random_img_train_data": [69, 70], "random_img_val_data": [69, 70], "rand_train_load": [69, 70], "rand_val_load": [69, 70], "afhq_10": [69, 70], "partially_random_train_data": [69, 70], "partially_random_val_data": [69, 70], "partial_rand_train_load": [69, 70], "partial_rand_val_load": [69, 70], "biganimalnet": [69, 70], "val_acc_pur": [69, 70], "train_acc_pur": [69, 70], "end_tim": 69, "223": [69, 76], "69637608528137": 69, "p_x": 69, "p_y": 69, "visualize_data": [69, 70], "image_class": [69, 70], "val_acc_random": 69, "train_acc_random": 69, "_early_stopping_video": 69, "early_stopping_main": [69, 70], "best_model": [69, 70], "_early_stopping_exercis": 69, "harm": [69, 82], "_early_stopping_discuss": 69, "caveat": 69, "intial": [69, 100], "val_acc_shuffl": 69, "train_acc_shuffl": 69, "_early_stopping_generalization_bonus_discuss": 69, "shrinkag": 70, "peril": 70, "w2d1_t2": 70, "frobeniu": 70, "animalnet": 70, "kep": 70, "_l1_and_l2_regularization_video": 70, "teammat": 70, "reg_train_data": 70, "reg_val_data": 70, "14500": 70, "reg_train_load": 70, "reg_val_load": 70, "val_acc_unreg": 70, "train_acc_unreg": 70, "param_norm_unreg": 70, "lasso": 70, "ddagger": 70, "l_r": 70, "subsect": 70, "sgn": 70, "mbox": 70, "l1_reg": 70, "445133209228516": 70, "_l1_regularization_exercis": 70, "args1": [70, 100, 102], "test_batch_s": 70, "val_acc_l1reg": 70, "train_acc_l1reg": 70, "param_norm_l1reg": 70, "251": [70, 76, 80], "253": [70, 76, 80], "254": [70, 76, 80], "165": [70, 76, 94], "163": [70, 76, 94], "164": [70, 76, 94], "166": [70, 76, 94], "167": [70, 76, 94], "_tune_lambda1_exercis": 70, "quadrat": 70, "\u03b7": 70, "l2_reg": 70, "328375816345215": 70, "_l2_ridge_regularization_exercis": 70, "args2": [70, 102], "val_acc_l2reg": 70, "train_acc_l2reg": 70, "param_norm_l2reg": 70, "168": [70, 76], "169": [70, 76], "170": [70, 76], "_tune_lambda2_exercis": 70, "args3": 70, "val_acc_l1l2reg": 70, "train_acc_l1l2reg": 70, "param_norm_l1l2reg": 70, "lambda_2": 70, "lambda_1": 70, "174": [70, 76], "173": [70, 76], "175": [70, 76], "176": [70, 76], "_dropout_video": 70, "liter": 70, "subsequ": [70, 73, 85, 87], "netdropout": 70, "unsqueeze_": 70, "running_predictions_dp": 70, "train_loss_dp": 70, "test_loss_dp": 70, "model_norm_dp": 70, "_dropout_discuss": 70, "fare": 70, "animalnetdropout": 70, "248": [70, 76], "val_acc_dropout": 70, "train_acc_dropout": 70, "model_dp": 70, "val_acc_big": 70, "train_acc_big": 70, "model_big": 70, "dp": 70, "placement": 70, "_dropout_caveats_discuss": 70, "_data_augmentation_video": 70, "14280": 70, "new_transform": 70, "randomverticalflip": 70, "new_train_data": 70, "new_train_load": 70, "model_aug": 70, "val_acc_dataaug": 70, "train_acc_dataaug": 70, "param_norm_dataaug": 70, "model_pur": 70, "param_norm_pur": 70, "_data_augmentation_discussuion": 70, "_overparameterized_vs_small_nn_discussuion": 70, "_sgd_video": 70, "broader": 70, "bewar": 70, "overshoot": 70, "11700": 70, "2930": 70, "full_train_load": 70, "full_val_load": 70, "acc_dict": 70, "val_": 70, "train_": 70, "param_norm_": 70, "0e": 70, "_hyperparameter_tuning_video": 70, "consum": 70, "bayesian": 70, "evolutionari": 70, "_overview_of_regularization_techniques_discuss": 70, "_adversarial_attacks_bonus_video": 70, "inevit": 70, "defens": 70, "distil": 70, "prone": 70, "w2d2_bonuslectur": 72, "_kynghyun_cho_video": 72, "dawn": 73, "mcknight": 73, "gerum": 73, "cassidi": 73, "pirlot": 73, "rohan": 73, "saha": 73, "liam": 73, "peet": 73, "pare": 73, "najafi": 73, "lili": [73, 84, 85, 100, 102], "cheng": [73, 84, 85, 100, 102], "bettina": [73, 76, 77], "hein": [73, 76, 77], "nina": 73, "kudryashova": 73, "anmol": 73, "gupta": [73, 100, 102], "xiaoxiong": 73, "tran": 73, "minh": 73, "hmrishav": 73, "bandyopadhyai": 73, "rahul": 73, "shekhar": 73, "w2d2_t1": 73, "trang": [73, 80, 81, 82], "correlate2d": 73, "gzip": 73, "download_data": 73, "nextract": 73, "fz": 73, "ft": [73, 76, 84, 85], "foldernam": 73, "gunzip": 73, "f_in": 73, "f_out": 73, "copyfileobj": 73, "check_shape_funct": 73, "image_shap": 73, "kernel_shap": 73, "correct_shap": 73, "user_shap": 73, "output_shap": [73, 80], "check_conv_funct": 73, "conv_funct": 73, "solution_us": 73, "solution_scipi": 73, "result_right": 73, "check_pooling_net": 73, "x_img": 73, "emnist_train": 73, "x_img_idx": 73, "output_x": 73, "right_output": 73, "309552": 73, "6216984": 73, "2708383": 73, "6654134": 73, "2271233": 73, "873457": 73, "318945": 73, "46229": 73, "663746": 73, "8889914": 73, "31068993": 73, "354934": 73, "378724": 73, "882853": 73, "499334": 73, "8546696": 73, "29296": 73, "096506": 73, "7074604": 73, "984148": 73, "12916": 73, "10037": 73, "667609": 73, "2780352": 73, "436305": 73, "9764223": 73, "98801": 73, "1756": 73, "531992": 73, "664275": 73, "5453291": 73, "2691708": 73, "3217516": 73, "3798618": 73, "05612564": 73, "218788": 73, "360992": 73, "980816": 73, "354935": 73, "8126211": 73, "9199777": 73, "9382377": 73, "076582": 73, "035061": 73, "92164516": 73, "434638": 73, "7816348": 73, "83254766": 73, "right_shap": 73, "display_image_from_greyscale_arrai": 73, "_matrix": 73, "_img": 73, "220": [73, 76], "make_plot": 73, "actual_convolut": 73, "memor": 73, "l1": [73, 84], "_introduction_to_cnns_and_rnns_video": 73, "penal": 73, "dens": [73, 80, 82], "_regularization_and_effective_number_of_params_discuss": 73, "_representations_and_visual_processing_in_the_brain_video": 73, "aristotl": 73, "bc": 73, "certainli": [73, 88], "_what_makes_a_representation_good_discuss": 73, "_details_about_convolution_video": 73, "lipton": 73, "smola": 73, "underlin": 73, "run_demo": 73, "id_html": 73, "w2d2_convnetsanddlthink": 73, "interactive_demo": 73, "convent": [73, 80], "convolv": [73, 81], "conv_check": 73, "incorrect": [73, 85], "_convolution_of_a_simple_kernel_exercis": 73, "calculate_output_shap": 73, "output_height": 73, "output_width": 73, "kernel_height": 73, "kernel_width": 73, "correcli": 73, "_convolution_output_size_exercis": 73, "beneath": 73, "convolve2d": 73, "convolution2d": 73, "im_h": 73, "im_w": 73, "ker_h": 73, "ker_w": 73, "out_h": 73, "out_w": 73, "out_row": 73, "out_col": 73, "overlai": 73, "current_product": 73, "_coding_a_convolution_exercis": 73, "chicago_skyline_shrunk_v2": 73, "bmp": 73, "ipydisplai": 73, "skyline_image_fil": 73, "img_skyline_orig": 73, "img_skyline_mat": 73, "kernel_v": 73, "kernel_hor": 73, "img_processed_mat_v": 73, "img_processed_mat_hor": 73, "img_processed_mat": 73, "img_process": 73, "plethora": 73, "whatev": 73, "dim1": 73, "dim2": 73, "3x3": 73, "convolutionbackward0": 73, "undefin": [73, 85], "0s": [73, 94, 100], "onto": [73, 76, 77, 80, 88], "_visualization_of_convolution_with_padding_and_stride_interactive_demo": 73, "abrupt": 73, "_edge_detection_discuss": 73, "stripe": 73, "1s": [73, 94, 100], "processed_imag": 73, "_kernel_structure_discuss": 73, "50min": 73, "binar": 73, "charact": [73, 85, 88], "itl": 73, "nist": 73, "gov": 73, "iaui": 73, "vip": 73, "cs_link": 73, "xwfaj": 73, "get_xvs0_dataset": 73, "emnist_test": 73, "1307": 73, "3081": 73, "train_idx": 73, "int64": [73, 87], "test_idx": 73, "o_img_idx": 73, "ax4": 73, "_visualization_of_convolution_with_multiple_filters_interactive_demo": 73, "thicker": 73, "net2": 73, "kernel_1": 73, "kernel_2": 73, "tthird": 73, "checkerboard": 73, "kernel_3": 73, "multiple_kernel": 73, "ax11": 73, "ax12": 73, "ax13": 73, "axesimag": 73, "0x7f79eadc18b0": 73, "_multiple_filters_discuss": 73, "o_img": 73, "output_o": 73, "ax14": 73, "ax21": 73, "ax22": 73, "ax23": 73, "ax24": 73, "ax31": 73, "ax32": 73, "ax33": 73, "ax34": 73, "rectifi": [73, 94], "net3": 73, "output_x_relu": 73, "output_o_relu": 73, "ax15": 73, "ax16": 73, "ax17": 73, "ax25": 73, "ax26": 73, "ax27": 73, "ax35": 73, "ax36": 73, "ax37": 73, "strengthen": 73, "funciton": 73, "cup": [73, 76], "invari": [73, 91, 100], "retain": 73, "translation": 73, "_pooling_video": 73, "systemat": 73, "neighborhood": 73, "depict": 73, "_the_effect_of_the_stride_interactive_demo": 73, "net4": 73, "_implement_maxpooling_exercis": 73, "output_x_pool": 73, "output_o_pool": 73, "intact": 73, "33min": 73, "_putting_it_all_together_video": 73, "times32": 73, "num_dens": 73, "num_conv": 73, "do_plot": 73, "image_s": [73, 77], "number_of_linear": 73, "number_of_conv2d": 73, "final_lay": 73, "sample_imag": 73, "linear_lay": 73, "linear_net": 73, "code_dens": 73, "model_dens": 73, "result_dens": 73, "conv_lay": 73, "conv_net": 73, "code_conv": 73, "model_conv": 73, "shape_conv": 73, "result_conv": 73, "t_1": 73, "shape_linear": 73, "t_2": 73, "t_3": 73, "p1": 73, "p2": 73, "addbox": 73, "text1": 73, "text2": 73, "text3": 73, "clip_on": 73, "gcf": [73, 76], "set_tight_layout": 73, "my_stringiobyt": 73, "seek": [73, 76], "my_base64_jpgdata": 73, "mystr": 73, "caption": [73, 84], "range1": 73, "range2": 73, "slider_batch_s": 73, "slider_image_s": 73, "images": 73, "slider_number_of_linear": 73, "numdens": 73, "slider_number_of_conv2d": 73, "numconv": 73, "slider_kernel_s": 73, "kernels": 73, "input_pool": 73, "checkbox": [73, 85], "input_final_lay": 73, "output_code1": 73, "output_plot": 73, "plot_func": 73, "code1": 73, "code2": 73, "doctyp": 73, "5px": 73, "clearfix": 73, "2em": 73, "h2": [73, 82], "irrespect": 73, "_number_of_parameters_interactive_demo": 73, "_implement_your_own_cnn_video": 73, "9216": 73, "therebi": 73, "emnist_net": 73, "emnistnet": 73, "10d": 73, "_implement_your_own_cnn_exercis": 73, "nresult": 73, "ouselv": 73, "lean": [73, 74], "20min": [73, 84, 85, 94], "_writing_your_own_training_loop_bonus_video": 73, "shirt": [73, 76], "trouser": 73, "pullov": 73, "dress": [73, 88], "coat": [73, 76], "sandal": [73, 76], "sneaker": 73, "ankl": 73, "boot": [73, 76], "10min": [73, 77, 84, 85, 94], "2min": 73, "zalandoresearch": 73, "fashionmnist": 73, "dfhu5": 73, "reduce_class": 73, "get_fashion_mnist_dataset": 73, "validation_data": 73, "_the_training_loop_bonus_video": 73, "ourput": 73, "mnist_train": 73, "mnist_test": 73, "udpat": 73, "07": 73, "fmnist_net1": 73, "FOR": 73, "_code_the_training_loop_bonus_exercis": 73, "combat": [73, 76], "greatli": [73, 82], "_overfitting_bonus_discuss": 73, "30min": [73, 80, 94], "fmnist_net2": 73, "_adding_regularization_bonus_exercis": 73, "_adding_regularization_bonus_discuss": 73, "precalcul": 73, "3495898238046372": 73, "2901147632522786": 73, "2504794800931469": 73, "23571575765914105": 73, "21297093365896255": 73, "19087818914905508": 73, "186408187797729": 73, "19487689035211472": 73, "16774938120803934": 73, "1548648244958926": 73, "1390149021382503": 73, "10919439224922593": 73, "10054351237820501": 73, "09900783193594914": 73, "08370604479507088": 73, "07831853718318521": 73, "06859792241866285": 73, "06152600247383197": 73, "046342475851873885": 73, "055123823092992796": 73, "83475": 73, "8659166666666667": 73, "8874166666666666": 73, "8913333333333333": 73, "8998333333333334": 73, "9140833333333334": 73, "9178333333333333": 73, "9138333333333334": 73, "9251666666666667": 73, "92975": 73, "939": [73, 76, 94], "9525833333333333": 73, "9548333333333333": 73, "9585833333333333": 73, "9655833333333333": 73, "9661666666666666": 73, "9704166666666667": 73, "9743333333333334": 73, "9808333333333333": 73, "9775": 73, "334623601436615": 73, "2977438402175903": 73, "2655304968357086": 73, "25506321132183074": 73, "2588835284113884": 73, "2336345863342285": 73, "3029863876104355": 73, "240766831189394": 73, "2719801160693169": 73, "25231350839138034": 73, "2500132185220718": 73, "26699506521224975": 73, "2934862145781517": 73, "361227530837059": 73, "33196919202804565": 73, "36985905408859254": 73, "4042587959766388": 73, "3716402840614319": 73, "3707024946808815": 73, "4652537405490875": 73, "866875": 73, "851875": 73, "8775": 73, "889375": 73, "881875": 73, "900625": 73, "898125": 73, "885625": 73, "876875": 73, "899375": 73, "90625": 73, "89875": 73, "884375": 73, "874375": 73, "89375": 73, "903125": 73, "890625": 73, "35404509995528993": 73, "30616586227366266": 73, "2872369573946963": 73, "27564131199045383": 73, "25969504263806853": 73, "24728168408445855": 73, "23505379509260046": 73, "21552803914280647": 73, "209761732277718": 73, "19977611067526518": 73, "19632092922767427": 73, "18672360206379535": 73, "16564940239124476": 73, "1654047035671612": 73, "1684555298985636": 73, "1627526102349796": 73, "13878319327263755": 73, "12881529055773577": 73, "12628930977525862": 73, "11346105090837846": 73, "8324166666666667": 73, "8604166666666667": 73, "8680833333333333": 73, "8728333333333333": 73, "8829166666666667": 73, "88625": 73, "89425": 73, "90125": 73, "9015833333333333": 73, "90925": 73, "9114166666666667": 73, "917": [73, 76], "9268333333333333": 73, "92475": 73, "921": [73, 76], "9255833333333333": 73, "9385": 73, "9428333333333333": 73, "9424166666666667": 73, "9484166666666667": 73, "3533937376737595": 73, "29569859683513644": 73, "27531551957130435": 73, "2576177391409874": 73, "26947550356388095": 73, "25361743807792664": 73, "2527468180656433": 73, "24179009914398195": 73, "28664454460144045": 73, "23347773611545564": 73, "24672816634178163": 73, "27822364538908007": 73, "2380720081925392": 73, "24426509588956832": 73, "2443918392062187": 73, "24207917481660843": 73, "2519641682505608": 73, "3075403380393982": 73, "2798181238770485": 73, "26709021866321564": 73, "826875": 73, "870625": 73, "8875": 73, "883125": 73, "891875": 73, "888125": 73, "905": [73, 76], "905625": 73, "901875": 73, "39775496332886373": 73, "33771887778284704": 73, "321900939132939": 73, "3079229625774191": 73, "304149763301966": 73, "28249239723416086": 73, "2861261191044716": 73, "27356165798103554": 73, "2654648520686525": 73, "2697350280557541": 73, "25354846321204877": 73, "24612889034633942": 73, "23482802549892284": 73, "2389904112416379": 73, "23742155821875055": 73, "232423192127905": 73, "22337309338469455": 73, "2141852991932884": 73, "20677659985549907": 73, "19355326712607068": 73, "83625": 73, "8481666666666666": 73, "8530833333333333": 73, "8571666666666666": 73, "86775": 73, "8623333333333333": 73, "8711666666666666": 73, "8748333333333334": 73, "8685833333333334": 73, "8785": 73, "8804166666666666": 73, "8835833333333334": 73, "8840833333333333": 73, "88875": 73, "8919166666666667": 73, "8946666666666667": 73, "8960833333333333": 73, "9063333333333333": 73, "3430288594961166": 73, "4062050700187683": 73, "29745822548866274": 73, "27728439271450045": 73, "28092808067798614": 73, "2577864158153534": 73, "2651400637626648": 73, "25632822573184966": 73, "3082498562335968": 73, "2812121778726578": 73, "26345942318439486": 73, "2577408078312874": 73, "25757989794015884": 73, "26434457510709763": 73, "24917411386966706": 73, "27261342853307724": 73, "2445397639274597": 73, "26001051396131514": 73, "24147838801145555": 73, "2471102523803711": 73, "82875": 73, "795625": 73, "87375": 73, "865625": 73, "8825": 73, "87625": 73, "848125": 73, "87875": 73, "8675": 73, "8925": 73, "87125": 73, "895625": 73, "90375": 73, "4454924576777093": 73, "43416607585993217": 73, "42200265769311723": 73, "40520024616667566": 73, "41137005166804536": 73, "404100904280835": 73, "40118067664034823": 73, "40139733080534223": 73, "3797615355158106": 73, "3596332479030528": 73, "3600061919460905": 73, "3554147962242999": 73, "34480382890460337": 73, "3329520877054397": 73, "33164913056695716": 73, "31860941466181836": 73, "30702565340919696": 73, "30605297186907304": 73, "2953788426486736": 73, "2877389984403519": 73, "7788333333333334": 73, "7825": 73, "7854166666666667": 73, "7916666666666666": 73, "7885": 73, "7833333333333333": 73, "7923333333333333": 73, "79525": 73, "805": [73, 76], "81475": 73, "8161666666666667": 73, "8188333333333333": 73, "817": [73, 76], "8266666666666667": 73, "82225": 73, "8360833333333333": 73, "8456666666666667": 73, "8430833333333333": 73, "8491666666666666": 73, "8486666666666667": 73, "3507828885316849": 73, "3337512403726578": 73, "34320746660232543": 73, "3476085543632507": 73, "3326113569736481": 73, "33033264458179473": 73, "32014619171619413": 73, "3182142299413681": 73, "30076164126396177": 73, "3263852882385254": 73, "27597591280937195": 73, "29062016785144806": 73, "2765174686908722": 73, "269492534995079": 73, "2679423809051514": 73, "2691828978061676": 73, "2726386785507202": 73, "2541181230545044": 73, "2580208206176758": 73, "26315389811992645": 73, "839375": 73, "843125": 73, "823125": 73, "821875": 73, "81875": 73, "819375": 73, "8225": 73, "835625": 73, "865": [73, 76], "868125": 73, "855625": 73, "8975": 73, "885": [73, 76], "34561181647029326": 73, "2834314257699124": 73, "2583787844298368": 73, "23892096465730922": 73, "23207981773513428": 73, "20245029634617745": 73, "183908417583146": 73, "17489413774393975": 73, "17696723581707857": 73, "15615438255778652": 73, "14469048382833283": 73, "12424647461305907": 73, "11314761043189371": 73, "11249036608422373": 73, "10725672634199579": 73, "09081190969160896": 73, "0942245383271353": 73, "08525650047677312": 73, "06622548752583246": 73, "06039895973307021": 73, "8356666666666667": 73, "8675833333333334": 73, "88175": 73, "8933333333333333": 73, "8975833333333333": 73, "91175": 73, "91825": 73, "9249166666666667": 73, "9238333333333333": 73, "9305": 73, "9465833333333333": 73, "9539166666666666": 73, "9555": 73, "9615": 73, "9606666666666667": 73, "96275": 73, "9725": 73, "9764166666666667": 73, "31630186855792997": 73, "2702121251821518": 73, "2915778249502182": 73, "26050266206264494": 73, "27837209939956664": 73, "24276352763175965": 73, "3567117482423782": 73, "2752074319124222": 73, "2423130339384079": 73, "2565067422389984": 73, "28710135877132414": 73, "266545415520668": 73, "31818037331104276": 73, "28757534325122835": 73, "2777567034959793": 73, "2998969575762749": 73, "3292293107509613": 73, "30775387287139894": 73, "32681577146053314": 73, "44882203072309496": 73, "85375": 73, "879375": 73, "875625": 73, "86125": 73, "89625": 73, "895": [73, 76], "89125": 73, "880625": 73, "894375": 73, "35970850011452715": 73, "31336131549261986": 73, "2881505932421126": 73, "2732012960267194": 73, "26232245425753137": 73, "2490472443639598": 73, "24866499093935845": 73, "22930880945096624": 73, "21745950407645803": 73, "20700296882460725": 73, "197304340356842": 73, "20665066804182022": 73, "19864868348900308": 73, "184807124210799": 73, "1684703354703936": 73, "17377675851767369": 73, "16638460063791655": 73, "15944768343754906": 73, "14876513817208878": 73, "1388207479835825": 73, "83375": 73, "85175": 73, "86725": 73, "8719166666666667": 73, "8761666666666666": 73, "8865833333333333": 73, "88275": 73, "8956666666666667": 73, "8995833333333333": 73, "9034166666666666": 73, "90825": 73, "9043333333333333": 73, "9093333333333333": 73, "9145": 73, "9196666666666666": 73, "9216666666666666": 73, "9273333333333333": 73, "9299166666666666": 73, "93675": 73, "3166788029670715": 73, "28422485530376435": 73, "38055971562862395": 73, "2586472672224045": 73, "2588653892278671": 73, "27983254253864287": 73, "25693483114242555": 73, "26412731170654297": 73, "2733065390586853": 73, "24399636536836625": 73, "24481021404266357": 73, "2689305514097214": 73, "2527604129910469": 73, "24829535871744157": 73, "2654112687706947": 73, "23074268400669098": 73, "24625462979078294": 73, "26423920392990113": 73, "25540480852127073": 73, "25536185175180437": 73, "856875": 73, "86625": 73, "815": [73, 76], "88125": 73, "893125": 73, "3975753842040579": 73, "34884724409339274": 73, "3296900932142075": 73, "3150389680361494": 73, "31285368667003954": 73, "30415422033439293": 73, "29553352716438314": 73, "289314468094009": 73, "2806722329969102": 73, "2724469883486311": 73, "26634286379719035": 73, "2645016222241077": 73, "2619251853766594": 73, "2551752221473354": 73, "26411766035759704": 73, "24515971153023394": 73, "2390686312412962": 73, "23573122312255362": 73, "221005061562074": 73, "22358600648635246": 73, "8106666666666666": 73, "8286666666666667": 73, "8513333333333334": 73, "84975": 73, "8570833333333333": 73, "8624166666666667": 73, "8626666666666667": 73, "866": [73, 76], "8706666666666667": 73, "8738333333333334": 73, "8778333333333334": 73, "8798333333333334": 73, "8865": 73, "8898333333333334": 73, "8885833333333333": 73, "8991666666666667": 73, "8968333333333334": 73, "3597823417186737": 73, "31115993797779085": 73, "29929635107517244": 73, "2986589139699936": 73, "2938830828666687": 73, "28118040919303894": 73, "2711684626340866": 73, "2844697123765945": 73, "26613601863384245": 73, "2783134698867798": 73, "2540236383676529": 73, "25821100890636445": 73, "2618845862150192": 73, "2554920208454132": 73, "26543013513088226": 73, "24074569433927537": 73, "26475649774074556": 73, "25578504264354707": 73, "2648500043153763": 73, "25700133621692656": 73, "825": [73, 76], "8375": 73, "85875": 73, "861875": 73, "886875": 73, "86375": 73, "88375": 73, "4584837538447786": 73, "4506375778545725": 73, "4378386567089152": 73, "4066803843734112": 73, "3897064097542712": 73, "3855383962868376": 73, "39160584618753574": 73, "3731403942120836": 73, "37915910170116324": 73, "36966170814443144": 73, "35735995298687445": 73, "35630573094525236": 73, "346426092167484": 73, "34040802899510303": 73, "32829743726773464": 73, "3284692421872565": 73, "3186114077713895": 73, "32295761503120685": 73, "3201326223764014": 73, "30581602454185486": 73, "7803333333333333": 73, "7709166666666667": 73, "7723333333333333": 73, "7850833333333334": 73, "7903333333333333": 73, "7986666666666666": 73, "8011666666666667": 73, "8068333333333333": 73, "8095833333333333": 73, "8226666666666667": 73, "8285": 73, "83125": 73, "8369166666666666": 73, "8395": 73, "8441666666666666": 73, "8393333333333334": 73, "8490833333333333": 73, "8546666666666667": 73, "43526833415031435": 73, "3598956459760666": 73, "3492005372047424": 73, "33501910269260404": 73, "31689528703689573": 73, "3113307124376297": 73, "32388085544109346": 73, "3084335786104202": 73, "3013568025827408": 73, "28992725372314454": 73, "28726822674274444": 73, "26945948660373686": 73, "276592333316803": 73, "27462401330471037": 73, "27574350595474245": 73, "2710308712720871": 73, "2702724140882492": 73, "27323003828525544": 73, "25551479041576386": 73, "26488787233829497": 73, "808125": 73, "81625": 73, "8325": 73, "846875": 73, "850625": 73, "838125": 73, "836875": 73, "858125": 73, "86875": 73, "3579516930783049": 73, "29596046564426826": 73, "2779693031247626": 73, "2563994538356015": 73, "24771526356802342": 73, "2324555875693864": 73, "2139121579362991": 73, "20474095547452886": 73, "19138856208387842": 73, "18883306279461434": 73, "1763652620757831": 73, "1698919345248253": 73, "16033914366221808": 73, "1557997044651432": 73, "1432509447467771": 73, "13817814606776896": 73, "12609625801919622": 73, "11830132696381275": 73, "11182412960903441": 73, "112559904720872": 73, "8314166666666667": 73, "8611666666666666": 73, "8736666666666667": 73, "8800833333333333": 73, "8944166666666666": 73, "9036666666666666": 73, "9090833333333334": 73, "9193333333333333": 73, "9161666666666667": 73, "92225": 73, "9255": 73, "93075": 73, "93225": 73, "9414166666666667": 73, "94375": 73, "9485833333333333": 73, "9535833333333333": 73, "9524166666666667": 73, "30677567660808563": 73, "32954772651195524": 73, "25747098088264464": 73, "2736126834154129": 73, "2561805549263954": 73, "23671718776226044": 73, "24553639352321624": 73, "2338863667845726": 73, "24586652517318724": 73, "23423030972480774": 73, "26579618513584136": 73, "2781539523601532": 73, "27084136098623274": 73, "23948652744293214": 73, "26023868829011915": 73, "2419952344894409": 73, "2511997854709625": 73, "23935708701610564": 73, "2701922015845776": 73, "27307246536016466": 73, "878125": 73, "896875": 73, "904375": 73, "906875": 73, "3712943946903056": 73, "3198322071594761": 73, "29978102302931725": 73, "295274139798068": 73, "2861913934032968": 73, "27165328782606635": 73, "25972246442069397": 73, "2543164194819141": 73, "24795781916126292": 73, "24630710007028378": 73, "23296909834793272": 73, "23382153587931015": 73, "2239028559799524": 73, "21443849290780564": 73, "2149274461367663": 73, "20642021417300752": 73, "19801520536396097": 73, "1978839404009124": 73, "19118623847657062": 73, "18144798041024107": 73, "8235833333333333": 73, "8538333333333333": 73, "86075": 73, "8664166666666666": 73, "8754166666666666": 73, "8799166666666667": 73, "8815833333333334": 73, "88725": 73, "8848333333333334": 73, "8936666666666667": 73, "8935": 73, "8995": 73, "9068333333333334": 73, "9098333333333334": 73, "9120833333333334": 73, "91375": 73, "9175833333333333": 73, "3184810388088226": 73, "2948088157176971": 73, "29438531696796416": 73, "27669853866100313": 73, "2634278678894043": 73, "25847582578659056": 73, "2500907778739929": 73, "2538330048322678": 73, "25127841770648957": 73, "2519759064912796": 73, "2455715072154999": 73, "2437664610147476": 73, "259639236330986": 73, "24515749186277389": 73, "2553828465938568": 73, "2324645048379898": 73, "24492083072662355": 73, "24482838332653045": 73, "23327024638652802": 73, "2520161652565002": 73, "855": [73, 76], "8525": 73, "40442772225496615": 73, "36662670541951": 73, "355034276367502": 73, "3396551510755052": 73, "3378269396563794": 73, "32084332002287214": 73, "31314464951766297": 73, "2982726935693558": 73, "2885229691387491": 73, "2888992782285873": 73, "2893476904706752": 73, "281817957996688": 73, "2771622718490185": 73, "2693793097550565": 73, "2617615883416952": 73, "2657115764995205": 73, "25631817549150043": 73, "24793559907281654": 73, "2538738044652533": 73, "23912971732305718": 73, "8093333333333333": 73, "82825": 73, "8341666666666666": 73, "84525": 73, "8515": 73, "8583333333333333": 73, "8688333333333333": 73, "8685": 73, "8689166666666667": 73, "8693333333333333": 73, "8766666666666667": 73, "8839166666666667": 73, "8866666666666667": 73, "8929166666666667": 73, "38392188608646394": 73, "3653419762849808": 73, "3050421380996704": 73, "30614266455173494": 73, "2937217426300049": 73, "30008585572242735": 73, "2794034606218338": 73, "27541795969009397": 73, "31378355383872986": 73, "2670704126358032": 73, "26745485186576845": 73, "2471194839477539": 73, "26509816259145735": 73, "25458798944950106": 73, "2481587851047516": 73, "25591064751148224": 73, "2596563971042633": 73, "2569611769914627": 73, "2435744071006775": 73, "2507249677181244": 73, "820625": 73, "860625": 73, "46106574311852455": 73, "4519433615372536": 73, "4446939624687459": 73, "4284856241751224": 73, "4527993325857406": 73, "4220876024758562": 73, "40969764266876463": 73, "39233948219012704": 73, "42498463344700793": 73, "3869199570506177": 73, "38021832910623954": 73, "3855376149270129": 73, "3721433773319772": 73, "3662295250340979": 73, "3629763710530514": 73, "358500304691335": 73, "3490118366131123": 73, "34879197790584665": 73, "33399240054348683": 73, "3347948451149971": 73, "7866666666666666": 73, "7865": 73, "79375": 73, "7755833333333333": 73, "79125": 73, "7973333333333333": 73, "8085833333333333": 73, "7913333333333333": 73, "8125833333333333": 73, "81675": 73, "8173333333333334": 73, "831": [73, 76], "8306666666666667": 73, "8353333333333334": 73, "8320833333333333": 73, "84375": 73, "8410833333333333": 73, "35159709095954894": 73, "3579048192501068": 73, "3501501774787903": 73, "33594816565513613": 73, "3741619431972504": 73, "34183687329292295": 73, "3353554099798203": 73, "32617265462875367": 73, "3640907108783722": 73, "33187183618545535": 73, "32401839792728426": 73, "30536725163459777": 73, "31303414940834046": 73, "2893040508031845": 73, "3063929396867752": 73, "2909839802980423": 73, "2858921372890472": 73, "2850045281648636": 73, "28049838364124297": 73, "2873564797639847": 73, "816875": 73, "793125": 73, "810625": 73, "8175": 73, "814375": 73, "828125": 73, "83875": 73, "818125": 73, "834375": 73, "37716902824158366": 73, "3260373148195287": 73, "3128290904012132": 73, "2998493126732238": 73, "29384377892030045": 73, "2759418967873492": 73, "26431119905665834": 73, "2577077782455277": 73, "25772295725789474": 73, "24954422610871335": 73, "24065862928933285": 73, "23703582263848882": 73, "23237684028262787": 73, "2200249534575863": 73, "22110319957929722": 73, "21804759631607126": 73, "21419822757548473": 73, "19927451733816812": 73, "19864692467641323": 73, "18966749441274938": 73, "8215833333333333": 73, "848": [73, 76], "8526666666666667": 73, "8585": 73, "8639166666666667": 73, "8716666666666667": 73, "8783333333333333": 73, "8849166666666667": 73, "88325": 73, "8918333333333334": 73, "896": [73, 76], "9010833333333333": 73, "8996666666666666": 73, "9016666666666666": 73, "902": [73, 76], "9105833333333333": 73, "9160833333333334": 73, "3255926352739334": 73, "3397491586208343": 73, "3148202610015869": 73, "30447013437747955": 73, "27427292466163633": 73, "2607581865787506": 73, "2583494257926941": 73, "24150457441806794": 73, "24839721441268922": 73, "24157819360494615": 73, "24594406485557557": 73, "2547012311220169": 73, "24132476687431337": 73, "2433958488702774": 73, "2358475297689438": 73, "24675665378570558": 73, "23343635857105255": 73, "22841362684965133": 73, "2247604575753212": 73, "24281086921691894": 73, "85125": 73, "853125": 73, "3795942336796446": 73, "33614943612446174": 73, "3235826115024851": 73, "3267444484728448": 73, "30353531146303137": 73, "29750882636042353": 73, "2964640334248543": 73, "28714796314214136": 73, "2744278162717819": 73, "27310871372514584": 73, "2624819800257683": 73, "2579742945889209": 73, "25963644726954876": 73, "25635017161356644": 73, "2501001837960583": 73, "24249463702769988": 73, "23696896695393196": 73, "23254455582417072": 73, "22419108628751117": 73, "22851746232110134": 73, "8204166666666667": 73, "8506666666666667": 73, "8635": 73, "87475": 73, "87925": 73, "8805833333333334": 73, "8845": 73, "88675": 73, "8908333333333334": 73, "8926666666666667": 73, "89525": 73, "8985": 73, "8955833333333333": 73, "3383863967657089": 73, "31120560944080355": 73, "32110977828502657": 73, "3080899566411972": 73, "2866462391614914": 73, "27701647162437437": 73, "29040718913078306": 73, "2702513742446899": 73, "2590403389930725": 73, "26199558019638064": 73, "26484714448451996": 73, "2940529054403305": 73, "2654808533191681": 73, "25154681205749513": 73, "26637687146663663": 73, "24435366928577423": 73, "24174826145172118": 73, "2444209086894989": 73, "247626873254776": 73, "24192263156175614": 73, "8575": 73, "85625": 73, "41032169133107715": 73, "37122817583223605": 73, "35897897873470125": 73, "3438001747064768": 73, "33858899811797954": 73, "3389760729797343": 73, "32536247420184156": 73, "3152934226425404": 73, "30936657058748795": 73, "3078679118226183": 73, "30974164977669716": 73, "30031369174731537": 73, "29489042173991814": 73, "28921707251921613": 73, "28369594476324445": 73, "2849519875772456": 73, "27076949349584734": 73, "26930386248104116": 73, "26349931491657774": 73, "26431971300948176": 73, "8086666666666666": 73, "8284166666666667": 73, "8381666666666666": 73, "837": [73, 76], "8389166666666666": 73, "8488333333333333": 73, "8533333333333334": 73, "8551666666666666": 73, "8509166666666667": 73, "8628333333333333": 73, "86225": 73, "8715": 73, "8814166666666666": 73, "8835": 73, "3464747530221939": 73, "3193131250143051": 73, "3464068531990051": 73, "3129056388139725": 73, "3131117367744446": 73, "30689118325710296": 73, "2929005026817322": 73, "3131696957349777": 73, "302835636138916": 73, "27934255003929137": 73, "300513002872467": 73, "26962003886699676": 73, "2676294481754303": 73, "26430738389492037": 73, "2525753951072693": 73, "2508367341756821": 73, "25303518533706665": 73, "24774718701839446": 73, "24518848478794097": 73, "26084545016288757": 73, "849375": 73, "869375": 73, "863125": 73, "8725": 73, "4765880586619073": 73, "4503744399928032": 73, "4249279998401378": 73, "42333967214886176": 73, "4236916420941657": 73, "4269233151002133": 73, "4192506206479478": 73, "41413671872083174": 73, "41084911515738104": 73, "389948022413127": 73, "39566395788433706": 73, "3741930383951106": 73, "3794517093040842": 73, "3692300356131919": 73, "3640432547223061": 73, "3608953575504587": 73, "3419572095129084": 73, "34907091543712515": 73, "33601277535583113": 73, "3408893179544743": 73, "77625": 73, "7823333333333333": 73, "80075": 73, "7810833333333334": 73, "7928333333333333": 73, "7930833333333334": 73, "7951666666666667": 73, "8015833333333333": 73, "8000833333333334": 73, "8126666666666666": 73, "811": [73, 76], "81775": 73, "8236666666666667": 73, "8215": 73, "8305833333333333": 73, "8251666666666667": 73, "8299166666666666": 73, "836": [73, 76], "3674533206224442": 73, "36733597874641416": 73, "35894496202468873": 73, "3514183223247528": 73, "35345671892166136": 73, "36494161546230314": 73, "35217500329017637": 73, "3447349113225937": 73, "34697150766849516": 73, "36931039452552794": 73, "3350031852722168": 73, "3416145300865173": 73, "32389605045318604": 73, "3109715062379837": 73, "3322615468502045": 73, "327584428191185": 73, "31910278856754304": 73, "311815539598465": 73, "2950947880744934": 73, "2948034608364105": 73, "789375": 73, "81375": 73, "804375": 73, "80625": 73, "8125": 73, "84625": 73, "824375": 73, "825625": 73, "840625": 73, "8475": 73, "844375": 73, "400307985173582": 73, "2597426520640662": 73, "20706942731312025": 73, "17091670006251475": 73, "13984850759524653": 73, "11444453444522518": 73, "0929887340481538": 73, "07584588486117436": 73, "06030314570384176": 73, "04997897459031356": 73, "037156337104278056": 73, "02793900864590992": 73, "02030197833807442": 73, "01789472087045391": 73, "0175876492686666": 73, "019220354652448274": 73, "013543135874294319": 73, "006956856955481477": 73, "0024507183060002227": 73, "00206579088377317": 73, "8547833333333333": 73, "9049": 73, "9241666666666667": 73, "9360166666666667": 73, "94695": 73, "9658666666666667": 73, "9723166666666667": 73, "9780333333333333": 73, "9820166666666666": 73, "9868": 73, "9906666666666667": 73, "9936833333333334": 73, "9941333333333333": 73, "99405": 73, "9932833333333333": 73, "9960666666666667": 73, "9979666666666667": 73, "9996666666666667": 73, "9995666666666667": 73, "36797549843788147": 73, "2586278670430183": 73, "24208260095119477": 73, "24353929474949837": 73, "24164094921946525": 73, "2638056704550982": 73, "2579395814836025": 73, "27675500786304474": 73, "2851512663513422": 73, "30380481338500975": 73, "3235128371268511": 73, "3284085538983345": 73, "3443841063082218": 73, "41086878085136413": 73, "457796107493341": 73, "4356938077956438": 73, "4109785168170929": 73, "4433729724138975": 73, "4688420155197382": 73, "4773445381522179": 73, "908375": 73, "91475": 73, "915125": 73, "91525": 73, "91725": 73, "924875": 73, "91975": 73, "922375": 73, "92025": 73, "920375": 73, "9235": 73, "918125": 73, "918875": 73, "923625": 73, "92625": 73, "925": [73, 76], "4710115425463424": 73, "3166707545550647": 73, "25890692547440275": 73, "22350736999753187": 73, "19296910860009794": 73, "17304379170113154": 73, "15315235079105285": 73, "13728606270383925": 73, "12178339355929034": 73, "10961619754736898": 73, "10074329449495337": 73, "08793247367408294": 73, "07651288138686625": 73, "06934997136779089": 73, "06243234033510685": 73, "056774082654433795": 73, "05116950291028218": 73, "04961718403588313": 73, "04289388027836952": 73, "040430180404756245": 73, "8289666666666666": 73, "8851833333333333": 73, "9045166666666666": 73, "9167666666666666": 73, "9294166666666667": 73, "93545": 73, "94275": 73, "9486666666666667": 73, "95365": 73, "95855": 73, "9618833333333333": 73, "9667": 73, "9717666666666667": 73, "9745833333333334": 73, "9765833333333334": 73, "9793": 73, "9809833333333333": 73, "9820333333333333": 73, "9839166666666667": 73, "9849166666666667": 73, "3629846270084381": 73, "31240448981523516": 73, "24729759228229523": 73, "2697310926616192": 73, "24718070650100707": 73, "23403583562374114": 73, "2295891786813736": 73, "22117181441187858": 73, "2475375788807869": 73, "23771390727162361": 73, "2562992911040783": 73, "25533875498175623": 73, "27057862806320193": 73, "2820998176634312": 73, "29471745146811007": 73, "2795617451965809": 73, "3008101430237293": 73, "28815430629253386": 73, "31814645100384953": 73, "3106237706840038": 73, "874125": 73, "908875": 73, "9045": 73, "919375": 73, "9245": 73, "926": [73, 76], "925875": 73, "926375": 73, "925125": 73, "92525": 73, "924625": 73, "930875": 73, "926625": 73, "6091368444629316": 73, "40709905083309106": 73, "33330900164873106": 73, "29541655938063605": 73, "26824146830864043": 73, "24633059249535552": 73, "22803501166832219": 73, "21262132842689435": 73, "20038021789160745": 73, "18430457027680647": 73, "1744787511763288": 73, "165271017740149": 73, "15522625095554507": 73, "1432937567076608": 73, "13617747858651222": 73, "12876031456241158": 73, "12141566201230325": 73, "11405601029369686": 73, "11116664642408522": 73, "10308189516060992": 73, "7803833333333333": 73, "8559166666666667": 73, "8823": 73, "89505": 73, "9027333333333334": 73, "9099166666666667": 73, "9162333333333333": 73, "9224833333333333": 73, "9243166666666667": 73, "9321": 73, "9345833333333333": 73, "9375333333333333": 73, "9418833333333333": 73, "9456666666666667": 73, "9482333333333334": 73, "9513666666666667": 73, "9527333333333333": 73, "9559": 73, "9576166666666667": 73, "9611": 73, "36491659212112426": 73, "29200539910793305": 73, "2840233483910561": 73, "2591339669823646": 73, "24114771646261215": 73, "2436459481716156": 73, "2374294084906578": 73, "24284198743104934": 73, "22679156363010405": 73, "2229055170416832": 73, "21932773572206496": 73, "23045065227150918": 73, "23631879675388337": 73, "22048399156332016": 73, "2563135535418987": 73, "2494968646839261": 73, "24099056956171988": 73, "23974315640330315": 73, "24684958010911942": 73, "25887142738699914": 73, "8665": 73, "897": [73, 76], "907375": 73, "914125": 73, "9125": 73, "913875": 73, "911875": 73, "921125": 73, "922625": 73, "923375": 73, "924125": 73, "915625": 73, "926125": 73, "932625": 73, "927875": 73, "187068938827718": 73, "9080034740316842": 73, "6863665148329887": 73, "5706229420867301": 73, "5069490017921432": 73, "46316734996876485": 73, "42913920047885573": 73, "4107565824855874": 73, "3908677859061054": 73, "37283689377785745": 73, "3606657798388111": 73, "353545261082301": 73, "34009441143986": 73, "3239413740506559": 73, "3193119444620253": 73, "31045137204404577": 73, "3003838519091164": 73, "29092520530194615": 73, "28635713599447504": 73, "2760026559138349": 73, "5551333333333334": 73, "6467": 73, "7338666666666667": 73, "7841333333333333": 73, "8128": 73, "82845": 73, "8501666666666666": 73, "8580833333333333": 73, "8646166666666667": 73, "8667666666666667": 73, "8709833333333333": 73, "8766166666666667": 73, "8816666666666667": 73, "8812": 73, "88465": 73, "8898833333333334": 73, "8934666666666666": 73, "8940833333333333": 73, "8977666666666667": 73, "6463955206871033": 73, "5193838343620301": 73, "4155286856889725": 73, "3316091845035553": 73, "3148408111333847": 73, "29354524302482604": 73, "2875490103960037": 73, "26903486740589144": 73, "27737221759557723": 73, "262776792883873": 73, "25498255288600924": 73, "2390553195178509": 73, "24918611392378806": 73, "23830307483673097": 73, "23538302001357078": 73, "24996423116326333": 73, "2464654156267643": 73, "24081429636478424": 73, "23204647853970528": 73, "23771219885349273": 73, "763875": 73, "81925": 73, "8885": 73, "8895": 73, "904125": 73, "906125": 73, "908": [73, 76], "909375": 73, "916125": 73, "9175": 73, "91875": 73, "91425": 73, "915375": 73, "4140813298491654": 73, "27481235485118843": 73, "22397600941614174": 73, "1890777693286951": 73, "16538111197112848": 73, "1448796250478132": 73, "12440053254032313": 73, "10817898457734855": 73, "09634132136696025": 73, "08548538653410352": 73, "07339220296349257": 73, "06470446296305314": 73, "060030178171393875": 73, "053294485403614034": 73, "04429284706704323": 73, "04014099264770115": 73, "03974721442450951": 73, "03304463665041803": 73, "02955428938137994": 73, "026940144761875052": 73, "8496666666666667": 73, "8982666666666667": 73, "9162166666666667": 73, "9292166666666667": 73, "93805": 73, "9457666666666666": 73, "9534333333333334": 73, "9596": 73, "9645833333333333": 73, "9679": 73, "9726166666666667": 73, "9761666666666666": 73, "9800166666666666": 73, "9842": 73, "9855333333333334": 73, "9857": 73, "98805": 73, "9895666666666667": 73, "9905833333333334": 73, "3327465409040451": 73, "27738857254385946": 73, "23834018683433533": 73, "24359044748544692": 73, "23630736249685289": 73, "26239568686485293": 73, "23089197066426276": 73, "23183160039782524": 73, "2287161501646042": 73, "23795067170262338": 73, "2680365410447121": 73, "28079107534885406": 73, "2745736412107945": 73, "27641161236166956": 73, "2967236565724015": 73, "29836027943715454": 73, "28526886811852453": 73, "3188628684282303": 73, "3159900237545371": 73, "33990017675608397": 73, "899875": 73, "9105": 73, "92075": 73, "924": [73, 76], "920875": 73, "9285": 73, "927625": 73, "9265": 73, "927375": 73, "927": [73, 76], "92575": 73, "48859380523978013": 73, "3269256727337075": 73, "275135099903734": 73, "24039912359244914": 73, "21368402032566858": 73, "19328243048317523": 73, "17890911489359732": 73, "16624130663682402": 73, "15215728174088827": 73, "1416037013468299": 73, "13273427299440288": 73, "12227611260405227": 73, "11463099068699917": 73, "10616964906720179": 73, "09988978996809357": 73, "09424899211093815": 73, "08670466838887077": 73, "0835973875783781": 73, "0778748192367698": 73, "07327510508696741": 73, "82055": 73, "8806666666666667": 73, "9004333333333333": 73, "9117333333333333": 73, "9206333333333333": 73, "92785": 73, "9333": 73, "9384166666666667": 73, "9430333333333333": 73, "9471833333333334": 73, "95055": 73, "9540166666666666": 73, "9568833333333333": 73, "9601666666666666": 73, "9620333333333333": 73, "9652": 73, "9676833333333333": 73, "9682666666666667": 73, "9706": 73, "9724333333333334": 73, "34025013536214826": 73, "29788709819316866": 73, "2680273652672768": 73, "2463292105793953": 73, "23471139985322953": 73, "22580294385552407": 73, "21676637730002404": 73, "20925517010688782": 73, "23552959233522416": 73, "21975916308164598": 73, "23494828915596008": 73, "21611644634604454": 73, "22251244640350343": 73, "22066593673825263": 73, "2214409472346306": 73, "22849382662773132": 73, "24493269926309585": 73, "2397777333110571": 73, "23578458192944526": 73, "2563280282020569": 73, "870875": 73, "900375": 73, "906625": 73, "92125": 73, "92425": 73, "916": [73, 76], "923125": 73, "92675": 73, "922875": 73, "931125": 73, "932375": 73, "929": [73, 76], "6104797730917362": 73, "42115319246994154": 73, "3527538229359874": 73, "3136731511446586": 73, "2857721160565104": 73, "26646374052426197": 73, "24732486170523965": 73, "23057452346613286": 73, "21953405395769743": 73, "20952929538100767": 73, "19584925043811677": 73, "18926965880162044": 73, "18003955145856973": 73, "17379174885878176": 73, "16635702809354644": 73, "15807223409366633": 73, "1509416516620054": 73, "1477138751140758": 73, "14028569269798266": 73, "13906246528172417": 73, "7786833333333333": 73, "8482166666666666": 73, "8730833333333333": 73, "888": [73, 76], "8978": 73, "9033666666666667": 73, "9089166666666667": 73, "9147666666666666": 73, "91955": 73, "9221833333333334": 73, "92715": 73, "9309666666666667": 73, "9334": 73, "93495": 73, "9376833333333333": 73, "9402666666666667": 73, "94405": 73, "9439166666666666": 73, "9466833333333333": 73, "9464833333333333": 73, "3859497320652008": 73, "3124091213941574": 73, "28177140313386917": 73, "2564259949326515": 73, "24969424712657928": 73, "23137387067079543": 73, "22758139592409135": 73, "22978509336709976": 73, "2293499847650528": 73, "22430640310049058": 73, "21563700905442237": 73, "21529569518566133": 73, "22171301135420798": 73, "2105387990772724": 73, "21190602815151213": 73, "21494245541095733": 73, "21312989933788776": 73, "20670134457945824": 73, "2146600303351879": 73, "21474341893941165": 73, "907": [73, 76], "915": [73, 76], "917875": 73, "917625": 73, "921875": 73, "928125": 73, "92775": 73, "928625": 73, "930375": 73, "1724896589194789": 73, "8803599189911315": 73, "692622532690766": 73, "5974764075837156": 73, "5319996399920124": 73, "49373906012028773": 73, "4741932853007876": 73, "45601858158927483": 73, "43706520244892216": 73, "4238534729236733": 73, "41077356216813454": 73, "38932509837882606": 73, "3771154705856019": 73, "3687882057305719": 73, "34927689276937485": 73, "3379922736602933": 73, "33547254843212393": 73, "3263144160448107": 73, "31800466419251233": 73, "3133781185822446": 73, "5631833333333334": 73, "6579333333333334": 73, "7342166666666666": 73, "7765833333333333": 73, "8036333333333333": 73, "8197166666666666": 73, "82755": 73, "8320166666666666": 73, "8397833333333333": 73, "8432666666666667": 73, "8519333333333333": 73, "85835": 73, "86285": 73, "8641": 73, "87105": 73, "8756666666666667": 73, "8775166666666666": 73, "87965": 73, "88255": 73, "8832333333333333": 73, "5745115535259246": 73, "4740168128013611": 73, "4092038922309876": 73, "345498643040657": 73, "32894178831577303": 73, "2999964846372604": 73, "28456189918518066": 73, "28186965006589887": 73, "26958267349004744": 73, "26703972268104553": 73, "2667745503783226": 73, "2553461962342262": 73, "25764305877685545": 73, "2528705199956894": 73, "24987997275590895": 73, "24210182267427444": 73, "2366510547697544": 73, "24053962442278862": 73, "22825994032621383": 73, "2270425768494606": 73, "776875": 73, "822625": 73, "848875": 73, "87825": 73, "88925": 73, "9015": 73, "9035": 73, "91125": 73, "908625": 73, "917125": 73, "91675": 73, "919875": 73, "43062501005145276": 73, "29807482149078646": 73, "2541527441585623": 73, "21918726423338278": 73, "1950343672964555": 73, "17517360023010387": 73, "16213757058244144": 73, "14869415854364": 73, "13477844860392815": 73, "12352272007129848": 73, "11392300839184412": 73, "10589898744228679": 73, "09751250602896692": 73, "089864786467088": 73, "08516462990539526": 73, "07973235945548934": 73, "07441158362824137": 73, "07053931183896578": 73, "06258528833356954": 73, "06177985634201014": 73, "8429": 73, "88905": 73, "9052166666666667": 73, "9182166666666667": 73, "92755": 73, "9337666666666666": 73, "93835": 73, "944": [73, 76, 94], "9489333333333333": 73, "9565333333333333": 73, "9599166666666666": 73, "9637833333333333": 73, "9659666666666666": 73, "9685666666666667": 73, "9705": 73, "9713666666666667": 73, "9738": 73, "9770166666666666": 73, "9769833333333333": 73, "32814766228199005": 73, "29447353577613833": 73, "25052148789167406": 73, "22761481428146363": 73, "23280890756845474": 73, "23155913531780242": 73, "21984874603152274": 73, "2166314404308796": 73, "2202563073039055": 73, "22508277136087418": 73, "2237191815972328": 73, "2246915928721428": 73, "22815296687185765": 73, "2254556802213192": 73, "2337513281852007": 73, "2381753808259964": 73, "24798179551959038": 73, "24766947883367538": 73, "24877363580465317": 73, "2518915164768696": 73, "879625": 73, "89025": 73, "907875": 73, "916625": 73, "91625": 73, "923": [73, 76], "927125": 73, "925375": 73, "925625": 73, "5022556754285847": 73, "3545388207554436": 73, "2965180559564374": 73, "2689443711818917": 73, "24340009927622544": 73, "22504497168144819": 73, "21177587015574167": 73, "19926073912507308": 73, "18498492261557692": 73, "1792394390810273": 73, "16716771742809555": 73, "16088557891500022": 73, "15540826101420022": 73, "1471743908549931": 73, "14383414784458273": 73, "1351151093741311": 73, "1312572255915305": 73, "12904865093140014": 73, "12332957751079918": 73, "11934908895072208": 73, "8186333333333333": 73, "8905666666666666": 73, "9020666666666667": 73, "9106333333333333": 73, "9169333333333334": 73, "9227": 73, "9258166666666666": 73, "9317": 73, "9329666666666667": 73, "9384833333333333": 73, "9394333333333333": 73, "94185": 73, "9447666666666666": 73, "9449833333333333": 73, "9489": 73, "9506": 73, "9520333333333333": 73, "95295": 73, "9556833333333333": 73, "37072600054740906": 73, "2894986196160316": 73, "2896255247592926": 73, "2553737629055977": 73, "2347450014948845": 73, "23144772934913635": 73, "22532679361104965": 73, "2152210614681244": 73, "21610748746991157": 73, "22872606116533278": 73, "22058768355846406": 73, "20230921444296837": 73, "2118315652012825": 73, "20028054055571556": 73, "20844366964697839": 73, "20884322375059128": 73, "21231223946809769": 73, "19875787001848222": 73, "2072589308321476": 73, "22480831852555275": 73, "862": [73, 76], "894": [73, 76], "892375": 73, "906375": 73, "912625": 73, "916875": 73, "9185": 73, "92825": 73, "92925": 73, "926875": 73, "6208003907124879": 73, "4341448332582201": 73, "3655890760454796": 73, "3245583019102179": 73, "3000562671722888": 73, "2840681741280215": 73, "2686156402947679": 73, "25843519997844566": 73, "24892204790227196": 73, "23988707410469493": 73, "22968693327770304": 73, "22323107979953416": 73, "21376596502403714": 73, "21353628940340172": 73, "208721635311143": 73, "20283085862393063": 73, "19862186088204892": 73, "1939613972542319": 73, "18833921627917968": 73, "18451892669552933": 73, "7769666666666667": 73, "8453333333333334": 73, "86965": 73, "88425": 73, "8911": 73, "8957666666666667": 73, "9056666666666666": 73, "9083833333333333": 73, "9122666666666667": 73, "91455": 73, "9176833333333333": 73, "92035": 73, "9217": 73, "9232333333333334": 73, "9270333333333334": 73, "9283": 73, "93035": 73, "9312333333333334": 73, "390482270359993": 73, "3140819278359413": 73, "286346542596817": 73, "26530489122867584": 73, "25648517191410064": 73, "25534764647483826": 73, "24066219604015351": 73, "22813884472846985": 73, "22091108289361": 73, "22591463786363603": 73, "22548504903912545": 73, "21807716876268388": 73, "23463654381036758": 73, "21917386519908905": 73, "2077158398628235": 73, "2112607652246952": 73, "205703763961792": 73, "21748955991864205": 73, "20092388433218003": 73, "20742826372385026": 73, "859125": 73, "89225": 73, "904875": 73, "914875": 73, "916375": 73, "91575": 73, "92375": 73, "1608194957918196": 73, "8736483463918222": 73, "7270457689632485": 73, "6118623841482439": 73, "5539627463769302": 73, "5169604117872872": 73, "4843029365547176": 73, "4664089765979537": 73, "449539397952399": 73, "4308713404481599": 73, "4170197155842903": 73, "4104185118508746": 73, "3983522486299086": 73, "3890672579232945": 73, "38423672571047535": 73, "38125834129512437": 73, "36963055836461756": 73, "36898326972273116": 73, "3608236700328174": 73, "35822524538617145": 73, "56785": 73, "6591833333333333": 73, "71765": 73, "7660333333333333": 73, "7931666666666667": 73, "8079666666666667": 73, "8198833333333333": 73, "8275166666666667": 73, "8349833333333333": 73, "8422": 73, "8473666666666667": 73, "8486833333333333": 73, "85425": 73, "85675": 73, "8578666666666667": 73, "8603333333333333": 73, "8643333333333333": 73, "8637833333333333": 73, "8684333333333333": 73, "8680166666666667": 73, "5984484012126923": 73, "5152713191509247": 73, "42289899206161496": 73, "3746640253067017": 73, "3369040569067001": 73, "32359291434288023": 73, "2978636801838875": 73, "2998174095153809": 73, "2883352539539337": 73, "2839300352931023": 73, "2775397801399231": 73, "2616970262527466": 73, "259125192284584": 73, "25470315623283385": 73, "2535187450051308": 73, "2600560383200645": 73, "25031394577026367": 73, "2547155976295471": 73, "23950587111711502": 73, "24401323813199996": 73, "750875": 73, "78025": 73, "869875": 73, "884875": 73, "891625": 73, "898875": 73, "89275": 73, "9005": 73, "910375": 73, "9135": 73, "911625": 73, "5018121279410716": 73, "3649225841834347": 73, "31199926770985253": 73, "2825479824850554": 73, "25993211727057186": 73, "2431308363737074": 73, "22870161555913973": 73, "22126636312587428": 73, "2113911879540824": 73, "20279224649834227": 73, "19300907663603836": 73, "18686007729360163": 73, "1815741605866057": 73, "1759802805684777": 73, "17041425832084564": 73, "16513840764014323": 73, "15892388751861383": 73, "1548161118118557": 73, "1498002242614656": 73, "14744469122107284": 73, "8158": 73, "8648": 73, "8846833333333334": 73, "8954666666666666": 73, "9035333333333333": 73, "9097666666666666": 73, "9142666666666667": 73, "91615": 73, "9219166666666667": 73, "9239333333333334": 73, "9268166666666666": 73, "9287666666666666": 73, "9304833333333333": 73, "9327333333333333": 73, "9365": 73, "9368666666666666": 73, "9395333333333333": 73, "9445": 73, "9450166666666666": 73, "35916801404953": 73, "30038927191495896": 73, "2824265750646591": 73, "28094157111644746": 73, "2402345055937767": 73, "24779821130633353": 73, "2263277245759964": 73, "22270147562026976": 73, "22010754531621932": 73, "20850908517837524": 73, "21723379525542258": 73, "20454896742105483": 73, "2065480750799179": 73, "20593296563625335": 73, "21030707907676696": 73, "2015896993279457": 73, "19770563289523124": 73, "19552358242869378": 73, "197759574085474": 73, "19900305101275445": 73, "867125": 73, "890875": 73, "912125": 73, "90875": 73, "9275": 73, "928": [73, 76], "928875": 73, "93325": 73, "930125": 73, "564780301424359": 73, "41836969141385705": 73, "3581543931924204": 73, "3251280398018706": 73, "30215959723538427": 73, "28700008430778345": 73, "27507679125488693": 73, "26540731782439164": 73, "25373875692105496": 73, "24964979071734048": 73, "24098571216357922": 73, "23604591902512223": 73, "2270722362135392": 73, "2229606584985373": 73, "22031292727570545": 73, "21439386613126885": 73, "21020108821200156": 73, "2042837777872012": 73, "20376247368149283": 73, "20021205727082453": 73, "7927": 73, "8474166666666667": 73, "8672166666666666": 73, "8811833333333333": 73, "8883": 73, "8952833333333333": 73, "89795": 73, "9011333333333333": 73, "9055833333333333": 73, "9071166666666667": 73, "9100333333333334": 73, "91515": 73, "91775": 73, "9197833333333333": 73, "9218666666666666": 73, "9239": 73, "9236833333333333": 73, "92455": 73, "39558523416519165": 73, "3187315353155136": 73, "30105597496032716": 73, "2717038299441338": 73, "25286867189407347": 73, "24664685553312302": 73, "24286985045671464": 73, "23643679201602935": 73, "23006864881515504": 73, "2277349520921707": 73, "22591854375600814": 73, "2165311907827854": 73, "21385486593842506": 73, "21402871897816658": 73, "2096972267627716": 73, "21242560443282127": 73, "2098898750245571": 73, "2062524998188019": 73, "19932547932863234": 73, "20170186588168143": 73, "897125": 73, "9065": 73, "9085": 73, "907625": 73, "91275": 73, "91925": 73, "6916971901205303": 73, "4947840944567977": 73, "41710148827988963": 73, "38678343986460906": 73, "36429949198513906": 73, "34339441834831796": 73, "33055868282564665": 73, "3199633415272114": 73, "31550557391920575": 73, "3022628513289921": 73, "2959158662110885": 73, "2941135993993867": 73, "28555906579089063": 73, "27903660322462065": 73, "2769482293601102": 73, "27154609372716215": 73, "26548120195963487": 73, "26188135733291795": 73, "2588035051009929": 73, "2574938320115939": 73, "7497333333333334": 73, "8236833333333333": 73, "8482333333333333": 73, "8618666666666667": 73, "8703666666666666": 73, "8772166666666666": 73, "8803333333333333": 73, "88525": 73, "88945": 73, "8937166666666667": 73, "8969": 73, "90175": 73, "9041666666666667": 73, "9046166666666666": 73, "41916924858093263": 73, "3380992366075516": 73, "31549062132835387": 73, "2921286026239395": 73, "2786481494307518": 73, "28516836106777194": 73, "25556409001350405": 73, "2538892236948013": 73, "24726227968931197": 73, "24262803781032563": 73, "24080126863718032": 73, "24242325466871262": 73, "23416680485010147": 73, "22847312396764755": 73, "22423979061841964": 73, "2311997367441654": 73, "22794704174995423": 73, "21943940049409866": 73, "21820387506484987": 73, "21150743806362152": 73, "8435": 73, "87725": 73, "890375": 73, "910625": 73, "909875": 73, "919625": 73, "923875": 73, "162218615571573": 73, "8284856370453642": 73, "7309887468624217": 73, "6590983641744931": 73, "6089096262510906": 73, "5663433943285363": 73, "5383681068733048": 73, "5242803116787725": 73, "49926126579930785": 73, "48940120944018556": 73, "4789252862779062": 73, "46633604049746163": 73, "4596060775458686": 73, "4464966354847971": 73, "4418302221593064": 73, "43759817490254893": 73, "42892070028827645": 73, "4226101264516428": 73, "418694807601763": 73, "4110745745840103": 73, "58005": 73, "6824666666666667": 73, "7223333333333334": 73, "7464333333333333": 73, "7711333333333333": 73, "7891833333333333": 73, "8012333333333334": 73, "80635": 73, "8172666666666667": 73, "8271833333333334": 73, "8335833333333333": 73, "8371833333333333": 73, "8412166666666666": 73, "84265": 73, "8458833333333333": 73, "8471166666666666": 73, "8497666666666667": 73, "8522833333333333": 73, "5945872340202332": 73, "518519122838974": 73, "4681703653335571": 73, "42978407418727876": 73, "40349935555458066": 73, "37377681517601014": 73, "35234942865371705": 73, "3359788683652878": 73, "3217720929384232": 73, "3279728285074234": 73, "3114012089371681": 73, "3060767319202423": 73, "2949701727628708": 73, "2981588536500931": 73, "2855641575455666": 73, "28112928783893587": 73, "28212732630968096": 73, "27846804082393645": 73, "27372796374559405": 73, "27415593349933626": 73, "78525": 73, "820125": 73, "875125": 73, "876625": 73, "882": [73, 76], "887875": 73, "884625": 73, "892125": 73, "894125": 73, "902625": 73, "89975": 73, "90075": 73, "d2": 73, "description_width": [73, 80], "800px": 73, "aasdsd": 73, "_dropout_exploration_bonus_interactive_demo": 73, "transforms_custom": 73, "get_augmentation_transform": 73, "_how_much_augmentation_help_bonus_exercis": 73, "_data_augmentation_bonus_discuss": 73, "ashish": 74, "sahoo": 74, "practition": [74, 88, 91], "w2d2_t2": 74, "_intro_to_dl_thinking_video": 74, "_spiking_neuron_predictions_video": 74, "_spiking_neuron_predictions_setup_video": 74, "motorcycl": 74, "emerg": [74, 94], "millisecond": 74, "k_": [74, 76], "lambda_": 74, "\u03bb_": 74, "stamp": 74, "milisecond": 74, "all_data": 74, "_designing_a_cost_function_to_predict_neural_activities_discuss": 74, "_spiking_neurons_wrapup_video": 74, "_nonpoisson_neurons_bonus_discuss": 74, "_ann_uncertainty_vignette_video": 74, "_ann_uncertainty_setup_video": 74, "atom": 74, "chemic": 74, "mu_i": [74, 77], "sigma_i": 74, "_ann_uncertainty_discuss": 74, "_ann_uncertainty_wrapup_video": 74, "rapid": 74, "nmr": 74, "imit": 74, "molecular": 74, "_negative_standard_deviations_bonus_discuss": 74, "_embedding_faces_vignette_video": 74, "_embedding_faces_setup_video": 74, "nearbi": 74, "_j": 74, "i_c": 74, "j_c": 74, "phrase": [74, 88], "_p": 74, "fed": [74, 94], "_n": 74, "dissimiliarti": 74, "dissimilar": [74, 89], "triplet": [74, 77], "subscript": 74, "anchor": [74, 77], "_embedding_faces_discuss": 74, "_embedding_faces_wrapup_video": 74, "dwell": [74, 76], "entiti": 74, "pull": 74, "probe": 74, "laura": [76, 77], "pede": [76, 77], "vogg": [76, 77], "marissa": [76, 77], "wei": [76, 77], "timo": [76, 77], "l\u00fcddeck": [76, 77], "cari": [76, 77], "murrai": [76, 77], "ben": [76, 77], "heil": [76, 77], "w2d3_t1": 76, "_modern_cnns_and_transfer_learning_video": 76, "image_length": 76, "image_channel": 76, "num_of_param": 76, "_l": 76, "k_l": 76, "n_l": 76, "characterist": 76, "fullyconnectednet": 76, "get_parameter_count": 76, "param_count": 76, "fccnet": 76, "fccn": 76, "12583168": 76, "7168": 76, "_calculate_number_of_params_exercis": 76, "calculate_paramet": 76, "filter_count": 76, "fcnn_node": 76, "filter_width": 76, "image_area": 76, "image_volum": 76, "fcnn_paramet": 76, "cnn_paramet": 76, "_check_your_results_interactive_demo": 76, "1980": 76, "predat": 76, "revolut": 76, "_history_of_convnets_video": 76, "_challenges_of_improving_cnns_discuss": 76, "18min": [76, 84], "_alexnet_and_vgg_video": 76, "paralel": 76, "input_imag": 76, "input_batch": 76, "s3": 76, "owt": 76, "4df8aa71": 76, "load_state_dict_from_url": 76, "9dzeu": 76, "w2d3_modernconvnet": 76, "urlretriev": 76, "input_tensor": 76, "_filter_similarity_discuss": 76, "alexnet_intermediate_output": 76, "browse_imag": 76, "view_imag": 76, "_what_does_alexnet_see_interactive_demo": 76, "_filter_purpose_discuss": 76, "_residual_networks_resnets_video": 76, "subtract": [76, 80, 88], "preceed": 76, "imagenette2": 76, "mnve4": 76, "tgz": 76, "dict_map": 76, "tench": [76, 80], "tinca": 76, "carassiu": 76, "auratu": 76, "eater": 76, "eat": 76, "carcharodon": 76, "carcharia": 76, "tiger": [76, 84], "galeocerdo": 76, "cuvieri": 76, "hammerhead": 76, "crampfish": 76, "numbfish": 76, "torpedo": 76, "stingrai": 76, "cock": 76, "hen": 76, "ostrich": 76, "struthio": 76, "camelu": 76, "brambl": 76, "fringilla": 76, "montifringilla": 76, "goldfinch": 76, "cardu": 76, "hous": 76, "finch": 76, "linnet": 76, "carpodacu": 76, "mexicanu": 76, "junco": 76, "snowbird": 76, "indigo": 76, "bunt": 76, "passerina": 76, "cyanea": 76, "turdu": 76, "migratoriu": 76, "bulbul": 76, "jai": [76, 84], "magpi": [76, 80], "chickade": 76, "water": [76, 84], "ouzel": 76, "dipper": 76, "kite": 76, "eagl": 76, "haliaeetu": 76, "leucocephalu": 76, "vultur": 76, "owl": [76, 88], "strix": 76, "nebulosa": 76, "european": [76, 88], "salamand": 76, "salamandra": 76, "newt": 76, "trituru": 76, "vulgari": 76, "eft": 76, "ambystoma": 76, "maculatum": 76, "axolotl": 76, "mud": 76, "puppi": 76, "mexicanum": 76, "bullfrog": 76, "rana": 76, "catesbeiana": 76, "toad": 76, "rib": 76, "ascaphu": 76, "trui": 76, "loggerhead": 76, "turtl": 76, "caretta": 76, "leatherback": 76, "leatheri": 76, "dermoch": 76, "coriacea": 76, "terrapin": 76, "tortois": 76, "band": 76, "gecko": 76, "iguana": 76, "chameleon": 76, "anol": 76, "anoli": 76, "carolinensi": 76, "whiptail": 76, "lizard": 76, "agama": 76, "frill": 76, "chlamydosauru": 76, "kingi": 76, "allig": 76, "gila": 76, "monster": 76, "heloderma": 76, "suspectum": 76, "lacerta": 76, "african": 76, "chamaeleo": 76, "chamaeleon": 76, "komodo": 76, "dragon": 76, "giant": 76, "varanu": 76, "komodoensi": 76, "crocodil": 76, "nile": 76, "crocodylu": 76, "niloticu": 76, "mississipiensi": 76, "triceratop": 76, "thunder": 76, "carphophi": 76, "amoenu": 76, "ringneck": 76, "ring": 76, "hognos": 76, "puff": 76, "adder": [76, 88], "sand": 76, "viper": 76, "grass": 76, "king": [76, 87], "kingsnak": 76, "garter": 76, "vine": 76, "hypsiglena": 76, "torquata": 76, "boa": 76, "constrictor": 76, "rock": [76, 87], "seba": 76, "indian": 76, "cobra": 76, "naja": 76, "mamba": 76, "sea": 76, "horn": [76, 88], "cerast": 76, "asp": 76, "cornutu": 76, "diamondback": 76, "rattlesnak": 76, "crotalu": 76, "adamanteu": 76, "sidewind": 76, "trilobit": 76, "harvestman": 76, "daddi": 76, "longleg": 76, "phalangium": 76, "opilio": 76, "scorpion": 76, "gold": 76, "garden": 76, "spider": 76, "argiop": 76, "aurantia": 76, "barn": 76, "araneu": 76, "cavaticu": 76, "aranea": 76, "diademata": 76, "widow": 76, "latrodectu": 76, "mactan": 76, "tarantula": 76, "wolf": [76, 88], "hunt": [76, 88], "centiped": 76, "grous": 76, "ptarmigan": 76, "ruf": 76, "partridg": 76, "bonasa": 76, "umbellu": 76, "prairi": 76, "chicken": 76, "fowl": 76, "peacock": 76, "quail": 76, "psittacu": 76, "erithacu": 76, "macaw": 76, "sulphur": 76, "crest": 76, "cockatoo": 76, "kakato": 76, "galerita": 76, "cacatua": 76, "lorikeet": 76, "coucal": 76, "bee": [76, 80], "hornbil": 76, "hummingbird": 76, "jacamar": 76, "toucan": 76, "drake": 76, "breast": 76, "mergans": 76, "mergu": 76, "serrat": 76, "goos": 76, "swan": 76, "cygnu": 76, "atratu": 76, "101": 76, "tusker": 76, "echidna": 76, "spini": 76, "anteat": 76, "platypu": 76, "duckbil": 76, "duck": 76, "bill": 76, "ornithorhynchu": 76, "anatinu": 76, "wallabi": 76, "brush": 76, "kangaroo": 76, "koala": 76, "bear": 76, "phascolarcto": 76, "cinereu": 76, "wombat": 76, "jellyfish": [76, 80], "anemon": 76, "coral": 76, "flatworm": 76, "platyhelminth": 76, "nematod": 76, "roundworm": 76, "conch": 76, "snail": 76, "114": [76, 84], "slug": 76, "nudibranch": 76, "116": 76, "chiton": 76, "shell": [76, 84, 94], "cradl": 76, "polyplacophor": 76, "117": 76, "chamber": 76, "nautilu": 76, "pearli": 76, "118": 76, "dung": 76, "crab": 76, "cancer": [76, 91], "magist": 76, "119": [76, 84], "irroratu": 76, "fiddler": 76, "alaska": 76, "alaskan": 76, "paralithod": 76, "camtschatica": 76, "lobster": 76, "northern": 76, "homaru": 76, "americanu": 76, "123": 76, "langoust": 76, "crawfish": 76, "crayfish": 76, "crawdad": 76, "crawdaddi": 76, "hermit": 76, "isopod": 76, "stork": 76, "ciconia": 76, "nigra": 76, "129": 76, "spoonbil": 76, "flamingo": [76, 84], "heron": 76, "egretta": 76, "caerulea": 76, "egret": 76, "albu": 76, "133": 76, "bittern": 76, "crane": 76, "135": 76, "limpkin": 76, "aramu": 76, "pictu": 76, "136": 76, "gallinul": 76, "porphyrio": 76, "coot": 76, "marsh": 76, "fulica": 76, "americana": 76, "bustard": 76, "ruddi": 76, "turnston": 76, "arenaria": 76, "interpr": 76, "sandpip": 76, "dunlin": 76, "erolia": 76, "alpina": 76, "redshank": 76, "tringa": 76, "totanu": 76, "dowitch": 76, "oystercatch": 76, "oyster": 76, "catcher": 76, "pelican": 76, "145": 76, "penguin": 76, "aptenodyt": 76, "patagonica": 76, "albatross": 76, "mollymawk": 76, "whale": 76, "devilfish": 76, "eschrichtiu": 76, "gibbosu": 76, "robustu": 76, "killer": 76, "orca": 76, "grampu": 76, "orcinu": 76, "dugong": 76, "dugon": 76, "lion": 76, "chihuahua": 76, "152": 76, "japanes": 76, "spaniel": 76, "maltes": 76, "terrier": 76, "154": 76, "pekines": 76, "pekinges": 76, "peke": 76, "155": 76, "shih": 76, "tzu": 76, "blenheim": 76, "papillon": 76, "158": 76, "159": 76, "rhodesian": 76, "ridgeback": 76, "afghan": 76, "hound": 76, "basset": 76, "beagl": 76, "bloodhound": 76, "sleuthhound": 76, "bluetick": 76, "tan": 76, "coonhound": 76, "walker": 76, "redbon": 76, "borzoi": 76, "russian": 76, "wolfhound": 76, "irish": 76, "greyhound": 76, "whippet": 76, "ibizan": 76, "podenco": 76, "norwegian": 76, "elkhound": 76, "otterhound": 76, "otter": 76, "saluki": 76, "gazel": 76, "177": 76, "scottish": 76, "deerhound": 76, "178": 76, "weimaran": 76, "179": 76, "staffordshir": 76, "bullterri": 76, "bull": [76, 87], "pit": 76, "bedlington": 76, "183": 76, "kerri": 76, "185": 76, "norfolk": 76, "186": 76, "norwich": 76, "187": 76, "yorkshir": 76, "188": 76, "wire": 76, "hair": 76, "fox": 76, "189": [76, 84], "lakeland": 76, "sealyham": 76, "191": 76, "airedal": 76, "cairn": 76, "193": 76, "australian": 76, "dandi": 76, "dinmont": 76, "195": 76, "boston": 76, "196": 76, "miniatur": 76, "schnauzer": 76, "198": 76, "scotch": 76, "scotti": 76, "tibetan": 76, "chrysanthemum": 76, "silki": 76, "sydnei": 76, "202": 76, "wheaten": 76, "203": 76, "west": [76, 100], "highland": 76, "204": 76, "lhasa": 76, "apso": 76, "205": 76, "206": 76, "curli": [76, 88], "207": 76, "golden": 76, "208": 76, "labrador": 76, "chesapeak": 76, "german": [76, 80], "pointer": [76, 85], "vizsla": 76, "hungarian": 76, "212": 76, "setter": 76, "213": 76, "214": 76, "gordon": 76, "215": [76, 94], "brittani": 76, "clumber": 76, "217": [76, 94], "springer": 76, "218": [76, 94], "welsh": 76, "cocker": 76, "sussex": 76, "222": 76, "kuvasz": 76, "schipperk": 76, "groenendael": 76, "malinoi": 76, "226": 76, "briard": 76, "227": 76, "kelpi": 76, "228": 76, "komondor": 76, "sheepdog": 76, "bobtail": 76, "230": 76, "shetland": 76, "sheep": 76, "231": 76, "colli": 76, "232": 76, "233": 76, "bouvier": 76, "flandr": 76, "234": 76, "rottweil": 76, "shepherd": [76, 80], "polic": 76, "alsatian": 76, "236": [76, 80], "doberman": 76, "pinscher": 76, "238": [76, 80], "swiss": 76, "mountain": 76, "239": 76, "bernes": 76, "appenzel": 76, "entlebuch": 76, "242": 76, "boxer": 76, "243": [76, 80], "mastiff": 76, "244": 76, "245": 76, "bulldog": 76, "246": [76, 80], "dane": 76, "247": [76, 80], "saint": 76, "bernard": 76, "eskimo": 76, "huski": 76, "malamut": 76, "malemut": 76, "siberian": 76, "dalmatian": 76, "coach": 76, "carriag": 76, "affenpinsch": 76, "monkei": 76, "basenji": 76, "pug": 76, "leonberg": 76, "newfoundland": 76, "pyrene": 76, "258": 76, "samoi": 76, "samoyed": 76, "pomeranian": 76, "chow": 76, "261": [76, 80], "keeshond": 76, "262": 76, "brabancon": 76, "griffon": 76, "263": [76, 80], "pembrok": 76, "corgi": 76, "cardigan": 76, "265": [76, 80], "poodl": 76, "266": [76, 80], "mexican": 76, "hairless": 76, "269": [76, 80], "timber": 76, "cani": 76, "lupu": 76, "arctic": 76, "tundrarum": 76, "271": [76, 80], "mane": 76, "rufu": 76, "niger": 76, "272": [76, 80, 88], "coyot": 76, "latran": 76, "273": [76, 80], "dingo": 76, "warrig": 76, "warrag": 76, "274": [76, 80], "dhole": 76, "cuon": 76, "alpinu": 76, "275": 76, "hyena": 76, "cape": 76, "lycaon": 76, "276": 76, "hyaena": 76, "277": 76, "vulp": 76, "278": 76, "kit": 76, "macroti": 76, "279": 76, "alopex": 76, "lagopu": 76, "280": 76, "urocyon": 76, "cinereoargenteu": 76, "tabbi": 76, "282": 76, "283": 76, "persian": 76, "284": 76, "siames": 76, "285": 76, "egyptian": 76, "286": 76, "cougar": 76, "puma": 76, "catamount": 76, "painter": 76, "panther": 76, "feli": 76, "concolor": 76, "287": 76, "lynx": 76, "leopard": 76, "panthera": 76, "pardu": 76, "289": [76, 94], "snow": 76, "ounc": 76, "uncia": 76, "290": 76, "jaguar": 76, "onca": 76, "291": [76, 94], "leo": 76, "292": [76, 94], "tigri": 76, "293": [76, 94], "cheetah": 76, "chetah": 76, "acinonyx": 76, "jubatu": 76, "294": [76, 94], "brown": [76, 85, 87], "bruin": 76, "ursu": 76, "arcto": 76, "295": 76, "euarcto": 76, "296": 76, "maritimu": 76, "thalarcto": 76, "297": 76, "sloth": 76, "melursu": 76, "ursinu": 76, "298": 76, "mongoos": 76, "meerkat": 76, "mierkat": 76, "beetl": 76, "ladybug": 76, "ladybeetl": 76, "ladi": 76, "ladybird": 76, "carabid": 76, "303": 76, "longicorn": 76, "304": 76, "chrysomelid": 76, "305": 76, "306": 76, "rhinocero": 76, "307": 76, "weevil": 76, "308": 76, "310": 76, "emmet": 76, "pismir": 76, "311": 76, "grasshopp": 76, "cricket": [76, 87], "walkingstick": 76, "insect": [76, 87], "cockroach": 76, "roach": 76, "manti": 76, "mantid": 76, "316": 76, "cicada": 76, "cicala": 76, "leafhopp": 76, "lacew": 76, "319": 76, "dragonfli": 76, "darn": 76, "needl": 76, "devil": 76, "sew": 76, "feeder": 76, "doctor": [76, 84], "mosquito": 76, "hawk": 76, "skeeter": 76, "damselfli": 76, "admir": 76, "ringlet": 76, "butterfli": 76, "323": 76, "monarch": 76, "milkwe": 76, "danau": 76, "plexippu": 76, "324": 76, "cabbag": 76, "sulfur": 76, "lycaenid": 76, "327": 76, "starfish": 76, "star": 76, "328": 76, "urchin": 76, "cucumb": 76, "holothurian": 76, "330": [76, 88], "rabbit": 76, "cottontail": 76, "331": 76, "hare": 76, "332": 76, "angora": 76, "333": 76, "hamster": 76, "334": 76, "porcupin": 76, "hedgehog": 76, "335": 76, "squirrel": 76, "eastern": 76, "sciuru": 76, "336": 76, "marmot": 76, "337": 76, "beaver": 76, "guinea": 76, "pig": 76, "cavia": 76, "cobaya": 76, "sorrel": 76, "340": 76, "zebra": 76, "341": 76, "hog": 76, "grunter": 76, "squealer": 76, "su": 76, "scrofa": 76, "342": 76, "boar": 76, "343": 76, "warthog": 76, "hippopotamu": 76, "hippo": 76, "river": [76, 84], "amphibiu": 76, "345": 76, "buffalo": 76, "asiat": 76, "bubalu": 76, "bubali": 76, "348": 76, "ram": [76, 77], "tup": 76, "349": 76, "bighorn": 76, "cimarron": 76, "rocki": 76, "ovi": 76, "canadensi": 76, "350": 76, "ibex": 76, "capra": 76, "351": 76, "hartebeest": 76, "impala": 76, "aepycero": 76, "melampu": 76, "353": 76, "354": 76, "arabian": 76, "camel": 76, "dromedari": 76, "dromedariu": 76, "355": 76, "llama": 76, "356": 76, "weasel": 76, "357": 76, "mink": 76, "polecat": 76, "fitch": 76, "foulmart": 76, "foumart": 76, "mustela": 76, "putoriu": 76, "359": 76, "ferret": 76, "nigrip": 76, "360": 76, "361": 76, "skunk": 76, "pussi": 76, "362": 76, "badger": 76, "363": 76, "armadillo": 76, "364": 76, "toed": 76, "bradypu": 76, "tridactylu": 76, "orangutan": 76, "orangutang": 76, "pongo": 76, "pygmaeu": 76, "gorilla": 76, "chimpanze": 76, "chimp": 76, "pan": 76, "troglodyt": 76, "gibbon": 76, "hylob": 76, "lar": 76, "369": 76, "siamang": 76, "syndactylu": 76, "symphalangu": 76, "370": 76, "guenon": 76, "371": 76, "pata": 76, "hussar": 76, "erythrocebu": 76, "372": 76, "baboon": 76, "macaqu": 76, "374": 76, "langur": 76, "colobu": 76, "376": 76, "probosci": 76, "nasali": 76, "larvatu": 76, "377": 76, "marmoset": 76, "capuchin": 76, "ringtail": 76, "cebu": 76, "capucinu": 76, "379": 76, "howler": 76, "titi": 76, "381": 76, "atel": 76, "geoffroyi": 76, "382": 76, "saimiri": 76, "sciureu": 76, "383": 76, "madagascar": 76, "lemur": 76, "catta": 76, "indri": 76, "brevicaudatu": 76, "385": 76, "eleph": 76, "elepha": 76, "maximu": 76, "loxodonta": 76, "africana": 76, "lesser": 76, "ailuru": 76, "fulgen": 76, "388": 76, "coon": 76, "ailuropoda": 76, "melanoleuca": 76, "389": 76, "barracouta": 76, "snoek": 76, "390": 76, "eel": 76, "coho": 76, "salmon": 76, "jack": 76, "silver": 76, "oncorhynchu": 76, "kisutch": 76, "392": 76, "holocanthu": 76, "tricolor": 76, "394": 76, "sturgeon": 76, "395": 76, "gar": 76, "garfish": 76, "garpik": 76, "billfish": 76, "lepisosteu": 76, "osseu": 76, "396": 76, "lionfish": 76, "397": 76, "puffer": 76, "pufferfish": 76, "blowfish": 76, "globefish": 76, "398": 76, "abacu": 76, "399": 76, "abaya": 76, "academ": 76, "gown": 76, "robe": 76, "401": 76, "accordion": 76, "piano": 76, "402": 76, "acoust": [76, 80], "guitar": [76, 80], "aircraft": 76, "carrier": 76, "flattop": 76, "404": 76, "airlin": 76, "airship": 76, "dirig": 76, "altar": 76, "407": 76, "ambul": 76, "408": [76, 88], "amphibian": 76, "amphibi": 76, "409": 76, "clock": 76, "410": 76, "apiari": 76, "411": 76, "apron": 76, "412": 76, "ashcan": 76, "trash": 76, "garbag": [76, 87], "wastebin": 76, "ashbin": 76, "dustbin": 76, "barrel": 76, "413": 76, "assault": 76, "rifl": 76, "gun": 76, "414": 76, "backpack": 76, "knapsack": 76, "packsack": 76, "rucksack": 76, "haversack": 76, "415": 76, "bakeri": 76, "bakeshop": 76, "bakehous": 76, "416": 76, "417": 76, "balloon": 76, "418": 76, "ballpoint": 76, "pen": 76, "ballpen": 76, "biro": 76, "419": 76, "banjo": 76, "421": 76, "bannist": 76, "banist": 76, "balustrad": 76, "balust": 76, "handrail": 76, "422": 76, "barbel": 76, "barber": 76, "chair": [76, 101], "424": 76, "barbershop": 76, "425": [76, 82], "426": [76, 82], "baromet": 76, "cask": 76, "428": 76, "barrow": 76, "cart": 76, "lawn": 76, "wheelbarrow": 76, "429": 76, "basebal": 76, "430": 76, "basketbal": 76, "431": 76, "bassinet": 76, "bassoon": 76, "433": 76, "bath": 76, "cap": 76, "434": 76, "towel": 76, "435": 76, "bathtub": 76, "tub": 76, "436": [76, 88], "beach": [76, 84], "wagon": 76, "station": 76, "estat": 76, "waggon": 76, "437": 76, "beacon": 76, "lighthous": 76, "pharo": 76, "beaker": 76, "439": 76, "bearskin": 76, "busbi": 76, "shako": 76, "440": 76, "beer": 76, "bottl": 76, "441": 76, "442": 76, "cote": 76, "cot": 76, "bib": 76, "444": [76, 82], "bicycl": 76, "tandem": 76, "445": [76, 82], "bikini": 76, "446": [76, 82], "binder": 76, "447": [76, 82], "binocular": 76, "opera": 76, "448": 76, "birdhous": 76, "449": 76, "boathous": 76, "bobsl": 76, "bobsleigh": 76, "bob": 76, "451": 76, "bolo": 76, "tie": 76, "bola": 76, "452": [76, 94], "bonnet": 76, "poke": 76, "453": [76, 94], "bookcas": 76, "454": [76, 94], "bookshop": 76, "bookstor": 76, "bookstal": 76, "455": [76, 94], "bottlecap": 76, "bow": 76, "457": [76, 84, 94], "bowti": 76, "brass": 76, "tablet": 76, "plaqu": 76, "459": [76, 94], "brassier": 76, "bra": 76, "bandeau": 76, "breakwat": 76, "groin": 76, "groyn": 76, "mole": 76, "bulwark": 76, "seawal": 76, "jetti": 76, "461": 76, "breastplat": 76, "aegi": 76, "egi": 76, "broom": 76, "463": 76, "bucket": 76, "pail": 76, "464": 76, "buckl": 76, "bulletproof": 76, "vest": 76, "466": 76, "bullet": 76, "butcher": 76, "shop": 76, "meat": 76, "market": [76, 88], "468": 76, "cab": 76, "hack": 76, "taxi": 76, "taxicab": 76, "caldron": 76, "cauldron": 76, "470": 76, "candl": 76, "taper": 76, "wax": 76, "471": 76, "cannon": 76, "472": 76, "cano": 76, "473": 76, "tin": 76, "474": 76, "475": 76, "476": 76, "carousel": 76, "carrousel": 76, "merri": 76, "roundabout": 76, "whirligig": 76, "477": 76, "478": 76, "carton": 76, "479": 76, "cash": 76, "dispens": 76, "teller": 76, "atm": 76, "481": 76, "cassett": 76, "482": 76, "483": 76, "castl": 76, "484": 76, "catamaran": 76, "cd": 76, "486": 76, "cello": 76, "violoncello": 76, "487": 76, "telephon": 76, "cellphon": 76, "mobil": 76, "488": 76, "489": 76, "chainlink": 76, "fenc": 76, "490": 76, "armor": 76, "armour": 76, "chainsaw": 76, "492": 76, "chest": 76, "493": 76, "chiffoni": 76, "commod": 76, "494": 76, "chime": 76, "gong": 76, "495": 76, "cabinet": 76, "closet": 76, "496": 76, "christma": 76, "stock": 76, "498": 76, "cinema": 76, "theater": 76, "theatr": 76, "palac": 76, "499": 76, "cleaver": 76, "chopper": 76, "cliff": 76, "501": 76, "cloak": 76, "502": 76, "clog": 76, "geta": 76, "patten": 76, "sabot": 76, "503": 76, "cocktail": 76, "shaker": 76, "504": 76, "coffe": [76, 80], "mug": [76, 80], "505": 76, "coffeepot": 76, "506": 76, "coil": 76, "volut": 76, "whorl": 76, "helix": 76, "507": 76, "lock": 76, "keyboard": [76, 85], "keypad": 76, "confectioneri": 76, "confectionari": 76, "candi": 76, "510": 76, "containership": 76, "vessel": 76, "511": 76, "corkscrew": 76, "screw": 76, "513": [76, 80], "cornet": 76, "trumpet": 76, "trump": [76, 88], "cowboi": 76, "515": [76, 80], "gallon": 76, "517": 76, "518": 76, "helmet": 76, "519": 76, "crate": 76, "520": [76, 80], "crib": 76, "521": [76, 80], "crock": 76, "pot": 76, "croquet": 76, "crutch": 76, "524": [76, 80], "cuirass": 76, "525": [76, 88], "dam": 76, "dike": 76, "dyke": 76, "526": 76, "desk": 76, "527": 76, "desktop": 76, "528": 76, "dial": 76, "529": 76, "diaper": 76, "nappi": 76, "napkin": 76, "530": 76, "532": 76, "dine": 76, "dishrag": 76, "dishcloth": 76, "534": 76, "dishwash": [76, 101], "dish": 76, "washer": 76, "535": 76, "disk": [76, 88, 100], "disc": [76, 100], "dock": 76, "dockag": 76, "facil": 76, "dogsl": 76, "sled": 76, "sleigh": 76, "538": [76, 94], "dome": 76, "539": [76, 94], "doormat": 76, "mat": 76, "540": [76, 94], "drill": 76, "offshor": 76, "rig": 76, "541": [76, 94], "membranophon": 76, "tympan": 76, "542": [76, 94], "drumstick": 76, "543": [76, 94], "dumbbel": 76, "544": [76, 94], "dutch": 76, "oven": 76, "fan": 76, "blower": 76, "546": 76, "locomot": 76, "548": 76, "entertain": 76, "549": 76, "envelop": 76, "550": 76, "espresso": 76, "maker": 76, "551": 76, "powder": 76, "552": 76, "feather": 76, "553": 76, "554": 76, "fireboat": 76, "555": 76, "fireguard": 76, "557": 76, "flagpol": 76, "flagstaff": 76, "558": 76, "flute": 76, "transvers": 76, "559": 76, "fold": [76, 84], "560": 76, "footbal": 76, "forklift": 76, "fountain": 76, "563": 76, "564": 76, "565": 76, "freight": 76, "566": 76, "567": 76, "fry": 76, "frypan": 76, "skillet": 76, "fur": [76, 91], "569": 76, "dustcart": 76, "570": 76, "gasmask": 76, "respir": 76, "ga": 76, "571": 76, "pump": 76, "gasolin": 76, "petrol": 76, "island": 76, "572": 76, "goblet": 76, "573": 76, "kart": 76, "574": 76, "golf": 76, "575": 76, "golfcart": 76, "576": 76, "gondola": 76, "tam": 76, "580": 76, "greenhous": 76, "nurseri": 76, "glasshous": 76, "581": 76, "grill": 76, "radiat": 76, "582": 76, "groceri": 76, "food": [76, 87], "guillotin": 76, "584": 76, "585": 76, "sprai": 76, "586": 76, "587": 76, "hammer": 76, "588": 76, "hamper": 76, "589": 76, "dryer": 76, "drier": 76, "590": 76, "held": [76, 94], "microcomput": 76, "591": 76, "handkerchief": 76, "hanki": 76, "hankei": 76, "592": 76, "593": 76, "harmonica": 76, "mouth": 76, "harp": 76, "594": 76, "595": 76, "harvest": 76, "reaper": 76, "596": 76, "hatchet": 76, "holster": 76, "598": 76, "599": 76, "honeycomb": 76, "hook": [76, 94], "claw": 76, "601": 76, "hoopskirt": 76, "crinolin": 76, "603": 76, "604": 76, "hourglass": 76, "605": 76, "ipod": 76, "606": 76, "iron": 76, "lantern": 76, "denim": 76, "609": 76, "jeep": 76, "landrov": 76, "jersei": 76, "tee": 76, "611": 76, "jigsaw": 76, "puzzl": 76, "612": 76, "jinrikisha": 76, "ricksha": 76, "rickshaw": 76, "joystick": 76, "614": 76, "kimono": 76, "615": 76, "knee": 76, "knot": 76, "617": 76, "618": 76, "ladl": 76, "lampshad": 76, "lamp": 76, "shade": 76, "620": 76, "laptop": 76, "621": 76, "mower": 76, "622": 76, "623": 76, "knife": 76, "paperknif": 76, "624": 76, "lifeboat": 76, "ignit": 76, "ignitor": 76, "limousin": 76, "limo": 76, "628": 76, "liner": [76, 97], "ocean": 76, "629": 76, "lipstick": 76, "lip": 76, "roug": 76, "loafer": 76, "631": 76, "lotion": 76, "632": 76, "loudspeak": 76, "speaker": 76, "633": 76, "loup": 76, "jewel": 76, "634": 76, "lumbermil": 76, "sawmil": 76, "635": 76, "compass": 76, "636": 76, "mailbag": 76, "postbag": 76, "637": 76, "mailbox": 76, "638": 76, "maillot": 76, "639": 76, "tank": 76, "manhol": 76, "641": 76, "maraca": 76, "642": 76, "marimba": 76, "xylophon": 76, "643": 76, "644": 76, "matchstick": 76, "645": 76, "maypol": 76, "maze": 76, "labyrinth": 76, "647": 76, "648": 76, "medicin": 76, "megalith": 76, "microphon": 76, "mike": 76, "651": 76, "microwav": 76, "652": 76, "militari": 76, "653": 76, "milk": 76, "654": 76, "minibu": [76, 80], "655": 76, "miniskirt": 76, "minivan": 76, "657": 76, "missil": 76, "658": 76, "mitten": 76, "659": 76, "bowl": 76, "manufactur": 76, "662": 76, "modem": 76, "663": 76, "monasteri": 76, "664": 76, "665": 76, "mope": 76, "mortar": 76, "mortarboard": 76, "668": 76, "mosqu": 76, "669": 76, "670": 76, "scooter": 76, "671": 76, "terrain": 76, "roader": 76, "672": 76, "tent": 76, "673": 76, "674": 76, "mousetrap": 76, "675": 76, "676": 76, "muzzl": 76, "677": 76, "nail": 76, "678": 76, "brace": 76, "679": 76, "necklac": 76, "680": 76, "nippl": 76, "681": 76, "682": 76, "obelisk": 76, "683": 76, "obo": 76, "hautboi": 76, "684": 76, "ocarina": 76, "sweet": 76, "potato": 76, "685": 76, "odomet": 76, "hodomet": 76, "mileomet": 76, "milomet": 76, "oil": 76, "687": 76, "pipe": [76, 82], "oscilloscop": 76, "scope": [76, 85], "cathod": 76, "cro": 76, "689": 76, "overskirt": 76, "690": 76, "oxcart": 76, "691": 76, "oxygen": 76, "692": 76, "packet": 76, "693": 76, "paddl": 76, "boat": 76, "694": 76, "paddlewheel": 76, "695": 76, "padlock": 76, "696": 76, "paintbrush": 76, "697": 76, "pajama": 76, "pyjama": 76, "pj": 76, "jammi": 76, "698": 76, "panpip": 76, "pandean": 76, "syrinx": 76, "701": 76, "parachut": 76, "chute": 76, "702": 76, "703": 76, "park": [76, 84], "bench": 76, "704": 76, "meter": 76, "705": 76, "passeng": 76, "706": 76, "patio": 76, "terrac": 76, "707": 76, "708": 76, "pedest": 76, "plinth": 76, "footstal": 76, "709": 76, "pencil": 76, "710": 76, "sharpen": 76, "711": 76, "perfum": 76, "essenc": 76, "712": 76, "petri": 76, "photocopi": 76, "714": 76, "plectrum": 76, "plectron": 76, "715": 76, "pickelhaub": 76, "716": 76, "picket": 76, "pale": 76, "717": 76, "pickup": 76, "718": 76, "pier": 76, "piggi": 76, "penni": 76, "720": 76, "pill": 76, "722": 76, "ping": 76, "723": 76, "pinwheel": 76, "724": 76, "pirat": 76, "725": 76, "pitcher": 76, "ewer": 76, "726": 76, "woodwork": 76, "727": 76, "planetarium": 76, "728": 76, "plastic": 76, "729": 76, "plate": 76, "rack": 76, "730": 76, "plow": 76, "plough": 76, "731": 76, "plunger": 76, "plumber": 76, "732": 76, "polaroid": 76, "camera": [76, 77], "733": 76, "pole": 76, "734": 76, "paddi": 76, "patrol": 76, "maria": 76, "735": 76, "poncho": 76, "736": 76, "billiard": 76, "snooker": 76, "737": 76, "soda": 76, "738": 76, "flowerpot": 76, "739": 76, "potter": 76, "740": 76, "741": 76, "prayer": 76, "rug": 76, "742": 76, "printer": 76, "743": 76, "prison": 76, "744": 76, "projectil": 76, "745": 76, "projector": [76, 89], "746": 76, "puck": 76, "hockei": 76, "747": 76, "punch": 76, "punchbal": 76, "purs": 76, "749": 76, "quill": 76, "quilt": 76, "751": 76, "racer": 76, "race": [76, 77], "752": 76, "racket": 76, "racquet": 76, "753": 76, "754": [76, 88], "radio": 76, "wireless": 76, "755": 76, "telescop": 76, "reflector": 76, "756": 76, "rain": [76, 87], "758": 76, "reel": 76, "759": 76, "reflex": 76, "760": 76, "refriger": 76, "icebox": 76, "761": 76, "762": 76, "restaur": 76, "eateri": 76, "763": 76, "revolv": 76, "shooter": 76, "764": 76, "765": 76, "rocker": 76, "766": 76, "rotisseri": 76, "767": 76, "rubber": 76, "eras": [76, 101], "rugbi": 76, "769": 76, "ruler": 76, "770": 76, "shoe": 76, "771": 76, "772": 76, "safeti": 76, "pin": 76, "773": 76, "saltshak": 76, "salt": 76, "774": 76, "775": 76, "sarong": 76, "776": 76, "saxophon": 76, "777": 76, "scabbard": 76, "778": 76, "weigh": 76, "779": 76, "bu": 76, "780": [76, 88], "schooner": 76, "scoreboard": 76, "782": 76, "crt": 76, "783": 76, "screwdriv": 76, "785": 76, "seat": 76, "belt": [76, 80], "seatbelt": 76, "786": 76, "787": 76, "shield": 76, "buckler": 76, "788": 76, "789": 76, "shoji": 76, "790": 76, "basket": 76, "791": 76, "792": 76, "shovel": 76, "793": 76, "shower": 76, "794": 76, "curtain": 76, "795": 76, "ski": 76, "796": 76, "797": 76, "798": 76, "slipstick": 76, "799": 76, "door": 76, "bandit": 76, "801": 76, "snorkel": 76, "802": 76, "snowmobil": 76, "803": 76, "snowplow": 76, "snowplough": 76, "soap": 76, "soccer": 76, "806": 76, "sock": 76, "807": 76, "solar": 76, "collector": 76, "furnac": 76, "808": 76, "sombrero": 76, "809": 76, "soup": 76, "810": 76, "heater": 76, "shuttl": 76, "813": 76, "spatula": 76, "speedboat": 76, "816": 76, "spindl": 76, "sport": [76, 87], "819": 76, "820": 76, "821": 76, "steel": 76, "822": 76, "823": 76, "stethoscop": 76, "824": 76, "stole": 76, "stone": 76, "wall": [76, 97], "826": 76, "stopwatch": 76, "827": 76, "stove": 76, "828": 76, "strainer": 76, "829": 76, "streetcar": 76, "tram": 76, "tramcar": 76, "trollei": 76, "830": 76, "stretcher": 76, "studio": 76, "couch": 76, "bed": [76, 85], "stupa": 76, "tope": 76, "submarin": 76, "pigboat": 76, "834": 76, "cloth": 76, "835": 76, "sundial": 76, "sunglass": 76, "838": 76, "sunscreen": 76, "sunblock": 76, "blocker": 76, "suspens": 76, "840": 76, "swab": 76, "swob": 76, "mop": 76, "841": 76, "sweatshirt": 76, "842": 76, "trunk": 76, "843": 76, "swing": 76, "845": 76, "syring": 76, "846": 76, "armi": 76, "tape": 76, "849": 76, "teapot": 76, "850": 76, "teddi": 76, "851": 76, "televis": 76, "852": 76, "tenni": 76, "853": 76, "thatch": 76, "roof": 76, "854": 76, "thimbl": 76, "856": 76, "thresher": 76, "thrasher": 76, "thresh": 76, "857": 76, "throne": 76, "858": 76, "tile": 76, "toaster": 76, "860": 76, "tobacco": 76, "tobacconist": 76, "861": 76, "toilet": 76, "863": 76, "totem": 76, "864": 76, "tow": 76, "wrecker": 76, "toyshop": 76, "tractor": 76, "867": 76, "trailer": [76, 84], "lorri": 76, "868": 76, "trai": 76, "869": 76, "trench": 76, "870": 76, "tricycl": 76, "trike": 76, "velociped": 76, "871": 76, "trimaran": 76, "872": 76, "tripod": 76, "873": 76, "triumphal": 76, "874": 76, "trolleybu": 76, "trackless": 76, "trombon": 76, "vat": 76, "877": 76, "turnstil": 76, "typewrit": 76, "879": 76, "umbrella": 76, "880": 76, "unicycl": 76, "monocycl": 76, "881": 76, "upright": 76, "vacuum": 76, "cleaner": 76, "883": 76, "vase": 76, "884": 76, "vault": 76, "velvet": 76, "886": 76, "vend": 76, "887": 76, "vestment": 76, "viaduct": 76, "889": 76, "violin": 76, "fiddl": 76, "890": 76, "volleybal": 76, "891": 76, "waffl": 76, "892": 76, "893": 76, "wallet": 76, "billfold": 76, "notecas": 76, "pocketbook": 76, "wardrob": 76, "warplan": 76, "washbasin": 76, "handbasin": 76, "washbowl": 76, "lavabo": 76, "wash": 76, "basin": 76, "898": 76, "899": 76, "jug": 76, "tower": 76, "901": 76, "whiskei": 76, "whistl": 76, "903": 76, "wig": 76, "904": 76, "windsor": 76, "wine": 76, "wing": 76, "wok": 76, "910": 76, "spoon": 76, "wool": 76, "woolen": 76, "woollen": 76, "912": 76, "rail": 76, "virginia": 76, "913": 76, "wreck": 76, "914": 76, "yawl": 76, "yurt": 76, "comic": 76, "918": 76, "crossword": 76, "919": 76, "street": 76, "920": 76, "stoplight": 76, "jacket": 76, "dust": 76, "guacamol": 76, "consomm": 76, "hotpot": 76, "trifl": 76, "cream": 76, "icecream": 76, "lolli": 76, "lollipop": 76, "popsicl": 76, "930": 76, "loaf": 76, "931": 76, "bagel": 76, "beigel": 76, "pretzel": 76, "933": 76, "cheeseburg": 76, "934": 76, "hotdog": 76, "935": [76, 94], "mash": 76, "936": [76, 94], "937": [76, 94], "broccoli": 76, "cauliflow": 76, "zucchini": 76, "courgett": 76, "940": [76, 94], "spaghetti": 76, "squash": 76, "941": [76, 94], "acorn": 76, "942": [76, 94], "butternut": 76, "943": 76, "cuke": 76, "artichok": 76, "globe": 76, "945": [76, 94], "pepper": 76, "946": 76, "cardoon": 76, "947": 76, "mushroom": 76, "948": 76, "granni": 76, "smith": 76, "949": 76, "strawberri": 76, "950": 76, "lemon": 76, "952": 76, "953": 76, "pineappl": 76, "anana": 76, "954": 76, "banana": 76, "955": 76, "jackfruit": 76, "jak": 76, "956": 76, "custard": 76, "appl": 76, "957": 76, "pomegran": 76, "958": 76, "hai": 76, "959": 76, "carbonara": 76, "960": 76, "chocol": 76, "sauc": 76, "syrup": 76, "961": 76, "dough": 76, "962": 76, "meatloaf": 76, "963": 76, "pizza": 76, "pie": [76, 87], "964": 76, "potpi": 76, "965": 76, "burrito": 76, "966": 76, "967": 76, "968": 76, "969": 76, "eggnog": 76, "alp": 76, "bubbl": 76, "972": 76, "reef": 76, "974": 76, "geyser": 76, "lakesid": 76, "lakeshor": 76, "976": 76, "promontori": 76, "headland": 76, "foreland": 76, "977": 76, "sandbar": 76, "978": 76, "seashor": 76, "coast": 76, "seacoast": 76, "979": 76, "vale": 76, "980": 76, "volcano": 76, "981": 76, "ballplay": 76, "982": 76, "groom": 76, "bridegroom": 76, "983": 76, "scuba": 76, "diver": 76, "rapese": 76, "985": 76, "986": 76, "slipper": 76, "cypripedium": 76, "calceolu": 76, "parviflorum": 76, "987": 76, "corn": 76, "988": 76, "989": 76, "rosehip": 76, "990": 76, "buckey": 76, "chestnut": 76, "conker": 76, "991": 76, "fungu": 76, "992": 76, "agar": 76, "gyromitra": 76, "994": 76, "stinkhorn": 76, "carrion": 76, "earthstar": 76, "polyporu": 76, "frondosu": 76, "grifola": 76, "frondosa": 76, "bolet": 76, "capitulum": 76, "tissu": 76, "bathroom": 76, "dir_to_imagenet_index": 76, "n03888257": 76, "n03425413": 76, "n03394916": 76, "n03000684": 76, "n02102040": 76, "n03445777": 76, "n03417042": 76, "n03028079": 76, "n02979186": 76, "n01440764": 76, "dir_index_to_imagenet_label": 76, "ordered_dir": 76, "dir_index": 76, "dir_nam": 76, "val_transform": 76, "imagenette_v": 76, "imagenette_train": 76, "random_indic": 76, "imagenette_train_subset": 76, "imagenette_train_load": 76, "imagenette_val_load": 76, "dataset_length": 76, "loss_sum": 76, "total_1_correct": 76, "total_5_correct": 76, "bearpaw": 76, "cc9106d598ff1fe375cc030873ceacfea0499d77": 76, "topk": [76, 100, 102], "top_k_correct": 76, "top_1_correct": 76, "top_1_acc": 76, "top_5_acc": 76, "imagenette_train_loop": 76, "untrain": [76, 94], "imagenette_batch": 76, "top_1_accuraci": 76, "top_5_accuraci": 76, "resnet18_weight": 76, "resnet_opt": 76, "predict_top5": 76, "top5_prob": 76, "top5_nam": 76, "top5_idc": 76, "_use_the_resnet_model_exercis": 76, "moveaxi": [76, 80], "bonsai": 76, "svg": 76, "pok\u00e9mon_pikachu_art": 76, "data2": 76, "27min": 76, "_improving_efficiency_inception_and_resnext_video": 76, "xie": 76, "calculate_parameters_resnet": 76, "d_in": 76, "resnet_channel": 76, "d_out": 76, "resnet_paramet": 76, "calculate_parameters_resnext": 76, "resnext_channel": 76, "num_path": 76, "pathwai": 76, "resnext_paramet": 76, "descriptions_resnet": 76, "descriptions_resnext": 76, "cardin": 76, "lbox_resnet": 76, "lbox_resnext": 76, "rbox_resnet": 76, "rbox_resnext": 76, "ui_resnet": 76, "ui_resnet_label": 76, "1px": 76, "ui_resnext": 76, "ui_resnext_label": 76, "out_resnet": 76, "out_resnext": 76, "_resnet_vs_resnext_interactive_demo": 76, "_resnet_vs_resnext_discuss": 76, "biggest": 76, "23min": 76, "_improving_efficiency_mobilenet_video": 76, "convolution_math": 76, "filter_s": [76, 80], "conv_paramet": 76, "depthwise_conv_paramet": 76, "_calculation_of_parameters_exercis": 76, "_parameter_savings_discuss": 76, "24min": 76, "_transfer_learning_video": 76, "twice": [76, 80, 87], "pokemon": 76, "cis_522_data": [76, 77], "u4njm": 76, "small_pokemon_dataset": 76, "charmand": 76, "charmeleon": 76, "ivysaur": 76, "charizard": 76, "bulbasaur": 76, "blastois": 76, "squirtl": 76, "venusaur": 76, "wartortl": 76, "pokemon_dataset": 76, "image_count": [76, 77], "pokemon_test_set": 76, "pokemon_train_set": 76, "pokemon_train_load": 76, "pokemon_test_load": 76, "pretrained_acc": 76, "total_correct": 76, "num_correct": 76, "linreadout_acc": 76, "scratch_acc": 76, "_pretrained_resnet_vs_resnet_exercis": 76, "_training_only_the_classification_exercis": 76, "facial": 76, "_summary_and_outlook_video": 76, "21min": [76, 84], "_speedaccuracy_tradeoff_different_backbones_bonus_video": 76, "era": 76, "tradeoff": [76, 97], "t_start": 76, "top_1_acciraci": 76, "aux_logit": 76, "googlenet": 76, "model_tim": 76, "plot_acc_spe": 76, "ti": [76, 100], "create_model": 76, "weight_list": 76, "alexnet_weight": 76, "vgg19_weight": 76, "_accuracy_vs_training_speed_exercis": 76, "_finding_best_model_exercis": 76, "_speed_and_accuracy_correlation_exercis": 76, "facenet": 77, "w2d3_t2_bonu": 77, "facenet_pytorch": 77, "mtcnn": 77, "inceptionresnetv1": 77, "12min": 77, "2kyfb": 77, "_face_recognition_using_cnns_video": 77, "retrain": [77, 80], "bruce": 77, "lee": 77, "neil": 77, "harri": 77, "pam": 77, "grier": 77, "face_dataset": 77, "face_load": 77, "process_imag": 77, "model_tensor": 77, "display_tensor": 77, "img_crop": 77, "bruce_tensor": 77, "bruce_displai": 77, "neil_tensor": 77, "neil_displai": 77, "pam_tensor": 77, "pam_displai": 77, "tensor_to_displai": 77, "vggface2": 77, "9131": 77, "bruce_embed": 77, "neil_embed": 77, "pam_embed": 77, "_embedding_vectors_discuss": 77, "princip": [77, 80], "embedding_tensor": 77, "n_compon": [77, 81, 87], "pca_tensor": 77, "categ": 77, "pc": [77, 80], "unlock": 77, "19min": 77, "casia": 77, "webfac": 77, "caucasian": 77, "crimin": 77, "justic": 77, "utkfac": 77, "women": 77, "imbalanc": 77, "_ethical_aspects_video": [77, 84], "richardvogg": 77, "face_sampl": 77, "36wyh": 77, "face_sample2": 77, "black_female_tensor": 77, "black_female_displai": 77, "_1_1_": 77, "white_female_tensor": 77, "white_female_displai": 77, "_1_0_": 77, "black_female_embed": 77, "white_female_embed": 77, "cdist": 77, "calculate_pairwise_dist": 77, "embedding_dimens": [77, 89], "femal": [77, 84, 88], "_face_similarity_discuss": 77, "_embeddings_discuss": 77, "lastli": 77, "men": 77, "fairfac": 77, "male": [77, 84, 88], "k\u00e4rkk\u00e4inen": 77, "joo": 77, "centroid": [77, 87], "complement": 77, "embedding_s": 77, "sum_sq": 77, "1x1": 77, "w2d4_bonuslectur": 79, "_geoffrey_hinton_video": 79, "upenn": 80, "instructor": 80, "libsixel": 80, "w2d4_t1": 80, "pylab": 80, "pytorch_pretrained_biggan": 80, "one_hot_from_nam": 80, "image_mo": 80, "image_batch": 80, "n_batch": 80, "covari": [80, 91], "m1": 80, "cov": [80, 81, 91], "m2": 80, "num_interp": 80, "kl_q_p": 80, "zs": 80, "kl": 80, "mu_p": 80, "sigma_p": 80, "log_q": 80, "log_p": 80, "mu_q": 80, "log_sig_q": 80, "log_p_x": 80, "mu_x": 80, "sig_x": 80, "squared_error": 80, "pca_encoder_decod": 80, "svd_lowrank": 80, "w_encod": 80, "w_decod": 80, "pca_encod": 80, "pca_decod": 80, "cout": 80, "unnecessarili": 80, "dilat": [80, 94], "in_depth": 80, "in_height": 80, "in_width": 80, "out_depth": 80, "out_height": 80, "out_width": 80, "plot_gen_samples_ppca": 80, "therm1": 80, "therm2": 80, "therm_data_sim": 80, "thermomet": 80, "them2": 80, "plot_linear_a": 80, "lin_loss": 80, "plot_conv_a": 80, "conv_loss": 80, "ae": 80, "plot_imag": 80, "plt_titl": 80, "plot_torch_imag": 80, "plot_phi": 80, "entropu": 80, "inter": 80, "28318": 80, "rsampl": 80, "lie": [80, 88], "im_plt": 80, "nltk_data": [80, 84, 85], "omw": 80, "ekjxi": 80, "kuwep": 80, "corpora": [80, 85, 87], "_generative_modeling_video": 80, "biggan_model": 80, "from_pretrain": [80, 82, 84, 85, 87, 88], "3yvhw": 80, "biggan_deep_256": 80, "sneak": 80, "peek": 80, "truncat": [80, 81, 84, 85, 88], "ins": [80, 84], "z_magnitud": 80, "truncnorm": 80, "truncated_noise_sampl": 80, "dim_z": 80, "randomst": 80, "sample_from_biggan": 80, "instabl": 80, "clone": [80, 94, 100, 102], "z_slider": 80, "440px": 80, "category_dropdown": 80, "realist": 80, "_generated_images_discuss": 80, "interpolate_biggan": 80, "category_a": 80, "category_b": 80, "z_magnitude_a": 80, "z_magnitude_b": 80, "interpolate_and_shap": 80, "interp": 80, "unit_vector": 80, "z_a": 80, "z_b": 80, "z_interp": 80, "y_interp": 80, "output_grid": 80, "z_a_slid": 80, "z_b_slider": 80, "magntud": 80, "category_a_dropdown": 80, "category_b_dropdown": 80, "_biggan_interpolation_interactive_demo": 80, "_samples_from_the_same_category_discuss": 80, "_latent_variable_models_video": 80, "generate_data": 80, "mean_of_temp": 80, "cov_of_temp": 80, "temparatur": 80, "kx1": 80, "kxk": 80, "psudo": 80, "multivariate_norm": [80, 81], "sqrt2": 80, "pc_ax": 80, "therm_data": 80, "therm_data_mean": 80, "therm_data_cent": 80, "outer": 80, "therm_data_zero_cent": 80, "pc_project": 80, "pc_axes_vari": 80, "sensor_noise_std": 80, "sensor_noise_var": 80, "gen_from_ppca": 80, "noise_var": 80, "data_mean": 80, "pc_varianc": 80, "epsilon_cov": 80, "sim_mean": 80, "rand_ep": 80, "_coding_ppca_exercis": 80, "_autoencoders_video": 80, "jbpme": 80, "mnist_val": 80, "cifar10_v": 80, "dataset_nam": 80, "get_data": 80, "my_dataset": 80, "my_dataset_nam": 80, "my_dataset_shap": 80, "my_dataset_s": 80, "my_valset": 80, "data_shap": 80, "data_s": 80, "valid_set": 80, "longrightarrow": 80, "2_2": 80, "plenti": 80, "linearautoencod": 80, "x_dim": 80, "my_dataset_dim": 80, "h_dim": 80, "train_autoencod": 80, "mse_loss": 80, "pin_memori": 80, "im_batch": 80, "enc_lin": 80, "dec_lin": 80, "x_prime": 80, "flat_x": 80, "lin_a": 80, "_linear_autoencoder_exercis": 80, "n_plot": 80, "h_pca": 80, "recon_pca": 80, "nimag": 80, "1000x450": 80, "_pca_vs_linearautoencod": 80, "Such": 80, "biaslay": 80, "grain": 80, "requisit": 80, "init_bia": 80, "tour": [80, 87], "deconvolut": [80, 94], "ubiquit": 80, "schemat": 80, "dummy_imag": 80, "dummy_conv": 80, "dummy_deconv": 80, "n_filter": 80, "enc_bia": 80, "enc_conv_1": 80, "conv_1_shap": 80, "enc_conv_2": 80, "conv_2_shap": 80, "enc_flatten": 80, "flat_after_conv": 80, "undo": 80, "ing": 80, "unflatten": 80, "dec_unflatten": 80, "unflattened_s": 80, "dec_deconv_1": 80, "dec_deconv_2": 80, "dec_bia": 80, "trained_conv_a": 80, "lin_recon": 80, "nonlin_recon": 80, "nonlin": 80, "_nonlinear_autoencoder_exercis": 80, "_variational_autoencoder_video": 80, "ambiti": 80, "k_vae": 80, "convva": 80, "num_filt": 80, "filter_reduct": 80, "shape_after_conv": 80, "flat_size_after_conv": 80, "q_bia": 80, "q_conv_1": 80, "q_conv_2": 80, "q_flatten": 80, "q_fc_phi": 80, "p_fc_upsampl": 80, "p_unflatten": 80, "p_deconv_1": 80, "p_deconv_2": 80, "p_bia": 80, "log_sig_x": 80, "flat_": 80, "mu_z": 80, "elbo": [80, 81], "expected_z": 80, "kplus1": 80, "train_va": 80, "elbo_v": 80, "trained_conv_vara": 80, "sigma_x": 80, "keyboardinterrupt": [80, 94], "_tensor": 80, "retain_graph": 80, "create_graph": 80, "has_torch_function_unari": 80, "handle_torch_funct": 80, "grad_tensor": 80, "grad_vari": 80, "_execution_engin": 80, "run_backward": 80, "grad_tensors_": 80, "allow_unreach": 80, "accumulate_grad": 80, "overset": 80, "q_": 80, "w_e": 80, "parametar": 80, "p_": [80, 82], "partli": 80, "prime": 80, "generate_imag": 80, "n_imag": 80, "_generating_images_exercis": 80, "_autoencoders_vs_variational_autoencoders_discuss": 80, "_sota_vaes_and_wrapup_video": 80, "binxu": [81, 82], "dongrui": [81, 82], "deng": [81, 82], "dora": [81, 82, 88, 97], "zhiyu": [81, 82, 88, 97], "adrita": [81, 82, 88, 97], "w2d4_t2": 81, "mline": 81, "plotting_z": 81, "kdeplot": 81, "pnt": 81, "titlestr": 81, "figh": 81, "hacki": 81, "stackoverflow": [81, 89], "73739704": 81, "14392829": 81, "_get_lin": 81, "prop_cycl": 81, "quiver_plot": 81, "vec": 81, "gmm_pdf_contour_plot": 81, "gmm": 81, "logprob": [81, 84], "dstack": 81, "visualize_diffusion_distr": 81, "x_traj_rev": 81, "leftt": 81, "rightt": 81, "explabel": 81, "x_t": [81, 82], "interchang": 81, "markov": 81, "sde": [81, 82], "synopsi": 81, "_intro_and_principles_video": 81, "_math_behind_diffusion_video": 81, "vpsde": 81, "wiener": 81, "z_t": 81, "_0": 81, "_t": 81, "sigma_t": [81, 82], "p_t": 81, "p_0": [81, 82], "int_": [81, 82], "undergo": 81, "diffusion_1d_forward": 81, "samplen": 81, "bimod": 81, "cumsum": [81, 87], "scatter1": 81, "scatter2": 81, "set_offset": 81, "to_jshtml": 81, "_visualizing_diffusion_interactive_demo": 81, "gaussianmixtur": 81, "signifi": [81, 100, 102], "prec": 81, "norm_weight": 81, "add_compon": 81, "pdf_decompos": 81, "component_pdf": 81, "nabla_x": 81, "weighted_compon_pdf": 81, "gradvec": 81, "score_decompos": 81, "gradvec_list": 81, "rand_compon": 81, "all_sampl": 81, "gmm_samp": 81, "mu1": 81, "cov1": 81, "mu2": 81, "cov2": 81, "show_sampl": 81, "gmm_sampl": 81, "gmm_samps_few": 81, "scorevecs_few": 81, "gauss": 81, "mode1": 81, "mode2": 81, "silenc": [81, 82], "_what_does_score_tell_us_discuss": 81, "equip": 81, "nabla_": 81, "recoveri": 81, "sigma_t_squar": 81, "diffuse_gmm": 81, "teleport": 81, "sigma_t_2": 81, "noise_cov": 81, "covs_dif": 81, "reverse_diffusion_sde_sampling_gmm": 81, "sampn": 81, "nstep": 81, "gausian": 81, "sigmat2": 81, "xt": 81, "eps_z": 81, "transport": 81, "gmm_t": 81, "score_xt": 81, "2500": [81, 85], "x0_rev": 81, "_score_enables_reversal_of_diffusion_exercis": 81, "dsm": [81, 82], "j_": 81, "e_": [81, 82], "tild": 81, "s_": [81, 82, 100], "esm": 81, "2_t": 81, "gamma_t": [81, 82], "1dt": [81, 82], "emphas": [81, 82], "sigma_ts_": 81, "rapidli": [81, 88], "scare": 81, "disguis": 81, "absorb": 81, "alpha_t": 81, "2303": 81, "00848": 81, "2206": 81, "00364": 81, "2106": 81, "05527": 81, "_denoising_objective_discuss": 81, "sigma_t_fun": 81, "toler": [81, 82, 97], "random_t": [81, 82], "perturbed_x": [81, 82], "sigma_t_test": 81, "score_analyt_test": [81, 82], "_implementing_denoising_score_matching_objective_exercis": 81, "gaussianfourierproject": [81, 82], "embed_dim": [81, 82], "t_proj": 81, "scoremodel_tim": 81, "t_emb": 81, "induct": [81, 82], "sample_x_and_score_t_depend": 81, "trainn": 81, "partit": 81, "trainn_part": 81, "x_train_col": 81, "y_train_col": 81, "t_train_col": 81, "gmm_dif": 81, "x_train_tsr": 81, "y_train_tsr": 81, "t_train_tsr": 81, "test_dsm_object": 81, "x_train_samp": 81, "y_train_samp": 81, "t_train_samp": 81, "x_test_samp": 81, "y_test_samp": 81, "t_test_samp": 81, "score_model_td": 81, "sigma_t_f": 81, "5k": [81, 85], "y_pred_train": 81, "mse_train": 81, "y_pred_test": 81, "mse_test": 81, "stats_df": 81, "dsm_loss": 81, "reverse_diffusion_sde_sampl": 81, "tvec": 81, "x_traj_rev_appr_denoi": 81, "x_traj_rev_exact": 81, "x_samp": 81, "bravo": 81, "bigg": 81, "ordinari": 81, "brian": 81, "anderson": 81, "1986": 81, "song": 81, "practis": [81, 88], "pascal": 81, "vincent": 81, "2011": 81, "w2d4_t3": 82, "functool": 82, "multiplicativelr": 82, "lambdalr": 82, "takeawai": 82, "highwai": 82, "_network_architecture_video": 82, "marginal_prob_std": 82, "diffusion_coeff": 82, "int_0": 82, "tg": 82, "2t": 82, "0t": 82, "diff_coeff": 82, "_train_diffusion_for_mnist_exercis": 82, "readout": 82, "freq": 82, "x_proj": 82, "repr": 82, "time_emb": 82, "t_mod1": 82, "gnorm1": 82, "groupnorm": 82, "num_channel": [82, 100, 102], "t_mod2": 82, "gnorm2": 82, "t_mod3": 82, "gnorm3": 82, "t_mod4": 82, "gnorm4": 82, "tconv4": 82, "t_mod5": 82, "tgnorm4": 82, "tconv3": 82, "t_mod6": 82, "tgnorm3": 82, "tconv2": 82, "t_mod7": 82, "tgnorm2": 82, "tconv1": 82, "swish": 82, "h3": 82, "4th": [82, 88, 100], "h4": 82, "_unet_architecture_discuss": 82, "irregular": 82, "_2": [82, 91], "marginal_prob_std_test": 82, "_defining_the_loss_function_exercis": 82, "suffic": 82, "marginal_prob_std_fn": 82, "diffusion_coeff_fn": 82, "score_model": 82, "10e": 82, "lr_lambda": 82, "tqdm_epoch": 82, "num_item": [82, 94], "get_last_lr": 82, "yann": 82, "lecun": [82, 91], "exdb": 82, "idx3": 82, "ubyt": 82, "idx1": 82, "t10k": 82, "euler_maruyama_sampl": 82, "x_shape": 82, "maruyama": 82, "init_x": 82, "step_siz": 82, "batch_time_step": 82, "mean_x": 82, "save_samples_uncond": 82, "sample_batch_s": 82, "sample_np": 82, "imsav": 82, "uncondition_diffus": 82, "uncond_score_model": 82, "filenotfounderror": [82, 100, 102], "_open_file_lik": 82, "_is_zipfil": 82, "1002": 82, "1003": 82, "orig_posit": 82, "name_or_buff": 82, "_is_path": 82, "_open_fil": 82, "errno": 82, "advis": 82, "effortless": 82, "_conditional_diffusion_model_video": 82, "uncondit": 82, "_advanced_techinque_stable_diffusion_video": 82, "potent": 82, "stablediffusionpipelin": 82, "dpmsolvermultistepschedul": 82, "pndmschedul": 82, "model_id": [82, 88], "stabilityai": 82, "torch_dtyp": 82, "float16": 82, "pndm": 82, "from_config": 82, "dpm": 82, "loos": 82, "dessert": 82, "gogh": 82, "ballerina": 82, "danc": 82, "starri": 82, "monet": 82, "num_inference_step": 82, "_stable_diffusion_interactive_demo": 82, "babi": 82, "recursive_print": 82, "text_encod": 82, "named_children": 82, "modulelist": 82, "unet2dconditionmodel": 82, "conv_in": 82, "time_proj": 82, "time_embed": 82, "timestepembed": 82, "linear_1": 82, "1280": 82, "silu": 82, "linear_2": 82, "down_block": 82, "crossattndownblock2d": 82, "downblock2d": 82, "up_block": 82, "upblock2d": 82, "crossattnupblock2d": 82, "mid_block": 82, "unetmidblock2dcrossattn": 82, "conv_norm_out": 82, "conv_act": 82, "conv_out": 82, "cliptextmodel": 82, "text_model": 82, "cliptexttransform": 82, "cliptextembed": 82, "token_embed": [82, 84], "49408": 82, "position_embed": 82, "clipencod": 82, "clipencoderlay": 82, "final_layer_norm": 82, "layernorm": [82, 84], "elementwise_affin": 82, "_architecture_of_stable_diffusion_model_discuss": 82, "_ethical_consideration_video": 82, "artist": 82, "prompter": 82, "deserv": 82, "credit": 82, "_copyrights_discuss": 82, "misinform": 82, "unet_condit": 82, "text_dim": 82, "nclass": 82, "cond_emb": 82, "y_mod2": 82, "y_mod3": 82, "y_mod4": 82, "y_mod5": 82, "y_mod6": 82, "y_mod7": 82, "constant_": 82, "y_emb": 82, "loss_fn_cond": 82, "initil": 82, "score_model_cond": 82, "lr_current": 82, "ckpt_cond": 82, "empty_cach": 82, "bikram": [84, 85], "khastgir": [84, 85], "rajaswa": [84, 85], "patil": [84, 85], "egor": [84, 85], "zverev": [84, 85], "alish": [84, 85, 87, 88, 89], "dipani": [84, 85, 87, 88, 89], "ezekiel": [84, 85], "william": [84, 85], "hadi": [84, 85, 94], "vafaei": [84, 85, 94], "pytorch_pretrained_bert": 84, "gensim": [84, 85, 87, 88], "w2d5_t1": 84, "ta_cache_dir": [84, 85], "pprint": [84, 85], "abc": [84, 85], "abstractmethod": [84, 85], "word2vec": [84, 85, 87, 88], "manifold": [84, 85, 87], "tsne": [84, 85, 87], "vocab": [84, 85, 87, 88], "autotoken": [84, 85, 88], "berttoken": 84, "bertformaskedlm": 84, "modulenotfounderror": [84, 85], "get_ipython": [84, 85], "run_line_mag": [84, 85], "zqw5": 84, "brown_wordlist": [84, 87], "w2vmodel": [84, 87], "editori": [84, 87], "fiction": [84, 87], "govern": [84, 87], "mysteri": [84, 87, 88], "religion": [84, 87], "romanc": [84, 87], "science_fict": [84, 87], "create_word2vec_model": [84, 87], "sg": [84, 87], "min_count": [84, 87], "vector_s": [84, 87], "model_dictionari": [84, 87], "wv": [84, 87], "get_embed": [84, 87], "keyerror": 84, "check_word_in_corpu": 84, "word_embed": [84, 87], "layer1_s": 84, "embed_list": 84, "f_x": 84, "ambienc": 84, "aggreg": 84, "dataset_dict": 84, "datasetdict": 84, "sentiment": [84, 87], "cach": 84, "set_format": 84, "input_id": [84, 85, 88], "hf_datasets_cach": [84, 85], "kthjg": [84, 85], "load_dataset": [84, 85, 87, 88], "yelp_review_ful": [84, 85], "download_mod": [84, 85], "reuse_dataset_if_exist": [84, 85], "cache_dir": [84, 85], "ignore_verif": [84, 85], "charg": 84, "variant": 84, "mlm": [84, 88], "pred_text": 84, "actual_label": 84, "batch1": 84, "transform_sentence_for_bert": 84, "masked_word": 84, "___": 84, "parse_text_and_word": 84, "raw_lin": 84, "option1": 84, "optionn": 84, "mask_index": 84, "get_probabilities_of_masked_word": 84, "uncas": 84, "words_idx": 84, "convert_tokens_to_id": 84, "tokenized_text": 84, "indexed_token": 84, "masked_index": 84, "tokens_tensor": 84, "pretrained_masked_model": 84, "predicted_index": 84, "ix": 84, "suffer": [84, 101], "attend": 84, "_application_of_attention_discuss": 84, "40min": [84, 94], "_queries_keys_and_values_video": 84, "ambigi": 84, "get_value_attent": 84, "query_embed": 84, "query_similar_word": 84, "similar_by_word": 84, "key_embed": 84, "unscal": 84, "scaled_attent": 84, "softmax_attent": 84, "nscale": 84, "value_similar_word": 84, "similar_by_vector": 84, "monei": 84, "random_word": [84, 87], "_intution_behind_attention_interactive_demo": 84, "_does_this_model_perform_well_discuss": 84, "dotproductattent": 84, "calculate_scor": 84, "databas": 84, "bmm": 84, "softmax_weight": 84, "dot_product_attent": 84, "_dot_product_attention_exercis": 84, "_multihead_attention_video": 84, "semant": [84, 101], "to_kei": 84, "to_queri": 84, "to_valu": 84, "selfattent": 84, "unify_head": 84, "unifi": 84, "multiheadattent": 84, "_q_k_v_attention_exercis": 84, "_transformer_overview_i_video": 84, "transformerblock": 84, "norm1": 84, "norm2": 84, "norm_1": 84, "norm_2": 84, "1607": 84, "06450": 84, "_transformer_encoder_exercis": 84, "_transformer_overview_ii_video": 84, "autoregress": 84, "explanatori": 84, "vaswani": 84, "_complexity_of_decoding_discuss": 84, "_positional_encoding_video": 84, "concern": 84, "sinusoid": 84, "pe_": 84, "2i": 84, "d_": 84, "pe": 84, "bonu": [84, 87], "familiaris": 84, "transformer_tutori": 84, "emb_siz": 84, "inject": 84, "div_term": 84, "wavelength": 84, "2\u03c0": 84, "pepo": 84, "register_buff": 84, "gehr": 84, "alammar": 84, "phillip": 84, "lipp": 84, "num_token": 84, "pos_enc": 84, "transformer_block": 84, "classification_head": 84, "sequence_avg": 84, "_transformer_architecture_for_classification_exercis": 84, "n_iter": [84, 87], "l2_penalti": 84, "l1_penalti": 84, "opim": 84, "n_neuron": 84, "n_test": 84, "placehold": [84, 85], "cf": 84, "appendix": 84, "iter_train_loss": 84, "iter_loss_test": 84, "test_batch": 84, "out_test": 84, "loss_test": [84, 87], "pred_batch": 84, "predicted_label28": 84, "11min": 84, "keen": 84, "favor": 84, "racial": 84, "gender": 84, "crow": 84, "gather": 84, "wouldn": [84, 85], "astrophysicist": 84, "gross": 84, "socioeconom": 84, "statu": 84, "alcohol": 84, "mansion": 84, "favour": 84, "u2019t": 84, "attract": 84, "masked_text": 84, "_find_biases_in_the_model_interactive_demo": 84, "creatur": 84, "ago": [84, 87], "prei": [84, 88], "jungl": 84, "compsognathu": 84, "_problems_of_this_approach_discuss": 84, "protbert": 84, "_biases_of_using_these_models_in_other_fields_discuss": 84, "5min": 84, "vit": 84, "dall": 84, "parti": 84, "nerf": 84, "wav2vec": 84, "generalist": 84, "gato": 84, "demand": 84, "w2d5_t2_bonu": 85, "trainer": [85, 88], "trainingargu": [85, 88], "automodelforcausallm": 85, "automodelforsequenceclassif": 85, "_pretraining_video": 85, "sentiment_dict": 85, "clean_text": 85, "backslash": 85, "sample_review_from_yelp": 85, "use_custom_review": 85, "custom_review": 85, "gpt2": [85, 88], "xlnet": 85, "extension_prompt": 85, "num_output_respons": 85, "input_text": 85, "generated_respons": 85, "num_return_sequ": [85, 88], "THE": 85, "generated_text": [85, 88], "custom_positive_extens": 85, "custom_negative_extens": 85, "positive_input_id": 85, "positive_attention_mask": 85, "attention_mask": 85, "positive_label_id": 85, "positive_extension_likelihood": 85, "nlog": 85, "negative_input_id": 85, "negative_attention_mask": 85, "negative_label_id": 85, "negative_extension_likelihood": 85, "nposit": 85, "nneg": 85, "_finetuning_video": 85, "billion": [85, 87], "bert": [85, 101], "tokenize_funct": 85, "tokenis": 85, "tokenized_dataset": 85, "10k": 85, "train_lay": 85, "num_label": 85, "pooler": 85, "frozen": [85, 94], "training_arg": [85, 88], "output_dir": [85, 88], "yelp_bert": 85, "overwrite_output_dir": 85, "evaluation_strategi": 85, "per_device_train_batch_s": [85, 88], "per_device_eval_batch_s": 85, "num_train_epoch": 85, "fp16": 85, "save_step": 85, "logging_step": 85, "report_to": 85, "compute_metr": [85, 88], "eval_pr": [85, 88], "load_metr": 85, "eval_dataset": 85, "22min": 85, "_robustness_video": 85, "deceiv": 85, "persuad": 85, "impart": 85, "_load_an_original_review_interactive_demo": 85, "wordswapqwerti": 85, "wordswapextend": 85, "wordswapcontract": 85, "wordswaphomoglyphswap": 85, "compositetransform": 85, "wordswaprandomcharacterdelet": 85, "wordswapneighboringcharacterswap": 85, "wordswaprandomcharacterinsert": 85, "wordswaprandomcharactersubstitut": 85, "flair": 85, "ordereddict": 85, "wordswap": 85, "_get_replacement_word": 85, "default_class_repr": 85, "extra_repr_kei": 85, "extra_param": 85, "extra_str": 85, "__dict__": 85, "label_color": 85, "pink": 85, "adversari": 85, "current_text": 85, "pre_transformation_constraint": 85, "indices_to_modifi": 85, "shifted_idx": 85, "_get_transform": 85, "attackedtext": 85, "pretransformationconstraint": 85, "dictat": 85, "searchmethod": 85, "transformed_text": 85, "convert_from_original_idx": 85, "attack_attr": 85, "last_transform": 85, "indicies_to_modifi": 85, "letters_to_insert": 85, "char": 85, "ascii_lett": 85, "_get_random_lett": 85, "lowercas": [85, 88], "uppercas": 85, "word_to_replac": 85, "replacement_word": 85, "transformed_texts_idx": 85, "replace_word_at_index": 85, "qwerti": 85, "replic": [85, 89, 91], "random_on": 85, "skip_first_char": 85, "skip_last_char": 85, "wordswapqwert": 85, "fabul": 85, "_keyboard_adjac": 85, "_get_adjac": 85, "adjacent_kei": 85, "s_lower": 85, "isupp": 85, "candidate_word": 85, "start_idx": 85, "end_idx": 85, "randrang": 85, "swap_kei": 85, "extension_map": 85, "ain": 85, "hadn": 85, "hasn": [85, 88], "madam": 85, "mightn": 85, "mustn": 85, "needn": 85, "oughtn": 85, "ought": 85, "shan": 85, "wasn": 85, "weren": 85, "expend": 85, "contract": 85, "reverse_contraction_map": 85, "word_idx": 85, "next_idx": 85, "next_word": 85, "delete_word_at_index": 85, "homoglyph": 85, "graphem": 85, "glyph": 85, "substr": [85, 88], "homo": 85, "\u09ed": 85, "\u0223": 85, "\ud835\udfd5": 85, "\u0431": 85, "\u01bd": 85, "\uab9e": 85, "\u0292": 85, "\u14bf": 85, "\u0251": 85, "\u044c": 85, "\u03f2": 85, "\u0501": 85, "\u0435": 85, "\ud835\ude8f": 85, "\u0261": 85, "\u0570": 85, "\u0456": 85, "\u03f3": 85, "\ud835\udc8c": 85, "\u217c": 85, "\uff4d": 85, "\u0578": 85, "\u043e": 85, "\u0440": 85, "\u051b": 85, "\u2c85": 85, "\u0455": 85, "\ud835\ude9d": 85, "\u057d": 85, "\u0475": 85, "\u051d": 85, "\u0443": 85, "\u1d22": 85, "repl_lett": 85, "disregard": 85, "optoin": 85, "new_attacked_text": 85, "main_str": 85, "transformation_lin": 85, "add_ind": 85, "stopword": 85, "_get_modifiable_indic": 85, "check_compat": 85, "wordembeddingdist": 85, "words_from_text": 85, "words_to_ignor": 85, "alphanumer": 85, "legitim": 85, "isalnum": 85, "apostroph": 85, "hyphen": 85, "asterisk": 85, "_flair_pos_tagg": 85, "flair_tag": 85, "tag_typ": 85, "upo": 85, "tagger": 85, "sequencetagg": 85, "zip_flair_result": 85, "split_token": 85, "previous_attacked_text": 85, "text_input": 85, "_text_input": 85, "_word": 85, "_words_per_input": 85, "_pos_tag": 85, "_ner_tag": 85, "setdefault": 85, "original_index_map": 85, "num_word": 85, "modified_indic": 85, "__eq__": 85, "__hash__": 85, "hash": 85, "free_memori": 85, "text_window_around_index": 85, "window_s": 85, "half_siz": 85, "text_idx_start": 85, "_text_index_of_word_index": 85, "text_idx_end": 85, "pos_of_word_index": 85, "desired_word_idx": 85, "use_token": 85, "flair_word_list": 85, "flair_pos_list": 85, "word_idx_in_flair_tag": 85, "ner_of_word_index": 85, "ner": 85, "flair_ner_list": 85, "look_after_index": 85, "pre_word": 85, "lower_text": 85, "text_until_word_index": 85, "text_after_word_index": 85, "first_word_diff": 85, "other_attacked_text": 85, "first_word_diff_index": 85, "all_words_diff": 85, "ith_word_diff": 85, "words_diff_num": 85, "generate_token": 85, "words_to_token": 85, "edit_dist": 85, "w1_t": 85, "w2_t": 85, "cal_dif": 85, "replace_words_at_indic": 85, "new_word": 85, "generate_new_attacked_text": 85, "insert_text_after_word_index": 85, "word_at_index": 85, "new_text": 85, "insert_text_before_word_index": 85, "capit": 85, "get_deletion_indic": 85, "punctuat": [85, 88], "preturb": 85, "perturbed_text": 85, "original_text": 85, "new_attack_attr": 85, "newly_modified_indic": 85, "new_i": 85, "input_word": 85, "adv_word_seq": 85, "word_start": 85, "word_end": 85, "adv_word": 85, "adv_num_word": 85, "num_words_diff": 85, "shifted_modified_indic": 85, "modified_idx": 85, "original_modification_idx": 85, "new_idx_map": 85, "preced": 85, "reform": 85, "perturbed_input_text": 85, "perturbed_input": 85, "words_diff_ratio": 85, "align_with_model_token": 85, "model_wrapp": 85, "subword": [85, 87, 88], "ding": 85, "modelwrapp": 85, "word2token_map": 85, "tokenizer_input": 85, "strip_prefix": 85, "last_match": 85, "matched_token": 85, "input_tupl": 85, "column_label": 85, "words_per_input": 85, "_input": 85, "printable_text": 85, "key_color": 85, "key_color_method": 85, "entail": 85, "ck": [85, 88], "color_text": 85, "pct_words_to_swap": 85, "transformations_per_exampl": 85, "_filter_transform": 85, "compare_against_origin": 85, "call_mani": 85, "attacked_text": 85, "all_transformed_text": 85, "num_words_to_swap": 85, "words_swap": 85, "augment_mani": 85, "text_list": [85, 87], "show_progress": [85, 88], "augment_text_with_id": 85, "id_list": 85, "supplement": 85, "all_text_list": 85, "all_id_list": 85, "_id": 85, "augmented_text": 85, "constraints_lin": 85, "constraints_str": 85, "importerror": [85, 87], "_arrow": 85, "noqa": [85, 87], "e402": 85, "dictconfig": 85, "clusteringmodel": 85, "entity_linker_model": 85, "spanclassifi": 85, "language_model": 85, "languagemodel": 85, "_iter_dataset": 85, "documentembed": 85, "scalarmix": 85, "documentcnnembed": 85, "documentlmembed": 85, "documentpoolembed": 85, "documentrnnembed": 85, "documenttfidfembed": 85, "sentencetransformerdocumentembed": 85, "transformerdocumentembed": 85, "convtransformnetworkimageembed": 85, "identityimageembed": 85, "precomputedimageembed": 85, "load_embed": 85, "register_embed": 85, "flairembed": 85, "stackedembed": 85, "tokenembed": 85, "transformerembed": 85, "transformeronnxdocumentembed": 85, "lockeddropout": 85, "worddropout": 85, "matutil": [85, 87], "f401": [85, 87], "reload": [85, 87, 94], "namespac": [85, 87], "indexedcorpu": [85, 87], "mmcorpu": [85, 87], "bleicorpu": [85, 87], "corpusabc": [85, 87], "saveload": [85, 87], "get_blas_func": [85, 87], "triu": [85, 87], "lapack": [85, 87], "get_lapack_func": [85, 87], "psi": [85, 87], "word_swap_contract": 85, "word_swap_extend": 85, "word_swap_homoglyph_swap": 85, "word_swap_neighboring_character_swap": 85, "word_swap_qwerti": 85, "word_swap_random_character_delet": 85, "word_swap_random_character_insert": 85, "word_swap_random_character_substitut": 85, "augmented_review": 85, "getpredict": 85, "return_tensor": [85, 88], "naugment": 85, "_textattack_module_interactive_demo": 85, "anushre": [87, 89], "hede": [87, 89], "pooja": [87, 89], "consul": [87, 89], "katrin": [87, 89], "reuel": [87, 89], "levenshtein": [87, 89], "portalock": 87, "w3d1_t1": 87, "facebookresearch": [87, 89], "vkuz7": [87, 89], "word_token": 87, "pad_sequ": 87, "imdb": 87, "ag_new": 87, "build_vocab_from_iter": 87, "to_map_style_dataset": 87, "suppress": 87, "simplefilt": 87, "punkt": 87, "download_file_from_google_dr": 87, "uc": 87, "export": 87, "get_confirm_token": 87, "save_response_cont": 87, "cooki": 87, "download_warn": 87, "chunk_siz": 87, "32768": 87, "iter_cont": 87, "aliv": 87, "_time_series_and_nlp_video": 87, "_what_is_nlp_video": 87, "_nlp_tokenization_video": 87, "linguist": 87, "prune": 87, "uninterest": 87, "typo": 87, "bewteen": 87, "freedom": 87, "random_word_embed": 87, "voter": 87, "god": 87, "administr": 87, "get_cluster_embed": 87, "embedding_clust": 87, "word_clust": 87, "closest": 87, "similar_word": 87, "most_similar": 87, "topn": 87, "cluser": 87, "tsne_model_en_2d": 87, "perplex": 87, "3500": 87, "embeddings_en_2d": 87, "tsne_plot_similar_word": 87, "rainbow": 87, "markers": 87, "xytext": 87, "textcoord": 87, "bbox_inch": 87, "farther": 87, "_similarity_discuss": 87, "_embeddings_rule_video": 87, "_distributional_similarity_and_vector_embeddings_video": 87, "morphem": 87, "oblivi": 87, "2frqg": [87, 89], "ft_en_vector": [87, 89], "get_word_vector": [87, 89], "nembed": 87, "04045481": 87, "10617249": 87, "27222311": 87, "06879666": 87, "16408321": 87, "00276707": 87, "27080125": 87, "05805573": 87, "31865698": 87, "03748008": 87, "00254088": 87, "13805169": 87, "00182498": 87, "08973497": 87, "00319015": 87, "19619396": 87, "09858181": 87, "10103802": 87, "08279888": 87, "0082208": 87, "13119364": 87, "15956607": 87, "17203182": 87, "0315701": 87, "25064597": 87, "06182072": 87, "03929246": 87, "05157393": 87, "03543638": 87, "13660161": 87, "05473648": 87, "06072914": 87, "04709269": 87, "17394426": 87, "02101276": 87, "11402624": 87, "24489872": 87, "08576579": 87, "00322696": 87, "04509873": 87, "00614253": 87, "05772085": 87, "073414": 87, "06718913": 87, "06057961": 87, "10963406": 87, "1245006": 87, "04819863": 87, "11408057": 87, "11081408": 87, "06752145": 87, "01689911": 87, "01186301": 87, "11716368": 87, "01287614": 87, "10639337": 87, "04243141": 87, "01057278": 87, "0230855": 87, "04930984": 87, "04717607": 87, "03696446": 87, "0015999": 87, "02193867": 87, "01331578": 87, "11102925": 87, "1686794": 87, "05814958": 87, "00296521": 87, "04252011": 87, "00352389": 87, "06267346": 87, "07747819": 87, "08959802": 87, "02445797": 87, "08913022": 87, "13422231": 87, "1258949": 87, "01296814": 87, "0531218": 87, "00541025": 87, "16908626": 87, "06323182": 87, "11510128": 87, "08352032": 87, "07224389": 87, "01023453": 87, "08263734": 87, "03859017": 87, "00798539": 87, "01498295": 87, "05448429": 87, "02708506": 87, "00549948": 87, "14634523": 87, "12550676": 87, "04641578": 87, "10164826": 87, "05370862": 87, "01217492": 87, "get_nearest_neighbor": 87, "8168574571609497": 87, "princ": 87, "796097457408905": 87, "emperor": 87, "7907207608222961": 87, "7655220627784729": 87, "lord": 87, "7435404062271118": 87, "7394551634788513": 87, "chieftain": 87, "7307553291320801": 87, "tyrant": 87, "7226710319519043": 87, "conqueror": 87, "719561755657196": 87, "kingli": 87, "718187689781189": 87, "queen": 87, "cosine_similar": [87, 89, 94], "vec_a": [87, 89], "vec_b": [87, 89], "getsimilar": [87, 89], "word1": [87, 89], "word2": [87, 89], "knight": 87, "twenti": 87, "nsimilar": 87, "ascend": 87, "descend": 87, "victori": 87, "defeat": 87, "7181877493858337": 87, "6881008744239807": 87, "2892838716506958": 87, "19655467569828033": 87, "833964467048645": 87, "8707448840141296": 87, "7478055953979492": 87, "8461978435516357": 87, "595384955406189": 87, "word_similar": 87, "5649225115776062": 87, "pronunci": 87, "4072215259075165": 87, "5812374353408813": 87, "_check_similarity_between_words_interactive_demo": 87, "context_word_1": 87, "context_word_2": 87, "word_similarity_1": 87, "word_similarity_2": 87, "7297980785369873": 87, "340322345495224": 87, "woman": 87, "_____": 87, "germani": [87, 88], "berlin": 87, "franc": 87, "petal": 87, "get_analog": 87, "funnction": 87, "positv": 87, "______": 87, "frannc": 87, "8162637948989868": 87, "8568049669265747": 87, "7037209272384644": 87, "flower": 87, "poverti": 87, "wealth": 87, "615874171257019": 87, "afflict": 87, "5437814593315125": 87, "_explore_homonyms_interactive_demo": 87, "_using_embeddings_video": 87, "cheap": 87, "attach": 87, "neuralnet": [87, 100, 102], "embedding_length": 87, "voabulari": 87, "embeddingbag": 87, "embedding_fasttext": 87, "requiresgrad": 87, "initrang": 87, "uniform_": 87, "_simple_feed_forward_net_exercis": 87, "train_it": 87, "valid_it": 87, "test_it": 87, "emb_vector": 87, "plot_train_v": 87, "total_acc": 87, "total_count": 87, "elaps": 87, "valid_split": 87, "yield_token": 87, "data_it": 87, "set_default_index": 87, "text_pipelin": 87, "label_pipelin": 87, "collate_batch": 87, "_label": 87, "_text": 87, "processed_text": 87, "split_train_": 87, "split_valid_": 87, "collate_fn": 87, "valid_dataload": 87, "steplr": 87, "total_accu": 87, "epoch_start_tim": 87, "accu_train": 87, "loss_train": 87, "accu_v": 87, "loss_val": 87, "accu_test": 87, "training_accuraci": 87, "validation_accuraci": 87, "ag_news_label": 87, "sci": 87, "tec": 87, "ex_text_str": 87, "memphi": 87, "tenn": 87, "jon": 87, "rahm": 87, "endur": 87, "season": 87, "weather": 87, "sundai": 87, "royal": 87, "portrush": 87, "thursdai": 87, "wgc": 87, "fedex": 87, "jude": 87, "spaniard": 87, "flawless": 87, "pga": 87, "impress": [87, 101], "nine": 87, "tpc": 87, "southwind": 87, "multilingu": 87, "jordan": 88, "matelski": 88, "weizh": 88, "yuan": 88, "dalia": 88, "nasr": 88, "stephen": 88, "kiilu": 88, "konstantin": 88, "tsafatino": 88, "comprehens": 88, "influenti": 88, "pytorch_lightn": 88, "typing_extens": 88, "w3d1_t2": 88, "regex": 88, "_intro_to_nlps_and_llms_video": 88, "_nlp_pipeline_video": 88, "march": 88, "grade": 88, "exchang": 88, "hf": 88, "wikitext": 88, "41492": 88, "wolv": 88, "howl": 88, "assembl": 88, "den": 88, "storm": 88, "unfamiliar": 88, "territori": 88, "km2": 88, "sq": 88, "indistinguish": 88, "voic": 88, "octav": 88, "bass": 88, "stress": 88, "nasal": 88, "bariton": 88, "pup": 88, "yearl": 88, "yelp": 88, "harmon": 88, "overton": 88, "smoothli": 88, "mate": 88, "kill": 88, "cry": 88, "bark": 88, "choru": 88, "lone": 88, "protract": 88, "melodi": 88, "north": [88, 100], "louder": 88, "stronger": 88, "syllabl": 88, "mutual": 88, "biologist": 88, "generate_n_exampl": 88, "protocol": 88, "ecosystem": 88, "reinvent": 88, "richer": 88, "workflow": 88, "embedd": 88, "shelf": 88, "splitter": 88, "12_000": 88, "wordpiec": 88, "workpiec": 88, "subchunk": 88, "diacrit": 88, "accent": 88, "whitespac": 88, "stripacc": 88, "whitespacesplit": 88, "individual_digit": 88, "bpe": 88, "ee": 88, "tokenizer_train": 88, "wordpiecetrain": 88, "special_token": 88, "sample_ratio": 88, "dataset_smal": 88, "train_from_iter": 88, "hello": [88, 89], "toastersock": 88, "groommpi": 88, "hell": 88, "9140": 88, "2264": 88, "4375": 88, "aster": 88, "omm": 88, "downstream": [88, 94], "_is_it_a_good_idea_to_do_pre_tokenizers_discuss": 88, "_tokenizer_good_practices_discuss": 88, "unicod": 88, "\u997f": 88, "_chinese_and_english_tokenizer_discuss": 88, "_bert_video": 88, "_nlg_video": 88, "codeparrot": 88, "offroad": 88, "_tokenizers_discuss": 88, "7gb": 88, "500mb": 88, "automodelwithlmhead": 88, "generation_pipelin": 88, "input_prompt": 88, "simple_add": 88, "input_token_id": 88, "input_str": 88, "convert_ids_to_token": 88, "\u0121simpl": 88, "3486": 88, "\u0121int": 88, "1109": 88, "\u0121b": 88, "\u0121": 88, "1035": 88, "\u010b\u0121\u0121\u0121": 88, "\u0121add": 88, "15747": 88, "\u0121two": 88, "2877": 88, "\u0121number": 88, "5579": 88, "\u0121togeth": 88, "10451": 88, "\u0121and": 88, "\u0121return": 88, "2529": 88, "\u0121the": 88, "\u0121result": 88, "weirdli": 88, "copilot": 88, "wilder": 88, "simple_sub": 88, "simple_mul": 88, "simpleadd": 88, "simpleadder2": 88, "__": 88, "yike": 88, "ew": 88, "hobbyist": 88, "java": 88, "devolv": 88, "resembl": [88, 101], "diagnos": 88, "nonexpert": 88, "_using_sota_models_discuss": 88, "alright": 88, "_some_": 88, "isc": 88, "apach": 88, "repo_nam": 88, "overwhelmingli": 88, "lightn": 88, "serializ": 88, "bunch": 88, "collat": 88, "datacollatorforlanguagemodel": 88, "encoded_dataset": 88, "remove_column": 88, "data_col": 88, "ellipsi": 88, "adder_typ": 88, "adder_nam": 88, "adder_type_nam": 88, "adder_name_nam": 88, "jam": 88, "imperfect": 88, "_finetune_the_model_exercis": 88, "_accuracy_metric_observations_discuss": 88, "_conclusion_video": [88, 94], "payload": 88, "bearer": 88, "api_url": 88, "hf_": 88, "chatgpt": [88, 101], "gpt3": 88, "gptbing": 88, "gpt4": 88, "latter": [88, 91], "musk": 88, "biographi": 88, "uk": 88, "japan": 88, "capita": 88, "_play_around_with_llms_act": 88, "tutor": 88, "exam": 88, "_what_models_video": [88, 94], "w3d1_t3_bonu": 89, "tradition": [89, 97], "sum_i": [89, 100], "wy_i": 89, "deeptext": 89, "unchang": 89, "rqadk": 89, "neither": [89, 97], "nor": [89, 97], "bonjour": 89, "7028388977050781": 89, "20523205399513245": 89, "chatt": 89, "chat": 89, "013087842613458633": 89, "02490561455488205": 89, "6003134250640869": 89, "en_word": 89, "fr_word": 89, "bilingual_dictionari": 89, "make_training_matric": 89, "learn_transform": 89, "svd": 89, "source_dictionari": 89, "target_dictionari": 89, "source_matrix": 89, "target_matrix": 89, "21030391": 89, "atleast_1d": 89, "expand_dim": 89, "normalize_vector": 89, "dictionary_length": 89, "bilingu": 89, "source_training_matrix": 89, "target_training_matrix": 89, "5818601846694946": 89, "43272727727890015": 89, "6866631507873535": 89, "6003133654594421": 89, "_multilingual_embeddings_bonus_act": 89, "central": [91, 97], "w3d2_t1": 91, "_intro_to_dl_thinking_2_video": 91, "_getting_more_vignette_video": 91, "bui": 91, "costli": 91, "shear": 91, "bright": 91, "1000x": 91, "recogniz": 91, "_getting_more_data_discuss": 91, "_getting_more_data_wrapup_video": 91, "balestriero": 91, "bottou": 91, "2204": 91, "03632": 91, "_classbased_strategies_bonus_discuss": 91, "_detecting_tumors_vignette_video": 91, "_detecting_tumors_setup_video": 91, "hospit": 91, "scan": 91, "coher": 91, "chop": 91, "former": 91, "_detecting_tumors_discuss": 91, "_detecting_tumors_wrapup_video": 91, "tschandl": 91, "rinner": 91, "apalla": 91, "skin": 91, "nat": 91, "med": 91, "1234": [91, 94], "s41591": 91, "020": 91, "0942": 91, "_brains_on_forrest_gump_vignette_video": 91, "_brains_on_forrest_gump_setup_video": 91, "_1": 91, "pearson": 91, "rho": 91, "_brains_on_forrest_gump_discuss": 91, "_brains_on_forrest_gump_wrapup_video": 91, "arora": 91, "bilm": 91, "livescu": 91, "canon": [91, 97, 100, 102], "proceed": 91, "30th": 91, "pmlr": 91, "1247": 91, "mlr": 91, "v28": 91, "andrew13": 91, "_wrapup_of_dl_thinking_video": 91, "cca": 91, "w3d3_bonuslectur": 93, "_melanie_mitchell_video": 93, "arna": 94, "ghosh": 94, "colleen": 94, "gillon": 94, "atnafu": 94, "lambebo": 94, "colleenjg": 94, "neuromatch_ssl_tutori": 94, "importlib": 94, "repo_path": [94, 100, 102], "download_str": [94, 100, 102], "redownload": 94, "zipurl": [94, 100, 102], "smqvg": 94, "zipresp": [94, 100, 102], "w3d3_t1": 94, "plot_util": 94, "runner": 94, "w3d3_unsupervisedandselfsupervisedlearn": 94, "xkcd": 94, "plot_rsm_histogram": 94, "min_val": 94, "nanmin": 94, "max_val": 94, "nanmax": 94, "test_custom_torch_rsm_fct": 94, "custom_torch_rsm_fct": 94, "f_name": 94, "rand_feat": 94, "rsm_custom": 94, "rsm_ground_truth": 94, "calculate_torch_rsm": 94, "equal_nan": 94, "test_custom_contrastive_loss_fct": 94, "custom_simclr_contrastive_loss": 94, "rand_proj_feat1": 94, "rand_proj_feat2": 94, "loss_custom": 94, "loss_ground_truth": 94, "contrastive_loss": 94, "dspritesdataset": 94, "dsprites_subset": 94, "dsprites_torchdataset": 94, "dspritestorchdataset": 94, "target_lat": 94, "train_sampl": 94, "test_sampl": 94, "train_test_split_idx": 94, "fraction_train": 94, "randst": 94, "supervised_encod": 94, "load_encod": 94, "model_typ": 94, "random_encod": 94, "vae_encod": 94, "invariance_transform": 94, "randomaffin": 94, "dsprites_invariance_torchdataset": 94, "simclr_transform": 94, "simclr_encod": 94, "_why_do_representations_matter_video": 94, "openli": 94, "oval": 94, "heart": 94, "show_imag": 94, "posx": 94, "posi": 94, "compris": 94, "1x84": 94, "feat_encoder_schemat": 94, "1200": 94, "seed_process": 94, "train_test_splix_idx": 94, "16000": 94, "train_classifi": 94, "freeze_featur": 94, "substructur": 94, "encodercor": 94, "train_supervised_encod": 94, "_logistic_regression_classifier_exercis": 94, "_supervised_learning_and_invariance_video": 94, "nbr": [94, 97], "pairwis": 94, "_function_that_calculates_rsms_exercis": 94, "whichev": 94, "sorting_lat": 94, "plot_model_rsm": 94, "rsm_fct": 94, "encoder_rsm": 94, "encoder_lat": 94, "all_lat": 94, "all_featur": 94, "_supervised_network_encoder_rsm_interactive_demo": 94, "rsms_supervised_encoder_10ep_bs1000_seed2021": 94, "_what_patterns_do_the_rsms_reveal_discuss": 94, "_random_representations_video": 94, "trivial": 94, "plot_rsm": 94, "_plotting_a_random_network_encoder_exercis": 94, "rsms_random_encoder_0ep_bs0_seed2021": 94, "_trained_vs_random_encoder_discuss": 94, "ahead": 94, "random_loss_arrai": 94, "_evaluating_the_classification_performance_exercis": 94, "_random_projections_with_dsprites_discuss": 94, "_generative_models_video": 94, "absenc": 94, "kullback": 94, "leibler": 94, "kld": 94, "load_decod": 94, "vae_decod": 94, "load_vae_decod": 94, "vae_encoder_300ep_bs500_seed2021": 94, "vae_decoder_300ep_bs500_seed2021": 94, "plot_vae_reconstruct": 94, "_pretrained_vae_interactive_demo": 94, "overcom": [94, 101], "_vae_on_the_reconstruction_task_discuss": 94, "_vae_encoder_rsms_interactive_demo": 94, "rsms_vae_encoder_300ep_bs500_seed2021": 94, "_construct_a_meaningful_representation_space_discuss": 94, "kept": 94, "vae_train_loss": 94, "vae_loss_arrai": 94, "_evaluate_performance_using_pretrained_vae_exercis": 94, "_modern_approach_in_selfsupervised_learning_video": 94, "predetermin": 94, "_image_transformations_interactive_demo": 94, "_data_transformations_video": 94, "proj_feat1": 94, "feat_siz": 94, "proj_feat2": 94, "similarity_matrix": 94, "pos_sample_ind": 94, "neg_sample_ind": 94, "denomin": 94, "relax": 94, "z1": 94, "z2": 94, "proj_featur": 94, "2n": 94, "_simclr_loss_function_exercis": 94, "test_simclr_loss_arrai": 94, "train_simclr": 94, "loss_fct": 94, "neg_pair": 94, "total_loss": 94, "num_tot": 94, "z_aug1": 94, "z_aug2": 94, "simclr_encoder_60ep_bs1000_deg90_trans0": 94, "2_scale0": 94, "8to1": 94, "2_seed2021": 94, "dsprites_torch": 94, "dsprites_invariance_torch": 94, "simclr_loss_arrai": 94, "_evaluate_performance_using_pretrained_simclr_interactive_demo": 94, "_un_self_supervised_learning_video": 94, "train_sampler_bias": 94, "significantli": 94, "6x": 94, "train_sampler_bias_ctrl": 94, "bias_typ": 94, "shape_posx_spac": 94, "test_sampler_for_bias": 94, "compens": 94, "train_bia": 94, "test_sampler_for_bias_ctrl": 94, "5808": 94, "posx_quadr": 94, "full_training_procedur": 94, "dataset_typ": 94, "funtion": 94, "bias_ctrl": 94, "encoder_label": 94, "num_clf_epoch": 94, "ntrain": 94, "train_encoder_clfs_by_fraction_label": 94, "subset_se": 94, "vae_encoder_bias_ctrl_450ep_bs500_seed2021": 94, "simclr_encoder_bias_ctrl_150ep_bs1000_deg90_trans0": 94, "labelled_fract": 94, "plot_accuraci": 94, "1232": 94, "1233": 94, "train_clfs_by_fraction_label": 94, "1235": 94, "1236": 94, "1237": 94, "1238": 94, "1239": 94, "1240": 94, "plot_chanc": 94, "1241": 94, "1242": 94, "1244": 94, "1245": 94, "1064": 94, "1062": 94, "fresh": 94, "1063": 94, "orig_encod": 94, "1065": 94, "1066": 94, "num_epochs_use_al": 94, "1067": 94, "fraction_of_label": 94, "1068": 94, "1069": 94, "progress_bar": 94, "1070": 94, "1072": 94, "1073": 94, "classification_optim": 94, "get_featur": 94, "feats_extr": 94, "feature_extractor": 94, "feats_flat": 94, "feats_proj": 94, "linear_project": 94, "1511": 94, "_wrapped_call_impl": 94, "1509": 94, "_compiled_call_impl": 94, "misc": 94, "1510": 94, "_call_impl": 94, "1520": 94, "1516": 94, "1517": 94, "_backward_hook": 94, "_backward_pre_hook": 94, "_forward_hook": 94, "_forward_pre_hook": 94, "1518": 94, "_global_backward_pre_hook": 94, "_global_backward_hook": 94, "1519": 94, "_global_forward_hook": 94, "_global_forward_pre_hook": 94, "forward_cal": 94, "1522": 94, "_conv_forward": 94, "padding_mod": 94, "_reversed_padding_repeated_twic": 94, "_pair": 94, "notabl": 94, "weaker": 94, "mitig": 94, "_biased_training_dataset_discuss": 94, "_general_principles_discuss": 94, "_invariant_representations_bonus_video": 94, "_simclr_network_encoder_rsms_bonus_interactive_demo": 94, "rsms_simclr_encoder_60ep_bs1000_deg90_trans0": 94, "_contrastive_models_bonus_discuss": 94, "_avoiding_representational_collapse_bonus_video": 94, "hood": 94, "simclr_encoder_neg_pair": 94, "rsms_and_histogram_plot": 94, "simclr_rsm": 94, "simclr_neg_pairs_rsm": 94, "random_rsm": 94, "calc": 94, "_visualizing_the_network_encoder_rsms_bonus_exercis": 94, "nuse": 94, "simclr_neg_pairs_loss_arrai": 94, "rsms_simclr_encoder_2neg_60ep_bs1000_deg90_trans0": 94, "_negative_pairs_in_computing_the_contrastive_loss_bonus_discuss": 94, "_simclr_network_encoder_pretrained_with_only_a_few_negative_pairs_bonus_interactive_demo": 94, "_fewshot_supervised_learning_bonus_video": 94, "thoroughli": 94, "unlabel": 94, "new_supervised_encod": 94, "nwith": 94, "_use_a_fraction_of_the_labelled_dataset_bonus_interactive_demo": 94, "_advantages_and_disadvantages_of_encoders_bonus_discuss": 94, "w3d4_bonuslectur": 96, "_chelsea_finn_video": 96, "pablo": 97, "samuel": 97, "castro": 97, "xiaomei": 97, "julia": 97, "costacurta": 97, "w3d4_t1": 97, "chronolog": 97, "_intro_to_rl_video": 97, "sutton": 97, "barto": 97, "playground": 97, "app": 97, "_grid_world_video": 97, "ascii_to_emoji": 97, "action_effect": 97, "get_emoji": 97, "gridworldbas": 97, "world_spec": 97, "full_lik": 97, "goal_cel": 97, "get_neighbour": 97, "neighbour": 97, "neighbour_po": 97, "include_polici": 97, "row_rang": 97, "row_char": 97, "gwb": 97, "goal_queu": 97, "goals_don": 97, "goal_neighbour": 97, "gwp": 97, "planer": 97, "_make_a_better_planner_exercis": 97, "harder_grid": 97, "gwb_2": 97, "gwp_2": 97, "puterman": 97, "_markov_decision_process_video": 97, "mdpbase": 97, "grid_world": 97, "num_stat": 97, "state_idx": 97, "cell_to_st": 97, "state_to_cel": 97, "goal_stat": 97, "s2": 97, "nbr_state": 97, "nbr_action": 97, "mdpb": 97, "_create_an_mdp_exercis": 97, "_q_values_video": 97, "mdptogo": 97, "computeq": 97, "sxa": 97, "steps_to_go": 97, "mdptg": 97, "_create_a_step_to_go_solver_exercis": 97, "bellman": 97, "backup": 97, "curriculum": 97, "_value_iteration_video": 97, "mdpvalueiter": 97, "error_toler": 97, "num_iter": 97, "new_q": 97, "max_next_q": 97, "_draw_v": 97, "min_v": 97, "max_v": 97, "wall_v": 97, "grid_valu": 97, "get_xaxi": 97, "get_yaxi": 97, "set_clim": 97, "draw_mod": 97, "mdpvi": 97, "_implement_value_iteration_exercis": 97, "_policy_iteration_video": 97, "mdppolicyiter": 97, "findpi": 97, "\u03c0": 97, "new_pi": 97, "next_v": 97, "mdppi": 97, "_implement_policy_iteration_exercis": 97, "mild": 97, "_q_learning_video": 97, "qlearner": 97, "current_st": 97, "new_stat": 97, "pickact": 97, "maybereset": 97, "learnq": 97, "10_000": 97, "base_q_learn": 97, "_implement_q_learning_exercis": 97, "_epsilon_greedy_exploration_video": 97, "qlearnerexplor": 97, "_implement_epsilon_greedy_exploration_exercis": 97, "testb": 97, "greatest": 97, "occasion": 97, "discov": 97, "w3d5_bonuslectur": 99, "_amita_kapoor_video": 99, "mandana": [100, 102], "samiei": [100, 102], "raymond": [100, 102], "chua": [100, 102], "kushaan": [100, 102], "lilicrap": [100, 102], "namrata": [100, 102], "bafna": [100, 102], "coloredlog": [100, 102], "w3d5_t1": 100, "unpickl": [100, 102], "loadtrainexampl": [100, 102], "trainexampleshistori": [100, 102], "modelfil": [100, 102], "examplesfil": [100, 102], "trainexampl": [100, 102], "exit": [100, 102], "save_model_checkpoint": [100, 102], "nnet": [100, 102], "filepath": [100, 102], "load_model_checkpoint": [100, 102], "raymondchua": [100, 102], "nma_rl_gam": [100, 102], "kf4p9": [100, 102], "arena": [100, 102], "mct": 100, "othelloplay": [100, 102], "othellolog": [100, 102], "nnetwrapp": [100, 102], "dotdict": [100, 102], "numit": [100, 102], "numep": [100, 102], "tempthreshold": [100, 102], "exploit": [100, 102], "updatethreshold": [100, 102], "playoff": [100, 102], "maxlenofqueu": [100, 102], "nummctssim": [100, 102], "arenacompar": [100, 102], "cpuct": [100, 102], "maxdepth": [100, 102], "nummcsim": [100, 102], "mc_topk": [100, 102], "load_folder_fil": [100, 102], "8x100x50": [100, 102], "numitersfortrainexampleshistori": [100, 102], "_a_game_loop_for_rl_video": 100, "centr": 100, "south": 100, "outflank": 100, "opppon": 100, "voluntarili": 100, "forfeit": 100, "eothello": 100, "6x6": [100, 102], "getinitboard": [100, 102], "getvalidmov": [100, 102], "square_cont": [100, 102], "getsquarepiec": [100, 102], "getboards": [100, 102], "getactions": [100, 102], "getcanonicalform": [100, 102], "stringrepresent": [100, 102], "tobyt": [100, 102], "stringrepresentationread": [100, 102], "board_": [100, 102], "getscor": [100, 102], "countdiff": [100, 102], "displayvalidmov": [100, 102], "getnextst": [100, 102], "execute_mov": [100, 102], "legalmov": [100, 102], "get_legal_mov": [100, 102], "getgameend": [100, 102], "has_legal_mov": [100, 102], "getsymmetri": [100, 102], "pi_board": [100, 102], "newb": [100, 102], "rot90": [100, 102], "newpi": [100, 102], "fliplr": [100, 102], "randomplay": [100, 102], "_implement_a_random_player_excercis": 100, "player1": [100, 102], "player2": [100, 102], "num_gam": [100, 102], "playgam": [100, 102], "nnumber": [100, 102], "win_rate_player1": [100, 102], "nwin": [100, 102], "w3d5_reinforcementlearningforgamesanddlthinking3": 100, "gameresult": 100, "onewon": 100, "curplay": 100, "smarter": 100, "_train_a_value_function_video": 100, "pretrained_model": [100, 102], "loaded_gam": [100, 102], "checkpoint_1": [100, 102], "l_": 100, "othellonet": [100, 102], "board_x": [100, 102], "board_i": [100, 102], "bn4": [100, 102], "fc_bn1": [100, 102], "batchnorm1d": [100, 102], "fc_bn2": [100, 102], "fc4": [100, 102], "_implement_othelonn_excercis": 100, "v_loss": [100, 102], "batch_count": [100, 102], "target_v": [100, 102], "loss_v": [100, 102], "out_v": [100, 102], "l_v": [100, 102], "save_checkpoint": [100, 102], "load_checkpoint": [100, 102], "_implement_the_value_network_excercis": 100, "rl_for_gam": 100, "vnet": [100, 102], "_play_games_using_a_value_function_video": 100, "model_save_nam": [100, 102], "valuebasedplay": [100, 102], "max_num_act": [100, 102], "va_list": [100, 102], "negat": 100, "nextboard": [100, 102], "_implement_the_value_based_player_excercis": 100, "_train_a_policy_network_video": 100, "t_i": 100, "output_i": 100, "pi_loss": [100, 102], "target_pi": [100, 102], "loss_pi": [100, 102], "out_pi": [100, 102], "l_pi": [100, 102], "nll": [100, 102], "aspir": [100, 102], "gombru": [100, 102], "2018": [100, 102], "_implement_the_policy_network_exercis": 100, "pnet": [100, 102], "_play_games_using_a_policy_network_video": 100, "probabilit": 100, "sum_vap": [100, 102], "action_prob": [100, 102], "vap": [100, 102], "renorm": [100, 102], "_implement_the_policy_based_player_exercis": 100, "_play_using_monte_carlo_rollouts_video": 100, "recapitul": 100, "ps": [100, 102], "canonicalboard": [100, 102], "temp_v": [100, 102], "init_start_st": [100, 102], "isfirstact": [100, 102], "current_play": [100, 102], "sum_ps_": [100, 102], "nb": [100, 102], "insuffici": [100, 102], "dozen": [100, 102], "next_": [100, 102], "next_play": [100, 102], "_implement_the_monte_carlo_planner_exercis": 100, "_play_with_planning_video": 100, "averg": 100, "s_t": [100, 102], "mc": [100, 102], "mc_model_save_nam": [100, 102], "montecarlobasedplay": [100, 102], "best_act": [100, 102], "avg_valu": [100, 102], "qsa": [100, 102], "num_valid_act": [100, 102], "top_k_act": [100, 102], "argpartit": [100, 102], "getactionprob": [100, 102], "rp": [100, 102], "n1": [100, 102], "maxrollout": [100, 102], "mc1": 100, "n1p": [100, 102], "mc_result": [100, 102], "_monte_carlo_simulations_exercis": 100, "vp": [100, 102], "pp": [100, 102], "_unbeatable_opponents_video": 100, "w3d5_t2": 101, "_intro_to_dl_thinking_3_video": 101, "_the_future_video": 101, "curios": 101, "Their": 101, "constitu": 101, "mammal": 101, "flesh": 101, "2201": 101, "07372": 101, "brief": 101, "retrospect": 101, "prospect": 101, "uncertain": 101, "revel": 101, "unmet": 101, "conting": 101, "_the_future_discuss": 101, "_in_context_learning_vignette_video": 101, "llm": 101, "2211": 101, "15561": 101, "2212": 101, "10559": 101, "implicit": 101, "possess": 101, "icl": 101, "theorem": 101, "_in_context_learning_discuss": 101, "_memories_vignette_video": 101, "dinner": 101, "blackboard": 101, "1410": 101, "1805": 101, "07603": 101, "1703": 101, "03129": 101, "overlook": 101, "intric": 101, "ntm": 101, "enrich": 101, "dnn": 101, "emdqn": 101, "atari": 101, "lifelong": 101, "shot": 101, "seamlessli": 101, "omniglot": 101, "agil": 101, "_memories_discuss": 101, "_multiple_information_sources_vignette_video": 101, "synergist": 101, "gpt": 101, "2302": 101, "14045": 101, "palm": 101, "multimod": 101, "mllm": 101, "ocr": 101, "ccombin": 101, "_multiple_information_sources_discuss": 101, "_language_for_robotics_video": 101, "subproblem": 101, "robotoc": 101, "manipul": 101, "embodi": 101, "rt": 101, "socrat": 101, "outstand": 101, "_language_for_robotics_discuss": 101, "w3d5_t3_bonu": 102, "othello": 102, "othellogam": 102, "othellonnet": 102, "valuenetwork": 102, "policynetwork": 102, "policybasedplay": 102, "montecarlo": 102, "rollout": 102, "oppon": 102, "_plan_with_mcts_video": 102, "puct": 102, "sum_b": 102, "uct": 102, "alorithm": 102, "nsa": 102, "ns": 102, "till": 102, "cur_best": 102, "getnsa": 102, "_mcts_planner_exercis": 102, "_play_with_mcts_video": 102, "mcts_model_save_nam": 102, "montecarlotreesearchbasedplay": 102, "besta": 102, "counts_sum": 102, "versu": 102, "mcts1": 102, "mcts_result": 102, "n2": 102, "n2p": 102, "_play_games_mcts_exercis": 102, "john": 103, "butler": 103, "advic": 103, "isabel": 103}, "objects": {}, "objtypes": {}, "objnames": {}, "titleterms": {"prerequisit": 0, "preparatori": 0, "materi": [0, 23], "nma": [0, 19, 28], "deep": [0, 4, 16, 26, 33, 44, 45, 57, 58, 59, 62, 64, 65, 74, 76, 91, 101], "learn": [0, 4, 5, 8, 16, 24, 26, 27, 28, 33, 44, 45, 57, 58, 59, 61, 62, 64, 70, 74, 76, 81, 84, 88, 91, 92, 94, 95, 96, 97, 98, 100, 101], "prepar": [0, 3, 5, 20, 43, 76, 77], "yourself": 0, "cours": [0, 46, 57], "program": 0, "math": [0, 81], "skill": 0, "comput": [1, 60, 67, 94], "vision": [1, 16], "data": [2, 3, 5, 7, 8, 11, 15, 16, 17, 18, 20, 21, 30, 33, 35, 39, 57, 65, 67, 69, 70, 73, 76, 77, 81, 84, 85, 91, 94, 100], "augment": [2, 17, 57, 70, 73, 85], "imag": [2, 3, 4, 5, 16, 18, 43, 57, 67, 73, 76, 77, 80, 82, 94], "classif": [2, 5, 7, 43, 64, 67, 76, 84, 85, 94], "model": [2, 3, 8, 16, 17, 23, 25, 30, 31, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 64, 65, 67, 70, 73, 76, 78, 80, 81, 82, 84, 85, 88, 94], "object": [2, 3, 5, 8, 11, 16, 17, 20, 21, 25, 27, 28, 33, 35, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "setup": [2, 3, 5, 7, 8, 11, 16, 17, 18, 20, 21, 25, 27, 28, 33, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "instal": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "depend": [2, 3, 5, 7, 12, 15, 16, 17, 18, 21, 25, 27, 28, 39, 43, 57, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "set": [2, 5, 8, 17, 18, 20, 21, 25, 28, 30, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 100, 102], "random": [2, 8, 25, 28, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "seed": [2, 8, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "devic": [2, 5, 8, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "gpu": [2, 8, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "cpu": [2, 8, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "train": [2, 3, 7, 8, 11, 16, 17, 18, 20, 28, 39, 40, 43, 57, 60, 61, 62, 64, 67, 69, 70, 73, 76, 77, 82, 84, 85, 89, 94, 100, 102], "hyperparamet": [2, 8, 61, 70], "cutout": 2, "mixup": 2, "dataset": [2, 3, 4, 7, 8, 10, 12, 15, 18, 19, 25, 57, 60, 64, 65, 67, 69, 70, 73, 77, 80, 84, 85, 87, 88, 94], "cifar": [2, 8], "10": [2, 33, 34, 35, 38, 43, 57, 61, 62, 74, 91, 94], "loader": [2, 8, 65, 70], "visual": [2, 15, 16, 18, 20, 57, 62, 69, 70, 73, 77, 81, 87, 94], "architectur": [2, 8, 17, 21, 80, 82, 84, 88, 91, 100], "resnet": [2, 8, 76], "test": [2, 8, 16, 17, 18, 21, 25, 38, 39, 40, 43, 57, 69, 73, 81, 82], "loss": [2, 3, 8, 18, 61, 64, 67, 69, 81, 82, 94, 100], "function": [2, 5, 7, 8, 15, 16, 17, 21, 27, 28, 33, 35, 39, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 89, 94, 100, 102], "optim": [2, 3, 8, 66, 67, 76], "loop": [2, 8, 16, 25, 60, 73, 76, 100], "auxiliari": [2, 8], "knowledg": [3, 35], "extract": 3, "from": [3, 4, 8, 20, 28, 43, 57, 62, 73, 76, 80, 82, 84, 85, 87, 94, 97, 100, 102], "convolut": [3, 16, 73, 76, 80], "neural": [3, 4, 12, 16, 17, 20, 43, 57, 60, 61, 62, 64, 74, 81, 82, 87, 100], "network": [3, 4, 5, 12, 16, 17, 28, 43, 57, 61, 62, 64, 65, 67, 70, 76, 77, 81, 82, 94, 100, 102], "project": [3, 12, 23, 27, 31, 32, 33, 35, 36, 39, 40, 46, 52, 94], "idea": [3, 4, 10, 19, 26, 27, 88], "acknowledg": [3, 7], "an": [3, 57, 64, 67, 73, 74, 85, 94, 97, 102], "classifi": [3, 57, 94], "download": [3, 16, 17, 18, 21, 65, 67, 73, 76, 77, 80, 82, 84, 85, 100, 102], "creat": [3, 16, 17, 28, 43, 57, 87, 97, 100], "inspect": [3, 15, 28, 82], "gan": 3, "translat": [3, 11], "get": [3, 16, 31, 57, 73, 77, 91], "cyclegan": 3, "code": [3, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 88, 94, 97, 100, 102], "biolog": [4, 64], "analysi": [4, 12, 17, 27, 62], "us": [4, 5, 8, 15, 16, 18, 25, 28, 43, 50, 51, 53, 54, 57, 69, 76, 77, 81, 84, 87, 88, 94, 100, 102], "featur": [4, 65, 76, 88, 94], "predict": [4, 11, 18, 20, 74, 84, 85], "human": [4, 25], "behavior": 4, "leakag": 4, "gradient": [4, 60, 61, 67, 70], "flow": 4, "base": [4, 27, 28, 81, 91, 100, 102], "vae": [4, 80, 94], "self": [4, 92, 94, 96], "supervis": [4, 92, 94, 96], "graph": [4, 60], "someth": 5, "screwi": 5, "recognit": [5, 77], "detect": [5, 73, 91], "screw": 5, "helper": [5, 7, 16, 17, 21, 57, 61, 62, 65, 67, 70, 73, 80, 82, 84, 85, 87, 89, 94, 100, 102], "choos": [5, 12, 61], "figur": [5, 20, 25, 28, 33, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 80, 81, 82, 84, 87, 89, 94], "load": [5, 7, 8, 12, 15, 17, 18, 57, 69, 70, 73, 77, 84, 85, 87, 94, 100, 102], "let": [5, 28], "s": [5, 12, 28, 60, 64], "check": [5, 8, 76, 84, 85, 87], "out": [5, 76, 97], "some": [5, 20, 28, 57, 67, 77], "up": [5, 8, 18, 44, 60, 61, 74, 76, 80, 85, 91], "our": [5, 57], "first": [5, 16, 43, 65], "challeng": [5, 76], "damag": 5, "multi": [5, 26, 63, 84], "class": [5, 28, 76, 81, 91], "perform": [5, 17, 27, 84, 94], "introspect": 5, "orient": [5, 103], "bound": 5, "box": 5, "cluster": 5, "perspect": 5, "scale": 5, "transfer": [5, 8, 26, 27, 64, 65, 76], "link": [5, 15, 31, 52], "slide": [6, 13, 22, 29], "music": 7, "gener": [7, 11, 35, 39, 40, 46, 57, 60, 62, 64, 65, 69, 70, 73, 78, 80, 82, 84, 88, 94], "spectrogram": 7, "thi": [7, 62, 67, 77, 84], "notebook": 7, "gtzan": 7, "includ": 7, "have": 7, "look": [7, 16, 28], "simpl": [7, 57, 61, 73, 87, 97], "cnn": [7, 18, 73, 76, 77], "run": [7, 15, 25, 28, 46, 62, 67, 69, 70, 73, 76, 77, 94], "me": [7, 67, 69, 70, 73], "sourc": [8, 101], "100": 8, "dataload": [8, 16, 18, 57, 65, 70, 73], "pytorch": [8, 12, 43, 56, 57, 60, 64, 73], "re": [8, 62], "improv": [8, 76], "differ": [8, 64, 67, 76, 94], "delet": 8, "variabl": [8, 57, 80, 84, 85, 94], "previou": [8, 102], "target": [8, 94], "select": [8, 28, 33, 37, 39, 40, 80], "subset": 8, "pre": [8, 43, 77, 85, 94], "freez": 8, "paramet": [8, 65, 67, 73, 76], "unfreez": 8, "last": [8, 73], "layer": [8, 16, 21, 63, 65, 76], "number": [8, 57, 65, 73, 76, 94], "plot": [8, 16, 18, 20, 28, 33, 35, 39, 57, 60, 61, 62, 64, 65, 69, 70, 73, 76, 80, 81, 94], "result": [8, 31, 76], "natur": [9, 86, 88], "languag": [9, 11, 84, 85, 86, 88, 101], "process": [9, 57, 73, 81, 85, 86, 87, 88, 97], "machin": [11, 57], "repres": [11, 94], "word": [11, 84, 87], "distribut": [11, 21, 76, 87], "The": [11, 28, 44, 57, 61, 64, 65, 73, 76, 80, 81, 88, 94, 101], "rnn": [11, 20, 73], "text": [11, 57, 85], "To": 11, "do": [11, 73, 76, 88, 91, 94], "further": [11, 76], "read": [11, 34, 76], "twitter": 12, "sentiment": [12, 85], "welcom": [12, 57], "nlp": [12, 45, 87, 88], "templat": [12, 23, 31, 32, 43], "step": [12, 23, 33, 34, 35, 36, 37, 38, 41, 54, 97], "1": [12, 27, 28, 31, 33, 35, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "question": [12, 31, 33, 35, 39, 40, 57], "goal": [12, 88], "2": [12, 27, 28, 33, 35, 40, 43, 46, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101, 102], "literatur": [12, 33, 35], "review": [12, 33, 35, 85], "3": [12, 27, 28, 33, 35, 36, 40, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101], "explor": [12, 27, 64, 73, 87, 94, 97], "4": [12, 27, 28, 33, 36, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 80, 82, 84, 85, 87, 88, 91, 94, 97, 100, 101], "toolkit": [12, 31, 33, 37, 39, 40], "logist": [12, 31, 94], "regress": [12, 18, 62, 94], "explain": 12, "ai": [12, 59, 65, 93], "recurr": 12, "what": [12, 20, 64, 73, 76, 81, 87, 91, 94, 97], "next": 12, "neurosci": [14, 73], "algonaut": 15, "video": [15, 27, 28, 34, 35, 36, 37, 38, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 91, 93, 94, 96, 97, 99, 100, 101, 102, 103], "term": 15, "algonauts2021": 15, "enter": [15, 84], "dropbox": 15, "cell": [15, 17, 62, 67, 77], "util": [15, 81], "fmri": 15, "dimens": [15, 94], "correspond": 15, "brain": [15, 16, 17, 19, 73, 79, 91], "respons": [15, 18, 94], "refer": [15, 27, 57, 60], "lost": 16, "glass": 16, "how": [16, 23, 57, 67, 73, 74, 76, 84, 94], "deal": 16, "noisi": 16, "input": [16, 76], "clean": 16, "defin": [16, 17, 20, 25, 36, 43, 57, 69, 81, 82, 100], "preprocess": [16, 77], "pipelin": [16, 88], "exampl": [16, 18, 28, 33, 35, 36, 39, 40, 76, 77, 81, 94, 97], "nois": 16, "free": 16, "ventral": 16, "stream": 16, "alexnet": [16, 18, 76], "2012": 16, "batch": [16, 64, 67], "normal": [16, 17], "downscal": 16, "factor": 16, "tensorboard": 16, "accuraci": [16, 76, 88], "calcul": [16, 76, 77, 94], "hypothesi": [16, 36], "naiv": 16, "learner": 16, "expert": [16, 100], "experienc": 16, "kernel": [16, 73], "16": [16, 57], "filter": [16, 73, 76], "intermedi": 16, "output": [16, 73], "segment": 17, "denois": [17, 81], "intro": [17, 62, 74, 81, 87, 88, 91, 97, 101], "activ": [17, 18, 74], "neuron": [17, 20, 64, 74], "dish": 17, "transform": [17, 57, 77, 83, 84, 85, 94], "u": [17, 82], "net": [17, 60, 61, 62, 82, 87], "threshold": [17, 40], "find": [17, 35, 57, 76, 84], "move": 18, "beyond": [18, 82, 84], "label": [18, 69, 76, 94], "finetun": [18, 76], "bold": 18, "kai": 18, "structur": [18, 23, 73], "fine": [18, 44, 76, 85, 88], "tune": [18, 44, 70, 72, 76, 85, 88], "voxel": 18, "loc": 18, "region": 18, "custom": [18, 81], "numpi": [18, 57], "arrai": 18, "dissimilar": 18, "correl": [18, 62, 76], "between": [18, 87], "observ": [18, 88, 100], "valu": [18, 62, 84, 94, 97, 100, 102], "curat": 19, "crcn": 19, "janelia": 19, "figshar": 19, "other": [19, 26, 28, 84], "score": [19, 81, 82], "allen": 19, "observatori": 19, "bciaut": 19, "p300": 19, "focu": 20, "matter": [20, 61, 94], "infer": 20, "low": 20, "dimension": [20, 67], "dynam": [20, 43], "record": 20, "simul": [20, 64, 100], "linear": [20, 58, 61, 62, 80], "system": [20, 27], "compar": [20, 67, 76, 81, 94, 100], "true": 20, "fire": [20, 64], "rate": [20, 61, 70], "view": [20, 77], "all": [20, 28, 67, 69, 73, 94], "one": [20, 28, 57], "trial": 20, "latent": [20, 80, 94], "anim": [21, 65, 69, 70], "pose": 21, "estim": 21, "mount": [21, 28], "your": [21, 35, 36, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "gdrive": 21, "visula": 21, "evalu": [21, 33, 38, 39, 40, 64, 67, 94], "error": [21, 80], "final": [21, 31, 39, 40, 46], "introduct": [23, 27, 35, 43, 60, 64, 67, 69, 73, 76, 80, 84, 87, 94, 100, 103], "daili": [23, 31, 46, 57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "schedul": [23, 31, 46, 47], "ten": 23, "import": [23, 28, 43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "deadlin": 23, "reinforc": [24, 26, 28, 95, 97, 98, 100], "rl": [25, 27, 97, 99, 100], "cognit": 25, "task": [25, 27, 28, 64, 94], "background": [25, 35, 39, 40], "n": 25, "back": 25, "environ": [25, 27, 28, 43, 84, 85], "implement": [25, 27, 31, 33, 38, 39, 40, 61, 64, 67, 70, 73, 81, 84, 88, 97, 100], "scheme": 25, "agent": [25, 26, 28, 100, 102], "initi": [25, 61, 62, 65], "traffic": 26, "signal": [26, 40], "control": [26, 28], "resourc": [26, 57], "dqn": 27, "algorithm": [27, 60, 97], "lunar": 27, "lander": 27, "updat": [27, 67], "upgrad": 27, "lib": 27, "plai": [27, 57, 88, 100, 102], "basic": [27, 36, 44, 56, 57, 95, 97], "addit": [27, 28], "exploit": [27, 97], "trade": [27, 76], "off": [27, 76], "reward": [27, 28], "shape": 27, "identifi": 27, "state": [27, 35, 80, 88], "inform": [27, 101], "crucial": 27, "its": [27, 74], "extens": [27, 85], "atari": 27, "game": [27, 98, 100, 102], "5": [27, 33, 36, 37, 43, 57, 60, 61, 62, 64, 65, 67, 70, 73, 74, 76, 80, 84, 87, 88, 91, 94, 97, 100, 101], "obstacl": 27, "avoid": [27, 85, 94], "b": 27, "minigrid": 27, "6": [27, 33, 37, 43, 57, 60, 61, 62, 67, 70, 73, 74, 76, 84, 87, 88, 91, 94, 97, 100, 101], "prefer": 27, "pbrl": 27, "robolymp": 28, "robot": [28, 96, 101], "colab": [28, 53, 57], "limit": 28, "pybullet": 28, "locomot": 28, "save": [28, 76], "restor": 28, "checkpoint": 28, "befor": [28, 69], "runtim": 28, "restart": 28, "after": [28, 69, 73, 76], "conveni": 28, "factori": 28, "method": [28, 57, 67, 94], "continu": 28, "list": 28, "modifi": [28, 43], "instanti": 28, "default": [28, 70, 97], "bit": 28, "frame": 28, "action": 28, "take": 28, "properti": 28, "dm": 28, "acm": 28, "d4pg": 28, "examin": [28, 94], "polici": [28, 52, 97, 100, 102], "total": 28, "googl": [28, 53, 57], "drive": 28, "temporarili": 28, "option": [28, 60, 61, 62, 94], "unmount": 28, "two": 28, "dmpo": 28, "ddpg": 28, "good": [28, 65, 73, 88, 94], "luck": 28, "search": [30, 57, 67, 102], "metadataset": 30, "guid": [31, 41], "summari": [31, 39, 40, 43, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 94, 97, 100, 101, 102], "submiss": 31, "ta": 31, "mentor": 31, "week": [31, 57, 73], "start": 31, "w1d4": [31, 46], "dai": [31, 46], "w1d5": 31, "w2d1": 31, "3h": 31, "w2d2": 31, "w3d1": 31, "w3d2": [31, 46], "half": [31, 46], "w3d3": 31, "w3d4": 31, "w3d5": [31, 46], "present": [31, 43, 84], "content": 31, "retriev": 33, "ingredi": [33, 36, 39, 40], "hypothes": [33, 36, 39, 40], "draft": [33, 37, 39, 40], "7": [33, 37, 38, 43, 57, 61, 62, 67, 73, 74, 76, 84, 91, 94, 97, 100], "build": [33, 57, 64, 80], "8": [33, 38, 43, 46, 57, 61, 62, 67, 73, 74, 76, 84, 91, 94, 97, 100], "complet": [33, 38, 39, 40, 94], "9": [33, 38, 43, 57, 61, 62, 67, 74, 76, 84, 91, 94, 100], "public": 33, "publish": 34, "11": [34, 43, 57, 61, 94], "write": [34, 73], "abstract": [34, 40, 93], "paper": 34, "guidanc": 34, "suggest": 34, "0": [35, 60, 62, 64, 69, 73, 94, 100], "overview": [35, 46, 70, 84], "demo": [35, 57, 61, 62, 64, 67, 73, 76, 80, 81, 82, 84, 85, 87, 94], "disclaim": 35, "tutori": [35, 43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 91, 94, 97, 100, 101, 102], "phenomenon": [35, 39, 40], "ask": 35, "about": [35, 73, 91, 94], "own": [35, 67, 73, 84], "understand": [35, 73, 81, 85], "art": [35, 80, 88], "determin": [36, 76], "formul": 36, "specif": [36, 46, 97], "mathemat": 36, "plan": [37, 100, 102], "ethic": [38, 39, 57, 65, 67, 77, 82, 84, 94, 100], "illus": [39, 40], "media": 39, "thought": [39, 40], "vestibular": 40, "integr": [40, 64], "ddm": 40, "mechan": 40, "decis": [40, 57, 97], "assembl": 40, "deploi": [42, 43], "bonu": [43, 57, 59, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 85, 88, 89, 91, 93, 94, 96, 99, 102], "web": 43, "feedback": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "gadget": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "flask": 43, "ngrok": 43, "packag": 43, "which": 43, "doesn": 43, "t": [43, 91, 94], "work": [43, 57, 67, 84, 94], "latest": 43, "version": 43, "section": [43, 57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 89, 91, 94, 97, 100, 101, 102], "submit": [43, 57, 59, 60, 61, 62, 64, 65, 67, 69, 70, 72, 73, 74, 76, 77, 79, 80, 81, 82, 84, 85, 87, 88, 89, 91, 93, 94, 96, 97, 99, 100, 101, 102], "app": 43, "jinja2": 43, "jinja": 43, "appli": [43, 94], "mvvm": 43, "design": [43, 74, 91], "pattern": [43, 94], "rest": 43, "api": 43, "vue": 43, "js": 43, "serv": 43, "applic": [43, 84, 99], "heroku": 43, "python": 43, "12": [43, 57, 61], "local": [43, 67], "deploy": 43, "13": [43, 57], "14": [43, 57], "15": [43, 57], "wrap": [44, 60, 61, 74, 80, 91], "podcast": [44, 45], "panel": [44, 45, 46], "discuss": [44, 45, 60, 61, 62, 67, 94], "convnet": [45, 71, 75, 76, 77], "2024": 46, "juli": 46, "26": 46, "coursework": [46, 52], "time": [46, 82, 86, 87], "propos": 46, "dl": [46, 57, 71, 74, 90, 91, 98, 101], "think": [46, 64, 65, 67, 69, 70, 71, 73, 74, 76, 77, 80, 81, 82, 84, 87, 88, 91, 101], "profession": 46, "develop": 46, "share": [48, 91], "calendar": 48, "timezon": 49, "widget": [49, 57, 61, 62], "discord": 50, "jupyterbook": 51, "quick": 52, "attend": 52, "advic": 53, "kaggl": 54, "technic": 55, "help": [55, 73], "And": [56, 57, 71, 83, 86, 92, 98], "neuromatch": 57, "histori": [57, 76, 97], "why": [57, 61, 65, 94], "cool": 57, "tensor": 57, "make": [57, 62, 73], "like": [57, 94], "rang": 57, "exercis": [57, 60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 87, 88, 94, 97, 100, 102], "oper": 57, "manipul": [57, 87], "index": 57, "vs": [57, 61, 62, 64, 65, 67, 70, 73, 76, 80, 81], "just": 57, "much": [57, 73], "faster": 57, "ar": [57, 65, 67, 87, 94], "displai": [57, 76, 77], "cifar10": 57, "grayscal": 57, "csv": 57, "file": 57, "sampl": [57, 60, 62, 67, 73, 80, 81, 82, 100], "boundari": 57, "tweak": 57, "xor": 57, "interact": [57, 61, 62, 64, 67, 73, 76, 80, 81, 82, 84, 85, 87, 94], "solv": 57, "info": [57, 91], "Be": 57, "group": 57, "17": 57, "syllabu": 57, "meet": 57, "lectur": [57, 59, 72, 79, 93, 96, 99], "block": 57, "thing": 57, "more": [57, 60, 91, 94], "magic": 57, "survei": [57, 62, 65, 67, 70, 74, 76, 82, 84, 88, 91, 94, 97, 101], "60": 57, "year": 57, "research": [57, 99], "altair": 57, "vega_dataset": 57, "author": 57, "edit": 57, "author_filt": 57, "full": [57, 67, 94], "appendix": 57, "offici": 57, "document": 57, "book": 57, "yoshua": 59, "bengio": 59, "descent": [60, 67, 70], "autograd": 60, "execut": [60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "set_devic": [60, 61, 62, 64, 65, 67, 69, 70, 73, 76, 77, 80, 81, 82, 84, 85, 87, 88, 89, 94, 100, 102], "steepest": 60, "ascent": 60, "analyt": [60, 61, 62], "vector": [60, 62, 77, 87], "backprop": 60, "chain": 60, "rule": [60, 87], "auto": [60, 80], "differenti": 60, "forward": [60, 87], "propag": 60, "buid": 60, "backward": 60, "modul": [60, 80, 82, 100, 102], "nn": [60, 70], "A": [61, 97, 100], "shallow": 61, "narrow": 61, "solut": [61, 62], "lnn": [61, 62], "landscap": 61, "depth": [61, 67], "effect": [61, 73, 94], "prelud": 62, "represent": [62, 73, 94], "tree": [62, 102], "sure": 62, "you": [62, 73], "enabl": [62, 81, 94], "singular": 62, "decomposit": 62, "svd": 62, "similar": [62, 64, 76, 77, 87, 94], "rsa": 62, "illusori": 62, "demonstr": [62, 73], "outro": [62, 65, 100], "lr": 62, "dlnn": 62, "perceptron": 63, "artifici": 64, "mlp": [64, 65, 67], "need": [64, 65], "univers": 64, "approxim": 64, "theorem": 64, "relu": [64, 65, 73], "purpos": [64, 76], "cross": 64, "entropi": 64, "spiral": 64, "classfic": 64, "point": 64, "eval": 64, "doe": [64, 65, 69, 73, 76, 81, 84, 94], "well": [64, 65, 84, 94], "physiolog": 64, "motiv": [64, 73], "leaki": [64, 65], "lif": 64, "r_m": 64, "tau_": 64, "ref": 64, "real": [64, 65], "face": [65, 69, 70, 74, 77], "wider": 65, "deeper": 65, "express": 65, "wide": 65, "while": 65, "keep": 65, "same": [65, 80], "tradeoff": 65, "where": 65, "fail": [65, 94], "case": [65, 67], "studi": [65, 67], "world": [65, 97], "high": [65, 67], "level": 65, "aspect": [65, 77, 84, 100], "hype": 65, "xavier": 65, "best": [65, 76], "gain": 65, "techniqu": [67, 69, 70], "unexpect": 67, "consequ": [67, 94], "successfulli": 67, "mnist": [67, 73, 82], "interpret": [67, 69], "poor": 67, "condit": [67, 82, 94], "momentum": 67, "gd": 67, "oscil": 67, "non": [67, 74], "convex": 67, "overparameter": [67, 69, 70], "rescu": 67, "width": 67, "expens": 67, "mini": 67, "cost": [67, 74], "minibatch": 67, "size": [67, 73], "adapt": 67, "rmsprop": 67, "concern": 67, "put": [67, 73], "togeth": [67, 69, 73], "benchmark": 67, "metric": [67, 88], "regular": [68, 69, 70, 73], "part": [69, 70], "shrinkag": 69, "frobeniu": 69, "norm": [69, 70], "overfit": [69, 73], "valid": 69, "memor": 69, "animalnet": 69, "earli": 69, "stop": 69, "them": 69, "l1": 70, "l2": 70, "unregular": 70, "ridg": 70, "dropout": [70, 73], "caveat": 70, "without": [70, 94], "small": 70, "stochast": 70, "sgd": 70, "adversari": 70, "attack": 70, "kyunghyun": 72, "cho": 72, "onlin": 72, "hyperparmet": 72, "kynghyun": 72, "recap": 73, "experi": 73, "param": 73, "edg": 73, "detail": 73, "definit": 73, "note": [73, 81], "chicago": 73, "skylin": 73, "pad": 73, "stride": 73, "pool": 73, "subsampl": 73, "emnist": 73, "multipl": [73, 101], "see": [73, 76], "would": 73, "recogn": 73, "x": 73, "maxpool": 73, "fulli": 73, "connect": 73, "revisit": 73, "fashion": 73, "backpropag": 73, "remind": 73, "symptom": 73, "cure": 73, "ad": 73, "ha": 73, "been": 73, "spike": 74, "vignett": [74, 91, 101], "poisson": 74, "can": [74, 94], "ann": 74, "know": 74, "uncertainti": 74, "so": 74, "we": [74, 94], "measur": 74, "neg": [74, 85, 94], "standard": 74, "deviat": 74, "embed": [74, 77, 82, 87, 89], "modern": [75, 76, 77, 94], "fcnn": 76, "big": 76, "vgg": 76, "residu": 76, "imagenett": 76, "textual": 76, "imagenet": 76, "map": [76, 103], "eval_imagenett": 76, "incept": 76, "resnext": 76, "effici": 76, "depthwis": 76, "separ": 76, "mobilenet": 76, "64": 76, "onli": [76, 94], "readout": 76, "scratch": 76, "head": [76, 84], "comparison": [76, 80, 100], "pretrain": [76, 77], "outlook": 76, "speed": 76, "backbon": 76, "train_loop": 76, "train_load": 76, "loss_fn": 76, "run_model": 76, "lr_rate": 76, "facial": 77, "bia": 77, "discrimin": 77, "due": 77, "pairwis": 77, "distanc": 77, "within": 77, "sum": 77, "squar": 77, "wss": 77, "geoffrei": 79, "hinton": 79, "distil": 79, "variat": [80, 94], "autoencod": [80, 94], "pleas": 80, "ignor": 80, "warn": 80, "dure": 80, "wordnet": 80, "biggan": 80, "interpol": 80, "categori": 80, "ppca": 80, "conceptu": 80, "pca": 80, "autoencond": 80, "nonlinear": 80, "fill": 80, "convautoencod": 80, "encod": [80, 84, 94], "compon": 80, "novel": 80, "decod": [80, 84, 94], "Of": 80, "diffus": [81, 82], "principl": [81, 94], "behind": [81, 84], "1d": 81, "2d": 81, "gaussian": 81, "mixtur": 81, "log": 81, "densiti": 81, "each": [81, 94], "mode": 81, "individu": 81, "tell": 81, "revers": 81, "match": 81, "unet": 82, "sampler": 82, "advanc": 82, "techinqu": 82, "stabl": 82, "consider": [82, 94], "copyright": 82, "imageri": 82, "attent": [83, 84], "nltk": [84, 87], "punkt": 84, "averaged_perceptron_tagg": 84, "brown": 84, "webtext": 84, "yelp": [84, 85], "load_yelp_data": 84, "token": [84, 87, 88], "bert": [84, 88], "infil": 84, "queri": 84, "kei": 84, "intut": 84, "corpu": 84, "dot": 84, "product": 84, "multihead": 84, "q": [84, 97], "k": 84, "v": 84, "i": [84, 94], "ii": 84, "complex": 84, "posit": [84, 85], "positionalencod": 84, "bias": [84, 94], "probabl": 84, "mask": 84, "problem": [84, 97], "approach": [84, 94], "hint": 84, "field": 84, "robust": 85, "gpt": [85, 88], "osf": 85, "context": [85, 101], "extend": 85, "binari": 85, "likelihood": 85, "light": 85, "weight": 85, "break": 85, "origin": 85, "textattack": 85, "issu": 85, "seri": [86, 87], "fasttext": [87, 89], "homonym": 87, "analog": [87, 93], "feed": 87, "llm": 88, "Is": 88, "pre_token": 88, "practic": 88, "chines": 88, "english": 88, "nlg": 88, "sota": 88, "todai": 88, "tomorrow": 88, "conclus": [88, 94], "around": 88, "larg": 88, "multilingu": 89, "thinking2": 90, "multimod": 91, "strategi": 91, "tumor": 91, "still": 91, "isn": 91, "enough": 91, "forrest": 91, "gump": 91, "pull": 91, "unsupervis": 92, "melani": 93, "mitchel": 93, "un": 94, "allow": 94, "independ": 94, "introduc": 94, "dsprite": 94, "schemat": 94, "directli": 94, "along": 94, "induc": 94, "invari": 94, "matric": 94, "rsm": 94, "reveal": 94, "support": 94, "don": 94, "potenti": 94, "versu": [94, 100], "produc": 94, "conclud": 94, "reconstruct": 94, "organ": 94, "abil": 94, "construct": 94, "meaning": 94, "space": 94, "few": 94, "avail": 94, "could": 94, "ssl": 94, "simclr": 94, "wa": 94, "cope": 94, "contrast": 94, "collaps": 94, "reduc": 94, "histogram": 94, "pair": 94, "shot": 94, "benefit": 94, "short": 94, "scenario": 94, "e": 94, "when": 94, "fraction": 94, "advantag": 94, "disadvantag": 94, "type": 94, "under": 94, "variou": 94, "chealsea": 96, "finn": 96, "chelsea": 96, "grid": 97, "shortest": 97, "path": 97, "planner": [97, 100, 102], "gridworld": 97, "gridworldplann": 97, "try": 97, "harder": 97, "markov": 97, "mdp": 97, "go": 97, "solver": 97, "iter": 97, "epsilon": 97, "greedi": [97, 100], "For": [98, 100], "thinking3": 98, "amita": 99, "kapoor": 99, "futur": [99, 101], "othellogam": 100, "player": [100, 102], "compet": 100, "othello": 100, "othellonnet": 100, "valuenetwork": 100, "mse": 100, "progress": 100, "policynetwork": 100, "policybasedplay": 100, "mont": [100, 102], "carlo": [100, 102], "rollout": 100, "montecarlo": 100, "against": [100, 102], "unbeat": 100, "oppon": 100, "19": 100, "In": 101, "memori": 101, "mct": 102, "concept": 103}, "envversion": {"sphinx.domains.c": 2, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 6, "sphinx.domains.index": 1, "sphinx.domains.javascript": 2, "sphinx.domains.math": 2, "sphinx.domains.python": 3, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.intersphinx": 1, "sphinx": 56}}) \ No newline at end of file diff --git a/tutorials/Bonus_DeployModels/student/Bonus_Tutorial1.html b/tutorials/Bonus_DeployModels/student/Bonus_Tutorial1.html index 7ea47578f..a4390cd32 100644 --- a/tutorials/Bonus_DeployModels/student/Bonus_Tutorial1.html +++ b/tutorials/Bonus_DeployModels/student/Bonus_Tutorial1.html @@ -48,7 +48,7 @@ const thebe_selector_output = ".output, .cell_output" - + @@ -1077,7 +1077,7 @@

Tutorial Objectives
-
+
@@ -1256,7 +1256,7 @@

Section 1: IntroductionVideo 1: Deploying Neural Networks on the Web#

-
+
@@ -1275,7 +1275,7 @@

Submit your feedback
-
+

We will start by building a simple web application in Flask, which we’ll keep extending throughout the tutorial. In the end, you will have a web app where you can upload an image and have it classified automatically by a neural network model.

@@ -1291,7 +1291,7 @@

Section 2.1: Your First Flask App#

-
+
@@ -1310,7 +1310,7 @@

Submit your feedback - +

Creating a minimal Flask app is very simple. You need to create a Flask object and define the handler for the root URL returning the HTML response. You need to provide the applications module or package, but we can use __name__ as a convenient shortcut.

We need one small trick because the app will be running in a notebook. If you just run the app, it will be accessible at http://127.0.0.1:5000. The problem is that this is a local address to the server where the notebook is running, so you can’t access it. This is where ngrok helps - it creates a tunnel from the notebook server to the outside world. Make sure you use the ngrok URL when testing your app.

@@ -1363,7 +1363,7 @@

Section 2.2: Using Jinja2 Templates#

-
+

@@ -1382,7 +1382,7 @@

Submit your feedback - +

The default template engine used by Flask is Jinja2. Jinja2 offers features that help you write clean and reusable templates such as inheritance, humanizing, and formatting data (there’s an extension for this), dividing components into sub-modules, etc.

In this section, we are going to add Jinja2 templates to the app. WIth Jinja2 you can use variables and control flow commands, like ifs and loops in your HTML code. Then you can pass data from your Python code to the template when it is rendered.

@@ -1451,7 +1451,7 @@

Section 2.3: Apply the MVVM Design Pattern#

-
+

@@ -1470,7 +1470,7 @@

Submit your feedback - +

Design patterns provide a way of writing reusable, adaptable, and extendable code. Design patterns are not libraries, but rather a set of best practices to follow when designing your software.

Model View View-Model (MVVM) is a powerful design pattern commonly used in web applications (and other GUI applications).

@@ -1551,7 +1551,7 @@

Section 2.4: Creating a REST API#

-
+

@@ -1570,7 +1570,7 @@

Submit your feedback - +

REST (Representational State Transfer) is a set of rules according to which APIs are designed to enable your service to interact with other services. If HTML pages are interfaces designed for humans, you can think about REST APIs as interfaces made for computers.

A common way to implement a REST API is for your application to respond to certain requests by returning a JSON string containing the required data.

@@ -1671,7 +1671,7 @@

Section 3: Vue.js#

-
+

@@ -1690,7 +1690,7 @@

Submit your feedback - +

We already talked about the MVVM pattern and implemented it using a back end framework - Flask. Applying the same pattern on the front end can also be beneficial when creating dynamic applications.

Vue.js is a great front end library that implements the MVVM design pattern. It is widely used for creating user interfaces and single-page applications.

@@ -1780,7 +1780,7 @@

Section 4: Model Presentation#

-
+

@@ -1799,7 +1799,7 @@

Submit your feedback - +

Now (finally) we have all the tools we need to deploy our neural network! We are going to use a pre-trained DenseNet mode. In the first step, we are going to create an API entry point that accepts an image as input and classifies it. After that, we will create a dynamic UI for easier interaction.

@@ -1809,7 +1809,7 @@

Section 4.1: Image Classification API#

-
+
@@ -1828,7 +1828,7 @@

Submit your feedback - +

First, we need to load a pre-trained DenseNet trained on ImageNet. You can use torchvision.models to quickly get a pre-trained model for many popular neural network architectures.

@@ -1904,7 +1904,7 @@

Section 4.2: Create a Dynamic Application#

-
+

@@ -1923,7 +1923,7 @@

Submit your feedback - +

We will now create a Flask app that receives an image at /predict and passes it through the model. We will also implement an interactive UI to upload the image and call the API.

The UI consists of a file upload field, a classify button, and an image displaying the uploaded file.

@@ -2040,7 +2040,7 @@

Section 5: Deploy a Flask app on Heroku#

-
+

@@ -2059,7 +2059,7 @@

Submit your feedback - +

Now you are going to deploy your application as a real web server outside of the notebook. We are going to use Heroku for this. Heroku is a PaaS (Platform-as-a-Service) that offers pre-configured environments so you can deploy an application easily and quickly. They also offer a free tier which is enough for deploying simple apps.

But first, you need to test your application locally.

@@ -2070,7 +2070,7 @@

Section 5.1: Preparing Your Environment#

-
+

@@ -2089,7 +2089,7 @@

Submit your feedback - +

You need to do all the steps from here on on your own machine and not in the notebook. You need to make sure that you have Python 3 installed and some code editor (for example VS Code). You will also be using the terminal a lot in this section.

First, you need to prepare your Python environment and install all required dependencies. You should first create an empty folder where you will store your application and do the following steps.

@@ -2133,7 +2133,7 @@

Section 5.2: Create Your Application#

-
+

@@ -2152,7 +2152,7 @@

Submit your feedback - +

You are now ready to create the files needed for your application. For now, you need just 2 files.

app.py

@@ -2300,7 +2300,7 @@

Section 5.4: Preparing for Deployment on Heroku#

-
+

@@ -2319,7 +2319,7 @@

Submit your feedback - +

Before we can deploy on Heroku there are a couple of things we need to prepare.

Create Procfile

@@ -2348,7 +2348,7 @@

Section 5.5: Deploying on Heroku#

-
+

@@ -2367,7 +2367,7 @@

Submit your feedback - +

You are now finally ready to deploy to Heroku! There are just a couple of steps needed.

1. Create a Heroku account

@@ -2405,7 +2405,7 @@

Summary#

-
+

@@ -2424,7 +2424,7 @@

Submit your feedback - +

In this tutorial you learned the basics of some modern tools for creating dynamic web applications and REST APIs. You also learned how you can deploy your neural network model as a web app.

You can now build on top of that and create more sophisticated and awesome applications and make them available to millions of people!

diff --git a/tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1.html b/tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1.html index 50961d6f3..d223c48e0 100644 --- a/tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1.html +++ b/tutorials/W1D1_BasicsAndPytorch/student/W1D1_Tutorial1.html @@ -50,7 +50,7 @@ - + @@ -1309,7 +1309,7 @@

Tutorial Objectives
-
+


@@ -1489,7 +1489,7 @@

Section 1: Welcome to Neuromatch Deep learning courseVideo 1: Welcome and History#

-
+

This will be an intensive 3 week adventure. We will all learn Deep Learning (DL) in a group. Groups need standards. Read our Code of Conduct.

@@ -1510,14 +1510,14 @@

Submit your feedback
-
+

Video 2: Why DL is cool#

-
+

Discuss with your pod: What do you hope to get out of this course? [in about 100 words]

@@ -1537,7 +1537,7 @@

Submit your feedback - + @@ -1565,7 +1565,7 @@

Section 2.1: Creating Tensors#

-
+
@@ -1584,7 +1584,7 @@

Submit your feedback - +

There are various ways of creating tensors, and when doing any real deep learning project, we will usually have to do so.

Construct tensors directly:

@@ -1645,7 +1645,7 @@

Submit your feedback - +

@@ -1909,7 +1909,7 @@

Section 2.2: Operations in PyTorch#

-
+
@@ -1928,7 +1928,7 @@

Submit your feedback - +

Tensor-Tensor operations

We can perform operations on tensors using methods under torch..

@@ -2018,7 +2018,9 @@

Submit your feedback
Mean values of the rows of x tensor([0.2522, 0.6170, 0.5267])
 
@@ -2030,8 +2032,8 @@

Submit your feedback

Coding Exercise 2.2 : Simple tensor operations#

Below are two expressions involving operations on matrices.

-
-(1)#\[\begin{equation} +
+(1)#\[\begin{equation} \textbf{A} = \begin{bmatrix}2 &4 \\5 & 7 \end{bmatrix} @@ -2042,8 +2044,8 @@

Coding Exercise 2.2 : Simple tensor operations -(2)#\[\begin{equation} +
+(2)#\[\begin{equation} b = \begin{bmatrix} 3 \\ 5 \\ 7 \end{bmatrix} \cdot @@ -2156,7 +2158,7 @@

Submit your feedback -

+

@@ -2167,7 +2169,7 @@

Section 2.3 Manipulating Tensors in Pytorch#

-
+
@@ -2186,7 +2188,7 @@

Submit your feedback - +

Indexing

Just as in numpy, elements in a tensor can be accessed by index. As in any numpy array, the first element has index 0 and ranges are specified to include the first to last_element-1. We can access elements according to their relative position to the end of the list by using negative indices. Indexing is also referred to as slicing.

@@ -2430,8 +2432,8 @@

Coding Exercise 2.3: Manipulating Tensors\(A\) and \(B\) and returns the column sum of A multiplied by the sum of all the elmements of \(B\), i.e., a scalar, e.g.,

-
-(3)#\[\begin{equation} +
+(3)#\[\begin{equation} \text{If } A = \begin{bmatrix} 1 & 1 \\ @@ -2451,8 +2453,8 @@

Coding Exercise 2.3: Manipulating Tensors\(C\) and returns a 2D tensor consisting of a flattened \(C\) with the index of each element appended to this tensor in the row dimension, e.g.,

-
-(4)#\[\begin{equation} +
+(4)#\[\begin{equation} \text{If } C = \begin{bmatrix} 2 & 3 \\ @@ -2469,8 +2471,8 @@

Coding Exercise 2.3: Manipulating Tensors\(D\) and \(E\). If the dimensions allow it, this function returns the elementwise sum of \(D\)-shaped \(E\), and \(D\); else this function returns a 1D tensor that is the concatenation of the two tensors, e.g.,

-
-(5)#\[\begin{equation} +
+(5)#\[\begin{equation} \text{If } D = \begin{bmatrix} 1 & -1 \\ @@ -2487,8 +2489,8 @@

Coding Exercise 2.3: Manipulating Tensors -(6)#\[\begin{equation} +
+(6)#\[\begin{equation} \text{If } D = \begin{bmatrix} 1 & -1 \\ @@ -2632,7 +2634,7 @@

Submit your feedback -

+

@@ -2643,7 +2645,7 @@

Section 2.4: GPUs#

-
+
@@ -2662,7 +2664,7 @@

Submit your feedback - +

By default, when we create a tensor it will not live on the GPU!

@@ -2885,7 +2889,7 @@

Submit your feedback - +

Discuss!

Try and reduce the dimensions of the tensors and increase the iterations. You can get to a point where the cpu only function is faster than the GPU function. Why might this be?

@@ -2907,7 +2911,7 @@

Submit your feedback - +

@@ -2918,7 +2922,7 @@

Section 2.5: Datasets and Dataloaders#

-
+
@@ -2937,7 +2941,7 @@

Submit your feedback - +

When training neural network models you will be working with large amounts of data. Fortunately, PyTorch offers some great tools that help you organize and manipulate your data samples.

+

Video 8: Train and Test#

-
+
@@ -3077,7 +3081,7 @@

Submit your feedback - +

Training and Test Datasets

When loading a dataset, you can specify if you want to load the training or the test samples using the train argument. We can load the training and test datasets separately. For simplicity, today we will not use both datasets separately, but this topic will be adressed in the next days.

@@ -3115,7 +3119,7 @@

Submit your feedback#

-
+

Dataloader

Another important concept is the Dataloader. It is a wrapper around the Dataset that splits it into minibatches (important for training the neural network) and makes the data iterable. The shuffle argument is used to shuffle the order of the samples across the minibatches.

@@ -3166,7 +3170,7 @@

Video 9: Data Augmentation - Transformations
Batch size: torch.Size([64, 3, 32, 32])
 
-../../../_images/5517ff75b0df3683c1d00c1d0cc0a9ba636f10e7d27ecfc0f92189d829f2c556.png +../../../_images/adbbb7d4b9eca41b61d38981d89b534767b785b4466f76eb683aef9080dad326.png

Transformations

@@ -3236,7 +3240,7 @@

Submit your feedback - +

@@ -3257,7 +3261,7 @@

Section 3: Neural Networks#

-
+
@@ -3276,7 +3280,7 @@

Submit your feedback - +

@@ -3334,7 +3338,7 @@

Generate sample data -../../../_images/a2829e1935e2bad62518a69ef55ee33889fe6ecc6762ac0b1dcb7fed5c9472dd.png +../../../_images/522d287bd530127087194d55ae3f4e772f3315f39b6f902327e01565029fd1c7.png

Prepare Data for PyTorch

@@ -3380,7 +3384,7 @@

Section 3.2: Create a Simple Neural Network#

-
+

@@ -3399,7 +3403,7 @@

Submit your feedback - +

For this example we want to have a simple neural network consisting of 3 layers:

@@ -3604,7 +3608,7 @@

Section 3.3: Train Your Neural Network#

-
+
@@ -3623,7 +3627,7 @@

Submit your feedback - +

Now it is time to train your network on your dataset. Don’t worry if you don’t fully understand everything yet - we will cover training in much more details in the next days. For now, the goal is just to see your network in action!

You will usually implement the train method directly when implementing your class NaiveNet. Here, we will implement it as a function outside of the class in order to have it in a separate cell.

@@ -3803,7 +3807,7 @@

Helper function to plot the decision boundary
Epoch 14000 loss is 0.2128836214542389
 
-../../../_images/5adc4e702a00e5378fb581feb37714785469a4a637c9024550a29ebc765a8315.png +../../../_images/b66c28d1e7df9a9bb7418cf1fd3dcad9402533eba8bfc3b56b9f52bb5d7012a2.png

Plot the loss during training

@@ -3820,7 +3824,7 @@

Helper function to plot the decision boundary
Text(0, 0.5, 'Loss')
 
-../../../_images/4633ec72b53c4f88006a88008cb2fa937738c206757d2c6c59ee0e0e48e762f5.png +../../../_images/58d3ce2721c5d4f2d4bef02242e9b514a75acef3b3fe60aa340c14e62d074fe8.png

@@ -3869,7 +3873,7 @@

Visualize the training process#

-
+
@@ -3888,7 +3892,7 @@

Submit your feedback - +

@@ -3917,14 +3921,14 @@

Submit your feedback - +

Video 14: XOR Widget#

-
+
@@ -3943,12 +3947,12 @@

Submit your feedback - +

Exclusive OR (XOR) logical operation gives a true (1) output when the number of true inputs is odd. That is, a true output result if one, and only one, of the inputs to the gate is true. If both inputs are false (0) or both are true or false output results. Mathematically speaking, XOR represents the inequality function, i.e., the output is true if the inputs are not alike; otherwise, the output is false.

In case of two inputs (\(X\) and \(Y\)) the following truth table is applied:

-
-(7)#\[\begin{matrix} +
+(7)#\[\begin{matrix} X & Y & \text{XOR}\\ \hline 0 & 0 & 0\\ @@ -3968,15 +3972,15 @@

Interactive Demo 3.3: Solving XOR\(f(x)\) is ReLU is:

-
-(8)#\[\begin{equation} +
+(8)#\[\begin{equation} y = f(x_1)+f(x_2)-f(x_1+x_2) \end{equation}\]

Try to set the weights and biases to implement this function after you played enough :)

Play with the parameters to solve XOR

-
+

Do you think we can solve the discrete XOR (only 4 possibilities) with only 2 hidden units?

+

@@ -4033,7 +4037,7 @@

Section 4: Ethics And Course Info#

-
+
@@ -4052,14 +4056,14 @@

Submit your feedback - +

Video 16: Be a group#

-
+
@@ -4078,14 +4082,14 @@

Submit your feedback - +

Video 17: Syllabus#

-
+
@@ -4104,7 +4108,7 @@

Submit your feedback - +

@@ -4260,23 +4264,23 @@

Define Visualization using ALtair
-
+
+

@@ -4426,23 +4430,23 @@

Edit the AUTHOR
-
+
- + @@ -1453,11 +1453,11 @@

Install and import feedback gadget#

-
+
-
+
@@ -1476,7 +1476,7 @@

Submit your feedback
-
+

diff --git a/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1.html b/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1.html index 17778fc6d..eac37b68c 100644 --- a/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1.html +++ b/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial1.html @@ -50,7 +50,7 @@ - + @@ -1043,7 +1043,7 @@

Tutorial Objectives
-
+
@@ -1350,7 +1350,7 @@

Section 0: IntroductionVideo 0: Introduction#

-
+
@@ -1369,7 +1369,7 @@

Submit your feedback
-
+

@@ -1384,7 +1384,7 @@

Section 1.1: Gradients & Steepest Ascent#

-
+
@@ -1403,45 +1403,45 @@

Submit your feedback - +

Before introducing the gradient descent algorithm, let’s review a very important property of gradients. The gradient of a function always points in the direction of the steepest ascent. The following exercise will help clarify this.

Analytical Exercise 1.1: Gradient vector (Optional)#

Given the following function:

-
-(9)#\[\begin{equation} +
+(9)#\[\begin{equation} z = h(x, y) = \sin(x^2 + y^2) \end{equation}\]

find the gradient vector:

-
-(10)#\[\begin{equation} +
+(10)#\[\begin{equation} \begin{bmatrix} \dfrac{\partial z}{\partial x} \\ \\ \dfrac{\partial z}{\partial y} \end{bmatrix} \end{equation}\]

Hint: Use the chain rule!

Chain rule: For a composite function \(F(x) = g(h(x)) \equiv (g \circ h)(x)\):

-
-(11)#\[\begin{equation} +
+(11)#\[\begin{equation} F'(x) = g'(h(x)) \cdot h'(x) \end{equation}\]

or differently denoted:

-
-(12)#\[\begin{equation} +
+(12)#\[\begin{equation} \frac{dF}{dx} = \frac{dg}{dh} ~ \frac{dh}{dx} \end{equation}\]
Click here for the solution

We can rewrite the function as a composite function:

-
-(13)#\[\begin{equation} +
+(13)#\[\begin{equation} z = f\left( g(x,y) \right), ~~ f(u) = \sin(u), ~~ g(x, y) = x^2 + y^2 \end{equation}\]

Using the chain rule:

-
-(14)#\[\begin{align} +
+(14)#\[\begin{align} \dfrac{\partial z}{\partial x} &= \dfrac{\partial f}{\partial g} \dfrac{\partial g}{\partial x} = \cos(g(x,y)) ~ (2x) = \cos(x^2 + y^2) \cdot 2x \\ \\ \dfrac{\partial z}{\partial y} &= \dfrac{\partial f}{\partial g} \dfrac{\partial g}{\partial y} = \cos(g(x,y)) ~ (2y) = \cos(x^2 + y^2) \cdot 2y \end{align}\]
@@ -1461,7 +1461,7 @@

Submit your feedback -

+
@@ -1538,14 +1538,14 @@

Submit your feedback - +

Video 2: Gradient Descent - Discussion#

-
+
@@ -1564,7 +1564,7 @@

Submit your feedback - +

@@ -1572,8 +1572,8 @@

Submit your feedback

Section 1.2: Gradient Descent Algorithm#

Let \(f(\mathbf{w}): \mathbb{R}^d \rightarrow \mathbb{R}\) be a differentiable function. Gradient Descent is an iterative algorithm for minimizing the function \(f\), starting with an initial value for variables \(\mathbf{w}\), taking steps of size \(\eta\) (learning rate) in the direction of the negative gradient at the current point to update the variables \(\mathbf{w}\).

-
-(15)#\[\begin{equation} +
+(15)#\[\begin{equation} \mathbf{w}^{(t+1)} = \mathbf{w}^{(t)} - \eta \nabla f \left( \mathbf{w}^{(t)} \right) \end{equation}\]

where \(\eta > 0\) and \(\nabla f (\mathbf{w})= \left( \frac{\partial f(\mathbf{w})}{\partial w_1}, ..., \frac{\partial f(\mathbf{w})}{\partial w_d} \right)\). Since negative gradients always point locally in the direction of steepest descent, the algorithm makes small steps at each point towards the minimum.

@@ -1594,8 +1594,8 @@

Section 1.2: Gradient Descent Algorithm

Hence, all we need is to calculate the gradient of the loss function with respect to the learnable parameters (i.e., weights):

-
-(16)#\[\begin{equation} +
+(16)#\[\begin{equation} \dfrac{\partial Loss}{\partial \mathbf{w}} = \left[ \dfrac{\partial Loss}{\partial w_1}, \dfrac{\partial Loss}{\partial w_2} , \dots, \dfrac{\partial Loss}{\partial w_d} \right]^{\top} \end{equation}\]
@@ -1618,7 +1618,7 @@

Submit your feedback -

+

@@ -1629,7 +1629,7 @@

Section 1.3: Computational Graphs and Backprop#

-
+
@@ -1648,12 +1648,12 @@

Submit your feedback - +

Exercise 1.2 is an example of how overwhelming the derivation of gradients can get, as the number of variables and nested functions increases. This function is still extraordinarily simple compared to the loss functions of modern neural networks. So how can we (as well as PyTorch and similar frameworks) approach such beasts?

Let’s look at the function again:

-
-(17)#\[\begin{equation} +
+(17)#\[\begin{equation} f(x, y, z) = \tanh \left(\ln \left[1 + z \frac{2x}{sin(y)} \right] \right) \end{equation}\]

We can build a so-called computational graph (shown below) to break the original function into smaller and more approachable expressions.

@@ -1662,8 +1662,8 @@

Submit your feedback\(f\), and work our way against the arrows while calculating the gradient of each expression as we go. This is called the backward pass, from which the backpropagation of errors algorithm gets its name.

Computation Graph full

By breaking the computation into simple operations on intermediate variables, we can use the chain rule to calculate any gradient:

-
-(18)#\[\begin{equation} +
+(18)#\[\begin{equation} \dfrac{\partial f}{\partial x} = \dfrac{\partial f}{\partial e}~\dfrac{\partial e}{\partial d}~\dfrac{\partial d}{\partial c}~\dfrac{\partial c}{\partial a}~\dfrac{\partial a}{\partial x} = \left( 1-\tanh^2(e) \right) \cdot \frac{1}{d+1}\cdot z \cdot \frac{1}{b} \cdot 2 \end{equation}\]

Conveniently, the values for \(e\), \(b\), and \(d\) are available to us from when we did the forward pass through the graph. That is, the partial derivatives have simple expressions in terms of the intermediate variables \(a,b,c,d,e\) that we calculated and stored during the forward pass.

@@ -1673,8 +1673,8 @@

Analytical Exercise 1.3: Chain Rule (Optional)\(\dfrac{\partial f}{\partial y}\) using the computational graph and chain rule.

Click here for the solution -
-(19)#\[\begin{equation} +
+(19)#\[\begin{equation} \dfrac{\partial f}{\partial y} = \dfrac{\partial f}{\partial e}~\dfrac{\partial e}{\partial d}~\dfrac{\partial d}{\partial c}~\dfrac{\partial c}{\partial b}~\dfrac{\partial b}{\partial y} = \left( 1-\tanh^2(e) \right) \cdot \frac{1}{d+1}\cdot z \cdot \frac{-a}{b^2} \cdot \cos(y) \end{equation}\]

For more: Calculus on Computational Graphs: Backpropagation

@@ -1694,7 +1694,7 @@

Submit your feedback -

+

@@ -1708,7 +1708,7 @@

Section 2: PyTorch AutoGrad#

-
+
@@ -1727,7 +1727,7 @@

Submit your feedback - +

Deep learning frameworks such as PyTorch, JAX, and TensorFlow come with a very efficient and sophisticated set of algorithms, commonly known as Automatic Differentiation. AutoGrad is PyTorch’s automatic differentiation engine. Here we start by covering the essentials of AutoGrad, and you will learn more in the coming days.

@@ -1845,7 +1845,7 @@

Submit your feedback - + @@ -1863,7 +1863,7 @@

Section 2.2: Backward Propagation -
Gradient function = <AddBackward0 object at 0x7fb5727ae700>
+
Gradient function = <AddBackward0 object at 0x7f8a7bb7ba90>
 
@@ -1926,7 +1926,7 @@

Section 3: PyTorch’s Neural Net module (nn module#

-
+
@@ -1945,7 +1945,7 @@

Submit your feedback - +

PyTorch provides us with ready-to-use neural network building blocks, such as layers (e.g., linear, recurrent, etc.), different activation and loss functions, and much more, packed in the torch.nn module. If we build a neural network using torch.nn layers, the weights and biases are already in requires_grad mode and will be registered as model parameters.

For training, we need three things:

@@ -1988,7 +1988,7 @@

Generate the sample dataset
Random seed 2021 has been set.
 
-../../../_images/4c1ed1868ba96066a2e7d7703df52a32f0302c2ace25d15ed08ee848fea75ec2.png +../../../_images/83f933dbc6d50564dbbf98f58f6d5322fb90d93971785b0d778011045bf4d158.png

Let’s define a very wide (512 neurons) neural net with one hidden layer and nn.Tanh() activation function.

@@ -2180,7 +2180,7 @@

Submit your feedback - +

@@ -2194,7 +2194,7 @@

Summary#

-
+
@@ -2213,7 +2213,7 @@

Submit your feedback - +

diff --git a/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2.html b/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2.html index bc0f456e2..4c449756b 100644 --- a/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2.html +++ b/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial2.html @@ -50,7 +50,7 @@ - + @@ -1064,7 +1064,7 @@

Tutorial Objectives
-
+
@@ -1970,7 +1970,7 @@

Section 1: A Shallow Narrow Linear Neural NetworkVideo 1: Shallow Narrow Linear Net#

-
+
@@ -1989,7 +1989,7 @@

Submit your feedback
-
+

@@ -2034,7 +2034,7 @@

Submit your feedback - +

@@ -2043,8 +2043,8 @@

Coding Exercise 1.1: Implement simple narrow LNN (Optional)forward pass for our model from scratch without using PyTorch.

Also, although our model gets a single input feature and outputs a single prediction, we could calculate the loss and perform training for multiple samples at once. This is the common practice for neural networks, since computers are incredibly fast doing matrix (or tensor) operations on batches of data, rather than processing samples one at a time through for loops. Therefore, for the loss function, please implement the mean squared error (MSE), and adjust your analytical gradients accordingly when implementing the dloss_dw function.

Finally, complete the train function for the gradient descent algorithm:

-
-(20)#\[\begin{equation} +
+(20)#\[\begin{equation} \mathbf{w}^{(t+1)} = \mathbf{w}^{(t)} - \eta \nabla loss (\mathbf{w}^{(t)}) \end{equation}\]
+
@@ -2234,7 +2234,7 @@

Section 1.2: Learning landscapes#

-
+
@@ -2253,11 +2253,11 @@

Submit your feedback - +

As you may have already asked yourself, we can analytically find \(w_1\) and \(w_2\) without using gradient descent:

-
-(21)#\[\begin{equation} +
+(21)#\[\begin{equation} w_1 \cdot w_2 = \dfrac{y}{x} \end{equation}\]

In fact, we can plot the gradients, the loss function and all the possible solutions in one figure. In this example, we use the \(y = 1x\) mapping:

@@ -2283,7 +2283,7 @@

Submit your feedback -../../../_images/16bacb1df7ed9615ca2b3a5b1b2d017eea911c94d95e697331ef9f174cda5ad0.png +../../../_images/b2c72bebca4f83cb4f4eaf6f8abd90f6e9200fbfb1d2b4c301ce441b41803f8a.png

Here, we also visualize the loss landscape in a 3-D plot, with two training trajectories for different initial conditions. @@ -2295,7 +2295,7 @@

Submit your feedback -../../../_images/4035348be18abfca2094f2231f8015e24075adf0e363ced40e8af2346bef57a4.png +../../../_images/a929da8da1b2ef2bcf7555e64b02623d058e275714fc2799073034aa8b006012.png

@@ -2303,7 +2303,7 @@

Submit your feedback#

-
+
@@ -2322,7 +2322,7 @@

Submit your feedback - +

@@ -2338,7 +2338,7 @@

Section 2.1: The effect of depth#

-
+
@@ -2357,7 +2357,7 @@

Submit your feedback - +

Why might depth be useful? What makes a network or learning system “deep”? The reality is that shallow neural nets are often incapable of learning complex functions due to data limitations. On the other hand, depth seems like magic. Depth can change the functions a network can represent, the way a network learns, and how a network generalizes to unseen data.

So let’s look at the challenges that depth poses in training a neural network. Imagine a single input, single output linear network with 50 hidden layers and only one neuron per layer (i.e. a narrow deep neural network). The output of the network is easy to calculate:

@@ -2389,13 +2389,13 @@

Interactive Demo 2.1: Depth widget - +

Video 5: Effect of Depth - Discussion#

-
+
@@ -2414,7 +2414,7 @@

Submit your feedback - +

@@ -2426,7 +2426,7 @@

Section 2.2: Choosing a learning rate#

-
+
@@ -2445,7 +2445,7 @@

Submit your feedback - +

@@ -2472,13 +2472,13 @@

Interactive Demo 2.2: Learning rate widget - +

Video 7: Learning Rate - Discussion#

-
+
@@ -2497,7 +2497,7 @@

Submit your feedback - +

@@ -2508,7 +2508,7 @@

Section 2.3: Depth vs Learning Rate#

-
+
@@ -2527,7 +2527,7 @@

Submit your feedback - +

@@ -2567,7 +2567,7 @@

Interactive Demo 2.3: Depth and Learning Rate - +

Submit your feedback#

@@ -2585,14 +2585,14 @@

Submit your feedback - +

Video 9: Depth and Learning Rate - Discussion#

-
+
@@ -2611,7 +2611,7 @@

Submit your feedback - +

@@ -2622,7 +2622,7 @@

Section 2.4: Why initialization is important#

-
+
@@ -2641,7 +2641,7 @@

Submit your feedback - +

We’ve seen, even in the simplest of cases, that depth can slow learning. Why? From the chain rule, gradients are multiplied by the current weight at each layer, so the product can vanish or explode. Therefore, weight initialization is a fundamentally important hyperparameter.

Although in practice initial values for learnable parameters are often sampled from different \(\mathcal{Uniform}\) or \(\mathcal{Normal}\) probability distribution, here we use a single value for all the parameters.

@@ -2662,7 +2662,7 @@

Submit your feedback -../../../_images/3cd532bdc2aa84dba045f16ff44310580f469353eef0e1144af89a8bd28c1647.png +../../../_images/d50b0bb5be799a5938cc05673941c07c15ae6d4203f35bc93551e6831ce2318c.png

@@ -2670,7 +2670,7 @@

Submit your feedback#

-
+
@@ -2689,7 +2689,7 @@

Submit your feedback - +

@@ -2702,7 +2702,7 @@

Summary#

-
+
@@ -2721,7 +2721,7 @@

Submit your feedback - +

@@ -2762,7 +2762,7 @@

Hyperparameter interaction - +

Submit your feedback#

@@ -2780,7 +2780,7 @@

Submit your feedback - +

diff --git a/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3.html b/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3.html index 6c9cfaf2c..5a12f8e97 100644 --- a/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3.html +++ b/tutorials/W1D2_LinearDeepLearning/student/W1D2_Tutorial3.html @@ -50,7 +50,7 @@ - + @@ -1108,7 +1108,7 @@

Tutorial Objectives
-
+
@@ -2084,8 +2084,8 @@

Section 0: Prelude

Coding Exercise 0: Re-initialization (Optional)#

Complete the function ex_initializer_, such that the weights are sampled from the following distribution:

-
-(22)#\[\begin{equation} +
+(22)#\[\begin{equation} \mathcal{N}\left(\mu=0, ~~\sigma=\gamma \sqrt{\dfrac{1}{n_{in} + n_{out}}} \right) \end{equation}\]

where \(\gamma\) is the initialization scale, \(n_{in}\) and \(n_{out}\) are respectively input and output dimensions of the layer. the Underscore (“_”) in ex_initializer_ and other functions, denotes “in-place” operation.

@@ -2141,7 +2141,7 @@

Submit your feedback
-
+

@@ -2154,7 +2154,7 @@

Section 1: Deep Linear Neural Nets#

-
+
@@ -2173,7 +2173,7 @@

Submit your feedback - +

So far, depth just seems to slow down the learning. And we know that a single nonlinear hidden layer (given enough number of neurons and infinite training samples) has the potential to approximate any function. So it seems fair to ask: What is depth good for?

One reason can be that shallow nonlinear neural networks hardly meet their true potential in practice. In the contrast, deep neural nets are often surprisingly powerful in learning complex functions without sacrificing generalization. A core intuition behind deep learning is that deep nets derive their power through learning internal representations. How does this work? To address representation learning, we have to go beyond the 1D chain.

@@ -2212,7 +2212,7 @@

Run to generate and visualize training samples from tree -../../../_images/2e10eb2fa8a8838373723213e24c9f75dc2ab57c7be16c39d0e86b84746e6dc1.png +../../../_images/1d278a6559a883fb1f41a77e46e43a5afbf4aa7f10e77be7f987643785f08e1c.png

Think!

@@ -2333,7 +2333,7 @@

Make sure you execute this cell to enable the widget!
-
+

@@ -2352,7 +2352,7 @@

Submit your feedback - +

@@ -2365,7 +2365,7 @@

Section 2: Singular Value Decomposition (SVD)#

-
+
@@ -2384,11 +2384,11 @@

Submit your feedback - +

In this section, we intend to study the learning (training) dynamics we just saw. First, we should know that a linear neural network is performing sequential matrix multiplications, which can be simplified to:

-
-(23)#\[\begin{align} +
+(23)#\[\begin{align} \mathbf{y} &= \mathbf{W}_{L}~\mathbf{W}_{L-1}~\dots~\mathbf{W}_{1} ~ \mathbf{x} \\ &= \left(\prod_{i=1}^{L}{\mathbf{W}_{i}}\right) ~ \mathbf{x} \\ &= \mathbf{W}_{tot} ~ \mathbf{x} @@ -2398,8 +2398,8 @@

Submit your feedback\(A\) (yes, ANY) can be decomposed (factorized) to 3 matrices:

-
-(24)#\[\begin{equation} +
+(24)#\[\begin{equation} \mathbf{A} = \mathbf{U} \mathbf{Σ} \mathbf{V}^{\top} \end{equation}\]

where \(U\) is an orthogonal matrix, \(\Sigma\) is a diagonal matrix, and \(V\) is again an orthogonal matrix. The diagonal elements of \(\Sigma\) are called singular values.

@@ -2466,7 +2466,7 @@

Submit your feedback -

+

Make sure you execute this cell to train the network and plot#

@@ -2505,7 +2505,7 @@

Make sure you execute this cell to train the network and plot -../../../_images/ab691438ce7197a801c04a5a0512e67400c7328fdb5a2324a1b97d67fb9dbb7a.png +../../../_images/0425646389429b5ee5be251a9729a599897cc73935acbe8201f0d6bba99ac16b.png

Think!

@@ -2528,14 +2528,14 @@

Submit your feedback - +

Video 3: SVD - Discussion#

-
+
@@ -2554,7 +2554,7 @@

Submit your feedback - +

@@ -2567,7 +2567,7 @@

Section 3: Representational Similarity Analysis (RSA)Video 4: RSA#

-
+
@@ -2586,14 +2586,14 @@

Submit your feedback - +

The previous section ended with an interesting remark. SVD helped to break our deep “wide” linear neural net into 8 deep “narrow” linear neural nets.

The first narrow net (highest singular value) converges fastest, while the last four narrow nets, converge almost simultaneously and have the smallest singular values. Can it be that the narrow net with larger mode is learning the difference between “living things” and “objects”, while another narrow net with smaller mode is learning the difference between Fish and Birds? how could we check this hypothesis?

Representational Similarity Analysis (RSA) is an approach that could help us understand the internal representation of our network. The main idea is that the activity of hidden units (neurons) in the network must be similar when the network is presented with similar input. For our dataset (hierarchically structured data), we expect the activity of neurons in the hidden layer to be more similar for Tuna and Canary, and less similar for Tuna and Oak.

For similarity measure, we can use the good old dot (scalar) product, which is also called cosine similarity. For calculating the dot product between multiple vectors (which would be our case), we can simply use matrix multiplication. Therefore the Representational Similarity Matrix for multiple-input (batch) activity could be calculated as follow:

-
-(25)#\[\begin{equation} +
+(25)#\[\begin{equation} RSM = \mathbf{H} \mathbf{H}^{\top} \end{equation}\]

where \(\mathbf{H} = \mathbf{X} \mathbf{W_1}\) is the activity of hidden neurons for a given batch \(\mathbf{X}\).

@@ -2706,7 +2706,7 @@

Make sure you execute this cell to enable widgets
-
+

Let’s take a moment to analyze this more. A deep neural net is learning the representations, rather than a naive mapping (look-up table). This is thought to be the reason for deep neural nets supreme generalization and transfer learning ability. Unsurprisingly, neural nets with no hidden layer are incapable of representation learning, even with extremely small initialization.

@@ -2726,14 +2726,14 @@

Submit your feedback - +

Video 5: RSA - Discussion#

-
+
@@ -2752,7 +2752,7 @@

Submit your feedback - +

@@ -2765,7 +2765,7 @@

Section 4: Illusory Correlations#

-
+
@@ -2784,7 +2784,7 @@

Submit your feedback - +

Let’s recall the training loss curves. There was often a long plateau (where the weights are stuck at a saddle point), followed by a sudden drop. For very deep complex neural nets, such plateaus can take hours of training, and we are often tempted to stop the training, because we believe it is “as good as it gets”! Another side effect of “immature interruption” of training is the network finding (learning) illusory correlations.

To better understand this, let’s do the next demonstration and exercise.

@@ -2819,7 +2819,7 @@

Demonstration: Illusory Correlations -../../../_images/f64a1c67037ca360d4520ed87514eab2d9b77fd6d7e1f1842165a09a711452f4.png +../../../_images/3a63aa830d1503b84a7d83fb936f1171a97b791503c43847651d484a54daee9e.png

You can see the new feature shown in the last column of the plot above.

@@ -2874,7 +2874,7 @@

Make sure you execute this cell to train the network and plot -../../../_images/dbccbca66516cacb8a41ced26bf0cd378c90041c9e9cbf543116fb377c8a8557.png +../../../_images/575be5eb288326275c799b4750a001e8347933129cbc93aa448a89aca03882eb.png

It seems that the network starts by learning an “illusory correlation” that sharks have bones, and in later epochs, as it learns deeper representations, it can see (learn) beyond the illusory correlation. This is important to remember that we never presented the network with any data saying that sharks have bones.

@@ -2910,7 +2910,7 @@

Exercise 4: Illusory Correlations -../../../_images/569fc8f69cf7812bf8f3a04c31fc90eb497c9de16f18b9bd9de5ab65b360fc37.png +../../../_images/b3c2bb3bab70cb65e6dbcbdbd1e021ecd70d29be61ff143fb1e69a93c731189a.png
@@ -2959,7 +2959,7 @@

Make sure you execute this cell to train the network and plot -../../../_images/afac7309a8b92ea40082ce82f15d62fee1ea84fde5d0e3ce3c1fb65c82d75890.png +../../../_images/a31ba78cc972eadf46b9cf6e28951a6ff3572d2d93b9f21732bbaf097acd39b8.png

@@ -2979,14 +2979,14 @@

Submit your feedback - +

Video 7: Illusory Correlations - Discussion#

-
+
@@ -3005,7 +3005,7 @@

Submit your feedback - +

@@ -3018,7 +3018,7 @@

Summary#

-
+
@@ -3037,7 +3037,7 @@

Submit your feedback - +

@@ -3056,7 +3056,7 @@

Bonus#

Video 9: Linear Regression#

-
+
@@ -3075,7 +3075,7 @@

Submit your feedback - +

@@ -3083,21 +3083,21 @@

Section 5.1: Linear Regression\(\mathbf{x} \in \mathbb{R}^M\), where \(M\) denotes the number of independent variables, while the dependent variables are collected in vector \(\mathbf{y} \in \mathbb{R}^N\), where \(N\) denotes the number of dependent variables. And the mapping between them is represented by the weight matrix \(\mathbf{W} \in \mathbb{R}^{N \times M}\) and a bias vector \(\mathbf{b} \in \mathbb{R}^{N}\) (generalizing to affine mappings).

The multivariate regression model can be written as:

-
-(26)#\[\begin{equation} +
+(26)#\[\begin{equation} \mathbf{y} = \mathbf{W} ~ \mathbf{x} + \mathbf{b} \end{equation}\]

or it can be written in matrix format as:

-
-(27)#\[\begin{equation} +
+(27)#\[\begin{equation} \begin{bmatrix} y_{1} \\ y_{2} \\ \vdots \\ y_{N} \\ \end{bmatrix} = \begin{bmatrix} w_{1,1} & w_{1,2} & \dots & w_{1,M} \\ w_{2,1} & w_{2,2} & \dots & w_{2,M} \\ \vdots & \ddots & \ddots & \vdots \\ w_{N,1} & w_{N,2} & \dots & w_{N,M} \end{bmatrix} \begin{bmatrix} x_{1} \\ x_{2} \\ \vdots \\ x_{M} \\ \end{bmatrix} + \begin{bmatrix} b_{1} \\ b_{2} \\ \vdots \\b_{N} \\ \end{bmatrix} \end{equation}\]

Section 5.2: Vectorized regression#

Linear regression can be simply extended to multi-samples (\(D\)) input-output mapping, which we can collect in a matrix \(\mathbf{X} \in \mathbb{R}^{M \times D}\), sometimes called the design matrix. The sample dimension also shows up in the output matrix \(\mathbf{Y} \in \mathbb{R}^{N \times D}\). Thus, linear regression takes the following form:

-
-(28)#\[\begin{equation} +
+(28)#\[\begin{equation} \mathbf{Y} = \mathbf{W} ~ \mathbf{X} + \mathbf{b} \end{equation}\]

where matrix \(\mathbf{W} \in \mathbb{R}^{N \times M}\) and the vector \(\mathbf{b} \in \mathbb{R}^{N}\) (broadcasted over sample dimension) are the desired parameters to find.

@@ -3106,30 +3106,30 @@

Section 5.2: Vectorized regression#

Linear regression is a relatively simple optimization problem. Unlike most other models that we will see in this course, linear regression for mean squared loss can be solved analytically.

For \(D\) samples (batch size), \(\mathbf{X} \in \mathbb{R}^{M \times D}\), and \(\mathbf{Y} \in \mathbb{R}^{N \times D}\), the goal of linear regression is to find \(\mathbf{W} \in \mathbb{R}^{N \times M}\) such that:

-
-(29)#\[\begin{equation} +
+(29)#\[\begin{equation} \mathbf{Y} = \mathbf{W} ~ \mathbf{X} \end{equation}\]

Given the Squared Error loss function, we have:

-
-(30)#\[\begin{equation} +
+(30)#\[\begin{equation} Loss(\mathbf{W}) = ||\mathbf{Y} - \mathbf{W} ~ \mathbf{X}||^2 \end{equation}\]

So, using matrix notation, the optimization problem is given by:

-
-(31)#\[\begin{align} +
+(31)#\[\begin{align} \mathbf{W^{*}} &= \underset{\mathbf{W}}{\mathrm{argmin}} \left( Loss (\mathbf{W}) \right) \\ &= \underset{\mathbf{W}}{\mathrm{argmin}} \left( ||\mathbf{Y} - \mathbf{W} ~ \mathbf{X}||^2 \right) \\ &= \underset{\mathbf{W}}{\mathrm{argmin}} \left( \left( \mathbf{Y} - \mathbf{W} ~ \mathbf{X}\right)^{\top} \left( \mathbf{Y} - \mathbf{W} ~ \mathbf{X}\right) \right) \end{align}\]

To solve the minimization problem, we can simply set the derivative of the loss with respect to \(\mathbf{W}\) to zero.

-
-(32)#\[\begin{equation} +
+(32)#\[\begin{equation} \dfrac{\partial Loss}{\partial \mathbf{W}} = 0 \end{equation}\]

Assuming that \(\mathbf{X}\mathbf{X}^{\top}\) is full-rank, and thus it is invertible, we can write:

-
-(33)#\[\begin{equation} +
+(33)#\[\begin{equation} \mathbf{W}^{\mathbf{*}} = \mathbf{Y} \mathbf{X}^{\top} \left( \mathbf{X} \mathbf{X}^{\top} \right) ^{-1} \end{equation}\]

@@ -3199,7 +3199,7 @@

Submit your feedback -

+
@@ -3338,7 +3338,7 @@

Demonstration: Linear Regression vs. DLNN -../../../_images/7dab5cd492e47eabc79f18cfc91295b3a6fba687cd26f117f1d712277ef030da.png +../../../_images/378be3d26490c74e544d93d36b6495079af1f963fbab130eeb7397e22381bc71.png
@@ -3359,7 +3359,7 @@

Demonstration: Linear Regression vs. DLNN#

-
+
@@ -3378,7 +3378,7 @@

Submit your feedback - +

diff --git a/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1.html b/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1.html index 2f9b77b56..0d6b76edd 100644 --- a/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1.html +++ b/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial1.html @@ -50,7 +50,7 @@ - + @@ -1040,7 +1040,7 @@

Tutorial objectives
-
+
@@ -1333,7 +1333,7 @@

Section 0: Introduction to MLPs#

-
+
@@ -1352,7 +1352,7 @@

Submit your feedback
-
+

@@ -1364,7 +1364,7 @@

Section 1: The Need for MLPs#

-
+
@@ -1383,7 +1383,7 @@

Submit your feedback - +

@@ -1403,8 +1403,8 @@

Coding Exercise 1: Function approximation with ReLU

These are the points we will use to learn how to approximate the function. We have 10 training data points so we will have 9 ReLUs (we don’t need a ReLU for the last data point as we don’t have anything to the right of it to model).

We first need to figure out the bias term for each ReLU and compute the activation of each ReLU where:

-
-(34)#\[\begin{equation} +
+(34)#\[\begin{equation} y(x) = \text{max}(0, x+b) \end{equation}\]

We then need to figure out the correct weights on each ReLU so the linear combination approximates the desired function.

@@ -1504,7 +1504,7 @@

Submit your feedback -

+

@@ -1517,7 +1517,7 @@

Section 2: MLPs in PyTorch#

-
+
@@ -1536,7 +1536,7 @@

Submit your feedback - +

In the previous segment, we implemented a function to approximate any smooth function using MLPs. We saw that using Lipschitz continuity; We can prove that our approximation is mathematically correct. MLPs are fascinating, but before we get into the details on designing them, let’s familiarize ourselves with some basic terminology of MLPs - layer, neuron, depth, width, weight, bias, and activation function. Armed with these ideas, we can now design an MLP given its input, hidden layers, and output size.

@@ -1549,8 +1549,8 @@

Coding Exercise 2: Implement a general-purpose MLP in PytorchLeaky ReLU) in all hidden layers

Leaky ReLU is described by the following mathematical formula:

-
-(35)#\[\begin{align} +
+(35)#\[\begin{align} \text{LeakyReLU}(x) &= \text{max}(0,x) + \text{negative_slope} \cdot \text{min}(0, x) \\ &= \left\{ @@ -1664,7 +1664,7 @@

Submit your feedback -

+
@@ -1674,7 +1674,7 @@

Section 2.1: Classification with MLPs#

-
+
@@ -1693,7 +1693,7 @@

Submit your feedback - +

The main loss function we could use out of the box for multi-class classification for N samples and C number of classes is:

To get CrossEntropyLoss of a sample \(i\), we could first calculate \(-\log(\text{softmax}(x))\) and then take the element corresponding to \(\text {labels}_i\) as the loss. However, due to numerical stability, we implement this more stable equivalent form,

-
-(36)#\[\begin{equation} +
+(36)#\[\begin{equation} \operatorname{loss}(x_i, \text {labels}_i)=-\log \left(\frac{\exp (x[\text {labels}_i])}{\sum_{j} \exp (x[j])}\right)=-x_i[\text {labels}_i]+\log \left(\sum_{j=1}^C \exp (x_i[j])\right) \end{equation}\]

@@ -1714,8 +1714,8 @@

Coding Exercise 2.1: Implement Batch Cross Entropy Loss

A batch of labels with shape (N, ) that ranges from 0 to C-1

Returns the average loss \(L\) calculated according to:

-
-(37)#\[\begin{align} +
+(37)#\[\begin{align} \text{loss}(x_i, \text {labels}_i) &= -x_i[\text {labels}_i]+\log \left(\sum_{j=1}^C \exp (x_i[j])\right) \\ L &= \frac{1}{N} \sum_{i=1}^{N}{\text{loss}(x_i, \text {labels}_i)} \end{align}\]
@@ -1793,7 +1793,7 @@

Submit your feedback -

+ @@ -1802,8 +1802,8 @@

Submit your feedback#

Before we could start optimizing these loss functions, we need a dataset!

Let’s turn this fancy-looking equation into a classification dataset

-
-(38)#\[\begin{equation} +
+(38)#\[\begin{equation} \begin{array}{c} X_{k}(t)=t\left(\begin{array}{c} \sin \left[\frac{2 \pi}{K}\left(2 t+k-1\right)\right]+\mathcal{N}\left(0, \sigma\right) \\ @@ -1864,7 +1864,7 @@

Section 2.2: Spiral Classification Dataset
Random seed 2021 has been set.
 

-../../../_images/338911a3d51a8d2332a6edb77e91f79d860a36bea2c7a8ca6c0e1c78c16ea3aa.png +../../../_images/85460b363e5ef3fea7e7fa140a35f2f2f1cfe37ab393709587f3156a10f3cd53.png
@@ -1874,7 +1874,7 @@

Section 2.3: Training and Evaluation#

-
+
@@ -1893,7 +1893,7 @@

Submit your feedback - +

@@ -1985,7 +1985,7 @@

Submit your feedback - +

And we need to make a Pytorch data loader out of it. Data loading in PyTorch can be separated in 2 parts:

@@ -2279,7 +2279,7 @@

Bonus: Neuron Physiology and Motivation to Deep Learning#

-
+
@@ -2298,23 +2298,23 @@

Submit your feedback - +

Leaky Integrate-and-fire (LIF) neuronal model#

The basic idea of LIF neuron was proposed in 1907 by Louis Édouard Lapicque, long before we understood the electrophysiology of a neuron (see a translation of Lapicque’s paper ). More details of the model can be found in the book Theoretical neuroscience by Peter Dayan and Laurence F. Abbott.

The model dynamics is defined with the following formula,

-
-(39)#\[\begin{equation} +
+(39)#\[\begin{equation} \frac{d V_m}{d t}=\left\{\begin{array}{cc} \frac{1}{C_m}\left(-\frac{V_m}{R_m} + I \right) & t>t_{rest} \\ 0 & \text { otherwise } \end{array}\right. \end{equation}\]

Note that \(V_{m}\), \(C_{m}\), and \(R_{m}\) are the membrane voltage, capacitance, and resitance of the neuron, respectively, so the \(-\frac{V_{m}}{R_{m}}\) denotes the leakage current. When \(I\) is sufficiently strong such that \(V_{m}\) reaches a certain threshold value \(V_{\rm th}\), it momentarily spikes and then \(V_{m}\) is reset to \(V_{\rm reset}< V_{\rm th}\), and voltage stays at \(V_{\rm reset}\) for \(\tau_{\rm ref}\) ms, mimicking the refractoriness of the neuron during an action potential (note that \(V_{\rm reset}\) and \(\tau_{\rm ref}\) is assumed to be zero in the lecture):

-
-(40)#\[\begin{eqnarray} +
+(40)#\[\begin{eqnarray} V_{m}(t)=V_{\rm reset} \text{ for } t\in(t_{\text{sp}}, t_{\text{sp}} + \tau_{\text{ref}}] \end{eqnarray}\]

where \(t_{\rm sp}\) is the spike time when \(V_{m}(t)\) just exceeded \(V_{\rm th}\).

@@ -2331,8 +2331,8 @@

Leaky Integrate-and-fire (LIF) neuronal model#

In the cell below is given a function for LIF neuron model with it’s arguments described.

Note that we will use Euler’s method to make a numerical approximation to a derivative. Hence we will use the following implementation of the model dynamics,

-
-(41)#\[\begin{equation} +
+(41)#\[\begin{equation} V_m^{[n]}=\left\{\begin{array}{cc} V_m^{[n-1]} + \frac{1}{C_m}\left(-\frac{V_m^{[n-1]}}{R_m}+I \right) \Delta t & t>t_{r e s t} \\ 0 & \text { otherwise } @@ -2408,7 +2408,7 @@

Simulating an LIF Neuron -../../../_images/fdf6936a17ef4f3d30a069cae1cc47564e133fba51034ae49fd322bf2d5e90b1.png +../../../_images/5b7bee7b59e463ff8e6f7eb25a97d7e743499c4f94b358ed72743336ebe7dfcb.png

@@ -2471,7 +2471,7 @@

#
-
+

@@ -2496,7 +2496,7 @@

Submit your feedback - + diff --git a/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2.html b/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2.html index a9ee86bc7..a638278d1 100644 --- a/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2.html +++ b/tutorials/W1D3_MultiLayerPerceptrons/student/W1D3_Tutorial2.html @@ -50,7 +50,7 @@ - + @@ -989,7 +989,7 @@

Tutorial Objectives
-
+
@@ -1626,7 +1626,7 @@

Section 1: Wider vs deeper networks#

-
+
@@ -1645,7 +1645,7 @@

Submit your feedback
-
+

@@ -1769,7 +1769,7 @@

Submit your feedback - +

@@ -1793,7 +1793,7 @@

Submit your feedback - + @@ -1801,8 +1801,8 @@

Submit your feedback#

Let’s use the same spiral dataset generated before with two features. And then add more polynomial features (which makes the first layer wider). And finally, train a single linear layer. We could use the same MLP network with no hidden layers (though it would not be called an MLP anymore!).

Note that we will add polynomial terms upto \(P=50\) which means that for every \(x_1^n x_2^m\) term, \(n+m\leq P\). Now it’s fun math exercise to prove why the total number of polynomial features upto \(P\) becomes:

-
-(42)#\[\begin{equation} +
+(42)#\[\begin{equation} \text{# of terms} = \frac{(P+1)(P+2)}{2} \end{equation}\]

Also, we don’t need the polynomial term with degree zero (which is the constatnt term) since nn.Linear layers have bias terms. Therefore we will have one fewer polynomial feature.

@@ -1909,11 +1909,11 @@

Section 1.1: Where Wide Fails
Random seed 2021 has been set.
 

- @@ -1953,7 +1953,7 @@

Section 2: Deeper MLPsVideo 2: Case study#

-
+
@@ -1972,7 +1972,7 @@

Submit your feedback - +

@@ -2098,7 +2098,7 @@

Coding Exercise 2: Dataloader on a real-world dataset
-../../../_images/6864477d9e8849db0385b53a73ba8cbce3690f54f52e3239e340749f14d4243c.png +../../../_images/effbe87412081f6f6827cf3b5e342e950298e3905b0ac4550c91d643e4c9191e.png
@@ -2117,7 +2117,7 @@

Submit your feedback - +

@@ -2141,7 +2141,7 @@

Submit your feedback - + @@ -2154,7 +2154,7 @@

Section 3: Ethical aspects#

-
+
@@ -2173,7 +2173,7 @@

Submit your feedback - +

@@ -2185,7 +2185,7 @@

Summary#

-
+
@@ -2204,7 +2204,7 @@

Submit your feedback - +

@@ -2224,7 +2224,7 @@

Bonus: The need for good initialization#

-
+
@@ -2243,19 +2243,19 @@

Submit your feedback - +

Xavier initialization#

Let us look at the scale distribution of an output (e.g., a hidden variable) \(o_i\) for some fully-connected layer without nonlinearities. With \(n_{in}\) inputs (\(x_j\)) and their associated weights \(w_{ij}\) for this layer. Then an output is given by,

-
-(43)#\[\begin{equation} +
+(43)#\[\begin{equation} o_{i} = \sum_{j=1}^{n_\mathrm{in}} w_{ij} x_j \end{equation}\]

The weights \(w_{ij}\) are all drawn independently from the same distribution. Furthermore, let us assume that this distribution has zero mean and variance \(\sigma^2\). Note that this does not mean that the distribution has to be Gaussian, just that the mean and variance need to exist. For now, let us assume that the inputs to the layer \(x_j\) also have zero mean and variance \(\gamma^2\) and that they are independent of \(w_{ij}\) and independent of each other. In this case, we can compute the mean and variance of \(o_i\) as follows:

-
-(44)#\[\begin{align} +
+(44)#\[\begin{align} E[o_i] &= \sum_{j=1}^{n_\mathrm{in}} E[w_{ij} x_j] \\ \\ &= \sum_{j=1}^{n_\mathrm{in}} E[w_{ij}] E[x_j] = 0, \\ \\ \\ \mathrm{Var}[o_i] &= E[o_i^2] - (E[o_i])^2 \\ \\ @@ -2265,20 +2265,20 @@

Xavier initialization

One way to keep the variance fixed is to set \(n_{in}\sigma^2=1\) . Now consider backpropagation. There we face a similar problem, albeit with gradients being propagated from the layers closer to the output. Using the same reasoning as for forward propagation, we see that the gradients’ variance can blow up unless \(n_{out}\sigma^2=1\) , where \(n_{out}\) is the number of outputs of this layer. This leaves us in a dilemma: we cannot possibly satisfy both conditions simultaneously. Instead, we simply try to satisfy:

-
-(45)#\[\begin{align} +
+(45)#\[\begin{align} \frac{1}{2} (n_\mathrm{in} + n_\mathrm{out}) \sigma^2 = 1 \text{ or equivalently } \sigma = \sqrt{\frac{2}{n_\mathrm{in} + n_\mathrm{out}}} \end{align}\]

This is the reasoning underlying the now-standard and practically beneficial Xavier initialization, named after the first author of its creators Glorot and Bengio, 2010. Typically, the Xavier initialization samples weights from a Gaussian distribution with zero mean and variance \(\sigma^2=\frac{2}{(n_{in}+n_{out})}\),

-
-(46)#\[\begin{equation} +
+(46)#\[\begin{equation} w_{ij} \sim \mathcal{N} \left (\mu=0, \sigma=\sqrt{\frac{2}{(n_{in}+n_{out})}} \right) \end{equation}\]

We can also adapt Xavier’s intuition to choose the variance when sampling weights from a uniform distribution. Note that the uniform distribution \(\mathcal{U}(−a,a)\) has variance \(\frac{a^2}{3}\). Plugging this into our condition on \(\sigma^2\) yields the suggestion to initialize according to

-
-(47)#\[\begin{equation} +
+(47)#\[\begin{equation} w_{ij} \sim \mathcal{U} \left(-\sqrt{\frac{6}{n_\mathrm{in} + n_\mathrm{out}}}, \sqrt{\frac{6}{n_\mathrm{in} + n_\mathrm{out}}}\right) \end{equation}\]

This explanation is mainly taken from here.

@@ -2288,8 +2288,8 @@

Xavier initializationInitialization with transfer function#

Let’s derive the optimal gain for LeakyReLU following similar steps.

LeakyReLU is described mathematically:

-
-(48)#\[\begin{equation} +
+(48)#\[\begin{equation} f(x)=\left\{ \begin{array}{ll} \alpha \cdot x & \text { for } x<0 \\ @@ -2298,15 +2298,15 @@

Initialization with transfer function\(\alpha\) controls the angle of the negative slope.

Considering a single layer with this activation function gives,

-
-(49)#\[\begin{align} +
+(49)#\[\begin{align} o_{i} &= \sum_{j=1}^{n_\mathrm{in}} w_{ij} x_j\\ z_{i} &= f\left( o_{i} \right) \end{align}\]

where \(z_i\) denotes the activation of node \(i\).

The expectation of the output is still zero, i.e., \(\mathbb{E}[f(o_i)=0]\), but the variance changes, and assuming that the probability \(P(x < 0) = 0.5\), we have that:

-
-(50)#\[\begin{align} +
+(50)#\[\begin{align} \mathrm{Var}[f(o_i)] &= \mathbb{E}[f(o_i)^2] - \left( \mathbb{E}[f(o_i)] \right)^{2} \\ \\ &= \frac{\mathrm{Var}[o_i] + \alpha^2 \mathrm{Var}[o_i]}{2} \\ \\ &= \frac{1+\alpha^2}{2}n_\mathrm{in} \sigma^2 \gamma^2 @@ -2314,8 +2314,8 @@

Initialization with transfer function\(\gamma\) is the variance of the distribution of the inputs \(x_j\) and \(\sigma\) is the variance of the distribution of weights \(w_{ij}\), as before.

Therefore, following the rest of derivation as before,


-
-(51)#\[\begin{equation} +
+(51)#\[\begin{equation} \sigma = gain\sqrt{\frac{2}{n_\mathrm{in} + n_\mathrm{out}}}, \, \text{where} \,\, gain = \sqrt{\frac{2}{1+\alpha^2}} \end{equation}\]

As we can see from the derived formula of \(\sigma\), the transfer function we choose is related with the variance of the distribution of the weights. As the negative slope of the LeakyReLU \(\alpha\) becomes larger, the \(gain\) becomes smaller and thus, the distribution of the weights is narrower. On the other hand, as \(\alpha\) becomes smaller and smaller, the distribution of the weights is wider. Recall that, we initialize our weights, for example, by sampling from a normal distribution with zero mean and variance \(\sigma^2\).

diff --git a/tutorials/W1D5_Optimization/student/W1D5_Tutorial1.html b/tutorials/W1D5_Optimization/student/W1D5_Tutorial1.html index 13668f763..81b71c569 100644 --- a/tutorials/W1D5_Optimization/student/W1D5_Tutorial1.html +++ b/tutorials/W1D5_Optimization/student/W1D5_Tutorial1.html @@ -50,7 +50,7 @@ - + @@ -1246,7 +1246,7 @@

Tutorial Objectives
-
+


@@ -1498,7 +1498,7 @@

Section 1. IntroductionVideo 1: Introduction#

-
+
@@ -1517,7 +1517,7 @@

Submit your feedback
-
+

@@ -1540,7 +1540,7 @@

Submit your feedback - +

@@ -1557,7 +1557,7 @@

Section 2: Case study: successfully training an MLP for image classification

Video 2: Case Study - MLP Classification#

-
+
@@ -1576,7 +1576,7 @@

Submit your feedback - +

@@ -1731,7 +1731,7 @@

Run me! -../../../_images/94521ffc12e0f43997e97e204432b3e669bdfe84ffb02c1c42e216b27772bc27.png +../../../_images/b8061ee4e4d4e3abd502af240b0102a0ebb5490a872dfc818001e9a245bee996.png

@@ -1897,7 +1897,7 @@

Section 2.4: Interpretability -../../../_images/53200be50c429ba186794203173fbe578ab9800d98547ee4ad2ef3bb6d9e015a.png +../../../_images/a89102cec942cb125061c328e1d0676155805a2f0051d2961c1236f4cb4e0053.png @@ -1912,7 +1912,7 @@

Section 3: High dimensional search#

-
+
@@ -1931,7 +1931,7 @@

Submit your feedback - +

@@ -2081,7 +2081,7 @@

Submit your feedback - +

@@ -2161,7 +2161,7 @@

Submit your feedback - + @@ -2175,7 +2175,7 @@

Section 4: Poor conditioning#

-
+
@@ -2194,7 +2194,7 @@

Submit your feedback - +

We illustrate this issue in a 2-dimensional setting. We freeze all but two parameters of the network: one of them is an element of the weight matrix (filter) for class 0, while the other is the bias for class 7. These results in an optimization with two decision variables.

@@ -2234,7 +2234,7 @@

Submit your feedback - +

@@ -2242,22 +2242,22 @@

Submit your feedback

Coding Exercise 4: Implement momentum#

In this exercise you will implement the momentum update given by:

-
-(52)#\[\begin{equation} +
+(52)#\[\begin{equation} w_{t+1} = w_t - \eta \nabla J(w_t) + \beta (w_t - w_{t-1}) \end{equation}\]

It is convenient to re-express this update rule in terms of a recursion. For that, we define ‘velocity’ as the quantity:

-
-(53)#\[\begin{equation} +
+(53)#\[\begin{equation} v_{t-1} := w_{t} - w_{t-1} \end{equation}\]

which leads to the two-step update rule:

-
-(54)#\[\begin{equation} +
+(54)#\[\begin{equation} v_t = - \eta \nabla J(w_t) + \beta (\underbrace{w_t - w_{t-1}}_{v_{t-1}}) \end{equation}\]
-
-(55)#\[\begin{equation} +
+(55)#\[\begin{equation} w_{t+1} \leftarrow w_t + v_{t} \end{equation}\]

Pay attention to the positive sign of the update in the last equation, given the definition of \(v_t\), above.

@@ -2556,7 +2556,7 @@

Submit your feedback -

+
@@ -2678,7 +2678,7 @@

Interactive Demo 4: Momentum vs. GD
Random seed 2021 has been set.
 

- +

Submit your feedback#

@@ -2696,7 +2696,7 @@

Submit your feedback - +

@@ -2723,7 +2723,7 @@

Submit your feedback - + @@ -2738,7 +2738,7 @@

Section 5: Non-convexity#

-
+
@@ -2757,7 +2757,7 @@

Submit your feedback - +

Take a couple of minutes to play with a more complex 3D visualization of the loss landscape of a neural network on a non-convex problem. Visit https://losslandscape.com/explorer.

    @@ -2845,7 +2845,7 @@

    Interactive Demo 5: Overparameterization to the rescue!
    -
    +

    Submit your feedback#

    @@ -2863,7 +2863,7 @@

    Submit your feedback - +

    @@ -2889,7 +2889,7 @@

    Submit your feedback - +

@@ -2905,7 +2905,7 @@

Section 6: Full gradients are expensive#

-
+
@@ -2924,7 +2924,7 @@

Submit your feedback - +

@@ -3025,7 +3025,7 @@

Interactive Demo 6.1: Cost of computation - +

Submit your feedback#

@@ -3043,7 +3043,7 @@

Submit your feedback - +

@@ -3111,7 +3111,7 @@

Submit your feedback - + @@ -3183,7 +3183,7 @@

Interactive Demo 6.2: Compare different minibatch sizes - +

Remarks: SGD works! We have an algorithm that can be applied (with due precautions) to learn datasets of arbitrary size.

However, note the difference in the vertical scale across the plots above. When using a larger minibatch, we can perform fewer parameter updates as the forward and backward passes are more expensive.

@@ -3205,7 +3205,7 @@

Submit your feedback - + @@ -3225,7 +3225,7 @@

Section 7: Adaptive methods#

-
+
@@ -3244,14 +3244,14 @@

Submit your feedback - +

Coding Exercise 7: Implement RMSprop#

In this exercise you will implement the update of the RMSprop optimizer:

-
-(56)#\[\begin{align} +
+(56)#\[\begin{align} v_{t} &= \alpha v_{t-1} + (1 - \alpha) \nabla J(w_t)^2 \\ \\ w_{t+1} &= w_t - \eta \frac{\nabla J(w_t)}{\sqrt{v_t + \epsilon}} \end{align}\]
@@ -3366,7 +3366,7 @@

Submit your feedback -

+
@@ -3447,7 +3447,7 @@

Interactive Demo 7: Compare optimizers - +

Submit your feedback#

@@ -3465,7 +3465,7 @@

Submit your feedback - +

@@ -3489,7 +3489,7 @@

Submit your feedback - +

Remarks: Note that RMSprop allows us to use a ‘per-dimension’ learning rate without having to tune one learning rate for each dimension ourselves. The method uses information collected about the variance of the gradients throughout training to adapt the step size for each of the parameters automatically. The savings in tuning efforts of RMSprop over SGD or ‘plain’ momentum are undisputed on this task.

Moreover, adaptive optimization methods are currently a highly active research domain, with many related algorithms like Adam, AMSgrad, Adagrad being used in practical application and theoretically investigated.

@@ -3520,7 +3520,7 @@

Submit your feedback - + @@ -3533,7 +3533,7 @@

Section 8: Ethical concerns#

-
+
@@ -3552,7 +3552,7 @@

Submit your feedback - +

@@ -3585,7 +3585,7 @@

Bonus: Putting it all together#

-
+
@@ -3604,7 +3604,7 @@

Submit your feedback - +

@@ -3639,6 +3639,31 @@

Download parameters of the benchmark model
WARNING: For this notebook to perform best, if possible, in the menu under `Runtime` -> `Change runtime type.`  select `GPU` 
 
+
---------------------------------------------------------------------------
+EOFError                                  Traceback (most recent call last)
+Cell In[65], line 15
+     13   benchmark_state_dict = torch.load(fname)
+     14 else:
+---> 15   benchmark_state_dict = torch.load(fname, map_location=torch.device('cpu'))
+
+File /opt/hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/torch/serialization.py:1040, in load(f, map_location, pickle_module, weights_only, mmap, **pickle_load_args)
+   1038     except RuntimeError as e:
+   1039         raise pickle.UnpicklingError(UNSAFE_MESSAGE + str(e)) from None
+-> 1040 return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
+
+File /opt/hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/torch/serialization.py:1258, in _legacy_load(f, map_location, pickle_module, **pickle_load_args)
+   1252 if not hasattr(f, 'readinto') and (3, 8, 0) <= sys.version_info < (3, 8, 2):
+   1253     raise RuntimeError(
+   1254         "torch.load does not work with file-like objects that do not implement readinto on Python 3.8.0 and 3.8.1. "
+   1255         f"Received object of type \"{type(f)}\". Please update to Python 3.8.2 or newer to restore this "
+   1256         "functionality.")
+-> 1258 magic_number = pickle_module.load(f, **pickle_load_args)
+   1259 if magic_number != MAGIC_NUMBER:
+   1260     raise RuntimeError("Invalid magic number; corrupt file?")
+
+EOFError: Ran out of input
+
+
+
@@ -3895,7 +3893,7 @@

Submit your feedback - +

@@ -3919,7 +3917,7 @@

Submit your feedback - +
@@ -3942,21 +3940,7 @@

Evaluation -
Your model...
-
-
-
Train Loss 0.826 / Test Loss 0.810
-Train Accuracy 82.171% / Test Accuracy 83.252%
 
-Benchmark model
-
-
-
Train Loss 0.011 / Test Loss 0.025
-Train Accuracy 99.784% / Test Accuracy 99.316%
-
-
-

diff --git a/tutorials/W2D1_Regularization/student/W2D1_Tutorial1.html b/tutorials/W2D1_Regularization/student/W2D1_Tutorial1.html index 4cd58a7e8..54da6d358 100644 --- a/tutorials/W2D1_Regularization/student/W2D1_Tutorial1.html +++ b/tutorials/W2D1_Regularization/student/W2D1_Tutorial1.html @@ -50,7 +50,7 @@ - + @@ -1005,7 +1005,7 @@

Tutorial Objectives
-
+
@@ -1615,7 +1615,7 @@

Section 1: Regularization is Shrinkage#

-
+
@@ -1634,7 +1634,7 @@

Submit your feedback
-
+

A key idea of neural nets is that they use models that are “too complex” - complex enough to fit all the noise in the data. One then needs to “regularize” them to make the models fit complex enough, but not too complex. The more complex the model, the better it fits the training data, but if it is too complex, it generalizes less well; it memorizes the training data but is less accurate on future test data.

@@ -1642,7 +1642,7 @@

Submit your feedbackVideo 2: Regularization as Shrinkage#

-
+
@@ -1661,7 +1661,7 @@

Submit your feedback - +

One way to think about Regularization is to think in terms of the magnitude of the overall weights of the model. A model with big weights can fit more data perfectly, whereas a model with smaller weights tends to underperform on the train set but can surprisingly do very well on the test set. Having the weights too small can also be an issue as it can then underfit the model.

In these tutorials, we use the sum of the Frobenius norm of all the tensors in the model as a measure of the “size of the model”.

@@ -1670,8 +1670,8 @@

Submit your feedback#

Before we start, let’s define the Frobenius norm, sometimes also called the Euclidean norm of an \(m×n\) matrix \(A\) as the square root of the sum of the absolute squares of its elements.


-
-(57)#\[\begin{equation} +
+(57)#\[\begin{equation} ||A||_F= \sqrt{\sum_{i=1}^m\sum_{j=1}^n|a_{ij}|^2} \end{equation}\]

@@ -1743,7 +1743,7 @@

Submit your feedback -

+

Apart from calculating the weight size for an entire model, we could also determine the weight size in every layer. For this, we can modify our calculate_frobenius_norm function as shown below.

Have a look how it works!!

@@ -1797,7 +1797,7 @@

Submit your feedback +../../../_images/7f96db181f1246bf7033cc953839a1c263a55b81faf06b89209f8ed1237fcf00.png

Using the last function, calculate_frobenius_norm, we can also obtain the Frobenius norm per layer for a whole ANN model and use the plot_weigts function to visualize them.

@@ -1822,7 +1822,7 @@

Submit your feedback +../../../_images/c3d6f36997c074b0a8a6d31836859b54f631b834fd9bab539fc1c013a9b96c37.png

@@ -1836,7 +1836,7 @@

Section 2: OverfittingVideo 3: Overparameterization and Overfitting#

-
+
@@ -1855,7 +1855,7 @@

Submit your feedback - +

@@ -1889,7 +1889,7 @@

Section 2.1: Visualizing Overfitting
Random seed 2021 has been set.
 
-../../../_images/eaff432b30bae519b5199bab45f93f26b9bc95e7f7a3ba0f7eb32dd0cbd0cbcf.png +../../../_images/0367ec8db8ae4d65ea628497d7d5f591ff01c92e4fc01ec7dad4cccfb52e97ba.png

Let’s create an overparametrized Neural Network that can fit on the dataset that we just created and train it.

@@ -2013,7 +2013,7 @@

Section 2.1: Visualizing Overfitting
Random seed 2021 has been set.
 
- +

Now that we have finished training, let’s see how the model has evolved over the training process.

@@ -2119,7 +2119,7 @@

Animation (Run Me!)RuntimeError: Requested MovieWriter (ffmpeg) not available -../../../_images/4cb1e04a307d81d656f10cdd6f20e9d547bb9f89a008a8e0c786186fd442e89b.png +../../../_images/66423f415c5ed2c155e6d4c8df35ddcaef422be800f1752e6da3c3185e7cab70.png

@@ -2146,7 +2146,7 @@

Plot the train and test losses -../../../_images/d90acb9fd01dedb1a3276986f7b1eb8f5fe7d446d98eac8459a2be7e04269511.png +../../../_images/a5d3cad3009c0a9f95bd70801d179e595c8bf2e6d474bcc0b05541d7407725f3.png

@@ -2174,7 +2174,7 @@

Submit your feedback - +

Now let’s visualize the Frobenious norm of the model as we trained. You should see that the value of weights increases over the epochs.

Frobenious norm of the model

@@ -2197,7 +2197,7 @@

Submit your feedback -../../../_images/630efa1975514c4e86d9488d4675e6e15ff7695ac20e920ee98c7805fd6cbaf7.png +../../../_images/9d2db9ca12067f24eff8013482b01d974c3ad921f256ab2cec85fa57d86e789d.png

Finally, you can compare the Frobenius norm per layer in the model, before and after training.

@@ -2221,8 +2221,8 @@

Submit your feedback -../../../_images/f78468b1fd9d04097ff8de8c8037eee381278208493f358d6a14244b194bb0ba.png -../../../_images/307916fb13ee9b3502c1b5695ea462b08284f61dcdd50e464d44748b9d9aa93e.png +../../../_images/f7677587eea0099d87ca72136e5798438cc531f37a309517501b606513a5fb2d.png +../../../_images/096424eaa56ce6bb1650b00527c1964ed06db0349fcf1d81cfe3926ced2d0ebd.png @@ -2466,10 +2466,10 @@

Section 3: Memorization
Random seed 2021 has been set.
 
-
Time to memorize the dataset: 241.6447606086731
+
Time to memorize the dataset: 223.69637608528137
 
-../../../_images/02c33c12e77d12754a4f51ba67fa89bce734a4c0d766e2935d2f27d56b1acc27.png +../../../_images/8ae7cd982c48a5de102c8d6905769be84e5acc537ae8b89cd6292cd2f0993030.png
@@ -2491,8 +2491,8 @@

Frobenius norm for AnimalNet before and after training
-../../../_images/b0de31c52ce78630640d081720a3b57ef3508edc188b3a277bf0f64307ec7f08.png -../../../_images/0fffeba293de5453c66782e5f84871228cdf184f3cdb32706f755776007d1baa.png +../../../_images/da42e2626a961a450ddd110e5ba09e05e306fde26455d97b2eb04873f11bbfc0.png +../../../_images/7115401207a7c015679b0137158af5a34f99ab1c583958840de04e8653a18543.png

@@ -2549,7 +2549,7 @@

Data Visualizer
The image belongs to : cat
 
-../../../_images/a1a77da928eec784639eb05c87e63483f8f041a1dc6fa10f34391b8187021844.png +../../../_images/2e0155f095b8d0008f42c3e8e6d2607c01f87f0e446fed56b38405866f541fb3.png

Now let’s train the network on the shuffled data and see if it memorizes.

@@ -2594,7 +2594,7 @@

Data Visualizer
Random seed 2021 has been set.
 
-../../../_images/e2bbfa7fcba2c15582af5f938bb59bfd0604fbbcd2a43c8c054346d49a7ad0b6.png +../../../_images/f39b59eb6c7c677dfbd0cccfa034f14897e0e44ce39cf11a9b5b0662b7e14383.png

Isn’t it surprising to see that the ANN was able to achieve 100% training accuracy on randomly shuffled labels? This is one of the reasons why training accuracy is not a good indicator of model performance.

@@ -2608,7 +2608,7 @@

Section 4: Early Stopping#

-
+
@@ -2627,7 +2627,7 @@

Submit your feedback - +

Now that we have established that the validation accuracy reaches the peak well before the model overfits, we want to stop the training somehow early. You should have also observed from the above plots that the train/test loss on real data is not very smooth, and hence you might guess that the choice of the epoch can play a crucial role in the validation/test accuracy.

Early stopping stops training when the validation accuracies stop increasing.

@@ -2756,7 +2756,7 @@

Submit your feedback - +

@@ -2783,7 +2783,7 @@

Submit your feedback - + @@ -2838,7 +2838,7 @@

Bonus: Train with randomized labels
Random seed 2021 has been set.
 
-../../../_images/95f88f7e75611a5dfb12461c69670991eb7a9f23c58d715c1ed41a11777107d3.png +../../../_images/ba1198882221690cb498e83f0ecebef51d6b091fcff67ca9438698ea18db5d67.png
@@ -2869,7 +2869,7 @@

Plotting them all together (Run Me!) -../../../_images/4c430f878a4bdde5171ac45480975bfd8f1cd968428f3debd9b04c31f22b5df5.png +../../../_images/48b2f3dcdc7f8ce5bfffc4c5841322d8b0dc4f8448f79906826ce8a8013cc772.png

@@ -2897,7 +2897,7 @@

Submit your feedback - +

Also, it is interesting to note that sometimes the model trained on slightly shuffled data does slightly better than the one trained on pure data. Shuffling some of the data is a form of regularization, i.e., one of many ways of adding noise to the training data.

diff --git a/tutorials/W2D1_Regularization/student/W2D1_Tutorial2.html b/tutorials/W2D1_Regularization/student/W2D1_Tutorial2.html index 16978d4fb..7cf9ef2a0 100644 --- a/tutorials/W2D1_Regularization/student/W2D1_Tutorial2.html +++ b/tutorials/W2D1_Regularization/student/W2D1_Tutorial2.html @@ -50,7 +50,7 @@ - + @@ -1098,7 +1098,7 @@

Tutorial Objectives
-
+
@@ -1985,7 +1985,7 @@

Section 1: L1 and L2 Regularization#

-
+
@@ -2004,12 +2004,12 @@

Submit your feedback
-
+

Some of you might have already come across L1 and L2 regularization before in other courses. L1 and L2 are the most common types of regularization. These update the general cost function by adding another term known as the regularization term.


-
-(58)#\[\begin{equation} +
+(58)#\[\begin{equation} \text{Cost function} = Loss(\text{e.g., binary cross entropy}) + \text{Regularization term} \end{equation}\]

@@ -2093,7 +2093,7 @@

Dataloaders for Regularization
Random seed 2021 has been set.
 

-../../../_images/b9ae3a198b7e9657cfcd1bb3e387ab53c26153ea18f0437e4d25ce65a1b1658e.png +../../../_images/b1be39be0840384dec6887d3741e6245a06ade1140d68ec90e56fa51bbb92d83.png
Maximum Validation Accuracy reached: 51.0
 
@@ -2104,19 +2104,19 @@

Dataloaders for Regularization

Section 1.2: L1 Regularization#

L1 Regularization (or LASSO\(^{\ddagger}\)) uses a penalty which is the sum of the absolute value of all the weights in the Deep Learning architecture, resulting in the following loss function (\(L\) is the usual Cross-Entropy loss):

-
-(59)#\[\begin{equation} +
+(59)#\[\begin{equation} L_R = L + \lambda \sum \left| w^{(r)}_{ij} \right| \end{equation}\]

where \(r\) denotes the layer, and \(ij\) the specific weight in that layer.

At a high level, L1 Regularization is similar to L2 Regularization since it leads to smaller weights (you will see the analogy in the next subsection). It results in the following weight update equation when using Stochastic Gradient Descent:

-
-(60)#\[\begin{equation} +
+(60)#\[\begin{equation} w^{(r)}_{ij}←w^{(r)}_{ij} − \eta \cdot \lambda \cdot \text{sgn}\left(w^{(r)}_{ij}\right)−\eta \frac{\partial L}{\partial w_{ij}^{(r)}} \end{equation}\]

where \(\text{sgn}(\cdot)\) is the sign function, such that

-
-(61)#\[\begin{equation} +
+(61)#\[\begin{equation} \text{sgn}(w) = \left\{ \begin{array}{ll} @@ -2191,7 +2191,7 @@

Submit your feedback -

+

Now, let’s train a classifier that uses L1 regularization. Tune the hyperparameter lambda1 such that the validation accuracy is higher than that of the unregularized model.

+

@@ -2263,19 +2263,19 @@

Submit your feedback

Section 1.3: L2 / Ridge Regularization#

L2 Regularization (or Ridge), also referred to as “Weight Decay”, is widely used. It works by adding a quadratic penalty term to the Cross-Entropy Loss Function \(L\), which results in a new Loss Function \(L_R\) given by:

-
-(62)#\[\begin{equation} +
+(62)#\[\begin{equation} L_R = L + \lambda \sum \left( w^{(r)}_{ij} \right)^2 \end{equation}\]

where, again, \(r\) superscript denotes the layer, and \(ij\) the specific weight in that layer.

To get further insight into L2 Regularization, we investigate its effect on the Gradient Descent based update equations for the weight and bias parameters. Taking the derivative on both sides of the above equation, we obtain

-
-(63)#\[\begin{equation} +
+(63)#\[\begin{equation} \frac{\partial L_R}{\partial w^{(r)}_{ij}}=\frac{\partial L}{\partial w^{(r)}_{ij}} + 2\lambda w^{(r)}_{ij} \end{equation}\]

Thus the weight update rule becomes:

-
-(64)#\[\begin{equation} +
+(64)#\[\begin{equation} w^{(r)}_{ij}←w^{(r)}_{ij}−η\frac{\partial L}{\partial w^{(r)}_{ij}}−2 \eta \lambda w^{(r)}_{ij}=(1−2 \eta \lambda)w^{(r)}_{ij} − \eta \frac{\partial L}{\partial w^{(r)}_{ij}} \end{equation}\]

where \(\eta\) is the learning rate.

@@ -2343,7 +2343,7 @@

Submit your feedback -

+

Now we’ll train a classifier that uses L2 regularization. Tune the hyperparameter lambda2 such that the validation accuracy is higher than that of the unregularized model.

+

Now, let’s run a model with both L1 and L2 regularization terms.

@@ -2534,7 +2534,7 @@

Section 2: Dropout#

-
+
@@ -2553,7 +2553,7 @@

Submit your feedback - +

With Dropout, we literally drop out (zero out) some neurons during training. Throughout the training, the standard dropout zeros out some fraction (usually 50%) of the nodes in each layer, and on each iteration, before calculating the subsequent layer. Randomly selecting different subsets to drop out introduces noise into the process and reduces overfitting.


@@ -2674,7 +2674,7 @@

Run to train the default network
Random seed 2021 has been set.
 
- +
+

Now that we have finished the training, let’s see how the model has evolved over the training process.

Animation! (Run Me!)

@@ -2793,7 +2793,7 @@

Run to train the default networkRuntimeError: Requested MovieWriter (ffmpeg) not available -../../../_images/d964234cc1439e0aaf1deb690d9bad92d4f76256534fcab0ac85899a77c99c2c.png +../../../_images/b0a991aa1fd778f2f0ef3d4f2bb95a33c4e1819dc6126275b87c8b2891aa6fde.png

Plot the train and test losses with epoch

@@ -2819,7 +2819,7 @@

Run to train the default network -../../../_images/9cac0e245302cda4f4f277a43ee4c8b4b74eb2da2af25944be5022b4c6721177.png +../../../_images/d08f387ea927ba54e51cf3eac1e928d4cc5d6864c238f0e4664842ca7087a7fa.png

Plot the train and test losses with epoch

@@ -2845,7 +2845,7 @@

Run to train the default network -../../../_images/dd316b865a698b994ee3b13ad6dfe0de1aa3197969512a9600b4eebb980581aa.png +../../../_images/499b33296fd75f367398a8553dd918bb8947197f08ced98162d08fb3bc2387ad.png

Plot model weights with epoch

@@ -2870,7 +2870,7 @@

Run to train the default network -../../../_images/67a8d761ab6a11cdfddb806c3bb84ec6a68eb16014104c97ad3a53da8a969d84.png +../../../_images/5de67be11870e5173f10f30c2aa1331af6cd02956d72a7ad17e696a0ce5d2705.png

@@ -2895,7 +2895,7 @@

Submit your feedback - + @@ -3009,10 +3009,10 @@

Section 2.1: Dropout Implementation Caveats
Random seed 2021 has been set.
 
-
Random seed 2021 has been set.
+
Random seed 2021 has been set.
 
-../../../_images/47359cdf22e90dceb054e9bd8166287f7d88a2f652faf51fd3b3d90b464ec7de.png +../../../_images/3f322b109c93aa7bfd89efd8694090ad39571d6ab42da715e35728eff66c27de.png
@@ -3035,7 +3035,7 @@

Submit your feedback - +

@@ -3049,7 +3049,7 @@

Section 3: Data Augmentation#

-
+
@@ -3068,7 +3068,7 @@

Submit your feedback - +

Data augmentation is often used to increase the number of training samples. Now we will explore the effects of data augmentation on regularization. Here regularization is achieved by adding noise into training data after every epoch.

PyTorch’s torchvision module provides a few built-in data augmentation techniques, which we can use on image datasets. Some of the techniques we most frequently use are:

@@ -3206,10 +3206,10 @@

Data Loader without Data Augmentation
Random seed 2021 has been set.
 
-
Random seed 2021 has been set.
+
Random seed 2021 has been set.
 
-../../../_images/c463454c7b069521e9e2351b0002bac6cd2b0228a2619239ac92dcabc5f64185.png +../../../_images/a8766aa3dd02a279ca9ed688dde42f61fb55f03cbe1b2fc5f2c72f7a9f6880f0.png

@@ -3253,7 +3253,7 @@

Submit your feedback - + @@ -3277,7 +3277,7 @@

Submit your feedback - + @@ -3291,7 +3291,7 @@

Section 4: Stochastic Gradient Descent#

-
+
@@ -3310,7 +3310,7 @@

Submit your feedback - +

@@ -3407,13 +3407,13 @@

Generating Data Loaders
Random seed 2021 has been set.
 
-
-../../../_images/99db78bd1535fe12692e1f5329d742f857f7af1cfaeb27d07a71910ae1911136.png +../../../_images/9c8f994db318385a11b92995d28fee11addaa2d6c634d95c5fd115df3b530ed4.png

Plot parametric norms (Run me)

@@ -3472,7 +3472,7 @@

Generating Data Loaders
-../../../_images/3887af56402df0c0078b7262a0fc6f9b5635ae9181292ee8d6fc1f9554b57d75.png +../../../_images/2d59c9a8b535a107b091da62bd4a34913660601ebd751fd91c267099443fedb9.png

In the model above, we observe something different from what we expected. Why do you think this is happening?

@@ -3487,7 +3487,7 @@

Section 5: Hyperparameter Tuning#

-
+

@@ -3506,7 +3506,7 @@

Submit your feedback - +

Hyperparameter tuning is often tricky and time-consuming, and it is a vital part of training any Deep Learning model to give good generalization. There are a few techniques that we can use to guide us during the search.

@@ -3565,7 +3565,7 @@

Bonus: Adversarial Attacks#

-
+
@@ -3584,7 +3584,7 @@

Submit your feedback - +

Designing perturbations to the input data to trick a machine learning model is called an “adversarial attack”. These attacks are an inevitable consequence of learning in high dimensional space using complex decision boundaries. Depending on the application, these attacks can be very dangerous.


diff --git a/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture.html b/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture.html index b9f7a4936..5e334ef04 100644 --- a/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture.html +++ b/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_BonusLecture.html @@ -58,7 +58,7 @@ const thebe_selector_output = ".output, .cell_output" - + @@ -1453,11 +1453,11 @@

Install and import feedback gadget#

-
+
-
+

@@ -1476,7 +1476,7 @@

Submit your feedback
-
+

diff --git a/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1.html b/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1.html index b8175d084..7bd804ffc 100644 --- a/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1.html +++ b/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial1.html @@ -50,7 +50,7 @@ - + @@ -1343,7 +1343,7 @@

Tutorial Objectives
-
+
@@ -1886,7 +1886,7 @@

Section 0: Recap the Experience from Last Week#

-
+
@@ -1905,7 +1905,7 @@

Submit your feedback
-
+

@@ -1934,7 +1934,7 @@

Submit your feedback - +

Coming Up

The rest of these lectures focus on another way to reduce parameters: weight-sharing. Weight-sharing is based on the idea that some sets of weights can be used at multiple points in a network. We will focus primarily on CNNs today, where the weight-sharing is across the 2D space of an image. This weight-sharing technique (across space) can reduce the number of parameters and increase a network’s ability to generalize. For completeness, a similar approach is the Recurrent Neural Networks (RNNs), which share parameters across time, but we will not dive into this in this tutorial.

@@ -1949,7 +1949,7 @@

Section 1: Neuroscience motivation, General CNN structure#

-
+

@@ -1968,7 +1968,7 @@

Submit your feedback - +

@@ -1993,7 +1993,7 @@

Submit your feedback - +

@@ -2007,7 +2007,7 @@

Section 2: Convolutions and Edge Detection#

-
+
@@ -2026,7 +2026,7 @@

Submit your feedback - +

Before jumping into coding exercises, take a moment to look at this animation that steps through the process of convolution.

Recall from the video that convolution involves sliding the kernel across the image, taking the element-wise product, and adding those products together.

@@ -2071,8 +2071,8 @@

Definitional Note

Coding Exercise 2.1: Convolution of a Simple Kernel#

At its core, convolution is just repeatedly multiplying a matrix, known as a kernel or filter, with some other, larger matrix (in our case the pixels of an image). Consider the below image and kernel:

-
-(65)#\[\begin{align} +
+(65)#\[\begin{align} \textbf{Image} &= \begin{bmatrix}0 & 200 & 200 \\0 & 0 & 200 \\ 0 & 0 & 0 \end{bmatrix} \\ \\ @@ -2158,7 +2158,7 @@

Submit your feedback -

+

@@ -2224,7 +2224,7 @@

Submit your feedback - + @@ -2313,7 +2313,7 @@

Submit your feedback - +
@@ -2558,15 +2558,15 @@

Submit your feedback - +

Think! 2.2.1: Edge Detection#

One of the simpler tasks performed by a convolutional layer is edge detection; that is, finding a place in the image where there is a large and abrupt change in color. Edge-detecting filters are usually learned by the first layers in a CNN. Observe the following simple kernel and discuss whether this will detect vertical edges (where the trace of the edge is vertical; i.e. there is a boundary between left and right), or whether it will detect horizontal edges (where the trace of the edge is horizontal; i.e., there is a boundary between top and bottom).

-
-(66)#\[\begin{equation} +
+(66)#\[\begin{equation} \textbf{Kernel} = \begin{bmatrix} 1 & -1 \\ 1 & -1 \end{bmatrix} @@ -2588,7 +2588,7 @@

Submit your feedback -

+

Consider the image below, which has a black vertical stripe with white on the side. This is like a very zoomed-in vertical edge within an image!

As you can see, this kernel detects vertical edges (the black stripe corresponds to a highly positive result, while the white stripe corresponds to a highly negative result. However, to display the image, all the pixels are normalized between 0=black and 1=white).

@@ -2668,7 +2668,7 @@

Submit your feedback - +

@@ -2847,7 +2847,7 @@

Dataset/DataLoader Functions (Run me!) -../../../_images/3f6d3f33c0fcdc25fc5308b9f32b8a7a8eb8f1f22b32fda64c3085ee693dc64a.png +../../../_images/a0a6d8ea4435c866e16fc976e9f0b970931df02213bcbd9d015aab5767801928.png @@ -2893,7 +2893,7 @@

Submit your feedback - + @@ -2987,10 +2987,10 @@

Section 3.1: Multiple Filters -
<matplotlib.image.AxesImage at 0x7f79ab6816d0>
+
<matplotlib.image.AxesImage at 0x7f79eadc18b0>
 
-../../../_images/bf146facce2c5b4465c1c96f05129d951f75bdbd498dc91c9138c7961df38e04.png +../../../_images/5b8f96e235f4a03a0ed108b74da9718b0a752da3453498ad54d8d009d545f62d.png
@@ -3012,7 +3012,7 @@

Submit your feedback - +

We apply the filters to the images.

@@ -3243,7 +3243,7 @@

Section 3.2: ReLU after convolutions -../../../_images/0c9ad0f380bd8fd67092f23f2f95e539674e45608d4c52ccf0c01d12a475829f.png +../../../_images/70d03ba61a7131910eebb4ce352b35f6d93a1779fc0bcaabe5e333c7dac971b3.png

Discuss with your pod how the ReLU activations help strengthen the features necessary to detect an \(X\).

@@ -3259,7 +3259,7 @@

Section 3.3: PoolingVideo 4: Pooling#

-
+
@@ -3278,7 +3278,7 @@

Submit your feedback - +

Like convolutional layers, pooling layers have fixed-shape windows (pooling windows) that are systematically applied to the input. As with filters, we can change the shape of the window and the size of the stride. And, just like with filters, every time we apply a pooling operation we produce a single output.

Pooling performs a kind of information compression that provides summary statistics for a neighborhood of the input.

@@ -3337,7 +3337,7 @@

Submit your feedback - +

@@ -3455,7 +3455,7 @@

Submit your feedback - +
@@ -3534,7 +3534,7 @@

Section 4: Putting it all together#

-
+
@@ -3553,7 +3553,7 @@

Submit your feedback -

+
@@ -3835,7 +3835,7 @@

Interactive Demo 4.1: Number of Parameters - +

The difference in parameters is huge, and it continues to increase as the input image size increases. Larger images require that the linear layer use a matrix that can be directly multiplied with the input pixels.


@@ -3859,14 +3859,14 @@

Submit your feedback - +

Video 6: Implement your own CNN#

-
+
@@ -3885,7 +3885,7 @@

Submit your feedback - +

@@ -4099,7 +4099,7 @@

Submit your feedback - +

Note: We are using a softmax function here which converts a real value to a value between 0 and 1, which can be interpreted as a probability.

@@ -4161,7 +4161,7 @@

Bonus 1: Write your own training loop revisited#

-
+
@@ -4180,15 +4180,15 @@

Submit your feedback - +

Bonus 1.1: Understand the Dataset#

The dataset we are going to use for this task is called Fashion-MNIST. It consists of a training set of 60,000 examples and a test set of 10,000 examples. We further divide the test set into a validation set and a test set (8,000 and 2,000, respectively). Each example is a \(28 \times 28\) gray scale image, associated with a label from 10 classes. Following are the labels of the dataset:


-
-(67)#\[\begin{matrix} +
+(67)#\[\begin{matrix} \text{label} && \text{category} \\ \hline 0 && \text{T-shirt/top} \\ @@ -4366,7 +4366,7 @@

Loading Fashion-MNIST Data -../../../_images/ee2a62c7eda5e489def7f90d1dec3444d24b7a2046768887656da91b19675d09.png +../../../_images/cf8d4c96f5b4bbc479bbced72f41a91d985d373b1fcd94daf6f08f70a09a97f5.png

Take a minute with your pod and talk about which classes you think would be most confusable. How hard will it be to differentiate t-shirt/tops from shirts?

@@ -4375,7 +4375,7 @@

Loading Fashion-MNIST Data#

-
+
@@ -4394,7 +4394,7 @@

Submit your feedback - +

@@ -4579,7 +4579,7 @@

Load a sample dataset (EMNIST) - + @@ -4892,7 +4892,7 @@

Submit your feedback - + @@ -5004,7 +5004,7 @@

Submit your feedback - + @@ -5031,7 +5031,7 @@

Submit your feedback - + @@ -5083,7 +5083,7 @@

Interactive Demo Bonus 2.1: Dropout exploration - +

Submit your feedback#

@@ -5101,7 +5101,7 @@

Submit your feedback - +

@@ -5237,7 +5237,7 @@

Submit your feedback - + @@ -5262,7 +5262,7 @@

Submit your feedback - + diff --git a/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2.html b/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2.html index 3b9494977..d501ae40a 100644 --- a/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2.html +++ b/tutorials/W2D2_ConvnetsAndDlThinking/student/W2D2_Tutorial2.html @@ -50,7 +50,7 @@ - + @@ -976,7 +976,7 @@

Tutorial Objectives
-
+
@@ -1023,7 +1023,7 @@

Section 1: Intro to Deep Learning Thinking#

-
+
@@ -1042,7 +1042,7 @@

Submit your feedback
-
+

This tutorial is a bit different from others - there will be no coding! Instead you will watch a series of vignettes about various scenarios where you want to use a neural network. This tutorial will focus on cost functions, a tutorial you will see later in the course will be similar but focused on designing architectures.

Each section below will start with a vignette where either Lyle or Konrad is trying to figure out how to set up a neural network for a specific problem. Try to think of questions you want to ask them as you watch, then pay attention to what questions Lyle and Konrad are asking. Were they what you would have asked? How do their questions help quickly clarify the situation?

@@ -1057,7 +1057,7 @@

Section 2: Cost function for neurons#

-
+

@@ -1076,14 +1076,14 @@

Submit your feedback - +

Video 3: Spiking Neuron Predictions Set-up#

-
+
@@ -1102,14 +1102,14 @@

Submit your feedback - +

Konrad, a neuroscientist, wants to predict what neurons in someone’s motor cortex are doing while they are riding a motorcycle.

Upon discussion with Lyle, it emerges that we have data on 12 parameters of motorcycle riding, including acceleration, angle, braking, degrees of leaning. These inputs are fairly smooth over time, the angle of the motorcycle typically does not change much in 100 ms for example.

We also have recorded data on the timing of spikes of \(N\) neurons in motor cortex. The underlying firing rate is smooth but every millisecond spikes are random and independent. This means we can assume that the number of spikes in a short interval can be modeled using a Poisson distribution with an underlying firing rate for that interval \(\lambda\).

For neuron \(i\), the probability of seeing \(k_{i}\) spikes in some interval given an underlying firing rate \(\lambda_{i}\) is:

-
-(68)#\[\begin{equation} +
+(68)#\[\begin{equation} \mathcal{f(k_{i}:λ_{i})} = \mathcal{Pr(X=k_{i})} = \frac {\lambda_{i}^{k_{i}}e^{-\lambda_{i}}}{k_{i}!} \end{equation}\]

So this poisson distribution may be relevant if we want to, in a way, have a good model for the spiking of neurons.

@@ -1135,28 +1135,28 @@

Think! 1: Designing a cost function to predict neural activitiesClick here for the solution

First, we will convert our spike timing data to the number of spikes per time bin for time bins of size 50 ms. This gives us \(k_{i,t}\) for every neuron \(i\) and time bin \(t\).

We are assuming a Poisson distribution for our spiking. That means that we get the probability of seeing spike count \(k_{i, t}\) given underlying firing rate \(\lambda_{i, t}\) using this equation:

-
-(69)#\[\begin{equation} +
+(69)#\[\begin{equation} \mathcal{f(k_{i,t}:\lambda_{i,t})} = \mathcal{Pr}(X=k_{i,t}) = \frac {\lambda_{i,t}^{k_{i,t}}e^{-\lambda_{i,t}}}{k_{i,t}!} \end{equation}\]

That seems a pretty good thing to optimize to make our predictions as good as possible! We want a high probability of seeing the actual spike count we recorded given the neural network prediction of the underlying firing rate.

We will make this negative later so we have an equation that we want to minimize rather than maximize, so we can use all our normal tricks for minimization (instead of maximization). First though, let’s scale up to include all our neurons and time bins.

We can treat each time bin as independent because, while the underlying probability of firing changes slowly, every milisecond spiking is random and independent. From probability, we know that we can compute the probability of a set of independent events (all the spike counts) by multiplying the probabilities of each event. So the probability of seeing all of our data given the neural network predictions is all of our probabilities of \(k_{i,t}\) multiplied together:

-
-(70)#\[\begin{align} +
+(70)#\[\begin{align} \mathcal{Pr}(\text{all_data}) &= \prod_{i=1}^{N}\prod_{t=1}^\top \mathcal{Pr}(X=k_{i,t})\\ &= \prod_{i=1}^{N}\prod_{t=1}^\top \frac {\lambda_{i,t}^{k_{i,t}}e^{-\lambda_{i,t}}}{k_{i,t}!} \end{align}\]

This is also known as our likelihood!

We usually use the log likelihood instead of the likelihood when minimizing or maximizing for numerical computation reasons. W We can convert the above equation to log likelihood:

-
-(71)#\[\begin{align} +
+(71)#\[\begin{align} \text{log likelihood} &= \sum_{i=1}^N\sum_{t=1}^\top \text{log}(\mathcal{Pr}(X=k_{i,t}) \\ &= \sum_{i=1}^N\sum_{t=1}^\top k_{i,t} \text{log}(\lambda_{i,t}) - \lambda_{i,t} - \text{log}(k_{i,t}!) \end{align}\]

And last but not least, we want to make it negative so we can minimize instead of maximize:

-
-(72)#\[\begin{equation} +
+(72)#\[\begin{equation} \text{negative log likelihood} = \sum_{i=1}^N\sum_{t=1}^\top - k_{i,t} \text{log}(\lambda_{i,t}) + \lambda_{i,t} + \text{log}(k_{i,t}!) \end{equation}\]
@@ -1176,14 +1176,14 @@

Submit your feedback -

+

Video 4: Spiking Neurons Wrap-up#

-
+
@@ -1202,7 +1202,7 @@

Submit your feedback - +

Check out the papers mentioned in the above video:

@@ -1242,7 +1242,7 @@

Section 3: How can an ANN know its uncertainty#

-
+
@@ -1261,14 +1261,14 @@

Submit your feedback - +

Video 6: ANN Uncertainty Set-up#

-
+
@@ -1287,13 +1287,13 @@

Submit your feedback - +

Lyle wants to build an artificial neural network that has a measure of its own uncertainty about it’s predictions. He wants the neural network to give a prediction/estimate and an uncertainty, or standard deviation, measurement on it.

Let’s say Lyle wants to estimate the location of an atom in a chemical molecule based on various inputs. He wants to have the estimate of the location and an estimate of the variance. We don’t train neural networks on one data point at a time though - he wants a cost function that takes in N data points (input and atom location pairings).

We think we may be able to use a Gaussian distribution to help Lyle here:

-
-(73)#\[\begin{equation} +
+(73)#\[\begin{equation} g(x) = \frac{1}{\sigma\sqrt{2\pi}} \text{exp} \left( -\frac{1}{2}\frac{(x-\mu)^2}{\sigma^2} \right) \end{equation}\]

@@ -1316,18 +1316,18 @@

Think! 2: Designing a cost function so we measure uncertainty\(i\), the neural network predicts the mean (\(\mu_i\)) and standard deviation (\(\sigma_i\)) of the location given the inputs. We can then compute the probability of seeing the actual recorded location (\(x_i\)) given these predictions:

-
-(74)#\[\begin{equation} +
+(74)#\[\begin{equation} g(x) = \frac{1}{\sigma\sqrt{2\pi}} \text{exp}\left( -\frac{1}{2}\frac{(x_i-\mu_i)^2}{\sigma_i^2} \right) \end{equation}\]

The location of the atom is independent in each data point so we can get the overall likelihood by multiplying the probabilities for the individual data points.

-
-(75)#\[\begin{equation} +
+(75)#\[\begin{equation} \text{likelihood} = \prod_{i=1}^N\frac{1}{\sigma\sqrt{2\pi}} \text{exp}\left( -\frac{1}{2}\frac{(x_i-\mu_i)^2}{\sigma_i^2} \right) \end{equation}\]

And, as before, we want to take the log of this for numerical reasons and convert to negative log likelihood:

-
-(76)#\[\begin{equation} +
+(76)#\[\begin{equation} \text{negative log likelihood} = \sum_{i=1}^N \text{log} \left( \frac{1}{\sigma\sqrt{2\pi}} \text{exp}\left( -\frac{1}{2}\frac{(x_i-\mu_i)^2}{\sigma_i^2} \right) \right) \end{equation}\]

Changing the parameters of the neural network so it predicts \(\mu_i\) and \(\sigma_i\) that minimize this equation will give us (hopefully fairly accurate) predictions of the location and the network uncertainty about the location!

@@ -1347,14 +1347,14 @@

Submit your feedback -

+

Video 7: ANN Uncertainty Wrap-up#

-
+
@@ -1373,7 +1373,7 @@

Submit your feedback - +

Check out the papers mentioned in the above video:

@@ -1413,7 +1413,7 @@

Section 4: Embedding faces#

-
+
@@ -1432,14 +1432,14 @@

Submit your feedback - +

Video 9: Embedding Faces Set-up#

-
+
@@ -1458,13 +1458,13 @@

Submit your feedback - +

Konrad needs help recognizing faces. He wants to build a network that embeds photos of faces so that photos of the same person are nearby in the embedding space and photos of different people are far in the embedding space. We can’t just use pixel space because the pixels will be very different between a photo of someone straight on vs. from their side!

We will use a neural network to go from the pixels of each image to an embedding space. Let’s say you have a convolutional neural network with m units in the last layer. If you feed a face photo \(i\) through the CNN, the activities of the units in the last layer form an \(m\) dimensional vector \(\bar{y}_i\) - this is an embedding of that face photo in \(m\) dimensional space.

We think we might be able to incorporate Euclidean distance to help us here. The Euclidean distance between two vectors is:

-
-(77)#\[\begin{equation} +
+(77)#\[\begin{equation} d(\bar{y}_i, \bar{y}_j) = \sqrt{\sum_{c=1}^m(\bar{y}_{i_c} - \bar{y}_{j_c})^2} \end{equation}\]

@@ -1487,31 +1487,31 @@

Think! 3: Designing a cost function for face embedding Click here for the solution

We want the same faces to have similar embeddings. Let’s say we have one photo of Lyle \(a\) and another photo of Lyle \(p\). We want the embeddings of those photos to be very similar: we want the Euclidean distance between \(\bar{y}_a\) and \(\bar{y}_p\) (the activitys of the last layer of the CNN when photo \(a\) and \(p\) are fed through) to be small.

So one possible cost function is:

-
-(78)#\[\begin{equation} +
+(78)#\[\begin{equation} \text{Cost function} = d(\bar{y}_a, \bar{y}_p) \end{equation}\]

Imagine if we just feed in pairs of the same face and minimize that though. There would be no motivation to ever have different embeddings, we would be only minimizing the distance between embeddings. If the CNN was smart, it would just have the same embedding for every single photo - then the cost function would equal 0!

This is clearly not what we want. We want to motivate the CNN to have similar embeddings only when the faces are the same. This means we need to also train it to maximize distance when the faces are different.

We could choose another two photos of different people and maximize that distance but then there’s no relation to the embeddings we’ve already established of the two photos of Lyle. Instead, we will add one more photo to the mix: a photo of Konrad \(n\). We want the distance of this photo to be far from our original photos of Lyle \(a\) and \(p\). So we want the distance between \(a\) and \(p\) to be small and the distance between \(a\) and \(n\) for example to be large:

-
-(79)#\[\begin{equation} +
+(79)#\[\begin{equation} \text{Cost function} = d(\bar{y}_a, \bar{y}_p) - d(\bar{y}_a, \bar{y}_n) \end{equation}\]

We could compare \(n\) to both \(a\) and \(p\):

-
-(80)#\[\begin{equation} +
+(80)#\[\begin{equation} \text{Cost function} = d(\bar{y}_a, \bar{y}_p) - d(\bar{y}_a, \bar{y}_n) - d(\bar{y}_p, \bar{y}_n) \end{equation}\]

But then the cost function is a bit unbalanced, there are two dissimiliarty terms and they might dominate (so achieving the similarity is less important). So let’s go with just including one dissimilarity term.

This is an established cost function - triplet loss! We chose the subscripts \(a\), \(p\), and \(n\) for a reason: we have an anchor image, a positive image (the same person’s face as the anchor) and a negative image (a different person’s face as the anchor). We can then sum over N data points where each data point is a set of three images:

-
-(81)#\[\begin{equation} +
+(81)#\[\begin{equation} \text{Cost function} = \sum_{i=1}^N [d(\bar{y}_{a, i}, \bar{y}_{p, i}) - d(\bar{y}_{a, i}, \bar{y}_{n, i})] \end{equation}\]

There’s one little addition in triplet loss. Instead of just using the above cost function, researchers add a constant \(\alpha\) and then make the cost function 0 if it becomes negative. Why do you think they do this?

-
-(82)#\[\begin{equation} +
+(82)#\[\begin{equation} \text{Cost function} = \text{max} \left( \sum_{i=1}^N \left[ d(\bar{y}_{a, i}, \bar{y}_{p, i}) - d(\bar{y}_{a, i}, \bar{y}_{n, i}) + \alpha \right], 0 \right) \end{equation}\]
@@ -1530,14 +1530,14 @@

Submit your feedback -

+

Video 10: Embedding Faces Wrap-up#

-
+
@@ -1556,7 +1556,7 @@

Submit your feedback - +

Check out the papers mentioned in the above video:


@@ -1594,7 +1594,7 @@

Section 1: Modern CNNs and Transfer Learning#

-
+
@@ -1613,19 +1613,19 @@

Submit your feedback
-
+

Images are high dimensional. That is to say that image_length * image_width * image_channels is a big number, and multiplying that big number by a normal sized fully-connected layer leads to a ton of parameters to learn. Yesterday, we learned about convolutional neural networks, one way of working around high dimensionality in images and other domains.

The widget below (i.e., Interactive Demo 1) calculates the parameters required for a single convolutional or fully connected layer that operates on an image of a certain height and width.

Recall that, the number of parameters of a convolutional layer \(l\) are calculated as:

-
-(83)#\[\begin{equation} +
+(83)#\[\begin{equation} \text{num_of_params}_l = \left[ \left( H \times W \times K_{l-1} \right) + 1 \right] \times K_l \end{equation}\]

where \(H\) denotes the shape of the height of the filter, \(W\) the shape of the width of the filter, and \(K_l\) denotes the number of the filters in the \(l\)-th layer. The added \(1\) is because of the bias term for each filter.

While a fully connected layer contains:

-
-(84)#\[\begin{equation} +
+(84)#\[\begin{equation} \text{num_of_params}_l = \left[ \left( N_{l-1} \times N_l \right) + 1 \times N_l \right] \end{equation}\]

where \(N_l\) denotes the number of nodes in the \(l\)-th layer.

@@ -1786,7 +1786,7 @@

Submit your feedback -

+

@@ -1858,7 +1858,7 @@

Parameter Calculator
-
+
@@ -1877,7 +1877,7 @@

Submit your feedback - +

@@ -1892,7 +1892,7 @@

Section 2: The History of Convnets#

-
+
@@ -1911,7 +1911,7 @@

Submit your feedback - +

@@ -1937,7 +1937,7 @@

Submit your feedback - +

@@ -1950,7 +1950,7 @@

Section 3: Big and Deep Convnets#

-
+
@@ -1969,7 +1969,7 @@

Submit your feedback - +

@@ -2051,7 +2051,7 @@

Section 3.2: What does AlexNet learn? -../../../_images/becac0473624f04086b4f6c3661bc8141ba6e6994a2cc3613887f5df90e76190.png +../../../_images/a55aa677043fbfe697ed97615dbf553411bd6fa5e5e891cc601594eb0d21e843.png
@@ -2074,7 +2074,7 @@

Submit your feedback - +

@@ -2155,7 +2155,7 @@

Interactive Demo 3.2: What does AlexNet see? - +

Submit your feedback#

@@ -2173,7 +2173,7 @@

Submit your feedback - +

@@ -2197,7 +2197,7 @@

Submit your feedback - + @@ -2215,7 +2215,7 @@

Section 4: Convnets After AlexNet#

-
+
@@ -2234,7 +2234,7 @@

Submit your feedback - +

In this section we’ll be working with a state of the art CNN model called ResNet. ResNet has two particularly interesting features. First, it uses skip connections to avoid the vanishing gradient problem. Second, each block (collection of layers) in a ResNet can be treated as learning a residual function.

Mathematically, a neural network can be thought of as a series of operations that maps an input (like an image of a dog) to an output (like the label “dog”). In math-speak a mapping from an input to an output is called a function. Neural networks are a flexible way of expressing that function.

@@ -3402,7 +3402,7 @@

Prepare Imagenette Data
-../../../_images/63b232bbbcd644fcffbd6e619b6ab4e3ac81351028b17be87d47f52439c4a992.png +../../../_images/eb90e6c18ffc8ac63832ae36125a2dba2880170a40541762168fdf520c9687a3.png

@@ -3638,7 +3638,7 @@

Submit your feedback - +
@@ -3725,7 +3725,7 @@

Section 5: Inception + ResNeXt#

-
+
@@ -3744,7 +3744,7 @@

Submit your feedback -

+
@@ -3878,7 +3878,7 @@

Parameter Calculator - +

@@ -3897,7 +3897,7 @@

Submit your feedback - +

@@ -3926,7 +3926,7 @@

Submit your feedback - +

Now we want to look at the number of parameters.