{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "RJSnI0Xy-kCm"
},
"source": [
"![Meta---Logo@1x.jpg](data:image/jpeg;base64,/9j/4QAYRXhpZgAASUkqAAgAAAAAAAAAAAAAAP/sABFEdWNreQABAAQAAABkAAD/4QMxaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wLwA8P3hwYWNrZXQgYmVnaW49Iu+7vyIgaWQ9Ilc1TTBNcENlaGlIenJlU3pOVGN6a2M5ZCI/PiA8eDp4bXBtZXRhIHhtbG5zOng9ImFkb2JlOm5zOm1ldGEvIiB4OnhtcHRrPSJBZG9iZSBYTVAgQ29yZSA5LjAtYzAwMCA3OS5kYTRhN2U1ZWYsIDIwMjIvMTEvMjItMTM6NTA6MDcgICAgICAgICI+IDxyZGY6UkRGIHhtbG5zOnJkZj0iaHR0cDovL3d3dy53My5vcmcvMTk5OS8wMi8yMi1yZGYtc3ludGF4LW5zIyI+IDxyZGY6RGVzY3JpcHRpb24gcmRmOmFib3V0PSIiIHhtbG5zOnhtcD0iaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wLyIgeG1sbnM6eG1wTU09Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9tbS8iIHhtbG5zOnN0UmVmPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvc1R5cGUvUmVzb3VyY2VSZWYjIiB4bXA6Q3JlYXRvclRvb2w9IkFkb2JlIFBob3Rvc2hvcCAyNC4xIChNYWNpbnRvc2gpIiB4bXBNTTpJbnN0YW5jZUlEPSJ4bXAuaWlkOjlDN0Y5QzBDNEIxRDExRUU5MjgwQUNGNjU1QzlDQjREIiB4bXBNTTpEb2N1bWVudElEPSJ4bXAuZGlkOjlDN0Y5QzBENEIxRDExRUU5MjgwQUNGNjU1QzlDQjREIj4gPHhtcE1NOkRlcml2ZWRGcm9tIHN0UmVmOmluc3RhbmNlSUQ9InhtcC5paWQ6OUM3RjlDMEE0QjFEMTFFRTkyODBBQ0Y2NTVDOUNCNEQiIHN0UmVmOmRvY3VtZW50SUQ9InhtcC5kaWQ6OUM3RjlDMEI0QjFEMTFFRTkyODBBQ0Y2NTVDOUNCNEQiLz4gPC9yZGY6RGVzY3JpcHRpb24+IDwvcmRmOlJERj4gPC94OnhtcG1ldGE+IDw/eHBhY2tldCBlbmQ9InIiPz7/7gAOQWRvYmUAZMAAAAAB/9sAhAABAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAgICAgICAgICAgIDAwMDAwMDAwMDAQEBAQEBAQIBAQICAgECAgMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwP/wAARCAA1APADAREAAhEBAxEB/8QAwQAAAgIDAQEBAAAAAAAAAAAACQoACwYHCAUDBAEAAQQDAQEBAAAAAAAAAAAABgAFCAkBAwQCBwoQAAAGAQEGBAMDCAYGCwAAAAECAwQFBgcIABESExQJIRUWFyIYCjEjJEFhMyW3eBkaUTK0djg5lLU2d9dYcYGhQkQ1JrY3RygRAAIBAgMEBAsGBAcAAwAAAAECAxEEABIFIRMGBzFBFAhRYXGBkbEiMnI0FaHB0UJSM/DhIxbxYqIkFxgJU3NU/9oADAMBAAIRAxEAPwB/jZYWNCaj9TWF9J2NZHK2cbi0qVXZqdGwR5aj6ds00oiqs0rtWhGwGezU09KiYSpkAE0kymVXOkgRRUhzy95ccYc0eIo+GOC7R7rUnGZjULHDGCA0s0h9mONaipO1iQiKzsqkU4y424a4B0V9e4ouVt7FTRR7zyPQkRxINruadA2AVZiqgsFTtS31DeerpPqIaZKohhmqslTJM5G1I1S8WSdQAxhK8lYuSrT+Jg3CoDu6ds5dETAP0xx3jtZ9y67g3A2j2IfmPdNrGqOKssBntoYz+lHSZXkA/U6IT+gdGIGca977ivUrsrwTANNsFNA0oinkcfqZWjZEJ/SrMB+o4zvSr9RJfa7JtYLVpRXOQYB84STd3+iBXIWwwCZlClM4JSmkFCRE42KQwioQHzZYALvIJx+AWTmf3AtD1C2a95WXq2F8ikra3O9kilNOjtDSSSRnwHduu3bTpDrwH3wdVs51teP7Vru0cis8G7SSPx7kIiOPCM6nwV6MNP4ZzXizUJjyCyphu6RF7oliTOaOnIhRTcRwgIFdxsmxcpt5GGmY9QeBwzdpIuUDeByF3htWTxfwdxNwFr8/DHF1nLY63bkZ45ANoPuujAlJI2G1JEZkYdBOJ2cN8TaFxfo8WvcOXMd1pUw9l0r0jpVlIDI69DI4DKekDGstVOrzC2j6heuMuTyiK7/qW9TpsMRJ9cLrJNkyHVYwEYos3TBFuChBcPHKiDJqBygoqU6iZDmXKLkvx1zq4h+gcGW4aOPKbi5lJS2tUY0DzSAE1NDkjRXlehyoQrFQ3mpze4L5P6D9c4unIkkqILeMBri5cCpWJCQKCozyOVjSozMCyhlocw98zVDbLctI4haQ2JqemsJWldeR9XvL5w1THhIq+l5qppqpOnBA4lCpBwEMYQKIgACNpnBXcC5TaPoy23Gjz6zrRX2plee1QMekJHFcEFVOwFtpAqaE0xWjxh35eaGraubjhBIdJ0cN7MLJBdMVHQWkkgBDHpIXYCaCo24710f98ah3V9D0DVDCHx3MvFE2TXLDN02fUx47VMQiQ2uNZxUWvUUTqGEvVJEdMybwMuLdMplAjzzp7g3EOhW8/EfKecalYoCzaeyslyqipPZ3aSQXBA27tjHIeiPeMQuPvXJ/vxaDrc8PD/NCA6deuQq36srWzMaU36LGhtwTszqHjHS+7UFsMAtXTZ82bvWThB4zeIIumjtqsm4bOmzhMqqDhuukY6S6C6RwMQ5REpiiAgIgO1cssUtvK0E6sk6MVZWBDKwNCrA7QQdhB2g7Dif8UsU8SzQsrwuoZWUgqykVBBGwgjaCNhG0Y++2vGzE2WFhVLN31UmDsJZny5hmU0m5Ym5LEmTr5jKQmWV+p7ZnLvaHaZWrOpRo2WjlFm7WQXijKppnMY5CHABHeA7OqaU7oHzjaAejw4ZZNZjjkaMo1VJHSOrBpu2z3F8Rdy/AC2b8XRMpTn8DbJalXzHFifsJCx0ueYgk9jercx4JoP4uwwDxu8aOiJkTOJ1UP0rdYC8VzbPbSZG2ilQfDhwtLuO7i3ibCDQjwYIPtz46sTZYWNN6hs7490xYQyhqAytKeUY/xNTpe42NynyjPHKEaj+DholFZVFN5PWGTUQYR7fjKLl85SSAd5w29xxtK4jT3ica5ZEhjMr+6orhWYfq88Abh3aOcwiPjuAci0oAH+jeIRQ7t/5ft3fn2dPpEn6x6Dhm+uxf/G3pGGwcWXpvlHGOOcmNI1zDNci0OoXptDvVkHLyKb26vx9gRjXbhqItl3LFOQBJQ6Y8BjEES+Ahs1MuVivgNMPaNnQP0VAPpxnm3nHrE2WFibLCxNlhY8iwT0TVoGbs888LHwVciJKemn501liMYmIZLSEi8Mi2TWcKlbM25ziVMhzmAu4oCO4NsgEmg6TjBIUFj0DAxcQd7DtkZ6ybRsO4o1PRlsyRkifZ1im1pPHOXotWXnX4HFow6+boEbFMjLCmIAdwukmBtwCYN+3S9lcxqXdaKOnaPxxxx6jZyuI0erk7Nh/DBUduXHbibLCxNlhYmywsTZYWJssLHiWWyQVNrlgt9olGkHWarCStjsU0/U5TGIgoNivJy0o9V3Dy2jBg1UVUNuHcQgjt2adp97q+oQaVpkTzajdTJFFGoq0kkjBERR1szEKB4Tjmvb2106zm1C+kWKygiaSR22KiIpZ2J6gqgk+IYrue4drdu2vDUNM358pJs8dwLp7WcL0RQ6gpVun9WUiDxZgkdREbbbzoJPJVUvMOZYU2xTmbtW5SX7cg+TWjckeAodChEb6/OqzahcilZZ8u1QxodxBUxwqaALmkKiSSQmn7m/zN1PmpxfJq0pddHiZo7ODqjhrsJUVG9loHlO0k0QEoiAG30QfT5Vuw49hciazrFdYiz2eOSkmOG6U7Y19zUWTxMirMl4sLxhKvHFkMgcDLx7RJsVgp92osspxkThvzm7+Wo6fr03D/ACgt7OXTbaQo1/cK0onZTRuzRKyKIqiiyuXMo9pURaM0muWPdGsrzSItY5kTXMd9OgZbOErGYgdo38hVyZKe9GoURnYzMagas1+9g59iSlzWXtINgtmRYSttXMracRWwrOTvDaGap853KUeYh2EcnaTMEimUUi1Wib4yJBFBV0sJUBJ+RXfmh4q1iHhTmxBa6fe3DBIb6DMlsZGNFS5jkZzDmNAJlcxhiM6xpVwxc2e6hLw/psvEPLya4vLWFS0tpLRpwgFS0Doq73KKkxFQ9B7DO1FwMft1dwTI2gnKnn8aWRteIbWok2yji8r3kt5xsmmZJpYoIXHG1jLjBiYDIL8IA5Q42yo8BynTkj3gOQ/D3PHhjsNyY7Xiu1qbO8y1aIk1aKQCjPBJ+ZK1VqSJ7QIb4hyd5t6zyp17tUGe44fuNlza5qLJsosiE7ElQ0o9KFao2wgr17Qa3qA7w+r99MTMspHQzoiUrP2BNNw/qWHMTt3igRUDX2ih0EnDw4LHRYteJJaTklFnLgxQ6twm365rfLXuYck4rbTIlnuKFbeOoSfU75lGeaZgCQuwNLJRlghVIYwSIY2CtL0LmP3tucs0mrO1vGrVuHoWh02zRiFhiUkAttKxJUGeVmmcgGWRWjMYdtTRRi6ltqY0wHQrkBWhW8nZ8jQMbdrbNr7gFd88mZlqudkquoHECTEjRskPgkkQA3bVP8Wd6Tntxbrr65NxFqNj7dY4LKV7W3iHUixRMAwA2ZpTI7fnZjizvhfu1clOF9FXRYtAsL32KPNeRJc3Ep62aSRTlJO3LEI0X8qqMBO7o/agrGHKhKajNMkY/ZUmEOLrJ2MRdO5YlXjnK4F9YVFw8O4kvTzJZUpZBkqosLJI3UJGK2IqRGd3dM74OrcbazDyy5qyxya9OMtjfZVjM7qPlrgKFTfMATDKqrvWG7cGVkLwn70fdQ0vg7SJeY3LKKRNEgOa9sszSCBCfmLcsS+6UkCWNi27U7xSIlYJtPsha45OWWU0cZNmln52ca+msGSsk4FV0mwi0TvbDjbnKGMqs3j2CaklFEHf07ZF2hxAkRqkQR7+nIK0s0HO3hSBY1eVItVjQUUvIQsN7QbAzuVhuD+d2hkpnaV2Ku5Dzxurtzyc4nmMjJG0mmSOasFQFpbOp2kIgM0A/IiypXKsSBkrar3FkmJssLFP5r4SUW14azkUUzqrK6s9QySSSRDKKKqKZetxSJpkKAmOc5hAAAAEREdi+D9hPgHqwC3XzUnxt68EJ7EHcEd9vrXFEwuRZNzAYKz05jsQ5uZSxlWLOpSgSayFGyJJtnAogzcY/sz1VB8osG9tDSMiPAKgEAOe+t+0QVXbIu0fePP66Y6tNuuy3NH2RtsPi8B83qriz62GMGGJssLCNv1UfcR9Q2ipduzGU4Iw9NWhcnajXEe4HgfWx4yK/wAaY3eGSMQToV6GfBPv0D81FVy+jDBwrMjAD5pdvQG4bpOwfefu9OBzWrqrC1ToG1vL1D7/AEYTgfR7+Lcizk2LyOdlSbODNXzZZo5BB62Res1hQcETVBJ2zcJrJG3blEjlMXeUwCLxWvRhhII6cXGGkz/Ctpn/AHfsNfs5rewfN+63xH14PIP2U+EerHQO2vG3Gj8mam9N+FnfQZh1AYUxVICRNQI/I2U6PSX5k1SlOkcjKyTka6OVQhwEogQd4CAh4be1ikf3FY+QE41vNFGaSMqnxkDHv41zhhbM7Vd9h/L2MMrMmpCqOneN79VLw2bEOJSlM4WrEtKJoFMYwAHGIeI7tsMjp74I8opjKSRybY2Vh4iD6sbR284940Rn+zVr2SzawGxQJHoYryQ1M1UmY1Ncjn0hMpigomo5KZNUqngIG3CA/btsjB3i+UY1ykbtto6D6sVdnZpWQbd0jRC4croNm6GdK8qs4crJN0Ek02siY51FljkTIUCh+UfEfD7die9+Vf4cBth85HX9WLWYblTygJjWutgUAEREZ2LAAAPERERdbgAA2FaHwYNcy+EYyFNRNVMiqRyKpKkKomomYp01EzlAxDkOURKchyiAgIDuENsYzj8UtLxUDGvJick4+GiI5AzmQlZZ62jo1i2Ju43Dx88URbNkCb/E5zFKH9O2QCTQdOMEgCp6Mc2sNcGi6VsAVOM1daY5G0GVK3LXmOecWO5o7gxgIDZONQtSjxRwJx3cspBPv/Jts3EwFSjU8hxqFxbk5RImb4h+OOlVZKOQYeaLv2SMZyU3PmKrpBNh06oFFJx1Z1Ab8lUDlEp+LhNvDcPjtqoejrxuqKV6sfiZ2SuyLgrSPnoV86OBjEbM5Ri6cHApTHMJUUFzqGApCiI7g8AAR2zQ4xUHYDgLfftz4+xXoySxhAvTM5/UJb2tMdnROKbktEryRbLcTInKIG4HrlCNjly7hA7WQVKPgO01O4rwBDxbzfbiS+TPYaBZtcCu0dplO5twfGoM0qnqeJT1Yit3u+Nn4X5aJolq+W91m5EJpsO4jG9mI8pEUbeFZGGAK9jjSbH5/wBY7O9W2NTkaHp2iW+S3rR0kVZlIXlV8DDG8c6IYogPSyqbiYIA/Cc8PwG3lMIDODvr8y7jl/ykbRdLkMet8QSmzVgaMtsFzXbqfGhSA9YFxUbRURU7qvBcHGvMZdUv0D6Vo0YuWB2q05bLbKfI4aYdRMNDsNMPUbUlYtVxNlhYr3e6OTA77WPl6w6b40jHHTuwqNJ5Rgo2NW3uSUTrEuc1TkGqZUmlSmpkqhm4FOoiq5Kss3ErZZBJO/zu66XzC0rkxoicyJN5rbW4MYYMJorVgDaxXJY1adYqZqhWUZY5Kyq7NTTze4k4D1zm1rNrwImTT4ptrKQYZ5lqLqS3A2CIS1oASrCssdI2VQSX6f3WRWsbZEtOky6toSJbZllC2bHNuFq3aSTi+xcaKDmjTUqbhO7j5yIbGVhklDlK3kiLIpFOrIlAsZO/byn1TiPh605naTJPMdGiMNzb5maNLaR83aYo+hWSRqXBAq8RR2IW3NZEd0rj/TdD1u64H1COCJ9VdZIZwoWR540yiCWTpZWjH9AE+zIHVatNhv3ap7Fh2PMmoaKscNLV6dj2stBz0Y/hpmKfJFXZScVKNVWMjHvEDgJFmrxoudNQg+BiGEB26rG+vNMvYdS0+R4b+3lSWKRDRkkjYMjqR0MrAMD1EA45r2ytNRs5tPv41lsZ4mjkRhVXjdSrow61ZSQR1g4QiyZCWbQprasEdWl3JZXAeY0JiqLrKnSWlK4wkm1grAPzgG86VhqTtuR0XcYh03ByjxFHx/Q9wtqOld4HkRbXOqKps+ItEMdwAARHM6NDPk8BhuFcxnYQUU7CMUJ8S6fqfIvnZcW+mswutA1kSQEkgvCrrLDm8UsDIHHQQ7DaDh8+o2eLu1UrFyg1RWhbbXoWzw6xgADKxc9GtpWPVMACIAKjR2QR3CIeO356dZ0q70LWLvRL8Zb6zuZYJB4JIXaNx5mU4vl0jU7XWtKtdZsTmsru3jmjPhSVA6HzqwxkOzbhwxT+69znT146zVEznTUJq01CnIoQwkOQ5cv24xTkOUQMU5TBvAQ8QHYvg/YT4B6sAt181J8bes4NN9SNoBd4IzpRtaNHhio4r1axkW6vPl7MjeNq+oRrXWz6zJKFRIVJsTKES2PPIcRjKOJJGXMPCQhA24tOuN4hhb3k6PJ/Lo9GHDVrXdyC4X3H6fi/n0+nDLf07vcPHWnoxYYvv86Mjn3SuhB44uSj5wZWVtuPDNV0cWX5U6xjrvXK8PGKw8ksc6q6sjFHdLCUXiYC2ahb7mbMv7b7R5esYdtKuu0W+Rj/AFU2HxjqP3ebBONf2sak6DNJ2W9TF06V4ekwRmtJrDhcUVLxkmdEYyi09uCZyujJSs6smZ6oiB1GcYi5dCUSIH3c1vC08oiXr6fEOvHZdTrbQNM3UNnjPUMVxfbN0mZH7unccbkyu+lbPXZm3zeoLVXdlTqIKOqp6iTlrDFpu0TJFYSmQrDJowrFNAQOzTeHcJJiizUApHcyraW3sbDSij+PB04FLSB7679vaCczHxfz6Mao7xBSp90DW42SImi2YZ3tMYxbIJJoN2cbFkZx0awaoJFIkg0YMGqaKSZQApEyFKAAAberP5VPhx4v/nJPiOLRPSZ/hW0z/u/Ya/ZzW9hib91viPrwYwfsp8I9WFMe/t33sjY+yNbdDeii5OKVJ0xRWB1AZ0rboE7W2tXCQX+LcbTDc4nrStaA3JnZZAxZIslxsW5motHB3LtYWKsonnFa9A+8/dhk1PUnVzbW5oR7xHTXwDweM4BhpN7HXcm19U1HPFXp8RW6PdTnloTJefbq9rS+QSOBMc9gh2gx1mu05FvB3GSlFWJWbwDcSK6oAYQ7pb62tzuyfaHUB0fdhtg067uV3qiinrY9P34wHU925u5F2j7TT8yW2MsOOG7eZatqdqFwZd3j+tMLIYy7ltCL2qCNFzdalHqTA502ko1ZlfpEOCQLFIqUnqK4trsFBQ+IjHma0u7EiRqjwMD9+HS+wj3eZXuLYqsuJs5uYpLVVhCKjn1hko5u3jW2XMdOXCUUyyS1h24Ebxs/FSqiLGwoNyEZFdOmjlAqRHvStWW/tBbuHT9pvsPg/DBBpl8btCkn7y/aPD+P88LTa2ewX3NrjqP1b55gML1I+MrRmnPGW4aXWzLi5ByvRpm7Wq4sJJSLcWhOTbrrwLkqotlEirJmHlmKAhu2coL+2EaRljmCgdB6aYaLjTLxpnkCjIWY9I6Kk+HABsE4SyJqQy/j/BeJoppOZIydYW1Xp8S+lo6DaP5l0mqqi3Xl5dy0jWBDEQMPMWUIQN27fvENnB3WNC7+6BhsijeaQRptcnZgxQ/TXd3IAEfYWljuD7Aznh7eP5g33EA3jtx/UrT9R9B/DHf9Jvv0D0j8cP6qZZqegPt7U7JOoxYlVidN2mvGcff46PeMpZ0e01ekVuqkpdddJuE4+am5+4FSiY0SqlQdO3CX3hUzCcGDIbi4Kx7SzGnp6fRgmzrbWoeXYEQV8oHR6dmK3HXB3HNafdgze2hptzcZOu2G0+VYX0tYwLOStbieseiSvxbKrxCQusgX1VPgBxLOmyrxwvxcgjVty2qRHBbQ2iVFK02sf42DAnc3dxeyUNaE7FH8bT48dT1D6ajuu2yoNbWvifHtRcvWZXren2/LVSj7eBFCcxJB0yj15WKjXihd29Fy8RUSEeFQCGAQDUdStA1Kk+OmzG9dIvWXNlA8RIrjRl5z13D+35hfUL20tVlQv8di7M1GjoyLxrk+Rdu46iyEHbIGyQGQsIWtstMwr2sjJVwWr1nEu1oR/wAagG5btLmE2LHb3DrcxEZlPSOvxH+K41NLdWsb2k4ORh0Hq21qD+GzG7fpqyFN3Z8GHEB4k6pmnhHeYADjwzfQNvKA8I7wD8oDu216l8o3lHrGNukfOr5D6jg4ff8AckvrhqSpOMxVMZjiivPToIcQimVe9w9JnHCok/qgocrUgb92/cUNraP/AD+4Ti0vlpe8UKv9bVrhQT4rWS5iA8gzH04rd77PFTX3MC04cZv6WmwMQPHcR28hPlIA9GCHfTz44b13TJl7IhkkiyV9zEaDMqUoc08PRarDHjyKH3bxAkna34gH2Bxfn2jn/wCh2vSXfM/SOGwT2ew0YS06t5dTyBiPKkEWPvHcc0lIeXep6+ab681Ux168lvDHl/1zSYYA2r+xNjAe+8BraHTXhUMTUOX6XM2bI2QjWbhmvwP6Zjw3Mj7JbCnSEVmclKiY8bFqfdmBUzhwkcFGW4Zq9yzkOOaPHX948Qw5+B9BlR2DCqXN5seC327GSPZPONoyiKN1yz1EPu+BztPLXgr+09BmycZ63E6KVNHtrTak0+zarybYYDsOYySI2aGmAydqLt31vV85yneszRDxxhiBrs1j+HKkdRqrNZGs0OZIJKLdEHcC+OIp8nIFEwbiyLpiYOMpFibTa75XeMv+UdhpnCnBsyDjO8njupagMIrKCUHK6n/9kqGLZt3Mc49ksjYh93P+Q9tzK1W+4w4ojf8AtWwikt4aVXe3k0ZUlSOkWsTiQg7N7JAfaCuuA5Z6w1k7RrqMteL5948hb9iK5NXletMSK8ed8kxctpylXyuLgcV2yEqxFrINTAbmtzHAh+FVM5Q+58EcXcOc3eX9rxLYok2h6raFZYXo+UsDHcW0o6CUbPE4plYCoqrAlm4q4c1vlzxhPol0zRarp9wDHKtVqFIeGeM9IDLlkXbVSaGjAjD3vbn1kw+trTPUsmmWYt8iwZU6fl+vteBEYm+xLVDrJFuyLuFvCWxoonJsQDjTTScGb8ZlW6u6krvAco7vk3zFuuHArtw/NWexlapz2zk5ULdckDAwydBJUSZQsi1tI5PcxrbmZwXBrdUGsRf0buMbMk6AVYDqSUUkTpADFKlkand+3xLH1PCcnfVp6Vd1rs7AggCQX3D1IsDtUAAOpfxchZKec5t3iYU4+ttSbx/IUA/Jtdl/5/60+pcin06Rq/TtauoVH6UkSG5A87zufPinfvz6Qmn86k1BFp2/R7aVj4XR5revmSFB5sMYdsu0r3DQdpnlnKhlVmmPgrHEcwmMCVJnJimtiCIiI/A1gSAH5gDas3vUaRHoveE4qs4gAj6lv/PdRR3LelpTixTuz6pJrHIjhm7lJLpp+581tLJbr/piGO69o/4+6Yp+9fX+O/Wf+9lqH/a9bti+D9hPgHqwC3XzMnxt6zi0Z1caP6Nrt0N2rTPeitmqd7xpBKVCyLN+etSMiw0Szk6Lc2nAUXAeSWBBEXSaRiHeR53DUxgTXOAi8UzQT71eo+kdYwYzwLc2xhbrGzxHqOK4rt/an8q9oPuNMZnI8RMQKWP7pP4L1P0EnGuvIURWcTh7qg3RQMCcu8rEjGt56HOkcEXrqOb8Kgt1jCYjuIkvLai9Yqp8fV+BwKWsz2N3V6ihow8XX6OkYIT9St3IorVhqMrGmfDtujrNgDTq3bTDyfrMuzmKxkbMFshG7uRsUfIxjlwwlYqj1mRTh2KgDxJPVpXhMZNYg7c+m2xijMrikjfYP5/hjq1e7E8ohjNYk8HQSfw6PThpDsF9u8ug/RTBTN4gxjtQeo4kPlLLfWN+TLVmKWYqGx1jJwByprIDTq/IKOXqCheYjNyj9MTGTIlwtd/cb+ai/trsH3nz+rDzplr2a3BYf1X2n7h5vWThDPvGf5o2uf8AeFu/9pS2fbP5WP4Rgav/AJyT4ziy0p2ST4a7blUy8mmksrivRFA5GSRXDeiurScENLKkiqXeXiTWUjAKIbw3gOw2y57kp4Xp6TguV93aCT9MdfQMVZWmSVw7fNX+K7RrLuj9jhqey80u+oG2OIydsclOQXmy9qtrV0xrTKQn3bq8O0jsFVWyCiqRnwrbtxBECiUOsJEI9ulB/HiwGQmN51Nwf6ZarH7T6cWHsd9RZ2dIiPYxMTqJkIyLjGbaOjY2OwFnBlHx0eyRI2ZsWLNtjZJu0ZtG6RU0kkylImQoFKAAABsPHTrwmpXb5R+OCkarYAUD7Phb8MaG1Zd7vssaqtNWbtPN01CP5WEyvjmzVUib7A+cVSx065j1V6pYWgr47IkhLVe0N2ciyWES8l21TPvDh22RWV7FKsirtB8I/HGqfUdPnhaJm2MP0nzdXUcKK9hfMU5hrur6UnkS8VQj8i22Tw5aGZDmIhMQeSoGSgWzN2UBLzEWVnPHSCZR8OoZJjuHdu2d79A9q9eoV9GGPTZDHepToJp6cWb2oD/4Gzb/ALo8k/8As2Z2GY/3F+IevBhL+23wn1Yq1uzJ/mm6HP8AfxWv7PIbFF78q/w4DdP+dj+LFsLsKYNcKR/VvZnm6vpk0xYMi3qrSMy5lu13SzJIH4BkY/ElcjG8ZGPADxUYnmcipO+AfAXDFI32kDZ20lAZWc9IFPT/AIYY9ckKwpGOhmJPm/xxzR9JRpRpc251F6zLLEspe302YisHYsdu0E1z1I8nAls2SpiPBYpwbS8zES8RHpOkuBZJkd6hxCm7VKO3VpWGWEdB2n7satDgU57g+8Ng8XWfu+3DuezJghwAv6kvBGM8pdrzLuSrdX27q96fpah3fFtoRSQJLwElZciU2hWaNB6KYuT1+xVyxKleMwOCKzls0XMUVGqIl79Ndlugo91qg+gnDZq0aPZs7D2loR6QDhSX6ar/ADZMH/3UzP8Asav2ztqXyjeUesYZNI+dXyH1HBk++HSZeI1pzdseoKJxV2rtXPCrHIIJuArtNqcTIckwhuMCTr4TbvsHa5buG6zZX/I6DSIGBu7G5nEo6131zcSJXyrtGKpu+tpl5Yc45tTmUi1vLeExnqO6t4EenkbYcF+7ClrhJTR9a6i0dJDO07MllUmWG8oOEWdkgq2/hpA6YCJumfi1cpJmHdxHaKAH9XaGf/oRot9Y857PWJkP0++0SERP1FoZZklSv6kzIxHUJFPXiWfcV1myv+Ul1pcTjt9nrE28TrCzRQtG9P0tR1B6yjDqwVvPec8facMU27MGTJUkZWapHqOOSQyYyU9LKFMSIrUE2UOTrZyde8KDdPeBAEwqKGIiRRQkRuXfAHEnM/i+y4L4VhMuq3koWprkijG2SeVgDliiWrudpIGVQzsqmUfHvHPD/LjhS74w4mlEWmWkZNBTPLIdkcMQJGaWVqKg2DbmYqiswRpu9tzT3DdWQyBWgymSM0W9pB1evJOHCkNU4MgCjFQ7dYUjGZVimwDcyztzygHlIOHiwCodUxr+NB0bgXu18nezF9zwvoVk0s8xAEtxKdskhFfanuZmCxpm95o4UIVUAo11vVuNe8NzZ7QE3vEmtXixQRAkxwRDZHGDT2YbeIFpHp7qyTOCxYl4jTfgeo6Z8KUDCtLIB4mlQqTR1JnRIg7sM86Od9YrK/IUx+F5OzLhZwYnEYqJTlSIIJpkAKD+Z/MLWeafHeo8da6aXl/OWWOtVhhUBIYEOz2YolVAaAsQXb2mJN4fLfgPSOWfBOn8FaKK2llAFZ6UaaViWmmcbfalkLORUhQQo9lQMB+76mhc+dcOtNTWO4UXeU8FRDklvaMUBO/tmHSrLSMn8JQEXDzHbxdeURDeX9XryH9c4IE2lP3JudK8FcXty44gmycM63KNwzGiwX9AieRbpQsLdP8AVWD3VznHwfvT8sH4n4aHG+jRZtd0qM74KPals6lm8ptyWlHR/TM3Scgwvd2wNbkjoh1Ex1kmHDxbDmQisajmGFbAsvwQguTmibkyZJcfPmqO9cncpgUh1VmSrtsTcZwByz/7yfJODnRy+k06zVF4vsM09hIaD+pT27dmPRHcqAhqQFkWKRqiOhhlyQ5tS8reNEvbpmPDV5lhvEFTRK+zMFHS8DEsNhLIZEFC9Q/dCTcPZYaJsVelGE3AT0YxmYSZi3SL6MlomTapPY6SjnrY6jd2xfM1yKpKkMYihDAYBEB2okvbK7028l07UIpIL+CRo5I3Uq8ciMVdHU0KsrAqykAggg4t1tLu2v7WO+spEls5o1eN0IZXRwGVlYVBVlIII2EGowo137LZETWrukV2PXTXfUzCVcj50E1CHFnIzFpuE+2YrlKYTpLhDyDZxwmABFNyQweA7XK/+eekXlhyZv8AUrlStvfa9M8VQRmSOC2hZx4RvEdKj8yMOrFSPfy1S1vubllp9uwaey0SFJaEey8k9xKFPgO7dHoepwevB7O1LCO4Ht/acmr0h013les02UhwEB6Sfv1rmY5Qu8AHgXjnySgfmNtXp3vb+HUe8ZxNNAQY0uYIqj9UNpbxOPM6MPNiePdUsZrDkDw5FOCHe3mkof0y3U8iHzoynz4IbtGzEhcU/evr/HfrP/ey1D/tet2xfB+wnwD1YBbr5mT429Zxbs0T/Yem/wB1K7/qhnsJN7x8uDhfdHkwn59SF2gcsZuynQtZGkPEdmyddbui0x7n2iY/hlJewO5KBjeCh5STimZTu3pFoBiaEllg3EblYRhgKIqrqA76beIiGGYgKNoJ+0ff6cMWrWLyOLiBSzHYwH2H7j5sDt7MnY61K3LWvR73rM07ZDxNgzBwtspvmOTqu6gWmTLpByDY1EorJrIFDzSP8+AknLEMkq1Ujo5Rotwi8T39F5fRCArCwLts2dQ6zjl0/TpmuA1whWNdu0dJ6h95/nixA2HsFOKmzvGf5o2uf94W7/2lLYrs/lY/hGAm/wDnJPjOLJVDHkll3tbNsVQqIuJrJWgdrQ4ZAo7jKy9t09pwMYmUd4eJnz9MNhzMEus56BJX7cFmQyWWQdJip6VxVoaT8eYhyNqgwvirUdabNjXEt2yRC0TIVwgDxUdPUptPvBgkJpZayMJCLjWULOOm6kio5bqAgyTXNw8RQ2KJWdYmeMAuBUePAbAkbzKkpIQmhPgw7p/KP6KP+ZHVL/peJv8AhtsyfVp/0p9v44Ivodv+t/s/DE/lH9FH/Mjql/0vE3/DbZfVp/0p9v44X0O3/W/2fhjdOnH6Y/SVpoz5h3UHUc+6jZuz4YyLVckQUNYHONDQcrJ1OWby7OPlgjaEwfjHO1mwEWBFZNQUxECmAfEPEmpyyxmMqtGFOv8AHGyLR4IZVlVnqpB6urzYPrn4pj4JzWQhRMc+JMjlKUobzGManTIFKUA8RERHw24I/wBxfKPXhzl/bb4T6sVZ3Zrct2ndK0NKuVk0Ez5/qLYp1DAUpnD3q2bREBH7VHDpciZA/KYwB+XYovPlX+HAZYbL2P4hi2M2FMG2E9Pq8sbTEphXRxlxo1WVhKXkzJ2P5p0QhzpNn2RaxW5+AKsYoCVIFk8avwAR3AJgAPt3bPGkMA7p1kA+j/HDFrqExxv1Aken/DHlfSKZvqrjFurHTcu/bNrvEX+tZvi4xVUpXk1VbHXY6hzr9gjvE6rasS9Wjk3ZtwAmaXbB48fgtXQ50k/LSn34xoci5Hh/NWvm6P48uHINmfD9gIn1FF1qdS7SOpiNss8xh5C+usUUylsnSnC6stqNlql2jyOKSDeZw9TrdYkX5wDwI1ZLKD4EHbt05SbtSOqpPoOG7VWVbFwTtNAPLUH7sJ2fTVf5smD/AO6mZ/2NX7Z41L5RvKPWMMWkfOr5D6jh1bvB6M7DqhwTDXPG0OpNZVwk9lZ6LgmLcV5a3U2abNUrbXYpFIAVeTaB4tm/Zo/GdbpFW6JDLOCAMqe5Xzv03lPzBn0PiiYQcIa9HHFJKzUjt7mJmNvNITsWI7ySKRtgXeJI7BI2OI6d8Dk5qPM/gSHWeGoTPxVojvKkSislxbyBRPDGBtaQZI5Y12lt28aAvIowqbp61O510lXWQtmGbc9p0y9b+T2WIeMW0lCTrVqsoJGFirssguydLR7gxxRUMQjpqc5+Uonxn4re+ZPKjl/zj0KPRuOLKO9sY23kEiuySxMwFXhmjIZQ4pmUExyALnVsq0ql5e8z+O+U2tyatwbePZ3rru5o2VXjlVSfZmhkBVihrlNA6EtlZatXI9Q2rvUprGn4BPLdzk7iZi7TaVKlQMW3i4BnJyBisyDEVaCbJIvZyQOoCQLqEcPVAMCQH4OEgNnLbkxyu5JadctwbYxWQkQtcXUshkmaNPaO8nlYlYkAzZAUiFM5WtThx5h83eZXOO/t14uvZbwxuFt7aJAkSu/sjdwRABpXrlzENIa5Q1KDDMfaW7dTvTDV1835jiE0c632IKzi4F0Qiq+Lqa8FJypFK+JiI3CwmTTPImAROzQIRoUSGF2ClWHfG7zEPNfVl4C4JmLcv9OmzSTLUC/uVqokHWbaGpEI6JHLTEECErZh3S+7rLyw0tuN+MYQvHV/DlSJtpsbdqExnqFxLQGY9MahYgQTKGNJtBjE0sfNVJJdJRFZNNZFZM6SqSpCqJKpKFEiiaiZwEp0zlEQEBAQEB29KzIwdCQ4NQRsII6CD1EYwyq6lWAKkUIPQR4DhJ3uxdsue0rZEl8yYhrTp7pqvMod8mnEtVHCeH7DIqmO5qUwmiU5mlScujiMK9MAJJkODFUQWSSUdXS91DvJafzT0CHg7i25VOZNlFlJcgG/iQUE8ZPvTquy4jFWJBnUZGdYqp+8lyLvuXesS8U8OQM/Ad3Jm9gE9ilY7YXp7sJP7Eh2AHdMcyqZOdNO3cx1j6ZKF7Z4xyiX0Q2Kp5FA2uvwtub1MzlVddwFXVm2blzFNVXLgyotOM7IFRMcEQMc4m+r8wO7FyZ5m66OJuKNLP1tqb2WCaW3M9AAN+ImUOwUBd5QSZaLnoFp8m4N7xHNfl9o50HhzUR9IWu7jmijnENSSdyZFJQEktkqY81TkqTXEcPY3znr11GtK83kJq7ZHyXYPOLveZgqr5GCiTLIEm7hZHCYJIMYSAYcJUkScog8KLNqTjOgkJtxbxZwD3fuWL6lLHBY8M6Xbbu1tY6IZZKExW0INS0sr1LMcx2vNK2VZHx884c4T44548x00+KSa94h1K43lzcyVYRR1AkuJiKBY4loAoyjYkMQqUTD92PKNA4xoVKxxV0BbVuh1Sv0+CRNwcwkTXIprEMOcKZCEOuZs0KKhgAOI4iP5dvzy8Sa/qHFXEN9xNqzZtT1C8muZTtoZJpGkelakDMxoK7BQYvd4e0Ox4Z0Gy4d0tcunWFrFbxDrCQosa1pTbRRU9ZqcZjsy4eMU/WvkQ+e7WgO8N3zZah/Hf4eGXrfv8fzbF8H7CfAPVgFuvmZPjb1nFu1RP8AYem/3Urv+qGewi3vHy4OF90eTGV7Yx6xNlhYmywsVNneLEB7o2ufcO//APQ14Dw/pB0kAh/1CGxXZ/Kx/CMBN/8AOSfGcWiOksQHStpnEB3gOn3DIgIeICA45re4QHYYm/db4j68GMH7KfCPVhFz6gfsyZDwBmDIWtTTrTZK16bcpTUleMnwtZjnD59gm+TbpaQtT2TjGRFlUMXWWVWUftJBMhGkS4cqMFit0iMjuXzT7xZEEMhpINg8Y/HA5qmntFIbiIViY1PiPX5vV0eDGv8AQr9TZqz0p41ruHsxY9reqak02NZQlQnbJaZSk5SiIJgQG7KGk7q3irSxtbGMZEKk1O9jRflIQCqO1CgUC+p9MhlYuhKMfOPRjzbaxPCgjkAdR0baH07a46Cz19WxqXuVYk4LT9ptxrhCafomboXi2WySzBMw4H4d72GhFq3SKySSS3CBBft5Nr47zIG+zbXHpMQNZGLDwdH442Sa5MwpEgU+Emv4Y6v+mhz33MsoZQzDMZXgr5lnSPlqWsd+tuccqzL5j6bzSLZMDOsWvZVqsa7pWnp0GEvDRxU42KTSQdFWaGRFpIatSjtlRQlBKNlB4PH4PL/A36RLeO7FwWgY1JPh8Xh8Y6vW5TIMGkowexj9AjljItHLB62UDem4aPETt3KCgflIqioYo/mHZm6NuH8iooejFSZrH01Zx7X+uGw0RUs5TbTiLJTPIuB8hJt1E0rFVIizDO4syRWnrhJRpIfDHodQUorFaSbZw0W+9QVKBbDKlzAG6QRQj1jAPPDJZ3BXaGU1B8XUcH2qv1dmoWOpkdGW/SJiSz3ttHJN39uichWyr1+SkE0gIaSGmKQdgdMyuFA4zoJy/CAiIEEhdwA3nSIy1VchfJ9+HNdclC0aNS3hqfV/PDVOoLTtW+6d23mGNspIx1UldQeEMbZJiJiITcSLLG+VZOrwl5rE/DA6OhIPomv2ZwVFZEVEVn8UddsZQnPMYGuOQ2tzmTaFYjyjow9SxC8tMj7C6g+Q9P8AHixWuScRra7QGsVE6pLFgzUJiWUcniJhJt11XulZeGWZHkIlV+1GCyHjK5MSHIPEmogsXiTVIk6RMRIkBgvIepoz9n4EYEiLiwn61lX0H8QcHxrX1depJjTm8datJWF7De0WSaC1rirrdK3XHT0iYEM+VpazSfepkVOHEZJOZIG8RApihuAOA6RHXY7ZfIPX/LDmNcmC0ZFLeGp9X88Ck1OZ37ifeFr+Z9VuXXLVLAGkeqqWB+zh2EpVcI44c2icgICNplKYnNNL2HJtvfS7PjO8dO5EzFHmOHKTVJAm3VFHb2ZWJP3HPnPjPixxTS3d+Gmf9pB5APEPGcbd+mrMUO7Lg0omADGqmaOEoiG827DN+37g+0d2/wAdvGpfKN5R6xjZpHzq+Q+o4sz9hrBdgC/ch/hWes1vfPqPdXqVPVny9e3fuD1/Efi9fdV+I803fb1X4nh4eLw3bWGd2D/t19DX+wMv9oZB2f6x2zseTZ8pl9nd/wD1+xWtNtcQM7yH/Vb603985v7qzHf/AEnsna8235rNtz/H7dKV6sZN20P4YfqFf5deP3X3H8k98/QHuvyOX+N9FdD+N4eV+n6X77l79/3fFs1d6f8A7XfTF/5Mp/Z+ze/Su2fT619ntWf2en3N57OalPaphz7tH/WH6i3/AB1X+69u7+p9l7dSntdmy+10e9k9qla+zXBwtoEYnBibLCxNlhY8ax+nvIJr1b5N6W8rfeovUfQ+QeS9Mp5n515n+rvK+j4+fz/uuXv4/h37dunfUfqEH0jffVN6u53Obe7yoybvJ7efNTLl9qtKbccl/wBh7FN9T3X07dtvd7l3e7oc+8z+zky1zZvZpWuzCpupH+CF7sS2/wB2+Z15ud8uHtj7V83nm5nlvH8HR8e/9H8HD/V8N21r/LT/ALxf2nFT6Rl3ez6x23t1KbM/+by7a9O3FaXML/p7/csub6pXPt+ldk7HWu3JX8vk2eDZg8Wgf5NPaJL5PPRHkn4X1d5P6W9feZ8KnR+5PkH4rzbp9/I6j4OXv5f/AHtoF94D/mn+7z/zL27tvtdn3m/7Jk2Zuxb32d3X3sm3N73ViaHJH/iT+1h/xR2Psns7/d7ntOfbl7Xuvaz093Nsp7vXjunb4Pj7RibLCwvHlb+XS9z8le7HyKe6fuBbvcr1J7U+pPX/AKhfesfPet/Geceoup6vnfe8/j4/i37OKfUcgyZ8lBTp6MNb/Ss5z7rPXb0Vr14YMifLvKozyfp/KfL2XlfScHS+XdMn0PTcv4On6bh4N3hw7t2zcenb04cxSmzox6GyxnE2WFibLCwAbUR/L8+9mW/mK+Sn3z9YzXur639r/WnrT4POvOvNP1l5v1G/m877zncXF47d8f1Ddjd58lNnThsl+mbxt7u95XbWla4OTjz0Z6Ao3tz5T7e+j6z6D8g6fyL0Z5Ky9L+S9J+E8p8k5HTcr7vk8PD4btuFs2Y5vert8uHFMuUZPcps8nVjKXXTdM463kdHyFur6rl9N03LNz+o5v3XI5W/j4vh4d+/w2xj1hNzuffy4vuPIe4HN9d9a59WfIf7IcHnnH+P9T9N+rvPOo4up4fvOdxcz49+zza/Ucvs+7/mrhgvPpOf2ve68lPtxiPbo/lqfcNh5J1/n/VN/JPn59kPTHmnGHQ9L1X6o6vquHl9T91zOHf4bZufqWXb0f5K4xafSc+zp/z5aYc9rfpz0/C+kPJPSvlbH076b6H0/wCS9On5b5L5X+rvK+k4eRyPuuXu4fDdszGtdvTh/FKDLTLj29sYzgbfc2/h3exh/wCIX7P+kN7/ANDev/Q3r7zvko9d7R+rfx/qLpuDndF4crdzvh3bdNr2jef7eubrpWnnxyXnZd3/ALrLl6q0r5sKP4q/llPdBpzPmm4PM/H3V9nPa/dzf/F9N+I8s/6PHg2dn+p5fy+atcMafSM/5/PSmHzMY+hPbbHvtd5N7Z+h6n7denOm9PehPIWHpHyHovwfk3p/p+l5X3XI4eH4d2zE2bMc3vV2+XBKmXIMlMlBTydWOJe5P/D39jHH8Qb2Z9Ebn3o/3K9CesvO+Wj1XtP6v/Hep+RwcfQfFyv0vwbbrbtG8/2+bN4q/bTHPd9l3f8AusuXqrSvmrhPjE38sr7zRvH83fL86/8Atn2Z9md3ON/5l0/4nyX/ALeDds8P9Tyfk81a4Yo/pG8/P56Uw5Faf4dfyMS/XfLZ8gfpyF8w9O+gfYLyL1LCeTb/ACv/ANHb/VvQ7uP7zzDg4/vtmcdo3+zN2ivjrh+bsvZtuTs1PFl/DpxyJoz/AIHvzB1L5Lfk++Yfy+zejvab259c9B6amPVXk/p/9a8r0v1fVcvw6bj4vh37bZu3bs77Pu/HWmNFv9O3o7Pu971UpXx4/9k=)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "LERqQn5v8-ak"
},
"source": [
"# **Getting to know Llama 3: Everything you need to start building**\n",
"Our goal in this session is to provide a guided tour of Llama 3 with comparison with Llama 2, including understanding different Llama 3 models, how and where to access them, Generative AI and Chatbot architectures, prompt engineering, RAG (Retrieval Augmented Generation), Fine-tuning and more. All this is implemented with a starter code for you to take it and use it in your Llama 3 projects."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "ioVMNcTesSEk"
},
"source": [
"### **0 - Prerequisites**\n",
"* Basic understanding of Large Language Models\n",
"* Basic understanding of Python"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"executionInfo": {
"elapsed": 248,
"status": "ok",
"timestamp": 1695832228254,
"user": {
"displayName": "Amit Sangani",
"userId": "11552178012079240149"
},
"user_tz": 420
},
"id": "ktEA7qXmwdUM"
},
"outputs": [],
"source": [
"# presentation layer code\n",
"\n",
"import base64\n",
"from IPython.display import Image, display\n",
"import matplotlib.pyplot as plt\n",
"\n",
"def mm(graph):\n",
" graphbytes = graph.encode(\"ascii\")\n",
" base64_bytes = base64.b64encode(graphbytes)\n",
" base64_string = base64_bytes.decode(\"ascii\")\n",
" display(Image(url=\"https://mermaid.ink/img/\" + base64_string))\n",
"\n",
"def genai_app_arch():\n",
" mm(\"\"\"\n",
" flowchart TD\n",
" A[Users] --> B(Applications e.g. mobile, web)\n",
" B --> |Hosted API|C(Platforms e.g. Custom, HuggingFace, Replicate)\n",
" B -- optional --> E(Frameworks e.g. LangChain)\n",
" C-->|User Input|D[Llama 3]\n",
" D-->|Model Output|C\n",
" E --> C\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def rag_arch():\n",
" mm(\"\"\"\n",
" flowchart TD\n",
" A[User Prompts] --> B(Frameworks e.g. LangChain)\n",
" B <--> |Database, Docs, XLS|C[fa:fa-database External Data]\n",
" B -->|API|D[Llama 3]\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def llama2_family():\n",
" mm(\"\"\"\n",
" graph LR;\n",
" llama-2 --> llama-2-7b\n",
" llama-2 --> llama-2-13b\n",
" llama-2 --> llama-2-70b\n",
" llama-2-7b --> llama-2-7b-chat\n",
" llama-2-13b --> llama-2-13b-chat\n",
" llama-2-70b --> llama-2-70b-chat\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def llama3_family():\n",
" mm(\"\"\"\n",
" graph LR;\n",
" llama-3 --> llama-3-8b\n",
" llama-3 --> llama-3-70b\n",
" llama-3-8b --> llama-3-8b-base\n",
" llama-3-8b --> llama-3-8b-instruct\n",
" llama-3-70b --> llama-3-70b-base\n",
" llama-3-70b --> llama-3-70b-instruct\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def apps_and_llms():\n",
" mm(\"\"\"\n",
" graph LR;\n",
" users --> apps\n",
" apps --> frameworks\n",
" frameworks --> platforms\n",
" platforms --> Llama 2\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"import ipywidgets as widgets\n",
"from IPython.display import display, Markdown\n",
"\n",
"# Create a text widget\n",
"API_KEY = widgets.Password(\n",
" value='',\n",
" placeholder='',\n",
" description='API_KEY:',\n",
" disabled=False\n",
")\n",
"\n",
"def md(t):\n",
" display(Markdown(t))\n",
"\n",
"def bot_arch():\n",
" mm(\"\"\"\n",
" graph LR;\n",
" user --> prompt\n",
" prompt --> i_safety\n",
" i_safety --> context\n",
" context --> Llama_3\n",
" Llama_3 --> output\n",
" output --> o_safety\n",
" i_safety --> memory\n",
" o_safety --> memory\n",
" memory --> context\n",
" o_safety --> user\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def fine_tuned_arch():\n",
" mm(\"\"\"\n",
" graph LR;\n",
" Custom_Dataset --> Pre-trained_Llama\n",
" Pre-trained_Llama --> Fine-tuned_Llama\n",
" Fine-tuned_Llama --> RLHF\n",
" RLHF --> |Loss:Cross-Entropy|Fine-tuned_Llama\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def load_data_faiss_arch():\n",
" mm(\"\"\"\n",
" graph LR;\n",
" documents --> textsplitter\n",
" textsplitter --> embeddings\n",
" embeddings --> vectorstore\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n",
"\n",
"def mem_context():\n",
" mm(\"\"\"\n",
" graph LR\n",
" context(text)\n",
" user_prompt --> context\n",
" instruction --> context\n",
" examples --> context\n",
" memory --> context\n",
" context --> tokenizer\n",
" tokenizer --> embeddings\n",
" embeddings --> LLM\n",
" classDef default fill:#CCE6FF,stroke:#84BCF5,textColor:#1C2B33,fontFamily:trebuchet ms;\n",
" \"\"\")\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "i4Np_l_KtIno"
},
"source": [
"### **1 - Understanding Llama 3**"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "PGPSI3M5PGTi"
},
"source": [
"### **1.1 - What is Llama 3?**\n",
"\n",
"* State of the art (SOTA), Open Source LLM\n",
"* 8B, 70B\n",
"* Choosing model: Size, Quality, Cost, Speed\n",
"* Pretrained + Chat\n",
"* [Meta Llama 3 Blog](https://ai.meta.com/blog/meta-llama-3/)\n",
"* [Getting Started with Meta Llama](https://llama.meta.com/docs/get-started)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 240
},
"executionInfo": {
"elapsed": 248,
"status": "ok",
"timestamp": 1695832233087,
"user": {
"displayName": "Amit Sangani",
"userId": "11552178012079240149"
},
"user_tz": 420
},
"id": "OXRCC7wexZXd",
"outputId": "1feb1918-df4b-4cec-d09e-ffe55c12090b"
},
"outputs": [
{
"data": {
"text/html": [
""
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"llama2_family()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
""
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"llama3_family()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "aYeHVVh45bdT"
},
"source": [
"### **1.2 - Accessing Llama 3**\n",
"* Download + Self Host (i.e. [download Llama](https://ai.meta.com/resources/models-and-libraries/llama-downloads))\n",
"* Hosted API Platform (e.g. [Groq](https://console.groq.com/), [Replicate](https://replicate.com/meta/meta-llama-3-8b-instruct), [Together](https://api.together.xyz/playground/language/meta-llama/Llama-3-8b-hf), [Anyscale](https://app.endpoints.anyscale.com/playground))\n",
"\n",
"* Hosted Container Platform (e.g. [Azure](https://techcommunity.microsoft.com/t5/ai-machine-learning-blog/introducing-llama-2-on-azure/ba-p/3881233), [AWS](https://aws.amazon.com/blogs/machine-learning/llama-2-foundation-models-from-meta-are-now-available-in-amazon-sagemaker-jumpstart/), [GCP](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/139))\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "kBuSay8vtzL4"
},
"source": [
"### **1.3 - Use Cases of Llama 3**\n",
"* Content Generation\n",
"* Summarization\n",
"* General Chatbots\n",
"* RAG (Retrieval Augmented Generation): Chat about Your Own Data\n",
"* Fine-tuning\n",
"* Agents"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "sd54g0OHuqBY"
},
"source": [
"## **2 - Using and Comparing Llama 3 and Llama 2**\n",
"\n",
"In this notebook, we will use the Llama 2 70b chat and Llama 3 8b and 70b instruct models hosted on [Groq](https://console.groq.com/). You'll need to first [sign in](https://console.groq.com/) with your github or gmail account, then get an [API token](https://console.groq.com/keys) to try Groq out for free. (Groq runs Llama models very fast and they only support one Llama 2 model: the Llama 2 70b chat).\n",
"\n",
"**Note: You can also use other Llama hosting providers such as [Replicate](https://replicate.com/blog/run-llama-3-with-an-api?input=python), [Togther](https://docs.together.ai/docs/quickstart). Simply click the links here to see how to run `pip install` and use their freel trial API key with example code to modify the following three cells in 2.1 and 2.2.**\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "h3YGMDJidHtH"
},
"source": [
"### **2.1 - Install dependencies**"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"id": "VhN6hXwx7FCp"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Collecting groq\n",
" Downloading groq-0.5.0-py3-none-any.whl.metadata (12 kB)\n",
"Requirement already satisfied: anyio<5,>=3.5.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq) (4.3.0)\n",
"Requirement already satisfied: distro<2,>=1.7.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq) (1.9.0)\n",
"Requirement already satisfied: httpx<1,>=0.23.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq) (0.27.0)\n",
"Requirement already satisfied: pydantic<3,>=1.9.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq) (2.7.0)\n",
"Requirement already satisfied: sniffio in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq) (1.3.1)\n",
"Requirement already satisfied: typing-extensions<5,>=4.7 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq) (4.9.0)\n",
"Requirement already satisfied: idna>=2.8 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from anyio<5,>=3.5.0->groq) (3.6)\n",
"Requirement already satisfied: certifi in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from httpx<1,>=0.23.0->groq) (2024.2.2)\n",
"Requirement already satisfied: httpcore==1.* in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from httpx<1,>=0.23.0->groq) (1.0.5)\n",
"Requirement already satisfied: h11<0.15,>=0.13 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->groq) (0.14.0)\n",
"Requirement already satisfied: annotated-types>=0.4.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1.9.0->groq) (0.6.0)\n",
"Requirement already satisfied: pydantic-core==2.18.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1.9.0->groq) (2.18.1)\n",
"Downloading groq-0.5.0-py3-none-any.whl (75 kB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.0/75.0 kB\u001b[0m \u001b[31m3.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hInstalling collected packages: groq\n",
"Successfully installed groq-0.5.0\n"
]
}
],
"source": [
"!pip install groq"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### **2.2 - Create helpers for Llama 2 and Llama 3**\n",
"First, set your Groq API token as environment variables.\n"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"id": "8hkWpqWD28ho"
},
"outputs": [
{
"name": "stdin",
"output_type": "stream",
"text": [
" ········\n"
]
}
],
"source": [
"import os\n",
"from getpass import getpass\n",
"\n",
"GROQ_API_TOKEN = getpass()\n",
"\n",
"os.environ[\"GROQ_API_KEY\"] = GROQ_API_TOKEN"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Create Llama 2 and Llama 3 helper functions - for chatbot type of apps, we'll use Llama 3 8b/70b instruct models, not the base models."
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {
"id": "bVCHZmETk36v"
},
"outputs": [],
"source": [
"from groq import Groq\n",
"\n",
"client = Groq(\n",
" api_key=os.environ.get(\"GROQ_API_KEY\"),\n",
")\n",
"\n",
"def llama2(prompt, temperature=0.0, input_print=True):\n",
" chat_completion = client.chat.completions.create(\n",
" messages=[\n",
" {\n",
" \"role\": \"user\",\n",
" \"content\": prompt,\n",
" }\n",
" ],\n",
" model=\"llama2-70b-4096\",\n",
" temperature=temperature,\n",
" )\n",
"\n",
" return (chat_completion.choices[0].message.content)\n",
"\n",
"def llama3_8b(prompt, temperature=0.0, input_print=True):\n",
" chat_completion = client.chat.completions.create(\n",
" messages=[\n",
" {\n",
" \"role\": \"user\",\n",
" \"content\": prompt,\n",
" }\n",
" ],\n",
" model=\"llama3-8b-8192\",\n",
" temperature=temperature,\n",
" )\n",
"\n",
" return (chat_completion.choices[0].message.content)\n",
"\n",
"def llama3_70b(prompt, temperature=0.0, input_print=True):\n",
" chat_completion = client.chat.completions.create(\n",
" messages=[\n",
" {\n",
" \"role\": \"user\",\n",
" \"content\": prompt,\n",
" }\n",
" ],\n",
" model=\"llama3-70b-8192\",\n",
" temperature=temperature,\n",
" )\n",
"\n",
" return (chat_completion.choices[0].message.content)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "5Jxq0pmf6L73"
},
"source": [
"### **2.3 - Basic QA with Llama 2 and 3**"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"id": "H93zZBIk6tNU"
},
"outputs": [
{
"data": {
"text/markdown": [
"The typical color of a llama is a light brown or beige color, often with a darker brown or black patches on their ears, neck, and legs. Some llamas may also have a white or pale colored patch on their forehead. However, it's worth noting that llamas can come in a wide range of colors, including white, black, gray, and various shades of brown and red. Some breeds, such as the Suri alpaca, can have a more diverse range of colors, including shades of red, orange, and purple."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"prompt = \"The typical color of a llama is: \"\n",
"output = llama2(prompt)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"The typical color of a llama is white! However, llamas can also come in a variety of other colors, including:\n",
"\n",
"* Suri: a soft, fluffy coat that can be white, cream, or light brown\n",
"* Huacaya: a dense, soft coat that can be white, cream, or various shades of brown, gray, or black\n",
"* Rose-gray: a light grayish-pink color\n",
"* Dark brown: a rich, dark brown color\n",
"* Black: a glossy black coat\n",
"* Red: a reddish-brown color\n",
"* Cream: a light cream or beige color\n",
"* Fawn: a light reddish-brown color\n",
"\n",
"It's worth noting that llamas can also have various patterns and markings on their coats, such as white markings on the face, legs, or belly."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Brown."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(\"The typical color of a llama is what? Answer in one word.\")\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "cWs_s9y-avIT"
},
"source": [
"## **3 - Chat conversation**"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "r4DyTLD5ys6t"
},
"source": [
"### **3.1 - Single-turn chat**"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"id": "EMM_egWMys6u"
},
"outputs": [
{
"data": {
"text/markdown": [
"Sure, here's a short answer:\n",
"\n",
"The average lifespan of a llama is 15-25 years."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"prompt_chat = \"What is the average lifespan of a Llama? Answer the question in few words.\"\n",
"output = llama2(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"id": "sZ7uVKDYucgi"
},
"outputs": [
{
"data": {
"text/markdown": [
"15-20 years."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"id": "WQl3wmfbyBQ1"
},
"outputs": [
{
"data": {
"text/markdown": [
"The lion, tiger, leopard, and jaguar are all members of the Felidae family."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# example without previous context. LLM's are stateless and cannot understand \"they\" without previous context\n",
"prompt_chat = \"What animal family are they? Answer the question in few words.\"\n",
"output = llama2(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Canidae."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"I'm happy to help! However, I don't see a specific animal mentioned in your question. Could you please clarify or provide more context about which animal you're referring to?"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_70b(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: Llama 3 70b doesn't hallucinate.**"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### **3.2 - Multi-turn chat**\n",
"Chat app requires us to send in previous context to LLM to get in valid responses. Below is an example of Multi-turn chat."
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"id": "t7SZe5fT3HG3"
},
"outputs": [
{
"data": {
"text/markdown": [
"Assistant: Llamas are part of the Camelidae family, which includes camels and alpacas."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# example of multi-turn chat, with storing previous context\n",
"prompt_chat = \"\"\"\n",
"User: What is the average lifespan of a Llama?\n",
"Assistant: 15-20 years.\n",
"User: What animal family are they?\n",
"\"\"\"\n",
"output = llama2(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Llamas belong to the camelid family (Camelidae)."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: Llama 2 and 3 both behave well for using the chat history for follow up questions.**"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### **3.3 - Multi-turn chat with more instruction**\n",
"Adding the instructon \"Answer the question with one word\" to see the difference of Llama 2 and 3."
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Camelids"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# example of multi-turn chat, with storing previous context\n",
"prompt_chat = \"\"\"\n",
"User: What is the average lifespan of a Llama?\n",
"Assistant: Sure! The average lifespan of a llama is around 20-30 years.\n",
"User: What animal family are they?\n",
"\n",
"Answer the question with one word.\n",
"\"\"\"\n",
"output = llama2(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Camelid."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt_chat)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Both Llama 3 8b and Llama 2 70b follows instructions (e.g. \"Answer the question with one word\") better than Llama 2 7b.**"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "moXnmJ_xyD10"
},
"source": [
"### **4.2 - Prompt Engineering**\n",
"* Prompt engineering refers to the science of designing effective prompts to get desired responses\n",
"\n",
"* Helps reduce hallucination\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "t-v-FeZ4ztTB"
},
"source": [
"#### **4.2.1 - In-Context Learning (e.g. Zero-shot, Few-shot)**\n",
" * In-context learning - specific method of prompt engineering where demonstration of task are provided as part of prompt.\n",
" 1. Zero-shot learning - model is performing tasks without any\n",
"input examples.\n",
" 2. Few or “N-Shot” Learning - model is performing and behaving based on input examples in user's prompt."
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {
"id": "6W71MFNZyRkQ"
},
"outputs": [
{
"data": {
"text/markdown": [
"Curious"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Zero-shot example. To get positive/negative/neutral sentiment, we need to give examples in the prompt\n",
"prompt = '''\n",
"Classify: I saw a Gecko.\n",
"Sentiment: ?\n",
"\n",
"Give one word response.\n",
"'''\n",
"output = llama2(prompt)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {
"id": "MCQRjf1Y1RYJ"
},
"outputs": [
{
"data": {
"text/markdown": [
"Neutral"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: Llama 3 has different opinions than Llama 2.**"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {
"id": "8UmdlTmpDZxA"
},
"outputs": [
{
"data": {
"text/markdown": [
"Neutral"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# By giving examples to Llama, it understands the expected output format.\n",
"\n",
"prompt = '''\n",
"Classify: I love Llamas!\n",
"Sentiment: Positive\n",
"Classify: I dont like Snakes.\n",
"Sentiment: Negative\n",
"Classify: I saw a Gecko.\n",
"Sentiment:\n",
"\n",
"Give one word response.\n",
"'''\n",
"\n",
"output = llama2(prompt)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {
"id": "M_EcsUo1zqFD"
},
"outputs": [
{
"data": {
"text/markdown": [
"Neutral"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_8b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: Llama 2, with few shots, has the same output \"Neutral\" as Llama 3.**"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "mbr124Y197xl"
},
"source": [
"#### **4.2.2 - Chain of Thought**\n",
"\"Chain of thought\" enables complex reasoning through logical step by step thinking and generates meaningful and contextually relevant responses."
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {
"id": "Xn8zmLBQzpgj"
},
"outputs": [
{
"data": {
"text/markdown": [
"Seven."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Standard prompting\n",
"prompt = '''\n",
"Llama started with 5 tennis balls. It buys 2 more cans of tennis balls. Each can has 3 tennis balls.\n",
"How many tennis balls does Llama have?\n",
"\n",
"Answer in one word.\n",
"'''\n",
"\n",
"output = llama3_8b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {
"id": "lKNOj79o1Kwu"
},
"outputs": [
{
"data": {
"text/markdown": [
"Eleven."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_70b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: Llama 3-8b did not get the right answer because it was asked to answer in one word.**"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Let's break it down step by step!\n",
"\n",
"Llama started with 5 tennis balls.\n",
"\n",
"It buys 2 more cans of tennis balls. Each can has 3 tennis balls, so that's a total of 2 x 3 = 6 new tennis balls.\n",
"\n",
"Adding the new tennis balls to the original 5, Llama now has:\n",
"5 (initial tennis balls) + 6 (new tennis balls) = 11 tennis balls\n",
"\n",
"So, Llama now has 11 tennis balls!"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# By default, Llama 3 models follow \"Chain-Of-Thought\" prompting\n",
"prompt = '''\n",
"Llama started with 5 tennis balls. It buys 2 more cans of tennis balls. Each can has 3 tennis balls.\n",
"How many tennis balls does Llama have?\n",
"'''\n",
"\n",
"output = llama3_8b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "code",
"execution_count": 36,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"Llama started with 5 tennis balls. Then it bought 2 cans of tennis balls. Each can has 3 tennis balls. So that is 2 x 3 = 6 tennis balls. 5 + 6 = 11.\n",
"#### 11"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"output = llama3_70b(prompt)\n",
"md(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: By default, Llama 3 models identify word problems and solves it step by step!**"
]
},
{
"cell_type": "code",
"execution_count": 37,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"**Yes**\n",
"\n",
"Here's the step-by-step breakdown:\n",
"\n",
"1. We have 15 people who want to go to the restaurant.\n",
"2. Two people have cars that can seat 5 people each. This means we can accommodate 10 people in cars (2 cars x 5 seats per car).\n",
"3. We still have 5 people left who can't fit in the cars. We'll consider the motorcycles now.\n",
"4. Two people have motorcycles that can fit 2 people each. This means we can accommodate 4 people in motorcycles (2 motorcycles x 2 seats per motorcycle).\n",
"5. We still have 1 person left who can't fit in the cars or motorcycles. Unfortunately, we can't fit all 15 people in cars or motorcycles.\n",
"6. However, we can fit 10 people in cars (10 seats available) and 4 people in motorcycles (4 seats available), which adds up to 14 people. We still have 1 person left over.\n",
"7. Since we can't fit all 15 people in cars or motorcycles, we can't take everyone to the restaurant.\n",
"\n",
"However, we can take 14 people to the restaurant, which is the maximum number of people we can accommodate using the available cars and motorcycles.\n"
]
}
],
"source": [
"prompt = \"\"\"\n",
"15 of us want to go to a restaurant.\n",
"Two of them have cars\n",
"Each car can seat 5 people.\n",
"Two of us have motorcycles.\n",
"Each motorcycle can fit 2 people.\n",
"Can we all get to the restaurant by car or motorcycle?\n",
"Think step by step.\n",
"Provide the answer as a single yes/no answer first.\n",
"Then explain each intermediate step.\n",
"\"\"\"\n",
"output = llama3_8b(prompt)\n",
"print(output)"
]
},
{
"cell_type": "code",
"execution_count": 38,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"**Answer:** NO\n",
"\n",
"Here's the step-by-step explanation:\n",
"\n",
"1. We have 15 people who want to go to the restaurant.\n",
"2. We have 2 cars, each of which can seat 5 people. So, the cars can accommodate a total of 2 x 5 = 10 people.\n",
"3. This leaves 15 - 10 = 5 people who still need transportation.\n",
"4. We have 2 motorcycles, each of which can fit 2 people. So, the motorcycles can accommodate a total of 2 x 2 = 4 people.\n",
"5. This still leaves 5 - 4 = 1 person who doesn't have a ride.\n",
"6. Since we can't fit all 15 people in the available cars and motorcycles, the answer is NO, we cannot all get to the restaurant by car or motorcycle.\n"
]
}
],
"source": [
"output = llama3_70b(prompt)\n",
"print(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: Llama 3 70b model works correctly in this example.**"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Summary: Llama 2 often needs encourgement for step by step thinking to correctly reasoning. Llama 3 understands, reasons and explains better, making chain of thought unnecessary in the cases above.**"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "C7tDW-AH770Y"
},
"source": [
"### **4.3 - Retrieval Augmented Generation (RAG)**\n",
"* Prompt Eng Limitations - Knowledge cutoff & lack of specialized data\n",
"\n",
"* Retrieval Augmented Generation(RAG) allows us to retrieve snippets of information from external data sources and augment it to the user's prompt to get tailored responses from Llama 2.\n",
"\n",
"For our demo, we are going to download an external PDF file from a URL and query against the content in the pdf file to get contextually relevant information back with the help of Llama!\n",
"\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 40,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 259
},
"executionInfo": {
"elapsed": 329,
"status": "ok",
"timestamp": 1695832267093,
"user": {
"displayName": "Amit Sangani",
"userId": "11552178012079240149"
},
"user_tz": 420
},
"id": "Fl1LPltpRQD9",
"outputId": "4410c9bf-3559-4a05-cebb-a5731bb094c1"
},
"outputs": [
{
"data": {
"text/html": [
""
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"rag_arch()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "JJaGMLl_4vYm"
},
"source": [
"#### **4.3.1 - LangChain**\n",
"LangChain is a framework that helps make it easier to implement RAG."
]
},
{
"cell_type": "code",
"execution_count": 41,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: langchain in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (0.1.16)\n",
"Requirement already satisfied: PyYAML>=5.3 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (6.0.1)\n",
"Requirement already satisfied: SQLAlchemy<3,>=1.4 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (2.0.29)\n",
"Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (3.9.4)\n",
"Requirement already satisfied: dataclasses-json<0.7,>=0.5.7 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (0.6.4)\n",
"Requirement already satisfied: jsonpatch<2.0,>=1.33 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (1.33)\n",
"Requirement already satisfied: langchain-community<0.1,>=0.0.32 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (0.0.34)\n",
"Requirement already satisfied: langchain-core<0.2.0,>=0.1.42 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (0.1.45)\n",
"Requirement already satisfied: langchain-text-splitters<0.1,>=0.0.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (0.0.1)\n",
"Requirement already satisfied: langsmith<0.2.0,>=0.1.17 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (0.1.47)\n",
"Requirement already satisfied: numpy<2,>=1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (1.26.4)\n",
"Requirement already satisfied: pydantic<3,>=1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (2.7.0)\n",
"Requirement already satisfied: requests<3,>=2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (2.31.0)\n",
"Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain) (8.2.3)\n",
"Requirement already satisfied: aiosignal>=1.1.2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.3.1)\n",
"Requirement already satisfied: attrs>=17.3.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (23.2.0)\n",
"Requirement already satisfied: frozenlist>=1.1.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.4.1)\n",
"Requirement already satisfied: multidict<7.0,>=4.5 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (6.0.5)\n",
"Requirement already satisfied: yarl<2.0,>=1.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.9.4)\n",
"Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from dataclasses-json<0.7,>=0.5.7->langchain) (3.21.1)\n",
"Requirement already satisfied: typing-inspect<1,>=0.4.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from dataclasses-json<0.7,>=0.5.7->langchain) (0.9.0)\n",
"Requirement already satisfied: jsonpointer>=1.9 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from jsonpatch<2.0,>=1.33->langchain) (2.4)\n",
"Requirement already satisfied: packaging<24.0,>=23.2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-core<0.2.0,>=0.1.42->langchain) (23.2)\n",
"Requirement already satisfied: orjson<4.0.0,>=3.9.14 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langsmith<0.2.0,>=0.1.17->langchain) (3.10.0)\n",
"Requirement already satisfied: annotated-types>=0.4.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1->langchain) (0.6.0)\n",
"Requirement already satisfied: pydantic-core==2.18.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1->langchain) (2.18.1)\n",
"Requirement already satisfied: typing-extensions>=4.6.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1->langchain) (4.9.0)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests<3,>=2->langchain) (3.3.2)\n",
"Requirement already satisfied: idna<4,>=2.5 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests<3,>=2->langchain) (3.6)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests<3,>=2->langchain) (2.2.0)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests<3,>=2->langchain) (2024.2.2)\n",
"Requirement already satisfied: mypy-extensions>=0.3.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain) (1.0.0)\n",
"Requirement already satisfied: sentence-transformers in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (2.6.1)\n",
"Requirement already satisfied: transformers<5.0.0,>=4.32.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (4.40.0)\n",
"Requirement already satisfied: tqdm in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (4.66.2)\n",
"Requirement already satisfied: torch>=1.11.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (2.3.0.dev20240205)\n",
"Requirement already satisfied: numpy in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (1.26.4)\n",
"Requirement already satisfied: scikit-learn in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (1.4.2)\n",
"Requirement already satisfied: scipy in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (1.13.0)\n",
"Requirement already satisfied: huggingface-hub>=0.15.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (0.22.2)\n",
"Requirement already satisfied: Pillow in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sentence-transformers) (10.3.0)\n",
"Requirement already satisfied: filelock in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from huggingface-hub>=0.15.1->sentence-transformers) (3.13.1)\n",
"Requirement already satisfied: fsspec>=2023.5.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from huggingface-hub>=0.15.1->sentence-transformers) (2024.2.0)\n",
"Requirement already satisfied: packaging>=20.9 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from huggingface-hub>=0.15.1->sentence-transformers) (23.2)\n",
"Requirement already satisfied: pyyaml>=5.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from huggingface-hub>=0.15.1->sentence-transformers) (6.0.1)\n",
"Requirement already satisfied: requests in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from huggingface-hub>=0.15.1->sentence-transformers) (2.31.0)\n",
"Requirement already satisfied: typing-extensions>=3.7.4.3 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from huggingface-hub>=0.15.1->sentence-transformers) (4.9.0)\n",
"Requirement already satisfied: sympy in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from torch>=1.11.0->sentence-transformers) (1.11.1)\n",
"Requirement already satisfied: networkx in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from torch>=1.11.0->sentence-transformers) (3.0rc1)\n",
"Requirement already satisfied: jinja2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from torch>=1.11.0->sentence-transformers) (3.1.2)\n",
"Requirement already satisfied: regex!=2019.12.17 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from transformers<5.0.0,>=4.32.0->sentence-transformers) (2023.12.25)\n",
"Requirement already satisfied: tokenizers<0.20,>=0.19 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from transformers<5.0.0,>=4.32.0->sentence-transformers) (0.19.1)\n",
"Requirement already satisfied: safetensors>=0.4.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from transformers<5.0.0,>=4.32.0->sentence-transformers) (0.4.2)\n",
"Requirement already satisfied: joblib>=1.2.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from scikit-learn->sentence-transformers) (1.4.0)\n",
"Requirement already satisfied: threadpoolctl>=2.0.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from scikit-learn->sentence-transformers) (3.4.0)\n",
"Requirement already satisfied: MarkupSafe>=2.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from jinja2->torch>=1.11.0->sentence-transformers) (2.1.3)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests->huggingface-hub>=0.15.1->sentence-transformers) (3.3.2)\n",
"Requirement already satisfied: idna<4,>=2.5 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests->huggingface-hub>=0.15.1->sentence-transformers) (3.6)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests->huggingface-hub>=0.15.1->sentence-transformers) (2.2.0)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests->huggingface-hub>=0.15.1->sentence-transformers) (2024.2.2)\n",
"Requirement already satisfied: mpmath>=0.19 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from sympy->torch>=1.11.0->sentence-transformers) (1.2.1)\n",
"Requirement already satisfied: faiss-cpu in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (1.8.0)\n",
"Requirement already satisfied: numpy in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from faiss-cpu) (1.26.4)\n",
"Requirement already satisfied: bs4 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (0.0.2)\n",
"Requirement already satisfied: beautifulsoup4 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from bs4) (4.12.3)\n",
"Requirement already satisfied: soupsieve>1.2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from beautifulsoup4->bs4) (2.5)\n",
"Collecting langchain-groq\n",
" Downloading langchain_groq-0.1.2-py3-none-any.whl.metadata (2.8 kB)\n",
"Requirement already satisfied: groq<1,>=0.4.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-groq) (0.5.0)\n",
"Requirement already satisfied: langchain-core<0.2.0,>=0.1.42 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-groq) (0.1.45)\n",
"Requirement already satisfied: anyio<5,>=3.5.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq<1,>=0.4.1->langchain-groq) (4.3.0)\n",
"Requirement already satisfied: distro<2,>=1.7.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq<1,>=0.4.1->langchain-groq) (1.9.0)\n",
"Requirement already satisfied: httpx<1,>=0.23.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq<1,>=0.4.1->langchain-groq) (0.27.0)\n",
"Requirement already satisfied: pydantic<3,>=1.9.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq<1,>=0.4.1->langchain-groq) (2.7.0)\n",
"Requirement already satisfied: sniffio in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq<1,>=0.4.1->langchain-groq) (1.3.1)\n",
"Requirement already satisfied: typing-extensions<5,>=4.7 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from groq<1,>=0.4.1->langchain-groq) (4.9.0)\n",
"Requirement already satisfied: PyYAML>=5.3 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-core<0.2.0,>=0.1.42->langchain-groq) (6.0.1)\n",
"Requirement already satisfied: jsonpatch<2.0,>=1.33 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-core<0.2.0,>=0.1.42->langchain-groq) (1.33)\n",
"Requirement already satisfied: langsmith<0.2.0,>=0.1.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-core<0.2.0,>=0.1.42->langchain-groq) (0.1.47)\n",
"Requirement already satisfied: packaging<24.0,>=23.2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-core<0.2.0,>=0.1.42->langchain-groq) (23.2)\n",
"Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langchain-core<0.2.0,>=0.1.42->langchain-groq) (8.2.3)\n",
"Requirement already satisfied: idna>=2.8 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from anyio<5,>=3.5.0->groq<1,>=0.4.1->langchain-groq) (3.6)\n",
"Requirement already satisfied: certifi in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from httpx<1,>=0.23.0->groq<1,>=0.4.1->langchain-groq) (2024.2.2)\n",
"Requirement already satisfied: httpcore==1.* in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from httpx<1,>=0.23.0->groq<1,>=0.4.1->langchain-groq) (1.0.5)\n",
"Requirement already satisfied: h11<0.15,>=0.13 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->groq<1,>=0.4.1->langchain-groq) (0.14.0)\n",
"Requirement already satisfied: jsonpointer>=1.9 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from jsonpatch<2.0,>=1.33->langchain-core<0.2.0,>=0.1.42->langchain-groq) (2.4)\n",
"Requirement already satisfied: orjson<4.0.0,>=3.9.14 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langsmith<0.2.0,>=0.1.0->langchain-core<0.2.0,>=0.1.42->langchain-groq) (3.10.0)\n",
"Requirement already satisfied: requests<3,>=2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from langsmith<0.2.0,>=0.1.0->langchain-core<0.2.0,>=0.1.42->langchain-groq) (2.31.0)\n",
"Requirement already satisfied: annotated-types>=0.4.0 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1.9.0->groq<1,>=0.4.1->langchain-groq) (0.6.0)\n",
"Requirement already satisfied: pydantic-core==2.18.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from pydantic<3,>=1.9.0->groq<1,>=0.4.1->langchain-groq) (2.18.1)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests<3,>=2->langsmith<0.2.0,>=0.1.0->langchain-core<0.2.0,>=0.1.42->langchain-groq) (3.3.2)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages (from requests<3,>=2->langsmith<0.2.0,>=0.1.0->langchain-core<0.2.0,>=0.1.42->langchain-groq) (2.2.0)\n",
"Downloading langchain_groq-0.1.2-py3-none-any.whl (11 kB)\n",
"Installing collected packages: langchain-groq\n",
"Successfully installed langchain-groq-0.1.2\n"
]
}
],
"source": [
"!pip install langchain\n",
"!pip install sentence-transformers\n",
"!pip install faiss-cpu\n",
"!pip install bs4\n",
"!pip install langchain-groq"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### **4.3.2 - LangChain Q&A Retriever**\n",
"* ConversationalRetrievalChain\n",
"\n",
"* Query the Source documents\n"
]
},
{
"cell_type": "code",
"execution_count": 42,
"metadata": {
"id": "gAV2EkZqcruF"
},
"outputs": [],
"source": [
"from langchain_community.embeddings import HuggingFaceEmbeddings\n",
"from langchain_community.vectorstores import FAISS\n",
"from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
"from langchain_community.document_loaders import WebBaseLoader\n",
"import bs4\n",
"\n",
"# Step 1: Load the document from a web url\n",
"loader = WebBaseLoader([\"https://huggingface.co/blog/llama3\"])\n",
"documents = loader.load()\n",
"\n",
"# Step 2: Split the document into chunks with a specified chunk size\n",
"text_splitter = RecursiveCharacterTextSplitter(chunk_size=500, chunk_overlap=50)\n",
"all_splits = text_splitter.split_documents(documents)\n",
"\n",
"# Step 3: Store the document into a vector store with a specific embedding model\n",
"vectorstore = FAISS.from_documents(all_splits, HuggingFaceEmbeddings(model_name=\"sentence-transformers/all-mpnet-base-v2\"))"
]
},
{
"cell_type": "code",
"execution_count": 43,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Users/jeffxtang/anaconda3/envs/python3.11/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:119: LangChainDeprecationWarning: The method `Chain.__call__` was deprecated in langchain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
" warn_deprecated(\n"
]
},
{
"data": {
"text/markdown": [
"According to the provided context, the main changes in Llama 3 compared to Llama 2 are:\n",
"\n",
"1. A new tokenizer that expands the vocabulary size to 128,256 (from 32K tokens in the previous version), which can encode text more efficiently and potentially yield stronger multilingualism.\n",
"2. The introduction of two sizes: 8B for efficient deployment and development on consumer-size GPU, and 70B for large-scale AI native applications.\n",
"3. The availability of base and instruction-tuned variants for each model size.\n",
"4. The release of Llama Guard 2, a new version of Llama Guard that was fine-tuned on Llama 3 8B."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"from langchain_groq import ChatGroq\n",
"llm = ChatGroq(temperature=0, model_name=\"llama3-8b-8192\")\n",
"\n",
"from langchain.chains import ConversationalRetrievalChain\n",
"chain = ConversationalRetrievalChain.from_llm(llm,\n",
" vectorstore.as_retriever(),\n",
" return_source_documents=True)\n",
"\n",
"result = chain({\"question\": \"What’s new with Llama 3?\", \"chat_history\": []})\n",
"md(result['answer'])\n"
]
},
{
"cell_type": "code",
"execution_count": 45,
"metadata": {
"id": "NmEhBe3Kiyre"
},
"outputs": [
{
"data": {
"text/markdown": [
"According to the provided context, the main changes in Llama 3 compared to Llama 2 are:\n",
"\n",
"1. A new tokenizer that expands the vocabulary size to 128,256 (from 32K tokens in the previous version), which can encode text more efficiently and potentially yield stronger multilingualism.\n",
"2. The introduction of two sizes: 8B for efficient deployment and development on consumer-size GPU, and 70B for large-scale AI native applications.\n",
"3. The availability of base and instruction-tuned variants for each model size.\n",
"4. The release of Llama Guard 2, a new version of Llama Guard that was fine-tuned on Llama 3 8B."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Query against your own data\n",
"from langchain.chains import ConversationalRetrievalChain\n",
"chain = ConversationalRetrievalChain.from_llm(llm, vectorstore.as_retriever(), return_source_documents=True)\n",
"\n",
"chat_history = []\n",
"query = \"What’s new with Llama 3?\"\n",
"result = chain({\"question\": query, \"chat_history\": chat_history})\n",
"md(result['answer'])"
]
},
{
"cell_type": "code",
"execution_count": 46,
"metadata": {
"id": "CelLHIvoy2Ke"
},
"outputs": [
{
"data": {
"text/markdown": [
"According to the text, the two sizes of Llama 3 are 8B and 70B parameters."
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# This time your previous question and answer will be included as a chat history which will enable the ability\n",
"# to ask follow up questions.\n",
"chat_history = [(query, result[\"answer\"])]\n",
"query = \"What two sizes?\"\n",
"result = chain({\"question\": query, \"chat_history\": chat_history})\n",
"md(result['answer'])"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "TEvefAWIJONx"
},
"source": [
"## **5 - Fine-Tuning Models**\n",
"\n",
"* Limitatons of Prompt Eng and RAG\n",
"* Fine-Tuning Arch\n",
"* Types (PEFT, LoRA, QLoRA)\n",
"* Using PyTorch for Pre-Training & Fine-Tuning\n",
"\n",
"* Evals + Quality\n",
"\n",
"Examples of Fine-Tuning:\n",
"* [Meta Llama Recipes](https://github.com/meta-llama/llama-recipes/tree/main/recipes/finetuning)\n",
"* [Hugging Face fine-tuning with Llama 3](https://huggingface.co/blog/llama3#fine-tuning-with-%F0%9F%A4%97-trl)\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "_8lcgdZa8onC"
},
"source": [
"## **6 - Responsible AI**\n",
"\n",
"* Power + Responsibility\n",
"* Hallucinations\n",
"* Input & Output Safety\n",
"* Red-teaming (simulating real-world cyber attackers)\n",
"* [Responsible Use Guide](https://ai.meta.com/llama/responsible-use-guide/)\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "pbqb006R-T_k"
},
"source": [
"## **7 - Conclusion**\n",
"* Active research on LLMs and Llama\n",
"* Leverage the power of Llama and its open community\n",
"* Safety and responsible use is paramount!\n",
"\n",
"* Call-To-Action\n",
" * [Replicate Free Credits](https://replicate.fyi/connect2023) for Connect attendees!\n",
" * This notebook is available through Llama Github recipes\n",
" * Use Llama in your projects and give us feedback\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "gSz5dTMxp7xo"
},
"source": [
"#### **Resources**\n",
"- [Meta Llama 3 Blog](https://ai.meta.com/blog/meta-llama-3/)\n",
"- [Getting Started with Meta Llama](https://llama.meta.com/docs/get-started)\n",
"- [Llama 3 repo](https://github.com/meta-llama/llama3)\n",
"- [Llama 3 model card](https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md)\n",
"- [LLama 3 Recipes repo](https://github.com/meta-llama/llama-recipes)\n",
"- [Responsible Use Guide](https://ai.meta.com/llama/responsible-use-guide/)\n",
"- [Acceptable Use Policy](https://ai.meta.com/llama/use-policy/)\n",
"\n"
]
}
],
"metadata": {
"colab": {
"collapsed_sections": [
"ioVMNcTesSEk"
],
"machine_shape": "hm",
"provenance": [],
"toc_visible": true
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.7"
}
},
"nbformat": 4,
"nbformat_minor": 4
}