diff --git a/documentation/T1_2024/Group_Tasks_and_Handover_Documents/DataBytesReport.md b/documentation/T1_2024/Group_Tasks_and_Handover_Documents/DataBytesReport.md index 043a9e26..b89281d8 100644 --- a/documentation/T1_2024/Group_Tasks_and_Handover_Documents/DataBytesReport.md +++ b/documentation/T1_2024/Group_Tasks_and_Handover_Documents/DataBytesReport.md @@ -404,6 +404,7 @@ height="4.840277777777778in"} |:-------------|:-------------------| | Nicholas Lane | Write contributions here !
  • I developed a NLP model that uses transfer learning to classify transactions using a BERT transformer.
  • I wrote code to clean and preprocess text data, encode class labels, then tokenized transaction descriptions using BERT tokenizer, convert to BERT input format.
  • Created a function to create datasets, create datasets for training, validation, and testing.
  • Loaded in the Pre-trained BERT Model: Load pre-trained BERT model for NLP classification, then added an output layer for class prediction.
  • Compiled and train the BERT model with training dataset and fine tuned the model and finally evaluated the performance of the model.
  • Updated the code to include the DolFin colour code format, and resubmitted the code
  • I development of a Deep Neural Network model to identify and classify fraudulent bank transactions.
  • I searched for a suitable dataset, that would contain the information that DolFin would be able to access through the Open Banking platform.
  • I write code to clean and preprocess the various data types and prepare them for the deep learning model.
  • I split the data into training and testing datasets which were then made into TensorFlow datasets, which had been shuffled and prefetched.
  • I developed a model and added regulation to improve the model’s generalizability and reduce overfitting to the training data.
  • >The model was compiled and then evaluated, I also updated the colours to the Dolfin colour format and then submitted the code as a .py file.
  • Assist with handover documentation and presentation slides.
  • | | Junkai Jiang | Write notes about how to set up Dolfin_new
    Set up Dolfin_new GitHub repository
    Develop JWT service for user authentication(Dolfin_new)
    Develop Basiq API service(Dolfin_new)
    Develop database service(Dolfin_new)
    Review the pull request by Deepak: Optimization of the clear transaction function.
    Review the pull request by Sagar: Email verification function.
    Review the pull request for rebuilding the login route and update the login page
    Review the pull request for rebuilding the dashboard
    Set up Dolfin_new Trello backlog
    Redesigned and developed the dashboard web interface of the new project
    Discussions with Junior Developer
    Discuss with Junior about setting up a Dolfin account
    Discuss with Junior about the transition of the project (React part)
    Discuss with Junior about the account delete functionality
    Discuss with Juniors about the handover document
    Connect the reported financial well-being feature to the database and backend(Dolfin_new)
    Connect income and expenditure overview to the backend(Dolfin_new)
    Connect D-cloud to the backend(Dolfin_new)
    Add linking to the bank account feature(Dolfin_new)
    Fix the backend Docker file
    Complete the showcase video (Dolfin_new part) | +| Ata Colak |
  • Develop chatbot which uses Groq API as inference engine and LLAMA3-70b as large language model
  • Experiment running LLMs locally. Best performing model locally is "Phi3" using "Ollama".
  • Introduce logic to pass LLMs only relevant transaction info, reducing risk of hallucination tremendously.
  • Introduce capabilities to store different files correlating to different dates
  • Got PR merged for the final state of chatbot which extracts date information from user message, searches KnowledgeBase folder for relevant date, and answers user question related to their transactions.
  • | Full name | Write contributions here !
  • ITEM 1
  • ITEM 2
  • ITEM 3
  • ITEM 4
  • ITEM 5
  • ITEM 6
  • |