Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the code to pass test cases #45

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 23 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,26 @@ make test
```

Good luck!


## Description

I worked majorly on 2 files, namely:
```bash
- main.go
- index.html
```
In order to fix the code breaking **TestSearchCaseSensitive** test case, what I did was to have 2 copies of the data read from `completeworks.txt`.

I made one of the copy letters all small case and then ran the sort Array index algorithm on this data.
I also made sure that the data passed as a query was converted in to small case, hence making the search for indices case insensitive.
When I was to return a portion of the text back to the user, I made use of the first copy of data i.e the unaltered version of `completworks.txt` data.
In order to fix the code breaking **TestSearchDrunk** test case, what I did was to make use of a function called FindAllIndex found in the suffix array package.
The function makes it possible for you to find indices of substrings that match a given regular expression.

So I went ahead to convert the query been passed into a ‘exact match’ regular expression, the regular expression generated is then passed to the FindAllIndex to get indices of all substrings with the exact match.
The two indices returned represents the start index and end index of the substring. I use these values to return a section of the article for the user.
Since the testCase was expecting only 20 items and there was the possibility of the substrings occurring more than 20 times, I limited the amount of indices returned to 20 by passing 20 as the second argument to FindAllIndex function.

In order to fix the code breaking `should load more results for “horse” when clicking “Load More”` test case, what I did was to give the load more button an id of `load-more`.
Since puppetter function `page.click(‘#load-more’)` was looking for a clickable item with an id of `load-more`.
28 changes: 28 additions & 0 deletions index.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
<!doctype html>
<html class="no-js" lang="">

<head>
<meta charset="utf-8">
<title>ShakeSearch</title>
<link rel="stylesheet" href="style.css">
<meta name="theme-color" content="#fafafa">
</head>

<body>
<p>
<form id="form">
<label for="query">Query</label>
<input type="text" id="query" name="query">
<button type="submit">Search</button>
</form>
</p>
<p>
<table id="table">
<tbody id="table-body"></tbody>
</table>
</p>
<button id="load-more">Load More</button>
<script src="app.js"></script>
</body>

</html>
27 changes: 20 additions & 7 deletions main.go
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ import (
"log"
"net/http"
"os"
"regexp"
"strings"
)

func main() {
Expand Down Expand Up @@ -48,10 +50,15 @@ func handleSearch(searcher Searcher) func(w http.ResponseWriter, r *http.Request
w.Write([]byte("missing search query in URL params"))
return
}
results := searcher.Search(query[0])
results, err := searcher.Search(query[0])
if err != nil {
w.WriteHeader(http.StatusInternalServerError)
w.Write([]byte(err.Error()))
return
}
buf := &bytes.Buffer{}
enc := json.NewEncoder(buf)
err := enc.Encode(results)
err = enc.Encode(results)
if err != nil {
w.WriteHeader(http.StatusInternalServerError)
w.Write([]byte("encoding failure"))
Expand All @@ -68,15 +75,21 @@ func (s *Searcher) Load(filename string) error {
return fmt.Errorf("Load: %w", err)
}
s.CompleteWorks = string(dat)
s.SuffixArray = suffixarray.New(dat)
datToLower := strings.ToLower(s.CompleteWorks)
s.SuffixArray = suffixarray.New([]byte(datToLower))
return nil
}

func (s *Searcher) Search(query string) []string {
idxs := s.SuffixArray.Lookup([]byte(query), -1)
func (s *Searcher) Search(query string) ([]string, error) {
queryToLower := strings.ToLower(query)
regexQuery, err := regexp.Compile(fmt.Sprintf("\\b%s\\b", queryToLower))
if err != nil {
return []string{}, fmt.Errorf("regular expressions error: %w", err)
}
idxs := s.SuffixArray.FindAllIndex(regexQuery, 20)
results := []string{}
for _, idx := range idxs {
results = append(results, s.CompleteWorks[idx-250:idx+250])
results = append(results, s.CompleteWorks[idx[0]-250:idx[1]+250])
}
return results
return results, nil
}