cmc-csci143
cmc-csci143 copied to clipboard
sql.normalized_batch/04.sql fail
Hello @mikeizbicki,
I have successfully created an index that speeds up the sql.normalized_batch/04.sql query. However, when I run the test script, I get that the sql.normalized_batch/04.sql test is failing. I imagine that this cannot be caused by my index. Does this mean that I loaded the tweets incorrectly? All of my other test cases seem to be passing.
It probably means the tweets were loaded incorrectly (although there are some settings on the GIN index that can change the results of the query; see for example the gin_fuzzy_search_limit section of the GIN index reading https://habr.com/en/companies/postgrespro/articles/448746/).
The expected number of tweets is 6056. Can you post the number that you're getting back? Depending on the number, I might not make you reinsert all the data (because that's a lot of work and not really the point of the assignment).
Hi @mikeizbicki! Here are the results I am getting:
postgres=# SELECT
count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
AND lang='en'
;
count
-------
11415
(1 row)
This seems much greater than the expected number. What might be causing this?
Somehow duplicate entries have been added to your database. (There are no UNIQUE constraints, for example, that would prevent this from happening.) The other queries all do something like SELECT count(DISTINCT id) which causes this to not be a problem for those queries.
You don't have to redo inserting all the data. I'll waive for you the requirement that question 4 pass the test case.
This waiver is for @henrylong612 only. If anyone else here is in a similar situation, you can reply to this comment and request a similar waiver, but I'll have to approve it.
Thank you @mikeizbicki!
@mikeizbicki I am getting this output
postgres=# SELECT
count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
AND lang='en'
;
count
-------
12112
I believe I am having a similar issue
@luisgomez214 I'll also waive for you the requirement that problem 4 pass the test cases.
Thank you @mikeizbicki
My run time is
lambda-server:~/bigdata/twitter_postgres_indexes (master *%=) $ time docker-compose exec pg_normalized_batch ./run_tests.sh sql.normalized_batch
/home/Luis.Gomez.25/.local/lib/python3.6/site-packages/paramiko/transport.py:32: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6.
from cryptography.hazmat.backends import default_backend
sql.normalized_batch/01.sql pass
sql.normalized_batch/02.sql pass
sql.normalized_batch/03.sql pass
sql.normalized_batch/04.sql fail
sql.normalized_batch/05.sql pass
real 0m8.743s
user 0m0.572s
sys 0m0.387s
Is this correct or should I aim for 3 seconds?
@luisgomez214 You should be able to do better than this with the right indexes. You can submit as-is for 12/16 points on this section (-1 point/second over 5 seconds).
@luisgomez214 You should be able to do better than this with the right indexes. You can submit as-is for 12/16 points on this section (-1 point/second over 5 seconds).
My runtime for normalized_batch varies from 2-8 seconds when I run it. I was wondering if we will be graded on our indexes or our times when the test is ran?
@JTan242 The grading is described in the homework README file (https://github.com/mikeizbicki/twitter_postgres_indexes/?tab=readme-ov-file#grading), where it states that the grade is based on the runtimes.
Hi @mikeizbicki, I am getting a similar error for query 4 as the others.
postgres=# SELECT
count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
AND lang='en'
;
count
-------
12112
(1 row)
@mmendiratta27 I'm not sure if this is still relevant for you, but I will do the same waiver in that you don't need to have test case 4 passing.
Hi @mikeizbicki I saw this thread from a previous semester. I am running into the same issue where my test cases run in under 5 seconds but test case 4 is failing.
postgres=# SELECT
count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
AND lang='en'
;
count
-------
12112
(1 row)
I was wondering if I can also have the requirement for it to pass be waived?
I'm not sure if this is relevant, but initially my test four was failing. For some other reason I was trying to reset my environment so I pruned all volumes
$ docker stop $(docker ps -q)
$ docker rm $(docker ps -qa)
$ docker volume prune --all
and reset the containers
$ docker-compose run pg_normalized_batch bash -c 'rm -rf $PGDATA/*'
$ docker-compose run pg_denormalized bash -c 'rm -rf $PGDATA/*'
When I loaded in the data again, all tests passed. I don't recall what my output for the test was, if I also had count 12112, so I'm not sure if this could help or if I had a different issue.
@jchopra05 The expected number of tweets is 6056 and the number that you are getting is exactly double. This means that you have somehow inserted each tweet twice. I don't want you to waste time at this point resetting the volumes and reinserting. So I will waive the requirement for you that test case 4 passes.