Analytics pipeline: Failed to run task AnswerDistributionWorkflow

The above issue is fixed if I add one more param remote_log_level in the code.
However, the new issue appears

    WARNING:luigi-interface:Will not run ImportEnrollmentsIntoMysql(source=["hdfs://localhost:9000/data/logs/tracking/"], expand_interval=4 w 2 d 0 h 0 m 0 s, pattern=[".*tracking.log-([0-9]).*"], date_pattern=%Y%m%d, warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/, date=2021-03-26, partner_short_codes=[], partner_api_urls=[], api_root_url=None, interval=2021-03-25, enable_course_catalog=False) or any dependencies due to error in complete() method:
Traceback (most recent call last):
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/worker.py", line 401, in check_complete
    is_complete = task.complete()
  File "/var/lib/analytics-tasks/analyticstack/edx-analytics-pipeline/edx/analytics/tasks/util/overwrite.py", line 47, in complete
    return super(OverwriteOutputMixin, self).complete()
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/task.py", line 827, in complete
    return all(r.complete() for r in flatten(self.requires()))
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/task.py", line 827, in 
    return all(r.complete() for r in flatten(self.requires()))
  File "/var/lib/analytics-tasks/analyticstack/edx-analytics-pipeline/edx/analytics/tasks/util/overwrite.py", line 47, in complete
    return super(OverwriteOutputMixin, self).complete()
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/task.py", line 573, in complete
    return all(map(lambda output: output.exists(), outputs))
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/task.py", line 573, in 
    return all(map(lambda output: output.exists(), outputs))
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/contrib/hive.py", line 438, in exists
    return self.client.table_exists(self.table, self.database, self.partition)
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/contrib/hive.py", line 148, in table_exi
sts
    (%s)""" % (database, table, self.partition_spec(partition)))
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/contrib/hive.py", line 80, in run_hive_cmd
    return run_hive(['-e', hivecmd], check_return_code)
  File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/contrib/hive.py", line 67, in run_hive
    p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
  File "/usr/lib/python2.7/subprocess.py", line 394, in __init__
    errread, errwrite)
  File "/usr/lib/python2.7/subprocess.py", line 1047, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory

The new issue just looks like edX analytics pipeline fails