Newbie , installed dalai with llama locally, trying to make sense of responses

Like the other commenter said, LLaMA 7B is the smallest model and not good. The only decent 7B models are the finetuned derivatives of it like Alpaca Native. You won't get good results from standard LLaMA 7B.

As for your other questions, I should mention that that dalai repo hasn't been updated in a month. It's an abandoned project. You should be using llama.cpp or text generation web UI. This subreddit has a install guide linked in the sidebar to help with that.

With llama.cpp or text generation web UI installed, you should use a better model. Use the Models wiki page to find which model can work on your system and download it.

/r/LocalLLaMA Thread