J A B B Y A I

Loading

What is the meaning of a goal for an agent?

We are able to quantify the amount of predicted learning given some actions. We can predict the possibility for reward in the same way.

Is that all a “goal” has to be? A future state reward, can that be reasoned into a form a logic? Some kind of logical expression, I’m having trouble defining it nicely. How can a goal ever be expressed or internalized?

submitted by /u/42GOLDSTANDARD42
[link] [comments]

Leave a Comment