(no title)
hobbyjogger | 7 years ago
> "One technical problem with Computational Law, familiar to many individual with legal training, is due to the open texture of laws. Consider a municipal regulation stating "No vehicles in the park". On first blush this is fine, but it is really quite problematic. Just what constitutes a vehicle? Is a bicycle a vehicle? What about a skateboard? How about roller skates? What about a baby stroller? A horse? A repair vehicle? For that matter, what is the park? At what altitude does it end? If a helicopter hovers at 10 feet, is that a violation? What if it flies over at 100 feet?
> The resolution of this problem is to limit the application of Computational Law to those cases where such issues can be externalized or marginalized. We allow human users to make judgments about such open texture concepts in entering data or we avoid regulatory applications where such concepts abound.
> A different sort of challenge to Computational Law stems from the fact that not all legal reasoning is deductive. Edwina Rissland [Rissland et al.] notes that, "Law is not a matter of simply applying rules to facts via modus ponens"; and, when regarding the broad application of AI techniques to law, this is certainly true. The rules that apply to a real-world situation, as well as even the facts themselves, may be open to interpretation, and many legal decisions are made through case-based reasoning, bypassing explicit reasoning about laws and statutes. The general problem of open texture when interpreting rules, along with the parallel problem of running out of rules to apply when resolving terms, presents significant obstacles to implementable automated rule-based reasoning."
[1] https://law.stanford.edu/2016/01/13/michael-genesereths-comp...
No comments yet.