Results for "table-question-answering"

20 matches found.

microsoft

microsoft/tapex-base-finetuned-wikisql

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models ...

❓ table-question-answering 973,852
google

google/tapas-large-finetuned-sqa

No description available.

❓ table-question-answering 69,112
google

google/tapas-base-finetuned-wtq

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 10,157
google

google/tapas-small-finetuned-wikisql-supervised

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 4,836
google

google/tapas-tiny-finetuned-sqa

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 2,579
lysandre

lysandre/tiny-tapas-random-sqa

No description available....

❓ table-question-answering 1,987
lysandre

lysandre/tiny-tapas-random-wtq

No description available....

❓ table-question-answering 1,935
google

google/tapas-base-finetuned-sqa

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 1,410
google

google/tapas-medium-finetuned-wtq

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 1,113
microsoft

microsoft/tapex-large-finetuned-wtq

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models ...

❓ table-question-answering 892
microsoft

microsoft/tapex-large-finetuned-tabfact

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models ...

❓ table-question-answering 735
google

google/tapas-large-finetuned-wtq

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 727
microsoft

microsoft/tapex-large

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models ...

❓ table-question-answering 595
microsoft

microsoft/tapex-base

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models ...

❓ table-question-answering 586
google

google/tapas-small-finetuned-wtq

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 427
google

google/tapas-medium-finetuned-wikisql-supervised

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 413
microsoft

microsoft/tapex-base-finetuned-wtq

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models ...

❓ table-question-answering 381
google

google/tapas-base-finetuned-wikisql-supervised

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 322
google

google/tapas-small-finetuned-sqa

TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion. This means...

❓ table-question-answering 308
QuantFactory

QuantFactory/TableLLM-13b-GGUF

No description available.

❓ table-question-answering 192