Find answers to your questions about Data Science or help others by answering their Data Science questions.
I am trying to get a better understanding of the outputs given by Google's sentiment analysis API. It takes in a sentence and gives out two values - magnitude and...
Asked on 06/10/2021 by Samarth
0 answerSo let's say I create a logistic model to predict who will open a loan based on a based email list that includes who opened and who didn't that's 90%...
Asked on 06/10/2021 by happinessiskey
1 answerI am working with data from the government regarding COVID immunization numbers and would like to convert object columns into numeric. I am using Python in Jupyter Notebook. I tried...
Asked on 06/10/2021
1 answerI have a very simple program below that builds a model using multi-output regression. Even though all the training data consists of positive float values I'm discovering that predictions made...
Asked on 06/10/2021
0 answerIs it possible to compute attention/adapt existing transformer architectures (like longformer) to be used on multi-dimensional sequence input? As in, instead of a 1D array of tokens (like a python...
Asked on 06/10/2021
0 answerSo I have about 3000 images with 6 classes and this is what I did: 1 - split into training set and test set prior to anything with 20% test...
Asked on 06/10/2021 by Marco Ramos
0 answerdisclaimer: I am not 100% sure that this is the appropriate place to ask this question.Here is a little bit of context about the problem. I have a dataset...
Asked on 06/10/2021 by Murcielago
1 answerI was going through GAN's notebook by fchallot on Generative Adversarial Networks where, in the Generator Network, he creates a Dense layer with ...
Asked on 06/10/2021 by thanatoz
1 answerAll three terms sound super similar:[...] document layout analysis is the process of identifying and categorizing the regions of interest in the scanned image of a text document. A reading...
Asked on 06/10/2021
0 answerI understand the transformer architecture (from "Attention is All You Need"), as well as how the attention is computed in the multi-headed attention layers. What I'm confused on is why...
Asked on 06/10/2021 by Nick Koprowicz
1 answerGet help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP