-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference using a DataArray #23
Comments
Hi @Dhuige, Thanks for trying out In order to create a multivariate input, you should use the following Data_Array = datavar(Vector{Float64}, N) You can't use Data_Array_Univariate = datavar(Float64, N)
Data_Array_Multivariate = datavar(Vector{Float64}, N)
for i in 1:N
Data_Array_Univariate[i] ~ dot(Data_Array_Multivariate[i], theta) + epsilon
end We will be happy to help further if you provide more details on what you are trying to achieve with this model. |
Thank you for the improvement on my code used, what I am trying to say is that it would be nice if one could use 1 Array as input and output. In this example, this would imply that a slice of the multivariate is treated as Data_Array_Univariate. in other words: Preferably, the dot product would also work if the Data_Array_Multivariate is sliced in an array (since it will make mathematical sense, thus convenient to use). |
I see @Dhuige. Thanks for raising the issue. |
@albertpod @wouterwln @Dhuige We may consider this improvement in the next iteration of the |
ping @wouterwln |
I think this is possible in |
I'm not entirely sure, better to add a test for it |
I added a test for this. The slicing might not work because of #246 , but I think that might also be an illegal syntax. What do you think @albertpod ? I think it does not make sense to pass a vector of individual random variables into the |
It might be nice to infer data using an array. Slicing the array into a vector in the model able to read it out
By which I mean that one would be able to use an array given the following:
The text was updated successfully, but these errors were encountered: