Let $U_{0}$ be a random vector taking its values in a measurable space and having an unknown distribution $P$. Let $U_{1},U_{2},\ldots,U_{N}$ and $V_{1},V_{2},\ldots,V_{m}$ be independent simple random samples from $P$ of a random size $N$ and a fixed size $m$, respectively. Further, let $z_{1},z_{2},\ldots,z_{k} $ be real valued bounded functions defined on the same space. Assuming that only the first sample is observed, we find a minimax predictor $\mbox{\boldmath $d^{0}$}(N,U_{1},\ldots,U_{N})$ of the vector $\mbox{\boldmath $Y^{m}$} = \sum_{j=1}^{m} (z_{1}(V_{j}),z_{2}(V_{j}),\ldots,z_{k}(V_{j}))^{T}$ with respect to a quadratic error loss function.