In the present paper, a sequential decision problem on a partially observable Markov process is set up which takes into account a partial maintenance. We develop an optimal maintenance policy for the products. During their life cycle, a condition of this item changes, which causes some troubles. For a small trouble, it is possible to handle individually, but it might be necessary to replace a faulty component. The decision-maker does not observe a condition directly, but information is obtained through a magnitude of a trouble. A state of an item changes according to a Markovian transition rule based on ${\rm TP}_2$. The decision-maker decides a level of repair with cost which varies with the level. This problem is how much to expend to maintain this item to minimize the total expected cost. A dynamic programming formulation implies a recursive equation about expected cost obtainable under the optimal policy, and the purpose of this paper is to observe monotonic properties for this value.