Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
666 views
in Technique[技术] by (71.8m points)

sql server - FIFO Implementation in Inventory using SQL

This is basically an inventory project which tracks the "Stock In" and "Stock Out" of items through Purchase and sales respectively.

The inventory system follows FIFO Method (the items which are first purchased are always sold first). For example:

If we purchased Item A in months January, February and March When a customer comes we give away items purchased during January only when the January items are over we starts giving away February items and so on

So I have to show here the total stock in my hand and the split up so that I can see the total cost incurred.

Actual table data:

Actual table data

The result set I need to obtain:

What the result set is meant to be

My client insists that I should not use Cursor, so is there any other way of doing so?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

As some comment already said a CTE can solve this

with cte as (
select item, wh, stock_in, stock_out, price, value
     , row_number() over (partition by item, wh order by item, wh) as rank
from   myTable)
select a.item, a.wh
     , a.stock_in - coalesce(b.stock_out, 0) stock
     , a.price
     , a.value - coalesce(b.value, 0) value
from cte a
     left join cte b on a.item = b.item and a.wh = b.wh and a.rank = b.rank - 1
where a.stock_in - coalesce(b.stock_out, 0) > 0

If the second "Item B" has the wrong price (the IN price is 25, the OUT is 35).
SQL 2008 fiddle

Just for fun, with sql server 2012 and the introduction of the LEAD and LAG function the same thing is possible in a somewhat easier way

with cte as (
select item, wh, stock_in
     , coalesce(LEAD(stock_out) 
                OVER (partition by item, wh order by item, wh), 0) stock_out
     , price, value
     , coalesce(LEAD(value) 
                OVER (partition by  item, wh order by item, wh), 0) value_out
from   myTable)
select item
     , wh
     , (stock_in - stock_out) stock
     , price
     , (value - value_out) value
from   cte
where  (stock_in - stock_out) > 0

SQL2012 fiddle

Update
ATTENTION -> To use the two query before this point the data need to be in the correct order.

To have the details with more then one row per day you need something reliable to order the row with the same date, like a date column with time, an autoincremental ID or something down the same line, and it's not possible to use the query already written because they are based on the position of the data.

A better idea is to split the data in IN and OUT, order it by item, wh and data, and apply a rank on both data, like this:

SELECT d_in.item
     , d_in.wh
     , d_in.stock_in - coalesce(d_out.stock_out, 0) stock
     , d_in.price
     , d_in.value - coalesce(d_out.value, 0) value
FROM   (SELECT item, wh, stock_in, price, value
             , rank = row_number() OVER 
               (PARTITION BY item, wh ORDER BY item, wh, date)
        FROM   myTable
        WHERE  stock_out = 0) d_in
       LEFT JOIN
       (SELECT item, wh, stock_out, price, value
             , rank = row_number() OVER 
               (PARTITION BY item, wh ORDER BY item, wh, date)
        FROM   myTable
        WHERE  stock_in = 0) d_out
       ON d_in.item = d_out.item AND d_in.wh = d_out.wh 
      AND d_in.rank = d_out.rank
WHERE d_in.stock_in - coalesce(d_out.stock_out, 0) > 0

SQLFiddle

But this query is NOT completely reliable, the order of data in the same order group is not stable.

I haven't change the query to recalculate the price if the IN.price is different from the OUT.price


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...