Cree una vista para agregar una fila no. para un conjunto específico de datos filtrados por un índice de columna

I have a database table like the one mentioned below.

id    | lecture            | subject_id | date       | is_deleted
------|--------------------|------------|------------|-----------
 1    | Introduction       | 1          | 2012-08-10 | 0   
 2    | Structure          | 2          | 2012-08-15 | 1   
 3    | Introduction       | 2          | 2012-08-12 | 0   
 4    | Functions          | 1          | 2012-08-14 | 1   
 5    | Material           | 2          | 2012-08-18 | 0   
 6    | Requirements       | 1          | 2012-08-16 | 0   
 7    | Analysis           | 1          | 2012-08-11 | 0

I need to make a view out of this table (Lecture), which will display a row no. (flow no.) for each subject ordered by date, removing is_deleted = 1 rows. Simply, making a flow no. for each lecture in a particular subject ordered by date only with not deleted lectures. So, the view made by above data will look like the following.

flow_no | id   | date       | lecture            | subject_id 
--------|------|------------|--------------------|------------
 1      | 1    | 2012-08-10 | Introduction       | 1          
 2      | 7    | 2012-08-11 | Analysis           | 1          
 3      | 6    | 2012-08-16 | Requirements       | 1          
 1      | 3    | 2012-08-12 | Introduction       | 2          
 2      | 5    | 2012-08-18 | Material           | 2          

I tried to do this in several ways and everything failed. It's highly appreciated if someone could help me to resolve this. (mysql)

preguntado el 28 de agosto de 12 a las 08:08

2 Respuestas

SELECT @rownum := @rownum + 1 flow_no, id, date, lecture, subject_id
FROM subjects, (SELECT @rownum := 0) r
WHERE is_deleted = 0
ORDER BY subject_id, date, id

I'll leave it to you to turn that into a view.

To restart the flow_no per subject you have a few choices:

Copy the query above, and wrap it as a subquery, then calculate the min(flow_no) agrupando por subject_id, the join it to the above query, and subtract the min flow_no from each row.

You could assign the subject_id to a variable, then check the variable against the current subject_id and reset the rownum variable each time - I'm not even sure this is possible.

You could write a stored procedure to do this - get a list of unique subject_id, then run a bunch of queries for each one and output them.

None of these options sound appealing to me. If this was me I would abandon doing this in a query.

Respondido 28 ago 12, 09:08

OP wanted to partition flow_no by subject_id. - huevo

@eggyal Ouch. Good point. It's possible to do this, but I recommend against doing this much in a query. You should probably do this in the outer language. - Ariel

@Ariel thank you for your prompt answer. I have actually come to this point and what I highly require is to partition flow_no by subject_id thing. I tried to do as you have mentioned but couldn't achieve what I wanted. - Chandimak

@chandimak I've helped as much as I could, sorry it's not complete. Databases are really bad at doing this kind of thing, that's why it's so hard. I, again, recommend you do this in the outside language. - Ariel

@Ariel Thank you very much for your time and support. - Chandimak

I found the answer by the support of another online forum's member. The following will give the exact result required. If there is any same Lecture.date rows it generates flow_no in Lecture.id order for those rows.

SELECT id, lecture, Lecture.subject_id, date, c,
 (SELECT COUNT(subject_id) + 1 FROM Lecture AS l
  WHERE l.subject_id = Lecture.subject_id
  AND (l.date < Lecture.date OR (l.date = Lecture.date AND l.id < Lecture.id))
  AND is_deleted != 1
 ) AS flow_no
FROM Lecture
INNER JOIN
(
 SELECT subject_id, COUNT(subject_id) AS c
 FROM Lecture
 WHERE is_deleted != 1
 GROUP BY subject_id
) AS counts
ON Lecture.subject_id = counts.subject_id
WHERE is_deleted != 1
ORDER BY subject_id, date;

Respondido 29 ago 12, 08:08

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.